Author: Tina Gasperson
In 1993, the National Center for Supercomputing Applications (NCSA) gave us Mosaic, the first Web browser with a graphical user interface. Today, the NCSA is still innovating, creating a project that monitors how global climate change is affecting plants and wildlife, one that tracks oil spills, and another that predicts the possible effects of seismographic activity on bridges and other structures. To facilitate communication and collaboration between stations, NCSA is making use of the Web infrastructure it helped to launch almost 15 years ago, in a research program called the CyberCollaboratory. Not surprisingly, open source software is an integral part of the Web-based intiative.
Jim Myers, associate director for Cyberenvironments at NCSA, says open source is important because of the heavy customization needed to connect the component requirements of different researchers. Cyberenvironments integrates distributed computing and data resources to provide new scientific processes and greater productivity for researchers in disparate locations. To tie together all the pieces, Myers selected the Liferay open source portal. He looked at GridSphere and Sakai, two other open source portals, before going with Liferay, which fit the bill because it was “ready to scale. Still open source, but further along the curve toward enterprise.” Myers liked Liferay’s wider selection of “off the shelf portlets,” which he says made it possible to “concentrate on getting inside and changing the things we needed to. We could be less concerned about having to take on the burden of doing the maintenance. Enough people were contributing to Liferay that it made sense.”
The portlets make it easy for NCSA developers to quickly customize the portal according to researchers’ needs. Myers says getting that quick customization wouldn’t be so easy if NCSA were dealing with a traditional proprietary application. “We’re trying to work with users who are doing something that hasn’t been done before, and we’re not clear upfront with what’s going to be needed. It’s very hard to gather requirements and do that very close interaction with users [when dealing with a proprietary vendor]. It slows the process down, and it’s very important that we work in an agile style.”
The CyberCollaboratory hosts 400 registered users, who participate in wikis, blogs, document sharing, and message boards to foster interaction. NCSA developers have created a number of custom data analysis tools for the community, including Google Maps mashups and social networking analysis tools. One of the first projects the Cybercollaboratory hosted was a project that demonstrated how field researchers could discuss oils spills and even create spill simulations, tracking hypothetical spill trajectories using historical data, and predicting what effects a spill could have in a certain geographical area.
This kind of collaboration creates permanency in the research, Myers says. “We had to start thinking in this mode of how is this stuff going to last for a long time. We needed to do much more to make the systems auditable by the scientists. We want the person who does research next to be able to plug their component in.” The portal makes this “building upon” possible because it keeps all the research, all the discussion, all the collaboration tied together in a searchable archive. “Other people in the community can go back to the portal and use the data. We do a lot of our software development based on grants that last from one to five years at the most. If the portal is not open source, then someone has to pick up and start from scratch when you’re gone.”
Categories:
- Open Source
- Science & Research
- Case Study