Research Community Looks to Open Source SDN to Help Distribute Data from the Large Hadron Collider

49

When the Large Hadron Collider (LHC) starts back up in June, the data collected and distributed worldwide for research will surpass the 200 petabytes exchanged among LHC sites the last time the collider was operational. Network challenges at this scale are different from what enterprises typically confront, but Harvey Newman, Professor of Physics at Caltech, who has been a leader in global scale networking and computing for the high energy physics community for the last 30 years, and Julian Bunn, Principal Computational Scientist at Caltech, hope to introduce a technology to this rarified environment that enterprises are also now contemplating:  Software Defined Networking (SDN).  Network World Editor in Chief John Dix recently sat down with Newman and Bunn to get a glimpse inside the demanding world of research networks and the promise of SDN.

Can we start with an explanation of the different players in your world?

hnphotomay152015

Harvey Newman, Professor of Physics at Caltech

NEWMAN:      My group is a high energy physics group with a focus on the Large Hadron Collider (LHC) program that is about to start data taking at a higher energy than ever before, but over the years we’ve also had responsibility for the development of international networking for our field. So we deal with many teams of users located at sites throughout the world, as well as individuals and groups that are managing data operations, and network organizations like the Energy Sciences Network, Internet2, and GEANT (in addition to the national networks in Europe and the regional networks of the United States and Brazil).

Read more at Network World.