The world’s most important scientific facilities, from the CERN Large Hadron Collider to the National Radio Astronomy Observatory, deal with massive amounts of data every day that are mined, stored, analyzed and visualized. It’s a colossal task that requires help from the top minds in data management to handle.

So the National Science Foundation (NSF) is turning to expert computer scientists from the University of Utah’s School of Computing and five other top universities to help these facilities and other research projects manage their data in faster and more affordable ways.

Members of the UofU’s School of Computing are part of the new CI Compass, an NSF Center of Excellence dedicated to helping these research facilities cope with their “data lifecycle” more effectively.

“The NSF has invested hundreds of millions of dollars in large facilities, such as massive telescopes and oceanographic observatories. The problem is that each has become a technological island, and it’s difficult for them to complete their scientific mission and get up to speed in their data needs,” said UofU School of Computing professor Valerio Pascucci, who is director of the U’s Center for Extreme Data Management Analysis and Visualization. “They don’t have sufficient internal expertise. So we work with each of them to advise them on the latest solutions and modernize their software infrastructure, to do things faster or cheaper, and to make sure they don’t become stale and outdated.”

Joining the UofU in the new center are researchers from Indiana University, Texas Tech University, the University of North Carolina at Chapel Hill, the University of Notre Dame and the University of Southern California. In addition to Pascucci, the UofU team also includes School of Computing research associate professor Robert Ricci and researchers Giorgio Scorzelli and Steve Petruzza.