Archive

The Future of Computing

By Katelyn Begany

November 23, 2013

1T4A2898 Following renovations, the Calvin Lab’s second floor (formerly lab space) became the Simons Institute’s cafe and common area. Credit: Muli Safra

Last May, UC Berkeley was awarded a competitive $60 million grant to establish the Simons Institute for the Theory of Computing by the Simons Foundation, a private organization committed to advancing research in basic science and mathematics. This global center for theoretical computer science sprung into full-scale operation this fall with two programs: Theoretical Foundations of Big Data Analysis and Real Analysis in Computer Science. The central foci of research at the Simons Institute will continuously evolve in the form of intensive semester-long programs drawing experts that span the theory of computing. At the helm of this dynamic research landscape is the scientific leadership team, composed of three UC Berkeley Computer Science professors: Director Richard Karp, Associate Director Alistair Sinclair, and Senior Scientist Christos Papadimitriou. Established at a time of unprecedented use of computers, the Simons Institute has the potential to establish itself as a critical resource for researchers in academia and industry alike.

Theory of computing is largely concerned with determining which problems are theoretically solvable using a computer, as well as the temporal and spatial (i.e. processing time and memory) requirements of computations. According to Karp, “In the early days of computer science, people defined it as a science of the artificial, meaning that it was removed from natural phenomena.” While “computer” may generate images of laptops and desktops to most, to a theoretical computational scientist a “computer” is any conceivable abstract mathematical machine. In fact, the modern computer derives from the hypothetical Turing machine conceived of by the mathematician Alan Turing. In addition to exploring central problems in theoretical computer science, the Simons Institute seeks principles of computing in the algorithms, data structures, and computation techniques needed to advance basic science.

Since the dawn of ubiquitous use of the Internet and World Wide Web, the limits of computing are being challenged and propelled forward at an accelerating pace as the desire to store and analyze massive amounts of data mushrooms. Attracting experts from astronomy, business, computer science, engineering, finance, mathematics, statistics, and industry, the Theoretical Foundations of Big Data Analysis program aims to reveal unified principles in computing that surface while working with problems in big data. Through a combination of presentations and discourse, participants will tackle challenges in this emerging field such as data compression, dimensionality reduction, and the scaling of inference and optimization algorithms to massive datasets.

Joshua Bloom, Associate Professor of Astronomy, participant in the Theoretical Foundations of Big Data Analysis program, and “big data” guru, has encountered many of these obstacles in his research. Bloom and his team faced a modern needle-in-the-haystack problem: identifying the rare cosmological explosions of dying stars called supernovae. Given that there has not been a supernova in our galaxy since 1604, they employed the world’s largest infrared telescope to scan the universe. The challenge was two-fold: to harness the massive torrents of data streaming in from the telescope and to sift through it in real-time to optimally focus this costly resource on supernovae. Constructing novel database technology, parallel computing techniques, and machine-learning algorithms, their research is a testament to the fact that computer science is no longer solely the realm of computer scientists and engineers.

In a reminder that computer science, statistics, and mathematics are not simply static and immutable tools, Bloom challenges scientists to break away from problems that can be solved through the use of established analytical tools alone. He calls this the “novelty2 challenge” of raising the impact of research “to the second power” by simultaneously contributing to the development of novel analytical tools and to basic science. “Some of the best things that are going to happen on this campus are not going to be because somebody from the computer science department teams up with someone in seismology and helps the seismologist figure out something they didn’t know before,” Bloom predicts. “It’s going to be because the seismologist found a type of data and a type of question that wasn’t easily answered by existing theory and frameworks within statistics or computer science.”

Continued education in advanced statistical, mathematical, and computational sciences is an essential prerequisite for novelty2 research and the formation of mutually symbiotic partnerships. The Simons Institute will play an essential role in germinating collaborative research by mediating interactions between experts in computing with diverse sets of knowledge. The extraordinary award of the Simons Institute to UC Berkeley is much more than a triumph for the Department of Computer Science or the university’s prestige; it is a beacon of a bright future for cutting-edge collaborative scientific research at UC Berkeley.

This article is part of the Fall 2013 issue.

Notice something wrong?

Please report it here.