[email protected] Is Over. But the Search for Alien Life Continues

In 1995, the computer scientist David Gedye had an idea that could only originate at a cocktail party. What if the world’s personal computers were linked together on the internet to create a virtual supercomputer that could help with SETI, the search for extraterrestrial intelligence ? The network would be able to sort through the massive amounts of data being collected by radio telescopes, seeking signals that might point to an alien civilization around another star. A distributed supercomputer sounded outlandish at the time, but within four years, Gedye and his collaborators had built the software to make it a reality. They called it [email protected] Tuesday, researchers at the Berkeley SETI Research Center announced they would stop distributing new data to [email protected] users at the end of March. It marks the culmination of an unprecedented 20-year experiment that engaged millions of people from almost every country on earth. But all experiments must come to an end, and [email protected] is no exception. So far, the researchers at Berkeley have only been able to analyze small portions of the [email protected] data. They had to hit pause on the public-facing part of the experiment to analyze the full two decades of radio astronomy data they’ve collected to see what they might find.
“For 20 years, there’s been this fight between keeping the project running and getting the results out to the scientific community,” says Eric Korpela, the director of [email protected] “At this point, we can’t even be sure that we haven’t found anything because we’ve been doing most of our data analysis on small test databases rather than the whole sky.”Officially launched at Berkeley on May 17, 1999, the [email protected] initiative helped address one of the biggest challenges in the search for extraterrestrial intelligence: noise . Professional alien hunters are in the business of searching for weak radio signals in a vast sky washed out by interference from satellites, TV stations, and astrophysical phenomena like pulsars. This means they are fundamentally grappling with a big data problem—they’re looking for a single signal sent by ET floating on a vast ocean of radio flotsam.
Filtering through all this data requires computing power—and lots of it. More processors crunching data from outer space means a more sensitive analysis of more signals. By borrowing unused processing power from personal computers around the world, [email protected] could plow through radio telescope data faster than ever before . When a computer was idle, the [email protected] program launched a screensaver that showed a field of colorful spikes that represented signals collected at the Arecibo radio telescope in Puerto Rico as it scanned the cosmos. And for anyone who downloaded the software, it meant that if ET called Earth, it could very well be your own CPU that picked up the phone.
It didn’t take long for the idea to catch on. [email protected] quickly grew into what its collaborator, the nonprofit Planetary Society, has called its “most successful public participation project ever undertaken.” As WIRED reported in 2000, within months of [email protected]’s launch, more than 2.6 million people in 226 countries were volunteering their spare processing power to parse the mounds of data generated by alien-hunting radio telescopes. Together, they ran about 25 trillion calculations per second, which made [email protected] more than twice as powerful as the best supercomputer in the world at that time.“We didn’t anticipate how fast it would grow,” says Dan Werthimer, who helped create [email protected] and now serves as its chief scientist. “It grew exponentially and I think it’s because people are really excited about the question of whether we’re alone. It’s not very often that people can participate in a science project that has these sorts of profound implications.”