For years you have been a loyal [email protected] user, searching for that elusive extraterrestrial signal at every idle moment your computer could spare. You have processed hundreds of work units and returned thousands of potential candidate signals back to [email protected] headquarters. But never yet have you seen anything resembling the “Real Thing.” Until now. There, flashing on your screen is a beautiful gaussian, as clear and crisp as anything a SETI scientist could have dreamt up. Could it be that after all these years you have just detected a signal from E.T.?
With your heart pounding you rush to your computer and send urgent messages to [email protected] or The Planetary Society. “A BIG signal is heading your way!” you write breathlessly. “Could you please be sure to process it quickly? It just might be the signal we’ve been waiting for...” Sadly, the [email protected] crew appears surprisingly unaffected by your excitement. “Your candidate signal has been received and stored in our data-base” you are told calmly, “and will be analyzed within six months.” They promise to be in touch if it proves to be strong candidate for an extraterrestrial signal. And that is that.
The magic moment passes, the months go by, and no word arrives from [email protected] Your original enthusiasm has greatly subsided, but once in a while you still wonder about that unexplained gaussian: “could it have been the Real Thing? and if not, what was it?” You realize that you may never know the answer.
So it has been for many dedicated [email protected] users, who have occasionally run across promising-looking patterns in their work units. As members of the [email protected] team often explain, they cannot analyze a candidate signal - even the most promising one – at the time that it comes in. This is because a candidate signal does not stand by itself. It has to be confirmed by other signals coming from the same point in the sky at the same frequency, but measured at different times. “Signals are constantly coming in from clients” explains [email protected] Project Scientist Eric Korpela, “and they build up in a database. “Every six months or so we sift through what we have,” ranking the different candidates and locating the most promising ones.” Then, when the opportunity arises, the [email protected] crew revisits the strongest candidates with a radio telescope, looking for a repeat performance (i.e., a consistent signal).
“This system,” explains Korpela, “worked well as long as we had only millions of candidate signals to work with.” Such was the case with SERENDIP, the sky survey project that spawned [email protected] Now, however, after six years of [email protected], with millions of users processing and returning data, the candidate signals list is approaching 2 billion! With such an enormous – and fast growing – amount of data to process, a program that only analyzes the data every six months runs the risk of falling further and further behind.
And so Korpela, working closely with Chief Scientist Dan Werthimer and Software Engineer Jeff Cobb, came up with a new plan: Real Time Analysis of incoming signals. According to this new method, candidates sent in by users around the world will be quickly analyzed and compared to existing signals. Their promise of being a true E.T. signal will be quickly evaluated, almost in real time.
How will this be possible? Cobb explains: the entire surface of the sky will be divided into 50 million unique “pixels,” creating a virtual map of the heavens. Every candidate signal sent by users will be tagged with a unique number called a “cubic pixel” (“qpix”) number, which will indicate precisely in which pixel in the sky it was detected, and at what frequency. That unique pixel in the sky-map will then become a “hot spot,” indicating that a new signal has been detected at that location. The signal itself, along with its qpix number, will then be stored in the database.
The rest of the work, said Cobb, will be done by a program he calls the “Near Time Persistency Checker,” which will be running continuously as new signals are entered into the database. It will zero in on the “hot spots” where new candidate signals have been identified, and compare the new signals with any others that have previously been identified at the same spot. The program will then determine whether the new data warrants an increase in score for that particular location, a decrease, or no change. As its name suggests, the most crucial factor the Near Time Persistency Checker will be analyzing is persistency – whether a signal has already been detected at that point in the past at the same or a close frequency. Other factors include the strength and shape of the signal.
Since the Near Time Persistency Checker will be running constantly, the rankings of the different candidate signals will also be updated constantly. “We won’t have to do a massive analysis of all the data every six months in order to come up with a list of the top candidates,” explained Cobb, “because that list will now be generated automatically.” Before the Arecibo reobservations in 2003, for example, the [email protected] crew spent months analyzing the data to come up with a list of the 200 most promising candidates. Once the new approach is implements, this will no longer be necessary: the list, and the rankings, will be continuously generated every single day.
The conceptual breakthrough that makes this possible, according to Korpela and Cobb, is the ability to assign a single “qpix” number to a candidate, which indicates its location in the sky and frequency. This, explained Cobb, makes it possible to search through the database extremely quickly, and match the appropriate old and new signals with each other. Without the “qpix,” the search would have to be done according to three different parameters – the celestial coordinates and the frequency. This would have made real time analysis so complex as to make it practically impossible.
Even so Cobb cautions: “We are still in the proof-of-concept stage for this approach. There may be roadblocks ahead that we had not considered.” But if all goes well, Korpela and Cobb hope to have the system running within a few months, analyzing all [email protected] signals as they come in.
And what of the [email protected] user who thought he saw something but never knew what became of the promising gaussian? He or she will not have to wait any longer. Once real time analysis is implemented, the [email protected] team hope to have a running ranking of the different signals on the website, together with the users who contributed to each candidate. Curious users will simply go to the list to find out the fate of “their” signal. It may simply be a test signal, or radio interference; but it may, just possibly, be the Real Thing.