In science, a deluge of data
The torrents of digital data from scientific research have spawned a debate over who should have access to it, how it can be stored and who will pay to do so.
Vinton Cerf, the vice president of Google, said the issue has become crucial for public and private institutions.
And Alan Blatecky, the director of advanced cyberinfrastructure at the National Science Foundation in Virginia, said: “Data is the new currency for research. The question is how do you address the cost issues, because there is no new money.”
There is a growing international recognition of the scope of the problem. The Research Data Alliance, begun last August with just eight researchers, now has more than 750 academic, corporate and government scientists and information technology specialists in 50 countries.
Agencies in the United States are proposing to “support increased public access to the results of research funded by the federal government.”
Dr. Cerf and Francine Berman, a computer scientist at Rensselaer Polytechnic Institute in Troy, New York, argue in a paper published in the journal Science that companies and colleges must invest in new computer data centers so that crucial research data is not irretrievably lost.
“There is no economic ‘magic bullet’ that does not require someone, somewhere, to pay,” they wrote.
Dr. Berman leads the United States