He’s an adjunct computer science professor at the University of Houston and a research scientist at the University of California-Berkeley, and is the project director of His day jobs don’t allow for spending much time or funds pursuing volunteers to join in BOINC. David Anderson, BOINC’s project director, architect, and developer. As of this writing, the last such update was in September 2014.įor an explanation, I reached out to Dr. “To get a magazine to write about BOINC,” it reads, “you need to convince them that there’s something new and exciting.” The site also urges volunteers to update the page when and if they reach out to media, so as not to duplicate efforts. And it’s incredibly outdated, giving volunteers tips on how to write letters to computer magazines in countries around the world to boost exposure. The sum of the project’s recruiting is a page on BOINC’s website that you have to really study to understand. Public relations efforts around volunteer computing have been uniformly amateurish. (If you want to join us, check out this 10-minute guide to getting started.) And this for software that could run on machines numbering in the low billions globally. The systems count millions of total volunteers, but the data make clear that only a few hundred thousand are actively returning results. Since then I’ve installed the software on half a dozen new computers and tracked volunteer computing statistics as they have, depressingly, slumped. I’ve been volunteer computing for BOINC since the late ‘90s, when I was an undergrad at the University of California-Berkeley. In order to grow, BOINC needed to generate enthusiasm, to sign people up, to make studies of pulsars and peptides feel like a movement. The technology remains awe-inspiring, but it still isn’t self-sufficient. When it became clear in the early aughts that projects like BOINC and IBM’s World Community Grid could leverage middleware use of the spare computing power on huge numbers of PCs to solve massively parallel problems (think: scanning astronomical data to find extraterrestrial life) by running simulations, the consensus among researchers was that the future of data-driven investigation had arrived. The Berkeley Open Infrastructure for Network Computing may be rocking 157 petaFLOPS ( and counting), but it isn’t nearly as muscular as scientists thought it would be when the idea of volunteer computing emerged in the mid-‘90s. And here’s what they want to talk about: How paltry the whole thing is. A group of California computer scientists has built a tool for analyzing climate change, mapping clean water access, and formulating strategies to eradicate malaria, cancer, and AIDS, all using a system they can rightfully claim to be the most powerful computer network on the planet.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |