As you can imagine, there's a lot of talk of using supercomputers and distributed networks to perform difficult calculations, e.g. Mersenne prime search, or large number factorization. That got me wondering about how the computing power of the most powerful known supercomputers compares to the theoretical computing power of the largest know or theoretical botnets.
Would it make sense for a state actor like the NSA to use a stealth botnet with 10s or 100s of millions of devices to perform massively parallel calculations as an extension of, or in lieu of, whatever centralized supercomputer they use? How would you (or can you) calculate the power of a botnet of arbitrary size? I'm assuming you'd need to make a lot of assumptions about minimum processor size, etc...
[0]http://cup.columbia.edu/book/mathematics/9780231116398
The "cloud" would be a much better option as it is incredibly powerful and the stability has been encoded into the highly similar hardware. See this classic text, the researcher who originally wrote it was hired by Google [1]
and last: botnet >> supercomputer in term of scalability since its cost-free and becouse a supercomputer can always be part of the botnet, not the other way around