A while ago, I complained that no real advance in CPU frequency has been made in 10 years. Adding more cores (virtual ones, no less!) will not drastically improve computing performance on the machine. It will only allow parallel computing (which if used correctly could improve overall performance but won't change the order of magnitude of computing speed).
It comes with very little surprise that the GPU is showing steadier results, especially in the field of scientific calculation and cryptography.
I came across this page which gives an explanation of
why cpu clock has been limited for so long. I hope you'll find it as interesting as I did.
Finally, a question. Are we close to see some real world applications for
memristors as computer components? Bonus points if your answer comes with a link to a reliable source of info.