sonot a total flake. I don't buy neuro-augs in the next 15 years though. 30 perhaps. We've already had the woman paraplegic piloting an f35 simulator with her mind and varios advances in biotechnology, robotics etc. Never say never. But not on his timescaale.
Thing is, this is where I think the "softer" sciences like sociology and psychology come in. Just because because something is
technically possible doesn't mean that it will be subject to widespread adoption among the general population. Take your example, for instance. Why would an average person with no disabilities choose to undergo surgery or injection with nanobots (with all the potential risks such an operation would necessarily entail) in order to mentally control machines when non-integrative interfaces are already well-developed?
Don't get me wrong; I find the idea of human-machine interface fascinating and it would be something I would definitely consider if it was proven to be reasonably safe
and if it were to offer some kind of advantage or improvement in quality of life (and if I could afford it!), but I guess that I'm probably an outlier as far as this sort of thing goes. I value my intelligence and my morality, but I don't especially value my status as an unmodified human being on an intrinsic basis. I would expect that there would have be significant economic and/or social pressures in order for widespread adoption of such emerging technologies to become a reality. You're not just buying a shiny new toy here; you're modifying a part of what you are, and that's a big step even if you're willing to take that journey.
Let's not forget that there are wider social implications to the development and adoption of technology; what if the reason that people are choosing to cybernetically modify themselves is because there are economic pressures that strongly urge people to do so in order to remain competitive in the labour market with increasing automation, expert systems, artificial intelligence, that kind of thing? That's not a politically neutral thing or an unalloyed good. But I suspect you know all this stuff already.
Optimisms all well and good but I think he recons we're going to hit some critical point where it all goes a bit exponential and the work of 20 years is bootstrapped in 5.
Kurzweil seems to strongly subscribe to a version the
hard take-off model of the technological singularity, which I do not for various reasons.