Eliezer Yudkowsky, of the Singularity Institute, interviews AI pioneer Jaron Lanier. Lanier is a proponent of “phenotropic computing”, the idea that “intelligent” computer routines could be programmed to interact via interfaces designed for humans. In this interview, Lanier doesn’t talk about “phenotropic computing”, but instead demolishes any faith in the “singularity”. Lanier scorns any blind faith in the reducibility of consciousness, and is relentless in this interview.
Lanier has a ton of credibility, having been mentored by Marvin Minsky and having known Dennett for decades, as well as being responsible for some of the most interesting AI work of the last century. I would hate to ever get in an argument with Lanier, but Yudkowsky holds his own quite well. I’ve watched several interviews conducted by Yudkowsky, and this has to be one of the most difficult by far. He does a great job, but in this debate, which I’ve watched twice, I’m persuaded by Lanier.