Welcome back to our regular Friday feature, The Future in Five Questions. Today we have Nick Bostrom, the author of “ Superintelligence ” — the New York Times bestselling book that united Elon Musk and Bill Gates over concerns about the existential risks of AI. His ongoing research focuses on, among other things, whether AI should have rights. Responses have been edited for length and clarity. What’s one underrated big idea? The moral status of future digital minds. The idea is that as AI systems become more sophisticated and match smaller mammals in their capabilities — and then even more so, as they approach human capabilities — that they might begin to have claims to moral status to match. This issue is very much on the fringes of the debate, even though it's roughly now where AI safety was in the 2012-2014 period. It would be premature for legislation to enter this area. There are still so many fundamental unknowns here. We don't have a good grasp exactly of what the criteria would be for when a digital mind is conscious or when it has other properties that might grant moral status. What’s a technology you think is overhyped? A few years ago, I thought 3D printing was overhyped. Like, oh, in the future, everybody will have a 3D printer in their home! It always seemed implausible to me that you would want to print out your little plastic utensils and replace your china with that. Even if you could print out that coffee mug, how much of a limiting factor was getting a hold of a coffee mug for consumers in the first place? What book most shaped your conception of the future? K. Eric Drexler’s “ Engines of Creation ,” where he laid out his vision for the future of nanotechnology back in 1986. Then there was a book by a philosopher, John Leslie, “ The End of the World ,” which was an early example of trying to think about risks to our future. And also a book by Hans Moravec, “ Mind Children ,” that was an early discussion about the future of AI. What could government be doing regarding tech that it isn’t? Lawmakers and regulators should be more on top of the rapid advances in synthetic biology. The culture in the field of nuclear physics looked very different after Hiroshima and Nagasaki. Nuclear physicists realized that what they were doing was not just creative science, but that they had some responsibility and a need for secrecy when their work could be used to manufacture nuclear weapons technology. That same understanding is not very widespread in the biotech world. There is still an emphasis on democratizing access and open publishing and making tools available, but it is a field as dangerous as nuclear physics — more so, in fact, because even if you knew exactly all the designs, tools and tips needed to manufacture nuclear weapons, you could still not make one because you’d need very difficult-to-obtain raw materials, whereas in biology, you're not really limited to any difficult ingredients — so all you require is the knowledge. What has surprised you most this year? The failure to fund the pandemic preparedness bill — specifically the failure to get it fully funded the first time around . I had thought, given COVID, this would be one of the few things that would have bipartisan support. I would have thought that a lot of political interests would have been served in pushing for this.
|