Happy Friday and welcome back to our weekly feature: The Future in 5 Questions. Today we have writer David Auerbach, who spends quite a lot of time contemplating technology’s place in society. A former software engineer at Google and Microsoft, he has come to the unsettling conclusion that it’s actually too late to rein in our digital networks in the macro sense — and in a new book, explains that we’re now trapped in “Meganets” that no one can really control. Read on to hear his thoughts on the runaway consequences of networked systems, how AI actually thinks about data, and the lack of technical expertise in Congress. Responses have been edited for length and clarity. What’s one underrated big idea? That the systems that we interact with, often in fairly simple ways, are having subtle and only partly intended consequences. So much of the operations of these technological systems are beneath the surface and out of control even of the engineers and executives that own them and profit off of them. The deployment of the technology that I deem the “Meganet” — the assemblage of a huge number of servers and huge numbers of people interacting with these servers in a constant feedback loop — is actually creating its own sort of ecological rules that are beyond what we can humanly control. And so what we perceive to be matters of human control are actually closer to something like the global economy, which we acknowledged to be not quite as controlled as we once thought that it could be. What’s a technology you think is overhyped? Deep learning and AI. It has accomplished amazing things. But I think people are talking about these technologies in misdirected ways. AIs are great at pattern recognition in the absence of explicit criteria. They just need to know what qualifies as a feature. And that's good for sifting through a lot of amateur data. But in doing so, they're going to reiterate biases in ways that are very hard to fix. A lot of what we want to blame on AI is actually more attributable to the general structure of network societies. And AI is going to exacerbate some of those tendencies by making it more opaque. What book most shaped your conception of the future? Stanislav Lem’s “Summa Technologiae” — it’s a nonfiction work about future speculations on human interactions with technology. For something that was written in the 60s, it is incredibly relevant to today. It touches on things like VR, it touches on the limits of human knowledge. He was heavily influenced by cybernetics, the study of control systems. He took the work of Norbert Wiener and W. Ross Ashby — two of the big cybernetics people — and put it into a more speculative but fairly rigorous mindset. Like: how should we expect technology to condition our experience of the world? Because there's always this tendency to just take for granted the world in which we've What could government be doing regarding tech that it isn’t? Learning about it. The idea that you can regulate this stuff without actually understanding the guts of it — the principles of how computers treat data, even at the simplest level — leads to a lot of the completely unhelpful fixes and recommendations. There seem to be vanishingly few politicians who even have advisors with real technical knowledge. A lot of the designated experts actually are not as expert as they used to be. So I think the problem is actually worsening. What has surprised you most this year? The process by which apathy and indifference towards COVID mitigation emerged in a city like New York. New York went from being a very pro-vaccine, pro-mask town to one that became laissez-faire about it (even with regard to children) in a very short period of time. A lot of people basically stopped caring about boosters. A year ago, you would have gone on the subway and everyone would have been masked. Now it's maybe 50-50 at best. That tipping-point mentality — and the seeming difficulty in identifying the exact rationale for what tipped things over — is something that I find very fascinating.
|