This week a legion of the biggest players in the world of computer graphics descended on the Los Angeles Convention Center for the annual SIGGRAPH convention (the Special Interest Group on Computer Graphics and Interactive Techniques, in case you were wondering). Possibly the biggest announcement made there this week was from Nvidia, which unveiled its new “Grace Hopper Superchip” meant to power the next wave of AI development (more on that in the item below). But beyond the headlines, conferences like SIGGRAPH are an opportunity for the world’s tech giants (and scrappy start-ups) to show off some of their most impractical, cutting-edge technology — the things you won’t see listed on Amazon anytime soon but without the likes of which we’d all still be playing around with a Commodore 64. Case in point: Meta demonstrated some far-out, very much not-for-sale VR technology, which the company recently described in a very detailed blog post with some nifty video examples. “Varifocus” and “perspective-correct passthrough” might not mean anything to the non-VR nerd, but they’re intuitive concepts: The former is simply the ability to change visual focus between objects, and the latter the ability to seamlessly view the world around you while still immersed in virtual reality. These are both extremely important problems to solve before any kind of 3D-centric digital future becomes reality. People want to use their eyes like they do in the real world, and they won’t consistently wear a headset that cuts them off from their surroundings. Those things are also devilishly hard to get right, technologically. Experimental prototypes shown off by a Reality Labs research team at SIGGRAPH this week purport to accomplish both. (I wasn’t there, but the video clips they showed off are pretty compelling.) Douglas Lanman, a top researcher at Reality Labs, wrote in the blog post announcing them that “These skunkworks-style projects are meant to “to consider what might be one day, rather than what needs to be right now.” That means that they also raise serious policy questions that might be addressed “one day.” One is simply how your inventions will affect society. In Meta’s earlier, pre-world-bestriding-conglomerate days, Mark Zuckerberg likely didn’t anticipate that he might have consulted with, say, a team of social science researchers to measure the possible impact of his social media platform on political polarization. As a result, tech firms are worrying more in advance about what their devices might do to us. I spoke with Zvika Krieger, a consultant and Meta’s former director of responsible innovation, about the role that experimental projects like these new headsets play in the tech-world ecosystem. He described a delicate dance where the big tech companies are trying to navigate economic headwinds, the world of corporate competition, and the vagaries of academia all at the same time, with the goal of positioning themselves to break new ground and make a lot of money without incurring the wrath (and bad press) that, for example, Facebook has endured in recent years. Goggles that can capture reality at the astonishing level of detail showed off by Meta this week also have serious policy implications, with an entire field of research popping up around the ethics of biometric tracking and virtual reality immersion. Krieger told me that Big Tech’s in-house research projects are already paying very close attention to those social risks — especially as privacy-shattering brain-computer interfaces also lurk on the horizon. “It’s not just Elon Musk and Neuralink. Multiple companies are working on this and I'll let your imagination run wild in terms of all the ethical issues where these computers are being designed to read your brainwaves,” Krieger said. “Companies get that if this technology is ever going to be mainstream, there’s going to have to be super-intentional work done around consumer protections… I have seen some crazy shit.” Turning that “crazy shit” into something acceptable to consumers and regulators alike is an ongoing process that involves major contributions from academic-led research labs like the ones churning out Meta’s new prototypes. And those prototypes might, in fact, ultimately be part of that process just as much as they’re part of pushing the technology itself forward. Meta’s prototypes at SIGGRAPH might be a preview of real technology, coming someday to your consumer devices. But it might actually be more important as simply a way to launch a public conversation about them — so that when some version of this does hit the shelves, the freakout has already been both addressed and priced in. “A lot of this stuff is all about trial balloons,” Krieger said. “Both in terms of whether the ethics community or the policy community are going to have a freak-out, as well as a trial balloon for the market in terms of whether the stock goes up after we prove these things… a lot of it is about signaling, and not whether this is going to hit the market anytime soon.”
|