America’s IT networks need to be updated, badly — just ask one of the thousands of travelers grounded last Wednesday after a Federal Aviation Administration computer failure. When I met Jordan Shapiro at CES last week, government modernization was very much on her mind. She’s an economic and data analyst at the center-left Progressive Policy Institute, and as she sees it, the government has a crucial responsibility not just to maintain antiquated systems like the FAA’s, but also to keep pace with fast-moving new areas like AI and data privacy so it can serve citizens better day-to-day. With that in mind I called her today to talk about the roadblocks that keep the government from modernizing its own technology — and more broadly how to balance what she sees as the government’s duty to create a fair playing field with the necessity of allowing for free-market competition. An edited and condensed version of our conversation follows. Where do things stand now, when it comes to the way the government tries to keep its systems modernized and accessible? It can be very difficult to navigate government services, and government data systems maybe aren't even capturing the information that we need in order to improve those systems. The government has hundreds of agencies, all of which have their own processes and technologies at varying levels of being in or out of date. As far as I’m aware, there's not quite enough cohesive planning to move that process forward and create an easier system. The way government makes it easier for individuals to access its products and services is one of the main things I would like to see improve. What are the biggest points of friction right now when it comes to improving government IT for citizens? I’m looking at things that are “whole-of-government” opportunities to improve. One of those is privacy. American citizens should know that their data isn’t being exploited. We have things like the Privacy Act of 1974, which protects citizens’ government data — is that bill robust enough to deal with the multifaceted data that we are now collecting about people? Another is, how can we create a better system for users to verify their information with government service providers? In President Biden's executive order talking about improving government customer service he talks about the “time tax” — if you apply for benefits with the Social Security Administration, you fill out forms, then you fill out similar forms to get Medicare and Medicaid, and with Health and Human Services, and it goes on and on. We have these antiquated identity systems to prove who we are to various services, and we saw during Covid that this created a lot of fraud. I’m a big advocate for digital identity services, and that overlaps with privacy. How much does modernizing these systems overlap with protecting people’s fundamental rights as citizens? These systems are more and more essential parts of what it means to be a democratic citizen, and we want to make sure that they’re accountable to us — not that we’re accountable to them. Still, at PPI we believe in light-touch regulation for innovation, and that over-regulating innovation and technology can stymie its ability to flourish. For example, the Biden administration put out a blueprint for the use of AI, and I think that is a form of holding technology accountable to us. Although it's not binding, it provides a sense of how we have democratic values embedded in how we want the technology to be used. I would like to see more work like that. What are the biggest roadblocks right now to that work? We’re seeing major tech regulation happen around the world, and it’s still unclear how those regulations are going to interact with new technologies. So in my work one of the biggest barriers is that I don’t want to be overly prescriptive about what I suggest regulators do, because it’s new territory. We don't know how that is then going to interact with a product, or with future innovation in this space. One potential solution is to create more opportunities for sandboxing, and trial and error, and iteration [outside the regulatory framework]. What are the biggest risks of failing at that? What does the world look like where government doesn’t do this? There are different incentives at play in society — for a lot of companies, optimization is more about making the best product, making money, generating brand loyalty. Safety, privacy, and other concerns might not be the primary consideration. That is the role of government: To say, we need to make sure that there are guardrails in place so that these technologies aren’t exploiting people. For me, the government is there to make sure that there is a pro-social, pro-consumer optimization and set of incentives when new technologies are being developed. |