The rise of generative AI has raised questions for policymakers about how to govern these powerful new technologies. How will we validate the authenticity of content in a world where virtually any media can be synthetically generated? How will we prove a person’s identity now that machines have passed the Turing Test to demonstrate something resembling human intelligence? And, from a financial perspective, as AI agents come online, will these systems be able to transact on our behalf, or will they be entirely shut out from the financial system?
Each is a technical problem, which means they will require technical, in addition to policy, solutions. Fortunately, some of those solutions may lie in cryptocurrency and blockchain. By embracing these technologies, Congress and other federal policymakers can help guide AI toward beneficial ends.
AI-generated deepfakes are a top-of-mind concern for almost all elected officials. This is understandable. Not only is it a policy question, but they have a great deal to lose from even one sufficiently convincing, well-timed deepfake. In January, for instance, a political consultant used an AI-generated recording of President Biden to discourage would-be voters from going to the New Hampshire primary polls.
Related: 'Privacy-minded' CBDCs are a wolf in sheep’s clothing
Deepfakes can be exploited in more than one way. Even more recently, White House Press Secretary Karine Jean Pierre insinuated that embarrassing videos of President Biden were deepfakes, even though they were authentic (though some were deceptively cropped). In other words, deepfakes can be used to impersonate politicians, and the prospect of deepfakes can be used to dismiss inconvenient, but legitimate, media.
Karine Jean-Pierre calls videos of Joe Biden lost and confused on stage in Los Angeles are “Deep Fakes”
— Salem News Channel (@WatchSalemNews) June 17, 2024
Do you believe that story? pic.twitter.com/jlG8dAR4bG
A group called the Coalition for Content Provenance and Authenticity (C2PA) is creating a technical standard to alleviate this problem, but it’s been proven flawed in practice, as authenticity metadata can often be easily changed during the photo-editing process. Blockchain technologies, with their immutable ledgers, look like the better option. These could be adopted by camera makers to ensure that every photograph their products take is validated as authentic — pixel-for-pixel. Any changes would be immediately noticeable. The Numbers Protocol is one among several examples of new approaches that push in this direction.
Large language models (LLMs) like ChatGPT can also be used to mimic a person’s writing style and impersonate them. An LLM trained to do this impersonation might not be able to fool a person’s family, but it may be convincing enough to fool a coworker, acquaintance — or a customer service representative at a bank. Even before the proliferation of language models, online identity validation was a major issue. Beyond the need for this to prevent common cybercrimes like identity theft, many policymakers have sought to require age verification for social media platforms.
Once again, the crypto industry may hold a convenient answer. Digital identity infrastructure based on biometric authentication and zero-knowledge proofs — a mathematical tool that has been well-tested in the crypto industry — could allow any website to use a robust, yet privacy-preserving, means of identity validation.
Users could verify only the information they need to affect a particular transaction — just their age, for example, rather than having to turn over, and potentially expose, all of the personal information contained on a document like a driver's license. An identity infrastructure of this kind could be created by private firms — Worldcoin (WLD), a project backed by OpenAI CEO Sam Altman, is an example — o a nonprofit standards body, or by the government itself (similar to digital public infrastructure created by countries like Estonia).
Related: Roaring Kitty’s gamification of GameStop is a menace to the market
Finally, and more speculatively, the rise of AI agents presents more novel challenges for internet governance. Many AI observers believe that models with significant agentic capabilities — that is, the ability to take complex actions on a human’s behalf — will reach the market soon. Software engineering agents like Cognition AI’s "Devin" have already been demonstrated. If the last decade of AI progress has shown us anything, it’s that we should expect the technology to develop quickly from here.
These agents could, in the fullness of time, become trusted "advisors" to their human users. They may interact with other agents and other humans for us. These interactions may eventually include financial transactions. Yet few users — and even fewer, if any, banks and financial regulators — will allow these agents to access the traditional, dollar-based banking system.
Cryptocurrencies themselves can play a role here, and especially stablecoins. Because stablecoins backed by dollar assets would have one-to-one parity with USD, they would be instantly familiar to human users on both sides of a financial transaction.
This is just one potential option that may or may not come to fruition. But well-regulated and widely adopted stablecoins are a good idea for other reasons, as former House Speaker Paul Ryan recently articulated in the Wall Street Journal. Stablecoins present many opportunities for financial innovation, but their potential benefits to AI are still underrated.
In their own ways, each of these questions echoes longstanding challenges in the governance of the internet—the source of both high-quality information and so-called "fake news." Online identity theft has been common for decades. And while cryptocurrencies are now a mature technology, they still struggle to interact with a regulatory and financial system that was not designed with them — or the internet at all — in mind. By pursuing these policy and technical solutions, we can build a digital ecosystem ready for the technologies of the next Industrial Revolution, driven by AI.
This article is for general information purposes and is not intended to be and should not be taken as legal or investment advice. The views, thoughts, and opinions expressed here are the author’s alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.