Industry figures are divided on a contentious Californian artificial intelligence bill that passed on Aug. 28.
The new legislation will compel AI firms to implement new safety protocols, including an “emergency stop” button for AI models.
The Safe and Secure Innovation for Frontier Artificial Intelligence Models Act (SB 1047) passed the California Senate 29–9 on Aug. 28.
The bill now goes to Governor Gavin Newson’s desk for ratification.
Elon Musk was among those who expressed support for the bill. On X, he said it was a “tough call” but favored the legislation due to the “potential risk” of AI.
Not all tech figures are similarly persuaded, however. OpenAI chief strategy officer Jason Kwon is among those who have criticized the legislation.
Calanthia Mei, co-founder of the decentralized AI network Masa, said she was not in favor of the new rules, suggesting they were the result of an undue rush to legislate.
“Premature regulations like this will not only drive talent out of California; it will drive talent out of America,” Mei told Cointelegraph. She added:
“The risk sits in the likely possibility that America’s current and proposed regulatory frameworks cap the growth of the AI industry.”
Raheel Govindji, the CEO of the decentralized AI project DecideAI, took a contrasting view.
“We are in favor of legislation,” Govindji told Cointelegraph.
Govindji said DecideAI supports a killswitch controlled by a decentralized autonomous organization (DAO), which is how they propose to democratize and decentralize an emergency stop.
AI is a fast-moving industry
The fast-moving nature of the AI industry has stoked fears about its unfettered development.
In an Aug. 22 letter, former staff and whistleblowers at OpenAI warned, “Developing frontier AI models without adequate safety precautions poses foreseeable risks of catastrophic harm to the public.”
But to others, the rapid pace of AI development is something to be celebrated rather than feared.
Recent: Solutions without problems: How many Ethereum layer 2s are too many?
“In contrast to other transformative technologies, the speed of AI innovation is unparalleled. Builders are shipping new products, features and applications every day,” Mei said.
“We as builders don’t even know where the ceiling of AI is; how would the government know the ceiling of AI? Setting limits for high-potential technologies is unwise.”
Mei warned that the legislation’s ultimate cost would be to “drive talent out of the US” as it “did to crypto.”
Those in favor
Govindji proposes that a “DAO-controlled killswitch” could support the requirements of the legislation while still retaining “collective and transparent decision-making.”
The bill states that any AI model should be able to “promptly enact a full shutdown” but fails to define the meaning of promptly, leaving considerable room for interpretation.
For now, it is unknown whether a DAO model and its democratic voting system would be prompt enough to satisfy legislators. Govindji is confident it will. According to Govindji, DecideAI “will be ahead of the curve in providing AI which is a social good.”
AI firm Anthropic has also publicly supported the bill.
In an open letter to Governor Newsom, Anthropic CEO Dario Amodei said, ”AI systems are advancing in capabilities extremely quickly, which offers both great promise for California’s economy and substantial risk [...] We believe SB 1047, particularly after recent amendments, likely presents a feasible compliance burden for companies like ours, in light of the importance of averting catastrophic misuse.”
An earlier version of the bill forwarded criminal penalties for companies that failed to comply. After consultation with the industry, this provision was watered down to civil penalties only.
Bill SB 1047
Bill SB 1047 will only apply to “covered models,” with the definition of what models are covered shifting over time.
On implementation, a covered model will be an AI that costs over $100 million to develop or “An artificial intelligence model trained using a quantity of computing power greater than 10^26 integer or floating-point operation.”
The federal government’s Government Operations Agency will adjudicate any changes to the computing power threshold.
Recent: Five crypto market predictions that haven’t come true — yet
In a letter to Governor Newsom, OpenAI’s Kwon argued that legislation toward AI should only be handled at a federal level “rather than a patchwork of state laws.”
Given the overwhelming concentration of tech and AI firms in California, SB 1047 might arguably be the de facto national legislation for now.
The situation could change should the legislation cause AI firms to flee to other states, but to avoid SB 1047 entirely, the firms would also need to cease all operations and services in California.