Near plans to build world’s largest 1.4T parameter open-source AI model

Near Protocol unveils plans for the world’s largest open-source AI model, using crowdsourced research and aiming to decentralize AI tech.
Near Protocol unveils plans for the world’s largest open-source AI model, using crowdsourced research and aiming to decentralize AI tech.

Near Protocol has unveiled an ambitious plan to build the world’s largest open-source artificial intelligence model on the opening day of its Redacted conference in Bangkok, Thailand. The 1.4 trillion parameter model would be 3.5 times bigger than Meta’s current open-source Llama model.

It will be created through competitive crowdsourced research and development from thousands of contributors on the new Near AI Research hub, with participants able to join the training of a small 500 million parameter model from today, Nov. 10.

Near Protocol’s ambitious AI model

The project will grow in size and sophistication across seven models, with only the best contributors making the leap to working on progressively more complex and larger models. The models will be monetized and privacy will be preserved via the use of encrypted Trusted Execution Environments to reward contributors and encourage constant updating as the technology progresses.

Related: Near patches critical bug that could crash every node on the network

The expensive training and compute will be funded with token sales, Near Protocol co-founder Illia Polosukhin told Cointelegraph at the Redacted conference in Bangkok.

Cryptocurrencies, Near Protocol, ChatGPT

Near co-founder Illia Polosukhin. Source: Cointelegraph

“It costs about $160 million, so it’s a lot, obviously, but actually, in crypto, it is raiseable money,” he said. Polosukhin added:

“Then the tokenholders get repaid from all the inferences that happen when this model is used. So we have a business model, we have a way to monetize it, we have a way to raise money, and we have a way to put this in a loop. And so people can actually reinvest back into the next model as well.”

Near is one of the few crypto projects with the ability to realize such an ambitious undertaking: Polosukhin was one of the authors of the groundbreaking transformer research paper that led to ChatGPT, and co-founder Alex Skidanov worked at OpenAI in the lead-up to the era-defining model’s release in late 2022.

Skidanov, who now heads up Near AI, conceded that it’s a massive undertaking with a big hurdle to overcome. 

Decentralized AI tackles privacy issues

To train such a large model, the project would need “tens of thousands of GPUs in one place,” which wouldn’t be ideal. But to use a decentralized network of compute “you’d need a new technology that doesn’t exist today because all the distributed training techniques we have require very fast interconnect.” However, he added that emerging research from Deep Mind suggests it is possible.

Cryptocurrencies, Near Protocol, ChatGPT

Near co-founder Alex Skidanov. Source: Cointelegraph

Polosukhin said he hasn’t spoken with existing projects like the Artificial Superintelligence Alliance but would be happy to see if there are synergies. Whatever happens, he said decentralized AI technology must win for all our sake. Conference guest speaker Edward Snowden rammed home the point by painting a scary portrait of centralized AI turning the world into a giant surveillance state.

“This is probably the most important technology right now and probably in the future. And the reality is, if AI is controlled by one company, we effectively are going to do whatever that company says,” he explained. Snowden added:

“If all AI and effectively, all of the economy is being done by one company, there’s no decentralization at that point. So it is effectively like the only way that Web3 is still relevant, philosophically, if we have AI that also follows the same principles.”

Magazine: Asian crypto traders profit from Trump’s win, China’s 2025 CBDC deadline: Asia Express