As artificial intelligence shifts toward more efficient solutions, small language models (SLMs) are becoming a scalable and cost-effective alternative to the large, generalized models favored by Big Tech, particularly for task-specific applications.
Assisterr, a Cambridge-based network of SLMs, is developing a decentralized AI ecosystem where communities can own, manage and improve their AI models while participating in governance and reward systems through blockchain technology.
In this interview, Assisterr CEO Nick Havryliak discusses the advantages of SLMs, the potential of decentralized AI and how blockchain is empowering communities to shape the future of AI development.
Cointelegraph: What inspired the creation of SLMs, and how do they differ from the large AI models developed by Big Tech companies?
Nick Havryliak: My experience in tech consulting, particularly in managing R&D teams and developing machine learning (ML) and Internet of Things (IoT) solutions, highlighted the major limitation of data set size and quality in ML. This challenge led me to explore how Web3 incentives could be leveraged to aggregate high-quality, niche-specific data. This idea laid the foundation for creating SLMs, which solve the inefficiencies seen in larger, general AI models.
We’ve heard about large language models’ (LLMs’) hallucinations and high costs. That is mostly because Big Tech tries to reach higher adoption by making more generalistic models. SLMs offer a different route, where models can be highly specialized but only in smaller areas.
CT: SLMs promise to be more efficient, scalable and accessible. Could you explain how this efficiency translates into practical advantages for developers and users?
NH: SLMs are more resource-efficient, cost-effective and easier to deploy. They excel in task-specific optimization and low-resource environments, making them ideal for applications where speed and cost are critical.
Source: Assisterr
Besides this, for most daily tasks and AI automation, you don’t need “an AI with absolute knowledge” if the result is obtainable with a simpler solution. As humans are not using rockets to kill mosquitos, we do not need to wait for magic artificial general intelligence (AGI) to be developed to solve real-world problems in a variety of use cases. SLMs are here to empower such solutions now, not in the future.
CT: What are some real-world applications or industries that you believe will benefit most from the use of SLMs?
NH: SLMs can be integrated into any application, agentic framework or hardware we are using on a daily basis, including so-called edge devices such as mobile phones and laptops.
While it is still complicated to estimate the whole range of use cases, we believe the SLM market size and number of SLMs will surpass the number of mobile applications, which has a $255 billion market and over 5 million apps, in the next decade. SLMs will lower the entry point by being very close to the end-user’s domain while remaining highly cost-efficient.
CT: Assisterr emphasizes community ownership of AI models. How does your platform empower communities to create, manage and improve their own AI models?
NH: Through Assisterr’s tools, users set up smart contract-based treasuries that manage the models’ operations, revenue and governance via tokenized voting systems. Contributors and validators can collaborate by enhancing models, with rewards distributed based on their efforts.
CT: Blockchain plays a central role in your model. Could you walk us through how tokenization and blockchain technology enable decentralized AI governance and incentivize contributions?
NH: Each SLM operates within a MiniDAO — a treasury-based co-ownership model. The treasury is governed through voting by co-owners using Management Tokens (MTs). Key features of the MiniDAO include:
- Flexible voting structures for model and treasury management.
- Task lists for contributors and validators.
- Preset configurations for launching SLMs quickly and efficiently.
CT: How does Assisterr ensure that the community-driven development of SLMs maintains high standards of quality, security and ethical AI practices?
NH: While we create all the instruments to build SLMs securely and use state-of-the-art approaches, quality still heavily depends on many execution factors. That is why we want to make this process as accessible as possible, driving healthy competition among best-in-class SLMs and their communities!
CT: Do you foresee any challenges in transitioning from traditional centralized AI models to a decentralized AI ecosystem? If so, how is Assisterr addressing these obstacles?
NH: We see that the main challenge is the difficulty of building real-world AI use cases for non-technical people. Like, how to choose a problem, how to understand if it’s too complex for AI and how to define instructions for the model to follow.
Source: Assisterr
We want to transfer all this knowledge from technical teams, making this process smooth with our AI assistant. Instead of focusing on the developer community, we are making AI creation more accessible for creators, data owners and a variety of people with different backgrounds.
CT: Edge devices like smartphones and laptops are increasingly powerful. How are SLMs poised to take advantage of this trend to deliver smarter, more cost-efficient AI applications?
NH: Smartphones and different wearable devices are the main initial drivers of SLM development. Such devices usually have harder limitations on computing and currently, it’s simply impossible to run LLMs on them. But at the same time, SLMs are the main force for new gadget development because they can unlock new smart ways of usage. This includes things like Meta glasses to wearable friends or assistants who are always within reach.
CT: Looking ahead, how do you envision the future of AI in consumer electronics as SLMs become more widespread? Could they reshape how we interact with everyday technology?
NH: SLMs will definitely reshape our lives — not only our interaction with technology. We believe that in the nearest future, each decision could be made by some specific SLM. Each use case will have its small model making it extremely easy to find a helping “AI assistant friend” for any need. That’s where we see the power of a network effect as the more SLMs there are, the stronger and more valuable the network will be for the end user.
Disclaimer. Cointelegraph does not endorse any content or product on this page. While we aim at providing you with all important information that we could obtain in this sponsored article, readers should do their own research before taking any actions related to the company and carry full responsibility for their decisions, nor can this article be considered as investment advice.