SambaNova Systems has announced a new artificial intelligence chip capable of running models up to 5 trillion parameters.
The SN40L is designed for training and inference, but is only available through the company's wider platform 'the SambaNova Suite' - a combined software and hardware offering.
Manufactured by TSMC, SN40L supports 256k+ sequence length possible on a single SN40L node.
"Today, SambaNova offers the only purpose-built full stack [large language model] platform - the SambaNova Suite - now with an intelligent AI chip; it's a game changer for the Global 2000,” Rodrigo Liang, co-founder and CEO of SambaNova Systems, said.
“We’re now able to offer these two capabilities within one chip – the ability to address more memory, with the smartest compute core – enabling organizations to capitalize on the promise of pervasive AI, with their own LLMs to rival GPT4 and beyond.”
SambaNova pitches its Suite as an on-prem or cloud-based offering to allow companies to train their own generative AI models based on popular foundation models. In June, consulting giant Accenture deployed a three-rack Suite at its data center for generative AI work.
“We’ve started to see a trend towards smaller models, but bigger is still better and bigger models will start to become more modular,” Kunle Olukotun, co-founder of SambaNova Systems, said.
“Customers are requesting an LLM with the power of a trillion-parameter model like GPT-4, but they also want the benefits of owning a model fine-tuned on their data. With the new SN40, our most advanced AI chip to date, integrated into a full stack LLM platform, we’re giving customers the key to running the largest LLMs with higher performance for training and inference, without sacrificing model accuracy.”
Founded in 2017, San Francisco-based SambaNova has raised more than $1 billion from investors including Google Ventures, Intel, SoftBank, and Singaporean wealth fund GIC. Co-founder Christopher Ré previously founded data company Lattice, which was acquired by Apple in 2017.