Ex-Google researchers establish an innovative bio-inspired AI startup

August 19, 2023

AI Google

Two ex-Google researchers have founded a new intriguing AI research lab to build ensembles of smaller AI models that work together. 

The research lab, Sakana, named after the Japanese word for “fish,” envisions creating multiple compact AI models that collaborate harmoniously, much akin to how natural systems respond only to local information yet act as part of a whole. 

Examples include sholes of fish, flocks of birds, and swarms of insects, which move independently but cohesively. 

Starlings form cohesive “murmurations” which are mathematically complex. Source: Shutterstock.

The duo, David Ha and Llion Jones believe a “swarm” of models could achieve more than the customary approach of pouring all model training efforts into a singular colossal model. 

Elaborating on the philosophy, Ha commented, “Ants move around and dynamically form a bridge by themselves, which might not be the strongest bridge, but they can do it right away and adapt to the environments.” 

He further emphasized, “I think this sort of adaptation is one of the very powerful concepts that we see in natural algorithms.”

Both Ha and Jones are considered distinctive researchers in the field, with Jones being one of the first to propose transformer architectures in the 2017 paper “Attention Is All You Need.”

Meanwhile, Ha previously spearheaded research at Stability AI and generative AI at Google Brain in Japan.

Though still nascent, Sakana has grand plans, with a Tokyo office in the pipeline, as confirmed by Ha. The firm remains tight-lipped about its financial backing.

Sakana’s approach to AI development is based on some existing research. Ha and another Google colleague Yujin Tang pioneered an approach termed “The Sensory Neuron as a Transformer,” advocating the collaborative power of several AI models over a singular, massive one.

Interest in bio-inspired AI is rising as researchers begin to decode how biological systems sustain incredible abilities with low power demands. 

Biological brains use a minuscule fraction of the power required for lightweight AI models. IBM recently unveiled a brain-inspired analog GPU chip capable of robust performance at lower power. 

Jones observed, “The human brain still works better than our best AI,” continuing, “So, clearly, the human brain is doing something right that we haven’t quite caught on to yet.”

Leaving big tech research teams to form innovative startups is a well-trodden path, and Jones candidly remarked on his experience outside Google, “It’s unfortunately true to say I have so much more velocity outside of Google.” Upon Mr. Ha’s startup proposition, he concurred, “It just made a lot of sense to me.”

Join The Future


Clear, concise, comprehensive. Get a grip on AI developments with DailyAI

Sam Jeans

Sam is a science and technology writer who has worked in various AI startups. When he’s not writing, he can be found reading medical journals or digging through boxes of vinyl records.


Stay Ahead with DailyAI

Sign up for our weekly newsletter and receive exclusive access to DailyAI's Latest eBook: 'Mastering AI Tools: Your 2024 Guide to Enhanced Productivity'.

*By subscribing to our newsletter you accept our Privacy Policy and our Terms and Conditions