Liquid AI

About Liquid AI

Liquid AI develops non-transformer generative AI models, known as Liquid Foundation Models (LFMs), which require less memory while maintaining high performance across various applications. These models address the inefficiencies and high computational costs associated with traditional large language models, enabling scalable AI solutions for diverse industries.

```xml <problem> Traditional large language models (LLMs) demand significant computational resources and memory, hindering their deployment in edge computing environments and increasing operational costs. This limits the accessibility and scalability of AI solutions across various industries. </problem> <solution> Liquid AI develops Liquid Foundation Models (LFMs), a series of non-transformer generative AI models designed to deliver high performance with a significantly smaller memory footprint compared to traditional LLMs. These models enable the development and deployment of efficient AI solutions at various scales, making AI more accessible and cost-effective. By reducing the computational burden, Liquid AI facilitates the integration of advanced AI capabilities into resource-constrained environments. </solution> <features> - Non-transformer architecture for reduced memory requirements - High performance across a range of generative AI tasks - Scalable design suitable for both cloud and edge deployment - Liquid Engine platform for developing and deploying LFMs </features> <target_audience> The primary target audience includes enterprises and developers seeking efficient and scalable AI solutions, particularly those working with edge computing and resource-constrained environments. </target_audience> ```

What does Liquid AI do?

Liquid AI develops non-transformer generative AI models, known as Liquid Foundation Models (LFMs), which require less memory while maintaining high performance across various applications. These models address the inefficiencies and high computational costs associated with traditional large language models, enabling scalable AI solutions for diverse industries.

Where is Liquid AI located?

Liquid AI is based in Cambridge, United Kingdom.

When was Liquid AI founded?

Liquid AI was founded in 2023.

How much funding has Liquid AI raised?

Liquid AI has raised 250000000.

Who founded Liquid AI?

Liquid AI was founded by Ramin Hasani.

  • Ramin Hasani - CEO
Location
Cambridge, United Kingdom
Founded
2023
Funding
250000000
Employees
59 employees
Major Investors
AMD Ventures
Looking for specific startups?
Try our free semantic startup search

Liquid AI

Score: 100/100
AI-Generated Company Overview (experimental) – could contain errors

Executive Summary

Liquid AI develops non-transformer generative AI models, known as Liquid Foundation Models (LFMs), which require less memory while maintaining high performance across various applications. These models address the inefficiencies and high computational costs associated with traditional large language models, enabling scalable AI solutions for diverse industries.

liquid.ai10K+
cb
Crunchbase
Founded 2023Cambridge, United Kingdom

Funding

$

Estimated Funding

$250M+

Major Investors

AMD Ventures

Team (50+)

Ramin Hasani

CEO

Company Description

Problem

Traditional large language models (LLMs) demand significant computational resources and memory, hindering their deployment in edge computing environments and increasing operational costs. This limits the accessibility and scalability of AI solutions across various industries.

Solution

Liquid AI develops Liquid Foundation Models (LFMs), a series of non-transformer generative AI models designed to deliver high performance with a significantly smaller memory footprint compared to traditional LLMs. These models enable the development and deployment of efficient AI solutions at various scales, making AI more accessible and cost-effective. By reducing the computational burden, Liquid AI facilitates the integration of advanced AI capabilities into resource-constrained environments.

Features

Non-transformer architecture for reduced memory requirements

High performance across a range of generative AI tasks

Scalable design suitable for both cloud and edge deployment

Liquid Engine platform for developing and deploying LFMs

Target Audience

The primary target audience includes enterprises and developers seeking efficient and scalable AI solutions, particularly those working with edge computing and resource-constrained environments.

Liquid AI - Funding: $200M+ | StartupSeeker