
App development using Meta's Llama AI models has become easier
What's the story
Meta has launched the Llama API at its first-ever LlamaCon AI developer conference.
The new application programming interface (API) will allow developers to explore and experiment with products powered by different Llama models.
The limited preview of the Llama API is compatible with Meta's Software Development Kits (SDKs), enabling developers to create services, tools, and applications driven by these models.
Competitive edge
A strategic move by Meta
The launch of the Llama API is a strategic move by Meta to stay ahead in the highly competitive open model space.
Despite over a billion downloads for its Llama models, competitors like DeepSeek and Alibaba's Qwen could prove to be potential threats to Meta's ambition of building a massive ecosystem around Llama.
The new API offers tools for fine-tuning and assessing the performance of these models, starting with Llama 3.3 8B.
Data privacy
Data privacy and model portability
Meta has promised that customer data from the Llama API won't be used to train its own models.
It also confirmed that models created with this new interface can be transferred to another host.
This focus on data privacy and model portability makes the Llama API even more attractive for developers looking for flexibility in their projects.
Collaboration
Llama API offers model-serving options for developers
For those building on Meta's newly launched Llama 4 models, the Llama API offers model-serving options through partnerships with Cerebras and Groq.
These "early experimental" options are available on request to help developers prototype their AI apps.
"By simply selecting the Cerebras or Groq model names in the API, developers can enjoy a streamlined experience with all usage tracked in one location," said Meta in a blog post shared with TechCrunch.