AI is everywhere right now. And as a Laravel developer, you might be wondering — can I add AI features to my Laravel app without switching frameworks or pulling in a dozen packages?
The answer is yes. And with the Laravel 13 AI SDK, it has never been easier.
In this guide, we will go through everything you need to know about the Laravel 13 AI SDK. We will cover what it is, how to set it up, and how to use it in real projects. By the end, you will be able to build AI-powered features directly inside your Laravel application.
So, let’s get started.
What is the Laravel 13 AI SDK?
The Laravel 13 AI SDK is a first-party package built by the Laravel team. It gives you a clean, native way to work with AI models directly from your Laravel application.
Before Laravel 13, if you wanted to add AI features, you had to install third-party packages manually. You had to write custom wrappers. And you had to figure out authentication, error handling, and formatting on your own.
That was frustrating.
But now, Laravel provides a simple, elegant API — just like everything else in the framework. You get access to AI capabilities without the boilerplate.
The Laravel AI SDK currently supports:
- Text generation — Ask a question, get a response
- AI agents — Build smart, tool-using AI workflows
- Embeddings — Convert text into vector representations for semantic search
- Streaming — Get real-time responses instead of waiting for the full output
- Content automation — Automate writing, summarization, classification, and more
And all of this works through a unified AI facade. Clean. Simple. Very Laravel.
Why Laravel Built a Native AI SDK
This is a fair question. Why build a native AI SDK when packages like Prism PHP already exist?
Here’s the thing — Laravel AI integration is no longer a nice-to-have. It is becoming a core requirement in modern web applications. Product managers want chatbots. Clients want smart search. Users expect AI-powered features as a default, not a luxury.
By building a native solution, Laravel gives you:
- Consistency — The same conventions you already know
- Official support — No more worrying if a third-party package will be maintained
- Framework-level integration — Works seamlessly with queues, logging, caching, and events
In short, the Laravel 13 AI SDK is Laravel taking AI seriously. And that is a big deal for the entire ecosystem.
Requirements Before You Begin
Before you can use the Laravel 13 AI SDK, make sure your environment is ready.
Here is what you need:
- Laravel 13 (obviously)
- PHP 8.3 or higher — Laravel 13 requires this
- Composer — For package installation
- An API key from a supported AI provider (OpenAI, Anthropic, Gemini, etc.)
If you are still on Laravel 12, check out our Laravel 13 upgrade guide first. Upgrading is simpler than you think.
Also, if you are new to Laravel 13, you might want to read the Laravel 13 new features overview before continuing.
Installing and Setting Up the Laravel AI SDK
The Laravel 13 AI SDK comes as part of the Laravel ecosystem via Composer. Hence, let’s install it.
Open your terminal and run:
composer require laravel/ai
That’s it. One command. No extra configuration file needed at this point.
Next, publish the configuration file:
php artisan vendor:publish --provider="Laravel\AI\AIServiceProvider"
This will create a new file at config/ai.php. We will use this file in the next step to configure our AI provider.
Configuring Your AI Provider
The Laravel AI SDK supports multiple AI providers. This is one of its biggest strengths. You are not locked into one company.
Open config/ai.php. You will see something like this:
return [
'default' => env('AI_PROVIDER', 'openai'),
'providers' => [
'openai' => [
'api_key' => env('OPENAI_API_KEY'),
'model' => env('OPENAI_MODEL', 'gpt-4o'),
],
'anthropic' => [
'api_key' => env('ANTHROPIC_API_KEY'),
'model' => env('ANTHROPIC_MODEL', 'claude-3-5-sonnet-20241022'),
],
'gemini' => [
'api_key' => env('GEMINI_API_KEY'),
'model' => env('GEMINI_MODEL', 'gemini-1.5-pro'),
],
],
];
Now, add your credentials to your .env file:
AI_PROVIDER=openai
OPENAI_API_KEY=sk-your-openai-key-here
OPENAI_MODEL=gpt-4o
If you prefer Anthropic or Gemini, simply change AI_PROVIDER and add the corresponding key.
This multi-provider setup is a powerful feature of the Laravel AI integration. You can even switch providers per feature using runtime configuration, which we will cover shortly.
Text Generation with the Laravel AI SDK
Text generation is the most common use case. And with the Laravel 13 AI SDK, it is beautifully simple.
Basic Text Generation
Here is the most basic example:
use Illuminate\Support\Facades\AI;
$response = AI::text('What is Laravel 13?');
echo $response->text();
Run this from a controller, an Artisan command, or a job. It works anywhere inside your Laravel app.
Generating Text from a Controller
Let’s make it more realistic. Here is how you would use Laravel AI text generation inside a controller:
<?php
namespace App\Http\Controllers;
use Illuminate\Http\Request;
use Illuminate\Support\Facades\AI;
class ContentController extends Controller
{
public function generate(Request $request)
{
$request->validate([
'prompt' => 'required|string|max:500',
]);
$response = AI::text($request->prompt);
return response()->json([
'result' => $response->text(),
]);
}
}
Clean and simple. Notice how it follows standard Laravel patterns. There is no learning curve if you already know Laravel.
Using a System Prompt
Sometimes, you want to give the AI context before asking a question. That is where a system prompt comes in.
$response = AI::text(
prompt: 'Write a short product description for a wireless keyboard.',
system: 'You are a professional copywriter. Keep responses under 100 words. Use persuasive language.'
);
echo $response->text();
The system parameter sets the behavior of the AI. This is extremely useful when you are building AI-powered Laravel apps with specific personalities or personas.
Using a Different Provider at Runtime
What if you want to use Anthropic for one specific task but OpenAI for everything else?
$response = AI::provider('anthropic')->text('Summarize this article: ...');
echo $response->text();
This is one of the cleanest features of the Laravel AI SDK. You switch providers with one method call — no config changes needed.
Building AI Agents in Laravel 13
AI agents are more powerful than basic text generation. An agent can use tools to take actions, fetch data, and make decisions based on results.
Think of an agent as an AI that can reason about a problem and decide what to do next.
Here is a simple example of an AI agent in Laravel 13:
use Illuminate\Support\Facades\AI;
use Laravel\AI\Tools\Tool;
$agent = AI::agent()
->withTool(
Tool::make('get_weather')
->description('Get current weather for a city')
->parameter('city', 'string', 'The city name')
->handler(function (string $city) {
// Call your weather API here
return "The weather in {$city} is 25°C and sunny.";
})
)
->run('What is the weather in Bangalore right now?');
echo $agent->text();
What happens here? The AI decides on its own that it needs to call the get_weather tool. It calls it, gets the result, and returns a final answer.
This is the real power of AI in Laravel 13. You are not just generating text. You are building systems that reason, act, and respond intelligently.
Practical Agent Use Cases
AI agents work great for:
- Customer support bots — Agents that check order status, fetch FAQs, and escalate tickets
- Internal tools — Agents that query your database and summarize results
- Smart assistants — Agents that book appointments, send emails, or create records
- Code helpers — Agents that review code and suggest improvements
The Laravel 13 AI SDK makes all of this possible with clean, readable PHP code.
Working with Embeddings
Embeddings are a bit more advanced. But they are extremely powerful once you understand them.
An embedding is a numerical representation of text. It captures the meaning of words, not just the words themselves.
For example, the words “cat” and “kitten” are different strings. But their embeddings will be very similar because they mean similar things.
Why Embeddings Matter
Embeddings power:
- Semantic search — Find results by meaning, not exact keywords
- Recommendation systems — Suggest similar articles, products, or content
- RAG (Retrieval Augmented Generation) — Give the AI relevant context from your own data
- Duplicate detection — Find similar content in your database
Generating Embeddings with the Laravel AI SDK
Here is how to generate an embedding in Laravel 13:
use Illuminate\Support\Facades\AI;
$embedding = AI::embed('Laravel is a PHP framework for web artisans.');
// Returns an array of floats
$vector = $embedding->vector();
You can then store this vector in your database (PostgreSQL with pgvector works best) and query it later.
Storing and Searching Embeddings
// Step 1: Store the embedding when creating a post
$post = Post::create(['title' => $title, 'content' => $content]);
$embedding = AI::embed($content)->vector();
$post->update(['embedding' => $embedding]);
// Step 2: Search by meaning later
$query = AI::embed('How to deploy a Laravel application?')->vector();
$results = Post::whereVectorSimilar('embedding', $query)->limit(5)->get();
This is Laravel artificial intelligence at its best. Your app now understands what users are looking for, not just what they typed.
Streaming AI Responses
Nobody likes staring at a blank screen waiting for a full response to load. That is why streaming exists.
With streaming, the AI sends back words as it generates them — just like ChatGPT. This makes your app feel fast and responsive.
Here is how to stream a response in the Laravel 13 AI SDK:
use Illuminate\Support\Facades\AI;
$stream = AI::stream('Write a detailed tutorial about Laravel queues.');
foreach ($stream->chunks() as $chunk) {
echo $chunk->text();
ob_flush();
flush();
}
For a proper streaming API endpoint, you can use Laravel’s StreamedResponse:
use Illuminate\Support\Facades\AI;
use Symfony\Component\HttpFoundation\StreamedResponse;
public function streamContent(Request $request): StreamedResponse
{
return response()->stream(function () use ($request) {
$stream = AI::stream($request->prompt);
foreach ($stream->chunks() as $chunk) {
echo "data: " . json_encode(['text' => $chunk->text()]) . "\n\n";
ob_flush();
flush();
}
}, 200, [
'Content-Type' => 'text/event-stream',
'Cache-Control' => 'no-cache',
'X-Accel-Buffering' => 'no',
]);
}
On the frontend, you can consume this with the EventSource API or a simple fetch stream. The result is a real-time AI response — smooth, fast, and professional.
Real-World Use Cases
Let’s look at how developers are actually using the Laravel AI SDK in production.
1. AI-Powered Blog Post Generator
public function generatePost(Request $request)
{
$response = AI::text(
prompt: "Write a blog post about: {$request->topic}",
system: 'You are a technical writer. Use clear language. Keep paragraphs short. Add code examples where relevant.'
);
return Post::create([
'title' => $request->topic,
'content' => $response->text(),
'status' => 'draft',
]);
}
2. Smart Customer Support Auto-Reply
public function autoReply(SupportTicket $ticket)
{
$response = AI::text(
prompt: "Customer says: {$ticket->message}. Suggest a helpful reply.",
system: 'You are a friendly customer support agent for a SaaS company. Be concise and helpful.'
);
$ticket->update(['suggested_reply' => $response->text()]);
}
3. Product Description Generator for E-commerce
public function generateDescription(Product $product)
{
$response = AI::text(
prompt: "Generate a product description for: {$product->name}.
Features: {$product->features}. Target audience: {$product->audience}.",
system: 'Write compelling e-commerce product descriptions. Keep it under 150 words. Focus on benefits.'
);
$product->update(['description' => $response->text()]);
}
4. AI Code Review Assistant
public function reviewCode(Request $request)
{
$response = AI::text(
prompt: "Review this PHP code and suggest improvements:\n\n{$request->code}",
system: 'You are a senior PHP developer. Focus on security, performance, and Laravel best practices.'
);
return response()->json(['review' => $response->text()]);
}
These examples show just how versatile Laravel AI integration can be. Almost any feature you can imagine can be enhanced with AI.
Best Practices When Using Laravel AI
Before you ship AI features to production, here are some important things to keep in mind.
1. Always Validate User Input
Never pass raw user input directly to the AI without validation. This can lead to prompt injection attacks.
$request->validate([
'prompt' => 'required|string|max:1000',
]);
2. Cache Repeated Requests
AI API calls cost money. If the same prompt is being called repeatedly, cache the result.
$result = Cache::remember('ai_faq_answer_' . md5($prompt), 3600, function () use ($prompt) {
return AI::text($prompt)->text();
});
3. Run Heavy AI Tasks in the Background
For long-running generations, use Laravel queues. Never make the user wait.
// Create a job
class GenerateReportJob implements ShouldQueue
{
public function handle(): void
{
$response = AI::text('Generate a monthly sales report summary...');
// Save result to database
}
}
// Dispatch it
GenerateReportJob::dispatch();
4. Handle Errors Gracefully
AI APIs can fail. Always wrap your calls in try-catch.
try {
$response = AI::text($prompt);
return $response->text();
} catch (\Exception $e) {
Log::error('AI generation failed: ' . $e->getMessage());
return 'Sorry, we could not generate a response. Please try again.';
}
5. Set Spending Limits on Your API Account
This one is often forgotten. Go to your OpenAI or Anthropic dashboard and set monthly spending limits. This protects you from unexpected bills if traffic spikes.
Final Thoughts
The Laravel 13 AI SDK is genuinely exciting. And not because of hype — but because of what it makes possible.
Think about it. A few years ago, adding AI to a PHP application meant custom HTTP clients, manual JSON parsing, and fragile integrations. Today, you write AI::text() and you are done.
Moreover, this is just the beginning. The Laravel AI SDK will grow with every release. More providers, more tools, more capabilities — all under the same clean, familiar API you already love.
So, if you are starting a new Laravel project in 2026, there is really no reason not to consider AI features. The barrier to entry is now incredibly low.
And if you are upgrading an existing app, start small. Add a single AI feature — a description generator, a smart search, a support auto-reply. See how it performs. Then build from there.
The future of Laravel development is AI-assisted. And with Laravel 13, that future is already here.
Found this guide helpful? Share it with your fellow Laravel developers. And if you have questions, drop them in the comments below — we read every single one.

Leave a Reply