In our previous guide, we introduced the Laravel 13 AI SDK and covered the basics of working with AI in Laravel. Now, let’s take things a step further. Today, we are diving deep into Laravel 13 AI agents.
So, what exactly is an agent?
Well, an agent is not just a simple text generator. Instead, it is a dedicated PHP class that encapsulates instructions, tools, memory, and output logic — all in one place. In other words, it acts as a fully structured AI-powered component inside your application.
To put it simply, think of it as a specialized AI assistant — whether that’s a sales coach, a support bot, or even a document analyzer. Once you configure it, you can reuse it anywhere in your application, which makes your code much cleaner and easier to maintain.
Even better, Laravel makes this process incredibly smooth. With the built-in make:agent Artisan command, you can generate agents instantly. That means no custom boilerplate and no manual wiring — just clean, structured PHP classes ready to use.
In this guide, we will walk through everything you need to know about Laravel 13 AI agents. Specifically, we will cover how to create your first agent, how streaming works, how to use queues, how to integrate tools, manage memory, and finally, how to test everything properly.
So, without further delay, let’s get started.
What Is a Laravel AI Agent
Before we jump into writing code, it’s important to clearly understand what an agent actually is.
At its core, agents are the fundamental building blocks for interacting with AI providers in the Laravel AI SDK. More importantly, each agent is a dedicated PHP class that encapsulates everything needed to interact with a large language model. Which includes instructions, conversation context, tools, and output schema. Because of this structure, agents allow you to organize your AI logic in a much more maintainable and reusable way.
Once again, think of an agent as a specialized assistant — whether it’s a sales coach, a document analyzer, or a support bot. You define it once, and then you can prompt it whenever needed throughout your application.
This is exactly what makes agents so powerful. Unlike a one-off API call, an agent is reusable, structured, and scalable. As a result, your application becomes easier to extend and maintain over time.
Finally, it’s worth mentioning that the Laravel AI SDK supports multiple providers. For example, you can work with OpenAI, Anthropic, Gemini, Azure, Groq, xAI, DeepSeek, Mistral, and more — all through a single, clean interface.
Installation and Setup
First, install the Laravel AI package using the command below:
composer require laravel/ai
Next, publish the configuration and migration files:
php artisan vendor:publish --provider="Laravel\Ai\AiServiceProvider"
Then run the migrations. This creates the agent_conversations and agent_conversation_messages tables needed for conversation storage:
php artisan migrate
Now add your API credentials to .env:
OPENAI_API_KEY=your-openai-key
ANTHROPIC_API_KEY=your-anthropic-key
GEMINI_API_KEY=your-gemini-key
The SDK supports many providers. You only need to add keys for the ones you plan to use.
That’s it. You are ready to build Laravel 13 AI agents.
Creating Your First Agent with make:agent
Here is the command that makes everything easier. Laravel provides the make:agent Artisan command to generate your agent class:
php artisan make:agent SalesCoach
This generates a new file at app/Ai/Agents/SalesCoach.php. Open it and you will see:
<?php
namespace App\Ai\Agents;
use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Promptable;
use Stringable;
class SalesCoach implements Agent
{
use Promptable;
/**
* Get the instructions that the agent should follow.
*/
public function instructions(): Stringable|string
{
return 'You are a sales coach, analyzing transcripts and providing feedback.';
}
}
This is already a working Laravel AI agent. To begin with, it implements the Agent contract, while at the same time using the Promptable trait, and additionally defining a system prompt via instructions().
If you look closely, notice how clean it is. There are no configuration files, no service providers, and even more importantly, no manual wiring — instead, it’s just a simple PHP class.
Moreover, you can also generate an agent with structured output support right away:
php artisan make:agent SalesCoach --structured
This adds the HasStructuredOutput interface and a schema() method to the generated class. We will cover structured output in detail shortly.
Prompting an Agent
To interact with your Laravel AI agent, create an instance and call prompt():
$response = (new SalesCoach)
->prompt('Analyze this sales transcript...');
return (string) $response;
The make() static method resolves the agent from the service container. This enables automatic dependency injection:
$agent = SalesCoach::make(user: $user);
$response = $agent->prompt('Analyze this transcript...');
You can also override the provider, model, or timeout per prompt call:
use Laravel\Ai\Enums\Lab;
$response = (new SalesCoach)->prompt(
'Analyze this sales transcript...',
provider: Lab::Anthropic,
model: 'claude-opus-4-5-20251001',
timeout: 120,
);
This is very useful. You can use a cheap model by default, but switch to a more capable one for complex tasks — all at runtime without changing agent configuration.
Conversation Memory
By default, each prompt() call is stateless. The agent does not remember previous messages.
For chatbots, support systems, or any multi-turn conversation, you need memory. Laravel gives you two approaches.
Option 1 — Manual Conversation Context
Implement the Conversational interface and define a messages() method that loads your conversation history:
<?php
namespace App\Ai\Agents;
use App\Models\History;
use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Contracts\Conversational;
use Laravel\Ai\Messages\Message;
use Laravel\Ai\Promptable;
class SalesCoach implements Agent, Conversational
{
use Promptable;
public function __construct(public readonly int $userId) {}
public function instructions(): string
{
return 'You are a sales coach.';
}
public function messages(): iterable
{
return History::where('user_id', $this->userId)
->latest()
->limit(50)
->get()
->reverse()
->map(fn ($msg) => new Message($msg->role, $msg->content))
->all();
}
}
Option 2 — Automatic Memory with RemembersConversations
This is the simpler approach. Add the RemembersConversations trait. Laravel handles everything automatically:
<?php
namespace App\Ai\Agents;
use Laravel\Ai\Concerns\RemembersConversations;
use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Contracts\Conversational;
use Laravel\Ai\Promptable;
class SalesCoach implements Agent, Conversational
{
use Promptable, RemembersConversations;
public function instructions(): string
{
return 'You are a sales coach.';
}
}
To start a new conversation for a user:
$response = (new SalesCoach)->forUser($user)->prompt('Hello!');
// Store this ID for continuing the conversation later
$conversationId = $response->conversationId;
To continue an existing conversation:
$response = (new SalesCoach)
->continue($conversationId, as: $user)
->prompt('Tell me more about that.');
Laravel automatically loads previous messages and saves new ones after every interaction. Your Laravel AI agent remembers everything without you managing a single database query.
Structured Output
Sometimes plain text is not enough. You want structured data — a score, a list of items, a JSON object your application can process.
Implement the HasStructuredOutput interface and define a schema() method:
<?php
namespace App\Ai\Agents;
use Illuminate\Contracts\JsonSchema\JsonSchema;
use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Contracts\HasStructuredOutput;
use Laravel\Ai\Promptable;
class SalesCoach implements Agent, HasStructuredOutput
{
use Promptable;
public function instructions(): string
{
return 'You are a sales coach. Analyze transcripts and return structured feedback with a score.';
}
public function schema(JsonSchema $schema): array
{
return [
'feedback' => $schema->string()->required(),
'score' => $schema->integer()->min(1)->max(10)->required(),
];
}
}
Access the response like an array:
$response = (new SalesCoach)->prompt('Analyze this sales transcript...');
dump($response['feedback']); // "Great closing technique, but work on objection handling."
dd($response['score']); // 7
Nested Objects
public function schema(JsonSchema $schema): array
{
return [
'score' => $schema->integer()->required(),
'metadata' => $schema->object(fn ($schema) => [
'confidence' => $schema->string()->enum(['low', 'medium', 'high'])->required(),
'language' => $schema->string()->required(),
])->required(),
];
}
Arrays of Objects
public function schema(JsonSchema $schema): array
{
return [
'feedback' => $schema->array()
->items(
$schema->object(fn ($schema) => [
'comment' => $schema->string()->required(),
'score' => $schema->integer()->required(),
])
)
->required(),
];
}
Structured output is perfect for dashboards, automated reports, and feeding AI results directly into your database.
Attaching Files and Images
Your Laravel AI agent can also read and analyze documents and images. Pass them as attachments when prompting:
use App\Ai\Agents\SalesCoach;
use Laravel\Ai\Files;
$response = (new SalesCoach)->prompt(
'Analyze the attached sales transcript...',
attachments: [
Files\Document::fromStorage('transcript.pdf'), // From filesystem disk
Files\Document::fromPath('/home/laravel/doc.md'), // From local path
$request->file('transcript'), // From file upload
]
);
For images, use the Image class:
$response = (new ImageAnalyzer)->prompt(
'What is in this image?',
attachments: [
Files\Image::fromStorage('photo.jpg'),
Files\Image::fromPath('/home/laravel/photo.jpg'),
$request->file('photo'),
]
);
This opens powerful possibilities — invoice extraction, document summarization, image analysis — all within your Laravel agent.
Streaming Agent Responses
Streaming sends words back as they are generated — just like ChatGPT. It makes your application feel fast and responsive.
Call stream() instead of prompt():
use App\Ai\Agents\SalesCoach;
Route::get('/coach', function () {
return (new SalesCoach)->stream('Analyze this sales transcript...');
});
Returning the stream from a route automatically sends an SSE (Server-Sent Events) response to the client. No manual headers needed.
Use the then callback to run code after the full response is delivered:
use Laravel\Ai\Responses\StreamedAgentResponse;
Route::get('/coach', function () {
return (new SalesCoach)
->stream('Analyze this sales transcript...')
->then(function (StreamedAgentResponse $response) {
// $response->text, $response->events, $response->usage
});
});
Or iterate through events manually:
$stream = (new SalesCoach)->stream('Analyze this...');
foreach ($stream as $event) {
echo $event;
}
For frontend integration with the Vercel AI SDK, use the built-in protocol adapter:
return (new SalesCoach)
->stream('Analyze this...')
->usingVercelDataProtocol();
Queueing Agents in the Background
Some agent tasks are heavy. Long analyses, bulk processing, complex multi-step reasoning. Never block the HTTP request for these.
Use the queue() method:
use Illuminate\Http\Request;
use Laravel\Ai\Responses\AgentResponse;
Route::post('/coach', function (Request $request) {
(new SalesCoach)
->queue($request->input('transcript'))
->then(function (AgentResponse $response) {
// Save result to database, send notification, etc.
})
->catch(function (Throwable $e) {
// Handle the failure gracefully
});
return back();
});
The user gets an instant response. The agent processes in the background. The result is handled in the then callback when ready.
This is the proper production pattern for Laravel 13 AI agents that do heavy work.
Creating Tools with make:tool
Tools give your agent the ability to take real actions — query your database, call an API, generate data. The agent decides on its own which tool to use and when.
Create a tool with the make:tool Artisan command:
php artisan make:tool GetOrderStatus
This generates app/Ai/Tools/GetOrderStatus.php. Here is a complete, working tool:
<?php
namespace App\Ai\Tools;
use App\Models\Order;
use Illuminate\Contracts\JsonSchema\JsonSchema;
use Laravel\Ai\Contracts\Tool;
use Laravel\Ai\Tools\Request;
use Stringable;
class GetOrderStatus implements Tool
{
/**
* Describe what this tool does.
* The agent reads this to decide when to use it.
*/
public function description(): Stringable|string
{
return 'Fetch the current status of a customer order using the order number.';
}
/**
* The logic that runs when the agent calls this tool.
*/
public function handle(Request $request): Stringable|string
{
$order = Order::where('order_number', $request['order_number'])->first();
if (! $order) {
return "No order found with number {$request['order_number']}.";
}
return "Order {$request['order_number']}: Status = {$order->status}, " .
"Total = \${$order->total}, Placed on {$order->created_at->format('M d, Y')}.";
}
/**
* Define the input parameters this tool accepts.
*/
public function schema(JsonSchema $schema): array
{
return [
'order_number' => $schema->string()->required(),
];
}
}
Notice how clean this structure is. The description() tells the agent when to use this tool. The schema() defines what input it accepts. The handle() contains the actual logic.
Registering Tools on Your Agent
Return tools from the tools() method on your agent. Also implement HasTools:
use App\Ai\Tools\GetOrderStatus;
use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Contracts\HasTools;
use Laravel\Ai\Promptable;
class SupportAgent implements Agent, HasTools
{
use Promptable;
public function instructions(): string
{
return 'You are a friendly customer support agent. Help customers with their orders.';
}
public function tools(): iterable
{
return [
new GetOrderStatus,
];
}
}
Now, when a user asks, What is the status of my order ORD-12345? — the agent decides on its own to call GetOrderStatus, gets the result, and returns a natural language answer.
Multiple Tools — Real Support Agent
php artisan make:tool GetOrderStatus
php artisan make:tool InitiateRefund
php artisan make:tool EscalateTicket
public function tools(): iterable
{
return [
new GetOrderStatus,
new InitiateRefund,
new EscalateTicket,
];
}
Each tool does one thing well. The agent handles the decision logic. Clean, maintainable, and very powerful.
Built-in Similarity Search Tool
Laravel also provides a SimilaritySearch tool for RAG (Retrieval Augmented Generation). It lets your agent search vector embeddings stored in your database:
use App\Models\Document;
use Laravel\Ai\Tools\SimilaritySearch;
public function tools(): iterable
{
return [
SimilaritySearch::usingModel(Document::class, 'embedding'),
];
}
With more control over similarity threshold and filters:
SimilaritySearch::usingModel(
model: Document::class,
column: 'embedding',
minSimilarity: 0.7,
limit: 10,
query: fn ($query) => $query->where('published', true),
),
This gives your Laravel AI agent the ability to search your own knowledge base and answer questions based on your real content.
Built-in Provider Tools
Provider tools are special tools implemented natively by the AI provider. Your application does not execute them — the provider does.
Web Search
Give your agent real-time web access:
use Laravel\Ai\Providers\Tools\WebSearch;
public function tools(): iterable
{
return [
(new WebSearch)->max(5)->allow(['laravel.com', 'php.net']),
];
}
You can also set a user location to get region-specific results:
(new WebSearch)->location(city: 'Bangalore', region: 'KA', country: 'IN');
Supported by Anthropic, OpenAI, and Gemini.
Web Fetch
Let your agent read specific web pages:
use Laravel\Ai\Providers\Tools\WebFetch;
public function tools(): iterable
{
return [
(new WebFetch)->max(3)->allow(['docs.laravel.com']),
];
}
Perfect for agents that answer questions from your documentation or check external data.
File Search
Search documents stored in a vector store:
use Laravel\Ai\Providers\Tools\FileSearch;
public function tools(): iterable
{
return [
new FileSearch(stores: ['your-store-id']),
];
}
The agent can now search through uploaded PDFs and documents to find relevant information. Additionally, this capability is supported by OpenAI and Gemini. This makes it both flexible and powerful.
Agent Middleware
Agents support middleware — just like HTTP routes. This lets you intercept and modify prompts before they reach the provider.
Generate middleware with:
php artisan make:agent-middleware LogPrompts
This creates a file in app/Ai/Middleware/. Here is a logging middleware example:
<?php
namespace App\Ai\Middleware;
use Closure;
use Illuminate\Support\Facades\Log;
use Laravel\Ai\Prompts\AgentPrompt;
class LogPrompts
{
public function handle(AgentPrompt $prompt, Closure $next)
{
Log::info('AI Agent prompted', [
'prompt' => $prompt->prompt,
'user_id' => auth()->id(),
]);
return $next($prompt)->then(function ($response) {
Log::info('AI Agent responded', ['text' => $response->text]);
});
}
}
Register it on your agent by implementing HasMiddleware:
use App\Ai\Middleware\LogPrompts;
use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Contracts\HasMiddleware;
use Laravel\Ai\Promptable;
class SalesCoach implements Agent, HasMiddleware
{
use Promptable;
public function instructions(): string
{
return 'You are a sales coach.';
}
public function middleware(): array
{
return [
new LogPrompts,
];
}
}
You can use middleware for logging, rate limiting, content moderation, authentication checks, or prompt injection protection.
Anonymous Agents
Sometimes you need a quick, one-off AI interaction. You do not want to create a full agent class for it.
Laravel provides the agent() helper function:
use function Laravel\Ai\{agent};
$response = agent(
instructions: 'You are an expert at software development.',
messages: [],
tools: [],
)->prompt('Tell me about Laravel');
Anonymous agents also support structured output:
use Illuminate\Contracts\JsonSchema\JsonSchema;
use function Laravel\Ai\{agent};
$response = agent(
schema: fn (JsonSchema $schema) => [
'number' => $schema->integer()->required(),
],
)->prompt('Generate a random number less than 100');
echo $response['number'];
Use anonymous agents for prototyping, quick scripts, or Artisan command utilities.
Configuring Agents with PHP Attributes
This is one of the cleanest features of Laravel 13 AI agents. Instead of configuration arrays or method calls, you use PHP Attributes directly on the class.
Here is a fully configured agent using attributes:
<?php
namespace App\Ai\Agents;
use Laravel\Ai\Attributes\MaxSteps;
use Laravel\Ai\Attributes\MaxTokens;
use Laravel\Ai\Attributes\Model;
use Laravel\Ai\Attributes\Provider;
use Laravel\Ai\Attributes\Temperature;
use Laravel\Ai\Attributes\Timeout;
use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Enums\Lab;
use Laravel\Ai\Promptable;
#[Provider(Lab::Anthropic)]
#[Model('claude-opus-4-5-20251001')]
#[MaxSteps(10)]
#[MaxTokens(4096)]
#[Temperature(0.7)]
#[Timeout(120)]
class SalesCoach implements Agent
{
use Promptable;
public function instructions(): string
{
return 'You are a sales coach.';
}
}
Each attribute is self-explanatory:
#[Provider]— Which AI provider to use#[Model]— The specific model name#[MaxSteps]— How many tool-calling steps the agent can take#[MaxTokens]— Maximum number of tokens to generate#[Temperature]— Creativity/randomness from 0.0 to 1.0#[Timeout]— HTTP timeout in seconds
Smart Model Selection
You do not even need to specify a model name. Let Laravel choose automatically:
use Laravel\Ai\Attributes\UseCheapestModel;
use Laravel\Ai\Attributes\UseSmartestModel;
// Uses the cheapest available model — great for simple tasks
#[UseCheapestModel]
class SimpleSummarizer implements Agent
{
use Promptable;
}
// Uses the most capable model — great for complex reasoning
#[UseSmartestModel]
class ComplexAnalyzer implements Agent
{
use Promptable;
}
This is excellent for cost optimization. Simple tasks automatically use cheap models. Complex tasks automatically use smart models. No hard-coded model names anywhere.
Failover Between Providers
What happens if OpenAI goes down? Or you hit a rate limit?
Laravel 13 has built-in failover support. Simply pass an array of providers:
$response = (new SalesCoach)->prompt(
'Analyze this sales transcript...',
provider: [Lab::OpenAI, Lab::Anthropic],
);
If OpenAI fails, Laravel automatically retries with Anthropic. No extra code. No try-catch blocks. It just works.
You can also configure failover at the class level using the #[Provider] attribute with an array:
#[Provider([Lab::OpenAI, Lab::Anthropic, Lab::Gemini])]
class SalesCoach implements Agent
{
use Promptable;
}
This makes your Laravel 13 AI agents resilient in production environments where API availability cannot be guaranteed.
Testing Your Agents
Laravel provides a first-class testing API for Laravel AI agents. You never need to make real API calls during tests.
Fake an agent’s responses like this:
use App\Ai\Agents\SalesCoach;
use Laravel\Ai\Prompts\AgentPrompt;
// Return the same fixed response every time
SalesCoach::fake();
// Return different responses in sequence
SalesCoach::fake([
'First response',
'Second response',
]);
// Return responses dynamically based on the prompt
SalesCoach::fake(function (AgentPrompt $prompt) {
return 'Response for: ' . $prompt->prompt;
});
After prompting, make assertions:
SalesCoach::assertPrompted('Analyze this...');
SalesCoach::assertPrompted(function (AgentPrompt $prompt) {
return $prompt->contains('Analyze');
});
SalesCoach::assertNotPrompted('Something else');
SalesCoach::assertNeverPrompted();
For queued agents:
SalesCoach::assertQueued('Analyze this...');
SalesCoach::assertNeverQueued();
Prevent untested prompts from slipping through:
SalesCoach::fake()->preventStrayPrompts();
With this setup, your entire AI layer is fully testable. No API keys, no network calls, and no unexpected costs during CI/CD.
Real-World Example: A Sales Coach Agent
Let’s put everything together. Here is a complete Laravel 13 AI agent that analyzes sales transcripts. Also, remembers conversations, uses a custom tool, and returns structured output.
Step 1 — Create the Tool
For creating the tool, you can hit the below command in the terminal.
php artisan make:tool RetrievePreviousTranscripts
<?php
namespace App\Ai\Tools;
use App\Models\SalesTranscript;
use Illuminate\Contracts\JsonSchema\JsonSchema;
use Laravel\Ai\Contracts\Tool;
use Laravel\Ai\Tools\Request;
class RetrievePreviousTranscripts implements Tool
{
public function __construct(private readonly int $userId) {}
public function description(): string
{
return 'Retrieve previous sales transcripts for this user to compare performance over time.';
}
public function handle(Request $request): string
{
$transcripts = SalesTranscript::where('user_id', $this->userId)
->latest()
->limit(3)
->get();
if ($transcripts->isEmpty()) {
return 'No previous transcripts found for this user.';
}
return $transcripts->map(
fn ($t) => "Date: {$t->created_at->format('M d, Y')} | Score: {$t->score} | Summary: {$t->summary}"
)->join("\n");
}
public function schema(JsonSchema $schema): array
{
return []; // No input parameters needed
}
}
Then, we will create an agent.
Step 2 — Create the Agent
php artisan make:agent SalesCoach --structured
<?php
namespace App\Ai\Agents;
use App\Ai\Tools\RetrievePreviousTranscripts;
use App\Models\User;
use Illuminate\Contracts\JsonSchema\JsonSchema;
use Laravel\Ai\Attributes\MaxSteps;
use Laravel\Ai\Attributes\Provider;
use Laravel\Ai\Attributes\UseSmartestModel;
use Laravel\Ai\Concerns\RemembersConversations;
use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Contracts\Conversational;
use Laravel\Ai\Contracts\HasStructuredOutput;
use Laravel\Ai\Contracts\HasTools;
use Laravel\Ai\Enums\Lab;
use Laravel\Ai\Promptable;
#[Provider(Lab::Anthropic)]
#[UseSmartestModel]
#[MaxSteps(5)]
class SalesCoach implements Agent, Conversational, HasTools, HasStructuredOutput
{
use Promptable, RemembersConversations;
public function __construct(public readonly User $user) {}
public function instructions(): string
{
return "You are an expert sales coach analyzing call transcripts for {$this->user->name}.
Provide specific, actionable feedback. Reference previous transcripts when available.
Always give a score between 1 and 10.";
}
public function tools(): iterable
{
return [
new RetrievePreviousTranscripts($this->user->id),
];
}
public function schema(JsonSchema $schema): array
{
return [
'feedback' => $schema->string()->required(),
'score' => $schema->integer()->min(1)->max(10)->required(),
'strengths' => $schema->array()->items($schema->string())->required(),
'improvements' => $schema->array()->items($schema->string())->required(),
];
}
}
And then we will use it in the controller.
Step 3 — Use It in a Controller
<?php
namespace App\Http\Controllers;
use App\Ai\Agents\SalesCoach;
use App\Models\SalesTranscript;
use Illuminate\Http\Request;
class SalesCoachController extends Controller
{
public function analyze(Request $request)
{
$request->validate([
'transcript' => 'required|string',
'conversation_id' => 'nullable|string',
]);
$agent = SalesCoach::make(user: $request->user());
// Continue existing conversation or start a fresh one
if ($request->conversation_id) {
$agent = $agent->continue($request->conversation_id, as: $request->user());
} else {
$agent = $agent->forUser($request->user());
}
$response = $agent->prompt(
"Please analyze this sales call transcript:\n\n{$request->transcript}"
);
// Save the structured result to the database
SalesTranscript::create([
'user_id' => $request->user()->id,
'score' => $response['score'],
'summary' => $response['feedback'],
]);
return response()->json([
'feedback' => $response['feedback'],
'score' => $response['score'],
'strengths' => $response['strengths'],
'improvements' => $response['improvements'],
'conversation_id' => $response->conversationId,
]);
}
}
This is a production-ready Laravel 13 AI agent. It not only combines memory. This provides a custom tool, structured output, and PHP attribute-based configuration. But also keeps everything clean and structured. On top of that, it leverages the smartest available model. So, this ensures reliable and consistent results.
As a result, everything works seamlessly together. This gives you a scalable and maintainable AI setup right inside your Laravel application.
Final Thoughts
Laravel 13 AI agents are genuinely one of the most exciting additions to the framework. And once you go through this guide, it becomes clear why.
To begin with, the make:agent command gives you a clean and structured starting point. At the same time, the Promptable trait takes care of much of the underlying complexity. In addition, tools are implemented as dedicated classes with a clear and consistent interface, while memory is handled automatically through RemembersConversations.
On top of that, features like streaming, queuing, and failover are all built in, which makes the overall developer experience much smoother.
Equally important, the testing support allows you to build with confidence. There are no real API calls during tests, no unexpected costs, and no unnecessary complications — just clean, testable PHP code.
This is Laravel at its best — where complex problems are simplified and powerful systems remain easy to maintain.
So, start small. Build one agent. Give it a single tool. Prompt it and observe how it works. Then, gradually expand — add memory, introduce structured output, and integrate more tools as needed.
Ultimately, the future of Laravel development is agentic. And with Laravel 13, that future is already here.

Leave a Reply