Django AI: integrate LLMs capabilities into your project

Flávio Juvenal
July 5, 2024

In the last few months, Large Language Models (LLMs) have matured and now empower autonomous AI agents with advanced capabilities like Function Calling, Tool Use, Retrieval-Augmented Generation (RAG), vector stores, and more. This impressive evolution turns AI from hype into a new paradigm for building modern software.

AI agents are already the backbone of new features in mature products and even power the entire backend of early-stage startups. Chances are you’ve recently interacted with an LLM behind the scenes in a web application without even realizing it.

Python has a strong AI ecosystem, which certainly helps when integrating AI into Django applications. But there’s still a lack of clear, production-focused tutorials for connecting Django with modern frameworks like LangChain or LlamaIndex.

Thanks to new models like GPT-4, Claude and Gemini, your agents can now call functions from your application. That means you can build assistants that help users edit their profiles, manage tasks, or trigger workflows — just by chatting.

But making this work requires a lot of boilerplate: parsing user intent, calling Django methods safely, handling structured outputs, and more.

Curious about how LLMs can transform your product? Our team helps you seamlessly integrate AI into your Django app — from ideation to deployment.
Let’s talk AI for Django — book a free strategy call.
Table of Contents

Introducing Django AI: seamless LLM integration for Django projects

To streamline the integration of LLMs in Django, we built Django AI, a third-party Django app to supercharge your project with AI assistants.

You can declare assistants as Python classes and expose Django methods as tools the LLM can call. Behind the scenes, Django AI handles the orchestration.

These tools can do anything a Django view can:

  • Access the database
  • Check user permissions
  • Send emails
  • Upload or download media
  • Call external APIs

You can effectively build LLM-powered features that feel native to your product — without changing your architecture.


class ProjectManagementAIAssistant(AIAssistant):
   id = "project_assistant"
   name = "Project Management Assistant"
   instructions = (
       "You are a project management bot. "
       "Help the user manage projects using the provided tools.")
   model = "gpt-4o"

   @method_tool
   def get_current_user_email(self) -> str:
       """Get the current user's email"""
       return self._user.email

   @method_tool
   def get_user_assigned_issues(self, user_email: str) -> str:
       """Get the issues assigned to the provided user"""
       return json.dumps({
           "issues": list(
               Issue.objects.filter(user__email=user_email).values()
           )
       })

   @method_tool
   def assign_issue(self, issue_id: int, user_email: str) -> str:
       """Assign an issue to a user"""
       ...
  
   @method_tool
   def create_issue(self, title: str, description: str) -> str:
       """Create a new issue"""
       ...

Example: a custom AI assistant for task management

Here’s how you could build a project management assistant:

A user types:

"Please re-assign all my issues from project Foo to mary@example.org."

With Django AI:

  • The LLM parses the instruction
  • It calls the necessary methods (e.g. get_user_issues, check_permissions, assign_issue)
  • Your backend responds with updates—all via LLM tool calls

This is possible today using models like GPT-4 or Claude, and it’s powered by structured agent workflows, not just chat.

Beyond chat interfaces: smarter ways to trigger AI agents in Django

Not every feature needs a full chat UI.

Let’s say your frontend lacks the bandwidth to support bulk reassignment. A simple workaround is to build a form that takes project_name and new_assignee_email, and sends a pre-filled instruction to the AI assistant:

"Please re-assign all my issues from project {project_name} to {new_assignee_email},"

The LLM can handle the rest. No need to manually build a UI for every edge case.

Using JSON and structured data to power Django AI assistants

Modern LLMs work well with structured JSON inputs and outputs.

With Django AI, your frontend can:

  • POST a JSON instruction to the assistant
  • Receive structured results as JSON
  • Use assistants as if they were regular APIs

Multi-modal support (voice and image) is in our roadmap.

LangChain integration

Under the hood, Django AI uses LangChain to provide:

  • Support for vector stores
  • Retrieval-Augmented Generation (RAG)
  • Tool routing and memory
  • Integration with any LLM supported by LangChain

But don’t worry — you don’t need to learn LangChain to get started. Django AI abstracts that complexity.

The future of Django applications: AI agents, LLMs and real-world use cases

We’re not betting on AGI. But we do believe that AI assistants will become a standard way to access application functionality especially when paired with your domain logic and internal data.

Your Django project already has the business rules and data models. AI just adds a smarter interface.

With Django AI, you can integrate production-grade LLM models into your workflows: cleanly, safely, and without reinventing your stack. We’re excited to hear your feedback.

Check out Django AI on GitHub, explore our examples, open a Discussion, or submit an issue. And if you’re building something with it — let us know!