Bring AI features to your Django project: Django AI Assistant

Flávio Juvenal
July 5, 2024

In the last few months, Large Language Models have matured and now empower autonomous AI agents with capabilities like Function / Tool Calling, Retrieval-Augmented Generation, vector stores, etc. This impressive evolution transforms AI from a hype to a new paradigm for building software applications. AI agents and assistants are now the backbone of new features in existing products and even are the whole backend of new startups. Chances are you recently interacted with an LLM and didn't even notice since it's possible to use them behind the scenes in any web application.

Python has a mature ecosystem of AI libraries, which certainly helps in adding AI features to Django apps. However, there's a shortage of tutorials on how to integrate Django with popular frameworks such as LangChain and LlamaIndex. While using AI frameworks to make a ChatGPT clone is trivial, things get complicated fast when you need more features.

New LLMs can call functions from your application's side, so you can, for example, build an AI assistant inside Django that helps your users edit their profiles via chat. It's not simple to implement that, though. A lot of boilerplate code is necessary to enable LLMs to perform queries and call functions from Django's side.

Introducing Django AI Assistant

To streamline the implementation of AI features inside Django, Vinta is launching Django AI Assistant, a third-party Django app to supercharge your Django project with LLM capabilities. You can declare AI assistants as Python classes and quickly implement method tools the AI can use.

These tools can do anything a Django view can, such as:

  • Accessing the database;
  • Checking permissions;
  • Sending emails;
  • Downloading and uploading media files;
  • Calling external APIs;
  • And more.

You can effectively build full applications with LLMs doing all the heavy lifting as long as the AI has access to the right functions! Here's a quick example of an AI assistant that can manage the user's tasks in project management software:

class ProjectManagementAIAssistant(AIAssistant):
   id = "project_assistant"
   name = "Project Management Assistant"
   instructions = (
       "You are a project management bot. "
       "Help the user manage projects using the provided tools.")
   model = "gpt-4o"

   def get_current_user_email(self) -> str:
       """Get the current user's email"""

   def get_user_assigned_issues(self, user_email: str) -> str:
       """Get the issues assigned to the provided user"""
       return json.dumps({
           "issues": list(

   def assign_issue(self, issue_id: int, user_email: str) -> str:
       """Assign an issue to a user"""
   def create_issue(self, title: str, description: str) -> str:
       """Create a new issue"""

The application user can chat with this AI assistant and give open-ended instructions like:

"Please re-assign all my issues from project Foo to"

Thanks to the power of new LLM models such as GPT 4, Claude 3, Gemini 1.5, and others, the AI assistant can transform these instructions into multiple tool calls, therefore correctly calling all the right methods.

Beyond chat interfaces

We suggest you do not limit your user interface with AI assistants to textual conversations. Your users may not want to chat with an AI to do something like bulk re-assigning of issues, but you may not have the design/dev resources to implement that feature just yet.

A quick and dirty solution is building a pre-made chat message on the frontend to send it directly to the AI assistant backend at the click of a button. Just make a form where the user inputs a project_name and a new_assignee_email, then fill a string like:

"Please re-assign all my issues from project {project_name} to {new_assignee_email},"

You can directly send the string from the frontend to the AI assistant after the user submits the form. The AI is smart enough to call assign_issue for all issues necessary!

Beyond text input

New LLMs support JSON inputs very well, both for inputs and outputs. Your frontend can even call your Django AI Assistants as if they were dealing with a traditional HTTP API by POSTing JSONs. Multi-modal support with images and voice as inputs and outputs is coming soon to Django AI Assistant.

LangChain framework support

Behind the scenes, Django AI Assistant uses LangChain. This allows you to use LangChain pre-implemented tools, retrievers (RAG and vector store support are included), and any LLM models LangChain supports. But it's optional to be familiar with LangChain to use Django AI Assistant.

Is AI the future for Django applications?

Regardless of the feasibility of AGI, we foresee a future where AI will power novel features of web applications in a no-frills manner. We're realists: AI won't be the main value proposition for most Django applications. But we're also optimistic: AI assistants will get better and better, with or without LLMs. Your data and functionalities are the core of your Django application. Why not let AI interface with that to build an excellent experience for your application's users? With Django AI Assistant, you can seamlessly integrate Django and AI assistants.

Please give it a try. We're excited to hear any feedback on our GitHub Issues and Discussions. Let us know what you've built, too!