Skip to Content

LLM Context

View original on GitHub 

License PyPI version

LLM Context is a tool that helps developers quickly inject relevant content from code/text projects into Large Language Model chat interfaces. It leverages .gitignore patterns for smart file selection and provides both a streamlined clipboard workflow using the command line and direct LLM integration through the Model Context Protocol (MCP).

Note: This project was developed in collaboration with Claude-3.5-Sonnet, using LLM Context itself to share code during development. All code in the repository is human-curated (by me πŸ˜‡, @restlessronin).

Why LLM Context?

For an in-depth exploration of the reasoning behind LLM Context and its approach to AI-assisted development, check out our article: LLM Context: Harnessing Vanilla AI Chats for Development 

Current Usage Patterns

  • Direct LLM Integration: Native integration with Claude Desktop via MCP protocol
  • Chat Interface Support: Works with any LLM chat interface via CLI/clipboard
    • Optimized for interfaces with persistent context like Claude Projects and Custom GPTs
    • Works equally well with standard chat interfaces
  • Project Types: Suitable for code repositories and collections of text/markdown/html documents
  • Project Size: Optimized for projects that fit within an LLM’s context window. Large project support is in development

Installation

Install LLM Context using uv :

uv tool install llm-context

To upgrade to the latest version:

uv tool upgrade llm-context

Warning: LLM Context is under active development. Updates may overwrite configuration files prefixed with lc-. We recommend backing up any customized files before updating.

Quickstart

MCP with Claude Desktop

Add to β€˜claude_desktop_config.json’:

{ "mcpServers": { "CyberChitta": { "command": "uvx", "args": ["--from", "llm-context", "lc-mcp"] } } }

Once configured, you can start working with your project in two simple ways:

  1. Say: β€œI would like to work with my project” Claude will ask you for the project root path.

  2. Or directly specify: β€œI would like to work with my project /path/to/your/project” Claude will automatically load the project context.

CLI Quick Start and Typical Workflow

  1. Navigate to your project’s root directory
  2. Initialize repository: lc-init (only needed once)
  3. (Optional) Edit .llm-context/config.toml to customize ignore patterns
  4. Select files: lc-sel-files
  5. (Optional) Review selected files in .llm-context/curr_ctx.toml
  6. Generate context: lc-context
  7. Use with your preferred interface:
  • Project Knowledge (Claude Pro): Paste into knowledge section
  • GPT Knowledge (Custom GPTs): Paste into knowledge section
  • Regular chats: Use lc-set-profile code-prompt first to include instructions
  1. When the LLM requests additional files:
    • Copy the file list from the LLM
    • Run lc-read-cliplist
    • Paste the contents back to the LLM

Core Commands

  • lc-init: Initialize project configuration
  • lc-set-profile <name>: Switch profiles
  • lc-sel-files: Select files for inclusion
  • lc-context: Generate and copy context
  • lc-prompt: Generate project instructions for LLMs
  • lc-read-cliplist: Process LLM file requests
  • lc-changed: List files modified since last context generation

Features & Advanced Usage

LLM Context provides advanced features for customizing how project content is captured and presented:

  • Smart file selection using .gitignore patterns
  • Multiple profiles for different use cases
  • Code outline generation for supported languages
  • Customizable templates and prompts

See our User Guide for detailed documentation of these features.

Similar Tools

Check out our comprehensive list of alternatives  - the sheer number of tools tackling this problem demonstrates its importance to the developer community.

Acknowledgments

LLM Context evolves from a lineage of AI-assisted development tools:

  • This project succeeds LLM Code Highlighter , a TypeScript library I developed for IDE integration.
  • The concept originated from my work on RubberDuck  and continued with later contributions to Continue .
  • LLM Code Highlighter was heavily inspired by Aider Chat . I worked with GPT-4 to translate several Aider Chat Python modules into TypeScript, maintaining functionality while restructuring the code.
  • This project uses tree-sitter tag query files from Aider Chat.
  • LLM Context exemplifies the power of AI-assisted development, transitioning from Python to TypeScript and back to Python with the help of GPT-4 and Claude-3.5-Sonnet.

I am grateful for the open-source community’s innovations and the AI assistance that have shaped this project’s evolution.

I am grateful for the help of Claude-3.5-Sonnet in the development of this project.

License

This project is licensed under the Apache License, Version 2.0. See the LICENSE file for details.

Last updated on