A Go-based framework for creating AI agents using local models with Model Context Protocol (MCP) integration for connecting to external tools and resources.
Local AI Agents provides a foundation for building AI assistants that can:
- Connect to local language models (via Ollama)
- Integrate with external tools through the Model Context Protocol (MCP)
- Allow the AI model to build tools dynamically
- Control web browsers for automated tasks*
- Maintain conversation history and context
- Support multiple agent personalities
- Go 1.18 or higher
- Git
- Local AI model (e.g., Ollama)
-
Clone this repository:
git clone https://github.com/bloxsome/local_ai_agents.git cd Local_AI_Agents -
Install Ollama:
- Visit Ollama website and follow installation instructions
- Download a model:
ollama run llama3.3:latest
-
Build the application:
go build
Run the application:
./local_ai_agentsConfiguration is managed through environment variables and defaults in the config.go file.
/help: Show available commands/search <query>: Perform an interactive web search/context: Show current context/clear_context: Clear the current context/bullets: Display current bullet points/knowledge_tree: Display the knowledge tree/explain <concept>: Get an explanation of a concept/fact_check <statement>: Perform a fact check on a statement/profile: Display your user profile
Run the integration tests:
cd tests
./run_tests.sh # On Unix-like systems
run_tests.bat # On Windows - Not TestedSee tests/README.md for more details on testing.
Yes, AI models (like Claude) can write MCP servers for you let try and bring this local with ollama. The MCP protocol follows a well-defined structure that makes it suitable for AI-assisted development:
- Start with an existing example (calculator.go or weather.go) as a template
- Define the tools and resources your server will provide
- Implement the required MCP methods (listTools, listResources, etc.)
- Add your business logic for handling tool calls
This approach allows you to rapidly prototype new MCP servers that extend your AI agent's capabilities with custom functionality or API integrations.
The repository includes an MCP Generator server that allows AI models to create their own MCP servers:
- Located in
examples/mcp_server/mcp_generator/ - Provides tools for generating, validating, and building MCP servers
- Allows AI models to create custom MCP servers based on specifications
- Includes templates and examples for rapid development
See the MCP Generator README for details on how to use this server.
You can test if llama3.3 is able to use the MCP Generator with the provided test script:
# Install required Python package
pip install requests
# Run the test script
python examples/mcp_generator_test.pyThis script:
- Starts the MCP Generator server
- Queries llama3.3 to create a specification for a greeting MCP server
- Uses the MCP Generator to validate, generate, and build the server
- Outputs the generated server to
./generated_mcp_server(configurable)
You can customize the test with command-line options:
python examples/mcp_generator_test.py --model llama3.3:latest --output-dir ./my_serverThe MCP integration allows you to extend your AI agent's capabilities by connecting to external tools and resources:
-
Create custom MCP servers for specialized functionality:
- See fully functioning example in
examples/mcp_server/(calculator server) - Reference implementation in
internal/modules/mcp/
- See fully functioning example in
-
Implement new MCP tools:
- Browser control:
examples/mcp_browser_server/ - API integrations:
tests/mcp_server/weather.go(complete weather server example)
- Browser control:
-
Configure MCP settings:
- Example configuration:
examples/mcp_settings.json - Initialization example:
examples/mcp_initialization_example.go
- Example configuration:
The framework uses a modular architecture:
- Create a new module in
internal/modules/ - Implement the required interfaces
- Register your module in
main.go
See existing modules for reference:
- Command handling:
internal/modules/commands/ - Input processing:
internal/modules/input/ - Logging:
internal/modules/logging/
Modify agent behavior by:
- Updating prompt templates in the configuration
- Adding new command handlers
- Implementing custom memory/context management
Additional documentation can be found in the docs/ directory:
Contributions are welcome. Please submit pull requests or open issues to improve the project.
This project is licensed under the MIT License. See LICENSE for details.