Skip to content

Improve support for LLM usage#2437

Merged
gunthercox merged 14 commits intomasterfrom
llm
Feb 7, 2026
Merged

Improve support for LLM usage#2437
gunthercox merged 14 commits intomasterfrom
llm

Conversation

@gunthercox
Copy link
Copy Markdown
Owner

This pull request removes the experimental model parameter and introduces a more consistent implementation that uses the LogicAdapter pattern.

Additionally, this introduces the concept that each logic adapter can identify itself as a "tool" that can be used by a defined LLM.

Examples

Basic LLM with tools:

bot = ChatBot('Bot',
    logic_adapters=[
        {
            'import_path': 'chatterbot.logic.OllamaLogicAdapter',
            'model': 'llama3.1',
            'logic_adapters_as_tools': [
                'chatterbot.logic.MathematicalEvaluation',
                'chatterbot.logic.TimeLogicAdapter',
                'chatterbot.logic.UnitConversion'
            ]
        }
    ]
)

Hybrid configuration (LLM + standalone adapters for consensus):

bot = ChatBot('Bot',
    logic_adapters=[
        {
            'import_path': 'chatterbot.logic.OllamaLogicAdapter',
            'model': 'llama3.1',
            'logic_adapters_as_tools': [
                'chatterbot.logic.MathematicalEvaluation',
                'chatterbot.logic.TimeLogicAdapter'
            ]
        },
        'chatterbot.logic.BestMatch',  # Participates in consensus voting
        'chatterbot.logic.SpecificResponseAdapter'
    ]
)

Tool-capable adapter with parameters:

bot = ChatBot('Bot',
    logic_adapters=[
        {
            'import_path': 'chatterbot.logic.OllamaLogicAdapter',
            'model': 'mistral',
            'host': 'http://localhost:11434',
            'force_native_tools': True,
            'logic_adapters_as_tools': [
                {
                    'import_path': 'chatterbot.logic.MathematicalEvaluation',
                    'language': 'ENG'  # Adapter-specific parameters
                },
                'chatterbot.logic.TimeLogicAdapter'
            ]
        }
    ]
)

response = self.client.chat(
model=self.model,
messages=messages
)
Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This can also take a few other parameters depending on features the model supports. For example, thinking models:

response = chat(model="qwen3", messages=messages, tools=[...], think=True)

https://docs.ollama.com/capabilities/tool-calling#python

@gunthercox gunthercox merged commit 84403d5 into master Feb 7, 2026
8 checks passed
@gunthercox gunthercox deleted the llm branch February 7, 2026 19:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant