Overview
The code demonstrates a practical example of how to inject user-defined functions into an LLM, allowing it to understand natural language queries and execute appropriate mathematical operations. The implementation uses Ollama, a powerful framework for running large language models locally.
Key Components
1. Required Setup
A local Ollama server must be running https://ollama.com
Please refer to the Ollama documentation for setup instructions
The code supports multiple LLM options including:
llama3.2
llama3.1
qwen2.5-coder (used in this example)
Please refer to the Ollama documentation for details on how to set up different LLMs. https://ollama.com/search
2. Core Mathematical Functions
The code implements four basic mathematical operations:
for example,
Using the same principle the following functions can be created:
- add_two_numbers(a, b)
- subtract_two_numbers(a, b)
- multiply_two_numbers(a, b)
- divide_two_numbers(a, b)
Each function is well-documented with proper type hints and docstrings, making the code maintainable and self-explanatory.
3. Function Registry
The available functions are stored in a dictionary for easy access:
4. Interactive Chat Loop
The program runs in an interactive loop where:
Users can input mathematical questions in natural language
The LLM interprets the question and selects the appropriate function
The selected function is executed with the parsed parameters
Results are displayed to the user
Usage Examples
The calculator can understand various forms of input:
"What is 1 + 1?"
"Add 1 and 1"
"1 plus 1"
"one plus one"This flexibility in input processing demonstrates the power of using LLMs for natural language understanding.
Technical Implementation Details
The core of the implementation uses Ollama's chat API with function injection:
The LLM processes the user's input and determines which function to call along with the appropriate arguments. The program then executes the function and displays the result.
Benefits and Applications
This implementation demonstrates several key concepts:
Function Injection: How to extend LLM capabilities with custom functions
Natural Language Processing: Converting human language to programmatic function calls
Error Handling: Graceful handling of undefined functions and invalid inputs
User Interface: Simple but effective interactive interface
Conclusion
This code serves as an excellent example of how to combine LLMs with custom functions to create practical applications. While this implementation focuses on basic arithmetic, the same pattern can be applied to more complex use cases, from data analysis to automation tasks.
Understand function injection in LLMs
Build natural language interfaces for their applications
Learn about integrating Ollama into their projects
This implementation showcases the power of combining traditional programming with LLMs to create more intuitive and flexible user interfaces for computational tasks.
Full code is here,
https://github.com/crysanthus/llm-with-tools.git