LoLLMs is all about flexibility and power, and one of its most potent features is the ability to extend its capabilities with Function Calls. These are specialized tools the AI can learn to use, or which can modify the conversation flow. While you can code these manually, LoLLMs includes built-in helper functions designed to guide the AI in generating the necessary code for you!
Recently, you might have noticed two distinct functions available for this purpose: one for building “Classic” function calls and another for “Context Update” function calls. While both share the goal of helping you create new functions, they build fundamentally different types of tools.
Confused about which one to ask for? Let’s break it down!
The Core Difference: What Kind of Function Do You Want to Create?
The key distinction lies not in the builder functions themselves, but in the type of function call they help you generate:
- The “Classic” Builder (build_classic_function_call)
- Purpose: Helps you create standard, action-oriented functions (FunctionType.CLASSIC).
- Think of these as: Tools the AI explicitly calls upon to do something specific during its thought process or when requested.
- Key Method: These functions primarily rely on an execute(…) method.
- Behavior: When the AI decides to use a Classic function, the execute method runs, performs its task (e.g., saving a file, searching the web, running a calculation, calling an API), and typically returns a result string to the AI or the user.
- Example Use Cases:
- A function to save the current conversation to a file.
- A function to search Wikipedia for a specific term.
- A function to execute a simple Python script.
- A function to fetch the current weather.
- How the Builder Works (Simplified): It injects detailed instructions and templates (YAML config and Python code with an execute method) into the context. The main LLM then reads your request and these instructions to generate the necessary code blocks in its response. The builder’s process_output step then extracts and saves these code blocks.
- The “Context Update” Builder (build_context_update_function)
- Purpose: Helps you create functions that modify the interaction before the AI generates its main response, or after it finishes (FunctionType.CONTEXT_UPDATE).
- Think of these as: Pre-processors or post-processors for the AI’s input and output.
- Key Methods: These functions rely on update_context(…) and/or process_output(…). They typically don’t have an execute method.
- Behavior:
- update_context: Runs before the AI starts generating its main response. It injects text, data, or instructions directly into the context the AI will see.
- process_output: Runs after the AI has finished generating its response. It can modify the AI’s output, perform actions based on the output, or add follow-up information.
- Example Use Cases:
- Injecting today’s date and time into the context automatically.
- Adding specific domain knowledge or rules before the AI answers a technical question.
- Automatically formatting the AI’s code output after generation.
- Logging specific types of AI responses to a separate file.
- How the Builder Works (Simplified): This builder is more proactive. During its own update_context phase, it uses internal LLM calls to generate the YAML config first, saves it, then generates the Python code (with update_context/process_output methods) based on the YAML, and saves that too. It then adds messages to the context telling the main LLM that the files were created successfully.
Which Builder Should You Ask For?
- Need a tool for the AI to perform a specific action and get a result back?
Ask LoLLMs to help you build a classic function call.
Example: “Help me build a classic function call to search my local document folder.” - Need to automatically add information to the context before the AI thinks, or process the AI’s output after it’s done?
Ask LoLLMs to help you build a context update function call.
Example: “Help me build a context update function call that adds a legal disclaimer to the end of every AI response.”
Empowering Your LoLLMs Experience
Understanding the difference between these two function types allows you to precisely tailor LoLLMs’ behavior. By using the appropriate builder, you can easily ask the AI to generate the scaffolding for your custom tools, significantly speeding up the development process.
So, next time you have an idea for extending LoLLMs, think about whether you need an “action tool” (Classic) or a “context/output modifier” (Context Update), and ask the corresponding builder for help. Happy customizing!