8  Tool Calling and Agents

These are additional techniques in utilizing local LLMs that I want to learn, explore, and share.

8.1 Tool Calling

[deepseek-r1:7b]

Tool calling in the context of Large Language Models (LLMs) refers to a method where the model delegates specific subtasks or data retrieval processes to external tools, databases, or services. This approach allows LLMs to handle complex tasks more efficiently by leveraging specialized resources while maintaining higher-level reasoning capabilities internally.

Key Points:

  1. Delegation of Tasks: Tool calling enables an LLM to break down intricate problems into manageable parts, delegating complex subtasks to external tools. For example, for a weather forecast request, the model might delegate data retrieval to a reliable meteorological service.

  2. Efficiency and Resource Management: By offloading tasks that require external knowledge or resources, tool calling allows the LLM to stay focused on higher-level reasoning without being bogged down by internal processing limitations.

  3. Access to External Resources: This method often involves accessing databases, APIs, or other systems, which can provide real-time data and specialized information crucial for accurate responses.

  4. Privacy Considerations: Safeguards are in place to ensure that access to external tools does not expose sensitive information, maintaining data security and anonymization.

  5. Task Limitations and Scalability: There is a boundary between tasks the LLM can handle natively and those it must outsource. Tool calling contributes to scalability but may be constrained by processing power and network capabilities when handling multiple complex tasks simultaneously.

In essence, tool calling enhances an LLM’s functionality by expanding its problem-solving capabilities beyond basic text generation, allowing it to address a broader range of tasks efficiently and effectively.

8.1.1 Models with Tool Calling

For Ollama, you can identify LLMs with tool calling capability from the description of the models in the Ollama’s model page or from the tools icon in the page. For some models, you can find it in the Details > template page. For example, the template text for Qwen2.5 is https://ollama.com/library/qwen2.5:latest/blobs/eb4402837c78:

{{- if .Messages }}
{{- if or .System .Tools }}<|im_start|>system
{{- if .System }}
{{ .System }}
{{- end }}
{{- if .Tools }}

# Tools

You may call one or more functions to assist with the user query.

You are provided with function signatures within <tools></tools> XML tags:
<tools>
{{- range .Tools }}
{"type": "function", "function": {{ .Function }}}
{{- end }}
</tools>

For each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:
<tool_call>
{"name": <function-name>, "arguments": <args-json-object>}
</tool_call>
{{- end }}<|im_end|>
{{ end }}
{{- range $i, $_ := .Messages }}
{{- $last := eq (len (slice $.Messages $i)) 1 -}}
{{- if eq .Role "user" }}<|im_start|>user
{{ .Content }}<|im_end|>
{{ else if eq .Role "assistant" }}<|im_start|>assistant
{{ if .Content }}{{ .Content }}
{{- else if .ToolCalls }}<tool_call>
{{ range .ToolCalls }}{"name": "{{ .Function.Name }}", "arguments": {{ .Function.Arguments }}}
{{ end }}</tool_call>
{{- end }}{{ if not $last }}<|im_end|>
{{ end }}
{{- else if eq .Role "tool" }}<|im_start|>user
<tool_response>
{{ .Content }}
</tool_response><|im_end|>
{{ end }}
{{- if and (ne .Role "assistant") $last }}<|im_start|>assistant
{{ end }}
{{- end }}
{{- else }}
{{- if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
{{ .Prompt }}<|im_end|>
{{ end }}<|im_start|>assistant
{{ end }}{{ .Response }}{{ if .Response }}<|im_end|>{{ end }}

8.1.2 Calling R Functions

For tool calling, we will use ellmer package, which allows using tool calling of R functions (https://ellmer.tidyverse.org/reference/tool.html and https://ellmer.tidyverse.org/articles/tool-calling.html for details).

We start with the first example from https://ellmer.tidyverse.org/reference/tool.html, we need to define the metadata that the model will use to understand when it needs to call the tool,

library(ellmer)
# define the metadata
tool_rnorm = tool(
  stats::rnorm,
  name = "rnorm",
  description = "Draw numbers from a random normal distribution",
  arguments = list(
    n = type_integer("The number of observations. Must be a positive integer."),
    mean = type_number("The mean value of the distribution."),
    sd = type_number("The standard deviation of the distribution. Must be a non-negative number.")
  )
)
tool_rnorm
# <ellmer::ToolDef> rnorm(n, mean, sd)
# @name: rnorm
# @description: Draw numbers from a random normal distribution
# @convert: TRUE
#
function (n, mean = 0, sd = 1) 
.Call(C_rnorm, n, mean, sd)
<bytecode: 0x55c04d2f1cd0>
<environment: namespace:stats>

Then, we specify the local LLM that we want to use, followed by registering the tool.

chat = chat_ollama(model = "qwen3.5:4b", params = params(seed = 888))
chat$register_tool(tool_rnorm)
chat$get_tools() # ensure it gets registered
$rnorm
# <ellmer::ToolDef> rnorm(n, mean, sd)
# @name: rnorm
# @description: Draw numbers from a random normal distribution
# @convert: TRUE
#
function (n, mean = 0, sd = 1) 
.Call(C_rnorm, n, mean, sd)
<bytecode: 0x55c04d2f1cd0>
<environment: namespace:stats>

Finally, we can now try out the tool calling ability of the LLM,

chat$chat("Give me ten numbers from a random normal distribution with mean of 120 and standard deviation of 15. Do not comment, do not elaborate")
[99.8169, 122.269, 120.8879, 126.3081, 132.7487, 119.0378, 97.7111, 123.7985, 
115.3897, 118.1016]

We compare that to the response without tool calling using rollama’s query,

rollama::query("Give me ten numbers from a random normal distribution with mean of 120 and standard deviation of 15.",
               model = "qwen3.5", output = "text", screen = FALSE,
               model_params = list(seed = 888)) |> cat()
Here are ten numbers sampled from a normal distribution with a mean of 120 and a standard deviation of 15:

1.  **137.85**
2.  **106.31**
3.  **122.09**
4.  **113.74**
5.  **131.28**
6.  **108.92**
7.  **125.63**
8.  **111.07**
9.  **134.46**
10. **119.42**

*(Note: These values are simulated for illustrative purposes based on the specified parameters and are rounded to two decimal places.)*

8.1.3 Calling custom functions

If you were to ask any LLM this simple question, “What’s the time now?”, you will be greeted with a standard example, i.e. “I don’t know” or its variants,

rollama::query("What's the time now?", "qwen3.5",
               output = "text", screen = FALSE,
               model_params = list(seed = 888)) |> cat()
I don't have access to real-time data like clocks or dates. To find out the current time, you can:  
- Check your device's clock (phone, computer, etc.).  
- Ask a voice assistant like Siri, Alexa, or Google Assistant.  
- Search "time" in your browser for a quick lookup!  

⏰ Stay synchronized with the moment! 😊

This necessitates allowing time and date access to our buddy.

Now, we modify the the function from https://ellmer.tidyverse.org/articles/tool-calling.html to output current time.

# Get current date and time in %A, %e %B %Y, %-I:%M:%S%p strftime format
get_current_time <- function() {
  date_time = format(Sys.time(), "%A, %e %B %Y, %-I:%M:%S%p")
  print(date_time)
}
get_current_time()
[1] "Wednesday,  4 March 2026, 4:08:58PM"

It’s too much of a hassle to write the tool definition, so ask our buddy for help with creating tool definition using create_tool_def(),

chat_def = chat_ollama(model = "qwen2.5:14b", params = params(seed = 888))
create_tool_def(get_current_time, chat = chat_def)
It looks like the provided documentation is incomplete, but based on the 
function name `get_current_time`, we can infer that it likely returns the 
current time. Here's a reasonable guess at the tool call definition:

```r
tool(
  get_current_time,
  name = "get_current_time",
  description = "Get the current time",
  arguments = list(
    # Since no specific arguments are described, we assume it has no arguments.
  ),
)
```

If you have more details about the function, please provide them!

In fact, for our first example, we can also ask our buddy for help with creating tool definition,

create_tool_def(stats::rnorm, chat = chat_def)
tool(
  stats::rnorm,
  name = "rnorm",
  description = "Generate random deviates from a normal distribution",
  arguments = list(
    n = type_integer("Number of observations to generate"),
    mean = type_number(
      "Mean of the distribution. Defaults to 0",
      required = FALSE
    ),
    sd = type_number(
      "Standard deviation of the distribution. Defaults to 1",
      required = FALSE
    )
  ),
)

Now, we define (with some additional information),

tool_get_current_time = tool(
  get_current_time,
  name = "get_current_time",
  description = "Returns current date and time in %A, %e %B %Y, %-I:%M:%S%p strftime format",
  arguments = list(
    # No arguments required here
  )
)
tool_get_current_time
# <ellmer::ToolDef> get_current_time()
# @name: get_current_time
# @description: Returns current date and time in %A, %e %B %Y, %-I:%M:%S%p 
strftime format
# @convert: TRUE
#
function () 
{
    date_time = format(Sys.time(), "%A, %e %B %Y, %-I:%M:%S%p")
    print(date_time)
}

and register the tool,

chat$register_tool(tool_get_current_time)
chat$get_tools() # ensure it gets registered
$rnorm
# <ellmer::ToolDef> rnorm(n, mean, sd)
# @name: rnorm
# @description: Draw numbers from a random normal distribution
# @convert: TRUE
#
function (n, mean = 0, sd = 1) 
.Call(C_rnorm, n, mean, sd)
<bytecode: 0x55c04d2f1cd0>
<environment: namespace:stats>

$get_current_time
# <ellmer::ToolDef> get_current_time()
# @name: get_current_time
# @description: Returns current date and time in %A, %e %B %Y, %-I:%M:%S%p 
strftime format
# @convert: TRUE
#
function () 
{
    date_time = format(Sys.time(), "%A, %e %B %Y, %-I:%M:%S%p")
    print(date_time)
}

Then, lastly, we ask several questions,

chat$chat("What's the time now?")
[1] "Wednesday,  4 March 2026, 4:09:09PM"
Wednesday, 4 March 2026, 4:09:09 PM
chat$chat("What day is it today?")
Wednesday
chat$chat("Is it morning, afternoon, or evening right now?")
Afternoon

which are the correct date and time in my PC at the time of writing. Although for this particular seed number, it replies with full date and time for the first query.

8.2 AI Agents

An AI agent is an “AI model capable of reasoning, planning, and interacting with its environment” ("Hugging Face", 2025). It (i.e. the agent) reasons and plans, then takes actions by calling suitable tools, possibly in ways that we have discussed before i.e. tool calling.

Potential academic uses:

  • Web search
  • LR search and summary
  • and more …

In progress …

References

"Hugging Face". (2025). What is an agent? Website. Retrieved from https://huggingface.co/learn/agents-course/en/unit1/what-are-agents