Local LLM with Tool Calling Series: Python Cookbook Running an ollama model locally with web search tool call capability