LocalIntelligence, a fully native macOS client for Ollama

Ever wished you could fully explore the AI capabilities provided by Ollama from a native macOS app? Written in Swift with SwiftUI, this is exactly what LocalIntelligence provides.
 

Chat with all kind of local AI models

LocalIntelligence allows you to interact with all the AI models supported by Ollama. No need to use the command line or write programs in Python. LocalIntelligence supports embedding or Vision models such as IBM's Granite Vision. Just drag and drop your pictures and you are ready to ask anything you want.

Stacks Image 356

LocalIntelligence also fully supports local (STDIO) MCP servers (TCP/IP secure MCP servers support will be added in a future release). Adding a new MCP server is trivial and can be done from the Settings menu.

Full MCP support

Unlike other Ollama clients, LocalIntelligence provides full MCP support. With MCP servers, your LLM models can use all kind of external tools that allow you for example to provide real-time data to your models or execute remote procedure calls.

Stacks Image 352

That said, the best LocalIntelligence feature is that it is always just a click away. No need to install bloated JavaScript wrapped applications, web servers or additional libraries. Enjoy!

Download LocalIntelligence 1.0 (notarized version)

If you do need a local STDIO MCP server for testing, you can download a very simple one over here.

LocalIntelligence is also available on the App Store, as a sandboxed application. This extra security layer comes at a cost though. That version cannot connect to STDIO MCP servers because of the extra security required by Apple. That doesn't mean you cannot use MCP. It just means that in order to use MCP, you will have to connect to an unsecured TCP/IP based server

Download on the App Store

LocalIntelligence