One Tip a Week: You should install OpenWebUI

Anyone heard anything about large language models or AI recently? 🤣

I’m a big fan of AI tooling. I use Claude, GitHub Copilot, Cursor etc. These are great tools, but what if you can’t use these in some cases? Maybe there are work restrictions, or the Internet is down, or isn’t available on a flight, or you just don’t want to be using Cloud based AI tooling.

Ollama is good answer to that. They provide an endpoint you can use as well use it at the command line, e.g. ollama run llama2 "What is a Merkle tree?" .

If you don’t want to use Ollama at the command line or implement some software using their endpoint, OpenWebUI is a great option.

It’s got a pretty slick interface and it all runs locally. Just another tool to add to your developer tool belt.

The easiest way to get up and running is using Docker, but I leave it up to you to explore the docs on how you want to install it.

docker run -d -p 3000:8080 -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
OpenWebUI running on my local machine with a drop down open with all the available models to use

Give it a try and let me know what you think!

That’s it! Short and sweet. Until the next one!