

31·
2 days agoYou can self host that too ;)
OpenWebUI + Ollama + SearxNG. OpenWebUI can do llm web search using the engine of your choice (even self hosted SearxNG!). From there it’s easy to set the default prompt to always give you the top (10, 20, whatever) raw results so you’re not confined to ai results. It’s not quite duck.ai slick but I think I can get there with some more tinkering.
I mean, I could write one! I kind of just pieced it together from guides on the three individuals
Edit: back of the napkin guide below is basically in the OpenWebUI docs already! I use NixOS (btw) but docker/podman should work well.
OpenWebUI + Ollama setup – tl;dr
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:mainOpenWebUI SearXNG guide – a little more involved, but not difficult.