This interactive guide will check what kinds of AI tools are available on your device: browser-based AI (like WebGPU) or local models with Ollama.
Scroll down to begin exploring.
Modern browsers can now run small AI models using your GPU directly. This uses WebGPU, the successor to WebGL.
If your browser supports it, you can run AI models like WebLLM
fully inside your browser — no internet needed.
Ollama
lets you run large language models locally using your CPU or GPU. It offers a clean CLI and a REST API on localhost:11434
.
This check will look for a running Ollama server and list the models installed.