🧭 Welcome to Your Local AI Setup Guide

This interactive guide will check what kinds of AI tools are available on your device: browser-based AI (like WebGPU) or local models with Ollama.

Scroll down to begin exploring.

🧠 Browser-Based AI (WebGPU)

Modern browsers can now run small AI models using your GPU directly. This uses WebGPU, the successor to WebGL.

If your browser supports it, you can run AI models like WebLLM fully inside your browser — no internet needed.

Checking WebGPU support...

🐘 Ollama: Local LLMs on Your Machine

Ollama lets you run large language models locally using your CPU or GPU. It offers a clean CLI and a REST API on localhost:11434.

This check will look for a running Ollama server and list the models installed.

Checking Ollama status...