Experiment with AI models locally without the need to setup a full-blown ML stack. Powered by a native app designed to simplify the whole process from model downloading to starting an inference server. No GPU required!
Let's go!! 🎒🎒🎒 local.ai makes it easy to install and chat with a local model, and it exposes a server endpoint so you can plug it in apps like window.ai!
local.ai
Automagical.ai
OpenSea