Ujjwal

Run AI completely offline. BrainLoom v1.0.5 adds Local AI (Ollama) + Live Canvas Decks.

by•

Hey Hunters! 👋

When I launched BrainLoom (The Local-First Learning OS) two months ago, the goal was simple: Your second brain should live on your hard drive, not a server.

Today, I am shipping v1.0.5, and we are taking that philosophy to its ultimate conclusion.

What’s New in v1.0.5:

  • Local AI (Ollama) Integration: You can now connect BrainLoom directly to your local Ollama instance. Run powerful models like Llama3 or Qwen entirely offline. Zero API costs. 100% total privacy. You can now feed your most confidential study materials to the AI, and the data literally never leaves your laptop.

  • Live Flashcard Decks on the Canvas: You can now drop full study decks directly onto the Infinite Canvas. Connect your flashcards visually to the source PDFs and notes they belong to, building a true spatial memory map.

  • Upgraded AI Chatbar: The AI assistant now natively supports generating Tables, Code Blocks, and interacting with File Attachments right in the chat window.

We just hit 50+ paid Founders and are officially in the "Architects" batch (Lifetime License is currently $34).

Challenge for the Tech Crowd:
Download the free trial, turn your Wi-Fi completely OFF, and try generating a flashcard deck using a local Llama 3 model. It feels like magic.

Grab the update (Mac/Win/Linux): https://brainloom.app

Let me know which local model you end up hooking up to it!

15 views

Add a comment

Replies

Be the first to comment