Amar Dahmani

LISA Core, an extension that captures, compresses, and preserves your AI conversations.

What I built: A Chrome extension that captures your ChatGPT conversations (and 8 other AI platforms) into portable JSON/JSONL files you can upload anywhere.

The Problem This Solves

You spend hours crafting the perfect conversation in ChatGPT. Then you want to:

  • Continue it in Claude (better for coding)

  • Share it with a teammate who uses Gemini

  • Keep a backup in case ChatGPT goes down

  • Switch to a local model for privacy

Right now? You're stuck. ChatGPT locks your context in their ecosystem.

How It Works

  1. One-click export — Click the LISA button while in ChatGPT

  2. Get a tiny JSON file — Your 60,000-word conversation becomes ~700 words (80:1 compression)

  3. Upload anywhere — Drop it in Claude, Gemini, Grok, Mistral, DeepSeek, Copilot, or Perplexity

  4. Say: "Read this LISA JSON and continue our conversation"

  5. Done — Full context restored, conversation continues

Why Structured JSON?

When AI reads raw text, it wastes compute guessing:

  • "Fix it" → What's "it"?

  • "Do this later" → When?

  • "The file we discussed" → Which file?

LISA pre-resolves these ambiguities by translating your conversation into structured data:

Instead of this:

"Hey, can you fix the authentication bug we talked about earlier?"

AI gets this:

json

{
  "semantic_anchors": [{
    "id": "SA001",
    "topic": "Authentication bug fix",
    "priority": "high",
    "context": "Previously discussed login issue"
  }],
  "action_vectors": [{
    "action": "Debug auth module",
    "status": "pending"
  }]
}

Result: 95% of AI's compute goes to solving your problem, not figuring out what you meant.

Real Compression Examples

  • 302-message coding session: 645 KB → 6.45 KB (100:1)

  • 60,000-word discussion: ~700 words (86:1)

  • Tested across Claude, GPT-4, Gemini — >95% fidelity

Supported Platforms (9 total)

✅ ChatGPT (obviously) ✅ Claude + Claude Code ✅ Gemini ✅ Grok ✅ Mistral AI ✅ DeepSeek ✅ Microsoft Copilot ✅ Perplexity 🔜 Ollama (local models)

Privacy

  • 100% local processing — Your conversations never touch our servers

  • No tracking — We don't see your data because we never receive it

  • Open JSON format — No vendor lock-in, you own your data

  • Optional cloud sync — Only if you explicitly enable it

Free Tier

  • 5 exports per day (across all 9 platforms)

  • Local library storage

  • Full semantic compression

  • All features except cryptographic verification

Technical Specs

  • Chrome extension (Manifest V3)

  • Works on chatgpt.com automatically

  • SHA-256 hashing for audit trails

  • Right-click selection export

  • Git-style version history

Use Cases I've Seen

Developers:

  • Start debugging in ChatGPT → finish in Claude (better for code)

  • Keep coding session backups

  • Share project context with teammates using different AI

Students:

  • Study sessions portable across platforms

  • Keep personal knowledge library

  • Never lose important conversations

Anyone leaving ChatGPT:

  • Take your conversation history with you

  • Your data is yours

Why I Built This

I'm a professional translator (25 years). I learned that meaning is fragile — especially across boundaries.

When I started using AI heavily, I saw the same problem: your conversations are trapped in silos, and AI wastes compute re-parsing ambiguities every single time.

LISA treats context transfer as a translation problem, not just compression. Just like translating French to English isn't about swapping words 1:1 — it's about stabilizing meaning — LISA stabilizes meaning across the human-AI boundary.

Limitations

  • ChatGPT's HTML structure changes occasionally (we update the parser)

  • Free tier has daily limits (5 exports/day)

  • Works best on desktop Chrome (mobile support coming)

Chrome Web Store: Search "LISA Core" or check my profile for link

Happy to answer questions about how the semantic compression works, cross-platform compatibility, or why treating this as a translation problem (not just compression) matters. 🧠

Built this because I got tired of being locked into ChatGPT's ecosystem when Claude often gives better answers for coding. Figured others might have the same problem.

10 views

Add a comment

Replies

Best
Amar Dahmani

Hi Product Hunt 👋 I'm Amar. I've spent 25 years as a professional translator — English, French, Arabic, and my whole career has been about one problem: meaning gets lost in translation. Three years ago I started using AI heavily. And I kept noticing the same thing: the AI would forget. Switch platforms, lose context, start over. The knowledge I'd built wasn't being stored — it was being approximated. Every time I loaded a new conversation, the AI was guessing at what I'd meant before. I knew exactly what was happening. It's what happens in bad translation. You preserve the words but lose the meaning. So I built LISA to fix it: not as a developer (I'm not one), but as someone who understood the problem at a philosophical level. LISA doesn't compress conversations. It translates them into machine-executable semantics. The difference sounds subtle. In practice, the AI that reads a LISA file understands your context better than one reading the original transcript. We're live today. Free tier, no card required. Try it on any AI conversation you care about. I'll be here all day, ask me anything. Especially: why this from Algeria? Why from a translator? I think those are the interesting questions. 🙏