marius ndini

Open Sourcing CORE-MNX: Durable Memory for LLMs

Today we’re open-sourcing the core memory engine behind @Mnexium AI: CORE-MNX.

GItHub

NPM

For us, this is a product decision and a philosophy decision. Memory infrastructure is becoming foundational for serious AI products, and we believe the core layer should be transparent, inspectable, and extensible by the teams building on top of it.

What we open-sourced

CORE-MNX is the backend layer that powers durable memory workflows:

memory storage and retrieval,

claim extraction and truth-state resolution,

memory lifecycle management,

and event streaming for real-time systems.

It’s Postgres-backed, API-first, and built to integrate into real production stacks.

We tried our best to make this system as standalone as possible. Ultimately, its fairly difficult we needed LLMs (Cerebras for fast token output, ChatGPT for intelligence etc), Databases etc. We have intentionally made the project API interfaced so your project can be code agnostic.

Why we did this

Open-sourcing CORE lets builders:

understand exactly how memory behavior works,

self-host or extend the engine for their own products,

and avoid reinventing the same memory infrastructure from scratch.

What stays on Mnexium.com

Mnexium’s long-term direction is still the same: make AI systems more useful over time through durable memory and reliable recall.

Open-sourcing CORE is how we make that foundation available to everyone building in this space. Open to everyone to lend an opinion on improvements and how we make this problem solvable.

37 views

Add a comment

Replies

Best
christian b.

Awesome idea!

marius ndini

@christian_b_1 appreciate it -

any feedback welcomed.