Open-source AI is the privacy- and budget-friendly way to build apps, but difficult to deploy. EnergeticAI makes it as easy as `npm install`. Optimized for serverless functions with fast cold-start, small bundle size. Business-friendly license (Apache 2.0).
👋 Hi Product Hunt!
I'm Jonathan, maintainer of EnergeticAI.
I strongly believe in open-source AI as a way to build the products of the future in a privacy- and budget-friendly manner.
But these models can be tricky to use. There's no `npm install ai`.
And existing open-source AI libraries need a bunch of tuning to deploy in production, especially in serverless functions—which is some of the most accessible compute for early-stage projects.
EnergeticAI aims to fix this.
It's Tensorflow.js, optimized for the cold start and bundle size constraints of serverless functions, and with spectacular developer experience.
- Small module size (~3 MB vs. 146 MB - 513 MB for stock TensorFlow.js)
- Fast cold-start (~50 ms vs. 2000+ ms for stock TensorFlow.js)
- Incredible ease-of-use (pre-trained model, helpers for key tasks, docs)
It comes with libraries for text embeddings and few-shot text classification.
There's comprehensive docs, including a tutorial showing how to use embeddings to build product recommendations for a simple e-commerce website deployed to Netlify.
This is just the beginning of the project — looking forward to seeing how folks use it, and learning how to make it even better.
Jonathan
@taylorhughes There's a few tricks here. Two of my favorites:
1. Pre-bundling. Existing packages contain the kitchen sink for a long-tail of use-cases. A lot of serverless build systems are not great about tree-shaking this, leaving you in an undeployable state. EnergeticAI uses ESBuild to create an optimized bundle of TensorFlow.js (along with TypeScript declarations), and that bundle is what's uploaded to npm.
2. Dependency graph optimization. Existing packages have multiple definitions of what the "core" package is to depend on. For example, in TensorFlow.js there's `tfjs` as well as `tfjs-core`. Downstream packages depend on different ones without a ton of rhyme or reason, making your project messy. EnergeticAI has a single, compact core module, installed as a peer dependency, that everything else uses.
Report
For the uninitiated, @hellojwilde (the author) is a world class privacy engineer. I didn't know what privacy engineering really meant until I witnessed some of the things he built during our time together as coworkers.
This project will be one to watch and use for anyone serious about securing their AI integrations.
@dagglas Great question! Comes back to bundling.
Existing libraries assume that you'll want to download the model weights from the internet on first invocation. This works for small regression models, and containerized deployments where you can reuse that downloaded model across requests.
But it's not a great fit for serverless functions with larger models, where you're paying the network cost on almost every invocation.
In EnergeticAI, we offer an NPM module with the model weights, which you can package with your serverless function so that it's available right there on the first invocation.
~Really impressive idea! Would it be okay to include your app in my directory? (producthunt.com/products/wikiaitools). Email us with your product details and we'll also help you get your app listed on 100+ AI directories.
Replies
EnergeticAI
Forecast
EnergeticAI
Archbee
Chatbase
EnergeticAI
MotionDock
Aihotdeal