Blake Peeling

Banana - Serverless GPUs for Machine Learning inference

Banana provides inference hosting for ML models in three easy steps and a single line of code.

Stop paying for idle GPU time and deploy models to production instantly with our serverless GPU infrastructure.

Use Banana for scale. 🍌

Add a comment

Replies

Best
Aaron LaRue
Great product! Wish you good luck!
Arye Lipman
Thanks for sharing your product ? @makers
Avi Rosenbaum
This is amazing! Fantastic work, team.
Wouter Swijgman
It's a great product so far; deploying your models is intuitive and inferencing works like a charm (used for large NLP models)! Above all, the support is fast and really helpful. Will definitely continue using it!
I have ideas that involve ML, and seeing products being launched that make it easier for our dev community to deploy and run encourages me to take my ideas seriously. Thank you! I shall give it a try for sure