Blake Peeling

Banana - Serverless GPUs for Machine Learning inference

byβ€’
Banana provides inference hosting for ML models in three easy steps and a single line of code.

Stop paying for idle GPU time and deploy models to production instantly with our serverless GPU infrastructure.

Use Banana for scale. 🍌

Add a comment

Replies

Best
Amit Mahapatra
Great product and a great team! Congratulations on the launch :)
Blake Peeling
@amit_mahapatra1 thank you very much Amit! :)
Pranav Teegavarapu
this is awesome!!
Erik Dunteman
@pranavnt means a lot from the Kobra founder! Give it a whirl
Blake Peeling
@pranavnt thanks! we appreciate the <3
David Banys
On-demand + autoscaling + minimal cold start + GPUs?? The dream!
Erik Dunteman
@david_banys all those things :) thanks David
Максим ШСвяков
That's useful!
Joe Speiser
This rocks, sharing with all my devs friends now!
Sahil Chaudhary
@joe_speiser1 Thanks for the share!
Arielle Lok
this is poggers
Derek Pankaew
Wish this existed when we were doing GPU-heavy stuff!
Erik Dunteman
@derekpankaew yes I'm sure this would have been solid for the Yolo models you were running at Next Fitness! With models that small too they would have scaled up in the matter of a couple seconds too :)
Nader Khalil
This is awesome!!! Serverless GPUs makes so much sense
Blake Peeling
@naderlikeladder heck yea! thanks Nader :)
Oren Leung
gpu coldstarts is a tough problem to solve! glad banana is taking on this challenge! definitely super cool product! especially for hobby projects that require gpus and you don't want to pay an arm and an leg to host a demo
Roman Puliyan
Interesting