Banana

Frequently Asked Questions

← Back to Banana Docs

Deployment FAQs

  • Deployment on Banana is done through GitHub repositories. Refer to our detailed deployment setup guide.
  • We support Git LFS.
  • Each model you deploy on Banana requires it's own repository.
  • We do not currently support direct upload of docker images. For now, we support Git repos with dockerfile and build docker images.
  • We support pretty much everything that fits into Docker (TensorFlow, etc.). However, if you want things to be automagically faster with our Banana optimizations under the hood, those are only with PyTorch.

Pricing FAQs

  • Included in the billing of GPU seconds is the model load, inference time, and by default a 10second timeout (fully configurable within your Banana Dashboard) after the inference to prevent cold boots every time.