Skip to main content

What is RunPod?

RunPod is a cloud computing platform built for AI, machine learning, and general compute needs. Whether you're running deep learning models, training AI, or deploying cloud-based applications, RunPod provides scalable, high-performance GPU and CPU resources to power your workloads.

RunPod offers:

  • High-performance compute: Access powerful GPU & CPU resources on demand.
  • Flexible deployment: Deploy your code using Serverless for autoscaling pay-per-second execution, or Pods for containerized GPU and CPU instances.
  • Command-line interface: Use the RunPod command-line interface to seamlessly automate deployments.
  • Transparent pricing: GPUs are billed by the minute, with no fees for ingress/egress. See RunPod Pricing for details.

Get started today by signing up for an account.

Serverless

Serverless offers pay-per-second serverless computing with built-in autoscaling for production workloads.

Use Serverless to:

  • Deploy AI workloads with low cold-start times and robust security.
  • Build and expose REST API endpoints with autoscaling.
  • Queue jobs efficiently without managing infrastructure.

Get started with Serverless

Pods

Pods allow you to run containerized workloads on dedicated GPU or CPU instances.

RunPod offers two types of Pods:

  • Secure Cloud: Operates in T3/T4 data centers, providing high reliability and security.
  • Community Cloud: Connects individual compute providers to users through a vetted, secure peer-to-peer system.

For more info, see Secure Cloud vs. Community Cloud.

Get started with Pods

RunPod CLI

RunPod provides a command-line interface (CLI) tool for programmatically managing Pods, or quickly developing and deploying custom Serverless environments.

For more information, see:

Support

If you need help, reach out to us on Discord, via email, or submit a request using our contact page.

Next steps