beam-cloud

beam-cloud /beta9

Secure, high-performance AI infrastructure in Python.

1,029
80
GitHub
Public

Repository Statistics

Key metrics and engagement data

1.0k
Stars
80
Forks
11
Open Issues
0
Releases
1.08
Engagement Rate
Default branch: main

Timeline

Repository has been active for N/A

Repository Created

Last Activity
Inactive for NaN months

README.md

Logo
Logo

Run AI Workloads at Scale

Colab ⭐ Star the Repo Documentation Join Slack Twitter AGPL

Beam is a fast, open-source runtime for serverless AI workloads. It gives you a Pythonic interface to deploy and scale AI applications with zero infrastructure overhead.

Watch the demo

✨ Features

  • Fast Image Builds: Launch containers in under a second using a custom container runtime
  • Parallelization and Concurrency: Fan out workloads to 100s of containers
  • First-Class Developer Experience: Hot-reloading, webhooks, and scheduled jobs
  • Scale-to-Zero: Workloads are serverless by default
  • Volume Storage: Mount distributed storage volumes
  • GPU Support: Run on our cloud (4090s, H100s, and more) or bring your own GPUs

📦 Installation

shell
1pip install beam-client

⚡️ Quickstart

  1. Create an account here
  2. Follow our Getting Started Guide

Creating a sandbox

Spin up isolated containers to run LLM-generated code:

python
1from beam import Image, Sandbox
2
3
4sandbox = Sandbox(image=Image()).create()
5response = sandbox.process.run_code("print('I am running remotely')")
6
7print(response.result)

Deploy a serverless inference endpoint

Create an autoscaling endpoint for your custom model:

python
1from beam import Image, endpoint
2from beam import QueueDepthAutoscaler
3
4@endpoint(
5 image=Image(python_version="python3.11"),
6 gpu="A10G",
7 cpu=2,
8 memory="16Gi",
9 autoscaler=QueueDepthAutoscaler(max_containers=5, tasks_per_container=30)
10)
11def handler():
12 return {"label": "cat", "confidence": 0.97}

Run background tasks

Schedule resilient background tasks (or replace your Celery queue) by adding a simple decorator:

python
1from beam import Image, TaskPolicy, schema, task_queue
2
3
4class Input(schema.Schema):
5 image_url = schema.String()
6
7
8@task_queue(
9 name="image-processor",
10 image=Image(python_version="python3.11"),
11 cpu=1,
12 memory=1024,
13 inputs=Input,
14 task_policy=TaskPolicy(max_retries=3),
15)
16def my_background_task(input: Input, *, context):
17 image_url = input.image_url
18 print(f"Processing image: {image_url}")
19 return {"image_url": image_url}
20
21
22if __name__ == "__main__":
23 # Invoke a background task from your app (without deploying it)
24 my_background_task.put(image_url="https://example.com/image.jpg")
25
26 # You can also deploy this behind a versioned endpoint with:
27 # beam deploy app.py:my_background_task --name image-processor

Self-Hosting vs Cloud

Beta9 is the open-source engine powering Beam, our fully-managed cloud platform. You can self-host Beta9 for free or choose managed cloud hosting through Beam.

👋 Contributing

We welcome contributions big or small. These are the most helpful things for us:

❤️ Thanks to Our Contributors

Repository image