Skip to content

Open Source Deployment

Run the full bytefreezer platform yourself under the Elastic License 2.0.

Overview

  • Compute: Self-hosted
  • Storage: Your S3-compatible storage
  • AI: Your API keys
  • Support: Community

Requirements

  • Linux server (bare metal, VM, or container)
  • S3-compatible storage (AWS S3, MinIO, Backblaze B2, etc.)
  • AI API key (OpenAI, Anthropic, Google, or local LLM)

Quick Start

1. Clone the repository

git clone https://github.com/bytefreezer/bytefreezer.git
cd bytefreezer

2. Configure storage

Edit config.yaml:

storage:
  type: s3
  endpoint: https://s3.amazonaws.com
  bucket: your-bucket-name
  region: us-east-1
  access_key: ${AWS_ACCESS_KEY_ID}
  secret_key: ${AWS_SECRET_ACCESS_KEY}

3. Configure AI

ai:
  provider: openai
  api_key: ${OPENAI_API_KEY}
  model: gpt-4

4. Start services

docker-compose up -d

5. Deploy proxy

Install proxy on systems you want to monitor:

curl -sSL https://get.bytefreezer.com/proxy | bash

Components

Component Purpose Default Port
Control Management API 8080
Receiver Data ingestion 8081
Piper Data processing -
Packer Data packaging -
Proxy Data collection 8082

Next Steps