Skip to main content

Complete Pipeline

This pipeline combines all fan-out techniques from this tutorial:

  • Multi-destination output - Send every event to multiple destinations simultaneously
  • Broker pattern - Use fan_out to duplicate messages to each output
  • Destination-specific config - Configure batching, compression per destination

Full Configuration

fan-out-pattern.yaml
name: fan-out-pattern
description: Multi-destination fan-out pipeline for sensor data
type: pipeline
namespace: default
labels:
pattern: fan-out
data-type: sensor-metrics

config:
input:
http_server:
address: 0.0.0.0:8080
path: /events

output:
broker:
pattern: fan_out
outputs:
# Local file for debugging
- file:
path: /tmp/events.jsonl
codec: lines

# Kafka for real-time streaming
- kafka:
addresses: ["${KAFKA_BROKERS}"]
topic: sensor-events
key: "${!this.sensor_id}"

# S3 for long-term archival
- aws_s3:
bucket: "${S3_BUCKET}"
region: "${AWS_REGION}"
path: "events/${!timestamp_unix()}/${!uuid_v4()}.json"
batching:
count: 10
period: 10s

# Elasticsearch for search and analytics
- elasticsearch:
urls: ["${ES_ENDPOINT}"]
index: "sensor-events-${!timestamp_unix()}"
id: "${!this.event_id.or(uuid_v4())}"
batching:
count: 10
period: 10s

Quick Test

# Send an event (goes to all configured destinations)
curl -X POST http://localhost:8080/events \
-H "Content-Type: application/json" \
-d '{
"sensor_id": "temp-001",
"temperature": 72.5,
"humidity": 45,
"timestamp": "2024-01-15T10:30:00Z"
}'

Deploy

# Deploy to Expanso orchestrator
expanso-cli job deploy fan-out-pattern.yaml

# Or run locally with expanso-edge
expanso-edge run --config fan-out-pattern.yaml

Download

Download fan-out-pattern.yaml

What's Next?