Skip to main content

Troubleshooting

Quick Diagnosis

# Check container status
docker ps | grep enrich

# Check recent logs
docker logs enrich-export --tail 50 2>&1 | grep -i error

# Test enrichment
curl -X POST http://localhost:8080/logs \
-H "Content-Type: application/json" \
-d '{"level": "INFO", "message": "test", "service": "api"}'

# Check S3 output
aws s3 ls s3://$BUCKET/logs/$(date +%Y/%m/%d)/ --max-items 5

Common Issues

Metadata not being added

Cause: Mapping not executing or overwriting

Fix: Ensure metadata is merged, not replaced:

- mapping: |
root = this
root.metadata.processed_at = now()
root.metadata.pipeline = "enrich-export"

S3 uploads failing

Cause: Credentials or bucket permissions

# Test S3 access
aws s3 ls s3://$BUCKET/ --max-items 1

Fix: Verify AWS credentials and bucket policy:

output:
aws_s3:
bucket: ${S3_BUCKET}
region: ${AWS_REGION}
credentials:
id: ${AWS_ACCESS_KEY_ID}
secret: ${AWS_SECRET_ACCESS_KEY}

Small files being created

Cause: Batching too aggressive

Fix: Increase batch size and period:

batching:
count: 10000
period: 5m
byte_size: 50MB

High S3 costs

Cause: Too many small objects

Fix: Increase batch size, add compression:

output:
aws_s3:
bucket: ${BUCKET}
path: logs/${!timestamp_unix()}.json.gz
batching:
count: 50000
period: 10m
compression: gzip

Still stuck?

  1. Add debug logging: logger: {level: DEBUG}
  2. Check the Complete Pipeline for reference config
  3. Review Filter Severity for reducing log volume