Troubleshooting
Quick Diagnosis
# Check container status
docker ps | grep content-splitting
# Check recent logs
docker logs content-splitting --tail 50 2>&1 | grep -i error
# Test with sample array
curl -X POST http://localhost:8080/sensors/bulk \
-H "Content-Type: application/json" \
-d '{"device_id": "test", "readings": [{"value": 1}, {"value": 2}]}'
Common Issues
Array not being split
Cause: Wrong field name in unarchive config or array field missing
# Check logs for incoming data structure
docker logs content-splitting --tail 20 2>&1 | grep readings
Fix: Verify the array field name matches your config:
- unarchive:
format: json_array
field: readings # Must match your data
Parent context lost after split
Cause: Metadata not stored before split
Fix: Store context in meta() before unarchive, restore after:
- mapping: |
meta device_id = this.device_id
- unarchive:
format: json_array
field: readings
- mapping: |
root.device_id = meta("device_id")
Memory errors on large arrays
Cause: Array too large for available memory
# Check container memory
docker stats content-splitting --no-stream
Fix: Add size validation before splitting:
- mapping: |
if this.readings.length() > 10000 {
throw("Array too large")
}
Split messages going to wrong destination
Cause: Routing condition not matching split message structure
Fix: Check that routing conditions work on the split message format (individual items, not the parent array)
Still stuck?
- Add debug logging:
logger: {level: DEBUG} - Check the Complete Pipeline for reference config
- Review Fan-Out Pattern for multi-destination routing