Machine output between stages
formae inventory, apply, extract, and destroy all emit JSON when you pass --output-consumer=machine. That's how a GitOps pipeline hands a value from one stage to the next without parsing CLI tables.
Pipeline shape
A typical GitOps run chains three jobs - provision, deploy, verify - linked by needs: for ordering and outputs: to forward values.
jobs:
provision:
# formae apply
outputs:
<key>: <value> # one entry per value the next jobs need
deploy:
needs: provision
# consumes needs.provision.outputs.<key>
verify:
needs: deploy
# smoke-test against the deployed app
See the infra-to-app example for a concrete end-to-end pipeline.
The bridge
After a provision step, capture a property from the freshly created resource:
DB_HOST=$(formae inventory resources \
--query='label:pg-server' \
--output-consumer=machine \
| jq -r '.Resources[0].ReadOnlyProperties.fullyQualifiedDomainName')
echo "db_host=$DB_HOST" >> $GITHUB_OUTPUT
The next job reads ${{ needs.provision.outputs.db_host }} and feeds it into a deploy, migration, or test step.
Where outputs live
Pipeline outputs land in .Resources[].ReadOnlyProperties - computed values like FQDNs, ARNs, generated names. User-declared values are in .Resources[].Properties. The full schema is documented in formae inventory.
What's next
formae inventoryfor the query syntax- Fleet fan-out for the same shape across many repos
- CI/CD integration for general CI patterns