Creating Reusable Workflows in GitHub Actions: Best Practices
In the world of CI/CD, efficiency is key. Reusable workflows in GitHub Actions exist to eliminate redundancy and promote maintainability. By allowing you to define a workflow once and call it from multiple places, you can save time and reduce errors in your automation processes.
Creating a reusable workflow involves a few key components. First, you need to include the workflow_call key in your YAML file. This is essential for making your workflow reusable. You can define inputs to pass data from the calling workflow and secrets for sensitive information. For example, the inputs must match the expected data types in the called workflow, which can be boolean, number, or string. When you call a reusable workflow, use the with keyword to pass named inputs and the secrets keyword for sensitive data. This setup allows for a clean and efficient way to manage your workflows across different repositories.
In production, be aware of some critical nuances. Environment secrets cannot be passed from the caller workflow, as the on.workflow_call does not support the environment keyword. If you choose to inherit secrets using secrets: inherit, you can reference them even if they aren't explicitly defined. Additionally, if you're dealing with nested reusable workflows, ensure that all workflows in the chain are accessible to the caller. Permissions can only be maintained or reduced, not elevated, throughout the chain. These details can make or break your workflow efficiency, so pay close attention to them.
Key takeaways
- →Define inputs and secrets in your reusable workflow using the `inputs` and `secrets` keywords.
- →Use the `with` keyword to pass named inputs and the `secrets` keyword for sensitive data.
- →Be cautious that environment secrets cannot be passed from the caller workflow.
- →Remember that nested reusable workflows require all workflows to be accessible to the caller.
- →Use `secrets: inherit` to reference secrets not explicitly defined in the calling workflow.
Why it matters
Reusable workflows can significantly reduce duplication in your CI/CD pipelines, leading to faster development cycles and fewer errors. This efficiency is crucial in maintaining a robust and scalable automation strategy.
Code examples
on:workflow_call:inputs:config-path:required:true:type:stringsecrets:personal_access_token:required:truejobs:call-workflow-passing-data:uses:octo-org/example-repo/.github/workflows/reusable-workflow.yml@mainwith:config-path:.github/labeler.ymlsecrets:personal_access_token:${{secrets.token}}jobs:workflowA-calls-workflowB:uses:octo-org/example-repo/.github/workflows/B.yml@mainsecrets:inherit# pass all secretsWhen NOT to use this
The official docs don't call out specific anti-patterns here. Use your judgment based on your scale and requirements.
Want the complete reference?
Read official docsMastering Argo Rollouts for Progressive Delivery in Kubernetes
Argo Rollouts transforms how you deploy applications in Kubernetes by enabling advanced strategies like blue-green and canary updates. With its ability to manage ReplicaSets and control traffic, it’s a game changer for production environments. Dive in to learn how to leverage this powerful tool effectively.
Mastering Cluster Bootstrapping with Argo CD: The App of Apps Approach
Cluster bootstrapping with Argo CD is a game changer for managing multiple applications in Kubernetes. By leveraging the App of Apps pattern, you can declaratively manage your applications in a streamlined way. Dive into the specifics of sync policies and admin-level capabilities that make this possible.
Securing Docker Engine: Best Practices for Production
Docker Engine security is crucial for maintaining a safe containerized environment. Understanding kernel namespaces and control groups can help you isolate processes effectively. Dive into the mechanisms that keep your containers secure and the pitfalls to avoid.
Get the daily digest
One email. 5 articles. Every morning.
No spam. Unsubscribe anytime.