Skip to main content

Transferring state in an Ansible Workflow

Transferring state in an Ansible Workflow

Ansible is mostly a stateless system in comparison to tools like Terraform.
Fact caching alleviates only some of this. The problem with any
caching system is that the cached data may be stale. The Ansible modules
determine the current state and transform it to the desired state,
if applicable; however, the end state is not saved for a subsequent
playbook or job template to reference.

So what options are available to transfer state between
two otherwise independent playbooks / job templates?

Heavy-handed options

One option is to write variables to a file on the Ansible controller.
Keep in mind that there are extra security measures in place on
AWX/Tower — for example, blocking become on the cluster nodes.
Subsequent playbooks or job templates can read the file containing the
variables and execute roles and tasks accordingly on the variable values.

Another option or a slight variation of the above is to persist the file
to a network share. Of course, this adds another dependency on the
availability of that network share.

{% include image name=”Simple_WF_file.png” position=”center” size=”XXL” alt=”file” %}
Simple Workflow writing state to a file

At this point, if your needs tend towards stateful tooling, Ansible is probably
not the right tool for the job as it is mostly stateless.

Let’s look at a lightweight option next, which is suitable for specific use cases.

Using the set_stats module in a Workflow

This section will explore a use case where a maintenance window in your
monitoring solution needs to be created before some Ansible activity.
If the activity was successful, then the downtime will be deleted at the end.
Any monitoring solution will set a unique identifier for each downtime, here downtime_id.

Remember, while Ansible can gather facts at the start of a play with gather_facts: True,
which is the default, or utilizing a setup: task, it does not save the
(previous) state after any playbook or job template execution.
The same is true for a Workflow in AWX and Ansible Tower, which are the web
interface versions of Ansible.

Below is an example of a simple linear Workflow. It sets a downtime in the
SetsDowntime job template, moves on to the middle job template to do
some other tasks, and finally deletes the downtime in the DeletesDowntime
job template.

{% include image name=”Simple_WF.png” position=”center” size=”XXL” alt=”simple” %}
Simple Workflow

Once a playbook completes and the next playbook starts, Ansible is unaware of
anything that has happened in the previous playbook. What if, for example, at
the beginning of a Workflow, you set a downtime in your monitoring solution and
want to keep track of the downtime’s unique identifier, so that at the successful
completion of the Workflow Ansible can delete the downtime?

This is where the set_stats module comes in handy
(set_stats module).
The set_stats module allows setting a particular “fact” that gets gathered by Tower
via a callback plugin. Tower then stores it in its database and passes it to
subsequent job templates in the Workflow.
This way, it is possible to transfer some state or information between two
otherwise independent playbooks / job templates.

The following examples show how a playbook in a Workflow can use set_stats to transfer information to a playbook executed later in the Workflow.

This example playbook will be the first node in a Workflow:

---
# First playbook
- name: First playbook - Prep tasks and set a downtime in a monitoring solution
  hosts: all
  tasks:
  - name: Some prep tasks
    ...

  - name: Use an API call to set a downtime in a monitoring solution
    uri:
      url: "https://<monitoring_FQDN>/api/downtimes/"
      method: POST
      return_content: True
    register: r_downtime
      ...

  - name: Set fact for subsequent job templates
    set_stats:
      data:
        downtime_id: {% raw %}"{{ r_downtime.content.downtime_id }}"{% endraw %}

  - name: DEBUG - output downtime_id
      var: downtime_id # 61bff285-3390-4211-bf6f-0b684586e22a

When the Workflow has reached the final job template, that playbook can use the
above variable downtime_id, set by set_stats, to delete the monitoring
downtime for the targeted application stack. Below is an example of such a playbook.

---
# Last playbook
- name: Last playbook - Delete downtime that was set at the start of the Workflow
  hosts: all
  tasks:
  - name: DEBUG - output downtime_id
      var: downtime_id # 61bff285-3390-4211-bf6f-0b684586e22a

  - name: Delete downtime via monitoring solution API
    uri:
      url: {% raw %}"https://<monitoring_FQDN>/api/downtimes/{{ downtime_id }}"{% endraw %}
      method: DELETE
      ...

The Workflow utilizing these playbooks / job templates could look like the following example.

{% include image name=”Simple_WF_set_stats.png” position=”center” size=”XXL” alt=”stats” %}
Simple Workflow with three job templates using set_stats

After execution of the SetsDowntime job template its job details outline
the artifacts that were set due to the set_stats module invocation.

{% include image name=”artifacts1.png” position=”center” size=”XXL” alt=”artifacts” %}
Artifacts from SetsDowntime job template

The middle and last job template in the Workflow receive the artifact
as an extra variable (see below).

{% include image name=”extravars.png” position=”center” size=”XXL” alt=”extravars” %}
Artifact as an extra variable on subsequent job templates

//Let’s get started

As you can see, for simple use cases, the set_stats module can be a
convenient vehicle to transfer data between multiple job templates, while
avoiding the introduction of file dependencies.

You are now ready to apply this technique in your new and existing Workflows.

Share this story

Arctiq Team

We service innovation.