Managing Jamf configuration with Terraform and GitOps workflows
How to get started with Terraform to start pursuing Infrastructure as Code workflows in your own Jamf environment.
Some housekeeping rules
Before getting started with Terraform and testing workflows, there are a few things to be aware of:
- If you missed the first part of this blog series, you may want to familiarize yourself with all of the context this article is about.
- All the up-to-date information regarding Terraform – including future updates – is located on the Jamf developer site.
- As a best practice, it is recommended that testing be performed utilizing a test instance and not a production instance.
With that said, let’s get to work!
Installing Terraform
In keeping with simplicity, we recommend using Homebrew to install Terraform. After opening Terminal, proceed through the following steps:
- Type in the command:
brew tap hashicorp/tapto access the HashiCorp official directory. - Next, to install Terraform, execute the following command:
brew install hashicorp/tap/terraform - Last, to validate the installation was successful, enter the following command to view the version of Terraform installed:
terraform -version
Note: For reference, detailed installation steps – alongside videos of the installation process – are documented on the HashiCorp developer’s site.
Starting a Terraform project
A Terraform Project file will contain many specific items that are required to get up and running and start building out your project. Most of the relevant files will end with the file extension .tf or something similar, like .tfvars.
Getting started may feel a bit daunting. While there are many components to getting started with Terraform, this guide aims to make this process easier.
- Create a folder to store the Terraform project:
mkdir jamf-terraform - Change the working directory to the newly created folder:
cd ~/jamf-terraform - Next, clone the template branch from the main Terraform module repo:
git clone -b template https://github.com/Jamf-Concepts/terraform-jamf-platform
After this is done, you’ll have a fully functional, templated Terraform project ready to go for Jamf Pro and Jamf Security Cloud.
Adding a variable file
Something you would not want to end up in your GitHub repo are your login credentials or client secrets.
Thankfully, Terraform has a path for this.
Begin by launching your text editor of choice and copy-pasting the following text:
Name this file terraform.tfvars and save it to the top level of your newly cloned template repo.
ProTip: To keep things simple, we’ll be using basic authentication, but you can switch to using OAuth in the future.
Now, you can fill in the credential information, such as the jamfpro_username and jamfpro_password fields respectively in the newly created terraform.tfvars file. Additionally, you’ll need to create an API client within the endpoint security tenant to populate the Jamf Protect section.
Note: The General Settings Knob refers to the module included with the template, titled include_categories. This simple boolean variable lets admins declare whether to apply a specific module or let everything run each time.
Terraform state file
Since resources haven’t been committed to the instance yet, now is the perfect time to cover what the Terraform state file is.
Every time resources are added, changed or destroyed through Terraform, those updates are saved in a file called terraform.tfstate and then backed up to a file called terraform.tfstate.backup.
This is how Terraform remembers what has occurred within your instance – and it’s critical for forward functionality. Each time you run Terraform against an instance, this state file is consulted before anything is done. Each state file needs to be associated to its own instance, so it’s important to keep them separated to maintain functionality and reduce issues.
Running Terraform
The setup is in a good place with Terraform installed, a template repo and your credentials defined in a variables file.
We’re now ready to run our template module against our test Jamf Pro instance. To do so, open Terminal and run the following commands:
terraform init -upgrade
This initializes Terraform and upgrades any required providers to the latest versions for anyone who’s used this before on their machines. It also performs the initial installation for the required Terraform providers if you’re entirely new to the topic.
terraform fmt -recursive
This recursively formats any Terraform code in your local cloned branch to make sure it will all look correct and function properly.
terraform plan
This looks at your test Jamf Pro instance, the Terraform state file and module code for any changes (or new things) to apply (or remove) from your instance. It returns a full rundown of what will happen when you execute the module.
terraform apply
This first runs the plan (from the previous command) and reports back what will be added, changed or destroyed. It then asks you to confirm/deny this action by entering yes or no respectively.
terraform apply -parallelism=1
This forces all resources to be created to process one at a time. Since many SaaS apps have systems in place to detect potential attacks, sometimes this is the best way to approach running Terraform.
terraform destroy -parallelism=1
This destroys any created resources that are referenced in your state file in Terraform.
Each subsequent use of the plan, apply or destroy commands:
-
Consults your state file
-
Assesses what needs to be added, changed, or destroyed
-
And then executes the proper action
Running a specific module
When running the template module that’s included, it will create seven categories in your Jamf Pro test instance, named for each Apollo mission from 11-17 as a simple test. Once these are added, you can easily destroy them using the above commands. The goal here is to visually display Terraform’s structure, so you can start building modules that are relevant to your environment.
The anatomy of Terraform
The core of the functionality here is the hierarchy of .tf files. The root level main.tf, calls to the child module main.tf, which references the .tfvars and variables.tf files. Below is a full breakdown of the thread and order of communication:
When you run terraform apply:
- The state file is consulted.
Main.tf(root) is assessed for child modules that are being requested.Variables.tf(root) is scanned for relevant variables for the child modules being applied.Terraform.tfvarsis scanned for all relevant variables for running each child module.Main.tf(child) files are then called and executed, referencing their own localvariables.tffiles for relevant local child module variables.- Each resource is then created, and references are saved to
terraform.tfstate. - Terraform reports back success or failure, along with relevant codes and response messaging, to aid with troubleshooting issues, if necessary.
Things to keep in mind
-
When you start building your own modules, you can copy the template module included and change it to suit your needs.
-
Each child module must have its own
main.tfandvariables.tffiles. -
Each child module needs to be represented in the root level
main.tf. -
Carefully read through the module in the template repo and look for every reference point so you fully understand how each piece works together to achieve the goal.
-
Boolean operations, referred to as knobs in your terraform.tfvars file, are used to call specific child modules. Incidentally, this section may be omitted so that all the modules are called when terraform apply executed.
-
Module calls may be set as dependent on a variable being populated, or simple inclusion or exclusion statements may be set to determine when modules run, based on your own criteria.
Terraform is a very powerful tool with many built-in functions that you can use. Some of these will be necessary as you build specific functions, while others may never be used but rest assured, they’re all very well documented by the developer.
References
Subscribe to the Jamf Blog
Have market trends, Apple updates and Jamf news delivered directly to your inbox.
To learn more about how we collect, use, disclose, transfer, and store your information, please visit our Privacy Policy.