Now that our fabric configurations are defined in Terraform, we have two important pieces left to be able to manage things with IaC style. Specifically, we need to implement useful test cases that will give us confidence in our code, and we need a pipeline definition.
Defining a thorough set of test cases that won't take forever to run is an art in itself. Properly designed test cases can dramatically reduce the chance of outages due to human error. Your test code very likely will be more involved than the production configuration it's validating. The whole idea behind IaC is to validate the production code and configuration against a staging environment before applying it to the production network in as timely a manner as feasible. The faster the pipeline runs, the more agile your organization becomes in terms of accommodating changes. As with most things, you'll face tradeoffs.
PyATS can be a good supplement to any automation infrastructure, it is an end-to-end DevOps automation ecosystem. Agnostic by design, pyATS enables network engineers to automate their day-to-day DevOps activities, perform stateful validation of their device operational status, build a safety-net of scalable, data-driven and reusable tests around their network, and visualize everything in a modern, easy to use dashboard.
In this workshop, we'll use PyATS framework to create two test cases:
The PyATS test script is already created. Open and review the file tests/test_vni.py
:
code -r /home/cisco/CiscoLive/DEVWKS-3320/tests/test_vni.py
import logging
from pyats import aetest
from genie.conf import Genie
log = logging.getLogger(__name__)
class CommonSetup(aetest.CommonSetup):
@aetest.subsection
def establish_connections(self, testbed):
genie_testbed = Genie.init(testbed)
self.parent.parameters['testbed'] = genie_testbed
device_list = []
for device in genie_testbed.devices.values():
log.info(f"Connect to device {device.name}")
try:
device.connect()
except Exception:
self.failed("Failed to establish connection to '{}'".format(
device.name))
device_list.append(device)
self.parent.parameters.update(dev=device_list)
self.parent.parameters.update(desire_vni=[
50001,
30001])
class VniTest(aetest.Testcase):
@aetest.test
def get_vni(self):
self.all_vni_status = {}
for device in self.parent.parameters["dev"]:
vni = device.parse("show nve vni")
self.all_vni_status[device.name] = vni["nve1"]["vni"]
@aetest.test
def check_vni_status(self):
result = []
for device in self.parent.parameters["dev"]:
log.info(f"Check if all VNI are UP for device {device.name}")
for key, vni in self.all_vni_status[device.name].items():
if vni["vni_state"] != "up":
result.append({
"vni": vni["vni"],
"status": vni["vni_state"]
})
if result:
log.error(result)
self.failed()
@aetest.test()
def desired_vni(self):
result = []
for device in self.parent.parameters["dev"]:
log.info(f"Check if all desired VNI are deployed on device {device.name}")
for vni in self.parent.parameters["desire_vni"]:
if vni not in self.all_vni_status[device.name].keys():
result.append(vni)
if result:
log.error(result)
self.failed("not all VNI are deployed")
class CommonCleanup(aetest.CommonCleanup):
@aetest.subsection
def clean_everything(self):
""" Common Cleanup Subsection """
log.info("Aetest Common Cleanup ")
if __name__ == "__main__":
aetest.main()
.gitlab-ci.yml
in the project root folder. As with the test cases, we created this file prior to the workshop.
code -r /home/cisco/CiscoLive/DEVWKS-3320/.gitlab-ci.yml
Let's look at our pipeline in more depth.
variables:
TF_HTTP_ADDRESS: ${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/terraform/state/${ENV}
TF_HTTP_LOCK_ADDRESS: ${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/terraform/state/${ENV}/lock
TF_HTTP_UNLOCK_ADDRESS: ${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/terraform/state/${ENV}/lock
TF_HTTP_LOCK_METHOD: POST
TF_HTTP_UNLOCK_METHOD: DELETE
The YAML variables
dictionary above includes all the environment variables that are needed by each task. In our case, these are used to access the Terraform state in the gitlab http backend.
stages:
- review
- validate
- plan
- deploy
- verify
The YAML stages
list above defines the order in which the pipeline stages are run. We have four stages in general, validate
will verify the syntax of Terraform configurations, plan
will generate the Terraform configuration plan and show what will be changed. deploy
will deploy the terraform plan, and verify
will run all the test cases. The source branch is used to determine what configuration to apply to our different envrironments (staging and production). The review
pipeline stage will set the correct environment based on the source branch.
set_deploy_env:stage:
stage: review
rules:
- if: '$CI_PIPELINE_SOURCE == "merge_request_event"'
script:
- echo "ENV=stage" >> review.env
artifacts:
reports:
dotenv: review.env
For example, the above job sets the environment variable ENV
to stage
when the pipeline is triggered by a merge request. It saves ENV to a file which is read by subsequent jobs, which use this to select the correct terraform workspace and corresponding input variables.
tf_plan:
stage: plan
rules:
- if: '$CI_COMMIT_BRANCH == "main"'
- if: '$CI_PIPELINE_SOURCE == "merge_request_event"'
needs:
- job: set_deploy_env:stage
optional: true
- job: set_deploy_env:prod
optional: true
- job: tf_validate
image:
name: "hashicorp/terraform:1.3.7"
entrypoint: [""]
script:
- terraform init
- terraform plan -var-file=common.tfvars -var-file=$ENV.env.tfvars -out plan
artifacts:
name: plan
paths:
- plan
Now that everything is in place, let's proceed to see the CI portion of our pipeline in action!