My First Pure-Python Network Automation with pyATS / Genie !

I am very proud of this next piece of infrastructure as code for a few reasons.

  1. It addresses the problems with performance and speed at scale I’ve had with my current methodology and tools (Ansible)
  2. I feel like I am ready to “graduate” from Ansible to Python
  3. I’m already using Genie
  4. I’m already using pyATS
    * Limited to the handful of Solution Examples
  5. I already have working automation solutions and I think I can translate / refactor / at least be inspired by previous Ansible-based solutions.

Where to start?

I’ve been down the road of learning network automation from scratch – this time let’s start with simple information gathering and transformation.

Speaking of inspiration – I am going to start with a “Just the Facts” approach and go get – show interfaces status – my favourite command – into a CSV, MD, and this time let’s spice it up and also throw in an HTML page. From Genie parsed JSON.

Only this time using pure Python – no Ansible training wheels (crutches ?)

How to attack this ?

Break it down in human language and then see if we can translate it to Python is one approach. Another is to find working examples and guides provided by the Cisco team. Using a mix of the two and some other online resources here is how I did it.

The job folder is where I will keep the pyATS job file and and code file. Output will hold the 3 output files. I plan on hopefully using Jinja2 just like in Ansible so we need a Templates folder. Finally pyATS uses the concept of testbed files to setup connectivity and authentication. These are very similar to Ansible group_vars.

I’ve included a .gitignore file to keep the .pyc files out of the Git repository.

The Job file. This is a pyATS control file you can use to run the code. You can feed arguments in this way but I have not done that here.

The job file

Pretty simple so far – import the os and run the code.

First thing in the Python code is to setup the Python environment you need. Make sure to import JSON as we need to work with the Genie parsed data.

The actual code

Next we will setup Jinja2 and the File loader

Jinja2 setup

Now we import Genie and pyATS

Setup a logger

Ok so we need 3 source templates one for each file type

Turn on the logger

Let’s load up the testbed file

A testbed looks like this:

Note that yes! We CAN encrypt the string! %ENC{ } represents the pyATS encrypted string! Safe to store in Git repos!

Now some magic – we parse our command into a variable as JSON

Run the results thru the templates

While look like this:

CSV
Markdown
HTML

Then we create the output files back in Python to finish the playbook

Which look like – ha! – we dont know if this works yet! Lets check it out!

The job in action

The command to run the job

Next it loads up the testbed

pyATS is very verbose but in a good verbose with valuable information about your job

Next the actual SSH connection sets up using Unicron (this is different than Ansible which uses paramiko)

Ok my device’s banner is displayed. My banner is left over from some CI/CD work but it’s the right banner – I’m in !

Some basic platform stuff gets dumped to the job log followed by my next job steps

It seems to be working so far
show interfaces status

Ok it’s fired the command! Milestone in the job reached – now it should register this result as JSON in a variable next.

Now during my development I added the following to confirm this step was working to dump the variable to the screen:

print {{ variable name }}

Print replaces debug: msg=”{{ }}” – good!

Similar to an Ansible recap we get a pyATS Easypy Report

Easypy Report > Ansible.log

The Git Add * test

I like to build suspense so I change directories up a folder and try to stage, hopefully, the 3 new files into Git

cd ..

Git add *

Git commit -am “did my first python code work?”

Image

Amazingbut what do they look like?!?

They look incredible!

Image
CSV output
Image
Markdown Output
Image
HTML Output – RAW
Image
HTML Rendered

What does this mean ?

It means, seemingly, I’ve been mastering the wrong tool. That a faster, easier, and more elegant tool is available. This is ok – I feel like Ansible was primary school and I’m moving into the next stage of my life as a developer and moving up into high school with Python.

It also means I have a lot of code to refactor into Python – also fine – a good opportunity to teach my colleagues.

I also means I will be focusing less and less on Ansible I think and more and more on Python

20 years ago I was studying to become a computer programmer analyst in college writing C++, Java, Visual Basic 6, COBOL, CICS, JCL, HTML, CSS, SQL, and JavaScript and now, two decades later, I still have the magic touch and have figured out Python.

You can expect a lot more solutions like this – in fact I am going to see if I can work in my #chatbot / #voicebot capabilities into Python.

Dark Mode

Modern_Show_Interfaces_Status (this link opens in a new window) by automateyournetwork (this link opens in a new window)

A modern approach to the Cisco IOS-XE show interfaces status command using Python pyATS / Genie and Jinja2 templating to create business-ready CSV, Markdown, and HTML files

My Network Talks To Me – Literally!

This next post may seem like science fiction. I woke up this morning and checked that my output files – MP3 files – really did exist and that I actually made my Cisco network “talk” to me!

This post is right out of Star Trek so strap yourself in!

Can we make the network talk to us?

After my success with my #chatbot my brain decided to keep going further and further until I found myself thinking about how I could actually make this real. Like most problems let’s break it down and describe it. How much of this can I already achieve and what tools do I need to get the rest of the solution in place?

I know I can get network data back in the form of JSON (text) – so in theory all I need is a text-to-speech conversion tool.

Enter: Google Cloud !

That’s right Google Cloud offers exactly what I am looking for – a RESTful API that accepts text and returns “speech” ! With over 200 languages in both male and female voices I could even get this speech in French Canadian spoken in a dozen or so different voices!

I am getting ahead of myself but that is the vision:

  1. Go get data, automatically, from the network (easy)
  2. Convert to JSON (also easy)
  3. Feed the JSON text to the Google Cloud API (in theory, also easy)

The process – Google Cloud setup

There is some Google Cloud overhead involved here and this service is “free” – for up to 1 million processed text words or 3 months whichever comes first. It also looks like you get $300 in Google bucks and only 3 months of free access to Google Cloud.

Credit card warning: You need a credit card to sign up for this. They assured me, multiple times, that this does not automatically roll over to a paid subscription after the trial expires you have to actually engage and click and accept a full registration. So I hope this turns out to be free for 3 months and no actual charges show up on my credit card. But in the name of science fiction I press on.

So go setup a Google Cloud account and then your first project and eventually you will land on a page that looks like this.

Enable an API and search for text

Enable this API and investigate the documentation and examples if you like.

Now Google Cloud APIs are very secure to the point of confusion. So I have not fully ironed out the whole automation pipeline yet – mainly because of how complex their OAuth2 requests seem to be – but for now I have a work around I will show you to at least achieve the theoretical goal. We can circle back and mess with the authentication later. (Agile)

Setup OAuth2 credentials (or a Service Account if you want to use a JSON file they provide you)

Make sure it is a Web Application

This will provide you your clientID and secret.

For most OAuth2 that’s all you need – maybe I am missing the correct OAuth2 token URL to request a token but for now there is another tool you can use to get a token.

Google has an OAuth2 Developer Playground you can use to get a token while we figure out the OAuth stuff in the background

Follow the steps to select the API you want to develop (Cloud Text-To-Speech)

Then in the next step you can request and receive a development token

You can also refresh this token / request a new token. So copy that Access Token we will need it for Postman / Ansible in the next steps.

Moving to Postman

Normally under my collection I would setup my OAuth – here is the screenshot of the settings I’m just not sure of. And here is the missing link to full automation.

So far so good here

Again, this might be something trivial and I am 99% sure its because I have not setup this redirection or linked the two things but it was getting late and I wanted to get this working and not get stuck in the OAuth2 weeds

First here is what I think I have right:

Token name: Google API
Auth URL: https://accounts.google.com/o/oauth2/auth
Access Token URL: https://accounts.google.com/o/oauth2/token
Client ID:
Client Secret:
Scope: https://www.googleapis.com/auth/cloud-platform

But what I’m not really sure what to do with is this Callback URL – I don’t have one of those ?

Callback URL: This is my problem I am really not sure what I need to do here ?

I believe I need to add it here:

But I have no cookie crumbs to follow all of the ? help icons are sort of “This is where your Authorized redirect URLs go” and thats it ?

Open call to Google Cloud Development – I just need this last little step then the rest of this can be automated

Anyway moving along – pretending we have this OAuth2 working – we can cheat for now using the OAuth2 Playground method.

So here is the Postman request:

We want a new POST request in the Google Cloud API Postman Collection. The URL is:

https://texttospeech.googleapis.com/v1/text:sythesize

So cheat (for now) and grab the token from the OAuth2 Playground and then in your Postman Request Authorization tab – select Bearer Token and paste in your token. From my experience this will likely start with ya29 (so you know you have the right data here to paste in)

Tab over to Headers and double-check you have Bearer ya29.(your key here)

As far as the body goes – again we want RAW JSON

Now for the science fiction – in the body, in JSON, we setup what text we want to what speech

The canned example they provide

So we need the input (text) which is the text string we want converted.

We then select our voice – including the languageCode and name (and there are over 200 to choose from) – and the gender of the voice we want.

Lastly we want the audioConfig including the audioEncoding – and since MP3 was life for me in the mid to late 90s – let’s go with MP3 !

Hit Send and fire away!

Good start – a 200 OK with 52KB of data returned. It is returned as a JSON string:

Incredible stuff – this is the human voice pattern saying the text string – expressed as a base64-encoded audio string !

Curious, I found the Base64 Guru !

Ok – very cool stuff – but what am I supposed to do with it?

Fortunately Google Cloud has the insight we need

Hey – it’s exactly where we are at ! We have the audioContent and now we need to decode it!

Since I am still developing in my Windows world with Postman let’s decode our canned example

Carefully right-click copy (don’t actually click the link Postman will spawn a GET tab against the URL thinking you are trying to follow the URL along)

Now create a standard text (.txt) file in a C:\temp\decode folder and paste in the buffered response

I’ve highlighted the “audioContent”: “ – you have to strip this out / delete it as well as the last trailing at the end of the string – we just want the data starting with // and beyond to the end of the string

Lauch cmd and change to the C:\Temp\Decode folder and run the command

certutil -decode sample.txt sample.mp3

As you see if your text file was valid you should get a completed successfully response from the certutil utility. If not – check your string again for leading and trailing characters.

Otherwise – launch the file and have a listen!

How cool is that?!?!?

Enter: Network Automation

As I’ve said before anything I can make with with Postman I can automate with the Ansible URI module! But instead of some static text – I plan on getting network information back and having my network talk to me!

The playbook:

First we will prompt for credentials to authenticate

Now – let’s start with something relatively simple – can I “ask” the Core what IOS version it’s running? Sure – let’s go get the Ansible Facts, of which the IOS version is one of them, and pass the results along to the API !

For now we will hard code our token – again once I figure this out I will just have another previous Ansible URI step to go get my token with prompted ClientID / Client Secret at the start of the playbook along with the Cisco credentials. Again, temporary work around

Again because I have used body_format: json I can write the body in YAML.

Let’s mix up the voice a little bit too so hit the Voices Reference Guide

Ok so for our body let’s have some fun and try an English (US) WaveNet-A Male.

For the actual text lets mix a static string “John the Lab Core is running version” and then the magic Ansible Facts variable {{ ansible_facts[‘net_version’] }}

And see if this works

We need to register the response from the Google Cloud API and delegate the task to the localhost:

So now we need to parse and get just the base64-audio string into a text file. Just like in Postman this is contained in the json.audioContent key:

Now we have to decode the file! But this time with a Linux utility not a Windows utility

We can call the shell from Ansible easily to do this task:

Now in theory this should all work and I should get a text file and an MP3 file. Let’s run the playbook!

Lets check if Git picked a new file!

Ok ! What does it sound like!?!

Ok this is incredible!

Let’s try some Genie / pyATS parsing and some different languages !

Copy and paste and rename the previous playbook and call the new file GoogleCloudTextToSpeech_Sh_Int_Status.yml

Replace the ios_facts task with the following tasks

So now for our actual API call we want to conditionally loop over each interface if is DOWN / DOWN (meaning not UP / UP and not administratively DOWN)

Now as an experiment – let’s use French – Canadian in a Female Wavenet voice.

Does this also translate the English text to French? Do do I need to write the text en francais? Lets try it!

So this whole task now looks like this:

Now we need another loop and another condition to register the text from the results. We loop over the results of the first loop and when there is audioContent send that content to the text file.

Caution! RegEx ahead! Don’t be alarmed! Because of the “slashes” in an interface (Gigabit10/0/5) the file path will get messed up so let’s regex them to underscores

Then we need to decode the text files!

So let’s run the playbook!

So far so good – now our conditionals should kick in – we should see some items skipped in light blue text then our match hits in green

Similarly, our next step should also have skipped items and then yellow text indicating the audioContent has been captured

Which is finally converted to audio

What does it sound like!??! Did it automatically translate the text as well as convert it to speech?

TenGig

Gig

A little more fun with languages

I won’t post all the code but I’ve had a lot of fun with this !

How about the total number of port-channels on the Core – in Japanese ?!

Summary

In my opinion this Google Cloud API and network automation integration could change everything! Imagine if you will:

  • Global, multilingual teams
  • Elimination of technical text -> Human constructed phrasing, context, and simplicity
  • Audio files in your source of truth
  • Integrated with #chatops and #chatbots
  • Visually impaired or otherwise physical challenges with text-based operations
  • A talking network!

This was a lot of fun and I hope you found it interesting! I would love to hear your feedback!

Creating Your Own Network Automation ChatBot with Discord, Ansible, and Genie/pyATS!

As you may know I love to play with new toys. I especially love connecting new toys with my old toys. What you may not know is that I am also an avid World of Warcraft fan and player! In order to run what are known as “raids”, group content designed for 10 – 30 players, I use a program called Discord.

My goal was simple – could I send myself messages in Discord from my Ansible playbooks with network state data? Could I create a #chatbot in this way ?

As it turns out not only could I achieve this – it is actually pretty straight forward and simple to do!

Setup

There are not a lot of steps in the setup.

  • Download and install Discord
  • Setup an account
  • Create a Server
  • Create a Channel
  • I named my channel AutomateYourNetwork and set it up as private with invite only RBAC to see the channel
  • Once we have a server and channel setup we need to setup the Integrations
  • Now setup a WebHook
  • Select the channel you want your chatbot to send messages to
  • We will need the Webhook URL

Postman Development

As with all new API development I always start in Postman to sort out the authentication, headers, and body to POST against this new Discord Webhook.

First let’s setup a new Postman Collection called Discord; add a new POST request to this collection

For the request itself make sure you change the default GET to a POST and then use the following URL:

https://discord./com/api/webhooks/< your URL copied from Discord here>

The body is flexible but to get started let’s just use a small set of values.

Set your body to RAW JSON

And add this body (change your username unless you want this message to look like it came from me!!)

Now if you really want to see how fast / real-time / amazingly cool this is – make sure you have your Discord logged in but minimized to your system tray

Hit SEND in Postman

Your Discord should have notified you about a new message! In my system tray the icon has also changed!

Which is no surprise because back in Postman we see we received a valid 204 No Content response from the API

Lets see what Discord looks like

How cool is this?!?

Integrating with Network Automation and Infrastructure as Code

Ok this is great – but can we now integrate this into our CI/CD pipeline Ansible playbooks?

Can I send myself network state data ? Can we create a #chatops bot ?

Yes we can!

Lets start with Ansible Facts – and see if we can chat ourselves the current version of a device.

First let’s setup our credential prompts

Then, let’s use ios_facts

Thats more or less all I need to do – next let’s use the URI module to send ourselves a chat!

I will break down this next URI task; first setup the URL – again after /webhooks/ {{your URL here }}

This is a POST

I like to do this next step for two reasons; one to set the body of the POST to JSON (obvious) but also to allow me to use YAML syntax in Ansible to write the body of the POST (not so obvious). Without this my body would need the JSON formatting (think moustaches and brackets) which is hard enough to write on it’s own, and very hard to write inside a YAML Ansible task

Meaning I can format the body as such (in YAML):

And, like we saw in Postman, we are expecting a 204 back and no content

Make sure you are delegating to the localhost (you dont want this step to run on your Cisco switch)

Again back to the body we are accessing the Ansible magic variable ansible_facts and the key net_version

Lets run the playbook!

Discord looks happy

And, with the power of Internet Magic, here is our message!

This is incredible – let me integrate Cisco Genie / pyATS now and send some parsed network state data next – to show the real power here

The playbook structure is more or less the same so save the Ansible Facts version playbook and copy / rename it to Show Int Status.

Keep the prompts; remove the ios_facts task and replace it with this task

Followed by the Genie parsing step

Then we need to adjust the Discord message – for my example I only want a message if, for example, an interface is configured to be UP / UP (meaning it is not administratively down) but is DOWN / DOWN (notconnected state). I don’t care about UP/UP or Administratively Down interfaces.

Again, I will break down this

Most of this is the same

Here comes my magic with Genie. We want to loop over each interface Genie has parsed into the registered variable pyatsint_status_raw.interfaces. We need to convert this dictionary into a list of items so filter it | dict2items

Now we want a condition on this loop; only call the Discord API when the {{ item.value.status }} key (that is to say each iteration in the loops status value) when it equals “notconnect

Now we can reference the item.value for the per-interface name and the item.value.status for the notconnect status when it hits a match in the body of the message we are sending to Discord.

The task as a whole looks like this:

So we run this new playbook which sort of looks like this. Remember we have a condition so the light blue text indicates non-matches / skipped interfaces (because they are connected or admin down); green indicates a hit.

Drumroll please

And now in Discord I have this wonderful, pager-like, real-time “alert”

Now go build one!

Here is the GitHub repository – go try to build one for your network!

Dark Mode

DiscordNetworkChatBot (this link opens in a new window) by automateyournetwork (this link opens in a new window)

Ansible playbooks that chat with Discord using Ansible, Genie/pyATS, and the Discord webhooks that send network state information as a Discord message!

I would *love* to see your #chatbot in action – please hit me up on Twitter and show me what you got !

Is Your Network At Risk – Automating the Cisco PSIRT with Genie and Ansible

The security of my network keeps me up at night. Honestly it does. We live in a world where enterprise networks are defending themselves against state-sponsored attacks. But what do attackers look for? Typically, open, well-known, vulnerabilities.

Well, at least to the hackers, attackers, and even script-kiddies, these vulnerabilities are well-known. Those playing defense are often a step or two behind just identifying vulnerabilities – and often times at the mercy of patch-management cycles or operational constraints that prevent them of addressing (patching) these waiting-to-be-exploited holes in the network.

What can we do about it?

The first thing we can do is to stay informed ! But this alone can be a difficult task with a fleet of various platforms running various software versions at scale. How many flavours of Cisco IOS, IOS-XE, and NXOS platforms make up your enterprise? What version are they running? And most importantly is that version compromised?

The old way might be to get e-mail notifications (hurray more e-mail!), maybe RSS-feeds, or go device-by-device, webpage-by-webpage, looking up the version and if it’s open to attack or not.

Do you see why, now, how an enterprise becomes vulnerable? It’s tedious and time-intensive work. And the moment you are done – the data is stale – what, are you going to wake up every day and review threats like this manually? Assign staff to do this? Just accept the risk and do they best you can trying to patch quarterly ?

Enter: Automation

These types of tasks beg to be solved with automation !

So how can you do it?

Let’s just lay out a high level, human language, defined use-case / wish list.

  1. Can we, at scale, go get the current IOS / IOS-XE / NXOS software version from a device?
  2. Then can we send that particular version somewhere to find out if it has been compromised ?
  3. Can we generate a report from the data above?

Technically the above is all feasible; easy even!

  1. Yes, we can use Ansible and the Cisco Genie Parser to capture the current software version
  2. The Cisco Product Security Incident Response Team has incredible, secure, REST APIs available that we can automate with the Ansible URI module
  3. Using Jinja2 templates we can craft business-ready reports

Getting Started

First you need to setup your Cisco.com REST API suite access

Pre-Ansible Development

As with any new REST API I like to start with Postman and then transform working requests into Ansible playbooks.

First, under a new or existing Cisco.com Collection, add a new request called IOS Vulnerabilities

Cisco.com uses OAuth2 authentication mechanism where you first must authenticate against one REST API (https://cloudsso.cisco.com/as/token.oauth2) which provides back authorization Bearer token used to then authenticate and authorize against subsequent Cisco.com APIs

Your Client ID and Secret are found in the API portal after you register the OpenVuln API

Request and use a token from the API in Postman:

Now add your request for IOS

https://api.cisco.com/security/advisories/ios?version=

Test it out!

Let’s hard code a version and see what flaws it has

Ok so it has at least 1 open vulnerability!

Does it tell us what version fixes it?

Ok let’s check that version quickly while we are still in Postman

This version has no disclosed vulnerabilities!

One interesting thing of note – and our automation is going to need to handle it – is that if there are no flaws found we get a 404 back from the API not a 200 like our flawed response!

The Playbook

For the sake of the example I am using prompted inputs / response but these variables could easily be hardcoded and Ansible Vaulted.

So first prompt for your Cisco hosts username and password and your Cisco.com ClientID and Client Secret

Register the response

Then in the IOS.yml group_vars I have my Ansible network connections

I’ve put all IOS-platforms in this group in the hosts file to target them

Next step run the show version ios_command and register the reponse

Genie parse and register the JSON

Now we need, just like in Postman, to go get the OAuth2 token but instead of Postman we need to use the Ansible URI module

Then we have to parse this response and setup our Bearer and Token components from the response JSON.

Now that we have our token we can authenticate against the IOS open Vulernability API

There are a couple of things going on here:

  1. We are passing the Genie parsed .version.version key to the API for each IOS host in our list
  2. We are using the {{ token_type }} and {{ access_token }} to Authorize
  3. We have to expect two different status codes; 200 (flaws found on a host) and 404 (no flaws for the host software version)
  4. I’ve added until and delay to slow down / throttle the API requests as not to get a 406 back because I’ve overwhelmed the 10 requests per second upper limit
  5. We register the JSON response from the API

As always I like to create a “nice” (easy to read) version of the output in a .json file

Note we need to Loop over each host in our playbook (using the Ansible magic variable ansible_play_hosts) so the JSON file has a data set for each host in the playbook.

Lastly we run the template module to pass the data into Jinja2 where we will create a business-ready CSV file

Which looks like this broken apart:

Like building the JSON file first we need to loop over the hosts in the playbook.

Then we check if the errorCode is defined. You could also look at the 404 status code here. Either way if you get the error it means there are no vulnerabilities.

So add a row of data with the hostname, “N/A” for most fields, and the json.errorMessage from the API (or just hardcode “No flaws” or whatever you want here; “compliant”)

Now if the errorCode is not defined it means there are open flaws and we will get back other data we need to loop into.

I am choosing to do a nested set of two loops – one for each advisory in the list of advisories per software version. Then inside that loop another loop, adding a row of data to the spreadsheet, for each BugID found as well (which is also a list).

There are a few more lists like CVEs for example – you can either loop of these as well (but we start to get into too much repetition / rows of data) – or just use regex_replace(‘,’,’ ‘) to remove all commas inside a field. The result is the list spaced out inside the cell sorted alphabetically but if you do not do this it will throw off the number of cells in your CSV

The results

What can you do with “just” a CSV file?

With Excel or simply VS Code with the Excel Preview extension – some pretty awesome things!

I can, for example, pick the ID field and filter down a particular flaw in the list – which would then provide me all the hosts affected by that bug

Or pick a host out of the list and see what flaws it has, if any. Or any of the columns we have setup in the Jinja2

This image has an empty alt attribute; its file name is image-108.png

Included in the report is also the SEVERITY and BASE SCORE for quick decision making or the Detailed Publication URL to get a detailed report for closer analysis.

Which looks like this:

Automation is much more than configuration management. It can be used to manage, understand, and mitigate risk as well.

Together we can secure our enterprise networks and hopefully sleep a little more sound knowing they are safe.

Cisco Facts – A Collection of Ansible IOS / NXOS Facts and Genie Parsing Playbooks

Cisco DevNet Code Exchange has published my repository !

Dark Mode

Cisco_Facts (this link opens in a new window) by automateyournetwork (this link opens in a new window)

Ansible playbooks that use the IOS / NXOS Facts modules and Genie parsed commands to transform RAW JSON into business-ready documentation

Here you can easily start capturing Ansible Facts for IOS and NXOS and transform the JSON into CSV and Markdown !

Also included are a bunch of valuable Genie parsed show commands which transform the response into JSON then again transforms the JSON into CSV and Markdown!

The playbooks use Prompts so you should be able to clone the repo and update the hosts file and start targeting your hosts! For full enterprise support I suggest you refactor the group_vars and remove the prompts moving to full Ansible Vault – but for portability and ease of start-up I’ve made them prompted playbooks for now.

I would love to hear how they work out for you – please comment below if you have success!

Cisco Services APIs Ansible Playbooks – Version 2.0

I am very pleased to release the Cisco Services API Ansible Playbooks Version 2.0 which has been approved and released on Cisco DevNet Code Exchange !

You can find the code here and here

This major revision basically shifts away from lineinfile to Jinaj2 Templates for scale, performance, readability, and general best practices.

Serial 2 Info

The Cisco Serial 2 Info API receives a valid serial number and then returns structured JSON with your Cisco Contractual information !

The playbook uses the Genie parser to parse the show inventory command

After authenticating against the OAuth 2 service to get a Bearer token

It provides the API the serial number for every part per device.

The API provides the following information back:

Which we first dump into JSON and YAML files

Then template into CSV and MD

Using Jinja2

Which gives us:

Recommended Release

The other, very similar, Ansible playbook uses the Cisco Recommended Release API to create a spreadsheet with the current image on a host and the Cisco recommended version for that host given the Part ID (PID)

Here we don’t even have to use Genie to parse we can use the Ansible Facts module

And we transform again with Jinja2

And get this create report!

Please reach out to me directly if you need any help implementing these playbooks but I believe the instructions and code to be easy enough any beginner, with a little bit of refactoring and thought, could use this code as a starting point in their automation journey.

Dark Mode

Cisco_API_v2 (this link opens in a new window) by automateyournetwork (this link opens in a new window)

Ansible playbooks that capture serial number and PID and send them to the Cisco.com APIs transforming the response into business-ready documents. Version 2.0 uses Jinja2 templates.

Collaboration is the key to automation success – A Genie Success Story

I’ve been on about a three year journey with network automation and while I have had great personal and technical success – my organization and most of those outside my immediate day-to-day circle are still shackled to the ‘old’ way of doing things (primarily the CLI).

This year I decided to start a new program – a training session for those outside of my small development team – primarily targeted at the “operations” staff who can benefit the most from automation and infrastructure as code. This includes network operators, monitoring / NOC team members, IT Security staff, other developers, compute (server / storage) teams; everybody is welcome. We divided the calendar up into different teams with different recurring timeslots.

In advance I had written and tested a bunch of Code for the Campus Core – Ansible Facts and Genie parsed show commands – transforming the output into business-ready documentation. My plan was simple enough:

1. Ensure everybody was on the same page and had the same toolkit
* VS Code
* Git
* Azure DevOps repository bookmarked
* Various VS Code extensions
* A basic overview of main vs working branches in Git
* A basic outline on Ansible, YAML, Jinja2, and JSON

2. An operator would create a working development branch – in our case the Distribution Layer – so dist_facts Git branch.
* This operator ‘drives’ the whole session sharing their screen
* Step by step, line by line, refactoring (fancy way of saying copy-pasting) working code from the Core and updating it for the Distribution Layer as necessary
* Git clone, add, commit, push, and pull performed in both VS Code and Linux CLI
* Ansible playbook executed with a –limit against one building, then at scale after validating output
* Thorough tour of the JSON, YAML, CSV, MD, and HTML files after each run

3. Work through Ansible Facts and various Genie parsed commands to build up a source of truth

So far it has been a very successful approach with the two teams adopting Marvel superhero teams (TEAM: CAPTAIN AMERICA and TEAM: IRON MAN respectively) allowing me to create memes like this:

Image

Anyway – back to the point – today we had the following exchange:

Me: “So – we just parsed the show etherchannel summary CLI command and transformed the output into CSV files – amazing right! Any questions?”

Operator catching on quickly: “We use the show interfaces trunk command often to track down what VLANs are being carried on which interfaces – can we transform that into a CSV?”

Me, excited and proud of the Operator: “Amazing question and I’m glad you brought up a practical example of a command you use in the field all the time we can maybe transform into something a little more useful than CLI output!

Launch the Genie parser search engine (under available parsers on the left menu) and let’s see if there is a parser available

Bingo! Let’s do it!”

The playbook

In this example we are targeting the Campus Core.

The playbook is simple enough

Use the Ansible ios_command module to issue and register the show interfaces trunk command

Next, using the Genie filter plugin

Filter the raw variable and register a new variable with the Genie parsed results

Note you need to pass the parse_genie filter two arguments the command itself and the appropriate Cisco operating system

Next I like to create a Nice JSON and Nice YAML file with the parsed results as follows:

Which look like this:

And this:

I then use a loop to create a CSV and MD file from a Jinja2 template

The Jinja2 Template looks like this

Couple things to hightlight:

* We challenge the current iteration of the Ansible loop and when it is on “csv” we template a CSV file format otherwise (else) it will be “md” and we template a markdown file
* The For Loop is over each interface in the results
* Be careful with the CSV file you need to regex_replace the comma out of results because they are comma-separated which will throw off your CSV file. The markdown does not require any regex.

Which results in this amazing sortable, searchable, filterable, version controller, source controller, truthful, fact-based CSV file:

Now this example is just the singular logical Core but we will quickly refactor the code next week in our next session together and the operator who wanted this playbook will get a chance to write it ! Then we will have the interface trunk information for the entire campus automatically in spreadsheets!

The moral of this story is to collaborate. Ask your front-line operators how automation can help them. Do they have any frequently used or highly valuable CLI Commands they want transformed into CSV format? Would Ansible facts help them? And then show them how to do it so they can start writing these playbooks for themselves.

Sign on the Dotted Line

Layer 9 issues – finance – are often some of the most challenging a network engineer faces. Contract management can be particularly difficult in any scale organization especially if you are not “sole source” purchasing. Serial numbers and contracts are also not typically things the “network people” want to deal with but when that P1 hits and you try to open a SEV 1 TAC CASE – only to find out you are not under contract – I’ve been in less terrifying car accidents than this nightmare scenario.

I have good news ! Using a mix of automation and developer-like tools the network engineer can now create a real source of truth that, along with routes and MAC address-tables and other technical information, can include inventory and contractual business documentation from stateful, truthful, real-time, facts from Cisco.

Ok so let’s get into it!

As a rough outline for our logic here is the use case:

Can I automatically gather the serial numbers from Cisco device hostnames and then provide them to Cisco and get my contractual state for each part on each device?

Answer: YES !

What you will need:

* Linux host with Ansible, Genie parser
* Linux host requires both SSH access to the Cisco host and Internet Access to the OAuth2 and Cisco.com API HTTPS URLs
* Cisco SmartNet Total Care – I have written up instructions in this repo under the “OnBoarding Process” section

The Playbook

Step 1 – We will need to get the serial number for every part for a given hostname. For this we will use the standard show inventory command for IOS using the Ansible ios_command module. I will be using prompted methods for demonstration purposes or for on-demand multi-user (each with their own accounts) runtime, but we could easily Ansible Vault these credentials for fully hands-free run time or to containerize this playbook. I am also targeting a specific host – the Core – but I could easily change this to be every IOS device in the enterprise. This playbook is called CiscoCoreSerial2InfoFacts.yml

First prompt for username, enable secret, Cisco Customer ID, Cisco Customer Secret and register these variables:

Then run the ios_command show inventory and register the results in a variable.

Step 2 – Parse the raw output from the IOS command

Next, we use Genie to parse the raw results and register a new variable with the structured JSON. Genie requires, for show inventory, the command, the operating system, and the platform (in this case a Cisco 6500)

And here is what that structured JSON looks like:

So now we have a nice list of each part and their serial number we can feed the Cisco.com API to get back our contract information.

Step 3 – Get an OAuth 2 token from Cisco web services.

Cisco.com APIs use OAuth2 for authentication meaning you can not go directly against the API with a username and password. First you must retrieve a Bearer Token and then use that limited time token within it’s lifetime against the ultimate API.

Using the Ansible URI module go get a token and register the results as a variable. Provide the Customer ID and Client secret prompts to the API for authentication. This is an HTTP POST method.

With the new raw token setup the token type and access token from the raw response

Step 4 – Provide token to the Serial2Contract Cisco API to get back contractual information for each serial number.

Now that we have a valid token we can authenticate and authorize against the Cisco SmartNet Total Care Serial Number 2 Contract Information API.

In this step we are going to use an Ansible loop to loop over the Genie parsed structured JSON from the show inventory command providing the sn key for each item in the list. We need to use the Python | dict2items Ansible filter to transform the dictionary into a list we can iterate over.

The loop is written as

loop: “{{ pyats_inventory.index | dict2items }}”

And each serial number is referenced in the URL each iteration through the loop:

url: https://api.cisco.com/sn2info/v2/coverage/summary/serial_numbers/{{ item.value.sn }}

We register the returned structured JSON from the API as Serial2Info which looks like this:

So now I have the JSON – let’s make it a business ready artifact – a CSV file / spreadsheet and a markdown file – using Jinja2

Step 5 – Using Jinja2 lets template the structured JSON into a CSV file for the business.

Create a matching Jinja2 template called CiscoCoreSerial2InfoFacts.j2 and add a task to Ansible that uses the template module to build both a CSV file and a markdown file from the JSON.

In the Jinja2 file we need a section for CSV (if item = “csv”) and a section for markdown (else) based on their respective syntax. Then we need to loop over each of the responses.

result in Serial2Info[‘results’] is the loop used. I also add a default value using a filter | default (‘N/A’) in case the value is not defined. SFPs for example do not have all of the fields that a supervisor module has so to be safe it’s best to build in a default value for each variable.

The final Jinja2 looks something like this:

Which results in a CSV and Markdown file with a row for every serial number and their contractual facts from the API.

Summary

Large scale inventory and contract information can easily be automated into CSV spreadsheets that the business can easily consume. Ansible, Genie, Cisco.com APIs, Jinja2 templating and a little bit a logic come together into an automation pipeline that ensures contractual compliance and inventory fidelity at scale!

Dynamic Loops in Ansible

I’ve done many great things with Ansible but occasionally I come across a logical problem that may stretch the tool past it’s limitations. If you have been following the site I am on a big facts discovery and automated documentation movement right now using Ansible Facts and Genie parsers.

The latest parser I am trying to convert to documentation is the show access-lists command.

So we use the Ansible ios_command module and issue the command then parse the response. This is a playbook called CiscoACLFacts.yml

I always start with creating Nice JSON and Nice YAML from the structured JSON returned by the parsed command:

Then I examine the Nice JSON to identify “how” I can parse the parsed data into business documentation in the form of CSV, markdown, and HTML files.

And I have my CLI command converted into amazing structured human or machine readable JSON:

So here is where the logical challenge comes into play with Ansible.

As you know an Access Control List (ACL) is a list (hint: it’s right in the name) of Access Control Entries (ACEs) which are also a list. Both the ACL and the ACE are variable – they could be almost anything – and I need 2 loops: an outer loop to iterate over the ACLs (shown above is the “1”) and an inner loop to iterate over the ACEs (shown above as the “10”).

So I brush up on my Ansible loops. Looks like I need with_nested.

Problem: Ansible does not support dynamic loops as you would expect. Here is what I “wanted to do” and tried for a while before I figured out it wasn’t supported:

with_nested:
– “{{ pyats_all_access_lists | dict2items }}”
– “{{ item[0].value.aces }}”

No go. “Item not found” errors.

Here is the work around / solution:

Before I get into the loops a couple things to point out to anyone new to infrastructure to code or JSON specifically. The Genie parsed return data is not a list by default. Meaning it cannot be iterated over with a loop. We have to filter this from a dictionary – as indicated in JSON by the { } delimiters – into a list (which would be indicated by [ ] delimiters in the JSON if it was a list we could iterate over) – before we can loop over it.

| dict2items is this filter.

The loops:

You can define the outer loop key using loop_var as part of loop_control along with include to build a dynamic outer / inner loop connection.

In order to create my CSV file:

1 – Delete the file outside the loops / last Ansible task before entering the loops

2 – * Important new step here *

We need to perform the outer loop, register the key for this outloop, and then include a separate YAML file that includes the inner loop task

3 – * Another important new step here *

Create the referenced file CiscoACLInnerLoop.yml with the inner loop task, in this case, the task to add the rows of data to the CSV file

Things to identify in the above task:

The loop – it is using the outer loop (the loop_var) ACL_list as the primary key then we turn the .value.aces dictionary into another list with | dict2items giving us the inner list we can iterate over.

Important – the inner loop is what Ansible will reference from this point forward meaning item. now references the inner items. In order to reference the outer key you need to reference the loop_var again as seen on the line: “{{ ACL_list.key }},{{ item.key }}

This gives us the ACL then the individual ACE per row in the CSV file! Mixing the outer and inner loops!

Recommendation – you will notice the start of an {% if %} {% else %} {% endif %} statement – because almost everything in an ACL and ACE list is variable you should test if each item.value.X is defined first, use the value if its defined, otherwise use a hard coded value. As such:

{% if item.value.logging is defined %}

{{ item.value.logging }}

{% else %}

No Logging

{% end if %}

Next, back in the main playbook file, outside the loop, we finally add our CSV header row:

For the sake of keeping this short there is likely some Regular Expression replacements we need to make to clean up any stray JSON or to remove unnecessary characters / strings left behind but in essence we have the follow show access-lists command rendered into CSV:

Operations, management, compliance and standard, and most of all IT SECURITY is going to love this! All of this is in a central Git repository so all of these artifacts are Git-tracked / enabled. All of the CSV files are searchable, sortable, filterable, and EXCEL ready for the business!

Summary

Before you give up on any problem make sure you find and read the documentation!

I have to revisit some previous use cases and problems now with fresh eyes and new capabilities because I had given up on transforming some previous dictionaries in dictionaries because I didn’t know what I was doing!

One step closer! I hope this article helped show you how dynamic Ansible looping is done and you don’t have to fail and struggle with the concept like I did. I am out there on Twitter if you have any questions!

Excitement

And he’s right! I can’t stop making new GitHub repositories with Genie parsed show commands to documentation! Like show ip interface brief as seen above!