My Network Talks To Me – Literally!

This next post may seem like science fiction. I woke up this morning and checked that my output files – MP3 files – really did exist and that I actually made my Cisco network “talk” to me!

This post is right out of Star Trek so strap yourself in!

Can we make the network talk to us?

After my success with my #chatbot my brain decided to keep going further and further until I found myself thinking about how I could actually make this real. Like most problems let’s break it down and describe it. How much of this can I already achieve and what tools do I need to get the rest of the solution in place?

I know I can get network data back in the form of JSON (text) – so in theory all I need is a text-to-speech conversion tool.

Enter: Google Cloud !

That’s right Google Cloud offers exactly what I am looking for – a RESTful API that accepts text and returns “speech” ! With over 200 languages in both male and female voices I could even get this speech in French Canadian spoken in a dozen or so different voices!

I am getting ahead of myself but that is the vision:

  1. Go get data, automatically, from the network (easy)
  2. Convert to JSON (also easy)
  3. Feed the JSON text to the Google Cloud API (in theory, also easy)

The process – Google Cloud setup

There is some Google Cloud overhead involved here and this service is “free” – for up to 1 million processed text words or 3 months whichever comes first. It also looks like you get $300 in Google bucks and only 3 months of free access to Google Cloud.

Credit card warning: You need a credit card to sign up for this. They assured me, multiple times, that this does not automatically roll over to a paid subscription after the trial expires you have to actually engage and click and accept a full registration. So I hope this turns out to be free for 3 months and no actual charges show up on my credit card. But in the name of science fiction I press on.

So go setup a Google Cloud account and then your first project and eventually you will land on a page that looks like this.

Enable an API and search for text

Enable this API and investigate the documentation and examples if you like.

Now Google Cloud APIs are very secure to the point of confusion. So I have not fully ironed out the whole automation pipeline yet – mainly because of how complex their OAuth2 requests seem to be – but for now I have a work around I will show you to at least achieve the theoretical goal. We can circle back and mess with the authentication later. (Agile)

Setup OAuth2 credentials (or a Service Account if you want to use a JSON file they provide you)

Make sure it is a Web Application

This will provide you your clientID and secret.

For most OAuth2 that’s all you need – maybe I am missing the correct OAuth2 token URL to request a token but for now there is another tool you can use to get a token.

Google has an OAuth2 Developer Playground you can use to get a token while we figure out the OAuth stuff in the background

Follow the steps to select the API you want to develop (Cloud Text-To-Speech)

Then in the next step you can request and receive a development token

You can also refresh this token / request a new token. So copy that Access Token we will need it for Postman / Ansible in the next steps.

Moving to Postman

Normally under my collection I would setup my OAuth – here is the screenshot of the settings I’m just not sure of. And here is the missing link to full automation.

So far so good here

Again, this might be something trivial and I am 99% sure its because I have not setup this redirection or linked the two things but it was getting late and I wanted to get this working and not get stuck in the OAuth2 weeds

First here is what I think I have right:

Token name: Google API
Auth URL: https://accounts.google.com/o/oauth2/auth
Access Token URL: https://accounts.google.com/o/oauth2/token
Client ID:
Client Secret:
Scope: https://www.googleapis.com/auth/cloud-platform

But what I’m not really sure what to do with is this Callback URL – I don’t have one of those ?

Callback URL: This is my problem I am really not sure what I need to do here ?

I believe I need to add it here:

But I have no cookie crumbs to follow all of the ? help icons are sort of “This is where your Authorized redirect URLs go” and thats it ?

Open call to Google Cloud Development – I just need this last little step then the rest of this can be automated

Anyway moving along – pretending we have this OAuth2 working – we can cheat for now using the OAuth2 Playground method.

So here is the Postman request:

We want a new POST request in the Google Cloud API Postman Collection. The URL is:

https://texttospeech.googleapis.com/v1/text:sythesize

So cheat (for now) and grab the token from the OAuth2 Playground and then in your Postman Request Authorization tab – select Bearer Token and paste in your token. From my experience this will likely start with ya29 (so you know you have the right data here to paste in)

Tab over to Headers and double-check you have Bearer ya29.(your key here)

As far as the body goes – again we want RAW JSON

Now for the science fiction – in the body, in JSON, we setup what text we want to what speech

The canned example they provide

So we need the input (text) which is the text string we want converted.

We then select our voice – including the languageCode and name (and there are over 200 to choose from) – and the gender of the voice we want.

Lastly we want the audioConfig including the audioEncoding – and since MP3 was life for me in the mid to late 90s – let’s go with MP3 !

Hit Send and fire away!

Good start – a 200 OK with 52KB of data returned. It is returned as a JSON string:

Incredible stuff – this is the human voice pattern saying the text string – expressed as a base64-encoded audio string !

Curious, I found the Base64 Guru !

Ok – very cool stuff – but what am I supposed to do with it?

Fortunately Google Cloud has the insight we need

Hey – it’s exactly where we are at ! We have the audioContent and now we need to decode it!

Since I am still developing in my Windows world with Postman let’s decode our canned example

Carefully right-click copy (don’t actually click the link Postman will spawn a GET tab against the URL thinking you are trying to follow the URL along)

Now create a standard text (.txt) file in a C:\temp\decode folder and paste in the buffered response

I’ve highlighted the “audioContent”: “ – you have to strip this out / delete it as well as the last trailing at the end of the string – we just want the data starting with // and beyond to the end of the string

Lauch cmd and change to the C:\Temp\Decode folder and run the command

certutil -decode sample.txt sample.mp3

As you see if your text file was valid you should get a completed successfully response from the certutil utility. If not – check your string again for leading and trailing characters.

Otherwise – launch the file and have a listen!

How cool is that?!?!?

Enter: Network Automation

As I’ve said before anything I can make with with Postman I can automate with the Ansible URI module! But instead of some static text – I plan on getting network information back and having my network talk to me!

The playbook:

First we will prompt for credentials to authenticate

Now – let’s start with something relatively simple – can I “ask” the Core what IOS version it’s running? Sure – let’s go get the Ansible Facts, of which the IOS version is one of them, and pass the results along to the API !

For now we will hard code our token – again once I figure this out I will just have another previous Ansible URI step to go get my token with prompted ClientID / Client Secret at the start of the playbook along with the Cisco credentials. Again, temporary work around

Again because I have used body_format: json I can write the body in YAML.

Let’s mix up the voice a little bit too so hit the Voices Reference Guide

Ok so for our body let’s have some fun and try an English (US) WaveNet-A Male.

For the actual text lets mix a static string “John the Lab Core is running version” and then the magic Ansible Facts variable {{ ansible_facts[‘net_version’] }}

And see if this works

We need to register the response from the Google Cloud API and delegate the task to the localhost:

So now we need to parse and get just the base64-audio string into a text file. Just like in Postman this is contained in the json.audioContent key:

Now we have to decode the file! But this time with a Linux utility not a Windows utility

We can call the shell from Ansible easily to do this task:

Now in theory this should all work and I should get a text file and an MP3 file. Let’s run the playbook!

Lets check if Git picked a new file!

Ok ! What does it sound like!?!

Ok this is incredible!

Let’s try some Genie / pyATS parsing and some different languages !

Copy and paste and rename the previous playbook and call the new file GoogleCloudTextToSpeech_Sh_Int_Status.yml

Replace the ios_facts task with the following tasks

So now for our actual API call we want to conditionally loop over each interface if is DOWN / DOWN (meaning not UP / UP and not administratively DOWN)

Now as an experiment – let’s use French – Canadian in a Female Wavenet voice.

Does this also translate the English text to French? Do do I need to write the text en francais? Lets try it!

So this whole task now looks like this:

Now we need another loop and another condition to register the text from the results. We loop over the results of the first loop and when there is audioContent send that content to the text file.

Caution! RegEx ahead! Don’t be alarmed! Because of the “slashes” in an interface (Gigabit10/0/5) the file path will get messed up so let’s regex them to underscores

Then we need to decode the text files!

So let’s run the playbook!

So far so good – now our conditionals should kick in – we should see some items skipped in light blue text then our match hits in green

Similarly, our next step should also have skipped items and then yellow text indicating the audioContent has been captured

Which is finally converted to audio

What does it sound like!??! Did it automatically translate the text as well as convert it to speech?

TenGig

Gig

A little more fun with languages

I won’t post all the code but I’ve had a lot of fun with this !

How about the total number of port-channels on the Core – in Japanese ?!

Summary

In my opinion this Google Cloud API and network automation integration could change everything! Imagine if you will:

  • Global, multilingual teams
  • Elimination of technical text -> Human constructed phrasing, context, and simplicity
  • Audio files in your source of truth
  • Integrated with #chatops and #chatbots
  • Visually impaired or otherwise physical challenges with text-based operations
  • A talking network!

This was a lot of fun and I hope you found it interesting! I would love to hear your feedback!

Is Your Network At Risk – Automating the Cisco PSIRT with Genie and Ansible

The security of my network keeps me up at night. Honestly it does. We live in a world where enterprise networks are defending themselves against state-sponsored attacks. But what do attackers look for? Typically, open, well-known, vulnerabilities.

Well, at least to the hackers, attackers, and even script-kiddies, these vulnerabilities are well-known. Those playing defense are often a step or two behind just identifying vulnerabilities – and often times at the mercy of patch-management cycles or operational constraints that prevent them of addressing (patching) these waiting-to-be-exploited holes in the network.

What can we do about it?

The first thing we can do is to stay informed ! But this alone can be a difficult task with a fleet of various platforms running various software versions at scale. How many flavours of Cisco IOS, IOS-XE, and NXOS platforms make up your enterprise? What version are they running? And most importantly is that version compromised?

The old way might be to get e-mail notifications (hurray more e-mail!), maybe RSS-feeds, or go device-by-device, webpage-by-webpage, looking up the version and if it’s open to attack or not.

Do you see why, now, how an enterprise becomes vulnerable? It’s tedious and time-intensive work. And the moment you are done – the data is stale – what, are you going to wake up every day and review threats like this manually? Assign staff to do this? Just accept the risk and do they best you can trying to patch quarterly ?

Enter: Automation

These types of tasks beg to be solved with automation !

So how can you do it?

Let’s just lay out a high level, human language, defined use-case / wish list.

  1. Can we, at scale, go get the current IOS / IOS-XE / NXOS software version from a device?
  2. Then can we send that particular version somewhere to find out if it has been compromised ?
  3. Can we generate a report from the data above?

Technically the above is all feasible; easy even!

  1. Yes, we can use Ansible and the Cisco Genie Parser to capture the current software version
  2. The Cisco Product Security Incident Response Team has incredible, secure, REST APIs available that we can automate with the Ansible URI module
  3. Using Jinja2 templates we can craft business-ready reports

Getting Started

First you need to setup your Cisco.com REST API suite access

Pre-Ansible Development

As with any new REST API I like to start with Postman and then transform working requests into Ansible playbooks.

First, under a new or existing Cisco.com Collection, add a new request called IOS Vulnerabilities

Cisco.com uses OAuth2 authentication mechanism where you first must authenticate against one REST API (https://cloudsso.cisco.com/as/token.oauth2) which provides back authorization Bearer token used to then authenticate and authorize against subsequent Cisco.com APIs

Your Client ID and Secret are found in the API portal after you register the OpenVuln API

Request and use a token from the API in Postman:

Now add your request for IOS

https://api.cisco.com/security/advisories/ios?version=

Test it out!

Let’s hard code a version and see what flaws it has

Ok so it has at least 1 open vulnerability!

Does it tell us what version fixes it?

Ok let’s check that version quickly while we are still in Postman

This version has no disclosed vulnerabilities!

One interesting thing of note – and our automation is going to need to handle it – is that if there are no flaws found we get a 404 back from the API not a 200 like our flawed response!

The Playbook

For the sake of the example I am using prompted inputs / response but these variables could easily be hardcoded and Ansible Vaulted.

So first prompt for your Cisco hosts username and password and your Cisco.com ClientID and Client Secret

Register the response

Then in the IOS.yml group_vars I have my Ansible network connections

I’ve put all IOS-platforms in this group in the hosts file to target them

Next step run the show version ios_command and register the reponse

Genie parse and register the JSON

Now we need, just like in Postman, to go get the OAuth2 token but instead of Postman we need to use the Ansible URI module

Then we have to parse this response and setup our Bearer and Token components from the response JSON.

Now that we have our token we can authenticate against the IOS open Vulernability API

There are a couple of things going on here:

  1. We are passing the Genie parsed .version.version key to the API for each IOS host in our list
  2. We are using the {{ token_type }} and {{ access_token }} to Authorize
  3. We have to expect two different status codes; 200 (flaws found on a host) and 404 (no flaws for the host software version)
  4. I’ve added until and delay to slow down / throttle the API requests as not to get a 406 back because I’ve overwhelmed the 10 requests per second upper limit
  5. We register the JSON response from the API

As always I like to create a “nice” (easy to read) version of the output in a .json file

Note we need to Loop over each host in our playbook (using the Ansible magic variable ansible_play_hosts) so the JSON file has a data set for each host in the playbook.

Lastly we run the template module to pass the data into Jinja2 where we will create a business-ready CSV file

Which looks like this broken apart:

Like building the JSON file first we need to loop over the hosts in the playbook.

Then we check if the errorCode is defined. You could also look at the 404 status code here. Either way if you get the error it means there are no vulnerabilities.

So add a row of data with the hostname, “N/A” for most fields, and the json.errorMessage from the API (or just hardcode “No flaws” or whatever you want here; “compliant”)

Now if the errorCode is not defined it means there are open flaws and we will get back other data we need to loop into.

I am choosing to do a nested set of two loops – one for each advisory in the list of advisories per software version. Then inside that loop another loop, adding a row of data to the spreadsheet, for each BugID found as well (which is also a list).

There are a few more lists like CVEs for example – you can either loop of these as well (but we start to get into too much repetition / rows of data) – or just use regex_replace(‘,’,’ ‘) to remove all commas inside a field. The result is the list spaced out inside the cell sorted alphabetically but if you do not do this it will throw off the number of cells in your CSV

The results

What can you do with “just” a CSV file?

With Excel or simply VS Code with the Excel Preview extension – some pretty awesome things!

I can, for example, pick the ID field and filter down a particular flaw in the list – which would then provide me all the hosts affected by that bug

Or pick a host out of the list and see what flaws it has, if any. Or any of the columns we have setup in the Jinja2

This image has an empty alt attribute; its file name is image-108.png

Included in the report is also the SEVERITY and BASE SCORE for quick decision making or the Detailed Publication URL to get a detailed report for closer analysis.

Which looks like this:

Automation is much more than configuration management. It can be used to manage, understand, and mitigate risk as well.

Together we can secure our enterprise networks and hopefully sleep a little more sound knowing they are safe.

By Popular Demand – Automating the Cisco Identity Services Engine (ISE) External RESTful Services (ERS) API Suite

My last post, which has turned into one of my more popular posts, covered automating the Cisco ISE MnT Nodes REST API with Postman and Ansible. We also discovered there is an XML limitation with the MnT REST API.

I’ve been asked by several people if a similar approach can be used with the Cisco ISE ERS. Not only can it be done – because ERS returns XML or JSON – it can be done much easier without the additional XML-to-JSON conversion step.

Cisco ISE ERS

Unlike the MnT, which has a very small subset (3 Session APIs for example) of available APIs, ERS has a lot of available APIs to work with!

First you have to enable ERS

You need a User with ERS permissions

To reach the ERS Software Developer Kit (SDK) then visit:

https://ise.my.domain.com:9060/ers/sdk

Now this, which can be a bit misleading, will launch the Quick Reference tab

In the bottom left – there are other panels of information – namely the API References

Now go API-shopping!

This is a pretty incredible list of APIs available to us, each with a dedicated documentation guide, and a rich set of XML or JSON data sets for a single record or all records.

REST API Development

Let’s use the Authorization Profile API to start with, partially to compliment the MnT Session information API documentation we have, let’s enumerate our AuthZ profiles.

First let’s explore the documentation!

Side note – I am usually a do-it-first then read-the-docs-later type of developer but in this case I need the particular URL to actually get to the API.

It’s also good practice !

Various ways to interact with the API are listed – for this example we are interested in the Get-All API

Each API will provide the resource definition – which helps you understand the returned data structure and is it valuable to capture

For example vlan nameID in this returns the name of the VLAN each of my AuthZ policies places authorized clients. This is very valuable to me.

Then check out the actual XML and JSON sample responses so you can see what to expect back in Postman and eventually your JSON files

And then, and this is what we are after, the JSON

Now the key here is the Get-All we still need to go find the request details, which look like this:

So we have the method (GET) , the URI (https://ise.my.domain.com:9060/ers/config/authorizationprofile), and the required (Content-Type and Accept) headers.

API Tip: Before we get started I want to highlight that the ERS API returns 20 results by default. This might not seem important but what if you have 27 records? 95 records? 1,200 records?

The API uses pagination and returns groups of results (20 by default) per page of responses. This maximum records response can be adjusted with a maximum of 100 records per response.

This is done by adding ?size=(1 – 100)

Some of my datasets are less than 100 records so I add ?size=100 to my string to get, effectively, all records back.

We will cycle back to how to handle paginated responses later in this post. For now lets just some datasets that come back in a single response to ease into this concept.

Take it to Postman

Next step we build our Cisco ISE Postman Collection (or in my case add to my existing collection) and setup our request

The results are interesting – I have found 27 AuthZ Profiles and I get the basic internal ISE ID, name, and description of each.

Part of the pagination tip includes how ISE handles Get-All responses.

  1. Get-All
  2. List Total
  3. List Each Resource summary
  4. Include the Resource’s detailed href

The href attribute specifies the URL of the page the link goes to.

So we follow this link for a specific AuthZ profile and see what the individual resource details look like:

Ok here we go!

Move it into Ansible URI

Create a new YAML file – the Ansible playbook – and convert the working Postman string, with authentication and headers, and register the JSON response from the Get-All Authorization Profile with up to 100 records.

Now we have the first Postman response back – the list of AuthZ Profiles – registered in AuthorziationProfiles_List which we now can loop-over and go get each profile’s href and get that JSON. Follow the trail.

So here our loop simply loops over the natural JSON list we get back. If you were to recall the resources is a JSON list

So the actual URL we are visiting on each loop is the item.link.href !

Again I like to have this JSON in a file using the copy module and the | to_nice_json filter.

Which looks just like the Postman body:

Which we then move into a Jinja2 template

Which is easy enough

We loop over each result in AuthorizatonProfiles_Details.results which again is the natural JSON list so each key is simply {{ result.json.AuthorizationProfile.xx }}; .name for example.

Which results in the business-ready CSV !

Refactoring working code – rinse and repeatability

One amazing thing about the ERS API suite is how uniform and standard they all seem to be.

Meaning you can copy the playbook above and make the following adjustments to reflect the API you are developing against

  1. Rename the playbook
  2. Update all comments and task names:
  3. Update the URL
  4. Update all registered variable names
  5. Update all output file names

In fact with a few Find / Replace operations you can copy and transform the first API playbook to any other API playbook.

As an example my CiscoISEAuthenticationProfiles.yml file and my CiscoISEdACLList.yml playbooks are both exactly 55 lines of code each! They are exactly the same minus the unique identifiers linking back to the targeted API.

Handling Pagination

So far I’ve avoided the paginated responses but let’s add pagination handling into the playbook.

Here is an example, my Network Device API

https://ise.my.domain.com:9060/ers/config/networkdevice?size=100

Returns over 1,200 items, with 100 items per page.

So how do we deal with this?

So we should have what we need to convert this to Ansible and automatically handle the pagination using some Ansible math filters.

So inside Page_count we have .json.SearchResult.total (1215) which we can use to establish the number of pages we need to loop over.

And speaking of loops Ansible has a with_sequence looping mechanism that almost seems designed to work with paginated URI responses.

Meaning we can set the start of our sequence (1) and end of our sequence (the total number of responses divided by the number of responses per page (100) rounded up to the next whole number.

So here we go

Pay attention to the page={{ item }} which is each sequence number in the with_sequence loop.

We loop starting at 1 and end at the Total pages divided by 100 (make sure to parenthesize this math) then we use the | round filter specifying to round up (ceil) to the largest next whole number (0) then finally set that value as an integer so the sequence can treat it as a number.

This is pretty neat stuff here is the task in Ansible for me I was expecting 13 pages

Now I need a json_query to loop over the results (outer) and results (inner)

So lets set a fact to do just that

Now we should be able to iterate over this list of URIs

This seems to be working

I’m taking a coffee break while this runs – back in a few minutes!

Compiling

If you know me by now – my next step is always to put this output into a .json file

Which looks like this:

Which I then pass to the Jinaj2 template

Which looks like this:

Which then looks like this!

Which I have filted down to a sample – there are all 1,200+ records in this file!

So now I can handle any paginated responses – with only one or two extra steps!

I hope this series on the Cisco ISE REST APIs – both the ERS and MnT – has been valuable to you ! As you can see regardless or XML or JSON; paginated or not; we can easily use Ansible and a few other tools to automate the Cisco ISE REST APIs!

Automating the Cisco Identity Services Engine Monitoring and Troubleshooting Node (ISE MnT) REST API

In my opinion Network Access Control (NAC) using a mix of 802.1x and MAC Address Bypass (MAB) with dynamic Access Control Lists (dACL) is not just an extremely important foundation of a modern, high secure, enterprise network. It is actually a Software Defined Network (SDN) that completely changes the operationalization and dynamically adjusts the very configuration of your devices.

Cisco Identity Services Engine (ISE) uses Policy Sets to achieve this providing policy for Authentication – validating the identity a device claims to be – and Authorization – what, if anything, that identified device is permitted access to. Using either certificates and the 802.1x protocol or, if certificates are unfeasible, a MAC-based permission that pushes a dynamic ACL to the switch limited the access that MAC address has on the network.

I’ve implemented this at scale – and now I am trying to use automation tools to help operate, monitor, troubleshoot, and generally understand the impact of my ISE policy sets at the Access layer. And while ISE provides some amazing GUI-based tools – the Live Logs, Live Session, Policy Sets, and Context Visibility – if I can avoid using a GUI, sorting, filtering clicking, menu-driven system – I would like to just get right to the data!

Enter: Monitoring and Troubleshooting Nodes

In your ISE deployment options you can deploy Monitoring nodes / personas

These personas come with REST APIs !

But these are not to be confused with the ISE External REST Service (ERS) APIs!

(You can easily tell the difference in the API URL noting the presence of /mnt or the /esr path respectively)

High-Level Goals

In an Ansible Playbook

  1. Get a Total Session count using the ActiveCount REST API
  2. Get an Active Session details list using the ActiveList REST API
  3. Per-Access Layer send all MAC Addresses with an Authentication Session to the Session/MAC REST API
  4. Transform all returned data into business-ready CSV files

The Tools

A lot of different automation and infrastructure as code tools were used to create this solution including:

  • Cisco ISE MnT Node – REST API Source
  • VS Code – used to write the code
  • VS Code Extensions – used to help write the code
  • Git – used to version and source control the code
  • Azure DevOps Git Repository – used to host the Git repository
  • Postman – used to investigate and develop against the REST API
  • Ansible – automation and orchestration engine driving the solution
  • Ansible Vault – used to encrypt API credentials
  • Ansible Module: URI – module used to interface with REST API
  • Ansible Module: XML – module used to work with XML data
  • Ansible Module: Copy – move data from a variable into a file
  • Ansible Module: Template – call a Jinja2 template as a source to template another filetype as the destination output file
  • Ansible Module: ios_command – run a Cisco IOS CLI command
  • Ansible Module: set_fact – create your own variable
  • Cisco pyATS Genie Parser – transform Cisco IOS command output into structured JSON
  • Filter Plugin: parse_genie – a custom Python file used with Genie to parse CLI output
  • Ansible Utilities: CLI_Parse – parse “CLI” commands
  • CLI_Parser: XML – specify XML as the filetype to feed the CLI_Parser
  • Ansible Filter: to_nice_json – create human readable “pretty” JSON files
  • Ansible Filter: dict2items – transform a dictionary to structure items (list)
  • Ansible Filter: json_query – Use SQL-like structured Queries against JSON
  • Ansible Filter: flatten – flatten a nested list
  • Ansible Filter: hwaddr – manipulate MAC address formats
  • Ansible Filter: upper – change a string to UPPER CASE
  • Jinja2 Templates – Use Pythonic code to structure a template of another filetype using the JSON data as a source
  • CSV – The ultimate business-ready output file format

Wait – did you say XML ?

Yes. For some reason ISE MnT Nodes do not return structured JSON at all and you can only receive XML format.

https://twitter.com/densem0de/status/1359688237464367108

Unacceptable indeed!

Raiding the Lost REST API

Now that we’ve identified this limitation we have another step to consider in our automation orchestration – that is to parse and transform – the XML to JSON so we can work with it in Jinja2

Use Case #1 – Active Sessions

ISE MnT API requires some specific permissions under a user account before you get going:

As with any API development I like to start with Postman.

Setup a Postman Collection called Cisco ISE

Add the username and password under Authentication – Basic Auth

Your GET string to get to the Active Sessions API is as follows:

https://ise.domain.com/admin/API/mnt/Session/ActiveCount

With very simple headers:

Resulting in the following data set:

Now we have the components we need – the working credentials, URL string, headers, and expected output – we can migrate the code over to Ansible using the URI module.

First we need to set up some variables we can use to connect to the ISE MnT API. I use a file called Enterprise.yml for this in group_vars

Then I can proceed with my playbook called CiscoISEActiveSessionTotals.yml

The first task is to use the URI module and register the results from the ActiveCount MnT API into a variable ActiveCount_XML

Next, because it is only a single value, we can simply use XML to parse out that field as follows and register the new variable FilteredCount

I like to always capture a .json file of the data as a RAW artifact of the unmanipulated data:

Which looks just like the Postman output:

It should be noted that the above file is the contents of FilteredCount exactly.

Now I can pass this along to the Jinja2 template:

Which looks like this:

And results in this:

Very nice!

Using this as a foundation can we transform the more advanced APIs that return more than a single XML value?

Let’s find out!

Use Case #2 – Active Session Details

Adding the next API as a Request to the Postman Cisco ISE collection

Which returns a data set like this:

We will need another way, beyond XML and xpath, to parse this output. Ideally we could find a pre-made read-to-go XML to JSON conversion utility.

I tried, and failed, several different parsing tools including the Ansible recommended XML filter parse_xml with a spec file – but I just couldn’t figure this out.

Fortunately Ganesh Nalawade, Principal Engineer at Ansible, reached out to me with a great utility

So first we go out to the ActiveList REST API and register the ActiveList_XML

Now, after installing the Utilities from Ansible Galaxy, we simply feed the text ActiveList_XML.content – into the XML parser – and register the parsed data into a new variable ParsedActiveList

So lets copy that over to a .json file and take a look

Which results in:

Alright we have the XML parsed over to JSON! Now we can template it!

Looking at the JSON above we now need to loop over the activeSession under ParsedActiveList.parsed.activeList

In Jinja2 it looks like this:

Where we pick and choose our fields and the order we want to comma separate them. I like to include the | default(“N/A”) filter to add a value of “N/A” to any empty or non-present value.

The resulting CSV looks like this!

Now this is filtered against my user name but I have all 8,000+ authentication sessions, one per row, in this CSV file!

Easy filtered (as seen above) and sorted. Searchable. Simple.

What else can we do with this API?

Use Case #3 – Per-MAC Session Details

Now in my pièce de résistance we are going to add another parsing technology I love – the Cisco Genie Parser – to capture MAC addresses to feed the last Cisco ISE MnT REST API – the MAC Session.

Add this final Request to your Cisco ISE Postman Collection replacing the MAC with either a Postman variable or a MAC that has a session in ISE.

Which returns:

Ok so now that we know we can send any MAC with a session in ISE against the REST API – how can we dynamically find and feed those MAC addresses from the network?

Answer: Genie

In order to get a list of authentication sessions on a Cisco switch we use the show authentication sessions command at the CLI

So the first thing I do is check the Genie Parser library and see if they have an available parser for the command and for what Cisco platforms they support.

Sure enough – there is a parser for the command I need.

So back to the Ansible playbook – first we need to run the ios_command and run the show authentication sessions command registering the results into a variable. This playbook needs to be scoped for the Access layer hosts where the 802.1x / MAB boundary is enforced.

Then we use the Genie Parser to transform the RAW CLI standard output (stdout) to – you guessed it – JSON ! We only set this fact (the variable) when the output does NOT equal “No sessions currently exist” in case there are no authentication sessions present on the device.

Now we don’t need this output in a JSON file but we do need to parse it, or query it, in order to get the MAC addresses from the results in order to send them to the ISE MnT API.

JSON_Query is another extremely potent Ansible filter that works like an SQL query but against the structured JSON. It can take some getting used to (painful laughing) but once you get the hang of the syntax it’s incredible fast and powerful.

I owe a big shout out to my pair-programming partner who eventually figured this out with (for?) me

To break this apart:

We set a variable up jquery to hold the actual query itself.

Then we set a fact which is the pyats_auth.interfaces value, which we:

  • filter from a dictionary to items (a list with keys and values)
  • JSON_Query with our jquery string
  • Flatten down the list (since we just need the nested value.client key)

Now we have another list, MACList, that contains a list of MAC addresses from the JSON_Query, which is querying the JSON we used Genie to convert from the IOS command!

From here it’s a simple matter of feeding the API each MAC in that list in a loop and registering the results:

I want to draw your attention to additional filters I had to use. Primarily because I was stuck on “Why isn’t this working? It should work .. but it doesn’t work” for a long time at this step. Eventually using debug I printed myself the variable MACList when I spotted something – the format of the MAC address!

The Cisco IOS MAC format is different than the Cisco ISE MnT REST API expects!

Meaning I was feeding the API:

ab00.cd11.ef22

Instead of:

AB:00:CD:11:EF:22

So by adding the hwadd(‘linux’) filter it changed the structure of the MAC address.

Then by adding the upper filter it changed the lower case letters to upper case.

Once I figured this out – it all started to work.

So now we have the XML in MACSessionDetails which we again need to parse with the Ansible Utility CLI_parse

Now we need another loop here to loop over each result and we can use the .content key being the {{ item }}

So now we can actually create our.json file from the parsed XML

Which looks like this:

Now Jinaj2 templating JSON lists are a funny thing. Because we get a natural list back, as indicated by the square bracket after results [ we can loop into this “directly”.

Meaning

{% for result in ParsedActiveSessionMACList.results %}

{{ result.parsed.sessionParameters.user_name }}

{% endfor %}

NOT

{% for result in ParsedActiveSessionMACList.results %}

{{ ParsedActiveSessionMACList.results[result].parsed.sessionParameters.user_name }}

{% endfor %}

Which returns:

Success!

What I find neat:

  • Never need to sign into ISE except for deep audits – which I have the ID in the CSV to look up
  • At the L2 Access Layer I am actually, without DNS, getting the hostnames / usernames / IP addresses of the connected devices! Pretty cool!
  • At scale either in batch or on-demand I can get this data in seconds!
  • I wouldn’t classify anything I am doing as complicated – yes there are a lot of little pieces but they all fit together nicely.
  • I no longer fear XML returning from a REST API
  • I’ve fully automated the 3 key ISE 2.x (3.x has even more) MnT REST APIs!

Cisco Facts – A Collection of Ansible IOS / NXOS Facts and Genie Parsing Playbooks

Cisco DevNet Code Exchange has published my repository !

Dark Mode

Cisco_Facts (this link opens in a new window) by automateyournetwork (this link opens in a new window)

Ansible playbooks that use the IOS / NXOS Facts modules and Genie parsed commands to transform RAW JSON into business-ready documentation

Here you can easily start capturing Ansible Facts for IOS and NXOS and transform the JSON into CSV and Markdown !

Also included are a bunch of valuable Genie parsed show commands which transform the response into JSON then again transforms the JSON into CSV and Markdown!

The playbooks use Prompts so you should be able to clone the repo and update the hosts file and start targeting your hosts! For full enterprise support I suggest you refactor the group_vars and remove the prompts moving to full Ansible Vault – but for portability and ease of start-up I’ve made them prompted playbooks for now.

I would love to hear how they work out for you – please comment below if you have success!

Cisco Services APIs Ansible Playbooks – Version 2.0

I am very pleased to release the Cisco Services API Ansible Playbooks Version 2.0 which has been approved and released on Cisco DevNet Code Exchange !

You can find the code here and here

This major revision basically shifts away from lineinfile to Jinaj2 Templates for scale, performance, readability, and general best practices.

Serial 2 Info

The Cisco Serial 2 Info API receives a valid serial number and then returns structured JSON with your Cisco Contractual information !

The playbook uses the Genie parser to parse the show inventory command

After authenticating against the OAuth 2 service to get a Bearer token

It provides the API the serial number for every part per device.

The API provides the following information back:

Which we first dump into JSON and YAML files

Then template into CSV and MD

Using Jinja2

Which gives us:

Recommended Release

The other, very similar, Ansible playbook uses the Cisco Recommended Release API to create a spreadsheet with the current image on a host and the Cisco recommended version for that host given the Part ID (PID)

Here we don’t even have to use Genie to parse we can use the Ansible Facts module

And we transform again with Jinja2

And get this create report!

Please reach out to me directly if you need any help implementing these playbooks but I believe the instructions and code to be easy enough any beginner, with a little bit of refactoring and thought, could use this code as a starting point in their automation journey.

Dark Mode

Cisco_API_v2 (this link opens in a new window) by automateyournetwork (this link opens in a new window)

Ansible playbooks that capture serial number and PID and send them to the Cisco.com APIs transforming the response into business-ready documents. Version 2.0 uses Jinja2 templates.

A Recipe For Success – Using Ansible to Transform Cisco Data Centre Facts into Business-Ready Documentation

One of my favourite recipes is the Hakuna Frittata both because not only am I a big fan of puns, I also enjoy this hearty vegetarian meal that even I can handle putting together.

Inspired by this simple recipe I have decided to try and document my highly successful Ansible Cisco NXOS Facts playbook that captures and transforms raw facts from the data centre into business-ready documentation – automatically.

Ansible Cisco NXOS Facts to Business-Ready Documentation
Prep: 60-90 Min
Cook: 2-3 Min
Serves: An entire enterprise

Ingredients

1 Preheated Visual Studio Code
1 Git repository and Git
1 stick of Linux (a host with Ansible installed and SSH connectivity to the network devices)
3 pinches of Python filters
1 Cup of Ansible playbook (a YAML file with the serially executed tasks Ansible will perform)
1 Cup of Ansible module – NXOS_Facts
2 Tablespoons of Jinja2 Template
1 Teaspoon of hosts file
1 Tablespoon of group_vars
2 Raw Eggs – Cisco NXOS 7000 Aggregation Switches

Helpful Tip

This is not magic but did not necessarily come easy to me. You can use debug and print msg to yourself at the CLI. At each step that I register or have data inside a new variable I like to print it to the screen (one to see what the data, in JSON format, looks like; and two, to confirm my variable is not empty!)

Directions

1. You will need to first setup a hosts file listing your targeted hosts. I like to have a hierarchy as such:

hosts
[DC:children]
DCAgg
DCAccess

[DCAgg]
N7K01
N7K02

[DCAccess]
N5KA01
N5KB01
N5KA02
N5KB02

Or whatever your logical topology resembles.

2. Next we need to be able to securely connect to the devices. Create a group_vars folder and inside create a file that matches your hosts group name – in this case DC.yml

DC.yml
+

3. Create all the various output folder structure you require to store the files the playbook creates. I like something hierarchical again:

4. Create a playbooks folder to store the YAML file format Ansible playbook and a file called CiscoDCAggFacts.yml

In this playbook, which runs serially, we first capture the facts then transform them into business-ready documentation.

First we scope our targeted hosts (hosts: DCAgg)

Then we use the NXOS_Facts module to go gather all of the data. I want all the data so I choose gather_subset : – all but I could pick a smaller subset of facts to collect.

Next, and this is an important step, we take the captured data, now stored in the magic Ansible variable – {{ ansible_facts }} and put that into output files.

Using the | to_nice_json and | to_nice_yaml Python filters we can make the “RAW JSON” inside the variable (one long string if you were to look at it) into human-readable documentation.

4b. Repeatable step

NXOS Facts provides facts that can be put into the following distinct reports:

Platform information (hostname, serial number, license, software version, disk and memory information)
A list of all of the installed Modules hosted on the platform
A list of all IP addresses hosted on the platform
A list of all VLANs hosted on the platform
A list of all of the enabled Features on the platform
A list of all of the Interfaces, physical and virtual, including Fabric Extenders (FEX)
A list of all connected Neighbors
Fan information
Power Supply information

For some of these files, if the JSON data is structured in way that lends itself, I will create both a Comma-Separated Values (csv; a spreadsheet) file and a markdown (md; “html-light”) file. Some of the reports is just the csv file (IPs, Features, VLANs specifically).

The follow code can be copied 9 times and adjusted by updating the references – the task name, the template name, and the output file name – otherwise the basic structure is repeatable.

In order to create the HTML mind map you will also need mark map installed.

Another example of the code – this is the Interfaces section – notice only the name, src, and dest file names need to be updated as well as the MD and HTML file names in the shell command.

5. The Jinja2 Templates

Now that we have finished our Ansible playbook we need to create the Jinja2 templates we reference in the Ansible template module (in the src line)

Create the following folder structure to store the templates:

roles\dc\dc_agg\templates

Then, for each of the 9 templating tasks, create a matching .j2 file – for example the “base facts” as I like to call them – CiscoDCAggFacts.j2

In this template we need an If Else End If structure to test if we are templating csv or markdown then some For Loops to iterate over the JSON lists and key value pairs.

Add a header row with columns for the various fields of data. Reference your Nice JSON file to find the key value pairs.

No “For Loop” is required here just straight data from the JSON

Since its not csv it must be md; so add the appropriate markdown header rows

Then add the data row using markdown pipes for delimiters instead of commas

Close out the If

An example with For Loops might be Interfaces or Neighbors but the rest of the syntax and structure is the same

Now because there are multiple interfaces I need to loop or iterate over each interface.

Now add the row of data

Note you can include “In-line” If statements to check if a variable is defined. Some interfaces might not have a Description for example. Test if it is defined first, and if not (else) use a default of “No Description”

Other fields are imperative and do not need to be tested.

Close the Loop

Now do the markdown headers for Interfaces

Then the For Loop again and data row again but using pipes

Then close out the If statement

Complete the remaining templates. Save everything and Git commit / push up to your repo.

Cooking Time

Lets run the playbook against two fully-loaded production Nexus 7000s using the Linux time command

Two minutes in the oven !

Results

Some samples of the output.

First the Nice JSON – note the lists have been collapsed to be brief but any of the lists can be expanded in VS Code for the details

Interfaces

Neighbors

Now some prefer YAML to JSON so we have the exact same data but in YAML format as well

Now the above is already incredible but I wouldn’t call JSON and YAML files “business-ready” – for that we need a spreadsheet!

The real tasty stuff are the CSV files!

The general facts

Interfaces

Note that you can filter these csv files directly in VS Code – here I have applied a filter on all interfaces without a description

This captures all types of interfaces

Including SVIs

The Markdown provides a quickly rendered VS Code or browser experience

And the Interactive HTML is pretty neat!

Now remember we have all of these file types for all of the various facts these are just a few samples I like to hand out to the audience – for the full blown experience you can hopefully follow this recipe and cook your own Cisco NXOS Ansible Facts playbook!

This image has an empty alt attribute; its file name is image-72.png

Please reach out if you need any additional tips or advice ! I can be reached here or on my social media platforms.

Prevent an Intrusion – Run the Recommended Version!

In the wake of some very high profile IT security breaches and state sponsored attacks using compromised software today I wrote some infrastructure as code Ansible playbooks to create some business-ready documentation to help us understand our Cisco software version footprint against what release the vendor recommends. It is very important to run “Safe Harbor” code in the form of the Gold Star release. These releases are as close as it gets to being bug-free, secure, tested, and supported in production environments.

The ‘old-way’ involved getting the Cisco Part ID (PID) or several PIDs and looking up the recommended release on Cisco.com using an ever deepening hierarchy of platforms, operating systems, and PIDs. At scale this is like a day’s worth of work to go gather all of this information and present it in a way the business can understand.

Building on my recent success with the Serial2Info Cisco.com API as well as Ansible Facts I thought this might be another nice use-case for business-centric, non-technical (not routes, IP addresses, mac addresses, etc), extremely important and critical insight.

Use Case

Can I automatically get the PID from a host or group of hosts and provide it to the Cisco.com Software Suggestion API building business-ready reports in CSV and markdown?

Answer: Yes!

The Playbook

Again you are going to need:

* A Linux Host with SSH access to your Cisco IOS devices and HTTPS access to the Cisco.com API
* Credentials for the host and for the OAuth2 API
* We are not using Genie parsers here so just “base” Ansible will work

Step 1. Setup credential handling

Create a playbook file called CiscoCoreRecommendedReleaseFacts.yml

Again I use prompted methodology here same as the Serial2Info API

Gather the username, enable secret, Cisco.com API ClientID, Client Secret

Step 2. Gather Ansible Facts

Using the ios_facts module gather just the hardware subset

Because we are using Ansible Facts we do not need to register anything – the JSON is stored in the Ansible magic variable ansible_facts

I need 2 keys from this JSON – the PID and ideally the current running version. These can be found as follows in the ansible_facts variable:

Which is accessed as ansible_facts.net_model

Which again is accessed as ansible_facts.net_version

With the information above – without going any further – I could already build a nice report about what platforms and running versions there are!

But let’s go a step further and find out what Cisco recommends I should be running!

Step 2. Get your OAuth2 token

First, using the Ansible URI module

We need to get our token using the registered prompted credentials.

The API requires the following headers and body formatting; register the response as a variable (token):

We have to break apart the RAW JSON token to pass it to the ultimate Recommended Release API:

Now we are ready to send PIDs to the API.

Step 3 – Send PID to Cisco.com API

Again using the URI module:

Here we pass the ansible_facts.net_model Fact to the API as an HTTP GET:

The headers and body requirements. Notice the authentication and how we pass the Bearer Token along. We also register the returned JSON:

Here is what the returned JSON looks like:

The highest level key is json or accessed via RecommendedRelease.json

There is a productlist

Which as you can see is a list as denoted by the [ ]

Inside this list is another product key with the values from the API about the product itself

A little further down we find the recommended software release

Step 4 – Transform technical documentation into business ready CSV / MD files

These JSON and YAML (I also use the | to_nice_yaml filter to create a YAML file along with the JSON file) files are create for technical purposes but we can do a bit better making the information more palatable using business formats like CSV and mark down.

It is just a matter of using Jinja2 to template the CSV and Markdown files from the structured JSON variables / key-value pairs.

Add a final task in the Ansible playbook that will loop over the CSV and MD file types using the template module to source a new .j2 file – CiscoCoreRecommendedReleaseFacts.j2 – where our logic will go to generate our artifacts.

The Jinja2 starts with an If Else EndIf statement that checks if the Ansible loop is on CSV or not. If it is it uses the CSV section of templated file format otherwise it uses markdown syntax.

First we want to add a CSV header row

Then we need a For Loop to loop over each product in the productList

Now we add our “data line” per product in the loop using the various keys

Hostname for example uses the Ansible magic variable inventory_hostname

Then we want the Base PID. We use the Ansible default filter to set a default value in case the variable happens to be empty.

We continue accessing our keys and then we close the loop.

Now we need to create the Markdown syntax

And the same logic for the “data row” but with pipes instead of commas. Make sure to close off the If statement

Step 5 – Run playbook and check results

We run the playbook as ansible-playbook CiscoCoreRecommendedReleaseFacts.yml

Answer the prompts

Let the playbook run and check the results!

Summary

Again with a few free, simple tools like Ansible and the Cisco.com API we can, at scale gather and report on the current running version and the vendor recommended version quickly and easily and fully automatically!

Now go and start protecting your enterprise network armed with these facts!

Sign on the Dotted Line

Layer 9 issues – finance – are often some of the most challenging a network engineer faces. Contract management can be particularly difficult in any scale organization especially if you are not “sole source” purchasing. Serial numbers and contracts are also not typically things the “network people” want to deal with but when that P1 hits and you try to open a SEV 1 TAC CASE – only to find out you are not under contract – I’ve been in less terrifying car accidents than this nightmare scenario.

I have good news ! Using a mix of automation and developer-like tools the network engineer can now create a real source of truth that, along with routes and MAC address-tables and other technical information, can include inventory and contractual business documentation from stateful, truthful, real-time, facts from Cisco.

Ok so let’s get into it!

As a rough outline for our logic here is the use case:

Can I automatically gather the serial numbers from Cisco device hostnames and then provide them to Cisco and get my contractual state for each part on each device?

Answer: YES !

What you will need:

* Linux host with Ansible, Genie parser
* Linux host requires both SSH access to the Cisco host and Internet Access to the OAuth2 and Cisco.com API HTTPS URLs
* Cisco SmartNet Total Care – I have written up instructions in this repo under the “OnBoarding Process” section

The Playbook

Step 1 – We will need to get the serial number for every part for a given hostname. For this we will use the standard show inventory command for IOS using the Ansible ios_command module. I will be using prompted methods for demonstration purposes or for on-demand multi-user (each with their own accounts) runtime, but we could easily Ansible Vault these credentials for fully hands-free run time or to containerize this playbook. I am also targeting a specific host – the Core – but I could easily change this to be every IOS device in the enterprise. This playbook is called CiscoCoreSerial2InfoFacts.yml

First prompt for username, enable secret, Cisco Customer ID, Cisco Customer Secret and register these variables:

Then run the ios_command show inventory and register the results in a variable.

Step 2 – Parse the raw output from the IOS command

Next, we use Genie to parse the raw results and register a new variable with the structured JSON. Genie requires, for show inventory, the command, the operating system, and the platform (in this case a Cisco 6500)

And here is what that structured JSON looks like:

So now we have a nice list of each part and their serial number we can feed the Cisco.com API to get back our contract information.

Step 3 – Get an OAuth 2 token from Cisco web services.

Cisco.com APIs use OAuth2 for authentication meaning you can not go directly against the API with a username and password. First you must retrieve a Bearer Token and then use that limited time token within it’s lifetime against the ultimate API.

Using the Ansible URI module go get a token and register the results as a variable. Provide the Customer ID and Client secret prompts to the API for authentication. This is an HTTP POST method.

With the new raw token setup the token type and access token from the raw response

Step 4 – Provide token to the Serial2Contract Cisco API to get back contractual information for each serial number.

Now that we have a valid token we can authenticate and authorize against the Cisco SmartNet Total Care Serial Number 2 Contract Information API.

In this step we are going to use an Ansible loop to loop over the Genie parsed structured JSON from the show inventory command providing the sn key for each item in the list. We need to use the Python | dict2items Ansible filter to transform the dictionary into a list we can iterate over.

The loop is written as

loop: “{{ pyats_inventory.index | dict2items }}”

And each serial number is referenced in the URL each iteration through the loop:

url: https://api.cisco.com/sn2info/v2/coverage/summary/serial_numbers/{{ item.value.sn }}

We register the returned structured JSON from the API as Serial2Info which looks like this:

So now I have the JSON – let’s make it a business ready artifact – a CSV file / spreadsheet and a markdown file – using Jinja2

Step 5 – Using Jinja2 lets template the structured JSON into a CSV file for the business.

Create a matching Jinja2 template called CiscoCoreSerial2InfoFacts.j2 and add a task to Ansible that uses the template module to build both a CSV file and a markdown file from the JSON.

In the Jinja2 file we need a section for CSV (if item = “csv”) and a section for markdown (else) based on their respective syntax. Then we need to loop over each of the responses.

result in Serial2Info[‘results’] is the loop used. I also add a default value using a filter | default (‘N/A’) in case the value is not defined. SFPs for example do not have all of the fields that a supervisor module has so to be safe it’s best to build in a default value for each variable.

The final Jinja2 looks something like this:

Which results in a CSV and Markdown file with a row for every serial number and their contractual facts from the API.

Summary

Large scale inventory and contract information can easily be automated into CSV spreadsheets that the business can easily consume. Ansible, Genie, Cisco.com APIs, Jinja2 templating and a little bit a logic come together into an automation pipeline that ensures contractual compliance and inventory fidelity at scale!