Thank you to Cisco DevNet and everybody in my community who has supported me over the years!
I won!

The modern approach to enterprise network management
Thank you to Cisco DevNet and everybody in my community who has supported me over the years!
I won!
I am so very happy to announce after 8 months of development Automate Your Network is now available on educative.io as an online course (early access) !
Automate Your Network – Learn Interactively (educative.io)
Check it out!
Infrastructure as Software – Applying A Software Design Pattern to Network Automation!
Applying A Software Design Pattern To Network Automation – Packet Pushers
Remember this CLI output?
Using real network data I’ve reimagined the Show IP Interface Brief command as an Interactive 3D World!
Complete with indicators making it easy to see if an Interface is Healthy or Not!
https://www.automateyournetwork.ca/wp-content/uploads/verge3d/1170/IP_Interface_Brief.html
I have been very silent on this blog lately – apologies – your best way to follow my development work now is likely Twitter or YouTube
However! I have been using Blender to make 3D Animations from Network State Data
I’ve recently figured out how to make these animations Web-ready !
Check it out! This is the PSIRT Report for the Cisco DevNet Sandbox Nexus 9k as a 3D Blender
Click here for the full page version – Verge3D Web Interactive (automateyournetwork.ca)
Much more to come!
With all the big WebEx news – including a new logo – I wanted to revisit the basic #chatbots I have working using pyATS, Python requests, and the WebEx API after the conversation came up in the #pyATS WebEx Community space today:
First, let’s take a look at what this does, and this is not limited to Merlin; any pyATS job has this capability
If you create the pyats.conf file as Takashi suggests and add the [webex] information it will enable the pyATS job to report the job summary into the WebEx space you provide the config file.
This looks something like this inside of WebEx:
This in itself is pretty handy! And all you need to do is go to the Cisco WebEx for Developers portal and either make a Bot under My Apps
Or, right from the browser, grab one of the 12 hour tokens
The easiest way to get one of these is to go to the Documentation
Find the API Reference
Find Messages
Pick POST
COPY THIS BEARER TOKEN
Paste that into your pyats.conf
But how do I get the Room / Channel / Space ID?
If you browse to Rooms
You can GET your current Room list
This will give you the JSON list – here is the Merlin Room ID
Thats it you are ready to connect your pyATS jobs for a job summary as a WebEx message!
With the above foundational WebEx integration with pyATS and WebEx’s simplicity I thought I would integrate a few sample commands into a Merlin pyATS job for the community to see how you can send Network State data to WebEx!
I want the message to be in Markdown so I am going to use a Jinja2 template to craft the JSON we can POST with Python requests after pyATS has parsed or learned the function
We don’t need a lot to make this happen either here is everything I import
We setup our WebEx room and token (12 hour or bot) as variables we can call later
The general_functionalities are important these are object oriented code that gets reused per pyATS learn or parse library call.
Then for this example I will do 2 learn functions, platform and routing and see if I can transform real network state data into meaningful WebEx messages
I tell Python where to find the Jinja2 templates and setup a variable I can use later to load said templates
We then setup our pyATS framework and connect (testbed.connect) to our topology
Again the testbed file looks like this
Now that we have connected we can begin our Test Steps ultimately looping (for) over each device in our topology (testbed)
Yes in this Sandbox there is only 1 device but this could scale to X devices. Just add them to the testbed.
Now we can learn platform
As of right now we have the following JavaScript Object Notation (JSON) data inside the self.learned_platform variable
Our goals:
Now we start our test steps
We will get a boolean pass/fail from the Create CSV and Sent to WebEx WebEx step
Next I set up a few variables – namely the Jinja2 references, the directory for the XLSX file, and the file name.
Also – for attachments we will declare another variable, the MultipartEncoder with the information required to attach the Learned_Platform.csv file
Next we template the .xlsx file from the Jinja2 template
Which looks like this:
That renders the file that looks like this
We will use 2 more Jinaj2 templates for the actual message we will send. Because The JSON body we post to WebEx is a single line, and in Markdown a header row starts with a # symbol, to avoid making the whole thing a header we will send it first.
Here is the line in Python
And the matching Jinja2 template
Remember, we are sending a long single line / string, as markdown, so if we want multi-line we need to add <br/> the Markdown linebreak command
Here is how we send the header
Which looks like this in WebEx:
Now let’s go ahead and template the Markdown
Which looks like:
Important! I had to “trim” this from what is in the “full” Markdown as there *is* a character limit so watch for that!
But that is also why we can attach the full CSV
So go get #chatbotting using real network state data!
Reach out if you hit any snags and watch for the full development video!
Imagine being able to use a keyword search engine against your network ? A Google-like query for “VLAN 100”, a MAC address, IP address, even an ACL or simply the keyword “Down” that returns real time search results !
It sounds far-fetched but that is exactly what I’ve been able to do in the latest addition to my open source project Merlin ! I’ve made this available as open source!
merlin (this link opens in a new window) by automateyournetwork (this link opens in a new window)
Network Magic: Transforming the CLI and REST API using Infrastructure As Code automation
As you may know Merlin already creates a no-SQL document database using TinyDB – a serverless database that is very easy to use. My only problem is that I haven’t found (and confirmed by TinyDB author) a UI or frontend to consume and present the TinyDB.
Poking around the Internet I found Elastic – a suite of tools that seem like a perfect fit for my goals. “Elastic – Free and Open Search”
I suggest you start here and read about the ELK Stack
I setup a 14-day trial in the Elastic Cloud for the purposes to getting going. Elastic can also be run in a local Docker container or hosted on Linux.
System has not been booted with systemd as init system
Once you have logged into Elastic, you can use a Google account for this, you will want to setup a Deployment
Here is Merlin as a Deployment
Which then opens up a full menu of amazing features and capabilities
Some key information:
When you first setup your Deployment you will get one-time displayed credentials you *need* make sure you capture this information!
Your Endpoint (the URL to your database) is available in a click copy here in the main dashboard. You can also launch Kibana and Enterprise Search / copy their unique endpoint URLs here.
Since we are using Elastic Cloud make note of the Cloud ID
As we need this in Python to connect to our endpoints.
In order to setup the Search Engine click Enterprise Search and then when presented with the option Elastic App Search
Create an Engine
Name your engine (I would suggest whatever you named your deployment -engine or -search)
Now the next screen will present you with four methods of populating the Search Engine with JSON
We are going to be Indexing by API and if you pay attention to the Example it will give you what you need to do this and a sample body of JSON
(You get your URL and Bearer Token; make note of both we need them in the Python)
Here is the relevant Python / pyATS code you need to build your own Elastic index (Deployment) and then also the ElasticSearch search engine !
First you need to pip install pyATS, Elasticsearch, and elastic_enterprise_search
pip install pyATS[full]
pip install elasticsearch
pip install elastic_enterprise_search
Next, in the actual Python, you will need to import the above libraries into Python
As well as the pyATS framework
Next in the Python we need to setup a few things to interact with Elastic
Now we get into the actual pyATS job first setting up the AE Test section and using testbed.connect to establish our SSH connection to the network device
Next we setup our Test Case and define self, testbed, section, and steps. Each Step is a boolean test in pyATS.
For device in testbed kicks off the loop that runs the commands per device in the testbed topology (list of devices in the testbed file)
Now I have defined a function that can be reused that has a step and try to device.learn(function_name).info (and fail gracefully if the function could not be learned)
Now we simply feed this function the various features we want to learn
In this case it was written for the Cisco DevNet Sandbox – NXOS – Nexus 9k which only supports a limit number of features. In a real environment we can learn even more!
Then we use a different function for the parsed show commands
And run a variety of show commands
Now Merlin has, to date, created business-ready documents (CSV, markdown, HTML) and experimental documents (Mind Maps, Network Graphs) from the JSON we have inside all of these variables.
Now here is how we send the JSON to be Indexed in Elastic
Lets take a few examples – learn BGP – as a second fail-safe check in case it did parse correctly but for some reason was empty I first check if its not None
If its not none, we index it in our Deployment
Then we index it in our Search Engine
Here is show ip interface brief
Its easy and repetitive code – so much so that I will likely write another function for these 6 lines of code and just feed it the learn / show command.
In order to confirm your Elastic Deployment is up – you can use cURL or Postman or the Elastic API Console
Wait what? I have just built a database that has an API I can query with Postman???
Y E S !
Check this out
Launch Postman and setup a new Collection called Elastic
Add your username and password (the one-time displayed stuff I told you to write down!) under the Authorization – Type – Basic Auth
Add a Request called Deployment
Copy and Paste your Elastic endpoint ID
Paste it in as a GET in your Deployment Requst
You should get a 200 Status back
And something like this
In Elastic – You can do the same thing!
Launch the API Console
If you leave the field empty and click Submit you get the same status data back
What about our Network Data?!
Now if you pay close attention to the pyATS and Python logs – you will see this call and URL (and status) when you send the data to your Deployment to be Indexed
The 200 means it was successful – but you can take this string into Postman / Elastic API Console !
Back in Postman
Which gives us:
And in the API Console we just GET /devnet_sandbox_nexus9k/_doc/show_ip_interface_brief
Now – check this out – make your next GET against just the base index
In the DevNet Sandbox there is almost 35,000 rows of data ! WHAT !?
The full state as JSON
Over in API Console
Very cool what about the Search Engine??
Well the engine is now populated with Documents and Fields
Which look like this
We can filter on say VRF documents and the search engine magic starts
Now lets check out keyword searches in the Query Tester
VRF
How about an IP Address
What about “Up”?
Visualizations
I want to be open I have not totally developed out any visualizations but I want to show you Kibana and the absolutely incredible dashboards we can create using the Elastic Deployment data
Launch Kibana and then select Kibana again
Now take a look at the incredible things we can do
As I said I have barely scratched the surface but lets look at what we could do in a Dashboard
First thing we have to do is create an Index Pattern
I’ve selected the devnet_sandbox_nexus9k to be my index pattern
Now I have 6670 fields (!) to work with in Kibana Dashboards
Now it becomes, for a beginner like me, a little overwhelming simply because of the vast choices we have to work with this data
Kibana discovery and learning aside my adventure into network search engines was fun and I learned a lot along the way. I’ve made a video of my development process here if you would like to check it out before you try it yourself.
When I was tagged on Twitter about @ioshints (Ivan Pepelnjak (CCIE#1354 Emeritus)) latest blog post I thought somebody was telling me I should read the latest blog
I flagged this as “Hey what a coincidence I was just writing about #chatbots with Discord – I gotta read this later”
Turns out it was my article that was Worth Reading!
Why this is so special to me is that I started my automation journey with an ipSpace.net subscription which was a very key part of my early success with Ansible and Cloud automation specifically. Ivan has also personally helped me write better code and taken a personal interest in my success.
I am so incredibly humbled and thankful for Ivan’s recognition but even more by his commitment to be honest and open with his vast knowledge.
In a lot of ways I am trying to emulate Ivan’s approach and appreciate having a virtual mentor of such quality and capability.
Thanks!