BLOG

Journey from JavaScripting to an Enterprise AppDynamics Utility

Share on facebook
Facebook
Share on linkedin
LinkedIn
Share on twitter
Twitter
Share on email
Email

Here at Optimiz, one of our most frequently used tools is AppDynamics which is a leader in the Application Performance Monitoring space. In this article, we will go through the utility that we created to support our clients and consultants when they deploy and operate AppDynamics. Our utility solves the issue of having to manually do routine tasks through the use of automation which saves our consultant’s and user’s valuable time and increases productivity. It also reduces the risk of human error which increases with the size of your configuration. Because of these factors, we believe our utility can aid in further adoption of AppDynamics and streamline the process of working with it. This utility also enables easy access to analytics generated by AppDynamics which can assist in making business decisions.

Key Capabilities

  • Export analytics data beyond the UI and rest API limit.
  • Import/Export dashboards between different applications/controllers. 
  • Migrate configurations from non-prod to prod.
  • Export events to CSV for external analysis to reduce the noise and make sure we are having a effective and proactive alerting.
  • Create a CSV list of users, dashboards or health rules. 
  • Very portable. Written in Node.js and deployed using docker. 
  • Can be hosted on-prem or in public cloud. At Optimiz, we host in our GCP enviroment. 

Utility Architecture

AppDynamics has many API endpoints, which allow more flexibility than the UI that’s supplied. We chose Node.js as our backend because JavaScript frameworks have become very popular in recent years and it served to help us later with some plugin development in Grafana. This backend helped us to expand and customize the existing AppDynamics API functionality and to work around the CORS limitations which prevent access from client-side JavaScript. Then we used Express for our front-end, which allowed us to make a customized web interface giving easy access to common tasks performed by the Consultants at Optimiz. Our utility allows for quickly switching between different environments using your existing username/password combo for authentication. We finished off by containerizing it with Docker and deploying using Cloud Run under the Google Cloud Platform.

Utility Features

Due to the design of the current AppDynamics UI there are limitations that force certain actions to be performed manually. Taking up time with tedious tasks didn’t sound very optimal to us, so we set out to create a utility that would assist our consultants with their day-to-day actions. I have listed a few of them below. 

  • Exporting dashboards or health rules 
  • Importing them into a separate account/environment 
  • Gathering large volumes of analytics 
  • Creating a CSV list of users, dashboards, or health rules 

With the standard UI exporting is done one-by-one, and the same is true for importing on another account. Often our consultants use these features to transfer artifacts across multiple environments. To make this a more cohesive process we created a method of exporting from one account and importing to another with any number of artifacts in one action. We also added some functionality for creating CSV lists of the above-mentioned resources. In our use cases we found this helpful, but we also recognize that not everyone would need this as a standard feature. There was also a limitation of 5000 analytics per query which we worked to overcome. This enables clients to store large amounts of analytical data in data warehouses such as Snowflake. This data can then be correlated with other data sources to assist in making various business decisions.

Requirements

  1. Integrate the utility with the AppDynamics API 
  2. Allow for users to authenticate with the API using the controller login 
  3. Create a UI that was visually appealing and easy to use 
  4. Make the project easily deployable 

With these requirements, we began work on the utility.

Our Undertaking

The first step we took was to learn more about JavaScript and the Node.js framework. We had prior experience with the Flask framework in Python but little to no experience with JavaScript frameworks. As these have become quite popular in recent years, we decided to investigate this avenue for our utility.

 

We began with generating access tokens from the AppDynamics API using API clients. This required any user to have an API client created in the controller. We later found that this was inconvenient for users and we switched the process to allow tokens generated with username and password. 

 

Once we were able to get working tokens we could start pulling and manipulating data. We started simple by pulling dashboards info, users and health rules from the API and changing the data from JSON to CSV. Once we had completed this, we worked on functionality to export whole dashboards to be imported through the AppDynamics UI. 

 

Now that we had export functionality, we could begin working on a way to import the dashboards from JSON. This proved to be a difficult task but one that we were able to overcome. This was an integral feature as it helps both our consultants and customers to quickly deploy different artifacts across multiple environments.

 

At the same time the import/export was being developed, we were also creating a way to access AppDynamics Analytics from the utility. Now instead of using curl we have created a nice UI to enter query parameters into, which also bypasses the 5000 analytic per query limit mentioned earlier. This analytics functionality was added to a utility API which allows you to make the same requests with curls as well if preferred. 

 

Once we had finished all of the functionality, we containerized the project with Docker and it was ready for deployment. We used the Cloud Run service on Google’s Cloud Platform which was very straightforward to set up with the provided documentation.

 

Throughout the utility development, we utilized the Agile Methodology by conducting daily scrum meetings and joining in on bi-weekly team meetings. We also organized our time into several sprints to stay on schedule.

 

Lessons Learned

Learning #1 Async and Await 

We had some basic experience with JavaScript through our college work but quickly learned that there was so much more to the language than listening to events and reading inputs. Neither of us had worked with a server-side JavaScript framework in the past, which taught us the important distinction between front and back-end as well as how they can interact with each other. Once we started to work with fetch and API’s it was clear that we needed to understand the asynchronous nature of JavaScript. This led to a very interesting concept, which we used numerous times throughout our work. 

 

Learning #2 HTTP authentication 

Throughout the development of the utility, we had to learn about how authentication works when making HTTP requests. Previously we had only made requests that did not require any tokens or logins. Part of this learning was experimenting with Postman and trying different methods of authentication, we then tried to mimic those experiments using node-fetch.  

 

Learning #3 Converting curls 

We used the AppDynamics API and the documentation for the API extensively throughout the development of the utility. We needed to learn to convert the curls that were in the documentation to something more useable in JavaScript. Before this, neither of us had any experience using curl. We had to do a large amount of reading just to understand the curls so we could begin working with them. Once we had learned how to read curls, we were then able to replicate the statements using node-fetch.

 

Learning #4 Deploying Docker containers to the Google Cloud Platform (GCP) 

Previous to this project we had a bit of experience with Docker containers, but not in conjunction with a cloud platform. Setting up our container was fairly straightforward as we only had to package one Node.js application. Following the documentation for GCP Cloud Run taught us that deploying a basic container was very simple, only requiring a few commands to get up and running. We also learned that there are a few different ways to deploy containers with GCP such as Kubernetes Engine or App Engine. These options offer different functionality around containers like load balancing and different amounts of infrastructure control.

In our next post we will talk more about how to use this utility. If you are interested in checking it out please visit our GitHub repository at https://github.com/optimizca/appdtools.