Here at Optimiz, one of our most frequently used tools is AppDynamics which is a leader in the Application Performance Monitoring space. In this article, we will go through the utility that we created to support our clients and consultants when they deploy and operate AppDynamics. Our utility solves the issue of having to manually do routine tasks through the use of automation which saves our consultant’s and user’s valuable time and increases productivity. It also reduces the risk of human error which increases with the size of your configuration. Because of these factors, we believe our utility can aid in further adoption of AppDynamics and streamline the process of working with it. This utility also enables easy access to analytics generated by AppDynamics which can assist in making business decisions.
- Export analytics data beyond the UI and rest API limit.
- Import/Export dashboards between different applications/controllers.
- Migrate configurations from non-prod to prod.
- Export events to CSV for external analysis to reduce the noise and make sure we are having a effective and proactive alerting.
- Create a CSV list of users, dashboards or health rules.
- Very portable. Written in Node.js and deployed using docker.
- Can be hosted on-prem or in public cloud. At Optimiz, we host in our GCP enviroment.
Due to the design of the current AppDynamics UI there are limitations that force certain actions to be performed manually. Taking up time with tedious tasks didn’t sound very optimal to us, so we set out to create a utility that would assist our consultants with their day-to-day actions. I have listed a few of them below.
- Exporting dashboards or health rules
- Importing them into a separate account/environment
- Gathering large volumes of analytics
- Creating a CSV list of users, dashboards, or health rules
With the standard UI exporting is done one-by-one, and the same is true for importing on another account. Often our consultants use these features to transfer artifacts across multiple environments. To make this a more cohesive process we created a method of exporting from one account and importing to another with any number of artifacts in one action. We also added some functionality for creating CSV lists of the above-mentioned resources. In our use cases we found this helpful, but we also recognize that not everyone would need this as a standard feature. There was also a limitation of 5000 analytics per query which we worked to overcome. This enables clients to store large amounts of analytical data in data warehouses such as Snowflake. This data can then be correlated with other data sources to assist in making various business decisions.
- Integrate the utility with the AppDynamics API
- Allow for users to authenticate with the API using the controller login
- Create a UI that was visually appealing and easy to use
- Make the project easily deployable
With these requirements, we began work on the utility.
We began with generating access tokens from the AppDynamics API using API clients. This required any user to have an API client created in the controller. We later found that this was inconvenient for users and we switched the process to allow tokens generated with username and password.
Once we were able to get working tokens we could start pulling and manipulating data. We started simple by pulling dashboards info, users and health rules from the API and changing the data from JSON to CSV. Once we had completed this, we worked on functionality to export whole dashboards to be imported through the AppDynamics UI.
Now that we had export functionality, we could begin working on a way to import the dashboards from JSON. This proved to be a difficult task but one that we were able to overcome. This was an integral feature as it helps both our consultants and customers to quickly deploy different artifacts across multiple environments.
At the same time the import/export was being developed, we were also creating a way to access AppDynamics Analytics from the utility. Now instead of using curl we have created a nice UI to enter query parameters into, which also bypasses the 5000 analytic per query limit mentioned earlier. This analytics functionality was added to a utility API which allows you to make the same requests with curls as well if preferred.
Once we had finished all of the functionality, we containerized the project with Docker and it was ready for deployment. We used the Cloud Run service on Google’s Cloud Platform which was very straightforward to set up with the provided documentation.
Throughout the utility development, we utilized the Agile Methodology by conducting daily scrum meetings and joining in on bi-weekly team meetings. We also organized our time into several sprints to stay on schedule.
Learning #1 Async and Await
Learning #2 HTTP authentication
Throughout the development of the utility, we had to learn about how authentication works when making HTTP requests. Previously we had only made requests that did not require any tokens or logins. Part of this learning was experimenting with Postman and trying different methods of authentication, we then tried to mimic those experiments using node-fetch.
Learning #3 Converting curls
Learning #4 Deploying Docker containers to the Google Cloud Platform (GCP)
Previous to this project we had a bit of experience with Docker containers, but not in conjunction with a cloud platform. Setting up our container was fairly straightforward as we only had to package one Node.js application. Following the documentation for GCP Cloud Run taught us that deploying a basic container was very simple, only requiring a few commands to get up and running. We also learned that there are a few different ways to deploy containers with GCP such as Kubernetes Engine or App Engine. These options offer different functionality around containers like load balancing and different amounts of infrastructure control.
In our next post we will talk more about how to use this utility. If you are interested in checking it out please visit our GitHub repository at https://github.com/optimizca/appdtools.