We performed a comparison between Dynatrace and Azure Monitor based on real PeerSpot user reviews in five categories. After reading all of the collected data, you can find our conclusion below.
Comparison Results: Dynatrace is a better option than Azure Monitor, as it offers more advanced monitoring features such as real user tracking, AIOps automation, Kubernetes module, and session replay, along with a user-friendly interface, good AI capabilities, and easy deployment. In contrast, Azure Monitor is easy to set up and maintain but lacks visualization and integration with third-party services, and needs more out-of-the-box functionalities and artificial intelligence for event correlation. Dynatrace also provides better ROI through cost savings from automation and decreased mean time to identification and repair.
"I am monitoring all of my Azure Monitor and getting good reports. I can customize the reports to get the information I need. I am also getting emails about which AAS instances are down and everything in the system related to my services. It is easy to use, scalable, and user-friendly. Microsoft has Many guides and videos to help you understand how to create and use Azure Monitor."
"The solution has tons of valuable features."
"A product that is well-integrated for monitoring Microsoft Azure."
"The solution works well overall. It's easy to implement and simple to use."
"Azure Monitor gives us the observability to check everything that we have in the cloud."
"It has good troubleshooting features."
"The most valuable features of Azure Monitor are the login analytics workspace and we can write any kind of custom queries in order to receive the data that is inserted into the login analytics workspace, diagnostic settings, et cetera."
"Azure Monitor is really just a source for Dynatrace. It's just collecting data and monitoring the environment and the infrastructure. It is fairly good at that."
"It provides a better understanding of what is going on."
"Dynatrace alerts are based on deviations from the reference metrics which are constantly collected."
"It is a platform that is very well-suited for marketers, but also for technology people. That is the key: the dashboard."
"We can be more productive and agile. It allows us to be more accurate when we need to work with bugs."
"It provides the whole perspective in a single place when trying to guide the right people to go to the right solution at any given point in time."
"Reduced MTTR, thanks to smart problem detection and automated root cause analysis."
"It has given us one simple dashboard to monitor all of our servers and web applications."
"It helps to show where the problem is and isolates the issue."
"The biggest one is probably just the user interface. There could be more advanced logging at the database level. They can also improve their query builder to allow you to search for things better, but I last used it about a year ago. They might have already changed a ton of things in the newer versions."
"The query builder could be better. In comparison to other monitoring tools, in order to use Azure Monitor, your engineers need to have KQL experience. If they don't, it's not intuitive as a system."
"We cannot use AI services with the solution."
"As a younger product it still has room for feature improvement and enhancement."
"The length of latency is terrible and needs to be improved."
"This solution could be improved with more out-of-the-box functionalities and artificial intelligence to complete event correlation."
"The price could be lower but it is not a must."
"Currently, it seems it's complicated to get the correct information in terms of what to do and how things work."
"It still has a long way to go to reach that single pane of glass."
"The flexibility when it comes to integrating with other tools is very low."
"UEM (User Experience Management) works great for web clients and Android and IOS apps, but for other rich clients it's a lot more challenging."
"We have had to resolve a lot of things and had a lot of issues with the tool."
"The problem evaluation feature is an awesome idea, but bit difficult to pick up initially."
"They've leveraged those security gateways and renamed them ActiveGates, and now there are different web plugins we can run on it... Sometimes the development of those seems to be running very fast and it's not complete. They don't yet function quite as easily as the OneAgents do. But I have hopes that that's going to get better. We have tried the MQ, the Citrix, and the Oracle ActiveGate plugins. They could be sharper. It's the right direction to go. It just seems like it could be smoother."
"Regarding features, it would be good if there would be some features regarding app security."
"We have a couple of one page apps that it has a problem with because it doesn't call to the server all the time. I believe part of that is taken care of in the next version."
Azure Monitor is ranked 4th in Application Performance Monitoring (APM) and Observability with 44 reviews while Dynatrace is ranked 2nd in Application Performance Monitoring (APM) and Observability with 340 reviews. Azure Monitor is rated 7.6, while Dynatrace is rated 8.8. The top reviewer of Azure Monitor writes "A powerful Kusto query language but the alerting mechanism needs improvement". On the other hand, the top reviewer of Dynatrace writes "AI identifies all the components of a response-time issue or failure, hugely benefiting our triage efforts". Azure Monitor is most compared with Datadog, Sentry, Prometheus, Grafana and New Relic, whereas Dynatrace is most compared with Datadog, New Relic, AppDynamics, Splunk Enterprise Security and Elastic Observability. See our Azure Monitor vs. Dynatrace report.
See our list of best Application Performance Monitoring (APM) and Observability vendors.
We monitor all Application Performance Monitoring (APM) and Observability reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.