• Real User Monitoring. Source: eGInnovations 2020.
    Real User Monitoring. Source: eGInnovations 2020.
  • RUM data helps identify bottlenecks in a web application. Source: Veeravalli 2017.
    RUM data helps identify bottlenecks in a web application. Source: Veeravalli 2017.
  • Data Visualization in RUM. Source: Hadžić 2020.
    Data Visualization in RUM. Source: Hadžić 2020.
  • RUM - Full Stack Deployment and Analytics. Source: Dynatrace 2017.
    RUM - Full Stack Deployment and Analytics. Source: Dynatrace 2017.
  • RUM Dashboard. Source: GeekFlare 2020.
    RUM Dashboard. Source: GeekFlare 2020.

Real User Monitoring

User avatar
anuradhac
1133 DevCoins
Avatar of user arvindpdmn
arvindpdmn
154 DevCoins
2 authors have contributed to this article
Last updated by arvindpdmn
on 2020-07-05 09:38:41
Created by anuradhac
on 2020-05-13 10:47:42

Summary

Real User Monitoring. Source: eGInnovations 2020.
Real User Monitoring. Source: eGInnovations 2020.

Websites and mobile applications are constantly looking to improve end user’s experience with their product, be it stability, performance or usability. By passively observing every user’s interaction with the application, product teams can get deep insights into the user experience and take corrective action. This passive monitoring process is called Real User Monitoring (RUM) or End User Monitoring.

Traditional methods of Application Performance Management focus on downtime or response time. These are from product features perspective rather than end-user perspective. RUM gives a transparent, real-time view of every single user transaction for IT managers to pre-emptively detect and resolve user issues.

RUM tools typically insert small bits of JavaScript code into the client application for continuous data collection. Events such as DNS resolution, TCP connect time, SSL encryption negotiation, first-byte transmission, navigation display, page render time, TCP out-of-order segments, and user think-time are monitored.

Milestones

2005

Google launches its Web Analytics solution that tracks and reports website traffic.

2006

Dynatrace, Sematext and other software intelligence companies introduce their Application Performance Management products of which RUM is a part. Over the next few years, these tools employ Big Data and Cloud Based services to help internet companies monitor their user traffic.

2008

ISO Standard for measuring User experience was established (ISO 9241-210) as early as 1999. With its introduction in 2008, ISO 13407 is revised for human-centred design for interactive systems. The goal is to establish standards and metrics to measure user experience and engagement with web applications.

2016

Gartner defines end-user experience in three dimensions - (a) End User Experience Monitoring (RUM); (b) Application discovery, tracing and diagnostics (ADTD); (c) Application analytics (AA).

Feb
2017
RUM data helps identify bottlenecks in a web application. Source: Veeravalli 2017.

In a blog post, LinkedIn engineer describes how they used RUM to improve the performance of a Single Page Application (SPA). RUM data helped them identify bottlenecks. Traditional RUM libraries rely on Navigation Timing API and Resource Timing API. But these fail to capture JavaScript execution times. To address this, they use User Timing API. To improve performance, they lazy load data and do lazy rendering. Their web application is now faster by 20%.

Discussion

  • What are the key defining features of RUM?
    Data Visualization in RUM. Source: Hadžić 2020.
    Data Visualization in RUM. Source: Hadžić 2020.
    • User Journey Mapping - Study of user interactions with product at various stages of familiarity with the product (User Lifecycle). Intuitive analysis to understand each user action and the motivation behind those actions. Trace user movement within the product – what they like to do, where they falter, what features they avoid, why they exit.
    • Real-time Alerts - Using AI based analysis, problems detected in the user session are compiled into actionable notification alerts, even while user session is in progress. Example alert – “Error rate has been up for over 10 hours in the check-out page of ecommerce app”.
    • Individual User Session Monitoring - Collection of chronological events for a particular end user from start of a web session to a configurable period of inactivity. During each user session, pages/components visited and timing information are collected.
    • Data Visualization and Analysis - RUM collects data points from a high volume of users over wide range of metrics. Visualizations such as bar graphs, time series charts and area graphs make these large volumes of data human-readable. Actionable insights can be derived from the data.
  • What are the objectives of performing RUM?

    RUM is generally used by product managers and owners to discover key bottlenecks to meeting their business objectives – ROI, trial to subscription conversion rates, app downloads/uninstall rates and so on. By closely observing how users navigate their way through the app/website, it is possible to unearth the specific reasons for poor user engagement.

    RUM helps meet the following objectives:

    • Debug a user’s navigation path in case of failed operations to surface hidden problems.
    • Why a user experiences slow page load times - browser, network, server, or content download that is taking more processing time.
    • Real-time measurement of key targets by tracking actual visits and delivering top-level data on actual use cases.
    • Collect UX metrics such as NPS (Net Promoter Score), Usability Scale, App ratings, Time on Task, Click to Action.
    • User segmentation and testing based on actual operating system, browser, user location, language, device or network settings. Helps in troubleshooting deployment specific issues.
  • How is RUM different from web analytics mechanisms such as Google Analytics?

    RUM and analytics tools like Google Analytics serve different purposes. Analytics are for collecting high level performance data from product perspective. They do not focus on individual user experience at all. These tools aggregate trends captured across web sessions and show them up as performance statistics.

    For example, collect traffic data across the span of a single day, a week or a month. For high traffic scenarios, analytics tools use sampled traffic data for reporting purposes.

    The main purpose of Google Analytics is to collect and analyse web data for SEO, lead generation and conversion tracking, from a marketing perspective. RUM instead focuses more on the actual root causes of performance issues that users may experience.

    It is impossible to debug issues faced by an individual user using analytics. Or verify feature implementation across product versions. RUM enables monitoring user sessions across deployed devices/browsers or trace unhandled exceptions in a particular crash scenario. RUM captures 100% webpage hits, thus giving a very clear and close view of the user–product relationship.

    Analytics is meant to be a post-mortem of web activity, whereas RUM is about real-time analysis and rectification.

  • What is synthetic monitoring and how does it compare with RUM?

    Synthetic monitoring is the process of analysing a website’s user experience and performance without actual users, instead by simulating them. It is especially effective in the pre-deployment phase, during in-house testing before going live. It's essential for load testing of high-traffic websites.

    Synthetic monitoring can answer queries such as:

    • Is my website up and running?
    • Are response times acceptable, especially with high load?
    • Are all transactions working?
    • If there is a slow down or failure where is it in the infrastructure?
    • Are my third party components operating correctly?
    • How is the performance versus cost?

    Unlike RUM, synthetic monitoring cannot used to find what the real user is experiencing. In real-time deployment scenarios, synthetic test cases may all pass, but a real user might still face transaction outage or speed issues. This can be identified only using RUM.

    It’s almost impossible to simulate practical failure cases such as those related to geographical location access, specific user device settings, unexpected network outage.

    The general practice is to run synthetic monitoring test cases before going live, and real user monitoring after live deployment, at regular intervals. The two methods complement each other.

  • How is an RUM solution deployed?
    RUM - Full Stack Deployment and Analytics. Source: Dynatrace 2017.
    RUM - Full Stack Deployment and Analytics. Source: Dynatrace 2017.

    RUM tool usage is divided into two phases – Deployment and Execution.

    During deployment, two steps are performed:

    • JavaScript code RUM integration with client-side product code
    • Install RUM agent on target server on all deployed systems. It discovers all technologies and applications running on it (JS, PHP, IIS, .Net)

    RUM tool deployment requires full stack development skills.

    Now the tool is ready for usage. Its execution is navigated from the tool dashboard. User-specific access level permissions control data access.

  • What is the typical execution process flow for RUM?
    • Data Collection - captures data about requests for pages, JSON, and other resources from browser to web servers, even when requested content is hosted on third-party sites.
    • User Session Analysis - captured data regrouped into session-wise record of pages, components, timing information. User journey analysed for registered/anonymous users, new/recurring users. JavaScript errors checked for show stoppers or mild irritants.
    • User Performance Analysis – Various response times as perceived by user are studied at front/back end. Metrics optimised as per user bandwidth availability.
    • User Behavior Analysis – Frequent user activities, entry/exit actions, landing pages, last page viewed before exit, bounce rate, user leaving app after first visit are studied.
    • Problem Detection - undesirable behaviour such as slow response time, system problems, page load errors, transaction failures, web navigation errors, analyzed for different pages, objects, sessions.
    • AI-powered Root Cause Analysis - DevOps team perform RCA and map user issues with underlying root cause. Analytics model understands the context of each metric.
    • Reporting and Notification - Various data visualization formats and problem reports can be accessed. When data points cross configured thresholds, notification alerts can be configured to be sent.
  • What are the popular RUM tools used?
    RUM Dashboard. Source: GeekFlare 2020.
    RUM Dashboard. Source: GeekFlare 2020.

    RUM tools are testing and analysis platforms that enable product teams to study whether applications are working as per their intended design.

    RUM tools are generally deployed within the product environment by embedding JavaScript snippets into product code. They are not plug and play tools, they require code-level integration with the product. Most of the tools support React, AJAX, AngularJS web applications. For RUM on mobile apps, code is embedded into the app source code for Android/iOS apps.

    RUM tool dashboard can then be used to monitor user sessions at real-time and observe key metrics such as page rendering time, environment specific performance, error rates, and bounce rates. Alert notifications can be set for key touch-points.

    Some tools also support Single Page Application monitoring (where web apps dynamically rewrite current page content itself every time, instead of loading new pages for every operation). Such apps mostly use AJAX requests to pull content dynamically and create a fluid user experience.

    Some of the popular RUM tools include Sematext, Monitis from TeamViewer, SmartBear, Dynatrace, SOASTA mPulse from Akamai, Raygun, AppDynamics from Cisco, and CXOptimize by Cognizant (Open Source).

  • What are the limitations and concerns in using RUM?

    RUM is useful in several scenarios, but has some limitations. There are also concerns of performance overhead and security. Some are listed below:

    • Not effective with limited traffic – Before product launch or just after launch, websites/apps don’t generate much traffic. RUM would not detect any meaningful issues at this stage. It works best when there is steady traffic volume.
    • Generates too much of data - Ironically, during heavy usage, large number of datasets are collected. Small companies won't have sufficient DevOps resources for analysing this.
    • Trial and error visible to users - Synthetic monitoring is done using simulated environments without involving the actual user. Whereas, RUM tool runs in actual production environment. If issues are found regularly, users would notice frequent product updates. Product may appear immature.
    • Exposure risk at code level - Product teams must be careful before integrating RUM script code into product source code. Critical product functionality should not get exposed to external threats.
    • Not useful for benchmarking - RUM data is inherently random because it depends entirely on user activity which is unpredictable. So fixed interval monitoring of data may not yield accurate benchmarking results.

References

  1. Akamai. 2020. "Real User Monitoring." Accessed 2020-05-15.
  2. Altassian. 2020. "Browser RUM Sessions." AppDynamics, Cisco. Accessed 2020-05-15.
  3. DataDog. 2020. "RUM Analytics." Accessed 2020-05-15.
  4. DataDog. 2020b. "Actionable alerting." Accessed 2020-05-15.
  5. Dynatrace. 2017. "30 min Performance Demo : Real User Monitoring." April 19. Accessed 2020-05-15.
  6. Geekflare. 2020. "9 Best Real User Monitoring Tools to Improve User Experience." May 24. Accessed 2020-05-15.
  7. Gibbons, Sara. 2018. "Journey Mapping 101." December 9. Accessed 2020-05-15.
  8. Groske, Nate O. 2020. "Real User Monitoring (RUM) vs. Synthetic Monitoring Comparison." Stackify, November 20. Accessed 2020-05-15.
  9. Hadžić, Amir. 2020. "8 of The Best Real User Monitoring Tools and How to Choose One." May 21. Accessed 2020-05-15.
  10. Nick. 2017. "How Real User Monitoring differs from Google Analytics." Raygun, January 12. Accessed 2020-05-15.
  11. Reynolds, Justin. 2019. "The 7 best User Experience Monitoring tools for 2020." Raygun, February 14. Accessed 2020-05-15.
  12. Singer, Ryan. 2020. "6 key UX metrics to focus on." UX for the Masses, January 24. Accessed 2020-05-15.
  13. SmartBear. 2020. "What is Real-User Monitoring." Smart Bear Software. Accessed 2020-05-15.
  14. SmartBear. 2020b. "Synthetic vs Real User Monitoring." SmartBear Software. Accessed 2020-05-15.
  15. Stackify. 2020. "What Is Real User Monitoring? How It Works, Examples, Best Practices, and More." Accessed 2020-05-15.
  16. Veeravalli, Sreedhar. 2017. "Measuring and Optimizing Performance of Single-Page Applications (SPA) Using RUM." Blog, LinkedIn Engineering, February 2. Accessed 2020-07-05.
  17. Walker, Jeffry. 2019. "6 Ways Real User Monitoring Differs From Google Analytics." March 21. Accessed 2020-05-15.
  18. Wikipedia. 2020. "Application performance management." Accessed 2020-05-15.
  19. Wikipedia. 2020b."ISO 9241." Accessed 2020-05-15.
  20. eGInnovations. 2020. "Monitoring Digital User Experience." eG Innovations, Inc. Accessed 2020-05-15.

Milestones

2005

Google launches its Web Analytics solution that tracks and reports website traffic.

2006

Dynatrace, Sematext and other software intelligence companies introduce their Application Performance Management products of which RUM is a part. Over the next few years, these tools employ Big Data and Cloud Based services to help internet companies monitor their user traffic.

2008

ISO Standard for measuring User experience was established (ISO 9241-210) as early as 1999. With its introduction in 2008, ISO 13407 is revised for human-centred design for interactive systems. The goal is to establish standards and metrics to measure user experience and engagement with web applications.

2016

Gartner defines end-user experience in three dimensions - (a) End User Experience Monitoring (RUM); (b) Application discovery, tracing and diagnostics (ADTD); (c) Application analytics (AA).

Feb
2017
RUM data helps identify bottlenecks in a web application. Source: Veeravalli 2017.

In a blog post, LinkedIn engineer describes how they used RUM to improve the performance of a Single Page Application (SPA). RUM data helped them identify bottlenecks. Traditional RUM libraries rely on Navigation Timing API and Resource Timing API. But these fail to capture JavaScript execution times. To address this, they use User Timing API. To improve performance, they lazy load data and do lazy rendering. Their web application is now faster by 20%.

Tags

See Also

  • User Journey Mapping
  • JavaScript
  • Application Performance Management
  • UX Design
  • Web Analytics

Further Reading

  1. Altassian. 2020. "Browser RUM Sessions." AppDynamics, Cisco. Accessed 2020-05-15.
  2. DataDog. 2020. "RUM Analytics." Accessed 2020-05-15.
  3. Stackify. 2020. "What Is Real User Monitoring? How It Works, Examples, Best Practices, and More." Accessed 2020-05-15.

Article Stats

Author-wise Stats for Article Edits

Author
No. of Edits
No. of Chats
DevCoins
3
2
1133
2
4
154
1813
Words
7
Chats
5
Edits
0
Likes
197
Hits

Cite As

Devopedia. 2020. "Real User Monitoring." Version 5, July 5. Accessed 2020-07-07. https://devopedia.org/real-user-monitoring