Why Uptime.com Chose Apdex as a Performance Monitoring Standard
Early Twitter was an adventure. Every day was an open question: would you be able to log in or did the next big story crash the platform? It was taking off and crashing and flying and crashing again. All in real time. It was an exciting time for the internet, and while everything has changed since then it got us thinking: why did we used to tolerate stuff just not working?
And why do we still tolerate stuff not working?
That’s a part of the central question that the Apdex standard wanted to solve for, and why it was an important metric for us to adopt as we looked toward improving our Real User Monitoring (RUM). Not only why we tolerate it, but at what point is bad performance unacceptable.
With Apdex, we can identify the moment the user decides they have had enough, so you can make sure that they rarely (if ever) get to that point in the future.
Why Apdex as a Performance Monitoring Standard
The interaction between humans and computers fundamentally comes down to response. Humans expect computers to react quickly to the inputs received, or at the very least to appear as though some kind of action is being taken.
When you add an item to your cart, you expect it will be there immediately, and there is ambiguity when that doesn’t happen. We’ve all experienced that one extra item making its way into our basket.
Failing at this very basic interaction can break trust and cause the kind of annoyance that makes a user bounce.
An application’s processing speed is just as important to the end user as its load and up time. When revenue is on the line, tracking that response time becomes paramount to building a robust and trustworthy offering.
Apdex, or Application Performance Index, offers a standard to encapsulate all of those measurements we use to determine performance. The technical metrics involved in measuring response time ultimately boil down to the question: how satisfied is your end user?
How Does Apdex Measure Performance?
Apdex is intended to examine the different types of users that interact with a site – for example those who are browsing versus those who use and rely on the application. To that end, our RUM check allows you to monitor any URL on your site, from your marketing pages to the application itself. Our Apdex score will tell you how well each URL you are measuring performs based on this established standard.
Using Apdex, our reporting makes understanding the significance of response time feel intuitive. Easily correlate dips in performance to increases in bounce or error rate. With Apdex, and our proprietary measuring standard Time to Interactive (TTI), you can be sure you’re getting the clearest picture of your end user’s actual experience with your site.
What do these thresholds really mean?
Let’s get a closer look at these thresholds and what they are measuring in terms of the actual user experience.
When response times fall in the satisfied range, essentially the user is able to interact with your site and concentrate fully. We use the Apdex standard of 4 seconds to define Satisfied, but this threshold is completely adjustable. Adjusting this threshold does affect the calculation of the other Apdex values.
Tolerating a website’s performance is more common than you might think. How many times have you tried to access an application you need for work, only to find performance dragging. That’s a drag on your productivity, but you will stick around long enough to use the application.
That’s the definition of tolerating: a zone where the user is thinking “this is annoying, but I need this resource.” It’s not the ideal place to be in.
If tolerating is annoying, frustrating is picking up your computer and throwing it out the window. The user at this point is thinking about abandoning, if not already gone. More casual users are gone and may never come back, while more dedicated users are starting to consider alternatives to your service.
Each of these stages also has thresholds. Mildly annoyed users are not the same as nearing Hulk levels of rage users. With Page Load Time Distribution, you can see that distribution and make those fine distinctions between “I wish things were better” to “we should be doing better”.
How RUM Helps
What if you had the opportunity to sit down with all of your frustrated users and explain how dips in performance were related to X or Y failure in infrastructure? How would you handle an opportunity to make things right?
This is where RUM excels. When you use RUM, you essentially get feedback from those individual users. Each one of them tells you where they are on that spectrum from Satisfied to Frustrated, reporting the errors and problems they encountered along their pathway.
You already run monitoring to tell you whether your services are UP or DOWN. Take the plunge into experience monitoring and learn more about how to optimize and improve your services as you grow and deploy new ones.
RUM with Uptime.com and Apdex
Performance monitoring is all about measuring response time and speed, and there is no data more valuable than user-generated data. Uptime.com offers a simple, low-impact method of gathering a number of need-to-know data points measuring user experience.
Results are broken down by browser, URL, geographic location, operating system, and device. Color-coded reporting visualizes the user experience compared against a baseline. Whether you are meeting us for the first time, or a faithful reader and user, try RUM. Simple to set up and powered by real user data, RUM delivers vibrant and thorough reporting on your user experience.
Minute-by-minute Uptime checks.
Start your 21-day free trial with no credit card required at Uptime.com.