Why averages are not enough to report visibility for remote workers
Imagine you’re the CIO of a decent sized multinational corporation. For the last few weeks you’ve been busier than ever, getting your entire business set up for home-working in record speed.
It’s been going well, but you’ve just been invited into a video conference with the rest of the C-Suite and it seems that they’re not so happy.
“The video link for the new home-based teams is unacceptable,” the CMO complains. “The picture is fuzzy, the voice quality cuts in and out and every now and then there’s a weird echo… We can barely get anything done. It’s holding up the Q2 campaign and it needs to be fixed.”
You point to your monthly SLA report from the network operations team, laid out with user friendly graphics and accessible numbers. “The average load on our VPN is only 60%,” you argue. “That’s well within the acceptable limits.”
And therein lies the problem.
Every business consumes, produces, stores and uses data. The stats are mind boggling. According to IBM, there’s currently in excess of 2.7 zettabytes of data in the digital universe, and we’re creating an additional 35 zettabytes every year.
This data offers us huge potential across all aspects of our business. But it needs to be sifted and sorted, analysed and understood. Too much raw data and we’re overwhelmed. But the alternative - reports compiled of averages that don’t paint the whole picture - means we can miss the subtleties of what’s really happening.
This is particularly true when it comes to assessing human experience of home workers.
The problem with averages
The problem with averages is that they’re just that: averages. They simplify data, flattening out the peaks and troughs inherent in most samples to give a middle ground view that doesn’t always reflect the realities of a human experience.
In this case, the marketing department is getting frustrated because although the averages summarised in the CIOs data report say that the load on the VPN is around 60%, in reality it’s swinging wildly between 40% and 80%.
The home-working marketers may not know what these numbers mean but they can certainly feel the frustration rising as the quality of their video call is affected.
The point is that by using averages, you hide the true nature of the fluctuations in a system and can miss the most extreme cases.
As a result IT professionals are unaware of the true nature of the problem that they’re facing - in this case a dip in productivity that might eventually lead to wider HR problems such low morale, disengagement and increased staff turnover.
Averages and complex data
One simple answer for the CIO could be to scrap averages and instead look at all the data available from the digital supply chain. Only it isn’t really a simple answer because this bottom up approach would leave you with an awful lot of system performance data to process.
In reality, the quality of a video call is down to a range of factors including latency, jitter, packet loss and so on. And the effects can range from voice delay to echoes and feedback, fuzzy image quality or one person’s voice cutting out when someone else starts speaking.
The more complex the system, the more dangerous reporting averages can be, hiding variability and giving no real indication of how the users of the various systems are experiencing them. But at the other end of the spectrum you risk creating a data lake that needs a dedicated data scientist to wade through in order to gather actionable insights.
The time, effort and cost of manually looking through system performance data to resolve issues for every home worker is too great a challenge. For 100s or 1000s of home workers, quickly identifying and resolving issues across a complex system becomes impossible.
Measuring and reporting human experience
So how can our CIO actually measure and use human experience then? If we can’t use averages and we can’t overload the IT team with endless amounts of raw data, is there another way?
One answer would be to actually ask the users themselves. We’ve all seen applications that invite us to rate our experience after ending a video call, or had an annoying pop up appear as you're browsing a website, for example.
There is value in this but it can be difficult to collect the data - people don’t always want to fill out even the simplest of surveys, and one person’s 8/10 is very different to another’s.
Instead the CIO and his teams need tools that can collect the right data points from across the digital supply chain and make application-specific experience calculations to provide accurate and contextualised correlation analysis. They need to understand what’s really happening beyond the technical data, to see where in the supply chain the problem lies, which individual home worker is being affected and how to fix it.
This is where our analytics does the hard work correlating network and application cause with human experience effect. Analytics provides the speed and scale necessary, taking the CIO straight to the answers and providing actionable insight.
Our remote working improvement process tells you what needs to be done to improve the experience for the individual home worker (WiFi, ISP, Internet) and across the business (application, VPN, Data Centers).
And now we have a happy and effective CIO
For our CIO an actionable report like this would make his experience in that C-Suite meeting very different. In this new scenario he is presenting his findings coming out of a home worker improvement process.
“We’ve been monitoring the human experience of employees using the VPN”, he says. “What that shows us is that there is some variability affecting video conferences. We have already scheduled an upgrade to prevent any problems.”
The rest of the C-Suite are impressed. They hadn’t even noticed anything was wrong yet.