My research interests focus around the theory and practice of resource sharing under uncertainty in networked systems. I am particularly interested in exploring the complex behaviours of shared network and computer infrastructures.
Back in 2008, I had the research evidence, based on mathematical analysis and large-scale computer simulations, to demonstrate the value of viewing digital infrastructures through the lens of the end user. This had the potential to become the digital Voice of the Customer, a means of describing the output of the digital supply chain.
To put this idea into practice would require the development of an easy-to-deploy measurement service. The goal was friction-free adoption and deployment. Ideally, it would need a minimum of integration, have as little impact as possible on the digital infrastructure it was measuring, and would not invade anyone's privacy. This represented a serious challenge.
The response to that challenge started almost immediately and led to the creation of what we now call the Digital User (DU).
The DU development was, and continues to be, based on three key measurement principles:
Firstly, it must use standard Internet protocols (such as http, tcp, icmp etc.).
Secondly, it continuously, but gently, exercises the infrastructure.
Finally, it does this from the "outside-in": from user devices measuring all the way through to where digital content is supplied.
We also adopted a "separation of responsibilities” architectural principle. The DU takes measurements while a centralised service, our Analytics Cloud, crunches those measurements into user experience analytics.
Integration introduces friction. For some business models, integration is a source of professional services revenue – for our technological innovation it is unnecessary and would slow down deployment. From the beginning, we designed the technology to be analytics-as-a-service, a subscription business model where the customer pays for the analytics, the actionable data. There is no charge for the DUs or for however many members of staff are accessing results.
Reducing friction is also why the DU uses standard protocols and why the DU is responsible for contacting the Analytics Cloud (using https to receive instructions and to upload its measurements). There's no need for special firewall rules because, from a protocol perspective, the DU behaves in a very similar way to an application on a client device.
Even if the DU is deployed on a client device (it runs on all major operating systems) it only takes active measurements. It never looks at what the human user is doing on that device, and hence avoids the privacy issues associated with passive measurement techniques.
Where DU behavior differs from an application on a client device is that it is measuring continuously but in a gentle way. This builds a picture of user experience 24/7, whether or not the user device is being used by a human user. Our need for continuous measurements ruled out using passive techniques because people don't use their devices 24/7 (well not if you want to sleep as well!).
So that gives us near real time digital quality, at the points where users experience that digital quality. And being continuously available, it captures the evidence of those difficult-to-locate, ephemeral behaviours that generate variability, the enemy of quality.
One privacy issue emerged that we weren't expecting. By measuring continuously, we could work out when a device is on during the working day. If the DU is on the desktop or laptop of a member of staff, that ability can be perceived as an invasion of privacy in some countries.
The solution is to put DUs on devices that are on 24/7 and connected to LANs where staff computers are located. This maintains the all-important sampling of digital quality where staff experience it. It also helped us confirm an important application of our service – the ability to evaluate digital experience prior to going live for an application.
By locating DUs on 24/7 devices in a variety of places, it is possible to evaluate a new digital product or service or an existing application that is to be upgraded and establish a digital quality score benchmark.
This helps build confidence ahead of new rollouts. One of our customers realised they would struggle to deliver a new business critical service with consistent digital quality using just a single global platform, definitely critical to know before go-live.
The customer was able to use our analytics to establish the optimal number of platforms to deliver acceptable levels of digital quality. They had certainty that their investment in upgrading their services would improve their business.
In some respects, the DU appears to be a dumb piece of software – it simply receives instructions, takes measurements, and sends these measurements to the Analytics Cloud.
In fact, it is a sophisticated state machine that provides a robust measurement platform which, when coupled to our Analytics Cloud, delivers the Digital Quality Score, an outside-in, real-time, data-driven view of what your customers and employees would say about the digital experience of your digital products or services and why they would say it.