What are XLAs and how do they apply to Managed IT?
Read time 5 mins
Something of a newcomer when it comes to customer expectations, Experience Level Agreements (XLAs), at least in the context of managed IT, are pre-defined target levels for end users’ experience with any given IT service.
In other words, XLAs define and measure the type of IT experience end users can expect while also tracking and documenting their satisfaction with this experience by evaluating the quality of the service and support they receive.
If this seems somewhat vague compared to, say, the more traditional Service Level Agreements (SLAs) usually put in place by organisations, that’s because XLAs – although becoming more and more popular – have tended to be used and defined differently depending on what experience and what organisation they pertain to.
For example, it’s difficult to ‘agree’ on the way end users experience a service, since everyone is unique, and everyone experiences things in their own way and from their own perspectives. After all, people require different levels and types of support to feel satisfied – and this may also change over time or circumstances.
One thing that can be said for XLAs, however, is that they focus on the perceived value of IT services to users. XLAs drive organisations to put end user experience at the centre of their service propositions and really pay attention to how customers feel about their treatment and the services they receive (rather than just observing the operational performance, for instance).
In this sense, XLAs are a naturally people-centric device; they work to achieve outcomes directly desired by the people who utilise certain services. It might be useful to consider an XLA in conjunction with other measures, such as Digital Employee Experience (DEX), for example.
Service vs Experience
It’s likely that most of us – at least those working in, or with, a managed IT service – know what a Service Level Agreement (SLA) is. For years, SLAs have been an important reference point when it comes to determining whether MSPs are meeting the expectations placed upon them by customers.
Indeed, ISO 20000 defines SLAs as a ‘mandatory requirement’, as simply ‘one of the service delivery processes’. As a documented agreement between the organisation and its customer, an SLA clearly identifies services and their agreed performance.
SLAs have primarily been based on service metrics such as time to respond, first contact resolution (FCR) rate, number of support requests, and so on. However, these metrics can fall short when it comes to capturing the overall end-to-end experience of customers.
Time to respond is an important metric, but it fails to capture why it’s so critical or explain any issues arising from interactions with support staff. Indeed, failure to capture the user-centric aspects of service delivery can give rise to the ‘watermelon effect’. This is where contractual SLA performance is ‘green’ (the outside of the melon) but lurking inside are the ‘red’ issues, where measures have failed to identify underlying issues impacting users’ experience.
In other words, adding a XLA into the mix means aligning the service provider and the customer/user on their perceptions of performance. This can open the doors to honest and authentic communication between the two, as well as providing valuable input to an ongoing cycle of continual improvement.
Measuring XLAs
The good thing about XLAs is that, before a MSP enters into one with their customer, they encourage a clear understanding of:
-
The outcomes the customer is trying to achieve
-
The added value or benefits the MSP will provide
-
The things that result in a positive feeling for the customer as they interact with the service touch points
Customer satisfaction is the primary measure used in XLAs and, of course, this can be measured in different ways.
It could be a one-to-five-star rating, a customer satisfaction survey, or even a Net Promoter Score (NPS). NPS is a globally recognised market research metric based on a single survey question, any NPS above 70 can be considered to be world-class.
Baselining metrics in this way can be useful when onboarding a new supplier since figures can be tracked and measured along with any changes to the wider IT service provision. In doing so, service providers can measure these changes in terms of impact on user perception.
Like any new concept, there are a few challenges when it comes to XLAs, an obvious one being the subjectivity of the premise ‘experience’. Since XLAs measure user experience, one of their main challenges is how personal the measure can be. For example, one negative experience, be it downtime during a stressful period, or perhaps an encounter with a trainee who isn’t as fast as their colleagues yet, may be enough to cement negative perception permanently. In the event this happens, the organisation setting up the XLA would need to expend a lot of effort in trying to alter this perception and regain a satisfactory score. In other words, XLAs can be tricky because each recorded measure is based on one solitary experience (and users can also sway each other’s opinions, say, in sharing a bad experience amongst colleagues). Another challenge worth bearing in mind is the difficulty in collecting feedback. As the primary source of XLA metrics, it can be challenging since many users don’t like – or don’t have the time – to give feedback (unless it’s a negative experience, unfortunately). This can mean XLA feedback is skewed in favour of those that feel negatively about the service, as opposed to the silent majority who may feel very satisfied with the service received. To surmount these challenges, it is important that service providers invest in strong business relationship capabilities that will: Regularly measuring XLAs (rather than just once a year) can provide better data on trends related to different seasons and events. A key point here is generating enough volume to create a true picture of perception. This will help dilute anomalies as per the example given above. Service providers therefore need an easy, frictionless means of allowing users to rate the service they have received, ideally at the point of delivery while the experience is fresh in their minds.
What are the challenges of XLAs?
Ease and promote the process of receiving feedback.
Expose the various touch points where customers interact with the service. This clarifies the customer journey and what is a priority from the customer perspective.
Ongoing user satisfaction is the real value
The true value in understanding and measuring user experience is that doing so keeps channels of communication open throughout the service lifecycle. As user experiences evolve and change, a holistic approach to experience – one that is a truthful reflection of the customer’s day-to-day challenges and needs – is the only meaningful way of understanding and improving service value.
Done well, this can create what we here at Littlefish like to call ‘service excellence’. This is a blended experience which entails enhanced user-experiences, improved customer satisfaction, and tangible business value.
If you would like to discuss how Littlefish’s award-winning and user-centric IT services can help you reduce costs, increase service quality, and support your end users’ experience, feel free to contact us through our get in touch button.