Desktop version

Home arrow Economics arrow Content Delivery Networks. Fundamentals, Design, and Evolution

The “Quality Question" ...

As you might have noticed, I am in the habit of using the term ”good enough” with some regularity. In a sector that is dominated by conversations of quality, this may seem a somewhat pragmatic, if not heretical, attitude. The fact is that in the early days of streaming it was so exciting to manage to stream anything at all, of almost any quality, that my own experience as the sector emerged was through the eyes of learning to deliver the capability to stream. Image and audio quality were very much aspects that took many years to mature and take center stage. This makes me somewhat pragmatic.

While we do have high-resolution systems for testing our encoding platforms at id3as, I have become so used to watching video over dual-ISDN connections running at 128Kbps that I would never, myself, claim to have the “golden eyes” that those who profess to analyze video quality do. Even to this day I can sit and quite happily bask in the beauty of a full-screen scaled-up video that is normally 240x180 resolution and looks more like a cheap Christmas tree decoration than the video of a 2002 concert. I am not one to spend money on Hi-Fi, and while I realize that many people traditionally go to great lengths to ensure that their production workflows all exceed the quality of my home entertainment system. So as to ensure the “highest quality” by the time it reaches my front room, I help them out by keeping my home entertainment system pretty much as low cost as it can be, and generally quite out of date too. Why? Well it ensures that I have the best backward compatibility so my archives are still accessible, and it generally means that I am in the “long tail” mass market for services whose operators have debugged them and simply work. I tend to “push the boundaries” on my work computers for the latest and cutting-edge technologies, but when it comes to just watching something with my family, I take an almost opposite approach. This also keeps me mindful of where the mass consumer market is. Those who work in the industry and indulge themselves in 8kUHD screens five years before the mass market even understands what UHD means are right to do so, don't get me wrong. However, they are also at risk of disconnecting from the mass market, and unconsciously leaving their own customers with a sense of being second rate: Something that one should never allow to happen.

There are many great books written about “quality.” Nontechnical discussions aside, Persig's famous book, Zen in the Art of Motorcycle Maintenance, tells the story of a man's eventual breakdown in the search for its meaning.

This story is a potent warning for my colleagues in the sector. Since the 2002 opening up of massive availability of Intercontinental Telecoms routes, the CDNs, in particular, had to alter their approach to market. When transoceanic IP was a scarcity, the supply-demand economics allowed CDNs to charge a huge premium on their IP. This was because the CDNs' proxy capability offered significant savings over multiple direct delivery over each pipe. Premiums such as $1.50 per GB of data transferred were common, even up to 2005 (as contracts expired), but since then the CDNs have fought tooth and nail to maintain price. In fact it was only down to the consolidation caused by careful dropping of the prices (to today's more stable $0.02 per GB "floor”) that stemmed the competition and allowed the CDN revenue models to stabilize.

To add value, the CDNs changed the narrative of the delivery story from price to “quality," and - as we have discussed multiple times elsewhere - they have produced a number of key performance indicators that form the typical core of CDN negotiations with publishers these days.

I will explore latency in the following section, but given that very few applications are really latency sensitive and given that most media services have to buffer a few seconds before they start playing (orders of magnitude than the KPIs of milliseconds that CDNs differentiate with), we should also introduce the concept of “quality of experience (QoE). QoE is a broader measure of KPIs that takes a more holistic approach to the entire engagement of a customer with a service. While in a lab a particular bitrate may stream to a player, this is a relatively isolated way to view quality. In practice, when delivering commercial services, we must include a variety of factors such as ease of discovery and access, starting up within expectation and consistently, and probably one of the most important (at least according to major broadcasters such as Sky who spoke on this very subject at Content Delivery World in London last week) is “re-buffering."

These factors all add up to how well the client engages with your services. If a service is unavailable, then the user has a million other options, and they are unlikely to return to try the service again. That is unless they are in the middle of a box set, or if you have a monopoly on the content, in which case unavailability is likely to result in a very strong and direct impact on the customer's view of your brand as a whole.

So, while in these circumstances it cannot be stressed how important the CDN is in underpinning the client's perception of the CDN's customer, the CDN may often take its availability for granted and instead invest in differentiating with microscopically faster loading times or supporting higher bitrates. Any publisher who prioritizes encoding quality over availability must either be focused on a niche such as telemedicine or have an eye on the wrong KPI. While ISPs that offer poor connectivity - consistently providing speeds lower than advertised - may see a churn of customers, few publishers loose customers because they offer 1080p rather than 4 k streaming, or because the stream took 6.02 seconds to start rather than 6.01 seconds.

 
Source
< Prev   CONTENTS   Source   Next >

Related topics