In the latest move in the CMA's continuing crack down on digital markets, the CMA has now made clear that it believes more work needs to be done to understand and monitor the impact of algorithms on competition and consumers in the UK.
Spotlight on algorithms
Algorithms vary in form from simple rule-based systems to complex machine learning and artificial intelligence code. They can deliver numerous benefits to consumers: enabling personalised recommendations, saving time and allowing people to socialise, shop or consume news and entertainment online. They can also increase the efficiency and effectiveness of a company’s infrastructure and those pricing efficiencies can be passed directly to customers. So, what is the issue?
Algorithms have come under the spotlight of numerous regulators in recent years: financial services regulators have been focussing on transparency and justification of AI outcomes in financial services, while competition authorities around the world have examined the impact of the use of AI on competition and consumers. The CMA’s report highlights both these issues and concludes that more needs to be done to prevent harm to competition and consumers in the UK: a mantle that will no doubt be picked up by the CMA’s new Digital Markets Unit, expected to open in April this year.
Digital players of all sizes can expect more scrutiny over their use of algorithms from the CMA and regulators globally, and may increasingly be asked for explanations of the algorithms and AI they have employed. Companies should also expect to be held to account for the perceived harms associated with their algorithms.
CMA launches its ‘Analysing Algorithms’ programme
In January, the CMA launched its new ‘Analysing Algorithms’ programme to address the perceived harms caused by algorithms across the UK’s digital economy. In its first paper of the programme, the CMA looks at algorithms operated by businesses of all sizes - from Big Tech (e.g. Facebook’s News Feed or Google’s Search) to micro-businesses (e.g. the use of the machine learning systems) and finds that, where not regulated sufficiently, algorithms have the potential to cause significant harm to consumers.
The CMA’s paper covers well-trodden ground in terms of the antitrust issues it identifies, building on a working paper it published in 2018 and similar publications by peer authorities (e.g. the joint Franco-German paper released in November 2019), but adding that advances in technology mean algorithms today operate on a scale and with an increasing level of sophistication that makes the harms more noticeable than previously.
The key issues identified are:
- Large platforms using algorithms to exclude competitors: The paper explores the ways in which algorithms can enable platform operators to opaquely self-preference or manipulate rankings to exclude competitors. Given the power of the data held by major platform operators, the paper also outlines concerns this data may be used to intentionally target those customers at risk of switching and prevent them from moving to a competitor through algorithms, making it easier for incumbent firms to predate successfully and further entrenching market positions. This echoes the concerns raised by the European Commission, which is currently investigating both Google and Amazon over allegations that their algorithms preference their own products or services. The EU’s proposed Digital Markets Act would bring in new rules directly addressing this issue.
- Facilitating collusion between businesses: The paper finds that numerous types of collusion can be facilitated by algorithms, including explicit coordination between competitors (who use algorithms as a tool to assist with collusion) and (potentially tacit) co-ordination through firms using the same third-party software. Explicit collusion has already been an area of enforcement by the CMA (e.g. in Online Posters in 2016, the CMA investigated and found that two competing sellers of posters and frames had agreed not to undercut each other’s prices on Amazon UK and had implemented their arrangement through automated repricing software). However, the paper also references more speculative potential harms through autonomous tacit collusion (where algorithms are designed to learn to auto-collude). The CMA adds that even where an algorithm’s behaviour is not perfectly anticipated, firms are responsible for any effective oversight of their systems, including robust governance and impact assessments (though competition rules currently require some form of collusion between parties).
The CMA also identifies broader consumer harms
The paper also explores the possibility that algorithms could cause broader consumer harm, through driving unfair/discriminatory outcomes. For example:
- Personalised pricing algorithms: Algorithms which assess a user’s willingness to pay a certain price based on personal data (including advertising different prices or offering discounts to certain consumers) can result in a loss of trust. The opaque nature of how these algorithms work risks exacerbating these perceived harms. The CMA also highlights that data used to power these algorithms is collected and used in ways customers cannot control.
- Similarly, the CMA finds that algorithms which personalise rankings for online searches can lead to an unfair ranking of search results, such as higher rankings being given to those offerings that derive higher commissions.
- The CMA also has concerns about other unfair design practices that may exploit consumers, such as firms misleading customers through scarcity messages that convey a sense of urgency – “X rooms left!” (several of which the CMA states are fabricated and do not, for example, accurately reflect how many people are viewing a particular offer at a given time).
While competition law can conceivably address unfair or discriminatory conduct where there is a harm to competition through dominance enforcement, the CMA considers these harms could arise even where the tools are used by businesses without market power. To take action here, the CMA would need to look outside its traditional competition toolkit and use its consumer powers, which it has asked the UK Government to strengthen in its Digital Taskforce Advice.
What’s happening next?
The CMA is clear that more work needs to be done to understand the impact of algorithms and has publicly called for information as a next step.
More broadly, the CMA is advocating for close monitoring of the digital space, collaboration between regulators internationally and is calling for increased audit of the algorithms used by companies via several different routes, including opening formal investigations where it thinks a business’ algorithms may have infringed competition law, sandbox testing or checking on algorithmic systems that have previously been investigated. This role will likely sit within the new Digital Markets Unit.
Whilst these are still early days, the CMA’s paper appears to be a precursor to future intervention and enforcement across the digital markets by the CMA and other regulators in the UK.
"it is incumbent upon companies to keep records explaining their algorithmic systems, including ensuring that more complex algorithms are explainable. We believe companies should be ready to be held responsible for the outcomes of their algorithms”