Governments have long had an important role in maximising social welfare by regulating safety, wherever private-sector providers do not have adequate incentives to do so. The motor industry spent decades competing to decorate cars with chromium rather than fit them with seat belts, until governments took action. In the USA, Ralph Nader persuaded Congress to set up the National Highways Transportation Safety Administration and the provision of crashworthiness information; Europe followed with the Product Liability Directive and mandatory safety testing. The regulation of drugs has similarly moved from the Wild West of nineteenth-century patent medicines to modern standards of safety and efficacy assessed by randomised controlled trials. Toys can no longer have lead paint, and if you pull out a teddy bear’s eye it must not leave a sharp spike behind. Regulation plays a key role in consumer confidence.
Not all regulation comes from the state; the insurance industry plays its part, with insurance premiums translating safety test results into incentives, and insurance labs producing the needed results where government labs don’t. (In some cases, such as fire and burglar alarms, insurers’ labs set the standards too.) Indeed, governments sometimes justify intervention in protective cybersecurity on the grounds that they are the insurer of last resort. There is also some industry self-regulation, notably through standards bodies.
In short, safety is a large and complex ecosystem of both private and public sector regulation. Like many other sectors of the economy, it is being disrupted by technology. The dependability – the safety and security – of computer and communications systems is becoming ever-more critical to the safety of vehicles, medical devices, and in fact the whole infrastructure on which our societies depend. Indeed, in many languages, ‘safety’ and ‘security’ are the same word (Sicherheit, sˆuret´e, seguridad, sicurezza, trygghet, ... ).
At present, the European Union has many regulatory agencies concerned with safety in a variety of industries. We were commissioned in 2016 by the European Commission’s Joint Research Centre to examine the effect of ‘The Internet of Things’ on the ecosystem of safety regulation: the core question was what the EU’s regulatory framework should look like a decade from now. Will cybersecurity require a powerful, cross-domain regulator; or will each sector regulator acquire a cell of cybersecurity expertise; or will it be some mixture of general and sectoral approaches; or will we need to develop something else entirely? And how do we get there from here? This paper will consider the broader lessons that we draw from our work, but we will start by considering security regulation.
The social welfare goals of a cybersecurity regulator (whether free-standing or sectoral) will typically be some mix of safety and privacy. The former is likely to be dominant in transport while the latter may be more important with personal fitness monitors. Other goals may include national security, law enforcement, competition, the accurate recording of transactions and dispute resolution. An example where all are in play is smart electricity meters: we do not want the meters in our homes to cause fires, to leak personal data, to enable a foreign power to threaten to turn them off, to allow the utility to exploit market power, to make electricity theft easy, or to make it impossible to resolve disputes fairly.
Achieving these goals will depend on mechanisms such as cryptography and access control; assurance that these have been correctly implemented and are being effectively maintained; standards for interoperability; and liability rules that align incentives. There must 2 be mechanisms to prevent actors externalising risks to third parties who are less able to cope with them. This all involves not just writing standards and testing equipment before installation, but monitoring systems for breaches and vulnerabilities, and compelling the updating of software to deal with both safety and security issues as they arise.
So the goals and mission of a cybersecurity regulator may be a mix of the following:
1. Ascertaining, agreeing, and harmonising protection goals
2. Setting standards
3. Certifying standards achievement and enforcing compliance
4. Reducing vulnerabilities
5. Reducing compromises
6. Reducing system externalities
The underlying principle of these individual goals is to maximise social welfare by reducing risk. However, the devil lives in the detail.
The regulators’ first task is policy: to determine what needs to be regulated, and why. There will be multiple regulators with different missions: for example, the data protection authorities are concerned with privacy and national electricity regulators with competition. Once the goals are set, these can be elaborated into technical standards in collaboration with industry groups and relevant international bodies. Standards can build on existing specialist work, such as the US National Institute of Standards and Technology (NIST) standards for cryptography.
The first goal appears self-evident. What bad outcomes are we seeking to prevent or to mitigate? But this question can be tricky. Exactly whose risk should the regulator be reducing – the risk to a dominant industrial player, or to its millions of customers?
The second goal often entails adapting or evolving existing standards, of which there are already many, from algorithms and protocols through software processes to how people should be managed in organisations. Again, it can be tricky: inappropriate standards have been adopted in the past for both political and commercial reasons .
As for the third goal, compliance with standards helps reduce the information asymmetry between vendors and their customers. Business wants to know what it must do in order not to be held liable, and wants predictable tests of compliance. This is also tricky, as it creates incentives for liability games and regulatory capture.
The focus of the fourth goal is also on reducing the asymmetry between the purchaser and the vendor, but it is dynamic rather than static. Cybersecurity issues are starting to migrate from software products that are updated monthly (to fix bugs and stop attacks) to durable consumer goods such as motor vehicles. Will type approval for cars and medical devices depend in future on regular software updates? A regular update cycle will be needed to minimise the amount of time the purchaser is exposed to attacks. Online software updates also cut the costs of doing product recalls to fix safety problems. The tricky bit here is which vulnerabilities get prioritised. 3
The fifth goal seems to be focussed on reducing the exposure of insurers. There are other defenders too: consumers; vendors; security service companies; computer emergency response teams (CERTs); and finally, government agencies charged with protecting critical infrastructure. But some accidents are more salient than others, and voters are not content for all fatal accidents to be valued equally. If insurers or regulators act as if they are, a political backlash may follow (as happened in the USA with the Ford Pinto and in the UK with train protection systems).
The sixth goal is also about reducing asymmetry between vendors and customers. But the focus is no longer on technical vulnerabilities, but on the overall impact of externalities. If malware causes a business to lose money, the regulator might not focus on loss prevention, but ensure that the liability falls on the party most capable of mitigating the risk, such as the bank. Similarly, car companies should bear the cost of unsafe software causing accidents. If autonomous vehicles are bundled with insurance, then the incentives may be broadly in the right place, at least for the first purchaser; but the regulator may still have to consider how used vehicles will get security and safety patches, and the cost of their insurance. In fact, the EU has already created a ‘right to repair’ to open up after-markets for car parts; how might it adapt this to the need for security patches?
Read this entire piece at https://www.cl.cam.ac.uk/~rja14/Papers/weis2017.pdf
Read more at Woods LLP