The development of connected and autonomous vehicles (CAVs) is technology-driven and data-centric. Zenzics Roadmap to 2030 highlights that 'the intelligence of self-driving vehicles is driven by advanced features such as artificial intelligence (AI) or machine learning (ML) techniques'.[1] Developers of connected and automated mobility (CAM) technologies are engineering advances in machine learning and machine analysis techniques that can create valuable, potentially life-saving, insights from the massive well of data that is being generated.
Diego Black and Lucy Pegler take a look at the legal and regulatory issues involved in protecting data and innovations in CAVs.
The data of driving
It is predicted that the average driverless car will produce around 4TB of data per day, including data on traffic, route choices, passenger preferences, vehicle performance and many more data points[2].
'Data is foundational to emerging CAM technologies, products and services driving their safety, operation and connectivity'.[3]
As Burges Salmon and AXA UK outlined in their joint report as part of FLOURISH, an Innovate UK-funded CAV project, the data produced by CAVs can be broadly divided into a number of categories based on its characteristics. For example, sensitive commercial data, commercial data, personal data. How data should be protected will depend on its characteristics and importantly, the purposes for which it is used. The use of personal data (i.e. data from which an individual can be identified) attracts particular consideration.
The importance of data to the CAM industry and, in particular, the need to share data effectively to enable the deployment and operation of CAM, needs to be balanced against data protection considerations. In 2018, the Open Data Institute (ODI) published a report setting out that it considered that all journey data is personal data[4] consequently bringing journey data within the scope of the General Data Protection Regulation.[5]
Additionally, the European Data Protection Board (EDPB) has confirmed that the ePrivacy directive (2002/58/EC as revised by 2009/136/EC) applies to connected vehicles by virtue of 'the connected vehicle and every device connected to it [being] considered as a 'terminal equipment'.'[6] This means that any machine learning innovations deployed in CAVs will inevitably process vast amounts of personal data. The UK Information Commissioners Office has issued guidance on how to best deal with harnessing both big data and AI in relation to personal data, including emphasising the need for industry to deploy ethical principles, create ethics boards to monitor the new uses of data and ensure that machine learning algorithms are auditable.[7]
Navigating the legal frameworks that apply to the use of data is complex and whilst the EDPB has confirmed its position in relation to connected vehicles, automated vehicles and their potential use cases raise an entirely different set of considerations. Whilst the market is developing rapidly, use case scenarios for automated mobility will focus on how people consume services. Demand responsive transport and ride sharing are likely to play a huge role in the future of personal mobility.
The main issue policy makers now face is the ever evolving nature of the technology. As new, potentially unforeseen, technologies are integrated into CAVs, the industry will require both a stringent data protection framework on the one hand, and flexibility and accessibility on the other hand. These two policy goals are necessarily at odds with one another, and the industry will need to take a realistic, privacy by design approach to future development, working with rather than against regulators.
Whilst the GDPR and ePrivacy Directive will likely form the building blocks of future regulation of CAV data, we anticipate the development of a complementary framework of regulation and standards that recognises the unique applications of CAM technologies and the use of data.
Cyber security
The prolific and regular nature of cyber-attacks poses risks to both public acceptance of CAV technology and to the underlying business interests of organisations involved in the CAV ecosystem.
New technologies can present threat to existing cyber security measures. Tarquin Folliss of Reliance acsn highlights this noting that 'a CAVs mix of operational and information technology will produce systems complex to monitor, where intrusive endpoint monitoring might disrupt inadvertently the technology underpinning safety'. The threat is even more acute when thinking about CAVs in action and as Tarquin notes, the ability for 'malign actors to target a CAV network in the same way they target other critical national infrastructure networks and utilities, in order to disrupt'.
In 2017, the government announced 8 Key principles of Cyber Security for Connected and Automated Vehicles. This, alongside the DCMS IoT code of practice, the CCAVs CAV code of practice and the BSIs PAS 1885, provides a good starting point for CAV manufacturers. Best practices include:
Work continues at pace on cyber security for CAM. In May this year, Zenzic published its Cyber Resilience in Connected and Automated Mobility (CAM) Cyber Feasibility Report which sets out the findings of seven projects tasked with providing a clear picture of the challenges and potential solutions in ensuring digital resilience and cyber security within CAM.
Demonstrating the pace of work in the sector, in June 2020 the United Nations Economic Commission for Europe (UNECE) published two new UN Regulations focused on cyber security in the automotive sector. The Regulations represent another step-change in the approach to managing the significant cyber risk of an increasingly connected automotive sector.
Protecting innovation
As innovation in the CAV sector increases, issues regarding intellectual property and its protection and exploitation become more important. Companies that historically were not involved in the automotive sector are now rapidly becoming key partners providing expertise in technologies such as IT security, telecoms, block chain and machine learning. In autonomous vehicles many of the biggest patent filers in this area have software and telecoms backgrounds[8].
With the increasing use of in and inter-car connectivity and the accumulative amount of data having to be handled per second as levels of autonomy rises, innovators in the CAV space are having to handle issues regarding data security as well as determining how best to handle the large data sets. Furthermore, the recent UK government call for evidence on automated lane keeping systems is being seen by many as the first step of standards being introduced in autonomous vehicles.
In view of these developments new challenges are now being faced by companies looking to benefit from their innovations. Unlike more traditional automotive innovation where the innovations lay in improvements to engineering and machinery many of the innovations in the CAV space reside in electronics and software development. The ability to protect and exploit inventions in the software space has become increasingly of relevance in the automotive industry.
Multiple Intellectual Property rights exist that can be used to protect innovations in CAVs. Some rights can be particularly effective in areas of technology where standards exist, or are likely to exist. Two of the main ways seen at present are through the use of patents and trade secrets. Both can be used in combination, or separately, to provide an effective IP strategy. Such an approach is seen in other industries such as those involved in data security.
For companies that are developing or improving machine learning models, or training sets, the use of trade secrets is particularly common. Companies relying on trade secrets may often license access to, or sell the outputs of, their innovations. Advantageously, trade secrets are free and last indefinitely.
An effective strategy in such fields is to obtain patents that cover the technological standard. By definition if a third party were to adhere to the defined standard, they would necessarily fall within the scope of the patent, thus providing the owner of the patent with a potential revenue stream through licensing agreements. If, as anticipated, standards will be set in CAVs any company that can obtain patents to cover the likely standard will be at an advantage. Such licenses are typically offered under a fair, reasonable and non-discriminatory (FRAND) basis, to ensure that companies are not prevented by patent holders from entering the market.
A key consideration is that the use of trade secrets may be incompatible with the use of standards. If technology standards are introduced for autonomous vehicles, in order to comply with the standards companies would have to demonstrate that their technology complies with the standard. The use of trade secrets may be incompatible with the need to demonstrate compliance with a standard.
However, whilst a patent provides a stronger form of protection in order to enforce a patent the owner must be able to demonstrate a third party is performing the acts as defined in the patent. In the case of machine learning and mathematical-based methods such information is often kept hidden making providing infringement difficult. As a result patents in such areas are often directed towards a visible, or tangible, output. For example in CAVs this may be the control of a vehicle based on the improvements in the machine learning. Due to the difficulty in demonstrating infringement, many companies are choosing to protect their innovations with a mixture of trade secrets and patents.
Legal protections for innovations
For the innovations typically seen in the software side of CAVs, trade secrets and patents are the two main forms of protection.
Trade secrets are, as the name implies, where a company will keep all, or part of, their innovation a secret. In software-based inventions this may be in form of a black-box disclosure where the workings and functionality of the software are kept secret. However, steps do need to be taken to keep the innovation secret, and they do not prevent a third party from independently implementing, or reverse engineering, the innovation. Furthermore, once a trade secret is made public, the value associated with the trade secret is gone.
Patents are an exclusive right, lasting up to 20 years, which allow the holder to prevent, or request a license from, a third party utilising the technology that is covered by the scope of the patent in that territory. Therefore it is not possible to enforce say, a US patent in the UK. Unlike trade secrets publication of patents is an important part of the process.
In order for inventions to be patented they must be new (that is to say they have not been disclosed anywhere in the world before), inventive (not run-of-the-mill improvements), and concern non-excluded subject matter. The exclusions in the UK and Europe cover software, and mathematical methods, amongst other fields, as such. In the case of CAVs a large number of inventions are developed that could fall in the software and mathematical methods categories.
The test regarding whether or not an invention may be seen as excluded subject matter varies between jurisdictions. In Europe if an invention is seen to solve a technical problem, for example relating to the control of vehicles it would be deemed allowable. Many of the innovations in CAVs can be tied to technical problems relating to, for example, the control of vehicles or improvements in data security. As such on the whole CAV inventions may escape the exclusions.
What does the future hold?
Technology is advancing at a rapid rate. At the same time as industry develops more and more sophisticated software to harness data, bad actors gain access to more advanced tools. To combat these increased threats, CAV manufacturers need to be putting in place flexible frameworks to review and audit their uses of data now, looking toward the developments of tomorrow to assess the data security measures they have today. They should also be looking to protect some of their most valuable IP assets from the outset, including machine learning developments in a way that is secure and enforceable.
Originally posted here:
Connected and autonomous vehicles: Protecting data and machine learning innovations - Lexology