In-depth analysis: Automated decision making, connectivity and ethics - DAC Beachcroft

All Collections

Sort By

Related Articles

In-depth analysis: Automated decision making, connectivity and ethics's Tags

Tags related to this article

In-depth analysis: Automated decision making, connectivity and ethics

Published On: 1 September 2016

Insurers can have a major impact on the lives of individuals through the data they hold and how they use it – data regulators know this and, increasingly, so do consumers. In order to avoid becoming a casualty in the battle for data ownership, simple compliance with the latest set of regulations may not be enough, says Hans Allnutt, Partner at DAC Beachcroft: “In the big data era, the law is continuously playing catch up with technological possibilities. Insurers must begin to think beyond mere legal compliance and evolve an ethical approach. They must determine the ethical boundaries of their data processing. That includes collection, usage, retention and transfers.”

The need for ethical boundaries is something that the UK government recently considered as part of a parliamentary inquiry into big data. On 26 April 2016, the government announced that a new Council of Data Ethics would be established as a means of addressing the growing legal and ethical challenges associated with balancing privacy, anonymisation, security and public benefit.

“Just like the UK government, insurers should consider establishing a clear ethical data policy,” continues Allnutt. “You could say Google’s corporate motto of ‘Don’t be evil’ is one such example. It is about developing clarity over what you are prepared to do and what you are not prepared to do with data and always being totally transparent with the customer.”

It might be said that Google needs a motto because it is so technologically advanced that it operates beyond the boundaries to which existing laws operate. It has faced the wrath of many privacy campaigners as the case of Google Inc v Vidal-Hall and others demonstrates. A key feature of this case is the re-interpretation of existing law so that distress caused by misuse of personal data alone will be enough to trigger compensation in the absence of any financial loss. Google has been caught out by the court’s willingness to re-interpret existing law to keep up with consumers’ privacy expectations and demonstrates how companies have to go beyond mere compliance with existing privacy laws.

Moving targets
The growing threat of legal action, regulatory investigations, fines and serious reputational harm has pushed this issue up the corporate agenda but it is a moving target. The Google case and the new generation of data protection regulations in Europe show just how fast the presumption of privacy – and by implication ownership – is moving in favour of the individual. This is where the challenges start to mount up and where mere compliance today may leave insurers exposed tomorrow.

Compliance in itself, however, is not easy or a given, says Rhiannon Webster, data protection specialist and Partner at DAC Beachcroft. “To the extent insurers process personal data beyond what is strictly necessary for the insurance contract, insurers will generally need to obtain the consent of individuals. The threshold for consent is increasing under the new General Data Protection Regulation, which comes into force in May 2018 [see our comments on the impact of Brexit on the GDPR in the Legislation section below], and the insurance industry will need to ensure that all uses and disclosures of personal data are clearly explained to individuals and a record of the consent obtained. Consent also needs to be able to be freely withdrawn.”

The extent of data collected through connected and automated devices has great potential and Webster explains from experience that insurers are already aggregating it, creating big data and undertaking analysis. “This analysis is not only aimed at increasing the accuracy of underwriting but also identifying trends in behaviours and potential marketing channels, which has great value not only to insurers but outside the world of insurance too. If the data is fully anonymised, it is outside the scope of data protection rules; however, individuals get understandably concerned if their personal data is being used for purposes beyond what is necessary for the insurance contract. The public are quite happy to give a large amount of their data to loyalty card schemes as they see the value exchange, in the form of discounts. The insurance industry has some way to go to gain the public’s trust and to show the value insureds get from opening up their homes and lives to connected devices. The insurance industry needs to be aware that the world of connected devices is really a world of connected people and the public are becoming increasingly privacy aware.”

As connectivity – cars, homes, offices, factories, devices – becomes the norm, insurers are going to face these challenges more frequently and will have to work hard to win the trust of consumers. That brings us back to ethics, says Webster.

“There will be a trade-off with better prices and better cover for handing over personal data. We are already seeing that with telematics. To go further than that and ask for more becomes a question of trust and people don’t trust insurers. The key is transparency and being upfront about the reasons insurers want the data and the benefits that will bring to the individual.”

Regulators want to encourage this and the Information Commissioner is developing a Privacy Seal for industries or businesses that aspire and achieve the highest data protection and privacy standards.

Concentrating biases
Running alongside these challenges is the arrival of artificial intelligence, which is ready to make a major impact in the insurance market with automated decision making, according to Allnutt.

“We assume people are making decisions about how the data is used but it is increasingly likely to be a computer and this is fraught with dangers,” he says. “When you put a huge amount of data into a machine you can often concentrate the natural biases of the world and this can create inherently discriminatory outcomes.”

A recent example is the Microsoft chatbot, an artificial intelligence pilot that went horribly wrong when a robot created to chat online with teenagers turned into a foul-mouthed, rightwing racist as it absorbed the data from the conversations it was having with people. Similarly, Google searches for ‘unprofessional haircuts at work’ generated images of black women while a search for ‘professional styles’ showed only white models and provoked a storm of protest.

“This shows how careful you have to be not to become unwittingly evil by letting the machines make all the decisions. You can easily see this happening in insurance with people being excluded because of their digital profile. You have to focus on outcomes and program the computers to be ethical. That won’t be easy,” believes Allnutt.

 

Connected cars setting the pace

The pace and scale of adoption of automated driving technologies is bringing the era of the driverless vehicle – car, lorry or passenger transport – closer, says Craig Dickson, Chief Executive of DAC Beachcroft’s Claims Solutions Group: “There is a visible increase in government support to make it happen. It is now a case of when, not if, although it won’t be a neat straight line in terms of adoption. There will be significant increases in use as smart city centres and connected highways infrastructure arrives.”

The technology now going into cars is already having a major impact on insurers: “The way cameras are quickly superseding sensors is really changing the approach to claims handling as they produce evidence that is much harder to dispute. With telematics and connected cars we are also seeing that the ability to respond promptly to accidents has changed dramatically.”

Dickson says that the ability to connect cars through smart phone apps will soon have a major impact and that could bring insurers and vehicle manufacturers into competition for the data as the owner of the app will, potentially, own the data.

“Insurers will have to change their proposition significantly to win this battle. Motor insurers have really wrestled with how to sell the benefits of their product when it is seen by many people as a grudge purchase. The opportunity is there to strike a new deal with customers on price and cover but also by feeding back safer driving information.”

It will not be long before some major disruption to existing motor insurance business models emerges, says Dickson. “The true barriers for insurers and for the adoption of connectivity won’t be the technology, they will be the business models. Any heavily regulated industry is always vulnerable to disruption. For instance, the price comparison websites are very slick at what they do but how will they adapt to a world in which real-time exchange of data enables minute-by-minute changes and where today’s data sets tomorrow’s prices?”

Automation and connectivity are two key trends but the biggest change will come with the arrival of driverless vehicles and these will create their own set of ethical dilemmas, not least when faced with life-or-death decisions.

“I do not think these will hold back the adoption of driverless cars once the infrastructure is in place,” says Dickson. “Of course these ethical dilemmas will need to be faced as they won’t go away but they will not deflect a sympathetic user group that really wants it to work. They will ensure that these vehicles get into the daily activity of the majority of people and then we will reach an inflexion point after which adoption will be widespread.”

Ethics and data
The danger of creating uninsurable sectors of society – either because their digital profile does not match insurers’ desirable customers or because people exercise their right not to share their data – could attract the attention of financial regulators, warns Webster. “Excluding people because they don’t match certain profiles could be seen as unfair and that brings us right back into the Financial Conduct Authority’s territory of treating customers fairly.”

“This just underlines the challenge facing insurers, which is to focus on what they think is ethical in terms of their use of their customers’ data,” says Allnutt. “Those that get this right could have an opportunity to differentiate themselves from their competition. Profiling is right at the limits of what insurers will currently use technology for and we may soon have more rules to help with that. But technology will always outpace the rules so just as the UK government has realised with the establishment of its Council for Data Ethics, there is a need for companies to have an ethical approach to data usage now in order to deal with technology that is not foreseen today.”

Beta