18Jul

The chair of the Financial Conduct Authority (FCA) has given a speech with the dramatic title “How can we ensure that Big Data does not make us prisoners of technology?Charles Randell was speaking at the Reuters Newsmaker event in London in early July 2018.

He began by describing how algorithms play a big part in our lives. Mr Randell said:

“An algorithm decided the results of your internet searches today. If you used Google Maps to get here, an algorithm proposed your route. Algorithms decided the news you read on your news feed and the ads you saw.

“Algorithms decide which insurance and savings products you are offered on price comparison websites. Whether your job qualifies you for a mortgage. Perhaps, whether you are interviewed for your job in the first place.”

The FCA chair added that some people have questioned whether we still live in a truly democratic society, given the influence that algorithms have.

In support of this argument, he made three points:

  • Big Data – there are now “enormous and detailed datasets about many different aspects of our lives.”
  • Artificial intelligence and machine learning – these Big Data can be mined and analysed more extensively than ever before. Firms can predict our future behaviour, and use the results of data mining to decide whether to offer us products and services, and on what terms
  • Behavioural science – as firms understand more and more about human behaviour, they can target their marketing efforts accordingly

Next, Mr Randell went as far as to question whether the reach of Big Data was so large that the existing regulatory system was becoming inadequate, as he commented:

“The power of Big Data corporations and their central place in providing services that are now essential in our everyday lives raise significant questions about the adequacy of global frameworks for competition and regulation. The ordinary consumer may in practice have no choice in whether to deal with these corporations on terms which are non-negotiable and are often too general to be well understood. And without access to the data which consumers have signed – or clicked – away, new businesses may find it very difficult to compete.”

He cited two examples of what might be considered less ethical ways in which firms had used data: increasing the car insurance quotes on price comparison sites if the individual had a name that suggested they were from an ethnic minority; or cutting cardholders’ credit limits when charges began to appear for marriage guidance counselling.

Then Mr Randell highlighted that it was people who designed technological innovations, and that on occasions these people may need to be held accountable, by saying:

“People, not machines, need to understand and control the outcomes that the technology they are designing is producing; people, not machines, have to make the judgement as to whether these outcomes are ethically acceptable – and ensure that they don’t just automate and intensify unacceptable human biases that created the data of the past. A strong focus on checking outcomes will be essential as some forms of machine learning, such as neural networks, may produce results through processes which cannot be fully replicated and explained.”

The information shown in this article was correct at the time of publication. Articles are not routinely reviewed and as such are not updated. Please be aware the facts, circumstances or legal position may change after publication of the article