Ellie Dobson, Director of Data Science, quoted in Nature, the International Weekly Journal of Science.
Nature examines Artificial Intelligence and debates that before scientists can trust AI, they must first understand how the machine learns. The computer or black box is becoming exponentially more difficult to decipher and understand. However, the urgent need to understand its content is ever so prevalent. The technique of deep learning, in which networks are trained on massive amounts of big data, is increasingly finding commercial applications like safe-driving cars and websites that recommend products based on user search history.
The importance of the machine actually explaining the reason behind the conclusion it has drawn is especially important when for example denying loan applications as there is a legal obligation to provide an explanation. As Nature states …Similar concerns apply to a wide range of institutions, points out Ellie Dobson, director of data science at the big-data firm Arundo Analytics in Oslo. If something were to go wrong as a result of setting the UK interest rates, she says, “the Bank of England can't say, 'the black box made me do it'”.
To read the article in full, please follow the link below.