Trust in Human-Machine Partnerships

Technology has changed our lives dramatically over the past decades. We have evolved from doing everything manually to being able to perform more complex tasks by delegating them to machines. As Artificial Intelligence (AI) becomes widely deployed, the need for AI to support interaction with humans becomes ever more acute. In the not-too-distant future, devices will talk to each other as well as talk to us about their activities. Human-machine collaboration is a model in which humans co-work with AI systems and other machines rather than using them as tools, resulting in better decision making and increased efficiency. As in most successful collaborations, each partner brings forth abilities that the other lacks. Consequently, a human-machine collaboration can overcome the limits of human and AI individual capabilities by working alongside each other to achieve a shared goal. However, to effectively achieve this level of collaboration it is crucial to develop new AI technologies that people can use, understand, and trust.

The goal of the THuMP project is to advance the state-of-the-art in trustworthy human-AI decision-support systems. The Trust in Human-Machine Partnership (THuMP) project will address the technical challenges involved in creating explainable AI (XAI) systems so that people using the system can understand the rationale behind and trust suggestions made by the AI. ThUMP will also investigate the legal and ethical challenges involved in instantiating an XAI system for solving resource allocation problems in critical domains, based on varied data from multiple sources, with different levels of reliability and completeness. This project is done in collaboration with two project partners: Save the Children and Schlumberger that provided use cases for the project, and the law firm Hogan Lovells that will cooperate in investigating the legal implications of enhancing machines with transparency and explanations, and how this affects liability and accountability of machines and shared responsibilities.