Developing public appraisal of AI The potential to evaluate and audit AI should not be confined to government agencies; it should also be provided by civil society. This is a mission which a great many associations have already decided upon. Public authorities have a duty to lend this potential their support and to this end, we need to anticipate the financial problems facing civil defense agencies and journalists in their continuing role as the watchdogs of our digital era. As a guide, Propublica, the benchmark investigative media outlet for digital liberty which is financed by the Soros Foundation to the tune of $20m, has at its disposal five highly qualified full-time experts, developers at technology firms and/or post-doctorate students at the best universities, development support teams and a wide range of academic support. It would be difficult to locate similar resources elsewhere amongst French associations or in journalism, especially in the field of machine learning. Consequently, at the very least we need to oil the wheels of communication between the authorities, research and civil society by maintaining the roles of ombudsmen who are committed to supporting initiatives which aim to mobilize AI in efforts to understand discrimination. One of the main problems in terms of public auditing is getting access to data, which is frequently held by private stakeholders. There are currently voluntary initiatives on behalf of stakeholders such as Google which consist of making data available for the purpose of studying gender issues or to help us understand the phenomena of the non-use of rights, for example. In parallel with this voluntary approach to making data available, specific assistance could be put in place for organisations which are not equipped to access it (for example, in terms of their ability to secure data, etc). To this end, funding to assist with the accommodation and boosting of projects (scientific, engineering and legal support, etc.) could be considered, under the auspices of an organization whose independence is guaranteed. In this connection, the efforts of the organisations Team Up Turn, Propublica and the Electronic Frontier Foundation in the United States could serve as examples. In addition to assistance in terms of access to data, support for testing procedures and reverse-engineering could be introduced. These auditing procedures should not be the preserve of public auditors. To support public auditing, incentives could be offered to the public for making data available for research purposes and to associations who are actively defending social rights and freedoms, in order to help create different profiles and pathways of users, etc. Using the asset of citizens’ portability (see Part 1 of this report) could be one of the best ways to achieve this. Supporting Research into Accountability In the digital sphere, the most significant scientific progress often results from close collaboration between public authorities, research laboratories and manufacturers; AI is no exception. 118

For a Meaningful AI - Report - Page 118 For a Meaningful AI - Report Page 117 Page 119