Part 6 — For Inclusive and Diverse Artificial Intelligence particularly in regard to the most vulnerable members of society. The role of administrative staff to assist the public will become all the more crucial. Development of artificial intelligence within the civil service will only be advantageous if working conditions for civil servants are improved for the benefit of users. Optimization of administrative procedures must echo the aim to empower civil servants (support with looking up information on exceptions to procedures, or that do not directly relate to their field of expertise, automated data entry and transmission, etc.) and to refocus their responsibilities on providing human assistance to those in need, as well as on the development of better institutional coordination between actors involved in care (administrations, assistants, associations, etc.). The reception, guidance and assistance provided to users requires a coordinated approach by actors on the ground, contrary to public, kiosk or device-based approaches which currently prevail. As such, it is necessary to train civil servants in public assistance on a massive scale, whilst also strengthening links with existing actors in digital mediation and social policy professionals (in digital co-working spaces, associations, foundations, etc.). Using EPNs (espace public numérique - digital public spaces) to raise awareness of and report on discriminatory biases within automated access to basic services (housing, employment, healthcare, etc.). A number of studies have shown that machine automated tasks do not necessarily smooth out subjective biases of human procedures. Public authorities must therefore equip themselves with the skills required to better understand, identify and fight against forms of algorithmic discrimination, particularly when it affects access to basic services such as housing and energy, healthcare, employment and training, and credit. These skills could also be of a technical nature (see proposal on audit procedures for algorithms in the section on ethics) or institutional. Public authorities must develop new channels to communicate with citizens in order to facilitate reporting of experiences on the ground and carry out testing in real conditions. To do this, they must call on support from the digital mediation network and associations for the protection of rights. In conjunction with anti-discrimination and human rights associations, digital public spaces (EPN) could: - offer awareness-raising conferences on the risks posed by algorithmic discrimination; - organize citizen panels in order to test and identify possible biases; - launch research-action groups to better understand the appearance of certain forms of online exclusion or marginalization. The quality and representativeness of datasets on population groups are correlated to their social groups In their publication ‘Big Data’s Disparate Impact’ (2015), Solon Barocas and Andrew Selbst (Princeton University) show that the quality and representativeness of datasets on population groups are correlated to their social groups. They 145
For a Meaningful AI - Report Page 143 Page 145