In the thirteenth of our interviews with key speakers who are presenting at SEC 11th -12th October 2018, Babiche Veenendaal discusses the issue of bias in relation to artificial intelligence (AI) and how it can be countered.
Babiche Veenendaal, Managing Director at Accenture, started her career as an entrepreneur with a start-up called @ Law. Aged 22, she wrote a book about the legal aspects of the Internet. When no law firm wanted to start an Internet practice, she began her own business. She sold the business five years later to Clifford Chance, where she worked as a lawyer for the next five years. Thereafter, she moved to Accenture, one of the most ethical companies in the world.
Could you explain more about the recent study Accenture has been involved in, along with the Ministry of Foreign Affairs and Plan, and what the study has revealed in terms of gender equality?
The survey revealed in the first place, how much work there is still to do in relation to gender equality, even in a Western country like the Netherlands. A total of 607 companies participated in the survey regarding gender policies and practices. Several companies were either relatively unfamiliar with the topic as such, or did not have the information requested. We used a “maturity model” with four categories: Unaware/Blind; Neutral; Sensitive; and Transformative. Companies that participated in the survey did a self-assessment based on this gender continuum. About 27 per cent described themselves as unaware or blind.
While the business case is compelling and several companies are convinced to invest in gender equality, most of them need help in operationalising solutions to achieve gender equality. We have provided not only an overview of the issues and challenges, but we also presented tools for inspiration, such as the Gender Toolkit of IDH.
What is your advice to companies who wish to level the playing field in terms of gender inequality, particularly in relation to salaries, education, safety and union membership?
My advice would be to set clear goals and agree on targets to meet. Follow the example of the best performer in your industry in gender equality. Shoot for the moon, because we need serious ambition in this area to get somewhere. Make sure you do not only have female role models, but also male champions who really care about and fight for gender equality.
Where bias is concerned we need to start as early as possible, removing the bias of children. Many experiments show that kids already have a view of what men or women can become in life. We often use the quote “you can’t be, what you can’t see”. If a girl has never seen a female fireman or president, she might think this is not what she can become when she is grown up. We need to educate at an early age that everyone can become anything they want in the 21st century. Change the school books, change movies for kids and change the stories we tell our kids when we take them to bed.
Could you expand on the issue of bias in relation to artificial intelligence (AI)?
Research shows that the AI world is almost entirely dominated by men. Machine learning and AI systems offer opportunities to fix the bias and build more gender inclusive societies. Countries lacking key quality data will be unable to make evidence-based policies and, in turn, will fail to adopt measures not only to mitigate potentially harmful effects, but also to enhance and take advantage of these corrective opportunities.
How do you suggest we counter or govern such bias?
The low involvement and marginal inclusion of women in the coding and design of AI and machine learning technologies is leading to a variety of problems, including the replication of stereotypes, such as the submissive role of voice-powered virtual assistants, overwhelmingly represented by women. AI is replicating the same concepts of gender roles that are being removed from the real world. Take proactive steps towards the inclusion of more women in the workforce that design AI systems. For example, should governments require companies to proactively disclose the gender balance of their design teams?
In terms of the development of laws to avoid bias in our systems, how do you envisage enforcing such laws?
As stated, disclosure obligations are one way to enforce this. Also, there could be a code for behaviour and morals, just like lawyers and doctors need to adhere to, for developers of AI.
You can find out more about the STEMM Equality Congress here
You can find out more about Babiche Veenendaal here