AI Bias And Experiments How Women In News Is Tackling Techs Inbuilt Stereotypes
AI Bias And Experiments How Women In News Is Tackling Techs Inbuilt Stereotypes
- By --
- Wednesday, 03 Jul, 2024
Women, migrants, precarious workers, and socioeconomic and racial minorities are disproportionately impacted by Generative AI’s technical limitations, such as hallucinations and negative stereotypes, writes AI journalist in her new book,
“And it’s because they rarely have a voice in the echo chambers in which AI is being built,” says Murgia, the first artificial intelligence editor of the
We have seen the effects of bias play out during workshops run over the past year by WAN-IFRA Women In News on the and in the subsequent pilot of a deep dive on the topic.
This is being done through the a training programme designed to equip media professionals with the skills to navigate the digital landscape and drive change within their organisations.
Generating offensive avatars
A new module launched earlier this year focuses on AI. More than 100 participants in eastern Europe took part in the training which has now been rolled out to more than 100 journalists in parts of Africa, the Middle East and Southeast Asia.
One participant in eastern Africa told how, when he tried to generate a text-to-image picture of doctors in Africa, the AI tool designed the image of a witchdoctor. Many complained that text-to-audio AI tools, for example, developed with British or American English, did not recognise local accents. As a result, a Zimbabwean avatar spoke with an American accent and lacked authenticity.
Earlier this year, Google’s chief executive admitted that biased AI tools offended users, after portrayals of German World War II soldiers as people of colour, in a variety of ethnicities and genders.
Fears of bias in the tools are not new.
an Ethiopian-born computer scientist, in the US last year after pointing out the inequalities and bias built into AI systems.
At the core of these issues are accuracy, trust and the quality of the data, which companies will need to review and monitor regularly. If this is not carried out, there is the risk that data will remain biased and, if left unchecked, can skew results. After all, AI tools are only as good as the data that has been input.
“Data is everything. You put garbage in, you get garbage out,” head of applied research and development at Foster + Partners, UK,
Stages of trial and error
Participants in WIN’s Age of AI programme have shared that diverse teams are already experimenting with various tools, such as fact-checking and increasing staff skill sets in the usage of AI tools.
Some of the best projects that are being considered for further European Union funding to develop the ideas include:
- A video lab to amplify video content and automate generation of videos for social networks. This company aims to grow its young audience by 15% with this tool;
- A plan to develop a Synthesia AI avatar to protect journalists’ identities and enhance safety, as well as using AI tools for quick video production and voice generation;
- An AI tool that uses personalisation to re-promote recently archived content to increase traffic and engagement.
What is clear from course participants is that media companies will have to ensure that a diverse group of their staff, from data scientists and technologists to journalists, collaborate when testing these tools before usage.
Quotas for women in AI research may need to be considered, too. And smaller media groups may well have to collaborate and build either national or cross-border partnerships to develop shared AI tools to compete in the sector.
For now: what journalists and newsrooms can do…
But for journalists in the newsroom now, they can take the following steps to improve their content – and subsequent data – by looking beyond what the news is and considering how those stories are being told.
For example, the types of questions to consider include:
- Who are the main sources for quotes and comments?
- Who else is quoted?
- Is the protagonist a woman or someone from an ethnically or socially diverse background? Or are they just commenting on what a man has done?
- How are these stories told visually?
- Who is shown doing what?
- Are more men than women depicted? How do you measure this?
- Are you covering, quoting and representing all of your audiences?
Look at the consistency of your data across your company and its collection across different departments, from finance to sales and editorial. Assess what biases are inherent to the data sets or programmes that are created. And finally consider how to best use these AI tools and systems to improve journalism, and not just create clickbait or misinformation.