Added: Keyna Hage - Date: 13.09.2021 14:49 - Views: 41335 - Clicks: 2066
Most chatbots are deed by men and tend to replicate gender stereotypes. But as the few women involved in the industry can testify, getting AI to emote requires input from all genders. E ver wanted a friend who is always there for you?
Someone infinitely patient? Well, meet Replika.
Gender, voice, appearance: all are up for grabs. The product of a San Francisco-based startup, Replika is one of a growing of bots using artificial intelligence AI to meet our need for companionship. As AI developers begin to explore — and exploit — the realm of human emotions, it brings a host of gender-related issues to the fore.
Many centre on unconscious bias. The rise of racist robots is already well-documented. Is there a danger our AI pals could emerge to become loutish, sexist pigs? In addition to curated content, however, most AI companions learn from a combination of existing conversational datasets film and TV scripts are popular and user-generated content.
Both present risks of gender stereotyping. Lauren Kunze, chief executive of California-based AI developer Pandorabotssays publicly available datasets should only ever be used in conjunction with rigorous filters. The same, regrettably, is true for inputs from users. With more than 3 million male users, an unchecked Mitsuku presents a truly ghastly prospect. Appearances matter as well, says Kunze. The risk of gender prejudices affecting real-world attitudes should not be underestimated either, says Kunze.
She gives the example of school children barking orders at girls called Alexa after Amazon launched its home assistant with the same name. Pandorabots has experimented with banning abusive teen users, for example, with readmission conditional on them writing a full apology to Mitsuku via .
Alexa the AImeanwhile, now comes with a politeness feature. While emotion AI products such as Replika and Mitsuku aim to act as surrogate friends, others are more akin to virtual doctors. Here, gender issues play out slightly differently, with the challenge shifting from vetting male speech to eliciting it. Alison Darcy is co-founder of Woebota therapy chatbot which, in a randomized controlled trial at Stanford University was found to reduce symptoms of anxiety and depression. Users with disabilities or mental health issues are at particular risk here, says Kristina Barrick, head of digital influencing at the disability charity Scope.
Women in technology Artificial intelligence AI. AI and me: friendship chatbots are on the rise, but is there a gendered de flaw?
Photograph: Mitsuku. Supported by. Oliver Balch. Thu 7 May Why do we gender AI? Voice tech firms move to be more inclusive. Reuse this content.Come on with the bots i need a real woman
email: [email protected] - phone:(261) 236-4295 x 9991
AI and me: friendship chatbots are on the rise, but is there a gendered de flaw?