Ep 326: Giving Voice To Our Digital Assistants

MichaelGender/SexualityLeave a Comment

Genderless voice

Why do our digital assistants such as Alexa, Google Home, Siri and Cortana have “feminized” voices and what are the effects of this trend? That’s what I explore in this episode. Are there negative effects of using female voices in the devices we talk to and who talk to us? Are there alternatives? Turns out there is an alternative – a “genderless” voice. What does that sound like? Tune in to find out as we explore gender roles, expectations and equality.

This episode sponsored by Wix. I use Wix to create lots of online educational activities. Check them out!

Using Wix for Online Educational Activities

The reason digital assistants acquiesce to harassment isn’t just sexism or gender inequality in the tech world, as disturbing and prevalent as those may be. No, the explanation lies elsewhere, I believe. These machines are meant to manipulate their users into staying connected to their devices, and that focus on manipulation must be laser-like. To clearly state that harassment toward digital assistants is unacceptable would mean having some standard, some line that can’t be crossed. And one line leads to another, and soon you’re distracted—the user is distracted—from selling/buying merchandise, collecting/sharing data, and allowing a device to become ensconced in their life.

The moral standard most compatible with engagement is absolute freedom of expression, the standard of having no standards.

– Noam Cohen, “Why Siri and Alexa Weren’t Built to Smack Down Harassment”

Related Posts Plugin for WordPress, Blogger...

Leave a Reply

Your email address will not be published. Required fields are marked *