Exposing the Bias Embedded in Tech

Brasil Notícia Notícia

Exposing the Bias Embedded in Tech
Brasil Últimas Notícias,Brasil Manchetes
  • 📰 Women 2.0
  • ⏱ Reading Time:
  • 50 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 23%
  • Publisher: 63%

“It was penalizing résumés that had the word ‘women’ in it, such as if you went to a women’s college,” she said. “Your résumé was spit out.”

that facial recognition is far more accurate with lighter-skin men than with women and, especially, with darker-skin people.

Other examples: A Microsoft customer was testing a financial-services algorithm that did risk scoring for loans. “As they were training the data set, the data was of previously approved loans that largely were for men,” Ms. Johnson said. “The algorithm clearly said men are a better risk.”the computer models were being trained using résumés submitted over the past 10 years, and most came from men. Therefore it was “taught” that men were better job candidates.

For example, she said, tenants in Brooklyn are fighting a landlord who wants to replace a lock and key entry system with facial recognition. In aopposing that, the AI Now institute supported the tenants’ fear of increased surveillance and that the inaccuracy in facial recognition, especially of nonwhites, would lead them to be locked out.

“The people at the top look more and more the same,” she said. And fewer, not more, women are getting bachelor’s degrees in computer science. According to the

Resumimos esta notícia para que você possa lê-la rapidamente. Se você se interessou pela notícia, pode ler o texto completo aqui. Consulte Mais informação:

Women 2.0 /  🏆 149. in US

Brasil Últimas Notícias, Brasil Manchetes

Similar News:Você também pode ler notícias semelhantes a esta que coletamos de outras fontes de notícias.

San Francisco Will Use AI To Thwart Racial Bias When Charging SuspectsSan Francisco Will Use AI To Thwart Racial Bias When Charging SuspectsIn July, the San Francisco District Attorney's office will implement a 'first-in-the-nation' AI tool used to prevent prosecutors from being racially biased when deciding criminal charges.
Consulte Mais informação »

'Rafiki' Is A Stunning Lesbian Love Story\u2014In A Place Where That's Forbidden'Rafiki' Is A Stunning Lesbian Love Story\u2014In A Place Where That's ForbiddenOn GoodTrouble, we saw a rare instance of a Black woman addressing Black men's unspoken preference for white women, without reducing her to the stereotype of being bitter
Consulte Mais informação »

Genius Claims Google Stole Lyrics Embedded With Secret Morse CodeGenius Claims Google Stole Lyrics Embedded With Secret Morse CodeLyrics site used two types of apostrophes that, when converted to dots and dashes, spelled out “Red Handed”
Consulte Mais informação »

San Francisco Will Use AI To Thwart Racial Bias When Charging SuspectsSan Francisco Will Use AI To Thwart Racial Bias When Charging SuspectsIn July, the San Francisco District Attorney's office will implement a 'first-in-the-nation' AI tool used to prevent prosecutors from being racially biased when deciding criminal charges.
Consulte Mais informação »



Render Time: 2025-04-02 00:32:36