Are We Automating Racism?

  दृश्य 887,129


महीने पहले

Many of us assume that tech is neutral, and we have turned to tech as a way to root out racism, sexism, or other “isms” plaguing human decision-making. But as data-driven systems become a bigger and bigger part of our lives, we also notice more and more when they fail, and, more importantly, that they don’t fail on everyone equally. Glad You Asked host Joss Fong wants to know: Why do we think tech is neutral? How do algorithms become biased? And how can we fix these algorithms before they cause harm?

Vox महीने पहले
On this season of Glad You Asked, we explore the impact of systemic racism on our communities and in our daily lives. Watch the full season here: Want updates on our new projects and series? Sign up for the Vox video newsletter: For more reading about bias in AI, which we covered in this episode, visit our post on
Rushil Kasetty
Rushil Kasetty 12 दिन पहले
So great to see Joss again! One of my favorite hosts
Vaibhav Patidar
Vaibhav Patidar 13 दिन पहले
I have a crush on Joss❤️😍
Ser Winzzalot
Ser Winzzalot 15 दिन पहले
There she is!!!!!! Joss!
aznhoops 12
aznhoops 12 21 दिन पहले
Yo ngl joss is cute af
Mike Jim
Mike Jim 21 दिन पहले
Joss 🥰
jn18 25 दिन पहले
I love Joss Fong ❤️
Ari Prabowo
Ari Prabowo महीने पहले
The content of the video is thought-provoking and Joss is fabulous, as usual. But I feel that the audio in this piece can be better. It's too strong on the mid-range frequency that even with an EQ tweak on my end, the strident sounds are still too sharp on my ears.
Ivy Javagat
Ivy Javagat महीने पहले
just want to say Joss Fong is so pretty doe
Aldo Perez
Aldo Perez महीने पहले
So far out of all the episodes, this is my most favorite and the topic that I can relate the most. The other episodes are also interesting, but I think there's something lacking in the production. Even though the episodes run for only 22mins, I felt sleepy by watching it. I prefer the way Joss Fong talked here, it just makes me listen well. Just my opinion.
Yafidy Muhammad
Yafidy Muhammad महीने पहले
I’m here just to see joss. Who else?
Diaz महीने पहले
joss too cuteeee
rohitanand14 महीने पहले
We missed you joss
Joseph Andreanus Wijaya
Joseph Andreanus Wijaya महीने पहले
Great and informative video as always, and it's nice to see Joss as the main host of this Video. And this is such a hilarious topic though, never thought of even machines can have racial bias
lies of august
lies of august महीने पहले
When you first see Joss's face on the thumbnail for sure every topics are interesting 😅
Fanary महीने पहले
joss fong never disappoint me
fingerluck महीने पहले
I'm here to see Joss Fong, i mean beside this knowledgeable documentary thst that we can learn about racism in tech. We are all human beings, differences makes our lives colorful so keep it that way.
Nomis महीने पहले
I see Joss Fong on the thumbnail... I CLICK
Atheral महीने पहले
Joss Fong. That is all.
Divine Bitter
Divine Bitter महीने पहले
I would like to see the contrast pic of the two men Joss took at the beginning and uploaded to Twitter been repeated, but with the black man on the bottom. The background at the top of the pic they took was quite dark, and the lack of contrast might have contributed, along with Twitter's weighting bias, to the white face being featured. I don't think Twitter would switch to picking the black face but it would have helped control for an extra variable.
victor 91
victor 91 महीने पहले
Yeah it's not bad until when it comes to crime. Black people have to this day still being arrested over mistaken identity at an alarming rate . So weather its machine or human something has to be done!
Huey Freeman
Huey Freeman महीने पहले
Best Series yet! This channel never disappoint
sor3999 महीने पहले
I'm surprised the health risk model didn't factor in race because everything else in health care does. You are likely to be screened for heart disease earlier as a black man because the data shows they tend to get it. More simpler, women are more likely to get breast cancer even if a tiny minority of men can actually get it. Lupus is more common in women as well. Health care shouldn't be color blind.
George Willcox
George Willcox महीने पहले
I feel like a lot of these things aren’t the result of anything racist, but of other external factors end up contributing to that. The example of the hospital algorithm looking at expensive patients, for instance, isn’t inherently racist. The issue there should be with the factors that cause minority groups to cost less (ie. worse access to insurance), not with the software.
CeeSaw महीने पहले
this was actually a really interesting video! definitely makes me think more deeply about how my biases may affect technology and how they affect me O_O
George Willcox
George Willcox महीने पहले
Why did you make tea in a saucepan and not a kettle?
Anonymous bub
Anonymous bub महीने पहले
7:45 me seeing a little hand wave on the edge of the screen
Vince B
Vince B महीने पहले
Thought provoking video! However, using AI generated faces is probably the worst thing to do in this case, since whichever model generated the faces would presumably suffer the same systemic bias. There is a reason we bother collecting actual real-world data instead of using simulated data.
Kianoush Keykhosravi
Kianoush Keykhosravi महीने पहले
Wait, the cameraman is in both rooms but they are face timing each other???
dnyalslg महीने पहले
I just love their props lol
george so___s
george so___s महीने पहले
Very good videos. Keep them coming
kithytom007 महीने पहले
♥ Joss
raquellochoa महीने पहले
Oh my goodness the handwashing thing always happens to me and I am brown 🤷🏾‍♀️
Khaled K
Khaled K महीने पहले
Racism is taught. I would start by questioning a racist persons parents.
Mark Wallace
Mark Wallace महीने पहले
So, about the music. I see all music on videos as manipulative. WTF were they trying to make me feel with this stuff. Anyway good stuff.
Firoj Ahmed
Firoj Ahmed महीने पहले
lets not forget how light works in camera. I am a dark skinned person and I can confirm that a light skin would physically reflect higher amount of photon which will result in higher probability of the camera to capture that picture better than that of a black counterpart. same goes for computational photography and basic algorithm that are based on photos that we upload. it only makes sense that it would be bias towards white skin. why does everything have to be taken as an offensive scenario? We are going too far with this political correctness bullshit. Again, I am a person or dark skin and even I think this is bullshit. Now if you use it as if this is an issue in identifying person's face for security reasons or such, then, yes I am all for it to make it better to recognize all faces. But please, please make this political correctness bullshit stop.
karthick v
karthick v महीने पहले
Hello VOX Team, I personally a Favourite fan of your channel Videos, Thank you for the sensational projection of various topics and Keep up the great work.
Shion Chosa
Shion Chosa महीने पहले
There was a Better off Ted episode. Corporate head office decided to discontinue use of the energy saving technology to save money.
Matthew Leos
Matthew Leos महीने पहले
Joss is my hero! I imagine her voice any time I think through the complex features of life. ☺️
CheesecakeLasagna महीने पहले
Love the production especially on the set!
Riley Madison
Riley Madison महीने पहले
This is a really interesting look into machine learning - great job Glad You Asked team! It stands to reason that there would be bias no matter what because even if the machine doesn't have any inherent bias or self-interest in focusing on one face over another, people are still feeding information into the machine and the machine is basing its results on that information. And humans are still flawed beings who bring with them their own personalities, thought patterns, biases, childhood backgrounds, class backgrounds, et cetera. The only solution is to focus on what information we're feeding machines.
Anil Kaundal
Anil Kaundal महीने पहले
Algorithms are our opinions written in code.
Seth Deegan
Seth Deegan महीने पहले
Data scientists and AI researchers, please do your job ethically without bias.
Rodney Kelly
Rodney Kelly महीने पहले
At work, I have an IR camera that automatically measures your temperature as you walk into my facility. How it is supposed to do this is by locking on to the face, then measuring the person’s temperature. Needless to say, I want to take a sledgehammer to it. When it actually works, it’s with a dark face. The type of face it has the most problem with is a light face. If you also have a bald head, it will never see you.
TheSTRVman महीने पहले
4:30. Why are you filming and driving!! No don’t read a quote!! JOSS NOOO. *boomp* *beep beep beep*
Pavan Yaragudi
Pavan Yaragudi महीने पहले
Joss Fong!❤️🔥
Gustavo Medrano
Gustavo Medrano महीने पहले
same with medicine. sometimes they used to test it on white males, later on white females because they found out man and woman had different reactions to the same medicine now we know black people react even diferent to the medicine compare to other races... so basically the same is happening with tech...
diane ridley
diane ridley महीने पहले
The machines are programmed by people. Racist and bigoted and biased people. We are not at sentient machines yet.
Vance H
Vance H महीने पहले
Unity in Diversity
Maya PS
Maya PS महीने पहले
ooo this is so fascinating
Meredith White
Meredith White महीने पहले
Algorithms of Oppression is a really good book if you want to learn about racial and gender bias in big tech algorithms, like Google searches. It shows that machines and algorithms are only as objective as we are. It seems like machine learning and algorithms are more like groupthink and are not objective.
Atiqur Rahman
Atiqur Rahman महीने पहले
Machines aren't racist. The outcome feels racist due to bias in training data. The model needs to be retrained.
Jake from StarHeart
Jake from StarHeart महीने पहले
How did y’all de-age Steve Coogan?
alex d
alex d महीने पहले
also wtf shaking hands?? hello 'rona
Sam Lee
Sam Lee महीने पहले
Worlds of Math Destruction by Cathy O’Neil really touches hard on this subject, where biases of the designers or the customers of the algorithm have big negative impacts to society. There seriously needs some kind of ethical standard for designing algorithms but it’s so damn hard... :/
Bersl महीने पहले
There's also the possible issue of "white balance" of the cameras themselves. My understanding is that it's difficult to set this parameter in such a way that it gives acceptable/optimal contrast to both light and dark skin at the same time.
will_ press
will_ press महीने पहले
How was the M.I.T lady writing on the glass, because surely she was writing backwards so we can see it?
alex d
alex d महीने पहले
how are you still using skype?
joel Winchester
joel Winchester महीने पहले
1:55 Dàmn 😐👀😐
Christian morales
Christian morales महीने पहले
What I got out of this is... We need more data ?
wj35651 महीने पहले
18:36 why are they pretending they are talking on a video chat, when they had crystal clear picture from another camera? Reality and perception, subtle differences.
Vox महीने पहले
We had a camera crew on each end of our zoom call, since we couldn't travel due to Covid. - Joss
C.L. Brown
C.L. Brown महीने पहले
Wow, people shouldn't be viewed as data/numbers. We are all uniquely individuals. Software should not be used for calculating/predicting crime and health. We are all too dynamic. Especially during different times and places in our lives.
Milena महीने पहले
Are we stealing Netflix ideas?
tatianazim1 महीने पहले
As a black person living in North America this sucks...
Jaryd B
Jaryd B महीने पहले
My range of emotions went from: “this is so dumb lol cmon Vox” to “holy fuck Vox thank you for highlighting a story I never would’ve known about”
GreyOwul महीने पहले
People seem to be noticing how nicely the professor can write backwards... Fun fact: That's a camera trick! She is actually writing normally, (So the original video shows the text backwards) but then in editing, the video was flipped again, making the text appear normal. Notice that she is writing with her left hand, which should only be a 10% chance. Great video btw! I thought that the visualization of the machine learning process was extremely clever.
Thus Spoke Moe
Thus Spoke Moe महीने पहले
We need more black brown engineers to work on these machines. And every single person in the US should own their data and have control over it.
Kuldeep Μaurya
Kuldeep Μaurya महीने पहले
Wait.. how did the soap dispenser differentiate between the two hands???
Noah Johnson
Noah Johnson महीने पहले
These videos are so solid! I'm about to graduate college with a degree in sociology and so far these videos are hitting a ton of the main points that I've learned about over my four years of education.
Sasha Lee
Sasha Lee महीने पहले
Another great interesting video!
Pranav Kakkar
Pranav Kakkar महीने पहले
I missed seeing Josh in videos. Glad she's back.
Zubin Siddharth
Zubin Siddharth महीने पहले
Wait, how was the professor from Princeton able to write in reverse on that glass, so that we could read straight?
Killian Becker
Killian Becker महीने पहले
This feels like a pbs kids show. With the set and all!
Syazwan महीने पहले
16:15 that got me trippin for a second until I realize they probably just mirror the video so that the writing comes out right and she's not actually writing backwards.
The_void_screams_back महीने पहले
the level of production on this show is just * chef's kiss *
Sofi महीने पहले
love this topic!
Shahrukh Shikalgar
Shahrukh Shikalgar महीने पहले
Designers are biased and so do their machines Simple as that
Fandy Hadamu
Fandy Hadamu महीने पहले
Thank you
Danu Setia Nugraha
Danu Setia Nugraha महीने पहले
I never thought about it before. Thanks Vox!
dEcmircEd महीने पहले
maybe it was more tech focused but it was way more interesting to me than the one about assessing is own racism, which seemed a bit more frivolous in its sourcing and it's overall process. Joss does really great stuff
Yellowsnow69420 महीने पहले
Wow. Really smart and great guests on this video.
daylightinsomniac महीने पहले
as always the editing is absolutely superior. keeps me hooked.
Thanks for these Joss
Thomas Mastrogiacomo
Thomas Mastrogiacomo महीने पहले
Omg Dawood Khan is really cute! Who else agrees?
Elogene Karl Gallos
Elogene Karl Gallos महीने पहले
It's been forever since I've last seen Joss in a video. I've almost forgotten how good and well-constructed her videos are.
TheAstronomyDude महीने पहले
Not enough black people in China. Most of the datasets every algorithm uses were trained by CCTV data from Chinese streets and Chinese ID cards.
Michael Fadzai
Michael Fadzai महीने पहले
So Twitter said they didn't find evidence of racial bias when testing the tool. My opinion is that they were not looking for it in the first place.
terrab1ter4 महीने पहले
This reminds me of that book, "Weapons of Math Destruction" Great read for anyone interested, it's about these large algorithms which take on a life of their own
Armando René Cantú
Armando René Cantú महीने पहले
Me at first: "who even asks these questions, sreiously?" Me after finishing the video: "Aight, fair point."
The hand soap dispenser is a real thing. Straight up
Jordan JJ
Jordan JJ महीने पहले
What a thought provoking episode! That young woman Inioluwa not only knew the underlying problem but she even formed a solution.. when she said that it should be devs responsibility to proactively be conscious of those that could be targeted or specified in a social situation and do their best to prevent it in advance. She’s intelligent, and understands just what needs to be done and stated in a conflict; A solution.. Hats off to her..
Alya Nur Aeni
Alya Nur Aeni महीने पहले
I am amazed with the look of the studio. I would love to work there, the atmosphere is just different, unique and everyone have a place there 😍
ron xavier santos
ron xavier santos महीने पहले
Joss talking about twitter in 10:09 then went straight to ad, an you guessed it: twitter
- महीने पहले
• 2:58 - Lee was on the right track, it's about machine-vision and facial-detection. One test is to try light and dark faces on light and dark backgrounds. It's a matter of contrast and edge- and feature-detection. Machines are limited in what they can do for now. Some things might never be improved, like the soap-dispenser; if they increase the sensitivity, then it will be leaking soap. • 8:13 - And what did the search results of "white girls" return? What about "chinese girls"? 🤨 A partial test is useless. ¬_¬ • 9:00 - This is just regular confirmation bias; there aren't many articles about Muslims who… sculpted a statue or made a film. • 12:34 - Yikes! Hard to deny raw numbers. 🤦 • 12:41 - A.I.s are black-boxes, you _can't_ know why they make the "decisions" they make. • 13:33 - Most of the people who worked on developing technologies were white (and mostly American). They may or may not have had an inherent bias, but at the very least, they used their own data to test stuff at the beginning while they were still just tinkering around on their own, before they were moved up to labs with teams and bigger datasets. And cats built the Internet.🤷 • 14:44 - I can't believe you guys built this thing just for this video. What did you do afterwards? 🤔
Andy Salcedo
Andy Salcedo महीने पहले
I'm sorry Joss, but how did the only two people in this video that actively work in the tech industry, that are building these automated systems, only have a combined 5 minutes on screen? You don't talk to the computer scientists about solutions or even the future of this tech, but yet you talk to Dr. Benjamin and Dr. Noble (who don't code) about "implications" and examples in which tech was biased. Very frustrating as a rising minority data scientist myself, to see this video focused on opinion instead of actually finding out how to fix these algorithms (like the description says.) Missed an excellent opportunity at highlighting minority data scientists and how they feel building these algorithms.
Robert N
Robert N महीने पहले
A takeaway from this is also another proof that racism is taught... AI is neutral but then becomes inherently racists due to how they were taught. Learn to love instead of hate
Croissants & Coffee
Croissants & Coffee महीने पहले
As more and more governments, like say China, India, Middle Eastern countries, are employing face recognition tools that use such AI for law enforcement and surveillance, and they're buying said software from Western Countries, I am wondering how accurate these systems are seeing as the AI were trained on primarily white faces. Do these AI then learn "locally", and if so, can this data then be fed back into the original AI to make it learn how to recognise those ethnicities in western countries with an ethnically diverse population, like USA, UK, etc.?
Hope Rock
Hope Rock महीने पहले
While I do think that machines are biased I think that saying they're racist is an over statement.
LorentianElite महीने पहले
I'm a simple man. I see Joss, I click.
Ruchi Yadav
Ruchi Yadav महीने पहले
What you have to say about Pinterest and Tumblr! 🙄
Does My Neighborhood Determine My Future?
Why captchas are getting harder
दृश्य 49K
Gaon Ki setting Ka Birthday | We Are One
We Are One
दृश्य 2M
My New Bike For All india Ride 😀
Arun Smoki
दृश्य 626K
Sweden. How to love life in a poor climate. Big Episode
Антон Птушкин
दृश्य 17M
Gaon Ki setting Ka Birthday | We Are One
We Are One
दृश्य 2M
My New Bike For All india Ride 😀
Arun Smoki
दृश्य 626K
Muthoot Fincorp Scoot | Single Watch | Karikku Fliq
Break me no riks, only Disney+ Hotstar Quix !
DisneyPlus Hotstar
दृश्य 10M
4 Facts About Sex - Sandeep Maheshwari | Hindi
Sandeep Maheshwari
दृश्य 2.1M
APNA DOSTHANA | Episode 1 | Warangal Diaries Comedy
Warangal Diaries
दृश्य 256K