Siri, Sexism, and Silicon Valley

At this point, I think it’s fair to say that there is nothing intentional in the fact that Siri, Apple’s AI assistant for the iPhone 4S, has a hard time providing information for abortion clinics, birth control, and other reproductive health services. As both Amanda Marcotte and Jill Filipovic have pointed out, Apple relies on external databases for Siri, which often offer faulty or inconclusive information for reproductive health services. What’s more, because of the difficulties inherent to location-based search algorithms, it’s likely that Siri has a bias toward avoiding all but the most unambiugous results, which—given the extent to which abortion providers often demure about their identity—would explain the presence of crisis pregnancy centers in results for abortion-related queries.

As for the responses Siri gives when you say “I was raped” (“Really!”) or “Are you pro-life?” (“I am what I am”), it’s likely that those are canned responses to declarative statements or questions that Siri isn’t equipped to answer. Remember, Siri isn’t really displaying artificial intelligence: It is just a voice recognition front-end hooked to a massive data-combing operation.

I’m going to pause here and say that I am not accusing anyone of active malice toward women. I have no reason to think that Apple treats its female employees with any less respect than the men it employs. The simple fact is that sexism—like racism—is subtle, insidious, and manifests itself in many different ways. Apple might not be an explicitly sexist workplace, but it’s abundantly clear that it’s a place that doesn’t envision women as its “default” consumers.

That Siri gives responses for blowjobs and strippers—but can’t return a query about birth control—has everything to do with the fact that Apple (and Silicon Valley writ large) is a place dominated by men and their preferences. In all likelihood, Siri was developed and optimized by a team of all dudes or mostly dudes. And while they made sure to include things that were gender-neutral (like mental health services), there was no effort to approach Siri from the perspective of a woman user. Indeed, reproductive health is a classic male blind spot—it’s women who are “supposed” carry the responsibility for contraceptives. Men, in general, get a pass. The problem with Siri isn’t that the programmers hate women; it’s that they weren’t even on the radar.

Given the extent to which women are underrepresented in the tech industry, you could almost say that this—or something like it—was bound to happen. What’s more, we can expect it to happen again. It might not be Apple, but as long as the background sexism of Silicon Valley remains undisturbed—and reinforced by the industry’s illusion of meritocracy—we can assume that some company will do something else to alienate women.

Comments

"At this point, I think it’s fair to say that there is nothing intentional in the fact that Siri, Apple’s AI assistant for the iPhone 4S, has a hard time providing information for abortion clinics, birth control, and other reproductive health services. As both Amanda Marcotte and Jill Filipovic have pointed out, Apple relies on external databases for Siri, which often offer faulty or inconclusive information for reproductive health services. What’s more, because of the difficulties inherent to location-based search algorithms, it’s likely that Siri has a bias toward avoiding all but the most unambiugous results, which — given the extent to which abortion providers are often demure about their identity — would explain the presence of crisis pregnancy centers in results for abortion-related queries.

This is just an excuse which doesn't seem to even begin to deal with the problem or any other evidence. On what basis can you say it's not intentional? Certainly not on the basis of how you define the supposed skittishness of abortion clinics labelling themselves accurately. Well it turns out real clinics do labe themselves accurately. And fake clinics use the word abortion to label themselves inaccurately to decieve women who look for abortions.

I just did a Google search for NY City abortion clinics and got unambiguous results which came up with real abortion clinics. My search was entitled New York City abortion clinics.
Actually the word abortion is often in their name or description. So to NOT get abortion clinics would mean they are ignoring the self evident which is the word abortion.

Actuallythough and very intriguingly the first fake pregnancy center listed has as its heading - Abortion Options. That's on page 2 . So the way fake pregnancy centers get found is by women who are looking by using the word abortion. So if Siri isn't recognizing the word abortion how does it come up with a fake clinic with the heading abortion -options? It is misdirection but how accidental can that be then?

You need to be logged in to comment.
(If there's one thing we know about comment trolls, it's that they're lazy)

Connect
, after login or registration your account will be connected.