At this point, I think it’s fair to say that there is nothing intentional in the fact that Siri, Apple’s AI assistant for the iPhone 4S, has a hard time providing information for abortion clinics, birth control, and other reproductive health services. As both Amanda Marcotte and Jill Filipovic have pointed out, Apple relies on external databases for Siri, which often offer faulty or inconclusive information for reproductive health services. What’s more, because of the difficulties inherent to location-based search algorithms, it’s likely that Siri has a bias toward avoiding all but the most unambiugous results, which—given the extent to which abortion providers often demure about their identity—would explain the presence of crisis pregnancy centers in results for abortion-related queries.
As for the responses Siri gives when you say “I was raped” (“Really!”) or “Are you pro-life?” (“I am what I am”), it’s likely that those are canned responses to declarative statements or questions that Siri isn’t equipped to answer. Remember, Siri isn’t really displaying artificial intelligence: It is just a voice recognition front-end hooked to a massive data-combing operation.
I’m going to pause here and say that I am not accusing anyone of active malice toward women. I have no reason to think that Apple treats its female employees with any less respect than the men it employs. The simple fact is that sexism—like racism—is subtle, insidious, and manifests itself in many different ways. Apple might not be an explicitly sexist workplace, but it’s abundantly clear that it’s a place that doesn’t envision women as its “default” consumers.
That Siri gives responses for blowjobs and strippers—but can’t return a query about birth control—has everything to do with the fact that Apple (and Silicon Valley writ large) is a place dominated by men and their preferences. In all likelihood, Siri was developed and optimized by a team of all dudes or mostly dudes. And while they made sure to include things that were gender-neutral (like mental health services), there was no effort to approach Siri from the perspective of a woman user. Indeed, reproductive health is a classic male blind spot—it’s women who are “supposed” carry the responsibility for contraceptives. Men, in general, get a pass. The problem with Siri isn’t that the programmers hate women; it’s that they weren’t even on the radar.
Given the extent to which women are underrepresented in the tech industry, you could almost say that this—or something like it—was bound to happen. What’s more, we can expect it to happen again. It might not be Apple, but as long as the background sexism of Silicon Valley remains undisturbed—and reinforced by the industry’s illusion of meritocracy—we can assume that some company will do something else to alienate women.
You need to be logged in to comment.
(If there's one thing we know about comment trolls, it's that they're lazy)