What Siri’s Blind Spot on Women’s Health Really Means


This week, blogs erupted with news that Siri had a “blind spot” when it comes to women’s health. Ask Siri, Apple’s latest app, a voice-activated personal assistant, where to get an abortion or where to find emergency contraception, and it typically replies, “Sorry, I don’t see any places matching [your query].” Or worse, as some media outlets reported, Siri only provided locations for “crisis pregnancy centers,” unlicensed clinics that don’t actually provide health care. Instead, they target women with misinformation and propaganda.

No one owns the Internet, and we tend to assume that no one can control it. But this issue with Siri does show how easily a piece of code can shape our choices by limiting or controlling our options. Siri is amazingly adept at finding what you’re looking for. Ask it for the nearest hardware store, reservations for two at your favorite Italian restaurant, where to buy Viagra, and presto — you’ve got names, maps and phone numbers. Yet the trusty little wizard suddenly gets amnesia when asked about birth control or abortion care.

While this may be nothing more than a programming glitch, it is a modern-day example of the historic struggle women have always faced in getting access to health care and health information. The episode underscores the importance of being vigilant about the availability of information and services, especially critical health information, as new technologies emerge.

Apple’s oversight, however innocent, highlights a threat that none of us should take lightly. The Internet can be a liberating force — it has largely eradicated the kind of censorship that once choked people’s reproductive rights.

But even as the old barriers fall, technology is erecting new ones that are less visible and more insidious. When search engines shape our knowledge of the world, their blind spots become our blind spots. And when they anticipate our needs — by automatically narrowing search results to reflect our past preferences and interests — they can replace open access with the illusion of open access. Tools that could lead us to new information and insight serve mainly to reinforce our biases.

Apple should fix this immediately. And digital developers need to adopt a new ethic, and a new set of rules, to address this emerging hazard. Meanwhile, the Siri episode should remind us that search engines can hide the truth and propagate misinformation. Abortion care is still safe, legal and accessible. So is birth control, and so is emergency contraception.

So if the wizard on your smart phone is puzzled by questions about women’s health, use the phone’s browser to visit plannedparenthood.org. It’s fully accessible on mobile devices, and it won’t mislead you about where to find the health services you need. The humans who manage it make sure of that.

Like this story? Your $10 tax-deductible contribution helps support our research, reporting, and analysis.

To schedule an interview with and contact director of communications Rachel Perrone at rachel@rhrealitycheck.org.

  • halli620

    This whole uproar is dumb because Siri is entirely unnecessary and does not do anything that the map function or internet searches on Safari do not do. I didn’t even know about Siri when I got the iPhone 4s to replace my ancient flip phone, and I’ve never used it and see no need to ever, unless you’re driving and need to find something right away and can’t pull over to use the map function or Safari search. This is the only scenario where you could possibly need Siri. Otherwise, use the map function and do a search online! Therefore, Siri is in no way “limiting or controlling our options.” To claim this when the same iPhone can help you look up your needed information just by typing it in instead of talking to it (wouldn’t it be more private to type it so people don’t hear you anyway?) is making a mountain out of a molehill.