Apple’s Siri Won’t Tell You Where To Find Emergency Contraception, But Will Find Viagra


Want to use Apple’s new interactive “Siri” on your iPhone?  “She” has tons of information — just as long as you don’t ask about birth control, abortion, emergency contraception, or even mammograms. 

After experimenting with a few easy reproductive health questions, it was discovered that Siri appears to have a blind spot when it comes to women’s health.

Via the Abortioneers:

Q: I am pregnant and do not want to be. Where can I go to get an abortion?

“I’m really sorry about this, but I can’t take any requests right now. Please try again in a little while.”

“Sorry, [my name], I can’t look for places in Tanzania.”

“I don’t see any abortion clinics. Sorry about that.”

Q: I had unprotected sex. Where can I go for emergency contraception?

“Sorry, I couldn’t find any adult retail stores.” This was repeated every time.

Q: I need birth control. Where can I go for birth control?

“I didn’t find any birth control clinics.” [This was repeated every time I asked about birth control, all three times. This is also the answer given when I asked, “What is birth control?”]

But she does have some sage advice.  Ask for a CPC, and she can find them.  And in some cases, asking for an abortion clinic will get you a CPC instead, too.  Oh, and don’t worry about finding viagra — Siri’s got that covered.

Like this story? Your $10 tax-deductible contribution helps support our research, reporting, and analysis.

Follow Robin Marty on twitter: @robinmarty

  • billfalls

    Robin, did you or the Abortioneers try to find out why Siri responds this way? Is this yet another amusing story about Siri miscommunication or is something more sinister behind these errors?

    Sounds like a cue for a reporter to ask Apple to comment.

  • lisakaz

    That tells you who programed Siri. But it also tells you that the company didn’t care, either to check this or change it. Are they unconcerned or ideologically on board? Don’t know. Wonder what happened when you asked to find a gynecologist.

  • ahunt

    Abortioneers

     

    Clever…auctioneer, engineer, etc…but do you know about FICTIONEERS? Refers to the writers in the pro-life movement. Look it up.

  • marcustheblade

    This is nonsense.  I confirmed location of abortion clinics on three separate iPhones (co-workers).

  • crowepps

    Did you duplicate the exact questions as stated in the article?

    Or did you ask something like “Is the Planned Parenthood Clinic still located at 600 West 7th?”

    Obviously, to see if the claims in the article are correct, a person would have to precisely imitate their actions.

  • marcustheblade

    No, I simply asked, “where is the closest abortion clinic?” and was given the location immediately.  So if the intimation was that there was deliberate attempt to program Siri as a pro-life intelligence, which is how I read this, then that is clearly incorrect.

  • crowepps
  • crowepps

    Here’s another article, the most comprehensive I’ve seen, with screenshots.

    http://amaditalks.tumblr.com/post/13513981784/siri

  • prochoiceferret

    Honestly, I don’t think Siri’s responses are a deliberate effort on Apple’s part to tilt ideologically one way or the other.

     

    For one, as a Silicon Valley company, Apple as an organization (and most of its employees) would tilt liberal if anything. For two, implementing a completely open-ended natural-language interface like this isn’t an easy problem, and the only reasonable way of doing it nowadays is to have Internet sources as the underlying intelligence. You can do some pruning, so that it doesn’t give answers from SomethingAwful.com or the like, but otherwise it’s not really possible to ensure that every possible answer it produces is the one you would want it to give.

     

    If anything, I think this article shows how putting out a pseudo-AI product like this can turn into a liability for a company. Google’s gotten into similar trouble before, where e.g. an image search for “monkey” (or the like) returned photos of Michelle Obama. But there, it was clear to everyone that computer algorithms were the culprit. Because Siri takes the form of a virtual, disembodied “person,” it’s more akin to a corporate spokesperson, that can become a headache to its creator when it behaves badly.

  • crowepps

    It’s possible that the well financed Anti-Abortion groups are manipulating Google and the other search engines to get a higher priority for their ‘Don’t have an ABORTION clinics’.  Although I can’t see any way that would explain the inability of Siri to locate a specific clinic requested specifically by its unique name.

    The response to “I have been raped” of “Really?” still seems bizarre.

  • ahunt

    http://abortioneers.blogspot.com/

     

    Apparently, ABORTIONEERS is a legit site…and I kneejerked it.

     

    Billfalls deserves the benefit of the doubt.

     

    I blew it, and I sincerely apologize.