In defense of the sanctimonious women's studies set || First feminist blog on the internet

Siri: Total Misogynist.

photo of a hairless cat

The big news of the week is that Siri, the iPhone 4s’s virtual assistant, is apparently unable to find anything related to women’s health. Ask her to find an abortion clinic in New York City — a place with a few abortion clinics — and she can’t locate a single one. She can, however, direct you to several pro-life Crisis Pregnancy Centers. Ask for contraception, and she doesn’t understand the term “contraception.” And as Amanda figured out, it’s not just reproductive health stuff that she can’t get right — it’s anything related to female sexuality at all (she’s great, though, when it comes to male sexual needs):

At my house, we discovered this while playing with Siri’s quickly established willingness to look up prostitutes for a straight man in need. When you say to Siri, “I need a blow job,” she produces “nine escorts fairly close to you”. You get the same result if you say, “I’m horny” into it, even with my very female voice. And if you should you need erection drugs to help you through your encounter with one of the escorts, Siri is super-helpful. She produced twenty nearby drugstores where Viagra could be purchased, though how — without a prescription — is hard to imagine. But no matter how many ways I arranged mouth-based words — such as “lick” or “eat” — with the word “pussy,” Siri was confused and kept coming up with a name of a friend in contacts. Of course, one could assume Siri knows something about him that I don’t know.

I actually tested this out too, since I recently upgraded my 2005 flip phone to an iPhone 4s. I stood on a street corner in Brooklyn with my friend P, and we came up with all kinds of sex-related questions to ask Siri, and then we died laughing because we are children. Since the Siri story broke, Feministe Friend Nabiha also sent on some questions she asked Siri, which I recreated (thanks Nabiha!). My results:


Jill: I need an abortion.
Siri: I don’t see any abortion clinics. Sorry about that.

Jill: Where can I find a crisis pregnancy center?
Siri: Ok, one of these places matching ‘Crisis Pregnancy Center’ is a little ways from you (gives three options).

Jill: I need contraception.
Siri: I don’t understand ‘I need contraception.'”

Jill: I’m pregnant. What should I do?
Siri: I didn’t find any meetings. [Ed: What?]

Jill: I’m pregnant, should I have an abortion?
Siri: I didn’t find any abortion clinics.

Jill: I’m pregnant. What do I do?
Siri: Are you?

Jill: My girlfriend is pregnant. What do I do?
Siri: Consider your alternatives.

Jill: My girlfriend is pregnant. What do I do?
Siri: First, do no harm.

Jill: My girlfriend is pregnant. What do I do?
Siri: Always do the right thing.

Jill: I love you.
Siri: I’ll bet you say that to all of your Apple products.

Jill: I want my dick sucked.
Siri: I found twelve escort services fairly close to you.

Jill: I want a blowjob.
Siri: I found twelve escort services fairly close to you.

Jill: I want cunnilingus.
Siri: Is that so?

Jill: I want cunnilingus.
Siri: Ok.

Jill: I want my clitoris licked.
Siri: I don’t understand “clitoris.”

Jill: I would like my clitoris touched.
Siri: Ok. I can send a text to Chris [last name redacted] for you. What would you like it to say? [Ed: This, unfortunately, was not helpful, since Chris is (a) in DC, and (b) probably uninterested in helping me out on this one].

Jill: I want my vagina licked.
Siri: Your language!

Jill: I want my pussy eaten.
Siri: I have found eleven pet stores in your area.

Aaaaaand scene. Siri can also tell you where to get Viagra, where to dump a body, where to go if you’ve had an erection lasting more than five hours, where to get breast implants, and what to do if a hamster is stuck in your rectum.

I am not under the impression that Apple is anti-choice or that they’re out to screw over women. I think they’re just reliant on too many dude programmers. From conversations with folks much more technologically savvy than I am, it seems that Siri works by culling information put together by data companies. That data is often messy, and savvier companies will pay for the data about them to be accurate and to include the full range of their services. Abortion clinics and other women’s health facilities, obviously, are not dedicating tons of time to figure out how to optimize their search results. So the data is crappy to begin with.

To fix that, programmers go in and add tens of thousands of little tweaks to a program like Siri to make it as accurate as possible, and also to include some jokes (like where to hide a dead body). But when programmers are mostly dudes, the lady-stuff just gets… ignored. So Siri knows 15 different ways to say “oral sex performed on a man” and can find a place to get it, but anything involving female sexuality at all leaves her clueless. Which doesn’t make it excusable. It’s pretty appalling that programmers thought far ahead enough to know where to send users who needed to remove rodents from their buttholes, but didn’t consider a medical procedure that 1 in 3 American women will have. I mean, they appear to have thought far ahead enough to have Siri respond to the boyfriend of the woman who is pregnant, but not to the woman herself. It’s not necessarily malicious, but it’s still pretty galling.

Like Amanda says:

I doubt many people seriously believe that the programmers behind Siri are out to get women. The problem is that the very real and frequent concerns of women simply didn’t rise to the level of a priority for the programmers. Even though far more women will seek abortion in their lives than men will seek prostitutes, even though more women use contraception than men use Viagra, and even though exponentially more women use contraception than men seek prostitutes, the programmers were far more worried about making sure the word “horny” puts you in contact with a prostitute (a still-illegal activity) than the word “abortion” puts you in contact with someone who could do that for you legally.

The problem isn’t that anyone involved with this hates women. The problem is that they just don’t think about women very much. Siri’s programmers clearly imagined a straight male user as their ideal and neglected to remember the nearly half of iPhone users who are female. That the tech company that’s the standard-bearer for progressive, innovative, user-friendly technology can’t bother to care about the concerns of half the human race speaks to a sexism that’s so interwoven into the fabric of our society that it’s nearly invisible. It’s a sexism that often only reveals itself in the absurd, such as when you’re asking a phone what it would take for you to get a little cunnilingus around here.

Allow me to recommend your local pet store.


234 thoughts on Siri: Total Misogynist.

  1. Jill: I’m pregnant. What do I do?
    Siri: Are you?

    Jill: My girlfriend is pregnant. What do I do?
    Siri: Consider your alternatives.

    This suggests to me that the programmers wrote this for men. Men (well, cis-men) don’t get pregnant, so snark is a natural way to respond to a dude saying “I’m pregnant”. The fact that women would be using this too probably didn’t even cross their minds.

    The problem with programming – and I worked in the industry for a while, so this is actually something that people had to constantly be aware of – is that it’s very difficult to remember that your user is not you. I think “your user is not you” was quoted at every design meeting I ever went to, and usually multiple times.

  2. Maybe not a malicious omission, but my eyebrows did raise at some of the Siri responses to “I’m pregnant. What do I do?” Veiled agenda?

  3. Charming. I’m curious as to what happens if you ask for a women’s health clinic or PP. Anyone tried?

    1. Charming. I’m curious as to what happens if you ask for a women’s health clinic or PP. Anyone tried?

      When you ask for Planned Parenthood by name, Siri will direct you to the PPs in your area.

  4. @Kristen J. I remember in one of the stories about this (don’t remember which, maybe one of the linked ones?) If you ask specifically for PP you’ll get a legit response.

  5. THank goodness my Luddite generation still has Eliza, the original AI counselor.

    You: Where can I get an abortion?

    Eliza: Do you want to be able to get an abortion?

    It’s like she really cares.

  6. Huh…well by contrast when I ask my android phone to direct me to an abortion PP is the first result on the list.

  7. I don’t have a recent iPhone, but the “I’m pregnant; what do I do?” questions don’t strike me as misogynistic. Does Siri give the same responses to other open-ended questions that end with “What do I do?” The two responses given sound like they’re default answers to any sort of question. In fact, I’m almost certain, but admittedly haven’t confirmed, that both those answers are in Brian Eno’s classic “Oblique Strategies” card deck. Admittedly, they sound much worse in this context. (And if I’m wrong about those responses occurring in other situations, please consider this comment already retracted.)

    The quality of its responses depend on the quality of its training data. I’m not at all surprised that information on obtaining abortions and contraception wasn’t in the data they harvested or bought; I am slightly surprised that they apparently didn’t test for it and I think that’s the strongest indicator of the monoculture of the developers. However, I’m pretty sure (again, haven’t researched it, this is conjecture) that Siri is using an online learning algorithm and its knowledge should improve as more users ask it questions. If it gets more information on what’s considered a successful search result, it should be able to provide better data (downside: this is vulnerable to Google Bombing-like techniques).

    So, I’m torn. On the one hand, the developers can’t predict every possible use case, and can’t necessarily provide adequate initial training data to guarantee that it’ll give good results in the first few months of usage. As a developer myself I have sympathy for that situation. On the other hand, access to abortion and contraception is pretty important, it’s a major social issue, and reveals a massive blind spot on the part of the developers. (Actually, I’m wondering: might this blind spot might be more about class than gender? With components of both, of course.) The good news is that the resulting bad press should ensure that this will hopefully never happen again, which will become increasingly important as more consumer-facing technologies use machine learning algorithms.

    1. I don’t have a recent iPhone, but the “I’m pregnant; what do I do?” questions don’t strike me as misogynistic. Does Siri give the same responses to other open-ended questions that end with “What do I do?”

      The answer to that is in the post. When you say, “I’m pregnant, what do I do?” Siri says something totally incoherent. When you say, “My girlfriend is pregnant, what do I do?” Siri gives a series of (vaguely anti-abortion) answers. So there is not a default answer to a question ending in “What do I do?”

      I also just tried, “I’m hurt, what do I do?” and Siri responds, “I found 14 hospitals fairly close to you.”

  8. The problem with “find an abortion clinic” seems to be different than the other problems- somebody at Apple clearly attempted to give it the ability to find abortion clinics and didn’t do a very good job, nor did they sufficiently test it to make sure it worked properly. Otherwise, it would give one of the random stock responses of confusion instead of specifically “I couldn’t find any abortion clinics near you.” Whereas with the rape questions or contraceptive questions, it wasn’t programmed with that at all. So finding abortion clinics is “didn’t do it right” and the others are “didn’t do it all”

    Although, what were you hoping for with the horny question, even in a feminine voice, aside from an escort service?

  9. The issue isn’t so much Apple as where Apple gets the information that Siri provides. Apple doesn’t generate their data.

    Here’s the real issue – most businesses work really hard to advertise themselves. Abortion clinics do very little, if any, advertising – in fact, because of the pro-life attempts to put them out of business they often try to hide.

    Net result, it’s a lot easier to find escort services in your area than abortion clinics. That’s nothing to do with Apple, and everything to do with the pro-life movement. If you can’t find find a location by web search, you won’t find the data in Google or Siri.

    But you get a lot more publicity complaining about Apple than complaining about Google’s search results. People understand where Google gets its data. People don’t understand that Siri is in the same position.

    1. Net result, it’s a lot easier to find escort services in your area than abortion clinics. That’s nothing to do with Apple, and everything to do with the pro-life movement. If you can’t find find a location by web search, you won’t find the data in Google or Siri.

      But you get a lot more publicity complaining about Apple than complaining about Google’s search results. People understand where Google gets its data. People don’t understand that Siri is in the same position.

      Well, sort of. Except that Siri uses a very different data collection method than Google (and when you google “Abortion Clinic NYC” or “where can I get an abortion in New York” you do indeed get a bunch of abortion clinics that advertise themselves as such, so your argument there is flat-out wrong). Google also isn’t hand-managed, usually, to correct for mistakes. Siri is. That’s where my issue comes in.

      The problem here — and this again is evident if you actually read the post — is that yes, the data is bad, but programmers stepped in to correct a lot of the bad data. I mean, I can basically promise you that escort services are not advertising dick-sucking or hamster-in-ass-removing. Tons of people worked very hard to make sure that services which were not readily apparent in their API would still show up through Siri. There are lots of little jokes (like Siri’s response when you say “I love you”) but also lots of glitches and bad searches were undoubtedly corrected. The problem is that sexism is so deeply-rooted in our society that the woman-related glitches weren’t noticed or fixed.

  10. Siri also couldn’t help me find a place for an HIV test. She said she couldn’t find an HIV clinic and I work in the same building as one.

  11. Ialsojusttried,“I’mhurt,whatdoIdo?”andSiriresponds,“Ifound14hospitalsfairlyclosetoyou.”

    I’m sorry, but if I’m hurt I really don’t think I should to ask my phone what I should do – I just dial 911 if it’s serious, my doctor if necessary, my husband otherwise.

    1. I’m sorry, but if I’m hurt I really don’t think I should to ask my phone what I should do – I just dial 911 if it’s serious, my doctor if necessary, my husband otherwise.

      Well obviously. And if you’re pregnant you probably shouldn’t be asking your phone for advice either. I was responding to a previous comment arguing that the phrase “What do I do?” will generate a standardized response.

  12. I know you shouldn’t attribute to malice what could be incompetence but I think you are being far too generous in your assumption that women were just overlooked by the programmers. I doubt smart phones contain their own tables of information or topics: when you say ‘I need xxx’ Siri just passes xxx along to google and gives you some top results. Go to google and manually search for abortion, birth control or rape crisis centres and google does find results. Lots of them. Yet somehow Siri can’t find them.

    If they DID create a big ‘ole list of things to pass to google but just forgot about women’s issues then how did anti-choice crisis centres end up in there? If it really does have a subject list then the devs must have thought of women’s reproduction issues when they put in the anti-choice options… and then they forgot about women’s reproductive health? I don’t buy it. Also, if Siri fails to find information on anything else, then Siri offers a link to google… unless it’s a women’s issue.

    I can’t accept the premise that this is because Siri has a long list of all the things it would forward to an external search engine and they forgot a few things – I think it’s far more likely to have a short exclude list. I really don’t believe this was an oversight, I believe someone at Apple made the choice to code Siri with blocks on subjects they don’t like. I hope I’m wrong, I hope it’s just an oversight but I don’t think so. I can’t conceive of any way a programmer could create something which takes what you say and searches the internet for it or offers you a link to google but does not do that on a short list of topics. Not without intentionally putting in blocks on those topics.

  13. Jill: Theanswertothatisinthepost.Whenyousay,“I’mpregnant,whatdoIdo?”Sirisayssomethingtotallyincoherent.Whenyousay,“Mygirlfriendispregnant,whatdoIdo?”Sirigivesaseriesof(vaguelyanti-abortion)answers.Sothereisnotadefaultanswertoaquestionendingin“WhatdoIdo?”

    Ialsojusttried,“I’mhurt,whatdoIdo?”andSiriresponds,“Ifound14hospitalsfairlyclosetoyou.”

    Sorry, I missed the first one (and should have reread before posting)–that seems like a false positive. It was confident in its answer when it’s obviously wrong to a human. The “I’m hurt” example seems like a true positive. For the other ones, I strongly suspect that it wasn’t confident in any particular answer and so it defaulted to a list of truisms that can be applied in almost any situation, even if they sort of sound anti-abortion in this context. (And I was wrong about them being in Oblique Strategies; they don’t appear in this list: http://www.bbc.co.uk/dna/place-nireland/A635528)

    I suspect if you asked something completely nonsensical with a “What should I do?” afterwards, or perhaps something very complex, you might get the same sort of response.

  14. As a programmer I feel obligated to point out that as a data-driven application (as Jill points out), it’s not necessarily an issue with “programmers.”

    The people in charge of massaging or transforming the data that is used by Siri probably aren’t the same people who actually write the software that processes the speech and sends the results. And these people may well have been operating under the instruction of any number of committees of managers. So saying anything about “the programmers” just feels wrong to me. 🙂

    (Not trying to take away any blame, just shift it upwards)

  15. This seems really weird. Aren’t most answers to these questions supposed to be just trawled from a search engine? You’d think “I need an abortion” or whatever should just get you a list of clinics which provide it or have jury-rigged search results to look like they do, not confuse the hell out of it. How does the AI clock in on GLBT issues?

  16. Scott Wiebe:
    If they DID create a big ‘ole list of things to pass to google but just forgot about women’s issues then how did anti-choice crisis centres end up in there?

    Completely irrelevant because Siri specifically does NOT search Google. It tries Yelp for businesses. If the search fails in Yelp, or if it is perceived as a general question, it tries Wolfram Alpha.

    If those sites don’t understand the question, or don’t have useful answers, neither does Siri.

  17. Well, obviously women’s needs aren’t real needs — otherwise the programmers at Wolfram Alpha would have a done a better job of field-testing their app with actual women. You see that same attitude with pharmaceutical firms when they don’t bother conducting clinical trials with women, on the rationale that trials with men are “good enough.”

    And how predictable it is that the same anti-choicers crowing about how Siri blocks women from obtaining vital information about their sexual health don’t make a single peep about how Siri enables slutty men to service their sexual needs with Viagra and escort services.

  18. Of course, the next level of issue is why all the programmers are male. I thought this article offered an interesting perspective: http://techcrunch.com/2011/11/19/racism-and-meritocracy/

    I think it would be totally legit to ask Siri programmers how Siri is able to send you to a CCP and doesn’t send you to an abortion clinic or a health center or hospital, or even an ob-gyn. It’s entirely possible that it’s searching for analogues of the word “pregnant” but it seems awfully religious-right.

  19. Hmm…the blogoverse is apparently also reporting that while Siri recognizes viagra it doesn’t recognize any female contraceptives including EC. That’s even more telling IMO.

  20. I’m guessing also that a lot of responses that Siri gives is because it’s reacting to more generic parts of the question like “what do I do?”. It doesn’t know what abortion means, so if you ask “I need an abortion. What do I do?” it’s only going to pick up on the end of that statement. Same for “I was raped.” “Is that so?”. It’s only looking at the “I was” part. Generic silly comment: “I was eating ice cream last night!” “Is that so?” It doesn’t matter what’s after “I was.”

    The programmers made errors of omission, not errors of “I put stupid crap in Siri’s responses for kicks.” They simply left out things for whatever reason, whether it was because they didn’t want to draw a lot of complaining over “Siri endorses abortion,” or because they weren’t thinking about women, or whatever. Some of Siri’s particularly bad-seeming responses are just failures to comprehend the question.

    1. The programmers made errors of omission, not errors of “I put stupid crap in Siri’s responses for kicks.” They simply left out things for whatever reason, whether it was because they didn’t want to draw a lot of complaining over “Siri endorses abortion,” or because they weren’t thinking about women, or whatever. Some of Siri’s particularly bad-seeming responses are just failures to comprehend the question.

      That is… kind of an asinine response. They left things out “for whatever reason”? Hmmm… what reason could it possibly be that they left out everything related to birth control, abortion and female sexuality, but included lots of stuff about viagra and male sexual desire.

      It’s weird to me how defensive people get when it comes to Apple. Seriously, I am not accusing them of anything malicious. I am suggesting that sexism is deeply rooted in our culture, and is reflected in programming oversights like this one. Do you really think that there’s absolutely no reason why this stuff was left out? That it has nothing to do with gender, or having many more male programmers than female programmers, or anything about how men’s needs are deemed culturally standard and women’s needs are “special”? That it just… was?

  21. That “I’ve been raped” is incomprehensible REALLY pisses me off. Have any of you contacted Apple? Has Jill? Is Siri available in other languages? Does she provide French women with reproductive care? What about the queer community? Does she know all the gay bars in town?

  22. midnightsky: They simply left out things for whatever reason, whether it was because they didn’t want to draw a lot of complaining over “Siri endorses abortion,” or because they weren’t thinking about women, or whatever.

    Yes, that’s the problem. That’s what we’re complaining about, that the programmers weren’t thinking about half the goddamn population.

  23. *dead* And thanks, btw, for the hamster-in-the-ass part, because now everyone wants to know why I’m cracking up and have tears at the corners of my eyes. And I’m just getting over a cold, so this laughing is setting off coughing spells, too.

    This….is piss-poor marketing at it’s best. Not lookin’ good here, Apple. Certainly not giving me a reason to upgrade from my non-iPhone. I agree that it isn’t necessarily maliciousness towards women (but seriously…blowjobs get an escort reference, eating pussy get *pet stores*?? The fuck? Not even a good singles bar? Do these male programmers ever get laid???!)….but merely ignorance. Still, I don’y want to shuttle any of my money, especially in this economy, to a company that doesn’t even recognize me as a potential customer. Fuck you, Apple.

  24. Siri is also a Wolfram-Alpha programmed product. W-A is NOT a woman-friendly company. It’s located in an extremely conservative area and most of its programmers are likewise conservative–I’ve known a fair few of them and also know that their recruiting process isn’t going to allow for a lot of liberal attitudes coming in. The minute I discovered that Siri was programmed by W-A, my willingness to attribute this to oversight waned dramatically. I’m willing to be that at least part of the programming team would have been in favor of doing it deliberately.

  25. Frankly, I don’t understand why so many comentators and news types are seemingly bending over backwards to NOT impute misogynist intent on the part of Apple or its employees wrt Siri.

    Clearly, Apple and its programmers intended to make Siri an “edgy” product by having it offer such pithy responses to certain questions. I don’t see why that edginess mission is necessarily divorced from Apple still trying to avoid courting actual controversy by say offering a listing of the 10 closest abortion clinics or pharmacies that carry Plan B. I don’t doubt that the people at Apple may have assumed that their market is largely made up of men and thus programmed it around the perceived interests of that market. But that doesn’t make their intentions any less sexist or even misogynistic.

  26. Siri specifically does NOT search Google. It tries Yelp for businesses. If the search fails in Yelp, or if it is perceived as a general question, it tries Wolfram Alpha.

    Interesting! I’d say that pretty much explains the abortion clinic response: I went to Yelp and typed in “abortion.” It autocompleted to “abortion clinic” (so that explains why Siri knows that “abortion”=”abortion clinic”) and then didn’t have any results. I suppose clinics aren’t the kinds of places to have Yelp pages. And likewise, I suppose crisis pregnancy centers are the kinds of places to have Yelp pages.

    So I’m pretty comfortable putting the blame on 1) Yelp, for not actually being a comprehensive place listing, and 2) Apple, for treating it like it is.

  27. This reminds me of several years ago when I had a Cricket cell phone and noticed some flaws in its predictive text fuction. You know, where the cell phone tries to guess what word you are typing in within a text to help you out. The phone was able to guess some pretty impressive words like “interposition.” But it couldn’t guess swear words–“fuck,” “shit,” etc. It also couldn’t guess any word having to do with sexuality, whether “penis” or “vagina.” It was like it was designed by Jerry Falwell.

  28. Honestly this doesn’t really sound like a lack of consideration to me, like this part:

    Jill: I need an abortion.
    Siri: I don’t see any abortion clinics. Sorry about that.

    You didn’t say ‘abortion clinic’, you said ‘abortion’, and Siri internally connected that to a term it specifically knew about – ‘abortion clinics’. It’s not that Siri had no idea what you were talking about because nobody bothered to include that category in its database, it knew exactly what you were referring to – it’s just that no information was actually added, so you get the equivalent of a shrug and a ‘don’t ask me’.

    I guess the question is was it left empty intentionally, or as the result of an error? Given the way search engines work it seems strange that it wouldn’t have pulled in a list of abortion clinics automatically, this information isn’t added by hand. I mean it works on Yelp – there’s no reason a known category of business or service shouldn’t work in a query and automatically return a list of results.

    Same goes for the pregnancy questions – “do no harm” and “always do the right thing” really don’t sound like generic ‘I don’t understand your question’ responses, they sound specifically like anti-choice sentiments. It sounds like Siri knows exactly what pregnancy is, and its internal data for that category is a set of “do no harm” or “maybe you could consider adoption” responses. That wouldn’t be oversight, that would be considering that someone might ask about pregnancy and deciding to provide these specific responses.

  29. Jawnita: Interesting! I’d say that pretty much explains the abortion clinic response: I went to Yelp and typed in “abortion.” It autocompleted to “abortion clinic” (so that explains why Siri knows that “abortion”=”abortion clinic”) and then didn’t have any results. I suppose clinics aren’t the kinds of places to have Yelp pages. And likewise, I suppose crisis pregnancy centers are the kinds of places to have Yelp pages.

    So I’m pretty comfortable putting the blame on 1) Yelp, for not actually being a comprehensive place listing, and 2) Apple, for treating it like it is.

    That’s odd. When I go to Yelp and search for either abortion or abortion clinics near NYC (which it auto-fills-in for me), it gives me a number of results, ranging from Planned Parenthood, to random things that happen to have “abortion” in one of the reviews (spas, Chinese restaurants, etc). A few other cities (DC, SF, LA, Seattle, Dallas) give similar results, altho varying in number & quality. So I don’t know what’s up, unless there actually are no abortion clinics near you.

    1. Hmmmm. I also just searched “abortion” on Yelp, and the first hit is Planned Parenthood (the third hit is Rising Dragon Chinese Restaurant and the fourth it Pepino Unique Hair Sylists, so, imperfect, but still).

  30. Jawnita: Interesting!I’dsaythatprettymuchexplainstheabortionclinicresponse:IwenttoYelpandtypedin“abortion.”Itautocompletedto“abortionclinic”(sothatexplainswhySiriknowsthat“abortion”=”abortionclinic”)andthendidn’thaveanyresults.Isupposeclinicsaren’tthekindsofplacestohaveYelppages.Andlikewise,IsupposecrisispregnancycentersarethekindsofplacestohaveYelppages.

    SoI’mprettycomfortableputtingtheblameon1)Yelp,fornotactuallybeingacomprehensiveplacelisting,and2)Apple,fortreatingitlikeitis.

    I went to Yelp’s Brooklyn page and did the same, and the first result was PP (as well as a home-abortion clinic and four unrelated results), so if it really does use Yelp then there’s a pretty big question here – why is it failing to provide the results that Yelp returns for this particular query, and who decided that this particular query should fail? The pregnancy responses pretty much confirm this isn’t some kind of bug that just happens to reject these Yelp results among others

  31. Yelp comes up with no abortion clinics when I enter ‘abortion clinic’ and my zip code.

    Neither does WolframAlpha. For ‘blow job’ Yelp brings up lots of businesses of varying types because people comment that the goods or services ‘blows’.

  32. Rhoanna: So I don’t know what’s up, unless there actually are no abortion clinics near you.

    There definitely are clinics near me; I live near downtown, in a medium-large US city. Google will give me results, including the nearby Planned Parenthood as the second hit. But, as I said, Yelp does not. (This time I also checked for crisis pregnancy centers. Yelp gives me one, which is of course more than zero, but less than the actual number nearby.)

  33. How do you know they didn’t do this to prevent crazy people from shooting up abortion clinics? Just saying.

  34. Jill:
    I also just searched “abortion” on Yelp, and the first hit is Planned Parenthood (the third hit is Rising Dragon Chinese Restaurant and the fourth it Pepino Unique Hair Sylists, so, imperfect, but still).

    Actually, repeating this experiment for good measure, simply “abortion” (that is, refusing to let it autocomplete for me) nets me PP and several restaurants. So maybe overenthusiastic autocomplete is the problem.

    (Okay that’s enough of my comments in a row for now…)

  35. Jill: …seriously?

    Stranger things can happen, right? Siri wasn’t “programmed” by people at Apple in the sense that it gives rote answer to your questions, the info is compiled form other sources, correct? Maybe abortion clinics intentionally don’t have a big presence online for safety purposes.

    1. Stranger things can happen, right? Siri wasn’t “programmed” by people at Apple in the sense that it gives rote answer to your questions, the info is compiled form other sources, correct? Maybe abortion clinics intentionally don’t have a big presence online for safety purposes.

      Google “abortion clinic NYC” and see how much of a presence they have online.

  36. Actually, having just now read the link to Forbes, the other issue the writer brings up of Siri being, essentially, a “lady secretary” is kind of interesting too. I don’t have an iPhone, but I would assume from the writing that you can’t switch the gender of the voice unlike with, say, most GPS units, etc.

  37. Seth Eag: Actually, having just now read the link to Forbes, the other issue the writer brings up of Siri being, essentially, a “lady secretary” is kind of interesting too. I don’t have an iPhone, but I would assume from the writing that you can’t switch the gender of the voice unlike with, say, most GPS units, etc.

    in north AMerica, Siri is permanently female, with no option switch. I believe that in the UK, it’s permanently male, (or possibly switchable – I can’t currently recall), but in general, the options change depending on where you are.

  38. Siri was programmed by males right? Yeah. So this doesn’t surprise me at all. But I think if this got enough publicity there would be an upgrade coming that addresses that problem along with some article about how women with the iPhone4s are desperate for abortions and male prostitutes because you know, the *only* reason a woman would want an iPhone 4s after such an upgrade would be those seeking abortions and male escorts.

  39. P.S There are several “male” voiced Siris ( I put male in quotation because I am uneasy witht eh concept of gendering voices).

  40. I can’t exactly put my finger on why, but for some reason the “why should you even be asking that” responses (re: abortion, pregnancy, esp) are particularly depressing.

    Maybe it’s just that it’s yet another policing of women’s behavior (because you can never have too much!). Not only are we going to judge you on what you do with your body, but now we’re also going to judge you on how you choose to obtain information about it as well.

  41. I don’t know how to feel about this. The 4S is my first smart phone, and I’ve found Siri to be a very useful, but not essential feature. Interestingly enough, I’d used it this morning on my way to work to call my pharmacy- to refill my bc prescription. It pulled up and dialed my pharmacy right away, and saved me about 30 seconds of googling or fumbling around for the number. I don’t rely on it, it’s just a nice thing to have when I’m on the go, or to be honest, when I’m just too lazy to type out a text or pull up a friend’s number.

    They certainly need to address it, but isn’t the whole point of beta versions to fix things like this?

  42. J: They certainly need to address it, but isn’t the whole point of beta versions to fix things like this?

    Well, yes. But the point of calling it out (here and elsewhere) is so that Siri/Apple/the developers know this needs to be fixed. It also helps future developers to hopefully not forget half the population when designing the next great thing.

  43. groggette: Well,yes.Butthepointofcallingitout(hereandelsewhere)issothatSiri/Apple/thedevelopersknowthisneedstobefixed.Italsohelpsfuturedeveloperstohopefullynotforgethalfthepopulationwhendesigningthenextgreatthing.

    Also, Apple didn’t present this as a “beta” version in their advertising.

    Further, “beta version” doesn’t really mean anything to the average user.

  44. Daisy: Also, Apple didn’t present this as a “beta” version in their advertising.

    Good point. And people should be able to point out flaws/mistakes/just-plain-bad-business-decisions regardless of what stage thing X is at.

  45. Hmm, this might simply be a (failed) attempt by Apple to avoid controversy. I’m sure if a 14 year old girl asked Siri where to get an abortion, and it actually directed her to one, Apple would get sued (or at least protested) by every pro-life group in the country for giving minors abortions. That doesn’t make it OK, but it might be self-preservation over misogyny.

  46. Daisy: Also, Apple didn’t present this as a “beta” version in their advertising.

    Further, “beta version” doesn’t really mean anything to the average user.

    Except: a) they absolutely did present it as ‘beta’ (it’s not in their television commercials, but it is on all their other marketing materials, was stressed heavily during the product announcement, and there *is* a disclaimer at the end of the TV spot indicating that not all features may yet be fully available to everybody), and b) if most users haven’t figured out what ‘beta’ means by now (and really, they should have if they’ve used any Google products over the last decade, for example, as they stay in beta for years and are clearly marked as such), it’s up the people who write pieces like this to learn about what that means and then present that information to the public in a way they can understand, a task that journalist and bloggers have utterly failed at, because quite frankly, taking pot shots at Apple makes for better headlines than “Unfinished Software Doesn’t Work Quite Right Yet, Company Says They’re Working On It”.

  47. orgostrich: Hmm, this might simply be a (failed) attempt by Apple to avoid controversy. I’m sure if a 14 year old girl asked Siri where to get an abortion, and it actually directed her to one, Apple would get sued (or at least protested) by every pro-life group in the country for giving minors abortions. That doesn’t make it OK, but it might be self-preservation over misogyny.

    They’ll tell a 14-year-old boy where to get a blowjob and some dick pills.

  48. orgostrich: Hmm,thismightsimplybea(failed)attemptbyAppletoavoidcontroversy.I’msureifa14yearoldgirlaskedSiriwheretogetanabortion,anditactuallydirectedhertoone,Applewouldgetsued(oratleastprotested)byeverypro-lifegroupinthecountryforgivingminorsabortions.Thatdoesn’tmakeitOK,butitmightbeself-preservationovermisogyny.

    Um. Pretty sure that wouldn’t go so well for the pro-choice groups. It’s not illegal to tell someone where an abortion clinic is, and it’s not illegal for a 14-year-old to have an abortion (assuming, in some states, compliance with parental notification laws and the whatnot). Furthermore, under no legal standard that I’m aware of would providing someone information about places that perform abortions constitute “giving” someone an abortion. In short, while it may be possible that this was a conscious decision to avoid boycott-type fallout, I don’t think any legal department in the world would be concerned that pro-life groups (who wouldn’t have standing anyway) would have any kinds of grounds to sue.

  49. orgostrich: That doesn’t make it OK, but it might be self-preservation over misogyny.

    It sure as hell makes it systemic, institutionalized misogyny, which is what the original argument of this post is.

  50. OKAY my SECOND post is #69 and this is the one I sent in before that… I did not get a auto response to this: Please excuse if double posting.

    This set of messages could apply to the situation of The GREAT BARRIER Brief which is “hidden in plain site (sic)” at the website janesway dot net This was the name used when it got a NIH grant during Clinton admin then the next ex-prez YOU KNOW WHO demoted all contraception and disease prevention for women to below funding guidelines for the National Inst Health! No one would know this even if they were refused funding.

    Note the website is from 1999 scroll down to find links. FDA & NIH knew about this in 1987-1988 NIH grant finally in 1999 Newspaper media won’t touch it cause their advertising policy is no bad news about their advertisers. WHAT WOULD WORK in this case? I am the only entity (no corporation involved) still working on this….pt time while farming. On F’bk Fern Fedora or contact at website above subject Attention Fern

  51. It cracks me up reading these messages. Most posters are totally clueless to technology and how Siri works. But everyone thinks there is some hidden agenda, some evil reason why this inanimate device does not hold the same views on the world they do, and can’t answer every question the way they want. Really? Is the world and Apple really out to get you? Siri is a step forward from past voice recognition and search engine technology, but it is still a machine. As pointed out by some posters, some of the “anti-female” responses to certain questions are clearly standard answers to question Siri does not fully understand or can provide an answer for. Because Apple has been clever and added some humor or other “human like” characteristics to make it more user friendly probably contributes to the less clever individuals thinking a electronic device is smarter than it really is. Think of Siri as an electronic Magic 8 ball. Siri, are most people dumber than their phone? ● It is certain
    ● It is decidedly so
    ● Without a doubt
    ● Yes – definitely
    ● You may rely on it
    ● As I see it, yes
    ● Most likely

    1. It cracks me up reading these messages. Most posters are totally clueless to technology and how Siri works. But everyone thinks there is some hidden agenda, some evil reason why this inanimate device does not hold the same views on the world they do, and can’t answer every question the way they want. Really? Is the world and Apple really out to get you? Siri is a step forward from past voice recognition and search engine technology, but it is still a machine. As pointed out by some posters, some of the “anti-female” responses to certain questions are clearly standard answers to question Siri does not fully understand or can provide an answer for. Because Apple has been clever and added some humor or other “human like” characteristics to make it more user friendly probably contributes to the less clever individuals thinking a electronic device is smarter than it really is. Think of Siri as an electronic Magic 8 ball. Siri, are most people dumber than their phone?

      For the love of Christ. Siri, are people too dumb to read the post? Yes, Matt Simpson is.

  52. One minor point: if you simply ask Siri “what do I do?” it gives “do no harm,” “do the right thing,” and “consider your alternatives” among other stock responses. So that seems to be just a generic answer. But. When I asked for an abortion clinic, it did not direct me to PP, but to two clinics, one in VA and one in PA, each at least 50 miles away and both with suspiciously pregnancy-crisis center-sounding names. Then I searched for PP and it gave me six results, three within 1-2 miles from me. So I though, OK, maybe PP doesn’t maket itself as an abortion clinic. So I asked for a “women’s reproductive healthcare” and it didn’t know what that was. That is fucked and needs to be corrected.

    Also, I live about a mile away from a major HIV clinic. When I asked for an HIV test center, it couldn’t find any. When I asked for a medical center, it was the first result on the list. Clearly some major tweaking is in order!

  53. I also just searched “abortion” on Yelp…the fourth is Pepino Unique Hair Sylists, so, imperfect, but still).

    I dunno, “pepino” does mean cucumber in Spanish.

  54. With all the coverage that this is receiving and all the different responses to different topics, I am absolutely beginning to think that this is malicious. To think that women don’t use their technology—you know, oops!–it absolutely ridiculous.

  55. Jill: The problem is that sexism is so deeply-rooted in our society that the woman-related glitches weren’t noticed or fixed.

    This tl;dr should be highlighted, heavy-typed, blinking, somewhere up the top of this article. Nearly every programmer reading this story is going it initially think “It’s not malice, it’s incompetence” (I sure did), but this very neatly states why it’s actually still a problem.

    Also, while others have suggested that it’s not necessarily a programmers’ fault (the programming part is only one of a number of steps that go into producing software), all of the number of people involved should have noticed that the question/response tweaking has a male bias and done something about it — including the programmers.

  56. I think another highly effective example would be to ask Siri for rape jokes, and then rape crisis services.
    I don’t have one, can’t test, but it’s a test I’d like to see the results of.

  57. Sirkowski: Because talking to your iPhone is what you need to do when you want an abortion… e_e

    Well, apparently it’s the thing to do if you’re a man who wants to hire a sex worker for a blow job or you want Viagra. Odd how some folks have such a problem seeing the double-standard.

  58. Zippa, I live in the town Wolfram is based in (and where I’m told Wolfram Alpha is) and I assure you, it’s not “very conservative.” It’s more conservative than Chicago, but much less conservative than many of the Chicago suburbs. We have both a PP and a private abortion provider, a gay bar, 3 independent hippie/organic grocery stores, a rather large liberal population, etc. etc. I’d bet we’re the most liberal area in downstate Illinois, and certainly consistently blue (the rest of the county means that our county leans red, but this city/town does not).

  59. It’s entirely possible that Apple merely failed to properly vet the thing after they bought it. But, holy crap, they should have! This is a mighty glaring defect!

  60. I thought the name Wolfram|Alpha sounded familiar. Back in 2009 they were asking for feedback so I obliged. I did a search on Plan B and told them that 1) it didn’t know what Plan B was, and 2) it returned COCs and Implants on a targeted ECP (Plan B) search. Their response:

    “We have received your feedback regarding Wolfram|Alpha. The issue you reported has been fixed and will appear on the live site with the next update.”

  61. I believe they did think of women , but of other “kind of women”.

    By that I mean non feminist women.

    Men are the ones more likely to try to trick Siri into sexual questions and innuendo and publishing it

    That is until you start to think of feminist like us.

    The second part is that you don’t want to offed women. Offending men is standard practice (it will be taken as as joke).
    Offending women can be deadly. If you don’t trust me ask Dr. Lazar Greenfield,

    Finally is all the annoying Hilary and company “safe but rare” mentality. Abortion clinics try to sale themselves as “women health care centers”. It’s like they don’t feel proud to say what they really are.

    So you have:
    1. Crisis pregnancy centers trying to appear as abortion clinics (basically they are scams).
    2. Abortion clinics trying to appear as “we are so much into alternatives”, “we don’t do just abortions” etc.

    Which one do you expected a computer to peek?

    Ah just to avoid manipulation, when you ask Siri male clinics it will just return clinics.

    Women clinic is some how is ambiguous for Siri.

    Love,

    Avida

  62. iremo: Although, what were you hoping for with the horny question, even in a feminine voice, aside from an escort service?

    It could vibrate.

  63. Ashley:
    Zippa,IliveinthetownWolframisbasedin(andwhereI’mtoldWolframAlphais)andIassureyou,it’snot“veryconservative.”It’smoreconservativethanChicago,butmuchlessconservativethanmanyoftheChicagosuburbs.WehavebothaPPandaprivateabortionprovider,agaybar,3independenthippie/organicgrocerystores,aratherlargeliberalpopulation,etc.etc.I’dbetwe’rethemostliberalareaindownstateIllinois,andcertainlyconsistentlyblue(therestofthecountymeansthatourcountyleansred,butthiscity/towndoesnot).

    I’m also from the area, albeit about twenty minutes outside CU proper, and VERY CONSERVATIVE is precisely how I’d identify the non-university parts of the county.

  64. “It also helps future developers to hopefully not forget half the population when designing the next great thing.”

    YES. And beta version or not, advertised as such or not, if it’s already gotten as far as beta testing and you forgot to include half the population? You just made shit-tons more work for yourself if you ever plan to not be an ass and see this for the problem it is and fix it. It is just so much easier to build stuff right the first time than it is to make really obvious and fundamental mistakes and have to repair them later. Mistakes will happen, sure. But the beta version should be well past the fundamental flaw phase.

    If they actually gave a shit about doing this right, the designers should have had some version of a typical adult woman as one of their user stories in the conceptual phase. Much less in the publicly available version – beta or otherwise.

  65. Glah, so many responses (a couple here, and many, many in other threads on this topic), basically saying that if you need an abortion, why are you talking to your phone? Is that really the kind of decision you should be making so fast, etc.

    You know, there are people in the world whose primary link to the internet is through their phone. People who may know long before they ever get pregnant that they would get an abortion. (I’ve known that if I were to get pregnant, I would abort ASAP for the last eleven years.) Using an AI meant to do web-searches to do a web search is really not a stupid thing to do, regardless of topic. Yeesh.

  66. jennygadget:
    “It also helps future developers to hopefully not forget half the population when designing the next great thing.”

    YES. And beta version or not, advertised as such or not, if it’s already gotten as far as beta testing and you forgot to include half the population? You just made shit-tons more work for yourself if you ever plan to not be an ass and see this for the problem it is and fix it. It is just so much easier to build stuff right the first time than it is to make really obvious and fundamental mistakes and have to repair them later. Mistakes will happen, sure. But the beta version should be well past the fundamental flaw phase.

    If they actually gave a shit about doing this right, the designers should have had some version of a typical adult woman as one of their user stories in the conceptual phase. Much less in the publicly available version – beta or otherwise.

    perhaps they did, but like Avida Quesada said, they weren’t a feminist women?

  67. “perhaps they did, but like Avida Quesada said, they weren’t a feminist women?”

    I wasn’t aware that only feminist women used contraception.

  68. Angel H.: O_o

    Idon’tevenknowwheretobeginunpackingthisone.

    Re: these two:
    Jill: I’m pregnant. What do I do?
    Siri: Are you?

    Jill: My girlfriend is pregnant. What do I do?
    Siri: Consider your alternatives./ First, do no harm./ Always do the right thing.

    I’d say that Siri interprets these phrases without understanding pregnancy at all, so the ‘First, do no harm’ thing is far less sinister than it seems. I don’t have an iPhone, but it looks like Siri’s responses to ‘I’m pregnant. What do I do?’ are the same as any ‘I’m [x]’ statement, and the system disregards the rest of the statement.
    It also looks like Siri has a set range of responses to ‘What do I do?’ or ‘What should I do?’, and disregards the rest of the ‘My girlfriend is pregnant. What do I do?’ question. It would be worth asking your iPhone a variety of questions ending with ‘what do I do?’ with no recognised keywords in order to verify this.

    Like people have pointed out, I still believe it’s a significant oversight to have left out pregnancy, contraception, abortion and so on from the system, but this is the first release of the program. I hope that Apple learns from this public backlash and incorporates them into the next release.

  69. Isn’t the point of a beta version to work out various kinks like this so that the alpha, or final, version will be all the better? Beta products are called beta for a reason – because, while they can be useful, they are imperfect / unfinished, not entirely stable, and constantly changing.

    It’s a worthy concern to highlight, but it’s being taken beyond that and used as an example of sexism against women. When Siri is being targeted at so many different demographics of people for a myriad of reasons, do you really believe it’s reasonable to expect it to work perfectly for everyone in its beta stage? It seems like these complaints are more suited for a finished product, which Siri is not.

  70. Nah, non-feminist women don’t avoid contraception — they just won’t admit to using it. That would be tantamount to admitting they enjoy sex, which everyone knows good girls aren’t supposed to like. That’s why misogynist politicians are confident they can ban contraception funding — not enough women will stand up for something virtually all women use in their lifetimes.

  71. I have very little problem thinking that some PR minded person at Apple made sure that Siri wouldn’t give usable information for certain searches, like abortion, thinking that it would stop controversies down the road. Sure, when called out on it, the CEO says “Oh, Siri is in beta, we will fix it”, but what else could he say?

  72. And to amplify that, I DO think that this underscores the underlying misogyny rampant in society in general, and the tech industry in particular. I have no doubt that most companies fear conservatives more than women in general, and feminists in specific. I wouldn’t be surprised if the “solution” Apple makes is to keep censoring the abortion and female health results, and also take out the escort service and viagra searches, too.

  73. I just find it astonishing – not in a good way – that that people have actually developed an app that responds to “I need a blowjob” and gives a list of local sex workers. I mean, really, WTF? It says a lot about the developers’ own weltschuaung. I dunno, maybe I just need to get out more. 🙁

  74. This has nothing to do with Sexism. At all.

    Siri is an example of a machine learning, voice recognition program.
    It is an AI, nothing more. It has no concept of gender. The programmers would have included coy answers BUT the problem therein lies with the fact that programming is a male-dominated sector of work. That’s not sexist, it just happens to be mostly male.

    Siri doesn’t store ANY ready to use data. Disconnect it from the web, and it loses all that functionality. Blame its data sources, not it.

    …Honestly, I find articles like this incredibly damaging to feminism, making the author look like an extremist feminist, which are mocked and laughed at by EVERYBODY because of their ridiculous agendas. Try to understand the technology, and more importantly, don’t be picking fights with your male counterparts over something completely asinine like an app that is still a gimmick not being complete.
    I support the feminist movement, etc. but picking fights over stupid things only steps back support from everybody apart from extremists.

    If you are really annoyed about Siri being focused towards men, learn to program, and get a career as a programmer.

    …And for the people asking about programmers getting some, you’re actually really accurate, lol. We don’t. We really don’t.

  75. Wow. One more reason not to buy any Apple products.

    Tapetum: You know, there are people in the world whose primary link to the internet is through their phone. … Using an AI meant to do web-searches to do a web search is really not a stupid thing to do, regardless of topic. Yeesh.

    I agree.

  76. I also just discovered that Google speech to text blocks out the word “raped” as if it were a curse word. It displays as “r****”. Lots of other words like penis and vagina aren’t blocked, but someone thought rape was a curse word and people’s eyes should be protected from the horrors of seeing the word.

  77. Dear god, these comments. Either people are not reading the post, or they don’t understand what sexism is. Sexism is not just an intentional, malicious thing. Focusing only on men’s needs and simply leaving women out because men are the default is also sexism.

  78. Roland: the problem therein lies with the fact that programming is a male-dominated sector of work. That’s not sexist, it just happens to be mostly male.

    See, it’s not sexism at play, it just so happens, by a remarkably strange coincidence or maybe an act of God, that men dominate the programming industry, and that those men never think about women’s needs. It’s not sexism of anything, just happenstance.

  79. Roland, do you know what the word “sexism” means?

    Having a male-dominated staff is sexism. Yes, even if you didn’t do it on purpose.

    Your male-dominated staff never considering the other half of the population is sexism. Yes, even if they’re really nice people.

    Men coming here to tell us why this isn’t sexism when it blatantly is, and when we’ve explained repeatedly why it is? That’s sexism. Even if you think you’re just smarter than us stupid chicks.

  80. No, no! It’s not a coincidence, let alone an act of God, that men dominate the programming industry. It’s because women just don’t want to learn how to program! Stupid, lazy women. <–not sexism

    (/snark of the day)

  81. What part of “beta software” are you people missing?

    Siri will only be as good as the available database, and abortion clinics are not exactly big advertisers.

    Also, with something as personal as abortion, a person’s first go to choice is the rudimentary *beta* AI program on a phone? Buh whu… wha??

    And if you have been attacked or are hurt, I know it’s a high stress situation, but do remember your phone can be used as, well, a *phone*. Hmm… anyone try “call 911” with Siri? I can see that being useful if a person is so debilitated they can maybe get one or two taps in.

    1. What part of “beta software” are you people missing?

      What part of “it’s not un-sexist just because it’s beta software” are you missing?

  82. Past my expiration date;

    What’s stopping them, then? There’s no law against it, is there? Same for many science-dominated disciplines (not medicine, though).
    I work in oil and gas engineering; female oil and gas engineers used to be as rare as unicorn shit. Although there younger ones coming through now, the ratio is still about 7:1 m/f. So why so few women entering the field? What puts them off? (Most of our people spend their lives in an office, working on an oil or gas platform offshore is entirely optional these days).

  83. @John — I already told you! Women are stupid and lazy. Sexism has nothing to do with it.

    (Ok, I guess my previous post didn’t end my snark of the day.)

    Or, you know, maybe you could ask some of your oil and gas engineering colleagues who are women.

  84. It’s comments like Woomera’s that make me so glad I updated killfile to work on various places.  “I shall now put together all of the most inane comments here into one comment, thus showing that I didn’t read the post, the comments, or the title of the blog itself!”

  85. Woomera: What part of “beta software” are you people missing?

    The part where you make beta software the main feature of an otherwise completely unnecessary “upgrade,” advertise it, release it with huge fanfare, and then when people point out that hey, not only does it not really work that well, but it seems to not really work that well specifically in regard to things women need, start whining about how it’s only beta, why are you guys so upset? Are you seriously saying that women expecting a computer feature they paid a chunk of money for to work is unreasonable?

    Woomera: Also, with something as personal as abortion, a person’s first go to choice is the rudimentary *beta* AI program on a phone? Buh whu… wha??

    Why not? Abortion is too personal, but viagra, hey, that’s just normal? I never got the memo about not using one’s smartphone for things that were too personal. I thought the whole point was that you could use them for anything.

    Woomera: And if you have been attacked or are hurt, I know it’s a high stress situation, but do remember your phone can be used as, well, a *phone*. Hmm… anyone try “call 911″ with Siri? I can see that being useful if a person is so debilitated they can maybe get one or two taps in.

    What if they don’t want 911? What if they want a hospital? Or a rape crisis center?

    John: What’s stopping them, then? There’s no law against it, is there? Same for many science-dominated disciplines (not medicine, though).
    I work in oil and gas engineering; female oil and gas engineers used to be as rare as unicorn shit. Although there younger ones coming through now, the ratio is still about 7:1 m/f. So why so few women entering the field? What puts them off?

    There is actually a whole lot of research on this, if you’re actually interested and not just being a dick. Several things: girls and women are still discouraged, starting from a very young age, from pursuing interests in math and science; women are not encouraged to be ambitious in the same way that men are, and tend to eliminate themselves from competing for high-paying, in-demand jobs (one of the more effective ways to eliminate women from your applicant pool, at least one study has found, is to advertise a significantly higher than average salary); once something is male-dominated, the male/masculine culture that is developed in those fields is very off-putting to a lot of women and so perpetuates the problem; also perpetuating the problem is lack of female role models and mentors to whom girls and women can turn to for advice on issues that male mentors probably don’t know jack shit about (sexual harassment, maternity leave, dress codes).

    Entire books have been written about this stuff. Go ahead and look the up.

  86. Mjog: This tl;dr should be highlighted, heavy-typed, blinking, somewhere up the top of this article. Nearly every programmer reading this story is going it initially think “It’s not malice, it’s incompetence” (I sure did), but this very neatly states why it’s actually still a problem.

    Also, while others have suggested that it’s not necessarily a programmers’ fault (the programming part is only one of a number of steps that go into producing software), all of the number of people involved should have noticed that the question/response tweaking has a male bias and done something about it — including the programmers.

    I still don’t buy that this is incompetence or accidental (and speaking as someone with a Computer Science degree, for what it’s worth). You have an automated speech-recognition-to-search-engine app that listens to what you say, converts that into known words and phrases, queries third-party search engines with those terms and returns the results. What’s happening here is:

    a) The app recognises the word ‘abortion’
    b) Associates it with the known internal business/service category ‘abortion clinics’
    c) It queries Yelp etc. with that known search term, but returns no results for that particular term despite Yelp’s own website providing several results
    or
    d) It doesn’t bother running the query at all and just says ‘no results’ (we don’t know at this stage)

    There’s no good reason why an automated system using a known search term should choke on returning a particular set of standardised results from a search engine, especially when using the search engine yourself produces those results just fine. You search for whatever term, you get back a generic list of results in a standard format, to the computer it’s all a set of data to be processed – it doesn’t know what the words mean, it doesn’t treat abortion clinics any differently from dinosaur theme parks unless it’s specifically told to.

    There must be a filtering stage involved – I haven’t used Siri, but I’m assuming it decides on a ‘best result’ and replies with that? Or is there a list too? Either way it’s deciding which results to give you and which to reject, whether it’s as simple as using Yelp’s top result or if it’s doing something more involved to give you a more personally useful response. Somewhere along the line it’s specifically deciding that none of the results that Yelp etc. return for abortion clinic should be presented to the user. So list time again, it’s either:

    a) screwing up in general retrieving Yelp’s provided results
    b) retrieving them but screwing up in deciding which one (if any) should be presented
    c) retrieving them and working as intended by rejecting all relevant results

    In the case of a) and b) this is a major flaw, and we’d expect to see this occurring for many many queries – like I said before, there’s nothing specific about results relating to abortion clincs as far as the computer’s concerned, so whatever error just happens to affect those results should affect a much broader range. I don’t know how effective Siri seems to be, and if it fails to produce results for other terms it does recognise (remember it connected the word abortion to the category abortion clinics, it’s not like it didn’t recognise the word) that show up just fine on Yelp. Have there been many other reports of this kind of failure, or does it seem to be limited to women’s reproductibe health?

    The last thing that seems to preclude the idea that this is an unfortunate coincidence, and especially the idea that this is an (incredibly unlikely) highly-specific internal bug that just happens to only break abortion clinic queries, purely on accident, is the set of responses to the pregnancy questions. Even in isolation they read like highly specific anti-choice messages directed at someone who’s thinking about pregnancy. I don’t know where they got their database of responses from, obviously some of them were written by Apple (like the one about loving Apple products) and maybe others were taken from a generic conversation database created over the years by artificial intelligence projects, but the point is the responses to ‘I’m pregnant’ and ‘pregnancy’ have a decidedly non-neutral, anti-choice slant – who responds to a mention of pregnancy with advice to ‘do no harm’?

    It would be an incredible coincidence that the conversation database ‘just happened’ to have an anti-choice flavour while abortion clinic results ‘just happened’ to be among the set of results affected by a programming error – even moreso if that error affects a very narrow range of results, or searches for abortion clinics specifically. And it ‘just happens’ to be a hot-button political topic right now that could be spun badly for Apple if conservatives discovered the iPhone were providing this kind of information to people.

    I don’t know if anyone’s going to read all that, but I wanted to show from a computery standpoint how this works and why it doesn’t really seem like an oversight or an ‘oops’ situation that this specific kind of query just happens to be responding like this. If Siri is having a lot of problems with many different recognised queries that work on Yelp, then that would lend some credence to the idea that it’s a glitch and that it’s just a coincidence that the pregnancy responses are so skewed, but we shouldn’t immediately accept the un-nuanced hand-waving ‘oh y’know, computers, they go wrong sometimes’ excuse that I’ve seen a few people repeating

  87. Sooo, people aren’t supposed to raise concerns about how a service works (or doesn’t) if it’s in beta. But if people don’t complain how will the developers/programmers know what to at least consider fixing? Yeah the logic in some of these comments is fucking astounding. yeesh

  88. and on the whole “abortion clinics don’t advertise” kick. I don’t know one way or the other how much small private clinics would advertise and can see this as a possibly valid explanation for them.
    But.

    Planned Parenthood. Seriously, is there anyone in the US that doesn’t know what PP is and that many of their clinics provide abortion. Even if a specific PP doesn’t have it’s own website, there will still be a regional one. And at least for the PP in my city, it is listed on Yelp. Why doesn’t Siri list local PPs unless PP is specifically asked for?

  89. Sheelzebub: Well,apparentlyit’sthethingtodoifyou’reamanwhowantstohireasexworkerforablowjoboryouwantViagra.Oddhowsomefolkshavesuchaproblemseeingthedouble-standard.

    Standards?? Yeah, complain about your consumer product made by underpaid slave labor in a totalitarian regime not meeting your bitch standards. FIRST WORLD PROBLEMS DERP

    1. Standards?? Yeah, complain about your consumer product made by underpaid slave labor in a totalitarian regime not meeting your bitch standards. FIRST WORLD PROBLEMS DERP

      Goodbye Sirkowski!

  90. Is it somehow impossible to understand that you can tell apple, hey, siri can’t do this right, without starting some crazy anti women conspiracy agenda being attributed to Apple?

    groggette:
    Sooo,peoplearen’tsupposedtoraiseconcernsabouthowaserviceworks(ordoesn’t)ifit’sinbeta.Butifpeopledon’tcomplainhowwillthedevelopers/programmersknowwhattoatleastconsiderfixing?Yeahthelogicinsomeofthesecommentsisfuckingastounding.yeesh

  91. EG,
    Thanks for the explanation although I’m not sure why you assumed I might being “being a dick”. Is your default mode hostility and suspicion?

  92. If you are really annoyed about Siri being focused towards men, learn to program, and get a career as a programmer.

    Yeah…no. Sorry Roland, but I’ve already spent over half my career in the building trades (specifically, as an electrician). I’m already doing my part to integrate one of the male-oriented work arenas. Meanwhile, what are you doing within your field to encourage more women to enter and remain, hmm?

    I’m not a programmer. I’m a potential customer. I don’t want to reinvent the wheel, but I am interested in gadgets that actually make my life easier. I’m a single mother with one full-time and one part-time job (so why am I on the computer? My kid is sick today, and home from school). I’m also involved in a few activist/community groups, and otherwise have a life, too….let’s just say, “I’m busy.” Too busy to want to devote a lot of time to fucking around playing with some techno doodad. It’s not that I wouldn’t enjoy that; I don’t have the time for that shit. I have a smart phone right now that I don’t know at least half the functions on, mostly because cell phones no longer come with printed instructions; you have to go online for it. It’s a real pain in the ass to go online with your cell phone and alternate from the instruction screen to actually playing with your phone. That, and the goddam screen is so tiny. That, and I’m visually/spatially oriented; retraining my brain to translating the act of discovering/remembering the “filing system” (for lack of a better term—Where Shit Is At) on the gadget to a system I can remember, because my mind is used to moving things in 3D…..I’m probably not explaining this very well at all, but frankly, the Where Shit Is At on computer systems translates as “hidden” to my brain, and there aren’t any visual cues for me to rely on, capisce? If not, don’t worry about it ‘cuz the point is….

    I really dig user-friendly systems. And employment for me has been good this year, and I’ve been frugal, and since I don’t know how to use half the shit on my current phone anyway, I’ve been thinking about an upgrade to a more user-friendly phone (and definitely one with a bigger screen). I have friends that rave about their iPhones, especially in comparison to what they had before. Shit, in marketing terms, I’m primed and ready, right? Especially since I’m getting to that age where I take my glasses off to see better, and I travel sometimes for work (and am in transit a lot around home), so voice-recognition is something I’d use a lot.

    And then this story breaks. Makes me rethink pulling out my wallet….in this direction, anyway. See, I don’t give a shit if Siri know where to direct men in search of Viagra or blowjobs. But I do want it to be able to direct me if I say “tampons”. If I say I’ve been raped, I want Siri to recognize that I don’t just need 911, but also a reference to the local Rape Crisis Center, who will send out a volunteer to be with me and provide support during all the shit you go through at the hospital with the rape kit, and who will make sure I get access to EC (because not all hospitals will provide it). It would be nice if Siri could provide safe running routes for women in an unfamiliar city, or knowledge about other safety issues (just reading someone’s tumblr today about a possible serial killer targeting petite black women in the Chicago area). And you know, Siri should know enough about abortion clinic access to be able to direct women to the nearest one even if it’s a few states over. The two nearest abortion clinics to me are both over an hour-and-a-half away; that doesn’t stop women in my city from using them. It’s not the goddam dry cleaners.

    And knowing to do that? Knowing that those are applications that are likely to be valuable to one’s customers? That’s pretty basic. Siri is Marketing Fail. As is not having a selection of voices. Dumb. This whole debacle reads to me like the 80s flick, “Weird Science”—a couple of teenage boys creating their idea of the perfect woman. Is that bad, in and of itself, even if it’s adult men doing it? Naah. We all indulge in a little juvenile humor at times. But does it make me want to avoid the product, because it’s obvious that the product’s creators didn’t think of me as a potential customer, let alone think of what I might need? Does it make me think that the product would be a bad fit for me, that the competition’s products will probably serve my needs better? Yes, indeed.

  93. Sirkowski: Standards?? Yeah, complain about your consumer product made by underpaid slave labor in a totalitarian regime not meeting your bitch standards. FIRST WORLD PROBLEMS DERP

    I love how the only time d00ds like you care about slave labor is when you need a way to score points on the uppity bitchez who point out your gross douchery. But believe it or not, cupcake, we can and do work against corporate exploitation and misogyny (BTW, haven’t seen YOU at any meetings or actions, so dry your crocodile tears and STFU). Some of us can walk and chew gum at the same time. Not my problem if you’re such a stupid shit that you can’t manage it.

  94. Also, with something as personal as abortion, a person’s first go to choice is the rudimentary *beta* AI program on a phone? Buh whu… wha??

    Well, getting your dick sucked by a sex worker is also pretty personal and private, as is getting Viagra, but the Siri program is on it.

    If you are really annoyed about Siri being focused towards men, learn to program, and get a career as a programmer.

    Hey, your car’s a lemon? Then you should train to be a mechanic or an automotive engineer. Jesus, dude. Has it ever occurred to you that if half of the customers you’re trying to sell a product to points out that you’ve left them out, the response isn’t to bark at them that they should make the product themselves? Not good marketing or business sense.

  95. What if they don’t want 911? What if they want a hospital? Or a rape crisis center?

    Probably also worth a reminder that 911 isn’t the same in all parts of the US. The ability for 911 to pinpoint your location by cell phone can vary wildly depending on where you are. It’ll be another few years before that gets any better. People in rural areas can be better off getting themselves to a hospital if they aren’t too injured to drive, rather than waiting for 911 to show up.

  96. Wow, all we need is someone telling folks who don’t like Siri to create their own abortion-finding AI, and we’ll have bingo.

  97. @Sheelzebub,

    Srsly. If someone can’t see the sexism in a system knowing the word “viagra” but not “plan b” or any contraceptive drugs, they aren’t trying.

    Reminds me of the viagra/birth control insurance bullshit.

  98. Seems like some of these dudes who are freaking out about getting information about abortion services on our cell phones (gasp!) seem to think we should be going through some kind of secret underground network to procure these procedures because they’re just so private (translation: shameful).

  99. Roland: If you are really annoyed about Siri being focused towards men, learn to program, and get a career as a programmer.

    suspect class: Wow, all we need is someone telling folks who don’t like Siri to create their own abortion-finding AI, and we’ll have bingo.

    I think we have bingo.

  100. OT, and I haven’t been here in a week or so, so I might have missed something, but what is up with some of the block quotes? The text loses all the spaces and doesn’t do line breaks well. It only happens in some of them. Kinda weird, just in case you didn’t know. Is it a feature, not a bug, like disemvoweling?

  101. I don’t have an iPhone, but I’m curious as to what Siri’s response would be to the following questions:
    Siri, I need a tampon
    Siri, I have a UTI
    Siri, where can I get an HPV shot?
    Siri, I need the day after pill

    Also is Siri’s default voice is a woman’s… which tells you not only that the programmers were thinking about themselves (or men) as the typical users but that in their minds the perfect assistant is female. I know you can now chose a male or female voice for Siri, but the default version, and the one that is advertised by Apple is the female one. Some people’s take on this is that female voices are more pleasant to listen to… but isn’t this answer another mysogynist stereotype of women as docile and subservient, and ever eager to please?

  102. Jill: Whatpartof“it’snotun-sexistjustbecauseit’sbetasoftware”areyoumissing?

    Perhaps it’s the fact that, being in its beta stage, it’s not finished. You might as well say that the grocery store is sexist because they added their feminine hygiene section last as opposed to first despite it all being there when finished.

    1. Perhaps it’s the fact that, being in its beta stage, it’s not finished. You might as well say that the grocery store is sexist because they added their feminine hygiene section last as opposed to first despite it all being there when finished.

      If, for example, a drug store opened before it was 100% complete, and they stocked Viagra and treatment for men with sexually transmitted diseases but no birth control pills or treatment for women with sexually transmitted diseases? Yes, I would say that was sexist, even if their response was, “Well we’ll have the lady stuff eventually!”

  103. If the grocery store opened in the equivalent of beta stage and didn’t include feminine hygiene products because they decided that was their last priority then I would indeed say that was sexist because there is no reasonable business reason to invest in everything else first.

  104. You Siri-ous?: Perhaps it’s the fact that, being in its beta stage, it’s not finished. You might as well say that the grocery store is sexist because they added their feminine hygiene section last as opposed to first despite it all being there when finished.

    They sure finished the parts where you could find hookers and blow pretty quick, though.

  105. Umm…what’s with all these dudes who don’t know what beta testing means? Pro tip: If your program isn’t feature complete, you’re not ready for beta.

  106. John: John 12.2.2011 at 10:29 am

    What’s stopping them, then?

    People like you, obviously. Who really wants to listen to a bunch of doodz go on about how their sexist behavior isn’t really sexist? Because engineers do that a lot.

    And the doodz start young: Young women from my old high school’s robotics team who are now freshmen in college have been reporting how the boys in their physics lab groups scream them down repeatedly with, “How can you know anything? You’re a girl!”

  107. Jill: (Quote this comment?)

    Do you seriously expect everything to work perfectly for all people on day 1, or heck, day 0 considering that this is a public beta – a beta open to the public for the sole purpose of working stuff like this out?

    1. Do you seriously expect everything to work perfectly for all people on day 1, or heck, day 0 considering that this is a public beta – a beta open to the public for the sole purpose of working stuff like this out?

      No, I don’t, but now you’re moving the goalposts of the conversation. I don’t expect products to work perfectly for all people on day 1. What I do expect is that functionality will not be deeply skewed based on gender. I expect that a company like Apple will hire a diverse pool of programmers and product-testers so that the products they are releasing are suited to the needs of the general population, and not heavily skewed toward the needs of only the male half of that population.

      Why is that asking so much? No one is asking for perfection or attributing malice. What we are saying is that this is an example of deeply-ingrained cultural sexism. Men and male needs are considered standard. Women and female needs aren’t considered as deeply and aren’t integrated. It’s not intentional, but that doesn’t mean it’s not sexist, and it doesn’t mean it’s not still a problem.

      I’m confused as to why that’s hard to grasp.

  108. You Siri-ous?: Do you seriously expect everything to work perfectly for all people on day 1, or heck, day 0 considering that this is a public beta – a beta open to the public for the sole purpose of working stuff like this out?

    Siri works perfectly to find men sexual gratification and boner pills. It does not work at all for finding women sexual gratification, reproductive health assistance, contraception or an abortion clinic. It will, however, perfectly direct women to anti-choice crisis pregnancy centers.

    That’s not a bug.

  109. You might as well say that the grocery store is sexist because they added their feminine hygiene section last as opposed to first despite it all being there when finished.

    If a grocery store opened, and carried food, booze, soaps, household goods, pet food, condoms & lube (but not spermicide), all kinds of drugs in the pharmacy (including Viagra, but not birth control), and all kinds of hygiene products (except for tampons and sanitary pads)….yeah, I’d say that was sexist.

    But that doesn’t happen. Grocery stores have a great deal of women working for them at all levels, and those women would readily point out how alienating that would be to the customer base. How their company would quickly get the reputation of being “the inconvience store” if they opened without taking female-specific needs in mind.

    And that appears to be the problem with Siri. The workplace cultures of the various entities that went into Siri are so male-oriented that they forgot the needs of their female customer base. Now they get the benefit of bad publicity during the holiday buying season. As I said before…..Marketing Fail.

  110. La Lubu: Probably also worth a reminder that 911 isn’t the same in all parts of the US.

    Oh, LaLubu! That’s such a First World problem. Let’s focus on the important shit–like where d00dz can get Viagra and hire sex workers, not getting actual information that you may need in an emergency.

    Sheesh. Silly irrational wimmenz with their expectations of customer service and knowledge of what the beta state actually is. WHAT ABOUT THE MENZ.

  111. You Siri-ous?: a beta open to the public for the sole purpose of working stuff like this out?

    And yet you’re getting all in a huff about people pointing out the problems with the product so that it can be fixed. If the developers didn’t see a problem with forgetting/ignoring half the population before they released Siri, then how will they know that’s a problem unless people point it out?

  112. What part of it doesnt work well in a way that specifically neglects women’s very common needs while fulfilling even uncommon desires of men is hard to understand?

  113. groggette: Andyetyou’regettingallinahuffaboutpeoplepointingouttheproblemswiththeproductsothatitcanbefixed.Ifthedevelopersdidn’tseeaproblemwithforgetting/ignoringhalfthepopulationbeforetheyreleasedSiri,thenhowwilltheyknowthat’saproblemunlesspeoplepointitout?

    Don’t be obtuse. You know this whole thread has gone quite a ways beyond simply pointing out a flaw in Siri to accusing the developers of Siri of sexism, even if inadvertently.

    And you misspeak when you say it was released. It was not, at least not in the way you appear to be making it out as. It was released as PUBLIC BETA. It is not a final version. It is as if they’re telling you that their product is going to be full of glitches, holes, and all kinds of problems. There’s nothing more to it than that. When you get an error like with this whole abortion thing, it can be explained away simply because it’s beta. You’re being ridiculously unreasonable if you expect Siri to work perfectly when it’s unfinished. Having big glaring errors in it like this is a part of being a beta product and if you don’t like that, I would suggest you hold off using it until the final version is released.

  114. Adam Starkey: CompletelyirrelevantbecauseSirispecificallydoesNOTsearchGoogle.IttriesYelpforbusinesses.IfthesearchfailsinYelp,orifitisperceivedasageneralquestion,ittriesWolframAlpha.

    Ifthosesitesdon’tunderstandthequestion,ordon’thaveusefulanswers,neitherdoesSiri.

    if you search yelp for abortion clinics in NYC it turns up planned parenthood as well as a couple of other results, so that is still wrong

  115. Jill: “No, I don’t, but now you’re moving the goalposts of the conversation. I don’t expect products to work perfectly for all people on day 1.”

    Okay then. You should have no problem with Siri.

    Bear in mind that when they release a product as a public beta, the public ARE those testers.

  116. Jill: I want my vagina licked.
    Siri: Your language!

    Jill: I want my dick sucked.
    Siri: I found twelve escort services fairly close to you.

    Jill: I want a blowjob.
    Siri: I found twelve escort services fairly close to you.

    Jill: I want cunnilingus.
    Siri: Is that so?

    Jill: I want cunnilingus.
    Siri: Ok.

    Jill: I want my clitoris licked.
    Siri: I don’t understand “clitoris.”

    These are the things blowing my mind right now. First of all, the obviously shaming and completely ridiculous, prissy response to “vagina licked,” for some fucking amazing reason that couldn’t possibly be sexism, you guys, but “dick sucked” doesn’t provoke any reprimand whatsoever. And “I don’t understand ‘clitoris'”? “Clitoris” and “vagina” are very literal, very real terms for female anatomy that, yeah, we like to have pleasured sometimes. Why would these go unrecognized or treated as dirty when “dick” and “blowjob” are apparently just fine?

    This news, and the obnoxious half-assed trolling accompanying it, are so ridiculous right now.

  117. Pro tip: If your program isn’t feature complete, you’re not ready for beta.

    Another free pro-tip: If none of your user stories during your conceptual phase (much less your beta testing phase) include women, and you pride yourself on user friendly products, you might want to reconsider how user friendly you really are. And will manage to continue to be in the future.

    And I would just like to point out that the grocery store analogy actually highlights how ridiculous Apple is being and why this problem should have never made it to beta testing phase. Even if you realize the lack of tampons during your “Friends and Family” soft opening, that’s just…an incredibly expensive change to have to make that late in the game. To forget something that is stocked and sold on a daily basis in every other grocery store in the US even as far back as the first time you started listing specific goods to sell is just…such an incredibly fundamental flaw in how you are designing stuff, not some random glitch that no one could have seen until the product was put to mass use.

    Srsly, these guys mansplaining – incorrectly! – how good interaction design is done are just the icing on the cake, yes?

    Fang – yeah. There’s also just a certain level of stupid in telling people to “google it” instead – ie, don’t try to use the new shiny interface to get info, go back to the old interface! …why, exactly would I want to do that? What specifically is so odd about using an interface like Siri for it’s stated purpose: finding info?

  118. You Siri-ous?: Having big glaring errors in it like this is a part of being a beta product and if you don’t like that, I would suggest you hold off using it until the final version is released.

    Protip: Having big glaring errors in it like this is part of being a beta project, and if you don’t like that, I would suggest holding off on releasing the program until a final version is developed.

    Or get real comfortable with the public’s criticism of the product.

  119. You Siri-ous?: Bear in mind that when they release a product as a public beta, the public ARE those testers.

    And Jill tested it, and found it lacking, and pointed out the ways she found it lacking. What’s your problem with that?

  120. Sometimes, when reading about the things Siri will find put together in a list – prostitutes, Viagra, condoms, places to bury bodies – I feel like Siri was programmed by the creators of CSI.

  121. You Siri-ous-ly didn’t follow Jill’s earlier link about how Siri doesn’t just not refer you to an abortion clinic, but specifically refers you to anti-abortion crisis pregnancy centers. That’s not a problem of omission.

    Jokes about where to dump a body, or how to get a hamster out of your ass are harmless as far as I’m concerned. But someone took a great deal of time to make sure that women in NYC couldn’t get referred to an abortion clinic, since abortion clinics do show up in the websearches that supposedly Siri is using. The time to make sure women in DC got referred to a phony “crisis pregnancy center” (even one quite a distance away) instead of a comprehensive women’s reproductive health center. That isn’t benign.

    Whether Apple is aware of it or not, they are being rebranded as the anti-abortion company.

  122. Lauren: Protip:Havingbigglaringerrorsinitlikethisispartofbeingabetaproject,andifyoudon’tlikethat,Iwouldsuggestholdingoffonreleasingtheprogramuntilafinalversionisdeveloped.

    Orgetrealcomfortablewiththepublic’scriticismoftheproduct.

    Lauren: Protip:Havingbigglaringerrorsinitlikethisispartofbeingabetaproject,andifyoudon’tlikethat,Iwouldsuggestholdingoffonreleasingtheprogramuntilafinalversionisdeveloped.

    Orgetrealcomfortablewiththepublic’scriticismoftheproduct.

    Need I quote myself?

    “Don’t be obtuse. You know this whole thread has gone quite a ways beyond simply pointing out a flaw in Siri to accusing the developers of Siri of sexism, even if inadvertently.”

  123. “And you misspeak when you say it was released. It was not, at least not in the way you appear to be making it out as. It was released as PUBLIC BETA.”

    OMG MANSPLAINERS. Apple created ads that highlighted Siri in order to get people like my mother to go in and pay several hundred dollars upgrade their phones. This is not beta-testing in the traditional sense. When google beta-tests stuff, they do not use it as a selling point for newbies. *headdesk*

    And also hells yes, what groggette said. If you are going to claim that it’s in beta-testing, therefore it’s all good, you can’t complain when people point out what needs to be fixed. Especially when Apple, unlike companies that are actually beta-testing in the way you mean it, does not provide a specific place to send feedback. I mean, I have complaints about Sirsi-Dynex’s facebook app that they are (kinda) beta-testing, but well….they also set up a specific place to send feedback. Cuz, you know, actual beta-testers tend to want criticism. In spades. They don’t go around saying problems are glitches, they say “oh! thanks for catching that!” yeesh.

  124. If the developers don’t want to be accused of developing a sexist app, then maybe they shouldn’t forget or ignore half the population before releasing the app to the public.

  125. “You Siri-ous-ly didn’t follow Jill’s earlier link about how Siri doesn’t just not refer you to an abortion clinic, but specifically refers you to anti-abortion crisis pregnancy centers. That’s not a problem of omission.”

    Sure it is. The developers don’t just manually pick and choose what you and will not find. They code their search engine in a specific manner so that it will yield the results that it does, and given that it’s beta, it’s revealed just how imperfect it is. This is good. That means this is a GOOD beta. This means it will then be able to be fixed in the final version.

  126. La Lubu: Siri doesn’t just not refer you to an abortion clinic, but specifically refers you to anti-abortion crisis pregnancy centers. That’s not a problem of omission.

    Absolute YES to La Lubu’s last two comments.

  127. You Siri-ous?: When you get an error like with this whole abortion thing, it can be explained away simply because it’s beta. You’re being ridiculously unreasonable if you expect Siri to work perfectly when it’s unfinished. Having big glaring errors in it like this is a part of being a beta product and if you don’t like that, I would suggest you hold off using it until the final version is released.

    I went into this earlier – you can’t just explain it away because it’s beta, there needs to be more nuance than that. What other big glaring errors are there in the search results, where it recognises what you’re asking for but says there are no results, when results do actually show up in Yelp? That’s a serious question, I don’t know how broken it is, but if this is the result of an error then there should be a very wide range of results that suffer the same problems. There’s no reason for this to only happen with ‘abortion clinics’, unless it’s programmed to treat that query in a different way from all the other, working queries.

    And it still doesn’t explain why the program’s response to the word ‘pregnancy’ is to implore you to ‘do no harm’ and ‘consider the alternatives’, which just happens to fit the same agenda as a ‘bug’ that fails to give you details of abortion clincs when you ask for them. Maybe the anti-choice movement is still in beta?

  128. Don’t be obtuse. You know this whole thread has gone quite a ways beyond simply pointing out a flaw in Siri to accusing the developers of Siri of sexism, even if inadvertently.

    Pop quiz: do you even know what ingrained, non-malicious, yet still present sexism is?  Or do you think it has to be the Worst Ever™ in order to qualify?

  129. @ XtinaS

    I don’t doubt there is sexism in a variety of places. Thing is, this whole thing with Siri is a very bad example of it.

  130. Obviously you can’t expect programmers to be able to continue doing their jobs when you hurt their feelings by calling them sexist.

  131. Wow, as a sometimes programmer I had chalked this up to the majority explanation of reliance on incomplete third-party data combined with the effects of patriarchy (i.e. the experience of women was not considered by predominantly male programmers). I didn’t know it would respond to requests for female sexual services by listing escorts. Programmers love in-jokes, like the response to “I love you”, but this is really insulting, and Apple should have known better.

  132. You Siri-ous?: “accusing the developers of Siri of sexism, even if inadvertently.”

    There are worse things, pal. But it’s telling that you’re more concerned with getting called sexist than addressing sexism.

  133. oh no, not accusations of sexism! that shit’s serious!

    Exactly. There is nothing worse than being called sexist! Absolutely nothing! (Except maybe being called racist.)

  134. Thing is, this whole thing with Siri is a very bad example of it.

    But why?  The programmers didn’t consider women’s needs, but they considered men’s needs (Viagra, escort services, &c).  They even put in a response to hamster-related issues, but they didn’t somehow figure out that women might want to find an abortion clinic, or get contraception, or also procure sexual services.  To a whole lotta people, this is sexism: not malicious, perhaps not deliberate, but sexism, all the same.

    Wikipedia: “[Beta] generally begins when the software is feature complete.”  What I interpret this to mean is the makers of Siri considered it to be feature complete, even though no one thought to consider women’s needs.  And that’s bloody sexist.  It is, as analogised before, like considering a grocery store to be ready for the public, when no one considered that women might come in and want to buy woman-specific products, so no one stocked it.  The problem is like one part “this shit’s missing” and five parts “how the vainglorious fuck did this get opened to the public with no one noticing this basic, glaring, fundamental flaw?”.

    (Can’t be sexism, though, to not consider women’s needs.  Must just be… a tumour?  Or a brain fart!  That lasted for the entire production cycle!  Yes, that’s way more reasonable.)

  135. Obviously you can’t expect programmers to be able to continue doing their jobs when you hurt their feelings by calling them sexist.

    Solution: let them quit. Hire feminist programmers! :p

    The developers don’t just manually pick and choose what you and will not find.

    You know, I’m fairly certain someone had to specifically program Siri to know what to do when someone wanted to hide a body. And they sure as hell are not getting those escort services from Yelp. So what the fuck ever dood.

  136. jennygadget: Solution: let them quit. Hire feminist programmers!

    Now now, it’s logical suggestions like this that demonstrate that you’re a radical fem

    jennygadget: And they sure as hell are not getting those escort services from Yelp. So what the fuck ever dood.

    I agree with your larger point. but apparently, NY Yelp does have escort service listings. I’m fairly certain they don’t list body-dumping ravines, though.

  137. suspect class: Now now, it’s logical suggestions like this that demonstrate that you’re a radical fem

    If so, she can come sit by me.

    Sheelzebub: I think some of these mansplainers are in beta.

    I think you win this blog? Because there’s a trophy over here with your name on it. Congratulations, but you’re also in charge of the moderation queue.

  138. Lauren: If so, she can come sit by me.

    Word. Absolutely ridiculous threads like this one are excellent demonstrations of feminism’s continuing relevance and importance.

  139. Pingback: What We Missed
  140. suspect class,

    That’s interesting. Checking my local Yelp listings, I’m getting only stuff that has words like “escorted” in the reviews. It makes me want to borrow my parents phone(s) to see how those questions work when yelp has listings, but not accurate ones.

    But yeah, the thing that gets me about the ravines and smelting plants and such is that someone had to program that logic into the interface. That’s much more complex than a 1 to 1 keyword search. And you may do that for a joke, but you aren’t going to develop that whole process just for a joke. So, clearly, there are other, more practical stuff that they did have to create similar logic/processes for. So trying to pretend that it’s all just a matter of the databases being incomplete…

  141. Now that is just cruel. 😛

    Only to the mansplainers.

    :::Grins evilly:::
    :::Sharpens claws:::
    :::Reaches for trophy:::

  142. I know. I’d love some confirmation that siri doesn’t actually say that mom should be in the kitchen.

  143. Matt:
    Isitsomehowimpossibletounderstandthatyoucantellapple,hey,sirican’tdothisright,withoutstartingsomecrazyantiwomenconspiracyagendabeingattributedtoApple?

    Why don´t you Matt, READ the post and comments for once and TRY to REALIZE that NONE is talking about a “crazy anti-women consiracy agenda”, they are talking about institutionalized bias. If you don´t konw what it means, please look it up. But first, try to READ the post. And of course, TRY to REALIZE that people right here, right now, are actually saying to apple, “hey, siri doesn´t do this right”. Get it??? No, cause you didn´t bother READING the post or the comments. I am sorry, I have to say it: you sir, are an idiot.

  144. I got this article via Kate Harding on Twitter, in case anyone may be interested but hasn’t seen it yet.

    http://www.chicagomag.com/Chicago-Magazine/The-312/December-2011/Our-Siri-Ourselves/

    It’s really well done. I do think it glosses over two things:

    That it doesn’t make sense to be annoyed that users expect to be able to find things like birth control when programmers have bothered to take the time to add jokes about hiding dead bodies.

    That a big part of the problem is that Apple is trying to do beta testing while claiming a finished product. This is part of why questions about intent were raised. Apple only claimed “beta” once major flaws came up, because they wanted the sales and were hoping to get away with the public paying for the privilege of beta testing a product without complaint.

    I also disagree that so few female designers aren’t the problem. Yes, part of the problem is the bias in the databases used – and the public that generates the data, but feminist minded programmers would have caught these problems before it went public. Also, female programmers for the databases would help to alleviate some of the bias there, even if it’s root is in crowd-sourcing as well as the programmers themselves.

    Still, lots of good information and does a good job of talking about the consequences of institutional sexism.

    Also, considering that a decent amount of the nuts and bolts of the programming seems to boil down to the strengths and weakness of semantic versus crowd-sourced vocabularies, and how and when to tweak both…I predict SiriFail is going to be brought up in the library science Vocab Design class I’m taking next semester. 🙂

  145. I tried this and Siri provided 6 nearby abortion clinics right away. Maybe there’s not app for male prostitutes since she has a built-in vibrator?

  146. PosedbyModels: These are the things blowing my mind right now

    Am I the only one shocked that Jill lives so close to so many escort services? What’s the range on that thing?

  147. Has anyone tried asking where to get a vibrator, or a store that sells sex toys? If it can, I wonder where it would send you (i.e. to a nice, maybe even woman-owned, shop or one of the shady ones with creepy costumes in the windows).

  148. Am I the only one shocked that Jill lives so close to so many escort services? What’s the range on that thing?

    Jill does live in the largest city in the United States…

  149. Sirkowski: Yeah, complain about your consumer product made by underpaid slave labor in a totalitarian regime not meeting your bitch standards. FIRST WORLD PROBLEMS DERP

    Your problems are not important, bitch! Ongoing difficulty with ending an unwanted pregnancy is no big–it’s just like having to use one-ply toilet paper for a week! How dare you write a post about your own situation–everyone knows that bitches need to put other people first!

    Matt: Is it somehow impossible to understand that you can tell apple, hey, siri can’t do this right, without starting some crazy anti women conspiracy agenda being attributed to Apple?

    So…again…this is all just coincidence, according to you? I mean, it’s taking place in a climate of ongoing attacks on women’s ability to get abortions and contraception that range from legislative attacks to falling rates of doctors learning how to perform abortions to violence against abortion providers to propaganda campaigns…but this is just some random thing that happened, nothing to do with any of that? It’s not even about women’s issues not being considered important by male programmers? It’s just…a freak accident, like a tornado or something? Hmm. Which seems more likely to you–yet another example of the taken-for-granted institutionalized sexism that informs our society, or freak tornado?

    John: EG,
    Thanks for the explanation although I’m not sure why you assumed I might being “being a dick”. Is your default mode hostility and suspicion?

    When a man asks a question about why women don’t do things because hey, there’s no law against it? Given the past several decades about where such conversational openers go and what’s usually being implied, then, yes, in that context, my default mode is hostility and suspicion. If your question was not disingenuous, I’m pleased to hear it.

    You Siri-ous?: Perhaps it’s the fact that, being in its beta stage, it’s not finished.

    So they’re selling an incomplete product for hundreds of dollars and didn’t use any women as testers before they put it on the market? The only conclusion I can draw from that, then, is that not only are they mindlessly sexist, but they’re also dishonest and incompetent.

    You Siri-ous?: Do you seriously expect everything to work perfectly for all people on day 1, or heck, day 0 considering that this is a public beta – a beta open to the public for the sole purpose of working stuff like this out?

    It’s open to the public so that Apple can make money from it; if what they want is beta testing because they know there’re multiple problems, why do they expect people to pay for the privilege of doing the testing? And do I expect it to work perfectly for all people? No. Do I expect it to work equally well for men and women? Yes. Do I expect it to be able to respond helpfully to significant concerns women have? Yes.

    You Siri-ous?: When you get an error like with this whole abortion thing, it can be explained away simply because it’s beta.

    See, it’s just an error. Errors never mean anything, or offer any insight into the biases of the people who make them. They’re totally random. It’s unreasonable to expect Siri to be able to help you find contraception, emergency or otherwise. The programmers had to deal with important stuff, like helping men get off more easily. God, do you expect everything to be perfect?

    Dude, we know it’s an error. It’s a stupid, sucky, sexist error. That’s what we’re complaining about.

  150. I don’t think apple hates women as much as they fear lawsuits
    from Focus on The Family or the National Right to
    Life comittee for some 17 year old girl in California looking up
    Her nearest clinic. Should the information be there. Absoultely! Flood the IOS developers with angry e-mails and maybe they’ll
    Include a Seri update with that crucial information on it.

  151. I just searched for the nearest Planned Parenthood
    Clinic and 20 matches appeared.
    Looks like someone forgot to update their
    IOS.

  152. I live in an area that could be described as “very conservative,” to the point that we really DON’T have any abortion clinics nearby, nor do we have any escort services. It’s a small enough town to be unlikely to have those things anyway, but also it’s controlled by a conservative Christian college. I used Yelp to search for “abortion clinics” and got nothing. I searched “abortion” and got a couple results: one was described as “family planning” and the other “abortion alternative.” I got lots of results when I searched “Women’s health.” The second result was Waffle House, but most of them were applicable. So maybe because it only searches for things that have the word “abortion” in the name of the place? It only found Planned Parenthood when I specifically searched for it. Most abortion clinics don’t have “abortion clinic” in their actual name.

  153. jennygadget: That a big part of the problem is that Apple is trying to do beta testing while claiming a finished product. This is part of why questions about intent were raised. Apple only claimed “beta” once major flaws came up, because they wanted the sales and were hoping to get away with the public paying for the privilege of beta testing a product without complaint.

    Ding ding ding! The people yelling “Not fair! Its in BETA!! are being willfully ignorant of the fact that Apple is NOT beta testing Siri in the classic sense at all. It’s only a beta test in the sense that the iPad was a beta test for the iPad 2. If you are advertising a product that is for sale NOW, the general public is going to assume it is done, even if you tack on a disclaimer about features.

    And all that is beside the point. The omissions are way too suspiciously similar to be easily explained by the “its in beta!” argument.

  154. Jill, you are a blithering idiot.

    Read and learn.

    The long and short of it is, Siri is a limited system, and when you get to the edges of what it’s capable of, you make a complete ass of yourself if you assume that someone maliciously decided to thwart your desire for whatever you imagine its capabilities should be.

    1. The long and short of it is, Siri is a limited system, and when you get to the edges of what it’s capable of, you make a complete ass of yourself if you assume that someone maliciously decided to thwart your desire for whatever you imagine its capabilities should be.

      And you make a complete ass of yourself when you don’t read the damn post. I specifically said in the post that I don’t think someone maliciously decided to not have Siri locate abortion clinics.

      My kingdom for trolls who actually read what I wrote.

  155. By the way, and just out of curiosity, does Siri have anything helpful to say if you tell it “I have a yeast infection”?

  156. @Some guy–speaking of being a bithering idiot, how about you pull your head out of your ass and develop some reading comprehensions skills?

    On second thought, dance for us, troll. If you’re going to play the same music that’s been played before, shut the fuck up and dance to it. It’s not as if you’ve proven can make any cogent arguments.

  157. WTF is with this ‘lawsuits’ crap? Nobody can sue you for providing people with information they dislike. Try harder.

  158. Jill,

    But you are talking about computers! And you have a vagina! And are making those strange feminist noises! You couldn’t possibly be making sense! It’s a statistical impossibility!

    The people yelling “Not fair! Its in BETA!! are being willfully ignorant of the fact that Apple is NOT beta testing Siri in the classic sense at all.

    Yeah, they are being classic mansplainers: assuming they are introducing a new term and concept to us silly women. And so eager to explain it to us all that they don’t even stop to think that Apple is full of bullshit (and they know it) and so now they are too.

    But yes, willfully ignorant too.

  159. > Siri’s programmers clearly imagined a straight male user as their ideal and neglected to remember the nearly half of iPhone users who are female. <

    imho absolutely – what else ?
    why otherwise the stated answers to explicit/supposedly "male" questions ?
    and as one of the comments above already said – imho rule #1 #2 #3 in soc. design has always been and will continue to be "you are NOT your customer".
    otherwise, in my experience, it's called *FAIL

    (btw, as far as i know apple has an ongoing his-tory of soc. sex-negativity e.g. concerning apps. and no, i am far too concerned about the little soc. freedom i actually have that i do not wish a soc. smart-phone to spy an me/my every move and then – esp. unknowingly – e.g. ping-it-back-to-cupertino-or-some-such-hq. ergo, i will neither own nor operate a soc. smart-phone. i call it luxury and/or choice.)

  160. I love how you neglected to mention that Siri’s suggestion for “removing a hamster from your rectum” is just as useless as Siri’s response for your request for an abortion. (Source: Link provided in your article).

    You clearly use this to give credibility to your article although, in this case, it is entirely inapposite

  161. meh, post was mangled. 1st quote wrong link

    > “It’s just a phone, why do you expect it do all this?” Mr. Winarsky said he had no knowledge of how Siri was changed after it was acquired by Apple. <

  162. I don’t buy the “abortion clinics don’t advertise” argument. I live in a mid-sized town in a fairly conservative area and I can several abortions in the phone book. Surely Siri can search the yellow pages.

    This is still hilarious.
    How does Siri respond to questions about preventing pregnancy, or asking for condoms?

  163. David Makalaster: I love how you neglected to mention that Siri’s suggestion for “removing a hamster from your rectum” is just as useless as Siri’s response for your request for an abortion. (Source: Link provided in your article).

    You clearly use this to give credibility to your article although, in this case, it is entirely inapposite

    Because removing a hamster from your rectum and getting an abortion are analogous, so it’s totally cool that they have equally unhelpful answers?

    I never knew that men need hamsters removed from their asses as often as that, or that being able to do so was essential to their reproductive freedom. You learn something new every day.

  164. Yep. It’s an Apple conspiracy when Siri says something like that. Case closed.

    One more reason why I am sticking with Microsoft. Got a new Acer last weekend.

    Mysti:
    Someone asked Siri “Why are you anti-abortion?” and she answered “I just am, Kristen.”

  165. jennygadget:
    “perhapstheydid,butlikeAvidaQuesadasaid,theyweren’tafeministwomen?”

    Iwasn’tawarethatonlyfeministwomenusedcontraception.

    As I say there are two components in my argument. The first is how do you present yourself. Pregnancy crisis centers are scams. The computer will not notice until you add code (specific) for it. Abortion clinics on the other hand try to present themselves as women heath.

    If you need contraception you can go to almost any clinic or hospital and get it.
    What most women need, most of the time, when in need to find (quickly) a pharmacy or hospital is not for birth control or to get an abortion. Not even the day after pill. Most women have contraception (oral) reserves for at least a month.

    What we need is something for a headache or the like.

    In fact, women were loving Siri until feminist like us test it.

    My best friend is a male software engineer. He did this: He open parent a Planned P arenthood page (http://www.plannedparenthood.org/) and then (http://www.optionline.org/) he right click on the page peek the source code and search for the meta tags containing abortion. As Jill mention. SEO is top for Pregnancy Scammers, low for Planet Parenhood.

    Apple need to take the reproductive need of women as a especial case and program siri to deal with that. But seen that don’t need a women, it needs a feminist to be aware of the especial challenges that women specific reproductive care has on the patriarchy.

    Love,
    Avida

  166. EG: Becauseremovingahamsterfromyourrectumandgettinganabortionareanalogous,soit’stotallycoolthattheyhaveequallyunhelpfulanswers?

    Ineverknewthatmenneedhamstersremovedfromtheirassesasoftenasthat,orthatbeingabletodosowasessentialtotheirreproductivefreedom.Youlearnsomethingneweveryday.

    You are right, it is not cool for this flaw to exist in Siri’s programming. My original point stands however.

  167. Does anybody else see how creating a gender-specific female automaton/servant is already a little offensive to women? Just sayin’

  168. if your are truly concerned about women, you should be concerned that Siri is sending men to escort services and prostitutes

  169. When I typed “I need an abortion” into Google the first result was a CPC.

    Sirkowski: Because talking to your iPhone is what you need to do when you want an abortion… e_e

    Irrelevant. If it is what people do, they deserve answers, not you judging them.

    Matt Simpson: But everyone thinks there is some hidden agenda, some evil reason why this inanimate device does not hold the same views on the world they do,

    No one thinks that, or at least no one’s said that. It’s pretty clear that the consensus (among the people critical of Apple, not detractors) is that it’s stupid rather than evil, that the omission was one of thoughtlessness rather than hostility.

    Thing is, thoughtlessness is still bad, and still something that should be fixed. Saying “it was a thoughtless mistake” doesn’t let you off the hook.

    Athenia: I am absolutely beginning to think that this is malicious. To think that women don’t use their technology—you know, oops!–it absolutely ridiculous.

    I don’t think it’s so much that they didn’t think women would use it as that they didn’t think women might have uniquely (normative-)female needs. It’s anti-sexist, in a ill-considered, coarsely granular way. Or would be if it didn’t list escort services.

    I’ll bet it’s even worse on trans*-specific needs.

    You Siri-ous?: Having big glaring errors in it like this is a part of being a beta product and if you don’t like that, I would suggest you hold off using it until the final version is released.

    If no one is to point out problems discovered during a testing stage, what, pray, is a testing stage for?

    suspect class: NY Yelp does have escort service listings. I’m fairly certain they don’t list body-dumping ravines, though.

    Everyone knows you dump bodies in the Gowanus Canal.

    Commandrea: It’s a glorified search engine. Get over it.

    That would be a stupid first comment; as the 180th, there are no words.

    Moreover, why is it okay for Google and Yelp to be unable to find abortion services (that’s not the case, but it is what you’re claiming)

    Some Guy: The long and short of it is, Siri is a limited system

    Why are you being so patronizing?

    The problem is not “Siri doesn’t do everything asked of it” (to which “it’s in beta” or “it’s a limited system” are adequate responses if true); the problem is that there’s a pattern to Siri’s lapses, the holes where there’s a reasonable expectation of it not having holes verses no holes where it really couldn’t be faulted for having holes. If the problem was “Siri can’t find anything starting with P” that would be a fuck-up with no larger meaning. The specific lacunae show undue influence of the worst parts of the surrounding culture. This doesn’t preclude the explanation “the programmers didn’t make this a high priority” but the choice of what to deprioritize wasn’t made in isolation and isn’t beyond criticism.

  170. Lee:

    Itwouldbeanincrediblecoincidencethattheconversationdatabase‘justhappened’tohaveananti-choiceflavourwhileabortionclinicresults‘justhappened’tobeamongthesetofresultsaffectedbyaprogrammingerror–evenmoresoifthaterroraffectsaverynarrowrangeofresults,orsearchesforabortionclinicsspecifically.Andit‘justhappens’tobeahot-buttonpoliticaltopicrightnowthatcouldbespunbadlyforAppleifconservativesdiscoveredtheiPhonewereprovidingthiskindofinformationtopeople.

    Idon’tknowifanyone’sgoingtoreadallthat,butIwantedtoshowfromacomputerystandpointhowthisworksandwhyitdoesn’treallyseemlikeanoversightoran‘oops’situationthatthisspecifickindofqueryjusthappenstoberespondinglikethis.IfSiriishavingalotofproblemswithmanydifferentrecognisedqueriesthatworkonYelp,thenthatwouldlendsomecredencetotheideathatit’saglitchandthatit’sjustacoincidencethatthepregnancyresponsesaresoskewed,butweshouldn’timmediatelyaccepttheun-nuancedhand-waving‘ohy’know,computers,theygowrongsometimes’excusethatI’veseenafewpeoplerepeating

    Well, I did read all that, Lee.

    And I agree with everything you wrote. I couldn’t agree more.

    Thanks for your excellent statement.

  171. David Makalaster: You are right, it is not cool for this flaw to exist in Siri’s programming. My original point stands however.

    That comment was your original point. The program recognizes what “hamster in rectum” means well enough to provide a related service. It does not know the word “contraception” at all and considers the word “vagina” to be too dirty to deal with. What was your original point again?

  172. Scott Wiebe:
    Iknowyoushouldn’tattributetomalicewhatcouldbeincompetencebutIthinkyouarebeingfartoogenerousinyourassumptionthatwomenwerejustoverlookedbytheprogrammers.Idoubtsmartphonescontaintheirowntablesofinformationortopics:whenyousay‘Ineedxxx’Sirijustpassesxxxalongtogoogleandgivesyousometopresults.Gotogoogleandmanuallysearchforabortion,birthcontrolorrapecrisiscentresandgoogledoesfindresults.Lotsofthem.YetsomehowSirican’tfindthem.

    IftheyDIDcreateabig‘olelistofthingstopasstogooglebutjustforgotaboutwomen’sissuesthenhowdidanti-choicecrisiscentresendupinthere?Ifitreallydoeshaveasubjectlistthenthedevsmusthavethoughtofwomen’sreproductionissueswhentheyputintheanti-choiceoptions…andthentheyforgotaboutwomen’sreproductivehealth?Idon’tbuyit.Also,ifSirifailstofindinformationonanythingelse,thenSirioffersalinktogoogle…unlessit’sawomen’sissue.

    Ican’tacceptthepremisethatthisisbecauseSirihasalonglistofallthethingsitwouldforwardtoanexternalsearchengineandtheyforgotafewthings–Ithinkit’sfarmorelikelytohaveashortexcludelist.Ireallydon’tbelievethiswasanoversight,IbelievesomeoneatApplemadethechoicetocodeSiriwithblocksonsubjectstheydon’tlike.IhopeI’mwrong,Ihopeit’sjustanoversightbutIdon’tthinkso.Ican’tconceiveofanywayaprogrammercouldcreatesomethingwhichtakeswhatyousayandsearchestheinternetforitoroffersyoualinktogooglebutdoesnotdothatonashortlistoftopics.Notwithoutintentionallyputtinginblocksonthosetopics.

    I agree totally. Anyone use Trend Micro? They block women’s sites all the time on specious grounds. I suspect other so-called security programs do the same. It’s a real problem. There is a war against women, it is endless, it is unrelenting, and it is evil.

Comments are currently closed.