Civil liberties groups are steamed that the new iPhone voice-activated search engine won’t direct users to the nearest “abortion clinic.” It’s easy to overstate the truth, however. What do you think?
Ask any of your early-adopter, iPhone 4S-loving friends, and they’ll tell you: The new voice-activated personal assistant, Siri, is not perfect.
If you want help finding a pizza or a place to get your tires rotated, you’ll probably luck out. But some things seem to be out of her grasp. As WHYY’s Maiken Scott reported this week, civil-liberty-minded blogs are lambasting Apple for not making Siri smart enough to know when users are asking for directions to the nearest abortion clinic.
One writer says that if you just ask about “abortion,” Siri will tell you she can’t find any “abortion clinics,” and he speculates that Siri’s inclusion of the word “clinic” suggests a programmatic aversion to the A-word.
The Unofficial Apple Weblog makes a worthwhile attempt at debunking the criticism and explaining Siri’s logic.
Whatever you believe, it’s easy to overstate the truth here, such as in the case of a particularly incendiary video from The Young Turks, a progressive online newscast.
They state, incorrectly, that if you ask Siri to find an abortion clinic she will tell you where to find a right-wing pregnancy crisis center. But in a live demonstration moments later, Siri actually says she doesn’t understand the question.
When they ask Siri where to find a pregnancy crisis center, she comes up with a list of OB/GYNs, which the hosts speculate must be a bunch of fake doctors masquerading in white coats trying to scare women out of having abortions.
In short they prove nothing, except maybe that Siri doesn’t know the word “abortion.” That in itself is a valid point, and worth Apple’s consideration. The theatrics beyond that are unnecessasry. The Young Turks certainly do not prove that women seeking an abortion will be rendered helpless because of their iPhone 4S.
If you ask Siri where the nearest Planned Parenthood office is, she’ll tell you right away.
Siri knows a lot. And her programmers have endowed her with a sense of humor. You ask her about the meaning of life often enough, and she’ll eventually direct you to a list of area churches. If you ask where to dump a dead body, Siri will suggest swamps, mines and reservoirs.
But she doesn’t know everything. The extent of her knowledge is limited to the connections her programmers have made to various online directories and databases.
Apple says they can’t anticipate everything iPhone users will want to know. Fair enough. But may still be worth asking if the items the programmers thought to include–and the items omitted–might reveal any sort of unintentional bias.