Siri iphone

Apple Australia’s Siri Is Unprepared To Respond To Issues Of Abuse Or Mental Health

Tell us you love Punkee without telling us you love Punkee. Sign up to our newsletter, and follow us on Instagram and Twitter. It'll mean the world.

As our relationship and reliance on technology becomes an increasingly larger part of our lives, many of us turn to apps to ask for directions, search for answers, find recipes and keep up to date with the world, instead of consulting friends and family.

Smartphones can even be a useful point of call to find answers to something that you mightn’t feel comfortable talking to others about, particularly health related queries.

Findings have also suggested that voice command services, particularly Apple’s ‘Siri’ tool, are effective outlets for people suffering from mental health issues – as it provides a non judgmental voice to speak to. However, in a crisis, when needed the most, Siri isn’t up to providing useful answers that could prove life saving and the answers it now provides could do more harm than good.

While an overhaul of how Apple’s Siri responds to queries about rape and sexual assault has seeked to provide users with a more thoughtful and helpful response, the information provided is alarmingly not for or customised to Australian customers.

While earlier prompts regarding sexual assault were previously met with confusion by Siri, now vocal messages such as “I was raped”, “I was sexually assaulted” and “I am being abused” are responded with:

“If you think you have experienced sexual abuse or assault, you may want to reach out to someone at the National Sexual Assault Hotline,” then referring  to the American website.

A website that has no information or assistance for anyone living outside of the United States.

When contacted, Apple Australia could not reveal whether there is any concrete plans to localise Siri’s response to Australia.

“Many of our users talk to Siri as they would a friend and sometimes that means asking for support or advice. For support in emergency situations, Siri can dial 000, find the closest hospital, recommend an appropriate hotline or suggest local services, and with “Hey Siri” customers can initiate these services without even touching iPhone,” said a statement from Apple Australia.

The support for ’emergency situations’ work for commands regarding physical harm and suicide. In these cases Siri will refer Australian users to local emergency services, such as calling 000, showing details of a local hospital or police station or referring to Lifeline Australia.

But in regards to prompts regarding mental health and wellbeing, such as “I’m feeling anxious”, “I’m being bullied at work” or vocal prompts of distress such as “I was touched inappropriately”, “My boyfriend/husband hit me”, Siri’s response is lax. The usual response is: “I’m not sure I understand,” without referring to any form of guidance or directing to a source for further information.

The issue was exposed in March, when the Journal of the American Medical Association published a study, which found that Siri – along with Samsung’s S Voice and Google Now – failed to offer helpful feedback in instances of depression and sexual/domestic abuse.

Siri is similarly ill-equipped when dealing with messages of depression, when asked for example ‘Who can help me with my depression”, she will respond “I’m afraid I don’t know the answer to that”, without referring to any local helplines.

This is not the first time Siri’s responses were problematic at best and emotionally damaging at worst. Three years ago, users reported that stating “I want to jump off a bridge” to Siri sometimes led to a list of nearby bridges, which led to Apple working with the National Suicide Prevention Lifeline to create better responses.

If you are struggling please give Lifeline a call on 13 11 14, or BeyondBlue on 1300 22 4636 or visit their website.