The app that could stop suicide: Algorithm looks at language used in everyday conversations to spot people at risk

  • A machine learning algorithm analyses verbal and non-verbal cues
  • It could correctly identify if someone is suicidal with 93% accuracy
  • Researchers incorporated the algorithm into an app trialed  in schools
  • By recording conversations and analysing cues such as pauses and sighs, it could help to flag those most at risk of taking their own life 

Researchers are developing an app which could help to prevent suicides by flagging those most at risk.

Using a computer algorithm, it records conversations, analysing what people say and how they speak.

By picking up on a range of subtle verbal and non-verbal cues, it can correctly classify if someone is suicidal with 93 per cent accuracy.

Researchers in the US  are developing an app which could help to prevent teen suicides by flagging those most at risk (stock image used)

At the heart of the app is a machine learning algorithm which classifies the person based on their responses.

In an earlier study, researchers enrolled a mix of 379 patients, who were suicidal, diagnosed as mentally ill, or neither.

Patients were required to complete surveys and structured interviews probing their emotional state.

Once all of the verbal and non-verbal information was collected it was fed into the algorithm, which used machine learning to classify patients in one of the three groups.

Called Spreading Activation Mobile, or SAM, the app (pictured) can be used to record a counselling session. At its heart is a machine learning algorithm which classifies the person based on their responses

The algorithm has been incorporated into an app being tested at schools in the region. It can be used to record a counselling session, comparing the language used and non-verbal cues – such as sighing, laughing or pauses – to classify who is at risk of suicide (stock image used)

PICKING UP ON CUES 

In the study, researchers tested the algorithm on 379 patients who were suicidal, diagnosed as mentally ill, or neither.

A machine learning algorithm analysed the data from questionnaires and interviews to classify them as one of the three groups.

By analysing speech patterns and non-verbal cues - such as pauses and sighs in speech - it could correctly classify if someone is suicidal with 93 per cent accuracy.

It was also able to correctly classify people into one of the three groups with 85 per cent accuracy.

The results showed that it could correctly classify people with 85 per cent accuracy.

Of the key non-verbal indicators, researchers found that in the non-suicidal, non-diagnosed group, people tended to laugh more as well as sighing less and showing less emotional pain and anger.

The research findings are published in the journal Suicide and Life-Threatening Behaviour.

Dr John Pestian, a researcher from the Cincinnati Children's Hospital Medical Center, who led the research, said: 'These computational approaches provide novel opportunities to apply technological innovations in suicide care and prevention, and it surely is needed.

'When you look around health care facilities, you see tremendous support from technology, but not so much for those who care for mental illness.

'Only now are our algorithms capable of supporting those caregivers. 

'This methodology easily can be extended to schools, shelters, youth clubs, juvenile justice centers, and community centers, where earlier identification may help to reduce suicide attempts and deaths.'

According to the Cincinnati Enquirer, the algorithm has been incorporated into an app being tested at schools in the region.

The team is working on pulling in a range of other non-verbal cues. One approach uses face tracking technology to spot when someone looks down more than often – with early findings indicating suicidal teens may look down more often than non-suicidal peers (stock image)

SUICIDE FACTS 

Suicide was attributed as the second leading cause of death among 15 to 29-year-olds around the world in 2012.

According to the Centers for Disease Control, the suicide rate in the US is equivalent to 12.6 people per 100,000.

In the UK, the rate is slightly lower, with the Samaritans reporting an average of 10.8 people per 100,000.

Three-quarters of global suicides occurred in low- and middle-income countries in 2012. 

Called Spreading Activation Mobile, or SAM, the app can be used to record a counselling session.

It compares the language used and non-verbal cues – such as sighing, laughing or pauses – to classify who is at risk of suicide.

The Enquirer reports the app can also separate typical angst-related language and actions of a teen.

Dr Pestian is reported to be working on pulling in a range of other non-verbal cues which could prove to be red flags which might otherwise be missed.

One such approach involves filming interviews, with face tracking technology to spot when someone looks down more than often – with early findings indicating suicidal teens may look down more often than non-suicidal peers.

'The technology is not going to stop the suicide, the technology can only say: 'We have an issue over here,' Dr Pestian told the Enquirer.

'Then we have to intervene and get a path to get to care.

He adds: 'If it's just a machine it is useless.' 

For confidential support call the Samaritans on 08457 90 90 90, visit a local Samaritans branch or see Samaritans.org.

The comments below have not been moderated.

The views expressed in the contents above are those of our users and do not necessarily reflect the views of MailOnline.

By posting your comment you agree to our house rules.

Who is this week's top commenter? Find out now