Subscriber Benefit
As a subscriber you can listen to articles at work, in the car, or while you work out. Subscribe NowCarmel-based Syra Health is looking to its new AI-powered app to help it reach profitability and establish itself as a provider of high-tech mental health services.
The app, called Syrenity, launched in December for public download with a goal of helping people cope with challenges like stress, anxiety and depression.
Syrenity, while still small in reach, is part of an explosion of new health care services using artificial intelligence to communicate with people about highly personal matters. Medical and technology experts say they see benefits in the services but also caution about clinical, legal and ethical considerations as AI tools become more humanlike in their ability to converse.
Syra said its app uses a scientifically based approach to provide cognitive behavior therapy, or CBT, and mindfulness-based stress reduction to guide users. Syrenity also features a journal where users can write their thoughts and worries and interact with an artificial-intelligence-guided companion named Echo, which appears as a happy penguin-like avatar.
Syrenity offers dedicated home-screen sections for depression, anxiety, substance use, anger and insomnia.
“We wanted to create an application … that could be like a therapist in the pocket because it’s hard to get psychologist appointments, and also there’s a stigma associated with receiving mental health care,” Syra CEO Deepika Vuppalanchi said. “So this is something that they can do in an environment that is secure. They feel comfortable.”
Syra is looking for Syrenity to provide a steady source of recurring revenue for the company, which started trading on NASDAQ in September 2023 and has endured financial losses. The company lost $3.3 million in the first nine months of last year, and its share price has languished under $1 since early September. Yet the company said its goal is to achieve profitability in 2025 by growing revenue and cutting costs.
Syra offers a wide range of health care services, although most of its revenue comes from contracts in 24 states for services that include recruitment, training and workforce development of nurses and nursing assistants. The company’s largest customer is the Indiana Family and Social Services Administration.
Fast-growing market
With the creation of Syrenity, Syra seeks to establish itself in the fast-growing digital mental health market. But it will be tough. Smartphone apps focused on wellness and mental health seem to be everywhere.
California-based Headspace, co-founded by former Buddhist monk Andy Puddicombe, has expanded beyond its previous core offering of meditation. In October, Headspace launched its own conversational AI tool, dubbed ‘Ebb,’ to support people through “the ebbs and flows of life.” Headspace, which claims 40 million users, also is targeting businesses and health care plans with mental health services.
Woebot Health, also of California, touts its app as a “mental health ally” using “responsible AI.”
Dr. Shaun Grannis, vice president of data and analytics at the Indianapolis-based Regenstrief Institute, told IBJ technology-driven mental health tools hold the potential to help but also need to be more closely studied and understood given the growing sophistication of AI.
“I was trained in family medicine and practiced for about 13 years. And managing mental health, depression, anxiety, etc., was a big part of my job,” said Grannis, who is also a professor of family medicine at the Indiana University School of Medicine.
He said it’s “OK to be cautiously optimistic about these tools because clearly patients want to talk about how they’re feeling.” But Grannis cautioned that mental health tools need additional study to understand how the chatbot was trained and constrained in what it might say to people. In addition, he said, users need to understand the apps’ privacy and security terms.
“I think that AI is going to revolutionize our world,” Grannis said. “It’s already doing that.”
But, he said, “We have a lot more to learn about how to use [AI services], understand where they’re effective, where they’re not [and] what are the risks.”
The global market for mental health apps was $6.25 billion in 2023 and is expected to grow to more than $17 billion by 2030, according to San Francisco-based market researcher Grand View Research. The report cited a surge in downloads during the pandemic as fueling the market expansion for mental health apps.
How it works
Syra’s Syrenity costs $124.99 a year or $12.99 a month. The company did not disclose how many people have downloaded the app so far.
It’s also working on a business-to-business version of the app—one that could be offered by universities, insurance companies or others—which has had a “soft launch” for Syra employees and two other companies.
Syra says the Syrenity app stands apart because it is more comprehensive and clinical in its design, with a focus on prevention and intervention. It is also compliant with federal HIPAA privacy rules and features an “emergency button” that links users to the 988 suicide and crisis line in high-risk cases. Syra said Syrenity uses screening tools such as the GAD-7 scale for anxiety, PHQ-9 for depression and NIDA for substance use.
New Syrenity users begin by filling out an extensive questionnaire and completing an assessment of their mental health struggles, as well as providing information like the hobbies they enjoy. With that baseline, Syrenity is designed to provide tailored interventions to help prevent mental health concerns from escalating.
Users can track their progress compared to the baseline.
The penguin-like AI tool is named Echo because it is designed to mirror back to users what they are saying and ask questions to help guide them to answers.
A user might write: “I am really anxious about an upcoming interview and have been having feelings of inadequacy.”
Echo answers: “I can understand how interview anxiety and feelings of inadequacy can be challenging. Would you like to share more about the specific concerns you have or the situation you are facing?”
Syrenity users also can view reports based on their assessments with a graph plotting, for example, whether their anxiety is minimal, mild, moderate or severe. Other features include videos on topics such as mindful breathing and embracing your values.
More humanlike
Modern chatbots are becoming more humanlike than earlier versions, which were more scripted, said Michael Mattioli, professor of law at Indiana University Maurer School of Law in Bloomington.
“These more modern machine-based learning chatbots are using natural language processing,” said Mattioli, who has studied AI. “Instead of using very rigid rules, they are speaking to you in a more fluid way.”
A large language model, or LLM, is a type of AI like ChatGPT that can recognize connections between words to answer questions and communicate in a conversational way. The models are trained on large amounts of data that help them learn how to provide appropriate answers.
The fast advancement of AI worries some.
“Remember that AI and large language models have no sense of causality or morality,” said Ross Koppel, professor of biomedical informatics at the Perelman School of Medicine at the University of Pennsylvania and at the University of Buffalo. “I worry that we’re so hyped and so enamored of the AI and LLMs that we’re failing to say, ‘What the hell good does this do?’”
Koppel said that because there is a shortage of therapists, chatbots could help people. But he said it’s too soon to know for sure.
“All of the data I’ve seen on chatbots is that we really don’t know if they are efficacious,” Koppel said. “A lot of the tests that are done don’t seem to be particularly rigorous.”
Raja Reddy, Syra digital health portfolio lead, said Syrenity features what the company calls Syra Guardrails, or constraints, on what Echo can do. “That’s a kind of an ethical AI framework that our AI team has built out to ensure that the AI doesn’t step out of bounds, one, and then, two, it doesn’t also provide any clinical recommendations,” he said.
Syra said the clinical recommendations Syrenity avoids would be, for example, suggesting to a depressed person that he or she take an antidepressant.
Syra said it used four paid subject-matter experts, all psychologists, to validate content in the app.
One of those experts, Lorenzo Lorenzo-Luaces, an Indiana University Bloomington associate professor of psychological and brain sciences, said he was attracted to Syra’s comprehensive approach when developing Syrenity.
“It’s meant to work with people across the range of symptoms,” Lorenzo-Luaces said. “The recommendations that we give in the app are things that have been found to be effective for depression, anxiety, and they’re kind of skills that you could use in your everyday life.”
Lorenzo-Luaces also is the principal investigator in a study of Syrenity’s effectiveness. In the study, people with moderate or more serious depression are being recruited from across the nation. Half the participants are given immediate access to the Syrenity app for six weeks, while the other half wait about three weeks before being able to use the Syrenity app. So far, 135 people have taken part in the study out of a planned 300 participants.
Lorenzo Lorenzo-Luaces said early results show people using Syrenity had lower depression scores after six weeks. But he added that mental health apps in general have a small effect overall. “The apps are effective on average, but they don’t work for everyone, and they’re not the [be-all, end-all],” he added. “They have a place in the health care system.”
Syra’s terms and conditions on the Syrenity app include a statement that Syrenity does not provide medical or health care advice and is not a substitute for a licensed health care professional.
IU Mattioli’s said users should be aware of the distinction between a licensed clinician and an app.
“I don’t think AI is a cure-all, but I think it could be helpful to some people who are dealing with barriers in accessing traditional therapy,” he said. “It’s important for disclaimers to be in place. If it’s not really designed or clear to give medical advice and it is, users should understand that.”•
Please enable JavaScript to view this content.