Needed Innovation in Digital Health and Smartphone Applications for Mental Health: Transparency and Trust.

The promise of smartphone applications and connected technologies for mental health to advance diagnosis, augment treatment, and expand access has received much attention. Mental health disorders represent the leading cause of the loss of years of life because of disability and premature mortality and also contribute to employee absenteeism and lost productivity in economically established countries such as the United States. The potential of smartphone applications to offer new, at-your-fingertips tools and resources for mental health care is frequently cited. But this potential is not the only reason why it is hard to ignore smartphone applications. The reality of applications for clinical care is already here. More than 10 000 mental health–related applications are available to download, and that number increases daily. As smartphones become increasingly inexpensive and available to the entire population, including those with mental illness, the accessibility, immediacy, affordability, and bold marketing claims of applications will drive more patients to use them. This new reality is worrisome: studies suggest that most mental health apps in commercial marketplaces do not conform to clinical guidelines. Some may even offer dangerous recommendations, such as one application that advises people experiencing a bipolar manic episode to drink hard alcohol before bedtime to assist with sleeping.2 It is likely that most of these nonevidencebased applications may distract patients and potentially cause them to delay seeking care. Many applications do not respect the privacy of personal health information, and the price of a free application is often buried in a complex privacy policy requiring college reading comprehension—that price being the right to market and sell your data.3 Certainly there are exceptions, as a handful of safe, evidence-based, and useful applications exist. Still, these helpful applications may be to difficult to find among hundreds of more problematic applications. Finding these valuable applications, furthermore, is a challenge for both patients and clinicians. Mental health technologies like smartphone applications have not been thoroughly investigated through clinical science or overseen through regulatory control. Instead, there is a void in which the potential and preshpent reality of health applications are confusing, marred by a lack of transparency and trust. The situation exists partially because the US Food and Drug Administration (FDA) has taken a “hands-off” approach toward health applications, meaning that most mental health applications do not fall under federal regulations. The 21st Century Cures Act, Section 3060, “Clarifying Medical Software Regulations,” indicates that this hands-off approach will continue and become more lax. Astonishingly, the Apple iTunes and Android Google Play Store are the default arbiters and agents responsible for releasing (and on some occasions, withdrawing) applications, despite evidence that neither their wellknown star ratings nor number of downloads correlate well with health application quality.4 In early September 2016, Apple announced that it would no longer allow certain health applications in its marketplace. This announcement was seen as exerting more influence in protecting public interests related to health applications than the FDA.5 One of Apple’s guidelines states, for instance, that drug dosage calculators proffered on its health applications “must come from the drug manufacturer, a hospital, university, health insurance company, or other approved entity, or receive approval by the FDA or one of its international counterparts.”5 Such a move is a first step on a long journey, but it begs the question of how health application offerings will be evaluated transparently if manufacturers, hospitals, universities, health insurance companies, and the FDA do not gather evidence and define appropriate standards. Another recent first step is the greater engagement of professional societies. For example, the American Psychiatric Association recently released a smartphone application evaluation model that does not specifically recommend or endorse any one application, but rather guides clinicians in considering the safety, evidence, usability, and interoperability of an application to make a more informed decision about use.6 As mental health applications continue to mature, finding consensus and synergy between all stakeholder groups will be critical in creating transparency and trust. While the potential of mental health applications and connected technologies has powered the paradigm of mobile health for the field, it is time for clinical science to assume greater leadership, bringing greater trust and transparency. Application technology is not the limiting factor in adopting these digital tools—trust and transparency are. All of health care, and especially mental health care, revolves around expectations of confidentiality practices and respect for privacy when patients disclose their often most intimate experiences and vulnerabilities. To have therapeutic value, we suggest that VIEWPOINT