Just as preparing new doctors to properly diagnose patients requires carefully observing and correcting them on the job, developing new healthcare diagnostic technology requires closely monitoring and assessing it in real-world medical settings. 

However, the repeated interruptions of the typical regulatory paperwork regimen render this process time-consuming and expensive. Some of the most exciting advances in health diagnostic aids involve artificial intelligence (AI), loosely defined as machine learning

The regulatory sandbox model has the potential to avoid many of these delays and costs, while still maintaining the protections that consumers expect in the healthcare field. It utilizes a collaboration between trial subjects, regulators, and researchers.

This type of program already gives fintech innovators and consumers in the United States the opportunity to take advantage of new technology earlier than would otherwise be possible. Its speedy adoption in health care—such as medical imaging, health chatbots, and medication—has the potential to improve health and save lives. 

However, health care involves even more serious concerns than finance, so deeper questions about applying the sandbox method in this space are natural and appropriate as these programs await congressional approval. This article seeks to address two common concerns in the healthcare regulatory sandbox: safety and privacy.

How Will Legislation Prevent Dangerous Human and Machine Errors?

Avoiding mistakes means that machines need to complement, not replace, humans. 

The U.S. Food and Drug Administration (FDA) has already partnered with the U.K. and Canada to create ten guiding principles by which to judge the safety and effectiveness of AI devices, one of which is that “focus is placed on the performance of the human-AI team … rather than just the performance of the model in isolation.” Strictly enforcing this must remain a top priority.

Another guiding principle emphasizes that clear communication with both providers and patients must take place at every step of the process. Subjects must receive and understand detailed information on every possible aspect of a diagnostic device, and users must also be “made aware of device modifications and updates from real-world performance monitoring, the basis for decision-making when available, and a means to communicate product concerns to the developer.”

How Will Legislation Protect Sensitive Patient Information?

AI analyzes data, makes predictions, and suggests solutions faster and (often) more accurately than humans. However, since it requires the same sensitive data to learn as would a human, it raises new concerns about patient privacy in medical applications. 

The Health Insurance Portability and Accountability Act (HIPAA) of 1996 and related laws currently promise to prevent unnecessary sharing of health information. Unfortunately, not only does the reality already fall short, but adhering to the current research and development laws takes so long that fast-moving tech advances can render creations obsolete by the time regulators authorize them.

Multiple methods can mitigate these issues. The United Kingdom (U.K.), the originator of the sandbox idea in 2014, provides templates in its current healthcare AI Airlock and Genomics England Research Environment. For example, humans still review requests for data, remove identifying information, and confirm they comply with current law. Multiple levels of checkpoints further lower the chance of accidental leaks. 

As a perk of participating in a sandbox, creators can have their requests expedited by moving them to the top of the schedule, thereby ensuring both privacy and efficiency.

Can We Preemptively Address Risks?

Yes. Despite its many prospective pitfalls, the combination of futuristic technology and sandbox adoption has immense potential for good. 

Truly informed consent, swift consequences for government or company abuses, and consistent updates to protocol will lower the likelihood of damaging errors, while still opening possibilities for patients and practitioners who want to be part of this process. As always, more freedom and less government overreach will drive better health care.