Amid evolving and fast paced technological innovation in the pharmaceutical and biotechnology industries, including gene therapies and in vitro technologies, a parallel and unprecedented rise in healthcare data collection has occurred across multiple settings platforms, through real-word data (RWD), genomics, metabolomics datasets, wearables devices and health-apps. In response, a joint task force of the European Medicines Agency (EMA) and Heads of Medicines Agencies (HMA) was created in 2017. The Big Data Task Force (BDTF) was mandated to make recommendations on how big data should be used in the already ongoing changing of the regulatory paradigm, moving risk-benefit decisions from relying largely on traditional clinical trials to a broader and complementary range of sources.

Big data, a widely-used term without one commonly accepted definition, was defined by the BDTF as:

“extremely large datasets which may be complex, multi-dimensional, unstructured and heterogeneous, which are accumulating rapidly, and which may be analysed computationally to reveal patterns, trends, and associations. In general, big data sets require advanced or specialised methods to provide an answer within reliable constraints.”

Having defined the subject, phase I of the task force’s work had as its main goal to identify big data challenges and opportunities for medicines regulators in the European Economic Area (EEA). This was sought out namely through: identification of sources and characteristics of data, and defining the main format in which it is expected to be collected; exploring the applicability and impact of big data on medicines regulation; develop recommendations on necessary changes to legislation, regulatory guidelines and on data security; and establishing guidance for the development of big data capabilities for the evaluation of applications of marketing authorisations and clinical trials.

In February 2019, the Phase I report was published with the review of the Big Data landscape and identification of opportunities for improvement. Among others, recommendations were made regarding data quality and standardisation, sharing and access to data, and regulatory acceptability of Big Data analyses. The task force took a collaborative approach with other regulatory authorities and partners outside the EEA (e.g. U.S. Food and Drug Administration, Health Canada) to consider their insights on big data initiatives.

Acknowledging that BDTF Phase I addressed the “what” which was required on current big data landscape, Phase II was subsequently launched to address questions about “the which”, “the how” and “the when” about big data.  The priority was now set to establish concrete action to move from high-level recommendation to implementation.

Advances in technology are allowing the digitisation of large volumes of unstructured data, increasing the need for clarification on how this insights/evidence can be accepted by decision-making authorities. One major barrier identified by BDTF, along with the low access to big data, is the lack of resources and capability of EU regulatory network to analyse large volume of unstructured data which often needs additional re-analysis to validate results. Another crucial element is the need to strengthen the EU regulatory network to guide and critically interpret analyses derived from big data generated by emerging technologies and novel analytical approaches, ensuring confidence on the conclusions.

In January 2020, the Phase II report was published with actions and recommendation to deliver what BDTF established as their vision:

a strengthened regulatory system that can efficiently integrate data analysis into its assessment processes to improve decision-making. This will be supported by knowledge of data sources, their quality and their relevance for the European population, continual optimisation of data quality and analytical approaches and promotion of a secure and ethical data sharing culture. Training and external collaborations will be key in order to build expertise. Knowing when and how to rely in novel technologies, and the evidence generated from Big Data, will benefit public health by accelerating medicines development, improving treatment outcomes and facilitating earlier patient access to new treatments.”

The key recommendation, and the more ambition one, is the creation of the DARWIN (Data Analysis and Real World Interrogation Network), expected to be launched by 2023. DARWIN will be a European network of databases where quality and content is known and is established at the highest levels of data security to be used to provide robust evidence to decision-making authorities. The BDTF made several other recommendations, including extending the scope and utility of the ENCePP database (the European Network of Centres for Pharmacoepidemiology and Pharmacovigilance) and advising EU regulators to work closely with other EU and international initiatives (such as the Genomic Alliance for Genomic Health) to increase regulatory use healthcare data sources. The minimum costs to the EU regulatory bodies (EMA and national Competent Authorities) were also identified, which in addition to initial funding of USD $ 27-45 million, will require an annual funding of approximately USD $ 9 – 18 million. Other recommendations include guidance on data quality and metrics, development of guidelines for acceptability of evidence and increasing the evidence available by identification of relevant data sources. The task force also gave advice on how improve learnings on the utility of RWE and Big Data in drug development, modernise IT infrastructure, ensure a secure and ethical data sharing. They also proposed to increase EU regulatory network capabilities through training on Big Data as well as increased efforts towards an international regulatory alignment on data standards. So far, the BDTF recommendations are mainly focused on how to support stakeholders to better use existing and future tools, without providing technical and operational recommendations.

With the clear understanding that data landscape is evolving, regulatory systems around the world need to evolve and invest as well. Examples of this worldwide can be seen in the investment of around USD $1 billion by the FDA in the Sentinel system in the past 10 years, which is a national electronic system to monitor the safety of FDA-regulated medical products; another example is the Canadian Network for Observational Drug Effect Studies (CNODES), a network of healthcare databases that collects safety and effectiveness data of drugs marketed in Canada, with an annual cost of roughly USD $5.5 million; and the Japanese Medical Information Database Network (MID‐NET), which is a hospital data network system established to assess drug safety.

However, Big Data cannot be seen as the solution for all the challenges that regulators face to make correct and informed decisions. With the current reference standard for regulatory decisions being the randomised, double-blind, controlled clinical trials, which allows data collection with clear inclusion/exclusion criteria and protocols to reduce potential bias and confounding factors, the generation of complementary evidence must be seen as add-on evidence. The large-scale approach, with the known limitations (e.g. internal validity, lack of quality control in data collection), will not replace the gold standard but will be a valuable tool to facilitate and improve decisions, mainly in less common, severe, or long-term adverse effects that cannot be detected by the clinical trials.

As of February 2020, the BDTF phase II recommendations and proposals are being considered by the European Commission. If implemented, will be a major step for EU regulatory authorities in building expertise, generating, interpreting and draw conclusions on a meaningfully larger data set, improving EU public health and providing support for innovation assessment.

Six healthcare policy questions hanging in the balance as the United States votes

View Now