Whitepapers

AI in Drug Discovery: A Review of US & EU Policy Guidance

AI in Drug Discovery: A Review of US & EU Policy Guidance


Artificial Intelligence (AI) is revolutionizing the life sciences industry, offering unprecedented transformation in drug discovery and development. Our latest whitepaper reviews the evolution and the recent transformative impact of AI technologies in drug discovery and considers the latest policy guidance in the United States and European Union. Download today to learn more about:

Historical Context: Explore the evolution of AI in drug discovery, from the introduction of computer-aided drug design (CADD) to modern AI methodologies.

Current Applications: Understand how AI is being utilized to repurpose existing drugs, identify new therapeutic targets, and design novel molecules. Discover the promising future of AI in personalized medicine and predictive modeling.

Policy Guidance: Gain insights into the latest US and EU policies guiding the integration of AI in drug discovery, emphasizing responsible innovation, regulatory support, and ethical considerations.

This whitepaper provides a comprehensive analysis of global policy developments, including:

  • The FDA’s proactive guidance and upcoming draft regulations on AI/ML in drug development.
  • The European Union’s AI Act and EMA’s reflections on AI in regulatory decision-making.
  • The role of global AI Safety Institutes in ensuring safe and ethical AI applications in life sciences.

Unlock the full potential of AI in drug discovery by understanding the regulatory landscape and emerging opportunities. Download our whitepaper to explore the future of pharma and integrate cutting-edge AI strategies into your development pipeline.

Complete the form to access the full whitepaper

For more insights on drug development, target identification, and real-world data integration, visit our Lifebit Blog.

Download full whitepaper

 

Key Trends for Precision Medicine in 2024

Key Trends for Precision Medicine in 2024

Enabling personalized medicine through tech innovation, secure diverse data access, newborn sequencing, public trust, and global collaboration

During the Precision Medicine Community Event held on December 18, 2023, in London, leaders shared the latest breakthroughs in genomics, personalized medicine, and drug discovery. Emphasizing the importance of secure access to diverse data and the continuous need to cultivate public and patient trust, the event brought together influential figures from healthcare, industry, and academia worldwide.

This gathering facilitated connections and established new partnerships with the overarching objective of expediting the secure utilization of health and biomedical data in research for the purpose of saving lives.

In order to fully realize the potential of genomics in delivering personalized medicine to entire populations, this white paper outlines the five key themes and anticipated trends for 2024 that emerged from the event:

  1. Leveraging innovative technology for secure data linkage and access
  2. Utilizing diverse data in clinical trials for drug discovery
  3. Implementing genomic sequencing for newborns
  4. Involving the public to earn and strengthen trust
  5. Fostering global collaboration among biobanks, pharmaceutical companies, and healthcare providers

Designing clinical trials for optimal success

Designing clinical trials for optimal success



Clinical trials serve as the backbone of medical research, providing essential evidence for the safety and efficacy of new treatments and interventions.

However, designing clinical trials that yield meaningful results requires careful consideration of numerous factors, including study design, patient population, endpoints, and statistical analysis methods.

This white paper explores strategies for designing clinical trials for optimal success, drawing upon scientific research and best practices in the field.

Key white paper highlights include insights on: 

  • selecting representative patient populations,
  • choosing appropriate endpoints and outcomes,
  • employing robust statistical analysis methods and,
  •  utilizing pre-existing real world datasets.

This will enable researchers to maximize the likelihood of obtaining meaningful and reliable results.

How to use disease, real world & population-level data to train AI models

How to use disease, real world & population-level data to train AI models

The availability and use of disease, real world, and population-level datasets, are essential for AI in research and healthcare. Researchers can train AI models to derive useful insights, enhance clinical decision-making, and improve patient outcomes.

In this whitepaper, we examine the approaches and best practices for using disease, real world, and population-level data to train AI models.

Key white paper highlights include: 

  • The importance of data preprocessing, standardization and feature engineering,
  • Aspects of AI model training and validation, and
  • Real world applications and case studies from health research and drug discovery fields. 

Download the full white paper now and discover the future of training AI models using disease, real world and population level data with Lifebit.

Innovating Healthcare: RWD Strategies for Pharma Companies

Innovating Healthcare: RWD Strategies for Pharma Companies

Discover the transformative potential of real-world data (RWD) and real-world evidence (RWE) in clinical research and trials through our latest white paper. Here, we introduce pioneering approaches for pharmaceutical companies to effectively leverage RWD in healthcare.

The healthcare sector undergoes continuous evolution, marked by the regular emergence of new technologies, treatments, and regulations. For pharma companies, this dynamic environment calls for innovative strategies to stay ahead.

Uncover the transformative power of real world data (RWD) and real world evidence (RWE) in clinical research and trials with our latest white paper.

Explore the various applications of RWD and RWE across clinical research and learn about the benefits they bring. However, leveraging RWD to its fullest potential comes with challenges, including data quality, standardization, and regulatory considerations.

Discover innovative strategies for harnessing RWD in healthcare for pharmaceutical companies. Together, let’s navigate the realm of healthcare RWD to gain insights and drive innovation, excellence, and improved patient outcomes.

Download the full white paper now and discover the future of efficient, data-driven drug discovery with Lifebit.

Improving efficiency in drug discovery workflows

Improving efficiency in drug discovery workflows



Discover the challenges facing drug discovery researchers today and explore the innovative solutions helping to overcome them in our white paper. We introduce the Lifebit drug discovery module, which seamlessly integrates genomics data processing through advanced Nextflow pipelines.

Key white paper highlights include: 

  • Drug discovery module overview

Uncover the features that set this module apart, from identifying causal variants to understanding associated genes and proteins. Discover the efficiency of a system designed for easy integration and adaptability to the ever-evolving field of genomics.

  • Drug discovery module in action: a use case

Walk through a real-world use case, and understand the module’s sensitivity and its ability to recover causal variants, offering insights into its potential impact on diverse healthcare research scenarios.

  • Key benefits of the module


Explore the tangible benefits of employing the module, including reducing costs, increasing efficiency, and accelerating time to results and understand how our platform empowers researchers, streamlining workflows, and providing valuable insights into potential therapeutic interventions.

Download the full white paper now and discover the future of efficient, data-driven drug discovery with Lifebit.

Forecasting the future of genomic management

Forecasting the future of genomic management

 

 

The past, present and future of genomic data

Worldwide, large population genomics sequencing projects are being established to implement population-level genomic medicine. 

With these projects comes huge amounts of genomic data, which requires careful management to ensure this data remains secure yet accessible and analysable to researchers worldwide.

In this whitepaper, we discuss how large-scale genomic data is managed by researchers and organisations today and the challenges these current methods face. This includes the sheer size and complexity of the datasets, changing regulatory landscapes and the distributed nature of the data.

We consider new approaches and technologies that are being adopted to ensure genomic data management is effectively future-proofed going forward.

To understand how organisations can overcome these issues, we discuss the roles of:

  • trusted research environments (TREs) and federated data analysis in allowing secure data access 
  • data standardisation and cloud computing in enabling data to be safely combined
  • low/no-code tools and their ability to help democratise secure data access globally

Download our full white paper to discover in more detail Lifebit’s approach to future-proofing genomic data management.

 

Secure Data, Scalable Research

Secure Data, Scalable Research

Regulating health data research platforms at a national scale

Industry Reflections on National-Level Trusted Research Environment Accreditation Policies

Download WP

Linked health and multi-omics data generated by hospitals, healthcare providers, research organisations and clinical trials holds vast untapped potential, with the ability to advance research and innovation, improve patient care and accelerate drug development.


However, accessing health data is slow, inefficient and fragmented. Today, most of the world’s global data resources are siloed across distributed projects, organisations and platforms. This means that data is frequently duplicated or moved to enable research access, which is an inefficient and expensive means of sharing data and increases data privacy risk.

“Access to linked de-identified data through Secure Data Environments maximises the opportunity for health and care research and innovation in the public interest whilst protecting the privacy of individuals”

A solution to enabling secure data access for research is facilitated by Trusted Research Environments (TREs). Also known as “Secure Data Environments” (SDEs) or “Data Safe Havens”, TREs are highly secure, controlled computing environments that allow researchers to gain access to data safely. These are typically deployed at the data custodian/data controllers site, either on-premise or cloud. They can then be linked virtually in a process known as federated data analysis, enabling authorised researchers to access relevant health data across distributed sites securely.

To ensure the safe use of data in TREs as they propagate across the industry, work must be done to ensure that industry standards and best practices are met across cyber security, technical capability and information governance. Several accreditation frameworks that audit and certify TREs/SDEs to such standards have been published to support this work, including the NHS Secure Data Environment and Our Future Health TRE accreditation processes. In this whitepaper, Lifebit provides an industry perspective on the challenges and opportunities for using TREs/SDEs in the UK’s health data ecosystem, with a recap of updates in health data policy.

The paper also

  • Summarises new frameworks for the safe use of data within TREs/SDEs, and highlights recent accreditation processes in health data programmes. 
  • Dissects the implications of TRE/SDE accreditation processes for public and private sector organisations across the health data ecosystem.
  • Highlights examples of exemplary work supporting these frameworks as case studies in partnership with the Eastern Academic Health Science Network (AHSN)- including a key example where multi-party federated data analysis was performed for the first time in the UK.

Download our full white paper to discover in more detail Lifebit’s and Eastern AHSN’s reflections on national-level TRE accreditation policies.

Lifebit's Approach to Open-Source Software

Lifebit's Approach to Open-Source Software

Lifebit’s Platform offers researchers and clinicians a data-driven view into the determinants of health and disease. This facilitates clinical impacts, accelerates drug discovery, and ultimately improves patient outcomes.

Challenge: incorporating open-source software into platforms while maintaining security, support, and longevity

Researchers are increasingly turning to open-source software to access the latest, best-in-class tools for data analysis and sharing. The benefits of open source software are clear, so when organisations begin to build data analytics platforms/trusted research environments, they increasingly turn to open-source solutions. There are three approaches an organisation can take to adopting open-source software – closed platforms, DIY solutions, and open platform. However, for organisations looking to adopt open-source software as part of their platforms, there can be a variety of risks associated with this.

Closed platform or DIY platform approaches to open-source software come with high risk in terms of:

  • Lack of security and compliance
  • Lack of ongoing maintenance of software and support
  • High costs and poor user experience/interface (UX/UI)

Solution: open platform

Open platform systems have multiple benefits to end users, whilst ensuring security is not compromised. They:

  • enable users to add features and functionality they would otherwise not be able to in closed systems,
  • help combat vendor lock-ins and data silos, which are often significant issues when using either a closed platform or DIY solution
  • extend support for a variety of open-source integrations and applications, which ensure minimal security risks when it comes to adopting innovative and community-driven applications.

In this white paper we’ll explore the benefits and drawbacks of each approach and how Lifebit uses open-source software in our Platform while ensuring security, support and longevity. For access to the full white paper, fill out the form with your information.

Top 10 Best Practices for Building a Trusted Research Environment

Top 10 Best Practices for Building a Trusted Research Environment

Best Practise 1: Maintain ownership

Maintain total ownership of your data and TRE to maximise security and research outputs, while minimising cost

– To maintain total security over the data, it must be kept exclusively in the Data Custodian’s own TRE environment. This also reduces costs associated with data transfers and storage, setting the TRE on a sustainable path to fast-track research.

Best Practise 2: Federate to collaborate

Adopt a federated approach to open secure collaboration and commercialisation opportunities

Federation is the future of big data analytics. With a federated approach, Data Custodians retain full security over their datasets, as all data remains securely within the bounds and security firewalls of the TRE, and only analysis and computation are taken to external cohorts. Researchers gain access to larger, more diverse cohorts and organisations open opportunities for data collaboration and commercialisation.

Best Practise 3: Follow the Five Safes for secure data 

Implement the Five Safes to maximise security throughout the data lifecycle

The Five Safes framework lays out a set of principles to ensure safe access to sensitive data, recently applied to TRE management by the UK’s national health data institute, HDR UK. Implement these 5 pillars for secure data management – Safe People, Safe Projects, Safe Settings, Safe Data and Safe Outputs.

Best Practise 4: Industry-standard security and compliance

Establish industry-level compliance standards and transparent security processes to maintain public trust

TREs must be compliant with industry-wide standards that go beyond GDPR and HIPAA, for example, ​​ISO 27001 and UK government-backed scheme, Cyber Essentials Plus. TREs need a systematic approach to managing and protecting health data, including regular external audits, meaningful Patient and Public Involvement and Engagement and transparency on security procedures to ensure the long-term success of population health initiatives.

Best Practise 5: Automate the transformation to research-ready data 

Automation of upstream pipelines and harmonisation processes can guarantee rapid and standardised production of research-ready data

TREs need automated systems within the platform (e.g. ETL pipelines and APIs) to manage the large-scale data flowing into the environment and efficiently convert it to standardised analysis-ready data. Effective harmonisation and standardisation using industry-recognised standards (e.g. HL7 FHIR and OMOP CDM) means data resources can be queried more quickly and efficiently.

Best Practise 6: FAIR data

Create standardised metadata and use FAIR principles to make data findable and reuseable

Aligning with industry standards like FAIR (Findable, Accessible, Interoperable, and Reusable) and HDR UK’s Data Utility Framework brings a number of benefits to Data Custodians – making data more findable and reusable, enabling integration with other datasets or public repositories and aiding efficient data interpretation.

Best Practise 7: Multi-layered security controls

Apply trusted data controls to maximise security at each layer of the platform 

Protecting data confidentiality and security within a TRE takes a multi-layered approach. Key controls that should be implemented within a TRE include: de-identification, encryption, Airlock, role-based access control, tiered access levels and workspace/dataset segregation.

Best Practise 8: Procure an all-in-one solution

Procure an all-in-one TRE solution for smooth operations and mitigation of delivery risk

Deploy an all-in-one solution, where billing, infrastructure and TRE are all centrally managed by your TRE provider. This simplifies configuration and operations, mitigating delivery risks.

Best Practise 9: Future-proof with an infrastructure-agnostic provider 

An infrastructure- and cloud-agnostic TRE provider protects against vendor lock-in and project continuity risks

Selecting a TRE provider that is infrastructure- and cloud-agnostic is essential to future-proof a TRE against vendor lock-in and project continuity risk. The cloud/HPC environment account should also be created in your organisation’s name to mitigate dependency on infrastructure providers.

Best Practise 10: An open ecosystem extends TRE functionality

Build an open ecosystem platform to seamlessly integrate with community innovations and extend platform functionality 

Whether open, closed or DIY, the type of platform heavily influences how open-source software is managed and used. An open platform offers end-users the ability to customise the TRE with additional functionalities by integrating third-party applications, tools and data via APIs. 

Closely evaluate computational and storage costs with each infrastructure provider considered. Any pricing evaluation must be accompanied by the full disclosure of the official underlying costs, as well as any relevant reseller, government or public sector discounts achieved. 

Read The Complete Guide to Trusted Research Environments in 2023

Questions on procuring or building a Trusted Research Environment? Contact us at hello@lifebit.ai