back

Research

October 20, 2025

9 mins read

Designing trustworthy learning systems in precision medicine and the scalability gap

by Moniepoint R&D

Moniepoint welcomed Ernest E, Onuiri, Ph.D,

To share his knowledge on precision medicine, and he’s learning from machine learning for cancer detection. He shared a foot on the round view of what is happening in academic research.


⁠⁠Introduction


⁠The promise of healthcare often feels like a distant ideal, especially when faced with the "one-size-fits-all" approach to treatment. Imagine receiving medicine for malaria that works for some, but not for you, or a cancer treatment that cures one patient but harms another. This disparity highlights a fundamental problem that precision medicine aims to solve: tailoring medical interventions to the individual. But while the vision of customised care is compelling, its widespread adoption faces significant challenges, particularly around scalability, data complexity, and a collective reluctance to embrace change. We will explore the core concepts of precision medicine, delve into the obstacles hindering its scalability, and propose pathways for collaboration between industry and academia to build the trustworthy AI systems needed to revolutionise healthcare.


⁠What is Precision Medicine?

At its heart, precision medicine is about moving beyond generic treatments to customise care for each individual. The goal is to minimise adverse drug reactions and autoimmune problems that can sometimes be more harmful than the disease itself. This personalised approach relies on a rich collection of data points, including:

  • Clinical Records: Observable features like temperature, weight, and responses to basic questions gathered during a hospital visit. These are generally easy to understand.

  • Genomic Landscape: The complex internal data within humans often explains why things go wrong internally before external manifestations appear. This data is not easily accessible, expensive to obtain, and requires specialised tools and expertise to interpret, often appearing like noise to the untrained eye.

  • Imaging Data: Various forms of visual data obtained through technologies like MRI, CT, ultrasound, or X-ray.

Several key principles underpin precision medicine:

  • Personalisation: Customising medicine for a patient based on their specific needs.

  • Prediction: The ability to anticipate health events, which enables proactive prevention and treatment.

  • Participation: Empowering the patient to participate actively in their healthcare journey, asking questions, understanding the proposed treatments, and giving a sense of ownership.


⁠The Scalability Gap: Challenges in Precision Medicine

Despite its promise, precision medicine encounters substantial hurdles in scaling beyond controlled research environments to widespread clinical adoption.

  • Data Complexity and Heterogeneity: Diseases like cancer are inherently heterogeneous, with various subtypes and severity levels. Genomic data, unlike simple clinical records, is incredibly complex and requires intermediate tools and programming languages (like R) to be unpacked and made intelligible. Combining different data modes (clinical, genomic, imaging) further exacerbates this complexity.

  • Data Integration and Quality: Healthcare data is often siloed, stored in disparate locations (e.g., Lagos, London, Tokyo) and in formats that appeal to local practices, making integration a hurdle for researchers. Furthermore, data is rarely clean, often containing missing values, noise, or being jumbled, leading to wasted research time and unreliable insights. Even long-standing repositories in institutions may not be trustworthy for new applications because the initial workflows were not designed for broad utility.

  • Cost and Accessibility: While the technology exists for genomic sequencing, it remains prohibitively expensive and is not readily available in most hospitals, limiting its widespread application.

  • Reluctance to Change: A significant challenge is the human element – the resistance to adopting new, unconventional approaches. Experienced medical professionals may adhere to traditional methods, viewing new technologies like Electronic Medical Records (EMRs) with suspicion, fearing activity tracking rather than appreciating their potential for record-keeping and knowledge transfer.

  • Lack of Collaboration: There's often a disconnect between academia and industry. Academia might pursue theoretical solutions, while the industry focuses on immediate monetizable products, which are missing opportunities for collaboration. Academic studies, especially in certain parts of the world, often lack the funding and time to generate their own clean, fit-for-purpose data, forcing reliance on external sources.


⁠Lessons from COVID-19 and the Importance of Validation

The COVID-19 pandemic offered crucial lessons, particularly regarding the need for rigorous validation and the potential of time compression in drug discovery. The rapid development of vaccines, facilitated by in silico (computational) methods, showcased how processes that once took years could be dramatically accelerated. However, this unprecedented speed also led to questions about the longevity and nature of these interventions, as evidenced by the need for multiple doses and boosters, unlike traditional vaccines.

This experience underscores that validation is a never-ending process. Engineers and researchers must cultivate a "inquisitive mind" that constantly asks "what if," even when a solution works perfectly. The speaker also emphasises that simplicity often trumps complexity in engineering solutions. While complex models seem more robust, simpler approaches can usually yield equally good, if not better, outcomes with less computational burden and more efficient resource utilisation. Over-complicating solutions just for the sake of it can lead to systems struggling and ultimately failing to deliver effectively.


⁠Pathways to Bridging the Gap: The Power of Collaboration

Overcoming precision medicine's scalability gap demands a concerted effort and a fundamental shift towards synergistic collaboration between industry, academia, and regulatory bodies.

  • Industry-Academia Alignment: When industry (with its resources and real-world problems) and academia (with its research and innovative thinking) combine forces, they create something far greater than they could achieve alone. This symbiosis can translate research into monetizable solutions that address real industry needs, as seen in places like Silicon Valley's relationship with Stanford University.

  • Shared Data Repositories: Industry players, especially those generating vast amounts of data, should partner with academia to create shared data repositories. This would allow researchers to use relevant, homegrown data, leading to more tailored findings and solutions directly applicable to local contexts, rather than constantly adapting imported data and solutions.

  • Customisation in Education: Education itself needs to be customised to solve real problems, encouraging students to appreciate the practical application of their learning.

  • Leveraging Technology for Collaborative Workflows:
    Modern technology has eliminated geographical barriers, enabling remote collaboration. This capability should be fully leveraged to foster interdisciplinary teams that bring diverse perspectives and expertise to complex problems, preventing work in silos.


⁠Trustworthy AI: Shared Principles Across Healthcare and Fintech

The need for robust, explainable, and scalable AI systems is not unique to healthcare. Industries like fintech face remarkably similar challenges and benefit from similar principles.

  • Risk Prediction: As healthcare professionals worry about clinical risk prediction, fintech deals with financial risk prediction.

  • Personalisation: Precision medicine aims for personalised patient treatment; fintech strives for personalised customer recommendations.

  • Classification: Disease subtype classification in medicine finds its parallel in customer segmentation and classification in fintech.

  • Fraud and Anomaly Detection: Protecting sensitive patient data and detecting anomalies in healthcare is mirrored by the need for financial fraud and anomaly detection, ensuring data protection and compliance.

In both sectors, trustworthy, explainable, and scalable systems are vital for adoption. The computational complexity often seen in precision medicine, which is a tall order, also presents a significant opportunity for innovation and scaling.



⁠Technology: Ability to change data into medicine

Dr. Onuiri's approach to precision medicine involves a structured workflow to transform complex, multimodal data into actionable clinical insights. Building a trustworthy learning system requires more than a robust algorithm; it demands a rigorous, multi-step process from data collection to validation.


⁠Step 1: Multimodal Data Integration and Preprocessing

The first major hurdle is aggregating and harmonising disparate data types into a cohesive dataset.

  • Explanation: Data for a single patient often exists in different formats and locations, such as clinical records from a hospital's EMR, genomic data from a specialised lab, and imaging data from a radiology department. These data silos and varying standards make integration a significant challenge.

  • Instruction: The initial step is to build a comprehensive patient profile. This involves addressing challenges like data integration, handling missing values, and cleaning noisy data to ensure the quality and interoperability needed for an effective model.


⁠Step 2: Feature Engineering

With an integrated dataset, the next step is to extract meaningful signals that a machine learning model can interpret.

  • Explanation: Raw genomic or imaging data is often too complex and high-dimensional for a model to use directly. This critical step translates the data into a structured format, identifying key features or patterns—like specific gene expression levels or image textures—that correlate with particular health outcomes.

  • Instruction: Apply domain-specific techniques to convert the raw, multimodal data into numerical or categorical features. For example, specific gene expression levels might be used as features for a cancer classification model, while texture analysis could extract features from an MRI scan.


⁠Step 3: Model Development and Selection

This is where the "learning" happens. The goal is to choose and train a model that can perform a specific task, such as predicting clinical risk or classifying disease subtypes.

  • Explanation: While it can be tempting to use the most complex model available, Dr. Onuiri emphasises that simpler, more interpretable models often yield equally good, if not better, outcomes with less computational burden. The focus should be on creating an accurate, efficient, and explainable model.

  • Instruction: Train various models and evaluate them based on a balance of performance, computational cost, and explainability. In a clinical setting, a transparent model clinicians can understand and trust is often preferable to a "black box" algorithm.


⁠Step 4: Rigorous and Continuous Validation

A model is only valid if it is trustworthy. Validation is not a one-time check before deployment but a continuous, iterative process.

  • Explanation: The COVID-19 pandemic provided a critical lesson in the importance of rigorous validation. A model that performs flawlessly on a historical training dataset might fail when exposed to new, real-world clinical scenarios. Engineers and data scientists must maintain an "inquisitive mind," constantly questioning the model's performance and limitations.

  • Instruction: Continuously test the model against new data in different clinical contexts. This ongoing validation is crucial for ensuring the system remains reliable, safe, and effective over time, forming a trustworthy AI system's bedrock.


⁠Conclusion

Precision medicine holds immense potential to transform healthcare from a generalised approach to a deeply personal and effective one. However, this future hinges on our ability to navigate the complex data landscape, overcome resistance to change, and foster unprecedented collaboration. By promoting a symbiotic relationship between industry, academia, and regulatory bodies, we can collectively plug the scalability gap, build trustworthy AI systems, and unlock the breakthroughs that have long eluded us, leading to better outcomes for everyone.

Read similar stories

From Requirements to Relationships: How Moniepoint Engineers are Redefining Stakeholder Engagement
Research

November 11, 2025

From Requirements to Relationships: How Moniepoint Engineers are Redefining Stakeholder Engagement

by Moniepoint R&D

Unlocking Customer Insight: Applying Transformer Models to Unstructured Financial Text
Research

October 24, 2025

Unlocking Customer Insight: Applying Transformer Models to Unstructured Financial Text

by Moniepoint R&D

Federated Graph Neural Networks: Solving Real-Time Anomaly Detection Without Sacrificing Privacy
Research

October 23, 2025

Federated Graph Neural Networks: Solving Real-Time Anomaly Detection Without Sacrificing Privacy

by Moniepoint R&D

Get more stories like this

Sign up for exciting updates on Moniepoint and how we’re powering business dreams.