Bigger isn’t always better: an exploration into hypertrophic cardiomyopathy genetic testing

The inherited cardiac condition team

As part of my DPhil I have been fortunate to work closely with a team of experts that diagnose and manage individuals, and families, affected by inherited cardiac conditions.

Hypertrophic cardiomyopathy

The most common inherited cardiac condition is hypertrophic cardiomyopathy. Hypertrophic cardiomyopathy, colloquially shorted to HCM, is an inherited condition (usually autosomal dominant) that affects ~1 in 500 individuals, and is well recognised as a cause of sudden death within young adults. The use of genetics has helped to advance care for families affected by this disease, but for the majority of individuals diagnosed with HCM, a genetic diagnosis is never provided, which prevents the screening of their first degree relatives.

Is less more for HCM genetic testing?

Consequently, an increasingly large selection of genes are being scoured for genetic variants that may explain the cause of hypertrophic cardiomyopathy. Whilst intuitively this may seem reasonable, in the hope that a causal variant is detected, such practices are of limited value and rather than providing definitive answers often generate a list of variants of uncertain significance.

A case-control study using whole genome sequencing data

To further investigate whether there is any value in searching for variants within an expanded list genes associated with hypertrophic cardiomyopathy a case-control study was designed, using genome sequencing data generated by the NIHR’s BioResource for Rare Disease. This included 240 individuals diagnosed with HCM, who did not have a genetic diagnosis despite previous screening using a conventional gene panel, and 6,229 controls. 51 genes that had been previously proposed for HCM testing were interrogated using a combination of burden testing and manually variant interpretation.


Our results provided no support for the screening of a broad set of HCM-associated genes, with over 99% of variants found being classified as a variant of uncertain significance (VUS).

The manuscript can be found within Genetics in Medicine.

Learning the basics of blockchain

What is the scope for blockchain technologies?

“There is technology becoming available … I don’t claim to be an expert on it but the most obvious technology is blockchain” boasted Chancellor Philip Hammond at the Conservative Party Conference, when discussing the Irish border challenges of Brexit.

What problems can blockchain tackle?

Now, I also do not claim to be an expert in blockchain technology, but from what I understand, I did not think blockchain technology was going to help solve the Brexit Irish border issue. Nevertheless, what this revealed to me was that I was poorly informed as to what blockchain technology truly represented beyond the hype of cryptocurrency, and what problems blockchain technology was reasonably equipped to tackle. Certainly, within healthcare, reports often appear stating that medical records or genetic test results will be “put on the blockchain”, but what this actually represents remained rather opaque to me.

Figure 1: My prior understanding of blockchain technology (source)

Training in blockchain fundamentals

To up-skill myself — and help determine whether the hype associated with blockchain technology is justified — I enrolled on Coursera’s blockchain foundations course, created by ConsenSys Academy.

Having recently completed the course, I certainly feel better informed regarding blockchain technology and can now sensibly engage in discussion regarding its implementation, beyond the superficial knowledge I have regarding Bitcoin and Ethereum. The course does a good job at outlining the basic concepts of what a blockchain is, how and why it was created, provides an overview of the technical components underpinning the technology and, most importantly for me, provides a framework to establish whether or not a blockchain technology should be considered as a possible solution, for a given challenge.

Summarising blockchain utility

The first thing to appreciate is that a blockchain is only useful when a database is required. Furthermore, blockchains really only have an advantage over traditional databases when multiple users require “write” access to that database, and those users are either unknown, untrusted or in possession of conflicting interests. Even then, databases can accommodate these scenarios, typically through a trusted 3rd party. But when such an entity cannot be relied upon, a blockchain solution might be a possible option.

Figure 2: Framework outlined by Coursera

If a blockchain solution is required, there are a range of additional considerations that need to be addressed to determine whether a public or private blockchain is most suitable. Jeremy Millar, Chief of Staff at ConsenSys, has outlined an approach for this here.


Whilst the content of the course might not stimulate blockchain aficionados, it provided the necessary foundations to appreciate how blockchain technology can be applied and what limitations are currently present.

Delivering novel therapies in the 21st century

Over the last few days I was fortunate enough to attend the Royal Society’s conference on “Delivering novel therapies in the 21st century” and have summarised a few of the key themes: –

Big Picture

Lots of high-level statements were presented at the meeting, but three in particular stuck with me.

  1. Rare disease is not as rare as you might think, as there are over 25 million individuals with one of 7,000 rare diseases, with 30% of individuals with a rare disease dying before they reach 5 years old.
  2. Multi-morbidity a growing issue: simulation studies suggest diabetes and concomitant depression will be present in 35% of females and 21% of males by 2028. By 2035, 68% of over 65s will have 2 or more long term conditions.
  3. By 2022, biologics will deliver 52% of the top 100 product sales, at the expense of traditional small molecule medicines.

    NHS, academia and business

    The United Kingdom is well position to be a leader within the life sciences sector. In part this is due to established partnerships between the NHS, academia, industry, and regulators, but it is evident that with uncertain political times looming, more is required to secure this leading position, particularly for the NHS and academia. Throughout the 2 day event the NHS was praised for its electronic patient records, that offers unparalleled opportunities to follow up patients’ health records. It was also stated that it was the responsibility of the NHS and academia to advance our understanding of disease pathophysiology. However, there is a real threat that our global advantage might be lost. The NHS struggling with a supply and demand issue: the lowest average annual real growth rate ever recorded (1.2% per year extra), coupled with a huge increase in demand over the last 10 years, with a 40% increase in Emergency admissions and 79% increase in outpatient appointments since 2005. Furthermore, Brexit remains a considerable threat to academia, as outlined by 29 Nobel laureates.

    Drug discovery

    90% of drug discovery efforts fail, however, when genetic information is used, the success rate doubles. This well known statistic was repeated throughout the conference and several examples were given for how genetics can be leveraged, and why drug discovery efforts fail. This included Prof Peter Donnelly discussing the efforts of Genomics PLC, where a matrix comparing ~14 million genotypes with ~7,000 phenotypes (and gene expression information) for over 3 million individuals, from prior genome-wide association studies, has been assembled to help guide target discovery efforts. Of course, this is only one side of the coin. Whilst having genetic support for a target will be increasingly important, we know there is heterogeneity in efficacy, and designing a drug for mass adoption for a single phenotype may be an oversimplification. I suggested an experiment that could leverage individual level data from the UK BioBank to generate a polygenic risk score for drug efficacy — something that they have not performed yet, but will look into.
    Accompanying the high failure rate are the large costs associated with running a clinical trial, resulting in the expensive drugs for patients. One view was that the high cost of running a clinical trial was pathognomonic of something far more unsettling: that either the drug was a poor candidate, or the wrong patient population was being selected (as the underlying biology was relatively unknown).
    The cost of therapeutics to payors was highlighted, particularly for oncology drugs, which have steadily increased in price, but are poorly correlated to overall survival.

    Drug delivery

    This considered not only the increased range of therapeutic modalities — such as PROTACs and bicyclic peptides —  that unlock a broader spectrum of possible biology amenable to intervention, but also manufacturing considerations for the production of individualised therapeutics tailored to an individual’s immune system — with processes having to change from a manufacturer to stock model to a manufacturer to order model.
    A range of presentations highlighted the current advances made by using various carrier materials to treat disease, both systemically and targeted towards a specific organ. From exosomes for the treatment of pancreatic cancer,  heat-sensitive liposomes that can be activated using ultrasound to treat liver tumours, viral delivering of a CRISPR system to treat tyrosinemiaadeno-associated virus delivery of gene therapy for duchenne muscular dystrophy and antisense oligonucleotide therapy for Huntington’s disease.


    Sir Michael Rawlins, from the Medicines and Healthcare products Regulatory Agency, outlined how data for the safety and efficacy of new therapeutics need not be solely derived from a randomised clinical trial. He discussed the various advantages with taking a Bayesian approach to study design and walked through the alternative study designs, including: adaptive trials, mendelian randomisation, umbrella trials, step-wedge trials and ring trials.
    There was also further discussion regarding basket trials within the health economics section of the conference, with respect to histology independent cancer drugs such as Larotrectinib, and the challenges such a trial design poses for health technology assessment bodies, such as NICE (National Institute for Health and Care Excellence).

Assisted Intelligence: reflections from Nvidia’s GPU Technology Conference (GTC) 2018

This past week I attended Nvidia’s GPU Technology Conference (GTC) where I learnt more about the disruptive force of artificial intelligence within society.

Whilst it is easy to get distracted by the many exhibitions highlighting broad applications of artificial intelligence, I focussed on the intersection between artificial intelligence and healthcare, an area I have previously discussed at the Royal Society for Medicine’s meeting series (2015, 2016, 2017).

One thing I hadn’t appreciated before attending the event, was that Nvidia have been innovating within the healthcare domain for over 10 years, including partnerships with Siemens and GE. Having now experienced GTC I can now better appreciate why healthcare is a key vertical for Nvidia, given that the FDA has approved over 11 indications within 2018 that relate to artificial intelligence.

Reflecting on my GTC experience, I came away with a few key messages:

  1. Assisted intelligence: an increasing number of peer-reviewed articles have been published outlining how artificial intelligence compares against clinicians for the diagnosis of disease. Whilst there was reference to many of these studies throughout GTC, it was more common to hear how artificial technology can be used to assist the clinical decision process, rather than replacing it — this was discussed with reference to diabetic retinopathy.
  2. More-from-less: within the GTC keynote Jensen Huang, CEO of Nvidia, described how with the latest ‘real time rendering technology’, conspiracy theories regarding the space landing can be debunked. This was fascinating, and it appears that similar approaches are being applied within medical imaging, as outlined within the project CLARA update. This included the possibility that adverse reactions to contrast agents could be minimised by enhancing images that are generated using a far lower concentration of contrast media, and limiting radiation exposure through shorter capture times.
  3. DNA sequencing: there is potential to improve the DNA sequencing pipeline with deep learning, but we are not there yet. For example, Oxford Nanopore gave an impressive demonstration that explained how GPU technology can be applied to accurately read the A,T,C and G bases inferred from an electrical signal that is generated as a strand of DNA passes through a nanopore — the process that enables the DNA sequence to be read by the DNA sequencer. This is a remarkable use of deep learning technology, and has resulted in dramatic improvements in their technology. Beyond this specific application, Oxford Nanopore also outlined their vision to generate a device that consumers can afford, potentially enabling anyone to sequence anything within the world around them. This vision stretches well beyond the exciting opportunities reliable long-read sequencing affords to researchers studying the human genome, such as the ability to detect large structural re-arrangements, and capture areas of homology and highly repetitive regions that were challenging using established short-read methods.
    However, after attending one of the Deep Learning Institute’s hands-on sessions on variant classification for genomic variants, it was apparent that beyond base-calling, there are plenty of opportunities throughout the sequencing pipeline where deep learning may be applied to improve on current performance.
  4. Organisational benefit: beyond these clinical applications, GTC provided me with an opportunity to appreciate what artificial intelligence is, and what it is not. Understanding the scope of this (and any) technology is critical. Several speakers discussed the challenges data scientists encounter when trying to deploy artificial intelligence within an organisation, and often it related to a inappropriate expectations regarding the technology and the role of the data scientist within the process.

Drug discovery and genetics

A loss-of-function genetic variant that protects against chronic liver disease was reported in the The New England Journal of Medicine on 22nd March 2018. This finding adds to a series of previously established protective variants and appears to have stimulated efforts to advance treatment options within chronic liver disease — Regeneron Pharmaceuticals, Inc. have subsequently announced a collaboration with Alnylam Pharmaceuticals, Inc. to develop RNAi therapeutics for non-alcoholic steatohepatitis.

Such reports tend to generate excitement. To understand why requires analysis of the underlying genetics, biology, drug discovery challenges and opportunities within clinical medicine. My goal is to provide context to this intersection between science, medicine and business.

Premortem clinical information is specific but not very sensitive for the postmortem diagnosis of heart valve disease

The OxVALVE study group, led by Dr Bernard Prendergast (Oxford Valvular Heart Disease Population Study) group was established in 2009 to determine whether the early detection and treatment of valvular heart disease may improve the long term care pathways and health of patients. The primary study was conducted using a community-based, prospective cohort that recruited individuals over the age of 65 years.

In February 2017, as part of the OxVALVE study group, we published, in Heart, the largest reported study (n=7,879) to evaluate the correlation between valvular heart disease findings pre- and post-mortem. Whilst there are some inherent limitations to our approach, we were able to demonstrate that clinical information available prior to death, whilst highly specific, was a relatively insensitive predictor for the cause of death that was established at autopsy*. High positive and negative predictive values were noted. Overall, we were able to demonstrate that undiagnosed valvular heart disease contributed towards death in ~1.7% of our population, most commonly due to: aortic stenosis, infective endocarditis and rheumatic heart disease.

*Sensitivity is used to understand how often a test will correctly identify a positive finding from all the positive findings. If a test is highly sensitive, and the test result is negative, you can be nearly certain they don’t have the disease. Sensitivity can be calculated by dividing the total number of true positives by the sum of true positives and false negatives.

Specificity also known as the true negative rate, as it assesses how often people without a disease are correctly given a negative result. Therefore, if the test result for a highly specific test is positive you can be nearly certain that they actually have the disease. Specificity is calculated by dividing the total number of true negatives by the sum of the true negatives and false positives.

A future for RNA therapies? Inclisiran: a short interfering RNA for the lowering of LDL cholesterol

The New England Journal of Medicine published results of a phase 1 study investigating a novel method to reduce LDL cholesterol through a small interfering RNA (siRNA) targeting PCSK9 — inclisiran (Alnylam Pharmaceuticals and the Medicines Company). Being a phase 1 study, the safety, side-effect profile and pharmacodynamic effects of this novel therapeutic agent were assessed.

The reason I am reflecting on this study relates to the excitement that is held within the field of cardiology and PCSK9 inhibition. PCSK9 is a well-validated target that has been recently identified as a critical regulator of LDL cholesterol. PCSK9 achieves this by breaking down LDL cholesterol receptors within the liver; the more LDL receptors broken down, the higher the LDL levels within the bloodstream. PCSK9 inhibition therefore results in lower LDL levels. Individuals with raised LDL cholesterol are at risk of major adverse cardiovascular events such as myocardial infarction, and consequently developing therapeutic agents to successfully lower LDL has been a goal for many years. Achieved initially through the development of statins, more recently the FDA approved two monoclonal antibodies that inhibit PCSK9 (alirocumab (Praluent, Sanofi/Regeneron) and evolocumab (Repatha, Amgen)) to lower LDL cholesterol in patients refractory to statin therapy.

What differentiates inclisiran (an siRNA) from the currently FDA approved PCSK9 inhibitors (monoclonal antibodies) is the the mechanism through which PCSK9 inhibition is achieved. To briefly outline these differences: monoclonal antibodies of PCSK9 restrict PCSK9 from binding to LDL receptors across all extracellular tissues across all organs, whilst inclisiran specifically targets PCSK9 inhibition within the liver. This specificity relates to the design of inclisiran, with carbohydrate residues bound to the siRNA combining with a molecule specific to the liver (Asialoglycoprotein receptors), which enables uptake into the liver. Once in the liver, the siRNA binds with an RNA inducing silencing complex that allows the siRNA to interact and disrupt the mRNA that is required for PCSK9 protein production. This unique mechanism of action makes inclisiran a first in class therapeutic.

Inclisiran, administered as a subcutaneous injection, demonstrated no serious adverse events and was reported to provide a sustained reduction in LDL levels (~60% reduction). It will be fascinating to track the progress of inclisiran in subsequent trials — a global phase III trial has been suggested. Some analysts are already suggesting that sales of Inclisiran will reach ~1.3 USD in 2030, despite no cardiovascular outcome trial data.

Protective alleles and modifier variants in human health and disease

My recent review article, co-authored with Shalini Nayee and Eric Topol, was published in Nature Reviews Genetics in late October 2015.

The manuscript considers how drug developers can leverage nature’s evolutionary mechanisms that protect individuals from developing disease and how such biological processes can help promote health