Assisted Intelligence: reflections from Nvidia’s GPU Technology Conference (GTC) 2018

This past week I attended Nvidia’s GPU Technology Conference (GTC) where I learnt more about the disruptive force of artificial intelligence within society.

Whilst it is easy to get distracted by the many exhibitions highlighting broad applications of artificial intelligence, I focussed on the intersection between artificial intelligence and healthcare, an area I have previously discussed at the Royal Society for Medicine’s meeting series (2015, 2016, 2017).

One thing I hadn’t appreciated before attending the event, was that Nvidia have been innovating within the healthcare domain for over 10 years, including partnerships with Siemens and GE. Having now experienced GTC I can now better appreciate why healthcare is a key vertical for Nvidia, given that the FDA has approved over 11 indications within 2018 that relate to artificial intelligence.

Reflecting on my GTC experience, I came away with a few key messages:

  1. Assisted intelligence: an increasing number of peer-reviewed articles have been published outlining how artificial intelligence compares against clinicians for the diagnosis of disease. Whilst there was reference to many of these studies throughout GTC, it was more common to hear how artificial technology can be used to assist the clinical decision process, rather than replacing it — this was discussed with reference to diabetic retinopathy.
  2. More-from-less: within the GTC keynote Jensen Huang, CEO of Nvidia, described how with the latest ‘real time rendering technology’, conspiracy theories regarding the space landing can be debunked. This was fascinating, and it appears that similar approaches are being applied within medical imaging, as outlined within the project CLARA update. This included the possibility that adverse reactions to contrast agents could be minimised by enhancing images that are generated using a far lower concentration of contrast media, and limiting radiation exposure through shorter capture times.
  3. DNA sequencing: there is potential to improve the DNA sequencing pipeline with deep learning, but we are not there yet. For example, Oxford Nanopore gave an impressive demonstration that explained how GPU technology can be applied to accurately read the A,T,C and G bases inferred from an electrical signal that is generated as a strand of DNA passes through a nanopore — the process that enables the DNA sequence to be read by the DNA sequencer. This is a remarkable use of deep learning technology, and has resulted in dramatic improvements in their technology. Beyond this specific application, Oxford Nanopore also outlined their vision to generate a device that consumers can afford, potentially enabling anyone to sequence anything within the world around them. This vision stretches well beyond the exciting opportunities reliable long-read sequencing affords to researchers studying the human genome, such as the ability to detect large structural re-arrangements, and capture areas of homology and highly repetitive regions that were challenging using established short-read methods.
    However, after attending one of the Deep Learning Institute’s hands-on sessions on variant classification for genomic variants, it was apparent that beyond base-calling, there are plenty of opportunities throughout the sequencing pipeline where deep learning may be applied to improve on current performance.
  4. Organisational benefit: beyond these clinical applications, GTC provided me with an opportunity to appreciate what artificial intelligence is, and what it is not. Understanding the scope of this (and any) technology is critical. Several speakers discussed the challenges data scientists encounter when trying to deploy artificial intelligence within an organisation, and often it related to a inappropriate expectations regarding the technology and the role of the data scientist within the process.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.