In October 2020, the phrase ‘artificial intelligence’ was searched in the National Library of Medicine’s PubMed database for a previous editorial in this journal. At the time of the search, 110,855 publications were found. 1 It’s noteworthy that doing the same search a year later returns about 28,500 additional citations, bringing the total to 139,304 publications as of 2 October 2021. Even its most ardent skeptics have recognized the potential influence of artificial intelligence (AI), and the notion of a wondrous thinking machine, dubbed ‘the Master Algorithm’ by computer scientist Pedro Domingo, is no longer science fiction. 2
Despite the recent surge in interest in artificial intelligence, the reality is that research has been steadily progressing for at least 180 years. Ada Lovelace – Charles Babbage’s disciple and daughter of the great philhellene Lord Byron – published the first algorithm ever designed expressly for computer execution in 1843. Today, philosophers and futurists such as Nick Bostrom and Ray Kurzweil speak of superintelligence, which is far beyond the human brain’s ability and might. 3 After all, Gary Kasparov did succumb to IBM’s Deep Blue at chess.
The power of artificial intelligence in Malaysia has become equally clear to clinicians – particularly electrophysiologists and arrhythmia specialists. There is already accumulating evidence that AI can assist with electrophysiological diagnoses by automating routine clinical procedures or assisting with challenging tasks through the use of deep neural networks that outperform currently applied computerized techniques. 4 Soon, artificial intelligence simulations of the monomorphic ventricular tachycardia circuit may be utilized to guide catheter ablation or perhaps stereotactic radio ablation for a large number of patients. 5 Combining data from many diagnostic modalities using AI may help clarify the pathophysiology of the novel, unusual, or idiopathic cardiac diseases, aid in the early identification or targeted treatment of cardiovascular diseases, and enable screening for problems not currently associated with the ECG. 4
Is all of this foreseen or is it merely wishful thinking? Eric Larson, a computer scientist, and inventor has made a compelling case against any supercomputer surpassing the human brain. 6 This is not simply a matter of Aristotelian deduction vs. induction vs. inference; it appears to be a logistical one as well. Rebecca Goldin, writing for the Genetic Literacy Project in reaction to President Obama’s 2013 announcement of a new research project aimed at better understanding the human brain, provides context: 7
“The human brain is believed to contain roughly 86 billion neurons (8.6 x 1010), with each neuron containing tens of thousands of synaptic connections; these little communication sites serve as the means through which neurons exchange information. There are estimated to be over a hundred trillion neural synapses in all, which means that a computer recording a simple binary piece of information about synapses, such as whether it fired inside a time window or not, would require 100 terabytes. The amount of storage required to store even this trivial quantity of data every second for one person would exceed 100,000 terabytes or 100 petabytes. Nowadays, supercomputers can store approximately ten petabytes. Furthermore, this rapid computation ignores the changes in connection and location of these synapses that occur throughout time. Counting the changes in these connections following a good night’s sleep or a maths class results in a stunning figure (and many more bytes than the estimated 1080 atoms in the universe). The wiring issue appears to be insurmountable due to its enormity.”
Source: Mobius. co