India - Flag India

Please confirm your currency selection:

Indian Rupee
Incoterms:FCA (Shipping Point)
Duty, customs fees and taxes are collected at time of delivery.
Payment accepted in Credit cards only

US Dollars
Incoterms:FCA (Shipping Point)
Duty, customs fees and taxes are collected at time of delivery.
All payment options available

Bench Talk for Design Engineers

Bench Talk

rss

Bench Talk for Design Engineers | The Official Blog of Mouser Electronics


Using Artificial Intelligence to Predict Pandemics Clive  Maxfield

(Source: Lightspring/Shutterstock.com)

Most countries around the world were apparently hopelessly unprepared to deal with the effects of a pandemic. One notable exception is Iceland, which has spent years preparing for such an eventuality, with the result that they haven’t implemented a lockdown because they can quickly trace, test, and isolate anyone who is exposed to the virus.

It’s not like we weren’t all at least vaguely aware that a possibility like this could come to pass. Science-fiction stories have been postulating this sort of scenario since the genre began. My personal favorite is The Stand by Stephen King, but there are many, many more. In addition, multiple television series, including “Ten Ways the World Could End,” document potential futures involving climate change, an asteroid strike, one or more supervolcanoes erupting, nuclear war, and—you’ve guessed it—a pandemic.

Politicians worldwide have been made aware of pandemic possibilities. Futurists, academic institutions, and even government departments have been modeling pandemic scenarios and all but jumping up and down shouting, “The end is nigh!” for years, if not decades.

In 2014, thanks to the diligence of thousands of health workers, the world managed to avoid an outbreak of Ebola, which is a rare but deadly disease that kills an average of 50 percent of those who become infected.

In 2015, Bill Gates gave a TED Talk titled The Next Outbreak? We’re Not Ready. Even with all the futuristic technology at our disposal, the outburst of COVID-19 amply demonstrated that we still aren’t ready. If only we could be forewarned as to the onset of a new pandemic. We hear talk in the news about the models that scientists use to predict the times and numbers associated with coronavirus peaks in different cities, states, and countries. These are sophisticated mathematical models, and they can be incredibly useful. Such models apply to all sorts of tasks, such as predicting where diseases such as Ebola might strike next based on climate changes and poverty levels. Unfortunately, models of this type are only useful for mitigation purposes after the crisis is upon us (such as the coronavirus), or for “best-guess” predictions for possible futures. What is really required is some form of early-warning system for potential pandemics.

Predicting and Managing Pandemics Using AI and ML

Recent events show it is possible that artificial intelligence (AI) and machine learning (ML) could provide just such an advanced warning for potential pandemics. AI and ML are already proving themselves to be useful in various medical applications. For example, AI can help diagnose skin cancer, while ML was recently efficacious in discovering Halicin, a powerful new antibiotic. Of particular interest is the possibility of using AI in the battle against pandemics. As one example, artificial intelligence platform BlueDot reportedly "picked up on a cluster of unusual pneumonia cases happening around a market in Wuhan, China, on Dec. 30, 2019, and flagged it. BlueDot had spotted what would come to be known as COVID-19, nine days before the World Health Organization released its statement alerting people to the emergence of a novel coronavirus.”

It’s important to remember that the types of AI we see today are in the very early stages of their development. The founding event in the field of AI is generally accepted to be the Dartmouth Workshop, which took place in 1956. However, AI largely remained in the realm of academia until the 2010s. At that time, a combination of algorithmic developments coupled with advances in processing technologies caused AI to shake loose its academic shackles and start to take its place in the real world.

Sophisticated AI Architectures and Processing Engines

More sophisticated AI architectures and processing engines are coming online all the time. For example, a new company called Perceive recently introduced a chip called Ergo that’s a fraction the size of a (USD) penny coin, but that can run artificial neural networks (ANNs) with an excess of 100 million weights and a size exceeding 400MB. It can do all this by delivering over 4 TOPS (tera, or trillion, operations per second) peak performance at less than 1/10 watt peak power.

We’re going to need all the help we can get if COVID-19 is a taste of things to come. Look around you and consider how troubling things are at the moment. Consider that the mortality rate for COVID-19 is thought to be somewhere around 2.1 percent to 3.4 percent. Now consider what sort of state we would be in if the mortality rate for COVID-19 were akin to that of Ebola’s 50 percent and if the communicability of COVID-19 was similar to that of measles. Scientists define infectiousness using the reproduction number. Current evaluations give COVID-19 a value of 2-3, while measles tops the chart at 12-18.

Preparing for Pandemics Using AI Systems

Hopefully, by the time the causal agent for the next potential pandemic rears its ugly head, we will have powerful AI systems in place monitoring things around the world such as doctor’s reports, hospital admissions, even social-media postings (“I’m feeling a little under the weather today”). We will use the awesome power of today’s—and tomorrow’s—cloud-based computing, data mining, and artificial intelligence solutions to alert us to the fact that something bad could be headed our way.

It would be even better if the world’s governments used the current coronavirus crisis as a learning opportunity to ensure we are all better prepared should “the big one” come. Hey, we can but hope. “You may say I’m a dreamer, but I’m not the only one,” as John Lennon wrote in his song Imagine. I’m with John on this one.



« Back


Clive "Max" Maxfield

Clive "Max" Maxfield is a freelance technical consultant and writer. Max received his BSc in Control Engineering in 1980 from Sheffield Hallam University, England and began his career as a designer of central processing units (CPUs) for mainframe computers. Over the years, Max has designed everything from silicon chips to circuit boards and from brainwave amplifiers to Steampunk Prognostication Engines (don't ask). He has also been at the forefront of Electronic Design Automation (EDA) for more than 35 years.

Well-known throughout the embedded, electronics, semiconductor, and EDA industries, Max has presented papers at numerous technical conferences around the world, including North and South America, Europe, India, China, Korea, and Taiwan. He has given keynote presentations at the PCB West conference in the USA and the FPGA Forum in Norway. He's also been invited to give guest lectures at several universities in the USA, Sheffield Hallam University in the UK, and Oslo University in Norway. In 2001, Max "shared the stage" at a conference in Hawaii with former Speaker of the House, "Newt" Gingrich.

Max is the author and/or co-author of a number of books, including Designus Maximus Unleashed (banned in Alabama), Bebop to the Boolean Boogie (An Unconventional Guide to Electronics), EDA: Where Electronics Begins, FPGAs: Instant Access, and How Computers Do Math.


All Authors

Show More Show More
View Blogs by Date