The following article appeared a few days ago.
Spread of computers creates new dangers, FDA officials warn
June 30, 2008
WASHINGTON - After a routine piece of medical equipment started mysteriously killing hospital patients a few years ago, the federal government turned to a small team of its software experts in suburban Maryland for help.
The team's discovery - a flaw in a computer code that caused a drug pump to administer heavy overdoses - led to a recall, warnings and rewriting of the equipment's software. The discovery also illustrated a new threat behind some lifesaving medical devices.
Microprocessors run everything from patient monitors to artificial pancreases, and potential software flaws are a growing concern. A product might not malfunction because it was poorly designed or badly made - the traditional suspects - but because the computer code running it includes a mistake. The impact of that glitch can be increasingly serious because the latest automation is removing the doctors and nurses who watched for machine mix-ups.
"The world of technology is allowing us to do things we never thought possible, and it's largely a great advance," said Larry G. Kessler, who directs the Food and Drug Administration Office of Science and Engineering Laboratories, which oversees the team of software sleuths at White Oak in Montgomery County. "Where it gets to be scary is, we used to have more human intervention. With software doing more now, we need to have a lower tolerance for mistakes."
Of 23 recalls last year that the FDA classified as life-threatening, three involved faulty software.
Manufacturers test and inspect the software on their products, such as dialysis systems and patient monitors, before putting devices on the market. But they've been slow to follow the FDA in adopting new forensic technology because it is costly and still evolving, industry officials say. As a result, FDA software specialists are amassing evidence to show companies the value of the new testing. Meanwhile, traditional software checks, while good at detecting some flaws, are not thorough enough to find every mistake, according to computer scientists.
"If architects worked this way, they'd only be able to find flaws by building a building and then watching it fall down," said Paul Anderson, vice president of engineering at GrammaTech, which has sold forensic software technology to the FDA and medical device companies.
Finding a killer buried in a medical device's source code is not straightforward detective work. The directions for an implantable defibrillator might run over 100,000 lines - as long as War and Peace - and cover a multitude of possible actions that could take a decade for the device to run through. Fitzgerald's team of investigators doesn't have that kind of time, especially when patients are dying.
Much more here:
This is a really important article in my view as it reminds us just how potentially dangerous it can be to assume the operational software in both devices and indeed in clinical systems can be – and how hard it can be to discover just what the problem is.
The application of techniques used to check the software that controls space rockets – static analysis – seem like just the right thing to do. It seems to be likely it will not be long before this and other techniques become mandatory for all new devices and systems. That may not be a bad thing!