FUNDING FOR NEW SOFTWARE PARADIGM

(Washington, DC, press release by IP Newswire, 1 April 1998) The Defense Advanced Research Projects Agency (DARPA) today announced a major new initiative in software engineering. F.P. Rivers, program manager for the initiative, said that it addresses a major problem facing the US military: that much of current information technology is too "compute-intensive" to be deployed where it is most needed -- at the small unit or even individual soldier level. This initiative has its origins in a fortuitous observation: Rivers and several colleagues noticed that users on the most widely used platform -- Windows 95 -- were routinely presented with messages that an unknown unrecoverable error had occurred, and that these users just as routinely ignored those messages. "This occurred not just in casual use, but also in mission-critical operations." Rivers said "Once we started thinking about these messages not as a help, but as a hindrance, several other observations came together." In a typical program, 40% to 80% of the code is devoted to error detection and error handling. "Software bloat" -- the ever increasing size of programs -- has been blamed on programmers adding more and more features, but could also be blamed on all the error handling associated with those features. To make matters worse, multiple studies had shown that much, if not most, of the error-handling code was never tested. Sometimes this was because of time and budget pressures; sometimes the potential errors were so obscure and complex that the situations were too difficult to create "in the lab". This research was backed up by actual experience: error-handling code was often found to have significant errors. Rivers summarized "So, the typical program is overloaded with code that is rarely used, that may not work, and whose output is likely to be ignored anyway." He concluded "With this code removed, programs will be dramatically smaller and will run somewhat-to-noticeably faster." Many software developers, including several major vendors, have already taken some tentative steps in this direction, having recognized pieces of the problem, but without grasping the "big picture". Rivers said he expects this new approach, dubbed "Fault-Oblivious Computing", to quickly become the dominant software-engineering paradigm. He acknowledged that there were small highly specialized segments where fault-tolerant computing and program verification would still be of value. A major component of this initiative will be to develop tools to automatically identify and remove unneeded error-handling code from existing applications. The success of this approach would be bad news for memory-chip manufacturers, who are already hard-hit by decreased demand.

Version Info: $Revision: 1.2 $ $Date: 2002/03/10 07:59:22 $
Copyright 1998 by Douglas B. Moran
Permission to make digital or hard copy of part or all of this work is granted without fee provided that copies are not made or distributed for profit or commercial advantage.