Interesting clip from a textbook my father in law is reading for his Masters in MIS
There are hundreds, if not thousands, of books on how to develop software. Books on C, C++, Java, CORBA, XML, databases, and the like abound. However, most software organizations don't have a very good track record with the software they develop. After studying thousands of software projects, the Standish Group observed that about 23% of software projects fail to deliver any working software at all [Standish01a]. Unfortunately, these projects aren't being cancelled until well after their original schedule and budget have been exceeded.
The Standish study also showed that for projects that do deliver software, the average one is 45% over budget, 63% over schedule, and delivers only 67% of the originally planned features and functions. Based on our industry's track record, a software project that's estimated to take 12 months and cost $1 million can be reasonably expected to take closer to 20 months and cost about $1.5 million, while meeting only two thirds of its requirements.
Tracy Kidder [Kidder81] reports that about 40% of the commercial applications of computers have proven uneconomical. These applications don't show a positive return on investment in the sense that the job being automated ended up costing more to do after the system was installed than it did before. Return on investment is defined in Chapter 8, but, simply, those organizations paid more to develop the software than the software ever earned back for them.
Assuming the Standish and Kidder data can be combined, the resulting statistics are rather grim. If 23% of all software projects are cancelled without delivering anything, and 40% of the projects that do deliver software are net money losers, then about 54% of all software projects are counterproductive in the business sense. Over half the time, the organizations that paid for software projects would actually have been better off financially had they never even started those projects.