Alan Woodward looks at how 'software factories' are set to revolutionise software...
1991 saw the successful conclusion of one of the most remarkable enterprises in the history of invention.
A team of engineers at the London Science Museum managed to complete the construction of a cogwheel computer that had been designed, but never built, 170 years earlier by the Victorian scientist and mathematician, Charles Babbage (1791-1871).
During his lifetime, Babbage insisted that his cogwheel brain, which he named the Difference Engine, would be an ultra-reliable cogwheel calculator for printing mathematical tables devoid of any risk of human error. His efforts to build it, however, met largely with ridicule, and Babbage died a disappointed man in 1871.
Why couldn't Babbage build a Difference Engine in his lifetime?
Today, with the benefit of hindsight, we know that the biggest problem which stymied him was the lack of a precision metal industry.
The successful modern initiative to build a Difference Engine deliberately made use of components that were no more precisely-engineered than Babbage himself could have produced back in the nineteenth century.
However, the 4,000 components - most of them cogwheels needing to be fashioned to be as alike as possible - were manufactured using modern industrial manufacturing processes which ensured consistent high quality components to a standard pattern.
Today, despite his ultimate failure, Charles Babbage is regarded as the father of computing.
In the arcane world of cogwheel computing, the 170-year battle between the crafts approach and the industrial approach to making components for Babbage's engines was decisively won by the
industrial approach. Today, in the enormously important and influential business of computer software development - a business which, in a very real sense, has made the modern world possible - the craftsman-like way of doing things has prevailed for a long time. But the day of the craftsman-like approach may be coming to an end, to be replaced by an industrial way of designing software that is faster, less risky, less expensive and more efficient.
There is widespread agreement throughout the computer software business today that software development as it is currently practised is in a bad way. As Jack Greenfield, an influential software architect at Microsoft, points out: "Software development is slow, expensive and error-prone, often yielding products with large numbers of defects, causing serious problems of usability, reliability, performance, security and other qualities of service."
Greenfield goes on to quote research from the leading computer industry research house, the Standish Group. This research states that in the United States businesses spend about $250 billion annually on the software development of about 175,000 projects.
The research also indicates that only about 16 per cent of these projects finish on schedule and within budget, while another 31 per cent are - on average - cancelled each year, mainly due to quality problems, with overall losses of about $81 billion. A further 53 per cent exceed their budget by an astonishing average of 189 per cent, incurring a total loss of about $59 billion.
The same research suggests that even those projects which do get completed deliver an average of only about 42 per cent of the originally planned features.
More progress has been achieved in recent years from new kinds of tools which, rather than seeking to revolutionise software development, aim instead to make developers more productive. A particularly useful tool here has been application development frameworks (ADFs).
These are a form of open-toall software packages that facilitate rapid customisation of the software for a particular application to the precise requirements of the organisation in question. But even ADFs are really still a set of generic building blocks, with the objective of being used across a wide variety of applications.
Today, the reason why software development is such a laborious, potentially risky and fraught matter is that the way in which software is designed means that opportunities to re-use code (ie. pre-written programs) are much rarer than one might imagine.
There has been limited infrastructure and markets that have encouraged developers to supply potentially re- useable programming components. But maybe reuse of pre-packaged functionality wasn't the way to do it after all and a different perspective was required.
That different perspective appears to be a concept known as the domain-specific language (DSL). A DSL is a special kind of language designed to be used for a particular application or solution.
As one might imagine, a DSL will be all the more useful the more tightly the particular application or solution to which it caters is focused, because this will tend to increase the likelihood that another programmer will be able to use the same features of the language to specify what it is he or she wants the computer to do.
In fact some DSLs have relatively wide domain remits (eg. retail banking) while others have narrow remits (eg. operation of a retail banking ATM network). Incidentally, Microsoft has recently developed a special kind of DSL whose particular domain is helping developers to design other DSLs! In many ways this is the type of catalyst that will see DSLs enter the mainstream as it will now be possible for organisations or their suppliers to define languages aimed specifically at providing solutions for problems within their business domain.
I doubt even Babbage would have thought of that.
* Alan Woodward is chief technology officer at the business and information technology consultancy Charteris. Email: email@example.com