I must thank wamckee for a most excellent and thought provoking node title.

Programming by approximation - what does it mean?

We tend to think of programming as an exact science. However, ongoing trends are increasingly making this not the case. But this is part of a continuous process of cutting the human out of the loop. Whether the end goal of removing the human completely, will ever be achieved is a big subject of debate.

Getting it right first time

The first programs written by me were entered into the computer using punched cards. Any errors in the code needed replacement cards. We also had a 24 hour batch turnaround cycle. This meant that much of the intervening time between runs was spent pouring over listings, trying to imagine and predict what the computer was going to do.

The primary consideration in the philosophy of Getting it right first time is: what is the cost of a mistake? There is a tradeoff between how long we spend on going for perfection, and how much time is taken to fix bugs. In the early days of punched cards, bugs were very expensive, both in terms of human time and elapsed time.

The technology is improving all the time, and is enabling us to use approximation, and hence save us having to be perfectionists.

Abstraction

Each step of assembly language, high level language, metadata, etc. is taking us further away from the 1's and 0's of what is happening. Abstraction also underlies Object Orientated Programming, and use of design patterns takes this a step further. Methodologies provide another form of abstraction, relating to the design process.

Abstraction saves the human having to worry about fine detail, and hence reduces the likelihood of making mistakes. Greater degrees of automation also help.

Prototyping

In recent methodologies such as evolutionary delivery and extreme programming, the whole application exists right from the start of the development cycle. It exists, but it does not work yet. The application exists as requirements documents, These are broken down into modules as part of the design process, and at the same time unit tests are built in for each module. If this is in place first, it is much easier to guarantee what needs to be delivered.

This means that prototype applications can be delivered to the users incomplete - often before the users have worked out properly what they wanted.

Fault Tolerance

Although this is primarily applied to hardware design, the space program showed how majority voting algorithms could be applied to software. A majority of several machines have to agree on the answer for it to be accepted. The dissent is recorded and analyzed after the event.

Emergent Algorithms

The use of genetic algorithms, classifier systems and neural networks means that the human has even less to do. The human provides a training set, which comprises sets of data - which are really the expected results, input and output, together with a fitness function to evaluate how well the software performs - how far away from the goal. The internal storage of the emergent algorithm is randomly seeded, and it is made to grow some code.

Despite the apparent sci-fi feel of emergence, much use has been made in the fields of pattern recognition and image processing.

Sources:

  • Artificial Life : Stephen Levy. ISBN: 0140231056
  • Principles of Software Engineering Management : Tom Gilb. ISBN: 0201192462