The first
computer programs that were in any way worthy of the name were input by connecting wires,
flipping switches and typing in numbers. This was the first generation: Raw
machine code.
The next step was to get a front-end program together where you could enter (relatively) readable symbols such as ADD AX @FOO; JNZ BAR; and the machine would translate these to the correct opcode numbers. This was the second generation. Assembler. It was a big improvement and larger programs could be made.
The third generation contains procedural and object-oriented programming languages. C is the best known of the early ones. You could define routines with parameters and return values. You could give things names of any length, and build large programs out of a hierarchy of building-block routines. The code is compiled to machine code or interpreted. It was a big improvement and larger programs could be made.
And as always, programmers are victims of our success. We (or our bosses) always want more. We always asking "OK, but how can we code more, better, faster". Code can in theory do anything, but in practice is limited to what we can code, so demand always exceeds supply.
The Japanese were spearheading it with a project started in 1981 called the Fifth Generation Computer Project. They wanted to build a local industry, steal a march on the rest. Everyone else wanted to keep up.
The next paradigm was going to be to step up from specifying what to do in step-by-step detail – just specify what you want and let the machine find the best strategy. Declarative programming and functional programming, not imperative programming.
SQL was an example of a declarative language - you didn't have to say how to get the result, just specify what results you wanted.
Natural language processing was going to be big. You'd just tell the machine what you wanted in your own words.
Several spin-offs from AI looked interesting – we're not talking about any blue-sky human-equivalent understanding in the machine here, just a few techniques from the lab that were previously humans-only stuff. Prolog was one interesting approach to programmatic logic. Expert systems were going to be big, they could answer your questions (and even ask you the right ones), given incomplete data. Neural networks could match patterns.
We never got there. 4GL became just another one of yesterday's buzzwords. It didn't clear the backlog and produce an avalanche of production code as compared to the old techniques.
Meanwhile, we've made incremental gains. Object-oriented programming went mainstream because it was a logical addition to existing techniques that offered more code reuse and thus less new code. So has garbage collection because it results in less time writing and debugging housekeeping code. Our function libraries have become class libraries and are growing ever more extensive and usable.
But there hasn't been a paradigm shift. Prolog seems to have been a dead end, or at best limited. Expert systems too are great for a few limited uses.
Tools that generate 3GL code have morphed into drag and drop Rapid Application Development environments such as Visual Basic and Delphi. The coders soon learned that code generation wasn't worth anything at all unless you could modify the generated code, and tools that can round-trip back to the model from the altered code are greatly preferred. And that it is best done on top of a strong 3GL. VB still has a reputation as a toy for being weak in this regard.
Powerbuilder, for instance, had kick-ass visual data display generation tools, backed by a crappy little scripting language that made Visual Basic look powerful, consistent and flexible. Had. It has vanished.
As for the others, SQL may be declarative, but that doesn't stop it being quirky and involved just like any other programming language. You won't find your HR director using it. Functional programming languages have continued to progress, but have never had a mass market yet. If anything, they are harder to get your head around than imperative languages.
There has been a lot of incremental change. We have better tools than in 1980, and a more varied toolchest. Perhaps the accumulation of incremental change, from C in a text editor all the way to a Java IDE or Visual Studio .NET with drag and drop GUI toolkits, and PERL and Python on the side, is enough to declare a new generation.
But in my opinion, if there is a fourth generation of programming languages, a paradigm shift, it hasn't broken out of the lab yet. Perhaps declarative, context based, natural language techniques will somebday come to fruition.
Perhaps our programming language generations are counted 1, 2, 3, (4 was a dead end, backtrack), OO, OOGC. Perhaps a branching tree is a better metaphor than a linear sequence.
The term 4GL is sometimes now used to refer to declarative programming , and sometimes to refer to "application specific" languages – vertical tools that will generate stuff very rapidly, provided you select from a limited menu of options. Often this means report-writing and data entry, both database bound. These tend to be great if what you want to do was forseen by the system's designers. Otherwise, they are useless. The 80-20 rule applies: 80% of the job can be dead easy, the other 20% is impossible or nearly so.