Home | News | Hacking | Sciences | Technology | Ti 92 | Programming | Free articles | Links | Webmaster

news of the month
Programming Languages and their History
    Nov 09 2001 - 14:20 EST
RattleSnake writes:
When I decided to write this, I initially though about giving an answer to the many "what's the best programming language" threads we always had on the board. First of all: There is no "best programming language". Just because it depends on which field you are especially interested in. Internet programming and writing applications, for example, are two different things obviously and the languages that are suitable for them have to be basically different. Thus, you will have to decide on your own what you want to do. Ok, so you already know exactly what you want to do, what is the accurate language? Well, answering this question objectively and detailed enough is difficult. I decided to write about the history of programming languages not only to give you an impression of what the different languages are good for, but also to show you how it all developed just to understand the languages of today better.

I'm sure I cannot cover all the languages in this one article, but I think I mentioned all the really important ones and gave a quite objective statement. However, perhaps others can follow and write articles that deal with one special language.

Whenever a language is mentioned, you will find a link to the bottom of this article where I have a little collection of links that should give you some hints about where to start if you are interested in this language. If they don't help you, just use a search engine or memo me, perhaps I can help. I hope this will help all of you who have not yet so much experience in programming but who are also willing to learn. Ok, let's start ...

Computers need to get detailed instructions to perform a specific task. These instructions are known as a programming language. These languages were first composed of a series of steps to wire a particular program ... then these morphed into a series of steps keyed into the computer to be executed and even later these languages acquired advanced features such as logical branching and object orientation.

The first comuter, the "difference engine" was inventes by Charles Babbage. It could only execute tasks by changing the gears which executed the calculations. Thus, the earliest form of a computer language was actually physical motion. Physical motion was replaced by electrical signals at first when the US Government built the ENIAC in 1942.

In 1945, John Von Neumann was working at the Institute for Advanced Study. He developed two important concepts that directly affected the path of computer programming languages. The first was known as "shared-program technique". This technique stated that the actual computer hardware should be simple and not need to be hand-wired for each program. Instead, complex instructions should be used to control the simple hardware, allowing it to be reprogrammed much faster.

The second concept was also extremely important to the development of programming languages. Von Neumann called it "conditional control transfer". This idea gave rise to the notion of subroutines, or small blocks of code that could be jumped to in any order, instead of a single set of chronologically ordered steps for the computer to take. The second part of the idea stated that computer code should be able to branch based on logical statements such as: "IF (expression) THEN" and looped such as with a "FOR" statement. "Conditional control transfer" gave rise to the idea of "libraries", which are blocks of code that can be reused over and over.

In 1949, the language Short Code was invented. It was the first programming language for electronic devices and it required the programmer to change its statements into 0's and 1's by hand. Anyhow, this was the first step towards the complex languages of today. In 1951, Grace Hooper wrote the first compiler, A-0. At this point I already want to tell you a bit more about the difference between compiled and interpreted languages. If you alread know what this is or if you are not interested, just leave out the following part:

A compiled language and an interpreted language don't have any remarkable differences concerning their syntax or features or these differences ar at least not caused by the fact that they are compiled or interpreted. In fact, there is only one single difference between them when it comes to converting the language's statements into machine code, 1's and 0's. A compiler is a program that turns the language's (more or less high-level) statements into binary code and produces a new file that can be executed without any further compilation. Compilers also often detect errors and allow you to debug your code once you want to compile it. Compilers today produce machine code that is most probably as fast as any program that was written in Assembly or machine code. In contrast to that, an interpreted program or script is directly interpreted by an Interpreter program that understands the languages syntax and translates the operations to machine code for you directly (And tells the computer to run this code). The interpreted program has to be interpreted each time it is run, not only once. This saves a lot of time during the development progress of a program since you do not have to compile every test program or beta version before debugging / testing it, but it also often causes code that is slower while being executed. Interpreted languages are often platform independent since an interpreter can be easily written for different OS'. You can also compile a program for different OS', but this also means that you might have to rewrite a lot of code in order to keep functions that include OS - specific features. Java is a strange example of ... well, of an interpreted language I would say. Java source code is at first compiled to Bytecode which is lateron additionally interpreted by an interpreter, making it platform-independant and extremely useful for web applications.
In a Nutshell:
A compiled language is written and then run through a compiler which checks its syntax and compresses it into a binary executable. Since an interpreted language is not compiled, it must be checked for errors at run-time, which makes it quite a bit slower than a compiled language (like
C). Perl is an example of an interpreted language. Remember, though, that just because a language is interpreted doesn't necessarily mean it is not full-featured, or simplistic. Perl can get very complex and very cryptic, very quickly.


In 1957, the first of the major languages appeared in the form of FORTRAN. Its name stands for FORmula TRANslating system. The language was designed at IBM. The components were very simple, and provided the programmer with low-level access to the computers innards. Today, this language would be considered restrictive as it only included IF, DO, and especially the GOTO statement. GOTO is used to jump around within the code of a program, here's a little example that would cause an endless loop:

label:
command1
command2
command3

GOTO label


"label" marks a position in the code (syntax can differ from language to language). After the three commands are executed, the program starts to process all the commands again from somewhere in the code, in this case from the position on that we marked as label. This obviously can cause an endless loop unless you use IF statements. It can also be used to leave out parts of the code that are not needed (if some condition does not allow it). Nowadays, GOTO still exists in hybrid languages like C++ (improved C), Visual Basic (improved Basic) and Delphi (improved Pascal). Hybrid languages are object-oriented versions of old Languages like Basic, Pascal and C that still support old statements like GOTO together with new object-oriented concepts. I personally think this makes a language more varied and can be still of use even today. I consider these early statements the greatest step forward in the development of programming languages. The basic types of data in use today also got their start in FORTRAN, these included logical (or boolean) variables (TRUE or FALSE), and integer, real, and double-precision numbers.

Although FORTAN was good at handling numbers, it was not so good at handling input and output, which mattered most to business computing. Business computing started to take off in 1959, and because of this, COBOL was developed. It was designed from the ground up as the language for businessmen. Its only data types were numbers and strings of text. It also allowed for these to be grouped into arrays and records, so that data could be tracked and organized better. It is interesting to note that a COBOL program is built in a way similar to an essay, with four or five major sections that build into an elegant whole. COBOL statements also have some sort of English-like grammar syntax, making it quite easy to learn. All of these features were designed to make it easier for the average business to learn and adopt it.

In 1958, Mr. John McCarthy from the famous MIT created the LISt Processing (or LISP) language. It was actually designed for Artificial Intelligence (AI) research. Because it was designed for such a highly specialized field, its syntax has rarely been seen before or since. The most obvious difference between this language and other languages is that the basic and only type of data is the list, denoted by a sequence of items enclosed by parentheses. LISP programs themselves are written as a set of lists, so that LISP has the unique ability to modify itself, and hence grow on its own. The LISP syntax was known as "Cambridge Polish," as it was very different from standard Boolean logic:

x V y - Cambridge Polish, what was used to describe the LISP program
OR(x,y) - parenthesized prefix notation, what was used in the LISP program
x OR y - standard Boolean logic


Note: LISP remains in use today because its highly specialized and abstract nature.

The Algol language was created by a committee for scientific use in 1958. It's major contribution is being the root of the tree that has led to such languages as Pascal, C, C++, and also Java. Algol also has a clear defined grammar, known as BNF (Backus-Naar Form). Furthermore, it included some new concepts, such as recursive calling of functions. Recursive calling of functions means that a function within the program calls itself causing a loop. However, history lead to the adoption of "smaller and more compact languages" (quote from somewhere I don't remember), such as Pascal.

Pascal was begun in 1968 by Niklaus Wirth and somewhat still exists as a hybrid language called "Object-Pascal" or Delphi. Its development was mainly out of necessity for a good teaching tool. In the beginning, the language designers had no hopes for it to enjoy widespread adoption. Instead, they concentrated on developing good tools for teaching such as a debugger and editing system and support for common early microprocessor machines which were in use in teaching institutions.

Pascal was designed in a very orderly approach, it combined many of the best features of the languages in use at the time, COBOL, FORTRAN, and ALGOL. While doing so, many of the irregularities and oddball statements of these languages were cleaned up, which helped it gain users (Bergin, 100-101). Good mathematical features and effective I/O features made it a highly successful language. Pascal also improved the "pointer" data type, a very powerful feature of any language that implements it:

A pointer is a way to get at another object. Essentially it is a way to grab an instance of an object and then either pass that instance a message or retreive some data from that object. A pointer is actually just the address of where an instance is held in memory. Some piece of your program can either possess an instance of an object, or know where an instance of an object is. An instance of an object is a chunk of memory that is big enough to store all the member data of that object. A pointer is an address that explains how to get to where the instance is actually held in memory.


Pascal added the CASE statement as well, that allowed instructions to to branch like a tree in such a manner:

CASE expression OF
    possible-expression-value-1:
        command1
        ...
    possible-expression-value-2:
        command5
        ...
END


Pascal also helped the development of dynamic variables, which could be created while a program was being run, through the NEW and DISPOSE commands. However, Pascal did not implement dynamic arrays or groups of variables. Lateron a successor to Pascal was created, Modula-2. However, C appeared at the same time and gained popularity such as users with an immense speed.

C was developed in 1972 by Dennis Ritchie while working at Bell Labs in New Jersey. The transition in usage from the first major languages to the major languages of today occurred with the transition between Pascal and C. Its direct ancestors are B and BCPL, but its similarities to Pascal are quite obvious. All of the features of Pascal, including the new ones such as the CASE statement are available in C. C uses pointers extensively and was built to be fast and powerful at the expense of being hard to read. But because it fixed most of the mistakes Pascal had, it won over former-Pascal users quite rapidly.

Ritchie developed C for the new Unix system being created at the same time. Because of this, C and Unix go hand in hand. Unix gives C such advanced features as dynamic variables, multitasking, interrupt handling, forking, and strong, low-level, input-output. Because of this, C is often used to program OS' such as Unix, Windows, the MacOS, and Linux. Furthermore, C grew up to the famous language it is today because of unix, in my humble opinion. Being a unix programmer forced you to learn C, and thus it became a commonly used language.

In the late 1970's and early 1980's, a new programing method was being developed. It was known as Object Oriented Programming, or OOP. You can leave out the following brief introduction to OOP if you already know what that means or if you are not interested:

In the real world, you often have many objects of the same kind. For example, your computer is just one of many computers in the world. Using object-oriented terminology, we say that your computer object is an instance of the class that can be identified as "computers". computers have some state (cpu speed, RAM size, hd space) and behavior (open CD-ROM, type letters with the keyboard) in common. However, each computer's state is independent of and can be different from that of other computers. When building computers, manufacturers take advantage of the fact that computers share characteristics, building many computers from the same blueprint. It would be very inefficient to produce a new blueprint for every individual computer manufactured. In object-oriented software, it's also possible to have many objects of the same kind that share characteristics: rectangles, employee records, video clips, and so on. Like the computer manufacturers, you can take advantage of the fact that objects of the same kind are similar and you can create a blueprint for those objects. A software blueprint for objects is called a class. The class for your computer example would declare the instance variables necessary to contain the current cpu speed, the current RAM size and the current hd space ... for every single computer object. The class would also declare and provide implementations for the instance methods that allow the commong geek to open his CD-ROM and type letters with the keyboard. Would be a shame if not, wouldn't it? After you've created the computer class, you can create any number of computer objects from the class (remember, only in the oo world). When you create an instance of a class, the system allocates enough memory for the object and all its instance variables. Each instance gets its own copy of all the instance variables defined in the class.

Generally speaking, objects are defined in terms of classes. You know a lot about an object by knowing its class. Even if you don't know what a mac is, if I told you it was a computer, you would know that it had a cpu, RAM, a CD-ROM and so on. Object-oriented systems take this a step further and allow classes to be defined in terms of other classes. For example: Mac's and IBM PC's are kinds of computers. In object-oriented terminology, Mac's and IBM PC's are all subclasses of the computer class. Similarly, the computer class is the superclass of Mac's and IBM PC's. Each subclass inherits state (in the form of variable declarations) from the superclass. Mac's and IBM PC's share some states: existing CD-ROM, existing Keyboard. Also, each subclass inherits methods from the superclass. You can open a CD-ROM either on a Mac or an IBM PC, that doesn't matter. However, subclasses are not limited to the state and behaviors provided to them by their superclass. Subclasses can add variables and methods to the ones they inherit from the superclass. Well, I don't want to cause another Macs vs windoze computers fight, so try to imagine the differences on your own :). BTW: You are not limited to just one layer of inheritance. The inheritance tree, or class hierarchy, can be as deep as needed. Methods and variables are inherited down through the levels. In general, the farther down in the hierarchy a class appears, the more specialized its behavior.


Bjarne Stroustroup liked this method and developed extensions to C known as "C With Classes". This set of extensions developed into the full-featured language C++, which was released in 1983. C++ was designed to organize the raw power of C using OOP, but maintain the speed of C and be able to run on many different types of computers. That's the official version, I personally think that they were urgently aneed of an object oriented language that every common programmer would be able to learn - upgrading C to C++ was the best thing Bjarne could do. No offense, C++ was and actually still is the very language for offline coding. It is the most common best language for applications, especially simulations of any kind (games) today, I would say. Once again, quote from somewhere I don't remember:

"C++ provides an elegant way to track and manipulate hundreds of instances of people in elevators, or armies filled with different types of soldiers. It is the language of choice in today's AP Computer Science courses."

In the early 1990's, interactive TV was ment to be the technology of the future. Sun Microsystems decided that interactive TV needed a special, portable (can run on many types of machines), language. This language became Java, or rather some sort of ancestor of Java. In 1994, the Java project team changed their focus to the web after that interactive-TV-idea failed completely. Java was completely redessigned and the next year, Netscape licensed Java for use in their internet browser, Navigator. At this point, Java became the language of the future and several companies announced applications which would be written in Java.

Though Java is a text-book example of a good, object oriented language, it may be the "language that wasn't". It has serious optimization problems, meaning that programs written in it run very slowly. It's a matter of fact. And Sun has hurt Java's acceptance by engaging in political battles over it with Microsoft. But Java may wind up as the instructional language of tomorrow as it is truly object-oriented and implements advanced techniques such as true portability of code and garbage collection. Did I hear somebody ask? Oh yes, garbage collection. Once again, just leave out the following paragraph if you already know what this is or if you are not interested:

Garbage Collection summed up:
In a garbage collected language, it's the responsibility of each module to maintain It's data structures and arrange for them to only contain those objects which that module needs. when an object is no longer needed locally, It's local references are extingished (deleted, copied over, popped of the stack, etc). In a non-garbage collected language, in addition to determining doing this, each module must be aware of whether any other (including those which haven't been written yet) also accesses the object, and if not free it. Garbage collection thus reduces "the base problem of not having a handle on the useful life of your objects" from a global one to a local one.

I took this explaination from
here, go there also if you want to read more details.


Visual Basic is often taught as a first programming language today as it is based on the BASIC language developed in 1964 by John Kemeny and Thomas Kurtz. BASIC is a very limited language and was designed for non-computer science people, in fact. That's a matter of fact as well. Statements are chiefly run sequentially, but program control can change based on IF..THEN, and GOSUB statements which execute a certain block of code and then return to the original point in the program's flow. Microsoft has extended BASIC in its Visual Basic (VB) product. The heart of VB is the form, or blank window on which you is the form, or blank window on which you drag and drop components such as menus, pictures, and slider bars. These items have properties (such as its color) and events (such as clicks and double-clicks) and are central to building any user interface today. A common expression for these items is "Widget". VB was most often used to create quick and simple interfaces to other Microsoft products such as Excel and Access without needing a lot of code, though it was always possible to create full applications with it. Now the time has come for VB to become a fully integrated member of the great family of Object Oriented programming languages: The next Upgrade of VB 6.0 to VB.NET (Or VB 7) will redesign VB completely and include an improved classes system with constructors and destructors as well as inheritance and many more. If you are a VB programmer, read Microsoft's tutorials about upgrading from VB to VB.NET. Note that Visual Basic is a proprietary language owned by Microsoft which does not allow you to create platform-independant applications with it. Nevertheless, I apreciate Visual Basic very much as a simple, easy-to-learn language for all the little jobs that have to be done on any windows machine. Furthermore, if you are working on a windows computer, learning visual basic means more than only learning one language, in fact. The Visual Basic Scripting Edition uses the same syntax as VB and has the same language elments and can be a great language to write your offline scripts if you are an experienced Visual Basic coder. Active Server Page, microsoft's own serverside language also is based on the same concepts. Thus, if you know that you will definitively work with a windows computer, VB offers a great all-in-one solution. It is simple, yes. It is proprietary, yes. But it can help you solve problems, and that's what computing is supposed to do.
I think I don't have to tell you more about Delphi since it is nearly the same. Delphi works like VB with a different syntax. Furthermore, Delphi is not owned by Microsoft, but by Borland, which makes no difference in my opinion. I also think that Delphi can be as useful as VB, though I like Visual Basic more just because of the fact that a language owned by microsoft will always prevail on a microsoft OS.

Perl is most often used as the engine for a web interface or in scripts that modify configuration files. It has very strong text matching functions which make it ideal for these tasks. Perl was developed by the famous Larry Wall in 1987 because the Unix sed and awk tools (used for text manipulation) were no longer strong enough to support his needs. Depending on whom you ask, Perl stands for Practical Extraction and Reporting Language or Pathologically Eclectic Rubbish Lister. Perl is a platform-independant and very flexible language, it is c-based and famous for internet programming / cgi.


The very first PHP Parser was designed in 1994-1995 by Rasmus Lerdorf as a Perl cgi program that was calld "Personal Homepage Tools". PHP got its name by abbreviating the original name "Personal Homepage Tools". It is always capitalized when being referred to as a language or parsing engine. At first, the only purpose was logging visitors to a resume page easily by creating a simple server-side parsing language. While Lerdorf was an employee of Toronto University, he improved the program to be a complete, parsed language. He ported the parsing engine to C, and added database connectivity - from this point on, other people also contribued to the engine itself by submitting code. Zeev Suraski and Andi Gutmans rewrote the parsing engine to create PHP version 3 - the presently available PHP version is 4 which is available for download. The PHP parsing engine is open source and though it is compiled itself, PHP programs are interpreted. Some of the functions are similar to that of C and Perl. Programmers that understand these languages plus CGI and HTML can easily understand PHP. The PHP parsing engine is run either as a CGI program on an html server or as an html server module. A compatible html server is necessary to run PHP.
"PHP is a server-side, cross-platform, HTML embedded scripting language. PHP is a tool that lets you create dynamic web pages. PHP-enabled web pages are treated just like regular HTML pages and you can create and edit them the same way you normally create regular HTML pages." - This little quote from www.php.net illustrates why many people prefer PHP as their server-side programming language. An example of this embedded code looks like this:

<HTML>
<HEAD>
<TITLE>PHP Example</TITLE>
</HEAD>
<<BODY>
<?php echo "Hello World"; ?>
</BODY>
</HTML>





Sorry guys ... I can't be objective this time:
Python is an interpreted, interactive, object-oriented programming language. It is often compared to Tcl, Perl, Scheme or Java (Though Python is not that stiff). Python combines remarkable power with very clear syntax. It has modules, classes, exceptions, very high level dynamic data types, and dynamic typing. There are interfaces to many system calls and libraries, as well as to various windowing systems (X11, Motif, Tk, Mac, MFC). New built-in modules are easily written in C or C++ as Python is based on C, in fact. Python is also usable as an extension language for applications that need a programmable interface. Python is bitchin fast when it comes to develloping new programs as an interpreted language with a remarkably clean syntax. The Python implementation is portable: it runs on many brands of UNIX, on Windows, DOS, OS/2, Mac, Amiga... If your favorite system isn't listed here, it may still be supported, if there's a C compiler for it. So ... well, what else would I have to say ... just go and get the interpreter at www.python.org since Python is copyrighted but freely usable and distributable, even for commercial use. However, I have to admit that Python already exists for about 10 years, and it is still rarely used for no obvious reason.



Some final words .... programming languages have been under development for years and will remain so for many years to come. They got their start with a list of steps to wire a computer to perform a task. These steps eventually found their way into software and began to acquire newer and better features. The first major languages were characterized by the simple fact that they were intended for one purpose and one purpose only, while the languages of today are differentiated by the way they are programmed in, as they can be used for almost any purpose. As we see that object oriented programming has become the standart and certainly necessary to understand, I recommend you to become familiar with these concepts. If you understood them once, you will see how easy it will be to learn just any new language.
I can already hear some of you rioting against the lack of asm and machine code explainaitions in my article. You see, I have absolutely no idea about that stuff and I didn't want to write about it just because of that. CD already wrote a great article about asm, and I think that will do so far. If not, feel free to write a new article about machine code and how it changed your life.


Appendix A: Recomendable languages (IMHO)

  • C and C++ are most commonly used for aplication and offline programming.
  • Python is a general-purpose language for rapid prototyping and systems programming.
  • Java is used for embedded applications, particularly interactive Web pages.
  • Lisp and Scheme are indispensable for serious students of programming.
  • The Perl language is the "Swiss Army Chainsaw" of web programming.
  • Visual Basic is a simple language for applications and scripts that works great with any windows OS (leave this out if you don't like windows or proprietary languages).
  • tcl and tk are tools for rapid prototyping of X-Window system applications.


Appendix B: History Diagram

I have found a very good, detailed diagram representing the history of programming languages and uploaded it here. It is quite interesting to see which language originated from what other language and to see the actual ammount of existing languages. I think it also helps a lot to understand differences/similarities of certain languages.


Appendix C: Complete languages list:

I found a very complete list of programming languages that might be a bit much to read but that could also help you as well.


List of mentioned and other (more or less) popular programming languages with links:



Algol :



AWK :



B :



Basic :



BCPL :



C :



C++ :



C# :



Cobol :



Delphi :



Eiffel :



Flow-Matic :



Forth :



FORTRAN :



Haskell :



Icon :



J :



Java :



Lisp :



ML :



Modula :



Oberon :



Objective-C :



Pascal :



Perl :


PHP:





PostScript :



Prolog :



Python :



Rexx :



Ruby :



Sather :



Scheme :



Sh :



Simula :



Smalltalk :



Snobol :



Tcl/Tk :



Visual Basic:


depuis le 30/09/2003