A long, long time ago, when computers where huge equipments, made by tubes and big switches, the computer programs were "written" by wiring by hand the different computer units. It was the: "cable programming era".
I really like this photo, showing two programmers at work on Eniac; it reminds me that good software, in spite of all progress made since then, is still something to be done "by hand", like cabling: handcrafted and not suitable for an automated industrial process.
That was in the 40s, but soon after people began to develop easier programming methods and the assembler languages where introduced, where the program is coded by symbolic instructions, written somewhere in an intelligible way, then translated and "assembled" by the computer itself.
Programming in assembler means to manage the detailed location of each data in the computer memory and registers, it is a long, long work, cumbersome and error prone. There where persons very good at this job, but it was hard and time consuming.
In the following image an assembler program computing the factorial of a number, Also if we don't know assembler, we can recognize some statements: "jmp" , to go to a line in the program (to a label), "cmpl" and "jle" to compare values and jump to some instruction; "movl" are statements to move data around. In assembler you have to explicitly manage the position of all the values.
The FORTRAN
language was introduced in the 50s, it was easier and nearer to the
language used to describe mathematical problems; here
the positions of values is automatically managed
and all is simpler and shorter.
The same program, in the FORTRAN language, is shown in the following
picture.
The same program in the C language of the 70s; there are less statements and the syntax is more terse: all the computation is done in one line.
Nowadays (~2014) assembler is used only for a very detailed optimization of some critical parts of complex programs; some years ago it was used for programming simple micro-controllers, as those to control lifts and washing machines. Today integrated development system are available for most micro-controllers: a C-like program is written by the programmer, and a compiler deals with the translation to binary codes.
The future goes towards "system on chip": systems in which a minimal computer is wired into a single little board, able to run a real operating system, as Android or a mini-Linux. We can see this evolution in phones, home routers, video recorders, nas etc. I don't know if tomorrow I will really find Linux into my washing machine, (as I found in my TV), but this is the direction in which technology is going ( see this article on the arstechnica site )
I began programming around 1973, at Bologna University, where I was studing physics.
A consortium of Italian universities (CINECA) had a CDC 6600 computer, and some professors of the Physics Department where involved in usage and management of that computer, so the students of physics had a short computer laboratory course during the first year, and almost free access to the CDC 6600, at that time one of the most powerful computers in Italy [1].
We used FORTRAN IV, it was the period of punched cards and batch jobs: we used a card puncher machine to write each fortran statement into a single card. Then we took our deck of punched cards to a remote terminal, situated in the basement of the Physics Institute; the day after we had the results printed of paper, most of time signaling a syntax error in one of the cards. The use of cards dictated a rigid syntax for statements: short variable names (max 6 characters); only 80 columns for a statement, but first card columns reserved for numeric labels, and last columns for an optional card numeration. We could continue a long statement in the next card by putting a sign in column six; a sign in the first column was for comment cards.
Punching cards was a long job, prone to errors; so a new profession was born: the "keypunch operator": typists specialized in card puncher usage; but soon the technology changed and in few years this profession faded away. In the following picture an IBM advertisement for their card puncher. It was a big, slow and noisy machine, but, at that time, it was the best (maybe the only) way to program a computer.
We implemented simple mathematical algorithms, a practical way to learn by doing, so a good number of physics students, at that time, found a job in the field of computer programming. Most mathematical departments, beginning to be interested in computers, had instead a theoretical approach, but no real programming practice.
FORTRAN programs had a simple syntax; in FORTRAN IV the data types where essentially: integer, logical and float (real), in single or double precision. There was no support for characters. The type of each variable had to be explicitly declared, but there was the implicit convention that it was a single precision float number and an integer if beginning with one of the letters: I,J,K,L,M,N.
We had no structures, only arrays: variables of the same type, stored in sequential places in memory and addressed by an integer index.
We had some input/output statements and few control structures:
GOTO : to move to a given statement
IF : simple conditional statement, we had statements like:
IF (J-3) 100,101,103
where the expression in parenthesis is computed, and the program will goto statement with label 100 if the value is less than 0, to 101 if equal and to 102 if greater than 0.
There was also a logical tests, followed by a single statement:
IF (A.AND.B) E=F**3/G
DO loops : to iterate, but only over an integer index, as:
DIMENSION A(4),B(4),C(4) .... DO 10 I=1,4,2 A(I)=B(I)+C(I+1) 10 CONTINUE
At the first iteration we had I=1, at the second I=1+2 at the third the loop stops, being I>4
This was a very limited set of instructions, but FORTRAN was defined as a high-level language: where high is intended nearer the user logic and low is nearer the circuitry logic. That is true if compared to assembler, but it was very poor compared to modern languages or the modern evolution of FORTRAN (FORTRAN 90, FORTRAN 2008). FORTRAN programs followed a very sequential flow, where the program logic was mainly implemented as a set of conditional jumps.
We could give a cleaner structure to the code only by a wise use of functions and subroutines (subroutines where function returning no value). There was also a very useful feature: the "named commons": memory areas which where shared between subroutines; in this way it was possible a "segmentation" of the problem by distributing different data to different parts of the program.
In the following picture we have an axample of this usage;
the variables contained in the memory area named
SUBROUTINE ONE(A,B) COMMON /BIGDATA/C,D,E .... RETURN END SUBROUTINE TWO(K) COMMON /BIGDATA/C,D,E .... RETURN END SUBROUTINE THREE(K) COMMON /OTHER/Z(100) .... RETURN END
I found this feature a very useful one: an easy way to manage global or semi-global variables. Most modern languages lost this feature; it is the only thing I regret of the old FORTRAN.
In FORTRAN, parameters where passed to functions by address; and functions could change their values.
This was somehow hidden by the simple syntax of function arguments, but we could do things like:
C first: making a big array DIMENSION A(800000) MAXA=800000 C then computing needed space for variables IDIM1=5 L1=6 IDIM2=3 IDIM3=4 L2=L1+IDIM2*IDIM3 C passing different parts to subroutines CALL GELIB(IDIM1,IDIM2,IDIM3,A(1),A(L1),A(L2)) STOP END C the subroutines see the parts as different arrays SUBROUTINE GELIB(M1,M2,M3,A,B,C) DIMENSION A(1),B(M2,1),C(1) B(2,3)=A(2)+C(100) RETURN END
This was a way to mitigate the lack of dynamical allocations of memory areas.
An other trick was the use of array of arrays to implement a mapping between different arrays:
DO 10 I=1,M J=I A(J)=B(N(I)) 10 CONTINUE
Here N(I) contains the index of the place, in B, of the number to be put in A(j)
In the following figure the list of a little FORTRAN program I wrote in 1973 (it has some errors); the program computes the determinant of a square matrix.
This is short, but still a bit difficult to read, due to the many intercorrelated jumps. Variable names are short acronyms and there are no comments; for each comment a card was needed and people didn't like punching too much. More complex programs was often an intricate mess of jumps, a programming style named "spaghetti code".
In the following figure an example of somewhat involved structure in a fortran program. But "spaghetti code" is not dead with old FORTRAN; I have seen perfect examples of this programming style in Java, PHP, and other modern languages. As always, the programmer is the key point, not the used language.
When computer power increased, in the seventies,
the
"structured programming"
style became to be used, in which
the program is organized in blocks, delimited by conditional statements,
instead of being driven by simple jumps.
When the statements IF .. THEN ..ELSE .. ENDIF where made available,
in FORTRAN 77, I simply began to use that structures, which gave a clearer
way to write programs.
But in the academic environments, the adoption of structured programming caused
long and polemic discussions. The purists of good programming styles
began an ideological war against the GOTO statement: we had
an article by Dijkstra with
the title: "goto statement considered harmful"
(Communications of the ACM Vol. 11 Issue 3, March 1968 Pages 147-148 )
and also: Donald Knuth's "Structured Programming with go to Statements"
( Computing Surveys 6 (4): 261-301; 1974)
As a result there are today computer languages without the bad goto statement,
the victory of the purists, they killed the guilty statement!
An echo of this academic discussion can be found also in the Italian
middle schools. When there is a programming course (not often), the selected
language is Pascal: a real structured language,
created in 1970 by Nicklaus Wirth to teach programming.
This language was once popular in some US
academic circles, but was never really used for big programs,
and is today only a relic of the old times.
But it's very popular (around 2010, after 40 years), in Italian schools.
I had to the look in the
I use structured programming, but with a pragmatic approach: when a GOTO statement is the best way I use a GOTO statement. Trying to read long program and finding an ENDIF or an END statement closing a block beginning thousand statements before, is no different than finding a puzzling FORTRAN label, eventually a target of a far GOTO. The overall structure matters, statements are only tools; when I need them I use them; philosophical debates on statements are worthless.
IBM introduced his FORTRAN VS, implementing all the FORTRAN 77 features, around 1981, but I had to struggle with a big heritage of old FORTRAN programs, so I moved completely to a more modern FORTRAN only around 1984, when I began to use the mini-computers instead of mainframes.
At that times we began to use VAX computers, much less expensive than mainframes, but able to do most of our jobs. Reduced costs made VAX computers available to single university departments, so minicomputer eroded most of the mainframe market; it was a big change in the computer evolution.
In 1987 my department could buy a Microvax-II. I began to do system management with that machine, because I needed that computer and nobody else was taking charge of it, but this is another story.
The following figure an example of a VAX FORTRAN function of the eighties; we have easier input/output statements, IF..THEN..ELSE instruction , no indentation but still a column limited syntax. I still used a GOTO, when needed.
In the nineties FORTRAN evolved, FORTRAN 90 had support for matrix and vector operations, FORTRAN 2003 and FORTRAN 2008 are object oriented languages, but I didn't followed the evolution, around 1990 I began to use the C language.
In the following figure an example of FORTRAN in the 90s; it is easier to read, with a clearer structure and indentation.
FORTRAN is still used; the structure of its array, a sequence of data all of the same type, make easier the optimization done by compilers. Language implementing array by pointers, as C, makes optimization difficult, because the compiler don't know in advance where the pointer will go and if different arrays overlaps.
FORTRAN can be more efficient than C when a lot of vector and matrices are used, as in some scientific computations. Moreover there is an huge heritage of old but very big FORTRAN programs nobody want to rewrite, so FORTRAN is still alive and will be for years. ( see this article on the arstecnica site )
This text is released under the "Creative Commons" license. |