The most “popular” programming languages since 1960

Hits: 145

There will be people that will dismiss any “popular” programming list as a kind of meaningless horse race. But that’s as long as you are not looking for job prospects as an answer to what will pay the bills and keep the lights on when you enter the work world. But the field is large enough that you have room to ask  yourself: what kind of programming do you want to do? Systems programming? Applications? Servers? Clients? Scientific models? Statistical studies? Device drivers? Everyone hears about web programming, since that is the most visible, and seems to get the most “airplay” in the media. It might even interest you. For others, it’s dull. There is so much more out there.

With that preamble, why am I bothering to still do this? It is to show how popular languages follow the ebb and flow of computing history. Since World War II, we had the ENIAC, a host of IBM and AT&T mainframes, followed by networked computers, then personal computers, then the internet, and so on. With each major shake-up, programming needs change.

Disk drums on an IBM 2314.

By 1965, what had changed preferences in computer languages, are the same things that change it today: changes in hardware, programming for mainframes versus “personal” computers (which in this decade amounted to comptuers like the PDP-1). In the 1960s, hard drives (which were called “disk drums” back then) were relatively new, as was magnetic tape. Transistors hadn’t quite made their heyday yet, with the some of the most powerful computers still using vacuum tubes.

1960

COBOL. 1960 saw the introduction of supercomputers in the service of business, and by far the most popular language was COBOL (COmmon Buisiness-Oriented Language). COBOL was an interpreted language, which meant it was capable of running on many different machines with a minimum number of changes. Today, by the end of 2022, over 80% of business code is still written in COBOL.

1965

The Olivetti programmable calculator, about the size of a small modern digital cash register, and among the first of its kind.

ALGOL. Algol-60 saw the first implementation of the QuickSort algorithm, invented by C. A. R. Hoare, while a student in Moscow. He was knighted by Queen Elizabeth II for his discovery. Algol was behind COBOL as the most popular programming language, but both were dwarfed by FORTRAN users.

Niklaus Wirth

FORTRAN. FORTRAN was far and away the most popular programming language by 1965, and stayed that way for some decades. It was taught in many “service” computer course taken by science students and most engineering students. It was known for having a rather elaborate mathematics capability.

Other languages popular during that period: Assembly, APL, BASIC and Lisp. 1969 was the year that PASCAL was first introduced, by Niklaus Wirth.

1970

1970 saw the invention of UNIX by Kernighan and Ritchie at AT&T Labs, and Pascal came on board as a teaching language for structured programming in many university freshman courses. Otherwise, the landscape was pretty much the same for programming languages in popular use as before.

1975

By 1975, C had grown in popularity, but was not a teaching language: BASIC, Pascal, and Lisp had all ascended in popularity as we had sent men on the moon, and more students became interested in computer programming. FORTRAN and COBOL were still at the top of the heap, while ALGOL, APL and Assembly moved down. Assembly would in future decades disappear from general popularity, but it would never truly go away.

1980

Enquire was a hypertext processing system first proposed at CERN Physics labs by Tim Berners-Lee in 1980. Ideas from Enquire would be later used to design the World-Wide Web.

By 1980, C++ had been introduced by Bjarne Stroustrup over the past couple of years, bringing the concept of object-oriented programming to the world. More and more people had mastered C, and it moved to the middle of the “top 10” proramming languages used that year. Pascal became a wildly more popular language due to the introduction of household desktop PCs, and the offering of a Turbo Pascal compiler by a software company called Borland. Microsoft offered BASIC and FORTRAN compilers that extended their stock QBASIC interpreter that came with DOS. In addition, Tandy, Commodore and Sinclair were offering their own machines, each with their own BASIC interpreters.

1985

While he didn’t invent the Internet (he never claimed that at all, according to Snopes.com), Al Gore tables bills and sources funding to greatly expand the internet, post 1989.

Bjarne Stroustrup publishes his seminal work The C++ Programming Language, in 1985. With the introduction of Windows and Windows NT, Microsoft expanded their programming offering to include Visual Studio, which included compilers for C and C++.  C was rising to the top of the charts, competing with Borland’s Pascal product. C would never leave the top 3 for another 15 years.

1990

MS Windows 3.0 first shipped in 1990. Also, Adobe ships Photoshop the same year. The World-wide web also gets its first exposure this year. By 1991, a computer science student Linus Torvalds uploads his first kernel source code to an FTP site, which a maintainer mis-spelled as “Linux”, a name which stuck.

Visual BASIC was introduced by Microsoft. C++ rose to the top 5. FORTRAN, BASIC, Assembly, and COBOL all fell to the bottom 5 of the top 10 languages. C had a wild surge in popularity, as the Internet was coming onstream, and the World-Wide Web was just starting in the universities. By 1992, the top 2 positions were occupied by C and C++. Also by 1992, a need for CGI scripting was needed for the fledgling W0rld-wide web, and Perl became popular.

1995

By 1995 Netscape had been out for 5 years. 1995 was the year that Microsoft first introduces Internet Explorer and gives it away for free, causing Netscape to go open source and produce Firefox.

There were many scripting languages at the time aimed at web browsers, but there had not been any set standard as to a default scripting language. By the end of the decade, that standard would go to JavaScript, a language developed since 1995. It and Perl were rising in popularity as client-side and server-side web-based languages respectively. But in the following 5-year period there was another shake-up. Java (a very different language from JavaScript), a product of Sun Microsystems, came from out of nowhere in 1995 to be the 3rd most popular language by 1996. By this time, the web had arrived in people’s homes and there was a need to enhance people’s internet experiences.

Pascal was falling out of favour as computers were moving away from DOS in the home and in business, and by 1997, Borland designed and object-oriented version of Pascal, which was called Delphi. It turned out to be a formidable competitor to Visual Basic. By 1998, even more server-side dynamic web programming was provided with the language PHP.

2000

2000 was the year that USB flash drives grew in popularity. In other news, Google makes its IPO in 2004; and in the same year we are first hearing about “web 2.0”.

PHP overtook Perl by 2000 as the 5th-most used language that year. Java and JavaScript had occupied 2nd and 3rd, pusing C++ to the #4 spot. C was still on top.  That year, Microsoft offered the world C#. Apart from C and C++, the top 5 langugaes were all web-based languages: Java, JavaScript and PHP. Perl was descending in popularity, as a new scripted language with much cleaner syntax became ascendant: Python.

2005

In 2005, IBM sells its PC division to a Chinese firm, to make it the largest manufacturer of PC computers in the world.

C was finally pushed out of the top spot by Java; and Delphi was starting to drop out of the picture as Borland had financial troubles after a failed bid to attempt to make inroads into Linux, with their introduction of Kylix. They sold off Delphi to Embracadero, who produces that product today. Perl continues to ascend in popularity only slowly, as its popularity is buoyed up by a legacy of libraries and its role in various bioinformatics projects, such as the Human Genome Project, conducted by universities around the world.

In part due to bioinformatics and other informatics endeavours, math and stats-based languages popped up such as Matlab and R. There were still new web-based languages like Ruby.

2010

At more than 1 petaflop (over 1 quadrillion calculations per second), the Tianhe 1 (released in 2010) is capable of running massive simulations and complex molecular studies. This year, IBM’s Watson wins a Jeopardy tournament.

Perl had finally dropped off the top-10, leaving a legacy of code on web servers all over the world. Objective-C became popular with Apple developers and new operating systems line NextStep, iOS and OS X. By 2011, the top 4 were: Java, JavaScript, Python, and PHP. Apple’s teaching language, Swift was at #9 in 2014.

2015

C and C++ were pushed out of the top 5. R, primarily a statistical programming language, rose to #7, second only to C. By 2019, Python was the top language programmers were using. Kotlin showed up briefly in 2019, owing to Google’s support of the language on the Android.

2020

Not much change, except for the introduction of Go, touted to be a more “reasonable” implementation of C++ with lighter syntax. Microsoft introduced TypeScript, a superset of JavaScript, and likely an attempt to “embrace and extend” it as they attempted to do the last time to Java, for example (J++ never caught on), or to JavaScript itself with their mildly successful VBScript, which also never quite caught on over the long haul.

While that was happening, Rust, which had been around for some time, enjoyed some popularity as a back-end web language, as well as a systems language. By the end of 2022, TypeScript has risen to the top 5. Of 11 languages that are the most popular, 7 are web-based languages: Python, JavaScript, TypeScript, PHP, Go, Rust, and Kotlin. The others are Java, C++, C, and C#.

Compiling The Linux Kernel Docs

Hits: 61

In the last article, I said that compiling and installing source versions of software was akin to “going rogue”. I must confess that I have compiled from source and installed software that wasn’t in my distribution, most recently TexStudio, as being one of the larger projects, requiring tons of other libraries and whatnot to also be installed (or quite often, compiled from source on the side), since it wasn’t a part of the linux distro I was using at the time. It also wasn’t a part of Cygwin, and I compiled for that too. It was a great way to kill an afternoon.

But there was a time that I had compiled the kernel from source. It was necessary for me, as speed was an issue and I had slow hardware at the time. What I also had was a mixture of hardware pulled from different computers at different times. I researched specs on sound cards, network cards, video cards and the motherboard chipsets, and knew what specs to tweak on the kernel compilation dialogs, so I could get the kernel to do the right thing: which is to be fast and recognize all my hardware. I was doing this before the days of modules, with the version 1.x kernel. It worked, and it was noticeably faster than the stock kernels. X-Windows on my 80486 PC ran quite well with these compiled kernels, but was sluggish to the point of un-useable with a stock kernel running. Every few versions of the kernel, I would re-compile a new kernel for my PC, and pretty soon using the tcl/tk dialogs they had made things pretty easy, and I could answer all the questions from memory.

But then that all ended with version 2. Yes, I compiled a version 2 kernel from source, and yes, it ran OK. But it also had modules. The precompiled kernels were now stripped down and lean, and the modules would only be added as needed when the kernel auto-detected the presence of the appropriate hardware. After compiling a few times, I no longer saw the point from a performance standpoint, and today we are well into kernel version 5.3, and I haven’t compiled my own kernel for a very long time.

For the heck of it, I downloaded the 5.3 kernel, which uncompressed into nearly 1 gigabyte of source code. I studied the config options and the Makefile options, and saw that I could just run “make” to create only the documentation. So that’s what I did.

It created over 8,500 pages of documentation across dozens of PDF files. And 24 of them are zero-length PDFs, which presumably didn’t compile properly, otherwise the pagecount would have easily tipped the scales at 10,000. The pages were generated quickly, the 8,500 or more pages were generated with errors in about 3 minutes. The errors seemed to be manifest in the associated PDFs not showing up under the Documentation directory. I have a fast-ish processor, an Intel 4770k (a 4th generation i7 processor), which I never overclocked, running on what is now a fast-ish gaming motherboard (an ASUS Hero Maximus VI) with 32 gigs of fast-ish RAM. The compilation, even though it was only documentation, seemed to go screamingly fast on this computer, much faster than I was accustomed to (although I guess if I am using 80486’s and early Pentiums as a comparison …). The generated output to standard error of the LaTeX compilation was a veritable blur of underfull hbox’es and page numbers.

For the record, the pagecount was generated using the following code:

#! /bin/bash
list=`ls *.pdf`
tot=0
for i in $list ; do
        # if the PDF is of non-zero length then ...
        if [ -s "${i}" ] ; then 
                j=`pdfinfo ${i} | grep ^Pages`
                j=`awk '{gsub("Pages:", "");print}' <<< ${j}`
                # give a pagecount/filename/running total
                echo ${j}	    ${i}    ${tot}
                # tally up the total so far
                tot=$(($tot + $j))
        fi
done

echo Total page count: ${tot}