The most “popular” programming languages since 1960

Hits: 85

There will be people that will dismiss any “popular” programming list as a kind of meaningless horse race. But that’s as long as you are not looking for job prospects as an answer to what will pay the bills and keep the lights on when you enter the work world. But the field is large enough that you have room to ask  yourself: what kind of programming do you want to do? Systems programming? Applications? Servers? Clients? Scientific models? Statistical studies? Device drivers? Everyone hears about web programming, since that is the most visible, and seems to get the most “airplay” in the media. It might even interest you. For others, it’s dull. There is so much more out there.

With that preamble, why am I bothering to still do this? It is to show how popular languages follow the ebb and flow of computing history. Since World War II, we had the ENIAC, a host of IBM and AT&T mainframes, followed by networked computers, then personal computers, then the internet, and so on. With each major shake-up, programming needs change.

Disk drums on an IBM 2314.

By 1965, what had changed preferences in computer languages, are the same things that change it today: changes in hardware, programming for mainframes versus “personal” computers (which in this decade amounted to comptuers like the PDP-1). In the 1960s, hard drives (which were called “disk drums” back then) were relatively new, as was magnetic tape. Transistors hadn’t quite made their heyday yet, with the some of the most powerful computers still using vacuum tubes.

1960

COBOL. 1960 saw the introduction of supercomputers in the service of business, and by far the most popular language was COBOL (COmmon Buisiness-Oriented Language). COBOL was an interpreted language, which meant it was capable of running on many different machines with a minimum number of changes. Today, by the end of 2022, over 80% of business code is still written in COBOL.

1965

The Olivetti programmable calculator, about the size of a small modern digital cash register, and among the first of its kind.

ALGOL. Algol-60 saw the first implementation of the QuickSort algorithm, invented by C. A. R. Hoare, while a student in Moscow. He was knighted by Queen Elizabeth II for his discovery. Algol was behind COBOL as the most popular programming language, but both were dwarfed by FORTRAN users.

Niklaus Wirth

FORTRAN. FORTRAN was far and away the most popular programming language by 1965, and stayed that way for some decades. It was taught in many “service” computer course taken by science students and most engineering students. It was known for having a rather elaborate mathematics capability.

Other languages popular during that period: Assembly, APL, BASIC and Lisp. 1969 was the year that PASCAL was first introduced, by Niklaus Wirth.

1970

1970 saw the invention of UNIX by Kernighan and Ritchie at AT&T Labs, and Pascal came on board as a teaching language for structured programming in many university freshman courses. Otherwise, the landscape was pretty much the same for programming languages in popular use as before.

1975

By 1975, C had grown in popularity, but was not a teaching language: BASIC, Pascal, and Lisp had all ascended in popularity as we had sent men on the moon, and more students became interested in computer programming. FORTRAN and COBOL were still at the top of the heap, while ALGOL, APL and Assembly moved down. Assembly would in future decades disappear from general popularity, but it would never truly go away.

1980

Enquire was a hypertext processing system first proposed at CERN Physics labs by Tim Berners-Lee in 1980. Ideas from Enquire would be later used to design the World-Wide Web.

By 1980, C++ had been introduced by Bjarne Stroustrup over the past couple of years, bringing the concept of object-oriented programming to the world. More and more people had mastered C, and it moved to the middle of the “top 10” proramming languages used that year. Pascal became a wildly more popular language due to the introduction of household desktop PCs, and the offering of a Turbo Pascal compiler by a software company called Borland. Microsoft offered BASIC and FORTRAN compilers that extended their stock QBASIC interpreter that came with DOS. In addition, Tandy, Commodore and Sinclair were offering their own machines, each with their own BASIC interpreters.

1985

While he didn’t invent the Internet (he never claimed that at all, according to Snopes.com), Al Gore tables bills and sources funding to greatly expand the internet, post 1989.

Bjarne Stroustrup publishes his seminal work The C++ Programming Language, in 1985. With the introduction of Windows and Windows NT, Microsoft expanded their programming offering to include Visual Studio, which included compilers for C and C++.  C was rising to the top of the charts, competing with Borland’s Pascal product. C would never leave the top 3 for another 15 years.

1990

MS Windows 3.0 first shipped in 1990. Also, Adobe ships Photoshop the same year. The World-wide web also gets its first exposure this year. By 1991, a computer science student Linus Torvalds uploads his first kernel source code to an FTP site, which a maintainer mis-spelled as “Linux”, a name which stuck.

Visual BASIC was introduced by Microsoft. C++ rose to the top 5. FORTRAN, BASIC, Assembly, and COBOL all fell to the bottom 5 of the top 10 languages. C had a wild surge in popularity, as the Internet was coming onstream, and the World-Wide Web was just starting in the universities. By 1992, the top 2 positions were occupied by C and C++. Also by 1992, a need for CGI scripting was needed for the fledgling W0rld-wide web, and Perl became popular.

1995

By 1995 Netscape had been out for 5 years. 1995 was the year that Microsoft first introduces Internet Explorer and gives it away for free, causing Netscape to go open source and produce Firefox.

There were many scripting languages at the time aimed at web browsers, but there had not been any set standard as to a default scripting language. By the end of the decade, that standard would go to JavaScript, a language developed since 1995. It and Perl were rising in popularity as client-side and server-side web-based languages respectively. But in the following 5-year period there was another shake-up. Java (a very different language from JavaScript), a product of Sun Microsystems, came from out of nowhere in 1995 to be the 3rd most popular language by 1996. By this time, the web had arrived in people’s homes and there was a need to enhance people’s internet experiences.

Pascal was falling out of favour as computers were moving away from DOS in the home and in business, and by 1997, Borland designed and object-oriented version of Pascal, which was called Delphi. It turned out to be a formidable competitor to Visual Basic. By 1998, even more server-side dynamic web programming was provided with the language PHP.

2000

2000 was the year that USB flash drives grew in popularity. In other news, Google makes its IPO in 2004; and in the same year we are first hearing about “web 2.0”.

PHP overtook Perl by 2000 as the 5th-most used language that year. Java and JavaScript had occupied 2nd and 3rd, pusing C++ to the #4 spot. C was still on top.  That year, Microsoft offered the world C#. Apart from C and C++, the top 5 langugaes were all web-based languages: Java, JavaScript and PHP. Perl was descending in popularity, as a new scripted language with much cleaner syntax became ascendant: Python.

2005

In 2005, IBM sells its PC division to a Chinese firm, to make it the largest manufacturer of PC computers in the world.

C was finally pushed out of the top spot by Java; and Delphi was starting to drop out of the picture as Borland had financial troubles after a failed bid to attempt to make inroads into Linux, with their introduction of Kylix. They sold off Delphi to Embracadero, who produces that product today. Perl continues to ascend in popularity only slowly, as its popularity is buoyed up by a legacy of libraries and its role in various bioinformatics projects, such as the Human Genome Project, conducted by universities around the world.

In part due to bioinformatics and other informatics endeavours, math and stats-based languages popped up such as Matlab and R. There were still new web-based languages like Ruby.

2010

At more than 1 petaflop (over 1 quadrillion calculations per second), the Tianhe 1 (released in 2010) is capable of running massive simulations and complex molecular studies. This year, IBM’s Watson wins a Jeopardy tournament.

Perl had finally dropped off the top-10, leaving a legacy of code on web servers all over the world. Objective-C became popular with Apple developers and new operating systems line NextStep, iOS and OS X. By 2011, the top 4 were: Java, JavaScript, Python, and PHP. Apple’s teaching language, Swift was at #9 in 2014.

2015

C and C++ were pushed out of the top 5. R, primarily a statistical programming language, rose to #7, second only to C. By 2019, Python was the top language programmers were using. Kotlin showed up briefly in 2019, owing to Google’s support of the language on the Android.

2020

Not much change, except for the introduction of Go, touted to be a more “reasonable” implementation of C++ with lighter syntax. Microsoft introduced TypeScript, a superset of JavaScript, and likely an attempt to “embrace and extend” it as they attempted to do the last time to Java, for example (J++ never caught on), or to JavaScript itself with their mildly successful VBScript, which also never quite caught on over the long haul.

While that was happening, Rust, which had been around for some time, enjoyed some popularity as a back-end web language, as well as a systems language. By the end of 2022, TypeScript has risen to the top 5. Of 11 languages that are the most popular, 7 are web-based languages: Python, JavaScript, TypeScript, PHP, Go, Rust, and Kotlin. The others are Java, C++, C, and C#.

Ideology vs Getting Stuff Done

Hits: 19

I’ve disliked ideology as a personal philosophy. Nothing ever really applies in all cases. This is why, while I think of the idea of open source software is awesome, it is not awesome if you have to sacrifice doing things you need to do in the name of philosophical purity.

I share the views stated by Evan Liebovich in a video he made last year. He is more of a proponent of open source than I will ever be, but he has had to reckon with the fact that Linux is falling behind on the desktop, is a low priority for developers, and if you actually need to get things done for many ordinary use cases, you need to install Windows. Evan’s video was shot from his Windows 10 desktop.

Linux will still be the low-cost desktop solution for developers, sporting a plethora of sophisticated programming tools. In that one area, they are way ahead of all other operating systems. However, in other respects for example, Microsoft can support the latest scanners and media cards, which Linux is often slow in adopting.

The reason they are slow is, as of 2021, according to Evan, Linux is installed in only 1.8% of all desktops. This ought to be regarded as minor, considering Linux’s unquestioned dominance in android devices, Chromebooks, business infrastructure, and internet servers all over the world. Linux has scored dominance in nearly every platform imaginable. Just not on the desktop PC.

People writing software and drivers for the Windows PC are likely to stay programming in that domain, since that is where 98% of the market is. It is not very likely that most companies, especially small ones, are going to write drivers for hardware or peripherals such as adaper cards, printers or scanners since there is not enough money in it. But even if they do, it is not as likely to fully take advantage of the hardware.

Nowadays you can download licences for Windows 10 which had been taken from old, discarded machines. This is legal if a legitimately purchased licence was on one machine, is no longer used there, and is transferred to a new machine. Microsoft will have a problem if one license is on more than one machine. Otherwise, there should be no problem. Ali Express sells licences for as low as CAN$3.50 by some vendors. Some sell with DVD, others sell only the license code. In the latter case, you can go to Microsoft.com and download your own image to a USB stick, 16GB minimum.

Xemacs, xcalc, xclock, and TeXStudio, all running on Windows 10 using an X server called “vcxsrv”, all thanks to WSL2.

Microsoft has, in the view of many, moved on from the “OS wars”, and have allowed users to incorporate WSL2 (Windows Subsystem for Linux version 2) into their base operating system. This has allowed users to run Linux applications, and even X-Windows applications, on their Windows desktop. Another gesture to open source is their purchase of GitHub, and have also joined the Linux Foundation.  MS Teams also has a version made for Linux.

Even without WSL2, there are many open source (FOSS) applications that have windows versions. And chances are, they run better in Windows, since writers of video drivers, for example, likely have better support of graphics acceleration. The same can also be said for audio, printer support, network card support and scanner support. Such FOSS applications which do not use WSL2 can be found on sites such as TTCS OSSWIN Online (Trinidad and Tobago Computer Society collection of free Open Source Software for WINdows), which has one of the most comprehensive FOSS archives for MS Windows.

It is still possible, by and large, to have a FOSS computer where the only “large” software expense is the Windows OS itself.

What is that sorting algorithm?!

Hits: 16

Yes, for a while I was mystified. I had taught what I was sure was the Bubble Sort, the most basic sorting algorithm given to computer students. It was so simple, I felt little need to look it up (but should have), and just write it from memory. Here was my algorithm, written in pseudocode:

You have an array of numbers A[]
for i <- 1 to N
   for j <- (i+1) to N
      if A[i] > A[j] then
         swap A[i] and A[j]

That’s the Bubble Sort, as I thought it was, and as I have been teaching it for 5 years. The algorithm works, is inefficient as the Bubble Sort has been advertised to be, and I have never questioned it. It was also an easy one to teach and students picked up on it easily.

But from this or that source over the 5 years, I have found others attributing a wholly different algorithm to the Bubble Sort. I dove into my copy of Algorithms 2e (Cormen, Leiserson, RIvest and Stein, 2001), a text which was used in many of the better computer science programs in North America, and this was how they described it:

Let A be an array of data to sort
for i <- 1 to length[A] do
   for j <- length[A] downto (i + 1) do
      if A[j] < A[j - 1] then
         Swap A[j] and A[i]

Yes, this is the source I should have cited rather than going on memory, and that would have been the end of it. But notice that we are only ever comparing two adjacent array elements in a bubble sort. But at the time, I only saw evidence of this cropping up on a YouTube video and some other nameless, faceless websites, and thought maybe there is more than one algorithm called a bubble sort. I tried this several times on paper with several short combinations of numbers, and convinced myself that their algorithm works also.

There are other versions. Harvard CS50 adds some efficiency to the pseudocode. Since the list is often sorted well before both loops have completed, the modification is to have a conditional loop (rather than a counted one) which uses a swap counter, which is used to decide if the list really is sorted. If there are no swaps after one turn of the outermost loop, the list is declared sorted and the algorithm exits. It went something like this:

Let A be an array
Set swap_counter to a nonzero value
Repeat
|   reset swap_counter to 0
|   look at each adjacent pair in A
|   |  if A[n] and A[n-1] are not in order,
|   |     Swap A[n] and A[n-1]
|   |     Add 1 to swap_counter
|   v  end if
v   move to the next adjacent pair
Until swap_counter is 0

You have to play with some short lists on a sheet of paper to convince yourself that this will actually sort a list. It is still a nested loop, but a conditional one.

Then, there was Wikipedia. Their entry on Bubble Sort, apart from being about the proper bubble sort, had a footnote on the bottom, to a paper describing a sorting method called I can’t believe it can sort. Seriously, that was the name given and cited. Here is their algorithm, as elucidated by Dr. Stanley Fung from the University of Leicester (UK) (3 October, 2021) in a paper he entitles “Is this the simplest (and most suprising) sorting algorithm ever?”, publically archived at Cornell Univesity:

Let A be an array of length n
for i <- 1 to n do
   for j <- 1 to n do
      if A[i] < A[j] then
         Swap A[i] and A[j]

This is where I began to question my sanity. This was very close to my version of Bubble Sort, except that I started j at (i+1) and tested A[i] > A[j]. It was similar in that it used a nested for loop, comparison, and swap.

But Fung wrote a paper giving this as a kind of Ripleys’ Believe it or Not. He seemed incredulous that this could work at all. And yes I can see his point. His test of A[i] < A[j] seems backward, yet the numbers end up in increasing order. This is becuase allowing both nested loops going to the array size allows for redundant comparisons and for additional swaps which can take place if j < i, which he explains. Is it efficient? Hell, no. But it is still shares a similar efficiency to Bubble Sort, O(n^2). Would anyone in their right mind offer this to students as a sorting algorithm? Nope. And yes, he does make these points also. It’s as close as I have seen to a sorting algorithm that appears “brute force”. In addition, he gives a proof of correctness of the algorithm to sort in ascending order.

But now I began to wonder what I was doing. I had been using something like this algorithm for 4 years before Fung wrote his paper. What was my algorithm then? Did I invent something new? Am I going to be famous? That feeling was short-lived as I read on. The next algorithm he offered was the one I taught. It was called an Exchange Sort.

I stand by Exchange Sort as an basic algorithm to teach and for students to grasp in high school, and seems more straightforward than the Bubble Sort. I will probably end up teaching both in due course, and I have already changed my notes and slides, as I haven’t taught it yet this year.

This whole episode illustrates the importance of readily letting go of old ideas once they have been proven wrong, and then learning the right way. Learning takes humility, one of our human virtues we are endowed with. People who feel no need of such reflection and change are people whom I feel are crippled with one of the worst learning disabilities, not the result of biological barriers or mental illness, but instead merely borne of pride and hubris. Life is too short to hang on to old ideas that don’t work and actually never did in the past either.