Article attached to this image is at: https://opencurve.info/the-collatz-conjecture/
The programming assignment about the Collatz Conjecture I give to my students is getting old. The Collatz Conjecture is a prediction about an algorithm. To state it,
You have a number n, a positive whole number
if n is even, divide n by 2
Otherwise, apply the formula 3n + 1
With this new value, repeat the above steps again
The conjecture predicts that whatever the value of n, the series will sooner or later decay down to the sequence 1, then 4, then 2, then 1. Once you reach 1, you will enter a loop that repeats the sequence 1, 4, 2, 1 forever. The repeating “1, 4, 2, 1” sequence is called a cycle. I don’t want to go into the Collatz Conjecture in detail, but you can play with some numbers and satisfy yourself that this is the case for any numbers you can do on paper, and in your head. There are also many videos on YouTube about it, where people both explain it and demonstrate it for some numbers.
It is a conjecture because it has never been proven for all integers. It had been checked computationally for integers as high as $5.7 \times 10^{18}$. Such checking might be furnished as “strong evidence” in support of Collatz, but it falls short of a convincing proof. It remains one of Mathematics’ unsolved problems.
Meanwhile, there are other sequences that have their own conjectured cycles. Dr. N. J. Wildberger (2013) has found in his research that replacing 2 and 3 in the algorithm with 2 and 5 has 1, 6, 3, 16, 8, 4, 2, 1 as one of its cycles. But if you start with n = 5 with the latter scheme, you get another cycle immediately after the first term: 26, 13, 66, 33, 166 83, 416, 208, 104, 52, 26, repeating forever. Things get haywire pretty soon however, when n=7, numbers shoot up to the higher reaches of unsigned int in my C++ program, but after 18,613,331 iterations with no repetition or any other kind of end in sight, I hit Control+C and stopped execution.
I changed the integer type to int1024_t, and by the 20,000th term, the numbers have been hovering around 308 digits for some time, and have likely reached the limitation of the widest integer type I could find. But early on as it proceeded into the thousands of terms, you could see it going down a little, but overall, the numbers swelled.
But was it a high-flying numerical cycle? Repetition was checked by running the program under UNIX’s script program, then checking the typesript file for the numbers it captured. I examined it in vi, and checked to see if the last number in the file was repeated anywhere. It wasn’t. The same appears to happen with n = 9.
We can change the algorithm slightly to:
You have a number n, a positive whole number
if n is even, divide n by 2
if 5 divides n, divide n by 5
Otherwise, apply the formula 7n + 1
With this new value, repeat the above steps again
This produces a cycle when n = 6: 3, 22, 11, 78, 39, 274, 137, 960, 480, 240, 120, 60, 30, 15, 3, pretty much from the second term. I’ve observed the same cycle when n =12, 15, 17, 19, for example. The doubles and quintuples of these numbers would also be infinite. Also, members of the aforementioned cylcle would predictably also produce the same cycle. n = 13 goes as high as 26,374,440 before coming down to 1. When the cycle terminates in my program, it is because the old pattern of 4, 2, 1 was observed.
The listing for the latter code is shown below:
#include <iostream>int main(){unsignedlong n;unsignedlong highest =0;unsignedlong terms =1;
std::cout<<"Enter a number: ";
std::cin>> n;while(n >1){
std::cout<< n <<"\n";if(n > highest){ highest = n;}
terms++;if(n %2==0){
n = n /2;}elseif(n %5==0){
n = n /5;}else{
n =7*n +1;}}
std::cout<<"There are "<< terms <<" terms in the sequence.\n";
std::cout<<"The maximum term is: "<< highest <<".\n";}
ChatGPT is a chat service with an AI bot. It is fed many samples of text from all over the net, and can answer your questions in a conversational style. I have seen videos that uncharatably characterizes chatGPT as quite often giving wrong answers to questions. Academics even play down the bot’s ability to give coherent answers.
Well, I can tell you that despite it being tied to corpus linguistics and samples given to it, chatGPT seems to learn how to respond pretty well.
There was once a famous phrase that is touted to be correct grammar but still completely nonsensical: green dreams sleep furiously. So I asked chatGPT:
why do green dreams sleep furiously?
ChatGPT: I’m sorry, but I’m not sure what you mean by “green dreams sleep furiously.” This sentence appears to be a nonsensical combination of words that does not convey a clear or meaningful message. If you could provide me with more context or rephrase your question, I would be happy to try to help you to the best of my abilities.
So it wasn’t fooled by this standard phrase found in many texts on linguistics. I thought it was entirely possible that this phrase would be so well known to the designers, that it was fed to ChatGPT as a “bad” example of a meaningful phrase.
But then I remembered a video of a prof asking chatGPT about blue wolves, and apparently, chatGPT took him seriously and attempted a serious answer using reasoning that was unsupportable on a casual Googling. There are no blue wolves. So I tried this:
can you tell me about green foxes?
ChatGPT: In the natural world, foxes come in a variety of colors, including red, gray, and even black. However, there are no known species of foxes that are naturally green.
That being said, in some fictional and mythological contexts, green foxes do appear. For example, in Japanese folklore, there is a creature called the “Kitsune” which is a fox spirit with the ability to shape-shift into a human form. Kitsune are often depicted as having red or white fur, but some stories describe them as having green fur.
In modern media, green foxes also appear in various forms of art and entertainment, such as video games and cartoons. For example, in the popular video game series “Star Fox,” the main character, Fox McCloud, is depicted with green fur.
Overall, while there are no known naturally occurring green foxes, they do have a presence in various fictional and artistic contexts.
I couldn’t fool it with that either. I am now starting to chalk it up to maybe a peculiar phrasing the prof was using, which I didn’t have the knack for. Fooling the bot isn’t easy then, and you have to know its weaknesses in a way that a layperson like me is not likely to grasp. Clearly, it is not just riffing “found text” together as claimed in any kind of haphazard manner.
But as a programmer, common sense tells me that a computer isn’t going to know the difference between good and bad responses to samples. It also isn’t going to predict, or have stored answers to, your questions. And indeed, each response to chatGPT allows me to rate an answer by clicking one of two icons — either a “thumbs up” icon or a “thumbs down”.
It is likely after millions of questions from as many users, the sampling would be far richer than what the designers alone would have given.
I even tried to deliberately trip it up with a question like “Why was president Jello Biafra considered an anarchist?” The question is premised on the Dead Kennedys former frontman being “president”, which is untrue. Here is its response:
Why was president Jello Biafra considered an anarchist?
ChatGPT: Jello Biafra, whose real name is Eric Reed Boucher, is an American musician and political activist who is best known as the lead singer of the punk rock band “Dead Kennedys.” He is often associated with anarchist and leftist politics.
Biafra was heavily influenced by the punk rock movement of the late 1970s and early 1980s, which was characterized by a DIY ethos and a rejection of mainstream culture and politics. He was particularly influenced by the British punk band “Crass,” who espoused anarchist and pacifist ideas.
Biafra’s lyrics and political activism often expressed anarchist and leftist ideas, such as opposition to corporate greed, government corruption, and police brutality. He also supported a number of progressive causes, such as environmentalism, animal rights, and LGBTQ rights.
While Biafra himself has never identified as a strict anarchist, his political beliefs and activism have often aligned with anarchist principles of anti-authoritarianism, direct action, and solidarity with marginalized groups. His advocacy for alternative forms of social organization and direct democracy have also been associated with anarchist thought.
Overall, Biafra’s anarchist and leftist leanings are a result of his political and cultural influences, as well as his own experiences and observations of social and political issues in the United States.
It seems to know its counterculture well enough to fool me, a former college DJ (except on some minor points, like his being influenced by the British punk band “Crass”, which I never heard about in all of my years following his biography since the 1980s). But it ignored my reference to him as “president”. Other things that seem just a bit “off”: I don’t think he was ever vocal on animal or LGBT rights (although I am sure he didn’t oppose them), so these are also unsupported.
When I tried a more direct question, “What year was Jello Biafra elected president?”, it then rejected my premise that he was ever president or indeed had ever held political office except in a satirical scenario. Damn. Nailed him.
Then I asked it a Calculus question:
What is the tangent equation of the function x^2 at the point (2.4)?
ChatGPT: To find the equation of the tangent line to the function f(x) = x^2 at the point (2,4), we need to find the slope of the tangent line at that point, and then use the point-slope form of a line to write the equation.
The slope of the tangent line to f(x) at x = 2 is equal to the derivative of f(x) at x = 2. We can find this derivative using the power rule of differentiation:
f'(x) = 2x
So, at x = 2, we have:
f'(2) = 2(2) = 4
This means that the slope of the tangent line to f(x) at x = 2 is 4.
Now we can use the point-slope form of a line to write the equation of the tangent line. The point-slope form is:
y – y1 = m(x – x1)
where m is the slope of the line and (x1, y1) is a point on the line.
Plugging in the values we found, we have:
y – 4 = 4(x – 2)
Simplifying, we get:
y – 4 = 4x – 8
y = 4x – 4
So the equation of the tangent line to f(x) = x^2 at the point (2,4) is y = 4x – 4.
That was scary. Notice that the question was asked with a typographical error: (2.4) should have been (2,4) in the question, but chatGPT just took it all in stride, corrected my mistake and answered the question. It didn’t even point out the mistake, it just dealt with it quietly, as if it was respecting my feelings or something.
The prospect of education and chatGPT
I can say quite succinctly that with chatGPT’s ability to produce essays or to solve math problems with detailed descriptions of the steps that education is clearly under threat.
I also asked chatGPT to produce Java code of the first 10 Fibonacci numbers along with ratios of the successive pairs of terms. It produced the Java code, which printed the terms and ratios on the same line, which I would consider the only bug in the program, albeit a minor bug. It also added comments in key places int he code. It only took a few changes of their print() statements into printf() statements before the output became presentable. Code formatting was excellent. But here I take a grade 10/grade 11 problem in programming, and it appears a call to chatGPT can now solve those programming problems, and very likely more complex problems as well. And do it in seconds.
The prospect of love, relationships and chatGPT
I love my wife, I really do. And I’m a lousy poet. So can chatGPT write romantic sonnets? Oh yeah. You pick the flavour: standard, Petrarchean, Shakespearean — those were the styles I’ve tried. The first one was the one I finally gave to my wife Diane as a “found poem with her name by an unknown author”. While that is true in its content — the AI bot just curates samples of text from all over the ‘net and munges it all together in interesting and new ways, so indeed attribution is impossible — it was the standard “white lie” of a husband who had almost forgotten today was Valentine’s day, and who had not the heart to tell her that the sonnet was computer-generated through artificial intelligence.
But I meant every word of the poem.
Write a sonnet for my wife diane
ChatGPT: Of all the things that I have come to know, There’s none so fair or dear as Diane’s face. Her eyes, like pools of light, with love aglow, Her smile, a thing of wonder and of grace.
She moves with such a poise and gentle ease, A joy to see her dancing through the day. Her laughter, like a bird upon the breeze, Can lift me up and chase my cares away.
In times of need, she is my shining light, A beacon in the darkness of my soul. Her love has helped me through the darkest night, And made me once again feel whole.
Diane, my love, my heart’s own true delight, You are the sun that makes my world so bright.
There will be people that will dismiss any “popular” programming list as a kind of meaningless horse race. But that’s as long as you are not looking for job prospects as an answer to what will pay the bills and keep the lights on when you enter the work world. But the field is large enough that you have room to ask yourself: what kind of programming do you want to do? Systems programming? Applications? Servers? Clients? Scientific models? Statistical studies? Device drivers? Everyone hears about web programming, since that is the most visible, and seems to get the most “airplay” in the media. It might even interest you. For others, it’s dull. There is so much more out there.
With that preamble, why am I bothering to still do this? It is to show how popular languages follow the ebb and flow of computing history. Since World War II, we had the ENIAC, a host of IBM and AT&T mainframes, followed by networked computers, then personal computers, then the internet, and so on. With each major shake-up, programming needs change.
Disk drums on an IBM 2314.
By 1965, what had changed preferences in computer languages, are the same things that change it today: changes in hardware, programming for mainframes versus “personal” computers (which in this decade amounted to comptuers like the PDP-1). In the 1960s, hard drives (which were called “disk drums” back then) were relatively new, as was magnetic tape. Transistors hadn’t quite made their heyday yet, with the some of the most powerful computers still using vacuum tubes.
1960
COBOL. 1960 saw the introduction of supercomputers in the service of business, and by far the most popular language was COBOL (COmmon Buisiness-Oriented Language). COBOL was an interpreted language, which meant it was capable of running on many different machines with a minimum number of changes. Today, by the end of 2022, over 80% of business code is still written in COBOL.
1965
The Olivetti programmable calculator, about the size of a small modern digital cash register, and among the first of its kind.
ALGOL. Algol-60 saw the first implementation of the QuickSort algorithm, invented by C. A. R. Hoare, while a student in Moscow. He was knighted by Queen Elizabeth II for his discovery. Algol was behind COBOL as the most popular programming language, but both were dwarfed by FORTRAN users.
Niklaus Wirth
FORTRAN. FORTRAN was far and away the most popular programming language by 1965, and stayed that way for some decades. It was taught in many “service” computer course taken by science students and most engineering students. It was known for having a rather elaborate mathematics capability.
Other languages popular during that period: Assembly, APL, BASIC and Lisp. 1969 was the year that PASCAL was first introduced, by Niklaus Wirth.
1970
1970 saw the invention of UNIX by Kernighan and Ritchie at AT&T Labs, and Pascal came on board as a teaching language for structured programming in many university freshman courses. Otherwise, the landscape was pretty much the same for programming languages in popular use as before.
1975
By 1975, C had grown in popularity, but was not a teaching language: BASIC, Pascal, and Lisp had all ascended in popularity as we had sent men on the moon, and more students became interested in computer programming. FORTRAN and COBOL were still at the top of the heap, while ALGOL, APL and Assembly moved down. Assembly would in future decades disappear from general popularity, but it would never truly go away.
1980
Enquire was a hypertext processing system first proposed at CERN Physics labs by Tim Berners-Lee in 1980. Ideas from Enquire would be later used to design the World-Wide Web.
By 1980, C++ had been introduced by Bjarne Stroustrup over the past couple of years, bringing the concept of object-oriented programming to the world. More and more people had mastered C, and it moved to the middle of the “top 10” proramming languages used that year. Pascal became a wildly more popular language due to the introduction of household desktop PCs, and the offering of a Turbo Pascal compiler by a software company called Borland. Microsoft offered BASIC and FORTRAN compilers that extended their stock QBASIC interpreter that came with DOS. In addition, Tandy, Commodore and Sinclair were offering their own machines, each with their own BASIC interpreters.
1985
While he didn’t invent the Internet (he never claimed that at all, according to Snopes.com), Al Gore tables bills and sources funding to greatly expand the internet, post 1989.
Bjarne Stroustrup publishes his seminal work The C++ Programming Language, in 1985. With the introduction of Windows and Windows NT, Microsoft expanded their programming offering to include Visual Studio, which included compilers for C and C++. C was rising to the top of the charts, competing with Borland’s Pascal product. C would never leave the top 3 for another 15 years.
1990
MS Windows 3.0 first shipped in 1990. Also, Adobe ships Photoshop the same year. The World-wide web also gets its first exposure this year. By 1991, a computer science student Linus Torvalds uploads his first kernel source code to an FTP site, which a maintainer mis-spelled as “Linux”, a name which stuck.
Visual BASIC was introduced by Microsoft. C++ rose to the top 5. FORTRAN, BASIC, Assembly, and COBOL all fell to the bottom 5 of the top 10 languages. C had a wild surge in popularity, as the Internet was coming onstream, and the World-Wide Web was just starting in the universities. By 1992, the top 2 positions were occupied by C and C++. Also by 1992, a need for CGI scripting was needed for the fledgling W0rld-wide web, and Perl became popular.
1995
By 1995 Netscape had been out for 5 years. 1995 was the year that Microsoft first introduces Internet Explorer and gives it away for free, causing Netscape to go open source and produce Firefox.
There were many scripting languages at the time aimed at web browsers, but there had not been any set standard as to a default scripting language. By the end of the decade, that standard would go to JavaScript, a language developed since 1995. It and Perl were rising in popularity as client-side and server-side web-based languages respectively. But in the following 5-year period there was another shake-up. Java (a very different language from JavaScript), a product of Sun Microsystems, came from out of nowhere in 1995 to be the 3rd most popular language by 1996. By this time, the web had arrived in people’s homes and there was a need to enhance people’s internet experiences.
Pascal was falling out of favour as computers were moving away from DOS in the home and in business, and by 1997, Borland designed and object-oriented version of Pascal, which was called Delphi. It turned out to be a formidable competitor to Visual Basic. By 1998, even more server-side dynamic web programming was provided with the language PHP.
2000
2000 was the year that USB flash drives grew in popularity. In other news, Google makes its IPO in 2004; and in the same year we are first hearing about “web 2.0”.
PHP overtook Perl by 2000 as the 5th-most used language that year. Java and JavaScript had occupied 2nd and 3rd, pusing C++ to the #4 spot. C was still on top. That year, Microsoft offered the world C#. Apart from C and C++, the top 5 langugaes were all web-based languages: Java, JavaScript and PHP. Perl was descending in popularity, as a new scripted language with much cleaner syntax became ascendant: Python.
2005
In 2005, IBM sells its PC division to a Chinese firm, to make it the largest manufacturer of PC computers in the world.
C was finally pushed out of the top spot by Java; and Delphi was starting to drop out of the picture as Borland had financial troubles after a failed bid to attempt to make inroads into Linux, with their introduction of Kylix. They sold off Delphi to Embracadero, who produces that product today. Perl continues to ascend in popularity only slowly, as its popularity is buoyed up by a legacy of libraries and its role in various bioinformatics projects, such as the Human Genome Project, conducted by universities around the world.
In part due to bioinformatics and other informatics endeavours, math and stats-based languages popped up such as Matlab and R. There were still new web-based languages like Ruby.
2010
At more than 1 petaflop (over 1 quadrillion calculations per second), the Tianhe 1 (released in 2010) is capable of running massive simulations and complex molecular studies. This year, IBM’s Watson wins a Jeopardy tournament.
Perl had finally dropped off the top-10, leaving a legacy of code on web servers all over the world. Objective-C became popular with Apple developers and new operating systems line NextStep, iOS and OS X. By 2011, the top 4 were: Java, JavaScript, Python, and PHP. Apple’s teaching language, Swift was at #9 in 2014.
2015
C and C++ were pushed out of the top 5. R, primarily a statistical programming language, rose to #7, second only to C. By 2019, Python was the top language programmers were using. Kotlin showed up briefly in 2019, owing to Google’s support of the language on the Android.
2020
Not much change, except for the introduction of Go, touted to be a more “reasonable” implementation of C++ with lighter syntax. Microsoft introduced TypeScript, a superset of JavaScript, and likely an attempt to “embrace and extend” it as they attempted to do the last time to Java, for example (J++ never caught on), or to JavaScript itself with their mildly successful VBScript, which also never quite caught on over the long haul.
While that was happening, Rust, which had been around for some time, enjoyed some popularity as a back-end web language, as well as a systems language. By the end of 2022, TypeScript has risen to the top 5. Of 11 languages that are the most popular, 7 are web-based languages: Python, JavaScript, TypeScript, PHP, Go, Rust, and Kotlin. The others are Java, C++, C, and C#.
I’ve disliked ideology as a personal philosophy. Nothing ever really applies in all cases. This is why, while I think of the idea of open source software is awesome, it is not awesome if you have to sacrifice doing things you need to do in the name of philosophical purity.
I share the views stated by Evan Liebovich in a video he made last year. He is more of a proponent of open source than I will ever be, but he has had to reckon with the fact that Linux is falling behind on the desktop, is a low priority for developers, and if you actually need to get things done for many ordinary use cases, you need to install Windows. Evan’s video was shot from his Windows 10 desktop.
Linux will still be the low-cost desktop solution for developers, sporting a plethora of sophisticated programming tools. In that one area, they are way ahead of all other operating systems. However, in other respects for example, Microsoft can support the latest scanners and media cards, which Linux is often slow in adopting.
The reason they are slow is, as of 2021, according to Evan, Linux is installed in only 1.8% of all desktops. This ought to be regarded as minor, considering Linux’s unquestioned dominance in android devices, Chromebooks, business infrastructure, and internet servers all over the world. Linux has scored dominance in nearly every platform imaginable. Just not on the desktop PC.
People writing software and drivers for the Windows PC are likely to stay programming in that domain, since that is where 98% of the market is. It is not very likely that most companies, especially small ones, are going to write drivers for hardware or peripherals such as adaper cards, printers or scanners since there is not enough money in it. But even if they do, it is not as likely to fully take advantage of the hardware.
Nowadays you can download licences for Windows 10 which had been taken from old, discarded machines. This is legal if a legitimately purchased licence was on one machine, is no longer used there, and is transferred to a new machine. Microsoft will have a problem if one license is on more than one machine. Otherwise, there should be no problem. Ali Express sells licences for as low as CAN$3.50 by some vendors. Some sell with DVD, others sell only the license code. In the latter case, you can go to Microsoft.com and download your own image to a USB stick, 16GB minimum.
Xemacs, xcalc, xclock, and TeXStudio, all running on Windows 10 using an X server called “vcxsrv”, all thanks to WSL2.
Microsoft has, in the view of many, moved on from the “OS wars”, and have allowed users to incorporate WSL2 (Windows Subsystem for Linux version 2) into their base operating system. This has allowed users to run Linux applications, and even X-Windows applications, on their Windows desktop. Another gesture to open source is their purchase of GitHub, and have also joined the Linux Foundation. MS Teams also has a version made for Linux.
Even without WSL2, there are many open source (FOSS) applications that have windows versions. And chances are, they run better in Windows, since writers of video drivers, for example, likely have better support of graphics acceleration. The same can also be said for audio, printer support, network card support and scanner support. Such FOSS applications which do not use WSL2 can be found on sites such as TTCS OSSWIN Online (Trinidad and Tobago Computer Society collection of free Open Source Software for WINdows), which has one of the most comprehensive FOSS archives for MS Windows.
It is still possible, by and large, to have a FOSS computer where the only “large” software expense is the Windows OS itself.
Yes, for a while I was mystified. I had taught what I was sure was the Bubble Sort, the most basic sorting algorithm given to computer students. It was so simple, I felt little need to look it up (but should have), and just write it from memory. Here was my algorithm, written in pseudocode:
You have an array of numbers A[]
for i <- 1 to N
for j <- (i+1) to N
if A[i] > A[j] then
swap A[i] and A[j]
That’s the Bubble Sort, as I thought it was, and as I have been teaching it for 5 years. The algorithm works, is inefficient as the Bubble Sort has been advertised to be, and I have never questioned it. It was also an easy one to teach and students picked up on it easily.
But from this or that source over the 5 years, I have found others attributing a wholly different algorithm to the Bubble Sort. I dove into my copy of Algorithms 2e (Cormen, Leiserson, RIvest and Stein, 2001), a text which was used in many of the better computer science programs in North America, and this was how they described it:
Let A be an array of data to sort
for i <- 1 to length[A] do
for j <- length[A] downto (i + 1) do
if A[j] < A[j - 1] then
Swap A[j] and A[i]
Yes, this is the source I should have cited rather than going on memory, and that would have been the end of it. But notice that we are only ever comparing two adjacent array elements in a bubble sort. But at the time, I only saw evidence of this cropping up on a YouTube video and some other nameless, faceless websites, and thought maybe there is more than one algorithm called a bubble sort. I tried this several times on paper with several short combinations of numbers, and convinced myself that their algorithm works also.
There are other versions. Harvard CS50 adds some efficiency to the pseudocode. Since the list is often sorted well before both loops have completed, the modification is to have a conditional loop (rather than a counted one) which uses a swap counter, which is used to decide if the list really is sorted. If there are no swaps after one turn of the outermost loop, the list is declared sorted and the algorithm exits. It went something like this:
Let A be an array
Set swap_counter to a nonzero value
Repeat
| reset swap_counter to 0
| look at each adjacent pair in A
| | if A[n] and A[n-1] are not in order,
| | Swap A[n] and A[n-1]
| | Add 1 to swap_counter
| v end if
v move to the next adjacent pair
Until swap_counter is 0
You have to play with some short lists on a sheet of paper to convince yourself that this will actually sort a list. It is still a nested loop, but a conditional one.
Then, there was Wikipedia. Their entry on Bubble Sort, apart from being about the proper bubble sort, had a footnote on the bottom, to a paper describing a sorting method called I can’t believe it can sort. Seriously, that was the name given and cited. Here is their algorithm, as elucidated by Dr. Stanley Fung from the University of Leicester (UK) (3 October, 2021) in a paper he entitles “Is this the simplest (and most suprising) sorting algorithm ever?”, publically archived at Cornell Univesity:
Let A be an array of length n
for i <- 1 to n do
for j <- 1 to n do
if A[i] < A[j] then
Swap A[i] and A[j]
This is where I began to question my sanity. This was very close to my version of Bubble Sort, except that I started at and tested . It was similar in that it used a nested for loop, comparison, and swap.
But Fung wrote a paper giving this as a kind of Ripleys’ Believe it or Not. He seemed incredulous that this could work at all. And yes I can see his point. His test of seems backward, yet the numbers end up in increasing order. This is becuase allowing both nested loops going to the array size allows for redundant comparisons and for additional swaps which can take place if , which he explains. Is it efficient? Hell, no. But it is still shares a similar efficiency to Bubble Sort, . Would anyone in their right mind offer this to students as a sorting algorithm? Nope. And yes, he does make these points also. It’s as close as I have seen to a sorting algorithm that appears “brute force”. In addition, he gives a proof of correctness of the algorithm to sort in ascending order.
But now I began to wonder what I was doing. I had been using something like this algorithm for 4 years before Fung wrote his paper. What was my algorithm then? Did I invent something new? Am I going to be famous? That feeling was short-lived as I read on. The next algorithm he offered was the one I taught. It was called an Exchange Sort.
I stand by Exchange Sort as an basic algorithm to teach and for students to grasp in high school, and seems more straightforward than the Bubble Sort. I will probably end up teaching both in due course, and I have already changed my notes and slides, as I haven’t taught it yet this year.
This whole episode illustrates the importance of readily letting go of old ideas once they have been proven wrong, and then learning the right way. Learning takes humility, one of our human virtues we are endowed with. People who feel no need of such reflection and change are people whom I feel are crippled with one of the worst learning disabilities, not the result of biological barriers or mental illness, but instead merely borne of pride and hubris. Life is too short to hang on to old ideas that don’t work and actually never did in the past either.