Remarks on LaTeX editors

Hits: 46

Nearly three years ago on another blog, I wrote about a comparison of LaTeX editors. Soon after, I began to use a third editor which, if you are a latex expert, you almost certaintly would have heard about, and are probably in fact using TeXStudio, an editor that has been around for close to a decade, but never appeared to show up on Linux installation packages. The editors that showed up, at least for me, were LyX and TeXmacs.

TeXstudio, once I discovered it, I installed it everywhere I could: on my Windows 10 and 7 machines, on my Linux installations, and even on Cygwin, even though they already had a Windows installation. To this day I have not seen any difference in output or functionality. All invocations of TeXstudio require a lot of time and packages for an installation of enough features.

This is TeXstudio, with the horizontal toolbars shown, along with part of the workspace. There are two vertical toolbars there, also partially shown.

First thing’s first: the editor. In LyX and TeXmacs, I needed to bail out of the editor, and export the code to LaTeX whenever I needed to do any serious equation editing or table editing or the like. In contrast, TeXstudio leaves me with no reason to ever leave the editor. First of all, the editor allows for native latex code to be entered. If there are pieces of Latex code that you don’t know, or have a fuzzy knowledge about, there is probably an icon or menu item that covers it. For document formatting, a menu item leads to a form dialog where you can fill in the form with sensible information pertaining to your particular document, default font size, paper size, margins, and so on. The ouput of this dialog is the preamble section to the LaTeX source file. To the rest of that source file, you add your document and formatting codes.  It is a kind of “notepad” for LaTeX, with syntax highlighting and shortcut buttons, menus and dialogs. It comes close to being WYSIWYG, in that “compiling” the code and pressing  the green “play” button brings up a window with the output of the existing code you are editing. It is not a live update, but it saves you the agony of saving, going on the command line compiling the code, and viewing in seeminly endless cycles. Now you can view the formatted document at the press of the play button.

Compiling The Linux Kernel Docs

Hits: 50

In the last article, I said that compiling and installing source versions of software was akin to “going rogue”. I must confess that I have compiled from source and installed software that wasn’t in my distribution, most recently TexStudio, as being one of the larger projects, requiring tons of other libraries and whatnot to also be installed (or quite often, compiled from source on the side), since it wasn’t a part of the linux distro I was using at the time. It also wasn’t a part of Cygwin, and I compiled for that too. It was a great way to kill an afternoon.

But there was a time that I had compiled the kernel from source. It was necessary for me, as speed was an issue and I had slow hardware at the time. What I also had was a mixture of hardware pulled from different computers at different times. I researched specs on sound cards, network cards, video cards and the motherboard chipsets, and knew what specs to tweak on the kernel compilation dialogs, so I could get the kernel to do the right thing: which is to be fast and recognize all my hardware. I was doing this before the days of modules, with the version 1.x kernel. It worked, and it was noticeably faster than the stock kernels. X-Windows on my 80486 PC ran quite well with these compiled kernels, but was sluggish to the point of un-useable with a stock kernel running. Every few versions of the kernel, I would re-compile a new kernel for my PC, and pretty soon using the tcl/tk dialogs they had made things pretty easy, and I could answer all the questions from memory.

But then that all ended with version 2. Yes, I compiled a version 2 kernel from source, and yes, it ran OK. But it also had modules. The precompiled kernels were now stripped down and lean, and the modules would only be added as needed when the kernel auto-detected the presence of the appropriate hardware. After compiling a few times, I no longer saw the point from a performance standpoint, and today we are well into kernel version 5.3, and I haven’t compiled my own kernel for a very long time.

For the heck of it, I downloaded the 5.3 kernel, which uncompressed into nearly 1 gigabyte of source code. I studied the config options and the Makefile options, and saw that I could just run “make” to create only the documentation. So that’s what I did.

It created over 8,500 pages of documentation across dozens of PDF files. And 24 of them are zero-length PDFs, which presumably didn’t compile properly, otherwise the pagecount would have easily tipped the scales at 10,000. The pages were generated quickly, the 8,500 or more pages were generated with errors in about 3 minutes. The errors seemed to be manifest in the associated PDFs not showing up under the Documentation directory. I have a fast-ish processor, an Intel 4770k (a 4th generation i7 processor), which I never overclocked, running on what is now a fast-ish gaming motherboard (an ASUS Hero Maximus VI) with 32 gigs of fast-ish RAM. The compilation, even though it was only documentation, seemed to go screamingly fast on this computer, much faster than I was accustomed to (although I guess if I am using 80486’s and early Pentiums as a comparison …). The generated output to standard error of the LaTeX compilation was a veritable blur of underfull hbox’es and page numbers.

For the record, the pagecount was generated using the following code:

#! /bin/bash
list=`ls *.pdf`
tot=0
for i in $list ; do
        # if the PDF is of non-zero length then ...
        if [ -s "${i}" ] ; then 
                j=`pdfinfo ${i} | grep ^Pages`
                j=`awk '{gsub("Pages:", "");print}' <<< ${j}`
                # give a pagecount/filename/running total
                echo ${j}	    ${i}    ${tot}
                # tally up the total so far
                tot=$(($tot + $j))
        fi
done

echo Total page count: ${tot}

Relitivistic Pedantry

Hits: 34

I must say first off, that I teach math and computer science, and was never qualified to teach physics. But I am interested in physics, and got drawn into in a physics discussion about how time does not stretch or compress in the visible world, and this is why in most of science, time is always the independent variable, stuck for most practical purposes on the x axis.

In the macroscopic world, time and mass are pretty reliable and so close to Einstein’s formulas (or those associated with the Special and General Theories of Relativity) at the macroscopic level that we prefer to stick to simpler formulas from classical mechanics, since they are great approximations, so long as things move well below the speed of light.

I am not sure (is anyone?) about how time is influenced by things like gravity and velocity (in particluar, the formulas stating how time is a dependent varable with respect to these things), but I remember an equation for relative mass, which doesn’t use time that would provide some insight into relativity:

    \[ \displaystyle{m(v) = lim_{v \to c^-} \frac{m_0}{\sqrt{1 - \frac{v^2}{c^2}}} = \infty} \]

Here, the independent variable is velocity, and it is evident that even for bodies that appear to move fast (on the scale of 10 to 20,000 km/h), it doesn’t have much impact on this equation. Rest mass and relative mass are essentially the same, and a body would have to move at nearly the speed of light for the mass of the moving body to change significantly. Indeed, as velocity v gets closer to the speed of light c, mass shoots up to infinity. I understand that Einstein stated that nothing can move faster than light, and this is supported by the above equation, since that would make it negative under the radical.

It does not escape my notice that velocity is supposed to depend on time, making the function m(v(t)), but time warps under things like high velocity also (as well as high gravity), so that time depends on … ? This is where I tell people to “go ask your physics prof” about anything more involved.

Sattelites move within the range of 10,000 to 20,000 km/h, hundreds of kilometres above the Earth’s surface. My assertion that there is not much change here in relativity terms. But this is still is large enough to keep makers of cell phones up at night, since not considering Einstein equations in time calcluations can cause GPS systems to register errors in a person’s position on the globe on the order of several kilometres, rendering the GPS functions on cell phones essentially useless.

My companion was trying to make the latter point, where I was thinking much more generally. We stick to classical mechanics, not because the equations are necessarily the correct ones, but instead because they are simple and lend a great deal of predictive power to the macroscopic world around us.

While you are quarantining and social distancing …

Hits: 39

Sir Isaac Newton, along with some personal notes written in Greek.

Other, greater people have done great things in quarantine way before you were born. I already knew that the late Sir Isaac Newton discovered things like optics, gravity, and the rules for Calculus, which he called the study of “fluxions”. But what I didn’t know is that in the two years he did so, he was in his early 20’s, and England suffered an epidemic of The Bubonic Plague, known as The Great Plague, in the years 1665-1666, long before infectious disease were known and understood. It is even worthy of remarking that so little was known of medicine generally that even Sir Isaac believed in alchemy until the day he died.

Prior to his quarantine he was thought of as an unremarkable undergraduate student, according to Wikipedia. But given two years cooped up where he lived and avoiding the Plague gave him time alone to come up with his brilliant theories on classical mechanics, using calculus to explain it mathematically.