In another life, I was asked to create visit cards for Odyssée Recherches Appliquées in Montréal (a long extinct subsidiary of Odyssey Research Associates in Ithaca). One of the difficulty was computing Bezier curves coefficients, I vaguely remember I wrote a Turbo Pascal helper for this. Shift-click on the image instead of clicking, at least for some Web browsers, to get to the PostScript source without launching a PostScript viewer. I have only little experience with writing PostScript directly, and honestly, I do not wish to extend it ☺. So despite diacritics do not show anymore, I do not intend to debug this.
Université de Montréal
I find myself very lucky of being appointed as a "systems analyst" for the very first job I got in my life (this might be worth the tale, one of these days!) The small team of people (five or six) were taking care of the CDC mainframes of the computing center, at Université de Montréal. Personal computers did not exist yet, it was only natural that hundreds of users had to split the power of a single machine.
Those CDC mainframes, while among the biggest computers available in their time, are tiny by nowadays standards. So, having these machines handle job requests from hundreds of users was quite an achievement. Every second was precious, and charged as such: CPU time and IO time were sold. We were doing strong accounting. Moreover, there was absolutely no kind of ethical problem, at the time, for our team to routinely study how various users were using the mainframe, and pointing them to possible improvements. So, users were educated at being good citizens and sparing resources as much as they could. Merely fooling around or random exploring was not very welcome, offenders could even be expelled and loose their computing privileges.
I immensely enjoyed having to relate with hundreds of researchers, and I guess many liked me as well: the exchanges were surely diverse, fascinating and fruitful. Very often, people stood in my office, enthusiastically teaching me about their work, entertaining me, often with the hope that we can find ways to somehow optimize ther works. No one was really counting his time, we were working nights and days, weekends did not exist (surely weekends existed for many, yet the computing centre was so crowded at all times that, all naive that I was then, this was the impression I got).
Everything was batch on such computers: input from punched cards, output on printed listing (or more rarely, on punched cards), with big magnetic tapes for long term storage (disk storage systems, despite physically huge, were too small and too fragile for that). Interactive work was requiring so much more precious resources than batch processing that it was introduced reluctantly, with circumspection and extreme care. Many computing centre wrote their editor, meant to do appropriate compromises. Ours was called ED, and part of TELUM, a locally developed full telecommunication system for SCOPE, which had none. ED was more an interactive programming shell with an embedded (powerful) editor and a batch submission interface, than a mere editor. TELUM was involving PP programming, of course, but also, PDP-11 code. Two people created a hardware interface between a PP channel and the PDP. Besides participating in the design discussion, my main involvement in it was to create programs to produce hardware cabling diagrams and recipes, and also, a full PDP-11 macro assembler, written in CPU COMPASS. Later, I wrote
BONJOUR, which was an alternative shell for ED under TELUM, yet able to collaborate with ED. Hmph! This was one project among a lot of others. I realise that we were doing a lot of things at that time.
Optimization concerns
The nanosecond (which was really a short time interval in these eons) was really deified. I remember having spent three full days optimising about 25 lines of CPU assembler (yet I also remember, many years after, that I was very tired). It was the chore of a relocating loop for a LISP loader a friend and I were writing, and the idea was to be able to fully process the tables at least as fast as they were delivered from disk by the PPU. Some might remember all attention that was given, at the time, to not loose disk revolutions.
The problems to ponder, for a good assembler coder, were to keep busy as many arithmetic units as simultaneously possible, to resolve register allocation conflicts (three types for them) that would prevent the scoreboard from issuing instructions, to tighten loops enough so they fit in the small instruction stack, to minimise the number of no-ops for realigning jump goals, to re-equilibrate code between jumps and flow through tests so RNI is more efficient, to properly "wrap" code around the entry/exit point. Maniacs were considering memory bank conflicts, but this, I usually did not.
The FTN compiler (an optimizing FORTRAN) was taking care of many of these tasks automatically, and was also generating clever code in many circumstances, often to minimise the number of conditional jump instructions and to normalise arithmetic. FTN was a wonder. Many assembler hand coders would not easily outperform FTN generated-code, speed-wise.
Very few people considered or praised the other compilers, but I guess I studied the code generated by everything available. Nobody knows how absolutely clever the COBOL compiler was at doing decimal arithmetic, on a machine not meant for that. So clever, in fact, that I wondered if Seymour Cray did not know it all in advance when he designed the machine. The Pascal compiler, later, was astonishingly sharp generating code avoiding multiplications and divisions, especially while indexing packed structures.
When I see myself programming Python, nowadays, and caring so little about CPU cycles, I sometimes feel I lived many lives ☺.
PPU programming
As one of the very few allowed to do PP programming, I had to be very careful. Any PP bug would usually fully throw the system down, needing a deadstart, and then disturbing
a lot of users. Deadstart was sometimes needed to reload new versions of PP code as well (depending on the situation). Always scheduled late in the night, so they less impact users. And lastly, PP bugs could also physically damage the hardware (like PP core memory, or the display tubes of the console, for example). So, this was less fun, and I always have been nervous while performing tests.
Hardware
We got the whole hardware schematics for the CDC machines, as well. I have a funny story about this, where I play the hero! ☺
After the CDC 6000 series, and after Seymour left the company, CDC created the Cyber 70 series. If I remember well, the designer was named Gary Tom — and also a bit known for having an extraordinarily beautiful wife! ☺.
Our Cyber 74 was the fourth or fifth of the series. Even if we had all the schematics, the series was new and the tutorial documentation for technicians was still sketchy and incomplete. The machine suddenly became quite unstable, it deadstarted spontaneously many times a day (we had a bit more than 600 such deadstarts in one month, that makes 20 a day!) and the technicians were just unable to isolate the cause. It lasted for nearly half a year. CDC was a bit desperate, flying in new teams every week, on the premise that if a team does not find quickly what is wrong, they need new blood to try other thinking avenues. They had teams of specialists to study ambient electro-magnetic radiations. All cabling was multiply shielded. Coincidence patterns were tried with the operation of the nuclear laboratory in the university, usage of big machinery in nearby hospitals, etc. The machine was
cooked a great deal (that is, the length of the cables were adjusted to augment the synchronicity of transmitted signals). I have a lot of almost incredible tales from that period. One of these nights, for a single example, I surprised the EIC, running out of avenues after having been under sustained pressure for months, with a radiasthesist's device around the machine, in hope the pendulum would hint him at were the problem could be…
At last, Gary Tom himself was flown in to defend the reputation of his design. This is how I met and knew him, and we spent a lot of time together, working and eating on as coincident schedules as possible, and he explained good parts of his design to me, rather thoroughly. Being a software guy, I was fascinated by all this new knowledge. We surely had a lot of fun! After many days peeking around with diagnostics programs and oscilloscopes, Gary went away of the machine, and started studying the detail of how the machine has been cooked for the last months (any change in the length of any cable had to be reported in the
cookbook). Gary found the problem by pure thinking, and I was much impressed. It turned out to be one of the transistors, serving as an amplifier for one of the phase clocks, which was very progressively becoming slow (how this was triggering a deadstart sequence, which is a fairly complex process, is fascinating in itself, but would make this blog entry a bit too technical). This slowdown was masked out by all the cable cooking, which had the unrecognized effect of counterbalancing it.
A good while later, long after Gary's visit, one physicist came to us with a strange problem, in which he was telling that one of his eigenvalues was wrong. He took us many days to figure out that at one step of the computation,
x = y*z was in fact computed as
x = y*z^2. It was not bad code from some compiler bug. The hardware was indeed spontaneously squaring one of the floating operands, without being instructed to do so. Something far from trivial in the opinion of everybody.
After many days, technicians still had no clue for this astonishing mystery, so I decided to get with them and study enough maintenance software to become useful. They were a bit reluctant at first, but finally accepted me. Using better testing tools, I finally found out that the double multiplication occurred if the multiplication instruction was located in a particular spot in the instruction stack (a small hardware cache for tight loops), and only when the first multiply unit was already busy (there were two such units). Then, remembering Gary, I went away from the computer, and started to follow the circuit diagrams carefully (not being used to them). It came to my intuition that the second multiply unit was probably receiving its trigger twice instead of once, maybe because of some pulse reflexion, somewhere. I made the hypothesis that one of the cable was probably not punched in correctly. From the diagrams, there were three possibilities, which I submitted to the technicians (they were not allowing me to handle the thick mass of cables myself). They found a floating cable as predicted.
From that time on, and for many years, I enjoyed a lot of respect and collaboration from the technicians, and from all neighboring sites! ☺
Other anecdotes
Some users never entered a machine room, and for some of them, it was really all mysterious. On CDC-6600 running SCOPE, the command
REWIND(file) on punched cards was usable to bring a magnetic tape back to its beginning, and either
UNLOAD(file) or
EJECT(file) could be used to fully unwind the tape from the vacuum columns and open the protective door. Unbelievable but true, a user once asked me if using the
EJECT command was representing a potential hazard for appointed operators…
CDC changed a convention in the naming of files, and a few of specialised applications failed. My boss asked me to write the
PSR (programming summary report), euphemism for
bug report at the time. He told me the exercise would help me to learn English and get acquainted with the problem submission mechanism, but told he wanted to revise my text before it goes out. So, I took a dictionary for looking up words one by one (using the usual way: seek for French, take English, seek that English to cross-check French, iterating a few times until you think you get a stable meaning among all those being offered). In one introductory sentence, I just wanted to say that the strange file names could not be handled in various ways by the group of applications. I was quite proud to find that
fiddle means handling, staying vague on the precise operations being performed. Trying to translate strange yielded
queer. After revising the text, once typed on the proper forms by the local secretaries, my boss told me that it could not let my report go, as in English, one should never use those two words in the same sentence. A bit annoyed to start the cycle over, I tried to assimilate yet another English rule. This is only later that I figured out the meaning of what I wrote.
A few years ago, a guy said he found C, as a novice, easier to grasp than Python. Pondering his arguments thrown me into the past and reminded me of Pierre Garneau, an architect (someone who design houses) who was using these CDC machines. Pierre was then giving into FORTRAN programming, with a bit of assembler. Since he had a lot of energy, dynamism and enthusiasm, I had a lot of pleasure diving into various discussions with him. To my surprise, he was pretty reluctant to any form of higher abstractions, like those found in Pascal, LISP or Simula (all popular at the time). He explained to me that whenever using these abstractions, he just could never stop his own mind from translating these into more machine oriented paradigms all along, and this constant translation was more burdening than helpful. So, for him at least, these abstractions were encumbrances. So yes, everything is possible, and it helps me to understand that some people are hardwired so strangely that they might, even today, prefer C to Python! ☺
Opening to the world
My main regret of that time is that it was mostly closed and local. All that work invested in the international networking community could have been pretty useful, and while some was getting through and widespread, most was humble. The time for planetary exchanges, as we know it today, was far from having come yet. People were mainly writing for themselves or for their neighbouring collaborators, it was fairly humble on average. And even when not so humble, as the
not invented here syndrome was still strong, this was another reason for wider exchanges to not occur frequently.
From my own little selfish viewpoint, I started what later became Recode in such times. It has been successively written in CDC FORTRAN, COMPASS and Pascal, or a mix thereof. (Overall, Recode has been a long adventure. From Pascal, it was later ported to Turbo Pascal on Apple ][, then IBM-PC, and only later, to Microsoft C. When I learned Unix, I brought Recode there, and adapted it to GNU standards, just for learning them. Someone from GNU discovered it, and liked it enough to make it available there. Only then, Recode became known outside my circle of friends or local CDC users.)