This machine was Columbia's prototype for open public timesharing. Previously all computing was in the batch using the central IBM mainframe (with the exception of the Business School, which had access to an interactive subsystem of the mainframe, CALL/OS, which was a BASIC interpreter).
Columbia's PDP-11/50 was housed in 208 Computer Center 1975-1982 and ran
DEC's RSTS/E operating system as a pilot project in open timesharing (we
would have liked to try Unix but at the point it had no security; also since
RSTS/E was built around BASIC, this would be a migration path for the
Business School). 32 terminal cables came in at the back, initially
hardwired to public terminals downstairs, later to the PACX terminal switch,
and thus accessible from all over campus. The experiment was a big success,
with 1700 users signing up for accounts, quickly overloading the machine.
In 1977 we started buying the bigger, faster, and more general-purpose DECSYSTEM-20s. The PDP-11 was retired and sold off
in 1982. In fact we didn't actually sell it, we traded it for an RP06 disk drive to add the DEC-20 disk farm.
The PDP-11 console (ours is shown above in a black-and-white photo by Ben Beecher) had switches and lights. The lights show the address and contents (bits) of a selected location. When the CPU had nothing else to do (increasingly uncommon as the system became more popular) it played a "movie" on the light bank. Each operating system (RSTS/E, RT-11, RSX-11, etc) had its own movie -- groups of four lights moving left to right, right to left, or in a circle between the two light banks. The Massachusetts General Hospital Multi-User Multi-Processing Operating System (MUMPS) had the lights throbbing in and out from the center like a bad case of mumps. This was the sort of thing that gave the DEC world an aura of "fun", compared to the "serious" IBM world ("this page intentionally left blank"). At right, another shot illustrating the cheerful color scheme.
Here are a few more photos taken (by me, circa 1976) in Columbia's PDP-11 room. Click on a photo to enlarge it.
I don't remember what the other stuff is — oscilloscope, etc. Apparently we were testing something... Background center: a DEC VT52 terminal. To its right: a rack full of PDP-11 and RSTS/E manuals. Foreground right center: a DEC LA36 DECwriter. Just above the right tractor sprocket of the LA36 you can see a pair of the airport earmuffs we had to wear in the constant 75dB racket (more when the printers were going). Left: behind the oscilloscope, the once-ubiquitous Hazeltine 2000 CRT with its keyboard on top.
Ben Beecher at his desk. Left: 9-track tape rack for backup tapes. Right: the left portion of the PDP-11/50, showing the tape drive and another cabinet containing an RS04 fixed-head disk and perhaps some memory. Foreground left: the rear of the Terminet, partially obscured by a couple pair of airport earmuffs.
By the way, I had had a job for a while at in the Computer Science Laboratory of Mount Sinai Hospital in NYC in the early 1970s, working on a PDP-11/20 and an 11/45 with DOS-11/Batch. Many years later somebody asked what DOS/Batch was like; my recollections:
You punched cards, read 'em into the reader, got the results back on the line printer. Those of us who used the machine to develop software or do production runs (e.g. hunting through a tape to compile and correlate data on anal fissures) would take turns. No multiprocessing, each person had the machine all to her/himself, switches, lights, spinning-back-and-forth DECtapes, squeaky RK05s, and all. When you weren't using the machine you were writing code at your desk with pencil and paper. There was a Teletype console with a minimalistic "monitor" (shell), where you could activate jobs from the card reader, DECtape, paper tape reader, etc, and use PIP (Peripheral Interchange Program) to manage files (move, rename, delete, copy, etc).Links:
Of course, a batch job could run a program that interacted with the Teletype or other devices. The one I remember best had to do with cervical cancer. One of the treatments was for the surgeon to manually insert radioactive needles into the tumor. You can imagine, it took time and patience to get the needles in just right, to maximize the dose to the tumor and minimize the dose to healthy tissue. So doctors who performed this procedure a bunch of times began to suffer from the radiation.
Lee Lidofsky, a professor of Nuclear Engineering and Medical Physics at Columbia (whom I had worked for in the late 60s and early 70s and who steered me towards a career in computing) came up with an ingenious solution, one that would not have been possible a few years earlier: let the doctor shove the needles in any old way, as fast as possible, then take a stereo X-ray of the area, dash over to the computer lab, where we would "scan" the X-rays; the software would create a 3D model, display it on a storage tube, and then the doctor interacts with the graphic (using a light pen) to specify the tumor boundaries. Then the software figures out the 3D dose contours (by solving tons of simultaneous equations that would take weeks or months to do by hand), displays the isodoses on the tube, and prints out instructions stating the optimum time at which to remove each needle, thus maximizing the dose to the tumor and minimizing the dose to healthy tissue of both the patient AND the doctor. The doctor runs back to the operating room and follows the instructions.
All this in, what, 32K? And yes, the computer really, really had to be up when the doctor burst in with those X-rays! As far as I know, it always was.