According to Hugh T. Hoskins, there was an assembler called SOPAT, delivered from IBM, prior to the appearance of Autocoder. This was at the University of Southern California School of Business in 1962. Also see the later comments from Keith Williams below.
The 1401 was so popular that (according to legend) 1401 applications were still running in 2000 on 1401 simulators (which themselves might be 70x0 applications, therefore running on simulators of their own), and this presented a special challenge in the Year-2000 conversion. You can bet that 1960-era programmers with a only few thousand bytes of memory at their disposal didn't "waste core" on 4-digit years!.
Pictured (left to right; follow links for bigger pictures):
The 1403 printer could print up to 1400 132-column lines per minute, sometimes more. The 1402 reader/punch could accommodate 1000 cards (half a box) in its hopper, read 800 cards per minute, and punch 250 cards per minute. Loren Wilton (of Burroughs/Unisys, who worked with the 1401 while in college) pointed out (31 Dec 2003) that:
... if you let a handful of cards fall down into the read feed (which was normally done when loading the tray, and would happen as soon as you started the reader anyway) the tray plus the read feeder would hold an entire box of cards easily, or 2000+ cards.
This was quite handy, since it reduced the amount of time you had to spend loading the cards into the reader, and you could devote your time to managing the punch, which had a much smaller hopper, and the read/punch stackers, which only held around 800 to 1000 cards at most in each stacker. Typically only the right stacker was used for the reader, and the left stacker used for the punch, so multiple stackers didn't help much.
If you were running a job that printed data onto preprinted forms, (especially with multipart paper or stiff paper) you would also have to devote a fair amount of time to monitoring the 1403 stacker to make sure that you didn't end up with forms spilling all over the floor rather than stacking neatly in the stacker. Thick forms tended to not stack well, especially if the printer were doing a lot of high-speed slews, as was typical in forms jobs.
Not shown: the 1406 storage unit containing core memory. The 1401 was equipped with up to 4K 8-bit characters of core memory; the 1406 increased its memory capacity to 8K, 12K, or 16K (thus the 1401 shown above has 4K).
The 1405 Disk Storage Unit Model 1 had a capacity of "10 million alphanumerical characters stored on 25 disks" (platters). The Model 2 held 20 million characters on 50 platters. Each platter is accessed by its own access arm (read/write head). This is one of the earliest production disk drives, the direct descendent of the original IBM 305 Disk File introduced with its RAMAC (RAndoM ACcess) computer in 1956. "The in-line method of data processing continually maintains the records of a business in up-to-date status. Any transaction affecting a business can be processed when it occurs, and all the records and accounts affected are updated immediately. The executives of an organization have available, at any time, information representing the status of any account at that moment."
I seem to remember (though I can't be certain) that we used the 1401 to process print tapes from the 7090 (or 7094). Since the 7090 was "so fast," it was considered a waste of resource to use it to print output. So print and punch output was written to tape (even parity for text, odd for binary) which was then processed on the 1401. I always liked the 1401; I remember it as being kind of elegant and economical in its use of storage (the words were only as long as you needed them to be).
This one's by me...
When I was in the Army in Germany in the mid-1960s, at 7th Army Headquarters in Stuttgart, Patch Barracks, we had a mobile 1401. It was in a BIG truck trailer. When we went out on maneuvers, it came with us. The trailer was hitched to a deuce and half, and a gigantic gas-powered generator was hitched to the trailer. Once we were deployed in the woods or wherever, the sides of the trailer telescoped out and you had a fairly large machine room full of key punches, verifiers, the 1401, tape drives, desks, etc. Maybe the key punches and verifiers were in a separate trailer, it's all cloudy now. I wasn't exalted enough to have anything directly to do with the computer, I did the key punches and EAMs (407, sorter, etc). I had no idea what it was all used for, except that it was called a Command and Control Information System (CCIS). Anyway it did its job, whatever it was, for weeks on end in the depths of the Schwarzwald, no matter how much mud we tracked in.
In July 2006 I heard from Wade Harper, who was at CCIS at the same time, who mused:
It's hard to believe that we had 12 E6's and 12 E7's, 3 Lt's and 2 or 3 WO to program a computer with JUST 8K of memory.Yeah. Btw, E6 and E7 are enlisted pay grades. LT is Lieutenant. WO is Warrant Officer, which is in between Enlisted and Officer. Warrant Officers are usually helicopter pilots. The enlisted men (they were, indeed, all men), with one exception, were Specialists, not NCOs (noncommissioned officers, i.e. Sergeants), meaning they had the same pay as sergeants without having to boss people around. A good idea, I think: to promote people based on their skill and performance, letting them keep doing what they are good at without forcing them into management. (Apparently, the Army abandoned this practice some years ago.) Later, Wade explained what the 1401 was actually doing at CCIS:
The 1041 was programmed for MRS (Military Report System) in the field. Which was a simple sequential database on tape. 1 block for each report. Each Hq office would submit info in card format which was put to tape as input to MRS. We could hardly program anything with just 8K ram. Every report had to be the same format. No individual calculations. We were barnstorming one day and Jodie Powers wondered if we could somehow put 1 or 2K of code on the tape with each block of data. Then we could individualize each report. So I finally got it programmed and it work very well. We also programmed stuff for garrison work. I had all the conventional ammo in Europe. Spurling (because he spoke German) and I think Jerry Cook, had the marching orders program (in case of war). I don't remember the other projects. We went around to a lot of Battalion headquarters begging for work. I stayed in the Army for 20 years. Then worked as a Systems Programmer on the IBM 360/370 and others until I retired for good in 1996. I was fortunate to learn computer programming in the Army._____________________________
|*||Written before 2011, when I was laid off. Within a couple more years Bob too after just about 50 years. Out with the old and in with the new!|
by Gary H. Anthes
Gary Anthes contributed the following on 30 March 2005, "My own small contribution to the 1401 Appreciation Society and Autocoder Programmers Alumni Association is the attachment, a column I wrote for Computerworld a few years ago. Enjoy." [Computerworld, August 20, 1990, Manager's Journal, p.60. BYLINE: By Gary H. Anthes. Anthes, Computerworld's Washington, D.C., correspondent, is a former U.S. Navy Lt. j.g. and was assistant director of data processing at the Navy Supply Depot in Da Nang, Vietnam.]
The enemy rockets always came at night, but they were not well aimed and rarely did much damage. And when a buddy was bitten by a poisonous snake as he took cover in a bunker during a red alert, I decided to stay in bed whenever the rockets came in.
But just before dawn on Feb. 24, 1970, the Viet Cong got lucky, and I learned about disaster recovery.
A rocket launched from somewhere in the Vietnamese jungle hit the U.S. Navy Supply Depot near Da Nang, miraculously landing on a stack of 6,000 anti-tank mines. The exploding mines sent shock waves across the depot, flattening the data processing center where I worked. Secondary explosions continued for 13 hours.
When I heard the mammoth explosion at my camp several miles away, I immediately thought of the gray case holding the five tapes that were updated each day and taken off-site in case computer processing ever had to be moved to the Navy's emergency facility in the Philippines. But the case holding the backup data files wasn't in its familiar spot by my bed; I had forgotten to take it with me the previous evening. With visions of courts martial dancing through my head, I drove to the Supply Depot to help in the clean-up and recovery effort.
Although the building housing the computer center had collapsed, the IBM 1401 computer and it's coterie of electro-mechanical punched card machines seemed more or less intact, although covered with tons of dust and debris. And the case holding the mag tapes was were I had left it, apparently unharmed.
Two civilian IBM engineers soon arrived on the scene, and if they slept at all over the next few days, it wasn't apparent. The computer was wheeled to an intact warehouse nearby, where Navy Sea Bees worked around the clock to install a raised floor and air conditioning. Thanks to these heroic efforts and to IBM's industrial strength vacuum cleaners, the equipment was cleaned up and working again within a week.
The IBM 1401 -- a predecessor to the System 360 -- had all the processing power and memory of today's arcade games, but it ran three shifts a day, seven days a week keeping track of an inventory of 105,000 items supporting requisitions worth $32 million a month. Although the computer and its inventory control applications were critical to the Navy's mission of supplying combat troops, disaster recovery was executed so quickly that Navy brass elected not to send me to the Philippines with the backup tapes. Thus, I escaped a court martial and never learned whether the explosions had jiggled the tapes' bits into alphabet soup.
There are some lessons in all of this for today's data center manager, none of them having to do with Viet Cong rockets, anti-tank mines or poisonous snakes. First, expect the unexpected. Second, have a gold-plated service contract backed up by dedicated, competent people. Last, if you're the one entrusted with the case of backup tapes, don't leave work without it.
When talking about memory capacity, most people understand 4K, 12K, 16K as meaning 4096, 12288 and 16384. The IBM 1401 had memory capacities of 4000, 8000, 12000 and 16000 words. Might want to explain the difference. Also each memory location was a word not an 8 bit characters. One of the bits was not accessible because it was a parity bit and another bit was the Word Mark (WM) which marked the beginning of instruction (reading up) and the end of a data field (reading down through memory).
The 1402 five output hoppers (stackers) were a very useful feature. By default the puncher would drop the card in the left stacker and optionally in the 2nd from left or the middle stacker. The reader by default dropped into the right stacker and optionally into the second from the right or the middle stacker. I used many programs and wrote a program or two that would read data cards into the center stacker by default unless the card was going to be replaced in which case it would dropped into the second from the right stacker and the card punch would punch a replacement and drop it into the center stacker.
The one bizarre thing about the 1401 was the card reader would read into addressess 001 through 080. When the Load button was pressed the card reader would read in the first card, the I-Addr register was set to 001 so execution began in address 001 and so the boot loader would begin (let me know if you want a more indepth explanation about the boot loader). The card punch would punch from address 101 through 180 and the printer printed from 201 through 132 with address 200 being the used for channel control (right phrase).
Because the computer was variable word length it was easy to write a program using variable length arithmetic. A 1401 with the optional multiply/divide instruction option could multiple two 80 decimal numbers in approximately 15 minutes with a single instruction.
I have just found your pages on the 1401 at http://www.columbia.edu/cu/computinghistory/1401.html
It gave me great pleasure because I joined IBM in October 1959 just about the time of the announcement of that machine. The 1401 was the first computer that I knew and probably the only one that I understood in depth. I learned to program the machine in 1960 and by 1961 was teaching it to other IBM personnel and customer programmers.
I feel obliged to correct one small fact contained on your page. Autocoder was not the first programming language for the 1401. The instruction repertoire and memory addressing system was simple enough to allow you to code simple routines in machine language, but the first assembler program was known as SPS (Symbolic Programming System). This programming system was announced by IBM with the machine.
Many of the early 1401s (which replaced punched card accounting systems) consisted simply of the 1401 processing unit, a 1402 card reader/punch and a 1403 printer. They had no tape or disk units, and in fact these units did not figure in the first announcement. Autocoder required a tape or disk unit to process your symbolic program to produce the object code. Autocoder was made available first on the 1410, and a 1401 version did not appear until late 1961.
Until that time we programmed the 1401 in SPS (Symbolic Programming System). The SPS assembler program was held in a stack of punched cards. The programmer's symbolic program was also punched into cards and placed behind the SPS assembler in the reading stack of the 1402. On pressing the "Load" button the SPS assembler was loaded into the core memory of the 1401 and immediately read and processed the user's symbolic program. Translation was a two step process - first a partially translated deck was punched out on the punch side of the 1402. This partially translated program was then fed back into the read side of the 1402 and, during this second pass, a fully translated object program was punched out on the punch side.
The 1401 was supplied with a choice of 5 different core memory sizes. For practical "stand alone" computing the minimum memory size was 4K characters, but you could have 8, 12 or 16K memory configurations. It was also supplied with a minimum memory configuration of 1.4K for systems that were to be used as off-line printer system for the much more powerful IBM 700/7000 series.
To clarify a point made by Bill Nugent, the smallest addressable unit of memory on the 1400 series was known as the "character" and consisted of eight binary bits (physically, eight ferrite cores). It was the equivalent of what we now call a "byte" but that term did not come into use until the announcement of the third generation (System /360) machines. As Bill explains, six of the bits were used for character coding, using a system known as BCD based on the code used in IBM punched cards. The seventh bit was used as a parity bit, and the eighth as a "Word Mark". A "Word" on the 1400 series consisted of a variable number of consecutive character positions, the last one having the "Word Mark" bit on. It was therefore known as a variable word length machine, in contrast to the fixed word length of the 709 and 650 which had preceded it. Each machine language instruction constituted a "word" and could vary in length between 1, 4, 7 and 8 characters, the last one carrying a word mark. Data words were, of course, totally variable in length, and were processed character by character in sequence until the word mark was encountered.
From Karen B. McLaughlin, 24 August 2006:
I was an early SPS programmer, starting at the Lawrence Radiation Laboratory (LRL) in 1961 fresh out of UC Berkeley. Keith Williams clarified a lot of specifics I couldn't recognize in the other reminiscences preceding his input. His descriptions brought back memories of what we programmers had to go through in order to test and debug, including loading/unloading punched cards and making sure the paper was stacking correctly. No one mentioned key punching, but that was another skill we all had to acquire because the keypunch staff was kept busy punching data cards since we had no tape drives. A significant characteristic about the 1401 (and others of its era) was that because there was no operating system, any error could be attributed to the program in core, making debugging a relatively simple case of problem solving. Once operating systems were involved, error correction became far more complicated and time consuming.
Another point about the 1401: the console had bit switches that allowed a programmer to change core dynamically, which enabled debugging on the fly--providing you knew Hollerith. Since the programming staff only got hands-on testing one hour a day, that was a useful feature.
Program design had to be elegant and frugal, utilizing overlays and structure before the term was invented. Today many people carry a PDA and cell phone each with more memory than the first computer we used to create payroll processing for over 5,000 employees at LRL Berkeley and Livermore in 1968.
I stayed in the computing field, encountering lots of different machines, languages, projects, and job titles, and eventually retired in 1999 after 25 years at the Jet Propulsion Laboratory. With a gleeful feeling of excess, possibly engendered by my early experience on the 1401, I just finished building a personal computer containing almost 500GB of storage, far more than I'll ever be able to use.
I had a wonderful and challenging ride over the years, but I have always thought programming and operating the 1401 was the most fun.
Robert N. Sammer, 21 May 2007:
After reading the articles on your web site, I would like to add the following.
In 1962, I joined IBMs New York Time/Life building computer operations department as a 1401 computer operator.
My responsibilities included putting all incoming computer jobs programmers submitted to the center onto tape using the satellite 1401. This allowed the current main frames (7090, 7040/44) to process jobs without the time consuming input/output functions.
The New York computer center was a satellite to the main computing center in Poughkeepsie N. Y., so the second shift, in N.Y, would send Poughkeepsie, via telephones lines called teleprocessing, the overflow of jobs which could not be processed in NY.
Since we had a deadline to meet, and there many jobs to be teleprocessed to Poughkeepsie, I stacked the multitude of jobs onto the card read feed tray which could hold up to 3000 cards, press start on the computer and the card to tape process started. When there was a reader check (the reader detected an error between the read-in and verify brushes) the operator had to carefully remove the cards in the card read-in hopper so that card sequence integrity was preserved; remove cards from the top of the read feed tray so that the remaining cards on the tray could be pushed upward so the operator could flush the cards inside the reader. Now the operator would check the card in error to see if there was a valid reason for the reader check. If none could be seen, the operator would replace all the cards in proper order and start the reader. If nothing happened, the card to tape processing would continue. If the same card had another reader check the job would be eliminated from the job stream, and the programmer was notified.
One night, the reader that was assigned to this task kept giving false reader checks. It would read five or six cards and reader check the next one. After performing the procedure described above, the reader would read nine or ten cards and reader check the next card.
I asked my supervisor to have the CEs (Customer Engineer) to check the reader out. I was told the CEs had check during the first shift several times but could not find any reason for the reader checks. After about the twentieth reader check, with time running out, I did what any good American Do-It-Yourself repairman would do.
(To easing the stress of an eight hour shift on my feet, I purchased a pair of rubber soled shoes. The sole design reminded me of a newly plowed field with little valleys and sharp peaks which went from side to side on each shoe.)
If it doesnt work after checking it out and finding nothing wrong.kick it! That is exactly what I did. My shoe print was left on the door of the 1401 because of the normal dust found on raised white tile floors in the operations room.
The reader did not malfunction during the remainder of my shift, reading perfectly. It worked the third shift, the first shift and when I returned to work my shoe print still there and received many thanks from my fellow operators for fixing the reader.
The CEs even were surprised at my solution. The print remained for a few days but was removed before the CE manager and guests came through on a tour of the operations department. Due to your web site articles and wondering what, if anything, will be done for the 50th anniversary of the advent of the 1401, I retrained myself for programming the 1401 and have written a small utility program for it and am 95% sure it will work if entered into the 1401. Yes, I played 1401 computer to test the program out, and yes, it was hard to translate SPS coding into an object deck as we called it in 1962, but I had FUN. And this after a 38 year career in IBM Main Frame computer systems design, specification writing, coding, debugging, testing etc.
Edward G. NilgesYour editor responds, belatedly (2015):
13A 6F 1F Wang Long Village
Yung Shue Wan
As a former IBM 1401 programmer, who debugged an object code only compiler for Fortran after IBM removed support for the 1401 in January 1971, and who discovered extra precision math and new forms of "modified address" arithmetic on my own, I am actually quite angry to read how the Army had the resources to wastefully set up a "1401 data center" in the woods of Germany, while my university went begging for resources to teach its students.
This is because the 1401 data center probably did not do much of anything and was a boondoggle.
I admire the hard work and heroism of data processing techs who recovered the 1401 at Da Nang after a Viet Cong rocket attack. However, at the same time I was learning the 1401, I was marching against that crazy war, as crazy as the war in Iraq today, where, no doubt, the heroism, self-sacrifice, and hard work of lowly and unglamorous mil-specs is being wasted so officers at flag rank can get promoted, and the worst President in American history can pretend he's a man.
My direct experience as a data processing professional over the following thirty years was that in America, the civilian sector was systematically starved of time and resources to develop effective and reliable systems for human needs so that our military-industrial establishment could waste money in idiotic ventures from carting a mainframe around in a truck to "Star Wars".
I discovered in January 1972 the consequence of the macho big talk of IBM customer engineers who had learned as draftees their macho big talk, because one of them had charged my university, a university starved of funds because Roosevelt University had the bad taste to educate working class people and people of color, for "fixing" the IBM 1401 Fortran compiler to work on Roosevelt's minimal memory configuration...by using unavailable memory to branch to a subroutine which overlaid the runtime interpreter.
This "fix" had never been tested, but thrown contemptuously at my mathematics professor, who was merely trying to give working class students and students of color their first education in computer science so they could compete with wealthy children at the University of Chicago.
The customer engineer did not even know, and was apparently incurious to discover, that Roosevelt University was paying for and had had the extra hardware to perform multiply and divide in memory. I removed his "fix" and inserted the correct multiply instruction, and the compiler worked and was subsequently used in classes.
The intellectual incuriosity and sexist talk of the customer engineers (who liked working at Roosevelt, they said, because it was even then old-fashioned, with a glass window on Michigan Avenue suitable for girlwatching) was part and partial of a military and corporate quasi-elite which then and now insists on "leading" America into a permanent war in Iraq (that resulted from a failure of intellectual curiosity as to whether Saddam Husayn had WMDs) and increasing numbers of Americans tormented on the job and off by data systems constructed by ignorant men, which fail to provide them health insurance, which fail to provide veterans with benefits they earned, and which fail to record their credit history accurately and render their credit record prey to the criminal class.
Lyndon Baines Johnson's "Great Society" and its glimmer of hope had already disappeared by 1972, and as a result Roosevelt University and its students were even then increasingly unable to access funds for education or any true human needs, because in 1972, Nixon's insane bombing campaign and destabilization of the government of Chile took precedence.
For this reason, I pitched in, working 12 hours a day, keeping Roosevelt University's IBM 1401 system alive until it could upgrade. I developed a set of software and procedures for reliable computing that enabled the Registrar to accurately grade students and the Bursar to pay employees, in an era when this meant coding in SPS assembler language for the most part, with the Fortran compiler available for reports.
For this reason, I am appalled to see the military industrial complex and the aging men rejected by this complex celebrate the use of the 1401 to kill four million people in Vietnam. I am also on record (on comp.risks) as questioning the takeover of the Computer Museum in Mountain View by hardware types (and political conservatives who, inappropriately, insert pro-Bush rants in technical communications) who are reconstructing the 1401 in hardware, thereby wasting scarce resources, and, possibly, rebuilding toxic technologies, when the Computer Museum could present much more of the "deep" technology of the past in software simulation.
You can read more details of my early adventures with compilation on the 1401 on my book on a modern technology. It's "Build Your Own Net Language and Compiler" (Apress 2004). I remain convinced that MOST software and hardware efforts in America then and now are not serious technical and intellectual ventures, but boondoggles private and public meant to show that "everything is under control"...when the incidents in Manhattan in September 2001, and in Glasgow last week, show that this isn't the case at all, and that perverting technology for shows of force (if not downright murder, as when data systems are used to track our destruction of targets from the air) has created the anti-Americanism that today is the norm in other countries.
Edward G. Nilges
The US Army in Germany of the 1960s was relatively harmless. A great waste of money and resources for sure, as was the Cold War itself, but it didn't do much more harm than running over the occasional chicken. As I responded to Edward in 2007 (we had a lengthy correspondence), in 1965 “the American invasion of the Dominican Republic was a real eye-opener for me (still a teenager but already in the Army), and it came just when Johnson was starting to call up 50,000 kids a month for Viet Nam. I wondered, what Army am I in?” I applied for discharge as a conscientious objector and spent my final year in the Army waiting for them to figure out what to do with the application. In the end it was denied but by then I only had a few days left and when I got back to the States I was ready-made for the antiwar movement (and the next antiwar movement, and the next, and the next...)
Today we see the result of America's postwar priorities: much of the Middle East in full collapse, Mexico and Central America turned into killing fields; the US economy in ruins except for those at the very top; our political system hell-bent on undoing everything good that was accomplished since FDR took office; the very planet rapidly becoming a toxic waste dump sinking into a dead sea. And that's the short list. So I'm on Edward's side, but this is a computer nostalgia site :-) Nevertheless, it's always good to put things in perspective. (2021 addition: If you want to know all there is to know about my glorious Army career, you can read it here.)
From Dave Brown, 6 August 2015:
I enjoyed reading your page about the IBM 1401 computer. It brought back memories. One thing I recall is that the Autocoder instruction "Store B-register", coded as "SBR", caused the CPU to emit a very short radio burst that could be picked up by an FM transistor radio set on top of the CPU cabinet. So there were programs that would play Christmas Carols by issuing SBR's at the frequency per second for each note. The 1403 printer had its letters on a bicycle-like chain drive with 132 hammers that would hit the characters as they flew by. One program to test the integrity of the chain was called a "Chain Breaker". It would print a line of charters that caused all 132 hammer to fire at the same time which put considerable stress on the chain. I recall that there was a knob on the printer to advance or retard the hammer-timing, depending on the number of carbon copies that were being printed, otherwise the left- or right-side of each letter would be missing. And like the SBR instruction, there were programs that would use the firing of the hammers to make notes to play Christmas Carols. My favorite was the "Little Drummer Boy".
Photos and quoted text in the first section: From the IBM 1401 Data Processing System Reference Manual A24-1403-x, courtesy of Brent Radbourne, February 2003.
Offsite Links (all valid as of 26 March 2021):
|Columbia University Computing History||Frank da Cruz / firstname.lastname@example.org||This page created: 18 January 2001||Last update: 26 March 2021|