Technology knows no bounds

Not as "old as dirt" but I do go back a ways with computers. In the '70s, the foundry had an IBM 1200 as the machine controller. There was a 360 up in billing, but I never worked up that way. The spectrometer had a PDP-8 with a 33 teletype reading the program from paper tape. Later I worked for Wang Computers, in the Western Pacific area. The machines had 300 meg CDC drives. When we got our hands on a Winchester 8" hard drive, I fell in love with it. Not as fast as the CDC drives nor so easy to change the "pack", but so much technology in so small a package was the shape of things to come. Then IBM hit the market with a "personal" sized machine just a little bigger than an Apple2. Along with Bill Gates and going into hiding ever since.

I didn't go to high school, all I had was experience with "precision electronics" from the foundry. A 5-1/4 hard sectored floppy held the bootstrap loader for the CPU on the Wangs. When I went to work for USSteel, we had a PDP-11/43 for interfacing. The system dated from around '80. It had an 8" single sided floppy for bootstraping. I never was a "customer engineer" as such. Just an old school electrician with a interest in electronics. It led to a number of interesting pursuits over the years.

Best I recall, the 33 teletype had a 5 bit "Baudot" code that was used on the 19 and the 43 machines. It is where "baud" and ASCII come from. I'm not too sure, but the model 45 teletype may have gotton its' designation from the 45 baud ASCII. I do remember that 110 baud was considered to be fast at the time. So many memories lost. . .

.
 
The first computer I ever saw belonged to the University of Arizona. It was in 1959 or 60 and I was in seventh grade. It was in two, wheeled cabinets just small enough to fit through a standard door, and set up in front of the stage in the auditorium. Outside in the courtyard was a diesel generator set with a thick cable leading inside. When you entered you could smell ozone, and when you sat down in the front row you could feel the heat emitted from the hundreds of vacuum tubes packed inside and hear the fans keeping them cool. The only features on the black cabinets were few lights. The professor giving the presentation said that it could do math, and demonstrated it making a calculation. The answer was presented in a series of lights on the panel of one of the machines and was interpreted by the Prof.. He talked about the future of computing, and lamented that the only thing holding them back was the size and power requirements. The next year I got a transistor radio for Christmas and everything changed.
A couple of years later I received an analog computer for Christmas, It consisted of an oscillator, four rheostats and two program boards. and came in kit form. The way it worked was thus; You would set the rheostat on the left pointing to a number, you would set the rheostat in the center to a number, then you would turn the rheostat on the right until you heard no tone in the earpiece (tune to a null) and it would be pointing to the answer. ( 2 + 2 = 4) It wasn't very accurate and I soon lost interest, I think one of my younger cousins ended up with it.
 
My first 1 Million Instructions Per Second computers was a Motorola 32032 processor single-board computer. It was a marvel.

One little correction which will date me too... It was not Motorola CPU, it was National Semiconductor NS32032, it was the first 32-bit general-purpose microprocessor on the market. Motorola had their 68020 and 68030.

Ariel
 
Hey I built a Motorola 68000 evaluation kit - now that was a wonderful processor: The instruction set was like a superchatged PDP11 with some VAX style macroinstructions. A delight to program in assembler!

I never got a chance to try the Nat Semi processor. shame.
 
One little correction which will date me too... It was not Motorola CPU, it was National Semiconductor NS32032, it was the first 32-bit general-purpose microprocessor on the market. Motorola had their 68020 and 68030.

Ariel
You are spot on! Yes, I got my brands swapped. So many years, So many Chips, So many different assemblers. ;)
I remember being glad I didn't have to write my own assembler, as National Semiconductor had a rather solid one.
There were a lot of good processors over the years, but some that never seemed to get a toehold in the market.
There were some innovative designs, but introduced features the market was not ready for, and failed to see sales numbers.

Over the years, I learned a sad truth about the industry. There were fewer jobs in computer design, than there were in Networking.
The pay in Networking was frequently better as well. I morphed into the networking side of things for higher pay, and easier work.
It always seemed a bit of a paradox. The job doing networking was less technical, and paid as well (or better). This is especially true
of Networking Security. I always felt the years spent designing computers gave me insight, (which look like intuition to others), with
solving networking and application issues.

We live in a day where those who know what an X and Y register, as well as an accumulator is growing less common. If you say
"post indexed indirect addressing", you draw strange looks. Code has become bloated and sloppy, with re-enterability and ease
of understanding is a higher importance than economy of memory and raw speed. This was brought on by companies being destroyed
by losing a lead programmer, and the project languishing, as nobody else understood the code. Amazing things used to be done with a single
1k of memory, now C Compilers eat 10k (or more) just to do "hello world", once you include the STD.IO library. The whole industry has
become more "abstract" and less literal and direct. Modern programmers (by in large) are not as "math nimble" as some of the old
goats who used to code. If you were to mention using a single register to hold multiple Flag Bits, for different states of process in your
code, you will draw strange looks. Today, you be viewed a speaking strange voodoo to the current generation of coders. Perhaps the
last bastion of old-school coders would be those who write drivers for hardware. They still have to live in the land of flag bits, registers,
and multiple hardware states.

I am in my last decade of working in the industry. And find myself building up a small shop to tinker with machining after I retire.
You have to keep the mind active, or it fades away. I have watched too many brilliant people retire, and their minds slowly erode, not
from anything biological, but rather, disuse. With the human body, anything not used will atrophy.
 
Last edited:
Don't knock the 56k!

15 years ago I setup a remote weather station and usb camera at a friend's mountain home so they could monitor the weather conditions and driveway. Broadband only became available a few years ago. The system has dailed into a ftp site every hour to upload a photo and weather data for the last 15 years without fail.

Even more suprising is the system was built on a 5 year old used HP PC running win xp that has run 24/7 for almost 20 years.

I recently asked if they wanted to update the system to take advantage of broadband but they declined stating why mess with what's working.

Sent from my SM-G981U using Tapatalk
Remote weather stations are interesting. A really cool "old school" technology still in use is the meteor burst communications system used by the USDA's Snotel system. I think the system was initiated in the late 60's or early 70's....
 
Y'all should take a look at a Raspberry Pi. They're a full computer system with general purpose IO running linux on a board the size of a credit card. The current version has built in wifi, ethernet, usb, and HDMI video. More computing power than we used to send men to the moon. All for about $20.
 
Back
Top