PDA

View Full Version : History of Computer Manufacturing - what's your favourite book?



stevesy
August 22nd, 2013, 11:02 PM
Mine is "Accidental Empires". Straight to the point, comprehensive and very funny! Well worth a read : ]

squakie
August 23rd, 2013, 05:19 AM
Try any of the 6 to 8 foot long racks of manuals from the old mainframe days. I had hours of "light" reading through those at the time.

stevesy
August 23rd, 2013, 09:26 AM
Try any of the 6 to 8 foot long racks of manuals from the old mainframe days. I had houir of "light" reading through those at the time.

Oh god. I'll pass ; ]

3rdalbum
August 23rd, 2013, 11:44 AM
Not quite a history of computer manufacturing, but Hackers by Steven Levy is awesome. You get the impression he was really there during everything he writes about (although he wasn't).

squakie
August 24th, 2013, 04:15 AM
Are we talking back in the days when hacking was a good thing - I was involved in that a lot - or the newer meaning of the word?

3rdalbum
August 24th, 2013, 02:40 PM
Are we talking back in the days when hacking was a good thing - I was involved in that a lot - or the newer meaning of the word?

Original meaning of the word. I'll elaborate on the book: The concept of hacking was really born in MIT, when some members of the Tech Model Railway Club at MIT came across a TX-0 minicomputer and started playing around on it (hacking) for fun. They did some pretty amazing things on that computer, and on future computers that were installed at MIT.

The second part is about hardware hackers of the 1970s who built their own computers, and then started to form businesses around personal computing.

The third part is about games programmers, programming for the Apple 2 and the other personal computers that were around in the 1970s and early 1980s.

The last part, almost an appendix, is about Richard Stallman.

A great read from start to finish, but the first part about the MIT hackers is my favourite part. It's really funny, and also amazing what they were able to do with such limited computing power.

stevesy
August 24th, 2013, 10:07 PM
squakie how would you compare the computer industry in your day to nowadays, did you prefer it then or now? Biggest changes you've seen? Anything you miss? Would love to hear from yourself and others who've lived through the whole thing. I think I've just hijacked my own thread lol, oh well hopefully that's allowed : ]

stevesy
August 24th, 2013, 10:32 PM
Not quite a history of computer manufacturing, but Hackers by Steven Levy is awesome. You get the impression he was really there during everything he writes about (although he wasn't).

Will definitely check that book out 3rdalbum, sounds exactly like what I'm after. Thanks for the recommendation! : ] Have you read "Accidental Empires (http://kickass.to/accidental-empires-robert-cringely-ebook-t6271502.html)" by Cringely?

lisati
August 24th, 2013, 11:11 PM
Try any of the 6 to 8 foot long racks of manuals from the old mainframe days. I had hours of "light" reading through those at the time.

I still have one or two from my days using mainframes, they're gathering dust somewhere in a box. One of my favourites is "The soul of a new machine. (http://en.wikipedia.org/wiki/The_Soul_of_a_New_Machine)"

squakie
August 25th, 2013, 06:25 AM
I almost had every one of the dang manuals for the systems I worked on memorized - I say almost because of the following:

In the early 80's my girlfriend worked as a systems analyst/programmer in the IS department where I was the sys admin and systems programmer. She came around one day asking about a problem - I took her to the manuals, showed her the appropriate information and explained it to her. The next day - the very same thing. In those days I was a "ask me once, twice at most" kind of guy. She came back a third day with the same thing and I told her the answer. She replied "that's not right, on page "x", paragraph "x" it says" and said exactly what it said - you see, she had one of those photographic or what ever they call it memories. When we got in arguements I would sometimes reverse my position the next time the same thing came up, and that would drive her nuts!

BTW - some of the hardware I worked with in the very early 1970's was mid-1960's "stuff", and on a visit to Boston many years ago now, my girlfriend and I found that same hardware in the computer museum. That's when you learn how antiquated you are as well.

Today versus then? They are 2 completely different worlds. We had limited CPU power and limited memory, even on the mainframes, compared to today. The main difference was that we didn't have GUI's eating up resources and we had very tight operating systems. To have 200+ users online, many batch jobs running, programmers programming and testing, etc., all with a CPU below the mhz rating we would use today, and memory in the megabytes. But the internal architecture of the systems were very different as well. If you moved away from business systems toward the scientific side, faster processors and more memory were a must-have.

The other big difference I see: as sys admins and systems programmers we had to know everything inside and out. The internals of everything - even communications protocols for example, putting scopes on lines and tracing the bit or byte flow to see what headers, what data, etc., actually went across a given line. Reading memory dumps, following machine level code, etc.. Most of the early programmers knew all of this as well. Before the advent of virtual memory, we used to write code to write out pieces of data, swap new data in, then execute code that was actually in that data. Today's operating systems separate memory better so it's harder to just drop some code in some data and execute it.

Most importantly though, is the perception the end-users had. After all, in a commercial or scientific setting, the computer is there to process and provide data to users - not just as a "toy" for people like I used to be. This has changed greatly - GUI's instead of screen-oriented programs. Instead of following the programmers flow when working with the end result, now things are event driven (we were used to that from the internals of an OS stand point). Click a button - something happens. Click another button - something else happens, perhaps at the same time. True graphics - pictures and all - versus the line level "drawing" we had (except on the scientific side).

I remember buying a bare board at our local electronics parts store during their once a month parking lot "come see the future" events. We had to locate the parts and solder them on ourselves. Had to make our own cables, write our own drivers for things like disks. Wanted another board to use in conjunction - we had to figure out how to make them "talk" to each other. Then things progressed and we had actual operating systems - things like CP/M and the like, hardware with common busses (like the old S100 buss, etc.). Then IBM entered the fray with their original PC, and things changed greatly. Bring in cloned hardware that was cheap, and things took off and eventually grew into what we have today. I also had the first personal hard disk out of our large group of "hackers". If I remember correctly it was something like 5 or 10 megabytes and sounded like a jet engine running right next to you.

Lots of big changes, huge differences from the past to now, yet each served the needs at the time.

stevesy
August 27th, 2013, 12:32 AM
I almost had every one of the dang manuals for the systems I worked on memorized - I say almost because of the following:

In the early 80's my girlfriend worked as a systems analyst/programmer in the IS department where I was the sys admin and systems programmer. She came around one day asking about a problem - I took her to the manuals, showed her the appropriate information and explained it to her. The next day - the very same thing. In those days I was a "ask me once, twice at most" kind of guy. She came back a third day with the same thing and I told her the answer. She replied "that's not right, on page "x", paragraph "x" it says" and said exactly what it said - you see, she had one of those photographic or what ever they call it memories. When we got in arguements I would sometimes reverse my position the next time the same thing came up, and that would drive her nuts!

BTW - some of the hardware I worked with in the very early 1970's was mid-1960's "stuff", and on a visit to Boston many years ago now, my girlfriend and I found that same hardware in the computer museum. That's when you learn how antiquated you are as well.

Today versus then? They are 2 completely different worlds. We had limited CPU power and limited memory, even on the mainframes, compared to today. The main difference was that we didn't have GUI's eating up resources and we had very tight operating systems. To have 200+ users online, many batch jobs running, programmers programming and testing, etc., all with a CPU below the mhz rating we would use today, and memory in the megabytes. But the internal architecture of the systems were very different as well. If you moved away from business systems toward the scientific side, faster processors and more memory were a must-have.

The other big difference I see: as sys admins and systems programmers we had to know everything inside and out. The internals of everything - even communications protocols for example, putting scopes on lines and tracing the bit or byte flow to see what headers, what data, etc., actually went across a given line. Reading memory dumps, following machine level code, etc.. Most of the early programmers knew all of this as well. Before the advent of virtual memory, we used to write code to write out pieces of data, swap new data in, then execute code that was actually in that data. Today's operating systems separate memory better so it's harder to just drop some code in some data and execute it.

Most importantly though, is the perception the end-users had. After all, in a commercial or scientific setting, the computer is there to process and provide data to users - not just as a "toy" for people like I used to be. This has changed greatly - GUI's instead of screen-oriented programs. Instead of following the programmers flow when working with the end result, now things are event driven (we were used to that from the internals of an OS stand point). Click a button - something happens. Click another button - something else happens, perhaps at the same time. True graphics - pictures and all - versus the line level "drawing" we had (except on the scientific side).

I remember buying a bare board at our local electronics parts store during their once a month parking lot "come see the future" events. We had to locate the parts and solder them on ourselves. Had to make our own cables, write our own drivers for things like disks. Wanted another board to use in conjunction - we had to figure out how to make them "talk" to each other. Then things progressed and we had actual operating systems - things like CP/M and the like, hardware with common busses (like the old S100 buss, etc.). Then IBM entered the fray with their original PC, and things changed greatly. Bring in cloned hardware that was cheap, and things took off and eventually grew into what we have today. I also had the first personal hard disk out of our large group of "hackers". If I remember correctly it was something like 5 or 10 megabytes and sounded like a jet engine running right next to you.

Lots of big changes, huge differences from the past to now, yet each served the needs at the time.

Fantastic, thanks very much for sharing your experience squakie appreciate it, better than a book : ]

Doug S
August 27th, 2013, 01:07 AM
One of my favourites is "The soul of a new machine. (http://en.wikipedia.org/wiki/The_Soul_of_a_New_Machine)"That is my old favorite also.

lisati
August 27th, 2013, 01:46 AM
Lots of big changes, huge differences from the past to now, yet each served the needs at the time.
Strange, but while I read this, I recalled that the programming environment on the mainframes I used back in the 1980s was based around 31-bit addressing. :D

squakie
August 29th, 2013, 11:39 PM
Strange, but while I read this, I recalled that the programming environment on the mainframes I used back in the 1980s was based around 31-bit addressing. :D

There was a lot of strange things that came out/happened along the way, but 31-bit addressing is either a new one to me or my "recreational activities" have blurred things ;)

Do you remember what system it was?

EDIT: Jeez, the 31-bit Os's for the IBM mainframes - can't remember - but I think something to get around the old 24-bit addressing limitations (perhaps the old MVS?)

squakie
August 30th, 2013, 10:32 AM
I probably should add to my adventures: when what was called a mini-computer came out we had to learn those as well, inside and out. Along with that came things like LAN's instead of async or sync communications. Things like TCP/IP, all the fun of servers, etc.. Then of course PC's where added to the fray along with their various networking and server products. Throw in things linke token ring, etc., and things just grew. Most everything I did know is now long forgotten, including most of my networking knowledge, although I bet I could still read a dump, segment offsets, machine code and all from those days. Makes me feel like a complete idiot when I have to post asking for help on something, and yet there are times when I have to laugh when someone makes a mis-statement about when/where something originated - they're either too young to know or they jumped into computing after those times had passed, and therefore assume things. There used to be multiple collating sequences - EBCDIC (you don't really hear that anymore), octal (many mini's, DEC's and a lot of others that were running unix-like or unix-derived OS's), ascii, etc.. Anymore I *think* people mainly deal in ascii or octal now. I really don't hear much of anyone talking about escape sequences, etc., now either. But like I said - 2 different worlds for 2 different times, each fitting the needs at the time. I personally wouldn't say one is better than the other - I've seen way too much more garbage code in today's world, and to me at least the OS's, while open instead of proprietary, seem to have a need for more resources to accomplish a similar job. However, for the end user, I would say that today's world is much more productive and user-friendly.