So, I watched 'Triumph of the Nerds' earlier this week - a synoptic walk through the development of the personal computer - up thru the Apple II (which was kind of surprising that it stopped there, because I used the Apple IIe in 1984 and the film was done in 1996, but whatever).
I thought it covered things fairly well from a hardware standpoint - from the original size of even the most basic computer to the 'impossible' miniaturizations that Steve Wozniak envisioned and brought about. It was interesting to get a glimpse of just how differently the people involved in the process view things.
On the other hand, it didn't get much into software development - just touched briefly on what computers could be used for, what they were used for in the beginning, and how limited application was for the first PC. Having grown up with the process of miniaturization coupled with increasing 'power', the hardware end of things just wasn't that fascinating to me. I'd much rather hear about how they decided what an OS should include, how they come up with new, viable applications, and what's involved in that process. And no, I'm not saying I want to learn source code (ugh!)...but the process sounds interesting.
All in all, the film was pretty basic - nothing earth-shatteringly new in it - and fairly narrow in it's scope. If you're interested in PC history, you'll like it.
No comments:
Post a Comment