Programmable/useful digital computers predate the transistor: http://en.wikipedia.org/wiki/Z3_(computer) For a small sample of computational history: http://en.wikipedia.org/wiki/Charles_Babbage http://en.wikipedia.org/wiki/Ada_Lovelace http://en.wikipedia.org/wiki/George_Boole http://en.wikipedia.org/wiki/Von_Neumann http://en.wikipedia.org/wiki/Alonzo_Church http://en.wikipedia.org/wiki/Alan_Turing Analog computers were also in heavy use before digital computers were commonplace. Technically Pong is a mixed-signal (analog-digital) game, and its predecessor Tennis for Two might have been entirely analog.
Intel is #1 because of backroom dealings with OEMs that leave AMD out of the deal, as it did with Cyrix and Transmeta before Microsoft is #1 because of monopoly practices and mass licensing deals with governments and big corporations Apple is #1 thanks to clever marketing and putting form way above function, case in point the new macpro
Of course, why haven't I though of vacuum tubes and mechanical computers? I guess I was in the "personnal computing" mood. Let me quote Popular Mechanics here:
We're heading back in that direction--people want dumb-ish terminals (mobile, tablets) running web apps, which over time will be hosted by fewer and fewer "cloud" companies.
AFAIK it was; opamps and passive circuitery. Without surprise, it was made by a physicist! Edit, now I wanna build it so much! (http://scienceblogs.com/brookhaven/...6c2364076df19e409648a-VideogameSchematic1.jpg)
Isn't it funny how history repeats itself? Were also working on going back to tube technology as well, although not in a vacuum and on a much smaller scale. 100% Agreed, but i feel apple has something unknown up their sleeve. I'm not sure exactly what it is, but i just have a hunch that they are up to something. The already slow innovation slowed down even more after Jobs, and i can't imagine that Steve Jobs would pick Tim Cook to replace him if he didn't think he wasn't 100% up to the task at that exact point, and i think he would had made sure he colleagues agreed as well. At least thats the impression i get from all of the documentaries and other reading i've done on him. It just seems weird that they are hiring so many extra people but nothing big is coming out from them other than hardware updates and fixes with the exception of the mac pro. All in all i'm starting to like Microsoft less and less, and liking Apple more and more. But then Apple ends up doing something to upset me (Lion Server for example). I will say that Server 3 has come a long way though, and is a fairly good product. Hell i just installed Debian as my main OS on my desktop a few weeks ago, i never thought that would happen.
Oh yeah... I remember from 30-40s movies telephone engineers working on those huge cables thing... that was problably a computer.. thanks for this little explanation. I never knew much about what is a byte and bit and such... its just great how far goes the potential of electricity and such... it all derives from math, and you'd be a genius to code based on reality's physics.
ive been trying to make my own game ( in flash ) , and never actually got far... it kinda take a lot of time to catch coding and being used to enough to write your own . but id like to create a little game maybe, one day just for curiosity... and yeah these SGI computers seems so cool for their time tho. haha.
If you are talking about when you would see the telephone operator with the big panel of jacks manually plugging wires from one point to another in order to connect the call, that is not a computer but a simple patchbay. Each input and output connection would be permanently connected to the back of the patchbay with each getting a corresponding front jack that the operator would use to make connections via patch cables between the jacks. They are still very commonly used in recording studios. I won't get into things like normalling (default connections) and half normalling (multing or splitting) here but the information is widely available if you are interested.