Videogames directly descend from computering , wanting it or not. And computering was created kinda all from mathematics. Yes, because at first we wanted to create powerful tools that would help us create contents digitally using real-life mathematical equations that would apply to be able to create consistent content. yesterday I found an interesting video on youtube about the beggining of computers in life; as we might not all know, all starts with bytes. a single byte is like a data, either empty or full. imagine by then we want to create a full-scale picture, then it would take probably millions of bytes, each of them having its own color. so at first, people wanted to create powerful tools that would help us create schematics without handwriting them theirself, so to avoid making precision mistakes. If so a computer could create our schematic in mind perfectly, counting each bit precisely, by then we would be able to ccreate really performants objects; I guess it's why and how the consoles came out. Because they were huge data file created to make like mathematical equation , using variables and such; then the console can read the file and interpret it. maybe something you'll be interested to look : http://www.youtube.com/watch?v=AmwdTRChAUU http://www.youtube.com/watch?v=VxLv6jDPpng
Yeah, the videos are a fun look back on history. I have an SGI Indigo2 stashed away somewhere in my house, I should crack it out some time and do a video on it.
During the early and mid 1980s, home computers were just becoming popular and were used as game machines in many homes. Parents would rather buy their kids home computers then colecovision since it was actually usable for something besides Asteroids. Back when the Nes was out (1986-89) I would play it at friends but I knew the amiga computer was out there with far superior graphics (though superior games is really arguable) and was what I really wanted. Ended up getting a genesis at launch in 89 which thankfully ended up with alot of Amiga ports. Also popular back then were the Sierra On Line adventure games (Kings Quest etc) which were never on consoles only PCs and Macs. Love the early compact Macs by the way and I collect them.
no, that's a bit. one byte can have up to 256 (0 ~ 255) different states, since it is made of 8 of those lil bits.
No. Computering doesn't know optimism or pessimism, only math, which includes rounding. So 0-127 = empty, 128-255 = full. Now floating point, this is where it gets complicated since it can represent intermediate values - e.g. to represent '2.75', you need three bytes, since you have to have two 'full' (255) and one 'three-quarter' (191) byte. To be exact, the last byte would actually need to be 191.25, which can't pre represented - that's where the rounding errors come from. (The 'floating' means the decimal point 'floats' to whereever the last byte representing the fraction is located). That's enough learnings for today now! :stupid:
I remember the good old days when a computer was much simpler and easy to understand. My father was a computer engineer and one of the top programmers on the Patriot System for Desert Sandstorm. He got my grandparents an Amiga 1000 PC. Still have it and still working to this day. My grandfather had to replace a part or upgrade it back in the late 1980s and ripped several traces while removing the old part from the PCB. Since those systems were much simpler with large traces and solder joints he was able to run wires to repair the broken traces. I never did much with the system besides play classic games during the mid 1990s. Now we can't repair the PC stuff due to the small traces and multilayer PCB designs. Sure there are people out there who can but it takes experience and special tools to do it by hand. not to mention a steady hand. I miss the old days when someone could easily understand their machine. Takes more studying today than it did back then to understand computing...
TBH most of the time you don't need (because stuff has gotten very, very cheap - Replacing a $80 32KB chip might've been a viable use of your time, replacing a $2/64MB one, not so much) or want to (because it gets outdated fast... though I feel that's slowed down a bit in recent years). Yes, today's systems have gotten very complex. But to get a grasp of the basics, you could learn the in and outs of e.g. an Arduino (then work your way up from there if so desired). There'll always be a point where you go "nah, that's too much detail, I don't care any more" (did you know the layout of every transistor in your first Z80-based CPU? No?), it's just that this point is always shifting upwards.
What I was getting at is the E waste factor. So many good electronics get trashed because they either are outdated or have a very small problem. (Like say a burned out fuse.)
This my friend, is the real beginning of computing. First transistor by John Bardeen and Walter Brattain at Bell Laboratories, circa 1947!
Dude, look at that. At the time, you should be a real genius, to think about doing something like this.
Indeed, they got the Nobel prize in Physics in 1956 for it http://www.nobelprize.org/nobel_prizes/physics/laureates/1956/. They got it with their boss Shockley, who later started a semi-conductor company that didn't succeed, but its employee went and funded Intel Corp...
Now THAT is some history right there. Intel is THE de facto #1 semiconductor producer in the world. Just look at their CPU line.
True. Any company takes some time to approach world leader status. The fact they have been #1 for around 25 years is a testament to this. I take Intel over AMD. Funny thing is my PC I'm using right now as well as my laptop both have AMD APU processors in them. Too cheap to afford another Intel/ Nvidia system after my Asus 1215N died... LOL