The beggining of Computering

Discussion in 'General Gaming' started by Franzh11, Oct 27, 2013.

  1. Franzh11

    Franzh11 Spirited Member

    Joined:
    Dec 11, 2011
    Messages:
    156
    Likes Received:
    2
    Videogames directly descend from computering , wanting it or not.

    And computering was created kinda all from mathematics.

    Yes, because at first we wanted to create powerful tools that would help us create contents digitally using real-life mathematical equations that would apply to be able to create consistent content.

    yesterday I found an interesting video on youtube about the beggining of computers in life; as we might not all know, all starts with bytes.

    a single byte is like a data, either empty or full.

    imagine by then we want to create a full-scale picture, then it would take probably millions of bytes, each of them having its own color.

    so at first, people wanted to create powerful tools that would help us create schematics without handwriting them theirself, so to avoid making precision mistakes. If so a computer could create our schematic in mind perfectly, counting each bit precisely, by then we would be able to ccreate really performants objects;

    I guess it's why and how the consoles came out. Because they were huge data file created to make like mathematical equation , using variables and such; then the console can read the file and interpret it.

    maybe something you'll be interested to look :

    http://www.youtube.com/watch?v=AmwdTRChAUU

    http://www.youtube.com/watch?v=VxLv6jDPpng
     
    Last edited: Oct 27, 2013
  2. RetroSwim

    RetroSwim <B>Site Supporter 2013</B><BR><B>Site Supporter 20

    Joined:
    Dec 10, 2012
    Messages:
    605
    Likes Received:
    26
    [​IMG]
     
    Last edited: Oct 27, 2013
    MaxWar likes this.
  3. thequadehunter

    thequadehunter Active Member

    Joined:
    Sep 19, 2013
    Messages:
    32
    Likes Received:
    0
    "Silicon Graphics Super Workstation"

    Haha oh god.
     
  4. richterw

    richterw Rapidly Rising Member

    Joined:
    Aug 2, 2013
    Messages:
    80
    Likes Received:
    4
    Interesting stuff. Silicon Graphics were very good for the time. I miss those simpler times.
     
  5. RetroSwim

    RetroSwim <B>Site Supporter 2013</B><BR><B>Site Supporter 20

    Joined:
    Dec 10, 2012
    Messages:
    605
    Likes Received:
    26
    Yeah, the videos are a fun look back on history.

    I have an SGI Indigo2 stashed away somewhere in my house, I should crack it out some time and do a video on it.
     
  6. DeckardBR

    DeckardBR Fiery Member

    Joined:
    Jul 3, 2008
    Messages:
    864
    Likes Received:
    2
    During the early and mid 1980s, home computers were just becoming popular and were used as game machines in many homes. Parents would rather buy their kids home computers then colecovision since it was actually usable for something besides Asteroids. Back when the Nes was out (1986-89) I would play it at friends but I knew the amiga computer was out there with far superior graphics (though superior games is really arguable) and was what I really wanted.

    Ended up getting a genesis at launch in 89 which thankfully ended up with alot of Amiga ports. Also popular back then were the Sierra On Line adventure games (Kings Quest etc) which were never on consoles only PCs and Macs. Love the early compact Macs by the way and I collect them.
     
  7. la-li-lu-le-lo

    la-li-lu-le-lo ラリルレロ

    Joined:
    Feb 8, 2006
    Messages:
    5,657
    Likes Received:
    238
    My thoughts exactly. Computering? Such a word does not exist in any dictionary I've seen.
     
  8. CrAzY

    CrAzY SNES4LIFE

    Joined:
    Nov 25, 2006
    Messages:
    1,737
    Likes Received:
    48
    Computering is a perfectly cromulent word.
     
  9. Tatsujin

    Tatsujin Officer at Arms

    Joined:
    Nov 24, 2005
    Messages:
    3,614
    Likes Received:
    6
    no, that's a bit. one byte can have up to 256 (0 ~ 255) different states, since it is made of 8 of those lil bits.
     
  10. rso

    rso Gone. See y'all elsewhere, maybe.

    Joined:
    Mar 26, 2010
    Messages:
    2,190
    Likes Received:
    447
    No. Computering doesn't know optimism or pessimism, only math, which includes rounding. So 0-127 = empty, 128-255 = full.

    Now floating point, this is where it gets complicated since it can represent intermediate values - e.g. to represent '2.75', you need three bytes, since you have to have two 'full' (255) and one 'three-quarter' (191) byte. To be exact, the last byte would actually need to be 191.25, which can't pre represented - that's where the rounding errors come from. (The 'floating' means the decimal point 'floats' to whereever the last byte representing the fraction is located).

    That's enough learnings for today now! :stupid:
     
    Last edited: Oct 28, 2013
  11. Evangelion-01

    Evangelion-01 Officer at Arms

    Joined:
    Mar 13, 2004
    Messages:
    3,114
    Likes Received:
    3
  12. sonicdude10

    sonicdude10 So long AG and thanks for all the fish!

    Joined:
    Jan 17, 2012
    Messages:
    2,573
    Likes Received:
    29
    I remember the good old days when a computer was much simpler and easy to understand. My father was a computer engineer and one of the top programmers on the Patriot System for Desert Sandstorm. He got my grandparents an Amiga 1000 PC. Still have it and still working to this day. My grandfather had to replace a part or upgrade it back in the late 1980s and ripped several traces while removing the old part from the PCB. Since those systems were much simpler with large traces and solder joints he was able to run wires to repair the broken traces.

    I never did much with the system besides play classic games during the mid 1990s.

    Now we can't repair the PC stuff due to the small traces and multilayer PCB designs. Sure there are people out there who can but it takes experience and special tools to do it by hand. not to mention a steady hand.

    I miss the old days when someone could easily understand their machine. Takes more studying today than it did back then to understand computing...
     
  13. rso

    rso Gone. See y'all elsewhere, maybe.

    Joined:
    Mar 26, 2010
    Messages:
    2,190
    Likes Received:
    447
    TBH most of the time you don't need (because stuff has gotten very, very cheap - Replacing a $80 32KB chip might've been a viable use of your time, replacing a $2/64MB one, not so much) or want to (because it gets outdated fast... though I feel that's slowed down a bit in recent years).

    Yes, today's systems have gotten very complex. But to get a grasp of the basics, you could learn the in and outs of e.g. an Arduino (then work your way up from there if so desired). There'll always be a point where you go "nah, that's too much detail, I don't care any more" (did you know the layout of every transistor in your first Z80-based CPU? No?), it's just that this point is always shifting upwards.
     
    Last edited: Oct 28, 2013
  14. sonicdude10

    sonicdude10 So long AG and thanks for all the fish!

    Joined:
    Jan 17, 2012
    Messages:
    2,573
    Likes Received:
    29
    What I was getting at is the E waste factor. So many good electronics get trashed because they either are outdated or have a very small problem. (Like say a burned out fuse.)
     
  15. -=FamilyGuy=-

    -=FamilyGuy=- Site Supporter 2049

    Joined:
    Mar 3, 2007
    Messages:
    3,034
    Likes Received:
    891
    This my friend, is the real beginning of computing.
    transistor.gif
    First transistor by John Bardeen and Walter Brattain at Bell Laboratories, circa 1947!
     
    BLUamnEsiac likes this.
  16. richterw

    richterw Rapidly Rising Member

    Joined:
    Aug 2, 2013
    Messages:
    80
    Likes Received:
    4
    Dude, look at that. At the time, you should be a real genius, to think about doing something like this.
     
    Last edited: Oct 28, 2013
  17. -=FamilyGuy=-

    -=FamilyGuy=- Site Supporter 2049

    Joined:
    Mar 3, 2007
    Messages:
    3,034
    Likes Received:
    891
    Indeed, they got the Nobel prize in Physics in 1956 for it http://www.nobelprize.org/nobel_prizes/physics/laureates/1956/. They got it with their boss Shockley, who later started a semi-conductor company that didn't succeed, but its employee went and funded Intel Corp...
     
    Last edited: Oct 28, 2013
  18. sonicdude10

    sonicdude10 So long AG and thanks for all the fish!

    Joined:
    Jan 17, 2012
    Messages:
    2,573
    Likes Received:
    29
    Now THAT is some history right there. Intel is THE de facto #1 semiconductor producer in the world. Just look at their CPU line.
     
  19. Tatsujin

    Tatsujin Officer at Arms

    Joined:
    Nov 24, 2005
    Messages:
    3,614
    Likes Received:
    6
    Yeah, but they weren't much big as a semiconductor manufacturer until the late 80s.
     
  20. sonicdude10

    sonicdude10 So long AG and thanks for all the fish!

    Joined:
    Jan 17, 2012
    Messages:
    2,573
    Likes Received:
    29
    True. Any company takes some time to approach world leader status. The fact they have been #1 for around 25 years is a testament to this. I take Intel over AMD. Funny thing is my PC I'm using right now as well as my laptop both have AMD APU processors in them. Too cheap to afford another Intel/ Nvidia system after my Asus 1215N died... LOL
     
sonicdude10
Draft saved Draft deleted
Insert every image as a...
  1.  0%

Share This Page