A long time ago as most of us know consoles were sold as bit systems obviously 32 bit was better than 16 bit and the games and gameplay were what made a console fast forward to the ps2, gamecube, dreamcast era these were the last of the true "bit machines". Since the Xbox all console manufactuires do is explain their console in computer terms "oh the Wiipsbox has this many cpu's and has this graphics card with this much ram. I do realise that ram and cpu speed etc were all mentioned and stuff in the early days as well. It just wasn't push as much as it is nowdays, could it of been that most people were too computer illiterate and didnt know their ram from there cpu back then?
I'm pretty sure it was because at one stage they stopped throwing higher bit processors into the mix, but instead started using the bits they had more intelligently (parallel processing ect). Obviously there's more to it than that but essentially producing higher and higher bit machines becomes impractical versus squeezing power out of better system building techniques.
exactly as you say it is OP. Most people didn't even know what bits were - but hell they're offering more than nintendo, that must be awesome! Nowadays people understand what ram is, what is its purpose, the same for the cpu or gpu etc. Just a proper way to market stuff.
I do remember thinking of the DC and PS2 as "128-bit" but the impact had passed by the time the GC/Xbox were out.
Probably because once the XBOX came around marketing didn't want to look inferior by being only a 32 bit console.
Also there used to be cart size such as four mega. That kind of died with CD based system as there isn't a size limit. Games can even span multiple CDs and DVD
I think part of it was that going from 8-bit to 16-bit, and then from 16 to 32, broke through some real significant limitations at the time. 32-bit, it's only relatively recently that we've had hardware and software that could push up against the limits of, and 64-bit is probably enough for the rest of eternity.
DC and PS2 are 32-bit though, they aren't 128-bit technically. There does not exist a true 128-bit CPU.
Unless x86 is the exception the ability to address more than 4gb of RAM is more than enough reason to jump from 32bit to 64bit. Reason enough right there. What became irrelevant was for the marketing department to brag about bits.
That is true, but the DC and (to a lesser extent) PS2 called their machines 128-bit. It's really an antiquated term to describe something relatively irrelevant. For example, the Genesis WASN'T the first 16 bit system, the Intellivision was. I think the idea of "bits=power" died out at the N64, because it wasn't really twice as powerful as the PS1, for other reasons (the carts had a max size of 64 MB, as opposed to 700ish MB for a CD, just to name one).
The reason they talk about number of cpus and amount of ram is because they market computers like that, so it's familiar. In the old days, particularly with Sega's US marketing campaign they had to take simple concepts to put into commercials and ad material. Listing computer specifications doesn't stick like something simple. Saying, Hey SEGA GENESIS is 16BIT! That Nintendo is only 8bit. That's a good thing to say in marketing because to any consumer they would assume, so Sega Genesis is twice the product. Same with Blast Processing, it's a good thing that's a simple concept that can stick. That's why we had the whole "bits" thing back then. Nevermind how irrelevant it was.
On the computer side its a bit more convoluted than that, what currently referred to as x64 (64-bit) is simply a enhanced instruction set, no current consumer grade computer runs on true x64 architecture,and like you mentioned they do use 64-bit registers to allow memory addressing over 4gb, while intel does have a handful of cpus currently offering that, but there dedicated to the server market. AMD was the first processor manufacturer to release a pseudo x64 cpus ,this happened in 2003 long after console manufacturers stopped naming there consoles after bits. Allso what I was referring to was this TV advertisement, and Ataris' miss labeling there console as 64-bit.
What are you talking about? I think most modern computers allow for more than 4GB of RAM, not just servers. My computer has 16GB, for example. And here's a question: regardless of whether it was relevant or not, what did the number of bits refer to, exactly? I've never really heard a succinct explanation.
Well, like he said, x64 computers do use 64-bit memory registers to allow more than 4 GBs of memory, but they're not "true" 64-bit systems. And the bit thing is probably easiest to explain by saying that the bits refer to the largest number a system can store in one register, for example, for an 8-bit system it would be 2 to the power of 8 = 256, 16-bit: 2^16 = 65536, and so on.
What would make it a "true" 64-bit system, then? And is a Nintendo 64 a "true" 64-bit system? I should think not.
Jaguar is also a bit misleading. The way it's set up with 5 chips, one could code the game to run only on the 16 bit 68k CPU and leave 4 unused. That wouldn't make the game 64 bits game even though it runs on a "64 bits" system.