For 2006: [-] Procesador IBM G5 con 4 núcleos de 2.5Ghz, 128KB de caché de nivel 1 y 512Kb de cache de nivel 2 [-] Procesador gráfico ATI RN520 de doble núcleo wow... Hope this is real.
A solid indication of the timeline for the launch of Nintendo's next home console has emerged from memory firm MoSys Inc, which has revealed that it will supply memory for the system, with "mid-2006" touted as the launch date. MoSys previously provided the 1T-SRAM memory technology used by NEC for the GameCube's memory, and speaking in a live conference call following the announcement of the firm's Q1 earnings, CEO and CFO Mark Voll said that it would again be fulfilling this role for Revolution. "During the quarter we announced that NEC Electronics will now use our 1T-SRAM embedded memory technologies on their advanced 90nm process, and that the initial designs to be incorporated in SoCs will be used in Nintendo's next-generation game console, codenamed Revolution," he said. The most interesting part came next, however, when Voll commented that: "We are excited to be a participating member of the Nintendo team once again as Nintendo will roll out its successor game console to the GameCube in mid-2006." This is the first solid evidence that the Revolution platform is still on track for a mid-2006 launch. The console is expected to debut at a pre-E3 conference next Tuesday, but it's still not known just how much will be on display - with sources close to Nintendo suggesting that only a pre-recorded video of "next-gen footage" may be shown. MoSys didn't reveal how much RAM would be going into each Revolution console - but in an unrelated story also doing the rounds about Revolution today, Chinese website Unika.com claims to have seen an actual specification for the hardware. According to the site, the console will boast four 2.5Ghz IBM G5 Custom cores, with 128KB of level 1 cache and a 512KB shared level 2 cache, while the graphics will be powered by a dual core ATI RN520 chipset, with 16MB of on-board eDRAM for the frame buffer. While both of those specifications seem eminently possible - not least because IBM and ATI are confirmed as Nintendo's hardware partners for the console - we've been unable to find any confirmation or denial of the figures, simply because no developers outside of Nintendo's tightly sewn up inner circle actually have Revolution details, let alone devkits, as yet. Eurogamer
using half guessworf, half pisspoor spanish: IBM G5 Processor with 4 cores at 2.5GHz. 128kb level 1 cache and 512kb of level 2 cache (I'm not certain if this is per core or total) Dual Core ATI RN520 graphics processor
it's not too hard for americans to get, but if you're from another country is can be hard I guess, but that is why they invented http://world.altavista.com
Yeah, i dont quite understand this shift towards multicore processors in gaming. It sounds like X360, Revolution, and As far as i understand, PS3 are using multicore processors (Or some kind of paralell processing). Isn't this very hard to code for multiple simultaneous threads in gaming, and as the saturn proved, rather hard to get maximum performance out? Perhaps we're just reaching a peak where one speedy processor alone is not fast enough, multiple fast processors are needed?
Offtopic: Aaaa, was that really a nessicary post? I think it had something to do with cheese. Seriously though, there's a reason why everyone posts in English on this board, and that's because it's a common language for all of our denizens. News articles don't get posted in Japanese for the same reason they shouldn't be posted in Spanish - not everyone speaks/reads it. Yes, we could all get the general gist of it, or take it to Google or Babel Fish for translation, but that's a little ridiculous on an English-speaking board. It's not remotely as much of an issue of laziness on our part as it is courtesy on yours.
A 4 core CPU? In all honesty what do we need that for? A dual core version of ATI's upcoming graphics chip? Again what do we need this stuff for? Are devs getting lazy and don't feel like really optimizing their stuff anymore so they just demand extremly powerful consoles to run their junk? In the end won't all the multi threading etc make these HARDER to develop for rather than just giving some "UBER" speed boost? It's not consoles are really multitasking like a desktop. I thought Nintendo said their next console was not going to be not so much about graphics and more about gameplay.
Personally, I think that with multi-core/multi-processor consoles, EVENTUALLY we will see programmers completely separate game aspects and assign them to individual units. Setting aside the actual difficulty of maintaining and synchronizing multiple threads, it is perfectly feasible to see one processor dedicated to AI, one to level control (i.e. positioning, determination of visible components), one to I/O processing (as in controller input and networking) and possibly one for overall system control. We have already seen segregation of graphics and sound processing, this would just allow programmers to go a step further.
The big chip companies are going multi-core because they are having problems ramping above their current levels without huge heat problems. A couple of years back Intel were making a lot of noise about how P4 was going to go to 10ghz, you don't hear them talking about it any more: check their roadmap, the future is multi-core. Anyone interested should read the Anandtech articles on why Prescott failed, and why multi-core is innevitable: http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2343 http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2377
Well, i do listen to a vasta mount of iron maiden Anywhom, Back to the multi core subject, I'm hoping it might weed out some of the shite in the market, by being harder to code for. It is considerably easier to code for one processor than multiple, so hopefully the pisspoor programming (Yes, I'm looking at you Driver 3 et al) will either: - Look shite compared to decently coded games due to not using the machines full power OR - Learn to code for multiple cores, and hopefully learn to write nice code while they're at it
While it might be a good way to weed out lazy and crappy programming, it could also have the opposite effect - the Saturn and Jaguar both used nonstandard architecture as far as processors and co-processors went, and the problem there was that few could really harness the power of the machines - in the Jaguar's case, there were a handful of excellent games, and the Saturn was largely untouched by developers outside of Japan because of its difficult nature (not to mention everyone wanted 3d polygons, not the sprites that were the Saturn's strong suit).
It's a good way of forcing smaller devs out of the industry, too, which doesn't seem a good thing to me.