Rumored Wii U spec Sheet, True or False?
According to Wii U Daily, a Specification Sheet has been released for the Wii U from an “Undisclosed Japanese Developer” stating that the Wii U is “50% more powerful” than the Xbox 360 and PS3.
- Quad Core, 3 GHz PowerPC-based 45nm CPU from IBM, very similar to the Xbox 360 chip.
- 768 MB of DRAM “embedded” with the CPU, and shared between CPU and GPU
- Unknown, 40nm ATI-based GPU
This is strange because IBM has suggested the Wii U will run on a Power7-based CPU, which would be significantly more powerful than the rumored chip.
Nintendo has been testing two versions of the console, one with 768 MB of RAM, and one with 1 GB. The RAM is also made by IBM and is embedded with the processor on the same die/silicon, which results in more bandwidth.
If accurate, the specs will place the Wii U squarely ahead of the PS3 and Xbox 360, each of which are over a half-decade old, but not by much.
People are disappointed on how powerful the graphics are, but come on, were talking about Nintendo here, they’re not known for having graphics comparable to Sony and Microsoft. Whether you think these specs are true or false, is all entirely up to you.
Most people may probably be thinking that, “there is no way a home console will have 768 MBs of eDRAM“. If you ask me, Nintendo aren’t going to use eDRAM, certainly not 768 or 1GB of it. The fact that they have different configurations proves it. Nintendo are probably using STACKED DRAM. Three or four 256MB DRAMs stacked on top of the CPU, possibly using TSV (Through Silicon Vias) for massive bandwidth and low power use. We could be talking about 384/512-bit or 768/1024-bit memory bus. More likely is that it is just stacked without using TSV and it’s a 192/256-bit bus. If the CPU is a POWER7 variant then there may be some addition eDRAM as cache on the CPU. If it’s a quad-core variant, then maybe there is 16MB of eDRAM cache (as the octo-core Power7 has 32MB). As the memory is on the CPU, it is likely that there is a very high bandwidth interconnect between the CPU and GPU as well. This would probably mean an MCM (Multi-Chip Module). Alternatively, the GPU has its own graphics memory.