I rather doubt it. The GPU the WiiU is using (from the specs I've seen) looks like a relatively low-cost GPU as it is (core config seems similar to a 5550 or 6530D -clock rate and process match the 5550 too). The CPU die being used is absolutely tiny too (GPU dwarfs it on the MCM).
I really doubt using an APU would have been cheaper overall . . . in fact it would probably be more expensive unless Nintendo dropped Wii compatibility and used a relatively gimped APU model that fell into the similar power consumption range as the WiiU. (otherwise bigger PSU, bulkier case, larger heatsink/fan(s), etc -some of which are more aesthetic issues than actual cost)
As it is, Nitnendo seems to have cut the CPU clock rate due either to aesthetics (small size) and/or cost reasons (same as above), so going APU shouldn't have been a deal breaker one way or the other.
From what I understand, the big added cost for the Wii U is with that game pad, and that's the real deal breaker here. Dropping that would have massively reduced the console's practical retail price, and perhaps allowed for more leeway in terms of actual hardware performance too. (be it using an AMD CPU/APU or a faster IBM one -maybe a beefier GPU too) For that matter, they could have added more modest features like an upgraded nunchuck with additional motion sensors. (or just dual-wielding wii remotes-gimmicky, but could be interesting for some things, including some hack and slash RPGs and adventure games or even shooters, especially with dual pointers)
Perhaps they should have held off longer on the gamepad idea and pushed ahead with the motion-plus capable wii remote as the main controller initially to save costs. (and maybe bump up the internal hardware performance) They could have kept going with the features and gimmicks the wii established, but with reasonably modern hardware and more refined usage of those gimmicks/features. That, and, again, they should have done this sooner. (or, really, anyone should have done this sooner -seriously, a next-gen console in 2010 could have really cut into the market . . . if anyone wanted to push fantasies like "The Return of Sega," that would have been a pretty feasible time to do it)
For 2012 though, they really needed beefier hardware than what they put out. With developers strangled by the aging limited 360 and PS3, a half-decent new console (at least close to a good budget build gaming PC) would have been something really interesting and compelling to a big chunk of developers. And with the established special features and gimmicks of the Wii carried over, there'd be potential to work that in too.
If higher-end ARM/Android based game platforms appear (perhaps Project Shield) that may change things somewhat too, not to mention potential shift towards open-source platforms given the questionable direction MS's current OS platforms are going (at least once Win7 starts becoming less mainstream). Albeit, OpenGL would probably be the main API on such platforms too. (or pretty much any major standard outside of MS's stuff -which have OpenGL support too, of course)I see everybody jumping to one standard, pressure from most of the industry will be on AMD's driver support of DX11 & OpenGL. Developers will get used to having a powerful environment they can easily port their games around in, with the Xbox Durango, Playstation 4, and PC-Steambox. Development on Wii U, will be more cumbersome and expensive than any of it's competitors. Nintendo will probably need to burden themselves with the costs of outside development, and on top of already eating the price cuts at retailers, it's no wonder the media is squawking about how bad things are.
There's no technical reason the Wii U couldn't support OpenGL too, but the actual performance bottlenecks would still be there.


Reply With Quote


