Again, the issue here isn't the programming that's making development costs rise. It's the asset creation. The reason the world map of Final Fantasy XIII is literally a straight line is because after years of being in development that was the most they could get done with that level of graphical quality. The reason we never got that FF7 remake is because it would cost too much to recreate a game world that's significantly larger than FF13 with the same graphical quality as FF13.
Look at games like Xenoblade. That game has huge open worlds that have a pretty hefty amount of geometric detail. It's game world is significantly larger than a lot of HD JRPGs. To put a world like that on an HD console at the time it was being developed would have cost way too much. Cuts would have had to have been made to reduce the number of assets created. That's where the Wii has helped with development costs.
Trekkies thinks its all about HD asset creation and ballooning budgeting costs.
Completely oblivious or still choosing to ignore the obvious point, that if the hardware itself was standardized to something common like x86, there would be less time and effort spent on porting and optimizing game engines, and more resources available to create the HD content that the market demands and expects.
Trekkies thinks its all about HD asset creation and ballooning budgeting costs.
Completely oblivious or still choosing to ignore the obvious point, that if the hardware itself was standardized to something common like x86, there would be less time and effort spent on porting and optimizing game engines, and more resources available to create the HD content that the market demands and expects.
There are tools out that make this SIGNIFICANTLY less of an issue than people think it is.
You can make some phenomenal worlds, complete with textures and lighting with literally the click of a button. Now, would you just accept that and use it? No, but it's not like every single polygon in the entire game needs to be painstakingly placed.
Most of the issue with HD assets has always been an issue of space. Either on the storage medium, the video/system RAM, or both. Without those bottlenecks the actual creation of HD assets isn't that big of a deal. Some extra work? Absolutely, but it's NOT the huge issue people think it is.
Most 3D games have 2-3 grades of models specifically because they have resource issues. The big fancy cut scene models are almost never used in game. If you could actually have 1 model/texture set that worked for your cut scenes AND gameplay, it would actually make less work. That obviously doesn't hold true in every aspect, but there are tradeoffs in both directions.
Trekkies thinks its all about HD asset creation and ballooning budgeting costs.
Completely oblivious or still choosing to ignore the obvious point, that if the hardware itself was standardized to something common like x86, there would be less time and effort spent on porting and optimizing game engines, and more resources available to create the HD content that the market demands and expects.
I'm not saying it's all that but it is a pretty significant factor. If it wasn't then explain why games like FF13 are pretty much a giant straight line? The programming and engine work required for that isn't much different if you wanted to add different paths and make the world larger. And it's certainly not a technical issue as the Wii easily pulls off a huge world in Xenoblade Chronicles. The issue it boils down to is to make those larger worlds it requires more assets to be made and more effort put into making sure things look good. The aspect of making sure they look good is even more important on HD consoles. Yeah now the tools have gotten better for current gen stuff, but next gen stuff is going to be a whole new ball game.
The average development cost for a Wii game is $5-$7 Million. The average development cost for a 360/PS3 game is $15-$30 million, with bigger titles getting closer to $60+ Million. If programming was truly the issue those numbers should be closer, but they're not. If programming was truly the issue like you are saying it is then Wii development costs should be higher than 360/PS3 costs since Wii games typically need to be reprogrammed and built with less advanced tools and engines. Even with easy to use tools and engines 360/PS3 games still cost triple or more to develop than a Wii game. The only possible explanation is that the HD assets cost more time and resources to produce than the SD assets used for Wii games, that's really the only difference left.
Last edited by TrekkiesUnite118; 05-03-2013 at 02:37 PM.
I'm not saying it's all that but it is a pretty significant factor. If it wasn't then explain why games like FF13 are pretty much a giant straight line? The programming and engine work required for that isn't much different if you wanted to add different paths and make the world larger. And it's certainly not a technical issue as the Wii easily pulls off a huge world in Xenoblade Chronicles. The issue it boils down to is to make those larger worlds it requires more assets to be made and more effort put into making sure things look good. The aspect of making sure they look good is even more important on HD consoles. Yeah now the tools have gotten better for current gen stuff, but next gen stuff is going to be a whole new ball game.
The average development cost for a Wii game is $5-$7 Million. The average development cost for a 360/PS3 game is $15-$30 million, with bigger titles getting closer to $60+ Million. If programming was truly the issue those numbers should be closer, but they're not. If programming was truly the issue like you are saying it is than Wii development costs should be higher than 360/PS3 costs since Wii games typically need to be reprogrammed and built with less advanced tools. Even with easy to use tools and engines 360/PS3 games still cost triple or more to develop then a Wii game. The only possible explanation is that the HD assets cost more time and resources to produce than the SD assets used for Wii games, that's really the only difference left.
The software industry has a bad habit of using extra resources when provided with bloated and inefficient code. It's the way Microsoft has done things for decade. Bigger is always better & refine and optimize if there's performance issues.
Having more RAM for high res content while simultaneously running background social media services is going to be what drives the industry standard forward.
Ballooning costs of that nature are bad enough, but when Nintendo's hardware is too limited to handle an easy port and needs a lot of optimizing, it becomes a costly secondary option to put the game on the Wii U platform at all. PC gaming is going to set the high standard, next gen consoles will hold it back, but the Wii U won't even matter unless Nintendo kisses asses and subsidizes exclusives.
There's nothing noble or noteworthy about the shills at Monolith, Xenoblade creators on Nintendo's payroll, spewing rhetoric about how much extra effort would be necessary if they were to required to upscale their game and make sure it looks decent. It's bullshit.
Great job deflecting the point there KnuckleDuster.
Costs may start low or not much higher Next Generation, but that's how this generation started out too. Then as developers got better grips on the hardware costs ballooned out of control. I wont be surprised if the same thing happens next gen. I don't want to see it happen, but I'm not going to be shocked if it does.
Originally Posted by Knuckle Duster
There's nothing noble or noteworthy about the shills at Monolith, Xenoblade creators on Nintendo's payroll, spewing rhetoric about how much extra effort would be necessary if they were to required to upscale their game and make sure it looks decent. It's bullshit.
It's not just Monolith saying it, Square Enix said it too in response to FF13's linearity as well as in response to fans wanting an FF7 remake. If Xenoblade was made on an HD console it probably would have cost too much and required cuts. Remember it started development around 2006.
Out of curiosity have you actually played Xenoblade or seen it? I can't think of a single HD game that has environments as open and huge as this:
And that's just one very tiny portion of the game. In most games the scenery you see in the far background is just a 2D image or a skybox you can never ever get to. In Xenoblade it's part of the world and you can actually go touch and explore it, without having to load into a new area or anything.
Last edited by TrekkiesUnite118; 05-03-2013 at 03:01 PM.
Just Cause 2 is nice, but how much did it cost to make? I'm not saying they don't exist on HD systems, but they are very cost prohibitive to make, especially when it comes to JRPGs and such.
I can't find a budget for it unfortunately. All I know is that it sold 3.5 million copies as of December of 2011 and Square Enix was happy enough to greenlight a sequel.
Great job deflecting the point there KnuckleDuster.
Costs may start low or not much higher Next Generation, but that's how this generation started out too. Then as developers got better grips on the hardware costs ballooned out of control. I wont be surprised if the same thing happens next gen. I don't want to see it happen, but I'm not going to be shocked if it does.
Why does it even matter? Why should I care if it costs developers an extra million to work on their games? They'll somehow release inferior games or not create them at all? Well, I guess they won't be putting food on their table any time soon.
How does not being able to provide the higher cost games the ability to run, in any way help Nintendo? How does having $300 machine that can only perform as well as a $199 Xbox 360 help them? They have no longevity to offer in the face of the obvious demand.
It's not just Monolith saying it, Square Enix said it too in response to FF13's linearity as well as in response to fans wanting an FF7 remake. If Xenoblade was made on an HD console it probably would have cost too much and required cuts. Remember it started development around 2006.
Out of curiosity have you actually played Xenoblade or seen it? I can't think of a single HD game that has environments as open and huge as this:
And that's just one very tiny portion of the game. In most games the scenery you see in the far background is just a 2D image or a skybox you can never ever get to. In Xenoblade it's part of the world and you can actually go touch and explore it, without having to load into a new area or anything.
I imported Xenoblade and played the entire game when it came out. It was one of the only Wii games worth playing in the past few years. The amount of detail you're describing is nonsense, the far off new areas load when you walk into them. The larger ones act like any other sandbox game and renders them to blur out in the distance. Xenoblade looked and played better in HD, on the Dolphin emulator, since day 1.
Trekkies thinks its all about HD asset creation and ballooning budgeting costs.
Completely oblivious or still choosing to ignore the obvious point, that if the hardware itself was standardized to something common like x86, there would be less time and effort spent on porting and optimizing game engines, and more resources available to create the HD content that the market demands and expects.
Being x86 doesn't really matter at all, just having a decently fast CPU of any architecture (with decent library support) would be fine. The WiiU CPU is simply too slow, and being PowerPC based is not the problem.
In fact, a faster clocked CPU otherwise similar to the WiiU's would be a pretty decent counterpart to x86 and ARM competition, and without some of the more dramatic difficulties the Xenon and especially Cell faced for game performance. (out of order execution requiring very different code optimization requirements for compilers, Xenon needing code optimized for heavy multithreading vs contemporary single-thread PC games of the time, and the much bigger clash of SPE performance optimization needed for the PS3)
If the PPC750 derivative Nintendo/IBM were working with could manage double the clock rate they're doing in WiiUs (or a little less than double), that would have been quite reasonably matched to the rest of the hardware (and similar spec low-end gaming PCs), and if they couldn't push the clock up then they should have moved on to a different CPU entirely (either another PPC derivative, or ARM or x86 and embed the old Wii CPU for compatibility and coprocessing)
A fast ARM Cortex CPU could make a fine CPU for a mainstream home console for that matter.
Originally Posted by EclecticGroove
There are tools out that make this SIGNIFICANTLY less of an issue than people think it is.
You can make some phenomenal worlds, complete with textures and lighting with literally the click of a button. Now, would you just accept that and use it? No, but it's not like every single polygon in the entire game needs to be painstakingly placed.
Most of the issue with HD assets has always been an issue of space. Either on the storage medium, the video/system RAM, or both. Without those bottlenecks the actual creation of HD assets isn't that big of a deal. Some extra work? Absolutely, but it's NOT the huge issue people think it is.
Most 3D games have 2-3 grades of models specifically because they have resource issues. The big fancy cut scene models are almost never used in game. If you could actually have 1 model/texture set that worked for your cut scenes AND gameplay, it would actually make less work. That obviously doesn't hold true in every aspect, but there are tradeoffs in both directions.
Again, CPU architecture difference isn't the problem here, but actual performance IS.
And yes, space is a problem too, and it's a big shame MS and Sony haven't been including provisions for RAM expansion (potentially cheap mid-gen add-ons -and standardized features of late models- that would open things up considerably and potentially improve PC versions as well with fewer compromises -more unified development effort).
This is going to be a problem with the WiiU as well . . . 2 GB is better than 512 MB, but it's still limiting, especially at this point in the game and especially with only 1 GB actually usable to the programmer. (it's especially unfortunate Nintendo has no RAM expansion given they're using commodity DDR3 SDRAM for the WiiU, not GDDR let alone XDR)
Originally Posted by TVC 15
A big thing people seem to forget with the Wii and 3rd party developers, is the long standing assumption at the time and one that still lingers on that by using older tech that it benefitted development costs and benefited the industry. In actuality the only companies that benefited from the Wii's lacklustre microwaved leftover hardware where Nintendo. I've actually read arguments from developers on more technically minded forums like beyond3d that it actually increased development costs for third parties, most companies had geared up for the next generation PS360 of development, the Wii's lack of feature parity actually meant re-tooling old engines and downgrading assets to work on the Wii for multi-plat titles and numerous other franchises, the Wii did have its own specific development practices such as using very outdated fixed function hardware. Why bother pouring resources into it?
It put the wii in a totally different game development niche than contemporary high-end PC games and PS3/360 games and in-line with upgraded late 6th gen titles as well as a useful platform to continue down-porting to the PS2 from as that platform coasted along very late in life.
Hell, if Nintendo hadn't wanted to kill off the Game Cube so soon, another great aspect would have been moderately cut down Wii games released on GC for the budget market (along with a cost-reduced GC model to directly compete with the PS2 slim). Then again, the X-box (hardware wise) was the best platform of all to do that with . . . except the manufacturing arrangement MS had doesn't seem like it would have favored a super-integrated, low cost, low overhead (for outsourced hardware) derivative from Intel/Nvidia. (had they gone AMD+ATi, it might have been a little different )
Honestly, the Wii's weaknesses could have worked out much better in the long run if they'd launched a successor a couple years earlier than they did. The existing WiiU (at least in terms of internal chipset) would have been much better off back in 2010, even with the shortcomings. Keeping out the new gamepad initially would have cut out costs hugely while still competing much more directly with PS3/360 (with some advantages), plus being able to expand upon and refine the gimmicks introduced in the Wii itself would have been great. (including wii motion plus out of the box)
Imagine all those late-gen wii titles that could have been WiiU technical/aesthetic quality (and with more refined usage of the pointer and motion controls) on top of all the decent versions of mid/late 7th gen titles that could have been done.
Having the hardware out so much sooner would have meant many more multiplat games could have taken the specific limitations into account and not been left high and dry after the fact. (as it is, now developers have to choose to port over older games that already lost their initial release appeal, along with considering severely cut-back versions of games targeting PC and next-gen competitors)
In general though, the technical situation is a bit odd with the WiiU . . . unlike the wii, they put the effort into making a generally new architecture with semi-vestgial old components, and they invested in a modified multi-core PPC 750 derivative capable of at least going above 1 GHz (if the same core was used in the dev kits, then considerably more than that too), and it ends up rather unbalanced compared to established hardware standards. (much more CPU bound than the 360 and more than a low-end gaming PC with a similar GPU and a low-end CPU like a decently fast Celeron, Pentium, or Athlon II) At very least they should have gotten something with nominal CPU performance on par with the 360's Xenon. (which probably would have entailed the Wii's CPU being somewhere above 2 GHz)
The whole argument that the next generation is going to spiral out of control with dev costs is overblown, if anything most development hardware has matured, Sony's OpenGL like API and MS DirectX will make easy cross-platform portability of code from PC, NextBox and PS4, since they all share semi-custom PC hardware. As hardware becomes more powerful coding becomes less-low level and flexible, with mature middleware and compilers. Coders aren't dividing lines of RISC code into a 4kb scratchpad to maximise matrix transforms anymore.
This only works if the hardware performance is at least roughly comparable to the low-end PC and console counterparts, and exhibits bottlenecks of similar nature. (and uses similar APIs and drivers of similar or better performance)
With a fast CPU, this would at least be close to the case for the wii, but that's where the hole is.
6 days older than SEGA Genesis
-------------
Originally Posted by evilevoix
Dude it’s the bios that marries the 16 bit and the 8 bit that makes it 24 bit. If SNK released their double speed bios revision SNK would have had the world’s first 48 bit machine, IDK how you keep ignoring this.
Originally Posted by evilevoix
the PCE, that system has no extra silicone for music, how many resources are used to make music and it has less sprites than the MD on screen at once but a larger sprite area?
Being x86 doesn't really matter at all, just having a decently fast CPU of any architecture (with decent library support) would be fine. The WiiU CPU is simply too slow, and being PowerPC based is not the problem.
I never said it was the problem, or that the architecture mattered. I've said by that by switching to the AMD APUs & GPUs, the losses on hardware would likely end quicker and keep costs down since they're tooled up and fabricating them on a massive scale for the PC market anyway. Trekkies thinks it's an argument all about programming, whereas it's really all about creating a high-performance and standard environment that pleases developers. That's all that matters, and that's what Nintendo didn't follow through on.
I see everybody jumping to one standard, pressure from most of the industry will be on AMD's driver support of DX11 & OpenGL. Developers will get used to having a powerful environment they can easily port their games around in, with the Xbox Durango, Playstation 4, and PC-Steambox. Development on Wii U, will be more cumbersome and expensive than any of it's competitors. Nintendo will probably need to burden themselves with the costs of outside development, and on top of already eating the price cuts at retailers, it's no wonder the media is squawking about how bad things are.
It's not like Nintendo is going to die. They're too big and have IP that prints money. They're not really giving anybody a reason to buy or develop for Wii U right now though.
Last edited by Knuckle Duster; 05-03-2013 at 11:06 PM.
Out of curiosity have you actually played Xenoblade or seen it? I can't think of a single HD game that has environments as open and huge as this:
.
Oblivion, Skyrim, Fallout 3 and Fallout: New Vegas have huge maps to explore.
Why isn't GTA V coming to the Wii U?
A Black Falcon: no, computer games and video games are NOT the same thing. Video games are on consoles, computer games are on PC. The two kinds of games are different, and have significantly different design styles, distribution methods, and game genre selections. Computer gaming and console (video) gaming are NOT the same thing."