Aaand all this technical talk is little more than a blip in the real, marketable world. I bet not even digitalfoundry would enjoy such verbose discussions.
Printable View
Aaand all this technical talk is little more than a blip in the real, marketable world. I bet not even digitalfoundry would enjoy such verbose discussions.
If you are just going to drop frames completely, there's no advantage to having more than three buffers. Having more buffers allows the system to decide which buffers to drop, and which to show next based on things like the average frame rate the game is shooting for. Maybe it doesn't show the current newest frame generated, but rather newest - X instead to hit some kind of target FPS.
I understand you're a proponent for giving Console users the same choices as far as 'Detail Settings' go... I agree, it would benefit the user to have access to that... even at the Console level. But that is NOT the core issue.
The fact remains that the P3&360 are both 'plenty' fast enough to render todays engines with V-Sync being forced... there are ZERO excuses. Perhaps not AA on engines that aren't well optimized/poorly ported, but V-Sync 100% of the time should be a non issue.
The core issue is that the Pubs are not paying the Devs enough money (given enough time=money) to ensure solid enough frame rates with V-Sync forced on, so they're going the short cut route which is to automate V-Sync off in times of sub par Frame Rates. This is a chicken shit move and nothing else. Games that utilize this BULLSHIT trickery need to be marked ON the Game Case. That way, the customer has no place to complain if they buy the game even after being warned on the box that V-Sync is not supported. I have a right as a consumer to know whether or not a "wannabe PC" title supports V-Sync for christs sake (helluva lot easier than calling a Dev or Pub and having them escelate my "does your game support V-Sync" question to Tech 2 only for me to either not get a response or wait weeks to actually get one). Putting that information on the actual box would greatly simplify things.
The excuses that the multitude of you are making for the Devs and Pubs in these cases are nothing more than petty excuses.
First (back in late '06) - everyone was ranting and raving about how the 360 and P3 were going to bring PC gaming into everyone's living rooms at a fraction of the cost of a top notch Gaming PC, and with much less setup hassle. The fact of the matter is that the engines of the Gaming World have not increased by 3 fold since Q4 of '06. The engines of today look about the same minus a couple extra layers of misc Post Processing FX.
Call of Duty Modern Warfare 2 is a perfect example: (the P3 version from my experience) has V-Sync locked 100% of the time and I seem to remember it having 2-4xAA. An engine of that caliber/quality was able to be pulled off on the P3, there is NO excuse that any other engine of similar quality can't be done just as well.
The only excuses are when the Pub refuses to pay the Dev enough money to ensure they have enough time to nail the design, or when the Dev simply short cuts the process.
Bottom Line is that I'm sick of seeing "gamers" make excuses for these companies. This is not a thin line. This is black and white. Sony and Microsoft are not ensuring the proper level of quality over the software that gets released on their hardware.
If I'm buying a game for the PC, then there is a world of possible issues I might encounter... I know this going into the purchase.
If I'm buying a game for one of the 3 Consoles, then I shouldn't have to worry myself with ANY OF THE ISSUES that a PC version could come with. Why is this? Because the game was designed on a system with ONE set of drivers, ONE OS installation, ONE kind of CPU, ONE kind of GPU/etc/etc/etfuckingetc. They're proprietary machines and we were promised that all of the misc "there's too many possible combinations when it comes to building a PC and maintaining level support for all those different combos" issues specifically related to PC Gaming would be averted when gaming on the Consoles because Troubleshooting/Bugshooting in the Consoles should be much easier/comprehensive thanks to the proprietary hardware configuartions.... and mainly because this is what's been promised.
I'm tired of spineless excuses: I want my fellow gamers to stand up and complain when they've been screwed over. Quit supporting BULLSHIT practices and quit making EXCUSES when the Pubs and Devs step on their dicks.
I can't read all of this to find out. Are we considering interlaced resolutions in this discussion? You know, like most of the PS2 library and the likes of Halo 3 use?
Question for you... does the entire PS2 Library support 480P resolution?
Honestly, I've been expecting that it would. I've been thinking my 200+ Collection of P2 games would look fantastic in 480P versus 480i (which is what I've been running them on).
Lemme guess... there's only a handful of games that support 480P while the majority support 480i only?
I think there's a trick to it... you're supposed to hold TRIANGLE as you boot the game or something like that to switch the system to 480p. Supposedly, most games work fine that way, but you don't get any extra resolution in the game - it's just scaled unless the game supports it specifically.
Yes it is, see below:
It's not a performance issue at all, that's why Vsync was forced on pretty much all older consoles (the 32x does it for crying out loud, and most 3D MD games including Virtua Racing :p), it was only more recently that consoles could be set up to NOT use v-sync. (or on computers, but even then most late 90s PC games tended to force v-sync with double buffering by default -even most older games would do that, but if they hit bandwidth limits they might not be able to copy into video RAM in one vblank period, so it would tear -double buffering in VRAM itself solves that but I don't think VGA mode 13h -320x200x256- would allow that and sometimes there wasn't enough VRAM to double buffer into -but most systems had a render buffer in main RAM that would copy into VRAM)Quote:
The fact remains that the P3&360 are both 'plenty' fast enough to render todays engines with V-Sync being forced... there are ZERO excuses. Perhaps not AA on engines that aren't well optimized/poorly ported, but V-Sync 100% of the time should be a non issue.
V-sync is not intensive at all, it's just frame skipping/limiting be it using double or triple buffering, the reason to not do it would eb to avoid dropped frames at the expense of tearing: note it's NOT the same issue as tearing with single buffering on some VERY old systems as that's an actual performance problem, but is tearing with double buffering and the famr being updated right away rather than waiting for Vsync and throwing away some of the frames without ever displaying them:
That's exactly why it's important for fast paced action games where every frame counts (ie some FPSs), and also the reason only user defined settings would solve it.
Triple buffering doesn't fix that, especially if the machine renders FASTER than the screen refreshes (ie 100-160 FPS on a 60 Hz screen), but still an issue with slower rates as it's not always going to time correctly. (better than double buffering, but still missing some frames)
Every single Xbox game renders in 480p and some offer higher res. (you can set the Xbox to 480p in the system settings and play it on a VGA monitor :p)
Can't do that with the PS2 or GC as they don't all support 480p... though I think the PS2 (not sure of GC/Wii) may upscale to 480p to allow that as well for games rendering in lower res.
The Dreamcast and Xbox don't ever render in 480i, they're pure 480p that gets downconverted to 480i on the fly. (and the DC supports VGA but not component, not sure if the Xbox supports VGA or not -I knot it doesn't "officially" but the RGB connections might sync to 31 kHz just like component -the Wii does that even though there's no 1st party VGA cable)
interesting, thks
I noticed a fair amount of PS1 titles have tearing (playing them on the P2, not sure if that's the reason or not) and I don't remember seeing any of the N64 games with any Tearing issues so it seems that V-Sync was forced on that.
It all boils down to Sony and Microsoft rushing their consoles with the wrong video cards. They should have waited another 6-12 months and put 88GTs in there and we wouldn't have ANY of this bullshit that we're seeing in todays "next generation" Consoles. But nooooooooooooo, they just had to have something out by Q4 of '06.... nothing would do them, so what did they do... they go with 7800 equivalents which was RETARDED because the 88GT was the first Video Card milestone in regard to amazing speed/reliability/and price.
Even knowing that ^, I still don't care... they shouldn't be trying to put games out on their aging hardware if they can't ensure 100% working V-Sync.
Lol.... here's a point for all you excuse making sons of bitches: would ANY of you buy a slow machine and try to run Crysis on it. No, of course you wouldn't... so the exact same principle and logic applies here.
Yet, when S.&M. do it, and it fails (intermittent Tearing because V-Sync is automated at times when Framerates are too low), they get a pass from y'all because why?
On the PS2, I recall significant developer interviews explaining the early PS2 library's aliasing issues as a 240i mode they used to save memory early on. All of the major sites in the US talked about the aliasing, and I think they all posted something to the effect of Sony having to "teach" developers how to use the mode without causing so much aliasing. Also, I've done significant comparisons between my late model PS1 and PS2, and am confident that the PS2 does not alter PS1 games unless you change the enhancement settings. It is running with PS1 hardware after all.
On Halo 3, I oversimplified, but it is running at 1152x640 progressive in two frame buffers on the 360 as some sort of means to enhance the lighting.
Interesting... thks for the information, I appreciate it. :D
That is interesting. I remember reading that the P2 had an extra filter or two that it could run on top of the P1 games. I looked at some comparison pics and wasn't impressed at all... some looked kinda blurry but a smidge cleaner. I'd rather just stay with the messy original cuz I'm weird like that.
And ^... in regard to the P3 doing something similar for P2 games... I also looked at comparison shots for that and I felt the same about those shots. Meh, pass... I'll take the original.
Honestly... I don't understand why Sony made the architectures for their three Consoles so vastly different that their latest can't offer backwards compatibility for all three systems. Another thing is that I'd love to see a fixed up version of a P2 be created by Sony to replace the P2 Slim. Something that is original enough to where they have 100% compatibility with the entire P1/2 catalogs along with V-Sync&2-4xAA for all P1&P2 titles.
I'd pay $250 for that machine if it existed.
Checked into 480p on the PS2 a bit more...
http://en.wikipedia.org/wiki/List_of...ith_HD_supportQuote:
Generally, progressive scan mode is activated by holding the Triangle and Cross, or "X," buttons down after the PlayStation 2 logo appears. When this is done, the game will typically load a screen with instructions on how to enable progressive scan. Many games only offer progressive scan through this method, offering no related options in the game's options menu.
It looks like most games support 480p and 16:9, but you need the component cable (obviously - you can't do progressive on a composite or RF connection).
Hah, neither Shinobi or the Virtua Fighter games are listed in that wiki. Typical.
It's not that they don't offer progressive, it's that they don't offer 480p. Most games are probably 240p by default. You can do 16:9 with 240p just as easy as with 480p.
It doesn't say what games are 480i... that would be interesting to know, but I can't see a game supporting 480i and NOT supporting 480p. So I assume that when they say they support 16:9 but NOT 480p, they mean the game is 16: and 240p instead.