Quantcast

Page 5 of 5 FirstFirst 12345
Results 61 to 65 of 65

Thread: Fallout 3 New Vegas - THKS to all those who beta tested it!

  1. #61
    Banned by Administrators
    Join Date
    Jan 2010
    Location
    USA
    Posts
    2,317
    Rep Power
    0

    Default

    Quote Originally Posted by Phosis View Post
    What's wrong with 1080p? It's the new standard everyone has adopted, and is what defines "HD". This is the standard definition for blu-ray as well. It may be "wasteful" now, but this is how these industries continue to progress. Eventually, your dick will be able to run videos in 1080p without issues, let alone any given computer processes. (This barely makes sense.)

    Also ATI and Nvidia is a preference, my preference being cost. I have heard a lot of whining and bitching from ATI detractors, but I haven't had any issues personally.
    Everything... it looks no better than 1280x1024 on a decent 40-50" monitor.
    It's a wasteful resolution and if 720P had been the sole aim for the P3&360, there'd be a lot less Variable V-Sync games and more MSAA on P3 titles I imagine... not to mention the amount of time saved on Programmers having to ensure steady framerates for 1080P, time they could spend on getting the games to run better or have more post processing fx, or just be better games with more content.

    Where you're wrong is that 1080P will only satisfy the needs of todays hardware manufacturers (CPUs & GPUs) for so long... eventually they'll want to kick out even newer hardware that'll "supposedly" be capable of even crazier resolutions and once again the newer console generation will come out gimped in exactly the same manner that his generation has been.

    Nintendo will come out with something that outputs in 720P with their next console... and both Sony & M. will attempt rezes above 1080P I'm sure but the games will be the

    same

    DX9 (minus what a powerful PC can do in DX9 mind you, draw distances, same quality textures for instance)

    looking games but yyyyyyyyyyyyyyyyaaaaaaaaaaaaaaay

    in rezes of


    2500x2000 or whatever lame rezes they think of tackling next 'all' at the expense of options that are way more important than a silly big resolution.



    That's what's wrong with 1080P.

    Case in point: IDTech 4 maxed w/4xMSaa and V-Sync at 1280x1024 easily looks as good if not better than most Console games in "Glorious" 1080P... and you know what... that's not just sad, that's plain pathetic.

    How in the world is it that technology from 2004 in a resolution comparable to 720P looks better than most of todays console games rendered in 1080P (taking into account that most console games don't actually render anything in any rez even close to true 1080P, but still you get the point)...

  2. #62
    Phosis's Avatar
    Join Date
    Jul 2009
    Posts
    1,037
    Rep Power
    27

    Default

    Hmm alright. I really didn't know much about the whole situation, to be honest. Thank you for clarifying.

  3. #63
    Banned by Administrators
    Join Date
    Jan 2010
    Location
    USA
    Posts
    2,317
    Rep Power
    0

    Default

    Quote Originally Posted by Phosis View Post
    Hmm alright. I really didn't know much about the whole situation, to be honest. Thank you for clarifying.
    It's just my opinion... a strong one at that. I know I state it like it's fact, but that's simply how it comes out.


    I feel very strongly that the entire Gaming Industry would benefit hugely if it would simply build on the engines of today and quit worrying about making new engines every year and pushing rezes above and beyond what was done the previous year/s.

  4. #64
    End of line.. Shining Hero gamevet's Avatar
    Join Date
    Jan 2008
    Location
    Dallas, Texas
    Posts
    10,401
    Rep Power
    143

    Default

    Quote Originally Posted by OldSchool View Post
    From what I heard there were going to be plenty of mods that enabled everything Infinity Ward had turned off. IW have really turned anti PC since COD 6 that's for sure.
    But that doesn't change the fact that a lot of developers are focusing on the console market, because it's far more profitable than PC. It's pretty obvious when IW is forced not to add the obvious features to the PC game. Perhaps that is why they had a falling out with Activision.


    I'd rather only companies that can properly make solid games, be involved with making games... I was plenty clear.
    When you have a game that requires a team of over 100 people, the payroll will quickly drain a company's funds. The days of having 10-20 people working on a game are pretty rare, unless they're working on an indie game, the Wii, or DS. You're dillusional if you think any company can push a budgeted product beyond what they are paid to do. If you want to put the blame on someone, put it towards the publishers that set the release dates and time of development.


    Your Numbers mean nothing to me... bah.
    Maybe once you put your pipe down and actually do a little homework yourself, you won't sound like a fool pulling facts out of your ass. My numbers are legit. This dedicated PC forum talks about the lack of QC support in Crysis.




    http://www.overclockers.com/forums/s...=553139&page=1

    Quote Originally Posted by deathman20
    Crysis was suppose to be optimized for quads. But in beta at least it always was dual core optimized. It kept getting dragged out and I don't even think they released a "quad core optimized beta" before it went to retail. That sucked I know Crysis was a game I was going to get a quad for but after seeing no one getting any results from said game with quads, I said forget it.
    There was only 3 patches for Crysis. The latest patch addressed issues with the game, but did not add Quad-Core Support.[/quote]

    Need proof. Here's the 3 official updates for Crysis, by Crytek. There never was an update to support Quad-Core processors for the game.

    http://www.crymod.com/filebase.php?fileid=1073



    Quote Originally Posted by OldSchool View Post
    The V-Sync on the 360 version is Variable isn't? Or is it locked?
    Bioshock on the 360 does support V-Sync at 30fps. You can turn it off and get @50fps, and from the 40 minutes I just played with it off, I only saw one instance where it was a problem, when you're still in the sub and the Splicer starts cutting into the roof;The lit ceiling starts to flicker horribly. Other than that the image was clean, even when I spun my character around in circles in different areas.

    I did a side by side comparison of Bioshock on my PC, and on the 360. I used my LGW2453V 1080p monitor, so I could directly compare the 2 versions side by side, flipping from HDMI to DVI.

    PC:With V-Sync enabled, the PC version does run at a brisk 60FPS+, but turning it off netted about the same frame-rate. The textures on the walls and objects weren't any different (probably because the minumum videocard requirements were only 128MB RAM and the recommended 7900GT was 512MB with DX9c) than that of the 360 version. The black levels were a little deeper and the floors looked a little more vibrant. The character models looked pretty much identical to the 360 game.


    360: With V-Sync enabled the game runs at a smooth 30fps. The camera swing felt a little slower at this frame-rate, but it wasn't a hinderance on the gameplay. Turning off V-sync provided a smooth frame-rate @50fps and the sub was the only instance where I noticed having it off was an issue; There was no screen tearing, no matter how often I swung the view around. The Black levels were decent, but I felt they could have been a little darker. Particle effects were clean and pretty much identical to that of the PC version.

    The one thing that was clearly better about the PC version, was the sound. I don't know if it was because I have a better sound card in my PC, or if the sound samples were compressed more on the 360 version. The speaker setup to my computer desk is 2.1 surround, so I'm not exactly getting the most out of the sound options for either on this setup. The 360 may sound better on my 5.1 surround in the livingroom, but that would probably be true of the PC as well.

    [




    Quote Originally Posted by Phosis View Post
    I can run anything, right now, today. Flawlessly, for the most part. And it is awesome, better than my PS3. Crysis runs like butter. However, I also had to drop 1600 dollars to get this quality, and a year from now, my PC will be out of date, and I will probably drop half of that again at some point to upgrade.

    Not cheap, not for everyone. I can see why developers are far more eager to develop for consoles than PC's now. And there is a reason my console sees far more play time than my PC ever will.

    Quote Originally Posted by Phosis View Post
    I will post them when I get home. This is the sort of thing I should know off the top of my head, but do not. I have an Intel i5, (Or i7? The newest one.) Radeon HD 5770, 4 gig ram, and...other stuff. It was custom built at any rate. 1600 includes everything including my boss (not the brand...it's just like a boss.) HD monitor.
    Your processor will be fine for at least another 3 years. My Q9650, might be pretty dated in another 2 years though. I have a feeling that the Quad-Cores will get tossed aside, as developers start pushing the multi-threaded cores of the i7 processors and the multiple cores of the AMD Phenom II x6. It seems like QC was more of a quick idea for Intel and AMD to one-up eachother, and it got overlooked by developers, because of how many Dual-Core setups were still viable at the time.

    The videocard market is a bit more brutal though. In the past couple of years, ATI and NVidia have released well over 8 different upgrades to their own cards. It's hard enough to keep up with what NVidia is doing, but ATI's new card every 3 months is beyond a mind-melting list of products and features to keep up with.
    Last edited by gamevet; 01-05-2011 at 11:37 PM.
    A Black Falcon: no, computer games and video games are NOT the same thing. Video games are on consoles, computer games are on PC. The two kinds of games are different, and have significantly different design styles, distribution methods, and game genre selections. Computer gaming and console (video) gaming are NOT the same thing."



  5. #65
    Let's Go Away Master of Shinobi kokujin's Avatar
    Join Date
    Oct 2009
    Location
    Daaaaytonaaaaa
    Posts
    1,421
    Rep Power
    29

    Default

    Quote Originally Posted by OldSchool View Post
    Everything... it looks no better than 1280x1024 on a decent 40-50" monitor.
    It's a wasteful resolution and if 720P had been the sole aim for the P3&360, there'd be a lot less Variable V-Sync games and more MSAA on P3 titles I imagine... not to mention the amount of time saved on Programmers having to ensure steady framerates for 1080P, time they could spend on getting the games to run better or have more post processing fx, or just be better games with more content.

    Where you're wrong is that 1080P will only satisfy the needs of todays hardware manufacturers (CPUs & GPUs) for so long... eventually they'll want to kick out even newer hardware that'll "supposedly" be capable of even crazier resolutions and once again the newer console generation will come out gimped in exactly the same manner that his generation has been.

    Nintendo will come out with something that outputs in 720P with their next console... and both Sony & M. will attempt rezes above 1080P I'm sure but the games will be the

    same

    DX9 (minus what a powerful PC can do in DX9 mind you, draw distances, same quality textures for instance)

    looking games but yyyyyyyyyyyyyyyyaaaaaaaaaaaaaaay

    in rezes of


    2500x2000 or whatever lame rezes they think of tackling next 'all' at the expense of options that are way more important than a silly big resolution.



    That's what's wrong with 1080P.

    Case in point: IDTech 4 maxed w/4xMSaa and V-Sync at 1280x1024 easily looks as good if not better than most Console games in "Glorious" 1080P... and you know what... that's not just sad, that's plain pathetic.

    How in the world is it that technology from 2004 in a resolution comparable to 720P looks better than most of todays console games rendered in 1080P (taking into account that most console games don't actually render anything in any rez even close to true 1080P, but still you get the point)...
    Most games on the 360/PS3 are rendered in 720p, and then upscaled to 1080P.In summation, they can barely do 1280x720, so 1280x1024 is definitely out of the picture, but I do agree they jumped on the HD wagon too early.The next consoles will be true HD, hopefully.

    Less talk more action!

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •