My old IBM G78 CRT would run at 75hz, with a maximum resolution of @ 1600 x 1200.
My LCD is a Samsung LN52A630. It's very simular to the
A650 in this review. It's a 120Hz (Auto Motion Plus) display that ranked very high in reviews, even being very comparable with what Pioneer had at the time. It supports 24hz for movies, but I'm not quite sure if it would support the 70hz and 75hz refresh rates. I've only used the PC port (Analog RGB) a couple of times with my laptop and my old Q9650 rig I'd built in 2010. It would be interesting to see what setting I could do with my i5-2500K/EVGA GTX 670 FTW! rig, but the last time I had connected it to this display, I'd used the HDMI port.
I run my games with Vsync enabled, and with Nvidia's Adaptive Vsync, the games run ultra smooth when the fps drops below 60fps. I get to keep my settings at 1080p, with most games set to the maximum settings. It's a win-win!
What model is your television?
Final Fantasy XIII rendered at a higher resolution on the PS3, but it was still output @ 720p on the display. I don't recall ever seeing my Samsung LCD showing 360 games as being output as 720p, but with the PS3 it would show up at that resolution 99% of the time. PSN games and Wipeout HD show up as 1080p on the PS3 and man does Wipeout HD look excellent at that resolution.
Do you think the limited VRAM of the PS3 was the reason why most games are being output at 720p, or does the 360 just have better upscaling hardware onboard?