I wonder why everyone wants to turn down sharpness... I set it to max to have lovely square sharp pixels :P
Printable View
I wonder why everyone wants to turn down sharpness... I set it to max to have lovely square sharp pixels :P
Because it creates artificial halos to increase contrast between dark and bright pixels to give the appearance of sharpness.
http://www.joeredifer.com/crap/sharp.jpg
I like sharp pixels too, but excessive sharpness filtering (technical term: edge enhancement) causes ringing artifacts (also known as edge halos) around sharp contrasts. It's an unavoidable consequence of the image processing filters that are used. At the worst, it makes objects look as if they are digitally superimposed on the background, instead of naturally blending into them.
Edge enhancement is also the bane of home theatre enthusiasts. Too often do movie studios boost the sharpness of a movie for a DVD/Blu-ray release to compensate for the softness that comes with analog film. It can make a movie look very ugly and unnatural, especially when in motion.
Here's a more detailed write-up about edge enhancement and its nasty effects: link
That kind of arificial sharpening does suck, so one would have to find a "sweetspot" where original signal is not really smoothened or edge enhanced...
Exactly. And the default sharpness setting for pretty much any TV is way way way too high. That's why we crank it down, to find the sweet spot. A good calibration DVD helps a lot in this.
[edit]
700th post, yay!
I have this problem endlessly...as I prefer sharp pictures, but hate the artifacts.
That's why Dreamcast VGA is so great. The TV does not apply any noise or MPEG artifact filters that blur the image, and so it does not need to artificially sharpen the image either. The result is that the pixels on your TV are exactly as sharp as they are in the Dreamcast's VRAM.
^
On CRT sets I think having sharpness all the way up is preferable though. (as they tend to have th eopposite issue, at least SD sets do, with the immage being softened by turning sharpness down)
A bit of a tangent (and I have mentioned it before) but I tried out my pre-tmss M1 genesis via RF (horrible RF noise in these models) on a freind's old Zenith Advanced System 3 and found it to look amazing. Close to the S-video or even some of the RGB screenshots I've seen. That set must have an amazing comb filter along with awsome automatic fine tuning. Turning sharpness all th eway up made the pixels verry sharp and visible, enough to make my freind ask me to turn it down and make it softer. hell, the color didn't even start to bleed until close to 90%. (by which point it was extremely oversaturated)
I mean 480i on a proper SDTV or LCD SDTV highly optimized for that purpose isn't THAT much better than 480p on a native display. (be it HD CRT supporting 480p natively or LCD EDTV with that native resolution or resolution high enough to properly approximate it). By this latter point I mean, 480p content will look great on a digital display with 480 vertical pixels (so no scaling) but will look pretty blury on a 1360x768 native display. (even 640x480 scaled to my 1440x900 LCD screen is bothersome, tolerable, but a bit annoying compared to a proper VGA monitor).
That reminds me, VGA monitor... that would be the ideal way to display it; a nice crisp CRT VGA monitor. As long as you don't need wide screen, a nice, VGA CRT is a great option. (we've got a 20" VGA monitor originally from a Macintosh workstation that looked amazing with my freind's Xbox 360 and pretty good with the Wii as well, though not amazingly better than in 480i -plus in 480p on the wii you disable the 240p functionality ofr VC games -with a propper SD CRT, of course)
A 1080p LCD/Plasm set should have high enough resolution to adequately reproduce 480p though. (not as good as a CRT ED/HDTV, let alone VGA monitor)
My other point is that 480/60p is the big difference. Proper 480/60i (with no combing, framrate capped at 30 Hz) on a nice TV will approximate 480/30p in general. (things like the dot pitch being other factors)
On an HD set (CRT or LCD), of course 480p will look better than 480i, in some cases much more radically than others (some HDTVs have similar SD capabilities to dedicated SD LCDs though -especially good if they accept SD content via component video)
I'm getting off topic, but what really bothers me is people wo buy these "nice" big HDTVs only to plug an SD satellite/cable converter/decoder box into it, in some cases via RF!!! (in fact, that's what installation guys tend to do, my dad had to re-do my granparents TVs to hook up composte at least) Even with composite you're getting crap for HDTVs, stuff that will look better on cheaper SD CRT sets. The biggest annoyance is people constantly haing the display set to stretched (or zoom for that matter) either deforming or cropping the 4x3 immage to avois any boarders ... on top of that they ofen have it set such that even proper widescreen content gets cropped and/or deformed. (I personally play most modern ST content in anamorphic mode, so a smaller screen to have a 16x9 immage and tighter scan as well, pretty much any N64 and up game that supports anamorphic display I will opt for)
How big the difference is between 480i and 480p depends a lot on the quality of the deinterlacer in the TV. My Toshiba LCD has a Faroudja DCDi, which is one of the best deinterlacer chips available on the market. I specifically chose my TV for that reason. The result is that 480i is only slightly less sharp than 480p, but otherwise interlaced video looks excellent.
My parents on the other hand have a Sony Bravia from 2006, which has a deinterlacer so incredibly bad it may as well not have a deinterlacer at all. Worst of all, every once in a while that deinterlacer feels like going out of whack and deinterlaces the fields in the wrong order, causing all sorts of combing artifacts and stuttering. It's mostly noticeable with scrolling news bars at the bottom of the screen, they constantly switch between smooth-stuttering-smooth-stuttering-etc. The difference between 480i and 480p is night-and-day on that TV.
I don't think any digital TVs work with 240p properly, that is unless there's soem awsoem TV that has a single feild deinterlcing mode. (either straight line doubling or even interpolation would be OK, keeping felids separate is what's important, minimize loss in horizontal resolution, and maintain 60 Hz framerate) In many cases single-feild would be preferred for 480i or 1080i content as well. (if the frame rate ever exceeds 1/2 the field rate)