^^^
That's pretty much what I meant. I'm not completely familiar with all the technical details of analog TV signals, so forgive me if my statements are not 100% correct all the time. My point remains the same though, which is that even the best HDTVs will try to deinterlace the signal coming from a Genesis/Mega Drive, even though it is completely unnecessary and undesirable.
My HDTV has one of the best deinterlacers available on the market (a Faroudja DCDi, I specifically chose it with that feature in mind), and it generally does a pretty good job with 2D graphics. Still images look pixelly as they should, while animating sprites get a sort of 2xSaI look, which is really not that bad.
It starts to fall apart though with flickering effects, such as often used for transparency or shadows. The deinterlacer will try to guess what goes in between the frames with and without the effect, and even the best algorithms can't avoid it becoming a horrible horizontally-striped mess.
For my classic consoles, I really don't want any of this. CRTs don't have any of these problems, and present the incoming images exactly as they are supposed to be.
QFT.
The less digital filters applied to the image, the better. Philips TVs in particular are notorious for featuring all sorts of redundant filters that are supposedly meant to improve the image, but usually just make a blurry unnatural looking mess out of it. For any kind of computer generated imagery, you'll want as little tampering to the incoming images as possible before they are displayed on the screen. Classic CRT TVs fit this bill perfectly.

