Quantcast

Page 4 of 10 FirstFirst 12345678 ... LastLast
Results 46 to 60 of 148

Thread: Sega Mega Drive vs TG-16 Graphics chips

  1. #46
    Nameless One
    Join Date
    Jul 2006
    Posts
    70
    Rep Power
    18

    Default

    Ive never heard of shmups being the least CPU intensive genre, that privilege has always gone turn base rpgs or digital comics, if thats the case, the genesis must have one crappy CPU, since its flag ship shmup(thunder force IV) has some pretty bad slowdown.

    As far as non shooters go, here is an example of three BG layers..the two ships in the BG are sprites and the ship in the foreground is a background layer or tiles, the moon is a sprite as well, throw in some good AI for the green sword skeleton=no slowdown.





  2. #47
    Hero of Algol Kamahl's Avatar
    Join Date
    Jan 2011
    Location
    Belgium
    Age
    33
    Posts
    8,637
    Rep Power
    145

    Default

    Quote Originally Posted by Jorge Nuno View Post
    There's everything to make it 12bit? You mean the nibble alignment of each color component, the 0000BBB0GGG0RRR0? Or isn't this youre talking about?
    That too, I'm just saying that the VDP is internally working as 12bit, there would be very little modifications required to make the genny have a 12bit master palette. Heck thats what they did for the Game Gear :S.

    Quote Originally Posted by Jorge Nuno View Post
    About the 128color palette, could be more, could be a group of palettes for each plane if they really wanted.
    Yeah but 128 is another thing that would only take slight tweaks (plus twice the cram) to get.

  3. #48
    Outrunner
    Join Date
    Jul 2009
    Location
    Azeitão - PT
    Age
    36
    Posts
    723
    Rep Power
    18

    Default

    Quote Originally Posted by Kamahl View Post
    That too, I'm just saying that the VDP is internally working as 12bit, there would be very little modifications required to make the genny have a 12bit master palette. Heck thats what they did for the Game Gear :S.



    Yeah but 128 is another thing that would only take slight tweaks (plus twice the cram) to get.

    I think they just didn't want any more internal memory (too much area?)
    The cram as it is now wastes maybe 3456 transistors (assuming S-RAM, 6 transistor per cell, 2 inverters + RWline enablers)

    Pumping in 12bit and 128 entries resulted in 9216, excluding the little extra addressing logic, almost the triple here.

    On the DAC, it needed some 6 extra resistors and transistors per added bit (resistors are BIG)

  4. #49
    Hero of Algol Kamahl's Avatar
    Join Date
    Jan 2011
    Location
    Belgium
    Age
    33
    Posts
    8,637
    Rep Power
    145

    Default

    Quote Originally Posted by Jorge Nuno View Post
    I think they just didn't want any more internal memory (too much area?)
    The cram as it is now wastes maybe 3456 transistors (assuming S-RAM, 6 transistor per cell, 2 inverters + RWline enablers)

    Pumping in 12bit and 128 entries resulted in 9216, excluding the little extra addressing logic, almost the triple here.

    On the DAC, it needed some 6 extra resistors and transistors per added bit (resistors are BIG)
    You're most likely right, but still, it surprises me no one at sega realized that having only 4 palettes could turn out to be a big problem. Maybe they didn't realize there was a CRAM bug and that if more colors were needed it would only be a matter of swapping them midframe...

  5. #50
    ESWAT Veteran Chilly Willy's Avatar
    Join Date
    Feb 2009
    Posts
    6,744
    Rep Power
    81

    Default

    It's amazing how non-programmers think they know so much about programming.

    Turn-based RPGs don't require a lot of CPU power because MOST of the time is spent waiting on the user input. Real-time simulations (a super-class of the RPG) take TONS of CPU power. Sports games are universally recognized as the most taxing on a system, and of the consoles of that time, the Genesis is universally recognized as the best at sports games.

    Shooters are NOTHING in the way of processing - the most work they require is collision detection, and good collision detection routines are fairly simple and quick. It's much like a good sort algorithm. The SNES being slow with shooters is more an indication of how little time it has to update the graphics than in how much CPU power it has. The TG16 is practically designed for shooters, and it shows. A shooter requires high video throughput, and little game logic.

  6. #51
    Hero of Algol Kamahl's Avatar
    Join Date
    Jan 2011
    Location
    Belgium
    Age
    33
    Posts
    8,637
    Rep Power
    145

    Default

    Quote Originally Posted by Chilly Willy View Post
    It's amazing how non-programmers think they know so much about programming. .
    But I AM a programmer, just not a low lvl one, doesn't mean I can't understand how this shit works.

    As for your RPG comment, you could obviously tell I was joking so no need to attack the point, as there is nothing there for me to defend.

    "Shooters are NOTHING in the way of processing"

    I must ask once again... ARE YOU SERIOUS?

    "The SNES being slow with shooters is more an indication of how little time it has to update the graphics than in how much CPU power it has"

    No... the SNES being slow with shooters is more of an indication of how much of a piece of garbage the processor is, shooters have barely any animation other than moving a lot of sprites, heck even the enemies that are animated usually are all on the same frame, and when they are not, you're not fighting different ones.

    And I'm not denying that sports games are the most taxing (since they require huge amounts of calculations for the AI), but your claim that shooters don't require a lot of processing is just ridiculous.

    EDIT: It's also worth pointing out that all those dynamic tile effects would be unnecessary in a sports game.
    Last edited by Kamahl; 01-26-2011 at 10:23 PM.

  7. #52
    Nameless One
    Join Date
    Jul 2006
    Posts
    70
    Rep Power
    18

    Default

    Turn-based RPGs don't require a lot of CPU power because MOST of the time is spent waiting on the user input
    Thats right, the same goes for digital comics and point and click games.

    Shooters are NOTHING in the way of processing - the most work they require is collision detection,
    There are typically many more projectiles and enemy sprites on screen at once which require collision detection, that is the point in which slowdown will most likely occur.

  8. #53
    ESWAT Veteran Chilly Willy's Avatar
    Join Date
    Feb 2009
    Posts
    6,744
    Rep Power
    81

    Default

    Quote Originally Posted by Kamahl View Post
    But I AM a programmer, just not a low lvl one, doesn't mean I can't understand how this shit works.

    As for your RPG comment, you could obviously tell I was joking so no need to attack the point, as there is nothing there for me to defend.
    The games that use the most power are obviously those that compute pi on the fly.


    "Shooters are NOTHING in the way of processing"

    I must ask once again... ARE YOU SERIOUS?
    Yes. Other than collision detection, name one thing about a shooter that requires anything more than simple logic... which can be easily replaced with a table lookup.


    "The SNES being slow with shooters is more an indication of how little time it has to update the graphics than in how much CPU power it has"

    No... the SNES being slow with shooters is more of an indication of how much of a piece of garbage the processor is, shooters have barely any animation other than moving a lot of sprites, heck even the enemies that are animated usually are all on the same frame, and when they are not, you're not fighting different ones.
    I'm not saying I like the 65816, but it's not nearly as bad as many people think. It's particularly good at byte operations. There's threads over at SpritesMind where a few of us went back and forth over various types of operations on the 65816 and 68000 just trying to prove it was or was not an able processor. In any case, the kind of processing a shooter needs is the kind the 65816 is fastest at handling. Again, I have to say slow shooters on the SNES are almost certainly a matter of not enough time to update the vram. Remember that the SNES can only update the vram during the vblank... which is also the only time it can do certain other tasks like read the controllers.


    And I'm not denying that sports games are the most taxing (since they require huge amounts of calculations for the AI), but your claim that shooters don't require a lot of processing is just ridiculous.
    Again, please point to ANY part of a shooter that is taxing other than the collision detection.


    EDIT: It's also worth pointing out that all those dynamic tile effects would be unnecessary in a sports game.
    Well, that depends on the sport being simulated, and how good your sprite support on the console is.

  9. #54
    Outrunner roundwars's Avatar
    Join Date
    Jan 2010
    Location
    California, USA
    Age
    33
    Posts
    574
    Rep Power
    21

    Default

    Quote Originally Posted by Chilly Willy View Post
    Wrong. GoT proves I'm right. There's a reason the best looking games on the TG16 are shooters: shooters require the least amount of cpu power to play.
    I think the real reason is that shooters don't really require a tile layer for anything other than eye candy. You can pull off cool looking parallax effects cheaply with only one tile layer, but not if you also want to have a foreground layer for the character to run and jump on, like you'd need in a platformer. Especially if this is a platformer with 360º scrolling, like Sonic.

  10. #55
    Hero of Algol kool kitty89's Avatar
    Join Date
    Mar 2009
    Location
    San Jose, CA
    Age
    34
    Posts
    9,724
    Rep Power
    67

    Default

    Quote Originally Posted by Jorge Nuno View Post
    The window plane doesn't count as a layer. Its either plane A or W, you can't show them together (one behind the other). Oh planeW is unscrollable too.
    The window layer is just the solid color far BG layer, isn't it? (like the PCE and SNES also have)




    Quote Originally Posted by Kamahl View Post
    I've also heard about a planned NES compatibility that was latter dropped for the SNES.
    I think that's just a rumor. The SNES has about as much similarity to the NES as the PC Engine does... (also a directly 6502 compatible CPU) or as much as the Saturn does to the Genesis. (with the exception that they re-used the serial I/O logic of the NES, but that's no indication either: just a smart cost saving design point -they used it again in the Virtual Boy)

    They might have considered BC at some point, but there's little if any indication that it had any significant impact on the SNES's final design.

    Use of the 65816 (actually a custom derivative with some modifications to the memory map and instruction set over the standard 65816 iirc) has many more possible reasons: the biggest would be the fact that it was cheap and easy to license, plus building on the common architecture used on the NES and the 2nd biggest home console in Japan. (PCE)

    I'm sure they considered using a 68k and maybe even some others (maybe even x86 -NEC had some pretty decent low-cost enhanced 8088/8086 derivatives -rather close to the 80186 in performance and up to 16 MHz but lacking the internal peripheral logic and adding Z80/8080 compatibility), but in any case none of those would have been as attractive in terms of licensed production and custom implementation. (they added the I/O logic as well as a fast multiplication unit on-chip with the CPU core)

    The odd memory map in the SNES has less to do with the CPU and more to do with Nintendo's own design philosphy.


    The CPU architecture thy chose wasn't bad at all, but the fact they clocked it at 2.68 MHz was a major drag. (onboard RAM speed and all earlier ROM chips in SNES games were that slow)
    I've gotten conflicting information on just how the 650x architecture makes bus accesses:
    tomaitheous mentioned you need RAM clocked 2x as fast as the CPU, but there seem to be many cases where that isn't implemented as such in hardware (the PCE's CPU is a modified derivative with modified bus logic for plain single cycle accesses).
    The came up on Atariage a while back for the 65816 and 6502/C02 and I got very different information: basically that the 6502 needed memory clocked at its internal clock speed to have 0 wait states but the '816's address latch required moderately faster memory than that (~15% faster, so at 2.68 MHz it would need 325 ns memory rather than 375).
    Maybe allowing the single-cycle accesses also requires some sort of external latch or other logic to address the issue. (I know that's how memory can be clocked at 1/4 the 68k's bus speed: ie a 16 MHz 68k could have zero wait states with 4 MHz memory -250 ns accesses, or maybe that's 5.33 MHz/187 ns since you need the data to be on the 68k bus by the end of the 3rd cycle after a memory request, not the end of the 4th cycle)

    The SNES's CPU should have been closer to 7-10 MHz in internal RAM with slower speed modes to access ROM.
    Even if they were forced to have DRAM that ran at 2x the clock speed as the CPU, by 1990 100 ns DRAM should have been reasonable, and should have allowed the CPU to run at least at 1/5 the master clock with 0 wait sates (21.48/5 is ~4.3 MHz) which is 60% faster than what the CPU normally runs at. (2.68 MHz) Or if they clocked it faster and added a wait state mechanism, it could average closer to 5 MHz with 100 ns RAM. (again, assuming that is does actually need 2x the speed -otherwise they could push it to 10 MHz like the SA-1)

    The fact that it doesn't even run at 3.58 MHz in RAM is really odd since even 120 ns FPM DRAM should have allowed that speed even if the CPU really did need 2x speed RAM.

    But no, it runs at 2.68 MHz and I really don't know why. (a very bad thing to cheap out on if that was the issue, especially since any savings from using slow RAM would have been very short lived as faster speeds got cheap pretty fast -at very least 3.58 or 4.3 MHz would have been cheap for most of its lift, and again that's assuming that they couldn't needed to haev 2x speed memory for 1/2 cycle accesses)
    That would imply they dropped down to 160 ns DRAM AND didn't add any logic to avoid the 2x speed nacessity and thus opted for a slow CPU. (so then ended up with a sound system that was way overbuilt and cost far more than what it was used for -something much cheaper ans simpler like the Ricoh PCM chip in the FM Towns/Arcade/MCD would have been fine, more so since Nintedno had a good relationship with Ricoh as their primary chip vendor)
    Hell, that wouldn't have even been a good cost saving move as it would have meant needing fast ROM just to have 2.68 MHz, so a very bad move all around.

    There must have been some logic/mechanism (possibly a latch of some sort like used for split DMA on the Amiga and ST) to allow 1 cycle access times and thus have normal speed RAM/ROM (maybe that 15% overhead for the multiplexed address bus -assuming the chip used in the SNES didn't demultiplex that with 24 separate address pins). So then, not only could cheaper ROM be used, but even common 160 ns FPM DRAM would have allowed 5.37 MHz, and they very likely could have pushed for 7.16 MHz using 120 ns DRAM. (you'd probably need 80 ns DRAM for 10.74 MHz and for 1990 that would be pushing it a little -still could have been a very good investment though with the added performance and the fact that 80 ns RAM should be getting cheaper rather quickly and the Sega CD was using 80 ns FPM DRAM in '91 for example -note, RAM prices were pretty stagnant in the early/mid 90s in terms of capacity so chips that were already in the lower end mass market bracket were going to stay that price with little change until '96, but faster speed RAM would still be falling in price closer to that baseline mass market cheapest RAM level -and even in 1990, 80 ns FPM DRAM was already towards the lower end category in the grand scheme of things -I'm only saying this in case anyone has read my previous posts mentioning the stagnating early/mid 90s RAM prices and to avoid contradiction/confusion)



    The processor is a 16-bit variant of the NES processor that's for sure at least. It has a lot of weird memory access limitations, it's complete crap at working with negative numbers and having half the clock speed of the genny or the tg-16 doesn't help at all...
    The clock speed is actually worse than that as above... and it's arguably 8-bit (not that "bitness" is a useful metric as such ), it's on an 8-bit bus but with added internal logic for 16-bit operations and a 16-bit ALU (though some have argued that they could have made do with an 8-bit ALU at the same performance, not chained 8-bit ALUs either, but a single oen configured in a specific manner).
    In that respect it's sort of like the 8088 ("16-bit" internal and 8-bit external) except the 8088 was a derivative of the 8086 with an 8-bit bus while there never was a version of the '816 with a 16-bit bus. (I guess it was too emphasized in the low-cost/embedded market to merit implementing a wider bus -no version with the address lines demultiplexed either for that matter -the 8086/88 also used a multiplexed address bus )

    Another odd case would be the 6809 and 6309 as those are technically 8-bit but with 16-bit registers and operations (but the Z80 also does 16-bit operations as such -much more slowly- and the 68000 has all 32-bit registers internally and aside from the data bus and ALU could be considered a 32-bit CPU ). The 6309 actually could have been a really attractive CPU for a game console... it only was ever produced for up to 3.58 MHz, but the clock/clock performance was very high as was the cost/performance ratio and relatively programmer friendly on top of that. (though I doubt Nintendo would have been able to license it like the 6502 or '816 -MOS/WDC/Rockwell had extremely favorable licensing agreements)

    Again, the whole "bittness" thing is a bunch of BS and pointless: comparing architectures with actual benchmarks and comparisons of the strengths and weaknesses is far more compelx than that. (and even benchmark programs can be misleading -especially compiled code as some processors work more poorly with high-level languages -the 650x is among those though the C02 and '816 are a lot better off than the plain 6502)

    A 7mhz version of the same chip would probably be too costly. The VDP and soundchips of the SNES were too expensive.
    I doubt that was ever the issue: the only possibility of actual speed restriction on the CPU side would have been if Ricoh couldn't manage high enough yeilds at high clock speeds soon enough, but that would have been a last minute hack. (which doesn't seem to be the case)

    As above, it seems more like Nintendo made some bad decisions in terms of the memory interface for the CPU as well as the RAM speed. (even with the worst case it definitely should have been 3.58 MHz or maybe 4.3 MHz, but it seems they were extremely short sighted and cheaped out on RAM as well as logic to allow a slower bus rate without wait states -moderately faster RAM would have already been cheap/common and gotten cheaper rather quickly while allowing slower bus accesses would greatly facilitate lower cost ROM -again, assuming the worst case of 2x speed accesses... otherwise I have no idea why they did what they did)

    If it WAS cost related, it was probably in part due to expenses moved towards the SPC700 module. (a very bad move in hindsight, but something that several others would repeat with overbuilt sound systems in later generations -and sometimes both overbuilt AND underpowered due to unnecessary features being added and more important ones left out -like the Saturn's advanced FM synth of wave samples and the DSP which were never used -or almost never- while it lacked ADPCM in hardware) Part of it was the limited format used too... with more flexibe tools, developers might have been able to enable all 32 hardware channels (by disabling reverb/echo) or maybe even on a per channel basis. (ie 4 channels without reverb in place of 1 with for 4+7, 8+6, 12+5, 16+4 channels, etc)
    Again, for the average user, the older/cheaper/simpler Ricoh 8-channel stereo 8-bit PCM synth chip probably would have been very similar in performance (if not ahead in some areas). 128k wave RAM and the ricoh chip would meet or exceed most stuff done with the SNES. (and especially be more cost effective if they added an interface to use DRAM, let alone sharing the main bus for RAM and ROM -and boost main RAM to 256 kB with flexible CPU/sound use -the only reason to use RAM and not ROM is to allow samples to be compressed in ROM and decompressed into RAM -probably using 4-bit ADPCM with an encoder optimized for 8-bit samples)
    Or if they did stick with just 64k, they could supplement the PCM with an FM chip too. (they could go the FM Towns route with the Ricoh chip plus a YM2612 -similar to the MCD's configuration and Sega System 18/32 except the latter had dual YM2612s; the 2612 would be preferable to the 8 channel 2151 since the latter required an external DAC that would add a little to board space -more so for the stereo DAC- and the chip itself might have been more expensive)

    But this is getting a bit off topic.
    6 days older than SEGA Genesis
    -------------
    Quote Originally Posted by evilevoix View Post
    Dude it’s the bios that marries the 16 bit and the 8 bit that makes it 24 bit. If SNK released their double speed bios revision SNK would have had the world’s first 48 bit machine, IDK how you keep ignoring this.
    Quote Originally Posted by evilevoix View Post
    the PCE, that system has no extra silicone for music, how many resources are used to make music and it has less sprites than the MD on screen at once but a larger sprite area?

  11. #56
    Hero of Algol kool kitty89's Avatar
    Join Date
    Mar 2009
    Location
    San Jose, CA
    Age
    34
    Posts
    9,724
    Rep Power
    67

    Default

    Quote Originally Posted by tomaitheous View Post
    That's true, but there's still a difference. The fake parallax on the TG16 doesn't change when you view it over composite or svideo or RGB, like the fake colors of the Genesis. To me, that doesn't particularly make them comparable.
    Plus the PCE's higher res modes would make for BETTER opportunities to use dithering tricks for translucency and false higher color on top of its added palette capabilities. (it would be more of a technique to get around the 9-bit color limitation than the palette count -opposed to the MD which has BOTH issues )
    The only problem is that sprite count doesn't increase and sprite pixels have to be the same size as the screen resolution... otherwise faking an 8bpp cell BG at 344 (or especially 512) width mode could have worked well... if using 512 width, it would have been great if sprites had a double wide stretch setting. (like Atari consoles/computers and I think the SMS -that might force vertical scaling though)

    Still useful for FMV though, but a fat lot of good that did them given they didn't get nearly the sort of investment in optimized cell based lossy compression formats as the Genesis got. (hardware ADPCM with DMA to the CD data stream, high res modes, more flexible DMA for graphics, more subpalettes in all modes, and potential to quantize down to 2bpp tiles and more reasonably use the high-res mode to get around the 9-bit RGB limit -especially if you didn't losslessly compress the tile data -otoh, 2-bit still eats up VRAM as much as 4-bit, so that would make it harder to double buffer, so maybe smarter to opt for the lower and med resolutions -the med res also has the advantage of being almost square pixels for PAL, and PAL's 2 line chroma resolution would blend vertically as well as composite smear horizontally -so closer to 1 pass V/H blur rather than just H in NTSC -and H wouldn't blur was much due to the higher color res but 7.16 MHz dot mode would blur more in PAL than 5.37 MHz mode in NTSC)



    OTOH I'll bet more developers wished Hudson would have completed the dual plane 2bpp BG mode. (more so if that 2bpp mode was more like the SNES's with 2bpp graphics in VRAM rather than padded to 4bit and taking up more space, let alone converting the CRAM entries for the BG from 16 15 color palettes to 64 3 color palettes)



    I'm curious. What do you think the CD system added to the Core system, that the core system couldn't handle without it?
    I'd think the CD (even the Super CD, but especially the CDROM2) would be significantly worse off than hucard for the dynamic tiling effects.
    They could have dong things to stream data off CD in realtime, but that's rather limited (like Bram Stoker's Dracula -and pure tilemap oriented graphics are already inherently "compressed" by way of character/tilemapping -the main thing exploited in better SCD FMV ).
    Actually, I wonder if any games DID do that... one major problem would be no streaming audio (especially if you allowed scrolling both directions -if the game was fully linear and autoscrolling -like many shmups, you could have a sub-track for streaming ADPCM along with the majority of the bandwidth for streaming graphics).
    The MCD has that pretty nice FM+PSG+PCM sound system to supplant CD-DA (or even low quality streaming audio) when necessary (so long as it's well used -unlike some FMV games... like Sewer Shark). Silpheed is the prime example of that of course. (streaming audio is only used in cutscenes, though the in-game music had me thinking it was streaming PCM initially)

    That would have been pretty neat actually: a 2D game sort of like bram stoker's method, but for normal pixel art/hand drawn graphics designed to keep a relatively steady and limited bitrate to facilitate on the fly streaming without a huge amount of buffering and CPU overhead to manage variable bitrate. (even with variable bitrate, enough buffering would prevent the need for reseek)

    True. But it's still fun to discuss the strengths of the systems. I think it's safe to say that the Genesis pretty much showed ALL of its strengths in software. Giving you a pretty good idea what it's capable of (and sometimes making you think it's capable of more than what it's doing). By comparison, the TG/PCE never really approach this level. You see some effects and such here and there, but you rarely see them all together in a single game, across multiple games, to give any real impression of what the system can do.
    Yeah, the PCE didn't get pushed that hard (would have been especially interesting to see what some of the European developers did with it) like some other systems that weren't the biggest things on the mass market. (the PS3 got very well tapped in spite of its crazy-tough architecture -though it might not have ever been maxed out due to practical limitations, but damn close, all things considered)

    That's sort of what makes the PCE demoscene interesting though.


    - Multicolor bitmap mode. Using the large tilemap width of 128x64, and a large set of tiles house patterned pixels in horizontal rows, you can make a real 128x112 bitmap display (horizontally scaled both X and Y). Bitmap display as in you just write to the tilemap. A single element (word), nothing else. Great for rendering special bitmap effects or 3D. The colors, IIRC because of vram space, was something like 32 colors or more. It used the 512 pixel res mode and 4x horizontal stretched/repeated pixels.
    Would there have been a possible exploit for that on the SNES's 512 pixel mode? (then again, if you're going to go for that, why not just render to a 2x2 scaled mode 7 plane like wolf3D )







    Quote Originally Posted by Kamahl View Post
    Well I thought so too but Fonzie proved me wrong... "Mode 7" (perspective and all) on a stock genesis.

    And even before that there were demos showing 60fps background scaling (although there was a lot of tile repetition... so it was "cheating" a bit).

    I think I read on your blog you found a fast way to do pixel doubling on the PCE? That could allow it to do background scaling in software too, any news on that?

    You really should do a "Everything the PCE could do but didn't" demo. Just to show people how powerful it really is.
    As in the other thread, I definitely want more info on that... or to see it in action. (haven't gotten to that point in Pier Solar though)

    It looks like it's pretty low res, which makes more sense, and if tight coding on the 2.68 MHz 65816 of the SNES allowed Wolf3D at 112x96 (using a model 7 plane for faster/simpler rendering with 8-bit packed pixels scaled 2x2 wide to 224x192) or Toy Story, it's not really that strange that the Genesis could do it. (toy story does it in the 16 color mode too I believe, but I haven't checked with an emu -if it's mode 7 that would have helped a fair bit, especially if 1/2 resolution -defintiely not 1/4 res though, and it's mirroring the top/bottom screens and only rendering 1/2)
    Now doing that (even at 1/4 res pixels) fullscreen and close to 30 FPS (let alone dual plane split screen) would be another story.


    There's other tricks to approximate a 3D flat plane with a combination of software scaling and line scroll. (and I think that's what Panorama Cotton is doing)
    Some others may do that realtime or with animation buffered into VRAM combined with linescroll for perspective. (Burning Force, Lawnmower Man, and G-Loc on the MD may be usign animation more like Space Harrier 2 along with linescroll)
    Last edited by kool kitty89; 01-27-2011 at 02:53 AM.
    6 days older than SEGA Genesis
    -------------
    Quote Originally Posted by evilevoix View Post
    Dude it’s the bios that marries the 16 bit and the 8 bit that makes it 24 bit. If SNK released their double speed bios revision SNK would have had the world’s first 48 bit machine, IDK how you keep ignoring this.
    Quote Originally Posted by evilevoix View Post
    the PCE, that system has no extra silicone for music, how many resources are used to make music and it has less sprites than the MD on screen at once but a larger sprite area?

  12. #57
    Mastering your Systems Shining Hero TmEE's Avatar
    Join Date
    Oct 2007
    Location
    Norway, Horten
    Age
    34
    Posts
    10,112
    Rep Power
    114

    Default

    Quote Originally Posted by Kamahl View Post
    Actualy, hardware wise, the TG-16 is the system that, theoretically, could have the best animation. It's not a processor limitation but a VRAM bus limitation, and although the TG-16 bus is weaker, it can be used while the screen is being drawn.
    It also requires a lot of cartrige space that the TG-16 didn't have (since it used small hucards) or a lot of ram (that it only had with the arcade card), so all games have less animation, even though they could have much more.

    So that's another point for the genny. The SNES bus is particularly bad, it's about the same as the TG-16 bus but without the ability to use it while the screen is being drawn.

    You could also shutdown the screen drawing a bit earlier (sacrificing a few scanlines) to get about twice (or even more) bandwidth. It's used in soulstar for example, and doing so the genny completely humiliates both the TG-16 and the SNES in animation.

    Animation like what you see in Gunstar Heroes, Alien Soldier, and other games that make bosses and stuff out of a bunch of sprite IS something only the genesis was fast enough to do, and they do look awesome.
    Quote Originally Posted by tomaitheous View Post
    No. Ever play Sapphire? There's 30fps morphing animation and all kinds of other animation going on at the same time (frame pixel updates, not just SATB entry updates). Kamahl directly hit the nail on the head. While the transfer to vram is slower on the TG than the Genesis or SNES, it doesn't have restrictions writing to vram during active display - like both the SNES and Genesis (and some other consoles). The TG16 is definitely capable of the animation that you stated as example. Being 16bit has nothing to do with it. You could have a fast 8bit DMA controller that could run circles around the Genesis 3 times over. 'Bitness' means nothing, it's all about bandwidth. The Genesis cpu isn't updating those frames to vram, a dma controller is. If the Genesis cpu was manually updating the pixel data to vram, it would be considerably slower. The DMA was there to over come the cpu's slowness (relative to this matter, doesn't mean the CPU is 'slow' per se).
    PCE can update VRAM during active display but like any other system, you use a really good chunk of it for the game logic, and while the resource is there, you cannot really use it, though that's very dependant on the code itself and what is going on. Bosses are good for that part since you only got one single enemy and this should not take much screen time, leaving a good chunk left for VRAM updates

    The effects in Sapphire are pretty awesome


    Regarding VRAM bandwidth, I once experimented with interlace mode... having one frame for showing GFX and doing game logic + screen drawing, and other blanked frame for a massive DMA. It will work awesome on a typical HDTV screen which treats any analog input as 480i thus blending 2 frames together so this eliminates flickering, though colors get darker, but on CRT you just get massive flicker... you can alleviate it a bit by choosing suitable color for blanked frame, but it does not give too optimal effect...

    This would have allowed for silky smooth 30FPS Wolfenstein like game :3



    Quote Originally Posted by kool kitty89 View Post
    The window layer is just the solid color far BG layer, isn't it? (like the PCE and SNES also have)
    Where Window is, there is no A. Window cannot be scrolled. Its most useful as statusbar, which its mostly used for. You can make it start from any side of the screen and it can cover few tiles or all of the screen, its quite flexible, and IMO very very useful thing.
    Death To MP3, :3
    Mida sa loed ? Nagunii aru ei saa "Gnirts test is a shit" New and growing website of total jawusumness !
    If any of my images in my posts no longer work you can find them in "FileDen Dump" on my site ^

  13. #58
    Wildside Expert
    Join Date
    Nov 2008
    Posts
    104
    Rep Power
    21

    Default

    Quote Originally Posted by Kamahl View Post
    The chip itself... It has a 12bit resolution internally, that's what allows Shadow/Highlight mode to exist.

    You can get 1536/4096 colors onscreen using a mix of shadow, highlight and palette swaps.
    there are much less colors because a lot of colors are the same, I remember someone measuring the RGB levels in each mode and for each 7 index value (3-bits nibble)

    - color values 0,2,4,6 in Shadow mode are the same as values 0,1,2,3 in Normal Mode

    - color values 1,3,5,7 in Highlight mode are the same as values 4,5,6,7 in Normal Mode

    - color value 7 in Shadow mode is the same as value 0 in Highlight mode.

    So there are 15 possible level for each RGB channel, which could fit with a 12-bits (or more likeky 3x 4-bits) DAC but it has yet to be proven, they could have have handle that completely differently, especially considering the full range can not be used and the shadow/highlight/normal modes are exclusive (you can't have RED channel using highlight and BLUE channel using shadow) which limits the max. number of different colors to 960 (8x8x8x2 - 8x8) instead of 3375 (15x15x15), which is very far from the 4096 allowed by a 12-bits range.
    Last edited by Eke-Eke; 01-27-2011 at 05:20 AM.

  14. #59
    Mastering your Systems Shining Hero TmEE's Avatar
    Join Date
    Oct 2007
    Location
    Norway, Horten
    Age
    34
    Posts
    10,112
    Rep Power
    114

    Default

    I'm more willing to believe analog approach to S/HL, expecially since there's some odd spikes and things are not too linear AND you get 15 not 16 color greyscale gradient.

    You'll be getting around 1200 colors at most.
    Death To MP3, :3
    Mida sa loed ? Nagunii aru ei saa "Gnirts test is a shit" New and growing website of total jawusumness !
    If any of my images in my posts no longer work you can find them in "FileDen Dump" on my site ^

  15. #60
    Hero of Algol kool kitty89's Avatar
    Join Date
    Mar 2009
    Location
    San Jose, CA
    Age
    34
    Posts
    9,724
    Rep Power
    67

    Default

    Quote Originally Posted by Jorge Nuno View Post
    I can (or anyone else, this is piss easy) fake more color depth on the MD by doing checkerboard or vertical lines dithering AND alternating colors (alternating between 2 consecutive intensities) every frame, this will be unnoticeable on every video interface including RGB.
    Some FMV did that... Mortal Kombat did along with some other things but looked a bit ugly.
    Sonic 3D Blast's cinepak (not MD tilemap based, but 4x4 pixel 16 color per frame cinepak) used that with both flicker and interlocking dither in the dithered frames. (it seems like it might have pushed for pseudo 256 colors, so it might have been a derivative of 256 color cinepak using 2 composite 4-bit planes -OTOH the palettes don't seem to change, or at least not drastically for the intermittent flicker frames, so maybe it's plain 16 color formatting with special organization for flicker to blend dithering better -probably would allow better compression in general than pushing for an actual 8bpp format that was then interpreted to dual 4-bit flickered planes)

    Hey, maybe they could have even used such a flicker effect in combination with other existing FMV formats on the MCD (especially the rather nice tilemap based formats) for more pseudo/flicker colors but still keeping bandwidth/buffer size in VRAM lower by catering to the tilemap format (you'd have enough tiles in VRAM to apply to 2 separate tilemaps with different colors and graphics -a better encoder might be able to share some tiles between both frames even, on top of the per-tile color quantization and use of tile flips which I assume the better versions of the CD codecs did take advantage of already- and end up with well over 256 color output, reloading all 4 palettes between each flickered frame, and up to 256 pseudo colors per 8x8 cell with 16 different pseudo-256 color palettes -with some redundant colors due to the common BG color with all 16 color palettes- given they managed 30-46 color range for existing FMV with only 4 15 color palettes +BG color per frame, that would imply that it would be a lot more than 16x that count for the filcker method due to the fact you have 16 "palettes: to work with as such rather than 4, though probably still a fair bit less than the maximum possible 3,000 some pseudo-colors on-screen)

    Or you could try separate luma and chroma encoding (though you've only got 8 shades of true gray to work with, without shadow -but lower color count would also losslessly compress better) in combination with other schemes (esp tilemap based) and perhaps using the ASIC to quickly scale the choma tiles. (if you used lower res chroma like H.261)

    And in any case, you could simplify things by only using 2 16 color palettes for a 256 color image as such, so catering much more simply with existing bitmap based schemes like true cinepak derivatives (though full frame bitmaps would put a bigger hit on DMA bandwidth -if that became the limiting factor) or other schemes using the tilemap but without the algorithms for optimizing multiple palettes per frame. (would also cut out data for CLUTs -more for the pseudo CLUTs you need to manage 16 pseudo-256 color palettes)
    And you could still do the separate chroma and luma encoding but with plain 8 shades of gray (without shadow -which would require another tilemap and tiles to apply for the added shades) and plain 16 colors (so 128 colors max), or use 4 palettes on the chroma layer but plain 3-bit grayscale for the luma layer for up to 488 colors or probably closer to 240-360 some colors given the 30-46 ish colors on existing FMV -probably on the lower side for the least clash. (and again, potentially use the ASIC to quickly scale choma tiles if you went the H.261-like chroma resolution route)

    And dithering on top of that in all cases. (though perhaps specifically managed to maintain favorable lossless compression of tile data)


    This effect also works if I want to do a 50% alpha effect.
    Doesn't that break down for multiple translucent objects?



    One big issue for every single case of flicker blending as such is with PAL's 50 Hz flickering significantly worse than NTSC.
    Even so, it probably would have been reasonably tolerable (and probably better looking for FMV than plain dithered 16/4 palette FMV or for some splash screens -and be easier to compress than complex shadow+hilight methods).
    Actually, with the plain 16 color palette pair for pseudo 256 color output, that would probably be a lot easier to work with (on the encoding side) than carefully managing per-cell subpalette optimization, let alone the option to use plain bitmap methods and send a full 2 frames' worth of tile data over rather than tilemap based schemes. (if you kept the bitmap to tile output mapped to the same tiles for every pair of frames, you could cut out the bandwidth for tilemap updates, just swapping tilesets)

    I wonder why that was't explored more and used often. (again, it definitely seems like it would be easier to manage on the encoding side than algorithms to efficiently use 4 15/16 color subpalettes along with dithering)



    EDIT:
    Quote Originally Posted by Jorge Nuno View Post
    Just beacuse there are shadow and highligh effects doesn't mean the DAC/palette is 12bits. If I write odd values to the cram and read them back, the LSB of each component is not what I wrote (seems to be open bus on those bits 15-12, 8, 4, 0). Those effects are done by changing the upper/lower voltage references on the DAC itself, which is easy to do electronically (6-8 MOS transistors).
    So it's more like 11-bit 3-3-3-2 RGBI than actual 12-bit RGB output. (except the "I" only uses 3 states of the 4 possible)

    Hmm, that actually could have been a nice silicon saving hack for more colors/shades even with legitimate increases in CRAM. (instead of 3 more high speed resistors, just a toggle for double or 1/2 intensity, so well short of 12-bit RGB, but many more shades than 9-bit or 3-4-3 RGB possibly some advantages over 4-4-3 RGB even given the human eye is more sensitive to luminance than color) So they could have tweaked the VDP to allow palettes to directly specify 9-bit RGB plus SH/HL colors as well with extended CRAM and slightly tweaked logic to allow the CRAM to directly make use of SH/HL. (they could retain the SH/HL effect too and use both simultaneously for that matter, just more redundant outputs when using that -in hindsight the HL/SH stuff was so little used, the logic to allow it was probably wasted -something like allowing 1/2 res 8bpp tiles via pixel accumulation would have been far more useful -not sure how much logic that would require, but probably not set on a per cell basis, more like per-layer, but per cell would have been great too )
    Hell, you wouldn't even necessarily need more CRAM for more palettes either (same thing for the 8bpp mode suggestion), you could have additional bits toggle RGB (or RGBI) values directly on top of the 4-bits selecting palette entries. As such they could have had 2 lines of CRAM dedicated to sprites and 2 to BGs with 1 bit for each tilemap or sprite index to toggle those 2 15 color palettes, and the additional bit (or 2 or 3 more bits if you did away with HL/SH effects) to control RGB levels and bright/dark toggling to provide the added "palettes." (even if HL/SH logic was removed, the DAC mechanism for the luma control could be useful in cutting chip space over 4-bit DACs) On top of that, they could have tweaked CRAM to include intensity control as well. (so 11 bits per line rather than 9 -there's a chance thy used 16-bit SRAM words as it was, and in that case it wouldn't even take any more space -if it's true 9-bit SRAM for CRAM, that would use a bit more space though)

    So, assuming you've removed HL/SH flags from sprites and tiles and have 2 CRAM palettes for each set and then another 2 bits for tiles and 3 bits for sprites for additional palette control over RGBI applied to the indexed palette selected for a total of 8 BG "palettes" and 16 sprite "palettes". Perhaps even have different palette modes with different hardware ROM LUTs for different combinations of RGBI values for the final 2 or 3 bits of palette control. (of if it was fixed to 1 set, make that 1 set the most well balanced)
    If you kept it as plain 9-bit CRAM entries, you'd definitely want to have the intensity selecting bits in the additional palette selection options, so that would use up 3 of the sates of each (leaving 1 more states for tiles and 5 more for sprites), the final state for BG might be for toggling green intensity instead of global brightness (so 1 state for normal RGB colors as indexed, 2 states for toggling bright/dark for all colors, and 1 state only increasing the intensity of green -or decreasing it depending what was deemed more useful -definitely green though as it's the most sensitive to human vision), and for the remaining 5 sprite palette states (2 bits plus 1 added state) you could have 4 states for direct R-G level control (1 bit each) and the final state might again be used for additional green intensity control or maybe doing something with blue.


    So it would be a hybrid of indexed and direct colors. (I got some inspiration on that from a discussion about the ST's palette on Atariage -in the context of a hypothetical addition of an 8-bit color mode without added CRAM, but other than just using direct 3-3-2 RGB, but rather using the 16 entry palette and the final 4 bits for direct 1-2-1 RGB level control -and possibly options to use different configurations like 2-1-1, etc) Also sort of like the NES's additional RGB toggle to the palette, but embedded in the sprite tables and tilemap palette selection rather than PPU registers on a per frame basis. (or per line with a raster interrupt)

    Something like that might have been a good compromise even with the SMS VDP logic eating up precious silicon. (and of course, rather than bothering with the SH/HL type luminance control hack for an enhanced palette, they could have gone for 4-bit DACs, cut out all HL/SH logic to save space and instead add full 12-bit CRAM entries and allot 2 lines of CRAM for BG and another 2 for sprites with the additional palette control via direct RGB, like 1-1-0 for tiles and 1-1-1 for sprites, or VDP registers that could select different R-G-B values for such as part of loading the palettes, but hardwiring to 1-1-0 and 1-1-1 would probably be OK)

    In all those cases (with all HL/SH flags replaced by added palette options), you'd have up to 361 colors/shades on-screen with no added tricks (16 sprite palettes 8 BG and 1 common BG color), so a pretty big step up and also larger colorspace than the PCE. (even for the RGBI-ish thing using SH/HL luma control in the palettes -but removing all other SH/HL logic to save die space and free up the additional palette flags; plus if dropping the HL/SH logic freed up enough space to add 4+4 =>8-bit pixel accumulation on a per-layer basis -or maybe reserve 1 flag on sprites to allow it per-sprite-, that would definitely have been a good idea to have around -with extended RGBI control with the added 4 bits- but otherwise using the space to upgrade to full 12-bit RGB with 4 lines of CRAM would be preferable, unless you could do that AND the 8bpp mode )

    Of course, if they'd dropped the SMS block (and made the PBC more of an active adapter), they could have practically crammed more into the VDP too. (make the 8bpp idea more feasible, or maybe even more CRAM lines in addition to that -I'd put 8-bit accumulation at a higher priority than more CRAM lines -the direct color flags would supplant that a fair bit: any added CRAM indexes would be displacing direct RGB control with the flags set for a true palette instead of an RGB bit as more CRAM was added)

    I wonder if Sega ever considered using direct RGB control like that.

    With no HL/SH, you'd be limited to dithered meshes/bars and flicker for translucency effects. (given how rarely SH/HL was used, it was probably a pretty big waste of logic)




    Quote Originally Posted by Jorge Nuno View Post
    I think they just didn't want any more internal memory (too much area?)
    The cram as it is now wastes maybe 3456 transistors (assuming S-RAM, 6 transistor per cell, 2 inverters + RWline enablers)
    That's also assuming the existing CRAM is 9-bits wide and not 16-bit wide. (I seem to recall some speculation on it possibly being 16-bits wide due to ease of engineering/interfacing, or maybe that was in the context of addressing logic -ie use 9-bit SRAM lines but 16-bit address/interface logic with the upper 7 bits unused)













    Quote Originally Posted by Chilly Willy View Post
    The real difference is that faking colors on the Genesis usually takes no/very little extra processor time, and no/very little extra memory, while faking more layers/sprites takes an INCREDIBLE amount of extra CPU time, and a LOT more memory. That's why you don't normally see certain games on the TG16 - no CPU time left after faking what's missing; if you do see a game of that sort, it's either slow and decent looking, or fast and really piss-poor looking.
    Using BG tile animation takes a TON of CPU resource?

    As you said before, the PCE's strengths included DMA bandwidth, and a simulated BG layer (other than sprites) is most often done by manipulating tile data directly with partially animated tiles as well as modifying the tilemap. (the only software "blits" would be manipulating the tilemap, so part of the scroll would be cell-wise scrolling, and the rest would be animation to smooth that out -it wouldn't have to be full single-pixel scroll either, it could end up being 2 or 4 pixels at a time, or for a fast scrolling game, plain 8-pixel cell wise scrolling could be OK)

    Of course, that also takes careful art design to allign everything to the tilemap and facilitate both the real hardware scrolled graphics, and the dynamic tiled graphics. (and probably also aiming at symmetrical patterns to allow flipping and heavier re-use of tiles to reduce the added data needed)

    So a lot of work for the game/art designer, but not necessarily a lot of CPU overhead. (most of the work would be done ahead of time)
    You could possibly do some software blits to cut-down on added graphics needed (ie modify some tiles on the fly and page them to VRAM for the "animation" shifting the tiles one or a few pixels over -making preshifted scrolling animation), and even if you did that for ALL the animation to avoid using any more ROM, that would be far less intensive than CPU driven bitmap rendering like the ST has to do as you've got the tilemap in hardware and one tile can be re-used many times with that very same scroll animation. (or use software blits as a form of decompression, allowing most dynamic tiling animation to be fully buffered in VRAM, but have the CPU pull data from ROM and shift it into the necessary tile animation a bit ahead of time -with whatever free CPU time there is- and load that into VRAM in chunks -you've only got 8k of work RAM and you wouldn't want to dedicate too much of that to buffering graphics)

    Some Genesis games did that too, like the far "background" in Hydrocity Zone in Sonic 3.











    Quote Originally Posted by Chilly Willy View Post
    Shooters are NOTHING in the way of processing - the most work they require is collision detection, and good collision detection routines are fairly simple and quick. It's much like a good sort algorithm. The SNES being slow with shooters is more an indication of how little time it has to update the graphics than in how much CPU power it has. The TG16 is practically designed for shooters, and it shows. A shooter requires high video throughput, and little game logic.
    How is it that many shooters on the SNES, Genesis, and TG-16 (and others for that matter) tend to slowdown more than other such genres... then again sports games do tend to slowdown noticeably too at times. (and some platformers, including Sonic)

    Slowdown usually occurs with a LOT of sprites appearing on-screen, like rings flying out of Sonic, a bunch of players in a sports game, or ships in shooters. (the animation thing is moot there since it's quite often many of the very same objects being repeated over and over -in shooters, there's both projectiles and enemies that that applies to)

    In the case of Sonic, that's a physics/game logic issue iirc, but how about shooters? (there's no gravity, just the AI/logic moving the enemies around the screen as well as the projectiles moving in patterns or straight lines -sometimes arcs and sometimes multiple arcs with seeker missiles and the
    like -the latter would have a bit of simple AI targeting the enemy ships)

    Vapor Trail on the Genesis has some pretty hefty slowdown when there's a lot of stuff going on. (especially lots of small sprites all over)


    The slowdown in Super Mario World also tends to be a sprite management issue, namely in the underwater sections with lots of aquatic enemies floating around. The All Stars SMW version supposedly bumped ROM up to 3.58 MHz and thus gave the CPU a 33% boost (most code run from ROM) which eliminated slowdown.


    For fighting games, I wouldn't be surprised if it's all DMA related, which would also explain why SFII SCE actually seems to have MORE slowdown than Turbo on the SNES at times. (seems to depend on the combination of characters and levels used, but in several cases I noticed obvious slowdown on the Genesis game with Chun Li's fireball -or other projectiles- where it was more subtle or absent on the SNES -Turbo, not WW, not Super)

    I also wouldn't be surprised if that's not the case in PAL either as the DMA constraints are much less.





    Quote Originally Posted by Chilly Willy View Post
    Again, please point to ANY part of a shooter that is taxing other than the collision detection.
    Maybe that's the issue then since the slowdown is directly tied to having "too many" sprites on-screen and occurs in some pretty small games too. (ie ones less likely to be doing any on the fly animation -though more obvious by the lack of visible animation as such)
    Last edited by kool kitty89; 01-27-2011 at 06:47 AM.
    6 days older than SEGA Genesis
    -------------
    Quote Originally Posted by evilevoix View Post
    Dude it’s the bios that marries the 16 bit and the 8 bit that makes it 24 bit. If SNK released their double speed bios revision SNK would have had the world’s first 48 bit machine, IDK how you keep ignoring this.
    Quote Originally Posted by evilevoix View Post
    the PCE, that system has no extra silicone for music, how many resources are used to make music and it has less sprites than the MD on screen at once but a larger sprite area?

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •