Quote:
Originally Posted by
sheath
From all of the other discussions it sounds to me like the Genesis design is more at fault than the Sega CDs. All the Sega CD needed was its own "Super" VDP, or the Genesis needed a VDP with even faster bandwidth and the color RAM expansion lines hooked up. Then the Sega CD itself would have needed an "Arcade Card" like RAM expansion over time to show its fullest potential. The real world Sega CD can already do things with its Graphics CoProcessor that the Arcade Scaler boards could not and it becomes a huge apples to oranges comparison of rendering types and sprites versus "objects".
Edit: moved reply to Sheath here:
http://www.sega-16.com/forum/showthr...ty-stuff/page4
Quote:
Originally Posted by
parallaxscroll
Well I don't know. The X-Board was 1986-87 tech. By late 1991 or Q3 1992, I do not see why such capability could not be replicated on a far smaller motherboard with far fewer pieces of silicon, ASICs and RAM chips, obviously a 2nd 68000 would've been a must.
Consolidation would have been possible, but it would still be expensive . . . and not particularly well suited to the overall nature of the Sega CD either, especially as an add-on. Then again, if they didn't care about compatibility this might have been interesting to do in a standalone new console instead of doing a CD add-on at all. (except you could do something much cheaper and more flexible with the blitter-like approach used in the Sega CD . . . and there's a better argument for maintaining Genesis backwards compatibility at least, especially for a mid-generation release like that -particularly given Sega's overall trends with the SG-1000, SMS, and MD in that respect; so we're back to the context of the thread I linked to earlier)
Also, multi-CPUs are not a good idea . . . arcade boards often did it since cost was no object, and it made some design aspects easier even if the added CPUs didn't get much use. (that's not to say more specialized added processors wouldn't have made sense too, like a DSP) This gets addressed in that other thread too.
Quote:
S3 and ATi had terrible reputations during the early PC 3D graphics era.
Yes, but give that context, as I did above, and you've got a lot more to consider in respect to a console. Personally I'd favor S3 in general for cost/features/performance in the 1996 area as they had pretty nice low-res 32-bit color performance . . . the Rage II was a bit faster and much more DirectX compatible but the perspective correction and texture filtering was worse, among a few other things. The main problem with both of those was driver support combined with modest peak performance for the standards of high-end PC gamers . . . but far more reasonable by console standards. (in a console setting, the ViRGE 325 should have fared pretty well next to the N64's RDP in real-world performance . . . and been a lot more open to programmers in terms of optimizing performnace/detail)
The Rage II would have fared even better in terms of raw speed with all effects enabled, but the visual quality was worse and likely cost more. Though it did have MPEG acceleration features and if you focused mostly on 16-bit color, the visual quality over ViRGE is less obvious. (though low res alpha textures will look bad due to lack of alpha texture filtering on Rage)
Also remember the generally bad reputation PowerVR cards/drivers had on PC, yet it ended up exceptional on the Dreamcast (and even adequate for directx). And ViRGE, by comparison was a much more "normal" architecture, and the main concern would just be optimizing around the performance and limitations which is something every console developer had to deal with to some extent. (rather than having to optimize for a unique/exostic graphics pipeline with tile-differred rendering)
This is the same reas I'd argue against using the NV-1 . . . potentially really powerful, fexible, nd interesting, but also very exotic and not condusive to established programming or 3D modeling, let alone multiplatform development.
Quote:
Rendition, I will give you though, as their Vérité V1000 was the first decent piece of 3D consumer silicon in the PC space. Even John Carmack endorsed it as being the standard for PC 3D graphics. That is, until 3Dfx dropped Voodoo Graphics on the scene like an atomic bomb. Perhaps something resembling the Vérité V2000 or V2100 could've been customized for Sega, offering Voodoo Graphics like performance. With that said, it probably would not surpass the 3DO M2 in performance or image quality, and Sega turned M2 down.
It's not just about performance, but practicality and cost. (this would apply to M2 as well) If cost wasn't a concern, than Voodoo would obviously be the winner in 1996 and ViRGE and Rage would be of no concern. Then again, the V-1000 might have been a better compromise for the time than the ViRGE, in terms of cost, performance, and features. (and Rage II still falls somewhere in there too, mainly due to the speed advantage over either of those others and MPEG acceleration, and fuller features than the ViRGE 325 in some respects, but weaker visual quality for texture filtering and perspective correction)
Rage I was pretty much a non-option though . . . slower than ViRGE and weaker features (and crap for DirectX -only OK for a few Rage-specific games).
There's also a few others worth mentioning like Lagoona 3D, but that's in a similar performance vein to Rage II. (out earlier though)
But again, all this is moot anyway given my actual argument regarding Sega waiting until '95/96 for their next-gen console. :p
That, and, once the Saturn was out there, rushing out another new system wouldn't make that much sense, another reason the M2 would have been questionable.
Quote:
Sega could've had decent to good Japanese software support, providing the Sega CD had sold better in the U.S. Sega could've drawn on the support from the PC industry (not to mention Amiga and X68000 games), arcade support, from Sega themselves as well as third parties, as well as original home grown efforts. Phantasy Star V could've been the flagship RPG, even if not for sales. I think in Japan, Sega could've overtaken NEC PC-Engine+CD-ROM without all the different card upgrades NEC offered.
Sega would've had a single 16-Bit console (MD/Genesis) and a single CD-ROM upgrade. While there would've been little to no chance of Sega beating Nintendo in Japan, worldwide, yes. The MD/Genesis was beating Nintendo worldwide in late 1991 to around early 1994.
(correct me if I'm wrong).
Yes, it should have ended up better overall, and they at least could have competed more directly with NEC for 2nd place in Japan.
Having more JP support may very well have directly contributed to SoA handling things differently. (with solid JP support, they'd have more brething room, and maybe wouldn't have gone in head first for FMV like they did . . . not avoided it entirely mind you, but perhaps gone into multimedia in a boarder and more varied manner, and looked more at what the computer game world was doing)
Quote:
I'm not suggesting MegaCD/SegaCD should've sold for $400 but more like $300 in late 1992. Then dropping to $200 in 1993 with combined cartridge+CD systems in 1993-1994 selling for $250 then $200.
The Sega CD already WAS $300 in 1992, and $230 in 1993 (MCD2). In Japan in 1991 it was roughly $380 US with the exchange rates of the time iirc.