Why was 24-bit color support introduced twice in GPUs?


I was doing research, trying to answer the question "Which was the first GPU to support 24-bit color". I know all color since 1992 is 24-bit, even in games like Doom. I mean 16 million simultaneous colors on the screen, not just 256 unique colors from a 24-bit palette.

I started digging, and naturally I came across the ATI Mach 32. Later I find out that RIVA TNT also "added" support for truecolor. So I’m left wondering, is 24-bit color support some ancient technology that was forgotten after 1992 and rediscovered in the year 1998? Or are they talking about something different?

I have two guesses, but I’d love to know the real explanation:

  1. Truecolor support in RIVA TNT meant it’s hardware accelerated, as in the sprites are stored in the VRAM, as opposed to the Mach 32, where the VRAM is just a frame buffer so acceleration would be considered software.
  2. Nvidia meant 32-bit color texture, not even talking about frame buffer pixel depth.

Anyone know what both Nvidia and ATI really meant?