236 colors on the screen

Discussion in 'Digital Photography' started by BioColor, Feb 24, 2005.

  1. BioColor

    BioColor Guest

    Hi,

    From my work with graphics programming, AFAIK a video card only has
    256 entries in its active palette. 20 of these are reserved by
    Windows. This leaves only 236 simultaneous colors for dots on the
    screen at one time. Each of these colors has it's own r, g, and b
    values. With 8 bits of intensity for each of those you get a
    theoretical palette of 16 million colors from which you can select any
    236 at a time.

    However, I never see this mentioned in all the discussions of color in
    this NG. Surely it affects how images look on the screen. For example,
    if this is true, you can never display a grayscale with more than 236
    different levels of intensity.

    While this might be more than enough for a gray scale, when you put
    all the colors in a photo on the screen, I would think the 236 color
    limit would be important.

    What am I missing?

    TIA
    Duncan
     
    BioColor, Feb 24, 2005
    #1
    1. Advertisements

  2. Today's video cards don't use palettes - they typically have 8 bits of
    red, 8 of green, and 8 of blue. How long ago was your graphics
    programming?

    David
     
    David J Taylor, Feb 24, 2005
    #2
    1. Advertisements

  3. BioColor

    Owamanga Guest

    What you are missing is a video card that was made in the last 10
    years.

    Where the hell are you? Elbonia?

    Today's (and even those of 5 years ago) video cards are not limited by
    this palette method any more, although they can still use it if you
    wish, most people run their video cards in high-color (16 bit) or
    true-color modes (24 bit) paletteless colors.
     
    Owamanga, Feb 24, 2005
    #3
  4. BioColor

    Chris Brown Guest

    Possibly the fact that for the last decade, consumer video cards have
    allowed high resolution "true-colour" displays which don't indirect via a
    256 entry LUT, but just allocate a 32 bit word for each pixel and store a
    full RGB triplet in each one, allowing each pixel on the screen to take any
    one of the 16,777,216 colours that exist in a 24 bit colour range?
     
    Chris Brown, Feb 24, 2005
    #4
  5. BioColor

    RSD99 Guest

    256 color palletted color ?

    As all of the other posters have mentioned, this hasn't been in use for
    something like at least a decade. In fact, I have an old computer across
    the room set up for Windows 95. It was upgraded from Windows for
    Workgroups, version 3.11. It is at least ten years old. It has **always**
    been set up for the so called "high-color" (16 bit) video mode.

    What you are missing ... is about the last ten to fifteen years.
     
    RSD99, Feb 24, 2005
    #5
  6. BioColor

    Lionel Guest

    I don't think I've seen a PC with palette-based video in ten years or
    more. Modern machines are usually 24 bit colour, & often 32 bit colour.
    The traditional method was to grab most of the palette, assign a very
    carefully selected set of colours to it, & use error-diffusion dithering
    to simulate 24 bit colour. (It worked surprisingly well if you were a
    yard or more away from the screen.) Fortunately, we don't have to jump
    through those sorts of hoops any more.
     
    Lionel, Feb 24, 2005
    #6
  7. BioColor

    scott Guest

    I don't think I've seen a PC with palette-based video in ten years
    32-bit colour? Not heard that one before, how many bits are for RBG,
    something like 10,12,10 ???
     
    scott, Feb 24, 2005
    #7
  8. BioColor

    BioColor Guest

    Ah. Silly me. It's very cold here in Elbonia, and my brain must have
    been frozen.

    I take files of floating point numbers and, for display, I convert
    them to those RGB intensities using a palette. With the image on the
    screen, I interactively modify the colors by modifying the palette (in
    my brand new Radeon card), instead of destructively modifying the rgb
    intensities of each dot on the screen in a non-palettized mode.

    Fortunately, all the new cards still support palettes.

    The original code and concept are from 1975. After a bunch of ports
    and rewrites, it finally ended up on a PC when you needed $8,000 worth
    of extra hardware to show anything beyond monochrome. A few years ago
    the 8-bit color other posters have recalled began to misbehave on the
    newer cards, and I rewrote it to work in truecolor. I still use the
    palettes, though, and they are still limited to 236 colors in VB.

    Duncan
     
    BioColor, Feb 24, 2005
    #8
  9. BioColor

    Pete Fenelon Guest

    This is actually correct (ish) for about, er, 1991, when my 1-megabyte
    Trident 8900 video card could do 1024x768x8-bits and you paid zillions
    of dollars for 24-bit graphics on an SGI.
    About 15 years.

    pete
     
    Pete Fenelon, Feb 24, 2005
    #9
  10. BioColor

    Pete Fenelon Guest

    IIRC you use less bits for blue if you're doing 8-bit non-palette
    graphics; I used to use the Research Machines colour graphics card in
    the early 80s and that had 3 bits of red, 3 bits of green and 2 bits of
    blue in its palette.

    pete
     
    Pete Fenelon, Feb 24, 2005
    #10
  11. BioColor

    Lionel Guest

    IIRC, it'll be 8x3, plus an alpha channel, or 10x3, & I have no idea the
    extra bits are used for, if anything.
     
    Lionel, Feb 24, 2005
    #11
  12. BioColor

    Owamanga Guest

    <g>

    I can't remember the exact number of bits involved, but modern video
    introduce the concept of transparency - alpha channel. It needs some
    bits.

    I'm guessing 24 bits for color (8 bits per channel) and 8 bits for the
    alpha channel. Legend suggests you get better performance in 32 bit
    mode due to the byte boundary thing.
     
    Owamanga, Feb 24, 2005
    #12
  13. Thanks, all, for being mostly gentle about my brain burp. Silly me.
    Maybe it was that full moon...

    Duncan
     
    Duncan Chesley, Feb 24, 2005
    #13
  14. BioColor

    Colin D Guest

    With a 32-bit operating system (Win 95 upwards) it is more efficient
    from the processor pov to shift data in 32-bit chunks. 32-bit graphics
    cards actually only use 24-bit color, but the extra byte goes along for
    efficiency. I don't know what's in it tho'.

    Colin
     
    Colin D, Feb 24, 2005
    #14
  15. BioColor

    Jim Townsend Guest

    Running windows.. If you right click on the desktop, and then choose
    Properties -> Settings.

    In the settings tab, you should see see the option to choose 32 bit color.

    As others have mentioned, it provides the same number of colors
    as 24 bit.. The extra bits are for the alpha channel..

    http://www.answers.com/main/ntquery?method=4&dsid=1512&dekey=bit+depth&gwp=8&curtab=1512_1
     
    Jim Townsend, Feb 24, 2005
    #15
  16. BioColor

    JohnR66 Guest

    My computer is set to 32 bit, there is no 24bit mode, only 16 below that. If
    this is true, I can se why there is no 24bit mode.
    John
     
    JohnR66, Feb 25, 2005
    #16
  17. BioColor

    Alex Butcher Guest

    24-bits are used for the RGB colour triple and sometimes the remaining
    8 bits are used to implement an alpha channel (i.e. for blending and
    mixing). Freqently, the remaining 8 bits are left unused, purely to
    improve performance.
    Best Regards,
    Alex.
     
    Alex Butcher, Mar 6, 2005
    #17
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.