Problem With HDTV Images From Space Station

Discussion in 'Digital Photography' started by David Ruether, Dec 16, 2006.

  1. While waiting for the first live broadcast in HD from the international
    space station I caught the first half of a program on the Apollo 11 trip
    to the moon. One interesting aspect was the brief bright lights seen
    in the cabin, later attributed to mysterious "Z" rays passing through it
    (as I recall). They did not appear to do any damage, though. But
    I watched the HD broadcast and noticed what appeared to be many
    tiny white spots in the image which at first I thought were caused by
    dust. On seeing a repeat of the program it was obvious that these
    white spots could not be caused by dust in or on the lens, and their
    sharpness and lightness probably precluded them from being caused
    by dust on the sensors (and they did not change with light levels, but
    they were clearly visible on my particularly sharp HD display at a
    scale of about one pixel in two million). I wonder if "Z" rays (or
    something similar) make shooting digitally in space difficult without
    accepting some damage to the sensors (and the resultant images).
     
    David Ruether, Dec 16, 2006
    #1
    1. Advertisements

  2. David Ruether

    Ed Velez Guest

    Now you have me thinking....I saw the same HD episode from Discovery
    HD while they were interviewing up in space and I could see the same
    white spots. Was not pixelation nor a weak signal since they always
    stayed in the same spot on the screen.


    12/16/2006 11:14:55 AM
     
    Ed Velez, Dec 16, 2006
    #2
    1. Advertisements

  3. Yes. Aperture changes and light source position changes did not
    make any difference, and the spots were very sharp and small (and
    white) making dust on the sensors or lens unlikely. It seems that the
    only possibility that remains is that the sensors were damaged. If
    so, I wonder by what means - and if this really means that digital
    photography in space has some basic problem associated with it...
     
    David Ruether, Dec 16, 2006
    #3
  4. David Ruether

    ~~NoMad~~ Guest

    Those spots were defects in the HD CCD. Hot Pixels if you want. Most
    consumer and professional cameras have software built in to correct for
    defects in the CCDs but NASA does not want this pixel correction software on
    any of their cameras. They expect to do any correction they want on the
    ground. This way they have true Raw images from the HD CCD that in case they
    need to very carefully analyze the data at a later date they will have the
    'original' raw video.


    NM
     
    ~~NoMad~~, Dec 16, 2006
    #4
  5. David Ruether

    ~~NoMad~~ Guest

    And BTW: This is an excellent example of what a typical CCD in a camera or
    camcorder would look like if it didn't have pixel correction software.

    NM
     
    ~~NoMad~~, Dec 16, 2006
    #5
  6. Ah, your explanation seems entirely logical, especially given the apparent
    single-pixel size of the spots. Thanks.
     
    David Ruether, Dec 17, 2006
    #6
  7. David Ruether

    John Turco Guest

    ~~NoMad~~ wrote:


    Hello, NoMad:

    My Kodak P850 digicam has a few bad pixels. It's still under warranty,
    but If I decided to exchange it, what are the odds that I'd get a camera
    with a "perfect" CCD?

    Is there anything else I could do, perhaps?

    Thanks!


    Cordially,
    John Turco <>
     
    John Turco, Dec 18, 2006
    #7
  8. David Ruether

    ~~NoMad~~ Guest

    If you send it back to the factory they may to decide to run special
    software on it that maps out the bad pixels. I've heard of cases where CCDs
    actually deteriorate after manufacture and new bad pixels form. This
    condition seems very rare. Usually when the bad pixels are mapped out they
    stay clear for the future.

    NM
     
    ~~NoMad~~, Dec 19, 2006
    #8
  9. It doesn't make all that much sense to me.

    In 'normal' 3CCD video camera, a stuck pixel is just one primary color.
    For some reason it is usually blue (I guess the blue signal is amplified
    more, but I am not sure).

    On a Bayer pattern sensor it not clear what would happen. But I don't
    expect a single stuck pixel to be always white.

    Of a scanning back with a filter wheel, I expect a stuck pixel to be white,
    but I doubt NASA uses those types of cameras for HDTV.
     
    Philip Homburg, Dec 20, 2006
    #9
  10. David Ruether

    ~~NoMad~~ Guest

    I expect that if NASA wanted their HD camera to be a high-precision
    instrument capable of registering a most accurate image that they would use
    a single CCD camera with filter wheel. This way the camera could be
    carefully characterized and they would know exactly where each good pixel is
    pointing.

    NM

    P.S. NASA has always been known for using high precision, large dynamic
    range B&W sensors that use filters to extract color data.
     
    ~~NoMad~~, Dec 20, 2006
    #10
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.