Announcements After lecture, we will start with basic image processing using MaxIm DL Dark Sky Night scheduled for tomorrow night (Wednesday). Weather forecast looks bad. Rain may end but still cloudy and very humid. Cancellation notice will be posted tomorrow by 5:00pm. If I do cancel, class will meet Thursday night. If clear (most forecasts say it will be) meet at the farm to do imaging. If not, meet here.
Digital Imaging
Basics: The Photoelectric Effect
Photoelectric Effect: The Theory
The MOS works on the photoelectric effect By “catching” the liberated electrons in the MOS device you lose the frequency information. Thus, the MOS simply measure the intensity (# of photons), not the color.
The difference is in the on-pixel amplifier CCD Versus CMOS The difference is in the on-pixel amplifier
Each pixel is an MOS photodiode The liberated electrons are held in the potential well until they are read out
CMOS includes amplifiers on each pixel while CCD’s do not As a result, not all the area on a CMOS pixel is available to collect light. The amplifiers also tend to be nonlinear so they aren’t as useful for photometry
A CCD array has millions of individual devices The smaller the pixel, the higher the resolution. But, the smaller the pixel, the fewer the number of electrons that can be stored before it fills up
Downloading the array is like running a series of conveyor belts
Downloading the array
If a pixel gets overfilled (saturated), it “blooms” into pixels down the line
Anti-blooming Architecture helps eliminate booming
The download a numerical array of how many photons hit each pixel Number of electrons in each pixel is proportional to the number of photons that hit that pixel. The result is a spreadsheet that is easy to mathematically manipulate.
The numbers in the array can be translated to a grayscale
Once pixel is assigned a shade of gray, a black & white image is created from the spreadsheet
Color imaging requires filters For pretty pictures we use red, green, blue and luminance. For photometry we use ubvri (Johnson set) or ugriz (SDSS set). Other systems extend into the UV or IR ends of the spectrum
It is possible to put filters directly on the CCD chip All color digital cameras use microfilters. Notice there are more greens than reds or blues. Different chips have different wavelength sensitivity.
A color chip has a lower resolution since each “pixel” is actually four Silicon chips are most sensitive in infrared and red and least sensitive in blue. An infrared filter will cover the entire chip. The “white balance” adjusts the amplification of each of the colors.
A color image download has to account for the colors