In 1984, there was an annular solar eclipse visible from Texas, and every school-aged student made a pinhole projector, that allowed you to view a representation of the sun and the shadow of the moon as it passed in front.
Now, everyone has a smart phone, so in honor of the upcoming August eclipse event, why not make “an app for that”?
Update: Now takes time-lapse photos
Table of Contents
Warning
DO NOT stare directly in to the sun. Viewing the eclipse directly can damage your eyes. JUST DON’T DO IT.
READ THE INSTRUCTIONS CAREFULLY.
Download
On your Android phone, navigate to this page, then click the link below to download “View The Eclipse” Android application:
https://justinparrtech.com/JustinParr-Tech/wp-content/uploads/ViewTheEclipse.apk
Once downloaded, open the application using Android’s file manager.
You might be prompted to “allow installation from an unknown source”… if so, Allow it, install this app, then go back to settings and turn off the setting “Allow installation from unknown sources”.
Update: Now takes time-lapse photos. Set the timer to an interval from 5 seconds to 10 minutes. Press the camera button to start recording, and the app will take a photo every timer interval.
Instructions
Preparing the Viewing Device
- Download the app from the link above.
- You will need a Google Cardboard viewing device, or a small cardboard box.
- Here is an Instructable with a template and instructions for making a Google Cardboard viewing device:
http://www.instructables.com/id/Google-Cardboard-20/
MAKE SURE there is a hole about the size of a quarter in the viewing device, allowing the phone’s camera to “see through” the back of the viewing device. - If you use a small cardboard box:
- You need a box about the size of your phone, such that the phone can lay flat (display facing up, camera facing down) on the BOTTOM of the box
- Cut a small hole about the size of a quarter in the BOTTOM of the box, corresponding to the location of your phone’s camera, such that your phone’s camera can see clearly through the hole
- Place your phone in the box, display facing up, camera facing down, and tape it in place.
- Hold the box up to your face, so that you are looking in to the box, and your forehead touches the top edge of the box (like a pair of binoculars).
- Look for any slits or holes where light is leaking in, and tape them using heavy-duty duct tape.
Note: Whether using Cardboard or a small box, there needs to be a hole big enough for your camera to see clearly through the bottom / back of the device. Note that this needs to be MUCH LARGER than a pinhole – the hole should be about the size of a quarter, such that you could easily take a picture using your phone, through the hole.
To view the eclipse:
- Do a dry run at least 1 day in advance, to make sure everything works as expected.
- Set your phone’s display to a comfortable viewing brightness level.
- Place your phone inside the device, and tape it in place if necessary
- Start the “View the Eclipse” application
- If using Google Cardboard, tap the screen in order to switch to STEREO viewing mode.
Tap the screen to switch between MONO and STEREO mode, and back again. - Place the viewing device against your face, using it like a pair of binoculars.
- MAKE SURE there is no ambient light leaking in. If there is, block any holes or cracks with heavy duty duct tape or similar.
- Use the device like a pair of binoculars, orienting your view toward the sun.
- DO NOT STARE DIRECTLY AT THE SUN. You should be looking ONLY INSIDE THE BOX, and you should ONLY be able to see the phone’s display, with the “View the Eclipse” app running. The sun should APPEAR RED. If you can see any direct (white) sunlight, go back and review the instructions again.
How it Works
The app takes raw data from your phone’s camera’s preview function, converts it to RGB format, and manipulates the RGB data in order to reduce the image intensity and provide greater contrast.
RGB stands for “Red Green Blue”, and represents how a color display represents color information.
Each dot, or “pixel” on your smart phone’s display is made up of a tiny red light, green light, and blue light. Assigning each pixel an RGB value sets it to a specific color.
Red, green, and blue are the primary colors of light, and can be combined to form any other color value. For example, if you shine a red light on a white surface, you will see that the surface turns red. If you add a green light, the surface now turns yellow, because red light combined with green light forms yellow. If you slowly adjust the intensity of the red light, turning it down very slowly, you will observe that the resulting light value shifts from yellow, to yellow-green, and eventually to green. Likewise, green and blue combine to form cyan, and red and blue combine to form magenta.
Unlike paint colors, black is the absence of any light, and white is formed by combining all three colors equally.
RGB format assigns a value to each channel, for each pixel, so that a complete color image is formed.
However, your phone’s camera captures data in YUV format (Sometimes called YCbCr), where the main Y channel provides a luminance (intensity) reference value, and U/V channels provide “blueness” and “redness” respectively.
YUV started as a simple way to transmit television signals in such a way that older black and white televisions and newer color televisions could share the same signal – A black and white television could simply ignore the U and V channels, and simply render the Y (luminence) channel only. Meanwhile, color televisions were able to use the U (blue) and V (red) channels in addition to Y data to determine what actual color to make each pixel.
Because the human eye is more sensitive to contrast, and less sensitive to specific color details, YUV is also often employed because specific encoding schemes can be used to strip off most of the color data, leaving all of the contrast data in tact, and thus saving a tremendous amount of storage while increasing image data throughput.
So both by convention and due to the desire to improve image throughput, most cameras capture raw image data in YUV format.
Therefore, the first step is to convert from YUV to RGB, and fortunately, there is an internal library which handles that.
For more details, including formulas for converting YUV to RGB, check out this Wikipedia article: YUV
Once the data is in RGB format, each 32-bit unsigned integer contains 8 bits per pixel (0 to 255 decimal) for alpha (transparency), red, green, and blue channels, respectively:
AAAAAAAA RRRRRRRR GGGGGGGG BBBBBBBB
Once the data is in RGB format, we apply a filter, to modify the RGB data for each pixel:
ALPHA - No change RED - AND 10101010 (170 decimal) GREEN - AND 00111111 (63 decimal) BLUE - AND 00111111 (63 decimal)
Because red light is easiest on the eyes, we will use RED as a main component of our output channel. We use 170 decimal (10101010 binary) to strip off all odd bits of color data data, thereby increasing contrast, so that more details are visible.
We want to maintain some of the color data for all three channels, but we want to make sure that white light becomes red, the easiest way to do this is to strip the two high bits from GREEN and BLUE color channels.
This will result in muddy blue-brown-green colors for darker intensities, while higher intensity white colors become one of several specific shades of red (due to the contrast filter).
Although early television tubes, photomultipliers, and early CCD circuits could be damaged by prolonged exposure to direct sunlight, modern CCD sensors used in most smart phones are largely immune.
This allows the CCD to capture a small portion of the sun’s light (called a subsample) based on the CCD sensor’s sensitivity range, which then gets filtered based on the scheme above, to result in an image that can be viewed directly by the human eye without causing eye strain or eye damage.
Can I Download This From Google Play
No.
Google Play will ban anything they don’t understand, which will result in a “strike” against my developer account.
Because Google Play is fundamentally flawed in this respect, I have chosen to self-publish.
Pingback: HOME | Justin A. Parr - Technologist