IMAGE PROCESSING GALLERY
For those of you who have contributed – thank you! Your labors of love have illustrated articles about Juno, Jupiter and JunoCam. Your products show up in all sorts of places. I have used them to report to the scientific community. We are writing papers for scientific journals and using your contributions – always with appropriate attribution of course. Some creations are works of art and we are working out ways to showcase them as art.
If you have a favorite “artist” you can create your own gallery. Click on “Submitted by” on the left, select your favorite artist(s), and then click on “Filter”. For other tips about the gallery click on the “Gallery Organization” tab.
We have a methane filter, included for the polar science investigation, that is almost at the limits of our detector’s wavelength range. To get enough photons for an image we need to use a very long exposure. In some images this results in scattered light in the image. For science purposes we will simply crop out the portions of the image that include this artifact. Work is in progress to determine exactly what conditions cause stray light problems so that this can be minimized for future imaging.
The JunoCam images are identified by a small spacecraft icon. You will see both raw and processed versions of the images as they become available. The JunoCam movie posts have too many images to post individually, so we are making them available for download in batches as zip files.
You can filter the gallery by many different characteristics, including by Perijove Pass, Points of Interest and Mission Phase.
A special note about the Earth Flyby mission phase images: these were acquired in 2013 when Juno flew past Earth. Examples of processed images are shown; most contributions are from amateurs.
The spacecraft spin rate would cause more than a pixel's worth of image blurring for exposures longer than about 3.2 milliseconds. For the illumination conditions at Jupiter such short exposures would result in unacceptably low SNR, so the camera provides Time-Delayed-Integration (TDI). TDI vertically shifts the image one row each 3.2 milliseconds over the course of the exposure, cancelling the scene motion induced by rotation. Up to about 100 TDI steps can be used for the orbital timing case while still maintaining the needed frame rate for frame-to-frame overlap. For Earth Flyby the light levels are high enough that TDI is not needed except for the methane band and for nightside imaging.
Junocam pixels are 12 bits deep from the camera but are converted to 8 bits inside the instrument using a lossless "companding" table, a process similar to gamma correction, to reduce their size. All Junocam products on the missionjuno website are in this 8-bit form as received on Earth. Scientific users interested in radiometric analysis should use the "RDR" data products archived with the Planetary Data System, which have been converted back to a linear 12-bit scale.
Can a Cloud Displacement Field be Derived from a Pair of JunoCam Images?
Visuals of a europlanet #RASJuno talk about JunoCam image processing, London, 2018-05-10.
The talk essentially investigated the feasibility of 1st order short-term weather forecast derived from a pair of JunoCam images taken within a few minutes.
It starts with a Perijove-12 flyby movie to provide some context. Juno's Perijove-12 flyby has taken place on 2018-04-01.
It then compares cropped pairs of locally contrast-normalized JunoCam images reprojected to the same vantage point, and visualizes according band-pass filtered displacement fields. Morphs extrapolate the motion between the two images of the JunoCam image pair by a factor of 100 into the past, and also into the future, assuming a stable velocity field. The differential equation given by the displacement field was forward and backward integrated using the probably most simple and 1st order numerical method, called Euler method. Numerical integration is the basis of the morphs. Changing velocity fields, like moving storm systems haven't modeled in this feasibility test.
The second image pair is investigated in more detail. After various versions of visualizing the displacement field itself, first and second order derivatives, such as curl, divergence, laplacian, are visualized.
Then statistical errors induced by the specific choice of sets of tiles for stereo correlation are visualized.
This raises the question, whether a larger sample of correlation tiles can smooth the noise. This effect is visualized, too.
Another question is the feasibility of a higher-resolved velocity map by working with smaller tiles, and changing the upper frequency bound of the band-pass filter applied to the raw displacement data. This effect is visualized.
A few morphs are added towards the end of the movie.
The design, implementation, debugging, an test runs of the analysis software, including movie and talk preparation took about two weeks. So, the result might not yet be quite free of remaining glitches.
This investigation was primarily about feasibility. Exploring the limitations of those methods, and applying more sophisticated techniques, as well as the reduction to physical entities like velocity or vorticity is ongoing.
Credit for raw images: NASA / JPL / SwRI / MSSS
Navigation data: NAIF/SPICE
Image processing and data reduction: Gerald Eichstädt
Movie compilation made extensive use of ffmpeg.
MP4 version, about 630 MB:
http://junocam.pictures/gerald/talks/europlanet_london_20180510/versions/London2018_Eichstaedt_PJ12_v10.mp4