It's challenging to capture dark nebulae, such as those surrounding the Iris, in a dark sky, let alone my backyard. Yet another reason I'm glad I went remote!
For this project we are starting with 7h 40m of luminence and 2h 30m each of RGB, for a total integration of 15 hours. The data was captured using my Redcat 51 and ZWO 294MM Pro camera.
Take a look at my article on processing the Witch Head for details about calibrating and stacking. We will begin with the master files.
Lum
Red
Green
Blue
First, let's combine the R, G, and B mono images to create a color version, using Pixinsight's Channel Combination.
The combined RGB master looks red because the red filter was brightest, and the screen stretch is linked, meaning all three channels are treated equally by PI.
We can get a better look at our data by unlinking the channels, thereby causing PI to stretch each individually. We do that by unclicking the chain icon. This stretch provides a better look at the color, but also reveals an uneven background running diagonally, light to dark, from lower left to upper right
Linked stretch.
Unlinked stretch.
Two steps are required to permanently fix the color: background extraction and color calibration. First, Graxpert removes the gradients, then SPCC calibrates the color by measuring the color accuracy of known stars.
After extraction using Graxpert, shown with a linked stretch.
After color calibration, also linked.
Graxpert, again! The gradients are usually simple in images from Starfront, due to its dark skies. Images shot in more light pollution may require a lower smoothing setting.
This is what Graxpert removed from our image.
The next few steps are for the stars. Large stars tend to hide the faint detail we're after. First, we'll tighten them in both images using Blur Xterminator. Then we'll copy the RGB image and stretch the copy specifically for the stars. We'll eventually remove the stars from the luminance and RGB images, replacing them with the new stars.
Before Blur XT.
After Blur XT.
RGB with smaller stars.
Now, let's copy the RGB image and use the Star Stretch script to create really small stars. We'll use these in the final combination.
It looks odd, but we're only worried about the stars. The are small and have good saturation.
Now we have two color images- one for the nebula and one for the stars. We'll apply the screen transfer function to the original RGB file.
Let's stretch the luminance data by applying the screen stretch using Histogram Transformation.
I know some photographers use more complicated methods to stretch their images from linear to non-linear, such as the Generalized Hyperbolic Stretch process. To be honest, I've never had much luck with GHS.
I sometimes use a script called EZ Softstretch if I want a slightly more contrasty look than STF.
Either stretch is just a starting point. We'll continue to adjust the brightness and contrast in Photoshop, where it can be done more easily.
Next steps in Pixinsight are noise reduction and star removal-coming soon.