I recently shot some luminance data of the Whirlpool Galaxy using my 1600MM Pro camera in my backyard. It's not the best camera, but since my 294MM Pro now lives at Starfront Observatories, it's the only one I have at home. As they say, the best camera is the one you have with you.
After processing the data, I remembered I shot some data with the 294 last spring, so I decided to see what would happen if I combined the master Lum files, which you can see below.
This is a stack of 780 30s images (6.5 hrs). It's an older camera, but the noise is fairly low and mostly high frequency, meaning just a couple pixels in size.
The 294 has a more efficient sensor, which can be seen in this image, a stack of 110 75s images. It has less than half the integration time but more contrast.
As you can see, both cameras did a reasonable job, although the 294 did it in less than half the time. The flats correction was better in the 1600, and the 294 also shows signs of walking noise (the vertical streaks in the background) and a strange diagonal line that wasn't rejected during stacking. The walking noise appeared because it was one of the first objects I shot with an off-axis guider, and I didn't have the dither settings sorted out then.
Interestingly, the 1600 has less contrast in the galaxy but does a bit better job of showing the faint tidal tail of the colliding galaxies.
The first step is to align one image to the other, so the stars line up. Since the cameras have different pixel sizes, the image scales (resolutions) are also different. Not to worry, the Star Alignment process takes care of that automatically. I used the 294 image as the reference.
Now, it's time to integrate the two images and see what we get. Since the image integration process requires more than two images, we'll need a workaround. All we need to do is make a copy of each and then load all four. Problem solved!
I used the default integration settings and Percentile Clipping as the pixel rejection algorithm. PC is suitable for very small sets of images.
The difference is subtle, but I do see an improvement.
The resulting image has the stronger contrast of the 294 without the uneven background. The faint tidal features also stand out more and the diagonal line was rejected.
According to Pixinsight's Noise Evaluation script, the combined image has slightly more noise than the 294 master, although the noise is very fine-grained. Importantly, the walking noise has disappeared. Walking noise can be challenging to process out later.
Overall, combining the master frames was worth it, and it took much less time than loading all the subframes together, pre-processing them, and then stacking the resulting 890 images.
The final image. Combining the color data didn't lead to an improvement, so I only used the data from the 294.