Camera sensors are really photon detectors. When a photon hits a pixel, it produces an electrical charge, which is then converted to a digital number. The more photons, the greater the charge and the higher the number. The photons being detected are the signal.
Cameras aren't perfect and make minor errors when measuring signals. When signals are large, like a picture taken on your phone in bright sunlight, the errors are so minor in comparison to the signal that you don't notice them. The errors are very visible when there is very little signal, like when imaging a faint object in the night sky. The errors are noise in the data.
When the ratio between signal and noise is small, parts of an image that should look smooth appear grainy, which hides faint details and looks unpleasant. Things smooth out when that ratio is high, allowing us to brighten the image enough to make those faint details visible.
A single picture of a faint object will be grainy because the SNR is small. However, if we average a large number of images together, the math brings the noise values closer together and increases the SNR, smoothing out the grain. Astrophotography is all about getting a stacked image with a high SNR. The more images you stack, the better-quality final image you'll get.
Since deep-space objects change extremely slowly, we can take many images of them over hours, days, months, or even years and successfully stack them into a final image with a high SNR.
Now, we can take those final images, or "masters," assign them to specific colors, and enhance them to make a beautiful image, much like you might edit a photo on your phone to make it brighter, add more color, or sharpen the details.
It's that simple!