- cross-posted to:
- science@beehaw.org
- cross-posted to:
- science@beehaw.org
Question: why are satellite trails still a problem in digital sensors? I can see how a long exposure on photographic film can be marred by a stray line, but it was my understanding that digital sensors in telescopes are being read out continuously during a long exposure, more like a video than a photograph, and the frames are then added together by a computer to achieve the same level of noise reduction as long exposure film. Why can’t you just leave out the frames with satellites in them? Is there a type of sensor being used that accumulates light for minutes or hours at a time before being read out, and if so - why? As in, how specifically different is that from combining multiple video frames? Or is the article talking specifically about software to automatically detect frames with sat trails by doing that every-line-at-every-angle math? The only problem in the article I understand is looking for transient events in particular:
For example, a glint of sunlight off a small piece of insulation shed by a satellite could appear in a telescope image like a flaring star.
This is a really good question!