I am processing long-slit data from GMOS-South (Hamamatsu CCDs) with the gsreduce task, setting the parameter fl_vardq=yes. I have found that good pixels are marked in the DQ extension with a value of 4 and bad pixels (and chip gaps) with 16. I would have expected the good pixels to have values of 0 and the bad to have values of >=1. This causes every pixel in the DQ extension for the extracted spectrum to be labeled as bad, which is wrong. Does anyone know of a solution for this?
I am running the gemini package v1.13 on Iraf v2.16 through Ureka.
DQ=4 is supposed to be for saturated pixels. Are these flats? Are they definitely not saturated? I’m not sure off hand what would cause that flag to be set erroneously (perhaps some bad header keyword values), but the first thing is to verify the actual rough values.