[PIPE2D-427] reduceArc.py fails with HgAr data taken May 06, 2019 Created: 22/May/19  Updated: 23/May/19  Resolved: 23/May/19

Status: Done
Project: DRP 2-D Pipeline
Component/s: None
Affects Version/s: None
Fix Version/s: None

Type: Task Priority: Normal
Reporter: ncaplar Assignee: price
Resolution: Done Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Issue Links:
Blocks
blocks PIPE2D-414 reduceExposure.doSubtractContinuum=Tr... Done
Relates
relates to PIPE2D-430 Interpolate over negative variances Open
relates to PIPE2D-428 reduceArc.py fails with Xenon data ta... Done
Story Points: 2
Sprint: 2DDRP-2019 E
Reviewers: hassan

 Description   

I am unable to run reduceArc.py on the HgAr data taken at LAM on May 06, 2019.

The script that I have used to ingest data and prepare calibrations is at: /home/ncaplar/ReductionScripts/May21/pfs_preparation_May.sh. It uses the suggested pfs_build_calibs.sh, which can also be found in the same folder.

Script that I used, which has reduceArc.py command, pfs_May22_Neon_May_defocus.sh, is in the same folder.

Note that with the same preparation, and using equivalent scripts, Neon and Krypton data passes reduceArc.py command.

The output from running the final script is at:
https://gist.github.com/nevencaplar/4131a30fd406f9f4c0d55aea8ccb348a



 Comments   
Comment by hassan [ 22/May/19 ]

Excerpt from script output (L485) showing the error is shown below:

reduceArc FATAL: Failed on dataId={'visit': 17075, 'arm': 'r', 'dateObs': '2019-05-06', 'site': 'L', 'category': 'A', 'expId': 17075, 'spectrograph': 1, 'field': 'ARC', 'ccd': 1, 'filter': 'r', 'expTime': 15.0, 'dataType': 'arc', 'taiObs': '2019-05-06', 'pfiDesignId': 1099528409104, 'slitOffset': 0.0}: Exception: 
  File "src/math/CurveFitting.cc", line 330, in std::tuple<float, ImageT, ImageT, ImageT> pfs::drp::stella::math::{anonymous}::fitProfile1d(const ndarray::ArrayRef<const ImageT, 1, 1>&, const ndarray::ArrayRef<const ImageT, 1, 1>&, const ndarray::ArrayRef<bool, 1, 1>&, const ndarray::ArrayRef<ImageT, 1, 1>&, float, bool) [with ImageT = float]
    fitProfile1d:: i = 2: ERROR: dataVar(i) == 0. {0}
lsst::pex::exceptions::Exception: 'fitProfile1d:: i = 2: ERROR: dataVar(i) == 0.'
Comment by price [ 23/May/19 ]

The error is that there is a negative variance (at 1605,469). It appears to be coming from the dark: the value of the dark is -0.01 per second, and after bias subtraction the pixel is already negative (-20). The variance is set after dark subtraction, so we end up with a negative variance. I'm going to post-process the image after ISR to remove negative variances.

Comment by price [ 23/May/19 ]

hassan, could you please review these changes?

price@MacBook:~/pfs/drp_stella (tickets/PIPE2D-427=) $ git sub
commit 5bd04dcf79fadefaada86df2e09577309f7c9145 (HEAD -> tickets/PIPE2D-427, origin/tickets/PIPE2D-427)
Author: Paul Price <price@astro.princeton.edu>
Date:   Wed May 22 15:08:24 2019 -0400

    reduceExposure: mask negative variance pixels
    
    We have low count levels compared to imaging, so we sometimes have
    pixels that are negative after bias+dark, which means the variance
    gets set to a negative number, which causes problems downstream
    (e.g., CurveFitting.cc:330). Find these pixels, set the variance
    to infinity and mask them as BAD.

 python/pfs/drp/stella/reduceExposure.py | 25 ++++++++++++++++++++++++-
 1 file changed, 24 insertions(+), 1 deletion(-)
Comment by ncaplar [ 23/May/19 ]

Reduction passed successfully after these changes.

Comment by price [ 23/May/19 ]

Merged to master.

Generated at Sat Feb 10 15:53:03 JST 2024 using Jira 8.3.4#803005-sha1:1f96e09b3c60279a408a2ae47be3c745f571388b.