[PIPE2D-523] Demonstrate R, B and NIR arm merging Created: 14/Mar/20  Updated: 16/Nov/22  Resolved: 09/May/20

Status: Done
Project: DRP 2-D Pipeline
Component/s: None
Affects Version/s: None
Fix Version/s: None

Type: Story Priority: Normal
Reporter: hassan Assignee: price
Resolution: Done Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Attachments: PNG File bmn.png     PNG File brn.png    
Issue Links:
Relates
relates to PIPE2D-1114 m-arm master biases are ingested to r... Won't Fix
relates to SIM2D-123 Model the NIR based on a CCD Done
relates to PIPE2D-316 Demonstrate arm merging Done
Story Points: 4
Sprint: 2DDRP-2021 A
Reviewers: hassan

 Description   

Following the implementation of SIM2D-123 where a CCD-based NIR simulation is now provided, check that all 3 simulated arms can be merged successfully with the 2D DRP.



 Comments   
Comment by price [ 03/Apr/20 ]

May as well include M also, since we now have sims working for that.

It would be helpful to make a larger dataset than the integration test (which is size-limited by the Travis runtime limit).

Comment by price [ 23/Apr/20 ]

The work accomplished on this ticket is a bit more extensive than what's in the description above, but it's all related.

  • Created a new test dataset that lives in /projects/HSC/PFS/weekly; the script to create it lives in pfs_pipe2d. The test dataset includes BRMN data, and will be used to demonstrate BRN and BMN merging.
  • Put together a script for processing this data, and testing the results.
  • Set up Jenkins so that it will run this script weekly (this may need some tweaking, but we're pretty close).
  • Fixed some bugs and made some improvements after poking at the results of processing.
  • Generated a new integration test dataset with 10 fibers instead of 600, so the integration test will go faster (full fiber density can be tested with the weekly).
Comment by rhl [ 23/Apr/20 ]

Do you have plans to use Mineo-kun's scripting system for running the code, replacing your script?

Comment by price [ 23/Apr/20 ]

Once Mineo-san's system is working, it would be good to see it replace both the integration test script and the weekly script.

Comment by naoki.yasuda [ 06/May/20 ]

I have processed arc data taken at LAM on 2019 July with this branch. It seems to work fine up to reduceArc.py. Should I check mergeArms.py as well? In that case, what is the command line for that?

Comment by hassan [ 06/May/20 ]

This would be useful, but only if you have 1 hour spare. The command line is

mergeArms.py /path/to/dataRepo –calib /path/to/calibRepo –rerun pipeline –id field=OBJECT

Note before running the above, you need to run reduceExposure first to create the pfsArm files:

 reduceExposure.py /path/to/dataRepo --calib /path/to/calibRepo --rerun pipeline --id field=OBJECT
Comment by naoki.yasuda [ 06/May/20 ]

mergeArms.py will fail with the following error. Any suggestion?

(lsst-scipipe-1172c30) [yasuda@fe PIPE2D-523]$ mergeArms.py $REPO_DIR --calib $REPO_DIR/CALIB --rerun pipeline --id visit=21400                                
CameraMapper INFO: Loading exposure registry from /gpfs02/work/yasuda/PFS/calibs_Jul_PIPE2D-523/registry.sqlite3                                               
CameraMapper INFO: Loading calib registry from /gpfs02/work/yasuda/PFS/calibs_Jul_PIPE2D-523/CALIB/calibRegistry.sqlite3                                       
CameraMapper INFO: Loading calib registry from /gpfs02/work/yasuda/PFS/calibs_Jul_PIPE2D-523/CALIB/calibRegistry.sqlite3                                       
root INFO: Running: /gpfs02/work/yasuda/PFS/work/PIPE2D-523/drp_stella/bin/mergeArms.py /gpfs02/work/yasuda/PFS/calibs_Jul_PIPE2D-523 --calib /gpfs02/work/yasuda/PFS/calibs_Jul_PIPE2D-523/CALIB --rerun pipeline --id visit=21400                                                                                                     
WARNING: You are using OpenBLAS with multiple threads (20), but have not                                                                                            
specified the number of threads using one of the OpenBLAS environment variables:                                                                                    
OPENBLAS_NUM_THREADS, GOTO_NUM_THREADS, OMP_NUM_THREADS.                                                                                                            
This may indicate that you are unintentionally using multiple threads, which may                                                                                    
cause problems. WE HAVE THEREFORE DISABLED OpenBLAS THREADING. If you know                                                                                          
what you are doing and want threads enabled implicitly, set the environment                                                                                         
variable LSST_ALLOW_IMPLICIT_THREADS.                                                                                                                               
Traceback (most recent call last):                                                                                                                                  
  File "/opt/local/pfs2.1/stack/miniconda3-4.5.12-1172c30/Linux64/pipe_base/18.1.0/python/lsst/pipe/base/cmdLineTask.py", line 388, in __call__                     
    result = self.runTask(task, dataRef, kwargs)                                                                                                                    
  File "/opt/local/pfs2.1/stack/miniconda3-4.5.12-1172c30/Linux64/pipe_base/18.1.0/python/lsst/pipe/base/cmdLineTask.py", line 447, in runTask                      
    return task.runDataRef(dataRef, **kwargs)                                                                                                                       
  File "/gpfs02/work/yasuda/PFS/work/PIPE2D-523/drp_stella/python/pfs/drp/stella/mergeArms.py", line 96, in runDataRef                                              
    sky1d = self.subtractSky1d.run(sum(spectra, []), pfsConfig, sum(lsf, []))                                                                                       
  File "/gpfs02/work/yasuda/PFS/work/PIPE2D-523/drp_stella/python/pfs/drp/stella/subtractSky1d.py", line 56, in run                                                 
    resampledList = self.resampleSpectra(spectraList, pfsConfig)                                                                                                    
  File "/gpfs02/work/yasuda/PFS/work/PIPE2D-523/drp_stella/python/pfs/drp/stella/subtractSky1d.py", line 84, in resampleSpectra                                     
    return [spectra.resample(wavelength, pfsConfig.fiberId[index]) for spectra in spectraList]                                                                      
  File "/gpfs02/work/yasuda/PFS/work/PIPE2D-523/drp_stella/python/pfs/drp/stella/subtractSky1d.py", line 84, in <listcomp>                                          
    return [spectra.resample(wavelength, pfsConfig.fiberId[index]) for spectra in spectraList]                                                                      
  File "/gpfs02/work/yasuda/PFS/work/PIPE2D-523/drp_stella/python/pfs/drp/stella/datamodel/pfsFiberArraySet.py", line 110, in resample                              
    return type(self)(self.identity, fiberId, np.concatenate([[wavelength]]*numSpectra),                                                                            
ValueError: need at least one array to concatenate                                                                                                                  

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/gpfs02/work/yasuda/PFS/work/PIPE2D-523/drp_stella/bin/mergeArms.py", line 3, in <module>
    MergeArmsTask.parseAndRun()                                                                  
  File "/opt/local/pfs2.1/stack/miniconda3-4.5.12-1172c30/Linux64/pipe_base/18.1.0/python/lsst/pipe/base/cmdLineTask.py", line 603, in parseAndRun
    resultList = taskRunner.run(parsedCmd)                                                                                                        
  File "/opt/local/pfs2.1/stack/miniconda3-4.5.12-1172c30/Linux64/pipe_base/18.1.0/python/lsst/pipe/base/cmdLineTask.py", line 221, in run        
    resultList = list(mapFunc(self, targetList))                                                                                                  
  File "/opt/local/pfs2.1/stack/miniconda3-4.5.12-1172c30/Linux64/pipe_base/18.1.0/python/lsst/pipe/base/cmdLineTask.py", line 400, in __call__   
    ", ".join(str(ref.dataId) for ref in dataRef), eName, e)                                                                                      
  File "/opt/local/pfs2.1/stack/miniconda3-4.5.12-1172c30/Linux64/pipe_base/18.1.0/python/lsst/pipe/base/cmdLineTask.py", line 400, in <genexpr>  
    ", ".join(str(ref.dataId) for ref in dataRef), eName, e)                                                                                      
AttributeError: 'list' object has no attribute 'dataId'                                                                                           
Comment by price [ 07/May/20 ]

There are a few problems here. Firstly, the root cause (the ValueError) is due to attempting to do 1d sky subtraction with no sky fibers. I've put in some code to catch that case and give a more informative error message.

The extra exception (the AttributeError) is a common problem in the LSST Gen2 middleware, where the input to the runDataRef method doesn't match one of the types it thinks you might give it. Fixing it requires copying 50 lines of code to make a small fix, which doesn't seem worth it to me because it doesn't hide the true problem, and we're going to update to the Gen3 middleware.

Finally, if we disable the 1d sky subtraction (-c doSubtractSky1d=False), we encounter a different problem: the fiberIds for the red and blue arms differ. This is likely because the detectorMaps are not accurate (I don't see any bootstrap products in /gpfs02/work/yasuda/PFS/calibs_Jul_PIPE2D-523/rerun), but this problem has been lurking for a while. I've added some new code to constructFiberTrace to catch this case, along with a brute force workaround that will be used if you set -c forceFiberIds=True.

With those fixes, I've got mergeArms working on Yasuda-san's data.

(lsst-scipipe-1172c30) price@fe:/gpfs02/work/price/pipe2d-523 $ constructFiberTrace.py /gpfs02/work/yasuda/PFS/calibs_Jul_PIPE2D-523 --calib /gpfs02/work/yasuda/PFS/calibs_Jul_PIPE2D-523/CALIB --output fiberTrace  --id visit=21122..21126 --cores 8 -c forceFiberIds=True
(lsst-scipipe-1172c30) price@fe:/gpfs02/work/price/pipe2d-523 $ cp -r /gpfs02/work/yasuda/PFS/calibs_Jul_PIPE2D-523/CALIB .
(lsst-scipipe-1172c30) price@fe:/gpfs02/work/price/pipe2d-523 $ cp fiberTrace/FIBERTRACE/pfsFiberTrace-2019-07-25-021122-* CALIB/FIBERTRACE/
(lsst-scipipe-1172c30) price@fe:/gpfs02/work/price/pipe2d-523 $ reduceExposure.py /gpfs02/work/yasuda/PFS/calibs_Jul_PIPE2D-523 --calib CALIB --output=demo --id visit=21400
(lsst-scipipe-1172c30) price@fe:/gpfs02/work/price/pipe2d-523 $ mergeArms.py demo --calib CALIB --rerun merge --id visit=21400 -c doSubtractSky1d=False
Comment by price [ 09/May/20 ]

Merged to master.

Generated at Sat Feb 10 15:54:24 JST 2024 using Jira 8.3.4#803005-sha1:1f96e09b3c60279a408a2ae47be3c745f571388b.