[PIPE2D-224] Add support for DM-4141 to reduceArc*.py Created: 22/Jul/17  Updated: 26/Jul/17  Resolved: 26/Jul/17

Status: Done
Project: DRP 2-D Pipeline
Component/s: None
Affects Version/s: None
Fix Version/s: None

Type: Story Priority: Normal
Reporter: rhl Assignee: rhl
Resolution: Done Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Reviewers: price

 Description   

The new support for proper unit exit codes (LSST's DM-4141) requires some boiler plate added to TaskRunner; please add it.



 Comments   
Comment by price [ 26/Jul/17 ]

I think you could get rid of ReduceArcTaskRunner.__call__ completely (along with the associated maintenance) if you changed the getTargetList:

class ReduceArcTaskRunner(TaskRunner):
    """Get parsed values into the ReduceArcTask.run"""
    @staticmethod
    def getTargetList(parsedCmd, **kwargs):
        return [(parsedCmd.id.refList, dict(butler=parsedCmd.butler, wLenFile=parsedCmd.wLenFile, lineList=parsedCmd.lineList))]
Comment by rhl [ 26/Jul/17 ]

That fails with

Traceback (most recent call last):
  File "bin/reduceArcRefSpec.py", line 4, in <module>
    ReduceArcRefSpecTask.parseAndRun()
  File "/Users/rhl/LSST/pipe/base/python/lsst/pipe/base/cmdLineTask.py", line 532, in parseAndRun
    resultList = taskRunner.run(parsedCmd)
  File "/Users/rhl/LSST/pipe/base/python/lsst/pipe/base/cmdLineTask.py", line 227, in run
    resultList = list(mapFunc(self, targetList))
  File "/Users/rhl/LSST/pipe/base/python/lsst/pipe/base/cmdLineTask.py", line 368, in __call__
    dataRef, kwargs = args
ValueError: too many values to unpack
(Pdb) w
  /Users/rhl/PFS/drp/stella/bin/reduceArcRefSpec.py(4)<module>()
-> ReduceArcRefSpecTask.parseAndRun()
  /Users/rhl/LSST/pipe/base/python/lsst/pipe/base/cmdLineTask.py(532)parseAndRun()
-> resultList = taskRunner.run(parsedCmd)
  /Users/rhl/LSST/pipe/base/python/lsst/pipe/base/cmdLineTask.py(227)run()
-> resultList = list(mapFunc(self, targetList))
> /Users/rhl/LSST/pipe/base/python/lsst/pipe/base/cmdLineTask.py(368)__call__()
-> dataRef, kwargs = args
(Pdb) p type(args)
<type 'dict'>
(Pdb) p args.keys()
['butler', 'lineList', 'expRefList', 'refSpec']
Comment by price [ 26/Jul/17 ]

If type(args) is dict then I don't think you can be using the code I provided, where type(args) would be tuple?

Comment by rhl [ 26/Jul/17 ]

Sorry, you're right (I cut and pasted something foolish from pdb). But there is still a problem; dataRef is a list, and it's passed to task.writeMetadata.

Comment by price [ 26/Jul/17 ]

You could add to ReduceArcTask:

    def _getMetadataName(self):
        return None

Why does reduceArc*.py need a list of input dataRefs?

Comment by rhl [ 26/Jul/17 ]

Paul and I discussed this.

reduceArc.py was designed to add sets of exposures together to improve the S/N and then write a calibration product for later reduction. It currently processes individual images, but I'd rather not address this problem with this ticket.

Paul's proposal to simply turn off metadata is a workaround, but as the metadata dumping doesn't work sensibly for any pipe_driver scripts which process multiple {{dataRef}}s it's acceptable here.

Comment by rhl [ 26/Jul/17 ]

rebased, merged, pushed

Generated at Sat Feb 10 15:49:44 JST 2024 using Jira 8.3.4#803005-sha1:1f96e09b3c60279a408a2ae47be3c745f571388b.