Showing posts with label DR. Show all posts
Showing posts with label DR. Show all posts

2014-07-25

A new Starlink release contains notable updates to the SCUBA-2 configuration files.


The latest Starlink release - 2014A - has been made public. For details please read the release notes provided at: http://starlink.jach.hawaii.edu/starlink/2014A

As part of this new release we want to highlight one significant update and a couple of new additions to the SCUBA-2 reduction arsenal of config files.

Updates to bright_extended config file

This config file 'dimmconfig_bright_extended.lis' has always been intended for reducing data containing bright extended sources. It has remained untouched for a couple of years now despite advances in our understanding of SCUBA-2 reduction of bright regions. The config file now contains the following parameters/links:

   ^$STARLINK_DIR/share/smurf/dimmconfig.lis

   numiter=-40
   flt.filt_edge_largescale=480
   ast.zero_snr = 3
   ast.zero_snrlo = 2

   ast.skip = 5
   flt.zero_snr = 5
   flt.zero_snrlo = 3


In previous Starlink releases i.e. Hikianalia, the bright_extended configuration file only contained the following:

   numiter = -40
   ast.zero_snr = 5

   flt.filt_edge_largescale = 600


New 'FIX' config file

Two new config parameter files have been added These are intended to be used with one of the existing dimmconfig files. They provide new values for selected parameters aimed at solving a particular problem ("blobs" in the final map, or very slow convergence).

  • dimmconfig_fix_blobs.lis 
    • These parameters attempt to prevent smooth bright blobs of emission appearing in the final map. It does this by 1) identifying and flagging samples that appear to suffer from ringing a soft-edged Butterworth filter in place of the normal hard-edged filter, and 3) rejecting samples for which the separate sub-arrays see a markedly different common-mode signal.
  • dimmconfig_fix_convergence.lis
    • The parameters defined by this file attempt to aid the convergence process, and should be used for maps that will not converge within a reasonable number of iterations.

2010-08-05

SMURF Update (August 5th 2010)

It's been a couple of months since the last SMURF news entry so I thought I'd bring people up to date.

  • All the configuration files for the iterative map-maker have been tweaked and some have been renamed. For example the '_faint' config is now called "dimmconfig_blank_field.lis". We also have a new config file for bright calibrators called "dimmconfig_bright_compact.lis".
  • The SMO (time series smoothing) model has been improved. A few bugs have been fixed and it's been parallelized and so is much faster. SMO is not enabled in any of the default configuration files.
  • The size of the apodization and padding for the FFTs can now be calculated dynamically based on the requested filter. This is now the default behaviour. A new Fourier filter has also been written that does not require apodization (which may be important for very short maps) but is still being tested.
  • Quality handling has been revamped inside the map-maker to allow us to report more than 8 different types of flagging. The report at the end of each iteration is now more compact and if you use the SHOWQUAL command to look at exported models you may see that the bit numbers assigned to a particular quality are no longer fixed. Additionally if more than 8 flags are used the exported model will combine related flags (for example PAD and APOD will be merged into ENDS).
  • Very noisy bolometers will now be discarded before the iterative map-maker starts. This can help with convergence. See the "noiseclip" config parameter to adjust this.
  • The map-maker now compares flatfield ramps taken at the start and end of each observation and disables bolometers that whose calibration has varied too much. This will not help data taken prior to 20100223 where flatfield ramps are not available.
  • The step correction algorithm continues to be improved.
  • SC2CLEAN will now report quality statistics in verbose mode.
  • SC2FFT can now be given a bad bolometer mask.
The cookbook has also been updated and can be read online.

2010-08-04

New version of the SCUBA-2 cookbook

Version 1.1 of the SMURF SCUBA-2 Data Reduction Cookbook  (or SC21, as it is fondly known), is out. We recommend that everybody who works with SCUBA-2 data read through it.
                                                                                           
In order to be able to follow the cookbook, you will need to update your local starlink release to the latest version.

You can always get to SC21 from the sidebar of this blog.

Updated to change reference from SC19 to SC21

2010-03-17

Flatfielding updates

Summary: Flatfield ramps work really well and SMURF can now automatically handle them in the map-maker.

SCUBA-2 bolometers need to be calibrated to understand how they respond to varying signal coming from the sky and astronomical object. The original plan was to calibrate in the dark (shutter closed). The sequence goes something like:

  1. Select a reference heater value, take a dark frame
  2. Choose a new heater setting, take a dark frame
  3. Take a dark frame at the reference heater value
  4. Choose a different heater setting, take a dark frame
  5. Take a dark frame at the reference heater value
and continue until you have covered a reasonable range of heater settings. As the heater is changed the bolometers read out a different current. Any drifts in the instrument are compensated by averaging the surrounding reference frames and subtracting. This means that you end up with a curve that goes through zero power at the reference heater value. In order to convert this to a flatfield you either fit a polynomial as a function of measured current (so that you can look up the power) or else use "TABLE" mode and do a linear interpolation between measurements either side of the measured current. The gradient of the curve (how the bolometer responds to changes in power) is the "responsivity" and is measured in amps per watt. The responsivity image can be calculated using the SMURF calcflat command.

When you open the shutter the idea is that you "heater track" to the sky. This involves you adjusting the power to the heater such that the sky power detected by the bolometer results in the same current being measured by the bolometer as it measured in the dark. We do this by looking at the signal from a set of tracking bolometers and assume that those bolometers are representative of the others on the array. In reality what happens is that about 80% of the bolometers do more or less read the same signal before and after opening the shutter but the other 20% are in a completely different place. This would not be a problem if the responsivity didn't change for those 20% but unfortunately it does. We have verified this by doing finely spaced pong maps on Mars covering a 6x6 arcmin area. This takes about 15 minutes but gives us a beam map of every single bolometer. Analysing the Mars images showed that the bolometers with the lowest responsivity also measure a very low integrated flux for Mars and so the calibration does change when the shutter is opened.

The solution for this was to change flatfielding to work on the sky rather than in the dark. This works in just the same way as previously, using reference sky measurements to compensate for drift, and the top plot in the figure shows a sky flatfield that is working pretty much perfectly. Finely-spaced maps of Mars confirm that all the bolometers are calibrated to within 10% with no drop off for the low responsivity bolometers.

At this point things were looking good but we still had the issue that the sky flat takes a few minutes and really has to be done every time you do a new setup and probably at least once an hour. They also are very dependent on observing conditions as could be seen on 20100310 and a few days before hand where the sky was terribly unstable despite brilliantly low opacity (0.03 CSO tau). The middle plot below shows a sky flat on 20100310 and it is immediately obvious that the sky is varying very fast and varying the power over a much larger range than the heater is adjusting for. This flatfield failed to calibrate any bolometers at all and we had to resort to dark flatfields to get a baseline calibration (with the associated worries described above).

We had known this was going to be an issue so in the early part of the year we had been modifying the acquisition system to do fast flatfield ramps. Rather than setting the heater, doing an observation, changing the heater, doing an observation we can now change the heater value at 200 Hz (currently we take 3 measurements at each setting though). On 20100223 we enabled sky flat field ramps at the start and end of every single mapping observation and a few days later we added it to focus, pointings and sky noise observations. The bottom plot shows the flatfield ramp for the observation that immediately followed the discrete sky flatfield shown in the middle plot. There is an issue with the very last ramp but the flatfielding software in SMURF had no problem calculating a flatfield for 850 bolometers (SMURF does compensate for drift in the reference heater values). The flatfield ramps are going to help enormously with calibration.

Actually using these flatfields in the map-maker took some work but yesterday I committed changes to SMURF so that flatfield ramps will be calculated and used when flatfielding data in the map-maker (and other SMURF commands). All you need to do is give all the files from an observation to SMURF and it will sort everything out.

I have updated the /stardev and /star rsync server in Hilo (64-bit and 32-bit). There is also a new nightly build available for OSX Snow Leopard 64bit in the usual place.

One final caveat, we have not yet calibrated the resistance of each bolometer relative to the nominal 2 ohms. We have taken data by looking at a blackbody source which should give us a way of tweaking the resistances. When this happens the flatfielding will change slightly and maps will need to be remade (although how critical that is will depend on how much we tweak the bolometers). 


2009-09-02

JLS DR telecon - 1st meeting

Attendance: A. Chrysostomou, R. Tilanus, T. Jenness, R. Plume, M. van der Wiel, J. Di Francesco, G. Fuller, B. Cavanagh, H. Thomas, D. Johnstone, H. Roberts, D. Nutter, J. Hatchell, F. Economou

- initial discussion on whether we will have a SCUBA-2 pipeline ready. There will be something in place for shared risks but basic. More development will have to wait until we have all arrays in place as it is not worth sinking any effort into this at this time.

- some people are having issues getting the pipeline installed and the fact that there is a lack of documentation. If people/institutes are having issues installing (any) Starlink software, then please inform the JAC (stardev@jach.hawaii.edu) providing the relevant details.

ACTION 1: JAC will provide information on how to rsync the starlink releases to get latest patches/fixes. Information will also include for which operating system these patches/fixes are available.

DONE(!): Instructions are available on the starlink web site (http://starlink.jach.hawaii.edu/).
To download the most recent release go to: http://starlink.jach.hawaii.edu/starlink/Releases
To keep up to date with the latest fixes and patches go to:
http://starlink.jach.hawaii.edu/starlink/rsyncStarlink


- GAF requested for more statistics to be made available from the QA. GAF will follow up with specific request to Brad (see Action 3 below)

- it was clarified that the summit pipeline (during normal night-time observing) only runs basic QA on calibrations. After the end of observing, all data taken that night is re-reduced by a “nightly pipeline” which executes the full QA and advanced processing. The reduced data products which result from this are shipped to CADC and can be downloaded with (or without) the raw data in the normal way.

ACTION 2a: JAC to make QA log available to observers/co-Is following nightly reduction via the OMP (as a downloadable file).
ACTION 2b: JAC to make a more compact and readable QA report format.

ACTION 3: For SLS to provide JAC (ie Brad) with list of statistics and requirements for their QA, and also what they want for their reduction recipes to do.

- JH raised some existing issues from the GBS: flatfielding (striping) of early HARP data; some bad baselines are not being picked up by QA; although not as prevalent as in older data, spikes are not trapped by the QA; an investigation is needed on how the gridding should best be done

+++ the flatfielding problem is on Brad’s worklist

+++ we need more feedback from the teams on which bad baselines are not been filtered out

+++ de-spiking data is not a problem that JAC has been able to tackle as yet. Part of the issue is that these do not seem to be as prevalent in data any more and observers (PI as well as JLS) are not reporting the issue any longer. GAF reported that spikes are still present but at a small level, which is an issue for SLS who are looking for weak, narrow lines.

ACTION 4: For JLS teams to provide JAC with images/data/log of spikes when they come across them in their data.

- RPT raised a few issues from the NGLS:

+++ need ability to baseline fit both wide and narrow lines in same data set

+++ need ability to restrict e.g. moments analysis to known velocity range.

+++ QA generally fails for (at least) early NGLS data. Will need to investigate this more but need an easy means to switch off in recipes. This is easy in the main recipe, but less so in the advanced interative part.

- there is a blog available for data reduction and pipeline activities (you’re problably looking at it right now!): http://pipelinesandarchives.blogspot.com/

- the issue of making the pipeline more controllable through a config file to set parameters was discussed. TJ announced that he is developing infrastructure so that the pipeline can be parameterised

- ACC received several emails prior to the meeting. A common theme was the lack of documentation explaining what the pipeline does to data, and how to use the pipeline. JH repeated this concern at the meeting.

ACTION 5: ACC took an action following the close of the meeting to organise the production of pipeline documentation. These will probably take the form of a detailed account of what the pipeline does, and a separate cookbook which explains how to run the pipeline with the different options available.

ACTION 6: ACC to poll for a date and time for next telecon and make these meeting notes available.

2008-06-27

SCUBA-2 DR pipeline

A belated announcement that the SCUBA-2 data reduction pipeline passed its "lab acceptance" earlier this month. Full report at http://docs.jach.hawaii.edu/JCMT/SC2/SOF/PM210/04/sc2_sof_pm210_04.pdf