2010-05-28

Dark Squids and Magnetic Field Pickup

Over the last couple of weeks I have been experimenting with some data that seem to exhibit the effects of magnetic field pickup in the detectors. The SCUBA-2 readout system amplifies small changing magnetic fields caused by changing currents in the detectors (that are produced by changing optical and thermal loads on the bolometers) using arrays of SQUIDs. Although there is substantial shielding, some external changing magnetic fields may still leak into the raw data - such as motion of the telescope through the Earth's magnetic field, or fields produced by motors in the dome etc. The readouts are multiplexed, and the signals from columns of detectors pass through a common amplifier chain. For this reason, each column has a bare SQUID with no bolometer, a so-called "dark SQUID" (DKS henceforth), to track this non-thermal/optical noise term. The time-series for these data are stored in raw SCUBA-2 data files in the ".more.scuba2.dksquid" extension. They can be plotted, for example, using KAPPA linplot (this is for the DKS in column 1 of the following raw 30s data file):

$linplot 's4a20100223_00061_0002.more.scuba2.dksquid(1,)'



Just like regular bolometer data, DKS can have spikes (like in this example). Generally speaking the DKS haven't shown a lot of structure so we weren't concentrating on using them to clean the bolometer data.

However, on some of the wider-area data sets the DKS signals can be substantially larger, and seem to introduce a significant noise term. One additional feature of the magnetic field pickup signal that makes it easy to distinguish from other noise sources is that its sign is essentially random, and independent of the detector response to optical/thermal loads, i.e. the signal could be rising in one bolometer, and falling in the next. The following movie shows a data cube from the s8d array after the common-mode signal has been removed. The residual signal oscillates in time, with two clear detector populations that are out of phase with eachother:



These data were taken as part of a 0.5 deg diameter scan, and it is clear comparing the dark squid signals with the change in telescope azimuth that they are highly correlated. Combined with the arbitrary sign of the residual noise, the culprit is almost certainly magnetic field pickup.

A dark squid signal (red dots), and the signal smoothed with 200 sample boxcar (white line) over 100s:


Azimuth offset from the map centre over the same period of time:


The next plot shows the dark squids for each working column over ~25 min. with arbitrary vertical offsets for clarity. Note that while similar, each column has a slightly different response:


The iterative map-maker has a partially-implemented model, called "DKS", for iteratively fitting and removing the DKS signals from working detectors along each column. The relevant settings for the configuration file are:

modelorder = (dks,com,gai,ext,ast,noi)
dks.boxcar = 100

Here we have added DKS cleaning as the first step in the iterative model fitting to the data (note also that we have dropped FLT for now, more on that below), and the second option smooths the DKS data with a 100 sample boxcar prior to fitting and removing it from each bolometer.

The next two images show a map solution using only COM to remove low-frequency noise, followed by a solution that uses both DKS and COM:





While not perfect, it is still clear that the DKS removal has removed a lot of the noise that caused streaking in the first image. Remember, there is no additional high-pass filtering as FLT has been turned off in this example.

If "dks" is specified in "exportndf", the fitted DKS model will be written to a file with a suffix "_dks.sdf". However, this file is not particularly easy to view and make any sense of because the dark squids and the fit coefficients are all bundled together in one large array. For this reason, we've added a new SMURF task called "sc2expandmodel" that will evaluate the model and produce a full 3d data cube with the same dimensions/units as the residual data, e.g.

$sc2expandmodel s4a20100228_00016_0002_con_dks expanded_dks

will produce an easily-viewable file called "expanded_dks". All detectors along a column will have the same shape, but the scale/offset will be different for each detector. Examining this file and comparing to the "_res" output from makemap is useful for getting an idea of how strong this magnetic field pickup is for a given data set.

All of this code is still experimental. It currently has several problems:
  • boundaries with padding/apodization will introduce steps, so padstart/padend/apod should probably be set to zero, meaning that it is not optimal for combining with FLT at this stage.
  • the DKS are not pre-processed in the same way as the bolometer data to remove spikes / steps etc. This capability will be added shortly.
  • Some DKS are clearly dead (e.g. DKS 1 in the multi-DKS plot above). Currently we don't have an automatic system for flagging/removing them. Note also that if they are bad, it will be impossible to DKS-clean working detectors along that column (so we need to decide if we will simply have to ignore them when making a map?)
Once we have worked through some of these issues, we will update the default dimmconfig files in SMURF accordingly.

2010-05-18

SCUBA-2 filter information

At our last SCUBA-2/SRO telecon, people asked to see the measured profiles for the SCUBA-2 filters.

Dan has put up some plots here.

2010-05-17

PICARD web page

I've created a new static location for all things PICARD:
http://www.oracdr.org/oracdr/PICARD

The page includes an introduction to PICARD, a list of all available recipes (with more detailed documentation) and a few hints, tips and potential gotchas. I'll keep this page up-to-date, adding new blog entries and/or sending messages to the scuba2dr mailing list as necessary.

2010-05-11

Post-processing SCUBA-2 data with PICARD

Processing raw SCUBA-2 data is done with SMURF or the pipeline (ORAC-DR). What happens after that depends on the user and their level of familiarity with particular software packages. Fortunately the SCUBA-2 software team is here to help out and come up with a series of standardized tools for performing a number of basic post-processing tasks.

Introduction to PICARD

Our tool of choice is PICARD which makes use of the existing ORAC-DR infrastructure as well as our existing knowledge of writing primitives and recipes for the SCUBA-2 pipeline. PICARD is run from the command line as follows (I'll use % as the prompt):

% picard [options] [RECIPE_NAME] [list of files to process]

For example,

% picard -log sf -recpars mypar.lis CROP_JCMT_IMAGES myfiles*.sdf

The most commonly used options are -log and -recpars (the full list of available options can be seen by running picard -h). Both of these options take additional arguments.

The "-log" option controls where the messages from PICARD are printed: "-log sf" will write messages to the terminal window and to a file called .picard_PID.log (where PID is the process ID for picard) in the output directory. To avoid creating the .picard_PID.log files, just specify "-log s".

The "-recpars" option allows the user to pass in a text file containing parameters which can be used in the given recipe. The permitted parameters are listed with the various recipes below. The format of this text file is a list of `parameter = value' entries, with the recipe name given in square brackets:

[RECIPE_NAME]
PARAM1 = VALUE1
PARAM2 = VALUE2

PICARD writes its output files to the current directory (unless the environment variable ORAC_DATA_OUT is defined in which case that location will be used).

There are currently four recipes which may be of interest:
  • MOSAIC_JCMT_IMAGES
  • CROP_JCMT_IMAGES
  • REMOVE_BACKGROUND
  • SCUBA2_MATCHED_FILTER
These recipes and their parameters are described in more detail below. More recipes will be added as the need arises and as we gain more experience in analyzing SCUBA-2 data. Interested users should update their Starlink installations to get access to these recipes.

MOSAIC_JCMT_IMAGES

Coadd the given files into a single map, taking into account the EXP_TIME and WEIGHTS NDF components. The images are combined using variance weighting and the output variance is derived from the input variances. Currently the recipe uses the KAPPA wcsmosaic task for coadding the images.

The same pixel-spreading method (and any associated parameters) is used for the data and the EXP_TIME and WEIGHTS.

Creates a single output file based on the name of the last file in the list, and with a suffix "_mos" (e.g. mylastfile_mos.sdf).

Available recipe parameters:
[MOSAIC_JCMT_IMAGES]
WCSMOSAIC_METHOD = wcsmosaic pixel-spreading method: see wcsmosaic documentation for available options (default is "nearest")
WCSMOSAIC_PARAMS = additional parameters which may be required for the chosen method

CROP_JCMT_IMAGES

Crop images to the map size in the data header (as specified in the Observing Tool), though this size can be overridden using the recipe parameters below.

Creates an output file for each input file, with the suffix "_crop".

Available recipe parameters:
[CROP_JCMT_IMAGES]
MAP_WIDTH = map width in arcsec
MAP_HEIGHT = map height in arcsec

REMOVE_BACKGROUND

Fit and remove large-scale background variations from images using either KAPPA fitsurface or CUPID findback. See the Starlink documentation on both tasks for more information on the parameters shown below. The option exists to mask out a circular region centred on the source before removing the background.

Be aware that the subtraction of the background will add noise proportional to the RMS deviation between the image and the background fit.

Creates an output file for each input file with the suffix "_back".

Available recipe parameters:
[REMOVE_BACKGROUND]
MASK_SOURCE = flag to mask out a circular region on the source before fitting a background (1 = mask out source; 0 = do not mask out source - the default)
APERTURE_RADIUS = radius of aperture (in arcsec) for masking out source (otherwise 30 arcsec)
BACKGROUND_FITMETHOD = the method for fitting the background, either fitsurface (default) or findback
FITSURFACE_FITTYPE = fittype parameter for fitsurface: polynomial (default) or spline
FITSURFACE_FITPAR = parameters for fit, up to 2 numbers corresponding to NXPAR/NYPAR for fitsurface or KNOTS for spline fit
FINDBACK_BOX = size of box in pixels used by findback for smoothing the image

SCUBA2_MATCHED_FILTER

Apply a matched filter to the data to improve point-source detectability. The images and PSFs are smoothed with a broad Gaussian (default is 30 arcsec but can be varied using the recipe parameter below) and subtracted from the originals. The images are convolved with the modified PSFs. The PSF created by the recipe is a Gaussian with FWHM equal to the JCMT beamsize at the appropriate wavelength (i.e. 7.5 or 14 arcsec).

Creates an output file for each input file with the suffix "_mf", and a PSF file "_psf" if the PSF was not specified as a recipe parameter.

Available recipe parameters:
[SCUBA2_MATCHED_FILTER]
PSF_MATCHFILTER = name of a PSF file (NDF format, will be used for all images)
PSF_NORM = switch to determine whether a PSF is normalized to a peak of unity ("peak" - the default) or a sum of unity ("sum")
SMOOTH_FWHM = FWHM in arcsec of Gaussian to smooth image (and PSF)

2010-05-07

Typos in config files

This week I've made a couple of minor changes to the way the iterative map-maker handles configuration files. On too many occasions I've made a change to a config parameter and seen no effect because of a typo or else I've left a parameter unspecified and had no idea what that really meant.

We've now fixed both these problems:

  • Defaults are now read from a file called $SMURF_DIR/smurf_makemap.def
  • When user-supplied parameters are read in they are merged with the defaults and if there is any keyword used that is not present in the default file you will get an error immediately.
  • When the history is written out (use KAPPA HISLIST to see it) it contains the values that were actually used rather than just the values that were supplied by you. These are much more useful when trying to work out exactly what is different between two reductions.
I did have to change the way config files work in one small way. Previously if you wanted your config file to include overrides based on the wavelength you would have to say "flt_850.something" or "flt_450.something". To allow the above changes to work we've had to reorganize that slightly so you now say "450.flt.something" and can now override all the normal parameters in this way for a particular wavelength.

2010-04-29

How Should I Mosaic My SCUBA-2 Data (Redux)?

Mosaicking your SCUBA-2 with wcsmosaic or makemos is fine if all you're interested in is the data. But what if you want to know the exposure time per pixel in your map to determine if the noise is reasonable? That's easy for each map written by makemap: the exposure time image is stored in the .MORE.SMURF.EXP_TIME NDF extension (which you can view in Gaia). But wcsmosaic doesn't know about this extension, so the EXP_TIME data in a mosaic is only that of the first file.

There are two ways to get properly mosaicked data:
  • Use the ORAC-DR pipeline and process from raw;
  • Use the PICARD recipe MOSAIC_JCMT_IMAGES on processed data.

I'll write a post about using the pipeline in the near future but for now I'll highlight the second method.

PICARD comes for free with ORAC-DR (which comes for free with Starlink). It's basically a tool for processing and analyzing reduced data which takes advantage of the same infrastructure used by the pipeline.

At the command line type:

picard -log s MOSAIC_JCMT_IMAGES myfiles*.sdf

It'll tell you what it's doing and exit. This will create a file called "mylastfile_mos.sdf" which has the correct EXP_TIME extension (where "mylastfile" is the name of the last file in the list given to MOSAIC_JCMT_IMAGES). Try it and see.

Note that all the files must be of the same source.

At the moment, it only supports wcsmosaic as the mosaicking task but will support makemos in the future. Keep rsync'ing and one day it'll just be there...

[This post was updated on 20100510 to reflect the change in the recipe name and the file-naming behaviour]

How Should I Mosaic My SCUBA-2 Data?

I've been asked this a number of times so I thought I'd write something down. Currently the SMURF iterative map-maker calculates models for each chunk of data independently (a chunk is a continuous sequence so many times that will be a single observation but it might be smaller than an observation if a flatfield is inserted mid-way through). It then combines them using a weighted coadd to give you the final map. This means that there is nothing to be gained in throwing all your observations at MAKEMAP apart from it taking much longer to make a map before you can see the result.

In terms of flexibility it is better to make a map from each observation separately. You can then look at each map to see how it looks before combining it. You may also want to remove any low frequency structure at this point as well. To combine these maps into a single map you have two choices using Starlink software:

  1. Use KAPPA WCSMOSAIC with an interpolation or rebinning scheme of your choice. Make sure that you set the VARIANCE parameter to true to enable variance weighting of your mosaic.
  2. If you have already ensured that the maps are made on the same grid (you can use the REF argument in MAKEMAP to ensure this) then you can try CCDPACK MAKEMOS. This can be used to do median or clipped mean stacking and does no interpolation or smoothing. If your images are not aligned on the same grid it is probably better to remake them so that they are, but if that is difficult you can use the KAPPA WCSALIGN command to match up the grids first. Use the GENVAR and USEVAR parameters in MAKEMOS.
In the future it may be that the map-maker will be able to do a better job handling all the data at once itself instead of external mosaicking but at the moment this is not the case.

If you want to keep a track of exposure times you need to track that separately. We will cover that in a follow up post.