2 The really easy way

Given that ORAC-DR is what you will use at the telescope, and that it provides the easiest introduction to the various steps necessary to reduce your data, we will deal with it first. If you are at the telescope log into mamo (if you are working on another machine just ssh mamo), and then type:

  % oracdr_scuba
  % oracdr


pdfpict
Figure 1: The Xoracdr gui for orac-dr. All the information needed to start the data reduction (ie. where the data is, what instrument is being used, what ut date) is prompted for.


If you are at your home institution or at the JAC, you should use the new x-windows gui. Simply type:

  %xoracdr

and the interface shown in Figure  1 should pop-up. Fill in the appropriate date, the location of the raw data files (see below if you are at the JAC and unsure) as well as where you want the reduced data to go, and press Start!

Having done one or other of the previous set-ups you should then find a dialogue window pops up, with explanatory messages about what the pipeline is doing, followed shortly by a xwindow which will display all the key stages of the reduction. You will find that ORAC-DR starts working through the observations one by one ignoring some (like the focus and align measurements) but reducing most. For each distinct type of observation (pointing, skydip, jigglemap) ORAC-DR has a recipe it uses to reduce the data. If you miss a stage (maybe you went out to make a coffee) or you want to examine something in detail, don’t despair look at the output directory for the reduction, you will find many of the final reduced files and intermediate stages eating into your disk space. From the name of the file you should be able to work out at what stage of the reduction it was produced – the final ‘re-binned images’ should finish in “_reb”.

Given this is a cookbook for reducing map data, let’s look at the two recipes for reducing jiggle-maps, and emerson2 maps. You can find these by typing “ls $ORAC_DIR/recipes/SCUBA” (if the environment variable is not set type “oracdr_scuba”). The jiggle map recipe reads,

  =head1 NAME
  
  SCUBA_JIGMAP - Standard reduction for jiggle map data
  
  =head1 SYNOPSIS
  
  
  =head1 DESCRIPTION
  
  This is the standard recipe to use for reduction of SCUBA
  jiggle map data.
  
  
  =cut
  
  _PRE_PROCESS_
  
  _FLAT_FIELD_
  
  _SET_BAD_PIXELS_
  
  _EXTINCTION_CORRECT_
  
  _CLIP_BOLOMETERS_ NSIGMA=5.0
  
  _REMOVE_SKY_NOISE_JIGGLE_  BOLOMETERS=r3 MODE=median
  
  _REBIN_FRAME_ PIXEL_SIZE=3.0 REBIN_METHOD=GAUSSIAN
  
  _FIND_CALIBRATION_MAP_
  
  _CALIBRATE_DATA_
  
  _REBIN_GROUP_ PIXEL_SIZE=1.0 REBIN_METHOD=LINEAR
  
  _DELETE_TEMP_FILES_ KEEP=_reb,_ext,_sky,_cal

while the scan map recipe reads (omitting the verbose header information),

  _PRE_PROCESS_
  
  _FLAT_FIELD_
  
  _SET_BAD_PIXELS_
  
  _DESPIKE_SCAN_
  
  _EXTINCTION_CORRECT_
  
  _REMOVE_SCAN_BASELINE_
  
  _REMOVE_SKY_NOISE_SCAN_
  
  # Comment this if the processing of the individual frame is
  # not required.
  _REBIN_FRAME_ PIXEL_SIZE=3.0 REBIN_METHOD=LINEAR
  
  _REBIN_EM2_GROUP_ PIXEL_SIZE=3.0 REBIN_METHOD=GAUSSIAN
  
  # Tidy up
  # Need to make sure that the _rlb file is kept for the
  # sky removal and that the _sky file is kept for the group processing.
  _DELETE_TEMP_FILES_ KEEP=_rlb,_sky,_reb

Both read surprisingly like English, each line in the recipes is a step to be done in the data processing (or in the ORAC-DR parlance a call to a ‘primitive’). Also it is worth noting that the first few steps are nearly identical. The pre-processing, flat-fielding, despiking and extinction correction is the same for both jiggle, and scan map data, they only differ in the despiking, removal of baselines and skynoise, and in the rebinning. Note many of the steps have variables which can be set to customize the recipe, i.e. N_SIGMA on the clipping, and PIXEL_SIZE in the rebinning. It is possible to accurately reduce your data using ORAC-DR alone, using the supplied variables in the recipe, customizing the recipes to use the wide range of set primitives issued with ORAC-DR or even by altering primitives or writing new ones (though this requires you to become acquainted with object-orientated Perl). At the very least ORAC-DR should be run twice, once at the summit when the data is being taken, and once at home to give you something to compare to. However the next sections describe how to reduce the data at the ‘bare-bones’ level using Surf and the standard Starlink data-reduction packages.