### 5 Data-cube manipulation

While the initial data-reduction software for IFUs is highly instrument dependent, the data analysis of the final science data product for all these instruments should be fairly generic. The end product of the data reduction for IFS is, almost naturally, an $x$,$y$,$\lambda$ data cube. For instruments not working in the optical and infrared regimes, the third axis may also be in some other spectral co-ordinate system, such as frequency. Once assembled, with associated variance and quality arrays, scientifically interesting information can be extracted from the cube.

#### 5.1 Existing software

For long-slit paradigm instruments a data cube is the only data product available, as overlapping point-spread functions mean that the spectral data must be resampled. For MOS paradigm data, while the individual spectra are available, data visualisation is often intrinisically more intuitive if done on resampled data cubes. As such, this cookbook will (in the main) only deal with applications that handle the data-cube format. If you have already sourced the Starlink /star/etc/login and /star/etc/cshrc files1 then the commands kappa, figaro, and ccdpack will set up access to the (SUN/95), FIGARO, and Ccdpack tasks respectively, including most of the following applications.

##### 5.1.1 Arithmetic Operations

Some of the most fundamental operations you might wish to perform on a data cube are the arithmetic operations of addition, subtraction, multiplication and division, both by scalars and other data cubes. All these basic operations, and additional more complicated ones, can be carried out using tasks from the (SUN/95) package. Detailed documentation for all of these tasks can be found in SUN/95.

Adds a scalar to an NDF data structure
• calc
Evaluates a mathematical expression
• cdiv
Divides an NDF by a scalar
• cmult
Multiplies an NDF by a scalar
• csub
Subtracts a scalar from an NDF data structure
• div
Divides one NDF data structure by another
• maths
Evaluates mathematical expressions applied to NDF data structures
• mult
Multiplies two NDF data structures
• normalize
Normalises one NDF to a similar NDF by calculating a scale factor and zero-point difference
• pow
Takes the specified power of each pixel of a data array
• rift
Adds a scalar to a section of an NDF data structure to correct rift-valley defects
• sub
Subtracts one NDF data structure from another
• thresh
Edits an NDF such that array values below and above two thresholds take constant values
##### 5.1.2 Cube manipulation

Slightly more complex is manipulation and resampling of the data cube itself. The most important utilities available to do this within Kappa are the collapse, chanmap, ndfcopy, and pluck applications. collapse can produce white-light and passband images (see Section 5.8) and velocity maps by the appropriate selection of statistic. chanmap creates a grid of passbands. ndfcopy can extract a single spectrum (along pixel axes) or arbitrary cube sections (again see Section 5.8 for details), while pluck uses interpolation to extract at arbitrary positions, such as a spectrum at nominated equatorial co-ordinates, or an image at a given wavelength or frequency. Some applications require the spectral axis to be the first or the third axis; permaxes allows shuffling of the pixel axes.

• chanmap
Creates a channel map from a cube NDF by compressing slices along a nominated axis
• collapse
Reduces the number of axes in an $N$-dimensional NDF by compressing it along a nominated axis
Reduces the size of an NDF by adding values in rectangular boxes
• compave
Reduces the size of an NDF by averaging values in rectangular boxes
• compick
Reduces the size of an NDF by picking equally spaced pixels
• flip
Reverses an NDF’s pixels along a specified dimension
• ndfcopy
Copies an NDF, or an NDF section, to a new location
• permaxes
Permutes an NDF’s pixel axes
• pixdupe
Expands an NDF by pixel duplication
• pluck
Plucks slices from an NDF at arbitrary positions
• regrid
Applies a geometrical transformation to an NDF
• segment
Copies polygonal segments from one NDF into another
• slide
Realigns an NDF using a translation.
##### 5.1.3 Two-dimensional manipulation

While IFU data is intrinsically three dimensional there are times when it is necessary to deal with two dimensional images during analysis (e.g. velocity maps). Some potentially useful Kappa applications which are restricted to handling two-dimensional files are

• look
Outputs the values of a sub-array of a two-dimensional data array to the screen or a text file
• median
Smooths a two-dimensional data array using a weighted median filter

Some tasks operate in two dimensions, but apply processing to a series of planes in a cube. They permit, for example, to smooth spatially while retaining the spectral resolution.

• block
Smooths an NDF using a one- or two-dimensional square or rectangular box filter
• gausmooth
Smooths a one- or two-dimensional NDF using a Gaussian filter
• rotate
Rotates two-dimensional NDF about its centre through any angle
##### 5.1.4 Pixel Operations

There are several applications which can carry out operations on individual pixels, the most general of these is chpix which can replace the pixel value of a single, or region, of pixels with an user defined value (including the bad magic value).

• chpix
Replaces the values of selected pixels in an NDF
• errclip
Removes pixels with large errors from an NDF
Removes regions of bad values from an NDF
• nomagic
Replaces all occurrences of magic value (see SUN/95 for details) pixels in an NDF array with a new value
• numb
Counts the number of elements of an NDF with values or absolute values above or below a threshold
• substitute
Replaces all occurrences of a given value in an NDF array with another value
##### 5.1.5 Other tools and file manipulation

Building processing scripts from the Starlink applications can involve you manipulating or querying information about structures within the NDF file itself, three useful (SUN/95) commands to do this are listed below.

• ndftrace
Displays the attributes of an NDF data structure
• parget
Obtains the value or values of an application parameter
• stats
Computes simple statistics for an NDF’s pixels

In some cases NDF information must be modified after a processing step (e.g. the NDF title) or, in the case of a newly created NDF (see Section 6.6), we must generate inital values. Kappa provides various tools to manipulate NDF extensions.

• axlabel
Sets a new label value for an axis within an NDF data structure
• axunits
Sets a new units value for an axis within an NDF data structure
• setaxis
Sets values for an axis array component within an NDF data structure
Sets new bad-pixel flag values for an NDF
• setbb
Sets a new value for the quality bad-bits mask of an NDF
• setbound
Sets new bounds for an NDF
• setext
Manipulates the contents of a specified NDF extension
• setlabel
Sets a new label for an NDF data structure
• setmagic
Replaces all occurrences of a given value in an NDF array with the bad value
• setnorm
Sets a new value for one or all of an NDF’s axis-normalisation flags
• setorigin
Sets a new pixel origin for an NDF
• settitle
Sets a new title for an NDF data structure
• settype
Sets a new numeric type for the data and variance components of an NDF
• setunits
Sets a new units value for an NDF data structure
• setvar
Sets new values for the variance component of an NDF data structure
##### 5.1.6 Visualisation

Combined with the graphics devices commands (see Section 5.3) the following applications, especially clinplot, display, linplot and contour act as the backbone for image display, and some complex effects can be generated using these elemental tasks.

• clinplot
Draws a spatial grid of line plots for an axis of a cube NDF
• contour
Contours a two-dimensional NDF
• cursor
Reports the co-ordinates of positions selected using the cursor
• display
Displays a one- or two-dimensional NDF
• drawsig
Draws +/-n standard-deviation lines on a line plot
• linplot
Draws a line plot of the data values in a one-dimensional NDF
• profile
Creates a one-dimensional profile through an $N$-dimensional NDF
##### 5.1.7 Mosaics

Mosaicking multiple IFU data cubes together is something you may well wish to consider, unfortuantely there are problems involved in doing so (see Section 5.9), however, the following tasks from Kappa may be useful

• wcsalign
Aligns a group of NDFs using World Co-ordinate System information
• wcsmosaic
Tiles a group of NDFs using World Co-ordinate System information

along with the following tasks from Ccdpack.

• makemos
Makes a mosaic by combining and (optionally) normalising a set of images
• drizzle
Resamples and mosaics using the drizzling algorithm
##### 5.1.8 Spectral fitting

Some of the most useful applications available for spectral fitting and manipulation live inside FIGARO as part of SPECDRE. Some of these applications work on individual spectra, however, others read a whole cube at once and work on each row in turn. A possible problem at this stage is that many of these applications expect the spectroscopic axis to be the first in the cube, whereas for the current generation of IFU data cubes the spectral axis is typically the third axis in the cube. Kappa permaxes can reconfigure the cube as needed by the various packages. A full list of SPECDRE applications can be found in SUN/86.

One of the fundamental building blocks of spectral analysis is gaussian fitting, amougst other tools, FIGARO provides the fitgauss application (part of SPECDRE) to carry out this task.

FITGAUSS is especially well suited to automation inside a script. For example,

fitgauss \
in=${spectrum} mask1=${low_mask} mask2=${upp_mask} \ cont=${cont} peak=${peak} fwhm=${fwhm} reguess=no remask=no \
ncomp=1 cf=0 pf=0 wf=0 comp=${component} fitgood=yes \ centre=${position} logfil=${fitfile} device=xwin \ dialog=f we call the fitgauss routine here specifying all the necessary parameters, and suppressing user interaction, to allow us to automatically fit a spectrum from inside a shell script. Here we have specified an input file,${spectrum} and the lower and upper boundaries of the fitting region, ${low_mask}, and${upp_mask}, respectively. Various initial guesses for the fitting parameters have also been specified: the continuum level ${cont}, peak height${peak}, full-width half-maximum ${fwhm} and the line centre${position}. By specifying ncomp=1 cf=0 pf=0 wf=0 we have told the application that we want it to fit a single gaussian with the central line position, peak height and FWHM being free to vary.

In addition we have turned off user interaction with the application, by setting the following parameters, reguess=no, remask=no, dialog=f and fitgood=yes.

This allows fitgauss to go about the fit without further user intervention, displaying its resulting fit in an X-display, logging the fit characteristics to a file (${fitfile}) and saving the fit in the SPECDRE extension (see SUN/86 for details) of the NDF where it is available for future reference and manipulation by other SPECDRE applications. The velmap and peakmap scripts are based around the SPECDRE fitgauss application. For data with a significantly varying continuum mfittrend is available in Kappa to fit and remove the continuum signal, thereby making fitgauss’s job easier. #### 5.2 Locating Features The ability to identify and measure the properties of features can play an important role in spectral-cube analysis. For example, you may wish to locate emission lines and obtain their widths as initial guesses to spectral fitting, or to mask the lines in order to determine the baseline or continuum. Extending to three dimensions you may want to identify and measure the properties of clumps of comoving emission in a velocity cube. The CUPID package addresses these needs. The findclumps is the main command. It offers a choice of clump-finding algorithms with detailed configuration control, and generates a catalogue in FITS format storing for each peak its centre and centroid co-ordinates, its width, peak value, total flux and number of contributing pixels. Information from the catalogue can be extracted into scripts using STILTS and in particular its powerful tpipe command. findclumps in=cube out=clumps outcat=cubeclump perspectrum \ config="^ClumpFind.par" accept set peak = ‘stilts tpipe in=cubeclump.FIT cmd="select index==1" \ cmd=’keepcols Cen3’ omode=out ofmt=csv-nohead‘ Here we find the clumps in the three-dimensional NDF called cube using the configuration stored in the text file ClumpFind.par that will specify things like the minimum number of pixels in a clump, the instruments’s beam FWHM, and tuning of the particular clump-finding algorithm chosen. The PERSPECTRUM parameter requests that the spectra be analysed independently. It would be absent if you were looking for features in a velocity cube. The contents of output NDF clumps is algorithm dependent, but in most cases its data array stores the index of the clump in which each pixel resides, and all contain clump information and cut-outs of the original data about each clump in an extension called CUPID; and its QUALITY array has flags to indicate if the pixel is part of a clump or background. The last is useful for masking and inspecting the located clumps. Some experimentation with findclumps algorithms and configuration to obtain a suitable segmentation for your data’s characteristics is expected. That is why we have not listed any configuration parameters or even chosen a clump-finding algorithm in the example. Continuing with the example, the catalogue of clumps detected and passing any threshold criteria are stored in cubeclump.FIT. (See the CUPID manual for details of all the column names.) We then use tpipe to select the third axis centroid co-ordinate (Cen3) of the first clump (index==1) and store the numerical value in shell variable peak. The commands cmd= are executed in the order they appear. This could be extended to keep other columns keepcols and return them in an array of the line with most flux given in column Sum. set parline = ‘stilts tpipe in=cubeclump.FIT cmd=’sort -down Sum’ \ cmd="select index==1" cmd=’keepcols "Cen3 Width3"’ \ omode=out ofmt=ascii‘ set centre =$parline[3]
set width = $parline[4] Here tpipe sorts the fluxes, picks that clump, and writes an ASCII catalogue containing the centroid and width along the third axis storing the one-line catalogue to shell variable parline. In this format the names of the columns appear first, hence the required values are in the third and fourth elements. We could have used the csv-nohead output format to give a comma-separated list of values as in the previous example, and split these with awk. tpipe has many command options that for example permit the selection, addition, and deletion of columns; and the statistics of columns. The selections can be complex boolean expressions involving many columns. There are even functions to compute angular distances on the sky, to say select a spatial region. See the STILTS manual for many examples. CUPID also includes a background-tracing and subtraction application findback. You may need to run this or mfittrend to remove the background before feature detection. Note the if you are applying findback to the spectra independently, the spectral axis must be first, even if this demands a re-orient the cube.$KAPPA_DIR/permaxes in=scube out=cubeperm perm=$3,1,2$
$CUPID_DIR/findback in=scubeperm sub=no out=backperm box=$box,1,1$ \ ilevel=0 rms=$noise KAPPA_DIR/permaxes in=backperm out=back perm=$2,3,1$
$KAPPA_DIR/sub in1=cube in2=back out=cube_bs In this example, the spectral axis is the third axis of NDF cube. findback removes structure smaller than $box pixels along each spectrum independently. The resulting estimated backgrounds for each spectrum are stored in NDF backperm, which is re-oriented to the original axis permutation to allow subtraction from cube.

#### 5.3 Dealing with Graphical Devices

##### 5.3.1 Devices and Globals

(SUN/95) and other Starlink applications using PGPLOT graphics have a single graphics device for line and image graphics. These can be specified as either PGPLOT or Starlink device names. The current device may be set using the gdset command as in the example below.

% gdset xwindows

You can use the gdnames command to query which graphics devices are available, and your choice of device will remain in force, and can be inspected using the globals command, e.g.

% globals
The current data file is             : /tmp/ifu_data
The current graphics device is       : xwindows
The current lookup table file is     : <undefined>
The current transformation is        : <undefined>
The current interaction mode is      : <undefined>

unless unset using the noglobals, or overriden using the DEVICE parameter option in a specific application. Predicatably, the gdclear command can be used to clear the graphics device.

More information on these and other graphics topics can be found in SUN/95.

##### 5.3.2 The Graphics Database

Each Starlink application which makes use of the standard Starlink graphics calls, which is most of them, creates an entry in the graphics database. This allows the applications to interact, for instance you can display an image to an X-Window display device using the display command, and later query a pixel position using the cursor command.

The graphics database is referred to as the AGI database, after the name of the subroutine library used to access its contents, and exists as a file stored in your home directory. In most circumstances it will be named for the machine you are working on.

% ls ~/*.sdf
-rw-r--r--  1 aa   users  2083328 Jan 02 12:50  /home/aa/agi_pc10.sdf

An extensive introduction sprinkled with tutorial examples to making full use of the graphics database can be found in the Kappa manual (SUN/95) in the sections entitled The Graphics Database in Action and Other Graphics Database Facilities. There is little point in repeating the information here, however learning to manipulate the graphics database provides you with powerful tools in visualising your IFU data, as well as letting you produce pretty publication quality plots. For instance, the compare shell script make fairly trivial use of the graphics database to produce multiple image and line plots on a single GWM graphics device.

##### 5.3.3 Pseudo Colour and LUTs

The different display types, such as pseudo colour and true colour, are explained in detail in the Graphics Cookbook (SC/15).

On pseudo-colour displays Kappa uses a number of look-up tables (commonly refered to as LUTs) to manipulate the colours of your display. For instance you may want to have your images displayed in grey scale (lutgrey) or using a false colour ‘heat’ scale (lutheat). Kappa has many applications to deal with LUTs (see SUN/95), these applications can easily be identified as they all start with “lut”, e.g. lutable, lutcol.

#### 5.4 Dealing with WCS Information

World Co-ordinate System (i.e. real world co-ordinates) information is a complex topic, and one that many people, including the author, find confusing at times.

Starlink applications usually deal with WCS information using the AST subroutine library (see SUN/210 for Fortran and SUN/211 for C language bindings), although notably some parts of FIGARO (such as SPECDRE) have legacy and totally independent methods of dealing with WCS information. See Section 6.6 for an example of how to overcome this interoperability issue.

This general approach has an underlying effect on how Starlink applications look at co-ordinate systems and your data in general.

Starlink applications therefore tend to deal with co-ordinate ‘Frames’. For instance, the PIXEL Frame is the frame in which your data is considered in the physical pixels with a specified origin, i.e. for a simple two-dimensional example, your data frame may have an $x$ size of 100 pixels and a $y$ size of 150 pixels with the origin of the frame at the co-ordinates (20,30). Another frame is the SKY frame, which positions your image in the real sky—normally right ascension and declination but other sky co-ordinate systems are available and easily transformed. A ‘mapping’ between these two frames will exist, and will be described, inside the WCS extension of your NDF. The Kappa wcscopy application can be used to copy WCS component from one NDF to another, optionally introducing a linear transformation of pixel co-ordinates in the process. This can be used to add WCS information back into an NDF which has been stripped of WCS information by non-WCS aware applications. Further details about WCS Frames are available in SUN/95 Using World Co-ordinate Systems.

Why is this important? Well, for instance, the display command will automatically plot your data with axes annotated with co-ordinates described by the current WCS frame, so if your data contains a SKY frame it can (and much of the time will) be automatically be plotted annotaed with the real sky co-ordinates, usyally right ascension and declination, of the observation. It is also critical for mosaicking of data cubes, as explained later in Section 5.9.

Both (SUN/95) and Ccdpack contain commands to handle WCS NDF extensions. In Kappa we have the following applications.

Adds a new co-ordinate Frame into the WCS component of an NDF
• wcsattrib
Manages attribute values associated with the WCS component of an NDF
• wcscopy
Copies WCS information from one NDF to another
• wcsframe
Changes the current co-ordinate Frame in the WCS component of an NDF
• wcsremove
Removes co-ordinate Frames from the WCS component of an NDF
• wcstran
Transforms a position from one NDF co-ordinate Frame to another

while Ccdpack has

• wcsedit
Modifies or examines image co-ordinate system information

which is a very useful utility for handling frames within the extension. For instance,

% wcsedit ifu_file

WCSEDIT
=======
1 NDF accessed using parameter IN
MODE - Action to perform (CURRENT,ADD,REMOVE,SET,SHOW) /’SHOW’/ >

Index Cur  Domain            Title
----- ---  ------            -----

ifu_file:
1        GRID              Data grid indices; first pixel at (1,1,1)
2        PIXEL             Pixel coordinates; first pixel at (0.5,0...
3    *   AXIS              Axis coordinates; first pixel at (-1.812...

%

here we see that ifu_file.sdf has three WCS frames, the base GRID frame with origin (1,1,1), a PIXEL frame with origin (0.5,0.5,0.5) and an AXIS frame with real world co-ordinate mapped on to the PIXEL frame.

ndftrace also contains a useful option at this point: FULLFRAME. In the following example, most of the ndftrace output is excised for clarity, denoted by a vertical ellipsis.

% ndftrace mms6_co fullframe

NDF structure /data/mjc/datacube/mms6_co:
.
.
.

Shape:
No. of dimensions:  3
Dimension size(s):  5 x 5 x 83
Pixel bounds     :  -2:2, -2:2, -54:28
Total pixels     :  2075
.
.
.

World Coordinate Systems:
Number of coordinate Frames: 4

Current coordinate Frame (Frame 4):
Index               : 4

Frame title         : "Compound coordinates describing celes..."
Domain              : SKY-SPECTRUM
First pixel centre  : 5:36:52.8, -7:26:11, 345.7623

Axis 1:
Label: Right ascension
Units: hh:mm:ss.s

Axis 2:
Label: Declination
Units: ddd:mm:ss

Axis 3:
Label: Frequency (LSB)
Units: GHz
.
.
.

NDF Bounding Box:

Axis 1: 5:36:51.0 -> 5:36:53.0
Axis 2: -7:26:14 -> -7:25:44
Axis 3: 345.762 -> 345.8139

The important information here is the boundary of the image in the PIXEL and CURRENT frame of the NDF, the CURRENT frame being the current ‘default’ frame which applications accessing the NDF will report co-ordinates from (the CURRENT frame of the NDF can be changed using the Kappa wcsframe command and is usually the last accessed frame in the extension). In this case we can see that the current frame is the SKY-SPECTRUM frame, with extent 5:36:51 to 5:36:53.0 in right ascension, $-$7:26:14 to $-$7:25:44 in declination, and 345.762 to 345.8139 Ghz in frequency.

#### 5.5 GAIA data visualisation

Gaia is a display and analysis tool widely used for two-dimensional data. For many years the Gaia tool had little to offer for cube visualisation. At the appropriately named Version 3.0, came a toolbox for cubes to permit inspection of planes individually, or as an animation, or as a passband image by collapsing. Now at the time of writing, Version 4.2 Gaia offers further facilities, such as rendering, for cube analysis. Gaia is also under active delevopment. Check the \$STARLINK_DIR/news/gaia.news file for the latest features extending upon those summarised below.

##### 5.5.1 Cube toolbox

If you start up Gaia with a cube

% gaia orion_mos_msk.sdf &

the cube toolbox appears (see Figure 2). In the upper portion you may choose the axis perpendicular to the axes visible in the regular two-dimensional display; this will normally be the spectral axis, and that is what is assumed this manual. You can select a plane to view in the main display by giving its index or adjusting the slider.

The lower portion of the toolbox has a tabbed interface to a selection of methods. The controls change for each method. Of particular note are the following methods.

Animation tab
This steps through a range of planes automataically at a controlled rate. The animation can be captured to a GIF.
Spectrum tab
This extends the basic behaviour of the spectrum plot (see Section 5.5.2 including defining the spectral limits. You can also define spatial regions graphically, and thereby form a composite spectrum for an object.
Collapse tab
This allows you collapse along the spectral axis over a defined range and offers a wide selection of combination options. Most will form a ‘white-light’ image in some form, but Iwc gives a form of velocity map, and Iwd estimates the line dispersions.
Chanmap tab
This forms a channel map comprising a grid of passband tiles, each tile being the result of collapsing a range of planes using one of the Collapse methods. You can inspect the spectral co-ordinate of each plane and with the cursor mark equivalent positions in every tile.
Rebin tab
This allows you to increase the signal-to-noise by rebinning along one or more dimensions.
Filter tab
This controls smoothing of image planes.

If you need to access the Cube toolbox, go to the Open cube... option in the File menu of the main display.

Once you have an image displayed, be it a plane, passband image or channel map, you can then use Gaia’s wide selection of image-display facilities described in SUN/214 (also see SC/17) to enhance the elements under investigation. There are also many analysis capabilities mostly through the (Image-Analysis menu), including masking and flux measurement.

##### 5.5.2 Spectral plot

One of the most useful facilities for cube analysis is a dynamic spectrum display. Once you have a representative image displayed, click on the mouse over the image a line plot along the spectral axis appears. As you drag the cursor holding down the first mouse button, the spectrum display updates to dynamically to reflect the current spatial location while the data limits are unchanged. If you click again the range will reset to the current spectrum. You can enforce autoscaling as you drag too by enabling Options$\to$Autoscale in the spectral plot’s control bar. This is slower although it allows an intensity independent comparison of the spectra. The Spectrum tab mentioned earlier also allows control of the plotting range along both axes. The Reference: Set button lets you nominate the current spectrum as a reference spectrum. Then as you subsequently drag the mouse you can compare any of the spectra with the reference spectrum. See Figure 3.

The vertical red line shows the plane being displayed in the main viewer. You can drag that line with the mouse to adjust the plane on view, say to inspect where emission at a chosen velocity lies spatially.

Further features of the spectral viewer can be found in the online help.

##### 5.5.3 Volume Visualisation

The View menu in the cube toolbox offers two interactive rendering functions that let you explore your cube in three dimensions.

The first of these is iso-surface. It’s akin to contouring of an image extended to three dimensions. Each coloured surface corresponds to a constant data value. In order to see inside a surface, each surface should have an opacity that is less than 1.0. An example using the same Orion dataset is in Figure 4.

Just as with the contouring in Gaia, there are different methods to generate iso-surface levels authomatically, as well as manually.

You can adjust the viewing angle and zoom factor either with the mouse, or with the keyboard for finer control. The online help lists the various controls. There are many options to control the appearance, such as directional and annotated axes. Gaia can also display the current slice and displayed spectrum. It is also possible to display two cubes simultaneously, say to compare data from different wavelengths measuring different molecular species. Gaia provides a number of alignment options in this regard.

The second function is volume rendering. It displays all the data within two data limits as a single volume. You assign a colour and opacity to each limit. The controls and options are the same as for the iso-surfaces. See Figure 5.

These visualisation functions place heavy demands on computer memory and CPU. Also a modern graphics card with hardware support for OpenGL makes a huge difference in interactive performance. So you will need a modern machine to get the best out of these tools. Of the two tools, iso-surface is quicker to render and uses less memory.

#### 5.6 IDL and data visualisation

IDL has extensive visualisations capabilities and has many of the tools needed to analyse IFU data cubes available ‘off the shelf’. Unfortunately, due to the large file sizes involved, some of the more useful tools availble in IDL can be very slow on machines with small amounts ($<512$ Mb) of memory.

##### 5.6.1 Display problems

Like many modern UNIX applications IDL suffers problems coping with pseudo and true colour displays. When writing IDL scripts it is important to bear in mind the display type you are using, for pseudo colour (8 bpp) displays you should set the X device type as follows

device, pseudo_color = 8

while for true-colour displays, commonly found on modern machines running Linux, you should set the X device type to have the appropriate display depth, e.g. for a 24-bpp display.

device, true_color = 24

It should be noted that while the IDL Development Environment (IDLDE) will run in UNIX on a 16 bpp display, only 8 bpp and 24 bpp display are supported for graphics output. If your IDL script or procedure involves graphics display you must run IDL either under an 8-bpp pseudo-colour display, or a 24-bpp true-colour display. Ask your system administrator if you are in any doubt as to whether you machine is capable of producing a 24-bpp true-colour display.

Using a true-colour display if you wish to make use of the colour look-up tables (LUTs), and the loadct procedure, you should also set the decomposed keyword to $0$, e.g.

device, true_color = 24
device, decomposed = 0

or alteratively, if you wish to make of 24-bit colour rather than use the LUTs then you should the decomposed keyword to $1$, e.g.

device, true_color = 24
device, decomposed = 1

##### 5.6.2 Slicer3

Slicer3 is a GUI widget based application to visualize three-dimensional data which comes with the IDL environment, a simple script to read an IFU data cube into the GUI is shown below,

pro display_slicer

; Read in TEIFU data cube

; create a pointer to the data array
ifu_ptr = ptr_new(ifu_data)

; run slicer3
slicer3, ifu_ptr
end

here we create a pointer to our data array and pass this pointer to the slicer3 procedure. The Slicer3 GUI is shown in Figures 6 and 7. These display a projection of an IFU data cube and the user operating on it with the cut and probe tools.

One of the interesting things about the Slicer3 GUI is that it is entirely implemented as an IDL procedure and the code can therefore be modified by the user for specific tasks. More information on Slicer3 can be found in the online help in IDL.

##### 5.6.3 The IDL Astronomy Library

No discussion of astronomy data visualisation in IDL can be complete without reference to the IDL Astronomy Library. This IDL procedure library, which is maintained by the GSFC, provides most of the necessary tools to handle your data inside IDL. The library is quite extensive, and fairly well documented. A list of the library routines, broken down by task, can be found at http://idlastro.gsfc.nasa.gov/contents.html.

##### 5.6.4 ATV Image Viewer

ATV  is a frontend for the IDL tv procedure. Like the Slicer3 GUI discussed previously, ATV is entirely implemented as an IDL procedure so it is simple to add new routines, buttons or menus if additional functionality is needed. The interface, deliberately, resembles SAOimage so that users can quickly start using it. Detailed usage instructions are available online at http://www.physics.uci.edu/~barth/atv/instructions.html. However, to display an a plane in a data cube you can pass an array directly to atv as follows.

; display plane i in the data cube
atv, array(*,*,i)

#### 5.7 IRAF and the Starlink software

Most of the packages we have discussed, e.g. (SUN/95), FIGARO, and Ccdpack, are available from the IRAF command-line interface (up to the 2004 Spring release) and can be used just like normal IRAF applications (see SUN/217 for details) and IRAF CL scripts can be built around them as you would expect to allow you to analyse your IFU data cubes using their capabilities.

However, it should be noted that Starlink and IRAF applications use intrinsically different data formats. When a Starlink application is run from the IRAF CL, the application will automatically convert to and from the IRAF .imh format on input and output. This process should be transparent, and you will only see native IRAF files. However, you should be aware, if you are used to using the Starlink software, that the native NDF format is more capable than the IRAF format and some information (such as quality and variance arrays) may be lost when running the Starlink software from IRAF.

#### 5.8 Visualisation using the DATACUBE scripts

The scripts shipped within the DATACUBE package are described in SUN/237. The example dataset has a spectral axis in the wavelength system with units of Ångstrom, however DATACUBE can handle other spectral systems and units, as supported by the FITS standard.

##### 5.8.1 How do I create a white-light image?

You can make use of the DATACUBE squash shell script which is a user friendly interface to the Kappa collapse application allowing you to create both white-light and passband image, e.g.

% squash -p
NDF input file: ifu_file
Shape:
No. of dimensions: 3
Dimension size(s): 59 x 110 x 961
Pixel bounds     : 1:59, 1:110, 1:961
Total pixels     : 6236890
Wavelength bounds : 4526:10089.8

Lower Wavelength-axis bound : 4526
Upper Wavelength-axis bound : 10089.8

Collapsing:
White-light image: 59 x 110
Wavelength range: 4526--10089.8 Angstrom

NDF output file: out

Output NDF:
File: out.sdf
%

Here we make a white-light image from the input data file ifu_file.sdf, saving it as a two-dimensional NDF file out.sdf as well as plotting it in a GWM window (see Figure 9). Alternatively we can make use of scripts command-line options and specify the input and output files, along with the wavelength bounds, on the command line, as in this example.

% squash -i ifu_file -o out -l 4526 -u 10089.8 -p

Alternatively we can make direct use of the collapse application.

% collapse
IN - Input NDF /@/tmp/aa_squash_collapse/ > ifu_file
AXIS - The axis to be collapsed /’1’/ > 3
Collapsing pixel axis 3 from pixel 1 to pixel 961 inclusive...
OUT - Output NDF > out
%

Here we collapse the cube along the third ($\lambda$) axis.

##### 5.8.2 How do I create a passband image?

The DATACUBE package offers two ways to create passband images: first we may use (as before) the squash shell script, this time specify more restrictive $\lambda$ limits, e.g.

% squash -i ifu_file -o out -l 5490 -u 5690 -p

would create a two-dimensional passband image, collapsing a 200 Å-wide section of the spectral axis.

Alternatively, we may choose to generate our passband image interactively using the passband shell script.

%passband
NDF input file: ifu_file

Input NDF:
File: ifu_file.sdf
Shape:
No. of dimensions: 3
Dimension size(s): 59 x 110 x 961
Pixel bounds     : 1:59, 1:110, 1:961
Total pixels     : 6236890
Collapsing:
White-light image: 59 x 110

Left click to extract spectrum.

Extracting:
(X,Y) pixel             : 32,71
NDF array analysed        : DATA

Pixel sum              : 47734.279694
Pixel mean             : 49.671466903226
Standard deviation     : 62.703311643991
Minimum pixel value    : -21.781915
At pixel            : (31)
Co-ordinate         : (4697.297)
Maximum pixel value    : 916.4472266
At pixel            : (174)
Co-ordinate         : (5526.027)
Total number of pixels : 961
Number of pixels used  : 961 (100.0%)

Zoom in (yes/no): yes

Left click on lower zoom boundary.
Left click on upper zoom boundary.

Zooming:
Lower Boundary: 5237.92
Upper Boundary: 5756.03

Left click on lower boundary.
Left click on upper boundary.

Passband:
Lower Boundary: 5498.83
Upper Boundary: 5561.82
Collapsing:
White-light image: 59 x 110
Wavelength range: 5498.83--5561.82 Angstrom
Plotting:
Left: White-light image.
Right: Passband image (5498.83--5561.82 Angstrom)
%

Here the script presents us with a white-light image and prompts us to click on it to select a good signal-to-noise spectrum, it then asks us whether or not we want to zoom in on a certain part of the spectrum. Let us zoom, and then the script allows us to interactively select a region to extract to create a passband image. It then plots this next to the white-light image for comparison.

Alternatively we can again we can make direct use of the collapse application, upon which both the squash and passband have been built, as shown below.

% collapse in=ifu_file out=out axis=3 low=5490 high=5560

##### 5.8.3 How do I step through passband images?

The DATACUBE package provides the step shell script to carry out this task.

% step
NDF input file: ifu_file
Input NDF:
File: ifu_file.sdf
Shape:
No. of dimensions: 3
Dimension size(s): 21 x 21 x 961
Pixel bounds     : 20:40, 60:80, 1:961
Total pixels     : 423801
Wavelength bounds : 4526:10089.8 Angstrom

Lower Wavelength-axis bound: 5000
Upper Wavelength-axis bound: 8000
Wavelength step size: 1000

Stepping:
Range: 5000--8000 Angstrom
Step: 1000

Collapsing:
White-light image: 21 x 21
Wavelength range: 5000--6000 Angstrom
Output NDF:
File: chunk_1.sdf
Title: Setting to 5000--6000

Collapsing:
White-light image: 21 x 21
Wavelength range: 6000--7000 Angstrom
Output NDF:
File: chunk_2.sdf
Title: Setting to 6000--7000 Angstrom

Collapsing:
White-light image: 21 x 21
Wavelength range: 7000--8000 Angstrom
Output NDF:
File: chunk_3.sdf
Title: Setting to 7000--8000 Angstrom
%

Here we are asked for the lower and upper bounds of the desired wavelength range, and a step size. The script then generates a series of NDF two-dimensional passband images named chunk_*.sdf.

Alternatively, a very simplistic IDL script to step through an TEIFU data cube (stored in an NDF called ifu_file.sdf) is shown below.

pro step

; Read in TEIFU data cube

; Create a window of the right size
window,xsize=236,ysize=440

; Step through the data in the lambda direction
for i = 0, 960 do begin
tvscl,congrid(ifu_data(*,*,i),236,440)
endfor

end

The script reads the NDF file in using the READ_NDF procedure, creates an IDL graphics window, and steps through the data cube in the wavelength direction a pixel at a time.

There is now a Kappa command chanmap task that forms an image of abutting channels or passbands of equal depth. It is also available from the cube toolbox in Gaia. Once created you may view the passband image with Gaia or display. To read off the three-dimensional co-ordinates of features, in Gaia use Image-Analysis$\to$Change Coordinates$\to$Show All Coordinates…, and in Kappa run cursor.

##### 5.8.4 How do I extract individual spectra?

The ripper shell script in the DATACUBE package was designed as a user-friendly interface over the Kappa ndfcopy application.

% ripper -p
NDF input file: ifu_file
Input NDF:
File: ifu_file.sdf
Shape:
No. of dimensions: 3
Dimension size(s): 59 x 110 x 961
Pixel bounds     : 1:59, 1:110, 1:961
Total pixels     : 6236890
Collapsing:
White-light image: 59 x 110

Left click to extract spectrum.

Extracting:
(X,Y) pixel: 31,73

NDF output file: out

Output NDF:
File: out.sdf

Here we read in the data cube, ifu_file.sdf, and are prompted to click on a pixel to extract the spectrum, see Figure 12.

Alternatively we can use ndfcopy directly as underneath.

% ndfcopy in="ifu_file(31,73,)" out=out trim trimwcs

Here we extract the same spectrum by using NDF sections to specify a region of interest, and the TRIM and TRIMWCS parameters to reduce the dimensionality of the file to only one dimension.

##### 5.8.5 How do I compare spectra?

The compare script was written to give you this capability. It allows you to continually select spectra from different parts of the cube, plotting the two most recent to the right of a white-light image of the cube. See Figure 13.

For instance,

% compare
NDF input file: ifu_file

Input NDF:
File: ifu_file.sdf
Shape:
No. of dimensions: 3
Dimension size(s): 59 x 110 x 961
Pixel bounds     : 1:59, 1:110, 1:961
Total pixels     : 6236890
Collapsing:
White-light image: 59 x 110

Left click to extract a spectrum.
Right click to exit program.

Extracting:
(X,Y) pixel             : 31,77
NDF array analysed        : DATA

Pixel sum              : 37022.3614004
Pixel mean             : 38.524829761082
Standard deviation     : 51.712134738114
Minimum pixel value    : -11.8406626
At pixel            : (7)
Co-ordinate         : (4558.209)
Maximum pixel value    : 927.2638094
At pixel            : (174)
Co-ordinate         : (5526.027)
Total number of pixels : 961
Number of pixels used  : 961 (100.0%)

Left click to extract a spectrum.
Right click to exit program.

Extracting:
(X,Y) pixel            : 40,36
NDF array analysed        : DATA

Pixel sum              : 7506.1577888
Pixel mean             : 7.8107781361082
Standard deviation     : 8.9202431481454
Minimum pixel value    : -38.7883342
At pixel            : (133)
Co-ordinate         : (5288.419)
Maximum pixel value    : 47.7443282
At pixel            : (670)
Co-ordinate         : (8400.505)
Total number of pixels : 961
Number of pixels used  : 961 (100.0%)

Left click to extract a spectrum.
Right click to exit program.
%

Here the script presents us with a white-light image and asks us to click on a pixel. It then extracts and displays the spectral axis associated with that pixel in the upper right of the display window. We then have the opportunity to select another pixel, the corresponding spectrum being displayed in the lower right of the display window. We extract only two spectra during this run of the script (Figure 13), pressing the right-hand mouse button to terminate. However, if we carried on and selected a further spectrum it would replace our original in the upper-right panel of the display window. Selecting a further spectrum would replace the lower-right panel. The location of each new spectrum plot alternates.

Also see how Gaiacompares spectra Section 5.5.2.

##### 5.8.6 How do I plot stacked spectra?

The DATACUBE package provides two tasks to carry out this process, the stacker (see Figure 14) and multistack shell scripts.

A run of the stacker script is shown below.

% stacker
NDF input file: ifu_file

Input NDF:
File: ifu_file.sdf
Shape:
No. of dimensions: 3
Dimension size(s): 59 x 110 x 961
Pixel bounds     : 1:59, 1:110, 1:961
Total pixels     : 6236890
Collapsing:
White-light image: 59 x 110

Number of spectra: 3

Left click on pixel to be extracted.

Extracting:
(X,Y) pixel: 24,80

Left click on pixel to be extracted.

Extracting:
(X,Y) pixel: 30,71

Left click on pixel to be extracted.

Extracting:
(X,Y) pixel: 30,55

Offset: 200

Plotting:
Spectrum: 3
Spectrum: 2
Spectrum: 1

Zoom in (yes/no): yes

Left click on lower zoom boundary.
Left click on upper zoom boundary.

Zooming:
Lower Boundary: 6792.26
Upper Boundary: 7745.58

Plotting:
Spectrum: 3
Spectrum: 2
Spectrum: 1
%

Here we extract three spectra by clicking on a white-light image of the data cube, and these are then plotted with an offset of 200 counts between each spectrum. We then get the opportunity to zoom into a region of interest, and the three spectra are then re-plotted.

The multistack script operates in a similar manner, however, here we are prompted for the number of spectral groups required, and the number of spectra in each group. The mean spectrum of for each ‘group’ of spectra is calculated, and then all the mean spectra are plotted in a stack as before, as seen in the following example.

% multistacker
NDF input file: ifu_file

Input NDF:
File: ifu_file.sdf
Shape:
No. of dimensions: 3
Dimension size(s): 59 x 110 x 961
Pixel bounds     : 1:59, 1:110, 1:961
Total pixels     : 6236890
Collapsing:
White-light image: 59 x 110

Number of groups: 3

Number of spectra in group: 4

Left click on pixel to be extracted.
.
.
.

Here we request three groups of four spectra each, i.e. we’ll get three mean spectra plotted as a stack in the final display.

##### 5.8.7 How do I create a grid of spectra?

For an overview visual inspection of the spectra in a cube, it is useful to plot many spectra simultaneously, albeit at a lower resolution, in their respective spatial locations. While Kappa provides the clinplot command to make such a grid, sometimes the sheer number of spatial pixels can make the spectra unreadable and will take some time to plot. Therefore the DATACUBE package offers the gridspec shell script. It has an option to average the spectra in the spatial domain, thereby reducing the number of spectra plotted by several times, generating more practical graphics quicker. See Figure 15. Below is an example. The -z option requests that the white-light image be shown and a subset of the cube selected with the cursor. The -b option sets the spatial blocking factor. Different factors may be given for $x$ and $y$, the two values separated by a comma.

% gridspec -b 4 -i ifu_file -z

Input NDF:
File: ifu_file.sdf
Shape:
No. of dimensions: 3
Dimension size(s): 59 x 110 x 961
Pixel bounds     : 1:59, 1:110, 1:961
Total pixels     : 6236890

Collapsing:
White-light image: 59 x 110

Left click on lower zoom boundary.
Left click on upper zoom boundary.

Zooming:
Extracting:
Lower (X,Y): 20,61
Upper (X,Y): 42,82

Plotting:
Clinplot: Grid of spectra, averaged 4 x 4

A useful strategy is to select a blocking factor that gives no more than ten plots2 along an axis. Then focus on regions of interest—you either supply an NDF section or select from the white-light image—by decreasing the blocking, and thus increasing both the spatial and spectral plotting resolutions.

##### 5.8.8 How do I create a velocity map?

The DATACUBE package provides the velmap and velmoment shell scripts to manage this fairly complex task. velmap fits Gaussians to a selected line, and involves some graphical interaction to select the template spectrum, and initial fitting parameters. For data well characterised by a Gaussian, velmap can produce excellent results. However, it is an expensive algorithm to apply to each spatial pixel. Also Gaussian fitting is not appropriate for all data.

The alternative offered velmoment collapses along the spectral axis by deriving the intensity-weighted mean co-ordinate, and converts this to a velocity. This is turbo-charged compared with fitting. The downside is that the results will not be as accurate as line fitting, and care must be taken to select regions of the spectra populated by a single significant emission line. By fitting
velmap allows you to select the highest signal-to-noise spectrum from a white-light image. You can then interactively fit a line in this spectrum. The script will attempt to automatically fit the same line in all the remaining spectra in the cube, calculate the Doppler velocity of this line from a rest-frame co-ordinate you provide or is read from the NDF WCS, and create a velocity map of that line in the cube. See Figure 16. Below is an example.

% velmap -v -p
NDF input file: section

Input NDF:
File: section.sdf
Shape:
No. of dimensions: 3
Dimension size(s): 21 x 21 x 961
Pixel bounds     : 20:40, 60:80, 1:961
Total pixels     : 423801
Collapsing:
White-light image: 21 x 21

Left click on pixel to be extracted.

Extracting:
(X,Y) pixel: 31,72
Variances: Extension present.

Zoom in (yes/no): yes

Left click on lower zoom boundary.
Left click on upper zoom boundary.

Zooming:
Lower Boundary: 5289.73
Upper Boundary: 5714.58

Left click on the lower limit of the fitting region.
Left click on the upper limit of the fitting region.

Left click on your first estimate of the continuum.
Left click on your second estimate of the continuum.

Continuum:
First Estimate: 34.2915
Second Estimate: 10.2744
Average Value: 22.5659

Left click on the line peak.

Line Position:
Peak Position: 5526.29
Peak Height: 979.966

Left click on the left-hand edge of the FWHM.
Left click on the right-hand edge of the FWHM.

FWHM:
Lower Bound: 5518.38
Upper Bound: 5535.79
FWHM: 17.41

Rest Wavelength: 5007

Rest Wavelength:
Wavelength: 5007 Angstrom
Fitting:
Centre Position: 5527.768 +- 0.7394548E-01
Peak Height: 1020.167 +- 11.59994
FWHM: 13.28973 +- 0.1748629
Line integral: 14431.77 +- 163.9841

Fit okay (yes/no): yes

NDF output file: output
Fitting:
Spectrum at (20,60): 5530.940 +- 0.5387596
31392.451 +- 32.26783 km/s
Spectrum at (21,60): 5531.210 +- 0.5329667
31408.628 +- 31.94727 km/s
.
.
.
Spectrum at (39,80): 5526.000 +- 0.7465050E-01
31096.465 +- 4.475734 km/s
Spectrum at (40,80): 5526.596 +- 0.8262261E-01
31132.175 +- 4.943080 km/s

Output NDF:
Converting: Creating NDF from data.
Origin: Attaching origin (20,60).
Converting: Attaching VARIANCE array.
Axes: Attaching AXIS extensions.
WCS: Attaching WCS information.
Title: Setting title.

Plotting:
Display: Velocity map using percentile scaling.
Contour: White-light image with equally spaced contours.

Refit points (yes/no): no
%

As no automatic process is ever perfect the script allows you to manually refit spectra where it had difficulties in obtaining a fit. If the script was unable to fit an ($x$,$y$) position this value will be marked as VAL__BADD in the final velocity map. Should you choose to Refit points, clicking on these (normally black) data points in the final output image will drop you into an interactive fitting routine. However, you are not restricted to just refitting those points where the script was unable to obtain a fit, you may manually refit any data point in the velocity map.

There is a -a option where you can review each fit at each spatial pixel. The fit parameters can be logged to a Small Text List with the -l option. By moments
The second script for generating a velocity map is velmoment. This first allows you to select a region of interest. For large regions or noisy spectra you can also request spatial averaging (-b option) to reduce the spatial dimensions by integer factors. If your dataset has a WCS SPECTRUM or DSBSPECTRUM Domain, velmoment then switches co-ordinate system to one of four velocities, such as optical or radio. It can also cope with NDFs in the UK data-cube format too. Then comes the heart of the script—the collapse task acting upon the spectral axis. It derives the intensity-weighted velocity moment. For simple spectra, i.e. a single or dominant emission line, collapse finds a representative velocity for the line. The final stage is to display the map of the velocities. It uses a colour table that runs from blue to red in order to give a visual clue of the relative motions. You may choose to superimpose optional contours of the white-light image upon the velocity map. Here is an example. Rather than supply a spatial subset as in the velmap example, we choose to select the spatial region for analysis by interacting with a 2$×$2 spatially averaged (-b 2) ‘white-light’ image that the script presents. The wavelength bounds in the NDF section 5300.:5750. restricts the spectral compression to the range 5300 to 5750 Ångstrom and brackets the [OIII] line. This means that the ’white-light’ image is effectively a map of the [OIII] emission. The script converts the intensity-weighted wavelengths into velocities that it displays with a key and suitable colour map. Finally it overlays a ten-level contour plot of [OIII] map on the velocity map. See Figure 17. Below is an example.

% velmoment -b 2 -i ifu_file"(,,5300.:5750.)" -r 5007 -p -c 10

Input NDF:
File: ifu_file.sdf
Shape:
No. of dimensions: 3
Dimension size(s): 59 x 110 x 79
Pixel bounds     : 1:59, 1:110, 135:213
Total pixels     : 512710

Display white-light image to select subset (yes/no): y
Collapsing:
White-light image: 59 x 110

Left click on lower zoom boundary.
Left click on upper zoom boundary.

Zooming:
Extracting:
Lower (X,Y): 14,56
Upper (X,Y): 46,97

Rest Wavelength :
Wavelength  : 5007 Angstrom

NDF output file: moment

Collapsing:
Intensity-weighted co-ordinate image: 16 x 21
Plotting:
Display: Velocity map using percentile scaling.
Contour: White-light image with equally spaced contours.

##### 5.8.9 How do I create line-strength map?

The DATACUBE peakmap script will generate a line-strength map. The interface to this script is very similar to that of the velmap script discussed in Section 5.8.8, and it also generates final output in a similar form, see Figure 18. Much like the velmap script the peakmap script allows you to manually refit any spectrum that you think may have been poorly fitted by the automatic process.

It should be noted that generating a passband image of a line region, and a line-strength map using the peakmap script, should yield similar results. If you are worried about how accurately the automatic fitting of gaussians is doing on a particularly noisy image, then generating a line-strength map and making this comparison is an easy way of deciding a level of trust in your velocity maps, as these are generated using the same fitting algorithms.

##### 5.8.10 But they don’t handle blended lines!

No, neither peakmap nor velmap handle multiple gaussians or blended lines. While the FIGARO fitgauss application on which these scripts are based can handle fitting blended lines—up to six through the NCOMP parameter—automating this process reliably proved to be extremely difficult and made the fitting routine very fragile to signal-to-noise problems.

##### 5.8.11 How do I create line-ratio map?

Make a line-strength map of the both lines using peakmap or passband images using squash, and then use the Kappa div task to divide one by the other to create a ratio map. Here is an example.

% div out=ratio_map in1=image1 in2=image2 title="Ratio Map"

#### 5.9 Mosaicking

Mosaicking IFU data cubes poses unique problems. Firstly the field of view of all the current generation of instruments can be measured in arcseconds, far too small for the traditional approach of image registration to allow the cubes to be matched up the $x$,$y$ plane, additionally, the wavelength calibration of the two cubes you wish to mosaic may be entirely different, certainly the case for cubes coming from different instruments.

Unfortunately mosaicking therefore relies critically on WCS information provided in the cube FITS headers. Currently the form this information takes varies between cubes from different instruments; and sometimes where active development work is ongoing, between different cubes produced by the same instrument. It is therefore very difficult to provide a ‘catch all’ script or even recipe to allow you to mosaic two cubes together as yet. The agreement of a standard for the spectroscopic world co-ordinates promulgated in FITS (Greisen et al., 2006, Representations of spectral coordinates in FITS, Astronomy & Astrophysics 446,747) should diminish the problem. At the time of writing the Starlink AST already supports spectral frames (these are used to compute the velocities in velmap), and most features of this FITS standard.

If the data cubes to be combined have valid WCS information, you should try the wcsmosaic task. If your spectral co-ordinates are only present in an AXIS component, see the section Converting an AXIS structure to a SpecFrame in SUN/95.

Without valid WCS information we offer a possible approach to the problem. If the two data cubes have identical spectral-axis, e.g. wavelength, calibrations and, rather critically, the same number of pixels along the spectral axis, (i.e. they are from the same instrument); then the approach we take to the problem is to determine the right ascension and declination of the centre of the $x$,$y$ plane and work out the offset between the two frames in pixels. You can probably use the AXIS frame to determine the arcsecond-to-pixel conversion factor, or this may be present in the FITS headers.

Then make use of the Ccdpack wcsedit application to modify the origin of the PIXEL frame of one of the cubes such that the two cubes are aligned in the PIXEL frame. Next we suggest you change the current frame to the PIXEL frame (with wcsframe) and use makemos to mosaic the cubes together (see Figure 19). It should be noted that makemos pays no attention to the WCS information in the third axis (being designed for two-dimensional CCD frames) which is why having an identical wavelength calibration over the same number of pixels is rather crucial.

Alternatively use can be made of the Ccdpack wcsreg application to align the cubes spatially.

Due to the differences in WCS content between instruments, if you want to mosaic cubes from two different instruments together, the only additional advice we can currently offer you is that you should carefully inspect the WCS information provided by the two cubes using (for instance) wcsedit and try to find a way to map a frame in the first cube to a frame in the second. It may then prove necessary to re-sample one of the cubes to provide a similar wavelength scale. This may involve using Kappa tasks wcsadd to define a mapping between frames, and regrid to resample.

1/star/etc/ is for a standard Starlink installation, but the Starlink software may be in a different directory tree on your system.

2This is a guide figure. The limit will depend on your hardware and the size of your plotting window. It will be more for a higher-resolution hardcopy.