NWP SAF

AAPP Installation Guide       

  Document ID:  NWPSAF-MO-UD-005
  Version:           7.3
   Date:               01 June 2015

Table of Contents

4 Testing

After having successfully installed AAPP on your workstation you may start testing AAPP. Several different test cases are provided. You should run the ones that are relevant to your application. The test cases supplied with AAPP v7.1 are listed in the following table. Other test cases may be available on the AAPP ftp server.

Title
File
Purpose
Scripts
Notes
NOAA-16 HRPT
noaa16_test.tgz NOAA-16 HRPT from 2001 using TBUS. Processes raw HRPT to level 1d for AMSU-A, AMSU-B and HIRS, and AVHRR to level 1b.
NOAA16_RUN.sh
NOAA16_Compare.sh
clean.sh

Same data as were provided with AAPP v6, but updated reference output. No external libraries needed.
NOAA-18 HRPT noaa18_test.tgz NOAA-18 HRPT from 2005 using TLE. Processes raw HRPT to level 1d for AMSU-A, MHS and HIRS, and AVHRR to level 1b. NOAA18_RUN.sh
NOAA18_Compare.sh
clean.sh
Same data as were provided with AAPP v6, but updated reference output. No external libraries needed.
ATMS and CrIS processing
ATMS_CrIS.tgz
Ingest of ATMS/CrIS Sensor Data Records in HDF5 and/or BUFR format. Pre-processing for ATMS/CrIS (ATMS spatial filtering, CrIS channel selection, mapping, etc.). BUFR encode facility.
atms_cris_bufr.sh
atms_cris_sdr.sh
process_atms_cris_1c.sh
clean.sh
atms_cris_bufr.sh requires BUFR library
atms_cris_sdr.sh requires HDF5 library

Reference 1c files are provided in case you only want to run process_atms_cris_1c.sh
FY-3A MWTS and MWHS
FY3A.tgz MWTS and MWHS ingest (HDF5) and pre-processing (map ATMS to CrIS)
MWTS_MWHS.sh
clean.sh
Requires HDF5 library
MAIA
MAIA_noaa19.tgz MAIA2.1 and MAIA3, using forecast files. run_atovs_avhrr.sh
run_maia.sh
clean.sh
Requires MAIA3 data file to be installed.

Use of the forecast files requires GRIB-API, but it will revert to climatology if GRIB-API is not available.
MetOp-A AHRPT
metopa_20100128.tgz MetOp-A AHRPT, with the ability to exercise the following:
  • IASI OPS-LRS and pre-processing
  • IASI PC code
  • ATOVS and AVHRR processing from level 0 to level 1
  • MAIA3 (using climatology)
  • BUFR encode/decode
run_atovs_avhrr.sh
run_atovs_bufr.sh
run_iasi_OPS.sh
run_iasi_PC_bufr.sh
run_maia3.sh
clean.sh
run_atovs_avhrr.sh should be run first.

run_iasi_PC_bufr.sh requires HDF5 and BUFR libraries.

run_atovs_bufr.sh requires BUFR library

run_iasi_OPS.sh requires
  • $PATH_OPS to point to the "perl" directory of an OPS_LRS installation,
  • $DIR_IASICONFIG to point to a set of IASI config files (available from the AAPP ftp server)
MAIA4
 MAIA4_test_1gran.tgz
MAIA4 and viirs_paste
run_maia4.sh
run_viirs_paste.sh
Introduced with AAPP v7.5 (June 2013)

Requires HDF5 Fortran library and GRIB_API library.

Forecast files and VIIRS SDR files are supplied. A single granule is used here; cases with more granules can be supplied on request.

Note that the HDF5 library tool "nagg" is an alternative to "viirs_paste", and can run faster, especially when concatenating large input files.
VIIRS to CrIS
 viirs_to_cris.tgz
VIIRS to CrIS mapping, optionally including MAIA4
viirs_to_cris_run.sh
viirs_to_cris_withMAIA4.sh
Introduced with AAPP v7.10 (June 2015)

Requires HDF5 Fortran library and GRIB_API library.

VIIRS and CrIS SDR files are supplied. Forecast files are downloaded from the Internet. A single granule of VIIRS data is supplied, but the scripts will work with more granules.


Each test case is contained within a zipped tar file. To unpack a test case, copy the file to a suitable directory and type
tar -xzf name.tgz

or if your system does not support the "-z" option in tar
gunzip -c name.tgz tar -xf -

Before running any of the test cases, you should set the environment variable AAPP_PREFIX to the installation directory for AAPP v7 (i.e. the directory containing your ATOVS_ENV7 file):
export AAPP_PREFIX=.....

Specific instructions for running a test case are provided in the corresponding README.txt files.

Where reference output files are provided, these were generated at the Met Office on RedHat Enterprise Linux 6 with ifort 12.0.4 Fortran compiler.

4.1 Test case example: NOAA data test cases

These test cases make use of the AAPP chain script AAPP_RUN_NOAA (installed to directory AAPP/bin). (For AAPP v1 to v5 this script was called AAPP_RUN and was in directory AAPP)

To run the NOAA-18 test case (for example), unpack the noaa18_test.tgz file, set up environment variable AAPP_PREFIX then enter the following commands:

cd noaa18_test
NOAA18_RUN.sh

Check the output files NOAA18.OUT and NOAA18.ERR in directory work to see if it ran OK. You can compare with the reference output in directory work_ref. If all looks OK, you can compare the level 1c and 1d output files by typing

NOAA18_Compare.sh

then examine the results in directory compare. Some differences are to be expected due to compiler differences, but the maximum differences should be only a few hundredths of a K for AMSU and HIRS brightness temperatures, somewhat larger in some of the AVHRR derived quantities.

Note that TBUS and TLE data for the NOAA test cases are stored within the directory structure of the test case, not within the main AAPP installation tree. This ensures the test case is self-contained.

4.2 Possible Problem Areas

Some portability problems may still require manual adaptions. Recommendations are provided in the list of known bugs and problems in the AAPP webpage. The AAPP_7 package has been tested on Linux, AIX and SUN.

Problems can be reported through the Feedback Form available from the NWP SAF.

4.3 Preparation for Processing Your Own HRPT Read-Out Data

Before trying to use AAPP for processing NOAA HRPT read-out data from your own station check first whether
 
i) your station delivers the data in the format expected by AAPP, i.e. the 10-bit words of the down-linked HRPT data are stored right-justified in 16-bit words.
If this is not the case you have to reformat the data. If your data are in packed 10-bit words then you can use the tool unpack_noaa_hrpt to unpack the data, but you will probably need to change the values of parameters bytes_in and words_out in unpack_noaa_hrpt.F, since different stations use different conventions.
ii) your station position is contained in
(dest.dir.)/AAPP/src/navigation/libnavtool/stations.txt
If this is not the case insert it and copy the updated file additionally to
(dest.dir.)/AAPP/data/navigation/
and modify the variable "STATION" in the environment variable file ATOVS_ENV77 with the name of your station. Note: if "STATION" is empty then AAPP_RUN_NOAA will fail!

To process METOP AHRPT data you must make sure that your station produces PFS Level 0 data files, as defined by EUMETSAT. If the reception station does not deliver files in L0 format then the user should consider using METOPizer software to pre-process the raw AHRPT (go to www.eumetsat.int and navigate to Data -> Data Delivery -> Support Software & Tools ). The METOPizer programs tvcdu_to_ccsds and ccsds_to_l0 are likely to be needed. Please note the following advice from EUMETSAT:

If you are reading CADU packets then the installation procedure has changed. The Metopizer used to include a Reed-Solomon library licensed under the GPL which has to be installed separately now. There is an INSTALL file inside the package with instructions.

To process NPP and JPSS data you will need an external processing package to generate Sensor Data Record (SDR) files suitable for input to AAPP. Suitable packages are supplied by NASA ("IPOPP") and by the University of Wisconsin ("CSPP"). Details are available elsewhere.

4.4 Processing scripts

Direct readout processing
The AAPP_RUN_NOAA chain script processes NOAA HRPT data through to level 1d. The script offers the following options:
In early versions of AAPP, it was suggested that the user should customise the script to suit his own requirements, but now that the functionality is provided by command-line options, this should be unnecessary. If you do wish to customise the script, please give it a different name, to avoid confusion.

If your HRPT reception system automatically delivers the orbit number, time, date, etc. (e.g. in the file name) then you may pefer to use a simplified script, as in AAPP_RUN_NOAA_simplified. In this example we use environment variables "ATOVS" and "AVHRR" to control which parts of AAPP are run.

To process METOP AHRPT data you may use the AAPP_RUN_METOP script which works as follows:
The script allows the user to take two passes through the data - the first to generate traditional HIRS, AMSU-A and MHS 1d files, and the second optional pass to run OPS-LRS and generate products on the IASI grid. For full details, please see the description in the source.

If your station does not generate one file per instrument per pass (e.g. it uses 3 minute granules) then you will need to concatenate the granules, otherwise the HIRS calibration will fail. Level 0 granules can be concatenated using the Unix "cat" command.

Processing externally supplied level 1b files
Many users also use AAPP to process global or regional ATOVS data, which are typically supplied at level 1b or 1c. To do this you need to call atovin and/or atovpp directly from your script. For example, to convert ATOVS data from level 1b to level 1d on the HIRS grid you would typically do the following:

ln -sf amsua_1bfile aman.l1b
ln -sf amsub_1bfile ambn.l1b
ln -sf hirs_1bfile hrsn.l1b
atovin AMSU-A AMSU-B HIRS
atovpp -i "
AMSU-A AMSU-B HIRS" -g "HIRS"

Note that if you are ordering data from NOAA CLASS, you should untick the box that says "include archive header" (or do this in your user preferences). Otherwise you will have to strip off the first 512 bytes from each 1b file before running atovin (e.g. dd bs=512 skip=1).

AAPP does not currently ingest TOVS level 1b data from CLASS (i.e. NOAA-14 and earlier). This capability is planned for a future AAPP upgrade.

In the case of METOP data distributed in BUFR, you would need to run aapp_decodebufr_1c and then atovpp. But note that global METOP data are distributed in short (typically 3 minute) granules, and there is no guarantee that the granules for the different instruments will be aligned in time. Therefore a utility has been provided to concatenate level 1c files, called combine_1c, in order to ensure that the instrument to be mapped covers a wider time interval than the instrument to be used as a grid. It is up to the user to write his script such that the appropriate input files are concatenated.

If difficulties are encountered, please seek advice using the Feedback Form available from the NWP SAF.

Processing ATMS and CrIS data
ATMS and CrIS data may be received as near-real-time global BUFR files or as SDR files. You will need to run some or all of the following AAPP tools:
For more information on the processing, see AAPP document NWPSAF-MO-UD-027 "Pre-processing of ATMS and CrIS".

In some cases you will need to do some preparation of the data before running the AAPP utilities. Data often arrive with one file per 32-second granule - which is too short for effective use of atms_beamwidth and atovpp. Also, in some cases the granules may not arrive in time order. One solution to this problem that has been employed at the Met Office for ATMS/CrIS BUFR data is to aggregate files as follows:
  1. Use the time stamp in the file name to compute a granule number, defined as the second of day divided by 32. Append this to the file name.
  2. Also compute an aggregation number, equal to the granule number divided by the number of granules you wish to aggregate (e.g. 10). Append this to the file name also.
  3. When all 10 granules have arrived for a given aggregation number, concatenate the individual granules (unix cat will work for BUFR data) and proceed to the AAPP processing.
A similar approach could be used for processing HDF5 granules, except that atms_sdr or cris_sdr would be run first, and the *.l1c granules aggregated using combine_1c. This problem does not arise with HDF5 data from NOAA/CLASS because files can be delivered already aggregated into chunks of 8 minutes (15 granules).

Processing archived data from EUMETSAT Data Centre
AAPP can ingest the following types of data from EUMETSAT Data Centre:
Note that unless you specify a regional subset, then the input files will be full-orbit. For AVHRR and IASI this results in quite large input files. If you wish to process AVHRR files that contain more than 33 minutes of data, the following changes are needed in AAPP:
  1. For level 0 processing, increase the value of mx_hrpscn in hrptdc.h. Normally set to 12000; increase to 32767 which is the maximum possible, since the AVHRR 1b format holds avh_h_scnlin as a 2-byte integer. Note that a full-orbit would be 36400 scans.
  2. Increase the value of avh_mxscn in avhrcl.h. Also normally set to 12000.
It is preferable to specify a geographical subset (defined by lat/lon limits) when you order the data.

The applicable AAPP scripts are:


4.5 Satellite attitude

Some users may wish to manually set the values of roll, pitch and yaw ("attitude") for a particular satellite. The default attitude for each satellite is set in the file ${DIR_NAVIGATION}satid.txt. Look for the lines that say "number of orbital bulletin dependant attitude biases". Underneath are listed the default yaw, roll and pitch (in milliradians) for up to four types of bulletin (in the order TBUS, Argos, 2-line, Spot). These values may be changed by the user if required, or attitude biases inserted if there are no values already present (i.e. older satellites). This file also specifies misalignment parameters for individual instruments.

If you wish to change the attitude for specific orbits then you need to create a file ${DIR_NAVIGATION}/ana/estatt_${BUL}_${SATIMG}.txt, where ${BUL} is "tbus", "tle" or "spm" and ${SATIMG} is the name of the satellite, e.g. "noaa16". This is a text file with the following values on each line
yaw roll pitch orbit 0
where yaw, roll and pitch are in milliradians and "0" means the data are good. If this file exists, it is read by the scripts avhrcl, amsuacl, amsubcl, etc. If the orbit being processed (or the previous orbit) matches a line in the file then the corresponding attitude values are used; if not then the default (from satid.txt) is used.


4.6 Satellite manoeuvres

Some satellites (notably MetOp) undergo pre-planned manoeuvres in order to maintain the correct orbit. For MetOp these may be either "in-plane" (no change to attitude) or out-of-plane (in which the satellite is rotated so that the station-keeping rockets are pointing at 90 degrees to the direction of motion). For AAPP users the in-plane manoeuvres are usually of little concern, however the out-of-plane manoeuvres (conducted typically every 1-2 years) can cause problems with the quality control of the orbital elements files (TBUS, 2-line or Spot).

When an out-of-plane manoeuvre is announced it is recommended for users to temporarily disable the quality control mechanisms by setting an environment variable as follows:
PAR_NAVIGATION_EXTRAP_LIMIT=100
export PAR_NAVIGATION_EXTRAP_LIMIT
This may be either inserted into your ATOVS_ENV7 file or into your calling script. When the manoeuvre is finished, and new orbital elements have been ingested, then you can set PAR_NAVIGATION_EXTRAP_LIMIT="". This facility was introduced in AAPP version 6.7.

To check whether a bulletin has been successfully ingested, look at your index file, e.g. $DIR_NAVIGATION/tle_db/tle_M02.index.  The second number in each line should be "0" if the corresponding bulletin was OK. To ingest a specific TLE file by hand (e.g. published in the MetOp Admin message) then you can run tleing by hand with the "-f" option (type tleing -h for usage instructions), e.g.
tleing -s M02 -f ./2011-10/tle_20111005_1130.txt

If you manually ingest a bulletin in order to correct a geolocation error, remember to delete the old satpos file, otherwise AAPP will continue to use that satpos file for the remainder of the day and you will still get the error.



4.7 On the use of working directories

The ATOVS_ENV7 setup defines an environment variable $WRK (default $HOME/tmp), which is used by several of the AAPP scripts. For example, AAPP_RUN_NOAA and AAPP_RUN_METOP do a "cd" to this directory during the script execution. Other scripts assume you are already in a suitable working directory (e.g. atovpp). You can set up your working directory before running the script, e.g. to use the current directory do

WRK=$PWD && export WRK

If you are running multiple instances of AAPP on the same machine (e.g. processing global and local data at the same time), it is very important that they do not attempt to use the same working directory, otherwise you will get unpredictable results.

One way to do this would be to define temporary working directories before running a script, e.g.

WRK=$PWD/tmp_$$_$(date +%H%M%S_%N) && export $WRK
mkdir -p $WRK
cd $WRK


where the temporary directory has been created accoring to the shell process number ($$) and the system time. Remember to delete the temporary directory when you have finished with it.

Most users are unlikely to need this level of complexity, but the issue has been raised in the past via the NWP SAF Helpdesk, therefore we mention it as something to be aware of when you are designing your processing systems.
 

 Cover Page