WXP version 5
Program Reference

HURRICANE

Sections

NAME

hurricane - Parses hurricane products and reformats for a web page.

SYNOPSIS

hurricane [parameters...] filename

PARAMETERS

Command Line Resource Default Description
-h help No Lists basic help information.
-df=filename default .wxpdef Sets the name of the resource file.
-na=name name hurricane Specifies the name used in resource file parsing.
-ba batch No Run program in batch mode
-me=level message out2 Specifies level of messages to be displayed
  • file information - mess
  • table of storm data - out1
  • product level information - out3
  • parsed information from product - out4
-fp=filepath file_path current directory Specifies location of database files.  
-dp=datapath data_path current directory Specifies the location (path) of the input raw tropical data files. This is the location where the ingest program has saved the data files. This may be modified in the name convention file.
-cp=conpath con_path current directory Specifies the location (path) of the output data files. This will be a root directory for hurricane data such as /data/hurricane.
-nc=name_conv name_conv name_conv The name convention file specifies how files are named in WXP. This sets which name convention file to use.
-if=in_file in_file trp_dat Specifies the input file name tag. The default is trp_dat, but it can be modified to any value in the filename convention file. A full name convention can be specified as well. There are two file types that can also be specified:
  • raw_wp - western Pacific data files
  • raw_io - Indian Ocean data files

If either of these is not specified, standard TPC or JTWC format is assumed.

-cu=[hour|la] current None This specifies to use current data files. The current filename is based on the name convention. An optional hour can be specified for older data. If la is specified, the program will search back to find the most recent available file.
-ho=hour hour None This resource specifies the exact hour that a data file is valid for. This locks in the start hour for a multi-file sequence.
-nh=num_hour num_hour 0 This specifies the number of hours that will be searched for hourly data.
-id=identifier identifier None Used to decode a specific storm (reserved for future use).
-pa=param[,param...] parameter None Extra parameters:
filename[#seq] filename None
User Pompt
Batch: current=la
The name of the raw data file to be parsed. An optional sequence number can be added to designate the time for non-WXP files.

DESCRIPTION

The hurricane program parses hurricane products in order to generate a wide range of data files containing information on specific hurricanes. The input to the program is raw tropical data file from the TPC (Tropical Prediction Center) and the Joint Typhoon Warning Center (JTWC) which can contain any number of products. To simplify the process, it is highly recommended to save the hurricane products (WTxx) in a separate file type as specified by the trp_dat file name tag.

The program parses three types of TPC products:

The output of the program is the following set of files for each active storm:

The program puts these files in directories sorted by storm, region and year. The above files are put in the directories for each storm. The directory structure looks like:

storm1 storm2... storm1 storm2... storm1 storm2... storm1 storm2...

year1

year2...

year1

year2...

region1

region2...

con_path

The regions include:

Finally, the program produces a listing file which is put in each year directory:

This file is only a list of those storms that were active within the period parsed. This list can be used to determine which storms need to be updated on a regular basis. Also, this list can be parsed and new storms added to an overall names file which has all the named storms for a particular year.

Automating the Hurricane Processing

The hurricane program will process each storm that has advisories in the raw tropical files that are in the range specified by num_hour resource. When reprocessing the data, hurricane reads in the track.dat file for every storm that it sees in the raw files to obtain any older advisory information, adding the new advisories onto the end of the file.

There are a couple of scripts that go with the hurricane program. The first is hur_update which runs all the tasks to generate images and HTML pages based on the information generated by hurricane. This script should be run in cron once every 2-4 hours or roughly 10-15 minutes after each advisory in order to get the latest advisories. The script does the following tasks:

  1. it runs hurricane to process all the latest hurricane information.
  2. it uses the storm.list file to generate a list of the currently active storms. It updates a names file with each new storm.
  3. it generates satellite images for each active storm using xsat with the last.dom file. The visible file is sat_vis.gif and the IR file is sat_ir.gif. It also generates a small IR image named sat_ir_s.gif for inclusion into the storm.html file.
  4. it generates full Atlantic visible, IR and water vapor images on the same projection as the yearly tracking chart.
  5. it runs the hur_plt script.
    1. generates a yearly tracking chart based on the names in the names file. When a new storm develops, the hur_update will add its name to this file. You can delete names out of this file when depressions become tropical storms.
    2. generates storm based tracking charts using mapplt and the track.dom file.
    3. it generates small track charts for inclusion into the storm.html file.
  6. it runs hur_html_setup which is a Perl script that generates a large HTML file for each region and year named index.html.
    1. it inserts each of the storm.line files for a general storm summary
    2. it generates a HTML table using all the storm.html files generated by the hurricane program.

When all this is completed, there is an up to date HTML hurricane page with all the tracking charts updated and current satellite images.

EXAMPLES

hurricane -cu=la -nh=-6

This updates the hurricane files for the last 6 hours.  This should be run from cron either by invoking the hurricane program or the hur_update script.

FILES

SEE ALSO


Last updated Aug 27, 1997