Overview: 
  The files, programs, and demos in this package provide a utility
  suite and instructions to enable further automation of the hotspot
  pipeline.  These tools are designed to be general and flexible, and
  can be used for a variety of purposes in addition to the hotspot
  pipeline.

Programs:
  The Script-Tokenizer:
  This program applies a set of tokens to one or more files, and then
  (optionally) executes the files.  The input tokens are required, and
  follow a simple NAME=VALUE syntax.  When the program executes, all
  instance of NAME will be replaced with VALUE. The new 'tokenized'
  script will be written to a a file with the extension $FILE.tok,
  where $FILE is the name of the original input file that is being
  tokenized.  Newly created files inherit the permissions of the
  original file. For example, if a file to be tokenized has
  permissions '-rw-r--r--', the new file will also be '-rw-r--r--'.

  Several other program options exist.  Notably, the '--clobber'
  option allows files to be overwritten, and the '--execute-scripts'
  option causes each tokenized script to be run.  If the
  '--break-on-error' option is present, the tokenizer will terminate
  immediately when a script returns with a non-zero status.

Critical Files:
  src/script-tokenizer.py:  A program for applying tokens to
  a set of files, and then (optionally) executing the files. Run
  without arguments to see an overview of program options.

  test/demo/*: A basic example of script-tokenizer usage

Dependencies:
  Python 2.6+

Scott Kuehn
Sat Oct 24 17:29:15 PDT 2009
