Version 2 (modified by wja, 4 years ago) (diff)

--

Thermal Data Delivery

Once the data has been processed, it needs to be put into a delivery directory. This is the final file structure in which it will be sent to the customer.

Delivery Script

From the base directory of the project generate the structure using:

make_arsf_delivery.py --projectlocation $PWD \
                      --deliverytype owl --steps STRUCTURE

If everything looks OK run again with --final

Once the structure has been generated run the other steps using:

make_arsf_delivery.py --projectlocation $PWD \
                      --deliverytype owl --notsteps STRUCTURE

Again pass in --final if the output all looks OK.

See the delivery library guide for more details on advanced options.

Readme Generation

To make the readme first generate the readme config file using

generate_readme_config.py -d <delivery directory> -r hyper -c <config_file>

If you recieve the error ImportError: No module named pyhdf.HDF, this can likely be resolved by activating the following virtual environment and running the script again:

source ~utils/python_venvs/pyhdf/bin/activate

The readme config file will be saved as hyp_genreadme-airbone_owl.cfg in the processing directory, do not delete this file as it is required for delivery checking. Check all the information in the readme config file is correct, if not change it.

Then create a readme tex file using:

create_latex_hyperspectral_apl_readme.py -f hyp_genreadme-airbone_owl.cfg

Finally run pdflatex <readme_tex_file> to create the PDF readme (you probably need to run this twice). This readme should be placed in the main delivery folder.