Changes between Version 25 and Version 26 of Procedures/OwlProcessing


Ignore:
Timestamp:
Nov 2, 2016, 2:34:43 PM (8 years ago)
Author:
lah
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • Procedures/OwlProcessing

    v25 v26  
    1717grep -rnw thermal/owl/*/capture/*.hdr -e "GPS"
    1818
    19 If there are no GPS times in the data header, the data cannot be geocorrected. Add timestamps from corresponding Fenix flightlines or estimate from the logsheet or file creation time.
     19If there are no GPS times in the data header, the data cannot be geocorrected. Use owl_hdr_fix.py to add the times from the file modification time and check it is reasonable.
    2020
    21213) Check for calibration data:
     
    2424ls thermal/owl/*/capture/T2*
    2525
    26 Make sure that there is at least 1 T1 and 1 T2 file present for the flight and that it is located in the same directory as the flight data. If there are double the number of expected flightlines, then the calibration data is probably stored in separate directories from the flight data. It will need moving to be processed. e.g. if line 1 only has T1 and T2 files and line 2 has data, but no T1 and T2, move/copy line 1 data into line2.
     26Make sure that there is at least 1 T1 and 1 T2 file present for the flight and that it is located in the same directory as the flight data. If there are double the number of expected flightlines, then the calibration data is probably stored in separate directories from the flight data. The unpacking scripts should move the raw data into ordered number flightline directories and the T1 and T2 files into directories labelled by time.
     27
     28Note that you will need to generate a config file to determine which calibration data is used for each flightline during radiometric calibration.
    2729
    28304) Check for dropped frames in the .log files. If there are more than a few note on the ticket. Make sure the log file is in the same layout as 2014 288 before starting processing.
     
    3234== File Naming ==
    3335
    34 Our scripts require that data be named according to the standard convention e.g. OWL219b-14-1, where 219 is the julian day, b the sortie, 14 the year and 1 the flightline. The files should have been renamed during unpacking, but if this is not the case, run proj_tidy.sh -e -p <project directory> to generate the move commands. Check that every move command is present before executing, as partial renaming cannot be completed in a second run. Taking a backup of the data files is recommended before renaming.
     36Our scripts require that data be named according to the standard convention e.g. OWL219b-14-1, where 219 is the julian day, b the sortie, 14 the year and 1 the flightline. The files should have been renamed during unpacking, but if this is not the case, run owl_rename.py to rename everything.
    3537
    3638== Dark Frames ==
    3739
    38 The radiometric calibration will fail if the dark frames are saved in a separate file to the data (Ops sometimes do this). Check the capture directory for any extra files labelled like they might be dark frames and append them to the end of the data file. Remember to add the autodarkstartline key to the header file. Alternatively use the script stitchOwl.py.
     40The radiometric calibration will fail if the dark frames are saved in a separate file to the data (Ops sometimes do this). Check the capture directory for any extra files labelled like they might be dark frames and append them to the end of the data file. Remember to add the autodarkstartline key to the header file.
    3941
    40 e.g. stitchOwl.py -c .../thermal/owl/OWL241a-15-8
     42Original Owl data should not be modified, nor scripts ran by ARSF in the raw directories. Use the batch script stitch_all_owl.py with a config file to automatically stitch desired files into the processing/owl/flightlines/stitched directory. The config files should be a text file with each flight line and each file to be stitched on a new line separated by a comma e.g:
     43
     441, T1_OWL274-15_0844.raw
     452, T1_OWL274-15_0844.raw
     463, T1_OWL274-15_0853.raw
     47
     48The script stitchOwl.py may be used for individual lines.
    4149
    4250Specim's calibration tool actually only uses dark frames to detect blinkers; it does not use them to radiometrically calibrate the data. Therefore, if dark frames are completely missing a calibration file (e.g. T1) may be used instead.
     51
     52When creating level 1 files remember to use the -s flag with processOwl.py to use the "raw" files in the processing/owl/flightlines/stitched directory rather than the original raw files.
    4353
    4454== Radiometric Calibration ==
     
    5060To process individual lines use: process_OWL_line.py -p <project directory> -f <flightline>
    5161
    52 Each flightline should have it's own calibration data in the form of additional files prefixed T1 and T2 in the capture directory.  If these are not present the script will produce a warning.  Run the script again with the option -c auto once the submitted flightlines have produced the required calibration files (Check the log has passed the "Writing calibration file..." step), or manually specify a calibration file with -c <flightline with calibration data_calibration.rad>.
     62Use the -s flag if the raw files did not have dark frames and have been stitched into the processing/owl/flightlines/stitched directory.
     63
     64If T1 and T2 files are in different directories to the raw data a config file should be used to specify which files to use. This is formatted like this:
     65[owl_-2]
     66black_body = 0853
     67[owl_-3]
     68black_body = 0853
     69
     70Currently this is generated manually, but will soon be automated based on GPS timestamps. The config file is specified using the -t flag to processOWL.py.
     71
     72The config file should be used for missing T1 and T2 files, but an obsolete method used the -c auto option to automatically select adjacent flightline calibration data.
    5373
    5474All output files are written to the flight line subdirectory of /processing/owl/flightlines/level1b and logs written to /processing/owl/logfiles. The output files consist of the data (*_proc.bil) and if not specified as inputs, the calibration (*_calibration.rad) and blinker files (*blinkers.dat), each with their own header file. During processing a symlink appears in the output flight line folder to prevent simultaneous processing over the same line.  You may need to remove this if processing is aborted before it is automatically removed.
     
    82102If there appears to be striping across the mapped files, bright and dark lines, or sections of a flight line don't match up the sensor has probably dropped frames. These can be checked in the .log file in the capture directory. These empty lines need to be inserted into the level 1b data file before geocorrection to map the flight line correctly.
    83103
    84 == Create mask files ==
    85 
    86 Specim's tool produces a blinker file locating each of the blinking pixels. This are auomatically converted to mask files by APL, but can also be converted using the script blinker2mask.py. Make sure mask files are present before making a delivery so the files can be correctly moved into the delivery directory.
    87 
    88104== Making a delivery ==
    89105
     
    93109
    94110
    95 == Readme ===
     111== Readme ==
    96112
    97113There is no separate owl option for make a readme. Generate a config file for the Fenix and manually edit it for the Owl. If you just edit the tex file you will have to run autoqc on the command line, but if there are no overflows and underflows just remove the table from the readme and state it in a sentence instead.
     
    110126 * The boresight needs entering into the config file / needs adding to table. Should not be a problem for project processing as we will always have a boresight.
    111127 * Bad pixels remain - will eventually look into a masking routine &/ get a better blinker detection routine from Specim.
    112  * Calibration files saved separately to data will not be processed - asm is working on fixing this as part of the new unpacking scripts.