|Version 34 (modified by asm, 11 days ago) (diff)|
LIDAR data delivery
Once the data has been processed, it needs to be put into a delivery directory. This is the final file structure in which it will be sent to the customer.
What should be included
- ASCII LIDAR point cloud data
- LAS point cloud data
- pdf version of flight logsheet
- readme file describing the data set + copyright notice
- screenshot of mosaic of all lines (full resolution) and zoomed image with vector overlay
- data quality report and further processing scripts
- DEM of the LIDAR data - usually gridded to 2m resolution.
- Screenshot of DEM
For full waveform deliveries the following should also be included:
- Discrete LAS files
- Full waveform LAS files
- Navigation data (.sol file and .trj file)
- ASCII full waveform extractions - if requested by the PI
Procedure for creating a LIDAR data set delivery
- See the guide at delivery library
- If you did not create the DEM and screenshots using the above script (-m option) then create them manually using lidar_intensity.sh.
You will also need to make a lat long version of the dem (for use with aplcorr) using convert_uk_dem.sh or convert_nonuk_dem.sh appropriately.
- Generate the readme file.
- Create a config file for the read me using the generate_readme_config.py script, using the -d option to specify the delivery directory and the -r option to specify the type of config file to generate.
- Edit the config file and check all the items are filled in:
- Any remarks about the data should be entered as a sentence in the "data_quality_remarks" section.
- If vectors have been used then the accuracy should be entered in "vectors" (e.g. '5-10' if they're within 5m to 10m)
- line_numbering should contain a space separated list of line names linking the logsheet to the processed files.
- las_files should contain the path to the processed LAS file in order to extract the per flightline statistics
- elevation_difference should contain a list of the elevation differences between overlapping flightlines. Enter the lines numbers and the difference in cm followed by a semicolon e.g 001 002 5; 002 003 4.5; etc...
- All "compulsory" items should contain data
- Create a TeX file. Use the script create_latex_lidar_readme.py -f <config filename> Use the -w option for creating a full waveform readme
- This file can be reviewed and edited in any text editor if needed. Note that this script may cause an error [Errno 21] Is a directory: '/tmp/'. If this occurs, fill in the intensity and dem images in the "Optional" section of the config file generated as a result of above steps.
- Create a PDF file by running pdflatex <tex_filename>
- Review the read me and check carefully to see if it looks OK with all relevant information present
- Copy it to the delivery directory and remove any temporary files. Recommended to keep the TeX file until after delivery checking in case any edits are required
- Copy the template directory over to the project directory. Template directory at ~arsf/arsf_data/2011/delivery/lidar/template
- Move the processed data to the delivery directory
- Move the LAS binary files into delivery/flightlines/las1.0
- REMEMBER THAT THESE LAS FILES SHOULD HAVE BEEN QC'ED AND CLASSIFIED FOR NOISY POINTS
- Rename the files in the convention "LDR-PPPP_PP-yyyydddfnn.LAS" (details in readme).
- run las2txt.sh <delivery/flightlines/las1.0> <delivery/flightlines/ascii> Note, it is important to ensure that the correct options are used with this. Otherwise it will output the wrong format.
- OR run las2txt --parse txyzicrna <lidarfilename> <outputfilename> for each file, outputting to the ascii_laser directory (may not work with LAS 1.1).
- You need to create a DEM from the lidar data to include with the delivery. Use lidar_aster_dem.sh and put the output file in a directory named 'dem'. Noisy points (those with classification 7) should not be included in the DEM (this is the default behaviour). You should also create a second DEM that is suitable for use in aplcor. Use either convert_uk_dem.sh or convert_nonuk_dem.sh depending on where your project is located, or run the script again specifying the different output projection. This DEM should be name *_wgs84_latlong.dem
- Include a pdf version of the flight logsheet with the delivery
- Make sure correct up to date data quality report (pdf version) is included in docs directory
- Create full resolution JPGs of mosaic of all the LIDAR lines by intensity, a separate one of the intensity with vectors overlaid (if vectors are available) and one of the dem and put in screenshot directory (with lidar_intensity.sh).
- Generate the readme using as per point 3 above
*Note: Be sure that all the files outputted in the above steps conform to the file name formats specified here
A problem that occasionally occurs where there is no hyperspectral data for a particular flight is that a very large dem is created during the DEMGENERATION stage of the delivery script. You should overcome this by using the aplbuffer=False argument for the
Additional Steps for Full Waveform Deliveries
Create the following folders in the delivery directory:
- discrete_laser - containing two folders; ascii_files & LAS_files
- discrete_laser/ascii_files - move the ascii_laser folder to discrete_laser/ascii_files
- discrete_laser/(las1.2|las1.0) - move the discrete LAS files to here - naming convention LDR-PPPP_PP-yyyydddfnn.LAS
- fw_laser - move the full waveform LAS files to here - naming convention LDR-FW-PPPP_PP-yyyydddfnn.LAS
- navigation - copy the .sol file and .trj file to here - naming convention PPPP_PP-yyyy-dddf.*
- fw_extractions - this should contain the following information (This step is only required if an area is specified by the PI)
- A folder for each requested area containing the relevant ASCII extractions
- A text file describing the information given in the ASCII files. Template in ~arsf/arsf_data/2010/delivery/lidar/template/fw_extractions/extractions.txt. Use summarise_area.py to extract the extents for each area to enter into readme. Naming convention PPPP_PP-yyyy-dddf_extractions.txt
- A jpg showing the location of the areas on an intensity image. Naming convention PPPP_PP-yyyy-dddf_extractions.jpg
The Read_me will need to be edited to include the above information. A template of the full waveform readme can be found at ~arsf/arsf_data/2010/delivery/lidar/template/fw_readme.txt
Semi-scripted full waveform delivery
The delivery library can be used to perform a full waveform delivery. Still under testing but should work, to get the script to perform a full waveform delivery include the --lidarfw argument to the make_arsf_delivery.py script. The example below will create the full waveform directory structure.
make_arsf_delivery.py --lidarfw --solfile posatt/ipas_honeywell/proc/20140621_082249.sol --lidardeminfo resolution=2 inprojection=UKBNG --projectlocation . --deliverytype lidar --steps STRUCTURE --final
When the script runs in full waveform mode the discrete classified lidar data needs to be in the las-classified folder, the classified full waveform data needs to be in las-fw-classified and trj files should be placed in trj all theses folders should be in PROJ_DIR/processing/als50, see below for a clipping of the tree command. Also ensure the logsheet pdf is in the admin folder as it will complain otherwise. I would suggest not running the whole delivery at once. What I found works best is running the steps before the rename stage then running the rest. There is still a few things that needs to be done manually, the intensity image is named mosaic_image.jpg not PROJ_CODE-year_day-intensity.jpg and the navigation SUP file is not copied across yet. The script that creates images has a bug somewhere, it puts .txt.jpg as the extension for the screenshots, running rename.sh -f .txt.jpg .jpg would fix that.
processing ├── als50 │ ├── 2014172.cfg │ ├── 2014172.reg │ ├── las │ ├── las-classified │ ├── las-fw │ ├── las-fw-classified │ ├── logfiles │ └── trj
If you have hyperspectral data to make into a delivery, go to the hyperspectral delivery page.
If not, or if you've done that already, the delivery is ready for checking.