top of page

RTK Point Cloud Data Post-processing Workflow|7 Steps to Maintain Processing Accuracy

By LRTK Team (Lefixea Inc.)

All-in-One Surveying Device: LRTK Phone

Table of Contents

Introduction

Step 1: Raw Data Inspection and Quality Assessment

Step 2: Extraction of Fixed Solution Data and Policy for Handling Float Solutions

Step 3: Coordinate System Transformation and Unification

Step 4: Integration of Multi-Session Data

Step 5: Noise Removal and Filtering

Step 6: Final Accuracy Verification and Quality Check

Step 7: Optimization of Output Formats and Delivery Preparation

Practical Post-Processing Workflow Examples

Improving Efficiency and Automating Post-Processing

Common Post-Processing Issues and Solutions

Post-Processing Quality Control and Continuous Improvement


Introduction

RTK point cloud surveying is only half complete when the on-site measurement work is finished. The raw data collected in the field must undergo various processing steps before it can be shaped into a usable final deliverable. The quality of this post-processing determines the reliability and usability of the resulting data.


No matter how high-precision the data obtained during measurement may be, if errors are introduced during post-processing, its value will be compromised. Conversely, even if the measurement errors are somewhat large, appropriate post-processing can minimize their impact and produce data with sufficient reliability.


In this article, we divide the post-processing workflow for RTK point cloud data into seven steps and explain in detail what should be performed at each step and what kind of quality control should be carried out. By mastering this workflow, you will be able to maintain measurement accuracy and reliably deliver high-quality outputs. Post-processing is an extension of the measurement work, and the quality control performed here affects the reliability of the project's final outcomes. Implementing the content of this article will greatly improve operational efficiency.


Step 1: Raw Data Inspection and Quality Assessment

Immediately after returning from the measurement site, the first thing to do is to verify the acquired raw data and assess its quality. This step is extremely important, as it determines the direction of subsequent processing. If there are problems with the data quality, however precise the subsequent processing may be, the reliability of the final results will be reduced.


When checking raw data, we first check the completeness of the measurement session. We verify whether the recorded data cover the entire scheduled measurement period and whether there were any interruptions in recording during the measurement. Next, we confirm that a fixed solution is maintained at each epoch. Data measured in a float solution state have reduced accuracy, so it is necessary to decide in advance the processing policy for those intervals.


Satellite reception status is also an important evaluation metric. Check indicators such as the number of satellites and DOP values at each time, and assess the quality of the measurement environment. If there are intervals with few satellites or poor DOP values, record that the data for those intervals are relatively less reliable.


It is also important to visualize the entire dataset to confirm there are no clearly abnormal values, such as jumps in measured positions or outliers. Use 3D visualization software to display the trajectory of the measured positions and check for any unnatural points or jumps.


Confirming that the temporal alignment between the reference station data recorded before measurement and the rover data is accurate is also an important element of quality assessment. If there is a time offset, the correspondence of correction values will become inaccurate, leading to a degradation of precision.


Step 2: Extraction of fixed solution data and policy for handling float solutions

In RTK positioning, fixed and float solution states can coexist. Ideally only fixed-solution data should be used, but in practice float-solution data is often included.


In Step 2, we extract from the entire measurement the data that were recorded in a fixed solution state and assess their proportion and temporal distribution. A higher proportion of fixed solutions indicates higher measurement quality.


A processing policy must be decided for float solution data. Options include completely excluding it or including it in processing with a low weight. If excluded, areas of missing data will occur, but high reliability is ensured. If included with a low weight, data continuity is preserved, but the accuracy in those areas is relatively reduced.


If there are periods during which float solutions persist, it is important to analyze the reasons. Multiple causes may be considered, such as deterioration of the satellite reception environment, communication failures, or temporary malfunctions of the receiver. By identifying the cause, you can improve future measurement plans and determine whether it is necessary to repeat the measurement.


For measurements spanning multiple sessions, it is important to determine the proportion of fixed solutions for each session and evaluate the variability in quality between sessions. If the variability is large, it may be worth considering re-measurement of sessions with low quality.


Step 3: Coordinate System Transformation and Unification

Coordinates obtained from measurements are typically represented in a geographic coordinate system (latitude, longitude, altitude). If processing in a construction-site or project-specific coordinate system is required, coordinate system transformation is essential.


Coordinate system transformations depend on transformation parameters. If the accuracy of the transformation parameters is low, systematic errors will occur in the transformed data. It is important to determine the transformation parameters using multiple existing control points.


When determining transformation parameters, it is recommended to use multiple known points (points whose positions are known in both coordinate systems). Using at least 3 points, and preferably 5 or more known points, increases the reliability of the parameters. It is also desirable that the known points be distributed across the entire measurement area.


Verification of coordinate system transformations is also important. Using the determined parameters, transform the coordinates of known points and check how closely the results match the known coordinates. Usually, if the error is within a few centimeters (a few in), the parameters are considered usable.


When working with multiple coordinate systems, it is particularly important to record the transformation process. By documenting in detail which coordinate system was converted to which, which parameters were used, and so on, you will be able to provide a clear explanation later if questions arise about the reliability of the data.


Step 4: Integration of Multiple Session Data

In cases such as construction site management where measurements are taken across multiple dates and times, the data from each session must be integrated to form a consistent dataset as a whole.


Ensure that the data from each session are expressed in the same coordinate system. If different sessions used different coordinate systems, you should complete the conversion to a unified coordinate system in a prior step.


An effective method for detecting shifts between sessions is to make use of overlapping measurement areas. If multiple sessions measure a common region, comparing the data from that region lets you assess whether there is any shift between sessions and how large it is.


If systematic offsets are detected between multiple sessions, it is important to analyze their causes. The causes can be varied, such as movement of reference points, errors in coordinate transformation parameters, or misconfiguration of measurement instruments. It may be necessary to identify and correct the causes and then reintegrate the data.


Integrated multi-session data provides the foundation for tracking changes over time. For example, it makes it possible to quantitatively assess construction progress, terrain changes, and structural deformations using data from multiple measurement sessions.


Step 5: Noise Removal and Filtering

The acquired RTK point cloud data may contain noise from various causes. The main causes of noise are receiver measurement errors, signal disturbance due to multipath, and temporary interruptions in signal reception.


The first step in noise removal is to detect and exclude points that contain clearly anomalous values. In three-dimensional space, values that are statistically distant from surrounding points can be regarded as outliers. It is possible to automatically detect outliers using statistical methods (for example, the 3-sigma criterion).


The next step is adaptive filtering tailored to the measurement environment. On construction sites, surrounding objects that are not the measurement target (temporary structures, construction machinery, etc.) may also be captured. To exclude these, algorithms that leverage prior knowledge of the target’s 3D shape and automatically remove regions that do not match the measurement objectives are effective.


Smoothing is also effective for noise reduction. By averaging the values of adjacent measurement points, random noise can be reduced. However, excessive smoothing can distort the true shape of the object being measured, so the strength of the smoothing should be set carefully.


It is also important to verify the effectiveness of noise removal. You should compare the data before and after processing to confirm that no unintended information has been lost. Fine details of complex terrain and structures are easily lost with overly aggressive noise removal, so a balanced processing approach is essential.


Step 6: Final Accuracy Verification and Quality Confirmation

After completing the post-processing steps, it is essential to comprehensively verify whether the final data obtained meets the required accuracy specifications.


The main method for final accuracy verification is checking accuracy at known points. At multiple known points within the measurement area (such as existing survey control points), the processed data are checked to see how well they match those known coordinates. Generally, if the errors are on the order of a few centimeters (a few inches), the accuracy can be considered sufficient.


Measurement repeatability (repeatability) is also an item that should be verified. When the same location is measured multiple times, confirm to what extent the results are reproducible. If reproducibility is low, there may be a problem with the measurement system or the processing method.


Measurement uniformity is also an important indicator. By understanding whether accuracy is uniform across the entire measurement area or whether accuracy degrades in specific regions, you can assess the reliability of the data on a per-region basis. For regions with low reliability, it is worth considering the need for additional measurements.


Visual inspection of the data is also important. Using 3D visualization software, review the measurement data visually and subjectively assess whether there are any unnatural structures or shapes and whether the information required to meet the measurement objectives has been obtained.


Creating a quality-assurance report is also an important task at this stage. By recording in detail the quality of the measurement data, the processing that has been applied, and any potential limitations, data users will be able to make appropriate use of the information.


Step 7: Optimize Output Format and Prepare for Delivery

Processed data must be converted into a final usable format. It is important to select the optimal output format according to the measurement objectives and the user's requirements.


Common output formats include the LAS format (the standard format for 3D point clouds), the XYZ format (a simple text format), and others. Additionally, the Shapefile format may be used for GIS analysis, and the DWG format for architectural and construction CAD.


Including metadata in the output file is recommended. By including information such as the measurement date and time, the coordinates of the reference point, the definition of the coordinate system, measurement accuracy, and the type of reference station used, data users will find it easier to understand the background of the data.


Checks to ensure data integrity are also important. Verify that the output files were generated correctly, that the file sizes match the expected values, and that there are no issues with data loading.


In the final pre-delivery check, it is important to reconfirm that all of the user's requirement specifications have been met. Based on a checklist, verify whether the measurements for the requested scope have been completed, whether the accuracy specifications are satisfied, and whether the file formats and naming conventions comply with the requirements.


Organizing delivery documentation is also important. Clearly compiling and providing the information necessary for users to appropriately utilize the data—such as explanations of measurement methods, descriptions of processing procedures, results of accuracy evaluations, and notes—is a professional approach.


Practical Post-processing Workflow Examples

By examining examples of post-processing workflows from real projects, you can bridge the gap between theory and practice.


In a large-scale land development project, the following workflow is implemented for point cloud data obtained from weekly drone RTK surveys. First, automated quality assessment of the raw data: on the survey server, metrics such as the ratio of fixed solutions and DOP values are automatically calculated, and a quality report is generated. Second, comparison with the previous week’s data: differences from the design surface are automatically calculated to visualize construction progress. Third, reporting the results to the field: GIS-based 3D visualization presents the information in a format that is easy for non-technical personnel to understand. This automated workflow has greatly shortened the time from surveying to reporting.


In the bridge deformation monitoring project, integration and analysis of RTK measurement data collected over multiple years are continuously performed. Newly measured data are compared with data from all previous sessions and stored in a form that allows detailed review. Statistical analysis reveals deformation trends, and a mechanism has been established to automatically generate alerts when dramatic changes occur.


These examples demonstrate that automating and integrating post-processing enables efficient management and utilization of even large, complex datasets.


Streamlining and Automating Post-processing

Post-processing RTK point cloud data takes a considerable amount of time when done manually. To improve efficiency, it is worth considering automating any steps that can be automated.


Processes for which algorithms are well established—such as coordinate system transformations, noise removal, and integration of multiple sessions—can be automated via batch processing. By using scripts or macros to process data from multiple projects in a single run with unified parameters, efficiency and consistency are improved.


Automating quality inspection is also important. Tasks that can be automated include accuracy verification, anomaly detection, and automatic metadata extraction. A final human review is still necessary, but preliminary automated inspections can efficiently pinpoint problematic areas.


However, care must be taken in the automation process to ensure that the unique requirements of individual projects are not overlooked. It is important for a human to identify in advance cases that cannot be handled by general algorithms or that require project-specific processing, and to reflect that determination in the script.


Common Post-Processing Issues and Solutions

In practical work, various problems can occur during the post-processing stage. A typical problem is coordinate shifts between multiple sessions. Possible causes include using different reference points between sessions or inaccurate coordinate system transformation parameters. As a remedy, you can perform a detailed inspection of the overlapping measurement areas between sessions and analyze the magnitude and spatial pattern of the shifts to identify and correct the cause. Using differential statistical analysis can determine whether the shifts occur systematically in a particular direction or are random. Systematic shifts are likely due to errors in the coordinate system transformation parameters, while random shifts are more likely due to measurement quality issues.


If noise removal is excessive, important details of the measurement target may be lost. In such cases, it is effective to reprocess using weaker smoothing conditions, or to return to the raw data and process it with a different approach. By performing test processing with several different filtering parameters and visually comparing the results, you can determine the optimal parameters.


Inaccurate coordinate system transformation parameters are also a common problem. If the positions of the control points used for the transformation are inaccurate, the parameters themselves will be inaccurate. It is necessary to re-determine the parameters using multiple reliable control points. Ideally, the number of control points should be far greater than the degrees of freedom of the transformation parameters, and the optimal parameters should be determined by a statistical least-squares method.


The best way to prevent these issues is to perform rigorous checks and keep records at each stage of the process. To enable rapid identification of causes when problems occur, it is important to clearly document the input and output data for each step of the process. Leveraging a version control system to make the processing steps traceable is also an effective measure for large-scale projects.


Post-processing Quality Control and Continuous Improvement

To continuously improve the post-processing workflow for RTK point clouds, it is important to establish a quality control framework and a review cycle. Not only technical procedures, but also organizational initiatives support the overall quality of the workflow.


Recording and analyzing processing logs provides the foundation for continuous improvement. By logging the input and output data at each step, the parameters used, processing times, and any errors and countermeasures, you can trace the history of processing afterward. By accumulating and analyzing this data across multiple projects, you can gain insights such as patterns of recurring problems and optimal parameter settings for specific environmental conditions.


Knowledge sharing within the team is also an important means of improvement. By documenting and sharing with the team the methods that succeeded in specific data processing tasks and the ways to handle special situations, individual experiential knowledge is accumulated as an organizational asset. Presenting case examples at regular review meetings and reflecting them in processing manuals is effective.


It is important to reaffirm that the high-precision reference point coordinates measured with LRTK (an iPhone-mounted GNSS high-precision positioning device) form the quality foundation for post-processing. Ensuring accuracy during the measurement phase raises the accuracy of each step in post-processing. Improvements to the post-processing workflow achieve maximum effect when the measurement and post-processing phases are optimized together as an integrated whole.


Next Steps:
Explore LRTK Products & Workflows

LRTK helps professionals capture absolute coordinates, create georeferenced point clouds, and streamline surveying and construction workflows. Explore the products below, or contact us for a demo, pricing, or implementation support.

LRTK supercharges field accuracy and efficiency

The LRTK series delivers high-precision GNSS positioning for construction, civil engineering, and surveying, enabling significant reductions in work time and major gains in productivity. It makes it easy to handle everything from design surveys and point-cloud scanning to AR, 3D construction, as-built management, and infrastructure inspection.

bottom of page