Next Article in Journal
WDBSTF: A Weighted Dual-Branch Spatiotemporal Fusion Network Based on Complementarity between Super-Resolution and Change Prediction
Previous Article in Journal
Lidar- and V2X-Based Cooperative Localization Technique for Autonomous Driving in a GNSS-Denied Environment
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development of a Lightweight Single-Band Bathymetric LiDAR

by
Guoqing Zhou
1,2,*,
Xiang Zhou
1,2,
Weihao Li
2,
Dawei Zhao
2,
Bo Song
2,
Chao Xu
2,
Haotian Zhang
2,
Zhexian Liu
2,
Jiasheng Xu
2,
Gangchao Lin
2,
Ronghua Deng
2,
Haocheng Hu
2,
Yizhi Tan
2,
Jinchun Lin
2,
Jiazhi Yang
2,
Xueqin Nong
3,
Chenyang Li
4,
Yiqiang Zhao
1,
Cheng Wang
5,
Lieping Zhang
2 and
Liping Zou
6
add Show full author list remove Hide full author list
1
School of Microelectronics, Tianjin University, Tianjin 300072, China
2
Guangxi Key Laboratory of Spatial Information and Geomatics, Guilin University of Technology, Guilin 541004, China
3
The 34th Research Institute of China Electronics Technology Group Corporation, Guilin 541004, China
4
School of Marine Science and Technology, Tianjin University, Tianjin 300072, China
5
Institute of Aerospace Information Innovation, Chinese Academy of Sciences, Beijing 100864, China
6
Lide Information Technology Co., Ltd., Wuhan 430000, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2022, 14(22), 5880; https://0-doi-org.brum.beds.ac.uk/10.3390/rs14225880
Submission received: 30 September 2022 / Revised: 6 November 2022 / Accepted: 15 November 2022 / Published: 20 November 2022

Abstract

:
Traditional bathymetry LiDAR (light detection and ranging) onboard manned and/or unmanned airborne systems cannot operate in the context of narrow rivers in urban areas with high buildings and in mountainous areas with high peaks. Therefore, this study presents a prototype of a lightweight bathymetry LiDAR onboard an unmanned shipborne vehicle (called “GQ-Cor 19”). The GQ-Cor 19 system primarily includes an emitting optical module, a receiving optical module, control module, detection module, high-speed A/D sampling module, and data processing system. Considering that the “GQ-Cor 19” is extremely close to the water surface, various new technical challenges are encountered, such as significant laser scattering energy from the surface of the water, which saturates signals received by the photomultiplier tube detector. Therefore, this study presents various new technical solutions, including (1) an improved Bresenham algorithm, (2) a small and lightweight receiving optical system with a split-field method, and (3) a data acquisition module with a high-speech A/D collector. Following a series of different experimental verifications, the results demonstrate that the new generation of single-band LiDAR onboard an unmanned shipborne vehicle can swiftly measure the underwater depth, and the maximum measurement depth is more than 25 m. The measurement accuracy is better than 30 cm and the weight is less than 12 kg.

Graphical Abstract

1. Introduction

1.1. Background

Airborne bathymetry LiDAR (light detection and ranging), hereafter referred to as ABL, is onboard a manned airborne vehicle or helicopter with dual bands (1064 nm and 532 nm). This type of ABL can be applied in terrestrial mapping for both land and water bottoms; however, it is expensive, huge, and difficult to use. Moreover, the operating conditions are demanding, such as on airplane runways and in broad airspace [1,2,3,4]. Therefore, single-band ABL onboard unmanned aerial vehicles (UAVs) have been developed for the bathymetry of shallow islands and small lakes at a 20 m water depth [5,6,7,8]. However, all the mentioned ABL systems require broad airspace, which is difficult to meet in riverways, city ponds, lakes, city canals in urban areas with high-rise buildings, barrier lakes in narrow high mountains, cave lakes, and other unique regions. Therefore, the research group at the Guilin University of Technology, Guilin, Guangxi, China developed a prototype of a single-band bathymetric LiDAR onboard a shipborne vehicle [9,10,11,12,13,14,15,16], named “GQ-Cormorant 19”, shortened as “GQ-Cor 19”. The specifications of GQ-Cor 19 listed in Table 1 need to be achieved.

1.2. Related Work

Thus far, no single-band bathymetric LiDAR systems onboard unmanned shipborne vehicles have been made available in the global market. The related work is solely based on an airborne LiDAR. In early 1968, Hickman et al. [17] developed the first airborne bathymetry LiDAR (ABL) system. Subsequently, with increasing developments in electronics, photons, optics, and information technology, numerous ABL systems were developed and marketed. For example, in 2005, Leica AHAB produced the Hawk Eye II ABL system and subsequently produced the Hawk Eye III ABL system, which is commonly used worldwide [18]. In 2011, Rigel Company and Innsbruck University collaboratively developed a VQ-820-G ABL system. In 2015, Rigel Company successfully developed a VQ-880-G ABL system based on the VQ-820-G ABL system. The VQ-880-G ABL system has an improved system in various aspects, such as the detection capability of echo signals, maximum water depth, and measurement accuracy [19]. In 2020, Old Dominion University, Norfolk, Virginia, USA collaborated with the Bigelow Marine Science Laboratory and conducted a study on algae and other related index data measured using the LiDAR bathymetry system in the ocean. The results demonstrated that the accuracy of the data measured using the LiDAR bathymetry system was three times higher than that of the data measured via satellite remote sensing [20].
Thus far, numerous commonly used ABL systems in international markets include the U.S. military Shoals 3000T system [21,22,23,24], the Swedish Leica AHAB’s Hawk Eye III system, the Canadian Optech’s CZMIL system [25,26,27], and the Austrian RIEGL’s VQ-880G [28,29], which are all mature. For example, the Shoals 3000T system can be easily installed on an aircraft and has various functions, such as route planning, automatic data processing, etc. Therefore, it has been used in several countries worldwide [30]. An overview of the ABL system is presented in Table 2.
Based on the above analysis, it can be concluded that the existing ABL systems are airborne-based systems, such as for aircraft and UAVs. Such an ABL system operates only under certain circumstances, such as on the runway and in broad airspace. Therefore, this study presents a prototype of a new generation of lightweight, single-band bathymetric LiDAR onboard an unmanned boat, called GQ-Cor 19.

2. GQ-Cor 19 Bathymetric LiDAR System

2.1. Architecture of GQ-Cor 19

The basic architecture of GQ-Cor 19 is shown in Figure 1, which comprises an emitting optical system, a receiving optical system, detection module, high-speed A/D sampling, POS system, control system, and LiDAR data-processing system.
The emitting optical system consists of a laser, laser beam collimation, and laser scanning system. The laser scanning system comprises motors, motor drivers, optical wedges, and the corresponding scanning algorithms. The system functions by changing the scanning angle, combined with the Bresenham scanning algorithm, to achieve different scanning methods, such as linear and circumferential modes. The system can adjust the scanning point density according to the shipping speed and actual demand to prevent sparse or missed points.
The receiving optical system, which is a generalized Keplerian system, includes an objective lens, split-field lens, and eyepiece set. The objective lens set uses an objective lens, a reflector, and a focusing lens, forming a Kirk three-piece objective lens set; split-field mirrors composed of aluminum open-hole reflectors; and the eyepiece set uses a convex lens, filter, and plano-convex lens to form a modified Känel eyepiece set.
The detection module uses a photomultiplier tubes (PMT) detector, where the PMT module receives echo signals.
The high-speed A/D sampling module uses an FPGA and a high-speed A/D data acquisition card. The high-speed A/D data acquisition card acquires the LiDAR echo signal, and the high-speed A/D data acquisition card is connected to the FPGA via the interface to transfer the detected data to the FPGA, which stores the data in the hard disk. The system achieves three-channel parallel sampling with a sampling rate of 2 G SPS (sampling points/second) and sampling accuracy of 16 bits.
The POS system, which consists of a global position system (GPS) and inertial measurement unit (IMU), provides a high-precision dynamic position and attitude. The system synchronizes the POS and waveform data to ensure uniform timing of the position and waveform data.
The control system is implemented by STM32 to control the work of the emitting optical system, detection module, high-speed A/D sampling module, and POS system. The working status of each module was obtained while controlling the operation of each module.

2.2. Design and Implementation of GQ-Cor 19 Emitting Optical System

2.2.1. Laser Selection

One of the most important criteria for the selection of a laser is its ability to penetrate water, and other factors include the stability of the laser beam, beam quality, divergence angle, and other parameters. Considering the availability of lasers in the market, a solid-state laser, HQP, manufactured by Beijing Xinglin Ruiguang Company, was selected (see Figure 2), and its specifications are listed in Table 3.

2.2.2. Laser Beam Collimation

The laser beam emitted by the laser has a certain divergence angle, which creates a large spot after propagation [30,31,32,33]. Therefore, the emitted laser beam must be collimated to achieve a reasonable spot size. For this reason, a Galileo-type collimated beam-spreading device consisting of a negative lens and a positive lens was designed and implemented (see Figure 3). First, the laser was expanded using a negative lens and then transformed into a parallel beam using a positive lens. The magnification of the Galileo collimated beam expander is calculated as:
f 1 f 2 = R 1 R 2 = Γ 0
where f 1 and f 2 represent the focal lengths of the positive and negative lenses, respectively; R 1 and R 2 denote the radii of the positive and negative lenses, respectively; Γ 0 indicates the magnification; and the negative sign indicates that the focus of the negative lens is an imaginary focus.
For a thick lens (d >> 0), the effective focal length f and back focal length (BFL) are expressed as follows:
f = n r 1 r 2 n 1 n r 2 r 1 + n 1 d
B F L = n r 1 r 2 d r 2 n 1 n 1 n r 2 r 1 + n 1 d
where r 1   and r 2   indicate the radii of the curvature of the negative side of the thick lens, d i (I = 1, 2) represents the thickness between the two vertices of the lens (Figure 3), and n denotes the refractive index of the lens. The parameters of the Galileo-collimated beam expander are presented in Table 4.
With Table 4 above and the focal length of the first lens at, we obtain:
r 1 = r 2   1.5195 r 1 2 0.5195 2 1.5195 r 1 + 2.5975 = 25   1.5195 r 1 2 0.5195 2 1.5195 r 1 + 3 0.5195 = 5  
Using Equation (4), the negative lens r 1 = 5.6654   mm , and the positive lens r 2 = 25.0909   mm . The distance D between the two lenses can be calculated as
D = B F L 2 d 1 + B F L 1
where B F L 1 and B F L 2 indicate the back focal lengths of the negative and positive lenses, respectively, and   d 1   indicates the negative lens thickness. Substituting these parameters into Equations (3) and (5), we obtain B F L 1 = 3.8903   mm , B F L 2 = 23.2969   mm , and D = 16.4066   mm .

2.2.3. Laser Scanning System

The laser scanning system consists of a motor, motor driver, and optical wedge (see Figure 4). The workflow is as follows: input the initial scan centroid position, scan speed, scan type, and scan angle to STM32 via WIFI communication. STM32 stores and converts position data into trigger control signals. The signal and protocol for controlling the stepper motor are input to the stepper motor driver via the communication to control the swing angle, position, and speed of the X- and Y-axes mirrors. The STM32 inputs the pulse signals converted by Bresenham’s algorithm to the stepper motor at Positions 1 and 2 and controls the stepper motor to drive the reflector at Position 1 to swing the corresponding angle in the Y direction so that the laser beam can be translated along the lead hammer direction for a certain distance. The stepper motor drives the reflector at Position 2 to swing the corresponding angle in the X direction so that the laser beam can be translated along the horizontal direction for a certain distance. The reflector at Positions 1 and 2 can be controlled to swing in the Y- and X-directions, respectively, to implement a specific scanning trajectory. The angle compensation was fed back to the angle control module using an angle sensor to implement high-precision control of the rotation speed, rotation direction, and rotation angle of the two mirrors.
In the GQ-Cor 19, the optical wedge was purchased from Changchun Boxin Company, China, with customized specifications at an inclination angle of 5°, diameter of 120 mm, and a magnesiumaluminum alloy. The motor was a 60 series servo motor, and was purchased from Shanghai Tongyi Automation Technology Limited Company. The motor bracket was manufactured by the Guilin Huatong Machinery Limited Company. The algorithm uses the improved Bresenham algorithm, which can switch the scanning mode either linearly or circularly. Finally, the light wedge was fixed on the motor rotor with glass glue, and the motor was fixed with a motor bracket and then installed on the GQ-Cor 19 (Figure 5).

2.3. Receiving Optical System

Traditional dual-band bathymetry LiDAR adopts a Cassegrain optical system, which is large and heavy and is therefore not suitable for GQ-Cor 19. Moreover, GQ-Cor19 operates much closer to the surface of the water, resulting in the echo signal from the water surface being stronger than that from the bottom of the water source. Thus, it is difficult to distinguish the echo signals from either the surface or the bottom. Therefore, this study presents the design of a small, lightweight, and functional power-receiving optical system.
The architecture of the optical receiving system is illustrated in Figure 6, and consists of an objective lens set, split-field mirror, APD eyepiece set, and PMT eyepiece set. The objective lens set, APD eyepiece set, and PMT eyepiece set are composed of three lenses. The corresponding parameters are listed in Table 5. The receiving optical system receivers are divided into large and small fields of view. The PMT eyepiece group received the echo signal from a large field of view, and the APD eyepiece group received the echo signal from a small field of view.
The receiving optical system are displayed in Figure 7. With the given parameters in Table 5, the receiving optical system is designed, as shown in Figure 8. The objective group uses Kirk’s three-piece objective lens with a focal length of 505 mm, where the first lens is a positive-focus ZF14 lens, the second is a negative-focus F2HT lens, and the third is a positive-focus ZF14 lens. The split-field mirror diameter of 70 mm and a center opening diameter of 1.5 mm are used to separate the water echo signal and bottom echo signal. The eyepiece set is made up of a modified Känel eyepiece, consisting of a first SF66 plano-convex lens, a second SF66 positively curved moon lens, and a third LASF14A double-glued lens glued to the SF66 with negative focal length, with the center positively curved moon lens used to expand the optimal field of view of the eyepiece.
Using the abovementioned design, a total of 9 lenses, whose specifications are listed in Table 6, were purchased from Atmont Company. Finally, the 3D structural model and the implemented receiving optical system are displayed in Figure 8.

2.4. Design and Implementation of Control System

The control system is primarily responsible for initialization, functional control, signal synchronization, information acquisition, information fusion, information transmission, error detection, humanmachine interaction, status display, and generating remote control commands for GQ-Cor 19. The architecture of the control system is displayed in Figure 9.
Upon receiving the operation command, the main controller controls the module to work independently, or the entire system to collaboratively work as required. In the normal working mode, both the laser and scanning motor would work in terms of the preset parameters, which generates a TTL trigger signal each time it fires a laser pulse, and this trigger signal is sent to the main controller for counting. Subsequently, it is sent to the high-speed data acquisition recorder as a trigger signal for data conversion and storage. The main controller reads POS data, including attitude, positioning, and time, and reads the motor rotation angle, number of turns, and laser trigger count. The main controller must also process the communication data from the LORA module and control commands.
Based on the abovementioned functional requirements and operating principles, the STM32F103ZET6 MCU from the ST company was selected as the main control chip. The circuit scheme is depicted in Figure 10, the top view of the PCB is shown in Figure 11a, and its implementation is shown in Figure 11b [34,35,36,37,38,39,40].
The architecture of the corresponding software for the main controller is illustrated in Figure 12. The implementation of the software is programmed in C language under Keil MDK (Microcontroller Development Kit) platform environment, which implements multi-task scheduling under the uC/OS III embedded real-time operating system. The graphical interface is implemented by STemWin.

2.5. Design and Implementation of High-Speed A/D Sampling System

The architecture of the FPGA-based high-speed A/D sampling system is shown in Figure 13, which consists of a high-speed data acquisition card, FPGA chip, and solid-state driver.
The operating principle involves using an ADC capable of supporting the JESD204B protocol development. Its major function is to acquire analog data, convert them into digital signals, and transfer them into the internal FPGA via the FMC interface. The timing of the data acquisition was designed in the FPGA chip according to the logic control module in the selected A/D chip. The acquisition trigger signal is designed for multichannel A/D acquisition, and the data acquisition and storage of each channel are commenced under the command from the trigger signal.
The data storage and output were implemented using the designed FIFO module with an IP core inside the FPGA. The FIFO module writes the data into the FIFO and records the data collected by each channel in the real-time mode. The FPGA board is designed to support the SATA III protocol SSD storage interface and the 100 Gigabit Ethernet interface for subsequent data export from the SSD. Instructions for data storage and output are encapsulated into IP cores, and the microblaze softcore calls are designed to implement their functions in an embedded development kit (EDK) environment.
Using the abovementioned design, the ADI’s 4-channel, 2G SPS sampling rate, 16-bit sampling accuracy high-speed data acquisition card, FPGA backplane with 200 M, 1.3 GHz, 490 RAM resources in the KINTEX-7 family from Xilinx, Samsung’s 530 MB/s read/write speed, 500 GB space, 2.5-inch, four 870EVO plug-in SSDs were selected. The high-speed data acquisition card and SSD were then mounted on the FPGA backplane, plugged into a common interface board, and fixed in an aluminum enclosure with a cooling fan (Figure 14).

2.6. Design and implementation of System Assembly

Following the implementation of each module, the next step involves assembling each part into an entire GQ-Cor 19 system on an optical platform (Figure 15). The assembly steps include fixing the chassis base, installing the receiving optical system, laser, PMT detectors, laser scanning system, control system, high-speed A/D sampling module, wiring, capping, communication module, and POS system. A wiring diagram for assembling the entire GQ-Cor 19 is illustrated in Figure 16. Finally, the entire GQ-Cor 19 system was assembled (see Figure 17).

2.7. Design and Implementation of Software for Data Processing

The architecture of the 3D point-cloud data post-processing software for the GQ-Cor 19 system was completely self-developed (Figure 18). The software contains seven modules: project engineering, point cloud generation, point cloud processing, tools, data visualization, and product creation. The primary functions of the project engineering include performing project management, data addition, and deletion. The main functions of point cloud generation include waveform decomposition, distance inversion, coordinate inversion, and 3D point cloud accuracy evaluation. The main functions of the point cloud processing include strip adjustment, ground point extraction, and point cloud classification. The main functions of the tools include data conversion, statistical analysis, measurement (coordinates, area, and angle), and coloring. The chief functions of the data visualization include the addition and management of windows, visualization from different orientations, and the setting of point cloud categories. The key functions of product creation include DEM generation, DSM generation, and contour generation. A screenshot of this is shown in Figure 19.

3. Validations through Indoor and Outdoor Experiments

3.1. Verification through Indoor Tank

For validation through indoor experiments, an acrylic tank filled with different water qualities was simulated. First, a certain amount of water was filled in the tank, and a plane mirror was placed at a specific location on the tank to reflect the laser into the water; another plane mirror was placed at a specific location in the tank to reflect the laser into the tank wall, and a black baffle was placed on the tank wall to simulate the bottom of the water source. Different water depths can be simulated by changing the positions of the black baffle in the tank, and multiple water-depth data points can be obtained (see Figure 20).
The experimental procedure is as follows: First, the equipment was placed on the corresponding position, the GQ-Cor 19 system was turned on, and the oscilloscope was used to collect the waveform data. Second, the position of the black baffle was changed after the data acquisition, and the oscilloscope was used again to collect the waveform data. Third, this operation was repeated three times, and the waveform data at different water depths were collected three times, as shown in Figure 21a–c. The experimental results are presented in Figure 21.
From the above waveform, it is obvious that the system exhibits good results in laboratory experiments, and the system can receive both surface and bottom signals with high signal quality and no jittering noise; the water depth data can be derived from the time difference of surface signal and bottom signal. The waveforms of the three typical locations are good, which proves that the GQ-Cor 19 system functions effectively.

3.2. Verification through Indoor Swimming Pool

The maximum water depth to be measured was then tested. Verification was performed in an indoor swimming pool (Figure 22). The experimental steps are as follows: First, the GQ-Cor 19 was fixed on a stand, which was 2 m away from the water surface, an underwater reflector was placed to reflect the laser, causing the laser to be emitted laterally in the pool. Second, a baffle was used as an underwater target and different water depth data were measured by changing the position of the baffle to verify the maximum bathymetric capability of the GQ-Cor 19 system. Third, once the system was turned on, the waveform data were collected using an oscilloscope, and the baffle position in the pool was changed several times to obtain a large amount of water depth data. The experimental results at six water depths (six positions on the baffle) are shown in Figure 23.
As shown in Figure 23a–f, the first channel of the oscilloscope in Figure 23 receives the main waveform signal, the second channel receives the laser synchronization signal, and the third channel receives the bottom signal. In this experiment, the PMT gating function was added. The synchronization signal of the laser was used as the gating signal, and the main waveform signal was used as the reference signal. Accordingly, the system did not receive the surface signal, except for the bottom signal. The measured water depth was calculated using the time difference between the main and bottom waves. It can be concluded that the GQ-Cor 19 system can measure a maximum water depth of more than 26.7 m.

3.3. Validation through the Outdoor Water Well

To validate the impact of the different reflectors in the bottom of the water source on the echo signal strength and echo signal reception, the GQ-Cor 19 system was tested in an outdoor water well (Figure 24). As shown in Figure 24, the well has a depth of 10 m above the ground and 10 m below the ground, i.e., a total water depth of 20 m. The basic experimental steps are as follows: water was injected into the well, the GQ-Cor 19 was setup above the well, the waveform data was collected using GQ-Cor 19, and the waveform was obtained (Figure 25).
As shown in Figure 25, channel 1 is the reference signal, channel 2 is the synchronization signal, and channel 3 is the bottom signal. Water depth was calculated according to the time interval between the reference signal and the bottom signal. It can be concluded that the GQ-Cor 19 system functioned effectively.

3.4. Validation through the Outdoor Pond

To verify the detection capability of GQ-Cor 19 in different water environments, the experiments were conducted in an outdoor pond at the campus of Guilin University of Technology (Figure 26). In this experiment, the polarization technique was used to weaken the surface signal strength while enhancing the bottom signal strength. The waveform is shown in Figure 27.
In Figure 27a–c, channel 1 is the signal received in a large FOV and channel 2 is the signal received in a small FOV. Owing to the limited water depth, the laser power setting was low and the small FOV signal was weak. In channel 2, which does not use polarization compression, it is obvious that the water surface signal is uncompressed, whereas in channel 1, which uses polarization compression technology, it is obvious that the water surface signal is smaller than the bottom signal, indicating that the polarization technology is effective. Simultaneously, according to the different distances between the two peaks of the waveforms obtained from three different positions, it can be observed that the GQ-Cor 19 system can work normally in the pool, and the system functions effectively.

3.5. Validation though Outdoor Reservoir

To test the GQ-Cor 19 system in outdoor environments, particularly to verify whether the GQ-Cor 19 system is feasible in a poorer water quality environment, experiments were conducted at Qingshitan Reservoir, located in Qingshitan Town, Lingchuan County, Guilin, Guangxi (Figure 28). The specific process of the experiment was as follows: (1) The GQ-Cor 19 system was placed onto the unmanned boat, (2) the unmanned boat was moved to the reservoir, (3) power was supplied to the GQ-Cor 19 system, (4) the unmanned boat course was remotely controlled in the reservoir, and (5) echo signals were collected and stored using an onboard high-speed A/D sampling module and storage chip. During the reservoir experiment, polarization and gating technologies were not used; therefore, the water surface signal easily saturated the PMT. The specific bottom waveform is illustrated in Figure 29.
As shown in Figure 29, channel 1 is a small FOV signal with a weak bottom signal and channel 2 is a large FOV with a strong bottom signal and an oversaturated surface signal. The water depth is calculated to be 1.69 m, and the actual water depth is 1.74 m. Using this verification, it can be concluded that the GQ-Cor 19 system can work accurately in the reservoir environment.
For illustration, measuring and the actual distance in different environments are shown in Table 7.
Based on Table 7, the following conclusions can be drawn: (1) The GQ-Cor 19 system can work effectively in a variety of water environments, and the functions of each part work effectively (the average errors of the bathymetric terrains in the five environments are 0.06 m, 0.157 m, 0.05 m, 0.19 m, and 0.05 m, all of which achieve the designed measurement accuracy); (2) the use of polarization technology can effectively reduce the strength of water surface signal, thereby enhancing the strength of bottom signal, which demonstrates that this technology is capable of solving the problem of PMT excessive saturation from of the water surface signal; (3) the use of the gating gain technology can effectively avoid receiving the water surface signal, which enhances the bottom of the water source signal.
The GQ-Cor 19 system is aimed at bathymetry for rivers, lakes, and reservoirs with a water depth of 20 m, which avoids the requirements of airborne bathymetric LiDAR for outer space and airplane runways. Moreover, the GQ-Cor 19 system has many advantages, such as being low in cost and low in power consumption, lightweight, easy to carry, and flexible compared with the traditional bathymetry LiDAR system.

4. Discussion

The GQ-Cor 19 system can work effectively in a variety of water environments, and the functions of each part work effectively. The key parameters used in the GQ-Cor19 system includes the following aspects. (1) The emitting optical system (wave length: 532 nm; peak power: 100 kW; repeat frequency: 1 kHz; pulse width: 3 ns; material/refractive index of material (532 nm: BK7/1.5195)). (2) The receiving optical system (receiving angle of FOV: 95 mrad; bandwidth: ±1 nm; PMT objective group focal length: 49.27 mm; APD objective group focal length: 12.01 mm). (3) The control system is implemented by STM32 to control the work of the emitting optical system, detection module, high-speed A/D sampling module, and POS system. The working status of each module was obtained while controlling the operation of each module. (4) High-speed A/D sampling system (The ADI’s 4-channel, 2G SPS sampling rate, 16-bit sampling accuracy high-speed data acquisition card, FPGA backplane with 200 M, 1.3 GHz, 490 RAM resources in the KINTEX-7 family from Xilinx, Samsung’s 530 MB/s read/write speed, 500 GB space, 2.5-inch, four 870EVO plug-in SSDs were selected).
The limitation of the GQ-Cor 19 system is that the sparse or the missed laser point density from laser scanning trajectories relies solely on the reflected light wedges. This cannot obtain high-density point cloud data for bathymetry.

5. Results

The average error of the bathymetric data from the indoor laboratory tank, indoor swimming pool, outdoor artificial well, outdoor pond, and outdoor reservoir were 0.06 m, 0.157 m, 0.05 m, 0.19 m, and 0.05 m, respectively, which demonstrates that the GQ-Cor 19 can effectively work under various environments with a pre-designed accuracy. The experimental results demonstrated that GQ-Cor 19 is effective, reliable, and capable of quickly measuring the water depth with a maximum measurement depth of more than 26.7 m at an accuracy higher than 30 cm, weight less than 12 kg, communication distance of 5 km, operation speed of 3 m/s, and duration of more than 2 h.

6. Conclusions

This study presented a prototype of a lightweight bathymetric LiDAR system onboard an unmanned shipborne vehicle, named the GQ-Cor 19 system, which consists of hardware (e.g., scanning optical system, receiving optical system, control system, high-speed A/D sampling system, laser, etc.) and software. Each part was independently developed and finally assembled into an entire bathymetric LiDAR system. The main innovations in the GQ-Cor 19 system include the following.
(1) An improved Bresenham algorithm was proposed for solving the problem of sparse or missed laser point density resulting from laser scanning trajectories by relying solely on the reflected light wedges. In particular, the method can realize two scanning modes: linear and circular scanning, which can obtain high-density point cloud data for bathymetry.
(2) A small and lightweight receiving optical system with a split-field method was designed and implemented for the effective detection of echo signals from both the water surface and bottom of the water source. The polarization technique was adopted to compress the overstrength of the echo signals from the water surface.
(3) The data acquisition module with a high-speech A/D collector was used to collect echo signals, transfer the collected data into the FPGA chip, implement data storage via the FPGA at the 2G SPS sampling rate, and achieve a sampling accuracy of 16 bits for three-channel parallel sampling. Gating technology was adopted for a single PMT corresponding to a single channel to avoid PMT saturation when receiving a strong echo signal from the water surface.

Author Contributions

Methodology, G.L. and J.Y.; Validation, D.Z.; Formal analysis, H.H.; Investigation, C.X., J.X., R.D., Y.T., C.L. and L.Z. (Liping Zou); Resources, B.S., H.Z., Z.L., J.L., X.N., Y.Z., C.W. and L.Z. (Lieping Zhang); Writing—original draft, W.L.; Writing—review & editing, G.Z.; Supervision, X.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Guangxi Innovative Development Grand Program (the grant #: Guike AD19254002), the Guangxi Natural Science Foundation for Innovation Research Team (the grant #: 2019GXNSFGA245001), the BaGuiScholars program of Guangxi, and the Open Fund of Guangxi Key Laboratory of Spatial Information and Geomatics (the grant #: 19-050-11-13).

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to thank the reviewers for their constructive comments and suggestions.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Finkl, C.; Benedet, L.; Andrews, J.; Finld, C. Interpretation of Seabed Geomorphology Based on Spatial Analysis of High-Density Airborne Laser Bathymetry. J. Coast. Res. 2005, 21, 501–514. [Google Scholar] [CrossRef] [Green Version]
  2. Legleiter, C. Remote measurement of river morphology via fusion of LiDAR topography and spectrally based bathymetry. Earth Surf. Process. Landf. 2012, 37, 499–518. [Google Scholar] [CrossRef]
  3. Costa, B.; Battista, T.; Pittman, S. Comparative evaluation of airborne LiDAR and ship-based multibeam SoNAR bathymetry and intensity for mapping coral reef ecosystems. Remote Sens. Environ. 2009, 113, 1082–1100. [Google Scholar] [CrossRef]
  4. Westfeld, P.; Maas, H.; Richter, K.; Weiß, R. Analysis and correction of ocean wave pattern induced systematic coordinate errors in airborne LiDAR bathymetry. ISPRS J. Photogramm. Remote Sens. 2017, 128, 314–325. [Google Scholar] [CrossRef]
  5. Miller, H.; Cotterill, C.; Bradwell, T. Glacial and paraglacial history of the Troutbeck Valley, Cumbria, UK: Integrating airborne LiDAR, multibeam bathymetry, and geological field mapping. Proc. Geol. Assoc. 2014, 125, 31–40. [Google Scholar] [CrossRef] [Green Version]
  6. Mandlburger, G.; Hauer, C.; Wieser, M.; Pfeifer, N. Topo-Bathymetric LiDAR for Monitoring River Morphodynamics and Instream Habitats—A Case Study at the Pielach River. Remote Sens. 2015, 7, 6160–6195. [Google Scholar] [CrossRef] [Green Version]
  7. Kinzel, P.; Legleiter, C. sUAS-Based Remote Sensing of River Discharge Using Thermal Particle Image Velocimetry and Bathymetric Lidar. Remote Sens. 2019, 11, 2317. [Google Scholar] [CrossRef] [Green Version]
  8. Khrimenko, M.; Hopkinson, C. A Simplified End-User Approach to Lidar Very Shallow Water Bathymetric Correction. IEEE Geosci. Remote Sens. Lett. 2019, 17, 3–7. [Google Scholar] [CrossRef]
  9. Zhou, G. Urban High-Resolution Remote Sensing: Algorithms and Modelling; CRC Press: Boca Raton, FL, USA, 2020; ISBN 978-03-67-857509. [Google Scholar]
  10. Zhou, G.; Zhou, X. Seamless Fusion of LiDAR and Aerial Imagery for Building Extraction. IEEE Trans. Geosci. Remote Sens. 2014, 52, 7393–7407. [Google Scholar] [CrossRef]
  11. Zhou, G.; Zhou, X.; Yang, J.; Yue, T.; Nong, X.; Baysal, O. Flash LiDAR Sensor using Fiber Coupled APDs. IEEE Sens. J. 2015, 15, 4758–4768. [Google Scholar] [CrossRef]
  12. Zhou, G.; Li, C.; Zhang, D.; Liu, D.; Zhou, X.; Zhan, J. Overview of Underwater Transmission Characteristics of Oceanic LiDAR. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 8144–8159. [Google Scholar] [CrossRef]
  13. Zhou, G.; Long, S.; Xu, J.; Zhou, X.; Song, B.; Deng, R.; Wang, C. Comparison analysis of five waveform decomposition algorithms for the airborne LiDAR echo signal. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 7869–7880. [Google Scholar] [CrossRef]
  14. Zhou, G.; Deng, R.; Zhou, X.; Long, S.; Li, W.; Lin, G.; Li, X. Gaussian Inflection Point Selection for LiDAR Hidden Echo Signal Decomposition. IEEE Geosci. Remote Sens. Lett. 2022, 19, 1–5. [Google Scholar] [CrossRef]
  15. Zhou, G.; Li, W.; Zhou, X.; Tan, Y.; Lin, G.; Li, X.; Deng, R. An Innovative Echo Detection System with STM32 Gated and PMT Adjustable Gain for Airborne LiDAR. Int. J. Remote Sens. 2021, 42, 9187–9211. [Google Scholar] [CrossRef]
  16. Zhou, G.; Zhou, X.; Song, Y.; Xie, D.; Wang, L.; Yan, G.; Hu, M.; Liu, B.; Shang, W.; Gong, C.; et al. Design of supercontinuum laser hyperspectral light detection and ranging (LiDAR) (SCLaHS LiDAR). Int. J. Remote Sens. 2021, 42, 3731–3755. [Google Scholar] [CrossRef]
  17. Nayegandhi, A.; Brock, J.; Wright, C. Small-Footprint, waveform-resolving lidar estimation of submerged and sub-canopy topography in coastal environments. Int. J. Remote Sens. 2009, 30, 861–878. [Google Scholar] [CrossRef]
  18. Liu, Y.; Guo, K.; He, X.; Xu, W.; Feng, Y. Research Progress of Airborne Laser Bathymetry Technology. Geomat. Inf. Sci. Wuhan Univ. 2017, 42, 1185–1194. [Google Scholar] [CrossRef]
  19. Collin, A.; Ramambason, C.; Pastol, Y.; Casella, E.; Rovere, A.; Thiault, L.; Espiau, B.; Siu, G.; Lerouvreur, F.; Nakamura, N.; et al. Very high-resolution mapping of coral reef state using airborne bathymetric LiDAR surface-intensity and drone imagery. Int. J. Remote Sens. 2018, 39, 5676–5688. [Google Scholar] [CrossRef] [Green Version]
  20. Shen, X.; Liu, Z.; Zhou, Y.; Liu, Q.; Xu, P.; Mao, Z.; Liu, C.; Tang, L.; Ying, N.; Hu, M.; et al. Instrument response effects on the retrieval of oceanic lidar. Appl. Opt. 2020, 59, C21–C30. [Google Scholar] [CrossRef]
  21. Lucas, K.; Carter, G. Change in distribution and composition of vegetated habitats on Horn Island, Mississippi, northern Gulf of Mexico, in the initial five years following Hurricane Katrina. Geomorphology 2013, 199, 129–137. [Google Scholar] [CrossRef]
  22. Pe’eri, S.; Long, B. LIDAR Technology Applied in Coastal Studies and Management. J. Coast. Res. 2011, 62, 1–5. [Google Scholar] [CrossRef]
  23. Zhou, G.; Xie, M. GIS-based Three-dimensional Morphologic Analysis of Assateague Island National Seashore from LIDAR Series Datasets. J. Coast. Res. 2009, 25, 435–447. [Google Scholar] [CrossRef]
  24. Collin, A.; Archambault, P.; Long, B. Mapping the Shallow Water Seabed Habitat With the SHOALS. IEEE Trans. Geosci. Remote Sens. 2008, 46, 2947–2955. [Google Scholar] [CrossRef]
  25. Zhao, X.; Liang, G.; Liang, Y.; Zhao, J.; Zhou, F. Background noise reduction for airborne bathymetric full waveforms by creating trend models using optech czmil in the yellow sea of china. Appl. Opt. 2020, 59, 11019–11026. [Google Scholar] [CrossRef] [PubMed]
  26. Ding, K.; Li, Q.; Zhu, J.; Wang, C.; Xu, T. Evaluation of Airborne LiDAR Bathymetric Parameters on the Northern South China Sea Based on MODIS Data. Acta Geod. Cartogr. Sin. 2018, 47, 180. [Google Scholar] [CrossRef]
  27. Li, Q.; Wang, J.; Han, Y.; Gao, Z.; Zhang, Y.; Jin, D. Potential evaluation of China’s coastal airborne LiDAR bathymetry based on CZMIL Nova. Remote Sens. Land Resour. 2020, 32, 184–190. [Google Scholar] [CrossRef]
  28. Dee, S.; Cuttler, M.; O’Leary, M.; Hacker, J.; Browne, N. The complexity of calculating an accurate carbonate budge. Coral Reefs 2020, 39, 1525–1534. [Google Scholar] [CrossRef]
  29. Tonina, D.; Mckean, J.; Benjankar, R.; Yager, E.; Carmichael, R.; Chen, Q.; Carpenter, A.; Kelsey, L.G.; Edmondson, M.R. Evaluating the performance of topobathymetric lidar to support multi-dimensional flow modelling in a gravel-bed mountain stream. Earth Surf. Process. Landf. 2020, 45, 2850–2868. [Google Scholar] [CrossRef]
  30. Ding, K. Research on the Signal-Wavelength Airbome LiDAR Bathymetry Full-Waveform Date Processing Algorithm and Its Application. Ph.D. Thesis, Information and Communication Engineering, Shenzhen University, GuangDong, China, 2018. [Google Scholar]
  31. Zhou, G.; Zhao, D.; Zhou, X.; Xu, C.; Liu, Z.; Wu, G.; Lin, J.; Zhang, H.; Yang, J.; Nong, X.; et al. An RF Amplifier Circuit for Enhancement of Echo Signal Detection in Bathymetric LiDAR. IEEE Sens. J. 2022, 22, 20612–20625. [Google Scholar] [CrossRef]
  32. Zhou, G.; Song, C.; Schickler, W. Urban 3D GIS from LIDAR and digital aerial images. Comput. Geosci. 2004, 30, 345–353. [Google Scholar] [CrossRef]
  33. Zhou, G.; Baysal, O.; Kaye, J. Concept design of future intelligent earth observing satellites. Int. J. Remote Sens. 2004, 25, 2667–2685. [Google Scholar] [CrossRef]
  34. Xu, Y.; Boone, C.; Pileggi, L. Metal-mask configurable RF front-end circuits. IEEE J. Solid-State Circuits 2004, 39, 1347–1351. [Google Scholar] [CrossRef]
  35. Harada, M.; Tsukahara, T.; Kodate, J.; Yamagishi, A.; Yamada, J. 2-GHz RF front-end circuits in CMOS/SIMOX operating at an extremely low voltage of 0.5 V. IEEE J. Solid-State Circuits 2000, 35, 2000–2004. [Google Scholar] [CrossRef]
  36. Nguyen, X.; Kim, H.; Lee, H. An Efficient Sampling Algorithm With a K-NN Expanding Operator for Depth Data Acquisition in a LiDAR System. IEEE Trans. Circuits Syst. Video Technol. 2020, 30, 4700–4714. [Google Scholar] [CrossRef]
  37. Zheng, H.; Ma, R.; Liu, M.; Zhu, Z. A Linear-Array Receiver Analog Front-End Circuit for Rotating Scanner LiDAR Application. IEEE Sens. J. 2019, 19, 5053–5061. [Google Scholar] [CrossRef]
  38. Liang, Y.; Xu, B.; Fei, Q.; Wu, W.; Shan, X.; Huang, K.; Zeng, H. Low-Timing-Jitter GHz-Gated InGaAs/InP Single-Photon Avalanche Photodiode for LIDAR. IEEE J. Sel. Top. Quantum Electron. 2022, 28, 3801807. [Google Scholar] [CrossRef]
  39. Hong, C.; Kim, S.H.; Kim, J.H.; Park, S.M. A Linear-Mode LiDAR Sensor Using a Multi-Channel CMOS Transimpedance Amplifier Array. IEEE Sensors. J. 2018, 18, 7032–7040. [Google Scholar] [CrossRef]
  40. Kurtti, S.; Nissinen, J.; Kostamovaara, J. A Wide Dynamic Range CMOS Laser Radar Receiver with a Time-Domain Walk Error Compensation Scheme. IEEE Trans. Circuits Syst. I Regul. Pap. 2017, 64, 550–561. [Google Scholar] [CrossRef]
Figure 1. Architecture of GQ-Cor 19.
Figure 1. Architecture of GQ-Cor 19.
Remotesensing 14 05880 g001
Figure 2. Laser purchased from market.
Figure 2. Laser purchased from market.
Remotesensing 14 05880 g002
Figure 3. Galileo-type collimated beam expander.
Figure 3. Galileo-type collimated beam expander.
Remotesensing 14 05880 g003
Figure 4. Architecture of the scanning optical system.
Figure 4. Architecture of the scanning optical system.
Remotesensing 14 05880 g004
Figure 5. Implementation of a scanning optical system.
Figure 5. Implementation of a scanning optical system.
Remotesensing 14 05880 g005
Figure 6. Architecture of the receiving optical system.
Figure 6. Architecture of the receiving optical system.
Remotesensing 14 05880 g006
Figure 7. The receiving optical system, where p represents the objective group first lens, q depicts the objective group second lens, r indicates the objective group third lens, s denotes the split-field lens, t refers to the first lens of the APD eyepiece group, u represents the second lens of the APD eyepiece group, v signifies the first lens of the APD eyepiece group, w represents the first lens of the PMT eyepiece group, x represents the second lens of the PMT eyepiece group, and y represents the first lens of the PMT eyepiece group.
Figure 7. The receiving optical system, where p represents the objective group first lens, q depicts the objective group second lens, r indicates the objective group third lens, s denotes the split-field lens, t refers to the first lens of the APD eyepiece group, u represents the second lens of the APD eyepiece group, v signifies the first lens of the APD eyepiece group, w represents the first lens of the PMT eyepiece group, x represents the second lens of the PMT eyepiece group, and y represents the first lens of the PMT eyepiece group.
Remotesensing 14 05880 g007
Figure 8. 3D model and the implemented of receiving optical system.
Figure 8. 3D model and the implemented of receiving optical system.
Remotesensing 14 05880 g008
Figure 9. Architecture of the control components in the GQ-Cor 19 system.
Figure 9. Architecture of the control components in the GQ-Cor 19 system.
Remotesensing 14 05880 g009
Figure 10. Circuit scheme of the main control section.
Figure 10. Circuit scheme of the main control section.
Remotesensing 14 05880 g010
Figure 11. (a) Top view of the PCB and (b) the physical diagram of the control board.
Figure 11. (a) Top view of the PCB and (b) the physical diagram of the control board.
Remotesensing 14 05880 g011
Figure 12. Main control part of the operating system task scheduling flowchart.
Figure 12. Main control part of the operating system task scheduling flowchart.
Remotesensing 14 05880 g012
Figure 13. Architecture of high-speed AD sampling system for GQ-Cor 19.
Figure 13. Architecture of high-speed AD sampling system for GQ-Cor 19.
Remotesensing 14 05880 g013
Figure 14. High-speed AD sampling module.
Figure 14. High-speed AD sampling module.
Remotesensing 14 05880 g014
Figure 15. LiDAR system assembly.
Figure 15. LiDAR system assembly.
Remotesensing 14 05880 g015
Figure 16. Wiring diagram of shipboard LiDAR system.
Figure 16. Wiring diagram of shipboard LiDAR system.
Remotesensing 14 05880 g016
Figure 17. LiDAR overall physical diagram.
Figure 17. LiDAR overall physical diagram.
Remotesensing 14 05880 g017
Figure 18. Architecture of LiDAR data post-processing.
Figure 18. Architecture of LiDAR data post-processing.
Remotesensing 14 05880 g018
Figure 19. A screen shot for the GQ-Cor 19 data post-processing software with Chinese interface.
Figure 19. A screen shot for the GQ-Cor 19 data post-processing software with Chinese interface.
Remotesensing 14 05880 g019
Figure 20. The overall physical diagram of the laboratory validation.
Figure 20. The overall physical diagram of the laboratory validation.
Remotesensing 14 05880 g020
Figure 21. (ac) The waveforms are shown by oscilloscope in indoor experiments; the first peak of the waveforms is the surface signal and the second peak of the waveforms is the bottom signal.
Figure 21. (ac) The waveforms are shown by oscilloscope in indoor experiments; the first peak of the waveforms is the surface signal and the second peak of the waveforms is the bottom signal.
Remotesensing 14 05880 g021
Figure 22. The overall physical diagram of the pool experiment.
Figure 22. The overall physical diagram of the pool experiment.
Remotesensing 14 05880 g022
Figure 23. (af) Waveform diagram of the water pool experiment.
Figure 23. (af) Waveform diagram of the water pool experiment.
Remotesensing 14 05880 g023
Figure 24. The experiments under the artificial water well.
Figure 24. The experiments under the artificial water well.
Remotesensing 14 05880 g024
Figure 25. The waveform from the experiment under the artificial water well.
Figure 25. The waveform from the experiment under the artificial water well.
Remotesensing 14 05880 g025
Figure 26. The overall physical diagram of the water pool experiment.
Figure 26. The overall physical diagram of the water pool experiment.
Remotesensing 14 05880 g026
Figure 27. The waveform received under the water pond environment.
Figure 27. The waveform received under the water pond environment.
Remotesensing 14 05880 g027
Figure 28. The experiments under reservoir environment.
Figure 28. The experiments under reservoir environment.
Remotesensing 14 05880 g028
Figure 29. Waveform under the reservoir experimental environment.
Figure 29. Waveform under the reservoir experimental environment.
Remotesensing 14 05880 g029
Table 1. Technical specifications of the GQ-Cor 19.
Table 1. Technical specifications of the GQ-Cor 19.
ParametersValues
Laser wavelength532 nm
Weight12 kg
Size470 mm × 352 mm × 204 mm
Maximum measured water depth>25 m
Measurement accuracy30 cm
PlatformUnmanned shipborne
Endurance time2 h
Full angle of beam divergence≤2 mrad
Scanning angle10°
Table 2. Five representative ABL systems.
Table 2. Five representative ABL systems.
LiDARLaser Frequency (kHz)Minimum/Maximum Detection Depth (m)Bathymetric Accuracy (m)Flight Height (m)Carrier Country
SHOALS 3000T [21,22,23,24]30.2/500.25300–400AircraftCanada
Hawk Eye III [25,26]Shallow water: 35
Deep water: 10
0.4/Shallow water:15
Deep water: 50
0.3400–600AircraftSweden
CZMIL [27]100.15/500.3400–1000AircraftCanada
VQ-880G [28]550−/1.5 secchi0.3600AircraftAustrian
LADS MK-Ⅲ [29]1.50.4/800.2360–900AircraftAustralia
Table 3. Specifications of laser selected from market.
Table 3. Specifications of laser selected from market.
ParametersValues
Wave length532 nm
Peak power100 kW
Pulse Width3 ns
Repeat frequency1 kHz
Divergence angle0.2 mrad
Table 4. Galileo collimated beam expander parameters.
Table 4. Galileo collimated beam expander parameters.
ParametersValues
Thickness of the first lens3 mm
Thickness of second lens5 mm
Effective focal length of second lens25 mm
Material/refractive index of material (532 nm)BK7/1.5195
Table 5. The specification of the receiving optical system in GQ-Cor 19.
Table 5. The specification of the receiving optical system in GQ-Cor 19.
Parameters Values
Receiving angle of FOV95 mrad
Entrance pupil diameter82 mm
Exit pupil diameter8 mm
Magnification10.25xa and 42x
Band width±1 nm
Main aperture80 mm
Eyepiece group focal length505 mm
PMT objective group focal length49.27 mm
APD objective group focal length12.01 mm
Split-field mirror diameter70 mm
Eyepiece group aperture64 mm
Table 6. Lens Selection.
Table 6. Lens Selection.
Symbol ParametersPosition
p Customized, K9 glassObjective lens set lens 1
q Φ12.5/Thickness 2.3, Ordinary aluminum film + ProtectionObjective lens set lens 2
r Transmittance 95%, Bandwidth 10 nmObjective lens set lens 3
s Customized, Silver Plated FilmSplit Field Mirror
t Diameter Φ25, Focal length 25, Back Focus 20.28, Visible light enhancement film coatingAPD eyepiece set lens 1
u Diameter Φ10 mm, Center thickness 2.6 mm, Effective pore size 19APD eyepiece set lens 2
v Transmittance 95%, Bandwidth 10 nmAPD eyepiece set lens 3
w Diameter Φ20, Focal length 25, Back Focus 22.76, Visible light enhancement film coatingPMT eyepiece set lens 1
xFlat Convex MirrorDiameter Φ20 mm, Center thickness 4.6 mm, Effective pore size 19PMT eyepiece set lens 2
yFilterTransmittance 95%, Bandwidth 10 nmPMT eyepiece set lens 3
Table 7. The waveform data in different experimental environments.
Table 7. The waveform data in different experimental environments.
Experimental EnvironmentMeasuring Distance (m)Actual Distance (m)Error (m)
Indoor tank 0.7680.7000.068
1.6611.6000.061
2.2482.3000.052
Average error (m)0.060
Indoor swimming pool10.159.880.27
13.4213.240.18
14.9615.220.26
16.6616.500.16
19.7219.750.03
25.6225.580.04
Average error (m)0.157
Outdoor water wells15.19150.19
Outdoor water pond0.700.750.05
0.880.900.02
0.921.000.08
Average error (m)0.05
Outdoor reservoir1.691.740.05
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhou, G.; Zhou, X.; Li, W.; Zhao, D.; Song, B.; Xu, C.; Zhang, H.; Liu, Z.; Xu, J.; Lin, G.; et al. Development of a Lightweight Single-Band Bathymetric LiDAR. Remote Sens. 2022, 14, 5880. https://0-doi-org.brum.beds.ac.uk/10.3390/rs14225880

AMA Style

Zhou G, Zhou X, Li W, Zhao D, Song B, Xu C, Zhang H, Liu Z, Xu J, Lin G, et al. Development of a Lightweight Single-Band Bathymetric LiDAR. Remote Sensing. 2022; 14(22):5880. https://0-doi-org.brum.beds.ac.uk/10.3390/rs14225880

Chicago/Turabian Style

Zhou, Guoqing, Xiang Zhou, Weihao Li, Dawei Zhao, Bo Song, Chao Xu, Haotian Zhang, Zhexian Liu, Jiasheng Xu, Gangchao Lin, and et al. 2022. "Development of a Lightweight Single-Band Bathymetric LiDAR" Remote Sensing 14, no. 22: 5880. https://0-doi-org.brum.beds.ac.uk/10.3390/rs14225880

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop