Next Article in Journal
Recent Advances on Wearable Electronics and Embedded Computing Systems for Biomedical Applications
Next Article in Special Issue
Sudoku Inspired Designs for Radar Waveforms and Antenna Arrays
Previous Article in Journal
Obstacle Avoidance Based-Visual Navigation for Micro Aerial Vehicles
Previous Article in Special Issue
A Compressive-Sensing Inspired Alternate Projection Algorithm for Sparse Array Synthesis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Ground-Based 3D Radar Imaging of Trees Using a 2D Synthetic Aperture

Department of Electrical and Computer Engineering, Brigham Young University, Provo, UT 84602, USA
*
Author to whom correspondence should be addressed.
Submission received: 6 December 2016 / Revised: 2 January 2017 / Accepted: 13 January 2017 / Published: 23 January 2017
(This article belongs to the Special Issue Radio and Radar Signal Processing)

Abstract

:
Motivated by the desire to gain insight into the details of conventional airborne synthetic aperture radar (SAR) imaging of trees, a ground-based SAR system designed for short-range three-dimensional (3D) radar imaging is developed using a two-dimensional (2D) synthetic aperture. The heart of the system is a compact linear frequency modulation-continuous wave (LFM-CW) radar, a custom two-dimensional scan mechanism, and a three-dimensional time-domain backprojection algorithm that generates three-dimensional backscatter images at an over-sampled resolution of 10 cm by 10 cm by 10 cm. The backprojection algorithm is formulated directly in spatial coordinates. A new method for estimating and compensating for signal attenuation within the canopy is used that exploits the backprojection image formation approach. Several three-dimensional C-band backscatter images of different individual trees of multiple species are generated from data collected for trees both in isolation and near buildings. The trees imaged in this study are about 10 m in height. The transformation of the three-dimensional images to airborne SAR images is described and a sample result provided.

Graphical Abstract

1. Introduction

Airborne synthetic aperture radar (SAR) systems have been widely used for imaging natural terrain and features [1]. Some natural features of interest are isolated trees where extraction of the height, location, and shape of the tree foliage are desired. Due to the nature of SAR imaging, the height of the tree causes the SAR image of the tree to “lay over” so that the top of the tree appears closer to the ground track of the radar than does the bottom of the tree (Figure 1 and Figure 2). In this SAR image, scattering from the leaves creates a bright feature on the side of the tree facing the radar, while the rest of the tree is in the radar shadow. The precise shape of the tree image depends on the shape and height of the tree (see Figure 2) as well as the foliage penetration. However, this simple image does not convey the complexity of the scattering from the tree structure and leaves. Compare the radar images of the trees in Figure 1 and Figure 3. Further complicating the situation is the (generally) relatively coarse resolution of the radar system and changes in the radar frequency and incidence angle. As radars increase in resolution, there is interest in better understanding the interaction of the radar signal with tree foliage and how trees appear in SAR images.
In an effort to better understand how trees appear in SAR images and to collect tree scattering and size data, a ground-based, short-range radar imaging system has been constructed that uses a two-dimensional (2D) synthetic aperture to create high resolution three-dimensional (3D) images of radar scattering from trees. Using 3D backscatter images collected at high resolution by the ground radar, the projected 2D radar image of the tree in a conventional SAR can be inferred. Operating at C-band, our ground-based radar system incorporates a 2D scanner with a scanning area that is 1.7 m by 1.7 m. The core radio frequency (RF) system is a C-band microASAR [2] that uses linear frequency modulation-continuous wave (LFM-CW) and records dechirped data to a compact flash card.
After background theory, this paper briefly describes the radar system and scanner, discusses the processing algorithm, and provides images collected of several trees. A discussion on transforming the 3D images into the expected view from a conventionally collected SAR image is then provided.

2. Methods

Traditionally, SAR imaging is conducted from moving aircraft or spacecraft. However, the key to synthetic aperture imaging is the collection of data with different antenna displacements and signal processing to construct an image from the displaced antenna radar observations. The motion creating the antenna displacement need not be continuous as evident in the derivation of the commonly used “stop and hop” approximation in radar signal modeling [1]. Radar platform motion during LFM-CW operation has some effect but can be compensated for [3,4]. The data must be collected with antenna displacements spaced no farther apart than one-half of the real-aperture antenna size in order to avoid grating lobes and aliasing. The synthetic aperture processing enables image formation at much higher resolution than possible with a real-aperture antenna [1].
Conventional SAR with its one-dimensional (1D) synthetic aperture generates 2D images of the surface backscatter. Interferometric SAR uses an additional antenna that is vertically and/or horizontally displaced by a baseline to enable estimation of the surface vertical displacement [1]. Since the SAR signal penetrates vegetation canopies, multiple baselines can enable tomographic reconstruction of scattering within the vegetation canopy [5,6,7,8,9,10,11,12]. Multi-baseline SAR is a special case of a 2D synthetic aperture where the elevation dimension is sparsely sampled [10].
A 2D synthetic aperture is formed by coherently collecting radar echoes over a 2D grid of antenna displacements where the antenna is displaced in both azimuth and elevation. With a uniform rectangular sampling grid (that is, the observations using real-aperture antennas are collected at each point of an evenly spaced grid—a fully sampled aperture) within a rectangular 2D synthetic aperture, some of the complexities that arise in multi-baseline SAR are ameliorated. While full sampling of a 2D synthetic aperture in space is probably impractical from orbit, ground-based 2D SAR is both practical and useful for vegetation studies, and the subject of this paper.
For a full 2D synthetic aperture, the synthetic azimuth beamwidth φA and elevation beamwidth φE are given by
φA = λ/mA,
ΦE = λ/mE,
where mA and mE are the projected synthetic aperture dimensions in the azimuth and elevation directions and λ is the radar wavelength. However, it should be noted that the synthetic aperture length can be arbitrarily truncated, with the penalty of loss in synthetic aperture resolution. For a truncated synthetic aperture, the effective imaging resolution in azimuth and elevation is given by the slant range R times the beamwidth, i.e.,
rA = R φA = R λ/mA,
rE = R ΦE = R λ/mE.
These expressions differ from SAR with a full synthetic aperture where resolution is independent of range, i.e., the azimuth resolution is r = l/2 where l is the real-aperture antenna size [1].
From the 2D synthetic aperture radar data, a 3D matched filter that accounts for range walk in the azimuth and elevation directions can be employed to generate a backscatter image of the target. Having both azimuth and elevation displacements, as well as range resolution, means that the resulting backscatter image can be created in 3D. The picture elements of a 3D image are termed volume elements or “voxels” in contrast to the picture elements (“pixels”) of 2D images. With a fully filled 2D synthetic aperture, phase unwrapping is not required nor is a target model needed.
While a variety of approximate methods (e.g., modified range-Doppler filtering) can be used to create the voxel images, for this study, we prefer the use of true 3D matched filtering via time-domain backprojection. The 3D matched filtering is aided by formulating the SAR backprojection algorithm in strictly spatial variables [4,13]. Ignoring apodization windows and the 1/R4 slant range dependence, 3D backprojection formulated in spatial variables can be expressed as [4]
v ( x ,   y , z ) = n L A m L E p n , m ( d n , m )   exp ( i 4 π λ d n , m ) ,
where i = 1 , v (x, y, z) is the complex-valued backscatter at the voxel centered at (x, y, z), and p n , m ( d n , m ) is the interpolated range-compressed radar signal for the (n,m)th synthetic aperture sampling location at the slant range distance dn,m computed as
d n , m = ( x x n , m ) 2 + ( y y n , m ) 2 + ( z z n , m ) 2 ,
where (xn,m, yn,m, zn,m) is the location of the radar for the (n,m)th sampling location. Backprojection implicitly compensates for range migration, which is particularly significant for short-range SAR. The voxels are evaluated at a 3D grid of (x, y, z) values to form a 3D backscatter image. The voxel spacing is typically set to be consistent with the resolution values given in Equations (3) and (4) and with the range resolution associated with the range pulse compression. In this paper, the voxel size is set to be 10 cm in azimuth by 10 cm in elevation by 10 cm in range. The range dimension oversamples the range resolution, although, in this application, the range migration, coupled with the short-range operation, results in an improved effective range resolution in the voxel image from that obtained by range compression alone. The apodization window trades sidelobe suppression for mainlobe resolution. In this work, simple 1D Hamming apodization windows are used for range-compression, as well as for azimuth and elevation compression.
Note that the simple time-domain back projection image formation algorithm described in Equation (5) does not account for the attenuation of the signal through the vegetation. This is required for estimation of the vegetation backscatter at each voxel.
At the C-band, the tree vegetation (leaves, twigs, and branches) can be treated as a 3D volume scattering region [14,15]. The radar signal penetrates the vegetation and some of it scattered back to the radar. Additional signal is scattered in other directions and is lost to the radar. This scattering loss attenuates the signal as it passes through the vegetation [1]. Nevertheless, the attenuation through the vegetation of a single tree is adequate for observing backscatter from vegetation elements at the backside of the tree. This is not the case at higher radar frequencies.
As illustrated in Figure 4, the two-way attenuation of the signal reflecting from backscattering elements at a given voxel location depends on the characteristics of the intervening vegetation that vary throughout the tree volume. For simplicity in this paper, the vegetation path is treated as composed of discrete volume elements, and the effects of multipath within the volume scatterer are neglected when they result in path length extension longer than a single radar range bin.
Since the radar does not resolve individual leaves and branches, within each voxel, the volume scattering is modeled as a backscattering cross-section σ and an extinction cross-section κe. The cross-sections are assumed to be azimuthally fixed [14] over the angles associated with the synthetic aperture observations. The attenuation L(z) along the signal propagation path is given by
L ( z ) = exp { 0 z κ e ( τ ) d τ } .
For 3D imaging consistent with the backprojection algorithm given in Equation (5), the polar form of the attenuation equation in Equation (7) is converted to a rectangular discrete formulation
L ( x , y , z ) = exp { 1 cos ( tan 1 x ' ) cos ( tan 1 y ' ) z ' = 0 z κ e ( x ' r E , y ' r A , z ' ) } ,
where x’ = x/z’ and y’ = y/z’. For convenience in this expression, κe has been scaled by the voxel volume. If the assumption is made that κe is proportional to the σ, i.e., κe = Aσ, a rough approximation for the attenuation is
L ( x , y , z ) exp { A cos ( tan 1 x ' ) cos ( tan 1 y ' ) z ' = 0 z v ( x ' r E , y ' r A , z ' ) } .
In this paper, the constant A is estimated from the actual data. Once the attenuation is known, the attenuation-corrected image I ( x , y , z ) is computed as
I ( x , y , z ) = v ( x , y , z ) / L 2 ( x , y , z ) = v ( x , y , z ) exp { 2 A cos ( tan 1 x ' ) cos ( tan 1 y ' ) z ' = 0 z v ( x ' r E , y ' r A , z ' ) } .
In practice, the image is computed from the front of the tree to the back. Note that the attenuation term is squared due to the two-way propagation of the signal. Note that once voxel images of I and L are available, this model can be reversed to synthesize the appearance of the tree as viewed from other incidence angles, for example, when observed from an aircraft SAR. In this case, L(x, y, z) is computed using Equation (9) and I(x, y, z) using Equation (10). The synthesized 2D SAR image S(x′′, y′′) is computed by summing the integrated path-attenuated backscatter for the voxels at the same slant range and azimuth. Ignoring 1/R4 slant range dependence and a constant scale factor, the projected backscatter image can be approximately computed as
S ( x , y ) = x , y , z r ( x , y ) I ( x , y , z ) exp { 1 cos ( tan 1 x ) cos ( tan 1 y ) z ' = 0 z L ( x r E , y r A , z ' ) } ,
where r(x’’,y’’) denotes the voxels at the appropriate azimuth and a fixed slant range.
The value of A is estimated by first computing the unattenuated v using Equation (9). The vegetation density is assumed to be roughly uniform when averaged over azimuth and elevation, i.e., the average attenuation versus range is roughly constant. With this assumption, the A estimate is the value of A that minimizes the slope of the root-mean-square attenuated image over the extent of the foliage. This attenuation model is only approximate but has the advantage of simplicity and requires no a priori knowledge about the tree.
A weakness of the computation shown in Equation (11) is that it does not account for changes in scattering behavior with elevation incidence angle: i.e., changes in σ can be expected as a function of incidence angle. For example, the trunk is less normal-facing when viewed from overhead, and thus has a less specular reflection compared to viewing it from the side. In addition, since the core branches have a relatively vertical orientation, they have a more favorable upward scattering than the sideward scattering observed by the ground-based radar. Adjustment for incidence angle effects is not addressed in this paper, but is reserved for a future paper.
In summary, controlled displacement and data collection in multiple dimensions using small real-aperture antennas enables formation of large 2D synthetic apertures, which, when coupled with the radar range resolution, can enable true 3D radar imaging for a distributed target such as a tree when coupled with backprojection image formation and a simple attenuation model.

3. 3D Imaging Sensor

The major elements of the ground-based radar system include a high-precision 2D scan mechanism, the radar electronics, and the image formation processing. These components are briefly described below. The synthetic aperture approach to imaging is particularly effective for short range application as the processing algorithm accounts for the fact that the target is in the near-field of the synthetic aperture while the target is in the far field of the small real-aperture antennas. To achieve similar resolution with a single real aperture antenna, a much larger antenna would be required, placing the target in the near-field of the antenna, which would make focusing more difficult.
A photograph and design drawings of the student-built scan mechanism are shown in Figure 5 [16]. The scanner is capable of precise motion over approximately 1.7 m in the vertical and horizontal dimensions. It supports the transmit and receive antennas and the radar electronics, stopping in 5 cm increments to support Nyquist spatial sampling with dual 13.5 cm by 10 cm horns. Separate transmit and receive horns, spaced 25 cm apart, in a pseudo mono-static mode are used. The separation distance is included in the image formation processing. For this experiment, two Waveline model 5999 standard gain horns (Waveline Inc., Fairfield, NJ, USA, 2011) with 30° beamwidth in both axes were used.
For tree imaging, the scanner is typically placed 15 m to 30 m from the tree with the scanner plane orthogonal to the trunk. The scanner takes approximately 1 h to complete a full scan with short stops during radar data collection at each position. The scanner control system provides a serial data stream to the radar to report the position of the antenna assembly, which is recorded with the data. The position information is used in the image formation.
Radar data collection is accomplished using a microASAR programmed for very short-range operation. The microASAR is a compact, flexible, self-contained SAR system designed for operation in a small aircraft. It is described in detail in [2,3]. A block diagram of the RF subsystem is shown in Figure 6. An LFM-CW SAR operating at C-band (5.4 GHz) with 120 MHz of bandwidth, it has a range resolution of 1.25 m. The self-contained data collection system collects raw SAR data on a compact flash disk. For the short-range operation in this application, the transmit power is set to 10 mW. Because of its small size and weight, the self-contained microASAR can be mounted adjacent to the antennas on the scanner to reduce RF losses. The bistatic microASAR continuously transmits and receives signals using two separate standard C-band horn antennas (see Figure 5a). The two antennas have about 40 dB of isolation. Feedthrough leakage is further suppressed by the bandpass filter in the dechirping stage (see Figure 6) of the microASAR hardware [2]. The location of the antennas are recorded in the data. Only data collected from when the scanner is stopped at one of the scan positions on the 5 cm by 5 cm position measurement grid are used in the image formation. Multiple pulses at each location are averaged to improve the signal-to-noise ratio. In this experiment, measurements were collected at vertical (VV) polarization.
In processing the data, the raw microASAR data is first range-compressed by windowing and computing the fast Fourier transform (FFT). The range-compressed data is used in the 3D backprojection algorithm to create 3D voxel images of radar backscatter from the target over a 3D grid centered on the target tree. As indicated earlier, in this study, the voxel size is set to 10 cm horizontal by 10 cm vertical by 10 cm in ground range. Note that, due to the large relative range migration involved in short-range operation, the effective range resolution in the synthetic aperture image is finer than the radar bandwidth would normally support. The positions and separation distance between the transmit and the receive antennas are accounted for in the backprojection processing geometry. While graphics processing unit (GPU)-based processing algorithm to form images could have been used [17], for this study, the processing is done in Matlab (Mathworks, Sunnyvale, CA, USA, 2011) without attempting formal calibration of A or σ, i.e., the images are uncalibrated [16]. Unfortunately, radar calibration is hindered by the loss of critical calibration data due to a human error. Calibration was to be based on observations of an isolated 20 cm diameter aluminum sphere hanging from a thin nylon line.
Note that a similar 2D synthetic aperture setup has been successfully used for observing volume and layer scattering within a snowpack [18,19]. The resulting 3D imaging capability enables tracking the evolution of the backscatter within the key snow layers associated with the risk of avalanches.

4. Results

A number of 3D SAR images of different tree types have been collected. Data collections were done on calm days to minimize leaf motion. Figure 7 shows an optical image of a small Norway Maple tree, and corresponding radar views of it are provided in Figure 8 where the radar operated at a distance of about 30 m from the tree base. Unfortunately, the multipath from the ground limits the accuracy of the near-ground imaging of the trunk. Because of the difficulties of visualizing the 3D voxel image, only integrated 2D slices through the 3D image are presented [16]. The arbitrarily-scaled linear grayscale has been selected to eliminate background noise. Images from two additional trees are shown in Figure 9 and Figure 10. The features and their sizes in the radar images are validated from manual measurements of the tree [16].
The tree in Figure 7 and Figure 8 has dense foliage on the left, but is sparse on the right and is missing foliage at its rear due to storm damage. As expected, at 10 cm by 10 cm by 10 cm, the resolution of the 3D voxel images is too coarse to resolve individual leaves or small branches. However, the images clearly reveal the large-scale structure of the tree canopy. Though attenuated by its passage through the foliage, the radar signal successfully passes through the forward part of the canopy to enable imaging of the rear part of the tree canopy. This is more evident in later tree images in Figure 9 and Figure 10 where the trees have full rear foliage. Note that there is a close resemblance of the optical and radar images, though the radar image includes backscatter values from within the canopy, while the optical image is unable to see significantly into the foliage.
The radar images of trees shown in Figure 9 and Figure 10 help provide insight into the 3D structure of the trees, including observation of a wayward branch on the right side of the tree in Figure 9. While it is clear from the optical image that the branch is extending far to the right, the side view from the radar image also shows that the branch is extending forward toward the radar. This side view is, in effect, a synthesized depth image since the radar only observes from the front. Note that smaller patches of leaves held by unresolved branches extend forward and backward from the trunk. From the top view, which is also a synthesized image, the tree foliage extends closer to the radar than it does to the rear, something validated by direct manual measurement of the tree [16].
Figure 10 shows a tree next to a large building. By imaging only the volume corresponding to the location of the tree, the building is excluded from the image. As expected, no volume imaging from within the building was observed. This tree is a healthy, symmetric specimen. The total attenuation through the tree was appreciable, leading to some loss in image quality of the dense foliage at the rear of the tree.
From the 3D voxel images of I and the estimated A value, the appearance of this tree in a conventional C-Band SAR image with the same range resolution is computed using Equation (11) and shown in Figure 11. Unfortunately, airborne SAR images of these trees during the growing season are not available for comparison. Instead, the image is compared with a conventional airborne SAR image of a different tree shown in Figure 3. The synthetic SAR image uses an incidence angle of 50°, which corresponds to the incidence angle of the SAR image of the tree in Figure 3. Though the tree shadow is not included in the synthetic image, the modelled backscatter from the synthetic tree image resembles the backscattering from the SAR image, confirming the viability of the approach.

5. Conclusions

This paper illustrates how ground-based measurements of an isolated tree can be used to synthesize an airborne SAR image of the tree. The theory of 3D ground-based radar imaging of a tree using a 2D aperture is discussed, including a 2D mechanical scan mechanism. There are several advantages for using the 2D synthetic aperture approach. One is that a very large aperture that provides fine resolution can be placed close to the tree. If a single antenna were used, the tree would be in the near field of such a large antenna. However, the target is in the far field of the small real-aperture antennas used by the radar. The time-domain 3D backprojection processing implicitly accounts for range and phase migration to ensure full focusing of the synthetic aperture. Second, the short range operation provides high signal-to-noise observations in the presence of attenuation while operating at low power.
This work is only an incremental step in the direction of ground-based fine-resolution imaging of trees and mapping the resulting 3D images to equivalent 2D conventional SAR images. There remains much more to do. For example, the attenuation model employed in this paper is somewhat overly simplified, and better models are needed to more accurately deal with how the attenuation path changes from the ground-based radar observation geometry to the airborne radar observation geometry. For synthesizing 2D SAR images, additional work is required to include the incidence-angle dependence of the scattering from major branches and the trunk, and possibly the leaves [15]. This paper has not attempted to synthesize the tree’s radar shadow, as this requires information about surrounding terrain, which is a subject for future research.

Acknowledgments

This project, completed in 2011, was funded by the Brigham Young University Microwave Earth Remote Sensing Laboratory, which is part of the Brigham Young University Center for Remote Sensing.

Author Contributions

David G. Long conceived and designed the experiments; Justin F. Penner performed the experiments and analyzed the data; Justin F. Penner and David G. Long wrote the paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ulaby, F.T.; Long, D.G. Microwave Radar and Radiometric Remote Sensing; University of Michigan Press: Ann Arbor, MI, USA, 2013. [Google Scholar]
  2. Edwards, M.; Madsen, D.; Stringham, C.; Margulis, A.; Wicks, B.; Long, D.G. MicroASAR: A small, robust LFM-CW SAR for operation on UAVs and small aircraft. In Proceeding of the 2008 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Boston, MA, USA, 6–11 July 2008; Volume 5, pp. 514–517.
  3. Zaugg, E.; Edwards, M.; Long, D.G.; Stringham, C. Developments in compact high-performance synthetic aperture radar systems for use on small unmanned aircraft digital receiver design for an offset IF LFM-CW SAR. In Proceedings of the 2011 IEEE Aerospace Conference, Big Sky, MT, USA, 5–12 March 2011; pp. 1–14.
  4. Duersch, M.D.; Long, D.G. Analysis of time-domain back-projection for stripmap SAR. Int. J. Rem. Sens. 2015, 36, 2010–2036. [Google Scholar] [CrossRef]
  5. Reigber, R.; Moreira, A. First demonstration of airborne SAR tomography using multibaseline L-band data. IEEE Trans. Geosci. Remote Sens. 2000, 38, 2142–2152. [Google Scholar] [CrossRef]
  6. Tebaldini, S. Algebraic synthesis of forest scenarios from multi-baseline PolInSAR data. IEEE Trans. Geosci. Remote Sens. 2009, 47, 4132–4142. [Google Scholar] [CrossRef]
  7. Pardini, M.; Torano-Caicoya, A.; Kugler, F.; Lee, S.K.; Hajnsek, I.; Papathanassiou, K. On the estimation of forest vertical structure from multibaseline polarimetric SAR data. In Proceedings of the 2012 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Munich, Germany, 22–27 July 2012; pp. 3443–3446.
  8. Nannini, M.; Scheiber, R.; Horn, R.; Moreira, A. First 3D Reconstructions of targets hidden beneath foliage by means of polarimetric SAR tomography. IEEE Geosci. Rem. Sens. Lett. 2012, 9, 60–64. [Google Scholar] [CrossRef] [Green Version]
  9. Fornaro, G.; Pauciullo, A.; Reale, D.; Zhu, X.; Bamler, R. SAR Tomography: An Advanced Tool for 4D Spaceborne Radar Scanning with Application to Imaging and Monitoring of Cities and Single Buildings. Available online: http://elib.dlr.de/81576/1/Fornaro-et-al-dec12.pdf (accessed on 18 January 2017).
  10. Reale, D.; Fornaro, G. SAR tomography for 3D reconstruction and monitoring. Encycl. Earthq. Eng. 2014, 1–16. [Google Scholar] [CrossRef]
  11. Reigber, A.; Lombardini, F.; Viviani, F.; Nannini, M.; Martinez del Hoyo, A. Three-dimensional and higher-order imaging with tomographic SAR: Techniques, applications, issues. In Proceedings of the 2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Milan, Italy, 26–31 July 2015; pp. 2915–2918.
  12. Tong Minh, D.H.; Le Toan, T.; Rocca, F.; Tebaldini, S.; Villard, L.; Réjou-Méchain, M.; Phillips, O.L.; Feldpausch, T.R.; Dubois-Fernandez, P.; Scipal, K.; et al. SAR Tomography for the retrieval of forest biomass and height: Cross-validation at two tropical forest sites in French Guiana. Remote Sens. Environ. 2016, 175, 138–147. [Google Scholar] [CrossRef] [Green Version]
  13. Zaugg, E.C.; Long, D.G. Generalized frequency scaling and backprojection for LFM-CW SAR processing. IEEE Trans. Geosci. Remote Sens. 2015, 53, 3600–3614. [Google Scholar] [CrossRef]
  14. Savage, N.; Ndzi, D.; Seville, A.; Vilar, E.; Austin, J. Radio wave propagation through vegetation: Factors influencing signal attenuation. Radio Sci. 2003, 30. [Google Scholar] [CrossRef]
  15. Johannesson, P. Wave Propagation through Vegetation at 3.1 GHz and 5.8 GHz. Department of Electroscience Electromagnetic Theory; Report code. LUTEDX/(TEAT-5039)/1-109/(2001); Lund Institute of Technology: Lund, Sweden, 2001. [Google Scholar]
  16. Penner, J.F. Development of a Ground-Based High-Resolution 3D-SAR System for Studying the Microwave Scattering Characteristics of Trees. Master’s Thesis, Brigham Young University, Provo, UT, USA, 2011. [Google Scholar]
  17. Stringham, C.; Long, D.G. GPU Processing for UAS-based LFM-CW stripmap SAR. Photogramm. Eng. Remote Sens. 2014, 80, 1107–1115. [Google Scholar] [CrossRef]
  18. Preston, S.J. Design and Feasibility Testing for a Ground-Based, Three-Dimensional Ultra-High-Resolution, Synthetic Aperture Radar to Image Snowpacks. Master’s Thesis, Brigham Young University, Provo, UT, USA, 2010. [Google Scholar]
  19. Long, D.G.; Preston, S.J. Method, Apparatus, and System to Remotely Acquire Information from Volumes in a Snowpack. U.S. Patent 8,581,772, 12 November 2012. [Google Scholar]
Figure 1. Comparison of an optical image (a) and a C-band single look radar image (b) of a tall tree. The red star is at the center of the tree base. In the optical image, the solar illumination is from the lower left, resulting in a shadow that extends to the upper right. The image of the tree canopy is centered over the base. In the radar image, the radar illumination is from the right, so the tree’s radar shadow extends to the left, while layover causes the canopy to appear to the right of the base. The radar incidence angle is approximately 45°.
Figure 1. Comparison of an optical image (a) and a C-band single look radar image (b) of a tall tree. The red star is at the center of the tree base. In the optical image, the solar illumination is from the lower left, resulting in a shadow that extends to the upper right. The image of the tree canopy is centered over the base. In the radar image, the radar illumination is from the right, so the tree’s radar shadow extends to the left, while layover causes the canopy to appear to the right of the base. The radar incidence angle is approximately 45°.
Electronics 06 00011 g001
Figure 2. Layover effect on two trees of differing geometry observed at a similar incidence angle near 45°. The radar images backscatter at fixed slant ranges, schematically illustrated with the slanted lines. Two tree shapes are shown: (a) a typically-shaped conifer; (b) a broadleaf tree with a spherical canopy. Note that the cross-track location of the leading edge of the tree canopy in the image (point P2) depends on both the tree height P1 as well as the tree shape. When measuring the height of isolated trees, both shape and height must be considered. Analysis of case (b) is complicated by variations in the density of the canopy.
Figure 2. Layover effect on two trees of differing geometry observed at a similar incidence angle near 45°. The radar images backscatter at fixed slant ranges, schematically illustrated with the slanted lines. Two tree shapes are shown: (a) a typically-shaped conifer; (b) a broadleaf tree with a spherical canopy. Note that the cross-track location of the leading edge of the tree canopy in the image (point P2) depends on both the tree height P1 as well as the tree shape. When measuring the height of isolated trees, both shape and height must be considered. Analysis of case (b) is complicated by variations in the density of the canopy.
Electronics 06 00011 g002
Figure 3. Layover observed of a tree with a rounded canopy in an X-band synthetic aperture radar (SAR) image with 0.5 m pixel resolution. Radar illumination is from the right. The incidence angle is approximately 50°. As in Figure 1, the tree’s radar shadow extends to left, while layover causes the canopy to appear to the right of the base. The arbitrary grayscale extends from black for low backscatter to white for high backscatter.
Figure 3. Layover observed of a tree with a rounded canopy in an X-band synthetic aperture radar (SAR) image with 0.5 m pixel resolution. Radar illumination is from the right. The incidence angle is approximately 50°. As in Figure 1, the tree’s radar shadow extends to left, while layover causes the canopy to appear to the right of the base. The arbitrary grayscale extends from black for low backscatter to white for high backscatter.
Electronics 06 00011 g003
Figure 4. Signal path geometry for a 1D synthetic aperture. The path length through the vegetation (the volume scatterer region to the right of the boundary line) varies with real-aperture antenna positions within the synthetic aperture, resulting in different attenuation values even when the attenuation source is considered to be uniform. For tree imaging, the attenuation varies with the path through the spatially-varying vegetation density [1].
Figure 4. Signal path geometry for a 1D synthetic aperture. The path length through the vegetation (the volume scatterer region to the right of the boundary line) varies with real-aperture antenna positions within the synthetic aperture, resulting in different attenuation values even when the attenuation source is considered to be uniform. For tree imaging, the attenuation varies with the path through the spatially-varying vegetation density [1].
Electronics 06 00011 g004
Figure 5. (a) Photo of the 2D scan mechanism. The C-band transmit and receive horns are visible at the top of the moving platform on which the microASAR [2] is mounted during operation. The scanner has approximately 1.7 m vertical and horizontal displacement that is computer controlled; (b) drawings detailing the mechanical structure of the scanner. Stepper motors are computer controlled from a laptop computer. Digitized radar data is time-tagged with the scanner position.
Figure 5. (a) Photo of the 2D scan mechanism. The C-band transmit and receive horns are visible at the top of the moving platform on which the microASAR [2] is mounted during operation. The scanner has approximately 1.7 m vertical and horizontal displacement that is computer controlled; (b) drawings detailing the mechanical structure of the scanner. Stepper motors are computer controlled from a laptop computer. Digitized radar data is time-tagged with the scanner position.
Electronics 06 00011 g005
Figure 6. Radio frequency (RF) system signal flow block diagram of the microASAR [2,3]. The signal begins as a repetitive linear frequency modulation chirp generated by a direct digital synthesizer (DDS) which is mixed up and filtered by a bandpass filter (BPF). The received signal is homodyne-mixed down to an offset video signal before sampling with an analog to digital converted (ADC), which is followed by digital processing. The offset-homodyne mixing simplifies processing and minimizes ADC quantization noise [2].
Figure 6. Radio frequency (RF) system signal flow block diagram of the microASAR [2,3]. The signal begins as a repetitive linear frequency modulation chirp generated by a direct digital synthesizer (DDS) which is mixed up and filtered by a bandpass filter (BPF). The received signal is homodyne-mixed down to an offset video signal before sampling with an analog to digital converted (ADC), which is followed by digital processing. The offset-homodyne mixing simplifies processing and minimizes ADC quantization noise [2].
Electronics 06 00011 g006
Figure 7. Image of a small Norway Maple tree along a sidewalk on the Brigham Young University campus. As a result of the loss of a major limb on the backside due to storm damage, the tree canopy is concentrated on the forward side of the trunk in the direction of the camera. See Figure 8.
Figure 7. Image of a small Norway Maple tree along a sidewalk on the Brigham Young University campus. As a result of the loss of a major limb on the backside due to storm damage, the tree canopy is concentrated on the forward side of the trunk in the direction of the camera. See Figure 8.
Electronics 06 00011 g007
Figure 8. Slices through 3D backscattering image of tree shown in Figure 7. (a) Front view; (b) side view. Due to the loss of a major limb on the backside of the tree due to storm damage, the tree canopy is concentrated on the forward side of the trunk. Multipath from the ground between the radar and tree distorts the image of the trunk. The reversed linear image color scale extends from white for low backscatter to black for high backscatter. The linear grayscale is arbitrary.
Figure 8. Slices through 3D backscattering image of tree shown in Figure 7. (a) Front view; (b) side view. Due to the loss of a major limb on the backside of the tree due to storm damage, the tree canopy is concentrated on the forward side of the trunk. Multipath from the ground between the radar and tree distorts the image of the trunk. The reversed linear image color scale extends from white for low backscatter to black for high backscatter. The linear grayscale is arbitrary.
Electronics 06 00011 g008
Figure 9. Optical and radar slices through 3D backscattering image of another Norway Maple tree. (a) optical image; (b) front view backscatter image; (c) top view image; (d) side view image. Multipath from the ground between the radar and tree distorts the image of the trunk. Note that the nearby light pole and trees in the background are excluded from the 3D imaging volume. The linear grayscale is arbitrary.
Figure 9. Optical and radar slices through 3D backscattering image of another Norway Maple tree. (a) optical image; (b) front view backscatter image; (c) top view image; (d) side view image. Multipath from the ground between the radar and tree distorts the image of the trunk. Note that the nearby light pole and trees in the background are excluded from the 3D imaging volume. The linear grayscale is arbitrary.
Electronics 06 00011 g009
Figure 10. Optical and radar slices through 3D backscattering image of a healthy Honey Locust tree. (a) optical image; (b) front view backscatter image; (c) top view image; (d) side view image. Multipath from the ground between the radar and tree distorts the image of the trunk. The nearby building is excluded from the 3D imaging volume. The linear grayscale is arbitrary.
Figure 10. Optical and radar slices through 3D backscattering image of a healthy Honey Locust tree. (a) optical image; (b) front view backscatter image; (c) top view image; (d) side view image. Multipath from the ground between the radar and tree distorts the image of the trunk. The nearby building is excluded from the 3D imaging volume. The linear grayscale is arbitrary.
Electronics 06 00011 g010
Figure 11. Projected tree backscatter image from Figure 11 as would be observed by a high resolution C-band SAR at an incidence angle of 50°. Shadowing is not included. This synthesized SAR image can be compared to the actual SAR image of a different tree in Figure 3. The arbitrary linear grayscale extends from black for low backscatter to white for high backscatter.
Figure 11. Projected tree backscatter image from Figure 11 as would be observed by a high resolution C-band SAR at an incidence angle of 50°. Shadowing is not included. This synthesized SAR image can be compared to the actual SAR image of a different tree in Figure 3. The arbitrary linear grayscale extends from black for low backscatter to white for high backscatter.
Electronics 06 00011 g011

Share and Cite

MDPI and ACS Style

Penner, J.F.; Long, D.G. Ground-Based 3D Radar Imaging of Trees Using a 2D Synthetic Aperture. Electronics 2017, 6, 11. https://0-doi-org.brum.beds.ac.uk/10.3390/electronics6010011

AMA Style

Penner JF, Long DG. Ground-Based 3D Radar Imaging of Trees Using a 2D Synthetic Aperture. Electronics. 2017; 6(1):11. https://0-doi-org.brum.beds.ac.uk/10.3390/electronics6010011

Chicago/Turabian Style

Penner, Justin F., and David G. Long. 2017. "Ground-Based 3D Radar Imaging of Trees Using a 2D Synthetic Aperture" Electronics 6, no. 1: 11. https://0-doi-org.brum.beds.ac.uk/10.3390/electronics6010011

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop