DEVELOPMENT OF A LOW-COST SMALL DRONE

Transcription

DEVELOPMENT OF A LOW-COST SMALL DRONE
DEVELOPMENT OF A LOW-COST SMALL DRONE-BASED LASER-SCANNER
SYSTEM FOR RICE MONITORING
Kazuyoshi Takahashi1, Phan Thi Anh Thu2, Hiroki Irie3 and Takahiro Yamada4
Nagaok University of Technology,1603-1, Kami-Tomioka, Nagaoka, Niigata 940-2188, Japan
Email: ktakaha@nagaokaut.ac.jp
2
Graduate school of Nagaoka University of Technology, 1603-1, Kami-Tomioka, Nagaoka, Niigata 940-2188, Japan
Email: phanthu.bk@gmail.com
3
National Institute of Technology, Kumamoto College,2627 Hirayamashin-Machi, Yatsushiro, Kumamoto, 866-8501 Japan
Email: irie@kumamoto-nct.ac.jp
4
National Institute of Technology, Fukushima College, 30 Kami-Arakawa Nagao, Iwaki, Fukushima, 970-8034, Japan
Email: yamada@fukushima-nct.ac.jp
1
KEY WORDS: UAV, LiDAR, 3D point-cloud, GNSS
ABSTRACT: This paper presents a small drone-based laser-scanner system for rice monitoring at low altitudes. The
system uses a DJI S800 as the flight platform, with a 3-axis gimbal system to stabilize the laser-scanner attitude. The
laser scanner obtains 3D point-cloud data of rice plants. A single-frequency GPS receiver and a small motion-logger
are also installed to record the platform position and nose direction. The laser scanner and the GPS receiver are
controlled by a Raspberry Pi, which also records the relevant data. The GPS receiver uses a u-blox module that can
output raw GPS data, so its positioning error (in the horizontal coordinates) is about 1 m when applying
post-kinematic processing. The mapping accuracy of the 3D point-cloud data from the proposed system is found to be
1.56 ± 0.12 m without ground control point (GCP) correction and 0.23 ± 0.10 m using only one GCP. The results show
that this system has the potential to map 3D point-cloud data with sub-meter accuracy. In addition, observations of
paddy rice indicate that the proposed system produces high reproducibility of relative height.
1. INTRODUCTION
Recent progress in the technology of microelectromechanical systems (MEMS) has enabled the development of
low-cost remote sensing systems using small unmanned aerial vehicles (UAVs, also known as drones). The main
advantage of these systems is the flexibility of observations. Many UAV systems have been proposed and developed,
and they mainly carry digital cameras for the generation of digital surface models (DSM) or for crop monitoring
(Hunt et al., 2010; Zhou, 2009; Sugiura et al., 2004). Crop monitoring using a passive optical sensor, such as a digital
camera, usually needs additional processes such as observations of standard reflector intensity or skylight intensity to
remove or reduce the illumination effects (Uto et al., 2013). The monitoring results are affected by the quality of the
additional data. The authors have been developing a growth monitoring method for paddy rice using measurements
from a laser scanner (one of the active optical sensors) (Takahashi et al., 2010). This previous study found that it is
possible to estimate the vegetation coverage of a paddy field using laser scanner measurements from above the paddy.
In addition, to monitor widespread paddy areas, it is necessary to develop a system that can obtain laser scanner
measurements from low altitude. Several laser scanner measurement systems onboard UAVs have been developed.
Wallace et al. (2012) developed a multicopter-borne light detection and ranging (LiDAR) system for forest inventory,
and Nagai et al. (2009) developed a LiDAR system for use onboard an industrial helicopter. The costs of
crop-monitoring systems, such as the initial and running costs, are reflected in the crop’s price. Therefore, in
consideration of observation tolerances, it is necessary to create as low cost a system as possible. However, the cost of
the existing system is too high to be factored into the crop’s price. Accordingly, in this study, a low-cost drone-based
laser-scanner measurement system was developed by assembling inexpensive commercial products. The positioning
accuracy of the 3D cloud data from this new system was also evaluated for the case of a small, flat observation area.
2. SYSTEM DESCRIPTION
Our drone-based system consists of a DJI S800 airframe and a gimbal mounted with a line laser scanner. To reduce
the airframe fluctuations during observation flights, a 3-axis gimbal (TAROT 5D) was used to maintain the horizontal
attitude of the line laser scanner. The line laser scanner installed on the gimbal was a Hokuyo UTM30LX, which has
a weight of about 0.3 kg. Although the DJI S800 has a GPS unit, its data is only suitable for flight control purposes.
Therefore, a single-frequency GPS receiver was installed to position the airframe. The single-frequency GPS receiver
(Sensor-Com Kinematic GPS Evaluation Kit) outputs raw data for post-kinematic positioning. Takahashi et al. (2014)
reported that the trajectory repeatability of a low-speed object calculated from this GPS receiver’s output data had an
accuracy of approximately 1.0 m in horizontal coordinates. A small GPS/IMU integrated data logger (Ninja Scan
Light) was used to obtain the Earth’s magnetic field intensity for calculating the nose direction of the airframe. A
single-board PC (Raspberry Pi) was used to control both the line laser scanner and the GPS receiver, as well as to
record their output data. The line laser scanner, GPS receiver, and Earth’s magnetic field sensor data were
synchronized using the GPS time. The configuration of our drone-based system is shown in Figure 1, and the
purchase prices (acquisition time was 2010–2014) of the sensors and the airframe are listed in Table 1.
3. METHODS
3.1 Coordinate transformation of 3D point-cloud data
Assuming the attitude of the line laser scanner remains horizontal, the 3D point-cloud data can be transformed to the
map coordinates using the airframe position and its nose direction. The local coordinate system shown in Figure 2a is
a right-hand coordinate system whose origin is set to the mechanical origin of the line laser scanner. Its Y-axis, ,
corresponds to the rotation axis of the scanning mirror. Japan’s orthogonal coordinate system was used as the map
coordinate system for evaluating the positioning accuracy. According to equation (1), the local coordinates of the
observed point,
, ,
∙
, 0,
∙
, can be converted to the map coordinates,
, ,
, where is the observed range, is the scanning angle, and is the nose direction angle. The airframe
position,
,
,
, is obtained from the GPS observations with an antenna offset
, ,
. The
relation between the local- and map-coordinate systems is shown in Figure 2b. Positive nose direction angles are
generated by counterclockwise rotation.
0
0
0
0
1
+Q−
(1)
3.2 Experiment to evaluate the positioning accuracy
To evaluate the positioning accuracy of 3D point-cloud data from our drone-based system, nine markers (0.2 m × 0.2
m) were placed on the ground. The laser-scanner measurement using this system was performed on Dec. 19, 2014, in
Table 1. Purchase prices (US$) of the
main components.
Figure 1. Developed drone-based laser-scanner measurement system
Figure 2. Coordinate systems. (a) Local coordinate system. (b) Map coordinate system
the playground of the national institute of technology, Kumamoto College. Figure 3 shows a photo of the
experimental area. The 3D point-cloud data were converted to the map coordinates using the method described above.
During the measurements, the airframe was operated manually under the GPS control mode. In addition, calibration
data for the magnetism sensor were acquired by rotating the nose direction 360° while hovering just after takeoff. The
trajectory of the airframe was calculated using RTKLIB (TAKASU, 2013) with the nearest permanent GPS station in
Japan’s GEONET.
3.2.1 Measurement of the marker positions
To determine the map coordinates of each marker, Global Navigation Satellite System (GNSS) measurement and
Total Station (TS) measurement were combined. The map coordinates of a location to set the total station and a point
for the azimuth angle reference were determined using GNSS measurements with the post-static positioning process.
The measurement for each marker was then carried out using the total station. Based on the map coordinates of the
two points defined above, the position of each marker in the total station local coordinates (the reference position) was
also converted to map coordinates. The reference position of the marker, _
_ ,
_ , _ , is shown in Figure 4,
where is the marker ID indicated in Figure 4. The mean elevation of the markers was 3.31 m with a standard
deviation of 0.07 m.
3.3 Tentative observation of paddy rice
To examine the reproducibility of the relative height distribution of 3D point-cloud data from our drone-based system,
laser-scanner measurements of paddy rice were taken on Jun. 20, 2014. The system configuration was the same as for
the above positioning accuracy evaluation, except that the small GPS/IMU integrated data logger was not installed.
Therefore, in this experiment, the nose direction of the drone was estimated from the trajectory data measured by the
GPS post-kinematic positioning process.
4. Results and Discussion
4.1 Transformation of the observed 3D point-cloud data
The region of interest (ROI) for mapping 3D point-cloud data was set according to the intensity image of the line laser
scanner (Figure 5). In Figure 5, there is an area that lacks laser data (highlighted by the red arrows), although the
markers were clearly identified. The position of the airframe fluctuates according to the processing parameters of
RTKLIB. In particular, the elevation mask and the signal noise (SN) mask are important, as they affect the receipt of
GPS signals. In this experiment, an elevation mask of 15° and an SN mask of 35 dBHz were adopted to obtain the
trajectory of the airframe (Figure 6). Figure 6 shows that the airframe maintained a constant altitude and flew linearly.
According to the trajectory data, the mean flight speed was 1.6 m/s and the height above the ground was 7.54 m (the
altitude was 10.85 m). However, the observed range at 0° scan angle, which appears to be the height above the ground,
was 5.83 m. This discrepancy suggests a vertical GPS positioning error of approximately 1.7 m. The nose direction
angle calculated from the earth magnetism sensor data was 145.1° throughout the ROI. In accordance with the above
Figure 3. Photo showing the experimental area
Figure 4. Positions of markers
method, 3D point-cloud data were transformed to the map coordinates. Figure 7 shows the georeferenced images of
range and intensity. In Figure 7b, the observed terrain does not appear to be flat land, but the standard deviation of
elevation, 0.09 m, was close to that for the markers.
4.1.1 Positioning accuracy
To collect the data for evaluating the positioning accuracy, the marker positions as observed by our drone-based
, were identified on the intensity image, where is the marker ID. The residual,
system, _
_ , _ , _
,
, between the reference and observed positions of the marker was then calculated. The
_
_ , _
_
mean absolute residual, MAE, and the standard deviation of the residuals were 1.56 m and 0.12 m, respectively. The
residuals in the X, Y, and Z axes were 0.88, 0.22, and 1.27 m, respectively. This indicates that the positioning
accuracy of our drone-based system is 1–2 m.
4.1.2 Positioning accuracy after applying a simple correction using GCP data
Because the residual standard deviation was less than 0.1 m, we attempted to improve the positioning accuracy of our
drone-based system using ground control point (GCP) data. A simple correction method was adopted. The corrected
map coordinates of a marker, _ , were obtained by equation (2), where is the marker ID and ≠ 𝑘 is the ID of the
marker selected as the GCP. The MAE after applying the simple correction, MAE was calculated by equation (3),
Figure 5. Intensity image
Figure 6. Trajectory of the airframe. (a) Horizontal plane (X-Y). (b) Vertical plane (X-Z)
Figure 7. Georeferenced images. (a) Intensity image. (b) Range image.
where is the number of markers. The value of MAE' was found to be 0.23 m with a standard deviation of 0.10 m.
The residuals in the X, Y, and Z directions were 0.12, 0.13, and 0.11 m, respectively. This result shows that our
drone-based system can achieve sub-meter positioning accuracy by employing a simple correction using a GCP.
_
_
MAE
1
𝑚(𝑚 + 1)
(2)
−
𝑚 𝑚−1
(3)
_
𝑘=1 𝑗=1
4.2 Tentative observation of paddy rice
Four laser-scanner measurements were carried out over the small paddy area shown in Figure 8a. The flight direction
of our drone-based system corresponded to the longitudinal direction of the target paddy area, as shown by the red
arrow in Figure 8a.This small paddy area was divided into five plots of size 3 m × 2 m by a series of pipes. After
applying a simple correction using GCP data, relative height distributions were obtained (Figures 8b–8e). The relative
heights in the same plot appear to be uniform, although the relative heights vary along the longitudinal direction of the
0.3
0.1
0.2
0.0
0.1
-0.35
-0.25
-0.15
-0.05
(c)
-0.45
-0.35
-0.25
-0.15
-0.05
-0.45
-0.35
-0.25
-0.15
-0.05
0.2
(d)
0.0
0.1
0.1
0.0
-0.55
-0.55
0.3
-0.45
0.2
0.3
-0.55
Relative frequency
(b)
0.2
(a)
0.0
Relative frequency
0.3
Figure 8. Ortho photo and georeferenced range images in the target paddy. (a) Ortho photo. The red arrow and
rectangle show the flight direction of our drone-based system and sampling area for comparison. (b)–(e)
correspond to the 1st–4th measurement data, respectively.
-0.45
-0.35
-0.25
Relative height [m]
-0.15
-0.05
-0.55
Relative height [m]
Figure 9. Histograms of relative height. (a)–(d) correspond to the 1st–4th measurement data, respectively.
paddy area. Relative height distributions were compared in the area of the red rectangle in Figure 8. The size of this
comparison area is 0.8 m × 1.0 m. Histograms of relative height are shown in Figure 9. Figures 9a–9d contained 2733,
1928, 1707, and 1242, 3D data points, respectively. The histogram distributions in Figures 9a–9c are broadly similar.
The observed height of the vegetation layer was calculated by the difference between the 5th and 95th percentile
heights. This corresponds to the nearest part of the ground and the top part of the paddy rice. The heights ranged from
0.17–0.20 m, and the difference among them was about 0.01 m, although the plant height of this plot was 0.40 m. This
plot had vegetation coverage of approximately 20%. Accordingly, there was some difference between the plant height
and the layer height. This result shows that the proposed drone-based system achieves high reproducibility of relative
height.
5. Summary
A small drone-based laser-scanner system for rice monitoring at low altitudes has been described. This new system
was created by assembling inexpensive commercial products. To simplify the coordinate transformation process, a
3-axis gimbal system was used to stabilize the laser-scanner attitude. A single-frequency GPS receiver that outputs
raw data was also installed to obtain the trajectory data of the airframe. The mapping accuracy of the 3D point-cloud
data from this system was examined and found to be 1.56 ± 0.12 m without GCP correction and 0.23 ± 0.10 m using
only one GCP. This result shows that the proposed system has the potential to map 3D point-cloud data with
sub-meter accuracy. An observation of paddy rice using this system was also carried out, and four laser-scanner
measurements were obtained. A comparison of relative heights demonstrated that this system exhibits high
reproducibility of the relative height.
Acknowledgement
We thank the support of Mr. Yasuhiro Higuchi of Niigata Crop Research Center. Part of this study was supported by
JSPS KAKENHI Grant Numbers 23580361 and 26450362.
References
E. Raymond Hunt, Jr., W. Dean Hively, Stephen J. Fujikawa, David S. Linden, Craig S. T. Daughtry and Greg W.
McCarty, 2010, Acquisition of NIR-Green-Blue Digital Photographs from Unmanned Aircraft for Crop Monitoring,
Remote Sensing, 2, pp.290-305.
Masahiko Nagai, Tianen Chen, Ryosuke Shibasaki, Hideo Kumagai, and Afzal Ahmed, 2009, UAV-Borne 3-D
Mapping System by Multisensor Integration, IEEE Trans. Geosci. Remote Sense., 47(3), pp.701-708.
Ryo Sugiura, Noboru Noguchi, Kazunobu Ishii, 2005, Remote-sensing Technology for Vegetation Monitoring using
an Unmanned Helicopter, Biosystems Engineering, 90(4), pp.369-379, 2004.
Kazuyoshi Takahashi, Atsushi Rikimaru, Kenta Sakata, 2010, A Study on the Growth Monitoring Method of Rice Pl
ants by Laser Scanner from Above, Retrieved Aug. 17, 2015, from http://www.a-a-r-s.org/acrs/administrator/compo
nents/com_jresearch/files/publications/PS01-13.pdf
Kazuyoshi Takahashi, Hiroki Irie and Takahiro Yamada, 2014, Study of Positioning Capabilities of Post Interferome
tric Processing between Single Frequency GPS Receiver Observation Data and Permanent GPS Station Data, Journa
l of the Japan Society of Photogrammetry and Remote Sensing, 53(1), pp.39-43.
TAKASU, RTKLIB, An Open Source Program Package for GNSS Positioning, Retrieved Oct. 10, 2013,
http://www.rtklib.com/
Kuniaki Uto, Haruyuki Seki,Genya Saito, Yukio Kosugi, 2013, Characterization of Rice Paddies by a UAV-Mounted
Miniature Hyperspectral Sensor System, IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens., 6(2), pp.851-860.
Luke Wallace, Arko Lucieer, Christopher Watson and Darren Turner, 2012, Development of a UAV-LiDAR System
with Application to Forest Inventory, Remote Sensing. 4, pp.1519-1543.
Guoqing Zhou, 2009, Near Real-Time Orthorectification and Mosaic of Small UAV Video Flow for Time-Critical
Event Response, IEEE Trans. Geosci. Remote Sense., 47(3), pp.739-747.