#PAGE_PARAMS# #ADS_HEAD_SCRIPTS# #MICRODATA#

Research on motion planning for an indoor spray arm based on an improved potential field method


Authors: Dongjie Zhao aff001;  Bin Zhang aff002;  Ying Zhao aff001;  Qun Sun aff001;  Chuanjun Li aff002;  Chong Wang aff001
Authors place of work: School of Mechanical & Automotive Engineering, Liaocheng University, Liaocheng, China aff001;  College of Engineering, China Agricultural University, Beijing, China aff002
Published in the journal: PLoS ONE 15(1)
Category: Research Article
doi: https://doi.org/10.1371/journal.pone.0226912

Summary

The target spraying effect of spray robots mainly depends on the control performance of the spraying arm during the processes of aiming and tracking. To further improve the robustness of the endpoint control and positioning accuracy of the spray arm, an improved potential field algorithm for the motion planning and control of the spray arm is proposed based on prophase research. The algorithm introduces a velocity potential field, visual field constraints and joint position limit constraints into the traditional artificial potential field method. The velocity potential field is used to ensure that the target state of the spraying arm is at the same velocity as the target crop (relative velocity) to achieve stable target tracking. The visual field constraints and joint position limit constraints are utilized to ensure the efficiency of the visual servo control and the movement of the spray arm. The algorithm can plan a feasible trajectory for the spraying arm in Cartesian space and image space, and use the speed controller to control the spraying arm movement along the trajectory for aiming and tracking. Simulation analysis shows that the algorithm can plan better motion trajectories than the servo controller based on image moments in previous studies. In addition, the experimental results show that the algorithm can effectively improve the robustness of targeting and tracking control for the spray robot.

Keywords:

Simulation and modeling – Imaging techniques – Cameras – Crops – Image processing – Target detection – Robots – Prototypes

Introduction

For crops planted in a greenhouse environment with large plant spacing, such as cucurbit seedlings and lettuce seedlings in the early growing season, it is important to use target spraying technology to apply fertilizers and pesticides according to their position and size. In addition, it is of significance to improve the utilization rate of medicinal liquids and to alleviate soil pollution. Introducing a visual servo control into the spraying robot system and using the visual information obtained from the visual sensor to guide the spraying arm to complete the target movement, can satisfy the accuracy requirements of target operation and improve the adaptability of the spraying robot in unstructured environments (e.g., greenhouses) [15].

It was found in reference [6] that spray robots can achieve a good target effect by using the visual tracking method based on image moments and a hybrid vision system that includes a monocular scene camera and a monocular (or multi) eye-in-hand camera. However, for the servo control of the spray arm, when the initial pose and the desired pose are quite different, the motion trajectory planned only by the speed controller is poor, which limits the speed of the spraying robot. Furthermore, the vibration of the spray robot caused by occasional larger undulations or obstacles on smooth greenhouse pavement will cause target crops to leave the field of view of the eye-in-hand camera, and the servo process will sometimes fail.

Meanwhile, the joint manipulator has a faster response speed and a more irregular workspace than the Cartesian-coordinate manipulator, and its irregular workspace has some restrictions on the motion trajectory of the manipulator [78]. Therefore, it is necessary to optimize the visual servo process by adding various constraints, and applying the path planning method to reasonably control the motion of the manipulator.

The common methods in the trajectory planning include genetic algorithm [910], simulated annealing [1112], artificial neural network [1314], A* algorithm [15], vector field method [16], adaptive algorithm [1718], particle swarm optimization algorithm [1920], artificial potential field method [2122], etc. Among these algorithms, the artificial potential field method has a simple structure, is convenient for real-time control on hardware entities, and can usually plan smoother and safer paths. This method has been widely used in real-time obstacle avoidance and smooth trajectory control [2325]. However, the algorithm also has some shortcomings, such as goal nonreachable with obstacle nearby (abbr. GNRON) and an insufficient dynamic path planning ability [26], which can be improved together with specific problems.

In this paper, the traditional artificial potential field method is appropriately improved according to visual tracking and target requirements of the spray arm, and the improved algorithm is used to plan the motion trajectory of the spray arm under the conditions of visual field constraints and joint limit constraints. The effect is verified by simulation and prototype tests.

Target spray arm and target operation

System scheme for target spray robot

A hybrid vision structure similar to that in reference [6], including one scene camera and one (or several) eye-in-hand camera, is adopted by the target spray robot (Fig 1). The scene camera, mounted in the front of the robot body, is used to obtain location information about the object crops. The eye-in-hand camera, mounted at the end of the spray arm with the nozzle, is used to provide precise localization of the target crop. The control system consists of one personal computer (abbr. PC, host computer) and one(or several) digital signal processor (abbr. DSP, slave computer). The PC generates and transmits control instructions to the DSP by analyzing the scene camera images and real-time state information of the working parts (spray arm, nozzle, etc.) provided by the DSP. The DSP controls the movement of the spray arm and nozzle according to the PC instructions, the eye-in-hand camera images and the other sensors.

Fig. 1. Target spraying robot system.
Target spraying robot system.
1. Nozzle 2. Eye-in-hand camera 3. Spray arm 4. Scene camera 5. Slave computer (DSP, not shown) 6. Host computer (PC) 7. Movement platform 8. Target crop 9. Object crop 10. Robot’s motion direction.

A ZUTO460 4-DOF series manipulator is selected to be the spray arm. The manipulator can achieve the movement along X-axis, Y-axis, Z-axis and the rotation along Z-axis of the end-effector, which meets the requirements of the translational motion of the nozzle in vertical spray mode, and is suitable for fertilization and pesticide application for many greenhouse crops. Furthermore, compared with the Cartesian coordinate manipulator in reference [6], the manipulator has a larger working space and a faster moving speed, and thus, more easily meets the target requirements when the robot moves more quickly.

The process of the spray arm guiding the nozzle to a target spray along with the forward motion of the spray robot can be described as follows: first, the control system, according to the position information about the object crops provided by the scene camera, controls the spray arm movement to make the crops enter smoothly into the eye-in-hand camera’s view. Then, according to precise information about the target crop provided by the eye-in-hand camera, the control system controls the spray arm to move the nozzle quickly into spray position (target) and ensures the relative position between the nozzle and the target crop is unchanged (tracking spray). It is easy to understand that the effect of target spray is mainly determined by the visual servo control accuracy of the spray arm during the targeting and tracking stage.

Description of target operation issue

The spray arm needs to guide the nozzle to fulfill target and tracking spray operations during the forward movement of the spray robot. Regarding path planning, the motion of the nozzle during the targeting stage can be regarded as point to point movement, and it is necessary to ensure that the trajectory of the nozzle is smooth, and the path is short. The motion of the nozzle during the tracking spray stage can be regarded as continuous path movement, so that the nozzle must stably and accurately track the motion trajectories of the target crop relative to the spray robot. Moreover, to avoid the failure of the visual servo, the target crop should always be in the field of the eye-in-hand camera, and the planned targeting and tracking trajectories need to satisfy the spray arm’s workspace, which is an irregular polyhedron due to structural constraints.

To solve above problems, a path-planning and visual tracking method based on an improved artificial potential field is proposed. The basic idea is that, based on the traditional artificial potential field, velocity potential field is introduced to meet the needs of stable tracking; in addition, field of view constraint and joint position limit constraints are introduced to ensure the validity of the servo control and the motion of the spray arm. Thus, a feasible trajectory is planned in Cartesian space and image space, and the image-based visual servo method is used to track the trajectory.

Trajectory planning and target control based on the improved potential field method

Traditional artificial potential field method

The traditional artificial potential field method regards the motion of a robot in planned space as a kind of motion in a virtual force field. Target points attract the robot, while obstacles or threat areas repel it, and it moves toward the target points under the action of a composition force. The traditional gravitational field Uatt and the repulsive force field Urep are usually defined as

where

P——current position vector of the robot

Pt——position vector of a target point

Po——position vector of the closest point between an obstacle and the robot

ρ0——influence distance of the obstacle repulsion field, mm

α, β——positive proportional gain coefficient of gravity and repulsion

Gravity Fatt and repulsion Frep are equal to the negative gradients of Uatt and Urep respectively, while the virtual force on the robot F is the sum of the two vectors.

where

∇——Laplacian operator

The artificial potential field method can iteratively plan the discrete trajectory of the robot. The virtual force F determines the motion direction of the robot in the next time step:

where

Pk——current position vector of the robot

Pk+1——current position vector of the robot in the next time step

ε——iterative step size

Improved method

During the process of targeting and tracking spray, the spray robot keeps moving forward, which is identical to the situation in which the robot remains still and the target crop keeps moving back, that is, the situation in which the target is moving. Since the traditional artificial potential field model does not consider the speed factor, the planned path is prone to oscillation when tracking moving targets. That is, this method can only achieve a fast approach to moving targets, but it is difficult to stably track objects. Therefore, a velocity term is introduced into the gravitational field function to make the target state of the spray arm move at the same speed as the target crop.

Although greenhouse pavement is relatively flat, there are still occasional large fluctuations or obstacles, which may cause the target crop to escape the field of view of the hand-eye camera, and the servo process to fail in some cases. Therefore, field of view constraints are introduced, that is, the four boundaries of the field of view are regarded as obstacles. In addition, due to structural constraints, the joints of the waist and arm of the spray arm can only be rotated in specific ranges, and the limit positions on both sides of the ranges should also be regarded as obstacles.

The above gravitational and repulsive terms are defined in different description spaces. In particular, the four boundaries of the field of view are defined in image space, the limit positions of the joints are defined in joint space, the velocity term of the gravitational field is defined in Cartesian space, and its location term is defined in image space (according to reference [6], the expected spray position of the target crop is determined by its image moment). If the artificial potential field method is used to plan the trajectory, the virtual forces should be transformed into the same description space. In this paper, the method described in reference [27] is applied to transform the virtual forces between different descriptive spaces. Thus, if the potential field function Uf = U(f(P)), and f(P) is continuously differentiable in the feasible region of P, then the virtual force Ff can be expressed as

where

(∂f∂P)+ —— the inverse (or pseudo inverse) of partial derivative ∂f∂P

On the premise of satisfying the field of view and joint limit constraints, the improved artificial potential field method is applied to plan the discrete trajectory of the camera in Cartesian space, which then is mapped into image space and tracked with the image-based visual servo controller.

Gravitational potential field and gravitation

To stably track the target crop and avoid oscillation, a velocity term is introduced into the gravitational potential field, and the gravitational potential field function is defined as

where

s——image moment vector of the target crop (m10m00,m01m00,kμ20+μ02m00), (pixel, pixel, pixel)

s*——expected value of s, (pixel, pixel, pixel)

vt——relative velocity of the target crop, mm/s

v——velocity of the end of the spray arm(or the eye-in-hand camera), mm/s

α1, α2——positive proportional gain coefficient

The negative gradients of Uatt(s, v) relative to s and v can be called the relative position gravity Fatts and the relative velocity gravity Fattv, respectively, and

where

Ls+——inverse matrix of the image Jacobian matrix related to s

Fatt(P)—virtual resultant gravity

The function of Fatts(P) makes the spray arm move along the shortest path to the desired spray position, its direction is from the current position to the desired position and its size is proportional to the relative position. The function of Fattv(P) makes the target state of the spray arm move at the same speed as the target crop; its direction is the motion direction of the target crop relative to the spray arm, and is proportional to the relative velocity. When the spray arm is aligned with the desired spray location and the relative velocity between the spray arm and the target crop is 0, Fatt(P) = 0.

Repulsion field and repulsion force of field of view constraint

Suppose the crop image will be kept in the camera’s view while the centroid of the crop image k0(u0, v0) is in region M of the camera’s view (M’s boundary is umin, umax, vmin, vmax). Let d be the influence distance of the boundary of the field of view (Fig 2).

Fig. 2. The visual field of the eye-in-hand camera.
The visual field of the eye-in-hand camera.

To prevent k0 from escaping from M, k0 should be repulsed to return to safe region M0 when it moves in Ml, Mr, Mt and Mb. By introducing logarithmic terms into the repulsive field function, the repulsive force tends to be infinite when k0 approaches the boundary of the camera’s view. The repulsion field function is (Fig 3):

where

Fig. 3. The repulsion field of the visual field constraint.
The repulsion field of the visual field constraint.

The repulsive force Frepv (as shown in Fig 4, the length and direction of the arrow represents the magnitude and direction of the repulsion force, respectively) of the field of view of the spray arm can be expressed as

Fig. 4. The repulsive force of the visual field constraint.
The repulsive force of the visual field constraint.

Repulsion field and repulsion force of joint limits

The three joint variables of the spray arm have certain a rotation range. Suppose qi ∈ (qimin, qimax), i = 1, 2, 3, and θ is the influence distance of the joint limit position. As a view constraint, the repulsion field of the joint limit is defined as follows.

where

Suppose Γ = {qi ∈ [qimin + θ, qimax − θ], i = 1,2,3}. The repulsion force of the spray arm’s joint limit Frepq can be expressed as

where

Lr——Jacobi matrix of the spray arm

Trajectory planning method

According to the gravitational and repulsive forces defined above, the resultant force of the spray arm in Cartesian space should be

where

β, γ——scaling factors.

The proportion of each component in F(P) can be changed by adjusting the values of β and γ.

A feasible discrete trajectory for the eye-in-hand camera Γ = {Pk|k = 0,1, ⋯ n} can be planned in Cartesian space by using F(P) and Eq (15).

where

εk——step size of the kth control cycle

Further, combined with the eye-in-hand camera model, the values sk* of image moment s=[m10m00m01m00kμ20+μ02m00]T corresponding to each trajectory point Pk can be calculated. That is, a corresponding discrete trajectory in the image space Χ={sk*|k=0,1,⋯n} is obtained.

Controller design

The image-based visual servo controller is used to track this trajectory. Suppose that in the k th control cycle, the error of the image features is defined as:

A speed controller is selected to exponentially decrease the error. According to reference [28], the controller can be selected as

where

L^s+——pseudo-inverse of estimation of image Jacobian matrix Ls.

The visual system block diagram is shown in Fig 5. According to reference [16], the controller can make the system robust to model error and noise disturbance.

Fig. 5. Block diagram of the proposed visual control system.
Block diagram of the proposed visual control system.

Simulation analysis

The above visual servo motion planning method based on the improved potential field method is validated by simulation in MATLAB software. The camera adopted a perspective projection model with a resolution of 1000 *1000 and a focal length of 10mm. Suppose that the spray robot moves at the speed of vhx = 0.02m/s, vhy = 0.3m/s and vhz = 0m/s. A crop image parallel to the eye-in-hand camera plane, is selected after background segmentation and binarization as the target crop. The projection of the target crop at initial position and the desired spray position are shown in Fig 6. At the desired position, the canopy’s depth of the target crop is Z* = 0.49m, and S = [493 582 262]T. Suppose that the spray robot stands still, and the target crop moves toward it.

Fig. 6. Simulation image.
Simulation image.
(A) Initial position. (B) Desired position.

(1) Effect of adding gravitational field

Fig 7 shows a simulation comparison of the planned trajectories between the potential field method with only a gravitational potential term and the image moment method in reference [6]. It can be seen that the nozzle’s trajectory in image space of the former is obviously better than that of the latter, and the nozzle’s trajectory in Cartesian space of the former is shorter and more reasonable than that of the latter.

Fig. 7. Results of introducing gravitational field.
Results of introducing gravitational field.
(A) Motion trajectory in image space. (B) Motion trajectory in Cartesian space. Where the simulation result of potential field method is designated with a blue solid line, and the simulation result of image moment method is designated with a black dash-dotted line.

(2) Effect of adding field of view constraints

The field boundary corresponding to the centroid k0(u0, v0) of the crop image is set as u0 ∈ [200,800] and v0 ∈ [200,800], and the influence distance of the field boundary is set as 100 pixels. Fig 8 shows a simulation comparison of the planned trajectories between the potential field method with only field of view constraints and the image moment method in reference [6]. It can be seen that the latter can not ensure that the target crop always appears in the field of vision, which may lead to the failure of the servo task, and the former can avoid the problem.

Fig. 8. Results of introducing visual field constraints.
Results of introducing visual field constraints.
(A) Motion trajectory in image space. (B) Motion trajectory in Cartesian space. Where the simulation result of potential field method is designated with a blue solid line, and the simulation result of image moment method is designated with a black dash-dotted line.

(3) Effect of adding joint limit constraints

The joint angle variation range of the waist, upper arm and lower arm joints are set as q1 ∈ (−1.7, 0.5)rad, q2 ∈ (−2.0, −0.2)rad and q3 ∈ (0.9,2.0)rad respectively. The influence distance of the joint limit position is set to 0.3 rad. Fig 9 is a simulation comparison of the planned trajectories between the potential field method with only joint limit constraints and the image moment method in reference [6]. It can be seen that in this case, the trajectory planned by the latter makes the joints of both the waist and the lower arm exceed the setting range, leading to the failure of the servo task, while the trajectory planned by the former can avoid the problem.

Fig. 9. Results of introducing joint limit position constraints.
Results of introducing joint limit position constraints.
(A) Image moment method. (B) Potential field method. Where the trajectory of the waist is designated with a blue dotted line, the trajectory of the upper arm is designated with a red dash-dotted line, and the trajectory of the lower arm is designated with a black pecked line.

(4) Comprehensive effect of the improved potential field method

The simulation results of the improved potential field method considering all the gravitational and repulsive terms are shown in Fig 10. Fig 10A is an error tracking curve of three moment features. It can be seen that all the moment features steadily converge and the target process lasts approximately 13 visual cycles. Influenced by the translation velocity of the robot in the X and Y directions, the errors of the three moment features in the tracking process are approximately −3 pixels, −4 pixels and 1 pixel, corresponding to position errors at the desired positions of −1.5mm, −2mm and 0.5mm, respectively. Fig 10B is velocity of the eye-in-hand camera (or nozzle) in the X, Y and Z directions, which converge to −0.02m/s, −0.3m/s and 0m/s, respectively, ensuring the target accuracy in the tracking spray stage. Fig 10C and 10D show the simulation comparison of the planned trajectories for the eye-in-hand camera between the potential field method and the image moment method in reference [6] in Cartesian space and image space, respectively. It can be seen that the trajectories in image space and Cartesian space of the former are obviously better than those of the latter.

Fig. 10. Results of the improved potential field method.
Results of the improved potential field method.
(A) Error tracking curve. Where the error tracking curve of moment feature 1 is designated with a blue dotted line, the error tracking curve of moment feature 2 is designated with a red dash-dotted line, and the error tracking curve of moment feature 3 is designated with a black pecked line. (B) Velocity of the eye-in-hand camera. Where the velocity in X direction is designated with a blue dotted line, the velocity in Y direction is designated with a red dash-dotted line, and the velocity in Z direction is designated with a black pecked line. (C) Motion trajectory in Cartesian space. Where the simulation result of potential field method is designated with a blue solid line, and the simulation result of image moment method is designated with a black dash-dotted line. (D) Motion trajectory in image space. Where the simulation result of potential field method is designated with a blue solid line, and the simulation result of image moment method is designated with a black dash-dotted line.

Experimental verification

Based on the preceding studies (reference [6]), an improved prototype (Fig 11) is built to test the effectiveness of the above motion planning method. The prototype used a ZUTO460-BRG type 4-axis manipulator as the sprayer arm, a Basler acA1300 industrial camera (resolution 1296*966 pixels) as the scene camera and an OV7725 chip network camera (resolution 640*480 pixels) as the eye-in-hand camera. Moreover, a KYD650N5 controllable laser lamp is used instead of a nozzle to measure the target error conveniently in the experiments.

Fig. 11. Experimental prototype.
Experimental prototype.
1. Movement platform 2. Controller, driver, etc. 3. PC 4. Scene camera. 5. Spray arm 6. Eye-in-hand camera 7. Laser.

The pavement of the greenhouse is relatively flat, and random vibration caused by unevenness is generally small. To facilitate the measurement of the target error in the experiment process, prototype experiments are carried out in a controllable laboratory environment.

Ten hawthorn leaves are selected as experiment objects. Before the experiment, the centroid for each leaf was marked, and the expected off-ground height of the laser lamp was estimated when targeting according to h = h1 + 2.8d (where h1— off-ground height of leaf, d—diameter of the surrounding circle, the center of which is the centroid of the leaf), as shown in Fig 12. In the experiment, the laser lamp would be lit during the simulated tracking spray process.

Fig. 12. Schematic diagram of target and tracking effects.
Schematic diagram of target and tracking effects.
1. Blade 2. Centroid (horizontal expected position) 3. Vertical desired position. 4,5. Horizontal and vertical range of effective target tracking.

The deviation of the laser lamp’s spot relative to the centroid mark during the tracking spray process is taken as the horizontal error err_h, and the maximum deviation of the laser lamp’s height relative to its desired height is taken as the vertical error err_v.

To quantitatively evaluate the tracking spray’s effect, it is stipulated that the horizontal relative error Re_h = |err_h|/d*100%, the vertical relative error Re_h = |err_h|/h*100%, and the tracking spray effect is divided into two kinds: effective tracking spray (Re_h ≤ 10% and Re_h ≤ 5%) and ineffective tracking spray (Re_h > 10% or Re_v > 5%).

Image processing and feature extraction are carried out using the method used in reference [6]. The method determines the expected value of the image moment is as follows: manually control the spray arm to move the nozzle into its desired spray location, and record the value of S={m10m00,m01m00,kμ20+μ02m00} at the moment. The process is repeated five times, the average value is taken as the expected value, and the experiment results showed that the expected value of S is {237, 189, 202}.

During the experiments, the leaves are randomly placed with spacing of approximately 30~50mm in the X direction and 400~600mm in the Y direction, and several obstacles are set in the path of the prototype to simulate the unevenness of greenhouse pavement. The prototype is controlled to pass the obstacles and simulate target spraying on those leaves by two speeds of 100~150mm/s and 250~300mm/s, and each spray time is set to 0.5s, as shown in Fig 13.

Fig. 13. Simulated target experiment.
Simulated target experiment.

To compare the control effect of the improved potential field method with the image moment method of reference [6], ten experiments were carried out for each of the two methods. During the experiments, it was found that the value of Re_h was usually less than 4% of that expected for very special cases, such as the incorrect identification of leaves, while Re_h was always greater than 10% in these cases. Therefore, the factor of whether or not Re_h was greater than 10%, was only used to judge the tracking spray effectiveness in the experiment. For ease of judgment, a marked circle was drawn on each leaf with the centroid as the center and a radius of 0.1d. If the laser lamp’s spot was always in the marked circle during simulated spraying, the tracking spray was judged to be effective; otherwise, the tracking spray was ineffective.

The statistical results of effective and ineffective tracking spray in various cases are shown in Table 1. In the table, the effective tracking spray rate = the times of effective tracking spray / the total times of tracking spray * 100%, and the ineffective tracking spray rate = the times of ineffective tracking spray / the total times of tracking spray * 100%.

Tab. 1. Statistical table of target and tracking effect.
Statistical table of target and tracking effect.

The prototype test shows that the improved potential field method has a better tracking control effect than the image moment method in reference [6], and its advantages are more obvious with the increase of the velocity of the prototype. However, the former has higher calculation costs and increases the sampling period by approximately 21%, which is disadvantage and restricts the improvement of the prototype’s velocity and target tracking accuracy. In addition, some factors, such as the model error of the prototype system and the deviation between the actual and ideal attitudes of the leaves are still important factors affecting the target tracking accuracy.

Conclusion

To further improve the motion control of the spray arm during the processes of targeting and tracking, and based on previous research about visual tracking methods that used image moments and a hybrid vision structure with a single scene camera and a single (or multi) eye-in-hand camera, a novel algorithm for motion planning and target control is proposed. This novel algorithm is based on an improved potential field algorithm and introduces velocity potential field, field of view constraints and joint position limit constraint based on the traditional artificial potential field. Specifically, the velocity potential field is introduced to achieve stable tracking by making the target state of the spray arm move at the same velocity as the crop (relative velocity). The field of view constraint is introduced to prevent the target crop escape from escaping from the field view of the eye-in-hand camera and to ensure the efficiency of the visual servo control. Joint position limit constraints are introduced to ensure that the planned trajectories of the joints are always within an allowable range. The simulation and experimental results show that the proposed method has a higher tracking accuracy, a better planned path and a higher robustness than the servo controller based on image moments. Future work will extend the motion planning algorithm of the manipulator to other platforms, such as logistics intelligent sorting systems.

Supporting information

S1 Fig [tif]
Flow chart of simulation analysis.

S2 Fig [tif]
Structure diagram of the prototype software system.

S3 Fig [tif]
Flowchart diagram of the prototype software system.

S1 File [docx]
Estimation method of crop position and platform speed based on scene camera image.

S2 File [docx]
Pictures of experiment scene.


Zdroje

1. Qiu BJ, Yan R, Ma J, Guan XP, Ou MX. Research progress analysis of variable rate sprayer technology. Transactions of the Chinese Society for Agricultural Machinery. 2015; 46(3):59–72. doi: 10.6041/j.issn.1000-1298.2015.03.009

2. Hong SW, Zhao LY, Zhu HP. CFD simulation of pesticide spray from air-assisted sprayers in an apple orchard: Tree deposition and off-target losses. Atmospheric Environment. 2018;175:109–119. doi: 10.1016/j.atmosenv.2017.12.001

3. Liu XM, Li Y, Li M, Yuan J, Fang QZ, Hou JL. Design and test of smart-targeting spraying system on boom Sprayer. Transactions of the Chinese Society for Agricultural Machinery. 2016. 47(3):37–44. doi: 10.6041/j.issn.1000-1298.2016.03.006

4. Garcia-Santillan I, Peluffo-Ordonez D, Caranqui V, Pusdá M, Garrido F, Granda P. Computer vision-based method for automatic detection of crop rows in potato fields. Advances in Intelligent Systems and Computing, 2018,721:355–366. doi: 10.1007/978-3-319-73450-7_34

5. Garcia-Santillan I, Montalvo M, Guerrero J, Pajares G. Automatic detection of curved and straight crop rows from images in maize fields. Biosystems Engineering. 2017;156:61–79. doi: 10.1016/j.biosystemseng.2017.01.013

6. Zhao DJ, Zhang B, Wang XL, Guo HH, Xu SB. Automatic target research of indoor spray robot based on image moments. Transactions of the Chinese Society for Agricultural Machinery. 2016;47(12): 22–29. doi: 10.6041/j.issn.1000-1298.2016.12.004

7. Chang WC, Shao CK. Hybrid eye-to-hand and eye-in-hand visual servoing for autonomous robotic manipulation. Sice Conference. IEEE. 2010; 415~422.

8. Hua CC, Wang YQ, Guan XP. Visual tracking control for an uncalibrated robot system with unknown camera parameters. Robotics and Computer-Integrated Manufacturing. 2014; 30(1): 19–24. doi: 10.1016/j.rcim.2013.06.002

9. Baizid K, Yousnadj A, Meddahi A, Chellali R, Iqbal J. Time scheduling and optimization of industrial robotized tasks based on genetic algorithms. Robotics and Computer-integrated Manufacturing, 2015, 34:140–150. doi: 10.1016/j.rcim.2014.12.003

10. Zhao SP, Zhu ZX, Luo JJ. Multitask-based trajectory planning for redundant space robotics using improved genetic algorithm. Applied Sciences, 2019,9(11):1–22. doi: 10.3390/app9112226

11. Ning Q, Tao GP, Chen BC, Lei YJ, Yan H, Zhao CP. Multi-UAVs trajectory and mission cooperative planning based on the Markov model. Physical communication, 2019,35:1–10. doi: 10.1016/j.phycom.2019.100717

12. Zhang Q, Wang L, Zhou DS. Trajectory planning of 7-DOF space manipulator for minimizing base disturbance. International Journal of Advanced Robotic System, 2016,13: 1–10. doi: 10.5772/62123

13. De Momi E, Kranendonk L, Valenti M, Enayati N, Ferrigno G. A neural network-based approach for trajectory planning in robot-human handover tasks. Frontiers in robotics AI, 2016,3:1–10. doi: 10.3389/frobt.2016.00034

14. Abe A.Trajectory planning for flexible Cartesian robot manipulator by using artificial neural network: numerical simulation and experimental verification. Robotica, 2011,29:797–804. doi: 10.1017/S0263574710000767

15. Xie L, Xue SF, Zhang JF, Zhang MY, Tian WL, Haugen S. A path planning approach based on multi-direction A* algorithm for ships navigating within wind farm waters. Ocean Engineering, 2019,184:311–322. doi: 10.1016/j.oceaneng.2019.04.055

16. Xu X, Hu Y, Zhai JM, Li LZ, Guo PS. A novel non-collision trajectory planning algorithm based on velocity potential field for robotic manipulator. Intermational Journal of Advanced Robotic Systems, 2018,15(4):1–10. doi: 10.1177/1729881418787075

17. Liu T, Gong GF, Yang HY, Chen YX, Zhu Y. Trajectory control of tunnel boring machine based on adaptive rectification trajectory planning and multi-cylinders coordinated control. International Journal of Precision Engineering and Manufacturing, 2019,20(10): 1721–1733. doi: 10.1007/s12541-019-00073-5

18. Yang CG, Li ZJ, Li J. Trajectory planning and optimized adaptive control for a class of wheeled inverted pendulum vehicle models. IEEE Transactions on Cybernetics, 2012,43(1):24–36. doi: 10.1109/TSMCB.2012.2198813 22695357

19. Wang MM, Luo JJ, Walter U. Trajectory planning of free-floating space robot using Particle Swarm Optimization (PSO). ACTA Astronautica, 2015,112: 77–88. doi: 10.1016/j.actaastro.2015.03.008

20. Chen CT, Liao TT. A hybrid strategy for the time- and energy-efficient trajectory planning of parallel platform manipulators. Robotics and Computer-integrated Manufacturing, 2011,27(1):72–81. doi: 10.1016/j.rcim.2010.06.012

21. Ameni A, Khaled N. An advanced potential field method proposed for mobile robot path planning. Transactions of the Institute of Measurement and Control. 2019; 11:3132–3144. doi: 10.1177/0142331218824393

22. Kumar PB, Rawat H, Parhi DR. Path planning of humanoids based on artificial potential field method in unknown environments [J]. Expert Systems,2018,36(2):1–11. doi: 10.1111/exsy.12360

23. Chen GL, Liu J. Mobile Robot Path Planning Using Ant Colony Algorithm and Improved Potential Field Method. Computational Intelligence and Neuroscience. 2019:1–10. doi: 10.1155/2019/1932812 31198416

24. Wang WR, Zhu MC, Wang XM, He S, He JP, Xu ZB. An improved artificial potential field method of trajectory planning and obstacle avoidance for redundant manipulators. International Journal of Advanced Robotic Systems. 2018;15(5):1–13. doi: 10.1177/1729881418799562

25. Li WH, Yang CG, Jiang YM, Liu XF. Motion Planning for Omnidirectional Wheeled Mobile Robot by Potential Field Method. Journal of Advanced Transportation. 2017;1–11. doi: 10.1155/2017/4961383

26. Bence Kovács, Géza Szayer, Tajti F, Burdelis M, Péter Korondi. A novel potential field method for path planning of mobile robots by adapting animal motion attributes. Robotics and Autonomous Systems. 2016;82:24–34. doi: 10.1016/j.robot.2016.04.007

27. Mezouar Y, Chaumette F. Path planning for robust image-based control. IEEE Transactions on Robotics and Automation. 2002; 18(4): 534–549. doi: 10.1109/TRA.2002.802218

28. Lu X, Liu JT, Li HF, Li Y, Sun L. Attack and defense planning for competitive networked robots based on artificial potential field method. ROBOT. 2013; 35(2): 218–226. doi: 10.3724/SP.J.1218.2013.00218


Článek vyšel v časopise

PLOS One


2020 Číslo 1
Nejčtenější tento týden
Nejčtenější v tomto čísle
Kurzy

Zvyšte si kvalifikaci online z pohodlí domova

plice
INSIGHTS from European Respiratory Congress
nový kurz

Současné pohledy na riziko v parodontologii
Autoři: MUDr. Ladislav Korábek, CSc., MBA

Svět praktické medicíny 3/2024 (znalostní test z časopisu)

Kardiologické projevy hypereozinofilií
Autoři: prof. MUDr. Petr Němec, Ph.D.

Střevní příprava před kolonoskopií
Autoři: MUDr. Klára Kmochová, Ph.D.

Všechny kurzy
Kurzy Podcasty Doporučená témata Časopisy
Přihlášení
Zapomenuté heslo

Zadejte e-mailovou adresu, se kterou jste vytvářel(a) účet, budou Vám na ni zaslány informace k nastavení nového hesla.

Přihlášení

Nemáte účet?  Registrujte se

#ADS_BOTTOM_SCRIPTS#