On-Line Terrain Estimation using Internal Sensors Debangshu Sadhukhan and Carl A. Moore Department of Mechanical Engineering Florida A&M – Florida State University College of Engineering 2525 Pottsdamer Street Tallahassee, Florida 32310 . Abstract -The eXperimental Unmanned Vehicle (XUV) [1] is designed to autonomously navigate over different types of terrain. The performance of autonomous navigation improves when the vehicle’s control system takes into account the type of terrain on which the vehicle is traveling. For example, if the ground is covered with snow a reduction of acceleration is necessary to avoid wheel slip. Previous researchers have developed algorithms that use vision to categorize the traversability of the terrain. Others have used classical terramechanics equations to identify the key terrain parameters. In this paper we present the foundation of a novel algorithm, which uses data from the vehicle’s internal sensors to categorize the type of terrain being traversed. We believe the algorithm’s method of “sensing by feeling” will enable us to qualitatively determine the terrain in real-time without the use of a vision system. It is difficult for this algorithm to analyze homogenous terrains because corresponding artifacts in homogenous images are hard to detect. Also, since the vision-based algorithm is dependent on illumination, low light conditions increase the possibility of incorrect terrain classification Fig. 1. The Army’s XUV I. INTRODUCTION AND RELATED WORK The XUV, shown in Fig. 1, is a Semi-Autonomous Unmanned Ground Vehicle (UGV) that uses high fidelity sensors for Reconnaissance, Surveillance, and Target Acquisition. The goal of current XUV research is to develop autonomous mobility that enables an UGV to maneuver over rugged terrain as part of a mixed manned and unmanned vehicle group. As part of this goal the XUV must be able to maneuver at speeds higher than traditional UGVs. High-speed maneuvers necessitate knowledge of the terrain’s character. Even during low speed maneuvers, knowledge of the terrain decreases the vehicle’s likelihood of becoming stuck. Thus the goal of our research is to provide real-time qualitative knowledge of the terrain. Other researchers have studied terrain detection for UGVs. For example, Howard, Seraji, and Tunstel [2,3,4,5] investigated the use of vision to classify the transversability of the terrain. They identified roughness, slope, discontinuity, and hardness as the key terrain traversability characteristics. Vision algorithms determine the characteristics based on the brightness levels of corresponding pixels from subsequent camera images of the terrain. The algorithm combines the four terrain characteristics to form a Fuzzy Traversability Index representing the ease of robot travel over the terrain. Karl Iagnemma and Steven Dubowsky [6,7] at Massachusetts Institute of Technology have utilized Classical Terramechanics equations to perform online estimation of terrain parameters. The authors identified cohesion c and internal friction angle φ as the key terrain parameters. They have developed equations relating these two parameters of interest to physically measurable quantities k1 and k2. These physically measurable quantities are functions of the vehicle’s vertical loading, torque, wheel angular speed, and wheel linear speed. The equation relating these physical quantities to the cohesion c and the internal friction angle φ is of the following form, c k1 = k2 tan φ After measuring the physical quantities over a number of time steps, the above equation is solved for cohesion and internal friction using the least squares method. Coulomb’s equation is then used to compute the terrain shear strength, which the authors state is a good measure of terrain traversability. II. PROPOSED ALGORITHM Our goal is to develop a terrain detection algorithm that relies on data gathered from the vehicle’s internal sensors. The algorithm will function in the event of damage to the robot’s vision system or during low light conditions. Furthermore, the results from our algorithm can be used as a crosscheck for the results from the vision based terrain detection. In our algorithm we use measurements of wheel slip, vehicle acceleration, and terrain induced wheel noise as indicators of the terrain type. As the first step in the algorithm’s development we performed driving experiments over various terrains using a consumer SUV. We wanted to observe wheel slip and other dynamic effects induced by the vehicle-terrain interaction. As expected, while driving on asphalt we observed negligible slip; we were able to make steep turns at high speeds without any gross tire slip. There was considerable slip while driving on gravel, and we experienced a good deal of vertical and lateral acceleration as well. We also heard the distinctive noise of the tires crunching through the rocks. In the case of grass we observed less slip than on gravel but more lateral and vertical acceleration. We believe that this was due to the fact that the grassy surfaces we chose had not been leveled to the extent that is typical for surfaces meant for driving such as gravel lots, asphalt, or concrete. We also identified notable terrain characteristics while driving over hard packed dirt and loose pebbles. Our driving experiments confirmed that besides wheel slip data, wheel noise and vehicle acceleration are indicative of the terrain type. We are currently considering whether a neural network that fuses measurements of these three reactions can accomplish the terrain detection. We will implement UGV experimentation using on an ATRV-Jr mobile robot manufactured by iRobot (Fig. 2). It comes equipped with a full array of sensors including an inertial navigation sensor (INS), wheel encoders, sonars, laser range finder, bump panels, differential GPS, and a compass. Fig. 2. IRobot’s ATRV-Jr We will first consider terrain induced wheel slip. The longitudinal slip (i) of each wheel is defined as the ratio of vehicle translation velocity (v) (calculated from INS measured acceleration) to wheel translation velocity (rw) (calculated from the wheel encoder signal). v i = 1 − rω Four slip values (one for each wheel) are averaged to obtain a single slip value for the vehicle at each time step. III. SIMULATION RESULTS A prototype robot was built and tested using the ADAMS/View vehicle modeling and simulation software [8]. We conducted a simulation of the robot driving over four types of terrain including asphalt, dirt, grass, and gravel. Each type of terrain was represented by different coefficients of static and kinetic friction. The values chosen were 1.0 and 0.65 for asphalt, 0.5 and 0.3 for grass, 0.6 and 0.4 for dirt, 0.8 and 0.4 for gravel [9]. Fig. 3. ADAMS/View simulation results: percentage wheel slip on various terrains. Grass Sand Packed Dirt Gravel Grass 30 m Fig. 4. Test bed to measure vehicle-terrain interaction over various terrain Shown above in Fig. 3 is a plot of the longitudinal slip verus time on the simulated surfaces. The prototype model was accelerated from rest to a velocity of 1.25 m/sec in 1.2 seconds. Initially, under high acceleration, the percentage slip is high. Gradually, as the final speed is approached, the rate of acceleration decreases and the corresponding percentage slip values decrease until they are near-zero at constant vehicle velocity. IV. CONCLUSION We have observed in simulation and experimentation that wheel slip is different for different terrains. We have also found that both wheel noise, lateral and vertical vehicle acceleration can be characteristic of the type of terrain. Our immediate future work will be to develop computer programs to store sensor data from the relevant ATRV-Jr sensors. We are in the process of building a test bed to perform outdoor experimentation, see schematic in Fig. 4. One thing we hope to learn is the relationship between robot-terrain interaction and robot speed. After using slip to classify terrains we will attempt to make the classifications algorithm more robust by incorporating data from audio sensors and accelerometers. ACKNOWLEDGMENT We would like to thank E. Collins and P. Hollis of FAMU-FSU College of Engineering for their support and assistance. This research was sponsored by the Collaborative Technical Alliance (CTA) project sponsored by General Dynamics Robot Systems (GDRS). REFERENCES [1] http://www.arl.army.mil/wmrd/Tech /ugv-both.pdf [2] A. Howard and. H. Seraji, “Vision-Based Terrain Characterization and Traversability Assessment”, Journal of Robotic Systems vol. 18, no. 10, pp. 577587. 2001. [3] H. Seraji and A. Howard, “Behavior-based robot navigation on challenging terrain: A fuzzy logic approach”, IEEE Transactions on Robotics and Automation, vol: 18, no. 3, pp. 308 –321, 2002. [4] A. Howard, H. Seraji, and E. Tunstel, “A rule-based fuzzy traversability index for mobile robot navigation”, Proceedings of 2001 IEEE International conference on Robotics and Automation, Seoul, South Korea, vol. 3, pp. 3067-3071. [5] E. Tunstel, A. Howard, and H. Seraji, “Fuzzy rulebased reasoning for rover safety and survivability”, Proceedings of 2001 IEEE International conference on Robotics and Automation, Seoul, South Korea, vol. 2, pp. 1413-1420. [6] Karl Iagnemma and Steven Dubowsky, “Terrain estimation for high-speed rough–terrain autonomous vehicle navigation”, Proceedings of SPIE Conference on Unmanned Ground Vehicle Technology IV, pp. 256-266, 2002. [7] Karl Iagnemma, Hassan Shibly, and Steven Dubowsky, “On-line Terrain Parameter Estimation for Planetary Rovers”, Proceedings of the 2002 IEEE International Conference on Robotics and Automation, Washington, DC, vol. 3, pp. 3142-3147. [8] http://www.adams.com/ [9] Daniel J. Parkka, Equation Directory for the Reconstructionist, Institute of Police Technology and Management, Second Edition, 1996. [10] J. Y. Wong, Theory Of Ground Vehicles, John Wiley & Sons, Third Edition, 2001.
© Copyright 2024 ExpyDoc