Without GPS, autonomous vehicles are easily lost. Now a new algorithm developed at the Caltech (California Institute of Technology) allows autonomous vehicles to identify where they are simply by looking at the ground around them - and for the first time, the technology works independently of seasonal changes on that ground.
Details of the process were published June 23 in the journal Science Robotics, published by the American Association for the Advancement of Science (AAAS).
The general procedure, known as ground-based optical navigation (VTRN), was first developed in the 1960s. Autonomous vehicles do not lose their orientation, comparing nearby ground with high-resolution satellite imagery.
The problem is that, in order to work, the current generation VTRN requires the ground it examines to match the images in its database. Anything that alters or hides the ground, such as snow or fallen leaves, makes the images look out of place. VTRN systems can work properly if the database contains landscape images in all possible weather conditions.
To overcome this challenge, Caltech turned to deep learning and artificial intelligence (AI) to remove content per season that hinders current VTRN systems.
The process - developed by Soon-Jo Chung and Anthony Fragoso in collaboration with graduate student Connor Lee and graduate student Austin McCoy - uses what is known as "self-supervised learning". While most computer vision strategies rely on people carefully scrutinizing large data sets to teach an algorithm how to recognize what it sees, it lets the algorithm teach itself instead. Artificial intelligence looks for patterns in images by capturing details and features that humans are likely to miss.
In addition to being useful for autonomous vehicles on Earth, the system also has applications for space missions. The entry, descent and landing system (EDL) in the shipment March 2020 Perseverance rover JPL, for example, used VTRN for the first time on the Red Planet to land at Jezero Crater, a site previously considered too dangerous to land. The team looked at areas of Mars that are experiencing severe seasonal changes, Earth-like conditions and the new system could allow for improved navigation to support scientific goals, including water retrieval.
Then Fragoso, Lee and Chung will expand the technology to take into account changes in the weather: fog, rain, snow and so on. If successful, their work could help improve driverless car navigation systems.
Source of information: caltech.edu