IntroductionIn almost every planetary surface investigation, the characterization from a camera is a common initial step [1]. Mission Control is developing a science autonomy system called Autonomous Soil Assessment System: Contextualizing Rocks, Anomalies and Terrains in Exploratory Robotic Science (ASAS-CRATERS). It can enable automated surface characterization on planetary missions, which can benefit a wide range of science investigations and rover navigation alike. It can perform terrain classification and novelty detection using convolutional neural networks, and data aggregation to produce relevant data products for supporting science operations. Built on cutting-edge algorithms and off-the-shelf computing components, it offers low-cost ways to speed up tactical decision-making in next-generation commercial lunar missions.Background and MotivationAutonomy in Science OperationsSeveral factors are increasingly driving the need for autonomy in science operations. In traditional Mars rover operations, visual surface characterization and subsequent analysis and decision-making takes place in day-long tactical cycles [2]. Upcoming commercial lunar rover missions will have reduced latency, short lifetimes, and constrained bandwidth shared across several payloads. This will result in a need for rapid tactical decision-making processes with limited data, leaving little time for analysis, target identification, and making decisions. Payload operators may not receive data in a timely fashion, or worse, may not receive some data at all. Autonomous onboard terrain classification offers a way to downlink light-weight data products and reduce the bottleneck in scientific terrain assessment. Autonomous classification and novelty detection increase the chances of detecting novel/sparse features (e.g., lunar outcrop or pyroclasts) that may otherwise be missed or not downlinked when driving and other mission needs are prioritized.Application to Lunar GeologyWhile dedicated science instruments that reveal mineralogical and elemental composition improve our understanding of geological processes, a rover’s navigation sensors can document the morphology, morphometry, and composition of surface materials, regardless of primary investigation goals. High-resolution colour images and 3D data from stereo cameras provide information such as the size-frequency distribution and physical characteristics of craters and rocks, and regolith properties. All this offers valuable insight into the geologic setting. To provide a practical output as a science support tool for several types of missions, a classification scheme is being developed that segments a surface image into geological features that are visually distinct based on morphology, tone, and texture. This will be adapted for specific missions. See Figure 1 for a hand-labelled example.Figure 1: Hand-labelled lunar terrain classification example. Letters indicate crater degradation; P: Pristine; S: Semi-Degraded; G: Ghost. Right: original Yutu-1 image. Credit: CNSA.TechnologyAlgorithms and SoftwareASAS-CRATERS comprises three algorithms. First, the terrain classifier consists of a deep-learning encoder-decoder style network which classifies each image pixel into semantic terrain labels. Second, the novelty detector uses a semi-supervised convolutional neural network architecture with an autoencoder module and a binary classifier that work in series. Third, a data aggregator will combine the outputs on map tiles that are useable by onboard algorithms, lightweight for more efficient downlink, and enables faster backroom analysis and integration into GIS tools. See Figure 2 for a conceptual illustration of how ASAS-CRATERS outputs would be aggregated onto map tiles.Figure 2: Hand-made illustration of aggregating terrain classifier data on orthorectified map tiles.In the near future, additional algorithms will be developed to extend the capabilities of this autonomy system. This primarily includes an autonomous instrument targeting capability so that lunar science instruments, on rovers in particular, can use ASAS-CRATERS to identify and select features for targeting.Sensors and Computing HardwareASAS-CRATERS will be embedded on the Q8S, a high-performance, low-power Xilinx Zynq UltraScale+ System-on-Chip FPGA designed by Xiphos Technologies, which will fly in 2020. The Q8S consumes 3W at minimum and measures 90g and 80x80x22.3mm. Embedded on a flight-ready COTS system, ASAS-CRATERS can be rapidly integrated onto payload suites, rovers, and landers, offering low-cost advanced computing capabilities.Concept of OperationsPrior to deployment on a mission, ASAS-CRATERS’ machine learning algorithms will require training using relevant expert-labelled images from lunar and analogue datasets. Once a mission begins, the algorithms will be updated with images collected in situ. ASAS-CRATERS can then be used to classify terrain and detect features during nominal rover operations. The science team can integrate ASAS-CRATERS data products into their terrain analysis and make decisions on whether specific features merit deeper investigation using onboard instruments.Use Cases and BenefitsASAS-CRATERS can benefit science missions in several ways. First, it can support science operations in tactical cycles. Novelty detection can aid scientists that may miss features or spend valuable time in looking for them. The terrain classifier data products are low-dimensional representations which optimizes downlink. The classification itself can speed up scientific analysis in rapid tactical cycles and this becomes increasingly more useful in complex scenes diverse in mineralogy and lithology. Second, for high-priority features, onboard algorithms can perform instrument targeting and data triage for downlink prioritization. Third, as a semantically useful terrain representation, it can be used by advanced robotics algorithms to enable autonomous and intelligent navigation. Fourth, in human exploration architectures, ASAS-CRATERS can be embedded with crewed systems to provide autonomous capabilities that can support astronauts in geological field excursions.Field Tests and DemonstrationsThe terrain classifier was first developed by Mission Control under the Autonomous Soil Assessment System project [3]. In 2019, it was used to classify eight Mars-relevant terrain types in real-time at ~15 FPS (see Figure 3 for an example). This was a part of tests in Iceland under SAND-E (Semi-Autonomous Navigation for Detrital Environments), a NASA PSTAR funded project led by Dr. Ryan Ewing at Texas A&M University.Figure 3: Classifier output overlaid on one camera image during a SAND-E traverse in Iceland field tests.While ASAS-CRATERS is a multi-mission payload, near-term demonstrations are targeted for upcoming lunar missions in 2022 and 2023.AcknowledgementsThe authors would like to thank the Canadian Space Agency (CSA) for funding the development of ASAS-CRATERS and previous technology developments.References[1] Francis R. et al. (2014) SpaceOps. DOI: 10.2514/6.2014-1798. [2] Gaines D. et al. (2016) PlanRob, 115-125. [3] Faragalli M. et al. (2018) i-SAIRAS.