Back to Search Start Over

Tracking Sensor Location by Video Analysis in Double-Shell Tank Inspections.

Authors :
Price, Jacob
Aaberg, Ethan
Mo, Changki
Miller, John
Source :
Applied Sciences (2076-3417); Aug2023, Vol. 13 Issue 15, p8708, 10p
Publication Year :
2023

Abstract

Featured Application: Defects in the bottom of the primary tank are mapped by data from an ultrasonic sensor pulled through narrow air slots in the space between primary and secondary tanks. The location of data collection points is critical to map accuracy. Laboratory experiments suggest that analyzing videos from cameras on the front and back crawlers moving the sensor can help track sensor location with sufficient accuracy for map creation. Double-shell tanks (DSTs) are a critical part of the infrastructure for nuclear waste management at the U.S. Department of Energy's Hanford site. They are expected to be used for the interim storage of partially liquid nuclear waste until 2050, which is the target date for completing the immobilization process for all Hanford nuclear waste. At that time, DSTs will have been used about 15 years beyond their original projected lifetime. Consequently, for the next approximately 30 years, Hanford DSTs will undergo periodic nondestructive evaluation (NDE) to ensure their integrity. One approach to perform NDE is to use ultrasonic data from a robot moving through air slots, originally designed for cooling, in the confined space between primary and secondary tanks. Interpreting ultrasonic sensor output requires knowing where measurements were taken with a precision of approximately one inch. Analyzing video acquired during inspection is one approach to tracking sensor location. The top edge of an air slot is easily detected due to the difference in color and texture between the primary tank bottom and the air slot walls. A line fit to this edge is used in a model to calculate the apparent width of the air slot in pixels at targets near the top edge that can be recognized in video images. The apparent width of the air slot at the chosen target in a later video frame determines how far the robot has moved between those frames. Algorithms have been developed that automate target selection and matching in later frames. Tests in a laboratory mockup demonstrated that the method tracks the location of the ultrasonic sensor with the required precision. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20763417
Volume :
13
Issue :
15
Database :
Complementary Index
Journal :
Applied Sciences (2076-3417)
Publication Type :
Academic Journal
Accession number :
169910214
Full Text :
https://doi.org/10.3390/app13158708