Parallel Processing and Immersive Visualization of Sonar Point CloudsEnvironmental scientists analyzing sonar data from West Lake Bonney, Antarctica using the visualization tool inside the CAVE2™ - L. Long, EVL
Authors: Febretti, A., Richmond K., Doran, P., Johnson, A
Publication: IEEE Symposium on Large Data Visualization (LDAV) 2014, Paris, France The investigation of underwater structures and natural features through Autonomous Underwater Vehicles (AUVs) is an expanding field with applications in archaeology, engineering, environmental sciences and astrobiology. Processing and analyzing the raw sonar data generated by automated surveys is challenging due to the presence of complex error sources like water chemistry, zero-depth variations, inertial navigation errors and multipath reflections. Furthermore, the complexity of the collected data makes it difficult to perform effective analysis on a standard display. Point clouds made up of hundreds of millions to billions of points are not uncommon. Highly interactive, immersive visualization is a desirable tool that researchers can use to improve the quality of a final sonar-based data product. In this paper we present a scalable toolkit for the processing and visualization of sonar point clouds on a cluster-based, large scale immersive visualization environment. The cluster is used simultaneously as a parallel processing platform that performs sonar beam-tracing of the source raw data, and as the rendering driver of the immersive display. Date: November 9, 2014 - November 10, 2014 Document: View PDF |