27 results on '"Zurowietz, Martin"'
Search Results
2. Making marine image data FAIR
- Author
-
Schoening, Timm, Durden, Jennifer M., Faber, Claas, Felden, Janine, Heger, Karl, Hoving, Henk-Jan T., Kiko, Rainer, Köser, Kevin, Krämmer, Christopher, Kwasnitschka, Tom, Möller, Klas Ove, Nakath, David, Naß, Andrea, Nattkemper, Tim W., Purser, Autun, and Zurowietz, Martin
- Published
- 2022
- Full Text
- View/download PDF
3. Deep learning–assisted biodiversity assessment in deep-sea benthic megafauna communities: a case study in the context of polymetallic nodule mining
- Author
-
Cuvelier, Daphne, primary, Zurowietz, Martin, additional, and Nattkemper, Tim W., additional
- Published
- 2024
- Full Text
- View/download PDF
4. Report on the Marine Imaging Workshop 2022
- Author
-
Borremans, Catherine, primary, Durden, Jennifer, additional, Schoening, Timm, additional, Curtis, Emma, additional, Adams, Luther, additional, Branzan Albu, Alexandra, additional, Arnaubec, Aurélien, additional, Ayata, Sakina-Dorothée, additional, Baburaj, Reshma, additional, Bassin, Corinne, additional, Beck, Miriam, additional, Bigham, Katharine, additional, Boschen-Rose, Rachel, additional, Collett, Chad, additional, Contini, Matteo, additional, Correa, Paulo, additional, Dominguez-Carrió, Carlos, additional, Dreyfus, Gautier, additional, Duncan, Graeme, additional, Ferrera, Maxime, additional, Foulon, Valentin, additional, Friedman, Ariell, additional, Gaikwad, Santosh, additional, Game, Chloe, additional, Gaytán-Caballero, Adriana, additional, Girard, Fanny, additional, Giusti, Michela, additional, Hanafi-Portier, Mélissa, additional, Howell, Kerry, additional, Hulevata, Iryna, additional, Itiowe, Kiamuke, additional, Jackett, Chris, additional, Jansen, Jan, additional, Karthäuser, Clarissa, additional, Katija, Kakani, additional, Kernec, Maxime, additional, Kim, Gabriel, additional, Kitahara, Marcelo, additional, Langenkämper, Daniel, additional, Langlois, Tim, additional, Lanteri, Nadine, additional, Jianping Li, Claude, additional, Li, Qi-Ran, additional, Liabot, Pierre-Olivier, additional, Lindsay, Dhugal, additional, Loulidi, Ali, additional, Marcon, Yann, additional, Marini, Simone, additional, Marranzino, Ashley, additional, Massot-Campos, Miquel, additional, Matabos, Marjolaine, additional, Menot, Lenaick, additional, Moreno, Bernabé, additional, Morrissey, Marcus, additional, Nakath, David, additional, Nattkemper, Tim, additional, Neufeld, Monika, additional, Obst, Matthias, additional, Olu, Karine, additional, Parimbelli, Alexa, additional, Pasotti, Francesca, additional, Pelletier, Dominique, additional, Perhirin, Margaux, additional, Piechaud, Nils, additional, Pizarro, Oscar, additional, Purser, Autun, additional, Rodrigues, Clara, additional, Ceballos Romero, Elena, additional, Schlining, Brian, additional, Song, Yifan, additional, Sosik, Heidi, additional, Sourisseau, Marc, additional, Taormina, Bastien, additional, Taucher, Jan, additional, Thornton, Blair, additional, Van Audenhaege, Loïc, additional, von der Meden, Charles, additional, Wacquet, Guillaume, additional, Williams, Jack, additional, Witting, Kea, additional, and Zurowietz, Martin, additional
- Published
- 2024
- Full Text
- View/download PDF
5. Report on the Marine Imaging Workshop 2022
- Author
-
Borremans, Catherine, Durden, Jennifer, Schoening, Timm, Curtis, Emma, Adams, Luther, Branzan Albu, Alexandra, Arnaubec, Aurélien, Ayata, Sakina-Dorothée, Baburaj, Reshma, Bassin, Corinne, Beck, Miriam, Bigham, Katharine, Boschen-Rose, Rachel, Collett, Chad, Contini, Matteo, Correa, Paulo, Dominguez-Carrió, Carlos, Dreyfus, Gautier, Duncan, Graeme, Ferrera, Maxime, Foulon, Valentin, Friedman, Ariell, Gaikwad, Santosh, Game, Chloe, Gaytán-Caballero, Adriana, Girard, Fanny, Giusti, Michela, Hanafi-Portier, Mélissa, Howell, Kerry, Hulevata, Iryna, Itiowe, Kiamuke, Jackett, Chris, Jansen, Jan, Karthäuser, Clarissa, Katija, Kakani, Kernec, Maxime, Kim, Gabriel, Kitahara, Marcelo, Langenkämper, Daniel, Langlois, Tim, Lanteri, Nadine, Jianping Li, Claude, Li, Qi-Ran, Liabot, Pierre-Olivier, Lindsay, Dhugal, Loulidi, Ali, Marcon, Yann, Marini, Simone, Marranzino, Ashley, Massot-Campos, Miquel, Matabos, Marjolaine, Menot, Lenaick, Moreno, Bernabé, Morrissey, Marcus, Nakath, David, Nattkemper, Tim, Neufeld, Monika, Obst, Matthias, Olu, Karine, Parimbelli, Alexa, Pasotti, Francesca, Pelletier, Dominique, Perhirin, Margaux, Piechaud, Nils, Pizarro, Oscar, Purser, Autun, Rodrigues, Clara, Ceballos Romero, Elena, Schlining, Brian, Song, Yifan, Sosik, Heidi, Sourisseau, Marc, Taormina, Bastien, Taucher, Jan, Thornton, Blair, Van Audenhaege, Loïc, von der Meden, Charles, Wacquet, Guillaume, Williams, Jack, Witting, Kea, Zurowietz, Martin, Borremans, Catherine, Durden, Jennifer, Schoening, Timm, Curtis, Emma, Adams, Luther, Branzan Albu, Alexandra, Arnaubec, Aurélien, Ayata, Sakina-Dorothée, Baburaj, Reshma, Bassin, Corinne, Beck, Miriam, Bigham, Katharine, Boschen-Rose, Rachel, Collett, Chad, Contini, Matteo, Correa, Paulo, Dominguez-Carrió, Carlos, Dreyfus, Gautier, Duncan, Graeme, Ferrera, Maxime, Foulon, Valentin, Friedman, Ariell, Gaikwad, Santosh, Game, Chloe, Gaytán-Caballero, Adriana, Girard, Fanny, Giusti, Michela, Hanafi-Portier, Mélissa, Howell, Kerry, Hulevata, Iryna, Itiowe, Kiamuke, Jackett, Chris, Jansen, Jan, Karthäuser, Clarissa, Katija, Kakani, Kernec, Maxime, Kim, Gabriel, Kitahara, Marcelo, Langenkämper, Daniel, Langlois, Tim, Lanteri, Nadine, Jianping Li, Claude, Li, Qi-Ran, Liabot, Pierre-Olivier, Lindsay, Dhugal, Loulidi, Ali, Marcon, Yann, Marini, Simone, Marranzino, Ashley, Massot-Campos, Miquel, Matabos, Marjolaine, Menot, Lenaick, Moreno, Bernabé, Morrissey, Marcus, Nakath, David, Nattkemper, Tim, Neufeld, Monika, Obst, Matthias, Olu, Karine, Parimbelli, Alexa, Pasotti, Francesca, Pelletier, Dominique, Perhirin, Margaux, Piechaud, Nils, Pizarro, Oscar, Purser, Autun, Rodrigues, Clara, Ceballos Romero, Elena, Schlining, Brian, Song, Yifan, Sosik, Heidi, Sourisseau, Marc, Taormina, Bastien, Taucher, Jan, Thornton, Blair, Van Audenhaege, Loïc, von der Meden, Charles, Wacquet, Guillaume, Williams, Jack, Witting, Kea, and Zurowietz, Martin
- Abstract
Imaging is increasingly used to capture information on the marine environment thanks to the improvements in imaging equipment, devices for carrying cameras and data storage in recent years. In that context, biologists, geologists, computer specialists and end-users must gather to discuss the methods and procedures for optimising the quality and quantity of data collected from images. The 4 Marine Imaging Workshop was organised from 3-6 October 2022 in Brest (France) in a hybrid mode. More than a hundred participants were welcomed in person and about 80 people attended the online sessions. The workshop was organised in a single plenary session of presentations followed by discussion sessions. These were based on dynamic polls and open questions that allowed recording of the imaging community’s current and future ideas. In addition, a whole day was dedicated to practical sessions on image analysis, data standardisation and communication tools. The format of this edition allowed the participation of a wider community, including lower income countries, early career scientists, all working on laboratory, benthic and pelagic imaging. This article summarises the topics addressed during the workshop, particularly the outcomes of the discussion sessions for future reference and to make the workshop results available to the open public.
- Published
- 2024
6. Deep learning–assisted biodiversity assessment in deep-sea benthic megafauna communities: a case study in the context of polymetallic nodule mining
- Author
-
Cuvelier, Daphne, Zurowietz, Martin, Nattkemper, Tim W., Cuvelier, Daphne, Zurowietz, Martin, and Nattkemper, Tim W.
- Abstract
Technological developments have facilitated the collection of large amounts of imagery from isolated deep-sea ecosystems such as abyssal nodule fields. Application of imagery as a monitoring tool in these areas of interest for deep-sea exploitation is extremely valuable. However, in order to collect a comprehensive number of species observations, thousands of images need to be analysed, especially if a high diversity is combined with low abundances such is the case in the abyssal nodule fields. As the visual interpretation of large volumes of imagery and the manual extraction of quantitative information is time-consuming and error-prone, computational detection tools may play a key role to lessen this burden. Yet, there is still no established workflow for efficient marine image analysis using deep learning–based computer vision systems for the task of fauna detection and classification. Methods In this case study, a dataset of 2100 images from the deep-sea polymetallic nodule fields of the eastern Clarion-Clipperton Fracture zone from the SO268 expedition (2019) was selected to investigate the potential of machine learning–assisted marine image annotation workflows. The Machine Learning Assisted Image Annotation method (MAIA), provided by the BIIGLE system, was applied to different set-ups trained with manually annotated fauna data. The results computed with the different set-ups were compared to those obtained by trained marine biologists regarding accuracy (i.e. recall and precision) and time. Results Our results show that MAIA can be applied for a general object (i.e. species) detection with satisfactory accuracy (90.1% recall and 13.4% precision), when considered as one intermediate step in a comprehensive annotation workflow. We also investigated the performance for different volumes of training data, MAIA performance tuned for individual morphological groups and the impact of sediment coverage in the training data. Discussion We conclude that: a) steps must b
- Published
- 2024
- Full Text
- View/download PDF
7. Fast visual exploration of mass spectrometry images with interactive dynamic spectral similarity pseudocoloring
- Author
-
Wüllems, Karsten, Zurowietz, Annika, Zurowietz, Martin, Schneider, Roland, Bednarz, Hanna, Niehaus, Karsten, and Nattkemper, Tim W.
- Published
- 2021
- Full Text
- View/download PDF
8. Deep learning-based diatom taxonomy on virtual slides
- Author
-
Kloster, Michael, Langenkämper, Daniel, Zurowietz, Martin, Beszteri, Bánk, and Nattkemper, Tim W.
- Published
- 2020
- Full Text
- View/download PDF
9. BIIGLE User Meeting 2023 Summary
- Author
-
Zurowietz, Martin, Langenkämper, Daniel, and Nattkemper, Tim W.
- Subjects
BIIGLE - Abstract
This is the summary of the BIIGLE User Meeting that took place on Feb. 8, 2023 from 2 pm to 5 pm CET.
- Published
- 2023
- Full Text
- View/download PDF
10. A Digital Light Microscopic Method for Diatom Surveys Using Embedded Acid-Cleaned Samples
- Author
-
Burfeid-Castellanos, Andrea M., primary, Kloster, Michael, additional, Beszteri, Sára, additional, Postel, Ute, additional, Spyra, Marzena, additional, Zurowietz, Martin, additional, Nattkemper, Tim W., additional, and Beszteri, Bánk, additional
- Published
- 2022
- Full Text
- View/download PDF
11. First insights on Vazella pourtalesii assemblage dynamics
- Author
-
European Commission, Fisheries and Oceans Canada, Grinyó, Jordi, Aguzzi, Jacopo, García, Rafael, Gracias, Nuno, Nattkemper, Tim W., Zurowietz, Martin, Hanz, Ulrike, Alrashed, Ahmad, Rusconi, Fabio, Mienis, Furu, European Commission, Fisheries and Oceans Canada, Grinyó, Jordi, Aguzzi, Jacopo, García, Rafael, Gracias, Nuno, Nattkemper, Tim W., Zurowietz, Martin, Hanz, Ulrike, Alrashed, Ahmad, Rusconi, Fabio, and Mienis, Furu
- Abstract
In certain areas of the Nova Scotian shelf, Vazella pourtalesii (Hexactinelid) forms dense monospecific aggregations. Vazella creates complex 3-dimensional structures, which provide habitat to a wide range of associated species, enhancing local diversity. In 2013, the Canadian authorities established the Sambro Bank Sponge Conservation Area, banning bottom fishing activities. However, this protection zone has not yet been incorporated into a management plan requiring implementation of monitoring strategies that account for natural variability. To elucidate assemble dynamics and response to environmental variability, a NIOZ-designed lander equipped with a HD-video camera, ADCP, CT and oxygen sensors was deployed at 154m depth for 10-months. A total of 5,151 still images were obtained of which 1157 were manually analyzed with the BIIGLE 2 software, and posteriorly used to train a mask convolutional neural network. Over 30,000 organisms belonging to 8 different species were identified. An unidentified actinian and the red fish Sebastes fasciatus were the most abundant species accounting for 93% and 4% of all observed organisms, respectively. Non-metric multidimensional scaling analysis revealed that assemblage composition did not significantly differ between seasons. However, it was observed that sessile taxa abundances decreased during benthic storms that influence the near-bed hydrodynamics. These storms could last for several days causing high sediment resuspension and the dislodgment or partial burial of sponge and actinian individuals. We also observed that sponge individuals attached to small cobbles, moved across the field of view. Moving individuals intermittently appeared laying horizontally on the seafloor or standing in an upward position, within short intervals. Bottom currents may cause Vazella individuals, attached to small cobbles, to tip and rise progressively, dislodging them from their original position depending on current intensity and direction. Ou
- Published
- 2022
12. Large-Scale Marine Image Annotation in the Age of the Web and Deep Learning
- Author
-
Zurowietz, Martin
- Abstract
Digital imaging has become one of the most important techniques to non-invasively collect data in the field of marine benthic environmental monitoring and exploration. Traditionally, marine imaging data is analyzed by manual image annotation where domain experts mark objects of interest in the images and assign class labels to the marked objects. With technological advances of underwater carrier systems, digital cameras and digital storage technology, the acquisition rate of marine imaging data is rapidly increasing. Traditional purely manual image annotation cannot keep up with the volume of newly acquired data, as the availability of domain experts who can annotate the images is very limited. Hence, new (computational) approaches that increase both the efficiency and effectivity of marine image annotation are required. In this thesis, BIIGLE 2.0 is presented, which is a web-based application for image annotation with a special focus on marine imaging. BIIGLE 2.0 offers several novel concepts and annotation tools that enable highly efficient manual image annotation. Furthermore, the application architecture of BIIGLE 2.0 allows for a versatile deployment from a mobile single-chip computer in the field up to a large cloud-based stationary setup. The possibility to synchronize annotation data between multiple BIIGLE 2.0 instances and a federated search pave the way for the creation of a powerful collaborative network of annotation systems across research ships, monitoring stations or research institutes. In addition, the Machine learning Assisted Image Annotation method (MAIA) and its extension through Unsupervised Knowledge Transfer (UnKnoT) are presented. MAIA introduces a four-stage image annotation workflow that includes machine learning methods for computer vision to automate the time-consuming task of object detection. This allows human observers to annotate large marine image collections much faster than before. With UnKnoT, the first two MAIA stages for unsupervised object detection can be skipped for datasets with special properties that are often given in the benthic marine imaging context, accelerating the workflow even more. The combination of BIIGLE 2.0, MAIA and UnKnoT presents an advancement for marine image annotation that integrates manual annotation with specialized software, automated computer assistance and a sophisticated user interface for a highly efficient and effective annotation process. In addition, the method and tool Interactive Feature Localization in Deep neural networks (IFeaLiD) is presented, which offers a novel way for the inspection of convolutional neural networks for computer vision. IFeaLiD can be used, among other objectives, to judge the suitability of a particular trained network for a specific task such as object detection in the marine imaging context.
- Published
- 2022
- Full Text
- View/download PDF
13. Current Trends and Future Directions of Large Scale Image and Video Annotation: Observations From Four Years of BIIGLE 2.0
- Author
-
Zurowietz, Martin, primary and Nattkemper, Tim W., additional
- Published
- 2021
- Full Text
- View/download PDF
14. An Interactive Visualization for Feature Localization in Deep Neural Networks
- Author
-
Zurowietz, Martin and Nattkemper, Tim W.
- Subjects
Multidisciplinary ,explainable deep learning ,deep neural network visualization ,lcsh:Electronic computers. Computer science ,visual analytics ,web application ,interactive visualization ,computer vision ,lcsh:QA75.5-76.95 - Abstract
Deep artificial neural networks have become the go-to method for many machine learning tasks. In the field of computer vision, deep convolutional neural networks achieve state-of-the-art performance for tasks such as classification, object detection, or instance segmentation. As deep neural networks become more and more complex, their inner workings become more and more opaque, rendering them a “black box” whose decision making process is no longer comprehensible. In recent years, various methods have been presented that attempt to peek inside the black box and to visualize the inner workings of deep neural networks, with a focus on deep convolutional neural networks for computer vision. These methods can serve as a toolbox to facilitate the design and inspection of neural networks for computer vision and the interpretation of the decision making process of the network. Here, we present the new tool Interactive Feature Localization in Deep neural networks (IFeaLiD) which provides a novel visualization approach to convolutional neural network layers. The tool interprets neural network layers as multivariate feature maps and visualizes the similarity between the feature vectors of individual pixels of an input image in a heat map display. The similarity display can reveal how the input image is perceived by different layers of the network and how the perception of one particular image region compares to the perception of the remaining image. IFeaLiD runs interactively in a web browser and can process even high resolution feature maps in real time by using GPU acceleration with WebGL 2. We present examples from four computer vision datasets with feature maps from different layers of a pre-trained ResNet101. IFeaLiD is open source and available online at https://ifealid.cebitec.uni-bielefeld.de.
- Published
- 2020
- Full Text
- View/download PDF
15. Current Trends and Future Directions of Large Scale Image and Video Annotation: Observations From Four Years of BIIGLE 2.0
- Author
-
Zurowietz, Martin, Nattkemper, Tim W., Zurowietz, Martin, and Nattkemper, Tim W.
- Abstract
Marine imaging has evolved from small, narrowly focussed applications to large-scale applications covering areas of several hundred square kilometers or time series covering observation periods of several months. The analysis and interpretation of the accumulating large volume of digital images or videos will continue to challenge the marine science community to keep this process efficient and effective. It is safe to say that any strategy will rely on some software platform supporting manual image and video annotation, either for a direct manual annotation-based analysis or for collecting training data to deploy a machine learning–based approach for (semi-)automatic annotation. This paper describes how computer-assisted manual full-frame image and video annotation is currently performed in marine science and how it can evolve to keep up with the increasing demand for image and video annotation and the growing volume of imaging data. As an example, observations are presented how the image and video annotation tool BIIGLE 2.0 has been used by an international community of more than one thousand users in the last 4 years. In addition, new features and tools are presented to show how BIIGLE 2.0 has evolved over the same time period: video annotation, support for large images in the gigapixel range, machine learning assisted image annotation, improved mobility and affordability, application instance federation and enhanced label tree collaboration. The observations indicate that, despite novel concepts and tools introduced by BIIGLE 2.0, full-frame image and video annotation is still mostly done in the same way as two decades ago, where single users annotated subsets of image collections or single video frames with limited computational support. We encourage researchers to review their protocols for education and annotation, making use of newer technologies and tools to improve the efficiency and effectivity of image and video annotation in marine science.
- Published
- 2021
- Full Text
- View/download PDF
16. Corrigendum: BIIGLE 2.0 - Browsing and Annotating Large Marine Image Collections
- Author
-
Langenkämper, Daniel, primary, Zurowietz, Martin, additional, Schoening, Timm, additional, and Nattkemper, Tim W., additional
- Published
- 2020
- Full Text
- View/download PDF
17. Megafauna community assessment of polymetallic-nodule fields with cameras: platform and methodology comparison
- Author
-
Schoening, Timm, primary, Purser, Autun, additional, Langenkämper, Daniel, additional, Suck, Inken, additional, Taylor, James, additional, Cuvelier, Daphne, additional, Lins, Lidia, additional, Simon-Lledó, Erik, additional, Marcon, Yann, additional, Jones, Daniel O. B., additional, Nattkemper, Tim, additional, Köser, Kevin, additional, Zurowietz, Martin, additional, Greinert, Jens, additional, and Gomes-Pereira, Jose, additional
- Published
- 2020
- Full Text
- View/download PDF
18. Unsupervised Knowledge Transfer for Object Detection in Marine Environmental Monitoring and Exploration
- Author
-
Zurowietz, Martin and Nattkemper, Tim Wilhelm
- Abstract
The volume of digital image data collected in the field of marine environmental monitoring and exploration has been growing in rapidly increasing rates in recent years. Computational support is essential for the timely evaluation of the high volume of marine imaging data, but often modern techniques such as deep learning cannot be applied due to the lack of training data. In this paper, we present Unsupervised Knowledge Transfer (UnKnoT), a new method to use the limited amount of training data more efficiently. In order to avoid time-consuming annotation, it employs a technique we call "scale transfer" and enhanced data augmentation to reuse existing training data for object detection of the same object classes in new image datasets. We introduce four fully annotated marine image datasets acquired in the same geographical area but with different gear and distance to the sea floor. We evaluate the new method on the four datasets and show that it can greatly improve the object detection performance in the relevant cases compared to object detection without knowledge transfer. We conclude with a recommendation for an image acquisition and annotation scheme that ensures a good applicability of modern machine learning methods in the field of marine environmental monitoring and exploration.
- Published
- 2020
19. Megafauna community assessment with cameras: Platform, annotator and methodology comparison
- Author
-
Schoening, Timm, Purser, Autun, Langenkämper, Daniel, Suck, Inken, Taylor, James, Cuvelier, Daphne, Lins, Lidia, Simon-Lledó, Erik, Marcon, Yann, Jones, Daniel O. B., Nattkemper, Tim W., Köser, Kevin, Zurowietz, Martin, Nuno Gomez-Pereira, Jose, Greinert, Jens, Schoening, Timm, Purser, Autun, Langenkämper, Daniel, Suck, Inken, Taylor, James, Cuvelier, Daphne, Lins, Lidia, Simon-Lledó, Erik, Marcon, Yann, Jones, Daniel O. B., Nattkemper, Tim W., Köser, Kevin, Zurowietz, Martin, Nuno Gomez-Pereira, Jose, and Greinert, Jens
- Published
- 2020
20. Megafauna community assessment of polymetallic-nodule fields with cameras: platform and methodology comparison
- Author
-
Schoening, Timm, Purser, Autun, Langenkämper, Daniel, Suck, Inken, Taylor, James, Cuvelier, Daphne, Lins, Lidia, Simon-Lledo, Erik, Marcon, Yann, Jones, Daniel O. B., Nattkemper, Tim, Köser, Kevin, Zurowietz, Martin, Greinert, Jens, Gomes-Pereira, Jose, Schoening, Timm, Purser, Autun, Langenkämper, Daniel, Suck, Inken, Taylor, James, Cuvelier, Daphne, Lins, Lidia, Simon-Lledo, Erik, Marcon, Yann, Jones, Daniel O. B., Nattkemper, Tim, Köser, Kevin, Zurowietz, Martin, Greinert, Jens, and Gomes-Pereira, Jose
- Abstract
With the mining of polymetallic nodules from the deep-sea seafloor once more evoking commercial interest, decisions must be taken on how to most efficiently regulate and monitor physical and community disturbance in these remote ecosystems. Image-based approaches allow non-destructive assessment of the abundance of larger fauna to be derived from survey data, with repeat surveys of areas possible to allow time series data collection. At the time of writing, key underwater imaging platforms commonly used to map seafloor fauna abundances are autonomous underwater vehicles (AUVs), remotely operated vehicles (ROVs) and towed camera “ocean floor observation systems” (OFOSs). These systems are highly customisable, with cameras, illumination sources and deployment protocols changing rapidly, even during a survey cruise. In this study, eight image datasets were collected from a discrete area of polymetallic-nodule-rich seafloor by an AUV and several OFOSs deployed at various altitudes above the seafloor. A fauna identification catalogue was used by five annotators to estimate the abundances of 20 fauna categories from the different datasets. Results show that, for many categories of megafauna, differences in image resolution greatly influenced the estimations of fauna abundance determined by the annotators. This is an important finding for the development of future monitoring legislation for these areas. When and if commercial exploitation of these marine resources commences, robust and verifiable standards which incorporate developing technological advances in camera-based monitoring surveys should be key to developing appropriate management regulations for these regions.
- Published
- 2020
21. Unsupervised Knowledge Transfer for Object Detection in Marine Environmental Monitoring and Exploration
- Author
-
Zurowietz, Martin, primary and Nattkemper, Tim W., additional
- Published
- 2020
- Full Text
- View/download PDF
22. MAIA—A machine learning assisted image annotation method for environmental monitoring and exploration
- Author
-
Zurowietz, Martin, Langenkämper, Daniel, Hosking, Brett, Ruhl, Henry, and Nattkemper, Tim Wilhelm
- Subjects
Computer and Information Sciences ,Neural Networks ,Databases, Factual ,Imaging Techniques ,Image Processing ,Oceans and Seas ,Computer Vision ,Marine and Aquatic Sciences ,Equipment ,lcsh:Medicine ,Marine Biology ,Research and Analysis Methods ,Remote Sensing ,Machine Learning ,Deep Learning ,Artificial Intelligence ,Marine Monitoring ,Image Processing, Computer-Assisted ,Computer Imaging ,lcsh:Science ,Data Curation ,Imaging Equipment ,Ecology and Environmental Sciences ,lcsh:R ,Aquatic Environments ,Biology and Life Sciences ,Marine Environments ,Signal Processing ,Earth Sciences ,Engineering and Technology ,lcsh:Q ,Neural Networks, Computer ,Research Article ,Neuroscience ,Environmental Monitoring - Abstract
Digital imaging has become one of the most important techniques in environmental monitoring and exploration. In the case of the marine environment, mobile platforms such as autonomous underwater vehicles (AUVs) are now equipped with high-resolution cameras to capture huge collections of images from the seabed. However, the timely evaluation of all these images presents a bottleneck problem as tens of thousands or more images can be collected during a single dive. This makes computational support for marine image analysis essential. Computer-aided analysis of environmental images (and marine images in particular) with machine learning algorithms is promising, but challenging and different to other imaging domains because training data and class labels cannot be collected as efficiently and comprehensively as in other areas. In this paper, we present Machine learning Assisted Image Annotation (MAIA), a new image annotation method for environmental monitoring and exploration that overcomes the obstacle of missing training data. The method uses a combination of autoencoder networks and Mask Region-based Convolutional Neural Network (Mask R-CNN), which allows human observers to annotate large image collections much faster than before. We evaluated the method with three marine image datasets featuring different types of background, imaging equipment and object classes. Using MAIA, we were able to annotate objects of interest with an average recall of 84.1% more than twice as fast as compared to "traditional" annotation methods, which are purely based on software-supported direct visual inspection and manual annotation. The speed gain increases proportionally with the size of a dataset. The MAIA approach represents a substantial improvement on the path to greater efficiency in the annotation of large benthic image collections.
- Published
- 2018
23. MAIA—A machine learning assisted image annotation method for environmental monitoring and exploration
- Author
-
Zurowietz, Martin, primary, Langenkämper, Daniel, additional, Hosking, Brett, additional, Ruhl, Henry A., additional, and Nattkemper, Tim W., additional
- Published
- 2018
- Full Text
- View/download PDF
24. MAIA—A machine learning assisted image annotation method for environmental monitoring and exploration
- Author
-
Sarder, Pinaki, Zurowietz, Martin, Langenkämper, Daniel, Hosking, Brett, Ruhl, Henry A., Nattkemper, Tim W., Sarder, Pinaki, Zurowietz, Martin, Langenkämper, Daniel, Hosking, Brett, Ruhl, Henry A., and Nattkemper, Tim W.
- Abstract
Digital imaging has become one of the most important techniques in environmental monitoring and exploration. In the case of the marine environment, mobile platforms such as autonomous underwater vehicles (AUVs) are now equipped with high-resolution cameras to capture huge collections of images from the seabed. However, the timely evaluation of all these images presents a bottleneck problem as tens of thousands or more images can be collected during a single dive. This makes computational support for marine image analysis essential. Computer-aided analysis of environmental images (and marine images in particular) with machine learning algorithms is promising, but challenging and different to other imaging domains because training data and class labels cannot be collected as efficiently and comprehensively as in other areas. In this paper, we present Machine learning Assisted Image Annotation (MAIA), a new image annotation method for environmental monitoring and exploration that overcomes the obstacle of missing training data. The method uses a combination of autoencoder networks and Mask Region-based Convolutional Neural Network (Mask R-CNN), which allows human observers to annotate large image collections much faster than before. We evaluated the method with three marine image datasets featuring different types of background, imaging equipment and object classes. Using MAIA, we were able to annotate objects of interest with an average recall of 84.1% more than twice as fast as compared to “traditional” annotation methods, which are purely based on software-supported direct visual inspection and manual annotation. The speed gain increases proportionally with the size of a dataset. The MAIA approach represents a substantial improvement on the path to greater efficiency in the annotation of large benthic image collections.
- Published
- 2018
25. BIIGLE 2.0 - Browsing and Annotating Large Marine Image Collections
- Author
-
Langenkämper, Daniel, Zurowietz, Martin, Schoening, Timm, and Nattkemper, Tim Wilhelm
- Subjects
Global and Planetary Change ,lcsh:QH1-199.5 ,human computer interaction (HCI) ,underwater image analysis system ,Ocean Engineering ,marine biology ,Aquatic Science ,lcsh:General. Including nature conservation, geographical distribution ,Oceanography ,image annotation ,megafauna ,Marine Science ,data bases ,lcsh:Q ,marine imaging ,environmental sciences ,lcsh:Science ,Water Science and Technology - Abstract
Combining state-of-the art digital imaging technology with different kinds of marine exploration techniques such as modern autonomous underwater vehicle (AUV), remote operating vehicle (ROV) or other monitoring platforms enables marine imaging on new spatial and/or temporal scales. A comprehensive interpretation of such image collections requires the detection, classification and quantification of objects of interest (OOI) in the images usually performed by domain experts. However, the data volume and the rich content of the images makes the support by software tools inevitable. We define some requirements for marine image annotation and present our new online tool BIIGLE 2.0. It is developed with a special focus on annotating benthic fauna in marine image collections with tools customized to increase efficiency and effectiveness in the manual annotation process. The software architecture of the system is described and the special features of BIIGLE 2.0 are illustrated with different use-cases and future developments are discussed.
- Published
- 2017
- Full Text
- View/download PDF
26. BIIGLE 2.0 - Browsing and Annotating Large Marine Image Collections
- Author
-
Langenkämper, Daniel, primary, Zurowietz, Martin, additional, Schoening, Timm, additional, and Nattkemper, Tim W., additional
- Published
- 2017
- Full Text
- View/download PDF
27. Megafauna community assessment of polymetallic nodule fields with cameras: Platform and methodology comparison.
- Author
-
Schoening, Timm, Purser, Autun, Langenkämper, Daniel, Suck, Inken, Taylor, James, Cuvelier, Daphne, Lins, Lidia, Simon-Lledó, Erik, Marcon, Yann, Jones, Daniel O. B., Nattkemper, Tim, Köser, Kevin, Zurowietz, Martin, Gomes-Pereira, Jose, and Greinert, Jens
- Subjects
ECOLOGICAL disturbances ,MARINE resources ,OCEAN bottom ,CAMERAS ,SUBMERSIBLES - Abstract
With the mining of polymetallic nodules from the deep sea seafloor again approaching commercial viability, decisions must be taken on how to most efficiently regulate and monitor physical and community disturbance in these remote ecosystems. Image based approaches allow non-destructive assessment of larger fauna abundances to be derived from survey data, with repeat surveys of areas possible to allow time series data collection. At time of writing key underwater imaging platforms commonly used to map seafloor fauna abundances are Automated Underwater Vehicles (AUVs), Remotely Operated Vehicles (ROVs) and towed camera Ocean Floor Observation Systems (OFOSs). These systems are highly customisable, with mounted cameras, illumination systems and deployment protocols rapidly changing over time, and even within survey cruises. In this study 8 image datasets were collected from a discrete area of polymetallic nodule rich seafloor by an AUV and several OFOSs deployed at various altitudes above the seafloor. A fauna identification catalogue was used by 5 annotators to estimate the abundances of 20 fauna categories from the different data sets. Results show that for many categories of megafauna differences in image resolution greatly influenced the estimations of fauna abundance determined by the annotators. This is an important finding for the development of future monitoring legislation for these areas. When and if commercial exploitation of these marine resources commences, to ensure best monitoring practice, unambiguous rules on how camera-based monitoring surveys should be conducted, and with what equipment, must be put in place. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.