11 results on '"Zurowietz, Martin"'
Search Results
2. Making marine image data FAIR
- Author
-
Schoening, Timm, Durden, Jennifer M., Faber, Claas, Felden, Janine, Heger, Karl, Hoving, Henk-Jan T., Kiko, Rainer, Köser, Kevin, Krämmer, Christopher, Kwasnitschka, Tom, Möller, Klas Ove, Nakath, David, Naß, Andrea, Nattkemper, Tim W., Purser, Autun, and Zurowietz, Martin
- Published
- 2022
- Full Text
- View/download PDF
3. Fast visual exploration of mass spectrometry images with interactive dynamic spectral similarity pseudocoloring
- Author
-
Wüllems, Karsten, Zurowietz, Annika, Zurowietz, Martin, Schneider, Roland, Bednarz, Hanna, Niehaus, Karsten, and Nattkemper, Tim W.
- Published
- 2021
- Full Text
- View/download PDF
4. Deep learning-based diatom taxonomy on virtual slides
- Author
-
Kloster, Michael, Langenkämper, Daniel, Zurowietz, Martin, Beszteri, Bánk, and Nattkemper, Tim W.
- Published
- 2020
- Full Text
- View/download PDF
5. A Digital Light Microscopic Method for Diatom Surveys Using Embedded Acid-Cleaned Samples.
- Author
-
Burfeid-Castellanos, Andrea M., Kloster, Michael, Beszteri, Sára, Postel, Ute, Spyra, Marzena, Zurowietz, Martin, Nattkemper, Tim W., and Beszteri, Bánk
- Subjects
DIATOMS ,MICROSCOPY ,MICROSCOPES ,SPECIES diversity ,WATER quality ,DOMOIC acid ,WORKFLOW management - Abstract
Diatom identification and counting by light microscopy of permanently embedded acid-cleaned silicate shells (frustules) is a fundamental method in ecological and water quality investigations. Here we present a new variant of this method based on "digital virtual slides", and compare it to the traditional, non-digitized light microscopy workflow on freshwater samples. We analysed three replicate slides taken from six benthic samples using two methods: (1) working directly on a light microscope (the "traditional" counting method), and (2) preparing "virtual digital slides" by high-resolution slide scanning and subsequently identifying and labelling individual valves or frustules using a web browser-based image annotation platform (the digital method). Both methods led to comparable results in terms of species richness, diatom indices and diatom community composition. Although counting by digital microscopy was slightly more time consuming, our experience points out that the digital workflow can not only improve the transparency and reusability of diatom counts but it can also increase taxonomic precision. The introduced digital workflow can also be applied for taxonomic inter-expert calibration through the web, and for producing training image sets for deep-learning-based diatom identification, making it a promising and versatile alternative or extension to traditional light microscopic diatom analyses in the future. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
6. An Interactive Visualization for Feature Localization in Deep Neural Networks
- Author
-
Zurowietz, Martin and Nattkemper, Tim W.
- Subjects
Multidisciplinary ,explainable deep learning ,deep neural network visualization ,lcsh:Electronic computers. Computer science ,visual analytics ,web application ,interactive visualization ,computer vision ,lcsh:QA75.5-76.95 - Abstract
Deep artificial neural networks have become the go-to method for many machine learning tasks. In the field of computer vision, deep convolutional neural networks achieve state-of-the-art performance for tasks such as classification, object detection, or instance segmentation. As deep neural networks become more and more complex, their inner workings become more and more opaque, rendering them a “black box” whose decision making process is no longer comprehensible. In recent years, various methods have been presented that attempt to peek inside the black box and to visualize the inner workings of deep neural networks, with a focus on deep convolutional neural networks for computer vision. These methods can serve as a toolbox to facilitate the design and inspection of neural networks for computer vision and the interpretation of the decision making process of the network. Here, we present the new tool Interactive Feature Localization in Deep neural networks (IFeaLiD) which provides a novel visualization approach to convolutional neural network layers. The tool interprets neural network layers as multivariate feature maps and visualizes the similarity between the feature vectors of individual pixels of an input image in a heat map display. The similarity display can reveal how the input image is perceived by different layers of the network and how the perception of one particular image region compares to the perception of the remaining image. IFeaLiD runs interactively in a web browser and can process even high resolution feature maps in real time by using GPU acceleration with WebGL 2. We present examples from four computer vision datasets with feature maps from different layers of a pre-trained ResNet101. IFeaLiD is open source and available online at https://ifealid.cebitec.uni-bielefeld.de.
- Published
- 2020
- Full Text
- View/download PDF
7. Unsupervised Knowledge Transfer for Object Detection in Marine Environmental Monitoring and Exploration
- Author
-
Zurowietz, Martin and Nattkemper, Tim Wilhelm
- Abstract
The volume of digital image data collected in the field of marine environmental monitoring and exploration has been growing in rapidly increasing rates in recent years. Computational support is essential for the timely evaluation of the high volume of marine imaging data, but often modern techniques such as deep learning cannot be applied due to the lack of training data. In this paper, we present Unsupervised Knowledge Transfer (UnKnoT), a new method to use the limited amount of training data more efficiently. In order to avoid time-consuming annotation, it employs a technique we call "scale transfer" and enhanced data augmentation to reuse existing training data for object detection of the same object classes in new image datasets. We introduce four fully annotated marine image datasets acquired in the same geographical area but with different gear and distance to the sea floor. We evaluate the new method on the four datasets and show that it can greatly improve the object detection performance in the relevant cases compared to object detection without knowledge transfer. We conclude with a recommendation for an image acquisition and annotation scheme that ensures a good applicability of modern machine learning methods in the field of marine environmental monitoring and exploration.
- Published
- 2020
8. MAIA—A machine learning assisted image annotation method for environmental monitoring and exploration
- Author
-
Zurowietz, Martin, Langenkämper, Daniel, Hosking, Brett, Ruhl, Henry A., Nattkemper, Tim W., and Sarder, Pinaki
- Abstract
Digital imaging has become one of the most important techniques in environmental monitoring and exploration. In the case of the marine environment, mobile platforms such as autonomous underwater vehicles (AUVs) are now equipped with high-resolution cameras to capture huge collections of images from the seabed. However, the timely evaluation of all these images presents a bottleneck problem as tens of thousands or more images can be collected during a single dive. This makes computational support for marine image analysis essential. Computer-aided analysis of environmental images (and marine images in particular) with machine learning algorithms is promising, but challenging and different to other imaging domains because training data and class labels cannot be collected as efficiently and comprehensively as in other areas. In this paper, we present Machine learning Assisted Image Annotation (MAIA), a new image annotation method for environmental monitoring and exploration that overcomes the obstacle of missing training data. The method uses a combination of autoencoder networks and Mask Region-based Convolutional Neural Network (Mask R-CNN), which allows human observers to annotate large image collections much faster than before. We evaluated the method with three marine image datasets featuring different types of background, imaging equipment and object classes. Using MAIA, we were able to annotate objects of interest with an average recall of 84.1% more than twice as fast as compared to “traditional” annotation methods, which are purely based on software-supported direct visual inspection and manual annotation. The speed gain increases proportionally with the size of a dataset. The MAIA approach represents a substantial improvement on the path to greater efficiency in the annotation of large benthic image collections.
- Published
- 2018
- Full Text
- View/download PDF
9. Megafauna community assessment of polymetallic-nodule fields with cameras: platform and methodology comparison.
- Author
-
Schoening, Timm, Purser, Autun, Langenkämper, Daniel, Suck, Inken, Taylor, James, Cuvelier, Daphne, Lins, Lidia, Simon-Lledó, Erik, Marcon, Yann, Jones, Daniel O. B., Nattkemper, Tim, Köser, Kevin, Zurowietz, Martin, Greinert, Jens, and Gomes-Pereira, Jose
- Subjects
ECOLOGICAL disturbances ,AUTONOMOUS underwater vehicles ,MARINE resources ,OCEAN bottom ,CAMERAS - Abstract
With the mining of polymetallic nodules from the deep-sea seafloor once more evoking commercial interest, decisions must be taken on how to most efficiently regulate and monitor physical and community disturbance in these remote ecosystems. Image-based approaches allow non-destructive assessment of the abundance of larger fauna to be derived from survey data, with repeat surveys of areas possible to allow time series data collection. At the time of writing, key underwater imaging platforms commonly used to map seafloor fauna abundances are autonomous underwater vehicles (AUVs), remotely operated vehicles (ROVs) and towed camera "ocean floor observation systems" (OFOSs). These systems are highly customisable, with cameras, illumination sources and deployment protocols changing rapidly, even during a survey cruise. In this study, eight image datasets were collected from a discrete area of polymetallic-nodule-rich seafloor by an AUV and several OFOSs deployed at various altitudes above the seafloor. A fauna identification catalogue was used by five annotators to estimate the abundances of 20 fauna categories from the different datasets. Results show that, for many categories of megafauna, differences in image resolution greatly influenced the estimations of fauna abundance determined by the annotators. This is an important finding for the development of future monitoring legislation for these areas. When and if commercial exploitation of these marine resources commences, robust and verifiable standards which incorporate developing technological advances in camera-based monitoring surveys should be key to developing appropriate management regulations for these regions. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
10. BIIGLE 2.0 - Browsing and Annotating Large Marine Image Collections
- Author
-
Langenkämper, Daniel, Zurowietz, Martin, Schoening, Timm, and Nattkemper, Tim Wilhelm
- Subjects
Global and Planetary Change ,lcsh:QH1-199.5 ,human computer interaction (HCI) ,underwater image analysis system ,Ocean Engineering ,marine biology ,Aquatic Science ,lcsh:General. Including nature conservation, geographical distribution ,Oceanography ,image annotation ,megafauna ,Marine Science ,data bases ,lcsh:Q ,marine imaging ,environmental sciences ,lcsh:Science ,Water Science and Technology - Abstract
Combining state-of-the art digital imaging technology with different kinds of marine exploration techniques such as modern autonomous underwater vehicle (AUV), remote operating vehicle (ROV) or other monitoring platforms enables marine imaging on new spatial and/or temporal scales. A comprehensive interpretation of such image collections requires the detection, classification and quantification of objects of interest (OOI) in the images usually performed by domain experts. However, the data volume and the rich content of the images makes the support by software tools inevitable. We define some requirements for marine image annotation and present our new online tool BIIGLE 2.0. It is developed with a special focus on annotating benthic fauna in marine image collections with tools customized to increase efficiency and effectiveness in the manual annotation process. The software architecture of the system is described and the special features of BIIGLE 2.0 are illustrated with different use-cases and future developments are discussed.
- Published
- 2017
- Full Text
- View/download PDF
11. Megafauna community assessment of polymetallic nodule fields with cameras: Platform and methodology comparison.
- Author
-
Schoening, Timm, Purser, Autun, Langenkämper, Daniel, Suck, Inken, Taylor, James, Cuvelier, Daphne, Lins, Lidia, Simon-Lledó, Erik, Marcon, Yann, Jones, Daniel O. B., Nattkemper, Tim, Köser, Kevin, Zurowietz, Martin, Gomes-Pereira, Jose, and Greinert, Jens
- Subjects
ECOLOGICAL disturbances ,MARINE resources ,OCEAN bottom ,CAMERAS ,SUBMERSIBLES - Abstract
With the mining of polymetallic nodules from the deep sea seafloor again approaching commercial viability, decisions must be taken on how to most efficiently regulate and monitor physical and community disturbance in these remote ecosystems. Image based approaches allow non-destructive assessment of larger fauna abundances to be derived from survey data, with repeat surveys of areas possible to allow time series data collection. At time of writing key underwater imaging platforms commonly used to map seafloor fauna abundances are Automated Underwater Vehicles (AUVs), Remotely Operated Vehicles (ROVs) and towed camera Ocean Floor Observation Systems (OFOSs). These systems are highly customisable, with mounted cameras, illumination systems and deployment protocols rapidly changing over time, and even within survey cruises. In this study 8 image datasets were collected from a discrete area of polymetallic nodule rich seafloor by an AUV and several OFOSs deployed at various altitudes above the seafloor. A fauna identification catalogue was used by 5 annotators to estimate the abundances of 20 fauna categories from the different data sets. Results show that for many categories of megafauna differences in image resolution greatly influenced the estimations of fauna abundance determined by the annotators. This is an important finding for the development of future monitoring legislation for these areas. When and if commercial exploitation of these marine resources commences, to ensure best monitoring practice, unambiguous rules on how camera-based monitoring surveys should be conducted, and with what equipment, must be put in place. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.