Back to Search Start Over

Constrained Active Classification Using Partially Observable Markov Decision Processes

Authors :
Wu, Bo
Lauffer, Niklas
Ahmadi, Mohamadreza
Bharadwaj, Suda
Xu, Zhe
Topcu, Ufuk
Publication Year :
2020

Abstract

In this work, we study the problem of actively classifying the attributes of dynamical systems characterized as a finite set of Markov decision process (MDP) models. We are interested in finding strategies that actively interact with the dynamical system and observe its reactions so that the attribute of interest is classified efficiently with high confidence. We present a decision-theoretic framework based on partially observable Markov decision processes (POMDPs). The proposed framework relies on assigning a classification belief (a probability distribution) to the attributes of interest. Given an initial belief, a confidence level over which a classification decision can be made, a cost bound, safe belief sets, and a finite time horizon, we compute POMDP strategies leading to classification decisions. We present three different algorithms to compute such strategies. The first algorithm computes the optimal strategy exactly by value iteration. To overcome the computational complexity of computing the exact solutions, we propose a second algorithm based on adaptive sampling and a third based on a Monte Carlo tree search to approximate the optimal probability of reaching a classification decision. We illustrate the proposed methodology using examples from medical diagnosis, security surveillance, and wildlife classification.<br />Comment: arXiv admin note: substantial text overlap with arXiv:1810.00097

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2008.04768
Document Type :
Working Paper