Handling Sequential Observations in Intelligent Surveillance

Jianbing Ma, Weiru Liu, Paul Miller

Research output: Chapter in Book/Report/Conference proceedingConference contribution

10 Citations (Scopus)
294 Downloads (Pure)


Demand for intelligent surveillance in public transport systems is growing due to the increased threats of terrorist attack, vandalism and litigation. The aim of intelligent surveillance is in-time reaction to information received from various monitoring devices, especially CCTV systems. However, video analytic algorithms can only provide static assertions, whilst in reality, many related events happen in sequence and hence should be modeled sequentially. Moreover, analytic algorithms are error-prone, hence how to correct the sequential analytic results based on new evidence (external information or later sensing discovery) becomes an interesting issue. In this paper, we introduce a high-level sequential observation modeling framework which can support revision and update on new evidence. This framework adapts the situation calculus to deal with uncertainty from analytic results. The output of the framework can serve as a foundation for event composition. We demonstrate the significance and usefulness of our framework with a case study of a bus surveillance project.
Original languageEnglish
Title of host publicationInternational Conference on Scalable Uncertainty Management, SUM 2011
Number of pages14
Publication statusPublished - Oct 2011
EventScalable Uncertainty Management - 5th International Conference, SUM 2011 - Dayton, OH, United States
Duration: 01 Oct 201101 Oct 2011

Publication series

NameLecture Notes on Computer Science
PublisherSpringer Berlin Heidelberg
ISSN (Electronic)0302-9743


ConferenceScalable Uncertainty Management - 5th International Conference, SUM 2011
Country/TerritoryUnited States
CityDayton, OH

Bibliographical note

ISSN: 978-3-642-23962-5


Dive into the research topics of 'Handling Sequential Observations in Intelligent Surveillance'. Together they form a unique fingerprint.

Cite this