中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Detecting Anomaly Based on Time Dependence for Large Scenes

文献类型:会议论文

作者Lixin Chen; Huiwen Guo; Xinyu Wu; Wei Feng; Xiaoxin Lin
出版日期2016
会议名称IEEE International Conference on Information and Automation(ICIA)
会议地点中国宁波
英文摘要We propose a novel approach for the crowd anomaly detection in multiple cameras with non-overlapping and visible views. As we all know that there are some kinds of information hidden in the non-overlapping fields always. In this paper, we will mine time dependence data so that we can analyze the crowd anomaly detection from time dimension's angle. Firstly, we have to preprocess the real scene using optical flow. Secondly, we build a model of crowd movement. We build the model of crowd movement using random data of the simulated scene and real scene and based on neighborhood weighted fuzzy c-means(NW-FCM) algorithm. Thirdly, we analyze local and global path based on time dependence data. We research probability of one trajectory through one piece of local path. Then we study the global path based on the Bayesian Information Criterion (BIC) and Markov Chain Monte Carlo (MCMC). At last, we can analyze the crowd anomaly detection. There are two kinds of anomaly events including abnormal retention events and abnormal moving event. We set up the empirical threshold value of probability e P . If the probability of detected model is less than e P , the detected model is marked as the crowd anomaly. We judge the detection system based on the confusion matrix. The global comprehensive assessment criteria for the real scene is 95.6%. Experimental results show the anomaly detection is precise.
收录类别EI
语种英语
源URL[http://ir.siat.ac.cn:8080/handle/172644/10134]  
专题深圳先进技术研究院_集成所
作者单位2016
推荐引用方式
GB/T 7714
Lixin Chen,Huiwen Guo,Xinyu Wu,et al. Detecting Anomaly Based on Time Dependence for Large Scenes[C]. 见:IEEE International Conference on Information and Automation(ICIA). 中国宁波.

入库方式: OAI收割

来源:深圳先进技术研究院

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。