Group Abnormal Behaviour Detection Algorithm Based on Global Optical Flow

Yu Hao, Ying Liu, Jiulun Fan, Zhijie Xu

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)


Abnormal behaviour detection algorithm needs to conduct behaviour analysis on the basis of continuous video inclination tracking, and the robustness of the algorithm is reduced for the occlusion of moving targets, the occlusion of the environment, and the movement of targets with the same colour. For this reason, the optical flow information between RGB (red, green, and blue) images and video frames is used as the input of the network in view of group behaviour. Then, the direction, velocity, acceleration, and energy of the crowd were weighted and fused into a global optical flow descriptor. At the same time, the crowd trajectory map is extracted from the original image of a single frame. Following, in order to realize the detection of large displacement moving target and solve the problem that the traditional optical flow algorithm is only suitable for the detection of displacement moving target, a video abnormal behaviour detection algorithm based on the double-flow convolutional neural network is proposed. The network uses two network branches to learn spatial dimension information and temporal dimension information, respectively, and uses short- A nd long-time neural network to model the dependency relationship between long-time video frames, so as to obtain the final behaviour classification results. Simulation test results show that the proposed method can achieve good recognition effect on multiple datasets, and the performance of abnormal behaviour detection can be significantly improved by using interframe motion information.

Original languageEnglish
Article number5543204
Number of pages12
Early online date5 May 2021
Publication statusPublished - 1 Sep 2021


Dive into the research topics of 'Group Abnormal Behaviour Detection Algorithm Based on Global Optical Flow'. Together they form a unique fingerprint.

Cite this