SEEM: A Sequence Entropy Energy-Based Model for Pedestrian Trajectory All-Then-One Prediction

Dafeng Wang, Hongbo Liu, Naiyao Wang, Yiyang Wang, Hua Wang, Seán McLoone

Research output: Contribution to journalArticlepeer-review

213 Downloads (Pure)


Predicting the future trajectories of pedestrians is of increasing importance for many applications such as autonomous driving and social robots. Nevertheless, current trajectory prediction models suffer from limitations such as lack of diversity in candidate trajectories, poor accuracy, and instability. In this paper, we propose a novel Sequence Entropy Energy-based Model named SEEM, which consists of a generator network and an energy network. Within SEEM we optimize the sequence entropy by taking advantage of the local variational inference of f-divergence estimation to maximize the mutual information across the generator in order to cover all modes of the trajectory distribution, thereby ensuring SEEM achieves full diversity in candidate trajectory generation. Then, we introduce a probability distribution clipping mechanism to draw samples towards regions of high probability in the trajectory latent space, while our energy network determines which trajectory is most representative of the
ground truth. This dual approach is our so-called all-then-one strategy. Finally, a zero-centered potential energy regularization is proposed to ensure stability and convergence of the training process. Through experiments on both synthetic and public benchmark datasets, SEEM is shown to substantially outperform the current state-of-the-art approaches in terms of diversity, accuracy and stability of pedestrian trajectory prediction.
Original languageEnglish
Number of pages18
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Early online date01 Feb 2022
Publication statusEarly online date - 01 Feb 2022


Dive into the research topics of 'SEEM: A Sequence Entropy Energy-Based Model for Pedestrian Trajectory All-Then-One Prediction'. Together they form a unique fingerprint.

Cite this