Learning Quantile Functions for Temporal Point Processes with Recurrent Neural Splines

Souhaib Ben Taieb
(2021)

 pdf

We can build flexible predictive models for rich continuous-time event data by combining temporal point processes (TPP) with recurrent neural networks. Many prediction tasks require characterizing the distribution of the next arrival time conditional on the observed history. We propose a new neural parametrization for TPPs based on the conditional quantile function. Specifically, we use a flexible monotonic rational-quadratic spline to learn a smooth continuous quantile function. Conditioning on historical events is achieved through a recurrent neural network. This novel parametrization provides a flexible yet tractable TPP model with multiple advantages, such as analytical sampling and closed-form expressions for quantiles and prediction intervals. While neural TPP models are often trained using maximum likelihood estimation, we consider the more robust continuous ranked probability score (CRPS). We additionally derive a closed-form expression for the CRPS of our model. Finally, we demonstrate that the proposed model achieves state-of-the-art performance in standard prediction tasks on both synthetic and real-world event data.