You have now covered the full landscape of point process theory and simulation — from the memoryless Poisson process to self-exciting Hawkes cascades, doubly stochastic Cox processes, and marked ETAS models. This final chapter gives you a decision guide for choosing the right model, pointers to the research frontier, and a reading list for going deeper.
The following questions help narrow down which model to use:
Step 1: Is the rate constant?
Step 2: Is the non-stationarity deterministic (known driver) or random?
λ(t).Step 3: Is the process overdispersed but without self-excitation?
Step 4: Do events have meaningful attributes (size, type)?
Step 5: Are there multiple interacting streams?
| Observation | Recommended Model |
|---|---|
| Constant rate, no memory | HPP |
| Time-varying rate, no memory | NHPP |
| Refractory period, no clustering | Renewal |
| Bursts triggered by events | Hawkes |
| Multiple interacting streams | Multivariate Hawkes |
| Overdispersion from latent environment | Cox / LGCP |
| Magnitude-dependent cascades | ETAS / Marked Hawkes |
| Formula | Meaning |
|---|---|
λ*(t) = λ |
HPP |
λ*(t) = μ + Σ α·exp(−β(t−tᵢ)) |
Hawkes with exp kernel |
n* = α/β < 1 |
Stationarity condition |
E[λ*] = μ/(1−n*) |
Mean Hawkes rate |
ℓ = Σ log λ*(tᵢ) − ∫ λ*(t)dt |
Log-likelihood (universal) |
τᵢ = ∫₀^{tᵢ} λ*(s)ds ~ Exp(1) |
Time-rescaling residuals |
ρ(A) < 1 |
Multivariate stationarity |
Var[N] = E[Λ] + Var[Λ] |
Cox overdispersion |
| Library | Strengths | Install |
|---|---|---|
tick (Inria) |
Fast Hawkes simulation and MLE, multi-dimensional | pip install tick |
hawkeslib |
Simple Python Hawkes library | pip install hawkeslib |
pyHawkes |
Research-oriented, various kernels | GitHub |
PINT (R) |
Point process inference toolkit in R | CRAN |
PySEF |
Stochastic event forecasting (seismology) | GitHub |
lifelines |
Survival analysis, renewal processes | pip install lifelines |
For most applications, the from-scratch implementations in this book’s code/ directory are sufficient and educational. Use tick when you need speed for large datasets.
Foundational textbooks:
Seminal papers:
Recent research:
Neural point processes: Replace the parametric intensity with a neural network (RNN, Transformer). Can capture complex non-Markovian dependencies but loses interpretability and requires more data.
Non-parametric kernel estimation: Instead of fixing φ(t) = α·exp(−βt), estimate the kernel from data using EM algorithms (Bacry & Muzy, 2016) or regularization.
Spatio-temporal processes: Extend to 2D space: earthquake locations, crime events, disease spread. The conditional intensity becomes λ*(t, x, y).
Hawkes processes for graphs: Events on nodes of a network; edges define cross-excitation structure (social networks, neural connectomes).
Uncertainty quantification: Bayesian approaches (MCMC for Hawkes, variational inference for LGCP) give full posterior distributions over parameters.
Before declaring a model fit:
Point processes are a rich and practically essential class of stochastic models. The journey from the memoryless Poisson process (chapter 2) to the full ETAS model (chapter 10) represents increasing realism: more complex dynamics, more parameters, more powerful inference machinery.
The key insight that unifies the entire book: the conditional intensity λ*(t) is everything. Choose λ*(t) to match your domain knowledge, estimate it from data using the universal log-likelihood, and validate it with the time-rescaling theorem. Everything else is details.
| ← Chapter 14 | Table of Contents |