Real event streams rarely arrive at a constant rate. Web traffic peaks during business hours, neural firing rates track a stimulus, and earthquake occurrence rates vary with tectonic stress. The inhomogeneous (non-homogeneous) Poisson process (NHPP) extends the HPP by allowing the rate to vary deterministically with time, while preserving the independence-of-increments property that makes Poisson processes analytically tractable.
An NHPP with intensity function λ(t) ≥ 0 is a counting process such that:
N(0) = 0a < b, N(a, b] ~ Poisson(Λ(a, b]) whereΛ(a, b] = ∫ₐᵇ λ(s) ds
The function Λ(t) = ∫₀ᵗ λ(s) ds is called the mean measure or cumulative intensity. It counts the expected number of events up to time t.
The conditional intensity is simply λ*(t) = λ(t) — there is no memory, just a time-varying rate.
Distribution of counts:
P(N(a, b] = k) = e^{-Λ(a,b]} · Λ(a,b]^k / k!
E[N(a, b]] = Λ(a, b]
Var[N(a, b]] = Λ(a, b]
Equidispersion (Var = Mean) still holds, but now in terms of Λ(a, b], not a fixed rate times the length.
Conditional uniformity: Given N(T) = n, the event times are i.i.d. with density proportional to λ(t) (not uniform, unlike the HPP). This is the weighted analog: events concentrate where λ(t) is large.
If Λ(t) is invertible in closed form, we can use the inverse transform method:
[0, Λ(T)] to get times τ₁ < τ₂ < ...tᵢ = Λ⁻¹(τᵢ)This works because the time-change t → Λ(t) transforms an NHPP into an HPP. It is exact and efficient when Λ⁻¹ is available analytically.
Example: For the linear rate λ(t) = a + bt:
Λ(t) = at + bt²/2
Λ⁻¹(u) = (-a + sqrt(a² + 2bu)) / b
When Λ⁻¹ is not available in closed form, the thinning (rejection) algorithm of Lewis and Shedler (1979) is the standard approach:
Algorithm:
λ_bar ≥ λ(t) for all t ∈ [0, T]λ_bar (proposals)t with probability λ(t) / λ_bardef simulate_nhpp_thinning(intensity_fn, lambda_bar, T):
events = []
t = 0.0
while True:
t += np.random.exponential(1.0 / lambda_bar)
if t > T:
break
if np.random.uniform() < intensity_fn(t) / lambda_bar:
events.append(t)
return np.array(events)
Efficiency: The expected fraction of proposals accepted is Λ(T) / (λ_bar · T). A tighter upper bound λ_bar means fewer wasted proposals.
For slowly varying λ(t), we can use a time-varying upper bound (piecewise constant approximation) to improve efficiency.
A natural model for daily-periodic event streams:
λ(t) = λ₀ · (1 + A · sin(2πt / P))
where λ₀ is the baseline rate, A ∈ (0, 1) is the amplitude, and P is the period.
The upper bound is λ_bar = λ₀ · (1 + A). For λ₀ = 5, A = 0.8, P = 24 (hours), about 1 / (1 + A) = 55.6% of proposals are accepted.
lambda_0, A, P = 5.0, 0.8, 24.0
intensity = lambda t: lambda_0 * (1 + A * np.sin(2 * np.pi * t / P))
lambda_bar = lambda_0 * (1 + A)
events = simulate_nhpp_thinning(intensity, lambda_bar, T=168) # one week
See code/03_inhomogeneous_poisson.py for the full implementation with efficiency analysis.
A flexible non-parametric approach is to model λ(t) as piecewise constant over bins of width Δ:
λ(t) = λₖ for t ∈ [(k-1)Δ, kΔ)
Simulation is exact: for each bin, draw Nₖ ~ Poisson(λₖ · Δ) events and place them uniformly within the bin.
This is the baseline model for many practical applications (e.g., server log analysis) and is the starting point for non-parametric intensity estimation via kernel smoothing.
λ(t).Λ(t) = ∫₀ᵗ λ(s) ds replaces λ · t in all Poisson formulas.Λ⁻¹ is tractable; otherwise use Lewis-Shedler thinning with an upper bound λ_bar.t, not on past events.| ← Chapter 2 | Table of Contents | Chapter 4: Poisson Process Properties → |