Time Normalization Using the Integral Length Scale¶
When comparing time series from a simulation against wind tunnel data or another simulation, raw seconds are not a useful axis. The natural time scale of an ABL flow is the eddy turnover time \(T_u = L_u / U\), set by the streamwise integral length scale \(L_u\) and a representative mean velocity \(U\). Normalising time by \(T_u\) removes the dependence on the specific reference velocity and length, so that one full unit of \(t^* = t / T_u\) corresponds to the passage of one large-eddy structure past the probe.
This notebook shows how to compute \(L_u\) from a probe time series following Taylor’s frozen-turbulence hypothesis, and how to use it to non-dimensionalise time for plotting, statistics windowing, and cross-case comparison.
Note. The streamwise integral length scale \(L_u\) is also a target quantity for the inlet itself. When validating an ABL case against a standard, it should be checked alongside \(U(z)\) and \(I_u(z)\). See the ABL Velocity and Turbulence Intensity notebook for those two profiles.
Definition¶
For a stationary, ergodic streamwise velocity signal \(u(t)\) at a fixed height, decompose it as \(u(t) = U + u'(t)\) with \(U = \overline{u(t)}\). The temporal autocorrelation of the fluctuating component is
The integral time scale is
where \(\tau^*\) is the first zero crossing of \(R_{uu}\). This bounded integral avoids the slow-decay tail that contaminates the open-ended definition.
Under Taylor’s frozen-turbulence hypothesis (turbulent eddies advect past the probe faster than they evolve), the spatial integral length scale is
and the corresponding eddy turnover time is \(T_u = L_u / U = T_{int}\).
Step 1: Compute the Autocorrelation via FFT¶
Direct evaluation of \(R_{uu}(\tau)\) scales as \(O(N^2)\). The FFT-based form below is \(O(N \log N)\) and produces an unbiased estimator after dividing by the lag count.
[ ]:
import numpy as np
def autocorrelation_fft(signal: np.ndarray) -> np.ndarray:
"""Normalised autocorrelation R_uu(tau) for tau >= 0 via FFT."""
s = signal - signal.mean()
n = s.size
nfft = 1 << (2 * n - 1).bit_length()
F = np.fft.rfft(s, n=nfft)
acf_full = np.fft.irfft(F * np.conj(F), n=nfft)[:n]
norm = np.arange(n, 0, -1) # unbiased lag count
acf = acf_full / norm
return acf / acf[0]
Tip. The zero-padding to nfft = 1 << (2 * n - 1).bit_length() avoids circular-correlation wrap-around. The division by np.arange(n, 0, -1) corrects for the shrinking number of overlapping samples at large lag.
Step 2: Integrate to the First Zero Crossing¶
[ ]:
def integral_length_scale(signal: np.ndarray, dt: float, u_mean: float) -> float:
"""Lu = u_mean * integral of R_uu(tau) up to the first zero crossing."""
acf = autocorrelation_fft(signal)
sign_change = np.where(np.diff(np.sign(acf)))[0]
if sign_change.size == 0:
idx_zero = acf.size
else:
idx_zero = sign_change[0] + 1
T_int = np.trapz(acf[:idx_zero], dx=dt)
return u_mean * T_int
If \(R_{uu}\) does not cross zero within the signal length, the integration extends to the end of the available record; this typically means the signal is too short for a converged estimate, so flag it rather than trust the value.
Step 3: Apply to a Vertical Line Probe¶
Given the probe CSV format produced by AeroSim (see the ABL Velocity and Turbulence Intensity notebook for the columns), compute \(L_u\) at every probe point.
[ ]:
import pandas as pd
df_pts = pd.read_csv("line.line_profile line.points.csv", index_col="idx")
df_ux = pd.read_csv("line.line_profile line.ux.csv")
time_steps = df_ux["time_step"].to_numpy(dtype=np.float64)
df_ux.drop(columns=["time_step"], inplace=True)
df_ux.columns = df_ux.columns.astype(int)
z = df_pts["z"].to_numpy()
dt = time_steps[1] - time_steps[0]
Lu = np.zeros(df_pts.shape[0])
for i, point_idx in enumerate(df_pts.index):
u = df_ux[point_idx].to_numpy(dtype=float)
if u.std() < 1e-9:
Lu[i] = np.nan
continue
Lu[i] = integral_length_scale(u, dt=dt, u_mean=u.mean())
for z_target in (10, 20, 50, 100):
j = int((np.abs(z - z_target)).argmin())
print(f"z = {z[j]:6.2f} m -> Lu = {Lu[j]:7.2f} m")
The EN 1991-1-4 Annex B.1 reference profile for \(L_u(z)\) is
with \(L_t = 300\) m and \(z_t = 200\) m, clamped to \(L(z_{min})\) for \(z < z_{min}\). Use it to overlay a target curve on the simulated \(L_u(z)\), the same way \(U(z)\) and \(I_u(z)\) are validated.
Step 4: Pick a Reference Pair (Lu_ref, U_ref)¶
For time normalisation, a single \(T_u\) value must be chosen, not a profile. Common conventions:
Convention |
\(L_u\) |
\(U\) |
When to use |
|---|---|---|---|
Reference height |
\(L_u(z_{ref})\) |
\(U(z_{ref})\) |
Building studies; \(z_{ref}\) is the building reference height (e.g., \(H\) or \(2H/3\)) |
Pedestrian level |
\(L_u(z = 10\ \text{m})\) |
\(U(z = 10\ \text{m})\) |
Pedestrian comfort or terrain-only studies, matching standard 10 m reporting height |
Domain-mean |
\(\overline{L_u}\) over the engineering height range |
\(\overline{U}\) over the same range |
Quick comparison; less reproducible across studies |
State the convention explicitly when reporting results - the same dataset gives different non-dimensional times under different conventions.
Step 5: Normalise Time¶
Once a reference pair is chosen, the eddy turnover time and the non-dimensional time are
[ ]:
z_ref = 30.0 # building reference height [m]
j_ref = int(np.abs(z - z_ref).argmin())
Lu_ref = Lu[j_ref]
U_ref = df_ux[df_pts.index[j_ref]].mean()
T_u = Lu_ref / U_ref
t = time_steps - time_steps[0]
t_star = t / T_u
print(f"Lu_ref = {Lu_ref:.2f} m | U_ref = {U_ref:.2f} m/s | T_u = {T_u:.3f} s")
print(f"Total non-dimensional time covered: t* = {t_star[-1]:.1f}")
Important. For converged turbulence statistics on a high-rise study, the recommended sampling window is at least \(t^* \geq 100\). If the simulation duration falls short, autocorrelation tails will not flatten and quantities like \(L_u\) itself will be underestimated.
Practical Uses¶
Plotting time series. Replace the seconds axis with \(t^*\). Two simulations with different \(U_{ref}\) or different reference heights become directly comparable.
Spectral analysis. When plotting power spectral densities, use the reduced frequency \(f^* = f \cdot L_u / U\) rather than \(f\) in Hz. This collapses the inertial range of independent runs onto a single \(-5/3\) slope and makes Eurocode and von Karman target spectra applicable to any case.
Statistics windowing. Discard the initial transient by trimming the first \(\sim 5\) to \(10\) units of \(t^*\) before computing means and standard deviations.
Comparing against wind tunnel data. Tunnel records are reported at model scale with their own \(L_u\) and \(U_{ref}\). Plotting both datasets in \(t^*\) removes the scale factor and exposes whether spectra and integral statistics actually agree.
Next Steps¶
ABL Velocity and Turbulence Intensity - mean profile and turbulence intensity from the same probe.
ABL guided case post-processing - full validation workflow against a target terrain category.
Matching Custom Inlet - reproducing a measured wind tunnel profile, where \(L_u\) is also a calibration target.