You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A systems-thinking essay that explains why failure rarely happens suddenly. It shows how slow drift, accumulating pressure, and weakening buffers push systems toward collapse long before outcomes change, and why prediction-focused analytics miss the most important phase of failure.
An analytical essay on why prediction-based models fail in reflexive, unstable systems. This article argues that accuracy collapses when models influence behavior, and proposes equilibrium and force-based modeling as a more robust framework for understanding pressure, instability, and transitions in AI-shaped systems.
An early-warning system that models disasters as instability transitions rather than isolated events. It combines force-based instability modeling with an interpretable ML escalation-risk layer to detect when hazards become disasters due to exposure growth, response delays, and buffer collapse.
A systems-thinking essay that reframes failure as a gradual transition rather than a discrete outcome. It explains how pressure accumulation, weakening buffers, and hidden instability precede visible collapse, and why prediction-based models arrive too late to prevent failure in human-centered systems.
An interpretable early-warning engine that detects academic instability before grades collapse. Instead of predicting performance, it models pressure accumulation, buffer strength, and transition risk using attendance, engagement, and study load to explain fragility and identify high-leverage interventions.