Abstract:
Rutting is one of the critical distresses affecting the safety and serviceability of flexible pavements. Modeling the progression of rutting remains a challenge due to its numerous interacting factors. There exist many empirical and probabilistic models for predicting rutting propagation in the literature. However, these models are limited by their ability to accurately simulate local conditions, their high input requirements, and their local calibration requirements. Provided the significance of predicting rutting to ensure timely and strategic maintenance interventions, this study aims at developing a framework that achieves accurate rut depth predictions and quantifies the relative contribution of the different factors. This framework is characterized by low input requirements that can accommodate data scarcity and resource limitations in local road agencies, mainly in developing countries, that are initiating their pavement management systems.
For the scope of this research, historical rutting time-series, climate, traffic, and pavement design and materials data are acquired from the Long-Term Pavement Performance database (LTPP) and employed in training a Deep Neural Network (DNN). Ultimately, a model requiring twenty-nine inputs was determined. The findings show that the developed DNN model has significantly superior performance as compared to a multiple-linear regression model developed using the same dataset, the mechanistic-empirical rutting prediction model provided in Pavement-ME, and the world bank’s HDM-4 models. The model estimations were further used to capture and rank the relative importance of the different variables, which confirmed the high influence of traffic and climatic conditions. Generic family performance curves that correspond to certain traffic, climate, and mix design combinations are developed to further simplify the problem and assist road agencies that cannot acquire the data required for utilizing the DNN. Family curves introduce additional inaccuracies due to the mathematical simplifications; therefore, an Ensemble Kalman Filter (EnKF) framework is proposed to probabilistically calibrate the family models as new measurements become available.