Gaussian process latent class choice models
| dc.contributor.author | Sfeir, Georges | |
| dc.contributor.author | Rodrigues, Filipe | |
| dc.contributor.author | Abou-Zeid, Maya | |
| dc.contributor.department | Department of Civil and Environmental Engineering | |
| dc.contributor.faculty | Maroun Semaan Faculty of Engineering and Architecture (MSFEA) | |
| dc.contributor.institution | American University of Beirut | |
| dc.date.accessioned | 2025-01-24T11:28:20Z | |
| dc.date.available | 2025-01-24T11:28:20Z | |
| dc.date.issued | 2022 | |
| dc.description.abstract | We present a Gaussian Process – Latent Class Choice Model (GP-LCCM) to integrate a non-parametric class of probabilistic machine learning within discrete choice models (DCMs). Gaussian Processes (GPs) are kernel-based algorithms that incorporate expert knowledge by assuming priors over latent functions rather than priors over parameters, which makes them more flexible in addressing nonlinear problems. By integrating a Gaussian Process within a LCCM structure, we aim at improving discrete representations of unobserved heterogeneity. The proposed model would assign individuals probabilistically to behaviorally homogeneous clusters (latent classes) using GPs and simultaneously estimate class-specific choice models by relying on random utility models. Furthermore, we derive and implement an Expectation-Maximization (EM) algorithm to jointly estimate/infer the hyperparameters of the GP kernel function and the class-specific choice parameters by relying on a Laplace approximation and gradient-based numerical optimization methods, respectively. The model is tested on two different mode choice applications and compared against different LCCM benchmarks. Results show that GP-LCCM allows for a more complex and flexible representation of heterogeneity and improves both in-sample fit and out-of-sample predictive power. Moreover, behavioral and economic interpretability is maintained at the class-specific choice model level while local interpretation of the latent classes can still be achieved, although the non-parametric characteristic of GPs lessens the transparency of the model. © 2022 The Authors | |
| dc.identifier.doi | https://doi.org/10.1016/j.trc.2022.103552 | |
| dc.identifier.eid | 2-s2.0-85122829515 | |
| dc.identifier.uri | http://hdl.handle.net/10938/27038 | |
| dc.language.iso | en | |
| dc.publisher | Elsevier Ltd | |
| dc.relation.ispartof | Transportation Research Part C: Emerging Technologies | |
| dc.source | Scopus | |
| dc.subject | Discrete choice models | |
| dc.subject | Em algorithm | |
| dc.subject | Gaussian process | |
| dc.subject | Latent class choice models | |
| dc.subject | Machine learning | |
| dc.subject | Approximation algorithms | |
| dc.subject | Benchmarking | |
| dc.subject | Gaussian distribution | |
| dc.subject | Gaussian noise (electronic) | |
| dc.subject | Maximum principle | |
| dc.subject | Numerical methods | |
| dc.subject | Optimization | |
| dc.subject | Parameter estimation | |
| dc.subject | Choice model | |
| dc.subject | Expectations maximization algorithms | |
| dc.subject | Expert knowledge | |
| dc.subject | Gaussian processes | |
| dc.subject | Latent class | |
| dc.subject | Latent class choice model | |
| dc.subject | Latent function | |
| dc.subject | Nonparametrics | |
| dc.subject | Probabilistics | |
| dc.subject | Algorithm | |
| dc.subject | Discrete choice analysis | |
| dc.subject | Economic analysis | |
| dc.subject | Gaussian method | |
| dc.title | Gaussian process latent class choice models | |
| dc.type | Article |
Files
Original bundle
1 - 1 of 1