Stochastic modeling and algorithms for the Dynamic Vehicle Routing Problem with Urgent Tasks.
We consider the problem of routing a fleet of vehicles over regular, plannable tasks and dynamic, urgent tasks that arise following a random process. Our objective is to minimize response time to urgent tasks while maintaining efficient operation for the regular tasks. We develop a Markov Decision Process that may be solved to optimality for small instances. For larger instances, we develop an approximate solution using Approximate Dynamic Programming and Reinforcement Learning. Our approach fully exploits the available information on the random nature of urgent tasks, which allows us to further reduce response time compared to models available in literature, among others, due to an improved spatial and temporal spread of vehicles over the service area. The routing problems arise, for example, in cleaning, maintenance and security applications.
Meeting ID: 951 5698 6071