Traffic Routing in the Cellphone/GPS Age

The equilibrium-based analysis of traffic routing that we studied earlier in the semester (in connection with Braess’ Paradox) assumes that the traffic delay functions are known in advance by all drivers. Drivers are assumed to be taking the same trip day after day, to experiment with alternative routes, and to develop an instinct about the associated delay functions.

In reality, drivers not only do not know the traffic delay functions but may not know all of the route alternatives available to them. The popularity of GPS devices is an indication that drivers are interested in better guidance. But what happens to traffic flows as this guidance becomes available?

An Intelligent Traffic System (ITS) is “a combination of electronics, telecommunications and information technology to the transportation sector for improving safety and travel times on the transportation system” according to the Michigan Department of Transportation. While in the past such systems have consisted of speed detectors in the roadway, central computers, and electronic signage to tell drivers which way to go, the advent of cell phone-based GPS systems suggests a different, less expensive and possibly more efficient infrastructure. Drivers can use the GPS feature to compute shortest routes, while in combination with cell-based communication the impact of traffic on route times can be reported to the GPS route computation, potentially resulting in different routes that take these real time traffic reports into account. However the traffic volume can change as the route is traversed (partially as the result of multiple travelers getting the same local route directions) so a predictive mechanism is also required.

On a recent trip I experienced the current, limited state of the art. I needed to travel from a point in northern New Jersey to a northern suburb of Philadelphia. Google Maps, which I can access from my cell phone, showed the fastest route to be via the Garden State Parkway, the New Jersey Turnpike and the Pennsylvania Turnpike. However I knew from experience that at some times of the day various segments of this route become congested, which called into question the optimality of the route recommended by Google.

In fact, as my departure time neared, Google Maps reported increasing delays on the route it recommended, but was not able to recommend an alternative route. I knew of a “Shunpike” route via state highways and forced Google Maps to compute the travel time for this alternative (by specifying an intermediate point). By departure time, the Shunpike travel time was already shorter than the Turnpike route Google continued to recommend! Clearly Google Maps does not take traffic delays into account in making its recommendations. In this respect it is no better than a standalone GPS system - the communications capability and the central availability of data are not being fully exploited.

One might expect that Google will eventually integrate its route calculations with its traffic measurement capabilities. It probably gets its traffic measurement data today (through intermediaries) from real time monitoring by inductive loops in the road, but it seems increasingly likely that the best traffic delay information will eventually be deduced from real travel speeds recorded by individual GPS units and reported to a central computer (it is uncanny how reliably GPS units report current speed). Such reports would cover a much greater fraction of all roads and would be very current. This idea is currently being tested, by Nokia, not Google - see http://www.latimes.com/news/local/la-me-gpscars9feb09,0,4765729.story. The article reports that some transportation officials are skeptical, one claiming that people will find uncongested alternatives on their own.

Note that if individual GPS units are to compute traffic-based fastest routes, data on travel speeds for all pertinent links must also be downloaded or broadcast to the unit in the car. This suggests that route computations might better be carried out centrally, based on the most current traffic speed estimates garnered from GPS-equipped cellphones.

Even with current link speed data, however, the system described so far might not produce the best routes. In the example given above, I was able to confirm that delays continued to build up on the Turnpike route during the course of the day. Had I taken that route I would have encountered delays that would have been even greater than predicted at my starting time. And delays might have been building up on the Shunpike route as well. Clearly, a predictive mechanism is needed to accurately predict travel times and compute optimal routes. (Contrast this with the fact that my primitive standalone GPS is not able to learn and predict from my own speeds on the routes that I take even in the absence of traffic - it greatly underestimates the typical speeds on non-Interstate roads, nearly always recommending that I take the Interstate when non-Interstates get me there much sooner.)

Link travel time prediction is the subject of a paper from the University of Michigan ITS Research Center of Excellence, published in IEEE Transactions on Intelligent Transportation Systems (see http://ieeexplore.ieee.org/Xplore/login.jsp?url=/iel5/6979/18819/00869017.pdf?temp=x via the Cornell library web site for the full paper). This paper proposes an approach which simulates traffic flows based on reported link travel times, computes optimal routes based on these travel times, updates the link travel times, and iterates to convergence, which the paper argues will occur. The approach includes a backdating of simulated link travel times that is claimed to overcome the inaccuracy resulting from the fact that link times are inherently out of date by the time they are used.

The ITS literature and the Michigan paper in particular distinguishes between centralized and decentralized route guidance systems. A key problem that must be overcome with any intelligence routing system is the ‘all or nothing’ nature of routing advice. Think of a widely heeded helicopter traffic report that broadcasts delay reports on one Hudson River crossing into Manhattan, thereby directing traffic to another crossing that quickly becomes congested — while travel actually becomes faster on the crossing that was reported to be congested.

According to the University of Michigan article, “Allocation of routes under high market penetration is projected to be more stable under a centralized archtecture because the route guidance service provider can control more precisely the number of vehicles routed onto a specific route.” For example, a system in which routes are calculated centrally can randomly direct different motorists to take different routes between the same pairs of locations, thereby coming closer to network equilibrium. The Michigan authors claim that broadcasting link delays based on their simulation would bring about equilibrium in a decentralized way, with individual GPS units computing optimal routes based on the predicted link delays (essentially replicating what the central simulation has accomplished in advance).

If a routing system that uses real time data compiled from cellphone-based GPS units to recommend routes becomes widely used, a centralized approach does have an additional advantage that I have not seen mentioned in the literature: if a large number of drivers request routes, the centralized system will have a basis for estimating intended travel between pairs of points, usually called origin-destination (O-D) information. Getting such O-D information is ordinarily quite difficult, because measured link traffic volumes may not be sufficient to accurately compute it (given O-D information, link volumes can be computed by matrix multiplication, but it may not be possible to invert the matrix to retrieve O-D information from link volumes). Although some drivers might end up cancelling or delaying their travel plans once they see the optimal time predictions, the raw data can probably be adjusted to yield good estimates of traffic loading when combined with the simulation.

It remains to be seen whether drivers would be willing to report their travel plans, in the form of requests to Google Maps or the equivalent. Here again, the issue of privacy that permeates our course can be raised. However if enough drivers are willing to sacrifice privacy in return for excellent routing recommendations (and confidentiality promises from Google), traffic flow through existing networks and steps to eliminate bottlenecks might well be improved. Here in the 10 square miles of Ithaca we are fortunate that we do not need to worry much about traffic, but when driving in the surrounding reality, technology to beat traffic congestion would be highly welcome.

STOP PRESS: While this post is speculative and attempts to predict future developments, shortly after it was posted the Wall Street Journal published an article about a GPS system from Dash Express (http://www.dash.net/) that has many of the characteristics discussed here, though it may lack the predictive capability. The Wall Street article is linked from the company’s web site.

Posted in Topics: Education, Mathematics, Technology

Responses are currently closed, but you can trackback from your own site.

Comments are closed.



* You can follow any responses to this entry through the RSS 2.0 feed.