Good afternoon or good morning to those of you to the West. Welcome to the Talking Freight Seminar Series. My name is Jennifer Symoun and I will moderate today's seminar.
Today's topic is Non-Traditional Freight Data Sources.
Please be advised that today's seminar is being recorded.
Before I go any further, I do want to let those of you who are calling into the teleconference for the audio know that you need to mute your computer speakers or else you will be hearing your audio over the computer as well.
Today we'll have three presenters, Crystal Jones of the FHWA Office of Freight Management and Operations, Dale Tabat of the Washington State Department of Transportation, and Alon Bassok of the Puget Sound Regional Council.
Crystal Jones joined FHWA's Office of Freight Management and Operations in October 2003. Prior to joining FHWA Crystal worked for the Department of the Army for 11 years where she held several positions in transportation and logisticsincluding assignment with the office of Deputy Chief of Staff for logistics at the Pentagon.
Crystal's primary area of expertise is with freight technology and operations. Crystal also has extensive experience in the areas of programming and budgeting and strategic and performance planning.
aCrystal holds a Bachelor's Degree in Industrial Technology with a concentration in Computer Science from Elizabeth City State University, and a Masters of Science in Administration from Central Michigan University. She is the program manager for FHWA's Freight Performance Measurement initiative. Dale A. Tabat is the Truck Freight Program and Policy Manager for the Washington State Department of Transportation. His areas of focus are on integrating truck freight services within the department, truck parking, truck freight data, research and policy.
He also serves as the division's liaison with the Washington Trucking Association and legislative staff. Dale has over 30 years of management experience in the transportation sector working in operations; sales; planning; and pricing positions for numerous companies including U.S. Xpress; Federal Express; Roadway; Rollins Dedicated Services, and Pacific Intermountain Express.
He also served on the board of directors for the California and Utah trucking association.
Alon Bassok works for the Puget Sound Regional Council, where he is a freight economics analyst. He is responsible for coordinating region-wide traffic data collection and has extensive experience working with truck related GPS data. He has taught courses on sustainability in transportation for the University of Washington's Department of Urban Planning
and received his Ph. D. from the University of Washington in 2009.
I would like to go over a few logistical details.
The seminar will last 90 minutes, final 30 for Question and Answer. If during the presentation you think of a question you can type it into the chat area. Make sure you send your question to everyone
and indicate which presenter the question is for. presenters are unable to answer questions during the seminar.
The operator will give instructions on how to ask questions over the phone. You can use the freight planning listserv or the chat box.
The session is being recorded, file containing audio and visual portion of the seminar will be posted to the talking freight website in the next several weeks.
The presentations will also be available online, as well as recording and transcript, I will notify all attendees once available.
The seminar, as well as all talking freights are eligible for continuing education credits. To obtain credit you must log in with first and last name or type in name to chat box.
More instructions on file share box to obtain credits.
Download the evaluation form and submit to me after you fill it out.
In addition, even if you are not applying for credits if you want to fill out the evaluation form that would be great. I notice it's not in the box, I will get it uploaded once we get started.
Today's topic is Non-traditional Freight Data Sources.
Crystal Jones is our first presenter.
If you have questions, type them into the chat box and they will be answered at the end of the seminar.
With that, Crystal, I will bring up your presentation.
Crystal Jones: Good morning everyone. Thank you for joining.
At the end of the presentation you will find it well worthwhile participating.
I will go over a national initiative we have here at federal highway. It's not focused so much on data, but process and institutional challenges we face getting the initiative started
and lessons learned with regard to establishing a partnership with priority sector to obtain data source with long-lasting value for the transportation community.
I am from federal highways office of freight management and operations, we have a wide range of activities and missionaries. Primarily, the work we do focuses on understanding the magnitude of moving the freight system, strategies,
analytical tools, institutional arrangement, and understanding, plo mot promoting freight ment, encouraging innovation and we have the size and weight enforcement in the Department of Transportation.
There's been several reauthorization proposals and positions put forth by several agencies such as AASHTO, the department had a reform strategy they put forth during the last administration. While they differ in some aspects,
there are common themes that come forth in most proposals with regards to freight. Listed here are some of those key things. Defining the federal role in good and freight movement. Incorporating freight performance and accountability,
the presentation I'll be giving today, getting better data. Promoting better management of existing assets.
Employ using multiple funding source for transportation project and programs and linking policy and funding to the environment and energy sectors.
What kind of freight data do we typically need as transportation decision makers?
Obviously there's a myriad, from infrastructure size, data to know the key corridors, gateways we should make investment in to prepare for the future, requires forecasting, to understand transportation and freight movement today
and be able to think about investments in a more longer term strategy.
How does freight movement effect congestion and congestion effect freight movement. Understanding how the system performs and operates, and policy and regulation side.
We have competing priorities, what are the best investments to make and how to allocate scarce funding through different area and programs.
The freight data challenge, I am sure most of us on the teleconference today have been, if we work in the freight area have been challenged with coming up with the right type of data, having harmonization among definitions.
There may be a data source, but not to the level needed for a particular modeling effort. The challenge of harmonization is one challenge, and using available data as a proxy for the needed data, the flow of trade for geographical flows,
weight ratios, export ratios to come up with values for weight ratios, et cetera.
There's also the lack of evaluative data sources, meaning there is no national freight data source we can point to, to say this is the authoritative source to make decisions.
The initiative started in 2008, 2003, we consider it to be a public/private partnership, most data, information we need to freight movement resides in the private sector.
When you talk about getting towards the data closest to the most realistic and accurate data, it becomes necessary to get the source data from where the data originated.
This leads us to think maybe public/private partnerships give an opportunity for better freight data.
What are the challenges? In the effort we set up, there's going to be challenges associated with legal, confidentiality, proprietary nature of the data. If you get at the point you negotiate a contract
In the next slides I will discuss the challenges and opportunities we discovered in our initiative.
A lot of the work in the freight performance measurement initiative started out for federal highway, we have a strategic goal of system performance, saying we want to be able to provide safe, reliable,
effective sustainable mobility for all users.
From a freight perspective, the freight users of the system, we wanted measures, data to depict how effectively freight is moving on the highway.
That was our initial goal or purpose for pursuing this new freight data we have now acquire ared through a public/private data partnership.
The compelling reason for private industry to participate, purely because we feel the private sector wanted to share data they share common interests we have in improving operations, increasing capacity
and garnering funding to support programs, so a lot of their reasoning for participating in this initiative was because they believe we will use the data to make better transportation decisions.
What is the freight performance measurement initiative? It started in 2003, a contractual relationship between the -- and American Transportation Research Institute, nondisclosure agreement with GPS and sublets it data,
measuring travel time, speed, for the highway system. On a typical month we have 500,000 trucks equipped with GPS and satellite equipment nationwide.
We use that data, location, date-time stamp to measure speed, reliability for 25 significant interstate highways with significant freight movement.
In addition to to doing the work at the national nestle level we worked with several universities, at the state and local, five case studies with a university throughout the U.S., Portland state, University of Minnesota min,
Texas Transportation Institute, two different work groups within that institute, and the University of Wisconsin.
This purpose was to see how you being use the data in the other transportation areas.
Portland state did was look at recurring, non-recurring congestion, the impact on the metro area. The point of our research at this point is to go beyond using data at initial level to derive measure of speed, travel time,
reliability shown in the map, figure how states, MPOs could use this data to support their freight business areas.
University of Minnesota min used archived truck data for freight performance, between Chicago and Minneapolis-St. Paul area.
Texas Transportation Institute -- national measures, through the research with the universities we have shown it has applicability below the national level.
One of the other applications we also explored using this data is for are measuring border crossing performance.
Texas Transportation Institute looked at the same GPS data to measure travel time reliability on the interstate system to derive measures of border crossing time and performance in the Texas area specifically El Paso, and Laredo.
They overlaid two months of GPS data on a busy crossing in the Laredo world trade area, showed you could use the same type data to understand the border crossing process, not only from a total crossing time perspective,
but segments within a border crossing. For instance, how long from the Mexican export lot to customs, primary.
One of the most interesting uses of this data has been be by the University of Wisconsin, they are looking at measures of having to do with network resiliency. They used the travel time data, average speed data
and number of events observed before a severe weather incident and two severe weather incidents to come up with measures of robustness, the ability of a highway to withstand incidents without -- and rapidity, to restore after an incident.
Now, what are the lessons we learned through this data?
Some of the key lessons regards getting started. If you want to pursue this path of getting access to private sector data by the public sector for use for freight planning or operations,
the first thing you should do when you get the started is be knowledgeable of the regulations, both on the public side and private side to restrict or facilitate access to the data. As a for instance,
when we started one of the main things was how to deal with the data remaining anonymous, doesn't disclose particular information from a trucking company. Dollar will There will be time and patience required.
We started with a select few trucking companies, a few interstate highways.
Through the positive feedback from the trucking companies that participated we were able to convince to work out agreements to give access not only on per company basis, but entire fleet. The data arrangements to obtain access,
used to calculate speed, travel time, reliability, now,
instead of having individual arrangements with individual trucking companies we have arrangements with large technology vendors that provide access to the entire fleets for the one that's subscribe to their particular product.
Probably one of the big lessons we had to learn in getting started was that this type of data produces mass amounts of data, terra bites and terra bytes, we had to carefully think through, how to handle, collect, store, process data,
collect what we need as opposed to asking for the world and not having the ability to collect, process or analyze the data.
There are reasons a vendor would say no, I will list some of the top concerns trucking companies and participants had in the beginning.
The primary things we hear is they are afraid of being burned by bad publicity or regulation. There's a thought process that preparing the data would put a burden on already overworked staff. They are worried about improper use,
mishandling of data.
For instance, GPS, can it be used to set up speed traps, for instance, even though that's a far-fetched one, but the type of concerns we had to deal with,
and assure them these issues could be addressed in a way that didn't harm their business.
The concern the source data would be modified by the government to tell the story that what you want necessarily so, the fact there was a handling and processing that takes the raw data and uses it to derive secondary measures,
concern about does modifying the data really tell the right story? Those were the type of issues we had to address in terms of getting them to want to participate.
Some of the things we used to get them to say yes was the quid pro quo, meaning they get something in return for allowing us to use the data. As I started out in my message,
our first story was we will use this data to hopefully come up with better freight data allowing us to make better investment decisions. That was the original going in process.
For this particular project there is financial reimbursement.
A lot of times the quid pro quo could just be financial reimbursement.
There is, but considering the amount of data and benefits, we think that the price we pay is -- the benefits far outweigh the cons that come with the price we pay for access to the data.
To have examples ready of how it could benefit them. For instance, I had in the slide previously, showing there are specific freight programs that could come out of freight funding, dedicated freight programs, out of access to this data.
If there's not a good story to tell on how it could benefit them, there some probably be a good story on how they couldn't be harmed, meaning nothing proprietary about it, anonymous.
If you don't have a good story line behind how it could benefit them, at least have a good story behind how it won't harm them.
To counter the saying no about processing data to cause a burden, for our program a lot of processing comes after we receive the raw data, just assurances put in place to say there's agreed-upon process,
what the processes would be so they have assurance it will be anonymous, no proprietary information.
We take a lot of the burden for processing the data and minimize the impact on them in terms of packaging data to provide us access to it.
The concept of peer pressure, other countries, a state partnership you try to arrange.
Always putting forth the notion that other people are doing it, showing examples of how it has been successful.
For our initiative the main concerns, why they should say yes, how we went about addressing this.
The main concerns the trucking industry specifically had with regards to giving access to GPS is civil litigation, trial lawyers using the data to do subpoenas, competitive, proprietary nature of data,
making sure you don't pinpoint a particular trucking company moving out of a particular city. Giving data on their market share to their competitors.
The whole idea of government access. A lot of that, we don't as federal highway take possession of the data, it's managed by a third party.
Access issues are handled by having a third-party, we get secondary data products as opposed to raw data.
I will close by saying one thing I pointed out was having examples of success stories, things that worked in the past. Federal highway is completing a research project on a freight compendium,
compendium of examples of public/private data partnership, for instance where the private sector shared data with the public sector and that research initiative is wrapping up now.
The compendium should be available, on our website, will give examples of who the partners were, the arrangements, the processes to get into the arrangements. Some of the lessons learned, et cetera.
That will be available probably in the next month or so on our website.
Lastly, I will put this link up, I thought I will be able to do a short demo of a tool we developed called FPM, freight performance measurement, this web link gives,
links to access to a web tool based on the travel time data for the 25 interstates shown in the map early on. Jennifer will make this web link available in a chat pod before the end of the presentation
and we invite you to log on to the website and explore the data, see how it might be useful to your organization.
This was my vision of the end state of quality freight data would be. I am sure most of you will agree, but I won't read it to you.
I think that is it for me.
Thank you, Crystal. I put the link in the chat pod for anybody interested. I will turn it over to Dale tab on the -- good
Good morning from rainy Seattle.
I am the truck freight policy manager for the Washington state DOT. On my title slide you see the name of Ed McCormack. Ed is my partner in this venture at the University of Washington, he's our technical expert. Fortunately for him,
he's on vacation this week, but if you have technical questions I can't answer we will send them over to Ed for an answer.
The truck freight performance measure research project was a joint partnership between transNow, Wash DOT and a lot of cooperation from the Washington Trucking Association.
They were very essential in getting our funding in order to move forward with this project.
Just some quick background here, in 2007 the Washington State Legislature appropriated money for the project, we were able to secure federal funding via Washington TransNow, and some in-kind money from them.
Put together in a Bill in the budget and directed us to track truck movements in the Puget Sound area and see if we could take that data and develop it into a GIS map type format.
The legislature appropriated $324,000 of the 428,000, reappropriated, the reason for that is because the original program was supposed to end at the end of 2009, but, or July of 2009,
but because of some ability to get GPS data we had to change the direction we were going in. Originally the thought process was we could get it directly from trucking companies in the area,
and for some of the reasons that Crystal talked about early, we wound up going to GPS third-party providers and buying data directly from them.
Our providers that we have been buying data from you are getting data from a [indiscernible] Turnpike, owned by Zeta, ATRI has supplied data and some Qualcomm data from a fleet outfitted with Qualcomm instrumentation.
One reason we wanted to put this program together was to look at how this benefits the state of Washington.
There were three top reasons on why we needed to do this program. Number one is future federal freight funding requests.
All the bills we're seeing, hearing about on reauthorization talks about a freight component and within that component is a request for performance measurement data.
Number two, to increase the department's accountability to our citizens, actually to our customers as we look at them.
It gives us the ability to look at a project before, during and after the project. And be able to say yes, it's doing what we told you it was going to do.
Then, also, it gives us the ability to look at where we have existing bottlenecks or chokes or speed slow-downs, so that we will put the amount of money which is now limited into the biggest bang for the buck.
Number one, future federal freight funding requests, as we said, everything we hear on reauthorization is on performance measurements. So we have looked at three issues there, travel time, reliability and access.
The project has shown we can accurately track truck travel times, look at reliability of the network, with the onboard GPS. This map gives us a zonal idea or look at the number of trips in and between zones.
The reason we chose zones is also for another reason that Crystal gave, so we aren't drilling down to a location site. Once again, carriers get very nervous if that data is being put out in the public realm,
that their competition can see, or that could possibly be used to punish them or change route movements.
Increased public accountability to our citizens, our customers, we decided to do a concept look to see if a -- to prove the concept of this.
I had a construction project last year in May, the I-90 floating bridge, over Lake Washington.
We were replacing the expansion joints on the center road way in the May time period. So we had the idea, let's take a look, track trucks on that.
What we've seen, were able to actually look at differences during construction, before construction and after construction, see that there were changes in the traffic. We had the ability to look at both directions, eastbound
and Westbound side. We haven't analyzed what this means, just a proof of concept that we could look at changes in truck speed based on construction project.
Another thing, we were able to take that road segment, as you see here on the map, we can actually look at that's what, at the spot speeds that exist on that road.
You see a slow-down here on 25-miles or less on this curve, and the bridge, and this was information for a week in January.
Now, people will tell you in Seattle, we already know there's a slow-down on this portion of I-90, which is great. But if you are out using this to get money for a project from the federal government,
presenting this to a representative from Illinois, well, they don't know what happens at the junction of I-90 and I-5, this road way here. We have a slow down, choke point at that particular part of the roadway.
To go further to take a look, we looked at a year's worth of data, were able to see there's a variance in average truck speed West Westbound, and larger variance Eastbound, and as to what happens.
This piece of road has a distinct curve, causes trucks to slow down when they go through that, but the other thing is as they come down at this point here there's a set of express lane on-ramps,
if they are open causes a slow down in traffic going into the Mt. Baker tunnel, one of the reasons we are looking at a slower speed there. The average of all vehicle speed comes off of our loops in the roadway.
Another study the university did, Ed McCormack, with GPS data a few years back. They noticed when the system showed free flow,
truck speed was 10 to 15-miles less off the GPS than the -- they sent out students with their cars to follow the vehicles and found they were actually traveling slower than the surrounding traffic. The reasons for that are hills,
congestion on the ramps. Trucks are over in the right hand lane, so they are slowing down for traffic entering or leaving the road way.
That does cause a variance in speed.
This is just another presentation we can do on this.
We can break it down on a pie graph, gives a better visual. You can see at the bottleneck area that the slower speeds are more dominant the than on the straighter areas of the system.
One of the things we were requested to look at is we can look at free flow, interstates, can we actually start looking at ramps? This is another proof of concept we had the ability to go out there, using this GPS data,
and plotting the movements of trucks on the ramps and hopefully get spot speeds to look at ramp and interchange performance on the network.
So, there are benefits, and the cost of monitoring truck performance on the state-wide network. The advantages are it actually gives us truck speeds that we really have no other way to obtain.
We do not have a system that differentiates a truck from a car on the freeway system. We can actually look at those trucks and provide that data to our trucking companies, shippers, so they have a better idea of the delays, stops,
on specific routes which help them operationally hold a truck back or move it out earlier in the day, but to avoid the congestion spots.
The big advantage is that data is out there right now. We are buying it from third parties. I will tell you in a lot of instances the third parties didn't even think of this. I mean, this is, if you want to look at it,
this is data waste. This is stuff, data they are storing for their customers after it's out there, unless the customer comes back to request it, it's really not out there.
The limitations on this is that you need to be researching this the at all times. You need the analysis going on. It's not a performance program that you can put out there and automatically let it run. It needs to be managed,
and one of the other problems that we've seen is this -- it works very well on heavily traveled road segments.
You need a lot of GPS data on local roads to analyze their performance.
You also have reads that are short nature in the time between the reads. Our average reads right now are around 15 minutes. Ideally we would like to see two to three minutes.
Currently in the central Puget Sound, we are utilizing information from about 2500 trucks.
While that sounds like a lot, spread out over a couple hundred square miles it gets a little spotty in places.
The more vehicles' information you can capture, the better off you will be.
Our next steps on this, we did go back to the state legislature in January, made this presentation to them.
Telling them, showing them what done, what we're able to do, why we need to continue this, especially to justify federal funding and support our state freight investment decisions.
We gave them three options, one was just to maintain the system, the program in the central Puget Sound, to increase to state-wide coverage to high-volume highways and go to high volume highway and local truck corridors.
The state legislature in the budget that came out this year, our supplemental transportation budget, gave us an additional $122,000 to continue the program through June of 2011, and also changed the language expanding the area.
We are now able to go out and look at the whole state of Washington versus just the central Puget Sound.
So by getting the state-wide coverage, it will help us develop benchmarks and performance measures, measure freight corridors, areas outside Puget Sound to go down to the metropolitan areas of Spokane and Vancouver, Washington.
We will look at urban segments, give us the ability to look at border crossings, very important to the economy of the state of Washington.
Then finally, on our GPS data, we talked to the vendors to get this additional data, we will have them expand it out 100-miles around the state. We will look, have a buffer zone of 100-miles in Idaho, Oregon,
to see trucks originating outside of the state of Washington and where they are coming from.
Finally, on some more of the state-wide performance measurements is we will be able to take this program and tailor it to different organization and companies. We will be able to look at particular segments of it.
The more data we can collect over a period of time, the better our measurements will be. It will help us support and validate a state-wide forecasting model. One thing we're working on at Wash DOT is funding a commodity flow survey,
a map so we can understand it; where the freight is moving within the state of Washington, what routes it's taking, better understand analytical tools.
Finally, one aspect the university is working on, putting this into an Internet framework where organization can go on and look at data between two zones, look at the truck travel,
performance measurements on the routes between those zones at different times of the day, days of the week type information, and there is my contact information.
And Ed's, so you can send us questions.
Also, Jennifer mentioned I am the chair of panel 31.
That is a panel of the national cooperative freight research program and what we are doing, we're in the process of getting proposals in to build a guide book to transportation data management.
It basically goes hand in hand to what Crystal was talking about, using this as a guide book on getting data from the private sector and overcoming those obstacles.
Thank you for your time.
Thank you, Dale. We're now moving to our final presentation, A Lon Bassok.
Alon Bassok: Thank you, I would like to talk a little more about some of the things Dale and Crystal have spoken about.
Two projects we worked on at the Puget Sound Regional Council and how excited we are, and naturally great interest building over GPS data, and truck GPS data in particular, it allows us to do things we couldn't do before,
and to do things that we have been able to do a little better.
I will talk about two projects, on truck and cars, completed about a year ago through our congestion management process supported by a STEP research grant. Dale alluded to truck speeds, moving slower on the road.
A project completed now, where we're going with it on trip generation, being able to use the GPS data and understand better the relationship between how many trucks come out of specific lane uses and employment types
and we used grocery stores as a proof of concept in one case study, but could be easily applied to any other type of commercial or industrial manufacturing activity.
Then I want to discuss where we're going next, how exciting the GPS data is for us and others as well.
We know trucks travel slower than cars, in particular on our freeways and there's a variety of reasons for this. Most notably they accelerate more slowly, take more time to decelerate, takes more are time to get up to speed on hills,
keep speed as they climb hills. They are in the right-hand side lane on the freeways and generally travel slower than cars. Just knowing this isn't quite enough. The question becomes how much slower and on which facilities,
which has wide implications for travel demand model, which this work was meant to support, but any other model, air quality, truck speed information as an input, and all models feed actual planning applications.
Any analysis done with results of the models that later can be used for investment decisions.
A better understanding of truck performance ultimately leads to the things that Crystal and Dale were already suggesting are important and being able to say which facilities are suffering, which need improvement,
where do we need new facilities, and so on.
In our model, as things stood at the time on a regional level, we did a fairly good job of estimating how many trucks were on the roadways, combined, between freeways and arterials.
The over-and underestimation was -- we wanted to understand by how much you could speeds, what is the exact relationship of truck speeds in order to be able to better represent not on the aggregate level, but facility level,
what is going on with trucks.
As Dale was mentioning, there's no better way to come up with spot speeds, be able to pull trucks specifically out from other vehicles is to use GPS data. We were quite excited to be able to use this for the first time.
Unfortunately, serendipity didn't work in our favor, we didn't get to use the GPS data set Dale was describing for this effort. But Ed McCormack
and Mark Helenback at University of Washington were kind enough to let us use an earlier data set they collected. Included about a half million points. If you see it on the map it looks like straight lines.
This particular data set and the one Dale described, if you look at a year'sy set of data there's hardly a street that doesn't have trucks on it. Lots of data and all the ensuing maintenance issues that come with that.
The data here is from eight trucking firms, 25 GPS units, all equipped with them. No matter how wonderful this data was, we didn't have a comparably wonderful GPS data set to use for passenger vehicles for the same time frame.
This became problematic, as you see here, the most comparable data set we had slightly different in time frame, included all traffic. We ended up with a result for the most part the trucks travel faster than cars, which, well,
unfortunately, simply isn't true. So we had to do something else.
For the most part GPS data is wonderful tool, would will be ideal tool but you have to plan these things far enough in advance to be able to collect all the data you need, not only truck data, but car data.
While this is the best possible type of data we could have had, we had to step back and say well, what's second best? I want to talk briefly on what we did to resolve the issue and complete this analysis.
In lieu of the GPS data we took speed trap data from the states, loop detectors on the highways and separated out vehicles by length classes. The shorter vehicles we called cars
and anything that was longer in the latter three bends we considered to be trucks.
There's other things, and buses, RVs, so on, but not sufficiently enough to give us concern about wanting to separate them out.
Then we looked at observations with just cars or just trucks in 20-second intervals.
Later turns out that becomes problematic because we're not looking at time intervals with both cars and trucks in the same 20-second time interval.
This is one example at one location of how the GPS data compares to the long vehicle estimation. For a.m.
peak, and within a very short distance of the actual speed traps, many more GPS observations, GPS observations immediately adjacent to the speed trap. Across the board all the speed traps,
within a reasonable range between the GPS data and long vehicles such that we felt comfortable using the analysis.
The GPS data told us the best case scenario, and here it is validating this other procedure.
We came up with the difference, the trucks, cars, could be applied to the models framework to even out whap what happens on the freeway, arterial assignment.
We wanted to see how things turned out over time. You see something quite strange happened in that the 2006 traffic volumes for -- speeds for car and trucks out-perform the speeds of the year 2000.
While it's always great to sit in your office and with the click of a mouse be able to fix regional congestion issues, this simply isn't the case.
We had to do a little analysis on what went wrong with this, the discrepancy over time.
We only considered the observations that had just cars or just trucks, and what happens, essentially we are looking at the slowest trucks and the fastest cars and missing everything in the middle,
but don't have the ability to pull out from the speed trap data which are specifically cars and specifically trucks.
We decided we had to move to the least desirable approach, do things by proxy. A lot of time in the freight world, as you all know we use whatever available data is, not necessarily the best data we would like to use.
In this case we would have preferred to use GPS data but had to make one more simplification.
For this last version we used five minute speed trap data, didn't bother trying to distinguish cars from trucks. Instead, we compared two different lanes on freeways.
So the inner most nonHOV lane, car speeds, assuming not many trucks make their way all the way to the left. The second outer-most lane for trucks, didn't want to leave them in the right hand lane, so many merging issues,
thought it would bring the trucks to an artificially lower speed. We used the second outer most lane to get a gauge for what truck performance would be.
What we see here using this method, slightly smaller differences overall between the peaks and 10% between car and trucks on the roadways and ultimately this is what we ended up implementing.
Looking over times, things are somewhat more reasonable, 2006 performs worse than 2000, trucks are moving slower than cars, and we feel relatively confident this 10% difference is a reasonable estimate.
Then, across the methods, a quick comparison between the two in and out versus short and long vehicles.
So even the most simplistic method leads to good results. We would like to have been able to use better data, GPS data still validated some of these methods, useful in the context of validation,
and in the future it would be great to have more GPS data, comparable car data to actually have complete picture of spot speed and directly compare on highway segments.
Our second project does use -- 2500 trucks a day, 15-minute reads, richer information if you can get smaller reads, but even at 15 minutes the data is so incredibly large,
for the purposes of what we are doing here it's not necessary to get to a finer level of detail than 15 minutes. Something in the magnitude of more than 3 million records a month, we now have data back to early 2008,
the data will continue to be collected at least through the summer of 2011.
Take all these datas, figure out how to put them on the road network, Dale showed images of what that looks through the University of Washington work.
We have to figure out where trucks start and stop, and want to know the intentional stops. Anytime there was a stop of less than three minutes, that doesn't get considered as an actual stop, could be due to traffic,
but want to validate through the GIS analysis these are legitimate stops and a point of interest, say someone is using a not very well utilized side street as truck parking, that doesn't become the origin,
you want to start the place they pick up goods and deliver somewhere else.
For this case study we used a month of data, 3 million reads, giving us 358 million truck trips, give us great summary, you can pick out the statistics, the GPS, even with this many trucks gives an understanding --
For grocery stores we were able to take over a three-month period, get a better understanding. Over all, 10 trips per tour, some are to larger grocers.
You might find a really large grocery chain sends one truck from distribution center to a store, all that truck does, versus some of the smaller vendors, say beer distributors make many small trips within a tour to restaurants,
grocery stores and so on.
So, since this is a sub-set of all trucks, we wanted to develop an understanding of how many trucks would really come to the grocery stores. We had a very comparable rate to the information we got out of interviews with stores,
a different research project Ed McCormack was involved in. Said more or less in the Puget Sound, 10 to 12 trips happen daily to each grocery store. This turned out to be half the observations from manual counts.
Trusting our manual counters more than the interviews, in this instance we are able to factor the truck trips up by two, to say we're really, the GPS data represent half the travel to grocery stores, could be for any industry,
not just grocery distribution.
We can further separate this out by trips to different types of areas, largest metropolitan cities, down to rural areas, looking at number of daily truck trips. This allows us to do something we weren't able to do before.
You consider trip generation for trucks you think about employment, assume each employee assumes so many trips, factor that up. We see here under a redefined industry, grocery stores, in fact,
employment has no correlation to the number of truck trips, more to do with the land use area they're in. Allowed us to think more strategically how to apply trip generate rates.
Lots of things can be done with the data, we can certainly continue the work with trip generation for any employment type, any time of day, so on. There's lots of other useful metrics that come out of this, response speeds, times,
reliability issues, trip and tour lanes for activity based -- calibration is helped by such data as opposed to calibrating to observe truck counts wherever you can find data for that.
The trip generation models -- one thing we're quite excited about now, trying to develop commodity flow estimates with GPS data, quite a long way to go in developing a model that allows us to do that,
but we're starting to think about being able to do that because on a sub-regional level, thinking corridor specific, we don't have a good data source. Certainly the national sources, as good or not as they might be,
don't give us that offer of detail. Instead of trying to develop a model parsing out the national data set to a corridor or specific facility we are trying to do some estimation based on GPS data set.
We're not only excited about this, but there's gaining, this effort is gaining excitement from folks around the country. The vendors are using this as a by-product, waste stream essentially for them,
getting a better understanding there's a purpose for this data and there's more of it, variable everywhere around the country. Others could easily implement this in similar ways and continue to do research on other uses for the data.
I have my contact information up here if you have questions beyond today, happy to chat with anyone anytime about what doing with the data and how you might apply it yourself.
Thank you, and thank you to those who posted questions. I will start with those posted online. I encourage you to keep typing them in. Once completed with the ones online we will open the phone lines.
So, I am actually going backwards, up, since we just finished Alon's presentation.
Direct the questions to you. Did you limit the data to road ways that allow truck movement, trucks are expected to take or keep all roadway data?
A Alon: We used all the roadway data, but for the most part there aren't limitations on where trucks could go. We weren't really interested, for our purposes, in the route selection, as much as the origins and destinations.
We had the stop locations of where the trucks were, regardless of route, we knew where they started and finished.
Didn't need to worry about routing.
If you want to do routing, the 15-minute intervals make that somewhat prohibitive.
Trucks on the freeway can travel a long distance over that time frame.
Question: Could this information be used to model a possible -- trucks in the inner most lane in urban areas.
Could you repeat that could this information be used to model a change to direct trucks to the inner most lane in urban areas.
Would this reduce congestion for all vehicle types?
You could certainly model it, absolutely. I am mot sure it would be a desirable thing to do, just in the conflicts between the trucks moving from the outer most to the inner most lanes.
Generally you don't want to have those conflicts in lane changes, especially because of the speed differences.
You could certainly model what would happen if you did do it. I am not sure what it would do -- my guess, without having done analysis on it would be that if anything, it would have worse impact on congestion,
you would be slowing down lanes that otherwise would travel faster.
So, it might cause slightly more severe congestion than currently experienced.
That's just my hunch, I haven't tried to model that.
We will go to questions for Dale.
Have you coupled your data with train freight data to see the flux moving between the two parallel modes? no, we haven't at this time. The data we are looking at now, until the legislature allowed us to go state-wide,
which we will be doing in the future, was within the central Puget Sound, a very compact metropolitan area where there's not trains competing with trucks in that marketplace.
So at this time, just using Puget Sound data.
It's a little -- it would be difficult if not non-existent to look at data from trains, because they are running, for the most part, all running long hauls out of the Seattle market.
Question: Did you consider using the data available through FHWA such as the velocity data Crystal described?
Not at this time. We will look at that data as we expand state-wide, but once again, this was in a metropolitan area.
We were limited by the area we worked in.
Getting data from third parties, one thing Ed did, we talked to providers, trying to indicate to them what we were doing, and what we were doing, and trying to accomplish, utilize data, and those that agreed to do it, of course,
we signed agreements with them, had a contract, put together a pay schedule for the invoicing process. We also laid out what type of data we were going to get, what type of information we were going to get.
There were privacy issues and that was accomplished by separating the company from the vehicle. What we were getting was positioning data, we could do speeds, but we had absolutely no idea who the truck belonged to
or what type of truck it was. In the future we hope to be able to start getting that type of data so we can actually look at different types of vehicle combinations, size, weight, so forth.
All of our vendors that we did agreements with, we had nondisclosures with them that limited what we could do with the data, who we could share the data with.
So, for example, when Alon and the P SRC approached us to utilize our data sets, we had to go to each of our vendors and get their permission in other words in other per in order to share the data.
That's one way we go back to the privacy issues and how people keep data from being used in ways they may not be happy with.
Now, for are Crystal, can earmark PS data be used to --
We have done, for instance, taking a city, looking at struck flows from a particular time, Dallas, for instance, after 24 hours, eight hours, where the trucks that originate in a particular area flow. So from that perspective,
we have done proof of concept, but not any widespread analysis of looking at truck flows and network utilization.
It could be done, but the caution would be that the GPS data set is a sub-set of population, so how representative it would be of all -- there may be underrepresentation of some types of movement, like local deliveries, service vehicles.
It is largely the nation wide data set I described is largely class 8 vehicles, typically longer haul truck movement as opposed to localized movement.
Okay, next question --
Regarding thought of financial reimbursement for data is there a funding source on a state level.
Dale's model of how they did it is one model I personally unfortunately do not know, whether or not SPR or state funds could be used. Dale described the Washington state experience, they convinced
and showed their legislators how this type of data could better improve decisions on freight improvement, mobility improvement program. That's one model,
but certainly I would think some of the regular traffic data collection funds that would be available in a state for whatever they use for their traffic data collection should be available for this type of data also.
The funding could be used to access this type of data also, but I am not an authoritative expert matter on this, but I can take the question, see if I can find a better answer.
I would say a model like used in Washington state is one model, but I -8D speculate any funds currently used their traffic date sai collection could be used to support this type of collection.
12K3W4R50*EU6R7B8G9 I believe the next question is for you as well. Does this include freight modes other than trucks, rail, domestic, water or commerce?
Discuss the needs and/or existence of comparable data for all modes to make national freight enter modal decisions.
The initiative I described was started originally as a highway performance measurement initiative. We did not endeavor in our initiative to check the data for all modes. Dale talked about the national cooperative research program.
the work to date focused on highway truck, the concept of travel time, the travel time reliability for all modes is a prominent measure of system performance, but our data collection effort only focus said on highways.
Question: How do we get the model velocity -- for Wyoming --
I put up the web link.
Freight performance.org. You can request, as state DOT, or -- a user ID and password, for Wyoming it would have data on I-80 and -- it only is his tor are biological data, available for 2009 right now.
That website will give, I just pulled up a query, I 25, I 80 and I 90, for the entire year 2000 there are those three interstates. I can't think if there's another.
The 25 interstates are any state can request access to the data.
That web address is up again, freightperformancemeasures.org, for anyone interested.
The next question, I will put out to all the presenters, did anyone look at [indiscernible] on interstates or arterials?
The network file we used for the national, nationwide effort for the 25 interstates does not have anything on our network file which is also available on that web link you just put up.
Does not differentiate between what part is R50UR8 rural or urban, but the entire length of the interstate is available.
You would have to assign a rural or urban designation. It's not limited, it's for the entire interstate.
The data we have, is largely class 8. You wouldn't be able to differentiate between smaller or larger trucks.
And our study, we are looking at a central Puget sound, mostly metropolitan area, so there wasn't really a lot of rural movements to look at in the central huge sound study. As we expanded across the state,
we will have the ability to look at rural segments versus urban segments and get a better idea on speeds, movements in those areas.
This is Alon. That would be great to see when that happens, but I will suspect trucks still in rural areas would travel slower tan car -- than cars, just because of operation, climbing hills,
a car would take it faster than a large truck, similarly the de-acceleration issues. They should -- the gap should be smaller, less merging issues unless people coming on and off different facilities.
I would expect you would see some sort of difference between trucks and cars.
I could take that. I don't think very many. I think there's a handful that explicitly considered this in terms of modeling framework, and their own performance measurement products, probably a handful more that have the data
or gather it from somewhere, but I don't think too many more than a dozen are actually collecting speed data.
Any other presenters have thoughts?
I agree with Alon. I have not heard of a lot of ENPOs doing this type of speedy collection.
I have not either, but one thing I have heard from MPOs when we present, particularly when focusing data presentation on the interstate aspect of our initiative, is some feedback that's not going to be enough for an MP Oto make decisions,
do sufficient modeling, includes many more facilities than just the interstate.
That's one thing we incorporated, incorporate attempted to address, beyond interstate collection data tool, data for off interstate routes. Just having data on the interstates while they carry most of the freight,
is not enough to build into the MPO planning process, not enough -- you have to have data for all the facilities in a region, not just the interstates. So that doesn't necessarily answer the question of are they collecting it,
but they feel it's a data need, to have data on more facilities other than just the interstates, and they obviously aren't collecting that data, I guess.
Okay. We will move to a question for Dale. How did you determine your zone boundaries and what were your considerations?
Well, our zone boundaries were limited by what the state legislature told us we could do, the central Puget Sound, a defined area.
We tried to set our long lapse, how we set that, within a box, but including the counties within the central Puget Sound area, if you look at a map, Snoqualmieish, [indiscernible], we sets that particular area that we wanted to study,
and that was also what we sent to our vendors as to we wanted data within this. Since everything in GPS moves on long last, they have the ability to only give us data for vehicles that enter into that box. Once that was determined,
the vendors automated the system where they would go back in with a computer program, pull out the data we needed and send it to us, at least send to up to the university via file format on a weekly basis.
I can add to that, I interpreted that a little differently. If this isn't what you meant, the specific areas that Dale was showing in the map in terms of travel times between locations, the traffic analysis zones for the region.
If you were asking about the smaller areas.
If you want to clarify the question, feel free to type in.
Dale, another question for you.
Has there been any integration of GPS data with [indiscernible] or [indiscernible] data? not at this time. That's an interesting question,
interest something I am going to present to Ed McCormack to see if there's a way to combine that data.
The next question, sent just privately to me, but I will read it out, intended for everyone. Many permanent classification counters, ADRs, weigh in motion and portable class -- wave electronics, RTLs, can get speed by vehicle type data.
Might this supplement the GPS data you've gathered? absolutely.
That would be great data to have, if you can get your hands on the raw data. Certainly from a speed perspective, if you can get the classification data, the GPS data can work hand in hand with it,
let you know what sort of sub-set of all trucks the GPS data represents.
You can get comparisons, really look to see -- assuming all the classification data is operating, well, you can see how the roadway is performing and then get the real-time GPS data to speed trap comparison
and perhaps be able to use a sub-set of the GPS data and factor the other locations based on the speed trap, but definitely you need the raw data from the classification counters to be able to do that with.
Okay. The next question, what does the third-party data cost?
Well, depends how much you are getting and what you are looking -- we were saying, for all the vendors we were getting data from, about $10,000 a month for the data. That we were collecting. So once again, the amount of data
and how that data is sent, what you are looking for is going to determine the cost.
Along that line, one thing that we have been hearing, especially from vendors, as technology improves, the vendors put new equipment -- [lost audio]
[Audio on webcast is garbled]
500 to 600 vehicles nationwide and we pay less than 60-cents per vehicle per year.
That's about -- the valuation is based on research program more than a programmatic --
As we have said to remember, to the third-party 1RER7BD this is a waste plo product. They are getting paid up front by their customer to monitor the truck, collect the data.
This is the back stream of that business transaction that they're holding in storage in the event the customer may need it down the road.
Basically you are creating, not only are you not creating a cost center for them, you are creating a profit center. That's the key story line. As long as it doesn't burden them or cost them,
that's going to be really the driving factor for how the arrangement is negotiated. I think the national cooperative research project Dale is leading,
that guide book will talk about those data valuation issues ask the compendium I describe, the FHWA is releasing this month will talk about the cost of arrangements, some of the processes by which they were made.
I think they will be at least 31 different data arrangements, not only U.S., but Canadian, I mentioned the sort of peer pressure thinking. We're not the only ones doing it.
In Canada, Europe, the compendium will have not only experience here, but from at least Canada.
Okay. The final question, did desire to have aggregated data versus point data do well in [indiscernible] concerns
Yes. Because once again O as I said u the vendor's ability to separate customer data they sent to us taking care of some of those privacy issues that are out there, especially within the carrier ranks.
Okay, one more question about privacy issues. Are there privacy issues with certain vendor selling telemetry without the fleet's knowledge or okay?
That will be, from my standpoint, best directed to the vendor. We told them our needs, what we are doing with it, how we will handle it, and that was a decision that the vendor made. Now,
whether there's something in their contract to that effect that allows them to utilize the data with other parties under certain -- I have, to be frank, I have no idea on that.
I know there are carriers in some other studies that raised issues about their data being used, but it relates back to what Crystal had in her presentation, that it would be used as a punishment means.
A lot has to do with who is getting the data and what they are using it for.
Crystal: I will add to that.
I think Dale is, again, correct. Certainly we don't necessarily go tell every fleet a particular vendor has, your data is included, because we have worked with the lawyers of those companies and for our contractor, and I will assure you,
whatever we are paying them is not going to be enough for them to not consider the privacy issues of their subscribers. From that perspective, I think they negotiated agreements based on the best interest of their customers.
There's no way they will sacrifice or jeopardize the relationships they have with their fleets for the small profit they may gain from participating in such an initiative. To that degree, definitely the federal level,
initiative we're leading, there were plenty of lawyers involved. Any privacy issues were addressed. But again, it is a case-by-case basis, and it's been around at least six years using that model.
I am not going to ever say there's not going to be that issue. If you know there's non-attainment area set up, truck restrictions put certain routes, that data may cause them to say I don't want to play anymore.
Nondisclosure agreements, in the case of the federal effort are written in such a way we want to understand we we have had plenty of out reach, can't point to any one to say do you know if I are included, having objective,
better data far outweigh concerns linger with regards to privacy. The technology vendors have negotiated agreements in there their best interest u.
I want to thank our presenters, great information shared, thank everybody in attendance.
The recorded version, seminars, presentations will all be posted online in the next few weeks. I send out an e-mail once everything is up there.
If you are a DES member and want to receive 1.5 certification maintenance credits for today's Webinar make sure you were signed? With the first and last name, or type it into the chat box.
download the evaluation form and send to me once you colonel completed completed it.
We want to get an idea of how many people are in the room with you, if you are participating in a group setting. Before you sign off fill in this poll, would be very helpful.
The next seminar, May 19, about data for state and local freight planning. I encourage you to register for that through the talking freight website. All future talking freights are announced through the listserv.
Thank you, everybody. Enjoy the rest of your day.