Moultrie Mobile
Self-driving big rigs
Community
Contributors to this thread:
HA/KS 13-Dec-17
Woods Walker 13-Dec-17
Wayne Helmick 13-Dec-17
Woods Walker 13-Dec-17
Wayne Helmick 13-Dec-17
Woods Walker 13-Dec-17
bigeasygator 13-Dec-17
Woods Walker 13-Dec-17
bigeasygator 13-Dec-17
Woods Walker 13-Dec-17
Grey Ghost 14-Dec-17
Owl 14-Dec-17
Michael 14-Dec-17
Woods Walker 14-Dec-17
Amoebus 14-Dec-17
woodguy65 14-Dec-17
Amoebus 14-Dec-17
bigeasygator 14-Dec-17
Amoebus 14-Dec-17
woodguy65 14-Dec-17
TGbow 14-Dec-17
Franzen 14-Dec-17
bigeasygator 14-Dec-17
Franzen 14-Dec-17
bigeasygator 14-Dec-17
Fivers 14-Dec-17
TGbow 14-Dec-17
bigeasygator 14-Dec-17
TGbow 14-Dec-17
IdyllwildArcher 14-Dec-17
Franzen 14-Dec-17
Woods Walker 14-Dec-17
bigeasygator 14-Dec-17
bigeasygator 14-Dec-17
TGbow 14-Dec-17
Coyote 65 14-Dec-17
Woods Walker 15-Dec-17
HA/KS 15-Dec-17
Woods Walker 15-Dec-17
Mike B 15-Dec-17
Amoebus 15-Dec-17
bigeasygator 15-Dec-17
Mike B 15-Dec-17
bigeasygator 15-Dec-17
Franzen 15-Dec-17
HA/KS 15-Dec-17
bigeasygator 15-Dec-17
Franzen 15-Dec-17
South Farm 15-Dec-17
foxbo 15-Dec-17
HA/KS 17-Dec-17
Mike B 18-Dec-17
Amoebus 19-Dec-17
foxbo 19-Dec-17
Woods Walker 19-Dec-17
bigeasygator 19-Dec-17
foxbo 19-Dec-17
bad karma 19-Dec-17
Amoebus 19-Dec-17
From: HA/KS
13-Dec-17

HA/KS's Link
I recently heard a discussion on the radio while I was driving. The claim was that self-driving trucks would reduce truck-caused crashes by as much as 99%.

One objection to the idea is that truck drivers would be unemployed, but the interviewee said that there would still need to be a "driver."

He did say that there would be a lot of unemployed attorneys as the number of tort cases would plummet.

From: Woods Walker
13-Dec-17
Weeeell.........maybe. The overall number of court cases would theoretically decline, but when there's a mechanical failure (note I said WHEN and not IF) and someone is injured/killed then the award will make a typical vehicle accident case pale by comparison. The lawyer would now be going after GM or the like and not some poor slobs State Farm coverage. Talk about deep pockets!!!

This could be the start of an entirely new field of law.

I'm glad you brought this up Henry because I saw an ad on TV this week and it made me think of this very topic. Ford is now selling cars that will parallel park themselves in REALLY tight spaces. Now if you have one of these and the sensors or whatever that control the operation of the car when doing this eventually fail and you damage someone else's car, who pays for it??? Initially your insurance would pay for any damage, but would the insurance company them go after Ford? I mean it's certainly NOT the driver's fault if they weren't in actual control of the car at the time, right?

13-Dec-17
WW, To add to your point, what happens when one backs over a kid running into school.

From: Woods Walker
13-Dec-17
Exactly. Who pays? You were NOT in control of the car. It was, or more accurately the manufacturer of the car was. Before I bought one of these "smart" cars I'd want a lawyer to research it and maybe even have the car dealer sign a document stating that they are liable when the car drives itself.

13-Dec-17
Yeah, this crap scares me. These backup cameras are amazing on these new $50,000 pickups but when you drive dirt roads all day you can't see crap. Who knows if all these sensors are any different.

From: Woods Walker
13-Dec-17
Bottom line....if it's mechanical and/or electrical it's not a matter of "IF", but "WHEN" it will fail.

Never for get the old saying......... "If it has t*ts or tires you WILL have problems with it!"

From: bigeasygator
13-Dec-17

bigeasygator's Link
It’s not a matter of if but when self driving cars become the norm IMO, and their use brings LOTS of questions and issues. In addition to the legality/liability questions raised, there’s also a big ethical challenge around programming self driving cars. It’s called the trolley problem and to summarize it, when confronted with a situation where the car might have to choose between the death of the driver or the death of others, what should the car be programmed to do? Would you buy a car that would, in certain situations, be programmed to put you at more risk?

From: Woods Walker
13-Dec-17
HOLY CRAP! I never thought of that angle! Scary for sure! And in that situation, HOW or even CAN the car know if the object you're about to hit is another human being or an animal?

Can you imagine if the programmer for that feature was a PETA type? YIKES!!!!

From: bigeasygator
13-Dec-17

bigeasygator's Link
Here’s another issue with self-driving cars that I read about years ago and thought it was interesting to bring up. Clearly a big driver for the technology is safety. However, increased safety will have a huge impact on organ donations as auto accidents are one of the main sources of organs. How to address this shortage will likely lead to innovation in artificial organ technology, but it certainly will take some time to get there.

From: Woods Walker
13-Dec-17
Another "watch what you wish for" scenario I guess.

From: Grey Ghost
14-Dec-17
I do enjoy the backup camera on my pickup. It makes hitching up to a boat or trailer a breeze.

I don't think I'll ever own a vehicle that drives itself.

Matt

From: Owl
14-Dec-17
As someone who deals with a lot of freight delivery, I can't imagine the automation that can accommodate the variables and vagaries of depot logistics. What happens when the truck arrives and no one is there to offload? Happens a ton. Remote operation combined with semi-automation (no pun intended) may be plausible but not full automation. Not until depot sites and protocols are changed considerably.

And who takes liability for safety and system checks on unmanned vehicles? My company won't assume liability for a gashed tire that is going to blow 100 miles later just because the road tractor is automated.

From: Michael
14-Dec-17
With the career I choose early on in life it requires a lot of road miles. There has been many a long boring drives to get home. On occasions I wished my truck had auto pilot.

As for back up cameras. They are very helpful. Since I spend a great deal of time on gravel roads or muddy roads I am always cleaning the camera lens. It takes longer to get out of the truck and walk back there then to actually clean the lens.

14-Dec-17
I think it's going to happen for both trucks and cars. My wife's infiniti damn near works on auto-pilot now. I mostly turn all that crap off unless I'm driving cross country. I find it irritating.

From: Woods Walker
14-Dec-17
But to my first question.......who's liable when it fails?

From: Amoebus
14-Dec-17

Amoebus's Link
Peleton offers an automated way for Semis to platoon offering fuel savings to both vehicles (4.5% on the front semi, 10% savings on the rear). The main advantage of this (besides fuel savings) is there is an automatic way for the rear vehicle to brake when the front vehicle does - and get rid of the driver's delay.

Nascar fans have been seeing this effect for decades.

From: woodguy65
14-Dec-17
I'm sure Isis is licking their chops, just made terrorism like an Xbox game. How easy you think it would be for an expert on the bad guys side to hack one of those and then remotely control it to do whatever!

From: Amoebus
14-Dec-17

Amoebus's Link
Woods - "But to my first question.......who's liable when it fails? "

Lots of talk about the subject on the interweb. Included one from slate.

Short answer: Since there aren't fully automated cars/trucks yet (beyond test environments), the rules for liability haven't been invented.

They point to drunk driving as one area where an automated vehicle can save a lot of lives. Possibly make the drunk driving laws tricky also (i.e. were you in control of the vehicle at the time of the crash or was it in automatic mode?).

These same type of questions probably were asked when cruise control first came out.

From: bigeasygator
14-Dec-17
As Amoebus said, it's still early days. Some of it will certainly depend on the level of autonomy, as self-driving means a lot of things to a lot of folks. There's actually a system to describe this autonomy with levels. They are as follows (taken from the web):

"Level 0: This one is pretty basic. The driver (human) controls it all: steering, brakes, throttle, power. It's what you've been doing all along.

Level 1: This driver-assistance level means that most functions are still controlled by the driver, but a specific function (like steering or accelerating) can be done automatically by the car.

Level 2: In level 2, at least one driver assistance system of "both steering and acceleration/ deceleration using information about the driving environment" is automated, like cruise control and lane-centering. It means that the "driver is disengaged from physically operating the vehicle by having his or her hands off the steering wheel AND foot off pedal at the same time," according to the SAE. The driver must still always be ready to take control of the vehicle, however.

Level 3: Drivers are still necessary in level 3 cars, but are able to completely shift "safety-critical functions" to the vehicle, under certain traffic or environmental conditions. It means that the driver is still present and will intervene if necessary, but is not required to monitor the situation in the same way it does for the previous levels.

Level 4: This is what is meant by "fully autonomous." Level 4 vehicles are "designed to perform all safety-critical driving functions and monitor roadway conditions for an entire trip." However, it's important to note that this is limited to the "operational design domain (ODD)" of the vehicle—meaning it does not cover every driving scenario.

Level 5: This refers to a fully-autonomous system that expects the vehicle's performance to equal that of a human driver, in every driving scenario—including extreme environments like dirt roads that are unlikely to be navigated by driverless vehicles in the near future."

For levels 0-2 it seems clear that the driver is still in control. For levels 3 and above, liability is a lot less certain. I don't believe there's a lawsuit associated with the Tesla crash in Florida that killed the driver while on autopilot last year, but the NTSB laid some of the blame at Tesla's feet.

It goes beyond just finding fault in the crash. Are you still liable for a DUI if you get behind the wheel of a fully autonomous car? What about other traffic tickets? Lots of questions that no doubt will be answered at some point but clearly not yet.

And all these questions aside, there is no doubt the technology will make the roads safer. Cost of vehicle ownership are also likely to decrease, both from cars being operated far more efficiently and due to significantly decreased insurance costs.

I do a lot of my hunting out west and the trip always involves a drive of at least about 20 hours. Having a self-driving vehicle to get me there would be incredible. That said, even without full autonomy, the other safety features associated with the Level 2 cars are well worth it IMO. The last two cars I purchased had lane assist, adaptive cruise control, and automated emergency breaking and I absolutely love them. IMO, full autonomy shouldn't become a requirement, but these types of safety features should become mandatory on cars (especially the automated emergency breaking).

From: Amoebus
14-Dec-17
Woodsguy65

I was at a computer conference 3-4 years ago and the universities had done that already. At that time the car they 'hacked' had 26 computers all attached to the same bus (communication line in the vehicle). Once they found an opening in one of the computers, they could get control of all 26 computers. Remote control braking, accelerating airbag deployment, etc.

They were working with the manufacturers to close up the entry points.

From: woodguy65
14-Dec-17
Exactly. While I'm sure they could close up the entry points for the vast majority of hackers - what about the elite guys???

I bet there are some folks that would be able to hack that "full proof" system in minutes not months.

But what the hell do I know... I can barely Control my TV!

From: TGbow
14-Dec-17
I don't see that ever happening. I drove for 22 years and I cant see how it would work other than some type of contained road/track ect. Then you would be looking at a lot more expense to build the system for them to travel on. Throw in the snow/ice on the roads, it wouldn't work.

BTW, we have way more regulations for the trucking industry than we did in 1985 when I started. Laws are passed without congress voting on Federal laws. Yet, the highways are a lot more unsafe than 30 years ago. Truck drivers are a whole new breed overall. Not all, but drivers drive worse than ever today in spite of all the regulations. Problem is they keep passing stupid laws that will not effect the safety aspect of the highway like more log book regs, ect, which will never work because you will never regulate whether a driver actually sleeps or the quality of sleep they will get...aint gonna happen.

They should focus on writing tickets for high speed, tailgating, erratic driving. After so many of those kinds of tickets they would lose their license. That would get rid of a lot of the bad drivers, not all but a lot. It is sad what they've done to the trucking industry.

From: Franzen
14-Dec-17
"Cost of vehicle ownership are also likely to decrease..."

I would be curious to know how you came to that conclusion?

"...but these types of safety features should become mandatory on cars (especially the automated emergency breaking)."

This appears to simply be another vote for government control of our lives. You might feel safe with machines doing everything for you, while I feel much safer being in control of my own actions.

From: bigeasygator
14-Dec-17
It's pretty straightforward Franzen. For one, computers would make far better drivers than humans. They don't get distracted. They don't text. They can be programmed to not speed. They can also be programmed to drive in ways that lead to less wear-and-tear on vehicles. They can be programmed to drive in ways that are far more fuel efficient than humans. They will be far better than humans at avoiding obstacles that can lead to car repairs. In the world of self driving cars, car insurance costs will decrease tremendously as the risk associated with those cars is much lower.

Call it what you want. There's plenty of people that have no problem with the government stepping up and controlling who enters this country in the name of safety. Car accidents are a far bigger threat to our daily existence.

From: Franzen
14-Dec-17
Actually it's not straightforward. Insurance costs for "drivers" may or may not go down. As was discussed above, it hasn't been determined who shoulders liability. The liability still exists, so someone will have to pay for it. If accident rates significantly decline over time, I tend to agree there would likely be some insurance savings. Vehicle wear-and-tear is primarily a result of roadway type/condition, although the driver does play some role as well. Fuel efficiency I will give you, but the gains would be pretty marginal, and for some drivers non-existent. Even IF all of the aforementioned were supporting the case of reduced ownership cost, what about the astronomical cost to develop and implement the type of technology that is being referred to?

Many other things to consider as well: traffic violations, municipal revenue, police, etc, etc. Things such as computers keeping speed within legal limits could actually increase costs.

I am not sure what controlling aliens has to do with government control over it's own citizens?

From: bigeasygator
14-Dec-17
"the astronomical cost to develop and implement the type of technology that is being referred to?"

It's a lot cheaper than you think. Level 2 autonomy is already there and does not add significant costs to base models. Full level 4 autonomy is pretty expensive today, but forecasts show these costs will come down drastically as the technology is adopted (like any technology).

"IHS Automotive forecasts that the price for the self-driving technology will add between $7,000 and $10,000 to a car’s sticker price in 2025, a figure that will drop to around $5,000 in 2030 and about $3,000 in 2035"

"I am not sure what controlling aliens has to do with government control over it's own citizens"

Of course you don't. Wouldn't expect you too. Regulations all fine and good when you think it benefits you, right?

From: Fivers
14-Dec-17
As the technology evolves, some "drivers" could possibly be held more responsible than others, if, say Brand X is programmed a certain way and Brand Y is programmed a different way and a "driver" buys Brand X and kills someone that would have been avoided by buying Brand Y. I could see lawsuits against the "driver" because they bought the lesser vehicle, in turn, this would eventually eliminate all but one make of vehicle or each brand would have to use the exact same technology.

Once they can get a PC to work effectively for more than 5 years, they might then be able to look at producing computers that will be controlling 1 ton, or larger, missiles down the highways. The only way to make it work in every State, there would need to be sensors and readers buried in every road across the U.S., lane recognition will not work on snow covered roads, poorly or non-painted paved back roads or gravel roads.....not to mention construction zones.

From: TGbow
14-Dec-17
There's a big difference in checking out people that enter our country vs tractor trailor regulations. Tractor trailors dont conspire to commit terrorism though they could be used for terrorism as a tool. I would think every American would want to prevent potential terrorist from entering our country...we have enough already sitting on Capital Hill.

From: bigeasygator
14-Dec-17
There is a big difference between big rigs and foreign terrorists...tractor trailers kill about 500 times more people each year in this country than foreign terrorists do.

From: TGbow
14-Dec-17
You are missing the point. More regulations will not reduce accidents. Screening people that enter our country is common sense. More regulations for trucks will not solve any problems because the regs they pass dont deal with real life safety...only revenue. Stop the high speed drivers and tailgaiters and then it will actually have an effect on highway safety. More logbbook regs and petty laws will not do anything but make more revenue. 30 yrs ago, rarely did I see drivers drive like some of them do now...we had less regulations then. I'm all for real safety but rediculous laws do nothing to make the highways safeer.

Example: 2 drivers leave point A both in seperate trucks, they drive 11 hrs and both stop their trucks for their break. One driver gets in his sleeper birth and sleeps for 9 hrs. The other driver gets in his sleeper for 9 hrs but he watched his TV for 5 hrs and sleeps for 4. Both drivers are legal according to the law. The log book law served no real purpose to ensure the drivers actually slept.

I havent drove in 10 yrs but there were many times I could not sleep when my break was due..I would lay there or get up and drive and then I would stop and sleep whwn I was actually sleepy.

14-Dec-17
I would love to own a car that drove itself. I could text, read, surf bowsite and Google earth while my car took me to hunts... win win

From: Franzen
14-Dec-17
Again, the two are not related just because you want them to be. How we as a country decide to deal with aliens has zero to do with regulation upon U.S. citizens. Whether or not any of this benefits me I do not know, and is of course irrelevant. You only bring it up to try to defend your poor argument for government control.

You telling me that something is going to be cheap in the future does not really mean anything either. Is IHS Automotive a NFP? Do you know for a fact that all associated costs were incorporated into those figures? As with nearly all technology, yes you are correct that it gets cheaper with time, but in many cases it is still more costly than not having the technology at all. Implementing the technology in the vehicle is likely the easiest and cheapest part. It gets even cheaper if you think the government has an open checkbook to fund R&D.

From: Woods Walker
14-Dec-17
So we're going to rely on a computer for our safety on the road. GREAT! Computers are never wrong, or fail to perform! What a relief!

Now if we could just get the weatherman that type of computer.

What I'd rather see than a car that drives itself is one that FIXES itself!

From: bigeasygator
14-Dec-17
Sorry TGbow, you are missing the point. That, or you don't understand the capabilities of autonomous driving. Logbooks would be a moot point. Trucks and cars would have the ability to drive themselves to varying degrees so that it takes the responsibility out of human hands and eliminates the risk logbooks are trying to mitigate. It won't matter whether someone logged the appropriate amount of hours in their logbook or slept the requisite amount of time because the technology will be there to prevent the consequence of a tired driver. Machines don't get tired. They don't get distracted. As an example, my car has adaptive cruise control and emergency braking. It is damn near impossible for me to drive into the back of someone on a highway. In a world where the majority of the cars on the highway would have this technology, we would reduce a lot of accidents on the highway (and this is about the minimal capability associated with autonomous vehicles). Picture a world where every vehicle could talk to one another (almost the same way our air traffic control systems work). Where cars weren't just emergency braking but where they are adjusting positions to provide the prerequisite space to react to some unforeseen event. Where you could program out speeding and tailgating. You don't think this would prevent accidents?

From: bigeasygator
14-Dec-17
"Again, the two are not related just because you want them to be."

They most certainly are. ALL regulation comes with a cost, be it regulation on immigration or regulation on vehicle safety requirements. That cost can be at the expense of our wallets, or at the expense of our freedoms (usually it's both). Because you fail to see the cost of restrictive immigration policies doesn't mean they aren't there. And again, if I'm going to put regulations in place in the name of safety, I'd prefer they actual make citizens of this country safer.

From: TGbow
14-Dec-17
Now I understand more how these Big gov politicians get reelected.

From: Coyote 65
14-Dec-17
Never going to happen, unions will insist on a driver onboard. Safety you know. And think of what happens in a sudden ice storm when a school bus full of muslim kindergardners get tboned by a big rig.

Terry

From: Woods Walker
15-Dec-17
"And think of what happens in a sudden ice storm when a school bus full of muslim kindergardners get tboned by a big rig."

Yes. and when something like this DOES eventually happen WHO'S liable???

Or will it be that if as the driver of the vehicle you CHOOSE to have it on auto-driver so in fact it's YOUR fault?

From: HA/KS
15-Dec-17
If you read the article, there are already rigs driving themselves from TX to CA delivering refrigerators. There is an observing driver on board.

They also have these rigs on the road in FL.

15-Dec-17
this is the future and I'm thinking it won't be long until the "observing" human driver is no longer needed.

From: Woods Walker
15-Dec-17
If I were starting college now I'd go to law school!!!! All you need is to win ONE case against GM or Ford, etc. when the robot truck eventually malfunctions and kills someone and you could RETIRE!!!!

From: Mike B
15-Dec-17
This *may* work for some local distributors, etc., but I can't see it working in long-haul applications.

There's just too many factors involved with running longer distances, including climbing grades and going down 'em, dealing with snow, wind and a jillion cars going every which way to get around the slower semi.

It does take skill to get that rig down the road safely every time, not just the ability to operate the truck, but watching for issues before they become a problem. An experienced driver feels the road with his ass..it tells you what's going on underneath you...a computer hasn't got that instinct.

So, what does an 80,000 lb. driverless vehicle do when it's traveling 70mph. down the interstate, and there's an Elk standing in the road 100 yds in front of it? Mash the brakes? Try to go around? Plow it down like it wasn't there? If you can't safely avoid it, the best bet is to take it right down the middle of the hood, hoping it goes under. It'll rip out your crossover line, so first thing after stopping you get under there and shut off the valves before all your fuel runs out. Then, you clean the rest of it out of your grill (if it's not trashed), and carry on. Doubt an autonomous vehicle can do that, or make the correct decision every time.

First time one of them has a ginormous wreck that people die in, the cost for insurance will go so high it will not be profitable to run them.

IMO it's bad, bad idea.

From: Amoebus
15-Dec-17

Amoebus's Link
Mike - all those things are now done by computers. The only difference is the computers make those decisions in nano-seconds. Depending on the attentiveness of the human driver, those decisions are made in seconds or never (if they have fallen asleep).

I don't doubt there will be issues. The big question is the number of issues that can be solved with the automation. If you can get rid of 90% of drunk driving deaths (remember that some are the drunks and the rest are the innocents) by putting them into an automatic car, they the overall effect will be acceptable - as long as you don't cause 90% deaths by having the automatically driven semi plow into elementary schools.

I really like the Google Maps concept that is available now. You put in your ending address and it determines the fastest route - all of it based on other drivers who are using Google Maps at the same time on your route. You combine an automated trucking fleet with an automatic mapping utility and you avoid semi traffic during rush hours in the big cities.

All of this was tested/proceeded by the DARPA Grand Challenge (see link).

From: bigeasygator
15-Dec-17
“An experienced driver feels the road with his ass..it tells you what's going on underneath you...a computer hasn't got that instinct.”

I hear what you’re saying, but have to disagree. Traction control systems were one of the first forms of automation in vehicles, and they are all about feeling the road. Computers are far more effective at feeling and reacting to the road than humans are - they can detect and correct wheel tires slipping WAY faster and WAY more consistently than humans and they can be programmed to make the right decision with this information far faster and effectively than humans can just about every time. A computer is far better equipped to deal with challenging road conditions - ice, rain, grade, etc - than humans are.

From: Mike B
15-Dec-17
BEG/Amoebus: Y'all are welcome to your opinions, but for me I'll stick by my opinions. I've got almost a million miles driving big trucks, and there's just too much odd stuff that happens fast that a computer can neither anticipate or manage as well as an experienced driver.

Perfect road conditions, etc. it might get away with it; unfortunately road conditions and weather are never perfect. It's a disaster waiting to happen.

From: bigeasygator
15-Dec-17
“there's just too much odd stuff that happens fast that a computer can neither anticipate or manage as well as an experienced driver.”

I understand that you have a lot of experience to form that opinion, but you’re also only approaching it from your perspective. When it comes to reacting to road conditions quickly and effectively, it’s really no contest. A computer will win just abou every time. Our brains just aren’t that good, even though we’d like to think so.

Plus there’s a much smaller learning curve with computers. Sure, the gap between someone with millions of miles under their belt and a computer might be smaller, but compare how long it takes to bring a person up to that level of competence versus a computer.

A bigger question might be how autonomous vehicles deal with mechanical failure, but I could see a scenario where the truck is programmed to pull over and call out a mechanic.

From: Franzen
15-Dec-17
"A computer is far better equipped to deal with challenging road conditions - ice, rain, grade, etc - than humans are. "

Based on my experience with traction control systems that is not necessarily a correct statement in all cases (in instances it may be). One example: When driving through deeper snow and all your wheels are losing traction, a decent driver knows you need more power, whereas the systems I have seen all cut power. I regularly turn off the system in those situations. I'm not saying they couldn't be programmed differently or just shut the vehicle down in bad conditions, but there are going to problems regardless of programming.

I say all this because there are a lot more hurdles to overcome prior to this technology becoming widespread, and it isn't really all that close. Of course those that are developing the technology are going to set themselves up for success with basic examples of how the technology could be used. Even in the OP link, the robotruck was only taking on the most direct, easy portion of the route, with human drivers taking over near the terminals. I sat in a short presentation for autonomous vehicles not too long ago, and from my recollection having autonomous and non-autonomous vehicles simultaneously on the roads was an enormous hurdle that the industry was nowhere near overcoming. I have no idea whether or not he was a leading expert in the industry though.

The DARPA Urban Challenge was somewhat intriguing, but only 6 vehicles even finished the course and from what little reading I did it appeared that the vehicles were very slow and likely accrued lots of error penalties. I didn't dig too deep to figure out the difficulty of the course in comparison with a true urban environment. Of course that was 10 years ago as well (from link above), thus technology has certainly improved. Of course the 'ol govt. blank check was covering the tab.

From: HA/KS
15-Dec-17
I remember when bank tellers said that ATMs would never work.

From: bigeasygator
15-Dec-17

bigeasygator's embedded Photo
bigeasygator's embedded Photo
I saw this chart from McKinsey one time that I thought summarized automation potential fairly well.

From: Franzen
15-Dec-17
I'm not sure if that was directed at me or not, but that's not what I'm trying to get at in any shape or fashion. Simply saying it's further off than some are suggesting. Automating a vehicle to take on a programmed route isn't all that difficult with today's technology. Integrating it with all the unknown conditions throughout our country's system of roadways is.

From: South Farm
15-Dec-17
The way people drive here in Minnesota I think self-driving cars would be a HUGE improvement. How some of you ever got a license is beyond me because you drive like old people scr...

From: foxbo
15-Dec-17
I worked in the heavy truck business most of my life. I don't ever see a truck doing a good job without an operator. Makes zero sense to me.

From: HA/KS
17-Dec-17
Foxbo, we used to have to talk to an operator to make a phone call.

For many years, every automobile had a driver and a mechanic whenever they went on a lengthy trip.

In former days, farm tractors required the driver to drive, now he is just an observer for most of the time.

From: Mike B
18-Dec-17

Mike B's embedded Photo
Mike B's embedded Photo

From: Amoebus
19-Dec-17
Good timing with this thread and the recent Amtrak crash. Turns out there is a similar mechanism for trains called PTC (Positive Train Control) whose aim is to prevent some/all of the 40% caused by human error.

It wasn't fully implemented on that track yet, but if speed was the only factor in the crash, it sounds like it could really have been useful.

From: foxbo
19-Dec-17
I'd like to see that computer back up to a roll off container and get out and hook the cable. I'd also like to see it back up at the dump and open the back door. It ain't happening.

From: Woods Walker
19-Dec-17
The Amtrak train was just reported to have been doing 80 MPH in an area zoned for 30. They have yet to determine WHY. Should be interesting.

From: bigeasygator
19-Dec-17
Foxbo, there’s no question a computer will be able to back up a truck more efficiently than a human. The other tasks you described are a bigger challenge, but some of them might not require more automation and can still lead to efficiency. For example, when looking at operations at a dump, rather than rely on every driver to perform that task, the role can be shifted to someone manning the dump. If that is the only hurdle to complete automation associated with that operation, you can shift the work to one person instead of dozens of drivers.

There are also clearly specific operations that aren’t going to lend themselves to automation. That said, seems to be a lot of transport services are.

From: foxbo
19-Dec-17
A roll off driver will pull anywhere from eight to a dozen loads a day. Each pull requires the driver to get out of the truck, hook the cable, and secure the ratchet straps. Open tops need to be covered also. No computer will ever accomplish these tasks.

19-Dec-17
"Each pull requires the driver to get out of the truck, hook the cable, and secure the ratchet straps. Open tops need to be covered also. No computer will ever accomplish these tasks. "

That's not driving. We are talking about trucks that drive themselves not robots that do the human part....but that's coming too.

From: bad karma
19-Dec-17
Yes, indeed. Think of the market for sex robots at truck stops.

From: Amoebus
19-Dec-17
foxbo - "Each pull requires the driver to get out of the truck, hook the cable, and secure the ratchet straps. Open tops need to be covered also. No computer will ever accomplish these tasks. "

Repeated tasks in a controlled environment is going to be easier to automate with robots/computers than getting a semi cross-country with all the unknowns pointed out by people above. Just look at an automotive assembly line in 1917 and one in 2017.

  • Sitka Gear