NEWS HEADLINE DISCUSSIONS

Self-Driving Uber Car Racks Up First Kill

POSTED BY: JEWELSTAITEFAN
UPDATED: Thursday, September 19, 2024 05:05
SHORT URL:
VIEWED: 15536
PAGE 1 of 2

Monday, March 19, 2018 7:33 PM

JEWELSTAITEFAN


How many points was it worth?
IIRC, video games gave 10 points for cars, but pedestrians were worth 40 points.


Killed a woman pedestrian in Tempe, AZ.


https://ca.finance.yahoo.com/news/self-driving-uber-car-hit-055237750.
html



But hey, they are safe!!!
This is not the Self-Driving Kill you are looking for, move along now.
Never mind that man behind the curtain.

Even had a dolt in the "Driver's" seat, but was in self driving mode.

They think this was the first kill, not sure if this is completely accurate.
Idiots who use Self-Driving: 1, Humans: -1.


https://ca.finance.yahoo.com/news/self-driving-uber-car-hit-055237750.
html

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, March 21, 2018 1:33 PM

JEWELSTAITEFAN


Maybe cities which allow self-driving vehicles should be required to post warning signs at every entrance to the city.

If the dead woman was a supporter of self-drive she might be nominated for Darwin Award.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, March 21, 2018 2:10 PM

WISHIMAY


One thing I learned implicitly last year...is that you are not actually in control no matter what. You can be awake, aware, have good reaction timing and STILL get squashed like a bug. You are at the mercy of other humans actions, after all.

Eventually, automated driving will be safer than self-driving. It's coming no matter what you think about it.

There is too much profit to be had from it NOT to...

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, March 21, 2018 11:37 PM

WISHIMAY


I think I would still trust A.I. driving over people. AI doesn't have ego, road rage, grudges, bad vision, can't be drunk, stoned, sick or stupid.

And I saw the video of this today, the woman that was hit was being a complete idiot and she paid the price, not sure I could've stopped in time either.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, March 25, 2018 9:27 AM

SECOND

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at https://www.mediafire.com/two


Quote:

Originally posted by Wishimay:
I think I would still trust A.I. driving over people. AI doesn't have ego, road rage, grudges, bad vision, can't be drunk, stoned, sick or stupid.

It depends on who designed the A.I.
Not all designs are better than a drunk driver. For example:

The New York Times reported that Uber's self-driving technology worked on average for just 13 miles in Arizona before requiring human correction to avoid a crash, compared to an average of 5,600 miles for one of Uber's top competitors in the market, Waymo, in nearby California. Uber has not released data for its self-driving car tests in California.
http://thehill.com/policy/technology/380038-ubers-self-driving-cars-in
-arizona-averaged-only-13-miles-without


Some A.I. designers don't worry about killing people. It is all just the price of progress and the lawyers will handle it for the engineers. Don't bother the engineers about those legal details that can be solved by throwing money at the relatives of the deceased. For another example, again at Uber:

New York Magazine once attributed Levandowski, the engineer in charge at Uber, as saying, “I’m pissed we didn’t have the first death,” to a group of Uber engineers after a driver died in a competitor's car, a Tesla on autopilot, in 2016. (Levandowski has denied ever saying it).
www.theverge.com/2018/3/20/17144090/uber-car-accident-arizona-safety-a
nthony-levandowski-waymo



The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, March 25, 2018 12:00 PM

WISHIMAY



Quote:

Originally posted by Wishimay:
I think I would



I said "would". I don't trust them NOW. I think all designs are still too primitive. Give these things the ability to learn from previous mistakes and in 20 yrs I'll think about it

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, March 25, 2018 12:48 PM

JEWELSTAITEFAN


Quote:

Originally posted by Wishimay:
Quote:

Originally posted by Wishimay:
I think I would

I said "would". I don't trust them NOW. I think all designs are still too primitive. Give these things the ability to learn from previous mistakes and in 20 yrs I'll think about it

Previous mistakes = future dead humans, totally preventable deaths.

Wouldn't it be more beneficial if all these "previous mistakes" occurred in the lab, where the future dead could be the programmers?

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, March 25, 2018 3:28 PM

WISHIMAY


Quote:

Originally posted by JEWELSTAITEFAN:


Wouldn't it be more beneficial if all these "previous mistakes" occurred in the lab, where the future dead could be the programmers?



Every time you put tire to road, you risk your life...why would you expect perfection from these things without being out on an actual road?? You can't program every scenario in a closed course...

They ALREADY have better accident per mile records than PEOPLE do, as Google started testing its first self-driving car in 2009.

ONE fatality from then till NINE years later?? It's impressive. Not enough to make me go near it yet, but still impressive.


NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, March 25, 2018 4:47 PM

SECOND

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at https://www.mediafire.com/two


Quote:

Originally posted by Wishimay:

They ALREADY have better accident per mile records than PEOPLE do, as Google started testing its first self-driving car in 2009.

ONE fatality from then till NINE years later?? It's impressive. Not enough to make me go near it yet, but still impressive.

People driving have 1.02 fatalities per 100 million vehicles miles traveled in 2014. Uber has 1.00 fatalities per thousand vehicle miles. There is much room for improvement at Uber. Maybe next time Uber could apply the brakes? Or steer around the pedestrian? Or beep the horn? Flash the high beams on the headlights? Or do something other than drive straight toward the pedestrian at a constant speed in the same lane like a railroad train on tracks.

https://en.wikipedia.org/wiki/Transportation_safety_in_the_United_Stat
es#Road_safety


Maybe the Uber Computer was not paying sufficient attention because it was watching youtube when it killed the pedestrian. Who knows?



The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, March 25, 2018 8:55 PM

WISHIMAY


I'm talking ALL self driving vehicles, obviously.
If a person doesn't have time to react, a computer system doesn't really either. A computer itself is faster, but this is an integrated system.

Crackhead shouldn't have been WALKING in busy road at night with a bike. PERIOD. She had a death wish. She didn't even LOOK at oncoming traffic.

You play in traffic on a dark night with NOTHING reflective or lit up, you are GOING to get hit!





NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, March 25, 2018 9:48 PM

JEWELSTAITEFAN


Quote:

Originally posted by Wishimay:
Quote:

Originally posted by JEWELSTAITEFAN:
Wouldn't it be more beneficial if all these "previous mistakes" occurred in the lab, where the future dead could be the programmers?

Every time you put tire to road, you risk your life...why would you expect perfection from these things without being out on an actual road?? You can't program every scenario in a closed course...

They ALREADY have better accident per mile records than PEOPLE do, as Google started testing its first self-driving car in 2009.

ONE fatality from then till NINE years later?? It's impressive. Not enough to make me go near it yet, but still impressive.

Did you forget your medication today, or are you just dumb?
Putting tire to road risks your life? You must also fantasize collisions are accidents. You never risk your life when not putting tire to road? One wonders what planet you are on.
Only the most insipid programmer would attempt to program "every scenario" - but maybe you actually hit on the moronic reasoning that the faulty program resulted in death.

You can't comprehend math today either? With self-driving generating 100,000 times worse fatality fate, you proclaim it safer?
And then to delude yourself into considering it impressive?

I've only logged about 1.9 million miles, but I haven't had a fatality yet, so either I'm smarter than a computer and it's programmer, or more competent than a team of each. Which would be more impressive? Although I expect many other, Nay the majority of non-BungBetty drivers, have evaded a fatality, I assume most have fewer miles logged than I.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, March 25, 2018 10:54 PM

JEWELSTAITEFAN


Quote:

Originally posted by Wishimay:
I'm talking ALL self driving vehicles, obviously.
If a person doesn't have time to react, a computer system doesn't really either. A computer itself is faster, but this is an integrated system.

Crackhead shouldn't have been WALKING in busy road at night with a bike. PERIOD. She had a death wish. She didn't even LOOK at oncoming traffic.

You play in traffic on a dark night with NOTHING reflective or lit up, you are GOING to get hit!

A computer cannot react as fast as a person? How did you even master the use of a keyboard?
I wonder if ANYBODY has a clue where to start in deciphering your level of stupid.
Did you think Anti-Lock Braking Systems are controlled, actuated, and processed by chipmunks or nanobots? ABS was employed in the first 747 Jumbos, and that was technology 50 years older than now.
For talented and skilled humans to even view or interact with computer systems, sensors, actuators from even 30 years vintage requires oscilloscopes, spectrum analyzers, high-speed cameras, and other time-slowin, time-sampling, time-filtering Instruments and devices.

You think that computers, or integrated systems, can see fully, and solely, in the human visual spectrum? There exist in technology no sensors capable of detecting a wift of 5-foot 100-pound traversing at lightning velocity such as human walking speed with a bicycle?

What does time of day matter to a computer detection?
What does reflectivity matter to a computer detection?
What does lit up matter to a computer detection?
Of all the vehicles and human drivers that she managed to not get hit by in her life, her luck ran out when she first met a non-human driven car.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Monday, March 26, 2018 3:17 AM

WISHIMAY


Quote:

Originally posted by JEWELSTAITEFAN:

Of all the vehicles and human drivers that she managed to not get hit by in her life, her luck ran out when she first met a non-human driven car.


Drinking it up tonight, eh? I see your shit for brains mode has come out to play.

THERE WAS A HUMAN BEHIND THE WHEEL.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Monday, March 26, 2018 3:21 AM

WISHIMAY


Quote:

Originally posted by JEWELSTAITEFAN:

And then to delude yourself into considering it impressive?




Not going to mention your motorcycle accident? As big a jerk as you are, I'm sure you've caused a few, too.

ONE fatality in 9 years, with ALL the self driving cars out there...

I'd still take IT over YOU any day and twice on Sundays.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Monday, March 26, 2018 3:58 AM

JEWELSTAITEFAN


Quote:

Originally posted by Wishimay:
Quote:

Originally posted by JEWELSTAITEFAN:

Of all the vehicles and human drivers that she managed to not get hit by in her life, her luck ran out when she first met a non-human driven car.

Drinking it up tonight, eh?

THERE WAS A HUMAN BEHIND THE WHEEL.

Still can't find your meds? Your script run out?

The car was not being driven by a human. The human droid occupying the seat behind the wheel was far too extremely busy staring at their lap, or the floor, to prevent the sacrifice of a human life in the pursuit of technological regression.
Ever heard of Don't Text And Drive, BungBetty?
In some States the Driving Under the Influence is worded such that it includes Influence of Distraction.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Monday, March 26, 2018 12:25 PM

MOOSE


So News Headlines is now an extension of RWED and it’s hateful bullshit?
One more place to ignore. Hopefully Haken will remove it from the latest discussion listing, just like RWED.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Monday, March 26, 2018 2:37 PM

JEWELSTAITEFAN


Quote:

Originally posted by Moose:
So News Headlines is now an extension of RWED and it’s hateful bullshit?
One more place to ignore. Hopefully Haken will remove it from the latest discussion listing, just like RWED.

Sorry you feel that way.
RWED seems to have become nonstop politics. Posting nonpolitical Real World Events there only ensures the thread will get buried.

I posted this here because the subject is not fundamentally political. The subject was a News Headline. There has been some Discussion on topic, which I am glad for. RWED and here are absolutely not the only 2 subforums where stupid comments are posted.

Some people here would want to call you Forum Nazi or Gestapo or Police for suggesting threads were not in their proper forum.

Where do you think this thread should have been placed?
After 6 days in RWED it would have been buried, without ensuing Discussion.
We haven't been able to o prevent stupid comments from getting posted, in any subforum that I've seen here.

Do you disagree with this thread being posted here? Not trying to argue, just wondering your view.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Monday, March 26, 2018 3:24 PM

MOOSE


If you look back at the history of this forum, you’ll see the vast majority of the threads are about News Headlines that concern the show. But since there has been lack of Firefly in the headlines the past few years, I kinda understand you all hijacking it for your own use.

But that’s not even what I’m complaining about. Reread this thread and if you can’t figure it out, oh well.
It ultimately doesn’t matter. This place has been on life support for a long time.

Oh, I’m not a fan of self driving vehicles but I do wonder if a human driver could have avoided that accident.
Maybe, maybe not.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Monday, March 26, 2018 10:38 PM

SECOND

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at https://www.mediafire.com/two


Quote:

Originally posted by Moose:

Oh, I’m not a fan of self driving vehicles but I do wonder if a human driver could have avoided that accident.
Maybe, maybe not.

A human driver who wasn't half asleep and driving that dark road would have used high beams on the headlights because there wasn't any oncoming traffic. The pedestrian would have shown much better. The Arizona governor has expressed himself forcefully:

Updated March 26,2018, 10:14 p.m. ET

Arizona Gov. Doug Ducey on Monday ordered Uber Technologies Inc. to suspend testing autonomous vehicles on public roadways in the state, a blow to the company’s development efforts after one of its self-driving cars struck and killed a pedestrian in Tempe.

Mr. Ducey, a Republican who welcomed Uber’s self-driving technology with open arms to Arizona in 2016, said in a letter to Uber’s chief executive that he had directed the state’s department of transportation to suspend the company’s ability to test the cars.



The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Monday, March 26, 2018 11:20 PM

SECOND

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at https://www.mediafire.com/two


Safety drivers are supposed to be our bridge to a self-driving world. They might not have the attention spans for it. “The bottom line is that humans are terrible babysitters of automation, which is why we are in such a dangerous period of time with good, but not great, autonomous cars,” Cummings told Slate in an email. This observation could foretell problems with semi-autonomous cars and trucks.

https://slate.com/technology/2018/03/safety-drivers-attention-spans-mi
ght-slow-self-driving-car-progress.html


The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Monday, March 26, 2018 11:31 PM

JEWELSTAITEFAN


Quote:

Originally posted by Moose:
If you look back at the history of this forum, you’ll see the vast majority of the threads are about News Headlines that concern the show. But since there has been lack of Firefly in the headlines the past few years, I kinda understand you all hijacking it for your own use.

But that’s not even what I’m complaining about. Reread this thread and if you can’t figure it out, oh well.
It ultimately doesn’t matter. This place has been on life support for a long time.

Oh, I’m not a fan of self driving vehicles but I do wonder if a human driver could have avoided that accident.
Maybe, maybe not.

I was using the descriptive subtitle on the "To The Boards" page, which I try to do with each of the subforums. I was not attempting hijacking.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Monday, March 26, 2018 11:35 PM

JEWELSTAITEFAN


Quote:

Originally posted by second:
Quote:

Originally posted by Moose:

Oh, I’m not a fan of self driving vehicles but I do wonder if a human driver could have avoided that accident.
Maybe, maybe not.

A human driver who wasn't half asleep and driving that dark road would have used high beams on the headlights because there wasn't any oncoming traffic. The pedestrian would have shown much better. The Arizona governor has expressed himself forcefully:

Updated March 26,2018, 10:14 p.m. ET

Arizona Gov. Doug Ducey on Monday ordered Uber Technologies Inc. to suspend testing autonomous vehicles on public roadways in the state, a blow to the company’s development efforts after one of its self-driving cars struck and killed a pedestrian in Tempe.

Mr. Ducey, a Republican who welcomed Uber’s self-driving technology with open arms to Arizona in 2016, said in a letter to Uber’s chief executive that he had directed the state’s department of transportation to suspend the company’s ability to test the cars.

Well, that is heartening, and refreshing.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Tuesday, March 27, 2018 9:06 PM

SECOND

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at https://www.mediafire.com/two


March 27, 2018 / 5:13 PM

The new Uber driverless vehicle is armed with only one roof-mounted lidar sensor compared with seven lidar units on the older Ford Fusion models Uber employed, according to diagrams prepared by Uber.

In scaling back to a single lidar on the Volvo, Uber introduced a blind zone around the perimeter of the SUV that cannot fully detect pedestrians, according to interviews with former employees and Raj Rajkumar, the head of Carnegie Mellon University's transportation center who has been working on self-driving technology for over a decade.

The lidar system made by Velodyne - one of the top suppliers of sensors for self-driving vehicles - sees objects in a 360-degree circle around the car, but has a narrow vertical range that prevents it from detecting obstacles low to the ground, according to information on Velodyne’s website as well as former employees who operated the Uber SUVs.

Autonomous vehicles operated by rivals Waymo, Alphabet Inc's self-driving vehicle unit, have six lidar sensors, while General Motors Co's vehicle contains five, according to information from the companies.

Uber declined to comment on its decision to reduce its lidar count, and referred questions on the blind spot to Velodyne. Velodyne acknowledged that with the rooftop lidar there is a roughly three meter blind spot around a vehicle, saying that more sensors are necessary.

"If you're going to avoid pedestrians, you're going to need to have a side lidar to see those pedestrians and avoid them, especially at night," Marta Hall, president and chief business development officer at Velodyne, told Reuters.

www.reuters.com/article/us-uber-selfdriving-sensors-insight/ubers-use-
of-fewer-safety-sensors-prompts-questions-after-arizona-crash-idUSKBN1H337Q


The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, March 28, 2018 1:54 AM

JEWELSTAITEFAN


Quote:

Originally posted by second:
March 27, 2018 / 5:13 PM

The new Uber driverless vehicle is armed with only one roof-mounted lidar sensor compared with seven lidar units on the older Ford Fusion models Uber employed, according to diagrams prepared by Uber.

In scaling back to a single lidar on the Volvo, Uber introduced a blind zone around the perimeter of the SUV that cannot fully detect pedestrians, according to interviews with former employees and Raj Rajkumar, the head of Carnegie Mellon University's transportation center who has been working on self-driving technology for over a decade.

The lidar system made by Velodyne - one of the top suppliers of sensors for self-driving vehicles - sees objects in a 360-degree circle around the car, but has a narrow vertical range that prevents it from detecting obstacles low to the ground, according to information on Velodyne’s website as well as former employees who operated the Uber SUVs.

Autonomous vehicles operated by rivals Waymo, Alphabet Inc's self-driving vehicle unit, have six lidar sensors, while General Motors Co's vehicle contains five, according to information from the companies.

Uber declined to comment on its decision to reduce its lidar count, and referred questions on the blind spot to Velodyne. Velodyne acknowledged that with the rooftop lidar there is a roughly three meter blind spot around a vehicle, saying that more sensors are necessary.

"If you're going to avoid pedestrians, you're going to need to have a side lidar to see those pedestrians and avoid them, especially at night," Marta Hall, president and chief business development officer at Velodyne, told Reuters.

www.reuters.com/article/us-uber-selfdriving-sensors-insight/ubers-use-
of-fewer-safety-sensors-prompts-questions-after-arizona-crash-idUSKBN1H337Q


What kind of retard thinks this is OK? At least now we know they are targeting small kids.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, March 28, 2018 7:25 AM

SECOND

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at https://www.mediafire.com/two


Quote:

Originally posted by JEWELSTAITEFAN:

What kind of retard thinks this is OK? At least now we know they are targeting small kids.

In America, humans only have about 1 fatality every 100,000,000 miles and they do not have 7 lidars watching for pedestrians and don't have 10 radars watching for other cars and they do not have 20 video cameras for eyes watching the road. Either the Uber video cameras are far worse than the average human eye or the Uber software is far worse than the average human's reflexes.

www.iihs.org/iihs/topics/t/general-statistics/fatalityfacts/state-by-s
tate-overview


The earlier Uber's Ford Fusion test cars used seven lidars, seven radars and 20 cameras. The newer Volvo test vehicles use a single lidar, 10 radars and seven cameras, Uber said.

www.reuters.com/article/us-uber-selfdriving-sensors-insight/ubers-use-
of-fewer-safety-sensors-prompts-questions-after-arizona-crash-idUSKBN1H337Q


The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, March 28, 2018 10:42 AM

ZEEK


Honestly we don't know enough details right now to determine what went wrong. Self driving systems are not so much programmed as they are trained. The only way for the systems to learn is to continue to experience more and more different situations. One theory I've heard floated around is that they were potentially training the lidar sensors by using the other sensors to control the driving. Given that this happened at night that seems like a bad idea. Lidar is supposed to be better at detecting things that are difficult to see.

It could also be that the lidar sensor they were using was insufficient. That sounds negligent to me and could potentially turn into an expensive lawsuit.

We also have to remember that Uber is a company on the fast track to bankruptcy. They aren't making money. Instead they're hemorrhaging it at a billion dollar per quarter rate recently. There self driving platform is probably not something they can afford to do. However, Uber has never appeared to have a good corporate culture. It wouldn't surprise me if they're playing fast and loose with their self driving project and are cutting corners to save money.

Overall, I wouldn't put this incident down as a knock on self driving cars. I'd say it's more a knock on Uber.

NOTIFY: N   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, March 28, 2018 12:01 PM

MOOSE


Quote:

Originally posted by JEWELSTAITEFAN:
Quote:

Originally posted by Moose:
If you look back at the history of this forum, you’ll see the vast majority of the threads are about News Headlines that concern the show. But since there has been lack of Firefly in the headlines the past few years, I kinda understand you all hijacking it for your own use.

But that’s not even what I’m complaining about. Reread this thread and if you can’t figure it out, oh well.
It ultimately doesn’t matter. This place has been on life support for a long time.

Oh, I’m not a fan of self driving vehicles but I do wonder if a human driver could have avoided that accident.
Maybe, maybe not.

I was using the descriptive subtitle on the "To The Boards" page, which I try to do with each of the subforums. I was not attempting hijacking.



Fair enough.
Sorry I’ve been bitchy in this thread, I’ve had a f’ed up couple of weeks and let it spill into here.



NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, March 28, 2018 1:24 PM

SECOND

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at https://www.mediafire.com/two


Quote:

Originally posted by Zeek:
Honestly we don't know enough details right now to determine what went wrong. Self driving systems are not so much programmed as they are trained. The only way for the systems to learn is to continue to experience more and more different situations. One theory I've heard floated around is that they were potentially training the lidar sensors by using the other sensors to control the driving. Given that this happened at night that seems like a bad idea. Lidar is supposed to be better at detecting things that are difficult to see.

It could also be that the lidar sensor they were using was insufficient. That sounds negligent to me and could potentially turn into an expensive lawsuit.

Humans are ready to drive a car after a few weeks and a few hundred miles of driver's education. And the humans cause only 1 fatality per 100,000,000 vehicle miles. Whatever algorithm Uber is training, it is learning extremely slowly. If Uber was creating a super-humanly safe driving system, I'd expect it to take millions of miles of practice. But I'd also expect it from the very first year to be as good as a human student driver with no more training than a student driver gets. Uber should have never let their cars off a closed test track until it was as good as some dorky 15 year old qualifying for his license.

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, March 28, 2018 4:29 PM

JEWELSTAITEFAN


Quote:

Originally posted by Moose:
Quote:

Originally posted by JEWELSTAITEFAN:
Quote:

Originally posted by Moose:
If you look back at the history of this forum, you’ll see the vast majority of the threads are about News Headlines that concern the show. But since there has been lack of Firefly in the headlines the past few years, I kinda understand you all hijacking it for your own use.

But that’s not even what I’m complaining about. Reread this thread and if you can’t figure it out, oh well.
It ultimately doesn’t matter. This place has been on life support for a long time.

Oh, I’m not a fan of self driving vehicles but I do wonder if a human driver could have avoided that accident.
Maybe, maybe not.

I was using the descriptive subtitle on the "To The Boards" page, which I try to do with each of the subforums. I was not attempting hijacking.

Fair enough.
Sorry I’ve been bitchy in this thread, I’ve had a f’ed up couple of weeks and let it spill into here.

Thanks for the reply.
I suspect I understood what you were referring to, but that was beyond my control, and I couldn't conjure a way for me to satisfy the points you were making.

I do hope you will again call me out if I post a thread here which is truly not a News Headline for Discussion.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Thursday, March 29, 2018 10:50 AM

ZEEK


Quote:

Originally posted by second:
Humans are ready to drive a car after a few weeks and a few hundred miles of driver's education. And the humans cause only 1 fatality per 100,000,000 vehicle miles. Whatever algorithm Uber is training, it is learning extremely slowly. If Uber was creating a super-humanly safe driving system, I'd expect it to take millions of miles of practice. But I'd also expect it from the very first year to be as good as a human student driver with no more training than a student driver gets. Uber should have never let their cars off a closed test track until it was as good as some dorky 15 year old qualifying for his license.

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly


That's not a fair comparison. The self driving system is learning everything. Not just driving. So, if we are following the same timeline then the system needs 15 years to be equal to a human. It's learning the equivalent of hand eye coordination. It's learning object permanence. It's learning what a dog is, what a cat is, what a deer is. It's learning the world just like we do as infants. We even get a head start because some of that is ingrained in our DNA and we just react on instincts. There's no evolution process to give a machine instincts. They have to learn it all.

However, being a specialized system it doesn't really need to learn all the things humans learn. It doesn't need to learn history, science, math, etc. It's not going to spend time on entertainment. It'll never understand pop culture, but we don't need or want it to learn those things. So, maybe 15 years is too generous. I'm fairly certain we'll see full autonomous cars commercially available in under 5 years. So, I think it will beat a human even in learning.

The other fun advantage is that they're talking about letting the system learn in a virtual environment. Sort of a supped up car driving videogame that allows the car to learn in the matrix. So, maybe we'll never hook a human up to the matrix to teach them kung fu, but for artificial intelligence it might just be a possibility. And then no human lives are on the line while it learns.

NOTIFY: N   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Thursday, March 29, 2018 1:01 PM

JEWELSTAITEFAN


Quote:

Originally posted by Zeek:
Quote:

Originally posted by second:
Humans are ready to drive a car after a few weeks and a few hundred miles of driver's education. And the humans cause only 1 fatality per 100,000,000 vehicle miles. Whatever algorithm Uber is training, it is learning extremely slowly. If Uber was creating a super-humanly safe driving system, I'd expect it to take millions of miles of practice. But I'd also expect it from the very first year to be as good as a human student driver with no more training than a student driver gets. Uber should have never let their cars off a closed test track until it was as good as some dorky 15 year old qualifying for his license.


That's not a fair comparison. The self driving system is learning everything. Not just driving. So, if we are following the same timeline then the system needs 15 years to be equal to a human. It's learning the equivalent of hand eye coordination. It's learning object permanence. It's learning what a dog is, what a cat is, what a deer is. It's learning the world just like we do as infants. We even get a head start because some of that is ingrained in our DNA and we just react on instincts. There's no evolution process to give a machine instincts. They have to learn it all.

However, being a specialized system it doesn't really need to learn all the things humans learn. It doesn't need to learn history, science, math, etc. It's not going to spend time on entertainment. It'll never understand pop culture, but we don't need or want it to learn those things. So, maybe 15 years is too generous. I'm fairly certain we'll see full autonomous cars commercially available in under 5 years. So, I think it will beat a human even in learning.

The other fun advantage is that they're talking about letting the system learn in a virtual environment. Sort of a supped up car driving videogame that allows the car to learn in the matrix. So, maybe we'll never hook a human up to the matrix to teach them kung fu, but for artificial intelligence it might just be a possibility. And then no human lives are on the line while it learns.

So they've only been on the road for 9 years so far, right?
The 5 year mark was 4 years ago.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Thursday, March 29, 2018 1:08 PM

SECOND

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at https://www.mediafire.com/two


Quote:

Originally posted by Zeek:
Quote:

Originally posted by second:
Humans are ready to drive a car after a few weeks and a few hundred miles of driver's education. And the humans cause only 1 fatality per 100,000,000 vehicle miles. Whatever algorithm Uber is training, it is learning extremely slowly. If Uber was creating a super-humanly safe driving system, I'd expect it to take millions of miles of practice. But I'd also expect it from the very first year to be as good as a human student driver with no more training than a student driver gets. Uber should have never let their cars off a closed test track until it was as good as some dorky 15 year old qualifying for his license.

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly


That's not a fair comparison. The self driving system is learning everything. Not just driving. So, if we are following the same timeline then the system needs 15 years to be equal to a human. It's learning the equivalent of hand eye coordination. It's learning object permanence. It's learning what a dog is, what a cat is, what a deer is. It's learning the world just like we do as infants. We even get a head start because some of that is ingrained in our DNA and we just react on instincts. There's no evolution process to give a machine instincts. They have to learn it all.

However, being a specialized system it doesn't really need to learn all the things humans learn. It doesn't need to learn history, science, math, etc. It's not going to spend time on entertainment. It'll never understand pop culture, but we don't need or want it to learn those things. So, maybe 15 years is too generous. I'm fairly certain we'll see full autonomous cars commercially available in under 5 years. So, I think it will beat a human even in learning.

The other fun advantage is that they're talking about letting the system learn in a virtual environment. Sort of a supped up car driving videogame that allows the car to learn in the matrix. So, maybe we'll never hook a human up to the matrix to teach them kung fu, but for artificial intelligence it might just be a possibility. And then no human lives are on the line while it learns.

It’s still an open question how the video, lidar, and radar sensors all managed to miss seeing a pedestrian in front of the car. I’ll bet Uber already has a pretty good idea of what happened, but so far they aren’t telling.

If I had to guess, the sensors were NOT working properly and the Uber Computer did NOT react by stopping on the side of the road with its hazard blinkers flashing while it waited for a technician to fix whatever knocked out the sensors. Maybe Uber did not program the computer to handle that unusual circumstance. I've seen human drivers not programmed for unusual circumstances. For one example: a teenage driver with a flat tire kept driving down the road at slow speed as the tire tears itself apart and the aluminum rim was ruined. What was the reason the teenager gave as I replaced the wheel with the spare tire? Home was only few more blocks away. This was a learning experience on what not to do on a front wheel drive car when the rear tire on the passenger side goes flat.

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Thursday, March 29, 2018 2:35 PM

JEWELSTAITEFAN


Quote:

Originally posted by JEWELSTAITEFAN:
Quote:

Originally posted by Moose:
Quote:

Originally posted by JEWELSTAITEFAN:
Quote:

Originally posted by Moose:
If you look back at the history of this forum, you’ll see the vast majority of the threads are about News Headlines that concern the show. But since there has been lack of Firefly in the headlines the past few years, I kinda understand you all hijacking it for your own use.

But that’s not even what I’m complaining about. Reread this thread and if you can’t figure it out, oh well.
It ultimately doesn’t matter. This place has been on life support for a long time.

Oh, I’m not a fan of self driving vehicles but I do wonder if a human driver could have avoided that accident.
Maybe, maybe not.

I was using the descriptive subtitle on the "To The Boards" page, which I try to do with each of the subforums. I was not attempting hijacking.

Fair enough.
Sorry I’ve been bitchy in this thread, I’ve had a f’ed up couple of weeks and let it spill into here.

Thanks for the reply.
I suspect I understood what you were referring to, but that was beyond my control, and I couldn't conjure a way for me to satisfy the points you were making.

I do hope you will again call me out if I post a thread here which is truly not a News Headline for Discussion.

Also, the history of the Forum is not always intuitive. Once, before Cinema, most Film topics were put in Talk Story or General Discussion. Same with Other Science Fiction Series. Sometimes the difference between Talk Story and News Headline Discussion escapes notice.

I see your point about the History of News Headline - and I wish more Headlines were about Firefly, such as Capt Mal on American Housewife, which is not in News Headline Forum.

When hunting for the most appropriate forum to place a thread, one sees that Talk Story is grouped with Cinema and RWED, while News Headline is grouped with Other Science Fiction Series and General Discussion. So I check other thread titles within those 2 to see where my new thread best fits, or matches.

Although some people choose to post their sexual exploits and sodomy adventures in General Discussion and Cinema, I do at least attempt some modicum of organization and conformity.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Friday, March 30, 2018 10:46 AM

SECOND

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at https://www.mediafire.com/two


Prevent accidents like Uber’s self-driving car crash
https://qz.com/1241384

“Imagine if this camera [in Uber’s car] was actually analyzing the driver’s head pose, eye closure rate, eyes on the road or not, various emotional and cognitive states, and in real time was able to alert if the safety driver was not paying attention,” says Rana el Kaliouby, CEO of Affectiva, an AI startup working with auto manufacturers like BMW and Daimler on this problem.

Since the Uber car already had a driver-facing camera, el Kaliouby notes, the technology and infrastructure for this solution already exists. All that’s needed is to implement software that actually monitors the driver.

Cadillacs with the “Super Cruise” semi-autonomous mode already have a form of this technology. These cars have a small camera located in the steering wheel that tracks a person’s head and eyes. If the camera detects that a person is not looking at the road, the steering wheel begins to flash and the car makes a warning noise, prompting them to pay attention. If the driver still isn’t watching the road, the car automatically slows down to a stop, puts on the hazard lights, and calls the emergency vehicle-assistance service OnStar.

“How long it takes before the system notices a driver is not paying attention depends on your speed,” Robb Bolio, a lead engineer for GM’s autonomous vehicles unit, told CNBC. “If you are going 75 miles per hour, it’s three or four seconds, depending on the traffic around you. If you are in bumper-to-bumper traffic going 10 miles per hour, it’s a little longer.”

Euro NCAP, an automotive safety organization backed by the European Commission and five European governments, has already said that driver monitoring systems like this will be a key factor in how it rates cars for safety.

“The idea is that this is a symbiotic relationship between human and machine,” el Kaliouby says. “We need to leverage that relationship until we’re comfortable that these vehicles can drive in an autonomous mode, I don’t think anyone would say we’re at a place where we’re comfortable doing that.”

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Friday, March 30, 2018 11:04 AM

ZEEK


Quote:

Originally posted by JEWELSTAITEFAN:
So they've only been on the road for 9 years so far, right?
The 5 year mark was 4 years ago.


Yeah I mean 5 years from the time of posting. I'm not sure they were on the road 9 years ago though.

Quote:

Originally posted by second:
It’s still an open question how the video, lidar, and radar sensors all managed to miss seeing a pedestrian in front of the car. I’ll bet Uber already has a pretty good idea of what happened, but so far they aren’t telling.

If I had to guess, the sensors were NOT working properly and the Uber Computer did NOT react by stopping on the side of the road with its hazard blinkers flashing while it waited for a technician to fix whatever knocked out the sensors. Maybe Uber did not program the computer to handle that unusual circumstance. I've seen human drivers not programmed for unusual circumstances. For one example: a teenage driver with a flat tire kept driving down the road at slow speed as the tire tears itself apart and the aluminum rim was ruined. What was the reason the teenager gave as I replaced the wheel with the spare tire? Home was only few more blocks away. This was a learning experience on what not to do on a front wheel drive car when the rear tire on the passenger side goes flat.

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly


I haven't seen anything in the news about whether Uber has recovered the vehicle or at least the data yet. I thought I read the police had confiscated it. My guess is they're trying to make heads or tails of what happened first. Kinda a tall order if they don't know how the system works in the first place. If they could even find the system logs I'd be impressed. I still wouldn't suspect they'd know what any of it means.

A friend of mine brought up that the data is probably worth millions to Uber. It is a learning experience for their systems. Does that then put into question whether it's in the company's best interest to create a situation where a car kills a person just for the data? A company's purpose is to make money and if they estimate that killing a person is profitable (after fines, PR nightmare, court cases, etc.) then that's a scary world. I'm hoping that's not already the case.

NOTIFY: N   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Friday, March 30, 2018 2:18 PM

JEWELSTAITEFAN


Quote:

Originally posted by second:
Prevent accidents like Uber’s self-driving car crash
https://qz.com/1241384

“Imagine if this camera [in Uber’s car] was actually analyzing the driver’s head pose, eye closure rate, eyes on the road or not, various emotional and cognitive states, and in real time was able to alert if the safety driver was not paying attention,” says Rana el Kaliouby, CEO of Affectiva, an AI startup working with auto manufacturers like BMW and Daimler on this problem.

Since the Uber car already had a driver-facing camera, el Kaliouby notes, the technology and infrastructure for this solution already exists. All that’s needed is to implement software that actually monitors the driver.

Cadillacs with the “Super Cruise” semi-autonomous mode already have a form of this technology. These cars have a small camera located in the steering wheel that tracks a person’s head and eyes. If the camera detects that a person is not looking at the road, the steering wheel begins to flash and the car makes a warning noise, prompting them to pay attention. If the driver still isn’t watching the road, the car automatically slows down to a stop, puts on the hazard lights, and calls the emergency vehicle-assistance service OnStar.

“How long it takes before the system notices a driver is not paying attention depends on your speed,” Robb Bolio, a lead engineer for GM’s autonomous vehicles unit, told CNBC. “If you are going 75 miles per hour, it’s three or four seconds, depending on the traffic around you. If you are in bumper-to-bumper traffic going 10 miles per hour, it’s a little longer.”

Euro NCAP, an automotive safety organization backed by the European Commission and five European governments, has already said that driver monitoring systems like this will be a key factor in how it rates cars for safety.

“The idea is that this is a symbiotic relationship between human and machine,” el Kaliouby says. “We need to leverage that relationship until we’re comfortable that these vehicles can drive in an autonomous mode, I don’t think anyone would say we’re at a place where we’re comfortable doing that.”

Does the Cadillac pull to the side of the road? Stopping in the middle lane of Interstate could be unhealthy.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Friday, March 30, 2018 4:42 PM

SECOND

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at https://www.mediafire.com/two


Quote:

Originally posted by JEWELSTAITEFAN:

Does the Cadillac pull to the side of the road? Stopping in the middle lane of Interstate could be unhealthy.

You won’t like what the instructions say:

The system is designed to maintain the current lane. You need to take control to change lanes, steer around a traffic situation or object, merge into traffic, or exit the freeway. Super Cruise does not detect construction zones.

3RD ALERT
If the steering wheel light bar flashes red for too long, a voice prompt will be heard. You should take over steering immediately; otherwise, the vehicle will slow in your lane of travel and eventually brake to a stop. Super Cruise and Adaptive Cruise Control will disengage. In the event of an unresponsive driver, the vehicle will come to a controlled stop, activate the hazard lights, and contact OnStar Emergency Services.
www.cadillac.com/content/dam/cadillac/na/us/english/index/ownership/te
chnology/supercruise/pdfs/2018-cad-ct6-supercruise-personalization.pdf


If you drop dead at 70mph or fall asleep, the Cadillac won’t drive your body to the hospital or to the edge of the road. It will simply stop moving a few seconds after your heart stops or you begin snoring. It won’t cruise west on I-10 until it runs out of gas.



The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Friday, March 30, 2018 4:56 PM

JEWELSTAITEFAN


Quote:

Originally posted by second:
Quote:

Originally posted by JEWELSTAITEFAN:

Does the Cadillac pull to the side of the road? Stopping in the middle lane of Interstate could be unhealthy.

You won’t like what the instructions say:

The system is designed to maintain the current lane. You need to take control to change lanes, steer around a traffic situation or object, merge into traffic, or exit the freeway. Super Cruise does not detect construction zones.

3RD ALERT
If the steering wheel light bar flashes red for too long, a voice prompt will be heard. You should take over steering immediately; otherwise, the vehicle will slow in your lane of travel and eventually brake to a stop. Super Cruise and Adaptive Cruise Control will disengage. In the event of an unresponsive driver, the vehicle will come to a controlled stop, activate the hazard lights, and contact OnStar Emergency Services.
www.cadillac.com/content/dam/cadillac/na/us/english/index/ownership/te
chnology/supercruise/pdfs/2018-cad-ct6-supercruise-personalization.pdf


If you drop dead at 70mph or fall asleep, the Cadillac won’t drive your body to the hospital or to the edge of the road. It will simply stop moving a few seconds after your heart stops or you begin snoring. It won’t cruise west on I-10 until it runs out of gas.




That sounds hazardous. Think of the middle lane of 10.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Saturday, March 31, 2018 9:03 AM

SECOND

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at https://www.mediafire.com/two


Quote:

Originally posted by JEWELSTAITEFAN:

That sounds hazardous. Think of the middle lane of 10.

There is an article titled "Path to Autonomy: Self-Driving Car Levels 0 to 5 Explained". The Cadillac Super Cruise is only at level 2. The ultimate goal is level 5. No company except Waymo is anywhere close to a 5 prototype. There was only one car that ever made it up to level 4, but it has been discontinued.
www.caranddriver.com/features/path-to-autonomy-self-driving-car-levels
-0-to-5-explained-feature


Funny scene where Russell Crowe says, "You don't have to pull over. The car can drive itself." That is only true in dreams.



The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Saturday, March 31, 2018 3:06 PM

JEWELSTAITEFAN


I just hear MSM report that the driver was killed in the accident. That's not correct, is it?

The report also said the driver had not touched the wheel for more than 6 seconds, despite repeated warnings, alarms, and alerts.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Saturday, March 31, 2018 3:24 PM

JEWELSTAITEFAN


And Kim Kommando listed the criminal and driving convictions of that driver.

Plus, reported that Waymo requires intervention of Safety Driver an average of every 5,600 miles. For Uber, the comparable figure is 13 miles.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Monday, May 7, 2018 4:11 PM

SECOND

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at https://www.mediafire.com/two


Uber has reportedly discovered that the fatal crash involving one of its prototype self-driving cars was probably caused by software faultily set up to ignore objects in the road, sources told The Information.

Specifically, The autonomous programming detects items around the vehicle and operators fine-tune its sensitivity to make sure it only reacts to true threats (solid objects instead of bags, for example). Unfortunately, the car's software was supposedly set too far in the other direction, and didn't stop in time to avoid hitting bicyclist Elaine Herzberg.

www.engadget.com/2018/05/07/uber-crash-reportedly-caused-by-software-t
hat-ignored-objects-in
/

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at www.mediafire.com/folder/1uwh75oa407q8/Firefly

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Monday, May 7, 2018 7:34 PM

JEWELSTAITEFAN


Quote:

Originally posted by second:
Uber has reportedly discovered that the fatal crash involving one of its prototype self-driving cars was probably caused by software faultily set up to ignore objects in the road, sources told The Information.

Specifically, The autonomous programming detects items around the vehicle and operators fine-tune its sensitivity to make sure it only reacts to true threats (solid objects instead of bags, for example). Unfortunately, the car's software was supposedly set too far in the other direction, and didn't stop in time to avoid hitting bicyclist Elaine Herzberg.

www.engadget.com/2018/05/07/uber-crash-reportedly-caused-by-software-t
hat-ignored-objects-in
/


She was not riding her bike, right?
A poorly designed system, which allows sensitivity to be "adjusted" off spectrum, could mistake an unmounted bicycle in motion for transparency.
A poorly designed system, yet one which was granted authorization to run loose on the roads, without a qualified driver, in the state. Whoever authorized that should also be charged, along with the company.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Tuesday, May 8, 2018 10:27 AM

ZEEK


Quote:

Originally posted by second:
Uber has reportedly discovered that the fatal crash involving one of its prototype self-driving cars was probably caused by software faultily set up to ignore objects in the road, sources told The Information.

Specifically, The autonomous programming detects items around the vehicle and operators fine-tune its sensitivity to make sure it only reacts to true threats (solid objects instead of bags, for example). Unfortunately, the car's software was supposedly set too far in the other direction, and didn't stop in time to avoid hitting bicyclist Elaine Herzberg.

www.engadget.com/2018/05/07/uber-crash-reportedly-caused-by-software-t
hat-ignored-objects-in/



I really hope Uber was trying to give a simplistic explanation for their software, because that does not sound like the proper way to build a machine learning algorithm. You don't have sensitivity to objects. You get the system to learn how to identify what an object is. Besides something the size of a person and a bike should definitely not be assumed to be a non-solid object. That "sensitivity" would have to be almost entirely disabled at that point. I mean by size that's extremely similar to a motorcycle. You can't just plow through those like they're a bag.

Uber probably should not be in the self driving business. They just don't have the resources to develop a good system. They're burning cash like crazy and that is not a good situation to be investing in a lot of research and development.

NOTIFY: N   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Tuesday, May 8, 2018 4:17 PM

JEWELSTAITEFAN


Quote:

Originally posted by Zeek:
Quote:

Originally posted by second:
Uber has reportedly discovered that the fatal crash involving one of its prototype self-driving cars was probably caused by software faultily set up to ignore objects in the road, sources told The Information.

Specifically, The autonomous programming detects items around the vehicle and operators fine-tune its sensitivity to make sure it only reacts to true threats (solid objects instead of bags, for example). Unfortunately, the car's software was supposedly set too far in the other direction, and didn't stop in time to avoid hitting bicyclist Elaine Herzberg.

www.engadget.com/2018/05/07/uber-crash-reportedly-caused-by-software-t
hat-ignored-objects-in/



I really hope Uber was trying to give a simplistic explanation for their software, because that does not sound like the proper way to build a machine learning algorithm. You don't have sensitivity to objects. You get the system to learn how to identify what an object is. Besides something the size of a person and a bike should definitely not be assumed to be a non-solid object. That "sensitivity" would have to be almost entirely disabled at that point. I mean by size that's extremely similar to a motorcycle. You can't just plow through those like they're a bag.

Uber probably should not be in the self driving business. They just don't have the resources to develop a good system. They're burning cash like crazy and that is not a good situation to be investing in a lot of research and development.

And wasn't she a pedestrian anyhow, ignoring their Spin?

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Tuesday, May 8, 2018 4:22 PM

JEWELSTAITEFAN


Quote:

Originally posted by second:
Quote:

Originally posted by Wishimay:
They ALREADY have better accident per mile records than PEOPLE do, as Google started testing its first self-driving car in 2009.

ONE fatality from then till NINE years later?? It's impressive. Not enough to make me go near it yet, but still impressive.

People driving have 1.02 fatalities per 100 million vehicles miles traveled in 2014. Uber has 1.00 fatalities per thousand vehicle miles. There is much room for improvement at Uber. Maybe next time Uber could apply the brakes? Or steer around the pedestrian? Or beep the horn? Flash the high beams on the headlights? Or do something other than drive straight toward the pedestrian at a constant speed in the same lane like a railroad train on tracks.

https://en.wikipedia.org/wiki/Transportation_safety_in_the_United_Stat
es#Road_safety


Maybe the Uber Computer was not paying sufficient attention because it was watching youtube when it killed the pedestrian. Who knows?



Quote:


Looks like pedestrian to me.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, May 30, 2018 12:26 AM

JEWELSTAITEFAN


Heard a news item today.
Statement by some nondriven auto maker, maybe uber.
Said their self driving cars are not intended to avoid crashes, or killing humans.
What kind of retards are in charge of these places?

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, May 30, 2018 4:23 PM

SECOND

The Joss Whedon script for Serenity, where Wash lives, is Serenity-190pages.pdf at https://www.mediafire.com/two


The most difficult to build is the perception module, says Sebastian Thrun, a Stanford professor who used to lead Google’s autonomous-vehicle effort. The hardest things to identify, he says, are rarely-seen items such as debris on the road, or plastic bags blowing across a highway. In the early days of Google’s AV project, he recalls, “our perception module could not distinguish a plastic bag from a flying child.”

According to the NTSB report, the Uber vehicle struggled to identify Elaine Herzberg as she wheeled her bicycle across a four-lane road. Although it was dark, the car’s radar and LIDAR detected her six seconds before the crash. But the perception system got confused: it classified her as an unknown object, then as a vehicle and finally as a bicycle, whose path it could not predict. Just 1.3 seconds before impact, the self-driving system realised that emergency braking was needed. But the car’s built-in emergency braking system had been disabled, to prevent conflict with the self-driving system; instead a human safety operator in the vehicle is expected to brake when needed. But the safety operator, who had been looking down at the self-driving system’s display screen, failed to brake in time.

The cause of the accident therefore has many elements, but is ultimately a system-design failure. When its perception module gets confused, an AV should slow down. But unexpected braking can cause problems of its own: confused AVs have in the past been rear-ended (by human drivers) after slowing suddenly. Hence the delegation of responsibility for braking to human safety drivers, who are there to catch the system when an accident seems imminent. In theory adding a safety driver to supervise an imperfect system ensures that the system is safe overall. But that only works if they are paying attention to the road at all times.

www.economist.com/the-economist-explains/2018/05/29/why-ubers-self-dri
ving-car-killed-a-pedestrian

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, May 30, 2018 4:28 PM

JEWELSTAITEFAN


Quote:

Originally posted by second:
The most difficult to build is the perception module, says Sebastian Thrun, a Stanford professor who used to lead Google’s autonomous-vehicle effort. The hardest things to identify, he says, are rarely-seen items such as debris on the road, or plastic bags blowing across a highway. In the early days of Google’s AV project, he recalls, “our perception module could not distinguish a plastic bag from a flying child.”

According to the NTSB report, the Uber vehicle struggled to identify Elaine Herzberg as she wheeled her bicycle across a four-lane road. Although it was dark, the car’s radar and LIDAR detected her six seconds before the crash. But the perception system got confused: it classified her as an unknown object, then as a vehicle and finally as a bicycle, whose path it could not predict. Just 1.3 seconds before impact, the self-driving system realised that emergency braking was needed. But the car’s built-in emergency braking system had been disabled, to prevent conflict with the self-driving system; instead a human safety operator in the vehicle is expected to brake when needed. But the safety operator, who had been looking down at the self-driving system’s display screen, failed to brake in time.

The cause of the accident therefore has many elements, but is ultimately a system-design failure. When its perception module gets confused, an AV should slow down. But unexpected braking can cause problems of its own: confused AVs have in the past been rear-ended (by human drivers) after slowing suddenly. Hence the delegation of responsibility for braking to human safety drivers, who are there to catch the system when an accident seems imminent. In theory adding a safety driver to supervise an imperfect system ensures that the system is safe overall. But that only works if they are paying attention to the road at all times.

www.economist.com/the-economist-explains/2018/05/29/why-ubers-self-dri
ving-car-killed-a-pedestrian

This was the Safety Driver without a Driver's License?

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, May 30, 2018 10:43 PM

JEWELSTAITEFAN


News said a Tesla self-driving car crashed into a PARKED Police cruiser, totalling the cop car.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

YOUR OPTIONS

NEW POSTS TODAY

USERPOST DATE

OTHER TOPICS

DISCUSSIONS
What "They've" Done To Us, And Is It The Same As G-32 Paxilon Hydrochlorate?
Thu, November 21, 2024 03:50 - 75 posts
Self-Driving Uber Car Racks Up First Kill
Thu, September 19, 2024 05:05 - 62 posts
Cave Paintings From a Species Before Humans
Fri, September 6, 2024 17:49 - 17 posts
Computerized Voting???
Fri, August 23, 2024 12:06 - 12 posts
The Captain gets to keep his job for another season
Mon, April 15, 2024 17:10 - 2 posts
Firefly Converting Go-Se Into Fuel
Thu, April 11, 2024 19:29 - 1 posts
Welcome Back to the living Badger !
Sun, December 3, 2023 21:55 - 1 posts
R.I.P. Shawna Trpcic
Mon, October 9, 2023 05:46 - 5 posts

Tue, March 7, 2023 16:14 - 1 posts
Georgia may approve public school Bible classes
Sat, February 25, 2023 09:36 - 22 posts
Joss Whedon fan site shuts down after ex-wife's critical essay
Wed, November 30, 2022 04:18 - 47 posts
WSJ: In the Philippines, Judge Consults Three Wee Friends
Tue, November 29, 2022 08:20 - 2 posts

FFF.NET SOCIAL