Automobile Enthusiasts
Related: About this forumTesla drivers say new self-driving update is repeatedly running red lights
https://futurism.com/the-byte/tesla-fsd-update-red-lights"Thankfully I stopped it before it ran the light," wrote one Redditor user in a thread with several other Tesla drivers experiencing the same thing, "but hopefully Tesla is aware their software is at a very dangerous level right now."
"I've also had that happen once or twice at below 30 mph," another person wrote. "Having trust issues because of it [not gonna lie]."
A YouTuber posted a video clearly showing the red light problem Redditors brought up in their thread. "Don't do that! That's a red light," the YouTuber exclaimed while he was driving through the downtown area of Newark, New Jersey, and the vehicle tried to drive through a red light at an intersection.
TLDR; It's not FSD. And robotaxis won't have a human to hit the brakes.
There's another article that goes into limitations in the all-camera system.
https://democraticunderground.com/12055147
Joinfortmill
(16,361 posts)usonian
(13,743 posts)But we do quote reliable sources. A lot of problems are either unique to Teslas, to EV's in general (like lithium battery problems) magnified by their popularity, or just lack of experience as an auto maker.
Of course, a flaw imperils a lot of others.
RockRaven
(16,251 posts)and are using it as if what he said (or you thought he said) was really true.
Where did you go wrong? 'Tis a mystery!!!
regnaD kciN
(26,590 posts)it stands for Fatal Self-Driving.
usonian
(13,743 posts)Laziness is the mother of invention.
And acronyms.
brush
(57,459 posts)Last edited Sun Sep 15, 2024, 08:24 PM - Edit history (1)
How the hell could they trust that idiot, especially after failure after failure after failuer on so-called self-driving cars...not to meention that horrendous stainless steel cybertrcuk that rusts.
NBachers
(18,124 posts)You'll notice stop signs, stop lights, bicycles, difficult intersections, delivery vans, narrow streets and hills and bends with two-way traffic. I felt completely confident and at-ease.
usonian
(13,743 posts)Having driven though fog more times than I like, I sure would like to have radar on board, as well.
I was planning to install a noisemaker to scare away deer, but I figured that it might just spook them to jump INTO the road instead of OFF.
I read somewhere that they make seemingly random moves in response to threats, to fake out predators. Not good for autos.
stopdiggin
(12,801 posts)Seems awfully one sided - given the horrendous job of driving that many of our human operators demonstrate every day - are we seriously claiming that self driving represents significantly worse? I find that somewhat difficult to believe - but, if so, let's trot out those figures to make exactly that argument. Meantime, I have a strong hunch that a lot of this opposition is driven at gut level.
So - appreciate that someone would try to add a little balance to the discussion. I think it's useful.
added: In relation to the OP - I'm sure that most people are already aware that the Waymo and Tesla situation are actually talking different systems (and even different parameters?) - hardware, software, programming, the whole ball of wax
NBachers
(18,124 posts)prefer to always remain in control. I was driving in night-time Pacific shore fog recently and paying attention to how it was doing. It seemed to be OK, but my hands never left the wheel.
Kablooie
(18,764 posts)I've been using Tesla FSD on every drive for the past 3 years. I also had it try to run a red light in the past, but every few weeks, a new update improves its performance so that now I just tell it where to go and sit back while it takes me there. My very few interventions are almost always for routing errors, not safety corrections. The software is not complete yet; it still needs work on navigating parking lots, dead-end streets, pulling aside for emergency vehicles, etc., but these will be incorporated in future updates.
Don't get me wrong. I'm disgusted with Musk's political ambitions, and his Robotaxi event was empty fluff, but his self-driving car is very close to being a reality.
usonian
(13,743 posts)But most LLM's are "language" models, after all, designed to predict language. no more.
https://www.democraticunderground.com/100219719722
Despite the models uncanny ability to navigate effectively, when the researchers closed some streets and added detours, its performance plummeted.
When they dug deeper, the researchers found that the New York maps the model implicitly generated had many nonexistent streets curving between the grid and connecting far away intersections.
This could have serious implications for generative AI models deployed in the real world, since a model that seems to be performing well in one context might break down if the task or environment slightly changes.
Stay alert!
I personally avoid software updates at least two major releases.
I won't be an alpha or beta tester.
more at my link.
Kablooie
(18,764 posts)I've wondered if the routing issues come from faulty maps.
In the latest iteration, though, I found it prioritizes street rules over its calculated route or map data.
Just yesterday, it was supposed to turn right into a parking lot but instead got into the left turn lane by mistake. It made the left turn safely and then rerouted to get itself into the correct parking lot. It used to make this kind of mistake fairly often but it's pretty rare recently.