OK, its just a deer, but the future is clear. These things are going to start kill people left and right.
How many kids is Elon going to kill before we shut him down? Whats the number of children we’re going to allow Elon to murder every year?
The poster, who pays Tesla CEO Elon Musk for a subscription to the increasingly far-right social media site, claimed that the FSD software “works awesome” and that a deer in the road is an “edge case.” One might argue that edge cases are actually very important parts of any claimed autonomy suite, given how drivers check out when they feel the car is doing the work, but this owner remains “insanely grateful” to Tesla regardless.
How are these people always such pathetic suckers.
I grew up in Maine. Deer in the road isn’t an edge case there. It’s more like a nightly occurrence.
Same in Kansas. Was in a car that hit one in the 80s and see them often enough that I had to avoid one that was crossing a busy interstste highway last week.
Deer are the opposite of an edge case in the majority of the US.
Putting these valid points aside we’re also all just taking for granted that the software would have properly identified a human under the same circumstances… This could very easily have been a much more chilling outcome
I’m not taking that for granted. If it can’t tell a solid object os in the road, I would guess that would be true for a human that is balled up or facing away as well.
Same, hit one just south of Lyndon at night.
It’s no different in Southern Ontario where I live. Saw a semi truck plow into one, it really wasn’t pretty. Another left a huge dent on my mom’s car when she hit one driving at night.
I drove through rural Arkansas at sundown once. I’ve never seen so many deer in my life.
Same in northern Michigan in mid summer. And most of New England as well.
I grew up in upstate NY so I’m no stranger to deer. This was something else. We were driving through the Winding Staircase mountain and there were hundreds of them. My wife kept screaming and grabbing my arm while I was driving until I had to stop in the middle of the (empty except for us and the deer) road to calmly explain that she was making the situation significantly worse.
Fences alongside the road and special animal crossings are unfeasible with US roads length, yes?..
I’ve read that they do that … somewhere.
Being a run of the mill fascist (rather than those in power) is actually an incredibly submissive position, they just want strong daddies to take care of them and make the bad people go away. It takes courage to be a “snowflake liberal” by comparison
Not really, a good fascist should be always ready to fight for their place in the sun, on all levels, their collective included. There’s no rightful domination there, or right per se, but there is fighting and the resulting domination of the strongest. So if you disobey and lose, you have contributed to fascism to the best of your ability. If you disobey and win, you are the most virtuous fascist. Apathy is the worst crime there. It’s the “jungle” ideology in some sense.
It would be fine if not for the fact that it doesn’t contribute anything to the human, just describes the basic level and how to succeed there, but there are better levels.
Still I think it’s important to deeply understand fascism and how it’s not all evil, because we must understand why and when it’s in demand. It’s an ideology of chaotic life and violent evolution, and the demand for it arises when more gracious alternatives erode, and nothing around is certain other than one’s will to fight.
Umberto Eco’s “Foucault’s Pendulum” is a wonderful book deeply exploring fascist aesthetic, by the way.
The issue with fascist followers (an important word) is that it doesn’t take anything to pretend to be a fascist, while being a submissive slave in fact.
I actually find it funny how if you remove NAP from anarcho-capitalism, it can become both classical fascism and classical anarchism, with the difference being in what people of these ideologies want from the future, not the rules these ideologies impose.
Edge cases (NOT features) are the thing that keeps them from reaching higher levels of autonomy. These level differences are like “most circumstances”, “nearly all circumstances”, “really all circumstances”.
Since Tesla cares so much more about features, they will remain on level 2 for another very long time.
I’d go even farther and say most driving is an edge case. I used 30 day trial of full self-driving and the results were eye opening. Not how it did: it was pretty much as expected, but looking at where it went wrong.
Full self driving did very well in “normal” cases, but I never realized just how much of driving was an “edge” case. Lane markers faded? No road edge but the ditch? Construction? Pothole? Debris? Other car does something they shouldn’t have? Traffic lights not aligned in front of you so it’s not clear what lane? Intersection not aligned so you can’t just go straight across? People intruding? Contradictory signs? Signs covered by tree branches? No sight line when turning?
After that experiment, it seems like “edge” cases are more common than “normal” cases when driving. Humans just handle it without thinking about it, but the car needs more work here
Deer on the road is an edge case that humans cannot handle well. In general every option other than hitting the deer is overall worse - which is why most insurance companies won’t increase your rates if you hit a deer and file a claim for repairs.
The only way to not hit/kill hundreds of deer (thousands? I don’t know the number) every year is to reduce rural speed limits to unreasonably slow speeds. Deer jump out of dark places right in front of cars all the time - the only option to avoid it that might work is either drive in the other lanes (which sometimes means into an oncoming car), or into the ditch (you have no clue what might be there - if you are lucky the car just rolls, but there could be large rocks or strong fence posts and the car stops instantly. Note that this all happens fast, you can’t think you only get to react. Drivers in rural areas are taught to hit the brakes and maintain their lane.
Drivers in rural areas are taught to hit the brakes and maintain their lane.
Which the Tesla didn’t do. It plowed full speed into the deer, which arguably made the collision much much worse than it could have been. I doubt the thing was programmed to maintain speed into a deer. The more likely alternative is that the FSD couldn’t tell there was a deer there in the first place.
Braking dips the hood making it easier for the deer to go into the windshield. You should actually speed up right before hitting to make your hood go up and make it hopefully go under or better stay in the grill.
Doesn’t this all depend on the height of your car and the condition of your shocks? Doesn’t seem like a hard and fast rule. Also, you’re assuming rear wheel drive. FWD does not “raise the hood” like you’re playing Cruising USA.
Please show me that guideline, anywhere.
/Swede living in the deer countryside
Wear gloves when they hand you that guideline because they might be pulling it out of their ass.
Maybe, but it’s still the case that slowing down will impart less energy to the collision. Let up on the brake before impact if you want, but you should have been braking once you first saw the deer in the road.
Sometimes those fuckers just jump out at you at the last minute. They’re not smart. But if you click the link, this one was right in the middle of the road, with that “Deer in the headlights” look. There was plenty of time to slow down before impact.
Conditions matter and your reaction should always be for the worst possible scenario (moose and snow), braking removes your ability to maneuver as well, and locking the brakes up which will almost always happen when you panic break, would be the worst scenario. If there’s snow or rain, braking again is right out.
If it jumps out and you can’t do anything but brake, you shouldn’t do that, you grip the wheel and maintain speed, and if you can punch the gas for the hood raise. But people panic and can’t think. So maintain speed, don’t panic and lock your brakes up.
You should know how to brake without causing maneuver problems (including not locking up the wheels). It is a basic skill needed for many situations. Just keep slowing down, the accelerate just before impact is something that can only be done in movies - any real world attempt will be worse - remember if you keep braking you lose momentum, so the acceleration needs to be perfectly timed or it is worse.
You know cars have had ABS for a long time, right?
Speeding up instead of braking is fucking stupid, you’re just increasing the impact force (F=(m*v)/t), and increasing the likelihood of the deer going through your windscreen and killing people.
In this case, the deer just stood there in the road.
Any driver and any AI should be able to stop before the obstacle in that case.
Cause it could be a human, or a fallen tree instead of a deer.
This sounds made up
If you think physics is made up, sure…
I don’t think hitting more gas is going to gently slide the 300 pound buck under my car. It’s just going to increase the impact force.
Sliding the deer under your car is also really bad for you. It’s going to do a lot of damage under there such as ripping break lines, destroying ball joints, or fragging your differentials. You need to safely shed as much speed as possible while maintaining your lane when about to hit a deer.
Considering suspension, if you accelerate there’s a lowering of the back of the car/raising of the front.
Conversely, breaking has the opposite effect, increasing the chances of the deer rolling over your hood and through your windshield.
You’ll want to minimize that, hence the acceleration.
The physics is F=(m*v)/t
I.e. the greater the velocity the greater the force of impact.
A moving vehicle in real life is a bit more complicated of an equation, factor in the car’s angle towards the horizontal as you accelerate or brake, that’s the original point, but whatever.
Right before hitting begin the keyword. If you can stop before hitting yes that’s ideal, but in situations where it jumps out and you can’t react. Braking during impact is the worst thing you can do.
If you think I’m saying to line it up and accelerate for 200meters, I dont know what to say about that,
Braking during impact is the worst thing you can do.
This is not correct, where are you getting this from?
Dude, the article just said to hit the brakes “if you can’t avoid hitting a deer”, the exact scenario you described… Did you even open it?
I can see it in the headline, without opening the article
Life is not an arcade game.
aight what’s your strategy for hitting a giraffe, then?
I don’t know, where I live giraffes are only in the zoo and thus never on the road. I’m not aware of any escaping the zoo.
I’m sure if I lived around wild deere, my training would include that, but since I don’t I was able to save some time by not learning that.
What if you’re driving through a zoo though?
I’ve never been in a zoo I’m allowed to drive more thln e wheelchair through. They may require extra training - I would not know
Same for a moose? Speed up so you clear it before gravity caves your car roof.
You maintain speed, you can’t maneuver well if braking, and as stated your hood dips while braking too which can cause worse issues.
Troll comment.
You do that - you die.
No, for moose you are actually supposed to swerve and risk the ditch.
That’s a good strategy to ensure you die: a mooses torso is already higher than the hood of a lot of SUVs, so you’re taking a moose to the face.
The whole premise of ABS brakes, which all cars made in North America since 2012 will have, is specifically to allow you to maintain control when you fully apply the brakes. Unless you are a professional driver or have a car without ABS, you should just fully apply the brakes in an emergency stop. Please stop telling people that fully applying the brakes will reduce manueverability when it won’t for the majority of drivers in the developed world.
And if someone’s vehicle doesn’t have ABS, they should know how to properly brake without locking their tires, and when it won’t be appropriate to use them.
The problem is not that the deer was hit, a human driver may have done so as well. The actual issue is that the car didn’t do anything to avoid hitting it. It didn’t even register that the deer was there and, what’s even worse, that there was an accident. It just continued on as if nothing happened.
Yeah, the automated system should be better than a human. That is the whole point of collision detection systems!
Right. I was trying to decide whether to mention that deer can be hard to spot in time. Even in the middle of the road like this, they’re non-reflective and there may be no movement to catch the eye. It’s very possible for a human to be zoning out and not notice this deer in time
But yeah, this is where we need the car to help. This is what the car should be better than human with. This is what would make ai a good tool to improve safety. If it saw the deer
Deer jump out of dark places
that one was just standing there, yo
If tesla also used radar or other sensing systems instead of limiting themselves to only cameras then being in the dark wouldn’t be an issue.
Deer on the road is an edge case that humans cannot handle well.
If I’m driving at dawn or dusk, when they’re moving around in low light I’m extra careful when driving. I’m scanning the treeline, the sides of the road, the median etc because I know there’s a decent chance I’ll see them and I can slow down in case they make a run across the road. So far I’ve seen several hundred deer and I haven’t hit any of them.
Tesla makes absolutely no provision in this regard.
This whole FSD thing is a massive failure of oversight, no car should be doing self driving without using cameras and radar and Tesla should be forced to refund the
suckerscustomers who paid for this feature.Sure, I do that too. I also have had damage because a deer I didn’t see jumped out of the trees onto the road. (Though as others pointed out this case the deer was on the road with plenty of time to stop (or at least greatly slow down), but the Tesla did nothing.
Damn right. Stomp the brakes and take it to the face.
In general every option other than hitting the deer is overall worse
You’re wrong. The clear solution here is to open suicide-prevention clinics for the depressed deer.
Yeah this Tesla owner is dumb. wdym “we just need to train the AI to know what deer butts look like”? Tesla had radar and sonar, it didn’t need to know what a deer’s butt looks like because radar would’ve told it something was there! But they took it away because Musk had the genius idea of only using cameras for whatever reason.
Sunk cost? Tech worship?
I’m so jaded, I question my wife when she says the sun will rise tomorrow so I really don’t get it either.
Only keeping the regular cameras was a genius move to hold back their full autonomy plans
The day he said that “ReGULAr CAmErAs aRe ALl YoU NeEd” was the day I lost all trust in their implementation. And I’m someone who’s completely ready to turn over all my driving to an autopilot lol
You can’t understand his ironman levels of genius because of your below-billionnaire mind
I believe we can make a self-driving car with only optical sensors that performs as well as a human someday. I don’t think today is that day, or that we shouldn’t aim for self-driving to be far better than human drivers.
Hardware 4 models have a radar on the front as well.
Deer aren’t edge cases. If you are in a rural community or the suburbs, deer are a daily way of life.
As more and more of their forests are destroyed, deer are a daily part of city life. I live in the middle of a large midwestern city; in neighborhood with houses crowded together. I see deer in my lawn regularly.
The deer are actually the ones doing much of the deforestation.
But I agree with your point that the overpopulation is impossible to miss. I’m also in the suburbs of a major Midwestern city and the deer are everywhere. My city tags them so, oddly, you kind of get to know them.
Last year #100 and #161 both had fawns in my back yard (for a total of 3 babies). This year, #161 dropped 2 more back there. I still see #100 around, but I don’t think she had offspring this year. She might have been sterilized, but I heard that the city stopped doing that because some of our tagged deer were tracked to 2 states away. Now we just cull them.
Two days ago I saw a buck (rare for the 'burbs) chasing a few of this year’s fawns around. I thought “you dummy, those girls are too young to breed,” but then I looked it up, and apparently sexual maturity in deer is determined by weight, not age. Does can participate in their first-year rut if they’ve had enough to eat. And those little shits have had plenty of flowers out of my garden.
The deer are building subdivisions and stroads?
I see buck all the time I’m my neighborhood! I was on a walk earlier this summer and turned a corner to be face to face with a small herd that was hopping fences to graze. The buck was across the street and just stared at me.
At first I was afraid because they can get big, but now I’ve seen them a few times and I’m thinking they are used to people. I’m still not getting close if I can help it. They are much bigger than you would expect.
I like seeing them but I feel bad that they are stuck in the city.
People are acting like drivers don’t hit deers at full speed while they’re in control of the car. Unless we get numbers comparing self driving vs human driven cars then this is just a non story with the only goal being discrediting Musk when there’s so many other shit that can be used to discredit him.
To quote OP “How many kids will we let Elon kill before we shut him down?”, by this logic, how many kids will we let drivers kill before we take all cars off the road then?
People are acting like drivers don’t hit deers at full speed while they’re in control of the car.
I should be very surprised if people don’t generally try to brake or avoid hitting an animal (with some exceptions), if only so that they don’t break the car. Whether they succeed at that is another question entirely.
People drive drunk, people drive while checking their phone, people panic and freeze, deers often just jump in front of you from out of nowhere.
People hit fucking humans without braking because they’re not paying attention to what the fuck they’re doing!
But for some reason if it’s a car with assistance well now that’s scandalous! No idea if they’re safer in general and cause less accidents, one is too many! Unless it’s a human behind the wheel then who gives a fuck how many accidents they cause?
… and that’s the kind of driving Tesla is trying to emulate? awesome.
No, I’m saying that one video of a Tesla hitting a deer doesn’t prove that they’re less safe or just as likely as human to hit things when using assisted driving.
Show actual stats of accidents per miles driven compared to cars without assisted driving and then we’ll be able to talk.
If we had videos of every Toyotas or Hyundai or Ford that hit deers while being driven by a human, this video of a Tesla doing it would just be a drop in a pool of water, but because it happened with an assistant behind the wheel people are acting like it means assisted driving doesn’t make cars safer.
TL;DR: It’s an anecdote, without actual stats it’s just noise to influence people’s opinion
Do you own Tesla stock?
Nope and I’ll be the first to say that Musk is a fucking moron, but there’s tons of shit to attack him on, pretending that Tesla cars are more deadly than human driven cars with anecdotal evidence is just stupid.
Problem is the data is rigged. It’s road miles driven that autopilot deigned to activate for with cars that rarely need their friction brakes that are less than 10 years old versus total population of cars with more age and more brake wear and when autopilot says ‘nope, too dangerous for me’, the human still drives.
The other problem is people are thinking they can ignore their cars operation, because of all the rhetoric. A human might have still hit the deer, but he would have at least applied brakes.
Finally, we shouldn’t settle for ‘no worse than human’ when we have more advanced sensors available, and we should call out Tesla for explicitly declaring ‘vision only’ when we already know other sensors can see things cameras cannot.
A human might have still hit the deer, but he would have at least applied brakes.
You’re making quite the assumption there.
I’m not saying we need to settle, I’m saying it’s useless to share that example if we don’t have actual numbers to compare the stats between human driven miles and miles in cars with assistance available and insurance companies would have that.
Bravo, you’re the first person to bring actual fucking statistics to the discussion! Per driven miles would be better than per driver but hey, at least it’s not just a clickbait article.
People drive drunk, people drive while checking their phone,
And those people are breaking the law.
people panic and freeze
I don’t think I’ve ever seen someone panic so much they just act as if they didn’t even hit a deer.
deers often just jump in front of you from out of nowhere.
In this case, the deer was just sitting there, so not applicable.
People hit fucking humans without braking because they’re not paying attention to what the fuck they’re doing!
If it was this much negligence, they’d be facing vehicular manslaughter charges.
But for some reason if it’s a car with assistance well now that’s scandalous!
It’s scandalous when a human does it too. We should do better than human anyway, and we can identify a number of deliberate decisions that exacerbate this problem that could be addressed, e.g. mitigation through LIDAR, which Tesla has famously rejected.
oh my fucking god
Driving is full of edge cases. Humans are also bad drivers who get edge cases wrong all the time.
The real question isn’t is Tesla better/worse in anyone in particular, but overall how does Tesla compare. If a Tesla is better in some situations and worse in others and so overall just as bad as a human I can accept it. Is Tesla is overall worse then they shouldn’t be driving at all (If they can identify those situations they can stop and make a human take over). If a Tesla is overall better then I’ll accept a few edge cases where they are worse.
Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.
Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.
The agency is asking if other similar FSD crashes have occurred in reduced roadway visibility conditions, and if Tesla has updated or modified the FSD system in a way that may affect it in such conditions.
It sure seems like they aren’t being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren’t telling the truth.
It sure seems like they aren’t being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren’t telling the truth.
I think their silence is very telling, just like their alleged crash test data on Cybertrucks. If your vehicles are that safe, why wouldn’t you be shoving that into every single selling point you have? Why wouldn’t that fact be plastered across every Gigafactory and blaring from every Tesla that drives past on the road? If Tesla’s FSD is that good, and Cybertrucks are that safe, why are they hiding those facts?
If the cybertruck is so safe in crashes they would be begging third parties to test it so they could smugly lord their 3rd party verified crash test data over everyone else.
Bu they don’t because they know it would be a repeat of smashing the bulletproof window on stage.
deleted by creator
Being safer than humans is a decent starting point, but safety should be maximized to the best of a machine’s capability, even if it means adding a sensor or two. Keeping screws loose on a Boeing airplane still makes the plane safer than driving, so Boeing should not be made to take responsibility.
deleted by creator
Humans are also bad drivers who get edge cases wrong all the time.
It would be so awesome if humans only got the edge cases wrong.
I’ve been able to get demos of autopilot in one of my friend’s cars, and I’ll always remember autopilot correctly stopping at a red light, followed by someone in the next lane over blowing right through it several seconds later at full speed.
Unfortunately “better than the worst human driver” is a bar we passed a long time ago. From recent demos I’d say we’re getting close to the “average driver”, at least for clear visibility conditions, but I don’t think even that’s enough to have actually driverless cars driving around.
There were over 9M car crashes with almost 40k deaths in the US in 2020, and that would be insane to just decide that’s acceptable for self driving cars as well. No company is going to want that blood on their hands.
If a Tesla is better in some situations and worse in others and so overall just as bad as a human I can accept it.
This idea has a serious problem: THE BUG.
We hear this idea very often, but you are disregarding the problem of a programmed solution: it makes it’s mistakes all the time. Infinitely.
Humans are also bad drivers who get edge cases wrong all the time.
So this is not exactly true.
Humans can learn, and humans can tell when they made an error, and try to do it differently next time. And all humans are different. They make different mistakes. This tiny fact is very important. It secures our survival.
The car does not know when it made a mistake, for example, when it killed a deer, or a person, and crashed it’s windshield and bent lot’s of it’s metal. It does not learn from it.
It would do it again and again.
And all the others would do exactly the same, because they run the same software with the same bug.
Now imagine 250 million people having 250 million Teslas, and then comes the day when each one of them decides to kill a person…
Tesla can detect a crash and send the last minute of data back so all cars learn from is. I don’t know if they do but they can.
I don’t know if they do but they can.
"Today on Oct 30 I ran into a deer but I was too dumb to see it, not even see any obstacle at all. I just did nothing. My driver had to do it all.
Grrrrrr.
Everybody please learn from that, wise up and get yourself some LIDAR!"
Yeah there are edge cases in all directions.
When people want to say that someone is very rare they should say “corner case,” but this doesn’t seem to have made it out of QA lingo and into the popular lexicon.
Given that they market it as “supervised”, the question only has to be “are humans safer when using this tool than when not using it?”
One of the cool things I’ve noticed since recent updates, is the car giving a nudge to help me keep centered, even when I’m not using autopilot
All cars are death machines
In the day, we sweat it out on the streets
Of a runaway American dream
At night we ride through the mansions of glory
In the suicide machines
“Born to Ride” - Bruce Springsteen
I notice nobody has commented on the fact that the driver should’ve reacted to the deer. It’s not Tesla’s responsibility to emergency brake, even if that is a feature in the system. Drivers are responsible for their vehicle’s movements at the end of the day.
Then it’s not “Full self driving”. It’s at best lane assistance, but I wouldn’t trust that either.
Elon needs to shut the fuck up about self driving and maybe issue a full recall, because he’s going to get people killed.
deleted by creator
It’s self-driving but you need to supervise it because you are both responsible and because it’s not perfect.
but you need to supervise it because you are both responsible and because it’s not perfect
Not self-driving then. Words have meanings.
Right. Wikipedia defines it as such
A self-driving car, also known as a autonomous car (AC), driverless car, robotaxi, robotic car or robo-car,[1][2][3] is a car that is capable of operating with reduced or no human input.
But also
Organizations such as SAE have proposed terminology standards. However, most terms have no standard definition and are employed variously by vendors and others. Proposals to adopt aviation automation terminology for cars have not prevailed.
So there’s no one definition. It is driving by itself. You don’t have to do any driving. But you should keep alert so if something happens you can taker over. Seems like it fits with the general use imo but doesn’t fulfill the more stringent definitions.
The definition is that Tesla is shit.
They’re selling a spotty lane assist as Self Driving when it is not.
Other companies are selling actual self-driving cars, (even if those companies are fucking up as well) but Tesla is nowhere near that level of autonomy. All because Musk cheaped out on the sensor package.
Teslas will never be self-driving, because they literally cannot detect the road and obstacles with just their little camera setup.
They should not be allowed to call it self-driving, or autopilot, or anything else that implies that you can take your hands off the steering wheel.
True but if Tesla keeps acting like they’re on the verge of an unsupervised, steering wheel-free system…this is more evidence that they’re not. I doubt we’ll see a cybercab with no controls for the next 10 years if the current tech is still ignoring large, highly predictable objects in the road.
That would be lovely if it wasn’t called and marketed as Full Self-Driving.
You sell vaporware/incomplete functionality software and release it into the wild, then you are responsible for all the chaos it brings.
I watched the whole video… Mowed down like 90 deer in a row.
Dude. Just angle the truck upwards! It’ll go right over that!
Why does this read like an ad for cybertrucks for people who would want to run over deer
What’s Kennedy driving these days?
Full speed in the dark, I think most people would failed to avoid that. What’s concerning is it does not stop afterwards
Note that part of the discussion is we shouldn’t settle for human limitations when we don’t have to. Notably things like LIDAR are considered to give these systems superhuman vision. However, Tesla said ‘eyes are good enough for folks, so just cameras’.
The rest of the industry said LIDAR is important and focus on trying to make it more practical.
The rest of the industry said LIDAR is important and focus on trying to make it more practical.
Volvo is using LIDAR. I trust them way more than Tesla when it comes to something pertaining to safety.
Hell, even not having lidar The thing was pretty clearly a large road obstacle a second and a half out. They had a whole left lane open At least enough time to do a significant speed reduction.
Isn’t Elon advertising AI as orders of magnitudes better reaction time and much less error prone than a human though…
Remember when they removed ultrasonic and radar sensors in favor of “Tesla Vision”? That decision demonstrably cost people their lives and yet older, proven tech continues to be eschewed in favor of the cutting edge new shiny.
I’m all for pushing the envelope when it comes to advancements in technology and AI in its many forms, but those of us that don’t buy Teslas never signed up to volunteer our lives as training data for FSD.
I think the LIDAR and other sensors are supposed to be IR and see in the dark.
Sensors that the Tesla famously doesn’t have (afaik, didn’t check) because Elon is a dumbass.
Too bad Tesla’s don’t have that. Just cameras and machine learning.
The cameras alone should be able to see IR. There’s filters over most digital cameras to prevent that, but no reason to do it here.
Tesla is just advertising technology that isn’t ready, and people are dying as a result.
And its also always to have multiple layers of defence. Its straight up stupid to remove the redundancy in safety measures because you trust your tech.
Not only redundancy, but different types of sensors actually serve different purposes because they excel at different tasks.
For any camera to see IR, there must be IR light there to be seen. LIDAR and proximity sensors emit their own light, but TIL tesla doesn’t have any… Great tech…my 300€ vacuum bot has LIDAR… Ofc it doesn’t go 130KM/h in the dark, but I was 99.99% sure any self-driving car had the bare minimum of sensors, but I guess Tesla isn’t one of them.
Yeah, lidar and radar don’t need an external light source to work
reading this, I am scared how dulled I have become to the danger posed from my 45 minute daily commute back from work. 65 kilometer driving into the black at 100km/h
Didn’t stop afterwards, didn’t even attempt to brake
Why is anyone going full speed in the dark? Let alone an unsafe self driving car.
- Vehicle needed lidar
- Vehicle should have a collision detection indicator for anomalous collisions and random mechanical problems
The autopilot knows deers can’t sue
What if it kills the deer out of season?
Right, most animals can only zoo!
I guess that’s the big game …
For the 1000th time Tesla: don’t call it “autopilot” when it’s nothing more than a cruise control that needs constant attention.
It is autopilot (a poor one but still one) that legally calls itself cruise control so Tesla wouldn’t have to take responsibility when it inevitably breaks the law.
Real Autopilot also needs constant attention, the term comes from aviation and it’s not fully autonomous. It maintains heading, altitude, and can do minor course correction.
It’s the “full self driving” wording they use that needs shit on.
Real Autopilot also needs constant attention
Newer “real” autopilot systems absolutely do not need constant attention. Many of them can do full landing sequences now. The definition would match what people commonly use it for, not what it was “originally”. Most people believe autopilot to be that it pilots itself automatically. There is 0 intuition about what a pilot actually does in the cockpit for most normal people. And technology bares out that thought process as autopilot in it’s modern form can actually do 99% of flying, where take-off and landing isn’t exempted anymore.
Looked it up some, In ideal conditions, and with supervision. The pilot can’t just take a nap and forget about it. Which, two Tesla’s credit when you activate the feature for the first time it does make you read a large unskippable warning that you need to be paying attention at all times. I still don’t mind the name autopilot I just hate that they are marketing it as fully autonomous self-driving because that’s the part that implies you don’t need to be watching over it (to me)
Is there video that actually shows it “keeps going”? The way that video loops I know I can’t tell what happens immediately after.
The driver’s tweet says it kept going, but I didn’t find the full video.
Inb4 it actually stopped with hazards like I’ve seen in other videos. Fuck elon and fuck teslas marketing of self driving but I’ve seen people reach far for karma hate posts on tesla sooooooo ¯\_(ツ)_/¯
The poster, who pays Tesla CEO Elon Musk for a subscription to the increasingly far-right social media site, claimed that the FSD software “works awesome” and that a deer in the road is an “edge case.” One might argue that edge cases are actually very important parts of any claimed autonomy suite, given how drivers check out when they feel the car is doing the work, but this owner remains “insanely grateful” to Tesla regardless.
Yeah nah. This person is the absolute opposite of a Tesla or Musk hater. They’ve had this experience and are expressing fucking gratitude to Tesla. Some people really are crazy.
The way that video loops I know I can’t tell what happens immediately after.
SRSLY?
Have you ever been in a car, going fast?
You can see in the video that the car does NOT brake hard before the crash. Not even in the very last second.
What did YOU think what happens in the next second?
What I think doesn’t matter. I’d like to actually see the whole video though. Then I nor you would need to hypothesize about it either.
Tesla’s approach to automotive autonomy is a unique one: Rather than using pesky sensors, which cost money, the company has instead decided to rely only on the output from the car’s cameras. Its computers analyze every pixel, crunch through tons of data, and then apparently decide to just plow into deer and keep on trucking.
I mean, to be honest…if you are about to hit a deer on the road anyway, speed up. Higher chance the scrawny fucker will get yeeted over you after meeting your car, rather than get juuuuust perfectly booped into air to crash through windshield and into your face.
Official advice I heard many times. Prolly doesn’t apply if you are going slow.
Edit: Read further down. This advice is effing outdated, disregard. -_- God I am happy I’ve never had to put it i to test.
Haven’t read down yet, but I bet odds are a bit better if you let go of the brake just before impact, to raise the front up a bit.
Friendly reminder that tesla auto pilot is an AI training on live data. If it hasn’t seen something enough times then it won’t know to stop. This is how you have a tesla running full speed into an overturned semi and many, many other accidents.
I wonder how much recognition it has on non-white people. we’ve seen these models not having enough people of color in their samples before.
Color doesn’t matter to Lidar… Oh wait… Elon nixed that.