Elon Musk’s FSD v12 demo includes a near miss at a red light and doxxing Mark Zuckerberg — 45-minute video was meant to demonstrate v12 of Tesla’s Full Self-Driving but ended up being a list of thi…::Elon Musk posted a 45-minute live demonstration of v12 of Tesla’s Full Self-Driving feature. During the video, Musk has to take control of the vehicle after it nearly runs a red light. He also doxxes Mark Zuckerberg.

  • Thorny_Thicket@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    102
    arrow-down
    14
    ·
    edit-2
    1 year ago

    AI DRIVR made an interesting analysis about the v12 on YouTube. Apparently it’s completely different from the previous versions and instead of understanding traffic rules it learns from a videos of people driving which means it does things like doesn’t fully stop at stop signs and drives over the speedlimit - like people do too.

    It’s interesting because by strictly following traffic rules you might infact be a danger to others but by driving like humans you’re also breaking the law. Good example of a situation where the “right” thing to do might not be the most intuitive one though in this case it’s still up for a debate.

    • meseek #2982@lemmy.ca
      link
      fedilink
      English
      arrow-up
      95
      arrow-down
      4
      ·
      1 year ago

      That’s what we were all clambering for: a self driving machine that operates like a mouth breather late for work.

      Elon is a masterclass of stupid.

      • EpsilonVonVehron@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        Mush doesn’t care about laws. As mentioned on another article, he appears to be operating the phone by hand in the driver’s seat, which is both a driving violation and against Tesla’s own driver manual.

        • meseek #2982@lemmy.ca
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          1 year ago

          Same guy who parades around in his private jet calling everyone who doesn’t return to the office amoral and selfish.

          So yeah. All that tracks. The entire “it’s different because it’s me” stench wafting in.

      • morriscox@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        9
        ·
        1 year ago

        WTH is wrong with mouth breathers? What ass grasped for some new insult and came up with that? It’s a lame stupid insult.

      • Thorny_Thicket@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        90
        ·
        1 year ago

        Perhaps you should put your hatred towards Elon aside for a while and objectively consider what actually is the better solution here.

        One could argue that strictly following the rules is the right approach, and perhaps it would be if everyone actually drove that way. However, in reality, that’s not usually the case. What truly increases traffic safety is predictability. If most drivers are rolling through stop signs and you’re the only one stopping completely, while you might technically be in the right, your behaviour could lead to accidents due to the unpredictability. The same applies to speeding. Driving significantly slower than the flow of traffic might slow down the traffic flow, leading to unsafe overtakings and such. While you might be legally correct here too, in practice, a slight increase in speed could lead to increased road safety.

        These are complex issues. A dose of humility might go a long way instead of acting like the answer is obvious.

        • SatanicNotMessianic@lemmy.ml
          link
          fedilink
          English
          arrow-up
          27
          arrow-down
          1
          ·
          1 year ago

          I do this kind of thing for a living, and have done so for going on 30 years. I study complex systems and how they use learning and adaptation.

          Musk’s approach to these systems is idiotic and shows no understanding of or appreciation for how complex systems - animals, in particular - actually work. He wanted to avoid giving his vehicles lidar, for instance, because animals can navigate the world without it. Yet he didn’t give them either the perceptual or cognitive capabilities that animals have, nor did he take into account the problems of animal locomotion being solved by evolution are very different from the problems solved by people driving vehicles. It, of course, didn’t work, and now Tesla is trailing the pack on self-driving capabilities with the big three German car makers and others prepping class 3 vehicles for shipping.

          If he is trying to chatgpt his way out of the corner he’s painted himself into, he’s just going to make it worse - and, amusingly, for the same reasons. Vision is just one dimension of sensation, and cars are not people, or antelopes, or fish, or whatever his current analogy is.

          This is just Elon Eloning again. No one predicts a car coming towards them is going to do a California stop at a stop sign. If Om pulling into an intersection and I see someone rolling through a stop sign, I’m hitting the brakes because obviously a) they didn’t see me and b) they don’t know the rules of the road. Elon’s cars have a problem with cross traffic and emergency vehicles anyway, making the logic fuzzier is not going to improve the situation. If he thinks throwing video and telemetry data at a large model is going to overcome his under-engineered autonomous system, I suspect he’s going to be in for a rude discovery.

          If there’s anything kids today can learn from Elon (or from Trump for that matter), it’s how to be so confidently wrong that people throw money at you. The problem is that if you’re not already born into wealth and privilege, you’re likely to merely become the owner of the most successful line of car dealerships in a suburban county in Pennsylvania, or else in prison for fraud.

          • Thorny_Thicket@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            14
            ·
            1 year ago

            If FSD is trained from billions of hours of video data then it by definition drives like an average driver and thus is highly predictable.

            • SatanicNotMessianic@lemmy.ml
              link
              fedilink
              English
              arrow-up
              14
              ·
              1 year ago

              That’s not how it works, unfortunately. That’s how people want it to work, but it’s not how it works.

              This is just more of Elon’s pie in the sky.

              • Thorny_Thicket@sopuli.xyz
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                8
                ·
                1 year ago

                If you’ve done this kind of stuff for living for the past 30 years then I’m sure you can give me a better explanation than “that’s not how it works”

        • Honytawk@lemmy.zip
          link
          fedilink
          English
          arrow-up
          26
          ·
          1 year ago

          What better predictability is there than actually following the law?

          Self driving cars should be better than us, not be just like us.

          • Thorny_Thicket@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            8
            ·
            1 year ago

            Even if self driving car behaves like a human driver it still exceeds humans thousandfold in processing and reaction speed. For a truly advanced self driving system plowing thru stop signs and speeding should be non-issue because unlike humans it can pay 100% attention to its surroundings 100% of the time and react instantly when needed.

        • meseek #2982@lemmy.ca
          link
          fedilink
          English
          arrow-up
          22
          arrow-down
          2
          ·
          1 year ago

          The better solution is to not program your machine to act like a clown behind the wheel, doing all manner of illegal offences because ThAt’s HoW ReGulAr PeoPlE DrIve!

          We aren’t trying to make auto pilot act like a real bonafide driver, we are just removing the inconvenience of needing to do the driving.

          • Thorny_Thicket@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            14
            ·
            1 year ago

            That depends on what you value.

            If you want self driving cars that follow traffic rules to the letter even if that means more people are going to die then that’s fine. I don’t agree but I can see why someone would think that. Personally I would prioritize human life so if it turns out this is one of the cases when bending the rules does in fact lead to less accidents then that’s what I’m voting for.

            I’m not claiming either is true. Just asking to consider the fact that the right thing to do is not always intuitive.

            • meseek #2982@lemmy.ca
              link
              fedilink
              English
              arrow-up
              9
              ·
              1 year ago

              Oh we all know what Elon values 🤪

              Let’s pluck out this forced choice fallacy first off. I’m going to opt for c) I want self driving cars to obey the rules of the road “to the letter” and keep people safe. If not, why do they even make traffic rules?

              You and Elon want the cool self driving car that cruises 60 in a 50 with traffic and occasionally doesn’t check its blind spot but quickly recovers and gives a quick wave like sorry bro my bad.

              I mean, okay Jerry.

              • Thorny_Thicket@sopuli.xyz
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                edit-2
                1 year ago

                It’s a thought experiment. I’m not making any statements about which is the correct thing to do. Just asking people to consider the possibility that what actually leads to the safest possible roads may not be what is intuitive. If you for the sake of argument can’t imagine a scenario where a self driving car is able to bend the rules from time to time to navigate differen’t scenarios while still managing to stay out of accidents then you frankly just haven’t thought about it very much.

              • Thorny_Thicket@sopuli.xyz
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                2
                ·
                edit-2
                1 year ago

                That third option is the first option in my view.

                For the sake of an argument let’s imagine that most people drive 10kph over the speedlimit on highways and statistically a significant number of accidents happens when people are overtaking someone driving slower.

                Now by driving faster these dangerous overtakes happen way less often and it results in overall increase in safety but it’s also against the rules so how does your “third option” solve this issue?

            • batmaniam@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 year ago

              When someone is driving, if they misjudge and bend the rules at wrong time, and kill someone they go to court. They can potentially be convicted of all sorts of things.

              Who’s going to court when a car does it? Who serves the jail time?

              • Thorny_Thicket@sopuli.xyz
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                With the current systems the driver obviously. These systems are not yet advanced enough to be blindly relied on

                • batmaniam@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  1 year ago

                  I should have been more clear: I meant an AI trained to break the rules the way we’re talking about. Having the ability to make a judgement also means responsibility for that judgement. If I cross a double yellow to get around farm equipment on a back country road, and I misjudge and kill someone, it’s on me. It doesn’t matter if 999/1000 I could have broke the rules responsibly.

                  So who goes to jail when a car does it?

        • OnionQuest@lemmy.ml
          link
          fedilink
          English
          arrow-up
          15
          ·
          1 year ago

          It’s simply solved by the fact that I, as a human driver, can recognize now when a robo-taxi is driving and change my expectations of the car’s behavior. Right now it’s clearly evident what an autonomous car looks like and a reasonable person will have the expectation that they follow the letter of the law.

          I interact with these vehicles on a daily basis in San Francisco and it would be weird if they weren’t driving perfectly.

        • NotYourSocialWorker@feddit.nu
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          If most drivers are rolling through stop signs and you’re the only one stopping completely, while you might technically be in the right, your behaviour could lead to accidents due to the unpredictability.

          Simply no. If you as a driver aren’t prepared that the car in front of you might actually stop when there’s a sign that says stop, and if you aren’t keeping enough of a distance to be able to break, then it isn’t the car in front that is the problem, or who is the one causing the accident, it’s you and only you.

          The same applies to speeding. Driving significantly slower than the flow of traffic might slow down the traffic flow, leading to unsafe overtakings and such.

          Again no. If they are driving at the speed of the signage, keeping the speed and driving predictable, then the ones driving “significantly” faster are the ones decreasing road safety. No-one is forcing them to perform “unsafe overtakings and such”. Also, just because you, from your vantage point, can’t see a reason for the car in front of you driving slowly doesn’t mean that there isn’t one.

          While a dose of humility is good, a dose of personal responsibility is also great

          • Thorny_Thicket@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            then the ones driving “significantly” faster are the ones decreasing road safety. No-one is forcing them to perform “unsafe overtakings and such”.

            I’m not claiming it is so, but I’m saying it’s conceivable that if the autonomous vehicle drives slightly over the speed limit, with the flow of traffic, it may actually lead to a statistically significant drop in accidents compared to the scenario where it follows the speed limit. Yes, no one is forcing other drivers to behave in such a way, but they do, and because of that, people die. In this case, forcing self-driving cars to follow traffic rules to the letter would paradoxically mean you’re choosing to kill and injure more people.

            I don’t think the answer to this kind of moral question is obvious. Traffic is such a complex system, and there are probably many other examples where the actually safer thing to do is not what you’d intuitively think.

        • trashgirlfriend@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          15
          ·
          1 year ago

          The answer is clear and easy.

          Don’t let computers have full control over freely moving several ton death machines.

          • The King@lemmy.world
            link
            fedilink
            English
            arrow-up
            14
            arrow-down
            9
            ·
            1 year ago

            This is such a cop-out. “No computers!”, but it’s okay to let someone drive who isn’t paying attention because they’re deep in their phone? I drive a motorcycle and I’ve had people stare me straight in the eye, only to pull out in front of me and nearly kill me.

            People are notoriously bad at driving. The computer doesn’t have to be perfect, just better than the soccer moms or distracted dummies.

          • Honytawk@lemmy.zip
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            5
            ·
            1 year ago

            After a while the human will be the bottle neck of preventing accidents.

            Computers are a lot better at following the law.

    • variaatio@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      edit-2
      1 year ago

      It’s interesting because by strictly following traffic rules you might infact be a danger to others but by driving like humans you’re also breaking the law.

      Well the others should also stop breaking the law, then things are safe again. One doesn’t solve the illegal murder problem by making murder legal. If someone is danger to someone else by driving legally, then source of problem is other persons behaviour. Since legal rules don’t include stuff like “be obnoxious and hindering to others”.

      The other drivers must drive like expecting possibly the others involved driving by the rules. Leaving enough room, incase the car in front in fact does stop at the stop sign. Since they might have to emergency stop anyway. If one isn’t distant enough to leave room for stop sign stopping, one certainly doesn’t have the safe distance to anticipate as they should the car in front at any moment having to do emergency stop due to developing sudden situation. One must always leave avoidance distance.

      Drive by the speed limit and not little over? It is the speeding over takers fault they are speeding over taker, took a dangerous over take when they shouldn’t due to being “annoyed” by someone driving by the speed limit and thus causing a crash.

      There is very very few cases where driving by the rules is the cause of danger. Other drivers being fool hardy, emotional idiots is the source of danger. Fault will and should land with the fool hardy idiot.

      As NTHSA said with making Tesla remove the “california stop” aka rolling the stop singing without stopping, others breaking the law don’t make it legal for you. In fact said arbitrary cultural behavior, which some follow and some don’t is a source of danger due to uncertainty it causes.

      edit: So in long term the car is safer by following rules, since it induces others to drive legally and predictably. Specially since machines don’t use human non verbal hints and so on. Thus the only sensible route for a driving machine, instead of driving human is to strictly follow traffic rules. Since it makes it a predictable player. Unlike with humans other humans have no way to culturally gauge how a “driving machine would behave”, if it doesn’t behave by the one publicly known precedent it could be expected to behave… Driving by the rules to the letter. Which does include the simple rule of “if you can you must try to avoid collision, even on having right of way”. No amount of “but the rules say”, overrules that basic rule in the rules “every driver has obligation to try to avoid collision or minimize collision upon not being able to avoid collision.” So there well be no “cyborg car bowling down a pedestrian or other car, because technically the other person was breaking the law. The car had right of way”.

      • Thorny_Thicket@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I obviously don’t know for sure, but at least it’s conceivable that, in fact, it may be the case that erratic behavior of other drivers, caused by someone else driving slower than them, leads to a significant number of accidents every year that would not have happened had they been driving at the same speed as everyone else.

        In this case, forcing the self-driving vehicle to never go over the speed limit literally means you’re knowingly choosing an option that leads to more people dying instead of less.

        I think there’s a pretty clear moral dilemma here. I’m not claiming to know the right way forward, but I just want to point out that strictly following the rules without an exception is not always what leads to the best results. Of course, allowing self-driving cars to break the rules comes with its own issues, but this just further points to the complexity of this issue.

        • variaatio@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Tehnyt again if that follow others behavior is drive faster, that also leads to accidents. Not many with the other frustrated drivers, but with say wildlife. People not being aboe to stop in time more often dye to the increased speed and thus increased braking distance.

          That is why bendy narrow roads have slower speed limit. It is function of what is the predicted reaction time, the amount of sight distance one had.

          Can’t cheat physics, the more speeding there is, the longer the braking distances, the more often it isn’t anymore a near miss due to braking in time and instead a full on collision.

          So sure one is more synch, but every is in synch with less reaction time available, when the unavoidable chaos factor raises its head. Chaos factor like wild live (who are not obligated nor obliged to follow traffic rules) or say someone bursting a tire leading to sudden change in speed and control.

          • Thorny_Thicket@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            When a self-driving car drives at or below the speed limit on a fast-moving highway, it can disrupt the natural flow of traffic. This can lead to a higher chance of accidents when other human drivers resort to aggressive maneuvers like tailgating, risky overtaking, or sudden lane changes. I’m not claiming that it does so for a fact, but it is conceivable, and that’s the point of my argument.

            Now, contrast this with a self-driving car that adjusts its speed to match the prevailing traffic conditions, even if it means slightly exceeding the speed limit. By doing so, it can blend with the surrounding traffic and reduce the chances of accidents. It’s not about encouraging speeding but rather adapting to the behavior of other human drivers.

            Of course, we should prioritize safety and adhere to traffic rules whenever possible. However, sometimes the safest thing to do might be temporarily going with the flow, even if it means bending the speed limit rules slightly. The paradox lies in the fact that by mimicking human behavior to a certain extent, self-driving cars can contribute to overall road safety. It’s a nuanced issue, but it underscores the complexity of integrating autonomous vehicles into a world where human drivers are far from perfect. This would not be an issue if every car was driven by an competent AI and there was no human drivers.

  • DampSquid
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    3
    ·
    edit-2
    1 year ago

    I assume this all just bullshit and lies like last time?

    • Honytawk@lemmy.zip
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 year ago

      When was it ever different from old musky?

      How long before the cybertruck is released again?

      • EpsilonVonVehron@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        Cybertruck, hyperloop, etc, etc. Mush said in 2016, “I really would consider autonomous driving to be basically a solved problem,” Musk said. “I think we’re basically less than two years away from complete autonomy.”

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    5
    ·
    1 year ago

    This is the best summary I could come up with:


    “That’s why we’ve not released this to the public yet.” (FSD is technically a beta software, though Musk has said that v12 will be the first time Tesla removes that label.)

    But the moment when Musk was forced to intervene at the traffic light has already been seized upon by critics who say Tesla’s approach to autonomous driving is insufficient and reckless.

    Musk has said that FSD is being tested as beta software to emphasize the need for drivers to pay attention to the road while using the driver-assist feature.

    (Remember, Musk has banned the @ElonJet account that tracks his private jet from X/Twitter, claiming it was a “direct personal safety risk” to him.)

    The broader context here is that the federal government’s two-year investigation into Tesla’s highway driver-assist feature, Autopilot, is nearing its end, which may have prompted Musk to post the video as provocation.

    The government could force a recall of Autopilot and, by extension, FSD, which could affect Tesla’s valuation, much of which hinges on the company’s promise that it will offer full autonomy to its customers in the near future.


    The original article contains 656 words, the summary contains 184 words. Saved 72%. I’m a bot and I’m open source!

  • EdibleFriend@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    26
    ·
    1 year ago

    Eh I hate the dipshit but he has a point. its not really doxxing when he literally just googled it live.

    • EyesEyesBaby@lemmy.world
      link
      fedilink
      English
      arrow-up
      68
      ·
      edit-2
      1 year ago

      If we apply that same theory to @ElonJet, then that wouldn’t be doxxing either. Obviously Elon thinks otherwise. If @ElonJet is doxxing, then so is this.

      • EdibleFriend@lemmy.world
        link
        fedilink
        English
        arrow-up
        24
        arrow-down
        2
        ·
        1 year ago

        lol already talked about that elsewhere. yep its exactly the same thing. im not saying he isn’t a hypocrite

        • Aurenkin@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          4
          ·
          1 year ago

          Excuse me, this is the internet. You have to form a view on someone and then either agree or disagree with everything they do consistently otherwise it’s illegal.

          • EdibleFriend@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            Lop yep. People are talking to me like I’m some kind of Fanboy when all I really said was maybe, this time, he didn’t actually strangle a puppy.

    • lazyvar@programming.dev
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      6
      ·
      1 year ago

      Isn’t that a little bit of circular reasoning?

      If I doxx someone online then it gets indexed by Google, if someone then Google’s the information it stops being doxxing?

      I’d assume most doxxing isn’t done by someone who has unique firsthand knowledge (e.g. “Oh I know John, he lives on so and so road”) and instead is done by finding the information online whether via Google or a different public source.

      At least in the US, where a ridiculous amount of private information is deemed “public”.

      • timkenhan@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        2
        ·
        1 year ago

        releasing the information versus acquiring the released information are two different thing.

        • lazyvar@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          1 year ago

          Most doxxers don’t technically release the information, rather they’ve acquired it and point others to where they’ve acquired it or simply disseminate it further.

      • EdibleFriend@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Not really? Because, in your scenario, Musk would have to be the person to originally post his info. He didn’t even have to go drop a few bucks on spokeo or something

        • lazyvar@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          That’s what I’m saying. In most cases the doxxer isn’t the one who originally provided the info, but rather someone who has found the information online via a Google search or something similar.

          • EdibleFriend@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            and in most situations the person releasing the information isn’t doxxing someone whos world famous and takes all of 4 seconds to find information on. usually when you hear about doxxing its…well…someone like you. hidden behind a anon nickname and some weird ass neckbeard digs and digs to find enough information to get your name and go from there.

            Its quite a bit different when the person is one of the wealthiest and most powerful men on the planet and so much about him is just very basic known things.

  • Vub@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    I don’t understand much of this stuff but does this mean they (he…) threw a decade of research out the window and instead fed an AI loads of video data to start over from scratch?

    • ours@lemmy.film
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      I’m guessing it’s just more complicated than that. Training an AI model with loads of video data is how they got this far but they seem to be hitting the limits of this current process/sensors.