- cross-posted to:
- futurology@futurology.today
- cross-posted to:
- futurology@futurology.today
Trust in AI technology and the companies that develop it is dropping, in both the U.S. and around the world, according to new data from Edelman shared first with Axios.
Why it matters: The move comes as regulators around the world are deciding what rules should apply to the fast-growing industry. “Trust is the currency of the AI era, yet, as it stands, our innovation account is dangerously overdrawn,” Edelman global technology chair Justin Westcott told Axios in an email. “Companies must move beyond the mere mechanics of AI to address its true cost and value — the ‘why’ and ‘for whom.’”
Have you actually watched any videos of the new entirely AI based version 12 in action? It’s pretty damn good.
Not that that has anything really to do with my actual point which is that it still doesn’t have LiDAR and it still doesn’t really work.
I’m not really talking about self-driving I’m just pointing out it’s a bad analogy.
I don’t know what lidar has anything to do with any of it or why autonomous driving is a bad example. It’s an AI system and that’s what we’re talking about here.
LIDAR is crucial for self-driving systems to accurately map their surroundings, including things like “how close is this thing to my car” and “is there something behind this obstruction.” The very first Teslas with FSD (and every other self-driving car) used LIDAR, but then Tesla switched to a camera-only FSD implementation as a cost saving measure, which is way less accurate–it’s insanely difficult to accurately map your immediate surroundings bases solely on 2D images.
I disagree. Humans are a living proof that you can operate a vehicle with just two cameras. Teslas have way more than two and unlike a human driver, it’s monitoring its surroundings 100% of the time. Being able to perfectly map your surroundings is not the issue. It’s understanding what you see and knowing what to do with that information.
Humans also have the benefit of literally hundreds of millions of years of evolution spent on perfecting bicameral perception of our surroundings, and we’re still shit at judging things like distance and size.
Against that, is it any surprise that when computers don’t have the benefit of LIDAR they are also pretty fucking shit at judging size and distance?
Reality just doesn’t seem to agree with you. Did you see the video I linked above? I feel like most people have no real understanding of how damn good FSD V12 is despite being 100% AI and camera based.
Hey… fuck elon and fuck tesla
Here is an alternative Piped link(s):
videos
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.