• PresidentCamacho@lemm.ee
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    33
    ·
    7 months ago

    This is the actual logical way to think about self driving cars. Stop down voting him because “Tesla bad” you fuckin goons.

    • gallopingsnail@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      5
      ·
      7 months ago

      Tesla’s self driving appears to be less safe and causes more accidents than their competitors.

      “NHTSA’s Office of Defects Investigation said in documents released Friday that it completed “an extensive body of work” which turned up evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.”

      Tesla bad.

      • TypicalHog@lemm.ee
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        3
        ·
        7 months ago

        Can you link me the data that says Tesla’s competitors self-driving is more safe and causes less accidents and WHICH ONES? I would really like to know who else has this level of self-driving while also having less accidents.

      • Socsa@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        7 months ago

        I don’t quite understand what they mean by this. It tracks drivers with a camera and the steering wheel sensor and literally turns itself off if you stop paying attention. What more can they do?

        • nxdefiant@startrek.website
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          7 months ago

          The NHSTA hasn’t issued rules for these things either.

          the U.S. gov has issued general guidelines for the technology/industry here:

          https://www.transportation.gov/av/4

          They have an article on it discussing levels of automation here:

          https://www.nhtsa.gov/vehicle-safety/automated-vehicles-safety

          By all definitions layed out in that article:

          BlueCruise, Super Cruise, Mercedes’ thing is a lvl3 system ( you must be alert to reengage when the conditions for their operation no longer apply )

          Tesla’s FSD is a lvl 3 system (the system will warn you when you must reengage for any reason)

          Waymo and Cruise are a lvl 4 system (geolocked)

          Lvl 5 systems don’t exist.

          What we don’t have is any kind of federal laws:

          https://www.ncsl.org/transportation/autonomous-vehicles

          Separated into two sections – voluntary guidance and technical assistance to states – the new guidance focuses on SAE international levels of automation 3-5, clarifies that entities do not need to wait to test or deploy their ADS, revises design elements from the safety self-assessment, aligns federal guidance with the latest developments and terminology, and clarifies the role of federal and state governments.

          The guidance reinforces the voluntary nature of the guidelines and does not come with a compliance requirement or enforcement mechanism.

          (emphasis mine)

          The U.S. has operated on a “states are laboratories for laws” principal since its founding. The current situation is in line with that principle.

          These are not my opinions, these are all facts.

      • nxdefiant@startrek.website
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        8
        ·
        7 months ago

        No one else has the same capability in as wide a geographic range. Waymo, Cruise, Blue Cruise, Mercedes, etc are all geolocked to certain areas or certain stretches of road.

        • GiveMemes
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          2
          ·
          7 months ago

          Ok? Nobody else is being as wildly irresponsible, therefore tesla should be… rewarded?

          • nxdefiant@startrek.website
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            7 months ago

            I’m saying larger sample size == larger numbers.

            Tesla announced 300 million miles on FSD v12 in just the last month.

            https://www.notateslaapp.com/news/2001/tesla-on-fsd-close-to-license-deal-with-major-automaker-announces-miles-driven-on-fsd-v12

            Geographically, that’s all over the U.S, not just in hyper specific metro areas or stretches of road.

            The sample size is orders of magnitude bigger than everyone else, by almost every metric.

            If you include the most basic autopilot, Tesla surpassed 1 billion miles in 2018.

            These are not opinions, just facts. Take them into account when you decide to interpret the opinion of others.

            • GiveMemes
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              edit-2
              7 months ago

              That’s not how rates work tho. Larger sample size doesn’t correlate with a higher rate of accidents, which is what any such study implies, not just raw numbers. Your bullshit rationalization is funny. In fact, a larger sample size tends to correspond with lower rates of flaws, as there is less chance that an error/fault makes an outsized impact on the data.

              • nxdefiant@startrek.website
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                7 months ago

                No one’s talking about rates. The article itself, all the articles linked in these comments are talking about counts. Numbers of incidents. I’m not justifying anything because I’m not injecting my opinion here. I’m only pointing out that without context, counts don’t give you enough information to draw a conclusion, that’s just math. You can’t even derive a rate without that context!

                • GiveMemes
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  7 months ago

                  That’s not my point though. We both know that the government agency doing this work is primarily interested in the rates, whether or not reports from the media are talking about the total numbers or not. The only reason they started the process of investigation was because of individual incidents, yes, but they’re not looking for a few cases, but a pattern.

                  (Like this one:https://www.ranzlaw.com/why-are-tesla-car-accident-rates-so-high/)

    • doubtingtammy@lemmy.ml
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      7 months ago

      It’s not logical, it’s ideological. It’s the ideology that allows corporations to run a dangerous experiment on the public without their consent.

      And where’s the LIDAR again?

      • PresidentCamacho@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        4
        ·
        edit-2
        7 months ago

        My argument is that self driving car fatalities have to be compared against human driven car fatalities. If the self driving cars kill 500 people a year, but humans kill 1000 people a year, which one is better. Logic clearly isn’t your strong suit, maybe sit this one out…