Aston sought medical help after her symptoms—which included severe migraines, abdominal pain, joint dislocations, easy bruising, iron deficiency, fainting, tachycardia, and multiple injuries—began in 2015, per the New Zealand Herald. She was referred to Auckland Hospital, where a doctor accused her of causing her own illness. Because of his accusations, Aston was placed on psychiatric watch. 

Research suggests women are often much more likely to be misdiagnosed than men. A 2009 study of patients with heart disease symptoms found 31.3 per cent of middle-aged women “received a mental health condition as the most certain diagnosis”, compared to just 15.6 per cent of their male counterparts. Additionally, a 2020 study found that as many as 75.2 per cent of patients with endometriosis—a painful disorder that affects the tissue of the uterus—had been misdiagnosed after they started experiencing endometriosis symptoms. Among those women, nearly 50 per cent were told they had a “mental health problem”.

  • DavidGarcia@feddit.nl
    link
    fedilink
    English
    arrow-up
    82
    arrow-down
    7
    ·
    10 months ago

    I’m gonna cherish the day when all these doctors that suck at diagnosing are replaced with AI. It’s so stupid that you have to go to 100 doctors to find one that takes you seriously and actually gives you the right diagnosis.

    • pezhore@lemmy.ml
      link
      fedilink
      English
      arrow-up
      80
      arrow-down
      1
      ·
      10 months ago

      Unfortunately, AI is only as good as it’s training data. If there are biases in the training data, those biases shine through later.

      AI is interesting but not a silver bullet.

      • SinAdjetivos@beehaw.org
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        10 months ago

        Also most AI systems are not only susceptible to existing biases, but they have a nasty habit of coming up with wild new, and often very creative, biases of their own due to their reliance on random sampling and statistical modeling.

    • Fr❄stb☃️te@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      3
      ·
      10 months ago

      It’s so stupid that you have to go to 100 doctors to find one that takes you seriously

      YES!!! Especially those ones where if you want your tubes tied or balls clipped…fuck sake doc, if you won’t lop my cords, I will. With rusty garden shears and a whole lot of vicodin!

      • Saraphim@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        10 months ago

        Oh she begged for them to tie her tubes after her one and only massively traumatic birth experience that ended in a c-section. Her uterine issues have been hell, and they repeatedly refused to even discuss a hysterectomy because “she might change her mind”. Fuck that, she has ptsd from that birth and he’s 19 damned years old now. She’s not having more kids.

        I guess it’s better to have four month long periods and clots the size of your foot.

        Totally normal.

    • teruma@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      10 months ago

      The problem is that they’re all trained with previous human diagnoses and then doctors will have the excuse of “Well the AI says you’re a hysterical attention seeking female…”

    • Myrhial@discuss.online
      link
      fedilink
      English
      arrow-up
      6
      ·
      10 months ago

      We’ll need to ensure that this bias against female (and also male) patients isn’t adopted by the AI. We’re already not properly testing medicine on both sexes. Medical textbooks often list stuff as more or less common in one sexe. This is entirely possible but if the data isn’t properly screened we’re just moving the problem. Data can exist and be wrong for many reasons. We should address that urgently. It is bad for everyone. I think it is plausible an AI could have reached the same conclusion here because of all the mental health problems considered far more common in women. Did anyone ever even check where the source of that data is? Because some stuff really hasn’t been rechecked in the last 50 years I’m sure.

    • _number8_@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      21
      ·
      10 months ago

      it’s also so stupid that AI constantly chides you for asking it medical advice. yeah god forbid i want to use the free, ostensibly super intelligent tool instead of making an appointment, going there, talking to the guy who may also be wrong anyway, and paying hundreds

      • merridewOP
        link
        fedilink
        English
        arrow-up
        33
        ·
        10 months ago

        If you are talking about ChatGPT: please do not do this.

        ChatGPT guesses which words should come next in a sequence, based on its training data and your prompt. It is not intelligent, and it has no concept of what is real and what is not. It will cheerfully make things up.

        I sympathise with your frustration at having to pay money to see a doctor. I can’t imagine how stressful life would be without the NHS.

      • themusicman@lemmy.world
        link
        fedilink
        English
        arrow-up
        26
        ·
        10 months ago

        No current AI is super intelligent. As a software developer who has been keeping up to date with AI progress, I can say with some certainty that AI is far more biased in its diagnoses than human doctors and will often be misled by subtle changes in wording. I strongly urge you to not rely on medical advice from any current AI.

      • Altan1903@kbin.social
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        10 months ago

        It will not be free, and it will misdiagnose exactly the same way as a human doctor would.
        I’d argue that what Saraphim described, -their friend dying from being constantly misdiagnosed due to weight- is the perfect example of what we will be able expect from an AI doctor. These machine learning algorithms lack fidelity, which is most needed to understand a complex problem.
        Furthermore they have no concept of ethics or morals, and the data they train on reflects the imperfections of our society.
        So for example if all the doctors are biased towards overweight women, the AI trained on their diagnostic data will be too.

        AI doesnt exist but will ruin everything anyways

        One day AI will be a useful tool but to get there, one of the things we must do is be very critical of it.

      • Laticauda@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        10 months ago

        There isn’t a publicly available ai for medical issues (and even assuming there ever will be, it won’t be for a long long time) so if you’re asking any of the publicly accessible ai for medical advice then I’m sorry but that is clinically stupid. And any medical based ai that currently exists is going to be very flawed in various ways, and no better than a human, probably worse in a lot of ways, hence why we don’t use them on a widespread basis.

        Also current “ai” isn’t super intelligent by any metric, they’re just specialized algorithms. We don’t have the kind of sci fi AI you see in movies yet, we only have narrow AI atm, which are just pattern recognition learning algorithms, they don’t mimic a human thought process.