• flatbield@beehaw.org
    link
    fedilink
    English
    arrow-up
    4
    ·
    10 months ago

    This is the reason I balk at personifiing these things with human terms. It sounds cool but it is both inaccurate and misleading especially in the hands of the media and the general public.

    • eveninghere@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      10 months ago

      The dilemma is, ChatGPT can write better reports than most graduate school students in my country. For what these problematic vast majority of students do is remember, not analyze.

      Specifically for this context, students are not trained to analyze what they are asked (input query for ChatGPT). When I ask a unique question in their assignment, they can’t even form a response. They just write a generic text that doesn’t try to answer my question.

      They seem to copy and paste what’s in their brain. And when it comes to copy and pasting, i.e. mimicking what people do, ChatGPT is the champion in some sense. Hell, OpenAI even tuned it to generate balanced stance, and that’s also what students can’t do.

      Finally, 90% of the population perform actually worse than these graduate students.

      • flatbield@beehaw.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        It is sad, but most people seem to go to school for certification not learning. Use to grade when in grad school… the lazy sloppy work was nuts. Working for a company… the terrible writing some people do even with advanced degrees.