ChatGPT has a style over substance trick that seems to dupe people into thinking it’s smart, researchers found::Developers often prefer ChatGPT’s responses about code to those submitted by humans, despite the bot frequently being wrong, researchers found.

  • GenderNeutralBro@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    4
    ·
    11 months ago

    If you need a correct answer, you’re doing it wrong!

    I’m joking of course, but there’s a seed of truth: I’ve found ChatGPT’s wrong or incomplete answers to be incredibly helpful as a starting point. Sometimes it will suggest a Python module I didn’t even know about that does half my work for me. Or sometimes it has a lot of nonsense but the one line I actually need is correct (or close enough for me to understand).

    Nobody should be copying code off Stack Overflow without understanding it, either.