• BluesF
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Yeah, it’s just back exactly to the problem the article points out - refined bullshit is still bullshit. You still need to teach your LLM how to talk, so it still needs that cast bullshit input into its “base” before you feed it the “grounding” or whatever… And since it doesn’t actually understand any of that grounding it’s just yet more bullshit.