It wasn’t an extensive session, and “making up rules” is a bit perhaps strong as an expression. Perhaps “ignoring rules”, would be more apt. It just replied with something that a DM might say in a given scenario, without understanding why.
Like it kept asking me what to do after I told it, in specific terms, that I use my action and my bonus action. Basically allowed me to sit there as a sorcerer spamming endless spells, didn’t really understand spell slots or actions, but if you reminded it about them, then it pretended it had understood them all the time.
I’m sure it’s somewhere in my history, but also, just go ask one to DM you an impromptu battle and check for yourself.
I have this problem with ChatGPT and Powershell. It keeps referencing functions that do not exist inside of modules and when I’m like “that function doesn’t exist” its like “try reinstalling the module” and then I do and the function still isn’t there so I ask it for maybe another way to do it, and it just goes back to the first suggestion and it goes around in circles. ChatGPT works great sometimes, but honestly I still have more success with stack overflow
One of the first things I noticed when I asked ChatGPT to write some terraform for me a year ago was that it uses modules that don’t exist.
The same goes for Ruby. It just totally made up language features and gems that seemed to actually be from Python.
Not that it’s a programming language, but it also makes up rules for 5e D&D if you ask to play a game.
Could you give an example? I really want to know what silly rukes it came up with.
It wasn’t an extensive session, and “making up rules” is a bit perhaps strong as an expression. Perhaps “ignoring rules”, would be more apt. It just replied with something that a DM might say in a given scenario, without understanding why.
Like it kept asking me what to do after I told it, in specific terms, that I use my action and my bonus action. Basically allowed me to sit there as a sorcerer spamming endless spells, didn’t really understand spell slots or actions, but if you reminded it about them, then it pretended it had understood them all the time.
I’m sure it’s somewhere in my history, but also, just go ask one to DM you an impromptu battle and check for yourself.
They really are just like us.
It seems to shortcut implementations that require more than one block, and mimicks parameters from other functions.
I have this problem with ChatGPT and Powershell. It keeps referencing functions that do not exist inside of modules and when I’m like “that function doesn’t exist” its like “try reinstalling the module” and then I do and the function still isn’t there so I ask it for maybe another way to do it, and it just goes back to the first suggestion and it goes around in circles. ChatGPT works great sometimes, but honestly I still have more success with stack overflow