more llms

I used another LLM chat to get some quick input about how I messed up the point at which I mixed oil into a spice cake recipe.

(I was supposed to whisk it with eggs & buttermilk etc. but I totally forgot, and mixed it with the entire batter at the end while it was in the pan.)

It told me that it was probably going to end up like a bread than a cake. And also scolded me for not following instructions exactly for recipes. 


Anyway it turned out fine, exactly as I made it last time. But I have this strange compulsion to go back to the LLM and report that it worked out. Do I want to prove it wrong? Or reassure the bot that it all turned out okay in the end? 

I feel like I'm probably not the only person that does this. 

For what it's worth, I will report back that something worked when I'm trying to fix a coding issue or something. I feel like that's relevant. 

This situation though, I think I'm just being weird.