I hate having to justify using AI as a tool

This metaphor might not work but it's what came to mind. I use AI as a tool - sometimes I get a little bit of help tweaking a sentence, or suggesting alt text, or suggesting meta descriptions for a blog post. Sometimes I use it as a sounding board to work through some things on my mind. The most I use it for lately is for coding personal projects.

One such use is for creating shell scripts to automate some things in Obsidian for me; I also will get some CSS help. Honestly most of the coding I get help with is NOT public-facing. But I've also used it to trouble-shoot some technical issues I've had, whether it is with my websites or my Linux box at home. 

People call this vibe coding and people think it's the absolute worst (maybe second after generated images). I understand some of the reasons behind this - security risks among them. In my experience ChatGPT/Claude/Copilot will tell me if a certain option poses inherent security risks when giving me coding solutions - such as "this is the easiest route, but also has some security problems etc. etc." I understand the risks, and I can choose what to do from there. 

And yet if I say that, maybe I get vilified by people who think everything related to AI comes from Robot Hell

What if I were to compare this to cooking? People find recipes online all the time and follow them blindly when they're not able to figure out ratios themselves. Is this "vibe cooking"? If I created a great-tasting dish from a recipe book I'm still praised for making a great dish. But it's not truly my creation. I mixed the ingredients together and did the cooking process, but someone else put the work of testing out the recipe to get the optimal flavour. 

Except I don't need to justify using a cookbook. They just understand. Why does it have to be so different with coding?