Packer and GCP
Hey all,
I'm just getting into the full-time role of Infrastructure Automation. I'm working to get standardized base images for all of our environments (AWS, VMWare, and GCP.) AWS I'm familiar with having worked with it the last few five years, but the Packer and GCP process is throwing me for a loop.
According to the Google Cloud , you use the packer.json file to submit a build job for GCP. But, as per the recommendation from Packer, I'm using hclv2 files. And if you look at the docs for , the googlecompute engine has you building with Packer. So....what the hell?
And is GCP the special snowflake that you leverage their build method? AWS seems much simpler, where you just run it against the account, and let packer do it's provisioning, building, and publishing, the same with VMWare.
So am I looking at the GCP problem wrong? Or is this pretty typical amongst the various environments?

