Abusing AI to make Marketing smile, part 2
October 9, 2025
Picking up from yesterday's csv export experiment, I have big csv file with a bunch of gaps I don't want to manually fill if I don't have to. This morning, I sat down with my first cup of coffee and gave myself permission to spend 20 minutes tinkering before changing my tack.
trying with claude.ai
So, I asked Claude:
Hi there! I need to craft better seo stuff for many fields. Can you take a stab at it? the canonical URL should be the slug value prepended with https://docs.payabli.com I don't think I'll be using the image fields so you can ignore those. Any questions before you start?
Claude asked a few questions about voice and tone and whether I wanted to adhere to best practices for tag length.
It took a few tries to handle yaml keys with the array values because Claude kept removing the double quotes surrounding the comma-separated array, but I was able to get a csv file with all the previously empty fields filled in. The text was ehhhh -- on top of being AI-generated it was also the worst of both worlds between markcomm and techcomm. I have 157 pages to handle here so I decided it could bear a little more experimentation.
I decided to take the original csv, dump it in a Claude Project with our llms-full.txt and tell Claude to generate better marketing-y description fields using the page content. It did a poor job overall with this. I was pretty stunned, and investigated. It turns out that llms-full.txt strips all the page frontmatter, so there was nothing unique for Claude to index on between llms-full.txt and the csv file. It couldn't figure out which metadata from the csv belonged to which chunk of content on the llms-full.txt.
On to the next thing. I had only spent about 15 minutes on this lark.
using Claude Code
I don't have unlimited tokens in Claude Code, so I typically only use it for things that I really need to save time on, or things that need access to the docs repo.
I opened a Claude Code session and get to work:
Hi claude! I need to look at partiallycleared.csv and for any column that isn't
Reviewedyes | YES, I need you to go find the referenced page in fern/pages and write better description and og:description text. I want you to lean SLIGHTLY more toward marketing style language here, but don't go overboard.I could also use help cleaning up the keyword suggestions.
Claude Code did its "thinking" thing and started offering up edits to my existing pages (not the csv file) for me to approve. At this point, I realized that I'd left ambiguity in my prompt. Mentally, I'd wanted Claude to update my spreadsheet so I could later script the page updates. Claude's approach was way better than pre-coffee CT's approach, so I let it ride.
Claude's approach gave me an opportunity to review and revise language as it worked through the list of updates, which ended up saving me my precious human time. Claude updated 10 pages and asked me for feedback, and I decided to let it continue with the manual method because it actually made so much more sense than what I'd planned initially.
I started this process at 9:18 AM. Claude Code worked in batches of anywhere from 3-10 pages at time. At 10:54 AM, I was opening the PR. I'd even taken a 20 minute break to shape some loaves of bread and put them in to proof and clean up.
The total Claude Code spend for this was $15.65. This seems a little hefty for what amounts to like an hour and a half of it dealing with words. However, adding meaningful meta descriptions and keywords to 157 pages about various embedded payments related features would have taken me probably a week to do on my own, given all my other work responsibilities.
I ended up only needing to edit 4 descriptions out of 157. I think the key here was making Claude take the time to read each page and craft the description. Expensive, but so am I.
why bother with all this stuff?
Everyone keeps saying SEO is dead. But then other people are like "LLM crawlers do ingest SEO related meta tags, so make sure you're using them."
I don't know if anyone really knows what's going on cause mostly everything AI right now is just grifters standing on locked black boxes hawking whatever magic is inside.
I think it's entirely possible that traditional SEO stuff still matters for AEO and GEO. I'm in a hypercompetitive space and I have equity in my company -- if I spend a few hours and $15 of the company's money in Claude Code tokens, and it puts us in front of a few more users as a result, then I win in the long run. If SEO stuff doesn't matter for AEO and GEO, then I still optimized for search engines so who cares? It was like 3 hours of work.
So that's it. The audit and fix is over and done. All in -- 3ish hours of my human labor and a little over $15 in Claude Code credits, plus whatever the Claude.ai spend is for my work account.
Postscript: I totally forgot to have Claude Code generate the canonical URL but it turns out Fern does that automatically so I am actually done with the project.