ct smith

docs goblin

Abusing AI to make Marketing smile, part 1

October 8, 2025

Sometimes, as doc owners, it's good for us to try to make Marketing happy. To that end, I set out this week to do a full SEO audit of our doc content to make sure we're using all of the SEO tags that Fern suports. We have a lot of doc content and I'm lazy, so this morning I gave myself however long it took to drink my first coffee to figure out how to make the process as fast as possible.

15 minutes later, I had a beautiful spreadsheet with a full accounting of all our existing (and missing) page metadata/frontmatter.

the plan

I figured scripting something to iterate over my docs repo would be the lightest lift, and I wanted the script to do this:

I used Claude (just regular ole Claude.ai) to write a script that scanned all the .mdx files in our Fern directories and dump the metadata into a csv file.

the execution

I've got a pretty good prompting relationship with Claude at this point, so getting a script that would do exactly what I needed took only two prompts from my end.

Prompt 1:

Hi, I need to script a way to pull all the frontmatter from our Fern docs into a CSV for an SEO audit. We have a lot of missing meta tags and here's the platonic ideal of our frontmatter:

---
title: Manage Customers with the API
headline: "Manage Customers with Payabli API | Developer Documentation"
subtitle: Learn how to add and manage customers with the Payabli API
description: Complete guide for managing customer entities in Payabli. Covers custom identifiers, creating customer records via API, bulk importing customers from CSV, customer status management, and associating customers with transactions. Includes code examples for the /api/Customer endpoints.
keywords: [keywords]
og:site_name: Payabli Developer Documentation
og:title: "Manage Customers with the Payabli API"
og:description: "Complete guide for creating, importing, and managing customer records using the Payabli API. Includes custom identifiers and code examples."
slug: developers/developer-guides/entities-customers
icon: person
internalKeys: # I listed all our proprietary metadata we use
---

Let me know if you have any more questions before you get started.

Claude did the first iteration of the script without asking clarifying questions, and I ran it. It worked, but it scanned every file in our repo.

Prompt 2:

"Sorry, meant to only scan MDX files in the fern/ directory."

It spat out a new command for me to run, which did what I wanted it to. So the orignal script had everyhing I needed, I just hadn't bothered to read before running it with no arguments. RTFM, right?

the solution

It's a ~300-line script that uses gray-matter to parse frontmatter. It even handles the conversion from yaml arrays to csv format without an issue.

We use gray matter in a few other utilities and I've always found it incredibly useful and easy to use. I am a huge fan of building docs preprocessing and lots of automations and tooling based on page frontmatter (like the underpinnings of my knowledge graph visualizer)

These are the columns I ended up with:

filepath,priority,title,subtitle,description,headline,keywords,canonical_url,
og_site_name,og_title,og_description,og_image,og_image_width,og_image_height,
twitter_card,twitter_site,noindex,nofollow,image,slug,icon,ImportantInternalKey,
has_description,has_keywords,has_headline,has_og_tags,has_twitter_tags,
word_count,sidebarTitle,hide-nav-links,layout,max-toc-depth,subarea,mode

Some of them are booleans, some contain literal values, some are blank. If you try something like this, you can figure out what matters to you and tweak your own prompts and script.

The entire process from ideation to having my csv file in hand was under 15 minutes.

what now

The next stage of the project is filling in all those terrifying blanks, and I think I may tinker with using AI there somehow. After that, I may further tinker with having AI actually place the fixes in my frontmatter.

I will report back.

Postscript: the more time I spent with the csv file, the more I realized how useful that content actually is. The script has been promoted to a permanent tool in the Docs CLI, which I will also end up writing about at some point.

Update: I'm all done now, just one day later. See Abusing AI to make Marketing smile, part 2 for the conclusion.