Skip to main content
Martin Hähnel

Continueing With The Version Script

In Yesterday's Back At It post I made it to the first minimal version of the script. A lot of things are still missing. Here's what it can and can't do:

From those can't do yet items, I will try to tackle the first one now.

I wonder how people would normally implement this. In PHP, particularly in the Symfony universe there is the Finder which is pretty handy when you have to find files by glob, etc. With this it's trivial to handle paths, etc. But I guess we'll do it by hand for now.

And then I noticed that I got ahead of myself, as I hadn't even implemented that we can feed a (one) path to the script instead of hard-coding! Slower is better. As we start a new blog post here's the current situation in its entirety:

import { execSync } from "child_process";
import fs from "fs";
import matter from "gray-matter";

const TMP_FILE = "tmp-post-versions.md";

const args = process.argv.slice(2);
const isTesting = args.includes("--testing");
const path = args.filter((arg) => !arg.startsWith("--"));

function execute() {
	if (!isTesting) {
		console.log("This script is only for testing purposes.");
		return;
	}
	const gitLog = getCommitHashes(path);
	console.log(`Commits for post with path ${path}:\n`, gitLog);

	writeHashesToFile(gitLog);

	console.log("Bye...");
}

function getCommitHashes(path) {
	return execSync(`git log --pretty=oneline --follow -- "${path}"`).toString();
}

function writeHashesToFile(gitLog) {
	const commitHashes = gitLog
		.split("\n")
		.map((line) => line.split(" ")[0]) // get first part (hash)
		.filter((hash) => hash); // remove empty lines

	console.log("Commit hashes for the post:\n", commitHashes.join("\n"));

	//parse markdown file and write hashes to hashes key (add if not exists) of frontmatter
	const markdownContent = fs.readFileSync(TMP_FILE, "utf-8");
	const parsed = matter(markdownContent);
	parsed.data.hashes = [...(parsed.data.hashes || []), ...commitHashes];

	fs.writeFileSync(TMP_FILE, matter.stringify(parsed.content, parsed.data));
}

execute();

Alright. This works if I call it with a file. To my surprise, it also works with a folder despite what man git-log is saying:

--follow
           Continue listing the history of a file beyond renames (works only for a single file).

Trying to read this answer to a question about how --follow works makes me realize that it's probably best to actually really only let --follow handle a file, because what its trying to do is pretty complicated. That means we have to make sure we don't hand over folders to git log --follow.

function isDir(path) {
	return fs.statSync(path).isDirectory();
}

And I just noticed an error. Or rather a bad naming choice leading to an error! I wrote:

const args = process.argv.slice(2);
//...
const path = args.filter((arg) => !arg.startsWith("--"));

But args is an array. Naming it path implies a single path string, not an array of them. Things only worked thanks to type coercion. So it is now called paths and you access the first (and only expected) path by doing paths[0]. Great.

Next, I'll allow multiple file paths, by introducing a loop... but maybe I should actually do a couple of things first:

  1. switch to eleventy's fork of gray-matter
  2. write hashes to their original files

Task 1 is super easy and just means switching the package in the package.json and update. But then I see:

pnpm update
 WARN  2 deprecated subdependencies found: rollup-plugin-inject@3.0.2, sourcemap-codec@1.4.8

TIL that you can find out who is depending on these deprecated subdependencies, by doing a:

pnpm why rollup-plugin-inject

But in my case it doesn't show anything. Weird. I had to run it inside one of my two "apps" dirs. That seems kind of counterproductive to using a monorepo (as I do) with pnpm. Regardless I found the perps:

pnpm why rollup-plugin-inject
Legend: production dependency, optional only, dev only

d1-template /<redacted>/blog-monorepo/apps/cloudflare-blog-api

devDependencies:
@cloudflare/vitest-pool-workers 0.7.8
└─┬ wrangler 3.114.1
  └─┬ @esbuild-plugins/node-modules-polyfill 0.2.2
    └─┬ rollup-plugin-node-polyfills 0.2.1
      └── rollup-plugin-inject 3.0.2

So I guess as I am working on the blog and not its API I'll leave this for another time.

Next problem! The README of the gray-matter fork is wrong about how to use the package it claims:

Usage

Using CommonJS:

const matter = require('gray-matter');

Or ESM:

import matter = require('gray-matter');
// OR
import * as matter from 'gray-matter';

But you have to use @11ty/gray-matter for it to work. Smallest PR in the universe.

Alright. Phew.

Now to the main attraction: Writing hashes to their original files. Easy peasy.

//...
writeHashesToFile(gitLog, paths[0]);

	console.log("Bye...");
}

//...

function writeHashesToFile(gitLog, path) {
	//...
	const markdownContent = fs.readFileSync(path, "utf-8");
	//...

	fs.writeFileSync(path, matter.stringify(parsed.content, parsed.data));
}

execute();

Alright. This went better than expected. I guess we just do the whole multiple files paths as well then? This means looping over the paths array instead of just handling the first one.

Well... it was going too smoothly. I misunderstood how array merging in JS works. So hashes are not merged.

hashes:
  - 9402b41142602cd4a5f658646cf26944d7c7c04e
  - 10e0905f451c6a6e68ef6e0ea8146256bfb00e88
  - 9402b41142602cd4a5f658646cf26944d7c7c04e
  - 10e0905f451c6a6e68ef6e0ea8146256bfb00e88

Example:

const args = ['--notMe', 'yes', 'shared'];
const newArgs = ['alsoThis', '--meNotAlso', 'shared'];
const merged = [...args, ...newArgs];
const concat = args.concat(newArgs);
console.log(merged)
console.log(concat)

Both merged and concat print: ['--notMe','yes','shared','alsoThis','--meNotAlso', 'shared']. Which is bad. But using a Set works beautifully:

const seted = [...new Set([...args, ...newArgs])]
console.log(seted) // [ '--notMe', 'yes', 'shared', 'alsoThis', '--meNotAlso']

Great. In this way we can already feed the script and arbitrary number of file paths and the corresponding files are updated with their commit hashes in a deduplicated way.

Next, I want to work on the --worktree feature which is to say the feature that will check the current worktree for changed (markdown) files and add hashes to them.

...Except that I don't. What I want to do is write some tests, as this script actually makes sense to be tested. Any time I have to interact with the ecosystem questions arise. Like, I was unsure how pnpm actually handles the setup I have:

If I don't want to manually maintain all versions in the package.jsons? Because I had done so until now. It turns out it's as easy as this:

Root package.json

	//...
	"devDependencies": {
		"vitest": "3.0.8",
	//...

Sub package.json

	//...
	"devDependencies": {
		"vitest": "workspace:*",
	//...

Except that didn't work! I guess it only works for packages within the monorepo itself. But thankfully the other option, using what they call a catalogue is not that much harder to use.

So instead of doing things directly in the package.json, we do it in the pnpm-workspace.yaml:

//...
catalog:
  vitest: "3.0.8"
  "@11ty/gray-matter": "^2.0.0"
//...

And then we use this in the package.json instead of an explicit version range:

	//...
	"devDependencies": {
		"vitest": "catalog:",
		"@11ty/gray-matter": "catalog:",
	//...

If I want to update my versions I just do it in the "catalog" instead of the package.json. Works for me.[1]

Anyway. We now have vitest available to us. Time to set it up! If I recall correctly JS tests are located next to the actual file as to avoid a duplicated folder hierarchy. So there isn't really anything to set up per se. Except adding a script to run tests in the package.json.

//...
	"scripts": {
		//...
		"test:script": "vitest scripts/post-versions.test.js"
	},
//...
import { test, expect } from "vitest";
import { isDir } from "./post-versions";

test("it return true", () => {
	expect(true).toBe(true);
});

test("isDir function works", () => {
	const testDir = "scripts/test";
	const testFile = "scripts/test/test-post-versions.md";

	expect(isDir(testDir)).toBe(true);
	expect(isDir(testFile)).toBe(false);
});
pnpm run test:script

> blog-monorepo@0.1.0 test:script /Users/martinhahnel/Local/code/blog-monorepo
> vitest scripts/post-versions.test.js


 DEV  v3.0.8 /Users/martinhahnel/Local/code/blog-monorepo

stdout | scripts/post-versions.test.js
Script is running in debug mode. More verbose output enabled.

stderr | scripts/post-versions.test.js
Please provide the path to the post file as an argument.

 ✓ scripts/post-versions.test.js (2 tests) 1ms
   ✓ it return true
   ✓ isDir function works

 Test Files  1 passed (1)
      Tests  2 passed (2)
   Start at  20:42:56
   Duration  281ms (transform 17ms, setup 0ms, collect 58ms, tests 1ms, environment 0ms, prepare 45ms)

 PASS  Waiting for file changes...
       press h to show help, press q to quit

Works! Alright, that's it for today!


  1. They also mentioned this thing in their documentation, I'll have to try that sometime to move things to the catalog in a semi-automated fashion. ↩︎