Currently, I can’t find a way to build a site that contains badly formed external links. I thought skip_prefixes
would help, but the prefixes are applied after the URLs are parsed, and URL parsing errors cause a panic (via anyhow’s bail!
).
The context is that my content set is rather large (over 130,000 pages), and contains over 600 badly formed external links (for example: [link](TESTSITE)
). Fixing all the bad links is important, but not right away, so I’m looking for a solution that allows me to successfully build a site with badly formed external links, so I can continue to make progress on templates, taxonomies, etc.
So far I have two branches exploring two different solutions.
- print an error for badly formed URLs, but don’t panic (ie, change
bail!
toanyhow!
infn get_link_domain
) - apply
skip_prefixes
before attempting to parse external URLs (infn check_external_links
)
It may come down to how opinionated Zola wants to be about badly formed external links. Solution 1 would repeatedly confront the user with the presence of badly formed links, whereas solution 2 would take the user’s skip_prefixes
preference to heart, printing " ➡️ skipping: matched {prefix}"
but not flagging the link as badly formed.
I’m wondering if anyone has an immediate reaction or preference to either of those solutions, or ideas for a different solution. Thanks!