Could Zola generate static pages at unique URLs based on data loaded through load_data?
I’d love to use Zola to build a website based on the output from my BookShelf app – but I don’t want to create a markdown file for every entry. I can export the data as CSV/TSV and parse it with load_data but that’s only useful for a landing page showing all the entries. It would be amazing if Zola could also generate a page for every item in a loop through the results of load_data.
I realise this is unusual and likely not possible with how Zola works, but just thought I’d throw it out there… Thanks for your consideration.
As the requirements may change, I don’t think this should be included in Zola.
What you may possibly use is a “custom script” that would do that and that would run before you build the site.
I don’t want to share mine, but I’ve been doing it for one of my site.
Here is an example of how you could possibly do this in Python:
import csv
# Open the CSV file
with open('data.csv', 'r') as file:
# Create a CSV reader
reader = csv.reader(file)
# Iterate over the rows in the CSV file
for row in reader:
# Create a Markdown file for each row
with open(f'{row[0]}.md', 'w') as markdown_file:
# Write the Markdown file using the row data and a template
markdown_file.write(f'# {row[0]}\n\nDescription: {row[1]}')
This script will open the data.csv file, read it line by line, and create a Markdown file for each line. The Markdown file will be named after the first column in the row, and the content of the file will be generated using the data in the row and a template.
You can customize the template and the data that you include in the Markdown file by modifying the markdown_file.write line. For example, you could include additional data from the row or use a more complex template with multiple placeholders.
You could use another language and/or add complexity to the script based on your requirements.
This would be incredibly useful, for it allows “big” sites (in terms of number of pages) to move to a static site generator such as Zola. Here is a use case I have now for which I’m considering Zola vs Hugo:
— generate individual product pages (or PDFs) combining data fetched from external sources (e.g. API calls)
— generate the translations for each of those documents in different languages
Currently we have a thousand items x 10 languages, that’s 10000 pages, just for one type of content among others.
Conceptually it is not very different from the ability to create a list of taxonomy items (index of all terms, index of a term usage etc.).