:blue_book: A static-site generator made in Node.js.
static-site-express is a simple Node.js based static-site generator that uses EJS and Markdown. Deploy your static site
to Netlify or any platform to your liking. Suited for landing pages, portfolio, blogs, documentation, hobby projects.
I created a “Barebone” theme (previously on the starter/barebone
branch) without Tailwind CSS and Flowbite UI, with
SASS support and some basic
styling. It was a huge mistake to be dependent on any CSS frameworks. This theme became the default on the master
branch.
The old master branch is now available as deprecated-tailwind
, and I discontinued its development.
master
branch,gh repo create your-username/new-repo -p webandras/static-site-express
snipcart
src/layouts/partials/scripts.ejs
to the publicApiKey
property value:<div id="snipcart" data-config-modal-style="side" data-api-key="YOUR_PUBLIC_TEST_API_KEY" hidden></div>
Note: This api key is public, and can be submitted to version control. There is also a private key, but that should
never be committed.
Snipcart is more than a simple cart: enjoy a full back-office management dashboard to track
abandoned carts, sales, orders, customers and more.
Disclaimer: I am not affiliated with Snipcart in any ways.
Note: Netlify will build your site from the default branch (usually the master
) by default.
You can use a different branch other than the default one, but in that case Decap CMS (previously: Netlify CMS) will
not work properly. For
example, the images uploaded through the CMS will be pushed into the default branch, not the other one you set up in
Netlify!)
Test website: Use the ‘Deploy to Netlify’ button at the project’s website
to have a test website.
First, install or update npm packages.
Second, create a .env
file (see .env.example
), and set the variables.
If you want to use Algolia Search, you need
to register and generate your API credentials. If
you don’t want to use Algolia, set enableSearch
to false in config/site.config.js
.
Check out all the settings in the site.config.js
. There are comments there with additional information.
Use npm scripts defined in package.json
Note: On Windows, you can’t use the bash scripts located in the bin folder -> use the corresponding npm scripts in
package.json instead.
./content
into the ./public
folder (in watch mode):bin/watch
Or:
npm run watch-chokidar
Or:
npm run watch-nodemon
This bash script will call: npm run watch-chokidar
.
Alternatively, you can also use npm run watch-nodemon
.
If you modify site.config.js
restart the bin/watch
or the corresponding scripts in package.json to apply the changes
you have made.
For local development, make sure you rewrite the mode to “development”!
Generate the js and css bundles as well (in --watch mode): bin\webpack
(npm run webpack-watch
).
localhost:4000
(or the port you set in .env, default port is 4000) (legacy):bin/serve
Or:
npm run serve
TODO: the Express dev server crashes rarely - not finding some file generated by the builder. The files and folders
in the public folder are deleted and re-copied: for a brief moment it is possible not to have a .html file available to
be served by the Express server. However, the site-builder generates everything in a few hundreds of milliseconds (
generally less than 300 ms). So this error happens rarely.
It is recommended to switch to browser-sync to have live reloading in the browser when files change. The issue above
will disappear if you use this:
bin/liveserver
Or:
npm run liveserver
or run:
browser-sync start --server 'public' --files 'public'
bin/webpack
(npm run webpack
) watcher script to make sure the js and css bundles are recreated after file changes.If you don’t see your changes:
bin/watch
(npm run watch-chokidar
, or npm run watch-nodemon
),bin/webpack
(npm run webpack
),bin/liveserver
(npm run liveserver
)Make sure to build the live bundle in production mode.
Check out the bin
folder and the package.json
file to see the available scripts.
The JavaScript source is in the app/
folder. Generally, you only need to modify the core/generator.js
and
the core/methods.js
files.
methods.js
contains most of the methods for the generator.generator.js
, you can modify the pages you want to generate in the switch statements starting from line 280..ejs
) in the pages/
folder, and a template (in layouts/
) to be used for thatdefault.ejs
).templateConfig
object literal (generator.js
)After the changes, restart build/watch scripts. This process in suboptimal, but currently this is the workflow.
content/
folder)posts/
) where the front matter block contains the post properties (you cantemplateConfig
object literal (generator.js
) as well).pages/
) are using templates and partials defined in the layouts/
folder.config/site.config.js
file contains some of the global site properties (like site title, author, description,netlify.toml
configuration file contains important properties:[build]
base = "/"
publish = "public"
command = "npm run build"
The base path, the build command, and the “publish” directory. You can keep those settings unchanged.
You can also define here some post-processing actions to be run in the post-processing stages, for example as part of
Netlify’s CI/CD pipeline.
In the optional _headers
file you can specify the HTTP headers and set Content Security Policy (CSP) rules for the
Netlify server.
Currently, CSP rules are commented out. You can also specify these in netlify.toml
.
The _redirects
file is currently empty. When you have a custom domain, you can make a redirect from .netlify.com to
your custom domain there.
robots.txt
default settings:
# Disallow admin page
User-agent: *
Disallow: /admin/
# Disallow message-sent page
User-agent: *
Disallow: /message-sent/
# Rule 3
User-agent: *
Allow: /
For Google Search Console verification, you should have an HTML file
from Google included in the root of your Netlify publish folder (in our case, public
). The build script copies this
file from ./content
to ./public
.
Add the name of the filename in the filesToCopy
array at line 100 in ./app/core/generator.js
and restart watch
script!
Netlify builds your website with its buildbot. It starts a Docker container running
the Netlify build image
TL;DR: Netlify install a lot of packages (copies files over) to be able to run your favorite tool to build your static
website. And this is done in a Docker container. Read the overview section of Docker
docs: https://docs.docker.com/get-started/
A Docker container is basically a writable OverlayFS (FS = filesystem) layer created on the very top of the numerous
read-only OverlayFS layers of the Docker image (files copied on top of each other: each layer represents a command in
the Dockerfile). Which is destroyed after the build has been completed. However, the data can be made permanent using
volumes which are kept.
The images are based on base images (the FROM statement at the first line of a Dockerfile) that are special
distributions that “think they are operating systems”, but are more lightweight that a complete OS.
Alpine Linux is the most lightweight of them (around 5MB). Interesting to note,
that images can built from scratch as well (scratch is a
reserved image that is empty, and thus does nothing). The base images are built this way (“FROM scratch”).
Docker is using the kernel and obviously the resources of the host (which are shared), and are meant for process
isolation only. Containers are more lightweight, don’t have the overheads Virtual Machines
do. More about this topic.
VMs are used for full isolation including resources (for example, to subdivide the server resources for shared hosting:
each hosting having a computing power of X CPUs of X type, have X GB of memory and X GB storage space), and have a
separate (full) OS installed along with the host OS, so they do not share the kernel.
If you use Windows, you need to install Windows Subsystem for Windows 2 (WSL2) to have a (not fully featured) distro
based on Linux kernel installed (Ubuntu is used most of the time), that will run as a regular application. Although,
there are container base images available for Windows as well. So, Docker can even use the Windows kernel now (for
specific images).
Lots of images are pre-built for us (like the netlify/build
image) and stored in the Docker registry (not
DockerHub, since that is just a user interface). You don’t need to build them from Dockerfile, you just download them
from the registry.
If you know the Docker basics, you can understand some things about Netlify as well.
Check these shell scripts out:
When the Docker fires up, this script runs:
https://github.com/netlify/build-image/blob/focal/run-build.sh
This is the Dockerfile from which the Netlify image is built (currently based on ubuntu:20.04
, older Ubuntu base
images - like 16.04 - are deprecated now):
https://github.com/netlify/build-image/blob/focal/Dockerfile
Netlify automatically discovers the contact form via custom netlify attributes added to the form. A bot field is present
in the form to protect against spam bots. Netlify has first-class spam filter.
These are the key parts in the code for Algolia:
const algoliasearch = require("algoliasearch");
const client = algoliasearch(process.env.ALGOLIA_APP_ID, process.env.ALGOLIA_ADMIN_KEY);
const index = client.initIndex(process.env.ALGOLIA_INDEX);
Here, I use the AlgoliaSearch client library to send request to update and/or create records for the posts:
index.partialUpdateObjects(searchIndexData, {
createIfNotExists: true,
});
This is currently the structure of the search index (as a default example):
searchIndexData.push({
/**
* The object's unique identifier
*/
objectID: postData.attributes.date,
/**
* The URL where the Algolia Crawler found the record
*/
url: canonicalUrl,
/**
* The lang of the page
* - html[attr=lang]
*/
lang: config.site.lang,
/**
* The title of the page
* - og:title
* - head > title
*/
title: postData.attributes.title,
/**
* The description of the page
* - meta[name=description]
* - meta[property="og:description"]
*/
description: postData.attributes.excerpt,
/**
* The image of the page
* - meta[property="og:image"]
*/
image: config.site.seoUrl + "/assets/images/uploads/" + postData.attributes.coverImage,
/**
* The authors of the page
* - `author` field of JSON-LD Article object: https://schema.org/Article
* - meta[property="article:author"]
*/
authors: [config.site.author],
/**
* The publish date of the page
* - `datePublished` field of JSON-LD Article object: https://schema.org/Article
* - meta[property="article:published_time"]
*/
datePublished: postData.attributes.date,
/**
* The category of the page
* - meta[property="article:section"
* - meta[property="product:category"]
*/
category: postData.attributes.topic || "",
/**
* The content of your page
*/
content: postContents,
});
Note: Currently the objectID
is the post publish date (like “2022-08-17”). Maybe better to change it to the whole
slug to be completely unique. You can’t have two posts at the same day now.
Provided by i18next package. See in assets/js/main.js
. Remove this feature if you don’t need it. Some texts (like
the ones coming from config) cannot be made translatable. Probably not a good solution. The code is left there mainly
for reference.
The translations come from content/lang/translations.json
.
Credits: © Michael Lee.
GitHub
His website/blog
When I started my journey as web developer, I started using Jekyll for my simple websites after
reading some articles from Michael Lee about it. He has a great starter for Jekyll,
the Jekyll ⍺.
The data comes from content/data/opening-hours.yml
. It can be edited from Decap CMS as well.
I created the “Barebone” theme (branch: starter/barebone
) without Tailwind CSS, with SASS support and some basic
styling (nothing has changed in the app folder)
It was a mistake to be dependent on one CSS framework. Choose whatever you like for Barebone.
bin/livereload
) with local server to refresh the page after the file changesYou can use bin/livereload
instead of bin\serve
. The old express local server is not removed.
The project is now in mature state, there will be no more refactoring, only bugfixes and occasional improvements of
features. No breaking changes.
I don’t want to be part of the “rewrite culture”. Also, not a fan of npm any more. I use as small amount of packages as
possible.
Having 1000s of interdependent packages (with all these regular rewrites, and security issues as well) is a dependency
hell.
Delete:
site-generator
folder removed, bin/generate
, and npm scripts related are deleted as well.Update:
app
folder that you should edit. No need to rebuild all the time with Babel.New feature:
Fix:
Security:
Update:
This intended to be the last major version release.
New:
.env.example
, remove .env
, change .env vars.netlify.toml
configuration.netlify.toml
with whitelisting Netlify-specific domains.Update:
v16.14.0
.v16.14.0
.content/
folder.Delete:
New/Update/Delete:
Incorrect configuration in docker-compose.yml:
Correct EJS syntax error after EJS version update:
<%- include ('partial/element-name') %>
This is a breaking change, you should update your partials/templates
!
Update build and watch scripts (using chokidar):
generator.js
) to be used in a build and the chokidar-based watchIn 2019, chokidar was not watching file changes properly, thus the npm script was named “watch-exp”. The default watch
script is using nodemon.
Add flow types support and re-structure folders:
src/
, Babel will transpile the source into the lib/
folder where originallywebsite/
folder, necessary code changes are applied.Refactor site generator, code improvements, config changes:
package.json
add dotenv package, update npm scripts.Dockerize project:
.env
file..dockerignore
.Bug: Problem with an existing folder.
The build script should always delete the folders inside the public folder.
However, the assets folder is sometimes not deleted, so an exception occurs:
[Error: EEXIST: file already exists, mkdir './public/assets']
nodemon
not trigger re-build on Linux on file changes (this behavior was experienced on Ubuntu 18.04 LTS Bionicnpm run watch-exp
command which uses the chokidarIf you have a problem or a question about
static-site-express, open an issue here.
The idea of using a Node.js static site generator came from this good article by Douglas Matoso (not accessible any
more): Build a static site generator in 40 lines with Node.js.
This package uses some modified code parts from doug2k1/nanogen (mainly from
the legacy
branch and some ideas from the master
branch, MIT © Douglas Matoso 2018).
MIT licence - Copyright © 2018-2024 András Gulácsi.