Docker does not solve the abandoned template / dependency problem, it does not solve any problems, and only adds unnecessary steps to the deployment process.
You deploy a statically generated site by building the site with the site builder ahead of time, and serving the files it produces. This is literally the simplest deployment you could ever have, there’s nowhere for docker in this process. hugo serve
or npm run start
is only for development, and is the shorthand for build and host, they only depends on the configurations of node package, which has its version and it’s checksums (if checked in) well defined in the package management config, it is not dependent on any other system environment other than node’s version.
See: Static site generation (SSG) | Docusaurus
The only other place you would be using docker would be to build, which is part of development and share the same environment as hugo serve
.
The purpose for your proposal as how I understand it is if we freeze the build environment, we will not need to worry about dependency breaking when we don’t update them.
Which doesn’t make sense as all dependency versions and their checksums are defined in the package management config, and if someone pulled a transient dependency of npm, docker won’t save us without some really ugly magic, so I will presume you mean node versions + node dependency here.
Regardless, this is a false approach, besides npm eco-system being npm ecosystem, there are good reasons packages and node versions are not supported after a few years and lead to project not being able to build. npm projects, through transient dependencies, brings in A LOT of packages, their dependency approach is almost on the polar opposite of c++, checkout is-even. Some of these dependencies have security issues, some of them link to libraries with security issues, some of them link to libraries that link to libraries that link to libraries with security issues. Simply putting the build process into a docker container and freezing everything does not resolve them, if we cannot build our website, not building the website should be the least of our concern, that probably means our dependency chain is way out of date, which for npm might be a year or two, but still, there might be holes in the dependency chain. These packages with security issues may be served to our end user, and there’s a good record of people putting nasty stuff like crypto miners in a widely used transient dependency. In npm’s ecosystem, the only effective way to mitigate these dependency chain issues is to be up to date, and find alternatives when something goes unmaintained.
This is why I advocate for depending on well maintained library with a good history or backing, because ideally they keep up with their sub dependencies forever, and the only thing we need to do is to keep track of their latest release. Which we can simply setup a CI that checks if we break build, and a dependabot that pushes updates from upstream to us.
I don’t see or recall anyone in web dev building static websites within docker containers “for development”. Fancy static site hosts don’t even give you the ability to config underlying build image (see vercel, netlify), they also always do a fresh install before building by default.
This docker approach might be a patch for a personal blog, but I don’t think as an org we should dismiss dependency updates.
In C++ terms, you are proposing to CMake config - CMake build a deliverable in a docker container, so that the build won’t fail when gcc-3 is no longer included by default in everyone’s ubuntu environment. The solution here is not to freeze the environment at Ubuntu-12.04 + gcc-3 so it builds, it is to keep your build chain reasonably updated.
Again I want to remind everyone here, we are not shipping a dynamic web app. We are shipping static files. There’s no good reason we will need to have a dynamic website, and none of our options are based on dynamic websites or even next.js. There are established tools and platforms for static site building that are basically free of charge for our usage, this is very much a solved problem, and we are getting the simplist deployment we could have. What comes with the example repos/ whatever got generated from clis are usually good enough and simplest. We just need to make sure we rely on a well maintained, long running generator and template.
Docker is not lightweight on the developers side (it requires you to pull an entire OS and spin up a virtual machine!!, all for putting a blog out?), and the version fixing + consistent environment is already provided by NPM. Docker is not a simplest nor right solution. There’s already well established tools for the right alternative: dependabot that keep tracks of dependency updates for us providing periodic PRs, merge-ability checked with CI to ensure updates does not break build, reliable upstream engine that’s well maintained. When it fails, the quicker we resolve breaking changes, either by fixing a breaking change or find an alternative, the less friction there is to keep the website up to date.