+
+### Does this issue reproduce with the latest release?
diff --git a/.github/ISSUE_TEMPLATE/config.yml b/.github/ISSUE_TEMPLATE/config.yml
new file mode 100644
index 000000000..c84d3276b
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/config.yml
@@ -0,0 +1,5 @@
+blank_issues_enabled: false
+contact_links:
+ - name: SUPPORT, ISSUES and TROUBLESHOOTING
+ url: https://discourse.gohugo.io/
+ about: Please DO NOT use Github for support requests. Please visit https://discourse.gohugo.io for support! You will be helped much faster there. If you ignore this request your issue might be closed with a discourse label.
diff --git a/.github/ISSUE_TEMPLATE/feature_request.md b/.github/ISSUE_TEMPLATE/feature_request.md
new file mode 100644
index 000000000..c114b3d7f
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/feature_request.md
@@ -0,0 +1,11 @@
+---
+name: Proposal
+about: Propose a new feature for Hugo
+title: ''
+labels: 'Proposal, NeedsTriage'
+assignees: ''
+
+---
+
+
+
\ No newline at end of file
diff --git a/.github/dependabot.yml b/.github/dependabot.yml
new file mode 100644
index 000000000..1801e72d9
--- /dev/null
+++ b/.github/dependabot.yml
@@ -0,0 +1,7 @@
+# See https://docs.github.com/en/github/administering-a-repository/configuration-options-for-dependency-updates#package-ecosystem
+version: 2
+updates:
+ - package-ecosystem: "gomod"
+ directory: "/"
+ schedule:
+ interval: "daily"
diff --git a/.github/stale.yml b/.github/stale.yml
deleted file mode 100644
index 692c59659..000000000
--- a/.github/stale.yml
+++ /dev/null
@@ -1,23 +0,0 @@
-# Number of days of inactivity before an issue becomes stale
-daysUntilStale: 120
-# Number of days of inactivity before a stale issue is closed
-daysUntilClose: 30
-# Issues with these labels will never be considered stale
-exemptLabels:
- - Keep
- - Security
-# Label to use when marking an issue as stale
-staleLabel: Stale
-# Comment to post when marking an issue as stale. Set to `false` to disable
-markComment: >
- This issue has been automatically marked as stale because it has not had
- recent activity. The resources of the Hugo team are limited, and so we are asking for your help.
-
- If this is a **bug** and you can still reproduce this error on the master branch, please reply with all of the information you have about it in order to keep the issue open.
-
- If this is a **feature request**, and you feel that it is still relevant and valuable, please tell us why.
-
- This issue will automatically be closed in the near future if no further activity occurs. Thank you for all your contributions.
-
-# Comment to post when closing a stale issue. Set to `false` to disable
-closeComment: false
diff --git a/.github/workflows/image.yml b/.github/workflows/image.yml
new file mode 100644
index 000000000..c4f3c34c3
--- /dev/null
+++ b/.github/workflows/image.yml
@@ -0,0 +1,49 @@
+name: Build Docker image
+
+on:
+ release:
+ types: [published]
+ pull_request:
+permissions:
+ packages: write
+
+env:
+ REGISTRY_IMAGE: ghcr.io/gohugoio/hugo
+
+jobs:
+ build:
+ runs-on: ubuntu-latest
+
+ steps:
+ - name: Checkout
+ uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332 # v4.1.7
+
+ - name: Docker meta
+ id: meta
+ uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81 # v5.5.1
+ with:
+ images: ${{ env.REGISTRY_IMAGE }}
+
+ - name: Set up Docker Buildx
+ uses: docker/setup-buildx-action@988b5a0280414f521da01fcc63a27aeeb4b104db # v3.6.1
+
+ - name: Login to GHCR
+ # Login is only needed when the image is pushed
+ uses: docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567 # v3.3.0
+ with:
+ registry: ghcr.io
+ username: ${{ github.repository_owner }}
+ password: ${{ secrets.GITHUB_TOKEN }}
+
+ - name: Build and push
+ id: build
+ uses: docker/build-push-action@16ebe778df0e7752d2cfcbd924afdbbd89c1a755 # v6.6.1
+ with:
+ context: .
+ provenance: mode=max
+ sbom: true
+ push: ${{ github.event_name != 'pull_request' }}
+ platforms: linux/amd64,linux/arm64
+ tags: ${{ steps.meta.outputs.tags }}
+ labels: ${{ steps.meta.outputs.labels }}
+ build-args: HUGO_BUILD_TAGS=extended,withdeploy
\ No newline at end of file
diff --git a/.github/workflows/stale.yml b/.github/workflows/stale.yml
new file mode 100644
index 000000000..249c1ab54
--- /dev/null
+++ b/.github/workflows/stale.yml
@@ -0,0 +1,52 @@
+name: 'Close stale and lock closed issues and PRs'
+on:
+ workflow_dispatch:
+ schedule:
+ - cron: '30 1 * * *'
+permissions:
+ contents: read
+jobs:
+ stale:
+ permissions:
+ issues: write
+ pull-requests: write
+ runs-on: ubuntu-latest
+ steps:
+ - uses: dessant/lock-threads@7de207be1d3ce97a9abe6ff1306222982d1ca9f9 # v5.0.1
+ with:
+ issue-inactive-days: 21
+ add-issue-labels: 'Outdated'
+ issue-comment: >
+ This issue has been automatically locked since there
+ has not been any recent activity after it was closed.
+ Please open a new issue for related bugs.
+ pr-comment: >
+ This pull request has been automatically locked since there
+ has not been any recent activity after it was closed.
+ Please open a new issue for related bugs.
+ - uses: actions/stale@28ca1036281a5e5922ead5184a1bbf96e5fc984e # v9.0.0
+ with:
+ operations-per-run: 999
+ days-before-issue-stale: 365
+ days-before-pr-stale: 365
+ days-before-issue-close: 56
+ days-before-pr-close: 56
+ stale-issue-message: >
+ This issue has been automatically marked as stale because it has not had
+ recent activity. The resources of the Hugo team are limited, and so we are asking for your help.
+
+ If this is a **bug** and you can still reproduce this error on the master branch, please reply with all of the information you have about it in order to keep the issue open.
+
+ If this is a **feature request**, and you feel that it is still relevant and valuable, please tell us why.
+
+ This issue will automatically be closed in the near future if no further activity occurs. Thank you for all your contributions.
+ stale-pr-message: This PR has been automatically marked as stale because it has not had
+ recent activity. The resources of the Hugo team are limited, and so we are asking for your help.
+
+ Please check https://github.com/gohugoio/hugo/blob/master/CONTRIBUTING.md#code-contribution and verify that this code contribution fits with the description. If yes, tell is in a comment.
+
+ This PR will automatically be closed in the near future if no further activity occurs. Thank you for all your contributions.
+ stale-issue-label: 'Stale'
+ exempt-issue-labels: 'Keep,Security'
+ stale-pr-label: 'Stale'
+ exempt-pr-labels: 'Keep,Security'
diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml
new file mode 100644
index 000000000..c49c12371
--- /dev/null
+++ b/.github/workflows/test.yml
@@ -0,0 +1,132 @@
+on:
+ push:
+ branches: [master]
+ pull_request:
+name: Test
+env:
+ GOPROXY: https://proxy.golang.org
+ GO111MODULE: on
+ SASS_VERSION: 1.80.3
+ DART_SASS_SHA_LINUX: 7c933edbad0a7d389192c5b79393485c088bd2c4398e32f5754c32af006a9ffd
+ DART_SASS_SHA_MACOS: 79e060b0e131c3bb3c16926bafc371dc33feab122bfa8c01aa337a072097967b
+ DART_SASS_SHA_WINDOWS: 0bc4708b37cd1bac4740e83ac5e3176e66b774f77fd5dd364da5b5cfc9bfb469
+permissions:
+ contents: read
+jobs:
+ test:
+ strategy:
+ matrix:
+ go-version: [1.23.x, 1.24.x]
+ os: [ubuntu-latest, windows-latest] # macos disabled for now because of disk space issues.
+ runs-on: ${{ matrix.os }}
+ steps:
+ - if: matrix.os == 'ubuntu-latest'
+ name: Free Disk Space (Ubuntu)
+ uses: jlumbroso/free-disk-space@54081f138730dfa15788a46383842cd2f914a1be # v1.3.1
+ with:
+ # this might remove tools that are actually needed,
+ # if set to "true" but frees about 6 GB
+ tool-cache: false
+ android: true
+ dotnet: true
+ haskell: true
+ large-packages: true
+ docker-images: true
+ swap-storage: true
+ - name: Checkout code
+ uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332 # v4.1.7
+ - name: Install Go
+ uses: actions/setup-go@0a12ed9d6a96ab950c8f026ed9f722fe0da7ef32 # v5.0.2
+ with:
+ go-version: ${{ matrix.go-version }}
+ check-latest: true
+ cache: true
+ cache-dependency-path: |
+ **/go.sum
+ **/go.mod
+ - name: Install Ruby
+ uses: ruby/setup-ruby@a6e6f86333f0a2523ece813039b8b4be04560854 # v1.190.0
+ with:
+ ruby-version: "2.7"
+ bundler-cache: true #
+ - name: Install Python
+ uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f # v5.1.1
+ with:
+ python-version: "3.x"
+ - name: Install Mage
+ run: go install github.com/magefile/mage@v1.15.0
+ - name: Install asciidoctor
+ uses: reitzig/actions-asciidoctor@c642db5eedd1d729bb8c92034770d0b2f769eda6 # v2.0.2
+ - name: Install docutils
+ run: |
+ pip install docutils
+ rst2html --version
+ - if: matrix.os == 'ubuntu-latest'
+ name: Install pandoc on Linux
+ run: |
+ sudo apt-get update -y
+ sudo apt-get install -y pandoc
+ - if: matrix.os == 'macos-latest'
+ run: |
+ brew install pandoc
+ - if: matrix.os == 'windows-latest'
+ run: |
+ choco install pandoc
+ - run: pandoc -v
+ - if: matrix.os == 'windows-latest'
+ run: |
+ choco install mingw
+ - if: matrix.os == 'ubuntu-latest'
+ name: Install dart-sass Linux
+ run: |
+ echo "Install Dart Sass version ${SASS_VERSION} ..."
+ curl -LJO "https://github.com/sass/dart-sass/releases/download/${SASS_VERSION}/dart-sass-${SASS_VERSION}-linux-x64.tar.gz";
+ echo "${DART_SASS_SHA_LINUX} dart-sass-${SASS_VERSION}-linux-x64.tar.gz" | sha256sum -c;
+ tar -xvf "dart-sass-${SASS_VERSION}-linux-x64.tar.gz";
+ echo "$GOBIN"
+ echo "$GITHUB_WORKSPACE/dart-sass/" >> $GITHUB_PATH
+ - if: matrix.os == 'macos-latest'
+ name: Install dart-sass MacOS
+ run: |
+ echo "Install Dart Sass version ${SASS_VERSION} ..."
+ curl -LJO "https://github.com/sass/dart-sass/releases/download/${SASS_VERSION}/dart-sass-${SASS_VERSION}-macos-x64.tar.gz";
+ echo "${DART_SASS_SHA_MACOS} dart-sass-${SASS_VERSION}-macos-x64.tar.gz" | shasum -a 256 -c;
+ tar -xvf "dart-sass-${SASS_VERSION}-macos-x64.tar.gz";
+ echo "$GITHUB_WORKSPACE/dart-sass/" >> $GITHUB_PATH
+ - if: matrix.os == 'windows-latest'
+ name: Install dart-sass Windows
+ run: |
+ echo "Install Dart Sass version ${env:SASS_VERSION} ..."
+ curl -LJO "https://github.com/sass/dart-sass/releases/download/${env:SASS_VERSION}/dart-sass-${env:SASS_VERSION}-windows-x64.zip";
+ Expand-Archive -Path "dart-sass-${env:SASS_VERSION}-windows-x64.zip" -DestinationPath .;
+ echo "$env:GITHUB_WORKSPACE/dart-sass/" | Out-File -FilePath $Env:GITHUB_PATH -Encoding utf-8 -Append
+ - if: matrix.os == 'ubuntu-latest'
+ name: Install staticcheck
+ run: go install honnef.co/go/tools/cmd/staticcheck@latest
+ - if: matrix.os == 'ubuntu-latest'
+ name: Run staticcheck
+ run: staticcheck ./...
+ - if: matrix.os != 'windows-latest'
+ name: Check
+ run: |
+ sass --version;
+ mage -v check;
+ env:
+ HUGO_BUILD_TAGS: extended,withdeploy
+ - if: matrix.os == 'windows-latest'
+ # See issue #11052. We limit the build to regular test (no -race flag) on Windows for now.
+ name: Test
+ run: |
+ mage -v test;
+ env:
+ HUGO_BUILD_TAGS: extended,withdeploy
+ - name: Build tags
+ run: |
+ go install -tags extended
+ - if: matrix.os == 'ubuntu-latest'
+ name: Build for dragonfly
+ run: |
+ go install
+ env:
+ GOARCH: amd64
+ GOOS: dragonfly
diff --git a/.gitignore b/.gitignore
index e71fe6c00..ddad69611 100644
--- a/.gitignore
+++ b/.gitignore
@@ -1,25 +1,6 @@
-/hugo
-docs/public*
-/.idea
-hugo.exe
+
*.test
-*.prof
-nohup.out
-cover.out
-*.swp
-*.swo
-.DS_Store
-*~
-vendor/*/
-*.bench
-*.debug
-coverage*.out
-
-dock.sh
-
-GoBuilds
-dist
-
-resources/sunset.jpg
-
-vendor
\ No newline at end of file
+imports.*
+dist/
+public/
+.DS_Store
\ No newline at end of file
diff --git a/.travis.yml b/.travis.yml
deleted file mode 100644
index 2eaad6b5e..000000000
--- a/.travis.yml
+++ /dev/null
@@ -1,46 +0,0 @@
-language: go
-sudo: false
-dist: xenial
-env:
- global:
- - GOPROXY="https://proxy.golang.org"
- - HUGO_BUILD_TAGS="extended"
-git:
- depth: false
-go:
- - "1.11.10"
- - "1.12.5"
- - tip
-os:
- - linux
- - osx
- - windows
-matrix:
- allow_failures:
- - go: tip
- fast_finish: true
- exclude:
- - os: windows
- go: tip
-
-install:
- - mkdir -p $HOME/src
- - mv $TRAVIS_BUILD_DIR $HOME/src
- - export TRAVIS_BUILD_DIR=$HOME/src/hugo
- - cd $HOME/src/hugo
- - go get github.com/magefile/mage
-script:
- - go mod download || true
- - mage -v test
- - mage -v check
- - mage -v hugo
- - ./hugo -s docs/
- - ./hugo --renderToMemory -s docs/
- - df -h
-
-before_install:
- - df -h
- # https://travis-ci.community/t/go-cant-find-gcc-with-go1-11-1-on-windows/293/5
- - if [[ "$TRAVIS_OS_NAME" == "windows" ]]; then choco install mingw -y; export PATH=/c/tools/mingw64/bin:"$PATH"; fi
- - gem install asciidoctor
- - type asciidoctor
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index 124e5b754..ddd3efcf2 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -1,3 +1,5 @@
+>**Note:** We would appreciate if you hold on with any big refactoring (like renaming deprecated Go packages), mainly because of potential for extra merge work for future coming in in the near future.
+
# Contributing to Hugo
We welcome contributions to Hugo of any kind including documentation, themes,
@@ -29,12 +31,16 @@ Please don't use the GitHub issue tracker to ask questions.
## Reporting Issues
If you believe you have found a defect in Hugo or its documentation, use
-the GitHub [issue tracker](https://github.com/gohugoio/hugo/issues) to report
+the GitHub issue tracker to report
the problem to the Hugo maintainers. If you're not sure if it's a bug or not,
start by asking in the [discussion forum](https://discourse.gohugo.io).
When reporting the issue, please provide the version of Hugo in use (`hugo
version`) and your operating system.
+- [Hugo Issues · gohugoio/hugo](https://github.com/gohugoio/hugo/issues)
+- [Hugo Documentation Issues · gohugoio/hugoDocs](https://github.com/gohugoio/hugoDocs/issues)
+- [Hugo Website Theme Issues · gohugoio/hugoThemesSite](https://github.com/gohugoio/hugoThemesSite/issues)
+
## Code Contribution
Hugo has become a fully featured static site generator, so any new functionality must:
@@ -44,15 +50,15 @@ Hugo has become a fully featured static site generator, so any new functionality
* strive not to break existing sites.
* close or update an open [Hugo issue](https://github.com/gohugoio/hugo/issues)
-If it is of some complexity, the contributor is expected to maintain and support the new future (answer questions on the forum, fix any bugs etc.).
+If it is of some complexity, the contributor is expected to maintain and support the new feature in the future (answer questions on the forum, fix any bugs etc.).
-It is recommended to open up a discussion on the [Hugo Forum](https://discourse.gohugo.io/) to get feedback on your idea before you begin. If you are submitting a complex feature, create a small design proposal on the [Hugo issue tracker](https://github.com/gohugoio/hugo/issues) before you start.
+Any non-trivial code change needs to update an open [issue](https://github.com/gohugoio/hugo/issues). A non-trivial code change without an issue reference with one of the labels `bug` or `enhancement` will not be merged.
+Note that we do not accept new features that require [CGO](https://github.com/golang/go/wiki/cgo).
+We have one exception to this rule which is LibSASS.
**Bug fixes are, of course, always welcome.**
-
-
## Submitting Patches
The Hugo project welcomes all contributors and contributions regardless of skill or experience level. If you are interested in helping with the project, we will help you with your contribution.
@@ -75,19 +81,23 @@ To make the contribution process as seamless as possible, we ask for the followi
### Git Commit Message Guidelines
-This [blog article](http://chris.beams.io/posts/git-commit/) is a good resource for learning how to write good commit messages,
+This [blog article](https://cbea.ms/git-commit/) is a good resource for learning how to write good commit messages,
the most important part being that each commit message should have a title/subject in imperative mood starting with a capital letter and no trailing period:
-*"Return error on wrong use of the Paginator"*, **NOT** *"returning some error."*
+*"js: Return error when option x is not set"*, **NOT** *"returning some error."*
+
+Most title/subjects should have a lower-cased prefix with a colon and one whitespace. The prefix can be:
+
+* The name of the package where (most of) the changes are made (e.g. `media: Add text/calendar`)
+* If the package name is deeply nested/long, try to shorten it from the left side, e.g. `markup/goldmark` is OK, `resources/resource_transformers/js` can be shortened to `js`.
+* If this commit touches several packages with a common functional topic, use that as a prefix, e.g. `errors: Resolve correct line numbers`)
+* If this commit touches many packages without a common functional topic, prefix with `all:` (e.g. `all: Reformat Go code`)
+* If this is a documentation update, prefix with `docs:`.
+* If nothing of the above applies, just leave the prefix out.
+* Note that the above excludes nouns seen in other repositories, e.g. "chore:".
Also, if your commit references one or more GitHub issues, always end your commit message body with *See #1234* or *Fixes #1234*.
Replace *1234* with the GitHub issue ID. The last example will close the issue when the commit is merged into *master*.
-Sometimes it makes sense to prefix the commit message with the package name (or docs folder) all lowercased ending with a colon.
-That is fine, but the rest of the rules above apply.
-So it is "tpl: Add emojify template func", not "tpl: add emojify template func.", and "docs: Document emoji", not "doc: document emoji."
-
-Please use a short and descriptive branch name, e.g. **NOT** "patch-1". It's very common but creates a naming conflict each time when a submission is pulled for a review.
-
An example:
```text
@@ -113,12 +123,10 @@ cd hugo
go install
```
->Note: Some Go tools may not be fully updated to support Go Modules yet. One example would be LiteIDE. Follow [this workaround](https://github.com/visualfc/liteide/issues/986#issuecomment-428117702) for how to continue to work with Hugo below `GOPATH`.
-
For some convenient build and test targets, you also will want to install Mage:
```bash
-go get github.com/magefile/mage
+go install github.com/magefile/mage
```
Now, to make a change to Hugo's source:
@@ -140,7 +148,7 @@ Now, to make a change to Hugo's source:
1. Add your fork as a new remote (the remote name, "fork" in this example, is arbitrary):
```bash
- git remote add fork git://github.com/USERNAME/hugo.git
+ git remote add fork git@github.com:USERNAME/hugo.git
```
1. Push the changes to your new remote:
diff --git a/Dockerfile b/Dockerfile
index 4728a0f2e..a0e34353f 100755
--- a/Dockerfile
+++ b/Dockerfile
@@ -2,32 +2,98 @@
# Twitter: https://twitter.com/gohugoio
# Website: https://gohugo.io/
-FROM golang:1.11-stretch AS build
+ARG GO_VERSION="1.24"
+ARG ALPINE_VERSION="3.22"
+ARG DART_SASS_VERSION="1.79.3"
+FROM --platform=$BUILDPLATFORM tonistiigi/xx:1.5.0 AS xx
+FROM --platform=$BUILDPLATFORM golang:${GO_VERSION}-alpine${ALPINE_VERSION} AS gobuild
+FROM golang:${GO_VERSION}-alpine${ALPINE_VERSION} AS gorun
+
+
+FROM gobuild AS build
+
+RUN apk add clang lld
+
+# Set up cross-compilation helpers
+COPY --from=xx / /
+
+ARG TARGETPLATFORM
+RUN xx-apk add musl-dev gcc g++
+
+# Optionally set HUGO_BUILD_TAGS to "none" or "withdeploy" when building like so:
+# docker build --build-arg HUGO_BUILD_TAGS=withdeploy .
+#
+# We build the extended version by default.
+ARG HUGO_BUILD_TAGS="extended"
+ENV CGO_ENABLED=1
+ENV GOPROXY=https://proxy.golang.org
+ENV GOCACHE=/root/.cache/go-build
+ENV GOMODCACHE=/go/pkg/mod
+ARG TARGETPLATFORM
WORKDIR /go/src/github.com/gohugoio/hugo
-RUN apt-get install \
- git gcc g++ binutils
-COPY . /go/src/github.com/gohugoio/hugo/
-ENV GO111MODULE=on
-RUN go get -d .
-ARG CGO=0
-ENV CGO_ENABLED=${CGO}
-ENV GOOS=linux
+# For --mount=type=cache the value of target is the default cache id, so
+# for the go mod cache it would be good if we could share it with other Go images using the same setup,
+# but the go build cache needs to be per platform.
+# See this comment: https://github.com/moby/buildkit/issues/1706#issuecomment-702238282
+RUN --mount=target=. \
+ --mount=type=cache,target=/go/pkg/mod \
+ --mount=type=cache,target=/root/.cache/go-build,id=go-build-$TARGETPLATFORM <
+[bep]: https://github.com/bep
+[bugs]: https://github.com/gohugoio/hugo/issues?q=is%3Aopen+is%3Aissue+label%3ABug
+[contributing]: CONTRIBUTING.md
+[create a proposal]: https://github.com/gohugoio/hugo/issues/new?labels=Proposal%2C+NeedsTriage&template=feature_request.md
+[documentation repository]: https://github.com/gohugoio/hugoDocs
+[documentation]: https://gohugo.io/documentation
+[dragonfly bsd, freebsd, netbsd, and openbsd]: https://gohugo.io/installation/bsd
+[features]: https://gohugo.io/about/features/
+[forum]: https://discourse.gohugo.io
+[friends]: https://github.com/gohugoio/hugo/graphs/contributors
+[go]: https://go.dev/
+[hugo modules]: https://gohugo.io/hugo-modules/
+[installation]: https://gohugo.io/installation
+[issue queue]: https://github.com/gohugoio/hugo/issues
+[linux]: https://gohugo.io/installation/linux
+[macos]: https://gohugo.io/installation/macos
+[prebuilt binary]: https://github.com/gohugoio/hugo/releases/latest
+[requesting help]: https://discourse.gohugo.io/t/requesting-help/9132
+[spf13]: https://github.com/spf13
+[static site generator]: https://en.wikipedia.org/wiki/Static_site_generator
+[support]: https://discourse.gohugo.io
+[themes]: https://themes.gohugo.io/
+[website]: https://gohugo.io
+[windows]: https://gohugo.io/installation/windows
-A Fast and Flexible Static Site Generator built with love by [bep](https://github.com/bep), [spf13](http://spf13.com/) and [friends](https://github.com/gohugoio/hugo/graphs/contributors) in [Go][].
+
-[Website](https://gohugo.io) |
-[Forum](https://discourse.gohugo.io) |
-[Documentation](https://gohugo.io/getting-started/) |
-[Installation Guide](https://gohugo.io/getting-started/installing/) |
-[Contribution Guide](CONTRIBUTING.md) |
-[Twitter](https://twitter.com/gohugoio)
+A fast and flexible static site generator built with love by [bep], [spf13], and [friends] in [Go].
+
+---
[](https://godoc.org/github.com/gohugoio/hugo)
-[](https://travis-ci.org/gohugoio/hugo)
+[](https://github.com/gohugoio/hugo/actions?query=workflow%3ATest)
[](https://goreportcard.com/report/github.com/gohugoio/hugo)
+[Website] | [Installation] | [Documentation] | [Support] | [Contributing] | Mastodon
+
## Overview
-Hugo is a static HTML and CSS website generator written in [Go][].
-It is optimized for speed, ease of use, and configurability.
-Hugo takes a directory with content and templates and renders them into a full HTML website.
+Hugo is a [static site generator] written in [Go], optimized for speed and designed for flexibility. With its advanced templating system and fast asset pipelines, Hugo renders a complete site in seconds, often less.
-Hugo relies on Markdown files with front matter for metadata, and you can run Hugo from any directory.
-This works well for shared hosts and other systems where you don’t have a privileged account.
+Due to its flexible framework, multilingual support, and powerful taxonomy system, Hugo is widely used to create:
-Hugo renders a typical website of moderate size in a fraction of a second.
-A good rule of thumb is that each piece of content renders in around 1 millisecond.
+- Corporate, government, nonprofit, education, news, event, and project sites
+- Documentation sites
+- Image portfolios
+- Landing pages
+- Business, professional, and personal blogs
+- Resumes and CVs
-Hugo is designed to work well for any kind of website including blogs, tumbles, and docs.
+Use Hugo's embedded web server during development to instantly see changes to content, structure, behavior, and presentation. Then deploy the site to your host, or push changes to your Git provider for automated builds and deployment.
-#### Supported Architectures
+Hugo's fast asset pipelines include:
-Currently, we provide pre-built Hugo binaries for Windows, Linux, FreeBSD, NetBSD, macOS (Darwin), and [Android](https://gist.github.com/bep/a0d8a26cf6b4f8bc992729b8e50b480b) for x64, i386 and ARM architectures.
+- Image processing – Convert, resize, crop, rotate, adjust colors, apply filters, overlay text and images, and extract EXIF data
+- JavaScript bundling – Transpile TypeScript and JSX to JavaScript, bundle, tree shake, minify, create source maps, and perform SRI hashing.
+- Sass processing – Transpile Sass to CSS, bundle, tree shake, minify, create source maps, perform SRI hashing, and integrate with PostCSS
+- Tailwind CSS processing – Compile Tailwind CSS utility classes into standard CSS, bundle, tree shake, optimize, minify, perform SRI hashing, and integrate with PostCSS
-Hugo may also be compiled from source wherever the Go compiler tool chain can run, e.g. for other operating systems including DragonFly BSD, OpenBSD, Plan 9, and Solaris.
+And with [Hugo Modules], you can share content, assets, data, translations, themes, templates, and configuration with other projects via public or private Git repositories.
-**Complete documentation is available at [Hugo Documentation](https://gohugo.io/getting-started/).**
+See the [features] section of the documentation for a comprehensive summary of Hugo's capabilities.
-## Choose How to Install
+## Sponsors
-If you want to use Hugo as your site generator, simply install the Hugo binaries.
-The Hugo binaries have no external dependencies.
+
+
+
+
+
+
+
+
-To contribute to the Hugo source code or documentation, you should [fork the Hugo GitHub project](https://github.com/gohugoio/hugo#fork-destination-box) and clone it to your local machine.
+## Editions
-Finally, you can install the Hugo source code with `go`, build the binaries yourself, and run Hugo that way.
-Building the binaries is an easy task for an experienced `go` getter.
+Hugo is available in three editions: standard, extended, and extended/deploy. While the standard edition provides core functionality, the extended and extended/deploy editions offer advanced features.
-### Install Hugo as Your Site Generator (Binary Install)
+Feature|extended edition|extended/deploy edition
+:--|:-:|:-:
+Encode to the WebP format when [processing images]. You can decode WebP images with any edition.|:heavy_check_mark:|:heavy_check_mark:
+[Transpile Sass to CSS] using the embedded LibSass transpiler. You can use the [Dart Sass] transpiler with any edition.|:heavy_check_mark:|:heavy_check_mark:
+Deploy your site directly to a Google Cloud Storage bucket, an AWS S3 bucket, or an Azure Storage container. See [details].|:x:|:heavy_check_mark:
-Use the [installation instructions in the Hugo documentation](https://gohugo.io/getting-started/installing/).
+[dart sass]: https://gohugo.io/functions/css/sass/#dart-sass
+[processing images]: https://gohugo.io/content-management/image-processing/
+[transpile sass to css]: https://gohugo.io/functions/css/sass/
+[details]: https://gohugo.io/hosting-and-deployment/hugo-deploy/
-### Build and Install the Binaries from Source (Advanced Install)
+Unless your specific deployment needs require the extended/deploy edition, we recommend the extended edition.
-#### Prerequisite Tools
+## Installation
-* [Git](https://git-scm.com/)
-* [Go (at least Go 1.11)](https://golang.org/dl/)
+Install Hugo from a [prebuilt binary], package manager, or package repository. Please see the installation instructions for your operating system:
-#### Fetch from GitHub
+- [macOS]
+- [Linux]
+- [Windows]
+- [DragonFly BSD, FreeBSD, NetBSD, and OpenBSD]
-Since Hugo 0.48, Hugo uses the Go Modules support built into Go 1.11 to build. The easiest is to clone Hugo in a directory outside of `GOPATH`, as in the following example:
+## Build from source
-```bash
-mkdir $HOME/src
-cd $HOME/src
-git clone https://github.com/gohugoio/hugo.git
-cd hugo
-go install
+Prerequisites to build Hugo from source:
+
+- Standard edition: Go 1.23.0 or later
+- Extended edition: Go 1.23.0 or later, and GCC
+- Extended/deploy edition: Go 1.23.0 or later, and GCC
+
+Build the standard edition:
+
+```text
+go install github.com/gohugoio/hugo@latest
```
-**If you are a Windows user, substitute the `$HOME` environment variable above with `%USERPROFILE%`.**
-
-## The Hugo Documentation
+Build the extended edition:
-The Hugo documentation now lives in its own repository, see https://github.com/gohugoio/hugoDocs. But we do keep a version of that documentation as a `git subtree` in this repository. To build the sub folder `/docs` as a Hugo site, you need to clone this repo:
-
-```bash
-git clone git@github.com:gohugoio/hugo.git
+```text
+CGO_ENABLED=1 go install -tags extended github.com/gohugoio/hugo@latest
```
-## Contributing to Hugo
+
+Build the extended/deploy edition:
+
+```text
+CGO_ENABLED=1 go install -tags extended,withdeploy github.com/gohugoio/hugo@latest
+```
+
+## Star History
+
+[](https://star-history.com/#gohugoio/hugo&Timeline)
+
+## Documentation
+
+Hugo's [documentation] includes installation instructions, a quick start guide, conceptual explanations, reference information, and examples.
+
+Please submit documentation issues and pull requests to the [documentation repository].
+
+## Support
+
+Please **do not use the issue queue** for questions or troubleshooting. Unless you are certain that your issue is a software defect, use the [forum].
+
+Hugo’s [forum] is an active community of users and developers who answer questions, share knowledge, and provide examples. A quick search of over 20,000 topics will often answer your question. Please be sure to read about [requesting help] before asking your first question.
+
+## Contributing
+
+You can contribute to the Hugo project by:
+
+- Answering questions on the [forum]
+- Improving the [documentation]
+- Monitoring the [issue queue]
+- Creating or improving [themes]
+- Squashing [bugs]
+
+Please submit documentation issues and pull requests to the [documentation repository].
+
+If you have an idea for an enhancement or new feature, create a new topic on the [forum] in the "Feature" category. This will help you to:
+
+- Determine if the capability already exists
+- Measure interest
+- Refine the concept
+
+If there is sufficient interest, [create a proposal]. Do not submit a pull request until the project lead accepts the proposal.
For a complete guide to contributing to Hugo, see the [Contribution Guide](CONTRIBUTING.md).
-We welcome contributions to Hugo of any kind including documentation, themes,
-organization, tutorials, blog posts, bug reports, issues, feature requests,
-feature implementations, pull requests, answering questions on the forum,
-helping to manage issues, etc.
-
-The Hugo community and maintainers are [very active](https://github.com/gohugoio/hugo/pulse/monthly) and helpful, and the project benefits greatly from this activity.
-
-### Asking Support Questions
-
-We have an active [discussion forum](https://discourse.gohugo.io) where users and developers can ask questions.
-Please don't use the GitHub issue tracker to ask questions.
-
-### Reporting Issues
-
-If you believe you have found a defect in Hugo or its documentation, use
-the GitHub issue tracker to report the problem to the Hugo maintainers.
-If you're not sure if it's a bug or not, start by asking in the [discussion forum](https://discourse.gohugo.io).
-When reporting the issue, please provide the version of Hugo in use (`hugo version`).
-
-### Submitting Patches
-
-The Hugo project welcomes all contributors and contributions regardless of skill or experience level.
-If you are interested in helping with the project, we will help you with your contribution.
-Hugo is a very active project with many contributions happening daily.
-
-Because we want to create the best possible product for our users and the best contribution experience for our developers,
-we have a set of guidelines which ensure that all contributions are acceptable.
-The guidelines are not intended as a filter or barrier to participation.
-If you are unfamiliar with the contribution process, the Hugo team will help you and teach you how to bring your contribution in accordance with the guidelines.
-
-For a complete guide to contributing code to Hugo, see the [Contribution Guide](CONTRIBUTING.md).
-
-[](https://github.com/igrigorik/ga-beacon)
-
-[Go]: https://golang.org/
-[Hugo Documentation]: https://gohugo.io/overview/introduction/
-
## Dependencies
-Hugo stands on the shoulder of many great open source libraries, in lexical order:
+Hugo stands on the shoulders of great open source libraries. Run `hugo env --logLevel info` to display a list of dependencies.
- | Dependency | License |
- | :------------- | :------------- |
- | [github.com/BurntSushi/locker](https://github.com/BurntSushi/locker) | The Unlicense |
- | [github.com/BurntSushi/toml](https://github.com/BurntSushi/toml) | MIT License |
- | [github.com/PuerkitoBio/purell](https://github.com/PuerkitoBio/purell) | BSD 3-Clause "New" or "Revised" License |
- | [github.com/PuerkitoBio/urlesc](https://github.com/PuerkitoBio/urlesc) | BSD 3-Clause "New" or "Revised" License |
- | [github.com/alecthomas/chroma](https://github.com/alecthomas/chroma) | MIT License |
- | [github.com/bep/debounce](https://github.com/bep/debounce) | MIT License |
- | [github.com/bep/gitmap](https://github.com/bep/gitmap) | MIT License |
- | [github.com/bep/go-tocss](https://github.com/bep/go-tocss) | MIT License |
- | [github.com/niklasfasching/go-org](https://github.com/niklasfasching/go-org) | MIT License |
- | [github.com/cpuguy83/go-md2man](https://github.com/cpuguy83/go-md2man) | MIT License |
- | [github.com/danwakefield/fnmatch](https://github.com/danwakefield/fnmatch) | BSD 2-Clause "Simplified" License |
- | [github.com/disintegration/imaging](https://github.com/disintegration/imaging) | MIT License |
- | [github.com/dlclark/regexp2](https://github.com/dlclark/regexp2) | MIT License |
- | [github.com/eknkc/amber](https://github.com/eknkc/amber) | MIT License |
- | [github.com/fsnotify/fsnotify](https://github.com/fsnotify/fsnotify) | BSD 3-Clause "New" or "Revised" License |
- | [github.com/gobwas/glob](https://github.com/gobwas/glob) | MIT License |
- | [github.com/gorilla/websocket](https://github.com/gorilla/websocket) | BSD 2-Clause "Simplified" License |
- | [github.com/hashicorp/go-immutable-radix](https://github.com/hashicorp/go-immutable-radix) | Mozilla Public License 2.0 |
- | [github.com/hashicorp/golang-lru](https://github.com/hashicorp/golang-lru) | Mozilla Public License 2.0 |
- | [github.com/hashicorp/hcl](https://github.com/hashicorp/hcl) | Mozilla Public License 2.0 |
- | [github.com/jdkato/prose](https://github.com/jdkato/prose) | MIT License |
- | [github.com/kyokomi/emoji](https://github.com/kyokomi/emoji) | MIT License |
- | [github.com/magiconair/properties](https://github.com/magiconair/properties) | BSD 2-Clause "Simplified" License |
- | [github.com/markbates/inflect](https://github.com/markbates/inflect) | MIT License |
- | [github.com/mattn/go-isatty](https://github.com/mattn/go-isatty) | MIT License |
- | [github.com/mattn/go-runewidth](https://github.com/mattn/go-runewidth) | MIT License |
- | [github.com/miekg/mmark](https://github.com/miekg/mmark) | Simplified BSD License |
- | [github.com/mitchellh/hashstructure](https://github.com/mitchellh/hashstructure) | MIT License |
- | [github.com/mitchellh/mapstructure](https://github.com/mitchellh/mapstructure) | MIT License |
- | [github.com/muesli/smartcrop](https://github.com/muesli/smartcrop) | MIT License |
- | [github.com/nicksnyder/go-i18n](https://github.com/nicksnyder/go-i18n) | MIT License |
- | [github.com/olekukonko/tablewriter](https://github.com/olekukonko/tablewriter) | MIT License |
- | [github.com/pelletier/go-toml](https://github.com/pelletier/go-toml) | MIT License |
- | [github.com/pkg/errors](https://github.com/pkg/errors) | BSD 2-Clause "Simplified" License |
- | [github.com/russross/blackfriday](https://github.com/russross/blackfriday) | Simplified BSD License |
- | [github.com/shurcooL/sanitized_anchor_name](https://github.com/shurcooL/sanitized_anchor_name) | MIT License |
- | [github.com/spf13/afero](https://github.com/spf13/afero) | Apache License 2.0 |
- | [github.com/spf13/cast](https://github.com/spf13/cast) | MIT License |
- | [github.com/spf13/cobra](https://github.com/spf13/cobra) | Apache License 2.0 |
- | [github.com/spf13/fsync](https://github.com/spf13/fsync) | MIT License |
- | [github.com/spf13/jwalterweatherman](https://github.com/spf13/jwalterweatherman) | MIT License |
- | [github.com/spf13/nitro](https://github.com/spf13/nitro) | Apache License 2.0 |
- | [github.com/spf13/pflag](https://github.com/spf13/pflag) | BSD 3-Clause "New" or "Revised" License |
- | [github.com/spf13/viper](https://github.com/spf13/viper) | MIT License |
- | [github.com/tdewolff/minify](https://github.com/tdewolff/minify) | MIT License |
- | [github.com/tdewolff/parse](https://github.com/tdewolff/parse) | MIT License |
- | [github.com/wellington/go-libsass](https://github.com/wellington/go-libsass) | Apache License 2.0 |
- | [github.com/yosssi/ace](https://github.com/yosssi/ace) | MIT License |
- | [golang.org/x/image](https://golang.org/x/image) | BSD 3-Clause "New" or "Revised" License |
- | [golang.org/x/net](https://golang.org/x/net) | BSD 3-Clause "New" or "Revised" License |
- | [golang.org/x/sync](https://golang.org/x/sync) | BSD 3-Clause "New" or "Revised" License |
- | [golang.org/x/sys](https://golang.org/x/sys) | BSD 3-Clause "New" or "Revised" License |
- | [golang.org/x/text](https://golang.org/x/text) | BSD 3-Clause "New" or "Revised" License
- | [gopkg.in/yaml.v2](https://gopkg.in/yaml.v2) | Apache License 2.0 |
+
+See current dependencies
-
-
-
-
-
-
+```text
+github.com/BurntSushi/locker="v0.0.0-20171006230638-a6e239ea1c69"
+github.com/PuerkitoBio/goquery="v1.10.1"
+github.com/alecthomas/chroma/v2="v2.15.0"
+github.com/andybalholm/cascadia="v1.3.3"
+github.com/armon/go-radix="v1.0.1-0.20221118154546-54df44f2176c"
+github.com/bep/clocks="v0.5.0"
+github.com/bep/debounce="v1.2.0"
+github.com/bep/gitmap="v1.6.0"
+github.com/bep/goat="v0.5.0"
+github.com/bep/godartsass/v2="v2.3.2"
+github.com/bep/golibsass="v1.2.0"
+github.com/bep/gowebp="v0.3.0"
+github.com/bep/imagemeta="v0.8.4"
+github.com/bep/lazycache="v0.7.0"
+github.com/bep/logg="v0.4.0"
+github.com/bep/mclib="v1.20400.20402"
+github.com/bep/overlayfs="v0.9.2"
+github.com/bep/simplecobra="v0.5.0"
+github.com/bep/tmc="v0.5.1"
+github.com/cespare/xxhash/v2="v2.3.0"
+github.com/clbanning/mxj/v2="v2.7.0"
+github.com/cpuguy83/go-md2man/v2="v2.0.4"
+github.com/disintegration/gift="v1.2.1"
+github.com/dlclark/regexp2="v1.11.5"
+github.com/dop251/goja="v0.0.0-20250125213203-5ef83b82af17"
+github.com/evanw/esbuild="v0.24.2"
+github.com/fatih/color="v1.18.0"
+github.com/frankban/quicktest="v1.14.6"
+github.com/fsnotify/fsnotify="v1.8.0"
+github.com/getkin/kin-openapi="v0.129.0"
+github.com/ghodss/yaml="v1.0.0"
+github.com/go-openapi/jsonpointer="v0.21.0"
+github.com/go-openapi/swag="v0.23.0"
+github.com/go-sourcemap/sourcemap="v2.1.4+incompatible"
+github.com/gobuffalo/flect="v1.0.3"
+github.com/gobwas/glob="v0.2.3"
+github.com/gohugoio/go-i18n/v2="v2.1.3-0.20230805085216-e63c13218d0e"
+github.com/gohugoio/hashstructure="v0.5.0"
+github.com/gohugoio/httpcache="v0.7.0"
+github.com/gohugoio/hugo-goldmark-extensions/extras="v0.2.0"
+github.com/gohugoio/hugo-goldmark-extensions/passthrough="v0.3.0"
+github.com/gohugoio/locales="v0.14.0"
+github.com/gohugoio/localescompressed="v1.0.1"
+github.com/golang/freetype="v0.0.0-20170609003504-e2365dfdc4a0"
+github.com/google/go-cmp="v0.6.0"
+github.com/google/pprof="v0.0.0-20250208200701-d0013a598941"
+github.com/gorilla/websocket="v1.5.3"
+github.com/hairyhenderson/go-codeowners="v0.7.0"
+github.com/hashicorp/golang-lru/v2="v2.0.7"
+github.com/jdkato/prose="v1.2.1"
+github.com/josharian/intern="v1.0.0"
+github.com/kr/pretty="v0.3.1"
+github.com/kr/text="v0.2.0"
+github.com/kyokomi/emoji/v2="v2.2.13"
+github.com/lucasb-eyer/go-colorful="v1.2.0"
+github.com/mailru/easyjson="v0.7.7"
+github.com/makeworld-the-better-one/dither/v2="v2.4.0"
+github.com/marekm4/color-extractor="v1.2.1"
+github.com/mattn/go-colorable="v0.1.13"
+github.com/mattn/go-isatty="v0.0.20"
+github.com/mattn/go-runewidth="v0.0.9"
+github.com/mazznoer/csscolorparser="v0.1.5"
+github.com/mitchellh/mapstructure="v1.5.1-0.20231216201459-8508981c8b6c"
+github.com/mohae/deepcopy="v0.0.0-20170929034955-c48cc78d4826"
+github.com/muesli/smartcrop="v0.3.0"
+github.com/niklasfasching/go-org="v1.7.0"
+github.com/oasdiff/yaml3="v0.0.0-20241210130736-a94c01f36349"
+github.com/oasdiff/yaml="v0.0.0-20241210131133-6b86fb107d80"
+github.com/olekukonko/tablewriter="v0.0.5"
+github.com/pbnjay/memory="v0.0.0-20210728143218-7b4eea64cf58"
+github.com/pelletier/go-toml/v2="v2.2.3"
+github.com/perimeterx/marshmallow="v1.1.5"
+github.com/pkg/browser="v0.0.0-20240102092130-5ac0b6a4141c"
+github.com/pkg/errors="v0.9.1"
+github.com/rivo/uniseg="v0.4.7"
+github.com/rogpeppe/go-internal="v1.13.1"
+github.com/russross/blackfriday/v2="v2.1.0"
+github.com/sass/libsass="3.6.6"
+github.com/spf13/afero="v1.11.0"
+github.com/spf13/cast="v1.7.1"
+github.com/spf13/cobra="v1.8.1"
+github.com/spf13/fsync="v0.10.1"
+github.com/spf13/pflag="v1.0.6"
+github.com/tdewolff/minify/v2="v2.20.37"
+github.com/tdewolff/parse/v2="v2.7.15"
+github.com/tetratelabs/wazero="v1.8.2"
+github.com/webmproject/libwebp="v1.3.2"
+github.com/yuin/goldmark-emoji="v1.0.4"
+github.com/yuin/goldmark="v1.7.8"
+go.uber.org/automaxprocs="v1.5.3"
+golang.org/x/crypto="v0.33.0"
+golang.org/x/exp="v0.0.0-20250210185358-939b2ce775ac"
+golang.org/x/image="v0.24.0"
+golang.org/x/mod="v0.23.0"
+golang.org/x/net="v0.35.0"
+golang.org/x/sync="v0.11.0"
+golang.org/x/sys="v0.30.0"
+golang.org/x/text="v0.22.0"
+golang.org/x/tools="v0.30.0"
+golang.org/x/xerrors="v0.0.0-20240903120638-7835f813f4da"
+gonum.org/v1/plot="v0.15.0"
+google.golang.org/protobuf="v1.36.5"
+gopkg.in/yaml.v2="v2.4.0"
+gopkg.in/yaml.v3="v3.0.1"
+oss.terrastruct.com/d2="v0.6.9"
+oss.terrastruct.com/util-go="v0.0.0-20241005222610-44c011a04896"
+rsc.io/qr="v0.2.0"
+software.sslmate.com/src/go-pkcs12="v0.2.0"
+```
+
diff --git a/SECURITY.md b/SECURITY.md
new file mode 100644
index 000000000..6ac90f072
--- /dev/null
+++ b/SECURITY.md
@@ -0,0 +1,7 @@
+## Security Policy
+
+### Reporting a Vulnerability
+
+Please report (suspected) security vulnerabilities to **[bjorn.erik.pedersen@gmail.com](mailto:bjorn.erik.pedersen@gmail.com)**. You will receive a response from us within 48 hours. If we can confirm the issue, we will release a patch as soon as possible depending on the complexity of the issue but historically within days.
+
+Also see [Hugo's Security Model](https://gohugo.io/about/security/).
diff --git a/bench.sh b/bench.sh
deleted file mode 100755
index c6a20a7e3..000000000
--- a/bench.sh
+++ /dev/null
@@ -1,37 +0,0 @@
-#!/usr/bin/env bash
-
-# allow user to override go executable by running as GOEXE=xxx make ...
-GOEXE="${GOEXE-go}"
-
-# Convenience script to
-# - For a given branch
-# - Run benchmark tests for a given package
-# - Do the same for master
-# - then compare the two runs with benchcmp
-
-benchFilter=".*"
-
-if (( $# < 2 ));
- then
- echo "USAGE: ./bench.sh (and (regexp, optional))"
- exit 1
-fi
-
-
-
-if [ $# -eq 3 ]; then
- benchFilter=$3
-fi
-
-
-BRANCH=$1
-PACKAGE=$2
-
-git checkout $BRANCH
-"${GOEXE}" test -test.run=NONE -bench="$benchFilter" -test.benchmem=true ./$PACKAGE > /tmp/bench-$PACKAGE-$BRANCH.txt
-
-git checkout master
-"${GOEXE}" test -test.run=NONE -bench="$benchFilter" -test.benchmem=true ./$PACKAGE > /tmp/bench-$PACKAGE-master.txt
-
-
-benchcmp /tmp/bench-$PACKAGE-master.txt /tmp/bench-$PACKAGE-$BRANCH.txt
diff --git a/benchSite.sh b/benchSite.sh
deleted file mode 100755
index 623086708..000000000
--- a/benchSite.sh
+++ /dev/null
@@ -1,12 +0,0 @@
-#!/bin/bash
-
-# allow user to override go executable by running as GOEXE=xxx make ...
-GOEXE="${GOEXE-go}"
-
-# Send in a regexp mathing the benchmarks you want to run, i.e. './benchSite.sh "YAML"'.
-# Note the quotes, which will be needed for more complex expressions.
-# The above will run all variations, but only for front matter YAML.
-
-echo "Running with BenchmarkSiteBuilding/${1}"
-
-"${GOEXE}" test -run="NONE" -bench="BenchmarkSiteBuilding/${1}" -test.benchmem=true ./hugolib -memprofile mem.prof -count 3 -cpuprofile cpu.prof
diff --git a/benchbep.sh b/benchbep.sh
deleted file mode 100755
index efd616c88..000000000
--- a/benchbep.sh
+++ /dev/null
@@ -1 +0,0 @@
-gobench -package=./hugolib -bench="BenchmarkSiteNew/Deep_content_tree"
\ No newline at end of file
diff --git a/bepdock.sh b/bepdock.sh
deleted file mode 100755
index a7ac0c639..000000000
--- a/bepdock.sh
+++ /dev/null
@@ -1 +0,0 @@
-docker run --rm --mount type=bind,source="$(pwd)",target=/hugo -w /hugo -i -t bepsays/ci-goreleaser:1.11-2 /bin/bash
\ No newline at end of file
diff --git a/bufferpool/bufpool.go b/bufferpool/bufpool.go
index c1e4105d0..f05675e3e 100644
--- a/bufferpool/bufpool.go
+++ b/bufferpool/bufpool.go
@@ -20,7 +20,7 @@ import (
)
var bufferPool = &sync.Pool{
- New: func() interface{} {
+ New: func() any {
return &bytes.Buffer{}
},
}
diff --git a/cache/docs.go b/cache/docs.go
new file mode 100644
index 000000000..b9c49840f
--- /dev/null
+++ b/cache/docs.go
@@ -0,0 +1,2 @@
+// Package cache contains the different cache implementations.
+package cache
diff --git a/cache/dynacache/dynacache.go b/cache/dynacache/dynacache.go
new file mode 100644
index 000000000..25d0f9b29
--- /dev/null
+++ b/cache/dynacache/dynacache.go
@@ -0,0 +1,647 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package dynacache
+
+import (
+ "context"
+ "fmt"
+ "math"
+ "path"
+ "regexp"
+ "runtime"
+ "sync"
+ "time"
+
+ "github.com/bep/lazycache"
+ "github.com/bep/logg"
+ "github.com/gohugoio/hugo/common/collections"
+ "github.com/gohugoio/hugo/common/herrors"
+ "github.com/gohugoio/hugo/common/loggers"
+ "github.com/gohugoio/hugo/common/paths"
+ "github.com/gohugoio/hugo/common/rungroup"
+ "github.com/gohugoio/hugo/config"
+ "github.com/gohugoio/hugo/helpers"
+ "github.com/gohugoio/hugo/identity"
+ "github.com/gohugoio/hugo/resources/resource"
+)
+
+const minMaxSize = 10
+
+type KeyIdentity struct {
+ Key any
+ Identity identity.Identity
+}
+
+// New creates a new cache.
+func New(opts Options) *Cache {
+ if opts.CheckInterval == 0 {
+ opts.CheckInterval = time.Second * 2
+ }
+
+ if opts.MaxSize == 0 {
+ opts.MaxSize = 100000
+ }
+ if opts.Log == nil {
+ panic("nil Log")
+ }
+
+ if opts.MinMaxSize == 0 {
+ opts.MinMaxSize = 30
+ }
+
+ stats := &stats{
+ opts: opts,
+ adjustmentFactor: 1.0,
+ currentMaxSize: opts.MaxSize,
+ availableMemory: config.GetMemoryLimit(),
+ }
+
+ infol := opts.Log.InfoCommand("dynacache")
+
+ evictedIdentities := collections.NewStack[KeyIdentity]()
+
+ onEvict := func(k, v any) {
+ if !opts.Watching {
+ return
+ }
+ identity.WalkIdentitiesShallow(v, func(level int, id identity.Identity) bool {
+ evictedIdentities.Push(KeyIdentity{Key: k, Identity: id})
+ return false
+ })
+ resource.MarkStale(v)
+ }
+
+ c := &Cache{
+ partitions: make(map[string]PartitionManager),
+ onEvict: onEvict,
+ evictedIdentities: evictedIdentities,
+ opts: opts,
+ stats: stats,
+ infol: infol,
+ }
+
+ c.stop = c.start()
+
+ return c
+}
+
+// Options for the cache.
+type Options struct {
+ Log loggers.Logger
+ CheckInterval time.Duration
+ MaxSize int
+ MinMaxSize int
+ Watching bool
+}
+
+// Options for a partition.
+type OptionsPartition struct {
+ // When to clear the this partition.
+ ClearWhen ClearWhen
+
+ // Weight is a number between 1 and 100 that indicates how, in general, how big this partition may get.
+ Weight int
+}
+
+func (o OptionsPartition) WeightFraction() float64 {
+ return float64(o.Weight) / 100
+}
+
+func (o OptionsPartition) CalculateMaxSize(maxSizePerPartition int) int {
+ return int(math.Floor(float64(maxSizePerPartition) * o.WeightFraction()))
+}
+
+// A dynamic partitioned cache.
+type Cache struct {
+ mu sync.RWMutex
+
+ partitions map[string]PartitionManager
+
+ onEvict func(k, v any)
+ evictedIdentities *collections.Stack[KeyIdentity]
+
+ opts Options
+ infol logg.LevelLogger
+
+ stats *stats
+ stopOnce sync.Once
+ stop func()
+}
+
+// DrainEvictedIdentities drains the evicted identities from the cache.
+func (c *Cache) DrainEvictedIdentities() []KeyIdentity {
+ return c.evictedIdentities.Drain()
+}
+
+// DrainEvictedIdentitiesMatching drains the evicted identities from the cache that match the given predicate.
+func (c *Cache) DrainEvictedIdentitiesMatching(predicate func(KeyIdentity) bool) []KeyIdentity {
+ return c.evictedIdentities.DrainMatching(predicate)
+}
+
+// ClearMatching clears all partition for which the predicate returns true.
+func (c *Cache) ClearMatching(predicatePartition func(k string, p PartitionManager) bool, predicateValue func(k, v any) bool) {
+ if predicatePartition == nil {
+ predicatePartition = func(k string, p PartitionManager) bool { return true }
+ }
+ if predicateValue == nil {
+ panic("nil predicateValue")
+ }
+ g := rungroup.Run[PartitionManager](context.Background(), rungroup.Config[PartitionManager]{
+ NumWorkers: len(c.partitions),
+ Handle: func(ctx context.Context, partition PartitionManager) error {
+ partition.clearMatching(predicateValue)
+ return nil
+ },
+ })
+
+ for k, p := range c.partitions {
+ if !predicatePartition(k, p) {
+ continue
+ }
+ g.Enqueue(p)
+ }
+
+ g.Wait()
+}
+
+// ClearOnRebuild prepares the cache for a new rebuild taking the given changeset into account.
+// predicate is optional and will clear any entry for which it returns true.
+func (c *Cache) ClearOnRebuild(predicate func(k, v any) bool, changeset ...identity.Identity) {
+ g := rungroup.Run[PartitionManager](context.Background(), rungroup.Config[PartitionManager]{
+ NumWorkers: len(c.partitions),
+ Handle: func(ctx context.Context, partition PartitionManager) error {
+ partition.clearOnRebuild(predicate, changeset...)
+ return nil
+ },
+ })
+
+ for _, p := range c.partitions {
+ g.Enqueue(p)
+ }
+
+ g.Wait()
+
+ // Clear any entries marked as stale above.
+ g = rungroup.Run[PartitionManager](context.Background(), rungroup.Config[PartitionManager]{
+ NumWorkers: len(c.partitions),
+ Handle: func(ctx context.Context, partition PartitionManager) error {
+ partition.clearStale()
+ return nil
+ },
+ })
+
+ for _, p := range c.partitions {
+ g.Enqueue(p)
+ }
+
+ g.Wait()
+}
+
+type keysProvider interface {
+ Keys() []string
+}
+
+// Keys returns a list of keys in all partitions.
+func (c *Cache) Keys(predicate func(s string) bool) []string {
+ if predicate == nil {
+ predicate = func(s string) bool { return true }
+ }
+ var keys []string
+ for pn, g := range c.partitions {
+ pkeys := g.(keysProvider).Keys()
+ for _, k := range pkeys {
+ p := path.Join(pn, k)
+ if predicate(p) {
+ keys = append(keys, p)
+ }
+ }
+
+ }
+ return keys
+}
+
+func calculateMaxSizePerPartition(maxItemsTotal, totalWeightQuantity, numPartitions int) int {
+ if numPartitions == 0 {
+ panic("numPartitions must be > 0")
+ }
+ if totalWeightQuantity == 0 {
+ panic("totalWeightQuantity must be > 0")
+ }
+
+ avgWeight := float64(totalWeightQuantity) / float64(numPartitions)
+ return int(math.Floor(float64(maxItemsTotal) / float64(numPartitions) * (100.0 / avgWeight)))
+}
+
+// Stop stops the cache.
+func (c *Cache) Stop() {
+ c.stopOnce.Do(func() {
+ c.stop()
+ })
+}
+
+func (c *Cache) adjustCurrentMaxSize() {
+ c.mu.RLock()
+ defer c.mu.RUnlock()
+
+ if len(c.partitions) == 0 {
+ return
+ }
+ var m runtime.MemStats
+ runtime.ReadMemStats(&m)
+ s := c.stats
+ s.memstatsCurrent = m
+ // fmt.Printf("\n\nAvailable = %v\nAlloc = %v\nTotalAlloc = %v\nSys = %v\nNumGC = %v\nMaxSize = %d\nAdjustmentFactor=%f\n\n", helpers.FormatByteCount(s.availableMemory), helpers.FormatByteCount(m.Alloc), helpers.FormatByteCount(m.TotalAlloc), helpers.FormatByteCount(m.Sys), m.NumGC, c.stats.currentMaxSize, s.adjustmentFactor)
+
+ if s.availableMemory >= s.memstatsCurrent.Alloc {
+ if s.adjustmentFactor <= 1.0 {
+ s.adjustmentFactor += 0.2
+ }
+ } else {
+ // We're low on memory.
+ s.adjustmentFactor -= 0.4
+ }
+
+ if s.adjustmentFactor <= 0 {
+ s.adjustmentFactor = 0.05
+ }
+
+ if !s.adjustCurrentMaxSize() {
+ return
+ }
+
+ totalWeight := 0
+ for _, pm := range c.partitions {
+ totalWeight += pm.getOptions().Weight
+ }
+
+ maxSizePerPartition := calculateMaxSizePerPartition(c.stats.currentMaxSize, totalWeight, len(c.partitions))
+
+ evicted := 0
+ for _, p := range c.partitions {
+ evicted += p.adjustMaxSize(p.getOptions().CalculateMaxSize(maxSizePerPartition))
+ }
+
+ if evicted > 0 {
+ c.infol.
+ WithFields(
+ logg.Fields{
+ {Name: "evicted", Value: evicted},
+ {Name: "numGC", Value: m.NumGC},
+ {Name: "limit", Value: helpers.FormatByteCount(c.stats.availableMemory)},
+ {Name: "alloc", Value: helpers.FormatByteCount(m.Alloc)},
+ {Name: "totalAlloc", Value: helpers.FormatByteCount(m.TotalAlloc)},
+ },
+ ).Logf("adjusted partitions' max size")
+ }
+}
+
+func (c *Cache) start() func() {
+ ticker := time.NewTicker(c.opts.CheckInterval)
+ quit := make(chan struct{})
+
+ go func() {
+ for {
+ select {
+ case <-ticker.C:
+ c.adjustCurrentMaxSize()
+ // Reset the ticker to avoid drift.
+ ticker.Reset(c.opts.CheckInterval)
+ case <-quit:
+ ticker.Stop()
+ return
+ }
+ }
+ }()
+
+ return func() {
+ close(quit)
+ }
+}
+
+var partitionNameRe = regexp.MustCompile(`^\/[a-zA-Z0-9]{4}(\/[a-zA-Z0-9]+)?(\/[a-zA-Z0-9]+)?`)
+
+// GetOrCreatePartition gets or creates a partition with the given name.
+func GetOrCreatePartition[K comparable, V any](c *Cache, name string, opts OptionsPartition) *Partition[K, V] {
+ if c == nil {
+ panic("nil Cache")
+ }
+ if opts.Weight < 1 || opts.Weight > 100 {
+ panic("invalid Weight, must be between 1 and 100")
+ }
+
+ if partitionNameRe.FindString(name) != name {
+ panic(fmt.Sprintf("invalid partition name %q", name))
+ }
+
+ c.mu.RLock()
+ p, found := c.partitions[name]
+ c.mu.RUnlock()
+ if found {
+ return p.(*Partition[K, V])
+ }
+
+ c.mu.Lock()
+ defer c.mu.Unlock()
+
+ // Double check.
+ p, found = c.partitions[name]
+ if found {
+ return p.(*Partition[K, V])
+ }
+
+ // At this point, we don't know the number of partitions or their configuration, but
+ // this will be re-adjusted later.
+ const numberOfPartitionsEstimate = 10
+ maxSize := opts.CalculateMaxSize(c.opts.MaxSize / numberOfPartitionsEstimate)
+
+ onEvict := func(k K, v V) {
+ c.onEvict(k, v)
+ }
+
+ // Create a new partition and cache it.
+ partition := &Partition[K, V]{
+ c: lazycache.New(lazycache.Options[K, V]{MaxEntries: maxSize, OnEvict: onEvict}),
+ maxSize: maxSize,
+ trace: c.opts.Log.Logger().WithLevel(logg.LevelTrace).WithField("partition", name),
+ opts: opts,
+ }
+
+ c.partitions[name] = partition
+
+ return partition
+}
+
+// Partition is a partition in the cache.
+type Partition[K comparable, V any] struct {
+ c *lazycache.Cache[K, V]
+
+ zero V
+
+ trace logg.LevelLogger
+ opts OptionsPartition
+
+ maxSize int
+}
+
+// GetOrCreate gets or creates a value for the given key.
+func (p *Partition[K, V]) GetOrCreate(key K, create func(key K) (V, error)) (V, error) {
+ v, err := p.doGetOrCreate(key, create)
+ if err != nil {
+ return p.zero, err
+ }
+ if resource.StaleVersion(v) > 0 {
+ p.c.Delete(key)
+ return p.doGetOrCreate(key, create)
+ }
+ return v, err
+}
+
+func (p *Partition[K, V]) doGetOrCreate(key K, create func(key K) (V, error)) (V, error) {
+ v, _, err := p.c.GetOrCreate(key, create)
+ return v, err
+}
+
+func (p *Partition[K, V]) GetOrCreateWitTimeout(key K, duration time.Duration, create func(key K) (V, error)) (V, error) {
+ v, err := p.doGetOrCreateWitTimeout(key, duration, create)
+ if err != nil {
+ return p.zero, err
+ }
+ if resource.StaleVersion(v) > 0 {
+ p.c.Delete(key)
+ return p.doGetOrCreateWitTimeout(key, duration, create)
+ }
+ return v, err
+}
+
+// GetOrCreateWitTimeout gets or creates a value for the given key and times out if the create function
+// takes too long.
+func (p *Partition[K, V]) doGetOrCreateWitTimeout(key K, duration time.Duration, create func(key K) (V, error)) (V, error) {
+ resultch := make(chan V, 1)
+ errch := make(chan error, 1)
+
+ go func() {
+ var (
+ v V
+ err error
+ )
+ defer func() {
+ if r := recover(); r != nil {
+ if rerr, ok := r.(error); ok {
+ err = rerr
+ } else {
+ err = fmt.Errorf("panic: %v", r)
+ }
+ }
+ if err != nil {
+ errch <- err
+ } else {
+ resultch <- v
+ }
+ }()
+ v, _, err = p.c.GetOrCreate(key, create)
+ }()
+
+ select {
+ case v := <-resultch:
+ return v, nil
+ case err := <-errch:
+ return p.zero, err
+ case <-time.After(duration):
+ return p.zero, &herrors.TimeoutError{
+ Duration: duration,
+ }
+ }
+}
+
+func (p *Partition[K, V]) clearMatching(predicate func(k, v any) bool) {
+ p.c.DeleteFunc(func(key K, v V) bool {
+ if predicate(key, v) {
+ p.trace.Log(
+ logg.StringFunc(
+ func() string {
+ return fmt.Sprintf("clearing cache key %v", key)
+ },
+ ),
+ )
+ return true
+ }
+ return false
+ })
+}
+
+func (p *Partition[K, V]) clearOnRebuild(predicate func(k, v any) bool, changeset ...identity.Identity) {
+ if predicate == nil {
+ predicate = func(k, v any) bool {
+ return false
+ }
+ }
+ opts := p.getOptions()
+ if opts.ClearWhen == ClearNever {
+ return
+ }
+
+ if opts.ClearWhen == ClearOnRebuild {
+ // Clear all.
+ p.Clear()
+ return
+ }
+
+ depsFinder := identity.NewFinder(identity.FinderConfig{})
+
+ shouldDelete := func(key K, v V) bool {
+ // We always clear elements marked as stale.
+ if resource.StaleVersion(v) > 0 {
+ return true
+ }
+
+ // Now check if this entry has changed based on the changeset
+ // based on filesystem events.
+ if len(changeset) == 0 {
+ // Nothing changed.
+ return false
+ }
+
+ var probablyDependent bool
+ identity.WalkIdentitiesShallow(v, func(level int, id2 identity.Identity) bool {
+ for _, id := range changeset {
+ if r := depsFinder.Contains(id, id2, -1); r > 0 {
+ // It's probably dependent, evict from cache.
+ probablyDependent = true
+ return true
+ }
+ }
+ return false
+ })
+
+ return probablyDependent
+ }
+
+ // First pass.
+ // Second pass needs to be done in a separate loop to catch any
+ // elements marked as stale in the other partitions.
+ p.c.DeleteFunc(func(key K, v V) bool {
+ if predicate(key, v) || shouldDelete(key, v) {
+ p.trace.Log(
+ logg.StringFunc(
+ func() string {
+ return fmt.Sprintf("first pass: clearing cache key %v", key)
+ },
+ ),
+ )
+ return true
+ }
+ return false
+ })
+}
+
+func (p *Partition[K, V]) Keys() []K {
+ var keys []K
+ p.c.DeleteFunc(func(key K, v V) bool {
+ keys = append(keys, key)
+ return false
+ })
+ return keys
+}
+
+func (p *Partition[K, V]) clearStale() {
+ p.c.DeleteFunc(func(key K, v V) bool {
+ staleVersion := resource.StaleVersion(v)
+ if staleVersion > 0 {
+ p.trace.Log(
+ logg.StringFunc(
+ func() string {
+ return fmt.Sprintf("second pass: clearing cache key %v", key)
+ },
+ ),
+ )
+ }
+
+ return staleVersion > 0
+ })
+}
+
+// adjustMaxSize adjusts the max size of the and returns the number of items evicted.
+func (p *Partition[K, V]) adjustMaxSize(newMaxSize int) int {
+ if newMaxSize < minMaxSize {
+ newMaxSize = minMaxSize
+ }
+ oldMaxSize := p.maxSize
+ if newMaxSize == oldMaxSize {
+ return 0
+ }
+ p.maxSize = newMaxSize
+ // fmt.Println("Adjusting max size of partition from", oldMaxSize, "to", newMaxSize)
+ return p.c.Resize(newMaxSize)
+}
+
+func (p *Partition[K, V]) getMaxSize() int {
+ return p.maxSize
+}
+
+func (p *Partition[K, V]) getOptions() OptionsPartition {
+ return p.opts
+}
+
+func (p *Partition[K, V]) Clear() {
+ p.c.DeleteFunc(func(key K, v V) bool {
+ return true
+ })
+}
+
+func (p *Partition[K, V]) Get(ctx context.Context, key K) (V, bool) {
+ return p.c.Get(key)
+}
+
+type PartitionManager interface {
+ adjustMaxSize(addend int) int
+ getMaxSize() int
+ getOptions() OptionsPartition
+ clearOnRebuild(predicate func(k, v any) bool, changeset ...identity.Identity)
+ clearMatching(predicate func(k, v any) bool)
+ clearStale()
+}
+
+const (
+ ClearOnRebuild ClearWhen = iota + 1
+ ClearOnChange
+ ClearNever
+)
+
+type ClearWhen int
+
+type stats struct {
+ opts Options
+ memstatsCurrent runtime.MemStats
+ currentMaxSize int
+ availableMemory uint64
+
+ adjustmentFactor float64
+}
+
+func (s *stats) adjustCurrentMaxSize() bool {
+ newCurrentMaxSize := int(math.Floor(float64(s.opts.MaxSize) * s.adjustmentFactor))
+
+ if newCurrentMaxSize < s.opts.MinMaxSize {
+ newCurrentMaxSize = int(s.opts.MinMaxSize)
+ }
+ changed := newCurrentMaxSize != s.currentMaxSize
+ s.currentMaxSize = newCurrentMaxSize
+ return changed
+}
+
+// CleanKey turns s into a format suitable for a cache key for this package.
+// The key will be a Unix-styled path with a leading slash but no trailing slash.
+func CleanKey(s string) string {
+ return path.Clean(paths.ToSlashPreserveLeading(s))
+}
diff --git a/cache/dynacache/dynacache_test.go b/cache/dynacache/dynacache_test.go
new file mode 100644
index 000000000..78b2fc82e
--- /dev/null
+++ b/cache/dynacache/dynacache_test.go
@@ -0,0 +1,230 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package dynacache
+
+import (
+ "errors"
+ "fmt"
+ "path/filepath"
+ "testing"
+ "time"
+
+ qt "github.com/frankban/quicktest"
+ "github.com/gohugoio/hugo/common/loggers"
+ "github.com/gohugoio/hugo/identity"
+ "github.com/gohugoio/hugo/resources/resource"
+)
+
+var (
+ _ resource.StaleInfo = (*testItem)(nil)
+ _ identity.Identity = (*testItem)(nil)
+)
+
+type testItem struct {
+ name string
+ staleVersion uint32
+}
+
+func (t testItem) StaleVersion() uint32 {
+ return t.staleVersion
+}
+
+func (t testItem) IdentifierBase() string {
+ return t.name
+}
+
+func TestCache(t *testing.T) {
+ t.Parallel()
+ c := qt.New(t)
+
+ cache := New(Options{
+ Log: loggers.NewDefault(),
+ })
+
+ c.Cleanup(func() {
+ cache.Stop()
+ })
+
+ opts := OptionsPartition{Weight: 30}
+
+ c.Assert(cache, qt.Not(qt.IsNil))
+
+ p1 := GetOrCreatePartition[string, testItem](cache, "/aaaa/bbbb", opts)
+ c.Assert(p1, qt.Not(qt.IsNil))
+
+ p2 := GetOrCreatePartition[string, testItem](cache, "/aaaa/bbbb", opts)
+
+ c.Assert(func() { GetOrCreatePartition[string, testItem](cache, "foo bar", opts) }, qt.PanicMatches, ".*invalid partition name.*")
+ c.Assert(func() { GetOrCreatePartition[string, testItem](cache, "/aaaa/cccc", OptionsPartition{Weight: 1234}) }, qt.PanicMatches, ".*invalid Weight.*")
+
+ c.Assert(p2, qt.Equals, p1)
+
+ p3 := GetOrCreatePartition[string, testItem](cache, "/aaaa/cccc", opts)
+ c.Assert(p3, qt.Not(qt.IsNil))
+ c.Assert(p3, qt.Not(qt.Equals), p1)
+
+ c.Assert(func() { New(Options{}) }, qt.PanicMatches, ".*nil Log.*")
+}
+
+func TestCalculateMaxSizePerPartition(t *testing.T) {
+ t.Parallel()
+ c := qt.New(t)
+
+ c.Assert(calculateMaxSizePerPartition(1000, 500, 5), qt.Equals, 200)
+ c.Assert(calculateMaxSizePerPartition(1000, 250, 5), qt.Equals, 400)
+ c.Assert(func() { calculateMaxSizePerPartition(1000, 250, 0) }, qt.PanicMatches, ".*must be > 0.*")
+ c.Assert(func() { calculateMaxSizePerPartition(1000, 0, 1) }, qt.PanicMatches, ".*must be > 0.*")
+}
+
+func TestCleanKey(t *testing.T) {
+ c := qt.New(t)
+
+ c.Assert(CleanKey("a/b/c"), qt.Equals, "/a/b/c")
+ c.Assert(CleanKey("/a/b/c"), qt.Equals, "/a/b/c")
+ c.Assert(CleanKey("a/b/c/"), qt.Equals, "/a/b/c")
+ c.Assert(CleanKey(filepath.FromSlash("/a/b/c/")), qt.Equals, "/a/b/c")
+}
+
+func newTestCache(t *testing.T) *Cache {
+ cache := New(
+ Options{
+ Log: loggers.NewDefault(),
+ },
+ )
+
+ p1 := GetOrCreatePartition[string, testItem](cache, "/aaaa/bbbb", OptionsPartition{Weight: 30, ClearWhen: ClearOnRebuild})
+ p2 := GetOrCreatePartition[string, testItem](cache, "/aaaa/cccc", OptionsPartition{Weight: 30, ClearWhen: ClearOnChange})
+
+ p1.GetOrCreate("clearOnRebuild", func(string) (testItem, error) {
+ return testItem{}, nil
+ })
+
+ p2.GetOrCreate("clearBecauseStale", func(string) (testItem, error) {
+ return testItem{
+ staleVersion: 32,
+ }, nil
+ })
+
+ p2.GetOrCreate("clearBecauseIdentityChanged", func(string) (testItem, error) {
+ return testItem{
+ name: "changed",
+ }, nil
+ })
+
+ p2.GetOrCreate("clearNever", func(string) (testItem, error) {
+ return testItem{
+ staleVersion: 0,
+ }, nil
+ })
+
+ t.Cleanup(func() {
+ cache.Stop()
+ })
+
+ return cache
+}
+
+func TestClear(t *testing.T) {
+ t.Parallel()
+ c := qt.New(t)
+
+ predicateAll := func(string) bool {
+ return true
+ }
+
+ cache := newTestCache(t)
+
+ c.Assert(cache.Keys(predicateAll), qt.HasLen, 4)
+
+ cache.ClearOnRebuild(nil)
+
+ // Stale items are always cleared.
+ c.Assert(cache.Keys(predicateAll), qt.HasLen, 2)
+
+ cache = newTestCache(t)
+ cache.ClearOnRebuild(nil, identity.StringIdentity("changed"))
+
+ c.Assert(cache.Keys(nil), qt.HasLen, 1)
+
+ cache = newTestCache(t)
+
+ cache.ClearMatching(nil, func(k, v any) bool {
+ return k.(string) == "clearOnRebuild"
+ })
+
+ c.Assert(cache.Keys(predicateAll), qt.HasLen, 3)
+
+ cache.adjustCurrentMaxSize()
+}
+
+func TestPanicInCreate(t *testing.T) {
+ t.Parallel()
+ c := qt.New(t)
+ cache := newTestCache(t)
+
+ p1 := GetOrCreatePartition[string, testItem](cache, "/aaaa/bbbb", OptionsPartition{Weight: 30, ClearWhen: ClearOnRebuild})
+
+ willPanic := func(i int) func() {
+ return func() {
+ p1.GetOrCreate(fmt.Sprintf("panic-%d", i), func(key string) (testItem, error) {
+ panic(errors.New(key))
+ })
+ }
+ }
+
+ // GetOrCreateWitTimeout needs to recover from panics in the create func.
+ willErr := func(i int) error {
+ _, err := p1.GetOrCreateWitTimeout(fmt.Sprintf("error-%d", i), 10*time.Second, func(key string) (testItem, error) {
+ return testItem{}, errors.New(key)
+ })
+ return err
+ }
+
+ for i := range 3 {
+ for range 3 {
+ c.Assert(willPanic(i), qt.PanicMatches, fmt.Sprintf("panic-%d", i))
+ c.Assert(willErr(i), qt.ErrorMatches, fmt.Sprintf("error-%d", i))
+ }
+ }
+
+ // Test the same keys again without the panic.
+ for i := range 3 {
+ for range 3 {
+ v, err := p1.GetOrCreate(fmt.Sprintf("panic-%d", i), func(key string) (testItem, error) {
+ return testItem{
+ name: key,
+ }, nil
+ })
+ c.Assert(err, qt.IsNil)
+ c.Assert(v.name, qt.Equals, fmt.Sprintf("panic-%d", i))
+
+ v, err = p1.GetOrCreateWitTimeout(fmt.Sprintf("error-%d", i), 10*time.Second, func(key string) (testItem, error) {
+ return testItem{
+ name: key,
+ }, nil
+ })
+ c.Assert(err, qt.IsNil)
+ c.Assert(v.name, qt.Equals, fmt.Sprintf("error-%d", i))
+ }
+ }
+}
+
+func TestAdjustCurrentMaxSize(t *testing.T) {
+ t.Parallel()
+ c := qt.New(t)
+ cache := newTestCache(t)
+ alloc := cache.stats.memstatsCurrent.Alloc
+ cache.adjustCurrentMaxSize()
+ c.Assert(cache.stats.memstatsCurrent.Alloc, qt.Not(qt.Equals), alloc)
+}
diff --git a/cache/filecache/filecache.go b/cache/filecache/filecache.go
index bc0573d52..01c466ca6 100644
--- a/cache/filecache/filecache.go
+++ b/cache/filecache/filecache.go
@@ -1,4 +1,4 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -15,15 +15,17 @@ package filecache
import (
"bytes"
+ "errors"
"io"
- "io/ioutil"
"os"
"path/filepath"
"strings"
"sync"
"time"
+ "github.com/gohugoio/httpcache"
"github.com/gohugoio/hugo/common/hugio"
+ "github.com/gohugoio/hugo/hugofs"
"github.com/gohugoio/hugo/helpers"
@@ -31,8 +33,11 @@ import (
"github.com/spf13/afero"
)
+// ErrFatal can be used to signal an unrecoverable error.
+var ErrFatal = errors.New("fatal filecache error")
+
const (
- filecacheRootDirname = "filecache"
+ FilecacheRootDirname = "filecache"
)
// Cache caches a set of files in a directory. This is usually a file on
@@ -48,6 +53,9 @@ type Cache struct {
pruneAllRootDir string
nlocker *lockTracker
+
+ initOnce sync.Once
+ initErr error
}
type lockTracker struct {
@@ -100,9 +108,23 @@ func (l *lockedFile) Close() error {
return l.File.Close()
}
+func (c *Cache) init() error {
+ c.initOnce.Do(func() {
+ // Create the base dir if it does not exist.
+ if err := c.Fs.MkdirAll("", 0o777); err != nil && !os.IsExist(err) {
+ c.initErr = err
+ }
+ })
+ return c.initErr
+}
+
// WriteCloser returns a transactional writer into the cache.
// It's important that it's closed when done.
func (c *Cache) WriteCloser(id string) (ItemInfo, io.WriteCloser, error) {
+ if err := c.init(); err != nil {
+ return ItemInfo{}, nil, err
+ }
+
id = cleanID(id)
c.nlocker.Lock(id)
@@ -125,8 +147,13 @@ func (c *Cache) WriteCloser(id string) (ItemInfo, io.WriteCloser, error) {
// If not found a new file is created and passed to create, which should close
// it when done.
func (c *Cache) ReadOrCreate(id string,
- read func(info ItemInfo, r io.Reader) error,
- create func(info ItemInfo, w io.WriteCloser) error) (info ItemInfo, err error) {
+ read func(info ItemInfo, r io.ReadSeeker) error,
+ create func(info ItemInfo, w io.WriteCloser) error,
+) (info ItemInfo, err error) {
+ if err := c.init(); err != nil {
+ return ItemInfo{}, err
+ }
+
id = cleanID(id)
c.nlocker.Lock(id)
@@ -137,7 +164,13 @@ func (c *Cache) ReadOrCreate(id string,
if r := c.getOrRemove(id); r != nil {
err = read(info, r)
defer r.Close()
- return
+ if err == nil || err == ErrFatal {
+ // See https://github.com/gohugoio/hugo/issues/6401
+ // To recover from file corruption we handle read errors
+ // as the cache item was not found.
+ // Any file permission issue will also fail in the next step.
+ return
+ }
}
f, err := helpers.OpenFileForWriting(c.Fs, id)
@@ -148,13 +181,24 @@ func (c *Cache) ReadOrCreate(id string,
err = create(info, f)
return
+}
+// NamedLock locks the given id. The lock is released when the returned function is called.
+func (c *Cache) NamedLock(id string) func() {
+ id = cleanID(id)
+ c.nlocker.Lock(id)
+ return func() {
+ c.nlocker.Unlock(id)
+ }
}
// GetOrCreate tries to get the file with the given id from cache. If not found or expired, create will
// be invoked and the result cached.
// This method is protected by a named lock using the given id as identifier.
func (c *Cache) GetOrCreate(id string, create func() (io.ReadCloser, error)) (ItemInfo, io.ReadCloser, error) {
+ if err := c.init(); err != nil {
+ return ItemInfo{}, nil, err
+ }
id = cleanID(id)
c.nlocker.Lock(id)
@@ -166,7 +210,12 @@ func (c *Cache) GetOrCreate(id string, create func() (io.ReadCloser, error)) (It
return info, r, nil
}
- r, err := create()
+ var (
+ r io.ReadCloser
+ err error
+ )
+
+ r, err = create()
if err != nil {
return info, nil, err
}
@@ -179,11 +228,30 @@ func (c *Cache) GetOrCreate(id string, create func() (io.ReadCloser, error)) (It
var buff bytes.Buffer
return info,
hugio.ToReadCloser(&buff),
- afero.WriteReader(c.Fs, id, io.TeeReader(r, &buff))
+ c.writeReader(id, io.TeeReader(r, &buff))
+}
+
+func (c *Cache) writeReader(id string, r io.Reader) error {
+ dir := filepath.Dir(id)
+ if dir != "" {
+ _ = c.Fs.MkdirAll(dir, 0o777)
+ }
+ f, err := c.Fs.Create(id)
+ if err != nil {
+ return err
+ }
+ defer f.Close()
+
+ _, _ = io.Copy(f, r)
+
+ return nil
}
// GetOrCreateBytes is the same as GetOrCreate, but produces a byte slice.
func (c *Cache) GetOrCreateBytes(id string, create func() ([]byte, error)) (ItemInfo, []byte, error) {
+ if err := c.init(); err != nil {
+ return ItemInfo{}, nil, err
+ }
id = cleanID(id)
c.nlocker.Lock(id)
@@ -193,11 +261,16 @@ func (c *Cache) GetOrCreateBytes(id string, create func() ([]byte, error)) (Item
if r := c.getOrRemove(id); r != nil {
defer r.Close()
- b, err := ioutil.ReadAll(r)
+ b, err := io.ReadAll(r)
return info, b, err
}
- b, err := create()
+ var (
+ b []byte
+ err error
+ )
+
+ b, err = create()
if err != nil {
return info, nil, err
}
@@ -206,15 +279,18 @@ func (c *Cache) GetOrCreateBytes(id string, create func() ([]byte, error)) (Item
return info, b, nil
}
- if err := afero.WriteReader(c.Fs, id, bytes.NewReader(b)); err != nil {
+ if err := c.writeReader(id, bytes.NewReader(b)); err != nil {
return info, nil, err
}
- return info, b, nil
+ return info, b, nil
}
-// GetBytes gets the file content with the given id from the cahce, nil if none found.
+// GetBytes gets the file content with the given id from the cache, nil if none found.
func (c *Cache) GetBytes(id string) (ItemInfo, []byte, error) {
+ if err := c.init(); err != nil {
+ return ItemInfo{}, nil, err
+ }
id = cleanID(id)
c.nlocker.Lock(id)
@@ -224,15 +300,18 @@ func (c *Cache) GetBytes(id string) (ItemInfo, []byte, error) {
if r := c.getOrRemove(id); r != nil {
defer r.Close()
- b, err := ioutil.ReadAll(r)
+ b, err := io.ReadAll(r)
return info, b, err
}
return info, nil, nil
}
-// Get gets the file with the given id from the cahce, nil if none found.
+// Get gets the file with the given id from the cache, nil if none found.
func (c *Cache) Get(id string) (ItemInfo, io.ReadCloser, error) {
+ if err := c.init(); err != nil {
+ return ItemInfo{}, nil, err
+ }
id = cleanID(id)
c.nlocker.Lock(id)
@@ -253,20 +332,11 @@ func (c *Cache) getOrRemove(id string) hugio.ReadSeekCloser {
return nil
}
- if c.maxAge > 0 {
- fi, err := c.Fs.Stat(id)
- if err != nil {
- return nil
- }
-
- if c.isExpired(fi.ModTime()) {
- c.Fs.Remove(id)
- return nil
- }
+ if removed, err := c.removeIfExpired(id); err != nil || removed {
+ return nil
}
f, err := c.Fs.Open(id)
-
if err != nil {
return nil
}
@@ -274,30 +344,74 @@ func (c *Cache) getOrRemove(id string) hugio.ReadSeekCloser {
return f
}
+func (c *Cache) getBytesAndRemoveIfExpired(id string) ([]byte, bool) {
+ if c.maxAge == 0 {
+ // No caching.
+ return nil, false
+ }
+
+ f, err := c.Fs.Open(id)
+ if err != nil {
+ return nil, false
+ }
+ defer f.Close()
+
+ b, err := io.ReadAll(f)
+ if err != nil {
+ return nil, false
+ }
+
+ removed, err := c.removeIfExpired(id)
+ if err != nil {
+ return nil, false
+ }
+
+ return b, removed
+}
+
+func (c *Cache) removeIfExpired(id string) (bool, error) {
+ if c.maxAge <= 0 {
+ return false, nil
+ }
+
+ fi, err := c.Fs.Stat(id)
+ if err != nil {
+ return false, err
+ }
+
+ if c.isExpired(fi.ModTime()) {
+ c.Fs.Remove(id)
+ return true, nil
+ }
+
+ return false, nil
+}
+
func (c *Cache) isExpired(modTime time.Time) bool {
if c.maxAge < 0 {
return false
}
+
+ // Note the use of time.Since here.
+ // We cannot use Hugo's global Clock for this.
return c.maxAge == 0 || time.Since(modTime) > c.maxAge
}
// For testing
-func (c *Cache) getString(id string) string {
+func (c *Cache) GetString(id string) string {
id = cleanID(id)
c.nlocker.Lock(id)
defer c.nlocker.Unlock(id)
f, err := c.Fs.Open(id)
-
if err != nil {
return ""
}
defer f.Close()
- b, _ := ioutil.ReadAll(f)
+ b, _ := io.ReadAll(f)
return string(b)
-
}
// Caches is a named set of caches.
@@ -311,47 +425,29 @@ func (f Caches) Get(name string) *Cache {
// NewCaches creates a new set of file caches from the given
// configuration.
func NewCaches(p *helpers.PathSpec) (Caches, error) {
- var dcfg Configs
- if c, ok := p.Cfg.Get("filecacheConfigs").(Configs); ok {
- dcfg = c
- } else {
- var err error
- dcfg, err = DecodeConfig(p.Fs.Source, p.Cfg)
- if err != nil {
- return nil, err
- }
- }
-
+ dcfg := p.Cfg.GetConfigSection("caches").(Configs)
fs := p.Fs.Source
m := make(Caches)
for k, v := range dcfg {
var cfs afero.Fs
- if v.isResourceDir {
+ if v.IsResourceDir {
cfs = p.BaseFs.ResourcesCache
} else {
cfs = fs
}
if cfs == nil {
- // TODO(bep) we still have some places that do not initialize the
- // full dependencies of a site, e.g. the import Jekyll command.
- // That command does not need these caches, so let us just continue
- // for now.
- continue
+ panic("nil fs")
}
- baseDir := v.Dir
+ baseDir := v.DirCompiled
- if err := cfs.MkdirAll(baseDir, 0777); err != nil && !os.IsExist(err) {
- return nil, err
- }
-
- bfs := afero.NewBasePathFs(cfs, baseDir)
+ bfs := hugofs.NewBasePathFs(cfs, baseDir)
var pruneAllRootDir string
- if k == cacheKeyModules {
+ if k == CacheKeyModules {
pruneAllRootDir = "pkg"
}
@@ -364,3 +460,37 @@ func NewCaches(p *helpers.PathSpec) (Caches, error) {
func cleanID(name string) string {
return strings.TrimPrefix(filepath.Clean(name), helpers.FilePathSeparator)
}
+
+// AsHTTPCache returns an httpcache.Cache implementation for this file cache.
+// Note that none of the methods are protected by named locks, so you need to make sure
+// to do that in your own code.
+func (c *Cache) AsHTTPCache() httpcache.Cache {
+ return &httpCache{c: c}
+}
+
+type httpCache struct {
+ c *Cache
+}
+
+func (h *httpCache) Get(id string) (resp []byte, ok bool) {
+ id = cleanID(id)
+ b, removed := h.c.getBytesAndRemoveIfExpired(id)
+
+ return b, !removed
+}
+
+func (h *httpCache) Set(id string, resp []byte) {
+ if h.c.maxAge == 0 {
+ return
+ }
+
+ id = cleanID(id)
+
+ if err := h.c.writeReader(id, bytes.NewReader(resp)); err != nil {
+ panic(err)
+ }
+}
+
+func (h *httpCache) Delete(key string) {
+ h.c.Fs.Remove(key)
+}
diff --git a/cache/filecache/filecache_config.go b/cache/filecache/filecache_config.go
index 0c6b569c1..a71ddb474 100644
--- a/cache/filecache/filecache_config.go
+++ b/cache/filecache/filecache_config.go
@@ -11,105 +11,131 @@
// See the License for the specific language governing permissions and
// limitations under the License.
+// Package filecache provides a file based cache for Hugo.
package filecache
import (
+ "errors"
+ "fmt"
"path"
"path/filepath"
"strings"
"time"
+ "github.com/gohugoio/hugo/common/maps"
"github.com/gohugoio/hugo/config"
- "github.com/gohugoio/hugo/helpers"
-
"github.com/mitchellh/mapstructure"
- "github.com/pkg/errors"
"github.com/spf13/afero"
)
const (
- cachesConfigKey = "caches"
-
resourcesGenDir = ":resourceDir/_gen"
+ cacheDirProject = ":cacheDir/:project"
)
-var defaultCacheConfig = Config{
+var defaultCacheConfig = FileCacheConfig{
MaxAge: -1, // Never expire
- Dir: ":cacheDir/:project",
+ Dir: cacheDirProject,
}
const (
- cacheKeyGetJSON = "getjson"
- cacheKeyGetCSV = "getcsv"
- cacheKeyImages = "images"
- cacheKeyAssets = "assets"
- cacheKeyModules = "modules"
+ CacheKeyGetJSON = "getjson"
+ CacheKeyGetCSV = "getcsv"
+ CacheKeyImages = "images"
+ CacheKeyAssets = "assets"
+ CacheKeyModules = "modules"
+ CacheKeyGetResource = "getresource"
+ CacheKeyMisc = "misc"
)
-type Configs map[string]Config
+type Configs map[string]FileCacheConfig
+// For internal use.
func (c Configs) CacheDirModules() string {
- return c[cacheKeyModules].Dir
+ return c[CacheKeyModules].DirCompiled
}
var defaultCacheConfigs = Configs{
- cacheKeyModules: {
+ CacheKeyModules: {
MaxAge: -1,
Dir: ":cacheDir/modules",
},
- cacheKeyGetJSON: defaultCacheConfig,
- cacheKeyGetCSV: defaultCacheConfig,
- cacheKeyImages: {
+ CacheKeyGetJSON: defaultCacheConfig,
+ CacheKeyGetCSV: defaultCacheConfig,
+ CacheKeyImages: {
MaxAge: -1,
Dir: resourcesGenDir,
},
- cacheKeyAssets: {
+ CacheKeyAssets: {
MaxAge: -1,
Dir: resourcesGenDir,
},
+ CacheKeyGetResource: {
+ MaxAge: -1, // Never expire
+ Dir: cacheDirProject,
+ },
+ CacheKeyMisc: {
+ MaxAge: -1,
+ Dir: cacheDirProject,
+ },
}
-type Config struct {
+type FileCacheConfig struct {
// Max age of cache entries in this cache. Any items older than this will
// be removed and not returned from the cache.
- // a negative value means forever, 0 means cache is disabled.
+ // A negative value means forever, 0 means cache is disabled.
+ // Hugo is lenient with what types it accepts here, but we recommend using
+ // a duration string, a sequence of decimal numbers, each with optional fraction and a unit suffix,
+ // such as "300ms", "1.5h" or "2h45m".
+ // Valid time units are "ns", "us" (or "µs"), "ms", "s", "m", "h".
MaxAge time.Duration
// The directory where files are stored.
- Dir string
+ Dir string
+ DirCompiled string `json:"-"`
// Will resources/_gen will get its own composite filesystem that
// also checks any theme.
- isResourceDir bool
+ IsResourceDir bool `json:"-"`
}
// GetJSONCache gets the file cache for getJSON.
func (f Caches) GetJSONCache() *Cache {
- return f[cacheKeyGetJSON]
+ return f[CacheKeyGetJSON]
}
// GetCSVCache gets the file cache for getCSV.
func (f Caches) GetCSVCache() *Cache {
- return f[cacheKeyGetCSV]
+ return f[CacheKeyGetCSV]
}
// ImageCache gets the file cache for processed images.
func (f Caches) ImageCache() *Cache {
- return f[cacheKeyImages]
+ return f[CacheKeyImages]
}
// ModulesCache gets the file cache for Hugo Modules.
func (f Caches) ModulesCache() *Cache {
- return f[cacheKeyModules]
+ return f[CacheKeyModules]
}
// AssetsCache gets the file cache for assets (processed resources, SCSS etc.).
func (f Caches) AssetsCache() *Cache {
- return f[cacheKeyAssets]
+ return f[CacheKeyAssets]
}
-func DecodeConfig(fs afero.Fs, cfg config.Provider) (Configs, error) {
+// MiscCache gets the file cache for miscellaneous stuff.
+func (f Caches) MiscCache() *Cache {
+ return f[CacheKeyMisc]
+}
+
+// GetResourceCache gets the file cache for remote resources.
+func (f Caches) GetResourceCache() *Cache {
+ return f[CacheKeyGetResource]
+}
+
+func DecodeConfig(fs afero.Fs, bcfg config.BaseConfig, m map[string]any) (Configs, error) {
c := make(Configs)
valid := make(map[string]bool)
// Add defaults
@@ -118,11 +144,12 @@ func DecodeConfig(fs afero.Fs, cfg config.Provider) (Configs, error) {
valid[k] = true
}
- m := cfg.GetStringMap(cachesConfigKey)
-
_, isOsFs := fs.(*afero.OsFs)
for k, v := range m {
+ if _, ok := v.(maps.Params); !ok {
+ continue
+ }
cc := defaultCacheConfig
dc := &mapstructure.DecoderConfig{
@@ -137,7 +164,7 @@ func DecodeConfig(fs afero.Fs, cfg config.Provider) (Configs, error) {
}
if err := decoder.Decode(v); err != nil {
- return nil, err
+ return nil, fmt.Errorf("failed to decode filecache config: %w", err)
}
if cc.Dir == "" {
@@ -146,15 +173,12 @@ func DecodeConfig(fs afero.Fs, cfg config.Provider) (Configs, error) {
name := strings.ToLower(k)
if !valid[name] {
- return nil, errors.Errorf("%q is not a valid cache name", name)
+ return nil, fmt.Errorf("%q is not a valid cache name", name)
}
c[name] = cc
}
- // This is a very old flag in Hugo, but we need to respect it.
- disabled := cfg.GetBool("ignoreCache")
-
for k, v := range c {
dir := filepath.ToSlash(filepath.Clean(v.Dir))
hadSlash := strings.HasPrefix(dir, "/")
@@ -162,12 +186,12 @@ func DecodeConfig(fs afero.Fs, cfg config.Provider) (Configs, error) {
for i, part := range parts {
if strings.HasPrefix(part, ":") {
- resolved, isResource, err := resolveDirPlaceholder(fs, cfg, part)
+ resolved, isResource, err := resolveDirPlaceholder(fs, bcfg, part)
if err != nil {
return c, err
}
if isResource {
- v.isResourceDir = true
+ v.IsResourceDir = true
}
parts[i] = resolved
}
@@ -177,33 +201,29 @@ func DecodeConfig(fs afero.Fs, cfg config.Provider) (Configs, error) {
if hadSlash {
dir = "/" + dir
}
- v.Dir = filepath.Clean(filepath.FromSlash(dir))
+ v.DirCompiled = filepath.Clean(filepath.FromSlash(dir))
- if !v.isResourceDir {
- if isOsFs && !filepath.IsAbs(v.Dir) {
- return c, errors.Errorf("%q must resolve to an absolute directory", v.Dir)
+ if !v.IsResourceDir {
+ if isOsFs && !filepath.IsAbs(v.DirCompiled) {
+ return c, fmt.Errorf("%q must resolve to an absolute directory", v.DirCompiled)
}
// Avoid cache in root, e.g. / (Unix) or c:\ (Windows)
- if len(strings.TrimPrefix(v.Dir, filepath.VolumeName(v.Dir))) == 1 {
- return c, errors.Errorf("%q is a root folder and not allowed as cache dir", v.Dir)
+ if len(strings.TrimPrefix(v.DirCompiled, filepath.VolumeName(v.DirCompiled))) == 1 {
+ return c, fmt.Errorf("%q is a root folder and not allowed as cache dir", v.DirCompiled)
}
}
- if !strings.HasPrefix(v.Dir, "_gen") {
+ if !strings.HasPrefix(v.DirCompiled, "_gen") {
// We do cache eviction (file removes) and since the user can set
// his/hers own cache directory, we really want to make sure
// we do not delete any files that do not belong to this cache.
// We do add the cache name as the root, but this is an extra safe
// guard. We skip the files inside /resources/_gen/ because
// that would be breaking.
- v.Dir = filepath.Join(v.Dir, filecacheRootDirname, k)
+ v.DirCompiled = filepath.Join(v.DirCompiled, FilecacheRootDirname, k)
} else {
- v.Dir = filepath.Join(v.Dir, k)
- }
-
- if disabled {
- v.MaxAge = 0
+ v.DirCompiled = filepath.Join(v.DirCompiled, k)
}
c[k] = v
@@ -213,18 +233,15 @@ func DecodeConfig(fs afero.Fs, cfg config.Provider) (Configs, error) {
}
// Resolves :resourceDir => /myproject/resources etc., :cacheDir => ...
-func resolveDirPlaceholder(fs afero.Fs, cfg config.Provider, placeholder string) (cacheDir string, isResource bool, err error) {
- workingDir := cfg.GetString("workingDir")
-
+func resolveDirPlaceholder(fs afero.Fs, bcfg config.BaseConfig, placeholder string) (cacheDir string, isResource bool, err error) {
switch strings.ToLower(placeholder) {
case ":resourcedir":
return "", true, nil
case ":cachedir":
- d, err := helpers.GetCacheDir(fs, cfg)
- return d, false, err
+ return bcfg.CacheDir, false, nil
case ":project":
- return filepath.Base(workingDir), false, nil
+ return filepath.Base(bcfg.WorkingDir), false, nil
}
- return "", false, errors.Errorf("%q is not a valid placeholder (valid values are :cacheDir or :resourceDir)", placeholder)
+ return "", false, fmt.Errorf("%q is not a valid placeholder (valid values are :cacheDir or :resourceDir)", placeholder)
}
diff --git a/cache/filecache/filecache_config_test.go b/cache/filecache/filecache_config_test.go
index 9f80a4f90..c6d346dfc 100644
--- a/cache/filecache/filecache_config_test.go
+++ b/cache/filecache/filecache_config_test.go
@@ -11,21 +11,21 @@
// See the License for the specific language governing permissions and
// limitations under the License.
-package filecache
+package filecache_test
import (
"path/filepath"
"runtime"
- "strings"
"testing"
"time"
"github.com/spf13/afero"
+ "github.com/gohugoio/hugo/cache/filecache"
"github.com/gohugoio/hugo/config"
+ "github.com/gohugoio/hugo/config/testconfig"
qt "github.com/frankban/quicktest"
- "github.com/spf13/viper"
)
func TestDecodeConfig(t *testing.T) {
@@ -51,25 +51,27 @@ maxAge = "11h"
dir = "/path/to/c2"
[caches.images]
dir = "/path/to/c3"
-
+[caches.getResource]
+dir = "/path/to/c4"
`
cfg, err := config.FromConfigString(configStr, "toml")
c.Assert(err, qt.IsNil)
fs := afero.NewMemMapFs()
- decoded, err := DecodeConfig(fs, cfg)
- c.Assert(err, qt.IsNil)
-
- c.Assert(len(decoded), qt.Equals, 5)
+ decoded := testconfig.GetTestConfigs(fs, cfg).Base.Caches
+ c.Assert(len(decoded), qt.Equals, 7)
c2 := decoded["getcsv"]
c.Assert(c2.MaxAge.String(), qt.Equals, "11h0m0s")
- c.Assert(c2.Dir, qt.Equals, filepath.FromSlash("/path/to/c2/filecache/getcsv"))
+ c.Assert(c2.DirCompiled, qt.Equals, filepath.FromSlash("/path/to/c2/filecache/getcsv"))
c3 := decoded["images"]
c.Assert(c3.MaxAge, qt.Equals, time.Duration(-1))
- c.Assert(c3.Dir, qt.Equals, filepath.FromSlash("/path/to/c3/filecache/images"))
+ c.Assert(c3.DirCompiled, qt.Equals, filepath.FromSlash("/path/to/c3/filecache/images"))
+ c4 := decoded["getresource"]
+ c.Assert(c4.MaxAge, qt.Equals, time.Duration(-1))
+ c.Assert(c4.DirCompiled, qt.Equals, filepath.FromSlash("/path/to/c4/filecache/getresource"))
}
func TestDecodeConfigIgnoreCache(t *testing.T) {
@@ -96,26 +98,24 @@ maxAge = 3456
dir = "/path/to/c2"
[caches.images]
dir = "/path/to/c3"
-
+[caches.getResource]
+dir = "/path/to/c4"
`
cfg, err := config.FromConfigString(configStr, "toml")
c.Assert(err, qt.IsNil)
fs := afero.NewMemMapFs()
- decoded, err := DecodeConfig(fs, cfg)
- c.Assert(err, qt.IsNil)
-
- c.Assert(len(decoded), qt.Equals, 5)
+ decoded := testconfig.GetTestConfigs(fs, cfg).Base.Caches
+ c.Assert(len(decoded), qt.Equals, 7)
for _, v := range decoded {
c.Assert(v.MaxAge, qt.Equals, time.Duration(0))
}
-
}
func TestDecodeConfigDefault(t *testing.T) {
c := qt.New(t)
- cfg := newTestConfig()
+ cfg := config.New()
if runtime.GOOS == "windows" {
cfg.Set("resourceDir", "c:\\cache\\resources")
@@ -125,72 +125,22 @@ func TestDecodeConfigDefault(t *testing.T) {
cfg.Set("resourceDir", "/cache/resources")
cfg.Set("cacheDir", "/cache/thecache")
}
-
- fs := afero.NewMemMapFs()
-
- decoded, err := DecodeConfig(fs, cfg)
-
- c.Assert(err, qt.IsNil)
-
- c.Assert(len(decoded), qt.Equals, 5)
-
- imgConfig := decoded[cacheKeyImages]
- jsonConfig := decoded[cacheKeyGetJSON]
-
- if runtime.GOOS == "windows" {
- c.Assert(imgConfig.Dir, qt.Equals, filepath.FromSlash("_gen/images"))
- } else {
- c.Assert(imgConfig.Dir, qt.Equals, "_gen/images")
- c.Assert(jsonConfig.Dir, qt.Equals, "/cache/thecache/hugoproject/filecache/getjson")
- }
-
- c.Assert(imgConfig.isResourceDir, qt.Equals, true)
- c.Assert(jsonConfig.isResourceDir, qt.Equals, false)
-}
-
-func TestDecodeConfigInvalidDir(t *testing.T) {
- t.Parallel()
-
- c := qt.New(t)
-
- configStr := `
-resourceDir = "myresources"
-contentDir = "content"
-dataDir = "data"
-i18nDir = "i18n"
-layoutDir = "layouts"
-assetDir = "assets"
-archeTypedir = "archetypes"
-
-[caches]
-[caches.getJSON]
-maxAge = "10m"
-dir = "/"
-
-`
- if runtime.GOOS == "windows" {
- configStr = strings.Replace(configStr, "/", "c:\\\\", 1)
- }
-
- cfg, err := config.FromConfigString(configStr, "toml")
- c.Assert(err, qt.IsNil)
- fs := afero.NewMemMapFs()
-
- _, err = DecodeConfig(fs, cfg)
- c.Assert(err, qt.Not(qt.IsNil))
-
-}
-
-func newTestConfig() *viper.Viper {
- cfg := viper.New()
cfg.Set("workingDir", filepath.FromSlash("/my/cool/hugoproject"))
- cfg.Set("contentDir", "content")
- cfg.Set("dataDir", "data")
- cfg.Set("resourceDir", "resources")
- cfg.Set("i18nDir", "i18n")
- cfg.Set("layoutDir", "layouts")
- cfg.Set("archetypeDir", "archetypes")
- cfg.Set("assetDir", "assets")
- return cfg
+ fs := afero.NewMemMapFs()
+ decoded := testconfig.GetTestConfigs(fs, cfg).Base.Caches
+ c.Assert(len(decoded), qt.Equals, 7)
+
+ imgConfig := decoded[filecache.CacheKeyImages]
+ jsonConfig := decoded[filecache.CacheKeyGetJSON]
+
+ if runtime.GOOS == "windows" {
+ c.Assert(imgConfig.DirCompiled, qt.Equals, filepath.FromSlash("_gen/images"))
+ } else {
+ c.Assert(imgConfig.DirCompiled, qt.Equals, "_gen/images")
+ c.Assert(jsonConfig.DirCompiled, qt.Equals, "/cache/thecache/hugoproject/filecache/getjson")
+ }
+
+ c.Assert(imgConfig.IsResourceDir, qt.Equals, true)
+ c.Assert(jsonConfig.IsResourceDir, qt.Equals, false)
}
diff --git a/cache/filecache/filecache_integration_test.go b/cache/filecache/filecache_integration_test.go
new file mode 100644
index 000000000..1e920c29f
--- /dev/null
+++ b/cache/filecache/filecache_integration_test.go
@@ -0,0 +1,106 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package filecache_test
+
+import (
+ "path/filepath"
+ "testing"
+ "time"
+
+ "github.com/bep/logg"
+ qt "github.com/frankban/quicktest"
+ "github.com/gohugoio/hugo/htesting"
+ "github.com/gohugoio/hugo/hugolib"
+)
+
+// See issue #10781. That issue wouldn't have been triggered if we kept
+// the empty root directories (e.g. _resources/gen/images).
+// It's still an upstream Go issue that we also need to handle, but
+// this is a test for the first part.
+func TestPruneShouldPreserveEmptyCacheRoots(t *testing.T) {
+ files := `
+-- hugo.toml --
+baseURL = "https://example.com"
+-- content/_index.md --
+---
+title: "Home"
+---
+
+`
+
+ b := hugolib.NewIntegrationTestBuilder(
+ hugolib.IntegrationTestConfig{T: t, TxtarString: files, RunGC: true, NeedsOsFS: true},
+ ).Build()
+
+ _, err := b.H.BaseFs.ResourcesCache.Stat(filepath.Join("_gen", "images"))
+
+ b.Assert(err, qt.IsNil)
+}
+
+func TestPruneImages(t *testing.T) {
+ if htesting.IsCI() {
+ // TODO(bep)
+ t.Skip("skip flaky test on CI server")
+ }
+ t.Skip("skip flaky test")
+ files := `
+-- hugo.toml --
+baseURL = "https://example.com"
+[caches]
+[caches.images]
+maxAge = "200ms"
+dir = ":resourceDir/_gen"
+-- content/_index.md --
+---
+title: "Home"
+---
+-- assets/a/pixel.png --
+iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAADUlEQVR42mNkYPhfDwAChwGA60e6kgAAAABJRU5ErkJggg==
+-- layouts/index.html --
+{{ warnf "HOME!" }}
+{{ $img := resources.GetMatch "**.png" }}
+{{ $img = $img.Resize "3x3" }}
+{{ $img.RelPermalink }}
+
+
+
+`
+
+ b := hugolib.NewIntegrationTestBuilder(
+ hugolib.IntegrationTestConfig{T: t, TxtarString: files, Running: true, RunGC: true, NeedsOsFS: true, LogLevel: logg.LevelInfo},
+ ).Build()
+
+ b.Assert(b.GCCount, qt.Equals, 0)
+ b.Assert(b.H, qt.IsNotNil)
+
+ imagesCacheDir := filepath.Join("_gen", "images")
+ _, err := b.H.BaseFs.ResourcesCache.Stat(imagesCacheDir)
+
+ b.Assert(err, qt.IsNil)
+
+ // TODO(bep) we need a way to test full rebuilds.
+ // For now, just sleep a little so the cache elements expires.
+ time.Sleep(500 * time.Millisecond)
+
+ b.RenameFile("assets/a/pixel.png", "assets/b/pixel2.png").Build()
+
+ b.Assert(b.GCCount, qt.Equals, 1)
+ // Build it again to GC the empty a dir.
+ b.Build()
+
+ _, err = b.H.BaseFs.ResourcesCache.Stat(filepath.Join(imagesCacheDir, "a"))
+ b.Assert(err, qt.Not(qt.IsNil))
+ _, err = b.H.BaseFs.ResourcesCache.Stat(imagesCacheDir)
+ b.Assert(err, qt.IsNil)
+}
diff --git a/cache/filecache/filecache_pruner.go b/cache/filecache/filecache_pruner.go
index c6fd4497e..6f224cef4 100644
--- a/cache/filecache/filecache_pruner.go
+++ b/cache/filecache/filecache_pruner.go
@@ -14,10 +14,13 @@
package filecache
import (
+ "fmt"
"io"
"os"
- "github.com/pkg/errors"
+ "github.com/gohugoio/hugo/common/herrors"
+ "github.com/gohugoio/hugo/hugofs"
+
"github.com/spf13/afero"
)
@@ -28,15 +31,17 @@ import (
func (c Caches) Prune() (int, error) {
counter := 0
for k, cache := range c {
-
count, err := cache.Prune(false)
- if err != nil {
- return counter, errors.Wrapf(err, "failed to prune cache %q", k)
- }
-
counter += count
+ if err != nil {
+ if herrors.IsNotExist(err) {
+ continue
+ }
+ return counter, fmt.Errorf("failed to prune cache %q: %w", k, err)
+ }
+
}
return counter, nil
@@ -48,6 +53,9 @@ func (c *Cache) Prune(force bool) (int, error) {
if c.pruneAllRootDir != "" {
return c.pruneRootDir(force)
}
+ if err := c.init(); err != nil {
+ return 0, err
+ }
counter := 0
@@ -64,11 +72,20 @@ func (c *Cache) Prune(force bool) (int, error) {
// This cache dir may not exist.
return nil
}
- defer f.Close()
_, err = f.Readdirnames(1)
+ f.Close()
if err == io.EOF {
// Empty dir.
- return c.Fs.Remove(name)
+ if name == "." {
+ // e.g. /_gen/images -- keep it even if empty.
+ err = nil
+ } else {
+ err = c.Fs.Remove(name)
+ }
+ }
+
+ if err != nil && !herrors.IsNotExist(err) {
+ return err
}
return nil
@@ -87,7 +104,11 @@ func (c *Cache) Prune(force bool) (int, error) {
if err == nil {
counter++
}
- return err
+
+ if err != nil && !herrors.IsNotExist(err) {
+ return err
+ }
+
}
return nil
@@ -97,10 +118,12 @@ func (c *Cache) Prune(force bool) (int, error) {
}
func (c *Cache) pruneRootDir(force bool) (int, error) {
-
+ if err := c.init(); err != nil {
+ return 0, err
+ }
info, err := c.Fs.Stat(c.pruneAllRootDir)
if err != nil {
- if os.IsNotExist(err) {
+ if herrors.IsNotExist(err) {
return 0, nil
}
return 0, err
@@ -110,18 +133,5 @@ func (c *Cache) pruneRootDir(force bool) (int, error) {
return 0, nil
}
- counter := 0
- // Module cache has 0555 directories; make them writable in order to remove content.
- afero.Walk(c.Fs, c.pruneAllRootDir, func(path string, info os.FileInfo, err error) error {
- if err != nil {
- return nil
- }
- if info.IsDir() {
- counter++
- c.Fs.Chmod(path, 0777)
- }
- return nil
- })
- return 1, c.Fs.RemoveAll(c.pruneAllRootDir)
-
+ return hugofs.MakeReadableAndRemoveAllModulePkgDir(c.Fs, c.pruneAllRootDir)
}
diff --git a/cache/filecache/filecache_pruner_test.go b/cache/filecache/filecache_pruner_test.go
index 48bce723e..b49ba7645 100644
--- a/cache/filecache/filecache_pruner_test.go
+++ b/cache/filecache/filecache_pruner_test.go
@@ -11,13 +11,14 @@
// See the License for the specific language governing permissions and
// limitations under the License.
-package filecache
+package filecache_test
import (
"fmt"
"testing"
"time"
+ "github.com/gohugoio/hugo/cache/filecache"
"github.com/spf13/afero"
qt "github.com/frankban/quicktest"
@@ -52,13 +53,13 @@ maxAge = "200ms"
dir = ":resourceDir/_gen"
`
- for _, name := range []string{cacheKeyGetCSV, cacheKeyGetJSON, cacheKeyAssets, cacheKeyImages} {
+ for _, name := range []string{filecache.CacheKeyGetCSV, filecache.CacheKeyGetJSON, filecache.CacheKeyAssets, filecache.CacheKeyImages} {
msg := qt.Commentf("cache: %s", name)
p := newPathsSpec(t, afero.NewMemMapFs(), configStr)
- caches, err := NewCaches(p)
+ caches, err := filecache.NewCaches(p)
c.Assert(err, qt.IsNil)
cache := caches[name]
- for i := 0; i < 10; i++ {
+ for i := range 10 {
id := fmt.Sprintf("i%d", i)
cache.GetOrCreateBytes(id, func() ([]byte, error) {
return []byte("abc"), nil
@@ -73,9 +74,9 @@ dir = ":resourceDir/_gen"
c.Assert(err, qt.IsNil)
c.Assert(count, qt.Equals, 5, msg)
- for i := 0; i < 10; i++ {
+ for i := range 10 {
id := fmt.Sprintf("i%d", i)
- v := cache.getString(id)
+ v := cache.GetString(id)
if i < 5 {
c.Assert(v, qt.Equals, "")
} else {
@@ -83,7 +84,7 @@ dir = ":resourceDir/_gen"
}
}
- caches, err = NewCaches(p)
+ caches, err = filecache.NewCaches(p)
c.Assert(err, qt.IsNil)
cache = caches[name]
// Touch one and then prune.
@@ -96,9 +97,9 @@ dir = ":resourceDir/_gen"
c.Assert(count, qt.Equals, 4)
// Now only the i5 should be left.
- for i := 0; i < 10; i++ {
+ for i := range 10 {
id := fmt.Sprintf("i%d", i)
- v := cache.getString(id)
+ v := cache.GetString(id)
if i != 5 {
c.Assert(v, qt.Equals, "")
} else {
@@ -107,5 +108,4 @@ dir = ":resourceDir/_gen"
}
}
-
}
diff --git a/cache/filecache/filecache_test.go b/cache/filecache/filecache_test.go
index 6d3ea6289..a30aaa50b 100644
--- a/cache/filecache/filecache_test.go
+++ b/cache/filecache/filecache_test.go
@@ -1,4 +1,4 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -11,24 +11,21 @@
// See the License for the specific language governing permissions and
// limitations under the License.
-package filecache
+package filecache_test
import (
+ "errors"
"fmt"
"io"
- "io/ioutil"
- "os"
- "path/filepath"
"strings"
"sync"
"testing"
"time"
- "github.com/gohugoio/hugo/langs"
- "github.com/gohugoio/hugo/modules"
-
+ "github.com/gohugoio/hugo/cache/filecache"
"github.com/gohugoio/hugo/common/hugio"
"github.com/gohugoio/hugo/config"
+ "github.com/gohugoio/hugo/config/testconfig"
"github.com/gohugoio/hugo/helpers"
"github.com/gohugoio/hugo/hugofs"
@@ -41,13 +38,8 @@ func TestFileCache(t *testing.T) {
t.Parallel()
c := qt.New(t)
- tempWorkingDir, err := ioutil.TempDir("", "hugo_filecache_test_work")
- c.Assert(err, qt.IsNil)
- defer os.Remove(tempWorkingDir)
-
- tempCacheDir, err := ioutil.TempDir("", "hugo_filecache_test_cache")
- c.Assert(err, qt.IsNil)
- defer os.Remove(tempCacheDir)
+ tempWorkingDir := t.TempDir()
+ tempCacheDir := t.TempDir()
osfs := afero.NewOsFs()
@@ -87,31 +79,14 @@ dir = ":cacheDir/c"
p := newPathsSpec(t, osfs, configStr)
- caches, err := NewCaches(p)
+ caches, err := filecache.NewCaches(p)
c.Assert(err, qt.IsNil)
cache := caches.Get("GetJSON")
c.Assert(cache, qt.Not(qt.IsNil))
- c.Assert(cache.maxAge.String(), qt.Equals, "10h0m0s")
-
- bfs, ok := cache.Fs.(*afero.BasePathFs)
- c.Assert(ok, qt.Equals, true)
- filename, err := bfs.RealPath("key")
- c.Assert(err, qt.IsNil)
- if test.cacheDir != "" {
- c.Assert(filename, qt.Equals, filepath.Join(test.cacheDir, "c/"+filecacheRootDirname+"/getjson/key"))
- } else {
- // Temp dir.
- c.Assert(filename, qt.Matches, ".*hugo_cache.*"+filecacheRootDirname+".*key")
- }
cache = caches.Get("Images")
c.Assert(cache, qt.Not(qt.IsNil))
- c.Assert(cache.maxAge, qt.Equals, time.Duration(-1))
- bfs, ok = cache.Fs.(*afero.BasePathFs)
- c.Assert(ok, qt.Equals, true)
- filename, _ = bfs.RealPath("key")
- c.Assert(filename, qt.Equals, filepath.FromSlash("_gen/images/key"))
rf := func(s string) func() (io.ReadCloser, error) {
return func() (io.ReadCloser, error) {
@@ -120,7 +95,7 @@ dir = ":cacheDir/c"
io.Closer
}{
strings.NewReader(s),
- ioutil.NopCloser(nil),
+ io.NopCloser(nil),
}, nil
}
}
@@ -129,13 +104,13 @@ dir = ":cacheDir/c"
return []byte("bcd"), nil
}
- for _, ca := range []*Cache{caches.ImageCache(), caches.AssetsCache(), caches.GetJSONCache(), caches.GetCSVCache()} {
- for i := 0; i < 2; i++ {
+ for _, ca := range []*filecache.Cache{caches.ImageCache(), caches.AssetsCache(), caches.GetJSONCache(), caches.GetCSVCache()} {
+ for range 2 {
info, r, err := ca.GetOrCreate("a", rf("abc"))
c.Assert(err, qt.IsNil)
c.Assert(r, qt.Not(qt.IsNil))
c.Assert(info.Name, qt.Equals, "a")
- b, _ := ioutil.ReadAll(r)
+ b, _ := io.ReadAll(r)
r.Close()
c.Assert(string(b), qt.Equals, "abc")
@@ -151,7 +126,7 @@ dir = ":cacheDir/c"
_, r, err = ca.GetOrCreate("a", rf("bcd"))
c.Assert(err, qt.IsNil)
- b, _ = ioutil.ReadAll(r)
+ b, _ = io.ReadAll(r)
r.Close()
c.Assert(string(b), qt.Equals, "abc")
}
@@ -164,13 +139,13 @@ dir = ":cacheDir/c"
c.Assert(info.Name, qt.Equals, "mykey")
io.WriteString(w, "Hugo is great!")
w.Close()
- c.Assert(caches.ImageCache().getString("mykey"), qt.Equals, "Hugo is great!")
+ c.Assert(caches.ImageCache().GetString("mykey"), qt.Equals, "Hugo is great!")
info, r, err := caches.ImageCache().Get("mykey")
c.Assert(err, qt.IsNil)
c.Assert(r, qt.Not(qt.IsNil))
c.Assert(info.Name, qt.Equals, "mykey")
- b, _ := ioutil.ReadAll(r)
+ b, _ := io.ReadAll(r)
r.Close()
c.Assert(string(b), qt.Equals, "Hugo is great!")
@@ -180,7 +155,6 @@ dir = ":cacheDir/c"
c.Assert(string(b), qt.Equals, "Hugo is great!")
}
-
}
func TestFileCacheConcurrent(t *testing.T) {
@@ -206,7 +180,7 @@ dir = "/cache/c"
p := newPathsSpec(t, afero.NewMemMapFs(), configStr)
- caches, err := NewCaches(p)
+ caches, err := filecache.NewCaches(p)
c.Assert(err, qt.IsNil)
const cacheName = "getjson"
@@ -219,11 +193,11 @@ dir = "/cache/c"
var wg sync.WaitGroup
- for i := 0; i < 50; i++ {
+ for i := range 50 {
wg.Add(1)
go func(i int) {
defer wg.Done()
- for j := 0; j < 20; j++ {
+ for range 20 {
ca := caches.Get(cacheName)
c.Assert(ca, qt.Not(qt.IsNil))
filename, data := filenameData(i)
@@ -231,7 +205,7 @@ dir = "/cache/c"
return hugio.ToReadCloser(strings.NewReader(data)), nil
})
c.Assert(err, qt.IsNil)
- b, _ := ioutil.ReadAll(r)
+ b, _ := io.ReadAll(r)
r.Close()
c.Assert(string(b), qt.Equals, data)
// Trigger some expiration.
@@ -243,56 +217,60 @@ dir = "/cache/c"
wg.Wait()
}
-func TestCleanID(t *testing.T) {
+func TestFileCacheReadOrCreateErrorInRead(t *testing.T) {
+ t.Parallel()
c := qt.New(t)
- c.Assert(cleanID(filepath.FromSlash("/a/b//c.txt")), qt.Equals, filepath.FromSlash("a/b/c.txt"))
- c.Assert(cleanID(filepath.FromSlash("a/b//c.txt")), qt.Equals, filepath.FromSlash("a/b/c.txt"))
-}
-func initConfig(fs afero.Fs, cfg config.Provider) error {
- if _, err := langs.LoadLanguageSettings(cfg, nil); err != nil {
- return err
+ var result string
+
+ rf := func(failLevel int) func(info filecache.ItemInfo, r io.ReadSeeker) error {
+ return func(info filecache.ItemInfo, r io.ReadSeeker) error {
+ if failLevel > 0 {
+ if failLevel > 1 {
+ return filecache.ErrFatal
+ }
+ return errors.New("fail")
+ }
+
+ b, _ := io.ReadAll(r)
+ result = string(b)
+
+ return nil
+ }
}
- modConfig, err := modules.DecodeConfig(cfg)
- if err != nil {
- return err
+ bf := func(s string) func(info filecache.ItemInfo, w io.WriteCloser) error {
+ return func(info filecache.ItemInfo, w io.WriteCloser) error {
+ defer w.Close()
+ result = s
+ _, err := w.Write([]byte(s))
+ return err
+ }
}
- workingDir := cfg.GetString("workingDir")
- themesDir := cfg.GetString("themesDir")
- if !filepath.IsAbs(themesDir) {
- themesDir = filepath.Join(workingDir, themesDir)
- }
- modulesClient := modules.NewClient(modules.ClientConfig{
- Fs: fs,
- WorkingDir: workingDir,
- ThemesDir: themesDir,
- ModuleConfig: modConfig,
- IgnoreVendor: true,
- })
+ cache := filecache.NewCache(afero.NewMemMapFs(), 100*time.Hour, "")
- moduleConfig, err := modulesClient.Collect()
- if err != nil {
- return err
- }
+ const id = "a32"
- if err := modules.ApplyProjectConfigDefaults(cfg, moduleConfig.ActiveModules[len(moduleConfig.ActiveModules)-1]); err != nil {
- return err
- }
-
- cfg.Set("allModules", moduleConfig.ActiveModules)
-
- return nil
+ _, err := cache.ReadOrCreate(id, rf(0), bf("v1"))
+ c.Assert(err, qt.IsNil)
+ c.Assert(result, qt.Equals, "v1")
+ _, err = cache.ReadOrCreate(id, rf(0), bf("v2"))
+ c.Assert(err, qt.IsNil)
+ c.Assert(result, qt.Equals, "v1")
+ _, err = cache.ReadOrCreate(id, rf(1), bf("v3"))
+ c.Assert(err, qt.IsNil)
+ c.Assert(result, qt.Equals, "v3")
+ _, err = cache.ReadOrCreate(id, rf(2), bf("v3"))
+ c.Assert(err, qt.Equals, filecache.ErrFatal)
}
func newPathsSpec(t *testing.T, fs afero.Fs, configStr string) *helpers.PathSpec {
c := qt.New(t)
cfg, err := config.FromConfigString(configStr, "toml")
c.Assert(err, qt.IsNil)
- initConfig(fs, cfg)
- p, err := helpers.NewPathSpec(hugofs.NewFrom(fs, cfg), cfg, nil)
+ acfg := testconfig.GetTestConfig(fs, cfg)
+ p, err := helpers.NewPathSpec(hugofs.NewFrom(fs, acfg.BaseConfig()), acfg, nil)
c.Assert(err, qt.IsNil)
return p
-
}
diff --git a/cache/httpcache/httpcache.go b/cache/httpcache/httpcache.go
new file mode 100644
index 000000000..bd6d4bf7d
--- /dev/null
+++ b/cache/httpcache/httpcache.go
@@ -0,0 +1,229 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package httpcache
+
+import (
+ "encoding/json"
+ "time"
+
+ "github.com/gobwas/glob"
+ "github.com/gohugoio/hugo/common/predicate"
+ "github.com/gohugoio/hugo/config"
+ "github.com/mitchellh/mapstructure"
+)
+
+// DefaultConfig holds the default configuration for the HTTP cache.
+var DefaultConfig = Config{
+ Cache: Cache{
+ For: GlobMatcher{
+ Excludes: []string{"**"},
+ },
+ },
+ Polls: []PollConfig{
+ {
+ For: GlobMatcher{
+ Includes: []string{"**"},
+ },
+ Disable: true,
+ },
+ },
+}
+
+// Config holds the configuration for the HTTP cache.
+type Config struct {
+ // Configures the HTTP cache behavior (RFC 9111).
+ // When this is not enabled for a resource, Hugo will go straight to the file cache.
+ Cache Cache
+
+ // Polls holds a list of configurations for polling remote resources to detect changes in watch mode.
+ // This can be disabled for some resources, typically if they are known to not change.
+ Polls []PollConfig
+}
+
+type Cache struct {
+ // Enable HTTP cache behavior (RFC 9111) for these resources.
+ For GlobMatcher
+}
+
+func (c *Config) Compile() (ConfigCompiled, error) {
+ var cc ConfigCompiled
+
+ p, err := c.Cache.For.CompilePredicate()
+ if err != nil {
+ return cc, err
+ }
+
+ cc.For = p
+
+ for _, pc := range c.Polls {
+
+ p, err := pc.For.CompilePredicate()
+ if err != nil {
+ return cc, err
+ }
+
+ cc.PollConfigs = append(cc.PollConfigs, PollConfigCompiled{
+ For: p,
+ Config: pc,
+ })
+ }
+
+ return cc, nil
+}
+
+// PollConfig holds the configuration for polling remote resources to detect changes in watch mode.
+type PollConfig struct {
+ // What remote resources to apply this configuration to.
+ For GlobMatcher
+
+ // Disable polling for this configuration.
+ Disable bool
+
+ // Low is the lower bound for the polling interval.
+ // This is the starting point when the resource has recently changed,
+ // if that resource stops changing, the polling interval will gradually increase towards High.
+ Low time.Duration
+
+ // High is the upper bound for the polling interval.
+ // This is the interval used when the resource is stable.
+ High time.Duration
+}
+
+func (c PollConfig) MarshalJSON() (b []byte, err error) {
+ // Marshal the durations as strings.
+ type Alias PollConfig
+ return json.Marshal(&struct {
+ Low string
+ High string
+ Alias
+ }{
+ Low: c.Low.String(),
+ High: c.High.String(),
+ Alias: (Alias)(c),
+ })
+}
+
+type GlobMatcher struct {
+ // Excludes holds a list of glob patterns that will be excluded.
+ Excludes []string
+
+ // Includes holds a list of glob patterns that will be included.
+ Includes []string
+}
+
+func (gm GlobMatcher) IsZero() bool {
+ return len(gm.Includes) == 0 && len(gm.Excludes) == 0
+}
+
+type ConfigCompiled struct {
+ For predicate.P[string]
+ PollConfigs []PollConfigCompiled
+}
+
+func (c *ConfigCompiled) PollConfigFor(s string) PollConfigCompiled {
+ for _, pc := range c.PollConfigs {
+ if pc.For(s) {
+ return pc
+ }
+ }
+ return PollConfigCompiled{}
+}
+
+func (c *ConfigCompiled) IsPollingDisabled() bool {
+ for _, pc := range c.PollConfigs {
+ if !pc.Config.Disable {
+ return false
+ }
+ }
+ return true
+}
+
+type PollConfigCompiled struct {
+ For predicate.P[string]
+ Config PollConfig
+}
+
+func (p PollConfigCompiled) IsZero() bool {
+ return p.For == nil
+}
+
+func (gm *GlobMatcher) CompilePredicate() (func(string) bool, error) {
+ if gm.IsZero() {
+ panic("no includes or excludes")
+ }
+ var p predicate.P[string]
+ for _, include := range gm.Includes {
+ g, err := glob.Compile(include, '/')
+ if err != nil {
+ return nil, err
+ }
+ fn := func(s string) bool {
+ return g.Match(s)
+ }
+ p = p.Or(fn)
+ }
+
+ for _, exclude := range gm.Excludes {
+ g, err := glob.Compile(exclude, '/')
+ if err != nil {
+ return nil, err
+ }
+ fn := func(s string) bool {
+ return !g.Match(s)
+ }
+ p = p.And(fn)
+ }
+
+ return p, nil
+}
+
+func DecodeConfig(_ config.BaseConfig, m map[string]any) (Config, error) {
+ if len(m) == 0 {
+ return DefaultConfig, nil
+ }
+
+ var c Config
+
+ dc := &mapstructure.DecoderConfig{
+ Result: &c,
+ DecodeHook: mapstructure.StringToTimeDurationHookFunc(),
+ WeaklyTypedInput: true,
+ }
+
+ decoder, err := mapstructure.NewDecoder(dc)
+ if err != nil {
+ return c, err
+ }
+
+ if err := decoder.Decode(m); err != nil {
+ return c, err
+ }
+
+ if c.Cache.For.IsZero() {
+ c.Cache.For = DefaultConfig.Cache.For
+ }
+
+ for pci := range c.Polls {
+ if c.Polls[pci].For.IsZero() {
+ c.Polls[pci].For = DefaultConfig.Cache.For
+ c.Polls[pci].Disable = true
+ }
+ }
+
+ if len(c.Polls) == 0 {
+ c.Polls = DefaultConfig.Polls
+ }
+
+ return c, nil
+}
diff --git a/cache/httpcache/httpcache_integration_test.go b/cache/httpcache/httpcache_integration_test.go
new file mode 100644
index 000000000..4d6a5f718
--- /dev/null
+++ b/cache/httpcache/httpcache_integration_test.go
@@ -0,0 +1,95 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package httpcache_test
+
+import (
+ "testing"
+ "time"
+
+ qt "github.com/frankban/quicktest"
+ "github.com/gohugoio/hugo/hugolib"
+)
+
+func TestConfigCustom(t *testing.T) {
+ t.Parallel()
+
+ files := `
+-- hugo.toml --
+[httpcache]
+[httpcache.cache.for]
+includes = ["**gohugo.io**"]
+[[httpcache.polls]]
+low = "5s"
+high = "32s"
+[httpcache.polls.for]
+includes = ["**gohugo.io**"]
+
+
+`
+
+ b := hugolib.Test(t, files)
+
+ httpcacheConf := b.H.Configs.Base.HTTPCache
+ compiled := b.H.Configs.Base.C.HTTPCache
+
+ b.Assert(httpcacheConf.Cache.For.Includes, qt.DeepEquals, []string{"**gohugo.io**"})
+ b.Assert(httpcacheConf.Cache.For.Excludes, qt.IsNil)
+
+ pc := compiled.PollConfigFor("https://gohugo.io/foo.jpg")
+ b.Assert(pc.Config.Low, qt.Equals, 5*time.Second)
+ b.Assert(pc.Config.High, qt.Equals, 32*time.Second)
+ b.Assert(compiled.PollConfigFor("https://example.com/foo.jpg").IsZero(), qt.IsTrue)
+}
+
+func TestConfigDefault(t *testing.T) {
+ t.Parallel()
+
+ files := `
+-- hugo.toml --
+`
+ b := hugolib.Test(t, files)
+
+ compiled := b.H.Configs.Base.C.HTTPCache
+
+ b.Assert(compiled.For("https://gohugo.io/posts.json"), qt.IsFalse)
+ b.Assert(compiled.For("https://gohugo.io/foo.jpg"), qt.IsFalse)
+ b.Assert(compiled.PollConfigFor("https://gohugo.io/foo.jpg").Config.Disable, qt.IsTrue)
+}
+
+func TestConfigPollsOnly(t *testing.T) {
+ t.Parallel()
+ files := `
+-- hugo.toml --
+[httpcache]
+[[httpcache.polls]]
+low = "5s"
+high = "32s"
+[httpcache.polls.for]
+includes = ["**gohugo.io**"]
+
+
+`
+
+ b := hugolib.Test(t, files)
+
+ compiled := b.H.Configs.Base.C.HTTPCache
+
+ b.Assert(compiled.For("https://gohugo.io/posts.json"), qt.IsFalse)
+ b.Assert(compiled.For("https://gohugo.io/foo.jpg"), qt.IsFalse)
+
+ pc := compiled.PollConfigFor("https://gohugo.io/foo.jpg")
+ b.Assert(pc.Config.Low, qt.Equals, 5*time.Second)
+ b.Assert(pc.Config.High, qt.Equals, 32*time.Second)
+ b.Assert(compiled.PollConfigFor("https://example.com/foo.jpg").IsZero(), qt.IsTrue)
+}
diff --git a/cache/httpcache/httpcache_test.go b/cache/httpcache/httpcache_test.go
new file mode 100644
index 000000000..60c07d056
--- /dev/null
+++ b/cache/httpcache/httpcache_test.go
@@ -0,0 +1,73 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package httpcache
+
+import (
+ "testing"
+
+ qt "github.com/frankban/quicktest"
+ "github.com/gohugoio/hugo/config"
+)
+
+func TestGlobMatcher(t *testing.T) {
+ c := qt.New(t)
+
+ g := GlobMatcher{
+ Includes: []string{"**/*.jpg", "**.png", "**/bar/**"},
+ Excludes: []string{"**/foo.jpg", "**.css"},
+ }
+
+ p, err := g.CompilePredicate()
+ c.Assert(err, qt.IsNil)
+
+ c.Assert(p("foo.jpg"), qt.IsFalse)
+ c.Assert(p("foo.png"), qt.IsTrue)
+ c.Assert(p("foo/bar.jpg"), qt.IsTrue)
+ c.Assert(p("foo/bar.png"), qt.IsTrue)
+ c.Assert(p("foo/bar/foo.jpg"), qt.IsFalse)
+ c.Assert(p("foo/bar/foo.css"), qt.IsFalse)
+ c.Assert(p("foo.css"), qt.IsFalse)
+ c.Assert(p("foo/bar/foo.css"), qt.IsFalse)
+ c.Assert(p("foo/bar/foo.xml"), qt.IsTrue)
+}
+
+func TestDefaultConfig(t *testing.T) {
+ c := qt.New(t)
+
+ _, err := DefaultConfig.Compile()
+ c.Assert(err, qt.IsNil)
+}
+
+func TestDecodeConfigInjectsDefaultAndCompiles(t *testing.T) {
+ c := qt.New(t)
+
+ cfg, err := DecodeConfig(config.BaseConfig{}, map[string]interface{}{})
+ c.Assert(err, qt.IsNil)
+ c.Assert(cfg, qt.DeepEquals, DefaultConfig)
+
+ _, err = cfg.Compile()
+ c.Assert(err, qt.IsNil)
+
+ cfg, err = DecodeConfig(config.BaseConfig{}, map[string]any{
+ "cache": map[string]any{
+ "polls": []map[string]any{
+ {"disable": true},
+ },
+ },
+ })
+ c.Assert(err, qt.IsNil)
+
+ _, err = cfg.Compile()
+ c.Assert(err, qt.IsNil)
+}
diff --git a/cache/namedmemcache/named_cache.go b/cache/namedmemcache/named_cache.go
deleted file mode 100644
index d8c229a01..000000000
--- a/cache/namedmemcache/named_cache.go
+++ /dev/null
@@ -1,79 +0,0 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-// Package namedmemcache provides a memory cache with a named lock. This is suitable
-// for situations where creating the cached resource can be time consuming or otherwise
-// resource hungry, or in situations where a "once only per key" is a requirement.
-package namedmemcache
-
-import (
- "sync"
-
- "github.com/BurntSushi/locker"
-)
-
-// Cache holds the cached values.
-type Cache struct {
- nlocker *locker.Locker
- cache map[string]cacheEntry
- mu sync.RWMutex
-}
-
-type cacheEntry struct {
- value interface{}
- err error
-}
-
-// New creates a new cache.
-func New() *Cache {
- return &Cache{
- nlocker: locker.NewLocker(),
- cache: make(map[string]cacheEntry),
- }
-}
-
-// Clear clears the cache state.
-func (c *Cache) Clear() {
- c.mu.Lock()
- defer c.mu.Unlock()
-
- c.cache = make(map[string]cacheEntry)
- c.nlocker = locker.NewLocker()
-
-}
-
-// GetOrCreate tries to get the value with the given cache key, if not found
-// create will be called and cached.
-// This method is thread safe. It also guarantees that the create func for a given
-// key is invoced only once for this cache.
-func (c *Cache) GetOrCreate(key string, create func() (interface{}, error)) (interface{}, error) {
- c.mu.RLock()
- entry, found := c.cache[key]
- c.mu.RUnlock()
-
- if found {
- return entry.value, entry.err
- }
-
- c.nlocker.Lock(key)
- defer c.nlocker.Unlock(key)
-
- // Create it.
- value, err := create()
-
- c.mu.Lock()
- c.cache[key] = cacheEntry{value: value, err: err}
- c.mu.Unlock()
-
- return value, err
-}
diff --git a/cache/namedmemcache/named_cache_test.go b/cache/namedmemcache/named_cache_test.go
deleted file mode 100644
index 9feddb11f..000000000
--- a/cache/namedmemcache/named_cache_test.go
+++ /dev/null
@@ -1,80 +0,0 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package namedmemcache
-
-import (
- "fmt"
- "sync"
- "testing"
-
- qt "github.com/frankban/quicktest"
-)
-
-func TestNamedCache(t *testing.T) {
- t.Parallel()
- c := qt.New(t)
-
- cache := New()
-
- counter := 0
- create := func() (interface{}, error) {
- counter++
- return counter, nil
- }
-
- for i := 0; i < 5; i++ {
- v1, err := cache.GetOrCreate("a1", create)
- c.Assert(err, qt.IsNil)
- c.Assert(v1, qt.Equals, 1)
- v2, err := cache.GetOrCreate("a2", create)
- c.Assert(err, qt.IsNil)
- c.Assert(v2, qt.Equals, 2)
- }
-
- cache.Clear()
-
- v3, err := cache.GetOrCreate("a2", create)
- c.Assert(err, qt.IsNil)
- c.Assert(v3, qt.Equals, 3)
-}
-
-func TestNamedCacheConcurrent(t *testing.T) {
- t.Parallel()
-
- c := qt.New(t)
-
- var wg sync.WaitGroup
-
- cache := New()
-
- create := func(i int) func() (interface{}, error) {
- return func() (interface{}, error) {
- return i, nil
- }
- }
-
- for i := 0; i < 10; i++ {
- wg.Add(1)
- go func() {
- defer wg.Done()
- for j := 0; j < 100; j++ {
- id := fmt.Sprintf("id%d", j)
- v, err := cache.GetOrCreate(id, create(j))
- c.Assert(err, qt.IsNil)
- c.Assert(v, qt.Equals, j)
- }
- }()
- }
- wg.Wait()
-}
diff --git a/cache/partitioned_lazy_cache.go b/cache/partitioned_lazy_cache.go
deleted file mode 100644
index 31e66e127..000000000
--- a/cache/partitioned_lazy_cache.go
+++ /dev/null
@@ -1,99 +0,0 @@
-// Copyright 2017-present The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package cache
-
-import (
- "sync"
-)
-
-// Partition represents a cache partition where Load is the callback
-// for when the partition is needed.
-type Partition struct {
- Key string
- Load func() (map[string]interface{}, error)
-}
-
-// Lazy represents a lazily loaded cache.
-type Lazy struct {
- initSync sync.Once
- initErr error
- cache map[string]interface{}
- load func() (map[string]interface{}, error)
-}
-
-// NewLazy creates a lazy cache with the given load func.
-func NewLazy(load func() (map[string]interface{}, error)) *Lazy {
- return &Lazy{load: load}
-}
-
-func (l *Lazy) init() error {
- l.initSync.Do(func() {
- c, err := l.load()
- l.cache = c
- l.initErr = err
-
- })
-
- return l.initErr
-}
-
-// Get initializes the cache if not already initialized, then looks up the
-// given key.
-func (l *Lazy) Get(key string) (interface{}, bool, error) {
- l.init()
- if l.initErr != nil {
- return nil, false, l.initErr
- }
- v, found := l.cache[key]
- return v, found, nil
-}
-
-// PartitionedLazyCache is a lazily loaded cache paritioned by a supplied string key.
-type PartitionedLazyCache struct {
- partitions map[string]*Lazy
-}
-
-// NewPartitionedLazyCache creates a new NewPartitionedLazyCache with the supplied
-// partitions.
-func NewPartitionedLazyCache(partitions ...Partition) *PartitionedLazyCache {
- lazyPartitions := make(map[string]*Lazy, len(partitions))
- for _, partition := range partitions {
- lazyPartitions[partition.Key] = NewLazy(partition.Load)
- }
- cache := &PartitionedLazyCache{partitions: lazyPartitions}
-
- return cache
-}
-
-// Get initializes the partition if not already done so, then looks up the given
-// key in the given partition, returns nil if no value found.
-func (c *PartitionedLazyCache) Get(partition, key string) (interface{}, error) {
- p, found := c.partitions[partition]
-
- if !found {
- return nil, nil
- }
-
- v, found, err := p.Get(key)
- if err != nil {
- return nil, err
- }
-
- if found {
- return v, nil
- }
-
- return nil, nil
-
-}
diff --git a/cache/partitioned_lazy_cache_test.go b/cache/partitioned_lazy_cache_test.go
deleted file mode 100644
index 2c61a6560..000000000
--- a/cache/partitioned_lazy_cache_test.go
+++ /dev/null
@@ -1,138 +0,0 @@
-// Copyright 2017-present The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package cache
-
-import (
- "errors"
- "sync"
- "testing"
-
- qt "github.com/frankban/quicktest"
-)
-
-func TestNewPartitionedLazyCache(t *testing.T) {
- t.Parallel()
-
- c := qt.New(t)
-
- p1 := Partition{
- Key: "p1",
- Load: func() (map[string]interface{}, error) {
- return map[string]interface{}{
- "p1_1": "p1v1",
- "p1_2": "p1v2",
- "p1_nil": nil,
- }, nil
- },
- }
-
- p2 := Partition{
- Key: "p2",
- Load: func() (map[string]interface{}, error) {
- return map[string]interface{}{
- "p2_1": "p2v1",
- "p2_2": "p2v2",
- "p2_3": "p2v3",
- }, nil
- },
- }
-
- cache := NewPartitionedLazyCache(p1, p2)
-
- v, err := cache.Get("p1", "p1_1")
- c.Assert(err, qt.IsNil)
- c.Assert(v, qt.Equals, "p1v1")
-
- v, err = cache.Get("p1", "p2_1")
- c.Assert(err, qt.IsNil)
- c.Assert(v, qt.IsNil)
-
- v, err = cache.Get("p1", "p1_nil")
- c.Assert(err, qt.IsNil)
- c.Assert(v, qt.IsNil)
-
- v, err = cache.Get("p2", "p2_3")
- c.Assert(err, qt.IsNil)
- c.Assert(v, qt.Equals, "p2v3")
-
- v, err = cache.Get("doesnotexist", "p1_1")
- c.Assert(err, qt.IsNil)
- c.Assert(v, qt.IsNil)
-
- v, err = cache.Get("p1", "doesnotexist")
- c.Assert(err, qt.IsNil)
- c.Assert(v, qt.IsNil)
-
- errorP := Partition{
- Key: "p3",
- Load: func() (map[string]interface{}, error) {
- return nil, errors.New("Failed")
- },
- }
-
- cache = NewPartitionedLazyCache(errorP)
-
- v, err = cache.Get("p1", "doesnotexist")
- c.Assert(err, qt.IsNil)
- c.Assert(v, qt.IsNil)
-
- _, err = cache.Get("p3", "doesnotexist")
- c.Assert(err, qt.Not(qt.IsNil))
-
-}
-
-func TestConcurrentPartitionedLazyCache(t *testing.T) {
- t.Parallel()
-
- c := qt.New(t)
-
- var wg sync.WaitGroup
-
- p1 := Partition{
- Key: "p1",
- Load: func() (map[string]interface{}, error) {
- return map[string]interface{}{
- "p1_1": "p1v1",
- "p1_2": "p1v2",
- "p1_nil": nil,
- }, nil
- },
- }
-
- p2 := Partition{
- Key: "p2",
- Load: func() (map[string]interface{}, error) {
- return map[string]interface{}{
- "p2_1": "p2v1",
- "p2_2": "p2v2",
- "p2_3": "p2v3",
- }, nil
- },
- }
-
- cache := NewPartitionedLazyCache(p1, p2)
-
- for i := 0; i < 100; i++ {
- wg.Add(1)
- go func() {
- defer wg.Done()
- for j := 0; j < 10; j++ {
- v, err := cache.Get("p1", "p1_1")
- c.Assert(err, qt.IsNil)
- c.Assert(v, qt.Equals, "p1v1")
- }
- }()
- }
- wg.Wait()
-}
diff --git a/check_gofmt.sh b/check_gofmt.sh
new file mode 100755
index 000000000..c77517d3f
--- /dev/null
+++ b/check_gofmt.sh
@@ -0,0 +1,2 @@
+#!/usr/bin/env bash
+diff <(gofmt -d .) <(printf '')
\ No newline at end of file
diff --git a/codegen/methods.go b/codegen/methods.go
index ed8dba923..08ac97b00 100644
--- a/codegen/methods.go
+++ b/codegen/methods.go
@@ -26,6 +26,7 @@ import (
"path/filepath"
"reflect"
"regexp"
+ "slices"
"sort"
"strings"
"sync"
@@ -58,7 +59,7 @@ func (c *Inspector) MethodsFromTypes(include []reflect.Type, exclude []reflect.T
var methods Methods
- var excludes = make(map[string]bool)
+ excludes := make(map[string]bool)
if len(exclude) > 0 {
for _, m := range c.MethodsFromTypes(exclude, nil) {
@@ -99,12 +100,10 @@ func (c *Inspector) MethodsFromTypes(include []reflect.Type, exclude []reflect.T
name = pkgPrefix + name
return name, pkg
-
}
for _, t := range include {
-
- for i := 0; i < t.NumMethod(); i++ {
+ for i := range t.NumMethod() {
m := t.Method(i)
if excludes[m.Name] || seen[m.Name] {
@@ -124,7 +123,7 @@ func (c *Inspector) MethodsFromTypes(include []reflect.Type, exclude []reflect.T
method := Method{Owner: t, OwnerName: ownerName, Name: m.Name}
- for i := 0; i < numIn; i++ {
+ for i := range numIn {
in := m.Type.In(i)
name, pkg := nameAndPackage(in)
@@ -139,7 +138,7 @@ func (c *Inspector) MethodsFromTypes(include []reflect.Type, exclude []reflect.T
numOut := m.Type.NumOut()
if numOut > 0 {
- for i := 0; i < numOut; i++ {
+ for i := range numOut {
out := m.Type.Out(i)
name, pkg := nameAndPackage(out)
@@ -153,7 +152,6 @@ func (c *Inspector) MethodsFromTypes(include []reflect.Type, exclude []reflect.T
methods = append(methods, method)
}
-
}
sort.SliceStable(methods, func(i, j int) bool {
@@ -167,16 +165,13 @@ func (c *Inspector) MethodsFromTypes(include []reflect.Type, exclude []reflect.T
}
return wi < wj
-
})
return methods
-
}
func (c *Inspector) parseSource() {
c.init.Do(func() {
-
if !strings.Contains(c.ProjectRootDir, "hugo") {
panic("dir must be set to the Hugo root")
}
@@ -200,7 +195,6 @@ func (c *Inspector) parseSource() {
filenames = append(filenames, path)
return nil
-
})
for _, filename := range filenames {
@@ -230,7 +224,6 @@ func (c *Inspector) parseSource() {
c.methodWeight[iface] = weights
}
}
-
}
return true
})
@@ -247,7 +240,6 @@ func (c *Inspector) parseSource() {
}
}
}
-
})
}
@@ -313,7 +305,7 @@ func (m Method) inOutStr() string {
}
args := make([]string, len(m.In))
- for i := 0; i < len(args); i++ {
+ for i := range args {
args[i] = fmt.Sprintf("arg%d", i)
}
return "(" + strings.Join(args, ", ") + ")"
@@ -325,7 +317,7 @@ func (m Method) inStr() string {
}
args := make([]string, len(m.In))
- for i := 0; i < len(args); i++ {
+ for i := range args {
args[i] = fmt.Sprintf("arg%d %s", i, m.In[i])
}
return "(" + strings.Join(args, ", ") + ")"
@@ -348,7 +340,7 @@ func (m Method) outStrNamed() string {
}
outs := make([]string, len(m.Out))
- for i := 0; i < len(outs); i++ {
+ for i := range outs {
outs[i] = fmt.Sprintf("o%d %s", i, m.Out[i])
}
@@ -374,7 +366,7 @@ func (m Methods) Imports() []string {
}
// ToMarshalJSON creates a MarshalJSON method for these methods. Any method name
-// matchin any of the regexps in excludes will be ignored.
+// matching any of the regexps in excludes will be ignored.
func (m Methods) ToMarshalJSON(receiver, pkgPath string, excludes ...string) (string, []string) {
var sb strings.Builder
@@ -385,7 +377,7 @@ func (m Methods) ToMarshalJSON(receiver, pkgPath string, excludes ...string) (st
fmt.Fprintf(&sb, "func Marshal%sToJSON(%s %s) ([]byte, error) {\n", what, r, receiver)
var methods Methods
- var excludeRes = make([]*regexp.Regexp, len(excludes))
+ excludeRes := make([]*regexp.Regexp, len(excludes))
for i, exclude := range excludes {
excludeRes[i] = regexp.MustCompile(exclude)
@@ -444,13 +436,12 @@ func (m Methods) ToMarshalJSON(receiver, pkgPath string, excludes ...string) (st
// Exclude self
for i, pkgImp := range pkgImports {
if pkgImp == pkgPath {
- pkgImports = append(pkgImports[:i], pkgImports[i+1:]...)
+ pkgImports = slices.Delete(pkgImports, i, i+1)
}
}
}
return sb.String(), pkgImports
-
}
func collectMethodsRecursive(pkg string, f []*ast.Field) []string {
@@ -462,12 +453,15 @@ func collectMethodsRecursive(pkg string, f []*ast.Field) []string {
}
if ident, ok := m.Type.(*ast.Ident); ok && ident.Obj != nil {
- // Embedded interface
- methodNames = append(
- methodNames,
- collectMethodsRecursive(
- pkg,
- ident.Obj.Decl.(*ast.TypeSpec).Type.(*ast.InterfaceType).Methods.List)...)
+ switch tt := ident.Obj.Decl.(*ast.TypeSpec).Type.(type) {
+ case *ast.InterfaceType:
+ // Embedded interface
+ methodNames = append(
+ methodNames,
+ collectMethodsRecursive(
+ pkg,
+ tt.Methods.List)...)
+ }
} else {
// Embedded, but in a different file/package. Return the
// package.Name and deal with that later.
@@ -481,7 +475,6 @@ func collectMethodsRecursive(pkg string, f []*ast.Field) []string {
}
return methodNames
-
}
func firstToLower(name string) string {
@@ -516,7 +509,7 @@ func typeName(name, pkg string) string {
func uniqueNonEmptyStrings(s []string) []string {
var unique []string
- set := map[string]interface{}{}
+ set := map[string]any{}
for _, val := range s {
if val == "" {
continue
@@ -544,5 +537,4 @@ func varName(name string) string {
}
return name
-
}
diff --git a/codegen/methods_test.go b/codegen/methods_test.go
index 77399f4e4..0aff43d0e 100644
--- a/codegen/methods_test.go
+++ b/codegen/methods_test.go
@@ -25,7 +25,6 @@ import (
)
func TestMethods(t *testing.T) {
-
var (
zeroIE = reflect.TypeOf((*IEmbed)(nil)).Elem()
zeroIEOnly = reflect.TypeOf((*IEOnly)(nil)).Elem()
@@ -58,7 +57,6 @@ func TestMethods(t *testing.T) {
methodsStr := fmt.Sprint(methods)
c.Assert(methodsStr, qt.Contains, "MethodEmbed3(arg0 string) string")
-
})
t.Run("ToMarshalJSON", func(t *testing.T) {
@@ -76,9 +74,7 @@ func TestMethods(t *testing.T) {
c.Assert(pkg, qt.Contains, "encoding/json")
fmt.Println(pkg)
-
})
-
}
type I interface {
@@ -89,7 +85,7 @@ type I interface {
Method3(myint int, mystring string)
Method5() (string, error)
Method6() *net.IP
- Method7() interface{}
+ Method7() any
Method8() herrors.ErrorContext
method2()
method9() os.FileInfo
diff --git a/commands/check.go b/commands/check.go
deleted file mode 100644
index f36f23969..000000000
--- a/commands/check.go
+++ /dev/null
@@ -1,34 +0,0 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-// +build !darwin
-
-package commands
-
-import (
- "github.com/spf13/cobra"
-)
-
-var _ cmder = (*checkCmd)(nil)
-
-type checkCmd struct {
- *baseCmd
-}
-
-func newCheckCmd() *checkCmd {
- return &checkCmd{baseCmd: &baseCmd{cmd: &cobra.Command{
- Use: "check",
- Short: "Contains some verification checks",
- },
- }}
-}
diff --git a/commands/check_darwin.go b/commands/check_darwin.go
deleted file mode 100644
index 9291be84c..000000000
--- a/commands/check_darwin.go
+++ /dev/null
@@ -1,36 +0,0 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "github.com/spf13/cobra"
-)
-
-var _ cmder = (*checkCmd)(nil)
-
-type checkCmd struct {
- *baseCmd
-}
-
-func newCheckCmd() *checkCmd {
- cc := &checkCmd{baseCmd: &baseCmd{cmd: &cobra.Command{
- Use: "check",
- Short: "Contains some verification checks",
- },
- }}
-
- cc.cmd.AddCommand(newLimitCmd().getCommand())
-
- return cc
-}
diff --git a/commands/commandeer.go b/commands/commandeer.go
index 55526a857..bf9655637 100644
--- a/commands/commandeer.go
+++ b/commands/commandeer.go
@@ -1,4 +1,4 @@
-// Copyright 2019 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -14,401 +14,666 @@
package commands
import (
- "bytes"
+ "context"
"errors"
- "sync"
-
- "golang.org/x/sync/semaphore"
-
- "github.com/gohugoio/hugo/modules"
-
- "io/ioutil"
-
- "github.com/gohugoio/hugo/common/herrors"
- "github.com/gohugoio/hugo/common/hugo"
-
- jww "github.com/spf13/jwalterweatherman"
-
+ "fmt"
+ "io"
+ "log"
"os"
+ "os/signal"
"path/filepath"
- "regexp"
+ "runtime"
+ "strings"
+ "sync"
+ "sync/atomic"
+ "syscall"
"time"
+ "go.uber.org/automaxprocs/maxprocs"
+
+ "github.com/bep/clocks"
+ "github.com/bep/lazycache"
+ "github.com/bep/logg"
+ "github.com/bep/overlayfs"
+ "github.com/bep/simplecobra"
+
+ "github.com/gohugoio/hugo/common/hstrings"
+ "github.com/gohugoio/hugo/common/htime"
"github.com/gohugoio/hugo/common/loggers"
- "github.com/gohugoio/hugo/config"
-
- "github.com/spf13/cobra"
-
- "github.com/gohugoio/hugo/hugolib"
- "github.com/spf13/afero"
-
- "github.com/bep/debounce"
+ "github.com/gohugoio/hugo/common/paths"
"github.com/gohugoio/hugo/common/types"
+ "github.com/gohugoio/hugo/config"
+ "github.com/gohugoio/hugo/config/allconfig"
"github.com/gohugoio/hugo/deps"
"github.com/gohugoio/hugo/helpers"
"github.com/gohugoio/hugo/hugofs"
- "github.com/gohugoio/hugo/langs"
+ "github.com/gohugoio/hugo/hugolib"
+ "github.com/gohugoio/hugo/identity"
+ "github.com/gohugoio/hugo/resources/kinds"
+ "github.com/spf13/afero"
+ "github.com/spf13/cobra"
)
-type commandeerHugoState struct {
- *deps.DepsCfg
- hugoSites *hugolib.HugoSites
- fsCreate sync.Once
- created chan struct{}
-}
+var errHelp = errors.New("help requested")
-type commandeer struct {
- *commandeerHugoState
-
- logger *loggers.Logger
-
- // Currently only set when in "fast render mode". But it seems to
- // be fast enough that we could maybe just add it for all server modes.
- changeDetector *fileChangeDetector
-
- // We need to reuse this on server rebuilds.
- destinationFs afero.Fs
-
- h *hugoBuilderCommon
- ftch flagsToConfigHandler
-
- visitedURLs *types.EvictingStringQueue
-
- doWithCommandeer func(c *commandeer) error
-
- // We watch these for changes.
- configFiles []string
-
- // Used in cases where we get flooded with events in server mode.
- debounce func(f func())
-
- serverPorts []int
- languagesConfigured bool
- languages langs.Languages
- doLiveReload bool
- fastRenderMode bool
- showErrorInBrowser bool
-
- configured bool
- paused bool
-
- fullRebuildSem *semaphore.Weighted
-
- // Any error from the last build.
- buildErr error
-}
-
-func newCommandeerHugoState() *commandeerHugoState {
- return &commandeerHugoState{
- created: make(chan struct{}),
+// Execute executes a command.
+func Execute(args []string) error {
+ // Default GOMAXPROCS to be CPU limit aware, still respecting GOMAXPROCS env.
+ maxprocs.Set()
+ x, err := newExec()
+ if err != nil {
+ return err
}
-}
-
-func (c *commandeerHugoState) hugo() *hugolib.HugoSites {
- <-c.created
- return c.hugoSites
-}
-
-func (c *commandeer) errCount() int {
- return int(c.logger.ErrorCounter.Count())
-}
-
-func (c *commandeer) getErrorWithContext() interface{} {
- errCount := c.errCount()
-
- if errCount == 0 {
- return nil
- }
-
- m := make(map[string]interface{})
-
- m["Error"] = errors.New(removeErrorPrefixFromLog(c.logger.Errors()))
- m["Version"] = hugo.BuildVersionString()
-
- fe := herrors.UnwrapErrorWithFileContext(c.buildErr)
- if fe != nil {
- m["File"] = fe
- }
-
- if c.h.verbose {
- var b bytes.Buffer
- herrors.FprintStackTrace(&b, c.buildErr)
- m["StackTrace"] = b.String()
- }
-
- return m
-}
-
-func (c *commandeer) Set(key string, value interface{}) {
- if c.configured {
- panic("commandeer cannot be changed")
- }
- c.Cfg.Set(key, value)
-}
-
-func (c *commandeer) initFs(fs *hugofs.Fs) error {
- c.destinationFs = fs.Destination
- c.DepsCfg.Fs = fs
-
- return nil
-}
-
-func newCommandeer(mustHaveConfigFile, running bool, h *hugoBuilderCommon, f flagsToConfigHandler, doWithCommandeer func(c *commandeer) error, subCmdVs ...*cobra.Command) (*commandeer, error) {
-
- var rebuildDebouncer func(f func())
- if running {
- // The time value used is tested with mass content replacements in a fairly big Hugo site.
- // It is better to wait for some seconds in those cases rather than get flooded
- // with rebuilds.
- rebuildDebouncer = debounce.New(4 * time.Second)
- }
-
- c := &commandeer{
- h: h,
- ftch: f,
- commandeerHugoState: newCommandeerHugoState(),
- doWithCommandeer: doWithCommandeer,
- visitedURLs: types.NewEvictingStringQueue(10),
- debounce: rebuildDebouncer,
- fullRebuildSem: semaphore.NewWeighted(1),
- // This will be replaced later, but we need something to log to before the configuration is read.
- logger: loggers.NewLogger(jww.LevelError, jww.LevelError, os.Stdout, ioutil.Discard, running),
- }
-
- return c, c.loadConfig(mustHaveConfigFile, running)
-}
-
-type fileChangeDetector struct {
- sync.Mutex
- current map[string]string
- prev map[string]string
-
- irrelevantRe *regexp.Regexp
-}
-
-func (f *fileChangeDetector) OnFileClose(name, md5sum string) {
- f.Lock()
- defer f.Unlock()
- f.current[name] = md5sum
-}
-
-func (f *fileChangeDetector) changed() []string {
- if f == nil {
- return nil
- }
- f.Lock()
- defer f.Unlock()
- var c []string
- for k, v := range f.current {
- vv, found := f.prev[k]
- if !found || v != vv {
- c = append(c, k)
+ args = mapLegacyArgs(args)
+ cd, err := x.Execute(context.Background(), args)
+ if cd != nil {
+ if closer, ok := cd.Root.Command.(types.Closer); ok {
+ closer.Close()
}
}
- return f.filterIrrelevant(c)
-}
-
-func (f *fileChangeDetector) filterIrrelevant(in []string) []string {
- var filtered []string
- for _, v := range in {
- if !f.irrelevantRe.MatchString(v) {
- filtered = append(filtered, v)
- }
- }
- return filtered
-}
-
-func (f *fileChangeDetector) PrepareNew() {
- if f == nil {
- return
- }
-
- f.Lock()
- defer f.Unlock()
-
- if f.current == nil {
- f.current = make(map[string]string)
- f.prev = make(map[string]string)
- return
- }
-
- f.prev = make(map[string]string)
- for k, v := range f.current {
- f.prev[k] = v
- }
- f.current = make(map[string]string)
-}
-
-func (c *commandeer) loadConfig(mustHaveConfigFile, running bool) error {
-
- if c.DepsCfg == nil {
- c.DepsCfg = &deps.DepsCfg{}
- }
-
- if c.logger != nil {
- // Truncate the error log if this is a reload.
- c.logger.Reset()
- }
-
- cfg := c.DepsCfg
- c.configured = false
- cfg.Running = running
-
- var dir string
- if c.h.source != "" {
- dir, _ = filepath.Abs(c.h.source)
- } else {
- dir, _ = os.Getwd()
- }
-
- var sourceFs afero.Fs = hugofs.Os
- if c.DepsCfg.Fs != nil {
- sourceFs = c.DepsCfg.Fs.Source
- }
-
- environment := c.h.getEnvironment(running)
-
- doWithConfig := func(cfg config.Provider) error {
-
- if c.ftch != nil {
- c.ftch.flagsToConfig(cfg)
- }
-
- cfg.Set("workingDir", dir)
- cfg.Set("environment", environment)
- return nil
- }
-
- doWithCommandeer := func(cfg config.Provider) error {
- c.Cfg = cfg
- if c.doWithCommandeer == nil {
+ if err != nil {
+ if err == errHelp {
+ cd.CobraCommand.Help()
+ fmt.Println()
return nil
}
- err := c.doWithCommandeer(c)
- return err
- }
-
- configPath := c.h.source
- if configPath == "" {
- configPath = dir
- }
- config, configFiles, err := hugolib.LoadConfig(
- hugolib.ConfigSourceDescriptor{
- Fs: sourceFs,
- Path: configPath,
- WorkingDir: dir,
- Filename: c.h.cfgFile,
- AbsConfigDir: c.h.getConfigDir(dir),
- Environ: os.Environ(),
- Environment: environment},
- doWithCommandeer,
- doWithConfig)
-
- if err != nil {
- if mustHaveConfigFile {
- return err
- }
- if err != hugolib.ErrNoConfigFile && !modules.IsNotExist(err) {
- return err
- }
-
- }
-
- c.configFiles = configFiles
-
- if l, ok := c.Cfg.Get("languagesSorted").(langs.Languages); ok {
- c.languagesConfigured = true
- c.languages = l
- }
-
- // Set some commonly used flags
- c.doLiveReload = running && !c.Cfg.GetBool("disableLiveReload")
- c.fastRenderMode = c.doLiveReload && !c.Cfg.GetBool("disableFastRender")
- c.showErrorInBrowser = c.doLiveReload && !c.Cfg.GetBool("disableBrowserError")
-
- // This is potentially double work, but we need to do this one more time now
- // that all the languages have been configured.
- if c.doWithCommandeer != nil {
- if err := c.doWithCommandeer(c); err != nil {
- return err
+ if simplecobra.IsCommandError(err) {
+ // Print the help, but also return the error to fail the command.
+ cd.CobraCommand.Help()
+ fmt.Println()
}
}
+ return err
+}
- logger, err := c.createLogger(config, running)
- if err != nil {
- return err
- }
+type commonConfig struct {
+ mu *sync.Mutex
+ configs *allconfig.Configs
+ cfg config.Provider
+ fs *hugofs.Fs
+}
- cfg.Logger = logger
- c.logger = logger
+type configKey struct {
+ counter int32
+ ignoreModulesDoesNotExists bool
+}
- createMemFs := config.GetBool("renderToMemory")
+// This is the root command.
+type rootCommand struct {
+ Printf func(format string, v ...any)
+ Println func(a ...any)
+ StdOut io.Writer
+ StdErr io.Writer
- if createMemFs {
- // Rendering to memoryFS, publish to Root regardless of publishDir.
- config.Set("publishDir", "/")
- }
+ logger loggers.Logger
- c.fsCreate.Do(func() {
- fs := hugofs.NewFrom(sourceFs, config)
+ // The main cache busting key for the caches below.
+ configVersionID atomic.Int32
- if c.destinationFs != nil {
- // Need to reuse the destination on server rebuilds.
- fs.Destination = c.destinationFs
- } else if createMemFs {
- // Hugo writes the output to memory instead of the disk.
- fs.Destination = new(afero.MemMapFs)
- }
+ // Some, but not all commands need access to these.
+ // Some needs more than one, so keep them in a small cache.
+ commonConfigs *lazycache.Cache[configKey, *commonConfig]
+ hugoSites *lazycache.Cache[configKey, *hugolib.HugoSites]
- if c.fastRenderMode {
- // For now, fast render mode only. It should, however, be fast enough
- // for the full variant, too.
- changeDetector := &fileChangeDetector{
- // We use this detector to decide to do a Hot reload of a single path or not.
- // We need to filter out source maps and possibly some other to be able
- // to make that decision.
- irrelevantRe: regexp.MustCompile(`\.map$`),
+ // changesFromBuild received from Hugo in watch mode.
+ changesFromBuild chan []identity.Identity
+
+ commands []simplecobra.Commander
+
+ // Flags
+ source string
+ buildWatch bool
+ environment string
+
+ // Common build flags.
+ baseURL string
+ gc bool
+ poll string
+ forceSyncStatic bool
+
+ // Profile flags (for debugging of performance problems)
+ cpuprofile string
+ memprofile string
+ mutexprofile string
+ traceprofile string
+ printm bool
+
+ logLevel string
+
+ quiet bool
+ devMode bool // Hidden flag.
+
+ renderToMemory bool
+
+ cfgFile string
+ cfgDir string
+}
+
+func (r *rootCommand) isVerbose() bool {
+ return r.logger.Level() <= logg.LevelInfo
+}
+
+func (r *rootCommand) Close() error {
+ if r.hugoSites != nil {
+ r.hugoSites.DeleteFunc(func(key configKey, value *hugolib.HugoSites) bool {
+ if value != nil {
+ value.Close()
}
+ return false
+ })
+ }
+ return nil
+}
- changeDetector.PrepareNew()
- fs.Destination = hugofs.NewHashingFs(fs.Destination, changeDetector)
- c.changeDetector = changeDetector
- }
+func (r *rootCommand) Build(cd *simplecobra.Commandeer, bcfg hugolib.BuildCfg, cfg config.Provider) (*hugolib.HugoSites, error) {
+ h, err := r.Hugo(cfg)
+ if err != nil {
+ return nil, err
+ }
+ if err := h.Build(bcfg); err != nil {
+ return nil, err
+ }
- if c.Cfg.GetBool("logPathWarnings") {
- fs.Destination = hugofs.NewCreateCountingFs(fs.Destination)
- }
+ return h, nil
+}
- // To debug hard-to-find path issues.
- //fs.Destination = hugofs.NewStacktracerFs(fs.Destination, `fr/fr`)
+func (r *rootCommand) Commands() []simplecobra.Commander {
+ return r.commands
+}
- err = c.initFs(fs)
+func (r *rootCommand) ConfigFromConfig(key configKey, oldConf *commonConfig) (*commonConfig, error) {
+ cc, _, err := r.commonConfigs.GetOrCreate(key, func(key configKey) (*commonConfig, error) {
+ fs := oldConf.fs
+ configs, err := allconfig.LoadConfig(
+ allconfig.ConfigSourceDescriptor{
+ Flags: oldConf.cfg,
+ Fs: fs.Source,
+ Filename: r.cfgFile,
+ ConfigDir: r.cfgDir,
+ Logger: r.logger,
+ Environment: r.environment,
+ IgnoreModuleDoesNotExist: key.ignoreModulesDoesNotExists,
+ },
+ )
if err != nil {
- close(c.created)
- return
+ return nil, err
}
- var h *hugolib.HugoSites
-
- h, err = hugolib.NewHugoSites(*c.DepsCfg)
- c.hugoSites = h
- close(c.created)
+ if !configs.Base.C.Clock.IsZero() {
+ // TODO(bep) find a better place for this.
+ htime.Clock = clocks.Start(configs.Base.C.Clock)
+ }
+ return &commonConfig{
+ mu: oldConf.mu,
+ configs: configs,
+ cfg: oldConf.cfg,
+ fs: fs,
+ }, nil
})
+ return cc, err
+}
+
+func (r *rootCommand) ConfigFromProvider(key configKey, cfg config.Provider) (*commonConfig, error) {
+ if cfg == nil {
+ panic("cfg must be set")
+ }
+ cc, _, err := r.commonConfigs.GetOrCreate(key, func(key configKey) (*commonConfig, error) {
+ var dir string
+ if r.source != "" {
+ dir, _ = filepath.Abs(r.source)
+ } else {
+ dir, _ = os.Getwd()
+ }
+
+ if cfg == nil {
+ cfg = config.New()
+ }
+
+ if !cfg.IsSet("workingDir") {
+ cfg.Set("workingDir", dir)
+ } else {
+ if err := os.MkdirAll(cfg.GetString("workingDir"), 0o777); err != nil {
+ return nil, fmt.Errorf("failed to create workingDir: %w", err)
+ }
+ }
+
+ // Load the config first to allow publishDir to be configured in config file.
+ configs, err := allconfig.LoadConfig(
+ allconfig.ConfigSourceDescriptor{
+ Flags: cfg,
+ Fs: hugofs.Os,
+ Filename: r.cfgFile,
+ ConfigDir: r.cfgDir,
+ Environment: r.environment,
+ Logger: r.logger,
+ IgnoreModuleDoesNotExist: key.ignoreModulesDoesNotExists,
+ },
+ )
+ if err != nil {
+ return nil, err
+ }
+
+ base := configs.Base
+
+ cfg.Set("publishDir", base.PublishDir)
+ cfg.Set("publishDirStatic", base.PublishDir)
+ cfg.Set("publishDirDynamic", base.PublishDir)
+
+ renderStaticToDisk := cfg.GetBool("renderStaticToDisk")
+
+ sourceFs := hugofs.Os
+ var destinationFs afero.Fs
+ if cfg.GetBool("renderToMemory") {
+ destinationFs = afero.NewMemMapFs()
+ if renderStaticToDisk {
+ // Hybrid, render dynamic content to Root.
+ cfg.Set("publishDirDynamic", "/")
+ } else {
+ // Rendering to memoryFS, publish to Root regardless of publishDir.
+ cfg.Set("publishDirDynamic", "/")
+ cfg.Set("publishDirStatic", "/")
+ }
+ } else {
+ destinationFs = hugofs.Os
+ }
+
+ fs := hugofs.NewFromSourceAndDestination(sourceFs, destinationFs, cfg)
+
+ if renderStaticToDisk {
+ dynamicFs := fs.PublishDir
+ publishDirStatic := cfg.GetString("publishDirStatic")
+ workingDir := cfg.GetString("workingDir")
+ absPublishDirStatic := paths.AbsPathify(workingDir, publishDirStatic)
+ staticFs := hugofs.NewBasePathFs(afero.NewOsFs(), absPublishDirStatic)
+
+ // Serve from both the static and dynamic fs,
+ // the first will take priority.
+ // THis is a read-only filesystem,
+ // we do all the writes to
+ // fs.Destination and fs.DestinationStatic.
+ fs.PublishDirServer = overlayfs.New(
+ overlayfs.Options{
+ Fss: []afero.Fs{
+ dynamicFs,
+ staticFs,
+ },
+ },
+ )
+ fs.PublishDirStatic = staticFs
+
+ }
+
+ if !base.C.Clock.IsZero() {
+ // TODO(bep) find a better place for this.
+ htime.Clock = clocks.Start(configs.Base.C.Clock)
+ }
+
+ if base.PrintPathWarnings {
+ // Note that we only care about the "dynamic creates" here,
+ // so skip the static fs.
+ fs.PublishDir = hugofs.NewCreateCountingFs(fs.PublishDir)
+ }
+
+ commonConfig := &commonConfig{
+ mu: &sync.Mutex{},
+ configs: configs,
+ cfg: cfg,
+ fs: fs,
+ }
+
+ return commonConfig, nil
+ })
+
+ return cc, err
+}
+
+func (r *rootCommand) HugFromConfig(conf *commonConfig) (*hugolib.HugoSites, error) {
+ k := configKey{counter: r.configVersionID.Load()}
+ h, _, err := r.hugoSites.GetOrCreate(k, func(key configKey) (*hugolib.HugoSites, error) {
+ depsCfg := r.newDepsConfig(conf)
+ return hugolib.NewHugoSites(depsCfg)
+ })
+ return h, err
+}
+
+func (r *rootCommand) Hugo(cfg config.Provider) (*hugolib.HugoSites, error) {
+ return r.getOrCreateHugo(cfg, false)
+}
+
+func (r *rootCommand) getOrCreateHugo(cfg config.Provider, ignoreModuleDoesNotExist bool) (*hugolib.HugoSites, error) {
+ k := configKey{counter: r.configVersionID.Load(), ignoreModulesDoesNotExists: ignoreModuleDoesNotExist}
+ h, _, err := r.hugoSites.GetOrCreate(k, func(key configKey) (*hugolib.HugoSites, error) {
+ conf, err := r.ConfigFromProvider(key, cfg)
+ if err != nil {
+ return nil, err
+ }
+ depsCfg := r.newDepsConfig(conf)
+ return hugolib.NewHugoSites(depsCfg)
+ })
+ return h, err
+}
+
+func (r *rootCommand) newDepsConfig(conf *commonConfig) deps.DepsCfg {
+ return deps.DepsCfg{Configs: conf.configs, Fs: conf.fs, StdOut: r.logger.StdOut(), StdErr: r.logger.StdErr(), LogLevel: r.logger.Level(), ChangesFromBuild: r.changesFromBuild}
+}
+
+func (r *rootCommand) Name() string {
+ return "hugo"
+}
+
+func (r *rootCommand) Run(ctx context.Context, cd *simplecobra.Commandeer, args []string) error {
+ b := newHugoBuilder(r, nil)
+
+ if !r.buildWatch {
+ defer b.postBuild("Total", time.Now())
+ }
+
+ if err := b.loadConfig(cd, false); err != nil {
+ return err
+ }
+
+ err := func() error {
+ if r.buildWatch {
+ defer r.timeTrack(time.Now(), "Built")
+ }
+ err := b.build()
+ if err != nil {
+ return err
+ }
+ return nil
+ }()
if err != nil {
return err
}
- cacheDir, err := helpers.GetCacheDir(sourceFs, config)
+ if !r.buildWatch {
+ // Done.
+ return nil
+ }
+
+ watchDirs, err := b.getDirList()
if err != nil {
return err
}
- config.Set("cacheDir", cacheDir)
- cfg.Logger.INFO.Println("Using config file:", config.ConfigFileUsed())
+ watchGroups := helpers.ExtractAndGroupRootPaths(watchDirs)
+
+ for _, group := range watchGroups {
+ r.Printf("Watching for changes in %s\n", group)
+ }
+ watcher, err := b.newWatcher(r.poll, watchDirs...)
+ if err != nil {
+ return err
+ }
+
+ defer watcher.Close()
+
+ r.Println("Press Ctrl+C to stop")
+
+ sigs := make(chan os.Signal, 1)
+ signal.Notify(sigs, syscall.SIGINT, syscall.SIGTERM)
+
+ <-sigs
return nil
-
+}
+
+func (r *rootCommand) PreRun(cd, runner *simplecobra.Commandeer) error {
+ r.StdOut = os.Stdout
+ r.StdErr = os.Stderr
+ if r.quiet {
+ r.StdOut = io.Discard
+ r.StdErr = io.Discard
+ }
+ // Used by mkcert (server).
+ log.SetOutput(r.StdOut)
+
+ r.Printf = func(format string, v ...any) {
+ if !r.quiet {
+ fmt.Fprintf(r.StdOut, format, v...)
+ }
+ }
+ r.Println = func(a ...any) {
+ if !r.quiet {
+ fmt.Fprintln(r.StdOut, a...)
+ }
+ }
+ _, running := runner.Command.(*serverCommand)
+ var err error
+ r.logger, err = r.createLogger(running)
+ if err != nil {
+ return err
+ }
+ // Set up the global logger early to allow info deprecations during config load.
+ loggers.SetGlobalLogger(r.logger)
+
+ r.changesFromBuild = make(chan []identity.Identity, 10)
+
+ r.commonConfigs = lazycache.New(lazycache.Options[configKey, *commonConfig]{MaxEntries: 5})
+ // We don't want to keep stale HugoSites in memory longer than needed.
+ r.hugoSites = lazycache.New(lazycache.Options[configKey, *hugolib.HugoSites]{
+ MaxEntries: 1,
+ OnEvict: func(key configKey, value *hugolib.HugoSites) {
+ value.Close()
+ runtime.GC()
+ },
+ })
+
+ return nil
+}
+
+func (r *rootCommand) createLogger(running bool) (loggers.Logger, error) {
+ level := logg.LevelWarn
+
+ if r.devMode {
+ level = logg.LevelTrace
+ } else {
+ if r.logLevel != "" {
+ switch strings.ToLower(r.logLevel) {
+ case "debug":
+ level = logg.LevelDebug
+ case "info":
+ level = logg.LevelInfo
+ case "warn", "warning":
+ level = logg.LevelWarn
+ case "error":
+ level = logg.LevelError
+ default:
+ return nil, fmt.Errorf("invalid log level: %q, must be one of debug, warn, info or error", r.logLevel)
+ }
+ }
+ }
+
+ optsLogger := loggers.Options{
+ DistinctLevel: logg.LevelWarn,
+ Level: level,
+ StdOut: r.StdOut,
+ StdErr: r.StdErr,
+ StoreErrors: running,
+ }
+
+ return loggers.New(optsLogger), nil
+}
+
+func (r *rootCommand) resetLogs() {
+ r.logger.Reset()
+ loggers.Log().Reset()
+}
+
+// IsTestRun reports whether the command is running as a test.
+func (r *rootCommand) IsTestRun() bool {
+ return os.Getenv("HUGO_TESTRUN") != ""
+}
+
+func (r *rootCommand) Init(cd *simplecobra.Commandeer) error {
+ return r.initRootCommand("", cd)
+}
+
+func (r *rootCommand) initRootCommand(subCommandName string, cd *simplecobra.Commandeer) error {
+ cmd := cd.CobraCommand
+ commandName := "hugo"
+ if subCommandName != "" {
+ commandName = subCommandName
+ }
+ cmd.Use = fmt.Sprintf("%s [flags]", commandName)
+ cmd.Short = "Build your site"
+ cmd.Long = `COMMAND_NAME is the main command, used to build your Hugo site.
+
+Hugo is a Fast and Flexible Static Site Generator
+built with love by spf13 and friends in Go.
+
+Complete documentation is available at https://gohugo.io/.`
+
+ cmd.Long = strings.ReplaceAll(cmd.Long, "COMMAND_NAME", commandName)
+
+ // Configure persistent flags
+ cmd.PersistentFlags().StringVarP(&r.source, "source", "s", "", "filesystem path to read files relative from")
+ _ = cmd.MarkFlagDirname("source")
+ cmd.PersistentFlags().StringP("destination", "d", "", "filesystem path to write files to")
+ _ = cmd.MarkFlagDirname("destination")
+
+ cmd.PersistentFlags().StringVarP(&r.environment, "environment", "e", "", "build environment")
+ _ = cmd.RegisterFlagCompletionFunc("environment", cobra.NoFileCompletions)
+ cmd.PersistentFlags().StringP("themesDir", "", "", "filesystem path to themes directory")
+ _ = cmd.MarkFlagDirname("themesDir")
+ cmd.PersistentFlags().StringP("ignoreVendorPaths", "", "", "ignores any _vendor for module paths matching the given Glob pattern")
+ cmd.PersistentFlags().BoolP("noBuildLock", "", false, "don't create .hugo_build.lock file")
+ _ = cmd.RegisterFlagCompletionFunc("ignoreVendorPaths", cobra.NoFileCompletions)
+ cmd.PersistentFlags().String("clock", "", "set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00")
+ _ = cmd.RegisterFlagCompletionFunc("clock", cobra.NoFileCompletions)
+
+ cmd.PersistentFlags().StringVar(&r.cfgFile, "config", "", "config file (default is hugo.yaml|json|toml)")
+ _ = cmd.MarkFlagFilename("config", config.ValidConfigFileExtensions...)
+ cmd.PersistentFlags().StringVar(&r.cfgDir, "configDir", "config", "config dir")
+ _ = cmd.MarkFlagDirname("configDir")
+ cmd.PersistentFlags().BoolVar(&r.quiet, "quiet", false, "build in quiet mode")
+ cmd.PersistentFlags().BoolVarP(&r.renderToMemory, "renderToMemory", "M", false, "render to memory (mostly useful when running the server)")
+
+ cmd.PersistentFlags().BoolVarP(&r.devMode, "devMode", "", false, "only used for internal testing, flag hidden.")
+ cmd.PersistentFlags().StringVar(&r.logLevel, "logLevel", "", "log level (debug|info|warn|error)")
+ _ = cmd.RegisterFlagCompletionFunc("logLevel", cobra.FixedCompletions([]string{"debug", "info", "warn", "error"}, cobra.ShellCompDirectiveNoFileComp))
+ cmd.Flags().BoolVarP(&r.buildWatch, "watch", "w", false, "watch filesystem for changes and recreate as needed")
+
+ cmd.PersistentFlags().MarkHidden("devMode")
+
+ // Configure local flags
+ applyLocalFlagsBuild(cmd, r)
+
+ return nil
+}
+
+// A sub set of the complete build flags. These flags are used by new and mod.
+func applyLocalFlagsBuildConfig(cmd *cobra.Command, r *rootCommand) {
+ cmd.Flags().StringSliceP("theme", "t", []string{}, "themes to use (located in /themes/THEMENAME/)")
+ _ = cmd.MarkFlagDirname("theme")
+ cmd.Flags().StringVarP(&r.baseURL, "baseURL", "b", "", "hostname (and path) to the root, e.g. https://spf13.com/")
+ cmd.Flags().StringP("cacheDir", "", "", "filesystem path to cache directory")
+ _ = cmd.MarkFlagDirname("cacheDir")
+ cmd.Flags().StringP("contentDir", "c", "", "filesystem path to content directory")
+ cmd.Flags().StringSliceP("renderSegments", "", []string{}, "named segments to render (configured in the segments config)")
+}
+
+// Flags needed to do a build (used by hugo and hugo server commands)
+func applyLocalFlagsBuild(cmd *cobra.Command, r *rootCommand) {
+ applyLocalFlagsBuildConfig(cmd, r)
+ cmd.Flags().Bool("cleanDestinationDir", false, "remove files from destination not found in static directories")
+ cmd.Flags().BoolP("buildDrafts", "D", false, "include content marked as draft")
+ cmd.Flags().BoolP("buildFuture", "F", false, "include content with publishdate in the future")
+ cmd.Flags().BoolP("buildExpired", "E", false, "include expired content")
+ cmd.Flags().BoolP("ignoreCache", "", false, "ignores the cache directory")
+ cmd.Flags().Bool("enableGitInfo", false, "add Git revision, date, author, and CODEOWNERS info to the pages")
+ cmd.Flags().StringP("layoutDir", "l", "", "filesystem path to layout directory")
+ _ = cmd.MarkFlagDirname("layoutDir")
+ cmd.Flags().BoolVar(&r.gc, "gc", false, "enable to run some cleanup tasks (remove unused cache files) after the build")
+ cmd.Flags().StringVar(&r.poll, "poll", "", "set this to a poll interval, e.g --poll 700ms, to use a poll based approach to watch for file system changes")
+ _ = cmd.RegisterFlagCompletionFunc("poll", cobra.NoFileCompletions)
+ cmd.Flags().Bool("panicOnWarning", false, "panic on first WARNING log")
+ cmd.Flags().Bool("templateMetrics", false, "display metrics about template executions")
+ cmd.Flags().Bool("templateMetricsHints", false, "calculate some improvement hints when combined with --templateMetrics")
+ cmd.Flags().BoolVar(&r.forceSyncStatic, "forceSyncStatic", false, "copy all files when static is changed.")
+ cmd.Flags().BoolP("noTimes", "", false, "don't sync modification time of files")
+ cmd.Flags().BoolP("noChmod", "", false, "don't sync permission mode of files")
+ cmd.Flags().BoolP("printI18nWarnings", "", false, "print missing translations")
+ cmd.Flags().BoolP("printPathWarnings", "", false, "print warnings on duplicate target paths etc.")
+ cmd.Flags().BoolP("printUnusedTemplates", "", false, "print warnings on unused templates.")
+ cmd.Flags().StringVarP(&r.cpuprofile, "profile-cpu", "", "", "write cpu profile to `file`")
+ cmd.Flags().StringVarP(&r.memprofile, "profile-mem", "", "", "write memory profile to `file`")
+ cmd.Flags().BoolVarP(&r.printm, "printMemoryUsage", "", false, "print memory usage to screen at intervals")
+ cmd.Flags().StringVarP(&r.mutexprofile, "profile-mutex", "", "", "write Mutex profile to `file`")
+ cmd.Flags().StringVarP(&r.traceprofile, "trace", "", "", "write trace to `file` (not useful in general)")
+
+ // Hide these for now.
+ cmd.Flags().MarkHidden("profile-cpu")
+ cmd.Flags().MarkHidden("profile-mem")
+ cmd.Flags().MarkHidden("profile-mutex")
+
+ cmd.Flags().StringSlice("disableKinds", []string{}, "disable different kind of pages (home, RSS etc.)")
+ _ = cmd.RegisterFlagCompletionFunc("disableKinds", cobra.FixedCompletions(kinds.AllKinds, cobra.ShellCompDirectiveNoFileComp))
+ cmd.Flags().Bool("minify", false, "minify any supported output format (HTML, XML etc.)")
+}
+
+func (r *rootCommand) timeTrack(start time.Time, name string) {
+ elapsed := time.Since(start)
+ r.Printf("%s in %v ms\n", name, int(1000*elapsed.Seconds()))
+}
+
+type simpleCommand struct {
+ use string
+ name string
+ short string
+ long string
+ run func(ctx context.Context, cd *simplecobra.Commandeer, rootCmd *rootCommand, args []string) error
+ withc func(cmd *cobra.Command, r *rootCommand)
+ initc func(cd *simplecobra.Commandeer) error
+
+ commands []simplecobra.Commander
+
+ rootCmd *rootCommand
+}
+
+func (c *simpleCommand) Commands() []simplecobra.Commander {
+ return c.commands
+}
+
+func (c *simpleCommand) Name() string {
+ return c.name
+}
+
+func (c *simpleCommand) Run(ctx context.Context, cd *simplecobra.Commandeer, args []string) error {
+ if c.run == nil {
+ return nil
+ }
+ return c.run(ctx, cd, c.rootCmd, args)
+}
+
+func (c *simpleCommand) Init(cd *simplecobra.Commandeer) error {
+ c.rootCmd = cd.Root.Command.(*rootCommand)
+ cmd := cd.CobraCommand
+ cmd.Short = c.short
+ cmd.Long = c.long
+ if c.use != "" {
+ cmd.Use = c.use
+ }
+ if c.withc != nil {
+ c.withc(cmd, c.rootCmd)
+ }
+ return nil
+}
+
+func (c *simpleCommand) PreRun(cd, runner *simplecobra.Commandeer) error {
+ if c.initc != nil {
+ return c.initc(cd)
+ }
+ return nil
+}
+
+func mapLegacyArgs(args []string) []string {
+ if len(args) > 1 && args[0] == "new" && !hstrings.EqualAny(args[1], "site", "theme", "content") {
+ // Insert "content" as the second argument
+ args = append(args[:1], append([]string{"content"}, args[1:]...)...)
+ }
+ return args
}
diff --git a/commands/commands.go b/commands/commands.go
index 3096e1fe0..10ab106e2 100644
--- a/commands/commands.go
+++ b/commands/commands.go
@@ -1,4 +1,4 @@
-// Copyright 2019 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -14,299 +14,60 @@
package commands
import (
- "os"
+ "context"
- "github.com/gohugoio/hugo/hugolib/paths"
-
- "github.com/gohugoio/hugo/common/hugo"
- "github.com/gohugoio/hugo/common/loggers"
- "github.com/gohugoio/hugo/config"
- "github.com/gohugoio/hugo/helpers"
- "github.com/spf13/cobra"
+ "github.com/bep/simplecobra"
)
-type commandsBuilder struct {
- hugoBuilderCommon
-
- commands []cmder
-}
-
-func newCommandsBuilder() *commandsBuilder {
- return &commandsBuilder{}
-}
-
-func (b *commandsBuilder) addCommands(commands ...cmder) *commandsBuilder {
- b.commands = append(b.commands, commands...)
- return b
-}
-
-func (b *commandsBuilder) addAll() *commandsBuilder {
- b.addCommands(
- b.newServerCmd(),
- newVersionCmd(),
- newEnvCmd(),
- newConfigCmd(),
- newCheckCmd(),
- newDeployCmd(),
- newConvertCmd(),
- b.newNewCmd(),
- newListCmd(),
- newImportCmd(),
- newGenCmd(),
- createReleaser(),
- b.newModCmd(),
- )
-
- return b
-}
-
-func (b *commandsBuilder) build() *hugoCmd {
- h := b.newHugoCmd()
- addCommands(h.getCommand(), b.commands...)
- return h
-}
-
-func addCommands(root *cobra.Command, commands ...cmder) {
- for _, command := range commands {
- cmd := command.getCommand()
- if cmd == nil {
- continue
- }
- root.AddCommand(cmd)
+// newExec wires up all of Hugo's CLI.
+func newExec() (*simplecobra.Exec, error) {
+ rootCmd := &rootCommand{
+ commands: []simplecobra.Commander{
+ newHugoBuildCmd(),
+ newVersionCmd(),
+ newEnvCommand(),
+ newServerCommand(),
+ newDeployCommand(),
+ newConfigCommand(),
+ newNewCommand(),
+ newConvertCommand(),
+ newImportCommand(),
+ newListCommand(),
+ newModCommands(),
+ newGenCommand(),
+ newReleaseCommand(),
+ },
}
+
+ return simplecobra.New(rootCmd)
}
-type baseCmd struct {
- cmd *cobra.Command
+func newHugoBuildCmd() simplecobra.Commander {
+ return &hugoBuildCommand{}
}
-var _ commandsBuilderGetter = (*baseBuilderCmd)(nil)
-
-// Used in tests.
-type commandsBuilderGetter interface {
- getCommandsBuilder() *commandsBuilder
-}
-type baseBuilderCmd struct {
- *baseCmd
- *commandsBuilder
+// hugoBuildCommand just delegates to the rootCommand.
+type hugoBuildCommand struct {
+ rootCmd *rootCommand
}
-func (b *baseBuilderCmd) getCommandsBuilder() *commandsBuilder {
- return b.commandsBuilder
-}
-
-func (c *baseCmd) getCommand() *cobra.Command {
- return c.cmd
-}
-
-func newBaseCmd(cmd *cobra.Command) *baseCmd {
- return &baseCmd{cmd: cmd}
-}
-
-func (b *commandsBuilder) newBuilderCmd(cmd *cobra.Command) *baseBuilderCmd {
- bcmd := &baseBuilderCmd{commandsBuilder: b, baseCmd: &baseCmd{cmd: cmd}}
- bcmd.hugoBuilderCommon.handleFlags(cmd)
- return bcmd
-}
-
-func (c *baseCmd) flagsToConfig(cfg config.Provider) {
- initializeFlags(c.cmd, cfg)
-}
-
-type hugoCmd struct {
- *baseBuilderCmd
-
- // Need to get the sites once built.
- c *commandeer
-}
-
-var _ cmder = (*nilCommand)(nil)
-
-type nilCommand struct {
-}
-
-func (c *nilCommand) getCommand() *cobra.Command {
+func (c *hugoBuildCommand) Commands() []simplecobra.Commander {
return nil
}
-func (c *nilCommand) flagsToConfig(cfg config.Provider) {
-
+func (c *hugoBuildCommand) Name() string {
+ return "build"
}
-func (b *commandsBuilder) newHugoCmd() *hugoCmd {
- cc := &hugoCmd{}
-
- cc.baseBuilderCmd = b.newBuilderCmd(&cobra.Command{
- Use: "hugo",
- Short: "hugo builds your site",
- Long: `hugo is the main command, used to build your Hugo site.
-
-Hugo is a Fast and Flexible Static Site Generator
-built with love by spf13 and friends in Go.
-
-Complete documentation is available at http://gohugo.io/.`,
- RunE: func(cmd *cobra.Command, args []string) error {
- cfgInit := func(c *commandeer) error {
- if cc.buildWatch {
- c.Set("disableLiveReload", true)
- }
- return nil
- }
-
- c, err := initializeConfig(true, cc.buildWatch, &cc.hugoBuilderCommon, cc, cfgInit)
- if err != nil {
- return err
- }
- cc.c = c
-
- return c.build()
- },
- })
-
- cc.cmd.PersistentFlags().StringVar(&cc.cfgFile, "config", "", "config file (default is path/config.yaml|json|toml)")
- cc.cmd.PersistentFlags().StringVar(&cc.cfgDir, "configDir", "config", "config dir")
- cc.cmd.PersistentFlags().BoolVar(&cc.quiet, "quiet", false, "build in quiet mode")
-
- // Set bash-completion
- _ = cc.cmd.PersistentFlags().SetAnnotation("config", cobra.BashCompFilenameExt, config.ValidConfigFileExtensions)
-
- cc.cmd.PersistentFlags().BoolVarP(&cc.verbose, "verbose", "v", false, "verbose output")
- cc.cmd.PersistentFlags().BoolVarP(&cc.debug, "debug", "", false, "debug output")
- cc.cmd.PersistentFlags().BoolVar(&cc.logging, "log", false, "enable Logging")
- cc.cmd.PersistentFlags().StringVar(&cc.logFile, "logFile", "", "log File path (if set, logging enabled automatically)")
- cc.cmd.PersistentFlags().BoolVar(&cc.verboseLog, "verboseLog", false, "verbose logging")
-
- cc.cmd.Flags().BoolVarP(&cc.buildWatch, "watch", "w", false, "watch filesystem for changes and recreate as needed")
-
- cc.cmd.Flags().Bool("renderToMemory", false, "render to memory (only useful for benchmark testing)")
-
- // Set bash-completion
- _ = cc.cmd.PersistentFlags().SetAnnotation("logFile", cobra.BashCompFilenameExt, []string{})
-
- cc.cmd.SetGlobalNormalizationFunc(helpers.NormalizeHugoFlags)
- cc.cmd.SilenceUsage = true
-
- return cc
+func (c *hugoBuildCommand) Init(cd *simplecobra.Commandeer) error {
+ c.rootCmd = cd.Root.Command.(*rootCommand)
+ return c.rootCmd.initRootCommand("build", cd)
}
-type hugoBuilderCommon struct {
- source string
- baseURL string
- environment string
-
- buildWatch bool
-
- gc bool
-
- // Profile flags (for debugging of performance problems)
- cpuprofile string
- memprofile string
- mutexprofile string
- traceprofile string
-
- // TODO(bep) var vs string
- logging bool
- verbose bool
- verboseLog bool
- debug bool
- quiet bool
-
- cfgFile string
- cfgDir string
- logFile string
+func (c *hugoBuildCommand) PreRun(cd, runner *simplecobra.Commandeer) error {
+ return c.rootCmd.PreRun(cd, runner)
}
-func (cc *hugoBuilderCommon) getConfigDir(baseDir string) string {
- if cc.cfgDir != "" {
- return paths.AbsPathify(baseDir, cc.cfgDir)
- }
-
- if v, found := os.LookupEnv("HUGO_CONFIGDIR"); found {
- return paths.AbsPathify(baseDir, v)
- }
-
- return paths.AbsPathify(baseDir, "config")
-}
-
-func (cc *hugoBuilderCommon) getEnvironment(isServer bool) string {
- if cc.environment != "" {
- return cc.environment
- }
-
- if v, found := os.LookupEnv("HUGO_ENVIRONMENT"); found {
- return v
- }
-
- if isServer {
- return hugo.EnvironmentDevelopment
- }
-
- return hugo.EnvironmentProduction
-}
-
-func (cc *hugoBuilderCommon) handleCommonBuilderFlags(cmd *cobra.Command) {
- cmd.PersistentFlags().StringVarP(&cc.source, "source", "s", "", "filesystem path to read files relative from")
- cmd.PersistentFlags().SetAnnotation("source", cobra.BashCompSubdirsInDir, []string{})
- cmd.PersistentFlags().StringVarP(&cc.environment, "environment", "e", "", "build environment")
- cmd.PersistentFlags().StringP("themesDir", "", "", "filesystem path to themes directory")
- cmd.PersistentFlags().BoolP("ignoreVendor", "", false, "ignores any _vendor directory")
-}
-
-func (cc *hugoBuilderCommon) handleFlags(cmd *cobra.Command) {
- cc.handleCommonBuilderFlags(cmd)
- cmd.Flags().Bool("cleanDestinationDir", false, "remove files from destination not found in static directories")
- cmd.Flags().BoolP("buildDrafts", "D", false, "include content marked as draft")
- cmd.Flags().BoolP("buildFuture", "F", false, "include content with publishdate in the future")
- cmd.Flags().BoolP("buildExpired", "E", false, "include expired content")
- cmd.Flags().StringP("contentDir", "c", "", "filesystem path to content directory")
- cmd.Flags().StringP("layoutDir", "l", "", "filesystem path to layout directory")
- cmd.Flags().StringP("cacheDir", "", "", "filesystem path to cache directory. Defaults: $TMPDIR/hugo_cache/")
- cmd.Flags().BoolP("ignoreCache", "", false, "ignores the cache directory")
- cmd.Flags().StringP("destination", "d", "", "filesystem path to write files to")
- cmd.Flags().StringSliceP("theme", "t", []string{}, "themes to use (located in /themes/THEMENAME/)")
- cmd.Flags().StringVarP(&cc.baseURL, "baseURL", "b", "", "hostname (and path) to the root, e.g. http://spf13.com/")
- cmd.Flags().Bool("enableGitInfo", false, "add Git revision, date and author info to the pages")
- cmd.Flags().BoolVar(&cc.gc, "gc", false, "enable to run some cleanup tasks (remove unused cache files) after the build")
-
- cmd.Flags().Bool("templateMetrics", false, "display metrics about template executions")
- cmd.Flags().Bool("templateMetricsHints", false, "calculate some improvement hints when combined with --templateMetrics")
- cmd.Flags().BoolP("forceSyncStatic", "", false, "copy all files when static is changed.")
- cmd.Flags().BoolP("noTimes", "", false, "don't sync modification time of files")
- cmd.Flags().BoolP("noChmod", "", false, "don't sync permission mode of files")
- cmd.Flags().BoolP("i18n-warnings", "", false, "print missing translations")
- cmd.Flags().BoolP("path-warnings", "", false, "print warnings on duplicate target paths etc.")
- cmd.Flags().StringVarP(&cc.cpuprofile, "profile-cpu", "", "", "write cpu profile to `file`")
- cmd.Flags().StringVarP(&cc.memprofile, "profile-mem", "", "", "write memory profile to `file`")
- cmd.Flags().StringVarP(&cc.mutexprofile, "profile-mutex", "", "", "write Mutex profile to `file`")
- cmd.Flags().StringVarP(&cc.traceprofile, "trace", "", "", "write trace to `file` (not useful in general)")
-
- // Hide these for now.
- cmd.Flags().MarkHidden("profile-cpu")
- cmd.Flags().MarkHidden("profile-mem")
- cmd.Flags().MarkHidden("profile-mutex")
-
- cmd.Flags().StringSlice("disableKinds", []string{}, "disable different kind of pages (home, RSS etc.)")
-
- cmd.Flags().Bool("minify", false, "minify any supported output format (HTML, XML etc.)")
-
- // Set bash-completion.
- // Each flag must first be defined before using the SetAnnotation() call.
- _ = cmd.Flags().SetAnnotation("source", cobra.BashCompSubdirsInDir, []string{})
- _ = cmd.Flags().SetAnnotation("cacheDir", cobra.BashCompSubdirsInDir, []string{})
- _ = cmd.Flags().SetAnnotation("destination", cobra.BashCompSubdirsInDir, []string{})
- _ = cmd.Flags().SetAnnotation("theme", cobra.BashCompSubdirsInDir, []string{"themes"})
-}
-
-func checkErr(logger *loggers.Logger, err error, s ...string) {
- if err == nil {
- return
- }
- if len(s) == 0 {
- logger.CRITICAL.Println(err)
- return
- }
- for _, message := range s {
- logger.ERROR.Println(message)
- }
- logger.ERROR.Println(err)
+func (c *hugoBuildCommand) Run(ctx context.Context, cd *simplecobra.Commandeer, args []string) error {
+ return c.rootCmd.Run(ctx, cd, args)
}
diff --git a/commands/commands_test.go b/commands/commands_test.go
deleted file mode 100644
index 565736793..000000000
--- a/commands/commands_test.go
+++ /dev/null
@@ -1,286 +0,0 @@
-// Copyright 2019 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "fmt"
- "io/ioutil"
- "os"
- "path/filepath"
- "testing"
-
- "github.com/gohugoio/hugo/common/types"
-
- "github.com/spf13/cobra"
- "github.com/spf13/viper"
-
- qt "github.com/frankban/quicktest"
-)
-
-func TestExecute(t *testing.T) {
-
- c := qt.New(t)
-
- dir, err := createSimpleTestSite(t, testSiteConfig{})
- c.Assert(err, qt.IsNil)
-
- defer func() {
- os.RemoveAll(dir)
- }()
-
- resp := Execute([]string{"-s=" + dir})
- c.Assert(resp.Err, qt.IsNil)
- result := resp.Result
- c.Assert(len(result.Sites) == 1, qt.Equals, true)
- c.Assert(len(result.Sites[0].RegularPages()) == 1, qt.Equals, true)
-}
-
-func TestCommandsPersistentFlags(t *testing.T) {
- c := qt.New(t)
-
- noOpRunE := func(cmd *cobra.Command, args []string) error {
- return nil
- }
-
- tests := []struct {
- args []string
- check func(command []cmder)
- }{{[]string{"server",
- "--config=myconfig.toml",
- "--configDir=myconfigdir",
- "--contentDir=mycontent",
- "--disableKinds=page,home",
- "--environment=testing",
- "--configDir=myconfigdir",
- "--layoutDir=mylayouts",
- "--theme=mytheme",
- "--gc",
- "--themesDir=mythemes",
- "--cleanDestinationDir",
- "--navigateToChanged",
- "--disableLiveReload",
- "--noHTTPCache",
- "--i18n-warnings",
- "--destination=/tmp/mydestination",
- "-b=https://example.com/b/",
- "--port=1366",
- "--renderToDisk",
- "--source=mysource",
- "--path-warnings",
- }, func(commands []cmder) {
- var sc *serverCmd
- for _, command := range commands {
- if b, ok := command.(commandsBuilderGetter); ok {
- v := b.getCommandsBuilder().hugoBuilderCommon
- c.Assert(v.cfgFile, qt.Equals, "myconfig.toml")
- c.Assert(v.cfgDir, qt.Equals, "myconfigdir")
- c.Assert(v.source, qt.Equals, "mysource")
- c.Assert(v.baseURL, qt.Equals, "https://example.com/b/")
- }
-
- if srvCmd, ok := command.(*serverCmd); ok {
- sc = srvCmd
- }
- }
-
- c.Assert(sc, qt.Not(qt.IsNil))
- c.Assert(sc.navigateToChanged, qt.Equals, true)
- c.Assert(sc.disableLiveReload, qt.Equals, true)
- c.Assert(sc.noHTTPCache, qt.Equals, true)
- c.Assert(sc.renderToDisk, qt.Equals, true)
- c.Assert(sc.serverPort, qt.Equals, 1366)
- c.Assert(sc.environment, qt.Equals, "testing")
-
- cfg := viper.New()
- sc.flagsToConfig(cfg)
- c.Assert(cfg.GetString("publishDir"), qt.Equals, "/tmp/mydestination")
- c.Assert(cfg.GetString("contentDir"), qt.Equals, "mycontent")
- c.Assert(cfg.GetString("layoutDir"), qt.Equals, "mylayouts")
- c.Assert(cfg.GetStringSlice("theme"), qt.DeepEquals, []string{"mytheme"})
- c.Assert(cfg.GetString("themesDir"), qt.Equals, "mythemes")
- c.Assert(cfg.GetString("baseURL"), qt.Equals, "https://example.com/b/")
-
- c.Assert(cfg.Get("disableKinds"), qt.DeepEquals, []string{"page", "home"})
-
- c.Assert(cfg.GetBool("gc"), qt.Equals, true)
-
- // The flag is named path-warnings
- c.Assert(cfg.GetBool("logPathWarnings"), qt.Equals, true)
-
- // The flag is named i18n-warnings
- c.Assert(cfg.GetBool("logI18nWarnings"), qt.Equals, true)
-
- }}}
-
- for _, test := range tests {
- b := newCommandsBuilder()
- root := b.addAll().build()
-
- for _, c := range b.commands {
- if c.getCommand() == nil {
- continue
- }
- // We are only intereseted in the flag handling here.
- c.getCommand().RunE = noOpRunE
- }
- rootCmd := root.getCommand()
- rootCmd.SetArgs(test.args)
- c.Assert(rootCmd.Execute(), qt.IsNil)
- test.check(b.commands)
- }
-
-}
-
-func TestCommandsExecute(t *testing.T) {
-
- c := qt.New(t)
-
- dir, err := createSimpleTestSite(t, testSiteConfig{})
- c.Assert(err, qt.IsNil)
-
- dirOut, err := ioutil.TempDir("", "hugo-cli-out")
- c.Assert(err, qt.IsNil)
-
- defer func() {
- os.RemoveAll(dir)
- os.RemoveAll(dirOut)
- }()
-
- sourceFlag := fmt.Sprintf("-s=%s", dir)
-
- tests := []struct {
- commands []string
- flags []string
- expectErrToContain string
- }{
- // TODO(bep) permission issue on my OSX? "operation not permitted" {[]string{"check", "ulimit"}, nil, false},
- {[]string{"env"}, nil, ""},
- {[]string{"version"}, nil, ""},
- // no args = hugo build
- {nil, []string{sourceFlag}, ""},
- {nil, []string{sourceFlag, "--renderToMemory"}, ""},
- {[]string{"config"}, []string{sourceFlag}, ""},
- {[]string{"convert", "toTOML"}, []string{sourceFlag, "-o=" + filepath.Join(dirOut, "toml")}, ""},
- {[]string{"convert", "toYAML"}, []string{sourceFlag, "-o=" + filepath.Join(dirOut, "yaml")}, ""},
- {[]string{"convert", "toJSON"}, []string{sourceFlag, "-o=" + filepath.Join(dirOut, "json")}, ""},
- {[]string{"gen", "autocomplete"}, []string{"--completionfile=" + filepath.Join(dirOut, "autocomplete.txt")}, ""},
- {[]string{"gen", "chromastyles"}, []string{"--style=manni"}, ""},
- {[]string{"gen", "doc"}, []string{"--dir=" + filepath.Join(dirOut, "doc")}, ""},
- {[]string{"gen", "man"}, []string{"--dir=" + filepath.Join(dirOut, "man")}, ""},
- {[]string{"list", "drafts"}, []string{sourceFlag}, ""},
- {[]string{"list", "expired"}, []string{sourceFlag}, ""},
- {[]string{"list", "future"}, []string{sourceFlag}, ""},
- {[]string{"new", "new-page.md"}, []string{sourceFlag}, ""},
- {[]string{"new", "site", filepath.Join(dirOut, "new-site")}, nil, ""},
- {[]string{"unknowncommand"}, nil, "unknown command"},
- // TODO(bep) cli refactor fix https://github.com/gohugoio/hugo/issues/4450
- //{[]string{"new", "theme", filepath.Join(dirOut, "new-theme")}, nil,false},
- }
-
- for _, test := range tests {
- b := newCommandsBuilder().addAll().build()
- hugoCmd := b.getCommand()
- test.flags = append(test.flags, "--quiet")
- hugoCmd.SetArgs(append(test.commands, test.flags...))
-
- // TODO(bep) capture output and add some simple asserts
- // TODO(bep) misspelled subcommands does not return an error. We should investigate this
- // but before that, check for "Error: unknown command".
-
- _, err := hugoCmd.ExecuteC()
- if test.expectErrToContain != "" {
- c.Assert(err, qt.Not(qt.IsNil))
- c.Assert(err.Error(), qt.Contains, test.expectErrToContain)
- } else {
- c.Assert(err, qt.IsNil)
- }
-
- // Assert that we have not left any development debug artifacts in
- // the code.
- if b.c != nil {
- _, ok := b.c.destinationFs.(types.DevMarker)
- c.Assert(ok, qt.Equals, false)
- }
-
- }
-
-}
-
-type testSiteConfig struct {
- configTOML string
- contentDir string
-}
-
-func createSimpleTestSite(t *testing.T, cfg testSiteConfig) (string, error) {
- d, e := ioutil.TempDir("", "hugo-cli")
- if e != nil {
- return "", e
- }
-
- cfgStr := `
-
-baseURL = "https://example.org"
-title = "Hugo Commands"
-
-`
-
- contentDir := "content"
-
- if cfg.configTOML != "" {
- cfgStr = cfg.configTOML
- }
- if cfg.contentDir != "" {
- contentDir = cfg.contentDir
- }
-
- // Just the basic. These are for CLI tests, not site testing.
- writeFile(t, filepath.Join(d, "config.toml"), cfgStr)
-
- writeFile(t, filepath.Join(d, contentDir, "p1.md"), `
----
-title: "P1"
-weight: 1
----
-
-Content
-
-`)
-
- writeFile(t, filepath.Join(d, "layouts", "_default", "single.html"), `
-
-Single: {{ .Title }}
-
-`)
-
- writeFile(t, filepath.Join(d, "layouts", "_default", "list.html"), `
-
-List: {{ .Title }}
-Environment: {{ hugo.Environment }}
-
-`)
-
- return d, nil
-
-}
-
-func writeFile(t *testing.T, filename, content string) {
- must(t, os.MkdirAll(filepath.Dir(filename), os.FileMode(0755)))
- must(t, ioutil.WriteFile(filename, []byte(content), os.FileMode(0755)))
-}
-
-func must(t *testing.T, err error) {
- if err != nil {
- t.Fatal(err)
- }
-}
diff --git a/commands/config.go b/commands/config.go
index 72c2a0d97..7d166b9b8 100644
--- a/commands/config.go
+++ b/commands/config.go
@@ -1,4 +1,4 @@
-// Copyright 2015 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -9,139 +9,231 @@
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
-// limitations under the License.Print the version number of Hug
+// limitations under the License.
package commands
import (
+ "bytes"
+ "context"
"encoding/json"
+ "fmt"
"os"
- "reflect"
- "regexp"
- "sort"
"strings"
+ "time"
+ "github.com/bep/simplecobra"
+ "github.com/gohugoio/hugo/common/maps"
+ "github.com/gohugoio/hugo/config/allconfig"
+ "github.com/gohugoio/hugo/modules"
"github.com/gohugoio/hugo/parser"
"github.com/gohugoio/hugo/parser/metadecoders"
-
- "github.com/gohugoio/hugo/modules"
-
"github.com/spf13/cobra"
- jww "github.com/spf13/jwalterweatherman"
- "github.com/spf13/viper"
)
-var _ cmder = (*configCmd)(nil)
-
-type configCmd struct {
- hugoBuilderCommon
- *baseCmd
-}
-
-func newConfigCmd() *configCmd {
- cc := &configCmd{}
- cc.baseCmd = newBaseCmd(&cobra.Command{
- Use: "config",
- Short: "Print the site configuration",
- Long: `Print the site configuration, both default and custom settings.`,
- RunE: cc.printConfig,
- })
-
- cc.cmd.PersistentFlags().StringVarP(&cc.source, "source", "s", "", "filesystem path to read files relative from")
-
- printMountsCmd := &cobra.Command{
- Use: "mounts",
- Short: "Print the configured file mounts",
- RunE: cc.printMounts,
+// newConfigCommand creates a new config command and its subcommands.
+func newConfigCommand() *configCommand {
+ return &configCommand{
+ commands: []simplecobra.Commander{
+ &configMountsCommand{},
+ },
}
-
- cc.cmd.AddCommand(printMountsCmd)
-
- return cc
}
-func (c *configCmd) printMounts(cmd *cobra.Command, args []string) error {
- cfg, err := initializeConfig(true, false, &c.hugoBuilderCommon, c, nil)
+type configCommand struct {
+ r *rootCommand
+
+ format string
+ lang string
+ printZero bool
+
+ commands []simplecobra.Commander
+}
+
+func (c *configCommand) Commands() []simplecobra.Commander {
+ return c.commands
+}
+
+func (c *configCommand) Name() string {
+ return "config"
+}
+
+func (c *configCommand) Run(ctx context.Context, cd *simplecobra.Commandeer, args []string) error {
+ conf, err := c.r.ConfigFromProvider(configKey{counter: c.r.configVersionID.Load()}, flagsToCfg(cd, nil))
if err != nil {
return err
}
+ var config *allconfig.Config
+ if c.lang != "" {
+ var found bool
+ config, found = conf.configs.LanguageConfigMap[c.lang]
+ if !found {
+ return fmt.Errorf("language %q not found", c.lang)
+ }
+ } else {
+ config = conf.configs.LanguageConfigSlice[0]
+ }
- allModules := cfg.Cfg.Get("allmodules").(modules.Modules)
+ var buf bytes.Buffer
+ dec := json.NewEncoder(&buf)
+ dec.SetIndent("", " ")
+ dec.SetEscapeHTML(false)
- for _, m := range allModules {
- if err := parser.InterfaceToConfig(&modMounts{m: m}, metadecoders.JSON, os.Stdout); err != nil {
+ if err := dec.Encode(parser.ReplacingJSONMarshaller{Value: config, KeysToLower: true, OmitEmpty: !c.printZero}); err != nil {
+ return err
+ }
+
+ format := strings.ToLower(c.format)
+
+ switch format {
+ case "json":
+ os.Stdout.Write(buf.Bytes())
+ default:
+ // Decode the JSON to a map[string]interface{} and then unmarshal it again to the correct format.
+ var m map[string]any
+ if err := json.Unmarshal(buf.Bytes(), &m); err != nil {
return err
}
- }
- return nil
-}
-
-func (c *configCmd) printConfig(cmd *cobra.Command, args []string) error {
- cfg, err := initializeConfig(true, false, &c.hugoBuilderCommon, c, nil)
- if err != nil {
- return err
- }
-
- allSettings := cfg.Cfg.(*viper.Viper).AllSettings()
-
- // We need to clean up this, but we store objects in the config that
- // isn't really interesting to the end user, so filter these.
- ignoreKeysRe := regexp.MustCompile("client|sorted|filecacheconfigs|allmodules|multilingual")
-
- separator := ": "
-
- if len(cfg.configFiles) > 0 && strings.HasSuffix(cfg.configFiles[0], ".toml") {
- separator = " = "
- }
-
- var keys []string
- for k := range allSettings {
- if ignoreKeysRe.MatchString(k) {
- continue
- }
- keys = append(keys, k)
- }
- sort.Strings(keys)
- for _, k := range keys {
- kv := reflect.ValueOf(allSettings[k])
- if kv.Kind() == reflect.String {
- jww.FEEDBACK.Printf("%s%s\"%+v\"\n", k, separator, allSettings[k])
- } else {
- jww.FEEDBACK.Printf("%s%s%+v\n", k, separator, allSettings[k])
+ maps.ConvertFloat64WithNoDecimalsToInt(m)
+ switch format {
+ case "yaml":
+ return parser.InterfaceToConfig(m, metadecoders.YAML, os.Stdout)
+ case "toml":
+ return parser.InterfaceToConfig(m, metadecoders.TOML, os.Stdout)
+ default:
+ return fmt.Errorf("unsupported format: %q", format)
}
}
return nil
}
-type modMounts struct {
- m modules.Module
+func (c *configCommand) Init(cd *simplecobra.Commandeer) error {
+ c.r = cd.Root.Command.(*rootCommand)
+ cmd := cd.CobraCommand
+ cmd.Short = "Display site configuration"
+ cmd.Long = `Display site configuration, both default and custom settings.`
+ cmd.Flags().StringVar(&c.format, "format", "toml", "preferred file format (toml, yaml or json)")
+ _ = cmd.RegisterFlagCompletionFunc("format", cobra.FixedCompletions([]string{"toml", "yaml", "json"}, cobra.ShellCompDirectiveNoFileComp))
+ cmd.Flags().StringVar(&c.lang, "lang", "", "the language to display config for. Defaults to the first language defined.")
+ cmd.Flags().BoolVar(&c.printZero, "printZero", false, `include config options with zero values (e.g. false, 0, "") in the output`)
+ _ = cmd.RegisterFlagCompletionFunc("lang", cobra.NoFileCompletions)
+ applyLocalFlagsBuildConfig(cmd, c.r)
+
+ return nil
}
-type modMount struct {
+func (c *configCommand) PreRun(cd, runner *simplecobra.Commandeer) error {
+ return nil
+}
+
+type configModMount struct {
Source string `json:"source"`
Target string `json:"target"`
Lang string `json:"lang,omitempty"`
}
-func (m *modMounts) MarshalJSON() ([]byte, error) {
- var mounts []modMount
+type configModMounts struct {
+ verbose bool
+ m modules.Module
+}
+
+// MarshalJSON is for internal use only.
+func (m *configModMounts) MarshalJSON() ([]byte, error) {
+ var mounts []configModMount
for _, mount := range m.m.Mounts() {
- mounts = append(mounts, modMount{
+ mounts = append(mounts, configModMount{
Source: mount.Source,
Target: mount.Target,
Lang: mount.Lang,
})
}
+ var ownerPath string
+ if m.m.Owner() != nil {
+ ownerPath = m.m.Owner().Path()
+ }
+
+ if m.verbose {
+ config := m.m.Config()
+ return json.Marshal(&struct {
+ Path string `json:"path"`
+ Version string `json:"version"`
+ Time time.Time `json:"time"`
+ Owner string `json:"owner"`
+ Dir string `json:"dir"`
+ Meta map[string]any `json:"meta"`
+ HugoVersion modules.HugoVersion `json:"hugoVersion"`
+
+ Mounts []configModMount `json:"mounts"`
+ }{
+ Path: m.m.Path(),
+ Version: m.m.Version(),
+ Time: m.m.Time(),
+ Owner: ownerPath,
+ Dir: m.m.Dir(),
+ Meta: config.Params,
+ HugoVersion: config.HugoVersion,
+ Mounts: mounts,
+ })
+ }
+
return json.Marshal(&struct {
- Path string `json:"path"`
- Dir string `json:"dir"`
- Mounts []modMount `json:"mounts"`
+ Path string `json:"path"`
+ Version string `json:"version"`
+ Time time.Time `json:"time"`
+ Owner string `json:"owner"`
+ Dir string `json:"dir"`
+ Mounts []configModMount `json:"mounts"`
}{
- Path: m.m.Path(),
- Dir: m.m.Dir(),
- Mounts: mounts,
+ Path: m.m.Path(),
+ Version: m.m.Version(),
+ Time: m.m.Time(),
+ Owner: ownerPath,
+ Dir: m.m.Dir(),
+ Mounts: mounts,
})
}
+
+type configMountsCommand struct {
+ r *rootCommand
+ configCmd *configCommand
+}
+
+func (c *configMountsCommand) Commands() []simplecobra.Commander {
+ return nil
+}
+
+func (c *configMountsCommand) Name() string {
+ return "mounts"
+}
+
+func (c *configMountsCommand) Run(ctx context.Context, cd *simplecobra.Commandeer, args []string) error {
+ r := c.configCmd.r
+ conf, err := r.ConfigFromProvider(configKey{counter: c.r.configVersionID.Load()}, flagsToCfg(cd, nil))
+ if err != nil {
+ return err
+ }
+
+ for _, m := range conf.configs.Modules {
+ if err := parser.InterfaceToConfig(&configModMounts{m: m, verbose: r.isVerbose()}, metadecoders.JSON, os.Stdout); err != nil {
+ return err
+ }
+ }
+ return nil
+}
+
+func (c *configMountsCommand) Init(cd *simplecobra.Commandeer) error {
+ c.r = cd.Root.Command.(*rootCommand)
+ cmd := cd.CobraCommand
+ cmd.Short = "Print the configured file mounts"
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ applyLocalFlagsBuildConfig(cmd, c.r)
+ return nil
+}
+
+func (c *configMountsCommand) PreRun(cd, runner *simplecobra.Commandeer) error {
+ c.configCmd = cd.Parent.Command.(*configCommand)
+ return nil
+}
diff --git a/commands/convert.go b/commands/convert.go
index e4ff1ac61..ebf81cfb3 100644
--- a/commands/convert.go
+++ b/commands/convert.go
@@ -1,4 +1,4 @@
-// Copyright 2019 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -15,153 +15,149 @@ package commands
import (
"bytes"
+ "context"
"fmt"
- "io"
+ "path/filepath"
"strings"
"time"
- "github.com/gohugoio/hugo/resources/page"
-
- "github.com/gohugoio/hugo/hugofs"
-
+ "github.com/bep/simplecobra"
+ "github.com/gohugoio/hugo/config"
"github.com/gohugoio/hugo/helpers"
-
+ "github.com/gohugoio/hugo/hugofs"
+ "github.com/gohugoio/hugo/hugolib"
"github.com/gohugoio/hugo/parser"
"github.com/gohugoio/hugo/parser/metadecoders"
"github.com/gohugoio/hugo/parser/pageparser"
-
- "github.com/pkg/errors"
-
- "github.com/gohugoio/hugo/hugolib"
-
- "path/filepath"
-
+ "github.com/gohugoio/hugo/resources/page"
"github.com/spf13/cobra"
)
-var (
- _ cmder = (*convertCmd)(nil)
-)
-
-type convertCmd struct {
- hugoBuilderCommon
+func newConvertCommand() *convertCommand {
+ var c *convertCommand
+ c = &convertCommand{
+ commands: []simplecobra.Commander{
+ &simpleCommand{
+ name: "toJSON",
+ short: "Convert front matter to JSON",
+ long: `toJSON converts all front matter in the content directory
+to use JSON for the front matter.`,
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ return c.convertContents(metadecoders.JSON)
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ },
+ },
+ &simpleCommand{
+ name: "toTOML",
+ short: "Convert front matter to TOML",
+ long: `toTOML converts all front matter in the content directory
+to use TOML for the front matter.`,
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ return c.convertContents(metadecoders.TOML)
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ },
+ },
+ &simpleCommand{
+ name: "toYAML",
+ short: "Convert front matter to YAML",
+ long: `toYAML converts all front matter in the content directory
+to use YAML for the front matter.`,
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ return c.convertContents(metadecoders.YAML)
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ },
+ },
+ },
+ }
+ return c
+}
+type convertCommand struct {
+ // Flags.
outputDir string
unsafe bool
- *baseCmd
+ // Deps.
+ r *rootCommand
+ h *hugolib.HugoSites
+
+ // Commands.
+ commands []simplecobra.Commander
}
-func newConvertCmd() *convertCmd {
- cc := &convertCmd{}
-
- cc.baseCmd = newBaseCmd(&cobra.Command{
- Use: "convert",
- Short: "Convert your content to different formats",
- Long: `Convert your content (e.g. front matter) to different formats.
-
-See convert's subcommands toJSON, toTOML and toYAML for more information.`,
- RunE: nil,
- })
-
- cc.cmd.AddCommand(
- &cobra.Command{
- Use: "toJSON",
- Short: "Convert front matter to JSON",
- Long: `toJSON converts all front matter in the content directory
-to use JSON for the front matter.`,
- RunE: func(cmd *cobra.Command, args []string) error {
- return cc.convertContents(metadecoders.JSON)
- },
- },
- &cobra.Command{
- Use: "toTOML",
- Short: "Convert front matter to TOML",
- Long: `toTOML converts all front matter in the content directory
-to use TOML for the front matter.`,
- RunE: func(cmd *cobra.Command, args []string) error {
- return cc.convertContents(metadecoders.TOML)
- },
- },
- &cobra.Command{
- Use: "toYAML",
- Short: "Convert front matter to YAML",
- Long: `toYAML converts all front matter in the content directory
-to use YAML for the front matter.`,
- RunE: func(cmd *cobra.Command, args []string) error {
- return cc.convertContents(metadecoders.YAML)
- },
- },
- )
-
- cc.cmd.PersistentFlags().StringVarP(&cc.outputDir, "output", "o", "", "filesystem path to write files to")
- cc.cmd.PersistentFlags().StringVarP(&cc.source, "source", "s", "", "filesystem path to read files relative from")
- cc.cmd.PersistentFlags().BoolVar(&cc.unsafe, "unsafe", false, "enable less safe operations, please backup first")
- cc.cmd.PersistentFlags().SetAnnotation("source", cobra.BashCompSubdirsInDir, []string{})
-
- return cc
+func (c *convertCommand) Commands() []simplecobra.Commander {
+ return c.commands
}
-func (cc *convertCmd) convertContents(format metadecoders.Format) error {
- if cc.outputDir == "" && !cc.unsafe {
- return newUserError("Unsafe operation not allowed, use --unsafe or set a different output path")
- }
+func (c *convertCommand) Name() string {
+ return "convert"
+}
- c, err := initializeConfig(true, false, &cc.hugoBuilderCommon, cc, nil)
- if err != nil {
- return err
- }
-
- c.Cfg.Set("buildDrafts", true)
-
- h, err := hugolib.NewHugoSites(*c.DepsCfg)
- if err != nil {
- return err
- }
-
- if err := h.Build(hugolib.BuildCfg{SkipRender: true}); err != nil {
- return err
- }
-
- site := h.Sites[0]
-
- site.Log.FEEDBACK.Println("processing", len(site.AllPages()), "content files")
- for _, p := range site.AllPages() {
- if err := cc.convertAndSavePage(p, site, format); err != nil {
- return err
- }
- }
+func (c *convertCommand) Run(ctx context.Context, cd *simplecobra.Commandeer, args []string) error {
return nil
}
-func (cc *convertCmd) convertAndSavePage(p page.Page, site *hugolib.Site, targetFormat metadecoders.Format) error {
+func (c *convertCommand) Init(cd *simplecobra.Commandeer) error {
+ cmd := cd.CobraCommand
+ cmd.Short = "Convert front matter to another format"
+ cmd.Long = `Convert front matter to another format.
+
+See convert's subcommands toJSON, toTOML and toYAML for more information.`
+
+ cmd.PersistentFlags().StringVarP(&c.outputDir, "output", "o", "", "filesystem path to write files to")
+ _ = cmd.MarkFlagDirname("output")
+ cmd.PersistentFlags().BoolVar(&c.unsafe, "unsafe", false, "enable less safe operations, please backup first")
+
+ cmd.RunE = nil
+ return nil
+}
+
+func (c *convertCommand) PreRun(cd, runner *simplecobra.Commandeer) error {
+ c.r = cd.Root.Command.(*rootCommand)
+ cfg := config.New()
+ cfg.Set("buildDrafts", true)
+ h, err := c.r.Hugo(flagsToCfg(cd, cfg))
+ if err != nil {
+ return err
+ }
+ c.h = h
+ return nil
+}
+
+func (c *convertCommand) convertAndSavePage(p page.Page, site *hugolib.Site, targetFormat metadecoders.Format) error {
// The resources are not in .Site.AllPages.
for _, r := range p.Resources().ByType("page") {
- if err := cc.convertAndSavePage(r.(page.Page), site, targetFormat); err != nil {
+ if err := c.convertAndSavePage(r.(page.Page), site, targetFormat); err != nil {
return err
}
}
- if p.File().IsZero() {
+ if p.File() == nil {
// No content file.
return nil
}
- errMsg := fmt.Errorf("Error processing file %q", p.Path())
+ errMsg := fmt.Errorf("error processing file %q", p.File().Path())
- site.Log.INFO.Println("Attempting to convert", p.File().Filename())
+ site.Log.Infoln("attempting to convert", p.File().Filename())
f := p.File()
file, err := f.FileInfo().Meta().Open()
if err != nil {
- site.Log.ERROR.Println(errMsg)
+ site.Log.Errorln(errMsg)
file.Close()
return nil
}
- pf, err := parseContentFile(file)
+ pf, err := pageparser.ParseFrontMatterAndContent(file)
if err != nil {
- site.Log.ERROR.Println(errMsg)
+ site.Log.Errorln(errMsg)
file.Close()
return err
}
@@ -169,82 +165,65 @@ func (cc *convertCmd) convertAndSavePage(p page.Page, site *hugolib.Site, target
file.Close()
// better handling of dates in formats that don't have support for them
- if pf.frontMatterFormat == metadecoders.JSON || pf.frontMatterFormat == metadecoders.YAML || pf.frontMatterFormat == metadecoders.TOML {
- for k, v := range pf.frontMatter {
+ if pf.FrontMatterFormat == metadecoders.JSON || pf.FrontMatterFormat == metadecoders.YAML || pf.FrontMatterFormat == metadecoders.TOML {
+ for k, v := range pf.FrontMatter {
switch vv := v.(type) {
case time.Time:
- pf.frontMatter[k] = vv.Format(time.RFC3339)
+ pf.FrontMatter[k] = vv.Format(time.RFC3339)
}
}
}
var newContent bytes.Buffer
- err = parser.InterfaceToFrontMatter(pf.frontMatter, targetFormat, &newContent)
+ err = parser.InterfaceToFrontMatter(pf.FrontMatter, targetFormat, &newContent)
if err != nil {
- site.Log.ERROR.Println(errMsg)
+ site.Log.Errorln(errMsg)
return err
}
- newContent.Write(pf.content)
+ newContent.Write(pf.Content)
newFilename := p.File().Filename()
- if cc.outputDir != "" {
- contentDir := strings.TrimSuffix(newFilename, p.Path())
+ if c.outputDir != "" {
+ contentDir := strings.TrimSuffix(newFilename, p.File().Path())
contentDir = filepath.Base(contentDir)
- newFilename = filepath.Join(cc.outputDir, contentDir, p.Path())
+ newFilename = filepath.Join(c.outputDir, contentDir, p.File().Path())
}
fs := hugofs.Os
if err := helpers.WriteToDisk(newFilename, &newContent, fs); err != nil {
- return errors.Wrapf(err, "Failed to save file %q:", newFilename)
+ return fmt.Errorf("failed to save file %q:: %w", newFilename, err)
}
return nil
}
-type parsedFile struct {
- frontMatterFormat metadecoders.Format
- frontMatterSource []byte
- frontMatter map[string]interface{}
-
- // Everything after Front Matter
- content []byte
-}
-
-func parseContentFile(r io.Reader) (parsedFile, error) {
- var pf parsedFile
-
- psr, err := pageparser.Parse(r, pageparser.Config{})
- if err != nil {
- return pf, err
+func (c *convertCommand) convertContents(format metadecoders.Format) error {
+ if c.outputDir == "" && !c.unsafe {
+ return newUserError("Unsafe operation not allowed, use --unsafe or set a different output path")
}
- iter := psr.Iterator()
+ if err := c.h.Build(hugolib.BuildCfg{SkipRender: true}); err != nil {
+ return err
+ }
- walkFn := func(item pageparser.Item) bool {
- if pf.frontMatterSource != nil {
- // The rest is content.
- pf.content = psr.Input()[item.Pos:]
- // Done
- return false
- } else if item.IsFrontMatter() {
- pf.frontMatterFormat = metadecoders.FormatFromFrontMatterType(item.Type)
- pf.frontMatterSource = item.Val
+ site := c.h.Sites[0]
+
+ var pagesBackedByFile page.Pages
+ for _, p := range site.AllPages() {
+ if p.File() == nil {
+ continue
}
- return true
-
+ pagesBackedByFile = append(pagesBackedByFile, p)
}
- iter.PeekWalk(walkFn)
-
- metadata, err := metadecoders.Default.UnmarshalToMap(pf.frontMatterSource, pf.frontMatterFormat)
- if err != nil {
- return pf, err
+ site.Log.Println("processing", len(pagesBackedByFile), "content files")
+ for _, p := range site.AllPages() {
+ if err := c.convertAndSavePage(p, site, format); err != nil {
+ return err
+ }
}
- pf.frontMatter = metadata
-
- return pf, nil
-
+ return nil
}
diff --git a/commands/deploy.go b/commands/deploy.go
index d4b04ab78..3e9d3df20 100644
--- a/commands/deploy.go
+++ b/commands/deploy.go
@@ -1,4 +1,4 @@
-// Copyright 2019 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -11,67 +11,41 @@
// See the License for the specific language governing permissions and
// limitations under the License.
+//go:build withdeploy
+
package commands
import (
"context"
"github.com/gohugoio/hugo/deploy"
+
+ "github.com/bep/simplecobra"
"github.com/spf13/cobra"
)
-var _ cmder = (*deployCmd)(nil)
-
-// deployCmd supports deploying sites to Cloud providers.
-type deployCmd struct {
- hugoBuilderCommon
- *baseCmd
-}
-
-// TODO: In addition to the "deploy" command, consider adding a "--deploy"
-// flag for the default command; this would build the site and then deploy it.
-// It's not obvious how to do this; would all of the deploy-specific flags
-// have to exist at the top level as well?
-
-// TODO: The output files change every time "hugo" is executed, it looks
-// like because of map order randomization. This means that you can
-// run "hugo && hugo deploy" again and again and upload new stuff every time. Is
-// this intended?
-
-func newDeployCmd() *deployCmd {
- cc := &deployCmd{}
-
- cc.baseCmd = newBaseCmd(&cobra.Command{
- Use: "deploy",
- Short: "Deploy your site to a Cloud provider.",
- Long: `Deploy your site to a Cloud provider.
+func newDeployCommand() simplecobra.Commander {
+ return &simpleCommand{
+ name: "deploy",
+ short: "Deploy your site to a cloud provider",
+ long: `Deploy your site to a cloud provider
See https://gohugo.io/hosting-and-deployment/hugo-deploy/ for detailed
documentation.
`,
-
- RunE: func(cmd *cobra.Command, args []string) error {
- cfgInit := func(c *commandeer) error {
- return nil
- }
- comm, err := initializeConfig(true, false, &cc.hugoBuilderCommon, cc, cfgInit)
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ h, err := r.Hugo(flagsToCfgWithAdditionalConfigBase(cd, nil, "deployment"))
if err != nil {
return err
}
- deployer, err := deploy.New(comm.Cfg, comm.hugo().PathSpec.PublishFs)
+ deployer, err := deploy.New(h.Configs.GetFirstLanguageConfig(), h.Log, h.PathSpec.PublishFs)
if err != nil {
return err
}
- return deployer.Deploy(context.Background())
+ return deployer.Deploy(ctx)
},
- })
-
- cc.cmd.Flags().String("target", "", "target deployment from deployments section in config file; defaults to the first one")
- cc.cmd.Flags().Bool("confirm", false, "ask for confirmation before making changes to the target")
- cc.cmd.Flags().Bool("dryRun", false, "dry run")
- cc.cmd.Flags().Bool("force", false, "force upload of all files")
- cc.cmd.Flags().Bool("invalidateCDN", true, "invalidate the CDN cache via the cloudFrontDistributionID listed in the deployment target")
- cc.cmd.Flags().Int("maxDeletes", 256, "maximum # of files to delete, or -1 to disable")
-
- return cc
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ applyDeployFlags(cmd, r)
+ },
+ }
}
diff --git a/commands/deploy_flags.go b/commands/deploy_flags.go
new file mode 100644
index 000000000..d4326547a
--- /dev/null
+++ b/commands/deploy_flags.go
@@ -0,0 +1,33 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package commands
+
+import (
+ "github.com/gohugoio/hugo/deploy/deployconfig"
+ "github.com/spf13/cobra"
+)
+
+func applyDeployFlags(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ cmd.Flags().String("target", "", "target deployment from deployments section in config file; defaults to the first one")
+ _ = cmd.RegisterFlagCompletionFunc("target", cobra.NoFileCompletions)
+ cmd.Flags().Bool("confirm", false, "ask for confirmation before making changes to the target")
+ cmd.Flags().Bool("dryRun", false, "dry run")
+ cmd.Flags().Bool("force", false, "force upload of all files")
+ cmd.Flags().Bool("invalidateCDN", deployconfig.DefaultConfig.InvalidateCDN, "invalidate the CDN cache listed in the deployment target")
+ cmd.Flags().Int("maxDeletes", deployconfig.DefaultConfig.MaxDeletes, "maximum # of files to delete, or -1 to disable")
+ _ = cmd.RegisterFlagCompletionFunc("maxDeletes", cobra.NoFileCompletions)
+ cmd.Flags().Int("workers", deployconfig.DefaultConfig.Workers, "number of workers to transfer files. defaults to 10")
+ _ = cmd.RegisterFlagCompletionFunc("workers", cobra.NoFileCompletions)
+}
diff --git a/commands/deploy_off.go b/commands/deploy_off.go
new file mode 100644
index 000000000..8f5eaa2de
--- /dev/null
+++ b/commands/deploy_off.go
@@ -0,0 +1,50 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+//go:build !withdeploy
+
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package commands
+
+import (
+ "context"
+ "errors"
+
+ "github.com/bep/simplecobra"
+ "github.com/spf13/cobra"
+)
+
+func newDeployCommand() simplecobra.Commander {
+ return &simpleCommand{
+ name: "deploy",
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ return errors.New("deploy not supported in this version of Hugo; install a release with 'withdeploy' in the archive filename or build yourself with the 'withdeploy' build tag. Also see https://github.com/gohugoio/hugo/pull/12995")
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ applyDeployFlags(cmd, r)
+ cmd.Hidden = true
+ },
+ }
+}
diff --git a/commands/env.go b/commands/env.go
index 76c16b93b..753522560 100644
--- a/commands/env.go
+++ b/commands/env.go
@@ -1,4 +1,4 @@
-// Copyright 2016 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -14,31 +14,57 @@
package commands
import (
+ "context"
"runtime"
+ "github.com/bep/simplecobra"
+ "github.com/gohugoio/hugo/common/hugo"
"github.com/spf13/cobra"
- jww "github.com/spf13/jwalterweatherman"
)
-var _ cmder = (*envCmd)(nil)
-
-type envCmd struct {
- *baseCmd
-}
-
-func newEnvCmd() *envCmd {
- return &envCmd{baseCmd: newBaseCmd(&cobra.Command{
- Use: "env",
- Short: "Print Hugo version and environment info",
- Long: `Print Hugo version and environment info. This is useful in Hugo bug reports.`,
- RunE: func(cmd *cobra.Command, args []string) error {
- printHugoVersion()
- jww.FEEDBACK.Printf("GOOS=%q\n", runtime.GOOS)
- jww.FEEDBACK.Printf("GOARCH=%q\n", runtime.GOARCH)
- jww.FEEDBACK.Printf("GOVERSION=%q\n", runtime.Version())
+func newEnvCommand() simplecobra.Commander {
+ return &simpleCommand{
+ name: "env",
+ short: "Display version and environment info",
+ long: "Display version and environment info. This is useful in Hugo bug reports",
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ r.Printf("%s\n", hugo.BuildVersionString())
+ r.Printf("GOOS=%q\n", runtime.GOOS)
+ r.Printf("GOARCH=%q\n", runtime.GOARCH)
+ r.Printf("GOVERSION=%q\n", runtime.Version())
+ if r.isVerbose() {
+ deps := hugo.GetDependencyList()
+ for _, dep := range deps {
+ r.Printf("%s\n", dep)
+ }
+ } else {
+ // These are also included in the GetDependencyList above;
+ // always print these as these are most likely the most useful to know about.
+ deps := hugo.GetDependencyListNonGo()
+ for _, dep := range deps {
+ r.Printf("%s\n", dep)
+ }
+ }
return nil
},
- }),
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ },
+ }
+}
+
+func newVersionCmd() simplecobra.Commander {
+ return &simpleCommand{
+ name: "version",
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ r.Println(hugo.BuildVersionString())
+ return nil
+ },
+ short: "Display version",
+ long: "Display version and environment info. This is useful in Hugo bug reports.",
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ },
}
}
diff --git a/commands/gen.go b/commands/gen.go
index 6878cfe70..1c5361840 100644
--- a/commands/gen.go
+++ b/commands/gen.go
@@ -1,4 +1,4 @@
-// Copyright 2015 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -14,28 +14,290 @@
package commands
import (
+ "bytes"
+ "context"
+ "encoding/json"
+ "fmt"
+ "os"
+ "path"
+ "path/filepath"
+ "slices"
+ "strings"
+
+ "github.com/alecthomas/chroma/v2"
+ "github.com/alecthomas/chroma/v2/formatters/html"
+ "github.com/alecthomas/chroma/v2/styles"
+ "github.com/bep/simplecobra"
+ "github.com/gohugoio/hugo/common/hugo"
+ "github.com/gohugoio/hugo/docshelper"
+ "github.com/gohugoio/hugo/helpers"
+ "github.com/gohugoio/hugo/hugofs"
+ "github.com/gohugoio/hugo/hugolib"
+ "github.com/gohugoio/hugo/parser"
"github.com/spf13/cobra"
+ "github.com/spf13/cobra/doc"
+ "gopkg.in/yaml.v2"
)
-var _ cmder = (*genCmd)(nil)
+func newGenCommand() *genCommand {
+ var (
+ // Flags.
+ gendocdir string
+ genmandir string
-type genCmd struct {
- *baseCmd
+ // Chroma flags.
+ style string
+ highlightStyle string
+ lineNumbersInlineStyle string
+ lineNumbersTableStyle string
+ omitEmpty bool
+ )
+
+ newChromaStyles := func() simplecobra.Commander {
+ return &simpleCommand{
+ name: "chromastyles",
+ short: "Generate CSS stylesheet for the Chroma code highlighter",
+ long: `Generate CSS stylesheet for the Chroma code highlighter for a given style. This stylesheet is needed if markup.highlight.noClasses is disabled in config.
+
+See https://xyproto.github.io/splash/docs/all.html for a preview of the available styles`,
+
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ style = strings.ToLower(style)
+ if !slices.Contains(styles.Names(), style) {
+ return fmt.Errorf("invalid style: %s", style)
+ }
+ builder := styles.Get(style).Builder()
+ if highlightStyle != "" {
+ builder.Add(chroma.LineHighlight, highlightStyle)
+ }
+ if lineNumbersInlineStyle != "" {
+ builder.Add(chroma.LineNumbers, lineNumbersInlineStyle)
+ }
+ if lineNumbersTableStyle != "" {
+ builder.Add(chroma.LineNumbersTable, lineNumbersTableStyle)
+ }
+ style, err := builder.Build()
+ if err != nil {
+ return err
+ }
+
+ var formatter *html.Formatter
+ if omitEmpty {
+ formatter = html.New(html.WithClasses(true))
+ } else {
+ formatter = html.New(html.WithAllClasses(true))
+ }
+
+ w := os.Stdout
+ fmt.Fprintf(w, "/* Generated using: hugo %s */\n\n", strings.Join(os.Args[1:], " "))
+ formatter.WriteCSS(w, style)
+ return nil
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ cmd.PersistentFlags().StringVar(&style, "style", "friendly", "highlighter style (see https://xyproto.github.io/splash/docs/)")
+ _ = cmd.RegisterFlagCompletionFunc("style", cobra.NoFileCompletions)
+ cmd.PersistentFlags().StringVar(&highlightStyle, "highlightStyle", "", `foreground and background colors for highlighted lines, e.g. --highlightStyle "#fff000 bg:#000fff"`)
+ _ = cmd.RegisterFlagCompletionFunc("highlightStyle", cobra.NoFileCompletions)
+ cmd.PersistentFlags().StringVar(&lineNumbersInlineStyle, "lineNumbersInlineStyle", "", `foreground and background colors for inline line numbers, e.g. --lineNumbersInlineStyle "#fff000 bg:#000fff"`)
+ _ = cmd.RegisterFlagCompletionFunc("lineNumbersInlineStyle", cobra.NoFileCompletions)
+ cmd.PersistentFlags().StringVar(&lineNumbersTableStyle, "lineNumbersTableStyle", "", `foreground and background colors for table line numbers, e.g. --lineNumbersTableStyle "#fff000 bg:#000fff"`)
+ _ = cmd.RegisterFlagCompletionFunc("lineNumbersTableStyle", cobra.NoFileCompletions)
+ cmd.PersistentFlags().BoolVar(&omitEmpty, "omitEmpty", false, `omit empty CSS rules`)
+ _ = cmd.RegisterFlagCompletionFunc("omitEmpty", cobra.NoFileCompletions)
+ },
+ }
+ }
+
+ newMan := func() simplecobra.Commander {
+ return &simpleCommand{
+ name: "man",
+ short: "Generate man pages for the Hugo CLI",
+ long: `This command automatically generates up-to-date man pages of Hugo's
+ command-line interface. By default, it creates the man page files
+ in the "man" directory under the current directory.`,
+
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ header := &doc.GenManHeader{
+ Section: "1",
+ Manual: "Hugo Manual",
+ Source: fmt.Sprintf("Hugo %s", hugo.CurrentVersion),
+ }
+ if !strings.HasSuffix(genmandir, helpers.FilePathSeparator) {
+ genmandir += helpers.FilePathSeparator
+ }
+ if found, _ := helpers.Exists(genmandir, hugofs.Os); !found {
+ r.Println("Directory", genmandir, "does not exist, creating...")
+ if err := hugofs.Os.MkdirAll(genmandir, 0o777); err != nil {
+ return err
+ }
+ }
+ cd.CobraCommand.Root().DisableAutoGenTag = true
+
+ r.Println("Generating Hugo man pages in", genmandir, "...")
+ doc.GenManTree(cd.CobraCommand.Root(), header, genmandir)
+
+ r.Println("Done.")
+
+ return nil
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ cmd.PersistentFlags().StringVar(&genmandir, "dir", "man/", "the directory to write the man pages.")
+ _ = cmd.MarkFlagDirname("dir")
+ },
+ }
+ }
+
+ newGen := func() simplecobra.Commander {
+ const gendocFrontmatterTemplate = `---
+title: "%s"
+slug: %s
+url: %s
+---
+`
+
+ return &simpleCommand{
+ name: "doc",
+ short: "Generate Markdown documentation for the Hugo CLI",
+ long: `Generate Markdown documentation for the Hugo CLI.
+ This command is, mostly, used to create up-to-date documentation
+ of Hugo's command-line interface for https://gohugo.io/.
+
+ It creates one Markdown file per command with front matter suitable
+ for rendering in Hugo.`,
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ cd.CobraCommand.VisitParents(func(c *cobra.Command) {
+ // Disable the "Auto generated by spf13/cobra on DATE"
+ // as it creates a lot of diffs.
+ c.DisableAutoGenTag = true
+ })
+ if !strings.HasSuffix(gendocdir, helpers.FilePathSeparator) {
+ gendocdir += helpers.FilePathSeparator
+ }
+ if found, _ := helpers.Exists(gendocdir, hugofs.Os); !found {
+ r.Println("Directory", gendocdir, "does not exist, creating...")
+ if err := hugofs.Os.MkdirAll(gendocdir, 0o777); err != nil {
+ return err
+ }
+ }
+ prepender := func(filename string) string {
+ name := filepath.Base(filename)
+ base := strings.TrimSuffix(name, path.Ext(name))
+ url := "/docs/reference/commands/" + strings.ToLower(base) + "/"
+ return fmt.Sprintf(gendocFrontmatterTemplate, strings.Replace(base, "_", " ", -1), base, url)
+ }
+
+ linkHandler := func(name string) string {
+ base := strings.TrimSuffix(name, path.Ext(name))
+ return "/docs/reference/commands/" + strings.ToLower(base) + "/"
+ }
+ r.Println("Generating Hugo command-line documentation in", gendocdir, "...")
+ doc.GenMarkdownTreeCustom(cd.CobraCommand.Root(), gendocdir, prepender, linkHandler)
+ r.Println("Done.")
+
+ return nil
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ cmd.PersistentFlags().StringVar(&gendocdir, "dir", "/tmp/hugodoc/", "the directory to write the doc.")
+ _ = cmd.MarkFlagDirname("dir")
+ },
+ }
+ }
+
+ var docsHelperTarget string
+
+ newDocsHelper := func() simplecobra.Commander {
+ return &simpleCommand{
+ name: "docshelper",
+ short: "Generate some data files for the Hugo docs",
+
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ r.Println("Generate docs data to", docsHelperTarget)
+
+ var buf bytes.Buffer
+ jsonEnc := json.NewEncoder(&buf)
+
+ configProvider := func() docshelper.DocProvider {
+ conf := hugolib.DefaultConfig()
+ conf.CacheDir = "" // The default value does not make sense in the docs.
+ defaultConfig := parser.NullBoolJSONMarshaller{Wrapped: parser.LowerCaseCamelJSONMarshaller{Value: conf}}
+ return docshelper.DocProvider{"config": defaultConfig}
+ }
+
+ docshelper.AddDocProviderFunc(configProvider)
+ if err := jsonEnc.Encode(docshelper.GetDocProvider()); err != nil {
+ return err
+ }
+
+ // Decode the JSON to a map[string]interface{} and then unmarshal it again to the correct format.
+ var m map[string]any
+ if err := json.Unmarshal(buf.Bytes(), &m); err != nil {
+ return err
+ }
+
+ targetFile := filepath.Join(docsHelperTarget, "docs.yaml")
+
+ f, err := os.Create(targetFile)
+ if err != nil {
+ return err
+ }
+ defer f.Close()
+ yamlEnc := yaml.NewEncoder(f)
+ if err := yamlEnc.Encode(m); err != nil {
+ return err
+ }
+
+ r.Println("Done!")
+ return nil
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.Hidden = true
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ cmd.PersistentFlags().StringVarP(&docsHelperTarget, "dir", "", "docs/data", "data dir")
+ },
+ }
+ }
+
+ return &genCommand{
+ commands: []simplecobra.Commander{
+ newChromaStyles(),
+ newGen(),
+ newMan(),
+ newDocsHelper(),
+ },
+ }
}
-func newGenCmd() *genCmd {
- cc := &genCmd{}
- cc.baseCmd = newBaseCmd(&cobra.Command{
- Use: "gen",
- Short: "A collection of several useful generators.",
- })
+type genCommand struct {
+ rootCmd *rootCommand
- cc.cmd.AddCommand(
- newGenautocompleteCmd().getCommand(),
- newGenDocCmd().getCommand(),
- newGenManCmd().getCommand(),
- createGenDocsHelper().getCommand(),
- createGenChromaStyles().getCommand())
-
- return cc
+ commands []simplecobra.Commander
+}
+
+func (c *genCommand) Commands() []simplecobra.Commander {
+ return c.commands
+}
+
+func (c *genCommand) Name() string {
+ return "gen"
+}
+
+func (c *genCommand) Run(ctx context.Context, cd *simplecobra.Commandeer, args []string) error {
+ return nil
+}
+
+func (c *genCommand) Init(cd *simplecobra.Commandeer) error {
+ cmd := cd.CobraCommand
+ cmd.Short = "Generate documentation and syntax highlighting styles"
+ cmd.Long = "Generate documentation for your project using Hugo's documentation engine, including syntax highlighting for various programming languages."
+
+ cmd.RunE = nil
+ return nil
+}
+
+func (c *genCommand) PreRun(cd, runner *simplecobra.Commandeer) error {
+ c.rootCmd = cd.Root.Command.(*rootCommand)
+ return nil
}
diff --git a/commands/genautocomplete.go b/commands/genautocomplete.go
deleted file mode 100644
index b0b98abb4..000000000
--- a/commands/genautocomplete.go
+++ /dev/null
@@ -1,80 +0,0 @@
-// Copyright 2015 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "github.com/spf13/cobra"
- jww "github.com/spf13/jwalterweatherman"
-)
-
-var _ cmder = (*genautocompleteCmd)(nil)
-
-type genautocompleteCmd struct {
- autocompleteTarget string
-
- // bash for now (zsh and others will come)
- autocompleteType string
-
- *baseCmd
-}
-
-func newGenautocompleteCmd() *genautocompleteCmd {
- cc := &genautocompleteCmd{}
-
- cc.baseCmd = newBaseCmd(&cobra.Command{
- Use: "autocomplete",
- Short: "Generate shell autocompletion script for Hugo",
- Long: `Generates a shell autocompletion script for Hugo.
-
-NOTE: The current version supports Bash only.
- This should work for *nix systems with Bash installed.
-
-By default, the file is written directly to /etc/bash_completion.d
-for convenience, and the command may need superuser rights, e.g.:
-
- $ sudo hugo gen autocomplete
-
-Add ` + "`--completionfile=/path/to/file`" + ` flag to set alternative
-file-path and name.
-
-Logout and in again to reload the completion scripts,
-or just source them in directly:
-
- $ . /etc/bash_completion`,
-
- RunE: func(cmd *cobra.Command, args []string) error {
- if cc.autocompleteType != "bash" {
- return newUserError("Only Bash is supported for now")
- }
-
- err := cmd.Root().GenBashCompletionFile(cc.autocompleteTarget)
-
- if err != nil {
- return err
- }
-
- jww.FEEDBACK.Println("Bash completion file for Hugo saved to", cc.autocompleteTarget)
-
- return nil
- },
- })
-
- cc.cmd.PersistentFlags().StringVarP(&cc.autocompleteTarget, "completionfile", "", "/etc/bash_completion.d/hugo.sh", "autocompletion file")
- cc.cmd.PersistentFlags().StringVarP(&cc.autocompleteType, "type", "", "bash", "autocompletion type (currently only bash supported)")
-
- // For bash-completion
- cc.cmd.PersistentFlags().SetAnnotation("completionfile", cobra.BashCompFilenameExt, []string{})
-
- return cc
-}
diff --git a/commands/genchromastyles.go b/commands/genchromastyles.go
deleted file mode 100644
index a2231e56e..000000000
--- a/commands/genchromastyles.go
+++ /dev/null
@@ -1,74 +0,0 @@
-// Copyright 2017-present The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "os"
-
- "github.com/alecthomas/chroma"
- "github.com/alecthomas/chroma/formatters/html"
- "github.com/alecthomas/chroma/styles"
- "github.com/spf13/cobra"
-)
-
-var (
- _ cmder = (*genChromaStyles)(nil)
-)
-
-type genChromaStyles struct {
- style string
- highlightStyle string
- linesStyle string
- *baseCmd
-}
-
-// TODO(bep) highlight
-func createGenChromaStyles() *genChromaStyles {
- g := &genChromaStyles{
- baseCmd: newBaseCmd(&cobra.Command{
- Use: "chromastyles",
- Short: "Generate CSS stylesheet for the Chroma code highlighter",
- Long: `Generate CSS stylesheet for the Chroma code highlighter for a given style. This stylesheet is needed if pygmentsUseClasses is enabled in config.
-
-See https://help.farbox.com/pygments.html for preview of available styles`,
- }),
- }
-
- g.cmd.RunE = func(cmd *cobra.Command, args []string) error {
- return g.generate()
- }
-
- g.cmd.PersistentFlags().StringVar(&g.style, "style", "friendly", "highlighter style (see https://help.farbox.com/pygments.html)")
- g.cmd.PersistentFlags().StringVar(&g.highlightStyle, "highlightStyle", "bg:#ffffcc", "style used for highlighting lines (see https://github.com/alecthomas/chroma)")
- g.cmd.PersistentFlags().StringVar(&g.linesStyle, "linesStyle", "", "style used for line numbers (see https://github.com/alecthomas/chroma)")
-
- return g
-}
-
-func (g *genChromaStyles) generate() error {
- builder := styles.Get(g.style).Builder()
- if g.highlightStyle != "" {
- builder.Add(chroma.LineHighlight, g.highlightStyle)
- }
- if g.linesStyle != "" {
- builder.Add(chroma.LineNumbers, g.linesStyle)
- }
- style, err := builder.Build()
- if err != nil {
- return err
- }
- formatter := html.New(html.WithClasses())
- formatter.WriteCSS(os.Stdout, style)
- return nil
-}
diff --git a/commands/gendoc.go b/commands/gendoc.go
deleted file mode 100644
index 8312191f2..000000000
--- a/commands/gendoc.go
+++ /dev/null
@@ -1,96 +0,0 @@
-// Copyright 2016 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "fmt"
- "path"
- "path/filepath"
- "strings"
- "time"
-
- "github.com/gohugoio/hugo/helpers"
- "github.com/gohugoio/hugo/hugofs"
- "github.com/spf13/cobra"
- "github.com/spf13/cobra/doc"
- jww "github.com/spf13/jwalterweatherman"
-)
-
-var _ cmder = (*genDocCmd)(nil)
-
-type genDocCmd struct {
- gendocdir string
- *baseCmd
-}
-
-func newGenDocCmd() *genDocCmd {
- const gendocFrontmatterTemplate = `---
-date: %s
-title: "%s"
-slug: %s
-url: %s
----
-`
-
- cc := &genDocCmd{}
-
- cc.baseCmd = newBaseCmd(&cobra.Command{
- Use: "doc",
- Short: "Generate Markdown documentation for the Hugo CLI.",
- Long: `Generate Markdown documentation for the Hugo CLI.
-
-This command is, mostly, used to create up-to-date documentation
-of Hugo's command-line interface for http://gohugo.io/.
-
-It creates one Markdown file per command with front matter suitable
-for rendering in Hugo.`,
-
- RunE: func(cmd *cobra.Command, args []string) error {
- if !strings.HasSuffix(cc.gendocdir, helpers.FilePathSeparator) {
- cc.gendocdir += helpers.FilePathSeparator
- }
- if found, _ := helpers.Exists(cc.gendocdir, hugofs.Os); !found {
- jww.FEEDBACK.Println("Directory", cc.gendocdir, "does not exist, creating...")
- if err := hugofs.Os.MkdirAll(cc.gendocdir, 0777); err != nil {
- return err
- }
- }
- now := time.Now().Format("2006-01-02")
- prepender := func(filename string) string {
- name := filepath.Base(filename)
- base := strings.TrimSuffix(name, path.Ext(name))
- url := "/commands/" + strings.ToLower(base) + "/"
- return fmt.Sprintf(gendocFrontmatterTemplate, now, strings.Replace(base, "_", " ", -1), base, url)
- }
-
- linkHandler := func(name string) string {
- base := strings.TrimSuffix(name, path.Ext(name))
- return "/commands/" + strings.ToLower(base) + "/"
- }
-
- jww.FEEDBACK.Println("Generating Hugo command-line documentation in", cc.gendocdir, "...")
- doc.GenMarkdownTreeCustom(cmd.Root(), cc.gendocdir, prepender, linkHandler)
- jww.FEEDBACK.Println("Done.")
-
- return nil
- },
- })
-
- cc.cmd.PersistentFlags().StringVar(&cc.gendocdir, "dir", "/tmp/hugodoc/", "the directory to write the doc.")
-
- // For bash-completion
- cc.cmd.PersistentFlags().SetAnnotation("dir", cobra.BashCompSubdirsInDir, []string{})
-
- return cc
-}
diff --git a/commands/gendocshelper.go b/commands/gendocshelper.go
deleted file mode 100644
index c243581f6..000000000
--- a/commands/gendocshelper.go
+++ /dev/null
@@ -1,74 +0,0 @@
-// Copyright 2017-present The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "encoding/json"
- "fmt"
- "os"
- "path/filepath"
-
- "github.com/gohugoio/hugo/docshelper"
- "github.com/spf13/cobra"
-)
-
-var (
- _ cmder = (*genDocsHelper)(nil)
-)
-
-type genDocsHelper struct {
- target string
- *baseCmd
-}
-
-func createGenDocsHelper() *genDocsHelper {
- g := &genDocsHelper{
- baseCmd: newBaseCmd(&cobra.Command{
- Use: "docshelper",
- Short: "Generate some data files for the Hugo docs.",
- Hidden: true,
- }),
- }
-
- g.cmd.RunE = func(cmd *cobra.Command, args []string) error {
- return g.generate()
- }
-
- g.cmd.PersistentFlags().StringVarP(&g.target, "dir", "", "docs/data", "data dir")
-
- return g
-}
-
-func (g *genDocsHelper) generate() error {
- fmt.Println("Generate docs data to", g.target)
-
- targetFile := filepath.Join(g.target, "docs.json")
-
- f, err := os.Create(targetFile)
- if err != nil {
- return err
- }
- defer f.Close()
-
- enc := json.NewEncoder(f)
- enc.SetIndent("", " ")
-
- if err := enc.Encode(docshelper.DocProviders); err != nil {
- return err
- }
-
- fmt.Println("Done!")
- return nil
-
-}
diff --git a/commands/genman.go b/commands/genman.go
deleted file mode 100644
index 720046289..000000000
--- a/commands/genman.go
+++ /dev/null
@@ -1,77 +0,0 @@
-// Copyright 2016 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "fmt"
- "strings"
-
- "github.com/gohugoio/hugo/common/hugo"
- "github.com/gohugoio/hugo/helpers"
- "github.com/gohugoio/hugo/hugofs"
- "github.com/spf13/cobra"
- "github.com/spf13/cobra/doc"
- jww "github.com/spf13/jwalterweatherman"
-)
-
-var _ cmder = (*genManCmd)(nil)
-
-type genManCmd struct {
- genmandir string
- *baseCmd
-}
-
-func newGenManCmd() *genManCmd {
- cc := &genManCmd{}
-
- cc.baseCmd = newBaseCmd(&cobra.Command{
- Use: "man",
- Short: "Generate man pages for the Hugo CLI",
- Long: `This command automatically generates up-to-date man pages of Hugo's
-command-line interface. By default, it creates the man page files
-in the "man" directory under the current directory.`,
-
- RunE: func(cmd *cobra.Command, args []string) error {
- header := &doc.GenManHeader{
- Section: "1",
- Manual: "Hugo Manual",
- Source: fmt.Sprintf("Hugo %s", hugo.CurrentVersion),
- }
- if !strings.HasSuffix(cc.genmandir, helpers.FilePathSeparator) {
- cc.genmandir += helpers.FilePathSeparator
- }
- if found, _ := helpers.Exists(cc.genmandir, hugofs.Os); !found {
- jww.FEEDBACK.Println("Directory", cc.genmandir, "does not exist, creating...")
- if err := hugofs.Os.MkdirAll(cc.genmandir, 0777); err != nil {
- return err
- }
- }
- cmd.Root().DisableAutoGenTag = true
-
- jww.FEEDBACK.Println("Generating Hugo man pages in", cc.genmandir, "...")
- doc.GenManTree(cmd.Root(), header, cc.genmandir)
-
- jww.FEEDBACK.Println("Done.")
-
- return nil
- },
- })
-
- cc.cmd.PersistentFlags().StringVar(&cc.genmandir, "dir", "man/", "the directory to write the man pages.")
-
- // For bash-completion
- cc.cmd.PersistentFlags().SetAnnotation("dir", cobra.BashCompSubdirsInDir, []string{})
-
- return cc
-}
diff --git a/commands/helpers.go b/commands/helpers.go
index 1386e425f..a13bdebc2 100644
--- a/commands/helpers.go
+++ b/commands/helpers.go
@@ -1,4 +1,4 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -11,16 +11,19 @@
// See the License for the specific language governing permissions and
// limitations under the License.
-// Package commands defines and implements command-line commands and flags
-// used by Hugo. Commands and flags are implemented using Cobra.
package commands
import (
+ "errors"
"fmt"
- "regexp"
+ "log"
+ "os"
+ "path/filepath"
+ "strings"
+ "github.com/bep/simplecobra"
"github.com/gohugoio/hugo/config"
- "github.com/spf13/cobra"
+ "github.com/spf13/pflag"
)
const (
@@ -30,50 +33,89 @@ const (
showCursor = ansiEsc + "[?25h"
)
-type flagsToConfigHandler interface {
- flagsToConfig(cfg config.Provider)
+func newUserError(a ...any) *simplecobra.CommandError {
+ return &simplecobra.CommandError{Err: errors.New(fmt.Sprint(a...))}
}
-type cmder interface {
- flagsToConfigHandler
- getCommand() *cobra.Command
+func setValueFromFlag(flags *pflag.FlagSet, key string, cfg config.Provider, targetKey string, force bool) {
+ key = strings.TrimSpace(key)
+ if (force && flags.Lookup(key) != nil) || flags.Changed(key) {
+ f := flags.Lookup(key)
+ configKey := key
+ if targetKey != "" {
+ configKey = targetKey
+ }
+ // Gotta love this API.
+ switch f.Value.Type() {
+ case "bool":
+ bv, _ := flags.GetBool(key)
+ cfg.Set(configKey, bv)
+ case "string":
+ cfg.Set(configKey, f.Value.String())
+ case "stringSlice":
+ bv, _ := flags.GetStringSlice(key)
+ cfg.Set(configKey, bv)
+ case "int":
+ iv, _ := flags.GetInt(key)
+ cfg.Set(configKey, iv)
+ default:
+ panic(fmt.Sprintf("update switch with %s", f.Value.Type()))
+ }
+
+ }
}
-// commandError is an error used to signal different error situations in command handling.
-type commandError struct {
- s string
- userError bool
+func flagsToCfg(cd *simplecobra.Commandeer, cfg config.Provider) config.Provider {
+ return flagsToCfgWithAdditionalConfigBase(cd, cfg, "")
}
-func (c commandError) Error() string {
- return c.s
-}
-
-func (c commandError) isUserError() bool {
- return c.userError
-}
-
-func newUserError(a ...interface{}) commandError {
- return commandError{s: fmt.Sprintln(a...), userError: true}
-}
-
-func newSystemError(a ...interface{}) commandError {
- return commandError{s: fmt.Sprintln(a...), userError: false}
-}
-
-func newSystemErrorF(format string, a ...interface{}) commandError {
- return commandError{s: fmt.Sprintf(format, a...), userError: false}
-}
-
-// Catch some of the obvious user errors from Cobra.
-// We don't want to show the usage message for every error.
-// The below may be to generic. Time will show.
-var userErrorRegexp = regexp.MustCompile("argument|flag|shorthand")
-
-func isUserError(err error) bool {
- if cErr, ok := err.(commandError); ok && cErr.isUserError() {
- return true
+func flagsToCfgWithAdditionalConfigBase(cd *simplecobra.Commandeer, cfg config.Provider, additionalConfigBase string) config.Provider {
+ if cfg == nil {
+ cfg = config.New()
}
- return userErrorRegexp.MatchString(err.Error())
+ // Flags with a different name in the config.
+ keyMap := map[string]string{
+ "minify": "minifyOutput",
+ "destination": "publishDir",
+ "editor": "newContentEditor",
+ }
+
+ // Flags that we for some reason don't want to expose in the site config.
+ internalKeySet := map[string]bool{
+ "quiet": true,
+ "verbose": true,
+ "watch": true,
+ "liveReloadPort": true,
+ "renderToMemory": true,
+ "clock": true,
+ }
+
+ cmd := cd.CobraCommand
+ flags := cmd.Flags()
+
+ flags.VisitAll(func(f *pflag.Flag) {
+ if f.Changed {
+ targetKey := f.Name
+ if internalKeySet[targetKey] {
+ targetKey = "internal." + targetKey
+ } else if mapped, ok := keyMap[targetKey]; ok {
+ targetKey = mapped
+ }
+ setValueFromFlag(flags, f.Name, cfg, targetKey, false)
+ if additionalConfigBase != "" {
+ setValueFromFlag(flags, f.Name, cfg, additionalConfigBase+"."+targetKey, true)
+ }
+ }
+ })
+
+ return cfg
+}
+
+func mkdir(x ...string) {
+ p := filepath.Join(x...)
+ err := os.MkdirAll(p, 0o777) // before umask
+ if err != nil {
+ log.Fatal(err)
+ }
}
diff --git a/commands/hugo.go b/commands/hugo.go
deleted file mode 100644
index 3da059cc5..000000000
--- a/commands/hugo.go
+++ /dev/null
@@ -1,1169 +0,0 @@
-// Copyright 2019 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-// Package commands defines and implements command-line commands and flags
-// used by Hugo. Commands and flags are implemented using Cobra.
-package commands
-
-import (
- "context"
- "fmt"
- "io/ioutil"
- "os/signal"
- "runtime/pprof"
- "runtime/trace"
- "sync/atomic"
-
- "github.com/gohugoio/hugo/hugofs"
-
- "github.com/gohugoio/hugo/resources/page"
-
- "github.com/pkg/errors"
-
- "github.com/gohugoio/hugo/common/herrors"
- "github.com/gohugoio/hugo/common/loggers"
- "github.com/gohugoio/hugo/common/terminal"
-
- "syscall"
-
- "github.com/gohugoio/hugo/hugolib/filesystems"
-
- "golang.org/x/sync/errgroup"
-
- "os"
- "path/filepath"
- "runtime"
- "strings"
- "time"
-
- "github.com/gohugoio/hugo/config"
-
- flag "github.com/spf13/pflag"
-
- "github.com/fsnotify/fsnotify"
- "github.com/gohugoio/hugo/helpers"
- "github.com/gohugoio/hugo/hugolib"
- "github.com/gohugoio/hugo/livereload"
- "github.com/gohugoio/hugo/watcher"
- "github.com/spf13/afero"
- "github.com/spf13/cobra"
- "github.com/spf13/fsync"
- jww "github.com/spf13/jwalterweatherman"
-)
-
-// The Response value from Execute.
-type Response struct {
- // The build Result will only be set in the hugo build command.
- Result *hugolib.HugoSites
-
- // Err is set when the command failed to execute.
- Err error
-
- // The command that was executed.
- Cmd *cobra.Command
-}
-
-// IsUserError returns true is the Response error is a user error rather than a
-// system error.
-func (r Response) IsUserError() bool {
- return r.Err != nil && isUserError(r.Err)
-}
-
-// Execute adds all child commands to the root command HugoCmd and sets flags appropriately.
-// The args are usually filled with os.Args[1:].
-func Execute(args []string) Response {
- hugoCmd := newCommandsBuilder().addAll().build()
- cmd := hugoCmd.getCommand()
- cmd.SetArgs(args)
-
- c, err := cmd.ExecuteC()
-
- var resp Response
-
- if c == cmd && hugoCmd.c != nil {
- // Root command executed
- resp.Result = hugoCmd.c.hugo()
- }
-
- if err == nil {
- errCount := int(loggers.GlobalErrorCounter.Count())
- if errCount > 0 {
- err = fmt.Errorf("logged %d errors", errCount)
- } else if resp.Result != nil {
- errCount = resp.Result.NumLogErrors()
- if errCount > 0 {
- err = fmt.Errorf("logged %d errors", errCount)
- }
- }
-
- }
-
- resp.Err = err
- resp.Cmd = c
-
- return resp
-}
-
-// InitializeConfig initializes a config file with sensible default configuration flags.
-func initializeConfig(mustHaveConfigFile, running bool,
- h *hugoBuilderCommon,
- f flagsToConfigHandler,
- doWithCommandeer func(c *commandeer) error) (*commandeer, error) {
-
- c, err := newCommandeer(mustHaveConfigFile, running, h, f, doWithCommandeer)
- if err != nil {
- return nil, err
- }
-
- return c, nil
-
-}
-
-func (c *commandeer) createLogger(cfg config.Provider, running bool) (*loggers.Logger, error) {
- var (
- logHandle = ioutil.Discard
- logThreshold = jww.LevelWarn
- logFile = cfg.GetString("logFile")
- outHandle = os.Stdout
- stdoutThreshold = jww.LevelWarn
- )
-
- if c.h.verboseLog || c.h.logging || (c.h.logFile != "") {
- var err error
- if logFile != "" {
- logHandle, err = os.OpenFile(logFile, os.O_RDWR|os.O_APPEND|os.O_CREATE, 0666)
- if err != nil {
- return nil, newSystemError("Failed to open log file:", logFile, err)
- }
- } else {
- logHandle, err = ioutil.TempFile("", "hugo")
- if err != nil {
- return nil, newSystemError(err)
- }
- }
- } else if !c.h.quiet && cfg.GetBool("verbose") {
- stdoutThreshold = jww.LevelInfo
- }
-
- if cfg.GetBool("debug") {
- stdoutThreshold = jww.LevelDebug
- }
-
- if c.h.verboseLog {
- logThreshold = jww.LevelInfo
- if cfg.GetBool("debug") {
- logThreshold = jww.LevelDebug
- }
- }
-
- loggers.InitGlobalLogger(stdoutThreshold, logThreshold, outHandle, logHandle)
- helpers.InitLoggers()
-
- return loggers.NewLogger(stdoutThreshold, logThreshold, outHandle, logHandle, running), nil
-}
-
-func initializeFlags(cmd *cobra.Command, cfg config.Provider) {
- persFlagKeys := []string{
- "debug",
- "verbose",
- "logFile",
- // Moved from vars
- }
- flagKeys := []string{
- "cleanDestinationDir",
- "buildDrafts",
- "buildFuture",
- "buildExpired",
- "uglyURLs",
- "canonifyURLs",
- "enableRobotsTXT",
- "enableGitInfo",
- "pluralizeListTitles",
- "preserveTaxonomyNames",
- "ignoreCache",
- "forceSyncStatic",
- "noTimes",
- "noChmod",
- "ignoreVendor",
- "templateMetrics",
- "templateMetricsHints",
-
- // Moved from vars.
- "baseURL",
- "buildWatch",
- "cacheDir",
- "cfgFile",
- "confirm",
- "contentDir",
- "debug",
- "destination",
- "disableKinds",
- "dryRun",
- "force",
- "gc",
- "i18n-warnings",
- "invalidateCDN",
- "layoutDir",
- "logFile",
- "maxDeletes",
- "quiet",
- "renderToMemory",
- "source",
- "target",
- "theme",
- "themesDir",
- "verbose",
- "verboseLog",
- "duplicateTargetPaths",
- }
-
- // Will set a value even if it is the default.
- flagKeysForced := []string{
- "minify",
- }
-
- for _, key := range persFlagKeys {
- setValueFromFlag(cmd.PersistentFlags(), key, cfg, "", false)
- }
- for _, key := range flagKeys {
- setValueFromFlag(cmd.Flags(), key, cfg, "", false)
- }
-
- for _, key := range flagKeysForced {
- setValueFromFlag(cmd.Flags(), key, cfg, "", true)
- }
-
- // Set some "config aliases"
- setValueFromFlag(cmd.Flags(), "destination", cfg, "publishDir", false)
- setValueFromFlag(cmd.Flags(), "i18n-warnings", cfg, "logI18nWarnings", false)
- setValueFromFlag(cmd.Flags(), "path-warnings", cfg, "logPathWarnings", false)
-
-}
-
-func setValueFromFlag(flags *flag.FlagSet, key string, cfg config.Provider, targetKey string, force bool) {
- key = strings.TrimSpace(key)
- if (force && flags.Lookup(key) != nil) || flags.Changed(key) {
- f := flags.Lookup(key)
- configKey := key
- if targetKey != "" {
- configKey = targetKey
- }
- // Gotta love this API.
- switch f.Value.Type() {
- case "bool":
- bv, _ := flags.GetBool(key)
- cfg.Set(configKey, bv)
- case "string":
- cfg.Set(configKey, f.Value.String())
- case "stringSlice":
- bv, _ := flags.GetStringSlice(key)
- cfg.Set(configKey, bv)
- case "int":
- iv, _ := flags.GetInt(key)
- cfg.Set(configKey, iv)
- default:
- panic(fmt.Sprintf("update switch with %s", f.Value.Type()))
- }
-
- }
-}
-
-func isTerminal() bool {
- return terminal.IsTerminal(os.Stdout)
-
-}
-func ifTerminal(s string) string {
- if !isTerminal() {
- return ""
- }
- return s
-}
-
-func (c *commandeer) fullBuild() error {
-
- var (
- g errgroup.Group
- langCount map[string]uint64
- )
-
- if !c.h.quiet {
- fmt.Print(ifTerminal(hideCursor) + "Building sites … ")
- if isTerminal() {
- defer func() {
- fmt.Print(showCursor + clearLine)
- }()
- }
- }
-
- copyStaticFunc := func() error {
-
- cnt, err := c.copyStatic()
- if err != nil {
- return errors.Wrap(err, "Error copying static files")
- }
- langCount = cnt
- return nil
- }
- buildSitesFunc := func() error {
- if err := c.buildSites(); err != nil {
- return errors.Wrap(err, "Error building site")
- }
- return nil
- }
- // Do not copy static files and build sites in parallel if cleanDestinationDir is enabled.
- // This flag deletes all static resources in /public folder that are missing in /static,
- // and it does so at the end of copyStatic() call.
- if c.Cfg.GetBool("cleanDestinationDir") {
- if err := copyStaticFunc(); err != nil {
- return err
- }
- if err := buildSitesFunc(); err != nil {
- return err
- }
- } else {
- g.Go(copyStaticFunc)
- g.Go(buildSitesFunc)
- if err := g.Wait(); err != nil {
- return err
- }
- }
-
- for _, s := range c.hugo().Sites {
- s.ProcessingStats.Static = langCount[s.Language().Lang]
- }
-
- if c.h.gc {
- count, err := c.hugo().GC()
- if err != nil {
- return err
- }
- for _, s := range c.hugo().Sites {
- // We have no way of knowing what site the garbage belonged to.
- s.ProcessingStats.Cleaned = uint64(count)
- }
- }
-
- return nil
-
-}
-
-func (c *commandeer) initCPUProfile() (func(), error) {
- if c.h.cpuprofile == "" {
- return nil, nil
- }
-
- f, err := os.Create(c.h.cpuprofile)
- if err != nil {
- return nil, errors.Wrap(err, "failed to create CPU profile")
- }
- if err := pprof.StartCPUProfile(f); err != nil {
- return nil, errors.Wrap(err, "failed to start CPU profile")
- }
- return func() {
- pprof.StopCPUProfile()
- f.Close()
- }, nil
-}
-
-func (c *commandeer) initMemProfile() {
- if c.h.memprofile == "" {
- return
- }
-
- f, err := os.Create(c.h.memprofile)
- if err != nil {
- c.logger.ERROR.Println("could not create memory profile: ", err)
- }
- defer f.Close()
- runtime.GC() // get up-to-date statistics
- if err := pprof.WriteHeapProfile(f); err != nil {
- c.logger.ERROR.Println("could not write memory profile: ", err)
- }
-}
-
-func (c *commandeer) initTraceProfile() (func(), error) {
- if c.h.traceprofile == "" {
- return nil, nil
- }
-
- f, err := os.Create(c.h.traceprofile)
- if err != nil {
- return nil, errors.Wrap(err, "failed to create trace file")
- }
-
- if err := trace.Start(f); err != nil {
- return nil, errors.Wrap(err, "failed to start trace")
- }
-
- return func() {
- trace.Stop()
- f.Close()
- }, nil
-}
-
-func (c *commandeer) initMutexProfile() (func(), error) {
- if c.h.mutexprofile == "" {
- return nil, nil
- }
-
- f, err := os.Create(c.h.mutexprofile)
- if err != nil {
- return nil, err
- }
-
- runtime.SetMutexProfileFraction(1)
-
- return func() {
- pprof.Lookup("mutex").WriteTo(f, 0)
- f.Close()
- }, nil
-
-}
-
-func (c *commandeer) initProfiling() (func(), error) {
- stopCPUProf, err := c.initCPUProfile()
- if err != nil {
- return nil, err
- }
-
- stopMutexProf, err := c.initMutexProfile()
- if err != nil {
- return nil, err
- }
-
- stopTraceProf, err := c.initTraceProfile()
- if err != nil {
- return nil, err
- }
-
- return func() {
- c.initMemProfile()
-
- if stopCPUProf != nil {
- stopCPUProf()
- }
- if stopMutexProf != nil {
- stopMutexProf()
- }
-
- if stopTraceProf != nil {
- stopTraceProf()
- }
- }, nil
-}
-
-func (c *commandeer) build() error {
- defer c.timeTrack(time.Now(), "Total")
-
- stopProfiling, err := c.initProfiling()
- if err != nil {
- return err
- }
-
- defer func() {
- if stopProfiling != nil {
- stopProfiling()
- }
- }()
-
- if err := c.fullBuild(); err != nil {
- return err
- }
-
- // TODO(bep) Feedback?
- if !c.h.quiet {
- fmt.Println()
- c.hugo().PrintProcessingStats(os.Stdout)
- fmt.Println()
-
- if createCounter, ok := c.destinationFs.(hugofs.DuplicatesReporter); ok {
- dupes := createCounter.ReportDuplicates()
- if dupes != "" {
- c.logger.WARN.Println("Duplicate target paths:", dupes)
- }
- }
- }
-
- if c.h.buildWatch {
- watchDirs, err := c.getDirList()
- if err != nil {
- return err
- }
-
- baseWatchDir := c.Cfg.GetString("workingDir")
- rootWatchDirs := getRootWatchDirsStr(baseWatchDir, watchDirs)
-
- c.logger.FEEDBACK.Printf("Watching for changes in %s%s{%s}\n", baseWatchDir, helpers.FilePathSeparator, rootWatchDirs)
- c.logger.FEEDBACK.Println("Press Ctrl+C to stop")
- watcher, err := c.newWatcher(watchDirs...)
- checkErr(c.Logger, err)
- defer watcher.Close()
-
- var sigs = make(chan os.Signal, 1)
- signal.Notify(sigs, syscall.SIGINT, syscall.SIGTERM)
-
- <-sigs
- }
-
- return nil
-}
-
-func (c *commandeer) serverBuild() error {
- defer c.timeTrack(time.Now(), "Total")
-
- stopProfiling, err := c.initProfiling()
- if err != nil {
- return err
- }
-
- defer func() {
- if stopProfiling != nil {
- stopProfiling()
- }
- }()
-
- if err := c.fullBuild(); err != nil {
- return err
- }
-
- // TODO(bep) Feedback?
- if !c.h.quiet {
- fmt.Println()
- c.hugo().PrintProcessingStats(os.Stdout)
- fmt.Println()
- }
-
- return nil
-}
-
-func (c *commandeer) copyStatic() (map[string]uint64, error) {
- m, err := c.doWithPublishDirs(c.copyStaticTo)
- if err == nil || os.IsNotExist(err) {
- return m, nil
- }
- return m, err
-}
-
-func (c *commandeer) doWithPublishDirs(f func(sourceFs *filesystems.SourceFilesystem) (uint64, error)) (map[string]uint64, error) {
-
- langCount := make(map[string]uint64)
-
- staticFilesystems := c.hugo().BaseFs.SourceFilesystems.Static
-
- if len(staticFilesystems) == 0 {
- c.logger.INFO.Println("No static directories found to sync")
- return langCount, nil
- }
-
- for lang, fs := range staticFilesystems {
- cnt, err := f(fs)
- if err != nil {
- return langCount, err
- }
-
- if lang == "" {
- // Not multihost
- for _, l := range c.languages {
- langCount[l.Lang] = cnt
- }
- } else {
- langCount[lang] = cnt
- }
- }
-
- return langCount, nil
-}
-
-type countingStatFs struct {
- afero.Fs
- statCounter uint64
-}
-
-func (fs *countingStatFs) Stat(name string) (os.FileInfo, error) {
- f, err := fs.Fs.Stat(name)
- if err == nil {
- if !f.IsDir() {
- atomic.AddUint64(&fs.statCounter, 1)
- }
- }
- return f, err
-}
-
-func chmodFilter(dst, src os.FileInfo) bool {
- // Hugo publishes data from multiple sources, potentially
- // with overlapping directory structures. We cannot sync permissions
- // for directories as that would mean that we might end up with write-protected
- // directories inside /public.
- // One example of this would be syncing from the Go Module cache,
- // which have 0555 directories.
- return src.IsDir()
-}
-
-func (c *commandeer) copyStaticTo(sourceFs *filesystems.SourceFilesystem) (uint64, error) {
- publishDir := c.hugo().PathSpec.PublishDir
- // If root, remove the second '/'
- if publishDir == "//" {
- publishDir = helpers.FilePathSeparator
- }
-
- if sourceFs.PublishFolder != "" {
- publishDir = filepath.Join(publishDir, sourceFs.PublishFolder)
- }
-
- fs := &countingStatFs{Fs: sourceFs.Fs}
-
- syncer := fsync.NewSyncer()
- syncer.NoTimes = c.Cfg.GetBool("noTimes")
- syncer.NoChmod = c.Cfg.GetBool("noChmod")
- syncer.ChmodFilter = chmodFilter
- syncer.SrcFs = fs
- syncer.DestFs = c.Fs.Destination
- // Now that we are using a unionFs for the static directories
- // We can effectively clean the publishDir on initial sync
- syncer.Delete = c.Cfg.GetBool("cleanDestinationDir")
-
- if syncer.Delete {
- c.logger.INFO.Println("removing all files from destination that don't exist in static dirs")
-
- syncer.DeleteFilter = func(f os.FileInfo) bool {
- return f.IsDir() && strings.HasPrefix(f.Name(), ".")
- }
- }
- c.logger.INFO.Println("syncing static files to", publishDir)
-
- // because we are using a baseFs (to get the union right).
- // set sync src to root
- err := syncer.Sync(publishDir, helpers.FilePathSeparator)
- if err != nil {
- return 0, err
- }
-
- // Sync runs Stat 3 times for every source file (which sounds much)
- numFiles := fs.statCounter / 3
-
- return numFiles, err
-}
-
-func (c *commandeer) firstPathSpec() *helpers.PathSpec {
- return c.hugo().Sites[0].PathSpec
-}
-
-func (c *commandeer) timeTrack(start time.Time, name string) {
- if c.h.quiet {
- return
- }
- elapsed := time.Since(start)
- c.logger.FEEDBACK.Printf("%s in %v ms", name, int(1000*elapsed.Seconds()))
-}
-
-// getDirList provides NewWatcher() with a list of directories to watch for changes.
-func (c *commandeer) getDirList() ([]string, error) {
- var dirnames []string
-
- walkFn := func(path string, fi hugofs.FileMetaInfo, err error) error {
- if err != nil {
- c.logger.ERROR.Println("walker: ", err)
- return nil
- }
-
- if fi.IsDir() {
- if fi.Name() == ".git" ||
- fi.Name() == "node_modules" || fi.Name() == "bower_components" {
- return filepath.SkipDir
- }
-
- dirnames = append(dirnames, fi.Meta().Filename())
- }
-
- return nil
-
- }
-
- watchDirs := c.hugo().PathSpec.BaseFs.WatchDirs()
- for _, watchDir := range watchDirs {
-
- w := hugofs.NewWalkway(hugofs.WalkwayConfig{Logger: c.logger, Info: watchDir, WalkFn: walkFn})
- if err := w.Walk(); err != nil {
- c.logger.ERROR.Println("walker: ", err)
- }
- }
-
- dirnames = helpers.UniqueStringsSorted(dirnames)
-
- return dirnames, nil
-}
-
-func (c *commandeer) buildSites() (err error) {
- return c.hugo().Build(hugolib.BuildCfg{})
-}
-
-func (c *commandeer) handleBuildErr(err error, msg string) {
- c.buildErr = err
-
- c.logger.ERROR.Print(msg + ":\n\n")
- c.logger.ERROR.Println(helpers.FirstUpper(err.Error()))
- if !c.h.quiet && c.h.verbose {
- herrors.PrintStackTrace(err)
- }
-}
-
-func (c *commandeer) rebuildSites(events []fsnotify.Event) error {
- defer c.timeTrack(time.Now(), "Total")
-
- c.buildErr = nil
- visited := c.visitedURLs.PeekAllSet()
- if c.fastRenderMode {
-
- // Make sure we always render the home pages
- for _, l := range c.languages {
- langPath := c.hugo().PathSpec.GetLangSubDir(l.Lang)
- if langPath != "" {
- langPath = langPath + "/"
- }
- home := c.hugo().PathSpec.PrependBasePath("/"+langPath, false)
- visited[home] = true
- }
-
- }
- return c.hugo().Build(hugolib.BuildCfg{RecentlyVisited: visited}, events...)
-}
-
-func (c *commandeer) partialReRender(urls ...string) error {
- c.buildErr = nil
- visited := make(map[string]bool)
- for _, url := range urls {
- visited[url] = true
- }
- return c.hugo().Build(hugolib.BuildCfg{RecentlyVisited: visited, PartialReRender: true})
-}
-
-func (c *commandeer) fullRebuild(changeType string) {
- if changeType == configChangeGoMod {
- // go.mod may be changed during the build itself, and
- // we really want to prevent superfluous builds.
- if !c.fullRebuildSem.TryAcquire(1) {
- return
- }
- c.fullRebuildSem.Release(1)
- }
-
- c.fullRebuildSem.Acquire(context.Background(), 1)
-
- go func() {
-
- defer c.fullRebuildSem.Release(1)
-
- c.printChangeDetected(changeType)
-
- defer func() {
-
- // Allow any file system events to arrive back.
- // This will block any rebuild on config changes for the
- // duration of the sleep.
- time.Sleep(2 * time.Second)
- }()
-
- defer c.timeTrack(time.Now(), "Total")
-
- c.commandeerHugoState = newCommandeerHugoState()
- err := c.loadConfig(true, true)
- if err != nil {
- // Set the processing on pause until the state is recovered.
- c.paused = true
- c.handleBuildErr(err, "Failed to reload config")
-
- } else {
- c.paused = false
- }
-
- if !c.paused {
- _, err := c.copyStatic()
- if err != nil {
- c.logger.ERROR.Println(err)
- return
- }
-
- err = c.buildSites()
- if err != nil {
- c.logger.ERROR.Println(err)
- } else if !c.h.buildWatch && !c.Cfg.GetBool("disableLiveReload") {
- livereload.ForceRefresh()
- }
- }
- }()
-}
-
-// newWatcher creates a new watcher to watch filesystem events.
-func (c *commandeer) newWatcher(dirList ...string) (*watcher.Batcher, error) {
- if runtime.GOOS == "darwin" {
- tweakLimit()
- }
-
- staticSyncer, err := newStaticSyncer(c)
- if err != nil {
- return nil, err
- }
-
- watcher, err := watcher.New(1 * time.Second)
-
- if err != nil {
- return nil, err
- }
-
- for _, d := range dirList {
- if d != "" {
- _ = watcher.Add(d)
- }
- }
-
- // Identifies changes to config (config.toml) files.
- configSet := make(map[string]bool)
-
- c.logger.FEEDBACK.Println("Watching for config changes in", strings.Join(c.configFiles, ", "))
- for _, configFile := range c.configFiles {
- watcher.Add(configFile)
- configSet[configFile] = true
- }
-
- go func() {
- for {
- select {
- case evs := <-watcher.Events:
- c.handleEvents(watcher, staticSyncer, evs, configSet)
- if c.showErrorInBrowser && c.errCount() > 0 {
- // Need to reload browser to show the error
- livereload.ForceRefresh()
- }
- case err := <-watcher.Errors:
- if err != nil {
- c.logger.ERROR.Println("Error while watching:", err)
- }
- }
- }
- }()
-
- return watcher, nil
-}
-
-func (c *commandeer) printChangeDetected(typ string) {
- msg := "\nChange"
- if typ != "" {
- msg += " of " + typ
- }
- msg += " detected, rebuilding site."
-
- c.logger.FEEDBACK.Println(msg)
- const layout = "2006-01-02 15:04:05.000 -0700"
- c.logger.FEEDBACK.Println(time.Now().Format(layout))
-}
-
-const (
- configChangeConfig = "config file"
- configChangeGoMod = "go.mod file"
-)
-
-func (c *commandeer) handleEvents(watcher *watcher.Batcher,
- staticSyncer *staticSyncer,
- evs []fsnotify.Event,
- configSet map[string]bool) {
-
- var isHandled bool
-
- for _, ev := range evs {
- isConfig := configSet[ev.Name]
- configChangeType := configChangeConfig
- if isConfig {
- if strings.Contains(ev.Name, "go.mod") {
- configChangeType = configChangeGoMod
- }
- }
- if !isConfig {
- // It may be one of the /config folders
- dirname := filepath.Dir(ev.Name)
- if dirname != "." && configSet[dirname] {
- isConfig = true
- }
- }
-
- if isConfig {
- isHandled = true
-
- if ev.Op&fsnotify.Chmod == fsnotify.Chmod {
- continue
- }
-
- if ev.Op&fsnotify.Remove == fsnotify.Remove || ev.Op&fsnotify.Rename == fsnotify.Rename {
- for _, configFile := range c.configFiles {
- counter := 0
- for watcher.Add(configFile) != nil {
- counter++
- if counter >= 100 {
- break
- }
- time.Sleep(100 * time.Millisecond)
- }
- }
-
- }
-
- // Config file(s) changed. Need full rebuild.
- c.fullRebuild(configChangeType)
-
- return
- }
- }
-
- if isHandled {
- return
- }
-
- if c.paused {
- // Wait for the server to get into a consistent state before
- // we continue with processing.
- return
- }
-
- if len(evs) > 50 {
- // This is probably a mass edit of the content dir.
- // Schedule a full rebuild for when it slows down.
- c.debounce(func() {
- c.fullRebuild("")
- })
- return
- }
-
- c.logger.INFO.Println("Received System Events:", evs)
-
- staticEvents := []fsnotify.Event{}
- dynamicEvents := []fsnotify.Event{}
-
- // Special handling for symbolic links inside /content.
- filtered := []fsnotify.Event{}
- for _, ev := range evs {
- // Check the most specific first, i.e. files.
- contentMapped := c.hugo().ContentChanges.GetSymbolicLinkMappings(ev.Name)
- if len(contentMapped) > 0 {
- for _, mapped := range contentMapped {
- filtered = append(filtered, fsnotify.Event{Name: mapped, Op: ev.Op})
- }
- continue
- }
-
- // Check for any symbolic directory mapping.
-
- dir, name := filepath.Split(ev.Name)
-
- contentMapped = c.hugo().ContentChanges.GetSymbolicLinkMappings(dir)
-
- if len(contentMapped) == 0 {
- filtered = append(filtered, ev)
- continue
- }
-
- for _, mapped := range contentMapped {
- mappedFilename := filepath.Join(mapped, name)
- filtered = append(filtered, fsnotify.Event{Name: mappedFilename, Op: ev.Op})
- }
- }
-
- evs = filtered
-
- for _, ev := range evs {
- ext := filepath.Ext(ev.Name)
- baseName := filepath.Base(ev.Name)
- istemp := strings.HasSuffix(ext, "~") ||
- (ext == ".swp") || // vim
- (ext == ".swx") || // vim
- (ext == ".tmp") || // generic temp file
- (ext == ".DS_Store") || // OSX Thumbnail
- baseName == "4913" || // vim
- strings.HasPrefix(ext, ".goutputstream") || // gnome
- strings.HasSuffix(ext, "jb_old___") || // intelliJ
- strings.HasSuffix(ext, "jb_tmp___") || // intelliJ
- strings.HasSuffix(ext, "jb_bak___") || // intelliJ
- strings.HasPrefix(ext, ".sb-") || // byword
- strings.HasPrefix(baseName, ".#") || // emacs
- strings.HasPrefix(baseName, "#") // emacs
- if istemp {
- continue
- }
- if c.hugo().Deps.SourceSpec.IgnoreFile(ev.Name) {
- continue
- }
- // Sometimes during rm -rf operations a '"": REMOVE' is triggered. Just ignore these
- if ev.Name == "" {
- continue
- }
-
- // Write and rename operations are often followed by CHMOD.
- // There may be valid use cases for rebuilding the site on CHMOD,
- // but that will require more complex logic than this simple conditional.
- // On OS X this seems to be related to Spotlight, see:
- // https://github.com/go-fsnotify/fsnotify/issues/15
- // A workaround is to put your site(s) on the Spotlight exception list,
- // but that may be a little mysterious for most end users.
- // So, for now, we skip reload on CHMOD.
- // We do have to check for WRITE though. On slower laptops a Chmod
- // could be aggregated with other important events, and we still want
- // to rebuild on those
- if ev.Op&(fsnotify.Chmod|fsnotify.Write|fsnotify.Create) == fsnotify.Chmod {
- continue
- }
-
- walkAdder := func(path string, f hugofs.FileMetaInfo, err error) error {
- if f.IsDir() {
- c.logger.FEEDBACK.Println("adding created directory to watchlist", path)
- if err := watcher.Add(path); err != nil {
- return err
- }
- } else if !staticSyncer.isStatic(path) {
- // Hugo's rebuilding logic is entirely file based. When you drop a new folder into
- // /content on OSX, the above logic will handle future watching of those files,
- // but the initial CREATE is lost.
- dynamicEvents = append(dynamicEvents, fsnotify.Event{Name: path, Op: fsnotify.Create})
- }
- return nil
- }
-
- // recursively add new directories to watch list
- // When mkdir -p is used, only the top directory triggers an event (at least on OSX)
- if ev.Op&fsnotify.Create == fsnotify.Create {
- if s, err := c.Fs.Source.Stat(ev.Name); err == nil && s.Mode().IsDir() {
- _ = helpers.SymbolicWalk(c.Fs.Source, ev.Name, walkAdder)
- }
- }
-
- if staticSyncer.isStatic(ev.Name) {
- staticEvents = append(staticEvents, ev)
- } else {
- dynamicEvents = append(dynamicEvents, ev)
- }
- }
-
- if len(staticEvents) > 0 {
- c.printChangeDetected("Static files")
-
- if c.Cfg.GetBool("forceSyncStatic") {
- c.logger.FEEDBACK.Printf("Syncing all static files\n")
- _, err := c.copyStatic()
- if err != nil {
- c.logger.ERROR.Println("Error copying static files to publish dir:", err)
- return
- }
- } else {
- if err := staticSyncer.syncsStaticEvents(staticEvents); err != nil {
- c.logger.ERROR.Println("Error syncing static files to publish dir:", err)
- return
- }
- }
-
- if !c.h.buildWatch && !c.Cfg.GetBool("disableLiveReload") {
- // Will block forever trying to write to a channel that nobody is reading if livereload isn't initialized
-
- // force refresh when more than one file
- if len(staticEvents) == 1 {
- ev := staticEvents[0]
- path := c.hugo().BaseFs.SourceFilesystems.MakeStaticPathRelative(ev.Name)
- path = c.firstPathSpec().RelURL(helpers.ToSlashTrimLeading(path), false)
- livereload.RefreshPath(path)
- } else {
- livereload.ForceRefresh()
- }
- }
- }
-
- if len(dynamicEvents) > 0 {
- partitionedEvents := partitionDynamicEvents(
- c.firstPathSpec().BaseFs.SourceFilesystems,
- dynamicEvents)
-
- doLiveReload := !c.h.buildWatch && !c.Cfg.GetBool("disableLiveReload")
- onePageName := pickOneWriteOrCreatePath(partitionedEvents.ContentEvents)
-
- c.printChangeDetected("")
- c.changeDetector.PrepareNew()
- if err := c.rebuildSites(dynamicEvents); err != nil {
- c.handleBuildErr(err, "Rebuild failed")
- }
-
- if doLiveReload {
- if len(partitionedEvents.ContentEvents) == 0 && len(partitionedEvents.AssetEvents) > 0 {
- changed := c.changeDetector.changed()
- if c.changeDetector != nil && len(changed) == 0 {
- // Nothing has changed.
- return
- } else if len(changed) == 1 {
- pathToRefresh := c.firstPathSpec().RelURL(helpers.ToSlashTrimLeading(changed[0]), false)
- livereload.RefreshPath(pathToRefresh)
- } else {
- livereload.ForceRefresh()
- }
- }
-
- if len(partitionedEvents.ContentEvents) > 0 {
-
- navigate := c.Cfg.GetBool("navigateToChanged")
- // We have fetched the same page above, but it may have
- // changed.
- var p page.Page
-
- if navigate {
- if onePageName != "" {
- p = c.hugo().GetContentPage(onePageName)
- }
- }
-
- if p != nil {
- livereload.NavigateToPathForPort(p.RelPermalink(), p.Site().ServerPort())
- } else {
- livereload.ForceRefresh()
- }
- }
- }
- }
-}
-
-// dynamicEvents contains events that is considered dynamic, as in "not static".
-// Both of these categories will trigger a new build, but the asset events
-// does not fit into the "navigate to changed" logic.
-type dynamicEvents struct {
- ContentEvents []fsnotify.Event
- AssetEvents []fsnotify.Event
-}
-
-func partitionDynamicEvents(sourceFs *filesystems.SourceFilesystems, events []fsnotify.Event) (de dynamicEvents) {
- for _, e := range events {
- if sourceFs.IsAsset(e.Name) {
- de.AssetEvents = append(de.AssetEvents, e)
- } else {
- de.ContentEvents = append(de.ContentEvents, e)
- }
- }
- return
-
-}
-
-func pickOneWriteOrCreatePath(events []fsnotify.Event) string {
- name := ""
-
- // Some editors (for example notepad.exe on Windows) triggers a change
- // both for directory and file. So we pick the longest path, which should
- // be the file itself.
- for _, ev := range events {
- if (ev.Op&fsnotify.Write == fsnotify.Write || ev.Op&fsnotify.Create == fsnotify.Create) && len(ev.Name) > len(name) {
- name = ev.Name
- }
- }
-
- return name
-}
diff --git a/commands/hugo_test.go b/commands/hugo_test.go
deleted file mode 100644
index 6a666ff4a..000000000
--- a/commands/hugo_test.go
+++ /dev/null
@@ -1,52 +0,0 @@
-// Copyright 2019 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "os"
- "testing"
-
- qt "github.com/frankban/quicktest"
-)
-
-// Issue #5662
-func TestHugoWithContentDirOverride(t *testing.T) {
- c := qt.New(t)
-
- hugoCmd := newCommandsBuilder().addAll().build()
- cmd := hugoCmd.getCommand()
-
- contentDir := "contentOverride"
-
- cfgStr := `
-
-baseURL = "https://example.org"
-title = "Hugo Commands"
-
-contentDir = "thisdoesnotexist"
-
-`
- dir, err := createSimpleTestSite(t, testSiteConfig{configTOML: cfgStr, contentDir: contentDir})
- c.Assert(err, qt.IsNil)
-
- defer func() {
- os.RemoveAll(dir)
- }()
-
- cmd.SetArgs([]string{"-s=" + dir, "-c=" + contentDir})
-
- _, err = cmd.ExecuteC()
- c.Assert(err, qt.IsNil)
-
-}
diff --git a/commands/hugo_windows.go b/commands/hugo_windows.go
index 106c0cc71..c354e889d 100644
--- a/commands/hugo_windows.go
+++ b/commands/hugo_windows.go
@@ -1,4 +1,4 @@
-// Copyright 2015 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -13,15 +13,21 @@
package commands
-import "github.com/spf13/cobra"
+import (
+ // For time zone lookups on Windows without Go installed.
+ // See #8892
+ _ "time/tzdata"
+
+ "github.com/spf13/cobra"
+)
func init() {
// This message to show to Windows users if Hugo is opened from explorer.exe
cobra.MousetrapHelpText = `
- Hugo is a command-line tool for generating static website.
+ Hugo is a command-line tool for generating static websites.
+
+ You need to open PowerShell and run Hugo from there.
- You need to open cmd.exe and run Hugo from there.
-
Visit https://gohugo.io/ for more information.`
}
diff --git a/commands/hugobuilder.go b/commands/hugobuilder.go
new file mode 100644
index 000000000..3b57ac5e9
--- /dev/null
+++ b/commands/hugobuilder.go
@@ -0,0 +1,1157 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package commands
+
+import (
+ "context"
+ "errors"
+ "fmt"
+ "os"
+ "path/filepath"
+ "runtime"
+ "runtime/pprof"
+ "runtime/trace"
+ "strings"
+ "sync"
+ "sync/atomic"
+ "time"
+
+ "github.com/bep/simplecobra"
+ "github.com/fsnotify/fsnotify"
+ "github.com/gohugoio/hugo/common/herrors"
+ "github.com/gohugoio/hugo/common/htime"
+ "github.com/gohugoio/hugo/common/hugo"
+ "github.com/gohugoio/hugo/common/loggers"
+ "github.com/gohugoio/hugo/common/maps"
+ "github.com/gohugoio/hugo/common/paths"
+ "github.com/gohugoio/hugo/common/terminal"
+ "github.com/gohugoio/hugo/common/types"
+ "github.com/gohugoio/hugo/config"
+ "github.com/gohugoio/hugo/helpers"
+ "github.com/gohugoio/hugo/hugofs"
+ "github.com/gohugoio/hugo/hugolib"
+ "github.com/gohugoio/hugo/hugolib/filesystems"
+ "github.com/gohugoio/hugo/identity"
+ "github.com/gohugoio/hugo/livereload"
+ "github.com/gohugoio/hugo/resources/page"
+ "github.com/gohugoio/hugo/watcher"
+ "github.com/spf13/fsync"
+ "golang.org/x/sync/errgroup"
+ "golang.org/x/sync/semaphore"
+)
+
+type hugoBuilder struct {
+ r *rootCommand
+
+ confmu sync.Mutex
+ conf *commonConfig
+
+ // May be nil.
+ s *serverCommand
+
+ // Currently only set when in "fast render mode".
+ changeDetector *fileChangeDetector
+ visitedURLs *types.EvictingQueue[string]
+
+ fullRebuildSem *semaphore.Weighted
+ debounce func(f func())
+
+ onConfigLoaded func(reloaded bool) error
+
+ fastRenderMode bool
+ showErrorInBrowser bool
+
+ errState hugoBuilderErrState
+}
+
+var errConfigNotSet = errors.New("config not set")
+
+func (c *hugoBuilder) withConfE(fn func(conf *commonConfig) error) error {
+ c.confmu.Lock()
+ defer c.confmu.Unlock()
+ if c.conf == nil {
+ return errConfigNotSet
+ }
+ return fn(c.conf)
+}
+
+func (c *hugoBuilder) withConf(fn func(conf *commonConfig)) {
+ c.confmu.Lock()
+ defer c.confmu.Unlock()
+ fn(c.conf)
+}
+
+type hugoBuilderErrState struct {
+ mu sync.Mutex
+ paused bool
+ builderr error
+ waserr bool
+}
+
+func (e *hugoBuilderErrState) setPaused(p bool) {
+ e.mu.Lock()
+ defer e.mu.Unlock()
+ e.paused = p
+}
+
+func (e *hugoBuilderErrState) isPaused() bool {
+ e.mu.Lock()
+ defer e.mu.Unlock()
+ return e.paused
+}
+
+func (e *hugoBuilderErrState) setBuildErr(err error) {
+ e.mu.Lock()
+ defer e.mu.Unlock()
+ e.builderr = err
+}
+
+func (e *hugoBuilderErrState) buildErr() error {
+ e.mu.Lock()
+ defer e.mu.Unlock()
+ return e.builderr
+}
+
+func (e *hugoBuilderErrState) setWasErr(w bool) {
+ e.mu.Lock()
+ defer e.mu.Unlock()
+ e.waserr = w
+}
+
+func (e *hugoBuilderErrState) wasErr() bool {
+ e.mu.Lock()
+ defer e.mu.Unlock()
+ return e.waserr
+}
+
+// getDirList provides NewWatcher() with a list of directories to watch for changes.
+func (c *hugoBuilder) getDirList() ([]string, error) {
+ h, err := c.hugo()
+ if err != nil {
+ return nil, err
+ }
+
+ return helpers.UniqueStringsSorted(h.PathSpec.BaseFs.WatchFilenames()), nil
+}
+
+func (c *hugoBuilder) initCPUProfile() (func(), error) {
+ if c.r.cpuprofile == "" {
+ return nil, nil
+ }
+
+ f, err := os.Create(c.r.cpuprofile)
+ if err != nil {
+ return nil, fmt.Errorf("failed to create CPU profile: %w", err)
+ }
+ if err := pprof.StartCPUProfile(f); err != nil {
+ return nil, fmt.Errorf("failed to start CPU profile: %w", err)
+ }
+ return func() {
+ pprof.StopCPUProfile()
+ f.Close()
+ }, nil
+}
+
+func (c *hugoBuilder) initMemProfile() {
+ if c.r.memprofile == "" {
+ return
+ }
+
+ f, err := os.Create(c.r.memprofile)
+ if err != nil {
+ c.r.logger.Errorf("could not create memory profile: ", err)
+ }
+ defer f.Close()
+ runtime.GC() // get up-to-date statistics
+ if err := pprof.WriteHeapProfile(f); err != nil {
+ c.r.logger.Errorf("could not write memory profile: ", err)
+ }
+}
+
+func (c *hugoBuilder) initMemTicker() func() {
+ memticker := time.NewTicker(5 * time.Second)
+ quit := make(chan struct{})
+ printMem := func() {
+ var m runtime.MemStats
+ runtime.ReadMemStats(&m)
+ fmt.Printf("\n\nAlloc = %v\nTotalAlloc = %v\nSys = %v\nNumGC = %v\n\n", formatByteCount(m.Alloc), formatByteCount(m.TotalAlloc), formatByteCount(m.Sys), m.NumGC)
+ }
+
+ go func() {
+ for {
+ select {
+ case <-memticker.C:
+ printMem()
+ case <-quit:
+ memticker.Stop()
+ printMem()
+ return
+ }
+ }
+ }()
+
+ return func() {
+ close(quit)
+ }
+}
+
+func (c *hugoBuilder) initMutexProfile() (func(), error) {
+ if c.r.mutexprofile == "" {
+ return nil, nil
+ }
+
+ f, err := os.Create(c.r.mutexprofile)
+ if err != nil {
+ return nil, err
+ }
+
+ runtime.SetMutexProfileFraction(1)
+
+ return func() {
+ pprof.Lookup("mutex").WriteTo(f, 0)
+ f.Close()
+ }, nil
+}
+
+func (c *hugoBuilder) initProfiling() (func(), error) {
+ stopCPUProf, err := c.initCPUProfile()
+ if err != nil {
+ return nil, err
+ }
+
+ stopMutexProf, err := c.initMutexProfile()
+ if err != nil {
+ return nil, err
+ }
+
+ stopTraceProf, err := c.initTraceProfile()
+ if err != nil {
+ return nil, err
+ }
+
+ var stopMemTicker func()
+ if c.r.printm {
+ stopMemTicker = c.initMemTicker()
+ }
+
+ return func() {
+ c.initMemProfile()
+
+ if stopCPUProf != nil {
+ stopCPUProf()
+ }
+ if stopMutexProf != nil {
+ stopMutexProf()
+ }
+
+ if stopTraceProf != nil {
+ stopTraceProf()
+ }
+
+ if stopMemTicker != nil {
+ stopMemTicker()
+ }
+ }, nil
+}
+
+func (c *hugoBuilder) initTraceProfile() (func(), error) {
+ if c.r.traceprofile == "" {
+ return nil, nil
+ }
+
+ f, err := os.Create(c.r.traceprofile)
+ if err != nil {
+ return nil, fmt.Errorf("failed to create trace file: %w", err)
+ }
+
+ if err := trace.Start(f); err != nil {
+ return nil, fmt.Errorf("failed to start trace: %w", err)
+ }
+
+ return func() {
+ trace.Stop()
+ f.Close()
+ }, nil
+}
+
+// newWatcher creates a new watcher to watch filesystem events.
+func (c *hugoBuilder) newWatcher(pollIntervalStr string, dirList ...string) (*watcher.Batcher, error) {
+ staticSyncer := &staticSyncer{c: c}
+
+ var pollInterval time.Duration
+ poll := pollIntervalStr != ""
+ if poll {
+ pollInterval, err := types.ToDurationE(pollIntervalStr)
+ if err != nil {
+ return nil, fmt.Errorf("invalid value for flag poll: %s", err)
+ }
+ c.r.logger.Printf("Use watcher with poll interval %v", pollInterval)
+ }
+
+ if pollInterval == 0 {
+ pollInterval = 500 * time.Millisecond
+ }
+
+ watcher, err := watcher.New(500*time.Millisecond, pollInterval, poll)
+ if err != nil {
+ return nil, err
+ }
+
+ h, err := c.hugo()
+ if err != nil {
+ return nil, err
+ }
+ spec := h.Deps.SourceSpec
+
+ for _, d := range dirList {
+ if d != "" {
+ if spec.IgnoreFile(d) {
+ continue
+ }
+ _ = watcher.Add(d)
+ }
+ }
+
+ // Identifies changes to config (config.toml) files.
+ configSet := make(map[string]bool)
+ var configFiles []string
+ c.withConf(func(conf *commonConfig) {
+ configFiles = conf.configs.LoadingInfo.ConfigFiles
+ })
+
+ c.r.Println("Watching for config changes in", strings.Join(configFiles, ", "))
+ for _, configFile := range configFiles {
+ watcher.Add(configFile)
+ configSet[configFile] = true
+ }
+
+ go func() {
+ for {
+ select {
+ case changes := <-c.r.changesFromBuild:
+ unlock, err := h.LockBuild()
+ if err != nil {
+ c.r.logger.Errorln("Failed to acquire a build lock: %s", err)
+ return
+ }
+ c.changeDetector.PrepareNew()
+ err = c.rebuildSitesForChanges(changes)
+ if err != nil {
+ c.r.logger.Errorln("Error while watching:", err)
+ }
+ if c.s != nil && c.s.doLiveReload {
+ doReload := c.changeDetector == nil || len(c.changeDetector.changed()) > 0
+ doReload = doReload || c.showErrorInBrowser && c.errState.buildErr() != nil
+ if doReload {
+ livereload.ForceRefresh()
+ }
+ }
+ unlock()
+
+ case evs := <-watcher.Events:
+ unlock, err := h.LockBuild()
+ if err != nil {
+ c.r.logger.Errorln("Failed to acquire a build lock: %s", err)
+ return
+ }
+ c.handleEvents(watcher, staticSyncer, evs, configSet)
+ if c.showErrorInBrowser && c.errState.buildErr() != nil {
+ // Need to reload browser to show the error
+ livereload.ForceRefresh()
+ }
+ unlock()
+ case err := <-watcher.Errors():
+ if err != nil && !herrors.IsNotExist(err) {
+ c.r.logger.Errorln("Error while watching:", err)
+ }
+ }
+ }
+ }()
+
+ return watcher, nil
+}
+
+func (c *hugoBuilder) build() error {
+ stopProfiling, err := c.initProfiling()
+ if err != nil {
+ return err
+ }
+
+ defer func() {
+ if stopProfiling != nil {
+ stopProfiling()
+ }
+ }()
+
+ if err := c.fullBuild(false); err != nil {
+ return err
+ }
+
+ if !c.r.quiet {
+ c.r.Println()
+ h, err := c.hugo()
+ if err != nil {
+ return err
+ }
+
+ h.PrintProcessingStats(os.Stdout)
+ c.r.Println()
+ }
+
+ return nil
+}
+
+func (c *hugoBuilder) buildSites(noBuildLock bool) (err error) {
+ defer func() {
+ c.errState.setBuildErr(err)
+ }()
+
+ var h *hugolib.HugoSites
+ h, err = c.hugo()
+ if err != nil {
+ return
+ }
+ err = h.Build(hugolib.BuildCfg{NoBuildLock: noBuildLock})
+ return
+}
+
+func (c *hugoBuilder) copyStatic() (map[string]uint64, error) {
+ m, err := c.doWithPublishDirs(c.copyStaticTo)
+ if err == nil || herrors.IsNotExist(err) {
+ return m, nil
+ }
+ return m, err
+}
+
+func (c *hugoBuilder) copyStaticTo(sourceFs *filesystems.SourceFilesystem) (uint64, error) {
+ infol := c.r.logger.InfoCommand("static")
+ publishDir := helpers.FilePathSeparator
+
+ if sourceFs.PublishFolder != "" {
+ publishDir = filepath.Join(publishDir, sourceFs.PublishFolder)
+ }
+
+ fs := &countingStatFs{Fs: sourceFs.Fs}
+
+ syncer := fsync.NewSyncer()
+ c.withConf(func(conf *commonConfig) {
+ syncer.NoTimes = conf.configs.Base.NoTimes
+ syncer.NoChmod = conf.configs.Base.NoChmod
+ syncer.ChmodFilter = chmodFilter
+
+ syncer.DestFs = conf.fs.PublishDirStatic
+ // Now that we are using a unionFs for the static directories
+ // We can effectively clean the publishDir on initial sync
+ syncer.Delete = conf.configs.Base.CleanDestinationDir
+ })
+
+ syncer.SrcFs = fs
+
+ if syncer.Delete {
+ infol.Logf("removing all files from destination that don't exist in static dirs")
+
+ syncer.DeleteFilter = func(f fsync.FileInfo) bool {
+ return f.IsDir() && strings.HasPrefix(f.Name(), ".")
+ }
+ }
+ start := time.Now()
+
+ // because we are using a baseFs (to get the union right).
+ // set sync src to root
+ err := syncer.Sync(publishDir, helpers.FilePathSeparator)
+ if err != nil {
+ return 0, err
+ }
+ loggers.TimeTrackf(infol, start, nil, "syncing static files to %s", publishDir)
+
+ // Sync runs Stat 2 times for every source file.
+ numFiles := fs.statCounter / 2
+
+ return numFiles, err
+}
+
+func (c *hugoBuilder) doWithPublishDirs(f func(sourceFs *filesystems.SourceFilesystem) (uint64, error)) (map[string]uint64, error) {
+ langCount := make(map[string]uint64)
+
+ h, err := c.hugo()
+ if err != nil {
+ return nil, err
+ }
+ staticFilesystems := h.BaseFs.SourceFilesystems.Static
+
+ if len(staticFilesystems) == 0 {
+ c.r.logger.Infoln("No static directories found to sync")
+ return langCount, nil
+ }
+
+ for lang, fs := range staticFilesystems {
+ cnt, err := f(fs)
+ if err != nil {
+ return langCount, err
+ }
+ if lang == "" {
+ // Not multihost
+ c.withConf(func(conf *commonConfig) {
+ for _, l := range conf.configs.Languages {
+ langCount[l.Lang] = cnt
+ }
+ })
+ } else {
+ langCount[lang] = cnt
+ }
+ }
+
+ return langCount, nil
+}
+
+func (c *hugoBuilder) fullBuild(noBuildLock bool) error {
+ var (
+ g errgroup.Group
+ langCount map[string]uint64
+ )
+
+ c.r.logger.Println("Start building sites … ")
+ c.r.logger.Println(hugo.BuildVersionString())
+ c.r.logger.Println()
+ if terminal.IsTerminal(os.Stdout) {
+ defer func() {
+ fmt.Print(showCursor + clearLine)
+ }()
+ }
+
+ copyStaticFunc := func() error {
+ cnt, err := c.copyStatic()
+ if err != nil {
+ return fmt.Errorf("error copying static files: %w", err)
+ }
+ langCount = cnt
+ return nil
+ }
+ buildSitesFunc := func() error {
+ if err := c.buildSites(noBuildLock); err != nil {
+ return fmt.Errorf("error building site: %w", err)
+ }
+ return nil
+ }
+ // Do not copy static files and build sites in parallel if cleanDestinationDir is enabled.
+ // This flag deletes all static resources in /public folder that are missing in /static,
+ // and it does so at the end of copyStatic() call.
+ var cleanDestinationDir bool
+ c.withConf(func(conf *commonConfig) {
+ cleanDestinationDir = conf.configs.Base.CleanDestinationDir
+ })
+ if cleanDestinationDir {
+ if err := copyStaticFunc(); err != nil {
+ return err
+ }
+ if err := buildSitesFunc(); err != nil {
+ return err
+ }
+ } else {
+ g.Go(copyStaticFunc)
+ g.Go(buildSitesFunc)
+ if err := g.Wait(); err != nil {
+ return err
+ }
+ }
+
+ h, err := c.hugo()
+ if err != nil {
+ return err
+ }
+ for _, s := range h.Sites {
+ s.ProcessingStats.Static = langCount[s.Language().Lang]
+ }
+
+ if c.r.gc {
+ count, err := h.GC()
+ if err != nil {
+ return err
+ }
+ for _, s := range h.Sites {
+ // We have no way of knowing what site the garbage belonged to.
+ s.ProcessingStats.Cleaned = uint64(count)
+ }
+ }
+
+ return nil
+}
+
+func (c *hugoBuilder) fullRebuild(changeType string) {
+ if changeType == configChangeGoMod {
+ // go.mod may be changed during the build itself, and
+ // we really want to prevent superfluous builds.
+ if !c.fullRebuildSem.TryAcquire(1) {
+ return
+ }
+ c.fullRebuildSem.Release(1)
+ }
+
+ c.fullRebuildSem.Acquire(context.Background(), 1)
+
+ go func() {
+ defer c.fullRebuildSem.Release(1)
+
+ c.printChangeDetected(changeType)
+
+ defer func() {
+ // Allow any file system events to arrive basimplecobra.
+ // This will block any rebuild on config changes for the
+ // duration of the sleep.
+ time.Sleep(2 * time.Second)
+ }()
+
+ defer c.postBuild("Rebuilt", time.Now())
+
+ err := c.reloadConfig()
+ if err != nil {
+ // Set the processing on pause until the state is recovered.
+ c.errState.setPaused(true)
+ c.handleBuildErr(err, "Failed to reload config")
+ if c.s.doLiveReload {
+ livereload.ForceRefresh()
+ }
+ } else {
+ c.errState.setPaused(false)
+ }
+
+ if !c.errState.isPaused() {
+ _, err := c.copyStatic()
+ if err != nil {
+ c.r.logger.Errorln(err)
+ return
+ }
+ err = c.buildSites(false)
+ if err != nil {
+ c.r.logger.Errorln(err)
+ } else if c.s != nil && c.s.doLiveReload {
+ livereload.ForceRefresh()
+ }
+ }
+ }()
+}
+
+func (c *hugoBuilder) handleBuildErr(err error, msg string) {
+ c.errState.setBuildErr(err)
+ c.r.logger.Errorln(msg + ": " + cleanErrorLog(err.Error()))
+}
+
+func (c *hugoBuilder) handleEvents(watcher *watcher.Batcher,
+ staticSyncer *staticSyncer,
+ evs []fsnotify.Event,
+ configSet map[string]bool,
+) {
+ defer func() {
+ c.errState.setWasErr(false)
+ }()
+
+ var isHandled bool
+
+ // Filter out ghost events (from deleted, renamed directories).
+ // This seems to be a bug in fsnotify, or possibly MacOS.
+ var n int
+ for _, ev := range evs {
+ keep := true
+ // Write and rename operations are often followed by CHMOD.
+ // There may be valid use cases for rebuilding the site on CHMOD,
+ // but that will require more complex logic than this simple conditional.
+ // On OS X this seems to be related to Spotlight, see:
+ // https://github.com/go-fsnotify/fsnotify/issues/15
+ // A workaround is to put your site(s) on the Spotlight exception list,
+ // but that may be a little mysterious for most end users.
+ // So, for now, we skip reload on CHMOD.
+ // We do have to check for WRITE though. On slower laptops a Chmod
+ // could be aggregated with other important events, and we still want
+ // to rebuild on those
+ if ev.Op == fsnotify.Chmod {
+ keep = false
+ } else if ev.Has(fsnotify.Create) || ev.Has(fsnotify.Write) {
+ if _, err := os.Stat(ev.Name); err != nil {
+ keep = false
+ }
+ }
+ if keep {
+ evs[n] = ev
+ n++
+ }
+ }
+ evs = evs[:n]
+
+ for _, ev := range evs {
+ isConfig := configSet[ev.Name]
+ configChangeType := configChangeConfig
+ if isConfig {
+ if strings.Contains(ev.Name, "go.mod") {
+ configChangeType = configChangeGoMod
+ }
+ if strings.Contains(ev.Name, ".work") {
+ configChangeType = configChangeGoWork
+ }
+ }
+ if !isConfig {
+ // It may be one of the /config folders
+ dirname := filepath.Dir(ev.Name)
+ if dirname != "." && configSet[dirname] {
+ isConfig = true
+ }
+ }
+
+ if isConfig {
+ isHandled = true
+
+ if ev.Op&fsnotify.Chmod == fsnotify.Chmod {
+ continue
+ }
+
+ if ev.Op&fsnotify.Remove == fsnotify.Remove || ev.Op&fsnotify.Rename == fsnotify.Rename {
+ c.withConf(func(conf *commonConfig) {
+ for _, configFile := range conf.configs.LoadingInfo.ConfigFiles {
+ counter := 0
+ for watcher.Add(configFile) != nil {
+ counter++
+ if counter >= 100 {
+ break
+ }
+ time.Sleep(100 * time.Millisecond)
+ }
+ }
+ })
+ }
+
+ // Config file(s) changed. Need full rebuild.
+ c.fullRebuild(configChangeType)
+
+ return
+ }
+ }
+
+ if isHandled {
+ return
+ }
+
+ if c.errState.isPaused() {
+ // Wait for the server to get into a consistent state before
+ // we continue with processing.
+ return
+ }
+
+ if len(evs) > 50 {
+ // This is probably a mass edit of the content dir.
+ // Schedule a full rebuild for when it slows down.
+ c.debounce(func() {
+ c.fullRebuild("")
+ })
+ return
+ }
+
+ c.r.logger.Debugln("Received System Events:", evs)
+
+ staticEvents := []fsnotify.Event{}
+ dynamicEvents := []fsnotify.Event{}
+
+ filterDuplicateEvents := func(evs []fsnotify.Event) []fsnotify.Event {
+ seen := make(map[string]bool)
+ var n int
+ for _, ev := range evs {
+ if seen[ev.Name] {
+ continue
+ }
+ seen[ev.Name] = true
+ evs[n] = ev
+ n++
+ }
+ return evs[:n]
+ }
+
+ h, err := c.hugo()
+ if err != nil {
+ c.r.logger.Errorln("Error getting the Hugo object:", err)
+ return
+ }
+ n = 0
+ for _, ev := range evs {
+ if h.ShouldSkipFileChangeEvent(ev) {
+ continue
+ }
+ evs[n] = ev
+ n++
+ }
+ evs = evs[:n]
+
+ for _, ev := range evs {
+ ext := filepath.Ext(ev.Name)
+ baseName := filepath.Base(ev.Name)
+ istemp := strings.HasSuffix(ext, "~") ||
+ (ext == ".swp") || // vim
+ (ext == ".swx") || // vim
+ (ext == ".bck") || // helix
+ (ext == ".tmp") || // generic temp file
+ (ext == ".DS_Store") || // OSX Thumbnail
+ baseName == "4913" || // vim
+ strings.HasPrefix(ext, ".goutputstream") || // gnome
+ strings.HasSuffix(ext, "jb_old___") || // intelliJ
+ strings.HasSuffix(ext, "jb_tmp___") || // intelliJ
+ strings.HasSuffix(ext, "jb_bak___") || // intelliJ
+ strings.HasPrefix(ext, ".sb-") || // byword
+ strings.HasPrefix(baseName, ".#") || // emacs
+ strings.HasPrefix(baseName, "#") // emacs
+ if istemp {
+ continue
+ }
+
+ if h.Deps.SourceSpec.IgnoreFile(ev.Name) {
+ continue
+ }
+ // Sometimes during rm -rf operations a '"": REMOVE' is triggered. Just ignore these
+ if ev.Name == "" {
+ continue
+ }
+
+ walkAdder := func(path string, f hugofs.FileMetaInfo) error {
+ if f.IsDir() {
+ c.r.logger.Println("adding created directory to watchlist", path)
+ if err := watcher.Add(path); err != nil {
+ return err
+ }
+ } else if !staticSyncer.isStatic(h, path) {
+ // Hugo's rebuilding logic is entirely file based. When you drop a new folder into
+ // /content on OSX, the above logic will handle future watching of those files,
+ // but the initial CREATE is lost.
+ dynamicEvents = append(dynamicEvents, fsnotify.Event{Name: path, Op: fsnotify.Create})
+ }
+ return nil
+ }
+
+ // recursively add new directories to watch list
+ if ev.Has(fsnotify.Create) || ev.Has(fsnotify.Rename) {
+ c.withConf(func(conf *commonConfig) {
+ if s, err := conf.fs.Source.Stat(ev.Name); err == nil && s.Mode().IsDir() {
+ _ = helpers.Walk(conf.fs.Source, ev.Name, walkAdder)
+ }
+ })
+ }
+
+ if staticSyncer.isStatic(h, ev.Name) {
+ staticEvents = append(staticEvents, ev)
+ } else {
+ dynamicEvents = append(dynamicEvents, ev)
+ }
+ }
+
+ lrl := c.r.logger.InfoCommand("livereload")
+
+ staticEvents = filterDuplicateEvents(staticEvents)
+ dynamicEvents = filterDuplicateEvents(dynamicEvents)
+
+ if len(staticEvents) > 0 {
+ c.printChangeDetected("Static files")
+
+ if c.r.forceSyncStatic {
+ c.r.logger.Printf("Syncing all static files\n")
+ _, err := c.copyStatic()
+ if err != nil {
+ c.r.logger.Errorln("Error copying static files to publish dir:", err)
+ return
+ }
+ } else {
+ if err := staticSyncer.syncsStaticEvents(staticEvents); err != nil {
+ c.r.logger.Errorln("Error syncing static files to publish dir:", err)
+ return
+ }
+ }
+
+ if c.s != nil && c.s.doLiveReload {
+ // Will block forever trying to write to a channel that nobody is reading if livereload isn't initialized
+
+ if !c.errState.wasErr() && len(staticEvents) == 1 {
+ h, err := c.hugo()
+ if err != nil {
+ c.r.logger.Errorln("Error getting the Hugo object:", err)
+ return
+ }
+
+ path := h.BaseFs.SourceFilesystems.MakeStaticPathRelative(staticEvents[0].Name)
+ path = h.RelURL(paths.ToSlashTrimLeading(path), false)
+
+ lrl.Logf("refreshing static file %q", path)
+ livereload.RefreshPath(path)
+ } else {
+ lrl.Logf("got %d static file change events, force refresh", len(staticEvents))
+ livereload.ForceRefresh()
+ }
+ }
+ }
+
+ if len(dynamicEvents) > 0 {
+ partitionedEvents := partitionDynamicEvents(
+ h.BaseFs.SourceFilesystems,
+ dynamicEvents)
+
+ onePageName := pickOneWriteOrCreatePath(h.Conf.ContentTypes(), partitionedEvents.ContentEvents)
+
+ c.printChangeDetected("")
+ c.changeDetector.PrepareNew()
+
+ func() {
+ defer c.postBuild("Total", time.Now())
+ if err := c.rebuildSites(dynamicEvents); err != nil {
+ c.handleBuildErr(err, "Rebuild failed")
+ }
+ }()
+
+ if c.s != nil && c.s.doLiveReload {
+ if c.errState.wasErr() {
+ livereload.ForceRefresh()
+ return
+ }
+
+ changed := c.changeDetector.changed()
+ if c.changeDetector != nil {
+ if len(changed) >= 10 {
+ lrl.Logf("build changed %d files", len(changed))
+ } else {
+ lrl.Logf("build changed %d files: %q", len(changed), changed)
+ }
+ if len(changed) == 0 {
+ // Nothing has changed.
+ return
+ }
+ }
+
+ // If this change set also contains one or more CSS files, we need to
+ // refresh these as well.
+ var cssChanges []string
+ var otherChanges []string
+
+ for _, ev := range changed {
+ if strings.HasSuffix(ev, ".css") {
+ cssChanges = append(cssChanges, ev)
+ } else {
+ otherChanges = append(otherChanges, ev)
+ }
+ }
+
+ if len(partitionedEvents.ContentEvents) > 0 {
+ navigate := c.s != nil && c.s.navigateToChanged
+ // We have fetched the same page above, but it may have
+ // changed.
+ var p page.Page
+
+ if navigate {
+ if onePageName != "" {
+ p = h.GetContentPage(onePageName)
+ }
+ }
+
+ if p != nil && p.RelPermalink() != "" {
+ link, port := p.RelPermalink(), p.Site().ServerPort()
+ lrl.Logf("navigating to %q using port %d", link, port)
+ livereload.NavigateToPathForPort(link, port)
+ } else {
+ lrl.Logf("no page to navigate to, force refresh")
+ livereload.ForceRefresh()
+ }
+ } else if len(otherChanges) > 0 {
+ if len(otherChanges) == 1 {
+ // Allow single changes to be refreshed without a full page reload.
+ pathToRefresh := h.PathSpec.RelURL(paths.ToSlashTrimLeading(otherChanges[0]), false)
+ lrl.Logf("refreshing %q", pathToRefresh)
+ livereload.RefreshPath(pathToRefresh)
+ } else if len(cssChanges) == 0 || len(otherChanges) > 1 {
+ lrl.Logf("force refresh")
+ livereload.ForceRefresh()
+ }
+ } else {
+ lrl.Logf("force refresh")
+ livereload.ForceRefresh()
+ }
+
+ if len(cssChanges) > 0 {
+ // Allow some time for the live reload script to get reconnected.
+ if len(otherChanges) > 0 {
+ time.Sleep(200 * time.Millisecond)
+ }
+ for _, ev := range cssChanges {
+ pathToRefresh := h.PathSpec.RelURL(paths.ToSlashTrimLeading(ev), false)
+ lrl.Logf("refreshing CSS %q", pathToRefresh)
+ livereload.RefreshPath(pathToRefresh)
+ }
+ }
+ }
+ }
+}
+
+func (c *hugoBuilder) postBuild(what string, start time.Time) {
+ if h, err := c.hugo(); err == nil && h.Conf.Running() {
+ h.LogServerAddresses()
+ }
+ c.r.timeTrack(start, what)
+}
+
+func (c *hugoBuilder) hugo() (*hugolib.HugoSites, error) {
+ var h *hugolib.HugoSites
+ if err := c.withConfE(func(conf *commonConfig) error {
+ var err error
+ h, err = c.r.HugFromConfig(conf)
+ return err
+ }); err != nil {
+ return nil, err
+ }
+
+ if c.s != nil {
+ // A running server, register the media types.
+ for _, s := range h.Sites {
+ s.RegisterMediaTypes()
+ }
+ }
+ return h, nil
+}
+
+func (c *hugoBuilder) hugoTry() *hugolib.HugoSites {
+ var h *hugolib.HugoSites
+ c.withConf(func(conf *commonConfig) {
+ h, _ = c.r.HugFromConfig(conf)
+ })
+ return h
+}
+
+func (c *hugoBuilder) loadConfig(cd *simplecobra.Commandeer, running bool) error {
+ cfg := config.New()
+ cfg.Set("renderToMemory", c.r.renderToMemory)
+ watch := c.r.buildWatch || (c.s != nil && c.s.serverWatch)
+ if c.r.environment == "" {
+ // We need to set the environment as early as possible because we need it to load the correct config.
+ // Check if the user has set it in env.
+ if env := os.Getenv("HUGO_ENVIRONMENT"); env != "" {
+ c.r.environment = env
+ } else if env := os.Getenv("HUGO_ENV"); env != "" {
+ c.r.environment = env
+ } else {
+ if c.s != nil {
+ // The server defaults to development.
+ c.r.environment = hugo.EnvironmentDevelopment
+ } else {
+ c.r.environment = hugo.EnvironmentProduction
+ }
+ }
+ }
+ cfg.Set("environment", c.r.environment)
+
+ cfg.Set("internal", maps.Params{
+ "running": running,
+ "watch": watch,
+ "verbose": c.r.isVerbose(),
+ "fastRenderMode": c.fastRenderMode,
+ })
+
+ conf, err := c.r.ConfigFromProvider(configKey{counter: c.r.configVersionID.Load()}, flagsToCfg(cd, cfg))
+ if err != nil {
+ return err
+ }
+
+ if len(conf.configs.LoadingInfo.ConfigFiles) == 0 {
+ //lint:ignore ST1005 end user message.
+ return errors.New("Unable to locate config file or config directory. Perhaps you need to create a new site.\nRun `hugo help new` for details.")
+ }
+
+ c.conf = conf
+ if c.onConfigLoaded != nil {
+ if err := c.onConfigLoaded(false); err != nil {
+ return err
+ }
+ }
+
+ return nil
+}
+
+var rebuildCounter atomic.Uint64
+
+func (c *hugoBuilder) printChangeDetected(typ string) {
+ msg := "\nChange"
+ if typ != "" {
+ msg += " of " + typ
+ }
+ msg += fmt.Sprintf(" detected, rebuilding site (#%d).", rebuildCounter.Add(1))
+
+ c.r.logger.Println(msg)
+ const layout = "2006-01-02 15:04:05.000 -0700"
+ c.r.logger.Println(htime.Now().Format(layout))
+}
+
+func (c *hugoBuilder) rebuildSites(events []fsnotify.Event) (err error) {
+ defer func() {
+ c.errState.setBuildErr(err)
+ }()
+ if err := c.errState.buildErr(); err != nil {
+ ferrs := herrors.UnwrapFileErrorsWithErrorContext(err)
+ for _, err := range ferrs {
+ events = append(events, fsnotify.Event{Name: err.Position().Filename, Op: fsnotify.Write})
+ }
+ }
+ var h *hugolib.HugoSites
+ h, err = c.hugo()
+ if err != nil {
+ return
+ }
+ err = h.Build(hugolib.BuildCfg{NoBuildLock: true, RecentlyTouched: c.visitedURLs, ErrRecovery: c.errState.wasErr()}, events...)
+ return
+}
+
+func (c *hugoBuilder) rebuildSitesForChanges(ids []identity.Identity) (err error) {
+ defer func() {
+ c.errState.setBuildErr(err)
+ }()
+
+ var h *hugolib.HugoSites
+ h, err = c.hugo()
+ if err != nil {
+ return
+ }
+ whatChanged := &hugolib.WhatChanged{}
+ whatChanged.Add(ids...)
+ err = h.Build(hugolib.BuildCfg{NoBuildLock: true, WhatChanged: whatChanged, RecentlyTouched: c.visitedURLs, ErrRecovery: c.errState.wasErr()})
+
+ return
+}
+
+func (c *hugoBuilder) reloadConfig() error {
+ c.r.resetLogs()
+ c.r.configVersionID.Add(1)
+
+ if err := c.withConfE(func(conf *commonConfig) error {
+ oldConf := conf
+ newConf, err := c.r.ConfigFromConfig(configKey{counter: c.r.configVersionID.Load()}, conf)
+ if err != nil {
+ return err
+ }
+ sameLen := len(oldConf.configs.Languages) == len(newConf.configs.Languages)
+ if !sameLen {
+ if oldConf.configs.IsMultihost || newConf.configs.IsMultihost {
+ return errors.New("multihost change detected, please restart server")
+ }
+ }
+ c.conf = newConf
+ return nil
+ }); err != nil {
+ return err
+ }
+
+ if c.onConfigLoaded != nil {
+ if err := c.onConfigLoaded(true); err != nil {
+ return err
+ }
+ }
+
+ return nil
+}
diff --git a/commands/import.go b/commands/import.go
new file mode 100644
index 000000000..37a6b0dbf
--- /dev/null
+++ b/commands/import.go
@@ -0,0 +1,618 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package commands
+
+import (
+ "bytes"
+ "context"
+ "errors"
+ "fmt"
+ "io"
+ "log"
+ "os"
+ "path/filepath"
+ "regexp"
+ "strconv"
+ "strings"
+ "time"
+ "unicode"
+
+ "github.com/bep/simplecobra"
+ "github.com/gohugoio/hugo/common/htime"
+ "github.com/gohugoio/hugo/common/hugio"
+ "github.com/gohugoio/hugo/common/maps"
+ "github.com/gohugoio/hugo/helpers"
+ "github.com/gohugoio/hugo/hugofs"
+ "github.com/gohugoio/hugo/parser"
+ "github.com/gohugoio/hugo/parser/metadecoders"
+ "github.com/gohugoio/hugo/parser/pageparser"
+ "github.com/spf13/afero"
+ "github.com/spf13/cobra"
+)
+
+func newImportCommand() *importCommand {
+ var c *importCommand
+ c = &importCommand{
+ commands: []simplecobra.Commander{
+ &simpleCommand{
+ name: "jekyll",
+ short: "hugo import from Jekyll",
+ long: `hugo import from Jekyll.
+
+Import from Jekyll requires two paths, e.g. ` + "`hugo import jekyll jekyll_root_path target_path`.",
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ if len(args) < 2 {
+ return newUserError(`import from jekyll requires two paths, e.g. ` + "`hugo import jekyll jekyll_root_path target_path`.")
+ }
+ return c.importFromJekyll(args)
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ cmd.Flags().BoolVar(&c.force, "force", false, "allow import into non-empty target directory")
+ },
+ },
+ },
+ }
+
+ return c
+}
+
+type importCommand struct {
+ r *rootCommand
+
+ force bool
+
+ commands []simplecobra.Commander
+}
+
+func (c *importCommand) Commands() []simplecobra.Commander {
+ return c.commands
+}
+
+func (c *importCommand) Name() string {
+ return "import"
+}
+
+func (c *importCommand) Run(ctx context.Context, cd *simplecobra.Commandeer, args []string) error {
+ return nil
+}
+
+func (c *importCommand) Init(cd *simplecobra.Commandeer) error {
+ cmd := cd.CobraCommand
+ cmd.Short = "Import a site from another system"
+ cmd.Long = `Import a site from another system.
+
+Import requires a subcommand, e.g. ` + "`hugo import jekyll jekyll_root_path target_path`."
+
+ cmd.RunE = nil
+ return nil
+}
+
+func (c *importCommand) PreRun(cd, runner *simplecobra.Commandeer) error {
+ c.r = cd.Root.Command.(*rootCommand)
+ return nil
+}
+
+func (i *importCommand) createConfigFromJekyll(fs afero.Fs, inpath string, kind metadecoders.Format, jekyllConfig map[string]any) (err error) {
+ title := "My New Hugo Site"
+ baseURL := "http://example.org/"
+
+ for key, value := range jekyllConfig {
+ lowerKey := strings.ToLower(key)
+
+ switch lowerKey {
+ case "title":
+ if str, ok := value.(string); ok {
+ title = str
+ }
+
+ case "url":
+ if str, ok := value.(string); ok {
+ baseURL = str
+ }
+ }
+ }
+
+ in := map[string]any{
+ "baseURL": baseURL,
+ "title": title,
+ "languageCode": "en-us",
+ "disablePathToLower": true,
+ }
+
+ var buf bytes.Buffer
+ err = parser.InterfaceToConfig(in, kind, &buf)
+ if err != nil {
+ return err
+ }
+
+ return helpers.WriteToDisk(filepath.Join(inpath, "hugo."+string(kind)), &buf, fs)
+}
+
+func (c *importCommand) getJekyllDirInfo(fs afero.Fs, jekyllRoot string) (map[string]bool, bool) {
+ postDirs := make(map[string]bool)
+ hasAnyPost := false
+ if entries, err := os.ReadDir(jekyllRoot); err == nil {
+ for _, entry := range entries {
+ if entry.IsDir() {
+ subDir := filepath.Join(jekyllRoot, entry.Name())
+ if isPostDir, hasAnyPostInDir := c.retrieveJekyllPostDir(fs, subDir); isPostDir {
+ postDirs[entry.Name()] = hasAnyPostInDir
+ if hasAnyPostInDir {
+ hasAnyPost = true
+ }
+ }
+ }
+ }
+ }
+ return postDirs, hasAnyPost
+}
+
+func (c *importCommand) createSiteFromJekyll(jekyllRoot, targetDir string, jekyllPostDirs map[string]bool) error {
+ fs := &afero.OsFs{}
+ if exists, _ := helpers.Exists(targetDir, fs); exists {
+ if isDir, _ := helpers.IsDir(targetDir, fs); !isDir {
+ return errors.New("target path \"" + targetDir + "\" exists but is not a directory")
+ }
+
+ isEmpty, _ := helpers.IsEmpty(targetDir, fs)
+
+ if !isEmpty && !c.force {
+ return errors.New("target path \"" + targetDir + "\" exists and is not empty")
+ }
+ }
+
+ jekyllConfig := c.loadJekyllConfig(fs, jekyllRoot)
+
+ mkdir(targetDir, "layouts")
+ mkdir(targetDir, "content")
+ mkdir(targetDir, "archetypes")
+ mkdir(targetDir, "static")
+ mkdir(targetDir, "data")
+ mkdir(targetDir, "themes")
+
+ c.createConfigFromJekyll(fs, targetDir, "yaml", jekyllConfig)
+
+ c.copyJekyllFilesAndFolders(jekyllRoot, filepath.Join(targetDir, "static"), jekyllPostDirs)
+
+ return nil
+}
+
+func (c *importCommand) convertJekyllContent(m any, content string) (string, error) {
+ metadata, _ := maps.ToStringMapE(m)
+
+ lines := strings.Split(content, "\n")
+ var resultLines []string
+ for _, line := range lines {
+ resultLines = append(resultLines, strings.Trim(line, "\r\n"))
+ }
+
+ content = strings.Join(resultLines, "\n")
+
+ excerptSep := ""
+ if value, ok := metadata["excerpt_separator"]; ok {
+ if str, strOk := value.(string); strOk {
+ content = strings.Replace(content, strings.TrimSpace(str), excerptSep, -1)
+ }
+ }
+
+ replaceList := []struct {
+ re *regexp.Regexp
+ replace string
+ }{
+ {regexp.MustCompile("(?i)"), ""},
+ {regexp.MustCompile(`\{%\s*raw\s*%\}\s*(.*?)\s*\{%\s*endraw\s*%\}`), "$1"},
+ {regexp.MustCompile(`{%\s*endhighlight\s*%}`), "{{< / highlight >}}"},
+ }
+
+ for _, replace := range replaceList {
+ content = replace.re.ReplaceAllString(content, replace.replace)
+ }
+
+ replaceListFunc := []struct {
+ re *regexp.Regexp
+ replace func(string) string
+ }{
+ // Octopress image tag: http://octopress.org/docs/plugins/image-tag/
+ {regexp.MustCompile(`{%\s+img\s*(.*?)\s*%}`), c.replaceImageTag},
+ {regexp.MustCompile(`{%\s*highlight\s*(.*?)\s*%}`), c.replaceHighlightTag},
+ }
+
+ for _, replace := range replaceListFunc {
+ content = replace.re.ReplaceAllStringFunc(content, replace.replace)
+ }
+
+ var buf bytes.Buffer
+ if len(metadata) != 0 {
+ err := parser.InterfaceToFrontMatter(m, metadecoders.YAML, &buf)
+ if err != nil {
+ return "", err
+ }
+ }
+ buf.WriteString(content)
+
+ return buf.String(), nil
+}
+
+func (c *importCommand) convertJekyllMetaData(m any, postName string, postDate time.Time, draft bool) (any, error) {
+ metadata, err := maps.ToStringMapE(m)
+ if err != nil {
+ return nil, err
+ }
+
+ if draft {
+ metadata["draft"] = true
+ }
+
+ for key, value := range metadata {
+ lowerKey := strings.ToLower(key)
+
+ switch lowerKey {
+ case "layout":
+ delete(metadata, key)
+ case "permalink":
+ if str, ok := value.(string); ok {
+ metadata["url"] = str
+ }
+ delete(metadata, key)
+ case "category":
+ if str, ok := value.(string); ok {
+ metadata["categories"] = []string{str}
+ }
+ delete(metadata, key)
+ case "excerpt_separator":
+ if key != lowerKey {
+ delete(metadata, key)
+ metadata[lowerKey] = value
+ }
+ case "date":
+ if str, ok := value.(string); ok {
+ re := regexp.MustCompile(`(\d+):(\d+):(\d+)`)
+ r := re.FindAllStringSubmatch(str, -1)
+ if len(r) > 0 {
+ hour, _ := strconv.Atoi(r[0][1])
+ minute, _ := strconv.Atoi(r[0][2])
+ second, _ := strconv.Atoi(r[0][3])
+ postDate = time.Date(postDate.Year(), postDate.Month(), postDate.Day(), hour, minute, second, 0, time.UTC)
+ }
+ }
+ delete(metadata, key)
+ }
+
+ }
+
+ metadata["date"] = postDate.Format(time.RFC3339)
+
+ return metadata, nil
+}
+
+func (c *importCommand) convertJekyllPost(path, relPath, targetDir string, draft bool) error {
+ log.Println("Converting", path)
+
+ filename := filepath.Base(path)
+ postDate, postName, err := c.parseJekyllFilename(filename)
+ if err != nil {
+ c.r.Printf("Failed to parse filename '%s': %s. Skipping.", filename, err)
+ return nil
+ }
+
+ log.Println(filename, postDate, postName)
+
+ targetFile := filepath.Join(targetDir, relPath)
+ targetParentDir := filepath.Dir(targetFile)
+ os.MkdirAll(targetParentDir, 0o777)
+
+ contentBytes, err := os.ReadFile(path)
+ if err != nil {
+ c.r.logger.Errorln("Read file error:", path)
+ return err
+ }
+ pf, err := pageparser.ParseFrontMatterAndContent(bytes.NewReader(contentBytes))
+ if err != nil {
+ return fmt.Errorf("failed to parse file %q: %s", filename, err)
+ }
+ newmetadata, err := c.convertJekyllMetaData(pf.FrontMatter, postName, postDate, draft)
+ if err != nil {
+ return fmt.Errorf("failed to convert metadata for file %q: %s", filename, err)
+ }
+
+ content, err := c.convertJekyllContent(newmetadata, string(pf.Content))
+ if err != nil {
+ return fmt.Errorf("failed to convert content for file %q: %s", filename, err)
+ }
+
+ fs := hugofs.Os
+ if err := helpers.WriteToDisk(targetFile, strings.NewReader(content), fs); err != nil {
+ return fmt.Errorf("failed to save file %q: %s", filename, err)
+ }
+ return nil
+}
+
+func (c *importCommand) copyJekyllFilesAndFolders(jekyllRoot, dest string, jekyllPostDirs map[string]bool) (err error) {
+ fs := hugofs.Os
+
+ fi, err := fs.Stat(jekyllRoot)
+ if err != nil {
+ return err
+ }
+ if !fi.IsDir() {
+ return errors.New(jekyllRoot + " is not a directory")
+ }
+ err = os.MkdirAll(dest, fi.Mode())
+ if err != nil {
+ return err
+ }
+ entries, err := os.ReadDir(jekyllRoot)
+ if err != nil {
+ return err
+ }
+
+ for _, entry := range entries {
+ sfp := filepath.Join(jekyllRoot, entry.Name())
+ dfp := filepath.Join(dest, entry.Name())
+ if entry.IsDir() {
+ if entry.Name()[0] != '_' && entry.Name()[0] != '.' {
+ if _, ok := jekyllPostDirs[entry.Name()]; !ok {
+ err = hugio.CopyDir(fs, sfp, dfp, nil)
+ if err != nil {
+ c.r.logger.Errorln(err)
+ }
+ }
+ }
+ } else {
+ lowerEntryName := strings.ToLower(entry.Name())
+ exceptSuffix := []string{
+ ".md", ".markdown", ".html", ".htm",
+ ".xml", ".textile", "rakefile", "gemfile", ".lock",
+ }
+ isExcept := false
+ for _, suffix := range exceptSuffix {
+ if strings.HasSuffix(lowerEntryName, suffix) {
+ isExcept = true
+ break
+ }
+ }
+
+ if !isExcept && entry.Name()[0] != '.' && entry.Name()[0] != '_' {
+ err = hugio.CopyFile(fs, sfp, dfp)
+ if err != nil {
+ c.r.logger.Errorln(err)
+ }
+ }
+ }
+
+ }
+ return nil
+}
+
+func (c *importCommand) importFromJekyll(args []string) error {
+ jekyllRoot, err := filepath.Abs(filepath.Clean(args[0]))
+ if err != nil {
+ return newUserError("path error:", args[0])
+ }
+
+ targetDir, err := filepath.Abs(filepath.Clean(args[1]))
+ if err != nil {
+ return newUserError("path error:", args[1])
+ }
+
+ c.r.Println("Import Jekyll from:", jekyllRoot, "to:", targetDir)
+
+ if strings.HasPrefix(filepath.Dir(targetDir), jekyllRoot) {
+ return newUserError("abort: target path should not be inside the Jekyll root")
+ }
+
+ fs := afero.NewOsFs()
+ jekyllPostDirs, hasAnyPost := c.getJekyllDirInfo(fs, jekyllRoot)
+ if !hasAnyPost {
+ return errors.New("abort: jekyll root contains neither posts nor drafts")
+ }
+
+ err = c.createSiteFromJekyll(jekyllRoot, targetDir, jekyllPostDirs)
+ if err != nil {
+ return newUserError(err)
+ }
+
+ c.r.Println("Importing...")
+
+ fileCount := 0
+ callback := func(path string, fi hugofs.FileMetaInfo) error {
+ if fi.IsDir() {
+ return nil
+ }
+
+ relPath, err := filepath.Rel(jekyllRoot, path)
+ if err != nil {
+ return newUserError("get rel path error:", path)
+ }
+
+ relPath = filepath.ToSlash(relPath)
+ draft := false
+
+ switch {
+ case strings.Contains(relPath, "_posts/"):
+ relPath = filepath.Join("content/post", strings.Replace(relPath, "_posts/", "", -1))
+ case strings.Contains(relPath, "_drafts/"):
+ relPath = filepath.Join("content/draft", strings.Replace(relPath, "_drafts/", "", -1))
+ draft = true
+ default:
+ return nil
+ }
+
+ fileCount++
+ return c.convertJekyllPost(path, relPath, targetDir, draft)
+ }
+
+ for jekyllPostDir, hasAnyPostInDir := range jekyllPostDirs {
+ if hasAnyPostInDir {
+ if err = helpers.Walk(hugofs.Os, filepath.Join(jekyllRoot, jekyllPostDir), callback); err != nil {
+ return err
+ }
+ }
+ }
+
+ c.r.Println("Congratulations!", fileCount, "post(s) imported!")
+ c.r.Println("Now, start Hugo by yourself:\n")
+ c.r.Println("cd " + args[1])
+ c.r.Println("git init")
+ c.r.Println("git submodule add https://github.com/theNewDynamic/gohugo-theme-ananke themes/ananke")
+ c.r.Println("echo \"theme = 'ananke'\" > hugo.toml")
+ c.r.Println("hugo server")
+
+ return nil
+}
+
+func (c *importCommand) loadJekyllConfig(fs afero.Fs, jekyllRoot string) map[string]any {
+ path := filepath.Join(jekyllRoot, "_config.yml")
+
+ exists, err := helpers.Exists(path, fs)
+
+ if err != nil || !exists {
+ c.r.Println("_config.yaml not found: Is the specified Jekyll root correct?")
+ return nil
+ }
+
+ f, err := fs.Open(path)
+ if err != nil {
+ return nil
+ }
+
+ defer f.Close()
+
+ b, err := io.ReadAll(f)
+ if err != nil {
+ return nil
+ }
+
+ m, err := metadecoders.Default.UnmarshalToMap(b, metadecoders.YAML)
+ if err != nil {
+ return nil
+ }
+
+ return m
+}
+
+func (c *importCommand) parseJekyllFilename(filename string) (time.Time, string, error) {
+ re := regexp.MustCompile(`(\d+-\d+-\d+)-(.+)\..*`)
+ r := re.FindAllStringSubmatch(filename, -1)
+ if len(r) == 0 {
+ return htime.Now(), "", errors.New("filename not match")
+ }
+
+ postDate, err := time.Parse("2006-1-2", r[0][1])
+ if err != nil {
+ return htime.Now(), "", err
+ }
+
+ postName := r[0][2]
+
+ return postDate, postName, nil
+}
+
+func (c *importCommand) replaceHighlightTag(match string) string {
+ r := regexp.MustCompile(`{%\s*highlight\s*(.*?)\s*%}`)
+ parts := r.FindStringSubmatch(match)
+ lastQuote := rune(0)
+ f := func(c rune) bool {
+ switch {
+ case c == lastQuote:
+ lastQuote = rune(0)
+ return false
+ case lastQuote != rune(0):
+ return false
+ case unicode.In(c, unicode.Quotation_Mark):
+ lastQuote = c
+ return false
+ default:
+ return unicode.IsSpace(c)
+ }
+ }
+ // splitting string by space but considering quoted section
+ items := strings.FieldsFunc(parts[1], f)
+
+ result := bytes.NewBufferString("{{< highlight ")
+ result.WriteString(items[0]) // language
+ options := items[1:]
+ for i, opt := range options {
+ opt = strings.Replace(opt, "\"", "", -1)
+ if opt == "linenos" {
+ opt = "linenos=table"
+ }
+ if i == 0 {
+ opt = " \"" + opt
+ }
+ if i < len(options)-1 {
+ opt += ","
+ } else if i == len(options)-1 {
+ opt += "\""
+ }
+ result.WriteString(opt)
+ }
+
+ result.WriteString(" >}}")
+ return result.String()
+}
+
+func (c *importCommand) replaceImageTag(match string) string {
+ r := regexp.MustCompile(`{%\s+img\s*(\p{L}*)\s+([\S]*/[\S]+)\s+(\d*)\s*(\d*)\s*(.*?)\s*%}`)
+ result := bytes.NewBufferString("{{< figure ")
+ parts := r.FindStringSubmatch(match)
+ // Index 0 is the entire string, ignore
+ c.replaceOptionalPart(result, "class", parts[1])
+ c.replaceOptionalPart(result, "src", parts[2])
+ c.replaceOptionalPart(result, "width", parts[3])
+ c.replaceOptionalPart(result, "height", parts[4])
+ // title + alt
+ part := parts[5]
+ if len(part) > 0 {
+ splits := strings.Split(part, "'")
+ lenSplits := len(splits)
+ if lenSplits == 1 {
+ c.replaceOptionalPart(result, "title", splits[0])
+ } else if lenSplits == 3 {
+ c.replaceOptionalPart(result, "title", splits[1])
+ } else if lenSplits == 5 {
+ c.replaceOptionalPart(result, "title", splits[1])
+ c.replaceOptionalPart(result, "alt", splits[3])
+ }
+ }
+ result.WriteString(">}}")
+ return result.String()
+}
+
+func (c *importCommand) replaceOptionalPart(buffer *bytes.Buffer, partName string, part string) {
+ if len(part) > 0 {
+ buffer.WriteString(partName + "=\"" + part + "\" ")
+ }
+}
+
+func (c *importCommand) retrieveJekyllPostDir(fs afero.Fs, dir string) (bool, bool) {
+ if strings.HasSuffix(dir, "_posts") || strings.HasSuffix(dir, "_drafts") {
+ isEmpty, _ := helpers.IsEmpty(dir, fs)
+ return true, !isEmpty
+ }
+
+ if entries, err := os.ReadDir(dir); err == nil {
+ for _, entry := range entries {
+ if entry.IsDir() {
+ subDir := filepath.Join(dir, entry.Name())
+ if isPostDir, hasAnyPost := c.retrieveJekyllPostDir(fs, subDir); isPostDir {
+ return isPostDir, hasAnyPost
+ }
+ }
+ }
+ }
+
+ return false, true
+}
diff --git a/commands/import_jekyll.go b/commands/import_jekyll.go
deleted file mode 100644
index e5c39dc34..000000000
--- a/commands/import_jekyll.go
+++ /dev/null
@@ -1,596 +0,0 @@
-// Copyright 2019 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "bytes"
- "errors"
- "fmt"
- "io/ioutil"
- "os"
- "path/filepath"
- "regexp"
- "strconv"
- "strings"
- "time"
- "unicode"
-
- "github.com/gohugoio/hugo/common/hugio"
-
- "github.com/gohugoio/hugo/parser/metadecoders"
-
- "github.com/gohugoio/hugo/helpers"
- "github.com/gohugoio/hugo/hugofs"
- "github.com/gohugoio/hugo/hugolib"
- "github.com/gohugoio/hugo/parser"
- "github.com/spf13/afero"
- "github.com/spf13/cast"
- "github.com/spf13/cobra"
- jww "github.com/spf13/jwalterweatherman"
-)
-
-var _ cmder = (*importCmd)(nil)
-
-type importCmd struct {
- *baseCmd
-}
-
-func newImportCmd() *importCmd {
- cc := &importCmd{}
-
- cc.baseCmd = newBaseCmd(&cobra.Command{
- Use: "import",
- Short: "Import your site from others.",
- Long: `Import your site from other web site generators like Jekyll.
-
-Import requires a subcommand, e.g. ` + "`hugo import jekyll jekyll_root_path target_path`.",
- RunE: nil,
- })
-
- importJekyllCmd := &cobra.Command{
- Use: "jekyll",
- Short: "hugo import from Jekyll",
- Long: `hugo import from Jekyll.
-
-Import from Jekyll requires two paths, e.g. ` + "`hugo import jekyll jekyll_root_path target_path`.",
- RunE: cc.importFromJekyll,
- }
-
- importJekyllCmd.Flags().Bool("force", false, "allow import into non-empty target directory")
-
- cc.cmd.AddCommand(importJekyllCmd)
-
- return cc
-
-}
-
-func (i *importCmd) importFromJekyll(cmd *cobra.Command, args []string) error {
-
- if len(args) < 2 {
- return newUserError(`import from jekyll requires two paths, e.g. ` + "`hugo import jekyll jekyll_root_path target_path`.")
- }
-
- jekyllRoot, err := filepath.Abs(filepath.Clean(args[0]))
- if err != nil {
- return newUserError("path error:", args[0])
- }
-
- targetDir, err := filepath.Abs(filepath.Clean(args[1]))
- if err != nil {
- return newUserError("path error:", args[1])
- }
-
- jww.INFO.Println("Import Jekyll from:", jekyllRoot, "to:", targetDir)
-
- if strings.HasPrefix(filepath.Dir(targetDir), jekyllRoot) {
- return newUserError("abort: target path should not be inside the Jekyll root")
- }
-
- forceImport, _ := cmd.Flags().GetBool("force")
-
- fs := afero.NewOsFs()
- jekyllPostDirs, hasAnyPost := i.getJekyllDirInfo(fs, jekyllRoot)
- if !hasAnyPost {
- return errors.New("abort: jekyll root contains neither posts nor drafts")
- }
-
- site, err := i.createSiteFromJekyll(jekyllRoot, targetDir, jekyllPostDirs, forceImport)
-
- if err != nil {
- return newUserError(err)
- }
-
- jww.FEEDBACK.Println("Importing...")
-
- fileCount := 0
- callback := func(path string, fi hugofs.FileMetaInfo, err error) error {
- if err != nil {
- return err
- }
-
- if fi.IsDir() {
- return nil
- }
-
- relPath, err := filepath.Rel(jekyllRoot, path)
- if err != nil {
- return newUserError("get rel path error:", path)
- }
-
- relPath = filepath.ToSlash(relPath)
- draft := false
-
- switch {
- case strings.Contains(relPath, "_posts/"):
- relPath = filepath.Join("content/post", strings.Replace(relPath, "_posts/", "", -1))
- case strings.Contains(relPath, "_drafts/"):
- relPath = filepath.Join("content/draft", strings.Replace(relPath, "_drafts/", "", -1))
- draft = true
- default:
- return nil
- }
-
- fileCount++
- return convertJekyllPost(site, path, relPath, targetDir, draft)
- }
-
- for jekyllPostDir, hasAnyPostInDir := range jekyllPostDirs {
- if hasAnyPostInDir {
- if err = helpers.SymbolicWalk(hugofs.Os, filepath.Join(jekyllRoot, jekyllPostDir), callback); err != nil {
- return err
- }
- }
- }
-
- jww.FEEDBACK.Println("Congratulations!", fileCount, "post(s) imported!")
- jww.FEEDBACK.Println("Now, start Hugo by yourself:\n" +
- "$ git clone https://github.com/spf13/herring-cove.git " + args[1] + "/themes/herring-cove")
- jww.FEEDBACK.Println("$ cd " + args[1] + "\n$ hugo server --theme=herring-cove")
-
- return nil
-}
-
-func (i *importCmd) getJekyllDirInfo(fs afero.Fs, jekyllRoot string) (map[string]bool, bool) {
- postDirs := make(map[string]bool)
- hasAnyPost := false
- if entries, err := ioutil.ReadDir(jekyllRoot); err == nil {
- for _, entry := range entries {
- if entry.IsDir() {
- subDir := filepath.Join(jekyllRoot, entry.Name())
- if isPostDir, hasAnyPostInDir := i.retrieveJekyllPostDir(fs, subDir); isPostDir {
- postDirs[entry.Name()] = hasAnyPostInDir
- if hasAnyPostInDir {
- hasAnyPost = true
- }
- }
- }
- }
- }
- return postDirs, hasAnyPost
-}
-
-func (i *importCmd) retrieveJekyllPostDir(fs afero.Fs, dir string) (bool, bool) {
- if strings.HasSuffix(dir, "_posts") || strings.HasSuffix(dir, "_drafts") {
- isEmpty, _ := helpers.IsEmpty(dir, fs)
- return true, !isEmpty
- }
-
- if entries, err := ioutil.ReadDir(dir); err == nil {
- for _, entry := range entries {
- if entry.IsDir() {
- subDir := filepath.Join(dir, entry.Name())
- if isPostDir, hasAnyPost := i.retrieveJekyllPostDir(fs, subDir); isPostDir {
- return isPostDir, hasAnyPost
- }
- }
- }
- }
-
- return false, true
-}
-
-func (i *importCmd) createSiteFromJekyll(jekyllRoot, targetDir string, jekyllPostDirs map[string]bool, force bool) (*hugolib.Site, error) {
- s, err := hugolib.NewSiteDefaultLang()
- if err != nil {
- return nil, err
- }
-
- fs := s.Fs.Source
- if exists, _ := helpers.Exists(targetDir, fs); exists {
- if isDir, _ := helpers.IsDir(targetDir, fs); !isDir {
- return nil, errors.New("target path \"" + targetDir + "\" exists but is not a directory")
- }
-
- isEmpty, _ := helpers.IsEmpty(targetDir, fs)
-
- if !isEmpty && !force {
- return nil, errors.New("target path \"" + targetDir + "\" exists and is not empty")
- }
- }
-
- jekyllConfig := i.loadJekyllConfig(fs, jekyllRoot)
-
- mkdir(targetDir, "layouts")
- mkdir(targetDir, "content")
- mkdir(targetDir, "archetypes")
- mkdir(targetDir, "static")
- mkdir(targetDir, "data")
- mkdir(targetDir, "themes")
-
- i.createConfigFromJekyll(fs, targetDir, "yaml", jekyllConfig)
-
- i.copyJekyllFilesAndFolders(jekyllRoot, filepath.Join(targetDir, "static"), jekyllPostDirs)
-
- return s, nil
-}
-
-func (i *importCmd) loadJekyllConfig(fs afero.Fs, jekyllRoot string) map[string]interface{} {
- path := filepath.Join(jekyllRoot, "_config.yml")
-
- exists, err := helpers.Exists(path, fs)
-
- if err != nil || !exists {
- jww.WARN.Println("_config.yaml not found: Is the specified Jekyll root correct?")
- return nil
- }
-
- f, err := fs.Open(path)
- if err != nil {
- return nil
- }
-
- defer f.Close()
-
- b, err := ioutil.ReadAll(f)
-
- if err != nil {
- return nil
- }
-
- c, err := metadecoders.Default.UnmarshalToMap(b, metadecoders.YAML)
-
- if err != nil {
- return nil
- }
-
- return c
-}
-
-func (i *importCmd) createConfigFromJekyll(fs afero.Fs, inpath string, kind metadecoders.Format, jekyllConfig map[string]interface{}) (err error) {
- title := "My New Hugo Site"
- baseURL := "http://example.org/"
-
- for key, value := range jekyllConfig {
- lowerKey := strings.ToLower(key)
-
- switch lowerKey {
- case "title":
- if str, ok := value.(string); ok {
- title = str
- }
-
- case "url":
- if str, ok := value.(string); ok {
- baseURL = str
- }
- }
- }
-
- in := map[string]interface{}{
- "baseURL": baseURL,
- "title": title,
- "languageCode": "en-us",
- "disablePathToLower": true,
- }
-
- var buf bytes.Buffer
- err = parser.InterfaceToConfig(in, kind, &buf)
- if err != nil {
- return err
- }
-
- return helpers.WriteToDisk(filepath.Join(inpath, "config."+string(kind)), &buf, fs)
-}
-
-func (i *importCmd) copyJekyllFilesAndFolders(jekyllRoot, dest string, jekyllPostDirs map[string]bool) (err error) {
- fs := hugofs.Os
-
- fi, err := fs.Stat(jekyllRoot)
- if err != nil {
- return err
- }
- if !fi.IsDir() {
- return errors.New(jekyllRoot + " is not a directory")
- }
- err = os.MkdirAll(dest, fi.Mode())
- if err != nil {
- return err
- }
- entries, err := ioutil.ReadDir(jekyllRoot)
- if err != nil {
- return err
- }
-
- for _, entry := range entries {
- sfp := filepath.Join(jekyllRoot, entry.Name())
- dfp := filepath.Join(dest, entry.Name())
- if entry.IsDir() {
- if entry.Name()[0] != '_' && entry.Name()[0] != '.' {
- if _, ok := jekyllPostDirs[entry.Name()]; !ok {
- err = hugio.CopyDir(fs, sfp, dfp, nil)
- if err != nil {
- jww.ERROR.Println(err)
- }
- }
- }
- } else {
- lowerEntryName := strings.ToLower(entry.Name())
- exceptSuffix := []string{".md", ".markdown", ".html", ".htm",
- ".xml", ".textile", "rakefile", "gemfile", ".lock"}
- isExcept := false
- for _, suffix := range exceptSuffix {
- if strings.HasSuffix(lowerEntryName, suffix) {
- isExcept = true
- break
- }
- }
-
- if !isExcept && entry.Name()[0] != '.' && entry.Name()[0] != '_' {
- err = hugio.CopyFile(fs, sfp, dfp)
- if err != nil {
- jww.ERROR.Println(err)
- }
- }
- }
-
- }
- return nil
-}
-
-func parseJekyllFilename(filename string) (time.Time, string, error) {
- re := regexp.MustCompile(`(\d+-\d+-\d+)-(.+)\..*`)
- r := re.FindAllStringSubmatch(filename, -1)
- if len(r) == 0 {
- return time.Now(), "", errors.New("filename not match")
- }
-
- postDate, err := time.Parse("2006-1-2", r[0][1])
- if err != nil {
- return time.Now(), "", err
- }
-
- postName := r[0][2]
-
- return postDate, postName, nil
-}
-
-func convertJekyllPost(s *hugolib.Site, path, relPath, targetDir string, draft bool) error {
- jww.TRACE.Println("Converting", path)
-
- filename := filepath.Base(path)
- postDate, postName, err := parseJekyllFilename(filename)
- if err != nil {
- jww.WARN.Printf("Failed to parse filename '%s': %s. Skipping.", filename, err)
- return nil
- }
-
- jww.TRACE.Println(filename, postDate, postName)
-
- targetFile := filepath.Join(targetDir, relPath)
- targetParentDir := filepath.Dir(targetFile)
- os.MkdirAll(targetParentDir, 0777)
-
- contentBytes, err := ioutil.ReadFile(path)
- if err != nil {
- jww.ERROR.Println("Read file error:", path)
- return err
- }
-
- pf, err := parseContentFile(bytes.NewReader(contentBytes))
- if err != nil {
- jww.ERROR.Println("Parse file error:", path)
- return err
- }
-
- newmetadata, err := convertJekyllMetaData(pf.frontMatter, postName, postDate, draft)
- if err != nil {
- jww.ERROR.Println("Convert metadata error:", path)
- return err
- }
-
- content := convertJekyllContent(newmetadata, string(pf.content))
-
- fs := hugofs.Os
- if err := helpers.WriteToDisk(targetFile, strings.NewReader(content), fs); err != nil {
- return fmt.Errorf("failed to save file %q: %s", filename, err)
- }
-
- return nil
-}
-
-func convertJekyllMetaData(m interface{}, postName string, postDate time.Time, draft bool) (interface{}, error) {
- metadata, err := cast.ToStringMapE(m)
- if err != nil {
- return nil, err
- }
-
- if draft {
- metadata["draft"] = true
- }
-
- for key, value := range metadata {
- lowerKey := strings.ToLower(key)
-
- switch lowerKey {
- case "layout":
- delete(metadata, key)
- case "permalink":
- if str, ok := value.(string); ok {
- metadata["url"] = str
- }
- delete(metadata, key)
- case "category":
- if str, ok := value.(string); ok {
- metadata["categories"] = []string{str}
- }
- delete(metadata, key)
- case "excerpt_separator":
- if key != lowerKey {
- delete(metadata, key)
- metadata[lowerKey] = value
- }
- case "date":
- if str, ok := value.(string); ok {
- re := regexp.MustCompile(`(\d+):(\d+):(\d+)`)
- r := re.FindAllStringSubmatch(str, -1)
- if len(r) > 0 {
- hour, _ := strconv.Atoi(r[0][1])
- minute, _ := strconv.Atoi(r[0][2])
- second, _ := strconv.Atoi(r[0][3])
- postDate = time.Date(postDate.Year(), postDate.Month(), postDate.Day(), hour, minute, second, 0, time.UTC)
- }
- }
- delete(metadata, key)
- }
-
- }
-
- metadata["date"] = postDate.Format(time.RFC3339)
-
- return metadata, nil
-}
-
-func convertJekyllContent(m interface{}, content string) string {
- metadata, _ := cast.ToStringMapE(m)
-
- lines := strings.Split(content, "\n")
- var resultLines []string
- for _, line := range lines {
- resultLines = append(resultLines, strings.Trim(line, "\r\n"))
- }
-
- content = strings.Join(resultLines, "\n")
-
- excerptSep := ""
- if value, ok := metadata["excerpt_separator"]; ok {
- if str, strOk := value.(string); strOk {
- content = strings.Replace(content, strings.TrimSpace(str), excerptSep, -1)
- }
- }
-
- replaceList := []struct {
- re *regexp.Regexp
- replace string
- }{
- {regexp.MustCompile("(?i)"), ""},
- {regexp.MustCompile(`\{%\s*raw\s*%\}\s*(.*?)\s*\{%\s*endraw\s*%\}`), "$1"},
- {regexp.MustCompile(`{%\s*endhighlight\s*%}`), "{{< / highlight >}}"},
- }
-
- for _, replace := range replaceList {
- content = replace.re.ReplaceAllString(content, replace.replace)
- }
-
- replaceListFunc := []struct {
- re *regexp.Regexp
- replace func(string) string
- }{
- // Octopress image tag: http://octopress.org/docs/plugins/image-tag/
- {regexp.MustCompile(`{%\s+img\s*(.*?)\s*%}`), replaceImageTag},
- {regexp.MustCompile(`{%\s*highlight\s*(.*?)\s*%}`), replaceHighlightTag},
- }
-
- for _, replace := range replaceListFunc {
- content = replace.re.ReplaceAllStringFunc(content, replace.replace)
- }
-
- return content
-}
-
-func replaceHighlightTag(match string) string {
- r := regexp.MustCompile(`{%\s*highlight\s*(.*?)\s*%}`)
- parts := r.FindStringSubmatch(match)
- lastQuote := rune(0)
- f := func(c rune) bool {
- switch {
- case c == lastQuote:
- lastQuote = rune(0)
- return false
- case lastQuote != rune(0):
- return false
- case unicode.In(c, unicode.Quotation_Mark):
- lastQuote = c
- return false
- default:
- return unicode.IsSpace(c)
- }
- }
- // splitting string by space but considering quoted section
- items := strings.FieldsFunc(parts[1], f)
-
- result := bytes.NewBufferString("{{< highlight ")
- result.WriteString(items[0]) // language
- options := items[1:]
- for i, opt := range options {
- opt = strings.Replace(opt, "\"", "", -1)
- if opt == "linenos" {
- opt = "linenos=table"
- }
- if i == 0 {
- opt = " \"" + opt
- }
- if i < len(options)-1 {
- opt += ","
- } else if i == len(options)-1 {
- opt += "\""
- }
- result.WriteString(opt)
- }
-
- result.WriteString(" >}}")
- return result.String()
-}
-
-func replaceImageTag(match string) string {
- r := regexp.MustCompile(`{%\s+img\s*(\p{L}*)\s+([\S]*/[\S]+)\s+(\d*)\s*(\d*)\s*(.*?)\s*%}`)
- result := bytes.NewBufferString("{{< figure ")
- parts := r.FindStringSubmatch(match)
- // Index 0 is the entire string, ignore
- replaceOptionalPart(result, "class", parts[1])
- replaceOptionalPart(result, "src", parts[2])
- replaceOptionalPart(result, "width", parts[3])
- replaceOptionalPart(result, "height", parts[4])
- // title + alt
- part := parts[5]
- if len(part) > 0 {
- splits := strings.Split(part, "'")
- lenSplits := len(splits)
- if lenSplits == 1 {
- replaceOptionalPart(result, "title", splits[0])
- } else if lenSplits == 3 {
- replaceOptionalPart(result, "title", splits[1])
- } else if lenSplits == 5 {
- replaceOptionalPart(result, "title", splits[1])
- replaceOptionalPart(result, "alt", splits[3])
- }
- }
- result.WriteString(">}}")
- return result.String()
-
-}
-func replaceOptionalPart(buffer *bytes.Buffer, partName string, part string) {
- if len(part) > 0 {
- buffer.WriteString(partName + "=\"" + part + "\" ")
- }
-}
diff --git a/commands/import_jekyll_test.go b/commands/import_jekyll_test.go
deleted file mode 100644
index 4ae26b95c..000000000
--- a/commands/import_jekyll_test.go
+++ /dev/null
@@ -1,133 +0,0 @@
-// Copyright 2015 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "encoding/json"
- "testing"
- "time"
-
- qt "github.com/frankban/quicktest"
-)
-
-func TestParseJekyllFilename(t *testing.T) {
- c := qt.New(t)
- filenameArray := []string{
- "2015-01-02-test.md",
- "2012-03-15-中文.markup",
- }
-
- expectResult := []struct {
- postDate time.Time
- postName string
- }{
- {time.Date(2015, time.January, 2, 0, 0, 0, 0, time.UTC), "test"},
- {time.Date(2012, time.March, 15, 0, 0, 0, 0, time.UTC), "中文"},
- }
-
- for i, filename := range filenameArray {
- postDate, postName, err := parseJekyllFilename(filename)
- c.Assert(err, qt.IsNil)
- c.Assert(expectResult[i].postDate.Format("2006-01-02"), qt.Equals, postDate.Format("2006-01-02"))
- c.Assert(expectResult[i].postName, qt.Equals, postName)
- }
-}
-
-func TestConvertJekyllMetadata(t *testing.T) {
- c := qt.New(t)
- testDataList := []struct {
- metadata interface{}
- postName string
- postDate time.Time
- draft bool
- expect string
- }{
- {map[interface{}]interface{}{}, "testPost", time.Date(2015, 10, 1, 0, 0, 0, 0, time.UTC), false,
- `{"date":"2015-10-01T00:00:00Z"}`},
- {map[interface{}]interface{}{}, "testPost", time.Date(2015, 10, 1, 0, 0, 0, 0, time.UTC), true,
- `{"date":"2015-10-01T00:00:00Z","draft":true}`},
- {map[interface{}]interface{}{"Permalink": "/permalink.html", "layout": "post"},
- "testPost", time.Date(2015, 10, 1, 0, 0, 0, 0, time.UTC), false,
- `{"date":"2015-10-01T00:00:00Z","url":"/permalink.html"}`},
- {map[interface{}]interface{}{"permalink": "/permalink.html"},
- "testPost", time.Date(2015, 10, 1, 0, 0, 0, 0, time.UTC), false,
- `{"date":"2015-10-01T00:00:00Z","url":"/permalink.html"}`},
- {map[interface{}]interface{}{"category": nil, "permalink": 123},
- "testPost", time.Date(2015, 10, 1, 0, 0, 0, 0, time.UTC), false,
- `{"date":"2015-10-01T00:00:00Z"}`},
- {map[interface{}]interface{}{"Excerpt_Separator": "sep"},
- "testPost", time.Date(2015, 10, 1, 0, 0, 0, 0, time.UTC), false,
- `{"date":"2015-10-01T00:00:00Z","excerpt_separator":"sep"}`},
- {map[interface{}]interface{}{"category": "book", "layout": "post", "Others": "Goods", "Date": "2015-10-01 12:13:11"},
- "testPost", time.Date(2015, 10, 1, 0, 0, 0, 0, time.UTC), false,
- `{"Others":"Goods","categories":["book"],"date":"2015-10-01T12:13:11Z"}`},
- }
-
- for _, data := range testDataList {
- result, err := convertJekyllMetaData(data.metadata, data.postName, data.postDate, data.draft)
- c.Assert(err, qt.IsNil)
- jsonResult, err := json.Marshal(result)
- c.Assert(err, qt.IsNil)
- c.Assert(string(jsonResult), qt.Equals, data.expect)
- }
-}
-
-func TestConvertJekyllContent(t *testing.T) {
- c := qt.New(t)
- testDataList := []struct {
- metadata interface{}
- content string
- expect string
- }{
- {map[interface{}]interface{}{},
- `Test content\n\npart2 content`, `Test content\n\npart2 content`},
- {map[interface{}]interface{}{},
- `Test content\n\npart2 content`, `Test content\n\npart2 content`},
- {map[interface{}]interface{}{"excerpt_separator": ""},
- `Test content\n\npart2 content`, `Test content\n\npart2 content`},
- {map[interface{}]interface{}{}, "{% raw %}text{% endraw %}", "text"},
- {map[interface{}]interface{}{}, "{%raw%} text2 {%endraw %}", "text2"},
- {map[interface{}]interface{}{},
- "{% highlight go %}\nvar s int\n{% endhighlight %}",
- "{{< highlight go >}}\nvar s int\n{{< / highlight >}}"},
- {map[interface{}]interface{}{},
- "{% highlight go linenos hl_lines=\"1 2\" %}\nvar s string\nvar i int\n{% endhighlight %}",
- "{{< highlight go \"linenos=table,hl_lines=1 2\" >}}\nvar s string\nvar i int\n{{< / highlight >}}"},
-
- // Octopress image tag
- {map[interface{}]interface{}{},
- "{% img http://placekitten.com/890/280 %}",
- "{{< figure src=\"http://placekitten.com/890/280\" >}}"},
- {map[interface{}]interface{}{},
- "{% img left http://placekitten.com/320/250 Place Kitten #2 %}",
- "{{< figure class=\"left\" src=\"http://placekitten.com/320/250\" title=\"Place Kitten #2\" >}}"},
- {map[interface{}]interface{}{},
- "{% img right http://placekitten.com/300/500 150 250 'Place Kitten #3' %}",
- "{{< figure class=\"right\" src=\"http://placekitten.com/300/500\" width=\"150\" height=\"250\" title=\"Place Kitten #3\" >}}"},
- {map[interface{}]interface{}{},
- "{% img right http://placekitten.com/300/500 150 250 'Place Kitten #4' 'An image of a very cute kitten' %}",
- "{{< figure class=\"right\" src=\"http://placekitten.com/300/500\" width=\"150\" height=\"250\" title=\"Place Kitten #4\" alt=\"An image of a very cute kitten\" >}}"},
- {map[interface{}]interface{}{},
- "{% img http://placekitten.com/300/500 150 250 'Place Kitten #4' 'An image of a very cute kitten' %}",
- "{{< figure src=\"http://placekitten.com/300/500\" width=\"150\" height=\"250\" title=\"Place Kitten #4\" alt=\"An image of a very cute kitten\" >}}"},
- {map[interface{}]interface{}{},
- "{% img right /placekitten/300/500 'Place Kitten #4' 'An image of a very cute kitten' %}",
- "{{< figure class=\"right\" src=\"/placekitten/300/500\" title=\"Place Kitten #4\" alt=\"An image of a very cute kitten\" >}}"},
- }
-
- for _, data := range testDataList {
- result := convertJekyllContent(data.metadata, data.content)
- c.Assert(data.expect, qt.Equals, result)
- }
-}
diff --git a/commands/limit_darwin.go b/commands/limit_darwin.go
deleted file mode 100644
index 6799f37b1..000000000
--- a/commands/limit_darwin.go
+++ /dev/null
@@ -1,84 +0,0 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "syscall"
-
- "github.com/spf13/cobra"
- jww "github.com/spf13/jwalterweatherman"
-)
-
-var _ cmder = (*limitCmd)(nil)
-
-type limitCmd struct {
- *baseCmd
-}
-
-func newLimitCmd() *limitCmd {
- ccmd := &cobra.Command{
- Use: "ulimit",
- Short: "Check system ulimit settings",
- Long: `Hugo will inspect the current ulimit settings on the system.
-This is primarily to ensure that Hugo can watch enough files on some OSs`,
- RunE: func(cmd *cobra.Command, args []string) error {
- var rLimit syscall.Rlimit
- err := syscall.Getrlimit(syscall.RLIMIT_NOFILE, &rLimit)
- if err != nil {
- return newSystemError("Error Getting rlimit ", err)
- }
-
- jww.FEEDBACK.Println("Current rLimit:", rLimit)
-
- if rLimit.Cur >= newRlimit {
- return nil
- }
-
- jww.FEEDBACK.Println("Attempting to increase limit")
- rLimit.Cur = newRlimit
- err = syscall.Setrlimit(syscall.RLIMIT_NOFILE, &rLimit)
- if err != nil {
- return newSystemError("Error Setting rLimit ", err)
- }
- err = syscall.Getrlimit(syscall.RLIMIT_NOFILE, &rLimit)
- if err != nil {
- return newSystemError("Error Getting rLimit ", err)
- }
- jww.FEEDBACK.Println("rLimit after change:", rLimit)
-
- return nil
- },
- }
-
- return &limitCmd{baseCmd: newBaseCmd(ccmd)}
-}
-
-const newRlimit = 10240
-
-func tweakLimit() {
- var rLimit syscall.Rlimit
- err := syscall.Getrlimit(syscall.RLIMIT_NOFILE, &rLimit)
- if err != nil {
- jww.WARN.Println("Unable to get rlimit:", err)
- return
- }
- if rLimit.Cur < newRlimit {
- rLimit.Cur = newRlimit
- err = syscall.Setrlimit(syscall.RLIMIT_NOFILE, &rLimit)
- if err != nil {
- // This may not succeed, see https://github.com/golang/go/issues/30401
- jww.INFO.Println("Unable to increase number of open files limit:", err)
- }
- }
-}
diff --git a/commands/limit_others.go b/commands/limit_others.go
deleted file mode 100644
index 8d3e6ad70..000000000
--- a/commands/limit_others.go
+++ /dev/null
@@ -1,20 +0,0 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-// +build !darwin
-
-package commands
-
-func tweakLimit() {
- // nothing to do
-}
diff --git a/commands/list.go b/commands/list.go
index f233ce62c..42f3408ba 100644
--- a/commands/list.go
+++ b/commands/list.go
@@ -1,4 +1,4 @@
-// Copyright 2019 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -14,196 +14,200 @@
package commands
import (
+ "context"
"encoding/csv"
"os"
+ "path/filepath"
"strconv"
"strings"
"time"
+ "github.com/bep/simplecobra"
"github.com/gohugoio/hugo/hugolib"
+ "github.com/gohugoio/hugo/resources/page"
"github.com/gohugoio/hugo/resources/resource"
"github.com/spf13/cobra"
- jww "github.com/spf13/jwalterweatherman"
)
-var _ cmder = (*listCmd)(nil)
-
-type listCmd struct {
- hugoBuilderCommon
- *baseCmd
-}
-
-func (lc *listCmd) buildSites(config map[string]interface{}) (*hugolib.HugoSites, error) {
- cfgInit := func(c *commandeer) error {
- for key, value := range config {
- c.Set(key, value)
+// newListCommand creates a new list command and its subcommands.
+func newListCommand() *listCommand {
+ createRecord := func(workingDir string, p page.Page) []string {
+ return []string{
+ filepath.ToSlash(strings.TrimPrefix(p.File().Filename(), workingDir+string(os.PathSeparator))),
+ p.Slug(),
+ p.Title(),
+ p.Date().Format(time.RFC3339),
+ p.ExpiryDate().Format(time.RFC3339),
+ p.PublishDate().Format(time.RFC3339),
+ strconv.FormatBool(p.Draft()),
+ p.Permalink(),
+ p.Kind(),
+ p.Section(),
}
+ }
+
+ list := func(cd *simplecobra.Commandeer, r *rootCommand, shouldInclude func(page.Page) bool, opts ...any) error {
+ bcfg := hugolib.BuildCfg{SkipRender: true}
+ cfg := flagsToCfg(cd, nil)
+ for i := 0; i < len(opts); i += 2 {
+ cfg.Set(opts[i].(string), opts[i+1])
+ }
+ h, err := r.Build(cd, bcfg, cfg)
+ if err != nil {
+ return err
+ }
+
+ writer := csv.NewWriter(r.StdOut)
+ defer writer.Flush()
+
+ writer.Write([]string{
+ "path",
+ "slug",
+ "title",
+ "date",
+ "expiryDate",
+ "publishDate",
+ "draft",
+ "permalink",
+ "kind",
+ "section",
+ })
+
+ for _, p := range h.Pages() {
+ if shouldInclude(p) {
+ record := createRecord(h.Conf.BaseConfig().WorkingDir, p)
+ if err := writer.Write(record); err != nil {
+ return err
+ }
+ }
+ }
+
return nil
}
- c, err := initializeConfig(true, false, &lc.hugoBuilderCommon, lc, cfgInit)
- if err != nil {
- return nil, err
+ return &listCommand{
+ commands: []simplecobra.Commander{
+ &simpleCommand{
+ name: "drafts",
+ short: "List draft content",
+ long: `List draft content.`,
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ shouldInclude := func(p page.Page) bool {
+ if !p.Draft() || p.File() == nil {
+ return false
+ }
+ return true
+ }
+ return list(cd, r, shouldInclude,
+ "buildDrafts", true,
+ "buildFuture", true,
+ "buildExpired", true,
+ )
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ },
+ },
+ &simpleCommand{
+ name: "future",
+ short: "List future content",
+ long: `List content with a future publication date.`,
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ shouldInclude := func(p page.Page) bool {
+ if !resource.IsFuture(p) || p.File() == nil {
+ return false
+ }
+ return true
+ }
+ return list(cd, r, shouldInclude,
+ "buildFuture", true,
+ "buildDrafts", true,
+ )
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ },
+ },
+ &simpleCommand{
+ name: "expired",
+ short: "List expired content",
+ long: `List content with a past expiration date.`,
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ shouldInclude := func(p page.Page) bool {
+ if !resource.IsExpired(p) || p.File() == nil {
+ return false
+ }
+ return true
+ }
+ return list(cd, r, shouldInclude,
+ "buildExpired", true,
+ "buildDrafts", true,
+ )
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ },
+ },
+ &simpleCommand{
+ name: "all",
+ short: "List all content",
+ long: `List all content including draft, future, and expired.`,
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ shouldInclude := func(p page.Page) bool {
+ return p.File() != nil
+ }
+ return list(cd, r, shouldInclude, "buildDrafts", true, "buildFuture", true, "buildExpired", true)
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ },
+ },
+ &simpleCommand{
+ name: "published",
+ short: "List published content",
+ long: `List content that is not draft, future, or expired.`,
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ shouldInclude := func(p page.Page) bool {
+ return !p.Draft() && !resource.IsFuture(p) && !resource.IsExpired(p) && p.File() != nil
+ }
+ return list(cd, r, shouldInclude)
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ },
+ },
+ },
}
-
- sites, err := hugolib.NewHugoSites(*c.DepsCfg)
-
- if err != nil {
- return nil, newSystemError("Error creating sites", err)
- }
-
- if err := sites.Build(hugolib.BuildCfg{SkipRender: true}); err != nil {
- return nil, newSystemError("Error Processing Source Content", err)
- }
-
- return sites, nil
}
-func newListCmd() *listCmd {
- cc := &listCmd{}
-
- cc.baseCmd = newBaseCmd(&cobra.Command{
- Use: "list",
- Short: "Listing out various types of content",
- Long: `Listing out various types of content.
-
-List requires a subcommand, e.g. ` + "`hugo list drafts`.",
- RunE: nil,
- })
-
- cc.cmd.AddCommand(
- &cobra.Command{
- Use: "drafts",
- Short: "List all drafts",
- Long: `List all of the drafts in your content directory.`,
- RunE: func(cmd *cobra.Command, args []string) error {
- sites, err := cc.buildSites(map[string]interface{}{"buildDrafts": true})
-
- if err != nil {
- return newSystemError("Error building sites", err)
- }
-
- for _, p := range sites.Pages() {
- if p.Draft() {
- jww.FEEDBACK.Println(strings.TrimPrefix(p.File().Filename(), sites.WorkingDir+string(os.PathSeparator)))
- }
- }
-
- return nil
- },
- },
- &cobra.Command{
- Use: "future",
- Short: "List all posts dated in the future",
- Long: `List all of the posts in your content directory which will be posted in the future.`,
- RunE: func(cmd *cobra.Command, args []string) error {
- sites, err := cc.buildSites(map[string]interface{}{"buildFuture": true})
-
- if err != nil {
- return newSystemError("Error building sites", err)
- }
-
- writer := csv.NewWriter(os.Stdout)
- defer writer.Flush()
-
- for _, p := range sites.Pages() {
- if resource.IsFuture(p) {
- err := writer.Write([]string{
- strings.TrimPrefix(p.File().Filename(), sites.WorkingDir+string(os.PathSeparator)),
- p.PublishDate().Format(time.RFC3339),
- })
- if err != nil {
- return newSystemError("Error writing future posts to stdout", err)
- }
- }
- }
-
- return nil
- },
- },
- &cobra.Command{
- Use: "expired",
- Short: "List all posts already expired",
- Long: `List all of the posts in your content directory which has already expired.`,
- RunE: func(cmd *cobra.Command, args []string) error {
- sites, err := cc.buildSites(map[string]interface{}{"buildExpired": true})
-
- if err != nil {
- return newSystemError("Error building sites", err)
- }
-
- writer := csv.NewWriter(os.Stdout)
- defer writer.Flush()
-
- for _, p := range sites.Pages() {
- if resource.IsExpired(p) {
- err := writer.Write([]string{
- strings.TrimPrefix(p.File().Filename(), sites.WorkingDir+string(os.PathSeparator)),
- p.ExpiryDate().Format(time.RFC3339),
- })
- if err != nil {
- return newSystemError("Error writing expired posts to stdout", err)
- }
- }
- }
-
- return nil
- },
- },
- &cobra.Command{
- Use: "all",
- Short: "List all posts",
- Long: `List all of the posts in your content directory, include drafts, future and expired pages.`,
- RunE: func(cmd *cobra.Command, args []string) error {
- sites, err := cc.buildSites(map[string]interface{}{
- "buildExpired": true,
- "buildDrafts": true,
- "buildFuture": true,
- })
-
- if err != nil {
- return newSystemError("Error building sites", err)
- }
-
- writer := csv.NewWriter(os.Stdout)
- defer writer.Flush()
-
- writer.Write([]string{
- "path",
- "slug",
- "title",
- "date",
- "expiryDate",
- "publishDate",
- "draft",
- "permalink",
- })
- for _, p := range sites.Pages() {
- if !p.IsPage() {
- continue
- }
- err := writer.Write([]string{
- strings.TrimPrefix(p.File().Filename(), sites.WorkingDir+string(os.PathSeparator)),
- p.Slug(),
- p.Title(),
- p.Date().Format(time.RFC3339),
- p.ExpiryDate().Format(time.RFC3339),
- p.PublishDate().Format(time.RFC3339),
- strconv.FormatBool(p.Draft()),
- p.Permalink(),
- })
- if err != nil {
- return newSystemError("Error writing posts to stdout", err)
- }
- }
-
- return nil
- },
- },
- )
-
- cc.cmd.PersistentFlags().StringVarP(&cc.source, "source", "s", "", "filesystem path to read files relative from")
- cc.cmd.PersistentFlags().SetAnnotation("source", cobra.BashCompSubdirsInDir, []string{})
-
- return cc
+type listCommand struct {
+ commands []simplecobra.Commander
+}
+
+func (c *listCommand) Commands() []simplecobra.Commander {
+ return c.commands
+}
+
+func (c *listCommand) Name() string {
+ return "list"
+}
+
+func (c *listCommand) Run(ctx context.Context, cd *simplecobra.Commandeer, args []string) error {
+ // Do nothing.
+ return nil
+}
+
+func (c *listCommand) Init(cd *simplecobra.Commandeer) error {
+ cmd := cd.CobraCommand
+ cmd.Short = "List content"
+ cmd.Long = `List content.
+
+List requires a subcommand, e.g. hugo list drafts`
+
+ cmd.RunE = nil
+ return nil
+}
+
+func (c *listCommand) PreRun(cd, runner *simplecobra.Commandeer) error {
+ return nil
}
diff --git a/commands/list_test.go b/commands/list_test.go
deleted file mode 100644
index bfc280679..000000000
--- a/commands/list_test.go
+++ /dev/null
@@ -1,72 +0,0 @@
-package commands
-
-import (
- "bytes"
- "encoding/csv"
- "io"
- "os"
- "path/filepath"
- "strings"
- "testing"
-
- qt "github.com/frankban/quicktest"
- "github.com/spf13/cobra"
-)
-
-func captureStdout(f func() (*cobra.Command, error)) (string, error) {
- old := os.Stdout
- r, w, _ := os.Pipe()
- os.Stdout = w
-
- _, err := f()
-
- if err != nil {
- return "", err
- }
-
- w.Close()
- os.Stdout = old
-
- var buf bytes.Buffer
- io.Copy(&buf, r)
- return buf.String(), nil
-}
-
-func TestListAll(t *testing.T) {
- c := qt.New(t)
- dir, err := createSimpleTestSite(t, testSiteConfig{})
-
- c.Assert(err, qt.IsNil)
-
- hugoCmd := newCommandsBuilder().addAll().build()
- cmd := hugoCmd.getCommand()
-
- defer func() {
- os.RemoveAll(dir)
- }()
-
- cmd.SetArgs([]string{"-s=" + dir, "list", "all"})
-
- out, err := captureStdout(cmd.ExecuteC)
- c.Assert(err, qt.IsNil)
-
- r := csv.NewReader(strings.NewReader(out))
-
- header, err := r.Read()
-
- c.Assert(err, qt.IsNil)
- c.Assert(header, qt.DeepEquals, []string{
- "path", "slug", "title",
- "date", "expiryDate", "publishDate",
- "draft", "permalink",
- })
-
- record, err := r.Read()
-
- c.Assert(err, qt.IsNil)
- c.Assert(record, qt.DeepEquals, []string{
- filepath.Join("content", "p1.md"), "", "P1",
- "0001-01-01T00:00:00Z", "0001-01-01T00:00:00Z", "0001-01-01T00:00:00Z",
- "false", "https://example.org/p1/",
- })
-}
diff --git a/commands/mod.go b/commands/mod.go
index 5fbd93ecb..58155f9be 100644
--- a/commands/mod.go
+++ b/commands/mod.go
@@ -1,4 +1,4 @@
-// Copyright 2019 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -14,175 +14,331 @@
package commands
import (
+ "context"
+ "errors"
"os"
+ "path/filepath"
- "github.com/gohugoio/hugo/modules"
+ "github.com/bep/simplecobra"
+ "github.com/gohugoio/hugo/config"
+ "github.com/gohugoio/hugo/modules/npm"
"github.com/spf13/cobra"
)
-var _ cmder = (*modCmd)(nil)
-
-type modCmd struct {
- *baseBuilderCmd
-}
-
-func (b *commandsBuilder) newModCmd() *modCmd {
- c := &modCmd{}
-
- const commonUsage = `
+const commonUsageMod = `
Note that Hugo will always start out by resolving the components defined in the site
-configuration, provided by a _vendor directory (if no --ignoreVendor flag provided),
+configuration, provided by a _vendor directory (if no --ignoreVendorPaths flag provided),
Go Modules, or a folder inside the themes directory, in that order.
See https://gohugo.io/hugo-modules/ for more information.
`
- cmd := &cobra.Command{
- Use: "mod",
- Short: "Various Hugo Modules helpers.",
- Long: `Various helpers to help manage the modules in your project's dependency graph.
+// buildConfigCommands creates a new config command and its subcommands.
+func newModCommands() *modCommands {
+ var (
+ clean bool
+ pattern string
+ all bool
+ )
-Most operations here requires a Go version installed on your system (>= Go 1.12) and the relevant VCS client (typically Git).
-This is not needed if you only operate on modules inside /themes or if you have vendored them via "hugo mod vendor".
+ npmCommand := &simpleCommand{
+ name: "npm",
+ short: "Various npm helpers",
+ long: `Various npm (Node package manager) helpers.`,
+ commands: []simplecobra.Commander{
+ &simpleCommand{
+ name: "pack",
+ short: "Experimental: Prepares and writes a composite package.json file for your project",
+ long: `Prepares and writes a composite package.json file for your project.
-` + commonUsage,
+On first run it creates a "package.hugo.json" in the project root if not already there. This file will be used as a template file
+with the base dependency set.
- RunE: nil,
+This set will be merged with all "package.hugo.json" files found in the dependency tree, picking the version closest to the project.
+
+This command is marked as 'Experimental'. We think it's a great idea, so it's not likely to be
+removed from Hugo, but we need to test this out in "real life" to get a feel of it,
+so this may/will change in future versions of Hugo.
+`,
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ applyLocalFlagsBuildConfig(cmd, r)
+ },
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ h, err := r.Hugo(flagsToCfg(cd, nil))
+ if err != nil {
+ return err
+ }
+ return npm.Pack(h.BaseFs.ProjectSourceFs, h.BaseFs.AssetsWithDuplicatesPreserved.Fs)
+ },
+ },
+ },
}
- cmd.AddCommand(
- &cobra.Command{
- Use: "get",
- DisableFlagParsing: true,
- Short: "Resolves dependencies in your current Hugo Project.",
- Long: `
-Resolves dependencies in your current Hugo Project.
+ return &modCommands{
+ commands: []simplecobra.Commander{
+ &simpleCommand{
+ name: "init",
+ short: "Initialize this project as a Hugo Module",
+ long: `Initialize this project as a Hugo Module.
+ It will try to guess the module path, but you may help by passing it as an argument, e.g:
+
+ hugo mod init github.com/gohugoio/testshortcodes
+
+ Note that Hugo Modules supports multi-module projects, so you can initialize a Hugo Module
+ inside a subfolder on GitHub, as one example.
+ `,
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ applyLocalFlagsBuildConfig(cmd, r)
+ },
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ h, err := r.getOrCreateHugo(flagsToCfg(cd, nil), true)
+ if err != nil {
+ return err
+ }
+ var initPath string
+ if len(args) >= 1 {
+ initPath = args[0]
+ }
+ c := h.Configs.ModulesClient
+ if err := c.Init(initPath); err != nil {
+ return err
+ }
+ return nil
+ },
+ },
+ &simpleCommand{
+ name: "verify",
+ short: "Verify dependencies",
+ long: `Verify checks that the dependencies of the current module, which are stored in a local downloaded source cache, have not been modified since being downloaded.`,
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ applyLocalFlagsBuildConfig(cmd, r)
+ cmd.Flags().BoolVarP(&clean, "clean", "", false, "delete module cache for dependencies that fail verification")
+ },
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ conf, err := r.ConfigFromProvider(configKey{counter: r.configVersionID.Load()}, flagsToCfg(cd, nil))
+ if err != nil {
+ return err
+ }
+ client := conf.configs.ModulesClient
+ return client.Verify(clean)
+ },
+ },
+ &simpleCommand{
+ name: "graph",
+ short: "Print a module dependency graph",
+ long: `Print a module dependency graph with information about module status (disabled, vendored).
+Note that for vendored modules, that is the version listed and not the one from go.mod.
+`,
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ applyLocalFlagsBuildConfig(cmd, r)
+ cmd.Flags().BoolVarP(&clean, "clean", "", false, "delete module cache for dependencies that fail verification")
+ },
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ conf, err := r.ConfigFromProvider(configKey{counter: r.configVersionID.Load()}, flagsToCfg(cd, nil))
+ if err != nil {
+ return err
+ }
+ client := conf.configs.ModulesClient
+ return client.Graph(os.Stdout)
+ },
+ },
+ &simpleCommand{
+ name: "clean",
+ short: "Delete the Hugo Module cache for the current project",
+ long: `Delete the Hugo Module cache for the current project.`,
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ applyLocalFlagsBuildConfig(cmd, r)
+ cmd.Flags().StringVarP(&pattern, "pattern", "", "", `pattern matching module paths to clean (all if not set), e.g. "**hugo*"`)
+ _ = cmd.RegisterFlagCompletionFunc("pattern", cobra.NoFileCompletions)
+ cmd.Flags().BoolVarP(&all, "all", "", false, "clean entire module cache")
+ },
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ h, err := r.Hugo(flagsToCfg(cd, nil))
+ if err != nil {
+ return err
+ }
+ if all {
+ modCache := h.ResourceSpec.FileCaches.ModulesCache()
+ count, err := modCache.Prune(true)
+ r.Printf("Deleted %d files from module cache.", count)
+ return err
+ }
+
+ return h.Configs.ModulesClient.Clean(pattern)
+ },
+ },
+ &simpleCommand{
+ name: "tidy",
+ short: "Remove unused entries in go.mod and go.sum",
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ applyLocalFlagsBuildConfig(cmd, r)
+ },
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ h, err := r.Hugo(flagsToCfg(cd, nil))
+ if err != nil {
+ return err
+ }
+ return h.Configs.ModulesClient.Tidy()
+ },
+ },
+ &simpleCommand{
+ name: "vendor",
+ short: "Vendor all module dependencies into the _vendor directory",
+ long: `Vendor all module dependencies into the _vendor directory.
+ If a module is vendored, that is where Hugo will look for it's dependencies.
+ `,
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ applyLocalFlagsBuildConfig(cmd, r)
+ },
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ h, err := r.Hugo(flagsToCfg(cd, nil))
+ if err != nil {
+ return err
+ }
+ return h.Configs.ModulesClient.Vendor()
+ },
+ },
+
+ &simpleCommand{
+ name: "get",
+ short: "Resolves dependencies in your current Hugo project",
+ long: `
+Resolves dependencies in your current Hugo project.
Some examples:
Install the latest version possible for a given module:
hugo mod get github.com/gohugoio/testshortcodes
-
+
Install a specific version:
hugo mod get github.com/gohugoio/testshortcodes@v0.3.0
-Install the latest versions of all module dependencies:
+Install the latest versions of all direct module dependencies:
+
+ hugo mod get
+ hugo mod get ./... (recursive)
+
+Install the latest versions of all module dependencies (direct and indirect):
hugo mod get -u
+ hugo mod get -u ./... (recursive)
Run "go help get" for more information. All flags available for "go get" is also relevant here.
-` + commonUsage,
- RunE: func(cmd *cobra.Command, args []string) error {
- return c.withModsClient(false, func(c *modules.Client) error {
+` + commonUsageMod,
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.DisableFlagParsing = true
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ },
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
// We currently just pass on the flags we get to Go and
// need to do the flag handling manually.
- if len(args) == 1 && args[0] == "-h" {
- return cmd.Help()
+ if len(args) == 1 && (args[0] == "-h" || args[0] == "--help") {
+ return errHelp
}
- return c.Get(args...)
- })
+
+ var lastArg string
+ if len(args) != 0 {
+ lastArg = args[len(args)-1]
+ }
+
+ if lastArg == "./..." {
+ args = args[:len(args)-1]
+ // Do a recursive update.
+ dirname, err := os.Getwd()
+ if err != nil {
+ return err
+ }
+
+ // Sanity chesimplecobra. We do recursive walking and want to avoid
+ // accidents.
+ if len(dirname) < 5 {
+ return errors.New("must not be run from the file system root")
+ }
+
+ filepath.Walk(dirname, func(path string, info os.FileInfo, err error) error {
+ if info.IsDir() {
+ return nil
+ }
+ if info.Name() == "go.mod" {
+ // Found a module.
+ dir := filepath.Dir(path)
+
+ cfg := config.New()
+ cfg.Set("workingDir", dir)
+ conf, err := r.ConfigFromProvider(configKey{counter: r.configVersionID.Add(1)}, flagsToCfg(cd, cfg))
+ if err != nil {
+ return err
+ }
+ r.Println("Update module in", conf.configs.Base.WorkingDir)
+ client := conf.configs.ModulesClient
+ return client.Get(args...)
+
+ }
+ return nil
+ })
+ return nil
+ } else {
+ conf, err := r.ConfigFromProvider(configKey{counter: r.configVersionID.Load()}, flagsToCfg(cd, nil))
+ if err != nil {
+ return err
+ }
+ client := conf.configs.ModulesClient
+ return client.Get(args...)
+ }
+ },
},
+ npmCommand,
},
- &cobra.Command{
- Use: "graph",
- Short: "Print a module dependency graph.",
- Long: `Print a module dependency graph with information about module status (disabled, vendored).
-Note that for vendored modules, that is the version listed and not the one from go.mod.
-`,
- RunE: func(cmd *cobra.Command, args []string) error {
- return c.withModsClient(true, func(c *modules.Client) error {
- return c.Graph(os.Stdout)
- })
- },
- },
- &cobra.Command{
- Use: "init",
- Short: "Initialize this project as a Hugo Module.",
- Long: `Initialize this project as a Hugo Module.
-It will try to guess the module path, but you may help by passing it as an argument, e.g:
-
- hugo mod init github.com/gohugoio/testshortcodes
-
-Note that Hugo Modules supports multi-module projects, so you can initialize a Hugo Module
-inside a subfolder on GitHub, as one example.
-`,
- RunE: func(cmd *cobra.Command, args []string) error {
- var path string
- if len(args) >= 1 {
- path = args[0]
- }
- return c.withModsClient(false, func(c *modules.Client) error {
- return c.Init(path)
- })
- },
- },
- &cobra.Command{
- Use: "vendor",
- Short: "Vendor all module dependencies into the _vendor directory.",
- Long: `Vendor all module dependencies into the _vendor directory.
-
-If a module is vendored, that is where Hugo will look for it's dependencies.
-`,
- RunE: func(cmd *cobra.Command, args []string) error {
- return c.withModsClient(true, func(c *modules.Client) error {
- return c.Vendor()
- })
- },
- },
- &cobra.Command{
- Use: "tidy",
- Short: "Remove unused entries in go.mod and go.sum.",
- RunE: func(cmd *cobra.Command, args []string) error {
- return c.withModsClient(true, func(c *modules.Client) error {
- return c.Tidy()
- })
- },
- },
- &cobra.Command{
- Use: "clean",
- Short: "Delete the entire Hugo Module cache.",
- Long: `Delete the entire Hugo Module cache.
-
-Note that after you run this command, all of your dependencies will be re-downloaded next time you run "hugo".
-
-Also note that if you configure a positive maxAge for the "modules" file cache, it will also be cleaned as part of "hugo --gc".
-
-`,
- RunE: func(cmd *cobra.Command, args []string) error {
- com, err := c.initConfig(true)
- if err != nil {
- return err
- }
-
- _, err = com.hugo().FileCaches.ModulesCache().Prune(true)
- return err
-
- },
- },
- )
-
- c.baseBuilderCmd = b.newBuilderCmd(cmd)
-
- return c
-
+ }
}
-func (c *modCmd) withModsClient(failOnMissingConfig bool, f func(*modules.Client) error) error {
- com, err := c.initConfig(failOnMissingConfig)
+type modCommands struct {
+ r *rootCommand
+
+ commands []simplecobra.Commander
+}
+
+func (c *modCommands) Commands() []simplecobra.Commander {
+ return c.commands
+}
+
+func (c *modCommands) Name() string {
+ return "mod"
+}
+
+func (c *modCommands) Run(ctx context.Context, cd *simplecobra.Commandeer, args []string) error {
+ _, err := c.r.ConfigFromProvider(configKey{counter: c.r.configVersionID.Load()}, nil)
if err != nil {
return err
}
+ // config := conf.configs.Base
- return f(com.hugo().ModulesClient)
+ return nil
}
-func (c *modCmd) initConfig(failOnNoConfig bool) (*commandeer, error) {
- com, err := initializeConfig(failOnNoConfig, false, &c.hugoBuilderCommon, c, nil)
- if err != nil {
- return nil, err
- }
- return com, nil
+func (c *modCommands) Init(cd *simplecobra.Commandeer) error {
+ cmd := cd.CobraCommand
+ cmd.Short = "Manage modules"
+ cmd.Long = `Various helpers to help manage the modules in your project's dependency graph.
+Most operations here requires a Go version installed on your system (>= Go 1.12) and the relevant VCS client (typically Git).
+This is not needed if you only operate on modules inside /themes or if you have vendored them via "hugo mod vendor".
+
+` + commonUsageMod
+ cmd.RunE = nil
+ return nil
+}
+
+func (c *modCommands) PreRun(cd, runner *simplecobra.Commandeer) error {
+ c.r = cd.Root.Command.(*rootCommand)
+ return nil
}
diff --git a/commands/new.go b/commands/new.go
index 4fc0d4ed4..81e1c65a4 100644
--- a/commands/new.go
+++ b/commands/new.go
@@ -1,4 +1,4 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -15,32 +15,33 @@ package commands
import (
"bytes"
- "os"
+ "context"
"path/filepath"
"strings"
+ "github.com/bep/simplecobra"
+ "github.com/gohugoio/hugo/common/paths"
+ "github.com/gohugoio/hugo/config"
"github.com/gohugoio/hugo/create"
- "github.com/gohugoio/hugo/helpers"
- "github.com/gohugoio/hugo/hugolib"
- "github.com/spf13/afero"
+ "github.com/gohugoio/hugo/create/skeletons"
"github.com/spf13/cobra"
- jww "github.com/spf13/jwalterweatherman"
)
-var _ cmder = (*newCmd)(nil)
+func newNewCommand() *newCommand {
+ var (
+ force bool
+ contentType string
+ format string
+ )
-type newCmd struct {
- contentEditor string
- contentType string
-
- *baseBuilderCmd
-}
-
-func (b *commandsBuilder) newNewCmd() *newCmd {
- cmd := &cobra.Command{
- Use: "new [path]",
- Short: "Create new content for your site",
- Long: `Create a new content file and automatically set the date and title.
+ var c *newCommand
+ c = &newCommand{
+ commands: []simplecobra.Commander{
+ &simpleCommand{
+ name: "content",
+ use: "content [path]",
+ short: "Create new content",
+ long: `Create a new content file and automatically set the date and title.
It will guess which kind of file to create based on the path provided.
You can also specify the kind with ` + "`-k KIND`" + `.
@@ -48,90 +49,179 @@ You can also specify the kind with ` + "`-k KIND`" + `.
If archetypes are provided in your theme or site, they will be used.
Ensure you run this within the root directory of your site.`,
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ if len(args) < 1 {
+ return newUserError("path needs to be provided")
+ }
+ h, err := r.Hugo(flagsToCfg(cd, nil))
+ if err != nil {
+ return err
+ }
+ return create.NewContent(h, contentType, args[0], force)
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = func(cmd *cobra.Command, args []string, toComplete string) ([]string, cobra.ShellCompDirective) {
+ if len(args) != 0 {
+ return []string{}, cobra.ShellCompDirectiveNoFileComp
+ }
+ return []string{}, cobra.ShellCompDirectiveNoFileComp | cobra.ShellCompDirectiveFilterDirs
+ }
+ cmd.Flags().StringVarP(&contentType, "kind", "k", "", "content type to create")
+ cmd.Flags().String("editor", "", "edit new content with this editor, if provided")
+ _ = cmd.RegisterFlagCompletionFunc("editor", cobra.NoFileCompletions)
+ cmd.Flags().BoolVarP(&force, "force", "f", false, "overwrite file if it already exists")
+ applyLocalFlagsBuildConfig(cmd, r)
+ },
+ },
+ &simpleCommand{
+ name: "site",
+ use: "site [path]",
+ short: "Create a new site",
+ long: `Create a new site at the specified path.`,
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ if len(args) < 1 {
+ return newUserError("path needs to be provided")
+ }
+ createpath, err := filepath.Abs(filepath.Clean(args[0]))
+ if err != nil {
+ return err
+ }
+
+ cfg := config.New()
+ cfg.Set("workingDir", createpath)
+ cfg.Set("publishDir", "public")
+
+ conf, err := r.ConfigFromProvider(configKey{counter: r.configVersionID.Load()}, flagsToCfg(cd, cfg))
+ if err != nil {
+ return err
+ }
+ sourceFs := conf.fs.Source
+
+ err = skeletons.CreateSite(createpath, sourceFs, force, format)
+ if err != nil {
+ return err
+ }
+
+ r.Printf("Congratulations! Your new Hugo site was created in %s.\n\n", createpath)
+ r.Println(c.newSiteNextStepsText(createpath, format))
+
+ return nil
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = func(cmd *cobra.Command, args []string, toComplete string) ([]string, cobra.ShellCompDirective) {
+ if len(args) != 0 {
+ return []string{}, cobra.ShellCompDirectiveNoFileComp
+ }
+ return []string{}, cobra.ShellCompDirectiveNoFileComp | cobra.ShellCompDirectiveFilterDirs
+ }
+ cmd.Flags().BoolVarP(&force, "force", "f", false, "init inside non-empty directory")
+ cmd.Flags().StringVar(&format, "format", "toml", "preferred file format (toml, yaml or json)")
+ _ = cmd.RegisterFlagCompletionFunc("format", cobra.FixedCompletions([]string{"toml", "yaml", "json"}, cobra.ShellCompDirectiveNoFileComp))
+ },
+ },
+ &simpleCommand{
+ name: "theme",
+ use: "theme [name]",
+ short: "Create a new theme",
+ long: `Create a new theme with the specified name in the ./themes directory.
+This generates a functional theme including template examples and sample content.`,
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ if len(args) < 1 {
+ return newUserError("theme name needs to be provided")
+ }
+ cfg := config.New()
+ cfg.Set("publishDir", "public")
+
+ conf, err := r.ConfigFromProvider(configKey{counter: r.configVersionID.Load()}, flagsToCfg(cd, cfg))
+ if err != nil {
+ return err
+ }
+ sourceFs := conf.fs.Source
+ createpath := paths.AbsPathify(conf.configs.Base.WorkingDir, filepath.Join(conf.configs.Base.ThemesDir, args[0]))
+ r.Println("Creating new theme in", createpath)
+
+ err = skeletons.CreateTheme(createpath, sourceFs, format)
+ if err != nil {
+ return err
+ }
+
+ return nil
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = func(cmd *cobra.Command, args []string, toComplete string) ([]string, cobra.ShellCompDirective) {
+ if len(args) != 0 {
+ return []string{}, cobra.ShellCompDirectiveNoFileComp
+ }
+ return []string{}, cobra.ShellCompDirectiveNoFileComp | cobra.ShellCompDirectiveFilterDirs
+ }
+ cmd.Flags().StringVar(&format, "format", "toml", "preferred file format (toml, yaml or json)")
+ _ = cmd.RegisterFlagCompletionFunc("format", cobra.FixedCompletions([]string{"toml", "yaml", "json"}, cobra.ShellCompDirectiveNoFileComp))
+ },
+ },
+ },
}
- cc := &newCmd{baseBuilderCmd: b.newBuilderCmd(cmd)}
-
- cmd.Flags().StringVarP(&cc.contentType, "kind", "k", "", "content type to create")
- cmd.Flags().StringVar(&cc.contentEditor, "editor", "", "edit new content with this editor, if provided")
-
- cmd.AddCommand(newNewSiteCmd().getCommand())
- cmd.AddCommand(newNewThemeCmd().getCommand())
-
- cmd.RunE = cc.newContent
-
- return cc
+ return c
}
-func (n *newCmd) newContent(cmd *cobra.Command, args []string) error {
- cfgInit := func(c *commandeer) error {
- if cmd.Flags().Changed("editor") {
- c.Set("newContentEditor", n.contentEditor)
- }
- return nil
- }
+type newCommand struct {
+ rootCmd *rootCommand
- c, err := initializeConfig(true, false, &n.hugoBuilderCommon, n, cfgInit)
-
- if err != nil {
- return err
- }
-
- if len(args) < 1 {
- return newUserError("path needs to be provided")
- }
-
- createPath := args[0]
-
- var kind string
-
- createPath, kind = newContentPathSection(c.hugo(), createPath)
-
- if n.contentType != "" {
- kind = n.contentType
- }
-
- return create.NewContent(c.hugo(), kind, createPath)
+ commands []simplecobra.Commander
}
-func mkdir(x ...string) {
- p := filepath.Join(x...)
-
- err := os.MkdirAll(p, 0777) // before umask
- if err != nil {
- jww.FATAL.Fatalln(err)
- }
+func (c *newCommand) Commands() []simplecobra.Commander {
+ return c.commands
}
-func touchFile(fs afero.Fs, x ...string) {
- inpath := filepath.Join(x...)
- mkdir(filepath.Dir(inpath))
- err := helpers.WriteToDisk(inpath, bytes.NewReader([]byte{}), fs)
- if err != nil {
- jww.FATAL.Fatalln(err)
- }
+func (c *newCommand) Name() string {
+ return "new"
}
-func newContentPathSection(h *hugolib.HugoSites, path string) (string, string) {
- // Forward slashes is used in all examples. Convert if needed.
- // Issue #1133
- createpath := filepath.FromSlash(path)
-
- if h != nil {
- for _, dir := range h.BaseFs.Content.Dirs {
- createpath = strings.TrimPrefix(createpath, dir.Meta().Filename())
- }
- }
-
- var section string
- // assume the first directory is the section (kind)
- if strings.Contains(createpath[1:], helpers.FilePathSeparator) {
- parts := strings.Split(strings.TrimPrefix(createpath, helpers.FilePathSeparator), helpers.FilePathSeparator)
- if len(parts) > 0 {
- section = parts[0]
- }
-
- }
-
- return createpath, section
+func (c *newCommand) Run(ctx context.Context, cd *simplecobra.Commandeer, args []string) error {
+ return nil
+}
+
+func (c *newCommand) Init(cd *simplecobra.Commandeer) error {
+ cmd := cd.CobraCommand
+ cmd.Short = "Create new content"
+ cmd.Long = `Create a new content file and automatically set the date and title.
+It will guess which kind of file to create based on the path provided.
+
+You can also specify the kind with ` + "`-k KIND`" + `.
+
+If archetypes are provided in your theme or site, they will be used.
+
+Ensure you run this within the root directory of your site.`
+
+ cmd.RunE = nil
+ return nil
+}
+
+func (c *newCommand) PreRun(cd, runner *simplecobra.Commandeer) error {
+ c.rootCmd = cd.Root.Command.(*rootCommand)
+ return nil
+}
+
+func (c *newCommand) newSiteNextStepsText(path string, format string) string {
+ format = strings.ToLower(format)
+ var nextStepsText bytes.Buffer
+
+ nextStepsText.WriteString(`Just a few more steps...
+
+1. Change the current directory to ` + path + `.
+2. Create or install a theme:
+ - Create a new theme with the command "hugo new theme "
+ - Or, install a theme from https://themes.gohugo.io/
+3. Edit hugo.` + format + `, setting the "theme" property to the theme name.
+4. Create new content with the command "hugo new content `)
+
+ nextStepsText.WriteString(filepath.Join("", "."))
+
+ nextStepsText.WriteString(`".
+5. Start the embedded web server with the command "hugo server --buildDrafts".
+
+See documentation at https://gohugo.io/.`)
+
+ return nextStepsText.String()
}
diff --git a/commands/new_content_test.go b/commands/new_content_test.go
deleted file mode 100644
index 36726e37a..000000000
--- a/commands/new_content_test.go
+++ /dev/null
@@ -1,134 +0,0 @@
-// Copyright 2019 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "path/filepath"
- "testing"
-
- "github.com/gohugoio/hugo/hugofs"
- "github.com/spf13/viper"
-
- qt "github.com/frankban/quicktest"
-)
-
-// Issue #1133
-func TestNewContentPathSectionWithForwardSlashes(t *testing.T) {
- c := qt.New(t)
- p, s := newContentPathSection(nil, "/post/new.md")
- c.Assert(p, qt.Equals, filepath.FromSlash("/post/new.md"))
- c.Assert(s, qt.Equals, "post")
-}
-
-func checkNewSiteInited(fs *hugofs.Fs, basepath string, t *testing.T) {
- c := qt.New(t)
- paths := []string{
- filepath.Join(basepath, "layouts"),
- filepath.Join(basepath, "content"),
- filepath.Join(basepath, "archetypes"),
- filepath.Join(basepath, "static"),
- filepath.Join(basepath, "data"),
- filepath.Join(basepath, "config.toml"),
- }
-
- for _, path := range paths {
- _, err := fs.Source.Stat(path)
- c.Assert(err, qt.IsNil)
- }
-}
-
-func TestDoNewSite(t *testing.T) {
- c := qt.New(t)
- n := newNewSiteCmd()
- basepath := filepath.Join("base", "blog")
- _, fs := newTestCfg()
-
- c.Assert(n.doNewSite(fs, basepath, false), qt.IsNil)
-
- checkNewSiteInited(fs, basepath, t)
-}
-
-func TestDoNewSite_noerror_base_exists_but_empty(t *testing.T) {
- c := qt.New(t)
- basepath := filepath.Join("base", "blog")
- _, fs := newTestCfg()
- n := newNewSiteCmd()
-
- c.Assert(fs.Source.MkdirAll(basepath, 0777), qt.IsNil)
-
- c.Assert(n.doNewSite(fs, basepath, false), qt.IsNil)
-}
-
-func TestDoNewSite_error_base_exists(t *testing.T) {
- c := qt.New(t)
- basepath := filepath.Join("base", "blog")
- _, fs := newTestCfg()
- n := newNewSiteCmd()
-
- c.Assert(fs.Source.MkdirAll(basepath, 0777), qt.IsNil)
- _, err := fs.Source.Create(filepath.Join(basepath, "foo"))
- c.Assert(err, qt.IsNil)
- // Since the directory already exists and isn't empty, expect an error
- c.Assert(n.doNewSite(fs, basepath, false), qt.Not(qt.IsNil))
-
-}
-
-func TestDoNewSite_force_empty_dir(t *testing.T) {
- c := qt.New(t)
- basepath := filepath.Join("base", "blog")
- _, fs := newTestCfg()
- n := newNewSiteCmd()
-
- c.Assert(fs.Source.MkdirAll(basepath, 0777), qt.IsNil)
- c.Assert(n.doNewSite(fs, basepath, true), qt.IsNil)
-
- checkNewSiteInited(fs, basepath, t)
-}
-
-func TestDoNewSite_error_force_dir_inside_exists(t *testing.T) {
- c := qt.New(t)
- basepath := filepath.Join("base", "blog")
- _, fs := newTestCfg()
- n := newNewSiteCmd()
-
- contentPath := filepath.Join(basepath, "content")
-
- c.Assert(fs.Source.MkdirAll(contentPath, 0777), qt.IsNil)
- c.Assert(n.doNewSite(fs, basepath, true), qt.Not(qt.IsNil))
-}
-
-func TestDoNewSite_error_force_config_inside_exists(t *testing.T) {
- c := qt.New(t)
- basepath := filepath.Join("base", "blog")
- _, fs := newTestCfg()
- n := newNewSiteCmd()
-
- configPath := filepath.Join(basepath, "config.toml")
- c.Assert(fs.Source.MkdirAll(basepath, 0777), qt.IsNil)
- _, err := fs.Source.Create(configPath)
- c.Assert(err, qt.IsNil)
-
- c.Assert(n.doNewSite(fs, basepath, true), qt.Not(qt.IsNil))
-}
-
-func newTestCfg() (*viper.Viper, *hugofs.Fs) {
-
- v := viper.New()
- fs := hugofs.NewMem(v)
-
- v.SetFs(fs.Source)
-
- return v, fs
-
-}
diff --git a/commands/new_site.go b/commands/new_site.go
deleted file mode 100644
index 6d1677e22..000000000
--- a/commands/new_site.go
+++ /dev/null
@@ -1,165 +0,0 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "bytes"
- "errors"
- "path/filepath"
- "strings"
-
- "github.com/gohugoio/hugo/parser/metadecoders"
-
- _errors "github.com/pkg/errors"
-
- "github.com/gohugoio/hugo/create"
- "github.com/gohugoio/hugo/helpers"
- "github.com/gohugoio/hugo/hugofs"
- "github.com/gohugoio/hugo/parser"
- "github.com/spf13/cobra"
- jww "github.com/spf13/jwalterweatherman"
- "github.com/spf13/viper"
-)
-
-var _ cmder = (*newSiteCmd)(nil)
-
-type newSiteCmd struct {
- configFormat string
-
- *baseCmd
-}
-
-func newNewSiteCmd() *newSiteCmd {
- ccmd := &newSiteCmd{}
-
- cmd := &cobra.Command{
- Use: "site [path]",
- Short: "Create a new site (skeleton)",
- Long: `Create a new site in the provided directory.
-The new site will have the correct structure, but no content or theme yet.
-Use ` + "`hugo new [contentPath]`" + ` to create new content.`,
- RunE: ccmd.newSite,
- }
-
- cmd.Flags().StringVarP(&ccmd.configFormat, "format", "f", "toml", "config & frontmatter format")
- cmd.Flags().Bool("force", false, "init inside non-empty directory")
-
- ccmd.baseCmd = newBaseCmd(cmd)
-
- return ccmd
-
-}
-
-func (n *newSiteCmd) doNewSite(fs *hugofs.Fs, basepath string, force bool) error {
- archeTypePath := filepath.Join(basepath, "archetypes")
- dirs := []string{
- filepath.Join(basepath, "layouts"),
- filepath.Join(basepath, "content"),
- archeTypePath,
- filepath.Join(basepath, "static"),
- filepath.Join(basepath, "data"),
- filepath.Join(basepath, "themes"),
- }
-
- if exists, _ := helpers.Exists(basepath, fs.Source); exists {
- if isDir, _ := helpers.IsDir(basepath, fs.Source); !isDir {
- return errors.New(basepath + " already exists but not a directory")
- }
-
- isEmpty, _ := helpers.IsEmpty(basepath, fs.Source)
-
- switch {
- case !isEmpty && !force:
- return errors.New(basepath + " already exists and is not empty")
-
- case !isEmpty && force:
- all := append(dirs, filepath.Join(basepath, "config."+n.configFormat))
- for _, path := range all {
- if exists, _ := helpers.Exists(path, fs.Source); exists {
- return errors.New(path + " already exists")
- }
- }
- }
- }
-
- for _, dir := range dirs {
- if err := fs.Source.MkdirAll(dir, 0777); err != nil {
- return _errors.Wrap(err, "Failed to create dir")
- }
- }
-
- createConfig(fs, basepath, n.configFormat)
-
- // Create a default archetype file.
- helpers.SafeWriteToDisk(filepath.Join(archeTypePath, "default.md"),
- strings.NewReader(create.ArchetypeTemplateTemplate), fs.Source)
-
- jww.FEEDBACK.Printf("Congratulations! Your new Hugo site is created in %s.\n\n", basepath)
- jww.FEEDBACK.Println(nextStepsText())
-
- return nil
-}
-
-// newSite creates a new Hugo site and initializes a structured Hugo directory.
-func (n *newSiteCmd) newSite(cmd *cobra.Command, args []string) error {
- if len(args) < 1 {
- return newUserError("path needs to be provided")
- }
-
- createpath, err := filepath.Abs(filepath.Clean(args[0]))
- if err != nil {
- return newUserError(err)
- }
-
- forceNew, _ := cmd.Flags().GetBool("force")
-
- return n.doNewSite(hugofs.NewDefault(viper.New()), createpath, forceNew)
-}
-
-func createConfig(fs *hugofs.Fs, inpath string, kind string) (err error) {
- in := map[string]string{
- "baseURL": "http://example.org/",
- "title": "My New Hugo Site",
- "languageCode": "en-us",
- }
-
- var buf bytes.Buffer
- err = parser.InterfaceToConfig(in, metadecoders.FormatFromString(kind), &buf)
- if err != nil {
- return err
- }
-
- return helpers.WriteToDisk(filepath.Join(inpath, "config."+kind), &buf, fs.Source)
-}
-
-func nextStepsText() string {
- var nextStepsText bytes.Buffer
-
- nextStepsText.WriteString(`Just a few more steps and you're ready to go:
-
-1. Download a theme into the same-named folder.
- Choose a theme from https://themes.gohugo.io/ or
- create your own with the "hugo new theme " command.
-2. Perhaps you want to add some content. You can add single files
- with "hugo new `)
-
- nextStepsText.WriteString(filepath.Join("", "."))
-
- nextStepsText.WriteString(`".
-3. Start the built-in live server via "hugo server".
-
-Visit https://gohugo.io/ for quickstart guide and full documentation.`)
-
- return nextStepsText.String()
-}
diff --git a/commands/new_theme.go b/commands/new_theme.go
deleted file mode 100644
index a0a4e89e3..000000000
--- a/commands/new_theme.go
+++ /dev/null
@@ -1,179 +0,0 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "bytes"
- "errors"
- "path/filepath"
- "strings"
- "time"
-
- "github.com/gohugoio/hugo/helpers"
- "github.com/gohugoio/hugo/hugofs"
- "github.com/spf13/cobra"
- jww "github.com/spf13/jwalterweatherman"
-)
-
-var _ cmder = (*newThemeCmd)(nil)
-
-type newThemeCmd struct {
- *baseCmd
- hugoBuilderCommon
-}
-
-func newNewThemeCmd() *newThemeCmd {
- ccmd := &newThemeCmd{baseCmd: newBaseCmd(nil)}
-
- cmd := &cobra.Command{
- Use: "theme [name]",
- Short: "Create a new theme",
- Long: `Create a new theme (skeleton) called [name] in the current directory.
-New theme is a skeleton. Please add content to the touched files. Add your
-name to the copyright line in the license and adjust the theme.toml file
-as you see fit.`,
- RunE: ccmd.newTheme,
- }
-
- ccmd.cmd = cmd
-
- return ccmd
-}
-
-// newTheme creates a new Hugo theme template
-func (n *newThemeCmd) newTheme(cmd *cobra.Command, args []string) error {
- c, err := initializeConfig(false, false, &n.hugoBuilderCommon, n, nil)
-
- if err != nil {
- return err
- }
-
- if len(args) < 1 {
- return newUserError("theme name needs to be provided")
- }
-
- createpath := c.hugo().PathSpec.AbsPathify(filepath.Join(c.Cfg.GetString("themesDir"), args[0]))
- jww.FEEDBACK.Println("Creating theme at", createpath)
-
- cfg := c.DepsCfg
-
- if x, _ := helpers.Exists(createpath, cfg.Fs.Source); x {
- return errors.New(createpath + " already exists")
- }
-
- mkdir(createpath, "layouts", "_default")
- mkdir(createpath, "layouts", "partials")
-
- touchFile(cfg.Fs.Source, createpath, "layouts", "index.html")
- touchFile(cfg.Fs.Source, createpath, "layouts", "404.html")
- touchFile(cfg.Fs.Source, createpath, "layouts", "_default", "list.html")
- touchFile(cfg.Fs.Source, createpath, "layouts", "_default", "single.html")
-
- baseofDefault := []byte(`
-
- {{- partial "head.html" . -}}
-
- {{- partial "header.html" . -}}
-
- {{- block "main" . }}{{- end }}
-
- {{- partial "footer.html" . -}}
-
-
-`)
- err = helpers.WriteToDisk(filepath.Join(createpath, "layouts", "_default", "baseof.html"), bytes.NewReader(baseofDefault), cfg.Fs.Source)
- if err != nil {
- return err
- }
-
- touchFile(cfg.Fs.Source, createpath, "layouts", "partials", "head.html")
- touchFile(cfg.Fs.Source, createpath, "layouts", "partials", "header.html")
- touchFile(cfg.Fs.Source, createpath, "layouts", "partials", "footer.html")
-
- mkdir(createpath, "archetypes")
-
- archDefault := []byte("+++\n+++\n")
-
- err = helpers.WriteToDisk(filepath.Join(createpath, "archetypes", "default.md"), bytes.NewReader(archDefault), cfg.Fs.Source)
- if err != nil {
- return err
- }
-
- mkdir(createpath, "static", "js")
- mkdir(createpath, "static", "css")
-
- by := []byte(`The MIT License (MIT)
-
-Copyright (c) ` + time.Now().Format("2006") + ` YOUR_NAME_HERE
-
-Permission is hereby granted, free of charge, to any person obtaining a copy of
-this software and associated documentation files (the "Software"), to deal in
-the Software without restriction, including without limitation the rights to
-use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
-the Software, and to permit persons to whom the Software is furnished to do so,
-subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in all
-copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
-FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
-COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
-IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
-CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
-`)
-
- err = helpers.WriteToDisk(filepath.Join(createpath, "LICENSE"), bytes.NewReader(by), cfg.Fs.Source)
- if err != nil {
- return err
- }
-
- n.createThemeMD(cfg.Fs, createpath)
-
- return nil
-}
-
-func (n *newThemeCmd) createThemeMD(fs *hugofs.Fs, inpath string) (err error) {
-
- by := []byte(`# theme.toml template for a Hugo theme
-# See https://github.com/gohugoio/hugoThemes#themetoml for an example
-
-name = "` + strings.Title(helpers.MakeTitle(filepath.Base(inpath))) + `"
-license = "MIT"
-licenselink = "https://github.com/yourname/yourtheme/blob/master/LICENSE"
-description = ""
-homepage = "http://example.com/"
-tags = []
-features = []
-min_version = "0.41"
-
-[author]
- name = ""
- homepage = ""
-
-# If porting an existing theme
-[original]
- name = ""
- homepage = ""
- repo = ""
-`)
-
- err = helpers.WriteToDisk(filepath.Join(inpath, "theme.toml"), bytes.NewReader(by), fs.Source)
- if err != nil {
- return
- }
-
- return nil
-}
diff --git a/commands/release.go b/commands/release.go
index 4de165f35..059f04eb8 100644
--- a/commands/release.go
+++ b/commands/release.go
@@ -1,6 +1,4 @@
-// +build release
-
-// Copyright 2017-present The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -16,57 +14,40 @@
package commands
import (
- "errors"
+ "context"
- "github.com/gohugoio/hugo/config"
+ "github.com/bep/simplecobra"
"github.com/gohugoio/hugo/releaser"
"github.com/spf13/cobra"
)
-var _ cmder = (*releaseCommandeer)(nil)
+// Note: This is a command only meant for internal use and must be run
+// via "go run -tags release main.go release" on the actual code base that is in the release.
+func newReleaseCommand() simplecobra.Commander {
+ var (
+ step int
+ skipPush bool
+ try bool
+ )
-type releaseCommandeer struct {
- cmd *cobra.Command
+ return &simpleCommand{
+ name: "release",
+ short: "Release a new version of Hugo",
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ rel, err := releaser.New(skipPush, try, step)
+ if err != nil {
+ return err
+ }
- version string
-
- skipPublish bool
- try bool
-}
-
-func createReleaser() cmder {
- // Note: This is a command only meant for internal use and must be run
- // via "go run -tags release main.go release" on the actual code base that is in the release.
- r := &releaseCommandeer{
- cmd: &cobra.Command{
- Use: "release",
- Short: "Release a new version of Hugo.",
- Hidden: true,
+ return rel.Run()
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.Hidden = true
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ cmd.PersistentFlags().BoolVarP(&skipPush, "skip-push", "", false, "skip pushing to remote")
+ cmd.PersistentFlags().BoolVarP(&try, "try", "", false, "no changes")
+ cmd.PersistentFlags().IntVarP(&step, "step", "", 0, "step to run (1: set new version 2: prepare next dev version)")
+ _ = cmd.RegisterFlagCompletionFunc("step", cobra.FixedCompletions([]string{"1", "2"}, cobra.ShellCompDirectiveNoFileComp))
},
}
-
- r.cmd.RunE = func(cmd *cobra.Command, args []string) error {
- return r.release()
- }
-
- r.cmd.PersistentFlags().StringVarP(&r.version, "rel", "r", "", "new release version, i.e. 0.25.1")
- r.cmd.PersistentFlags().BoolVarP(&r.skipPublish, "skip-publish", "", false, "skip all publishing pipes of the release")
- r.cmd.PersistentFlags().BoolVarP(&r.try, "try", "", false, "simulate a release, i.e. no changes")
-
- return r
-}
-
-func (c *releaseCommandeer) getCommand() *cobra.Command {
- return c.cmd
-}
-
-func (c *releaseCommandeer) flagsToConfig(cfg config.Provider) {
-
-}
-
-func (r *releaseCommandeer) release() error {
- if r.version == "" {
- return errors.New("must set the --rel flag to the relevant version number")
- }
- return releaser.New(r.version, r.skipPublish, r.try).Run()
}
diff --git a/commands/release_noop.go b/commands/release_noop.go
deleted file mode 100644
index ccf34b68e..000000000
--- a/commands/release_noop.go
+++ /dev/null
@@ -1,20 +0,0 @@
-// +build !release
-
-// Copyright 2018 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-func createReleaser() cmder {
- return &nilCommand{}
-}
diff --git a/commands/server.go b/commands/server.go
index 709181507..c8895b9a1 100644
--- a/commands/server.go
+++ b/commands/server.go
@@ -1,4 +1,4 @@
-// Copyright 2019 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -15,105 +15,416 @@ package commands
import (
"bytes"
+ "context"
+ "crypto/tls"
+ "crypto/x509"
+ "encoding/json"
+ "encoding/pem"
+ "errors"
"fmt"
+ "io"
+ "maps"
"net"
"net/http"
+ _ "net/http/pprof"
"net/url"
"os"
"os/signal"
+ "path"
"path/filepath"
"regexp"
- "runtime"
+ "sort"
"strconv"
"strings"
"sync"
+ "sync/atomic"
"syscall"
"time"
- "github.com/pkg/errors"
+ "github.com/bep/mclib"
+ "github.com/pkg/browser"
- "github.com/gohugoio/hugo/livereload"
- "github.com/gohugoio/hugo/tpl"
+ "github.com/bep/debounce"
+ "github.com/bep/simplecobra"
+ "github.com/fsnotify/fsnotify"
+ "github.com/gohugoio/hugo/common/herrors"
+ "github.com/gohugoio/hugo/common/hugo"
+ "github.com/gohugoio/hugo/tpl/tplimpl"
+ "github.com/gohugoio/hugo/common/types"
+ "github.com/gohugoio/hugo/common/urls"
"github.com/gohugoio/hugo/config"
"github.com/gohugoio/hugo/helpers"
+ "github.com/gohugoio/hugo/hugofs"
+ "github.com/gohugoio/hugo/hugolib"
+ "github.com/gohugoio/hugo/hugolib/filesystems"
+ "github.com/gohugoio/hugo/livereload"
+ "github.com/gohugoio/hugo/transform"
+ "github.com/gohugoio/hugo/transform/livereloadinject"
"github.com/spf13/afero"
"github.com/spf13/cobra"
- jww "github.com/spf13/jwalterweatherman"
+ "github.com/spf13/fsync"
+ "golang.org/x/sync/errgroup"
+ "golang.org/x/sync/semaphore"
)
-type serverCmd struct {
- // Can be used to stop the server. Useful in tests
- stop <-chan bool
+var (
+ logDuplicateTemplateExecuteRe = regexp.MustCompile(`: template: .*?:\d+:\d+: executing ".*?"`)
+ logDuplicateTemplateParseRe = regexp.MustCompile(`: template: .*?:\d+:\d*`)
+)
- disableLiveReload bool
- navigateToChanged bool
- renderToDisk bool
- serverAppend bool
- serverInterface string
- serverPort int
- liveReloadPort int
- serverWatch bool
- noHTTPCache bool
+var logReplacer = strings.NewReplacer(
+ "can't", "can’t", // Chroma lexer doesn't do well with "can't"
+ "*hugolib.pageState", "page.Page", // Page is the public interface.
+ "Rebuild failed:", "",
+)
- disableFastRender bool
- disableBrowserError bool
+const (
+ configChangeConfig = "config file"
+ configChangeGoMod = "go.mod file"
+ configChangeGoWork = "go work file"
+)
- *baseBuilderCmd
+const (
+ hugoHeaderRedirect = "X-Hugo-Redirect"
+)
+
+func newHugoBuilder(r *rootCommand, s *serverCommand, onConfigLoaded ...func(reloaded bool) error) *hugoBuilder {
+ var visitedURLs *types.EvictingQueue[string]
+ if s != nil && !s.disableFastRender {
+ visitedURLs = types.NewEvictingQueue[string](20)
+ }
+ return &hugoBuilder{
+ r: r,
+ s: s,
+ visitedURLs: visitedURLs,
+ fullRebuildSem: semaphore.NewWeighted(1),
+ debounce: debounce.New(4 * time.Second),
+ onConfigLoaded: func(reloaded bool) error {
+ for _, wc := range onConfigLoaded {
+ if err := wc(reloaded); err != nil {
+ return err
+ }
+ }
+ return nil
+ },
+ }
}
-func (b *commandsBuilder) newServerCmd() *serverCmd {
- return b.newServerCmdSignaled(nil)
+func newServerCommand() *serverCommand {
+ // Flags.
+ var uninstall bool
+
+ c := &serverCommand{
+ quit: make(chan bool),
+ commands: []simplecobra.Commander{
+ &simpleCommand{
+ name: "trust",
+ short: "Install the local CA in the system trust store",
+ run: func(ctx context.Context, cd *simplecobra.Commandeer, r *rootCommand, args []string) error {
+ action := "-install"
+ if uninstall {
+ action = "-uninstall"
+ }
+ os.Args = []string{action}
+ return mclib.RunMain()
+ },
+ withc: func(cmd *cobra.Command, r *rootCommand) {
+ cmd.ValidArgsFunction = cobra.NoFileCompletions
+ cmd.Flags().BoolVar(&uninstall, "uninstall", false, "Uninstall the local CA (but do not delete it).")
+ },
+ },
+ },
+ }
+
+ return c
}
-func (b *commandsBuilder) newServerCmdSignaled(stop <-chan bool) *serverCmd {
- cc := &serverCmd{stop: stop}
+func (c *serverCommand) Commands() []simplecobra.Commander {
+ return c.commands
+}
- cc.baseBuilderCmd = b.newBuilderCmd(&cobra.Command{
- Use: "server",
- Aliases: []string{"serve"},
- Short: "A high performance webserver",
- Long: `Hugo provides its own webserver which builds and serves the site.
-While hugo server is high performance, it is a webserver with limited options.
-Many run it in production, but the standard behavior is for people to use it
-in development and use a more full featured server such as Nginx or Caddy.
+type countingStatFs struct {
+ afero.Fs
+ statCounter uint64
+}
-'hugo server' will avoid writing the rendered and served content to disk,
-preferring to store it in memory.
+func (fs *countingStatFs) Stat(name string) (os.FileInfo, error) {
+ f, err := fs.Fs.Stat(name)
+ if err == nil {
+ if !f.IsDir() {
+ atomic.AddUint64(&fs.statCounter, 1)
+ }
+ }
+ return f, err
+}
-By default hugo will also watch your files for any changes you make and
-automatically rebuild the site. It will then live reload any open browser pages
-and push the latest content to them. As most Hugo sites are built in a fraction
-of a second, you will be able to save and see your changes nearly instantly.`,
- RunE: cc.server,
+// dynamicEvents contains events that is considered dynamic, as in "not static".
+// Both of these categories will trigger a new build, but the asset events
+// does not fit into the "navigate to changed" logic.
+type dynamicEvents struct {
+ ContentEvents []fsnotify.Event
+ AssetEvents []fsnotify.Event
+}
+
+type fileChangeDetector struct {
+ sync.Mutex
+ current map[string]uint64
+ prev map[string]uint64
+
+ irrelevantRe *regexp.Regexp
+}
+
+func (f *fileChangeDetector) OnFileClose(name string, checksum uint64) {
+ f.Lock()
+ defer f.Unlock()
+ f.current[name] = checksum
+}
+
+func (f *fileChangeDetector) PrepareNew() {
+ if f == nil {
+ return
+ }
+
+ f.Lock()
+ defer f.Unlock()
+
+ if f.current == nil {
+ f.current = make(map[string]uint64)
+ f.prev = make(map[string]uint64)
+ return
+ }
+
+ f.prev = make(map[string]uint64)
+ maps.Copy(f.prev, f.current)
+ f.current = make(map[string]uint64)
+}
+
+func (f *fileChangeDetector) changed() []string {
+ if f == nil {
+ return nil
+ }
+ f.Lock()
+ defer f.Unlock()
+ var c []string
+ for k, v := range f.current {
+ vv, found := f.prev[k]
+ if !found || v != vv {
+ c = append(c, k)
+ }
+ }
+
+ return f.filterIrrelevantAndSort(c)
+}
+
+func (f *fileChangeDetector) filterIrrelevantAndSort(in []string) []string {
+ var filtered []string
+ for _, v := range in {
+ if !f.irrelevantRe.MatchString(v) {
+ filtered = append(filtered, v)
+ }
+ }
+ sort.Strings(filtered)
+ return filtered
+}
+
+type fileServer struct {
+ baseURLs []urls.BaseURL
+ roots []string
+ errorTemplate func(err any) (io.Reader, error)
+ c *serverCommand
+}
+
+func (f *fileServer) createEndpoint(i int) (*http.ServeMux, net.Listener, string, string, error) {
+ r := f.c.r
+ baseURL := f.baseURLs[i]
+ root := f.roots[i]
+ port := f.c.serverPorts[i].p
+ listener := f.c.serverPorts[i].ln
+ logger := f.c.r.logger
+
+ if i == 0 {
+ r.Printf("Environment: %q\n", f.c.hugoTry().Deps.Site.Hugo().Environment)
+ mainTarget := "disk"
+ if f.c.r.renderToMemory {
+ mainTarget = "memory"
+ }
+ if f.c.renderStaticToDisk {
+ r.Printf("Serving pages from %s and static files from disk\n", mainTarget)
+ } else {
+ r.Printf("Serving pages from %s\n", mainTarget)
+ }
+ }
+
+ var httpFs *afero.HttpFs
+ f.c.withConf(func(conf *commonConfig) {
+ httpFs = afero.NewHttpFs(conf.fs.PublishDirServer)
})
- cc.cmd.Flags().IntVarP(&cc.serverPort, "port", "p", 1313, "port on which the server will listen")
- cc.cmd.Flags().IntVar(&cc.liveReloadPort, "liveReloadPort", -1, "port for live reloading (i.e. 443 in HTTPS proxy situations)")
- cc.cmd.Flags().StringVarP(&cc.serverInterface, "bind", "", "127.0.0.1", "interface to which the server will bind")
- cc.cmd.Flags().BoolVarP(&cc.serverWatch, "watch", "w", true, "watch filesystem for changes and recreate as needed")
- cc.cmd.Flags().BoolVar(&cc.noHTTPCache, "noHTTPCache", false, "prevent HTTP caching")
- cc.cmd.Flags().BoolVarP(&cc.serverAppend, "appendPort", "", true, "append port to baseURL")
- cc.cmd.Flags().BoolVar(&cc.disableLiveReload, "disableLiveReload", false, "watch without enabling live browser reload on rebuild")
- cc.cmd.Flags().BoolVar(&cc.navigateToChanged, "navigateToChanged", false, "navigate to changed content file on live browser reload")
- cc.cmd.Flags().BoolVar(&cc.renderToDisk, "renderToDisk", false, "render to Destination path (default is render to memory & serve from there)")
- cc.cmd.Flags().BoolVar(&cc.disableFastRender, "disableFastRender", false, "enables full re-renders on changes")
- cc.cmd.Flags().BoolVar(&cc.disableBrowserError, "disableBrowserError", false, "do not show build errors in the browser")
+ fs := filesOnlyFs{httpFs.Dir(path.Join("/", root))}
+ if i == 0 && f.c.fastRenderMode {
+ r.Println("Running in Fast Render Mode. For full rebuilds on change: hugo server --disableFastRender")
+ }
- cc.cmd.Flags().String("memstats", "", "log memory usage to this file")
- cc.cmd.Flags().String("meminterval", "100ms", "interval to poll memory usage (requires --memstats), valid time units are \"ns\", \"us\" (or \"µs\"), \"ms\", \"s\", \"m\", \"h\".")
+ decorate := func(h http.Handler) http.Handler {
+ return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
+ if f.c.showErrorInBrowser {
+ // First check the error state
+ err := f.c.getErrorWithContext()
+ if err != nil {
+ f.c.errState.setWasErr(true)
+ w.WriteHeader(500)
+ r, err := f.errorTemplate(err)
+ if err != nil {
+ logger.Errorln(err)
+ }
- return cc
+ port = 1313
+ f.c.withConf(func(conf *commonConfig) {
+ if lrport := conf.configs.GetFirstLanguageConfig().BaseURLLiveReload().Port(); lrport != 0 {
+ port = lrport
+ }
+ })
+ lr := baseURL.URL()
+ lr.Host = fmt.Sprintf("%s:%d", lr.Hostname(), port)
+ fmt.Fprint(w, injectLiveReloadScript(r, lr))
+
+ return
+ }
+ }
+
+ if f.c.noHTTPCache {
+ w.Header().Set("Cache-Control", "no-store, no-cache, must-revalidate, max-age=0")
+ w.Header().Set("Pragma", "no-cache")
+ }
+
+ var serverConfig config.Server
+ f.c.withConf(func(conf *commonConfig) {
+ serverConfig = conf.configs.Base.Server
+ })
+
+ // Ignore any query params for the operations below.
+ requestURI, _ := url.PathUnescape(strings.TrimSuffix(r.RequestURI, "?"+r.URL.RawQuery))
+
+ for _, header := range serverConfig.MatchHeaders(requestURI) {
+ w.Header().Set(header.Key, header.Value)
+ }
+
+ if canRedirect(requestURI, r) {
+ if redirect := serverConfig.MatchRedirect(requestURI, r.Header); !redirect.IsZero() {
+ doRedirect := true
+ // This matches Netlify's behavior and is needed for SPA behavior.
+ // See https://docs.netlify.com/routing/redirects/rewrites-proxies/
+ if !redirect.Force {
+ path := filepath.Clean(strings.TrimPrefix(requestURI, baseURL.Path()))
+ if root != "" {
+ path = filepath.Join(root, path)
+ }
+ var fs afero.Fs
+ f.c.withConf(func(conf *commonConfig) {
+ fs = conf.fs.PublishDirServer
+ })
+
+ fi, err := fs.Stat(path)
+
+ if err == nil {
+ if fi.IsDir() {
+ // There will be overlapping directories, so we
+ // need to check for a file.
+ _, err = fs.Stat(filepath.Join(path, "index.html"))
+ doRedirect = err != nil
+ } else {
+ doRedirect = false
+ }
+ }
+ }
+
+ if doRedirect {
+ w.Header().Set(hugoHeaderRedirect, "true")
+ switch redirect.Status {
+ case 404:
+ w.WriteHeader(404)
+ file, err := fs.Open(strings.TrimPrefix(redirect.To, baseURL.Path()))
+ if err == nil {
+ defer file.Close()
+ io.Copy(w, file)
+ } else {
+ fmt.Fprintln(w, "
Page Not Found
")
+ }
+ return
+ case 200:
+ if r2 := f.rewriteRequest(r, strings.TrimPrefix(redirect.To, baseURL.Path())); r2 != nil {
+ requestURI = redirect.To
+ r = r2
+ }
+ default:
+ w.Header().Set("Content-Type", "")
+ http.Redirect(w, r, redirect.To, redirect.Status)
+ return
+
+ }
+ }
+ }
+ }
+
+ if f.c.fastRenderMode && f.c.errState.buildErr() == nil {
+ if isNavigation(requestURI, r) {
+ if !f.c.visitedURLs.Contains(requestURI) {
+ // If not already on stack, re-render that single page.
+ if err := f.c.partialReRender(requestURI); err != nil {
+ f.c.handleBuildErr(err, fmt.Sprintf("Failed to render %q", requestURI))
+ if f.c.showErrorInBrowser {
+ http.Redirect(w, r, requestURI, http.StatusMovedPermanently)
+ return
+ }
+ }
+ }
+
+ f.c.visitedURLs.Add(requestURI)
+
+ }
+ }
+
+ h.ServeHTTP(w, r)
+ })
+ }
+
+ fileserver := decorate(http.FileServer(fs))
+ mu := http.NewServeMux()
+ if baseURL.Path() == "" || baseURL.Path() == "/" {
+ mu.Handle("/", fileserver)
+ } else {
+ mu.Handle(baseURL.Path(), http.StripPrefix(baseURL.Path(), fileserver))
+ }
+ if r.IsTestRun() {
+ var shutDownOnce sync.Once
+ mu.HandleFunc("/__stop", func(w http.ResponseWriter, r *http.Request) {
+ shutDownOnce.Do(func() {
+ close(f.c.quit)
+ })
+ })
+ }
+
+ endpoint := net.JoinHostPort(f.c.serverInterface, strconv.Itoa(port))
+
+ return mu, listener, baseURL.String(), endpoint, nil
+}
+
+func (f *fileServer) rewriteRequest(r *http.Request, toPath string) *http.Request {
+ r2 := new(http.Request)
+ *r2 = *r
+ r2.URL = new(url.URL)
+ *r2.URL = *r.URL
+ r2.URL.Path = toPath
+ r2.Header.Set("X-Rewrite-Original-URI", r.URL.RequestURI())
+
+ return r2
}
type filesOnlyFs struct {
fs http.FileSystem
}
-type noDirFile struct {
- http.File
-}
-
func (fs filesOnlyFs) Open(name string) (http.File, error) {
f, err := fs.fs.Open(name)
if err != nil {
@@ -122,134 +433,56 @@ func (fs filesOnlyFs) Open(name string) (http.File, error) {
return noDirFile{f}, nil
}
+type noDirFile struct {
+ http.File
+}
+
func (f noDirFile) Readdir(count int) ([]os.FileInfo, error) {
return nil, nil
}
-var serverPorts []int
+type serverCommand struct {
+ r *rootCommand
-func (sc *serverCmd) server(cmd *cobra.Command, args []string) error {
- // If a Destination is provided via flag write to disk
- destination, _ := cmd.Flags().GetString("destination")
- if destination != "" {
- sc.renderToDisk = true
+ commands []simplecobra.Commander
+
+ *hugoBuilder
+
+ quit chan bool // Closed when the server should shut down. Used in tests only.
+ serverPorts []serverPortListener
+ doLiveReload bool
+
+ // Flags.
+ renderStaticToDisk bool
+ navigateToChanged bool
+ openBrowser bool
+ serverAppend bool
+ serverInterface string
+ tlsCertFile string
+ tlsKeyFile string
+ tlsAuto bool
+ pprof bool
+ serverPort int
+ liveReloadPort int
+ serverWatch bool
+ noHTTPCache bool
+ disableLiveReload bool
+ disableFastRender bool
+ disableBrowserError bool
+}
+
+func (c *serverCommand) Name() string {
+ return "server"
+}
+
+func (c *serverCommand) Run(ctx context.Context, cd *simplecobra.Commandeer, args []string) error {
+ if c.pprof {
+ go func() {
+ http.ListenAndServe("localhost:8080", nil)
+ }()
}
-
- var serverCfgInit sync.Once
-
- cfgInit := func(c *commandeer) error {
- c.Set("renderToMemory", !sc.renderToDisk)
- if cmd.Flags().Changed("navigateToChanged") {
- c.Set("navigateToChanged", sc.navigateToChanged)
- }
- if cmd.Flags().Changed("disableLiveReload") {
- c.Set("disableLiveReload", sc.disableLiveReload)
- }
- if cmd.Flags().Changed("disableFastRender") {
- c.Set("disableFastRender", sc.disableFastRender)
- }
- if cmd.Flags().Changed("disableBrowserError") {
- c.Set("disableBrowserError", sc.disableBrowserError)
- }
- if sc.serverWatch {
- c.Set("watch", true)
- }
-
- // TODO(bep) yes, we should fix.
- if !c.languagesConfigured {
- return nil
- }
-
- var err error
-
- // We can only do this once.
- serverCfgInit.Do(func() {
- serverPorts = make([]int, 1)
-
- if c.languages.IsMultihost() {
- if !sc.serverAppend {
- err = newSystemError("--appendPort=false not supported when in multihost mode")
- }
- serverPorts = make([]int, len(c.languages))
- }
-
- currentServerPort := sc.serverPort
-
- for i := 0; i < len(serverPorts); i++ {
- l, err := net.Listen("tcp", net.JoinHostPort(sc.serverInterface, strconv.Itoa(currentServerPort)))
- if err == nil {
- l.Close()
- serverPorts[i] = currentServerPort
- } else {
- if i == 0 && sc.cmd.Flags().Changed("port") {
- // port set explicitly by user -- he/she probably meant it!
- err = newSystemErrorF("Server startup failed: %s", err)
- }
- c.logger.FEEDBACK.Println("port", sc.serverPort, "already in use, attempting to use an available port")
- sp, err := helpers.FindAvailablePort()
- if err != nil {
- err = newSystemError("Unable to find alternative port to use:", err)
- }
- serverPorts[i] = sp.Port
- }
-
- currentServerPort = serverPorts[i] + 1
- }
- })
-
- c.serverPorts = serverPorts
-
- c.Set("port", sc.serverPort)
- if sc.liveReloadPort != -1 {
- c.Set("liveReloadPort", sc.liveReloadPort)
- } else {
- c.Set("liveReloadPort", serverPorts[0])
- }
-
- isMultiHost := c.languages.IsMultihost()
- for i, language := range c.languages {
- var serverPort int
- if isMultiHost {
- serverPort = serverPorts[i]
- } else {
- serverPort = serverPorts[0]
- }
-
- baseURL, err := sc.fixURL(language, sc.baseURL, serverPort)
- if err != nil {
- return nil
- }
- if isMultiHost {
- language.Set("baseURL", baseURL)
- }
- if i == 0 {
- c.Set("baseURL", baseURL)
- }
- }
-
- return err
-
- }
-
- if err := memStats(); err != nil {
- jww.WARN.Println("memstats error:", err)
- }
-
- c, err := initializeConfig(true, true, &sc.hugoBuilderCommon, sc, cfgInit)
- if err != nil {
- return err
- }
-
- if err := c.serverBuild(); err != nil {
- return err
- }
-
- for _, s := range c.hugo().Sites {
- s.RegisterMediaTypes()
- }
-
// Watch runs its own server as part of the routine
- if sc.serverWatch {
+ if c.serverWatch {
watchDirs, err := c.getDirList()
if err != nil {
@@ -259,10 +492,9 @@ func (sc *serverCmd) server(cmd *cobra.Command, args []string) error {
watchGroups := helpers.ExtractAndGroupRootPaths(watchDirs)
for _, group := range watchGroups {
- jww.FEEDBACK.Printf("Watching for changes in %s\n", group)
+ c.r.Printf("Watching for changes in %s\n", group)
}
- watcher, err := c.newWatcher(watchDirs...)
-
+ watcher, err := c.newWatcher(c.r.poll, watchDirs...)
if err != nil {
return err
}
@@ -271,246 +503,333 @@ func (sc *serverCmd) server(cmd *cobra.Command, args []string) error {
}
- return c.serve(sc)
-
-}
-
-func getRootWatchDirsStr(baseDir string, watchDirs []string) string {
- relWatchDirs := make([]string, len(watchDirs))
- for i, dir := range watchDirs {
- relWatchDirs[i], _ = helpers.GetRelativePath(dir, baseDir)
- }
-
- return strings.Join(helpers.UniqueStringsSorted(helpers.ExtractRootPaths(relWatchDirs)), ",")
-}
-
-type fileServer struct {
- baseURLs []string
- roots []string
- errorTemplate tpl.Template
- c *commandeer
- s *serverCmd
-}
-
-func (f *fileServer) createEndpoint(i int) (*http.ServeMux, string, string, error) {
- baseURL := f.baseURLs[i]
- root := f.roots[i]
- port := f.c.serverPorts[i]
-
- publishDir := f.c.Cfg.GetString("publishDir")
-
- if root != "" {
- publishDir = filepath.Join(publishDir, root)
- }
-
- absPublishDir := f.c.hugo().PathSpec.AbsPathify(publishDir)
-
- jww.FEEDBACK.Printf("Environment: %q", f.c.hugo().Deps.Site.Hugo().Environment)
-
- if i == 0 {
- if f.s.renderToDisk {
- jww.FEEDBACK.Println("Serving pages from " + absPublishDir)
- } else {
- jww.FEEDBACK.Println("Serving pages from memory")
- }
- }
-
- httpFs := afero.NewHttpFs(f.c.destinationFs)
- fs := filesOnlyFs{httpFs.Dir(absPublishDir)}
-
- if i == 0 && f.c.fastRenderMode {
- jww.FEEDBACK.Println("Running in Fast Render Mode. For full rebuilds on change: hugo server --disableFastRender")
- }
-
- // We're only interested in the path
- u, err := url.Parse(baseURL)
- if err != nil {
- return nil, "", "", errors.Wrap(err, "Invalid baseURL")
- }
-
- decorate := func(h http.Handler) http.Handler {
- return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
- if f.c.showErrorInBrowser {
- // First check the error state
- err := f.c.getErrorWithContext()
- if err != nil {
- w.WriteHeader(500)
- var b bytes.Buffer
- err := f.errorTemplate.Execute(&b, err)
- if err != nil {
- f.c.logger.ERROR.Println(err)
- }
- port = 1313
- if !f.c.paused {
- port = f.c.Cfg.GetInt("liveReloadPort")
- }
- fmt.Fprint(w, injectLiveReloadScript(&b, port))
-
- return
- }
- }
-
- if f.s.noHTTPCache {
- w.Header().Set("Cache-Control", "no-store, no-cache, must-revalidate, max-age=0")
- w.Header().Set("Pragma", "no-cache")
- }
-
- if f.c.fastRenderMode && f.c.buildErr == nil {
- p := r.RequestURI
- if strings.HasSuffix(p, "/") || strings.HasSuffix(p, "html") || strings.HasSuffix(p, "htm") {
- if !f.c.visitedURLs.Contains(p) {
- // If not already on stack, re-render that single page.
- if err := f.c.partialReRender(p); err != nil {
- f.c.handleBuildErr(err, fmt.Sprintf("Failed to render %q", p))
- if f.c.showErrorInBrowser {
- http.Redirect(w, r, p, http.StatusMovedPermanently)
- return
- }
- }
- }
-
- f.c.visitedURLs.Add(p)
-
- }
- }
- h.ServeHTTP(w, r)
- })
- }
-
- fileserver := decorate(http.FileServer(fs))
- mu := http.NewServeMux()
-
- if u.Path == "" || u.Path == "/" {
- mu.Handle("/", fileserver)
- } else {
- mu.Handle(u.Path, http.StripPrefix(u.Path, fileserver))
- }
-
- endpoint := net.JoinHostPort(f.s.serverInterface, strconv.Itoa(port))
-
- return mu, u.String(), endpoint, nil
-}
-
-var logErrorRe = regexp.MustCompile(`(?s)ERROR \d{4}/\d{2}/\d{2} \d{2}:\d{2}:\d{2} `)
-
-func removeErrorPrefixFromLog(content string) string {
- return logErrorRe.ReplaceAllLiteralString(content, "")
-}
-func (c *commandeer) serve(s *serverCmd) error {
-
- isMultiHost := c.hugo().IsMultihost()
-
- var (
- baseURLs []string
- roots []string
- )
-
- if isMultiHost {
- for _, s := range c.hugo().Sites {
- baseURLs = append(baseURLs, s.BaseURL.String())
- roots = append(roots, s.Language().Lang)
- }
- } else {
- s := c.hugo().Sites[0]
- baseURLs = []string{s.BaseURL.String()}
- roots = []string{""}
- }
-
- templ, err := c.hugo().TextTmpl.Parse("__default_server_error", buildErrorTemplate)
+ err := func() error {
+ defer c.r.timeTrack(time.Now(), "Built")
+ return c.build()
+ }()
if err != nil {
return err
}
- srv := &fileServer{
- baseURLs: baseURLs,
- roots: roots,
- c: c,
- s: s,
- errorTemplate: templ,
- }
+ return c.serve()
+}
- doLiveReload := !c.Cfg.GetBool("disableLiveReload")
+func (c *serverCommand) Init(cd *simplecobra.Commandeer) error {
+ cmd := cd.CobraCommand
+ cmd.Short = "Start the embedded web server"
+ cmd.Long = `Hugo provides its own webserver which builds and serves the site.
+While hugo server is high performance, it is a webserver with limited options.
- if doLiveReload {
- livereload.Initialize()
- }
+The ` + "`" + `hugo server` + "`" + ` command will by default write and serve files from disk, but
+you can render to memory by using the ` + "`" + `--renderToMemory` + "`" + ` flag. This can be
+faster in some cases, but it will consume more memory.
- var sigs = make(chan os.Signal, 1)
- signal.Notify(sigs, syscall.SIGINT, syscall.SIGTERM)
+By default hugo will also watch your files for any changes you make and
+automatically rebuild the site. It will then live reload any open browser pages
+and push the latest content to them. As most Hugo sites are built in a fraction
+of a second, you will be able to save and see your changes nearly instantly.`
+ cmd.Aliases = []string{"serve"}
- for i := range baseURLs {
- mu, serverURL, endpoint, err := srv.createEndpoint(i)
+ cmd.Flags().IntVarP(&c.serverPort, "port", "p", 1313, "port on which the server will listen")
+ _ = cmd.RegisterFlagCompletionFunc("port", cobra.NoFileCompletions)
+ cmd.Flags().IntVar(&c.liveReloadPort, "liveReloadPort", -1, "port for live reloading (i.e. 443 in HTTPS proxy situations)")
+ _ = cmd.RegisterFlagCompletionFunc("liveReloadPort", cobra.NoFileCompletions)
+ cmd.Flags().StringVarP(&c.serverInterface, "bind", "", "127.0.0.1", "interface to which the server will bind")
+ _ = cmd.RegisterFlagCompletionFunc("bind", cobra.NoFileCompletions)
+ cmd.Flags().StringVarP(&c.tlsCertFile, "tlsCertFile", "", "", "path to TLS certificate file")
+ _ = cmd.MarkFlagFilename("tlsCertFile", "pem")
+ cmd.Flags().StringVarP(&c.tlsKeyFile, "tlsKeyFile", "", "", "path to TLS key file")
+ _ = cmd.MarkFlagFilename("tlsKeyFile", "pem")
+ cmd.Flags().BoolVar(&c.tlsAuto, "tlsAuto", false, "generate and use locally-trusted certificates.")
+ cmd.Flags().BoolVar(&c.pprof, "pprof", false, "enable the pprof server (port 8080)")
+ cmd.Flags().BoolVarP(&c.serverWatch, "watch", "w", true, "watch filesystem for changes and recreate as needed")
+ cmd.Flags().BoolVar(&c.noHTTPCache, "noHTTPCache", false, "prevent HTTP caching")
+ cmd.Flags().BoolVarP(&c.serverAppend, "appendPort", "", true, "append port to baseURL")
+ cmd.Flags().BoolVar(&c.disableLiveReload, "disableLiveReload", false, "watch without enabling live browser reload on rebuild")
+ cmd.Flags().BoolVarP(&c.navigateToChanged, "navigateToChanged", "N", false, "navigate to changed content file on live browser reload")
+ cmd.Flags().BoolVarP(&c.openBrowser, "openBrowser", "O", false, "open the site in a browser after server startup")
+ cmd.Flags().BoolVar(&c.renderStaticToDisk, "renderStaticToDisk", false, "serve static files from disk and dynamic files from memory")
+ cmd.Flags().BoolVar(&c.disableFastRender, "disableFastRender", false, "enables full re-renders on changes")
+ cmd.Flags().BoolVar(&c.disableBrowserError, "disableBrowserError", false, "do not show build errors in the browser")
- if doLiveReload {
- mu.HandleFunc("/livereload.js", livereload.ServeJS)
- mu.HandleFunc("/livereload", livereload.Handler)
- }
- jww.FEEDBACK.Printf("Web Server is available at %s (bind address %s)\n", serverURL, s.serverInterface)
- go func() {
- err = http.ListenAndServe(endpoint, mu)
- if err != nil {
- c.logger.ERROR.Printf("Error: %s\n", err.Error())
- os.Exit(1)
+ r := cd.Root.Command.(*rootCommand)
+ applyLocalFlagsBuild(cmd, r)
+
+ return nil
+}
+
+func (c *serverCommand) PreRun(cd, runner *simplecobra.Commandeer) error {
+ c.r = cd.Root.Command.(*rootCommand)
+
+ c.hugoBuilder = newHugoBuilder(
+ c.r,
+ c,
+ func(reloaded bool) error {
+ if !reloaded {
+ if err := c.createServerPorts(cd); err != nil {
+ return err
+ }
+
+ if (c.tlsCertFile == "" || c.tlsKeyFile == "") && c.tlsAuto {
+ c.withConfE(func(conf *commonConfig) error {
+ return c.createCertificates(conf)
+ })
+ }
}
- }()
+
+ if err := c.setServerInfoInConfig(); err != nil {
+ return err
+ }
+
+ if !reloaded && c.fastRenderMode {
+ c.withConf(func(conf *commonConfig) {
+ conf.fs.PublishDir = hugofs.NewHashingFs(conf.fs.PublishDir, c.changeDetector)
+ conf.fs.PublishDirStatic = hugofs.NewHashingFs(conf.fs.PublishDirStatic, c.changeDetector)
+ })
+ }
+
+ return nil
+ },
+ )
+
+ destinationFlag := cd.CobraCommand.Flags().Lookup("destination")
+ if c.r.renderToMemory && (destinationFlag != nil && destinationFlag.Changed) {
+ return fmt.Errorf("cannot use --renderToMemory with --destination")
+ }
+ c.doLiveReload = !c.disableLiveReload
+ c.fastRenderMode = !c.disableFastRender
+ c.showErrorInBrowser = c.doLiveReload && !c.disableBrowserError
+
+ if c.fastRenderMode {
+ // For now, fast render mode only. It should, however, be fast enough
+ // for the full variant, too.
+ c.changeDetector = &fileChangeDetector{
+ // We use this detector to decide to do a Hot reload of a single path or not.
+ // We need to filter out source maps and possibly some other to be able
+ // to make that decision.
+ irrelevantRe: regexp.MustCompile(`\.map$`),
+ }
+
+ c.changeDetector.PrepareNew()
+
}
- jww.FEEDBACK.Println("Press Ctrl+C to stop")
-
- if s.stop != nil {
- select {
- case <-sigs:
- case <-s.stop:
- }
- } else {
- <-sigs
+ err := c.loadConfig(cd, true)
+ if err != nil {
+ return err
}
return nil
}
+func (c *serverCommand) setServerInfoInConfig() error {
+ if len(c.serverPorts) == 0 {
+ panic("no server ports set")
+ }
+ return c.withConfE(func(conf *commonConfig) error {
+ for i, language := range conf.configs.LanguagesDefaultFirst {
+ isMultihost := conf.configs.IsMultihost
+ var serverPort int
+ if isMultihost {
+ serverPort = c.serverPorts[i].p
+ } else {
+ serverPort = c.serverPorts[0].p
+ }
+ langConfig := conf.configs.LanguageConfigMap[language.Lang]
+ baseURLStr, err := c.fixURL(langConfig.BaseURL, c.r.baseURL, serverPort)
+ if err != nil {
+ return err
+ }
+ baseURL, err := urls.NewBaseURLFromString(baseURLStr)
+ if err != nil {
+ return fmt.Errorf("failed to create baseURL from %q: %s", baseURLStr, err)
+ }
+
+ baseURLLiveReload := baseURL
+ if c.liveReloadPort != -1 {
+ baseURLLiveReload, _ = baseURLLiveReload.WithPort(c.liveReloadPort)
+ }
+ langConfig.C.SetServerInfo(baseURL, baseURLLiveReload, c.serverInterface)
+
+ }
+ return nil
+ })
+}
+
+func (c *serverCommand) getErrorWithContext() any {
+ buildErr := c.errState.buildErr()
+ if buildErr == nil {
+ return nil
+ }
+
+ m := make(map[string]any)
+
+ m["Error"] = cleanErrorLog(c.r.logger.Errors())
+
+ m["Version"] = hugo.BuildVersionString()
+ ferrors := herrors.UnwrapFileErrorsWithErrorContext(buildErr)
+ m["Files"] = ferrors
+
+ return m
+}
+
+func (c *serverCommand) createCertificates(conf *commonConfig) error {
+ hostname := "localhost"
+ if c.r.baseURL != "" {
+ u, err := url.Parse(c.r.baseURL)
+ if err != nil {
+ return err
+ }
+ hostname = u.Hostname()
+ }
+
+ // For now, store these in the Hugo cache dir.
+ // Hugo should probably introduce some concept of a less temporary application directory.
+ keyDir := filepath.Join(conf.configs.LoadingInfo.BaseConfig.CacheDir, "_mkcerts")
+
+ // Create the directory if it doesn't exist.
+ if _, err := os.Stat(keyDir); os.IsNotExist(err) {
+ if err := os.MkdirAll(keyDir, 0o777); err != nil {
+ return err
+ }
+ }
+
+ c.tlsCertFile = filepath.Join(keyDir, fmt.Sprintf("%s.pem", hostname))
+ c.tlsKeyFile = filepath.Join(keyDir, fmt.Sprintf("%s-key.pem", hostname))
+
+ // Check if the certificate already exists and is valid.
+ certPEM, err := os.ReadFile(c.tlsCertFile)
+ if err == nil {
+ rootPem, err := os.ReadFile(filepath.Join(mclib.GetCAROOT(), "rootCA.pem"))
+ if err == nil {
+ if err := c.verifyCert(rootPem, certPEM, hostname); err == nil {
+ c.r.Println("Using existing", c.tlsCertFile, "and", c.tlsKeyFile)
+ return nil
+ }
+ }
+ }
+
+ c.r.Println("Creating TLS certificates in", keyDir)
+
+ // Yes, this is unfortunate, but it's currently the only way to use Mkcert as a library.
+ os.Args = []string{"-cert-file", c.tlsCertFile, "-key-file", c.tlsKeyFile, hostname}
+ return mclib.RunMain()
+}
+
+func (c *serverCommand) verifyCert(rootPEM, certPEM []byte, name string) error {
+ roots := x509.NewCertPool()
+ ok := roots.AppendCertsFromPEM(rootPEM)
+ if !ok {
+ return fmt.Errorf("failed to parse root certificate")
+ }
+
+ block, _ := pem.Decode(certPEM)
+ if block == nil {
+ return fmt.Errorf("failed to parse certificate PEM")
+ }
+ cert, err := x509.ParseCertificate(block.Bytes)
+ if err != nil {
+ return fmt.Errorf("failed to parse certificate: %v", err.Error())
+ }
+
+ opts := x509.VerifyOptions{
+ DNSName: name,
+ Roots: roots,
+ }
+
+ if _, err := cert.Verify(opts); err != nil {
+ return fmt.Errorf("failed to verify certificate: %v", err.Error())
+ }
+
+ return nil
+}
+
+func (c *serverCommand) createServerPorts(cd *simplecobra.Commandeer) error {
+ flags := cd.CobraCommand.Flags()
+ var cerr error
+ c.withConf(func(conf *commonConfig) {
+ isMultihost := conf.configs.IsMultihost
+ c.serverPorts = make([]serverPortListener, 1)
+ if isMultihost {
+ if !c.serverAppend {
+ cerr = errors.New("--appendPort=false not supported when in multihost mode")
+ return
+ }
+ c.serverPorts = make([]serverPortListener, len(conf.configs.Languages))
+ }
+ currentServerPort := c.serverPort
+ for i := range c.serverPorts {
+ l, err := net.Listen("tcp", net.JoinHostPort(c.serverInterface, strconv.Itoa(currentServerPort)))
+ if err == nil {
+ c.serverPorts[i] = serverPortListener{ln: l, p: currentServerPort}
+ } else {
+ if i == 0 && flags.Changed("port") {
+ // port set explicitly by user -- he/she probably meant it!
+ cerr = fmt.Errorf("server startup failed: %s", err)
+ return
+ }
+ c.r.Println("port", currentServerPort, "already in use, attempting to use an available port")
+ l, sp, err := helpers.TCPListen()
+ if err != nil {
+ cerr = fmt.Errorf("unable to find alternative port to use: %s", err)
+ return
+ }
+ c.serverPorts[i] = serverPortListener{ln: l, p: sp.Port}
+ }
+
+ currentServerPort = c.serverPorts[i].p + 1
+ }
+ })
+
+ return cerr
+}
+
// fixURL massages the baseURL into a form needed for serving
// all pages correctly.
-func (sc *serverCmd) fixURL(cfg config.Provider, s string, port int) (string, error) {
+func (c *serverCommand) fixURL(baseURLFromConfig, baseURLFromFlag string, port int) (string, error) {
+ certsSet := (c.tlsCertFile != "" && c.tlsKeyFile != "") || c.tlsAuto
useLocalhost := false
- if s == "" {
- s = cfg.GetString("baseURL")
+ baseURL := baseURLFromFlag
+ if baseURL == "" {
+ baseURL = baseURLFromConfig
useLocalhost = true
}
- if !strings.HasSuffix(s, "/") {
- s = s + "/"
+ if !strings.HasSuffix(baseURL, "/") {
+ baseURL = baseURL + "/"
}
// do an initial parse of the input string
- u, err := url.Parse(s)
+ u, err := url.Parse(baseURL)
if err != nil {
return "", err
}
// if no Host is defined, then assume that no schema or double-slash were
// present in the url. Add a double-slash and make a best effort attempt.
- if u.Host == "" && s != "/" {
- s = "//" + s
+ if u.Host == "" && baseURL != "/" {
+ baseURL = "//" + baseURL
- u, err = url.Parse(s)
+ u, err = url.Parse(baseURL)
if err != nil {
return "", err
}
}
if useLocalhost {
- if u.Scheme == "https" {
+ if certsSet {
+ u.Scheme = "https"
+ } else if u.Scheme == "https" {
u.Scheme = "http"
}
u.Host = "localhost"
}
- if sc.serverAppend {
+ if c.serverAppend {
if strings.Contains(u.Host, ":") {
u.Host, _, err = net.SplitHostPort(u.Host)
if err != nil {
- return "", errors.Wrap(err, "Failed to split baseURL hostpost")
+ return "", fmt.Errorf("failed to split baseURL hostport: %w", err)
}
}
u.Host += fmt.Sprintf(":%d", port)
@@ -519,39 +838,420 @@ func (sc *serverCmd) fixURL(cfg config.Provider, s string, port int) (string, er
return u.String(), nil
}
-func memStats() error {
- b := newCommandsBuilder()
- sc := b.newServerCmd().getCommand()
- memstats := sc.Flags().Lookup("memstats").Value.String()
- if memstats != "" {
- interval, err := time.ParseDuration(sc.Flags().Lookup("meminterval").Value.String())
- if err != nil {
- interval, _ = time.ParseDuration("100ms")
- }
+func (c *serverCommand) partialReRender(urls ...string) (err error) {
+ defer func() {
+ c.errState.setWasErr(false)
+ }()
+ visited := types.NewEvictingQueue[string](len(urls))
+ for _, url := range urls {
+ visited.Add(url)
+ }
- fileMemStats, err := os.Create(memstats)
+ var h *hugolib.HugoSites
+ h, err = c.hugo()
+ if err != nil {
+ return
+ }
+
+ // Note: We do not set NoBuildLock as the file lock is not acquired at this stage.
+ err = h.Build(hugolib.BuildCfg{NoBuildLock: false, RecentlyTouched: visited, PartialReRender: true, ErrRecovery: c.errState.wasErr()})
+
+ return
+}
+
+func (c *serverCommand) serve() error {
+ var (
+ baseURLs []urls.BaseURL
+ roots []string
+ h *hugolib.HugoSites
+ )
+ err := c.withConfE(func(conf *commonConfig) error {
+ isMultihost := conf.configs.IsMultihost
+ var err error
+ h, err = c.r.HugFromConfig(conf)
if err != nil {
return err
}
- fileMemStats.WriteString("# Time\tHeapSys\tHeapAlloc\tHeapIdle\tHeapReleased\n")
+ // We need the server to share the same logger as the Hugo build (for error counts etc.)
+ c.r.logger = h.Log
- go func() {
- var stats runtime.MemStats
-
- start := time.Now().UnixNano()
-
- for {
- runtime.ReadMemStats(&stats)
- if fileMemStats != nil {
- fileMemStats.WriteString(fmt.Sprintf("%d\t%d\t%d\t%d\t%d\n",
- (time.Now().UnixNano()-start)/1000000, stats.HeapSys, stats.HeapAlloc, stats.HeapIdle, stats.HeapReleased))
- time.Sleep(interval)
- } else {
- break
- }
+ if isMultihost {
+ for _, l := range conf.configs.ConfigLangs() {
+ baseURLs = append(baseURLs, l.BaseURL())
+ roots = append(roots, l.Language().Lang)
}
- }()
+ } else {
+ l := conf.configs.GetFirstLanguageConfig()
+ baseURLs = []urls.BaseURL{l.BaseURL()}
+ roots = []string{""}
+ }
+
+ return nil
+ })
+ if err != nil {
+ return err
}
- return nil
+
+ // Cache it here. The HugoSites object may be unavailable later on due to intermittent configuration errors.
+ // To allow the en user to change the error template while the server is running, we use
+ // the freshest template we can provide.
+ var (
+ errTempl *tplimpl.TemplInfo
+ templHandler *tplimpl.TemplateStore
+ )
+ getErrorTemplateAndHandler := func(h *hugolib.HugoSites) (*tplimpl.TemplInfo, *tplimpl.TemplateStore) {
+ if h == nil {
+ return errTempl, templHandler
+ }
+ templHandler := h.GetTemplateStore()
+ errTempl := templHandler.LookupByPath("/_server/error.html")
+ if errTempl == nil {
+ panic("template server/error.html not found")
+ }
+ return errTempl, templHandler
+ }
+ errTempl, templHandler = getErrorTemplateAndHandler(h)
+
+ srv := &fileServer{
+ baseURLs: baseURLs,
+ roots: roots,
+ c: c,
+ errorTemplate: func(ctx any) (io.Reader, error) {
+ // hugoTry does not block, getErrorTemplateAndHandler will fall back
+ // to cached values if nil.
+ templ, handler := getErrorTemplateAndHandler(c.hugoTry())
+ b := &bytes.Buffer{}
+ err := handler.ExecuteWithContext(context.Background(), templ, b, ctx)
+ return b, err
+ },
+ }
+
+ doLiveReload := !c.disableLiveReload
+
+ if doLiveReload {
+ livereload.Initialize()
+ }
+
+ sigs := make(chan os.Signal, 1)
+ signal.Notify(sigs, syscall.SIGINT, syscall.SIGTERM)
+ var servers []*http.Server
+
+ wg1, ctx := errgroup.WithContext(context.Background())
+
+ for i := range baseURLs {
+ mu, listener, serverURL, endpoint, err := srv.createEndpoint(i)
+ var srv *http.Server
+ if c.tlsCertFile != "" && c.tlsKeyFile != "" {
+ srv = &http.Server{
+ Addr: endpoint,
+ Handler: mu,
+ TLSConfig: &tls.Config{
+ MinVersion: tls.VersionTLS12,
+ },
+ }
+ } else {
+ srv = &http.Server{
+ Addr: endpoint,
+ Handler: mu,
+ }
+ }
+
+ servers = append(servers, srv)
+
+ if doLiveReload {
+ baseURL := baseURLs[i]
+ mu.HandleFunc(baseURL.Path()+"livereload.js", livereload.ServeJS)
+ mu.HandleFunc(baseURL.Path()+"livereload", livereload.Handler)
+ }
+ c.r.Printf("Web Server is available at %s (bind address %s) %s\n", serverURL, c.serverInterface, roots[i])
+ wg1.Go(func() error {
+ if c.tlsCertFile != "" && c.tlsKeyFile != "" {
+ err = srv.ServeTLS(listener, c.tlsCertFile, c.tlsKeyFile)
+ } else {
+ err = srv.Serve(listener)
+ }
+ if err != nil && err != http.ErrServerClosed {
+ return err
+ }
+ return nil
+ })
+ }
+
+ if c.r.IsTestRun() {
+ // Write a .ready file to disk to signal ready status.
+ // This is where the test is run from.
+ var baseURLs []string
+ for _, baseURL := range srv.baseURLs {
+ baseURLs = append(baseURLs, baseURL.String())
+ }
+ testInfo := map[string]any{
+ "baseURLs": baseURLs,
+ }
+
+ dir := os.Getenv("WORK")
+ if dir != "" {
+ readyFile := filepath.Join(dir, ".ready")
+ // encode the test info as JSON into the .ready file.
+ b, err := json.Marshal(testInfo)
+ if err != nil {
+ return err
+ }
+ err = os.WriteFile(readyFile, b, 0o777)
+ if err != nil {
+ return err
+ }
+ }
+
+ }
+
+ c.r.Println("Press Ctrl+C to stop")
+
+ if c.openBrowser {
+ // There may be more than one baseURL in multihost mode, open the first.
+ if err := browser.OpenURL(baseURLs[0].String()); err != nil {
+ c.r.logger.Warnf("Failed to open browser: %s", err)
+ }
+ }
+
+ err = func() error {
+ for {
+ select {
+ case <-c.quit:
+ return nil
+ case <-sigs:
+ return nil
+ case <-ctx.Done():
+ return ctx.Err()
+ }
+ }
+ }()
+ if err != nil {
+ c.r.Println("Error:", err)
+ }
+
+ ctx, cancel := context.WithTimeout(context.Background(), 5*time.Second)
+ defer cancel()
+ wg2, ctx := errgroup.WithContext(ctx)
+ for _, srv := range servers {
+ srv := srv
+ wg2.Go(func() error {
+ return srv.Shutdown(ctx)
+ })
+ }
+
+ err1, err2 := wg1.Wait(), wg2.Wait()
+ if err1 != nil {
+ return err1
+ }
+ return err2
+}
+
+type serverPortListener struct {
+ p int
+ ln net.Listener
+}
+
+type staticSyncer struct {
+ c *hugoBuilder
+}
+
+func (s *staticSyncer) isStatic(h *hugolib.HugoSites, filename string) bool {
+ return h.BaseFs.SourceFilesystems.IsStatic(filename)
+}
+
+func (s *staticSyncer) syncsStaticEvents(staticEvents []fsnotify.Event) error {
+ c := s.c
+
+ syncFn := func(sourceFs *filesystems.SourceFilesystem) (uint64, error) {
+ publishDir := helpers.FilePathSeparator
+
+ if sourceFs.PublishFolder != "" {
+ publishDir = filepath.Join(publishDir, sourceFs.PublishFolder)
+ }
+
+ syncer := fsync.NewSyncer()
+ c.withConf(func(conf *commonConfig) {
+ syncer.NoTimes = conf.configs.Base.NoTimes
+ syncer.NoChmod = conf.configs.Base.NoChmod
+ syncer.ChmodFilter = chmodFilter
+ syncer.SrcFs = sourceFs.Fs
+ syncer.DestFs = conf.fs.PublishDir
+ if c.s != nil && c.s.renderStaticToDisk {
+ syncer.DestFs = conf.fs.PublishDirStatic
+ }
+ })
+
+ logger := s.c.r.logger
+
+ for _, ev := range staticEvents {
+ // Due to our approach of layering both directories and the content's rendered output
+ // into one we can't accurately remove a file not in one of the source directories.
+ // If a file is in the local static dir and also in the theme static dir and we remove
+ // it from one of those locations we expect it to still exist in the destination
+ //
+ // If Hugo generates a file (from the content dir) over a static file
+ // the content generated file should take precedence.
+ //
+ // Because we are now watching and handling individual events it is possible that a static
+ // event that occupies the same path as a content generated file will take precedence
+ // until a regeneration of the content takes places.
+ //
+ // Hugo assumes that these cases are very rare and will permit this bad behavior
+ // The alternative is to track every single file and which pipeline rendered it
+ // and then to handle conflict resolution on every event.
+
+ fromPath := ev.Name
+
+ relPath, found := sourceFs.MakePathRelative(fromPath, true)
+
+ if !found {
+ // Not member of this virtual host.
+ continue
+ }
+
+ // Remove || rename is harder and will require an assumption.
+ // Hugo takes the following approach:
+ // If the static file exists in any of the static source directories after this event
+ // Hugo will re-sync it.
+ // If it does not exist in all of the static directories Hugo will remove it.
+ //
+ // This assumes that Hugo has not generated content on top of a static file and then removed
+ // the source of that static file. In this case Hugo will incorrectly remove that file
+ // from the published directory.
+ if ev.Op&fsnotify.Rename == fsnotify.Rename || ev.Op&fsnotify.Remove == fsnotify.Remove {
+ if _, err := sourceFs.Fs.Stat(relPath); herrors.IsNotExist(err) {
+ // If file doesn't exist in any static dir, remove it
+ logger.Println("File no longer exists in static dir, removing", relPath)
+ c.withConf(func(conf *commonConfig) {
+ _ = conf.fs.PublishDirStatic.RemoveAll(relPath)
+ })
+
+ } else if err == nil {
+ // If file still exists, sync it
+ logger.Println("Syncing", relPath, "to", publishDir)
+
+ if err := syncer.Sync(relPath, relPath); err != nil {
+ c.r.logger.Errorln(err)
+ }
+ } else {
+ c.r.logger.Errorln(err)
+ }
+
+ continue
+ }
+
+ // For all other event operations Hugo will sync static.
+ logger.Println("Syncing", relPath, "to", publishDir)
+ if err := syncer.Sync(filepath.Join(publishDir, relPath), relPath); err != nil {
+ c.r.logger.Errorln(err)
+ }
+ }
+
+ return 0, nil
+ }
+
+ _, err := c.doWithPublishDirs(syncFn)
+ return err
+}
+
+func chmodFilter(dst, src os.FileInfo) bool {
+ // Hugo publishes data from multiple sources, potentially
+ // with overlapping directory structures. We cannot sync permissions
+ // for directories as that would mean that we might end up with write-protected
+ // directories inside /public.
+ // One example of this would be syncing from the Go Module cache,
+ // which have 0555 directories.
+ return src.IsDir()
+}
+
+func cleanErrorLog(content string) string {
+ content = strings.ReplaceAll(content, "\n", " ")
+ content = logReplacer.Replace(content)
+ content = logDuplicateTemplateExecuteRe.ReplaceAllString(content, "")
+ content = logDuplicateTemplateParseRe.ReplaceAllString(content, "")
+ seen := make(map[string]bool)
+ parts := strings.Split(content, ": ")
+ keep := make([]string, 0, len(parts))
+ for _, part := range parts {
+ if seen[part] {
+ continue
+ }
+ seen[part] = true
+ keep = append(keep, part)
+ }
+ return strings.Join(keep, ": ")
+}
+
+func injectLiveReloadScript(src io.Reader, baseURL *url.URL) string {
+ var b bytes.Buffer
+ chain := transform.Chain{livereloadinject.New(baseURL)}
+ chain.Apply(&b, src)
+
+ return b.String()
+}
+
+func partitionDynamicEvents(sourceFs *filesystems.SourceFilesystems, events []fsnotify.Event) (de dynamicEvents) {
+ for _, e := range events {
+ if !sourceFs.IsContent(e.Name) {
+ de.AssetEvents = append(de.AssetEvents, e)
+ } else {
+ de.ContentEvents = append(de.ContentEvents, e)
+ }
+ }
+ return
+}
+
+func pickOneWriteOrCreatePath(contentTypes config.ContentTypesProvider, events []fsnotify.Event) string {
+ name := ""
+
+ for _, ev := range events {
+ if ev.Op&fsnotify.Write == fsnotify.Write || ev.Op&fsnotify.Create == fsnotify.Create {
+ if contentTypes.IsIndexContentFile(ev.Name) {
+ return ev.Name
+ }
+
+ if contentTypes.IsContentFile(ev.Name) {
+ name = ev.Name
+ }
+
+ }
+ }
+
+ return name
+}
+
+func formatByteCount(b uint64) string {
+ const unit = 1000
+ if b < unit {
+ return fmt.Sprintf("%d B", b)
+ }
+ div, exp := int64(unit), 0
+ for n := b / unit; n >= unit; n /= unit {
+ div *= unit
+ exp++
+ }
+ return fmt.Sprintf("%.1f %cB",
+ float64(b)/float64(div), "kMGTPE"[exp])
+}
+
+func canRedirect(requestURIWithoutQuery string, r *http.Request) bool {
+ if r.Header.Get(hugoHeaderRedirect) != "" {
+ return false
+ }
+ return isNavigation(requestURIWithoutQuery, r)
+}
+
+// Sec-Fetch-Mode should be sent by all recent browser versions, see https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Sec-Fetch-Mode#navigate
+// Fall back to the file extension if not set.
+// The main take here is that we don't want to have CSS/JS files etc. partake in this logic.
+func isNavigation(requestURIWithoutQuery string, r *http.Request) bool {
+ return r.Header.Get("Sec-Fetch-Mode") == "navigate" || isPropablyHTMLRequest(requestURIWithoutQuery)
+}
+
+func isPropablyHTMLRequest(requestURIWithoutQuery string) bool {
+ if strings.HasSuffix(requestURIWithoutQuery, "/") || strings.HasSuffix(requestURIWithoutQuery, "html") || strings.HasSuffix(requestURIWithoutQuery, "htm") {
+ return true
+ }
+ return !strings.Contains(requestURIWithoutQuery, ".")
}
diff --git a/commands/server_errors.go b/commands/server_errors.go
deleted file mode 100644
index 9f13c9d8c..000000000
--- a/commands/server_errors.go
+++ /dev/null
@@ -1,95 +0,0 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "bytes"
- "io"
-
- "github.com/gohugoio/hugo/transform"
- "github.com/gohugoio/hugo/transform/livereloadinject"
-)
-
-var buildErrorTemplate = `
-
-
-
- Hugo Server: Error
-
-
-
-
- {{ highlight .Error "apl" "noclasses=true,style=monokai" }}
- {{ with .File }}
- {{ $params := printf "noclasses=true,style=monokai,linenos=table,hl_lines=%d,linenostart=%d" (add .LinesPos 1) (sub .Position.LineNumber .LinesPos) }}
- {{ $lexer := .ChromaLexer | default "go-html-template" }}
- {{ highlight (delimit .Lines "\n") $lexer $params }}
- {{ end }}
- {{ with .StackTrace }}
- {{ highlight . "apl" "noclasses=true,style=monokai" }}
- {{ end }}
-
{{ .Version }}
- Reload Page
-
-
-
-`
-
-func injectLiveReloadScript(src io.Reader, port int) string {
- var b bytes.Buffer
- chain := transform.Chain{livereloadinject.New(port)}
- chain.Apply(&b, src)
-
- return b.String()
-}
diff --git a/commands/server_test.go b/commands/server_test.go
deleted file mode 100644
index 8bd96c6ec..000000000
--- a/commands/server_test.go
+++ /dev/null
@@ -1,134 +0,0 @@
-// Copyright 2015 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "fmt"
- "net/http"
- "os"
- "runtime"
- "strings"
- "testing"
- "time"
-
- "github.com/gohugoio/hugo/helpers"
-
- qt "github.com/frankban/quicktest"
- "github.com/spf13/viper"
-)
-
-func TestServer(t *testing.T) {
- if isWindowsCI() {
- // TODO(bep) not sure why server tests have started to fail on the Windows CI server.
- t.Skip("Skip server test on appveyor")
- }
- c := qt.New(t)
- dir, err := createSimpleTestSite(t, testSiteConfig{})
- c.Assert(err, qt.IsNil)
-
- // Let us hope that this port is available on all systems ...
- port := 1331
-
- defer func() {
- os.RemoveAll(dir)
- }()
-
- stop := make(chan bool)
-
- b := newCommandsBuilder()
- scmd := b.newServerCmdSignaled(stop)
-
- cmd := scmd.getCommand()
- cmd.SetArgs([]string{"-s=" + dir, fmt.Sprintf("-p=%d", port)})
-
- go func() {
- _, err = cmd.ExecuteC()
- c.Assert(err, qt.IsNil)
- }()
-
- // There is no way to know exactly when the server is ready for connections.
- // We could improve by something like https://golang.org/pkg/net/http/httptest/#Server
- // But for now, let us sleep and pray!
- time.Sleep(2 * time.Second)
-
- resp, err := http.Get("http://localhost:1331/")
- c.Assert(err, qt.IsNil)
- defer resp.Body.Close()
- homeContent := helpers.ReaderToString(resp.Body)
-
- c.Assert(homeContent, qt.Contains, "List: Hugo Commands")
- c.Assert(homeContent, qt.Contains, "Environment: development")
-
- // Stop the server.
- stop <- true
-
-}
-
-func TestFixURL(t *testing.T) {
- type data struct {
- TestName string
- CLIBaseURL string
- CfgBaseURL string
- AppendPort bool
- Port int
- Result string
- }
- tests := []data{
- {"Basic http localhost", "", "http://foo.com", true, 1313, "http://localhost:1313/"},
- {"Basic https production, http localhost", "", "https://foo.com", true, 1313, "http://localhost:1313/"},
- {"Basic subdir", "", "http://foo.com/bar", true, 1313, "http://localhost:1313/bar/"},
- {"Basic production", "http://foo.com", "http://foo.com", false, 80, "http://foo.com/"},
- {"Production subdir", "http://foo.com/bar", "http://foo.com/bar", false, 80, "http://foo.com/bar/"},
- {"No http", "", "foo.com", true, 1313, "//localhost:1313/"},
- {"Override configured port", "", "foo.com:2020", true, 1313, "//localhost:1313/"},
- {"No http production", "foo.com", "foo.com", false, 80, "//foo.com/"},
- {"No http production with port", "foo.com", "foo.com", true, 2020, "//foo.com:2020/"},
- {"No config", "", "", true, 1313, "//localhost:1313/"},
- }
-
- for _, test := range tests {
- t.Run(test.TestName, func(t *testing.T) {
- b := newCommandsBuilder()
- s := b.newServerCmd()
- v := viper.New()
- baseURL := test.CLIBaseURL
- v.Set("baseURL", test.CfgBaseURL)
- s.serverAppend = test.AppendPort
- s.serverPort = test.Port
- result, err := s.fixURL(v, baseURL, s.serverPort)
- if err != nil {
- t.Errorf("Unexpected error %s", err)
- }
- if result != test.Result {
- t.Errorf("Expected %q, got %q", test.Result, result)
- }
- })
- }
-}
-
-func TestRemoveErrorPrefixFromLog(t *testing.T) {
- c := qt.New(t)
- content := `ERROR 2018/10/07 13:11:12 Error while rendering "home": template: _default/baseof.html:4:3: executing "main" at : error calling partial: template: partials/logo.html:5:84: executing "partials/logo.html" at <$resized.AHeight>: can't evaluate field AHeight in type *resource.Image
-ERROR 2018/10/07 13:11:12 Rebuild failed: logged 1 error(s)
-`
-
- withoutError := removeErrorPrefixFromLog(content)
-
- c.Assert(strings.Contains(withoutError, "ERROR"), qt.Equals, false)
-
-}
-
-func isWindowsCI() bool {
- return runtime.GOOS == "windows" && os.Getenv("CI") != ""
-}
diff --git a/commands/static_syncer.go b/commands/static_syncer.go
deleted file mode 100644
index 62ef28b2c..000000000
--- a/commands/static_syncer.go
+++ /dev/null
@@ -1,132 +0,0 @@
-// Copyright 2017 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "os"
- "path/filepath"
-
- "github.com/gohugoio/hugo/hugolib/filesystems"
-
- "github.com/fsnotify/fsnotify"
- "github.com/gohugoio/hugo/helpers"
- "github.com/spf13/fsync"
-)
-
-type staticSyncer struct {
- c *commandeer
-}
-
-func newStaticSyncer(c *commandeer) (*staticSyncer, error) {
- return &staticSyncer{c: c}, nil
-}
-
-func (s *staticSyncer) isStatic(filename string) bool {
- return s.c.hugo().BaseFs.SourceFilesystems.IsStatic(filename)
-}
-
-func (s *staticSyncer) syncsStaticEvents(staticEvents []fsnotify.Event) error {
- c := s.c
-
- syncFn := func(sourceFs *filesystems.SourceFilesystem) (uint64, error) {
- publishDir := c.hugo().PathSpec.PublishDir
- // If root, remove the second '/'
- if publishDir == "//" {
- publishDir = helpers.FilePathSeparator
- }
-
- if sourceFs.PublishFolder != "" {
- publishDir = filepath.Join(publishDir, sourceFs.PublishFolder)
- }
-
- syncer := fsync.NewSyncer()
- syncer.NoTimes = c.Cfg.GetBool("noTimes")
- syncer.NoChmod = c.Cfg.GetBool("noChmod")
- syncer.ChmodFilter = chmodFilter
- syncer.SrcFs = sourceFs.Fs
- syncer.DestFs = c.Fs.Destination
-
- // prevent spamming the log on changes
- logger := helpers.NewDistinctFeedbackLogger()
-
- for _, ev := range staticEvents {
- // Due to our approach of layering both directories and the content's rendered output
- // into one we can't accurately remove a file not in one of the source directories.
- // If a file is in the local static dir and also in the theme static dir and we remove
- // it from one of those locations we expect it to still exist in the destination
- //
- // If Hugo generates a file (from the content dir) over a static file
- // the content generated file should take precedence.
- //
- // Because we are now watching and handling individual events it is possible that a static
- // event that occupies the same path as a content generated file will take precedence
- // until a regeneration of the content takes places.
- //
- // Hugo assumes that these cases are very rare and will permit this bad behavior
- // The alternative is to track every single file and which pipeline rendered it
- // and then to handle conflict resolution on every event.
-
- fromPath := ev.Name
-
- relPath := sourceFs.MakePathRelative(fromPath)
-
- if relPath == "" {
- // Not member of this virtual host.
- continue
- }
-
- // Remove || rename is harder and will require an assumption.
- // Hugo takes the following approach:
- // If the static file exists in any of the static source directories after this event
- // Hugo will re-sync it.
- // If it does not exist in all of the static directories Hugo will remove it.
- //
- // This assumes that Hugo has not generated content on top of a static file and then removed
- // the source of that static file. In this case Hugo will incorrectly remove that file
- // from the published directory.
- if ev.Op&fsnotify.Rename == fsnotify.Rename || ev.Op&fsnotify.Remove == fsnotify.Remove {
- if _, err := sourceFs.Fs.Stat(relPath); os.IsNotExist(err) {
- // If file doesn't exist in any static dir, remove it
- toRemove := filepath.Join(publishDir, relPath)
-
- logger.Println("File no longer exists in static dir, removing", toRemove)
- _ = c.Fs.Destination.RemoveAll(toRemove)
- } else if err == nil {
- // If file still exists, sync it
- logger.Println("Syncing", relPath, "to", publishDir)
-
- if err := syncer.Sync(filepath.Join(publishDir, relPath), relPath); err != nil {
- c.logger.ERROR.Println(err)
- }
- } else {
- c.logger.ERROR.Println(err)
- }
-
- continue
- }
-
- // For all other event operations Hugo will sync static.
- logger.Println("Syncing", relPath, "to", publishDir)
- if err := syncer.Sync(filepath.Join(publishDir, relPath), relPath); err != nil {
- c.logger.ERROR.Println(err)
- }
- }
-
- return 0, nil
- }
-
- _, err := c.doWithPublishDirs(syncFn)
- return err
-
-}
diff --git a/commands/version.go b/commands/version.go
deleted file mode 100644
index 287950a2d..000000000
--- a/commands/version.go
+++ /dev/null
@@ -1,44 +0,0 @@
-// Copyright 2015 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package commands
-
-import (
- "github.com/gohugoio/hugo/common/hugo"
- "github.com/spf13/cobra"
- jww "github.com/spf13/jwalterweatherman"
-)
-
-var _ cmder = (*versionCmd)(nil)
-
-type versionCmd struct {
- *baseCmd
-}
-
-func newVersionCmd() *versionCmd {
- return &versionCmd{
- newBaseCmd(&cobra.Command{
- Use: "version",
- Short: "Print the version number of Hugo",
- Long: `All software has versions. This is Hugo's.`,
- RunE: func(cmd *cobra.Command, args []string) error {
- printHugoVersion()
- return nil
- },
- }),
- }
-}
-
-func printHugoVersion() {
- jww.FEEDBACK.Println(hugo.BuildVersionString())
-}
diff --git a/common/collections/append.go b/common/collections/append.go
index ee15fef7d..db9db8bf3 100644
--- a/common/collections/append.go
+++ b/common/collections/append.go
@@ -21,38 +21,73 @@ import (
// Append appends from to a slice to and returns the resulting slice.
// If length of from is one and the only element is a slice of same type as to,
// it will be appended.
-func Append(to interface{}, from ...interface{}) (interface{}, error) {
+func Append(to any, from ...any) (any, error) {
+ if len(from) == 0 {
+ return to, nil
+ }
tov, toIsNil := indirect(reflect.ValueOf(to))
toIsNil = toIsNil || to == nil
var tot reflect.Type
if !toIsNil {
+ if tov.Kind() == reflect.Slice {
+ // Create a copy of tov, so we don't modify the original.
+ c := reflect.MakeSlice(tov.Type(), tov.Len(), tov.Len()+len(from))
+ reflect.Copy(c, tov)
+ tov = c
+ }
+
if tov.Kind() != reflect.Slice {
return nil, fmt.Errorf("expected a slice, got %T", to)
}
tot = tov.Type().Elem()
+ if tot.Kind() == reflect.Slice {
+ totvt := tot.Elem()
+ fromvs := make([]reflect.Value, len(from))
+ for i, f := range from {
+ fromv := reflect.ValueOf(f)
+ fromt := fromv.Type()
+ if fromt.Kind() == reflect.Slice {
+ fromt = fromt.Elem()
+ }
+ if totvt != fromt {
+ return nil, fmt.Errorf("cannot append slice of %s to slice of %s", fromt, totvt)
+ } else {
+ fromvs[i] = fromv
+ }
+ }
+ return reflect.Append(tov, fromvs...).Interface(), nil
+
+ }
+
toIsNil = tov.Len() == 0
if len(from) == 1 {
fromv := reflect.ValueOf(from[0])
+ if !fromv.IsValid() {
+ // from[0] is nil
+ return appendToInterfaceSliceFromValues(tov, fromv)
+ }
+ fromt := fromv.Type()
+ if fromt.Kind() == reflect.Slice {
+ fromt = fromt.Elem()
+ }
if fromv.Kind() == reflect.Slice {
if toIsNil {
// If we get nil []string, we just return the []string
return from[0], nil
}
- fromt := reflect.TypeOf(from[0]).Elem()
-
// If we get []string []string, we append the from slice to to
if tot == fromt {
return reflect.AppendSlice(tov, fromv).Interface(), nil
} else if !fromt.AssignableTo(tot) {
// Fall back to a []interface{} slice.
return appendToInterfaceSliceFromValues(tov, fromv)
-
}
+
}
}
}
@@ -63,8 +98,9 @@ func Append(to interface{}, from ...interface{}) (interface{}, error) {
for _, f := range from {
fv := reflect.ValueOf(f)
- if !fv.Type().AssignableTo(tot) {
+ if !fv.IsValid() || !fv.Type().AssignableTo(tot) {
// Fall back to a []interface{} slice.
+ tov, _ := indirect(reflect.ValueOf(to))
return appendToInterfaceSlice(tov, from...)
}
tov = reflect.Append(tov, fv)
@@ -73,11 +109,15 @@ func Append(to interface{}, from ...interface{}) (interface{}, error) {
return tov.Interface(), nil
}
-func appendToInterfaceSliceFromValues(slice1, slice2 reflect.Value) ([]interface{}, error) {
- var tos []interface{}
+func appendToInterfaceSliceFromValues(slice1, slice2 reflect.Value) ([]any, error) {
+ var tos []any
for _, slice := range []reflect.Value{slice1, slice2} {
- for i := 0; i < slice.Len(); i++ {
+ if !slice.IsValid() {
+ tos = append(tos, nil)
+ continue
+ }
+ for i := range slice.Len() {
tos = append(tos, slice.Index(i).Interface())
}
}
@@ -85,10 +125,10 @@ func appendToInterfaceSliceFromValues(slice1, slice2 reflect.Value) ([]interface
return tos, nil
}
-func appendToInterfaceSlice(tov reflect.Value, from ...interface{}) ([]interface{}, error) {
- var tos []interface{}
+func appendToInterfaceSlice(tov reflect.Value, from ...any) ([]any, error) {
+ var tos []any
- for i := 0; i < tov.Len(); i++ {
+ for i := range tov.Len() {
tos = append(tos, tov.Index(i).Interface())
}
diff --git a/common/collections/append_test.go b/common/collections/append_test.go
index 8c9a6e73f..62d9015ce 100644
--- a/common/collections/append_test.go
+++ b/common/collections/append_test.go
@@ -14,6 +14,8 @@
package collections
import (
+ "html/template"
+ "reflect"
"testing"
qt "github.com/frankban/quicktest"
@@ -23,39 +25,60 @@ func TestAppend(t *testing.T) {
t.Parallel()
c := qt.New(t)
- for _, test := range []struct {
- start interface{}
- addend []interface{}
- expected interface{}
+ for i, test := range []struct {
+ start any
+ addend []any
+ expected any
}{
- {[]string{"a", "b"}, []interface{}{"c"}, []string{"a", "b", "c"}},
- {[]string{"a", "b"}, []interface{}{"c", "d", "e"}, []string{"a", "b", "c", "d", "e"}},
- {[]string{"a", "b"}, []interface{}{[]string{"c", "d", "e"}}, []string{"a", "b", "c", "d", "e"}},
- {nil, []interface{}{"a", "b"}, []string{"a", "b"}},
- {nil, []interface{}{nil}, []interface{}{nil}},
- {[]interface{}{}, []interface{}{[]string{"c", "d", "e"}}, []string{"c", "d", "e"}},
- {tstSlicers{&tstSlicer{"a"}, &tstSlicer{"b"}},
- []interface{}{&tstSlicer{"c"}},
- tstSlicers{&tstSlicer{"a"}, &tstSlicer{"b"}, &tstSlicer{"c"}}},
- {&tstSlicers{&tstSlicer{"a"}, &tstSlicer{"b"}},
- []interface{}{&tstSlicer{"c"}},
- tstSlicers{&tstSlicer{"a"},
+ {[]string{"a", "b"}, []any{"c"}, []string{"a", "b", "c"}},
+ {[]string{"a", "b"}, []any{"c", "d", "e"}, []string{"a", "b", "c", "d", "e"}},
+ {[]string{"a", "b"}, []any{[]string{"c", "d", "e"}}, []string{"a", "b", "c", "d", "e"}},
+ {[]string{"a"}, []any{"b", template.HTML("c")}, []any{"a", "b", template.HTML("c")}},
+ {nil, []any{"a", "b"}, []string{"a", "b"}},
+ {nil, []any{nil}, []any{nil}},
+ {[]any{}, []any{[]string{"c", "d", "e"}}, []string{"c", "d", "e"}},
+ {
+ tstSlicers{&tstSlicer{"a"}, &tstSlicer{"b"}},
+ []any{&tstSlicer{"c"}},
+ tstSlicers{&tstSlicer{"a"}, &tstSlicer{"b"}, &tstSlicer{"c"}},
+ },
+ {
+ &tstSlicers{&tstSlicer{"a"}, &tstSlicer{"b"}},
+ []any{&tstSlicer{"c"}},
+ tstSlicers{
+ &tstSlicer{"a"},
&tstSlicer{"b"},
- &tstSlicer{"c"}}},
- {testSlicerInterfaces{&tstSlicerIn1{"a"}, &tstSlicerIn1{"b"}},
- []interface{}{&tstSlicerIn1{"c"}},
- testSlicerInterfaces{&tstSlicerIn1{"a"}, &tstSlicerIn1{"b"}, &tstSlicerIn1{"c"}}},
- //https://github.com/gohugoio/hugo/issues/5361
- {[]string{"a", "b"}, []interface{}{tstSlicers{&tstSlicer{"a"}, &tstSlicer{"b"}}},
- []interface{}{"a", "b", &tstSlicer{"a"}, &tstSlicer{"b"}}},
- {[]string{"a", "b"}, []interface{}{&tstSlicer{"a"}},
- []interface{}{"a", "b", &tstSlicer{"a"}}},
+ &tstSlicer{"c"},
+ },
+ },
+ {
+ testSlicerInterfaces{&tstSlicerIn1{"a"}, &tstSlicerIn1{"b"}},
+ []any{&tstSlicerIn1{"c"}},
+ testSlicerInterfaces{&tstSlicerIn1{"a"}, &tstSlicerIn1{"b"}, &tstSlicerIn1{"c"}},
+ },
+ // https://github.com/gohugoio/hugo/issues/5361
+ {
+ []string{"a", "b"},
+ []any{tstSlicers{&tstSlicer{"a"}, &tstSlicer{"b"}}},
+ []any{"a", "b", &tstSlicer{"a"}, &tstSlicer{"b"}},
+ },
+ {
+ []string{"a", "b"},
+ []any{&tstSlicer{"a"}},
+ []any{"a", "b", &tstSlicer{"a"}},
+ },
// Errors
- {"", []interface{}{[]string{"a", "b"}}, false},
+ {"", []any{[]string{"a", "b"}}, false},
// No string concatenation.
- {"ab",
- []interface{}{"c"},
- false},
+ {
+ "ab",
+ []any{"c"},
+ false,
+ },
+ {[]string{"a", "b"}, []any{nil}, []any{"a", "b", nil}},
+ {[]string{"a", "b"}, []any{nil, "d", nil}, []any{"a", "b", nil, "d", nil}},
+ {[]any{"a", nil, "c"}, []any{"d", nil, "f"}, []any{"a", nil, "c", "d", nil, "f"}},
+ {[]string{"a", "b"}, []any{}, []string{"a", "b"}},
} {
result, err := Append(test.start, test.addend...)
@@ -67,7 +90,124 @@ func TestAppend(t *testing.T) {
}
c.Assert(err, qt.IsNil)
- c.Assert(result, qt.DeepEquals, test.expected)
+ c.Assert(result, qt.DeepEquals, test.expected, qt.Commentf("test: [%d] %v", i, test))
+ }
+}
+
+// #11093
+func TestAppendToMultiDimensionalSlice(t *testing.T) {
+ t.Parallel()
+ c := qt.New(t)
+
+ for _, test := range []struct {
+ to any
+ from []any
+ expected any
+ }{
+ {
+ [][]string{{"a", "b"}},
+ []any{[]string{"c", "d"}},
+ [][]string{
+ {"a", "b"},
+ {"c", "d"},
+ },
+ },
+ {
+ [][]string{{"a", "b"}},
+ []any{[]string{"c", "d"}, []string{"e", "f"}},
+ [][]string{
+ {"a", "b"},
+ {"c", "d"},
+ {"e", "f"},
+ },
+ },
+ {
+ [][]string{{"a", "b"}},
+ []any{[]int{1, 2}},
+ false,
+ },
+ } {
+ result, err := Append(test.to, test.from...)
+ if b, ok := test.expected.(bool); ok && !b {
+ c.Assert(err, qt.Not(qt.IsNil))
+ } else {
+ c.Assert(err, qt.IsNil)
+ c.Assert(result, qt.DeepEquals, test.expected)
+ }
+ }
+}
+
+func TestAppendShouldMakeACopyOfTheInputSlice(t *testing.T) {
+ t.Parallel()
+ c := qt.New(t)
+ slice := make([]string, 0, 100)
+ slice = append(slice, "a", "b")
+ result, err := Append(slice, "c")
+ c.Assert(err, qt.IsNil)
+ slice[0] = "d"
+ c.Assert(result, qt.DeepEquals, []string{"a", "b", "c"})
+ c.Assert(slice, qt.DeepEquals, []string{"d", "b"})
+}
+
+func TestIndirect(t *testing.T) {
+ t.Parallel()
+ c := qt.New(t)
+
+ type testStruct struct {
+ Field string
}
+ var (
+ nilPtr *testStruct
+ nilIface interface{} = nil
+ nonNilIface interface{} = &testStruct{Field: "hello"}
+ )
+
+ tests := []struct {
+ name string
+ input any
+ wantKind reflect.Kind
+ wantNil bool
+ }{
+ {
+ name: "nil pointer",
+ input: nilPtr,
+ wantKind: reflect.Ptr,
+ wantNil: true,
+ },
+ {
+ name: "nil interface",
+ input: nilIface,
+ wantKind: reflect.Invalid,
+ wantNil: false,
+ },
+ {
+ name: "non-nil pointer to struct",
+ input: &testStruct{Field: "abc"},
+ wantKind: reflect.Struct,
+ wantNil: false,
+ },
+ {
+ name: "non-nil interface holding pointer",
+ input: nonNilIface,
+ wantKind: reflect.Struct,
+ wantNil: false,
+ },
+ {
+ name: "plain value",
+ input: testStruct{Field: "xyz"},
+ wantKind: reflect.Struct,
+ wantNil: false,
+ },
+ }
+
+ for _, tt := range tests {
+ t.Run(tt.name, func(t *testing.T) {
+ v := reflect.ValueOf(tt.input)
+ got, isNil := indirect(v)
+
+ c.Assert(got.Kind(), qt.Equals, tt.wantKind)
+ c.Assert(isNil, qt.Equals, tt.wantNil)
+ })
+ }
}
diff --git a/common/collections/collections.go b/common/collections/collections.go
index bb47c8acc..0b46abee9 100644
--- a/common/collections/collections.go
+++ b/common/collections/collections.go
@@ -17,5 +17,5 @@ package collections
// Grouper defines a very generic way to group items by a given key.
type Grouper interface {
- Group(key interface{}, items interface{}) (interface{}, error)
+ Group(key any, items any) (any, error)
}
diff --git a/common/collections/order.go b/common/collections/order.go
new file mode 100644
index 000000000..4bdc3b4ac
--- /dev/null
+++ b/common/collections/order.go
@@ -0,0 +1,20 @@
+// Copyright 2020 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package collections
+
+type Order interface {
+ // Ordinal is a zero-based ordinal that represents the order of an object
+ // in a collection.
+ Ordinal() int
+}
diff --git a/common/collections/slice.go b/common/collections/slice.go
index 38ca86b08..731f489f9 100644
--- a/common/collections/slice.go
+++ b/common/collections/slice.go
@@ -15,17 +15,18 @@ package collections
import (
"reflect"
+ "sort"
)
// Slicer defines a very generic way to create a typed slice. This is used
// in collections.Slice template func to get types such as Pages, PageGroups etc.
// instead of the less useful []interface{}.
type Slicer interface {
- Slice(items interface{}) (interface{}, error)
+ Slice(items any) (any, error)
}
// Slice returns a slice of all passed arguments.
-func Slice(args ...interface{}) interface{} {
+func Slice(args ...any) any {
if len(args) == 0 {
return args
}
@@ -64,3 +65,31 @@ func Slice(args ...interface{}) interface{} {
}
return slice.Interface()
}
+
+// StringSliceToInterfaceSlice converts ss to []interface{}.
+func StringSliceToInterfaceSlice(ss []string) []any {
+ result := make([]any, len(ss))
+ for i, s := range ss {
+ result[i] = s
+ }
+ return result
+}
+
+type SortedStringSlice []string
+
+// Contains returns true if s is in ss.
+func (ss SortedStringSlice) Contains(s string) bool {
+ i := sort.SearchStrings(ss, s)
+ return i < len(ss) && ss[i] == s
+}
+
+// Count returns the number of times s is in ss.
+func (ss SortedStringSlice) Count(s string) int {
+ var count int
+ i := sort.SearchStrings(ss, s)
+ for i < len(ss) && ss[i] == s {
+ count++
+ i++
+ }
+ return count
+}
diff --git a/common/collections/slice_test.go b/common/collections/slice_test.go
index 3ebfe6d11..4008a5e6c 100644
--- a/common/collections/slice_test.go
+++ b/common/collections/slice_test.go
@@ -20,11 +20,13 @@ import (
qt "github.com/frankban/quicktest"
)
-var _ Slicer = (*tstSlicer)(nil)
-var _ Slicer = (*tstSlicerIn1)(nil)
-var _ Slicer = (*tstSlicerIn2)(nil)
-var _ testSlicerInterface = (*tstSlicerIn1)(nil)
-var _ testSlicerInterface = (*tstSlicerIn1)(nil)
+var (
+ _ Slicer = (*tstSlicer)(nil)
+ _ Slicer = (*tstSlicerIn1)(nil)
+ _ Slicer = (*tstSlicerIn2)(nil)
+ _ testSlicerInterface = (*tstSlicerIn1)(nil)
+ _ testSlicerInterface = (*tstSlicerIn1)(nil)
+)
type testSlicerInterface interface {
Name() string
@@ -44,8 +46,8 @@ type tstSlicer struct {
TheName string
}
-func (p *tstSlicerIn1) Slice(in interface{}) (interface{}, error) {
- items := in.([]interface{})
+func (p *tstSlicerIn1) Slice(in any) (any, error) {
+ items := in.([]any)
result := make(testSlicerInterfaces, len(items))
for i, v := range items {
switch vv := v.(type) {
@@ -54,13 +56,12 @@ func (p *tstSlicerIn1) Slice(in interface{}) (interface{}, error) {
default:
return nil, errors.New("invalid type")
}
-
}
return result, nil
}
-func (p *tstSlicerIn2) Slice(in interface{}) (interface{}, error) {
- items := in.([]interface{})
+func (p *tstSlicerIn2) Slice(in any) (any, error) {
+ items := in.([]any)
result := make(testSlicerInterfaces, len(items))
for i, v := range items {
switch vv := v.(type) {
@@ -81,8 +82,8 @@ func (p *tstSlicerIn2) Name() string {
return p.TheName
}
-func (p *tstSlicer) Slice(in interface{}) (interface{}, error) {
- items := in.([]interface{})
+func (p *tstSlicer) Slice(in any) (any, error) {
+ items := in.([]any)
result := make(tstSlicers, len(items))
for i, v := range items {
switch vv := v.(type) {
@@ -102,17 +103,17 @@ func TestSlice(t *testing.T) {
c := qt.New(t)
for i, test := range []struct {
- args []interface{}
- expected interface{}
+ args []any
+ expected any
}{
- {[]interface{}{"a", "b"}, []string{"a", "b"}},
- {[]interface{}{&tstSlicer{"a"}, &tstSlicer{"b"}}, tstSlicers{&tstSlicer{"a"}, &tstSlicer{"b"}}},
- {[]interface{}{&tstSlicer{"a"}, "b"}, []interface{}{&tstSlicer{"a"}, "b"}},
- {[]interface{}{}, []interface{}{}},
- {[]interface{}{nil}, []interface{}{nil}},
- {[]interface{}{5, "b"}, []interface{}{5, "b"}},
- {[]interface{}{&tstSlicerIn1{"a"}, &tstSlicerIn2{"b"}}, testSlicerInterfaces{&tstSlicerIn1{"a"}, &tstSlicerIn2{"b"}}},
- {[]interface{}{&tstSlicerIn1{"a"}, &tstSlicer{"b"}}, []interface{}{&tstSlicerIn1{"a"}, &tstSlicer{"b"}}},
+ {[]any{"a", "b"}, []string{"a", "b"}},
+ {[]any{&tstSlicer{"a"}, &tstSlicer{"b"}}, tstSlicers{&tstSlicer{"a"}, &tstSlicer{"b"}}},
+ {[]any{&tstSlicer{"a"}, "b"}, []any{&tstSlicer{"a"}, "b"}},
+ {[]any{}, []any{}},
+ {[]any{nil}, []any{nil}},
+ {[]any{5, "b"}, []any{5, "b"}},
+ {[]any{&tstSlicerIn1{"a"}, &tstSlicerIn2{"b"}}, testSlicerInterfaces{&tstSlicerIn1{"a"}, &tstSlicerIn2{"b"}}},
+ {[]any{&tstSlicerIn1{"a"}, &tstSlicer{"b"}}, []any{&tstSlicerIn1{"a"}, &tstSlicer{"b"}}},
} {
errMsg := qt.Commentf("[%d] %v", i, test.args)
@@ -120,5 +121,52 @@ func TestSlice(t *testing.T) {
c.Assert(test.expected, qt.DeepEquals, result, errMsg)
}
-
+}
+
+func TestSortedStringSlice(t *testing.T) {
+ t.Parallel()
+ c := qt.New(t)
+
+ var s SortedStringSlice = []string{"a", "b", "b", "b", "c", "d"}
+
+ c.Assert(s.Contains("a"), qt.IsTrue)
+ c.Assert(s.Contains("b"), qt.IsTrue)
+ c.Assert(s.Contains("z"), qt.IsFalse)
+ c.Assert(s.Count("b"), qt.Equals, 3)
+ c.Assert(s.Count("z"), qt.Equals, 0)
+ c.Assert(s.Count("a"), qt.Equals, 1)
+}
+
+func TestStringSliceToInterfaceSlice(t *testing.T) {
+ t.Parallel()
+ c := qt.New(t)
+
+ tests := []struct {
+ name string
+ in []string
+ want []any
+ }{
+ {
+ name: "empty slice",
+ in: []string{},
+ want: []any{},
+ },
+ {
+ name: "single element",
+ in: []string{"hello"},
+ want: []any{"hello"},
+ },
+ {
+ name: "multiple elements",
+ in: []string{"a", "b", "c"},
+ want: []any{"a", "b", "c"},
+ },
+ }
+
+ for _, tt := range tests {
+ t.Run(tt.name, func(t *testing.T) {
+ got := StringSliceToInterfaceSlice(tt.in)
+ c.Assert(got, qt.DeepEquals, tt.want)
+ })
+ }
}
diff --git a/common/collections/stack.go b/common/collections/stack.go
new file mode 100644
index 000000000..ff0db2f02
--- /dev/null
+++ b/common/collections/stack.go
@@ -0,0 +1,82 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package collections
+
+import "slices"
+
+import "sync"
+
+// Stack is a simple LIFO stack that is safe for concurrent use.
+type Stack[T any] struct {
+ items []T
+ zero T
+ mu sync.RWMutex
+}
+
+func NewStack[T any]() *Stack[T] {
+ return &Stack[T]{}
+}
+
+func (s *Stack[T]) Push(item T) {
+ s.mu.Lock()
+ defer s.mu.Unlock()
+ s.items = append(s.items, item)
+}
+
+func (s *Stack[T]) Pop() (T, bool) {
+ s.mu.Lock()
+ defer s.mu.Unlock()
+ if len(s.items) == 0 {
+ return s.zero, false
+ }
+ item := s.items[len(s.items)-1]
+ s.items = s.items[:len(s.items)-1]
+ return item, true
+}
+
+func (s *Stack[T]) Peek() (T, bool) {
+ s.mu.RLock()
+ defer s.mu.RUnlock()
+ if len(s.items) == 0 {
+ return s.zero, false
+ }
+ return s.items[len(s.items)-1], true
+}
+
+func (s *Stack[T]) Len() int {
+ s.mu.RLock()
+ defer s.mu.RUnlock()
+ return len(s.items)
+}
+
+func (s *Stack[T]) Drain() []T {
+ s.mu.Lock()
+ defer s.mu.Unlock()
+ items := s.items
+ s.items = nil
+ return items
+}
+
+func (s *Stack[T]) DrainMatching(predicate func(T) bool) []T {
+ s.mu.Lock()
+ defer s.mu.Unlock()
+ var items []T
+ for i := len(s.items) - 1; i >= 0; i-- {
+ if predicate(s.items[i]) {
+ items = append(items, s.items[i])
+ s.items = slices.Delete(s.items, i, i+1)
+ }
+ }
+ return items
+}
diff --git a/common/collections/stack_test.go b/common/collections/stack_test.go
new file mode 100644
index 000000000..965d4dbc8
--- /dev/null
+++ b/common/collections/stack_test.go
@@ -0,0 +1,77 @@
+package collections
+
+import (
+ "testing"
+
+ qt "github.com/frankban/quicktest"
+)
+
+func TestNewStack(t *testing.T) {
+ t.Parallel()
+ c := qt.New(t)
+
+ s := NewStack[int]()
+
+ c.Assert(s, qt.IsNotNil)
+}
+
+func TestStackBasic(t *testing.T) {
+ t.Parallel()
+ c := qt.New(t)
+
+ s := NewStack[int]()
+
+ c.Assert(s.Len(), qt.Equals, 0)
+
+ s.Push(1)
+ s.Push(2)
+ s.Push(3)
+
+ c.Assert(s.Len(), qt.Equals, 3)
+
+ top, ok := s.Peek()
+ c.Assert(ok, qt.Equals, true)
+ c.Assert(top, qt.Equals, 3)
+
+ popped, ok := s.Pop()
+ c.Assert(ok, qt.Equals, true)
+ c.Assert(popped, qt.Equals, 3)
+
+ c.Assert(s.Len(), qt.Equals, 2)
+
+ _, _ = s.Pop()
+ _, _ = s.Pop()
+ _, ok = s.Pop()
+
+ c.Assert(ok, qt.Equals, false)
+}
+
+func TestStackDrain(t *testing.T) {
+ t.Parallel()
+ c := qt.New(t)
+
+ s := NewStack[string]()
+ s.Push("a")
+ s.Push("b")
+
+ got := s.Drain()
+
+ c.Assert(got, qt.DeepEquals, []string{"a", "b"})
+ c.Assert(s.Len(), qt.Equals, 0)
+}
+
+func TestStackDrainMatching(t *testing.T) {
+ t.Parallel()
+ c := qt.New(t)
+
+ s := NewStack[int]()
+ s.Push(1)
+ s.Push(2)
+ s.Push(3)
+ s.Push(4)
+
+ got := s.DrainMatching(func(v int) bool { return v%2 == 0 })
+
+ c.Assert(got, qt.DeepEquals, []int{4, 2})
+ c.Assert(s.Drain(), qt.DeepEquals, []int{1, 3})
+}
diff --git a/common/constants/constants.go b/common/constants/constants.go
new file mode 100644
index 000000000..c7bbaa541
--- /dev/null
+++ b/common/constants/constants.go
@@ -0,0 +1,49 @@
+// Copyright 2020 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package constants
+
+// Error/Warning IDs.
+// Do not change these values.
+const (
+ // IDs for remote errors in tpl/data.
+ ErrRemoteGetJSON = "error-remote-getjson"
+ ErrRemoteGetCSV = "error-remote-getcsv"
+
+ WarnFrontMatterParamsOverrides = "warning-frontmatter-params-overrides"
+ WarnRenderShortcodesInHTML = "warning-rendershortcodes-in-html"
+ WarnGoldmarkRawHTML = "warning-goldmark-raw-html"
+ WarnPartialSuperfluousPrefix = "warning-partial-superfluous-prefix"
+ WarnHomePageIsLeafBundle = "warning-home-page-is-leaf-bundle"
+)
+
+// Field/method names with special meaning.
+const (
+ FieldRelPermalink = "RelPermalink"
+ FieldPermalink = "Permalink"
+)
+
+// IsFieldRelOrPermalink returns whether the given name is a RelPermalink or Permalink.
+func IsFieldRelOrPermalink(name string) bool {
+ return name == FieldRelPermalink || name == FieldPermalink
+}
+
+// Resource transformations.
+const (
+ ResourceTransformationFingerprint = "fingerprint"
+)
+
+// IsResourceTransformationPermalinkHash returns whether the given name is a resource transformation that changes the permalink based on the content.
+func IsResourceTransformationPermalinkHash(name string) bool {
+ return name == ResourceTransformationFingerprint
+}
diff --git a/common/docs.go b/common/docs.go
new file mode 100644
index 000000000..041a62a01
--- /dev/null
+++ b/common/docs.go
@@ -0,0 +1,2 @@
+// Package common provides common helper functionality for Hugo.
+package common
diff --git a/common/hashing/hashing.go b/common/hashing/hashing.go
new file mode 100644
index 000000000..e45356758
--- /dev/null
+++ b/common/hashing/hashing.go
@@ -0,0 +1,194 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Package hashing provides common hashing utilities.
+package hashing
+
+import (
+ "crypto/md5"
+ "encoding/hex"
+ "io"
+ "strconv"
+ "sync"
+
+ "github.com/cespare/xxhash/v2"
+ "github.com/gohugoio/hashstructure"
+ "github.com/gohugoio/hugo/identity"
+)
+
+// XXHashFromReader calculates the xxHash for the given reader.
+func XXHashFromReader(r io.Reader) (uint64, int64, error) {
+ h := getXxHashReadFrom()
+ defer putXxHashReadFrom(h)
+
+ size, err := io.Copy(h, r)
+ if err != nil {
+ return 0, 0, err
+ }
+ return h.Sum64(), size, nil
+}
+
+// XxHashFromReaderHexEncoded calculates the xxHash for the given reader
+// and returns the hash as a hex encoded string.
+func XxHashFromReaderHexEncoded(r io.Reader) (string, error) {
+ h := getXxHashReadFrom()
+ defer putXxHashReadFrom(h)
+ _, err := io.Copy(h, r)
+ if err != nil {
+ return "", err
+ }
+ hash := h.Sum(nil)
+ return hex.EncodeToString(hash), nil
+}
+
+// XXHashFromString calculates the xxHash for the given string.
+func XXHashFromString(s string) (uint64, error) {
+ h := xxhash.New()
+ h.WriteString(s)
+ return h.Sum64(), nil
+}
+
+// XxHashFromStringHexEncoded calculates the xxHash for the given string
+// and returns the hash as a hex encoded string.
+func XxHashFromStringHexEncoded(f string) string {
+ h := xxhash.New()
+ h.WriteString(f)
+ hash := h.Sum(nil)
+ return hex.EncodeToString(hash)
+}
+
+// MD5FromStringHexEncoded returns the MD5 hash of the given string.
+func MD5FromStringHexEncoded(f string) string {
+ h := md5.New()
+ h.Write([]byte(f))
+ return hex.EncodeToString(h.Sum(nil))
+}
+
+// HashString returns a hash from the given elements.
+// It will panic if the hash cannot be calculated.
+// Note that this hash should be used primarily for identity, not for change detection as
+// it in the more complex values (e.g. Page) will not hash the full content.
+func HashString(vs ...any) string {
+ hash := HashUint64(vs...)
+ return strconv.FormatUint(hash, 10)
+}
+
+// HashStringHex returns a hash from the given elements as a hex encoded string.
+// See HashString for more information.
+func HashStringHex(vs ...any) string {
+ hash := HashUint64(vs...)
+ return strconv.FormatUint(hash, 16)
+}
+
+var hashOptsPool = sync.Pool{
+ New: func() any {
+ return &hashstructure.HashOptions{
+ Hasher: xxhash.New(),
+ }
+ },
+}
+
+func getHashOpts() *hashstructure.HashOptions {
+ return hashOptsPool.Get().(*hashstructure.HashOptions)
+}
+
+func putHashOpts(opts *hashstructure.HashOptions) {
+ opts.Hasher.Reset()
+ hashOptsPool.Put(opts)
+}
+
+// HashUint64 returns a hash from the given elements.
+// It will panic if the hash cannot be calculated.
+// Note that this hash should be used primarily for identity, not for change detection as
+// it in the more complex values (e.g. Page) will not hash the full content.
+func HashUint64(vs ...any) uint64 {
+ var o any
+ if len(vs) == 1 {
+ o = toHashable(vs[0])
+ } else {
+ elements := make([]any, len(vs))
+ for i, e := range vs {
+ elements[i] = toHashable(e)
+ }
+ o = elements
+ }
+
+ hash, err := Hash(o)
+ if err != nil {
+ panic(err)
+ }
+ return hash
+}
+
+// Hash returns a hash from vs.
+func Hash(vs ...any) (uint64, error) {
+ hashOpts := getHashOpts()
+ defer putHashOpts(hashOpts)
+ var v any = vs
+ if len(vs) == 1 {
+ v = vs[0]
+ }
+ return hashstructure.Hash(v, hashOpts)
+}
+
+type keyer interface {
+ Key() string
+}
+
+// For structs, hashstructure.Hash only works on the exported fields,
+// so rewrite the input slice for known identity types.
+func toHashable(v any) any {
+ switch t := v.(type) {
+ case keyer:
+ return t.Key()
+ case identity.IdentityProvider:
+ return t.GetIdentity()
+ default:
+ return v
+ }
+}
+
+type xxhashReadFrom struct {
+ buff []byte
+ *xxhash.Digest
+}
+
+func (x *xxhashReadFrom) ReadFrom(r io.Reader) (int64, error) {
+ for {
+ n, err := r.Read(x.buff)
+ if n > 0 {
+ x.Digest.Write(x.buff[:n])
+ }
+ if err != nil {
+ if err == io.EOF {
+ err = nil
+ }
+ return int64(n), err
+ }
+ }
+}
+
+var xXhashReadFromPool = sync.Pool{
+ New: func() any {
+ return &xxhashReadFrom{Digest: xxhash.New(), buff: make([]byte, 48*1024)}
+ },
+}
+
+func getXxHashReadFrom() *xxhashReadFrom {
+ return xXhashReadFromPool.Get().(*xxhashReadFrom)
+}
+
+func putXxHashReadFrom(h *xxhashReadFrom) {
+ h.Reset()
+ xXhashReadFromPool.Put(h)
+}
diff --git a/common/hashing/hashing_test.go b/common/hashing/hashing_test.go
new file mode 100644
index 000000000..105b6d8b5
--- /dev/null
+++ b/common/hashing/hashing_test.go
@@ -0,0 +1,157 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package hashing
+
+import (
+ "fmt"
+ "math"
+ "strings"
+ "sync"
+ "testing"
+
+ qt "github.com/frankban/quicktest"
+)
+
+func TestXxHashFromReader(t *testing.T) {
+ c := qt.New(t)
+ s := "Hello World"
+ r := strings.NewReader(s)
+ got, size, err := XXHashFromReader(r)
+ c.Assert(err, qt.IsNil)
+ c.Assert(size, qt.Equals, int64(len(s)))
+ c.Assert(got, qt.Equals, uint64(7148569436472236994))
+}
+
+func TestXxHashFromReaderPara(t *testing.T) {
+ c := qt.New(t)
+
+ var wg sync.WaitGroup
+ for i := range 10 {
+ i := i
+ wg.Add(1)
+ go func() {
+ defer wg.Done()
+ for j := range 100 {
+ s := strings.Repeat("Hello ", i+j+1*42)
+ r := strings.NewReader(s)
+ got, size, err := XXHashFromReader(r)
+ c.Assert(size, qt.Equals, int64(len(s)))
+ c.Assert(err, qt.IsNil)
+ expect, _ := XXHashFromString(s)
+ c.Assert(got, qt.Equals, expect)
+ }
+ }()
+ }
+
+ wg.Wait()
+}
+
+func TestXxHashFromString(t *testing.T) {
+ c := qt.New(t)
+ s := "Hello World"
+ got, err := XXHashFromString(s)
+ c.Assert(err, qt.IsNil)
+ c.Assert(got, qt.Equals, uint64(7148569436472236994))
+}
+
+func TestXxHashFromStringHexEncoded(t *testing.T) {
+ c := qt.New(t)
+ s := "The quick brown fox jumps over the lazy dog"
+ got := XxHashFromStringHexEncoded(s)
+ // Facit: https://asecuritysite.com/encryption/xxhash?val=The%20quick%20brown%20fox%20jumps%20over%20the%20lazy%20dog
+ c.Assert(got, qt.Equals, "0b242d361fda71bc")
+}
+
+func BenchmarkXXHashFromReader(b *testing.B) {
+ r := strings.NewReader("Hello World")
+ b.ResetTimer()
+ for i := 0; i < b.N; i++ {
+ XXHashFromReader(r)
+ r.Seek(0, 0)
+ }
+}
+
+func BenchmarkXXHashFromString(b *testing.B) {
+ s := "Hello World"
+ b.ResetTimer()
+ for i := 0; i < b.N; i++ {
+ XXHashFromString(s)
+ }
+}
+
+func BenchmarkXXHashFromStringHexEncoded(b *testing.B) {
+ s := "The quick brown fox jumps over the lazy dog"
+ b.ResetTimer()
+ for i := 0; i < b.N; i++ {
+ XxHashFromStringHexEncoded(s)
+ }
+}
+
+func TestHashString(t *testing.T) {
+ c := qt.New(t)
+
+ c.Assert(HashString("a", "b"), qt.Equals, "3176555414984061461")
+ c.Assert(HashString("ab"), qt.Equals, "7347350983217793633")
+
+ var vals []any = []any{"a", "b", tstKeyer{"c"}}
+
+ c.Assert(HashString(vals...), qt.Equals, "4438730547989914315")
+ c.Assert(vals[2], qt.Equals, tstKeyer{"c"})
+}
+
+type tstKeyer struct {
+ key string
+}
+
+func (t tstKeyer) Key() string {
+ return t.key
+}
+
+func (t tstKeyer) String() string {
+ return "key: " + t.key
+}
+
+func BenchmarkHashString(b *testing.B) {
+ word := " hello "
+
+ var tests []string
+
+ for i := 1; i <= 5; i++ {
+ sentence := strings.Repeat(word, int(math.Pow(4, float64(i))))
+ tests = append(tests, sentence)
+ }
+
+ b.ResetTimer()
+
+ for _, test := range tests {
+ b.Run(fmt.Sprintf("n%d", len(test)), func(b *testing.B) {
+ for i := 0; i < b.N; i++ {
+ HashString(test)
+ }
+ })
+ }
+}
+
+func BenchmarkHashMap(b *testing.B) {
+ m := map[string]any{}
+ for i := range 1000 {
+ m[fmt.Sprintf("key%d", i)] = i
+ }
+
+ b.ResetTimer()
+
+ for i := 0; i < b.N; i++ {
+ HashString(m)
+ }
+}
diff --git a/common/hcontext/context.go b/common/hcontext/context.go
new file mode 100644
index 000000000..9524ef284
--- /dev/null
+++ b/common/hcontext/context.go
@@ -0,0 +1,46 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package hcontext
+
+import "context"
+
+// ContextDispatcher is a generic interface for setting and getting values from a context.
+type ContextDispatcher[T any] interface {
+ Set(ctx context.Context, value T) context.Context
+ Get(ctx context.Context) T
+}
+
+// NewContextDispatcher creates a new ContextDispatcher with the given key.
+func NewContextDispatcher[T any, R comparable](key R) ContextDispatcher[T] {
+ return keyInContext[T, R]{
+ id: key,
+ }
+}
+
+type keyInContext[T any, R comparable] struct {
+ zero T
+ id R
+}
+
+func (f keyInContext[T, R]) Get(ctx context.Context) T {
+ v := ctx.Value(f.id)
+ if v == nil {
+ return f.zero
+ }
+ return v.(T)
+}
+
+func (f keyInContext[T, R]) Set(ctx context.Context, value T) context.Context {
+ return context.WithValue(ctx, f.id, value)
+}
diff --git a/common/herrors/error_locator.go b/common/herrors/error_locator.go
index 3778a3729..acaebb4bc 100644
--- a/common/herrors/error_locator.go
+++ b/common/herrors/error_locator.go
@@ -1,4 +1,4 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -11,18 +11,15 @@
// See the License for the specific language governing permissions and
// limitations under the License.
-// Package errors contains common Hugo errors and error related utilities.
+// Package herrors contains common Hugo errors and error related utilities.
package herrors
import (
"io"
- "io/ioutil"
"path/filepath"
"strings"
"github.com/gohugoio/hugo/common/text"
-
- "github.com/spf13/afero"
)
// LineMatcher contains the elements used to match an error to a line
@@ -36,19 +33,47 @@ type LineMatcher struct {
}
// LineMatcherFn is used to match a line with an error.
-type LineMatcherFn func(m LineMatcher) bool
+// It returns the column number or 0 if the line was found, but column could not be determined. Returns -1 if no line match.
+type LineMatcherFn func(m LineMatcher) int
// SimpleLineMatcher simply matches by line number.
-var SimpleLineMatcher = func(m LineMatcher) bool {
- return m.Position.LineNumber == m.LineNumber
+var SimpleLineMatcher = func(m LineMatcher) int {
+ if m.Position.LineNumber == m.LineNumber {
+ // We found the line, but don't know the column.
+ return 0
+ }
+ return -1
}
-var _ text.Positioner = ErrorContext{}
+// NopLineMatcher is a matcher that always returns 1.
+// This will effectively give line 1, column 1.
+var NopLineMatcher = func(m LineMatcher) int {
+ return 1
+}
+
+// OffsetMatcher is a line matcher that matches by offset.
+var OffsetMatcher = func(m LineMatcher) int {
+ if m.Offset+len(m.Line) >= m.Position.Offset {
+ // We found the line, but return 0 to signal that we want to determine
+ // the column from the error.
+ return 0
+ }
+ return -1
+}
+
+// ContainsMatcher is a line matcher that matches by line content.
+func ContainsMatcher(text string) func(m LineMatcher) int {
+ return func(m LineMatcher) int {
+ if idx := strings.Index(m.Line, text); idx != -1 {
+ return idx + 1
+ }
+ return -1
+ }
+}
// ErrorContext contains contextual information about an error. This will
// typically be the lines surrounding some problem in a file.
type ErrorContext struct {
-
// If a match will contain the matched line and up to 2 lines before and after.
// Will be empty if no match.
Lines []string
@@ -56,114 +81,15 @@ type ErrorContext struct {
// The position of the error in the Lines above. 0 based.
LinesPos int
- position text.Position
+ // The position of the content in the file. Note that this may be different from the error's position set
+ // in FileError.
+ Position text.Position
// The lexer to use for syntax highlighting.
// https://gohugo.io/content-management/syntax-highlighting/#list-of-chroma-highlighting-languages
ChromaLexer string
}
-// Position returns the text position of this error.
-func (e ErrorContext) Position() text.Position {
- return e.position
-}
-
-var _ causer = (*ErrorWithFileContext)(nil)
-
-// ErrorWithFileContext is an error with some additional file context related
-// to that error.
-type ErrorWithFileContext struct {
- cause error
- ErrorContext
-}
-
-func (e *ErrorWithFileContext) Error() string {
- pos := e.Position()
- if pos.IsValid() {
- return pos.String() + ": " + e.cause.Error()
- }
- return e.cause.Error()
-}
-
-func (e *ErrorWithFileContext) Cause() error {
- return e.cause
-}
-
-// WithFileContextForFile will try to add a file context with lines matching the given matcher.
-// If no match could be found, the original error is returned with false as the second return value.
-func WithFileContextForFile(e error, realFilename, filename string, fs afero.Fs, matcher LineMatcherFn) (error, bool) {
- f, err := fs.Open(filename)
- if err != nil {
- return e, false
- }
- defer f.Close()
- return WithFileContext(e, realFilename, f, matcher)
-}
-
-// WithFileContextForFile will try to add a file context with lines matching the given matcher.
-// If no match could be found, the original error is returned with false as the second return value.
-func WithFileContext(e error, realFilename string, r io.Reader, matcher LineMatcherFn) (error, bool) {
- if e == nil {
- panic("error missing")
- }
- le := UnwrapFileError(e)
-
- if le == nil {
- var ok bool
- if le, ok = ToFileError("", e).(FileError); !ok {
- return e, false
- }
- }
-
- var errCtx ErrorContext
-
- posle := le.Position()
-
- if posle.Offset != -1 {
- errCtx = locateError(r, le, func(m LineMatcher) bool {
- if posle.Offset >= m.Offset && posle.Offset < m.Offset+len(m.Line) {
- lno := posle.LineNumber - m.Position.LineNumber + m.LineNumber
- m.Position = text.Position{LineNumber: lno}
- }
- return matcher(m)
- })
- } else {
- errCtx = locateError(r, le, matcher)
- }
-
- pos := &errCtx.position
-
- if pos.LineNumber == -1 {
- return e, false
- }
-
- pos.Filename = realFilename
-
- if le.Type() != "" {
- errCtx.ChromaLexer = chromaLexerFromType(le.Type())
- } else {
- errCtx.ChromaLexer = chromaLexerFromFilename(realFilename)
- }
-
- return &ErrorWithFileContext{cause: e, ErrorContext: errCtx}, true
-}
-
-// UnwrapErrorWithFileContext tries to unwrap an ErrorWithFileContext from err.
-// It returns nil if this is not possible.
-func UnwrapErrorWithFileContext(err error) *ErrorWithFileContext {
- for err != nil {
- switch v := err.(type) {
- case *ErrorWithFileContext:
- return v
- case causer:
- err = v.Cause()
- default:
- return nil
- }
- }
- return nil
-}
-
func chromaLexerFromType(fileType string) string {
switch fileType {
case "html", "htm":
@@ -185,31 +111,24 @@ func chromaLexerFromFilename(filename string) string {
return chromaLexerFromType(ext)
}
-func locateErrorInString(src string, matcher LineMatcherFn) ErrorContext {
+func locateErrorInString(src string, matcher LineMatcherFn) *ErrorContext {
return locateError(strings.NewReader(src), &fileError{}, matcher)
}
-func locateError(r io.Reader, le FileError, matches LineMatcherFn) ErrorContext {
+func locateError(r io.Reader, le FileError, matches LineMatcherFn) *ErrorContext {
if le == nil {
panic("must provide an error")
}
- errCtx := ErrorContext{position: text.Position{LineNumber: -1, ColumnNumber: 1, Offset: -1}, LinesPos: -1}
+ ectx := &ErrorContext{LinesPos: -1, Position: text.Position{Offset: -1}}
- b, err := ioutil.ReadAll(r)
+ b, err := io.ReadAll(r)
if err != nil {
- return errCtx
+ return ectx
}
- pos := &errCtx.position
- lepos := le.Position()
-
lines := strings.Split(string(b), "\n")
- if lepos.ColumnNumber >= 0 {
- pos.ColumnNumber = lepos.ColumnNumber
- }
-
lineNo := 0
posBytes := 0
@@ -222,34 +141,30 @@ func locateError(r io.Reader, le FileError, matches LineMatcherFn) ErrorContext
Offset: posBytes,
Line: line,
}
- if errCtx.LinesPos == -1 && matches(m) {
- pos.LineNumber = lineNo
+ v := matches(m)
+ if ectx.LinesPos == -1 && v != -1 {
+ ectx.Position.LineNumber = lineNo
+ ectx.Position.ColumnNumber = v
break
}
posBytes += len(line)
}
- if pos.LineNumber != -1 {
- low := pos.LineNumber - 3
- if low < 0 {
- low = 0
- }
+ if ectx.Position.LineNumber > 0 {
+ low := max(ectx.Position.LineNumber-3, 0)
- if pos.LineNumber > 2 {
- errCtx.LinesPos = 2
+ if ectx.Position.LineNumber > 2 {
+ ectx.LinesPos = 2
} else {
- errCtx.LinesPos = pos.LineNumber - 1
+ ectx.LinesPos = ectx.Position.LineNumber - 1
}
- high := pos.LineNumber + 2
- if high > len(lines) {
- high = len(lines)
- }
+ high := min(ectx.Position.LineNumber+2, len(lines))
- errCtx.Lines = lines[low:high]
+ ectx.Lines = lines[low:high]
}
- return errCtx
+ return ectx
}
diff --git a/common/herrors/error_locator_test.go b/common/herrors/error_locator_test.go
index cce94166f..62f15213d 100644
--- a/common/herrors/error_locator_test.go
+++ b/common/herrors/error_locator_test.go
@@ -1,4 +1,4 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -11,7 +11,7 @@
// See the License for the specific language governing permissions and
// limitations under the License.
-// Package errors contains common Hugo errors and error related utilities.
+// Package herrors contains common Hugo errors and error related utilities.
package herrors
import (
@@ -24,8 +24,11 @@ import (
func TestErrorLocator(t *testing.T) {
c := qt.New(t)
- lineMatcher := func(m LineMatcher) bool {
- return strings.Contains(m.Line, "THEONE")
+ lineMatcher := func(m LineMatcher) int {
+ if strings.Contains(m.Line, "THEONE") {
+ return 1
+ }
+ return -1
}
lines := `LINE 1
@@ -39,35 +42,41 @@ LINE 8
`
location := locateErrorInString(lines, lineMatcher)
+ pos := location.Position
c.Assert(location.Lines, qt.DeepEquals, []string{"LINE 3", "LINE 4", "This is THEONE", "LINE 6", "LINE 7"})
- pos := location.Position()
c.Assert(pos.LineNumber, qt.Equals, 5)
c.Assert(location.LinesPos, qt.Equals, 2)
- c.Assert(locateErrorInString(`This is THEONE`, lineMatcher).Lines, qt.DeepEquals, []string{"This is THEONE"})
+ locate := func(s string, m LineMatcherFn) *ErrorContext {
+ ctx := locateErrorInString(s, m)
+ return ctx
+ }
+
+ c.Assert(locate(`This is THEONE`, lineMatcher).Lines, qt.DeepEquals, []string{"This is THEONE"})
location = locateErrorInString(`L1
This is THEONE
L2
`, lineMatcher)
- c.Assert(location.Position().LineNumber, qt.Equals, 2)
+ pos = location.Position
+ c.Assert(pos.LineNumber, qt.Equals, 2)
c.Assert(location.LinesPos, qt.Equals, 1)
c.Assert(location.Lines, qt.DeepEquals, []string{"L1", "This is THEONE", "L2", ""})
- location = locateErrorInString(`This is THEONE
+ location = locate(`This is THEONE
L2
`, lineMatcher)
c.Assert(location.LinesPos, qt.Equals, 0)
c.Assert(location.Lines, qt.DeepEquals, []string{"This is THEONE", "L2", ""})
- location = locateErrorInString(`L1
+ location = locate(`L1
This THEONE
`, lineMatcher)
c.Assert(location.Lines, qt.DeepEquals, []string{"L1", "This THEONE", ""})
c.Assert(location.LinesPos, qt.Equals, 1)
- location = locateErrorInString(`L1
+ location = locate(`L1
L2
This THEONE
`, lineMatcher)
@@ -75,12 +84,16 @@ This THEONE
c.Assert(location.LinesPos, qt.Equals, 2)
location = locateErrorInString("NO MATCH", lineMatcher)
- c.Assert(location.Position().LineNumber, qt.Equals, -1)
+ pos = location.Position
+ c.Assert(pos.LineNumber, qt.Equals, 0)
c.Assert(location.LinesPos, qt.Equals, -1)
c.Assert(len(location.Lines), qt.Equals, 0)
- lineMatcher = func(m LineMatcher) bool {
- return m.LineNumber == 6
+ lineMatcher = func(m LineMatcher) int {
+ if m.LineNumber == 6 {
+ return 1
+ }
+ return -1
}
location = locateErrorInString(`A
@@ -93,14 +106,18 @@ G
H
I
J`, lineMatcher)
+ pos = location.Position
c.Assert(location.Lines, qt.DeepEquals, []string{"D", "E", "F", "G", "H"})
- c.Assert(location.Position().LineNumber, qt.Equals, 6)
+ c.Assert(pos.LineNumber, qt.Equals, 6)
c.Assert(location.LinesPos, qt.Equals, 2)
// Test match EOF
- lineMatcher = func(m LineMatcher) bool {
- return m.LineNumber == 4
+ lineMatcher = func(m LineMatcher) int {
+ if m.LineNumber == 4 {
+ return 1
+ }
+ return -1
}
location = locateErrorInString(`A
@@ -108,12 +125,17 @@ B
C
`, lineMatcher)
+ pos = location.Position
+
c.Assert(location.Lines, qt.DeepEquals, []string{"B", "C", ""})
- c.Assert(location.Position().LineNumber, qt.Equals, 4)
+ c.Assert(pos.LineNumber, qt.Equals, 4)
c.Assert(location.LinesPos, qt.Equals, 2)
- offsetMatcher := func(m LineMatcher) bool {
- return m.Offset == 1
+ offsetMatcher := func(m LineMatcher) int {
+ if m.Offset == 1 {
+ return 1
+ }
+ return -1
}
location = locateErrorInString(`A
@@ -122,8 +144,9 @@ C
D
E`, offsetMatcher)
- c.Assert(location.Lines, qt.DeepEquals, []string{"A", "B", "C", "D"})
- c.Assert(location.Position().LineNumber, qt.Equals, 2)
- c.Assert(location.LinesPos, qt.Equals, 1)
+ pos = location.Position
+ c.Assert(location.Lines, qt.DeepEquals, []string{"A", "B", "C", "D"})
+ c.Assert(pos.LineNumber, qt.Equals, 2)
+ c.Assert(location.LinesPos, qt.Equals, 1)
}
diff --git a/common/herrors/errors.go b/common/herrors/errors.go
index e484ecb80..c7ee90dd0 100644
--- a/common/herrors/errors.go
+++ b/common/herrors/errors.go
@@ -1,4 +1,4 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -19,47 +19,169 @@ import (
"fmt"
"io"
"os"
+ "regexp"
+ "runtime"
"runtime/debug"
-
- _errors "github.com/pkg/errors"
+ "strings"
+ "time"
)
-// As defined in https://godoc.org/github.com/pkg/errors
-type causer interface {
- Cause() error
+// PrintStackTrace prints the current stacktrace to w.
+func PrintStackTrace(w io.Writer) {
+ buf := make([]byte, 1<<16)
+ runtime.Stack(buf, true)
+ fmt.Fprintf(w, "%s", buf)
}
-type stackTracer interface {
- StackTrace() _errors.StackTrace
-}
-
-// PrintStackTrace prints the error's stack trace to stdoud.
-func PrintStackTrace(err error) {
- FprintStackTrace(os.Stdout, err)
-}
-
-// FprintStackTrace prints the error's stack trace to w.
-func FprintStackTrace(w io.Writer, err error) {
- if err, ok := err.(stackTracer); ok {
- for _, f := range err.StackTrace() {
- fmt.Fprintf(w, "%+s:%d\n", f, f)
- }
- }
+// ErrorSender is a, typically, non-blocking error handler.
+type ErrorSender interface {
+ SendError(err error)
}
// Recover is a helper function that can be used to capture panics.
// Put this at the top of a method/function that crashes in a template:
-// defer herrors.Recover()
-func Recover(args ...interface{}) {
+//
+// defer herrors.Recover()
+func Recover(args ...any) {
if r := recover(); r != nil {
+ fmt.Println("ERR:", r)
args = append(args, "stacktrace from panic: \n"+string(debug.Stack()), "\n")
fmt.Println(args...)
}
+}
+// IsTimeoutError returns true if the given error is or contains a TimeoutError.
+func IsTimeoutError(err error) bool {
+ return errors.Is(err, &TimeoutError{})
+}
+
+type TimeoutError struct {
+ Duration time.Duration
+}
+
+func (e *TimeoutError) Error() string {
+ return fmt.Sprintf("timeout after %s", e.Duration)
+}
+
+func (e *TimeoutError) Is(target error) bool {
+ _, ok := target.(*TimeoutError)
+ return ok
+}
+
+// errMessage wraps an error with a message.
+type errMessage struct {
+ msg string
+ err error
+}
+
+func (e *errMessage) Error() string {
+ return e.msg
+}
+
+func (e *errMessage) Unwrap() error {
+ return e.err
+}
+
+// IsFeatureNotAvailableError returns true if the given error is or contains a FeatureNotAvailableError.
+func IsFeatureNotAvailableError(err error) bool {
+ return errors.Is(err, &FeatureNotAvailableError{})
}
// ErrFeatureNotAvailable denotes that a feature is unavailable.
//
// We will, at least to begin with, make some Hugo features (SCSS with libsass) optional,
// and this error is used to signal those situations.
-var ErrFeatureNotAvailable = errors.New("this feature is not available in your current Hugo version, see https://goo.gl/YMrWcn for more information")
+var ErrFeatureNotAvailable = &FeatureNotAvailableError{Cause: errors.New("this feature is not available in your current Hugo version, see https://goo.gl/YMrWcn for more information")}
+
+// FeatureNotAvailableError is an error type used to signal that a feature is not available.
+type FeatureNotAvailableError struct {
+ Cause error
+}
+
+func (e *FeatureNotAvailableError) Unwrap() error {
+ return e.Cause
+}
+
+func (e *FeatureNotAvailableError) Error() string {
+ return e.Cause.Error()
+}
+
+func (e *FeatureNotAvailableError) Is(target error) bool {
+ _, ok := target.(*FeatureNotAvailableError)
+ return ok
+}
+
+// Must panics if err != nil.
+func Must(err error) {
+ if err != nil {
+ panic(err)
+ }
+}
+
+// IsNotExist returns true if the error is a file not found error.
+// Unlike os.IsNotExist, this also considers wrapped errors.
+func IsNotExist(err error) bool {
+ if os.IsNotExist(err) {
+ return true
+ }
+
+ // os.IsNotExist does not consider wrapped errors.
+ if os.IsNotExist(errors.Unwrap(err)) {
+ return true
+ }
+
+ return false
+}
+
+// IsExist returns true if the error is a file exists error.
+// Unlike os.IsExist, this also considers wrapped errors.
+func IsExist(err error) bool {
+ if os.IsExist(err) {
+ return true
+ }
+
+ // os.IsExist does not consider wrapped errors.
+ if os.IsExist(errors.Unwrap(err)) {
+ return true
+ }
+
+ return false
+}
+
+var nilPointerErrRe = regexp.MustCompile(`at <(.*)>: error calling (.*?): runtime error: invalid memory address or nil pointer dereference`)
+
+const deferredPrefix = "__hdeferred/"
+
+var deferredStringToRemove = regexp.MustCompile(`executing "__hdeferred/.*?" `)
+
+// ImproveRenderErr improves the error message for rendering errors.
+func ImproveRenderErr(inErr error) (outErr error) {
+ outErr = inErr
+ msg := improveIfNilPointerMsg(inErr)
+ if msg != "" {
+ outErr = &errMessage{msg: msg, err: outErr}
+ }
+
+ if strings.Contains(inErr.Error(), deferredPrefix) {
+ msg := deferredStringToRemove.ReplaceAllString(inErr.Error(), "executing ")
+ outErr = &errMessage{msg: msg, err: outErr}
+ }
+ return
+}
+
+func improveIfNilPointerMsg(inErr error) string {
+ m := nilPointerErrRe.FindStringSubmatch(inErr.Error())
+ if len(m) == 0 {
+ return ""
+ }
+ call := m[1]
+ field := m[2]
+ parts := strings.Split(call, ".")
+ if len(parts) < 2 {
+ return ""
+ }
+ receiverName := parts[len(parts)-2]
+ receiver := strings.Join(parts[:len(parts)-1], ".")
+ s := fmt.Sprintf("– %s is nil; wrap it in if or with: {{ with %s }}{{ .%s }}{{ end }}", receiverName, receiver, field)
+ return nilPointerErrRe.ReplaceAllString(inErr.Error(), s)
+}
diff --git a/common/herrors/errors_test.go b/common/herrors/errors_test.go
new file mode 100644
index 000000000..2f53a1e89
--- /dev/null
+++ b/common/herrors/errors_test.go
@@ -0,0 +1,45 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package herrors
+
+import (
+ "errors"
+ "fmt"
+ "testing"
+
+ qt "github.com/frankban/quicktest"
+ "github.com/spf13/afero"
+)
+
+func TestIsNotExist(t *testing.T) {
+ c := qt.New(t)
+
+ c.Assert(IsNotExist(afero.ErrFileNotFound), qt.Equals, true)
+ c.Assert(IsNotExist(afero.ErrFileExists), qt.Equals, false)
+ c.Assert(IsNotExist(afero.ErrDestinationExists), qt.Equals, false)
+ c.Assert(IsNotExist(nil), qt.Equals, false)
+
+ c.Assert(IsNotExist(fmt.Errorf("foo")), qt.Equals, false)
+
+ // os.IsNotExist returns false for wrapped errors.
+ c.Assert(IsNotExist(fmt.Errorf("foo: %w", afero.ErrFileNotFound)), qt.Equals, true)
+}
+
+func TestIsFeatureNotAvailableError(t *testing.T) {
+ c := qt.New(t)
+
+ c.Assert(IsFeatureNotAvailableError(ErrFeatureNotAvailable), qt.Equals, true)
+ c.Assert(IsFeatureNotAvailableError(&FeatureNotAvailableError{}), qt.Equals, true)
+ c.Assert(IsFeatureNotAvailableError(errors.New("asdf")), qt.Equals, false)
+}
diff --git a/common/herrors/file_error.go b/common/herrors/file_error.go
index 5af84adf5..38b198656 100644
--- a/common/herrors/file_error.go
+++ b/common/herrors/file_error.go
@@ -1,28 +1,32 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
// http://www.apache.org/licenses/LICENSE-2.0
//
-// Unless required by applicable law or agreed to in writing, software
+// Unless required by applicable lfmtaw or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
-// limitatio ns under the License.
+// limitations under the License.
package herrors
import (
"encoding/json"
+ "errors"
+ "fmt"
+ "io"
+ "path/filepath"
+ "github.com/bep/godartsass/v2"
+ "github.com/bep/golibsass/libsass/libsasserrors"
+ "github.com/gohugoio/hugo/common/paths"
"github.com/gohugoio/hugo/common/text"
-
- "github.com/pkg/errors"
-)
-
-var (
- _ causer = (*fileError)(nil)
+ "github.com/pelletier/go-toml/v2"
+ "github.com/spf13/afero"
+ "github.com/tdewolff/parse/v2"
)
// FileError represents an error when handling a file: Parsing a config file,
@@ -30,48 +34,309 @@ var (
type FileError interface {
error
+ // ErrorContext holds some context information about the error.
+ ErrorContext() *ErrorContext
+
text.Positioner
- // A string identifying the type of file, e.g. JSON, TOML, markdown etc.
- Type() string
+ // UpdatePosition updates the position of the error.
+ UpdatePosition(pos text.Position) FileError
+
+ // UpdateContent updates the error with a new ErrorContext from the content of the file.
+ UpdateContent(r io.Reader, linematcher LineMatcherFn) FileError
+
+ // SetFilename sets the filename of the error.
+ SetFilename(filename string) FileError
}
-var _ FileError = (*fileError)(nil)
+// Unwrapper can unwrap errors created with fmt.Errorf.
+type Unwrapper interface {
+ Unwrap() error
+}
+
+var (
+ _ FileError = (*fileError)(nil)
+ _ Unwrapper = (*fileError)(nil)
+)
+
+func (fe *fileError) SetFilename(filename string) FileError {
+ fe.position.Filename = filename
+ return fe
+}
+
+func (fe *fileError) UpdatePosition(pos text.Position) FileError {
+ oldFilename := fe.Position().Filename
+ if pos.Filename != "" && fe.fileType == "" {
+ _, fe.fileType = paths.FileAndExtNoDelimiter(filepath.Clean(pos.Filename))
+ }
+ if pos.Filename == "" {
+ pos.Filename = oldFilename
+ }
+ fe.position = pos
+ return fe
+}
+
+func (fe *fileError) UpdateContent(r io.Reader, linematcher LineMatcherFn) FileError {
+ if linematcher == nil {
+ linematcher = SimpleLineMatcher
+ }
+
+ var (
+ posle = fe.position
+ ectx *ErrorContext
+ )
+
+ if posle.LineNumber <= 1 && posle.Offset > 0 {
+ // Try to locate the line number from the content if offset is set.
+ ectx = locateError(r, fe, func(m LineMatcher) int {
+ if posle.Offset >= m.Offset && posle.Offset < m.Offset+len(m.Line) {
+ lno := posle.LineNumber - m.Position.LineNumber + m.LineNumber
+ m.Position = text.Position{LineNumber: lno}
+ return linematcher(m)
+ }
+ return -1
+ })
+ } else {
+ ectx = locateError(r, fe, linematcher)
+ }
+
+ if ectx.ChromaLexer == "" {
+ if fe.fileType != "" {
+ ectx.ChromaLexer = chromaLexerFromType(fe.fileType)
+ } else {
+ ectx.ChromaLexer = chromaLexerFromFilename(fe.Position().Filename)
+ }
+ }
+
+ fe.errorContext = ectx
+
+ if ectx.Position.LineNumber > 0 {
+ fe.position.LineNumber = ectx.Position.LineNumber
+ }
+
+ if ectx.Position.ColumnNumber > 0 {
+ fe.position.ColumnNumber = ectx.Position.ColumnNumber
+ }
+
+ return fe
+}
type fileError struct {
- position text.Position
+ position text.Position
+ errorContext *ErrorContext
fileType string
cause error
}
+func (e *fileError) ErrorContext() *ErrorContext {
+ return e.errorContext
+}
+
// Position returns the text position of this error.
func (e fileError) Position() text.Position {
return e.position
}
-func (e *fileError) Type() string {
- return e.fileType
+func (e *fileError) Error() string {
+ return fmt.Sprintf("%s: %s", e.position, e.causeString())
}
-func (e *fileError) Error() string {
+func (e *fileError) causeString() string {
if e.cause == nil {
return ""
}
- return e.cause.Error()
+ switch v := e.cause.(type) {
+ // Avoid repeating the file info in the error message.
+ case godartsass.SassError:
+ return v.Message
+ case libsasserrors.Error:
+ return v.Message
+ default:
+ return v.Error()
+ }
}
-func (f *fileError) Cause() error {
- return f.cause
+func (e *fileError) Unwrap() error {
+ return e.cause
}
-// NewFileError creates a new FileError.
-func NewFileError(fileType string, offset, lineNumber, columnNumber int, err error) FileError {
- pos := text.Position{Offset: offset, LineNumber: lineNumber, ColumnNumber: columnNumber}
+// NewFileError creates a new FileError that wraps err.
+// It will try to extract the filename and line number from err.
+func NewFileError(err error) FileError {
+ // Filetype is used to determine the Chroma lexer to use.
+ fileType, pos := extractFileTypePos(err)
return &fileError{cause: err, fileType: fileType, position: pos}
}
+// NewFileErrorFromName creates a new FileError that wraps err.
+// The value for name should identify the file, the best
+// being the full filename to the file on disk.
+func NewFileErrorFromName(err error, name string) FileError {
+ // Filetype is used to determine the Chroma lexer to use.
+ fileType, pos := extractFileTypePos(err)
+ pos.Filename = name
+ if fileType == "" {
+ _, fileType = paths.FileAndExtNoDelimiter(filepath.Clean(name))
+ }
+
+ return &fileError{cause: err, fileType: fileType, position: pos}
+}
+
+// NewFileErrorFromPos will use the filename and line number from pos to create a new FileError, wrapping err.
+func NewFileErrorFromPos(err error, pos text.Position) FileError {
+ // Filetype is used to determine the Chroma lexer to use.
+ fileType, _ := extractFileTypePos(err)
+ if fileType == "" {
+ _, fileType = paths.FileAndExtNoDelimiter(filepath.Clean(pos.Filename))
+ }
+ return &fileError{cause: err, fileType: fileType, position: pos}
+}
+
+func NewFileErrorFromFileInErr(err error, fs afero.Fs, linematcher LineMatcherFn) FileError {
+ fe := NewFileError(err)
+ pos := fe.Position()
+ if pos.Filename == "" {
+ return fe
+ }
+
+ f, realFilename, err2 := openFile(pos.Filename, fs)
+ if err2 != nil {
+ return fe
+ }
+
+ pos.Filename = realFilename
+ defer f.Close()
+ return fe.UpdateContent(f, linematcher)
+}
+
+func NewFileErrorFromFileInPos(err error, pos text.Position, fs afero.Fs, linematcher LineMatcherFn) FileError {
+ if err == nil {
+ panic("err is nil")
+ }
+ f, realFilename, err2 := openFile(pos.Filename, fs)
+ if err2 != nil {
+ return NewFileErrorFromPos(err, pos)
+ }
+ pos.Filename = realFilename
+ defer f.Close()
+ return NewFileErrorFromPos(err, pos).UpdateContent(f, linematcher)
+}
+
+// NewFileErrorFromFile is a convenience method to create a new FileError from a file.
+func NewFileErrorFromFile(err error, filename string, fs afero.Fs, linematcher LineMatcherFn) FileError {
+ if err == nil {
+ panic("err is nil")
+ }
+ f, realFilename, err2 := openFile(filename, fs)
+ if err2 != nil {
+ return NewFileErrorFromName(err, realFilename)
+ }
+ defer f.Close()
+ return NewFileErrorFromName(err, realFilename).UpdateContent(f, linematcher)
+}
+
+func openFile(filename string, fs afero.Fs) (afero.File, string, error) {
+ realFilename := filename
+
+ // We want the most specific filename possible in the error message.
+ fi, err2 := fs.Stat(filename)
+ if err2 == nil {
+ if s, ok := fi.(interface {
+ Filename() string
+ }); ok {
+ realFilename = s.Filename()
+ }
+ }
+
+ f, err2 := fs.Open(filename)
+ if err2 != nil {
+ return nil, realFilename, err2
+ }
+
+ return f, realFilename, nil
+}
+
+// Cause returns the underlying error, that is,
+// it unwraps errors until it finds one that does not implement
+// the Unwrap method.
+// For a shallow variant, see Unwrap.
+func Cause(err error) error {
+ type unwrapper interface {
+ Unwrap() error
+ }
+
+ for err != nil {
+ cause, ok := err.(unwrapper)
+ if !ok {
+ break
+ }
+ err = cause.Unwrap()
+ }
+ return err
+}
+
+// Unwrap returns the underlying error or itself if it does not implement Unwrap.
+func Unwrap(err error) error {
+ if u := errors.Unwrap(err); u != nil {
+ return u
+ }
+ return err
+}
+
+func extractFileTypePos(err error) (string, text.Position) {
+ err = Unwrap(err)
+
+ var fileType string
+
+ // LibSass, DartSass
+ if pos := extractPosition(err); pos.LineNumber > 0 || pos.Offset > 0 {
+ _, fileType = paths.FileAndExtNoDelimiter(pos.Filename)
+ return fileType, pos
+ }
+
+ // Default to line 1 col 1 if we don't find any better.
+ pos := text.Position{
+ Offset: -1,
+ LineNumber: 1,
+ ColumnNumber: 1,
+ }
+
+ // JSON errors.
+ offset, typ := extractOffsetAndType(err)
+ if fileType == "" {
+ fileType = typ
+ }
+
+ if offset >= 0 {
+ pos.Offset = offset
+ }
+
+ // The error type from the minifier contains line number and column number.
+ if line, col := extractLineNumberAndColumnNumber(err); line >= 0 {
+ pos.LineNumber = line
+ pos.ColumnNumber = col
+ return fileType, pos
+ }
+
+ // Look in the error message for the line number.
+ for _, handle := range lineNumberExtractors {
+ lno, col := handle(err)
+ if lno > 0 {
+ pos.ColumnNumber = col
+ pos.LineNumber = lno
+ break
+ }
+ }
+
+ if fileType == "" && pos.Filename != "" {
+ _, fileType = paths.FileAndExtNoDelimiter(pos.Filename)
+ }
+
+ return fileType, pos
+}
+
// UnwrapFileError tries to unwrap a FileError from err.
// It returns nil if this is not possible.
func UnwrapFileError(err error) FileError {
@@ -79,49 +344,38 @@ func UnwrapFileError(err error) FileError {
switch v := err.(type) {
case FileError:
return v
- case causer:
- err = v.Cause()
default:
- return nil
+ err = errors.Unwrap(err)
}
}
return nil
}
-// ToFileErrorWithOffset will return a new FileError with a line number
-// with the given offset from the original.
-func ToFileErrorWithOffset(fe FileError, offset int) FileError {
- pos := fe.Position()
- return ToFileErrorWithLineNumber(fe, pos.LineNumber+offset)
-}
-
-// ToFileErrorWithOffset will return a new FileError with the given line number.
-func ToFileErrorWithLineNumber(fe FileError, lineNumber int) FileError {
- pos := fe.Position()
- pos.LineNumber = lineNumber
- return &fileError{cause: fe, fileType: fe.Type(), position: pos}
-}
-
-// ToFileError will convert the given error to an error supporting
-// the FileError interface.
-func ToFileError(fileType string, err error) FileError {
- for _, handle := range lineNumberExtractors {
- lno, col := handle(err)
- offset, typ := extractOffsetAndType(err)
- if fileType == "" {
- fileType = typ
- }
-
- if lno > 0 || offset != -1 {
- return NewFileError(fileType, offset, lno, col, err)
+// UnwrapFileErrors tries to unwrap all FileError.
+func UnwrapFileErrors(err error) []FileError {
+ var errs []FileError
+ for err != nil {
+ if v, ok := err.(FileError); ok {
+ errs = append(errs, v)
}
+ err = errors.Unwrap(err)
}
- // Fall back to the pointing to line number 1.
- return NewFileError(fileType, -1, 1, 1, err)
+ return errs
+}
+
+// UnwrapFileErrorsWithErrorContext tries to unwrap all FileError in err that has an ErrorContext.
+func UnwrapFileErrorsWithErrorContext(err error) []FileError {
+ var errs []FileError
+ for err != nil {
+ if v, ok := err.(FileError); ok && v.ErrorContext() != nil {
+ errs = append(errs, v)
+ }
+ err = errors.Unwrap(err)
+ }
+ return errs
}
func extractOffsetAndType(e error) (int, string) {
- e = errors.Cause(e)
switch v := e.(type) {
case *json.UnmarshalTypeError:
return int(v.Offset), "json"
@@ -131,3 +385,46 @@ func extractOffsetAndType(e error) (int, string) {
return -1, ""
}
}
+
+func extractLineNumberAndColumnNumber(e error) (int, int) {
+ switch v := e.(type) {
+ case *parse.Error:
+ return v.Line, v.Column
+ case *toml.DecodeError:
+ return v.Position()
+
+ }
+
+ return -1, -1
+}
+
+func extractPosition(e error) (pos text.Position) {
+ switch v := e.(type) {
+ case godartsass.SassError:
+ span := v.Span
+ start := span.Start
+ filename, _ := paths.UrlStringToFilename(span.Url)
+ pos.Filename = filename
+ pos.Offset = start.Offset
+ pos.ColumnNumber = start.Column
+ case libsasserrors.Error:
+ pos.Filename = v.File
+ pos.LineNumber = v.Line
+ pos.ColumnNumber = v.Column
+ }
+ return
+}
+
+// TextSegmentError is an error with a text segment attached.
+type TextSegmentError struct {
+ Segment string
+ Err error
+}
+
+func (e TextSegmentError) Unwrap() error {
+ return e.Err
+}
+
+func (e TextSegmentError) Error() string {
+ return e.Err.Error()
+}
diff --git a/common/herrors/file_error_test.go b/common/herrors/file_error_test.go
index b1b5c5a02..7aca08405 100644
--- a/common/herrors/file_error_test.go
+++ b/common/herrors/file_error_test.go
@@ -1,4 +1,4 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -14,14 +14,42 @@
package herrors
import (
+ "errors"
+ "fmt"
+ "strings"
"testing"
- "github.com/pkg/errors"
+ "github.com/gohugoio/hugo/common/text"
qt "github.com/frankban/quicktest"
)
-func TestToLineNumberError(t *testing.T) {
+func TestNewFileError(t *testing.T) {
+ t.Parallel()
+
+ c := qt.New(t)
+
+ fe := NewFileErrorFromName(errors.New("bar"), "foo.html")
+ c.Assert(fe.Error(), qt.Equals, `"foo.html:1:1": bar`)
+
+ lines := ""
+ for i := 1; i <= 100; i++ {
+ lines += fmt.Sprintf("line %d\n", i)
+ }
+
+ fe.UpdatePosition(text.Position{LineNumber: 32, ColumnNumber: 2})
+ c.Assert(fe.Error(), qt.Equals, `"foo.html:32:2": bar`)
+ fe.UpdatePosition(text.Position{LineNumber: 0, ColumnNumber: 0, Offset: 212})
+ fe.UpdateContent(strings.NewReader(lines), nil)
+ c.Assert(fe.Error(), qt.Equals, `"foo.html:32:0": bar`)
+ errorContext := fe.ErrorContext()
+ c.Assert(errorContext, qt.IsNotNil)
+ c.Assert(errorContext.Lines, qt.DeepEquals, []string{"line 30", "line 31", "line 32", "line 33", "line 34"})
+ c.Assert(errorContext.LinesPos, qt.Equals, 2)
+ c.Assert(errorContext.ChromaLexer, qt.Equals, "go-html-template")
+}
+
+func TestNewFileErrorExtractFromMessage(t *testing.T) {
t.Parallel()
c := qt.New(t)
@@ -36,21 +64,17 @@ func TestToLineNumberError(t *testing.T) {
{errors.New(`template: _default/single.html:4:15: executing "_default/single.html" at <.Titles>: can't evaluate field Titles in type *hugolib.PageOutput`), 0, 4, 15},
{errors.New("parse failed: template: _default/bundle-resource-meta.html:11: unexpected in operand"), 0, 11, 1},
{errors.New(`failed:: template: _default/bundle-resource-meta.html:2:7: executing "main" at <.Titles>`), 0, 2, 7},
- {errors.New("error in front matter: Near line 32 (last key parsed 'title')"), 0, 32, 1},
{errors.New(`failed to load translations: (6, 7): was expecting token =, but got "g" instead`), 0, 6, 7},
+ {errors.New(`execute of template failed: template: index.html:2:5: executing "index.html" at : error calling partial: "/layouts/partials/foo.html:3:6": execute of template failed: template: partials/foo.html:3:6: executing "partials/foo.html" at <.ThisDoesNotExist>: can't evaluate field ThisDoesNotExist in type *hugolib.pageStat`), 0, 2, 5},
} {
- got := ToFileError("template", test.in)
+ got := NewFileErrorFromName(test.in, "test.txt")
errMsg := qt.Commentf("[%d][%T]", i, got)
- le, ok := got.(FileError)
- c.Assert(ok, qt.Equals, true)
- c.Assert(ok, qt.Equals, true, errMsg)
- pos := le.Position()
+ pos := got.Position()
c.Assert(pos.LineNumber, qt.Equals, test.lineNumber, errMsg)
c.Assert(pos.ColumnNumber, qt.Equals, test.columnNumber, errMsg)
- c.Assert(errors.Cause(got), qt.Not(qt.IsNil))
+ c.Assert(errors.Unwrap(got), qt.Not(qt.IsNil))
}
-
}
diff --git a/common/herrors/line_number_extractors.go b/common/herrors/line_number_extractors.go
index 93969b967..f70a2691f 100644
--- a/common/herrors/line_number_extractors.go
+++ b/common/herrors/line_number_extractors.go
@@ -9,7 +9,7 @@
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
-// limitatio ns under the License.
+// limitations under the License.
package herrors
@@ -20,17 +20,14 @@ import (
var lineNumberExtractors = []lineNumberExtractor{
// Template/shortcode parse errors
- newLineNumberErrHandlerFromRegexp(".*:(\\d+):(\\d*):"),
- newLineNumberErrHandlerFromRegexp(".*:(\\d+):"),
-
- // TOML parse errors
- newLineNumberErrHandlerFromRegexp(".*Near line (\\d+)(\\s.*)"),
+ newLineNumberErrHandlerFromRegexp(`:(\d+):(\d*):`),
+ newLineNumberErrHandlerFromRegexp(`:(\d+):`),
// YAML parse errors
- newLineNumberErrHandlerFromRegexp("line (\\d+):"),
+ newLineNumberErrHandlerFromRegexp(`line (\d+):`),
// i18n bundle errors
- newLineNumberErrHandlerFromRegexp("\\((\\d+),\\s(\\d*)"),
+ newLineNumberErrHandlerFromRegexp(`\((\d+),\s(\d*)`),
}
type lineNumberExtractor func(e error) (int, int)
@@ -61,6 +58,6 @@ func extractLineNo(re *regexp.Regexp) lineNumberExtractor {
return lno, col
}
- return -1, col
+ return 0, col
}
}
diff --git a/common/hexec/exec.go b/common/hexec/exec.go
new file mode 100644
index 000000000..c3a6ebf57
--- /dev/null
+++ b/common/hexec/exec.go
@@ -0,0 +1,389 @@
+// Copyright 2020 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package hexec
+
+import (
+ "bytes"
+ "context"
+ "errors"
+ "fmt"
+ "io"
+ "os"
+ "os/exec"
+ "path/filepath"
+ "regexp"
+ "strings"
+ "sync"
+
+ "github.com/bep/logg"
+ "github.com/gohugoio/hugo/common/loggers"
+ "github.com/gohugoio/hugo/common/maps"
+ "github.com/gohugoio/hugo/config"
+ "github.com/gohugoio/hugo/config/security"
+)
+
+var WithDir = func(dir string) func(c *commandeer) {
+ return func(c *commandeer) {
+ c.dir = dir
+ }
+}
+
+var WithContext = func(ctx context.Context) func(c *commandeer) {
+ return func(c *commandeer) {
+ c.ctx = ctx
+ }
+}
+
+var WithStdout = func(w io.Writer) func(c *commandeer) {
+ return func(c *commandeer) {
+ c.stdout = w
+ }
+}
+
+var WithStderr = func(w io.Writer) func(c *commandeer) {
+ return func(c *commandeer) {
+ c.stderr = w
+ }
+}
+
+var WithStdin = func(r io.Reader) func(c *commandeer) {
+ return func(c *commandeer) {
+ c.stdin = r
+ }
+}
+
+var WithEnviron = func(env []string) func(c *commandeer) {
+ return func(c *commandeer) {
+ setOrAppend := func(s string) {
+ k1, _ := config.SplitEnvVar(s)
+ var found bool
+ for i, v := range c.env {
+ k2, _ := config.SplitEnvVar(v)
+ if k1 == k2 {
+ found = true
+ c.env[i] = s
+ }
+ }
+
+ if !found {
+ c.env = append(c.env, s)
+ }
+ }
+
+ for _, s := range env {
+ setOrAppend(s)
+ }
+ }
+}
+
+// New creates a new Exec using the provided security config.
+func New(cfg security.Config, workingDir string, log loggers.Logger) *Exec {
+ var baseEnviron []string
+ for _, v := range os.Environ() {
+ k, _ := config.SplitEnvVar(v)
+ if cfg.Exec.OsEnv.Accept(k) {
+ baseEnviron = append(baseEnviron, v)
+ }
+ }
+
+ return &Exec{
+ sc: cfg,
+ workingDir: workingDir,
+ infol: log.InfoCommand("exec"),
+ baseEnviron: baseEnviron,
+ newNPXRunnerCache: maps.NewCache[string, func(arg ...any) (Runner, error)](),
+ }
+}
+
+// IsNotFound reports whether this is an error about a binary not found.
+func IsNotFound(err error) bool {
+ var notFoundErr *NotFoundError
+ return errors.As(err, ¬FoundErr)
+}
+
+// Exec enforces a security policy for commands run via os/exec.
+type Exec struct {
+ sc security.Config
+ workingDir string
+ infol logg.LevelLogger
+
+ // os.Environ filtered by the Exec.OsEnviron whitelist filter.
+ baseEnviron []string
+
+ newNPXRunnerCache *maps.Cache[string, func(arg ...any) (Runner, error)]
+ npxInit sync.Once
+ npxAvailable bool
+}
+
+func (e *Exec) New(name string, arg ...any) (Runner, error) {
+ return e.new(name, "", arg...)
+}
+
+// New will fail if name is not allowed according to the configured security policy.
+// Else a configured Runner will be returned ready to be Run.
+func (e *Exec) new(name string, fullyQualifiedName string, arg ...any) (Runner, error) {
+ if err := e.sc.CheckAllowedExec(name); err != nil {
+ return nil, err
+ }
+
+ env := make([]string, len(e.baseEnviron))
+ copy(env, e.baseEnviron)
+
+ cm := &commandeer{
+ name: name,
+ fullyQualifiedName: fullyQualifiedName,
+ env: env,
+ }
+
+ return cm.command(arg...)
+}
+
+type binaryLocation int
+
+func (b binaryLocation) String() string {
+ switch b {
+ case binaryLocationNodeModules:
+ return "node_modules/.bin"
+ case binaryLocationNpx:
+ return "npx"
+ case binaryLocationPath:
+ return "PATH"
+ }
+ return "unknown"
+}
+
+const (
+ binaryLocationNodeModules binaryLocation = iota + 1
+ binaryLocationNpx
+ binaryLocationPath
+)
+
+// Npx will in order:
+// 1. Try fo find the binary in the WORKINGDIR/node_modules/.bin directory.
+// 2. If not found, and npx is available, run npx --no-install .
+// 3. Fall back to the PATH.
+// If name is "tailwindcss", we will try the PATH as the second option.
+func (e *Exec) Npx(name string, arg ...any) (Runner, error) {
+ if err := e.sc.CheckAllowedExec(name); err != nil {
+ return nil, err
+ }
+
+ newRunner, err := e.newNPXRunnerCache.GetOrCreate(name, func() (func(...any) (Runner, error), error) {
+ type tryFunc func() func(...any) (Runner, error)
+ tryFuncs := map[binaryLocation]tryFunc{
+ binaryLocationNodeModules: func() func(...any) (Runner, error) {
+ nodeBinFilename := filepath.Join(e.workingDir, nodeModulesBinPath, name)
+ _, err := exec.LookPath(nodeBinFilename)
+ if err != nil {
+ return nil
+ }
+ return func(arg2 ...any) (Runner, error) {
+ return e.new(name, nodeBinFilename, arg2...)
+ }
+ },
+ binaryLocationNpx: func() func(...any) (Runner, error) {
+ e.checkNpx()
+ if !e.npxAvailable {
+ return nil
+ }
+ return func(arg2 ...any) (Runner, error) {
+ return e.npx(name, arg2...)
+ }
+ },
+ binaryLocationPath: func() func(...any) (Runner, error) {
+ if _, err := exec.LookPath(name); err != nil {
+ return nil
+ }
+ return func(arg2 ...any) (Runner, error) {
+ return e.New(name, arg2...)
+ }
+ },
+ }
+
+ locations := []binaryLocation{binaryLocationNodeModules, binaryLocationNpx, binaryLocationPath}
+ if name == "tailwindcss" {
+ // See https://github.com/gohugoio/hugo/issues/13221#issuecomment-2574801253
+ locations = []binaryLocation{binaryLocationNodeModules, binaryLocationPath, binaryLocationNpx}
+ }
+ for _, loc := range locations {
+ if f := tryFuncs[loc](); f != nil {
+ e.infol.Logf("resolve %q using %s", name, loc)
+ return f, nil
+ }
+ }
+ return nil, &NotFoundError{name: name, method: fmt.Sprintf("in %s", locations[len(locations)-1])}
+ })
+ if err != nil {
+ return nil, err
+ }
+
+ return newRunner(arg...)
+}
+
+const (
+ npxNoInstall = "--no-install"
+ npxBinary = "npx"
+ nodeModulesBinPath = "node_modules/.bin"
+)
+
+func (e *Exec) checkNpx() {
+ e.npxInit.Do(func() {
+ e.npxAvailable = InPath(npxBinary)
+ })
+}
+
+// npx is a convenience method to create a Runner running npx --no-install 0 {
+ if tp.NumIn() > 1 {
+ panic("not supported")
+ }
+ first := tp.In(0)
+ if IsContextType(first) {
+ args = append(args, reflect.ValueOf(cxt))
+ }
+ }
+
+ return fn.Call(args)
+}
+
// Based on: https://github.com/golang/go/blob/178a2c42254166cffed1b25fb1d3c7a5727cada6/src/text/template/exec.go#L931
func indirectInterface(v reflect.Value) reflect.Value {
if v.Kind() != reflect.Interface {
@@ -89,3 +270,26 @@ func indirectInterface(v reflect.Value) reflect.Value {
}
return v.Elem()
}
+
+var contextInterface = reflect.TypeOf((*context.Context)(nil)).Elem()
+
+var isContextCache = maps.NewCache[reflect.Type, bool]()
+
+type k string
+
+var contextTypeValue = reflect.TypeOf(context.WithValue(context.Background(), k("key"), 32))
+
+// IsContextType returns whether tp is a context.Context type.
+func IsContextType(tp reflect.Type) bool {
+ if tp == contextTypeValue {
+ return true
+ }
+ if tp == contextInterface {
+ return true
+ }
+
+ isContext, _ := isContextCache.GetOrCreate(tp, func() (bool, error) {
+ return tp.Implements(contextInterface), nil
+ })
+ return isContext
+}
diff --git a/common/hreflect/helpers_test.go b/common/hreflect/helpers_test.go
index 480ccb27a..cbcad0f22 100644
--- a/common/hreflect/helpers_test.go
+++ b/common/hreflect/helpers_test.go
@@ -14,6 +14,7 @@
package hreflect
import (
+ "context"
"reflect"
"testing"
"time"
@@ -30,6 +31,65 @@ func TestIsTruthful(t *testing.T) {
c.Assert(IsTruthful(time.Time{}), qt.Equals, false)
}
+func TestGetMethodByName(t *testing.T) {
+ c := qt.New(t)
+ v := reflect.ValueOf(&testStruct{})
+ tp := v.Type()
+
+ c.Assert(GetMethodIndexByName(tp, "Method1"), qt.Equals, 0)
+ c.Assert(GetMethodIndexByName(tp, "Method3"), qt.Equals, 2)
+ c.Assert(GetMethodIndexByName(tp, "Foo"), qt.Equals, -1)
+}
+
+func TestIsContextType(t *testing.T) {
+ c := qt.New(t)
+ type k string
+ ctx := context.Background()
+ valueCtx := context.WithValue(ctx, k("key"), 32)
+ c.Assert(IsContextType(reflect.TypeOf(ctx)), qt.IsTrue)
+ c.Assert(IsContextType(reflect.TypeOf(valueCtx)), qt.IsTrue)
+}
+
+func TestToSliceAny(t *testing.T) {
+ c := qt.New(t)
+
+ checkOK := func(in any, expected []any) {
+ out, ok := ToSliceAny(in)
+ c.Assert(ok, qt.Equals, true)
+ c.Assert(out, qt.DeepEquals, expected)
+ }
+
+ checkOK([]any{1, 2, 3}, []any{1, 2, 3})
+ checkOK([]int{1, 2, 3}, []any{1, 2, 3})
+}
+
+func BenchmarkIsContextType(b *testing.B) {
+ type k string
+ b.Run("value", func(b *testing.B) {
+ ctx := context.Background()
+ ctxs := make([]reflect.Type, b.N)
+ for i := 0; i < b.N; i++ {
+ ctxs[i] = reflect.TypeOf(context.WithValue(ctx, k("key"), i))
+ }
+
+ b.ResetTimer()
+ for i := 0; i < b.N; i++ {
+ if !IsContextType(ctxs[i]) {
+ b.Fatal("not context")
+ }
+ }
+ })
+
+ b.Run("background", func(b *testing.B) {
+ var ctxt reflect.Type = reflect.TypeOf(context.Background())
+ for i := 0; i < b.N; i++ {
+ if !IsContextType(ctxt) {
+ b.Fatal("not context")
+ }
+ }
+ })
+}
+
func BenchmarkIsTruthFul(b *testing.B) {
v := reflect.ValueOf("Hugo")
@@ -40,3 +100,51 @@ func BenchmarkIsTruthFul(b *testing.B) {
}
}
}
+
+type testStruct struct{}
+
+func (t *testStruct) Method1() string {
+ return "Hugo"
+}
+
+func (t *testStruct) Method2() string {
+ return "Hugo"
+}
+
+func (t *testStruct) Method3() string {
+ return "Hugo"
+}
+
+func (t *testStruct) Method4() string {
+ return "Hugo"
+}
+
+func (t *testStruct) Method5() string {
+ return "Hugo"
+}
+
+func BenchmarkGetMethodByName(b *testing.B) {
+ v := reflect.ValueOf(&testStruct{})
+ methods := []string{"Method1", "Method2", "Method3", "Method4", "Method5"}
+
+ b.ResetTimer()
+ for i := 0; i < b.N; i++ {
+ for _, method := range methods {
+ _ = GetMethodByName(v, method)
+ }
+ }
+}
+
+func BenchmarkGetMethodByNamePara(b *testing.B) {
+ v := reflect.ValueOf(&testStruct{})
+ methods := []string{"Method1", "Method2", "Method3", "Method4", "Method5"}
+
+ b.ResetTimer()
+ b.RunParallel(func(pb *testing.PB) {
+ for pb.Next() {
+ for _, method := range methods {
+ _ = GetMethodByName(v, method)
+ }
+ }
+ })
+}
diff --git a/common/hstrings/strings.go b/common/hstrings/strings.go
new file mode 100644
index 000000000..1de38678f
--- /dev/null
+++ b/common/hstrings/strings.go
@@ -0,0 +1,134 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package hstrings
+
+import (
+ "fmt"
+ "regexp"
+ "slices"
+ "strings"
+ "sync"
+
+ "github.com/gohugoio/hugo/compare"
+)
+
+var _ compare.Eqer = StringEqualFold("")
+
+// StringEqualFold is a string that implements the compare.Eqer interface and considers
+// two strings equal if they are equal when folded to lower case.
+// The compare.Eqer interface is used in Hugo to compare values in templates (e.g. using the eq template function).
+type StringEqualFold string
+
+func (s StringEqualFold) EqualFold(s2 string) bool {
+ return strings.EqualFold(string(s), s2)
+}
+
+func (s StringEqualFold) String() string {
+ return string(s)
+}
+
+func (s StringEqualFold) Eq(s2 any) bool {
+ switch ss := s2.(type) {
+ case string:
+ return s.EqualFold(ss)
+ case fmt.Stringer:
+ return s.EqualFold(ss.String())
+ }
+
+ return false
+}
+
+// EqualAny returns whether a string is equal to any of the given strings.
+func EqualAny(a string, b ...string) bool {
+ return slices.Contains(b, a)
+}
+
+// regexpCache represents a cache of regexp objects protected by a mutex.
+type regexpCache struct {
+ mu sync.RWMutex
+ re map[string]*regexp.Regexp
+}
+
+func (rc *regexpCache) getOrCompileRegexp(pattern string) (re *regexp.Regexp, err error) {
+ var ok bool
+
+ if re, ok = rc.get(pattern); !ok {
+ re, err = regexp.Compile(pattern)
+ if err != nil {
+ return nil, err
+ }
+ rc.set(pattern, re)
+ }
+
+ return re, nil
+}
+
+func (rc *regexpCache) get(key string) (re *regexp.Regexp, ok bool) {
+ rc.mu.RLock()
+ re, ok = rc.re[key]
+ rc.mu.RUnlock()
+ return
+}
+
+func (rc *regexpCache) set(key string, re *regexp.Regexp) {
+ rc.mu.Lock()
+ rc.re[key] = re
+ rc.mu.Unlock()
+}
+
+var reCache = regexpCache{re: make(map[string]*regexp.Regexp)}
+
+// GetOrCompileRegexp retrieves a regexp object from the cache based upon the pattern.
+// If the pattern is not found in the cache, the pattern is compiled and added to
+// the cache.
+func GetOrCompileRegexp(pattern string) (re *regexp.Regexp, err error) {
+ return reCache.getOrCompileRegexp(pattern)
+}
+
+// InSlice checks if a string is an element of a slice of strings
+// and returns a boolean value.
+func InSlice(arr []string, el string) bool {
+ return slices.Contains(arr, el)
+}
+
+// InSlicEqualFold checks if a string is an element of a slice of strings
+// and returns a boolean value.
+// It uses strings.EqualFold to compare.
+func InSlicEqualFold(arr []string, el string) bool {
+ for _, v := range arr {
+ if strings.EqualFold(v, el) {
+ return true
+ }
+ }
+ return false
+}
+
+// ToString converts the given value to a string.
+// Note that this is a more strict version compared to cast.ToString,
+// as it will not try to convert numeric values to strings,
+// but only accept strings or fmt.Stringer.
+func ToString(v any) (string, bool) {
+ switch vv := v.(type) {
+ case string:
+ return vv, true
+ case fmt.Stringer:
+ return vv.String(), true
+ }
+ return "", false
+}
+
+type (
+ Strings2 [2]string
+ Strings3 [3]string
+)
diff --git a/common/hstrings/strings_test.go b/common/hstrings/strings_test.go
new file mode 100644
index 000000000..d8e9e204a
--- /dev/null
+++ b/common/hstrings/strings_test.go
@@ -0,0 +1,56 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package hstrings
+
+import (
+ "regexp"
+ "testing"
+
+ qt "github.com/frankban/quicktest"
+)
+
+func TestStringEqualFold(t *testing.T) {
+ c := qt.New(t)
+
+ s1 := "A"
+ s2 := "a"
+
+ c.Assert(StringEqualFold(s1).EqualFold(s2), qt.Equals, true)
+ c.Assert(StringEqualFold(s1).EqualFold(s1), qt.Equals, true)
+ c.Assert(StringEqualFold(s2).EqualFold(s1), qt.Equals, true)
+ c.Assert(StringEqualFold(s2).EqualFold(s2), qt.Equals, true)
+ c.Assert(StringEqualFold(s1).EqualFold("b"), qt.Equals, false)
+ c.Assert(StringEqualFold(s1).Eq(s2), qt.Equals, true)
+ c.Assert(StringEqualFold(s1).Eq("b"), qt.Equals, false)
+}
+
+func TestGetOrCompileRegexp(t *testing.T) {
+ c := qt.New(t)
+
+ re, err := GetOrCompileRegexp(`\d+`)
+ c.Assert(err, qt.IsNil)
+ c.Assert(re.MatchString("123"), qt.Equals, true)
+}
+
+func BenchmarkGetOrCompileRegexp(b *testing.B) {
+ for i := 0; i < b.N; i++ {
+ GetOrCompileRegexp(`\d+`)
+ }
+}
+
+func BenchmarkCompileRegexp(b *testing.B) {
+ for i := 0; i < b.N; i++ {
+ regexp.MustCompile(`\d+`)
+ }
+}
diff --git a/common/htime/htime_integration_test.go b/common/htime/htime_integration_test.go
new file mode 100644
index 000000000..8090add12
--- /dev/null
+++ b/common/htime/htime_integration_test.go
@@ -0,0 +1,78 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package htime_test
+
+import (
+ "testing"
+
+ "github.com/gohugoio/hugo/hugolib"
+)
+
+// Issue #11267
+func TestApplyWithContext(t *testing.T) {
+ t.Parallel()
+
+ files := `
+-- config.toml --
+defaultContentLanguage = 'it'
+-- layouts/index.html --
+{{ $dates := slice
+ "2022-01-03"
+ "2022-02-01"
+ "2022-03-02"
+ "2022-04-07"
+ "2022-05-06"
+ "2022-06-04"
+ "2022-07-03"
+ "2022-08-01"
+ "2022-09-06"
+ "2022-10-05"
+ "2022-11-03"
+ "2022-12-02"
+}}
+{{ range $dates }}
+ {{ . | time.Format "month: _January_ weekday: _Monday_" }}
+ {{ . | time.Format "month: _Jan_ weekday: _Mon_" }}
+{{ end }}
+ `
+
+ b := hugolib.Test(t, files)
+
+ b.AssertFileContent("public/index.html", `
+month: _gennaio_ weekday: _lunedì_
+month: _gen_ weekday: _lun_
+month: _febbraio_ weekday: _martedì_
+month: _feb_ weekday: _mar_
+month: _marzo_ weekday: _mercoledì_
+month: _mar_ weekday: _mer_
+month: _aprile_ weekday: _giovedì_
+month: _apr_ weekday: _gio_
+month: _maggio_ weekday: _venerdì_
+month: _mag_ weekday: _ven_
+month: _giugno_ weekday: _sabato_
+month: _giu_ weekday: _sab_
+month: _luglio_ weekday: _domenica_
+month: _lug_ weekday: _dom_
+month: _agosto_ weekday: _lunedì_
+month: _ago_ weekday: _lun_
+month: _settembre_ weekday: _martedì_
+month: _set_ weekday: _mar_
+month: _ottobre_ weekday: _mercoledì_
+month: _ott_ weekday: _mer_
+month: _novembre_ weekday: _giovedì_
+month: _nov_ weekday: _gio_
+month: _dicembre_ weekday: _venerdì_
+month: _dic_ weekday: _ven_
+`)
+}
diff --git a/common/htime/time.go b/common/htime/time.go
new file mode 100644
index 000000000..c71e39ee4
--- /dev/null
+++ b/common/htime/time.go
@@ -0,0 +1,177 @@
+// Copyright 2021 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package htime
+
+import (
+ "log"
+ "strings"
+ "time"
+
+ "github.com/bep/clocks"
+ "github.com/spf13/cast"
+
+ "github.com/gohugoio/locales"
+)
+
+var (
+ longDayNames = []string{
+ "Sunday",
+ "Monday",
+ "Tuesday",
+ "Wednesday",
+ "Thursday",
+ "Friday",
+ "Saturday",
+ }
+
+ shortDayNames = []string{
+ "Sun",
+ "Mon",
+ "Tue",
+ "Wed",
+ "Thu",
+ "Fri",
+ "Sat",
+ }
+
+ shortMonthNames = []string{
+ "Jan",
+ "Feb",
+ "Mar",
+ "Apr",
+ "May",
+ "Jun",
+ "Jul",
+ "Aug",
+ "Sep",
+ "Oct",
+ "Nov",
+ "Dec",
+ }
+
+ longMonthNames = []string{
+ "January",
+ "February",
+ "March",
+ "April",
+ "May",
+ "June",
+ "July",
+ "August",
+ "September",
+ "October",
+ "November",
+ "December",
+ }
+
+ Clock = clocks.System()
+)
+
+func NewTimeFormatter(ltr locales.Translator) TimeFormatter {
+ if ltr == nil {
+ panic("must provide a locales.Translator")
+ }
+ return TimeFormatter{
+ ltr: ltr,
+ }
+}
+
+// TimeFormatter is locale aware.
+type TimeFormatter struct {
+ ltr locales.Translator
+}
+
+func (f TimeFormatter) Format(t time.Time, layout string) string {
+ if layout == "" {
+ return ""
+ }
+
+ if layout[0] == ':' {
+ // It may be one of Hugo's custom layouts.
+ switch strings.ToLower(layout[1:]) {
+ case "date_full":
+ return f.ltr.FmtDateFull(t)
+ case "date_long":
+ return f.ltr.FmtDateLong(t)
+ case "date_medium":
+ return f.ltr.FmtDateMedium(t)
+ case "date_short":
+ return f.ltr.FmtDateShort(t)
+ case "time_full":
+ return f.ltr.FmtTimeFull(t)
+ case "time_long":
+ return f.ltr.FmtTimeLong(t)
+ case "time_medium":
+ return f.ltr.FmtTimeMedium(t)
+ case "time_short":
+ return f.ltr.FmtTimeShort(t)
+ }
+ }
+
+ s := t.Format(layout)
+
+ monthIdx := t.Month() - 1 // Month() starts at 1.
+ dayIdx := t.Weekday()
+
+ if strings.Contains(layout, "January") {
+ s = strings.ReplaceAll(s, longMonthNames[monthIdx], f.ltr.MonthWide(t.Month()))
+ } else if strings.Contains(layout, "Jan") {
+ s = strings.ReplaceAll(s, shortMonthNames[monthIdx], f.ltr.MonthAbbreviated(t.Month()))
+ }
+
+ if strings.Contains(layout, "Monday") {
+ s = strings.ReplaceAll(s, longDayNames[dayIdx], f.ltr.WeekdayWide(t.Weekday()))
+ } else if strings.Contains(layout, "Mon") {
+ s = strings.ReplaceAll(s, shortDayNames[dayIdx], f.ltr.WeekdayAbbreviated(t.Weekday()))
+ }
+
+ return s
+}
+
+func ToTimeInDefaultLocationE(i any, location *time.Location) (tim time.Time, err error) {
+ switch vv := i.(type) {
+ case AsTimeProvider:
+ return vv.AsTime(location), nil
+ // issue #8895
+ // datetimes parsed by `go-toml` have empty zone name
+ // convert back them into string and use `cast`
+ // TODO(bep) add tests, make sure we really need this.
+ case time.Time:
+ i = vv.Format(time.RFC3339)
+ }
+ return cast.ToTimeInDefaultLocationE(i, location)
+}
+
+// Now returns time.Now() or time value based on the `clock` flag.
+// Use this function to fake time inside hugo.
+func Now() time.Time {
+ return Clock.Now()
+}
+
+func Since(t time.Time) time.Duration {
+ return Clock.Since(t)
+}
+
+// AsTimeProvider is implemented by go-toml's LocalDate and LocalDateTime.
+type AsTimeProvider interface {
+ AsTime(zone *time.Location) time.Time
+}
+
+// StopWatch is a simple helper to measure time during development.
+func StopWatch(name string) func() {
+ start := time.Now()
+ return func() {
+ log.Printf("StopWatch %q took %s", name, time.Since(start))
+ }
+}
diff --git a/common/htime/time_test.go b/common/htime/time_test.go
new file mode 100644
index 000000000..78954887e
--- /dev/null
+++ b/common/htime/time_test.go
@@ -0,0 +1,144 @@
+// Copyright 2021 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package htime
+
+import (
+ "testing"
+ "time"
+
+ qt "github.com/frankban/quicktest"
+ translators "github.com/gohugoio/localescompressed"
+)
+
+func TestTimeFormatter(t *testing.T) {
+ c := qt.New(t)
+
+ june06, _ := time.Parse("2006-Jan-02", "2018-Jun-06")
+ june06 = june06.Add(7777 * time.Second)
+
+ jan06, _ := time.Parse("2006-Jan-02", "2018-Jan-06")
+ jan06 = jan06.Add(32 * time.Second)
+
+ mondayNovemberFirst, _ := time.Parse("2006-Jan-02", "2021-11-01")
+ mondayNovemberFirst = mondayNovemberFirst.Add(33 * time.Second)
+
+ c.Run("Norsk nynorsk", func(c *qt.C) {
+ f := NewTimeFormatter(translators.GetTranslator("nn"))
+
+ c.Assert(f.Format(june06, "Monday Jan 2 2006"), qt.Equals, "onsdag juni 6 2018")
+ c.Assert(f.Format(june06, "Mon January 2 2006"), qt.Equals, "on. juni 6 2018")
+ c.Assert(f.Format(june06, "Mon Mon"), qt.Equals, "on. on.")
+ })
+
+ c.Run("Custom layouts Norsk nynorsk", func(c *qt.C) {
+ f := NewTimeFormatter(translators.GetTranslator("nn"))
+
+ c.Assert(f.Format(june06, ":date_full"), qt.Equals, "onsdag 6. juni 2018")
+ c.Assert(f.Format(june06, ":date_long"), qt.Equals, "6. juni 2018")
+ c.Assert(f.Format(june06, ":date_medium"), qt.Equals, "6. juni 2018")
+ c.Assert(f.Format(june06, ":date_short"), qt.Equals, "06.06.2018")
+
+ c.Assert(f.Format(june06, ":time_full"), qt.Equals, "kl. 02:09:37 UTC")
+ c.Assert(f.Format(june06, ":time_long"), qt.Equals, "02:09:37 UTC")
+ c.Assert(f.Format(june06, ":time_medium"), qt.Equals, "02:09:37")
+ c.Assert(f.Format(june06, ":time_short"), qt.Equals, "02:09")
+ })
+
+ c.Run("Custom layouts English", func(c *qt.C) {
+ f := NewTimeFormatter(translators.GetTranslator("en"))
+
+ c.Assert(f.Format(june06, ":date_full"), qt.Equals, "Wednesday, June 6, 2018")
+ c.Assert(f.Format(june06, ":date_long"), qt.Equals, "June 6, 2018")
+ c.Assert(f.Format(june06, ":date_medium"), qt.Equals, "Jun 6, 2018")
+ c.Assert(f.Format(june06, ":date_short"), qt.Equals, "6/6/18")
+
+ c.Assert(f.Format(june06, ":time_full"), qt.Equals, "2:09:37 am UTC")
+ c.Assert(f.Format(june06, ":time_long"), qt.Equals, "2:09:37 am UTC")
+ c.Assert(f.Format(june06, ":time_medium"), qt.Equals, "2:09:37 am")
+ c.Assert(f.Format(june06, ":time_short"), qt.Equals, "2:09 am")
+ })
+
+ c.Run("English", func(c *qt.C) {
+ f := NewTimeFormatter(translators.GetTranslator("en"))
+
+ c.Assert(f.Format(june06, "Monday Jan 2 2006"), qt.Equals, "Wednesday Jun 6 2018")
+ c.Assert(f.Format(june06, "Mon January 2 2006"), qt.Equals, "Wed June 6 2018")
+ c.Assert(f.Format(june06, "Mon Mon"), qt.Equals, "Wed Wed")
+ })
+
+ c.Run("Weekdays German", func(c *qt.C) {
+ tr := translators.GetTranslator("de")
+ f := NewTimeFormatter(tr)
+
+ // Issue #9107
+ for i, weekDayWideGerman := range []string{"Montag", "Dienstag", "Mittwoch", "Donnerstag", "Freitag", "Samstag", "Sonntag"} {
+ date := mondayNovemberFirst.Add(time.Duration(i*24) * time.Hour)
+ c.Assert(tr.WeekdayWide(date.Weekday()), qt.Equals, weekDayWideGerman)
+ c.Assert(f.Format(date, "Monday"), qt.Equals, weekDayWideGerman)
+ }
+
+ for i, weekDayAbbreviatedGerman := range []string{"Mo.", "Di.", "Mi.", "Do.", "Fr.", "Sa.", "So."} {
+ date := mondayNovemberFirst.Add(time.Duration(i*24) * time.Hour)
+ c.Assert(tr.WeekdayAbbreviated(date.Weekday()), qt.Equals, weekDayAbbreviatedGerman)
+ c.Assert(f.Format(date, "Mon"), qt.Equals, weekDayAbbreviatedGerman)
+ }
+ })
+
+ c.Run("Months German", func(c *qt.C) {
+ tr := translators.GetTranslator("de")
+ f := NewTimeFormatter(tr)
+
+ // Issue #9107
+ for i, monthWideNorway := range []string{"Januar", "Februar", "März", "April", "Mai", "Juni", "Juli"} {
+ date := jan06.Add(time.Duration(i*24*31) * time.Hour)
+ c.Assert(tr.MonthWide(date.Month()), qt.Equals, monthWideNorway)
+ c.Assert(f.Format(date, "January"), qt.Equals, monthWideNorway)
+ }
+ })
+}
+
+func BenchmarkTimeFormatter(b *testing.B) {
+ june06, _ := time.Parse("2006-Jan-02", "2018-Jun-06")
+
+ b.Run("Native", func(b *testing.B) {
+ for i := 0; i < b.N; i++ {
+ got := june06.Format("Monday Jan 2 2006")
+ if got != "Wednesday Jun 6 2018" {
+ b.Fatalf("invalid format, got %q", got)
+ }
+ }
+ })
+
+ b.Run("Localized", func(b *testing.B) {
+ f := NewTimeFormatter(translators.GetTranslator("nn"))
+ b.ResetTimer()
+ for i := 0; i < b.N; i++ {
+ got := f.Format(june06, "Monday Jan 2 2006")
+ if got != "onsdag juni 6 2018" {
+ b.Fatalf("invalid format, got %q", got)
+ }
+ }
+ })
+
+ b.Run("Localized Custom", func(b *testing.B) {
+ f := NewTimeFormatter(translators.GetTranslator("nn"))
+ b.ResetTimer()
+ for i := 0; i < b.N; i++ {
+ got := f.Format(june06, ":date_medium")
+ if got != "6. juni 2018" {
+ b.Fatalf("invalid format, got %q", got)
+ }
+ }
+ })
+}
diff --git a/common/hugio/copy.go b/common/hugio/copy.go
index 2b756cb44..31d679dfc 100644
--- a/common/hugio/copy.go
+++ b/common/hugio/copy.go
@@ -14,60 +14,63 @@
package hugio
import (
+ "fmt"
"io"
- "io/ioutil"
- "os"
+ iofs "io/fs"
"path/filepath"
- "github.com/pkg/errors"
-
"github.com/spf13/afero"
)
// CopyFile copies a file.
func CopyFile(fs afero.Fs, from, to string) error {
- sf, err := os.Open(from)
+ sf, err := fs.Open(from)
if err != nil {
return err
}
defer sf.Close()
- df, err := os.Create(to)
+ df, err := fs.Create(to)
if err != nil {
return err
}
defer df.Close()
_, err = io.Copy(df, sf)
- if err == nil {
- si, err := os.Stat(from)
- if err != nil {
- err = os.Chmod(to, si.Mode())
-
- if err != nil {
- return err
- }
- }
-
+ if err != nil {
+ return err
}
+ si, err := fs.Stat(from)
+ if err != nil {
+ err = fs.Chmod(to, si.Mode())
+
+ if err != nil {
+ return err
+ }
+ }
+
return nil
}
// CopyDir copies a directory.
func CopyDir(fs afero.Fs, from, to string, shouldCopy func(filename string) bool) error {
- fi, err := os.Stat(from)
+ fi, err := fs.Stat(from)
if err != nil {
return err
}
if !fi.IsDir() {
- return errors.Errorf("%q is not a directory", from)
+ return fmt.Errorf("%q is not a directory", from)
}
- err = fs.MkdirAll(to, 0777) // before umask
+ err = fs.MkdirAll(to, 0o777) // before umask
if err != nil {
return err
}
- entries, _ := ioutil.ReadDir(from)
+ d, err := fs.Open(from)
+ if err != nil {
+ return err
+ }
+ entries, _ := d.(iofs.ReadDirFile).ReadDir(-1)
for _, entry := range entries {
fromFilename := filepath.Join(from, entry.Name())
toFilename := filepath.Join(to, entry.Name())
diff --git a/common/hugio/hasBytesWriter.go b/common/hugio/hasBytesWriter.go
new file mode 100644
index 000000000..d2bcd1bb4
--- /dev/null
+++ b/common/hugio/hasBytesWriter.go
@@ -0,0 +1,80 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package hugio
+
+import (
+ "bytes"
+)
+
+// HasBytesWriter is a writer will match against a slice of patterns.
+type HasBytesWriter struct {
+ Patterns []*HasBytesPattern
+
+ i int
+ done bool
+ buff []byte
+}
+
+type HasBytesPattern struct {
+ Match bool
+ Pattern []byte
+}
+
+func (h *HasBytesWriter) patternLen() int {
+ l := 0
+ for _, p := range h.Patterns {
+ l += len(p.Pattern)
+ }
+ return l
+}
+
+func (h *HasBytesWriter) Write(p []byte) (n int, err error) {
+ if h.done {
+ return len(p), nil
+ }
+
+ if len(h.buff) == 0 {
+ h.buff = make([]byte, h.patternLen()*2)
+ }
+
+ for i := range p {
+ h.buff[h.i] = p[i]
+ h.i++
+ if h.i == len(h.buff) {
+ // Shift left.
+ copy(h.buff, h.buff[len(h.buff)/2:])
+ h.i = len(h.buff) / 2
+ }
+
+ for _, pp := range h.Patterns {
+ if bytes.Contains(h.buff, pp.Pattern) {
+ pp.Match = true
+ done := true
+ for _, ppp := range h.Patterns {
+ if !ppp.Match {
+ done = false
+ break
+ }
+ }
+ if done {
+ h.done = true
+ }
+ return len(p), nil
+ }
+ }
+
+ }
+
+ return len(p), nil
+}
diff --git a/common/hugio/hasBytesWriter_test.go b/common/hugio/hasBytesWriter_test.go
new file mode 100644
index 000000000..9e689a112
--- /dev/null
+++ b/common/hugio/hasBytesWriter_test.go
@@ -0,0 +1,67 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package hugio
+
+import (
+ "bytes"
+ "fmt"
+ "io"
+ "math/rand"
+ "strings"
+ "testing"
+ "time"
+
+ qt "github.com/frankban/quicktest"
+)
+
+func TestHasBytesWriter(t *testing.T) {
+ r := rand.New(rand.NewSource(time.Now().UnixNano()))
+
+ c := qt.New((t))
+
+ neww := func() (*HasBytesWriter, io.Writer) {
+ var b bytes.Buffer
+
+ h := &HasBytesWriter{
+ Patterns: []*HasBytesPattern{
+ {Pattern: []byte("__foo")},
+ },
+ }
+
+ return h, io.MultiWriter(&b, h)
+ }
+
+ rndStr := func() string {
+ return strings.Repeat("ab cfo", r.Intn(33))
+ }
+
+ for range 22 {
+ h, w := neww()
+ fmt.Fprint(w, rndStr()+"abc __foobar"+rndStr())
+ c.Assert(h.Patterns[0].Match, qt.Equals, true)
+
+ h, w = neww()
+ fmt.Fprint(w, rndStr()+"abc __f")
+ fmt.Fprint(w, "oo bar"+rndStr())
+ c.Assert(h.Patterns[0].Match, qt.Equals, true)
+
+ h, w = neww()
+ fmt.Fprint(w, rndStr()+"abc __moo bar")
+ c.Assert(h.Patterns[0].Match, qt.Equals, false)
+ }
+
+ h, w := neww()
+ fmt.Fprintf(w, "__foo")
+ c.Assert(h.Patterns[0].Match, qt.Equals, true)
+}
diff --git a/common/hugio/readers.go b/common/hugio/readers.go
index 8c901dd24..c4304c84e 100644
--- a/common/hugio/readers.go
+++ b/common/hugio/readers.go
@@ -14,6 +14,7 @@
package hugio
import (
+ "bytes"
"io"
"strings"
)
@@ -31,24 +32,75 @@ type ReadSeekCloser interface {
io.Closer
}
-// ReadSeekerNoOpCloser implements ReadSeekCloser by doing nothing in Close.
-// TODO(bep) rename this and simila to ReadSeekerNopCloser, naming used in stdlib, which kind of makes sense.
-type ReadSeekerNoOpCloser struct {
+// ReadSeekCloserProvider provides a ReadSeekCloser.
+type ReadSeekCloserProvider interface {
+ ReadSeekCloser() (ReadSeekCloser, error)
+}
+
+// readSeekerNopCloser implements ReadSeekCloser by doing nothing in Close.
+type readSeekerNopCloser struct {
ReadSeeker
}
// Close does nothing.
-func (r ReadSeekerNoOpCloser) Close() error {
+func (r readSeekerNopCloser) Close() error {
return nil
}
// NewReadSeekerNoOpCloser creates a new ReadSeekerNoOpCloser with the given ReadSeeker.
-func NewReadSeekerNoOpCloser(r ReadSeeker) ReadSeekerNoOpCloser {
- return ReadSeekerNoOpCloser{r}
+func NewReadSeekerNoOpCloser(r ReadSeeker) ReadSeekCloser {
+ return readSeekerNopCloser{r}
}
// NewReadSeekerNoOpCloserFromString uses strings.NewReader to create a new ReadSeekerNoOpCloser
// from the given string.
-func NewReadSeekerNoOpCloserFromString(content string) ReadSeekerNoOpCloser {
- return ReadSeekerNoOpCloser{strings.NewReader(content)}
+func NewReadSeekerNoOpCloserFromString(content string) ReadSeekCloser {
+ return stringReadSeeker{s: content, readSeekerNopCloser: readSeekerNopCloser{strings.NewReader(content)}}
+}
+
+var _ StringReader = (*stringReadSeeker)(nil)
+
+type stringReadSeeker struct {
+ s string
+ readSeekerNopCloser
+}
+
+func (s *stringReadSeeker) ReadString() string {
+ return s.s
+}
+
+// StringReader provides a way to read a string.
+type StringReader interface {
+ ReadString() string
+}
+
+// NewReadSeekerNoOpCloserFromBytes uses bytes.NewReader to create a new ReadSeekerNoOpCloser
+// from the given bytes slice.
+func NewReadSeekerNoOpCloserFromBytes(content []byte) readSeekerNopCloser {
+ return readSeekerNopCloser{bytes.NewReader(content)}
+}
+
+// NewOpenReadSeekCloser creates a new ReadSeekCloser from the given ReadSeeker.
+// The ReadSeeker will be seeked to the beginning before returned.
+func NewOpenReadSeekCloser(r ReadSeekCloser) OpenReadSeekCloser {
+ return func() (ReadSeekCloser, error) {
+ r.Seek(0, io.SeekStart)
+ return r, nil
+ }
+}
+
+// OpenReadSeekCloser allows setting some other way (than reading from a filesystem)
+// to open or create a ReadSeekCloser.
+type OpenReadSeekCloser func() (ReadSeekCloser, error)
+
+// ReadString reads from the given reader and returns the content as a string.
+func ReadString(r io.Reader) (string, error) {
+ if sr, ok := r.(StringReader); ok {
+ return sr.ReadString(), nil
+ }
+ b, err := io.ReadAll(r)
+ if err != nil {
+ return "", err
+ }
+ return string(b), nil
}
diff --git a/common/hugio/writers.go b/common/hugio/writers.go
index 82c4dca52..6f439cc8b 100644
--- a/common/hugio/writers.go
+++ b/common/hugio/writers.go
@@ -15,9 +15,16 @@ package hugio
import (
"io"
- "io/ioutil"
)
+// As implemented by strings.Builder.
+type FlexiWriter interface {
+ io.Writer
+ io.ByteWriter
+ WriteString(s string) (int, error)
+ WriteRune(r rune) (int, error)
+}
+
type multiWriteCloser struct {
io.Writer
closers []io.WriteCloser
@@ -26,7 +33,7 @@ type multiWriteCloser struct {
func (m multiWriteCloser) Close() error {
var err error
for _, c := range m.closers {
- if closeErr := c.Close(); err != nil {
+ if closeErr := c.Close(); closeErr != nil {
err = closeErr
}
}
@@ -55,7 +62,7 @@ func ToWriteCloser(w io.Writer) io.WriteCloser {
io.Closer
}{
w,
- ioutil.NopCloser(nil),
+ io.NopCloser(nil),
}
}
@@ -71,6 +78,36 @@ func ToReadCloser(r io.Reader) io.ReadCloser {
io.Closer
}{
r,
- ioutil.NopCloser(nil),
+ io.NopCloser(nil),
}
}
+
+type ReadWriteCloser interface {
+ io.Reader
+ io.Writer
+ io.Closer
+}
+
+// PipeReadWriteCloser is a convenience type to create a pipe with a ReadCloser and a WriteCloser.
+type PipeReadWriteCloser struct {
+ *io.PipeReader
+ *io.PipeWriter
+}
+
+// NewPipeReadWriteCloser creates a new PipeReadWriteCloser.
+func NewPipeReadWriteCloser() PipeReadWriteCloser {
+ pr, pw := io.Pipe()
+ return PipeReadWriteCloser{pr, pw}
+}
+
+func (c PipeReadWriteCloser) Close() (err error) {
+ if err = c.PipeReader.Close(); err != nil {
+ return
+ }
+ err = c.PipeWriter.Close()
+ return
+}
+
+func (c PipeReadWriteCloser) WriteString(s string) (int, error) {
+ return c.PipeWriter.Write([]byte(s))
+}
diff --git a/common/hugo/hugo.go b/common/hugo/hugo.go
index 62d923bf0..764a86a97 100644
--- a/common/hugo/hugo.go
+++ b/common/hugo/hugo.go
@@ -14,8 +14,32 @@
package hugo
import (
+ "context"
"fmt"
"html/template"
+ "os"
+ "path/filepath"
+ "runtime/debug"
+ "sort"
+ "strings"
+ "sync"
+ "time"
+
+ "github.com/bep/logg"
+
+ "github.com/bep/godartsass/v2"
+ "github.com/gohugoio/hugo/common/hcontext"
+ "github.com/gohugoio/hugo/common/hexec"
+ "github.com/gohugoio/hugo/common/loggers"
+ "github.com/gohugoio/hugo/common/maps"
+ "github.com/gohugoio/hugo/hugofs/files"
+
+ "github.com/spf13/afero"
+
+ iofs "io/fs"
+
+ "github.com/gohugoio/hugo/config"
+ "github.com/gohugoio/hugo/hugofs"
)
const (
@@ -24,16 +48,16 @@ const (
)
var (
- // commitHash contains the current Git revision. Use make to build to make
- // sure this gets set.
- commitHash string
-
- // buildDate contains the date of the current build.
+ // buildDate allows vendor-specified build date when .git/ is unavailable.
buildDate string
+ // vendorInfo contains vendor notes about the current build.
+ vendorInfo string
)
-// Info contains information about the current Hugo environment
-type Info struct {
+var _ maps.StoreProvider = (*HugoInfo)(nil)
+
+// HugoInfo contains information about the current Hugo environment
+type HugoInfo struct {
CommitHash string
BuildDate string
@@ -42,26 +66,402 @@ type Info struct {
// This can also be set by the user.
// It can be any string, but it will be all lower case.
Environment string
+
+ // version of go that the Hugo binary was built with
+ GoVersion string
+
+ conf ConfigProvider
+ deps []*Dependency
+
+ store *maps.Scratch
+
+ // Context gives access to some of the context scoped variables.
+ Context Context
}
// Version returns the current version as a comparable version string.
-func (i Info) Version() VersionString {
+func (i HugoInfo) Version() VersionString {
return CurrentVersion.Version()
}
// Generator a Hugo meta generator HTML tag.
-func (i Info) Generator() template.HTML {
- return template.HTML(fmt.Sprintf(``, CurrentVersion.String()))
+func (i HugoInfo) Generator() template.HTML {
+ return template.HTML(fmt.Sprintf(``, CurrentVersion.String()))
+}
+
+// IsDevelopment reports whether the current running environment is "development".
+func (i HugoInfo) IsDevelopment() bool {
+ return i.Environment == EnvironmentDevelopment
+}
+
+// IsProduction reports whether the current running environment is "production".
+func (i HugoInfo) IsProduction() bool {
+ return i.Environment == EnvironmentProduction
+}
+
+// IsServer reports whether the built-in server is running.
+func (i HugoInfo) IsServer() bool {
+ return i.conf.Running()
+}
+
+// IsExtended reports whether the Hugo binary is the extended version.
+func (i HugoInfo) IsExtended() bool {
+ return IsExtended
+}
+
+// WorkingDir returns the project working directory.
+func (i HugoInfo) WorkingDir() string {
+ return i.conf.WorkingDir()
+}
+
+// Deps gets a list of dependencies for this Hugo build.
+func (i HugoInfo) Deps() []*Dependency {
+ return i.deps
+}
+
+func (i HugoInfo) Store() *maps.Scratch {
+ return i.store
+}
+
+// Deprecated: Use hugo.IsMultihost instead.
+func (i HugoInfo) IsMultiHost() bool {
+ Deprecate("hugo.IsMultiHost", "Use hugo.IsMultihost instead.", "v0.124.0")
+ return i.conf.IsMultihost()
+}
+
+// IsMultihost reports whether each configured language has a unique baseURL.
+func (i HugoInfo) IsMultihost() bool {
+ return i.conf.IsMultihost()
+}
+
+// IsMultilingual reports whether there are two or more configured languages.
+func (i HugoInfo) IsMultilingual() bool {
+ return i.conf.IsMultilingual()
+}
+
+type contextKey uint8
+
+const (
+ contextKeyMarkupScope contextKey = iota
+)
+
+var markupScope = hcontext.NewContextDispatcher[string](contextKeyMarkupScope)
+
+type Context struct{}
+
+func (c Context) MarkupScope(ctx context.Context) string {
+ return GetMarkupScope(ctx)
+}
+
+// SetMarkupScope sets the markup scope in the context.
+func SetMarkupScope(ctx context.Context, s string) context.Context {
+ return markupScope.Set(ctx, s)
+}
+
+// GetMarkupScope gets the markup scope from the context.
+func GetMarkupScope(ctx context.Context) string {
+ return markupScope.Get(ctx)
+}
+
+// ConfigProvider represents the config options that are relevant for HugoInfo.
+type ConfigProvider interface {
+ Environment() string
+ Running() bool
+ WorkingDir() string
+ IsMultihost() bool
+ IsMultilingual() bool
}
// NewInfo creates a new Hugo Info object.
-func NewInfo(environment string) Info {
- if environment == "" {
- environment = EnvironmentProduction
+func NewInfo(conf ConfigProvider, deps []*Dependency) HugoInfo {
+ if conf.Environment() == "" {
+ panic("environment not set")
}
- return Info{
+ var (
+ commitHash string
+ buildDate string
+ goVersion string
+ )
+
+ bi := getBuildInfo()
+ if bi != nil {
+ commitHash = bi.Revision
+ buildDate = bi.RevisionTime
+ goVersion = bi.GoVersion
+ }
+
+ return HugoInfo{
CommitHash: commitHash,
BuildDate: buildDate,
- Environment: environment,
+ Environment: conf.Environment(),
+ conf: conf,
+ deps: deps,
+ store: maps.NewScratch(),
+ GoVersion: goVersion,
+ }
+}
+
+// GetExecEnviron creates and gets the common os/exec environment used in the
+// external programs we interact with via os/exec, e.g. postcss.
+func GetExecEnviron(workDir string, cfg config.AllProvider, fs afero.Fs) []string {
+ var env []string
+ nodepath := filepath.Join(workDir, "node_modules")
+ if np := os.Getenv("NODE_PATH"); np != "" {
+ nodepath = workDir + string(os.PathListSeparator) + np
+ }
+ config.SetEnvVars(&env, "NODE_PATH", nodepath)
+ config.SetEnvVars(&env, "PWD", workDir)
+ config.SetEnvVars(&env, "HUGO_ENVIRONMENT", cfg.Environment())
+ config.SetEnvVars(&env, "HUGO_ENV", cfg.Environment())
+ config.SetEnvVars(&env, "HUGO_PUBLISHDIR", filepath.Join(workDir, cfg.BaseConfig().PublishDir))
+
+ if fs != nil {
+ var fis []iofs.DirEntry
+ d, err := fs.Open(files.FolderJSConfig)
+ if err == nil {
+ fis, err = d.(iofs.ReadDirFile).ReadDir(-1)
+ }
+
+ if err == nil {
+ for _, fi := range fis {
+ key := fmt.Sprintf("HUGO_FILE_%s", strings.ReplaceAll(strings.ToUpper(fi.Name()), ".", "_"))
+ value := fi.(hugofs.FileMetaInfo).Meta().Filename
+ config.SetEnvVars(&env, key, value)
+ }
+ }
+ }
+
+ return env
+}
+
+type buildInfo struct {
+ VersionControlSystem string
+ Revision string
+ RevisionTime string
+ Modified bool
+
+ GoOS string
+ GoArch string
+
+ *debug.BuildInfo
+}
+
+var (
+ bInfo *buildInfo
+ bInfoInit sync.Once
+)
+
+func getBuildInfo() *buildInfo {
+ bInfoInit.Do(func() {
+ bi, ok := debug.ReadBuildInfo()
+ if !ok {
+ return
+ }
+
+ bInfo = &buildInfo{BuildInfo: bi}
+
+ for _, s := range bInfo.Settings {
+ switch s.Key {
+ case "vcs":
+ bInfo.VersionControlSystem = s.Value
+ case "vcs.revision":
+ bInfo.Revision = s.Value
+ case "vcs.time":
+ bInfo.RevisionTime = s.Value
+ case "vcs.modified":
+ bInfo.Modified = s.Value == "true"
+ case "GOOS":
+ bInfo.GoOS = s.Value
+ case "GOARCH":
+ bInfo.GoArch = s.Value
+ }
+ }
+ })
+
+ return bInfo
+}
+
+func formatDep(path, version string) string {
+ return fmt.Sprintf("%s=%q", path, version)
+}
+
+// GetDependencyList returns a sorted dependency list on the format package="version".
+// It includes both Go dependencies and (a manually maintained) list of C(++) dependencies.
+func GetDependencyList() []string {
+ var deps []string
+
+ bi := getBuildInfo()
+ if bi == nil {
+ return deps
+ }
+
+ for _, dep := range bi.Deps {
+ deps = append(deps, formatDep(dep.Path, dep.Version))
+ }
+
+ deps = append(deps, GetDependencyListNonGo()...)
+
+ sort.Strings(deps)
+
+ return deps
+}
+
+// GetDependencyListNonGo returns a list of non-Go dependencies.
+func GetDependencyListNonGo() []string {
+ var deps []string
+
+ if IsExtended {
+ deps = append(
+ deps,
+ formatDep("github.com/sass/libsass", "3.6.6"),
+ formatDep("github.com/webmproject/libwebp", "v1.3.2"),
+ )
+ }
+
+ if dartSass := dartSassVersion(); dartSass.ProtocolVersion != "" {
+ dartSassPath := "github.com/sass/dart-sass-embedded"
+ if IsDartSassGeV2() {
+ dartSassPath = "github.com/sass/dart-sass"
+ }
+ deps = append(deps,
+ formatDep(dartSassPath+"/protocol", dartSass.ProtocolVersion),
+ formatDep(dartSassPath+"/compiler", dartSass.CompilerVersion),
+ formatDep(dartSassPath+"/implementation", dartSass.ImplementationVersion),
+ )
+ }
+ return deps
+}
+
+// IsRunningAsTest reports whether we are running as a test.
+func IsRunningAsTest() bool {
+ for _, arg := range os.Args {
+ if strings.HasPrefix(arg, "-test") {
+ return true
+ }
+ }
+ return false
+}
+
+// Dependency is a single dependency, which can be either a Hugo Module or a local theme.
+type Dependency struct {
+ // Returns the path to this module.
+ // This will either be the module path, e.g. "github.com/gohugoio/myshortcodes",
+ // or the path below your /theme folder, e.g. "mytheme".
+ Path string
+
+ // The module version.
+ Version string
+
+ // Whether this dependency is vendored.
+ Vendor bool
+
+ // Time version was created.
+ Time time.Time
+
+ // In the dependency tree, this is the first module that defines this module
+ // as a dependency.
+ Owner *Dependency
+
+ // Replaced by this dependency.
+ Replace *Dependency
+}
+
+func dartSassVersion() godartsass.DartSassVersion {
+ if DartSassBinaryName == "" || !IsDartSassGeV2() {
+ return godartsass.DartSassVersion{}
+ }
+ v, _ := godartsass.Version(DartSassBinaryName)
+ return v
+}
+
+// DartSassBinaryName is the name of the Dart Sass binary to use.
+// TODO(bep) find a better place for this.
+var DartSassBinaryName string
+
+func init() {
+ DartSassBinaryName = os.Getenv("DART_SASS_BINARY")
+ if DartSassBinaryName == "" {
+ for _, name := range dartSassBinaryNamesV2 {
+ if hexec.InPath(name) {
+ DartSassBinaryName = name
+ break
+ }
+ }
+ if DartSassBinaryName == "" {
+ if hexec.InPath(dartSassBinaryNameV1) {
+ DartSassBinaryName = dartSassBinaryNameV1
+ }
+ }
+ }
+}
+
+var (
+ dartSassBinaryNameV1 = "dart-sass-embedded"
+ dartSassBinaryNamesV2 = []string{"dart-sass", "sass"}
+)
+
+// TODO(bep) we eventually want to remove this, but keep it for a while to throw an informative error.
+// We stopped supporting the old binary in Hugo 0.139.0.
+func IsDartSassGeV2() bool {
+ // dart-sass-embedded was the first version of the embedded Dart Sass before it was moved into the main project.
+ return !strings.Contains(DartSassBinaryName, "embedded")
+}
+
+// Deprecate informs about a deprecation starting at the given version.
+//
+// A deprecation typically needs a simple change in the template, but doing so will make the template incompatible with older versions.
+// Theme maintainers generally want
+// 1. No warnings or errors in the console when building a Hugo site.
+// 2. Their theme to work for at least the last few Hugo versions.
+func Deprecate(item, alternative string, version string) {
+ level := deprecationLogLevelFromVersion(version)
+ deprecateLevel(item, alternative, version, level)
+}
+
+// See Deprecate for details.
+func DeprecateWithLogger(item, alternative string, version string, log logg.Logger) {
+ level := deprecationLogLevelFromVersion(version)
+ deprecateLevelWithLogger(item, alternative, version, level, log)
+}
+
+// DeprecateLevelMin informs about a deprecation starting at the given version, but with a minimum log level.
+func DeprecateLevelMin(item, alternative string, version string, minLevel logg.Level) {
+ level := max(deprecationLogLevelFromVersion(version), minLevel)
+ deprecateLevel(item, alternative, version, level)
+}
+
+// deprecateLevel informs about a deprecation logging at the given level.
+func deprecateLevel(item, alternative, version string, level logg.Level) {
+ deprecateLevelWithLogger(item, alternative, version, level, loggers.Log().Logger())
+}
+
+// DeprecateLevel informs about a deprecation logging at the given level.
+func deprecateLevelWithLogger(item, alternative, version string, level logg.Level, log logg.Logger) {
+ var msg string
+ if level == logg.LevelError {
+ msg = fmt.Sprintf("%s was deprecated in Hugo %s and subsequently removed. %s", item, version, alternative)
+ } else {
+ msg = fmt.Sprintf("%s was deprecated in Hugo %s and will be removed in a future release. %s", item, version, alternative)
+ }
+
+ log.WithLevel(level).WithField(loggers.FieldNameCmd, "deprecated").Logf("%s", msg)
+}
+
+// We usually do about one minor version a month.
+// We want people to run at least the current and previous version without any warnings.
+// We want people who don't update Hugo that often to see the warnings and errors before we remove the feature.
+func deprecationLogLevelFromVersion(ver string) logg.Level {
+ from := MustParseVersion(ver)
+ to := CurrentVersion
+ minorDiff := to.Minor - from.Minor
+ switch {
+ case minorDiff >= 15:
+ // Start failing the build after about 15 months.
+ return logg.LevelError
+ case minorDiff >= 3:
+ // Start printing warnings after about 3 months.
+ return logg.LevelWarn
+ default:
+ return logg.LevelInfo
}
}
diff --git a/common/hugo/hugo_integration_test.go b/common/hugo/hugo_integration_test.go
new file mode 100644
index 000000000..77dbb5c91
--- /dev/null
+++ b/common/hugo/hugo_integration_test.go
@@ -0,0 +1,77 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package hugo_test
+
+import (
+ "strings"
+ "testing"
+
+ "github.com/gohugoio/hugo/hugolib"
+)
+
+func TestIsMultilingualAndIsMultihost(t *testing.T) {
+ t.Parallel()
+
+ files := `
+-- hugo.toml --
+disableKinds = ['page','rss','section','sitemap','taxonomy','term']
+defaultContentLanguageInSubdir = true
+[languages.de]
+baseURL = 'https://de.example.org/'
+[languages.en]
+baseURL = 'https://en.example.org/'
+-- content/_index.md --
+---
+title: home
+---
+-- layouts/index.html --
+multilingual={{ hugo.IsMultilingual }}
+multihost={{ hugo.IsMultihost }}
+ `
+
+ b := hugolib.Test(t, files)
+
+ b.AssertFileContent("public/de/index.html",
+ "multilingual=true",
+ "multihost=true",
+ )
+ b.AssertFileContent("public/en/index.html",
+ "multilingual=true",
+ "multihost=true",
+ )
+
+ files = strings.ReplaceAll(files, "baseURL = 'https://de.example.org/'", "")
+ files = strings.ReplaceAll(files, "baseURL = 'https://en.example.org/'", "")
+
+ b = hugolib.Test(t, files)
+
+ b.AssertFileContent("public/de/index.html",
+ "multilingual=true",
+ "multihost=false",
+ )
+ b.AssertFileContent("public/en/index.html",
+ "multilingual=true",
+ "multihost=false",
+ )
+
+ files = strings.ReplaceAll(files, "[languages.de]", "")
+ files = strings.ReplaceAll(files, "[languages.en]", "")
+
+ b = hugolib.Test(t, files)
+
+ b.AssertFileContent("public/en/index.html",
+ "multilingual=false",
+ "multihost=false",
+ )
+}
diff --git a/common/hugo/hugo_test.go b/common/hugo/hugo_test.go
index 5be575b62..f938073da 100644
--- a/common/hugo/hugo_test.go
+++ b/common/hugo/hugo_test.go
@@ -14,22 +14,98 @@
package hugo
import (
+ "context"
"fmt"
"testing"
+ "github.com/bep/logg"
qt "github.com/frankban/quicktest"
)
func TestHugoInfo(t *testing.T) {
c := qt.New(t)
- hugoInfo := NewInfo("")
+ conf := testConfig{environment: "production", workingDir: "/mywork", running: false}
+ hugoInfo := NewInfo(conf, nil)
c.Assert(hugoInfo.Version(), qt.Equals, CurrentVersion.Version())
c.Assert(fmt.Sprintf("%T", VersionString("")), qt.Equals, fmt.Sprintf("%T", hugoInfo.Version()))
- c.Assert(hugoInfo.CommitHash, qt.Equals, commitHash)
- c.Assert(hugoInfo.BuildDate, qt.Equals, buildDate)
+ c.Assert(hugoInfo.WorkingDir(), qt.Equals, "/mywork")
+
+ bi := getBuildInfo()
+ if bi != nil {
+ c.Assert(hugoInfo.CommitHash, qt.Equals, bi.Revision)
+ c.Assert(hugoInfo.BuildDate, qt.Equals, bi.RevisionTime)
+ c.Assert(hugoInfo.GoVersion, qt.Equals, bi.GoVersion)
+ }
c.Assert(hugoInfo.Environment, qt.Equals, "production")
c.Assert(string(hugoInfo.Generator()), qt.Contains, fmt.Sprintf("Hugo %s", hugoInfo.Version()))
+ c.Assert(hugoInfo.IsDevelopment(), qt.Equals, false)
+ c.Assert(hugoInfo.IsProduction(), qt.Equals, true)
+ c.Assert(hugoInfo.IsExtended(), qt.Equals, IsExtended)
+ c.Assert(hugoInfo.IsServer(), qt.Equals, false)
+ devHugoInfo := NewInfo(testConfig{environment: "development", running: true}, nil)
+ c.Assert(devHugoInfo.IsDevelopment(), qt.Equals, true)
+ c.Assert(devHugoInfo.IsProduction(), qt.Equals, false)
+ c.Assert(devHugoInfo.IsServer(), qt.Equals, true)
+}
+
+func TestDeprecationLogLevelFromVersion(t *testing.T) {
+ c := qt.New(t)
+
+ c.Assert(deprecationLogLevelFromVersion("0.55.0"), qt.Equals, logg.LevelError)
+ ver := CurrentVersion
+ c.Assert(deprecationLogLevelFromVersion(ver.String()), qt.Equals, logg.LevelInfo)
+ ver.Minor -= 3
+ c.Assert(deprecationLogLevelFromVersion(ver.String()), qt.Equals, logg.LevelWarn)
+ ver.Minor -= 4
+ c.Assert(deprecationLogLevelFromVersion(ver.String()), qt.Equals, logg.LevelWarn)
+ ver.Minor -= 13
+ c.Assert(deprecationLogLevelFromVersion(ver.String()), qt.Equals, logg.LevelError)
+
+ // Added just to find the threshold for where we can remove deprecated items.
+ // Subtract 5 from the minor version of the first ERRORed version => 0.122.0.
+ c.Assert(deprecationLogLevelFromVersion("0.127.0"), qt.Equals, logg.LevelError)
+}
+
+func TestMarkupScope(t *testing.T) {
+ c := qt.New(t)
+
+ conf := testConfig{environment: "production", workingDir: "/mywork", running: false}
+ info := NewInfo(conf, nil)
+
+ ctx := context.Background()
+
+ ctx = SetMarkupScope(ctx, "foo")
+
+ c.Assert(info.Context.MarkupScope(ctx), qt.Equals, "foo")
+}
+
+type testConfig struct {
+ environment string
+ running bool
+ workingDir string
+ multihost bool
+ multilingual bool
+}
+
+func (c testConfig) Environment() string {
+ return c.environment
+}
+
+func (c testConfig) Running() bool {
+ return c.running
+}
+
+func (c testConfig) WorkingDir() string {
+ return c.workingDir
+}
+
+func (c testConfig) IsMultihost() bool {
+ return c.multihost
+}
+
+func (c testConfig) IsMultilingual() bool {
+ return c.multilingual
}
diff --git a/common/hugo/vars_extended.go b/common/hugo/vars_extended.go
index bb96bade6..ab01e2647 100644
--- a/common/hugo/vars_extended.go
+++ b/common/hugo/vars_extended.go
@@ -11,7 +11,7 @@
// See the License for the specific language governing permissions and
// limitations under the License.
-// +build extended
+//go:build extended
package hugo
diff --git a/common/hugo/vars_regular.go b/common/hugo/vars_regular.go
index fae18df14..a78aeb0b6 100644
--- a/common/hugo/vars_regular.go
+++ b/common/hugo/vars_regular.go
@@ -11,7 +11,7 @@
// See the License for the specific language governing permissions and
// limitations under the License.
-// +build !extended
+//go:build !extended
package hugo
diff --git a/common/hugo/vars_withdeploy.go b/common/hugo/vars_withdeploy.go
new file mode 100644
index 000000000..4e0c3efbb
--- /dev/null
+++ b/common/hugo/vars_withdeploy.go
@@ -0,0 +1,18 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+//go:build withdeploy
+
+package hugo
+
+var IsWithdeploy = true
diff --git a/common/hugo/vars_withdeploy_off.go b/common/hugo/vars_withdeploy_off.go
new file mode 100644
index 000000000..36e9bd874
--- /dev/null
+++ b/common/hugo/vars_withdeploy_off.go
@@ -0,0 +1,18 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+//go:build !withdeploy
+
+package hugo
+
+var IsWithdeploy = false
diff --git a/common/hugo/version.go b/common/hugo/version.go
index 848393f97..cf5988840 100644
--- a/common/hugo/version.go
+++ b/common/hugo/version.go
@@ -15,9 +15,10 @@ package hugo
import (
"fmt"
- "strconv"
-
+ "io"
+ "math"
"runtime"
+ "strconv"
"strings"
"github.com/gohugoio/hugo/compare"
@@ -26,8 +27,9 @@ import (
// Version represents the Hugo build version.
type Version struct {
- // Major and minor version.
- Number float32
+ Major int
+
+ Minor int
// Increment this for bug releases
PatchLevel int
@@ -43,7 +45,7 @@ var (
)
func (v Version) String() string {
- return version(v.Number, v.PatchLevel, v.Suffix)
+ return version(v.Major, v.Minor, v.PatchLevel, v.Suffix)
}
// Version returns the Hugo version.
@@ -51,6 +53,11 @@ func (v Version) Version() VersionString {
return VersionString(v.String())
}
+// Compare implements the compare.Comparer interface.
+func (h Version) Compare(other any) int {
+ return compareVersions(h, other)
+}
+
// VersionString represents a Hugo version string.
type VersionString string
@@ -59,13 +66,16 @@ func (h VersionString) String() string {
}
// Compare implements the compare.Comparer interface.
-func (h VersionString) Compare(other interface{}) int {
- v := MustParseVersion(h.String())
- return compareVersionsWithSuffix(v.Number, v.PatchLevel, v.Suffix, other)
+func (h VersionString) Compare(other any) int {
+ return compareVersions(h.Version(), other)
+}
+
+func (h VersionString) Version() Version {
+ return MustParseVersion(h.String())
}
// Eq implements the compare.Eqer interface.
-func (h VersionString) Eq(other interface{}) bool {
+func (h VersionString) Eq(other any) bool {
s, err := cast.ToStringE(other)
if err != nil {
return false
@@ -85,10 +95,7 @@ func ParseVersion(s string) (Version, error) {
}
}
- v, p := parseVersion(s)
-
- vv.Number = v
- vv.PatchLevel = p
+ vv.Major, vv.Minor, vv.PatchLevel = parseVersion(s)
return vv, nil
}
@@ -111,76 +118,112 @@ func (v Version) ReleaseVersion() Version {
// Next returns the next Hugo release version.
func (v Version) Next() Version {
- return Version{Number: v.Number + 0.01}
+ return Version{Major: v.Major, Minor: v.Minor + 1}
}
// Prev returns the previous Hugo release version.
func (v Version) Prev() Version {
- return Version{Number: v.Number - 0.01}
+ return Version{Major: v.Major, Minor: v.Minor - 1}
}
// NextPatchLevel returns the next patch/bugfix Hugo version.
// This will be a patch increment on the previous Hugo version.
func (v Version) NextPatchLevel(level int) Version {
- return Version{Number: v.Number - 0.01, PatchLevel: level}
+ prev := v.Prev()
+ prev.PatchLevel = level
+ return prev
}
// BuildVersionString creates a version string. This is what you see when
// running "hugo version".
func BuildVersionString() string {
- program := "Hugo Static Site Generator"
+ // program := "Hugo Static Site Generator"
+ program := "hugo"
version := "v" + CurrentVersion.String()
- if commitHash != "" {
- version += "-" + strings.ToUpper(commitHash)
+
+ bi := getBuildInfo()
+ if bi == nil {
+ return version
+ }
+ if bi.Revision != "" {
+ version += "-" + bi.Revision
}
if IsExtended {
- version += "/extended"
+ version += "+extended"
+ }
+ if IsWithdeploy {
+ version += "+withdeploy"
}
- osArch := runtime.GOOS + "/" + runtime.GOARCH
+ osArch := bi.GoOS + "/" + bi.GoArch
- date := buildDate
+ date := bi.RevisionTime
+ if date == "" {
+ // Accept vendor-specified build date if .git/ is unavailable.
+ date = buildDate
+ }
if date == "" {
date = "unknown"
}
- return fmt.Sprintf("%s %s %s BuildDate: %s", program, version, osArch, date)
+ versionString := fmt.Sprintf("%s %s %s BuildDate=%s",
+ program, version, osArch, date)
+ if vendorInfo != "" {
+ versionString += " VendorInfo=" + vendorInfo
+ }
+
+ return versionString
}
-func version(version float32, patchVersion int, suffix string) string {
- if patchVersion > 0 || version > 0.53 {
- return fmt.Sprintf("%.2f.%d%s", version, patchVersion, suffix)
+func version(major, minor, patch int, suffix string) string {
+ if patch > 0 || minor > 53 {
+ return fmt.Sprintf("%d.%d.%d%s", major, minor, patch, suffix)
}
- return fmt.Sprintf("%.2f%s", version, suffix)
+ return fmt.Sprintf("%d.%d%s", major, minor, suffix)
}
// CompareVersion compares the given version string or number against the
// running Hugo version.
// It returns -1 if the given version is less than, 0 if equal and 1 if greater than
// the running version.
-func CompareVersion(version interface{}) int {
- return compareVersionsWithSuffix(CurrentVersion.Number, CurrentVersion.PatchLevel, CurrentVersion.Suffix, version)
+func CompareVersion(version any) int {
+ return compareVersions(CurrentVersion, version)
}
-func compareVersions(inVersion float32, inPatchVersion int, in interface{}) int {
- return compareVersionsWithSuffix(inVersion, inPatchVersion, "", in)
-}
-
-func compareVersionsWithSuffix(inVersion float32, inPatchVersion int, suffix string, in interface{}) int {
+func compareVersions(inVersion Version, in any) int {
var c int
switch d := in.(type) {
case float64:
- c = compareFloatVersions(inVersion, float32(d))
+ c = compareFloatWithVersion(d, inVersion)
case float32:
- c = compareFloatVersions(inVersion, d)
+ c = compareFloatWithVersion(float64(d), inVersion)
case int:
- c = compareFloatVersions(inVersion, float32(d))
+ c = compareFloatWithVersion(float64(d), inVersion)
case int32:
- c = compareFloatVersions(inVersion, float32(d))
+ c = compareFloatWithVersion(float64(d), inVersion)
case int64:
- c = compareFloatVersions(inVersion, float32(d))
+ c = compareFloatWithVersion(float64(d), inVersion)
+ case Version:
+ if d.Major == inVersion.Major && d.Minor == inVersion.Minor && d.PatchLevel == inVersion.PatchLevel {
+ return strings.Compare(inVersion.Suffix, d.Suffix)
+ }
+ if d.Major > inVersion.Major {
+ return 1
+ } else if d.Major < inVersion.Major {
+ return -1
+ }
+ if d.Minor > inVersion.Minor {
+ return 1
+ } else if d.Minor < inVersion.Minor {
+ return -1
+ }
+ if d.PatchLevel > inVersion.PatchLevel {
+ return 1
+ } else if d.PatchLevel < inVersion.PatchLevel {
+ return -1
+ }
default:
s, err := cast.ToStringE(in)
if err != nil {
@@ -191,50 +234,53 @@ func compareVersionsWithSuffix(inVersion float32, inPatchVersion int, suffix str
if err != nil {
return -1
}
+ return inVersion.Compare(v)
- if v.Number == inVersion && v.PatchLevel == inPatchVersion {
- return strings.Compare(suffix, v.Suffix)
- }
-
- if v.Number < inVersion || (v.Number == inVersion && v.PatchLevel < inPatchVersion) {
- return -1
- }
-
- return 1
- }
-
- if c == 0 && suffix != "" {
- return 1
}
return c
}
-func parseVersion(s string) (float32, int) {
- var (
- v float32
- p int
- )
-
- if strings.Count(s, ".") == 2 {
- li := strings.LastIndex(s, ".")
- p = cast.ToInt(s[li+1:])
- s = s[:li]
+func parseVersion(s string) (int, int, int) {
+ var major, minor, patch int
+ parts := strings.Split(s, ".")
+ if len(parts) > 0 {
+ major, _ = strconv.Atoi(parts[0])
+ }
+ if len(parts) > 1 {
+ minor, _ = strconv.Atoi(parts[1])
+ }
+ if len(parts) > 2 {
+ patch, _ = strconv.Atoi(parts[2])
}
- v = float32(cast.ToFloat64(s))
-
- return v, p
+ return major, minor, patch
}
-func compareFloatVersions(version float32, v float32) int {
- if v == version {
+// compareFloatWithVersion compares v1 with v2.
+// It returns -1 if v1 is less than v2, 0 if v1 is equal to v2 and 1 if v1 is greater than v2.
+func compareFloatWithVersion(v1 float64, v2 Version) int {
+ mf, minf := math.Modf(v1)
+ v1maj := int(mf)
+ v1min := int(minf * 100)
+
+ if v2.Major == v1maj && v2.Minor == v1min {
return 0
}
- if v < version {
+
+ if v1maj > v2.Major {
+ return 1
+ }
+
+ if v1maj < v2.Major {
return -1
}
- return 1
+
+ if v1min > v2.Minor {
+ return 1
+ }
+
+ return -1
}
func GoMinorVersion() int {
@@ -245,7 +291,15 @@ func goMinorVersion(version string) int {
if strings.HasPrefix(version, "devel") {
return 9999 // magic
}
- i, _ := strconv.Atoi(strings.Split(version, ".")[1])
- return i
-
+ var major, minor int
+ var trailing string
+ n, err := fmt.Sscanf(version, "go%d.%d%s", &major, &minor, &trailing)
+ if n == 2 && err == io.EOF {
+ // Means there were no trailing characters (i.e., not an alpha/beta)
+ err = nil
+ }
+ if err != nil {
+ return 0
+ }
+ return minor
}
diff --git a/common/hugo/version_current.go b/common/hugo/version_current.go
index f6bfedac3..ba367ceb5 100644
--- a/common/hugo/version_current.go
+++ b/common/hugo/version_current.go
@@ -16,7 +16,8 @@ package hugo
// CurrentVersion represents the current build version.
// This should be the only one.
var CurrentVersion = Version{
- Number: 0.57,
- PatchLevel: 2,
- Suffix: "",
+ Major: 0,
+ Minor: 148,
+ PatchLevel: 0,
+ Suffix: "-DEV",
}
diff --git a/common/hugo/version_test.go b/common/hugo/version_test.go
index e0cd0e6e8..33e50ebf5 100644
--- a/common/hugo/version_test.go
+++ b/common/hugo/version_test.go
@@ -22,10 +22,10 @@ import (
func TestHugoVersion(t *testing.T) {
c := qt.New(t)
- c.Assert(version(0.15, 0, "-DEV"), qt.Equals, "0.15-DEV")
- c.Assert(version(0.15, 2, "-DEV"), qt.Equals, "0.15.2-DEV")
+ c.Assert(version(0, 15, 0, "-DEV"), qt.Equals, "0.15-DEV")
+ c.Assert(version(0, 15, 2, "-DEV"), qt.Equals, "0.15.2-DEV")
- v := Version{Number: 0.21, PatchLevel: 0, Suffix: "-DEV"}
+ v := Version{Minor: 21, Suffix: "-DEV"}
c.Assert(v.ReleaseVersion().String(), qt.Equals, "0.21")
c.Assert(v.String(), qt.Equals, "0.21-DEV")
@@ -39,37 +39,36 @@ func TestHugoVersion(t *testing.T) {
// We started to use full semver versions even for main
// releases in v0.54.0
- v = Version{Number: 0.53, PatchLevel: 0}
+ v = Version{Minor: 53, PatchLevel: 0}
c.Assert(v.String(), qt.Equals, "0.53")
c.Assert(v.Next().String(), qt.Equals, "0.54.0")
c.Assert(v.Next().Next().String(), qt.Equals, "0.55.0")
- v = Version{Number: 0.54, PatchLevel: 0, Suffix: "-DEV"}
+ v = Version{Minor: 54, PatchLevel: 0, Suffix: "-DEV"}
c.Assert(v.String(), qt.Equals, "0.54.0-DEV")
}
func TestCompareVersions(t *testing.T) {
c := qt.New(t)
- c.Assert(compareVersions(0.20, 0, 0.20), qt.Equals, 0)
- c.Assert(compareVersions(0.20, 0, float32(0.20)), qt.Equals, 0)
- c.Assert(compareVersions(0.20, 0, float64(0.20)), qt.Equals, 0)
- c.Assert(compareVersions(0.19, 1, 0.20), qt.Equals, 1)
- c.Assert(compareVersions(0.19, 3, "0.20.2"), qt.Equals, 1)
- c.Assert(compareVersions(0.19, 1, 0.01), qt.Equals, -1)
- c.Assert(compareVersions(0, 1, 3), qt.Equals, 1)
- c.Assert(compareVersions(0, 1, int32(3)), qt.Equals, 1)
- c.Assert(compareVersions(0, 1, int64(3)), qt.Equals, 1)
- c.Assert(compareVersions(0.20, 0, "0.20"), qt.Equals, 0)
- c.Assert(compareVersions(0.20, 1, "0.20.1"), qt.Equals, 0)
- c.Assert(compareVersions(0.20, 1, "0.20"), qt.Equals, -1)
- c.Assert(compareVersions(0.20, 0, "0.20.1"), qt.Equals, 1)
- c.Assert(compareVersions(0.20, 1, "0.20.2"), qt.Equals, 1)
- c.Assert(compareVersions(0.21, 1, "0.22.1"), qt.Equals, 1)
- c.Assert(compareVersions(0.22, 0, "0.22-DEV"), qt.Equals, -1)
- c.Assert(compareVersions(0.22, 0, "0.22.1-DEV"), qt.Equals, 1)
- c.Assert(compareVersionsWithSuffix(0.22, 0, "-DEV", "0.22"), qt.Equals, 1)
- c.Assert(compareVersionsWithSuffix(0.22, 1, "-DEV", "0.22"), qt.Equals, -1)
- c.Assert(compareVersionsWithSuffix(0.22, 1, "-DEV", "0.22.1-DEV"), qt.Equals, 0)
+ c.Assert(compareVersions(MustParseVersion("0.20.0"), 0.20), qt.Equals, 0)
+ c.Assert(compareVersions(MustParseVersion("0.20.0"), float32(0.20)), qt.Equals, 0)
+ c.Assert(compareVersions(MustParseVersion("0.20.0"), float64(0.20)), qt.Equals, 0)
+ c.Assert(compareVersions(MustParseVersion("0.19.1"), 0.20), qt.Equals, 1)
+ c.Assert(compareVersions(MustParseVersion("0.19.3"), "0.20.2"), qt.Equals, 1)
+ c.Assert(compareVersions(MustParseVersion("0.1"), 3), qt.Equals, 1)
+ c.Assert(compareVersions(MustParseVersion("0.1"), int32(3)), qt.Equals, 1)
+ c.Assert(compareVersions(MustParseVersion("0.1"), int64(3)), qt.Equals, 1)
+ c.Assert(compareVersions(MustParseVersion("0.20"), "0.20"), qt.Equals, 0)
+ c.Assert(compareVersions(MustParseVersion("0.20.1"), "0.20.1"), qt.Equals, 0)
+ c.Assert(compareVersions(MustParseVersion("0.20.1"), "0.20"), qt.Equals, -1)
+ c.Assert(compareVersions(MustParseVersion("0.20.0"), "0.20.1"), qt.Equals, 1)
+ c.Assert(compareVersions(MustParseVersion("0.20.1"), "0.20.2"), qt.Equals, 1)
+ c.Assert(compareVersions(MustParseVersion("0.21.1"), "0.22.1"), qt.Equals, 1)
+ c.Assert(compareVersions(MustParseVersion("0.22.0"), "0.22-DEV"), qt.Equals, -1)
+ c.Assert(compareVersions(MustParseVersion("0.22.0"), "0.22.1-DEV"), qt.Equals, 1)
+ c.Assert(compareVersions(MustParseVersion("0.22.0-DEV"), "0.22"), qt.Equals, 1)
+ c.Assert(compareVersions(MustParseVersion("0.22.1-DEV"), "0.22"), qt.Equals, -1)
+ c.Assert(compareVersions(MustParseVersion("0.22.1-DEV"), "0.22.1-DEV"), qt.Equals, 0)
}
func TestParseHugoVersion(t *testing.T) {
@@ -84,5 +83,6 @@ func TestParseHugoVersion(t *testing.T) {
func TestGoMinorVersion(t *testing.T) {
c := qt.New(t)
c.Assert(goMinorVersion("go1.12.5"), qt.Equals, 12)
+ c.Assert(goMinorVersion("go1.14rc1"), qt.Equals, 14)
c.Assert(GoMinorVersion() >= 11, qt.Equals, true)
}
diff --git a/common/loggers/handlerdefault.go b/common/loggers/handlerdefault.go
new file mode 100644
index 000000000..bc3c7eec2
--- /dev/null
+++ b/common/loggers/handlerdefault.go
@@ -0,0 +1,106 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+// Some functions in this file (see comments) is based on the Go source code,
+// copyright The Go Authors and governed by a BSD-style license.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// package loggers contains some basic logging setup.
+package loggers
+
+import (
+ "fmt"
+ "io"
+ "strings"
+ "sync"
+
+ "github.com/bep/logg"
+
+ "github.com/fatih/color"
+)
+
+// levelColor mapping.
+var levelColor = [...]*color.Color{
+ logg.LevelTrace: color.New(color.FgWhite),
+ logg.LevelDebug: color.New(color.FgWhite),
+ logg.LevelInfo: color.New(color.FgBlue),
+ logg.LevelWarn: color.New(color.FgYellow),
+ logg.LevelError: color.New(color.FgRed),
+}
+
+// levelString mapping.
+var levelString = [...]string{
+ logg.LevelTrace: "TRACE",
+ logg.LevelDebug: "DEBUG",
+ logg.LevelInfo: "INFO ",
+ logg.LevelWarn: "WARN ",
+ logg.LevelError: "ERROR",
+}
+
+// newDefaultHandler handler.
+func newDefaultHandler(outWriter, errWriter io.Writer) logg.Handler {
+ return &defaultHandler{
+ outWriter: outWriter,
+ errWriter: errWriter,
+ Padding: 0,
+ }
+}
+
+// Default Handler implementation.
+// Based on https://github.com/apex/log/blob/master/handlers/cli/cli.go
+type defaultHandler struct {
+ mu sync.Mutex
+ outWriter io.Writer // Defaults to os.Stdout.
+ errWriter io.Writer // Defaults to os.Stderr.
+
+ Padding int
+}
+
+// HandleLog implements logg.Handler.
+func (h *defaultHandler) HandleLog(e *logg.Entry) error {
+ color := levelColor[e.Level]
+ level := levelString[e.Level]
+
+ h.mu.Lock()
+ defer h.mu.Unlock()
+
+ var w io.Writer
+ if e.Level > logg.LevelInfo {
+ w = h.errWriter
+ } else {
+ w = h.outWriter
+ }
+
+ var prefix string
+ for _, field := range e.Fields {
+ if field.Name == FieldNameCmd {
+ prefix = fmt.Sprint(field.Value)
+ break
+ }
+ }
+
+ if prefix != "" {
+ prefix = prefix + ": "
+ }
+
+ color.Fprintf(w, "%s %s%s", fmt.Sprintf("%*s", h.Padding+1, level), color.Sprint(prefix), e.Message)
+
+ for _, field := range e.Fields {
+ if strings.HasPrefix(field.Name, reservedFieldNamePrefix) {
+ continue
+ }
+ fmt.Fprintf(w, " %s %v", color.Sprint(field.Name), field.Value)
+ }
+
+ fmt.Fprintln(w)
+
+ return nil
+}
diff --git a/common/loggers/handlersmisc.go b/common/loggers/handlersmisc.go
new file mode 100644
index 000000000..2ae6300f7
--- /dev/null
+++ b/common/loggers/handlersmisc.go
@@ -0,0 +1,145 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+// Some functions in this file (see comments) is based on the Go source code,
+// copyright The Go Authors and governed by a BSD-style license.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package loggers
+
+import (
+ "fmt"
+ "strings"
+ "sync"
+
+ "github.com/bep/logg"
+ "github.com/gohugoio/hugo/common/hashing"
+)
+
+// PanicOnWarningHook panics on warnings.
+var PanicOnWarningHook = func(e *logg.Entry) error {
+ if e.Level != logg.LevelWarn {
+ return nil
+ }
+ panic(e.Message)
+}
+
+func newLogLevelCounter() *logLevelCounter {
+ return &logLevelCounter{
+ counters: make(map[logg.Level]int),
+ }
+}
+
+func newLogOnceHandler(threshold logg.Level) *logOnceHandler {
+ return &logOnceHandler{
+ threshold: threshold,
+ seen: make(map[uint64]bool),
+ }
+}
+
+func newStopHandler(h ...logg.Handler) *stopHandler {
+ return &stopHandler{
+ handlers: h,
+ }
+}
+
+func newSuppressStatementsHandler(statements map[string]bool) *suppressStatementsHandler {
+ return &suppressStatementsHandler{
+ statements: statements,
+ }
+}
+
+type logLevelCounter struct {
+ mu sync.RWMutex
+ counters map[logg.Level]int
+}
+
+func (h *logLevelCounter) HandleLog(e *logg.Entry) error {
+ h.mu.Lock()
+ defer h.mu.Unlock()
+ h.counters[e.Level]++
+ return nil
+}
+
+var errStop = fmt.Errorf("stop")
+
+type logOnceHandler struct {
+ threshold logg.Level
+ mu sync.Mutex
+ seen map[uint64]bool
+}
+
+func (h *logOnceHandler) HandleLog(e *logg.Entry) error {
+ if e.Level < h.threshold {
+ // We typically only want to enable this for warnings and above.
+ // The common use case is that many go routines may log the same error.
+ return nil
+ }
+ h.mu.Lock()
+ defer h.mu.Unlock()
+ hash := hashing.HashUint64(e.Level, e.Message, e.Fields)
+ if h.seen[hash] {
+ return errStop
+ }
+ h.seen[hash] = true
+ return nil
+}
+
+func (h *logOnceHandler) reset() {
+ h.mu.Lock()
+ defer h.mu.Unlock()
+ h.seen = make(map[uint64]bool)
+}
+
+type stopHandler struct {
+ handlers []logg.Handler
+}
+
+// HandleLog implements logg.Handler.
+func (h *stopHandler) HandleLog(e *logg.Entry) error {
+ for _, handler := range h.handlers {
+ if err := handler.HandleLog(e); err != nil {
+ if err == errStop {
+ return nil
+ }
+ return err
+ }
+ }
+ return nil
+}
+
+type suppressStatementsHandler struct {
+ statements map[string]bool
+}
+
+func (h *suppressStatementsHandler) HandleLog(e *logg.Entry) error {
+ for _, field := range e.Fields {
+ if field.Name == FieldNameStatementID {
+ if h.statements[field.Value.(string)] {
+ return errStop
+ }
+ }
+ }
+ return nil
+}
+
+// whiteSpaceTrimmer creates a new log handler that trims whitespace from log messages and string fields.
+func whiteSpaceTrimmer() logg.Handler {
+ return logg.HandlerFunc(func(e *logg.Entry) error {
+ e.Message = strings.TrimSpace(e.Message)
+ for i, field := range e.Fields {
+ if s, ok := field.Value.(string); ok {
+ e.Fields[i].Value = strings.TrimSpace(s)
+ }
+ }
+ return nil
+ })
+}
diff --git a/common/loggers/handlerterminal.go b/common/loggers/handlerterminal.go
new file mode 100644
index 000000000..c6a86d3a2
--- /dev/null
+++ b/common/loggers/handlerterminal.go
@@ -0,0 +1,100 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+// Some functions in this file (see comments) is based on the Go source code,
+// copyright The Go Authors and governed by a BSD-style license.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package loggers
+
+import (
+ "fmt"
+ "io"
+ "regexp"
+ "strings"
+ "sync"
+
+ "github.com/bep/logg"
+)
+
+// newNoAnsiEscapeHandler creates a new noAnsiEscapeHandler
+func newNoAnsiEscapeHandler(outWriter, errWriter io.Writer, noLevelPrefix bool, predicate func(*logg.Entry) bool) *noAnsiEscapeHandler {
+ if predicate == nil {
+ predicate = func(e *logg.Entry) bool { return true }
+ }
+ return &noAnsiEscapeHandler{
+ noLevelPrefix: noLevelPrefix,
+ outWriter: outWriter,
+ errWriter: errWriter,
+ predicate: predicate,
+ }
+}
+
+type noAnsiEscapeHandler struct {
+ mu sync.Mutex
+ outWriter io.Writer
+ errWriter io.Writer
+ predicate func(*logg.Entry) bool
+ noLevelPrefix bool
+}
+
+func (h *noAnsiEscapeHandler) HandleLog(e *logg.Entry) error {
+ if !h.predicate(e) {
+ return nil
+ }
+ h.mu.Lock()
+ defer h.mu.Unlock()
+
+ var w io.Writer
+ if e.Level > logg.LevelInfo {
+ w = h.errWriter
+ } else {
+ w = h.outWriter
+ }
+
+ var prefix string
+ for _, field := range e.Fields {
+ if field.Name == FieldNameCmd {
+ prefix = fmt.Sprint(field.Value)
+ break
+ }
+ }
+
+ if prefix != "" {
+ prefix = prefix + ": "
+ }
+
+ msg := stripANSI(e.Message)
+
+ if h.noLevelPrefix {
+ fmt.Fprintf(w, "%s%s", prefix, msg)
+ } else {
+ fmt.Fprintf(w, "%s %s%s", levelString[e.Level], prefix, msg)
+ }
+
+ for _, field := range e.Fields {
+ if strings.HasPrefix(field.Name, reservedFieldNamePrefix) {
+ continue
+ }
+ fmt.Fprintf(w, " %s %v", field.Name, field.Value)
+
+ }
+ fmt.Fprintln(w)
+
+ return nil
+}
+
+var ansiRe = regexp.MustCompile(`\x1b\[[0-9;]*m`)
+
+// stripANSI removes ANSI escape codes from s.
+func stripANSI(s string) string {
+ return ansiRe.ReplaceAllString(s, "")
+}
diff --git a/common/loggers/handlerterminal_test.go b/common/loggers/handlerterminal_test.go
new file mode 100644
index 000000000..f45ce80df
--- /dev/null
+++ b/common/loggers/handlerterminal_test.go
@@ -0,0 +1,40 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+// Some functions in this file (see comments) is based on the Go source code,
+// copyright The Go Authors and governed by a BSD-style license.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package loggers
+
+import (
+ "bytes"
+ "testing"
+
+ "github.com/bep/logg"
+ qt "github.com/frankban/quicktest"
+ "github.com/gohugoio/hugo/common/terminal"
+)
+
+func TestNoAnsiEscapeHandler(t *testing.T) {
+ c := qt.New(t)
+
+ test := func(s string) {
+ c.Assert(stripANSI(terminal.Notice(s)), qt.Equals, s)
+ }
+ test(`error in "file.md:1:2"`)
+
+ var buf bytes.Buffer
+ h := newNoAnsiEscapeHandler(&buf, &buf, false, nil)
+ h.HandleLog(&logg.Entry{Message: terminal.Notice(`error in "file.md:1:2"`), Level: logg.LevelInfo})
+
+ c.Assert(buf.String(), qt.Equals, "INFO error in \"file.md:1:2\"\n")
+}
diff --git a/common/loggers/logger.go b/common/loggers/logger.go
new file mode 100644
index 000000000..a013049f7
--- /dev/null
+++ b/common/loggers/logger.go
@@ -0,0 +1,385 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+// Some functions in this file (see comments) is based on the Go source code,
+// copyright The Go Authors and governed by a BSD-style license.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package loggers
+
+import (
+ "fmt"
+ "io"
+ "os"
+ "strings"
+ "time"
+
+ "github.com/bep/logg"
+ "github.com/bep/logg/handlers/multi"
+ "github.com/gohugoio/hugo/common/terminal"
+)
+
+var (
+ reservedFieldNamePrefix = "__h_field_"
+ // FieldNameCmd is the name of the field that holds the command name.
+ FieldNameCmd = reservedFieldNamePrefix + "_cmd"
+ // Used to suppress statements.
+ FieldNameStatementID = reservedFieldNamePrefix + "__h_field_statement_id"
+)
+
+// Options defines options for the logger.
+type Options struct {
+ Level logg.Level
+ StdOut io.Writer
+ StdErr io.Writer
+ DistinctLevel logg.Level
+ StoreErrors bool
+ HandlerPost func(e *logg.Entry) error
+ SuppressStatements map[string]bool
+}
+
+// New creates a new logger with the given options.
+func New(opts Options) Logger {
+ if opts.StdOut == nil {
+ opts.StdOut = os.Stdout
+ }
+ if opts.StdErr == nil {
+ opts.StdErr = os.Stderr
+ }
+
+ if opts.Level == 0 {
+ opts.Level = logg.LevelWarn
+ }
+
+ var logHandler logg.Handler
+ if terminal.PrintANSIColors(os.Stderr) {
+ logHandler = newDefaultHandler(opts.StdErr, opts.StdErr)
+ } else {
+ logHandler = newNoAnsiEscapeHandler(opts.StdErr, opts.StdErr, false, nil)
+ }
+
+ errorsw := &strings.Builder{}
+ logCounters := newLogLevelCounter()
+ handlers := []logg.Handler{
+ logCounters,
+ }
+
+ if opts.Level == logg.LevelTrace {
+ // Trace is used during development only, and it's useful to
+ // only see the trace messages.
+ handlers = append(handlers,
+ logg.HandlerFunc(func(e *logg.Entry) error {
+ if e.Level != logg.LevelTrace {
+ return logg.ErrStopLogEntry
+ }
+ return nil
+ }),
+ )
+ }
+
+ handlers = append(handlers, whiteSpaceTrimmer(), logHandler)
+
+ if opts.HandlerPost != nil {
+ var hookHandler logg.HandlerFunc = func(e *logg.Entry) error {
+ opts.HandlerPost(e)
+ return nil
+ }
+ handlers = append(handlers, hookHandler)
+ }
+
+ if opts.StoreErrors {
+ h := newNoAnsiEscapeHandler(io.Discard, errorsw, true, func(e *logg.Entry) bool {
+ return e.Level >= logg.LevelError
+ })
+
+ handlers = append(handlers, h)
+ }
+
+ logHandler = multi.New(handlers...)
+
+ var logOnce *logOnceHandler
+ if opts.DistinctLevel != 0 {
+ logOnce = newLogOnceHandler(opts.DistinctLevel)
+ logHandler = newStopHandler(logOnce, logHandler)
+ }
+
+ if len(opts.SuppressStatements) > 0 {
+ logHandler = newStopHandler(newSuppressStatementsHandler(opts.SuppressStatements), logHandler)
+ }
+
+ logger := logg.New(
+ logg.Options{
+ Level: opts.Level,
+ Handler: logHandler,
+ },
+ )
+
+ l := logger.WithLevel(opts.Level)
+
+ reset := func() {
+ logCounters.mu.Lock()
+ defer logCounters.mu.Unlock()
+ logCounters.counters = make(map[logg.Level]int)
+ errorsw.Reset()
+ if logOnce != nil {
+ logOnce.reset()
+ }
+ }
+
+ return &logAdapter{
+ logCounters: logCounters,
+ errors: errorsw,
+ reset: reset,
+ stdOut: opts.StdOut,
+ stdErr: opts.StdErr,
+ level: opts.Level,
+ logger: logger,
+ tracel: l.WithLevel(logg.LevelTrace),
+ debugl: l.WithLevel(logg.LevelDebug),
+ infol: l.WithLevel(logg.LevelInfo),
+ warnl: l.WithLevel(logg.LevelWarn),
+ errorl: l.WithLevel(logg.LevelError),
+ }
+}
+
+// NewDefault creates a new logger with the default options.
+func NewDefault() Logger {
+ opts := Options{
+ DistinctLevel: logg.LevelWarn,
+ Level: logg.LevelWarn,
+ }
+ return New(opts)
+}
+
+func NewTrace() Logger {
+ opts := Options{
+ DistinctLevel: logg.LevelWarn,
+ Level: logg.LevelTrace,
+ }
+ return New(opts)
+}
+
+func LevelLoggerToWriter(l logg.LevelLogger) io.Writer {
+ return logWriter{l: l}
+}
+
+type Logger interface {
+ Debug() logg.LevelLogger
+ Debugf(format string, v ...any)
+ Debugln(v ...any)
+ Error() logg.LevelLogger
+ Errorf(format string, v ...any)
+ Erroridf(id, format string, v ...any)
+ Errorln(v ...any)
+ Errors() string
+ Info() logg.LevelLogger
+ InfoCommand(command string) logg.LevelLogger
+ Infof(format string, v ...any)
+ Infoln(v ...any)
+ Level() logg.Level
+ LoggCount(logg.Level) int
+ Logger() logg.Logger
+ StdOut() io.Writer
+ StdErr() io.Writer
+ Printf(format string, v ...any)
+ Println(v ...any)
+ PrintTimerIfDelayed(start time.Time, name string)
+ Reset()
+ Warn() logg.LevelLogger
+ WarnCommand(command string) logg.LevelLogger
+ Warnf(format string, v ...any)
+ Warnidf(id, format string, v ...any)
+ Warnln(v ...any)
+ Deprecatef(fail bool, format string, v ...any)
+ Trace(s logg.StringFunc)
+}
+
+type logAdapter struct {
+ logCounters *logLevelCounter
+ errors *strings.Builder
+ reset func()
+ stdOut io.Writer
+ stdErr io.Writer
+ level logg.Level
+ logger logg.Logger
+ tracel logg.LevelLogger
+ debugl logg.LevelLogger
+ infol logg.LevelLogger
+ warnl logg.LevelLogger
+ errorl logg.LevelLogger
+}
+
+func (l *logAdapter) Debug() logg.LevelLogger {
+ return l.debugl
+}
+
+func (l *logAdapter) Debugf(format string, v ...any) {
+ l.debugl.Logf(format, v...)
+}
+
+func (l *logAdapter) Debugln(v ...any) {
+ l.debugl.Logf(l.sprint(v...))
+}
+
+func (l *logAdapter) Info() logg.LevelLogger {
+ return l.infol
+}
+
+func (l *logAdapter) InfoCommand(command string) logg.LevelLogger {
+ return l.infol.WithField(FieldNameCmd, command)
+}
+
+func (l *logAdapter) Infof(format string, v ...any) {
+ l.infol.Logf(format, v...)
+}
+
+func (l *logAdapter) Infoln(v ...any) {
+ l.infol.Logf(l.sprint(v...))
+}
+
+func (l *logAdapter) Level() logg.Level {
+ return l.level
+}
+
+func (l *logAdapter) LoggCount(level logg.Level) int {
+ l.logCounters.mu.RLock()
+ defer l.logCounters.mu.RUnlock()
+ return l.logCounters.counters[level]
+}
+
+func (l *logAdapter) Logger() logg.Logger {
+ return l.logger
+}
+
+func (l *logAdapter) StdOut() io.Writer {
+ return l.stdOut
+}
+
+func (l *logAdapter) StdErr() io.Writer {
+ return l.stdErr
+}
+
+// PrintTimerIfDelayed prints a time statement to the FEEDBACK logger
+// if considerable time is spent.
+func (l *logAdapter) PrintTimerIfDelayed(start time.Time, name string) {
+ elapsed := time.Since(start)
+ milli := int(1000 * elapsed.Seconds())
+ if milli < 500 {
+ return
+ }
+ fmt.Fprintf(l.stdErr, "%s in %v ms", name, milli)
+}
+
+func (l *logAdapter) Printf(format string, v ...any) {
+ // Add trailing newline if not present.
+ if !strings.HasSuffix(format, "\n") {
+ format += "\n"
+ }
+ fmt.Fprintf(l.stdOut, format, v...)
+}
+
+func (l *logAdapter) Println(v ...any) {
+ fmt.Fprintln(l.stdOut, v...)
+}
+
+func (l *logAdapter) Reset() {
+ l.reset()
+}
+
+func (l *logAdapter) Warn() logg.LevelLogger {
+ return l.warnl
+}
+
+func (l *logAdapter) Warnf(format string, v ...any) {
+ l.warnl.Logf(format, v...)
+}
+
+func (l *logAdapter) WarnCommand(command string) logg.LevelLogger {
+ return l.warnl.WithField(FieldNameCmd, command)
+}
+
+func (l *logAdapter) Warnln(v ...any) {
+ l.warnl.Logf(l.sprint(v...))
+}
+
+func (l *logAdapter) Error() logg.LevelLogger {
+ return l.errorl
+}
+
+func (l *logAdapter) Errorf(format string, v ...any) {
+ l.errorl.Logf(format, v...)
+}
+
+func (l *logAdapter) Errorln(v ...any) {
+ l.errorl.Logf(l.sprint(v...))
+}
+
+func (l *logAdapter) Errors() string {
+ return l.errors.String()
+}
+
+func (l *logAdapter) Erroridf(id, format string, v ...any) {
+ id = strings.ToLower(id)
+ format += l.idfInfoStatement("error", id, format)
+ l.errorl.WithField(FieldNameStatementID, id).Logf(format, v...)
+}
+
+func (l *logAdapter) Warnidf(id, format string, v ...any) {
+ id = strings.ToLower(id)
+ format += l.idfInfoStatement("warning", id, format)
+ l.warnl.WithField(FieldNameStatementID, id).Logf(format, v...)
+}
+
+func (l *logAdapter) idfInfoStatement(what, id, format string) string {
+ return fmt.Sprintf("\nYou can suppress this %s by adding the following to your site configuration:\nignoreLogs = ['%s']", what, id)
+}
+
+func (l *logAdapter) Trace(s logg.StringFunc) {
+ l.tracel.Log(s)
+}
+
+func (l *logAdapter) sprint(v ...any) string {
+ return strings.TrimRight(fmt.Sprintln(v...), "\n")
+}
+
+func (l *logAdapter) Deprecatef(fail bool, format string, v ...any) {
+ format = "DEPRECATED: " + format
+ if fail {
+ l.errorl.Logf(format, v...)
+ } else {
+ l.warnl.Logf(format, v...)
+ }
+}
+
+type logWriter struct {
+ l logg.LevelLogger
+}
+
+func (w logWriter) Write(p []byte) (n int, err error) {
+ w.l.Log(logg.String(string(p)))
+ return len(p), nil
+}
+
+func TimeTrackf(l logg.LevelLogger, start time.Time, fields logg.Fields, format string, a ...any) {
+ elapsed := time.Since(start)
+ if fields != nil {
+ l = l.WithFields(fields)
+ }
+ l.WithField("duration", elapsed).Logf(format, a...)
+}
+
+func TimeTrackfn(fn func() (logg.LevelLogger, error)) error {
+ start := time.Now()
+ l, err := fn()
+ elapsed := time.Since(start)
+ l.WithField("duration", elapsed).Logf("")
+ return err
+}
diff --git a/common/loggers/logger_test.go b/common/loggers/logger_test.go
new file mode 100644
index 000000000..bc8975b06
--- /dev/null
+++ b/common/loggers/logger_test.go
@@ -0,0 +1,154 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+// Some functions in this file (see comments) is based on the Go source code,
+// copyright The Go Authors and governed by a BSD-style license.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package loggers_test
+
+import (
+ "io"
+ "strings"
+ "testing"
+
+ "github.com/bep/logg"
+ qt "github.com/frankban/quicktest"
+ "github.com/gohugoio/hugo/common/loggers"
+)
+
+func TestLogDistinct(t *testing.T) {
+ c := qt.New(t)
+
+ opts := loggers.Options{
+ DistinctLevel: logg.LevelWarn,
+ StoreErrors: true,
+ StdOut: io.Discard,
+ StdErr: io.Discard,
+ }
+
+ l := loggers.New(opts)
+
+ for range 10 {
+ l.Errorln("error 1")
+ l.Errorln("error 2")
+ l.Warnln("warn 1")
+ }
+ c.Assert(strings.Count(l.Errors(), "error 1"), qt.Equals, 1)
+ c.Assert(l.LoggCount(logg.LevelError), qt.Equals, 2)
+ c.Assert(l.LoggCount(logg.LevelWarn), qt.Equals, 1)
+}
+
+func TestHookLast(t *testing.T) {
+ c := qt.New(t)
+
+ opts := loggers.Options{
+ HandlerPost: func(e *logg.Entry) error {
+ panic(e.Message)
+ },
+ StdOut: io.Discard,
+ StdErr: io.Discard,
+ }
+
+ l := loggers.New(opts)
+
+ c.Assert(func() { l.Warnln("warn 1") }, qt.PanicMatches, "warn 1")
+}
+
+func TestOptionStoreErrors(t *testing.T) {
+ c := qt.New(t)
+
+ var sb strings.Builder
+
+ opts := loggers.Options{
+ StoreErrors: true,
+ StdErr: &sb,
+ StdOut: &sb,
+ }
+
+ l := loggers.New(opts)
+ l.Errorln("error 1")
+ l.Errorln("error 2")
+
+ errorsStr := l.Errors()
+
+ c.Assert(errorsStr, qt.Contains, "error 1")
+ c.Assert(errorsStr, qt.Not(qt.Contains), "ERROR")
+
+ c.Assert(sb.String(), qt.Contains, "error 1")
+ c.Assert(sb.String(), qt.Contains, "ERROR")
+}
+
+func TestLogCount(t *testing.T) {
+ c := qt.New(t)
+
+ opts := loggers.Options{
+ StoreErrors: true,
+ }
+
+ l := loggers.New(opts)
+ l.Errorln("error 1")
+ l.Errorln("error 2")
+ l.Warnln("warn 1")
+
+ c.Assert(l.LoggCount(logg.LevelError), qt.Equals, 2)
+ c.Assert(l.LoggCount(logg.LevelWarn), qt.Equals, 1)
+ c.Assert(l.LoggCount(logg.LevelInfo), qt.Equals, 0)
+}
+
+func TestSuppressStatements(t *testing.T) {
+ c := qt.New(t)
+
+ opts := loggers.Options{
+ StoreErrors: true,
+ SuppressStatements: map[string]bool{
+ "error-1": true,
+ },
+ }
+
+ l := loggers.New(opts)
+ l.Error().WithField(loggers.FieldNameStatementID, "error-1").Logf("error 1")
+ l.Errorln("error 2")
+
+ errorsStr := l.Errors()
+
+ c.Assert(errorsStr, qt.Not(qt.Contains), "error 1")
+ c.Assert(errorsStr, qt.Contains, "error 2")
+ c.Assert(l.LoggCount(logg.LevelError), qt.Equals, 1)
+}
+
+func TestReset(t *testing.T) {
+ c := qt.New(t)
+
+ opts := loggers.Options{
+ StoreErrors: true,
+ DistinctLevel: logg.LevelWarn,
+ StdOut: io.Discard,
+ StdErr: io.Discard,
+ }
+
+ l := loggers.New(opts)
+
+ for range 3 {
+ l.Errorln("error 1")
+ l.Errorln("error 2")
+ l.Errorln("error 1")
+ c.Assert(l.LoggCount(logg.LevelError), qt.Equals, 2)
+
+ l.Reset()
+
+ errorsStr := l.Errors()
+
+ c.Assert(errorsStr, qt.Equals, "")
+ c.Assert(l.LoggCount(logg.LevelError), qt.Equals, 0)
+
+ }
+}
diff --git a/common/loggers/loggerglobal.go b/common/loggers/loggerglobal.go
new file mode 100644
index 000000000..b8c9a6931
--- /dev/null
+++ b/common/loggers/loggerglobal.go
@@ -0,0 +1,62 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+// Some functions in this file (see comments) is based on the Go source code,
+// copyright The Go Authors and governed by a BSD-style license.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package loggers
+
+import (
+ "sync"
+
+ "github.com/bep/logg"
+)
+
+// SetGlobalLogger sets the global logger.
+// This is used in a few places in Hugo, e.g. deprecated functions.
+func SetGlobalLogger(logger Logger) {
+ logMu.Lock()
+ defer logMu.Unlock()
+ log = logger
+}
+
+func initGlobalLogger(level logg.Level, panicOnWarnings bool) {
+ logMu.Lock()
+ defer logMu.Unlock()
+ var logHookLast func(e *logg.Entry) error
+ if panicOnWarnings {
+ logHookLast = PanicOnWarningHook
+ }
+
+ log = New(
+ Options{
+ Level: level,
+ DistinctLevel: logg.LevelInfo,
+ HandlerPost: logHookLast,
+ },
+ )
+}
+
+var logMu sync.Mutex
+
+func Log() Logger {
+ logMu.Lock()
+ defer logMu.Unlock()
+ return log
+}
+
+// The global logger.
+var log Logger
+
+func init() {
+ initGlobalLogger(logg.LevelWarn, false)
+}
diff --git a/common/loggers/loggers.go b/common/loggers/loggers.go
deleted file mode 100644
index 082fd1487..000000000
--- a/common/loggers/loggers.go
+++ /dev/null
@@ -1,176 +0,0 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package loggers
-
-import (
- "bytes"
- "io"
- "io/ioutil"
- "log"
- "os"
- "regexp"
-
- "github.com/gohugoio/hugo/common/terminal"
-
- jww "github.com/spf13/jwalterweatherman"
-)
-
-var (
- // Counts ERROR logs to the global jww logger.
- GlobalErrorCounter *jww.Counter
-)
-
-func init() {
- GlobalErrorCounter = &jww.Counter{}
- jww.SetLogListeners(jww.LogCounter(GlobalErrorCounter, jww.LevelError))
-}
-
-// Logger wraps a *loggers.Logger and some other related logging state.
-type Logger struct {
- *jww.Notepad
- ErrorCounter *jww.Counter
- WarnCounter *jww.Counter
-
- // This is only set in server mode.
- errors *bytes.Buffer
-}
-
-func (l *Logger) Errors() string {
- if l.errors == nil {
- return ""
- }
- return ansiColorRe.ReplaceAllString(l.errors.String(), "")
-}
-
-// Reset resets the logger's internal state.
-func (l *Logger) Reset() {
- l.ErrorCounter.Reset()
- if l.errors != nil {
- l.errors.Reset()
- }
-}
-
-// NewLogger creates a new Logger for the given thresholds
-func NewLogger(stdoutThreshold, logThreshold jww.Threshold, outHandle, logHandle io.Writer, saveErrors bool) *Logger {
- return newLogger(stdoutThreshold, logThreshold, outHandle, logHandle, saveErrors)
-}
-
-// NewDebugLogger is a convenience function to create a debug logger.
-func NewDebugLogger() *Logger {
- return newBasicLogger(jww.LevelDebug)
-}
-
-// NewWarningLogger is a convenience function to create a warning logger.
-func NewWarningLogger() *Logger {
- return newBasicLogger(jww.LevelWarn)
-}
-
-// NewErrorLogger is a convenience function to create an error logger.
-func NewErrorLogger() *Logger {
- return newBasicLogger(jww.LevelError)
-}
-
-var (
- ansiColorRe = regexp.MustCompile("(?s)\\033\\[\\d*(;\\d*)*m")
- errorRe = regexp.MustCompile("^(ERROR|FATAL|WARN)")
-)
-
-type ansiCleaner struct {
- w io.Writer
-}
-
-func (a ansiCleaner) Write(p []byte) (n int, err error) {
- return a.w.Write(ansiColorRe.ReplaceAll(p, []byte("")))
-}
-
-type labelColorizer struct {
- w io.Writer
-}
-
-func (a labelColorizer) Write(p []byte) (n int, err error) {
- replaced := errorRe.ReplaceAllStringFunc(string(p), func(m string) string {
- switch m {
- case "ERROR", "FATAL":
- return terminal.Error(m)
- case "WARN":
- return terminal.Warning(m)
- default:
- return m
- }
- })
- // io.MultiWriter will abort if we return a bigger write count than input
- // bytes, so we lie a little.
- _, err = a.w.Write([]byte(replaced))
- return len(p), err
-
-}
-
-// InitGlobalLogger initializes the global logger, used in some rare cases.
-func InitGlobalLogger(stdoutThreshold, logThreshold jww.Threshold, outHandle, logHandle io.Writer) {
- outHandle, logHandle = getLogWriters(outHandle, logHandle)
-
- jww.SetStdoutOutput(outHandle)
- jww.SetLogOutput(logHandle)
- jww.SetLogThreshold(logThreshold)
- jww.SetStdoutThreshold(stdoutThreshold)
-
-}
-
-func getLogWriters(outHandle, logHandle io.Writer) (io.Writer, io.Writer) {
- isTerm := terminal.IsTerminal(os.Stdout)
- if logHandle != ioutil.Discard && isTerm {
- // Remove any Ansi coloring from log output
- logHandle = ansiCleaner{w: logHandle}
- }
-
- if isTerm {
- outHandle = labelColorizer{w: outHandle}
- }
-
- return outHandle, logHandle
-
-}
-
-func newLogger(stdoutThreshold, logThreshold jww.Threshold, outHandle, logHandle io.Writer, saveErrors bool) *Logger {
- errorCounter := &jww.Counter{}
- warnCounter := &jww.Counter{}
- outHandle, logHandle = getLogWriters(outHandle, logHandle)
-
- listeners := []jww.LogListener{jww.LogCounter(errorCounter, jww.LevelError), jww.LogCounter(warnCounter, jww.LevelWarn)}
- var errorBuff *bytes.Buffer
- if saveErrors {
- errorBuff = new(bytes.Buffer)
- errorCapture := func(t jww.Threshold) io.Writer {
- if t != jww.LevelError {
- // Only interested in ERROR
- return nil
- }
-
- return errorBuff
- }
-
- listeners = append(listeners, errorCapture)
- }
-
- return &Logger{
- Notepad: jww.NewNotepad(stdoutThreshold, logThreshold, outHandle, logHandle, "", log.Ldate|log.Ltime, listeners...),
- ErrorCounter: errorCounter,
- WarnCounter: warnCounter,
- errors: errorBuff,
- }
-}
-
-func newBasicLogger(t jww.Threshold) *Logger {
- return newLogger(t, jww.LevelError, os.Stdout, ioutil.Discard, false)
-}
diff --git a/common/loggers/loggers_test.go b/common/loggers/loggers_test.go
deleted file mode 100644
index f572ba170..000000000
--- a/common/loggers/loggers_test.go
+++ /dev/null
@@ -1,32 +0,0 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package loggers
-
-import (
- "testing"
-
- qt "github.com/frankban/quicktest"
-)
-
-func TestLogger(t *testing.T) {
- c := qt.New(t)
- l := NewWarningLogger()
-
- l.ERROR.Println("One error")
- l.ERROR.Println("Two error")
- l.WARN.Println("A warning")
-
- c.Assert(l.ErrorCounter.Count(), qt.Equals, uint64(2))
-
-}
diff --git a/common/maps/cache.go b/common/maps/cache.go
new file mode 100644
index 000000000..de1535994
--- /dev/null
+++ b/common/maps/cache.go
@@ -0,0 +1,195 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package maps
+
+import (
+ "sync"
+)
+
+// Cache is a simple thread safe cache backed by a map.
+type Cache[K comparable, T any] struct {
+ m map[K]T
+ hasBeenInitialized bool
+ sync.RWMutex
+}
+
+// NewCache creates a new Cache.
+func NewCache[K comparable, T any]() *Cache[K, T] {
+ return &Cache[K, T]{m: make(map[K]T)}
+}
+
+// Delete deletes the given key from the cache.
+// If c is nil, this method is a no-op.
+func (c *Cache[K, T]) Get(key K) (T, bool) {
+ if c == nil {
+ var zero T
+ return zero, false
+ }
+ c.RLock()
+ v, found := c.get(key)
+ c.RUnlock()
+ return v, found
+}
+
+func (c *Cache[K, T]) get(key K) (T, bool) {
+ v, found := c.m[key]
+ return v, found
+}
+
+// GetOrCreate gets the value for the given key if it exists, or creates it if not.
+func (c *Cache[K, T]) GetOrCreate(key K, create func() (T, error)) (T, error) {
+ c.RLock()
+ v, found := c.m[key]
+ c.RUnlock()
+ if found {
+ return v, nil
+ }
+ c.Lock()
+ defer c.Unlock()
+ v, found = c.m[key]
+ if found {
+ return v, nil
+ }
+ v, err := create()
+ if err != nil {
+ return v, err
+ }
+ c.m[key] = v
+ return v, nil
+}
+
+// Contains returns whether the given key exists in the cache.
+func (c *Cache[K, T]) Contains(key K) bool {
+ c.RLock()
+ _, found := c.m[key]
+ c.RUnlock()
+ return found
+}
+
+// InitAndGet initializes the cache if not already done and returns the value for the given key.
+// The init state will be reset on Reset or Drain.
+func (c *Cache[K, T]) InitAndGet(key K, init func(get func(key K) (T, bool), set func(key K, value T)) error) (T, error) {
+ var v T
+ c.RLock()
+ if !c.hasBeenInitialized {
+ c.RUnlock()
+ if err := func() error {
+ c.Lock()
+ defer c.Unlock()
+ // Double check in case another goroutine has initialized it in the meantime.
+ if !c.hasBeenInitialized {
+ err := init(c.get, c.set)
+ if err != nil {
+ return err
+ }
+ c.hasBeenInitialized = true
+ }
+ return nil
+ }(); err != nil {
+ return v, err
+ }
+ // Reacquire the read lock.
+ c.RLock()
+ }
+
+ v = c.m[key]
+ c.RUnlock()
+
+ return v, nil
+}
+
+// Set sets the given key to the given value.
+func (c *Cache[K, T]) Set(key K, value T) {
+ c.Lock()
+ c.set(key, value)
+ c.Unlock()
+}
+
+// SetIfAbsent sets the given key to the given value if the key does not already exist in the cache.
+func (c *Cache[K, T]) SetIfAbsent(key K, value T) {
+ c.RLock()
+ if _, found := c.get(key); !found {
+ c.RUnlock()
+ c.Set(key, value)
+ } else {
+ c.RUnlock()
+ }
+}
+
+func (c *Cache[K, T]) set(key K, value T) {
+ c.m[key] = value
+}
+
+// ForEeach calls the given function for each key/value pair in the cache.
+// If the function returns false, the iteration stops.
+func (c *Cache[K, T]) ForEeach(f func(K, T) bool) {
+ c.RLock()
+ defer c.RUnlock()
+ for k, v := range c.m {
+ if !f(k, v) {
+ return
+ }
+ }
+}
+
+func (c *Cache[K, T]) Drain() map[K]T {
+ c.Lock()
+ m := c.m
+ c.m = make(map[K]T)
+ c.hasBeenInitialized = false
+ c.Unlock()
+ return m
+}
+
+func (c *Cache[K, T]) Len() int {
+ c.RLock()
+ defer c.RUnlock()
+ return len(c.m)
+}
+
+func (c *Cache[K, T]) Reset() {
+ c.Lock()
+ clear(c.m)
+ c.hasBeenInitialized = false
+ c.Unlock()
+}
+
+// SliceCache is a simple thread safe cache backed by a map.
+type SliceCache[T any] struct {
+ m map[string][]T
+ sync.RWMutex
+}
+
+func NewSliceCache[T any]() *SliceCache[T] {
+ return &SliceCache[T]{m: make(map[string][]T)}
+}
+
+func (c *SliceCache[T]) Get(key string) ([]T, bool) {
+ c.RLock()
+ v, found := c.m[key]
+ c.RUnlock()
+ return v, found
+}
+
+func (c *SliceCache[T]) Append(key string, values ...T) {
+ c.Lock()
+ c.m[key] = append(c.m[key], values...)
+ c.Unlock()
+}
+
+func (c *SliceCache[T]) Reset() {
+ c.Lock()
+ c.m = make(map[string][]T)
+ c.Unlock()
+}
diff --git a/common/maps/maps.go b/common/maps/maps.go
index e0d4f964d..f9171ebf2 100644
--- a/common/maps/maps.go
+++ b/common/maps/maps.go
@@ -14,34 +14,131 @@
package maps
import (
+ "fmt"
"strings"
- "github.com/gobwas/glob"
+ "github.com/gohugoio/hugo/common/types"
+ "github.com/gobwas/glob"
"github.com/spf13/cast"
)
-// ToLower makes all the keys in the given map lower cased and will do so
-// recursively.
-// Notes:
-// * This will modify the map given.
-// * Any nested map[interface{}]interface{} will be converted to map[string]interface{}.
-func ToLower(m map[string]interface{}) {
+// ToStringMapE converts in to map[string]interface{}.
+func ToStringMapE(in any) (map[string]any, error) {
+ switch vv := in.(type) {
+ case Params:
+ return vv, nil
+ case map[string]string:
+ m := map[string]any{}
+ for k, v := range vv {
+ m[k] = v
+ }
+ return m, nil
+
+ default:
+ return cast.ToStringMapE(in)
+ }
+}
+
+// ToParamsAndPrepare converts in to Params and prepares it for use.
+// If in is nil, an empty map is returned.
+// See PrepareParams.
+func ToParamsAndPrepare(in any) (Params, error) {
+ if types.IsNil(in) {
+ return Params{}, nil
+ }
+ m, err := ToStringMapE(in)
+ if err != nil {
+ return nil, err
+ }
+ PrepareParams(m)
+ return m, nil
+}
+
+// MustToParamsAndPrepare calls ToParamsAndPrepare and panics if it fails.
+func MustToParamsAndPrepare(in any) Params {
+ p, err := ToParamsAndPrepare(in)
+ if err != nil {
+ panic(fmt.Sprintf("cannot convert %T to maps.Params: %s", in, err))
+ }
+ return p
+}
+
+// ToStringMap converts in to map[string]interface{}.
+func ToStringMap(in any) map[string]any {
+ m, _ := ToStringMapE(in)
+ return m
+}
+
+// ToStringMapStringE converts in to map[string]string.
+func ToStringMapStringE(in any) (map[string]string, error) {
+ m, err := ToStringMapE(in)
+ if err != nil {
+ return nil, err
+ }
+ return cast.ToStringMapStringE(m)
+}
+
+// ToStringMapString converts in to map[string]string.
+func ToStringMapString(in any) map[string]string {
+ m, _ := ToStringMapStringE(in)
+ return m
+}
+
+// ToStringMapBool converts in to bool.
+func ToStringMapBool(in any) map[string]bool {
+ m, _ := ToStringMapE(in)
+ return cast.ToStringMapBool(m)
+}
+
+// ToSliceStringMap converts in to []map[string]interface{}.
+func ToSliceStringMap(in any) ([]map[string]any, error) {
+ switch v := in.(type) {
+ case []map[string]any:
+ return v, nil
+ case Params:
+ return []map[string]any{v}, nil
+ case []any:
+ var s []map[string]any
+ for _, entry := range v {
+ if vv, ok := entry.(map[string]any); ok {
+ s = append(s, vv)
+ }
+ }
+ return s, nil
+ default:
+ return nil, fmt.Errorf("unable to cast %#v of type %T to []map[string]interface{}", in, in)
+ }
+}
+
+// LookupEqualFold finds key in m with case insensitive equality checks.
+func LookupEqualFold[T any | string](m map[string]T, key string) (T, string, bool) {
+ if v, found := m[key]; found {
+ return v, key, true
+ }
for k, v := range m {
- switch v.(type) {
- case map[interface{}]interface{}:
- v = cast.ToStringMap(v)
- ToLower(v.(map[string]interface{}))
- case map[string]interface{}:
- ToLower(v.(map[string]interface{}))
+ if strings.EqualFold(k, key) {
+ return v, k, true
}
+ }
+ var s T
+ return s, "", false
+}
- lKey := strings.ToLower(k)
- if k != lKey {
- delete(m, k)
- m[lKey] = v
+// MergeShallow merges src into dst, but only if the key does not already exist in dst.
+// The keys are compared case insensitively.
+func MergeShallow(dst, src map[string]any) {
+ for k, v := range src {
+ found := false
+ for dk := range dst {
+ if strings.EqualFold(dk, k) {
+ found = true
+ break
+ }
+ }
+ if !found {
+ dst[k] = v
}
-
}
}
@@ -82,7 +179,7 @@ func (r KeyRenamer) getNewKey(keyPath string) string {
// Rename renames the keys in the given map according
// to the patterns in the current KeyRenamer.
-func (r KeyRenamer) Rename(m map[string]interface{}) {
+func (r KeyRenamer) Rename(m map[string]any) {
r.renamePath("", m)
}
@@ -90,27 +187,50 @@ func (KeyRenamer) keyPath(k1, k2 string) string {
k1, k2 = strings.ToLower(k1), strings.ToLower(k2)
if k1 == "" {
return k2
- } else {
- return k1 + "/" + k2
}
+ return k1 + "/" + k2
}
-func (r KeyRenamer) renamePath(parentKeyPath string, m map[string]interface{}) {
- for key, val := range m {
- keyPath := r.keyPath(parentKeyPath, key)
- switch val.(type) {
- case map[interface{}]interface{}:
- val = cast.ToStringMap(val)
- r.renamePath(keyPath, val.(map[string]interface{}))
- case map[string]interface{}:
- r.renamePath(keyPath, val.(map[string]interface{}))
+func (r KeyRenamer) renamePath(parentKeyPath string, m map[string]any) {
+ for k, v := range m {
+ keyPath := r.keyPath(parentKeyPath, k)
+ switch vv := v.(type) {
+ case map[any]any:
+ r.renamePath(keyPath, cast.ToStringMap(vv))
+ case map[string]any:
+ r.renamePath(keyPath, vv)
}
newKey := r.getNewKey(keyPath)
if newKey != "" {
- delete(m, key)
- m[newKey] = val
+ delete(m, k)
+ m[newKey] = v
+ }
+ }
+}
+
+// ConvertFloat64WithNoDecimalsToInt converts float64 values with no decimals to int recursively.
+func ConvertFloat64WithNoDecimalsToInt(m map[string]any) {
+ for k, v := range m {
+ switch vv := v.(type) {
+ case float64:
+ if v == float64(int64(vv)) {
+ m[k] = int64(vv)
+ }
+ case map[string]any:
+ ConvertFloat64WithNoDecimalsToInt(vv)
+ case []any:
+ for i, vvv := range vv {
+ switch vvvv := vvv.(type) {
+ case float64:
+ if vvv == float64(int64(vvvv)) {
+ vv[i] = int64(vvvv)
+ }
+ case map[string]any:
+ ConvertFloat64WithNoDecimalsToInt(vvvv)
+ }
+ }
}
}
}
diff --git a/common/maps/maps_get.go b/common/maps/maps_get.go
deleted file mode 100644
index 9289991ae..000000000
--- a/common/maps/maps_get.go
+++ /dev/null
@@ -1,31 +0,0 @@
-// Copyright 2019 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package maps
-
-import (
- "github.com/spf13/cast"
-)
-
-// GetString tries to get a value with key from map m and convert it to a string.
-// It will return an empty string if not found or if it cannot be convertd to a string.
-func GetString(m map[string]interface{}, key string) string {
- if m == nil {
- return ""
- }
- v, found := m[key]
- if !found {
- return ""
- }
- return cast.ToString(v)
-}
diff --git a/common/maps/maps_test.go b/common/maps/maps_test.go
index 8b0aa5eb9..40c8ac824 100644
--- a/common/maps/maps_test.go
+++ b/common/maps/maps_test.go
@@ -14,95 +14,150 @@
package maps
import (
+ "fmt"
"reflect"
"testing"
qt "github.com/frankban/quicktest"
)
-func TestToLower(t *testing.T) {
-
+func TestPrepareParams(t *testing.T) {
tests := []struct {
- input map[string]interface{}
- expected map[string]interface{}
+ input Params
+ expected Params
}{
{
- map[string]interface{}{
+ map[string]any{
"abC": 32,
},
- map[string]interface{}{
+ Params{
"abc": 32,
},
},
{
- map[string]interface{}{
+ map[string]any{
"abC": 32,
- "deF": map[interface{}]interface{}{
+ "deF": map[any]any{
23: "A value",
- 24: map[string]interface{}{
+ 24: map[string]any{
"AbCDe": "A value",
"eFgHi": "Another value",
},
},
- "gHi": map[string]interface{}{
+ "gHi": map[string]any{
"J": 25,
},
+ "jKl": map[string]string{
+ "M": "26",
+ },
},
- map[string]interface{}{
+ Params{
"abc": 32,
- "def": map[string]interface{}{
+ "def": Params{
"23": "A value",
- "24": map[string]interface{}{
+ "24": Params{
"abcde": "A value",
"efghi": "Another value",
},
},
- "ghi": map[string]interface{}{
+ "ghi": Params{
"j": 25,
},
+ "jkl": Params{
+ "m": "26",
+ },
},
},
}
for i, test := range tests {
- // ToLower modifies input.
- ToLower(test.input)
- if !reflect.DeepEqual(test.expected, test.input) {
- t.Errorf("[%d] Expected\n%#v, got\n%#v\n", i, test.expected, test.input)
- }
+ t.Run(fmt.Sprint(i), func(t *testing.T) {
+ // PrepareParams modifies input.
+ prepareClone := PrepareParamsClone(test.input)
+ PrepareParams(test.input)
+ if !reflect.DeepEqual(test.expected, test.input) {
+ t.Errorf("[%d] Expected\n%#v, got\n%#v\n", i, test.expected, test.input)
+ }
+ if !reflect.DeepEqual(test.expected, prepareClone) {
+ t.Errorf("[%d] Expected\n%#v, got\n%#v\n", i, test.expected, prepareClone)
+ }
+ })
}
}
+func TestToSliceStringMap(t *testing.T) {
+ c := qt.New(t)
+
+ tests := []struct {
+ input any
+ expected []map[string]any
+ }{
+ {
+ input: []map[string]any{
+ {"abc": 123},
+ },
+ expected: []map[string]any{
+ {"abc": 123},
+ },
+ }, {
+ input: []any{
+ map[string]any{
+ "def": 456,
+ },
+ },
+ expected: []map[string]any{
+ {"def": 456},
+ },
+ },
+ }
+
+ for _, test := range tests {
+ v, err := ToSliceStringMap(test.input)
+ c.Assert(err, qt.IsNil)
+ c.Assert(v, qt.DeepEquals, test.expected)
+ }
+}
+
+func TestToParamsAndPrepare(t *testing.T) {
+ c := qt.New(t)
+ _, err := ToParamsAndPrepare(map[string]any{"A": "av"})
+ c.Assert(err, qt.IsNil)
+
+ params, err := ToParamsAndPrepare(nil)
+ c.Assert(err, qt.IsNil)
+ c.Assert(params, qt.DeepEquals, Params{})
+}
+
func TestRenameKeys(t *testing.T) {
c := qt.New(t)
- m := map[string]interface{}{
+ m := map[string]any{
"a": 32,
"ren1": "m1",
"ren2": "m1_2",
- "sub": map[string]interface{}{
- "subsub": map[string]interface{}{
+ "sub": map[string]any{
+ "subsub": map[string]any{
"REN1": "m2",
"ren2": "m2_2",
},
},
- "no": map[string]interface{}{
+ "no": map[string]any{
"ren1": "m2",
"ren2": "m2_2",
},
}
- expected := map[string]interface{}{
+ expected := map[string]any{
"a": 32,
"new1": "m1",
"new2": "m1_2",
- "sub": map[string]interface{}{
- "subsub": map[string]interface{}{
+ "sub": map[string]any{
+ "subsub": map[string]any{
"new1": "m2",
"ren2": "m2_2",
},
},
- "no": map[string]interface{}{
+ "no": map[string]any{
"ren1": "m2",
"ren2": "m2_2",
},
@@ -119,5 +174,28 @@ func TestRenameKeys(t *testing.T) {
if !reflect.DeepEqual(expected, m) {
t.Errorf("Expected\n%#v, got\n%#v\n", expected, m)
}
-
+}
+
+func TestLookupEqualFold(t *testing.T) {
+ c := qt.New(t)
+
+ m1 := map[string]any{
+ "a": "av",
+ "B": "bv",
+ }
+
+ v, k, found := LookupEqualFold(m1, "b")
+ c.Assert(found, qt.IsTrue)
+ c.Assert(v, qt.Equals, "bv")
+ c.Assert(k, qt.Equals, "B")
+
+ m2 := map[string]string{
+ "a": "av",
+ "B": "bv",
+ }
+
+ v, k, found = LookupEqualFold(m2, "b")
+ c.Assert(found, qt.IsTrue)
+ c.Assert(k, qt.Equals, "B")
+ c.Assert(v, qt.Equals, "bv")
}
diff --git a/common/maps/ordered.go b/common/maps/ordered.go
new file mode 100644
index 000000000..0da9d239d
--- /dev/null
+++ b/common/maps/ordered.go
@@ -0,0 +1,144 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package maps
+
+import (
+ "slices"
+
+ "github.com/gohugoio/hugo/common/hashing"
+)
+
+// Ordered is a map that can be iterated in the order of insertion.
+// Note that insertion order is not affected if a key is re-inserted into the map.
+// In a nil map, all operations are no-ops.
+// This is not thread safe.
+type Ordered[K comparable, T any] struct {
+ // The keys in the order they were added.
+ keys []K
+ // The values.
+ values map[K]T
+}
+
+// NewOrdered creates a new Ordered map.
+func NewOrdered[K comparable, T any]() *Ordered[K, T] {
+ return &Ordered[K, T]{values: make(map[K]T)}
+}
+
+// Set sets the value for the given key.
+// Note that insertion order is not affected if a key is re-inserted into the map.
+func (m *Ordered[K, T]) Set(key K, value T) {
+ if m == nil {
+ return
+ }
+ // Check if key already exists.
+ if _, found := m.values[key]; !found {
+ m.keys = append(m.keys, key)
+ }
+ m.values[key] = value
+}
+
+// Get gets the value for the given key.
+func (m *Ordered[K, T]) Get(key K) (T, bool) {
+ if m == nil {
+ var v T
+ return v, false
+ }
+ value, found := m.values[key]
+ return value, found
+}
+
+// Has returns whether the given key exists in the map.
+func (m *Ordered[K, T]) Has(key K) bool {
+ if m == nil {
+ return false
+ }
+ _, found := m.values[key]
+ return found
+}
+
+// Delete deletes the value for the given key.
+func (m *Ordered[K, T]) Delete(key K) {
+ if m == nil {
+ return
+ }
+ delete(m.values, key)
+ for i, k := range m.keys {
+ if k == key {
+ m.keys = slices.Delete(m.keys, i, i+1)
+ break
+ }
+ }
+}
+
+// Clone creates a shallow copy of the map.
+func (m *Ordered[K, T]) Clone() *Ordered[K, T] {
+ if m == nil {
+ return nil
+ }
+ clone := NewOrdered[K, T]()
+ for _, k := range m.keys {
+ clone.Set(k, m.values[k])
+ }
+ return clone
+}
+
+// Keys returns the keys in the order they were added.
+func (m *Ordered[K, T]) Keys() []K {
+ if m == nil {
+ return nil
+ }
+ return m.keys
+}
+
+// Values returns the values in the order they were added.
+func (m *Ordered[K, T]) Values() []T {
+ if m == nil {
+ return nil
+ }
+ var values []T
+ for _, k := range m.keys {
+ values = append(values, m.values[k])
+ }
+ return values
+}
+
+// Len returns the number of items in the map.
+func (m *Ordered[K, T]) Len() int {
+ if m == nil {
+ return 0
+ }
+ return len(m.keys)
+}
+
+// Range calls f sequentially for each key and value present in the map.
+// If f returns false, range stops the iteration.
+// TODO(bep) replace with iter.Seq2 when we bump go Go 1.24.
+func (m *Ordered[K, T]) Range(f func(key K, value T) bool) {
+ if m == nil {
+ return
+ }
+ for _, k := range m.keys {
+ if !f(k, m.values[k]) {
+ return
+ }
+ }
+}
+
+// Hash calculates a hash from the values.
+func (m *Ordered[K, T]) Hash() (uint64, error) {
+ if m == nil {
+ return 0, nil
+ }
+ return hashing.Hash(m.values)
+}
diff --git a/common/maps/ordered_test.go b/common/maps/ordered_test.go
new file mode 100644
index 000000000..65a827810
--- /dev/null
+++ b/common/maps/ordered_test.go
@@ -0,0 +1,99 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package maps
+
+import (
+ "testing"
+
+ qt "github.com/frankban/quicktest"
+)
+
+func TestOrdered(t *testing.T) {
+ c := qt.New(t)
+
+ m := NewOrdered[string, int]()
+ m.Set("a", 1)
+ m.Set("b", 2)
+ m.Set("c", 3)
+
+ c.Assert(m.Keys(), qt.DeepEquals, []string{"a", "b", "c"})
+ c.Assert(m.Values(), qt.DeepEquals, []int{1, 2, 3})
+
+ v, found := m.Get("b")
+ c.Assert(found, qt.Equals, true)
+ c.Assert(v, qt.Equals, 2)
+
+ m.Set("b", 22)
+ c.Assert(m.Keys(), qt.DeepEquals, []string{"a", "b", "c"})
+ c.Assert(m.Values(), qt.DeepEquals, []int{1, 22, 3})
+
+ m.Delete("b")
+
+ c.Assert(m.Keys(), qt.DeepEquals, []string{"a", "c"})
+ c.Assert(m.Values(), qt.DeepEquals, []int{1, 3})
+}
+
+func TestOrderedHash(t *testing.T) {
+ c := qt.New(t)
+
+ m := NewOrdered[string, int]()
+ m.Set("a", 1)
+ m.Set("b", 2)
+ m.Set("c", 3)
+
+ h1, err := m.Hash()
+ c.Assert(err, qt.IsNil)
+
+ m.Set("d", 4)
+
+ h2, err := m.Hash()
+ c.Assert(err, qt.IsNil)
+
+ c.Assert(h1, qt.Not(qt.Equals), h2)
+
+ m = NewOrdered[string, int]()
+ m.Set("b", 2)
+ m.Set("a", 1)
+ m.Set("c", 3)
+
+ h3, err := m.Hash()
+ c.Assert(err, qt.IsNil)
+ // Order does not matter.
+ c.Assert(h1, qt.Equals, h3)
+}
+
+func TestOrderedNil(t *testing.T) {
+ c := qt.New(t)
+
+ var m *Ordered[string, int]
+
+ m.Set("a", 1)
+ c.Assert(m.Keys(), qt.IsNil)
+ c.Assert(m.Values(), qt.IsNil)
+ v, found := m.Get("a")
+ c.Assert(found, qt.Equals, false)
+ c.Assert(v, qt.Equals, 0)
+ m.Delete("a")
+ var b bool
+ m.Range(func(k string, v int) bool {
+ b = true
+ return true
+ })
+ c.Assert(b, qt.Equals, false)
+ c.Assert(m.Len(), qt.Equals, 0)
+ c.Assert(m.Clone(), qt.IsNil)
+ h, err := m.Hash()
+ c.Assert(err, qt.IsNil)
+ c.Assert(h, qt.Equals, uint64(0))
+}
diff --git a/common/maps/params.go b/common/maps/params.go
index 2d62ad752..819f796e4 100644
--- a/common/maps/params.go
+++ b/common/maps/params.go
@@ -14,81 +14,371 @@
package maps
import (
+ "fmt"
"strings"
"github.com/spf13/cast"
)
+// Params is a map where all keys are lower case.
+type Params map[string]any
+
+// KeyParams is an utility struct for the WalkParams method.
+type KeyParams struct {
+ Key string
+ Params Params
+}
+
+// GetNested does a lower case and nested search in this map.
+// It will return nil if none found.
+// Make all of these methods internal somehow.
+func (p Params) GetNested(indices ...string) any {
+ v, _, _ := getNested(p, indices)
+ return v
+}
+
+// SetParams overwrites values in dst with values in src for common or new keys.
+// This is done recursively.
+func SetParams(dst, src Params) {
+ for k, v := range src {
+ vv, found := dst[k]
+ if !found {
+ dst[k] = v
+ } else {
+ switch vvv := vv.(type) {
+ case Params:
+ if pv, ok := v.(Params); ok {
+ SetParams(vvv, pv)
+ } else {
+ dst[k] = v
+ }
+ default:
+ dst[k] = v
+ }
+ }
+ }
+}
+
+// IsZero returns true if p is considered empty.
+func (p Params) IsZero() bool {
+ if len(p) == 0 {
+ return true
+ }
+
+ if len(p) > 1 {
+ return false
+ }
+
+ for k := range p {
+ return k == MergeStrategyKey
+ }
+
+ return false
+}
+
+// MergeParamsWithStrategy transfers values from src to dst for new keys using the merge strategy given.
+// This is done recursively.
+func MergeParamsWithStrategy(strategy string, dst, src Params) {
+ dst.merge(ParamsMergeStrategy(strategy), src)
+}
+
+// MergeParams transfers values from src to dst for new keys using the merge encoded in dst.
+// This is done recursively.
+func MergeParams(dst, src Params) {
+ ms, _ := dst.GetMergeStrategy()
+ dst.merge(ms, src)
+}
+
+func (p Params) merge(ps ParamsMergeStrategy, pp Params) {
+ ns, found := p.GetMergeStrategy()
+
+ ms := ns
+ if !found && ps != "" {
+ ms = ps
+ }
+
+ noUpdate := ms == ParamsMergeStrategyNone
+ noUpdate = noUpdate || (ps != "" && ps == ParamsMergeStrategyShallow)
+
+ for k, v := range pp {
+
+ if k == MergeStrategyKey {
+ continue
+ }
+ vv, found := p[k]
+
+ if found {
+ // Key matches, if both sides are Params, we try to merge.
+ if vvv, ok := vv.(Params); ok {
+ if pv, ok := v.(Params); ok {
+ vvv.merge(ms, pv)
+ }
+ }
+ } else if !noUpdate {
+ p[k] = v
+ }
+
+ }
+}
+
+// For internal use.
+func (p Params) GetMergeStrategy() (ParamsMergeStrategy, bool) {
+ if v, found := p[MergeStrategyKey]; found {
+ if s, ok := v.(ParamsMergeStrategy); ok {
+ return s, true
+ }
+ }
+ return ParamsMergeStrategyShallow, false
+}
+
+// For internal use.
+func (p Params) DeleteMergeStrategy() bool {
+ if _, found := p[MergeStrategyKey]; found {
+ delete(p, MergeStrategyKey)
+ return true
+ }
+ return false
+}
+
+// For internal use.
+func (p Params) SetMergeStrategy(s ParamsMergeStrategy) {
+ switch s {
+ case ParamsMergeStrategyDeep, ParamsMergeStrategyNone, ParamsMergeStrategyShallow:
+ default:
+ panic(fmt.Sprintf("invalid merge strategy %q", s))
+ }
+ p[MergeStrategyKey] = s
+}
+
+func getNested(m map[string]any, indices []string) (any, string, map[string]any) {
+ if len(indices) == 0 {
+ return nil, "", nil
+ }
+
+ first := indices[0]
+ v, found := m[strings.ToLower(cast.ToString(first))]
+ if !found {
+ if len(indices) == 1 {
+ return nil, first, m
+ }
+ return nil, "", nil
+
+ }
+
+ if len(indices) == 1 {
+ return v, first, m
+ }
+
+ switch m2 := v.(type) {
+ case Params:
+ return getNested(m2, indices[1:])
+ case map[string]any:
+ return getNested(m2, indices[1:])
+ default:
+ return nil, "", nil
+ }
+}
+
// GetNestedParam gets the first match of the keyStr in the candidates given.
// It will first try the exact match and then try to find it as a nested map value,
// using the given separator, e.g. "mymap.name".
// It assumes that all the maps given have lower cased keys.
-func GetNestedParam(keyStr, separator string, candidates ...map[string]interface{}) (interface{}, error) {
+func GetNestedParam(keyStr, separator string, candidates ...Params) (any, error) {
keyStr = strings.ToLower(keyStr)
- lookupFn := func(key string) interface{} {
- for _, m := range candidates {
- if v, ok := m[key]; ok {
- return v
- }
+ // Try exact match first
+ for _, m := range candidates {
+ if v, ok := m[keyStr]; ok {
+ return v, nil
}
-
- return nil
- }
-
- v, _, _, err := GetNestedParamFn(keyStr, separator, lookupFn)
- return v, err
-}
-
-func GetNestedParamFn(keyStr, separator string, lookupFn func(key string) interface{}) (interface{}, string, map[string]interface{}, error) {
- result, _ := traverseDirectParams(keyStr, lookupFn)
- if result != nil {
- return result, keyStr, nil, nil
}
keySegments := strings.Split(keyStr, separator)
- if len(keySegments) == 1 {
- return nil, keyStr, nil, nil
+ for _, m := range candidates {
+ if v := m.GetNested(keySegments...); v != nil {
+ return v, nil
+ }
}
- return traverseNestedParams(keySegments, lookupFn)
+ return nil, nil
}
-func traverseDirectParams(keyStr string, lookupFn func(key string) interface{}) (interface{}, error) {
- return lookupFn(keyStr), nil
-}
-
-func traverseNestedParams(keySegments []string, lookupFn func(key string) interface{}) (interface{}, string, map[string]interface{}, error) {
- firstKey, rest := keySegments[0], keySegments[1:]
- result := lookupFn(firstKey)
- if result == nil || len(rest) == 0 {
- return result, firstKey, nil, nil
- }
-
- switch m := result.(type) {
- case map[string]interface{}:
- v, key, owner := traverseParams(rest, m)
- return v, key, owner, nil
- default:
+func GetNestedParamFn(keyStr, separator string, lookupFn func(key string) any) (any, string, map[string]any, error) {
+ keySegments := strings.Split(keyStr, separator)
+ if len(keySegments) == 0 {
return nil, "", nil, nil
}
-}
-func traverseParams(keys []string, m map[string]interface{}) (interface{}, string, map[string]interface{}) {
- // Shift first element off.
- firstKey, rest := keys[0], keys[1:]
- result := m[firstKey]
-
- // No point in continuing here.
- if result == nil {
- return result, "", nil
+ first := lookupFn(keySegments[0])
+ if first == nil {
+ return nil, "", nil, nil
}
- if len(rest) == 0 {
- // That was the last key.
- return result, firstKey, m
+ if len(keySegments) == 1 {
+ return first, keySegments[0], nil, nil
}
- // That was not the last key.
- return traverseParams(rest, cast.ToStringMap(result))
+ switch m := first.(type) {
+ case map[string]any:
+ v, key, owner := getNested(m, keySegments[1:])
+ return v, key, owner, nil
+ case Params:
+ v, key, owner := getNested(m, keySegments[1:])
+ return v, key, owner, nil
+ }
+
+ return nil, "", nil, nil
+}
+
+// ParamsMergeStrategy tells what strategy to use in Params.Merge.
+type ParamsMergeStrategy string
+
+const (
+ // Do not merge.
+ ParamsMergeStrategyNone ParamsMergeStrategy = "none"
+ // Only add new keys.
+ ParamsMergeStrategyShallow ParamsMergeStrategy = "shallow"
+ // Add new keys, merge existing.
+ ParamsMergeStrategyDeep ParamsMergeStrategy = "deep"
+
+ MergeStrategyKey = "_merge"
+)
+
+// CleanConfigStringMapString removes any processing instructions from m,
+// m will never be modified.
+func CleanConfigStringMapString(m map[string]string) map[string]string {
+ if len(m) == 0 {
+ return m
+ }
+ if _, found := m[MergeStrategyKey]; !found {
+ return m
+ }
+ // Create a new map and copy all the keys except the merge strategy key.
+ m2 := make(map[string]string, len(m)-1)
+ for k, v := range m {
+ if k != MergeStrategyKey {
+ m2[k] = v
+ }
+ }
+ return m2
+}
+
+// CleanConfigStringMap is the same as CleanConfigStringMapString but for
+// map[string]any.
+func CleanConfigStringMap(m map[string]any) map[string]any {
+ if len(m) == 0 {
+ return m
+ }
+ if _, found := m[MergeStrategyKey]; !found {
+ return m
+ }
+ // Create a new map and copy all the keys except the merge strategy key.
+ m2 := make(map[string]any, len(m)-1)
+ for k, v := range m {
+ if k != MergeStrategyKey {
+ m2[k] = v
+ }
+ switch v2 := v.(type) {
+ case map[string]any:
+ m2[k] = CleanConfigStringMap(v2)
+ case Params:
+ var p Params = CleanConfigStringMap(v2)
+ m2[k] = p
+ case map[string]string:
+ m2[k] = CleanConfigStringMapString(v2)
+ }
+
+ }
+ return m2
+}
+
+func toMergeStrategy(v any) ParamsMergeStrategy {
+ s := ParamsMergeStrategy(cast.ToString(v))
+ switch s {
+ case ParamsMergeStrategyDeep, ParamsMergeStrategyNone, ParamsMergeStrategyShallow:
+ return s
+ default:
+ return ParamsMergeStrategyDeep
+ }
+}
+
+// PrepareParams
+// * makes all the keys in the given map lower cased and will do so recursively.
+// * This will modify the map given.
+// * Any nested map[interface{}]interface{}, map[string]interface{},map[string]string will be converted to Params.
+// * Any _merge value will be converted to proper type and value.
+func PrepareParams(m Params) {
+ for k, v := range m {
+ var retyped bool
+ lKey := strings.ToLower(k)
+ if lKey == MergeStrategyKey {
+ v = toMergeStrategy(v)
+ retyped = true
+ } else {
+ switch vv := v.(type) {
+ case map[any]any:
+ var p Params = cast.ToStringMap(v)
+ v = p
+ PrepareParams(p)
+ retyped = true
+ case map[string]any:
+ var p Params = v.(map[string]any)
+ v = p
+ PrepareParams(p)
+ retyped = true
+ case map[string]string:
+ p := make(Params)
+ for k, v := range vv {
+ p[k] = v
+ }
+ v = p
+ PrepareParams(p)
+ retyped = true
+ }
+ }
+
+ if retyped || k != lKey {
+ delete(m, k)
+ m[lKey] = v
+ }
+ }
+}
+
+// PrepareParamsClone is like PrepareParams, but it does not modify the input.
+func PrepareParamsClone(m Params) Params {
+ m2 := make(Params)
+ for k, v := range m {
+ var retyped bool
+ lKey := strings.ToLower(k)
+ if lKey == MergeStrategyKey {
+ v = toMergeStrategy(v)
+ retyped = true
+ } else {
+ switch vv := v.(type) {
+ case map[any]any:
+ var p Params = cast.ToStringMap(v)
+ v = PrepareParamsClone(p)
+ retyped = true
+ case map[string]any:
+ var p Params = v.(map[string]any)
+ v = PrepareParamsClone(p)
+ retyped = true
+ case map[string]string:
+ p := make(Params)
+ for k, v := range vv {
+ p[k] = v
+ }
+ v = p
+ PrepareParams(p)
+ retyped = true
+ }
+ }
+
+ if retyped || k != lKey {
+ m2[lKey] = v
+ } else {
+ m2[k] = v
+ }
+ }
+ return m2
}
diff --git a/common/maps/params_test.go b/common/maps/params_test.go
index 6477de6f4..892c77175 100644
--- a/common/maps/params_test.go
+++ b/common/maps/params_test.go
@@ -20,14 +20,13 @@ import (
)
func TestGetNestedParam(t *testing.T) {
-
- m := map[string]interface{}{
+ m := map[string]any{
"string": "value",
"first": 1,
"with_underscore": 2,
- "nested": map[string]interface{}{
+ "nested": map[string]any{
"color": "blue",
- "nestednested": map[string]interface{}{
+ "nestednested": map[string]any{
"color": "green",
},
},
@@ -35,7 +34,7 @@ func TestGetNestedParam(t *testing.T) {
c := qt.New(t)
- must := func(keyStr, separator string, candidates ...map[string]interface{}) interface{} {
+ must := func(keyStr, separator string, candidates ...Params) any {
v, err := GetNestedParam(keyStr, separator, candidates...)
c.Assert(err, qt.IsNil)
return v
@@ -47,5 +46,124 @@ func TestGetNestedParam(t *testing.T) {
c.Assert(must("nested_color", "_", m), qt.Equals, "blue")
c.Assert(must("nested.nestednested.color", ".", m), qt.Equals, "green")
c.Assert(must("string.name", ".", m), qt.IsNil)
-
+ c.Assert(must("nested.foo", ".", m), qt.IsNil)
+}
+
+// https://github.com/gohugoio/hugo/issues/7903
+func TestGetNestedParamFnNestedNewKey(t *testing.T) {
+ c := qt.New(t)
+
+ nested := map[string]any{
+ "color": "blue",
+ }
+ m := map[string]any{
+ "nested": nested,
+ }
+
+ existing, nestedKey, owner, err := GetNestedParamFn("nested.new", ".", func(key string) any {
+ return m[key]
+ })
+
+ c.Assert(err, qt.IsNil)
+ c.Assert(existing, qt.IsNil)
+ c.Assert(nestedKey, qt.Equals, "new")
+ c.Assert(owner, qt.DeepEquals, nested)
+}
+
+func TestParamsSetAndMerge(t *testing.T) {
+ c := qt.New(t)
+
+ createParamsPair := func() (Params, Params) {
+ p1 := Params{"a": "av", "c": "cv", "nested": Params{"al2": "al2v", "cl2": "cl2v"}}
+ p2 := Params{"b": "bv", "a": "abv", "nested": Params{"bl2": "bl2v", "al2": "al2bv"}, MergeStrategyKey: ParamsMergeStrategyDeep}
+ return p1, p2
+ }
+
+ p1, p2 := createParamsPair()
+
+ SetParams(p1, p2)
+
+ c.Assert(p1, qt.DeepEquals, Params{
+ "a": "abv",
+ "c": "cv",
+ "nested": Params{
+ "al2": "al2bv",
+ "cl2": "cl2v",
+ "bl2": "bl2v",
+ },
+ "b": "bv",
+ MergeStrategyKey: ParamsMergeStrategyDeep,
+ })
+
+ p1, p2 = createParamsPair()
+
+ MergeParamsWithStrategy("", p1, p2)
+
+ // Default is to do a shallow merge.
+ c.Assert(p1, qt.DeepEquals, Params{
+ "c": "cv",
+ "nested": Params{
+ "al2": "al2v",
+ "cl2": "cl2v",
+ },
+ "b": "bv",
+ "a": "av",
+ })
+
+ p1, p2 = createParamsPair()
+ p1.SetMergeStrategy(ParamsMergeStrategyNone)
+ MergeParamsWithStrategy("", p1, p2)
+ p1.DeleteMergeStrategy()
+
+ c.Assert(p1, qt.DeepEquals, Params{
+ "a": "av",
+ "c": "cv",
+ "nested": Params{
+ "al2": "al2v",
+ "cl2": "cl2v",
+ },
+ })
+
+ p1, p2 = createParamsPair()
+ p1.SetMergeStrategy(ParamsMergeStrategyShallow)
+ MergeParamsWithStrategy("", p1, p2)
+ p1.DeleteMergeStrategy()
+
+ c.Assert(p1, qt.DeepEquals, Params{
+ "a": "av",
+ "c": "cv",
+ "nested": Params{
+ "al2": "al2v",
+ "cl2": "cl2v",
+ },
+ "b": "bv",
+ })
+
+ p1, p2 = createParamsPair()
+ p1.SetMergeStrategy(ParamsMergeStrategyDeep)
+ MergeParamsWithStrategy("", p1, p2)
+ p1.DeleteMergeStrategy()
+
+ c.Assert(p1, qt.DeepEquals, Params{
+ "nested": Params{
+ "al2": "al2v",
+ "cl2": "cl2v",
+ "bl2": "bl2v",
+ },
+ "b": "bv",
+ "a": "av",
+ "c": "cv",
+ })
+}
+
+func TestParamsIsZero(t *testing.T) {
+ c := qt.New(t)
+
+ var nilParams Params
+
+ c.Assert(Params{}.IsZero(), qt.IsTrue)
+ c.Assert(nilParams.IsZero(), qt.IsTrue)
+ c.Assert(Params{"foo": "bar"}.IsZero(), qt.IsFalse)
+ c.Assert(Params{"_merge": "foo", "foo": "bar"}.IsZero(), qt.IsFalse)
+ c.Assert(Params{"_merge": "foo"}.IsZero(), qt.IsTrue)
}
diff --git a/common/maps/scratch.go b/common/maps/scratch.go
index 4acd10c6c..cf5231783 100644
--- a/common/maps/scratch.go
+++ b/common/maps/scratch.go
@@ -22,37 +22,24 @@ import (
"github.com/gohugoio/hugo/common/math"
)
-// Scratch is a writable context used for stateful operations in Page/Node rendering.
+type StoreProvider interface {
+ // Store returns a Scratch that can be used to store temporary state.
+ // Store is not reset on server rebuilds.
+ Store() *Scratch
+}
+
+// Scratch is a writable context used for stateful build operations
type Scratch struct {
- values map[string]interface{}
+ values map[string]any
mu sync.RWMutex
}
-// Scratcher provides a scratching service.
-type Scratcher interface {
- Scratch() *Scratch
-}
-
-type scratcher struct {
- s *Scratch
-}
-
-func (s scratcher) Scratch() *Scratch {
- return s.s
-}
-
-// NewScratcher creates a new Scratcher.
-func NewScratcher() Scratcher {
- return scratcher{s: NewScratch()}
-}
-
// Add will, for single values, add (using the + operator) the addend to the existing addend (if found).
// Supports numeric values and strings.
//
// If the first add for a key is an array or slice, then the next value(s) will be appended.
-func (c *Scratch) Add(key string, newAddend interface{}) (string, error) {
-
- var newVal interface{}
+func (c *Scratch) Add(key string, newAddend any) (string, error) {
+ var newVal any
c.mu.RLock()
existingAddend, found := c.values[key]
c.mu.RUnlock()
@@ -83,7 +70,7 @@ func (c *Scratch) Add(key string, newAddend interface{}) (string, error) {
// Set stores a value with the given key in the Node context.
// This value can later be retrieved with Get.
-func (c *Scratch) Set(key string, value interface{}) string {
+func (c *Scratch) Set(key string, value any) string {
c.mu.Lock()
c.values[key] = value
c.mu.Unlock()
@@ -99,7 +86,7 @@ func (c *Scratch) Delete(key string) string {
}
// Get returns a value previously set by Add or Set.
-func (c *Scratch) Get(key string) interface{} {
+func (c *Scratch) Get(key string) any {
c.mu.RLock()
val := c.values[key]
c.mu.RUnlock()
@@ -107,22 +94,42 @@ func (c *Scratch) Get(key string) interface{} {
return val
}
+// Values returns the raw backing map. Note that you should just use
+// this method on the locally scoped Scratch instances you obtain via newScratch, not
+// .Page.Scratch etc., as that will lead to concurrency issues.
+func (c *Scratch) Values() map[string]any {
+ c.mu.RLock()
+ defer c.mu.RUnlock()
+ return c.values
+}
+
// SetInMap stores a value to a map with the given key in the Node context.
// This map can later be retrieved with GetSortedMapValues.
-func (c *Scratch) SetInMap(key string, mapKey string, value interface{}) string {
+func (c *Scratch) SetInMap(key string, mapKey string, value any) string {
c.mu.Lock()
_, found := c.values[key]
if !found {
- c.values[key] = make(map[string]interface{})
+ c.values[key] = make(map[string]any)
}
- c.values[key].(map[string]interface{})[mapKey] = value
+ c.values[key].(map[string]any)[mapKey] = value
+ c.mu.Unlock()
+ return ""
+}
+
+// DeleteInMap deletes a value to a map with the given key in the Node context.
+func (c *Scratch) DeleteInMap(key string, mapKey string) string {
+ c.mu.Lock()
+ _, found := c.values[key]
+ if found {
+ delete(c.values[key].(map[string]any), mapKey)
+ }
c.mu.Unlock()
return ""
}
// GetSortedMapValues returns a sorted map previously filled with SetInMap.
-func (c *Scratch) GetSortedMapValues(key string) interface{} {
+func (c *Scratch) GetSortedMapValues(key string) any {
c.mu.RLock()
if c.values[key] == nil {
@@ -130,7 +137,7 @@ func (c *Scratch) GetSortedMapValues(key string) interface{} {
return nil
}
- unsortedMap := c.values[key].(map[string]interface{})
+ unsortedMap := c.values[key].(map[string]any)
c.mu.RUnlock()
var keys []string
for mapKey := range unsortedMap {
@@ -139,7 +146,7 @@ func (c *Scratch) GetSortedMapValues(key string) interface{} {
sort.Strings(keys)
- sortedArray := make([]interface{}, len(unsortedMap))
+ sortedArray := make([]any, len(unsortedMap))
for i, mapKey := range keys {
sortedArray[i] = unsortedMap[mapKey]
}
@@ -147,7 +154,7 @@ func (c *Scratch) GetSortedMapValues(key string) interface{} {
return sortedArray
}
-// NewScratch returns a new instance Scratch.
+// NewScratch returns a new instance of Scratch.
func NewScratch() *Scratch {
- return &Scratch{values: make(map[string]interface{})}
+ return &Scratch{values: make(map[string]any)}
}
diff --git a/common/maps/scratch_test.go b/common/maps/scratch_test.go
index c2c436e40..f07169e61 100644
--- a/common/maps/scratch_test.go
+++ b/common/maps/scratch_test.go
@@ -47,10 +47,12 @@ func TestScratchAdd(t *testing.T) {
scratch.Add("scratch", scratch)
_, err := scratch.Add("scratch", scratch)
+ m := scratch.Values()
+ c.Assert(m, qt.HasLen, 5)
+
if err == nil {
t.Errorf("Expected error from invalid arithmetic")
}
-
}
func TestScratchAddSlice(t *testing.T) {
@@ -88,12 +90,11 @@ func TestScratchAddTypedSliceToInterfaceSlice(t *testing.T) {
c := qt.New(t)
scratch := NewScratch()
- scratch.Set("slice", []interface{}{})
+ scratch.Set("slice", []any{})
_, err := scratch.Add("slice", []int{1, 2})
c.Assert(err, qt.IsNil)
c.Assert(scratch.Get("slice"), qt.DeepEquals, []int{1, 2})
-
}
// https://github.com/gohugoio/hugo/issues/5361
@@ -106,8 +107,7 @@ func TestScratchAddDifferentTypedSliceToInterfaceSlice(t *testing.T) {
_, err := scratch.Add("slice", []int{1, 2})
c.Assert(err, qt.IsNil)
- c.Assert(scratch.Get("slice"), qt.DeepEquals, []interface{}{"foo", 1, 2})
-
+ c.Assert(scratch.Get("slice"), qt.DeepEquals, []any{"foo", 1, 2})
}
func TestScratchSet(t *testing.T) {
@@ -140,7 +140,7 @@ func TestScratchInParallel(t *testing.T) {
for i := 1; i <= 10; i++ {
wg.Add(1)
go func(j int) {
- for k := 0; k < 10; k++ {
+ for k := range 10 {
newVal := int64(k + j)
_, err := scratch.Add(key, newVal)
@@ -185,7 +185,21 @@ func TestScratchSetInMap(t *testing.T) {
scratch.SetInMap("key", "zyx", "Zyx")
scratch.SetInMap("key", "abc", "Abc (updated)")
scratch.SetInMap("key", "def", "Def")
- c.Assert(scratch.GetSortedMapValues("key"), qt.DeepEquals, []interface{}{0: "Abc (updated)", 1: "Def", 2: "Lux", 3: "Zyx"})
+ c.Assert(scratch.GetSortedMapValues("key"), qt.DeepEquals, any([]any{"Abc (updated)", "Def", "Lux", "Zyx"}))
+}
+
+func TestScratchDeleteInMap(t *testing.T) {
+ t.Parallel()
+ c := qt.New(t)
+
+ scratch := NewScratch()
+ scratch.SetInMap("key", "lux", "Lux")
+ scratch.SetInMap("key", "abc", "Abc")
+ scratch.SetInMap("key", "zyx", "Zyx")
+ scratch.DeleteInMap("key", "abc")
+ scratch.SetInMap("key", "def", "Def")
+ scratch.DeleteInMap("key", "lmn") // Do nothing
+ c.Assert(scratch.GetSortedMapValues("key"), qt.DeepEquals, any([]any{"Def", "Lux", "Zyx"}))
}
func TestScratchGetSortedMapValues(t *testing.T) {
diff --git a/common/math/math.go b/common/math/math.go
index cd06379aa..f88fbcd9c 100644
--- a/common/math/math.go
+++ b/common/math/math.go
@@ -20,35 +20,38 @@ import (
// DoArithmetic performs arithmetic operations (+,-,*,/) using reflection to
// determine the type of the two terms.
-func DoArithmetic(a, b interface{}, op rune) (interface{}, error) {
+func DoArithmetic(a, b any, op rune) (any, error) {
av := reflect.ValueOf(a)
bv := reflect.ValueOf(b)
var ai, bi int64
var af, bf float64
var au, bu uint64
+ var isInt, isFloat, isUint bool
switch av.Kind() {
case reflect.Int, reflect.Int8, reflect.Int16, reflect.Int32, reflect.Int64:
ai = av.Int()
switch bv.Kind() {
case reflect.Int, reflect.Int8, reflect.Int16, reflect.Int32, reflect.Int64:
+ isInt = true
bi = bv.Int()
case reflect.Float32, reflect.Float64:
+ isFloat = true
af = float64(ai) // may overflow
- ai = 0
bf = bv.Float()
case reflect.Uint, reflect.Uint8, reflect.Uint16, reflect.Uint32, reflect.Uint64:
bu = bv.Uint()
if ai >= 0 {
+ isUint = true
au = uint64(ai)
- ai = 0
} else {
+ isInt = true
bi = int64(bu) // may overflow
- bu = 0
}
default:
return nil, errors.New("can't apply the operator to the values")
}
case reflect.Float32, reflect.Float64:
+ isFloat = true
af = av.Float()
switch bv.Kind() {
case reflect.Int, reflect.Int8, reflect.Int16, reflect.Int32, reflect.Int64:
@@ -66,17 +69,18 @@ func DoArithmetic(a, b interface{}, op rune) (interface{}, error) {
case reflect.Int, reflect.Int8, reflect.Int16, reflect.Int32, reflect.Int64:
bi = bv.Int()
if bi >= 0 {
+ isUint = true
bu = uint64(bi)
- bi = 0
} else {
+ isInt = true
ai = int64(au) // may overflow
- au = 0
}
case reflect.Float32, reflect.Float64:
+ isFloat = true
af = float64(au) // may overflow
- au = 0
bf = bv.Float()
case reflect.Uint, reflect.Uint8, reflect.Uint16, reflect.Uint32, reflect.Uint64:
+ isUint = true
bu = bv.Uint()
default:
return nil, errors.New("can't apply the operator to the values")
@@ -94,38 +98,32 @@ func DoArithmetic(a, b interface{}, op rune) (interface{}, error) {
switch op {
case '+':
- if ai != 0 || bi != 0 {
+ if isInt {
return ai + bi, nil
- } else if af != 0 || bf != 0 {
+ } else if isFloat {
return af + bf, nil
- } else if au != 0 || bu != 0 {
- return au + bu, nil
}
- return 0, nil
+ return au + bu, nil
case '-':
- if ai != 0 || bi != 0 {
+ if isInt {
return ai - bi, nil
- } else if af != 0 || bf != 0 {
+ } else if isFloat {
return af - bf, nil
- } else if au != 0 || bu != 0 {
- return au - bu, nil
}
- return 0, nil
+ return au - bu, nil
case '*':
- if ai != 0 || bi != 0 {
+ if isInt {
return ai * bi, nil
- } else if af != 0 || bf != 0 {
+ } else if isFloat {
return af * bf, nil
- } else if au != 0 || bu != 0 {
- return au * bu, nil
}
- return 0, nil
+ return au * bu, nil
case '/':
- if bi != 0 {
+ if isInt && bi != 0 {
return ai / bi, nil
- } else if bf != 0 {
+ } else if isFloat && bf != 0 {
return af / bf, nil
- } else if bu != 0 {
+ } else if isUint && bu != 0 {
return au / bu, nil
}
return nil, errors.New("can't divide the value by 0")
diff --git a/common/math/math_test.go b/common/math/math_test.go
index a11701862..d75d30a69 100644
--- a/common/math/math_test.go
+++ b/common/math/math_test.go
@@ -24,16 +24,18 @@ func TestDoArithmetic(t *testing.T) {
c := qt.New(t)
for _, test := range []struct {
- a interface{}
- b interface{}
+ a any
+ b any
op rune
- expect interface{}
+ expect any
}{
{3, 2, '+', int64(5)},
+ {0, 0, '+', int64(0)},
{3, 2, '-', int64(1)},
{3, 2, '*', int64(6)},
{3, 2, '/', int64(1)},
{3.0, 2, '+', float64(5)},
+ {0.0, 0, '+', float64(0.0)},
{3.0, 2, '-', float64(1)},
{3.0, 2, '*', float64(6)},
{3.0, 2, '/', float64(1.5)},
@@ -42,18 +44,22 @@ func TestDoArithmetic(t *testing.T) {
{3, 2.0, '*', float64(6)},
{3, 2.0, '/', float64(1.5)},
{3.0, 2.0, '+', float64(5)},
+ {0.0, 0.0, '+', float64(0.0)},
{3.0, 2.0, '-', float64(1)},
{3.0, 2.0, '*', float64(6)},
{3.0, 2.0, '/', float64(1.5)},
{uint(3), uint(2), '+', uint64(5)},
+ {uint(0), uint(0), '+', uint64(0)},
{uint(3), uint(2), '-', uint64(1)},
{uint(3), uint(2), '*', uint64(6)},
{uint(3), uint(2), '/', uint64(1)},
{uint(3), 2, '+', uint64(5)},
+ {uint(0), 0, '+', uint64(0)},
{uint(3), 2, '-', uint64(1)},
{uint(3), 2, '*', uint64(6)},
{uint(3), 2, '/', uint64(1)},
{3, uint(2), '+', uint64(5)},
+ {0, uint(0), '+', uint64(0)},
{3, uint(2), '-', uint64(1)},
{3, uint(2), '*', uint64(6)},
{3, uint(2), '/', uint64(1)},
@@ -66,16 +72,15 @@ func TestDoArithmetic(t *testing.T) {
{-3, uint(2), '*', int64(-6)},
{-3, uint(2), '/', int64(-1)},
{uint(3), 2.0, '+', float64(5)},
+ {uint(0), 0.0, '+', float64(0)},
{uint(3), 2.0, '-', float64(1)},
{uint(3), 2.0, '*', float64(6)},
{uint(3), 2.0, '/', float64(1.5)},
{3.0, uint(2), '+', float64(5)},
+ {0.0, uint(0), '+', float64(0)},
{3.0, uint(2), '-', float64(1)},
{3.0, uint(2), '*', float64(6)},
{3.0, uint(2), '/', float64(1.5)},
- {0, 0, '+', 0},
- {0, 0, '-', 0},
- {0, 0, '*', 0},
{"foo", "bar", '+', "foobar"},
{3, 0, '/', false},
{3.0, 0, '/', false},
diff --git a/common/para/para.go b/common/para/para.go
new file mode 100644
index 000000000..c323a3073
--- /dev/null
+++ b/common/para/para.go
@@ -0,0 +1,73 @@
+// Copyright 2019 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Package para implements parallel execution helpers.
+package para
+
+import (
+ "context"
+
+ "golang.org/x/sync/errgroup"
+)
+
+// Workers configures a task executor with the most number of tasks to be executed in parallel.
+type Workers struct {
+ sem chan struct{}
+}
+
+// Runner wraps the lifecycle methods of a new task set.
+//
+// Run will block until a worker is available or the context is cancelled,
+// and then run the given func in a new goroutine.
+// Wait will wait for all the running goroutines to finish.
+type Runner interface {
+ Run(func() error)
+ Wait() error
+}
+
+type errGroupRunner struct {
+ *errgroup.Group
+ w *Workers
+ ctx context.Context
+}
+
+func (g *errGroupRunner) Run(fn func() error) {
+ select {
+ case g.w.sem <- struct{}{}:
+ case <-g.ctx.Done():
+ return
+ }
+
+ g.Go(func() error {
+ err := fn()
+ <-g.w.sem
+ return err
+ })
+}
+
+// New creates a new Workers with the given number of workers.
+func New(numWorkers int) *Workers {
+ return &Workers{
+ sem: make(chan struct{}, numWorkers),
+ }
+}
+
+// Start starts a new Runner.
+func (w *Workers) Start(ctx context.Context) (Runner, context.Context) {
+ g, ctx := errgroup.WithContext(ctx)
+ return &errGroupRunner{
+ Group: g,
+ ctx: ctx,
+ w: w,
+ }, ctx
+}
diff --git a/common/para/para_test.go b/common/para/para_test.go
new file mode 100644
index 000000000..cf24a4e37
--- /dev/null
+++ b/common/para/para_test.go
@@ -0,0 +1,96 @@
+// Copyright 2019 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package para
+
+import (
+ "context"
+ "runtime"
+ "sort"
+ "sync"
+ "sync/atomic"
+ "testing"
+ "time"
+
+ "github.com/gohugoio/hugo/htesting"
+
+ qt "github.com/frankban/quicktest"
+)
+
+func TestPara(t *testing.T) {
+ if runtime.NumCPU() < 4 {
+ t.Skipf("skip para test, CPU count is %d", runtime.NumCPU())
+ }
+
+ // TODO(bep)
+ if htesting.IsCI() {
+ t.Skip("skip para test when running on CI")
+ }
+
+ c := qt.New(t)
+
+ c.Run("Order", func(c *qt.C) {
+ n := 500
+ ints := make([]int, n)
+ for i := range n {
+ ints[i] = i
+ }
+
+ p := New(4)
+ r, _ := p.Start(context.Background())
+
+ var result []int
+ var mu sync.Mutex
+ for i := range n {
+ i := i
+ r.Run(func() error {
+ mu.Lock()
+ defer mu.Unlock()
+ result = append(result, i)
+ return nil
+ })
+ }
+
+ c.Assert(r.Wait(), qt.IsNil)
+ c.Assert(result, qt.HasLen, len(ints))
+ c.Assert(sort.IntsAreSorted(result), qt.Equals, false, qt.Commentf("Para does not seem to be parallel"))
+ sort.Ints(result)
+ c.Assert(result, qt.DeepEquals, ints)
+ })
+
+ c.Run("Time", func(c *qt.C) {
+ const n = 100
+
+ p := New(5)
+ r, _ := p.Start(context.Background())
+
+ start := time.Now()
+
+ var counter int64
+
+ for range n {
+ r.Run(func() error {
+ atomic.AddInt64(&counter, 1)
+ time.Sleep(1 * time.Millisecond)
+ return nil
+ })
+ }
+
+ c.Assert(r.Wait(), qt.IsNil)
+ c.Assert(counter, qt.Equals, int64(n))
+
+ since := time.Since(start)
+ limit := n / 2 * time.Millisecond
+ c.Assert(since < limit, qt.Equals, true, qt.Commentf("%s >= %s", since, limit))
+ })
+}
diff --git a/common/paths/path.go b/common/paths/path.go
new file mode 100644
index 000000000..de91d6a2f
--- /dev/null
+++ b/common/paths/path.go
@@ -0,0 +1,430 @@
+// Copyright 2021 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package paths
+
+import (
+ "errors"
+ "fmt"
+ "net/url"
+ "path"
+ "path/filepath"
+ "strings"
+ "unicode"
+)
+
+// FilePathSeparator as defined by os.Separator.
+const (
+ FilePathSeparator = string(filepath.Separator)
+ slash = "/"
+)
+
+// filepathPathBridge is a bridge for common functionality in filepath vs path
+type filepathPathBridge interface {
+ Base(in string) string
+ Clean(in string) string
+ Dir(in string) string
+ Ext(in string) string
+ Join(elem ...string) string
+ Separator() string
+}
+
+type filepathBridge struct{}
+
+func (filepathBridge) Base(in string) string {
+ return filepath.Base(in)
+}
+
+func (filepathBridge) Clean(in string) string {
+ return filepath.Clean(in)
+}
+
+func (filepathBridge) Dir(in string) string {
+ return filepath.Dir(in)
+}
+
+func (filepathBridge) Ext(in string) string {
+ return filepath.Ext(in)
+}
+
+func (filepathBridge) Join(elem ...string) string {
+ return filepath.Join(elem...)
+}
+
+func (filepathBridge) Separator() string {
+ return FilePathSeparator
+}
+
+var fpb filepathBridge
+
+// AbsPathify creates an absolute path if given a working dir and a relative path.
+// If already absolute, the path is just cleaned.
+func AbsPathify(workingDir, inPath string) string {
+ if filepath.IsAbs(inPath) {
+ return filepath.Clean(inPath)
+ }
+ return filepath.Join(workingDir, inPath)
+}
+
+// AddTrailingSlash adds a trailing Unix styled slash (/) if not already
+// there.
+func AddTrailingSlash(path string) string {
+ if !strings.HasSuffix(path, "/") {
+ path += "/"
+ }
+ return path
+}
+
+// AddLeadingSlash adds a leading Unix styled slash (/) if not already
+// there.
+func AddLeadingSlash(path string) string {
+ if !strings.HasPrefix(path, "/") {
+ path = "/" + path
+ }
+ return path
+}
+
+// AddTrailingAndLeadingSlash adds a leading and trailing Unix styled slash (/) if not already
+// there.
+func AddLeadingAndTrailingSlash(path string) string {
+ return AddTrailingSlash(AddLeadingSlash(path))
+}
+
+// MakeTitle converts the path given to a suitable title, trimming whitespace
+// and replacing hyphens with whitespace.
+func MakeTitle(inpath string) string {
+ return strings.Replace(strings.TrimSpace(inpath), "-", " ", -1)
+}
+
+// ReplaceExtension takes a path and an extension, strips the old extension
+// and returns the path with the new extension.
+func ReplaceExtension(path string, newExt string) string {
+ f, _ := fileAndExt(path, fpb)
+ return f + "." + newExt
+}
+
+func makePathRelative(inPath string, possibleDirectories ...string) (string, error) {
+ for _, currentPath := range possibleDirectories {
+ if strings.HasPrefix(inPath, currentPath) {
+ return strings.TrimPrefix(inPath, currentPath), nil
+ }
+ }
+ return inPath, errors.New("can't extract relative path, unknown prefix")
+}
+
+// ExtNoDelimiter takes a path and returns the extension, excluding the delimiter, i.e. "md".
+func ExtNoDelimiter(in string) string {
+ return strings.TrimPrefix(Ext(in), ".")
+}
+
+// Ext takes a path and returns the extension, including the delimiter, i.e. ".md".
+func Ext(in string) string {
+ _, ext := fileAndExt(in, fpb)
+ return ext
+}
+
+// PathAndExt is the same as FileAndExt, but it uses the path package.
+func PathAndExt(in string) (string, string) {
+ return fileAndExt(in, pb)
+}
+
+// FileAndExt takes a path and returns the file and extension separated,
+// the extension including the delimiter, i.e. ".md".
+func FileAndExt(in string) (string, string) {
+ return fileAndExt(in, fpb)
+}
+
+// FileAndExtNoDelimiter takes a path and returns the file and extension separated,
+// the extension excluding the delimiter, e.g "md".
+func FileAndExtNoDelimiter(in string) (string, string) {
+ file, ext := fileAndExt(in, fpb)
+ return file, strings.TrimPrefix(ext, ".")
+}
+
+// Filename takes a file path, strips out the extension,
+// and returns the name of the file.
+func Filename(in string) (name string) {
+ name, _ = fileAndExt(in, fpb)
+ return
+}
+
+// FileAndExt returns the filename and any extension of a file path as
+// two separate strings.
+//
+// If the path, in, contains a directory name ending in a slash,
+// then both name and ext will be empty strings.
+//
+// If the path, in, is either the current directory, the parent
+// directory or the root directory, or an empty string,
+// then both name and ext will be empty strings.
+//
+// If the path, in, represents the path of a file without an extension,
+// then name will be the name of the file and ext will be an empty string.
+//
+// If the path, in, represents a filename with an extension,
+// then name will be the filename minus any extension - including the dot
+// and ext will contain the extension - minus the dot.
+func fileAndExt(in string, b filepathPathBridge) (name string, ext string) {
+ ext = b.Ext(in)
+ base := b.Base(in)
+
+ return extractFilename(in, ext, base, b.Separator()), ext
+}
+
+func extractFilename(in, ext, base, pathSeparator string) (name string) {
+ // No file name cases. These are defined as:
+ // 1. any "in" path that ends in a pathSeparator
+ // 2. any "base" consisting of just an pathSeparator
+ // 3. any "base" consisting of just an empty string
+ // 4. any "base" consisting of just the current directory i.e. "."
+ // 5. any "base" consisting of just the parent directory i.e. ".."
+ if (strings.LastIndex(in, pathSeparator) == len(in)-1) || base == "" || base == "." || base == ".." || base == pathSeparator {
+ name = "" // there is NO filename
+ } else if ext != "" { // there was an Extension
+ // return the filename minus the extension (and the ".")
+ name = base[:strings.LastIndex(base, ".")]
+ } else {
+ // no extension case so just return base, which will
+ // be the filename
+ name = base
+ }
+ return
+}
+
+// GetRelativePath returns the relative path of a given path.
+func GetRelativePath(path, base string) (final string, err error) {
+ if filepath.IsAbs(path) && base == "" {
+ return "", errors.New("source: missing base directory")
+ }
+ name := filepath.Clean(path)
+ base = filepath.Clean(base)
+
+ name, err = filepath.Rel(base, name)
+ if err != nil {
+ return "", err
+ }
+
+ if strings.HasSuffix(filepath.FromSlash(path), FilePathSeparator) && !strings.HasSuffix(name, FilePathSeparator) {
+ name += FilePathSeparator
+ }
+ return name, nil
+}
+
+func prettifyPath(in string, b filepathPathBridge) string {
+ if filepath.Ext(in) == "" {
+ // /section/name/ -> /section/name/index.html
+ if len(in) < 2 {
+ return b.Separator()
+ }
+ return b.Join(in, "index.html")
+ }
+ name, ext := fileAndExt(in, b)
+ if name == "index" {
+ // /section/name/index.html -> /section/name/index.html
+ return b.Clean(in)
+ }
+ // /section/name.html -> /section/name/index.html
+ return b.Join(b.Dir(in), name, "index"+ext)
+}
+
+// CommonDirPath returns the common directory of the given paths.
+func CommonDirPath(path1, path2 string) string {
+ if path1 == "" || path2 == "" {
+ return ""
+ }
+
+ hadLeadingSlash := strings.HasPrefix(path1, "/") || strings.HasPrefix(path2, "/")
+
+ path1 = TrimLeading(path1)
+ path2 = TrimLeading(path2)
+
+ p1 := strings.Split(path1, "/")
+ p2 := strings.Split(path2, "/")
+
+ var common []string
+
+ for i := 0; i < len(p1) && i < len(p2); i++ {
+ if p1[i] == p2[i] {
+ common = append(common, p1[i])
+ } else {
+ break
+ }
+ }
+
+ s := strings.Join(common, "/")
+
+ if hadLeadingSlash && s != "" {
+ s = "/" + s
+ }
+
+ return s
+}
+
+// Sanitize sanitizes string to be used in Hugo's file paths and URLs, allowing only
+// a predefined set of special Unicode characters.
+//
+// Spaces will be replaced with a single hyphen.
+//
+// This function is the core function used to normalize paths in Hugo.
+//
+// Note that this is the first common step for URL/path sanitation,
+// the final URL/path may end up looking differently if the user has stricter rules defined (e.g. removePathAccents=true).
+func Sanitize(s string) string {
+ var willChange bool
+ for i, r := range s {
+ willChange = !isAllowedPathCharacter(s, i, r)
+ if willChange {
+ break
+ }
+ }
+
+ if !willChange {
+ // Prevent allocation when nothing changes.
+ return s
+ }
+
+ target := make([]rune, 0, len(s))
+ var (
+ prependHyphen bool
+ wasHyphen bool
+ )
+
+ for i, r := range s {
+ isAllowed := isAllowedPathCharacter(s, i, r)
+
+ if isAllowed {
+ // track explicit hyphen in input; no need to add a new hyphen if
+ // we just saw one.
+ wasHyphen = r == '-'
+
+ if prependHyphen {
+ // if currently have a hyphen, don't prepend an extra one
+ if !wasHyphen {
+ target = append(target, '-')
+ }
+ prependHyphen = false
+ }
+ target = append(target, r)
+ } else if len(target) > 0 && !wasHyphen && unicode.IsSpace(r) {
+ prependHyphen = true
+ }
+ }
+
+ return string(target)
+}
+
+func isAllowedPathCharacter(s string, i int, r rune) bool {
+ if r == ' ' {
+ return false
+ }
+ // Check for the most likely first (faster).
+ isAllowed := unicode.IsLetter(r) || unicode.IsDigit(r)
+ isAllowed = isAllowed || r == '.' || r == '/' || r == '\\' || r == '_' || r == '#' || r == '+' || r == '~' || r == '-' || r == '@'
+ isAllowed = isAllowed || unicode.IsMark(r)
+ isAllowed = isAllowed || (r == '%' && i+2 < len(s) && ishex(s[i+1]) && ishex(s[i+2]))
+ return isAllowed
+}
+
+// From https://golang.org/src/net/url/url.go
+func ishex(c byte) bool {
+ switch {
+ case '0' <= c && c <= '9':
+ return true
+ case 'a' <= c && c <= 'f':
+ return true
+ case 'A' <= c && c <= 'F':
+ return true
+ }
+ return false
+}
+
+var slashFunc = func(r rune) bool {
+ return r == '/'
+}
+
+// Dir behaves like path.Dir without the path.Clean step.
+//
+// The returned path ends in a slash only if it is the root "/".
+func Dir(s string) string {
+ dir, _ := path.Split(s)
+ if len(dir) > 1 && dir[len(dir)-1] == '/' {
+ return dir[:len(dir)-1]
+ }
+ return dir
+}
+
+// FieldsSlash cuts s into fields separated with '/'.
+func FieldsSlash(s string) []string {
+ f := strings.FieldsFunc(s, slashFunc)
+ return f
+}
+
+// DirFile holds the result from path.Split.
+type DirFile struct {
+ Dir string
+ File string
+}
+
+// Used in test.
+func (df DirFile) String() string {
+ return fmt.Sprintf("%s|%s", df.Dir, df.File)
+}
+
+// PathEscape escapes unicode letters in pth.
+// Use URLEscape to escape full URLs including scheme, query etc.
+// This is slightly faster for the common case.
+// Note, there is a url.PathEscape function, but that also
+// escapes /.
+func PathEscape(pth string) string {
+ u, err := url.Parse(pth)
+ if err != nil {
+ panic(err)
+ }
+ return u.EscapedPath()
+}
+
+// ToSlashTrimLeading is just a filepath.ToSlash with an added / prefix trimmer.
+func ToSlashTrimLeading(s string) string {
+ return TrimLeading(filepath.ToSlash(s))
+}
+
+// TrimLeading trims the leading slash from the given string.
+func TrimLeading(s string) string {
+ return strings.TrimPrefix(s, "/")
+}
+
+// ToSlashTrimTrailing is just a filepath.ToSlash with an added / suffix trimmer.
+func ToSlashTrimTrailing(s string) string {
+ return TrimTrailing(filepath.ToSlash(s))
+}
+
+// TrimTrailing trims the trailing slash from the given string.
+func TrimTrailing(s string) string {
+ return strings.TrimSuffix(s, "/")
+}
+
+// ToSlashTrim trims any leading and trailing slashes from the given string and converts it to a forward slash separated path.
+func ToSlashTrim(s string) string {
+ return strings.Trim(filepath.ToSlash(s), "/")
+}
+
+// ToSlashPreserveLeading converts the path given to a forward slash separated path
+// and preserves the leading slash if present trimming any trailing slash.
+func ToSlashPreserveLeading(s string) string {
+ return "/" + strings.Trim(filepath.ToSlash(s), "/")
+}
+
+// IsSameFilePath checks if s1 and s2 are the same file path.
+func IsSameFilePath(s1, s2 string) bool {
+ return path.Clean(ToSlashTrim(s1)) == path.Clean(ToSlashTrim(s2))
+}
diff --git a/common/paths/path_test.go b/common/paths/path_test.go
new file mode 100644
index 000000000..bc27df6c6
--- /dev/null
+++ b/common/paths/path_test.go
@@ -0,0 +1,313 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package paths
+
+import (
+ "path/filepath"
+ "testing"
+
+ qt "github.com/frankban/quicktest"
+)
+
+func TestGetRelativePath(t *testing.T) {
+ tests := []struct {
+ path string
+ base string
+ expect any
+ }{
+ {filepath.FromSlash("/a/b"), filepath.FromSlash("/a"), filepath.FromSlash("b")},
+ {filepath.FromSlash("/a/b/c/"), filepath.FromSlash("/a"), filepath.FromSlash("b/c/")},
+ {filepath.FromSlash("/c"), filepath.FromSlash("/a/b"), filepath.FromSlash("../../c")},
+ {filepath.FromSlash("/c"), "", false},
+ }
+ for i, this := range tests {
+ // ultimately a fancy wrapper around filepath.Rel
+ result, err := GetRelativePath(this.path, this.base)
+
+ if b, ok := this.expect.(bool); ok && !b {
+ if err == nil {
+ t.Errorf("[%d] GetRelativePath didn't return an expected error", i)
+ }
+ } else {
+ if err != nil {
+ t.Errorf("[%d] GetRelativePath failed: %s", i, err)
+ continue
+ }
+ if result != this.expect {
+ t.Errorf("[%d] GetRelativePath got %v but expected %v", i, result, this.expect)
+ }
+ }
+
+ }
+}
+
+func TestMakePathRelative(t *testing.T) {
+ type test struct {
+ inPath, path1, path2, output string
+ }
+
+ data := []test{
+ {"/abc/bcd/ab.css", "/abc/bcd", "/bbc/bcd", "/ab.css"},
+ {"/abc/bcd/ab.css", "/abcd/bcd", "/abc/bcd", "/ab.css"},
+ }
+
+ for i, d := range data {
+ output, _ := makePathRelative(d.inPath, d.path1, d.path2)
+ if d.output != output {
+ t.Errorf("Test #%d failed. Expected %q got %q", i, d.output, output)
+ }
+ }
+ _, error := makePathRelative("a/b/c.ss", "/a/c", "/d/c", "/e/f")
+
+ if error == nil {
+ t.Errorf("Test failed, expected error")
+ }
+}
+
+func TestMakeTitle(t *testing.T) {
+ type test struct {
+ input, expected string
+ }
+ data := []test{
+ {"Make-Title", "Make Title"},
+ {"MakeTitle", "MakeTitle"},
+ {"make_title", "make_title"},
+ }
+ for i, d := range data {
+ output := MakeTitle(d.input)
+ if d.expected != output {
+ t.Errorf("Test %d failed. Expected %q got %q", i, d.expected, output)
+ }
+ }
+}
+
+// Replace Extension is probably poorly named, but the intent of the
+// function is to accept a path and return only the file name with a
+// new extension. It's intentionally designed to strip out the path
+// and only provide the name. We should probably rename the function to
+// be more explicit at some point.
+func TestReplaceExtension(t *testing.T) {
+ type test struct {
+ input, newext, expected string
+ }
+ data := []test{
+ // These work according to the above definition
+ {"/some/random/path/file.xml", "html", "file.html"},
+ {"/banana.html", "xml", "banana.xml"},
+ {"./banana.html", "xml", "banana.xml"},
+ {"banana/pie/index.html", "xml", "index.xml"},
+ {"../pies/fish/index.html", "xml", "index.xml"},
+ // but these all fail
+ {"filename-without-an-ext", "ext", "filename-without-an-ext.ext"},
+ {"/filename-without-an-ext", "ext", "filename-without-an-ext.ext"},
+ {"/directory/mydir/", "ext", ".ext"},
+ {"mydir/", "ext", ".ext"},
+ }
+
+ for i, d := range data {
+ output := ReplaceExtension(filepath.FromSlash(d.input), d.newext)
+ if d.expected != output {
+ t.Errorf("Test %d failed. Expected %q got %q", i, d.expected, output)
+ }
+ }
+}
+
+func TestExtNoDelimiter(t *testing.T) {
+ c := qt.New(t)
+ c.Assert(ExtNoDelimiter(filepath.FromSlash("/my/data.json")), qt.Equals, "json")
+}
+
+func TestFilename(t *testing.T) {
+ type test struct {
+ input, expected string
+ }
+ data := []test{
+ {"index.html", "index"},
+ {"./index.html", "index"},
+ {"/index.html", "index"},
+ {"index", "index"},
+ {"/tmp/index.html", "index"},
+ {"./filename-no-ext", "filename-no-ext"},
+ {"/filename-no-ext", "filename-no-ext"},
+ {"filename-no-ext", "filename-no-ext"},
+ {"directory/", ""}, // no filename case??
+ {"directory/.hidden.ext", ".hidden"},
+ {"./directory/../~/banana/gold.fish", "gold"},
+ {"../directory/banana.man", "banana"},
+ {"~/mydir/filename.ext", "filename"},
+ {"./directory//tmp/filename.ext", "filename"},
+ }
+
+ for i, d := range data {
+ output := Filename(filepath.FromSlash(d.input))
+ if d.expected != output {
+ t.Errorf("Test %d failed. Expected %q got %q", i, d.expected, output)
+ }
+ }
+}
+
+func TestFileAndExt(t *testing.T) {
+ type test struct {
+ input, expectedFile, expectedExt string
+ }
+ data := []test{
+ {"index.html", "index", ".html"},
+ {"./index.html", "index", ".html"},
+ {"/index.html", "index", ".html"},
+ {"index", "index", ""},
+ {"/tmp/index.html", "index", ".html"},
+ {"./filename-no-ext", "filename-no-ext", ""},
+ {"/filename-no-ext", "filename-no-ext", ""},
+ {"filename-no-ext", "filename-no-ext", ""},
+ {"directory/", "", ""}, // no filename case??
+ {"directory/.hidden.ext", ".hidden", ".ext"},
+ {"./directory/../~/banana/gold.fish", "gold", ".fish"},
+ {"../directory/banana.man", "banana", ".man"},
+ {"~/mydir/filename.ext", "filename", ".ext"},
+ {"./directory//tmp/filename.ext", "filename", ".ext"},
+ }
+
+ for i, d := range data {
+ file, ext := fileAndExt(filepath.FromSlash(d.input), fpb)
+ if d.expectedFile != file {
+ t.Errorf("Test %d failed. Expected filename %q got %q.", i, d.expectedFile, file)
+ }
+ if d.expectedExt != ext {
+ t.Errorf("Test %d failed. Expected extension %q got %q.", i, d.expectedExt, ext)
+ }
+ }
+}
+
+func TestSanitize(t *testing.T) {
+ c := qt.New(t)
+ tests := []struct {
+ input string
+ expected string
+ }{
+ {" Foo bar ", "Foo-bar"},
+ {"Foo.Bar/foo_Bar-Foo", "Foo.Bar/foo_Bar-Foo"},
+ {"fOO,bar:foobAR", "fOObarfoobAR"},
+ {"FOo/BaR.html", "FOo/BaR.html"},
+ {"FOo/Ba---R.html", "FOo/Ba---R.html"}, /// See #10104
+ {"FOo/Ba R.html", "FOo/Ba-R.html"},
+ {"трям/трям", "трям/трям"},
+ {"은행", "은행"},
+ {"Банковский кассир", "Банковский-кассир"},
+ // Issue #1488
+ {"संस्कृत", "संस्कृत"},
+ {"a%C3%B1ame", "a%C3%B1ame"}, // Issue #1292
+ {"this+is+a+test", "this+is+a+test"}, // Issue #1290
+ {"~foo", "~foo"}, // Issue #2177
+
+ }
+
+ for _, test := range tests {
+ c.Assert(Sanitize(test.input), qt.Equals, test.expected)
+ }
+}
+
+func BenchmarkSanitize(b *testing.B) {
+ const (
+ allAlowedPath = "foo/bar"
+ spacePath = "foo bar"
+ )
+
+ // This should not allocate any memory.
+ b.Run("All allowed", func(b *testing.B) {
+ for i := 0; i < b.N; i++ {
+ got := Sanitize(allAlowedPath)
+ if got != allAlowedPath {
+ b.Fatal(got)
+ }
+ }
+ })
+
+ // This will allocate some memory.
+ b.Run("Spaces", func(b *testing.B) {
+ for i := 0; i < b.N; i++ {
+ got := Sanitize(spacePath)
+ if got != "foo-bar" {
+ b.Fatal(got)
+ }
+ }
+ })
+}
+
+func TestDir(t *testing.T) {
+ c := qt.New(t)
+ c.Assert(Dir("/a/b/c/d"), qt.Equals, "/a/b/c")
+ c.Assert(Dir("/a"), qt.Equals, "/")
+ c.Assert(Dir("/"), qt.Equals, "/")
+ c.Assert(Dir(""), qt.Equals, "")
+}
+
+func TestFieldsSlash(t *testing.T) {
+ c := qt.New(t)
+
+ c.Assert(FieldsSlash("a/b/c"), qt.DeepEquals, []string{"a", "b", "c"})
+ c.Assert(FieldsSlash("/a/b/c"), qt.DeepEquals, []string{"a", "b", "c"})
+ c.Assert(FieldsSlash("/a/b/c/"), qt.DeepEquals, []string{"a", "b", "c"})
+ c.Assert(FieldsSlash("a/b/c/"), qt.DeepEquals, []string{"a", "b", "c"})
+ c.Assert(FieldsSlash("/"), qt.DeepEquals, []string{})
+ c.Assert(FieldsSlash(""), qt.DeepEquals, []string{})
+}
+
+func TestCommonDirPath(t *testing.T) {
+ c := qt.New(t)
+
+ for _, this := range []struct {
+ a, b, expected string
+ }{
+ {"/a/b/c", "/a/b/d", "/a/b"},
+ {"/a/b/c", "a/b/d", "/a/b"},
+ {"a/b/c", "/a/b/d", "/a/b"},
+ {"a/b/c", "a/b/d", "a/b"},
+ {"/a/b/c", "/a/b/c", "/a/b/c"},
+ {"/a/b/c", "/a/b/c/d", "/a/b/c"},
+ {"/a/b/c", "/a/b", "/a/b"},
+ {"/a/b/c", "/a", "/a"},
+ {"/a/b/c", "/d/e/f", ""},
+ } {
+ c.Assert(CommonDirPath(this.a, this.b), qt.Equals, this.expected, qt.Commentf("a: %s b: %s", this.a, this.b))
+ }
+}
+
+func TestIsSameFilePath(t *testing.T) {
+ c := qt.New(t)
+
+ for _, this := range []struct {
+ a, b string
+ expected bool
+ }{
+ {"/a/b/c", "/a/b/c", true},
+ {"/a/b/c", "/a/b/c/", true},
+ {"/a/b/c", "/a/b/d", false},
+ {"/a/b/c", "/a/b", false},
+ {"/a/b/c", "/a/b/c/d", false},
+ {"/a/b/c", "/a/b/cd", false},
+ {"/a/b/c", "/a/b/cc", false},
+ {"/a/b/c", "/a/b/c/", true},
+ {"/a/b/c", "/a/b/c//", true},
+ {"/a/b/c", "/a/b/c/.", true},
+ {"/a/b/c", "/a/b/c/./", true},
+ {"/a/b/c", "/a/b/c/./.", true},
+ {"/a/b/c", "/a/b/c/././", true},
+ {"/a/b/c", "/a/b/c/././.", true},
+ {"/a/b/c", "/a/b/c/./././", true},
+ {"/a/b/c", "/a/b/c/./././.", true},
+ {"/a/b/c", "/a/b/c/././././", true},
+ } {
+ c.Assert(IsSameFilePath(filepath.FromSlash(this.a), filepath.FromSlash(this.b)), qt.Equals, this.expected, qt.Commentf("a: %s b: %s", this.a, this.b))
+ }
+}
diff --git a/common/paths/pathparser.go b/common/paths/pathparser.go
new file mode 100644
index 000000000..8b9259bf7
--- /dev/null
+++ b/common/paths/pathparser.go
@@ -0,0 +1,788 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package paths
+
+import (
+ "path"
+ "path/filepath"
+ "runtime"
+ "strings"
+ "sync"
+
+ "github.com/gohugoio/hugo/common/types"
+ "github.com/gohugoio/hugo/hugofs/files"
+ "github.com/gohugoio/hugo/identity"
+ "github.com/gohugoio/hugo/resources/kinds"
+)
+
+const (
+ identifierBaseof = "baseof"
+)
+
+// PathParser parses a path into a Path.
+type PathParser struct {
+ // Maps the language code to its index in the languages/sites slice.
+ LanguageIndex map[string]int
+
+ // Reports whether the given language is disabled.
+ IsLangDisabled func(string) bool
+
+ // IsOutputFormat reports whether the given name is a valid output format.
+ // The second argument is optional.
+ IsOutputFormat func(name, ext string) bool
+
+ // Reports whether the given ext is a content file.
+ IsContentExt func(string) bool
+}
+
+// NormalizePathString returns a normalized path string using the very basic Hugo rules.
+func NormalizePathStringBasic(s string) string {
+ // All lower case.
+ s = strings.ToLower(s)
+
+ // Replace spaces with hyphens.
+ s = strings.ReplaceAll(s, " ", "-")
+
+ return s
+}
+
+// ParseIdentity parses component c with path s into a StringIdentity.
+func (pp *PathParser) ParseIdentity(c, s string) identity.StringIdentity {
+ p := pp.parsePooled(c, s)
+ defer putPath(p)
+ return identity.StringIdentity(p.IdentifierBase())
+}
+
+// ParseBaseAndBaseNameNoIdentifier parses component c with path s into a base and a base name without any identifier.
+func (pp *PathParser) ParseBaseAndBaseNameNoIdentifier(c, s string) (string, string) {
+ p := pp.parsePooled(c, s)
+ defer putPath(p)
+ return p.Base(), p.BaseNameNoIdentifier()
+}
+
+func (pp *PathParser) parsePooled(c, s string) *Path {
+ s = NormalizePathStringBasic(s)
+ p := getPath()
+ p.component = c
+ p, err := pp.doParse(c, s, p)
+ if err != nil {
+ panic(err)
+ }
+ return p
+}
+
+// Parse parses component c with path s into Path using Hugo's content path rules.
+func (pp *PathParser) Parse(c, s string) *Path {
+ p, err := pp.parse(c, s)
+ if err != nil {
+ panic(err)
+ }
+ return p
+}
+
+func (pp *PathParser) newPath(component string) *Path {
+ p := &Path{}
+ p.reset()
+ p.component = component
+ return p
+}
+
+func (pp *PathParser) parse(component, s string) (*Path, error) {
+ ss := NormalizePathStringBasic(s)
+
+ p, err := pp.doParse(component, ss, pp.newPath(component))
+ if err != nil {
+ return nil, err
+ }
+
+ if s != ss {
+ var err error
+ // Preserve the original case for titles etc.
+ p.unnormalized, err = pp.doParse(component, s, pp.newPath(component))
+ if err != nil {
+ return nil, err
+ }
+ } else {
+ p.unnormalized = p
+ }
+
+ return p, nil
+}
+
+func (pp *PathParser) parseIdentifier(component, s string, p *Path, i, lastDot, numDots int, isLast bool) {
+ if p.posContainerHigh != -1 {
+ return
+ }
+ mayHaveLang := numDots > 1 && p.posIdentifierLanguage == -1 && pp.LanguageIndex != nil
+ mayHaveLang = mayHaveLang && (component == files.ComponentFolderContent || component == files.ComponentFolderLayouts)
+ mayHaveOutputFormat := component == files.ComponentFolderLayouts
+ mayHaveKind := p.posIdentifierKind == -1 && mayHaveOutputFormat
+ var mayHaveLayout bool
+ if p.pathType == TypeShortcode {
+ mayHaveLayout = !isLast && component == files.ComponentFolderLayouts
+ } else {
+ mayHaveLayout = component == files.ComponentFolderLayouts
+ }
+
+ var found bool
+ var high int
+ if len(p.identifiersKnown) > 0 {
+ high = lastDot
+ } else {
+ high = len(p.s)
+ }
+ id := types.LowHigh[string]{Low: i + 1, High: high}
+ sid := p.s[id.Low:id.High]
+
+ if len(p.identifiersKnown) == 0 {
+ // The first is always the extension.
+ p.identifiersKnown = append(p.identifiersKnown, id)
+ found = true
+
+ // May also be the output format.
+ if mayHaveOutputFormat && pp.IsOutputFormat(sid, "") {
+ p.posIdentifierOutputFormat = 0
+ }
+ } else {
+
+ var langFound bool
+
+ if mayHaveLang {
+ var disabled bool
+ _, langFound = pp.LanguageIndex[sid]
+ if !langFound {
+ disabled = pp.IsLangDisabled != nil && pp.IsLangDisabled(sid)
+ if disabled {
+ p.disabled = true
+ langFound = true
+ }
+ }
+ found = langFound
+ if langFound {
+ p.identifiersKnown = append(p.identifiersKnown, id)
+ p.posIdentifierLanguage = len(p.identifiersKnown) - 1
+ }
+ }
+
+ if !found && mayHaveOutputFormat {
+ // At this point we may already have resolved an output format,
+ // but we need to keep looking for a more specific one, e.g. amp before html.
+ // Use both name and extension to prevent
+ // false positives on the form css.html.
+ if pp.IsOutputFormat(sid, p.Ext()) {
+ found = true
+ p.identifiersKnown = append(p.identifiersKnown, id)
+ p.posIdentifierOutputFormat = len(p.identifiersKnown) - 1
+ }
+ }
+
+ if !found && mayHaveKind {
+ if kinds.GetKindMain(sid) != "" {
+ found = true
+ p.identifiersKnown = append(p.identifiersKnown, id)
+ p.posIdentifierKind = len(p.identifiersKnown) - 1
+ }
+ }
+
+ if !found && sid == identifierBaseof {
+ found = true
+ p.identifiersKnown = append(p.identifiersKnown, id)
+ p.posIdentifierBaseof = len(p.identifiersKnown) - 1
+ }
+
+ if !found && mayHaveLayout {
+ p.identifiersKnown = append(p.identifiersKnown, id)
+ p.posIdentifierLayout = len(p.identifiersKnown) - 1
+ found = true
+ }
+
+ if !found {
+ p.identifiersUnknown = append(p.identifiersUnknown, id)
+ }
+
+ }
+}
+
+func (pp *PathParser) doParse(component, s string, p *Path) (*Path, error) {
+ if runtime.GOOS == "windows" {
+ s = path.Clean(filepath.ToSlash(s))
+ if s == "." {
+ s = ""
+ }
+ }
+
+ if s == "" {
+ s = "/"
+ }
+
+ // Leading slash, no trailing slash.
+ if !strings.HasPrefix(s, "/") {
+ s = "/" + s
+ }
+
+ if s != "/" && s[len(s)-1] == '/' {
+ s = s[:len(s)-1]
+ }
+
+ p.s = s
+ slashCount := 0
+ lastDot := 0
+ lastSlashIdx := strings.LastIndex(s, "/")
+ numDots := strings.Count(s[lastSlashIdx+1:], ".")
+ if strings.Contains(s, "/_shortcodes/") {
+ p.pathType = TypeShortcode
+ }
+
+ for i := len(s) - 1; i >= 0; i-- {
+ c := s[i]
+
+ switch c {
+ case '.':
+ pp.parseIdentifier(component, s, p, i, lastDot, numDots, false)
+ lastDot = i
+ case '/':
+ slashCount++
+ if p.posContainerHigh == -1 {
+ if lastDot > 0 {
+ pp.parseIdentifier(component, s, p, i, lastDot, numDots, true)
+ }
+ p.posContainerHigh = i + 1
+ } else if p.posContainerLow == -1 {
+ p.posContainerLow = i + 1
+ }
+ if i > 0 {
+ p.posSectionHigh = i
+ }
+ }
+ }
+
+ if len(p.identifiersKnown) > 0 {
+ isContentComponent := p.component == files.ComponentFolderContent || p.component == files.ComponentFolderArchetypes
+ isContent := isContentComponent && pp.IsContentExt(p.Ext())
+ id := p.identifiersKnown[len(p.identifiersKnown)-1]
+
+ if id.Low > p.posContainerHigh {
+ b := p.s[p.posContainerHigh : id.Low-1]
+ if isContent {
+ switch b {
+ case "index":
+ p.pathType = TypeLeaf
+ case "_index":
+ p.pathType = TypeBranch
+ default:
+ p.pathType = TypeContentSingle
+ }
+
+ if slashCount == 2 && p.IsLeafBundle() {
+ p.posSectionHigh = 0
+ }
+ } else if b == files.NameContentData && files.IsContentDataExt(p.Ext()) {
+ p.pathType = TypeContentData
+ }
+ }
+ }
+
+ if p.pathType < TypeMarkup && component == files.ComponentFolderLayouts {
+ if p.posIdentifierBaseof != -1 {
+ p.pathType = TypeBaseof
+ } else {
+ pth := p.Path()
+ if strings.Contains(pth, "/_shortcodes/") {
+ p.pathType = TypeShortcode
+ } else if strings.Contains(pth, "/_markup/") {
+ p.pathType = TypeMarkup
+ } else if strings.HasPrefix(pth, "/_partials/") {
+ p.pathType = TypePartial
+ }
+ }
+ }
+
+ if p.pathType == TypeShortcode && p.posIdentifierLayout != -1 {
+ id := p.identifiersKnown[p.posIdentifierLayout]
+ if id.Low == p.posContainerHigh {
+ // First identifier is shortcode name.
+ p.posIdentifierLayout = -1
+ }
+ }
+
+ return p, nil
+}
+
+func ModifyPathBundleTypeResource(p *Path) {
+ if p.IsContent() {
+ p.pathType = TypeContentResource
+ } else {
+ p.pathType = TypeFile
+ }
+}
+
+//go:generate stringer -type Type
+
+type Type int
+
+const (
+
+ // A generic resource, e.g. a JSON file.
+ TypeFile Type = iota
+
+ // All below are content files.
+ // A resource of a content type with front matter.
+ TypeContentResource
+
+ // E.g. /blog/my-post.md
+ TypeContentSingle
+
+ // All below are bundled content files.
+
+ // Leaf bundles, e.g. /blog/my-post/index.md
+ TypeLeaf
+
+ // Branch bundles, e.g. /blog/_index.md
+ TypeBranch
+
+ // Content data file, _content.gotmpl.
+ TypeContentData
+
+ // Layout types.
+ TypeMarkup
+ TypeShortcode
+ TypePartial
+ TypeBaseof
+)
+
+type Path struct {
+ // Note: Any additions to this struct should also be added to the pathPool.
+ s string
+
+ posContainerLow int
+ posContainerHigh int
+ posSectionHigh int
+
+ component string
+ pathType Type
+
+ identifiersKnown []types.LowHigh[string]
+ identifiersUnknown []types.LowHigh[string]
+
+ posIdentifierLanguage int
+ posIdentifierOutputFormat int
+ posIdentifierKind int
+ posIdentifierLayout int
+ posIdentifierBaseof int
+ disabled bool
+
+ trimLeadingSlash bool
+
+ unnormalized *Path
+}
+
+var pathPool = &sync.Pool{
+ New: func() any {
+ p := &Path{}
+ p.reset()
+ return p
+ },
+}
+
+func getPath() *Path {
+ return pathPool.Get().(*Path)
+}
+
+func putPath(p *Path) {
+ p.reset()
+ pathPool.Put(p)
+}
+
+func (p *Path) reset() {
+ p.s = ""
+ p.posContainerLow = -1
+ p.posContainerHigh = -1
+ p.posSectionHigh = -1
+ p.component = ""
+ p.pathType = 0
+ p.identifiersKnown = p.identifiersKnown[:0]
+ p.posIdentifierLanguage = -1
+ p.posIdentifierOutputFormat = -1
+ p.posIdentifierKind = -1
+ p.posIdentifierLayout = -1
+ p.posIdentifierBaseof = -1
+ p.disabled = false
+ p.trimLeadingSlash = false
+ p.unnormalized = nil
+}
+
+// TrimLeadingSlash returns a copy of the Path with the leading slash removed.
+func (p Path) TrimLeadingSlash() *Path {
+ p.trimLeadingSlash = true
+ return &p
+}
+
+func (p *Path) norm(s string) string {
+ if p.trimLeadingSlash {
+ s = strings.TrimPrefix(s, "/")
+ }
+ return s
+}
+
+// IdentifierBase satisfies identity.Identity.
+func (p *Path) IdentifierBase() string {
+ if p.Component() == files.ComponentFolderLayouts {
+ return p.Path()
+ }
+ return p.Base()
+}
+
+// Component returns the component for this path (e.g. "content").
+func (p *Path) Component() string {
+ return p.component
+}
+
+// Container returns the base name of the container directory for this path.
+func (p *Path) Container() string {
+ if p.posContainerLow == -1 {
+ return ""
+ }
+ return p.norm(p.s[p.posContainerLow : p.posContainerHigh-1])
+}
+
+func (p *Path) String() string {
+ if p == nil {
+ return ""
+ }
+ return p.Path()
+}
+
+// ContainerDir returns the container directory for this path.
+// For content bundles this will be the parent directory.
+func (p *Path) ContainerDir() string {
+ if p.posContainerLow == -1 || !p.IsBundle() {
+ return p.Dir()
+ }
+ return p.norm(p.s[:p.posContainerLow-1])
+}
+
+// Section returns the first path element (section).
+func (p *Path) Section() string {
+ if p.posSectionHigh <= 0 {
+ return ""
+ }
+ return p.norm(p.s[1:p.posSectionHigh])
+}
+
+// IsContent returns true if the path is a content file (e.g. mypost.md).
+// Note that this will also return true for content files in a bundle.
+func (p *Path) IsContent() bool {
+ return p.Type() >= TypeContentResource && p.Type() <= TypeContentData
+}
+
+// isContentPage returns true if the path is a content file (e.g. mypost.md),
+// but nof if inside a leaf bundle.
+func (p *Path) isContentPage() bool {
+ return p.Type() >= TypeContentSingle && p.Type() <= TypeContentData
+}
+
+// Name returns the last element of path.
+func (p *Path) Name() string {
+ if p.posContainerHigh > 0 {
+ return p.s[p.posContainerHigh:]
+ }
+ return p.s
+}
+
+// Name returns the last element of path without any extension.
+func (p *Path) NameNoExt() string {
+ if i := p.identifierIndex(0); i != -1 {
+ return p.s[p.posContainerHigh : p.identifiersKnown[i].Low-1]
+ }
+ return p.s[p.posContainerHigh:]
+}
+
+// Name returns the last element of path without any language identifier.
+func (p *Path) NameNoLang() string {
+ i := p.identifierIndex(p.posIdentifierLanguage)
+ if i == -1 {
+ return p.Name()
+ }
+
+ return p.s[p.posContainerHigh:p.identifiersKnown[i].Low-1] + p.s[p.identifiersKnown[i].High:]
+}
+
+// BaseNameNoIdentifier returns the logical base name for a resource without any identifier (e.g. no extension).
+// For bundles this will be the containing directory's name, e.g. "blog".
+func (p *Path) BaseNameNoIdentifier() string {
+ if p.IsBundle() {
+ return p.Container()
+ }
+ return p.NameNoIdentifier()
+}
+
+// NameNoIdentifier returns the last element of path without any identifier (e.g. no extension).
+func (p *Path) NameNoIdentifier() string {
+ lowHigh := p.nameLowHigh()
+ return p.s[lowHigh.Low:lowHigh.High]
+}
+
+func (p *Path) nameLowHigh() types.LowHigh[string] {
+ if len(p.identifiersKnown) > 0 {
+ lastID := p.identifiersKnown[len(p.identifiersKnown)-1]
+ if p.posContainerHigh == lastID.Low {
+ // The last identifier is the name.
+ return lastID
+ }
+ return types.LowHigh[string]{
+ Low: p.posContainerHigh,
+ High: p.identifiersKnown[len(p.identifiersKnown)-1].Low - 1,
+ }
+ }
+ return types.LowHigh[string]{
+ Low: p.posContainerHigh,
+ High: len(p.s),
+ }
+}
+
+// Dir returns all but the last element of path, typically the path's directory.
+func (p *Path) Dir() (d string) {
+ if p.posContainerHigh > 0 {
+ d = p.s[:p.posContainerHigh-1]
+ }
+ if d == "" {
+ d = "/"
+ }
+ d = p.norm(d)
+ return
+}
+
+// Path returns the full path.
+func (p *Path) Path() (d string) {
+ return p.norm(p.s)
+}
+
+// PathNoLeadingSlash returns the full path without the leading slash.
+func (p *Path) PathNoLeadingSlash() string {
+ return p.Path()[1:]
+}
+
+// Unnormalized returns the Path with the original case preserved.
+func (p *Path) Unnormalized() *Path {
+ return p.unnormalized
+}
+
+// PathNoLang returns the Path but with any language identifier removed.
+func (p *Path) PathNoLang() string {
+ return p.base(true, false)
+}
+
+// PathNoIdentifier returns the Path but with any identifier (ext, lang) removed.
+func (p *Path) PathNoIdentifier() string {
+ return p.base(false, false)
+}
+
+// PathBeforeLangAndOutputFormatAndExt returns the path up to the first identifier that is not a language or output format.
+func (p *Path) PathBeforeLangAndOutputFormatAndExt() string {
+ if len(p.identifiersKnown) == 0 {
+ return p.norm(p.s)
+ }
+ i := p.identifierIndex(0)
+
+ if j := p.posIdentifierOutputFormat; i == -1 || (j != -1 && j < i) {
+ i = j
+ }
+ if j := p.posIdentifierLanguage; i == -1 || (j != -1 && j < i) {
+ i = j
+ }
+
+ if i == -1 {
+ return p.norm(p.s)
+ }
+
+ id := p.identifiersKnown[i]
+ return p.norm(p.s[:id.Low-1])
+}
+
+// PathRel returns the path relative to the given owner.
+func (p *Path) PathRel(owner *Path) string {
+ ob := owner.Base()
+ if !strings.HasSuffix(ob, "/") {
+ ob += "/"
+ }
+ return strings.TrimPrefix(p.Path(), ob)
+}
+
+// BaseRel returns the base path relative to the given owner.
+func (p *Path) BaseRel(owner *Path) string {
+ ob := owner.Base()
+ if ob == "/" {
+ ob = ""
+ }
+ return p.Base()[len(ob)+1:]
+}
+
+// For content files, Base returns the path without any identifiers (extension, language code etc.).
+// Any 'index' as the last path element is ignored.
+//
+// For other files (Resources), any extension is kept.
+func (p *Path) Base() string {
+ return p.base(!p.isContentPage(), p.IsBundle())
+}
+
+// Used in template lookups.
+// For pages with Type set, we treat that as the section.
+func (p *Path) BaseReTyped(typ string) (d string) {
+ base := p.Base()
+ if p.Section() == typ {
+ return base
+ }
+ d = "/" + typ
+ if p.posSectionHigh != -1 {
+ d += base[p.posSectionHigh:]
+ }
+ d = p.norm(d)
+ return
+}
+
+// BaseNoLeadingSlash returns the base path without the leading slash.
+func (p *Path) BaseNoLeadingSlash() string {
+ return p.Base()[1:]
+}
+
+func (p *Path) base(preserveExt, isBundle bool) string {
+ if len(p.identifiersKnown) == 0 {
+ return p.norm(p.s)
+ }
+
+ if preserveExt && len(p.identifiersKnown) == 1 {
+ // Preserve extension.
+ return p.norm(p.s)
+ }
+
+ var high int
+
+ if isBundle {
+ high = p.posContainerHigh - 1
+ } else {
+ high = p.nameLowHigh().High
+ }
+
+ if high == 0 {
+ high++
+ }
+
+ if !preserveExt {
+ return p.norm(p.s[:high])
+ }
+
+ // For txt files etc. we want to preserve the extension.
+ id := p.identifiersKnown[0]
+
+ return p.norm(p.s[:high] + p.s[id.Low-1:id.High])
+}
+
+func (p *Path) Ext() string {
+ return p.identifierAsString(0)
+}
+
+func (p *Path) OutputFormat() string {
+ return p.identifierAsString(p.posIdentifierOutputFormat)
+}
+
+func (p *Path) Kind() string {
+ return p.identifierAsString(p.posIdentifierKind)
+}
+
+func (p *Path) Layout() string {
+ return p.identifierAsString(p.posIdentifierLayout)
+}
+
+func (p *Path) Lang() string {
+ return p.identifierAsString(p.posIdentifierLanguage)
+}
+
+func (p *Path) Identifier(i int) string {
+ return p.identifierAsString(i)
+}
+
+func (p *Path) Disabled() bool {
+ return p.disabled
+}
+
+func (p *Path) Identifiers() []string {
+ ids := make([]string, len(p.identifiersKnown))
+ for i, id := range p.identifiersKnown {
+ ids[i] = p.s[id.Low:id.High]
+ }
+ return ids
+}
+
+func (p *Path) IdentifiersUnknown() []string {
+ ids := make([]string, len(p.identifiersUnknown))
+ for i, id := range p.identifiersUnknown {
+ ids[i] = p.s[id.Low:id.High]
+ }
+ return ids
+}
+
+func (p *Path) Type() Type {
+ return p.pathType
+}
+
+func (p *Path) IsBundle() bool {
+ return p.pathType >= TypeLeaf && p.pathType <= TypeContentData
+}
+
+func (p *Path) IsBranchBundle() bool {
+ return p.pathType == TypeBranch
+}
+
+func (p *Path) IsLeafBundle() bool {
+ return p.pathType == TypeLeaf
+}
+
+func (p *Path) IsContentData() bool {
+ return p.pathType == TypeContentData
+}
+
+func (p Path) ForType(t Type) *Path {
+ p.pathType = t
+ return &p
+}
+
+func (p *Path) identifierAsString(i int) string {
+ i = p.identifierIndex(i)
+ if i == -1 {
+ return ""
+ }
+
+ id := p.identifiersKnown[i]
+ return p.s[id.Low:id.High]
+}
+
+func (p *Path) identifierIndex(i int) int {
+ if i < 0 || i >= len(p.identifiersKnown) {
+ return -1
+ }
+ return i
+}
+
+// HasExt returns true if the Unix styled path has an extension.
+func HasExt(p string) bool {
+ for i := len(p) - 1; i >= 0; i-- {
+ if p[i] == '.' {
+ return true
+ }
+ if p[i] == '/' {
+ return false
+ }
+ }
+ return false
+}
diff --git a/common/paths/pathparser_test.go b/common/paths/pathparser_test.go
new file mode 100644
index 000000000..b1734aef2
--- /dev/null
+++ b/common/paths/pathparser_test.go
@@ -0,0 +1,611 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package paths
+
+import (
+ "path/filepath"
+ "testing"
+
+ "github.com/gohugoio/hugo/hugofs/files"
+ "github.com/gohugoio/hugo/resources/kinds"
+
+ qt "github.com/frankban/quicktest"
+)
+
+var testParser = &PathParser{
+ LanguageIndex: map[string]int{
+ "no": 0,
+ "en": 1,
+ "fr": 2,
+ },
+ IsContentExt: func(ext string) bool {
+ return ext == "md"
+ },
+ IsOutputFormat: func(name, ext string) bool {
+ switch name {
+ case "html", "amp", "csv", "rss":
+ return true
+ }
+ return false
+ },
+}
+
+func TestParse(t *testing.T) {
+ c := qt.New(t)
+
+ tests := []struct {
+ name string
+ path string
+ assert func(c *qt.C, p *Path)
+ }{
+ {
+ "Basic text file",
+ "/a/b.txt",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Name(), qt.Equals, "b.txt")
+ c.Assert(p.Base(), qt.Equals, "/a/b.txt")
+ c.Assert(p.Container(), qt.Equals, "a")
+ c.Assert(p.Dir(), qt.Equals, "/a")
+ c.Assert(p.Ext(), qt.Equals, "txt")
+ c.Assert(p.IsContent(), qt.IsFalse)
+ },
+ },
+ {
+ "Basic text file, upper case",
+ "/A/B.txt",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Name(), qt.Equals, "b.txt")
+ c.Assert(p.NameNoExt(), qt.Equals, "b")
+ c.Assert(p.NameNoIdentifier(), qt.Equals, "b")
+ c.Assert(p.BaseNameNoIdentifier(), qt.Equals, "b")
+ c.Assert(p.Base(), qt.Equals, "/a/b.txt")
+ c.Assert(p.Ext(), qt.Equals, "txt")
+ },
+ },
+ {
+ "Basic text file, 1 space in dir",
+ "/a b/c.txt",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Base(), qt.Equals, "/a-b/c.txt")
+ },
+ },
+ {
+ "Basic text file, 2 spaces in dir",
+ "/a b/c.txt",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Base(), qt.Equals, "/a--b/c.txt")
+ },
+ },
+ {
+ "Basic text file, 1 space in filename",
+ "/a/b c.txt",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Base(), qt.Equals, "/a/b-c.txt")
+ },
+ },
+ {
+ "Basic text file, 2 spaces in filename",
+ "/a/b c.txt",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Base(), qt.Equals, "/a/b--c.txt")
+ },
+ },
+ {
+ "Basic text file, mixed case and spaces, unnormalized",
+ "/a/Foo BAR.txt",
+ func(c *qt.C, p *Path) {
+ pp := p.Unnormalized()
+ c.Assert(pp, qt.IsNotNil)
+ c.Assert(pp.BaseNameNoIdentifier(), qt.Equals, "Foo BAR")
+ },
+ },
+ {
+ "Basic Markdown file",
+ "/a/b/c.md",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Ext(), qt.Equals, "md")
+ c.Assert(p.Type(), qt.Equals, TypeContentSingle)
+ c.Assert(p.IsContent(), qt.IsTrue)
+ c.Assert(p.IsLeafBundle(), qt.IsFalse)
+ c.Assert(p.Name(), qt.Equals, "c.md")
+ c.Assert(p.Base(), qt.Equals, "/a/b/c")
+ c.Assert(p.BaseReTyped("foo"), qt.Equals, "/foo/b/c")
+ c.Assert(p.Section(), qt.Equals, "a")
+ c.Assert(p.BaseNameNoIdentifier(), qt.Equals, "c")
+ c.Assert(p.Path(), qt.Equals, "/a/b/c.md")
+ c.Assert(p.Dir(), qt.Equals, "/a/b")
+ c.Assert(p.Container(), qt.Equals, "b")
+ c.Assert(p.ContainerDir(), qt.Equals, "/a/b")
+ },
+ },
+ {
+ "Content resource",
+ "/a/b.md",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Name(), qt.Equals, "b.md")
+ c.Assert(p.Base(), qt.Equals, "/a/b")
+ c.Assert(p.BaseNoLeadingSlash(), qt.Equals, "a/b")
+ c.Assert(p.Section(), qt.Equals, "a")
+ c.Assert(p.BaseNameNoIdentifier(), qt.Equals, "b")
+
+ // Reclassify it as a content resource.
+ ModifyPathBundleTypeResource(p)
+ c.Assert(p.Type(), qt.Equals, TypeContentResource)
+ c.Assert(p.IsContent(), qt.IsTrue)
+ c.Assert(p.Name(), qt.Equals, "b.md")
+ c.Assert(p.Base(), qt.Equals, "/a/b.md")
+ },
+ },
+ {
+ "No ext",
+ "/a/b",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Name(), qt.Equals, "b")
+ c.Assert(p.NameNoExt(), qt.Equals, "b")
+ c.Assert(p.Base(), qt.Equals, "/a/b")
+ c.Assert(p.Ext(), qt.Equals, "")
+ },
+ },
+ {
+ "No ext, trailing slash",
+ "/a/b/",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Name(), qt.Equals, "b")
+ c.Assert(p.Base(), qt.Equals, "/a/b")
+ c.Assert(p.Ext(), qt.Equals, "")
+ },
+ },
+ {
+ "Identifiers",
+ "/a/b.a.b.no.txt",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Name(), qt.Equals, "b.a.b.no.txt")
+ c.Assert(p.NameNoIdentifier(), qt.Equals, "b.a.b")
+ c.Assert(p.NameNoLang(), qt.Equals, "b.a.b.txt")
+ c.Assert(p.Identifiers(), qt.DeepEquals, []string{"txt", "no"})
+ c.Assert(p.IdentifiersUnknown(), qt.DeepEquals, []string{"b", "a", "b"})
+ c.Assert(p.Base(), qt.Equals, "/a/b.a.b.txt")
+ c.Assert(p.BaseNoLeadingSlash(), qt.Equals, "a/b.a.b.txt")
+ c.Assert(p.Path(), qt.Equals, "/a/b.a.b.no.txt")
+ c.Assert(p.PathNoLang(), qt.Equals, "/a/b.a.b.txt")
+ c.Assert(p.Ext(), qt.Equals, "txt")
+ c.Assert(p.PathNoIdentifier(), qt.Equals, "/a/b.a.b")
+ },
+ },
+ {
+ "Home branch cundle",
+ "/_index.md",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Identifiers(), qt.DeepEquals, []string{"md"})
+ c.Assert(p.IsBranchBundle(), qt.IsTrue)
+ c.Assert(p.IsBundle(), qt.IsTrue)
+ c.Assert(p.Base(), qt.Equals, "/")
+ c.Assert(p.BaseReTyped("foo"), qt.Equals, "/foo")
+ c.Assert(p.Path(), qt.Equals, "/_index.md")
+ c.Assert(p.Container(), qt.Equals, "")
+ c.Assert(p.ContainerDir(), qt.Equals, "/")
+ },
+ },
+ {
+ "Index content file in root",
+ "/a/index.md",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Base(), qt.Equals, "/a")
+ c.Assert(p.BaseReTyped("foo"), qt.Equals, "/foo/a")
+ c.Assert(p.BaseNameNoIdentifier(), qt.Equals, "a")
+ c.Assert(p.Container(), qt.Equals, "a")
+ c.Assert(p.Container(), qt.Equals, "a")
+ c.Assert(p.ContainerDir(), qt.Equals, "")
+ c.Assert(p.Dir(), qt.Equals, "/a")
+ c.Assert(p.Ext(), qt.Equals, "md")
+ c.Assert(p.IdentifiersUnknown(), qt.DeepEquals, []string{"index"})
+ c.Assert(p.Identifiers(), qt.DeepEquals, []string{"md"})
+ c.Assert(p.IsBranchBundle(), qt.IsFalse)
+ c.Assert(p.IsBundle(), qt.IsTrue)
+ c.Assert(p.IsLeafBundle(), qt.IsTrue)
+ c.Assert(p.Lang(), qt.Equals, "")
+ c.Assert(p.NameNoExt(), qt.Equals, "index")
+ c.Assert(p.NameNoIdentifier(), qt.Equals, "index")
+ c.Assert(p.NameNoLang(), qt.Equals, "index.md")
+ c.Assert(p.Section(), qt.Equals, "")
+ },
+ },
+ {
+ "Index content file with lang",
+ "/a/b/index.no.md",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Base(), qt.Equals, "/a/b")
+ c.Assert(p.BaseNameNoIdentifier(), qt.Equals, "b")
+ c.Assert(p.BaseReTyped("foo"), qt.Equals, "/foo/b")
+ c.Assert(p.Container(), qt.Equals, "b")
+ c.Assert(p.ContainerDir(), qt.Equals, "/a")
+ c.Assert(p.Dir(), qt.Equals, "/a/b")
+ c.Assert(p.Ext(), qt.Equals, "md")
+ c.Assert(p.Identifiers(), qt.DeepEquals, []string{"md", "no"})
+ c.Assert(p.IsBranchBundle(), qt.IsFalse)
+ c.Assert(p.IsBundle(), qt.IsTrue)
+ c.Assert(p.IsLeafBundle(), qt.IsTrue)
+ c.Assert(p.Lang(), qt.Equals, "no")
+ c.Assert(p.NameNoExt(), qt.Equals, "index.no")
+ c.Assert(p.NameNoIdentifier(), qt.Equals, "index")
+ c.Assert(p.NameNoLang(), qt.Equals, "index.md")
+ c.Assert(p.Path(), qt.Equals, "/a/b/index.no.md")
+ c.Assert(p.PathNoLang(), qt.Equals, "/a/b/index.md")
+ c.Assert(p.Section(), qt.Equals, "a")
+ },
+ },
+ {
+ "Index branch content file",
+ "/a/b/_index.no.md",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Base(), qt.Equals, "/a/b")
+ c.Assert(p.BaseNameNoIdentifier(), qt.Equals, "b")
+ c.Assert(p.Container(), qt.Equals, "b")
+ c.Assert(p.ContainerDir(), qt.Equals, "/a")
+ c.Assert(p.Ext(), qt.Equals, "md")
+ c.Assert(p.Identifiers(), qt.DeepEquals, []string{"md", "no"})
+ c.Assert(p.IsBranchBundle(), qt.IsTrue)
+ c.Assert(p.IsBundle(), qt.IsTrue)
+ c.Assert(p.IsLeafBundle(), qt.IsFalse)
+ c.Assert(p.NameNoExt(), qt.Equals, "_index.no")
+ c.Assert(p.NameNoLang(), qt.Equals, "_index.md")
+ },
+ },
+ {
+ "Index root no slash",
+ "_index.md",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Base(), qt.Equals, "/")
+ c.Assert(p.Ext(), qt.Equals, "md")
+ c.Assert(p.Name(), qt.Equals, "_index.md")
+ },
+ },
+ {
+ "Index root",
+ "/_index.md",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Base(), qt.Equals, "/")
+ c.Assert(p.Ext(), qt.Equals, "md")
+ c.Assert(p.Name(), qt.Equals, "_index.md")
+ },
+ },
+ {
+ "Index first",
+ "/a/_index.md",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Section(), qt.Equals, "a")
+ },
+ },
+ {
+ "Index text file",
+ "/a/b/index.no.txt",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Base(), qt.Equals, "/a/b/index.txt")
+ c.Assert(p.Ext(), qt.Equals, "txt")
+ c.Assert(p.Identifiers(), qt.DeepEquals, []string{"txt", "no"})
+ c.Assert(p.IsLeafBundle(), qt.IsFalse)
+ c.Assert(p.PathNoIdentifier(), qt.Equals, "/a/b/index")
+ },
+ },
+ {
+ "Empty",
+ "",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Base(), qt.Equals, "/")
+ c.Assert(p.Ext(), qt.Equals, "")
+ c.Assert(p.Name(), qt.Equals, "")
+ c.Assert(p.Path(), qt.Equals, "/")
+ },
+ },
+ {
+ "Slash",
+ "/",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Base(), qt.Equals, "/")
+ c.Assert(p.Ext(), qt.Equals, "")
+ c.Assert(p.Name(), qt.Equals, "")
+ },
+ },
+ {
+ "Trim Leading Slash bundle",
+ "foo/bar/index.no.md",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Path(), qt.Equals, "/foo/bar/index.no.md")
+ pp := p.TrimLeadingSlash()
+ c.Assert(pp.Path(), qt.Equals, "foo/bar/index.no.md")
+ c.Assert(pp.PathNoLang(), qt.Equals, "foo/bar/index.md")
+ c.Assert(pp.Base(), qt.Equals, "foo/bar")
+ c.Assert(pp.Dir(), qt.Equals, "foo/bar")
+ c.Assert(pp.ContainerDir(), qt.Equals, "foo")
+ c.Assert(pp.Container(), qt.Equals, "bar")
+ c.Assert(pp.BaseNameNoIdentifier(), qt.Equals, "bar")
+ },
+ },
+ {
+ "Trim Leading Slash file",
+ "foo/bar.txt",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Path(), qt.Equals, "/foo/bar.txt")
+ pp := p.TrimLeadingSlash()
+ c.Assert(pp.Path(), qt.Equals, "foo/bar.txt")
+ c.Assert(pp.PathNoLang(), qt.Equals, "foo/bar.txt")
+ c.Assert(pp.Base(), qt.Equals, "foo/bar.txt")
+ c.Assert(pp.Dir(), qt.Equals, "foo")
+ c.Assert(pp.ContainerDir(), qt.Equals, "foo")
+ c.Assert(pp.Container(), qt.Equals, "foo")
+ c.Assert(pp.BaseNameNoIdentifier(), qt.Equals, "bar")
+ },
+ },
+ {
+ "File separator",
+ filepath.FromSlash("/a/b/c.txt"),
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Base(), qt.Equals, "/a/b/c.txt")
+ c.Assert(p.Ext(), qt.Equals, "txt")
+ c.Assert(p.Name(), qt.Equals, "c.txt")
+ c.Assert(p.Path(), qt.Equals, "/a/b/c.txt")
+ },
+ },
+ {
+ "Content data file gotmpl",
+ "/a/b/_content.gotmpl",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Path(), qt.Equals, "/a/b/_content.gotmpl")
+ c.Assert(p.Ext(), qt.Equals, "gotmpl")
+ c.Assert(p.IsContentData(), qt.IsTrue)
+ },
+ },
+ {
+ "Content data file yaml",
+ "/a/b/_content.yaml",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.IsContentData(), qt.IsFalse)
+ },
+ },
+ }
+ for _, test := range tests {
+ c.Run(test.name, func(c *qt.C) {
+ if test.name != "Home branch cundle" {
+ // return
+ }
+ test.assert(c, testParser.Parse(files.ComponentFolderContent, test.path))
+ })
+ }
+}
+
+func TestParseLayouts(t *testing.T) {
+ c := qt.New(t)
+
+ tests := []struct {
+ name string
+ path string
+ assert func(c *qt.C, p *Path)
+ }{
+ {
+ "Basic",
+ "/list.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Base(), qt.Equals, "/list.html")
+ c.Assert(p.OutputFormat(), qt.Equals, "html")
+ },
+ },
+ {
+ "Lang",
+ "/list.no.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Identifiers(), qt.DeepEquals, []string{"html", "no", "list"})
+ c.Assert(p.IdentifiersUnknown(), qt.DeepEquals, []string{})
+ c.Assert(p.Base(), qt.Equals, "/list.html")
+ c.Assert(p.Lang(), qt.Equals, "no")
+ },
+ },
+ {
+ "Kind",
+ "/section.no.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Kind(), qt.Equals, kinds.KindSection)
+ c.Assert(p.Identifiers(), qt.DeepEquals, []string{"html", "no", "section"})
+ c.Assert(p.IdentifiersUnknown(), qt.DeepEquals, []string{})
+ c.Assert(p.Base(), qt.Equals, "/section.html")
+ c.Assert(p.Lang(), qt.Equals, "no")
+ },
+ },
+ {
+ "Layout",
+ "/list.section.no.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Layout(), qt.Equals, "list")
+ c.Assert(p.Identifiers(), qt.DeepEquals, []string{"html", "no", "section", "list"})
+ c.Assert(p.IdentifiersUnknown(), qt.DeepEquals, []string{})
+ c.Assert(p.Base(), qt.Equals, "/list.html")
+ c.Assert(p.Lang(), qt.Equals, "no")
+ },
+ },
+ {
+ "Layout multiple",
+ "/mylayout.list.section.no.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Layout(), qt.Equals, "mylayout")
+ c.Assert(p.Identifiers(), qt.DeepEquals, []string{"html", "no", "section", "list", "mylayout"})
+ c.Assert(p.IdentifiersUnknown(), qt.DeepEquals, []string{})
+ c.Assert(p.Base(), qt.Equals, "/mylayout.html")
+ c.Assert(p.Lang(), qt.Equals, "no")
+ },
+ },
+ {
+ "Layout shortcode",
+ "/_shortcodes/myshort.list.no.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Layout(), qt.Equals, "list")
+ },
+ },
+ {
+ "Layout baseof",
+ "/baseof.list.no.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Layout(), qt.Equals, "list")
+ },
+ },
+ {
+ "Lang and output format",
+ "/list.no.amp.not.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Identifiers(), qt.DeepEquals, []string{"html", "not", "amp", "no", "list"})
+ c.Assert(p.OutputFormat(), qt.Equals, "amp")
+ c.Assert(p.Ext(), qt.Equals, "html")
+ c.Assert(p.Lang(), qt.Equals, "no")
+ c.Assert(p.Base(), qt.Equals, "/list.html")
+ },
+ },
+ {
+ "Term",
+ "/term.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Base(), qt.Equals, "/term.html")
+ c.Assert(p.Identifiers(), qt.DeepEquals, []string{"html", "term"})
+ c.Assert(p.PathNoIdentifier(), qt.Equals, "/term")
+ c.Assert(p.PathBeforeLangAndOutputFormatAndExt(), qt.Equals, "/term")
+ c.Assert(p.Lang(), qt.Equals, "")
+ c.Assert(p.Kind(), qt.Equals, "term")
+ c.Assert(p.OutputFormat(), qt.Equals, "html")
+ },
+ },
+ {
+ "Shortcode with layout",
+ "/_shortcodes/myshortcode.list.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Base(), qt.Equals, "/_shortcodes/myshortcode.html")
+ c.Assert(p.Type(), qt.Equals, TypeShortcode)
+ c.Assert(p.Identifiers(), qt.DeepEquals, []string{"html", "list"})
+ c.Assert(p.Layout(), qt.Equals, "list")
+ c.Assert(p.PathNoIdentifier(), qt.Equals, "/_shortcodes/myshortcode")
+ c.Assert(p.PathBeforeLangAndOutputFormatAndExt(), qt.Equals, "/_shortcodes/myshortcode.list")
+ c.Assert(p.Lang(), qt.Equals, "")
+ c.Assert(p.Kind(), qt.Equals, "")
+ c.Assert(p.OutputFormat(), qt.Equals, "html")
+ },
+ },
+ {
+ "Sub dir",
+ "/pages/home.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Identifiers(), qt.DeepEquals, []string{"html", "home"})
+ c.Assert(p.Lang(), qt.Equals, "")
+ c.Assert(p.Kind(), qt.Equals, "home")
+ c.Assert(p.OutputFormat(), qt.Equals, "html")
+ c.Assert(p.Dir(), qt.Equals, "/pages")
+ },
+ },
+ {
+ "Baseof",
+ "/pages/baseof.list.section.fr.amp.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Identifiers(), qt.DeepEquals, []string{"html", "amp", "fr", "section", "list", "baseof"})
+ c.Assert(p.IdentifiersUnknown(), qt.DeepEquals, []string{})
+ c.Assert(p.Kind(), qt.Equals, kinds.KindSection)
+ c.Assert(p.Lang(), qt.Equals, "fr")
+ c.Assert(p.OutputFormat(), qt.Equals, "amp")
+ c.Assert(p.Dir(), qt.Equals, "/pages")
+ c.Assert(p.NameNoIdentifier(), qt.Equals, "baseof")
+ c.Assert(p.Type(), qt.Equals, TypeBaseof)
+ c.Assert(p.IdentifierBase(), qt.Equals, "/pages/baseof.list.section.fr.amp.html")
+ },
+ },
+ {
+ "Markup",
+ "/_markup/render-link.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Type(), qt.Equals, TypeMarkup)
+ },
+ },
+ {
+ "Markup nested",
+ "/foo/_markup/render-link.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Type(), qt.Equals, TypeMarkup)
+ },
+ },
+ {
+ "Shortcode",
+ "/_shortcodes/myshortcode.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Type(), qt.Equals, TypeShortcode)
+ },
+ },
+ {
+ "Shortcode nested",
+ "/foo/_shortcodes/myshortcode.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Type(), qt.Equals, TypeShortcode)
+ },
+ },
+ {
+ "Shortcode nested sub",
+ "/foo/_shortcodes/foo/myshortcode.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Type(), qt.Equals, TypeShortcode)
+ },
+ },
+ {
+ "Partials",
+ "/_partials/foo.bar",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Type(), qt.Equals, TypePartial)
+ },
+ },
+ {
+ "Shortcode lang in root",
+ "/_shortcodes/no.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Type(), qt.Equals, TypeShortcode)
+ c.Assert(p.Lang(), qt.Equals, "")
+ c.Assert(p.NameNoIdentifier(), qt.Equals, "no")
+ },
+ },
+ {
+ "Shortcode lang layout",
+ "/_shortcodes/myshortcode.no.html",
+ func(c *qt.C, p *Path) {
+ c.Assert(p.Type(), qt.Equals, TypeShortcode)
+ c.Assert(p.Lang(), qt.Equals, "no")
+ c.Assert(p.Layout(), qt.Equals, "")
+ c.Assert(p.NameNoIdentifier(), qt.Equals, "myshortcode")
+ },
+ },
+ }
+
+ for _, test := range tests {
+ c.Run(test.name, func(c *qt.C) {
+ if test.name != "Shortcode lang layout" {
+ // return
+ }
+ test.assert(c, testParser.Parse(files.ComponentFolderLayouts, test.path))
+ })
+ }
+}
+
+func TestHasExt(t *testing.T) {
+ c := qt.New(t)
+
+ c.Assert(HasExt("/a/b/c.txt"), qt.IsTrue)
+ c.Assert(HasExt("/a/b.c/d.txt"), qt.IsTrue)
+ c.Assert(HasExt("/a/b/c"), qt.IsFalse)
+ c.Assert(HasExt("/a/b.c/d"), qt.IsFalse)
+}
+
+func BenchmarkParseIdentity(b *testing.B) {
+ for i := 0; i < b.N; i++ {
+ testParser.ParseIdentity(files.ComponentFolderAssets, "/a/b.css")
+ }
+}
diff --git a/common/paths/paths_integration_test.go b/common/paths/paths_integration_test.go
new file mode 100644
index 000000000..f5ea3066a
--- /dev/null
+++ b/common/paths/paths_integration_test.go
@@ -0,0 +1,103 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package paths_test
+
+import (
+ "testing"
+
+ "github.com/gohugoio/hugo/hugolib"
+)
+
+func TestRemovePathAccents(t *testing.T) {
+ t.Parallel()
+
+ files := `
+-- hugo.toml --
+disableKinds = ["taxonomy", "term"]
+defaultContentLanguage = "en"
+defaultContentLanguageInSubdir = true
+[languages]
+[languages.en]
+weight = 1
+[languages.fr]
+weight = 2
+removePathAccents = true
+-- content/διακριτικός.md --
+-- content/διακριτικός.fr.md --
+-- layouts/_default/single.html --
+{{ .Language.Lang }}|Single.
+-- layouts/_default/list.html --
+List
+`
+ b := hugolib.Test(t, files)
+
+ b.AssertFileContent("public/en/διακριτικός/index.html", "en|Single")
+ b.AssertFileContent("public/fr/διακριτικος/index.html", "fr|Single")
+}
+
+func TestDisablePathToLower(t *testing.T) {
+ t.Parallel()
+
+ files := `
+-- hugo.toml --
+disableKinds = ["taxonomy", "term"]
+defaultContentLanguage = "en"
+defaultContentLanguageInSubdir = true
+[languages]
+[languages.en]
+weight = 1
+[languages.fr]
+weight = 2
+disablePathToLower = true
+-- content/MySection/MyPage.md --
+-- content/MySection/MyPage.fr.md --
+-- content/MySection/MyBundle/index.md --
+-- content/MySection/MyBundle/index.fr.md --
+-- layouts/_default/single.html --
+{{ .Language.Lang }}|Single.
+-- layouts/_default/list.html --
+{{ .Language.Lang }}|List.
+`
+ b := hugolib.Test(t, files)
+
+ b.AssertFileContent("public/en/mysection/index.html", "en|List")
+ b.AssertFileContent("public/en/mysection/mypage/index.html", "en|Single")
+ b.AssertFileContent("public/fr/MySection/index.html", "fr|List")
+ b.AssertFileContent("public/fr/MySection/MyPage/index.html", "fr|Single")
+ b.AssertFileContent("public/en/mysection/mybundle/index.html", "en|Single")
+ b.AssertFileContent("public/fr/MySection/MyBundle/index.html", "fr|Single")
+}
+
+func TestIssue13596(t *testing.T) {
+ t.Parallel()
+
+ files := `
+-- hugo.toml --
+disableKinds = ['home','rss','section','sitemap','taxonomy','term']
+-- content/p1/index.md --
+---
+title: p1
+---
+-- content/p1/a.1.txt --
+-- content/p1/a.2.txt --
+-- layouts/all.html --
+{{ range .Resources.Match "*" }}{{ .Name }}|{{ end }}
+`
+
+ b := hugolib.Test(t, files)
+
+ b.AssertFileContent("public/p1/index.html", "a.1.txt|a.2.txt|")
+ b.AssertFileExists("public/p1/a.1.txt", true)
+ b.AssertFileExists("public/p1/a.2.txt", true) // fails
+}
diff --git a/common/paths/type_string.go b/common/paths/type_string.go
new file mode 100644
index 000000000..08fbcc835
--- /dev/null
+++ b/common/paths/type_string.go
@@ -0,0 +1,32 @@
+// Code generated by "stringer -type Type"; DO NOT EDIT.
+
+package paths
+
+import "strconv"
+
+func _() {
+ // An "invalid array index" compiler error signifies that the constant values have changed.
+ // Re-run the stringer command to generate them again.
+ var x [1]struct{}
+ _ = x[TypeFile-0]
+ _ = x[TypeContentResource-1]
+ _ = x[TypeContentSingle-2]
+ _ = x[TypeLeaf-3]
+ _ = x[TypeBranch-4]
+ _ = x[TypeContentData-5]
+ _ = x[TypeMarkup-6]
+ _ = x[TypeShortcode-7]
+ _ = x[TypePartial-8]
+ _ = x[TypeBaseof-9]
+}
+
+const _Type_name = "TypeFileTypeContentResourceTypeContentSingleTypeLeafTypeBranchTypeContentDataTypeMarkupTypeShortcodeTypePartialTypeBaseof"
+
+var _Type_index = [...]uint8{0, 8, 27, 44, 52, 62, 77, 87, 100, 111, 121}
+
+func (i Type) String() string {
+ if i < 0 || i >= Type(len(_Type_index)-1) {
+ return "Type(" + strconv.FormatInt(int64(i), 10) + ")"
+ }
+ return _Type_name[_Type_index[i]:_Type_index[i+1]]
+}
diff --git a/common/paths/url.go b/common/paths/url.go
new file mode 100644
index 000000000..1d1408b51
--- /dev/null
+++ b/common/paths/url.go
@@ -0,0 +1,273 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package paths
+
+import (
+ "fmt"
+ "net/url"
+ "path"
+ "path/filepath"
+ "runtime"
+ "strings"
+)
+
+type pathBridge struct{}
+
+func (pathBridge) Base(in string) string {
+ return path.Base(in)
+}
+
+func (pathBridge) Clean(in string) string {
+ return path.Clean(in)
+}
+
+func (pathBridge) Dir(in string) string {
+ return path.Dir(in)
+}
+
+func (pathBridge) Ext(in string) string {
+ return path.Ext(in)
+}
+
+func (pathBridge) Join(elem ...string) string {
+ return path.Join(elem...)
+}
+
+func (pathBridge) Separator() string {
+ return "/"
+}
+
+var pb pathBridge
+
+// MakePermalink combines base URL with content path to create full URL paths.
+// Example
+//
+// base: http://spf13.com/
+// path: post/how-i-blog
+// result: http://spf13.com/post/how-i-blog
+func MakePermalink(host, plink string) *url.URL {
+ base, err := url.Parse(host)
+ if err != nil {
+ panic(err)
+ }
+
+ p, err := url.Parse(plink)
+ if err != nil {
+ panic(err)
+ }
+
+ if p.Host != "" {
+ panic(fmt.Errorf("can't make permalink from absolute link %q", plink))
+ }
+
+ base.Path = path.Join(base.Path, p.Path)
+ base.Fragment = p.Fragment
+ base.RawQuery = p.RawQuery
+
+ // path.Join will strip off the last /, so put it back if it was there.
+ hadTrailingSlash := (plink == "" && strings.HasSuffix(host, "/")) || strings.HasSuffix(p.Path, "/")
+ if hadTrailingSlash && !strings.HasSuffix(base.Path, "/") {
+ base.Path = base.Path + "/"
+ }
+
+ return base
+}
+
+// AddContextRoot adds the context root to an URL if it's not already set.
+// For relative URL entries on sites with a base url with a context root set (i.e. http://example.com/mysite),
+// relative URLs must not include the context root if canonifyURLs is enabled. But if it's disabled, it must be set.
+func AddContextRoot(baseURL, relativePath string) string {
+ url, err := url.Parse(baseURL)
+ if err != nil {
+ panic(err)
+ }
+
+ newPath := path.Join(url.Path, relativePath)
+
+ // path strips trailing slash, ignore root path.
+ if newPath != "/" && strings.HasSuffix(relativePath, "/") {
+ newPath += "/"
+ }
+ return newPath
+}
+
+// URLizeAn
+
+// PrettifyURL takes a URL string and returns a semantic, clean URL.
+func PrettifyURL(in string) string {
+ x := PrettifyURLPath(in)
+
+ if path.Base(x) == "index.html" {
+ return path.Dir(x)
+ }
+
+ if in == "" {
+ return "/"
+ }
+
+ return x
+}
+
+// PrettifyURLPath takes a URL path to a content and converts it
+// to enable pretty URLs.
+//
+// /section/name.html becomes /section/name/index.html
+// /section/name/ becomes /section/name/index.html
+// /section/name/index.html becomes /section/name/index.html
+func PrettifyURLPath(in string) string {
+ return prettifyPath(in, pb)
+}
+
+// Uglify does the opposite of PrettifyURLPath().
+//
+// /section/name/index.html becomes /section/name.html
+// /section/name/ becomes /section/name.html
+// /section/name.html becomes /section/name.html
+func Uglify(in string) string {
+ if path.Ext(in) == "" {
+ if len(in) < 2 {
+ return "/"
+ }
+ // /section/name/ -> /section/name.html
+ return path.Clean(in) + ".html"
+ }
+
+ name, ext := fileAndExt(in, pb)
+ if name == "index" {
+ // /section/name/index.html -> /section/name.html
+ d := path.Dir(in)
+ if len(d) > 1 {
+ return d + ext
+ }
+ return in
+ }
+ // /.xml -> /index.xml
+ if name == "" {
+ return path.Dir(in) + "index" + ext
+ }
+ // /section/name.html -> /section/name.html
+ return path.Clean(in)
+}
+
+// URLEscape escapes unicode letters.
+func URLEscape(uri string) string {
+ // escape unicode letters
+ u, err := url.Parse(uri)
+ if err != nil {
+ panic(err)
+ }
+ return u.String()
+}
+
+// TrimExt trims the extension from a path..
+func TrimExt(in string) string {
+ return strings.TrimSuffix(in, path.Ext(in))
+}
+
+// From https://github.com/golang/go/blob/e0c76d95abfc1621259864adb3d101cf6f1f90fc/src/cmd/go/internal/web/url.go#L45
+func UrlFromFilename(filename string) (*url.URL, error) {
+ if !filepath.IsAbs(filename) {
+ return nil, fmt.Errorf("filepath must be absolute")
+ }
+
+ // If filename has a Windows volume name, convert the volume to a host and prefix
+ // per https://blogs.msdn.microsoft.com/ie/2006/12/06/file-uris-in-windows/.
+ if vol := filepath.VolumeName(filename); vol != "" {
+ if strings.HasPrefix(vol, `\\`) {
+ filename = filepath.ToSlash(filename[2:])
+ i := strings.IndexByte(filename, '/')
+
+ if i < 0 {
+ // A degenerate case.
+ // \\host.example.com (without a share name)
+ // becomes
+ // file://host.example.com/
+ return &url.URL{
+ Scheme: "file",
+ Host: filename,
+ Path: "/",
+ }, nil
+ }
+
+ // \\host.example.com\Share\path\to\file
+ // becomes
+ // file://host.example.com/Share/path/to/file
+ return &url.URL{
+ Scheme: "file",
+ Host: filename[:i],
+ Path: filepath.ToSlash(filename[i:]),
+ }, nil
+ }
+
+ // C:\path\to\file
+ // becomes
+ // file:///C:/path/to/file
+ return &url.URL{
+ Scheme: "file",
+ Path: "/" + filepath.ToSlash(filename),
+ }, nil
+ }
+
+ // /path/to/file
+ // becomes
+ // file:///path/to/file
+ return &url.URL{
+ Scheme: "file",
+ Path: filepath.ToSlash(filename),
+ }, nil
+}
+
+// UrlStringToFilename converts the URL s to a filename.
+// If ParseRequestURI fails, the input is just converted to OS specific slashes and returned.
+func UrlStringToFilename(s string) (string, bool) {
+ u, err := url.ParseRequestURI(s)
+ if err != nil {
+ return filepath.FromSlash(s), false
+ }
+
+ p := u.Path
+
+ if p == "" {
+ p, _ = url.QueryUnescape(u.Opaque)
+ return filepath.FromSlash(p), false
+ }
+
+ if runtime.GOOS != "windows" {
+ return p, true
+ }
+
+ if len(p) == 0 || p[0] != '/' {
+ return filepath.FromSlash(p), false
+ }
+
+ p = filepath.FromSlash(p)
+
+ if len(u.Host) == 1 {
+ // file://c/Users/...
+ return strings.ToUpper(u.Host) + ":" + p, true
+ }
+
+ if u.Host != "" && u.Host != "localhost" {
+ if filepath.VolumeName(u.Host) != "" {
+ return "", false
+ }
+ return `\\` + u.Host + p, true
+ }
+
+ if vol := filepath.VolumeName(p[1:]); vol == "" || strings.HasPrefix(vol, `\\`) {
+ return "", false
+ }
+
+ return p[1:], true
+}
diff --git a/common/paths/url_test.go b/common/paths/url_test.go
new file mode 100644
index 000000000..5a9233c26
--- /dev/null
+++ b/common/paths/url_test.go
@@ -0,0 +1,100 @@
+// Copyright 2021 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package paths
+
+import (
+ "testing"
+
+ qt "github.com/frankban/quicktest"
+)
+
+func TestMakePermalink(t *testing.T) {
+ type test struct {
+ host, link, output string
+ }
+
+ data := []test{
+ {"http://abc.com/foo", "post/bar", "http://abc.com/foo/post/bar"},
+ {"http://abc.com/foo/", "post/bar", "http://abc.com/foo/post/bar"},
+ {"http://abc.com", "post/bar", "http://abc.com/post/bar"},
+ {"http://abc.com", "bar", "http://abc.com/bar"},
+ {"http://abc.com/foo/bar", "post/bar", "http://abc.com/foo/bar/post/bar"},
+ {"http://abc.com/foo/bar", "post/bar/", "http://abc.com/foo/bar/post/bar/"},
+ {"http://abc.com/foo", "post/bar?a=b#c", "http://abc.com/foo/post/bar?a=b#c"},
+ }
+
+ for i, d := range data {
+ output := MakePermalink(d.host, d.link).String()
+ if d.output != output {
+ t.Errorf("Test #%d failed. Expected %q got %q", i, d.output, output)
+ }
+ }
+}
+
+func TestAddContextRoot(t *testing.T) {
+ tests := []struct {
+ baseURL string
+ url string
+ expected string
+ }{
+ {"http://example.com/sub/", "/foo", "/sub/foo"},
+ {"http://example.com/sub/", "/foo/index.html", "/sub/foo/index.html"},
+ {"http://example.com/sub1/sub2", "/foo", "/sub1/sub2/foo"},
+ {"http://example.com", "/foo", "/foo"},
+ // cannot guess that the context root is already added int the example below
+ {"http://example.com/sub/", "/sub/foo", "/sub/sub/foo"},
+ {"http://example.com/тря", "/трям/", "/тря/трям/"},
+ {"http://example.com", "/", "/"},
+ {"http://example.com/bar", "//", "/bar/"},
+ }
+
+ for _, test := range tests {
+ output := AddContextRoot(test.baseURL, test.url)
+ if output != test.expected {
+ t.Errorf("Expected %#v, got %#v\n", test.expected, output)
+ }
+ }
+}
+
+func TestPretty(t *testing.T) {
+ c := qt.New(t)
+ c.Assert("/section/name/index.html", qt.Equals, PrettifyURLPath("/section/name.html"))
+ c.Assert("/section/sub/name/index.html", qt.Equals, PrettifyURLPath("/section/sub/name.html"))
+ c.Assert("/section/name/index.html", qt.Equals, PrettifyURLPath("/section/name/"))
+ c.Assert("/section/name/index.html", qt.Equals, PrettifyURLPath("/section/name/index.html"))
+ c.Assert("/index.html", qt.Equals, PrettifyURLPath("/index.html"))
+ c.Assert("/name/index.xml", qt.Equals, PrettifyURLPath("/name.xml"))
+ c.Assert("/", qt.Equals, PrettifyURLPath("/"))
+ c.Assert("/", qt.Equals, PrettifyURLPath(""))
+ c.Assert("/section/name", qt.Equals, PrettifyURL("/section/name.html"))
+ c.Assert("/section/sub/name", qt.Equals, PrettifyURL("/section/sub/name.html"))
+ c.Assert("/section/name", qt.Equals, PrettifyURL("/section/name/"))
+ c.Assert("/section/name", qt.Equals, PrettifyURL("/section/name/index.html"))
+ c.Assert("/", qt.Equals, PrettifyURL("/index.html"))
+ c.Assert("/name/index.xml", qt.Equals, PrettifyURL("/name.xml"))
+ c.Assert("/", qt.Equals, PrettifyURL("/"))
+ c.Assert("/", qt.Equals, PrettifyURL(""))
+}
+
+func TestUgly(t *testing.T) {
+ c := qt.New(t)
+ c.Assert("/section/name.html", qt.Equals, Uglify("/section/name.html"))
+ c.Assert("/section/sub/name.html", qt.Equals, Uglify("/section/sub/name.html"))
+ c.Assert("/section/name.html", qt.Equals, Uglify("/section/name/"))
+ c.Assert("/section/name.html", qt.Equals, Uglify("/section/name/index.html"))
+ c.Assert("/index.html", qt.Equals, Uglify("/index.html"))
+ c.Assert("/name.xml", qt.Equals, Uglify("/name.xml"))
+ c.Assert("/", qt.Equals, Uglify("/"))
+ c.Assert("/", qt.Equals, Uglify(""))
+}
diff --git a/common/predicate/predicate.go b/common/predicate/predicate.go
new file mode 100644
index 000000000..f71536474
--- /dev/null
+++ b/common/predicate/predicate.go
@@ -0,0 +1,78 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package predicate
+
+// P is a predicate function that tests whether a value of type T satisfies some condition.
+type P[T any] func(T) bool
+
+// And returns a predicate that is a short-circuiting logical AND of this and the given predicates.
+func (p P[T]) And(ps ...P[T]) P[T] {
+ return func(v T) bool {
+ for _, pp := range ps {
+ if !pp(v) {
+ return false
+ }
+ }
+ if p == nil {
+ return true
+ }
+ return p(v)
+ }
+}
+
+// Or returns a predicate that is a short-circuiting logical OR of this and the given predicates.
+func (p P[T]) Or(ps ...P[T]) P[T] {
+ return func(v T) bool {
+ for _, pp := range ps {
+ if pp(v) {
+ return true
+ }
+ }
+ if p == nil {
+ return false
+ }
+ return p(v)
+ }
+}
+
+// Negate returns a predicate that is a logical negation of this predicate.
+func (p P[T]) Negate() P[T] {
+ return func(v T) bool {
+ return !p(v)
+ }
+}
+
+// Filter returns a new slice holding only the elements of s that satisfy p.
+// Filter modifies the contents of the slice s and returns the modified slice, which may have a smaller length.
+func (p P[T]) Filter(s []T) []T {
+ var n int
+ for _, v := range s {
+ if p(v) {
+ s[n] = v
+ n++
+ }
+ }
+ return s[:n]
+}
+
+// FilterCopy returns a new slice holding only the elements of s that satisfy p.
+func (p P[T]) FilterCopy(s []T) []T {
+ var result []T
+ for _, v := range s {
+ if p(v) {
+ result = append(result, v)
+ }
+ }
+ return result
+}
diff --git a/common/predicate/predicate_test.go b/common/predicate/predicate_test.go
new file mode 100644
index 000000000..1e1ec004b
--- /dev/null
+++ b/common/predicate/predicate_test.go
@@ -0,0 +1,83 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package predicate_test
+
+import (
+ "testing"
+
+ qt "github.com/frankban/quicktest"
+ "github.com/gohugoio/hugo/common/predicate"
+)
+
+func TestAdd(t *testing.T) {
+ c := qt.New(t)
+
+ var p predicate.P[int] = intP1
+
+ c.Assert(p(1), qt.IsTrue)
+ c.Assert(p(2), qt.IsFalse)
+
+ neg := p.Negate()
+ c.Assert(neg(1), qt.IsFalse)
+ c.Assert(neg(2), qt.IsTrue)
+
+ and := p.And(intP2)
+ c.Assert(and(1), qt.IsFalse)
+ c.Assert(and(2), qt.IsFalse)
+ c.Assert(and(10), qt.IsTrue)
+
+ or := p.Or(intP2)
+ c.Assert(or(1), qt.IsTrue)
+ c.Assert(or(2), qt.IsTrue)
+ c.Assert(or(10), qt.IsTrue)
+ c.Assert(or(11), qt.IsFalse)
+}
+
+func TestFilter(t *testing.T) {
+ c := qt.New(t)
+
+ var p predicate.P[int] = intP1
+ p = p.Or(intP2)
+
+ ints := []int{1, 2, 3, 4, 1, 6, 7, 8, 2}
+
+ c.Assert(p.Filter(ints), qt.DeepEquals, []int{1, 2, 1, 2})
+ c.Assert(ints, qt.DeepEquals, []int{1, 2, 1, 2, 1, 6, 7, 8, 2})
+}
+
+func TestFilterCopy(t *testing.T) {
+ c := qt.New(t)
+
+ var p predicate.P[int] = intP1
+ p = p.Or(intP2)
+
+ ints := []int{1, 2, 3, 4, 1, 6, 7, 8, 2}
+
+ c.Assert(p.FilterCopy(ints), qt.DeepEquals, []int{1, 2, 1, 2})
+ c.Assert(ints, qt.DeepEquals, []int{1, 2, 3, 4, 1, 6, 7, 8, 2})
+}
+
+var intP1 = func(i int) bool {
+ if i == 10 {
+ return true
+ }
+ return i == 1
+}
+
+var intP2 = func(i int) bool {
+ if i == 10 {
+ return true
+ }
+ return i == 2
+}
diff --git a/common/rungroup/rungroup.go b/common/rungroup/rungroup.go
new file mode 100644
index 000000000..80a730ca9
--- /dev/null
+++ b/common/rungroup/rungroup.go
@@ -0,0 +1,93 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package rungroup
+
+import (
+ "context"
+
+ "golang.org/x/sync/errgroup"
+)
+
+// Group is a group of workers that can be used to enqueue work and wait for
+// them to finish.
+type Group[T any] interface {
+ Enqueue(T) error
+ Wait() error
+}
+
+type runGroup[T any] struct {
+ ctx context.Context
+ g *errgroup.Group
+ ch chan T
+}
+
+// Config is the configuration for a new Group.
+type Config[T any] struct {
+ NumWorkers int
+ Handle func(context.Context, T) error
+}
+
+// Run creates a new Group with the given configuration.
+func Run[T any](ctx context.Context, cfg Config[T]) Group[T] {
+ if cfg.NumWorkers <= 0 {
+ cfg.NumWorkers = 1
+ }
+ if cfg.Handle == nil {
+ panic("Handle must be set")
+ }
+
+ g, ctx := errgroup.WithContext(ctx)
+ // Buffered for performance.
+ ch := make(chan T, cfg.NumWorkers)
+
+ for range cfg.NumWorkers {
+ g.Go(func() error {
+ for {
+ select {
+ case <-ctx.Done():
+ return nil
+ case v, ok := <-ch:
+ if !ok {
+ return nil
+ }
+ if err := cfg.Handle(ctx, v); err != nil {
+ return err
+ }
+ }
+ }
+ })
+ }
+
+ return &runGroup[T]{
+ ctx: ctx,
+ g: g,
+ ch: ch,
+ }
+}
+
+// Enqueue enqueues a new item to be handled by the workers.
+func (r *runGroup[T]) Enqueue(t T) error {
+ select {
+ case <-r.ctx.Done():
+ return nil
+ case r.ch <- t:
+ }
+ return nil
+}
+
+// Wait waits for all workers to finish and returns the first error.
+func (r *runGroup[T]) Wait() error {
+ close(r.ch)
+ return r.g.Wait()
+}
diff --git a/common/rungroup/rungroup_test.go b/common/rungroup/rungroup_test.go
new file mode 100644
index 000000000..ac902079e
--- /dev/null
+++ b/common/rungroup/rungroup_test.go
@@ -0,0 +1,44 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package rungroup
+
+import (
+ "context"
+ "testing"
+
+ qt "github.com/frankban/quicktest"
+)
+
+func TestNew(t *testing.T) {
+ c := qt.New(t)
+
+ var result int
+ adder := func(ctx context.Context, i int) error {
+ result += i
+ return nil
+ }
+
+ g := Run[int](
+ context.Background(),
+ Config[int]{
+ Handle: adder,
+ },
+ )
+
+ c.Assert(g, qt.IsNotNil)
+ g.Enqueue(32)
+ g.Enqueue(33)
+ c.Assert(g.Wait(), qt.IsNil)
+ c.Assert(result, qt.Equals, 65)
+}
diff --git a/common/tasks/tasks.go b/common/tasks/tasks.go
new file mode 100644
index 000000000..3f8a754e9
--- /dev/null
+++ b/common/tasks/tasks.go
@@ -0,0 +1,150 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package tasks
+
+import (
+ "sync"
+ "time"
+)
+
+// RunEvery runs a function at intervals defined by the function itself.
+// Functions can be added and removed while running.
+type RunEvery struct {
+ // Any error returned from the function will be passed to this function.
+ HandleError func(string, error)
+
+ // If set, the function will be run immediately.
+ RunImmediately bool
+
+ // The named functions to run.
+ funcs map[string]*Func
+
+ mu sync.Mutex
+ started bool
+ closed bool
+ quit chan struct{}
+}
+
+type Func struct {
+ // The shortest interval between each run.
+ IntervalLow time.Duration
+
+ // The longest interval between each run.
+ IntervalHigh time.Duration
+
+ // The function to run.
+ F func(interval time.Duration) (time.Duration, error)
+
+ interval time.Duration
+ last time.Time
+}
+
+func (r *RunEvery) Start() error {
+ if r.started {
+ return nil
+ }
+
+ r.started = true
+ r.quit = make(chan struct{})
+
+ go func() {
+ if r.RunImmediately {
+ r.run()
+ }
+ ticker := time.NewTicker(500 * time.Millisecond)
+ defer ticker.Stop()
+ for {
+ select {
+ case <-r.quit:
+ return
+ case <-ticker.C:
+ r.run()
+ }
+ }
+ }()
+
+ return nil
+}
+
+// Close stops the RunEvery from running.
+func (r *RunEvery) Close() error {
+ if r.closed {
+ return nil
+ }
+ r.closed = true
+ if r.quit != nil {
+ close(r.quit)
+ }
+ return nil
+}
+
+// Add adds a function to the RunEvery.
+func (r *RunEvery) Add(name string, f Func) {
+ r.mu.Lock()
+ defer r.mu.Unlock()
+ if r.funcs == nil {
+ r.funcs = make(map[string]*Func)
+ }
+ if f.IntervalLow == 0 {
+ f.IntervalLow = 500 * time.Millisecond
+ }
+ if f.IntervalHigh <= f.IntervalLow {
+ f.IntervalHigh = 20 * time.Second
+ }
+
+ start := max(f.IntervalHigh/3, f.IntervalLow)
+ f.interval = start
+ f.last = time.Now()
+
+ r.funcs[name] = &f
+}
+
+// Remove removes a function from the RunEvery.
+func (r *RunEvery) Remove(name string) {
+ r.mu.Lock()
+ defer r.mu.Unlock()
+ delete(r.funcs, name)
+}
+
+// Has returns whether the RunEvery has a function with the given name.
+func (r *RunEvery) Has(name string) bool {
+ r.mu.Lock()
+ defer r.mu.Unlock()
+ _, found := r.funcs[name]
+ return found
+}
+
+func (r *RunEvery) run() {
+ r.mu.Lock()
+ defer r.mu.Unlock()
+ for name, f := range r.funcs {
+ if time.Now().Before(f.last.Add(f.interval)) {
+ continue
+ }
+ f.last = time.Now()
+ interval, err := f.F(f.interval)
+ if err != nil && r.HandleError != nil {
+ r.HandleError(name, err)
+ }
+
+ if interval < f.IntervalLow {
+ interval = f.IntervalLow
+ }
+
+ if interval > f.IntervalHigh {
+ interval = f.IntervalHigh
+ }
+ f.interval = interval
+ }
+}
diff --git a/common/terminal/colors.go b/common/terminal/colors.go
index 334b82fae..fef6efce8 100644
--- a/common/terminal/colors.go
+++ b/common/terminal/colors.go
@@ -1,4 +1,4 @@
-// Copyright 2018 The Hugo Authors. All rights reserved.
+// Copyright 2024 The Hugo Authors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -17,7 +17,6 @@ package terminal
import (
"fmt"
"os"
- "runtime"
"strings"
isatty "github.com/mattn/go-isatty"
@@ -29,13 +28,18 @@ const (
noticeColor = "\033[1;36m%s\033[0m"
)
+// PrintANSIColors returns false if NO_COLOR env variable is set,
+// else IsTerminal(f).
+func PrintANSIColors(f *os.File) bool {
+ if os.Getenv("NO_COLOR") != "" {
+ return false
+ }
+ return IsTerminal(f)
+}
+
// IsTerminal return true if the file descriptor is terminal and the TERM
// environment variable isn't a dumb one.
func IsTerminal(f *os.File) bool {
- if runtime.GOOS == "windows" {
- return false
- }
-
fd := f.Fd()
return os.Getenv("TERM") != "dumb" && (isatty.IsTerminal(fd) || isatty.IsCygwinTerminal(fd))
}
diff --git a/common/text/position.go b/common/text/position.go
index 0c43c5ae7..eb9de5624 100644
--- a/common/text/position.go
+++ b/common/text/position.go
@@ -24,6 +24,8 @@ import (
// Positioner represents a thing that knows its position in a text file or stream,
// typically an error.
type Positioner interface {
+ // Position returns the current position.
+ // Useful in error logging, e.g. {{ errorf "error in code block: %s" .Position }}.
Position() Position
}
@@ -50,12 +52,11 @@ func (pos Position) IsValid() bool {
var positionStringFormatfunc func(p Position) string
func createPositionStringFormatter(formatStr string) func(p Position) string {
-
if formatStr == "" {
formatStr = "\":file::line::col\""
}
- var identifiers = []string{":file", ":line", ":col"}
+ identifiers := []string{":file", ":line", ":col"}
var identifiersFound []string
for i := range formatStr {
@@ -70,7 +71,7 @@ func createPositionStringFormatter(formatStr string) func(p Position) string {
format := replacer.Replace(formatStr)
f := func(pos Position) string {
- args := make([]interface{}, len(identifiersFound))
+ args := make([]any, len(identifiersFound))
for i, id := range identifiersFound {
switch id {
case ":file":
@@ -84,7 +85,7 @@ func createPositionStringFormatter(formatStr string) func(p Position) string {
msg := fmt.Sprintf(format, args...)
- if terminal.IsTerminal(os.Stdout) {
+ if terminal.PrintANSIColors(os.Stdout) {
return terminal.Notice(msg)
}
diff --git a/common/text/position_test.go b/common/text/position_test.go
index ba4824344..a1f43c5d4 100644
--- a/common/text/position_test.go
+++ b/common/text/position_test.go
@@ -29,5 +29,4 @@ func TestPositionStringFormatter(t *testing.T) {
c.Assert(createPositionStringFormatter("好::col")(pos), qt.Equals, "好:13")
c.Assert(createPositionStringFormatter("")(pos), qt.Equals, "\"/my/file.txt:12:13\"")
c.Assert(pos.String(), qt.Equals, "\"/my/file.txt:12:13\"")
-
}
diff --git a/common/text/transform.go b/common/text/transform.go
new file mode 100644
index 000000000..de093af0d
--- /dev/null
+++ b/common/text/transform.go
@@ -0,0 +1,78 @@
+// Copyright 2019 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package text
+
+import (
+ "strings"
+ "sync"
+ "unicode"
+
+ "golang.org/x/text/runes"
+ "golang.org/x/text/transform"
+ "golang.org/x/text/unicode/norm"
+)
+
+var accentTransformerPool = &sync.Pool{
+ New: func() any {
+ return transform.Chain(norm.NFD, runes.Remove(runes.In(unicode.Mn)), norm.NFC)
+ },
+}
+
+// RemoveAccents removes all accents from b.
+func RemoveAccents(b []byte) []byte {
+ t := accentTransformerPool.Get().(transform.Transformer)
+ b, _, _ = transform.Bytes(t, b)
+ t.Reset()
+ accentTransformerPool.Put(t)
+ return b
+}
+
+// RemoveAccentsString removes all accents from s.
+func RemoveAccentsString(s string) string {
+ t := accentTransformerPool.Get().(transform.Transformer)
+ s, _, _ = transform.String(t, s)
+ t.Reset()
+ accentTransformerPool.Put(t)
+ return s
+}
+
+// Chomp removes trailing newline characters from s.
+func Chomp(s string) string {
+ return strings.TrimRightFunc(s, func(r rune) bool {
+ return r == '\n' || r == '\r'
+ })
+}
+
+// Puts adds a trailing \n none found.
+func Puts(s string) string {
+ if s == "" || s[len(s)-1] == '\n' {
+ return s
+ }
+ return s + "\n"
+}
+
+// VisitLinesAfter calls the given function for each line, including newlines, in the given string.
+func VisitLinesAfter(s string, fn func(line string)) {
+ high := strings.IndexRune(s, '\n')
+ for high != -1 {
+ fn(s[:high+1])
+ s = s[high+1:]
+
+ high = strings.IndexRune(s, '\n')
+ }
+
+ if s != "" {
+ fn(s)
+ }
+}
diff --git a/common/text/transform_test.go b/common/text/transform_test.go
new file mode 100644
index 000000000..74bb37783
--- /dev/null
+++ b/common/text/transform_test.go
@@ -0,0 +1,72 @@
+// Copyright 2019 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package text
+
+import (
+ "testing"
+
+ qt "github.com/frankban/quicktest"
+)
+
+func TestRemoveAccents(t *testing.T) {
+ c := qt.New(t)
+
+ c.Assert(string(RemoveAccents([]byte("Resumé"))), qt.Equals, "Resume")
+ c.Assert(string(RemoveAccents([]byte("Hugo Rocks!"))), qt.Equals, "Hugo Rocks!")
+ c.Assert(string(RemoveAccentsString("Resumé")), qt.Equals, "Resume")
+}
+
+func TestChomp(t *testing.T) {
+ c := qt.New(t)
+
+ c.Assert(Chomp("\nA\n"), qt.Equals, "\nA")
+ c.Assert(Chomp("A\r\n"), qt.Equals, "A")
+}
+
+func TestPuts(t *testing.T) {
+ c := qt.New(t)
+
+ c.Assert(Puts("A"), qt.Equals, "A\n")
+ c.Assert(Puts("\nA\n"), qt.Equals, "\nA\n")
+ c.Assert(Puts(""), qt.Equals, "")
+}
+
+func TestVisitLinesAfter(t *testing.T) {
+ const lines = `line 1
+line 2
+
+line 3`
+
+ var collected []string
+
+ VisitLinesAfter(lines, func(s string) {
+ collected = append(collected, s)
+ })
+
+ c := qt.New(t)
+
+ c.Assert(collected, qt.DeepEquals, []string{"line 1\n", "line 2\n", "\n", "line 3"})
+}
+
+func BenchmarkVisitLinesAfter(b *testing.B) {
+ const lines = `line 1
+ line 2
+
+ line 3`
+
+ for i := 0; i < b.N; i++ {
+ VisitLinesAfter(lines, func(s string) {
+ })
+ }
+}
diff --git a/common/types/closer.go b/common/types/closer.go
new file mode 100644
index 000000000..9f8875a8a
--- /dev/null
+++ b/common/types/closer.go
@@ -0,0 +1,54 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package types
+
+import "sync"
+
+type Closer interface {
+ Close() error
+}
+
+// CloserFunc is a convenience type to create a Closer from a function.
+type CloserFunc func() error
+
+func (f CloserFunc) Close() error {
+ return f()
+}
+
+type CloseAdder interface {
+ Add(Closer)
+}
+
+type Closers struct {
+ mu sync.Mutex
+ cs []Closer
+}
+
+func (cs *Closers) Add(c Closer) {
+ cs.mu.Lock()
+ defer cs.mu.Unlock()
+ cs.cs = append(cs.cs, c)
+}
+
+func (cs *Closers) Close() error {
+ cs.mu.Lock()
+ defer cs.mu.Unlock()
+ for _, c := range cs.cs {
+ c.Close()
+ }
+
+ cs.cs = cs.cs[:0]
+
+ return nil
+}
diff --git a/common/types/convert.go b/common/types/convert.go
new file mode 100644
index 000000000..6b1750376
--- /dev/null
+++ b/common/types/convert.go
@@ -0,0 +1,129 @@
+// Copyright 2019 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package types
+
+import (
+ "encoding/json"
+ "fmt"
+ "html/template"
+ "reflect"
+ "time"
+
+ "github.com/spf13/cast"
+)
+
+// ToDuration converts v to time.Duration.
+// See ToDurationE if you need to handle errors.
+func ToDuration(v any) time.Duration {
+ d, _ := ToDurationE(v)
+ return d
+}
+
+// ToDurationE converts v to time.Duration.
+func ToDurationE(v any) (time.Duration, error) {
+ if n := cast.ToInt(v); n > 0 {
+ return time.Duration(n) * time.Millisecond, nil
+ }
+ d, err := time.ParseDuration(cast.ToString(v))
+ if err != nil {
+ return 0, fmt.Errorf("cannot convert %v to time.Duration", v)
+ }
+ return d, nil
+}
+
+// ToStringSlicePreserveString is the same as ToStringSlicePreserveStringE,
+// but it never fails.
+func ToStringSlicePreserveString(v any) []string {
+ vv, _ := ToStringSlicePreserveStringE(v)
+ return vv
+}
+
+// ToStringSlicePreserveStringE converts v to a string slice.
+// If v is a string, it will be wrapped in a string slice.
+func ToStringSlicePreserveStringE(v any) ([]string, error) {
+ if v == nil {
+ return nil, nil
+ }
+ if sds, ok := v.(string); ok {
+ return []string{sds}, nil
+ }
+ result, err := cast.ToStringSliceE(v)
+ if err == nil {
+ return result, nil
+ }
+
+ // Probably []int or similar. Fall back to reflect.
+ vv := reflect.ValueOf(v)
+
+ switch vv.Kind() {
+ case reflect.Slice, reflect.Array:
+ result = make([]string, vv.Len())
+ for i := range vv.Len() {
+ s, err := cast.ToStringE(vv.Index(i).Interface())
+ if err != nil {
+ return nil, err
+ }
+ result[i] = s
+ }
+ return result, nil
+ default:
+ return nil, fmt.Errorf("failed to convert %T to a string slice", v)
+ }
+}
+
+// TypeToString converts v to a string if it's a valid string type.
+// Note that this will not try to convert numeric values etc.,
+// use ToString for that.
+func TypeToString(v any) (string, bool) {
+ switch s := v.(type) {
+ case string:
+ return s, true
+ case template.HTML:
+ return string(s), true
+ case template.CSS:
+ return string(s), true
+ case template.HTMLAttr:
+ return string(s), true
+ case template.JS:
+ return string(s), true
+ case template.JSStr:
+ return string(s), true
+ case template.URL:
+ return string(s), true
+ case template.Srcset:
+ return string(s), true
+ }
+
+ return "", false
+}
+
+// ToString converts v to a string.
+func ToString(v any) string {
+ s, _ := ToStringE(v)
+ return s
+}
+
+// ToStringE converts v to a string.
+func ToStringE(v any) (string, error) {
+ if s, ok := TypeToString(v); ok {
+ return s, nil
+ }
+
+ switch s := v.(type) {
+ case json.RawMessage:
+ return string(s), nil
+ default:
+ return cast.ToStringE(v)
+ }
+}
diff --git a/common/types/convert_test.go b/common/types/convert_test.go
new file mode 100644
index 000000000..13059285d
--- /dev/null
+++ b/common/types/convert_test.go
@@ -0,0 +1,48 @@
+// Copyright 2019 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package types
+
+import (
+ "encoding/json"
+ "testing"
+ "time"
+
+ qt "github.com/frankban/quicktest"
+)
+
+func TestToStringSlicePreserveString(t *testing.T) {
+ c := qt.New(t)
+
+ c.Assert(ToStringSlicePreserveString("Hugo"), qt.DeepEquals, []string{"Hugo"})
+ c.Assert(ToStringSlicePreserveString(qt.Commentf("Hugo")), qt.DeepEquals, []string{"Hugo"})
+ c.Assert(ToStringSlicePreserveString([]any{"A", "B"}), qt.DeepEquals, []string{"A", "B"})
+ c.Assert(ToStringSlicePreserveString([]int{1, 3}), qt.DeepEquals, []string{"1", "3"})
+ c.Assert(ToStringSlicePreserveString(nil), qt.IsNil)
+}
+
+func TestToString(t *testing.T) {
+ c := qt.New(t)
+
+ c.Assert(ToString([]byte("Hugo")), qt.Equals, "Hugo")
+ c.Assert(ToString(json.RawMessage("Hugo")), qt.Equals, "Hugo")
+}
+
+func TestToDuration(t *testing.T) {
+ c := qt.New(t)
+
+ c.Assert(ToDuration("200ms"), qt.Equals, 200*time.Millisecond)
+ c.Assert(ToDuration("200"), qt.Equals, 200*time.Millisecond)
+ c.Assert(ToDuration("4m"), qt.Equals, 4*time.Minute)
+ c.Assert(ToDuration("asdfadf"), qt.Equals, time.Duration(0))
+}
diff --git a/common/types/css/csstypes.go b/common/types/css/csstypes.go
new file mode 100644
index 000000000..061acfe64
--- /dev/null
+++ b/common/types/css/csstypes.go
@@ -0,0 +1,20 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package css
+
+// QuotedString is a string that needs to be quoted in CSS.
+type QuotedString string
+
+// UnquotedString is a string that does not need to be quoted in CSS.
+type UnquotedString string
diff --git a/common/types/evictingqueue.go b/common/types/evictingqueue.go
index 884762426..a335be3b2 100644
--- a/common/types/evictingqueue.go
+++ b/common/types/evictingqueue.go
@@ -15,57 +15,72 @@
package types
import (
+ "slices"
"sync"
)
-// EvictingStringQueue is a queue which automatically evicts elements from the head of
+// EvictingQueue is a queue which automatically evicts elements from the head of
// the queue when attempting to add new elements onto the queue and it is full.
// This queue orders elements LIFO (last-in-first-out). It throws away duplicates.
-// Note: This queue currently does not contain any remove (poll etc.) methods.
-type EvictingStringQueue struct {
+type EvictingQueue[T comparable] struct {
size int
- vals []string
- set map[string]bool
+ vals []T
+ set map[T]bool
mu sync.Mutex
+ zero T
}
-// NewEvictingStringQueue creates a new queue with the given size.
-func NewEvictingStringQueue(size int) *EvictingStringQueue {
- return &EvictingStringQueue{size: size, set: make(map[string]bool)}
+// NewEvictingQueue creates a new queue with the given size.
+func NewEvictingQueue[T comparable](size int) *EvictingQueue[T] {
+ return &EvictingQueue[T]{size: size, set: make(map[T]bool)}
}
// Add adds a new string to the tail of the queue if it's not already there.
-func (q *EvictingStringQueue) Add(v string) {
+func (q *EvictingQueue[T]) Add(v T) *EvictingQueue[T] {
q.mu.Lock()
if q.set[v] {
q.mu.Unlock()
- return
+ return q
}
if len(q.set) == q.size {
// Full
delete(q.set, q.vals[0])
- q.vals = append(q.vals[:0], q.vals[1:]...)
+ q.vals = slices.Delete(q.vals, 0, 1)
}
q.set[v] = true
q.vals = append(q.vals, v)
q.mu.Unlock()
+
+ return q
+}
+
+func (q *EvictingQueue[T]) Len() int {
+ if q == nil {
+ return 0
+ }
+ q.mu.Lock()
+ defer q.mu.Unlock()
+ return len(q.vals)
}
// Contains returns whether the queue contains v.
-func (q *EvictingStringQueue) Contains(v string) bool {
+func (q *EvictingQueue[T]) Contains(v T) bool {
+ if q == nil {
+ return false
+ }
q.mu.Lock()
defer q.mu.Unlock()
return q.set[v]
}
// Peek looks at the last element added to the queue.
-func (q *EvictingStringQueue) Peek() string {
+func (q *EvictingQueue[T]) Peek() T {
q.mu.Lock()
l := len(q.vals)
if l == 0 {
q.mu.Unlock()
- return ""
+ return q.zero
}
elem := q.vals[l-1]
q.mu.Unlock()
@@ -73,9 +88,12 @@ func (q *EvictingStringQueue) Peek() string {
}
// PeekAll looks at all the elements in the queue, with the newest first.
-func (q *EvictingStringQueue) PeekAll() []string {
+func (q *EvictingQueue[T]) PeekAll() []T {
+ if q == nil {
+ return nil
+ }
q.mu.Lock()
- vals := make([]string, len(q.vals))
+ vals := make([]T, len(q.vals))
copy(vals, q.vals)
q.mu.Unlock()
for i, j := 0, len(vals)-1; i < j; i, j = i+1, j-1 {
@@ -85,9 +103,9 @@ func (q *EvictingStringQueue) PeekAll() []string {
}
// PeekAllSet returns PeekAll as a set.
-func (q *EvictingStringQueue) PeekAllSet() map[string]bool {
+func (q *EvictingQueue[T]) PeekAllSet() map[T]bool {
all := q.PeekAll()
- set := make(map[string]bool)
+ set := make(map[T]bool)
for _, v := range all {
set[v] = true
}
diff --git a/common/types/evictingqueue_test.go b/common/types/evictingqueue_test.go
index 7489ba88d..b93243f3c 100644
--- a/common/types/evictingqueue_test.go
+++ b/common/types/evictingqueue_test.go
@@ -23,7 +23,7 @@ import (
func TestEvictingStringQueue(t *testing.T) {
c := qt.New(t)
- queue := NewEvictingStringQueue(3)
+ queue := NewEvictingQueue[string](3)
c.Assert(queue.Peek(), qt.Equals, "")
queue.Add("a")
@@ -53,9 +53,9 @@ func TestEvictingStringQueueConcurrent(t *testing.T) {
var wg sync.WaitGroup
val := "someval"
- queue := NewEvictingStringQueue(3)
+ queue := NewEvictingQueue[string](3)
- for j := 0; j < 100; j++ {
+ for range 100 {
wg.Add(1)
go func() {
defer wg.Done()
diff --git a/common/types/hstring/stringtypes.go b/common/types/hstring/stringtypes.go
new file mode 100644
index 000000000..53ce2068f
--- /dev/null
+++ b/common/types/hstring/stringtypes.go
@@ -0,0 +1,36 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package hstring
+
+import (
+ "html/template"
+
+ "github.com/gohugoio/hugo/common/types"
+)
+
+var _ types.PrintableValueProvider = HTML("")
+
+// HTML is a string that represents rendered HTML.
+// When printed in templates it will be rendered as template.HTML and considered safe so no need to pipe it into `safeHTML`.
+// This type was introduced as a wasy to prevent a common case of inifinite recursion in the template rendering
+// when the `linkify` option is enabled with a common (wrong) construct like `{{ .Text | .Page.RenderString }}` in a hook template.
+type HTML string
+
+func (s HTML) String() string {
+ return string(s)
+}
+
+func (s HTML) PrintableValue() any {
+ return template.HTML(s)
+}
diff --git a/common/types/hstring/stringtypes_test.go b/common/types/hstring/stringtypes_test.go
new file mode 100644
index 000000000..05e2c22b9
--- /dev/null
+++ b/common/types/hstring/stringtypes_test.go
@@ -0,0 +1,30 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package hstring
+
+import (
+ "html/template"
+ "testing"
+
+ qt "github.com/frankban/quicktest"
+ "github.com/spf13/cast"
+)
+
+func TestRenderedString(t *testing.T) {
+ c := qt.New(t)
+
+ // Validate that it will behave like a string in Hugo settings.
+ c.Assert(cast.ToString(HTML("Hugo")), qt.Equals, "Hugo")
+ c.Assert(template.HTML(HTML("Hugo")), qt.Equals, template.HTML("Hugo"))
+}
diff --git a/common/types/types.go b/common/types/types.go
index f03031439..7e94c1eea 100644
--- a/common/types/types.go
+++ b/common/types/types.go
@@ -17,10 +17,33 @@ package types
import (
"fmt"
"reflect"
+ "sync/atomic"
"github.com/spf13/cast"
)
+// RLocker represents the read locks in sync.RWMutex.
+type RLocker interface {
+ RLock()
+ RUnlock()
+}
+
+type Locker interface {
+ Lock()
+ Unlock()
+}
+
+type RWLocker interface {
+ RLocker
+ Locker
+}
+
+// KeyValue is a interface{} tuple.
+type KeyValue struct {
+ Key any
+ Value any
+}
+
// KeyValueStr is a string tuple.
type KeyValueStr struct {
Key string
@@ -29,8 +52,8 @@ type KeyValueStr struct {
// KeyValues holds an key and a slice of values.
type KeyValues struct {
- Key interface{}
- Values []interface{}
+ Key any
+ Values []any
}
// KeyString returns the key as a string, an empty string if conversion fails.
@@ -45,8 +68,8 @@ func (k KeyValues) String() string {
// NewKeyValuesStrings takes a given key and slice of values and returns a new
// KeyValues struct.
func NewKeyValuesStrings(key string, values ...string) KeyValues {
- iv := make([]interface{}, len(values))
- for i := 0; i < len(values); i++ {
+ iv := make([]any, len(values))
+ for i := range values {
iv[i] = values[i]
}
return KeyValues{Key: key, Values: iv}
@@ -59,7 +82,7 @@ type Zeroer interface {
}
// IsNil reports whether v is nil.
-func IsNil(v interface{}) bool {
+func IsNil(v any) bool {
if v == nil {
return true
}
@@ -78,3 +101,45 @@ func IsNil(v interface{}) bool {
type DevMarker interface {
DevOnly()
}
+
+// Unwrapper is implemented by types that can unwrap themselves.
+type Unwrapper interface {
+ // Unwrapv is for internal use only.
+ // It got its slightly odd name to prevent collisions with user types.
+ Unwrapv() any
+}
+
+// Unwrap returns the underlying value of v if it implements Unwrapper, otherwise v is returned.
+func Unwrapv(v any) any {
+ if u, ok := v.(Unwrapper); ok {
+ return u.Unwrapv()
+ }
+ return v
+}
+
+// LowHigh represents a byte or slice boundary.
+type LowHigh[S ~[]byte | string] struct {
+ Low int
+ High int
+}
+
+func (l LowHigh[S]) IsZero() bool {
+ return l.Low < 0 || (l.Low == 0 && l.High == 0)
+}
+
+func (l LowHigh[S]) Value(source S) S {
+ return source[l.Low:l.High]
+}
+
+// This is only used for debugging purposes.
+var InvocationCounter atomic.Int64
+
+// NewTrue returns a pointer to b.
+func NewBool(b bool) *bool {
+ return &b
+}
+
+// PrintableValueProvider is implemented by types that can provide a printable value.
+type PrintableValueProvider interface {
+ PrintableValue() any
+}
diff --git a/common/types/types_test.go b/common/types/types_test.go
index 7c5cba659..795733047 100644
--- a/common/types/types_test.go
+++ b/common/types/types_test.go
@@ -25,5 +25,27 @@ func TestKeyValues(t *testing.T) {
kv := NewKeyValuesStrings("key", "a1", "a2")
c.Assert(kv.KeyString(), qt.Equals, "key")
- c.Assert(kv.Values, qt.DeepEquals, []interface{}{"a1", "a2"})
+ c.Assert(kv.Values, qt.DeepEquals, []any{"a1", "a2"})
+}
+
+func TestLowHigh(t *testing.T) {
+ c := qt.New(t)
+
+ lh := LowHigh[string]{
+ Low: 2,
+ High: 10,
+ }
+
+ s := "abcdefghijklmnopqrstuvwxyz"
+ c.Assert(lh.IsZero(), qt.IsFalse)
+ c.Assert(lh.Value(s), qt.Equals, "cdefghij")
+
+ lhb := LowHigh[[]byte]{
+ Low: 2,
+ High: 10,
+ }
+
+ sb := []byte(s)
+ c.Assert(lhb.IsZero(), qt.IsFalse)
+ c.Assert(lhb.Value(sb), qt.DeepEquals, []byte("cdefghij"))
}
diff --git a/common/urls/baseURL.go b/common/urls/baseURL.go
new file mode 100644
index 000000000..2958a2a04
--- /dev/null
+++ b/common/urls/baseURL.go
@@ -0,0 +1,112 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package urls
+
+import (
+ "fmt"
+ "net/url"
+ "strconv"
+ "strings"
+)
+
+// A BaseURL in Hugo is normally on the form scheme://path, but the
+// form scheme: is also valid (mailto:hugo@rules.com).
+type BaseURL struct {
+ url *url.URL
+ WithPath string
+ WithPathNoTrailingSlash string
+ WithoutPath string
+ BasePath string
+ BasePathNoTrailingSlash string
+}
+
+func (b BaseURL) String() string {
+ return b.WithPath
+}
+
+func (b BaseURL) Path() string {
+ return b.url.Path
+}
+
+func (b BaseURL) Port() int {
+ p, _ := strconv.Atoi(b.url.Port())
+ return p
+}
+
+// HostURL returns the URL to the host root without any path elements.
+func (b BaseURL) HostURL() string {
+ return strings.TrimSuffix(b.String(), b.Path())
+}
+
+// WithProtocol returns the BaseURL prefixed with the given protocol.
+// The Protocol is normally of the form "scheme://", i.e. "webcal://".
+func (b BaseURL) WithProtocol(protocol string) (BaseURL, error) {
+ u := b.URL()
+
+ scheme := protocol
+ isFullProtocol := strings.HasSuffix(scheme, "://")
+ isOpaqueProtocol := strings.HasSuffix(scheme, ":")
+
+ if isFullProtocol {
+ scheme = strings.TrimSuffix(scheme, "://")
+ } else if isOpaqueProtocol {
+ scheme = strings.TrimSuffix(scheme, ":")
+ }
+
+ u.Scheme = scheme
+
+ if isFullProtocol && u.Opaque != "" {
+ u.Opaque = "//" + u.Opaque
+ } else if isOpaqueProtocol && u.Opaque == "" {
+ return BaseURL{}, fmt.Errorf("cannot determine BaseURL for protocol %q", protocol)
+ }
+
+ return newBaseURLFromURL(u)
+}
+
+func (b BaseURL) WithPort(port int) (BaseURL, error) {
+ u := b.URL()
+ u.Host = u.Hostname() + ":" + strconv.Itoa(port)
+ return newBaseURLFromURL(u)
+}
+
+// URL returns a copy of the internal URL.
+// The copy can be safely used and modified.
+func (b BaseURL) URL() *url.URL {
+ c := *b.url
+ return &c
+}
+
+func NewBaseURLFromString(b string) (BaseURL, error) {
+ u, err := url.Parse(b)
+ if err != nil {
+ return BaseURL{}, err
+ }
+ return newBaseURLFromURL(u)
+}
+
+func newBaseURLFromURL(u *url.URL) (BaseURL, error) {
+ // A baseURL should always have a trailing slash, see #11669.
+ if !strings.HasSuffix(u.Path, "/") {
+ u.Path += "/"
+ }
+ baseURL := BaseURL{url: u, WithPath: u.String(), WithPathNoTrailingSlash: strings.TrimSuffix(u.String(), "/")}
+ baseURLNoPath := baseURL.URL()
+ baseURLNoPath.Path = ""
+ baseURL.WithoutPath = baseURLNoPath.String()
+ baseURL.BasePath = u.Path
+ baseURL.BasePathNoTrailingSlash = strings.TrimSuffix(u.Path, "/")
+
+ return baseURL, nil
+}
diff --git a/common/urls/baseURL_test.go b/common/urls/baseURL_test.go
new file mode 100644
index 000000000..ba337aac8
--- /dev/null
+++ b/common/urls/baseURL_test.go
@@ -0,0 +1,81 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package urls
+
+import (
+ "testing"
+
+ qt "github.com/frankban/quicktest"
+)
+
+func TestBaseURL(t *testing.T) {
+ c := qt.New(t)
+
+ b, err := NewBaseURLFromString("http://example.com/")
+ c.Assert(err, qt.IsNil)
+ c.Assert(b.String(), qt.Equals, "http://example.com/")
+
+ b, err = NewBaseURLFromString("http://example.com")
+ c.Assert(err, qt.IsNil)
+ c.Assert(b.String(), qt.Equals, "http://example.com/")
+ c.Assert(b.WithPathNoTrailingSlash, qt.Equals, "http://example.com")
+ c.Assert(b.BasePath, qt.Equals, "/")
+
+ p, err := b.WithProtocol("webcal://")
+ c.Assert(err, qt.IsNil)
+ c.Assert(p.String(), qt.Equals, "webcal://example.com/")
+
+ p, err = b.WithProtocol("webcal")
+ c.Assert(err, qt.IsNil)
+ c.Assert(p.String(), qt.Equals, "webcal://example.com/")
+
+ _, err = b.WithProtocol("mailto:")
+ c.Assert(err, qt.Not(qt.IsNil))
+
+ b, err = NewBaseURLFromString("mailto:hugo@rules.com")
+ c.Assert(err, qt.IsNil)
+ c.Assert(b.String(), qt.Equals, "mailto:hugo@rules.com")
+
+ // These are pretty constructed
+ p, err = b.WithProtocol("webcal")
+ c.Assert(err, qt.IsNil)
+ c.Assert(p.String(), qt.Equals, "webcal:hugo@rules.com")
+
+ p, err = b.WithProtocol("webcal://")
+ c.Assert(err, qt.IsNil)
+ c.Assert(p.String(), qt.Equals, "webcal://hugo@rules.com")
+
+ // Test with "non-URLs". Some people will try to use these as a way to get
+ // relative URLs working etc.
+ b, err = NewBaseURLFromString("/")
+ c.Assert(err, qt.IsNil)
+ c.Assert(b.String(), qt.Equals, "/")
+
+ b, err = NewBaseURLFromString("")
+ c.Assert(err, qt.IsNil)
+ c.Assert(b.String(), qt.Equals, "/")
+
+ // BaseURL with sub path
+ b, err = NewBaseURLFromString("http://example.com/sub")
+ c.Assert(err, qt.IsNil)
+ c.Assert(b.String(), qt.Equals, "http://example.com/sub/")
+ c.Assert(b.WithPathNoTrailingSlash, qt.Equals, "http://example.com/sub")
+ c.Assert(b.BasePath, qt.Equals, "/sub/")
+ c.Assert(b.BasePathNoTrailingSlash, qt.Equals, "/sub")
+
+ b, err = NewBaseURLFromString("http://example.com/sub/")
+ c.Assert(err, qt.IsNil)
+ c.Assert(b.String(), qt.Equals, "http://example.com/sub/")
+ c.Assert(b.HostURL(), qt.Equals, "http://example.com")
+}
diff --git a/common/urls/ref.go b/common/urls/ref.go
index 71b00b71d..e5804a279 100644
--- a/common/urls/ref.go
+++ b/common/urls/ref.go
@@ -17,6 +17,6 @@ package urls
// args must contain a path, but can also point to the target
// language or output format.
type RefLinker interface {
- Ref(args map[string]interface{}) (string, error)
- RelRef(args map[string]interface{}) (string, error)
+ Ref(args map[string]any) (string, error)
+ RelRef(args map[string]any) (string, error)
}
diff --git a/compare/compare.go b/compare/compare.go
index 537a66676..fd15bd087 100644
--- a/compare/compare.go
+++ b/compare/compare.go
@@ -17,13 +17,16 @@ package compare
// The semantics of equals is that the two value are interchangeable
// in the Hugo templates.
type Eqer interface {
- Eq(other interface{}) bool
+ // Eq returns whether this value is equal to the other.
+ // This is for internal use.
+ Eq(other any) bool
}
// ProbablyEqer is an equal check that may return false positives, but never
// a false negative.
type ProbablyEqer interface {
- ProbablyEq(other interface{}) bool
+ // For internal use.
+ ProbablyEq(other any) bool
}
// Comparer can be used to compare two values.
@@ -31,5 +34,34 @@ type ProbablyEqer interface {
// Compare returns -1 if the given version is less than, 0 if equal and 1 if greater than
// the running version.
type Comparer interface {
- Compare(other interface{}) int
+ Compare(other any) int
+}
+
+// Eq returns whether v1 is equal to v2.
+// It will use the Eqer interface if implemented, which
+// defines equals when two value are interchangeable
+// in the Hugo templates.
+func Eq(v1, v2 any) bool {
+ if v1 == nil || v2 == nil {
+ return v1 == v2
+ }
+
+ if eqer, ok := v1.(Eqer); ok {
+ return eqer.Eq(v2)
+ }
+
+ return v1 == v2
+}
+
+// ProbablyEq returns whether v1 is probably equal to v2.
+func ProbablyEq(v1, v2 any) bool {
+ if Eq(v1, v2) {
+ return true
+ }
+
+ if peqer, ok := v1.(ProbablyEqer); ok {
+ return peqer.ProbablyEq(v2)
+ }
+
+ return false
}
diff --git a/compare/compare_strings_test.go b/compare/compare_strings_test.go
index db286c2c5..1a5bb0b1a 100644
--- a/compare/compare_strings_test.go
+++ b/compare/compare_strings_test.go
@@ -61,5 +61,22 @@ func TestLexicographicSort(t *testing.T) {
})
c.Assert(s, qt.DeepEquals, []string{"A", "b", "Ba", "ba", "ba", "Bz"})
-
+}
+
+func BenchmarkStringSort(b *testing.B) {
+ prototype := []string{"b", "Bz", "zz", "ba", "αβδ αβδ αβδ", "A", "Ba", "ba", "nnnnasdfnnn", "AAgæåz", "αβδC"}
+ b.Run("LessStrings", func(b *testing.B) {
+ ss := make([][]string, b.N)
+ for i := 0; i < b.N; i++ {
+ ss[i] = make([]string, len(prototype))
+ copy(ss[i], prototype)
+ }
+ b.ResetTimer()
+ for i := 0; i < b.N; i++ {
+ sss := ss[i]
+ sort.Slice(sss, func(i, j int) bool {
+ return LessStrings(sss[i], sss[j])
+ })
+ }
+ })
}
diff --git a/config/allconfig/allconfig.go b/config/allconfig/allconfig.go
new file mode 100644
index 000000000..0db0be1d8
--- /dev/null
+++ b/config/allconfig/allconfig.go
@@ -0,0 +1,1182 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Package allconfig contains the full configuration for Hugo.
+// { "name": "Configuration", "description": "This section holds all configuration options in Hugo." }
+package allconfig
+
+import (
+ "errors"
+ "fmt"
+ "reflect"
+ "regexp"
+ "sort"
+ "strconv"
+ "strings"
+ "sync"
+ "time"
+
+ "github.com/gohugoio/hugo/cache/filecache"
+ "github.com/gohugoio/hugo/cache/httpcache"
+ "github.com/gohugoio/hugo/common/hugo"
+ "github.com/gohugoio/hugo/common/loggers"
+ "github.com/gohugoio/hugo/common/maps"
+ "github.com/gohugoio/hugo/common/paths"
+ "github.com/gohugoio/hugo/common/types"
+ "github.com/gohugoio/hugo/common/urls"
+ "github.com/gohugoio/hugo/config"
+ "github.com/gohugoio/hugo/config/privacy"
+ "github.com/gohugoio/hugo/config/security"
+ "github.com/gohugoio/hugo/config/services"
+ "github.com/gohugoio/hugo/deploy/deployconfig"
+ "github.com/gohugoio/hugo/helpers"
+ "github.com/gohugoio/hugo/hugolib/segments"
+ "github.com/gohugoio/hugo/langs"
+ "github.com/gohugoio/hugo/markup/markup_config"
+ "github.com/gohugoio/hugo/media"
+ "github.com/gohugoio/hugo/minifiers"
+ "github.com/gohugoio/hugo/modules"
+ "github.com/gohugoio/hugo/navigation"
+ "github.com/gohugoio/hugo/output"
+ "github.com/gohugoio/hugo/related"
+ "github.com/gohugoio/hugo/resources/images"
+ "github.com/gohugoio/hugo/resources/kinds"
+ "github.com/gohugoio/hugo/resources/page"
+ "github.com/gohugoio/hugo/resources/page/pagemeta"
+ "github.com/spf13/afero"
+
+ xmaps "golang.org/x/exp/maps"
+)
+
+// InternalConfig is the internal configuration for Hugo, not read from any user provided config file.
+type InternalConfig struct {
+ // Server mode?
+ Running bool
+
+ Quiet bool
+ Verbose bool
+ Clock string
+ Watch bool
+ FastRenderMode bool
+ LiveReloadPort int
+}
+
+// All non-params config keys for language.
+var configLanguageKeys map[string]bool
+
+func init() {
+ skip := map[string]bool{
+ "internal": true,
+ "c": true,
+ "rootconfig": true,
+ }
+ configLanguageKeys = make(map[string]bool)
+ addKeys := func(v reflect.Value) {
+ for i := range v.NumField() {
+ name := strings.ToLower(v.Type().Field(i).Name)
+ if skip[name] {
+ continue
+ }
+ configLanguageKeys[name] = true
+ }
+ }
+ addKeys(reflect.ValueOf(Config{}))
+ addKeys(reflect.ValueOf(RootConfig{}))
+ addKeys(reflect.ValueOf(config.CommonDirs{}))
+ addKeys(reflect.ValueOf(langs.LanguageConfig{}))
+}
+
+type Config struct {
+ // For internal use only.
+ Internal InternalConfig `mapstructure:"-" json:"-"`
+ // For internal use only.
+ C *ConfigCompiled `mapstructure:"-" json:"-"`
+
+ RootConfig
+
+ // Author information.
+ // Deprecated: Use taxonomies instead.
+ Author map[string]any
+
+ // Social links.
+ // Deprecated: Use .Site.Params instead.
+ Social map[string]string
+
+ // The build configuration section contains build-related configuration options.
+ // {"identifiers": ["build"] }
+ Build config.BuildConfig `mapstructure:"-"`
+
+ // The caches configuration section contains cache-related configuration options.
+ // {"identifiers": ["caches"] }
+ Caches filecache.Configs `mapstructure:"-"`
+
+ // The httpcache configuration section contains HTTP-cache-related configuration options.
+ // {"identifiers": ["httpcache"] }
+ HTTPCache httpcache.Config `mapstructure:"-"`
+
+ // The markup configuration section contains markup-related configuration options.
+ // {"identifiers": ["markup"] }
+ Markup markup_config.Config `mapstructure:"-"`
+
+ // ContentTypes are the media types that's considered content in Hugo.
+ ContentTypes *config.ConfigNamespace[map[string]media.ContentTypeConfig, media.ContentTypes] `mapstructure:"-"`
+
+ // The mediatypes configuration section maps the MIME type (a string) to a configuration object for that type.
+ // {"identifiers": ["mediatypes"], "refs": ["types:media:type"] }
+ MediaTypes *config.ConfigNamespace[map[string]media.MediaTypeConfig, media.Types] `mapstructure:"-"`
+
+ Imaging *config.ConfigNamespace[images.ImagingConfig, images.ImagingConfigInternal] `mapstructure:"-"`
+
+ // The outputformats configuration sections maps a format name (a string) to a configuration object for that format.
+ OutputFormats *config.ConfigNamespace[map[string]output.OutputFormatConfig, output.Formats] `mapstructure:"-"`
+
+ // The outputs configuration section maps a Page Kind (a string) to a slice of output formats.
+ // This can be overridden in the front matter.
+ Outputs map[string][]string `mapstructure:"-"`
+
+ // The cascade configuration section contains the top level front matter cascade configuration options,
+ // a slice of page matcher and params to apply to those pages.
+ Cascade *config.ConfigNamespace[[]page.PageMatcherParamsConfig, *maps.Ordered[page.PageMatcher, page.PageMatcherParamsConfig]] `mapstructure:"-"`
+
+ // The segments defines segments for the site. Used for partial/segmented builds.
+ Segments *config.ConfigNamespace[map[string]segments.SegmentConfig, segments.Segments] `mapstructure:"-"`
+
+ // Menu configuration.
+ // {"refs": ["config:languages:menus"] }
+ Menus *config.ConfigNamespace[map[string]navigation.MenuConfig, navigation.Menus] `mapstructure:"-"`
+
+ // The deployment configuration section contains for hugo deployconfig.
+ Deployment deployconfig.DeployConfig `mapstructure:"-"`
+
+ // Module configuration.
+ Module modules.Config `mapstructure:"-"`
+
+ // Front matter configuration.
+ Frontmatter pagemeta.FrontmatterConfig `mapstructure:"-"`
+
+ // Minification configuration.
+ Minify minifiers.MinifyConfig `mapstructure:"-"`
+
+ // Permalink configuration.
+ Permalinks map[string]map[string]string `mapstructure:"-"`
+
+ // Taxonomy configuration.
+ Taxonomies map[string]string `mapstructure:"-"`
+
+ // Sitemap configuration.
+ Sitemap config.SitemapConfig `mapstructure:"-"`
+
+ // Related content configuration.
+ Related related.Config `mapstructure:"-"`
+
+ // Server configuration.
+ Server config.Server `mapstructure:"-"`
+
+ // Pagination configuration.
+ Pagination config.Pagination `mapstructure:"-"`
+
+ // Page configuration.
+ Page config.PageConfig `mapstructure:"-"`
+
+ // Privacy configuration.
+ Privacy privacy.Config `mapstructure:"-"`
+
+ // Security configuration.
+ Security security.Config `mapstructure:"-"`
+
+ // Services configuration.
+ Services services.Config `mapstructure:"-"`
+
+ // User provided parameters.
+ // {"refs": ["config:languages:params"] }
+ Params maps.Params `mapstructure:"-"`
+
+ // The languages configuration sections maps a language code (a string) to a configuration object for that language.
+ Languages map[string]langs.LanguageConfig `mapstructure:"-"`
+
+ // UglyURLs configuration. Either a boolean or a sections map.
+ UglyURLs any `mapstructure:"-"`
+}
+
+type configCompiler interface {
+ CompileConfig(logger loggers.Logger) error
+}
+
+func (c Config) cloneForLang() *Config {
+ x := c
+ x.C = nil
+ copyStringSlice := func(in []string) []string {
+ if in == nil {
+ return nil
+ }
+ out := make([]string, len(in))
+ copy(out, in)
+ return out
+ }
+
+ // Copy all the slices to avoid sharing.
+ x.DisableKinds = copyStringSlice(x.DisableKinds)
+ x.DisableLanguages = copyStringSlice(x.DisableLanguages)
+ x.MainSections = copyStringSlice(x.MainSections)
+ x.IgnoreLogs = copyStringSlice(x.IgnoreLogs)
+ x.IgnoreFiles = copyStringSlice(x.IgnoreFiles)
+ x.Theme = copyStringSlice(x.Theme)
+
+ // Collapse all static dirs to one.
+ x.StaticDir = x.staticDirs()
+ // These will go away soon ...
+ x.StaticDir0 = nil
+ x.StaticDir1 = nil
+ x.StaticDir2 = nil
+ x.StaticDir3 = nil
+ x.StaticDir4 = nil
+ x.StaticDir5 = nil
+ x.StaticDir6 = nil
+ x.StaticDir7 = nil
+ x.StaticDir8 = nil
+ x.StaticDir9 = nil
+ x.StaticDir10 = nil
+
+ return &x
+}
+
+func (c *Config) CompileConfig(logger loggers.Logger) error {
+ var transientErr error
+ s := c.Timeout
+ if _, err := strconv.Atoi(s); err == nil {
+ // A number, assume seconds.
+ s = s + "s"
+ }
+ timeout, err := time.ParseDuration(s)
+ if err != nil {
+ return fmt.Errorf("failed to parse timeout: %s", err)
+ }
+ disabledKinds := make(map[string]bool)
+ for _, kind := range c.DisableKinds {
+ kind = strings.ToLower(kind)
+ if newKind := kinds.IsDeprecatedAndReplacedWith(kind); newKind != "" {
+ logger.Deprecatef(false, "Kind %q used in disableKinds is deprecated, use %q instead.", kind, newKind)
+ // Legacy config.
+ kind = newKind
+ }
+ if kinds.GetKindAny(kind) == "" {
+ logger.Warnf("Unknown kind %q in disableKinds configuration.", kind)
+ continue
+ }
+ disabledKinds[kind] = true
+ }
+ kindOutputFormats := make(map[string]output.Formats)
+ isRssDisabled := disabledKinds["rss"]
+ outputFormats := c.OutputFormats.Config
+ for kind, formats := range c.Outputs {
+ if newKind := kinds.IsDeprecatedAndReplacedWith(kind); newKind != "" {
+ logger.Deprecatef(false, "Kind %q used in outputs configuration is deprecated, use %q instead.", kind, newKind)
+ kind = newKind
+ }
+ if disabledKinds[kind] {
+ continue
+ }
+ if kinds.GetKindAny(kind) == "" {
+ logger.Warnf("Unknown kind %q in outputs configuration.", kind)
+ continue
+ }
+ for _, format := range formats {
+ if isRssDisabled && format == "rss" {
+ // Legacy config.
+ continue
+ }
+ f, found := outputFormats.GetByName(format)
+ if !found {
+ transientErr = fmt.Errorf("unknown output format %q for kind %q", format, kind)
+ continue
+ }
+ kindOutputFormats[kind] = append(kindOutputFormats[kind], f)
+ }
+ }
+
+ defaultOutputFormat := outputFormats[0]
+ c.DefaultOutputFormat = strings.ToLower(c.DefaultOutputFormat)
+ if c.DefaultOutputFormat != "" {
+ f, found := outputFormats.GetByName(c.DefaultOutputFormat)
+ if !found {
+ return fmt.Errorf("unknown default output format %q", c.DefaultOutputFormat)
+ }
+ defaultOutputFormat = f
+ } else {
+ c.DefaultOutputFormat = defaultOutputFormat.Name
+ }
+
+ disabledLangs := make(map[string]bool)
+ for _, lang := range c.DisableLanguages {
+ disabledLangs[lang] = true
+ }
+ for lang, language := range c.Languages {
+ if !language.Disabled && disabledLangs[lang] {
+ language.Disabled = true
+ c.Languages[lang] = language
+ }
+ if language.Disabled {
+ disabledLangs[lang] = true
+ if lang == c.DefaultContentLanguage {
+ return fmt.Errorf("cannot disable default content language %q", lang)
+ }
+ }
+ }
+
+ for i, s := range c.IgnoreLogs {
+ c.IgnoreLogs[i] = strings.ToLower(s)
+ }
+
+ ignoredLogIDs := make(map[string]bool)
+ for _, err := range c.IgnoreLogs {
+ ignoredLogIDs[err] = true
+ }
+
+ baseURL, err := urls.NewBaseURLFromString(c.BaseURL)
+ if err != nil {
+ return err
+ }
+
+ isUglyURL := func(section string) bool {
+ switch v := c.UglyURLs.(type) {
+ case bool:
+ return v
+ case map[string]bool:
+ return v[section]
+ default:
+ return false
+ }
+ }
+
+ ignoreFile := func(s string) bool {
+ return false
+ }
+ if len(c.IgnoreFiles) > 0 {
+ regexps := make([]*regexp.Regexp, len(c.IgnoreFiles))
+ for i, pattern := range c.IgnoreFiles {
+ var err error
+ regexps[i], err = regexp.Compile(pattern)
+ if err != nil {
+ return fmt.Errorf("failed to compile ignoreFiles pattern %q: %s", pattern, err)
+ }
+ }
+ ignoreFile = func(s string) bool {
+ for _, r := range regexps {
+ if r.MatchString(s) {
+ return true
+ }
+ }
+ return false
+ }
+ }
+
+ var clock time.Time
+ if c.Internal.Clock != "" {
+ var err error
+ clock, err = time.Parse(time.RFC3339, c.Internal.Clock)
+ if err != nil {
+ return fmt.Errorf("failed to parse clock: %s", err)
+ }
+ }
+
+ httpCache, err := c.HTTPCache.Compile()
+ if err != nil {
+ return err
+ }
+
+ // Legacy paginate values.
+ if c.Paginate != 0 {
+ hugo.DeprecateWithLogger("site config key paginate", "Use pagination.pagerSize instead.", "v0.128.0", logger.Logger())
+ c.Pagination.PagerSize = c.Paginate
+ }
+
+ if c.PaginatePath != "" {
+ hugo.DeprecateWithLogger("site config key paginatePath", "Use pagination.path instead.", "v0.128.0", logger.Logger())
+ c.Pagination.Path = c.PaginatePath
+ }
+
+ // Legacy privacy values.
+ if c.Privacy.Twitter.Disable {
+ hugo.DeprecateWithLogger("site config key privacy.twitter.disable", "Use privacy.x.disable instead.", "v0.141.0", logger.Logger())
+ c.Privacy.X.Disable = c.Privacy.Twitter.Disable
+ }
+
+ if c.Privacy.Twitter.EnableDNT {
+ hugo.DeprecateWithLogger("site config key privacy.twitter.enableDNT", "Use privacy.x.enableDNT instead.", "v0.141.0", logger.Logger())
+ c.Privacy.X.EnableDNT = c.Privacy.Twitter.EnableDNT
+ }
+
+ if c.Privacy.Twitter.Simple {
+ hugo.DeprecateWithLogger("site config key privacy.twitter.simple", "Use privacy.x.simple instead.", "v0.141.0", logger.Logger())
+ c.Privacy.X.Simple = c.Privacy.Twitter.Simple
+ }
+
+ // Legacy services values.
+ if c.Services.Twitter.DisableInlineCSS {
+ hugo.DeprecateWithLogger("site config key services.twitter.disableInlineCSS", "Use services.x.disableInlineCSS instead.", "v0.141.0", logger.Logger())
+ c.Services.X.DisableInlineCSS = c.Services.Twitter.DisableInlineCSS
+ }
+
+ // Legacy permalink tokens
+ vs := fmt.Sprintf("%v", c.Permalinks)
+ if strings.Contains(vs, ":filename") {
+ hugo.DeprecateWithLogger("the \":filename\" permalink token", "Use \":contentbasename\" instead.", "0.144.0", logger.Logger())
+ }
+ if strings.Contains(vs, ":slugorfilename") {
+ hugo.DeprecateWithLogger("the \":slugorfilename\" permalink token", "Use \":slugorcontentbasename\" instead.", "0.144.0", logger.Logger())
+ }
+
+ c.C = &ConfigCompiled{
+ Timeout: timeout,
+ BaseURL: baseURL,
+ BaseURLLiveReload: baseURL,
+ DisabledKinds: disabledKinds,
+ DisabledLanguages: disabledLangs,
+ IgnoredLogs: ignoredLogIDs,
+ KindOutputFormats: kindOutputFormats,
+ DefaultOutputFormat: defaultOutputFormat,
+ CreateTitle: helpers.GetTitleFunc(c.TitleCaseStyle),
+ IsUglyURLSection: isUglyURL,
+ IgnoreFile: ignoreFile,
+ SegmentFilter: c.Segments.Config.Get(func(s string) { logger.Warnf("Render segment %q not found in configuration", s) }, c.RootConfig.RenderSegments...),
+ MainSections: c.MainSections,
+ Clock: clock,
+ HTTPCache: httpCache,
+ transientErr: transientErr,
+ }
+
+ for _, s := range allDecoderSetups {
+ if getCompiler := s.getCompiler; getCompiler != nil {
+ if err := getCompiler(c).CompileConfig(logger); err != nil {
+ return err
+ }
+ }
+ }
+
+ return nil
+}
+
+func (c *Config) IsKindEnabled(kind string) bool {
+ return !c.C.DisabledKinds[kind]
+}
+
+func (c *Config) IsLangDisabled(lang string) bool {
+ return c.C.DisabledLanguages[lang]
+}
+
+// ConfigCompiled holds values and functions that are derived from the config.
+type ConfigCompiled struct {
+ Timeout time.Duration
+ BaseURL urls.BaseURL
+ BaseURLLiveReload urls.BaseURL
+ ServerInterface string
+ KindOutputFormats map[string]output.Formats
+ DefaultOutputFormat output.Format
+ DisabledKinds map[string]bool
+ DisabledLanguages map[string]bool
+ IgnoredLogs map[string]bool
+ CreateTitle func(s string) string
+ IsUglyURLSection func(section string) bool
+ IgnoreFile func(filename string) bool
+ SegmentFilter segments.SegmentFilter
+ MainSections []string
+ Clock time.Time
+ HTTPCache httpcache.ConfigCompiled
+
+ // This is set to the last transient error found during config compilation.
+ // With themes/modules we compute the configuration in multiple passes, and
+ // errors with missing output format definitions may resolve itself.
+ transientErr error
+
+ mu sync.Mutex
+}
+
+// This may be set after the config is compiled.
+func (c *ConfigCompiled) SetMainSections(sections []string) {
+ c.mu.Lock()
+ defer c.mu.Unlock()
+ c.MainSections = sections
+}
+
+// IsMainSectionsSet returns whether the main sections have been set.
+func (c *ConfigCompiled) IsMainSectionsSet() bool {
+ c.mu.Lock()
+ defer c.mu.Unlock()
+ return c.MainSections != nil
+}
+
+// This is set after the config is compiled by the server command.
+func (c *ConfigCompiled) SetServerInfo(baseURL, baseURLLiveReload urls.BaseURL, serverInterface string) {
+ c.BaseURL = baseURL
+ c.BaseURLLiveReload = baseURLLiveReload
+ c.ServerInterface = serverInterface
+}
+
+// RootConfig holds all the top-level configuration options in Hugo
+type RootConfig struct {
+ // The base URL of the site.
+ // Note that the default value is empty, but Hugo requires a valid URL (e.g. "https://example.com/") to work properly.
+ // {"identifiers": ["URL"] }
+ BaseURL string
+
+ // Whether to build content marked as draft.X
+ // {"identifiers": ["draft"] }
+ BuildDrafts bool
+
+ // Whether to build content with expiryDate in the past.
+ // {"identifiers": ["expiryDate"] }
+ BuildExpired bool
+
+ // Whether to build content with publishDate in the future.
+ // {"identifiers": ["publishDate"] }
+ BuildFuture bool
+
+ // Copyright information.
+ Copyright string
+
+ // The language to apply to content without any language indicator.
+ DefaultContentLanguage string
+
+ // By default, we put the default content language in the root and the others below their language ID, e.g. /no/.
+ // Set this to true to put all languages below their language ID.
+ DefaultContentLanguageInSubdir bool
+
+ // The default output format to use for the site.
+ // If not set, we will use the first output format.
+ DefaultOutputFormat string
+
+ // Disable generation of redirect to the default language when DefaultContentLanguageInSubdir is enabled.
+ DisableDefaultLanguageRedirect bool
+
+ // Disable creation of alias redirect pages.
+ DisableAliases bool
+
+ // Disable lower casing of path segments.
+ DisablePathToLower bool
+
+ // Disable page kinds from build.
+ DisableKinds []string
+
+ // A list of languages to disable.
+ DisableLanguages []string
+
+ // The named segments to render.
+ // This needs to match the name of the segment in the segments configuration.
+ RenderSegments []string
+
+ // Disable the injection of the Hugo generator tag on the home page.
+ DisableHugoGeneratorInject bool
+
+ // Disable live reloading in server mode.
+ DisableLiveReload bool
+
+ // Enable replacement in Pages' Content of Emoji shortcodes with their equivalent Unicode characters.
+ // {"identifiers": ["Content", "Unicode"] }
+ EnableEmoji bool
+
+ // THe main section(s) of the site.
+ // If not set, Hugo will try to guess this from the content.
+ MainSections []string
+
+ // Enable robots.txt generation.
+ EnableRobotsTXT bool
+
+ // When enabled, Hugo will apply Git version information to each Page if possible, which
+ // can be used to keep lastUpdated in synch and to print version information.
+ // {"identifiers": ["Page"] }
+ EnableGitInfo bool
+
+ // Enable to track, calculate and print metrics.
+ TemplateMetrics bool
+
+ // Enable to track, print and calculate metric hints.
+ TemplateMetricsHints bool
+
+ // Enable to disable the build lock file.
+ NoBuildLock bool
+
+ // A list of log IDs to ignore.
+ IgnoreLogs []string
+
+ // A list of regexps that match paths to ignore.
+ // Deprecated: Use the settings on module imports.
+ IgnoreFiles []string
+
+ // Ignore cache.
+ IgnoreCache bool
+
+ // Enable to print greppable placeholders (on the form "[i18n] TRANSLATIONID") for missing translation strings.
+ EnableMissingTranslationPlaceholders bool
+
+ // Enable to panic on warning log entries. This may make it easier to detect the source.
+ PanicOnWarning bool
+
+ // The configured environment. Default is "development" for server and "production" for build.
+ Environment string
+
+ // The default language code.
+ LanguageCode string
+
+ // Enable if the site content has CJK language (Chinese, Japanese, or Korean). This affects how Hugo counts words.
+ HasCJKLanguage bool
+
+ // The default number of pages per page when paginating.
+ // Deprecated: Use the Pagination struct.
+ Paginate int
+
+ // The path to use when creating pagination URLs, e.g. "page" in /page/2/.
+ // Deprecated: Use the Pagination struct.
+ PaginatePath string
+
+ // Whether to pluralize default list titles.
+ // Note that this currently only works for English, but you can provide your own title in the content file's front matter.
+ PluralizeListTitles bool
+
+ // Whether to capitalize automatic page titles, applicable to section, taxonomy, and term pages.
+ CapitalizeListTitles bool
+
+ // Make all relative URLs absolute using the baseURL.
+ // {"identifiers": ["baseURL"] }
+ CanonifyURLs bool
+
+ // Enable this to make all relative URLs relative to content root. Note that this does not affect absolute URLs.
+ RelativeURLs bool
+
+ // Removes non-spacing marks from composite characters in content paths.
+ RemovePathAccents bool
+
+ // Whether to track and print unused templates during the build.
+ PrintUnusedTemplates bool
+
+ // Enable to print warnings for missing translation strings.
+ PrintI18nWarnings bool
+
+ // ENable to print warnings for multiple files published to the same destination.
+ PrintPathWarnings bool
+
+ // URL to be used as a placeholder when a page reference cannot be found in ref or relref. Is used as-is.
+ RefLinksNotFoundURL string
+
+ // When using ref or relref to resolve page links and a link cannot be resolved, it will be logged with this log level.
+ // Valid values are ERROR (default) or WARNING. Any ERROR will fail the build (exit -1).
+ RefLinksErrorLevel string
+
+ // This will create a menu with all the sections as menu items and all the sections’ pages as “shadow-members”.
+ SectionPagesMenu string
+
+ // The length of text in words to show in a .Summary.
+ SummaryLength int
+
+ // The site title.
+ Title string
+
+ // The theme(s) to use.
+ // See Modules for more a more flexible way to load themes.
+ Theme []string
+
+ // Timeout for generating page contents, specified as a duration or in seconds.
+ Timeout string
+
+ // The time zone (or location), e.g. Europe/Oslo, used to parse front matter dates without such information and in the time function.
+ TimeZone string
+
+ // Set titleCaseStyle to specify the title style used by the title template function and the automatic section titles in Hugo.
+ // It defaults to AP Stylebook for title casing, but you can also set it to Chicago or Go (every word starts with a capital letter).
+ TitleCaseStyle string
+
+ // The editor used for opening up new content.
+ NewContentEditor string
+
+ // Don't sync modification time of files for the static mounts.
+ NoTimes bool
+
+ // Don't sync modification time of files for the static mounts.
+ NoChmod bool
+
+ // Clean the destination folder before a new build.
+ // This currently only handles static files.
+ CleanDestinationDir bool
+
+ // A Glob pattern of module paths to ignore in the _vendor folder.
+ IgnoreVendorPaths string
+
+ config.CommonDirs `mapstructure:",squash"`
+
+ // The odd constructs below are kept for backwards compatibility.
+ // Deprecated: Use module mount config instead.
+ StaticDir []string
+ // Deprecated: Use module mount config instead.
+ StaticDir0 []string
+ // Deprecated: Use module mount config instead.
+ StaticDir1 []string
+ // Deprecated: Use module mount config instead.
+ StaticDir2 []string
+ // Deprecated: Use module mount config instead.
+ StaticDir3 []string
+ // Deprecated: Use module mount config instead.
+ StaticDir4 []string
+ // Deprecated: Use module mount config instead.
+ StaticDir5 []string
+ // Deprecated: Use module mount config instead.
+ StaticDir6 []string
+ // Deprecated: Use module mount config instead.
+ StaticDir7 []string
+ // Deprecated: Use module mount config instead.
+ StaticDir8 []string
+ // Deprecated: Use module mount config instead.
+ StaticDir9 []string
+ // Deprecated: Use module mount config instead.
+ StaticDir10 []string
+}
+
+func (c RootConfig) staticDirs() []string {
+ var dirs []string
+ dirs = append(dirs, c.StaticDir...)
+ dirs = append(dirs, c.StaticDir0...)
+ dirs = append(dirs, c.StaticDir1...)
+ dirs = append(dirs, c.StaticDir2...)
+ dirs = append(dirs, c.StaticDir3...)
+ dirs = append(dirs, c.StaticDir4...)
+ dirs = append(dirs, c.StaticDir5...)
+ dirs = append(dirs, c.StaticDir6...)
+ dirs = append(dirs, c.StaticDir7...)
+ dirs = append(dirs, c.StaticDir8...)
+ dirs = append(dirs, c.StaticDir9...)
+ dirs = append(dirs, c.StaticDir10...)
+ return helpers.UniqueStringsReuse(dirs)
+}
+
+type Configs struct {
+ Base *Config
+ LoadingInfo config.LoadConfigResult
+ LanguageConfigMap map[string]*Config
+ LanguageConfigSlice []*Config
+
+ IsMultihost bool
+
+ Modules modules.Modules
+ ModulesClient *modules.Client
+
+ // All below is set in Init.
+ Languages langs.Languages
+ LanguagesDefaultFirst langs.Languages
+ ContentPathParser *paths.PathParser
+
+ configLangs []config.AllProvider
+}
+
+func (c *Configs) Validate(logger loggers.Logger) error {
+ c.Base.Cascade.Config.Range(func(p page.PageMatcher, cfg page.PageMatcherParamsConfig) bool {
+ page.CheckCascadePattern(logger, p)
+ return true
+ })
+ return nil
+}
+
+// transientErr returns the last transient error found during config compilation.
+func (c *Configs) transientErr() error {
+ for _, l := range c.LanguageConfigMap {
+ if l.C.transientErr != nil {
+ return l.C.transientErr
+ }
+ }
+ return nil
+}
+
+func (c *Configs) IsZero() bool {
+ // A config always has at least one language.
+ return c == nil || len(c.Languages) == 0
+}
+
+func (c *Configs) Init() error {
+ var languages langs.Languages
+
+ var langKeys []string
+ var hasEn bool
+
+ const en = "en"
+
+ for k := range c.LanguageConfigMap {
+ langKeys = append(langKeys, k)
+ if k == en {
+ hasEn = true
+ }
+ }
+
+ // Sort the LanguageConfigSlice by language weight (if set) or lang.
+ sort.Slice(langKeys, func(i, j int) bool {
+ ki := langKeys[i]
+ kj := langKeys[j]
+ lki := c.LanguageConfigMap[ki]
+ lkj := c.LanguageConfigMap[kj]
+ li := lki.Languages[ki]
+ lj := lkj.Languages[kj]
+ if li.Weight != lj.Weight {
+ return li.Weight < lj.Weight
+ }
+ return ki < kj
+ })
+
+ // See issue #13646.
+ defaultConfigLanguageFallback := en
+ if !hasEn {
+ // Pick the first one.
+ defaultConfigLanguageFallback = langKeys[0]
+ }
+
+ if c.Base.DefaultContentLanguage == "" {
+ c.Base.DefaultContentLanguage = defaultConfigLanguageFallback
+ }
+
+ for _, k := range langKeys {
+ v := c.LanguageConfigMap[k]
+ if v.DefaultContentLanguage == "" {
+ v.DefaultContentLanguage = defaultConfigLanguageFallback
+ }
+ c.LanguageConfigSlice = append(c.LanguageConfigSlice, v)
+ languageConf := v.Languages[k]
+ language, err := langs.NewLanguage(k, c.Base.DefaultContentLanguage, v.TimeZone, languageConf)
+ if err != nil {
+ return err
+ }
+ languages = append(languages, language)
+ }
+
+ // Filter out disabled languages.
+ var n int
+ for _, l := range languages {
+ if !l.Disabled {
+ languages[n] = l
+ n++
+ }
+ }
+ languages = languages[:n]
+
+ var languagesDefaultFirst langs.Languages
+ for _, l := range languages {
+ if l.Lang == c.Base.DefaultContentLanguage {
+ languagesDefaultFirst = append(languagesDefaultFirst, l)
+ }
+ }
+ for _, l := range languages {
+ if l.Lang != c.Base.DefaultContentLanguage {
+ languagesDefaultFirst = append(languagesDefaultFirst, l)
+ }
+ }
+
+ c.Languages = languages
+ c.LanguagesDefaultFirst = languagesDefaultFirst
+
+ c.ContentPathParser = &paths.PathParser{
+ LanguageIndex: languagesDefaultFirst.AsIndexSet(),
+ IsLangDisabled: c.Base.IsLangDisabled,
+ IsContentExt: c.Base.ContentTypes.Config.IsContentSuffix,
+ IsOutputFormat: func(name, ext string) bool {
+ if name == "" {
+ return false
+ }
+
+ if of, ok := c.Base.OutputFormats.Config.GetByName(name); ok {
+ if ext != "" && !of.MediaType.HasSuffix(ext) {
+ return false
+ }
+ return true
+ }
+ return false
+ },
+ }
+
+ c.configLangs = make([]config.AllProvider, len(c.Languages))
+ for i, l := range c.LanguagesDefaultFirst {
+ c.configLangs[i] = ConfigLanguage{
+ m: c,
+ config: c.LanguageConfigMap[l.Lang],
+ baseConfig: c.LoadingInfo.BaseConfig,
+ language: l,
+ }
+ }
+
+ if len(c.Modules) == 0 {
+ return errors.New("no modules loaded (need at least the main module)")
+ }
+
+ // Apply default project mounts.
+ if err := modules.ApplyProjectConfigDefaults(c.Modules[0], c.configLangs...); err != nil {
+ return err
+ }
+
+ // We should consolidate this, but to get a full view of the mounts in e.g. "hugo config" we need to
+ // transfer any default mounts added above to the config used to print the config.
+ for _, m := range c.Modules[0].Mounts() {
+ var found bool
+ for _, cm := range c.Base.Module.Mounts {
+ if cm.Source == m.Source && cm.Target == m.Target && cm.Lang == m.Lang {
+ found = true
+ break
+ }
+ }
+ if !found {
+ c.Base.Module.Mounts = append(c.Base.Module.Mounts, m)
+ }
+ }
+
+ // Transfer the changed mounts to the language versions (all share the same mount set, but can be displayed in different languages).
+ for _, l := range c.LanguageConfigSlice {
+ l.Module.Mounts = c.Base.Module.Mounts
+ }
+
+ return nil
+}
+
+func (c Configs) ConfigLangs() []config.AllProvider {
+ return c.configLangs
+}
+
+func (c Configs) GetFirstLanguageConfig() config.AllProvider {
+ return c.configLangs[0]
+}
+
+func (c Configs) GetByLang(lang string) config.AllProvider {
+ for _, l := range c.configLangs {
+ if l.Language().Lang == lang {
+ return l
+ }
+ }
+ return nil
+}
+
+func newDefaultConfig() *Config {
+ return &Config{
+ Taxonomies: map[string]string{"tag": "tags", "category": "categories"},
+ Sitemap: config.SitemapConfig{Priority: -1, Filename: "sitemap.xml"},
+ RootConfig: RootConfig{
+ Environment: hugo.EnvironmentProduction,
+ TitleCaseStyle: "AP",
+ PluralizeListTitles: true,
+ CapitalizeListTitles: true,
+ StaticDir: []string{"static"},
+ SummaryLength: 70,
+ Timeout: "60s",
+
+ CommonDirs: config.CommonDirs{
+ ArcheTypeDir: "archetypes",
+ ContentDir: "content",
+ ResourceDir: "resources",
+ PublishDir: "public",
+ ThemesDir: "themes",
+ AssetDir: "assets",
+ LayoutDir: "layouts",
+ I18nDir: "i18n",
+ DataDir: "data",
+ },
+ },
+ }
+}
+
+// fromLoadConfigResult creates a new Config from res.
+func fromLoadConfigResult(fs afero.Fs, logger loggers.Logger, res config.LoadConfigResult) (*Configs, error) {
+ if !res.Cfg.IsSet("languages") {
+ // We need at least one
+ lang := res.Cfg.GetString("defaultContentLanguage")
+ if lang == "" {
+ lang = "en"
+ }
+ res.Cfg.Set("languages", maps.Params{lang: maps.Params{}})
+ }
+ bcfg := res.BaseConfig
+ cfg := res.Cfg
+
+ all := newDefaultConfig()
+
+ err := decodeConfigFromParams(fs, logger, bcfg, cfg, all, nil)
+ if err != nil {
+ return nil, err
+ }
+
+ langConfigMap := make(map[string]*Config)
+
+ languagesConfig := cfg.GetStringMap("languages")
+
+ var isMultihost bool
+
+ if err := all.CompileConfig(logger); err != nil {
+ return nil, err
+ }
+
+ for k, v := range languagesConfig {
+ mergedConfig := config.New()
+ var differentRootKeys []string
+ switch x := v.(type) {
+ case maps.Params:
+ _, found := x["params"]
+ if !found {
+ x["params"] = maps.Params{
+ maps.MergeStrategyKey: maps.ParamsMergeStrategyDeep,
+ }
+ }
+
+ for kk, vv := range x {
+ if kk == "_merge" {
+ continue
+ }
+ if kk == "baseurl" {
+ // baseURL configure don the language level is a multihost setup.
+ isMultihost = true
+ }
+ mergedConfig.Set(kk, vv)
+ rootv := cfg.Get(kk)
+ if rootv != nil && cfg.IsSet(kk) {
+ // This overrides a root key and potentially needs a merge.
+ if !reflect.DeepEqual(rootv, vv) {
+ switch vvv := vv.(type) {
+ case maps.Params:
+ differentRootKeys = append(differentRootKeys, kk)
+
+ // Use the language value as base.
+ mergedConfigEntry := xmaps.Clone(vvv)
+ // Merge in the root value.
+ maps.MergeParams(mergedConfigEntry, rootv.(maps.Params))
+
+ mergedConfig.Set(kk, mergedConfigEntry)
+ default:
+ // Apply new values to the root.
+ differentRootKeys = append(differentRootKeys, "")
+ }
+ }
+ } else {
+ switch vv.(type) {
+ case maps.Params:
+ differentRootKeys = append(differentRootKeys, kk)
+ default:
+ // Apply new values to the root.
+ differentRootKeys = append(differentRootKeys, "")
+ }
+ }
+ }
+ differentRootKeys = helpers.UniqueStringsSorted(differentRootKeys)
+
+ if len(differentRootKeys) == 0 {
+ langConfigMap[k] = all
+ continue
+ }
+
+ // Create a copy of the complete config and replace the root keys with the language specific ones.
+ clone := all.cloneForLang()
+
+ if err := decodeConfigFromParams(fs, logger, bcfg, mergedConfig, clone, differentRootKeys); err != nil {
+ return nil, fmt.Errorf("failed to decode config for language %q: %w", k, err)
+ }
+ if err := clone.CompileConfig(logger); err != nil {
+ return nil, err
+ }
+
+ // Adjust Goldmark config defaults for multilingual, single-host sites.
+ if len(languagesConfig) > 1 && !isMultihost && !clone.Markup.Goldmark.DuplicateResourceFiles {
+ if !clone.Markup.Goldmark.DuplicateResourceFiles {
+ if clone.Markup.Goldmark.RenderHooks.Link.EnableDefault == nil {
+ clone.Markup.Goldmark.RenderHooks.Link.EnableDefault = types.NewBool(true)
+ }
+ if clone.Markup.Goldmark.RenderHooks.Image.EnableDefault == nil {
+ clone.Markup.Goldmark.RenderHooks.Image.EnableDefault = types.NewBool(true)
+ }
+ }
+ }
+
+ langConfigMap[k] = clone
+ case maps.ParamsMergeStrategy:
+ default:
+ panic(fmt.Sprintf("unknown type in languages config: %T", v))
+
+ }
+ }
+
+ bcfg.PublishDir = all.PublishDir
+ res.BaseConfig = bcfg
+ all.CommonDirs.CacheDir = bcfg.CacheDir
+ for _, l := range langConfigMap {
+ l.CommonDirs.CacheDir = bcfg.CacheDir
+ }
+
+ cm := &Configs{
+ Base: all,
+ LanguageConfigMap: langConfigMap,
+ LoadingInfo: res,
+ IsMultihost: isMultihost,
+ }
+
+ return cm, nil
+}
+
+func decodeConfigFromParams(fs afero.Fs, logger loggers.Logger, bcfg config.BaseConfig, p config.Provider, target *Config, keys []string) error {
+ var decoderSetups []decodeWeight
+
+ if len(keys) == 0 {
+ for _, v := range allDecoderSetups {
+ decoderSetups = append(decoderSetups, v)
+ }
+ } else {
+ for _, key := range keys {
+ if v, found := allDecoderSetups[key]; found {
+ decoderSetups = append(decoderSetups, v)
+ } else {
+ logger.Warnf("Skip unknown config key %q", key)
+ }
+ }
+ }
+
+ // Sort them to get the dependency order right.
+ sort.Slice(decoderSetups, func(i, j int) bool {
+ ki, kj := decoderSetups[i], decoderSetups[j]
+ if ki.weight == kj.weight {
+ return ki.key < kj.key
+ }
+ return ki.weight < kj.weight
+ })
+
+ for _, v := range decoderSetups {
+ p := decodeConfig{p: p, c: target, fs: fs, bcfg: bcfg}
+ if err := v.decode(v, p); err != nil {
+ return fmt.Errorf("failed to decode %q: %w", v.key, err)
+ }
+ }
+
+ return nil
+}
+
+func createDefaultOutputFormats(allFormats output.Formats) map[string][]string {
+ if len(allFormats) == 0 {
+ panic("no output formats")
+ }
+ rssOut, rssFound := allFormats.GetByName(output.RSSFormat.Name)
+ htmlOut, _ := allFormats.GetByName(output.HTMLFormat.Name)
+
+ defaultListTypes := []string{htmlOut.Name}
+ if rssFound {
+ defaultListTypes = append(defaultListTypes, rssOut.Name)
+ }
+
+ m := map[string][]string{
+ kinds.KindPage: {htmlOut.Name},
+ kinds.KindHome: defaultListTypes,
+ kinds.KindSection: defaultListTypes,
+ kinds.KindTerm: defaultListTypes,
+ kinds.KindTaxonomy: defaultListTypes,
+ }
+
+ // May be disabled
+ if rssFound {
+ m["rss"] = []string{rssOut.Name}
+ }
+
+ return m
+}
diff --git a/config/allconfig/allconfig_integration_test.go b/config/allconfig/allconfig_integration_test.go
new file mode 100644
index 000000000..8f6cacf84
--- /dev/null
+++ b/config/allconfig/allconfig_integration_test.go
@@ -0,0 +1,381 @@
+package allconfig_test
+
+import (
+ "path/filepath"
+ "testing"
+
+ qt "github.com/frankban/quicktest"
+ "github.com/gohugoio/hugo/common/hugo"
+ "github.com/gohugoio/hugo/config/allconfig"
+ "github.com/gohugoio/hugo/hugolib"
+ "github.com/gohugoio/hugo/media"
+)
+
+func TestDirsMount(t *testing.T) {
+ files := `
+-- hugo.toml --
+baseURL = "https://example.com"
+disableKinds = ["taxonomy", "term"]
+[languages]
+[languages.en]
+weight = 1
+[languages.sv]
+weight = 2
+[[module.mounts]]
+source = 'content/en'
+target = 'content'
+lang = 'en'
+[[module.mounts]]
+source = 'content/sv'
+target = 'content'
+lang = 'sv'
+-- content/en/p1.md --
+---
+title: "p1"
+---
+-- content/sv/p1.md --
+---
+title: "p1"
+---
+-- layouts/_default/single.html --
+Title: {{ .Title }}
+ `
+
+ b := hugolib.NewIntegrationTestBuilder(
+ hugolib.IntegrationTestConfig{T: t, TxtarString: files},
+ ).Build()
+
+ // b.AssertFileContent("public/p1/index.html", "Title: p1")
+
+ sites := b.H.Sites
+ b.Assert(len(sites), qt.Equals, 2)
+
+ configs := b.H.Configs
+ mods := configs.Modules
+ b.Assert(len(mods), qt.Equals, 1)
+ mod := mods[0]
+ b.Assert(mod.Mounts(), qt.HasLen, 8)
+
+ enConcp := sites[0].Conf
+ enConf := enConcp.GetConfig().(*allconfig.Config)
+
+ b.Assert(enConcp.BaseURL().String(), qt.Equals, "https://example.com/")
+ modConf := enConf.Module
+ b.Assert(modConf.Mounts, qt.HasLen, 8)
+ b.Assert(modConf.Mounts[0].Source, qt.Equals, filepath.FromSlash("content/en"))
+ b.Assert(modConf.Mounts[0].Target, qt.Equals, "content")
+ b.Assert(modConf.Mounts[0].Lang, qt.Equals, "en")
+ b.Assert(modConf.Mounts[1].Source, qt.Equals, filepath.FromSlash("content/sv"))
+ b.Assert(modConf.Mounts[1].Target, qt.Equals, "content")
+ b.Assert(modConf.Mounts[1].Lang, qt.Equals, "sv")
+}
+
+func TestConfigAliases(t *testing.T) {
+ files := `
+-- hugo.toml --
+baseURL = "https://example.com"
+logI18nWarnings = true
+logPathWarnings = true
+`
+ b := hugolib.NewIntegrationTestBuilder(
+ hugolib.IntegrationTestConfig{T: t, TxtarString: files},
+ ).Build()
+
+ conf := b.H.Configs.Base
+
+ b.Assert(conf.PrintI18nWarnings, qt.Equals, true)
+ b.Assert(conf.PrintPathWarnings, qt.Equals, true)
+}
+
+func TestRedefineContentTypes(t *testing.T) {
+ files := `
+-- hugo.toml --
+baseURL = "https://example.com"
+[mediaTypes]
+[mediaTypes."text/html"]
+suffixes = ["html", "xhtml"]
+`
+
+ b := hugolib.Test(t, files)
+
+ conf := b.H.Configs.Base
+ contentTypes := conf.ContentTypes.Config
+
+ b.Assert(contentTypes.HTML.Suffixes(), qt.DeepEquals, []string{"html", "xhtml"})
+ b.Assert(contentTypes.Markdown.Suffixes(), qt.DeepEquals, []string{"md", "mdown", "markdown"})
+}
+
+func TestPaginationConfig(t *testing.T) {
+ files := `
+-- hugo.toml --
+ [languages.en]
+ weight = 1
+ [languages.en.pagination]
+ pagerSize = 20
+ [languages.de]
+ weight = 2
+ [languages.de.pagination]
+ path = "page-de"
+
+`
+
+ b := hugolib.Test(t, files)
+
+ confEn := b.H.Sites[0].Conf.Pagination()
+ confDe := b.H.Sites[1].Conf.Pagination()
+
+ b.Assert(confEn.Path, qt.Equals, "page")
+ b.Assert(confEn.PagerSize, qt.Equals, 20)
+ b.Assert(confDe.Path, qt.Equals, "page-de")
+ b.Assert(confDe.PagerSize, qt.Equals, 10)
+}
+
+func TestPaginationConfigDisableAliases(t *testing.T) {
+ files := `
+-- hugo.toml --
+disableKinds = ["taxonomy", "term"]
+[pagination]
+disableAliases = true
+pagerSize = 2
+-- layouts/_default/list.html --
+{{ $paginator := .Paginate site.RegularPages }}
+{{ template "_internal/pagination.html" . }}
+{{ range $paginator.Pages }}
+ {{ .Title }}
+{{ end }}
+-- content/p1.md --
+---
+title: "p1"
+---
+-- content/p2.md --
+---
+title: "p2"
+---
+-- content/p3.md --
+---
+title: "p3"
+---
+`
+
+ b := hugolib.Test(t, files)
+
+ b.AssertFileExists("public/page/1/index.html", false)
+ b.AssertFileContent("public/page/2/index.html", "pagination-default")
+}
+
+func TestMapUglyURLs(t *testing.T) {
+ files := `
+-- hugo.toml --
+[uglyurls]
+ posts = true
+`
+
+ b := hugolib.Test(t, files)
+
+ c := b.H.Configs.Base
+
+ b.Assert(c.C.IsUglyURLSection("posts"), qt.IsTrue)
+ b.Assert(c.C.IsUglyURLSection("blog"), qt.IsFalse)
+}
+
+// Issue 13199
+func TestInvalidOutputFormat(t *testing.T) {
+ t.Parallel()
+
+ files := `
+-- hugo.toml --
+disableKinds = ['page','rss','section','sitemap','taxonomy','term']
+[outputs]
+home = ['html','foo']
+-- layouts/index.html --
+x
+`
+
+ b, err := hugolib.TestE(t, files)
+ b.Assert(err, qt.IsNotNil)
+ b.Assert(err.Error(), qt.Contains, `failed to create config: unknown output format "foo" for kind "home"`)
+}
+
+// Issue 13201
+func TestLanguageConfigSlice(t *testing.T) {
+ t.Parallel()
+
+ files := `
+-- hugo.toml --
+disableKinds = ['page','rss','section','sitemap','taxonomy','term']
+[languages.en]
+title = 'TITLE_EN'
+weight = 2
+[languages.de]
+title = 'TITLE_DE'
+weight = 1
+[languages.fr]
+title = 'TITLE_FR'
+weight = 3
+`
+
+ b := hugolib.Test(t, files)
+ b.Assert(b.H.Configs.LanguageConfigSlice[0].Title, qt.Equals, `TITLE_DE`)
+}
+
+func TestContentTypesDefault(t *testing.T) {
+ files := `
+-- hugo.toml --
+baseURL = "https://example.com"
+
+
+`
+
+ b := hugolib.Test(t, files)
+
+ ct := b.H.Configs.Base.ContentTypes
+ c := ct.Config
+ s := ct.SourceStructure.(map[string]media.ContentTypeConfig)
+
+ b.Assert(c.IsContentFile("foo.md"), qt.Equals, true)
+ b.Assert(len(s), qt.Equals, 6)
+}
+
+func TestMergeDeep(t *testing.T) {
+ t.Parallel()
+
+ files := `
+-- hugo.toml --
+baseURL = "https://example.com"
+theme = ["theme1", "theme2"]
+_merge = "deep"
+-- themes/theme1/hugo.toml --
+[sitemap]
+filename = 'mysitemap.xml'
+[services]
+[services.googleAnalytics]
+id = 'foo bar'
+[taxonomies]
+ foo = 'bars'
+-- themes/theme2/config/_default/hugo.toml --
+[taxonomies]
+ bar = 'baz'
+-- layouts/home.html --
+GA ID: {{ site.Config.Services.GoogleAnalytics.ID }}.
+
+`
+
+ b := hugolib.Test(t, files)
+
+ conf := b.H.Configs
+ base := conf.Base
+
+ b.Assert(base.Environment, qt.Equals, hugo.EnvironmentProduction)
+ b.Assert(base.BaseURL, qt.Equals, "https://example.com")
+ b.Assert(base.Sitemap.Filename, qt.Equals, "mysitemap.xml")
+ b.Assert(base.Taxonomies, qt.DeepEquals, map[string]string{"bar": "baz", "foo": "bars"})
+
+ b.AssertFileContent("public/index.html", "GA ID: foo bar.")
+}
+
+func TestMergeDeepBuildStats(t *testing.T) {
+ t.Parallel()
+
+ files := `
+-- hugo.toml --
+baseURL = "https://example.com"
+title = "Theme 1"
+_merge = "deep"
+[module]
+[module.hugoVersion]
+[[module.imports]]
+path = "theme1"
+-- themes/theme1/hugo.toml --
+[build]
+[build.buildStats]
+disableIDs = true
+enable = true
+-- layouts/home.html --
+Home.
+
+`
+
+ b := hugolib.Test(t, files, hugolib.TestOptOsFs())
+
+ conf := b.H.Configs
+ base := conf.Base
+
+ b.Assert(base.Title, qt.Equals, "Theme 1")
+ b.Assert(len(base.Module.Imports), qt.Equals, 1)
+ b.Assert(base.Build.BuildStats.Enable, qt.Equals, true)
+ b.AssertFileExists("/hugo_stats.json", true)
+}
+
+func TestMergeDeepBuildStatsTheme(t *testing.T) {
+ t.Parallel()
+
+ files := `
+-- hugo.toml --
+baseURL = "https://example.com"
+_merge = "deep"
+theme = ["theme1"]
+-- themes/theme1/hugo.toml --
+title = "Theme 1"
+[build]
+[build.buildStats]
+disableIDs = true
+enable = true
+-- layouts/home.html --
+Home.
+
+`
+
+ b := hugolib.Test(t, files, hugolib.TestOptOsFs())
+
+ conf := b.H.Configs
+ base := conf.Base
+
+ b.Assert(base.Title, qt.Equals, "Theme 1")
+ b.Assert(len(base.Module.Imports), qt.Equals, 1)
+ b.Assert(base.Build.BuildStats.Enable, qt.Equals, true)
+ b.AssertFileExists("/hugo_stats.json", true)
+}
+
+func TestDefaultConfigLanguageBlankWhenNoEnglishExists(t *testing.T) {
+ t.Parallel()
+
+ files := `
+-- hugo.toml --
+baseURL = "https://example.com"
+[languages]
+[languages.nn]
+weight = 20
+[languages.sv]
+weight = 10
+[languages.sv.taxonomies]
+ tag = "taggar"
+-- layouts/all.html --
+All.
+`
+
+ b := hugolib.Test(t, files)
+
+ b.Assert(b.H.Conf.DefaultContentLanguage(), qt.Equals, "sv")
+}
+
+func TestDefaultConfigEnvDisableLanguagesIssue13707(t *testing.T) {
+ t.Parallel()
+
+ files := `
+-- hugo.toml --
+disableLanguages = []
+[languages]
+[languages.en]
+weight = 1
+[languages.nn]
+weight = 2
+[languages.sv]
+weight = 3
+`
+
+ b := hugolib.Test(t, files, hugolib.TestOptWithConfig(func(conf *hugolib.IntegrationTestConfig) {
+ conf.Environ = []string{`HUGO_DISABLELANGUAGES=sv nn`}
+ }))
+
+ b.Assert(len(b.H.Sites), qt.Equals, 1)
+}
diff --git a/config/allconfig/alldecoders.go b/config/allconfig/alldecoders.go
new file mode 100644
index 000000000..035349790
--- /dev/null
+++ b/config/allconfig/alldecoders.go
@@ -0,0 +1,469 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package allconfig
+
+import (
+ "fmt"
+ "strings"
+
+ "github.com/gohugoio/hugo/cache/filecache"
+
+ "github.com/gohugoio/hugo/cache/httpcache"
+ "github.com/gohugoio/hugo/common/maps"
+ "github.com/gohugoio/hugo/common/types"
+ "github.com/gohugoio/hugo/config"
+ "github.com/gohugoio/hugo/config/privacy"
+ "github.com/gohugoio/hugo/config/security"
+ "github.com/gohugoio/hugo/config/services"
+ "github.com/gohugoio/hugo/deploy/deployconfig"
+ "github.com/gohugoio/hugo/hugolib/segments"
+ "github.com/gohugoio/hugo/langs"
+ "github.com/gohugoio/hugo/markup/markup_config"
+ "github.com/gohugoio/hugo/media"
+ "github.com/gohugoio/hugo/minifiers"
+ "github.com/gohugoio/hugo/modules"
+
+ "github.com/gohugoio/hugo/navigation"
+ "github.com/gohugoio/hugo/output"
+ "github.com/gohugoio/hugo/related"
+ "github.com/gohugoio/hugo/resources/images"
+ "github.com/gohugoio/hugo/resources/page"
+ "github.com/gohugoio/hugo/resources/page/pagemeta"
+ "github.com/mitchellh/mapstructure"
+ "github.com/spf13/afero"
+ "github.com/spf13/cast"
+)
+
+type decodeConfig struct {
+ p config.Provider
+ c *Config
+ fs afero.Fs
+ bcfg config.BaseConfig
+}
+
+type decodeWeight struct {
+ key string
+ decode func(decodeWeight, decodeConfig) error
+ getCompiler func(c *Config) configCompiler
+ weight int
+ internalOrDeprecated bool // Hide it from the docs.
+}
+
+var allDecoderSetups = map[string]decodeWeight{
+ "": {
+ key: "",
+ weight: -100, // Always first.
+ decode: func(d decodeWeight, p decodeConfig) error {
+ if err := mapstructure.WeakDecode(p.p.Get(""), &p.c.RootConfig); err != nil {
+ return err
+ }
+
+ // This need to match with Lang which is always lower case.
+ p.c.RootConfig.DefaultContentLanguage = strings.ToLower(p.c.RootConfig.DefaultContentLanguage)
+
+ return nil
+ },
+ },
+ "imaging": {
+ key: "imaging",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.Imaging, err = images.DecodeConfig(p.p.GetStringMap(d.key))
+ return err
+ },
+ },
+ "caches": {
+ key: "caches",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.Caches, err = filecache.DecodeConfig(p.fs, p.bcfg, p.p.GetStringMap(d.key))
+ if p.c.IgnoreCache {
+ // Set MaxAge in all caches to 0.
+ for k, cache := range p.c.Caches {
+ cache.MaxAge = 0
+ p.c.Caches[k] = cache
+ }
+ }
+ return err
+ },
+ },
+ "httpcache": {
+ key: "httpcache",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.HTTPCache, err = httpcache.DecodeConfig(p.bcfg, p.p.GetStringMap(d.key))
+ if p.c.IgnoreCache {
+ p.c.HTTPCache.Cache.For.Excludes = []string{"**"}
+ p.c.HTTPCache.Cache.For.Includes = []string{}
+ }
+ return err
+ },
+ },
+ "build": {
+ key: "build",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ p.c.Build = config.DecodeBuildConfig(p.p)
+ return nil
+ },
+ getCompiler: func(c *Config) configCompiler {
+ return &c.Build
+ },
+ },
+ "frontmatter": {
+ key: "frontmatter",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.Frontmatter, err = pagemeta.DecodeFrontMatterConfig(p.p)
+ return err
+ },
+ },
+ "markup": {
+ key: "markup",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.Markup, err = markup_config.Decode(p.p)
+ return err
+ },
+ },
+ "segments": {
+ key: "segments",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.Segments, err = segments.DecodeSegments(p.p.GetStringMap(d.key))
+ return err
+ },
+ },
+ "server": {
+ key: "server",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.Server, err = config.DecodeServer(p.p)
+ return err
+ },
+ getCompiler: func(c *Config) configCompiler {
+ return &c.Server
+ },
+ },
+ "minify": {
+ key: "minify",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.Minify, err = minifiers.DecodeConfig(p.p.Get(d.key))
+ return err
+ },
+ },
+ "contenttypes": {
+ key: "contenttypes",
+ weight: 100, // This needs to be decoded after media types.
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.ContentTypes, err = media.DecodeContentTypes(p.p.GetStringMap(d.key), p.c.MediaTypes.Config)
+ return err
+ },
+ },
+ "mediatypes": {
+ key: "mediatypes",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.MediaTypes, err = media.DecodeTypes(p.p.GetStringMap(d.key))
+ return err
+ },
+ },
+ "outputs": {
+ key: "outputs",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ defaults := createDefaultOutputFormats(p.c.OutputFormats.Config)
+ m := maps.CleanConfigStringMap(p.p.GetStringMap("outputs"))
+ p.c.Outputs = make(map[string][]string)
+ for k, v := range m {
+ s := types.ToStringSlicePreserveString(v)
+ for i, v := range s {
+ s[i] = strings.ToLower(v)
+ }
+ p.c.Outputs[k] = s
+ }
+ // Apply defaults.
+ for k, v := range defaults {
+ if _, found := p.c.Outputs[k]; !found {
+ p.c.Outputs[k] = v
+ }
+ }
+ return nil
+ },
+ },
+ "outputformats": {
+ key: "outputformats",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.OutputFormats, err = output.DecodeConfig(p.c.MediaTypes.Config, p.p.Get(d.key))
+ return err
+ },
+ },
+ "params": {
+ key: "params",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ p.c.Params = maps.CleanConfigStringMap(p.p.GetStringMap("params"))
+ if p.c.Params == nil {
+ p.c.Params = make(map[string]any)
+ }
+
+ // Before Hugo 0.112.0 this was configured via site Params.
+ if mainSections, found := p.c.Params["mainsections"]; found {
+ p.c.MainSections = types.ToStringSlicePreserveString(mainSections)
+ if p.c.MainSections == nil {
+ p.c.MainSections = []string{}
+ }
+ }
+
+ return nil
+ },
+ },
+ "module": {
+ key: "module",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.Module, err = modules.DecodeConfig(p.p)
+ return err
+ },
+ },
+ "permalinks": {
+ key: "permalinks",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.Permalinks, err = page.DecodePermalinksConfig(p.p.GetStringMap(d.key))
+ return err
+ },
+ },
+ "sitemap": {
+ key: "sitemap",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ if p.p.IsSet(d.key) {
+ p.c.Sitemap, err = config.DecodeSitemap(p.c.Sitemap, p.p.GetStringMap(d.key))
+ }
+ return err
+ },
+ },
+ "taxonomies": {
+ key: "taxonomies",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ if p.p.IsSet(d.key) {
+ p.c.Taxonomies = maps.CleanConfigStringMapString(p.p.GetStringMapString(d.key))
+ }
+ return nil
+ },
+ },
+ "related": {
+ key: "related",
+ weight: 100, // This needs to be decoded after taxonomies.
+ decode: func(d decodeWeight, p decodeConfig) error {
+ if p.p.IsSet(d.key) {
+ var err error
+ p.c.Related, err = related.DecodeConfig(p.p.GetParams(d.key))
+ if err != nil {
+ return fmt.Errorf("failed to decode related config: %w", err)
+ }
+ } else {
+ p.c.Related = related.DefaultConfig
+ if _, found := p.c.Taxonomies["tag"]; found {
+ p.c.Related.Add(related.IndexConfig{Name: "tags", Weight: 80, Type: related.TypeBasic})
+ }
+ }
+ return nil
+ },
+ },
+ "languages": {
+ key: "languages",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ m := p.p.GetStringMap(d.key)
+ if len(m) == 1 {
+ // In v0.112.4 we moved this to the language config, but it's very commmon for mono language sites to have this at the top level.
+ var first maps.Params
+ var ok bool
+ for _, v := range m {
+ first, ok = v.(maps.Params)
+ if ok {
+ break
+ }
+ }
+ if first != nil {
+ if _, found := first["languagecode"]; !found {
+ first["languagecode"] = p.p.GetString("languagecode")
+ }
+ }
+ }
+ p.c.Languages, err = langs.DecodeConfig(m)
+ if err != nil {
+ return err
+ }
+
+ // Validate defaultContentLanguage.
+ if p.c.DefaultContentLanguage != "" {
+ var found bool
+ for lang := range p.c.Languages {
+ if lang == p.c.DefaultContentLanguage {
+ found = true
+ break
+ }
+ }
+ if !found {
+ return fmt.Errorf("config value %q for defaultContentLanguage does not match any language definition", p.c.DefaultContentLanguage)
+ }
+ }
+
+ return nil
+ },
+ },
+ "cascade": {
+ key: "cascade",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.Cascade, err = page.DecodeCascadeConfig(nil, true, p.p.Get(d.key))
+ return err
+ },
+ },
+ "menus": {
+ key: "menus",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.Menus, err = navigation.DecodeConfig(p.p.Get(d.key))
+ return err
+ },
+ },
+ "page": {
+ key: "page",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ p.c.Page = config.PageConfig{
+ NextPrevSortOrder: "desc",
+ NextPrevInSectionSortOrder: "desc",
+ }
+ if p.p.IsSet(d.key) {
+ if err := mapstructure.WeakDecode(p.p.Get(d.key), &p.c.Page); err != nil {
+ return err
+ }
+ }
+
+ return nil
+ },
+ getCompiler: func(c *Config) configCompiler {
+ return &c.Page
+ },
+ },
+ "pagination": {
+ key: "pagination",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ p.c.Pagination = config.Pagination{
+ PagerSize: 10,
+ Path: "page",
+ }
+ if p.p.IsSet(d.key) {
+ if err := mapstructure.WeakDecode(p.p.Get(d.key), &p.c.Pagination); err != nil {
+ return err
+ }
+ }
+
+ return nil
+ },
+ },
+ "privacy": {
+ key: "privacy",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.Privacy, err = privacy.DecodeConfig(p.p)
+ return err
+ },
+ },
+ "security": {
+ key: "security",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.Security, err = security.DecodeConfig(p.p)
+ return err
+ },
+ },
+ "services": {
+ key: "services",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.Services, err = services.DecodeConfig(p.p)
+ return err
+ },
+ },
+ "deployment": {
+ key: "deployment",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ var err error
+ p.c.Deployment, err = deployconfig.DecodeConfig(p.p)
+ return err
+ },
+ },
+ "author": {
+ key: "author",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ p.c.Author = maps.CleanConfigStringMap(p.p.GetStringMap(d.key))
+ return nil
+ },
+ internalOrDeprecated: true,
+ },
+ "social": {
+ key: "social",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ p.c.Social = maps.CleanConfigStringMapString(p.p.GetStringMapString(d.key))
+ return nil
+ },
+ internalOrDeprecated: true,
+ },
+ "uglyurls": {
+ key: "uglyurls",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ v := p.p.Get(d.key)
+ switch vv := v.(type) {
+ case bool:
+ p.c.UglyURLs = vv
+ case string:
+ p.c.UglyURLs = vv == "true"
+ case maps.Params:
+ p.c.UglyURLs = cast.ToStringMapBool(maps.CleanConfigStringMap(vv))
+ default:
+ p.c.UglyURLs = cast.ToStringMapBool(v)
+ }
+ return nil
+ },
+ internalOrDeprecated: true,
+ },
+ "internal": {
+ key: "internal",
+ decode: func(d decodeWeight, p decodeConfig) error {
+ return mapstructure.WeakDecode(p.p.GetStringMap(d.key), &p.c.Internal)
+ },
+ internalOrDeprecated: true,
+ },
+}
+
+func init() {
+ for k, v := range allDecoderSetups {
+ // Verify that k and v.key is all lower case.
+ if k != strings.ToLower(k) {
+ panic(fmt.Sprintf("key %q is not lower case", k))
+ }
+ if v.key != strings.ToLower(v.key) {
+ panic(fmt.Sprintf("key %q is not lower case", v.key))
+ }
+
+ if k != v.key {
+ panic(fmt.Sprintf("key %q is not the same as the map key %q", k, v.key))
+ }
+ }
+}
diff --git a/config/allconfig/configlanguage.go b/config/allconfig/configlanguage.go
new file mode 100644
index 000000000..6990a3590
--- /dev/null
+++ b/config/allconfig/configlanguage.go
@@ -0,0 +1,261 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package allconfig
+
+import (
+ "time"
+
+ "github.com/gohugoio/hugo/common/paths"
+ "github.com/gohugoio/hugo/common/urls"
+ "github.com/gohugoio/hugo/config"
+ "github.com/gohugoio/hugo/identity"
+ "github.com/gohugoio/hugo/langs"
+)
+
+type ConfigLanguage struct {
+ config *Config
+ baseConfig config.BaseConfig
+
+ m *Configs
+ language *langs.Language
+}
+
+func (c ConfigLanguage) Language() *langs.Language {
+ return c.language
+}
+
+func (c ConfigLanguage) Languages() langs.Languages {
+ return c.m.Languages
+}
+
+func (c ConfigLanguage) LanguagesDefaultFirst() langs.Languages {
+ return c.m.LanguagesDefaultFirst
+}
+
+func (c ConfigLanguage) PathParser() *paths.PathParser {
+ return c.m.ContentPathParser
+}
+
+func (c ConfigLanguage) LanguagePrefix() string {
+ if c.DefaultContentLanguageInSubdir() && c.DefaultContentLanguage() == c.Language().Lang {
+ return c.Language().Lang
+ }
+
+ if !c.IsMultilingual() || c.DefaultContentLanguage() == c.Language().Lang {
+ return ""
+ }
+ return c.Language().Lang
+}
+
+func (c ConfigLanguage) BaseURL() urls.BaseURL {
+ return c.config.C.BaseURL
+}
+
+func (c ConfigLanguage) BaseURLLiveReload() urls.BaseURL {
+ return c.config.C.BaseURLLiveReload
+}
+
+func (c ConfigLanguage) Environment() string {
+ return c.config.Environment
+}
+
+func (c ConfigLanguage) IsMultihost() bool {
+ if len(c.m.Languages)-len(c.config.C.DisabledLanguages) <= 1 {
+ return false
+ }
+ return c.m.IsMultihost
+}
+
+func (c ConfigLanguage) FastRenderMode() bool {
+ return c.config.Internal.FastRenderMode
+}
+
+func (c ConfigLanguage) IsMultilingual() bool {
+ return len(c.m.Languages) > 1
+}
+
+func (c ConfigLanguage) TemplateMetrics() bool {
+ return c.config.TemplateMetrics
+}
+
+func (c ConfigLanguage) TemplateMetricsHints() bool {
+ return c.config.TemplateMetricsHints
+}
+
+func (c ConfigLanguage) IsLangDisabled(lang string) bool {
+ return c.config.C.DisabledLanguages[lang]
+}
+
+func (c ConfigLanguage) IgnoredLogs() map[string]bool {
+ return c.config.C.IgnoredLogs
+}
+
+func (c ConfigLanguage) NoBuildLock() bool {
+ return c.config.NoBuildLock
+}
+
+func (c ConfigLanguage) NewContentEditor() string {
+ return c.config.NewContentEditor
+}
+
+func (c ConfigLanguage) Timeout() time.Duration {
+ return c.config.C.Timeout
+}
+
+func (c ConfigLanguage) BaseConfig() config.BaseConfig {
+ return c.baseConfig
+}
+
+func (c ConfigLanguage) Dirs() config.CommonDirs {
+ return c.config.CommonDirs
+}
+
+func (c ConfigLanguage) DirsBase() config.CommonDirs {
+ return c.m.Base.CommonDirs
+}
+
+func (c ConfigLanguage) WorkingDir() string {
+ return c.m.Base.WorkingDir
+}
+
+func (c ConfigLanguage) Quiet() bool {
+ return c.m.Base.Internal.Quiet
+}
+
+func (c ConfigLanguage) Watching() bool {
+ return c.m.Base.Internal.Watch
+}
+
+func (c ConfigLanguage) NewIdentityManager(name string, opts ...identity.ManagerOption) identity.Manager {
+ if !c.Watching() {
+ return identity.NopManager
+ }
+ return identity.NewManager(name, opts...)
+}
+
+func (c ConfigLanguage) ContentTypes() config.ContentTypesProvider {
+ return c.config.ContentTypes.Config
+}
+
+// GetConfigSection is mostly used in tests. The switch statement isn't complete, but what's in use.
+func (c ConfigLanguage) GetConfigSection(s string) any {
+ switch s {
+ case "security":
+ return c.config.Security
+ case "build":
+ return c.config.Build
+ case "frontmatter":
+ return c.config.Frontmatter
+ case "caches":
+ return c.config.Caches
+ case "markup":
+ return c.config.Markup
+ case "mediaTypes":
+ return c.config.MediaTypes.Config
+ case "outputFormats":
+ return c.config.OutputFormats.Config
+ case "permalinks":
+ return c.config.Permalinks
+ case "minify":
+ return c.config.Minify
+ case "allModules":
+ return c.m.Modules
+ case "deployment":
+ return c.config.Deployment
+ case "httpCacheCompiled":
+ return c.config.C.HTTPCache
+ default:
+ panic("not implemented: " + s)
+ }
+}
+
+func (c ConfigLanguage) GetConfig() any {
+ return c.config
+}
+
+func (c ConfigLanguage) CanonifyURLs() bool {
+ return c.config.CanonifyURLs
+}
+
+func (c ConfigLanguage) IsUglyURLs(section string) bool {
+ return c.config.C.IsUglyURLSection(section)
+}
+
+func (c ConfigLanguage) IgnoreFile(s string) bool {
+ return c.config.C.IgnoreFile(s)
+}
+
+func (c ConfigLanguage) DisablePathToLower() bool {
+ return c.config.DisablePathToLower
+}
+
+func (c ConfigLanguage) RemovePathAccents() bool {
+ return c.config.RemovePathAccents
+}
+
+func (c ConfigLanguage) DefaultContentLanguage() string {
+ return c.config.DefaultContentLanguage
+}
+
+func (c ConfigLanguage) DefaultContentLanguageInSubdir() bool {
+ return c.config.DefaultContentLanguageInSubdir
+}
+
+func (c ConfigLanguage) SummaryLength() int {
+ return c.config.SummaryLength
+}
+
+func (c ConfigLanguage) BuildExpired() bool {
+ return c.config.BuildExpired
+}
+
+func (c ConfigLanguage) BuildFuture() bool {
+ return c.config.BuildFuture
+}
+
+func (c ConfigLanguage) BuildDrafts() bool {
+ return c.config.BuildDrafts
+}
+
+func (c ConfigLanguage) Running() bool {
+ return c.config.Internal.Running
+}
+
+func (c ConfigLanguage) PrintUnusedTemplates() bool {
+ return c.config.PrintUnusedTemplates
+}
+
+func (c ConfigLanguage) EnableMissingTranslationPlaceholders() bool {
+ return c.config.EnableMissingTranslationPlaceholders
+}
+
+func (c ConfigLanguage) PrintI18nWarnings() bool {
+ return c.config.PrintI18nWarnings
+}
+
+func (c ConfigLanguage) CreateTitle(s string) string {
+ return c.config.C.CreateTitle(s)
+}
+
+func (c ConfigLanguage) Pagination() config.Pagination {
+ return c.config.Pagination
+}
+
+func (c ConfigLanguage) StaticDirs() []string {
+ return c.config.staticDirs()
+}
+
+func (c ConfigLanguage) EnableEmoji() bool {
+ return c.config.EnableEmoji
+}
diff --git a/config/allconfig/docshelper.go b/config/allconfig/docshelper.go
new file mode 100644
index 000000000..1a5fb6153
--- /dev/null
+++ b/config/allconfig/docshelper.go
@@ -0,0 +1,48 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package allconfig
+
+import (
+ "github.com/gohugoio/hugo/common/maps"
+ "github.com/gohugoio/hugo/config"
+ "github.com/gohugoio/hugo/docshelper"
+)
+
+// This is is just some helpers used to create some JSON used in the Hugo docs.
+func init() {
+ docsProvider := func() docshelper.DocProvider {
+ cfg := config.New()
+ for configRoot, v := range allDecoderSetups {
+ if v.internalOrDeprecated {
+ continue
+ }
+ cfg.Set(configRoot, make(maps.Params))
+ }
+ lang := maps.Params{
+ "en": maps.Params{
+ "menus": maps.Params{},
+ "params": maps.Params{},
+ },
+ }
+ cfg.Set("languages", lang)
+ cfg.SetDefaultMergeStrategy()
+
+ configHelpers := map[string]any{
+ "mergeStrategy": cfg.Get(""),
+ }
+ return docshelper.DocProvider{"config_helpers": configHelpers}
+ }
+
+ docshelper.AddDocProviderFunc(docsProvider)
+}
diff --git a/config/allconfig/load.go b/config/allconfig/load.go
new file mode 100644
index 000000000..4fb8bbaef
--- /dev/null
+++ b/config/allconfig/load.go
@@ -0,0 +1,544 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Package allconfig contains the full configuration for Hugo.
+package allconfig
+
+import (
+ "errors"
+ "fmt"
+ "os"
+ "path/filepath"
+ "strings"
+
+ "github.com/gobwas/glob"
+ "github.com/gohugoio/hugo/common/herrors"
+ "github.com/gohugoio/hugo/common/hexec"
+ "github.com/gohugoio/hugo/common/hugo"
+ "github.com/gohugoio/hugo/common/loggers"
+ "github.com/gohugoio/hugo/common/maps"
+ "github.com/gohugoio/hugo/common/paths"
+ "github.com/gohugoio/hugo/common/types"
+ "github.com/gohugoio/hugo/config"
+ "github.com/gohugoio/hugo/helpers"
+ hglob "github.com/gohugoio/hugo/hugofs/glob"
+ "github.com/gohugoio/hugo/modules"
+ "github.com/gohugoio/hugo/parser/metadecoders"
+ "github.com/spf13/afero"
+)
+
+//lint:ignore ST1005 end user message.
+var ErrNoConfigFile = errors.New("Unable to locate config file or config directory. Perhaps you need to create a new site.\n Run `hugo help new` for details.\n")
+
+func LoadConfig(d ConfigSourceDescriptor) (*Configs, error) {
+ if len(d.Environ) == 0 && !hugo.IsRunningAsTest() {
+ d.Environ = os.Environ()
+ }
+
+ if d.Logger == nil {
+ d.Logger = loggers.NewDefault()
+ }
+
+ l := &configLoader{ConfigSourceDescriptor: d, cfg: config.New()}
+ // Make sure we always do this, even in error situations,
+ // as we have commands (e.g. "hugo mod init") that will
+ // use a partial configuration to do its job.
+ defer l.deleteMergeStrategies()
+ res, _, err := l.loadConfigMain(d)
+ if err != nil {
+ return nil, fmt.Errorf("failed to load config: %w", err)
+ }
+
+ configs, err := fromLoadConfigResult(d.Fs, d.Logger, res)
+ if err != nil {
+ return nil, fmt.Errorf("failed to create config from result: %w", err)
+ }
+
+ moduleConfig, modulesClient, err := l.loadModules(configs, d.IgnoreModuleDoesNotExist)
+ if err != nil {
+ return nil, fmt.Errorf("failed to load modules: %w", err)
+ }
+
+ if len(l.ModulesConfigFiles) > 0 {
+ // Config merged in from modules.
+ // Re-read the config.
+ configs, err = fromLoadConfigResult(d.Fs, d.Logger, res)
+ if err != nil {
+ return nil, fmt.Errorf("failed to create config from modules config: %w", err)
+ }
+ if err := configs.transientErr(); err != nil {
+ return nil, fmt.Errorf("failed to create config from modules config: %w", err)
+ }
+ configs.LoadingInfo.ConfigFiles = append(configs.LoadingInfo.ConfigFiles, l.ModulesConfigFiles...)
+ } else if err := configs.transientErr(); err != nil {
+ return nil, fmt.Errorf("failed to create config: %w", err)
+ }
+
+ configs.Modules = moduleConfig.AllModules
+ configs.ModulesClient = modulesClient
+
+ if err := configs.Init(); err != nil {
+ return nil, fmt.Errorf("failed to init config: %w", err)
+ }
+
+ loggers.SetGlobalLogger(d.Logger)
+
+ return configs, nil
+}
+
+// ConfigSourceDescriptor describes where to find the config (e.g. config.toml etc.).
+type ConfigSourceDescriptor struct {
+ Fs afero.Fs
+ Logger loggers.Logger
+
+ // Config received from the command line.
+ // These will override any config file settings.
+ Flags config.Provider
+
+ // Path to the config file to use, e.g. /my/project/config.toml
+ Filename string
+
+ // The (optional) directory for additional configuration files.
+ ConfigDir string
+
+ // production, development
+ Environment string
+
+ // Defaults to os.Environ if not set.
+ Environ []string
+
+ // If set, this will be used to ignore the module does not exist error.
+ IgnoreModuleDoesNotExist bool
+}
+
+func (d ConfigSourceDescriptor) configFilenames() []string {
+ if d.Filename == "" {
+ return nil
+ }
+ return strings.Split(d.Filename, ",")
+}
+
+type configLoader struct {
+ cfg config.Provider
+ BaseConfig config.BaseConfig
+ ConfigSourceDescriptor
+
+ // collected
+ ModulesConfig modules.ModulesConfig
+ ModulesConfigFiles []string
+}
+
+// Handle some legacy values.
+func (l configLoader) applyConfigAliases() error {
+ aliases := []types.KeyValueStr{
+ {Key: "indexes", Value: "taxonomies"},
+ {Key: "logI18nWarnings", Value: "printI18nWarnings"},
+ {Key: "logPathWarnings", Value: "printPathWarnings"},
+ {Key: "ignoreErrors", Value: "ignoreLogs"},
+ }
+
+ for _, alias := range aliases {
+ if l.cfg.IsSet(alias.Key) {
+ vv := l.cfg.Get(alias.Key)
+ l.cfg.Set(alias.Value, vv)
+ }
+ }
+
+ return nil
+}
+
+func (l configLoader) applyDefaultConfig() error {
+ defaultSettings := maps.Params{
+ // These dirs are used early/before we build the config struct.
+ "themesDir": "themes",
+ "configDir": "config",
+ }
+
+ l.cfg.SetDefaults(defaultSettings)
+
+ return nil
+}
+
+func (l configLoader) normalizeCfg(cfg config.Provider) error {
+ if b, ok := cfg.Get("minifyOutput").(bool); ok && b {
+ cfg.Set("minify.minifyOutput", true)
+ } else if b, ok := cfg.Get("minify").(bool); ok && b {
+ cfg.Set("minify", maps.Params{"minifyOutput": true})
+ }
+
+ return nil
+}
+
+func (l configLoader) cleanExternalConfig(cfg config.Provider) error {
+ if cfg.IsSet("internal") {
+ cfg.Set("internal", nil)
+ }
+ return nil
+}
+
+func (l configLoader) applyFlagsOverrides(cfg config.Provider) error {
+ for _, k := range cfg.Keys() {
+ l.cfg.Set(k, cfg.Get(k))
+ }
+ return nil
+}
+
+func (l configLoader) applyOsEnvOverrides(environ []string) error {
+ if len(environ) == 0 {
+ return nil
+ }
+
+ const delim = "__env__delim"
+
+ // Extract all that start with the HUGO prefix.
+ // The delimiter is the following rune, usually "_".
+ const hugoEnvPrefix = "HUGO"
+ var hugoEnv []types.KeyValueStr
+ for _, v := range environ {
+ key, val := config.SplitEnvVar(v)
+ if strings.HasPrefix(key, hugoEnvPrefix) {
+ delimiterAndKey := strings.TrimPrefix(key, hugoEnvPrefix)
+ if len(delimiterAndKey) < 2 {
+ continue
+ }
+ // Allow delimiters to be case sensitive.
+ // It turns out there isn't that many allowed special
+ // chars in environment variables when used in Bash and similar,
+ // so variables on the form HUGOxPARAMSxFOO=bar is one option.
+ key := strings.ReplaceAll(delimiterAndKey[1:], delimiterAndKey[:1], delim)
+ key = strings.ToLower(key)
+ hugoEnv = append(hugoEnv, types.KeyValueStr{
+ Key: key,
+ Value: val,
+ })
+
+ }
+ }
+
+ for _, env := range hugoEnv {
+ existing, nestedKey, owner, err := maps.GetNestedParamFn(env.Key, delim, l.cfg.Get)
+ if err != nil {
+ return err
+ }
+
+ if existing != nil {
+ val, err := metadecoders.Default.UnmarshalStringTo(env.Value, existing)
+ if err == nil {
+ val = l.envValToVal(env.Key, val)
+ if owner != nil {
+ owner[nestedKey] = val
+ } else {
+ l.cfg.Set(env.Key, val)
+ }
+ continue
+ }
+ }
+
+ if owner != nil && nestedKey != "" {
+ owner[nestedKey] = env.Value
+ } else {
+ var val any
+ key := strings.ReplaceAll(env.Key, delim, ".")
+ _, ok := allDecoderSetups[key]
+ if ok {
+ // A map.
+ if v, err := metadecoders.Default.UnmarshalStringTo(env.Value, map[string]any{}); err == nil {
+ val = v
+ }
+ }
+
+ if val == nil {
+ // A string.
+ val = l.envStringToVal(key, env.Value)
+ }
+ l.cfg.Set(key, val)
+ }
+
+ }
+
+ return nil
+}
+
+func (l *configLoader) envValToVal(k string, v any) any {
+ switch v := v.(type) {
+ case string:
+ return l.envStringToVal(k, v)
+ default:
+ return v
+ }
+}
+
+func (l *configLoader) envStringToVal(k, v string) any {
+ switch k {
+ case "disablekinds", "disablelanguages":
+ if strings.Contains(v, ",") {
+ return strings.Split(v, ",")
+ } else {
+ return strings.Fields(v)
+ }
+ default:
+ return v
+ }
+}
+
+func (l *configLoader) loadConfigMain(d ConfigSourceDescriptor) (config.LoadConfigResult, modules.ModulesConfig, error) {
+ var res config.LoadConfigResult
+
+ if d.Flags != nil {
+ if err := l.normalizeCfg(d.Flags); err != nil {
+ return res, l.ModulesConfig, err
+ }
+ }
+
+ if d.Fs == nil {
+ return res, l.ModulesConfig, errors.New("no filesystem provided")
+ }
+
+ if d.Flags != nil {
+ if err := l.applyFlagsOverrides(d.Flags); err != nil {
+ return res, l.ModulesConfig, err
+ }
+ workingDir := filepath.Clean(l.cfg.GetString("workingDir"))
+
+ l.BaseConfig = config.BaseConfig{
+ WorkingDir: workingDir,
+ ThemesDir: paths.AbsPathify(workingDir, l.cfg.GetString("themesDir")),
+ }
+
+ }
+
+ names := d.configFilenames()
+
+ if names != nil {
+ for _, name := range names {
+ var filename string
+ filename, err := l.loadConfig(name)
+ if err == nil {
+ res.ConfigFiles = append(res.ConfigFiles, filename)
+ } else if err != ErrNoConfigFile {
+ return res, l.ModulesConfig, l.wrapFileError(err, filename)
+ }
+ }
+ } else {
+ for _, name := range config.DefaultConfigNames {
+ var filename string
+ filename, err := l.loadConfig(name)
+ if err == nil {
+ res.ConfigFiles = append(res.ConfigFiles, filename)
+ break
+ } else if err != ErrNoConfigFile {
+ return res, l.ModulesConfig, l.wrapFileError(err, filename)
+ }
+ }
+ }
+
+ if d.ConfigDir != "" {
+ absConfigDir := paths.AbsPathify(l.BaseConfig.WorkingDir, d.ConfigDir)
+ dcfg, dirnames, err := config.LoadConfigFromDir(l.Fs, absConfigDir, l.Environment)
+ if err == nil {
+ if len(dirnames) > 0 {
+ if err := l.normalizeCfg(dcfg); err != nil {
+ return res, l.ModulesConfig, err
+ }
+ if err := l.cleanExternalConfig(dcfg); err != nil {
+ return res, l.ModulesConfig, err
+ }
+ l.cfg.Set("", dcfg.Get(""))
+ res.ConfigFiles = append(res.ConfigFiles, dirnames...)
+ }
+ } else if err != ErrNoConfigFile {
+ if len(dirnames) > 0 {
+ return res, l.ModulesConfig, l.wrapFileError(err, dirnames[0])
+ }
+ return res, l.ModulesConfig, err
+ }
+ }
+
+ res.Cfg = l.cfg
+
+ if err := l.applyDefaultConfig(); err != nil {
+ return res, l.ModulesConfig, err
+ }
+
+ // Some settings are used before we're done collecting all settings,
+ // so apply OS environment both before and after.
+ if err := l.applyOsEnvOverrides(d.Environ); err != nil {
+ return res, l.ModulesConfig, err
+ }
+
+ workingDir := filepath.Clean(l.cfg.GetString("workingDir"))
+
+ l.BaseConfig = config.BaseConfig{
+ WorkingDir: workingDir,
+ CacheDir: l.cfg.GetString("cacheDir"),
+ ThemesDir: paths.AbsPathify(workingDir, l.cfg.GetString("themesDir")),
+ }
+
+ var err error
+ l.BaseConfig.CacheDir, err = helpers.GetCacheDir(l.Fs, l.BaseConfig.CacheDir)
+ if err != nil {
+ return res, l.ModulesConfig, err
+ }
+
+ res.BaseConfig = l.BaseConfig
+
+ l.cfg.SetDefaultMergeStrategy()
+
+ res.ConfigFiles = append(res.ConfigFiles, l.ModulesConfigFiles...)
+
+ if d.Flags != nil {
+ if err := l.applyFlagsOverrides(d.Flags); err != nil {
+ return res, l.ModulesConfig, err
+ }
+ }
+
+ if err := l.applyOsEnvOverrides(d.Environ); err != nil {
+ return res, l.ModulesConfig, err
+ }
+
+ if err = l.applyConfigAliases(); err != nil {
+ return res, l.ModulesConfig, err
+ }
+
+ return res, l.ModulesConfig, err
+}
+
+func (l *configLoader) loadModules(configs *Configs, ignoreModuleDoesNotExist bool) (modules.ModulesConfig, *modules.Client, error) {
+ bcfg := configs.LoadingInfo.BaseConfig
+ conf := configs.Base
+ workingDir := bcfg.WorkingDir
+ themesDir := bcfg.ThemesDir
+ publishDir := bcfg.PublishDir
+
+ cfg := configs.LoadingInfo.Cfg
+
+ var ignoreVendor glob.Glob
+ if s := conf.IgnoreVendorPaths; s != "" {
+ ignoreVendor, _ = hglob.GetGlob(hglob.NormalizePath(s))
+ }
+
+ ex := hexec.New(conf.Security, workingDir, l.Logger)
+
+ hook := func(m *modules.ModulesConfig) error {
+ for _, tc := range m.AllModules {
+ if len(tc.ConfigFilenames()) > 0 {
+ if tc.Watch() {
+ l.ModulesConfigFiles = append(l.ModulesConfigFiles, tc.ConfigFilenames()...)
+ }
+
+ // Merge in the theme config using the configured
+ // merge strategy.
+ cfg.Merge("", tc.Cfg().Get(""))
+
+ }
+ }
+
+ return nil
+ }
+
+ modulesClient := modules.NewClient(modules.ClientConfig{
+ Fs: l.Fs,
+ Logger: l.Logger,
+ Exec: ex,
+ HookBeforeFinalize: hook,
+ WorkingDir: workingDir,
+ ThemesDir: themesDir,
+ PublishDir: publishDir,
+ Environment: l.Environment,
+ CacheDir: conf.Caches.CacheDirModules(),
+ ModuleConfig: conf.Module,
+ IgnoreVendor: ignoreVendor,
+ IgnoreModuleDoesNotExist: ignoreModuleDoesNotExist,
+ })
+
+ moduleConfig, err := modulesClient.Collect()
+
+ // We want to watch these for changes and trigger rebuild on version
+ // changes etc.
+ if moduleConfig.GoModulesFilename != "" {
+ l.ModulesConfigFiles = append(l.ModulesConfigFiles, moduleConfig.GoModulesFilename)
+ }
+
+ if moduleConfig.GoWorkspaceFilename != "" {
+ l.ModulesConfigFiles = append(l.ModulesConfigFiles, moduleConfig.GoWorkspaceFilename)
+ }
+
+ return moduleConfig, modulesClient, err
+}
+
+func (l configLoader) loadConfig(configName string) (string, error) {
+ baseDir := l.BaseConfig.WorkingDir
+ var baseFilename string
+ if filepath.IsAbs(configName) {
+ baseFilename = configName
+ } else {
+ baseFilename = filepath.Join(baseDir, configName)
+ }
+
+ var filename string
+ if paths.ExtNoDelimiter(configName) != "" {
+ exists, _ := helpers.Exists(baseFilename, l.Fs)
+ if exists {
+ filename = baseFilename
+ }
+ } else {
+ for _, ext := range config.ValidConfigFileExtensions {
+ filenameToCheck := baseFilename + "." + ext
+ exists, _ := helpers.Exists(filenameToCheck, l.Fs)
+ if exists {
+ filename = filenameToCheck
+ break
+ }
+ }
+ }
+
+ if filename == "" {
+ return "", ErrNoConfigFile
+ }
+
+ m, err := config.FromFileToMap(l.Fs, filename)
+ if err != nil {
+ return filename, err
+ }
+
+ // Set overwrites keys of the same name, recursively.
+ l.cfg.Set("", m)
+
+ if err := l.normalizeCfg(l.cfg); err != nil {
+ return filename, err
+ }
+
+ if err := l.cleanExternalConfig(l.cfg); err != nil {
+ return filename, err
+ }
+
+ return filename, nil
+}
+
+func (l configLoader) deleteMergeStrategies() {
+ l.cfg.WalkParams(func(params ...maps.KeyParams) bool {
+ params[len(params)-1].Params.DeleteMergeStrategy()
+ return false
+ })
+}
+
+func (l configLoader) wrapFileError(err error, filename string) error {
+ fe := herrors.UnwrapFileError(err)
+ if fe != nil {
+ pos := fe.Position()
+ pos.Filename = filename
+ fe.UpdatePosition(pos)
+ return err
+ }
+ return herrors.NewFileErrorFromFile(err, filename, l.Fs, nil)
+}
diff --git a/config/allconfig/load_test.go b/config/allconfig/load_test.go
new file mode 100644
index 000000000..3c16e71e9
--- /dev/null
+++ b/config/allconfig/load_test.go
@@ -0,0 +1,67 @@
+package allconfig
+
+import (
+ "os"
+ "path/filepath"
+ "testing"
+
+ "github.com/spf13/afero"
+)
+
+func BenchmarkLoad(b *testing.B) {
+ tempDir := b.TempDir()
+ configFilename := filepath.Join(tempDir, "hugo.toml")
+ config := `
+baseURL = "https://example.com"
+defaultContentLanguage = 'en'
+
+[module]
+[[module.mounts]]
+source = 'content/en'
+target = 'content/en'
+lang = 'en'
+[[module.mounts]]
+source = 'content/nn'
+target = 'content/nn'
+lang = 'nn'
+[[module.mounts]]
+source = 'content/no'
+target = 'content/no'
+lang = 'no'
+[[module.mounts]]
+source = 'content/sv'
+target = 'content/sv'
+lang = 'sv'
+[[module.mounts]]
+source = 'layouts'
+target = 'layouts'
+
+[languages]
+[languages.en]
+title = "English"
+weight = 1
+[languages.nn]
+title = "Nynorsk"
+weight = 2
+[languages.no]
+title = "Norsk"
+weight = 3
+[languages.sv]
+title = "Svenska"
+weight = 4
+`
+ if err := os.WriteFile(configFilename, []byte(config), 0o666); err != nil {
+ b.Fatal(err)
+ }
+ d := ConfigSourceDescriptor{
+ Fs: afero.NewOsFs(),
+ Filename: configFilename,
+ }
+
+ for i := 0; i < b.N; i++ {
+ _, err := LoadConfig(d)
+ if err != nil {
+ b.Fatal(err)
+ }
+ }
+}
diff --git a/config/commonConfig.go b/config/commonConfig.go
new file mode 100644
index 000000000..947078672
--- /dev/null
+++ b/config/commonConfig.go
@@ -0,0 +1,511 @@
+// Copyright 2019 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package config
+
+import (
+ "fmt"
+ "net/http"
+ "regexp"
+ "slices"
+ "sort"
+ "strings"
+
+ "github.com/bep/logg"
+ "github.com/gobwas/glob"
+ "github.com/gohugoio/hugo/common/loggers"
+ "github.com/gohugoio/hugo/common/types"
+
+ "github.com/gohugoio/hugo/common/herrors"
+ "github.com/mitchellh/mapstructure"
+ "github.com/spf13/cast"
+)
+
+type BaseConfig struct {
+ WorkingDir string
+ CacheDir string
+ ThemesDir string
+ PublishDir string
+}
+
+type CommonDirs struct {
+ // The directory where Hugo will look for themes.
+ ThemesDir string
+
+ // Where to put the generated files.
+ PublishDir string
+
+ // The directory to put the generated resources files. This directory should in most situations be considered temporary
+ // and not be committed to version control. But there may be cached content in here that you want to keep,
+ // e.g. resources/_gen/images for performance reasons or CSS built from SASS when your CI server doesn't have the full setup.
+ ResourceDir string
+
+ // The project root directory.
+ WorkingDir string
+
+ // The root directory for all cache files.
+ CacheDir string
+
+ // The content source directory.
+ // Deprecated: Use module mounts.
+ ContentDir string
+ // Deprecated: Use module mounts.
+ // The data source directory.
+ DataDir string
+ // Deprecated: Use module mounts.
+ // The layout source directory.
+ LayoutDir string
+ // Deprecated: Use module mounts.
+ // The i18n source directory.
+ I18nDir string
+ // Deprecated: Use module mounts.
+ // The archetypes source directory.
+ ArcheTypeDir string
+ // Deprecated: Use module mounts.
+ // The assets source directory.
+ AssetDir string
+}
+
+type LoadConfigResult struct {
+ Cfg Provider
+ ConfigFiles []string
+ BaseConfig BaseConfig
+}
+
+var defaultBuild = BuildConfig{
+ UseResourceCacheWhen: "fallback",
+ BuildStats: BuildStats{},
+
+ CacheBusters: []CacheBuster{
+ {
+ Source: `(postcss|tailwind)\.config\.js`,
+ Target: cssTargetCachebusterRe,
+ },
+ },
+}
+
+// BuildConfig holds some build related configuration.
+type BuildConfig struct {
+ // When to use the resource file cache.
+ // One of never, fallback, always. Default is fallback
+ UseResourceCacheWhen string
+
+ // When enabled, will collect and write a hugo_stats.json with some build
+ // related aggregated data (e.g. CSS class names).
+ // Note that this was a bool <= v0.115.0.
+ BuildStats BuildStats
+
+ // Can be used to toggle off writing of the IntelliSense /assets/jsconfig.js
+ // file.
+ NoJSConfigInAssets bool
+
+ // Can used to control how the resource cache gets evicted on rebuilds.
+ CacheBusters []CacheBuster
+}
+
+// BuildStats configures if and what to write to the hugo_stats.json file.
+type BuildStats struct {
+ Enable bool
+ DisableTags bool
+ DisableClasses bool
+ DisableIDs bool
+}
+
+func (w BuildStats) Enabled() bool {
+ if !w.Enable {
+ return false
+ }
+ return !w.DisableTags || !w.DisableClasses || !w.DisableIDs
+}
+
+func (b BuildConfig) clone() BuildConfig {
+ b.CacheBusters = slices.Clone(b.CacheBusters)
+ return b
+}
+
+func (b BuildConfig) UseResourceCache(err error) bool {
+ if b.UseResourceCacheWhen == "never" {
+ return false
+ }
+
+ if b.UseResourceCacheWhen == "fallback" {
+ return herrors.IsFeatureNotAvailableError(err)
+ }
+
+ return true
+}
+
+// MatchCacheBuster returns the cache buster for the given path p, nil if none.
+func (s BuildConfig) MatchCacheBuster(logger loggers.Logger, p string) (func(string) bool, error) {
+ var matchers []func(string) bool
+ for _, cb := range s.CacheBusters {
+ if matcher := cb.compiledSource(p); matcher != nil {
+ matchers = append(matchers, matcher)
+ }
+ }
+ if len(matchers) > 0 {
+ return (func(cacheKey string) bool {
+ for _, m := range matchers {
+ if m(cacheKey) {
+ return true
+ }
+ }
+ return false
+ }), nil
+ }
+ return nil, nil
+}
+
+func (b *BuildConfig) CompileConfig(logger loggers.Logger) error {
+ for i, cb := range b.CacheBusters {
+ if err := cb.CompileConfig(logger); err != nil {
+ return fmt.Errorf("failed to compile cache buster %q: %w", cb.Source, err)
+ }
+ b.CacheBusters[i] = cb
+ }
+ return nil
+}
+
+func DecodeBuildConfig(cfg Provider) BuildConfig {
+ m := cfg.GetStringMap("build")
+
+ b := defaultBuild.clone()
+ if m == nil {
+ return b
+ }
+
+ // writeStats was a bool <= v0.115.0.
+ if writeStats, ok := m["writestats"]; ok {
+ if bb, ok := writeStats.(bool); ok {
+ m["buildstats"] = BuildStats{Enable: bb}
+ }
+ }
+
+ err := mapstructure.WeakDecode(m, &b)
+ if err != nil {
+ return b
+ }
+
+ b.UseResourceCacheWhen = strings.ToLower(b.UseResourceCacheWhen)
+ when := b.UseResourceCacheWhen
+ if when != "never" && when != "always" && when != "fallback" {
+ b.UseResourceCacheWhen = "fallback"
+ }
+
+ return b
+}
+
+// SitemapConfig configures the sitemap to be generated.
+type SitemapConfig struct {
+ // The page change frequency.
+ ChangeFreq string
+ // The priority of the page.
+ Priority float64
+ // The sitemap filename.
+ Filename string
+ // Whether to disable page inclusion.
+ Disable bool
+}
+
+func DecodeSitemap(prototype SitemapConfig, input map[string]any) (SitemapConfig, error) {
+ err := mapstructure.WeakDecode(input, &prototype)
+ return prototype, err
+}
+
+// Config for the dev server.
+type Server struct {
+ Headers []Headers
+ Redirects []Redirect
+
+ compiledHeaders []glob.Glob
+ compiledRedirects []redirect
+}
+
+type redirect struct {
+ from glob.Glob
+ fromRe *regexp.Regexp
+ headers map[string]glob.Glob
+}
+
+func (r redirect) matchHeader(header http.Header) bool {
+ for k, v := range r.headers {
+ if !v.Match(header.Get(k)) {
+ return false
+ }
+ }
+ return true
+}
+
+func (s *Server) CompileConfig(logger loggers.Logger) error {
+ if s.compiledHeaders != nil {
+ return nil
+ }
+ for _, h := range s.Headers {
+ g, err := glob.Compile(h.For)
+ if err != nil {
+ return fmt.Errorf("failed to compile Headers glob %q: %w", h.For, err)
+ }
+ s.compiledHeaders = append(s.compiledHeaders, g)
+ }
+ for _, r := range s.Redirects {
+ if r.From == "" && r.FromRe == "" {
+ return fmt.Errorf("redirects must have either From or FromRe set")
+ }
+ rd := redirect{
+ headers: make(map[string]glob.Glob),
+ }
+ if r.From != "" {
+ g, err := glob.Compile(r.From)
+ if err != nil {
+ return fmt.Errorf("failed to compile Redirect glob %q: %w", r.From, err)
+ }
+ rd.from = g
+ }
+ if r.FromRe != "" {
+ re, err := regexp.Compile(r.FromRe)
+ if err != nil {
+ return fmt.Errorf("failed to compile Redirect regexp %q: %w", r.FromRe, err)
+ }
+ rd.fromRe = re
+ }
+ for k, v := range r.FromHeaders {
+ g, err := glob.Compile(v)
+ if err != nil {
+ return fmt.Errorf("failed to compile Redirect header glob %q: %w", v, err)
+ }
+ rd.headers[k] = g
+ }
+ s.compiledRedirects = append(s.compiledRedirects, rd)
+ }
+
+ return nil
+}
+
+func (s *Server) MatchHeaders(pattern string) []types.KeyValueStr {
+ if s.compiledHeaders == nil {
+ return nil
+ }
+
+ var matches []types.KeyValueStr
+
+ for i, g := range s.compiledHeaders {
+ if g.Match(pattern) {
+ h := s.Headers[i]
+ for k, v := range h.Values {
+ matches = append(matches, types.KeyValueStr{Key: k, Value: cast.ToString(v)})
+ }
+ }
+ }
+
+ sort.Slice(matches, func(i, j int) bool {
+ return matches[i].Key < matches[j].Key
+ })
+
+ return matches
+}
+
+func (s *Server) MatchRedirect(pattern string, header http.Header) Redirect {
+ if s.compiledRedirects == nil {
+ return Redirect{}
+ }
+
+ pattern = strings.TrimSuffix(pattern, "index.html")
+
+ for i, r := range s.compiledRedirects {
+ redir := s.Redirects[i]
+
+ var found bool
+
+ if r.from != nil {
+ if r.from.Match(pattern) {
+ found = header == nil || r.matchHeader(header)
+ // We need to do regexp group replacements if needed.
+ }
+ }
+
+ if r.fromRe != nil {
+ m := r.fromRe.FindStringSubmatch(pattern)
+ if m != nil {
+ if !found {
+ found = header == nil || r.matchHeader(header)
+ }
+
+ if found {
+ // Replace $1, $2 etc. in To.
+ for i, g := range m[1:] {
+ redir.To = strings.ReplaceAll(redir.To, fmt.Sprintf("$%d", i+1), g)
+ }
+ }
+ }
+ }
+
+ if found {
+ return redir
+ }
+ }
+
+ return Redirect{}
+}
+
+type Headers struct {
+ For string
+ Values map[string]any
+}
+
+type Redirect struct {
+ // From is the Glob pattern to match.
+ // One of From or FromRe must be set.
+ From string
+
+ // FromRe is the regexp to match.
+ // This regexp can contain group matches (e.g. $1) that can be used in the To field.
+ // One of From or FromRe must be set.
+ FromRe string
+
+ // To is the target URL.
+ To string
+
+ // Headers to match for the redirect.
+ // This maps the HTTP header name to a Glob pattern with values to match.
+ // If the map is empty, the redirect will always be triggered.
+ FromHeaders map[string]string
+
+ // HTTP status code to use for the redirect.
+ // A status code of 200 will trigger a URL rewrite.
+ Status int
+
+ // Forcode redirect, even if original request path exists.
+ Force bool
+}
+
+// CacheBuster configures cache busting for assets.
+type CacheBuster struct {
+ // Trigger for files matching this regexp.
+ Source string
+
+ // Cache bust targets matching this regexp.
+ // This regexp can contain group matches (e.g. $1) from the source regexp.
+ Target string
+
+ compiledSource func(string) func(string) bool
+}
+
+func (c *CacheBuster) CompileConfig(logger loggers.Logger) error {
+ if c.compiledSource != nil {
+ return nil
+ }
+
+ source := c.Source
+ sourceRe, err := regexp.Compile(source)
+ if err != nil {
+ return fmt.Errorf("failed to compile cache buster source %q: %w", c.Source, err)
+ }
+ target := c.Target
+ var compileErr error
+ debugl := logger.Logger().WithLevel(logg.LevelDebug).WithField(loggers.FieldNameCmd, "cachebuster")
+
+ c.compiledSource = func(s string) func(string) bool {
+ m := sourceRe.FindStringSubmatch(s)
+ matchString := "no match"
+ match := m != nil
+ if match {
+ matchString = "match!"
+ }
+ debugl.Logf("Matching %q with source %q: %s", s, source, matchString)
+ if !match {
+ return nil
+ }
+ groups := m[1:]
+ currentTarget := target
+ // Replace $1, $2 etc. in target.
+ for i, g := range groups {
+ currentTarget = strings.ReplaceAll(target, fmt.Sprintf("$%d", i+1), g)
+ }
+ targetRe, err := regexp.Compile(currentTarget)
+ if err != nil {
+ compileErr = fmt.Errorf("failed to compile cache buster target %q: %w", currentTarget, err)
+ return nil
+ }
+ return func(ss string) bool {
+ match = targetRe.MatchString(ss)
+ matchString := "no match"
+ if match {
+ matchString = "match!"
+ }
+ logger.Debugf("Matching %q with target %q: %s", ss, currentTarget, matchString)
+
+ return match
+ }
+ }
+ return compileErr
+}
+
+func (r Redirect) IsZero() bool {
+ return r.From == "" && r.FromRe == ""
+}
+
+const (
+ // Keep this a little coarse grained, some false positives are OK.
+ cssTargetCachebusterRe = `(css|styles|scss|sass)`
+)
+
+func DecodeServer(cfg Provider) (Server, error) {
+ s := &Server{}
+
+ _ = mapstructure.WeakDecode(cfg.GetStringMap("server"), s)
+
+ for i, redir := range s.Redirects {
+ redir.To = strings.TrimSuffix(redir.To, "index.html")
+ s.Redirects[i] = redir
+ }
+
+ if len(s.Redirects) == 0 {
+ // Set up a default redirect for 404s.
+ s.Redirects = []Redirect{
+ {
+ From: "/**",
+ To: "/404.html",
+ Status: 404,
+ },
+ }
+ }
+
+ return *s, nil
+}
+
+// Pagination configures the pagination behavior.
+type Pagination struct {
+ // Default number of elements per pager in pagination.
+ PagerSize int
+
+ // The path element used during pagination.
+ Path string
+
+ // Whether to disable generation of alias for the first pagination page.
+ DisableAliases bool
+}
+
+// PageConfig configures the behavior of pages.
+type PageConfig struct {
+ // Sort order for Page.Next and Page.Prev. Default "desc" (the default page sort order in Hugo).
+ NextPrevSortOrder string
+
+ // Sort order for Page.NextInSection and Page.PrevInSection. Default "desc".
+ NextPrevInSectionSortOrder string
+}
+
+func (c *PageConfig) CompileConfig(loggers.Logger) error {
+ c.NextPrevInSectionSortOrder = strings.ToLower(c.NextPrevInSectionSortOrder)
+ c.NextPrevSortOrder = strings.ToLower(c.NextPrevSortOrder)
+ return nil
+}
diff --git a/config/commonConfig_test.go b/config/commonConfig_test.go
new file mode 100644
index 000000000..05ba185e3
--- /dev/null
+++ b/config/commonConfig_test.go
@@ -0,0 +1,197 @@
+// Copyright 2020 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package config
+
+import (
+ "errors"
+ "testing"
+
+ "github.com/gohugoio/hugo/common/herrors"
+ "github.com/gohugoio/hugo/common/loggers"
+ "github.com/gohugoio/hugo/common/types"
+
+ qt "github.com/frankban/quicktest"
+)
+
+func TestBuild(t *testing.T) {
+ c := qt.New(t)
+
+ v := New()
+ v.Set("build", map[string]any{
+ "useResourceCacheWhen": "always",
+ })
+
+ b := DecodeBuildConfig(v)
+
+ c.Assert(b.UseResourceCacheWhen, qt.Equals, "always")
+
+ v.Set("build", map[string]any{
+ "useResourceCacheWhen": "foo",
+ })
+
+ b = DecodeBuildConfig(v)
+
+ c.Assert(b.UseResourceCacheWhen, qt.Equals, "fallback")
+
+ c.Assert(b.UseResourceCache(herrors.ErrFeatureNotAvailable), qt.Equals, true)
+ c.Assert(b.UseResourceCache(errors.New("err")), qt.Equals, false)
+
+ b.UseResourceCacheWhen = "always"
+ c.Assert(b.UseResourceCache(herrors.ErrFeatureNotAvailable), qt.Equals, true)
+ c.Assert(b.UseResourceCache(errors.New("err")), qt.Equals, true)
+ c.Assert(b.UseResourceCache(nil), qt.Equals, true)
+
+ b.UseResourceCacheWhen = "never"
+ c.Assert(b.UseResourceCache(herrors.ErrFeatureNotAvailable), qt.Equals, false)
+ c.Assert(b.UseResourceCache(errors.New("err")), qt.Equals, false)
+ c.Assert(b.UseResourceCache(nil), qt.Equals, false)
+}
+
+func TestServer(t *testing.T) {
+ c := qt.New(t)
+
+ cfg, err := FromConfigString(`[[server.headers]]
+for = "/*.jpg"
+
+[server.headers.values]
+X-Frame-Options = "DENY"
+X-XSS-Protection = "1; mode=block"
+X-Content-Type-Options = "nosniff"
+
+[[server.redirects]]
+from = "/foo/**"
+to = "/baz/index.html"
+status = 200
+
+[[server.redirects]]
+from = "/loop/**"
+to = "/loop/foo/"
+status = 200
+
+[[server.redirects]]
+from = "/b/**"
+fromRe = "/b/(.*)/"
+to = "/baz/$1/"
+status = 200
+
+[[server.redirects]]
+fromRe = "/c/(.*)/"
+to = "/boo/$1/"
+status = 200
+
+[[server.redirects]]
+fromRe = "/d/(.*)/"
+to = "/boo/$1/"
+status = 200
+
+[[server.redirects]]
+from = "/google/**"
+to = "https://google.com/"
+status = 301
+
+
+
+`, "toml")
+
+ c.Assert(err, qt.IsNil)
+
+ s, err := DecodeServer(cfg)
+ c.Assert(err, qt.IsNil)
+ c.Assert(s.CompileConfig(loggers.NewDefault()), qt.IsNil)
+
+ c.Assert(s.MatchHeaders("/foo.jpg"), qt.DeepEquals, []types.KeyValueStr{
+ {Key: "X-Content-Type-Options", Value: "nosniff"},
+ {Key: "X-Frame-Options", Value: "DENY"},
+ {Key: "X-XSS-Protection", Value: "1; mode=block"},
+ })
+
+ c.Assert(s.MatchRedirect("/foo/bar/baz", nil), qt.DeepEquals, Redirect{
+ From: "/foo/**",
+ To: "/baz/",
+ Status: 200,
+ })
+
+ c.Assert(s.MatchRedirect("/foo/bar/", nil), qt.DeepEquals, Redirect{
+ From: "/foo/**",
+ To: "/baz/",
+ Status: 200,
+ })
+
+ c.Assert(s.MatchRedirect("/b/c/", nil), qt.DeepEquals, Redirect{
+ From: "/b/**",
+ FromRe: "/b/(.*)/",
+ To: "/baz/c/",
+ Status: 200,
+ })
+
+ c.Assert(s.MatchRedirect("/c/d/", nil).To, qt.Equals, "/boo/d/")
+ c.Assert(s.MatchRedirect("/c/d/e/", nil).To, qt.Equals, "/boo/d/e/")
+
+ c.Assert(s.MatchRedirect("/someother", nil), qt.DeepEquals, Redirect{})
+
+ c.Assert(s.MatchRedirect("/google/foo", nil), qt.DeepEquals, Redirect{
+ From: "/google/**",
+ To: "https://google.com/",
+ Status: 301,
+ })
+}
+
+func TestBuildConfigCacheBusters(t *testing.T) {
+ c := qt.New(t)
+ cfg := New()
+ conf := DecodeBuildConfig(cfg)
+ l := loggers.NewDefault()
+ c.Assert(conf.CompileConfig(l), qt.IsNil)
+
+ m, _ := conf.MatchCacheBuster(l, "tailwind.config.js")
+ c.Assert(m, qt.IsNotNil)
+ c.Assert(m("css"), qt.IsTrue)
+ c.Assert(m("js"), qt.IsFalse)
+
+ m, _ = conf.MatchCacheBuster(l, "foo.bar")
+ c.Assert(m, qt.IsNil)
+}
+
+func TestBuildConfigCacheBusterstTailwindSetup(t *testing.T) {
+ c := qt.New(t)
+ cfg := New()
+ cfg.Set("build", map[string]any{
+ "cacheBusters": []map[string]string{
+ {
+ "source": "assets/watching/hugo_stats\\.json",
+ "target": "css",
+ },
+ {
+ "source": "(postcss|tailwind)\\.config\\.js",
+ "target": "css",
+ },
+ {
+ "source": "assets/.*\\.(js|ts|jsx|tsx)",
+ "target": "js",
+ },
+ {
+ "source": "assets/.*\\.(.*)$",
+ "target": "$1",
+ },
+ },
+ })
+
+ conf := DecodeBuildConfig(cfg)
+ l := loggers.NewDefault()
+ c.Assert(conf.CompileConfig(l), qt.IsNil)
+
+ m, err := conf.MatchCacheBuster(l, "assets/watching/hugo_stats.json")
+ c.Assert(err, qt.IsNil)
+ c.Assert(m("css"), qt.IsTrue)
+}
diff --git a/config/configLoader.go b/config/configLoader.go
index 2e37a5b35..dd103f27b 100644
--- a/config/configLoader.go
+++ b/config/configLoader.go
@@ -14,21 +14,37 @@
package config
import (
+ "fmt"
+ "os"
"path/filepath"
"strings"
+ "github.com/gohugoio/hugo/common/herrors"
+
+ "github.com/gohugoio/hugo/common/paths"
+
"github.com/gohugoio/hugo/common/maps"
"github.com/gohugoio/hugo/parser/metadecoders"
"github.com/spf13/afero"
- "github.com/spf13/viper"
)
var (
+ // See issue #8979 for context.
+ // Hugo has always used config.toml etc. as the default config file name.
+ // But hugo.toml is a more descriptive name, but we need to check for both.
+ DefaultConfigNames = []string{"hugo", "config"}
+
+ DefaultConfigNamesSet = make(map[string]bool)
+
ValidConfigFileExtensions = []string{"toml", "yaml", "yml", "json"}
validConfigFileExtensionsMap map[string]bool = make(map[string]bool)
)
func init() {
+ for _, name := range DefaultConfigNames {
+ DefaultConfigNamesSet[name] = true
+ }
+
for _, ext := range ValidConfigFileExtensions {
validConfigFileExtensionsMap[ext] = true
}
@@ -41,43 +57,46 @@ func IsValidConfigFilename(filename string) bool {
return validConfigFileExtensionsMap[ext]
}
+func FromTOMLConfigString(config string) Provider {
+ cfg, err := FromConfigString(config, "toml")
+ if err != nil {
+ panic(err)
+ }
+ return cfg
+}
+
// FromConfigString creates a config from the given YAML, JSON or TOML config. This is useful in tests.
func FromConfigString(config, configType string) (Provider, error) {
- v := newViper()
m, err := readConfig(metadecoders.FormatFromString(configType), []byte(config))
if err != nil {
return nil, err
}
-
- v.MergeConfigMap(m)
-
- return v, nil
+ return NewFrom(m), nil
}
// FromFile loads the configuration from the given filename.
func FromFile(fs afero.Fs, filename string) (Provider, error) {
m, err := loadConfigFromFile(fs, filename)
if err != nil {
- return nil, err
+ fe := herrors.UnwrapFileError(err)
+ if fe != nil {
+ pos := fe.Position()
+ pos.Filename = filename
+ fe.UpdatePosition(pos)
+ return nil, err
+ }
+ return nil, herrors.NewFileErrorFromFile(err, filename, fs, nil)
}
-
- v := newViper()
-
- err = v.MergeConfigMap(m)
- if err != nil {
- return nil, err
- }
-
- return v, nil
+ return NewFrom(m), nil
}
// FromFileToMap is the same as FromFile, but it returns the config values
// as a simple map.
-func FromFileToMap(fs afero.Fs, filename string) (map[string]interface{}, error) {
+func FromFileToMap(fs afero.Fs, filename string) (map[string]any, error) {
return loadConfigFromFile(fs, filename)
}
-func readConfig(format metadecoders.Format, data []byte) (map[string]interface{}, error) {
+func readConfig(format metadecoders.Format, data []byte) (map[string]any, error) {
m, err := metadecoders.Default.UnmarshalToMap(data, format)
if err != nil {
return nil, err
@@ -86,10 +105,9 @@ func readConfig(format metadecoders.Format, data []byte) (map[string]interface{}
RenameKeys(m)
return m, nil
-
}
-func loadConfigFromFile(fs afero.Fs, filename string) (map[string]interface{}, error) {
+func loadConfigFromFile(fs afero.Fs, filename string) (map[string]any, error) {
m, err := metadecoders.Default.UnmarshalFileToMap(fs, filename)
if err != nil {
return nil, err
@@ -98,6 +116,100 @@ func loadConfigFromFile(fs afero.Fs, filename string) (map[string]interface{}, e
return m, nil
}
+func LoadConfigFromDir(sourceFs afero.Fs, configDir, environment string) (Provider, []string, error) {
+ defaultConfigDir := filepath.Join(configDir, "_default")
+ environmentConfigDir := filepath.Join(configDir, environment)
+ cfg := New()
+
+ var configDirs []string
+ // Merge from least to most specific.
+ for _, dir := range []string{defaultConfigDir, environmentConfigDir} {
+ if _, err := sourceFs.Stat(dir); err == nil {
+ configDirs = append(configDirs, dir)
+ }
+ }
+
+ if len(configDirs) == 0 {
+ return nil, nil, nil
+ }
+
+ // Keep track of these so we can watch them for changes.
+ var dirnames []string
+
+ for _, configDir := range configDirs {
+ err := afero.Walk(sourceFs, configDir, func(path string, fi os.FileInfo, err error) error {
+ if fi == nil || err != nil {
+ return nil
+ }
+
+ if fi.IsDir() {
+ dirnames = append(dirnames, path)
+ return nil
+ }
+
+ if !IsValidConfigFilename(path) {
+ return nil
+ }
+
+ name := paths.Filename(filepath.Base(path))
+
+ item, err := metadecoders.Default.UnmarshalFileToMap(sourceFs, path)
+ if err != nil {
+ // This will be used in error reporting, use the most specific value.
+ dirnames = []string{path}
+ return fmt.Errorf("failed to unmarshal config for path %q: %w", path, err)
+ }
+
+ var keyPath []string
+ if !DefaultConfigNamesSet[name] {
+ // Can be params.jp, menus.en etc.
+ name, lang := paths.FileAndExtNoDelimiter(name)
+
+ keyPath = []string{name}
+
+ if lang != "" {
+ keyPath = []string{"languages", lang}
+ switch name {
+ case "menu", "menus":
+ keyPath = append(keyPath, "menus")
+ case "params":
+ keyPath = append(keyPath, "params")
+ }
+ }
+ }
+
+ root := item
+ if len(keyPath) > 0 {
+ root = make(map[string]any)
+ m := root
+ for i, key := range keyPath {
+ if i >= len(keyPath)-1 {
+ m[key] = item
+ } else {
+ nm := make(map[string]any)
+ m[key] = nm
+ m = nm
+ }
+ }
+ }
+
+ // Migrate menu => menus etc.
+ RenameKeys(root)
+
+ // Set will overwrite keys with the same name, recursively.
+ cfg.Set("", root)
+
+ return nil
+ })
+ if err != nil {
+ return nil, dirnames, err
+ }
+
+ }
+
+ return cfg, dirnames, nil
+}
+
var keyAliases maps.KeyRenamer
func init() {
@@ -114,12 +226,6 @@ func init() {
// RenameKeys renames config keys in m recursively according to a global Hugo
// alias definition.
-func RenameKeys(m map[string]interface{}) {
+func RenameKeys(m map[string]any) {
keyAliases.Rename(m)
}
-
-func newViper() *viper.Viper {
- v := viper.New()
-
- return v
-}
diff --git a/config/configProvider.go b/config/configProvider.go
index 187fb7b10..c21342dce 100644
--- a/config/configProvider.go
+++ b/config/configProvider.go
@@ -14,19 +14,92 @@
package config
import (
- "github.com/spf13/cast"
+ "time"
+
+ "github.com/gohugoio/hugo/common/maps"
+ "github.com/gohugoio/hugo/common/paths"
+ "github.com/gohugoio/hugo/common/types"
+ "github.com/gohugoio/hugo/common/urls"
+ "github.com/gohugoio/hugo/identity"
+ "github.com/gohugoio/hugo/langs"
)
+// AllProvider is a sub set of all config settings.
+type AllProvider interface {
+ Language() *langs.Language
+ Languages() langs.Languages
+ LanguagesDefaultFirst() langs.Languages
+ LanguagePrefix() string
+ BaseURL() urls.BaseURL
+ BaseURLLiveReload() urls.BaseURL
+ PathParser() *paths.PathParser
+ Environment() string
+ IsMultihost() bool
+ IsMultilingual() bool
+ NoBuildLock() bool
+ BaseConfig() BaseConfig
+ Dirs() CommonDirs
+ Quiet() bool
+ DirsBase() CommonDirs
+ ContentTypes() ContentTypesProvider
+ GetConfigSection(string) any
+ GetConfig() any
+ CanonifyURLs() bool
+ DisablePathToLower() bool
+ RemovePathAccents() bool
+ IsUglyURLs(section string) bool
+ DefaultContentLanguage() string
+ DefaultContentLanguageInSubdir() bool
+ IsLangDisabled(string) bool
+ SummaryLength() int
+ Pagination() Pagination
+ BuildExpired() bool
+ BuildFuture() bool
+ BuildDrafts() bool
+ Running() bool
+ Watching() bool
+ NewIdentityManager(name string, opts ...identity.ManagerOption) identity.Manager
+ FastRenderMode() bool
+ PrintUnusedTemplates() bool
+ EnableMissingTranslationPlaceholders() bool
+ TemplateMetrics() bool
+ TemplateMetricsHints() bool
+ PrintI18nWarnings() bool
+ CreateTitle(s string) string
+ IgnoreFile(s string) bool
+ NewContentEditor() string
+ Timeout() time.Duration
+ StaticDirs() []string
+ IgnoredLogs() map[string]bool
+ WorkingDir() string
+ EnableEmoji() bool
+}
+
+// We cannot import the media package as that would create a circular dependency.
+// This interface defines a subset of what media.ContentTypes provides.
+type ContentTypesProvider interface {
+ IsContentSuffix(suffix string) bool
+ IsContentFile(filename string) bool
+ IsIndexContentFile(filename string) bool
+ IsHTMLSuffix(suffix string) bool
+}
+
// Provider provides the configuration settings for Hugo.
type Provider interface {
GetString(key string) string
GetInt(key string) int
GetBool(key string) bool
- GetStringMap(key string) map[string]interface{}
+ GetParams(key string) maps.Params
+ GetStringMap(key string) map[string]any
GetStringMapString(key string) map[string]string
GetStringSlice(key string) []string
- Get(key string) interface{}
- Set(key string, value interface{})
+ Get(key string) any
+ Set(key string, value any)
+ Keys() []string
+ Merge(key string, value any)
+ SetDefaults(params maps.Params)
+ SetDefaultMergeStrategy()
+ WalkParams(walkFn func(params ...maps.KeyParams) bool)
IsSet(key string) bool
}
@@ -35,24 +108,5 @@ type Provider interface {
// we do not attempt to split it into fields.
func GetStringSlicePreserveString(cfg Provider, key string) []string {
sd := cfg.Get(key)
- return toStringSlicePreserveString(sd)
-}
-
-func toStringSlicePreserveString(v interface{}) []string {
- if sds, ok := v.(string); ok {
- return []string{sds}
- }
- return cast.ToStringSlice(v)
-}
-
-// SetBaseTestDefaults provides some common config defaults used in tests.
-func SetBaseTestDefaults(cfg Provider) {
- cfg.Set("resourceDir", "resources")
- cfg.Set("contentDir", "content")
- cfg.Set("dataDir", "data")
- cfg.Set("i18nDir", "i18n")
- cfg.Set("layoutDir", "layouts")
- cfg.Set("assetDir", "assets")
- cfg.Set("archetypeDir", "archetypes")
- cfg.Set("publishDir", "public")
+ return types.ToStringSlicePreserveString(sd)
}
diff --git a/config/configProvider_test.go b/config/configProvider_test.go
index d9fff56b6..0afba1e58 100644
--- a/config/configProvider_test.go
+++ b/config/configProvider_test.go
@@ -17,12 +17,11 @@ import (
"testing"
qt "github.com/frankban/quicktest"
- "github.com/spf13/viper"
)
func TestGetStringSlicePreserveString(t *testing.T) {
c := qt.New(t)
- cfg := viper.New()
+ cfg := New()
s := "This is a string"
sSlice := []string{"This", "is", "a", "slice"}
diff --git a/config/defaultConfigProvider.go b/config/defaultConfigProvider.go
new file mode 100644
index 000000000..8c1d63851
--- /dev/null
+++ b/config/defaultConfigProvider.go
@@ -0,0 +1,366 @@
+// Copyright 2021 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package config
+
+import (
+ "fmt"
+ "strings"
+ "sync"
+
+ xmaps "golang.org/x/exp/maps"
+
+ "github.com/spf13/cast"
+
+ "github.com/gohugoio/hugo/common/maps"
+)
+
+// New creates a Provider backed by an empty maps.Params.
+func New() Provider {
+ return &defaultConfigProvider{
+ root: make(maps.Params),
+ }
+}
+
+// NewFrom creates a Provider backed by params.
+func NewFrom(params maps.Params) Provider {
+ maps.PrepareParams(params)
+ return &defaultConfigProvider{
+ root: params,
+ }
+}
+
+// defaultConfigProvider is a Provider backed by a map where all keys are lower case.
+// All methods are thread safe.
+type defaultConfigProvider struct {
+ mu sync.RWMutex
+ root maps.Params
+
+ keyCache sync.Map
+}
+
+func (c *defaultConfigProvider) Get(k string) any {
+ if k == "" {
+ return c.root
+ }
+ c.mu.RLock()
+ key, m := c.getNestedKeyAndMap(strings.ToLower(k), false)
+ if m == nil {
+ c.mu.RUnlock()
+ return nil
+ }
+ v := m[key]
+ c.mu.RUnlock()
+ return v
+}
+
+func (c *defaultConfigProvider) GetBool(k string) bool {
+ v := c.Get(k)
+ return cast.ToBool(v)
+}
+
+func (c *defaultConfigProvider) GetInt(k string) int {
+ v := c.Get(k)
+ return cast.ToInt(v)
+}
+
+func (c *defaultConfigProvider) IsSet(k string) bool {
+ var found bool
+ c.mu.RLock()
+ key, m := c.getNestedKeyAndMap(strings.ToLower(k), false)
+ if m != nil {
+ _, found = m[key]
+ }
+ c.mu.RUnlock()
+ return found
+}
+
+func (c *defaultConfigProvider) GetString(k string) string {
+ v := c.Get(k)
+ return cast.ToString(v)
+}
+
+func (c *defaultConfigProvider) GetParams(k string) maps.Params {
+ v := c.Get(k)
+ if v == nil {
+ return nil
+ }
+ return v.(maps.Params)
+}
+
+func (c *defaultConfigProvider) GetStringMap(k string) map[string]any {
+ v := c.Get(k)
+ return maps.ToStringMap(v)
+}
+
+func (c *defaultConfigProvider) GetStringMapString(k string) map[string]string {
+ v := c.Get(k)
+ return maps.ToStringMapString(v)
+}
+
+func (c *defaultConfigProvider) GetStringSlice(k string) []string {
+ v := c.Get(k)
+ return cast.ToStringSlice(v)
+}
+
+func (c *defaultConfigProvider) Set(k string, v any) {
+ c.mu.Lock()
+ defer c.mu.Unlock()
+
+ k = strings.ToLower(k)
+
+ if k == "" {
+ if p, err := maps.ToParamsAndPrepare(v); err == nil {
+ // Set the values directly in root.
+ maps.SetParams(c.root, p)
+ } else {
+ c.root[k] = v
+ }
+
+ return
+ }
+
+ switch vv := v.(type) {
+ case map[string]any, map[any]any, map[string]string:
+ p := maps.MustToParamsAndPrepare(vv)
+ v = p
+ }
+
+ key, m := c.getNestedKeyAndMap(k, true)
+ if m == nil {
+ return
+ }
+
+ if existing, found := m[key]; found {
+ if p1, ok := existing.(maps.Params); ok {
+ if p2, ok := v.(maps.Params); ok {
+ maps.SetParams(p1, p2)
+ return
+ }
+ }
+ }
+
+ m[key] = v
+}
+
+// SetDefaults will set values from params if not already set.
+func (c *defaultConfigProvider) SetDefaults(params maps.Params) {
+ maps.PrepareParams(params)
+ for k, v := range params {
+ if _, found := c.root[k]; !found {
+ c.root[k] = v
+ }
+ }
+}
+
+func (c *defaultConfigProvider) Merge(k string, v any) {
+ c.mu.Lock()
+ defer c.mu.Unlock()
+ k = strings.ToLower(k)
+
+ if k == "" {
+ rs, f := c.root.GetMergeStrategy()
+ if f && rs == maps.ParamsMergeStrategyNone {
+ // The user has set a "no merge" strategy on this,
+ // nothing more to do.
+ return
+ }
+
+ if p, err := maps.ToParamsAndPrepare(v); err == nil {
+ // As there may be keys in p not in root, we need to handle
+ // those as a special case.
+ var keysToDelete []string
+ for kk, vv := range p {
+ if pp, ok := vv.(maps.Params); ok {
+ if pppi, ok := c.root[kk]; ok {
+ ppp := pppi.(maps.Params)
+ maps.MergeParamsWithStrategy("", ppp, pp)
+ } else {
+ // We need to use the default merge strategy for
+ // this key.
+ np := make(maps.Params)
+ strategy := c.determineMergeStrategy(maps.KeyParams{Key: "", Params: c.root}, maps.KeyParams{Key: kk, Params: np})
+ np.SetMergeStrategy(strategy)
+ maps.MergeParamsWithStrategy("", np, pp)
+ c.root[kk] = np
+ if np.IsZero() {
+ // Just keep it until merge is done.
+ keysToDelete = append(keysToDelete, kk)
+ }
+ }
+ }
+ }
+ // Merge the rest.
+ maps.MergeParams(c.root, p)
+ for _, k := range keysToDelete {
+ delete(c.root, k)
+ }
+ } else {
+ panic(fmt.Sprintf("unsupported type %T received in Merge", v))
+ }
+
+ return
+ }
+
+ switch vv := v.(type) {
+ case map[string]any, map[any]any, map[string]string:
+ p := maps.MustToParamsAndPrepare(vv)
+ v = p
+ }
+
+ key, m := c.getNestedKeyAndMap(k, true)
+ if m == nil {
+ return
+ }
+
+ if existing, found := m[key]; found {
+ if p1, ok := existing.(maps.Params); ok {
+ if p2, ok := v.(maps.Params); ok {
+ maps.MergeParamsWithStrategy("", p1, p2)
+ }
+ }
+ } else {
+ m[key] = v
+ }
+}
+
+func (c *defaultConfigProvider) Keys() []string {
+ c.mu.RLock()
+ defer c.mu.RUnlock()
+ return xmaps.Keys(c.root)
+}
+
+func (c *defaultConfigProvider) WalkParams(walkFn func(params ...maps.KeyParams) bool) {
+ var walk func(params ...maps.KeyParams)
+ walk = func(params ...maps.KeyParams) {
+ if walkFn(params...) {
+ return
+ }
+ p1 := params[len(params)-1]
+ i := len(params)
+ for k, v := range p1.Params {
+ if p2, ok := v.(maps.Params); ok {
+ paramsplus1 := make([]maps.KeyParams, i+1)
+ copy(paramsplus1, params)
+ paramsplus1[i] = maps.KeyParams{Key: k, Params: p2}
+ walk(paramsplus1...)
+ }
+ }
+ }
+ walk(maps.KeyParams{Key: "", Params: c.root})
+}
+
+func (c *defaultConfigProvider) determineMergeStrategy(params ...maps.KeyParams) maps.ParamsMergeStrategy {
+ if len(params) == 0 {
+ return maps.ParamsMergeStrategyNone
+ }
+
+ var (
+ strategy maps.ParamsMergeStrategy
+ prevIsRoot bool
+ curr = params[len(params)-1]
+ )
+
+ if len(params) > 1 {
+ prev := params[len(params)-2]
+ prevIsRoot = prev.Key == ""
+
+ // Inherit from parent (but not from the root unless it's set by user).
+ s, found := prev.Params.GetMergeStrategy()
+ if !prevIsRoot && !found {
+ panic("invalid state, merge strategy not set on parent")
+ }
+ if found || !prevIsRoot {
+ strategy = s
+ }
+ }
+
+ switch curr.Key {
+ case "":
+ // Don't set a merge strategy on the root unless set by user.
+ // This will be handled as a special case.
+ case "params":
+ strategy = maps.ParamsMergeStrategyDeep
+ case "outputformats", "mediatypes":
+ if prevIsRoot {
+ strategy = maps.ParamsMergeStrategyShallow
+ }
+ case "menus":
+ isMenuKey := prevIsRoot
+ if !isMenuKey {
+ // Can also be set below languages.
+ // root > languages > en > menus
+ if len(params) == 4 && params[1].Key == "languages" {
+ isMenuKey = true
+ }
+ }
+ if isMenuKey {
+ strategy = maps.ParamsMergeStrategyShallow
+ }
+ default:
+ if strategy == "" {
+ strategy = maps.ParamsMergeStrategyNone
+ }
+ }
+
+ return strategy
+}
+
+func (c *defaultConfigProvider) SetDefaultMergeStrategy() {
+ c.WalkParams(func(params ...maps.KeyParams) bool {
+ if len(params) == 0 {
+ return false
+ }
+ p := params[len(params)-1].Params
+ var found bool
+ if _, found = p.GetMergeStrategy(); found {
+ // Set by user.
+ return false
+ }
+ strategy := c.determineMergeStrategy(params...)
+ if strategy != "" {
+ p.SetMergeStrategy(strategy)
+ }
+ return false
+ })
+}
+
+func (c *defaultConfigProvider) getNestedKeyAndMap(key string, create bool) (string, maps.Params) {
+ var parts []string
+ v, ok := c.keyCache.Load(key)
+ if ok {
+ parts = v.([]string)
+ } else {
+ parts = strings.Split(key, ".")
+ c.keyCache.Store(key, parts)
+ }
+ current := c.root
+ for i := range len(parts) - 1 {
+ next, found := current[parts[i]]
+ if !found {
+ if create {
+ next = make(maps.Params)
+ current[parts[i]] = next
+ } else {
+ return "", nil
+ }
+ }
+ var ok bool
+ current, ok = next.(maps.Params)
+ if !ok {
+ // E.g. a string, not a map that we can store values in.
+ return "", nil
+ }
+ }
+ return parts[len(parts)-1], current
+}
diff --git a/config/defaultConfigProvider_test.go b/config/defaultConfigProvider_test.go
new file mode 100644
index 000000000..cd6247e60
--- /dev/null
+++ b/config/defaultConfigProvider_test.go
@@ -0,0 +1,400 @@
+// Copyright 2021 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package config
+
+import (
+ "context"
+ "errors"
+ "fmt"
+ "strconv"
+ "strings"
+ "testing"
+
+ "github.com/gohugoio/hugo/common/para"
+
+ "github.com/gohugoio/hugo/common/maps"
+
+ qt "github.com/frankban/quicktest"
+)
+
+func TestDefaultConfigProvider(t *testing.T) {
+ c := qt.New(t)
+
+ c.Run("Set and get", func(c *qt.C) {
+ cfg := New()
+ var k string
+ var v any
+
+ k, v = "foo", "bar"
+ cfg.Set(k, v)
+ c.Assert(cfg.Get(k), qt.Equals, v)
+ c.Assert(cfg.Get(strings.ToUpper(k)), qt.Equals, v)
+ c.Assert(cfg.GetString(k), qt.Equals, v)
+
+ k, v = "foo", 42
+ cfg.Set(k, v)
+ c.Assert(cfg.Get(k), qt.Equals, v)
+ c.Assert(cfg.GetInt(k), qt.Equals, v)
+
+ c.Assert(cfg.Get(""), qt.DeepEquals, maps.Params{
+ "foo": 42,
+ })
+ })
+
+ c.Run("Set and get map", func(c *qt.C) {
+ cfg := New()
+
+ cfg.Set("foo", map[string]any{
+ "bar": "baz",
+ })
+
+ c.Assert(cfg.Get("foo"), qt.DeepEquals, maps.Params{
+ "bar": "baz",
+ })
+
+ c.Assert(cfg.GetStringMap("foo"), qt.DeepEquals, map[string]any{"bar": string("baz")})
+ c.Assert(cfg.GetStringMapString("foo"), qt.DeepEquals, map[string]string{"bar": string("baz")})
+ })
+
+ c.Run("Set and get nested", func(c *qt.C) {
+ cfg := New()
+
+ cfg.Set("a", map[string]any{
+ "B": "bv",
+ })
+ cfg.Set("a.c", "cv")
+
+ c.Assert(cfg.Get("a"), qt.DeepEquals, maps.Params{
+ "b": "bv",
+ "c": "cv",
+ })
+ c.Assert(cfg.Get("a.c"), qt.Equals, "cv")
+
+ cfg.Set("b.a", "av")
+ c.Assert(cfg.Get("b"), qt.DeepEquals, maps.Params{
+ "a": "av",
+ })
+
+ cfg.Set("b", map[string]any{
+ "b": "bv",
+ })
+
+ c.Assert(cfg.Get("b"), qt.DeepEquals, maps.Params{
+ "a": "av",
+ "b": "bv",
+ })
+
+ cfg = New()
+
+ cfg.Set("a", "av")
+
+ cfg.Set("", map[string]any{
+ "a": "av2",
+ "b": "bv2",
+ })
+
+ c.Assert(cfg.Get(""), qt.DeepEquals, maps.Params{
+ "a": "av2",
+ "b": "bv2",
+ })
+
+ cfg = New()
+
+ cfg.Set("a", "av")
+
+ cfg.Set("", map[string]any{
+ "b": "bv2",
+ })
+
+ c.Assert(cfg.Get(""), qt.DeepEquals, maps.Params{
+ "a": "av",
+ "b": "bv2",
+ })
+
+ cfg = New()
+
+ cfg.Set("", map[string]any{
+ "foo": map[string]any{
+ "a": "av",
+ },
+ })
+
+ cfg.Set("", map[string]any{
+ "foo": map[string]any{
+ "b": "bv2",
+ },
+ })
+
+ c.Assert(cfg.Get("foo"), qt.DeepEquals, maps.Params{
+ "a": "av",
+ "b": "bv2",
+ })
+ })
+
+ c.Run("Merge default strategy", func(c *qt.C) {
+ cfg := New()
+
+ cfg.Set("a", map[string]any{
+ "B": "bv",
+ })
+
+ cfg.Merge("a", map[string]any{
+ "B": "bv2",
+ "c": "cv2",
+ })
+
+ c.Assert(cfg.Get("a"), qt.DeepEquals, maps.Params{
+ "b": "bv",
+ "c": "cv2",
+ })
+
+ cfg = New()
+
+ cfg.Set("a", "av")
+
+ cfg.Merge("", map[string]any{
+ "a": "av2",
+ "b": "bv2",
+ })
+
+ c.Assert(cfg.Get(""), qt.DeepEquals, maps.Params{
+ "a": "av",
+ })
+ })
+
+ c.Run("Merge shallow", func(c *qt.C) {
+ cfg := New()
+
+ cfg.Set("a", map[string]any{
+ "_merge": "shallow",
+ "B": "bv",
+ "c": map[string]any{
+ "b": "bv",
+ },
+ })
+
+ cfg.Merge("a", map[string]any{
+ "c": map[string]any{
+ "d": "dv2",
+ },
+ "e": "ev2",
+ })
+
+ c.Assert(cfg.Get("a"), qt.DeepEquals, maps.Params{
+ "e": "ev2",
+ "_merge": maps.ParamsMergeStrategyShallow,
+ "b": "bv",
+ "c": maps.Params{
+ "b": "bv",
+ },
+ })
+ })
+
+ // Issue #8679
+ c.Run("Merge typed maps", func(c *qt.C) {
+ for _, left := range []any{
+ map[string]string{
+ "c": "cv1",
+ },
+ map[string]any{
+ "c": "cv1",
+ },
+ map[any]any{
+ "c": "cv1",
+ },
+ } {
+ cfg := New()
+
+ cfg.Set("", map[string]any{
+ "b": left,
+ })
+
+ cfg.Merge("", maps.Params{
+ "b": maps.Params{
+ "c": "cv2",
+ "d": "dv2",
+ },
+ })
+
+ c.Assert(cfg.Get(""), qt.DeepEquals, maps.Params{
+ "b": maps.Params{
+ "c": "cv1",
+ "d": "dv2",
+ },
+ })
+ }
+
+ for _, left := range []any{
+ map[string]string{
+ "b": "bv1",
+ },
+ map[string]any{
+ "b": "bv1",
+ },
+ map[any]any{
+ "b": "bv1",
+ },
+ } {
+ for _, right := range []any{
+ map[string]string{
+ "b": "bv2",
+ "c": "cv2",
+ },
+ map[string]any{
+ "b": "bv2",
+ "c": "cv2",
+ },
+ map[any]any{
+ "b": "bv2",
+ "c": "cv2",
+ },
+ } {
+ cfg := New()
+
+ cfg.Set("a", left)
+
+ cfg.Merge("a", right)
+
+ c.Assert(cfg.Get(""), qt.DeepEquals, maps.Params{
+ "a": maps.Params{
+ "b": "bv1",
+ "c": "cv2",
+ },
+ })
+ }
+ }
+ })
+
+ // Issue #8701
+ c.Run("Prevent _merge only maps", func(c *qt.C) {
+ cfg := New()
+
+ cfg.Set("", map[string]any{
+ "B": "bv",
+ })
+
+ cfg.Merge("", map[string]any{
+ "c": map[string]any{
+ "_merge": "shallow",
+ "d": "dv2",
+ },
+ })
+
+ c.Assert(cfg.Get(""), qt.DeepEquals, maps.Params{
+ "b": "bv",
+ })
+ })
+
+ c.Run("IsSet", func(c *qt.C) {
+ cfg := New()
+
+ cfg.Set("a", map[string]any{
+ "B": "bv",
+ })
+
+ c.Assert(cfg.IsSet("A"), qt.IsTrue)
+ c.Assert(cfg.IsSet("a.b"), qt.IsTrue)
+ c.Assert(cfg.IsSet("z"), qt.IsFalse)
+ })
+
+ c.Run("Para", func(c *qt.C) {
+ cfg := New()
+ p := para.New(4)
+ r, _ := p.Start(context.Background())
+
+ setAndGet := func(k string, v int) error {
+ vs := strconv.Itoa(v)
+ cfg.Set(k, v)
+ err := errors.New("get failed")
+ if cfg.Get(k) != v {
+ return err
+ }
+ if cfg.GetInt(k) != v {
+ return err
+ }
+ if cfg.GetString(k) != vs {
+ return err
+ }
+ if !cfg.IsSet(k) {
+ return err
+ }
+ return nil
+ }
+
+ for i := range 20 {
+ i := i
+ r.Run(func() error {
+ const v = 42
+ k := fmt.Sprintf("k%d", i)
+ if err := setAndGet(k, v); err != nil {
+ return err
+ }
+
+ m := maps.Params{
+ "new": 42,
+ }
+
+ cfg.Merge("", m)
+
+ return nil
+ })
+ }
+
+ c.Assert(r.Wait(), qt.IsNil)
+ })
+}
+
+func BenchmarkDefaultConfigProvider(b *testing.B) {
+ type cfger interface {
+ Get(key string) any
+ Set(key string, value any)
+ IsSet(key string) bool
+ }
+
+ newMap := func() map[string]any {
+ return map[string]any{
+ "a": map[string]any{
+ "b": map[string]any{
+ "c": 32,
+ "d": 43,
+ },
+ },
+ "b": 62,
+ }
+ }
+
+ runMethods := func(b *testing.B, cfg cfger) {
+ m := newMap()
+ cfg.Set("mymap", m)
+ cfg.Set("num", 32)
+ if !(cfg.IsSet("mymap") && cfg.IsSet("mymap.a") && cfg.IsSet("mymap.a.b") && cfg.IsSet("mymap.a.b.c")) {
+ b.Fatal("IsSet failed")
+ }
+
+ if cfg.Get("num") != 32 {
+ b.Fatal("Get failed")
+ }
+
+ if cfg.Get("mymap.a.b.c") != 32 {
+ b.Fatal("Get failed")
+ }
+ }
+
+ b.Run("Custom", func(b *testing.B) {
+ cfg := New()
+ for i := 0; i < b.N; i++ {
+ runMethods(b, cfg)
+ }
+ })
+}
diff --git a/config/env.go b/config/env.go
index f482cd247..4dcd63653 100644
--- a/config/env.go
+++ b/config/env.go
@@ -18,6 +18,12 @@ import (
"runtime"
"strconv"
"strings"
+
+ "github.com/pbnjay/memory"
+)
+
+const (
+ gigabyte = 1 << 30
)
// GetNumWorkerMultiplier returns the base value used to calculate the number
@@ -33,6 +39,36 @@ func GetNumWorkerMultiplier() int {
return runtime.NumCPU()
}
+// GetMemoryLimit returns the upper memory limit in bytes for Hugo's in-memory caches.
+// Note that this does not represent "all of the memory" that Hugo will use,
+// so it needs to be set to a lower number than the available system memory.
+// It will read from the HUGO_MEMORYLIMIT (in Gigabytes) environment variable.
+// If that is not set, it will set aside a quarter of the total system memory.
+func GetMemoryLimit() uint64 {
+ if mem := os.Getenv("HUGO_MEMORYLIMIT"); mem != "" {
+ if v := stringToGibabyte(mem); v > 0 {
+ return v
+ }
+ }
+
+ // There is a FreeMemory function, but as the kernel in most situations
+ // will take whatever memory that is left and use for caching etc.,
+ // that value is not something that we can use.
+ m := memory.TotalMemory()
+ if m != 0 {
+ return uint64(m / 4)
+ }
+
+ return 2 * gigabyte
+}
+
+func stringToGibabyte(f string) uint64 {
+ if v, err := strconv.ParseFloat(f, 32); err == nil && v > 0 {
+ return uint64(v * gigabyte)
+ }
+ return 0
+}
+
// SetEnvVars sets vars on the form key=value in the oldVars slice.
func SetEnvVars(oldVars *[]string, keyValues ...string) {
for i := 0; i < len(keyValues); i += 2 {
@@ -41,8 +77,8 @@ func SetEnvVars(oldVars *[]string, keyValues ...string) {
}
func SplitEnvVar(v string) (string, string) {
- parts := strings.Split(v, "=")
- return parts[0], parts[1]
+ name, value, _ := strings.Cut(v, "=")
+ return name, value
}
func setEnvVar(vars *[]string, key, value string) {
diff --git a/config/namespace.go b/config/namespace.go
new file mode 100644
index 000000000..e41b56e2d
--- /dev/null
+++ b/config/namespace.go
@@ -0,0 +1,75 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package config
+
+import (
+ "encoding/json"
+
+ "github.com/gohugoio/hugo/common/hashing"
+)
+
+func DecodeNamespace[S, C any](configSource any, buildConfig func(any) (C, any, error)) (*ConfigNamespace[S, C], error) {
+ // Calculate the hash of the input (not including any defaults applied later).
+ // This allows us to introduce new config options without breaking the hash.
+ h := hashing.HashStringHex(configSource)
+
+ // Build the config
+ c, ext, err := buildConfig(configSource)
+ if err != nil {
+ return nil, err
+ }
+
+ if ext == nil {
+ ext = configSource
+ }
+
+ if ext == nil {
+ panic("ext is nil")
+ }
+
+ ns := &ConfigNamespace[S, C]{
+ SourceStructure: ext,
+ SourceHash: h,
+ Config: c,
+ }
+
+ return ns, nil
+}
+
+// ConfigNamespace holds a Hugo configuration namespace.
+// The construct looks a little odd, but it's built to make the configuration elements
+// both self-documenting and contained in a common structure.
+type ConfigNamespace[S, C any] struct {
+ // SourceStructure represents the source configuration with any defaults applied.
+ // This is used for documentation and printing of the configuration setup to the user.
+ SourceStructure any
+
+ // SourceHash is a hash of the source configuration before any defaults gets applied.
+ SourceHash string
+
+ // Config is the final configuration as used by Hugo.
+ Config C
+}
+
+// MarshalJSON marshals the source structure.
+func (ns *ConfigNamespace[S, C]) MarshalJSON() ([]byte, error) {
+ return json.Marshal(ns.SourceStructure)
+}
+
+// Signature returns the signature of the source structure.
+// Note that this is for documentation purposes only and SourceStructure may not always be cast to S (it's usually just a map).
+func (ns *ConfigNamespace[S, C]) Signature() S {
+ var s S
+ return s
+}
diff --git a/config/namespace_test.go b/config/namespace_test.go
new file mode 100644
index 000000000..f443523a4
--- /dev/null
+++ b/config/namespace_test.go
@@ -0,0 +1,60 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package config
+
+import (
+ "strings"
+ "testing"
+
+ qt "github.com/frankban/quicktest"
+ "github.com/gohugoio/hugo/common/maps"
+ "github.com/mitchellh/mapstructure"
+)
+
+func TestNamespace(t *testing.T) {
+ c := qt.New(t)
+ c.Assert(true, qt.Equals, true)
+
+ // ns, err := config.DecodeNamespace[map[string]DocsMediaTypeConfig](in, defaultMediaTypesConfig, buildConfig)
+
+ ns, err := DecodeNamespace[[]*tstNsExt](
+ map[string]any{"foo": "bar"},
+ func(v any) (*tstNsExt, any, error) {
+ t := &tstNsExt{}
+ m, err := maps.ToStringMapE(v)
+ if err != nil {
+ return nil, nil, err
+ }
+ return t, nil, mapstructure.WeakDecode(m, t)
+ },
+ )
+
+ c.Assert(err, qt.IsNil)
+ c.Assert(ns, qt.Not(qt.IsNil))
+ c.Assert(ns.SourceStructure, qt.DeepEquals, map[string]any{"foo": "bar"})
+ c.Assert(ns.SourceHash, qt.Equals, "1420f6c7782f7459")
+ c.Assert(ns.Config, qt.DeepEquals, &tstNsExt{Foo: "bar"})
+ c.Assert(ns.Signature(), qt.DeepEquals, []*tstNsExt(nil))
+}
+
+type (
+ tstNsExt struct {
+ Foo string
+ }
+)
+
+func (t *tstNsExt) Init() error {
+ t.Foo = strings.ToUpper(t.Foo)
+ return nil
+}
diff --git a/config/privacy/privacyConfig.go b/config/privacy/privacyConfig.go
index ea34563eb..900f73540 100644
--- a/config/privacy/privacyConfig.go
+++ b/config/privacy/privacyConfig.go
@@ -30,9 +30,10 @@ type Config struct {
Disqus Disqus
GoogleAnalytics GoogleAnalytics
Instagram Instagram
- Twitter Twitter
+ Twitter Twitter // deprecated in favor of X in v0.141.0
Vimeo Vimeo
YouTube YouTube
+ X X
}
// Disqus holds the privacy configuration settings related to the Disqus template.
@@ -44,15 +45,9 @@ type Disqus struct {
type GoogleAnalytics struct {
Service `mapstructure:",squash"`
- // Enabling this will disable the use of Cookies and use Session Storage to Store the GA Client ID.
- UseSessionStorage bool
-
// Enabling this will make the GA templates respect the
// "Do Not Track" HTTP header. See https://www.paulfurley.com/google-analytics-dnt/.
RespectDoNotTrack bool
-
- // Enabling this will make it so the users' IP addresses are anonymized within Google Analytics.
- AnonymizeIP bool
}
// Instagram holds the privacy configuration settings related to the Instagram shortcode.
@@ -64,7 +59,8 @@ type Instagram struct {
Simple bool
}
-// Twitter holds the privacy configuration settingsrelated to the Twitter shortcode.
+// Twitter holds the privacy configuration settings related to the Twitter shortcode.
+// Deprecated in favor of X in v0.141.0.
type Twitter struct {
Service `mapstructure:",squash"`
@@ -76,17 +72,21 @@ type Twitter struct {
Simple bool
}
-// Vimeo holds the privacy configuration settingsrelated to the Vimeo shortcode.
+// Vimeo holds the privacy configuration settings related to the Vimeo shortcode.
type Vimeo struct {
Service `mapstructure:",squash"`
+ // When set to true, the Vimeo player will be blocked from tracking any session data,
+ // including all cookies and stats.
+ EnableDNT bool
+
// If simple mode is enabled, only a thumbnail is fetched from i.vimeocdn.com and
// shown with a play button overlaid. If a user clicks the button, he/she will
// be taken to the video page on vimeo.com in a new browser tab.
Simple bool
}
-// YouTube holds the privacy configuration settingsrelated to the YouTube shortcode.
+// YouTube holds the privacy configuration settings related to the YouTube shortcode.
type YouTube struct {
Service `mapstructure:",squash"`
@@ -96,6 +96,20 @@ type YouTube struct {
PrivacyEnhanced bool
}
+// X holds the privacy configuration settings related to the X shortcode.
+type X struct {
+ Service `mapstructure:",squash"`
+
+ // When set to true, the X post and its embedded page on your site are not
+ // used for purposes that include personalized suggestions and personalized
+ // ads.
+ EnableDNT bool
+
+ // If simple mode is enabled, a static and no-JS version of the X post will
+ // be built.
+ Simple bool
+}
+
// DecodeConfig creates a privacy Config from a given Hugo configuration.
func DecodeConfig(cfg config.Provider) (pc Config, err error) {
if !cfg.IsSet(privacyConfigKey) {
diff --git a/config/privacy/privacyConfig_test.go b/config/privacy/privacyConfig_test.go
index d798721e1..1dd20215b 100644
--- a/config/privacy/privacyConfig_test.go
+++ b/config/privacy/privacyConfig_test.go
@@ -18,7 +18,6 @@ import (
qt "github.com/frankban/quicktest"
"github.com/gohugoio/hugo/config"
- "github.com/spf13/viper"
)
func TestDecodeConfigFromTOML(t *testing.T) {
@@ -34,17 +33,16 @@ disable = true
[privacy.googleAnalytics]
disable = true
respectDoNotTrack = true
-anonymizeIP = true
-useSessionStorage = true
[privacy.instagram]
disable = true
simple = true
-[privacy.twitter]
+[privacy.x]
disable = true
enableDNT = true
simple = true
[privacy.vimeo]
disable = true
+enableDNT = true
simple = true
[privacy.youtube]
disable = true
@@ -60,15 +58,14 @@ simple = true
got := []bool{
pc.Disqus.Disable, pc.GoogleAnalytics.Disable,
- pc.GoogleAnalytics.RespectDoNotTrack, pc.GoogleAnalytics.AnonymizeIP,
- pc.GoogleAnalytics.UseSessionStorage, pc.Instagram.Disable,
- pc.Instagram.Simple, pc.Twitter.Disable, pc.Twitter.EnableDNT,
- pc.Twitter.Simple, pc.Vimeo.Disable, pc.Vimeo.Simple,
- pc.YouTube.PrivacyEnhanced, pc.YouTube.Disable,
+ pc.GoogleAnalytics.RespectDoNotTrack, pc.Instagram.Disable,
+ pc.Instagram.Simple,
+ pc.Vimeo.Disable, pc.Vimeo.EnableDNT, pc.Vimeo.Simple,
+ pc.YouTube.PrivacyEnhanced, pc.YouTube.Disable, pc.X.Disable, pc.X.EnableDNT,
+ pc.X.Simple,
}
c.Assert(got, qt.All(qt.Equals), true)
-
}
func TestDecodeConfigFromTOMLCaseInsensitive(t *testing.T) {
@@ -94,7 +91,7 @@ PrivacyENhanced = true
func TestDecodeConfigDefault(t *testing.T) {
c := qt.New(t)
- pc, err := DecodeConfig(viper.New())
+ pc, err := DecodeConfig(config.New())
c.Assert(err, qt.IsNil)
c.Assert(pc, qt.Not(qt.IsNil))
c.Assert(pc.YouTube.PrivacyEnhanced, qt.Equals, false)
diff --git a/config/security/securityConfig.go b/config/security/securityConfig.go
new file mode 100644
index 000000000..a3ec5197d
--- /dev/null
+++ b/config/security/securityConfig.go
@@ -0,0 +1,230 @@
+// Copyright 2018 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package security
+
+import (
+ "bytes"
+ "encoding/json"
+ "errors"
+ "fmt"
+ "reflect"
+ "strings"
+
+ "github.com/gohugoio/hugo/common/herrors"
+ "github.com/gohugoio/hugo/common/types"
+ "github.com/gohugoio/hugo/config"
+ "github.com/gohugoio/hugo/parser"
+ "github.com/gohugoio/hugo/parser/metadecoders"
+ "github.com/mitchellh/mapstructure"
+)
+
+const securityConfigKey = "security"
+
+// DefaultConfig holds the default security policy.
+var DefaultConfig = Config{
+ Exec: Exec{
+ Allow: MustNewWhitelist(
+ "^(dart-)?sass(-embedded)?$", // sass, dart-sass, dart-sass-embedded.
+ "^go$", // for Go Modules
+ "^git$", // For Git info
+ "^npx$", // used by all Node tools (Babel, PostCSS).
+ "^postcss$",
+ "^tailwindcss$",
+ ),
+ // These have been tested to work with Hugo's external programs
+ // on Windows, Linux and MacOS.
+ OsEnv: MustNewWhitelist(`(?i)^((HTTPS?|NO)_PROXY|PATH(EXT)?|APPDATA|TE?MP|TERM|GO\w+|(XDG_CONFIG_)?HOME|USERPROFILE|SSH_AUTH_SOCK|DISPLAY|LANG|SYSTEMDRIVE)$`),
+ },
+ Funcs: Funcs{
+ Getenv: MustNewWhitelist("^HUGO_", "^CI$"),
+ },
+ HTTP: HTTP{
+ URLs: MustNewWhitelist(".*"),
+ Methods: MustNewWhitelist("(?i)GET|POST"),
+ },
+}
+
+// Config is the top level security config.
+// {"name": "security", "description": "This section holds the top level security config.", "newIn": "0.91.0" }
+type Config struct {
+ // Restricts access to os.Exec....
+ // { "newIn": "0.91.0" }
+ Exec Exec `json:"exec"`
+
+ // Restricts access to certain template funcs.
+ Funcs Funcs `json:"funcs"`
+
+ // Restricts access to resources.GetRemote, getJSON, getCSV.
+ HTTP HTTP `json:"http"`
+
+ // Allow inline shortcodes
+ EnableInlineShortcodes bool `json:"enableInlineShortcodes"`
+}
+
+// Exec holds os/exec policies.
+type Exec struct {
+ Allow Whitelist `json:"allow"`
+ OsEnv Whitelist `json:"osEnv"`
+}
+
+// Funcs holds template funcs policies.
+type Funcs struct {
+ // OS env keys allowed to query in os.Getenv.
+ Getenv Whitelist `json:"getenv"`
+}
+
+type HTTP struct {
+ // URLs to allow in remote HTTP (resources.Get, getJSON, getCSV).
+ URLs Whitelist `json:"urls"`
+
+ // HTTP methods to allow.
+ Methods Whitelist `json:"methods"`
+
+ // Media types where the Content-Type in the response is used instead of resolving from the file content.
+ MediaTypes Whitelist `json:"mediaTypes"`
+}
+
+// ToTOML converts c to TOML with [security] as the root.
+func (c Config) ToTOML() string {
+ sec := c.ToSecurityMap()
+
+ var b bytes.Buffer
+
+ if err := parser.InterfaceToConfig(sec, metadecoders.TOML, &b); err != nil {
+ panic(err)
+ }
+
+ return strings.TrimSpace(b.String())
+}
+
+func (c Config) CheckAllowedExec(name string) error {
+ if !c.Exec.Allow.Accept(name) {
+ return &AccessDeniedError{
+ name: name,
+ path: "security.exec.allow",
+ policies: c.ToTOML(),
+ }
+ }
+ return nil
+}
+
+func (c Config) CheckAllowedGetEnv(name string) error {
+ if !c.Funcs.Getenv.Accept(name) {
+ return &AccessDeniedError{
+ name: name,
+ path: "security.funcs.getenv",
+ policies: c.ToTOML(),
+ }
+ }
+ return nil
+}
+
+func (c Config) CheckAllowedHTTPURL(url string) error {
+ if !c.HTTP.URLs.Accept(url) {
+ return &AccessDeniedError{
+ name: url,
+ path: "security.http.urls",
+ policies: c.ToTOML(),
+ }
+ }
+ return nil
+}
+
+func (c Config) CheckAllowedHTTPMethod(method string) error {
+ if !c.HTTP.Methods.Accept(method) {
+ return &AccessDeniedError{
+ name: method,
+ path: "security.http.method",
+ policies: c.ToTOML(),
+ }
+ }
+ return nil
+}
+
+// ToSecurityMap converts c to a map with 'security' as the root key.
+func (c Config) ToSecurityMap() map[string]any {
+ // Take it to JSON and back to get proper casing etc.
+ asJson, err := json.Marshal(c)
+ herrors.Must(err)
+ m := make(map[string]any)
+ herrors.Must(json.Unmarshal(asJson, &m))
+
+ // Add the root
+ sec := map[string]any{
+ "security": m,
+ }
+ return sec
+}
+
+// DecodeConfig creates a privacy Config from a given Hugo configuration.
+func DecodeConfig(cfg config.Provider) (Config, error) {
+ sc := DefaultConfig
+ if cfg.IsSet(securityConfigKey) {
+ m := cfg.GetStringMap(securityConfigKey)
+ dec, err := mapstructure.NewDecoder(
+ &mapstructure.DecoderConfig{
+ WeaklyTypedInput: true,
+ Result: &sc,
+ DecodeHook: stringSliceToWhitelistHook(),
+ },
+ )
+ if err != nil {
+ return sc, err
+ }
+
+ if err = dec.Decode(m); err != nil {
+ return sc, err
+ }
+ }
+
+ if !sc.EnableInlineShortcodes {
+ // Legacy
+ sc.EnableInlineShortcodes = cfg.GetBool("enableInlineShortcodes")
+ }
+
+ return sc, nil
+}
+
+func stringSliceToWhitelistHook() mapstructure.DecodeHookFuncType {
+ return func(
+ f reflect.Type,
+ t reflect.Type,
+ data any,
+ ) (any, error) {
+ if t != reflect.TypeOf(Whitelist{}) {
+ return data, nil
+ }
+
+ wl := types.ToStringSlicePreserveString(data)
+
+ return NewWhitelist(wl...)
+ }
+}
+
+// AccessDeniedError represents a security policy conflict.
+type AccessDeniedError struct {
+ path string
+ name string
+ policies string
+}
+
+func (e *AccessDeniedError) Error() string {
+ return fmt.Sprintf("access denied: %q is not whitelisted in policy %q; the current security configuration is:\n\n%s\n\n", e.name, e.path, e.policies)
+}
+
+// IsAccessDenied reports whether err is an AccessDeniedError
+func IsAccessDenied(err error) bool {
+ var notFoundErr *AccessDeniedError
+ return errors.As(err, ¬FoundErr)
+}
diff --git a/config/security/securityConfig_test.go b/config/security/securityConfig_test.go
new file mode 100644
index 000000000..faa05a97f
--- /dev/null
+++ b/config/security/securityConfig_test.go
@@ -0,0 +1,167 @@
+// Copyright 2018 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package security
+
+import (
+ "testing"
+
+ qt "github.com/frankban/quicktest"
+ "github.com/gohugoio/hugo/config"
+)
+
+func TestDecodeConfigFromTOML(t *testing.T) {
+ c := qt.New(t)
+
+ c.Run("Slice whitelist", func(c *qt.C) {
+ c.Parallel()
+ tomlConfig := `
+
+
+someOtherValue = "bar"
+
+[security]
+enableInlineShortcodes=true
+[security.exec]
+allow=["a", "b"]
+osEnv=["a", "b", "c"]
+[security.funcs]
+getEnv=["a", "b"]
+
+`
+
+ cfg, err := config.FromConfigString(tomlConfig, "toml")
+ c.Assert(err, qt.IsNil)
+
+ pc, err := DecodeConfig(cfg)
+ c.Assert(err, qt.IsNil)
+ c.Assert(pc, qt.Not(qt.IsNil))
+ c.Assert(pc.EnableInlineShortcodes, qt.IsTrue)
+ c.Assert(pc.Exec.Allow.Accept("a"), qt.IsTrue)
+ c.Assert(pc.Exec.Allow.Accept("d"), qt.IsFalse)
+ c.Assert(pc.Exec.OsEnv.Accept("a"), qt.IsTrue)
+ c.Assert(pc.Exec.OsEnv.Accept("e"), qt.IsFalse)
+ c.Assert(pc.Funcs.Getenv.Accept("a"), qt.IsTrue)
+ c.Assert(pc.Funcs.Getenv.Accept("c"), qt.IsFalse)
+ })
+
+ c.Run("String whitelist", func(c *qt.C) {
+ c.Parallel()
+ tomlConfig := `
+
+
+someOtherValue = "bar"
+
+[security]
+[security.exec]
+allow="a"
+osEnv="b"
+
+`
+
+ cfg, err := config.FromConfigString(tomlConfig, "toml")
+ c.Assert(err, qt.IsNil)
+
+ pc, err := DecodeConfig(cfg)
+ c.Assert(err, qt.IsNil)
+ c.Assert(pc, qt.Not(qt.IsNil))
+ c.Assert(pc.Exec.Allow.Accept("a"), qt.IsTrue)
+ c.Assert(pc.Exec.Allow.Accept("d"), qt.IsFalse)
+ c.Assert(pc.Exec.OsEnv.Accept("b"), qt.IsTrue)
+ c.Assert(pc.Exec.OsEnv.Accept("e"), qt.IsFalse)
+ })
+
+ c.Run("Default exec.osEnv", func(c *qt.C) {
+ c.Parallel()
+ tomlConfig := `
+
+
+someOtherValue = "bar"
+
+[security]
+[security.exec]
+allow="a"
+
+`
+
+ cfg, err := config.FromConfigString(tomlConfig, "toml")
+ c.Assert(err, qt.IsNil)
+
+ pc, err := DecodeConfig(cfg)
+ c.Assert(err, qt.IsNil)
+ c.Assert(pc, qt.Not(qt.IsNil))
+ c.Assert(pc.Exec.Allow.Accept("a"), qt.IsTrue)
+ c.Assert(pc.Exec.OsEnv.Accept("PATH"), qt.IsTrue)
+ c.Assert(pc.Exec.OsEnv.Accept("e"), qt.IsFalse)
+ })
+
+ c.Run("Enable inline shortcodes, legacy", func(c *qt.C) {
+ c.Parallel()
+ tomlConfig := `
+
+
+someOtherValue = "bar"
+enableInlineShortcodes=true
+
+[security]
+[security.exec]
+allow="a"
+osEnv="b"
+
+`
+
+ cfg, err := config.FromConfigString(tomlConfig, "toml")
+ c.Assert(err, qt.IsNil)
+
+ pc, err := DecodeConfig(cfg)
+ c.Assert(err, qt.IsNil)
+ c.Assert(pc.EnableInlineShortcodes, qt.IsTrue)
+ })
+}
+
+func TestToTOML(t *testing.T) {
+ c := qt.New(t)
+
+ got := DefaultConfig.ToTOML()
+
+ c.Assert(got, qt.Equals,
+ "[security]\n enableInlineShortcodes = false\n\n [security.exec]\n allow = ['^(dart-)?sass(-embedded)?$', '^go$', '^git$', '^npx$', '^postcss$', '^tailwindcss$']\n osEnv = ['(?i)^((HTTPS?|NO)_PROXY|PATH(EXT)?|APPDATA|TE?MP|TERM|GO\\w+|(XDG_CONFIG_)?HOME|USERPROFILE|SSH_AUTH_SOCK|DISPLAY|LANG|SYSTEMDRIVE)$']\n\n [security.funcs]\n getenv = ['^HUGO_', '^CI$']\n\n [security.http]\n methods = ['(?i)GET|POST']\n urls = ['.*']",
+ )
+}
+
+func TestDecodeConfigDefault(t *testing.T) {
+ t.Parallel()
+ c := qt.New(t)
+
+ pc, err := DecodeConfig(config.New())
+ c.Assert(err, qt.IsNil)
+ c.Assert(pc, qt.Not(qt.IsNil))
+ c.Assert(pc.Exec.Allow.Accept("a"), qt.IsFalse)
+ c.Assert(pc.Exec.Allow.Accept("npx"), qt.IsTrue)
+ c.Assert(pc.Exec.Allow.Accept("Npx"), qt.IsFalse)
+
+ c.Assert(pc.HTTP.URLs.Accept("https://example.org"), qt.IsTrue)
+ c.Assert(pc.HTTP.Methods.Accept("POST"), qt.IsTrue)
+ c.Assert(pc.HTTP.Methods.Accept("GET"), qt.IsTrue)
+ c.Assert(pc.HTTP.Methods.Accept("get"), qt.IsTrue)
+ c.Assert(pc.HTTP.Methods.Accept("DELETE"), qt.IsFalse)
+ c.Assert(pc.HTTP.MediaTypes.Accept("application/msword"), qt.IsFalse)
+
+ c.Assert(pc.Exec.OsEnv.Accept("PATH"), qt.IsTrue)
+ c.Assert(pc.Exec.OsEnv.Accept("GOROOT"), qt.IsTrue)
+ c.Assert(pc.Exec.OsEnv.Accept("HOME"), qt.IsTrue)
+ c.Assert(pc.Exec.OsEnv.Accept("SSH_AUTH_SOCK"), qt.IsTrue)
+ c.Assert(pc.Exec.OsEnv.Accept("a"), qt.IsFalse)
+ c.Assert(pc.Exec.OsEnv.Accept("e"), qt.IsFalse)
+ c.Assert(pc.Exec.OsEnv.Accept("MYSECRET"), qt.IsFalse)
+}
diff --git a/config/security/whitelist.go b/config/security/whitelist.go
new file mode 100644
index 000000000..5ce369a1f
--- /dev/null
+++ b/config/security/whitelist.go
@@ -0,0 +1,116 @@
+// Copyright 2021 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package security
+
+import (
+ "encoding/json"
+ "fmt"
+ "regexp"
+ "strings"
+)
+
+const (
+ acceptNoneKeyword = "none"
+)
+
+// Whitelist holds a whitelist.
+type Whitelist struct {
+ acceptNone bool
+ patterns []*regexp.Regexp
+
+ // Store this for debugging/error reporting
+ patternsStrings []string
+}
+
+// MarshalJSON is for internal use only.
+func (w Whitelist) MarshalJSON() ([]byte, error) {
+ if w.acceptNone {
+ return json.Marshal(acceptNoneKeyword)
+ }
+
+ return json.Marshal(w.patternsStrings)
+}
+
+// NewWhitelist creates a new Whitelist from zero or more patterns.
+// An empty patterns list or a pattern with the value 'none' will create
+// a whitelist that will Accept none.
+func NewWhitelist(patterns ...string) (Whitelist, error) {
+ if len(patterns) == 0 {
+ return Whitelist{acceptNone: true}, nil
+ }
+
+ var acceptSome bool
+ var patternsStrings []string
+
+ for _, p := range patterns {
+ if p == acceptNoneKeyword {
+ acceptSome = false
+ break
+ }
+
+ if ps := strings.TrimSpace(p); ps != "" {
+ acceptSome = true
+ patternsStrings = append(patternsStrings, ps)
+ }
+ }
+
+ if !acceptSome {
+ return Whitelist{
+ acceptNone: true,
+ }, nil
+ }
+
+ var patternsr []*regexp.Regexp
+
+ for i := range patterns {
+ p := strings.TrimSpace(patterns[i])
+ if p == "" {
+ continue
+ }
+ re, err := regexp.Compile(p)
+ if err != nil {
+ return Whitelist{}, fmt.Errorf("failed to compile whitelist pattern %q: %w", p, err)
+ }
+ patternsr = append(patternsr, re)
+ }
+
+ return Whitelist{patterns: patternsr, patternsStrings: patternsStrings}, nil
+}
+
+// MustNewWhitelist creates a new Whitelist from zero or more patterns and panics on error.
+func MustNewWhitelist(patterns ...string) Whitelist {
+ w, err := NewWhitelist(patterns...)
+ if err != nil {
+ panic(err)
+ }
+ return w
+}
+
+// Accept reports whether name is whitelisted.
+func (w Whitelist) Accept(name string) bool {
+ if w.acceptNone {
+ return false
+ }
+
+ for _, p := range w.patterns {
+ if p.MatchString(name) {
+ return true
+ }
+ }
+ return false
+}
+
+func (w Whitelist) String() string {
+ return fmt.Sprint(w.patternsStrings)
+}
diff --git a/config/security/whitelist_test.go b/config/security/whitelist_test.go
new file mode 100644
index 000000000..add3345a8
--- /dev/null
+++ b/config/security/whitelist_test.go
@@ -0,0 +1,46 @@
+// Copyright 2021 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package security
+
+import (
+ "testing"
+
+ qt "github.com/frankban/quicktest"
+)
+
+func TestWhitelist(t *testing.T) {
+ t.Parallel()
+ c := qt.New(t)
+
+ c.Run("none", func(c *qt.C) {
+ c.Assert(MustNewWhitelist("none", "foo").Accept("foo"), qt.IsFalse)
+ c.Assert(MustNewWhitelist().Accept("foo"), qt.IsFalse)
+ c.Assert(MustNewWhitelist("").Accept("foo"), qt.IsFalse)
+ c.Assert(MustNewWhitelist(" ", " ").Accept("foo"), qt.IsFalse)
+ c.Assert(Whitelist{}.Accept("foo"), qt.IsFalse)
+ })
+
+ c.Run("One", func(c *qt.C) {
+ w := MustNewWhitelist("^foo.*")
+ c.Assert(w.Accept("foo"), qt.IsTrue)
+ c.Assert(w.Accept("mfoo"), qt.IsFalse)
+ })
+
+ c.Run("Multiple", func(c *qt.C) {
+ w := MustNewWhitelist("^foo.*", "^bar.*")
+ c.Assert(w.Accept("foo"), qt.IsTrue)
+ c.Assert(w.Accept("bar"), qt.IsTrue)
+ c.Assert(w.Accept("mbar"), qt.IsFalse)
+ })
+}
diff --git a/config/services/servicesConfig.go b/config/services/servicesConfig.go
index 559848f5c..f9d5e1a6e 100644
--- a/config/services/servicesConfig.go
+++ b/config/services/servicesConfig.go
@@ -31,7 +31,8 @@ type Config struct {
Disqus Disqus
GoogleAnalytics GoogleAnalytics
Instagram Instagram
- Twitter Twitter
+ Twitter Twitter // deprecated in favor of X in v0.141.0
+ X X
RSS RSS
}
@@ -53,9 +54,15 @@ type Instagram struct {
// This means that if you use Bootstrap 4 or want to provide your own CSS, you want
// to disable the inline CSS provided by Hugo.
DisableInlineCSS bool
+
+ // App or Client Access Token.
+ // If you are using a Client Access Token, remember that you must combine it with your App ID
+ // using a pipe symbol (|) otherwise the request will fail.
+ AccessToken string
}
// Twitter holds the functional configuration settings related to the Twitter shortcodes.
+// Deprecated in favor of X in v0.141.0.
type Twitter struct {
// The Simple variant of Twitter is decorated with a basic set of inline styles.
// This means that if you want to provide your own CSS, you want
@@ -63,6 +70,14 @@ type Twitter struct {
DisableInlineCSS bool
}
+// X holds the functional configuration settings related to the X shortcodes.
+type X struct {
+ // The Simple variant of X is decorated with a basic set of inline styles.
+ // This means that if you want to provide your own CSS, you want
+ // to disable the inline CSS provided by Hugo.
+ DisableInlineCSS bool
+}
+
// RSS holds the functional configuration settings related to the RSS feeds.
type RSS struct {
// Limit the number of pages.
@@ -86,6 +101,9 @@ func DecodeConfig(cfg config.Provider) (c Config, err error) {
if c.RSS.Limit == 0 {
c.RSS.Limit = cfg.GetInt(rssLimitKey)
+ if c.RSS.Limit == 0 {
+ c.RSS.Limit = -1
+ }
}
return
diff --git a/config/services/servicesConfig_test.go b/config/services/servicesConfig_test.go
index ed3038159..952a7fe1c 100644
--- a/config/services/servicesConfig_test.go
+++ b/config/services/servicesConfig_test.go
@@ -18,7 +18,6 @@ import (
qt "github.com/frankban/quicktest"
"github.com/gohugoio/hugo/config"
- "github.com/spf13/viper"
)
func TestDecodeConfigFromTOML(t *testing.T) {
@@ -37,6 +36,8 @@ id = "ga_id"
disableInlineCSS = true
[services.twitter]
disableInlineCSS = true
+[services.x]
+disableInlineCSS = true
`
cfg, err := config.FromConfigString(tomlConfig, "toml")
c.Assert(err, qt.IsNil)
@@ -55,7 +56,7 @@ disableInlineCSS = true
func TestUseSettingsFromRootIfSet(t *testing.T) {
c := qt.New(t)
- cfg := viper.New()
+ cfg := config.New()
cfg.Set("disqusShortname", "root_short")
cfg.Set("googleAnalytics", "ga_root")
@@ -65,5 +66,4 @@ func TestUseSettingsFromRootIfSet(t *testing.T) {
c.Assert(config.Disqus.Shortname, qt.Equals, "root_short")
c.Assert(config.GoogleAnalytics.ID, qt.Equals, "ga_root")
-
}
diff --git a/config/sitemap.go b/config/sitemap.go
deleted file mode 100644
index 4031b7ec1..000000000
--- a/config/sitemap.go
+++ /dev/null
@@ -1,44 +0,0 @@
-// Copyright 2019 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package config
-
-import (
- "github.com/spf13/cast"
- jww "github.com/spf13/jwalterweatherman"
-)
-
-// Sitemap configures the sitemap to be generated.
-type Sitemap struct {
- ChangeFreq string
- Priority float64
- Filename string
-}
-
-func DecodeSitemap(prototype Sitemap, input map[string]interface{}) Sitemap {
-
- for key, value := range input {
- switch key {
- case "changefreq":
- prototype.ChangeFreq = cast.ToString(value)
- case "priority":
- prototype.Priority = cast.ToFloat64(value)
- case "filename":
- prototype.Filename = cast.ToString(value)
- default:
- jww.WARN.Printf("Unknown Sitemap field: %s\n", key)
- }
- }
-
- return prototype
-}
diff --git a/config/testconfig/testconfig.go b/config/testconfig/testconfig.go
new file mode 100644
index 000000000..8f70e6cb7
--- /dev/null
+++ b/config/testconfig/testconfig.go
@@ -0,0 +1,83 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// This package should only be used for testing.
+package testconfig
+
+import (
+ _ "unsafe"
+
+ "github.com/gohugoio/hugo/common/maps"
+ "github.com/gohugoio/hugo/config"
+ "github.com/gohugoio/hugo/config/allconfig"
+ "github.com/gohugoio/hugo/deps"
+ "github.com/gohugoio/hugo/hugofs"
+ toml "github.com/pelletier/go-toml/v2"
+ "github.com/spf13/afero"
+)
+
+func GetTestConfigs(fs afero.Fs, cfg config.Provider) *allconfig.Configs {
+ if fs == nil {
+ fs = afero.NewMemMapFs()
+ }
+ if cfg == nil {
+ cfg = config.New()
+ }
+ // Make sure that the workingDir exists.
+ workingDir := cfg.GetString("workingDir")
+ if workingDir != "" {
+ if err := fs.MkdirAll(workingDir, 0o777); err != nil {
+ panic(err)
+ }
+ }
+
+ configs, err := allconfig.LoadConfig(allconfig.ConfigSourceDescriptor{Fs: fs, Flags: cfg, Environ: []string{"EMPTY_TEST_ENVIRONMENT"}})
+ if err != nil {
+ panic(err)
+ }
+ return configs
+}
+
+func GetTestConfig(fs afero.Fs, cfg config.Provider) config.AllProvider {
+ return GetTestConfigs(fs, cfg).GetFirstLanguageConfig()
+}
+
+func GetTestDeps(fs afero.Fs, cfg config.Provider, beforeInit ...func(*deps.Deps)) *deps.Deps {
+ if fs == nil {
+ fs = afero.NewMemMapFs()
+ }
+ conf := GetTestConfig(fs, cfg)
+ d := &deps.Deps{
+ Conf: conf,
+ Fs: hugofs.NewFrom(fs, conf.BaseConfig()),
+ }
+ for _, f := range beforeInit {
+ f(d)
+ }
+ if err := d.Init(); err != nil {
+ panic(err)
+ }
+ return d
+}
+
+func GetTestConfigSectionFromStruct(section string, v any) config.AllProvider {
+ data, err := toml.Marshal(v)
+ if err != nil {
+ panic(err)
+ }
+ p := maps.Params{
+ section: config.FromTOMLConfigString(string(data)).Get(""),
+ }
+ cfg := config.NewFrom(p)
+ return GetTestConfig(nil, cfg)
+}
diff --git a/create/content.go b/create/content.go
index 0e05adf93..a4661c1ba 100644
--- a/create/content.go
+++ b/create/content.go
@@ -16,138 +16,189 @@ package create
import (
"bytes"
-
- "github.com/pkg/errors"
-
+ "errors"
+ "fmt"
"io"
"os"
- "os/exec"
"path/filepath"
"strings"
- "github.com/gohugoio/hugo/hugofs/files"
+ "github.com/gohugoio/hugo/hugofs/glob"
+
+ "github.com/gohugoio/hugo/common/hexec"
+ "github.com/gohugoio/hugo/common/hstrings"
+ "github.com/gohugoio/hugo/common/paths"
"github.com/gohugoio/hugo/hugofs"
"github.com/gohugoio/hugo/helpers"
"github.com/gohugoio/hugo/hugolib"
"github.com/spf13/afero"
- jww "github.com/spf13/jwalterweatherman"
)
-// NewContent creates a new content file in the content directory based upon the
-// given kind, which is used to lookup an archetype.
-func NewContent(
- sites *hugolib.HugoSites, kind, targetPath string) error {
- targetPath = filepath.Clean(targetPath)
- ext := helpers.Ext(targetPath)
- ps := sites.PathSpec
- archetypeFs := ps.BaseFs.SourceFilesystems.Archetypes.Fs
- sourceFs := ps.Fs.Source
+const (
+ // DefaultArchetypeTemplateTemplate is the template used in 'hugo new site'
+ // and the template we use as a fall back.
+ DefaultArchetypeTemplateTemplate = `---
+title: "{{ replace .File.ContentBaseName "-" " " | title }}"
+date: {{ .Date }}
+draft: true
+---
- jww.INFO.Printf("attempting to create %q of %q of ext %q", targetPath, kind, ext)
+`
+)
- archetypeFilename, isDir := findArchetype(ps, kind, ext)
- contentPath, s := resolveContentPath(sites, sourceFs, targetPath)
-
- if isDir {
-
- langFs, err := hugofs.NewLanguageFs(sites.LanguageSet(), archetypeFs)
- if err != nil {
- return err
- }
-
- cm, err := mapArcheTypeDir(ps, langFs, archetypeFilename)
- if err != nil {
- return err
- }
-
- if cm.siteUsed {
- if err := sites.Build(hugolib.BuildCfg{SkipRender: true}); err != nil {
- return err
- }
- }
-
- name := filepath.Base(targetPath)
- return newContentFromDir(archetypeFilename, sites, sourceFs, cm, name, contentPath)
+// NewContent creates a new content file in h (or a full bundle if the archetype is a directory)
+// in targetPath.
+func NewContent(h *hugolib.HugoSites, kind, targetPath string, force bool) error {
+ if _, err := h.BaseFs.Content.Fs.Stat(""); err != nil {
+ return errors.New("no existing content directory configured for this project")
}
- // Building the sites can be expensive, so only do it if really needed.
- siteUsed := false
-
- if archetypeFilename != "" {
+ cf := hugolib.NewContentFactory(h)
+ if kind == "" {
var err error
- siteUsed, err = usesSiteVar(archetypeFs, archetypeFilename)
+ kind, err = cf.SectionFromFilename(targetPath)
if err != nil {
return err
}
}
- if siteUsed {
- if err := sites.Build(hugolib.BuildCfg{SkipRender: true}); err != nil {
- return err
- }
+ b := &contentBuilder{
+ archeTypeFs: h.PathSpec.BaseFs.Archetypes.Fs,
+ sourceFs: h.PathSpec.Fs.Source,
+ ps: h.PathSpec,
+ h: h,
+ cf: cf,
+
+ kind: kind,
+ targetPath: targetPath,
+ force: force,
}
- content, err := executeArcheTypeAsTemplate(s, "", kind, targetPath, archetypeFilename)
+ ext := paths.Ext(targetPath)
+
+ b.setArcheTypeFilenameToUse(ext)
+
+ withBuildLock := func() (string, error) {
+ if !h.Configs.Base.NoBuildLock {
+ unlock, err := h.BaseFs.LockBuild()
+ if err != nil {
+ return "", fmt.Errorf("failed to acquire a build lock: %s", err)
+ }
+ defer unlock()
+ }
+
+ if b.isDir {
+ return "", b.buildDir()
+ }
+
+ if ext == "" {
+ return "", fmt.Errorf("failed to resolve %q to an archetype template", targetPath)
+ }
+
+ if !h.Conf.ContentTypes().IsContentFile(b.targetPath) {
+ return "", fmt.Errorf("target path %q is not a known content format", b.targetPath)
+ }
+
+ return b.buildFile()
+ }
+
+ filename, err := withBuildLock()
if err != nil {
return err
}
- if err := helpers.SafeWriteToDisk(contentPath, bytes.NewReader(content), s.Fs.Source); err != nil {
- return err
- }
-
- jww.FEEDBACK.Println(contentPath, "created")
-
- editor := s.Cfg.GetString("newContentEditor")
- if editor != "" {
- jww.FEEDBACK.Printf("Editing %s with %q ...\n", targetPath, editor)
-
- cmd := exec.Command(editor, contentPath)
- cmd.Stdin = os.Stdin
- cmd.Stdout = os.Stdout
- cmd.Stderr = os.Stderr
-
- return cmd.Run()
+ if filename != "" {
+ return b.openInEditorIfConfigured(filename)
}
return nil
}
-func targetSite(sites *hugolib.HugoSites, fi hugofs.FileMetaInfo) *hugolib.Site {
- for _, s := range sites.Sites {
- if fi.Meta().Lang() == s.Language().Lang {
- return s
- }
- }
- return sites.Sites[0]
+type contentBuilder struct {
+ archeTypeFs afero.Fs
+ sourceFs afero.Fs
+
+ ps *helpers.PathSpec
+ h *hugolib.HugoSites
+ cf hugolib.ContentFactory
+
+ // Builder state
+ archetypeFi hugofs.FileMetaInfo
+ targetPath string
+ kind string
+ isDir bool
+ dirMap archetypeMap
+ force bool
}
-func newContentFromDir(
- archetypeDir string,
- sites *hugolib.HugoSites,
- targetFs afero.Fs,
- cm archetypeMap, name, targetPath string) error {
+func (b *contentBuilder) buildDir() error {
+ // Split the dir into content files and the rest.
+ if err := b.mapArcheTypeDir(); err != nil {
+ return err
+ }
+
+ var contentTargetFilenames []string
+ var baseDir string
+
+ for _, fi := range b.dirMap.contentFiles {
+
+ targetFilename := filepath.Join(b.targetPath, strings.TrimPrefix(fi.Meta().PathInfo.Path(), b.archetypeFi.Meta().PathInfo.Path()))
+
+ // ===> post/my-post/pages/bio.md
+ abs, err := b.cf.CreateContentPlaceHolder(targetFilename, b.force)
+ if err != nil {
+ return err
+ }
+ if baseDir == "" {
+ baseDir = strings.TrimSuffix(abs, targetFilename)
+ }
+
+ contentTargetFilenames = append(contentTargetFilenames, abs)
+ }
+
+ var contentInclusionFilter *glob.FilenameFilter
+ if !b.dirMap.siteUsed {
+ // We don't need to build everything.
+ contentInclusionFilter = glob.NewFilenameFilterForInclusionFunc(func(filename string) bool {
+ filename = strings.TrimPrefix(filename, string(os.PathSeparator))
+ for _, cn := range contentTargetFilenames {
+ if strings.Contains(cn, filename) {
+ return true
+ }
+ }
+ return false
+ })
+ }
+
+ if err := b.h.Build(hugolib.BuildCfg{NoBuildLock: true, SkipRender: true, ContentInclusionFilter: contentInclusionFilter}); err != nil {
+ return err
+ }
+
+ for i, filename := range contentTargetFilenames {
+ if err := b.applyArcheType(filename, b.dirMap.contentFiles[i]); err != nil {
+ return err
+ }
+ }
+
+ // Copy the rest as is.
+ for _, fi := range b.dirMap.otherFiles {
+ meta := fi.Meta()
- for _, f := range cm.otherFiles {
- meta := f.Meta()
- filename := meta.Path()
- // Just copy the file to destination.
in, err := meta.Open()
if err != nil {
- return errors.Wrap(err, "failed to open non-content file")
+ return fmt.Errorf("failed to open non-content file: %w", err)
}
-
- targetFilename := filepath.Join(targetPath, strings.TrimPrefix(filename, archetypeDir))
-
+ targetFilename := filepath.Join(baseDir, b.targetPath, strings.TrimPrefix(fi.Meta().Filename, b.archetypeFi.Meta().Filename))
targetDir := filepath.Dir(targetFilename)
- if err := targetFs.MkdirAll(targetDir, 0777); err != nil && !os.IsExist(err) {
- return errors.Wrapf(err, "failed to create target directory for %s:", targetDir)
+
+ if err := b.sourceFs.MkdirAll(targetDir, 0o777); err != nil && !os.IsExist(err) {
+ return fmt.Errorf("failed to create target directory for %q: %w", targetDir, err)
}
- out, err := targetFs.Create(targetFilename)
+ out, err := b.sourceFs.Create(targetFilename)
if err != nil {
return err
}
@@ -161,26 +212,185 @@ func newContentFromDir(
out.Close()
}
- for _, f := range cm.contentFiles {
- filename := f.Meta().Path()
- s := targetSite(sites, f)
- targetFilename := filepath.Join(targetPath, strings.TrimPrefix(filename, archetypeDir))
-
- content, err := executeArcheTypeAsTemplate(s, name, archetypeDir, targetFilename, filename)
- if err != nil {
- return errors.Wrap(err, "failed to execute archetype template")
- }
-
- if err := helpers.SafeWriteToDisk(targetFilename, bytes.NewReader(content), targetFs); err != nil {
- return errors.Wrap(err, "failed to save results")
- }
- }
-
- jww.FEEDBACK.Println(targetPath, "created")
+ b.h.Log.Printf("Content dir %q created", filepath.Join(baseDir, b.targetPath))
return nil
}
+func (b *contentBuilder) buildFile() (string, error) {
+ contentPlaceholderAbsFilename, err := b.cf.CreateContentPlaceHolder(b.targetPath, b.force)
+ if err != nil {
+ return "", err
+ }
+
+ usesSite, err := b.usesSiteVar(b.archetypeFi)
+ if err != nil {
+ return "", err
+ }
+
+ var contentInclusionFilter *glob.FilenameFilter
+ if !usesSite {
+ // We don't need to build everything.
+ contentInclusionFilter = glob.NewFilenameFilterForInclusionFunc(func(filename string) bool {
+ filename = strings.TrimPrefix(filename, string(os.PathSeparator))
+ return strings.Contains(contentPlaceholderAbsFilename, filename)
+ })
+ }
+
+ if err := b.h.Build(hugolib.BuildCfg{NoBuildLock: true, SkipRender: true, ContentInclusionFilter: contentInclusionFilter}); err != nil {
+ return "", err
+ }
+
+ if err := b.applyArcheType(contentPlaceholderAbsFilename, b.archetypeFi); err != nil {
+ return "", err
+ }
+
+ b.h.Log.Printf("Content %q created", contentPlaceholderAbsFilename)
+
+ return contentPlaceholderAbsFilename, nil
+}
+
+func (b *contentBuilder) setArcheTypeFilenameToUse(ext string) {
+ var pathsToCheck []string
+
+ if b.kind != "" {
+ pathsToCheck = append(pathsToCheck, b.kind+ext)
+ }
+
+ pathsToCheck = append(pathsToCheck, "default"+ext)
+
+ for _, p := range pathsToCheck {
+ fi, err := b.archeTypeFs.Stat(p)
+ if err == nil {
+ b.archetypeFi = fi.(hugofs.FileMetaInfo)
+ b.isDir = fi.IsDir()
+ return
+ }
+ }
+}
+
+func (b *contentBuilder) applyArcheType(contentFilename string, archetypeFi hugofs.FileMetaInfo) error {
+ p := b.h.GetContentPage(contentFilename)
+ if p == nil {
+ panic(fmt.Sprintf("[BUG] no Page found for %q", contentFilename))
+ }
+
+ f, err := b.sourceFs.Create(contentFilename)
+ if err != nil {
+ return err
+ }
+ defer f.Close()
+
+ if archetypeFi == nil {
+ return b.cf.ApplyArchetypeTemplate(f, p, b.kind, DefaultArchetypeTemplateTemplate)
+ }
+
+ return b.cf.ApplyArchetypeFi(f, p, b.kind, archetypeFi)
+}
+
+func (b *contentBuilder) mapArcheTypeDir() error {
+ var m archetypeMap
+
+ seen := map[hstrings.Strings2]bool{}
+
+ walkFn := func(path string, fim hugofs.FileMetaInfo) error {
+ if fim.IsDir() {
+ return nil
+ }
+
+ pi := fim.Meta().PathInfo
+
+ if pi.IsContent() {
+ pathLang := hstrings.Strings2{pi.PathBeforeLangAndOutputFormatAndExt(), fim.Meta().Lang}
+ if seen[pathLang] {
+ // Duplicate content file, e.g. page.md and page.html.
+ // In the regular build, we will filter out the duplicates, but
+ // for archetype folders these are ambiguous and we need to
+ // fail.
+ return fmt.Errorf("duplicate content file found in archetype folder: %q; having both e.g. %s.md and %s.html is ambigous", path, pi.BaseNameNoIdentifier(), pi.BaseNameNoIdentifier())
+ }
+ seen[pathLang] = true
+ m.contentFiles = append(m.contentFiles, fim)
+ if !m.siteUsed {
+ var err error
+ m.siteUsed, err = b.usesSiteVar(fim)
+ if err != nil {
+ return err
+ }
+ }
+ return nil
+ }
+
+ m.otherFiles = append(m.otherFiles, fim)
+
+ return nil
+ }
+
+ walkCfg := hugofs.WalkwayConfig{
+ WalkFn: walkFn,
+ Fs: b.archeTypeFs,
+ Root: filepath.FromSlash(b.archetypeFi.Meta().PathInfo.Path()),
+ }
+
+ w := hugofs.NewWalkway(walkCfg)
+
+ if err := w.Walk(); err != nil {
+ return fmt.Errorf("failed to walk archetype dir %q: %w", b.archetypeFi.Meta().Filename, err)
+ }
+
+ b.dirMap = m
+
+ return nil
+}
+
+func (b *contentBuilder) openInEditorIfConfigured(filename string) error {
+ editor := b.h.Conf.NewContentEditor()
+ if editor == "" {
+ return nil
+ }
+
+ editorExec := strings.Fields(editor)[0]
+ editorFlags := strings.Fields(editor)[1:]
+
+ var args []any
+ for _, editorFlag := range editorFlags {
+ args = append(args, editorFlag)
+ }
+ args = append(
+ args,
+ filename,
+ hexec.WithStdin(os.Stdin),
+ hexec.WithStderr(os.Stderr),
+ hexec.WithStdout(os.Stdout),
+ )
+
+ b.h.Log.Printf("Editing %q with %q ...\n", filename, editorExec)
+
+ cmd, err := b.h.Deps.ExecHelper.New(editorExec, args...)
+ if err != nil {
+ return err
+ }
+
+ return cmd.Run()
+}
+
+func (b *contentBuilder) usesSiteVar(fi hugofs.FileMetaInfo) (bool, error) {
+ if fi == nil {
+ return false, nil
+ }
+ f, err := fi.Meta().Open()
+ if err != nil {
+ return false, err
+ }
+ defer f.Close()
+ bb, err := io.ReadAll(f)
+ if err != nil {
+ return false, fmt.Errorf("failed to read archetype file: %w", err)
+ }
+
+ return bytes.Contains(bb, []byte(".Site")) || bytes.Contains(bb, []byte("site.")), nil
+}
+
type archetypeMap struct {
// These needs to be parsed and executed as Go templates.
contentFiles []hugofs.FileMetaInfo
@@ -190,160 +400,3 @@ type archetypeMap struct {
// expensive, so only do when needed.
siteUsed bool
}
-
-func mapArcheTypeDir(
- ps *helpers.PathSpec,
- fs afero.Fs,
- archetypeDir string) (archetypeMap, error) {
-
- var m archetypeMap
-
- walkFn := func(path string, fi hugofs.FileMetaInfo, err error) error {
-
- if err != nil {
- return err
- }
-
- if fi.IsDir() {
- return nil
- }
-
- fil := fi.(hugofs.FileMetaInfo)
-
- if files.IsContentFile(path) {
- m.contentFiles = append(m.contentFiles, fil)
- if !m.siteUsed {
- m.siteUsed, err = usesSiteVar(fs, path)
- if err != nil {
- return err
- }
- }
- return nil
- }
-
- m.otherFiles = append(m.otherFiles, fil)
-
- return nil
- }
-
- walkCfg := hugofs.WalkwayConfig{
- WalkFn: walkFn,
- Fs: fs,
- Root: archetypeDir,
- }
-
- w := hugofs.NewWalkway(walkCfg)
-
- if err := w.Walk(); err != nil {
- return m, errors.Wrapf(err, "failed to walk archetype dir %q", archetypeDir)
- }
-
- return m, nil
-}
-
-func usesSiteVar(fs afero.Fs, filename string) (bool, error) {
- f, err := fs.Open(filename)
- if err != nil {
- return false, errors.Wrap(err, "failed to open archetype file")
- }
- defer f.Close()
- return helpers.ReaderContains(f, []byte(".Site")), nil
-}
-
-// Resolve the target content path.
-func resolveContentPath(sites *hugolib.HugoSites, fs afero.Fs, targetPath string) (string, *hugolib.Site) {
- targetDir := filepath.Dir(targetPath)
- first := sites.Sites[0]
-
- var (
- s *hugolib.Site
- siteContentDir string
- )
-
- // Try the filename: my-post.en.md
- for _, ss := range sites.Sites {
- if strings.Contains(targetPath, "."+ss.Language().Lang+".") {
- s = ss
- break
- }
- }
-
- var dirLang string
-
- for _, dir := range sites.BaseFs.Content.Dirs {
- meta := dir.Meta()
- contentDir := meta.Filename()
-
- if !strings.HasSuffix(contentDir, helpers.FilePathSeparator) {
- contentDir += helpers.FilePathSeparator
- }
-
- if strings.HasPrefix(targetPath, contentDir) {
- siteContentDir = contentDir
- dirLang = meta.Lang()
- break
- }
- }
-
- if s == nil && dirLang != "" {
- for _, ss := range sites.Sites {
- if ss.Lang() == dirLang {
- s = ss
- break
- }
- }
- }
-
- if s == nil {
- s = first
- }
-
- if targetDir != "" && targetDir != "." {
- exists, _ := helpers.Exists(targetDir, fs)
-
- if exists {
- return targetPath, s
- }
- }
-
- if siteContentDir == "" {
-
- }
-
- if siteContentDir != "" {
- pp := filepath.Join(siteContentDir, strings.TrimPrefix(targetPath, siteContentDir))
- return s.PathSpec.AbsPathify(pp), s
- } else {
- var contentDir string
- for _, dir := range sites.BaseFs.Content.Dirs {
- contentDir = dir.Meta().Filename()
- if dir.Meta().Lang() == s.Lang() {
- break
- }
- }
- return s.PathSpec.AbsPathify(filepath.Join(contentDir, targetPath)), s
- }
-
-}
-
-// FindArchetype takes a given kind/archetype of content and returns the path
-// to the archetype in the archetype filesystem, blank if none found.
-func findArchetype(ps *helpers.PathSpec, kind, ext string) (outpath string, isDir bool) {
- fs := ps.BaseFs.Archetypes.Fs
-
- var pathsToCheck []string
-
- if kind != "" {
- pathsToCheck = append(pathsToCheck, kind+ext)
- }
- pathsToCheck = append(pathsToCheck, "default"+ext, "default")
-
- for _, p := range pathsToCheck {
- fi, err := fs.Stat(p)
- if err == nil {
- return p, fi.IsDir()
- }
- }
-
- return "", false
-}
diff --git a/create/content_template_handler.go b/create/content_template_handler.go
deleted file mode 100644
index 1576fabdb..000000000
--- a/create/content_template_handler.go
+++ /dev/null
@@ -1,149 +0,0 @@
-// Copyright 2017 The Hugo Authors. All rights reserved.
-//
-// Licensed under the Apache License, Version 2.0 (the "License");
-// you may not use this file except in compliance with the License.
-// You may obtain a copy of the License at
-// http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing, software
-// distributed under the License is distributed on an "AS IS" BASIS,
-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-// See the License for the specific language governing permissions and
-// limitations under the License.
-
-package create
-
-import (
- "bytes"
- "fmt"
- "path/filepath"
- "strings"
- "time"
-
- "github.com/pkg/errors"
-
- "github.com/gohugoio/hugo/helpers"
- "github.com/gohugoio/hugo/source"
-
- "github.com/gohugoio/hugo/hugolib"
- "github.com/gohugoio/hugo/tpl"
- "github.com/spf13/afero"
-)
-
-// ArchetypeFileData represents the data available to an archetype template.
-type ArchetypeFileData struct {
- // The archetype content type, either given as --kind option or extracted
- // from the target path's section, i.e. "blog/mypost.md" will resolve to
- // "blog".
- Type string
-
- // The current date and time as a RFC3339 formatted string, suitable for use in front matter.
- Date string
-
- // The Site, fully equipped with all the pages etc. Note: This will only be set if it is actually
- // used in the archetype template. Also, if this is a multilingual setup,
- // this site is the site that best matches the target content file, based
- // on the presence of language code in the filename.
- Site *hugolib.SiteInfo
-
- // Name will in most cases be the same as TranslationBaseName, e.g. "my-post".
- // But if that value is "index" (bundles), the Name is instead the owning folder.
- // This is the value you in most cases would want to use to construct the title in your
- // archetype template.
- Name string
-
- // The target content file. Note that the .Content will be empty, as that
- // has not been created yet.
- source.File
-}
-
-const (
- // ArchetypeTemplateTemplate is used as initial template when adding an archetype template.
- ArchetypeTemplateTemplate = `---
-title: "{{ replace .Name "-" " " | title }}"
-date: {{ .Date }}
-draft: true
----
-
-`
-)
-
-var (
- archetypeShortcodeReplacementsPre = strings.NewReplacer(
- "{{<", "{x{<",
- "{{%", "{x{%",
- ">}}", ">}x}",
- "%}}", "%}x}")
-
- archetypeShortcodeReplacementsPost = strings.NewReplacer(
- "{x{<", "{{<",
- "{x{%", "{{%",
- ">}x}", ">}}",
- "%}x}", "%}}")
-)
-
-func executeArcheTypeAsTemplate(s *hugolib.Site, name, kind, targetPath, archetypeFilename string) ([]byte, error) {
-
- var (
- archetypeContent []byte
- archetypeTemplate []byte
- err error
- )
-
- f, err := s.SourceSpec.NewFileInfoFrom(targetPath, targetPath)
- if err != nil {
- return nil, err
- }
-
- if name == "" {
- name = f.TranslationBaseName()
-
- if name == "index" || name == "_index" {
- // Page bundles; the directory name will hopefully have a better name.
- dir := strings.TrimSuffix(f.Dir(), helpers.FilePathSeparator)
- _, name = filepath.Split(dir)
- }
- }
-
- data := ArchetypeFileData{
- Type: kind,
- Date: time.Now().Format(time.RFC3339),
- Name: name,
- File: f,
- Site: &s.Info,
- }
-
- if archetypeFilename == "" {
- // TODO(bep) archetype revive the issue about wrong tpl funcs arg order
- archetypeTemplate = []byte(ArchetypeTemplateTemplate)
- } else {
- archetypeTemplate, err = afero.ReadFile(s.BaseFs.Archetypes.Fs, archetypeFilename)
- if err != nil {
- return nil, fmt.Errorf("failed to read archetype file %s", err)
- }
-
- }
-
- // The archetype template may contain shortcodes, and these does not play well
- // with the Go templates. Need to set some temporary delimiters.
- archetypeTemplate = []byte(archetypeShortcodeReplacementsPre.Replace(string(archetypeTemplate)))
-
- // Reuse the Hugo template setup to get the template funcs properly set up.
- templateHandler := s.Deps.Tmpl.(tpl.TemplateHandler)
- templateName := "_text/" + helpers.Filename(archetypeFilename)
- if err := templateHandler.AddTemplate(templateName, string(archetypeTemplate)); err != nil {
- return nil, errors.Wrapf(err, "Failed to parse archetype file %q:", archetypeFilename)
- }
-
- templ, _ := templateHandler.Lookup(templateName)
-
- var buff bytes.Buffer
- if err := templ.Execute(&buff, data); err != nil {
- return nil, errors.Wrapf(err, "Failed to process archetype file %q:", archetypeFilename)
- }
-
- archetypeContent = []byte(archetypeShortcodeReplacementsPost.Replace(buff.String()))
-
- return archetypeContent, nil
-
-}
diff --git a/create/content_test.go b/create/content_test.go
index f43d3a5f4..429edfc26 100644
--- a/create/content_test.go
+++ b/create/content_test.go
@@ -14,76 +14,99 @@
package create_test
import (
+ "fmt"
"os"
"path/filepath"
"strings"
"testing"
+ "github.com/gohugoio/hugo/config"
+ "github.com/gohugoio/hugo/config/allconfig"
+ "github.com/gohugoio/hugo/config/testconfig"
+
"github.com/gohugoio/hugo/deps"
"github.com/gohugoio/hugo/hugolib"
- "fmt"
-
"github.com/gohugoio/hugo/hugofs"
qt "github.com/frankban/quicktest"
"github.com/gohugoio/hugo/create"
"github.com/gohugoio/hugo/helpers"
"github.com/spf13/afero"
- "github.com/spf13/viper"
)
-func TestNewContent(t *testing.T) {
-
+// TODO(bep) clean this up. Export the test site builder in Hugolib or something.
+func TestNewContentFromFile(t *testing.T) {
cases := []struct {
+ name string
kind string
path string
- expected []string
+ expected any
}{
- {"post", "post/sample-1.md", []string{`title = "Post Arch title"`, `test = "test1"`, "date = \"2015-01-12T19:20:04-07:00\""}},
- {"post", "post/org-1.org", []string{`#+title: ORG-1`}},
- {"emptydate", "post/sample-ed.md", []string{`title = "Empty Date Arch title"`, `test = "test1"`}},
- {"stump", "stump/sample-2.md", []string{`title: "Sample 2"`}}, // no archetype file
- {"", "sample-3.md", []string{`title: "Sample 3"`}}, // no archetype
- {"product", "product/sample-4.md", []string{`title = "SAMPLE-4"`}}, // empty archetype front matter
- {"lang", "post/lang-1.md", []string{`Site Lang: en|Name: Lang 1|i18n: Hugo Rocks!`}},
- {"lang", "post/lang-2.en.md", []string{`Site Lang: en|Name: Lang 2|i18n: Hugo Rocks!`}},
- {"lang", "content/post/lang-3.nn.md", []string{`Site Lang: nn|Name: Lang 3|i18n: Hugo Rokkar!`}},
- {"lang", "content_nn/post/lang-4.md", []string{`Site Lang: nn|Name: Lang 4|i18n: Hugo Rokkar!`}},
- {"lang", "content_nn/post/lang-5.en.md", []string{`Site Lang: en|Name: Lang 5|i18n: Hugo Rocks!`}},
- {"lang", "post/my-bundle/index.md", []string{`Site Lang: en|Name: My Bundle|i18n: Hugo Rocks!`}},
- {"lang", "post/my-bundle/index.en.md", []string{`Site Lang: en|Name: My Bundle|i18n: Hugo Rocks!`}},
- {"lang", "content/post/my-bundle/index.nn.md", []string{`Site Lang: nn|Name: My Bundle|i18n: Hugo Rokkar!`}},
- {"shortcodes", "shortcodes/go.md", []string{
+ {"Post", "post", "post/sample-1.md", []string{`title = "Post Arch title"`, `test = "test1"`, "date = \"2015-01-12T19:20:04-07:00\""}},
+ {"Post org-mode", "post", "post/org-1.org", []string{`#+title: ORG-1`}},
+ {"Post, unknown content filetype", "post", "post/sample-1.pdoc", false},
+ {"Empty date", "emptydate", "post/sample-ed.md", []string{`title = "Empty Date Arch title"`, `test = "test1"`}},
+ {"Archetype file not found", "stump", "stump/sample-2.md", []string{`title: "Sample 2"`}}, // no archetype file
+ {"No archetype", "", "sample-3.md", []string{`title: "Sample 3"`}}, // no archetype
+ {"Empty archetype", "product", "product/sample-4.md", []string{`title = "SAMPLE-4"`}}, // empty archetype front matter
+ {"Filenames", "filenames", "content/mypage/index.md", []string{"title = \"INDEX\"\n+++\n\n\nContentBaseName: mypage"}},
+ {"Branch Name", "name", "content/tags/tag-a/_index.md", []string{"+++\ntitle = 'Tag A'\n+++"}},
+
+ {"Lang 1", "lang", "post/lang-1.md", []string{`Site Lang: en|Name: Lang 1|i18n: Hugo Rocks!`}},
+ {"Lang 2", "lang", "post/lang-2.en.md", []string{`Site Lang: en|Name: Lang 2|i18n: Hugo Rocks!`}},
+ {"Lang nn file", "lang", "content/post/lang-3.nn.md", []string{`Site Lang: nn|Name: Lang 3|i18n: Hugo Rokkar!`}},
+ {"Lang nn dir", "lang", "content_nn/post/lang-4.md", []string{`Site Lang: nn|Name: Lang 4|i18n: Hugo Rokkar!`}},
+ {"Lang en in nn dir", "lang", "content_nn/post/lang-5.en.md", []string{`Site Lang: en|Name: Lang 5|i18n: Hugo Rocks!`}},
+ {"Lang en default", "lang", "post/my-bundle/index.md", []string{`Site Lang: en|Name: My Bundle|i18n: Hugo Rocks!`}},
+ {"Lang en file", "lang", "post/my-bundle/index.en.md", []string{`Site Lang: en|Name: My Bundle|i18n: Hugo Rocks!`}},
+ {"Lang nn bundle", "lang", "content/post/my-bundle/index.nn.md", []string{`Site Lang: nn|Name: My Bundle|i18n: Hugo Rokkar!`}},
+ {"Site", "site", "content/mypage/index.md", []string{"RegularPages .Site: 10", "RegularPages site: 10"}},
+ {"Shortcodes", "shortcodes", "shortcodes/go.md", []string{
`title = "GO"`,
"{{< myshortcode >}}",
"{{% myshortcode %}}",
- "{{* comment */>}}\n{{%/* comment */%}}"}}, // shortcodes
+ "{{* comment */>}}\n{{%/* comment */%}}",
+ }}, // shortcodes
}
+ c := qt.New(t)
+
for i, cas := range cases {
cas := cas
- t.Run(fmt.Sprintf("%s-%d", cas.kind, i), func(t *testing.T) {
- t.Parallel()
- c := qt.New(t)
+
+ c.Run(cas.name, func(c *qt.C) {
+ c.Parallel()
+
mm := afero.NewMemMapFs()
c.Assert(initFs(mm), qt.IsNil)
cfg, fs := newTestCfg(c, mm)
- h, err := hugolib.NewHugoSites(deps.DepsCfg{Cfg: cfg, Fs: fs})
+ conf := testconfig.GetTestConfigs(fs.Source, cfg)
+ h, err := hugolib.NewHugoSites(deps.DepsCfg{Configs: conf, Fs: fs})
c.Assert(err, qt.IsNil)
+ err = create.NewContent(h, cas.kind, cas.path, false)
- c.Assert(create.NewContent(h, cas.kind, cas.path), qt.IsNil)
+ if b, ok := cas.expected.(bool); ok && !b {
+ if !b {
+ c.Assert(err, qt.Not(qt.IsNil))
+ }
+ return
+ }
+
+ c.Assert(err, qt.IsNil)
fname := filepath.FromSlash(cas.path)
if !strings.HasPrefix(fname, "content") {
fname = filepath.Join("content", fname)
}
- content := readFileFromFs(t, fs.Source, fname)
- for _, v := range cas.expected {
+
+ content := readFileFromFs(c, fs.Source, fname)
+
+ for _, v := range cas.expected.([]string) {
found := strings.Contains(content, v)
if !found {
- t.Fatalf("[%d] %q missing from output:\n%q", i, v, content)
+ c.Fatalf("[%d] %q missing from output:\n%q", i, v, content)
}
}
})
@@ -91,60 +114,50 @@ func TestNewContent(t *testing.T) {
}
}
-func TestNewContentFromDir(t *testing.T) {
+func TestNewContentFromDirSiteFunction(t *testing.T) {
mm := afero.NewMemMapFs()
c := qt.New(t)
archetypeDir := filepath.Join("archetypes", "my-bundle")
- c.Assert(mm.MkdirAll(archetypeDir, 0755), qt.IsNil)
-
- archetypeThemeDir := filepath.Join("themes", "mytheme", "archetypes", "my-theme-bundle")
- c.Assert(mm.MkdirAll(archetypeThemeDir, 0755), qt.IsNil)
+ defaultArchetypeDir := filepath.Join("archetypes", "default")
+ c.Assert(mm.MkdirAll(archetypeDir, 0o755), qt.IsNil)
+ c.Assert(mm.MkdirAll(defaultArchetypeDir, 0o755), qt.IsNil)
contentFile := `
File: %s
-Site Lang: {{ .Site.Language.Lang }}
-Name: {{ replace .Name "-" " " | title }}
-i18n: {{ T "hugo" }}
+site RegularPages: {{ len site.RegularPages }}
+
`
- c.Assert(afero.WriteFile(mm, filepath.Join(archetypeDir, "index.md"), []byte(fmt.Sprintf(contentFile, "index.md")), 0755), qt.IsNil)
- c.Assert(afero.WriteFile(mm, filepath.Join(archetypeDir, "index.nn.md"), []byte(fmt.Sprintf(contentFile, "index.nn.md")), 0755), qt.IsNil)
-
- c.Assert(afero.WriteFile(mm, filepath.Join(archetypeDir, "pages", "bio.md"), []byte(fmt.Sprintf(contentFile, "bio.md")), 0755), qt.IsNil)
- c.Assert(afero.WriteFile(mm, filepath.Join(archetypeDir, "resources", "hugo1.json"), []byte(`hugo1: {{ printf "no template handling in here" }}`), 0755), qt.IsNil)
- c.Assert(afero.WriteFile(mm, filepath.Join(archetypeDir, "resources", "hugo2.xml"), []byte(`hugo2: {{ printf "no template handling in here" }}`), 0755), qt.IsNil)
-
- c.Assert(afero.WriteFile(mm, filepath.Join(archetypeThemeDir, "index.md"), []byte(fmt.Sprintf(contentFile, "index.md")), 0755), qt.IsNil)
- c.Assert(afero.WriteFile(mm, filepath.Join(archetypeThemeDir, "resources", "hugo1.json"), []byte(`hugo1: {{ printf "no template handling in here" }}`), 0755), qt.IsNil)
+ c.Assert(afero.WriteFile(mm, filepath.Join(archetypeDir, "index.md"), fmt.Appendf(nil, contentFile, "index.md"), 0o755), qt.IsNil)
+ c.Assert(afero.WriteFile(mm, filepath.Join(defaultArchetypeDir, "index.md"), []byte("default archetype index.md"), 0o755), qt.IsNil)
c.Assert(initFs(mm), qt.IsNil)
cfg, fs := newTestCfg(c, mm)
- h, err := hugolib.NewHugoSites(deps.DepsCfg{Cfg: cfg, Fs: fs})
+ conf := testconfig.GetTestConfigs(fs.Source, cfg)
+ h, err := hugolib.NewHugoSites(deps.DepsCfg{Configs: conf, Fs: fs})
c.Assert(err, qt.IsNil)
c.Assert(len(h.Sites), qt.Equals, 2)
- c.Assert(create.NewContent(h, "my-bundle", "post/my-post"), qt.IsNil)
+ c.Assert(create.NewContent(h, "my-bundle", "post/my-post", false), qt.IsNil)
+ cContains(c, readFileFromFs(t, fs.Source, filepath.Join("content", "post/my-post/index.md")), `site RegularPages: 10`)
- cContains(c, readFileFromFs(t, fs.Source, filepath.Join("content", "post/my-post/resources/hugo1.json")), `hugo1: {{ printf "no template handling in here" }}`)
- cContains(c, readFileFromFs(t, fs.Source, filepath.Join("content", "post/my-post/resources/hugo2.xml")), `hugo2: {{ printf "no template handling in here" }}`)
+ // Default bundle archetype
+ c.Assert(create.NewContent(h, "", "post/my-post2", false), qt.IsNil)
+ cContains(c, readFileFromFs(t, fs.Source, filepath.Join("content", "post/my-post2/index.md")), `default archetype index.md`)
- // Content files should get the correct site context.
- // TODO(bep) archetype check i18n
- cContains(c, readFileFromFs(t, fs.Source, filepath.Join("content", "post/my-post/index.md")), `File: index.md`, `Site Lang: en`, `Name: My Post`, `i18n: Hugo Rocks!`)
- cContains(c, readFileFromFs(t, fs.Source, filepath.Join("content", "post/my-post/index.nn.md")), `File: index.nn.md`, `Site Lang: nn`, `Name: My Post`, `i18n: Hugo Rokkar!`)
-
- cContains(c, readFileFromFs(t, fs.Source, filepath.Join("content", "post/my-post/pages/bio.md")), `File: bio.md`, `Site Lang: en`, `Name: My Post`)
-
- c.Assert(create.NewContent(h, "my-theme-bundle", "post/my-theme-post"), qt.IsNil)
- cContains(c, readFileFromFs(t, fs.Source, filepath.Join("content", "post/my-theme-post/index.md")), `File: index.md`, `Site Lang: en`, `Name: My Theme Post`, `i18n: Hugo Rocks!`)
- cContains(c, readFileFromFs(t, fs.Source, filepath.Join("content", "post/my-theme-post/resources/hugo1.json")), `hugo1: {{ printf "no template handling in here" }}`)
+ // Regular file with bundle kind.
+ c.Assert(create.NewContent(h, "my-bundle", "post/foo.md", false), qt.IsNil)
+ cContains(c, readFileFromFs(t, fs.Source, filepath.Join("content", "post/foo.md")), `draft: true`)
+ // Regular files should fall back to the default archetype (we have no regular file archetype).
+ c.Assert(create.NewContent(h, "my-bundle", "mypage.md", false), qt.IsNil)
+ cContains(c, readFileFromFs(t, fs.Source, filepath.Join("content", "mypage.md")), `draft: true`)
}
func initFs(fs afero.Fs) error {
- perm := os.FileMode(0755)
+ perm := os.FileMode(0o755)
var err error
// create directories
@@ -160,7 +173,16 @@ func initFs(fs afero.Fs) error {
}
}
- // create files
+ // create some dummy content
+ for i := 1; i <= 10; i++ {
+ filename := filepath.Join("content", fmt.Sprintf("page%d.md", i))
+ afero.WriteFile(fs, filename, []byte(`---
+title: Test
+---
+`), 0o666)
+ }
+
+ // create archetype files
for _, v := range []struct {
path string
content string
@@ -173,11 +195,40 @@ func initFs(fs afero.Fs) error {
path: filepath.Join("archetypes", "post.org"),
content: "#+title: {{ .BaseFileName | upper }}",
},
+ {
+ path: filepath.Join("archetypes", "name.md"),
+ content: `+++
+title = '{{ replace .Name "-" " " | title }}'
++++`,
+ },
{
path: filepath.Join("archetypes", "product.md"),
content: `+++
title = "{{ .BaseFileName | upper }}"
+++`,
+ },
+ {
+ path: filepath.Join("archetypes", "filenames.md"),
+ content: `...
+title = "{{ .BaseFileName | upper }}"
++++
+
+
+ContentBaseName: {{ .File.ContentBaseName }}
+
+`,
+ },
+ {
+ path: filepath.Join("archetypes", "site.md"),
+ content: `...
+title = "{{ .BaseFileName | upper }}"
++++
+
+Len RegularPages .Site: {{ len .Site.RegularPages }}
+Len RegularPages site: {{ len site.RegularPages }}
+
+
+`,
},
{
path: filepath.Join("archetypes", "emptydate.md"),
@@ -185,7 +236,7 @@ title = "{{ .BaseFileName | upper }}"
},
{
path: filepath.Join("archetypes", "lang.md"),
- content: `Site Lang: {{ .Site.Language.Lang }}|Name: {{ replace .Name "-" " " | title }}|i18n: {{ T "hugo" }}`,
+ content: `Site Lang: {{ site.Language.Lang }}|Name: {{ replace .Name "-" " " | title }}|i18n: {{ T "hugo" }}`,
},
// #3623x
{
@@ -221,14 +272,14 @@ Some text.
return nil
}
-func cContains(c *qt.C, v interface{}, matches ...string) {
+func cContains(c *qt.C, v any, matches ...string) {
for _, m := range matches {
c.Assert(v, qt.Contains, m)
}
}
// TODO(bep) extract common testing package with this and some others
-func readFileFromFs(t *testing.T, fs afero.Fs, filename string) string {
+func readFileFromFs(t testing.TB, fs afero.Fs, filename string) string {
t.Helper()
filename = filepath.FromSlash(filename)
b, err := afero.ReadFile(fs, filename)
@@ -247,8 +298,7 @@ func readFileFromFs(t *testing.T, fs afero.Fs, filename string) string {
return string(b)
}
-func newTestCfg(c *qt.C, mm afero.Fs) (*viper.Viper, *hugofs.Fs) {
-
+func newTestCfg(c *qt.C, mm afero.Fs) (config.Provider, *hugofs.Fs) {
cfg := `
theme = "mytheme"
@@ -259,27 +309,37 @@ languageName = "English"
[languages.nn]
weight = 2
languageName = "Nynorsk"
-contentDir = "content_nn"
+[module]
+[[module.mounts]]
+ source = 'archetypes'
+ target = 'archetypes'
+[[module.mounts]]
+ source = 'content'
+ target = 'content'
+ lang = 'en'
+[[module.mounts]]
+ source = 'content_nn'
+ target = 'content'
+ lang = 'nn'
`
if mm == nil {
mm = afero.NewMemMapFs()
}
- mm.MkdirAll(filepath.FromSlash("content_nn"), 0777)
+ mm.MkdirAll(filepath.FromSlash("content_nn"), 0o777)
- mm.MkdirAll(filepath.FromSlash("themes/mytheme"), 0777)
+ mm.MkdirAll(filepath.FromSlash("themes/mytheme"), 0o777)
c.Assert(afero.WriteFile(mm, filepath.Join("i18n", "en.toml"), []byte(`[hugo]
-other = "Hugo Rocks!"`), 0755), qt.IsNil)
+other = "Hugo Rocks!"`), 0o755), qt.IsNil)
c.Assert(afero.WriteFile(mm, filepath.Join("i18n", "nn.toml"), []byte(`[hugo]
-other = "Hugo Rokkar!"`), 0755), qt.IsNil)
+other = "Hugo Rokkar!"`), 0o755), qt.IsNil)
- c.Assert(afero.WriteFile(mm, "config.toml", []byte(cfg), 0755), qt.IsNil)
+ c.Assert(afero.WriteFile(mm, "config.toml", []byte(cfg), 0o755), qt.IsNil)
- v, _, err := hugolib.LoadConfig(hugolib.ConfigSourceDescriptor{Fs: mm, Filename: "config.toml"})
+ res, err := allconfig.LoadConfig(allconfig.ConfigSourceDescriptor{Fs: mm, Filename: "config.toml"})
c.Assert(err, qt.IsNil)
- return v, hugofs.NewFrom(mm, v)
-
+ return res.LoadingInfo.Cfg, hugofs.NewFrom(mm, res.LoadingInfo.BaseConfig)
}
diff --git a/docs/themes/gohugoioTheme/assets/js/filesaver.js b/create/skeletons/site/assets/.gitkeep
similarity index 100%
rename from docs/themes/gohugoioTheme/assets/js/filesaver.js
rename to create/skeletons/site/assets/.gitkeep
diff --git a/docs/themes/gohugoioTheme/layouts/partials/svg/exclamation.svg b/create/skeletons/site/content/.gitkeep
similarity index 100%
rename from docs/themes/gohugoioTheme/layouts/partials/svg/exclamation.svg
rename to create/skeletons/site/content/.gitkeep
diff --git a/create/skeletons/site/data/.gitkeep b/create/skeletons/site/data/.gitkeep
new file mode 100644
index 000000000..e69de29bb
diff --git a/create/skeletons/site/i18n/.gitkeep b/create/skeletons/site/i18n/.gitkeep
new file mode 100644
index 000000000..e69de29bb
diff --git a/create/skeletons/site/layouts/.gitkeep b/create/skeletons/site/layouts/.gitkeep
new file mode 100644
index 000000000..e69de29bb
diff --git a/create/skeletons/site/static/.gitkeep b/create/skeletons/site/static/.gitkeep
new file mode 100644
index 000000000..e69de29bb
diff --git a/create/skeletons/site/themes/.gitkeep b/create/skeletons/site/themes/.gitkeep
new file mode 100644
index 000000000..e69de29bb
diff --git a/create/skeletons/skeletons.go b/create/skeletons/skeletons.go
new file mode 100644
index 000000000..a6241ef92
--- /dev/null
+++ b/create/skeletons/skeletons.go
@@ -0,0 +1,182 @@
+// Copyright 2024 The Hugo Authors. All rights reserved.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package skeletons
+
+import (
+ "bytes"
+ "embed"
+ "errors"
+ "io/fs"
+ "path/filepath"
+ "strings"
+
+ "github.com/gohugoio/hugo/helpers"
+ "github.com/gohugoio/hugo/parser"
+ "github.com/gohugoio/hugo/parser/metadecoders"
+ "github.com/spf13/afero"
+)
+
+//go:embed all:site/*
+var siteFs embed.FS
+
+//go:embed all:theme/*
+var themeFs embed.FS
+
+// CreateTheme creates a theme skeleton.
+func CreateTheme(createpath string, sourceFs afero.Fs, format string) error {
+ if exists, _ := helpers.Exists(createpath, sourceFs); exists {
+ return errors.New(createpath + " already exists")
+ }
+
+ format = strings.ToLower(format)
+
+ siteConfig := map[string]any{
+ "baseURL": "https://example.org/",
+ "languageCode": "en-US",
+ "title": "My New Hugo Site",
+ "menus": map[string]any{
+ "main": []any{
+ map[string]any{
+ "name": "Home",
+ "pageRef": "/",
+ "weight": 10,
+ },
+ map[string]any{
+ "name": "Posts",
+ "pageRef": "/posts",
+ "weight": 20,
+ },
+ map[string]any{
+ "name": "Tags",
+ "pageRef": "/tags",
+ "weight": 30,
+ },
+ },
+ },
+ "module": map[string]any{
+ "hugoVersion": map[string]any{
+ "extended": false,
+ "min": "0.146.0",
+ },
+ },
+ }
+
+ err := createSiteConfig(sourceFs, createpath, siteConfig, format)
+ if err != nil {
+ return err
+ }
+
+ defaultArchetype := map[string]any{
+ "title": "{{ replace .File.ContentBaseName \"-\" \" \" | title }}",
+ "date": "{{ .Date }}",
+ "draft": true,
+ }
+
+ err = createDefaultArchetype(sourceFs, createpath, defaultArchetype, format)
+ if err != nil {
+ return err
+ }
+
+ return copyFiles(createpath, sourceFs, themeFs)
+}
+
+// CreateSite creates a site skeleton.
+func CreateSite(createpath string, sourceFs afero.Fs, force bool, format string) error {
+ format = strings.ToLower(format)
+ if exists, _ := helpers.Exists(createpath, sourceFs); exists {
+ if isDir, _ := helpers.IsDir(createpath, sourceFs); !isDir {
+ return errors.New(createpath + " already exists but not a directory")
+ }
+
+ isEmpty, _ := helpers.IsEmpty(createpath, sourceFs)
+
+ switch {
+ case !isEmpty && !force:
+ return errors.New(createpath + " already exists and is not empty. See --force.")
+ case !isEmpty && force:
+ var all []string
+ fs.WalkDir(siteFs, ".", func(path string, d fs.DirEntry, err error) error {
+ if d.IsDir() && path != "." {
+ all = append(all, path)
+ }
+ return nil
+ })
+ all = append(all, filepath.Join(createpath, "hugo."+format))
+ for _, path := range all {
+ if exists, _ := helpers.Exists(path, sourceFs); exists {
+ return errors.New(path + " already exists")
+ }
+ }
+ }
+ }
+
+ siteConfig := map[string]any{
+ "baseURL": "https://example.org/",
+ "title": "My New Hugo Site",
+ "languageCode": "en-us",
+ }
+
+ err := createSiteConfig(sourceFs, createpath, siteConfig, format)
+ if err != nil {
+ return err
+ }
+
+ defaultArchetype := map[string]any{
+ "title": "{{ replace .File.ContentBaseName \"-\" \" \" | title }}",
+ "date": "{{ .Date }}",
+ "draft": true,
+ }
+
+ err = createDefaultArchetype(sourceFs, createpath, defaultArchetype, format)
+ if err != nil {
+ return err
+ }
+
+ return copyFiles(createpath, sourceFs, siteFs)
+}
+
+func copyFiles(createpath string, sourceFs afero.Fs, skeleton embed.FS) error {
+ return fs.WalkDir(skeleton, ".", func(path string, d fs.DirEntry, err error) error {
+ _, slug, _ := strings.Cut(path, "/")
+ if d.IsDir() {
+ return sourceFs.MkdirAll(filepath.Join(createpath, slug), 0o777)
+ } else {
+ if filepath.Base(path) != ".gitkeep" {
+ data, _ := fs.ReadFile(skeleton, path)
+ return helpers.WriteToDisk(filepath.Join(createpath, slug), bytes.NewReader(data), sourceFs)
+ }
+ return nil
+ }
+ })
+}
+
+func createSiteConfig(fs afero.Fs, createpath string, in map[string]any, format string) (err error) {
+ var buf bytes.Buffer
+ err = parser.InterfaceToConfig(in, metadecoders.FormatFromString(format), &buf)
+ if err != nil {
+ return err
+ }
+
+ return helpers.WriteToDisk(filepath.Join(createpath, "hugo."+format), &buf, fs)
+}
+
+func createDefaultArchetype(fs afero.Fs, createpath string, in map[string]any, format string) (err error) {
+ var buf bytes.Buffer
+ err = parser.InterfaceToFrontMatter(in, metadecoders.FormatFromString(format), &buf)
+ if err != nil {
+ return err
+ }
+
+ return helpers.WriteToDisk(filepath.Join(createpath, "archetypes", "default.md"), &buf, fs)
+}
diff --git a/create/skeletons/theme/assets/css/main.css b/create/skeletons/theme/assets/css/main.css
new file mode 100644
index 000000000..166ade924
--- /dev/null
+++ b/create/skeletons/theme/assets/css/main.css
@@ -0,0 +1,22 @@
+body {
+ color: #222;
+ font-family: sans-serif;
+ line-height: 1.5;
+ margin: 1rem;
+ max-width: 768px;
+}
+
+header {
+ border-bottom: 1px solid #222;
+ margin-bottom: 1rem;
+}
+
+footer {
+ border-top: 1px solid #222;
+ margin-top: 1rem;
+}
+
+a {
+ color: #00e;
+ text-decoration: none;
+}
diff --git a/create/skeletons/theme/assets/js/main.js b/create/skeletons/theme/assets/js/main.js
new file mode 100644
index 000000000..e2aac5275
--- /dev/null
+++ b/create/skeletons/theme/assets/js/main.js
@@ -0,0 +1 @@
+console.log('This site was generated by Hugo.');
diff --git a/create/skeletons/theme/content/_index.md b/create/skeletons/theme/content/_index.md
new file mode 100644
index 000000000..652623b57
--- /dev/null
+++ b/create/skeletons/theme/content/_index.md
@@ -0,0 +1,9 @@
++++
+title = 'Home'
+date = 2023-01-01T08:00:00-07:00
+draft = false
++++
+
+Laborum voluptate pariatur ex culpa magna nostrud est incididunt fugiat
+pariatur do dolor ipsum enim. Consequat tempor do dolor eu. Non id id anim anim
+excepteur excepteur pariatur nostrud qui irure ullamco.
diff --git a/create/skeletons/theme/content/posts/_index.md b/create/skeletons/theme/content/posts/_index.md
new file mode 100644
index 000000000..e7066c092
--- /dev/null
+++ b/create/skeletons/theme/content/posts/_index.md
@@ -0,0 +1,7 @@
++++
+title = 'Posts'
+date = 2023-01-01T08:30:00-07:00
+draft = false
++++
+
+Tempor est exercitation ad qui pariatur quis adipisicing aliquip nisi ea consequat ipsum occaecat. Nostrud consequat ullamco laboris fugiat esse esse adipisicing velit laborum ipsum incididunt ut enim. Dolor pariatur nulla quis fugiat dolore excepteur. Aliquip ad quis aliqua enim do consequat.
diff --git a/create/skeletons/theme/content/posts/post-1.md b/create/skeletons/theme/content/posts/post-1.md
new file mode 100644
index 000000000..3e3fc6b25
--- /dev/null
+++ b/create/skeletons/theme/content/posts/post-1.md
@@ -0,0 +1,10 @@
++++
+title = 'Post 1'
+date = 2023-01-15T09:00:00-07:00
+draft = false
+tags = ['red']
++++
+
+Tempor proident minim aliquip reprehenderit dolor et ad anim Lorem duis sint eiusmod. Labore ut ea duis dolor. Incididunt consectetur proident qui occaecat incididunt do nisi Lorem. Tempor do laborum elit laboris excepteur eiusmod do. Eiusmod nisi excepteur ut amet pariatur adipisicing Lorem.
+
+Occaecat nulla excepteur dolore excepteur duis eiusmod ullamco officia anim in voluptate ea occaecat officia. Cillum sint esse velit ea officia minim fugiat. Elit ea esse id aliquip pariatur cupidatat id duis minim incididunt ea ea. Anim ut duis sunt nisi. Culpa cillum sit voluptate voluptate eiusmod dolor. Enim nisi Lorem ipsum irure est excepteur voluptate eu in enim nisi. Nostrud ipsum Lorem anim sint labore consequat do.
diff --git a/create/skeletons/theme/content/posts/post-2.md b/create/skeletons/theme/content/posts/post-2.md
new file mode 100644
index 000000000..22b828769
--- /dev/null
+++ b/create/skeletons/theme/content/posts/post-2.md
@@ -0,0 +1,10 @@
++++
+title = 'Post 2'
+date = 2023-02-15T10:00:00-07:00
+draft = false
+tags = ['red','green']
++++
+
+Anim eiusmod irure incididunt sint cupidatat. Incididunt irure irure irure nisi ipsum do ut quis fugiat consectetur proident cupidatat incididunt cillum. Dolore voluptate occaecat qui mollit laborum ullamco et. Ipsum laboris officia anim laboris culpa eiusmod ex magna ex cupidatat anim ipsum aute. Mollit aliquip occaecat qui sunt velit ut cupidatat reprehenderit enim sunt laborum. Velit veniam in officia nulla adipisicing ut duis officia.
+
+Exercitation voluptate irure in irure tempor mollit Lorem nostrud ad officia. Velit id fugiat occaecat do tempor. Sit officia Lorem aliquip eu deserunt consectetur. Aute proident deserunt in nulla aliquip dolore ipsum Lorem ut cupidatat consectetur sit sint laborum. Esse cupidatat sit sint sunt tempor exercitation deserunt. Labore dolor duis laborum est do nisi ut veniam dolor et nostrud nostrud.
diff --git a/create/skeletons/theme/content/posts/post-3/bryce-canyon.jpg b/create/skeletons/theme/content/posts/post-3/bryce-canyon.jpg
new file mode 100644
index 000000000..9a923bea0
Binary files /dev/null and b/create/skeletons/theme/content/posts/post-3/bryce-canyon.jpg differ
diff --git a/create/skeletons/theme/content/posts/post-3/index.md b/create/skeletons/theme/content/posts/post-3/index.md
new file mode 100644
index 000000000..ca42a664b
--- /dev/null
+++ b/create/skeletons/theme/content/posts/post-3/index.md
@@ -0,0 +1,12 @@
++++
+title = 'Post 3'
+date = 2023-03-15T11:00:00-07:00
+draft = false
+tags = ['red','green','blue']
++++
+
+Occaecat aliqua consequat laborum ut ex aute aliqua culpa quis irure esse magna dolore quis. Proident fugiat labore eu laboris officia Lorem enim. Ipsum occaecat cillum ut tempor id sint aliqua incididunt nisi incididunt reprehenderit. Voluptate ad minim sint est aute aliquip esse occaecat tempor officia qui sunt. Aute ex ipsum id ut in est velit est laborum incididunt. Aliqua qui id do esse sunt eiusmod id deserunt eu nostrud aute sit ipsum. Deserunt esse cillum Lorem non magna adipisicing mollit amet consequat.
+
+
+
+Sit excepteur do velit veniam mollit in nostrud laboris incididunt ea. Amet eu cillum ut reprehenderit culpa aliquip labore laborum amet sit sit duis. Laborum id proident nostrud dolore laborum reprehenderit quis mollit nulla amet veniam officia id id. Aliquip in deserunt qui magna duis qui pariatur officia sunt deserunt.
diff --git a/create/skeletons/theme/data/.gitkeep b/create/skeletons/theme/data/.gitkeep
new file mode 100644
index 000000000..e69de29bb
diff --git a/create/skeletons/theme/i18n/.gitkeep b/create/skeletons/theme/i18n/.gitkeep
new file mode 100644
index 000000000..e69de29bb
diff --git a/create/skeletons/theme/layouts/_partials/footer.html b/create/skeletons/theme/layouts/_partials/footer.html
new file mode 100644
index 000000000..a7cd916d0
--- /dev/null
+++ b/create/skeletons/theme/layouts/_partials/footer.html
@@ -0,0 +1 @@
+
Copyright {{ now.Year }}. All rights reserved.
diff --git a/create/skeletons/theme/layouts/_partials/head.html b/create/skeletons/theme/layouts/_partials/head.html
new file mode 100644
index 000000000..02c224018
--- /dev/null
+++ b/create/skeletons/theme/layouts/_partials/head.html
@@ -0,0 +1,5 @@
+
+
+{{ if .IsHome }}{{ site.Title }}{{ else }}{{ printf "%s | %s" .Title site.Title }}{{ end }}
+{{ partialCached "head/css.html" . }}
+{{ partialCached "head/js.html" . }}
diff --git a/create/skeletons/theme/layouts/_partials/head/css.html b/create/skeletons/theme/layouts/_partials/head/css.html
new file mode 100644
index 000000000..d76d23a16
--- /dev/null
+++ b/create/skeletons/theme/layouts/_partials/head/css.html
@@ -0,0 +1,9 @@
+{{- with resources.Get "css/main.css" }}
+ {{- if hugo.IsDevelopment }}
+
+ {{- else }}
+ {{- with . | minify | fingerprint }}
+
+ {{- end }}
+ {{- end }}
+{{- end }}
diff --git a/create/skeletons/theme/layouts/_partials/head/js.html b/create/skeletons/theme/layouts/_partials/head/js.html
new file mode 100644
index 000000000..16ffbedfe
--- /dev/null
+++ b/create/skeletons/theme/layouts/_partials/head/js.html
@@ -0,0 +1,16 @@
+{{- with resources.Get "js/main.js" }}
+ {{- $opts := dict
+ "minify" (not hugo.IsDevelopment)
+ "sourceMap" (cond hugo.IsDevelopment "external" "")
+ "targetPath" "js/main.js"
+ }}
+ {{- with . | js.Build $opts }}
+ {{- if hugo.IsDevelopment }}
+
+ {{- else }}
+ {{- with . | fingerprint }}
+
+ {{- end }}
+ {{- end }}
+ {{- end }}
+{{- end }}
diff --git a/create/skeletons/theme/layouts/_partials/header.html b/create/skeletons/theme/layouts/_partials/header.html
new file mode 100644
index 000000000..7980a00e1
--- /dev/null
+++ b/create/skeletons/theme/layouts/_partials/header.html
@@ -0,0 +1,2 @@
+
{{ site.Title }}
+{{ partial "menu.html" (dict "menuID" "main" "page" .) }}
diff --git a/create/skeletons/theme/layouts/_partials/menu.html b/create/skeletons/theme/layouts/_partials/menu.html
new file mode 100644
index 000000000..14245b55d
--- /dev/null
+++ b/create/skeletons/theme/layouts/_partials/menu.html
@@ -0,0 +1,51 @@
+{{- /*
+Renders a menu for the given menu ID.
+
+@context {page} page The current page.
+@context {string} menuID The menu ID.
+
+@example: {{ partial "menu.html" (dict "menuID" "main" "page" .) }}
+*/}}
+
+{{- $page := .page }}
+{{- $menuID := .menuID }}
+
+{{- with index site.Menus $menuID }}
+
+{{- end }}
+
+{{- define "_partials/inline/menu/walk.html" }}
+ {{- $page := .page }}
+ {{- range .menuEntries }}
+ {{- $attrs := dict "href" .URL }}
+ {{- if $page.IsMenuCurrent .Menu . }}
+ {{- $attrs = merge $attrs (dict "class" "active" "aria-current" "page") }}
+ {{- else if $page.HasMenuCurrent .Menu .}}
+ {{- $attrs = merge $attrs (dict "class" "ancestor" "aria-current" "true") }}
+ {{- end }}
+ {{- $name := .Name }}
+ {{- with .Identifier }}
+ {{- with T . }}
+ {{- $name = . }}
+ {{- end }}
+ {{- end }}
+
+```
+
+[emoji shortcodes]: /quick-reference/emojis/
diff --git a/docs/content/en/_common/menu-entry-properties.md b/docs/content/en/_common/menu-entry-properties.md
new file mode 100644
index 000000000..daeadd79d
--- /dev/null
+++ b/docs/content/en/_common/menu-entry-properties.md
@@ -0,0 +1,31 @@
+---
+_comment: Do not remove front matter.
+---
+
+
+
+identifier
+: (`string`) Required when two or more menu entries have the same `name`, or when localizing the `name` using translation tables. Must start with a letter, followed by letters, digits, or underscores.
+
+name
+: (`string`) The text to display when rendering the menu entry.
+
+params
+: (`map`) User-defined properties for the menu entry.
+
+parent
+: (`string`) The `identifier` of the parent menu entry. If `identifier` is not defined, use `name`. Required for child entries in a nested menu.
+
+post
+: (`string`) The HTML to append when rendering the menu entry.
+
+pre
+: (`string`) The HTML to prepend when rendering the menu entry.
+
+title
+: (`string`) The HTML `title` attribute of the rendered menu entry.
+
+weight
+: (`int`) A non-zero integer indicating the entry's position relative the root of the menu, or to its parent for a child entry. Lighter entries float to the top, while heavier entries sink to the bottom.
diff --git a/docs/content/en/_common/methods/page/next-and-prev.md b/docs/content/en/_common/methods/page/next-and-prev.md
new file mode 100644
index 000000000..f859961a4
--- /dev/null
+++ b/docs/content/en/_common/methods/page/next-and-prev.md
@@ -0,0 +1,60 @@
+---
+_comment: Do not remove front matter.
+---
+
+Hugo determines the _next_ and _previous_ page by sorting the site's collection of regular pages according to this sorting hierarchy:
+
+Field|Precedence|Sort direction
+:--|:--|:--
+[`weight`]|1|descending
+[`date`]|2|descending
+[`linkTitle`]|3|descending
+[`path`]|4|descending
+
+[`date`]: /methods/page/date/
+[`weight`]: /methods/page/weight/
+[`linkTitle`]: /methods/page/linktitle/
+[`path`]: /methods/page/path/
+
+The sorted page collection used to determine the _next_ and _previous_ page is independent of other page collections, which may lead to unexpected behavior.
+
+For example, with this content structure:
+
+```text
+content/
+├── pages/
+│ ├── _index.md
+│ ├── page-1.md <-- front matter: weight = 10
+│ ├── page-2.md <-- front matter: weight = 20
+│ └── page-3.md <-- front matter: weight = 30
+└── _index.md
+```
+
+And these templates:
+
+```go-html-template {file="layouts/_default/list.html"}
+{{ range .Pages.ByWeight }}
+
+{{ end }}
+```
+
+```go-html-template {file="layouts/_default/single.html"}
+{{ with .Prev }}
+ Previous
+{{ end }}
+
+{{ with .Next }}
+ Next
+{{ end }}
+```
+
+When you visit page-2:
+
+- The `Prev` method points to page-3
+- The `Next` method points to page-1
+
+To reverse the meaning of _next_ and _previous_ you can change the sort direction in your [site configuration], or use the [`Next`] and [`Prev`] methods on a `Pages` object for more flexibility.
+
+[site configuration]: /configuration/page/
+[`Next`]: /methods/pages/prev
+[`Prev`]: /methods/pages/prev
diff --git a/docs/content/en/_common/methods/page/nextinsection-and-previnsection.md b/docs/content/en/_common/methods/page/nextinsection-and-previnsection.md
new file mode 100644
index 000000000..54d240eb4
--- /dev/null
+++ b/docs/content/en/_common/methods/page/nextinsection-and-previnsection.md
@@ -0,0 +1,78 @@
+---
+_comment: Do not remove front matter.
+---
+
+Hugo determines the _next_ and _previous_ page by sorting the current section's regular pages according to this sorting hierarchy:
+
+Field|Precedence|Sort direction
+:--|:--|:--
+[`weight`]|1|descending
+[`date`]|2|descending
+[`linkTitle`]|3|descending
+[`path`]|4|descending
+
+[`date`]: /methods/page/date/
+[`weight`]: /methods/page/weight/
+[`linkTitle`]: /methods/page/linktitle/
+[`path`]: /methods/page/path/
+
+The sorted page collection used to determine the _next_ and _previous_ page is independent of other page collections, which may lead to unexpected behavior.
+
+For example, with this content structure:
+
+```text
+content/
+├── pages/
+│ ├── _index.md
+│ ├── page-1.md <-- front matter: weight = 10
+│ ├── page-2.md <-- front matter: weight = 20
+│ └── page-3.md <-- front matter: weight = 30
+└── _index.md
+```
+
+And these templates:
+
+```go-html-template {file="layouts/_default/list.html"}
+{{ range .Pages.ByWeight }}
+
+{{ end }}
+```
+
+```go-html-template {file="layouts/_default/single.html"}
+{{ with .PrevInSection }}
+ Previous
+{{ end }}
+
+{{ with .NextInSection }}
+ Next
+{{ end }}
+```
+
+When you visit page-2:
+
+- The `PrevInSection` method points to page-3
+- The `NextInSection` method points to page-1
+
+To reverse the meaning of _next_ and _previous_ you can change the sort direction in your [site configuration], or use the [`Next`] and [`Prev`] methods on a `Pages` object for more flexibility.
+
+[site configuration]: /configuration/page/
+[`Next`]: /methods/pages/prev
+[`Prev`]: /methods/pages/prev
+
+## Example
+
+Code defensively by checking for page existence:
+
+```go-html-template
+{{ with .PrevInSection }}
+ Previous
+{{ end }}
+
+{{ with .NextInSection }}
+ Next
+{{ end }}
+```
+
+## Alternative
+
+Use the [`Next`] and [`Prev`] methods on a `Pages` object for more flexibility.
diff --git a/docs/content/en/_common/methods/page/output-format-methods.md b/docs/content/en/_common/methods/page/output-format-methods.md
new file mode 100644
index 000000000..1e914db03
--- /dev/null
+++ b/docs/content/en/_common/methods/page/output-format-methods.md
@@ -0,0 +1,35 @@
+---
+_comment: Do not remove front matter.
+---
+
+### Get IDENTIFIER
+
+(`any`) Returns the `OutputFormat` object with the given identifier.
+
+### MediaType
+
+(`media.Type`) Returns the media type of the output format.
+
+### MediaType.MainType
+
+(`string`) Returns the main type of the output format's media type.
+
+### MediaType.SubType
+
+(`string`) Returns the subtype of the current format's media type.
+
+### Name
+
+(`string`) Returns the output identifier of the output format.
+
+### Permalink
+
+(`string`) Returns the permalink of the page generated by the current output format.
+
+### Rel
+
+(`string`) Returns the `rel` value of the output format, either the default or as defined in the site configuration.
+
+### RelPermalink
+
+(`string`) Returns the relative permalink of the page generated by the current output format.
diff --git a/docs/content/en/_common/methods/pages/group-sort-order.md b/docs/content/en/_common/methods/pages/group-sort-order.md
new file mode 100644
index 000000000..e2997a1bd
--- /dev/null
+++ b/docs/content/en/_common/methods/pages/group-sort-order.md
@@ -0,0 +1,5 @@
+---
+_comment: Do not remove front matter.
+---
+
+For the optional sort order, specify either `asc` for ascending order, or `desc` for descending order.
diff --git a/docs/content/en/_common/methods/pages/next-and-prev.md b/docs/content/en/_common/methods/pages/next-and-prev.md
new file mode 100644
index 000000000..462545c3f
--- /dev/null
+++ b/docs/content/en/_common/methods/pages/next-and-prev.md
@@ -0,0 +1,72 @@
+---
+_comment: Do not remove front matter.
+---
+
+Hugo determines the _next_ and _previous_ page by sorting the page collection according to this sorting hierarchy:
+
+Field|Precedence|Sort direction
+:--|:--|:--
+[`weight`]|1|descending
+[`date`]|2|descending
+[`linkTitle`]|3|descending
+[`path`]|4|descending
+
+[`date`]: /methods/page/date/
+[`weight`]: /methods/page/weight/
+[`linkTitle`]: /methods/page/linktitle/
+[`path`]: /methods/page/path/
+
+The sorted page collection used to determine the _next_ and _previous_ page is independent of other page collections, which may lead to unexpected behavior.
+
+For example, with this content structure:
+
+```text
+content/
+├── pages/
+│ ├── _index.md
+│ ├── page-1.md <-- front matter: weight = 10
+│ ├── page-2.md <-- front matter: weight = 20
+│ └── page-3.md <-- front matter: weight = 30
+└── _index.md
+```
+
+And these templates:
+
+```go-html-template {file="layouts/_default/list.html"}
+{{ range .Pages.ByWeight }}
+
+{{ end }}
+```
+
+```go-html-template {file="layouts/_default/single.html"}
+{{ $pages := .CurrentSection.Pages.ByWeight }}
+
+{{ with $pages.Prev . }}
+ Previous
+{{ end }}
+
+{{ with $pages.Next . }}
+ Next
+{{ end }}
+```
+
+When you visit page-2:
+
+- The `Prev` method points to page-3
+- The `Next` method points to page-1
+
+To reverse the meaning of _next_ and _previous_ you can chain the [`Reverse`] method to the page collection definition:
+
+```go-html-template {file="layouts/_default/single.html"}
+{{ $pages := .CurrentSection.Pages.ByWeight.Reverse }}
+
+{{ with $pages.Prev . }}
+ Previous
+{{ end }}
+
+{{ with $pages.Next . }}
+ Next
+{{ end }}
+```
+
+[`Reverse`]: /methods/pages/reverse/
diff --git a/docs/content/en/_common/methods/resource/global-page-remote-resources.md b/docs/content/en/_common/methods/resource/global-page-remote-resources.md
new file mode 100644
index 000000000..49146aed4
--- /dev/null
+++ b/docs/content/en/_common/methods/resource/global-page-remote-resources.md
@@ -0,0 +1,6 @@
+---
+_comment: Do not remove front matter.
+---
+
+> [!note]
+> Use this method with [global resources](g), [page resources](g), or [remote resources](g).
diff --git a/docs/content/en/_common/methods/resource/processing-spec.md b/docs/content/en/_common/methods/resource/processing-spec.md
new file mode 100644
index 000000000..395217328
--- /dev/null
+++ b/docs/content/en/_common/methods/resource/processing-spec.md
@@ -0,0 +1,36 @@
+---
+_comment: Do not remove front matter.
+---
+
+## Process specification
+
+The process specification is a space-delimited, case-insensitive list of one or more of the following in any sequence:
+
+action
+: Applicable to the [`Process`](/methods/resource/process) method only. Specify zero or one of `crop`, `fill`, `fit`, or `resize`. If you specify an action you must also provide dimensions.
+
+dimensions
+: Provide width _or_ height when using the [`Resize`](/methods/resource/resize) method, else provide both width _and_ height. See [details](/content-management/image-processing/#dimensions).
+
+anchor
+: Use with the [`Crop`](/methods/resource/crop) and [`Fill`](/methods/resource/fill) methods. Specify zero or one of `TopLeft`, `Top`, `TopRight`, `Left`, `Center`, `Right`, `BottomLeft`, `Bottom`, `BottomRight`, or `Smart`. Default is `Smart`. See [details](/content-management/image-processing/#anchor).
+
+rotation
+: Typically specify zero or one of `r90`, `r180`, or `r270`. Also supports arbitrary rotation angles. See [details](/content-management/image-processing/#rotation).
+
+target format
+: Specify zero or one of `gif`, `jpeg`, `png`, `tiff`, or `webp`. See [details](/content-management/image-processing/#target-format).
+
+quality
+: Applicable to JPEG and WebP images. Optionally specify `qN` where `N` is an integer in the range [0, 100]. Default is `75`. See [details](/content-management/image-processing/#quality).
+
+hint
+: Applicable to WebP images and equivalent to the `-preset` flag for the [`cwebp`] encoder. Specify zero or one of `drawing`, `icon`, `photo`, `picture`, or `text`. Default is `photo`. See [details](/content-management/image-processing/#hint).
+
+[`cwebp`]: https://developers.google.com/speed/webp/docs/cwebp
+
+background color
+: When converting a PNG or WebP with transparency to a format that does not support transparency, optionally specify a background color using a 3-digit or a 6-digit hexadecimal color code. Default is `#ffffff` (white). See [details](/content-management/image-processing/#background-color).
+
+resampling filter
+: Typically specify zero or one of `Box`, `Lanczos`, `CatmullRom`, `MitchellNetravali`, `Linear`, or `NearestNeighbor`. Other resampling filters are available. See [details](/content-management/image-processing/#resampling-filter).
diff --git a/docs/content/en/_common/methods/taxonomy/get-a-taxonomy-object.md b/docs/content/en/_common/methods/taxonomy/get-a-taxonomy-object.md
new file mode 100644
index 000000000..6fb729c17
--- /dev/null
+++ b/docs/content/en/_common/methods/taxonomy/get-a-taxonomy-object.md
@@ -0,0 +1,67 @@
+---
+_comment: Do not remove front matter.
+---
+
+Before we can use a `Taxonomy` method, we need to capture a `Taxonomy` object.
+
+## Capture a Taxonomy object
+
+Consider this site configuration:
+
+{{< code-toggle file=hugo >}}
+[taxonomies]
+genre = 'genres'
+author = 'authors'
+{{< /code-toggle >}}
+
+And this content structure:
+
+```text
+content/
+├── books/
+│ ├── and-then-there-were-none.md --> genres: suspense
+│ ├── death-on-the-nile.md --> genres: suspense
+│ └── jamaica-inn.md --> genres: suspense, romance
+│ └── pride-and-prejudice.md --> genres: romance
+└── _index.md
+```
+
+To capture the "genres" `Taxonomy` object from within any template, use the [`Taxonomies`] method on a `Site` object.
+
+```go-html-template
+{{ $taxonomyObject := .Site.Taxonomies.genres }}
+```
+
+To capture the "genres" `Taxonomy` object when rendering its page with a taxonomy template, use the [`Terms`] method on the page's [`Data`] object:
+
+```go-html-template {file="layouts/_default/taxonomy.html"}
+{{ $taxonomyObject := .Data.Terms }}
+```
+
+To inspect the data structure:
+
+```go-html-template
+
{{ debug.Dump $taxonomyObject }}
+```
+
+Although the [`Alphabetical`] and [`ByCount`] methods provide a better data structure for ranging through the taxonomy, you can render the weighted pages by term directly from the `Taxonomy` object:
+
+```go-html-template
+{{ range $term, $weightedPages := $taxonomyObject }}
+
+{{ end }}
+```
+
+In the example above, the first anchor element is a link to the term page.
+
+[`Alphabetical`]: /methods/taxonomy/alphabetical/
+[`ByCount`]: /methods/taxonomy/bycount/
+
+[`data`]: /methods/page/data/
+[`terms`]: /methods/page/data/#in-a-taxonomy-template
+[`taxonomies`]: /methods/site/taxonomies/
diff --git a/docs/content/en/_common/methods/taxonomy/ordered-taxonomy-element-methods.md b/docs/content/en/_common/methods/taxonomy/ordered-taxonomy-element-methods.md
new file mode 100644
index 000000000..ec5f8e406
--- /dev/null
+++ b/docs/content/en/_common/methods/taxonomy/ordered-taxonomy-element-methods.md
@@ -0,0 +1,24 @@
+---
+_comment: Do not remove front matter.
+---
+
+An ordered taxonomy is a slice, where each element is an object that contains the term and a slice of its weighted pages.
+
+Each element of the slice provides these methods:
+
+Count
+: (`int`) Returns the number of pages to which the term is assigned.
+
+Page
+: (`page.Page`) Returns the term's `Page` object, useful for linking to the term page.
+
+Pages
+: (`page.Pages`) Returns a `Pages` object containing the `Page` objects to which the term is assigned, sorted by [taxonomic weight](g). To sort or group, use any of the [methods] available to the `Pages` object. For example, sort by the last modification date.
+
+Term
+: (`string`) Returns the term name.
+
+WeightedPages
+: (`page.WeightedPages`) Returns a slice of weighted pages to which the term is assigned, sorted by taxonomic weight. The `Pages` method above is more flexible, allowing you to sort and group.
+
+[methods]: /methods/pages/
diff --git a/docs/content/en/_common/parsable-date-time-strings.md b/docs/content/en/_common/parsable-date-time-strings.md
new file mode 100644
index 000000000..92842767e
--- /dev/null
+++ b/docs/content/en/_common/parsable-date-time-strings.md
@@ -0,0 +1,14 @@
+---
+_comment: Do not remove front matter.
+---
+
+Format|Time zone
+:--|:--
+`2023-10-15T13:18:50-07:00`|`America/Los_Angeles`
+`2023-10-15T13:18:50-0700`|`America/Los_Angeles`
+`2023-10-15T13:18:50Z`|`Etc/UTC`
+`2023-10-15T13:18:50`|Default is `Etc/UTC`
+`2023-10-15`|Default is `Etc/UTC`
+`15 Oct 2023`|Default is `Etc/UTC`
+
+The last three examples are not fully qualified, and default to the `Etc/UTC` time zone.
diff --git a/docs/content/en/_common/permalink-tokens.md b/docs/content/en/_common/permalink-tokens.md
new file mode 100644
index 000000000..4aec68fb8
--- /dev/null
+++ b/docs/content/en/_common/permalink-tokens.md
@@ -0,0 +1,71 @@
+---
+_comment: Do not remove front matter.
+---
+
+`:year`
+: The 4-digit year as defined in the front matter `date` field.
+
+`:month`
+: The 2-digit month as defined in the front matter `date` field.
+
+`:monthname`
+: The name of the month as defined in the front matter `date` field.
+
+`:day`
+: The 2-digit day as defined in the front matter `date` field.
+
+`:weekday`
+: The 1-digit day of the week as defined in the front matter `date` field (Sunday = `0`).
+
+`:weekdayname`
+: The name of the day of the week as defined in the front matter `date` field.
+
+`:yearday`
+: The 1- to 3-digit day of the year as defined in the front matter `date` field.
+
+`:section`
+: The content's section.
+
+`:sections`
+: The content's sections hierarchy. You can use a selection of the sections using _slice syntax_: `:sections[1:]` includes all but the first, `:sections[:last]` includes all but the last, `:sections[last]` includes only the last, `:sections[1:2]` includes section 2 and 3. Note that this slice access will not throw any out-of-bounds errors, so you don't have to be exact.
+
+`:title`
+: The `title` as defined in front matter, else the automatic title. Hugo generates titles automatically for section, taxonomy, and term pages that are not backed by a file.
+
+`:slug`
+: The `slug` as defined in front matter, else the `title` as defined in front matter, else the automatic title. Hugo generates titles automatically for section, taxonomy, and term pages that are not backed by a file.
+
+`:filename`
+: The content's file name without extension, applicable to the `page` page kind.
+
+ {{< deprecated-in v0.144.0 >}}
+ The `:filename` token has been deprecated. Use `:contentbasename` instead.
+ {{< /deprecated-in >}}
+
+`:slugorfilename`
+: The `slug` as defined in front matter, else the content's file name without extension, applicable to the `page` page kind.
+
+ {{< deprecated-in v0.144.0 >}}
+ The `:slugorfilename` token has been deprecated. Use `:slugorcontentbasename` instead.
+ {{< /deprecated-in >}}
+
+`:contentbasename`
+: {{< new-in 0.144.0 />}}
+: The [content base name].
+
+[content base name]: /methods/page/file/#contentbasename
+
+`:slugorcontentbasename`
+: {{< new-in 0.144.0 />}}
+: The `slug` as defined in front matter, else the [content base name].
+
+For time-related values, you can also use the layout string components defined in Go's [time package]. For example:
+
+[time package]: https://pkg.go.dev/time#pkg-constants
+
+{{< code-toggle file=hugo >}}
+permalinks:
+ posts: /:06/:1/:2/:title/
+{{< /code-toggle >}}
+
+[content base name]: /methods/page/file/#contentbasename
diff --git a/docs/content/en/_common/ref-and-relref-error-handling.md b/docs/content/en/_common/ref-and-relref-error-handling.md
new file mode 100644
index 000000000..1d67bbc1f
--- /dev/null
+++ b/docs/content/en/_common/ref-and-relref-error-handling.md
@@ -0,0 +1,10 @@
+---
+_comment: Do not remove front matter.
+---
+
+By default, Hugo will throw an error and fail the build if it cannot resolve the path. You can change this to a warning in your site configuration, and specify a URL to return when the path cannot be resolved.
+
+{{< code-toggle file=hugo >}}
+refLinksErrorLevel = 'warning'
+refLinksNotFoundURL = '/some/other/url'
+{{< /code-toggle >}}
diff --git a/docs/content/en/_common/ref-and-relref-options.md b/docs/content/en/_common/ref-and-relref-options.md
new file mode 100644
index 000000000..ed0dd14c6
--- /dev/null
+++ b/docs/content/en/_common/ref-and-relref-options.md
@@ -0,0 +1,12 @@
+---
+_comment: Do not remove front matter.
+---
+
+path
+: (`string`) The path to the target page. Paths without a leading slash (`/`) are resolved first relative to the current page, and then relative to the rest of the site.
+
+lang
+: (`string`) The language of the target page. Default is the current language. Optional.
+
+outputFormat
+: (`string`) The output format of the target page. Default is the current output format. Optional.
diff --git a/docs/content/en/_common/render-hooks/pageinner.md b/docs/content/en/_common/render-hooks/pageinner.md
new file mode 100644
index 000000000..a598b880a
--- /dev/null
+++ b/docs/content/en/_common/render-hooks/pageinner.md
@@ -0,0 +1,47 @@
+---
+_comment: Do not remove front matter.
+---
+
+## PageInner details
+
+{{< new-in 0.125.0 />}}
+
+The primary use case for `PageInner` is to resolve links and [page resources](g) relative to an included `Page`. For example, create an "include" shortcode to compose a page from multiple content files, while preserving a global context for footnotes and the table of contents:
+
+```go-html-template {file="layouts/shortcodes/include.html" copy=true}
+{{ with .Get 0 }}
+ {{ with $.Page.GetPage . }}
+ {{- .RenderShortcodes }}
+ {{ else }}
+ {{ errorf "The %q shortcode was unable to find %q. See %s" $.Name . $.Position }}
+ {{ end }}
+{{ else }}
+ {{ errorf "The %q shortcode requires a positional parameter indicating the logical path of the file to include. See %s" .Name .Position }}
+{{ end }}
+```
+
+Then call the shortcode in your Markdown:
+
+```text {file="content/posts/p1.md"}
+{{%/* include "/posts/p2" */%}}
+```
+
+Any render hook triggered while rendering `/posts/p2` will get:
+
+- `/posts/p1` when calling `Page`
+- `/posts/p2` when calling `PageInner`
+
+`PageInner` falls back to the value of `Page` if not relevant, and always returns a value.
+
+> [!note]
+> The `PageInner` method is only relevant for shortcodes that invoke the [`RenderShortcodes`] method, and you must call the shortcode using [Markdown notation].
+
+As a practical example, Hugo's embedded link and image render hooks use the `PageInner` method to resolve markdown link and image destinations. See the source code for each:
+
+- [Embedded link render hook]
+- [Embedded image render hook]
+
+[`RenderShortcodes`]: /methods/page/rendershortcodes/
+[Markdown notation]: /content-management/shortcodes/#notation
+[Embedded link render hook]: {{% eturl render-link %}}
+[Embedded image render hook]: {{% eturl render-image %}}
diff --git a/docs/content/en/_common/scratch-pad-scope.md b/docs/content/en/_common/scratch-pad-scope.md
new file mode 100644
index 000000000..b659497d8
--- /dev/null
+++ b/docs/content/en/_common/scratch-pad-scope.md
@@ -0,0 +1,21 @@
+---
+_comment: Do not remove front matter.
+---
+
+## Scope
+
+The method or function used to create a scratch pad determines its scope. For example, use the `Store` method on a `Page` object to create a scratch pad scoped to the page.
+
+Scope|Method or function
+:--|:--
+page|[`PAGE.Store`]
+site|[`SITE.Store`]
+global|[`hugo.Store`]
+local|[`collections.NewScratch`]
+shortcode|[`SHORTCODE.Store`]
+
+[`page.store`]: /methods/page/store
+[`site.store`]: /methods/site/store
+[`hugo.store`]: /functions/hugo/store
+[`collections.newscratch`]: functions/collections/newscratch
+[`shortcode.store`]: /methods/shortcode/store
diff --git a/docs/content/en/_common/store-methods.md b/docs/content/en/_common/store-methods.md
new file mode 100644
index 000000000..1dd776130
--- /dev/null
+++ b/docs/content/en/_common/store-methods.md
@@ -0,0 +1,86 @@
+---
+# Do not remove front matter.
+---
+
+## Methods
+
+### Set
+
+Sets the value of the given key.
+
+```go-html-template
+{{ .Store.Set "greeting" "Hello" }}
+```
+
+### Get
+
+Gets the value of the given key.
+
+```go-html-template
+{{ .Store.Set "greeting" "Hello" }}
+{{ .Store.Get "greeting" }} → Hello
+```
+
+### Add
+
+Adds the given value to the existing value(s) of the given key.
+
+For single values, `Add` accepts values that support Go's `+` operator. If the first `Add` for a key is an array or slice, the following adds will be appended to that list.
+
+```go-html-template
+{{ .Store.Set "greeting" "Hello" }}
+{{ .Store.Add "greeting" "Welcome" }}
+{{ .Store.Get "greeting" }} → HelloWelcome
+```
+
+```go-html-template
+{{ .Store.Set "total" 3 }}
+{{ .Store.Add "total" 7 }}
+{{ .Store.Get "total" }} → 10
+```
+
+```go-html-template
+{{ .Store.Set "greetings" (slice "Hello") }}
+{{ .Store.Add "greetings" (slice "Welcome" "Cheers") }}
+{{ .Store.Get "greetings" }} → [Hello Welcome Cheers]
+```
+
+### SetInMap
+
+Takes a `key`, `mapKey` and `value` and adds a map of `mapKey` and `value` to the given `key`.
+
+```go-html-template
+{{ .Store.SetInMap "greetings" "english" "Hello" }}
+{{ .Store.SetInMap "greetings" "french" "Bonjour" }}
+{{ .Store.Get "greetings" }} → map[english:Hello french:Bonjour]
+ ```
+
+### DeleteInMap
+
+Takes a `key` and `mapKey` and removes the map of `mapKey` from the given `key`.
+
+```go-html-template
+{{ .Store.SetInMap "greetings" "english" "Hello" }}
+{{ .Store.SetInMap "greetings" "french" "Bonjour" }}
+{{ .Store.DeleteInMap "greetings" "english" }}
+{{ .Store.Get "greetings" }} → map[french:Bonjour]
+```
+
+### GetSortedMapValues
+
+Returns an array of values from `key` sorted by `mapKey`.
+
+```go-html-template
+{{ .Store.SetInMap "greetings" "english" "Hello" }}
+{{ .Store.SetInMap "greetings" "french" "Bonjour" }}
+{{ .Store.GetSortedMapValues "greetings" }} → [Hello Bonjour]
+```
+
+### Delete
+
+Removes the given key.
+
+```go-html-template
+{{ .Store.Set "greeting" "Hello" }}
+{{ .Store.Delete "greeting" }}
+```
diff --git a/docs/content/en/_common/syntax-highlighting-options.md b/docs/content/en/_common/syntax-highlighting-options.md
new file mode 100644
index 000000000..36144e090
--- /dev/null
+++ b/docs/content/en/_common/syntax-highlighting-options.md
@@ -0,0 +1,56 @@
+---
+_comment: Do not remove front matter.
+---
+
+anchorLineNos
+: (`bool`) Whether to render each line number as an HTML anchor element, setting the `id` attribute of the surrounding `span` element to the line number. Irrelevant if `lineNos` is `false`. Default is `false`.
+
+codeFences
+: (`bool`) Whether to highlight fenced code blocks. Default is `true`.
+
+guessSyntax
+: (`bool`) Whether to automatically detect the language if the `LANG` argument is blank or set to a language for which there is no corresponding [lexer](g). Falls back to a plain text lexer if unable to automatically detect the language. Default is `false`.
+
+ > [!note]
+ > The Chroma syntax highlighter includes lexers for approximately 250 languages, but only 5 of these have implemented automatic language detection.
+
+hl_Lines
+: (`string`) A space-delimited list of lines to emphasize within the highlighted code. To emphasize lines 2, 3, 4, and 7, set this value to `2-4 7`. This option is independent of the `lineNoStart` option.
+
+hl_inline
+: (`bool`) Whether to render the highlighted code without a wrapping container. Default is `false`.
+
+lineAnchors
+: (`string`) When rendering a line number as an HTML anchor element, prepend this value to the `id` attribute of the surrounding `span` element. This provides unique `id` attributes when a page contains two or more code blocks. Irrelevant if `lineNos` or `anchorLineNos` is `false`.
+
+lineNoStart
+: (`int`) The number to display at the beginning of the first line. Irrelevant if `lineNos` is `false`. Default is `1`.
+
+lineNos
+: (`any`) Controls line number display. Default is `false`.
+ - `true`: Enable line numbers, controlled by `lineNumbersInTable`.
+ - `false`: Disable line numbers.
+ - `inline`: Enable inline line numbers (sets `lineNumbersInTable` to `false`).
+ - `table`: Enable table-based line numbers (sets `lineNumbersInTable` to `true`).
+
+lineNumbersInTable
+: (`bool`) Whether to render the highlighted code in an HTML table with two cells. The left table cell contains the line numbers, while the right table cell contains the code. Irrelevant if `lineNos` is `false`. Default is `true`.
+
+noClasses
+: (`bool`) Whether to use inline CSS styles instead of an external CSS file. Default is `true`. To use an external CSS file, set this value to `false` and generate the CSS file from the command line:
+
+ ```text
+ hugo gen chromastyles --style=monokai > syntax.css
+ ```
+
+style
+: (`string`) The CSS styles to apply to the highlighted code. Case-sensitive. Default is `monokai`. See [syntax highlighting styles].
+
+tabWidth
+: (`int`) Substitute this number of spaces for each tab character in your highlighted code. Irrelevant if `noClasses` is `false`. Default is `4`.
+
+wrapperClass
+: {{< new-in 0.140.2 />}}
+: (`string`) The class or classes to use for the outermost element of the highlighted code. Default is `highlight`.
+
+[syntax highlighting styles]: /quick-reference/syntax-highlighting-styles/
diff --git a/docs/content/en/_common/time-layout-string.md b/docs/content/en/_common/time-layout-string.md
new file mode 100644
index 000000000..3664eaef2
--- /dev/null
+++ b/docs/content/en/_common/time-layout-string.md
@@ -0,0 +1,46 @@
+---
+_comment: Do not remove front matter.
+---
+
+Format a `time.Time` value based on [Go's reference time]:
+
+[Go's reference time]: https://pkg.go.dev/time#pkg-constants
+
+```text
+Mon Jan 2 15:04:05 MST 2006
+```
+
+Create a layout string using these components:
+
+Description|Valid components
+:--|:--
+Year|`"2006" "06"`
+Month|`"Jan" "January" "01" "1"`
+Day of the week|`"Mon" "Monday"`
+Day of the month|`"2" "_2" "02"`
+Day of the year|`"__2" "002"`
+Hour|`"15" "3" "03"`
+Minute|`"4" "04"`
+Second|`"5" "05"`
+AM/PM mark|`"PM"`
+Time zone offsets|`"-0700" "-07:00" "-07" "-070000" "-07:00:00"`
+
+Replace the sign in the layout string with a Z to print Z instead of an offset for the UTC zone.
+
+Description|Valid components
+:--|:--
+Time zone offsets|`"Z0700" "Z07:00" "Z07" "Z070000" "Z07:00:00"`
+
+```go-html-template
+{{ $t := "2023-01-27T23:44:58-08:00" }}
+{{ $t = time.AsTime $t }}
+{{ $t = $t.Format "Jan 02, 2006 3:04 PM Z07:00" }}
+
+{{ $t }} → Jan 27, 2023 11:44 PM -08:00
+```
+
+Strings such as `PST` and `CET` are not time zones. They are time zone _abbreviations_.
+
+Strings such as `-07:00` and `+01:00` are not time zones. They are time zone _offsets_.
+
+A time zone is a geographic area with the same local time. For example, the time zone abbreviated by `PST` and `PDT` (depending on Daylight Savings Time) is `America/Los_Angeles`.
diff --git a/docs/content/en/_index.md b/docs/content/en/_index.md
index 334704833..358f6a4d9 100644
--- a/docs/content/en/_index.md
+++ b/docs/content/en/_index.md
@@ -1,49 +1,4 @@
---
-title: "The world’s fastest framework for building websites"
-date: 2017-03-02T12:00:00-05:00
-features:
- - heading: Blistering Speed
- image_path: /images/icon-fast.svg
- tagline: What's modern about waiting for your site to build?
- copy: Hugo is the fastest tool of its kind. At <1 ms per page, the average site builds in less than a second.
-
- - heading: Robust Content Management
- image_path: /images/icon-content-management.svg
- tagline: Flexibility rules. Hugo is a content strategist's dream.
- copy: Hugo supports unlimited content types, taxonomies, menus, dynamic API-driven content, and more, all without plugins.
-
- - heading: Shortcodes
- image_path: /images/icon-shortcodes.svg
- tagline: Hugo's shortcodes are Markdown's hidden superpower.
- copy: We love the beautiful simplicity of markdown’s syntax, but there are times when we want more flexibility. Hugo shortcodes allow for both beauty and flexibility.
-
- - heading: Built-in Templates
- image_path: /images/icon-built-in-templates.svg
- tagline: Hugo has common patterns to get your work done quickly.
- copy: Hugo ships with pre-made templates to make quick work of SEO, commenting, analytics and other functions. One line of code, and you're done.
-
- - heading: Multilingual and i18n
- image_path: /images/icon-multilingual2.svg
- tagline: Polyglot baked in.
- copy: Hugo provides full i18n support for multi-language sites with the same straightforward development experience Hugo users love in single-language sites.
-
- - heading: Custom Outputs
- image_path: /images/icon-custom-outputs.svg
- tagline: HTML not enough?
- copy: Hugo allows you to output your content in multiple formats, including JSON or AMP, and makes it easy to create your own.
-sections:
- - heading: "300+ Themes"
- cta: Check out the Hugo themes.
- link: http://themes.gohugo.io/
- color_classes: bg-accent-color white
- image: /images/homepage-screenshot-hugo-themes.jpg
- copy: "Hugo provides a robust theming system that is easy to implement but capable of producing even the most complicated websites."
- - heading: "Capable Templating"
- cta: Get Started.
- link: templates/
- color_classes: bg-primary-color-light black
- image: /images/home-page-templating-example.png
- copy: "Hugo's Go-based templating provides just the right amount of logic to build anything from the simple to complex. If you prefer Jade/Pug-like syntax, you can also use Amber, Ace, or any combination of the three."
+title: The world's fastest framework for building websites
+description: Hugo is one of the most popular open-source static site generators. With its amazing speed and flexibility, Hugo makes building websites fun again.
---
-
-Hugo is one of the most popular open-source static site generators. With its amazing speed and flexibility, Hugo makes building websites fun again.
diff --git a/docs/content/en/about/_index.md b/docs/content/en/about/_index.md
index 8ed441b61..e55800959 100644
--- a/docs/content/en/about/_index.md
+++ b/docs/content/en/about/_index.md
@@ -1,20 +1,9 @@
---
title: About Hugo
-linktitle: Overview
-description: Hugo's features, roadmap, license, and motivation.
-date: 2017-02-01
-publishdate: 2017-02-01
-lastmod: 2017-02-01
+linkTitle: About
+description: Learn about Hugo and its features, privacy protections, and security model.
categories: []
keywords: []
-menu:
- docs:
- parent: "about"
- weight: 1
-weight: 1
-draft: false
+weight: 10
aliases: [/about-hugo/,/docs/]
-toc: false
---
-
-Hugo is not your average static site generator.
diff --git a/docs/content/en/about/benefits.md b/docs/content/en/about/benefits.md
deleted file mode 100644
index 0ba28c5cc..000000000
--- a/docs/content/en/about/benefits.md
+++ /dev/null
@@ -1,43 +0,0 @@
----
-title: The Benefits of Static Site Generators
-linktitle: The Benefits of Static
-description: Improved performance, security and ease of use are just a few of the reasons static site generators are so appealing.
-date: 2017-02-01
-publishdate: 2017-02-01
-lastmod: 2017-02-01
-keywords: [ssg,static,performance,security]
-menu:
- docs:
- parent: "about"
- weight: 30
-weight: 30
-sections_weight: 30
-draft: false
-aliases: []
-toc: false
----
-
-The purpose of website generators is to render content into HTML files. Most are "dynamic site generators." That means the HTTP server---i.e., the program that sends files to the browser to be viewed---runs the generator to create a new HTML file every time an end user requests a page.
-
-Over time, dynamic site generators were programmed to cache their HTML files to prevent unnecessary delays in delivering pages to end users. A cached page is a static version of a web page.
-
-Hugo takes caching a step further and all HTML files are rendered on your computer. You can review the files locally before copying them to the computer hosting the HTTP server. Since the HTML files aren't generated dynamically, we say that Hugo is a *static site generator*.
-
-This has many benefits. The most noticeable is performance. HTTP servers are *very* good at sending files---so good, in fact, that you can effectively serve the same number of pages with a fraction of the memory and CPU needed for a dynamic site.
-
-## More on Static Site Generators
-
-* ["An Introduction to Static Site Generators", David Walsh][]
-* ["Hugo vs. Wordpress page load speed comparison: Hugo leaves WordPress in its dust", GettingThingsTech][hugovwordpress]
-* ["Static Site Generators", O'Reilly][]
-* [StaticGen: Top Open-Source Static Site Generators (GitHub Stars)][]
-* ["Top 10 Static Website Generators", Netlify blog][]
-* ["The Resurgence of Static", dotCMS][dotcms]
-
-
-["An Introduction to Static Site Generators", David Walsh]: https://davidwalsh.name/introduction-static-site-generators
-["Static Site Generators", O'Reilly]: http://www.oreilly.com/web-platform/free/files/static-site-generators.pdf
-["Top 10 Static Website Generators", Netlify blog]: https://www.netlify.com/blog/2016/05/02/top-ten-static-website-generators/
-[hugovwordpress]: https://gettingthingstech.com/hugo-vs.-wordpress-page-load-speed-comparison-hugo-leaves-wordpress-in-its-dust/
-[StaticGen: Top Open-Source Static Site Generators (GitHub Stars)]: https://www.staticgen.com/
-[dotcms]: https://dotcms.com/blog/post/the-resurgence-of-static
diff --git a/docs/content/en/about/features.md b/docs/content/en/about/features.md
index 4176c60df..ff1a6b8eb 100644
--- a/docs/content/en/about/features.md
+++ b/docs/content/en/about/features.md
@@ -1,87 +1,136 @@
---
-title: Hugo Features
-linktitle: Hugo Features
-description: Hugo boasts blistering speed, robust content management, and a powerful templating language making it a great fit for all kinds of static websites.
-date: 2017-02-01
-publishdate: 2017-02-01
-lastmod: 2017-02-01
-menu:
- docs:
- parent: "about"
- weight: 20
+title: Features
+description: Hugo's rich and powerful feature set provides the framework and tools to create static sites that build in seconds, often less.
+categories: []
+keywords: []
weight: 20
-sections_weight: 20
-draft: false
-toc: true
---
-## General
+## Framework
-* [Extremely fast][] build times (< 1 ms per page)
-* Completely cross platform, with [easy installation][install] on macOS, Linux, Windows, and more
-* Renders changes on the fly with [LiveReload][] as you develop
-* [Powerful theming][]
-* [Host your site anywhere][hostanywhere]
+[Multiplatform]
+: Install Hugo's single executable on Linux, macOS, Windows, and more.
-## Organization
+[Multilingual]
+: Localize your project for each language and region, including translations, images, dates, currencies, numbers, percentages, and collation sequence. Hugo's multilingual framework supports single-host and multihost configurations.
-* Straightforward [organization for your projects][], including website sections
-* Customizable [URLs][]
-* Support for configurable [taxonomies][], including categories and tags
-* [Sort content][] as you desire through powerful template [functions][]
-* Automatic [table of contents][] generation
-* [Dynamic menu][] creation
-* [Pretty URLs][] support
-* [Permalink][] pattern support
-* Redirects via [aliases][]
+[Output formats]
+: Render each page of your site to one or more output formats, with granular control by page kind, section, and path. While HTML is the default output format, you can add JSON, RSS, CSV, and more. For example, create a REST API to access content.
-## Content
+[Templates]
+: Create templates using variables, functions, and methods to transform your content, resources, and data into a published page. While HTML templates are the most common, you can create templates for any output format.
-* Native Markdown and Emacs Org-Mode support, as well as other languages via *external helpers* (see [supported formats][])
-* TOML, YAML, and JSON metadata support in [front matter][]
-* Customizable [homepage][]
-* Multiple [content types][]
-* Automatic and user defined [content summaries][]
-* [Shortcodes][] to enable rich content inside of Markdown
-* ["Minutes to Read"][pagevars] functionality
-* ["Wordcount"][pagevars] functionality
+[Themes]
+: Reduce development time and cost by using one of the hundreds of themes contributed by the Hugo community. Themes are available for corporate sites, documentation projects, image portfolios, landing pages, personal and professional blogs, resumes, CVs, and more.
-## Additional Features
+[Modules]
+: Reduce development time and cost by creating or importing packaged combinations of archetypes, assets, content, data, templates, translation tables, static files, or configuration settings. A module may serve as the basis for a new site, or to augment an existing site.
-* Integrated [Disqus][] comment support
-* Integrated [Google Analytics][] support
-* Automatic [RSS][] creation
-* Support for [Go][], [Amber], and [Ace][] HTML templates
-* [Syntax highlighting][] powered by [Chroma][] (partly compatible with Pygments)
+[Privacy]
+: Configure your site to help comply with regional privacy regulations.
+[Security]
+: Hugo's security model is based on the premise that template and configuration authors are trusted, but content authors are not. This model enables generation of HTML output safe against code injection. Other protections prevent "shelling out" to arbitrary applications, limit access to specific environment variables, prevent connections to arbitrary remote data sources, and more.
-[Ace]: /templates/alternatives/
-[aliases]: /content-management/urls/#aliases
-[Amber]: https://github.com/eknkc/amber
-[Chroma]: https://github.com/alecthomas/chroma
-[content summaries]: /content-management/summaries/
-[content types]: /content-management/types/
-[Disqus]: https://disqus.com/
-[Dynamic menu]: /templates/menus/
-[Extremely fast]: https://github.com/bep/hugo-benchmark
-[front matter]: /content-management/front-matter/
-[functions]: /functions/
-[Go]: http://golang.org/pkg/html/template/
-[Google Analytics]: https://google-analytics.com/
-[homepage]: /templates/homepage/
-[hostanywhere]: /hosting-and-deployment/
-[install]: /getting-started/installing/
-[LiveReload]: /getting-started/usage/
-[organization for your projects]: /getting-started/directory-structure/
-[pagevars]: /variables/page/
-[Permalink]: /content-management/urls/#permalinks
-[Powerful theming]: /themes/
-[Pretty URLs]: /content-management/urls/
-[RSS]: /templates/rss/
+## Content authoring
+
+[Content formats]
+: Create your content using Markdown, HTML, AsciiDoc, Emacs Org Mode, Pandoc, or reStructuredText. Markdown is the default content format, conforming to the [CommonMark] and [GitHub Flavored Markdown] specifications.
+
+[Markdown attributes]
+: Apply HTML attributes such as `class` and `id` to Markdown images and block elements including blockquotes, fenced code blocks, headings, horizontal rules, lists, paragraphs, and tables.
+
+[Markdown extensions]
+: Leverage the embedded Markdown extensions to create tables, definition lists, footnotes, task lists, inserted text, mark text, subscripts, superscripts, and more.
+
+[Markdown render hooks]
+: Override the conversion of Markdown to HTML when rendering blockquotes, fenced code blocks, headings, images, links, and tables. For example, render every standalone image as an HTML `figure` element.
+
+[Diagrams]
+: Use fenced code blocks and Markdown render hooks to include diagrams in your content.
+
+[Mathematics]
+: Include mathematical equations and expressions in Markdown using LaTeX markup.
+
+[Syntax highlighting]
+: Syntactically highlight code examples using Hugo's embedded syntax highlighter, enabled by default for fenced code blocks in Markdown. The syntax highlighter supports hundreds of code languages and dozens of styles.
+
+[Shortcodes]
+: Use Hugo's embedded shortcodes, or create your own, to insert complex content. For example, use shortcodes to include `audio` and `video` elements, render tables from local or remote data sources, insert snippets from other pages, and more.
+
+## Content management
+
+[Content adapters]
+: Create content adapters to dynamically add content when building your site. For example, use a content adapter to create pages from a remote data source such as JSON, TOML, YAML, or XML.
+
+[Taxonomies]
+: Classify content to establish simple or complex logical relationships between pages. For example, create an authors taxonomy, and assign one or more authors to each page. Among other uses, the taxonomy system provides an inverted, weighted index to render a list of related pages, ordered by relevance.
+
+[Data]
+: Augment your content using local or remote data sources including CSV, JSON, TOML, YAML, and XML. For example, create a shortcode to render an HTML table from a remote CSV file.
+
+[Menus]
+: Provide rapid access to content via Hugo's menu system, configured automatically, globally, or on a page-by-page basis. The menu system is a key component of Hugo's multilingual architecture.
+
+[URL management]
+: Serve any page from any path via global configuration or on a page-by-page basis.
+
+## Asset pipelines
+
+[Image processing]
+: Convert, resize, crop, rotate, adjust colors, apply filters, overlay text and images, and extract EXIF data.
+
+[JavaScript bundling]
+: Transpile TypeScript and JSX to JavaScript, bundle, tree shake, minify, create source maps, and perform SRI hashing.
+
+[Sass processing]
+: Transpile Sass to CSS, bundle, tree shake, minify, create source maps, perform SRI hashing, and integrate with PostCSS.
+
+[Tailwind CSS processing]
+: Compile Tailwind CSS utility classes into standard CSS, bundle, tree shake, optimize, minify, perform SRI hashing, and integrate with PostCSS.
+
+## Performance
+
+[Caching]
+: Reduce build time and cost by rendering a partial template once then cache the result, either globally or within a given context. For example, cache the result of an asset pipeline to prevent reprocessing on every rendered page.
+
+[Segmentation]
+: Reduce build time and cost by partitioning your sites into segments. For example, render the home page and the "news section" every hour, and render the entire site once a week.
+
+[Minification]
+: Minify HTML, CSS, and JavaScript to reduce file size, bandwidth consumption, and loading times.
+
+[Multilingual]: /content-management/multilingual/
+[Multiplatform]: /installation/
+[Output formats]: /configuration/output-formats/
+[Templates]: /templates/introduction/
+[Themes]: https://themes.gohugo.io/
+[Modules]: /hugo-modules/
+[Privacy]: /configuration/privacy/
+[Security]: /about/security/
+
+[Content formats]: /content-management/formats/
+[CommonMark]: https://spec.commonmark.org/current/
+[GitHub Flavored Markdown]: https://github.github.com/gfm/
+[Markdown attributes]: /content-management/markdown-attributes/
+[Markdown extensions]: /configuration/markup/#extensions
+[Markdown render hooks]: /render-hooks/introduction/
+[Diagrams]: /content-management/diagrams/
+[Mathematics]: /content-management/mathematics/
+[Syntax highlighting]: /content-management/syntax-highlighting/
[Shortcodes]: /content-management/shortcodes/
-[sort content]: /templates/
-[supported formats]: /content-management/formats/
-[Syntax highlighting]: /tools/syntax-highlighting/
-[table of contents]: /content-management/toc/
-[taxonomies]: /content-management/taxonomies/
-[URLs]: /content-management/urls/
+
+[Content adapters]: /content-management/content-adapters/
+[Taxonomies]: /content-management/taxonomies/
+[Data]: /content-management/data-sources/
+[Menus]: /content-management/menus/
+[URL management]: /content-management/urls/
+
+[Image processing]: /content-management/image-processing/
+[JavaScript bundling]: /functions/js/build/
+[Sass processing]: /functions/css/Sass/
+[Tailwind CSS processing]: /functions/css/tailwindcss/
+
+[Caching]: /functions/partials/includecached/
+[Segmentation]: /configuration/segments/
+[Minification]: /configuration/minify/
diff --git a/docs/content/en/about/hugo-and-gdpr.md b/docs/content/en/about/hugo-and-gdpr.md
deleted file mode 100644
index e193e1838..000000000
--- a/docs/content/en/about/hugo-and-gdpr.md
+++ /dev/null
@@ -1,135 +0,0 @@
-
-
----
-title: Hugo and the General Data Protection Regulation (GDPR)
-linktitle: Hugo and GDPR
-description: About how to configure your Hugo site to meet the new regulations.
-date: 2018-05-25
-layout: single
-keywords: ["GDPR", "Privacy", "Data Protection"]
-menu:
- docs:
- parent: "about"
- weight: 5
-weight: 5
-sections_weight: 5
-draft: false
-aliases: [/privacy/,/gdpr/]
-toc: true
----
-
-
- General Data Protection Regulation ([GDPR](https://en.wikipedia.org/wiki/General_Data_Protection_Regulation)) is a regulation in EU law on data protection and privacy for all individuals within the European Union and the European Economic Area. It became enforceable on 25 May 2018.
-
- **Hugo is a static site generator. By using Hugo you are already standing on very solid ground. Static HTML files on disk are much easier to reason about compared to server and database driven web sites.**
-
- But even static websites can integrate with external services, so from version `0.41`, Hugo provides a **Privacy Config** that covers the relevant built-in templates.
-
- Note that:
-
- * These settings have their defaults setting set to _off_, i.e. how it worked before Hugo `0.41`. You must do your own evaluation of your site and apply the appropriate settings.
- * These settings work with the [internal templates](/templates/internal/). Some theme may contain custom templates for embedding services like Google Analytics. In that case these options have no effect.
- * We will continue this work and improve this further in future Hugo versions.
-
-## All Privacy Settings
-
-Below are all privacy settings and their default value. These settings need to be put in your site config (e.g. `config.toml`).
-
- {{< code-toggle file="config">}}
-[privacy]
-[privacy.disqus]
-disable = false
-[privacy.googleAnalytics]
-disable = false
-respectDoNotTrack = false
-anonymizeIP = false
-useSessionStorage = false
-[privacy.instagram]
-disable = false
-simple = false
-[privacy.twitter]
-disable = false
-enableDNT = false
-simple = false
-[privacy.vimeo]
-disable = false
-simple = false
-[privacy.youtube]
-disable = false
-privacyEnhanced = false
-{{< /code-toggle >}}
-
-
-## Disable All Services
-
-An example Privacy Config that disables all the relevant services in Hugo. With this configuration, the other settings will not matter.
-
- {{< code-toggle file="config">}}
-[privacy]
-[privacy.disqus]
-disable = true
-[privacy.googleAnalytics]
-disable = true
-[privacy.instagram]
-disable = true
-[privacy.twitter]
-disable = true
-[privacy.vimeo]
-disable = true
-[privacy.youtube]
-disable = true
-{{< /code-toggle >}}
-
-## The Privacy Settings Explained
-
-### GoogleAnalytics
-
-anonymizeIP
-: Enabling this will make it so the users' IP addresses are anonymized within Google Analytics.
-
-respectDoNotTrack
-: Enabling this will make the GA templates respect the "Do Not Track" HTTP header.
-
-useSessionStorage
-: Enabling this will disable the use of Cookies and use Session Storage to Store the GA Client ID.
-
-### Instagram
-
-simple
-: If simple mode is enabled, a static and no-JS version of the Instagram image card will be built. Note that this only supports image cards and the image itself will be fetched from Instagram's servers.
-
-**Note:** If you use the _simple mode_ for Instagram and a site styled with Bootstrap 4, you may want to disable the inlines styles provided by Hugo:
-
- {{< code-toggle file="config">}}
-[services]
-[services.instagram]
-disableInlineCSS = true
-{{< /code-toggle >}}
-
-### Twitter
-
-enableDNT
-: Enabling this for the twitter/tweet shortcode, the tweet and its embedded page on your site are not used for purposes that include personalized suggestions and personalized ads.
-
-simple
-: If simple mode is enabled, a static and no-JS version of a tweet will be built.
-
-
-**Note:** If you use the _simple mode_ for Twitter, you may want to disable the inlines styles provided by Hugo:
-
- {{< code-toggle file="config">}}
-[services]
-[services.twitter]
-disableInlineCSS = true
-{{< /code-toggle >}}
-
-### YouTube
-
-privacyEnhanced
-: When you turn on privacy-enhanced mode, YouTube won’t store information about visitors on your website unless the user plays the embedded video.
-
-### Vimeo
-
-simple
-: If simple mode is enabled, the video thumbnail is fetched from Vimeo's servers and it is overlayed with a play button. If the user clicks to play the video, it will open in a new tab directly on Vimeo's website.
-
diff --git a/docs/content/en/about/introduction.md b/docs/content/en/about/introduction.md
new file mode 100644
index 000000000..9586d08f8
--- /dev/null
+++ b/docs/content/en/about/introduction.md
@@ -0,0 +1,34 @@
+---
+title: Introduction
+description: Hugo is a static site generator written in Go, optimized for speed and designed for flexibility.
+categories: []
+keywords: []
+weight: 10
+aliases: [/about/what-is-hugo/,/about/benefits/]
+---
+
+Hugo is a [static site generator] written in [Go], optimized for speed and designed for flexibility. With its advanced templating system and fast asset pipelines, Hugo renders a complete site in seconds, often less.
+
+Due to its flexible framework, multilingual support, and powerful taxonomy system, Hugo is widely used to create:
+
+- Corporate, government, nonprofit, education, news, event, and project sites
+- Documentation sites
+- Image portfolios
+- Landing pages
+- Business, professional, and personal blogs
+- Resumes and CVs
+
+Use Hugo's embedded web server during development to instantly see changes to content, structure, behavior, and presentation. Then deploy the site to your host, or push changes to your Git provider for automated builds and deployment.
+
+And with [Hugo Modules], you can share content, assets, data, translations, themes, templates, and configuration with other projects via public or private Git repositories.
+
+Learn more about Hugo's [features], [privacy protections], and [security model].
+
+[Go]: https://go.dev
+[Hugo Modules]: /hugo-modules/
+[static site generator]: https://en.wikipedia.org/wiki/Static_site_generator
+[features]: /about/features/
+[security model]: about/security/
+[privacy protections]: /configuration/privacy
+
+{{< youtube 0RKpf3rK57I >}}
diff --git a/docs/content/en/about/license.md b/docs/content/en/about/license.md
index a8e7c4abd..06a3a695d 100644
--- a/docs/content/en/about/license.md
+++ b/docs/content/en/about/license.md
@@ -1,165 +1,75 @@
---
-title: Apache License
-linktitle: License
-description: Hugo v0.15 and later are released under the Apache 2.0 license.
-date: 2016-02-01
-publishdate: 2016-02-01
-lastmod: 2016-03-02
-categories: ["about hugo"]
-keywords: ["License","apache"]
-menu:
- docs:
- parent: "about"
- weight: 60
-weight: 60
-sections_weight: 60
-aliases: [/meta/license]
-toc: true
+title: License
+description: Hugo is released under the Apache 2.0 license.
+categories: []
+keywords: []
+weight: 40
---
-{{% note %}}
-Hugo v0.15 and later are released under the Apache 2.0 license.
-Earlier versions of Hugo were released under the [Simple Public License](https://opensource.org/licenses/Simple-2.0).
-{{% /note %}}
+## Apache License
-_Version 2.0, January 2004_
-
+_Version 2.0, January 2004_
+__
-*Terms and Conditions for use, reproduction, and distribution*
+### Terms and Conditions for use, reproduction, and distribution
-## 1. Definitions
+#### 1. Definitions
-“License” shall mean the terms and conditions for use, reproduction, and
-distribution as defined by Sections 1 through 9 of this document.
+“License” shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.
-“Licensor” shall mean the copyright owner or entity authorized by the copyright
-owner that is granting the License.
+“Licensor” shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.
-“Legal Entity” shall mean the union of the acting entity and all other entities
-that control, are controlled by, or are under common control with that entity.
-For the purposes of this definition, “control” means **(i)** the power, direct or
-indirect, to cause the direction or management of such entity, whether by
-contract or otherwise, or **(ii)** ownership of fifty percent (50%) or more of the
-outstanding shares, or **(iii)** beneficial ownership of such entity.
+“Legal Entity” shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, “control” means **(i)** the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or **(ii)** ownership of fifty percent (50%) or more of the outstanding shares, or **(iii)** beneficial ownership of such entity.
-“You” (or “Your”) shall mean an individual or Legal Entity exercising
-permissions granted by this License.
+“You” (or “Your”) shall mean an individual or Legal Entity exercising permissions granted by this License.
-“Source” form shall mean the preferred form for making modifications, including
-but not limited to software source code, documentation source, and configuration
-files.
+“Source” form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.
-“Object” form shall mean any form resulting from mechanical transformation or
-translation of a Source form, including but not limited to compiled object code,
-generated documentation, and conversions to other media types.
+“Object” form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.
-“Work” shall mean the work of authorship, whether in Source or Object form, made
-available under the License, as indicated by a copyright notice that is included
-in or attached to the work (an example is provided in the Appendix below).
+“Work” shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).
-“Derivative Works” shall mean any work, whether in Source or Object form, that
-is based on (or derived from) the Work and for which the editorial revisions,
-annotations, elaborations, or other modifications represent, as a whole, an
-original work of authorship. For the purposes of this License, Derivative Works
-shall not include works that remain separable from, or merely link (or bind by
-name) to the interfaces of, the Work and Derivative Works thereof.
+“Derivative Works” shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.
-“Contribution” shall mean any work of authorship, including the original version
-of the Work and any modifications or additions to that Work or Derivative Works
-thereof, that is intentionally submitted to Licensor for inclusion in the Work
-by the copyright owner or by an individual or Legal Entity authorized to submit
-on behalf of the copyright owner. For the purposes of this definition,
-“submitted” means any form of electronic, verbal, or written communication sent
-to the Licensor or its representatives, including but not limited to
-communication on electronic mailing lists, source code control systems, and
-issue tracking systems that are managed by, or on behalf of, the Licensor for
-the purpose of discussing and improving the Work, but excluding communication
-that is conspicuously marked or otherwise designated in writing by the copyright
-owner as “Not a Contribution.”
+“Contribution” shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, “submitted” means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as “Not a Contribution.”
-“Contributor” shall mean Licensor and any individual or Legal Entity on behalf
-of whom a Contribution has been received by Licensor and subsequently
-incorporated within the Work.
+“Contributor” shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.
-## 2. Grant of Copyright License
+#### 2. Grant of Copyright License
-Subject to the terms and conditions of this License, each Contributor hereby
-grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free,
-irrevocable copyright license to reproduce, prepare Derivative Works of,
-publicly display, publicly perform, sublicense, and distribute the Work and such
-Derivative Works in Source or Object form.
+Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.
-## 3. Grant of Patent License
+#### 3. Grant of Patent License
-Subject to the terms and conditions of this License, each Contributor hereby
-grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free,
-irrevocable (except as stated in this section) patent license to make, have
-made, use, offer to sell, sell, import, and otherwise transfer the Work, where
-such license applies only to those patent claims licensable by such Contributor
-that are necessarily infringed by their Contribution(s) alone or by combination
-of their Contribution(s) with the Work to which such Contribution(s) was
-submitted. If You institute patent litigation against any entity (including a
-cross-claim or counterclaim in a lawsuit) alleging that the Work or a
-Contribution incorporated within the Work constitutes direct or contributory
-patent infringement, then any patent licenses granted to You under this License
-for that Work shall terminate as of the date such litigation is filed.
+Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.
-## 4. Redistribution
+#### 4. Redistribution
-You may reproduce and distribute copies of the Work or Derivative Works thereof
-in any medium, with or without modifications, and in Source or Object form,
-provided that You meet the following conditions:
+You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:
-* **(a)** You must give any other recipients of the Work or Derivative Works a copy of
-this License; and
-* **(b)** You must cause any modified files to carry prominent notices stating that You
-changed the files; and
-* **\(c)** You must retain, in the Source form of any Derivative Works that You distribute,
-all copyright, patent, trademark, and attribution notices from the Source form
-of the Work, excluding those notices that do not pertain to any part of the
-Derivative Works; and
+* **(a)** You must give any other recipients of the Work or Derivative Works a copy of this License; and
+* **(b)** You must cause any modified files to carry prominent notices stating that You changed the files; and
+* **(c)** You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and
* **(d)** If the Work includes a “NOTICE” text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License.
You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.
-## 5. Submission of Contributions
+#### 5. Submission of Contributions
Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.
-## 6. Trademarks
+#### 6. Trademarks
This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.
-## 7. Disclaimer of Warranty
+#### 7. Disclaimer of Warranty
Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an “AS IS” BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.
-## 8. Limitation of Liability
+#### 8. Limitation of Liability
In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.
-## 9. Accepting Warranty or Additional Liability
+#### 9. Accepting Warranty or Additional Liability
While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.
-
-_END OF TERMS AND CONDITIONS_
-
-## APPENDIX: How to apply the Apache License to your work
-
-To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets `[]` replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same “printed page” as the copyright notice for easier identification within third-party archives.
-
-{{< code file="apache-notice.txt" download="apache-notice.txt" >}}
-Copyright [yyyy] [name of copyright owner]
-
-Licensed under the Apache License, Version 2.0 (the "License");
-you may not use this file except in compliance with the License.
-You may obtain a copy of the License at
-
- http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{{< /code >}}
diff --git a/docs/content/en/about/new-in-032/index.md b/docs/content/en/about/new-in-032/index.md
deleted file mode 100644
index f3e56dc6b..000000000
--- a/docs/content/en/about/new-in-032/index.md
+++ /dev/null
@@ -1,209 +0,0 @@
----
-title: Hugo 0.32 HOWTO
-description: About page bundles, image processing and more.
-date: 2017-12-28
-keywords: [ssg,static,performance,security]
-menu:
- docs:
- parent: "about"
- weight: 10
-weight: 10
-sections_weight: 10
-draft: false
-aliases: []
-toc: true
-images:
-- images/blog/sunset.jpg
----
-
-
-{{% note %}}
-This documentation belongs in other places in this documentation site, but is put here first ... to get something up and running fast.
-{{% /note %}}
-
-
-Also see this demo project from [bep](https://github.com/bep/), the clever Norwegian behind these new features:
-
-* https://temp.bep.is/hugotest/
-* https://github.com/bep/hugotest (source)
-
-## Page Resources
-
-### Organize Your Content
-
-{{< figure src="/images/hugo-content-bundles.png" title="Pages with image resources" >}}
-
-The content folder above shows a mix of content pages (`md` (i.e. markdown) files) and image resources.
-
-{{% note %}}
-You can use any file type as a content resource as long as it is a MIME type recognized by Hugo (`json` files will, as one example, work fine). If you want to get exotic, you can define your [own media type](/templates/output-formats/#media-types).
-{{% /note %}}
-
-The 3 page bundles marked in red explained from top to bottom:
-
-1. The home page with one image resource (`1-logo.png`)
-2. The blog section with two images resources and two pages resources (`content1.md`, `content2.md`). Note that the `_index.md` represents the URL for this section.
-3. An article (`hugo-is-cool`) with a folder with some images and one content resource (`cats-info.md`). Note that the `index.md` represents the URL for this article.
-
-The content files below `blog/posts` are just regular standalone pages.
-
-{{% note %}}
-Note that changes to any resource inside the `content` folder will trigger a reload when running in watch (aka server or live reload mode), it will even work with `--navigateToChanged`.
-{{% /note %}}
-
-#### Sort Order
-
-* Pages are sorted according to standard Hugo page sorting rules.
-* Images and other resources are sorted in lexicographical order.
-
-### Handle Page Resources in Templates
-
-
-#### List all Resources
-
-```go-html-template
-{{ range .Resources }}
-
-{{ end }}
-```
-
-For an absolute URL, use `.Permalink`.
-
-**Note:** The permalink will be relative to the content page, respecting permalink settings. Also, included page resources will not have a value for `RelPermalink`.
-
-#### List All Resources by Type
-
-```go-html-template
-{{ with .Resources.ByType "image" }}
-{{ end }}
-
-```
-
-Type here is `page` for pages, else the main type in the MIME type, so `image`, `json` etc.
-
-#### Get a Specific Resource
-
-```go-html-template
-{{ $logo := .Resources.GetByPrefix "logo" }}
-{{ with $logo }}
-{{ end }}
-```
-
-#### Include Page Resource Content
-
-```go-html-template
-{{ with .Resources.ByType "page" }}
-{{ range . }}
-
{{ .Title }}
-{{ .Content }}
-{{ end }}
-{{ end }}
-
-```
-
-
-## Image Processing
-
-The `image` resource implements the methods `Resize`, `Fit` and `Fill`:
-
-Resize
-: Resize to the given dimension, `{{ $logo.Resize "200x" }}` will resize to 200 pixels wide and preserve the aspect ratio. Use `{{ $logo.Resize "200x100" }}` to control both height and width.
-
-Fit
-: Scale down the image to fit the given dimensions, e.g. `{{ $logo.Fit "200x100" }}` will fit the image inside a box that is 200 pixels wide and 100 pixels high.
-
-Fill
-: Resize and crop the image given dimensions, e.g. `{{ $logo.Fill "200x100" }}` will resize and crop to width 200 and height 100
-
-
-{{% note %}}
-Image operations in Hugo currently **do not preserve EXIF data** as this is not supported by Go's [image package](https://github.com/golang/go/search?q=exif&type=Issues&utf8=%E2%9C%93). This will be improved on in the future.
-{{% /note %}}
-
-
-### Image Processing Examples
-
-_The photo of the sunset used in the examples below is Copyright [Bjørn Erik Pedersen](https://commons.wikimedia.org/wiki/User:Bep) (Creative Commons Attribution-Share Alike 4.0 International license)_
-
-
-{{< imgproc sunset Resize "300x" />}}
-
-{{< imgproc sunset Fill "90x120 left" />}}
-
-{{< imgproc sunset Fill "90x120 right" />}}
-
-{{< imgproc sunset Fit "90x90" />}}
-
-{{< imgproc sunset Resize "300x q10" />}}
-
-
-This is the shortcode used in the examples above:
-
-
-{{< code file="layouts/shortcodes/imgproc.html" >}}
-{{< readfile file="layouts/shortcodes/imgproc.html" >}}
-{{< /code >}}
-
-And it is used like this:
-
-```go-html-template
-{{* imgproc sunset Resize "300x" */>}}
-```
-
-### Image Processing Options
-
-In addition to the dimensions (e.g. `200x100`) where either height or width can be omitted, Hugo supports a set of additional image options:
-
-Anchor
-: Only relevant for `Fill`. This is useful for thumbnail generation where the main motive is located in, say, the left corner. Valid are `Center`, `TopLeft`, `Top`, `TopRight`, `Left`, `Right`, `BottomLeft`, `Bottom`, `BottomRight`. Example: `{{ $logo.Fill "200x100 BottomLeft" }}`
-
-JPEG Quality
-: Only relevant for JPEG images, values 1 to 100 inclusive, higher is better. Default is 75. `{{ $logo.Resize "200x q50" }}`
-
-Rotate
-: Rotates an image by the given angle counter-clockwise. The rotation will be performed first to get the dimensions correct. `{{ $logo.Resize "200x r90" }}`. The main use of this is to be able to manually correct for [EXIF orientation](https://github.com/golang/go/issues/4341) of JPEG images.
-
-Resample Filter
-: Filter used in resizing. Default is `Box`, a simple and fast resampling filter appropriate for downscaling. See https://github.com/disintegration/imaging for more. If you want to trade quality for faster processing, this may be a option to test.
-
-
-
-### Performance
-
-Processed images are stored below `/resources` (can be set with `resourceDir` config setting). This folder is deliberately placed in the project, as it is recommended to check these into source control as part of the project. These images are not "Hugo fast" to generate, but once generated they can be reused.
-
-If you change your image settings (e.g. size), remove or rename images etc., you will end up with unused images taking up space and cluttering your project.
-
-To clean up, run:
-
-```bash
-hugo --gc
-```
-
-
-{{% note %}}
-**GC** is short for **Garbage Collection**.
-{{% /note %}}
-
-
-## Configuration
-
-### Default Image Processing Config
-
-You can configure an `imaging` section in `config.toml` with default image processing options:
-
-```toml
-[imaging]
-# Default resample filter used for resizing. Default is Box,
-# a simple and fast averaging filter appropriate for downscaling.
-# See https://github.com/disintegration/imaging
-resampleFilter = "box"
-
-# Default JPEG quality setting. Default is 75.
-quality = 68
-```
-
-
-
-
-
diff --git a/docs/content/en/about/security.md b/docs/content/en/about/security.md
new file mode 100644
index 000000000..509ca6a75
--- /dev/null
+++ b/docs/content/en/about/security.md
@@ -0,0 +1,58 @@
+---
+title: Security model
+linkTitle: Security
+description: A summary of Hugo's security model.
+categories: []
+keywords: []
+weight: 30
+aliases: [/about/security-model/]
+---
+
+## Runtime security
+
+Hugo generates static websites, meaning the final output runs directly in the browser and interacts with any integrated APIs. However, during development and site building, the `hugo` executable itself is the runtime environment.
+
+Securing a runtime is a complex task. Hugo addresses this through a robust sandboxing approach and a strict security policy with default protections. Key features include:
+
+- Virtual file system: Hugo employs a virtual file system, limiting file access. Only the main project, not external components, can access files or directories outside the project root.
+- Read-Only access: User-defined components have read-only access to the file system, preventing unintended modifications.
+- Controlled external binaries: While Hugo utilizes external binaries for features like Asciidoctor support, these are strictly predefined with specific flags and are disabled by default. The [security policy] details these limitations.
+- No arbitrary commands: To mitigate risks, Hugo intentionally avoids implementing general functions that would allow users to execute arbitrary operating system commands.
+
+This combination of sandboxing and strict defaults effectively minimizes potential security vulnerabilities during the Hugo build process.
+
+[security policy]: /configuration/security/
+
+## Dependency security
+
+Hugo utilizes [Go Modules] to manage its dependencies, compiling as a static binary. Go Modules create a `go.sum` file, a critical security feature. This file acts as a database, storing the expected cryptographic checksums of all dependencies, including those required indirectly (transitive dependencies).
+
+[Hugo Modules], which extend Go Modules' functionality, also produce a `go.sum` file. To ensure dependency integrity, commit this `go.sum` file to your version control. If Hugo detects a checksum mismatch during the build process, it will fail, indicating a possible attempt to [tamper with your project's dependencies].
+
+[Go Modules]: https://go.dev/wiki/Modules#modules
+[Hugo Modules]: /hugo-modules/
+[tamper with your project's dependencies]: https://julienrenaux.fr/2019/12/20/github-actions-security-risk/
+
+## Web application security
+
+Hugo's security philosophy is rooted in established security standards, primarily aligning with the threats defined by [OWASP]. For HTML output, Hugo operates under a clear trust model. This model assumes that template and configuration authors, the developers, are trustworthy. However, the data supplied to these templates is inherently considered untrusted. This distinction is crucial for understanding how Hugo handles potential security risks.
+
+[OWASP]: https://en.wikipedia.org/wiki/OWASP
+
+To prevent unintended escaping of data that developers know is safe, Hugo provides [`safe`] functions, such as [`safeHTML`]. These functions allow developers to explicitly mark data as trusted, bypassing the default escaping mechanisms. This is essential for scenarios where data is generated or sourced from reliable sources. However, an exception exists: enabling [inline shortcodes]. By activating this feature, you are implicitly trusting the logic within the shortcodes and the data contained within your content files.
+
+[`safeHTML`]: /functions/safe/html/
+[inline shortcodes]: /content-management/shortcodes/#inline
+
+It's vital to remember that Hugo is a static site generator. This architectural choice significantly reduces the attack surface by eliminating the complexities and vulnerabilities associated with dynamic user input. Unlike dynamic websites, Hugo generates static HTML files, minimizing the risk of real-time attacks. Regarding content, Hugo's default Markdown renderer is [configured to sanitize] potentially unsafe content. This default behavior ensures that potentially malicious code or scripts are removed or escaped. However, this setting can be reconfigured if you have a high degree of confidence in the safety of your content sources.
+
+[configured to sanitize]: /configuration/markup/#rendererunsafe
+
+In essence, Hugo prioritizes secure output by establishing a clear trust boundary between developers and data. By default, it errs on the side of caution, sanitizing potentially unsafe content and escaping data. Developers have the flexibility to adjust these defaults through [`safe`] functions and [configuration options], but they must do so with a clear understanding of the security implications. Hugo's static site generation model further strengthens its security posture by minimizing dynamic vulnerabilities.
+
+[`safe`]: /functions/safe
+[configuration options]: /configuration/security
+
+## Configuration
+
+See [configure security](/configuration/security/).
diff --git a/docs/content/en/about/what-is-hugo.md b/docs/content/en/about/what-is-hugo.md
deleted file mode 100644
index 257c7e82d..000000000
--- a/docs/content/en/about/what-is-hugo.md
+++ /dev/null
@@ -1,65 +0,0 @@
----
-title: What is Hugo
-linktitle: What is Hugo
-description: Hugo is a fast and modern static site generator written in Go, and designed to make website creation fun again.
-date: 2017-02-01
-publishdate: 2017-02-01
-lastmod: 2017-02-01
-layout: single
-menu:
- docs:
- parent: "about"
- weight: 10
-weight: 10
-sections_weight: 10
-draft: false
-aliases: [/overview/introduction/,/about/why-i-built-hugo/]
-toc: true
----
-
-Hugo is a general-purpose website framework. Technically speaking, Hugo is a [static site generator][]. Unlike systems that dynamically build a page with each visitor request, Hugo builds pages when you create or update your content. Since websites are viewed far more often than they are edited, Hugo is designed to provide an optimal viewing experience for your website's end users and an ideal writing experience for website authors.
-
-Websites built with Hugo are extremely fast and secure. Hugo sites can be hosted anywhere, including [Netlify][], [Heroku][], [GoDaddy][], [DreamHost][], [GitHub Pages][], [GitLab Pages][], [Surge][], [Aerobatic][], [Firebase][], [Google Cloud Storage][], [Amazon S3][], [Rackspace][], [Azure][], and [CloudFront][] and work well with CDNs. Hugo sites run without the need for a database or dependencies on expensive runtimes like Ruby, Python, or PHP.
-
-We think of Hugo as the ideal website creation tool with nearly instant build times, able to rebuild whenever a change is made.
-
-## How Fast is Hugo?
-
-{{< youtube "CdiDYZ51a2o" >}}
-
-## What Does Hugo Do?
-
-In technical terms, Hugo takes a source directory of files and templates and uses these as input to create a complete website.
-
-## Who Should Use Hugo?
-
-Hugo is for people that prefer writing in a text editor over a browser.
-
-Hugo is for people who want to hand code their own website without worrying about setting up complicated runtimes, dependencies and databases.
-
-Hugo is for people building a blog, a company site, a portfolio site, documentation, a single landing page, or a website with thousands of pages.
-
-
-
-[@spf13]: https://twitter.com/@spf13
-[Aerobatic]: https://www.aerobatic.com/
-[Amazon S3]: https://aws.amazon.com/s3/
-[Azure]: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-static-website
-[CloudFront]: https://aws.amazon.com/cloudfront/ "Amazon CloudFront"
-[DreamHost]: https://www.dreamhost.com/
-[Firebase]: https://firebase.google.com/docs/hosting/ "Firebase static hosting"
-[GitHub Pages]: https://pages.github.com/
-[GitLab Pages]: https://about.gitlab.com/features/pages/
-[Go language]: https://golang.org/
-[GoDaddy]: https://www.godaddy.com/ "Godaddy.com Hosting"
-[Google Cloud Storage]: https://cloud.google.com/storage/
-[Heroku]: https://www.heroku.com/
-[Jekyll]: https://jekyllrb.com/
-[Middleman]: https://middlemanapp.com/
-[Nanoc]: https://nanoc.ws/
-[Netlify]: https://netlify.com
-[Rackspace]: https://www.rackspace.com/cloud/files
-[Surge]: https://surge.sh
-[contributing to it]: https://github.com/gohugoio/hugo
-[rackspace]: https://www.rackspace.com/cloud/files
-[static site generator]: /about/benefits/
diff --git a/docs/content/en/commands/_index.md b/docs/content/en/commands/_index.md
new file mode 100644
index 000000000..5869bfd9d
--- /dev/null
+++ b/docs/content/en/commands/_index.md
@@ -0,0 +1,8 @@
+---
+title: Command line interface
+linkTitle: CLI
+description: Use the command line interface (CLI) to manage your site.
+categories: []
+keywords: []
+weight: 10
+---
diff --git a/docs/content/en/commands/hugo.md b/docs/content/en/commands/hugo.md
index 4f5527b27..ef0bca9a5 100644
--- a/docs/content/en/commands/hugo.md
+++ b/docs/content/en/commands/hugo.md
@@ -1,12 +1,11 @@
---
-date: 2019-07-31
title: "hugo"
slug: hugo
url: /commands/hugo/
---
## hugo
-hugo builds your site
+Build your site
### Synopsis
@@ -15,7 +14,7 @@ hugo is the main command, used to build your Hugo site.
Hugo is a Fast and Flexible Static Site Generator
built with love by spf13 and friends in Go.
-Complete documentation is available at http://gohugo.io/.
+Complete documentation is available at https://gohugo.io/.
```
hugo [flags]
@@ -24,59 +23,62 @@ hugo [flags]
### Options
```
- -b, --baseURL string hostname (and path) to the root, e.g. http://spf13.com/
- -D, --buildDrafts include content marked as draft
- -E, --buildExpired include expired content
- -F, --buildFuture include content with publishdate in the future
- --cacheDir string filesystem path to cache directory. Defaults: $TMPDIR/hugo_cache/
- --cleanDestinationDir remove files from destination not found in static directories
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- -c, --contentDir string filesystem path to content directory
- --debug debug output
- -d, --destination string filesystem path to write files to
- --disableKinds strings disable different kind of pages (home, RSS etc.)
- --enableGitInfo add Git revision, date and author info to the pages
- -e, --environment string build environment
- --forceSyncStatic copy all files when static is changed.
- --gc enable to run some cleanup tasks (remove unused cache files) after the build
- -h, --help help for hugo
- --i18n-warnings print missing translations
- --ignoreCache ignores the cache directory
- --ignoreVendor ignores any _vendor directory
- -l, --layoutDir string filesystem path to layout directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --minify minify any supported output format (HTML, XML etc.)
- --noChmod don't sync permission mode of files
- --noTimes don't sync modification time of files
- --path-warnings print warnings on duplicate target paths etc.
- --quiet build in quiet mode
- --renderToMemory render to memory (only useful for benchmark testing)
- -s, --source string filesystem path to read files relative from
- --templateMetrics display metrics about template executions
- --templateMetricsHints calculate some improvement hints when combined with --templateMetrics
- -t, --theme strings themes to use (located in /themes/THEMENAME/)
- --themesDir string filesystem path to themes directory
- --trace file write trace to file (not useful in general)
- -v, --verbose verbose output
- --verboseLog verbose logging
- -w, --watch watch filesystem for changes and recreate as needed
+ -b, --baseURL string hostname (and path) to the root, e.g. https://spf13.com/
+ -D, --buildDrafts include content marked as draft
+ -E, --buildExpired include expired content
+ -F, --buildFuture include content with publishdate in the future
+ --cacheDir string filesystem path to cache directory
+ --cleanDestinationDir remove files from destination not found in static directories
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -c, --contentDir string filesystem path to content directory
+ -d, --destination string filesystem path to write files to
+ --disableKinds strings disable different kind of pages (home, RSS etc.)
+ --enableGitInfo add Git revision, date, author, and CODEOWNERS info to the pages
+ -e, --environment string build environment
+ --forceSyncStatic copy all files when static is changed.
+ --gc enable to run some cleanup tasks (remove unused cache files) after the build
+ -h, --help help for hugo
+ --ignoreCache ignores the cache directory
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ -l, --layoutDir string filesystem path to layout directory
+ --logLevel string log level (debug|info|warn|error)
+ --minify minify any supported output format (HTML, XML etc.)
+ --noBuildLock don't create .hugo_build.lock file
+ --noChmod don't sync permission mode of files
+ --noTimes don't sync modification time of files
+ --panicOnWarning panic on first WARNING log
+ --poll string set this to a poll interval, e.g --poll 700ms, to use a poll based approach to watch for file system changes
+ --printI18nWarnings print missing translations
+ --printMemoryUsage print memory usage to screen at intervals
+ --printPathWarnings print warnings on duplicate target paths etc.
+ --printUnusedTemplates print warnings on unused templates.
+ --quiet build in quiet mode
+ --renderSegments strings named segments to render (configured in the segments config)
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --templateMetrics display metrics about template executions
+ --templateMetricsHints calculate some improvement hints when combined with --templateMetrics
+ -t, --theme strings themes to use (located in /themes/THEMENAME/)
+ --themesDir string filesystem path to themes directory
+ --trace file write trace to file (not useful in general)
+ -w, --watch watch filesystem for changes and recreate as needed
```
### SEE ALSO
-* [hugo check](/commands/hugo_check/) - Contains some verification checks
-* [hugo config](/commands/hugo_config/) - Print the site configuration
-* [hugo convert](/commands/hugo_convert/) - Convert your content to different formats
-* [hugo deploy](/commands/hugo_deploy/) - Deploy your site to a Cloud provider.
-* [hugo env](/commands/hugo_env/) - Print Hugo version and environment info
-* [hugo gen](/commands/hugo_gen/) - A collection of several useful generators.
-* [hugo import](/commands/hugo_import/) - Import your site from others.
-* [hugo list](/commands/hugo_list/) - Listing out various types of content
-* [hugo mod](/commands/hugo_mod/) - Various Hugo Modules helpers.
-* [hugo new](/commands/hugo_new/) - Create new content for your site
-* [hugo server](/commands/hugo_server/) - A high performance webserver
-* [hugo version](/commands/hugo_version/) - Print the version number of Hugo
+* [hugo build](/commands/hugo_build/) - Build your site
+* [hugo completion](/commands/hugo_completion/) - Generate the autocompletion script for the specified shell
+* [hugo config](/commands/hugo_config/) - Display site configuration
+* [hugo convert](/commands/hugo_convert/) - Convert front matter to another format
+* [hugo deploy](/commands/hugo_deploy/) - Deploy your site to a cloud provider
+* [hugo env](/commands/hugo_env/) - Display version and environment info
+* [hugo gen](/commands/hugo_gen/) - Generate documentation and syntax highlighting styles
+* [hugo import](/commands/hugo_import/) - Import a site from another system
+* [hugo list](/commands/hugo_list/) - List content
+* [hugo mod](/commands/hugo_mod/) - Manage modules
+* [hugo new](/commands/hugo_new/) - Create new content
+* [hugo server](/commands/hugo_server/) - Start the embedded web server
+* [hugo version](/commands/hugo_version/) - Display version
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_build.md b/docs/content/en/commands/hugo_build.md
new file mode 100644
index 000000000..582cbe511
--- /dev/null
+++ b/docs/content/en/commands/hugo_build.md
@@ -0,0 +1,72 @@
+---
+title: "hugo build"
+slug: hugo_build
+url: /commands/hugo_build/
+---
+## hugo build
+
+Build your site
+
+### Synopsis
+
+build is the main command, used to build your Hugo site.
+
+Hugo is a Fast and Flexible Static Site Generator
+built with love by spf13 and friends in Go.
+
+Complete documentation is available at https://gohugo.io/.
+
+```
+hugo build [flags]
+```
+
+### Options
+
+```
+ -b, --baseURL string hostname (and path) to the root, e.g. https://spf13.com/
+ -D, --buildDrafts include content marked as draft
+ -E, --buildExpired include expired content
+ -F, --buildFuture include content with publishdate in the future
+ --cacheDir string filesystem path to cache directory
+ --cleanDestinationDir remove files from destination not found in static directories
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -c, --contentDir string filesystem path to content directory
+ -d, --destination string filesystem path to write files to
+ --disableKinds strings disable different kind of pages (home, RSS etc.)
+ --enableGitInfo add Git revision, date, author, and CODEOWNERS info to the pages
+ -e, --environment string build environment
+ --forceSyncStatic copy all files when static is changed.
+ --gc enable to run some cleanup tasks (remove unused cache files) after the build
+ -h, --help help for build
+ --ignoreCache ignores the cache directory
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ -l, --layoutDir string filesystem path to layout directory
+ --logLevel string log level (debug|info|warn|error)
+ --minify minify any supported output format (HTML, XML etc.)
+ --noBuildLock don't create .hugo_build.lock file
+ --noChmod don't sync permission mode of files
+ --noTimes don't sync modification time of files
+ --panicOnWarning panic on first WARNING log
+ --poll string set this to a poll interval, e.g --poll 700ms, to use a poll based approach to watch for file system changes
+ --printI18nWarnings print missing translations
+ --printMemoryUsage print memory usage to screen at intervals
+ --printPathWarnings print warnings on duplicate target paths etc.
+ --printUnusedTemplates print warnings on unused templates.
+ --quiet build in quiet mode
+ --renderSegments strings named segments to render (configured in the segments config)
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --templateMetrics display metrics about template executions
+ --templateMetricsHints calculate some improvement hints when combined with --templateMetrics
+ -t, --theme strings themes to use (located in /themes/THEMENAME/)
+ --themesDir string filesystem path to themes directory
+ --trace file write trace to file (not useful in general)
+ -w, --watch watch filesystem for changes and recreate as needed
+```
+
+### SEE ALSO
+
+* [hugo](/commands/hugo/) - Build your site
+
diff --git a/docs/content/en/commands/hugo_check.md b/docs/content/en/commands/hugo_check.md
deleted file mode 100644
index fc727476a..000000000
--- a/docs/content/en/commands/hugo_check.md
+++ /dev/null
@@ -1,43 +0,0 @@
----
-date: 2019-07-31
-title: "hugo check"
-slug: hugo_check
-url: /commands/hugo_check/
----
-## hugo check
-
-Contains some verification checks
-
-### Synopsis
-
-Contains some verification checks
-
-### Options
-
-```
- -h, --help help for check
-```
-
-### Options inherited from parent commands
-
-```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
-```
-
-### SEE ALSO
-
-* [hugo](/commands/hugo/) - hugo builds your site
-* [hugo check ulimit](/commands/hugo_check_ulimit/) - Check system ulimit settings
-
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_check_ulimit.md b/docs/content/en/commands/hugo_check_ulimit.md
deleted file mode 100644
index 478c9a850..000000000
--- a/docs/content/en/commands/hugo_check_ulimit.md
+++ /dev/null
@@ -1,47 +0,0 @@
----
-date: 2019-07-31
-title: "hugo check ulimit"
-slug: hugo_check_ulimit
-url: /commands/hugo_check_ulimit/
----
-## hugo check ulimit
-
-Check system ulimit settings
-
-### Synopsis
-
-Hugo will inspect the current ulimit settings on the system.
-This is primarily to ensure that Hugo can watch enough files on some OSs
-
-```
-hugo check ulimit [flags]
-```
-
-### Options
-
-```
- -h, --help help for ulimit
-```
-
-### Options inherited from parent commands
-
-```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
-```
-
-### SEE ALSO
-
-* [hugo check](/commands/hugo_check/) - Contains some verification checks
-
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_completion.md b/docs/content/en/commands/hugo_completion.md
new file mode 100644
index 000000000..ac60dc148
--- /dev/null
+++ b/docs/content/en/commands/hugo_completion.md
@@ -0,0 +1,46 @@
+---
+title: "hugo completion"
+slug: hugo_completion
+url: /commands/hugo_completion/
+---
+## hugo completion
+
+Generate the autocompletion script for the specified shell
+
+### Synopsis
+
+Generate the autocompletion script for hugo for the specified shell.
+See each sub-command's help for details on how to use the generated script.
+
+
+### Options
+
+```
+ -h, --help help for completion
+```
+
+### Options inherited from parent commands
+
+```
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
+```
+
+### SEE ALSO
+
+* [hugo](/commands/hugo/) - Build your site
+* [hugo completion bash](/commands/hugo_completion_bash/) - Generate the autocompletion script for bash
+* [hugo completion fish](/commands/hugo_completion_fish/) - Generate the autocompletion script for fish
+* [hugo completion powershell](/commands/hugo_completion_powershell/) - Generate the autocompletion script for powershell
+* [hugo completion zsh](/commands/hugo_completion_zsh/) - Generate the autocompletion script for zsh
+
diff --git a/docs/content/en/commands/hugo_completion_bash.md b/docs/content/en/commands/hugo_completion_bash.md
new file mode 100644
index 000000000..41fb47c0c
--- /dev/null
+++ b/docs/content/en/commands/hugo_completion_bash.md
@@ -0,0 +1,65 @@
+---
+title: "hugo completion bash"
+slug: hugo_completion_bash
+url: /commands/hugo_completion_bash/
+---
+## hugo completion bash
+
+Generate the autocompletion script for bash
+
+### Synopsis
+
+Generate the autocompletion script for the bash shell.
+
+This script depends on the 'bash-completion' package.
+If it is not installed already, you can install it via your OS's package manager.
+
+To load completions in your current shell session:
+
+ source <(hugo completion bash)
+
+To load completions for every new session, execute once:
+
+#### Linux:
+
+ hugo completion bash > /etc/bash_completion.d/hugo
+
+#### macOS:
+
+ hugo completion bash > $(brew --prefix)/etc/bash_completion.d/hugo
+
+You will need to start a new shell for this setup to take effect.
+
+
+```
+hugo completion bash
+```
+
+### Options
+
+```
+ -h, --help help for bash
+ --no-descriptions disable completion descriptions
+```
+
+### Options inherited from parent commands
+
+```
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
+```
+
+### SEE ALSO
+
+* [hugo completion](/commands/hugo_completion/) - Generate the autocompletion script for the specified shell
+
diff --git a/docs/content/en/commands/hugo_completion_fish.md b/docs/content/en/commands/hugo_completion_fish.md
new file mode 100644
index 000000000..7f971c3ca
--- /dev/null
+++ b/docs/content/en/commands/hugo_completion_fish.md
@@ -0,0 +1,56 @@
+---
+title: "hugo completion fish"
+slug: hugo_completion_fish
+url: /commands/hugo_completion_fish/
+---
+## hugo completion fish
+
+Generate the autocompletion script for fish
+
+### Synopsis
+
+Generate the autocompletion script for the fish shell.
+
+To load completions in your current shell session:
+
+ hugo completion fish | source
+
+To load completions for every new session, execute once:
+
+ hugo completion fish > ~/.config/fish/completions/hugo.fish
+
+You will need to start a new shell for this setup to take effect.
+
+
+```
+hugo completion fish [flags]
+```
+
+### Options
+
+```
+ -h, --help help for fish
+ --no-descriptions disable completion descriptions
+```
+
+### Options inherited from parent commands
+
+```
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
+```
+
+### SEE ALSO
+
+* [hugo completion](/commands/hugo_completion/) - Generate the autocompletion script for the specified shell
+
diff --git a/docs/content/en/commands/hugo_completion_powershell.md b/docs/content/en/commands/hugo_completion_powershell.md
new file mode 100644
index 000000000..6ea17892b
--- /dev/null
+++ b/docs/content/en/commands/hugo_completion_powershell.md
@@ -0,0 +1,53 @@
+---
+title: "hugo completion powershell"
+slug: hugo_completion_powershell
+url: /commands/hugo_completion_powershell/
+---
+## hugo completion powershell
+
+Generate the autocompletion script for powershell
+
+### Synopsis
+
+Generate the autocompletion script for powershell.
+
+To load completions in your current shell session:
+
+ hugo completion powershell | Out-String | Invoke-Expression
+
+To load completions for every new session, add the output of the above command
+to your powershell profile.
+
+
+```
+hugo completion powershell [flags]
+```
+
+### Options
+
+```
+ -h, --help help for powershell
+ --no-descriptions disable completion descriptions
+```
+
+### Options inherited from parent commands
+
+```
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
+```
+
+### SEE ALSO
+
+* [hugo completion](/commands/hugo_completion/) - Generate the autocompletion script for the specified shell
+
diff --git a/docs/content/en/commands/hugo_completion_zsh.md b/docs/content/en/commands/hugo_completion_zsh.md
new file mode 100644
index 000000000..b9e79f9f3
--- /dev/null
+++ b/docs/content/en/commands/hugo_completion_zsh.md
@@ -0,0 +1,67 @@
+---
+title: "hugo completion zsh"
+slug: hugo_completion_zsh
+url: /commands/hugo_completion_zsh/
+---
+## hugo completion zsh
+
+Generate the autocompletion script for zsh
+
+### Synopsis
+
+Generate the autocompletion script for the zsh shell.
+
+If shell completion is not already enabled in your environment you will need
+to enable it. You can execute the following once:
+
+ echo "autoload -U compinit; compinit" >> ~/.zshrc
+
+To load completions in your current shell session:
+
+ source <(hugo completion zsh)
+
+To load completions for every new session, execute once:
+
+#### Linux:
+
+ hugo completion zsh > "${fpath[1]}/_hugo"
+
+#### macOS:
+
+ hugo completion zsh > $(brew --prefix)/share/zsh/site-functions/_hugo
+
+You will need to start a new shell for this setup to take effect.
+
+
+```
+hugo completion zsh [flags]
+```
+
+### Options
+
+```
+ -h, --help help for zsh
+ --no-descriptions disable completion descriptions
+```
+
+### Options inherited from parent commands
+
+```
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
+```
+
+### SEE ALSO
+
+* [hugo completion](/commands/hugo_completion/) - Generate the autocompletion script for the specified shell
+
diff --git a/docs/content/en/commands/hugo_config.md b/docs/content/en/commands/hugo_config.md
index 6e1094903..2b4eaafa1 100644
--- a/docs/content/en/commands/hugo_config.md
+++ b/docs/content/en/commands/hugo_config.md
@@ -1,47 +1,53 @@
---
-date: 2019-07-31
title: "hugo config"
slug: hugo_config
url: /commands/hugo_config/
---
## hugo config
-Print the site configuration
+Display site configuration
### Synopsis
-Print the site configuration, both default and custom settings.
+Display site configuration, both default and custom settings.
```
-hugo config [flags]
+hugo config [command] [flags]
```
### Options
```
- -h, --help help for config
+ -b, --baseURL string hostname (and path) to the root, e.g. https://spf13.com/
+ --cacheDir string filesystem path to cache directory
+ -c, --contentDir string filesystem path to content directory
+ --format string preferred file format (toml, yaml or json) (default "toml")
+ -h, --help help for config
+ --lang string the language to display config for. Defaults to the first language defined.
+ --printZero include config options with zero values (e.g. false, 0, "") in the output
+ --renderSegments strings named segments to render (configured in the segments config)
+ -t, --theme strings themes to use (located in /themes/THEMENAME/)
```
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo](/commands/hugo/) - hugo builds your site
+* [hugo](/commands/hugo/) - Build your site
* [hugo config mounts](/commands/hugo_config_mounts/) - Print the configured file mounts
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_config_mounts.md b/docs/content/en/commands/hugo_config_mounts.md
index 98cd6f9eb..06a781220 100644
--- a/docs/content/en/commands/hugo_config_mounts.md
+++ b/docs/content/en/commands/hugo_config_mounts.md
@@ -1,5 +1,4 @@
---
-date: 2019-07-31
title: "hugo config mounts"
slug: hugo_config_mounts
url: /commands/hugo_config_mounts/
@@ -8,39 +7,39 @@ url: /commands/hugo_config_mounts/
Print the configured file mounts
-### Synopsis
-
-Print the configured file mounts
-
```
-hugo config mounts [flags]
+hugo config mounts [flags] [args]
```
### Options
```
- -h, --help help for mounts
+ -b, --baseURL string hostname (and path) to the root, e.g. https://spf13.com/
+ --cacheDir string filesystem path to cache directory
+ -c, --contentDir string filesystem path to content directory
+ -h, --help help for mounts
+ --renderSegments strings named segments to render (configured in the segments config)
+ -t, --theme strings themes to use (located in /themes/THEMENAME/)
```
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo config](/commands/hugo_config/) - Print the site configuration
+* [hugo config](/commands/hugo_config/) - Display site configuration
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_convert.md b/docs/content/en/commands/hugo_convert.md
index 1e85b72f1..a8d0b6a38 100644
--- a/docs/content/en/commands/hugo_convert.md
+++ b/docs/content/en/commands/hugo_convert.md
@@ -1,16 +1,15 @@
---
-date: 2019-07-31
title: "hugo convert"
slug: hugo_convert
url: /commands/hugo_convert/
---
## hugo convert
-Convert your content to different formats
+Convert front matter to another format
### Synopsis
-Convert your content (e.g. front matter) to different formats.
+Convert front matter to another format.
See convert's subcommands toJSON, toTOML and toYAML for more information.
@@ -25,25 +24,24 @@ See convert's subcommands toJSON, toTOML and toYAML for more information.
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo](/commands/hugo/) - hugo builds your site
+* [hugo](/commands/hugo/) - Build your site
* [hugo convert toJSON](/commands/hugo_convert_tojson/) - Convert front matter to JSON
* [hugo convert toTOML](/commands/hugo_convert_totoml/) - Convert front matter to TOML
* [hugo convert toYAML](/commands/hugo_convert_toyaml/) - Convert front matter to YAML
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_convert_toJSON.md b/docs/content/en/commands/hugo_convert_toJSON.md
index 484d942e6..fe81146f9 100644
--- a/docs/content/en/commands/hugo_convert_toJSON.md
+++ b/docs/content/en/commands/hugo_convert_toJSON.md
@@ -1,5 +1,4 @@
---
-date: 2019-07-31
title: "hugo convert toJSON"
slug: hugo_convert_toJSON
url: /commands/hugo_convert_tojson/
@@ -14,7 +13,7 @@ toJSON converts all front matter in the content directory
to use JSON for the front matter.
```
-hugo convert toJSON [flags]
+hugo convert toJSON [flags] [args]
```
### Options
@@ -26,24 +25,23 @@ hugo convert toJSON [flags]
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- -o, --output string filesystem path to write files to
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- --unsafe enable less safe operations, please backup first
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ -o, --output string filesystem path to write files to
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
+ --unsafe enable less safe operations, please backup first
```
### SEE ALSO
-* [hugo convert](/commands/hugo_convert/) - Convert your content to different formats
+* [hugo convert](/commands/hugo_convert/) - Convert front matter to another format
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_convert_toTOML.md b/docs/content/en/commands/hugo_convert_toTOML.md
index 53f25cf66..490b15ee6 100644
--- a/docs/content/en/commands/hugo_convert_toTOML.md
+++ b/docs/content/en/commands/hugo_convert_toTOML.md
@@ -1,5 +1,4 @@
---
-date: 2019-07-31
title: "hugo convert toTOML"
slug: hugo_convert_toTOML
url: /commands/hugo_convert_totoml/
@@ -14,7 +13,7 @@ toTOML converts all front matter in the content directory
to use TOML for the front matter.
```
-hugo convert toTOML [flags]
+hugo convert toTOML [flags] [args]
```
### Options
@@ -26,24 +25,23 @@ hugo convert toTOML [flags]
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- -o, --output string filesystem path to write files to
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- --unsafe enable less safe operations, please backup first
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ -o, --output string filesystem path to write files to
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
+ --unsafe enable less safe operations, please backup first
```
### SEE ALSO
-* [hugo convert](/commands/hugo_convert/) - Convert your content to different formats
+* [hugo convert](/commands/hugo_convert/) - Convert front matter to another format
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_convert_toYAML.md b/docs/content/en/commands/hugo_convert_toYAML.md
index 4173883d8..9b00ce247 100644
--- a/docs/content/en/commands/hugo_convert_toYAML.md
+++ b/docs/content/en/commands/hugo_convert_toYAML.md
@@ -1,5 +1,4 @@
---
-date: 2019-07-31
title: "hugo convert toYAML"
slug: hugo_convert_toYAML
url: /commands/hugo_convert_toyaml/
@@ -14,7 +13,7 @@ toYAML converts all front matter in the content directory
to use YAML for the front matter.
```
-hugo convert toYAML [flags]
+hugo convert toYAML [flags] [args]
```
### Options
@@ -26,24 +25,23 @@ hugo convert toYAML [flags]
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- -o, --output string filesystem path to write files to
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- --unsafe enable less safe operations, please backup first
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ -o, --output string filesystem path to write files to
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
+ --unsafe enable less safe operations, please backup first
```
### SEE ALSO
-* [hugo convert](/commands/hugo_convert/) - Convert your content to different formats
+* [hugo convert](/commands/hugo_convert/) - Convert front matter to another format
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_deploy.md b/docs/content/en/commands/hugo_deploy.md
index 054a16d42..696acf51f 100644
--- a/docs/content/en/commands/hugo_deploy.md
+++ b/docs/content/en/commands/hugo_deploy.md
@@ -1,23 +1,22 @@
---
-date: 2019-07-31
title: "hugo deploy"
slug: hugo_deploy
url: /commands/hugo_deploy/
---
## hugo deploy
-Deploy your site to a Cloud provider.
+Deploy your site to a cloud provider
### Synopsis
-Deploy your site to a Cloud provider.
+Deploy your site to a cloud provider
See https://gohugo.io/hosting-and-deployment/hugo-deploy/ for detailed
documentation.
```
-hugo deploy [flags]
+hugo deploy [flags] [args]
```
### Options
@@ -27,30 +26,30 @@ hugo deploy [flags]
--dryRun dry run
--force force upload of all files
-h, --help help for deploy
- --invalidateCDN invalidate the CDN cache via the cloudFrontDistributionID listed in the deployment target (default true)
+ --invalidateCDN invalidate the CDN cache listed in the deployment target (default true)
--maxDeletes int maximum # of files to delete, or -1 to disable (default 256)
--target string target deployment from deployments section in config file; defaults to the first one
+ --workers int number of workers to transfer files. defaults to 10 (default 10)
```
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo](/commands/hugo/) - hugo builds your site
+* [hugo](/commands/hugo/) - Build your site
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_env.md b/docs/content/en/commands/hugo_env.md
index 5d134d693..7e21733a4 100644
--- a/docs/content/en/commands/hugo_env.md
+++ b/docs/content/en/commands/hugo_env.md
@@ -1,19 +1,18 @@
---
-date: 2019-07-31
title: "hugo env"
slug: hugo_env
url: /commands/hugo_env/
---
## hugo env
-Print Hugo version and environment info
+Display version and environment info
### Synopsis
-Print Hugo version and environment info. This is useful in Hugo bug reports.
+Display version and environment info. This is useful in Hugo bug reports
```
-hugo env [flags]
+hugo env [flags] [args]
```
### Options
@@ -25,22 +24,21 @@ hugo env [flags]
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo](/commands/hugo/) - hugo builds your site
+* [hugo](/commands/hugo/) - Build your site
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_gen.md b/docs/content/en/commands/hugo_gen.md
index e195882de..ae11a0321 100644
--- a/docs/content/en/commands/hugo_gen.md
+++ b/docs/content/en/commands/hugo_gen.md
@@ -1,16 +1,15 @@
---
-date: 2019-07-31
title: "hugo gen"
slug: hugo_gen
url: /commands/hugo_gen/
---
## hugo gen
-A collection of several useful generators.
+Generate documentation and syntax highlighting styles
### Synopsis
-A collection of several useful generators.
+Generate documentation for your project using Hugo's documentation engine, including syntax highlighting for various programming languages.
### Options
@@ -21,26 +20,24 @@ A collection of several useful generators.
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo](/commands/hugo/) - hugo builds your site
-* [hugo gen autocomplete](/commands/hugo_gen_autocomplete/) - Generate shell autocompletion script for Hugo
+* [hugo](/commands/hugo/) - Build your site
* [hugo gen chromastyles](/commands/hugo_gen_chromastyles/) - Generate CSS stylesheet for the Chroma code highlighter
-* [hugo gen doc](/commands/hugo_gen_doc/) - Generate Markdown documentation for the Hugo CLI.
+* [hugo gen doc](/commands/hugo_gen_doc/) - Generate Markdown documentation for the Hugo CLI
* [hugo gen man](/commands/hugo_gen_man/) - Generate man pages for the Hugo CLI
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_gen_autocomplete.md b/docs/content/en/commands/hugo_gen_autocomplete.md
deleted file mode 100644
index 0c4657a87..000000000
--- a/docs/content/en/commands/hugo_gen_autocomplete.md
+++ /dev/null
@@ -1,64 +0,0 @@
----
-date: 2019-07-31
-title: "hugo gen autocomplete"
-slug: hugo_gen_autocomplete
-url: /commands/hugo_gen_autocomplete/
----
-## hugo gen autocomplete
-
-Generate shell autocompletion script for Hugo
-
-### Synopsis
-
-Generates a shell autocompletion script for Hugo.
-
-NOTE: The current version supports Bash only.
- This should work for *nix systems with Bash installed.
-
-By default, the file is written directly to /etc/bash_completion.d
-for convenience, and the command may need superuser rights, e.g.:
-
- $ sudo hugo gen autocomplete
-
-Add `--completionfile=/path/to/file` flag to set alternative
-file-path and name.
-
-Logout and in again to reload the completion scripts,
-or just source them in directly:
-
- $ . /etc/bash_completion
-
-```
-hugo gen autocomplete [flags]
-```
-
-### Options
-
-```
- --completionfile string autocompletion file (default "/etc/bash_completion.d/hugo.sh")
- -h, --help help for autocomplete
- --type string autocompletion type (currently only bash supported) (default "bash")
-```
-
-### Options inherited from parent commands
-
-```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
-```
-
-### SEE ALSO
-
-* [hugo gen](/commands/hugo_gen/) - A collection of several useful generators.
-
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_gen_chromastyles.md b/docs/content/en/commands/hugo_gen_chromastyles.md
index c893a7f33..2863e46b4 100644
--- a/docs/content/en/commands/hugo_gen_chromastyles.md
+++ b/docs/content/en/commands/hugo_gen_chromastyles.md
@@ -1,5 +1,4 @@
---
-date: 2019-07-31
title: "hugo gen chromastyles"
slug: hugo_gen_chromastyles
url: /commands/hugo_gen_chromastyles/
@@ -10,42 +9,42 @@ Generate CSS stylesheet for the Chroma code highlighter
### Synopsis
-Generate CSS stylesheet for the Chroma code highlighter for a given style. This stylesheet is needed if pygmentsUseClasses is enabled in config.
+Generate CSS stylesheet for the Chroma code highlighter for a given style. This stylesheet is needed if markup.highlight.noClasses is disabled in config.
-See https://help.farbox.com/pygments.html for preview of available styles
+See https://xyproto.github.io/splash/docs/all.html for a preview of the available styles
```
-hugo gen chromastyles [flags]
+hugo gen chromastyles [flags] [args]
```
### Options
```
- -h, --help help for chromastyles
- --highlightStyle string style used for highlighting lines (see https://github.com/alecthomas/chroma) (default "bg:#ffffcc")
- --linesStyle string style used for line numbers (see https://github.com/alecthomas/chroma)
- --style string highlighter style (see https://help.farbox.com/pygments.html) (default "friendly")
+ -h, --help help for chromastyles
+ --highlightStyle string foreground and background colors for highlighted lines, e.g. --highlightStyle "#fff000 bg:#000fff"
+ --lineNumbersInlineStyle string foreground and background colors for inline line numbers, e.g. --lineNumbersInlineStyle "#fff000 bg:#000fff"
+ --lineNumbersTableStyle string foreground and background colors for table line numbers, e.g. --lineNumbersTableStyle "#fff000 bg:#000fff"
+ --style string highlighter style (see https://xyproto.github.io/splash/docs/) (default "friendly")
```
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo gen](/commands/hugo_gen/) - A collection of several useful generators.
+* [hugo gen](/commands/hugo_gen/) - Generate documentation and syntax highlighting styles
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_gen_doc.md b/docs/content/en/commands/hugo_gen_doc.md
index 317560a79..3d808e75c 100644
--- a/docs/content/en/commands/hugo_gen_doc.md
+++ b/docs/content/en/commands/hugo_gen_doc.md
@@ -1,25 +1,23 @@
---
-date: 2019-07-31
title: "hugo gen doc"
slug: hugo_gen_doc
url: /commands/hugo_gen_doc/
---
## hugo gen doc
-Generate Markdown documentation for the Hugo CLI.
+Generate Markdown documentation for the Hugo CLI
### Synopsis
Generate Markdown documentation for the Hugo CLI.
+ This command is, mostly, used to create up-to-date documentation
+ of Hugo's command-line interface for https://gohugo.io/.
-This command is, mostly, used to create up-to-date documentation
-of Hugo's command-line interface for http://gohugo.io/.
-
-It creates one Markdown file per command with front matter suitable
-for rendering in Hugo.
+ It creates one Markdown file per command with front matter suitable
+ for rendering in Hugo.
```
-hugo gen doc [flags]
+hugo gen doc [flags] [args]
```
### Options
@@ -32,22 +30,21 @@ hugo gen doc [flags]
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo gen](/commands/hugo_gen/) - A collection of several useful generators.
+* [hugo gen](/commands/hugo_gen/) - Generate documentation and syntax highlighting styles
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_gen_man.md b/docs/content/en/commands/hugo_gen_man.md
index a6f7b7534..14fe859e3 100644
--- a/docs/content/en/commands/hugo_gen_man.md
+++ b/docs/content/en/commands/hugo_gen_man.md
@@ -1,5 +1,4 @@
---
-date: 2019-07-31
title: "hugo gen man"
slug: hugo_gen_man
url: /commands/hugo_gen_man/
@@ -11,11 +10,11 @@ Generate man pages for the Hugo CLI
### Synopsis
This command automatically generates up-to-date man pages of Hugo's
-command-line interface. By default, it creates the man page files
-in the "man" directory under the current directory.
+ command-line interface. By default, it creates the man page files
+ in the "man" directory under the current directory.
```
-hugo gen man [flags]
+hugo gen man [flags] [args]
```
### Options
@@ -28,22 +27,21 @@ hugo gen man [flags]
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo gen](/commands/hugo_gen/) - A collection of several useful generators.
+* [hugo gen](/commands/hugo_gen/) - Generate documentation and syntax highlighting styles
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_import.md b/docs/content/en/commands/hugo_import.md
index 67b91e66c..2b8e62951 100644
--- a/docs/content/en/commands/hugo_import.md
+++ b/docs/content/en/commands/hugo_import.md
@@ -1,16 +1,15 @@
---
-date: 2019-07-31
title: "hugo import"
slug: hugo_import
url: /commands/hugo_import/
---
## hugo import
-Import your site from others.
+Import a site from another system
### Synopsis
-Import your site from other web site generators like Jekyll.
+Import a site from another system.
Import requires a subcommand, e.g. `hugo import jekyll jekyll_root_path target_path`.
@@ -23,23 +22,22 @@ Import requires a subcommand, e.g. `hugo import jekyll jekyll_root_path target_p
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo](/commands/hugo/) - hugo builds your site
+* [hugo](/commands/hugo/) - Build your site
* [hugo import jekyll](/commands/hugo_import_jekyll/) - hugo import from Jekyll
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_import_jekyll.md b/docs/content/en/commands/hugo_import_jekyll.md
index e126c2107..8746c156e 100644
--- a/docs/content/en/commands/hugo_import_jekyll.md
+++ b/docs/content/en/commands/hugo_import_jekyll.md
@@ -1,5 +1,4 @@
---
-date: 2019-07-31
title: "hugo import jekyll"
slug: hugo_import_jekyll
url: /commands/hugo_import_jekyll/
@@ -11,11 +10,11 @@ hugo import from Jekyll
### Synopsis
hugo import from Jekyll.
-
+
Import from Jekyll requires two paths, e.g. `hugo import jekyll jekyll_root_path target_path`.
```
-hugo import jekyll [flags]
+hugo import jekyll [flags] [args]
```
### Options
@@ -28,22 +27,21 @@ hugo import jekyll [flags]
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo import](/commands/hugo_import/) - Import your site from others.
+* [hugo import](/commands/hugo_import/) - Import a site from another system
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_list.md b/docs/content/en/commands/hugo_list.md
index 16bd1933e..741ca1d68 100644
--- a/docs/content/en/commands/hugo_list.md
+++ b/docs/content/en/commands/hugo_list.md
@@ -1,18 +1,17 @@
---
-date: 2019-07-31
title: "hugo list"
slug: hugo_list
url: /commands/hugo_list/
---
## hugo list
-Listing out various types of content
+List content
### Synopsis
-Listing out various types of content.
+List content.
-List requires a subcommand, e.g. `hugo list drafts`.
+List requires a subcommand, e.g. hugo list drafts
### Options
@@ -23,26 +22,26 @@ List requires a subcommand, e.g. `hugo list drafts`.
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo](/commands/hugo/) - hugo builds your site
-* [hugo list all](/commands/hugo_list_all/) - List all posts
-* [hugo list drafts](/commands/hugo_list_drafts/) - List all drafts
-* [hugo list expired](/commands/hugo_list_expired/) - List all posts already expired
-* [hugo list future](/commands/hugo_list_future/) - List all posts dated in the future
+* [hugo](/commands/hugo/) - Build your site
+* [hugo list all](/commands/hugo_list_all/) - List all content
+* [hugo list drafts](/commands/hugo_list_drafts/) - List draft content
+* [hugo list expired](/commands/hugo_list_expired/) - List expired content
+* [hugo list future](/commands/hugo_list_future/) - List future content
+* [hugo list published](/commands/hugo_list_published/) - List published content
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_list_all.md b/docs/content/en/commands/hugo_list_all.md
index 6a820ee18..e0f1efdcb 100644
--- a/docs/content/en/commands/hugo_list_all.md
+++ b/docs/content/en/commands/hugo_list_all.md
@@ -1,19 +1,18 @@
---
-date: 2019-07-31
title: "hugo list all"
slug: hugo_list_all
url: /commands/hugo_list_all/
---
## hugo list all
-List all posts
+List all content
### Synopsis
-List all of the posts in your content directory, include drafts, future and expired pages.
+List all content including draft, future, and expired.
```
-hugo list all [flags]
+hugo list all [flags] [args]
```
### Options
@@ -25,22 +24,21 @@ hugo list all [flags]
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo list](/commands/hugo_list/) - Listing out various types of content
+* [hugo list](/commands/hugo_list/) - List content
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_list_drafts.md b/docs/content/en/commands/hugo_list_drafts.md
index 2dff18040..25ddc78d3 100644
--- a/docs/content/en/commands/hugo_list_drafts.md
+++ b/docs/content/en/commands/hugo_list_drafts.md
@@ -1,19 +1,18 @@
---
-date: 2019-07-31
title: "hugo list drafts"
slug: hugo_list_drafts
url: /commands/hugo_list_drafts/
---
## hugo list drafts
-List all drafts
+List draft content
### Synopsis
-List all of the drafts in your content directory.
+List draft content.
```
-hugo list drafts [flags]
+hugo list drafts [flags] [args]
```
### Options
@@ -25,22 +24,21 @@ hugo list drafts [flags]
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo list](/commands/hugo_list/) - Listing out various types of content
+* [hugo list](/commands/hugo_list/) - List content
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_list_expired.md b/docs/content/en/commands/hugo_list_expired.md
index 1703d9277..1936b9920 100644
--- a/docs/content/en/commands/hugo_list_expired.md
+++ b/docs/content/en/commands/hugo_list_expired.md
@@ -1,19 +1,18 @@
---
-date: 2019-07-31
title: "hugo list expired"
slug: hugo_list_expired
url: /commands/hugo_list_expired/
---
## hugo list expired
-List all posts already expired
+List expired content
### Synopsis
-List all of the posts in your content directory which has already expired.
+List content with a past expiration date.
```
-hugo list expired [flags]
+hugo list expired [flags] [args]
```
### Options
@@ -25,22 +24,21 @@ hugo list expired [flags]
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo list](/commands/hugo_list/) - Listing out various types of content
+* [hugo list](/commands/hugo_list/) - List content
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_list_future.md b/docs/content/en/commands/hugo_list_future.md
index bb4621484..3152639c2 100644
--- a/docs/content/en/commands/hugo_list_future.md
+++ b/docs/content/en/commands/hugo_list_future.md
@@ -1,19 +1,18 @@
---
-date: 2019-07-31
title: "hugo list future"
slug: hugo_list_future
url: /commands/hugo_list_future/
---
## hugo list future
-List all posts dated in the future
+List future content
### Synopsis
-List all of the posts in your content directory which will be posted in the future.
+List content with a future publication date.
```
-hugo list future [flags]
+hugo list future [flags] [args]
```
### Options
@@ -25,22 +24,21 @@ hugo list future [flags]
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo list](/commands/hugo_list/) - Listing out various types of content
+* [hugo list](/commands/hugo_list/) - List content
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_list_published.md b/docs/content/en/commands/hugo_list_published.md
new file mode 100644
index 000000000..a7a08c7b4
--- /dev/null
+++ b/docs/content/en/commands/hugo_list_published.md
@@ -0,0 +1,44 @@
+---
+title: "hugo list published"
+slug: hugo_list_published
+url: /commands/hugo_list_published/
+---
+## hugo list published
+
+List published content
+
+### Synopsis
+
+List content that is not draft, future, or expired.
+
+```
+hugo list published [flags] [args]
+```
+
+### Options
+
+```
+ -h, --help help for published
+```
+
+### Options inherited from parent commands
+
+```
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
+```
+
+### SEE ALSO
+
+* [hugo list](/commands/hugo_list/) - List content
+
diff --git a/docs/content/en/commands/hugo_mod.md b/docs/content/en/commands/hugo_mod.md
index 3fa7b7e3f..25a27185d 100644
--- a/docs/content/en/commands/hugo_mod.md
+++ b/docs/content/en/commands/hugo_mod.md
@@ -1,23 +1,21 @@
---
-date: 2019-07-31
title: "hugo mod"
slug: hugo_mod
url: /commands/hugo_mod/
---
## hugo mod
-Various Hugo Modules helpers.
+Manage modules
### Synopsis
Various helpers to help manage the modules in your project's dependency graph.
-
Most operations here requires a Go version installed on your system (>= Go 1.12) and the relevant VCS client (typically Git).
This is not needed if you only operate on modules inside /themes or if you have vendored them via "hugo mod vendor".
Note that Hugo will always start out by resolving the components defined in the site
-configuration, provided by a _vendor directory (if no --ignoreVendor flag provided),
+configuration, provided by a _vendor directory (if no --ignoreVendorPaths flag provided),
Go Modules, or a folder inside the themes directory, in that order.
See https://gohugo.io/hugo-modules/ for more information.
@@ -27,57 +25,35 @@ See https://gohugo.io/hugo-modules/ for more information.
### Options
```
- -b, --baseURL string hostname (and path) to the root, e.g. http://spf13.com/
- -D, --buildDrafts include content marked as draft
- -E, --buildExpired include expired content
- -F, --buildFuture include content with publishdate in the future
- --cacheDir string filesystem path to cache directory. Defaults: $TMPDIR/hugo_cache/
- --cleanDestinationDir remove files from destination not found in static directories
- -c, --contentDir string filesystem path to content directory
- -d, --destination string filesystem path to write files to
- --disableKinds strings disable different kind of pages (home, RSS etc.)
- --enableGitInfo add Git revision, date and author info to the pages
- --forceSyncStatic copy all files when static is changed.
- --gc enable to run some cleanup tasks (remove unused cache files) after the build
- -h, --help help for mod
- --i18n-warnings print missing translations
- --ignoreCache ignores the cache directory
- -l, --layoutDir string filesystem path to layout directory
- --minify minify any supported output format (HTML, XML etc.)
- --noChmod don't sync permission mode of files
- --noTimes don't sync modification time of files
- --path-warnings print warnings on duplicate target paths etc.
- --templateMetrics display metrics about template executions
- --templateMetricsHints calculate some improvement hints when combined with --templateMetrics
- -t, --theme strings themes to use (located in /themes/THEMENAME/)
- --trace file write trace to file (not useful in general)
+ -h, --help help for mod
```
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo](/commands/hugo/) - hugo builds your site
-* [hugo mod clean](/commands/hugo_mod_clean/) - Delete the entire Hugo Module cache.
-* [hugo mod get](/commands/hugo_mod_get/) - Resolves dependencies in your current Hugo Project.
-* [hugo mod graph](/commands/hugo_mod_graph/) - Print a module dependency graph.
-* [hugo mod init](/commands/hugo_mod_init/) - Initialize this project as a Hugo Module.
-* [hugo mod tidy](/commands/hugo_mod_tidy/) - Remove unused entries in go.mod and go.sum.
-* [hugo mod vendor](/commands/hugo_mod_vendor/) - Vendor all module dependencies into the _vendor directory.
+* [hugo](/commands/hugo/) - Build your site
+* [hugo mod clean](/commands/hugo_mod_clean/) - Delete the Hugo Module cache for the current project
+* [hugo mod get](/commands/hugo_mod_get/) - Resolves dependencies in your current Hugo project
+* [hugo mod graph](/commands/hugo_mod_graph/) - Print a module dependency graph
+* [hugo mod init](/commands/hugo_mod_init/) - Initialize this project as a Hugo Module
+* [hugo mod npm](/commands/hugo_mod_npm/) - Various npm helpers
+* [hugo mod tidy](/commands/hugo_mod_tidy/) - Remove unused entries in go.mod and go.sum
+* [hugo mod vendor](/commands/hugo_mod_vendor/) - Vendor all module dependencies into the _vendor directory
+* [hugo mod verify](/commands/hugo_mod_verify/) - Verify dependencies
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_mod_clean.md b/docs/content/en/commands/hugo_mod_clean.md
index ef21ddfcd..ff2255e53 100644
--- a/docs/content/en/commands/hugo_mod_clean.md
+++ b/docs/content/en/commands/hugo_mod_clean.md
@@ -1,52 +1,51 @@
---
-date: 2019-07-31
title: "hugo mod clean"
slug: hugo_mod_clean
url: /commands/hugo_mod_clean/
---
## hugo mod clean
-Delete the entire Hugo Module cache.
+Delete the Hugo Module cache for the current project
### Synopsis
-Delete the entire Hugo Module cache.
-
-Note that after you run this command, all of your dependencies will be re-downloaded next time you run "hugo".
-
-Also note that if you configure a positive maxAge for the "modules" file cache, it will also be cleaned as part of "hugo --gc".
-
-
+Delete the Hugo Module cache for the current project.
```
-hugo mod clean [flags]
+hugo mod clean [flags] [args]
```
### Options
```
- -h, --help help for clean
+ --all clean entire module cache
+ -b, --baseURL string hostname (and path) to the root, e.g. https://spf13.com/
+ --cacheDir string filesystem path to cache directory
+ -c, --contentDir string filesystem path to content directory
+ -h, --help help for clean
+ --pattern string pattern matching module paths to clean (all if not set), e.g. "**hugo*"
+ --renderSegments strings named segments to render (configured in the segments config)
+ -t, --theme strings themes to use (located in /themes/THEMENAME/)
```
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo mod](/commands/hugo_mod/) - Various Hugo Modules helpers.
+* [hugo mod](/commands/hugo_mod/) - Manage modules
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_mod_get.md b/docs/content/en/commands/hugo_mod_get.md
index 678bcded5..a5c9a9ea9 100644
--- a/docs/content/en/commands/hugo_mod_get.md
+++ b/docs/content/en/commands/hugo_mod_get.md
@@ -1,36 +1,41 @@
---
-date: 2019-07-31
title: "hugo mod get"
slug: hugo_mod_get
url: /commands/hugo_mod_get/
---
## hugo mod get
-Resolves dependencies in your current Hugo Project.
+Resolves dependencies in your current Hugo project
### Synopsis
-Resolves dependencies in your current Hugo Project.
+Resolves dependencies in your current Hugo project.
Some examples:
Install the latest version possible for a given module:
hugo mod get github.com/gohugoio/testshortcodes
-
+
Install a specific version:
hugo mod get github.com/gohugoio/testshortcodes@v0.3.0
-Install the latest versions of all module dependencies:
+Install the latest versions of all direct module dependencies:
+
+ hugo mod get
+ hugo mod get ./... (recursive)
+
+Install the latest versions of all module dependencies (direct and indirect):
hugo mod get -u
+ hugo mod get -u ./... (recursive)
Run "go help get" for more information. All flags available for "go get" is also relevant here.
Note that Hugo will always start out by resolving the components defined in the site
-configuration, provided by a _vendor directory (if no --ignoreVendor flag provided),
+configuration, provided by a _vendor directory (if no --ignoreVendorPaths flag provided),
Go Modules, or a folder inside the themes directory, in that order.
See https://gohugo.io/hugo-modules/ for more information.
@@ -38,7 +43,7 @@ See https://gohugo.io/hugo-modules/ for more information.
```
-hugo mod get [flags]
+hugo mod get [flags] [args]
```
### Options
@@ -50,22 +55,21 @@ hugo mod get [flags]
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo mod](/commands/hugo_mod/) - Various Hugo Modules helpers.
+* [hugo mod](/commands/hugo_mod/) - Manage modules
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_mod_graph.md b/docs/content/en/commands/hugo_mod_graph.md
index 397283180..cb2bdfb5a 100644
--- a/docs/content/en/commands/hugo_mod_graph.md
+++ b/docs/content/en/commands/hugo_mod_graph.md
@@ -1,12 +1,11 @@
---
-date: 2019-07-31
title: "hugo mod graph"
slug: hugo_mod_graph
url: /commands/hugo_mod_graph/
---
## hugo mod graph
-Print a module dependency graph.
+Print a module dependency graph
### Synopsis
@@ -15,34 +14,39 @@ Note that for vendored modules, that is the version listed and not the one from
```
-hugo mod graph [flags]
+hugo mod graph [flags] [args]
```
### Options
```
- -h, --help help for graph
+ -b, --baseURL string hostname (and path) to the root, e.g. https://spf13.com/
+ --cacheDir string filesystem path to cache directory
+ --clean delete module cache for dependencies that fail verification
+ -c, --contentDir string filesystem path to content directory
+ -h, --help help for graph
+ --renderSegments strings named segments to render (configured in the segments config)
+ -t, --theme strings themes to use (located in /themes/THEMENAME/)
```
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo mod](/commands/hugo_mod/) - Various Hugo Modules helpers.
+* [hugo mod](/commands/hugo_mod/) - Manage modules
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_mod_init.md b/docs/content/en/commands/hugo_mod_init.md
index 53c5412f1..3315e97d6 100644
--- a/docs/content/en/commands/hugo_mod_init.md
+++ b/docs/content/en/commands/hugo_mod_init.md
@@ -1,53 +1,56 @@
---
-date: 2019-07-31
title: "hugo mod init"
slug: hugo_mod_init
url: /commands/hugo_mod_init/
---
## hugo mod init
-Initialize this project as a Hugo Module.
+Initialize this project as a Hugo Module
### Synopsis
Initialize this project as a Hugo Module.
-It will try to guess the module path, but you may help by passing it as an argument, e.g:
+ It will try to guess the module path, but you may help by passing it as an argument, e.g:
- hugo mod init github.com/gohugoio/testshortcodes
-
-Note that Hugo Modules supports multi-module projects, so you can initialize a Hugo Module
-inside a subfolder on GitHub, as one example.
+ hugo mod init github.com/gohugoio/testshortcodes
+ Note that Hugo Modules supports multi-module projects, so you can initialize a Hugo Module
+ inside a subfolder on GitHub, as one example.
+
```
-hugo mod init [flags]
+hugo mod init [flags] [args]
```
### Options
```
- -h, --help help for init
+ -b, --baseURL string hostname (and path) to the root, e.g. https://spf13.com/
+ --cacheDir string filesystem path to cache directory
+ -c, --contentDir string filesystem path to content directory
+ -h, --help help for init
+ --renderSegments strings named segments to render (configured in the segments config)
+ -t, --theme strings themes to use (located in /themes/THEMENAME/)
```
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo mod](/commands/hugo_mod/) - Various Hugo Modules helpers.
+* [hugo mod](/commands/hugo_mod/) - Manage modules
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_mod_npm.md b/docs/content/en/commands/hugo_mod_npm.md
new file mode 100644
index 000000000..39a559e0f
--- /dev/null
+++ b/docs/content/en/commands/hugo_mod_npm.md
@@ -0,0 +1,45 @@
+---
+title: "hugo mod npm"
+slug: hugo_mod_npm
+url: /commands/hugo_mod_npm/
+---
+## hugo mod npm
+
+Various npm helpers
+
+### Synopsis
+
+Various npm (Node package manager) helpers.
+
+```
+hugo mod npm [command] [flags]
+```
+
+### Options
+
+```
+ -h, --help help for npm
+```
+
+### Options inherited from parent commands
+
+```
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
+```
+
+### SEE ALSO
+
+* [hugo mod](/commands/hugo_mod/) - Manage modules
+* [hugo mod npm pack](/commands/hugo_mod_npm_pack/) - Experimental: Prepares and writes a composite package.json file for your project
+
diff --git a/docs/content/en/commands/hugo_mod_npm_pack.md b/docs/content/en/commands/hugo_mod_npm_pack.md
new file mode 100644
index 000000000..5ece05769
--- /dev/null
+++ b/docs/content/en/commands/hugo_mod_npm_pack.md
@@ -0,0 +1,59 @@
+---
+title: "hugo mod npm pack"
+slug: hugo_mod_npm_pack
+url: /commands/hugo_mod_npm_pack/
+---
+## hugo mod npm pack
+
+Experimental: Prepares and writes a composite package.json file for your project
+
+### Synopsis
+
+Prepares and writes a composite package.json file for your project.
+
+On first run it creates a "package.hugo.json" in the project root if not already there. This file will be used as a template file
+with the base dependency set.
+
+This set will be merged with all "package.hugo.json" files found in the dependency tree, picking the version closest to the project.
+
+This command is marked as 'Experimental'. We think it's a great idea, so it's not likely to be
+removed from Hugo, but we need to test this out in "real life" to get a feel of it,
+so this may/will change in future versions of Hugo.
+
+
+```
+hugo mod npm pack [flags] [args]
+```
+
+### Options
+
+```
+ -b, --baseURL string hostname (and path) to the root, e.g. https://spf13.com/
+ --cacheDir string filesystem path to cache directory
+ -c, --contentDir string filesystem path to content directory
+ -h, --help help for pack
+ --renderSegments strings named segments to render (configured in the segments config)
+ -t, --theme strings themes to use (located in /themes/THEMENAME/)
+```
+
+### Options inherited from parent commands
+
+```
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
+```
+
+### SEE ALSO
+
+* [hugo mod npm](/commands/hugo_mod_npm/) - Various npm helpers
+
diff --git a/docs/content/en/commands/hugo_mod_tidy.md b/docs/content/en/commands/hugo_mod_tidy.md
index 68e7ac2d1..c7ae40625 100644
--- a/docs/content/en/commands/hugo_mod_tidy.md
+++ b/docs/content/en/commands/hugo_mod_tidy.md
@@ -1,46 +1,45 @@
---
-date: 2019-07-31
title: "hugo mod tidy"
slug: hugo_mod_tidy
url: /commands/hugo_mod_tidy/
---
## hugo mod tidy
-Remove unused entries in go.mod and go.sum.
-
-### Synopsis
-
-Remove unused entries in go.mod and go.sum.
+Remove unused entries in go.mod and go.sum
```
-hugo mod tidy [flags]
+hugo mod tidy [flags] [args]
```
### Options
```
- -h, --help help for tidy
+ -b, --baseURL string hostname (and path) to the root, e.g. https://spf13.com/
+ --cacheDir string filesystem path to cache directory
+ -c, --contentDir string filesystem path to content directory
+ -h, --help help for tidy
+ --renderSegments strings named segments to render (configured in the segments config)
+ -t, --theme strings themes to use (located in /themes/THEMENAME/)
```
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo mod](/commands/hugo_mod/) - Various Hugo Modules helpers.
+* [hugo mod](/commands/hugo_mod/) - Manage modules
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_mod_vendor.md b/docs/content/en/commands/hugo_mod_vendor.md
index f3beb1253..dc403affe 100644
--- a/docs/content/en/commands/hugo_mod_vendor.md
+++ b/docs/content/en/commands/hugo_mod_vendor.md
@@ -1,49 +1,51 @@
---
-date: 2019-07-31
title: "hugo mod vendor"
slug: hugo_mod_vendor
url: /commands/hugo_mod_vendor/
---
## hugo mod vendor
-Vendor all module dependencies into the _vendor directory.
+Vendor all module dependencies into the _vendor directory
### Synopsis
Vendor all module dependencies into the _vendor directory.
-
-If a module is vendored, that is where Hugo will look for it's dependencies.
-
+ If a module is vendored, that is where Hugo will look for it's dependencies.
+
```
-hugo mod vendor [flags]
+hugo mod vendor [flags] [args]
```
### Options
```
- -h, --help help for vendor
+ -b, --baseURL string hostname (and path) to the root, e.g. https://spf13.com/
+ --cacheDir string filesystem path to cache directory
+ -c, --contentDir string filesystem path to content directory
+ -h, --help help for vendor
+ --renderSegments strings named segments to render (configured in the segments config)
+ -t, --theme strings themes to use (located in /themes/THEMENAME/)
```
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo mod](/commands/hugo_mod/) - Various Hugo Modules helpers.
+* [hugo mod](/commands/hugo_mod/) - Manage modules
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_mod_verify.md b/docs/content/en/commands/hugo_mod_verify.md
new file mode 100644
index 000000000..2f22a2e49
--- /dev/null
+++ b/docs/content/en/commands/hugo_mod_verify.md
@@ -0,0 +1,50 @@
+---
+title: "hugo mod verify"
+slug: hugo_mod_verify
+url: /commands/hugo_mod_verify/
+---
+## hugo mod verify
+
+Verify dependencies
+
+### Synopsis
+
+Verify checks that the dependencies of the current module, which are stored in a local downloaded source cache, have not been modified since being downloaded.
+
+```
+hugo mod verify [flags] [args]
+```
+
+### Options
+
+```
+ -b, --baseURL string hostname (and path) to the root, e.g. https://spf13.com/
+ --cacheDir string filesystem path to cache directory
+ --clean delete module cache for dependencies that fail verification
+ -c, --contentDir string filesystem path to content directory
+ -h, --help help for verify
+ --renderSegments strings named segments to render (configured in the segments config)
+ -t, --theme strings themes to use (located in /themes/THEMENAME/)
+```
+
+### Options inherited from parent commands
+
+```
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
+```
+
+### SEE ALSO
+
+* [hugo mod](/commands/hugo_mod/) - Manage modules
+
diff --git a/docs/content/en/commands/hugo_new.md b/docs/content/en/commands/hugo_new.md
index 77d5fc853..2788ef168 100644
--- a/docs/content/en/commands/hugo_new.md
+++ b/docs/content/en/commands/hugo_new.md
@@ -1,12 +1,11 @@
---
-date: 2019-07-31
title: "hugo new"
slug: hugo_new
url: /commands/hugo_new/
---
## hugo new
-Create new content for your site
+Create new content
### Synopsis
@@ -19,62 +18,33 @@ If archetypes are provided in your theme or site, they will be used.
Ensure you run this within the root directory of your site.
-```
-hugo new [path] [flags]
-```
-
### Options
```
- -b, --baseURL string hostname (and path) to the root, e.g. http://spf13.com/
- -D, --buildDrafts include content marked as draft
- -E, --buildExpired include expired content
- -F, --buildFuture include content with publishdate in the future
- --cacheDir string filesystem path to cache directory. Defaults: $TMPDIR/hugo_cache/
- --cleanDestinationDir remove files from destination not found in static directories
- -c, --contentDir string filesystem path to content directory
- -d, --destination string filesystem path to write files to
- --disableKinds strings disable different kind of pages (home, RSS etc.)
- --editor string edit new content with this editor, if provided
- --enableGitInfo add Git revision, date and author info to the pages
- --forceSyncStatic copy all files when static is changed.
- --gc enable to run some cleanup tasks (remove unused cache files) after the build
- -h, --help help for new
- --i18n-warnings print missing translations
- --ignoreCache ignores the cache directory
- -k, --kind string content type to create
- -l, --layoutDir string filesystem path to layout directory
- --minify minify any supported output format (HTML, XML etc.)
- --noChmod don't sync permission mode of files
- --noTimes don't sync modification time of files
- --path-warnings print warnings on duplicate target paths etc.
- --templateMetrics display metrics about template executions
- --templateMetricsHints calculate some improvement hints when combined with --templateMetrics
- -t, --theme strings themes to use (located in /themes/THEMENAME/)
- --trace file write trace to file (not useful in general)
+ -h, --help help for new
```
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo](/commands/hugo/) - hugo builds your site
+* [hugo](/commands/hugo/) - Build your site
+* [hugo new content](/commands/hugo_new_content/) - Create new content
* [hugo new site](/commands/hugo_new_site/) - Create a new site (skeleton)
-* [hugo new theme](/commands/hugo_new_theme/) - Create a new theme
+* [hugo new theme](/commands/hugo_new_theme/) - Create a new theme (skeleton)
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_new_content.md b/docs/content/en/commands/hugo_new_content.md
new file mode 100644
index 000000000..9624e9a61
--- /dev/null
+++ b/docs/content/en/commands/hugo_new_content.md
@@ -0,0 +1,59 @@
+---
+title: "hugo new content"
+slug: hugo_new_content
+url: /commands/hugo_new_content/
+---
+## hugo new content
+
+Create new content
+
+### Synopsis
+
+Create a new content file and automatically set the date and title.
+It will guess which kind of file to create based on the path provided.
+
+You can also specify the kind with `-k KIND`.
+
+If archetypes are provided in your theme or site, they will be used.
+
+Ensure you run this within the root directory of your site.
+
+```
+hugo new content [path] [flags]
+```
+
+### Options
+
+```
+ -b, --baseURL string hostname (and path) to the root, e.g. https://spf13.com/
+ --cacheDir string filesystem path to cache directory
+ -c, --contentDir string filesystem path to content directory
+ --editor string edit new content with this editor, if provided
+ -f, --force overwrite file if it already exists
+ -h, --help help for content
+ -k, --kind string content type to create
+ --renderSegments strings named segments to render (configured in the segments config)
+ -t, --theme strings themes to use (located in /themes/THEMENAME/)
+```
+
+### Options inherited from parent commands
+
+```
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
+```
+
+### SEE ALSO
+
+* [hugo new](/commands/hugo_new/) - Create new content
+
diff --git a/docs/content/en/commands/hugo_new_site.md b/docs/content/en/commands/hugo_new_site.md
index 29ab9e8ef..0f0096ae4 100644
--- a/docs/content/en/commands/hugo_new_site.md
+++ b/docs/content/en/commands/hugo_new_site.md
@@ -1,5 +1,4 @@
---
-date: 2019-07-31
title: "hugo new site"
slug: hugo_new_site
url: /commands/hugo_new_site/
@@ -21,30 +20,29 @@ hugo new site [path] [flags]
### Options
```
- --force init inside non-empty directory
- -f, --format string config & frontmatter format (default "toml")
+ -f, --force init inside non-empty directory
+ --format string preferred file format (toml, yaml or json) (default "toml")
-h, --help help for site
```
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo new](/commands/hugo_new/) - Create new content for your site
+* [hugo new](/commands/hugo_new/) - Create new content
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_new_theme.md b/docs/content/en/commands/hugo_new_theme.md
index 8a97aad97..b1c937bae 100644
--- a/docs/content/en/commands/hugo_new_theme.md
+++ b/docs/content/en/commands/hugo_new_theme.md
@@ -1,19 +1,18 @@
---
-date: 2019-07-31
title: "hugo new theme"
slug: hugo_new_theme
url: /commands/hugo_new_theme/
---
## hugo new theme
-Create a new theme
+Create a new theme (skeleton)
### Synopsis
-Create a new theme (skeleton) called [name] in the current directory.
+Create a new theme (skeleton) called [name] in ./themes.
New theme is a skeleton. Please add content to the touched files. Add your
name to the copyright line in the license and adjust the theme.toml file
-as you see fit.
+according to your needs.
```
hugo new theme [name] [flags]
@@ -28,22 +27,21 @@ hugo new theme [name] [flags]
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo new](/commands/hugo_new/) - Create new content for your site
+* [hugo new](/commands/hugo_new/) - Create new content
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_server.md b/docs/content/en/commands/hugo_server.md
index f6db5a080..d735f449a 100644
--- a/docs/content/en/commands/hugo_server.md
+++ b/docs/content/en/commands/hugo_server.md
@@ -1,22 +1,20 @@
---
-date: 2019-07-31
title: "hugo server"
slug: hugo_server
url: /commands/hugo_server/
---
## hugo server
-A high performance webserver
+Start the embedded web server
### Synopsis
Hugo provides its own webserver which builds and serves the site.
While hugo server is high performance, it is a webserver with limited options.
-Many run it in production, but the standard behavior is for people to use it
-in development and use a more full featured server such as Nginx or Caddy.
-'hugo server' will avoid writing the rendered and served content to disk,
-preferring to store it in memory.
+The `hugo server` command will by default write and serve files from disk, but
+you can render to memory by using the `--renderToMemory` flag. This can be
+faster in some cases, but it will consume more memory.
By default hugo will also watch your files for any changes you make and
automatically rebuild the site. It will then live reload any open browser pages
@@ -24,70 +22,77 @@ and push the latest content to them. As most Hugo sites are built in a fraction
of a second, you will be able to save and see your changes nearly instantly.
```
-hugo server [flags]
+hugo server [command] [flags]
```
### Options
```
- --appendPort append port to baseURL (default true)
- -b, --baseURL string hostname (and path) to the root, e.g. http://spf13.com/
- --bind string interface to which the server will bind (default "127.0.0.1")
- -D, --buildDrafts include content marked as draft
- -E, --buildExpired include expired content
- -F, --buildFuture include content with publishdate in the future
- --cacheDir string filesystem path to cache directory. Defaults: $TMPDIR/hugo_cache/
- --cleanDestinationDir remove files from destination not found in static directories
- -c, --contentDir string filesystem path to content directory
- -d, --destination string filesystem path to write files to
- --disableBrowserError do not show build errors in the browser
- --disableFastRender enables full re-renders on changes
- --disableKinds strings disable different kind of pages (home, RSS etc.)
- --disableLiveReload watch without enabling live browser reload on rebuild
- --enableGitInfo add Git revision, date and author info to the pages
- --forceSyncStatic copy all files when static is changed.
- --gc enable to run some cleanup tasks (remove unused cache files) after the build
- -h, --help help for server
- --i18n-warnings print missing translations
- --ignoreCache ignores the cache directory
- -l, --layoutDir string filesystem path to layout directory
- --liveReloadPort int port for live reloading (i.e. 443 in HTTPS proxy situations) (default -1)
- --meminterval string interval to poll memory usage (requires --memstats), valid time units are "ns", "us" (or "µs"), "ms", "s", "m", "h". (default "100ms")
- --memstats string log memory usage to this file
- --minify minify any supported output format (HTML, XML etc.)
- --navigateToChanged navigate to changed content file on live browser reload
- --noChmod don't sync permission mode of files
- --noHTTPCache prevent HTTP caching
- --noTimes don't sync modification time of files
- --path-warnings print warnings on duplicate target paths etc.
- -p, --port int port on which the server will listen (default 1313)
- --renderToDisk render to Destination path (default is render to memory & serve from there)
- --templateMetrics display metrics about template executions
- --templateMetricsHints calculate some improvement hints when combined with --templateMetrics
- -t, --theme strings themes to use (located in /themes/THEMENAME/)
- --trace file write trace to file (not useful in general)
- -w, --watch watch filesystem for changes and recreate as needed (default true)
+ --appendPort append port to baseURL (default true)
+ -b, --baseURL string hostname (and path) to the root, e.g. https://spf13.com/
+ --bind string interface to which the server will bind (default "127.0.0.1")
+ -D, --buildDrafts include content marked as draft
+ -E, --buildExpired include expired content
+ -F, --buildFuture include content with publishdate in the future
+ --cacheDir string filesystem path to cache directory
+ --cleanDestinationDir remove files from destination not found in static directories
+ -c, --contentDir string filesystem path to content directory
+ --disableBrowserError do not show build errors in the browser
+ --disableFastRender enables full re-renders on changes
+ --disableKinds strings disable different kind of pages (home, RSS etc.)
+ --disableLiveReload watch without enabling live browser reload on rebuild
+ --enableGitInfo add Git revision, date, author, and CODEOWNERS info to the pages
+ --forceSyncStatic copy all files when static is changed.
+ --gc enable to run some cleanup tasks (remove unused cache files) after the build
+ -h, --help help for server
+ --ignoreCache ignores the cache directory
+ -l, --layoutDir string filesystem path to layout directory
+ --liveReloadPort int port for live reloading (i.e. 443 in HTTPS proxy situations) (default -1)
+ --minify minify any supported output format (HTML, XML etc.)
+ -N, --navigateToChanged navigate to changed content file on live browser reload
+ --noChmod don't sync permission mode of files
+ --noHTTPCache prevent HTTP caching
+ --noTimes don't sync modification time of files
+ -O, --openBrowser open the site in a browser after server startup
+ --panicOnWarning panic on first WARNING log
+ --poll string set this to a poll interval, e.g --poll 700ms, to use a poll based approach to watch for file system changes
+ -p, --port int port on which the server will listen (default 1313)
+ --pprof enable the pprof server (port 8080)
+ --printI18nWarnings print missing translations
+ --printMemoryUsage print memory usage to screen at intervals
+ --printPathWarnings print warnings on duplicate target paths etc.
+ --printUnusedTemplates print warnings on unused templates.
+ --renderSegments strings named segments to render (configured in the segments config)
+ --renderStaticToDisk serve static files from disk and dynamic files from memory
+ --templateMetrics display metrics about template executions
+ --templateMetricsHints calculate some improvement hints when combined with --templateMetrics
+ -t, --theme strings themes to use (located in /themes/THEMENAME/)
+ --tlsAuto generate and use locally-trusted certificates.
+ --tlsCertFile string path to TLS certificate file
+ --tlsKeyFile string path to TLS key file
+ --trace file write trace to file (not useful in general)
+ -w, --watch watch filesystem for changes and recreate as needed (default true)
```
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo](/commands/hugo/) - hugo builds your site
+* [hugo](/commands/hugo/) - Build your site
+* [hugo server trust](/commands/hugo_server_trust/) - Install the local CA in the system trust store
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/commands/hugo_server_trust.md b/docs/content/en/commands/hugo_server_trust.md
new file mode 100644
index 000000000..22ca2491e
--- /dev/null
+++ b/docs/content/en/commands/hugo_server_trust.md
@@ -0,0 +1,41 @@
+---
+title: "hugo server trust"
+slug: hugo_server_trust
+url: /commands/hugo_server_trust/
+---
+## hugo server trust
+
+Install the local CA in the system trust store
+
+```
+hugo server trust [flags] [args]
+```
+
+### Options
+
+```
+ -h, --help help for trust
+ --uninstall Uninstall the local CA (but do not delete it).
+```
+
+### Options inherited from parent commands
+
+```
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
+```
+
+### SEE ALSO
+
+* [hugo server](/commands/hugo_server/) - Start the embedded web server
+
diff --git a/docs/content/en/commands/hugo_version.md b/docs/content/en/commands/hugo_version.md
index c0bc9359b..14cc92a00 100644
--- a/docs/content/en/commands/hugo_version.md
+++ b/docs/content/en/commands/hugo_version.md
@@ -1,19 +1,18 @@
---
-date: 2019-07-31
title: "hugo version"
slug: hugo_version
url: /commands/hugo_version/
---
## hugo version
-Print the version number of Hugo
+Display version
### Synopsis
-All software has versions. This is Hugo's.
+Display version and environment info. This is useful in Hugo bug reports.
```
-hugo version [flags]
+hugo version [flags] [args]
```
### Options
@@ -25,22 +24,21 @@ hugo version [flags]
### Options inherited from parent commands
```
- --config string config file (default is path/config.yaml|json|toml)
- --configDir string config dir (default "config")
- --debug debug output
- -e, --environment string build environment
- --ignoreVendor ignores any _vendor directory
- --log enable Logging
- --logFile string log File path (if set, logging enabled automatically)
- --quiet build in quiet mode
- -s, --source string filesystem path to read files relative from
- --themesDir string filesystem path to themes directory
- -v, --verbose verbose output
- --verboseLog verbose logging
+ --clock string set the clock used by Hugo, e.g. --clock 2021-11-06T22:30:00.00+09:00
+ --config string config file (default is hugo.yaml|json|toml)
+ --configDir string config dir (default "config")
+ -d, --destination string filesystem path to write files to
+ -e, --environment string build environment
+ --ignoreVendorPaths string ignores any _vendor for module paths matching the given Glob pattern
+ --logLevel string log level (debug|info|warn|error)
+ --noBuildLock don't create .hugo_build.lock file
+ --quiet build in quiet mode
+ -M, --renderToMemory render to memory (mostly useful when running the server)
+ -s, --source string filesystem path to read files relative from
+ --themesDir string filesystem path to themes directory
```
### SEE ALSO
-* [hugo](/commands/hugo/) - hugo builds your site
+* [hugo](/commands/hugo/) - Build your site
-###### Auto generated by spf13/cobra on 31-Jul-2019
diff --git a/docs/content/en/configuration/_index.md b/docs/content/en/configuration/_index.md
new file mode 100644
index 000000000..7cb08cc73
--- /dev/null
+++ b/docs/content/en/configuration/_index.md
@@ -0,0 +1,7 @@
+---
+title: Configuration
+description: Configure your site.
+categories: []
+keywords: []
+weight: 10
+---
diff --git a/docs/content/en/configuration/all.md b/docs/content/en/configuration/all.md
new file mode 100644
index 000000000..9bc05057f
--- /dev/null
+++ b/docs/content/en/configuration/all.md
@@ -0,0 +1,362 @@
+---
+title: All settings
+description: The complete list of Hugo configuration settings.
+categories: []
+keywords: []
+weight: 20
+aliases: [/getting-started/configuration/]
+---
+
+## Settings
+
+archetypeDir
+: (`string`) The designated directory for [archetypes](g). Default is `archetypes`. {{% module-mounts-note %}}
+
+assetDir
+: (`string`) The designated directory for [global resources](g). Default is `assets`. {{% module-mounts-note %}}
+
+baseURL
+: (`string`) The absolute URL of your published site including the protocol, host, path, and a trailing slash.
+
+build
+: See [configure build](/configuration/build/).
+
+buildDrafts
+: (`bool`) Whether to include draft content when building a site. Default is `false`.
+
+buildExpired
+: (`bool`) Whether to include expired content when building a site. Default is `false`.
+
+buildFuture
+: (`bool`) Whether to include future content when building a site. Default is `false`.
+
+cacheDir
+: (`string`) The designated cache directory. See [details](#cache-directory).
+
+caches
+: See [configure file caches](/configuration/caches/).
+
+canonifyURLs
+: (`bool`) See [details](/content-management/urls/#canonical-urls) before enabling this feature. Default is `false`.
+
+capitalizeListTitles
+: {{< new-in 0.123.3 />}}
+: (`bool`) Whether to capitalize automatic list titles. Applicable to section, taxonomy, and term pages. Default is `true`. Use the [`titleCaseStyle`](#titlecasestyle) setting to configure capitalization rules.
+
+cascade
+: See [configure cascade](/configuration/cascade/).
+
+cleanDestinationDir
+: (`bool`) Whether to remove files from the site's destination directory that do not have corresponding files in the `static` directory during the build. Default is `false`.
+
+contentDir
+: (`string`) The designated directory for content files. Default is `content`. {{% module-mounts-note %}}
+
+copyright
+: (`string`) The copyright notice for a site, typically displayed in the footer.
+
+dataDir
+: (`string`) The designated directory for data files. Default is `data`. {{% module-mounts-note %}}
+
+defaultContentLanguage
+: (`string`) The project's default language key, conforming to the syntax described in [RFC 5646]. This value must match one of the defined language keys. Default is `en`.
+
+defaultContentLanguageInSubdir
+: (`bool`) Whether to publish the default language site to a subdirectory matching the `defaultContentLanguage`. Default is `false`.
+
+defaultOutputFormat
+: (`string`) The default output format for the site. If unspecified, the first available format in the defined order (by weight, then alphabetically) will be used.
+
+deployment
+: See [configure deployment](/configuration/deployment/).
+
+disableAliases
+: (`bool`) Whether to disable generation of alias redirects. Even if this option is enabled, the defined aliases will still be present on the page. This allows you to manage redirects separately, for example, by generating 301 redirects in an `.htaccess` file or a Netlify `_redirects` file using a custom output format. Default is `false`.
+
+disableDefaultLanguageRedirect
+: {{< new-in 0.140.0 />}}
+: (`bool`) Whether to disable generation of the alias redirect to the default language when `DefaultContentLanguageInSubdir` is `true`. Default is `false`.
+
+disableHugoGeneratorInject
+: (`bool`) Whether to disable injection of a `` tag into the home page. Default is `false`.
+
+disableKinds
+: (`[]string`) A slice of page [kinds](g) to disable during the build process, any of `404`, `home`, `page`, `robotstxt`, `rss`, `section`, `sitemap`, `taxonomy`, or `term`.
+
+disableLanguages
+: (`[]string]`) A slice of language keys representing the languages to disable during the build process. Although this is functional, consider using the [`disabled`] key under each language instead.
+
+disableLiveReload
+: (`bool`) Whether to disable automatic live reloading of the browser window. Default is `false`.
+
+disablePathToLower
+: (`bool`) Whether to disable transformation of page URLs to lower case.
+
+enableEmoji
+: (`bool`) Whether to allow emoji in Markdown. Default is `false`.
+
+enableGitInfo
+: (`bool`) For sites under Git version control, whether to enable the [`GitInfo`] object for each page. With the [default front matter configuration], the `Lastmod` method on a `Page` object will return the Git author date. Default is `false`.
+
+enableMissingTranslationPlaceholders
+: (`bool`) Whether to show a placeholder instead of the default value or an empty string if a translation is missing. Default is `false`.
+
+enableRobotsTXT
+: (`bool`) Whether to enable generation of a `robots.txt` file. Default is `false`.
+
+environment
+: (`string`) The build environment. Default is `production` when running `hugo` and `development` when running `hugo server`.
+
+frontmatter
+: See [configure front matter](/configuration/front-matter/).
+
+hasCJKLanguage
+: (`bool`) Whether to automatically detect [CJK](g) languages in content. Affects the values returned by the [`WordCount`] and [`FuzzyWordCount`] methods. Default is `false`.
+
+HTTPCache
+: See [configure HTTP cache](/configuration/http-cache/).
+
+i18nDir
+: (`string`) The designated directory for translation tables. Default is `i18n`. {{% module-mounts-note %}}
+
+ignoreCache
+: (`bool`) Whether to ignore the cache directory. Default is `false`.
+
+ignoreFiles
+: (`[]string]`) A slice of [regular expressions](g) used to exclude specific files from a build. These expressions are matched against the absolute file path and apply to files within the `content`, `data`, and `i18n` directories. For more advanced file exclusion options, see the section on [module mounts].
+
+ignoreLogs
+: (`[]string`) A slice of message identifiers corresponding to warnings and errors you wish to suppress. See [`erroridf`] and [`warnidf`].
+
+ignoreVendorPaths
+: (`string`) A [glob](g) pattern matching the module paths to exclude from the `_vendor` directory.
+
+imaging
+: See [configure imaging](/configuration/imaging/).
+
+languageCode
+: (`string`) The site's language tag, conforming to the syntax described in [RFC 5646]. This value does not affect translations or localization. Hugo uses this value to populate:
+
+ - The `language` element in the [embedded RSS template]
+ - The `lang` attribute of the `html` element in the [embedded alias template]
+ - The `og:locale` `meta` element in the [embedded Open Graph template]
+
+ When present in the root of the configuration, this value is ignored if one or more language keys exists. Please specify this value independently for each language key.
+
+languages
+: See [configure languages](/configuration/languages/).
+
+layoutDir
+: (`string`) The designated directory for templates. Default is `layouts`. {{% module-mounts-note %}}
+
+mainSections
+: (`string` or `[]string`) The main sections of a site. If set, the [`MainSections`] method on the `Site` object returns the given sections, otherwise it returns the section with the most pages.
+
+markup
+: See [configure markup](/configuration/markup/).
+
+mediaTypes
+: See [configure media types](/configuration/media-types/).
+
+menus
+: See [configure menus](/configuration/menus/).
+
+minify
+: See [configure minify](/configuration/minify/).
+
+module
+: See [configure modules](/configuration/module/).
+
+newContentEditor
+: (`string`) The editor to use when creating new content.
+
+noBuildLock
+: (`bool`) Whether to disable creation of the `.hugo_build.lock` file. Default is `false`.
+
+noChmod
+: (`bool`) Whether to disable synchronization of file permission modes. Default is `false`.
+
+noTimes
+: (`bool`) Whether to disable synchronization of file modification times. Default is `false`.
+
+outputFormats
+: See [configure output formats](/configuration/output-formats/).
+
+outputs
+: See [configure outputs](/configuration/outputs/).
+
+page
+: See [configure page](/configuration/page/).
+
+pagination
+: See [configure pagination](/configuration/pagination/).
+
+panicOnWarning
+: (`bool`) Whether to panic on the first WARNING. Default is `false`.
+
+params
+: See [configure params](/configuration/params/).
+
+permalinks
+: See [configure permalinks](/configuration/permalinks/).
+
+pluralizeListTitles
+: (`bool`) Whether to pluralize automatic list titles. Applicable to section pages. Default is `true`.
+
+printI18nWarnings
+: (`bool`) Whether to log WARNINGs for each missing translation. Default is `false`.
+
+printPathWarnings
+: (`bool`) Whether to log WARNINGs when Hugo publishes two or more files to the same path. Default is `false`.
+
+printUnusedTemplates
+: (`bool`) Whether to log WARNINGs for each unused template. Default is `false`.
+
+privacy
+: See [configure privacy](/configuration/privacy/).
+
+publishDir
+: (`string`) The designated directory for publishing the site. Default is `public`.
+
+refLinksErrorLevel
+: (`string`) The logging error level to use when the `ref` and `relref` functions, methods, and shortcodes are unable to resolve a reference to a page. Either `ERROR` or `WARNING`. Any `ERROR` will fail the build. Default is `ERROR`.
+
+refLinksNotFoundURL
+: (`string`) The URL to return when the `ref` and `relref` functions, methods, and shortcodes are unable to resolve a reference to a page.
+
+related
+: See [configure related content](/configuration/related-content/).
+
+relativeURLs
+: (`bool`) See [details](/content-management/urls/#relative-urls) before enabling this feature. Default is `false`.
+
+removePathAccents
+: (`bool`) Whether to remove [non-spacing marks](https://www.compart.com/en/unicode/category/Mn) from [composite characters](https://en.wikipedia.org/wiki/Precomposed_character) in content paths. Default is `false`.
+
+renderSegments
+: {{< new-in 0.124.0 />}}
+: (`[]string`) A slice of [segments](g) to render. If omitted, all segments are rendered. This option is typically set via a command-line flag, such as `hugo --renderSegments segment1,segment2`. The provided segment names must correspond to those defined in the [`segments`] configuration.
+
+resourceDir
+: (`string`) The designated directory for caching output from [asset pipelines](g). Default is `resources`.
+
+security
+: See [configure security](/configuration/security/).
+
+sectionPagesMenu
+: (`string`) When set, each top-level section will be added to the menu identified by the provided value. See [details](/content-management/menus/#define-automatically).
+
+segments
+: See [configure segments](/configuration/segments/).
+
+server
+: See [configure server](/configuration/server/).
+
+services
+: See [configure services](/configuration/services/).
+
+sitemap
+: See [configure sitemap](/configuration/sitemap/).
+
+staticDir
+: (`string`) The designated directory for static files. Default is `static`. {{% module-mounts-note %}}
+
+summaryLength
+: (`int`) Applicable to [automatic summaries], the minimum number of words returned by the [`Summary`] method on a `Page` object. The `Summary` method will return content truncated at the paragraph boundary closest to the specified `summaryLength`, but at least this minimum number of words.
+
+taxonomies
+: See [configure taxonomies](/configuration/taxonomies/).
+
+templateMetrics
+: (`bool`) Whether to print template execution metrics to the console. Default is `false`. See [details](/troubleshooting/performance/#template-metrics).
+
+templateMetricsHints
+: (`bool`) Whether to print template execution improvement hints to the console. Applicable when `templateMetrics` is `true`. Default is `false`. See [details](/troubleshooting/performance/#template-metrics).
+
+theme
+: (`string` or `[]string`) The [theme](g) to use. Multiple themes can be listed, with precedence given from left to right. See [details](/hugo-modules/theme-components/).
+
+themesDir
+: (`string`) The designated directory for themes. Default is `themes`.
+
+timeout
+: (`string`) The timeout for generating page content, either as a [duration] or in seconds. This timeout is used to prevent infinite recursion during content generation. You may need to increase this value if your pages take a long time to generate, for example, due to extensive image processing or reliance on remote content. Default is `30s`.
+
+timeZone
+: (`string`) The time zone used to parse dates without time zone offsets, including front matter date fields and values passed to the [`time.AsTime`] and [`time.Format`] template functions. The list of valid values may be system dependent, but should include `UTC`, `Local`, and any location in the [IANA Time Zone Database]. For example, `America/Los_Angeles` and `Europe/Oslo` are valid time zones.
+
+title
+: (`string`) The site title.
+
+titleCaseStyle
+: (`string`) The capitalization rules to follow when Hugo automatically generates a section title, or when using the [`strings.Title`] function. One of `ap`, `chicago`, `go`, `firstupper`, or `none`. Default is `ap`. See [details](#title-case-style).
+
+uglyurls
+: See [configure ugly URLs](/configuration/ugly-urls/).
+
+## Cache directory
+
+Hugo's file cache directory is configurable via the [`cacheDir`] configuration option or the `HUGO_CACHEDIR` environment variable. If neither is set, Hugo will use, in order of preference:
+
+1. If running on Netlify: `/opt/build/cache/hugo_cache/`. This means that if you run your builds on Netlify, all caches configured with `:cacheDir` will be saved and restored on the next build. For other [CI/CD](g) vendors, please read their documentation. For an CircleCI example, see [this configuration].
+1. In a `hugo_cache` directory below the OS user cache directory as defined by Go's [os.UserCacheDir] function. On Unix systems, per the [XDG base directory specification], this is `$XDG_CACHE_HOME` if non-empty, else `$HOME/.cache`. On MacOS, this is `$HOME/Library/Caches`. On Windows, this is`%LocalAppData%`. On Plan 9, this is `$home/lib/cache`.
+1. In a `hugo_cache_$USER` directory below the OS temp dir.
+
+To determine the current `cacheDir`:
+
+```sh
+hugo config | grep cachedir
+```
+
+## Title case style
+
+Hugo's [`titleCaseStyle`] setting governs capitalization for automatically generated section titles and the [`strings.Title`] function. By default, it follows the capitalization rules published in the Associated Press Stylebook. Change this setting to use other capitalization rules.
+
+ap
+: Use the capitalization rules published in the [Associated Press Stylebook]. This is the default.
+
+chicago
+: Use the capitalization rules published in the [Chicago Manual of Style].
+
+go
+: Capitalize the first letter of every word.
+
+firstupper
+: Capitalize the first letter of the first word.
+
+none
+: Disable transformation of automatic section titles, and disable the transformation performed by the `strings.Title` function. This is useful if you would prefer to manually capitalize section titles as needed, and to bypass opinionated theme usage of the `strings.Title` function.
+
+## Localized settings
+
+Some configuration settings, such as menus and custom parameters, can be defined separately for each language. See [configure languages](/configuration/languages/#localized-settings).
+
+[`cacheDir`]: #cachedir
+[`disabled`]: /configuration/languages/#disabled
+[`erroridf`]: /functions/fmt/erroridf/
+[`FuzzyWordCount`]: /methods/page/fuzzywordcount/
+[`GitInfo`]: /methods/page/gitinfo/
+[`MainSections`]: /methods/site/mainsections/
+[`segments`]: /configuration/segments/
+[`strings.Title`]: /functions/strings/title/
+[`strings.Title`]: /functions/strings/title
+[`Summary`]: /methods/page/summary/
+[`time.AsTime`]: /functions/time/astime/
+[`time.Format`]: /functions/time/format/
+[`titleCaseStyle`]: #titlecasestyle
+[`warnidf`]: /functions/fmt/warnidf/
+[`WordCount`]: /methods/page/wordcount/
+[Associated Press Stylebook]: https://www.apstylebook.com/
+[automatic summaries]: /content-management/summaries/#automatic-summary
+[Chicago Manual of Style]: https://www.chicagomanualofstyle.org/home.html
+[default front matter configuration]: /configuration/front-matter/
+[duration]: https://pkg.go.dev/time#Duration
+[embedded alias template]: {{% eturl alias %}}
+[embedded Open Graph template]: {{% eturl opengraph %}}
+[embedded RSS template]: {{% eturl rss %}}
+[IANA Time Zone Database]: https://en.wikipedia.org/wiki/List_of_tz_database_time_zones
+[module mounts]: /configuration/module/#mounts
+[os.UserCacheDir]: https://pkg.go.dev/os#UserCacheDir
+[RFC 5646]: https://datatracker.ietf.org/doc/html/rfc5646#section-2.1
+[this configuration]: https://github.com/bep/hugo-sass-test/blob/6c3960a8f4b90e8938228688bc49bdcdd6b2d99e/.circleci/config.yml
+[XDG base directory specification]: https://specifications.freedesktop.org/basedir-spec/latest/
diff --git a/docs/content/en/configuration/build.md b/docs/content/en/configuration/build.md
new file mode 100644
index 000000000..116294f05
--- /dev/null
+++ b/docs/content/en/configuration/build.md
@@ -0,0 +1,81 @@
+---
+title: Configure build
+linkTitle: Build
+description: Configure global build options.
+categories: []
+keywords: []
+aliases: [/getting-started/configuration-build/]
+---
+
+The `build` configuration section contains global build-related configuration options.
+
+{{< code-toggle config=build />}}
+
+buildStats
+: See the [build stats](#build-stats) section below.
+
+cachebusters
+: See the [cache busters](#cache-busters) section below.
+
+noJSConfigInAssets
+: (`bool`) Whether to disable writing a `jsconfig.json` in your `assets` directory with mapping of imports from running [js.Build](/hugo-pipes/js). This file is intended to help with intellisense/navigation inside code editors such as [VS Code](https://code.visualstudio.com/). Note that if you do not use `js.Build`, no file will be written.
+
+useResourceCacheWhen
+: (`string`) When to use the resource file cache, one of `never`, `fallback`, or `always`. Applicable when transpiling Sass to CSS. Default is `fallback`.
+
+## Cache busters
+
+The `build.cachebusters` configuration option was added to support development using Tailwind 3.x's JIT compiler where a `build` configuration may look like this:
+
+{{< code-toggle file=hugo >}}
+[build]
+ [build.buildStats]
+ enable = true
+ [[build.cachebusters]]
+ source = "assets/watching/hugo_stats\\.json"
+ target = "styles\\.css"
+ [[build.cachebusters]]
+ source = "(postcss|tailwind)\\.config\\.js"
+ target = "css"
+ [[build.cachebusters]]
+ source = "assets/.*\\.(js|ts|jsx|tsx)"
+ target = "js"
+ [[build.cachebusters]]
+ source = "assets/.*\\.(.*)$"
+ target = "$1"
+{{< /code-toggle >}}
+
+When `buildStats` is enabled, Hugo writes a `hugo_stats.json` file on each build with HTML classes etc. that's used in the rendered output. Changes to this file will trigger a rebuild of the `styles.css` file. You also need to add `hugo_stats.json` to Hugo's server watcher. See [Hugo Starter Tailwind Basic](https://github.com/bep/hugo-starter-tailwind-basic) for a running example.
+
+source
+: (`string`) A [regular expression](g) matching file(s) relative to one of the virtual component directories in Hugo, typically `assets/...`.
+
+target
+: (`string`) A [regular expression](g) matching the keys in the resource cache that should be expired when `source` changes. You can use the matching regexp groups from `source` in the expression, e.g. `$1`.
+
+## Build stats
+
+{{< code-toggle config=build.buildStats />}}
+
+enable
+: (`bool`) Whether to create a `hugo_stats.json` file in the root of your project. This file contains arrays of the `class` attributes, `id` attributes, and tags of every HTML element within your published site. Use this file as data source when [removing unused CSS] from your site. This process is also known as pruning, purging, or tree shaking. Default is `false`.
+
+[removing unused CSS]: /functions/resources/postprocess/
+
+disableIDs
+: (`bool`) Whether to exclude `id` attributes. Default is `false`.
+
+disableTags
+: (`bool`) Whether to exclude element tags. Default is `false`.
+
+disableClasses
+: (`bool`) Whether to exclude `class` attributes. Default is `false`.
+
+> [!note]
+> Given that CSS purging is typically limited to production builds, place the `buildStats` object below [`config/production`].
+>
+> Built for speed, there may be "false positive" detections (e.g., HTML elements that are not HTML elements) while parsing the published site. These "false positives" are infrequent and inconsequential.
+
+Due to the nature of partial server builds, new HTML entities are added while the server is running, but old values will not be removed until you restart the server or run a regular `hugo` build.
+
+[`config/production`]: /configuration/introduction/#configuration-directory
diff --git a/docs/content/en/configuration/caches.md b/docs/content/en/configuration/caches.md
new file mode 100644
index 000000000..03b499dcb
--- /dev/null
+++ b/docs/content/en/configuration/caches.md
@@ -0,0 +1,30 @@
+---
+title: Configure file caches
+linkTitle: Caches
+description: Configure file caches.
+categories: []
+keywords: []
+---
+
+This is the default configuration:
+
+{{< code-toggle config=caches />}}
+
+## Keys
+
+dir
+: (`string`) The absolute file system path where the cached files will be stored. You can begin the path with the `:cacheDir` or `:resourceDir` token. These tokens will be replaced with the actual configured cache directory and resource directory paths, respectively.
+
+maxAge
+: (`string`) The [duration](g) a cached entry remains valid before being evicted. A value of `0` disables the cache. A value of `-1` means the cache entry never expires (the default).
+
+## Tokens
+
+`:cacheDir`
+: (`string`) The designated cache directory. See [details](/configuration/all/#cachedir).
+
+`:project`
+: (`string`) The base directory name of the current Hugo project. By default, this ensures each project has isolated file caches, so running `hugo --gc` will only affect the current project's cache and not those of other Hugo projects on the same machine.
+
+`:resourceDir`
+: (`string`) The designated directory for caching output from [asset pipelines](g). See [details](/configuration/all/#resourcedir).
diff --git a/docs/content/en/configuration/cascade.md b/docs/content/en/configuration/cascade.md
new file mode 100644
index 000000000..d91996301
--- /dev/null
+++ b/docs/content/en/configuration/cascade.md
@@ -0,0 +1,77 @@
+---
+title: Configure cascade
+linkTitle: Cascade
+description: Configure cascade.
+categories: []
+keywords: []
+---
+
+You can configure your site to cascade front matter values to the home page and any of its descendants. However, this cascading will be prevented if the descendant already defines the field, or if a closer ancestor [node](g) has already cascaded a value for the same field through its front matter's `cascade` key.
+
+> [!note]
+> You can also configure cascading behavior within a page's front matter. See [details].
+
+For example, to cascade a "color" parameter to the home page and all its descendants:
+
+{{< code-toggle file=hugo >}}
+title = 'Home'
+[cascade.params]
+color = 'red'
+{{< /code-toggle >}}
+
+## Target
+
+
+
+The `target`[^1] keyword allows you to target specific pages or [environments](g). For example, to cascade a "color" parameter to pages within the "articles" section, including the "articles" section page itself:
+
+[^1]: The `_target` alias for `target` is deprecated and will be removed in a future release.
+
+{{< code-toggle file=hugo >}}
+[cascade.params]
+color = 'red'
+[cascade.target]
+path = '{/articles,/articles/**}'
+{{< /code-toggle >}}
+
+Use any combination of these keywords to target pages and/or environments:
+
+environment
+: (`string`) A [glob](g) pattern matching the build [environment](g). For example: `{staging,production}`.
+
+kind
+: (`string`) A [glob](g) pattern matching the [page kind](g). For example: ` {taxonomy,term}`.
+
+lang
+: (`string`) A [glob](g) pattern matching the [page language]. For example: `{en,de}`.
+
+path
+: (`string`) A [glob](g) pattern matching the page's [logical path](g). For example: `{/books,/books/**}`.
+
+## Array
+
+Define an array of cascade parameters to apply different values to different targets. For example:
+
+{{< code-toggle file=hugo >}}
+[[cascade]]
+[cascade.params]
+color = 'red'
+[cascade.target]
+path = '{/books/**}'
+kind = 'page'
+lang = '{en,de}'
+[[cascade]]
+[cascade.params]
+color = 'blue'
+[cascade.target]
+path = '{/films/**}'
+kind = 'page'
+environment = 'production'
+{{< /code-toggle >}}
+
+[details]: /content-management/front-matter/#cascade-1
+[page language]: /methods/page/language/
diff --git a/docs/content/en/configuration/content-types.md b/docs/content/en/configuration/content-types.md
new file mode 100644
index 000000000..4c5b5a23b
--- /dev/null
+++ b/docs/content/en/configuration/content-types.md
@@ -0,0 +1,63 @@
+---
+title: Configure content types
+linkTitle: Content types
+description: Configure content types.
+categories: []
+keywords: []
+---
+
+{{< new-in 0.144.0 />}}
+
+Hugo supports six [content formats](g):
+
+{{% include "/_common/content-format-table.md" %}}
+
+These can be used as either page content or [page resources](g). When used as page resources, their [resource type](g) is `page`.
+
+Consider this example of a [page bundle](g):
+
+```text
+content/
+└── example/
+ ├── index.md <-- content
+ ├── a.adoc <-- resource (resource type: page)
+ ├── b.html <-- resource (resource type: page)
+ ├── c.md <-- resource (resource type: page)
+ ├── d.org <-- resource (resource type: page)
+ ├── e.pdc <-- resource (resource type: page)
+ ├── f.rst <-- resource (resource type: page)
+ ├── g.jpg <-- resource (resource type: image)
+ └── h.png <-- resource (resource type: image)
+```
+
+The `index.md` file is the page's content, while the other files are page resources. Files `a` through `f` are of resource type `page`, while `g` and `h` are of resource type `image`.
+
+When you build a site, Hugo does not publish page resources having a resource type of `page`. For example, this is the result of building the site above:
+
+```text
+public/
+├── example/
+│ ├── g.jpg
+│ ├── h.png
+│ └── index.html
+└── index.html
+```
+
+The default behavior is appropriate in most cases. Given that page resources containing markup are typically intended for inclusion in the main content, publishing them independently is generally undesirable.
+
+The default behavior is determined by the `contentTypes` configuration:
+
+{{< code-toggle config=contentTypes />}}
+
+In this default configuration, page resources with those media types will have a resource type of `page`, and will not be automatically published. To change the resource type assignment from `page` to `text` for a given media type, remove the corresponding entry from the list.
+
+For example, to set the resource type of `text/html` files to `text`, thereby enabling automatic publication, remove the `text/html` entry:
+
+{{< code-toggle file=hugo >}}
+contentTypes:
+ text/asciidoc: {}
+ text/markdown: {}
+ text/org: {}
+ text/pandoc: {}
+ text/rst: {}
+{{< /code-toggle >}}
diff --git a/docs/content/en/configuration/deployment.md b/docs/content/en/configuration/deployment.md
new file mode 100644
index 000000000..fad50da63
--- /dev/null
+++ b/docs/content/en/configuration/deployment.md
@@ -0,0 +1,159 @@
+---
+title: Configure deployment
+linkTitle: Deployment
+description: Configure deployments to Amazon S3, Azure Blob Storage, or Google Cloud Storage.
+categories: []
+keywords: []
+---
+
+> [!note]
+> This configuration is only relevant when running `hugo deploy`. See [details](/host-and-deploy/deploy-with-hugo-deploy/).
+
+## Top-level options
+
+These settings control the overall behavior of the deployment process. This is the default configuration:
+
+{{< code-toggle file=hugo config=deployment />}}
+
+confirm
+: (`bool`) Whether to prompt for confirmation before deploying. Default is `false`.
+
+dryRun
+: (`bool`) Whether to simulate the deployment without any remote changes. Default is `false`.
+
+force
+: (`bool`) Whether to re-upload all files. Default is `false`.
+
+invalidateCDN
+: (`bool`) Whether to invalidate the CDN cache listed in the deployment target. Default is `true`.
+
+maxDeletes
+: (`int`) The maximum number of files to delete, or `-1` to disable. Default is `256`.
+
+matchers
+: (`[]*Matcher`) A slice of [matchers](#matchers-1).
+
+order
+: (`[]string`) An ordered slice of [regular expressions](g) that determines upload priority (left to right). Files not matching any expression are uploaded last in an arbitrary order.
+
+target
+: (`string`) The target deployment [`name`](#name). Defaults to the first target.
+
+targets
+: (`[]*Target`) A slice of [targets](#targets-1).
+
+workers
+: (`int`) The number of concurrent workers to use when uploading files. Default is `10`.
+
+## Targets
+
+A target represents a deployment target such as "staging" or "production".
+
+cloudFrontDistributionID
+: (`string`) The CloudFront Distribution ID, applicable if you are using the Amazon Web Services CloudFront CDN. Hugo will invalidate the CDN when deploying this target.
+
+exclude
+: (`string`) A [glob](g) pattern matching files to exclude when deploying to this target. Local files failing the include/exclude filters are not uploaded, and remote files failing these filters are not deleted.
+
+googleCloudCDNOrigin
+: (`string`) The Google Cloud project and CDN origin to invalidate when deploying this target, specified as `/`.
+
+include
+: (`string`) A [glob](g) pattern matching files to include when deploying to this target. Local files failing the include/exclude filters are not uploaded, and remote files failing these filters are not deleted.
+
+name
+: (`string`) An arbitrary name for this target.
+
+stripIndexHTML
+: (`bool`) Whether to map files named `/index.html` to `` on the remote (except for the root `index.html`). This is useful for key-value cloud storage (e.g., Amazon S3, Google Cloud Storage, Azure Blob Storage) to align canonical URLs with object keys. Default is `false`.
+
+url
+: (`string`) The [destination URL](#destination-urls) for deployment.
+
+## Matchers
+
+A Matcher represents a configuration to be applied to files whose paths match
+the specified pattern.
+
+cacheControl
+: (`string`) The caching attributes to use when serving the blob. See [details][cacheControl].
+
+contentEncoding
+: (`string`) The encoding used for the blob's content, if any. See [details][contentEncoding].
+
+contentType
+: (`string`) The media type of the blob being written. See [details][contentType].
+
+force
+: (`bool`) Whether matching files should be re-uploaded. Useful when other route-determined metadata (e.g., `contentType`) has changed. Default is `false`.
+
+gzip
+: (`bool`) Whether the file should be gzipped before upload. If so, the `ContentEncoding` field will automatically be set to `gzip`. Default is `false`.
+
+pattern
+: (`string`) A [regular expression](g) used to match paths. Paths are converted to use forward slashes (`/`) before matching.
+
+[cacheControl]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cache-Control
+[contentEncoding]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Encoding
+[contentType]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Type
+
+## Destination URLs
+
+Service|URL example
+:--|:--
+Amazon Simple Storage Service (S3)|`s3://my-bucket?region=us-west-1`
+Azure Blob Storage|`azblob://my-container`
+Google Cloud Storage (GCS)|`gs://my-bucket`
+
+With Google Cloud Storage you can target a subdirectory:
+
+```text
+gs://my-bucket?prefix=a/subdirectory
+```
+
+You can also to deploy to storage servers compatible with Amazon S3 such as:
+
+- [Ceph]
+- [MinIO]
+- [SeaweedFS]
+
+[Ceph]: https://ceph.com/
+[Minio]: https://www.minio.io/
+[SeaweedFS]: https://github.com/chrislusf/seaweedfs
+
+For example, the `url` for a MinIO deployment target might resemble this:
+
+```text
+s3://my-bucket?endpoint=https://my.minio.instance&awssdk=v2&use_path_style=true&disable_https=false
+```
+
+## Example
+
+{{< code-toggle file=hugo >}}
+[deployment]
+ order = ['.jpg$', '.gif$']
+ [[deployment.matchers]]
+ cacheControl = 'max-age=31536000, no-transform, public'
+ gzip = true
+ pattern = '^.+\.(js|css|svg|ttf)$'
+ [[deployment.matchers]]
+ cacheControl = 'max-age=31536000, no-transform, public'
+ gzip = false
+ pattern = '^.+\.(png|jpg)$'
+ [[deployment.matchers]]
+ contentType = 'application/xml'
+ gzip = true
+ pattern = '^sitemap\.xml$'
+ [[deployment.matchers]]
+ gzip = true
+ pattern = '^.+\.(html|xml|json)$'
+ [[deployment.targets]]
+ url = 's3://my_production_bucket?region=us-west-1'
+ cloudFrontDistributionID = 'E1234567890ABCDEF0'
+ exclude = '**.{heic,psd}'
+ name = 'production'
+ [[deployment.targets]]
+ url = 's3://my_staging_bucket?region=us-west-1'
+ exclude = '**.{heic,psd}'
+ name = 'staging'
+{{< /code-toggle >}}
diff --git a/docs/content/en/configuration/front-matter.md b/docs/content/en/configuration/front-matter.md
new file mode 100644
index 000000000..9f51b8a5a
--- /dev/null
+++ b/docs/content/en/configuration/front-matter.md
@@ -0,0 +1,91 @@
+---
+title: Configure front matter
+linkTitle: Front matter
+description: Configure front matter.
+categories: []
+keywords: []
+---
+
+## Dates
+
+There are four methods on a `Page` object that return a date.
+
+Method|Description
+:--|:--
+[`Date`]|Returns the date of the given page.
+[`ExpiryDate`]|Returns the expiry date of the given page.
+[`Lastmod`]|Returns the last modification date of the given page.
+[`PublishDate`]|Returns the publish date of the given page.
+
+[`Date`]: /methods/page/date
+[`ExpiryDate`]: /methods/page/expirydate
+[`Lastmod`]: /methods/page/lastmod
+[`PublishDate`]: /methods/page/publishdate
+
+Hugo determines the values to return based on this configuration:
+
+{{< code-toggle config=frontmatter />}}
+
+The `ExpiryDate` method, for example, returns the `expirydate` value if it exists, otherwise it returns `unpublishdate`.
+
+You can also use custom date parameters:
+
+{{< code-toggle file=hugo >}}
+[frontmatter]
+date = ["myDate", "date"]
+{{< /code-toggle >}}
+
+In the example above, the `Date` method returns the `myDate` value if it exists, otherwise it returns `date`.
+
+To fall back to the default sequence of dates, use the `:default` token:
+
+{{< code-toggle file=hugo >}}
+[frontmatter]
+date = ["myDate", ":default"]
+{{< /code-toggle >}}
+
+In the example above, the `Date` method returns the `myDate` value if it exists, otherwise it returns the first valid date from `date`, `publishdate`, `pubdate`, `published`, `lastmod`, and `modified`.
+
+## Aliases
+
+Some of the front matter fields have aliases.
+
+Front matter field|Aliases
+:--|:--
+`expiryDate`|`unpublishdate`
+`lastmod`|`modified`
+`publishDate`|`pubdate`, `published`
+
+The default front matter configuration includes these aliases.
+
+## Tokens
+
+Hugo provides several [tokens](g) to assist with front matter configuration.
+
+Token|Description
+:--|:--
+`:default`|The default ordered sequence of date fields.
+`:fileModTime`|The file's last modification timestamp.
+`:filename`|The date from the file name, if present.
+`:git`|The Git author date for the file's last revision.
+
+When Hugo extracts a date from a file name, it uses the rest of the file name to generate the page's [`slug`], but only if a slug isn't already specified in the page's front matter. For example, given the name `2025-02-01-article.md`, Hugo will set the `date` to `2025-02-01` and the `slug` to `article`.
+
+[`slug`]: /content-management/front-matter/#slug
+
+To enable access to the Git author date, set [`enableGitInfo`] to `true`, or use\
+the `--enableGitInfo` flag when building your site.
+
+[`enableGitInfo`]: /configuration/all/#enablegitinfo
+
+Consider this example:
+
+{{< code-toggle file=hugo >}}
+[frontmatter]
+date = [':filename', ':default']
+lastmod = ['lastmod', ':fileModTime']
+{{< /code-toggle >}}
+
+To determine `date`, Hugo tries to extract the date from the file name, falling back to the default ordered sequence of date fields.
+
+To determine `lastmod`, Hugo looks for a `lastmod` field in front matter, falling back to the file's last modification timestamp.
diff --git a/docs/content/en/configuration/http-cache.md b/docs/content/en/configuration/http-cache.md
new file mode 100644
index 000000000..788d22a08
--- /dev/null
+++ b/docs/content/en/configuration/http-cache.md
@@ -0,0 +1,107 @@
+---
+title: Configure the HTTP cache
+linkTitle: HTTP cache
+description: Configure the HTTP cache.
+categories: []
+keywords: []
+---
+
+> [!note]
+> This configuration is only relevant when using the [`resources.GetRemote`] function.
+
+## Layered caching
+
+Hugo employs a layered caching system.
+
+```goat {.w-40}
+ .-----------.
+| dynacache |
+ '-----+-----'
+ |
+ v
+ .----------.
+| HTTP cache |
+ '-----+----'
+ |
+ v
+ .----------.
+| file cache |
+ '-----+----'
+```
+
+Dynacache
+: An in-memory cache employing a Least Recently Used (LRU) eviction policy. Entries are removed from the cache when changes occur, when they match [cache-busting] patterns, or under low-memory conditions.
+
+HTTP Cache
+: An HTTP cache for remote resources as specified in [RFC 9111]. Optimal performance is achieved when resources include appropriate HTTP cache headers. The HTTP cache utilizes the file cache for storage and retrieval of cached resources.
+
+File cache
+: See [configure file caches].
+
+The HTTP cache involves two key aspects: determining which content to cache (the caching process itself) and defining the frequency with which to check for updates (the polling strategy).
+
+## HTTP caching
+
+The HTTP cache behavior is defined for a configured set of resources. Stale resources will be refreshed from the file cache, even if their configured Time-To-Live (TTL) has not expired. If HTTP caching is disabled for a resource, Hugo will bypass the cache and access the file directly.
+
+The default configuration disables everything:
+
+{{< code-toggle file=hugo >}}
+[HTTPCache.cache.for]
+excludes = ['**']
+includes = []
+{{< /code-toggle >}}
+
+cache.for.excludes
+: (`string`) A list of [glob](g) patterns to exclude from caching.
+
+cache.for.includes
+: (`string`) A list of [glob](g) patterns to cache.
+
+## HTTP polling
+
+Polling is used in watch mode (e.g., `hugo server`) to detect changes in remote resources. Polling can be enabled even if HTTP caching is disabled. Detected changes trigger a rebuild of pages using the affected resource. Polling can be disabled for specific resources, typically those known to be static.
+
+The default configuration disables everything:
+
+{{< code-toggle file=hugo >}}
+[[HTTPCache.polls]]
+disable = true
+high = '0s'
+low = '0s'
+[HTTPCache.polls.for]
+includes = ['**']
+excludes = []
+{{< /code-toggle >}}
+
+polls
+: A slice of polling configurations.
+
+polls.disable
+: (`bool`) Whether to disable polling for this configuration.
+
+polls.low
+: (`string`) The minimum polling interval expressed as a [duration](g). This is used after a recent change and gradually increases towards `polls.high`.
+
+polls.high
+: (`string`) The maximum polling interval expressed as a [duration](g). This is used when the resource is considered stable.
+
+polls.for.excludes
+: (`string`) A list of [glob](g) patterns to exclude from polling for this configuration.
+
+polls.for.includes
+: (`string`) A list of [glob](g) patterns to include in polling for this configuration.
+
+## Behavior
+
+Polling and HTTP caching interact as follows:
+
+- With polling enabled, rebuilds are triggered only by actual changes, detected via `eTag` changes (Hugo generates an MD5 hash if the server doesn't provide one).
+- If polling is enabled but HTTP caching is disabled, the remote is checked for changes only after the file cache's TTL expires (e.g., a `maxAge` of `10h` with a `1s` polling interval is inefficient).
+- If both polling and HTTP caching are enabled, changes are checked for even before the file cache's TTL expires. Cached `eTag` and `last-modified` values are sent in `if-none-match` and `if-modified-since` headers, respectively, and a cached response is returned on HTTP [304].
+
+[`resources.GetRemote`]: /functions/resources/getremote/
+[304]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/304
+[cache-busting]: /configuration/build/#cache-busters
+[configure file caches]: /configuration/caches/
+[RFC 9111]: https://datatracker.ietf.org/doc/html/rfc9111
diff --git a/docs/content/en/configuration/imaging.md b/docs/content/en/configuration/imaging.md
new file mode 100644
index 000000000..13ecf9c26
--- /dev/null
+++ b/docs/content/en/configuration/imaging.md
@@ -0,0 +1,69 @@
+---
+title: Configure imaging
+linkTitle: Imaging
+description: Configure imaging.
+categories: []
+keywords: []
+---
+
+## Processing options
+
+These are the default settings for processing images:
+
+{{< code-toggle file=hugo >}}
+[imaging]
+anchor = 'Smart'
+bgColor = '#ffffff'
+hint = 'photo'
+quality = 75
+resampleFilter = 'box'
+{{< /code-toggle >}}
+
+anchor
+: (`string`) When using the [`Crop`] or [`Fill`] method, the anchor determines the placement of the crop box. One of `TopLeft`, `Top`, `TopRight`, `Left`, `Center`, `Right`, `BottomLeft`, `Bottom`, `BottomRight`, or `Smart`. Default is `Smart`.
+
+bgColor
+: (`string`) The background color of the resulting image. Applicable when converting from a format that supports transparency to a format that does not support transparency, for example, when converting from PNG to JPEG. Expressed as an RGB [hexadecimal] value. Default is `#ffffff`.
+
+[hexadecimal]: https://developer.mozilla.org/en-US/docs/Web/CSS/hex-color
+
+hint
+: (`string`) Applicable to WebP images, this option corresponds to a set of predefined encoding parameters. One of `drawing`, `icon`, `photo`, `picture`, or `text`. Default is `photo`. See [details](/content-management/image-processing/#hint).
+
+quality
+: (`int`) Applicable to JPEG and WebP images, this value determines the quality of the converted image. Higher values produce better quality images, while lower values produce smaller files. Set this value to a whole number between `1` and `100`, inclusive. Default is `75`.
+
+resampleFilter
+: (`string`) The resampling filter used when resizing an image. Default is `box`. See [details](/content-management/image-processing/#resampling-filter)
+
+## EXIF data
+
+These are the default settings for extracting EXIF data from images:
+
+{{< code-toggle file=hugo >}}
+[imaging.exif]
+includeFields = ""
+excludeFields = ""
+disableDate = false
+disableLatLong = false
+{{< /code-toggle >}}
+
+disableDate
+: (`bool`) Whether to disable extraction of the image creation date/time. Default is `false`.
+
+disableLatLong
+: (`bool`) Whether to disable extraction of the GPS latitude and longitude. Default is `false`.
+
+excludeFields
+: (`string`) A [regular expression](g) matching the tags to exclude when extracting EXIF data.
+
+includeFields
+: (`string`) A [regular expression](g) matching the tags to include when extracting EXIF data. To include all available tags, set this value to `".*"`.
+
+> [!note]
+> To improve performance and decrease cache size, Hugo excludes the following tags: `ColorSpace`, `Contrast`, `Exif`, `Exposure[M|P|B]`, `Flash`, `GPS`, `JPEG`, `Metering`, `Resolution`, `Saturation`, `Sensing`, `Sharp`, and `WhiteBalance`.
+>
+> To control tag availability, change the `excludeFields` or `includeFields` settings as described above.
+
+[`Crop`]: /methods/resource/crop/
+[`Fill`]: /methods/resource/fill/
diff --git a/docs/content/en/configuration/introduction.md b/docs/content/en/configuration/introduction.md
new file mode 100644
index 000000000..8f8ad4c1e
--- /dev/null
+++ b/docs/content/en/configuration/introduction.md
@@ -0,0 +1,284 @@
+---
+title: Introduction
+description: Configure your site using files, directories, and environment variables.
+categories: []
+keywords: []
+weight: 10
+---
+
+## Sensible defaults
+
+Hugo offers many configuration options, but its defaults are often sufficient. A new site requires only these settings:
+
+{{< code-toggle file=hugo >}}
+baseURL = 'https://example.org/'
+languageCode = 'en-us'
+title = 'My New Hugo Site'
+{{< /code-toggle >}}
+
+Only define settings that deviate from the defaults. A smaller configuration file is easier to read, understand, and debug. Keep your configuration concise.
+
+> [!note]
+> The best configuration file is a short configuration file.
+
+## Configuration file
+
+Create a site configuration file in the root of your project directory, naming it `hugo.toml`, `hugo.yaml`, or `hugo.json`, with that order of precedence.
+
+```text
+my-project/
+└── hugo.toml
+```
+
+> [!note]
+> For versions v0.109.0 and earlier, the site configuration file was named `config`. While you can still use this name, it's recommended to switch to the newer naming convention, `hugo`.
+
+A simple example:
+
+{{< code-toggle file=hugo >}}
+baseURL = 'https://example.org/'
+languageCode = 'en-us'
+title = 'ABC Widgets, Inc.'
+[params]
+subtitle = 'The Best Widgets on Earth'
+[params.contact]
+email = 'info@example.org'
+phone = '+1 202-555-1212'
+{{< /code-toggle >}}
+
+To use a different configuration file when building your site, use the `--config` flag:
+
+```sh
+hugo --config other.toml
+```
+
+Combine two or more configuration files, with left-to-right precedence:
+
+```sh
+hugo --config a.toml,b.yaml,c.json
+```
+
+> [!note]
+> See the specifications for each file format: [TOML], [YAML], and [JSON].
+
+## Configuration directory
+
+Instead of a single site configuration file, split your configuration by [environment](g), root configuration key, and language. For example:
+
+```text
+my-project/
+└── config/
+ ├── _default/
+ │ ├── hugo.toml
+ │ ├── menus.en.toml
+ │ ├── menus.de.toml
+ │ └── params.toml
+ └── production/
+ └── params.toml
+```
+
+The root configuration keys are {{< root-configuration-keys >}}.
+
+### Omit the root key
+
+When splitting the configuration by root key, omit the root key in the component file. For example, these are equivalent:
+
+{{< code-toggle file=config/_default/hugo >}}
+[params]
+foo = 'bar'
+{{< /code-toggle >}}
+
+{{< code-toggle file=config/_default/params >}}
+foo = 'bar'
+{{< /code-toggle >}}
+
+### Recursive parsing
+
+Hugo parses the `config` directory recursively, allowing you to organize the files into subdirectories. For example:
+
+```text
+my-project/
+└── config/
+ └── _default/
+ ├── navigation/
+ │ ├── menus.de.toml
+ │ └── menus.en.toml
+ └── hugo.toml
+```
+
+### Example
+
+```text
+my-project/
+└── config/
+ ├── _default/
+ │ ├── hugo.toml
+ │ ├── menus.en.toml
+ │ ├── menus.de.toml
+ │ └── params.toml
+ ├── production/
+ │ ├── hugo.toml
+ │ └── params.toml
+ └── staging/
+ ├── hugo.toml
+ └── params.toml
+```
+
+Considering the structure above, when running `hugo --environment staging`, Hugo will use every setting from `config/_default` and merge `staging`'s on top of those.
+
+Let's take an example to understand this better. Let's say you are using Google Analytics for your website. This requires you to specify a [Google tag ID] in your site configuration:
+
+{{< code-toggle file=hugo >}}
+[services.googleAnalytics]
+ID = 'G-XXXXXXXXX'
+{{< /code-toggle >}}
+
+Now consider the following scenario:
+
+1. You don't want to load the analytics code when running `hugo server`.
+1. You want to use different Google tag IDs for your production and staging environments. For example:
+ - `G-PPPPPPPPP` for production
+ - `G-SSSSSSSSS` for staging
+
+To satisfy these requirements, configure your site as follows:
+
+1. `config/_default/hugo.toml`
+ - Exclude the `services.googleAnalytics` section. This will prevent loading of the analytics code when you run `hugo server`.
+ - By default, Hugo sets its `environment` to `development` when running `hugo server`. In the absence of a `config/development` directory, Hugo uses the `config/_default` directory.
+1. `config/production/hugo.toml`
+ - Include this section only:
+
+ {{< code-toggle file=hugo >}}
+ [services.googleAnalytics]
+ ID = 'G-PPPPPPPPP'
+ {{< /code-toggle >}}
+
+ - You do not need to include other parameters in this file. Include only those parameters that are specific to your production environment. Hugo will merge these parameters with the default configuration.
+ - By default, Hugo sets its `environment` to `production` when running `hugo`. The analytics code will use the `G-PPPPPPPPP` tag ID.
+
+1. `config/staging/hugo.toml`
+
+ - Include this section only:
+
+ {{< code-toggle file=hugo >}}
+ [services.googleAnalytics]
+ ID = 'G-SSSSSSSSS'
+ {{< /code-toggle >}}
+
+ - You do not need to include other parameters in this file. Include only those parameters that are specific to your staging environment. Hugo will merge these parameters with the default configuration.
+ - To build your staging site, run `hugo --environment staging`. The analytics code will use the `G-SSSSSSSSS` tag ID.
+
+## Merge configuration settings
+
+Hugo merges configuration settings from themes and modules, prioritizing the project's own settings. Given this simplified project structure with two themes:
+
+```text
+project/
+├── themes/
+│ ├── theme-a/
+│ │ └── hugo.toml
+│ └── theme-b/
+│ └── hugo.toml
+└── hugo.toml
+```
+
+and this project-level configuration:
+
+{{< code-toggle file=hugo >}}
+baseURL = 'https://example.org/'
+languageCode = 'en-us'
+title = 'My New Hugo Site'
+theme = ['theme-a','theme-b']
+{{< /code-toggle >}}
+
+Hugo merges settings in this order:
+
+1. Project configuration (`hugo.toml` in the project root)
+1. `theme-a` configuration
+1. `theme-b` configuration
+
+The `_merge` setting within each top-level configuration key controls _which_ settings are merged and _how_ they are merged.
+
+The value for `_merge` can be one of:
+
+none
+: No merge.
+
+shallow
+: Only add values for new keys.
+
+deep
+: Add values for new keys, merge existing.
+
+Note that you don't need to be so verbose as in the default setup below; a `_merge` value higher up will be inherited if not set.
+
+{{< code-toggle file=hugo dataKey="config_helpers.mergeStrategy" skipHeader=true />}}
+
+## Environment variables
+
+You can also configure settings using operating system environment variables:
+
+```sh
+export HUGO_BASEURL=https://example.org/
+export HUGO_ENABLEGITINFO=true
+export HUGO_ENVIRONMENT=staging
+hugo
+```
+
+The above sets the [`baseURL`], [`enableGitInfo`], and [`environment`] configuration options and then builds your site.
+
+> [!note]
+> An environment variable takes precedence over the values set in the configuration file. This means that if you set a configuration value with both an environment variable and in the configuration file, the value in the environment variable will be used.
+
+Environment variables simplify configuration for [CI/CD](g) deployments like GitHub Pages, GitLab Pages, and Netlify by allowing you to set values directly within their respective configuration and workflow files.
+
+> [!note]
+> Environment variable names must be prefixed with `HUGO_`.
+>
+> To set custom site parameters, prefix the name with `HUGO_PARAMS_`.
+
+For snake_case variable names, the standard `HUGO_` prefix won't work. Hugo infers the delimiter from the first character following `HUGO`. This allows for variations like `HUGOxPARAMSxAPI_KEY=abcdefgh` using any [permitted delimiter].
+
+In addition to configuring standard settings, environment variables may be used to override default values for certain internal settings:
+
+DART_SASS_BINARY
+: (`string`) The absolute path to the Dart Sass executable. By default, Hugo searches for the executable in each of the paths in the `PATH` environment variable.
+
+HUGO_FILE_LOG_FORMAT
+: (`string`) A format string for the file path, line number, and column number displayed when reporting errors, or when calling the `Position` method from a shortcode or Markdown render hook. Valid tokens are `:file`, `:line`, and `:col`. Default is `:file::line::col`.
+
+HUGO_MEMORYLIMIT
+: {{< new-in 0.123.0 />}}
+: (`int`) The maximum amount of system memory, in gigabytes, that Hugo can use while rendering your site. Default is 25% of total system memory. Note that `HUGO_MEMORYLIMIT` is a "best effort" setting. Don't expect Hugo to build a million pages with only 1 GB of memory. You can get more information about how this behaves during the build by building with `hugo --logLevel info` and look for the `dynacache` label.
+
+HUGO_NUMWORKERMULTIPLIER
+: (`int`) The number of workers used in parallel processing. Default is the number of logical CPUs.
+
+## Current configuration
+
+Display the complete site configuration with:
+
+```sh
+hugo config
+```
+
+Display a specific configuration setting with:
+
+```sh
+hugo config | grep [key]
+```
+
+Display the configured file mounts with:
+
+```sh
+hugo config mounts
+```
+
+[`baseURL`]: /configuration/all#baseurl
+[`enableGitInfo`]: /configuration/all#enablegitinfo
+[`environment`]: /configuration/all#environment
+[Google tag ID]: https://support.google.com/tagmanager/answer/12326985?hl=en
+[JSON]: https://datatracker.ietf.org/doc/html/rfc7159
+[permitted delimiter]: https://pubs.opengroup.org/onlinepubs/000095399/basedefs/xbd_chap08.html
+[TOML]: https://toml.io/en/latest
+[YAML]: https://yaml.org/spec/
diff --git a/docs/content/en/configuration/languages.md b/docs/content/en/configuration/languages.md
new file mode 100644
index 000000000..540cfd34f
--- /dev/null
+++ b/docs/content/en/configuration/languages.md
@@ -0,0 +1,193 @@
+---
+title: Configure languages
+linkTitle: Languages
+description: Configure the languages in your multilingual site.
+categories: []
+keywords: []
+---
+
+## Base settings
+
+Configure the following base settings within the site's root configuration:
+
+{{< code-toggle file=hugo >}}
+defaultContentLanguage = 'en'
+defaultContentLanguageInSubdir = false
+disableDefaultLanguageRedirect = false
+disableLanguages = []
+{{< /code-toggle >}}
+
+defaultContentLanguage
+: (`string`) The project's default language key, conforming to the syntax described in [RFC 5646]. This value must match one of the defined [language keys](#language-keys). Default is `en`.
+
+defaultContentLanguageInSubdir
+: (`bool`) Whether to publish the default language site to a subdirectory matching the `defaultContentLanguage`. Default is `false`.
+
+disableDefaultLanguageRedirect
+: {{< new-in 0.140.0 />}}
+: (`bool`) Whether to disable generation of the alias redirect to the default language when `DefaultContentLanguageInSubdir` is `true`. Default is `false`.
+
+disableLanguages
+: (`[]string]`) A slice of language keys representing the languages to disable during the build process. Although this is functional, consider using the [`disabled`](#disabled) key under each language instead.
+
+## Language settings
+
+Configure each language under the `languages` key:
+
+{{< code-toggle config=languages />}}
+
+In the above, `en` is the [language key](#language-keys).
+
+disabled
+: (`bool`) Whether to disable this language when building the site. Default is `false`.
+
+languageCode
+: (`string`) The language tag as described in [RFC 5646]. This value does not affect localization or URLs. Hugo uses this value to populate:
+
+ - The `lang` attribute of the `html` element in the [embedded alias template]
+ - The `language` element in the [embedded RSS template]
+ - The `locale` property in the [embedded OpenGraph template]
+
+ Access this value from a template using the [`Language.LanguageCode`] method on a `Site` or `Page` object.
+
+languageDirection
+: (`string`) The language direction, either left-to-right (`ltr`) or right-to-left (`rtl`). Use this value in your templates with the global [`dir`] HTML attribute. Access this value from a template using the [`Language.LanguageDirection`] method on a `Site` or `Page` object.
+
+languageName
+: (`string`) The language name, typically used when rendering a language switcher. Access this value from a template using the [`Language.LanguageName`] method on a `Site` or `Page` object.
+
+title
+: (`string`) The site title for this language. Access this value from a template using the [`Title`] method on a `Site` object.
+
+weight
+: (`int`) The language [weight](g). When set to a non-zero value, this is the primary sort criteria for this language. Access this value from a template using the [`Language.Weight`] method on a `Site` or `Page` object.
+
+## Localized settings
+
+Some configuration settings can be defined separately for each language. For example:
+
+{{< code-toggle file=hugo >}}
+[languages.en]
+languageCode = 'en-US'
+languageName = 'English'
+weight = 1
+title = 'Project Documentation'
+timeZone = 'America/New_York'
+[languages.en.pagination]
+path = 'page'
+[languages.en.params]
+subtitle = 'Reference, Tutorials, and Explanations'
+{{< /code-toggle >}}
+
+The following configuration keys can be defined separately for each language:
+
+{{< per-lang-config-keys >}}
+
+Any key not defined in a `languages` object will fall back to the global value in the root of the site configuration.
+
+## Language keys
+
+Language keys must conform to the syntax described in [RFC 5646]. For example:
+
+{{< code-toggle file=hugo >}}
+defaultContentLanguage = 'de'
+[languages.de]
+ weight = 1
+[languages.en-US]
+ weight = 2
+[languages.pt-BR]
+ weight = 3
+{{< /code-toggle >}}
+
+Artificial languages with private use subtags as defined in [RFC 5646 § 2.2.7] are also supported. Omit the `art-x-` prefix from the language key. For example:
+
+{{< code-toggle file=hugo >}}
+defaultContentLanguage = 'en'
+[languages.en]
+weight = 1
+[languages.hugolang]
+weight = 2
+{{< /code-toggle >}}
+
+> [!note]
+> Private use subtags must not exceed 8 alphanumeric characters.
+
+## Example
+
+{{< code-toggle file=hugo >}}
+defaultContentLanguage = 'de'
+defaultContentLanguageInSubdir = true
+disableDefaultLanguageRedirect = false
+
+[languages.de]
+contentDir = 'content/de'
+disabled = false
+languageCode = 'de-DE'
+languageDirection = 'ltr'
+languageName = 'Deutsch'
+title = 'Projekt Dokumentation'
+weight = 1
+
+[languages.de.params]
+subtitle = 'Referenz, Tutorials und Erklärungen'
+
+[languages.en]
+contentDir = 'content/en'
+disabled = false
+languageCode = 'en-US'
+languageDirection = 'ltr'
+languageName = 'English'
+title = 'Project Documentation'
+weight = 2
+
+[languages.en.params]
+subtitle = 'Reference, Tutorials, and Explanations'
+{{< /code-toggle >}}
+
+> [!note]
+> In the example above, omit `contentDir` if [translating by file name].
+
+## Multihost
+
+Hugo supports multiple languages in a multihost configuration. This means you can configure a `baseURL` per `language`.
+
+> [!note]
+> If you define a `baseURL` for one language, you must define a unique `baseURL` for all languages.
+
+For example:
+
+{{< code-toggle file=hugo >}}
+defaultContentLanguage = 'fr'
+[languages]
+ [languages.en]
+ baseURL = 'https://en.example.org/'
+ languageName = 'English'
+ title = 'In English'
+ weight = 2
+ [languages.fr]
+ baseURL = 'https://fr.example.org'
+ languageName = 'Français'
+ title = 'En Français'
+ weight = 1
+{{ code-toggle >}}
+
+With the above, Hugo publishes two sites, each with their own root:
+
+```text
+public
+├── en
+└── fr
+```
+
+[`dir`]: https://developer.mozilla.org/en-US/docs/Web/HTML/Global_attributes/dir
+[`Language.LanguageCode`]: /methods/site/language/#languagecode
+[`Language.LanguageDirection`]: /methods/site/language/#languagedirection
+[`Language.LanguageName`]: /methods/site/language/#languagename
+[`Language.Weight`]: /methods/site/language/#weight
+[`Title`]: /methods/site/title/
+[embedded alias template]: {{% eturl alias %}}
+[embedded OpenGraph template]: {{% eturl opengraph %}}
+[embedded RSS template]: {{% eturl rss %}}
+[RFC 5646]: https://datatracker.ietf.org/doc/html/rfc5646#section-2.1
+[RFC 5646 § 2.2.7]: https://datatracker.ietf.org/doc/html/rfc5646#section-2.2.7
+[translating by file name]: /content-management/multilingual/#translation-by-file-name
diff --git a/docs/content/en/configuration/markup.md b/docs/content/en/configuration/markup.md
new file mode 100644
index 000000000..b6135cee5
--- /dev/null
+++ b/docs/content/en/configuration/markup.md
@@ -0,0 +1,341 @@
+---
+title: Configure markup
+linkTitle: Markup
+description: Configure markup.
+categories: []
+keywords: []
+aliases: [/getting-started/configuration-markup/]
+---
+
+## Default handler
+
+In its default configuration, Hugo uses [Goldmark] to render Markdown to HTML.
+
+{{< code-toggle file=hugo >}}
+[markup]
+defaultMarkdownHandler = 'goldmark'
+{{< /code-toggle >}}
+
+Files with ending with `.md`, `.mdown`, or `.markdown` are processed as Markdown, unless you've explicitly set a different format using the `markup` field in your front matter.
+
+To use a different renderer for Markdown files, specify one of `asciidocext`, `org`, `pandoc`, or `rst` in your site configuration.
+
+`defaultMarkdownHandler`|Renderer
+:--|:--
+`asciidocext`|[AsciiDoc]
+`goldmark`|[Goldmark]
+`org`|[Emacs Org Mode]
+`pandoc`|[Pandoc]
+`rst`|[reStructuredText]
+
+To use AsciiDoc, Pandoc, or reStructuredText you must install the relevant renderer and update your [security policy].
+
+> [!note]
+> Unless you need a unique capability provided by one of the alternative Markdown handlers, we strongly recommend that you use the default setting. Goldmark is fast, well maintained, conforms to the [CommonMark] specification, and is compatible with [GitHub Flavored Markdown] (GFM).
+
+## Goldmark
+
+This is the default configuration for the Goldmark Markdown renderer:
+
+{{< code-toggle config=markup.goldmark />}}
+
+### Extensions
+
+The extensions below, excluding Extras and Passthrough, are enabled by default.
+
+Extension|Documentation|Enabled
+:--|:--|:-:
+`cjk`|[Goldmark Extensions: CJK]|:heavy_check_mark:
+`definitionList`|[PHP Markdown Extra: Definition lists]|:heavy_check_mark:
+`extras`|[Hugo Goldmark Extensions: Extras]||
+`footnote`|[PHP Markdown Extra: Footnotes]|:heavy_check_mark:
+`linkify`|[GitHub Flavored Markdown: Autolinks]|:heavy_check_mark:
+`passthrough`|[Hugo Goldmark Extensions: Passthrough]||
+`strikethrough`|[GitHub Flavored Markdown: Strikethrough]|:heavy_check_mark:
+`table`|[GitHub Flavored Markdown: Tables]|:heavy_check_mark:
+`taskList`|[GitHub Flavored Markdown: Task list items]|:heavy_check_mark:
+`typographer`|[Goldmark Extensions: Typographer]|:heavy_check_mark:
+
+#### Extras
+
+{{< new-in 0.126.0 />}}
+
+Enable [deleted text], [inserted text], [mark text], [subscript], and [superscript] elements in Markdown.
+
+Element|Markdown|Rendered
+:--|:--|:--
+Deleted text|`~~foo~~`|`foo`
+Inserted text|`++bar++`|`bar`
+Mark text|`==baz==`|`baz`
+Subscript|`H~2~O`|`H2O`
+Superscript|`1^st^`|`1st`
+
+To avoid a conflict when enabling the "subscript" feature of the Extras extension, if you want to render subscript and strikethrough text concurrently you must:
+
+1. Disable the Strikethrough extension
+1. Enable the "deleted text" feature of the Extras extension
+
+For example:
+
+{{< code-toggle file=hugo >}}
+[markup.goldmark.extensions]
+strikethrough = false
+
+[markup.goldmark.extensions.extras.delete]
+enable = true
+
+[markup.goldmark.extensions.extras.subscript]
+enable = true
+{{< /code-toggle >}}
+
+#### Passthrough
+
+{{< new-in 0.122.0 />}}
+
+Enable the Passthrough extension to include mathematical equations and expressions in Markdown using LaTeX markup. See [mathematics in Markdown] for details.
+
+#### Typographer
+
+The Typographer extension replaces certain character combinations with HTML entities as specified below:
+
+Markdown|Replaced by|Description
+:--|:--|:--
+`...`|`…`|horizontal ellipsis
+`'`|`’`|apostrophe
+`--`|`–`|en dash
+`---`|`—`|em dash
+`«`|`«`|left angle quote
+`“`|`“`|left double quote
+`‘`|`‘`|left single quote
+`»`|`»`|right angle quote
+`”`|`”`|right double quote
+`’`|`’`|right single quote
+
+### Settings explained
+
+Most of the Goldmark settings above are self-explanatory, but some require explanation.
+
+duplicateResourceFiles
+: {{< new-in 0.123.0 />}}
+: (`bool`) Whether to duplicate shared page resources for each language on multilingual single-host sites. See [multilingual page resources] for details. Default is `false`.
+
+ > [!note]
+ > With multilingual single-host sites, setting this parameter to `false` will enable Hugo's [embedded link render hook] and [embedded image render hook]. This is the default configuration for multilingual single-host sites.
+
+parser.wrapStandAloneImageWithinParagraph
+: (`bool`) Whether to wrap image elements without adjacent content within a `p` element when rendered. This is the default Markdown behavior. Set to `false` when using an [image render hook] to render standalone images as `figure` elements. Default is `true`.
+
+parser.autoDefinitionTermID
+: {{< new-in 0.144.0 />}}
+: (`bool`) Whether to automatically add `id` attributes to description list terms (i.e., `dt` elements). When `true`, the `id` attribute of each `dt` element is accessible through the [`Fragments.Identifiers`] method on a `Page` object.
+
+parser.autoHeadingID
+: (`bool`) Whether to automatically add `id` attributes to headings (i.e., `h1`, `h2`, `h3`, `h4`, `h5`, and `h6` elements).
+
+parser.autoIDType
+: (`string`) The strategy used to automatically generate `id` attributes, one of `github`, `github-ascii` or `blackfriday`.
+
+ - `github` produces GitHub-compatible `id` attributes
+ - `github-ascii` drops any non-ASCII characters after accent normalization
+ - `blackfriday` produces `id` attributes compatible with the Blackfriday Markdown renderer
+
+ This is also the strategy used by the [anchorize](/functions/urls/anchorize) template function. Default is `github`.
+
+parser.attribute.block
+: (`bool`) Whether to enable [Markdown attributes] for block elements. Default is `false`.
+
+parser.attribute.title
+: (`bool`) Whether to enable [Markdown attributes] for headings. Default is `true`.
+
+renderHooks.image.enableDefault
+: {{< new-in 0.123.0 />}}
+: (`bool`) Whether to enable the [embedded image render hook]. Default is `false`.
+
+ > [!note]
+ > The embedded image render hook is automatically enabled for multilingual single-host sites if [duplication of shared page resources] is disabled. This is the default configuration for multilingual single-host sites.
+
+renderHooks.link.enableDefault
+: {{< new-in 0.123.0 />}}
+: (`bool`) Whether to enable the [embedded link render hook]. Default is `false`.
+
+ > [!note]
+ > The embedded link render hook is automatically enabled for multilingual single-host sites if [duplication of shared page resources] is disabled. This is the default configuration for multilingual single-host sites.
+
+renderer.hardWraps
+: (`bool`) Whether to replace newline characters within a paragraph with `br` elements. Default is `false`.
+
+renderer.unsafe
+: (`bool`) Whether to render raw HTML mixed within Markdown. This is unsafe unless the content is under your control. Default is `false`.
+
+## AsciiDoc
+
+This is the default configuration for the AsciiDoc renderer:
+
+{{< code-toggle config=markup.asciidocExt />}}
+
+### Settings explained
+
+attributes
+: (`map`) A map of key-value pairs, each a document attribute. See Asciidoctor's [attributes].
+
+backend
+: (`string`) The backend output file format. Default is `html5`.
+
+extensions
+: (`string array`) An array of enabled extensions, one or more of `asciidoctor-html5s`, `asciidoctor-bibtex`, `asciidoctor-diagram`, `asciidoctor-interdoc-reftext`, `asciidoctor-katex`, `asciidoctor-latex`, `asciidoctor-mathematical`, or `asciidoctor-question`.
+
+ > [!note]
+ > To mitigate security risks, entries in the extension array may not contain forward slashes (`/`), backslashes (`\`), or periods. Due to this restriction, extensions must be in Ruby's `$LOAD_PATH`.
+
+failureLevel
+: (`string`) The minimum logging level that triggers a non-zero exit code (failure). Default is `fatal`.
+
+noHeaderOrFooter
+: (`bool`) Whether to output an embeddable document, which excludes the header, the footer, and everything outside the body of the document. Default is `true`.
+
+preserveTOC
+: (`bool`) Whether to preserve the table of contents (TOC) rendered by Asciidoctor. By default, to make the TOC compatible with existing themes, Hugo removes the TOC rendered by Asciidoctor. To render the TOC, use the [`TableOfContents`] method on a `Page` object in your templates. Default is `false`.
+
+safeMode
+: (`string`) The safe mode level, one of `unsafe`, `safe`, `server`, or `secure`. Default is `unsafe`.
+
+sectionNumbers
+: (`bool`) Whether to number each section title. Default is `false`.
+
+trace
+: (`bool`) Whether to include backtrace information on errors. Default is `false`.
+
+verbose
+: (`bool`) Whether to verbosely print processing information and configuration file checks to stderr. Default is `false`.
+
+workingFolderCurrent
+: (`bool`) Whether to set the working directory to be the same as that of the AsciiDoc file being processed, allowing [includes] to work with relative paths. Set to `true` to render diagrams with the [asciidoctor-diagram] extension. Default is `false`.
+
+### Configuration example
+
+{{< code-toggle file=hugo >}}
+[markup.asciidocExt]
+ extensions = ["asciidoctor-html5s", "asciidoctor-diagram"]
+ workingFolderCurrent = true
+ [markup.asciidocExt.attributes]
+ my-base-url = "https://example.com/"
+ my-attribute-name = "my value"
+{{< /code-toggle >}}
+
+### Syntax highlighting
+
+Follow the steps below to enable syntax highlighting.
+
+#### Step 1
+
+Set the `source-highlighter` attribute in your site configuration. For example:
+
+{{< code-toggle file=hugo >}}
+[markup.asciidocExt.attributes]
+source-highlighter = 'rouge'
+{{< /code-toggle >}}
+
+#### Step 2
+
+Generate the highlighter CSS. For example:
+
+```text
+rougify style monokai.sublime > assets/css/syntax.css
+```
+
+#### Step 3
+
+In your base template add a link to the CSS file:
+
+```go-html-template {file="layouts/_default/baseof.html"}
+
+ ...
+ {{ with resources.Get "css/syntax.css" }}
+
+ {{ end }}
+ ...
+
+```
+
+Then add the code to be highlighted to your markup:
+
+```text
+[#hello,ruby]
+----
+require 'sinatra'
+
+get '/hi' do
+ "Hello World!"
+end
+----
+```
+
+### Troubleshooting
+
+Run `hugo --logLevel debug` to examine Hugo's call to the Asciidoctor executable:
+
+```txt
+INFO 2019/12/22 09:08:48 Rendering book-as-pdf.adoc with C:\Ruby26-x64\bin\asciidoctor.bat using asciidoc args [--no-header-footer -r asciidoctor-html5s -b html5s -r asciidoctor-diagram --base-dir D:\prototypes\hugo_asciidoc_ddd\docs -a outdir=D:\prototypes\hugo_asciidoc_ddd\build -] ...
+```
+
+## Highlight
+
+This is the default configuration.
+
+{{< code-toggle config=markup.highlight />}}
+
+{{% include "/_common/syntax-highlighting-options.md" %}}
+
+## Table of contents
+
+This is the default configuration for the table of contents, applicable to Goldmark and Asciidoctor:
+
+{{< code-toggle config=markup.tableOfContents />}}
+
+startLevel
+: (`int`) Heading levels less than this value will be excluded from the table of contents. For example, to exclude `h1` elements from the table of contents, set this value to `2`. Default is `2`.
+
+endLevel
+: (`int`) Heading levels greater than this value will be excluded from the table of contents. For example, to exclude `h4`, `h5`, and `h6` elements from the table of contents, set this value to `3`. Default is `3`.
+
+ordered
+: (`bool`) Whether to generates an ordered list instead of an unordered list. Default is `false`.
+
+[`Fragments.Identifiers`]: /methods/page/fragments/#identifiers
+[`TableOfContents`]: /methods/page/tableofcontents/
+[asciidoctor-diagram]: https://asciidoctor.org/docs/asciidoctor-diagram/
+[attributes]: https://asciidoctor.org/docs/asciidoc-syntax-quick-reference/#attributes-and-substitutions
+[CommonMark]: https://spec.commonmark.org/current/
+[deleted text]: https://developer.mozilla.org/en-US/docs/Web/HTML/Element/del
+[duplication of shared page resources]: /configuration/markup/#duplicateresourcefiles
+[duplication of shared page resources]: /configuration/markup/#duplicateresourcefiles
+[embedded image render hook]: /render-hooks/images/#default
+[embedded image render hook]: /render-hooks/images/#default
+[embedded link render hook]: /render-hooks/links/#default
+[embedded link render hook]: /render-hooks/links/#default
+[GitHub Flavored Markdown]: https://github.github.com/gfm/
+[GitHub Flavored Markdown: Autolinks]: https://github.github.com/gfm/#autolinks-extension-
+[GitHub Flavored Markdown: Strikethrough]: https://github.github.com/gfm/#strikethrough-extension-
+[GitHub Flavored Markdown: Tables]: https://github.github.com/gfm/#tables-extension-
+[GitHub Flavored Markdown: Task list items]: https://github.github.com/gfm/#task-list-items-extension-
+[Goldmark]: https://github.com/yuin/goldmark/
+[Goldmark Extensions: CJK]: https://github.com/yuin/goldmark?tab=readme-ov-file#cjk-extension
+[Goldmark Extensions: Typographer]: https://github.com/yuin/goldmark?tab=readme-ov-file#typographer-extension
+[Hugo Goldmark Extensions: Extras]: https://github.com/gohugoio/hugo-goldmark-extensions?tab=readme-ov-file#extras-extension
+[Hugo Goldmark Extensions: Passthrough]: https://github.com/gohugoio/hugo-goldmark-extensions?tab=readme-ov-file#passthrough-extension
+[image render hook]: /render-hooks/images/
+[includes]: https://docs.asciidoctor.org/asciidoc/latest/syntax-quick-reference/#includes
+[inserted text]: https://developer.mozilla.org/en-US/docs/Web/HTML/Element/ins
+[mark text]: https://developer.mozilla.org/en-US/docs/Web/HTML/Element/mark
+[Markdown attributes]: /content-management/markdown-attributes/
+[mathematics in Markdown]: content-management/mathematics/
+[multilingual page resources]: /content-management/page-resources/#multilingual
+[PHP Markdown Extra: Definition lists]: https://michelf.ca/projects/php-markdown/extra/#def-list
+[PHP Markdown Extra: Footnotes]: https://michelf.ca/projects/php-markdown/extra/#footnotes
+[security policy]: /configuration/security/
+[subscript]: https://developer.mozilla.org/en-US/docs/Web/HTML/Element/sub
+[superscript]: https://developer.mozilla.org/en-US/docs/Web/HTML/Element/sup
+[AsciiDoc]: https://asciidoc.org/
+[Emacs Org Mode]: https://orgmode.org/
+[Pandoc]: https://www.pandoc.org/
+[reStructuredText]: https://docutils.sourceforge.io/rst.html
diff --git a/docs/content/en/configuration/media-types.md b/docs/content/en/configuration/media-types.md
new file mode 100644
index 000000000..ea89ee04a
--- /dev/null
+++ b/docs/content/en/configuration/media-types.md
@@ -0,0 +1,82 @@
+---
+title: Configure media types
+linkTitle: Media types
+description: Configure media types.
+categories: []
+keywords: []
+---
+
+{{% glossary-term "media type" %}}
+
+Configured media types serve multiple purposes in Hugo, including the definition of [output formats](g). This is the default media type configuration in tabular form:
+
+{{< datatable "config" "mediaTypes" "_key" "suffixes" >}}
+
+The `suffixes` column in the table above shows the suffixes associated with each media type. For example, Hugo associates `.html` and `.htm` files with the `text/html` media type.
+
+> [!note]
+> The first suffix is the primary suffix. Use the primary suffix when naming template files. For example, when creating a template for an RSS feed, use the `xml` suffix.
+
+## Default configuration
+
+The following is the default configuration that matches the table above:
+
+{{< code-toggle file=hugo config=mediaTypes />}}
+
+delimiter
+: (`string`) The delimiter between the file name and the suffix. The delimiter, in conjunction with the suffix, forms the file extension. Default is `"."`.
+
+suffixes
+: (`[]string`) The suffixes associated with this media type. The first suffix is the primary suffix.
+
+## Modify a media type
+
+You can modify any of the default media types. For example, to switch the primary suffix for `text/html` from `html` to `htm`:
+
+{{< code-toggle file=hugo >}}
+[mediaTypes.'text/html']
+suffixes = ['htm','html']
+{{< /code-toggle >}}
+
+If you alter a default media type, you must also explicitly redefine all output formats that utilize that media type. For example, to ensure the changes above affect the `html` output format, redefine the `html` output format:
+
+{{< code-toggle file=hugo >}}
+[outputFormats.html]
+mediaType = 'text/html'
+{{< /code-toggle >}}
+
+## Create a media type
+
+You can create new media types as needed. For example, to create a media type for an Atom feed:
+
+{{< code-toggle file=hugo >}}
+[mediaTypes.'application/atom+xml']
+suffixes = ['atom']
+{{< /code-toggle >}}
+
+## Media types without suffixes
+
+Occasionally, you may need to create a media type without a suffix or delimiter. For example, [Netlify] recognizes configuration files named `_redirects` and `_headers`, which Hugo can generate using custom [output formats](g).
+
+To support these custom output formats, register a custom media type with no suffix or delimiter:
+
+{{< code-toggle file=hugo >}}
+[mediaTypes."text/netlify"]
+delimiter = ""
+{{< /code-toggle >}}
+
+The custom output format definitions would look something like this:
+
+{{< code-toggle file=hugo >}}
+[outputFormats.redir]
+baseName = "_redirects"
+isPlainText = true
+mediatype = "text/netlify"
+[outputFormats.headers]
+baseName = "_headers"
+isPlainText = true
+mediatype = "text/netlify"
+notAlternative = true
+{{< /code-toggle >}}
+
+[Netlify]: https://www.netlify.com/
diff --git a/docs/content/en/configuration/menus.md b/docs/content/en/configuration/menus.md
new file mode 100644
index 000000000..759f53ff3
--- /dev/null
+++ b/docs/content/en/configuration/menus.md
@@ -0,0 +1,135 @@
+---
+title: Configure menus
+linkTitle: Menus
+description: Centrally define menu entries for one or more menus.
+categories: []
+keywords: []
+---
+
+> [!note]
+> To understand Hugo's menu system, please refer to the [menus] page.
+
+There are three ways to define menu entries:
+
+1. [Automatically]
+1. [In front matter]
+1. In site configuration
+
+This page covers the site configuration method.
+
+## Example
+
+To define entries for a "main" menu:
+
+{{< code-toggle file=hugo >}}
+[[menus.main]]
+name = 'Home'
+pageRef = '/'
+weight = 10
+
+[[menus.main]]
+name = 'Products'
+pageRef = '/products'
+weight = 20
+
+[[menus.main]]
+name = 'Services'
+pageRef = '/services'
+weight = 30
+{{< /code-toggle >}}
+
+This creates a menu structure that you can access with [`Menus`] method on a `Site` object:
+
+```go-html-template
+{{ range .Site.Menus.main }}
+ ...
+{{ end }}
+```
+
+See [menu templates] for a detailed example.
+
+To define entries for a "footer" menu:
+
+{{< code-toggle file=hugo >}}
+[[menus.footer]]
+name = 'Terms'
+pageRef = '/terms'
+weight = 10
+
+[[menus.footer]]
+name = 'Privacy'
+pageRef = '/privacy'
+weight = 20
+{{< /code-toggle >}}
+
+Access this menu structure in the same way:
+
+```go-html-template
+{{ range .Site.Menus.footer }}
+ ...
+{{ end }}
+```
+
+## Properties
+
+Menu entries usually include at least three properties: `name`, `weight`, and either `pageRef` or `url`. Use `pageRef` for internal page destinations and `url` for external destinations.
+
+These are the available menu entry properties:
+
+{{% include "/_common/menu-entry-properties.md" %}}
+
+pageRef
+: (`string`) The [logical path](g) of the target page. For example:
+
+ page kind|pageRef
+ :--|:--
+ home|`/`
+ page|`/books/book-1`
+ section|`/books`
+ taxonomy|`/tags`
+ term|`/tags/foo`
+
+url
+: (`string`) The destination URL. Use this for external destinations only.
+
+## Nested menu
+
+This nested menu demonstrates some of the available properties:
+
+{{< code-toggle file=hugo >}}
+[[menus.main]]
+name = 'Products'
+pageRef = '/products'
+weight = 10
+
+[[menus.main]]
+name = 'Hardware'
+pageRef = '/products/hardware'
+parent = 'Products'
+weight = 1
+
+[[menus.main]]
+name = 'Software'
+pageRef = '/products/software'
+parent = 'Products'
+weight = 2
+
+[[menus.main]]
+name = 'Services'
+pageRef = '/services'
+weight = 20
+
+[[menus.main]]
+name = 'Hugo'
+pre = ''
+url = 'https://gohugo.io/'
+weight = 30
+[menus.main.params]
+rel = 'external'
+{{< /code-toggle >}}
+
+[`Menus`]: /methods/site/menus/
+[Automatically]: /content-management/menus/#define-automatically
+[In front matter]: /content-management/menus/#define-in-front-matter
+[menu templates]: /templates/menu/
+[menus]: /content-management/menus/
diff --git a/docs/content/en/configuration/minify.md b/docs/content/en/configuration/minify.md
new file mode 100644
index 000000000..a530cb73d
--- /dev/null
+++ b/docs/content/en/configuration/minify.md
@@ -0,0 +1,15 @@
+---
+title: Configure minify
+linkTitle: Minify
+description: Configure minify.
+categories: []
+keywords: []
+---
+
+This is the default configuration:
+
+{{< code-toggle config=minify />}}
+
+See the [tdewolff/minify] project page for details.
+
+[tdewolff/minify]: https://github.com/tdewolff/minify
diff --git a/docs/content/en/configuration/module.md b/docs/content/en/configuration/module.md
new file mode 100644
index 000000000..d736b7c6f
--- /dev/null
+++ b/docs/content/en/configuration/module.md
@@ -0,0 +1,179 @@
+---
+title: Configure modules
+linkTitle: Modules
+description: Configure modules.
+categories: []
+keywords: []
+aliases: [/hugo-modules/configuration/]
+---
+
+## Top-level options
+
+This is the default configuration:
+
+{{< code-toggle file=hugo >}}
+[module]
+noProxy = 'none'
+noVendor = ''
+private = '*.*'
+proxy = 'direct'
+vendorClosest = false
+workspace = 'off'
+{{< /code-toggle >}}
+
+auth
+: {{< new-in 0.144.0 />}}
+: (`string`) Configures `GOAUTH` when running the Go command for module operations. This is a semicolon-separated list of authentication commands for go-import and HTTPS module mirror interactions. This is useful for private repositories. See `go help goauth` for more information.
+
+noProxy
+: (`string`) A comma-separated list of [glob](g) patterns matching paths that should not use the [configured proxy server](#proxy).
+
+noVendor
+: (`string`) A [glob](g) pattern matching module paths to skip when vendoring.
+
+private
+: (`string`) A comma-separated list of [glob](g) patterns matching paths that should be treated as private.
+
+proxy
+: (`string`) The proxy server to use to download remote modules. Default is `direct`, which means `git clone` and similar.
+
+replacements
+: (`string`) Primarily useful for local module development, a comma-separated list of mappings from module paths to directories. Paths may be absolute or relative to the [`themesDir`].
+
+ {{< code-toggle file=hugo >}}
+ [module]
+ replacements = 'github.com/bep/my-theme -> ../..,github.com/bep/shortcodes -> /some/path'
+ {{< /code-toggle >}}
+
+vendorClosest
+: (`bool`) Whether to pick the vendored module closest to the module using it. The default behavior is to pick the first. Note that there can still be only one dependency of a given module path, so once it is in use it cannot be redefined. Default is `false`.
+
+workspace
+: (`string`) The Go workspace file to use, either as an absolute path or a path relative to the current working directory. Enabling this activates Go workspace mode and requires Go 1.18 or later. The default is `off`.
+
+You may also use environment variables to set any of the above. For example:
+
+```sh
+export HUGO_MODULE_PROXY="https://proxy.example.org"
+export HUGO_MODULE_REPLACEMENTS="github.com/bep/my-theme -> ../.."
+export HUGO_MODULE_WORKSPACE="/my/hugo.work"
+```
+
+{{% include "/_common/gomodules-info.md" %}}
+
+## Hugo version
+
+You can specify a required Hugo version for your module in the `module` section. Users will then receive a warning if their Hugo version is incompatible.
+
+This is the default configuration:
+
+{{< code-toggle config=module.hugoVersion />}}
+
+You can omit any of the settings above.
+
+extended
+: (`bool`) Whether the extended edition of Hugo is required, satisfied by installing either the extended or extended/deploy edition.
+
+max
+: (`string`) The maximum Hugo version supported, for example `0.143.0`.
+
+min
+: (`string`) The minimum Hugo version supported, for example `0.123.0`.
+
+[`themesDir`]: /configuration/all/#themesdir
+
+## Imports
+
+{{< code-toggle file=hugo >}}
+[[module.imports]]
+disable = false
+ignoreConfig = false
+ignoreImports = false
+path = "github.com/gohugoio/hugoTestModules1_linux/modh1_2_1v"
+[[module.imports]]
+path = "my-shortcodes"
+{{< /code-toggle >}}
+
+disable
+: (`bool`) Whether to disable the module but keep version information in the `go.*` files. Default is `false`.
+
+ignoreConfig
+: (`bool`) Whether to ignore module configuration files, for example, `hugo.toml`. This will also prevent loading of any transitive module dependencies. Default is `false`.
+
+ignoreImports
+: (`bool`) Whether to ignore module imports. Default is `false`.
+
+noMounts
+: (`bool`) Whether to disable directory mounting for this import. Default is `false`.
+
+noVendor
+: (`bool`) Whether to disable vendoring for this import. This setting is restricted to the main project. Default is `false`.
+
+path
+: (`string`) The module path, either a valid Go module path (e.g., `github.com/gohugoio/myShortcodes`) or the directory name if stored in the [`themesDir`].
+
+[`themesDir`]: /configuration/all#themesDir
+
+{{% include "/_common/gomodules-info.md" %}}
+
+## Mounts
+
+Before Hugo v0.56.0, custom component paths could only be configured by setting [`archetypeDir`], [`assetDir`], [`contentDir`], [`dataDir`], [`i18nDir`], [`layoutDi`], or [`staticDir`] in the site configuration. Module mounts offer greater flexibility than these legacy settings, but
+you cannot use both.
+
+[`archetypeDir`]: /configuration/all/
+[`assetDir`]: /configuration/all/
+[`contentDir`]: /configuration/all/
+[`dataDir`]: /configuration/all/
+[`i18nDir`]: /configuration/all/
+[`layoutDi`]: /configuration/all/
+[`staticDir`]: /configuration/all/
+
+> [!note]
+> If you use module mounts do not use the legacy settings.
+
+### Default mounts
+
+> [!note]
+> Adding a new mount to a target root will cause the existing default mount for that root to be ignored. If you still need the default mount, you must explicitly add it along with the new mount.
+
+The are the default mounts:
+
+{{< code-toggle config=module.mounts />}}
+
+source
+: (`string`) The source directory of the mount. For the main project, this can be either project-relative or absolute. For other modules it must be project-relative.
+
+target
+: (`string`) Where the mount will reside within Hugo's virtual file system. It must begin with one of Hugo's component directories: `archetypes`, `assets`, `content`, `data`, `i18n`, `layouts`, or `static`. For example, `content/blog`.
+
+disableWatch
+: {{< new-in 0.128.0 />}}
+: (`bool`) Whether to disable watching in watch mode for this mount. Default is `false`.
+
+lang
+: (`string`) The language code, e.g. "en". Relevant for `content` mounts, and `static` mounts when in multihost mode.
+
+includeFiles
+: (`string` or `[]string`) One or more [glob](g) patterns matching files or directories to include. If `excludeFiles` is not set, the files matching `includeFiles` will be the files mounted.
+
+ The glob patterns are matched against file names relative to the source root. Use Unix-style forward slashes (`/`), even on Windows. A single forward slash (`/`) matches the mount root, and double asterisks (`**`) act as a recursive wildcard, matching all directories and files beneath a given point (e.g., `/posts/**.jpg`). The search is case-insensitive.
+
+excludeFiles
+: (`string` or `[]string`) One or more [glob](g) patterns matching files to exclude.
+
+### Example
+
+{{< code-toggle file=hugo >}}
+[module]
+[[module.mounts]]
+ source="content"
+ target="content"
+ excludeFiles="docs/*"
+[[module.mounts]]
+ source="node_modules"
+ target="assets"
+[[module.mounts]]
+ source="assets"
+ target="assets"
+{{< /code-toggle >}}
diff --git a/docs/content/en/configuration/output-formats.md b/docs/content/en/configuration/output-formats.md
new file mode 100644
index 000000000..2627c6df4
--- /dev/null
+++ b/docs/content/en/configuration/output-formats.md
@@ -0,0 +1,209 @@
+---
+title: Configure output formats
+linkTitle: Output formats
+description: Configure output formats.
+categories: []
+keywords: []
+---
+
+{{% glossary-term "output format" %}}
+
+You can output a page in as many formats as you want. Define an infinite number of output formats, provided they each resolve to a unique file system path.
+
+This is the default output format configuration in tabular form:
+
+{{< datatable
+ "config"
+ "outputFormats"
+ "_key"
+ "mediaType"
+ "weight"
+ "baseName"
+ "isHTML"
+ "isPlainText"
+ "noUgly"
+ "notAlternative"
+ "path"
+ "permalinkable"
+ "protocol"
+ "rel"
+ "root"
+ "ugly"
+>}}
+
+## Default configuration
+
+The following is the default configuration that matches the table above:
+
+{{< code-toggle config=outputFormats />}}
+
+baseName
+: (`string`) The base name of the published file. Default is `index`.
+
+isHTML
+: (`bool`) Whether to classify the output format as HTML. Hugo uses this value to determine when to create alias redirects and when to inject the LiveReload script. Default is `false`.
+
+isPlainText
+: (`bool`) Whether to parse templates for this output format with Go's [text/template] package instead of the [html/template] package. Default is `false`.
+
+mediaType
+: (`string`) The [media type](g) of the published file. This must match one of the [configured media types].
+
+notAlternative
+: (`bool`) Whether to exclude this output format from the values returned by the [`AlternativeOutputFormats`] method on a `Page` object. Default is `false`.
+
+noUgly
+: (`bool`) Whether to disable ugly URLs for this output format when [`uglyURLs`] are enabled in your site configuration. Default is `false`.
+
+path
+: (`string`) The published file's directory path, relative to the root of the publish directory. If not specified, the file will be published using its content path.
+
+permalinkable
+: (`bool`) Whether to return the rendering output format rather than main output format when invoking the [`Permalink`] and [`RelPermalink`] methods on a `Page` object. See [details](#link-to-output-formats). Enabled by default for the `html` and `amp` output formats. Default is `false`.
+
+protocol
+: (`string`) The protocol (scheme) of the URL for this output format. For example, `https://` or `webcal://`. Default is the scheme of the [`baseURL`] parameter in your site configuration, typically `https://`.
+
+rel
+: (`string`) If provided, you can assign this value to `rel` attributes in `link` elements when iterating over output formats in your templates. Default is `alternate`.
+
+root
+: (`bool`) Whether to publish files to the root of the publish directory. Default is `false`.
+
+ugly
+: (`bool`) Whether to enable uglyURLs for this output format when `uglyURLs` is `false` in your site configuration. Default is `false`.
+
+weight
+: (`int`) When set to a non-zero value, Hugo uses the `weight` as the first criteria when sorting output formats, falling back to the name of the output format. Lighter items float to the top, while heavier items sink to the bottom. Hugo renders output formats sequentially based on the sort order. Default is `0`, except for the `html` output format, which has a default weight of `10`.
+
+## Modify an output format
+
+You can modify any of the default output formats. For example, to prioritize `json` rendering over `html` rendering, when both are generated, adjust the [`weight`](#weight):
+
+{{< code-toggle file=hugo >}}
+[outputFormats.json]
+weight = 1
+[outputFormats.html]
+weight = 2
+{{< /code-toggle >}}
+
+The example above shows that when you modify a default content format, you only need to define the properties that differ from their default values.
+
+## Create an output format
+
+You can create new output formats as needed. For example, you may wish to create an output format to support Atom feeds.
+
+### Step 1
+
+Output formats require a specified media type. Because Atom feeds use `application/atom+xml`, which is not one of the [default media types], you must create it first.
+
+{{< code-toggle file=hugo >}}
+[mediaTypes.'application/atom+xml']
+suffixes = ['atom']
+{{< /code-toggle >}}
+
+See [configure media types] for more information.
+
+### Step 2
+
+Create a new output format:
+
+{{< code-toggle file=hugo >}}
+[outputFormats.atom]
+mediaType = 'application/atom+xml'
+noUgly = true
+{{< /code-toggle >}}
+
+Note that we use the default settings for all other output format properties.
+
+### Step 3
+
+Specify the page [kinds](g) for which to render this output format:
+
+{{< code-toggle file=hugo >}}
+[outputs]
+home = ['html', 'rss', 'atom']
+section = ['html', 'rss', 'atom']
+taxonomy = ['html', 'rss', 'atom']
+term = ['html', 'rss', 'atom']
+{{< /code-toggle >}}
+
+See [configure outputs] for more information.
+
+### Step 4
+
+Create a template to render the output format. Since Atom feeds are lists, you need to create a list template. Consult the [template lookup order] to find the correct template path:
+
+```text
+layouts/_default/list.atom.atom
+```
+
+We leave writing the template code as an exercise for you. Aim for a result similar to the [embedded RSS template].
+
+## List output formats
+
+To access output formats, each `Page` object provides two methods: [`OutputFormats`] (for all formats, including the current one) and [`AlternativeOutputFormats`]. Use `AlternativeOutputFormats` to create a link `rel` list within your site's `head` element, as shown below:
+
+```go-html-template
+{{ range .AlternativeOutputFormats }}
+
+{{ end }}
+```
+
+## Link to output formats
+
+By default, a `Page` object's [`Permalink`] and [`RelPermalink`] methods return the URL of the [primary output format](g), typically `html`. This behavior remains consistent regardless of the template used.
+
+For example, in `single.json.json`, you'll see:
+
+```go-html-template
+{{ .RelPermalink }} → /that-page/
+{{ with .OutputFormats.Get "json" }}
+ {{ .RelPermalink }} → /that-page/index.json
+{{ end }}
+```
+
+To make these methods return the URL of the _current_ template's output format, you must set the [`permalinkable`] setting to `true` for that format.
+
+With `permalinkable` set to true for `json` in the same `single.json.json` template:
+
+```go-html-template
+{{ .RelPermalink }} → /that-page/index.json
+{{ with .OutputFormats.Get "html" }}
+ {{ .RelPermalink }} → /that-page/
+{{ end }}
+```
+
+## Template lookup order
+
+Each output format requires a template conforming to the [template lookup order].
+
+For the highest specificity in the template lookup order, include the page kind, output format, and suffix in the file name:
+
+```text
+[page kind].[output format].[suffix]
+```
+
+For example, for section pages:
+
+Output format|Template path
+:--|:--
+`html`|`layouts/_default/section.html.html`
+`json`|`layouts/_default/section.json.json`
+`rss`|`layouts/_default/section.rss.xml`
+
+[`AlternativeOutputFormats`]: /methods/page/alternativeoutputformats/
+[`OutputFormats`]: /methods/page/outputformats/
+[`Permalink`]: /methods/page/permalink/
+[`RelPermalink`]: /methods/page/relpermalink/
+[`baseURL`]: /configuration/all/#baseurl
+[`permalinkable`]: #permalinkable
+[`uglyURLs`]: /configuration/ugly-urls/
+[configure media types]: /configuration/media-types/
+[configure outputs]: /configuration/outputs/
+[configured media types]: /configuration/media-types/
+[default media types]: /configuration/media-types/
+[embedded RSS template]: {{% eturl rss %}}
+[html/template]: https://pkg.go.dev/html/template
+[template lookup order]: /templates/lookup-order/
+[text/template]: https://pkg.go.dev/text/template
diff --git a/docs/content/en/configuration/outputs.md b/docs/content/en/configuration/outputs.md
new file mode 100644
index 000000000..9a83cb6e9
--- /dev/null
+++ b/docs/content/en/configuration/outputs.md
@@ -0,0 +1,49 @@
+---
+title: Configure outputs
+linkTitle: Outputs
+description: Configure which output formats to render for each page kind.
+categories: []
+keywords: []
+---
+
+{{% glossary-term "output format" %}}
+
+Learn more about creating and configuring output formats in the [configure output formats] section.
+
+## Outputs per page kind
+
+The following default configuration determines the output formats generated for each page kind:
+
+{{< code-toggle config=outputs />}}
+
+To render the built-in `json` output format for the `home` page kind, assuming you've already created the necessary template, add the following to your configuration:
+
+{{< code-toggle file=hugo >}}
+[outputs]
+home = ['html','rss','json']
+{{< /code-toggle >}}
+
+Notice in this example that we only specified the `home` page kind. You don't need to include entries for other page kinds unless you intend to modify their default output formats.
+
+> [!note]
+> The order of the output formats in the arrays above is important. The first element will be the _primary output format_ for that page kind, and in most cases that should be `html` as shown in the default configuration.
+>
+> The primary output format for a given page kind determines the value returned by the [`Permalink`] and [`RelPermalink`] methods on a `Page` object.
+>
+> See the [link to output formats] section for details.
+
+## Outputs per page
+
+Add output formats to a page's rendering using the `outputs` field in its front matter. For example, to include `json` in the output formats rendered for a specific page:
+
+{{< code-toggle file=content/example.md fm=true >}}
+title = 'Example'
+outputs = ['json']
+{{< /code-toggle >}}
+
+In its default configuration, Hugo will render both the `html` and `json` output formats for this page. The `outputs` field appends to, rather than replaces, the site's configured outputs.
+
+[`Permalink`]: /methods/page/permalink/
+[`RelPermalink`]: /methods/page/relpermalink/
+[configure output formats]: /configuration/output-formats/
+[link to output formats]: configuration/output-formats/#link-to-output-formats
diff --git a/docs/content/en/configuration/page.md b/docs/content/en/configuration/page.md
new file mode 100644
index 000000000..81169e546
--- /dev/null
+++ b/docs/content/en/configuration/page.md
@@ -0,0 +1,34 @@
+---
+title: Configure page
+linkTitle: Page
+description: Configure page behavior.
+categories: []
+keywords: []
+---
+
+{{< new-in 0.133.0 />}}
+
+{{% glossary-term "default sort order" %}}
+
+Hugo uses the default sort order to determine the _next_ and _previous_ page relative to the current page when calling these methods on a `Page` object:
+
+- [`Next`](/methods/page/next/) and [`Prev`](/methods/page/prev/)
+- [`NextInSection`](/methods/page/nextinsection/) and [`PrevInSection`](/methods/page/previnsection/)
+
+This is based on this default site configuration:
+
+{{< code-toggle config=page />}}
+
+To reverse the meaning of _next_ and _previous_:
+
+{{< code-toggle file=hugo >}}
+[page]
+ nextPrevInSectionSortOrder = 'asc'
+ nextPrevSortOrder = 'asc'
+{{< /code-toggle >}}
+
+> [!note]
+> These settings do not apply to the [`Next`] or [`Prev`] methods on a `Pages` object.
+
+[`Next`]: /methods/pages/next
+[`Prev`]: /methods/pages/next
diff --git a/docs/content/en/configuration/pagination.md b/docs/content/en/configuration/pagination.md
new file mode 100644
index 000000000..66b3b8cf4
--- /dev/null
+++ b/docs/content/en/configuration/pagination.md
@@ -0,0 +1,45 @@
+---
+title: Configure pagination
+linkTitle: Pagination
+description: Configure pagination.
+categories: []
+keywords: []
+---
+
+This is the default configuration:
+
+{{< code-toggle config=pagination />}}
+
+disableAliases
+: (`bool`) Whether to disable alias generation for the first pager. Default is `false`.
+
+pagerSize
+: (`int`) The number of pages per pager. Default is `10`.
+
+path
+: (`string`) The segment of each pager URL indicating that the target page is a pager. Default is `page`.
+
+With multilingual sites you can define the pagination behavior for each language:
+
+{{< code-toggle file=hugo >}}
+[languages.en]
+contentDir = 'content/en'
+languageCode = 'en-US'
+languageDirection = 'ltr'
+languageName = 'English'
+weight = 1
+[languages.en.pagination]
+disableAliases = true
+pagerSize = 10
+path = 'page'
+[languages.de]
+contentDir = 'content/de'
+languageCode = 'de-DE'
+languageDirection = 'ltr'
+languageName = 'Deutsch'
+weight = 2
+[languages.de.pagination]
+disableAliases = true
+pagerSize = 20
+path = 'blatt'
+{{< /code-toggle >}}
diff --git a/docs/content/en/configuration/params.md b/docs/content/en/configuration/params.md
new file mode 100644
index 000000000..239b0c2da
--- /dev/null
+++ b/docs/content/en/configuration/params.md
@@ -0,0 +1,100 @@
+---
+title: Configure params
+linkTitle: Params
+description: Create custom site parameters.
+categories: []
+keywords: []
+---
+
+Use the `params` key for custom parameters:
+
+{{< code-toggle file=hugo >}}
+baseURL = 'https://example.org/'
+title = 'Project Documentation'
+languageCode = 'en-US'
+[params]
+subtitle = 'Reference, Tutorials, and Explanations'
+[params.contact]
+email = 'info@example.org'
+phone = '+1 206-555-1212'
+{{< /code-toggle >}}
+
+Access the custom parameters from your templates using the [`Params`] method on a `Site` object:
+
+[`Params`]: /methods/site/params/
+
+```go-html-template
+{{ .Site.Params.subtitle }} → Reference, Tutorials, and Explanations
+{{ .Site.Params.contact.email }} → info@example.org
+```
+
+Key names should use camelCase or snake_case. While TOML, YAML, and JSON allow kebab-case keys, they are not valid [identifiers](g) and cannot be used when [chaining](g) identifiers.
+
+For example, you can do either of these:
+
+```go-html-template
+{{ .Site.params.camelCase.foo }}
+{{ .Site.params.snake_case.foo }}
+```
+
+But you cannot do this:
+
+```go-html-template
+{{ .Site.params.kebab-case.foo }}
+```
+
+## Multilingual sites
+
+For multilingual sites, create a `params` key under each language:
+
+{{< code-toggle file=hugo >}}
+baseURL = 'https://example.org/'
+defaultContentLanguage = 'en'
+
+[languages.de]
+languageCode = 'de-DE'
+languageDirection = 'ltr'
+languageName = 'Deutsch'
+title = 'Projekt Dokumentation'
+weight = 1
+
+[languages.de.params]
+subtitle = 'Referenz, Tutorials und Erklärungen'
+
+[languages.de.params.contact]
+email = 'info@de.example.org'
+phone = '+49 30 1234567'
+
+[languages.en]
+languageCode = 'en-US'
+languageDirection = 'ltr'
+languageName = 'English'
+title = 'Project Documentation'
+weight = 2
+
+[languages.en.params]
+subtitle = 'Reference, Tutorials, and Explanations'
+
+[languages.en.params.contact]
+email = 'info@example.org'
+phone = '+1 206-555-1212'
+{{< /code-toggle >}}
+
+## Namespacing
+
+To prevent naming conflicts, module and theme developers should namespace any custom parameters specific to their module or theme.
+
+{{< code-toggle file=hugo >}}
+[params.modules.myModule.colors]
+background = '#efefef'
+font = '#222222'
+{{< /code-toggle >}}
+
+To access the module/theme settings:
+
+```go-html-template
+{{ $cfg := .Site.Params.module.mymodule }}
+
+{{ $cfg.colors.background }} → #efefef
+{{ $cfg.colors.font }} → #222222
+```
diff --git a/docs/content/en/configuration/permalinks.md b/docs/content/en/configuration/permalinks.md
new file mode 100644
index 000000000..0810624a6
--- /dev/null
+++ b/docs/content/en/configuration/permalinks.md
@@ -0,0 +1,162 @@
+---
+title: Configure permalinks
+linkTitle: Permalinks
+description: Configure permalinks.
+categories: []
+keywords: []
+---
+
+This is the default configuration:
+
+{{< code-toggle config=permalinks />}}
+
+Define a URL pattern for each top-level section. Each URL pattern can target a given language and/or page kind.
+
+> [!note]
+> The [`url`] front matter field overrides any matching permalink pattern.
+
+## Monolingual example
+
+With this content structure:
+
+```text
+content/
+├── posts/
+│ ├── bash-in-slow-motion.md
+│ └── tls-in-a-nutshell.md
+├── tutorials/
+│ ├── git-for-beginners.md
+│ └── javascript-bundling-with-hugo.md
+└── _index.md
+```
+
+Render tutorials under "training", and render the posts under "articles" with a date-base hierarchy:
+
+{{< code-toggle file=hugo >}}
+[permalinks.page]
+posts = '/articles/:year/:month/:slug/'
+tutorials = '/training/:slug/'
+[permalinks.section]
+posts = '/articles/'
+tutorials = '/training/'
+{{< /code-toggle >}}
+
+The structure of the published site will be:
+
+```text
+public/
+├── articles/
+│ ├── 2023/
+│ │ ├── 04/
+│ │ │ └── bash-in-slow-motion/
+│ │ │ └── index.html
+│ │ └── 06/
+│ │ └── tls-in-a-nutshell/
+│ │ └── index.html
+│ └── index.html
+├── training/
+│ ├── git-for-beginners/
+│ │ └── index.html
+│ ├── javascript-bundling-with-hugo/
+│ │ └── index.html
+│ └── index.html
+└── index.html
+```
+
+To create a date-based hierarchy for regular pages in the content root:
+
+{{< code-toggle file=hugo >}}
+[permalinks.page]
+"/" = "/:year/:month/:slug/"
+{{< /code-toggle >}}
+
+Use the same approach with taxonomy terms. For example, to omit the taxonomy segment of the URL:
+
+{{< code-toggle file=hugo >}}
+[permalinks.term]
+'tags' = '/:slug/'
+{{< /code-toggle >}}
+
+## Multilingual example
+
+Use the `permalinks` configuration as a component of your localization strategy.
+
+With this content structure:
+
+```text
+content/
+├── en/
+│ ├── books/
+│ │ ├── les-miserables.md
+│ │ └── the-hunchback-of-notre-dame.md
+│ └── _index.md
+└── es/
+ ├── books/
+ │ ├── les-miserables.md
+ │ └── the-hunchback-of-notre-dame.md
+ └── _index.md
+```
+
+And this site configuration:
+
+{{< code-toggle file=hugo >}}
+defaultContentLanguage = 'en'
+defaultContentLanguageInSubdir = true
+
+[languages.en]
+contentDir = 'content/en'
+languageCode = 'en-US'
+languageDirection = 'ltr'
+languageName = 'English'
+weight = 1
+
+[languages.en.permalinks.page]
+books = "/books/:slug/"
+
+[languages.en.permalinks.section]
+books = "/books/"
+
+[languages.es]
+contentDir = 'content/es'
+languageCode = 'es-ES'
+languageDirection = 'ltr'
+languageName = 'Español'
+weight = 2
+
+[languages.es.permalinks.page]
+books = "/libros/:slug/"
+
+[languages.es.permalinks.section]
+books = "/libros/"
+{{< /code-toggle >}}
+
+The structure of the published site will be:
+
+```text
+public/
+├── en/
+│ ├── books/
+│ │ ├── les-miserables/
+│ │ │ └── index.html
+│ │ ├── the-hunchback-of-notre-dame/
+│ │ │ └── index.html
+│ │ └── index.html
+│ └── index.html
+├── es/
+│ ├── libros/
+│ │ ├── les-miserables/
+│ │ │ └── index.html
+│ │ ├── the-hunchback-of-notre-dame/
+│ │ │ └── index.html
+│ │ └── index.html
+│ └── index.html
+└── index.html
+```
+
+## Tokens
+
+Use these tokens when defining a URL pattern.
+
+{{% include "/_common/permalink-tokens.md" %}}
+
+[`url`]: /content-management/front-matter/#url
diff --git a/docs/content/en/configuration/privacy.md b/docs/content/en/configuration/privacy.md
new file mode 100644
index 000000000..c94f2c1c3
--- /dev/null
+++ b/docs/content/en/configuration/privacy.md
@@ -0,0 +1,43 @@
+---
+title: Configure privacy
+linkTitle: Privacy
+description: Configure your site to help comply with regional privacy regulations.
+categories: []
+keywords: []
+aliases: [/about/privacy/]
+---
+
+## Responsibility
+
+Site authors are responsible for ensuring compliance with regional privacy regulations, including but not limited to:
+
+- GDPR (General Data Protection Regulation): Applies to individuals within the European Union and the European Economic Area.
+- CCPA (California Consumer Privacy Act): Applies to California residents.
+- CPRA (California Privacy Rights Act): Expands upon the CCPA with stronger consumer privacy protections.
+- Virginia Consumer Data Protection Act (CDPA): Applies to businesses that collect, process, or sell the personal data of Virginia residents.
+
+Hugo's privacy settings can assist in compliance efforts.
+
+## Embedded templates
+
+Hugo provides [embedded templates](g) to simplify site and content creation. Some of these templates interact with external services. For example, the `youtube` shortcode connects with YouTube's servers to embed videos on your site.
+
+Some of these templates include settings to enhance privacy.
+
+## Configuration
+
+> [!note]
+> These settings affect the behavior of some of Hugo's embedded templates. These settings may or may not affect the behavior of templates provided by third parties in their modules or themes.
+
+These are the default privacy settings for Hugo's embedded templates:
+
+{{< code-toggle config=privacy />}}
+
+See each template's documentation for a description of its privacy settings:
+
+- [Disqus partial](/templates/embedded/#privacy-disqus)
+- [Google Analytics partial](/templates/embedded/#privacy-google-analytics)
+- [Instagram shortcode](/shortcodes/instagram/#privacy)
+- [Vimeo shortcode](/shortcodes/vimeo/#privacy)
+- [X shortcode](/shortcodes/x/#privacy)
+- [YouTube shortcode](/shortcodes/youtube/#privacy)
diff --git a/docs/content/en/configuration/related-content.md b/docs/content/en/configuration/related-content.md
new file mode 100644
index 000000000..c6e182fae
--- /dev/null
+++ b/docs/content/en/configuration/related-content.md
@@ -0,0 +1,111 @@
+---
+title: Configure related content
+linkTitle: Related content
+description: Configure related content.
+categories: []
+keywords: []
+---
+
+> [!note]
+> To understand Hugo's related content identification, please refer to the [related content] page.
+
+Hugo provides a sensible default configuration for identifying related content, but you can customize it in your site configuration, either globally or per language.
+
+## Default configuration
+
+This is the default configuration:
+
+{{< code-toggle config=related />}}
+
+> [!note]
+> Adding a `related` section to your site configuration requires you to provide a full configuration. You cannot override individual default values without specifying all related settings.
+
+## Top-level options
+
+threshold
+: (`int`) A value between 0-100, inclusive. A lower value will return more, but maybe not so relevant, matches.
+
+includeNewer
+: (`bool`) Whether to include pages newer than the current page in the related content listing. This will mean that the output for older posts may change as new related content gets added. Default is `false`.
+
+toLower
+: (`bool`) Whether to transform keywords in both the indexes and the queries to lower case. This may give more accurate results at a slight performance penalty. Default is `false`.
+
+## Per-index options
+
+name
+: (`string`) The index name. This value maps directly to a page parameter. Hugo supports string values (`author` in the example) and lists (`tags`, `keywords` etc.) and time and date objects.
+
+type
+: (`string`) One of `basic` or `fragments`. Default is `basic`.
+
+applyFilter
+: (`string`) Apply a `type` specific filter to the result of a search. This is currently only used for the `fragments` type.
+
+weight
+: (`int`) An integer weight that indicates how important this parameter is relative to the other parameters. It can be `0`, which has the effect of turning this index off, or even negative. Test with different values to see what fits your content best. Default is `0`.
+
+cardinalityThreshold
+: (`int`) If between 1 and 100, this is a percentage. All keywords that are used in more than this percentage of documents are removed. For example, setting this to `60` will remove all keywords that are used in more than 60% of the documents in the index. If `0`, no keyword is removed from the index. Default is `0`.
+
+pattern
+: (`string`) This is currently only relevant for dates. When listing related content, we may want to list content that is also close in time. Setting "2006" (default value for date indexes) as the pattern for a date index will add weight to pages published in the same year. For busier blogs, "200601" (year and month) may be a better default.
+
+toLower
+: (`bool`) Whether to transform keywords in both the indexes and the queries to lower case. This may give more accurate results at a slight performance penalty. Default is `false`.
+
+## Example
+
+Imagine we're building a book review site. Our main content will be book reviews, and we'll use genres and authors as taxonomies. When someone views a book review, we want to show a short list of related reviews based on shared authors and genres.
+
+Create the content:
+
+```text
+content/
+└── book-reviews/
+ ├── book-review-1.md
+ ├── book-review-2.md
+ ├── book-review-3.md
+ ├── book-review-4.md
+ └── book-review-5.md
+```
+
+Configure the taxonomies:
+
+{{< code-toggle file=hugo >}}
+[taxonomies]
+author = 'authors'
+genre = 'genres'
+{{< /code-toggle >}}
+
+Configure the related content identification:
+
+{{< code-toggle file=hugo >}}
+[related]
+includeNewer = true
+threshold = 80
+toLower = true
+[[related.indices]]
+name = 'authors'
+weight = 2
+[[related.indices]]
+name = 'genres'
+weight = 1
+{{< /code-toggle >}}
+
+We've configured the `authors` index with a weight of `2` and the `genres` index with a weight of `1`. This means Hugo prioritizes shared `authors` as twice as significant as shared `genres`.
+
+Then render a list of 5 related reviews with a partial template like this:
+
+```go-html-template {file="layouts/partials/related.html" copy=true}
+{{ with site.RegularPages.Related . | first 5 }}
+
+{{ end }}
+```
+
+[related content]: /content-management/related-content/
diff --git a/docs/content/en/configuration/security.md b/docs/content/en/configuration/security.md
new file mode 100644
index 000000000..f950dd233
--- /dev/null
+++ b/docs/content/en/configuration/security.md
@@ -0,0 +1,50 @@
+---
+title: Configure security
+linkTitle: Security
+description: Configure security.
+categories: []
+keywords: []
+---
+
+Hugo's built-in security policy, which restricts access to `os/exec`, remote communication, and similar operations, is configured via allow lists. By default, access is restricted. If a build attempts to use a feature not included in the allow list, it will fail, providing a detailed message.
+
+This is the default security configuration:
+
+{{< code-toggle config=security />}}
+
+enableInlineShortcodes
+: (`bool`) Whether to enable [inline shortcodes]. Default is `false`.
+
+exec.allow
+: (`[]string`) A slice of [regular expressions](g) matching the names of external executables that Hugo is allowed to run.
+
+exec.osEnv
+: (`[]string`) A slice of [regular expressions](g) matching the names of operating system environment variables that Hugo is allowed to access.
+
+funcs.getenv
+: (`[]string`) A slice of [regular expressions](g) matching the names of operating system environment variables that Hugo is allowed to access with the [`os.Getenv`] function.
+
+http.methods
+: (`[]string`) A slice of [regular expressions](g) matching the HTTP methods that the [`resources.GetRemote`] function is allowed to use.
+
+http.mediaTypes
+: (`[]string`) Applicable to the `resources.GetRemote` function, a slice of [regular expressions](g) matching the `Content-Type` in HTTP responses that Hugo trusts, bypassing file content analysis for media type detection.
+
+http.urls
+: (`[]string`) A slice of [regular expressions](g) matching the URLs that the `resources.GetRemote` function is allowed to access.
+
+> [!note]
+> Setting an allow list to the string `none` will completely disable the associated feature.
+
+You can also override the site configuration with environment variables. For example, to block `resources.GetRemote` from accessing any URL:
+
+```txt
+export HUGO_SECURITY_HTTP_URLS=none
+```
+
+Learn more about [using environment variables] to configure your site.
+
+[`os.Getenv`]: /functions/os/getenv
+[`resources.GetRemote`]: /functions/resources/getremote
+[inline shortcodes]: /content-management/shortcodes/#inline
+[using environment variables]: /configuration/introduction/#environment-variables
diff --git a/docs/content/en/configuration/segments.md b/docs/content/en/configuration/segments.md
new file mode 100644
index 000000000..0c4098770
--- /dev/null
+++ b/docs/content/en/configuration/segments.md
@@ -0,0 +1,77 @@
+---
+title: Configure segments
+linkTitle: Segments
+description: Configure your site for segmented rendering.
+categories: []
+keywords: []
+---
+
+{{< new-in 0.124.0 />}}
+
+> [!note]
+> The `segments` configuration applies only to segmented rendering. While it controls when content is rendered, it doesn't restrict access to Hugo's complete object graph (sites and pages), which remains fully available.
+
+Segmented rendering offers several advantages:
+
+- Faster builds: Process large sites more efficiently.
+- Rapid development: Render only a subset of your site for quicker iteration.
+- Scheduled rebuilds: Rebuild specific sections at different frequencies (e.g., home page and news hourly, full site weekly).
+- Targeted output: Generate specific output formats (like JSON for search indexes).
+
+## Segment definition
+
+Each segment is defined by include and exclude filters:
+
+- Filters: Each segment has zero or more exclude filters and zero or more include filters.
+- Matchers: Each filter contains one or more field [glob](g) matchers.
+- Logic: Matchers within a filter use AND logic. Filters within a section (include or exclude) use OR logic.
+
+## Filter fields
+
+Available fields for filtering:
+
+kind
+: (`string`) A [glob](g) pattern matching the [page kind](g). For example: ` {taxonomy,term}`.
+
+lang
+: (`string`) A [glob](g) pattern matching the [page language]. For example: `{en,de}`.
+
+output
+: (`string`) A [glob](g) pattern matching the [output format](g) of the page. For example: `{html,json}`.
+
+path
+: (`string`) A [glob](g) pattern matching the page's [logical path](g). For example: `{/books,/books/**}`.
+
+## Example
+
+Place broad filters, such as those for language or output format, in the excludes section. For example:
+
+{{< code-toggle file=hugo >}}
+[segments.segment1]
+ [[segments.segment1.excludes]]
+ lang = "n*"
+ [[segments.segment1.excludes]]
+ lang = "en"
+ output = "rss"
+ [[segments.segment1.includes]]
+ kind = "{home,term,taxonomy}"
+ [[segments.segment1.includes]]
+ path = "{/docs,/docs/**}"
+{{< /code-toggle >}}
+
+## Rendering segments
+
+Render specific segments using the [`renderSegments`] configuration or the `--renderSegments` flag:
+
+```bash
+hugo --renderSegments segment1
+```
+
+You can configure multiple segments and use a comma-separated list with `--renderSegments` to render them all.
+
+```bash
+hugo --renderSegments segment1,segment2
+```
+
+[`renderSegments`]: /configuration/all/#rendersegments
+[page language]: /methods/page/language/
diff --git a/docs/content/en/configuration/server.md b/docs/content/en/configuration/server.md
new file mode 100644
index 000000000..92f0f0cfa
--- /dev/null
+++ b/docs/content/en/configuration/server.md
@@ -0,0 +1,128 @@
+---
+title: Configure server
+linkTitle: Server
+description: Configure the development server.
+categories: []
+keywords: []
+---
+
+These settings are exclusive to Hugo's development server, so a dedicated [configuration directory] for development, where the server is configured accordingly, is the recommended approach.
+
+[configuration directory]: /configuration/introduction/#configuration-directory
+
+```text
+project/
+└── config/
+ ├── _default/
+ │ └── hugo.toml
+ └── development/
+ └── server.toml
+```
+
+## Default settings
+
+The development server defaults to redirecting to `/404.html` for any requests to URLs that don't exist. See the [404 errors](#404-errors) section below for details.
+
+{{< code-toggle config=server />}}
+
+force
+: (`bool`) Whether to force a redirect even if there is existing content in the path.
+
+from
+: (`string`) A [glob](g) pattern matching the requested URL. Either `from` or `fromRE` must be set. If both `from` and `fromRe` are specified, the URL must match both patterns.
+
+fromHeaders
+: {{< new-in 0.144.0 />}}
+: (`map[string][string]`) Headers to match for the redirect. This maps the HTTP header name to a [glob](g) pattern with values to match. If the map is empty, the redirect will always be triggered.
+
+fromRe
+: {{< new-in 0.144.0 />}}
+: (`string`) A [regular expression](g) used to match the requested URL. Either `from` or `fromRE` must be set. If both `from` and `fromRe` are specified, the URL must match both patterns. Capture groups from the regular expression are accessible in the `to` field as `$1`, `$2`, and so on.
+
+status
+: (`string`) The HTTP status code to use for the redirect. A status code of 200 will trigger a URL rewrite.
+
+to
+: (`string`) The URL to forward the request to.
+
+## Headers
+
+Include headers in every server response to facilitate testing, particularly for features like Content Security Policies.
+
+[Content Security Policies]: https://developer.mozilla.org/en-US/docs/Web/HTTP/CSP
+
+{{< code-toggle file=config/development/server >}}
+[[headers]]
+for = "/**"
+
+[headers.values]
+X-Frame-Options = "DENY"
+X-XSS-Protection = "1; mode=block"
+X-Content-Type-Options = "nosniff"
+Referrer-Policy = "strict-origin-when-cross-origin"
+Content-Security-Policy = "script-src localhost:1313"
+{{< /code-toggle >}}
+
+## Redirects
+
+You can define simple redirect rules.
+
+{{< code-toggle file=config/development/server >}}
+[[redirects]]
+from = "/myspa/**"
+to = "/myspa/"
+status = 200
+force = false
+{{< /code-toggle >}}
+
+The `200` status code in this example triggers a URL rewrite, which is typically the desired behavior for [single-page applications].
+
+[single-page applications]: https://en.wikipedia.org/wiki/Single-page_application
+
+## 404 errors
+
+The development server defaults to redirecting to /404.html for any requests to URLs that don't exist.
+
+{{< code-toggle config=server />}}
+
+If you've already defined other redirects, you must explicitly add the 404 redirect.
+
+{{< code-toggle file=config/development/server >}}
+[[redirects]]
+force = false
+from = "/**"
+to = "/404.html"
+status = 404
+{{< /code-toggle >}}
+
+For multilingual sites, ensure the default language 404 redirect is defined last:
+
+{{< code-toggle file=config/development/server >}}
+defaultContentLanguage = 'en'
+defaultContentLanguageInSubdir = false
+[[redirects]]
+from = '/fr/**'
+to = '/fr/404.html'
+status = 404
+
+[[redirects]] # Default language must be last.
+from = '/**'
+to = '/404.html'
+status = 404
+{{< /code-toggle >}}
+
+When the default language is served from a subdirectory:
+
+{{< code-toggle file=config/development/server >}}
+defaultContentLanguage = 'en'
+defaultContentLanguageInSubdir = true
+[[redirects]]
+from = '/fr/**'
+to = '/fr/404.html'
+status = 404
+
+[[redirects]] # Default language must be last.
+from = '/**'
+to = '/en/404.html'
+status = 404
+{{< /code-toggle >}}
diff --git a/docs/content/en/configuration/services.md b/docs/content/en/configuration/services.md
new file mode 100644
index 000000000..dbe3893a7
--- /dev/null
+++ b/docs/content/en/configuration/services.md
@@ -0,0 +1,52 @@
+---
+title: Configure services
+linkTitle: Services
+description: Configure embedded templates.
+categories: []
+keywords: []
+---
+
+Hugo provides [embedded templates](g) to simplify site and content creation. Some of these templates are configurable. For example, the embedded Google Analytics template requires a Google tag ID.
+
+This is the default configuration:
+
+{{< code-toggle config=services />}}
+
+disqus.shortname
+: (`string`) The `shortname` used with the Disqus commenting system. See [details](/templates/embedded/#disqus). To access this value from a template:
+
+ ```go-html-template
+ {{ .Site.Config.Services.Disqus.Shortname }}
+ ```
+
+googleAnalytics.id
+: (`string`) The Google tag ID for Google Analytics 4 properties. See [details](/templates/embedded/#google-analytics). To access this value from a template:
+
+ ```go-html-template
+ {{ .Site.Config.Services.GoogleAnalytics.ID }}
+ ```
+
+instagram.accessToken
+: (`string`) Do not use. Deprecated in [v0.123.0]. The embedded `instagram` shortcode no longer uses this setting.
+
+instagram.disableInlineCSS
+: (`bool`) Do not use. Deprecated in [v0.123.0]. The embedded `instagram` shortcode no longer uses this setting.
+
+rss.limit
+: (`int`) The maximum number of items to include in an RSS feed. Set to `-1` for no limit. Default is `-1`. See [details](/templates/rss/). To access this value from a template:
+
+ ```go-html-template
+ {{ .Site.Config.Services.RSS.Limit }}
+ ```
+
+twitter.disableInlineCSS
+: (`bool`) Do not use. Deprecated in [v0.141.0]. Use the `x` shortcode instead.
+
+x.disableInlineCSS
+: (`bool`) Whether to disable the inline CSS rendered by the embedded `x` shortode. See [details](/shortcodes/x/#privacy). Default is `false`. To access this value from a template:
+
+ ```go-html-template
+ {{ .Site.Config.Services.X.DisableInlineCSS }}
+
+[v0.141.0]: https://github.com/gohugoio/hugo/releases/tag/v0.141.0
+[v0.123.0]: https://github.com/gohugoio/hugo/releases/tag/v0.123.0
diff --git a/docs/content/en/configuration/sitemap.md b/docs/content/en/configuration/sitemap.md
new file mode 100644
index 000000000..bc972994c
--- /dev/null
+++ b/docs/content/en/configuration/sitemap.md
@@ -0,0 +1,24 @@
+---
+title: Configure sitemap
+linkTitle: Sitemap
+description: Configure the sitemap.
+categories: []
+keywords: []
+---
+
+These are the default sitemap configuration values. They apply to all pages unless overridden in front matter.
+
+{{< code-toggle config=sitemap />}}
+
+changefreq
+: (`string`) How frequently a page is likely to change. Valid values are `always`, `hourly`, `daily`, `weekly`, `monthly`, `yearly`, and `never`. With the default value of `""` Hugo will omit this field from the sitemap. See [details](https://www.sitemaps.org/protocol.html#changefreqdef).
+
+disable
+: {{< new-in 0.125.0 />}}
+: (`bool`) Whether to disable page inclusion. Default is `false`. Set to `true` in front matter to exclude the page.
+
+filename
+: (`string`) The name of the generated file. Default is `sitemap.xml`.
+
+priority
+: (`float`) The priority of a page relative to any other page on the site. Valid values range from 0.0 to 1.0. With the default value of `-1` Hugo will omit this field from the sitemap. See [details](https://www.sitemaps.org/protocol.html#prioritydef).
diff --git a/docs/content/en/configuration/taxonomies.md b/docs/content/en/configuration/taxonomies.md
new file mode 100644
index 000000000..4b5ba97a5
--- /dev/null
+++ b/docs/content/en/configuration/taxonomies.md
@@ -0,0 +1,68 @@
+---
+title: Configure taxonomies
+linkTitle: Taxonomies
+description: Configure taxonomies.
+categories: []
+keywords: []
+---
+
+The default configuration defines two [taxonomies](g), `categories` and `tags`.
+
+{{< code-toggle config=taxonomies />}}
+
+When creating a taxonomy:
+
+- Use the singular form for the key (e.g., `category`).
+- Use the plural form for the value (e.g., `categories`).
+
+Then use the value as the key in front matter:
+
+{{< code-toggle file=content/example.md fm=true >}}
+---
+title: Example
+categories:
+ - vegetarian
+ - gluten-free
+tags:
+ - appetizer
+ - main course
+{{< /code-toggle >}}
+
+If you do not expect to assign more than one [term](g) from a given taxonomy to a content page, you may use the singular form for both key and value:
+
+{{< code-toggle file=hugo >}}
+taxonomies:
+ author: author
+{{< /code-toggle >}}
+
+Then in front matter:
+
+{{< code-toggle file=content/example.md fm=true >}}
+---
+title: Example
+author:
+ - Robert Smith
+{{< /code-toggle >}}
+
+The example above illustrates that even with a single term, the value is still provided as an array.
+
+You must explicitly define the default taxonomies to maintain them when adding a new one:
+
+{{< code-toggle file=hugo >}}
+taxonomies:
+ author: author
+ category: categories
+ tag: tags
+{{< /code-toggle >}}
+
+To disable the taxonomy system, use the [`disableKinds`] setting in the root of your site configuration to disable the `taxonomy` and `term` page [kinds](g).
+
+{{< code-toggle file=hugo >}}
+disableKinds = ['categories','tags']
+{{< /code-toggle >}}
+
+[`disableKinds`]: /configuration/all/#disablekinds
+
+See the [taxonomies] section for more information.
+
+[taxonomies]: /content-management/taxonomies/
diff --git a/docs/content/en/configuration/ugly-urls.md b/docs/content/en/configuration/ugly-urls.md
new file mode 100644
index 000000000..ec1dd8a49
--- /dev/null
+++ b/docs/content/en/configuration/ugly-urls.md
@@ -0,0 +1,36 @@
+---
+title: Configure ugly URLs
+linkTitle: Ugly URLs
+description: Configure ugly URLs.
+categories: []
+keywords: []
+---
+
+{{% glossary-term "ugly url" %}} For example:
+
+```text
+https://example.org/section/article.html
+```
+
+In its default configuration, Hugo generates [pretty URLs](g). For example:
+```text
+https://example.org/section/article/
+```
+
+This is the default configuration:
+
+{{< code-toggle config=uglyURLs />}}
+
+To generate ugly URLs for the entire site:
+
+{{< code-toggle file=hugo >}}
+uglyURLs = true
+{{< /code-toggle >}}
+
+To generate ugly URLs for specific sections of your site:
+
+{{< code-toggle file=hugo >}}
+[uglyURLs]
+books = true
+films = false
+{{< /code-toggle >}}
diff --git a/docs/content/en/content-management/_index.md b/docs/content/en/content-management/_index.md
index 28f2ecf82..4e2060756 100644
--- a/docs/content/en/content-management/_index.md
+++ b/docs/content/en/content-management/_index.md
@@ -1,20 +1,8 @@
---
-title: Content Management
-linktitle: Content Management Overview
+title: Content management
description: Hugo makes managing large static sites easy with support for archetypes, content types, menus, cross references, summaries, and more.
-date: 2017-02-01
-publishdate: 2017-02-01
-lastmod: 2017-02-01
-menu:
- docs:
- parent: "content-management"
- weight: 1
-keywords: [source, organization]
-categories: [content management]
-weight: 01 #rem
-draft: false
+categories: []
+keywords: []
+weight: 10
aliases: [/content/,/content/organization]
-toc: false
---
-
-A static site generator needs to extend beyond front matter and a couple of templates to be both scalable and *manageable*. Hugo was designed with not only developers in mind, but also content managers and authors.
diff --git a/docs/content/en/content-management/archetypes.md b/docs/content/en/content-management/archetypes.md
index 354ef0fef..db0838504 100644
--- a/docs/content/en/content-management/archetypes.md
+++ b/docs/content/en/content-management/archetypes.md
@@ -1,97 +1,186 @@
---
title: Archetypes
-linktitle: Archetypes
-description: Archetypes are templates used when creating new content.
-date: 2017-02-01
-publishdate: 2017-02-01
-keywords: [archetypes,generators,metadata,front matter]
-categories: ["content management"]
-menu:
- docs:
- parent: "content-management"
- weight: 70
- quicklinks:
-weight: 70 #rem
-draft: false
+description: An archetype is a template for new content.
+categories: []
+keywords: []
aliases: [/content/archetypes/]
-toc: true
---
-## What are Archetypes?
+## Overview
-**Archetypes** are content template files in the [archetypes directory][] of your project that contain preconfigured [front matter][] and possibly also a content disposition for your website's [content types][]. These will be used when you run `hugo new`.
+A content file consists of [front matter](g) and markup. The markup is typically Markdown, but Hugo also supports other [content formats](g). Front matter can be TOML, YAML, or JSON.
+The `hugo new content` command creates a new file in the `content` directory, using an archetype as a template. This is the default archetype:
-The `hugo new` uses the `content-section` to find the most suitable archetype template in your project. If your project does not contain any archetype files, it will also look in the theme.
+{{< code-toggle file=archetypes/default.md fm=true >}}
+title = '{{ replace .File.ContentBaseName `-` ` ` | title }}'
+date = '{{ .Date }}'
+draft = true
+{{< /code-toggle >}}
-{{< code file="archetype-example.sh" >}}
-hugo new posts/my-first-post.md
-{{< /code >}}
+When you create new content, Hugo evaluates the [template actions](g) within the archetype. For example:
-The above will create a new content file in `content/posts/my-first-post.md` using the first archetype file found of these:
+```sh
+hugo new content posts/my-first-post.md
+```
+
+With the default archetype shown above, Hugo creates this content file:
+
+{{< code-toggle file=content/posts/my-first-post.md fm=true >}}
+title = 'My First Post'
+date = '2023-08-24T11:49:46-07:00'
+draft = true
+{{< /code-toggle >}}
+
+You can create an archetype for one or more [content types](g). For example, use one archetype for posts, and use the default archetype for everything else:
+
+```text
+archetypes/
+├── default.md
+└── posts.md
+```
+
+## Lookup order
+
+Hugo looks for archetypes in the `archetypes` directory in the root of your project, falling back to the `archetypes` directory in themes or installed modules. An archetype for a specific content type takes precedence over the default archetype.
+
+For example, with this command:
+
+```sh
+hugo new content posts/my-first-post.md
+```
+
+The archetype lookup order is:
1. `archetypes/posts.md`
-2. `archetypes/default.md`
-3. `themes/my-theme/archetypes/posts.md`
-4. `themes/my-theme/archetypes/default.md`
+1. `archetypes/default.md`
+1. `themes/my-theme/archetypes/posts.md`
+1. `themes/my-theme/archetypes/default.md`
-The last two list items are only applicable if you use a theme and it uses the `my-theme` theme name as an example.
+If none of these exists, Hugo uses a built-in default archetype.
-## Create a New Archetype Template
+## Functions and context
-A fictional example for the section `newsletter` and the archetype file `archetypes/newsletter.md`. Create a new file in `archetypes/newsletter.md` and open it in a text editor.
+You can use any template [function](g) within an archetype. As shown above, the default archetype uses the [`replace`](/functions/strings/replace) function to replace hyphens with spaces when populating the title in front matter.
-{{< code file="archetypes/newsletter.md" >}}
+Archetypes receive the following [context](g):
+
+Date
+: (`string`) The current date and time, formatted in compliance with RFC3339.
+
+File
+: (`hugolib.fileInfo`) Returns file information for the current page. See [details](/methods/page/file).
+
+Type
+: (`string`) The [content type](g) inferred from the top-level directory name, or as specified by the `--kind` flag passed to the `hugo new content` command.
+
+Site
+: (`page.Site`) The current site object. See [details](/methods/site/).
+
+## Date format
+
+To insert date and time with a different format, use the [`time.Now`] function:
+
+[`time.Now`]: /functions/time/now/
+
+{{< code-toggle file=archetypes/default.md fm=true >}}
+title = '{{ replace .File.ContentBaseName `-` ` ` | title }}'
+date = '{{ time.Now.Format "2006-01-02" }}'
+draft = true
+{{< /code-toggle >}}
+
+## Include content
+
+Although typically used as a front matter template, you can also use an archetype to populate content.
+
+For example, in a documentation site you might have a section (content type) for functions. Every page within this section should follow the same format: a brief description, the function signature, examples, and notes. We can pre-populate the page to remind content authors of the standard format.
+
+````text {file="archetypes/functions.md"}
---
-title: "{{ replace .Name "-" " " | title }}"
-date: {{ .Date }}
+date: '{{ .Date }}'
draft: true
+title: '{{ replace .File.ContentBaseName `-` ` ` | title }}'
---
-**Insert Lead paragraph here.**
+A brief description of what the function does, using simple present tense in the third person singular form. For example:
-## New Cool Posts
+`someFunction` returns the string `s` repeated `n` times.
-{{ range first 10 ( where .Site.RegularPages "Type" "cool" ) }}
-* {{ .Title }}
-{{ end }}
-{{< /code >}}
+## Signature
-When you create a new newsletter with:
-
-```bash
-hugo new newsletter/the-latest-cool.stuff.md
+```text
+func someFunction(s string, n int) string
```
-It will create a new newsletter type of content file based on the archetype template.
+## Examples
-**Note:** the site will only be built if the `.Site` is in use in the archetype file, and this can be time consuming for big sites.
+One or more practical examples, each within a fenced code block.
-The above _newsletter type archetype_ illustrates the possibilities: The full Hugo `.Site` and all of Hugo's template funcs can be used in the archetype file.
+## Notes
+Additional information to clarify as needed.
+````
-## Directory based archetypes
+Although you can include [template actions](g) within the content body, remember that Hugo evaluates these once---at the time of content creation. In most cases, place template actions in a [template](g) where Hugo evaluates the actions every time you [build](g) the site.
-Since Hugo `0.49` you can use complete directories as archetype templates. Given this archetype directory:
+## Leaf bundles
-```bash
-archetypes
+You can also create archetypes for [leaf bundles](g).
+
+For example, in a photography site you might have a section (content type) for galleries. Each gallery is leaf bundle with content and images.
+
+Create an archetype for galleries:
+
+```text
+archetypes/
+├── galleries/
+│ ├── images/
+│ │ └── .gitkeep
+│ └── index.md <-- same format as default.md
+└── default.md
+```
+
+Subdirectories within an archetype must contain at least one file. Without a file, Hugo will not create the subdirectory when you create new content. The name and size of the file are irrelevant. The example above includes a `.gitkeep` file, an empty file commonly used to preserve otherwise empty directories in a Git repository.
+
+To create a new gallery:
+
+```sh
+hugo new galleries/bryce-canyon
+```
+
+This produces:
+
+```text
+content/
+├── galleries/
+│ └── bryce-canyon/
+│ ├── images/
+│ │ └── .gitkeep
+│ └── index.md
+└── _index.md
+```
+
+## Specify archetype
+
+Use the `--kind` command line flag to specify an archetype when creating content.
+
+For example, let's say your site has two sections: articles and tutorials. Create an archetype for each content type:
+
+```text
+archetypes/
+├── articles.md
├── default.md
-└── post-bundle
- ├── bio.md
- ├── images
- │ └── featured.jpg
- └── index.md
+└── tutorials.md
```
-```bash
-hugo new --kind post-bundle posts/my-post
+To create an article using the articles archetype:
+
+```sh
+hugo new content articles/something.md
```
-Will create a new folder in `/content/posts/my-post` with the same set of files as in the `post-bundle` archetypes folder. All content files (`index.md` etc.) can contain template logic, and will receive the correct `.Site` for the content's language.
+To create an article using the tutorials archetype:
-
-
-[archetypes directory]: /getting-started/directory-structure/
-[content types]: /content-management/types/
-[front matter]: /content-management/front-matter/
+```sh
+hugo new content --kind tutorials articles/something.md
+```
diff --git a/docs/content/en/content-management/authors.md b/docs/content/en/content-management/authors.md
deleted file mode 100644
index 4cec5281a..000000000
--- a/docs/content/en/content-management/authors.md
+++ /dev/null
@@ -1,184 +0,0 @@
----
-title: Authors
-linktitle: Authors
-description:
-date: 2016-08-22
-publishdate: 2017-03-12
-lastmod: 2017-03-12
-keywords: [authors]
-categories: ["content management"]
-menu:
- docs:
- parent: "content-management"
- weight: 55
-weight: 55 #rem
-draft: true
-aliases: [/content/archetypes/]
-toc: true
-comments: Before this page is published, need to also update both site- and page-level variables documentation.
----
-
-
-
-Larger sites often have multiple content authors. Hugo provides standardized author profiles to organize relationships between content and content creators for sites operating under a distributed authorship model.
-
-## Author Profiles
-
-You can create a profile containing metadata for each author on your website. These profiles have to be saved under `data/_authors/`. The filename of the profile will later be used as an identifier. This way Hugo can associate content with one or multiple authors. An author's profile can be defined in the JSON, YAML, or TOML format.
-
-### Example: Author Profile
-
-Let's suppose Alice Allison is a blogger. A simple unique identifier would be `alice`. Now, we have to create a file called `alice.toml` in the `data/_authors/` directory. The following example is the standardized template written in TOML:
-
-{{< code file="data/_authors/alice.toml" >}}
-givenName = "Alice" # or firstName as alias
-familyName = "Allison" # or lastName as alias
-displayName = "Alice Allison"
-thumbnail = "static/authors/alice-thumb.jpg"
-image = "static/authors/alice-full.jpg"
-shortBio = "My name is Alice and I'm a blogger."
-bio = "My name is Alice and I'm a blogger... some other stuff"
-email = "alice.allison@email.com"
-weight = 10
-
-[social]
- facebook = "alice.allison"
- twitter = "alice"
- website = "www.example.com"
-
-[params]
- random = "whatever you want"
-{{< /code >}}
-
-All variables are optional but it's advised to fill all important ones (e.g. names and biography) because themes can vary in their usage.
-
-You can store files for the `thumbnail` and `image` attributes in the `static` folder. Then add the path to the photos relative to `static`; e.g., `/static/path/to/thumbnail.jpg`.
-
-`weight` allows you to define the order of an author in an `.Authors` list and can be accessed on list or via the `.Site.Authors` variable.
-
-The `social` section contains all the links to the social network accounts of an author. Hugo is able to generate the account links for the most popular social networks automatically. This way, you only have to enter your username. You can find a list of all supported social networks [here](#linking-social-network-accounts-automatically). All other variables, like `website` in the example above remain untouched.
-
-The `params` section can contain arbitrary data much like the same-named section in the config file. What it contains is up to you.
-
-## Associate Content Through Identifiers
-
-Earlier it was mentioned that content can be associated with an author through their corresponding identifier. In our case, blogger Alice has the identifier `alice`. In the front matter of a content file, you can create a list of identifiers and assign it to the `authors` variable. Here are examples for `alice` using YAML and TOML, respectively.
-
-```
----
-title: Why Hugo is so Awesome
-date: 2016-08-22T14:27:502:00
-authors: ["alice"]
----
-
-Nothing to read here. Move along...
-```
-
-```
-+++
-title = Why Hugo is so Awesome
-date = "2016-08-22T14:27:502:00"
-authors: ["alice"]
-+++
-
-Nothing to read here. Move along...
-```
-
-Future authors who might work on this blog post can append their identifiers to the `authors` array in the front matter as well.
-
-## Work with Templates
-
-After a successful setup it's time to give some credit to the authors by showing them on the website. Within the templates Hugo provides a list of the author's profiles if they are listed in the `authors` variable within the front matter.
-
-The list is accessible via the `.Authors` template variable. Printing all authors of a the blog post is straight forward:
-
-```
-{{ range .Authors }}
- {{ .DisplayName }}
-{{ end }}
-=> Alice Allison
-```
-
-Even if there are co-authors you may only want to show the main author. For this case you can use the `.Author` template variable **(note the singular form)**. The template variable contains the profile of the author that is first listed with his identifier in the front matter.
-
-{{% note %}}
-You can find a list of all template variables to access the profile information in [Author Variables](/variables/authors/).
-{{% /note %}}
-
-### Link Social Network Accounts
-
-As aforementioned, Hugo is able to generate links to profiles of the most popular social networks. The following social networks with their corrersponding identifiers are supported: `github`, `facebook`, `twitter`, `pinterest`, `instagram`, `youtube` and `linkedin`.
-
-This is can be done with the `.Social.URL` function. Its only parameter is the name of the social network as they are defined in the profile (e.g. `facebook`, `twitter`). Custom variables like `website` remain as they are.
-
-Most articles feature a small section with information about the author at the end. Let's create one containing the author's name, a thumbnail, a (summarized) biography and links to all social networks:
-
-{{< code file="layouts/partials/author-info.html" download="author-info.html" >}}
-{{ with .Author }}
-
-{{ end }}
-{{< /code >}}
-
-## Who Published What?
-
-That question can be answered with a list of all authors and another list containing all articles that they each have written. Now we have to translate this idea into templates. The [taxonomy][] feature allows us to logically group content based on information that they have in common; e.g. a tag or a category. Well, many articles share the same author, so this should sound familiar, right?
-
-In order to let Hugo know that we want to group content based on their author, we have to create a new taxonomy called `author` (the name corresponds to the variable in the front matter). Here is the snippet in a `config.yaml` and `config.toml`, respectively:
-
-```
-taxonomies:
- author: authors
-```
-
-```
-[taxonomies]
- author = "authors"
-```
-
-
-### List All Authors
-
-In the next step we can create a template to list all authors of your website. Later, the list can be accessed at `www.example.com/authors/`. Create a new template in the `layouts/taxonomy/` directory called `authors.term.html`. This template will be exclusively used for this taxonomy.
-
-{{< code file="layouts/taxonomy/author.term.html" download="author.term.html" >}}
-
-{{< /code >}}
-
-`.Data.Terms` contains the identifiers of all authors and we can range over it to create a list with all author names. The `$profile` variable gives us access to the profile of the current author. This allows you to generate a nice info box with a thumbnail, a biography and social media links, like at the [end of a blog post](#linking-social-network-accounts-automatically).
-
-### List Each Author's Publications
-
-Last but not least, we have to create the second list that contains all publications of an author. Each list will be shown in its own page and can be accessed at `www.example.com/authors/`. Replace `` with a valid author identifier like `alice`.
-
-The layout for this page can be defined in the template `layouts/taxonomy/author.html`.
-
-{{< code file="layouts/taxonomy/author.html" download="author.html" >}}
-{{ range .Pages }}
-
- written by {{ .Author.DisplayName }}
- {{ .Summary }}
-{{ end }}
-{{< /code >}}
-
-The example above generates a simple list of all posts written by a single author. Inside the loop you've access to the complete set of [page variables][pagevars]. Therefore, you can add additional information about the current posts like the publishing date or the tags.
-
-With a lot of content this list can quickly become very long. Consider to use the [pagination][] feature. It splits the list into smaller chunks and spreads them over multiple pages.
-
-[pagevars]: /variables/page/
-[pagination]: /templates/pagination/
diff --git a/docs/content/en/content-management/build-options.md b/docs/content/en/content-management/build-options.md
new file mode 100644
index 000000000..8c29a19b9
--- /dev/null
+++ b/docs/content/en/content-management/build-options.md
@@ -0,0 +1,303 @@
+---
+title: Build options
+description: Build options help define how Hugo must treat a given page when building the site.
+categories: []
+keywords: []
+aliases: [/content/build-options/]
+---
+
+
+
+Build options are stored in a reserved front matter object named `build`[^1] with these defaults:
+
+[^1]: The `_build` alias for `build` is deprecated and will be removed in a future release.
+
+{{< code-toggle file=content/example/index.md fm=true >}}
+[build]
+list = 'always'
+publishResources = true
+render = 'always'
+{{< /code-toggle >}}
+
+list
+: When to include the page within page collections. Specify one of:
+
+ - `always`: Include the page in _all_ page collections. For example, `site.RegularPages`, `.Pages`, etc. This is the default value.
+ - `local`: Include the page in _local_ page collections. For example, `.RegularPages`, `.Pages`, etc. Use this option to create fully navigable but headless content sections.
+ - `never`: Do not include the page in _any_ page collection.
+
+publishResources
+: Applicable to [page bundles], determines whether to publish the associated [page resources]. Specify one of:
+
+ - `true`: Always publish resources. This is the default value.
+ - `false`: Only publish a resource when invoking its [`Permalink`], [`RelPermalink`], or [`Publish`] method within a template.
+
+render
+: When to render the page. Specify one of:
+
+ - `always`: Always render the page to disk. This is the default value.
+ - `link`: Do not render the page to disk, but assign `Permalink` and `RelPermalink` values.
+ - `never`: Never render the page to disk, and exclude it from all page collections.
+
+> [!note]
+> Any page, regardless of its build options, will always be available by using the [`.Page.GetPage`] or [`.Site.GetPage`] method.
+
+## Example -- headless page
+
+Create a unpublished page whose content and resources can be included in other pages.
+
+```text
+content/
+├── headless/
+│ ├── a.jpg
+│ ├── b.jpg
+│ └── index.md <-- leaf bundle
+└── _index.md <-- home page
+```
+
+Set the build options in front matter:
+
+{{< code-toggle file=content/headless/index.md fm=true >}}
+title = 'Headless page'
+[build]
+ list = 'never'
+ publishResources = false
+ render = 'never'
+{{< /code-toggle >}}
+
+To include the content and images on the home page:
+
+```go-html-template {file="layouts/_default/home.html"}
+{{ with .Site.GetPage "/headless" }}
+ {{ .Content }}
+ {{ range .Resources.ByType "image" }}
+
+ {{ end }}
+{{ end }}
+```
+
+The published site will have this structure:
+
+```text
+public/
+├── headless/
+│ ├── a.jpg
+│ └── b.jpg
+└── index.html
+```
+
+In the example above, note that:
+
+1. Hugo did not publish an HTML file for the page.
+1. Despite setting `publishResources` to `false` in front matter, Hugo published the [page resources] because we invoked the [`RelPermalink`] method on each resource. This is the expected behavior.
+
+## Example -- headless section
+
+Create a unpublished section whose content and resources can be included in other pages.
+
+```text
+content/
+├── headless/
+│ ├── note-1/
+│ │ ├── a.jpg
+│ │ ├── b.jpg
+│ │ └── index.md <-- leaf bundle
+│ ├── note-2/
+│ │ ├── c.jpg
+│ │ ├── d.jpg
+│ │ └── index.md <-- leaf bundle
+│ └── _index.md <-- branch bundle
+└── _index.md <-- home page
+```
+
+Set the build options in front matter, using the `cascade` keyword to "cascade" the values down to descendant pages.
+
+{{< code-toggle file=content/headless/_index.md fm=true >}}
+title = 'Headless section'
+[[cascade]]
+[cascade.build]
+ list = 'local'
+ publishResources = false
+ render = 'never'
+{{< /code-toggle >}}
+
+In the front matter above, note that we have set `list` to `local` to include the descendant pages in local page collections.
+
+To include the content and images on the home page:
+
+```go-html-template {file="layouts/_default/home.html"}
+{{ with .Site.GetPage "/headless" }}
+ {{ range .Pages }}
+ {{ .Content }}
+ {{ range .Resources.ByType "image" }}
+
+ {{ end }}
+ {{ end }}
+{{ end }}
+```
+
+The published site will have this structure:
+
+```text
+public/
+├── headless/
+│ ├── note-1/
+│ │ ├── a.jpg
+│ │ └── b.jpg
+│ └── note-2/
+│ ├── c.jpg
+│ └── d.jpg
+└── index.html
+```
+
+In the example above, note that:
+
+1. Hugo did not publish an HTML file for the page.
+1. Despite setting `publishResources` to `false` in front matter, Hugo correctly published the [page resources] because we invoked the [`RelPermalink`] method on each resource. This is the expected behavior.
+
+## Example -- list without publishing
+
+Publish a section page without publishing the descendant pages. For example, to create a glossary:
+
+```text
+content/
+├── glossary/
+│ ├── _index.md
+│ ├── bar.md
+│ ├── baz.md
+│ └── foo.md
+└── _index.md
+```
+
+Set the build options in front matter, using the `cascade` keyword to "cascade" the values down to descendant pages.
+
+{{< code-toggle file=content/glossary/_index.md fm=true >}}
+title = 'Glossary'
+[build]
+render = 'always'
+[[cascade]]
+[cascade.build]
+ list = 'local'
+ publishResources = false
+ render = 'never'
+{{< /code-toggle >}}
+
+To render the glossary:
+
+```go-html-template {file="layouts/glossary/list.html"}
+
+ {{ range .Pages }}
+
{{ .Title }}
+
{{ .Content }}
+ {{ end }}
+
+```
+
+The published site will have this structure:
+
+```text
+public/
+├── glossary/
+│ └── index.html
+└── index.html
+```
+
+## Example -- publish without listing
+
+Publish a section's descendant pages without publishing the section page itself.
+
+```text
+content/
+├── books/
+│ ├── _index.md
+│ ├── book-1.md
+│ └── book-2.md
+└── _index.md
+```
+
+Set the build options in front matter:
+
+{{< code-toggle file=content/books/_index.md fm=true >}}
+title = 'Books'
+[build]
+render = 'never'
+list = 'never'
+{{< /code-toggle >}}
+
+The published site will have this structure:
+
+```text
+public/
+├── books/
+│ ├── book-1/
+│ │ └── index.html
+│ └── book-2/
+│ └── index.html
+└── index.html
+```
+
+## Example -- conditionally hide section
+
+Consider this example. A documentation site has a team of contributors with access to 20 custom shortcodes. Each shortcode takes several arguments, and requires documentation for the contributors to reference when using them.
+
+Instead of external documentation for the shortcodes, include an "internal" section that is hidden when building the production site.
+
+```text
+content/
+├── internal/
+│ ├── shortcodes/
+│ │ ├── _index.md
+│ │ ├── shortcode-1.md
+│ │ └── shortcode-2.md
+│ └── _index.md
+├── reference/
+│ ├── _index.md
+│ ├── reference-1.md
+│ └── reference-2.md
+├── tutorials/
+│ ├── _index.md
+│ ├── tutorial-1.md
+│ └── tutorial-2.md
+└── _index.md
+```
+
+Set the build options in front matter, using the `cascade` keyword to "cascade" the values down to descendant pages, and use the `target` keyword to target the production environment.
+
+{{< code-toggle file=content/internal/_index.md >}}
+title = 'Internal'
+[[cascade]]
+[cascade.build]
+render = 'never'
+list = 'never'
+[cascade.target]
+environment = 'production'
+{{< /code-toggle >}}
+
+The production site will have this structure:
+
+```text
+public/
+├── reference/
+│ ├── reference-1/
+│ │ └── index.html
+│ ├── reference-2/
+│ │ └── index.html
+│ └── index.html
+├── tutorials/
+│ ├── tutorial-1/
+│ │ └── index.html
+│ ├── tutorial-2/
+│ │ └── index.html
+│ └── index.html
+└── index.html
+```
+
+[`.Page.GetPage`]: /methods/page/getpage/
+[`.Site.GetPage`]: /methods/site/getpage/
+[`Permalink`]: /methods/resource/permalink/
+[`Publish`]: /methods/resource/publish/
+[`RelPermalink`]: /methods/resource/relpermalink/
+[page bundles]: /content-management/page-bundles/
+[page resources]: /content-management/page-resources/
diff --git a/docs/content/en/content-management/comments.md b/docs/content/en/content-management/comments.md
index 0034309f5..fee4fb372 100644
--- a/docs/content/en/content-management/comments.md
+++ b/docs/content/en/content-management/comments.md
@@ -1,20 +1,9 @@
---
title: Comments
-linktitle: Comments
description: Hugo ships with an internal Disqus template, but this isn't the only commenting system that will work with your new Hugo website.
-date: 2017-02-01
-publishdate: 2017-02-01
-lastmod: 2017-03-09
-keywords: [sections,content,organization]
-categories: [project organization, fundamentals]
-menu:
- docs:
- parent: "content-management"
- weight: 140
-weight: 140 #rem
-draft: false
+categories: []
+keywords: []
aliases: [/extras/comments/]
-toc: true
---
Hugo ships with support for [Disqus](https://disqus.com/), a third-party service that provides comment and community capabilities to websites via JavaScript.
@@ -29,59 +18,55 @@ Hugo comes with all the code you need to load Disqus into your templates. Before
Disqus comments require you set a single value in your [site's configuration file][configuration] like so:
-{{< code-toggle copy="false" >}}
-disqusShortname = "yourdiscussshortname"
+{{< code-toggle file=hugo >}}
+[services.disqus]
+shortname = 'your-disqus-shortname'
{{ code-toggle >}}
-For many websites, this is enough configuration. However, you also have the option to set the following in the [front matter][] of a single content file:
+For many websites, this is enough configuration. However, you also have the option to set the following in the [front matter] of a single content file:
-* `disqus_identifier`
-* `disqus_title`
-* `disqus_url`
+- `disqus_identifier`
+- `disqus_title`
+- `disqus_url`
-### Render Hugo's Built-in Disqus Partial Template
+### Render Hugo's built-in Disqus partial template
-Disqus has its own [internal template](https://gohugo.io/templates/internal/#disqus) available, to render it add the following code where you want comments to appear:
+Disqus has its own [internal template](/templates/embedded/#disqus) available, to render it add the following code where you want comments to appear:
-```
+```go-html-template
{{ template "_internal/disqus.html" . }}
```
-## Comments Alternatives
+## Alternatives
-There are a few alternatives to commenting on static sites for those who do not want to use Disqus:
+Commercial commenting systems:
-* [Static Man](https://staticman.net/)
-* [Talkyard](https://www.talkyard.io/blog-comments) (Open source, & serverless hosting)
-* [txtpen](https://txtpen.github.io/hn/)
-* [IntenseDebate](http://intensedebate.com/)
-* [Graph Comment][]
-* [Muut](http://muut.com/)
-* [isso](http://posativ.org/isso/) (Self-hosted, Python)
- * [Tutorial on Implementing Isso with Hugo][issotutorial]
-* [Utterances](https://utteranc.es/) (Open source, Github comments widget built on Github issues)
-* [Remark](https://github.com/umputun/remark) (Open source, Golang, Easy to run docker)
-* [Commento](https://commento.io/) (Open Source, available as a service, local install, or docker image)
+- [Emote](https://emote.com/)
+- [Graph Comment](https://graphcomment.com/)
+- [Hyvor Talk](https://talk.hyvor.com/)
+- [IntenseDebate](https://intensedebate.com/)
+- [ReplyBox](https://getreplybox.com/)
-
-
+Open-source commenting systems:
-
-
-[configuration]: /getting-started/configuration/
-[disquspartial]: /templates/partials/#disqus
+[configuration]: /configuration/
+[disquspartial]: /templates/embedded/#disqus
[disqussetup]: https://disqus.com/profile/signup/
[forum]: https://discourse.gohugo.io
[front matter]: /content-management/front-matter/
-[Graph Comment]: https://graphcomment.com/
[kaijuissue]: https://github.com/spf13/kaiju/issues/new
[issotutorial]: https://stiobhart.net/2017-02-24-isso-comments/
-[partials]: /templates/partials/
+[partials]: /templates/partial/
[MongoDB]: https://www.mongodb.com/
-[tweet]: https://twitter.com/spf13
diff --git a/docs/content/en/content-management/content-adapters.md b/docs/content/en/content-management/content-adapters.md
new file mode 100644
index 000000000..3468bb728
--- /dev/null
+++ b/docs/content/en/content-management/content-adapters.md
@@ -0,0 +1,349 @@
+---
+title: Content adapters
+description: Create content adapters to dynamically add content when building your site.
+categories: []
+keywords: []
+---
+
+{{< new-in 0.126.0 />}}
+
+## Overview
+
+A content adapter is a template that dynamically creates pages when building a site. For example, use a content adapter to create pages from a remote data source such as JSON, TOML, YAML, or XML.
+
+Unlike templates that reside in the `layouts` directory, content adapters reside in the `content` directory, no more than one per directory per language. When a content adapter creates a page, the page's [logical path](g) will be relative to the content adapter.
+
+```text
+content/
+├── articles/
+│ ├── _index.md
+│ ├── article-1.md
+│ └── article-2.md
+├── books/
+│ ├── _content.gotmpl <-- content adapter
+│ └── _index.md
+└── films/
+ ├── _content.gotmpl <-- content adapter
+ └── _index.md
+```
+
+Each content adapter is named _content.gotmpl and uses the same [syntax] as templates in the `layouts` directory. You can use any of the [template functions] within a content adapter, as well as the methods described below.
+
+## Methods
+
+Use these methods within a content adapter.
+
+### AddPage
+
+Adds a page to the site.
+
+```go-html-template {file="content/books/_content.gotmpl"}
+{{ $content := dict
+ "mediaType" "text/markdown"
+ "value" "The _Hunchback of Notre Dame_ was written by Victor Hugo."
+}}
+{{ $page := dict
+ "content" $content
+ "kind" "page"
+ "path" "the-hunchback-of-notre-dame"
+ "title" "The Hunchback of Notre Dame"
+}}
+{{ .AddPage $page }}
+```
+
+### AddResource
+
+Adds a page resource to the site.
+
+```go-html-template {file="content/books/_content.gotmpl"}
+{{ with resources.Get "images/a.jpg" }}
+ {{ $content := dict
+ "mediaType" .MediaType.Type
+ "value" .
+ }}
+ {{ $resource := dict
+ "content" $content
+ "path" "the-hunchback-of-notre-dame/cover.jpg"
+ }}
+ {{ $.AddResource $resource }}
+{{ end }}
+```
+
+Then retrieve the new page resource with something like:
+
+```go-html-template {file="layouts/_default/single.html"}
+{{ with .Resources.Get "cover.jpg" }}
+
+{{ end }}
+```
+
+### Site
+
+Returns the `Site` to which the pages will be added.
+
+```go-html-template {file="content/books/_content.gotmpl"}
+{{ .Site.Title }}
+```
+
+> [!note]
+> Note that the `Site` returned isn't fully built when invoked from the content adapters; if you try to call methods that depends on pages, e.g. `.Site.Pages`, you will get an error saying "this method cannot be called before the site is fully initialized".
+
+### Store
+
+Returns a persistent “scratch pad” to store and manipulate data. The main use case for this is to transfer values between executions when [EnableAllLanguages](#enablealllanguages) is set. See [examples](/methods/page/store/).
+
+```go-html-template {file="content/books/_content.gotmpl"}
+{{ .Store.Set "key" "value" }}
+{{ .Store.Get "key" }}
+```
+
+### EnableAllLanguages
+
+By default, Hugo executes the content adapter for the language defined by the _content.gotmpl file. Use this method to activate the content adapter for all languages.
+
+```go-html-template {file="content/books/_content.gotmpl"}
+{{ .EnableAllLanguages }}
+{{ $content := dict
+ "mediaType" "text/markdown"
+ "value" "The _Hunchback of Notre Dame_ was written by Victor Hugo."
+}}
+{{ $page := dict
+ "content" $content
+ "kind" "page"
+ "path" "the-hunchback-of-notre-dame"
+ "title" "The Hunchback of Notre Dame"
+}}
+{{ .AddPage $page }}
+```
+
+## Page map
+
+Set any [front matter field] in the map passed to the [`AddPage`](#addpage) method, excluding `markup`. Instead of setting the `markup` field, specify the `content.mediaType` as described below.
+
+This table describes the fields most commonly passed to the `AddPage` method.
+
+Key|Description|Required
+:--|:--|:-:
+`content.mediaType`|The content [media type]. Default is `text/markdown`. See [content formats] for examples.|
+`content.value`|The content value as a string.|
+`dates.date`|The page creation date as a `time.Time` value.|
+`dates.expiryDate`|The page expiry date as a `time.Time` value.|
+`dates.lastmod`|The page last modification date as a `time.Time` value.|
+`dates.publishDate`|The page publication date as a `time.Time` value.|
+`params`|A map of page parameters.|
+`path`|The page's [logical path](g) relative to the content adapter. Do not include a leading slash or file extension.|:heavy_check_mark:
+`title`|The page title.|
+
+> [!note]
+> While `path` is the only required field, we recommend setting `title` as well.
+>
+> When setting the `path`, Hugo transforms the given string to a logical path. For example, setting `path` to `A B C` produces a logical path of `/section/a-b-c`.
+
+## Resource map
+
+Construct the map passed to the [`AddResource`](#addresource) method using the fields below.
+
+Key|Description|Required
+:--|:--|:-:
+`content.mediaType`|The content [media type].|:heavy_check_mark:
+`content.value`|The content value as a string or resource.|:heavy_check_mark:
+`name`|The resource name.|
+`params`|A map of resource parameters.|
+`path`|The resources's [logical path](g) relative to the content adapter. Do not include a leading slash.|:heavy_check_mark:
+`title`|The resource title.|
+
+> [!note]
+> If the `content.value` is a string Hugo creates a new resource. If the `content.value` is a resource, Hugo obtains the value from the existing resource.
+>
+> When setting the `path`, Hugo transforms the given string to a logical path. For example, setting `path` to `A B C/cover.jpg` produces a logical path of `/section/a-b-c/cover.jpg`.
+
+## Example
+
+Create pages from remote data, where each page represents a book review.
+
+### Step 1
+
+Create the content structure.
+
+```text
+content/
+└── books/
+ ├── _content.gotmpl <-- content adapter
+ └── _index.md
+```
+
+### Step 2
+Inspect the remote data to determine how to map key-value pairs to front matter fields.\
+
+
+### Step 3
+
+Create the content adapter.
+
+```go-html-template {file="content/books/_content.gotmpl" copy=true}
+{{/* Get remote data. */}}
+{{ $data := dict }}
+{{ $url := "https://gohugo.io/shared/examples/data/books.json" }}
+{{ with try (resources.GetRemote $url) }}
+ {{ with .Err }}
+ {{ errorf "Unable to get remote resource %s: %s" $url . }}
+ {{ else with .Value }}
+ {{ $data = . | transform.Unmarshal }}
+ {{ else }}
+ {{ errorf "Unable to get remote resource %s" $url }}
+ {{ end }}
+{{ end }}
+
+{{/* Add pages and page resources. */}}
+{{ range $data }}
+
+ {{/* Add page. */}}
+ {{ $content := dict "mediaType" "text/markdown" "value" .summary }}
+ {{ $dates := dict "date" (time.AsTime .date) }}
+ {{ $params := dict "author" .author "isbn" .isbn "rating" .rating "tags" .tags }}
+ {{ $page := dict
+ "content" $content
+ "dates" $dates
+ "kind" "page"
+ "params" $params
+ "path" .title
+ "title" .title
+ }}
+ {{ $.AddPage $page }}
+
+ {{/* Add page resource. */}}
+ {{ $item := . }}
+ {{ with $url := $item.cover }}
+ {{ with try (resources.GetRemote $url) }}
+ {{ with .Err }}
+ {{ errorf "Unable to get remote resource %s: %s" $url . }}
+ {{ else with .Value }}
+ {{ $content := dict "mediaType" .MediaType.Type "value" .Content }}
+ {{ $params := dict "alt" $item.title }}
+ {{ $resource := dict
+ "content" $content
+ "params" $params
+ "path" (printf "%s/cover.%s" $item.title .MediaType.SubType)
+ }}
+ {{ $.AddResource $resource }}
+ {{ else }}
+ {{ errorf "Unable to get remote resource %s" $url }}
+ {{ end }}
+ {{ end }}
+ {{ end }}
+
+{{ end }}
+```
+
+### Step 4
+
+Create a single template to render each book review.
+
+```go-html-template {file="layouts/books/single.html" copy=true}
+{{ define "main" }}
+
{{ .Title }}
+
+ {{ with .Resources.GetMatch "cover.*" }}
+
+ {{ end }}
+
+
+ {{ end }}
+
+ {{ .Content }}
+{{ end }}
+```
+
+## Multilingual sites
+
+With multilingual sites you can:
+
+1. Create one content adapter for all languages using the [`EnableAllLanguages`](#enablealllanguages) method as described above.
+1. Create content adapters unique to each language. See the examples below.
+
+### Translations by file name
+
+With this site configuration:
+
+{{< code-toggle file=hugo >}}
+[languages.en]
+weight = 1
+
+[languages.de]
+weight = 2
+{{< /code-toggle >}}
+
+Include a language designator in the content adapter's file name.
+
+```text
+content/
+└── books/
+ ├── _content.de.gotmpl
+ ├── _content.en.gotmpl
+ ├── _index.de.md
+ └── _index.en.md
+```
+
+### Translations by content directory
+
+With this site configuration:
+
+{{< code-toggle file=hugo >}}
+[languages.en]
+contentDir = 'content/en'
+weight = 1
+
+[languages.de]
+contentDir = 'content/de'
+weight = 2
+{{< /code-toggle >}}
+
+Create a single content adapter in each directory:
+
+```text
+content/
+├── de/
+│ └── books/
+│ ├── _content.gotmpl
+│ └── _index.md
+└── en/
+ └── books/
+ ├── _content.gotmpl
+ └── _index.md
+```
+
+## Page collisions
+
+Two or more pages collide when they have the same publication path. Due to concurrency, the content of the published page is indeterminate. Consider this example:
+
+```text
+content/
+└── books/
+ ├── _content.gotmpl <-- content adapter
+ ├── _index.md
+ └── the-hunchback-of-notre-dame.md
+```
+
+If the content adapter also creates books/the-hunchback-of-notre-dame, the content of the published page is indeterminate. You can not define the processing order.
+
+To detect page collisions, use the `--printPathWarnings` flag when building your site.
+
+[content formats]: /content-management/formats/#classification
+[front matter field]: /content-management/front-matter/#fields
+[media type]: https://en.wikipedia.org/wiki/Media_type
+[syntax]: /templates/introduction/
+[template functions]: /functions/
diff --git a/docs/content/en/content-management/cross-references.md b/docs/content/en/content-management/cross-references.md
deleted file mode 100644
index f51271306..000000000
--- a/docs/content/en/content-management/cross-references.md
+++ /dev/null
@@ -1,88 +0,0 @@
----
-title: Links and Cross References
-description: Shortcodes for creating links to documents.
-date: 2017-02-01
-publishdate: 2017-02-01
-lastmod: 2017-03-31
-categories: [content management]
-keywords: ["cross references","references", "anchors", "urls"]
-menu:
- docs:
- parent: "content-management"
- weight: 100
-weight: 100 #rem
-aliases: [/extras/crossreferences/]
-toc: true
----
-
-
-The `ref` and `relref` shortcode resolves the absolute or relative permalink given a path to a document.
-
-## Use `ref` and `relref`
-
-```go-html-template
-{{* ref "document.md" */>}}
-{{* ref "#anchor" */>}}
-{{* ref "document.md#anchor" */>}}
-{{* ref "/blog/my-post" */>}}
-{{* ref "/blog/my-post.md" */>}}
-{{* relref "document.md" */>}}
-{{* relref "#anchor" */>}}
-{{* relref "document.md#anchor" */>}}
-```
-
-The single parameter to `ref` is a string with a content `documentname` (e.g., `about.md`) with or without an appended in-document `anchor` (`#who`) without spaces. Hugo is flexible in how we search for documents, so the file suffix may be omitted.
-
-**Paths without a leading `/` will first be tried resolved relative to the current page.**
-
-You will get an error if your document could not be uniquely resolved. The error behaviour can be configured, see below.
-
-### Link to another language version
-
-Link to another language version of a document, you need to use this syntax:
-
-```go-html-template
-{{* relref path="document.md" lang="ja" */>}}
-```
-
-### Get another Output Format
-
-To link to a given Output Format of a document, you can use this syntax:
-
-```go-html-template
-{{* relref path="document.md" outputFormat="rss" */>}}
-```
-
-### Anchors
-
-When an `anchor` is provided by itself, the current page’s unique identifier will be appended; when an `anchor` is provided appended to `documentname`, the found page's unique identifier will be appended:
-
-```go-html-template
-{{* relref "#anchors" */>}} => #anchors:9decaf7
-```
-
-The above examples render as follows for this very page as well as a reference to the "Content" heading in the Hugo docs features pageyoursite
-
-```go-html-template
-{{* relref "#who" */>}} => #who:9decaf7
-{{* relref "/blog/post.md#who" */>}} => /blog/post/#who:badcafe
-```
-
-More information about document unique identifiers and headings can be found [below]({{< ref "#hugo-heading-anchors" >}}).
-
-
-## Ref and RelRef Configuration
-
-The behaviour can, since Hugo 0.45, be configured in `config.toml`:
-
-refLinksErrorLevel ("ERROR")
-: When using `ref` or `relref` to resolve page links and a link cannot resolved, it will be logged with this log level. Valid values are `ERROR` (default) or `WARNING`. Any `ERROR` will fail the build (`exit -1`).
-
-refLinksNotFoundURL
-: URL to be used as a placeholder when a page reference cannot be found in `ref` or `relref`. Is used as-is.
-
-
-[lists]: /templates/lists/
-[output formats]: /templates/output-formats/
-[shortcode]: /content-management/shortcodes/
-[bfext]: /content-management/formats/#blackfriday-extensions
diff --git a/docs/content/en/content-management/data-sources.md b/docs/content/en/content-management/data-sources.md
new file mode 100644
index 000000000..3fc98b36a
--- /dev/null
+++ b/docs/content/en/content-management/data-sources.md
@@ -0,0 +1,111 @@
+---
+title: Data sources
+description: Use local and remote data sources to augment or create content.
+categories: []
+keywords: []
+aliases: [/extras/datafiles/,/extras/datadrivencontent/,/doc/datafiles/,/templates/data-templates/]
+---
+
+Hugo can access and [unmarshal](g) local and remote data sources including CSV, JSON, TOML, YAML, and XML. Use this data to augment existing content or to create new content.
+
+A data source might be a file in the `data` directory, a [global resource](g), a [page resource](g), or a [remote resource](g).
+
+## Data directory
+
+The `data` directory in the root of your project may contain one or more data files, in either a flat or nested tree. Hugo merges the data files to create a single data structure, accessible with the `Data` method on a `Site` object.
+
+Hugo also merges data directories from themes and modules into this single data structure, where the `data` directory in the root of your project takes precedence.
+
+> [!note]
+> Hugo reads the combined data structure into memory and keeps it there for the entire build. For data that is infrequently accessed, use global or page resources instead.
+
+Theme and module authors may wish to namespace their data files to prevent collisions. For example:
+
+```text
+project/
+└── data/
+ └── mytheme/
+ └── foo.json
+```
+
+> [!note]
+> Do not place CSV files in the `data` directory. Access CSV files as page, global, or remote resources.
+
+See the documentation for the [`Data`] method on a `Site` object for details and examples.
+
+## Global resources
+
+Use the `resources.Get` and `transform.Unmarshal` functions to access data files that exist as global resources.
+
+See the [`transform.Unmarshal`](/functions/transform/unmarshal/#global-resource) documentation for details and examples.
+
+## Page resources
+
+Use the `Resources.Get` method on a `Page` object combined with the `transform.Unmarshal` function to access data files that exist as page resources.
+
+See the [`transform.Unmarshal`](/functions/transform/unmarshal/#page-resource) documentation for details and examples.
+
+## Remote resources
+
+Use the `resources.GetRemote` and `transform.Unmarshal` functions to access remote data.
+
+See the [`transform.Unmarshal`](/functions/transform/unmarshal/#remote-resource) documentation for details and examples.
+
+## Augment existing content
+
+Use data sources to augment existing content. For example, create a shortcode to render an HTML table from a global CSV resource.
+
+```csv {file="assets/pets.csv"}
+"name","type","breed","age"
+"Spot","dog","Collie","3"
+"Felix","cat","Malicious","7"
+```
+
+```text {file="content/example.md"}
+{{* csv-to-table "pets.csv" */>}}
+```
+
+```go-html-template {file="layouts/shortcodes/csv-to-table.html"}
+{{ with $file := .Get 0 }}
+ {{ with resources.Get $file }}
+ {{ with . | transform.Unmarshal }}
+
+
+
+ {{ range index . 0 }}
+
{{ . }}
+ {{ end }}
+
+
+
+ {{ range after 1 . }}
+
+ {{ range . }}
+
{{ . }}
+ {{ end }}
+
+ {{ end }}
+
+
+ {{ end }}
+ {{ else }}
+ {{ errorf "The %q shortcode was unable to find %s. See %s" $.Name $file $.Position }}
+ {{ end }}
+{{ else }}
+ {{ errorf "The %q shortcode requires one positional argument, the path to the CSV file relative to the assets directory. See %s" .Name .Position }}
+{{ end }}
+```
+
+Hugo renders this to:
+
+name|type|breed|age
+:--|:--|:--|:--
+Spot|dog|Collie|3
+Felix|cat|Malicious|7
+
+## Create new content
+
+Use [content adapters] to create new content.
+
+[`Data`]: /methods/site/data/
+[content adapters]: /content-management/content-adapters/
diff --git a/docs/content/en/content-management/diagrams.md b/docs/content/en/content-management/diagrams.md
new file mode 100644
index 000000000..0070ced59
--- /dev/null
+++ b/docs/content/en/content-management/diagrams.md
@@ -0,0 +1,260 @@
+---
+title: Diagrams
+description: Use fenced code blocks and Markdown render hooks to include diagrams in your content.
+categories: []
+keywords: []
+---
+
+## GoAT diagrams (ASCII)
+
+Hugo natively supports [GoAT] diagrams with an [embedded code block render hook]. This means that this code block:
+
+````txt
+```goat
+ . . . .--- 1 .-- 1 / 1
+ / \ | | .---+ .-+ +
+ / \ .---+---. .--+--. | '--- 2 | '-- 2 / \ 2
+ + + | | | | ---+ ---+ +
+ / \ / \ .-+-. .-+-. .+. .+. | .--- 3 | .-- 3 \ / 3
+ / \ / \ | | | | | | | | '---+ '-+ +
+ 1 2 3 4 1 2 3 4 1 2 3 4 '--- 4 '-- 4 \ 4
+
+```
+````
+
+Will be rendered as:
+
+```goat
+
+ . . . .--- 1 .-- 1 / 1
+ / \ | | .---+ .-+ +
+ / \ .---+---. .--+--. | '--- 2 | '-- 2 / \ 2
+ + + | | | | ---+ ---+ +
+ / \ / \ .-+-. .-+-. .+. .+. | .--- 3 | .-- 3 \ / 3
+ / \ / \ | | | | | | | | '---+ '-+ +
+ 1 2 3 4 1 2 3 4 1 2 3 4 '--- 4 '-- 4 \ 4
+```
+
+## Mermaid diagrams
+
+Hugo does not provide a built-in template for Mermaid diagrams. Create your own using a [code block render hook]:
+
+```go-html-template {file="layouts/_default/_markup/render-codeblock-mermaid.html" copy=true}
+
+ {{ .Inner | htmlEscape | safeHTML }}
+
+{{ .Page.Store.Set "hasMermaid" true }}
+```
+
+Then include this snippet at the _bottom_ of your base template, before the closing `body` tag:
+
+```go-html-template {file="layouts/_default/baseof.html" copy=true}
+{{ if .Store.Get "hasMermaid" }}
+
+{{ end }}
+```
+
+With that you can use the `mermaid` language in Markdown code blocks:
+
+````text {copy=true}
+```mermaid
+sequenceDiagram
+ participant Alice
+ participant Bob
+ Alice->>John: Hello John, how are you?
+ loop Healthcheck
+ John->>John: Fight against hypochondria
+ end
+ Note right of John: Rational thoughts prevail!
+ John-->>Alice: Great!
+ John->>Bob: How about you?
+ Bob-->>John: Jolly good!
+```
+````
+
+## Goat ASCII diagram examples
+
+### Graphics
+
+```goat
+ .
+ 0 3 P * Eye / ^ /
+ *-------* +y \ +) \ / Reflection
+ 1 /| 2 /| ^ \ \ \ v
+ *-------* | | v0 \ v3 --------*--------
+ | |4 | |7 | *----\-----*
+ | *-----|-* +-----> +x / v X \ .-.<-------- o
+ |/ |/ / / o \ | / | Refraction / \
+ *-------* v / \ +-' / \
+ 5 6 +z v1 *------------------* v2 | o-----o
+ v
+
+```
+
+### Complex
+
+```goat
++-------------------+ ^ .---.
+| A Box |__.--.__ __.--> | .-. | |
+| | '--' v | * |<--- | |
++-------------------+ '-' | |
+ Round *---(-. |
+ .-----------------. .-------. .----------. .-------. | | |
+ | Mixed Rounded | | | / Diagonals \ | | | | | |
+ | & Square Corners | '--. .--' / \ |---+---| '-)-' .--------.
+ '--+------------+-' .--. | '-------+--------' | | | | / Search /
+ | | | | '---. | '-------' | '-+------'
+ |<---------->| | | | v Interior | ^
+ ' <---' '----' .-----------. ---. .--- v |
+ .------------------. Diag line | .-------. +---. \ / . |
+ | if (a > b) +---. .--->| | | | | Curved line \ / / \ |
+ | obj->fcn() | \ / | '-------' |<--' + / \ |
+ '------------------' '--' '--+--------' .--. .--. | .-. +Done?+-'
+ .---+-----. | ^ |\ | | /| .--+ | | \ /
+ | | | Join \|/ | | Curved | \| |/ | | \ | \ /
+ | | +----> o --o-- '-' Vertical '--' '--' '-- '--' + .---.
+ <--+---+-----' | /|\ | | 3 |
+ v not:line 'quotes' .-' '---'
+ .-. .---+--------. / A || B *bold* | ^
+ | | | Not a dot | <---+---<-- A dash--is not a line v |
+ '-' '---------+--' / Nor/is this. ---
+
+```
+
+### Process
+
+```goat
+ .
+ .---------. / \
+ | START | / \ .-+-------+-. ___________
+ '----+----' .-------. A / \ B | |COMPLEX| | / \ .-.
+ | | END |<-----+CHOICE +----->| | | +--->+ PREPARATION +--->| X |
+ v '-------' \ / | |PROCESS| | \___________/ '-'
+ .---------. \ / '-+---+---+-'
+ / INPUT / \ /
+ '-----+---' '
+ | ^
+ v |
+ .-----------. .-----+-----. .-.
+ | PROCESS +---------------->| PROCESS |<------+ X |
+ '-----------' '-----------' '-'
+```
+
+### File tree
+
+Created from
+
+```goat {width=300 color="orange"}
+───Linux─┬─Android
+ ├─Debian─┬─Ubuntu─┬─Lubuntu
+ │ │ ├─Kubuntu
+ │ │ ├─Xubuntu
+ │ │ └─Xubuntu
+ │ └─Mint
+ ├─Centos
+ └─Fedora
+```
+
+### Sequence diagram
+
+
+
+```goat {class="w-40"}
+┌─────┐ ┌───┐
+│Alice│ │Bob│
+└──┬──┘ └─┬─┘
+ │ │
+ │ Hello Bob! │
+ │───────────>│
+ │ │
+ │Hello Alice!│
+ │<───────────│
+┌──┴──┐ ┌─┴─┐
+│Alice│ │Bob│
+└─────┘ └───┘
+
+```
+
+### Flowchart
+
+
+
+```goat
+ _________________
+ ╱ ╲ ┌─────┐
+ ╱ DO YOU UNDERSTAND ╲____________________________________________________│GOOD!│
+ ╲ FLOW CHARTS? ╱yes └──┬──┘
+ ╲_________________╱ │
+ │no │
+ _________▽_________ ______________________ │
+ ╱ ╲ ╱ ╲ ┌────┐ │
+╱ OKAY, YOU SEE THE ╲________________╱ ... AND YOU CAN SEE ╲___│GOOD│ │
+╲ LINE LABELED 'YES'? ╱yes ╲ THE ONES LABELED 'NO'? ╱yes└──┬─┘ │
+ ╲___________________╱ ╲______________________╱ │ │
+ │no │no │ │
+ ________▽_________ _________▽__________ │ │
+ ╱ ╲ ┌───────────┐ ╱ ╲ │ │
+ ╱ BUT YOU SEE THE ╲___│WAIT, WHAT?│ ╱ BUT YOU JUST ╲___ │ │
+ ╲ ONES LABELED 'NO'? ╱yes└───────────┘ ╲ FOLLOWED THEM TWICE? ╱yes│ │ │
+ ╲__________________╱ ╲____________________╱ │ │ │
+ │no │no │ │ │
+ ┌───▽───┐ │ │ │ │
+ │LISTEN.│ └───────┬───────┘ │ │
+ └───┬───┘ ┌──────▽─────┐ │ │
+ ┌─────▽────┐ │(THAT WASN'T│ │ │
+ │I HATE YOU│ │A QUESTION) │ │ │
+ └──────────┘ └──────┬─────┘ │ │
+ ┌────▽───┐ │ │
+ │SCREW IT│ │ │
+ └────┬───┘ │ │
+ └─────┬─────┘ │
+ │ │
+ └─────┬─────┘
+ ┌───────▽──────┐
+ │LET'S GO DRING│
+ └───────┬──────┘
+ ┌─────────▽─────────┐
+ │HEY, I SHOULD TRY │
+ │INSTALLING FREEBSD!│
+ └───────────────────┘
+
+```
+
+### Table
+
+
+
+```goat {class="w-80 dark-blue"}
+┌────────────────────────────────────────────────┐
+│ │
+├────────────────────────────────────────────────┤
+│SYNTAX = { PRODUCTION } . │
+├────────────────────────────────────────────────┤
+│PRODUCTION = IDENTIFIER "=" EXPRESSION "." . │
+├────────────────────────────────────────────────┤
+│EXPRESSION = TERM { "|" TERM } . │
+├────────────────────────────────────────────────┤
+│TERM = FACTOR { FACTOR } . │
+├────────────────────────────────────────────────┤
+│FACTOR = IDENTIFIER │
+├────────────────────────────────────────────────┤
+│ | LITERAL │
+├────────────────────────────────────────────────┤
+│ | "[" EXPRESSION "]" │
+├────────────────────────────────────────────────┤
+│ | "(" EXPRESSION ")" │
+├────────────────────────────────────────────────┤
+│ | "{" EXPRESSION "}" . │
+├────────────────────────────────────────────────┤
+│IDENTIFIER = letter { letter } . │
+├────────────────────────────────────────────────┤
+│LITERAL = """" character { character } """" .│
+└────────────────────────────────────────────────┘
+```
+
+[code block render hook]: /render-hooks/code-blocks/
+[embedded code block render hook]: {{% eturl render-codeblock-goat %}}
+[GoAT]: https://github.com/bep/goat
diff --git a/docs/content/en/content-management/formats.md b/docs/content/en/content-management/formats.md
index 158f34199..1acaae063 100644
--- a/docs/content/en/content-management/formats.md
+++ b/docs/content/en/content-management/formats.md
@@ -1,262 +1,132 @@
---
-title: Supported Content Formats
-linktitle: Supported Content Formats
-description: Both HTML and Markdown are supported content formats.
-date: 2017-01-10
-publishdate: 2017-01-10
-lastmod: 2017-04-06
-categories: [content management]
-keywords: [markdown,asciidoc,mmark,pandoc,content format]
-menu:
- docs:
- parent: "content-management"
- weight: 20
-weight: 20 #rem
-draft: false
-aliases: [/content/markdown-extras/,/content/supported-formats/,/doc/supported-formats/,/tutorials/mathjax/]
-toc: true
+title: Content formats
+description: Create your content using Markdown, HTML, Emacs Org Mode, AsciiDoc, Pandoc, or reStructuredText.
+categories: []
+keywords: []
+aliases: [/content/markdown-extras/,/content/supported-formats/,/doc/supported-formats/]
---
-**Markdown is the main content format** and comes in two flavours: The excellent [Blackfriday project][blackfriday] (name your files `*.md` or set `markup = "markdown"` in front matter) or its fork [Mmark][mmark] (name your files `*.mmark` or set `markup = "mmark"` in front matter), both very fast markdown engines written in Go.
+## Introduction
-For Emacs users, [go-org](https://github.com/niklasfasching/go-org) provides built-in native support for Org-mode (name your files `*.org` or set `markup = "org"` in front matter)
+You may mix content formats throughout your site. For example:
-But in many situations, plain HTML is what you want. Just name your files with `.html` or `.htm` extension inside your content folder. Note that if you want your HTML files to have a layout, they need front matter. It can be empty, but it has to be there:
-
-```html
----
-title: "This is a content file in HTML"
----
-
-
-
Hello, Hugo!
-
+```text
+content/
+└── posts/
+ ├── post-1.md
+ ├── post-2.adoc
+ ├── post-3.org
+ ├── post-4.pandoc
+ ├── post-5.rst
+ └── post-6.html
```
-{{% note "Deeply Nested Lists" %}}
-Before you begin writing your content in markdown, Blackfriday has a known issue [(#329)](https://github.com/russross/blackfriday/issues/329) with handling deeply nested lists. Luckily, there is an easy workaround. Use 4-spaces (i.e., tab) rather than 2-space indentations.
-{{% /note %}}
+Regardless of content format, all content must have [front matter], preferably including both `title` and `date`.
-## Configure BlackFriday Markdown Rendering
+Hugo selects the content renderer based on the `markup` identifier in front matter, falling back to the file extension. See the [classification] table below for a list of markup identifiers and recognized file extensions.
-You can configure multiple aspects of Blackfriday as show in the following list. See the docs on [Configuration][config] for the full list of explicit directions you can give to Hugo when rendering your site.
+[classification]: #classification
+[front matter]: /content-management/front-matter/
-{{< readfile file="/content/en/readfiles/bfconfig.md" markdown="true" >}}
+## Formats
-## Extend Markdown
+### Markdown
-Hugo provides some convenient methods for extending markdown.
+Create your content in [Markdown] preceded by front matter.
-### Task Lists
+Markdown is Hugo's default content format. Hugo natively renders Markdown to HTML using [Goldmark]. Goldmark is fast and conforms to the [CommonMark] and [GitHub Flavored Markdown] specifications. You can configure Goldmark in your [site configuration][configure goldmark].
-Hugo supports [GitHub-styled task lists (i.e., TODO lists)][gfmtasks] for the Blackfriday markdown renderer. If you do not want to use this feature, you can disable it in your configuration.
+Hugo provides custom Markdown features including:
-#### Example Task List Input
+[Attributes]
+: Apply HTML attributes such as `class` and `id` to Markdown images and block elements including blockquotes, fenced code blocks, headings, horizontal rules, lists, paragraphs, and tables.
-{{< code file="content/my-to-do-list.md" >}}
-- [ ] a task list item
-- [ ] list syntax required
-- [ ] incomplete
-- [x] completed
-{{< /code >}}
+[Extensions]
+: Leverage the embedded Markdown extensions to create tables, definition lists, footnotes, task lists, inserted text, mark text, subscripts, superscripts, and more.
-#### Example Task List Output
+[Mathematics]
+: Include mathematical equations and expressions in Markdown using LaTeX markup.
-The preceding markdown produces the following HTML in your rendered website:
+[Render hooks]
+: Override the conversion of Markdown to HTML when rendering fenced code blocks, headings, images, and links. For example, render every standalone image as an HTML `figure` element.
-```
-
-
a task list item
-
list syntax required
-
incomplete
-
completed
-
+[Attributes]: /content-management/markdown-attributes/
+[CommonMark]: https://spec.commonmark.org/current/
+[Extensions]: /configuration/markup/#extensions
+[GitHub Flavored Markdown]: https://github.github.com/gfm/
+[Goldmark]: https://github.com/yuin/goldmark
+[Markdown]: https://daringfireball.net/projects/markdown/
+[Mathematics]: /content-management/mathematics/
+[Render hooks]: /render-hooks/introduction/
+[configure goldmark]: /configuration/markup/#goldmark
+
+### HTML
+
+Create your content in [HTML] preceded by front matter. The content is typically what you would place within an HTML document's `body` or `main` element.
+
+[HTML]: https://developer.mozilla.org/en-US/docs/Learn_web_development/Getting_started/Your_first_website/Creating_the_content
+
+### Emacs Org Mode
+
+Create your content in the [Emacs Org Mode] format preceded by front matter. You can use Org Mode keywords for front matter. See [details].
+
+[details]: /content-management/front-matter/#emacs-org-mode
+[Emacs Org Mode]: https://orgmode.org/
+
+### AsciiDoc
+
+Create your content in the [AsciiDoc] format preceded by front matter. Hugo renders AsciiDoc content to HTML using the Asciidoctor executable. You must install Asciidoctor and its dependencies (Ruby) to use the AsciiDoc content format.
+
+You can configure the AsciiDoc renderer in your [site configuration][configure asciidoc].
+
+In its default configuration, Hugo passes these CLI flags when calling the Asciidoctor executable:
+
+```text
+--no-header-footer
```
-#### Example Task List Display
+The CLI flags passed to the Asciidoctor executable depend on configuration. You may inspect the flags when building your site:
-The following shows how the example task list will look to the end users of your website. Note that visual styling of lists is up to you. This list has been styled according to [the Hugo Docs stylesheet][hugocss].
-
-- [ ] a task list item
-- [ ] list syntax required
-- [ ] incomplete
-- [x] completed
-
-### Emojis
-
-To add emojis directly to content, set `enableEmoji` to `true` in your [site configuration][config]. To use emojis in templates or shortcodes, see [`emojify` function][].
-
-For a full list of emojis, see the [Emoji cheat sheet][emojis].
-
-### Shortcodes
-
-If you write in Markdown and find yourself frequently embedding your content with raw HTML, Hugo provides built-in shortcodes functionality. This is one of the most powerful features in Hugo and allows you to create your own Markdown extensions very quickly.
-
-See [Shortcodes][sc] for usage, particularly for the built-in shortcodes that ship with Hugo, and [Shortcode Templating][sct] to learn how to build your own.
-
-### Code Blocks
-
-Hugo supports GitHub-flavored markdown's use of triple back ticks, as well as provides a special [`highlight` shortcode][hlsc], and syntax highlights those code blocks natively using *Chroma*. Users also have an option to use *Pygments* instead. See the [Syntax Highlighting][hl] section for details.
-
-## Mmark
-
-Mmark is a [fork of BlackFriday][mmark] and markdown superset that is well suited for writing [IETF documentation][ietf]. You can see examples of the syntax in the [Mmark GitHub repository][mmark] or the full syntax on [Miek Gieben's website][].
-
-### Use Mmark
-
-As Hugo ships with Mmark, using the syntax is as easy as changing the extension of your content files from `.md` to `.mmark`.
-
-In the event that you want to only use Mmark in specific files, you can also define the Mmark syntax in your content's front matter:
-
-```
----
-title: My Post
-date: 2017-04-01
-markup: mmark
----
+```text
+hugo --logLevel info
```
-{{% warning %}}
-Thare are some features not available in Mmark; one example being that shortcodes are not translated when used in an included `.mmark` file ([#3131](https://github.com/gohugoio/hugo/issues/3137)), and `EXTENSION_ABBREVIATION` ([#1970](https://github.com/gohugoio/hugo/issues/1970)) and the aforementioned GFM todo lists ([#2270](https://github.com/gohugoio/hugo/issues/2270)) are not fully supported. Contributions are welcome.
-{{% /warning %}}
+[AsciiDoc]: https://asciidoc.org/
+[configure the AsciiDoc renderer]: /configuration/markup/#asciidoc
+[configure asciidoc]: /configuration/markup/#asciidoc
-## MathJax with Hugo
+### Pandoc
-[MathJax](http://www.mathjax.org/) is a JavaScript library that allows the display of mathematical expressions described via a LaTeX-style syntax in the HTML (or Markdown) source of a web page. As it is a pure a JavaScript library, getting it to work within Hugo is fairly straightforward, but does have some oddities that will be discussed here.
+Create your content in the [Pandoc] format preceded by front matter. Hugo renders Pandoc content to HTML using the Pandoc executable. You must install Pandoc to use the Pandoc content format.
-This is not an introduction into actually using MathJax to render typeset mathematics on your website. Instead, this page is a collection of tips and hints for one way to get MathJax working on a website built with Hugo.
+Hugo passes these CLI flags when calling the Pandoc executable:
-### Enable MathJax
+```text
+--mathjax
+```
-The first step is to enable MathJax on pages that you would like to have typeset math. There are multiple ways to do this (adventurous readers can consult the [Loading and Configuring](http://docs.mathjax.org/en/latest/configuration.html) section of the MathJax documentation for additional methods of including MathJax), but the easiest way is to use the secure MathJax CDN by include a `
-{{< /code >}}
+### reStructuredText
-One way to ensure that this code is included in all pages is to put it in one of the templates that live in the `layouts/partials/` directory. For example, I have included this in the bottom of my template `footer.html` because I know that the footer will be included in every page of my website.
+Create your content in the [reStructuredText] format preceded by front matter. Hugo renders reStructuredText content to HTML using [Docutils], specifically rst2html. You must install Docutils and its dependencies (Python) to use the reStructuredText content format.
-### Options and Features
+Hugo passes these CLI flags when calling the rst2html executable:
-MathJax is a stable open-source library with many features. I encourage the interested reader to view the [MathJax Documentation](http://docs.mathjax.org/en/latest/index.html), specifically the sections on [Basic Usage](http://docs.mathjax.org/en/latest/index.html#basic-usage) and [MathJax Configuration Options](http://docs.mathjax.org/en/latest/index.html#mathjax-configuration-options).
+```text
+--leave-comments --initial-header-level=2
+```
-### Issues with Markdown
+[Docutils]: https://docutils.sourceforge.io/
+[reStructuredText]: https://docutils.sourceforge.io/rst.html
-{{% note %}}
-The following issues with Markdown assume you are using `.md` for content and BlackFriday for parsing. Using [Mmark](#mmark) as your content format will obviate the need for the following workarounds.
+## Classification
-When using Mmark with MathJax, use `displayMath: [['$$','$$'], ['\\[','\\]']]`. See the [Mmark `README.md`](https://github.com/miekg/mmark/wiki/Syntax#math-blocks) for more information. In addition to MathJax, Mmark has been shown to work well with [KaTeX](https://github.com/Khan/KaTeX). See this [related blog post from a Hugo user](http://nosubstance.me/post/a-great-toolset-for-static-blogging/).
-{{% /note %}}
+{{% include "/_common/content-format-table.md" %}}
-After enabling MathJax, any math entered between proper markers (see the [MathJax documentation][mathjaxdocs]) will be processed and typeset in the web page. One issue that comes up, however, with Markdown is that the underscore character (`_`) is interpreted by Markdown as a way to wrap text in `emph` blocks while LaTeX (MathJax) interprets the underscore as a way to create a subscript. This "double speak" of the underscore can result in some unexpected and unwanted behavior.
+When converting content to HTML, Hugo uses:
-### Solution
+- Native renderers for Markdown, HTML, and Emacs Org mode
+- External renderers for AsciiDoc, Pandoc, and reStructuredText
-There are multiple ways to remedy this problem. One solution is to simply escape each underscore in your math code by entering `\_` instead of `_`. This can become quite tedious if the equations you are entering are full of subscripts.
-
-Another option is to tell Markdown to treat the MathJax code as verbatim code and not process it. One way to do this is to wrap the math expression inside a `
` `
` block. Markdown would ignore these sections and they would get passed directly on to MathJax and processed correctly. This works great for display style mathematics, but for inline math expressions the line break induced by the `
` is not acceptable. The syntax for instructing Markdown to treat inline text as verbatim is by wrapping it in backticks (`` ` ``). You might have noticed, however, that the text included in between backticks is rendered differently than standard text (on this site these are items highlighted in red). To get around this problem, we could create a new CSS entry that would apply standard styling to all inline verbatim text that includes MathJax code. Below I will show the HTML and CSS source that would accomplish this (note this solution was adapted from [this blog post](http://doswa.com/2011/07/20/mathjax-in-markdown.html)---all credit goes to the original author).
-
-{{< code file="mathjax-markdown-solution.html" >}}
-
-
-
-{{< /code >}}
-
-
-
-As before, this content should be included in the HTML source of each page that will be using MathJax. The next code snippet contains the CSS that is used to have verbatim MathJax blocks render with the same font style as the body of the page.
-
-{{< code file="mathjax-style.css" >}}
-code.has-jax {
- font: inherit;
- font-size: 100%;
- background: inherit;
- border: inherit;
- color: #515151;
-}
-{{< /code >}}
-
-In the CSS snippet, notice the line `color: #515151;`. `#515151` is the value assigned to the `color` attribute of the `body` class in my CSS. In order for the equations to fit in with the body of a web page, this value should be the same as the color of the body.
-
-### Usage
-
-With this setup, everything is in place for a natural usage of MathJax on pages generated using Hugo. In order to include inline mathematics, just put LaTeX code in between `` `$ TeX Code $` `` or `` `\( TeX Code \)` ``. To include display style mathematics, just put LaTeX code in between `
$$TeX Code$$
`. All the math will be properly typeset and displayed within your Hugo generated web page!
-
-## Additional Formats Through External Helpers
-
-Hugo has a new concept called _external helpers_. It means that you can write your content using [Asciidoc][ascii], [reStructuredText][rest], or [pandoc]. If you have files with associated extensions, Hugo will call external commands to generate the content. ([See the Hugo source code for external helpers][helperssource].)
-
-For example, for Asciidoc files, Hugo will try to call the `asciidoctor` or `asciidoc` command. This means that you will have to install the associated tool on your machine to be able to use these formats. ([See the Asciidoctor docs for installation instructions](http://asciidoctor.org/docs/install-toolchain/)).
-
-To use these formats, just use the standard extension and the front matter exactly as you would do with natively supported `.md` files.
-
-Hugo passes reasonable default arguments to these external helpers by default:
-
-- `asciidoc`: `--no-header-footer --safe -`
-- `asciidoctor`: `--no-header-footer --safe --trace -`
-- `rst2html`: `--leave-comments --initial-header-level=2`
-- `pandoc`: `--mathjax`
-
-{{% warning "Performance of External Helpers" %}}
-Because additional formats are external commands generation performance will rely heavily on the performance of the external tool you are using. As this feature is still in its infancy, feedback is welcome.
-{{% /warning %}}
-
-## Learn Markdown
-
-Markdown syntax is simple enough to learn in a single sitting. The following are excellent resources to get you up and running:
-
-* [Daring Fireball: Markdown, John Gruber (Creator of Markdown)][fireball]
-* [Markdown Cheatsheet, Adam Pritchard][mdcheatsheet]
-* [Markdown Tutorial (Interactive), Garen Torikian][mdtutorial]
-* [The Markdown Guide, Matt Cone][mdguide]
-
-[`emojify` function]: /functions/emojify/
-[ascii]: http://asciidoctor.org/
-[bfconfig]: /getting-started/configuration/#configuring-blackfriday-rendering
-[blackfriday]: https://github.com/russross/blackfriday
-[mmark]: https://github.com/miekg/mmark
-[config]: /getting-started/configuration/
-[developer tools]: /tools/
-[emojis]: https://www.webpagefx.com/tools/emoji-cheat-sheet/
-[fireball]: https://daringfireball.net/projects/markdown/
-[gfmtasks]: https://guides.github.com/features/mastering-markdown/#syntax
-[helperssource]: https://github.com/gohugoio/hugo/blob/77c60a3440806067109347d04eb5368b65ea0fe8/helpers/general.go#L65
-[hl]: /content-management/syntax-highlighting/
-[hlsc]: /content-management/shortcodes/#highlight
-[hugocss]: /css/style.css
-[ietf]: https://tools.ietf.org/html/
-[mathjaxdocs]: https://docs.mathjax.org/en/latest/
-[mdcheatsheet]: https://github.com/adam-p/markdown-here/wiki/Markdown-Cheatsheet
-[mdguide]: https://www.markdownguide.org/
-[mdtutorial]: http://www.markdowntutorial.com/
-[Miek Gieben's website]: https://miek.nl/2016/march/05/mmark-syntax-document/
-[mmark]: https://github.com/mmarkdown/mmark
-[org]: http://orgmode.org/
-[pandoc]: http://www.pandoc.org/
-[Pygments]: http://pygments.org/
-[rest]: http://docutils.sourceforge.net/rst.html
-[sc]: /content-management/shortcodes/
-[sct]: /templates/shortcode-templates/
+Native renderers are faster than external renderers.
diff --git a/docs/content/en/content-management/front-matter.md b/docs/content/en/content-management/front-matter.md
index 37a51a12a..8bfbd1acc 100644
--- a/docs/content/en/content-management/front-matter.md
+++ b/docs/content/en/content-management/front-matter.md
@@ -1,191 +1,362 @@
---
-title: Front Matter
-linktitle:
-description: Hugo allows you to add front matter in yaml, toml, or json to your content files.
-date: 2017-01-09
-publishdate: 2017-01-09
-lastmod: 2017-02-24
-categories: [content management]
-keywords: ["front matter", "yaml", "toml", "json", "metadata", "archetypes"]
-menu:
- docs:
- parent: "content-management"
- weight: 30
-weight: 30 #rem
-draft: false
+title: Front matter
+description: Use front matter to add metadata to your content.
+categories: []
+keywords: []
aliases: [/content/front-matter/]
-toc: true
---
-**Front matter** allows you to keep metadata attached to an instance of a [content type][]---i.e., embedded inside a content file---and is one of the many features that gives Hugo its strength.
+## Overview
-{{< youtube Yh2xKRJGff4 >}}
+The front matter at the top of each content file is metadata that:
-## Front Matter Formats
+- Describes the content
+- Augments the content
+- Establishes relationships with other content
+- Controls the published structure of your site
+- Determines template selection
-Hugo supports four formats for front matter, each with their own identifying tokens.
+Provide front matter using a serialization format, one of [JSON], [TOML], or [YAML]. Hugo determines the front matter format by examining the delimiters that separate the front matter from the page content.
-TOML
-: identified by opening and closing `+++`.
+[json]: https://www.json.org/
+[toml]: https://toml.io/
+[yaml]: https://yaml.org/
-YAML
-: identified by opening and closing `---`.
+See examples of front matter delimiters by toggling between the serialization formats below.
-JSON
-: a single JSON object surrounded by '`{`' and '`}`', followed by a new line.
-
-ORG
-: a group of Org mode keywords in the format '`#+KEY: VALUE`'. Any line that does not start with `#+` ends the front matter section.
- Keyword values can be either strings (`#+KEY: VALUE`) or a whitespace separated list of strings (`#+KEY[]: VALUE_1 VALUE_2`).
-
-### Example
-
-{{< code-toggle >}}
-title = "spf13-vim 3.0 release and new website"
-description = "spf13-vim is a cross platform distribution of vim plugins and resources for Vim."
-tags = [ ".vimrc", "plugins", "spf13-vim", "vim" ]
-date = "2012-04-06"
-categories = [
- "Development",
- "VIM"
-]
-slug = "spf13-vim-3-0-release-and-new-website"
+{{< code-toggle file=content/example.md fm=true >}}
+title = 'Example'
+date = 2024-02-02T04:14:54-08:00
+draft = false
+weight = 10
+[params]
+author = 'John Smith'
{{< /code-toggle >}}
-## Front Matter Variables
+Front matter fields may be [boolean](g), [integer](g), [float](g), [string](g), [arrays](g), or [maps](g). Note that the TOML format also supports unquoted date/time values.
-### Predefined
+## Fields
-There are a few predefined variables that Hugo is aware of. See [Page Variables][pagevars] for how to call many of these predefined variables in your templates.
+The most common front matter fields are `date`, `draft`, `title`, and `weight`, but you can specify metadata using any of fields below.
+
+> [!note]
+> The field names below are reserved. For example, you cannot create a custom field named `type`. Create custom fields under the `params` key. See the [parameters] section for details.
+
+[parameters]: #parameters
aliases
-: an array of one or more aliases (e.g., old published paths of renamed content) that will be created in the output directory structure . See [Aliases][aliases] for details.
+: (`string array`) An array of one or more aliases, where each alias is a relative URL that will redirect the browser to the current location. Access these values from a template using the [`Aliases`] method on a `Page` object. See the [aliases] section for details.
-audio
-: an array of paths to audio files related to the page; used by the `opengraph` [internal template](/templates/internal) to populate `og:audio`.
+build
+: (`map`) A map of [build options].
+
+cascade
+: (`map`) A map of front matter keys whose values are passed down to the page's descendants unless overwritten by self or a closer ancestor's cascade. See the [cascade] section for details.
date
-: the datetime assigned to this page. This is usually fetched from the `date` field in front matter, but this behaviour is configurable.
+: (`string`) The date associated with the page, typically the creation date. Note that the TOML format also supports unquoted date/time values. See the [dates](#dates) section for examples. Access this value from a template using the [`Date`] method on a `Page` object.
description
-: the description for the content.
+: (`string`) Conceptually different than the page `summary`, the description is typically rendered within a `meta` element within the `head` element of the published HTML file. Access this value from a template using the [`Description`] method on a `Page` object.
draft
-: if `true`, the content will not be rendered unless the `--buildDrafts` flag is passed to the `hugo` command.
+: (`bool`) Whether to disable rendering unless you pass the `--buildDrafts` flag to the `hugo` command. Access this value from a template using the [`Draft`] method on a `Page` object.
expiryDate
-: the datetime at which the content should no longer be published by Hugo; expired content will not be rendered unless the `--buildExpired` flag is passed to the `hugo` command.
+: (`string`) The page expiration date. On or after the expiration date, the page will not be rendered unless you pass the `--buildExpired` flag to the `hugo` command. Note that the TOML format also supports unquoted date/time values. See the [dates](#dates) section for examples. Access this value from a template using the [`ExpiryDate`] method on a `Page` object.
headless
-: if `true`, sets a leaf bundle to be [headless][headless-bundle].
-
-images
-: an array of paths to images related to the page; used by [internal templates](/templates/internal) such as `_internal/twitter_cards.html`.
+: (`bool`) Applicable to [leaf bundles], whether to set the `render` and `list` [build options] to `never`, creating a headless bundle of [page resources].
isCJKLanguage
-: if `true`, Hugo will explicitly treat the content as a CJK language; both `.Summary` and `.WordCount` work properly in CJK languages.
+: (`bool`) Whether the content language is in the [CJK](g) family. This value determines how Hugo calculates word count, and affects the values returned by the [`WordCount`], [`FuzzyWordCount`], [`ReadingTime`], and [`Summary`] methods on a `Page` object.
keywords
-: the meta keywords for the content.
-
-layout
-: the layout Hugo should select from the [lookup order][lookup] when rendering the content. If a `type` is not specified in the front matter, Hugo will look for the layout of the same name in the layout directory that corresponds with a content's section. See ["Defining a Content Type"][definetype]
+: (`string array`) An array of keywords, typically rendered within a `meta` element within the `head` element of the published HTML file, or used as a [taxonomy](g) to classify content. Access these values from a template using the [`Keywords`] method on a `Page` object.
lastmod
-: the datetime at which the content was last modified.
+: (`string`) The date that the page was last modified. Note that the TOML format also supports unquoted date/time values. See the [dates](#dates) section for examples. Access this value from a template using the [`Lastmod`] method on a `Page` object.
+
+layout
+: (`string`) Provide a template name to [target a specific template], overriding the default [template lookup order]. Set the value to the base file name of the template, excluding its extension. Access this value from a template using the [`Layout`] method on a `Page` object.
linkTitle
-: used for creating links to content; if set, Hugo defaults to using the `linktitle` before the `title`. Hugo can also [order lists of content by `linktitle`][bylinktitle].
+: (`string`) Typically a shorter version of the `title`. Access this value from a template using the [`LinkTitle`] method on a `Page` object.
markup
-: **experimental**; specify `"rst"` for reStructuredText (requires`rst2html`) or `"md"` (default) for Markdown.
+: (`string`) An identifier corresponding to one of the supported [content formats]. If not provided, Hugo determines the content renderer based on the file extension.
+
+menus
+: (`string`, `string array`, or `map`) If set, Hugo adds the page to the given menu or menus. See the [menus] page for details.
+
+modified
+: Alias to [lastmod](#lastmod).
outputs
-: allows you to specify output formats specific to the content. See [output formats][outputs].
+: (`string array`) The [output formats] to render. See [configure outputs] for more information.
+
+params
+: {{< new-in 0.123.0 />}}
+: (`map`) A map of custom [page parameters].
+
+pubdate
+: Alias to [publishDate](#publishdate).
publishDate
-: if in the future, content will not be rendered unless the `--buildFuture` flag is passed to `hugo`.
+: (`string`) The page publication date. Before the publication date, the page will not be rendered unless you pass the `--buildFuture` flag to the `hugo` command. Note that the TOML format also supports unquoted date/time values. See the [dates](#dates) section for examples. Access this value from a template using the [`PublishDate`] method on a `Page` object.
+
+published
+: Alias to [publishDate](#publishdate).
resources
-: used for configuring page bundle resources. See [Page Resources][page-resources].
+: (`map array`) An array of maps to provide metadata for [page resources].
-series
-: an array of series this page belongs to, as a subset of the `series` [taxonomy](/content-management/taxonomies/); used by the `opengraph` [internal template](/templates/internal) to populate `og:see_also`.
+sitemap
+: (`map`) A map of sitemap options. See the [sitemap templates] page for details. Access these values from a template using the [`Sitemap`] method on a `Page` object.
slug
-: appears as the tail of the output URL. A value specified in front matter will override the segment of the URL based on the filename.
+: (`string`) Overrides the last segment of the URL path. Not applicable to section pages. See the [URL management] page for details. Access this value from a template using the [`Slug`] method on a `Page` object.
summary
-: text used when providing a summary of the article in the `.Summary` page variable; details available in the [content-summaries](/content-management/summaries/) section.
+: (`string`) Conceptually different than the page `description`, the summary either summarizes the content or serves as a teaser to encourage readers to visit the page. Access this value from a template using the [`Summary`] method on a `Page` object.
title
-: the title for the content.
+: (`string`) The page title. Access this value from a template using the [`Title`] method on a `Page` object.
+
+translationKey
+: (`string`) An arbitrary value used to relate two or more translations of the same page, useful when the translated pages do not share a common path. Access this value from a template using the [`TranslationKey`] method on a `Page` object.
type
-: the type of the content; this value will be automatically derived from the directory (i.e., the [section][]) if not specified in front matter.
+: (`string`) The [content type](g), overriding the value derived from the top-level section in which the page resides. Access this value from a template using the [`Type`] method on a `Page` object.
+
+unpublishdate
+: Alias to [expirydate](#expirydate).
url
-: the full path to the content from the web root. It makes no assumptions about the path of the content file. It also ignores any language prefixes of
-the multilingual feature.
-
-videos
-: an array of paths to videos related to the page; used by the `opengraph` [internal template](/templates/internal) to populate `og:video`.
+: (`string`) Overrides the entire URL path. Applicable to regular pages and section pages. See the [URL management] page for details.
weight
-: used for [ordering your content in lists][ordering]. Lower weight gets higher precedence. So content with lower weight will come first.
+: (`int`) The page [weight](g), used to order the page within a [page collection](g). Access this value from a template using the [`Weight`] method on a `Page` object.
-\
-: field name of the *plural* form of the index. See `tags` and `categories` in the above front matter examples. _Note that the plural form of user-defined taxonomies cannot be the same as any of the predefined front matter variables._
+[URL management]: /content-management/urls/#slug
+[`Summary`]: /methods/page/summary/
+[`aliases`]: /methods/page/aliases/
+[`date`]: /methods/page/date/
+[`description`]: /methods/page/description/
+[`draft`]: /methods/page/draft/
+[`expirydate`]: /methods/page/expirydate/
+[`fuzzywordcount`]: /methods/page/wordcount/
+[`keywords`]: /methods/page/keywords/
+[`lastmod`]: /methods/page/date/
+[`layout`]: /methods/page/layout/
+[`linktitle`]: /methods/page/linktitle/
+[`publishdate`]: /methods/page/publishdate/
+[`readingtime`]: /methods/page/readingtime/
+[`sitemap`]: /methods/page/sitemap/
+[`slug`]: /methods/page/slug/
+[`summary`]: /methods/page/summary/
+[`title`]: /methods/page/title/
+[`translationkey`]: /methods/page/translationkey/
+[`type`]: /methods/page/type/
+[`weight`]: /methods/page/weight/
+[`wordcount`]: /methods/page/wordcount/
+[aliases]: /content-management/urls/#aliases
+[build options]: /content-management/build-options/
+[cascade]: #cascade-1
+[configure outputs]: /configuration/outputs/#outputs-per-page
+[content formats]: /content-management/formats/#classification
+[leaf bundles]: /content-management/page-bundles/#leaf-bundles
+[menus]: /content-management/menus/#define-in-front-matter
+[output formats]: /configuration/output-formats/
+[page parameters]: #parameters
+[page resources]: /content-management/page-resources/#metadata
+[sitemap templates]: /templates/sitemap/
+[target a specific template]: /templates/lookup-order/#target-a-template
+[template lookup order]: /templates/lookup-order/
-{{% note "Hugo's Default URL Destinations" %}}
-If neither `slug` nor `url` is present and [permalinks are not configured otherwise in your site `config` file](/content-management/urls/#permalinks), Hugo will use the filename of your content to create the output URL. See [Content Organization](/content-management/organization) for an explanation of paths in Hugo and [URL Management](/content-management/urls/) for ways to customize Hugo's default behaviors.
-{{% /note %}}
+## Parameters
-### User-Defined
+{{< new-in 0.123.0 />}}
-You can add fields to your front matter arbitrarily to meet your needs. These user-defined key-values are placed into a single `.Params` variable for use in your templates.
+Specify custom page parameters under the `params` key in front matter:
-The following fields can be accessed via `.Params.include_toc` and `.Params.show_comments`, respectively. The [Variables][] section provides more information on using Hugo's page- and site-level variables in your templates.
+{{< code-toggle file=content/example.md fm=true >}}
+title = 'Example'
+date = 2024-02-02T04:14:54-08:00
+draft = false
+weight = 10
+[params]
+author = 'John Smith'
+{{< /code-toggle >}}
-{{< code-toggle copy="false" >}}
-include_toc: true
-show_comments: false
-{{ code-toggle >}}
+Access these values from a template using the [`Params`] or [`Param`] method on a `Page` object.
+[`param`]: /methods/page/param/
+[`params`]: /methods/page/params/
-## Order Content Through Front Matter
+Hugo provides [embedded templates] to optionally insert meta data within the `head` element of your rendered pages. These embedded templates expect the following front matter parameters:
-You can assign content-specific `weight` in the front matter of your content. These values are especially useful for [ordering][ordering] in list views. You can use `weight` for ordering of content and the convention of [`_weight`][taxweight] for ordering content within a taxonomy. See [Ordering and Grouping Hugo Lists][lists] to see how `weight` can be used to organize your content in list views.
+Parameter|Data type|Used by these embedded templates
+:--|:--|:--
+`audio`|`[]string`|[`opengraph.html`]
+`images`|`[]string`|[`opengraph.html`], [`schema.html`], [`twitter_cards.html`]
+`videos`|`[]string`|[`opengraph.html`]
-## Override Global Markdown Configuration
+The embedded templates will skip a parameter if not provided in front matter, but will throw an error if the data type is unexpected.
-It's possible to set some options for Markdown rendering in a content's front matter as an override to the [BlackFriday rendering options set in your project configuration][config].
+## Taxonomies
-## Front Matter Format Specs
+Classify content by adding taxonomy terms to front matter. For example, with this site configuration:
-* [TOML Spec][toml]
-* [YAML Spec][yaml]
-* [JSON Spec][json]
+{{< code-toggle file=hugo >}}
+[taxonomies]
+tag = 'tags'
+genre = 'genres'
+{{< /code-toggle >}}
-[variables]: /variables/
-[aliases]: /content-management/urls/#aliases/
-[archetype]: /content-management/archetypes/
-[bylinktitle]: /templates/lists/#by-link-title
-[config]: /getting-started/configuration/ "Hugo documentation for site configuration"
-[content type]: /content-management/types/
-[contentorg]: /content-management/organization/
-[definetype]: /content-management/types/#defining-a-content-type "Learn how to specify a type and a layout in a content's front matter"
-[headless-bundle]: /content-management/page-bundles/#headless-bundle
-[json]: https://www.ecma-international.org/publications/files/ECMA-ST/ECMA-404.pdf "Specification for JSON, JavaScript Object Notation"
-[lists]: /templates/lists/#ordering-content "See how to order content in list pages; for example, templates that look to specific _index.md for content and front matter."
-[lookup]: /templates/lookup-order/ "Hugo traverses your templates in a specific order when rendering content to allow for DRYer templating."
-[ordering]: /templates/lists/ "Hugo provides multiple ways to sort and order your content in list templates"
-[outputs]: /templates/output-formats/ "With the release of v22, you can output your content to any text format using Hugo's familiar templating"
-[page-resources]: /content-management/page-resources/
-[pagevars]: /variables/page/
-[section]: /content-management/sections/
-[taxweight]: /content-management/taxonomies/
-[toml]: https://github.com/toml-lang/toml "Specification for TOML, Tom's Obvious Minimal Language"
-[urls]: /content-management/urls/
-[variables]: /variables/
-[yaml]: http://yaml.org/spec/ "Specification for YAML, YAML Ain't Markup Language"
+Add taxonomy terms as shown below:
+
+{{< code-toggle file=content/example.md fm=true >}}
+title = 'Example'
+date = 2024-02-02T04:14:54-08:00
+draft = false
+weight = 10
+tags = ['red','blue']
+genres = ['mystery','romance']
+[params]
+author = 'John Smith'
+{{< /code-toggle >}}
+
+You can add taxonomy terms to the front matter of any these [page kinds](g):
+
+- `home`
+- `page`
+- `section`
+- `taxonomy`
+- `term`
+
+Access taxonomy terms from a template using the [`Params`] or [`GetTerms`] method on a `Page` object. For example:
+
+```go-html-template {file="layouts/_default/single.html"}
+{{ with .GetTerms "tags" }}
+
+{{ end }}
+```
+
+[`Params`]: /methods/page/params/
+[`GetTerms`]: /methods/page/getterms/
+
+## Cascade
+
+A [node](g) can cascade front matter values to its descendants. However, this cascading will be prevented if the descendant already defines the field, or if a closer ancestor node has already cascaded a value for that same field.
+
+For example, to cascade a "color" parameter from the home page to all its descendants:
+
+{{< code-toggle file=content/_index.md fm=true >}}
+title = 'Home'
+[cascade.params]
+color = 'red'
+{{< /code-toggle >}}
+
+### Target
+
+
+
+The `target`[^1] keyword allows you to target specific pages or [environments](g). For example, to cascade a "color" parameter from the home page only to pages within the "articles" section, including the "articles" section page itself:
+
+[^1]: The `_target` alias for `target` is deprecated and will be removed in a future release.
+
+{{< code-toggle file=content/_index.md fm=true >}}
+title = 'Home'
+[cascade.params]
+color = 'red'
+[cascade.target]
+path = '{/articles,/articles/**}'
+{{< /code-toggle >}}
+
+Use any combination of these keywords to target pages and/or environments:
+
+environment
+: (`string`) A [glob](g) pattern matching the build [environment](g). For example: `{staging,production}`.
+
+kind
+: (`string`) A [glob](g) pattern matching the [page kind](g). For example: ` {taxonomy,term}`.
+
+path
+: (`string`) A [glob](g) pattern matching the page's [logical path](g). For example: `{/books,/books/**}`.
+
+### Array
+
+Define an array of cascade parameters to apply different values to different targets. For example:
+
+{{< code-toggle file=content/_index.md fm=true >}}
+title = 'Home'
+[[cascade]]
+[cascade.params]
+color = 'red'
+[cascade.target]
+path = '{/books/**}'
+kind = 'page'
+[[cascade]]
+[cascade.params]
+color = 'blue'
+[cascade.target]
+path = '{/films/**}'
+kind = 'page'
+{{< /code-toggle >}}
+
+> [!note]
+> For multilingual sites, defining cascade values in your site configuration is often more efficient. This avoids repeating the same cascade values on the home, section, taxonomy, or term page for each language. See [details](/configuration/cascade/).
+>
+> If you choose to define cascade values in front matter for a multilingual site, you must create a corresponding home, section, taxonomy, or term page for every language.
+
+## Emacs Org Mode
+
+If your [content format] is [Emacs Org Mode], you may provide front matter using Org Mode keywords. For example:
+
+```text {file="content/example.org"}
+#+TITLE: Example
+#+DATE: 2024-02-02T04:14:54-08:00
+#+DRAFT: false
+#+AUTHOR: John Smith
+#+GENRES: mystery
+#+GENRES: romance
+#+TAGS: red
+#+TAGS: blue
+#+WEIGHT: 10
+```
+
+Note that you can also specify array elements on a single line:
+
+```text {file="content/example.org"}
+#+TAGS[]: red blue
+```
+
+[content format]: /content-management/formats/
+[emacs org mode]: https://orgmode.org/
+
+## Dates
+
+When populating a date field, whether a [custom page parameter](#parameters) or one of the four predefined fields ([`date`](#date), [`expiryDate`](#expirydate), [`lastmod`](#lastmod), [`publishDate`](#publishdate)), use one of these parsable formats:
+
+{{% include "/_common/parsable-date-time-strings.md" %}}
+
+To override the default time zone, set the [`timeZone`](/configuration/all/#timezone) in your site configuration. The order of precedence for determining the time zone is:
+
+1. The time zone offset in the date/time string
+1. The time zone specified in your site configuration
+1. The `Etc/UTC` time zone
+
+[`opengraph.html`]: {{% eturl opengraph %}}
+[`schema.html`]: {{% eturl schema %}}
+[`twitter_cards.html`]: {{% eturl twitter_cards %}}
+[embedded templates]: /templates/embedded/
diff --git a/docs/content/en/content-management/image-processing/index.md b/docs/content/en/content-management/image-processing/index.md
index b83a6c103..8d60c4f93 100644
--- a/docs/content/en/content-management/image-processing/index.md
+++ b/docs/content/en/content-management/image-processing/index.md
@@ -1,195 +1,447 @@
---
-title: "Image Processing"
-description: "Image Page resources can be resized and cropped."
-date: 2018-01-24T13:10:00-05:00
-lastmod: 2018-01-26T15:59:07-05:00
-linktitle: "Image Processing"
-categories: ["content management"]
-keywords: [bundle,content,resources,images]
-weight: 4004
-draft: false
-toc: true
-menu:
- docs:
- parent: "content-management"
- weight: 32
+title: Image processing
+description: Resize, crop, rotate, filter, and convert images.
+categories: []
+keywords: []
---
-## The Image Page Resource
+## Image resources
-The `image` is a [Page Resource]({{< relref "/content-management/page-resources" >}}), and the processing methods listed below does not work on images inside your `/static` folder.
+To process an image you must access the file as a page resource, global resource, or remote resource.
+### Page resource
-To get all images in a [Page Bundle]({{< relref "/content-management/organization#page-bundles" >}}):
+{{% glossary-term "page resource" %}}
+```text
+content/
+└── posts/
+ └── post-1/ <-- page bundle
+ ├── index.md
+ └── sunset.jpg <-- page resource
+```
+
+To access an image as a page resource:
```go-html-template
-{{ with .Resources.ByType "image" }}
+{{ $image := .Resources.Get "sunset.jpg" }}
+```
+
+### Global resource
+
+{{% glossary-term "global resource" %}}
+
+```text
+assets/
+└── images/
+ └── sunset.jpg <-- global resource
+```
+
+To access an image as a global resource:
+
+```go-html-template
+{{ $image := resources.Get "images/sunset.jpg" }}
+```
+
+### Remote resource
+
+{{% glossary-term "remote resource" %}}
+
+To access an image as a remote resource:
+
+```go-html-template
+{{ $image := resources.GetRemote "https://gohugo.io/img/hugo-logo.png" }}
+```
+
+## Image rendering
+
+Once you have accessed an image as a resource, render it in your templates using the `Permalink`, `RelPermalink`, `Width`, and `Height` properties.
+
+Example 1: Throws an error if the resource is not found.
+
+```go-html-template
+{{ $image := .Resources.GetMatch "sunset.jpg" }}
+
+```
+
+Example 2: Skips image rendering if the resource is not found.
+
+```go-html-template
+{{ $image := .Resources.GetMatch "sunset.jpg" }}
+{{ with $image }}
+
{{ end }}
-
```
-## Image Processing Methods
-
-
-The `image` resource implements the methods `Resize`, `Fit` and `Fill`, each returning the transformed image using the specified dimensions and processing options.
-
-Resize
-: Resizes the image to the specified width and height.
-
-```go
-// Resize to a width of 600px and preserve ratio
-{{ $image := $resource.Resize "600x" }}
-
-// Resize to a height of 400px and preserve ratio
-{{ $image := $resource.Resize "x400" }}
-
-// Resize to a width 600px and a height of 400px
-{{ $image := $resource.Resize "600x400" }}
-```
-
-Fit
-: Scale down the image to fit the given dimensions while maintaining aspect ratio. Both height and width are required.
-
-```go
-{{ $image := $resource.Fit "600x400" }}
-```
-
-Fill
-: Resize and crop the image to match the given dimensions. Both height and width are required.
-
-```go
-{{ $image := $resource.Fill "600x400" }}
-```
-
-
-{{% note %}}
-Image operations in Hugo currently **do not preserve EXIF data** as this is not supported by Go's [image package](https://github.com/golang/go/search?q=exif&type=Issues&utf8=%E2%9C%93). This will be improved on in the future.
-{{% /note %}}
-
-
-## Image Processing Options
-
-In addition to the dimensions (e.g. `600x400`), Hugo supports a set of additional image options.
-
-
-JPEG Quality
-: Only relevant for JPEG images, values 1 to 100 inclusive, higher is better. Default is 75.
-
-```go
-{{ $image.Resize "600x q50" }}
-```
-
-Rotate
-: Rotates an image by the given angle counter-clockwise. The rotation will be performed first to get the dimensions correct. The main use of this is to be able to manually correct for [EXIF orientation](https://github.com/golang/go/issues/4341) of JPEG images.
-
-```go
-{{ $image.Resize "600x r90" }}
-```
-
-Anchor
-: Only relevant for the `Fill` method. This is useful for thumbnail generation where the main motive is located in, say, the left corner.
-Valid are `Center`, `TopLeft`, `Top`, `TopRight`, `Left`, `Right`, `BottomLeft`, `Bottom`, `BottomRight`.
-
-```go
-{{ $image.Fill "300x200 BottomLeft" }}
-```
-
-Resample Filter
-: Filter used in resizing. Default is `Box`, a simple and fast resampling filter appropriate for downscaling.
-
-Examples are: `Box`, `NearestNeighbor`, `Linear`, `Gaussian`.
-
-See https://github.com/disintegration/imaging for more. If you want to trade quality for faster processing, this may be a option to test.
-
-```go
-{{ $image.Resize "600x400 Gaussian" }}
-```
-
-## Image Processing Examples
-
-_The photo of the sunset used in the examples below is Copyright [Bjørn Erik Pedersen](https://commons.wikimedia.org/wiki/User:Bep) (Creative Commons Attribution-Share Alike 4.0 International license)_
-
-
-{{< imgproc sunset Resize "300x" />}}
-
-{{< imgproc sunset Fill "90x120 left" />}}
-
-{{< imgproc sunset Fill "90x120 right" />}}
-
-{{< imgproc sunset Fit "90x90" />}}
-
-{{< imgproc sunset Resize "300x q10" />}}
-
-
-This is the shortcode used in the examples above:
-
-
-{{< code file="layouts/shortcodes/imgproc.html" >}}
-{{< readfile file="layouts/shortcodes/imgproc.html" >}}
-{{< /code >}}
-
-And it is used like this:
+Example 3: A more concise way to skip image rendering if the resource is not found.
```go-html-template
-{{* imgproc sunset Resize "300x" /*/>}}
+{{ with .Resources.GetMatch "sunset.jpg" }}
+
+{{ end }}
```
+Example 4: Skips rendering if there's problem accessing a remote resource.
-{{% note %}}
-**Tip:** Note the self-closing shortcode syntax above. The `imgproc` shortcode can be called both with and without **inner content**.
-{{% /note %}}
-
-## Image Processing Config
-
-You can configure an `imaging` section in `config.toml` with default image processing options:
-
-```toml
-[imaging]
-# Default resample filter used for resizing. Default is Box,
-# a simple and fast averaging filter appropriate for downscaling.
-# See https://github.com/disintegration/imaging
-resampleFilter = "box"
-
-# Default JPEG quality setting. Default is 75.
-quality = 75
-
-# Anchor used when cropping pictures.
-# Default is "smart" which does Smart Cropping, using https://github.com/muesli/smartcrop
-# Smart Cropping is content aware and tries to find the best crop for each image.
-# Valid values are Smart, Center, TopLeft, Top, TopRight, Left, Right, BottomLeft, Bottom, BottomRight
-anchor = "smart"
-
+```go-html-template
+{{ $url := "https://gohugo.io/img/hugo-logo.png" }}
+{{ with try (resources.GetRemote $url) }}
+ {{ with .Err }}
+ {{ errorf "%s" . }}
+ {{ else with .Value }}
+
+ {{ else }}
+ {{ errorf "Unable to get remote resource %q" $url }}
+ {{ end }}
+{{ end }}
```
-All of the above settings can also be set per image procecssing.
+## Image processing methods
-## Smart Cropping of Images
+The `image` resource implements the [`Process`], [`Resize`], [`Fit`], [`Fill`], [`Crop`], [`Filter`], [`Colors`] and [`Exif`] methods.
-By default, Hugo will use the [Smartcrop](https://github.com/muesli/smartcrop), a library created by [muesli](https://github.com/muesli), when cropping images with `.Fill`. You can set the anchor point manually, but in most cases the smart option will make a good choice. And we will work with the library author to improve this in the future.
+> [!note]
+> Metadata (EXIF, IPTC, XMP, etc.) is not preserved during image transformation. Use the `Exif` method with the _original_ image to extract EXIF metadata from JPEG, PNG, TIFF, and WebP images.
-An example using the sunset image from above:
+### Process
+{{< new-in 0.119.0 />}}
-{{< imgproc sunset Fill "200x200 smart" />}}
+> [!note]
+> The `Process` method is also available as a filter, which is more effective if you need to apply multiple filters to an image. See [Process filter](/functions/images/process).
+Process processes the image with the given specification. The specification can contain an optional action, one of `resize`, `crop`, `fit` or `fill`. This means that you can use this method instead of [`Resize`], [`Fit`], [`Fill`], or [`Crop`].
-## Image Processing Performance Consideration
+See [Options](#image-processing-options) for available options.
-Processed images are stored below `/resources` (can be set with `resourceDir` config setting). This folder is deliberately placed in the project, as it is recommended to check these into source control as part of the project. These images are not "Hugo fast" to generate, but once generated they can be reused.
+You can also use this method apply image processing that does not need any scaling, e.g. format conversions:
-If you change your image settings (e.g. size), remove or rename images etc., you will end up with unused images taking up space and cluttering your project.
+```go-html-template
+{{/* Convert the image from JPG to PNG. */}}
+{{ $png := $jpg.Process "png" }}
+```
-To clean up, run:
+Some more examples:
-```bash
+```go-html-template
+{{/* Rotate the image 90 degrees counter-clockwise. */}}
+{{ $image := $image.Process "r90" }}
+
+{{/* Scaling actions. */}}
+{{ $image := $image.Process "resize 600x" }}
+{{ $image := $image.Process "crop 600x400" }}
+{{ $image := $image.Process "fit 600x400" }}
+{{ $image := $image.Process "fill 600x400" }}
+```
+
+### Resize
+
+Resize an image to the given width and/or height.
+
+If you specify both width and height, the resulting image will be disproportionally scaled unless the original image has the same aspect ratio.
+
+```go-html-template
+{{/* Resize to a width of 600px and preserve aspect ratio */}}
+{{ $image := $image.Resize "600x" }}
+
+{{/* Resize to a height of 400px and preserve aspect ratio */}}
+{{ $image := $image.Resize "x400" }}
+
+{{/* Resize to a width of 600px and a height of 400px */}}
+{{ $image := $image.Resize "600x400" }}
+```
+
+### Fit
+
+Downscale an image to fit the given dimensions while maintaining aspect ratio. You must provide both width and height.
+
+```go-html-template
+{{ $image := $image.Fit "600x400" }}
+```
+
+### Fill
+
+Crop and resize an image to match the given dimensions. You must provide both width and height. Use the [`anchor`] option to change the crop box anchor point.
+
+```go-html-template
+{{ $image := $image.Fill "600x400" }}
+```
+
+### Crop
+
+Crop an image to match the given dimensions without resizing. You must provide both width and height. Use the [`anchor`] option to change the crop box anchor point.
+
+```go-html-template
+{{ $image := $image.Crop "600x400" }}
+```
+
+### Filter
+
+Apply one or more [filters] to an image.
+
+```go-html-template
+{{ $image := $image.Filter (images.GaussianBlur 6) (images.Pixelate 8) }}
+```
+
+Write this in a more functional style using pipes. Hugo applies the filters in the order given.
+
+```go-html-template
+{{ $image := $image | images.Filter (images.GaussianBlur 6) (images.Pixelate 8) }}
+```
+
+Sometimes it can be useful to create the filter chain once and then reuse it.
+
+```go-html-template
+{{ $filters := slice (images.GaussianBlur 6) (images.Pixelate 8) }}
+{{ $image1 := $image1.Filter $filters }}
+{{ $image2 := $image2.Filter $filters }}
+```
+
+### Colors
+
+`.Colors` returns a slice of hex strings with the dominant colors in the image using a simple histogram method.
+
+```go-html-template
+{{ $colors := $image.Colors }}
+```
+
+This method is fast, but if you also scale down your images, it would be good for performance to extract the colors from the scaled down image.
+
+### EXIF
+
+Provides an [EXIF] object containing image metadata.
+
+You may access EXIF data in JPEG, PNG, TIFF, and WebP images. To prevent errors when processing images without EXIF data, wrap the access in a [`with`] statement.
+
+```go-html-template
+{{ with $image.Exif }}
+ Date: {{ .Date }}
+ Lat/Long: {{ .Lat }}/{{ .Long }}
+ Tags:
+ {{ range $k, $v := .Tags }}
+ TAG: {{ $k }}: {{ $v }}
+ {{ end }}
+{{ end }}
+```
+
+You may also access EXIF fields individually, using the [`lang.FormatNumber`] function to format the fields as needed.
+
+```go-html-template
+{{ with $image.Exif }}
+
+ {{ with .Date }}
Date: {{ .Format "January 02, 2006" }}
{{ end }}
+ {{ with .Tags.ApertureValue }}
Aperture: {{ lang.FormatNumber 2 . }}
{{ end }}
+ {{ with .Tags.BrightnessValue }}
Brightness: {{ lang.FormatNumber 2 . }}
{{ end }}
+ {{ with .Tags.ExposureTime }}
Exposure Time: {{ . }}
{{ end }}
+ {{ with .Tags.FNumber }}
F Number: {{ . }}
{{ end }}
+ {{ with .Tags.FocalLength }}
Focal Length: {{ . }}
{{ end }}
+ {{ with .Tags.ISOSpeedRatings }}
ISO Speed Ratings: {{ . }}
{{ end }}
+ {{ with .Tags.LensModel }}
Lens Model: {{ . }}
{{ end }}
+
+{{ end }}
+```
+
+#### EXIF methods
+
+Date
+: (`time.Time`) Returns the image creation date/time. Format with the [`time.Format`]function.
+
+Lat
+: (`float64`) Returns the GPS latitude in degrees.
+
+Long
+: (`float64`) Returns the GPS longitude in degrees.
+
+Tags
+: (`exif.Tags`) Returns a collection of the available EXIF tags for this image. You may include or exclude specific tags from this collection in the [site configuration].
+
+## Image processing options
+
+The [`Resize`], [`Fit`], [`Fill`], and [`Crop`] methods accept a space-delimited, case-insensitive list of options. The order of the options within the list is irrelevant.
+
+### Dimensions
+
+With the [`Resize`] method you must specify width, height, or both. The [`Fit`], [`Fill`], and [`Crop`] methods require both width and height. All dimensions are in pixels.
+
+```go-html-template
+{{ $image := $image.Resize "600x" }}
+{{ $image := $image.Resize "x400" }}
+{{ $image := $image.Resize "600x400" }}
+{{ $image := $image.Fit "600x400" }}
+{{ $image := $image.Fill "600x400" }}
+{{ $image := $image.Crop "600x400" }}
+```
+
+### Rotation
+
+Rotates an image counter-clockwise by the given angle. Hugo performs rotation _before_ scaling. For example, if the original image is 600x400 and you wish to rotate the image 90 degrees counter-clockwise while scaling it by 50%:
+
+```go-html-template
+{{ $image = $image.Resize "200x r90" }}
+```
+
+In the example above, the width represents the desired width _after_ rotation.
+
+To rotate an image without scaling, use the dimensions of the original image:
+
+```go-html-template
+{{ with .Resources.GetMatch "sunset.jpg" }}
+ {{ with .Resize (printf "%dx%d r90" .Height .Width) }}
+
+ {{ end }}
+{{ end }}
+```
+
+In the example above, on the second line, we have reversed width and height to reflect the desired dimensions _after_ rotation.
+
+### Anchor
+
+When using the [`Crop`] or [`Fill`] method, the _anchor_ determines the placement of the crop box. You may specify `TopLeft`, `Top`, `TopRight`, `Left`, `Center`, `Right`, `BottomLeft`, `Bottom`, `BottomRight`, or `Smart`.
+
+The default value is `Smart`, which uses [Smartcrop] image analysis to determine the optimal placement of the crop box. You may override the default value in the [site configuration].
+
+For example, if you have a 400x200 image with a bird in the upper left quadrant, you can create a 200x100 thumbnail containing the bird:
+
+```go-html-template
+{{ $image.Crop "200x100 TopLeft" }}
+```
+
+If you apply [rotation](#rotation) when using the [`Crop`] or [`Fill`] method, specify the anchor relative to the rotated image.
+
+### Target format
+
+By default, Hugo encodes the image in the source format. You may convert the image to another format by specifying `bmp`, `gif`, `jpeg`, `jpg`, `png`, `tif`, `tiff`, or `webp`.
+
+```go-html-template
+{{ $image.Resize "600x webp" }}
+```
+
+To convert an image without scaling, use the dimensions of the original image:
+
+```go-html-template
+{{ with .Resources.GetMatch "sunset.jpg" }}
+ {{ with .Resize (printf "%dx%d webp" .Width .Height) }}
+
+ {{ end }}
+{{ end }}
+```
+
+### Quality
+
+Applicable to JPEG and WebP images, the `q` value determines the quality of the converted image. Higher values produce better quality images, while lower values produce smaller files. Set this value to a whole number between 1 and 100, inclusive.
+
+The default value is 75. You may override the default value in the [site configuration].
+
+```go-html-template
+{{ $image.Resize "600x webp q50" }}
+```
+
+### Hint
+
+Applicable to WebP images, this option corresponds to a set of predefined encoding parameters, and is equivalent to the `-preset` flag for the [`cwebp`] encoder.
+
+Value|Example
+:--|:--
+`drawing`|Hand or line drawing with high-contrast details
+`icon`|Small colorful image
+`photo`|Outdoor photograph with natural lighting
+`picture`|Indoor photograph such as a portrait
+`text`|Image that is primarily text
+
+The default value is `photo`. You may override the default value in the [site configuration].
+
+```go-html-template
+{{ $image.Resize "600x webp picture" }}
+```
+
+### Background color
+
+When converting an image from a format that supports transparency (e.g., PNG) to a format that does _not_ support transparency (e.g., JPEG), you may specify the background color of the resulting image.
+
+Use either a 3-digit or 6-digit hexadecimal color code (e.g., `#00f` or `#0000ff`).
+
+The default value is `#ffffff` (white). You may override the default value in the [site configuration].
+
+```go-html-template
+{{ $image.Resize "600x jpg #b31280" }}
+```
+
+### Resampling filter
+
+You may specify the resampling filter used when resizing an image. Commonly used resampling filters include:
+
+Filter|Description
+:--|:--
+`Box`|Simple and fast averaging filter appropriate for downscaling
+`Lanczos`|High-quality resampling filter for photographic images yielding sharp results
+`CatmullRom`|Sharp cubic filter that is faster than the Lanczos filter while providing similar results
+`MitchellNetravali`|Cubic filter that produces smoother results with less ringing artifacts than CatmullRom
+`Linear`|Bilinear resampling filter, produces smooth output, faster than cubic filters
+`NearestNeighbor`|Fastest resampling filter, no antialiasing
+
+The default value is `Box`. You may override the default value in the [site configuration].
+
+```go-html-template
+{{ $image.Resize "600x400 Lanczos" }}
+```
+
+See [github.com/disintegration/imaging] for the complete list of resampling filters. If you wish to improve image quality at the expense of performance, you may wish to experiment with the alternative filters.
+
+## Image processing examples
+
+_The photo of the sunset used in the examples below is Copyright [Bjørn Erik Pedersen](https://bep.is) (Creative Commons Attribution-Share Alike 4.0 International license)_
+
+{{< imgproc path="sunset.jpg" spec="resize 480x" alt="A sunset" />}}
+
+{{< imgproc path="sunset.jpg" spec="fill 120x150 left" alt="A sunset" />}}
+
+{{< imgproc path="sunset.jpg" spec="fill 120x150 right" alt="A sunset" />}}
+
+{{< imgproc path="sunset.jpg" spec="fit 120x120" alt="A sunset" />}}
+
+{{< imgproc path="sunset.jpg" spec="crop 240x240 center" alt="A sunset" />}}
+
+{{< imgproc path="sunset.jpg" spec="resize 360x q10" alt="A sunset" />}}
+
+## Configuration
+
+See [configure imaging](/configuration/imaging).
+
+## Smart cropping of images
+
+By default, Hugo uses the [Smartcrop] library when cropping images with the `Crop` or `Fill` methods. You can set the anchor point manually, but in most cases the `Smart` option will make a good choice.
+
+Examples using the sunset image from above:
+
+{{< imgproc path="sunset.jpg" spec="fill 200x200 smart" alt="A sunset" />}}
+
+{{< imgproc path="sunset.jpg" spec="crop 200x200 smart" alt="A sunset" />}}
+
+## Image processing performance consideration
+
+Hugo caches processed images in the `resources` directory. If you include this directory in source control, Hugo will not have to regenerate the images in a [CI/CD](g) workflow (e.g., GitHub Pages, GitLab Pages, Netlify, etc.). This results in faster builds.
+
+If you change image processing methods or options, or if you rename or remove images, the `resources` directory will contain unused images. To remove the unused images, perform garbage collection with:
+
+```sh
hugo --gc
```
-
-{{% note %}}
-**GC** is short for **Garbage Collection**.
-{{% /note %}}
-
-
-
+[`anchor`]: /content-management/image-processing#anchor
+[`Colors`]: #colors
+[`Crop`]: #crop
+[`cwebp`]: https://developers.google.com/speed/webp/docs/cwebp
+[`Exif`]: #exif
+[`Fill`]: #fill
+[`Filter`]: #filter
+[`Fit`]: #fit
+[`lang.FormatNumber`]: /functions/lang/formatnumber/
+[`Process`]: #process
+[`Resize`]: #resize
+[`time.Format`]: /functions/time/format/
+[`with`]: /functions/go-template/with/
+[EXIF]: https://en.wikipedia.org/wiki/Exif
+[filters]: /functions/images/filter/#image-filters
+[github.com/disintegration/imaging]: https://github.com/disintegration/imaging#image-resizing
+[site configuration]: /configuration/imaging/
+[Smartcrop]: https://github.com/muesli/smartcrop#smartcrop
diff --git a/docs/content/en/content-management/image-processing/sunset.jpg b/docs/content/en/content-management/image-processing/sunset.jpg
index 7d7307bed..4dbcc0836 100644
Binary files a/docs/content/en/content-management/image-processing/sunset.jpg and b/docs/content/en/content-management/image-processing/sunset.jpg differ
diff --git a/docs/content/en/content-management/markdown-attributes.md b/docs/content/en/content-management/markdown-attributes.md
new file mode 100644
index 000000000..f52a48f17
--- /dev/null
+++ b/docs/content/en/content-management/markdown-attributes.md
@@ -0,0 +1,108 @@
+---
+title: Markdown attributes
+description: Use Markdown attributes to add HTML attributes when rendering Markdown to HTML.
+categories: []
+keywords: []
+---
+
+## Overview
+
+Hugo supports Markdown attributes on images and block elements including blockquotes, fenced code blocks, headings, horizontal rules, lists, paragraphs, and tables.
+
+For example:
+
+```text
+This is a paragraph.
+{class="foo bar" id="baz"}
+```
+
+With `class` and `id` you can use shorthand notation:
+
+```text
+This is a paragraph.
+{.foo .bar #baz}
+```
+
+Hugo renders both of these to:
+
+```html
+
This is a paragraph.
+```
+
+## Block elements
+
+Update your site configuration to enable Markdown attributes for block-level elements.
+
+{{< code-toggle file=hugo >}}
+[markup.goldmark.parser.attribute]
+title = true # default is true
+block = true # default is false
+{{< /code-toggle >}}
+
+## Standalone images
+
+By default, when the [Goldmark] Markdown renderer encounters a standalone image element (no other elements or text on the same line), it wraps the image element within a paragraph element per the [CommonMark specification].
+
+[CommonMark specification]: https://spec.commonmark.org/current/
+[Goldmark]: https://github.com/yuin/goldmark
+
+If you were to place an attribute list beneath an image element, Hugo would apply the attributes to the surrounding paragraph, not the image.
+
+To apply attributes to a standalone image element, you must disable the default wrapping behavior:
+
+{{< code-toggle file=hugo >}}
+[markup.goldmark.parser]
+wrapStandAloneImageWithinParagraph = false # default is true
+{{< /code-toggle >}}
+
+## Usage
+
+You may add [global HTML attributes], or HTML attributes specific to the current element type. Consistent with its content security model, Hugo removes HTML event attributes such as `onclick` and `onmouseover`.
+
+[global HTML attributes]: https://developer.mozilla.org/en-US/docs/Web/HTML/Global_attributes
+
+The attribute list consists of one or more key-value pairs, separated by spaces or commas, wrapped by braces. You must quote string values that contain spaces. Unlike HTML, boolean attributes must have both key and value.
+
+For example:
+
+```text
+> This is a blockquote.
+{class="foo bar" hidden=hidden}
+```
+
+Hugo renders this to:
+
+```html
+
+
This is a blockquote.
+
+```
+
+In most cases, place the attribute list beneath the markup element. For headings and fenced code blocks, place the attribute list on the right.
+
+Element|Position of attribute list
+:--|:--
+blockquote | bottom
+fenced code block | right
+heading | right
+horizontal rule | bottom
+image | bottom
+list | bottom
+paragraph | bottom
+table | bottom
+
+For example:
+
+````text
+## Section 1 {class=foo}
+
+```bash {class=foo linenos=inline}
+declare a=1
+echo "${a}"
+```
+
+This is a paragraph.
+{class=foo}
+````
+
+As shown above, the attribute list for fenced code blocks is not limited to HTML attributes. You can also configure syntax highlighting by passing one or more of [these options](/functions/transform/highlight/#options).
diff --git a/docs/content/en/content-management/mathematics.md b/docs/content/en/content-management/mathematics.md
new file mode 100644
index 000000000..e0c8ba4d0
--- /dev/null
+++ b/docs/content/en/content-management/mathematics.md
@@ -0,0 +1,238 @@
+---
+title: Mathematics in Markdown
+linkTitle: Mathematics
+description: Include mathematical equations and expressions in Markdown using LaTeX markup.
+categories: []
+keywords: []
+---
+
+{{< new-in 0.122.0 />}}
+
+## Overview
+
+Mathematical equations and expressions written in [LaTeX] are common in academic and scientific publications. Your browser typically renders this mathematical markup using an open-source JavaScript display engine such as [MathJax] or [KaTeX].
+
+For example, with this LaTeX markup:
+
+```text
+\[
+\begin{aligned}
+KL(\hat{y} || y) &= \sum_{c=1}^{M}\hat{y}_c \log{\frac{\hat{y}_c}{y_c}} \\
+JS(\hat{y} || y) &= \frac{1}{2}(KL(y||\frac{y+\hat{y}}{2}) + KL(\hat{y}||\frac{y+\hat{y}}{2}))
+\end{aligned}
+\]
+```
+
+The MathJax display engine renders this:
+
+\[
+\begin{aligned}
+KL(\hat{y} || y) &= \sum_{c=1}^{M}\hat{y}_c \log{\frac{\hat{y}_c}{y_c}} \\
+JS(\hat{y} || y) &= \frac{1}{2}(KL(y||\frac{y+\hat{y}}{2}) + KL(\hat{y}||\frac{y+\hat{y}}{2}))
+\end{aligned}
+\]
+
+Equations and expressions can be displayed inline with other text, or as standalone blocks. Block presentation is also known as "display" mode.
+
+Whether an equation or expression appears inline, or as a block, depends on the delimiters that surround the mathematical markup. Delimiters are defined in pairs, where each pair consists of an opening and closing delimiter. The opening and closing delimiters may be the same, or different.
+
+> [!note]
+> You can configure Hugo to render mathematical markup on the client side using the MathJax or KaTeX display engine, or you can render the markup with the [`transform.ToMath`] function while building your site.
+>
+> The first approach is described below.
+
+## Setup
+
+Follow these instructions to include mathematical equations and expressions in your Markdown using LaTeX markup.
+
+### Step 1
+
+Enable and configure the Goldmark [passthrough extension] in your site configuration. The passthrough extension preserves raw Markdown within delimited snippets of text, including the delimiters themselves.
+
+{{< code-toggle file=hugo copy=true >}}
+[markup.goldmark.extensions.passthrough]
+enable = true
+
+[markup.goldmark.extensions.passthrough.delimiters]
+block = [['\[', '\]'], ['$$', '$$']]
+inline = [['\(', '\)']]
+
+[params]
+math = true
+{{< /code-toggle >}}
+
+The configuration above enables mathematical rendering on every page unless you set the `math` parameter to `false` in front matter. To enable mathematical rendering as needed, set the `math` parameter to `false` in your site configuration, and set the `math` parameter to `true` in front matter. Use this parameter in your base template as shown in [Step 3].
+
+> [!note]
+> The configuration above precludes the use of the `$...$` delimiter pair for inline equations. Although you can add this delimiter pair to the configuration and JavaScript, you will need to double-escape the `$` symbol when used outside of math contexts to avoid unintended formatting.
+>
+> See the [inline delimiters](#inline-delimiters) section for details.
+
+To disable passthrough of inline snippets, omit the `inline` key from the configuration:
+
+{{< code-toggle file=hugo >}}
+[markup.goldmark.extensions.passthrough.delimiters]
+block = [['\[', '\]'], ['$$', '$$']]
+{{< /code-toggle >}}
+
+You can define your own opening and closing delimiters, provided they match the delimiters that you set in [Step 2].
+
+{{< code-toggle file=hugo >}}
+[markup.goldmark.extensions.passthrough.delimiters]
+block = [['@@', '@@']]
+inline = [['@', '@']]
+{{< /code-toggle >}}
+
+### Step 2
+
+Create a partial template to load MathJax or KaTeX. The example below loads MathJax, or you can use KaTeX as described in the [engines](#engines) section.
+
+```go-html-template {file="layouts/partials/math.html" copy=true}
+
+
+```
+
+The delimiters above must match the delimiters in your site configuration.
+
+### Step 3
+
+Conditionally call the partial template from the base template.
+
+```go-html-template {file="layouts/_default/baseof.html"}
+
+ ...
+ {{ if .Param "math" }}
+ {{ partialCached "math.html" . }}
+ {{ end }}
+ ...
+
+```
+
+The example above loads the partial template if you have set the `math` parameter in front matter to `true`. If you have not set the `math` parameter in front matter, the conditional statement falls back to the `math` parameter in your site configuration.
+
+### Step 4
+
+Include mathematical equations and expressions in Markdown using LaTeX markup.
+
+```text {file="content/math-examples.md" copy=true}
+This is an inline \(a^*=x-b^*\) equation.
+
+These are block equations:
+
+\[a^*=x-b^*\]
+
+\[ a^*=x-b^* \]
+
+\[
+a^*=x-b^*
+\]
+
+These are also block equations:
+
+$$a^*=x-b^*$$
+
+$$ a^*=x-b^* $$
+
+$$
+a^*=x-b^*
+$$
+```
+
+If you set the `math` parameter to `false` in your site configuration, you must set the `math` parameter to `true` in front matter. For example:
+
+{{< code-toggle file=content/math-examples.md fm=true >}}
+title = 'Math examples'
+date = 2024-01-24T18:09:49-08:00
+[params]
+math = true
+{{< /code-toggle >}}
+
+## Inline delimiters
+
+The configuration, JavaScript, and examples above use the `\(...\)` delimiter pair for inline equations. The `$...$` delimiter pair is a common alternative, but using it may result in unintended formatting if you use the `$` symbol outside of math contexts.
+
+If you add the `$...$` delimiter pair to your configuration and JavaScript, you must double-escape the `$` when outside of math contexts, regardless of whether mathematical rendering is enabled on the page. For example:
+
+```text
+A \\$5 bill _saved_ is a \\$5 bill _earned_.
+```
+
+> [!note]
+> If you use the `$...$` delimiter pair for inline equations, and occasionally use the `$` symbol outside of math contexts, you must use MathJax instead of KaTeX to avoid unintended formatting caused by [this KaTeX limitation](https://github.com/KaTeX/KaTeX/issues/437).
+
+## Engines
+
+MathJax and KaTeX are open-source JavaScript display engines. Both engines are fast, but at the time of this writing MathJax v3.2.2 is slightly faster than KaTeX v0.16.11.
+
+> [!note]
+> If you use the `$...$` delimiter pair for inline equations, and occasionally use the `$` symbol outside of math contexts, you must use MathJax instead of KaTeX to avoid unintended formatting caused by [this KaTeX limitation](https://github.com/KaTeX/KaTeX/issues/437).
+>
+>See the [inline delimiters](#inline-delimiters) section for details.
+
+To use KaTeX instead of MathJax, replace the partial template from [Step 2] with this:
+
+```go-html-template {file="layouts/partials/math.html" copy=true}
+
+
+
+
+```
+
+The delimiters above must match the delimiters in your site configuration.
+
+## Chemistry
+
+Both MathJax and KaTeX provide support for chemical equations. For example:
+
+```text
+$$C_p[\ce{H2O(l)}] = \pu{75.3 J // mol K}$$
+```
+
+$$C_p[\ce{H2O(l)}] = \pu{75.3 J // mol K}$$
+
+As shown in [Step 2] above, MathJax supports chemical equations without additional configuration. To add chemistry support to KaTeX, enable the mhchem extension as described in the KaTeX [documentation](https://katex.org/docs/libs).
+
+[`transform.ToMath`]: /functions/transform/tomath/
+[KaTeX]: https://katex.org/
+[LaTeX]: https://www.latex-project.org/
+[MathJax]: https://www.mathjax.org/
+[passthrough extension]: /configuration/markup/#passthrough
+[Step 2]: #step-2
+[Step 3]: #step-3
diff --git a/docs/content/en/content-management/menus.md b/docs/content/en/content-management/menus.md
index 9ac6f8bff..6d01173dc 100644
--- a/docs/content/en/content-management/menus.md
+++ b/docs/content/en/content-management/menus.md
@@ -1,123 +1,97 @@
---
title: Menus
-linktitle: Menus
-description: Hugo has a simple yet powerful menu system.
-date: 2017-02-01
-publishdate: 2017-02-01
-lastmod: 2017-03-31
-categories: [content management]
-keywords: [menus]
-draft: false
-menu:
- docs:
- parent: "content-management"
- weight: 120
-weight: 120 #rem
+description: Create menus by defining entries, localizing each entry, and rendering the resulting data structure.
+categories: []
+keywords: []
aliases: [/extras/menus/]
-toc: true
---
-{{% note "Lazy Blogger"%}}
-If all you want is a simple menu for your sections, see the ["Section Menu for Lazy Bloggers" in Menu Templates](/templates/menu-templates/#section-menu-for-lazy-bloggers).
-{{% /note %}}
+## Overview
-You can do this:
+To create a menu for your site:
-* Place content in one or many menus
-* Handle nested menus with unlimited depth
-* Create menu entries without being attached to any content
-* Distinguish active element (and active branch)
+1. Define the menu entries
+1. [Localize](multilingual/#menus) each entry
+1. Render the menu with a [template]
-## What is a Menu in Hugo?
+Create multiple menus, either flat or nested. For example, create a main menu for the header, and a separate menu for the footer.
-A **menu** is a named array of menu entries accessible by name via the [`.Site.Menus` site variable][sitevars]. For example, you can access your site's `main` menu via `.Site.Menus.main`.
+There are three ways to define menu entries:
-{{% note "Menus on Multilingual Sites" %}}
-If you make use of the [multilingual feature](/content-management/multilingual/), you can define language-independent menus.
-{{% /note %}}
+1. Automatically
+1. In front matter
+1. In site configuration
-See the [Menu Entry Properties][me-props] for all the variables and functions related to a menu entry.
+> [!note]
+> Although you can use these methods in combination when defining a menu, the menu will be easier to conceptualize and maintain if you use one method throughout the site.
-## Add content to menus
+## Define automatically
-Hugo allows you to add content to a menu via the content's [front matter](/content-management/front-matter/).
+To automatically define a menu entry for each top-level [section](g) of your site, enable the section pages menu in your site configuration.
-### Simple
-
-If all you need to do is add an entry to a menu, the simple form works well.
-
-#### A Single Menu
-
-```
----
-menu: "main"
----
-```
-
-#### Multiple Menus
-
-```
----
-menu: ["main", "footer"]
----
-```
-
-#### Advanced
-
-
-```
----
-menu:
- docs:
- parent: 'extras'
- weight: 20
----
-```
-
-## Add Non-content Entries to a Menu
-
-You can also add entries to menus that aren’t attached to a piece of content. This takes place in your Hugo project's [`config` file][config].
-
-Here’s an example snippet pulled from a configuration file:
-
-{{< code-toggle file="config" >}}
-[[menu.main]]
- name = "about hugo"
- pre = ""
- weight = -110
- identifier = "about"
- url = "/about/"
-[[menu.main]]
- name = "getting started"
- pre = ""
- post = "New!"
- weight = -100
- url = "/getting-started/"
+{{< code-toggle file=hugo >}}
+sectionPagesMenu = "main"
{{< /code-toggle >}}
-{{% note %}}
-The URLs must be relative to the context root. If the `baseURL` is `https://example.com/mysite/`, then the URLs in the menu must not include the context root `mysite`. Using an absolute URL will override the baseURL. If the value used for `URL` in the above example is `https://subdomain.example.com/`, the output will be `https://subdomain.example.com`.
-{{% /note %}}
+This creates a menu structure that you can access with `site.Menus.main` in your templates. See [menu templates] for details.
-## Nesting
+## Define in front matter
-All nesting of content is done via the `parent` field.
+To add a page to the "main" menu:
-The parent of an entry should be the identifier of another entry. The identifier should be unique (within a menu).
+{{< code-toggle file=content/about.md fm=true >}}
+title = 'About'
+menus = 'main'
+{{< /code-toggle >}}
-The following order is used to determine an Identifier:
+Access the entry with `site.Menus.main` in your templates. See [menu templates] for details.
-`.Name > .LinkTitle > .Title`
+To add a page to the "main" and "footer" menus:
-This means that `.Title` will be used unless `.LinkTitle` is present, etc. In practice, `.Name` and `.Identifier` are only used to structure relationships and therefore never displayed.
+{{< code-toggle file=content/contact.md fm=true >}}
+title = 'Contact'
+menus = ['main','footer']
+{{< /code-toggle >}}
-In this example, the top level of the menu is defined in your [site `config` file][config]. All content entries are attached to one of these entries via the `.Parent` field.
+Access the entry with `site.Menus.main` and `site.Menus.footer` in your templates. See [menu templates] for details.
-## Render Menus
+> [!note]
+> The configuration key in the examples above is `menus`. The `menu` (singular) configuration key is an alias for `menus`.
-See [Menu Templates](/templates/menu-templates/) for information on how to render your site menus within your templates.
+### Properties
-[config]: /getting-started/configuration/
-[multilingual]: /content-management/multilingual/
-[sitevars]: /variables/
-[me-props]: /variables/menus/
+Use these properties when defining menu entries in front matter:
+
+{{% include "/_common/menu-entry-properties.md" %}}
+
+### Example
+
+This front matter menu entry demonstrates some of the available properties:
+
+{{< code-toggle file=content/products/software.md fm=true >}}
+title = 'Software'
+[menus.main]
+parent = 'Products'
+weight = 20
+pre = ''
+[menus.main.params]
+class = 'center'
+{{< /code-toggle >}}
+
+Access the entry with `site.Menus.main` in your templates. See [menu templates] for details.
+
+## Define in site configuration
+
+See [configure menus](/configuration/menus/).
+
+## Localize
+
+Hugo provides two methods to localize your menu entries. See [multilingual].
+
+## Render
+
+See [menu templates].
+
+[menu templates]: /templates/menu/
+[multilingual]: /content-management/multilingual/#menus
+[template]: /templates/menu/
diff --git a/docs/content/en/content-management/multilingual.md b/docs/content/en/content-management/multilingual.md
index 49565d948..d419f4381 100644
--- a/docs/content/en/content-management/multilingual.md
+++ b/docs/content/en/content-management/multilingual.md
@@ -1,190 +1,42 @@
---
-title: Multilingual Mode
-linktitle: Multilingual and i18n
-description: Hugo supports the creation of websites with multiple languages side by side.
-date: 2017-01-10
-publishdate: 2017-01-10
-lastmod: 2017-01-10
-categories: [content management]
-keywords: [multilingual,i18n, internationalization]
-menu:
- docs:
- parent: "content-management"
- weight: 150
-weight: 150 #rem
-draft: false
+title: Multilingual mode
+linkTitle: Multilingual
+description: Localize your project for each language and region, including translations, images, dates, currencies, numbers, percentages, and collation sequence. Hugo's multilingual framework supports single-host and multihost configurations.
+categories: []
+keywords: []
aliases: [/content/multilingual/,/tutorials/create-a-multilingual-site/]
-toc: true
---
-You should define the available languages in a `languages` section in your site configuration.
+## Configuration
-> Also See [Hugo Multilingual Part 1: Content translation](https://regisphilibert.com/blog/2018/08/hugo-multilingual-part-1-managing-content-translation/)
+See [configure languages](/configuration/languages/).
-## Configure Languages
-
-The following is an example of a site configuration for a multilingual Hugo project:
-
-{{< code-toggle file="config" >}}
-DefaultContentLanguage = "en"
-copyright = "Everything is mine"
-
-[params]
-[params.navigation]
-help = "Help"
-
-[languages]
-[languages.en]
-title = "My blog"
-weight = 1
-[languages.en.params]
-linkedin = "https://linkedin.com/whoever"
-
-[languages.fr]
-title = "Mon blogue"
-weight = 2
-[languages.fr.params]
-linkedin = "https://linkedin.com/fr/whoever"
-[languages.fr.params.navigation]
-help = "Aide"
-{{< /code-toggle >}}
-
-Anything not defined in a `languages` block will fall back to the global value for that key (e.g., `copyright` for the English `en` language). This also works for `params`, as demonstrated witgh `help` above: You will get the value `Aide` in French and `Help` in all the languages without this parameter set.
-
-With the configuration above, all content, sitemap, RSS feeds, paginations,
-and taxonomy pages will be rendered below `/` in English (your default content language) and then below `/fr` in French.
-
-When working with front matter `Params` in [single page templates][singles], omit the `params` in the key for the translation.
-
-`defaultContentLanguage` sets the project's default language. If not set, the default language will be `en`.
-
-If the default language needs to be rendererd below its own language code (`/en`) like the others, set `defaultContentLanguageInSubdir: true`.
-
-Only the obvious non-global options can be overridden per language. Examples of global options are `baseURL`, `buildDrafts`, etc.
-
-### Disable a Language
-
-You can disable one or more languages. This can be useful when working on a new translation.
-
-```toml
-disableLanguages = ["fr", "ja"]
-```
-
-Note that you cannot disable the default content language.
-
-We kept this as a standalone setting to make it easier to set via [OS environment](/getting-started/configuration/#configure-with-environment-variables):
-
-```bash
-HUGO_DISABLELANGUAGES="fr ja" hugo
-```
-If you have already a list of disabled languages in `config.toml`, you can enable them in development like this:
-
-```bash
-HUGO_DISABLELANGUAGES=" " hugo server
-```
-
-
-### Configure Multilingual Multihost
-
-From **Hugo 0.31** we support multiple languages in a multihost configuration. See [this issue](https://github.com/gohugoio/hugo/issues/4027) for details.
-
-This means that you can now configure a `baseURL` per `language`:
-
-
-> If a `baseURL` is set on the `language` level, then all languages must have one and they must all be different.
-
-Example:
-
-{{< code-toggle file="config" >}}
-[languages]
-[languages.fr]
-baseURL = "https://example.fr"
-languageName = "Français"
-weight = 1
-title = "En Français"
-
-[languages.en]
-baseURL = "https://example.com"
-languageName = "English"
-weight = 2
-title = "In English"
-{{ code-toggle >}}
-
-With the above, the two sites will be generated into `public` with their own root:
-
-```bash
-public
-├── en
-└── fr
-```
-
-**All URLs (i.e `.Permalink` etc.) will be generated from that root. So the English home page above will have its `.Permalink` set to `https://example.com/`.**
-
-When you run `hugo server` we will start multiple HTTP servers. You will typlically see something like this in the console:
-
-```bash
-Web Server is available at 127.0.0.1:1313 (bind address 127.0.0.1)
-Web Server is available at 127.0.0.1:1314 (bind address 127.0.0.1)
-Press Ctrl+C to stop
-```
-
-Live reload and `--navigateToChanged` between the servers work as expected.
-
-### Taxonomies and Blackfriday
-
-Taxonomies and [Blackfriday configuration][config] can also be set per language:
-
-
-{{< code-toggle file="config" >}}
-[Taxonomies]
-tag = "tags"
-
-[blackfriday]
-angledQuotes = true
-hrefTargetBlank = true
-
-[languages]
-[languages.en]
-weight = 1
-title = "English"
-[languages.en.blackfriday]
-angledQuotes = false
-
-[languages.fr]
-weight = 2
-title = "Français"
-[languages.fr.Taxonomies]
-plaque = "plaques"
-{{ code-toggle >}}
-
-## Translate Your Content
+## Translate your content
There are two ways to manage your content translations. Both ensure each page is assigned a language and is linked to its counterpart translations.
-### Translation by filename
+### Translation by file name
Considering the following example:
1. `/content/about.en.md`
-2. `/content/about.fr.md`
+1. `/content/about.fr.md`
The first file is assigned the English language and is linked to the second.
The second file is assigned the French language and is linked to the first.
-Their language is __assigned__ according to the language code added as a __suffix to the filename__.
+Their language is __assigned__ according to the language code added as a __suffix to the file name__.
-By having the same **path and base filename**, the content pieces are __linked__ together as translated pages.
+By having the same **path and base file name**, the content pieces are __linked__ together as translated pages.
-{{< note >}}
-If a file has no language code, it will be assigned the default language.
-{{ note >}}
+> [!note]
+> If a file has no language code, it will be assigned the default language.
### Translation by content directory
-This system uses different content directories for each of the languages. Each language's content directory is set using the `contentDir` param.
-
-{{< code-toggle file="config" >}}
+This system uses different content directories for each of the languages. Each language's `content` directory is set using the `contentDir` parameter.
+{{< code-toggle file=hugo >}}
languages:
en:
weight: 10
@@ -194,267 +46,384 @@ languages:
weight: 20
languageName: "Français"
contentDir: "content/french"
-
{{< /code-toggle >}}
The value of `contentDir` can be any valid path -- even absolute path references. The only restriction is that the content directories cannot overlap.
-Considering the following example in conjunction with the configuration above:
+Considering the following example in conjunction with the configuration above:
1. `/content/english/about.md`
-2. `/content/french/about.md`
+1. `/content/french/about.md`
The first file is assigned the English language and is linked to the second.
The second file is assigned the French language and is linked to the first.
-Their language is __assigned__ according to the content directory they are __placed__ in.
+Their language is __assigned__ according to the `content` directory they are __placed__ in.
-By having the same **path and basename** (relative to their language content directory), the content pieces are __linked__ together as translated pages.
+By having the same **path and basename** (relative to their language `content` directory), the content pieces are __linked__ together as translated pages.
-### Bypassing default linking.
+### Bypassing default linking
Any pages sharing the same `translationKey` set in front matter will be linked as translated pages regardless of basename or location.
Considering the following example:
1. `/content/about-us.en.md`
-2. `/content/om.nn.md`
-3. `/content/presentation/a-propos.fr.md`
+1. `/content/om.nn.md`
+1. `/content/presentation/a-propos.fr.md`
-```yaml
-# set in all three pages
+{{< code-toggle file=hugo >}}
translationKey: "about"
-```
-
-By setting the `translationKey` front matter param to `about` in all three pages, they will be __linked__ as translated pages.
+{{< /code-toggle >}}
+By setting the `translationKey` front matter parameter to `about` in all three pages, they will be __linked__ as translated pages.
### Localizing permalinks
-Because paths and filenames are used to handle linking, all translated pages will share the same URL (apart from the language subdirectory).
+Because paths and file names are used to handle linking, all translated pages will share the same URL (apart from the language subdirectory).
-To localize the URLs, the [`slug`]({{< ref "/content-management/organization/index.md#slug" >}}) or [`url`]({{< ref "/content-management/organization/index.md#url" >}}) front matter param can be set in any of the non-default language file.
+To localize URLs:
-For example, a French translation (`content/about.fr.md`) can have its own localized slug.
+- For a regular page, set either [`slug`] or [`url`] in front matter
+- For a section page, set [`url`] in front matter
-{{< code-toggle >}}
-Title: A Propos
+For example, a French translation can have its own localized slug.
+
+{{< code-toggle file=content/about.fr.md fm=true >}}
+title: A Propos
slug: "a-propos"
{{< /code-toggle >}}
+At render, Hugo will build both `/about/` and `/fr/a-propos/` without affecting the translation link.
-At render, Hugo will build both `/about/` and `/fr/a-propos/` while maintaining their translation linking.
+### Page bundles
-{{% note %}}
-If using `url`, remember to include the language part as well: `/fr/compagnie/a-propos/`.
-{{%/ note %}}
-
-### Page Bundles
-
-To avoid the burden of having to duplicate files, each Page Bundle inherits the resources of its linked translated pages' bundles except for the content files (markdown files, html files etc...).
+To avoid the burden of having to duplicate files, each Page Bundle inherits the resources of its linked translated pages' bundles except for the content files (Markdown files, HTML files etc.).
Therefore, from within a template, the page will have access to the files from all linked pages' bundles.
If, across the linked bundles, two or more files share the same basename, only one will be included and chosen as follows:
-* File from current language bundle, if present.
-* First file found across bundles by order of language `Weight`.
+- File from current language bundle, if present.
+- First file found across bundles by order of language `Weight`.
-{{% note %}}
-Page Bundle resources follow the same language assignment logic as content files, both by filename (`image.jpg`, `image.fr.jpg`) and by directory (`english/about/header.jpg`, `french/about/header.jpg`).
-{{%/ note %}}
+> [!note]
+> Page Bundle resources follow the same language assignment logic as content files, both by file name (`image.jpg`, `image.fr.jpg`) and by directory (`english/about/header.jpg`, `french/about/header.jpg`).
-## Reference the Translated Content
+## Reference translated content
To create a list of links to translated content, use a template similar to the following:
-{{< code file="layouts/partials/i18nlist.html" >}}
+```go-html-template {file="layouts/partials/i18nlist.html"}
{{ if .IsTranslated }}
{{ end }}
-{{< /code >}}
+```
-The above can be put in a `partial` (i.e., inside `layouts/partials/`) and included in any template, whether a [single content page][contenttemplate] or the [homepage][]. It will not print anything if there are no translations for a given page.
+The above can be put in a `partial` (i.e., inside `layouts/partials/`) and included in any template. It will not print anything if there are no translations for a given page.
The above also uses the [`i18n` function][i18func] described in the next section.
-### List All Available Languages
+### List all available languages
`.AllTranslations` on a `Page` can be used to list all translations, including the page itself. On the home page it can be used to build a language navigator:
-
-{{< code file="layouts/partials/allLanguages.html" >}}
+```go-html-template {file="layouts/partials/allLanguages.html"}
-{{< /code >}}
-
-## Translation of Strings
-
-Hugo uses [go-i18n][] to support string translations. [See the project's source repository][go-i18n-source] to find tools that will help you manage your translation workflows.
-
-Translations are collected from the `themes//i18n/` folder (built into the theme), as well as translations present in `i18n/` at the root of your project. In the `i18n`, the translations will be merged and take precedence over what is in the theme folder. Language files should be named according to [RFC 5646][] with names such as `en-US.toml`, `fr.toml`, etc.
-
-{{% note %}}
-From **Hugo 0.31** you no longer need to use a valid language code. It can be anything.
-
-See https://github.com/gohugoio/hugo/issues/3564
-
-{{% /note %}}
-
-From within your templates, use the `i18n` function like this:
-
-```
-{{ i18n "home" }}
```
-This uses a definition like this one in `i18n/en-US.toml`:
+## Translation of strings
-```
-[home]
-other = "Home"
+See the [`lang.Translate`] template function.
+
+## Localization
+
+The following localization examples assume your site's primary language is English, with translations to French and German.
+
+{{< code-toggle file=hugo >}}
+defaultContentLanguage = 'en'
+
+[languages]
+[languages.en]
+contentDir = 'content/en'
+languageName = 'English'
+weight = 1
+[languages.fr]
+contentDir = 'content/fr'
+languageName = 'Français'
+weight = 2
+[languages.de]
+contentDir = 'content/de'
+languageName = 'Deutsch'
+weight = 3
+
+{{< /code-toggle >}}
+
+### Dates
+
+With this front matter:
+
+{{< code-toggle file=hugo >}}
+date = 2021-11-03T12:34:56+01:00
+{{< /code-toggle >}}
+
+And this template code:
+
+```go-html-template
+{{ .Date | time.Format ":date_full" }}
```
-Often you will want to use to the page variables in the translations strings. To do that, pass on the "." context when calling `i18n`:
+The rendered page displays:
-```
-{{ i18n "wordCount" . }}
+Language|Value
+:--|:--
+English|Wednesday, November 3, 2021
+Français|mercredi 3 novembre 2021
+Deutsch|Mittwoch, 3. November 2021
+
+See [`time.Format`] for details.
+
+### Currency
+
+With this template code:
+
+```go-html-template
+{{ 512.5032 | lang.FormatCurrency 2 "USD" }}
```
-This uses a definition like this one in `i18n/en-US.toml`:
+The rendered page displays:
-```
-[wordCount]
-other = "This article has {{ .WordCount }} words."
-```
-An example of singular and plural form:
+Language|Value
+:--|:--
+English|$512.50
+Français|512,50 $US
+Deutsch|512,50 $
-```
-[readingTime]
-one = "One minute to read"
-other = "{{.Count}} minutes to read"
-```
-And then in the template:
+See [lang.FormatCurrency] and [lang.FormatAccounting] for details.
-```
-{{ i18n "readingTime" .ReadingTime }}
+### Numbers
+
+With this template code:
+
+```go-html-template
+{{ 512.5032 | lang.FormatNumber 2 }}
```
-## Customize Dates
+The rendered page displays:
-At the time of this writing, Go does not yet have support for internationalized locales for dates, but if you do some work, you can simulate it. For example, if you want to use French month names, you can add a data file like ``data/mois.yaml`` with this content:
+Language|Value
+:--|:--
+English|512.50
+Français|512,50
+Deutsch|512,50
-~~~yaml
-1: "janvier"
-2: "février"
-3: "mars"
-4: "avril"
-5: "mai"
-6: "juin"
-7: "juillet"
-8: "août"
-9: "septembre"
-10: "octobre"
-11: "novembre"
-12: "décembre"
-~~~
+See [lang.FormatNumber] and [lang.FormatNumberCustom] for details.
-...then index the non-English date names in your templates like so:
+### Percentages
-~~~html
-
-~~~
+With this template code:
-This technique extracts the day, month and year by specifying ``.Date.Day``, ``.Date.Month``, and ``.Date.Year``, and uses the month number as a key, when indexing the month name data file.
+```go-html-template
+{{ 512.5032 | lang.FormatPercent 2 }}
+```
+
+The rendered page displays:
+
+Language|Value
+:--|:--
+English|512.50%
+Français|512,50 %
+Deutsch|512,50 %
+
+See [lang.FormatPercent] for details.
## Menus
-You can define your menus for each language independently. Creating multilingual menus works just like [creating regular menus][menus], except they're defined in language-specific blocks in the configuration file:
+Localization of menu entries depends on how you define them:
-```
-defaultContentLanguage = "en"
+- When you define menu entries [automatically] using the section pages menu, you must use translation tables to localize each entry.
+- When you define menu entries [in front matter], they are already localized based on the front matter itself. If the front matter values are insufficient, use translation tables to localize each entry.
+- When you define menu entries [in site configuration], you must create language-specific menu entries under each language key. If the names of the menu entries are insufficient, use translation tables to localize each entry.
+
+### Create language-specific menu entries
+
+#### Method 1 -- Use a single configuration file
+
+For a simple menu with a small number of entries, use a single configuration file. For example:
+
+{{< code-toggle file=hugo >}}
+[languages.de]
+languageCode = 'de-DE'
+languageName = 'Deutsch'
+weight = 1
+
+[[languages.de.menus.main]]
+name = 'Produkte'
+pageRef = '/products'
+weight = 10
+
+[[languages.de.menus.main]]
+name = 'Leistungen'
+pageRef = '/services'
+weight = 20
[languages.en]
-weight = 0
-languageName = "English"
+languageCode = 'en-US'
+languageName = 'English'
+weight = 2
-[[languages.en.menu.main]]
-url = "/"
-name = "Home"
-weight = 0
-
-
-[languages.de]
+[[languages.en.menus.main]]
+name = 'Products'
+pageRef = '/products'
weight = 10
-languageName = "Deutsch"
-[[languages.de.menu.main]]
-url = "/"
-name = "Startseite"
-weight = 0
+[[languages.en.menus.main]]
+name = 'Services'
+pageRef = '/services'
+weight = 20
+{{< /code-toggle >}}
+
+#### Method 2 -- Use a configuration directory
+
+With a more complex menu structure, create a [configuration directory] and split the menu entries into multiple files, one file per language. For example:
+
+```text
+config/
+└── _default/
+ ├── menus.de.toml
+ ├── menus.en.toml
+ └── hugo.toml
```
-The rendering of the main navigation works as usual. `.Site.Menus` will just contain the menu in the current language. Note that `absLangURL` below will link to the correct locale of your website. Without it, menu entries in all languages would link to the English version, since it's the default content language that resides in the root directory.
+{{< code-toggle file=config/_default/menus.de >}}
+[[main]]
+name = 'Produkte'
+pageRef = '/products'
+weight = 10
+[[main]]
+name = 'Leistungen'
+pageRef = '/services'
+weight = 20
+{{< /code-toggle >}}
-```
-
+{{< code-toggle file=config/_default/menus.en >}}
+[[main]]
+name = 'Products'
+pageRef = '/products'
+weight = 10
+[[main]]
+name = 'Services'
+pageRef = '/services'
+weight = 20
+{{< /code-toggle >}}
+### Use translation tables
+
+When rendering the text that appears in menu each entry, the [example menu template] does this:
+
+```go-html-template
+{{ or (T .Identifier) .Name | safeHTML }}
```
-## Missing Translations
+It queries the translation table for the current language using the menu entry's `identifier` and returns the translated string. If the translation table does not exist, or if the `identifier` key is not present in the translation table, it falls back to `name`.
+
+The `identifier` depends on how you define menu entries:
+
+- If you define the menu entry [automatically] using the section pages menu, the `identifier` is the page's `.Section`.
+- If you define the menu entry [in site configuration] or [in front matter], set the `identifier` property to the desired value.
+
+For example, if you define menu entries in site configuration:
+
+{{< code-toggle file=hugo >}}
+[[menus.main]]
+ identifier = 'products'
+ name = 'Products'
+ pageRef = '/products'
+ weight = 10
+[[menus.main]]
+ identifier = 'services'
+ name = 'Services'
+ pageRef = '/services'
+ weight = 20
+{{< / code-toggle >}}
+
+Create corresponding entries in the translation tables:
+
+{{< code-toggle file=i18n/de >}}
+products = 'Produkte'
+services = 'Leistungen'
+{{< / code-toggle >}}
+
+## Missing translations
If a string does not have a translation for the current language, Hugo will use the value from the default language. If no default value is set, an empty string will be shown.
While translating a Hugo website, it can be handy to have a visual indicator of missing translations. The [`enableMissingTranslationPlaceholders` configuration option][config] will flag all untranslated strings with the placeholder `[i18n] identifier`, where `identifier` is the id of the missing translation.
-{{% note %}}
-Hugo will generate your website with these missing translation placeholders. It might not be suitable for production environments.
-{{% /note %}}
+> [!note]
+> Hugo will generate your website with these missing translation placeholders. It might not be suitable for production environments.
-For merging of content from other languages (i.e. missing content translations), see [lang.Merge](/functions/lang.merge/).
+For merging of content from other languages (i.e. missing content translations), see [lang.Merge].
-To track down missing translation strings, run Hugo with the `--i18n-warnings` flag:
+To track down missing translation strings, run Hugo with the `--printI18nWarnings` flag:
-```
- hugo --i18n-warnings | grep i18n
+```sh
+hugo --printI18nWarnings | grep i18n
i18n|MISSING_TRANSLATION|en|wordCount
```
-## Multilingual Themes support
+## Multilingual themes support
To support Multilingual mode in your themes, some considerations must be taken for the URLs in the templates. If there is more than one language, URLs must meet the following criteria:
-* Come from the built-in `.Permalink` or `.RelPermalink`
-* Be constructed with the [`relLangURL` template function][rellangurl] or the [`absLangURL` template function][abslangurl] **OR** be prefixed with `{{ .LanguagePrefix }}`
+- Come from the built-in `.Permalink` or `.RelPermalink`
+- Be constructed with the [`relLangURL`] or [`absLangURL`] template function, or be prefixed with `{{ .LanguagePrefix }}`
-If there is more than one language defined, the `LanguagePrefix` variable will equal `/en` (or whatever your `CurrentLanguage` is). If not enabled, it will be an empty string (and is therefore harmless for single-language Hugo websites).
+If there is more than one language defined, the `LanguagePrefix` method will return `/en` (or whatever the current language is). If not enabled, it will be an empty string (and is therefore harmless for single-language Hugo websites).
-[abslangurl]: /functions/abslangurl
-[config]: /getting-started/configuration/
-[contenttemplate]: /templates/single-page-templates/
-[go-i18n-source]: https://github.com/nicksnyder/go-i18n
-[go-i18n]: https://github.com/nicksnyder/go-i18n
-[homepage]: /templates/homepage/
-[i18func]: /functions/i18n/
-[menus]: /content-management/menus/
-[rellangurl]: /functions/rellangurl
-[RFC 5646]: https://tools.ietf.org/html/rfc5646
-[singles]: /templates/single-page-templates/
+## Generate multilingual content with `hugo new content`
+
+If you organize content with translations in the same directory:
+
+```sh
+hugo new content post/test.en.md
+hugo new content post/test.de.md
+```
+
+If you organize content with translations in different directories:
+
+```sh
+hugo new content content/en/post/test.md
+hugo new content content/de/post/test.md
+```
+
+[`absLangURL`]: /functions/urls/abslangurl/
+[`lang.Translate`]: /functions/lang/translate
+[`relLangURL`]: /functions/urls/rellangurl/
+[`slug`]: /content-management/urls/#slug
+[`time.Format`]: /functions/time/format/
+[`url`]: /content-management/urls/#url
+[automatically]: /content-management/menus/#define-automatically
+[config]: /configuration/
+[configuration directory]: /configuration/introduction/#configuration-directory
+[example menu template]: /templates/menu/#example
+[i18func]: /functions/lang/translate/
+[in front matter]: /content-management/menus/#define-in-front-matter
+[in site configuration]: /content-management/menus/#define-in-site-configuration
+[lang.FormatAccounting]: /functions/lang/formataccounting/
+[lang.FormatCurrency]: /functions/lang/formatcurrency/
+[lang.FormatNumber]: /functions/lang/formatnumber/
+[lang.FormatNumberCustom]: /functions/lang/formatnumbercustom/
+[lang.FormatPercent]: /functions/lang/formatpercent/
+[lang.Merge]: /functions/lang/merge/
diff --git a/docs/content/en/content-management/organization/1-featured-content-bundles.png b/docs/content/en/content-management/organization/1-featured-content-bundles.png
deleted file mode 100644
index 1706a29d6..000000000
Binary files a/docs/content/en/content-management/organization/1-featured-content-bundles.png and /dev/null differ
diff --git a/docs/content/en/content-management/organization/index.md b/docs/content/en/content-management/organization/index.md
index c12e07c26..a7682bfad 100644
--- a/docs/content/en/content-management/organization/index.md
+++ b/docs/content/en/content-management/organization/index.md
@@ -1,91 +1,90 @@
---
-title: Content Organization
-linktitle: Organization
+title: Content organization
+linkTitle: Organization
description: Hugo assumes that the same structure that works to organize your source content is used to organize the rendered site.
-date: 2017-02-01
-publishdate: 2017-02-01
-lastmod: 2017-02-01
-categories: [content management,fundamentals]
-keywords: [sections,content,organization,bundle,resources]
-menu:
- docs:
- parent: "content-management"
- weight: 10
-weight: 10 #rem
-draft: false
+categories: []
+keywords: []
aliases: [/content/sections/]
-toc: true
---
-## Page Bundles
+## Page bundles
Hugo `0.32` announced page-relative images and other resources packaged into `Page Bundles`.
-These terms are connected, and you also need to read about [Page Resources]({{< relref "/content-management/page-resources" >}}) and [Image Processing]({{< relref "/content-management/image-processing" >}}) to get the full picture.
+These terms are connected, and you also need to read about [Page Resources](/content-management/page-resources) and [Image Processing](/content-management/image-processing) to get the full picture.
-{{% imgproc 1-featured Resize "300x" %}}
-The illustration shows 3 bundles. Note that the home page bundle cannot contain other content pages, but other files (images etc.) are fine.
-{{% /imgproc %}}
+```text
+content/
+├── blog/
+│ ├── hugo-is-cool/
+│ │ ├── images/
+│ │ │ ├── funnier-cat.jpg
+│ │ │ └── funny-cat.jpg
+│ │ ├── cats-info.md
+│ │ └── index.md
+│ ├── posts/
+│ │ ├── post1.md
+│ │ └── post2.md
+│ ├── 1-landscape.jpg
+│ ├── 2-sunset.jpg
+│ ├── _index.md
+│ ├── content-1.md
+│ └── content-2.md
+├── 1-logo.png
+└── _index.md
+```
+The file tree above shows three bundles. Note that the home page bundle cannot contain other content pages, although other files (images etc.) are allowed.
-{{% note %}}
-The bundle documentation is **work in progress**. We will publish more comprehensive docs about this soon.
-{{% /note %}}
-
-
-# Organization of Content Source
-
+## Organization of content source
In Hugo, your content should be organized in a manner that reflects the rendered website.
-While Hugo supports content nested at any level, the top levels (i.e. `content/`) are special in Hugo and are considered the content type used to determine layouts etc. To read more about sections, including how to nest them, see [sections][].
+While Hugo supports content nested at any level, the top levels (i.e. `content/`) are special in Hugo and are considered the content type used to determine layouts etc. To read more about sections, including how to nest them, see [sections].
-Without any additional configuration, the following will just work:
+Without any additional configuration, the following will automatically work:
-```
+```txt
.
└── content
└── about
- | └── _index.md // <- https://example.com/about/
+ | └── index.md // <- https://example.org/about/
├── posts
- | ├── firstpost.md // <- https://example.com/posts/firstpost/
+ | ├── firstpost.md // <- https://example.org/posts/firstpost/
| ├── happy
- | | └── ness.md // <- https://example.com/posts/happy/ness/
- | └── secondpost.md // <- https://example.com/posts/secondpost/
+ | | └── ness.md // <- https://example.org/posts/happy/ness/
+ | └── secondpost.md // <- https://example.org/posts/secondpost/
└── quote
- ├── first.md // <- https://example.com/quote/first/
- └── second.md // <- https://example.com/quote/second/
+ ├── first.md // <- https://example.org/quote/first/
+ └── second.md // <- https://example.org/quote/second/
```
-## Path Breakdown in Hugo
+## Path breakdown in Hugo
+The following demonstrates the relationships between your content organization and the output URL structure for your Hugo website when it renders. These examples assume you are [using pretty URLs][pretty], which is the default behavior for Hugo. The examples also assume a key-value of `baseURL = "https://example.org/"` in your [site's configuration file][config].
-The following demonstrates the relationships between your content organization and the output URL structure for your Hugo website when it renders. These examples assume you are [using pretty URLs][pretty], which is the default behavior for Hugo. The examples also assume a key-value of `baseURL = "https://example.com"` in your [site's configuration file][config].
+### Index pages: `_index.md`
-### Index Pages: `_index.md`
+`_index.md` has a special role in Hugo. It allows you to add front matter and content to `home`, `section`, `taxonomy`, and `term` pages.
-`_index.md` has a special role in Hugo. It allows you to add front matter and content to your [list templates][lists]. These templates include those for [section templates][], [taxonomy templates][], [taxonomy terms templates][], and your [homepage template][].
+> [!note]
+> Access the content and metadata within an `_index.md` file by invoking the `GetPage` method on a `Site` or `Page` object.
-{{% note %}}
-**Tip:** You can get a reference to the content and metadata in `_index.md` using the [`.Site.GetPage` function](/functions/getpage/).
-{{% /note %}}
+You can create one `_index.md` for your home page and one in each of your content sections, taxonomies, and terms. The following shows typical placement of an `_index.md` that would contain content and front matter for a `posts` section list page on a Hugo website:
-You can keep one `_index.md` for your homepage and one in each of your content sections, taxonomies, and taxonomy terms. The following shows typical placement of an `_index.md` that would contain content and front matter for a `posts` section list page on a Hugo website:
-
-
-```
+```txt
. url
. ⊢--^-⊣
. path slug
. ⊢--^-⊣⊢---^---⊣
-. filepath
+. file path
. ⊢------^------⊣
content/posts/_index.md
```
At build, this will output to the following destination with the associated values:
-```
+```txt
url ("/posts/")
⊢-^-⊣
@@ -93,18 +92,16 @@ At build, this will output to the following destination with the associated valu
⊢--------^---------⊣⊢-^-⊣
permalink
⊢----------^-------------⊣
-https://example.com/posts/index.html
+https://example.org/posts/index.html
```
-The [sections][] can be nested as deeply as you need. The important part to understand is, that to make the section tree fully navigational, at least the lower-most section needs a content file. (i.e. `_index.md`).
+The [sections] can be nested as deeply as you want. The important thing to understand is that to make the section tree fully navigational, at least the lower-most section must include a content file. (i.e. `_index.md`).
+### Single pages in sections
-### Single Pages in Sections
+Single content files in each of your sections will be rendered by a [single template]. Here is an example of a single `post` within `posts`:
-Single content files in each of your sections are going to be rendered as [single page templates][singles]. Here is an example of a single `post` within `posts`:
-
-
-```
+```txt
path ("posts/my-first-hugo-post.md")
. ⊢-----------^------------⊣
. section slug
@@ -112,9 +109,9 @@ Single content files in each of your sections are going to be rendered as [singl
content/posts/my-first-hugo-post.md
```
-At the time Hugo builds your site, the content will be output to the following destination:
+When Hugo builds your site, the content will be output to the following destination:
-```
+```txt
url ("/posts/my-first-hugo-post/")
⊢------------^----------⊣
@@ -122,119 +119,33 @@ At the time Hugo builds your site, the content will be output to the following d
⊢--------^--------⊣⊢-^--⊣⊢-------^---------⊣
permalink
⊢--------------------^---------------------⊣
-https://example.com/posts/my-first-hugo-post/index.html
+https://example.org/posts/my-first-hugo-post/index.html
```
+## Paths explained
-## Paths Explained
-
-The following concepts will provide more insight into the relationship between your project's organization and the default behaviors of Hugo when building the output website.
+The following concepts provide more insight into the relationship between your project's organization and the default Hugo behavior when building output for the website.
### `section`
-A default content type is determined by a piece of content's section. `section` is determined by the location within the project's `content` directory. `section` *cannot* be specified or overridden in front matter.
+A default content type is determined by the section in which a content item is stored. `section` is determined by the location within the project's `content` directory. `section` *cannot* be specified or overridden in front matter.
### `slug`
-A content's `slug` is either `name.extension` or `name/`. The value for `slug` is determined by
-
-* the name of the content file (e.g., `lollapalooza.md`) OR
-* front matter overrides
+The `slug` is the last segment of the URL path, defined by the file name and optionally overridden by a `slug` value in front matter. See [URL Management](/content-management/urls/#slug) for details.
### `path`
-A content's `path` is determined by the section's path to the file. The file `path`
+A content's `path` is determined by the section's path to the file. The file `path`:
-* is based on the path to the content's location AND
-* does not include the slug
+- Is based on the path to the content's location AND
+- Does not include the slug
### `url`
-The `url` is the relative URL for the piece of content. The `url`
+The `url` is the entire URL path, defined by the file path and optionally overridden by a `url` value in front matter. See [URL Management](/content-management/urls/#slug) for details.
-* is based on the content's location within the directory structure OR
-* is defined in front matter and *overrides all the above*
-
-## Override Destination Paths via Front Matter
-
-Hugo believes that you organize your content with a purpose. The same structure that works to organize your source content is used to organize the rendered site. As displayed above, the organization of the source content will be mirrored in the destination.
-
-There are times where you may need more control over your content. In these cases, there are fields that can be specified in the front matter to determine the destination of a specific piece of content.
-
-The following items are defined in this order for a specific reason: items explained further down in the list will override earlier items, and not all of these items can be defined in front matter:
-
-### `filename`
-
-This isn't in the front matter, but is the actual name of the file minus the extension. This will be the name of the file in the destination (e.g., `content/posts/my-post.md` becomes `example.com/posts/my-post/`).
-
-### `slug`
-
-When defined in the front matter, the `slug` can take the place of the filename for the destination.
-
-{{< code file="content/posts/old-post.md" >}}
----
-title: New Post
-slug: "new-post"
----
-{{< /code >}}
-
-This will render to the following destination according to Hugo's default behavior:
-
-```
-example.com/posts/new-post/
-```
-
-### `section`
-
-`section` is determined by a content's location on disk and *cannot* be specified in the front matter. See [sections][] for more information.
-
-### `type`
-
-A content's `type` is also determined by its location on disk but, unlike `section`, it *can* be specified in the front matter. See [types][]. This can come in especially handy when you want a piece of content to render using a different layout. In the following example, you can create a layout at `layouts/new/mylayout.html` that Hugo will use to render this piece of content, even in the midst of many other posts.
-
-{{< code file="content/posts/my-post.md" >}}
----
-title: My Post
-type: new
-layout: mylayout
----
-{{< /code >}}
-
-
-
-
-
-### `url`
-
-A complete URL can be provided. This will override all the above as it pertains to the end destination. This must be the path from the baseURL (starting with a `/`). `url` will be used exactly as it provided in the front matter and will ignore the `--uglyURLs` setting in your site configuration:
-
-{{< code file="content/posts/old-url.md" >}}
----
-title: Old URL
-url: /blog/new-url/
----
-{{< /code >}}
-
-Assuming your `baseURL` is [configured][config] to `https://example.com`, the addition of `url` to the front matter will make `old-url.md` render to the following destination:
-
-```
-https://example.com/blog/new-url/
-```
-
-You can see more information on how to control output paths in [URL Management][urls].
-
-[config]: /getting-started/configuration/
-[formats]: /content-management/formats/
-[front matter]: /content-management/front-matter/
-[getpage]: /functions/getpage/
-[homepage template]: /templates/homepage/
-[homepage]: /templates/homepage/
-[lists]: /templates/lists/
-[pretty]: /content-management/urls/#pretty-urls
-[section templates]: /templates/section-templates/
+[config]: /configuration/
+[pretty]: /content-management/urls/#appearance
[sections]: /content-management/sections/
-[singles]: /templates/single-page-templates/
-[taxonomy templates]: /templates/taxonomy-templates/
-[taxonomy terms templates]: /templates/taxonomy-templates/
-[types]: /content-management/types/
-[urls]: /content-management/urls/
+[single template]: /templates/types/#single
diff --git a/docs/content/en/content-management/page-bundles.md b/docs/content/en/content-management/page-bundles.md
index 0d665759c..f6a5cf771 100644
--- a/docs/content/en/content-management/page-bundles.md
+++ b/docs/content/en/content-management/page-bundles.md
@@ -1,185 +1,145 @@
---
-title : "Page Bundles"
-description : "Content organization using Page Bundles"
-date : 2018-01-24T13:09:00-05:00
-lastmod : 2018-01-28T22:26:40-05:00
-linktitle : "Page Bundles"
-keywords : ["page", "bundle", "leaf", "branch"]
-categories : ["content management"]
-toc : true
-menu :
- docs:
- identifier : "page-bundles"
- parent : "content-management"
- weight : 11
+title: Page bundles
+description: Use page bundles to logically associate one or more resources with content.
+categories: []
+keywords: []
---
-Page Bundles are a way to group [Page Resources](/content-management/page-resources/).
+## Introduction
-A Page Bundle can be one of:
+A page bundle is a directory that encapsulates both content and associated resources.
-- Leaf Bundle (leaf means it has no children)
-- Branch Bundle (home page, section, taxonomy terms, taxonomy list)
+By way of example, this site has an "about" page and a "privacy" page:
-| | Leaf Bundle | Branch Bundle |
-|-------------------------------------|----------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| Usage | Collection of content and attachments for single pages | Collection of attachments for section pages (home page, section, taxonomy terms, taxonomy list) |
-| Index file name | `index.md` [^fn:1] | `_index.md` [^fn:1] |
-| Allowed Resources | Page and non-page (like images, pdf, etc.) types | Only non-page (like images, pdf, etc.) types |
-| Where can the Resources live? | At any directory level within the leaf bundle directory. | Only in the directory level **of** the branch bundle directory i.e. the directory containing the `_index.md` ([ref](https://discourse.gohugo.io/t/question-about-content-folder-structure/11822/4?u=kaushalmodi)). |
-| Layout type | `single` | `list` |
-| Nesting | Does not allow nesting of more bundles under it | Allows nesting of leaf or branch bundles under it |
-| Example | `content/posts/my-post/index.md` | `content/posts/_index.md` |
-| Content from non-index page files... | Accessed only as page resources | Accessed only as regular pages |
+```text
+content/
+├── about/
+│ ├── index.md
+│ └── welcome.jpg
+└── privacy.md
+```
+The "about" page is a page bundle. It logically associates a resource with content by bundling them together. Resources within a page bundle are [page resources], accessible with the [`Resources`] method on the `Page` object.
-## Leaf Bundles {#leaf-bundles}
+Page bundles are either _leaf bundles_ or _branch bundles_.
-A _Leaf Bundle_ is a directory at any hierarchy within the `content/`
-directory, that contains an **`index.md`** file.
+leaf bundle
+: A _leaf bundle_ is a directory that contains an `index.md` file and zero or more resources. Analogous to a physical leaf, a leaf bundle is at the end of a branch. It has no descendants.
-### Examples of Leaf Bundle organization {#examples-of-leaf-bundle-organization}
+branch bundle
+: A _branch bundle_ is a directory that contains an `_index.md` file and zero or more resources. Analogous to a physical branch, a branch bundle may have descendants including leaf bundles and other branch bundles. Top-level directories with or without `_index.md` files are also branch bundles. This includes the home page.
+
+> [!note]
+> In the definitions above and the examples below, the extension of the index file depends on the [content format](g). For example, use `index.md` for Markdown content, `index.html` for HTML content, `index.adoc` for AsciiDoc content, etc.
+
+## Comparison
+
+Page bundle characteristics vary by bundle type.
+
+| | Leaf bundle | Branch bundle |
+|---------------------|---------------------------------------------------------|---------------------------------------------------------|
+| Index file | `index.md` | `_index.md` |
+| Example | `content/about/index.md` | `content/posts/_index.md ` |
+| [Page kinds](g) | `page` | `home`, `section`, `taxonomy`, or `term` |
+| Template types | [single] | [home], [section], [taxonomy], or [term] |
+| Descendant pages | None | Zero or more |
+| Resource location | Adjacent to the index file or in a nested subdirectory | Same as a leaf bundles, but excludes descendant bundles |
+| [Resource types](g) | `page`, `image`, `video`, etc. | all but `page` |
+
+Files with [resource type](g) `page` include content written in Markdown, HTML, AsciiDoc, Pandoc, reStructuredText, and Emacs Org Mode. In a leaf bundle, excluding the index file, these files are only accessible as page resources. In a branch bundle, these files are only accessible as content pages.
+
+## Leaf bundles
+
+A _leaf bundle_ is a directory that contains an `index.md` file and zero or more resources. Analogous to a physical leaf, a leaf bundle is at the end of a branch. It has no descendants.
```text
content/
├── about
-│ ├── index.md
+│ └── index.md
├── posts
│ ├── my-post
-│ │ ├── content1.md
-│ │ ├── content2.md
-│ │ ├── image1.jpg
-│ │ ├── image2.png
+│ │ ├── content-1.md
+│ │ ├── content-2.md
+│ │ ├── image-1.jpg
+│ │ ├── image-2.png
│ │ └── index.md
│ └── my-other-post
-│ └── index.md
-│
+│ └── index.md
└── another-section
- ├── ..
- └── not-a-leaf-bundle
- ├── ..
- └── another-leaf-bundle
- └── index.md
+ ├── foo.md
+ └── not-a-leaf-bundle
+ ├── bar.md
+ └── another-leaf-bundle
+ └── index.md
```
-In the above example `content/` directory, there are four leaf
-bundles:
+There are four leaf bundles in the example above:
about
-: This leaf bundle is at the root level (directly under
- `content` directory) and has only the `index.md`.
+: This leaf bundle does not contain any page resources.
my-post
-: This leaf bundle has the `index.md`, two other content
- Markdown files and two image files.
+: This leaf bundle contains an index file, two resources of [resource type](g) `page`, and two resources of resource type `image`.
+
+ - content-1, content-2
+
+ These are resources of resource type `page`, accessible via the [`Resources`] method on the `Page` object. Hugo will not render these as individual pages.
+
+ - image-1, image-2
+
+ These are resources of resource type `image`, accessible via the `Resources` method on the `Page` object
my-other-post
-: This leaf bundle has only the `index.md`.
+: This leaf bundle does not contain any page resources.
another-leaf-bundle
-: This leaf bundle is nested under couple of
- directories. This bundle also has only the `index.md`.
+: This leaf bundle does not contain any page resources.
-{{% note %}}
-The hierarchy depth at which a leaf bundle is created does not matter,
-as long as it is not inside another **leaf** bundle.
-{{% /note %}}
+> [!note]
+> Create leaf bundles at any depth within the `content` directory, but a leaf bundle may not contain another bundle. Leaf bundles do not have descendants.
+## Branch bundles
-### Headless Bundle {#headless-bundle}
-
-A headless bundle is a bundle that is configured to not get published
-anywhere:
-
-- It will have no `Permalink` and no rendered HTML in `public/`.
-- It will not be part of `.Site.RegularPages`, etc.
-
-But you can get it by `.Site.GetPage`. Here is an example:
-
-```go-html-template
-{{ $headless := .Site.GetPage "/some-headless-bundle" }}
-{{ $reusablePages := $headless.Resources.Match "author*" }}
-
Authors
-{{ range $reusablePages }}
-
{{ .Title }}
- {{ .Content }}
-{{ end }}
-```
-
-_In this example, we are assuming the `some-headless-bundle` to be a headless
- bundle containing one or more **page** resources whose `.Name` matches
- `"author*"`._
-
-Explanation of the above example:
-
-1. Get the `some-headless-bundle` Page "object".
-2. Collect a *slice* of resources in this *Page Bundle* that matches
- `"author*"` using `.Resources.Match`.
-3. Loop through that *slice* of nested pages, and output their `.Title` and
- `.Content`.
-
----
-
-A leaf bundle can be made headless by adding below in the Front Matter
-(in the `index.md`):
-
-```toml
-headless = true
-```
-
-{{% note %}}
-Only leaf bundles can be made headless.
-{{% /note %}}
-
-There are many use cases of such headless page bundles:
-
-- Shared media galleries
-- Reusable page content "snippets"
-
-
-## Branch Bundles {#branch-bundles}
-
-A _Branch Bundle_ is any directory at any hierarchy within the
-`content/` directory, that contains at least an **`_index.md`** file.
-
-This `_index.md` can also be directly under the `content/` directory.
-
-{{% note %}}
-Here `md` (markdown) is used just as an example. You can use any file
-type as a content resource as long as it is a content type recognized by Hugo.
-{{% /note %}}
-
-
-### Examples of Branch Bundle organization {#examples-of-branch-bundle-organization}
+A _branch bundle_ is a directory that contains an `_index.md` file and zero or more resources. Analogous to a physical branch, a branch bundle may have descendants including leaf bundles and other branch bundles. Top-level directories with or without `_index.md` files are also branch bundles. This includes the home page.
```text
content/
-├── branch-bundle-1
-│ ├── branch-content1.md
-│ ├── branch-content2.md
-│ ├── image1.jpg
-│ ├── image2.png
-│ └── _index.md
-└── branch-bundle-2
- ├── _index.md
- └── a-leaf-bundle
- └── index.md
+├── branch-bundle-1/
+│ ├── _index.md
+│ ├── content-1.md
+│ ├── content-2.md
+│ ├── image-1.jpg
+│ └── image-2.png
+├── branch-bundle-2/
+│ ├── a-leaf-bundle/
+│ │ └── index.md
+│ └── _index.md
+└── _index.md
```
-In the above example `content/` directory, there are two branch
-bundles (and a leaf bundle):
+There are three branch bundles in the example above:
-`branch-bundle-1`
-: This branch bundle has the `_index.md`, two
- other content Markdown files and two image files.
+home page
+: This branch bundle contains an index file, two descendant branch bundles, and no resources.
-`branch-bundle-2`
-: This branch bundle has the `_index.md` and a
- nested leaf bundle.
+branch-bundle-1
+: This branch bundle contains an index file, two resources of [resource type](g) `page`, and two resources of resource type `image`.
-{{% note %}}
-The hierarchy depth at which a branch bundle is created does not
-matter.
-{{% /note %}}
+branch-bundle-2
+: This branch bundle contains an index file and a leaf bundle.
-[^fn:1]: The `.md` extension is just an example. The extension can be `.html`, `.json` or any of any valid MIME type.
+> [!note]
+> Create branch bundles at any depth within the `content` directory. Branch bundles may have descendants.
+
+## Headless bundles
+
+Use [build options] in front matter to create an unpublished leaf or branch bundle whose content and resources you can include in other pages.
+
+[`Resources`]: /methods/page/resources/
+[build options]: /content-management/build-options/
+[home]: /templates/types/#home
+[page resources]: /content-management/page-resources/
+[section]: /templates/types/#section
+[single]: /templates/types/#single
+[taxonomy]: /templates/types/#taxonomy
+[term]: /templates/types/#term
diff --git a/docs/content/en/content-management/page-resources.md b/docs/content/en/content-management/page-resources.md
index dcd19e42f..204ca5301 100644
--- a/docs/content/en/content-management/page-resources.md
+++ b/docs/content/en/content-management/page-resources.md
@@ -1,106 +1,131 @@
---
-title : "Page Resources"
-description : "Page Resources -- images, other pages, documents etc. -- have page-relative URLs and their own metadata."
-date: 2018-01-24
-categories: ["content management"]
-keywords: [bundle,content,resources]
-weight: 4003
-draft: false
-toc: true
-linktitle: "Page Resources"
-menu:
- docs:
- parent: "content-management"
- weight: 31
+title: Page resources
+description: Use page resources to logically associate assets with a page.
+categories: []
+keywords: []
---
-## Properties
+Page resources are only accessible from [page bundles](/content-management/page-bundles), those directories with `index.md` or
+`_index.md` files at their root. Page resources are only available to the
+page with which they are bundled.
-ResourceType
-: The main type of the resource. For example, a file of MIME type `image/jpg` has the ResourceType `image`.
+In this example, `first-post` is a page bundle with access to 10 page resources including audio, data, documents, images, and video. Although `second-post` is also a page bundle, it has no page resources and is unable to directly access the page resources associated with `first-post`.
-Name
-: Default value is the filename (relative to the owning page). Can be set in front matter.
-
-Title
-: Default value is the same as `.Name`. Can be set in front matter.
-
-Permalink
-: The absolute URL to the resource. Resources of type `page` will have no value.
-
-RelPermalink
-: The relative URL to the resource. Resources of type `page` will have no value.
-
-Content
-: The content of the resource itself. For most resources, this returns a string with the contents of the file. This can be used to inline some resources, such as `` or ``.
-
-MediaType
-: The MIME type of the resource, such as `image/jpg`.
-
-MediaType.MainType
-: The main type of the resource's MIME type. For example, a file of MIME type `application/pdf` has for MainType `application`.
-
-MediaType.SubType
-: The subtype of the resource's MIME type. For example, a file of MIME type `application/pdf` has for SubType `pdf`. Note that this is not the same as the file extension - PowerPoint files have a subtype of `vnd.mspowerpoint`.
-
-MediaType.Suffixes
-: A slice of possible suffixes for the resource's MIME type.
-
-## Methods
-ByType
-: Returns the page resources of the given type.
-
-```go
-{{ .Resources.ByType "image" }}
-```
-Match
-: Returns all the page resources (as a slice) whose `Name` matches the given Glob pattern ([examples](https://github.com/gobwas/glob/blob/master/readme.md)). The matching is case-insensitive.
-
-```go
-{{ .Resources.Match "images/*" }}
+```text
+content
+└── post
+ ├── first-post
+ │ ├── images
+ │ │ ├── a.jpg
+ │ │ ├── b.jpg
+ │ │ └── c.jpg
+ │ ├── index.md (root of page bundle)
+ │ ├── latest.html
+ │ ├── manual.json
+ │ ├── notice.md
+ │ ├── office.mp3
+ │ ├── pocket.mp4
+ │ ├── rating.pdf
+ │ └── safety.txt
+ └── second-post
+ └── index.md (root of page bundle)
```
-GetMatch
-: Same as `Match` but will return the first match.
+## Examples
-### Pattern Matching
-```go
-// Using Match/GetMatch to find this images/sunset.jpg ?
-.Resources.Match "images/sun*" ✅
-.Resources.Match "**/Sunset.jpg" ✅
-.Resources.Match "images/*.jpg" ✅
-.Resources.Match "**.jpg" ✅
-.Resources.Match "*" 🚫
-.Resources.Match "sunset.jpg" 🚫
-.Resources.Match "*sunset.jpg" 🚫
+Use any of these methods on a `Page` object to capture page resources:
+ - [`Resources.ByType`]
+ - [`Resources.Get`]
+ - [`Resources.GetMatch`]
+ - [`Resources.Match`]
+
+ Once you have captured a resource, use any of the applicable [`Resource`] methods to return a value or perform an action.
+
+The following examples assume this content structure:
+
+```text
+content/
+└── example/
+ ├── data/
+ │ └── books.json <-- page resource
+ ├── images/
+ │ ├── a.jpg <-- page resource
+ │ └── b.jpg <-- page resource
+ ├── snippets/
+ │ └── text.md <-- page resource
+ └── index.md
```
-## Page Resources Metadata
+Render a single image, and throw an error if the file does not exist:
-Page Resources' metadata is managed from their page's front matter with an array/table parameter named `resources`. You can batch assign values using a [wildcards](http://tldp.org/LDP/GNU-Linux-Tools-Summary/html/x11655.htm).
+```go-html-template
+{{ $path := "images/a.jpg" }}
+{{ with .Resources.Get $path }}
+
+{{ else }}
+ {{ errorf "Unable to get page resource %q" $path }}
+{{ end }}
+```
-{{% note %}}
-Resources of type `page` get `Title` etc. from their own front matter.
-{{% /note %}}
+Render all images, resized to 300 px wide:
+
+```go-html-template
+{{ range .Resources.ByType "image" }}
+ {{ with .Resize "300x" }}
+
+ {{ end }}
+{{ end }}
+```
+
+Render the markdown snippet:
+
+```go-html-template
+{{ with .Resources.Get "snippets/text.md" }}
+ {{ .Content }}
+{{ end }}
+```
+
+List the titles in the data file, and throw an error if the file does not exist.
+
+```go-html-template
+{{ $path := "data/books.json" }}
+{{ with .Resources.Get $path }}
+ {{ with . | transform.Unmarshal }}
+
Books:
+
+ {{ range . }}
+
{{ .title }}
+ {{ end }}
+
+ {{ end }}
+{{ else }}
+ {{ errorf "Unable to get page resource %q" $path }}
+{{ end }}
+```
+
+## Metadata
+
+The page resources' metadata is managed from the corresponding page's front matter with an array/table parameter named `resources`. You can batch assign values using [wildcards](https://tldp.org/LDP/GNU-Linux-Tools-Summary/html/x11655.htm).
+
+> [!note]
+> Resources of type `page` get `Title` etc. from their own front matter.
name
-: Sets the value returned in `Name`.
+: (`string`) Sets the value returned in `Name`.
-{{% warning %}}
-The methods `Match` and `GetMatch` use `Name` to match the resources.
-{{%/ warning %}}
+> [!note]
+> The methods `Match`, `Get` and `GetMatch` use `Name` to match the resources.
title
-: Sets the value returned in `Title`
+: (`string`) Sets the value returned in `Title`
params
-: A map of custom key/values.
+: (`map`) A map of custom key-value pairs.
+### Resources metadata example
-### Resources metadata example
-
-{{< code-toggle copy="false">}}
+{{< code-toggle file=content/example.md fm=true >}}
title: Application
date : 2018-01-25
resources :
@@ -134,9 +159,8 @@ From the example above:
- All `PDF` files will get a new `Name`. The `name` parameter contains a special placeholder [`:counter`](#the-counter-placeholder-in-name-and-title), so the `Name` will be `pdf-file-1`, `pdf-file-2`, `pdf-file-3`.
- Every docx in the bundle will receive the `word` icon.
-{{% warning %}}
-The __order matters__ --- Only the **first set** values of the `title`, `name` and `params`-**keys** will be used. Consecutive parameters will be set only for the ones not already set. For example, in the above example, `.Params.icon` is already first set to `"photo"` in `src = "documents/photo_specs.pdf"`. So that would not get overridden to `"pdf"` by the later set `src = "**.pdf"` rule.
-{{%/ warning %}}
+> [!note]
+> The order matters; only the first set values of the `title`, `name` and `params` keys will be used. Consecutive parameters will be set only for the ones not already set. In the above example, `.Params.icon` is first set to `"photo"` in `src = "documents/photo_specs.pdf"`. So that would not get overridden to `"pdf"` by the later set `src = "**.pdf"` rule.
### The `:counter` placeholder in `name` and `title`
@@ -146,7 +170,8 @@ The counter starts at 1 the first time they are used in either `name` or `title`
For example, if a bundle has the resources `photo_specs.pdf`, `other_specs.pdf`, `guide.pdf` and `checklist.pdf`, and the front matter has specified the `resources` as:
-{{< code-toggle copy="false">}}
+{{< code-toggle file=content/inspections/engine/index.md fm=true >}}
+title = 'Engine inspections'
[[resources]]
src = "*specs.pdf"
title = "Specification #:counter"
@@ -163,3 +188,110 @@ the `Name` and `Title` will be assigned to the resource files as follows:
| guide.pdf | `"pdf-file-2.pdf` | `"guide.pdf"` |
| other\_specs.pdf | `"pdf-file-3.pdf` | `"Specification #1"` |
| photo\_specs.pdf | `"pdf-file-4.pdf` | `"Specification #2"` |
+
+## Multilingual
+
+{{< new-in 0.123.0 />}}
+
+By default, with a multilingual single-host site, Hugo does not duplicate shared page resources when building the site.
+
+> [!note]
+> This behavior is limited to Markdown content. Shared page resources for other [content formats] are copied into each language bundle.
+
+Consider this site configuration:
+
+{{< code-toggle file=hugo >}}
+defaultContentLanguage = 'de'
+defaultContentLanguageInSubdir = true
+
+[languages.de]
+languageCode = 'de-DE'
+languageName = 'Deutsch'
+weight = 1
+
+[languages.en]
+languageCode = 'en-US'
+languageName = 'English'
+weight = 2
+{{< /code-toggle >}}
+
+And this content:
+
+```text
+content/
+└── my-bundle/
+ ├── a.jpg <-- shared page resource
+ ├── b.jpg <-- shared page resource
+ ├── c.de.jpg
+ ├── c.en.jpg
+ ├── index.de.md
+ └── index.en.md
+```
+
+With v0.122.0 and earlier, Hugo duplicated the shared page resources, creating copies for each language:
+
+```text
+public/
+├── de/
+│ ├── my-bundle/
+│ │ ├── a.jpg <-- shared page resource
+│ │ ├── b.jpg <-- shared page resource
+│ │ ├── c.de.jpg
+│ │ └── index.html
+│ └── index.html
+├── en/
+│ ├── my-bundle/
+│ │ ├── a.jpg <-- shared page resource (duplicate)
+│ │ ├── b.jpg <-- shared page resource (duplicate)
+│ │ ├── c.en.jpg
+│ │ └── index.html
+│ └── index.html
+└── index.html
+
+```
+
+With v0.123.0 and later, Hugo places the shared resources in the page bundle for the default content language:
+
+```text
+public/
+├── de/
+│ ├── my-bundle/
+│ │ ├── a.jpg <-- shared page resource
+│ │ ├── b.jpg <-- shared page resource
+│ │ ├── c.de.jpg
+│ │ └── index.html
+│ └── index.html
+├── en/
+│ ├── my-bundle/
+│ │ ├── c.en.jpg
+│ │ └── index.html
+│ └── index.html
+└── index.html
+```
+
+This approach reduces build times, storage requirements, bandwidth consumption, and deployment times, ultimately reducing cost.
+
+> [!note]
+> To resolve Markdown link and image destinations to the correct location, you must use link and image render hooks that capture the page resource with the [`Resources.Get`] method, and then invoke its [`RelPermalink`] method.
+>
+> By default, with multilingual single-host sites, Hugo enables its [embedded link render hook] and [embedded image render hook] to resolve Markdown link and image destinations.
+>
+> You may override the embedded render hooks as needed, provided they capture the resource as described above.
+
+Although duplicating shared page resources is inefficient, you can enable this feature in your site configuration if desired:
+
+{{< code-toggle file=hugo >}}
+[markup.goldmark]
+duplicateResourceFiles = true
+{{< /code-toggle >}}
+
+[`RelPermalink`]: /methods/resource/relpermalink/
+[`Resource`]: /methods/resource
+[`Resources.ByType`]: /methods/page/resources#bytype
+[`Resources.Get`]: /methods/page/resources#get
+[`Resources.Get`]: /methods/page/resources/#get
+[`Resources.GetMatch`]: /methods/page/resources#getmatch
+[`Resources.Match`]: /methods/page/resources#match
+[content formats]: /content-management/formats/
+[embedded image render hook]: /render-hooks/images/#default
+[embedded link render hook]: /render-hooks/links/#default
diff --git a/docs/content/en/content-management/related-content.md b/docs/content/en/content-management/related-content.md
new file mode 100644
index 000000000..d7b18dab0
--- /dev/null
+++ b/docs/content/en/content-management/related-content.md
@@ -0,0 +1,102 @@
+---
+title: Related content
+description: List related content in "See Also" sections.
+categories: []
+keywords: []
+aliases: [/content/related/,/related/,/content-management/related/]
+---
+
+Hugo uses a set of factors to identify a page's related content based on front matter parameters. This can be tuned to the desired set of indices and parameters or left to Hugo's default [related content configuration](/configuration/related-content/).
+
+## List related content
+
+To list up to 5 related pages (which share the same _date_ or _keyword_ parameters) is as simple as including something similar to this partial in your template:
+
+```go-html-template {file="layouts/partials/related.html" copy=true}
+{{ with site.RegularPages.Related . | first 5 }}
+
+{{ end }}
+```
+
+The `Related` method takes one argument which may be a `Page` or an options map. The options map has these options:
+
+indices
+: (`slice`) The indices to search within.
+
+document
+: (`page`) The page for which to find related content. Required when specifying an options map.
+
+namedSlices
+: (`slice`) The keywords to search for, expressed as a slice of `KeyValues` using the [`keyVals`] function.
+
+fragments
+: (`slice`) A list of special keywords that is used for indices configured as type "fragments". This will match the [fragment](g) identifiers of the documents.
+
+A fictional example using all of the above options:
+
+```go-html-template
+{{ $page := . }}
+{{ $opts := dict
+ "indices" (slice "tags" "keywords")
+ "document" $page
+ "namedSlices" (slice (keyVals "tags" "hugo" "rocks") (keyVals "date" $page.Date))
+ "fragments" (slice "heading-1" "heading-2")
+}}
+```
+
+> [!note]
+> We improved and simplified this feature in Hugo 0.111.0. Before this we had 3 different methods: `Related`, `RelatedTo` and `RelatedIndices`. Now we have only one method: `Related`. The old methods are still available but deprecated. Also see [this blog article](https://regisphilibert.com/blog/2018/04/hugo-optmized-relashionships-with-related-content/) for a great explanation of more advanced usage of this feature.
+
+## Index content headings
+
+Hugo can index the headings in your content and use this to find related content. You can enable this by adding a index of type `fragments` to your `related` configuration:
+
+{{< code-toggle file=hugo >}}
+[related]
+threshold = 20
+includeNewer = true
+toLower = false
+[[related.indices]]
+name = "fragmentrefs"
+type = "fragments"
+applyFilter = true
+weight = 80
+{{< /code-toggle >}}
+
+- The `name` maps to a optional front matter slice attribute that can be used to link from the page level down to the fragment/heading level.
+- If `applyFilter` is enabled, the `.HeadingsFiltered` on each page in the result will reflect the filtered headings. This is useful if you want to show the headings in the related content listing:
+
+```go-html-template
+{{ $related := .Site.RegularPages.Related . | first 5 }}
+{{ with $related }}
+
+{{ end }}
+```
+
+## Configuration
+
+See [configure related content](/configuration/related-content/).
+
+[`keyVals`]: /functions/collections/keyvals/
diff --git a/docs/content/en/content-management/related.md b/docs/content/en/content-management/related.md
deleted file mode 100644
index 640cb04c0..000000000
--- a/docs/content/en/content-management/related.md
+++ /dev/null
@@ -1,135 +0,0 @@
----
-title: Related Content
-description: List related content in "See Also" sections.
-date: 2017-09-05
-categories: [content management]
-keywords: [content]
-menu:
- docs:
- parent: "content-management"
- weight: 40
-weight: 30
-draft: false
-aliases: [/content/related/,/related/]
-toc: true
----
-
-
-Hugo uses a set of factors to identify a page's related content based on Front Matter parameters. This can be tuned to the desired set of indices and parameters or left to Hugo's default [Related Content configuration](#configure-related-content).
-
-## List Related Content
-
-
-To list up to 5 related pages (which share the same _date_ or _keyword_ parameters) is as simple as including something similar to this partial in your single page template:
-
-{{< code file="layouts/partials/related.html" >}}
-{{ $related := .Site.RegularPages.Related . | first 5 }}
-{{ with $related }}
-
-{{ end }}
-{{< /code >}}
-
-### Methods
-
-Here is the list of "Related" methods available on a page collection such `.RegularPages`.
-
-#### .Related PAGE
-Returns a collection of pages related the given one.
-
-```
-{{ $related := .Site.RegularPages.Related . }}
-```
-
-#### .RelatedIndices PAGE INDICE1 [INDICE2 ...]
-Returns a collection of pages related to a given one restricted to a list of indices.
-
-```
-{{ $related := .Site.RegularPages.RelatedIndices . "tags" "date" }}
-```
-
-#### .RelatedTo KEYVALS [KEYVALS2 ...]
-Returns a collection of pages related together by a set of indices and their match.
-
-In order to build those set and pass them as argument, one must use the `keyVals` function where the first argument would be the `indice` and the consective ones its potential `matches`.
-
-```
-{{ $related := .Site.RegularPages.RelatedTo ( keyVals "tags" "hugo" "rocks") ( keyVals "date" .Date ) }}
-```
-
-{{% note %}}
-Read [this blog article](https://regisphilibert.com/blog/2018/04/hugo-optmized-relashionships-with-related-content/) for a great explanation of more advanced usage of this feature.
-{{% /note %}}
-
-## Configure Related Content
-Hugo provides a sensible default configuration of Related Content, but you can fine-tune this in your configuration, on the global or language level if needed.
-
-### Default configuration
-
-Without any `related` configuration set on the project, Hugo's Related Content methods will use the following.
-
-```yaml
-related:
- threshold: 80
- includeNewer: false
- toLower: false
- indices:
- - name: keywords
- weight: 100
- - name: date
- weight: 10
-```
-
-Custom configuration should be set using the same syntax.
-
-{{% note %}}
-If you add a `related` config section, you need to add a complete configuration. It is not possible to just set, say, `includeNewer` and use the rest from the Hugo defaults.
-{{% /note %}}
-
-### Top Level Config Options
-
-threshold
-: A value between 0-100. Lower value will give more, but maybe not so relevant, matches.
-
-includeNewer
-: Set to true to include **pages newer than the current page** in the related content listing. This will mean that the output for older posts may change as new related content gets added.
-
-toLower
-: Set to true to lower case keywords in both the indexes and the queries. This may give more accurate results at a slight performance penalty. Note that this can also be set per index.
-
-### Config Options per Index
-
-name
-: The index name. This value maps directly to a page param. Hugo supports string values (`author` in the example) and lists (`tags`, `keywords` etc.) and time and date objects.
-
-weight
-: An integer weight that indicates _how important_ this parameter is relative to the other parameters. It can be 0, which has the effect of turning this index off, or even negative. Test with different values to see what fits your content best.
-
-pattern
-: This is currently only relevant for dates. When listing related content, we may want to list content that is also close in time. Setting "2006" (default value for date indexes) as the pattern for a date index will add weight to pages published in the same year. For busier blogs, "200601" (year and month) may be a better default.
-
-toLower
-: See above.
-
-## Performance Considerations
-
-**Fast is Hugo's middle name** and we would not have released this feature had it not been blistering fast.
-
-This feature has been in the back log and requested by many for a long time. The development got this recent kick start from this Twitter thread:
-
-{{< tweet 898398437527363585 >}}
-
-Scott S. Lowe removed the "Related Content" section built using the `intersect` template function on tags, and the build time dropped from 30 seconds to less than 2 seconds on his 1700 content page sized blog.
-
-He should now be able to add an improved version of that "Related Content" section without giving up the fast live-reloads. But it's worth noting that:
-
-* If you don't use any of the `Related` methods, you will not use the Relate Content feature, and performance will be the same as before.
-* Calling `.RegularPages.Related` etc. will create one inverted index, also sometimes named posting list, that will be reused for any lookups in that same page collection. Doing that in addition to, as an example, calling `.Pages.Related` will work as expected, but will create one additional inverted index. This should still be very fast, but worth having in mind, especially for bigger sites.
-
-{{% note %}}
-We currently do not index **Page content**. We thought we would release something that will make most people happy before we start solving [Sherlock's last case](https://github.com/joearms/sherlock).
-{{% /note %}}
diff --git a/docs/content/en/content-management/sections.md b/docs/content/en/content-management/sections.md
index 79ae201d4..f7a2296f5 100644
--- a/docs/content/en/content-management/sections.md
+++ b/docs/content/en/content-management/sections.md
@@ -1,98 +1,139 @@
---
-title: Content Sections
-linktitle: Sections
-description: "Hugo generates a **section tree** that matches your content."
-date: 2017-02-01
-publishdate: 2017-02-01
-lastmod: 2017-02-01
-categories: [content management]
-keywords: [lists,sections,content types,organization]
-menu:
- docs:
- parent: "content-management"
- weight: 50
-weight: 50 #rem
-draft: false
+title: Sections
+description: Organize content into sections.
+
+categories: []
+keywords: []
aliases: [/content/sections/]
-toc: true
---
-A **Section** is a collection of pages that gets defined based on the
-organization structure under the `content/` directory.
+## Overview
-By default, all the **first-level** directories under `content/` form their own
-sections (**root sections**).
+{{% glossary-term "section" %}}
-If a user needs to define a section `foo` at a deeper level, they need to create
-a directory named `foo` with an `_index.md` file (see [Branch Bundles][branch bundles]
-for more information).
-
-
-{{% note %}}
-A **section** cannot be defined or overridden by a front matter parameter -- it
-is strictly derived from the content organization structure.
-{{% /note %}}
-
-## Nested Sections
-
-The sections can be nested as deeply as you need.
-
-```bash
-content
-└── blog <-- Section, because first-level dir under content/
- ├── funny-cats
- │ ├── mypost.md
- │ └── kittens <-- Section, because contains _index.md
- │ └── _index.md
- └── tech <-- Section, because contains _index.md
- └── _index.md
+```text
+content/
+├── articles/ <-- section (top-level directory)
+│ ├── 2022/
+│ │ ├── article-1/
+│ │ │ ├── cover.jpg
+│ │ │ └── index.md
+│ │ └── article-2.md
+│ └── 2023/
+│ ├── article-3.md
+│ └── article-4.md
+├── products/ <-- section (top-level directory)
+│ ├── product-1/ <-- section (has _index.md file)
+│ │ ├── benefits/ <-- section (has _index.md file)
+│ │ │ ├── _index.md
+│ │ │ ├── benefit-1.md
+│ │ │ └── benefit-2.md
+│ │ ├── features/ <-- section (has _index.md file)
+│ │ │ ├── _index.md
+│ │ │ ├── feature-1.md
+│ │ │ └── feature-2.md
+│ │ └── _index.md
+│ └── product-2/ <-- section (has _index.md file)
+│ ├── benefits/ <-- section (has _index.md file)
+│ │ ├── _index.md
+│ │ ├── benefit-1.md
+│ │ └── benefit-2.md
+│ ├── features/ <-- section (has _index.md file)
+│ │ ├── _index.md
+│ │ ├── feature-1.md
+│ │ └── feature-2.md
+│ └── _index.md
+├── _index.md
+└── about.md
```
-**The important part to understand is, that to make the section tree fully navigational, at least the lower-most section needs a content file. (e.g. `_index.md`).**
+The example above has two top-level sections: articles and products. None of the directories under articles are sections, while all of the directories under products are sections. A section within a section is a known as a nested section or subsection.
-{{% note %}}
-When we talk about a **section** in correlation with template selection, it is
-currently always the *root section* only (`/blog/funny-cats/mypost/ => blog`).
+## Explanation
-If you need a specific template for a sub-section, you need to adjust either the `type` or `layout` in front matter.
-{{% /note %}}
+Sections and non-sections behave differently.
-## Example: Breadcrumb Navigation
+||Sections|Non-sections
+:--|:-:|:-:
+Directory names become URL segments|:heavy_check_mark:|:heavy_check_mark:
+Have logical ancestors and descendants|:heavy_check_mark:|:x:
+Have list pages|:heavy_check_mark:|:x:
-With the available [section variables and methods](#section-page-variables-and-methods) you can build powerful navigation. One common example would be a partial to show Breadcrumb navigation:
+With the file structure from the [example above](#overview):
-{{< code file="layouts/partials/breadcrumb.html" download="breadcrumb.html" >}}
-
- {{ template "breadcrumbnav" (dict "p1" . "p2" .) }}
-
-{{ define "breadcrumbnav" }}
-{{ if .p1.Parent }}
-{{ template "breadcrumbnav" (dict "p1" .p1.Parent "p2" .p2 ) }}
-{{ else if not .p1.IsHome }}
-{{ template "breadcrumbnav" (dict "p1" .p1.Site.Home "p2" .p2 ) }}
-{{ end }}
-
-{{ end }}
-{{< /code >}}
+1. The list page for the articles section includes all articles, regardless of directory structure; none of the subdirectories are sections.
+1. The articles/2022 and articles/2023 directories do not have list pages; they are not sections.
+1. The list page for the products section, by default, includes product-1 and product-2, but not their descendant pages. To include descendant pages, use the `RegularPagesRecursive` method instead of the `Pages` method in the list template.
+1. All directories in the products section have list pages; each directory is a section.
-## Section Page Variables and Methods
+## Template selection
-Also see [Page Variables](/variables/page/).
+Hugo has a defined [lookup order] to determine which template to use when rendering a page. The [lookup rules] consider the top-level section name; subsection names are not considered when selecting a template.
-{{< readfile file="/content/en/readfiles/sectionvars.md" markdown="true" >}}
+With the file structure from the [example above](#overview):
-## Content Section Lists
+Content directory|Section template
+:--|:--
+`content/products`|`layouts/products/list.html`
+`content/products/product-1`|`layouts/products/list.html`
+`content/products/product-1/benefits`|`layouts/products/list.html`
-Hugo will automatically create pages for each *root section* that list all of the content in that section. See the documentation on [section templates][] for details on customizing the way these pages are rendered.
+Content directory|Single template
+:--|:--
+`content/products`|`layouts/products/single.html`
+`content/products/product-1`|`layouts/products/single.html`
+`content/products/product-1/benefits`|`layouts/products/single.html`
-## Content *Section* vs Content *Type*
+If you need to use a different template for a subsection, specify `type` and/or `layout` in front matter.
-By default, everything created within a section will use the [content `type`][content type] that matches the *root section* name. For example, Hugo will assume that `posts/post-1.md` has a `posts` content `type`. If you are using an [archetype][] for your `posts` section, Hugo will generate front matter according to what it finds in `archetypes/posts.md`.
+## Ancestors and descendants
-[archetype]: /content-management/archetypes/
-[content type]: /content-management/types/
-[directory structure]: /getting-started/directory-structure/
-[section templates]: /templates/section-templates/
-[branch bundles]: /content-management/page-bundles/#branch-bundles
+A section has one or more ancestors (including the home page), and zero or more descendants. With the file structure from the [example above](#overview):
+
+```text
+content/products/product-1/benefits/benefit-1.md
+```
+
+The content file (benefit-1.md) has four ancestors: benefits, product-1, products, and the home page. This logical relationship allows us to use the `.Parent` and `.Ancestors` methods to traverse the site structure.
+
+For example, use the `.Ancestors` method to render breadcrumb navigation.
+
+```go-html-template {file="layouts/partials/breadcrumb.html"}
+
+```
+
+With this CSS:
+
+```css
+.breadcrumb ol {
+ padding-left: 0;
+}
+
+.breadcrumb li {
+ display: inline;
+}
+
+.breadcrumb li:not(:last-child)::after {
+ content: "»";
+}
+```
+
+Hugo renders this, where each breadcrumb is a link to the corresponding page:
+
+```text
+Home » Products » Product 1 » Benefits » Benefit 1
+```
+
+[lookup order]: /templates/lookup-order/
+[lookup rules]: /templates/lookup-order/#lookup-rules
diff --git a/docs/content/en/content-management/shortcodes.md b/docs/content/en/content-management/shortcodes.md
index 3be1c6f9e..2de387f39 100644
--- a/docs/content/en/content-management/shortcodes.md
+++ b/docs/content/en/content-management/shortcodes.md
@@ -1,422 +1,230 @@
---
title: Shortcodes
-linktitle:
-description: Shortcodes are simple snippets inside your content files calling built-in or custom templates.
-godocref:
-date: 2017-02-01
-publishdate: 2017-02-01
-lastmod: 2017-03-31
-menu:
- docs:
- parent: "content-management"
- weight: 35
-weight: 35 #rem
-categories: [content management]
-keywords: [markdown,content,shortcodes]
-draft: false
+description: Use embedded, custom, or inline shortcodes to insert elements such as videos, images, and social media embeds into your content.
+categories: []
+keywords: []
aliases: [/extras/shortcodes/]
-testparam: "Hugo Rocks!"
-toc: true
---
-## What a Shortcode is
+## Introduction
-Hugo loves Markdown because of its simple content format, but there are times when Markdown falls short. Often, content authors are forced to add raw HTML (e.g., video ``) to Markdown content. We think this contradicts the beautiful simplicity of Markdown's syntax.
+{{% glossary-term shortcode %}}
-Hugo created **shortcodes** to circumvent these limitations.
+There are three types of shortcodes: embedded, custom, and inline.
-A shortcode is a simple snippet inside a content file that Hugo will render using a predefined template. Note that shortcodes will not work in template files. If you need the type of drop-in functionality that shortcodes provide but in a template, you most likely want a [partial template][partials] instead.
+## Embedded
-In addition to cleaner Markdown, shortcodes can be updated any time to reflect new classes, techniques, or standards. At the point of site generation, Hugo shortcodes will easily merge in your changes. You avoid a possibly complicated search and replace operation.
+Hugo's embedded shortcodes are pre-defined templates within the application. Refer to each shortcode's documentation for specific usage instructions and available arguments.
-## Use Shortcodes
+{{% list-pages-in-section path=/shortcodes %}}
-{{< youtube 2xkNJL4gJ9E >}}
+## Custom
-In your content files, a shortcode can be called by calling `{{%/* shortcodename parameters */%}}`. Shortcode parameters are space delimited, and parameters with internal spaces can be quoted.
+Create custom shortcodes to simplify and standardize content creation. For example, the following shortcode template generates an audio player using a [global resource](g):
-The first word in the shortcode declaration is always the name of the shortcode. Parameters follow the name. Depending upon how the shortcode is defined, the parameters may be named, positional, or both, although you can't mix parameter types in a single call. The format for named parameters models that of HTML with the format `name="value"`.
-
-Some shortcodes use or require closing shortcodes. Again like HTML, the opening and closing shortcodes match (name only) with the closing declaration, which is prepended with a slash.
-
-Here are two examples of paired shortcodes:
-
-```
-{{%/* mdshortcode */%}}Stuff to `process` in the *center*.{{%/* /mdshortcode */%}}
+```go-html-template {file="layouts/shortcodes/audio.html"}
+{{ with resources.Get (.Get "src") }}
+
+{{ end }}
```
-```
-{{* highlight go */>}} A bunch of code here {{* /highlight */>}}
+Then call the shortcode from within markup:
+
+```text {file="content/example.md"}
+{{* audio src=/audio/test.mp3 */>}}
```
-The examples above use two different delimiters, the difference being the `%` character in the first and the `<>` characters in the second.
+Learn more about creating shortcodes in the [shortcode templates] section.
-### Shortcodes with Markdown
+## Inline
-In Hugo `0.55` we changed how the `%` delimiter works. Shortcodes using the `%` as the outer-most delimiter will now be fully rendered when sent to the content renderer (e.g. Blackfriday for Markdown), meaning they can be part of the generated table of contents, footnotes, etc.
+An inline shortcode is a shortcode template defined within content.
-If you want the old behavior, you can put the following line in the start of your shortcode template:
+Hugo's security model is based on the premise that template and configuration authors are trusted, but content authors are not. This model enables generation of HTML output safe against code injection.
-```
-{{ $_hugo_config := `{ "version": 1 }` }}
+To conform with this security model, creating shortcode templates within content is disabled by default. If you trust your content authors, you can enable this functionality in your site's configuration:
+
+{{< code-toggle file=hugo >}}
+[security]
+enableInlineShortcodes = true
+{{< /code-toggle >}}
+
+For more information see [configure security](/configuration/security).
+
+The following example demonstrates an inline shortcode, `date.inline`, that accepts a single positional argument: a date/time [layout string].
+
+```text {file="content/example.md"}
+Today is
+{{* date.inline ":date_medium" */>}}
+ {{- now | time.Format (.Get 0) -}}
+{{* /date.inline */>}}.
+
+Today is {{* date.inline ":date_full" /*/>}}.
```
+In the example above, the inline shortcode is executed twice: once upon definition and again when subsequently called. Hugo renders this to:
-### Shortcodes Without Markdown
-
-The `<` character indicates that the shortcode's inner content does *not* need further rendering. Often shortcodes without markdown include internal HTML:
-
-```
-{{* myshortcode */>}}
Hello World!
{{* /myshortcode */>}}
+```html
+
Today is Jan 30, 2025.
+
Today is Thursday, January 30, 2025
```
-### Nested Shortcodes
+Inline shortcodes process their inner content within the same context as regular shortcode templates, allowing you to use any available [shortcode method].
-You can call shortcodes within other shortcodes by creating your own templates that leverage the `.Parent` variable. `.Parent` allows you to check the context in which the shortcode is being called. See [Shortcode templates][sctemps].
+> [!note]
+> You cannot [nest](#nesting) inline shortcodes.
-## Use Hugo's Built-in Shortcodes
+Learn more about creating shortcodes in the [shortcode templates] section.
-Hugo ships with a set of predefined shortcodes that represent very common usage. These shortcodes are provided for author convenience and to keep your markdown content clean.
+## Calling
-### `figure`
+Shortcode calls involve three syntactical elements: tags, arguments, and notation.
-`figure` is an extension of the image syntax in markdown, which does not provide a shorthand for the more semantic [HTML5 `
` element][figureelement].
+### Tags
-The `figure` shortcode can use the following named parameters:
+Some shortcodes expect content between opening and closing tags. For example, the embedded [`details`] shortcode requires an opening and closing tag:
-src
-: URL of the image to be displayed.
-
-link
-: If the image needs to be hyperlinked, URL of the destination.
-
-target
-: Optional `target` attribute for the URL if `link` parameter is set.
-
-rel
-: Optional `rel` attribute for the URL if `link` parameter is set.
-
-alt
-: Alternate text for the image if the image cannot be displayed.
-
-title
-: Image title.
-
-caption
-: Image caption.
-
-class
-: `class` attribute of the HTML `figure` tag.
-
-height
-: `height` attribute of the image.
-
-width
-: `width` attribute of the image.
-
-attr
-: Image attribution text.
-
-attrlink
-: If the attribution text needs to be hyperlinked, URL of the destination.
-
-#### Example `figure` Input
-
-{{< code file="figure-input-example.md" >}}
-{{* figure src="/media/spf13.jpg" title="Steve Francia" */>}}
-{{< /code >}}
-
-#### Example `figure` Output
-
-{{< output file="figure-output-example.html" >}}
-
-
-
-
Steve Francia
-
-
-{{< /output >}}
-
-### `gist`
-
-Bloggers often want to include GitHub gists when writing posts. Let's suppose we want to use the [gist at the following url][examplegist]:
-
-```
-https://gist.github.com/spf13/7896402
+```text
+{{* details summary="See the details" */>}}
+This is a **bold** word.
+{{* /details */>}}
```
-We can embed the gist in our content via username and gist ID pulled from the URL:
+Some shortcodes do not accept content. For example, the embedded [`instagram`] shortcode requires a single _positional_ argument:
-```
-{{* gist spf13 7896402 */>}}
+```text
+{{* instagram CxOWiQNP2MO */>}}
```
-#### Example `gist` Input
+Some shortcodes optionally accept content. For example, you can call the embedded [`qr`] shortcode with content:
-If the gist contains several files and you want to quote just one of them, you can pass the filename (quoted) as an optional third argument:
-
-{{< code file="gist-input.md" >}}
-{{* gist spf13 7896402 "img.html" */>}}
-{{< /code >}}
-
-#### Example `gist` Output
-
-{{< output file="gist-output.html" >}}
-{{< gist spf13 7896402 >}}
-{{< /output >}}
-
-#### Example `gist` Display
-
-To demonstrate the remarkably efficiency of Hugo's shortcode feature, we have embedded the `spf13` `gist` example in this page. The following simulates the experience for visitors to your website. Naturally, the final display will be contingent on your stylesheets and surrounding markup.
-
-{{< gist spf13 7896402 >}}
-
-### `highlight`
-
-This shortcode will convert the source code provided into syntax-highlighted HTML. Read more on [highlighting](/tools/syntax-highlighting/). `highlight` takes exactly one required `language` parameter and requires a closing shortcode.
-
-#### Example `highlight` Input
-
-{{< code file="content/tutorials/learn-html.md" >}}
-{{* highlight html */>}}
-
-
-
{{ .Title }}
- {{ range .Pages }}
- {{ .Render "summary"}}
- {{ end }}
-
-
-{{* /highlight */>}}
-{{< /code >}}
-
-#### Example `highlight` Output
-
-The `highlight` shortcode example above would produce the following HTML when the site is rendered:
-
-{{< output file="tutorials/learn-html/index.html" >}}
-<sectionid="main">
- <div>
- <h1id="title">{{ .Title }}</h1>
- {{ range .Pages }}
- {{ .Render "summary"}}
- {{ end }}
- </div>
-</section>
-{{< /output >}}
-
-{{% note "More on Syntax Highlighting" %}}
-To see even more options for adding syntax-highlighted code blocks to your website, see [Syntax Highlighting in Developer Tools](/tools/syntax-highlighting/).
-{{% /note %}}
-
-### `instagram`
-
-If you'd like to embed a photo from [Instagram][], you only need the photo's ID. You can discern an Instagram photo ID from the URL:
-
-```
-https://www.instagram.com/p/BWNjjyYFxVx/
+```text
+{{* qr */>}}
+https://gohugo.io
+{{* /qr */>}}
```
-#### Example `instagram` Input
+Or use the self-closing syntax with a trailing slash to pass the text as an argument:
-{{< code file="instagram-input.md" >}}
-{{* instagram BWNjjyYFxVx */>}}
-{{< /code >}}
-
-You also have the option to hide the caption:
-
-{{< code file="instagram-input-hide-caption.md" >}}
-{{* instagram BWNjjyYFxVx hidecaption */>}}
-{{< /code >}}
-
-#### Example `instagram` Output
-
-By adding the preceding `hidecaption` example, the following HTML will be added to your rendered website's markup:
-
-{{< output file="instagram-hide-caption-output.html" >}}
-{{< instagram BWNjjyYFxVx hidecaption >}}
-{{< /output >}}
-
-#### Example `instagram` Display
-
-Using the preceding `instagram` with `hidecaption` example above, the following simulates the displayed experience for visitors to your website. Naturally, the final display will be contingent on your stylesheets and surrounding markup.
-
-{{< instagram BWNjjyYFxVx hidecaption >}}
-
-
-### `param`
-
-Gets a value from the current `Page's` params set in front matter, with a fall back to the site param value. It will log an `ERROR` if the param with the given key could not be found in either.
-
-```bash
-{{* param testparam */>}}
+```text
+{{* qr text=https://gohugo.io /*/>}}
```
-Since `testparam` is a param defined in front matter of this page with the value `Hugo Rocks!`, the above will print:
+Refer to each shortcode's documentation for specific usage instructions and available arguments.
-{{< param testparam >}}
+### Arguments
-To access deeply nested params, use "dot syntax", e.g:
+Shortcode arguments can be either _named_ or _positional_.
-```bash
-{{* param "my.nested.param" */>}}
+Named arguments are passed as case-sensitive key-value pairs, as seen in this example with the embedded [`figure`] shortcode. The `src` argument, for instance, is required.
+
+```text
+{{* figure src=/images/kitten.jpg */>}}
```
-### `ref` and `relref`
+Positional arguments, on the other hand, are determined by their position. The embedded `instagram` shortcode, for example, expects the first argument to be the Instagram post ID.
-These shortcodes will look up the pages by their relative path (e.g., `blog/post.md`) or their logical name (`post.md`) and return the permalink (`ref`) or relative permalink (`relref`) for the found page.
-
-`ref` and `relref` also make it possible to make fragmentary links that work for the header links generated by Hugo.
-
-{{% note "More on Cross References" %}}
-Read a more extensive description of `ref` and `relref` in the [cross references](/content-management/cross-references/) documentation.
-{{% /note %}}
-
-`ref` and `relref` take exactly one required parameter of _reference_, quoted and in position `0`.
-
-#### Example `ref` and `relref` Input
-
-```
-[Neat]({{* ref "blog/neat.md" */>}})
-[Who]({{* relref "about.md#who" */>}})
+```text
+{{* instagram CxOWiQNP2MO */>}}
```
-#### Example `ref` and `relref` Output
+Shortcode arguments are space-delimited, and arguments with internal spaces must be quoted.
-Assuming that standard Hugo pretty URLs are turned on.
-
-```
-Neat
-Who
+```text
+{{* figure src=/images/kitten.jpg alt="A white kitten" */>}}
```
-### `tweet`
+Shortcodes accept [scalar](g) arguments, one of [string](g), [integer](g), [floating point](g), or [boolean](g).
-You want to include a single tweet into your blog post? Everything you need is the URL of the tweet:
-
-```
-https://twitter.com/spf13/status/877500564405444608
+```text
+{{* my-shortcode name="John Smith" age=24 married=false */>}}
```
-#### Example `tweet` Input
+You can optionally use multiple lines when providing several arguments to a shortcode for better readability:
-Pass the tweet's ID from the URL as a parameter to the `tweet` shortcode:
-
-{{< code file="example-tweet-input.md" >}}
-{{* tweet 877500564405444608 */>}}
-{{< /code >}}
-
-#### Example `tweet` Output
-
-Using the preceding `tweet` example, the following HTML will be added to your rendered website's markup:
-
-{{< output file="example-tweet-output.html" >}}
-{{< tweet 877500564405444608 >}}
-{{< /output >}}
-
-#### Example `tweet` Display
-
-Using the preceding `tweet` example, the following simulates the displayed experience for visitors to your website. Naturally, the final display will be contingent on your stylesheets and surrounding markup.
-
-{{< tweet 877500564405444608 >}}
-
-### `vimeo`
-
-Adding a video from [Vimeo][] is equivalent to the YouTube shortcode above.
-
-```
-https://vimeo.com/channels/staffpicks/146022717
+```text
+{{* figure
+ src=/images/kitten.jpg
+ alt="A white kitten"
+ caption="This is a white kitten"
+ loading=lazy
+*/>}}
```
-#### Example `vimeo` Input
+Use a [raw string literal](g) if you need to pass a multiline string:
-Extract the ID from the video's URL and pass it to the `vimeo` shortcode:
-
-{{< code file="example-vimeo-input.md" >}}
-{{* vimeo 146022717 */>}}
-{{< /code >}}
-
-#### Example `vimeo` Output
-
-Using the preceding `vimeo` example, the following HTML will be added to your rendered website's markup:
-
-{{< output file="example-vimeo-output.html" >}}
-{{< vimeo 146022717 >}}
-{{< /output >}}
-
-{{% tip %}}
-If you want to further customize the visual styling of the YouTube or Vimeo output, add a `class` named parameter when calling the shortcode. The new `class` will be added to the `