Documentation testing

GitLab documentation is stored in projects with code, and treated like code. To maintain standards and quality of documentation, we use processes similar to those used for code.

Merge requests containing changes to Markdown (.md) files run these CI/CD jobs:

  • docs-lint markdown: Tests the documentation content with Vale, the Markdown structure with markdownlint, and other tests in
  • docs-lint links: Checks the validity of internal links in the documentation suite.
  • mermaidlint: Checks for invalid Mermaid charts in the documentation.
  • ui-docs-links lint: Checks the validity of links from UI elements, such as files in app/views files.

Tests in

The tests in /scripts/ look for page content problems that Vale and markdownlint cannot test for. The docs-lint markdown job fails if any of these tests fail:

  • Curl (curl) commands must use long-form options (--header) instead of short options, like -h.
  • Documentation pages must contain front matter indicating ownership of the page.
  • Non-standard Unicode space characters (NBSP, NNBSP, ZWSP) must not be used in documentation, because they can cause irregularities in search indexing and grepping.
  • must not contain duplicate versions.
  • No files in the doc/ directory may be executable.
  • Use instead of
  • Directories and filenames must use underscores instead of dashes.
  • Directories and filenames must be in lower case.

Merge requests containing changes to Markdown (.md) files run a docs-lint links job, which runs two types of link checks. In both cases, links with destinations that begin with http or https are considered external links, and skipped:

  • bundle exec nanoc check internal_links: Tests links to internal pages.
  • bundle exec nanoc check internal_anchors: Tests links to topic title anchors on internal pages.

Failures from these tests are displayed at the end of the test results in the Issues found! area. For example, failures in the internal_anchors test follow this format:

[ ERROR ] internal_anchors - Broken anchor detected!
  - source file `/tmp/gitlab-docs/public/ee/user/application_security/api_fuzzing/index.html`
  - destination `/tmp/gitlab-docs/public/ee/development/code_review.html`
  - link `../../../development/code_review.html#review-response-slo`
  - anchor `#review-response-slo`
  • Source file: The full path to the file containing the error. To find the file in the gitlab repository, replace /tmp/gitlab-docs/public/ee with doc, and .html with .md.
  • Destination: The full path to the file not found by the test. To find the file in the gitlab repository, replace /tmp/gitlab-docs/public/ee with doc, and .html with .md.
  • Link: The actual link the script attempted to find.
  • Anchor: If present, the topic title anchor the script attempted to find.

Check for multiple instances of the same broken link on each page reporting an error. Even if a specific broken link appears multiple times on a page, the test reports it only once.

Tests in mermaidlint


Mermaid builds charts and diagrams from code.

The mermaidlint job runs on merge requests that contain changes to Markdown files. The script (scripts/lint/check_mermaid.mjs) returns an error if any Markdown files return a Mermaid syntax error.

To help debug your Mermaid charts, use the Mermaid Live Editor.

The ui-docs-links lint job uses haml-lint to test that all documentation links from UI elements (app/views files, for example) link to valid pages and anchors.

Install documentation linters

To help adhere to the documentation style guidelines, and improve the content added to documentation, install documentation linters and integrate them with your code editor. At a minimum, install markdownlint and Vale to match the checks run in build pipelines. Both tools can integrate with your code editor.

Run documentation tests locally

Similar to previewing your changes locally, you can also run documentation tests on your local computer. This has the advantage of:

  • Speeding up the feedback loop. You can know of any problems with the changes in your branch without waiting for a CI/CD pipeline to run.
  • Lowering costs. Running tests locally is cheaper than running tests on the cloud infrastructure GitLab uses.

It’s important to:

  • Keep the tools up-to-date, and match the versions used in our CI/CD pipelines.
  • Run linters, documentation link tests, and UI link tests the same way they are run in CI/CD pipelines. It’s important to use same configuration we use in CI/CD pipelines, which can be different than the default configuration of the tool.

Run Vale or markdownlint locally

Installation and configuration instructions for markdownlint and Vale are available.

Run locally

Use a Rake task to run the tests locally.


  • You have either:
    • The required lint tools installed on your computer.
    • A working Docker or containerd installation, to use an image with these tools pre-installed.
  1. Go to your gitlab directory.
  2. Run:

    rake lint:markdown

To specify a single file or directory you would like to run lint checks for, run:

MD_DOC_PATH=path/to/ rake lint:markdown

The output should be similar to:

=> Linting documents at path /path/to/gitlab as <user>...
=> Checking for cURL short options...
=> Checking for duplicate entries...
=> Checking /path/to/gitlab/doc for executable permissions...
=> Checking for new files...
=> Linting markdown style...
=> Linting prose...
✔ 0 errors, 0 warnings and 0 suggestions in 1 file.
✔ Linting passed

To test links in the documentation locally:

  1. Go to the gitlab-docs directory.
  2. Run the following commands:

    # Check for broken internal links
    bundle exec nanoc check internal_links
    # Check for broken external links (might take a lot of time to complete).
    # This test is allowed to fail, and is run only in the gitlab-docs project CI
    bundle exec nanoc check internal_anchors

To test documentation links in the GitLab UI locally:

  1. Open the gitlab directory in a terminal window.
  2. Run:

    bundle exec haml-lint -i DocumentationLinks

If you receive an error the first time you run this test, run bundle install, which installs the dependencies for GitLab, and try again.

If you don’t want to install all of the dependencies to test the links, you can:

  1. Open the gitlab directory in a terminal window.
  2. Install haml-lint:

    gem install haml_lint
  3. Run:

    haml-lint -i DocumentationLinks

If you manually install haml-lint with this process, it does not update automatically and you should make sure your version matches the version used by GitLab.

Update linter configuration

Vale and markdownlint configurations are under source control in each project, so updates must be committed to each project individually.

The configuration in the gitlab project should be treated as the source of truth, and all updates should first be made there.

On a regular basis, the changes made in gitlab project to the Vale and markdownlint configuration should be synchronized to the other projects. In each of the supported projects:

  1. Create a new branch.
  2. Copy the configuration files from the gitlab project into this branch, overwriting the project’s old configuration. Make sure no project-specific changes from the gitlab project are included. For example, RelativeLinks.yml is hard coded for specific projects.
  3. Create a merge request and submit it to a technical writer for review and merge.

Update linting images

Lint tests run in CI/CD pipelines using images from the gitlab-docs container registry.

If a new version of a dependency is released (like a new version of Ruby), we should update the images to use the newer version. Then, we can update the configuration files in each of our documentation projects to point to the new image.

To update the linting images:

  1. In gitlab-docs, open a merge request to update .gitlab-ci.yml to use the new tooling version. (Example MR)
  2. When merged, start a Build every hour scheduled pipeline.
  3. Go the pipeline you started, and manually run the relevant build-images job, for example, image:docs-lint-markdown.
  4. In the job output, get the name of the new image. (Example job output)
  5. Verify that the new image was added to the container registry.
  6. Open merge requests to update each of these configuration files to point to the new image. In each merge request, include a small doc update to trigger the job that uses the image.
  7. In each merge request, check the relevant job output to confirm the updated image was used for the test. (Example job output)
  8. Assign the merge requests to any technical writer to review and merge.

Configure pre-push hooks

Git pre-push hooks allow Git users to:

  • Run tests or other processes before pushing a branch.
  • Avoid pushing a branch if failures occur with these tests.

lefthook is a Git hooks manager. It makes configuring, installing, and removing Git hooks simpler. Configuration for it is available in the lefthook.yml file for the gitlab project.

To set up lefthook for documentation linting, see Pre-push static analysis.

To show Vale errors on push, see Show Vale warnings on push.

Disable linting on documentation

Some, but not all, linting can be disabled on documentation files:

Tool versions used in CI/CD pipelines

You should use linter versions that are the same as those used in our CI/CD pipelines for maximum compatibility with the linting rules we use.

To match the versions of markdownlint-cli2 and vale used in the GitLab projects, refer to:

Versions set in these two locations should be the same.

Tool Version Command Additional information
markdownlint-cli2 Latest yarn global add markdownlint-cli2 None.
markdownlint-cli2 Specific yarn global add markdownlint-cli2@0.8.1 The @ indicates a specific version, and this example updates the tool to version 0.8.1.
Vale (using asdf) Specific asdf install Installs the version of Vale set in .tool-versions file in a project.
Vale (other) Specific Not applicable. Binaries can be directly downloaded.
Vale (using brew) Latest brew update && brew upgrade vale This command is for macOS only.

Supported projects

For the specifics of each test run in our CI/CD pipelines, see the configuration for those tests in the relevant projects:

We also run some documentation tests in the: