- Gather GitLab and system information
- Show GitLab license information
- Check GitLab configuration
- Rebuild
authorized_keys
file - Clear Redis cache
- Precompile the assets
- Check TCP connectivity to a remote site
- Clear exclusive lease (DANGER)
- Display status of database migrations
- Run incomplete database migrations
- Rebuild database indexes
- Dump the database schema
- Check the database for schema inconsistencies
- Troubleshooting
Maintenance Rake tasks
GitLab provides Rake tasks for general maintenance.
Gather GitLab and system information
This command gathers information about your GitLab installation and the system it runs on. These may be useful when asking for help or reporting issues. In a multi-node environment, run this command on nodes running GitLab Rails to avoid PostgreSQL socket errors.
-
Linux package installations:
sudo gitlab-rake gitlab:env:info
-
Self-compiled installations:
bundle exec rake gitlab:env:info RAILS_ENV=production
Example output:
System information
System: Ubuntu 20.04
Proxy: no
Current User: git
Using RVM: no
Ruby Version: 2.7.6p219
Gem Version: 3.1.6
Bundler Version:2.3.15
Rake Version: 13.0.6
Redis Version: 6.2.7
Sidekiq Version:6.4.2
Go Version: unknown
GitLab information
Version: 15.5.5-ee
Revision: 5f5109f142d
Directory: /opt/gitlab/embedded/service/gitlab-rails
DB Adapter: PostgreSQL
DB Version: 13.8
URL: https://app.gitaly.gcp.gitlabsandbox.net
HTTP Clone URL: https://app.gitaly.gcp.gitlabsandbox.net/some-group/some-project.git
SSH Clone URL: git@app.gitaly.gcp.gitlabsandbox.net:some-group/some-project.git
Elasticsearch: no
Geo: no
Using LDAP: no
Using Omniauth: yes
Omniauth Providers:
GitLab Shell
Version: 14.12.0
Repository storage paths:
- default: /var/opt/gitlab/git-data/repositories
- gitaly: /var/opt/gitlab/git-data/repositories
GitLab Shell path: /opt/gitlab/embedded/service/gitlab-shell
Gitaly
- default Address: unix:/var/opt/gitlab/gitaly/gitaly.socket
- default Version: 15.5.5
- default Git Version: 2.37.1.gl1
- gitaly Address: tcp://10.128.20.6:2305
- gitaly Version: 15.5.5
- gitaly Git Version: 2.37.1.gl1
Show GitLab license information
This command shows information about your GitLab license and how many seats are used. It is only available on GitLab Enterprise installations: a license cannot be installed into GitLab Community Edition.
These may be useful when raising tickets with Support, or for programmatically checking your license parameters.
-
Linux package installations:
sudo gitlab-rake gitlab:license:info
-
Self-compiled installations:
bundle exec rake gitlab:license:info RAILS_ENV=production
Example output:
Today's Date: 2020-02-29
Current User Count: 30
Max Historical Count: 30
Max Users in License: 40
License valid from: 2019-11-29 to 2020-11-28
Email associated with license: user@example.com
Check GitLab configuration
The gitlab:check
Rake task runs the following Rake tasks:
gitlab:gitlab_shell:check
gitlab:gitaly:check
gitlab:sidekiq:check
gitlab:incoming_email:check
gitlab:ldap:check
gitlab:app:check
-
gitlab:geo:check
(only if you’re running Geo)
It checks that each component was set up according to the installation guide and suggest fixes for issues found. This command must be run from your application server and doesn’t work correctly on component servers like Gitaly.
You may also have a look at our troubleshooting guides for:
Additionally you should also verify database values can be decrypted using the current secrets.
To run gitlab:check
, run:
-
Linux package installations:
sudo gitlab-rake gitlab:check
-
Self-compiled installations:
bundle exec rake gitlab:check RAILS_ENV=production
Use SANITIZE=true
for gitlab:check
if you want to omit project names from the output.
Example output:
Checking Environment ...
Git configured for git user? ... yes
Has python2? ... yes
python2 is supported version? ... yes
Checking Environment ... Finished
Checking GitLab Shell ...
GitLab Shell version? ... OK (1.2.0)
Repo base directory exists? ... yes
Repo base directory is a symlink? ... no
Repo base owned by git:git? ... yes
Repo base access is drwxrws---? ... yes
post-receive hook up-to-date? ... yes
post-receive hooks in repos are links: ... yes
Checking GitLab Shell ... Finished
Checking Sidekiq ...
Running? ... yes
Checking Sidekiq ... Finished
Checking GitLab App...
Database config exists? ... yes
Database is SQLite ... no
All migrations up? ... yes
GitLab config exists? ... yes
GitLab config up to date? ... no
Cable config exists? ... yes
Resque config exists? ... yes
Log directory writable? ... yes
Tmp directory writable? ... yes
Init script exists? ... yes
Init script up-to-date? ... yes
Redis version >= 2.0.0? ... yes
Checking GitLab ... Finished
Rebuild authorized_keys
file
In some cases it is necessary to rebuild the authorized_keys
file,
for example, if after an upgrade you receive Permission denied (publickey)
when pushing via SSH
and find 404 Key Not Found
errors in the gitlab-shell.log
file.
To rebuild authorized_keys
, run:
-
Linux package installations:
sudo gitlab-rake gitlab:shell:setup
-
Self-compiled installations:
cd /home/git/gitlab sudo -u git -H bundle exec rake gitlab:shell:setup RAILS_ENV=production
Example output:
This will rebuild an authorized_keys file.
You will lose any data stored in authorized_keys file.
Do you want to continue (yes/no)? yes
Clear Redis cache
If for some reason the dashboard displays the wrong information, you might want to clear Redis’ cache. To do this, run:
-
Linux package installations:
sudo gitlab-rake cache:clear
-
Self-compiled installations:
cd /home/git/gitlab sudo -u git -H bundle exec rake cache:clear RAILS_ENV=production
Precompile the assets
Sometimes during version upgrades you might end up with some wrong CSS or missing some icons. In that case, try to precompile the assets again.
This Rake task only applies to self-compiled installations. Read more about troubleshooting this problem when running the Linux package. The guidance for Linux package might be applicable for Kubernetes and Docker deployments of GitLab, though in general, container-based installations don’t have issues with missing assets.
-
Self-compiled installations:
cd /home/git/gitlab sudo -u git -H bundle exec rake gitlab:assets:compile RAILS_ENV=production
For Linux package installations, the unoptimized assets (JavaScript, CSS) are frozen at
the release of upstream GitLab. The Linux package installation includes optimized versions
of those assets. Unless you are modifying the JavaScript / CSS code on your
production machine after installing the package, there should be no reason to redo
rake gitlab:assets:compile
on the production machine. If you suspect that assets
have been corrupted, you should reinstall the Linux package.
Check TCP connectivity to a remote site
Sometimes you need to know if your GitLab installation can connect to a TCP service on another machine (for example a PostgreSQL or web server) to troubleshoot proxy issues. A Rake task is included to help you with this.
-
Linux package installations:
sudo gitlab-rake gitlab:tcp_check[example.com,80]
-
Self-compiled installations:
cd /home/git/gitlab sudo -u git -H bundle exec rake gitlab:tcp_check[example.com,80] RAILS_ENV=production
Clear exclusive lease (DANGER)
GitLab uses a shared lock mechanism: ExclusiveLease
to prevent simultaneous operations
in a shared resource. An example is running periodic garbage collection on repositories.
In very specific situations, an operation locked by an Exclusive Lease can fail without releasing the lock. If you can’t wait for it to expire, you can run this task to manually clear it.
To clear all exclusive leases:
sudo gitlab-rake gitlab:exclusive_lease:clear
To specify a lease type
or lease type + id
, specify a scope:
# to clear all leases for repository garbage collection:
sudo gitlab-rake gitlab:exclusive_lease:clear[project_housekeeping:*]
# to clear a lease for repository garbage collection in a specific project: (id=4)
sudo gitlab-rake gitlab:exclusive_lease:clear[project_housekeeping:4]
Display status of database migrations
See the background migrations documentation for how to check that migrations are complete when upgrading GitLab.
To check the status of specific migrations, you can use the following Rake task:
sudo gitlab-rake db:migrate:status
To check the tracking database on a Geo secondary site, you can use the following Rake task:
sudo gitlab-rake db:migrate:status:geo
This outputs a table with a Status
of up
or down
for
each migration. Example:
database: gitlabhq_production
Status Migration ID Type Milestone Name
--------------------------------------------------
up 20240701074848 regular 17.2 AddGroupIdToPackagesDebianGroupComponents
up 20240701153843 regular 17.2 AddWorkItemsDatesSourcesSyncToIssuesTrigger
up 20240702072515 regular 17.2 AddGroupIdToPackagesDebianGroupArchitectures
up 20240702133021 regular 17.2 AddWorkspaceTerminationTimeoutsToRemoteDevelopmentAgentConfigs
up 20240604064938 post 17.2 FinalizeBackfillPartitionIdCiPipelineMessage
up 20240604111157 post 17.2 AddApprovalPolicyRulesFkOnApprovalGroupRules
Starting with GitLab 17.1, migrations are executed in an order that conforms to the GitLab release cadence.
Run incomplete database migrations
Database migrations can be stuck in an incomplete state, with a down
status in the output of the sudo gitlab-rake db:migrate:status
command.
-
To complete these migrations, use the following Rake task:
sudo gitlab-rake db:migrate
-
After the command completes, run
sudo gitlab-rake db:migrate:status
to check if all migrations are completed (have anup
status). -
Hot reload
puma
andsidekiq
services:sudo gitlab-ctl hup puma sudo gitlab-ctl restart sidekiq
Starting with GitLab 17.1, migrations are executed in an order that conforms to the GitLab release cadence.
Rebuild database indexes
Database indexes can be rebuilt regularly to reclaim space and maintain healthy levels of index bloat over time. Reindexing can also be run as a regular cron job. A “healthy” level of bloat is highly dependent on the specific index, but generally should be below 30%.
Prerequisites:
- This feature requires PostgreSQL 12 or later.
- These index types are not supported: expression indexes, partitioned indexes, and indexes used for constraint exclusion.
To manually rebuild a database index:
-
Optional. To send annotations to a Grafana (4.6 or later) endpoint, enable annotations with these custom environment variables (see setting custom environment variables):
-
GRAFANA_API_URL
: The base URL for Grafana, such ashttp://some-host:3000
. -
GRAFANA_API_KEY
: A Grafana API key with at leastEditor role
.
-
-
Run the Rake task to rebuild the two indexes with the highest estimated bloat:
sudo gitlab-rake gitlab:db:reindex
-
The reindexing task (
gitlab:db:reindex
) rebuilds only the two indexes in each database with the highest bloat. To rebuild more than two indexes, run the task again until all desired indexes have been rebuilt.
Notes
- Rebuilding database indexes is a disk-intensive task, so you should perform the task during off-peak hours. Running the task during peak hours can lead to increased bloat, and can also cause certain queries to perform slowly.
- The task requires free disk space for the index being restored. The created
indexes are appended with
_ccnew
. If the reindexing task fails, re-running the task cleans up the temporary indexes. - The time it takes for database index rebuilding to complete depends on the size of the target database. It can take between several hours and several days.
Dump the database schema
In rare circumstances, the database schema can differ from what the application code expects even if all database migrations are complete. If this does occur, it can lead to odd errors in GitLab.
To dump the database schema:
SCHEMA=/tmp/structure.sql gitlab-rake db:schema:dump
The Rake task creates a /tmp/structure.sql
file that contains the database schema dump.
To determine if there are any differences:
- Go to the
db/structure.sql
file in thegitlab
project. Select the branch that matches your GitLab version. For example, the file for GitLab 16.2: https://gitlab.com/gitlab-org/gitlab/-/blob/16-2-stable-ee/db/structure.sql. - Compare
/tmp/structure.sql
with thedb/structure.sql
file for your version.
Check the database for schema inconsistencies
- Introduced in GitLab 15.11.
This Rake task checks the database schema for any inconsistencies and prints them in the terminal. This task is a diagnostic tool to be used under the guidance of GitLab Support. You should not use the task for routine checks as database inconsistencies might be expected.
gitlab-rake gitlab:db:schema_checker:run
Troubleshooting
Advisory lock connection information
After running the db:migrate
Rake task, you may see output like the following:
main: == [advisory_lock_connection] object_id: 173580, pg_backend_pid: 5532
main: == [advisory_lock_connection] object_id: 173580, pg_backend_pid: 5532
The messages returned are informational and can be ignored.
PostgreSQL socket errors when executing the gitlab:env:info
Rake task
After running sudo gitlab-rake gitlab:env:info
on Gitaly or other non-Rails nodes , you might see the following error:
PG::ConnectionBad: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/opt/gitlab/postgresql/.s.PGSQL.5432"?
This is because, in a multi-node environment, the gitlab:env:info
Rake task should only be executed on the nodes running GitLab Rails.