Import/Export development documentation

Troubleshooting and general development guidelines and tips for the Import/Export feature.

This document is originally based on the Import/Export 201 presentation available on YouTube.

Troubleshooting commands

Finds information about the status of the import and further logs using the JID:

# Rails console
Project.find_by_full_path('group/project').import_state.slice(:jid, :status, :last_error)
> {"jid"=>"414dec93f941a593ea1a6894", "status"=>"finished", "last_error"=>nil}
# Logs
grep JID /var/log/gitlab/sidekiq/current
grep "Import/Export error" /var/log/gitlab/sidekiq/current
grep "Import/Export backtrace" /var/log/gitlab/sidekiq/current
tail /var/log/gitlab/gitlab-rails/importer.log

Troubleshooting performance issues

Read through the current performance problems using the Import/Export below.

OOM errors

Out of memory (OOM) errors are normally caused by the Sidekiq Memory Killer:


An import status started, and the following Sidekiq logs will signal a memory issue:

WARN: Work still in progress <struct with JID>


Timeout errors occur due to the StuckImportJobsWorker marking the process as failed:

class StuckImportJobsWorker
  include ApplicationWorker
  include CronjobQueue

  IMPORT_JOBS_EXPIRATION = 15.hours.to_i

  def perform
    import_state_without_jid_count = mark_import_states_without_jid_as_failed!
    import_state_with_jid_count = mark_import_states_with_jid_as_failed!
Marked stuck import jobs as failed. JIDs: xyz
  +-----------+    +-----------------------------------+
  |Export Job |--->| Calls ActiveRecord `as_json` and  |
  +-----------+    | `to_json` on all project models   |

  +-----------+    +-----------------------------------+
  |Import Job |--->| Loads all JSON in memory, then    |
  +-----------+    | inserts into the DB in batches    |

Problems and solutions

ProblemPossible solutions
Slow JSON loading/dumping models from the databasesplit the worker
 Batch export
 Optimize SQL
 Move away from ActiveRecord callbacks (difficult)
High memory usage (see also some analysisDB Commit sweet spot that uses less memory
 Netflix Fast JSON API may help
 Batch reading/writing to disk and any SQL

Temporary solutions

While the performance problems are not tackled, there is a process to workaround importing big projects, using a foreground import:

Foreground import of big projects for customers. (Using the import template in the infrastructure tracker)


The Import/Export feature is constantly updated (adding new things to export), however the code hasn’t been refactored in a long time. We should perform a code audit (see confidential issue to make sure its dynamic nature does not increase the number of security concerns.

Security in the code

Some of these classes provide a layer of security to the Import/Export.

The AttributeCleaner removes any prohibited keys:

# AttributeCleaner
# Removes all `_ids` and other prohibited keys
    class AttributeCleaner
      ALLOWED_REFERENCES = RelationFactory::PROJECT_REFERENCES + RelationFactory::USER_REFERENCES + ['group_id']

      def clean
        @relation_hash.reject do |key, _value|
          prohibited_key?(key) || !@relation_class.attribute_method?(key) || excluded_key?(key)


The AttributeConfigurationSpec checks and confirms the addition of new columns:

# AttributeConfigurationSpec
  It looks like #{relation_class}, which is exported using the project Import/Export, has new attributes:

  Please add the attribute(s) to SAFE_MODEL_ATTRIBUTES if you consider this can be exported.
  Otherwise, please blacklist the attribute(s) in IMPORT_EXPORT_CONFIG by adding it to its correspondent
  model in the +excluded_attributes+ section.

  SAFE_MODEL_ATTRIBUTES: #{File.expand_path(safe_attributes_file)}
  IMPORT_EXPORT_CONFIG: #{Gitlab::ImportExport.config_file}

The ModelConfigurationSpec checks and confirms the addition of new models:

# ModelConfigurationSpec
  New model(s) <#{new_models.join(',')}> have been added, related to #{parent_model_name}, which is exported by
  the Import/Export feature.

  If you think this model should be included in the export, please add it to `#{Gitlab::ImportExport.config_file}`.

  Definitely add it to `#{File.expand_path(ce_models_yml)}`
  to signal that you've handled this error and to prevent it from showing up in the future.

The ExportFileSpec detects encrypted or sensitive columns:

# ExportFileSpec
  Found a new sensitive word <#{key_found}>, which is part of the hash #{parent.inspect}
  If you think this information shouldn't get exported, please exclude the model or attribute in

  Otherwise, please add the exception to +safe_list+ in CURRENT_SPEC using #{sensitive_word} as the
  key and the correspondent hash or model as the value.

  Also, if the attribute is a generated unique token, please add it to RelationFactory::TOKEN_RESET_MODELS
  if it needs to be reset (to prevent duplicate column problems while importing to the same instance).

  IMPORT_EXPORT_CONFIG: #{Gitlab::ImportExport.config_file}


Import/Export does not use strict SemVer, since it has frequent constant changes during a single GitLab release. It does require an update when there is a breaking change.

# ImportExport
module Gitlab
  module ImportExport
    extend self

    # For every version update, the version history in has to be kept up to date.
    VERSION = '0.2.4'

Version history

The current version history also displays the equivalent GitLab version and it is useful for knowing which versions won’t be compatible between them.

Exporting GitLab versionImporting GitLab version
11.7 to current11.7 to current
11.1 to 11.611.1 to 11.6
10.8 to 11.010.8 to 11.0
10.4 to 10.710.4 to 10.7
8.10.3 to to 8.11
8.10.0 to to 8.10.2
8.9.5 to to 8.9.11
8.9.0 to to 8.9.4

When to bump the version up

We will have to bump the version if we rename model/columns or perform any format modifications in the JSON structure or the file structure of the archive file.

We do not need to bump the version up in any of the following cases:

  • Add a new column or a model
  • Remove a column or model (unless there is a DB constraint)
  • Export new things (such as a new type of upload)

Every time we bump the version, the integration specs will fail and can be fixed with:

bundle exec rake gitlab:import_export:bump_version

A quick dive into the code

Import/Export configuration (import_export.yml)

The main configuration import_export.yml defines what models can be exported/imported.

Model relationships to be included in the project import/export:

  - labels:
    - :priorities
  - milestones:
    - events:
      - :push_event_payload
  - issues:
    - events:
    - ...

Only include the following attributes for the models specified:

    - :id
    - :email

Do not include the following attributes for the models specified:

    - :name
    - :path
    - ...

Extra methods to be called by the export:

# Methods
    - :type
    - :type


The import job status moves from none to finished or failed into different states:

import_status: none -> scheduled -> started -> finished/failed

While the status is started the Importer code processes each step required for the import.

# ImportExport::Importer
module Gitlab
  module ImportExport
    class Importer
      def execute
        if import_file && check_version! && restorers.all?(&:restore) && overwrite_project
          raise', '))
      rescue => e

      def restorers
        [repo_restorer, wiki_restorer, project_tree, avatar_restorer,
         uploads_restorer, lfs_restorer, statistics_restorer]

The export service, is similar to the Importer, restoring data instead of saving it.


# ImportExport::ExportService
module Projects
  module ImportExport
    class ExportService < BaseService

      def save_all!
        if save_services
 project, shared: @shared)

      def save_services
        [version_saver, avatar_saver, project_tree_saver, uploads_saver, repo_saver,
           wiki_repo_saver, lfs_saver].all?(&:save)