Welcome to mirror list, hosted at ThFree Co, Russian Federation.

gitlab.com/gitlab-org/gitlab-foss.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorGitLab Bot <gitlab-bot@gitlab.com>2022-05-27 03:08:34 +0300
committerGitLab Bot <gitlab-bot@gitlab.com>2022-05-27 03:08:34 +0300
commit17f6b320a11fc5bc1261994a7a93b34096e365e3 (patch)
treed81697c28fa4e84e3bcc28c53d22e14c4bd1300f
parent1c47c8e2d613829993eafb59a903223389389d86 (diff)
Add latest changes from gitlab-org/gitlab@master
-rw-r--r--doc/raketasks/backup_restore.md38
-rw-r--r--doc/user/infrastructure/iac/terraform_state.md27
-rw-r--r--lib/backup/manager.rb19
-rw-r--r--lib/backup/repositories.rb20
-rw-r--r--qa/qa/specs/features/browser_ui/4_verify/pipeline/multi-project_pipelines_spec.rb113
-rw-r--r--qa/qa/specs/features/browser_ui/4_verify/pipeline/parent_child_pipelines_dependent_relationship_spec.rb136
-rw-r--r--spec/lib/backup/repositories_spec.rb48
-rw-r--r--spec/tasks/gitlab/backup_rake_spec.rb2
8 files changed, 134 insertions, 269 deletions
diff --git a/doc/raketasks/backup_restore.md b/doc/raketasks/backup_restore.md
index c679cc16aca..956e2b4c070 100644
--- a/doc/raketasks/backup_restore.md
+++ b/doc/raketasks/backup_restore.md
@@ -426,6 +426,25 @@ For example, for installations from source:
sudo -u git -H bundle exec rake gitlab:backup:create REPOSITORIES_STORAGES=storage1,storage2
```
+#### Back up specific project repositories
+
+> [Introduced](https://gitlab.com/gitlab-org/gitlab/-/merge_requests/88094) in GitLab 15.1.
+
+You can back up a specific project or list of projects using the `REPOSITORIES_PATHS` option. The option accepts a comma-separated list of
+project paths. For example:
+
+- Omnibus GitLab installations:
+
+ ```shell
+ sudo gitlab-backup create REPOSITORIES_PATHS=gitlab-org/gitlab,gitlab-org/gitaly
+ ```
+
+- Installations from source:
+
+ ```shell
+ sudo -u git -H bundle exec rake gitlab:backup:create REPOSITORIES_PATHS=gitlab-org/gitlab,gitlab-org/gitaly
+ ```
+
#### Uploading backups to a remote (cloud) storage
You can let the backup script upload (using the [Fog library](https://fog.io/))
@@ -1259,6 +1278,25 @@ For example, for installations from source:
sudo -u git -H bundle exec rake gitlab:backup:restore BACKUP=timestamp_of_backup REPOSITORIES_STORAGES=storage1,storage2
```
+#### Restore specific project repositories
+
+> [Introduced](https://gitlab.com/gitlab-org/gitlab/-/merge_requests/88094) in GitLab 15.1.
+
+You can restore a specific project or list of projects using the `REPOSITORIES_PATHS` option. These projects must exist within the
+specified backup. The option accepts a comma-separated list of project paths. For example:
+
+- Omnibus GitLab installations:
+
+ ```shell
+ sudo gitlab-backup restore BACKUP=timestamp_of_backup REPOSITORIES_PATHS=gitlab-org/gitlab,gitlab-org/gitaly
+ ```
+
+- Installations from source:
+
+ ```shell
+ sudo -u git -H bundle exec rake gitlab:backup:restore BACKUP=timestamp_of_backup REPOSITORIES_PATHS=gitlab-org/gitlab,gitlab-org/gitaly
+ ```
+
## Alternative backup strategies
If your GitLab instance contains a lot of Git repository data, you may find the
diff --git a/doc/user/infrastructure/iac/terraform_state.md b/doc/user/infrastructure/iac/terraform_state.md
index 1eb8fdacc2f..0acb1acb66d 100644
--- a/doc/user/infrastructure/iac/terraform_state.md
+++ b/doc/user/infrastructure/iac/terraform_state.md
@@ -104,12 +104,31 @@ The following example demonstrates how to change the state name. The same workfl
You can use a GitLab-managed Terraform state backend as a
[Terraform data source](https://www.terraform.io/language/state/remote-state-data).
-1. Create a file named `example.auto.tfvars` with the following contents:
+1. In your `main.tf` or other relevant file, declare these variables. Leave the values empty.
+
+ ```hcl
+ variable "example_remote_state_address" {
+ type = string
+ description = "Gitlab remote state file address"
+ }
+
+ variable "example_username" {
+ type = string
+ description = "Gitlab username to query remote state"
+ }
+
+ variable "example_access_token" {
+ type = string
+ description = "Gitlab acess token to query remote state"
+ }
+ ```
+
+1. To override the values from the previous step, create a file named `example.auto.tfvars`. This file should **not** be versioned in your project repository.
```plaintext
- example_remote_state_address=https://gitlab.com/api/v4/projects/<TARGET-PROJECT-ID>/terraform/state/<TARGET-STATE-NAME>
- example_username=<GitLab username>
- example_access_token=<GitLab Personal Access Token>
+ example_remote_state_address = "https://gitlab.com/api/v4/projects/<TARGET-PROJECT-ID>/terraform/state/<TARGET-STATE-NAME>"
+ example_username = "<GitLab username>"
+ example_access_token = "<GitLab Personal Access Token>"
```
1. In a `.tf` file, define the data source by using [Terraform input variables](https://www.terraform.io/language/values/variables):
diff --git a/lib/backup/manager.rb b/lib/backup/manager.rb
index f249becb8f9..16b8f21c9e9 100644
--- a/lib/backup/manager.rb
+++ b/lib/backup/manager.rb
@@ -11,7 +11,8 @@ module Backup
LIST_ENVS = {
skipped: 'SKIP',
- repositories_storages: 'REPOSITORIES_STORAGES'
+ repositories_storages: 'REPOSITORIES_STORAGES',
+ repositories_paths: 'REPOSITORIES_PATHS'
}.freeze
TaskDefinition = Struct.new(
@@ -190,7 +191,11 @@ module Backup
max_storage_concurrency = ENV['GITLAB_BACKUP_MAX_STORAGE_CONCURRENCY'].presence
strategy = Backup::GitalyBackup.new(progress, incremental: incremental?, max_parallelism: max_concurrency, storage_parallelism: max_storage_concurrency)
- Repositories.new(progress, strategy: strategy, storages: repositories_storages)
+ Repositories.new(progress,
+ strategy: strategy,
+ storages: list_env(:repositories_storages),
+ paths: list_env(:repositories_paths)
+ )
end
def build_files_task(app_files_dir, excludes: [])
@@ -266,7 +271,8 @@ module Backup
tar_version: tar_version,
installation_type: Gitlab::INSTALLATION_TYPE,
skipped: ENV['SKIP'],
- repositories_storages: ENV['REPOSITORIES_STORAGES']
+ repositories_storages: ENV['REPOSITORIES_STORAGES'],
+ repositories_paths: ENV['REPOSITORIES_PATHS']
}
end
@@ -279,7 +285,8 @@ module Backup
tar_version: tar_version,
installation_type: Gitlab::INSTALLATION_TYPE,
skipped: list_env(:skipped).join(','),
- repositories_storages: list_env(:repositories_storages).join(',')
+ repositories_storages: list_env(:repositories_storages).join(','),
+ repositories_paths: list_env(:repositories_paths).join(',')
)
end
@@ -472,10 +479,6 @@ module Backup
@skipped ||= list_env(:skipped)
end
- def repositories_storages
- @repositories_storages ||= list_env(:repositories_storages)
- end
-
def list_env(name)
list = ENV.fetch(LIST_ENVS[name], '').split(',')
list += backup_information[name].split(',') if backup_information[name]
diff --git a/lib/backup/repositories.rb b/lib/backup/repositories.rb
index 4a31e87b969..29b20bd2e73 100644
--- a/lib/backup/repositories.rb
+++ b/lib/backup/repositories.rb
@@ -3,19 +3,25 @@
require 'yaml'
module Backup
+ # Backup and restores repositories by querying the database
class Repositories < Task
extend ::Gitlab::Utils::Override
- def initialize(progress, strategy:, storages: [])
+ # @param [IO] progress IO interface to output progress
+ # @param [Object] :strategy Fetches backups from gitaly
+ # @param [Array<String>] :storages Filter by specified storage names. Empty means all storages.
+ # @param [Array<String>] :paths Filter by specified project paths. Empty means all projects, groups and snippets.
+ def initialize(progress, strategy:, storages: [], paths: [])
super(progress)
@strategy = strategy
@storages = storages
+ @paths = paths
end
override :dump
- def dump(path, backup_id)
- strategy.start(:create, path, backup_id: backup_id)
+ def dump(destination_path, backup_id)
+ strategy.start(:create, destination_path, backup_id: backup_id)
enqueue_consecutive
ensure
@@ -23,8 +29,8 @@ module Backup
end
override :restore
- def restore(path)
- strategy.start(:restore, path)
+ def restore(destination_path)
+ strategy.start(:restore, destination_path)
enqueue_consecutive
ensure
@@ -36,7 +42,7 @@ module Backup
private
- attr_reader :strategy, :storages
+ attr_reader :strategy, :storages, :paths
def enqueue_consecutive
enqueue_consecutive_projects
@@ -66,12 +72,14 @@ module Backup
def project_relation
scope = Project.includes(:route, :group, namespace: :owner)
scope = scope.id_in(ProjectRepository.for_repository_storage(storages).select(:project_id)) if storages.any?
+ scope = scope.where_full_path_in(paths) if paths.any?
scope
end
def snippet_relation
scope = Snippet.all
scope = scope.id_in(SnippetRepository.for_repository_storage(storages).select(:snippet_id)) if storages.any?
+ scope = scope.joins(:project).merge(Project.where_full_path_in(paths)) if paths.any?
scope
end
diff --git a/qa/qa/specs/features/browser_ui/4_verify/pipeline/multi-project_pipelines_spec.rb b/qa/qa/specs/features/browser_ui/4_verify/pipeline/multi-project_pipelines_spec.rb
deleted file mode 100644
index c5408f30d16..00000000000
--- a/qa/qa/specs/features/browser_ui/4_verify/pipeline/multi-project_pipelines_spec.rb
+++ /dev/null
@@ -1,113 +0,0 @@
-# frozen_string_literal: true
-
-module QA
- RSpec.describe 'Verify' do
- describe 'Multi-project pipelines' do
- let(:downstream_job_name) { 'downstream_job' }
- let(:executor) { "qa-runner-#{SecureRandom.hex(4)}" }
- let!(:group) { Resource::Group.fabricate_via_api! }
-
- let(:upstream_project) do
- Resource::Project.fabricate_via_api! do |project|
- project.group = group
- project.name = 'upstream-project'
- end
- end
-
- let(:downstream_project) do
- Resource::Project.fabricate_via_api! do |project|
- project.group = group
- project.name = 'downstream-project'
- end
- end
-
- let!(:runner) do
- Resource::Runner.fabricate_via_api! do |runner|
- runner.token = group.reload!.runners_token
- runner.name = executor
- runner.tags = [executor]
- end
- end
-
- before do
- add_ci_file(downstream_project, downstream_ci_file)
- add_ci_file(upstream_project, upstream_ci_file)
-
- Flow::Login.sign_in
- upstream_project.visit!
- Flow::Pipeline.visit_latest_pipeline(status: 'passed')
- end
-
- after do
- runner.remove_via_api!
- [upstream_project, downstream_project].each(&:remove_via_api!)
- end
-
- it(
- 'creates a multi-project pipeline with artifact download',
- testcase: 'https://gitlab.com/gitlab-org/gitlab/-/quality/test_cases/358064'
- ) do
- Page::Project::Pipeline::Show.perform do |show|
- expect(show).to have_passed
- expect(show).not_to have_job(downstream_job_name)
-
- show.expand_linked_pipeline
-
- expect(show).to have_job(downstream_job_name)
- end
- end
-
- private
-
- def add_ci_file(project, file)
- Resource::Repository::Commit.fabricate_via_api! do |commit|
- commit.project = project
- commit.commit_message = 'Add CI config file'
- commit.add_files([file])
- end
- end
-
- def upstream_ci_file
- {
- file_path: '.gitlab-ci.yml',
- content: <<~YAML
- stages:
- - test
- - deploy
-
- job1:
- stage: test
- tags: ["#{executor}"]
- script: echo 'done' > output.txt
- artifacts:
- paths:
- - output.txt
-
- staging:
- stage: deploy
- trigger:
- project: #{downstream_project.path_with_namespace}
- strategy: depend
- YAML
- }
- end
-
- def downstream_ci_file
- {
- file_path: '.gitlab-ci.yml',
- content: <<~YAML
- "#{downstream_job_name}":
- stage: test
- tags: ["#{executor}"]
- needs:
- - project: #{upstream_project.path_with_namespace}
- job: job1
- ref: main
- artifacts: true
- script: cat output.txt
- YAML
- }
- end
- end
- end
-end
diff --git a/qa/qa/specs/features/browser_ui/4_verify/pipeline/parent_child_pipelines_dependent_relationship_spec.rb b/qa/qa/specs/features/browser_ui/4_verify/pipeline/parent_child_pipelines_dependent_relationship_spec.rb
deleted file mode 100644
index f2278c7bf6d..00000000000
--- a/qa/qa/specs/features/browser_ui/4_verify/pipeline/parent_child_pipelines_dependent_relationship_spec.rb
+++ /dev/null
@@ -1,136 +0,0 @@
-# frozen_string_literal: true
-
-module QA
- RSpec.describe 'Verify', :runner, :reliable do
- describe 'Parent-child pipelines dependent relationship' do
- let!(:project) do
- Resource::Project.fabricate_via_api! do |project|
- project.name = 'pipelines-dependent-relationship'
- end
- end
-
- let!(:runner) do
- Resource::Runner.fabricate_via_api! do |runner|
- runner.project = project
- runner.name = project.name
- runner.tags = ["#{project.name}"]
- end
- end
-
- before do
- Flow::Login.sign_in
- end
-
- after do
- runner.remove_via_api!
- end
-
- it(
- 'parent pipelines passes if child passes',
- testcase: 'https://gitlab.com/gitlab-org/gitlab/-/quality/test_cases/358062'
- ) do
- add_ci_files(success_child_ci_file)
- Flow::Pipeline.visit_latest_pipeline
-
- Page::Project::Pipeline::Show.perform do |parent_pipeline|
- expect(parent_pipeline).to have_child_pipeline
- expect { parent_pipeline.has_passed? }.to eventually_be_truthy
- end
- end
-
- it(
- 'parent pipeline fails if child fails',
- testcase: 'https://gitlab.com/gitlab-org/gitlab/-/quality/test_cases/358063'
- ) do
- add_ci_files(fail_child_ci_file)
- Flow::Pipeline.visit_latest_pipeline
-
- Page::Project::Pipeline::Show.perform do |parent_pipeline|
- expect(parent_pipeline).to have_child_pipeline
- expect { parent_pipeline.has_failed? }.to eventually_be_truthy
- end
- end
-
- private
-
- def success_child_ci_file
- {
- file_path: '.child-ci.yml',
- content: <<~YAML
- child_job:
- stage: test
- tags: ["#{project.name}"]
- needs:
- - project: #{project.path_with_namespace}
- job: job1
- ref: main
- artifacts: true
- script:
- - cat output.txt
- - echo "Child job done!"
-
- YAML
- }
- end
-
- def fail_child_ci_file
- {
- file_path: '.child-ci.yml',
- content: <<~YAML
- child_job:
- stage: test
- tags: ["#{project.name}"]
- script: exit 1
-
- YAML
- }
- end
-
- def parent_ci_file
- {
- file_path: '.gitlab-ci.yml',
- content: <<~YAML
- stages:
- - build
- - test
- - deploy
-
- default:
- tags: ["#{project.name}"]
-
- job1:
- stage: build
- script: echo "build success" > output.txt
- artifacts:
- paths:
- - output.txt
-
- job2:
- stage: test
- trigger:
- include: ".child-ci.yml"
- strategy: depend
-
- job3:
- stage: deploy
- script: echo "parent deploy done"
-
- YAML
- }
- end
-
- def add_ci_files(child_ci_file)
- Resource::Repository::Commit.fabricate_via_api! do |commit|
- commit.project = project
- commit.commit_message = 'Add parent and child pipelines CI files.'
- commit.add_files(
- [
- child_ci_file,
- parent_ci_file
- ]
- )
- end.project.visit!
- end
- end
- end
-end
diff --git a/spec/lib/backup/repositories_spec.rb b/spec/lib/backup/repositories_spec.rb
index 1581e4793e3..211b0d91f9f 100644
--- a/spec/lib/backup/repositories_spec.rb
+++ b/spec/lib/backup/repositories_spec.rb
@@ -6,6 +6,7 @@ RSpec.describe Backup::Repositories do
let(:progress) { spy(:stdout) }
let(:strategy) { spy(:strategy) }
let(:storages) { [] }
+ let(:paths) { [] }
let(:destination) { 'repositories' }
let(:backup_id) { 'backup_id' }
@@ -13,7 +14,8 @@ RSpec.describe Backup::Repositories do
described_class.new(
progress,
strategy: strategy,
- storages: storages
+ storages: storages,
+ paths: paths
)
end
@@ -107,6 +109,29 @@ RSpec.describe Backup::Repositories do
expect(strategy).to have_received(:finish!)
end
end
+
+ describe 'paths' do
+ let_it_be(:project) { create(:project, :repository) }
+
+ let(:paths) { [project.full_path] }
+
+ it 'calls enqueue for all repositories on the specified project', :aggregate_failures do
+ excluded_project = create(:project, :repository)
+ excluded_project_snippet = create(:project_snippet, :repository, project: excluded_project)
+ excluded_personal_snippet = create(:personal_snippet, :repository, author: excluded_project.first_owner)
+
+ subject.dump(destination, backup_id)
+
+ expect(strategy).to have_received(:start).with(:create, destination, backup_id: backup_id)
+ expect(strategy).not_to have_received(:enqueue).with(excluded_project, Gitlab::GlRepository::PROJECT)
+ expect(strategy).not_to have_received(:enqueue).with(excluded_project_snippet, Gitlab::GlRepository::SNIPPET)
+ expect(strategy).not_to have_received(:enqueue).with(excluded_personal_snippet, Gitlab::GlRepository::SNIPPET)
+ expect(strategy).to have_received(:enqueue).with(project, Gitlab::GlRepository::PROJECT)
+ expect(strategy).to have_received(:enqueue).with(project, Gitlab::GlRepository::WIKI)
+ expect(strategy).to have_received(:enqueue).with(project, Gitlab::GlRepository::DESIGN)
+ expect(strategy).to have_received(:finish!)
+ end
+ end
end
describe '#restore' do
@@ -208,5 +233,26 @@ RSpec.describe Backup::Repositories do
expect(strategy).to have_received(:finish!)
end
end
+
+ context 'paths' do
+ let(:paths) { [project.full_path] }
+
+ it 'calls enqueue for all repositories on the specified project', :aggregate_failures do
+ excluded_project = create(:project, :repository)
+ excluded_project_snippet = create(:project_snippet, :repository, project: excluded_project)
+ excluded_personal_snippet = create(:personal_snippet, :repository, author: excluded_project.first_owner)
+
+ subject.restore(destination)
+
+ expect(strategy).to have_received(:start).with(:restore, destination)
+ expect(strategy).not_to have_received(:enqueue).with(excluded_project, Gitlab::GlRepository::PROJECT)
+ expect(strategy).not_to have_received(:enqueue).with(excluded_project_snippet, Gitlab::GlRepository::SNIPPET)
+ expect(strategy).not_to have_received(:enqueue).with(excluded_personal_snippet, Gitlab::GlRepository::SNIPPET)
+ expect(strategy).to have_received(:enqueue).with(project, Gitlab::GlRepository::PROJECT)
+ expect(strategy).to have_received(:enqueue).with(project, Gitlab::GlRepository::WIKI)
+ expect(strategy).to have_received(:enqueue).with(project, Gitlab::GlRepository::DESIGN)
+ expect(strategy).to have_received(:finish!)
+ end
+ end
end
end
diff --git a/spec/tasks/gitlab/backup_rake_spec.rb b/spec/tasks/gitlab/backup_rake_spec.rb
index 52a0a9a7385..4a3b81a072f 100644
--- a/spec/tasks/gitlab/backup_rake_spec.rb
+++ b/spec/tasks/gitlab/backup_rake_spec.rb
@@ -465,7 +465,7 @@ RSpec.describe 'gitlab:app namespace rake task', :delete do
stub_env('GITLAB_BACKUP_MAX_STORAGE_CONCURRENCY', 2)
expect(::Backup::Repositories).to receive(:new)
- .with(anything, strategy: anything, storages: [])
+ .with(anything, strategy: anything, storages: [], paths: [])
.and_call_original
expect(::Backup::GitalyBackup).to receive(:new).with(anything, max_parallelism: 5, storage_parallelism: 2, incremental: false).and_call_original