Welcome to mirror list, hosted at ThFree Co, Russian Federation.

gitlab.com/gitlab-org/gitlab-foss.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorGitLab Bot <gitlab-bot@gitlab.com>2022-04-08 00:08:27 +0300
committerGitLab Bot <gitlab-bot@gitlab.com>2022-04-08 00:08:27 +0300
commit4df2fd43039fb3f09086525501db071790e1699f (patch)
tree9af966dc861351d7a7d97dd9f6a586d5700a63d3
parentc2bdb9d02768a61bee7560113f4d4c83dc91338e (diff)
Add latest changes from gitlab-org/gitlab@master
-rw-r--r--app/assets/javascripts/analytics/shared/components/projects_dropdown_filter.vue8
-rw-r--r--app/assets/javascripts/groups/components/group_item.vue4
-rw-r--r--app/assets/javascripts/header_search/components/header_search_autocomplete_items.vue4
-rw-r--r--app/assets/javascripts/issues/show/components/incidents/incident_tabs.vue7
-rw-r--r--app/assets/javascripts/jira_connect/branches/components/project_dropdown.vue4
-rw-r--r--app/assets/javascripts/jira_connect/subscriptions/components/group_item_name.vue9
-rw-r--r--app/assets/javascripts/packages_and_registries/dependency_proxy/app.vue1
-rw-r--r--app/assets/javascripts/pipelines/components/pipelines_list/pipelines_ci_templates.vue4
-rw-r--r--app/assets/javascripts/pipelines/components/pipelines_list/tokens/pipeline_trigger_author_token.vue8
-rw-r--r--app/assets/javascripts/projects/settings/topics/components/topics_token_selector.vue4
-rw-r--r--app/assets/javascripts/runner/components/runner_assigned_item.vue10
-rw-r--r--app/assets/javascripts/search/topbar/components/searchable_dropdown_item.vue4
-rw-r--r--app/assets/javascripts/vue_shared/components/filtered_search_bar/tokens/author_token.vue1
-rw-r--r--app/assets/javascripts/vue_shared/components/markdown/field.vue14
-rw-r--r--app/assets/javascripts/vue_shared/components/metric_images/metric_images_tab.vue119
-rw-r--r--app/assets/javascripts/vue_shared/components/metric_images/metric_images_table.vue266
-rw-r--r--app/assets/javascripts/vue_shared/components/metric_images/store/actions.js85
-rw-r--r--app/assets/javascripts/vue_shared/components/metric_images/store/index.js14
-rw-r--r--app/assets/javascripts/vue_shared/components/metric_images/store/mutation_types.js13
-rw-r--r--app/assets/javascripts/vue_shared/components/metric_images/store/mutations.js39
-rw-r--r--app/assets/javascripts/vue_shared/components/metric_images/store/state.js10
-rw-r--r--app/assets/javascripts/vue_shared/components/project_avatar.vue4
-rw-r--r--app/assets/javascripts/vue_shared/components/registry/title_area.vue4
-rw-r--r--app/views/projects/usage_quotas/index.html.haml2
-rw-r--r--app/views/search/results/_blob_highlight.html.haml4
-rw-r--r--data/removals/15_0/15-0-remove_ff_push_rules_supersede_code_owners.yml16
-rw-r--r--doc/administration/get_started.md7
-rw-r--r--doc/ci/runners/saas/macos/environment.md1
-rw-r--r--doc/development/backend/create_source_code_be/index.md4
-rw-r--r--doc/development/workhorse/channel.md199
-rw-r--r--doc/development/workhorse/configuration.md219
-rw-r--r--doc/development/workhorse/gitlab_features.md73
-rw-r--r--doc/development/workhorse/index.md84
-rw-r--r--doc/development/workhorse/new_features.md46
-rw-r--r--doc/update/removals.md10
-rw-r--r--doc/user/project/import/bitbucket_server.md28
-rw-r--r--locale/gitlab.pot51
-rw-r--r--qa/qa/resource/bulk_import_group.rb16
-rw-r--r--qa/qa/resource/project.rb15
-rw-r--r--qa/qa/specs/features/api/1_manage/import_large_github_repo_spec.rb40
-rw-r--r--qa/qa/specs/features/api/1_manage/migration/gitlab_migration_large_project_spec.rb400
-rw-r--r--qa/qa/specs/features/api/1_manage/migration/gitlab_project_migration_common.rb8
-rw-r--r--spec/db/schema_spec.rb18
-rw-r--r--spec/frontend/ide/components/new_dropdown/upload_spec.js22
-rw-r--r--spec/frontend/ide/stores/actions/merge_request_spec.js387
-rw-r--r--spec/frontend/ide/stores/actions/project_spec.js172
-rw-r--r--spec/frontend/ide/stores/actions/tree_spec.js92
-rw-r--r--spec/frontend/ide/stores/actions_spec.js537
-rw-r--r--spec/frontend/ide/stores/modules/branches/actions_spec.js30
-rw-r--r--spec/frontend/ide/stores/modules/clientside/actions_spec.js8
-rw-r--r--spec/frontend/ide/stores/modules/commit/actions_spec.js384
-rw-r--r--spec/frontend/ide/stores/modules/file_templates/actions_spec.js62
-rw-r--r--spec/frontend/ide/stores/modules/merge_requests/actions_spec.js35
-rw-r--r--spec/frontend/ide/stores/modules/pane/actions_spec.js27
-rw-r--r--spec/frontend/ide/stores/modules/pipelines/actions_spec.js152
-rw-r--r--spec/frontend/ide/stores/modules/terminal_sync/actions_spec.js31
-rw-r--r--spec/frontend/issues/show/components/incidents/incident_tabs_spec.js1
-rw-r--r--spec/frontend/packages_and_registries/dependency_proxy/app_spec.js6
-rw-r--r--spec/frontend/runner/components/runner_assigned_item_spec.js3
-rw-r--r--spec/frontend/vue_shared/components/markdown/field_spec.js2
-rw-r--r--spec/frontend/vue_shared/components/metric_images/__snapshots__/metric_images_table_spec.js.snap73
-rw-r--r--spec/frontend/vue_shared/components/metric_images/metric_images_tab_spec.js174
-rw-r--r--spec/frontend/vue_shared/components/metric_images/metric_images_table_spec.js230
-rw-r--r--spec/frontend/vue_shared/components/metric_images/mock_data.js5
-rw-r--r--spec/frontend/vue_shared/components/metric_images/store/actions_spec.js158
-rw-r--r--spec/frontend/vue_shared/components/metric_images/store/mutations_spec.js147
-rw-r--r--workhorse/README.md38
-rw-r--r--workhorse/doc/architecture/channel.md197
-rw-r--r--workhorse/doc/architecture/gitlab_features.md69
-rw-r--r--workhorse/doc/channel.md15
-rw-r--r--workhorse/doc/development/new_features.md42
-rw-r--r--workhorse/doc/development/tests.md19
-rw-r--r--workhorse/doc/operations/configuration.md215
-rw-r--r--workhorse/doc/operations/install.md52
74 files changed, 3351 insertions, 1911 deletions
diff --git a/app/assets/javascripts/analytics/shared/components/projects_dropdown_filter.vue b/app/assets/javascripts/analytics/shared/components/projects_dropdown_filter.vue
index b3ae671d611..b2b033de75d 100644
--- a/app/assets/javascripts/analytics/shared/components/projects_dropdown_filter.vue
+++ b/app/assets/javascripts/analytics/shared/components/projects_dropdown_filter.vue
@@ -11,6 +11,7 @@ import {
import { debounce } from 'lodash';
import { filterBySearchTerm } from '~/analytics/shared/utils';
import { getIdFromGraphQLId } from '~/graphql_shared/utils';
+import { AVATAR_SHAPE_OPTION_RECT } from '~/vue_shared/constants';
import { DEFAULT_DEBOUNCE_AND_THROTTLE_MS } from '~/lib/utils/constants';
import { n__, s__, __ } from '~/locale';
import getProjects from '../graphql/projects.query.graphql';
@@ -204,6 +205,7 @@ export default {
return getIdFromGraphQLId(project.id);
},
},
+ AVATAR_SHAPE_OPTION_RECT,
};
</script>
<template>
@@ -227,7 +229,7 @@ export default {
:entity-id="getEntityId(selectedProjects[0])"
:entity-name="selectedProjects[0].name"
:size="16"
- shape="rect"
+ :shape="$options.AVATAR_SHAPE_OPTION_RECT"
:alt="selectedProjects[0].name"
class="gl-display-inline-flex gl-vertical-align-middle gl-mr-2"
/>
@@ -255,7 +257,7 @@ export default {
:entity-id="getEntityId(project)"
:entity-name="project.name"
:src="project.avatarUrl"
- shape="rect"
+ :shape="$options.AVATAR_SHAPE_OPTION_RECT"
/>
<div>
<div data-testid="project-name">{{ project.name }}</div>
@@ -279,7 +281,7 @@ export default {
:entity-id="getEntityId(project)"
:entity-name="project.name"
:src="project.avatarUrl"
- shape="rect"
+ :shape="$options.AVATAR_SHAPE_OPTION_RECT"
/>
<div>
<div data-testid="project-name">{{ project.name }}</div>
diff --git a/app/assets/javascripts/groups/components/group_item.vue b/app/assets/javascripts/groups/components/group_item.vue
index 924c8c9db3e..4f21f68fa65 100644
--- a/app/assets/javascripts/groups/components/group_item.vue
+++ b/app/assets/javascripts/groups/components/group_item.vue
@@ -9,6 +9,7 @@ import {
} from '@gitlab/ui';
import { visitUrl } from '~/lib/utils/url_utility';
import UserAccessRoleBadge from '~/vue_shared/components/user_access_role_badge.vue';
+import { AVATAR_SHAPE_OPTION_RECT } from '~/vue_shared/constants';
import { VISIBILITY_TYPE_ICON, GROUP_VISIBILITY_TYPE } from '../constants';
import eventHub from '../event_hub';
@@ -112,6 +113,7 @@ export default {
},
},
safeHtmlConfig: { ADD_TAGS: ['gl-emoji'] },
+ AVATAR_SHAPE_OPTION_RECT,
};
</script>
@@ -145,7 +147,7 @@ export default {
:aria-label="group.name"
>
<gl-avatar
- shape="rect"
+ :shape="$options.AVATAR_SHAPE_OPTION_RECT"
:entity-name="group.name"
:src="group.avatarUrl"
:alt="group.name"
diff --git a/app/assets/javascripts/header_search/components/header_search_autocomplete_items.vue b/app/assets/javascripts/header_search/components/header_search_autocomplete_items.vue
index 11b4f681b63..4e65d2d5055 100644
--- a/app/assets/javascripts/header_search/components/header_search_autocomplete_items.vue
+++ b/app/assets/javascripts/header_search/components/header_search_autocomplete_items.vue
@@ -11,6 +11,7 @@ import {
import { mapState, mapGetters } from 'vuex';
import { s__ } from '~/locale';
import highlight from '~/lib/utils/highlight';
+import { AVATAR_SHAPE_OPTION_RECT } from '~/vue_shared/constants';
import { GROUPS_CATEGORY, PROJECTS_CATEGORY, LARGE_AVATAR_PX, SMALL_AVATAR_PX } from '../constants';
export default {
@@ -66,6 +67,7 @@ export default {
return this.currentFocusedOption?.html_id === data.html_id;
},
},
+ AVATAR_SHAPE_OPTION_RECT,
};
</script>
@@ -93,7 +95,7 @@ export default {
:entity-id="data.id"
:entity-name="data.label"
:size="avatarSize(data)"
- shape="square"
+ :shape="$options.AVATAR_SHAPE_OPTION_RECT"
/>
<span v-safe-html="highlightedName(data.label)"></span>
</div>
diff --git a/app/assets/javascripts/issues/show/components/incidents/incident_tabs.vue b/app/assets/javascripts/issues/show/components/incidents/incident_tabs.vue
index 52bce7484cd..ea0e15adfed 100644
--- a/app/assets/javascripts/issues/show/components/incidents/incident_tabs.vue
+++ b/app/assets/javascripts/issues/show/components/incidents/incident_tabs.vue
@@ -17,12 +17,13 @@ export default {
GlTab,
GlTabs,
HighlightBar,
- MetricsTab: () => import('ee_component/issues/show/components/incidents/metrics_tab.vue'),
TimelineTab: () =>
import('ee_component/issues/show/components/incidents/timeline_events_tab.vue'),
+ IncidentMetricTab: () =>
+ import('ee_component/issues/show/components/incidents/incident_metric_tab.vue'),
},
mixins: [glFeatureFlagsMixin()],
- inject: ['fullPath', 'iid', 'uploadMetricsFeatureAvailable'],
+ inject: ['fullPath', 'iid'],
apollo: {
alert: {
query: getAlert,
@@ -93,7 +94,7 @@ export default {
<highlight-bar :alert="alert" />
<description-component v-bind="$attrs" />
</gl-tab>
- <metrics-tab v-if="uploadMetricsFeatureAvailable" data-testid="metrics-tab" />
+ <incident-metric-tab />
<gl-tab
v-if="alert"
class="alert-management-details"
diff --git a/app/assets/javascripts/jira_connect/branches/components/project_dropdown.vue b/app/assets/javascripts/jira_connect/branches/components/project_dropdown.vue
index 88005cccd89..9b36642feb7 100644
--- a/app/assets/javascripts/jira_connect/branches/components/project_dropdown.vue
+++ b/app/assets/javascripts/jira_connect/branches/components/project_dropdown.vue
@@ -7,6 +7,7 @@ import {
GlAvatarLabeled,
} from '@gitlab/ui';
import { __ } from '~/locale';
+import { AVATAR_SHAPE_OPTION_RECT } from '~/vue_shared/constants';
import { PROJECTS_PER_PAGE } from '../constants';
import getProjectsQuery from '../graphql/queries/get_projects.query.graphql';
@@ -80,6 +81,7 @@ export default {
i18n: {
selectProjectText: __('Select a project'),
},
+ AVATAR_SHAPE_OPTION_RECT,
};
</script>
@@ -107,7 +109,7 @@ export default {
>
<gl-avatar-labeled
class="gl-text-truncate"
- shape="rect"
+ :shape="$options.AVATAR_SHAPE_OPTION_RECT"
:size="32"
:src="project.avatarUrl"
:label="project.name"
diff --git a/app/assets/javascripts/jira_connect/subscriptions/components/group_item_name.vue b/app/assets/javascripts/jira_connect/subscriptions/components/group_item_name.vue
index e6c172dae9e..509a32460bb 100644
--- a/app/assets/javascripts/jira_connect/subscriptions/components/group_item_name.vue
+++ b/app/assets/javascripts/jira_connect/subscriptions/components/group_item_name.vue
@@ -1,5 +1,6 @@
<script>
import { GlAvatar, GlIcon } from '@gitlab/ui';
+import { AVATAR_SHAPE_OPTION_RECT } from '~/vue_shared/constants';
export default {
components: {
@@ -12,6 +13,7 @@ export default {
required: true,
},
},
+ AVATAR_SHAPE_OPTION_RECT,
};
</script>
@@ -19,7 +21,12 @@ export default {
<div class="gl-display-flex gl-align-items-center">
<gl-icon name="folder-o" class="gl-mr-3" />
<div class="gl-display-none gl-flex-shrink-0 gl-sm-display-flex gl-mr-3">
- <gl-avatar :size="32" shape="rect" :entity-name="group.name" :src="group.avatar_url" />
+ <gl-avatar
+ :size="32"
+ :shape="$options.AVATAR_SHAPE_OPTION_RECT"
+ :entity-name="group.name"
+ :src="group.avatar_url"
+ />
</div>
<div>
diff --git a/app/assets/javascripts/packages_and_registries/dependency_proxy/app.vue b/app/assets/javascripts/packages_and_registries/dependency_proxy/app.vue
index 4ddefcbdf6a..67c2ca02d20 100644
--- a/app/assets/javascripts/packages_and_registries/dependency_proxy/app.vue
+++ b/app/assets/javascripts/packages_and_registries/dependency_proxy/app.vue
@@ -55,6 +55,7 @@ export default {
deleteCacheAlertMessageSuccess: s__(
'DependencyProxy|All items in the cache are scheduled for removal.',
),
+ clearCache: s__('DependencyProxy|Clear cache'),
},
confirmClearCacheModal: 'confirm-clear-cache-modal',
modalButtons: {
diff --git a/app/assets/javascripts/pipelines/components/pipelines_list/pipelines_ci_templates.vue b/app/assets/javascripts/pipelines/components/pipelines_list/pipelines_ci_templates.vue
index d50229e47c4..9dc6ae317ea 100644
--- a/app/assets/javascripts/pipelines/components/pipelines_list/pipelines_ci_templates.vue
+++ b/app/assets/javascripts/pipelines/components/pipelines_list/pipelines_ci_templates.vue
@@ -2,6 +2,7 @@
import { GlAvatar, GlButton, GlCard, GlSprintf, GlIcon, GlLink } from '@gitlab/ui';
import { mergeUrlParams } from '~/lib/utils/url_utility';
import { sprintf } from '~/locale';
+import { AVATAR_SHAPE_OPTION_RECT } from '~/vue_shared/constants';
import {
STARTER_TEMPLATE_NAME,
RUNNERS_AVAILABILITY_SECTION_EXPERIMENT_NAME,
@@ -33,6 +34,7 @@ export default {
RUNNERS_DOCUMENTATION_LINK_CLICKED_EVENT,
RUNNERS_SETTINGS_BUTTON_CLICKED_EVENT,
I18N,
+ AVATAR_SHAPE_OPTION_RECT,
inject: ['pipelineEditorPath', 'suggestedCiTemplates'],
props: {
ciRunnerSettingsPath: {
@@ -187,7 +189,7 @@ export default {
:src="template.logo"
:size="48"
class="gl-mr-5 gl-bg-white dark-mode-override"
- shape="rect"
+ :shape="$options.AVATAR_SHAPE_OPTION_RECT"
:alt="template.name"
data-testid="template-logo"
/>
diff --git a/app/assets/javascripts/pipelines/components/pipelines_list/tokens/pipeline_trigger_author_token.vue b/app/assets/javascripts/pipelines/components/pipelines_list/tokens/pipeline_trigger_author_token.vue
index 33115d72b9c..746cf238646 100644
--- a/app/assets/javascripts/pipelines/components/pipelines_list/tokens/pipeline_trigger_author_token.vue
+++ b/app/assets/javascripts/pipelines/components/pipelines_list/tokens/pipeline_trigger_author_token.vue
@@ -83,13 +83,7 @@ export default {
@input="searchAuthors"
>
<template #view="{ inputValue }">
- <gl-avatar
- v-if="activeUser"
- :size="16"
- :src="activeUser.avatar_url"
- shape="circle"
- class="gl-mr-2"
- />
+ <gl-avatar v-if="activeUser" :size="16" :src="activeUser.avatar_url" class="gl-mr-2" />
<span>{{ activeUser ? activeUser.name : inputValue }}</span>
</template>
<template #suggestions>
diff --git a/app/assets/javascripts/projects/settings/topics/components/topics_token_selector.vue b/app/assets/javascripts/projects/settings/topics/components/topics_token_selector.vue
index e8b0e95b142..d4c97cbf038 100644
--- a/app/assets/javascripts/projects/settings/topics/components/topics_token_selector.vue
+++ b/app/assets/javascripts/projects/settings/topics/components/topics_token_selector.vue
@@ -1,6 +1,7 @@
<script>
import { GlTokenSelector, GlAvatarLabeled } from '@gitlab/ui';
import { s__ } from '~/locale';
+import { AVATAR_SHAPE_OPTION_RECT } from '~/vue_shared/constants';
import searchProjectTopics from '../queries/project_topics_search.query.graphql';
export default {
@@ -65,6 +66,7 @@ export default {
this.$emit('update', tokens);
},
},
+ AVATAR_SHAPE_OPTION_RECT,
};
</script>
<template>
@@ -85,7 +87,7 @@ export default {
:entity-name="dropdownItem.name"
:label="dropdownItem.name"
:size="32"
- shape="rect"
+ :shape="$options.AVATAR_SHAPE_OPTION_RECT"
/>
</template>
</gl-token-selector>
diff --git a/app/assets/javascripts/runner/components/runner_assigned_item.vue b/app/assets/javascripts/runner/components/runner_assigned_item.vue
index ea8074199a6..38bdfecb7df 100644
--- a/app/assets/javascripts/runner/components/runner_assigned_item.vue
+++ b/app/assets/javascripts/runner/components/runner_assigned_item.vue
@@ -1,5 +1,6 @@
<script>
import { GlAvatar, GlLink } from '@gitlab/ui';
+import { AVATAR_SHAPE_OPTION_RECT } from '~/vue_shared/constants';
export default {
components: {
@@ -25,13 +26,20 @@ export default {
default: null,
},
},
+ AVATAR_SHAPE_OPTION_RECT,
};
</script>
<template>
<div class="gl-display-flex gl-align-items-center gl-py-5">
<gl-link :href="href" data-testid="item-avatar" class="gl-text-decoration-none! gl-mr-3">
- <gl-avatar shape="rect" :entity-name="name" :alt="name" :src="avatarUrl" :size="48" />
+ <gl-avatar
+ :shape="$options.AVATAR_SHAPE_OPTION_RECT"
+ :entity-name="name"
+ :alt="name"
+ :src="avatarUrl"
+ :size="48"
+ />
</gl-link>
<gl-link :href="href" class="gl-font-weight-bold gl-text-gray-900!">{{ fullName }}</gl-link>
diff --git a/app/assets/javascripts/search/topbar/components/searchable_dropdown_item.vue b/app/assets/javascripts/search/topbar/components/searchable_dropdown_item.vue
index 42d6444e690..a4254a355a2 100644
--- a/app/assets/javascripts/search/topbar/components/searchable_dropdown_item.vue
+++ b/app/assets/javascripts/search/topbar/components/searchable_dropdown_item.vue
@@ -2,6 +2,7 @@
import { GlDropdownItem, GlAvatar, GlSafeHtmlDirective as SafeHtml } from '@gitlab/ui';
import highlight from '~/lib/utils/highlight';
import { truncateNamespace } from '~/lib/utils/text_utility';
+import { AVATAR_SHAPE_OPTION_RECT } from '~/vue_shared/constants';
export default {
name: 'SearchableDropdownItem',
@@ -46,6 +47,7 @@ export default {
return highlight(this.item[this.name], this.searchText);
},
},
+ AVATAR_SHAPE_OPTION_RECT,
};
</script>
@@ -61,7 +63,7 @@ export default {
:src="item.avatar_url"
:entity-id="item.id"
:entity-name="item[name]"
- shape="rect"
+ :shape="$options.AVATAR_SHAPE_OPTION_RECT"
:size="32"
/>
<div class="gl-display-flex gl-flex-direction-column">
diff --git a/app/assets/javascripts/vue_shared/components/filtered_search_bar/tokens/author_token.vue b/app/assets/javascripts/vue_shared/components/filtered_search_bar/tokens/author_token.vue
index b70317b2ec4..696456be990 100644
--- a/app/assets/javascripts/vue_shared/components/filtered_search_bar/tokens/author_token.vue
+++ b/app/assets/javascripts/vue_shared/components/filtered_search_bar/tokens/author_token.vue
@@ -95,7 +95,6 @@ export default {
v-if="activeTokenValue"
:size="16"
:src="getAvatarUrl(activeTokenValue)"
- shape="circle"
class="gl-mr-2"
/>
{{ activeTokenValue ? activeTokenValue.name : inputValue }}
diff --git a/app/assets/javascripts/vue_shared/components/markdown/field.vue b/app/assets/javascripts/vue_shared/components/markdown/field.vue
index 179ac2fc57f..722df3cc58b 100644
--- a/app/assets/javascripts/vue_shared/components/markdown/field.vue
+++ b/app/assets/javascripts/vue_shared/components/markdown/field.vue
@@ -1,5 +1,5 @@
<script>
-import { GlIcon } from '@gitlab/ui';
+import { GlIcon, GlSafeHtmlDirective } from '@gitlab/ui';
import $ from 'jquery';
import '~/behaviors/markdown/render_gfm';
import { debounce, unescape } from 'lodash';
@@ -24,6 +24,9 @@ export default {
GlIcon,
Suggestions,
},
+ directives: {
+ SafeHtml: GlSafeHtmlDirective,
+ },
mixins: [glFeatureFlagsMixin()],
props: {
/**
@@ -308,6 +311,9 @@ export default {
);
},
},
+ safeHtmlConfig: {
+ ADD_TAGS: ['gl-emoji'],
+ },
};
</script>
@@ -369,19 +375,19 @@ export default {
<div
v-show="previewMarkdown"
ref="markdown-preview"
+ v-safe-html:[$options.safeHtmlConfig]="markdownPreview"
class="js-vue-md-preview md md-preview-holder"
- v-html="markdownPreview /* eslint-disable-line vue/no-v-html */"
></div>
</template>
<div
v-if="referencedCommands && previewMarkdown && !markdownPreviewLoading"
+ v-safe-html:[$options.safeHtmlConfig]="referencedCommands"
class="referenced-commands"
data-testid="referenced-commands"
- v-html="referencedCommands /* eslint-disable-line vue/no-v-html */"
></div>
<div v-if="shouldShowReferencedUsers" class="referenced-users">
<gl-icon name="warning-solid" />
- <span v-html="addMultipleToDiscussionWarning /* eslint-disable-line vue/no-v-html */"></span>
+ <span v-safe-html:[$options.safeHtmlConfig]="addMultipleToDiscussionWarning"></span>
</div>
</div>
</template>
diff --git a/app/assets/javascripts/vue_shared/components/metric_images/metric_images_tab.vue b/app/assets/javascripts/vue_shared/components/metric_images/metric_images_tab.vue
new file mode 100644
index 00000000000..3e796a73f72
--- /dev/null
+++ b/app/assets/javascripts/vue_shared/components/metric_images/metric_images_tab.vue
@@ -0,0 +1,119 @@
+<script>
+import { GlFormGroup, GlFormInput, GlLoadingIcon, GlModal, GlTab } from '@gitlab/ui';
+import { mapState, mapActions } from 'vuex';
+import { __, s__ } from '~/locale';
+import UploadDropzone from '~/vue_shared/components/upload_dropzone/upload_dropzone.vue';
+import MetricImagesTable from '~/vue_shared/components/metric_images/metric_images_table.vue';
+
+export default {
+ components: {
+ GlFormGroup,
+ GlFormInput,
+ GlLoadingIcon,
+ GlModal,
+ GlTab,
+ MetricImagesTable,
+ UploadDropzone,
+ },
+ inject: ['canUpdate', 'projectId', 'iid'],
+ data() {
+ return {
+ currentFiles: [],
+ modalVisible: false,
+ modalUrl: '',
+ modalUrlText: '',
+ };
+ },
+ computed: {
+ ...mapState(['metricImages', 'isLoadingMetricImages', 'isUploadingImage']),
+ actionPrimaryProps() {
+ return {
+ text: this.$options.i18n.modalUpload,
+ attributes: {
+ loading: this.isUploadingImage,
+ disabled: this.isUploadingImage,
+ category: 'primary',
+ variant: 'confirm',
+ },
+ };
+ },
+ },
+ mounted() {
+ this.setInitialData({ modelIid: this.iid, projectId: this.projectId });
+ this.fetchImages();
+ },
+ methods: {
+ ...mapActions(['fetchImages', 'uploadImage', 'setInitialData']),
+ clearInputs() {
+ this.modalVisible = false;
+ this.modalUrl = '';
+ this.modalUrlText = '';
+ this.currentFile = false;
+ },
+ openMetricDialog(files) {
+ this.modalVisible = true;
+ this.currentFiles = files;
+ },
+ async onUpload() {
+ try {
+ await this.uploadImage({
+ files: this.currentFiles,
+ url: this.modalUrl,
+ urlText: this.modalUrlText,
+ });
+ // Error case handled within action
+ } finally {
+ this.clearInputs();
+ }
+ },
+ },
+ i18n: {
+ modalUpload: __('Upload'),
+ modalCancel: __('Cancel'),
+ modalTitle: s__('Incidents|Add image details'),
+ modalDescription: s__(
+ "Incidents|Add text or a link to display with your image. If you don't add either, the file name displays instead.",
+ ),
+ dropDescription: s__(
+ 'Incidents|Drop or %{linkStart}upload%{linkEnd} a metric screenshot to attach it to the incident',
+ ),
+ },
+};
+</script>
+
+<template>
+ <gl-tab :title="s__('Incident|Metrics')" data-testid="metrics-tab">
+ <div v-if="isLoadingMetricImages">
+ <gl-loading-icon class="gl-p-5" size="sm" />
+ </div>
+ <gl-modal
+ modal-id="upload-metric-modal"
+ size="sm"
+ :action-primary="actionPrimaryProps"
+ :action-cancel="{ text: $options.i18n.modalCancel }"
+ :title="$options.i18n.modalTitle"
+ :visible="modalVisible"
+ @hidden="clearInputs"
+ @primary.prevent="onUpload"
+ >
+ <p>{{ $options.i18n.modalDescription }}</p>
+ <gl-form-group :label="__('Text (optional)')" label-for="upload-text-input">
+ <gl-form-input id="upload-text-input" v-model="modalUrlText" />
+ </gl-form-group>
+
+ <gl-form-group
+ :label="__('Link (optional)')"
+ label-for="upload-url-input"
+ :description="s__('Incidents|Must start with http or https')"
+ >
+ <gl-form-input id="upload-url-input" v-model="modalUrl" />
+ </gl-form-group>
+ </gl-modal>
+ <metric-images-table v-for="metric in metricImages" :key="metric.id" v-bind="metric" />
+ <upload-dropzone
+ v-if="canUpdate"
+ :drop-description-message="$options.i18n.dropDescription"
+ @change="openMetricDialog"
+ />
+ </gl-tab>
+</template>
diff --git a/app/assets/javascripts/vue_shared/components/metric_images/metric_images_table.vue b/app/assets/javascripts/vue_shared/components/metric_images/metric_images_table.vue
new file mode 100644
index 00000000000..8eb8e52728d
--- /dev/null
+++ b/app/assets/javascripts/vue_shared/components/metric_images/metric_images_table.vue
@@ -0,0 +1,266 @@
+<script>
+import {
+ GlButton,
+ GlFormGroup,
+ GlFormInput,
+ GlCard,
+ GlIcon,
+ GlLink,
+ GlModal,
+ GlSprintf,
+ GlTooltipDirective,
+} from '@gitlab/ui';
+import { mapActions } from 'vuex';
+import { __, s__ } from '~/locale';
+
+export default {
+ i18n: {
+ modalDelete: __('Delete'),
+ modalDescription: s__('Incident|Are you sure you wish to delete this image?'),
+ modalCancel: __('Cancel'),
+ modalTitle: s__('Incident|Deleting %{filename}'),
+ editModalUpdate: __('Update'),
+ editModalTitle: s__('Incident|Editing %{filename}'),
+ editIconTitle: s__('Incident|Edit image text or link'),
+ deleteIconTitle: s__('Incident|Delete image'),
+ },
+ components: {
+ GlButton,
+ GlFormGroup,
+ GlFormInput,
+ GlCard,
+ GlIcon,
+ GlLink,
+ GlModal,
+ GlSprintf,
+ },
+ directives: {
+ GlTooltip: GlTooltipDirective,
+ },
+ inject: ['canUpdate'],
+ props: {
+ id: {
+ type: Number,
+ required: true,
+ },
+ filePath: {
+ type: String,
+ required: true,
+ },
+ filename: {
+ type: String,
+ required: true,
+ },
+ url: {
+ type: String,
+ required: false,
+ default: null,
+ },
+ urlText: {
+ type: String,
+ required: false,
+ default: null,
+ },
+ },
+ data() {
+ return {
+ isCollapsed: false,
+ isDeleting: false,
+ isUpdating: false,
+ modalVisible: false,
+ editModalVisible: false,
+ modalUrl: this.url,
+ modalUrlText: this.urlText,
+ };
+ },
+ computed: {
+ deleteActionPrimaryProps() {
+ return {
+ text: this.$options.i18n.modalDelete,
+ attributes: {
+ loading: this.isDeleting,
+ disabled: this.isDeleting,
+ category: 'primary',
+ variant: 'danger',
+ },
+ };
+ },
+ updateActionPrimaryProps() {
+ return {
+ text: this.$options.i18n.editModalUpdate,
+ attributes: {
+ loading: this.isUpdating,
+ disabled: this.isUpdating,
+ category: 'primary',
+ variant: 'confirm',
+ },
+ };
+ },
+ arrowIconName() {
+ return this.isCollapsed ? 'chevron-right' : 'chevron-down';
+ },
+ bodyClass() {
+ return [
+ 'gl-border-1',
+ 'gl-border-t-solid',
+ 'gl-border-gray-100',
+ { 'gl-display-none': this.isCollapsed },
+ ];
+ },
+ },
+ methods: {
+ ...mapActions(['deleteImage', 'updateImage']),
+ toggleCollapsed() {
+ this.isCollapsed = !this.isCollapsed;
+ },
+ resetEditFields() {
+ this.modalUrl = this.url;
+ this.modalUrlText = this.urlText;
+ this.editModalVisible = false;
+ this.modalVisible = false;
+ },
+ async onDelete() {
+ try {
+ this.isDeleting = true;
+ await this.deleteImage(this.id);
+ } finally {
+ this.isDeleting = false;
+ this.modalVisible = false;
+ }
+ },
+ async onUpdate() {
+ try {
+ this.isUpdating = true;
+ await this.updateImage({
+ imageId: this.id,
+ url: this.modalUrl,
+ urlText: this.modalUrlText,
+ });
+ } finally {
+ this.isUpdating = false;
+ this.modalUrl = '';
+ this.modalUrlText = '';
+ this.editModalVisible = false;
+ }
+ },
+ },
+};
+</script>
+
+<template>
+ <gl-card
+ class="collapsible-card border gl-p-0 gl-mb-5"
+ header-class="gl-display-flex gl-align-items-center gl-border-b-0 gl-py-3"
+ :body-class="bodyClass"
+ >
+ <gl-modal
+ body-class="gl-pb-0! gl-min-h-6!"
+ modal-id="delete-metric-modal"
+ size="sm"
+ :visible="modalVisible"
+ :action-primary="deleteActionPrimaryProps"
+ :action-cancel="{ text: $options.i18n.modalCancel }"
+ @primary.prevent="onDelete"
+ @hidden="resetEditFields"
+ >
+ <template #modal-title>
+ <gl-sprintf :message="$options.i18n.modalTitle">
+ <template #filename>
+ {{ filename }}
+ </template>
+ </gl-sprintf>
+ </template>
+ <p>{{ $options.i18n.modalDescription }}</p>
+ </gl-modal>
+
+ <gl-modal
+ modal-id="edit-metric-modal"
+ size="sm"
+ :action-primary="updateActionPrimaryProps"
+ :action-cancel="{ text: $options.i18n.modalCancel }"
+ :visible="editModalVisible"
+ data-testid="metric-image-edit-modal"
+ @hidden="resetEditFields"
+ @primary.prevent="onUpdate"
+ >
+ <template #modal-title>
+ <gl-sprintf :message="$options.i18n.editModalTitle">
+ <template #filename>
+ {{ filename }}
+ </template>
+ </gl-sprintf>
+ </template>
+
+ <gl-form-group :label="__('Text (optional)')" label-for="upload-text-input">
+ <gl-form-input
+ id="upload-text-input"
+ v-model="modalUrlText"
+ data-testid="metric-image-text-field"
+ />
+ </gl-form-group>
+
+ <gl-form-group
+ :label="__('Link (optional)')"
+ label-for="upload-url-input"
+ :description="s__('Incidents|Must start with http or https')"
+ >
+ <gl-form-input
+ id="upload-url-input"
+ v-model="modalUrl"
+ data-testid="metric-image-url-field"
+ />
+ </gl-form-group>
+ </gl-modal>
+
+ <template #header>
+ <div class="gl-w-full gl-display-flex gl-flex-direction-row gl-justify-content-space-between">
+ <div class="gl-display-flex gl-flex-direction-row gl-align-items-center gl-w-full">
+ <gl-button
+ class="collapsible-card-btn gl-display-flex gl-text-decoration-none gl-reset-color! gl-hover-text-blue-800! gl-shadow-none!"
+ :aria-label="filename"
+ variant="link"
+ category="tertiary"
+ data-testid="collapse-button"
+ @click="toggleCollapsed"
+ >
+ <gl-icon class="gl-mr-2" :name="arrowIconName" />
+ </gl-button>
+ <gl-link v-if="url" :href="url" target="_blank" data-testid="metric-image-label-span">
+ {{ urlText == null || urlText == '' ? filename : urlText }}
+ <gl-icon name="external-link" class="gl-vertical-align-middle" />
+ </gl-link>
+ <span v-else data-testid="metric-image-label-span">{{
+ urlText == null || urlText == '' ? filename : urlText
+ }}</span>
+ <div class="gl-ml-auto btn-group">
+ <gl-button
+ v-if="canUpdate"
+ v-gl-tooltip.bottom
+ icon="pencil"
+ :aria-label="__('Edit')"
+ :title="$options.i18n.editIconTitle"
+ data-testid="edit-button"
+ @click="editModalVisible = true"
+ />
+ <gl-button
+ v-if="canUpdate"
+ v-gl-tooltip.bottom
+ icon="remove"
+ :aria-label="__('Delete')"
+ :title="$options.i18n.deleteIconTitle"
+ data-testid="delete-button"
+ @click="modalVisible = true"
+ />
+ </div>
+ </div>
+ </div>
+ </template>
+ <div
+ v-show="!isCollapsed"
+ class="gl-display-flex gl-flex-direction-column"
+ data-testid="metric-image-body"
+ >
+ <img class="gl-max-w-full gl-align-self-center" :src="filePath" />
+ </div>
+ </gl-card>
+</template>
diff --git a/app/assets/javascripts/vue_shared/components/metric_images/store/actions.js b/app/assets/javascripts/vue_shared/components/metric_images/store/actions.js
new file mode 100644
index 00000000000..832fb891838
--- /dev/null
+++ b/app/assets/javascripts/vue_shared/components/metric_images/store/actions.js
@@ -0,0 +1,85 @@
+import createFlash from '~/flash';
+import { s__ } from '~/locale';
+import * as types from './mutation_types';
+
+export const fetchImagesFactory = (service) => async ({ state, commit }) => {
+ commit(types.REQUEST_METRIC_IMAGES);
+ const { modelIid, projectId } = state;
+
+ try {
+ const response = await service.getMetricImages({ id: projectId, modelIid });
+ commit(types.RECEIVE_METRIC_IMAGES_SUCCESS, response);
+ } catch (error) {
+ commit(types.RECEIVE_METRIC_IMAGES_ERROR);
+ createFlash({ message: s__('MetricImages|There was an issue loading metric images.') });
+ }
+};
+
+export const uploadImageFactory = (service) => async (
+ { state, commit },
+ { files, url, urlText },
+) => {
+ commit(types.REQUEST_METRIC_UPLOAD);
+
+ const { modelIid, projectId } = state;
+
+ try {
+ const response = await service.uploadMetricImage({
+ file: files.item(0),
+ id: projectId,
+ modelIid,
+ url,
+ urlText,
+ });
+ commit(types.RECEIVE_METRIC_UPLOAD_SUCCESS, response);
+ } catch (error) {
+ commit(types.RECEIVE_METRIC_UPLOAD_ERROR);
+ createFlash({ message: s__('MetricImages|There was an issue uploading your image.') });
+ }
+};
+
+export const updateImageFactory = (service) => async (
+ { state, commit },
+ { imageId, url, urlText },
+) => {
+ commit(types.REQUEST_METRIC_UPLOAD);
+
+ const { modelIid, projectId } = state;
+
+ try {
+ const response = await service.updateMetricImage({
+ modelIid,
+ id: projectId,
+ imageId,
+ url,
+ urlText,
+ });
+ commit(types.RECEIVE_METRIC_UPDATE_SUCCESS, response);
+ } catch (error) {
+ commit(types.RECEIVE_METRIC_UPLOAD_ERROR);
+ createFlash({ message: s__('MetricImages|There was an issue updating your image.') });
+ }
+};
+
+export const deleteImageFactory = (service) => async ({ state, commit }, imageId) => {
+ const { modelIid, projectId } = state;
+
+ try {
+ await service.deleteMetricImage({ imageId, id: projectId, modelIid });
+ commit(types.RECEIVE_METRIC_DELETE_SUCCESS, imageId);
+ } catch (error) {
+ createFlash({ message: s__('MetricImages|There was an issue deleting the image.') });
+ }
+};
+
+export const setInitialData = ({ commit }, data) => {
+ commit(types.SET_INITIAL_DATA, data);
+};
+
+export default (service) => ({
+ fetchImages: fetchImagesFactory(service),
+ uploadImage: uploadImageFactory(service),
+ updateImage: updateImageFactory(service),
+ deleteImage: deleteImageFactory(service),
+ setInitialData,
+});
diff --git a/app/assets/javascripts/vue_shared/components/metric_images/store/index.js b/app/assets/javascripts/vue_shared/components/metric_images/store/index.js
new file mode 100644
index 00000000000..f13dde9a2bc
--- /dev/null
+++ b/app/assets/javascripts/vue_shared/components/metric_images/store/index.js
@@ -0,0 +1,14 @@
+import Vue from 'vue';
+import Vuex from 'vuex';
+import actionsFactory from './actions';
+import mutations from './mutations';
+import createState from './state';
+
+Vue.use(Vuex);
+
+export default (initialState, service) =>
+ new Vuex.Store({
+ actions: actionsFactory(service),
+ mutations,
+ state: createState(initialState),
+ });
diff --git a/app/assets/javascripts/vue_shared/components/metric_images/store/mutation_types.js b/app/assets/javascripts/vue_shared/components/metric_images/store/mutation_types.js
new file mode 100644
index 00000000000..8f1b31217a2
--- /dev/null
+++ b/app/assets/javascripts/vue_shared/components/metric_images/store/mutation_types.js
@@ -0,0 +1,13 @@
+export const REQUEST_METRIC_IMAGES = 'REQUEST_METRIC_IMAGES';
+export const RECEIVE_METRIC_IMAGES_SUCCESS = 'RECEIVE_METRIC_IMAGES_SUCCESS';
+export const RECEIVE_METRIC_IMAGES_ERROR = 'RECEIVE_METRIC_IMAGES_ERROR';
+
+export const REQUEST_METRIC_UPLOAD = 'REQUEST_METRIC_UPLOAD';
+export const RECEIVE_METRIC_UPLOAD_SUCCESS = 'RECEIVE_METRIC_UPLOAD_SUCCESS';
+export const RECEIVE_METRIC_UPLOAD_ERROR = 'RECEIVE_METRIC_UPLOAD_ERROR';
+
+export const RECEIVE_METRIC_UPDATE_SUCCESS = 'RECEIVE_METRIC_UPDATE_SUCCESS';
+
+export const RECEIVE_METRIC_DELETE_SUCCESS = 'RECEIVE_METRIC_DELETE_SUCCESS';
+
+export const SET_INITIAL_DATA = 'SET_INITIAL_DATA';
diff --git a/app/assets/javascripts/vue_shared/components/metric_images/store/mutations.js b/app/assets/javascripts/vue_shared/components/metric_images/store/mutations.js
new file mode 100644
index 00000000000..b42234b2829
--- /dev/null
+++ b/app/assets/javascripts/vue_shared/components/metric_images/store/mutations.js
@@ -0,0 +1,39 @@
+import * as types from './mutation_types';
+
+export default {
+ [types.REQUEST_METRIC_IMAGES](state) {
+ state.isLoadingMetricImages = true;
+ },
+ [types.RECEIVE_METRIC_IMAGES_SUCCESS](state, images) {
+ state.metricImages = images || [];
+ state.isLoadingMetricImages = false;
+ },
+ [types.RECEIVE_METRIC_IMAGES_ERROR](state) {
+ state.isLoadingMetricImages = false;
+ },
+ [types.REQUEST_METRIC_UPLOAD](state) {
+ state.isUploadingImage = true;
+ },
+ [types.RECEIVE_METRIC_UPLOAD_SUCCESS](state, image) {
+ state.metricImages.push(image);
+ state.isUploadingImage = false;
+ },
+ [types.RECEIVE_METRIC_UPLOAD_ERROR](state) {
+ state.isUploadingImage = false;
+ },
+ [types.RECEIVE_METRIC_UPDATE_SUCCESS](state, image) {
+ state.isUploadingImage = false;
+ const metricIndex = state.metricImages.findIndex((img) => img.id === image.id);
+ if (metricIndex >= 0) {
+ state.metricImages.splice(metricIndex, 1, image);
+ }
+ },
+ [types.RECEIVE_METRIC_DELETE_SUCCESS](state, imageId) {
+ const metricIndex = state.metricImages.findIndex((image) => image.id === imageId);
+ state.metricImages.splice(metricIndex, 1);
+ },
+ [types.SET_INITIAL_DATA](state, { modelIid, projectId }) {
+ state.modelIid = modelIid;
+ state.projectId = projectId;
+ },
+};
diff --git a/app/assets/javascripts/vue_shared/components/metric_images/store/state.js b/app/assets/javascripts/vue_shared/components/metric_images/store/state.js
new file mode 100644
index 00000000000..b734e5c87a6
--- /dev/null
+++ b/app/assets/javascripts/vue_shared/components/metric_images/store/state.js
@@ -0,0 +1,10 @@
+export default ({ modelIid, projectId } = {}) => ({
+ // Initial state
+ modelIid,
+ projectId,
+
+ // View state
+ metricImages: [],
+ isLoadingMetricImages: false,
+ isUploadingImage: false,
+});
diff --git a/app/assets/javascripts/vue_shared/components/project_avatar.vue b/app/assets/javascripts/vue_shared/components/project_avatar.vue
index f16187022a5..402e75962d2 100644
--- a/app/assets/javascripts/vue_shared/components/project_avatar.vue
+++ b/app/assets/javascripts/vue_shared/components/project_avatar.vue
@@ -1,5 +1,6 @@
<script>
import { GlAvatar } from '@gitlab/ui';
+import { AVATAR_SHAPE_OPTION_RECT } from '~/vue_shared/constants';
export default {
components: {
@@ -31,12 +32,13 @@ export default {
return this.alt ?? this.projectName;
},
},
+ AVATAR_SHAPE_OPTION_RECT,
};
</script>
<template>
<gl-avatar
- shape="rect"
+ :shape="$options.AVATAR_SHAPE_OPTION_RECT"
:entity-name="projectName"
:src="projectAvatarUrl"
:alt="avatarAlt"
diff --git a/app/assets/javascripts/vue_shared/components/registry/title_area.vue b/app/assets/javascripts/vue_shared/components/registry/title_area.vue
index d108d8d689d..fc0976b0792 100644
--- a/app/assets/javascripts/vue_shared/components/registry/title_area.vue
+++ b/app/assets/javascripts/vue_shared/components/registry/title_area.vue
@@ -1,6 +1,7 @@
<script>
import { GlAvatar, GlSprintf, GlLink, GlSkeletonLoader } from '@gitlab/ui';
import { isEqual } from 'lodash';
+import { AVATAR_SHAPE_OPTION_RECT } from '~/vue_shared/constants';
export default {
name: 'TitleArea',
@@ -53,6 +54,7 @@ export default {
}
},
},
+ AVATAR_SHAPE_OPTION_RECT,
};
</script>
@@ -64,7 +66,7 @@ export default {
<gl-avatar
v-if="avatar"
:src="avatar"
- shape="rect"
+ :shape="$options.AVATAR_SHAPE_OPTION_RECT"
class="gl-align-self-center gl-mr-4"
/>
diff --git a/app/views/projects/usage_quotas/index.html.haml b/app/views/projects/usage_quotas/index.html.haml
index fc889242a83..5b4edc92d1d 100644
--- a/app/views/projects/usage_quotas/index.html.haml
+++ b/app/views/projects/usage_quotas/index.html.haml
@@ -7,7 +7,7 @@
.col-sm-12
= s_('UsageQuota|Usage of project resources across the %{strong_start}%{project_name}%{strong_end} project').html_safe % { strong_start: '<strong>'.html_safe, strong_end: '</strong>'.html_safe, project_name: @project.name } + '.'
%a{ href: help_page_path('user/usage_quotas.md'), target: '_blank', rel: 'noopener noreferrer' }
- = s_('UsageQuota|Learn more about usage quotas.')
+ = s_('UsageQuota|Learn more about usage quotas') + '.'
= gl_tabs_nav do
= gl_tab_link_to '#storage-quota-tab', item_active: true do
diff --git a/app/views/search/results/_blob_highlight.html.haml b/app/views/search/results/_blob_highlight.html.haml
index 729eda331b5..7ba114496af 100644
--- a/app/views/search/results/_blob_highlight.html.haml
+++ b/app/views/search/results/_blob_highlight.html.haml
@@ -6,7 +6,7 @@
.blob-content{ data: { blob_id: blob.id, path: blob.path, highlight_line: highlight, qa_selector: 'file_content' } }
- blob.present.highlight.lines.each_with_index do |line, index|
- i = index + offset
- .line_holder.code-search-line
+ .line_holder.code-search-line.gl-display-flex
.line-numbers
.gl-display-flex
%span.diff-line-num.gl-pl-3
@@ -22,7 +22,7 @@
%a{ href: "#{blob_link}#L#{i}", id: "blob-L#{i}", 'data-line-number' => i, class: 'gl-display-flex! gl-align-items-center gl-justify-content-end' }
= sprite_icon('link', css_class: 'gl-ml-3! gl-mr-1!')
= i
- %pre.code.highlight
+ %pre.code.highlight.flex-grow-1
%code
= line.html_safe
diff --git a/data/removals/15_0/15-0-remove_ff_push_rules_supersede_code_owners.yml b/data/removals/15_0/15-0-remove_ff_push_rules_supersede_code_owners.yml
deleted file mode 100644
index b88b6a3f38f..00000000000
--- a/data/removals/15_0/15-0-remove_ff_push_rules_supersede_code_owners.yml
+++ /dev/null
@@ -1,16 +0,0 @@
-- name: "Removed feature flag PUSH_RULES_SUPERSEDE_CODE_OWNERS" # The name of the feature to be deprecated
- announcement_milestone: "14.8" # The milestone when this feature was first announced as deprecated.
- announcement_date: "2022-02-22" # The date of the milestone release when this feature was first announced as deprecated. This should almost always be the 22nd of a month (YYYY-MM-22), unless you did an out of band blog post.
- removal_milestone: "15.0" # The milestone when this feature is planned to be removed
- removal_date: "2022-05-22" # The date of the milestone release when this feature is planned to be removed. This should almost always be the 22nd of a month (YYYY-MM-22), unless you did an out of band blog post.
- breaking_change: true # If this deprecation is a breaking change, set this value to true
- reporter: tlinz # GitLab username of the person reporting the deprecation
- body: | # Do not modify this line, instead modify the lines below.
- The feature flag `PUSH_RULES_SUPERSEDE_CODE_OWNERS` has been removed in GitLab 15.0. From now on, push rules will supersede CODEOWNERS. The CODEOWNERS feature is no longer available for access control.
-# The following items are not published on the docs page, but may be used in the future.
- stage: create # (optional - may be required in the future) String value of the stage that the feature was created in. e.g., Growth
- tiers: # (optional - may be required in the future) An array of tiers that the feature is available in currently. e.g., [Free, Silver, Gold, Core, Premium, Ultimate]
- issue_url: https://gitlab.com/gitlab-org/gitlab/-/issues/262019 # (optional) This is a link to the deprecation issue in GitLab
- documentation_url: # (optional) This is a link to the current documentation page
- image_url: # (optional) This is a link to a thumbnail image depicting the feature
- video_url: # (optional) Use the youtube thumbnail URL with the structure of https://img.youtube.com/vi/UNIQUEID/hqdefault.jpg
diff --git a/doc/administration/get_started.md b/doc/administration/get_started.md
index 3fd8613d23f..b2bdd876499 100644
--- a/doc/administration/get_started.md
+++ b/doc/administration/get_started.md
@@ -62,9 +62,10 @@ You may need to import projects from external sources like GitHub, Bitbucket, or
### Popular project imports
-- [GitHub Enterprise to self-managed GitLab](../integration/github.md): Enabling OAuth makes it easier for developers to find and import their projects.
-- [Bitbucket Server](../user/project/import/bitbucket_server.md#limitations): There are certain data limitations.
- For assistance with these data types, contact your GitLab account manager or GitLab Support about our professional migration services.
+- [GitHub Enterprise to self-managed GitLab](../integration/github.md)
+- [Bitbucket Server](../user/project/import/bitbucket_server.md)
+
+For assistance with these data types, contact your GitLab account manager or GitLab Support about our professional migration services.
## GitLab instance security
diff --git a/doc/ci/runners/saas/macos/environment.md b/doc/ci/runners/saas/macos/environment.md
index 95545923712..4d209fe4cd6 100644
--- a/doc/ci/runners/saas/macos/environment.md
+++ b/doc/ci/runners/saas/macos/environment.md
@@ -35,6 +35,7 @@ Each image is running a specific version of macOS and Xcode.
| macos-10.14-xcode-10 | <https://gitlab.com/gitlab-org/ci-cd/shared-runners/images/macstadium/orka/-/blob/main/toolchain/mojave.yml> |
| macos-10.15-xcode-11 | <https://gitlab.com/gitlab-org/ci-cd/shared-runners/images/macstadium/orka/-/blob/main/toolchain/catalina.yml> |
| macos-11-xcode-12 | <https://gitlab.com/gitlab-org/ci-cd/shared-runners/images/macstadium/orka/-/blob/main/toolchain/big-sur.yml> |
+| macos-11-xcode-13 | <https://gitlab.com/gitlab-org/ci-cd/shared-runners/images/macstadium/orka/-/blob/main/toolchain/monterey.yml>
### Image update policy
diff --git a/doc/development/backend/create_source_code_be/index.md b/doc/development/backend/create_source_code_be/index.md
index 6421ca3754a..8661d8b4d74 100644
--- a/doc/development/backend/create_source_code_be/index.md
+++ b/doc/development/backend/create_source_code_be/index.md
@@ -21,14 +21,12 @@ The team works across three codebases: Workhorse, GitLab Shell and GitLab Rails.
## Workhorse
-GitLab Workhorse is a smart reverse proxy for GitLab. It handles "large" HTTP
+[GitLab Workhorse](../../workhorse/index.md) is a smart reverse proxy for GitLab. It handles "large" HTTP
requests such as file downloads, file uploads, `git push`, `git pull` and `git` archive downloads.
Workhorse itself is not a feature, but there are several features in GitLab
that would not work efficiently without Workhorse.
-Workhorse documentation is available in the [Workhorse repository](https://gitlab.com/gitlab-org/gitlab/tree/master/workhorse).
-
## GitLab Shell
GitLab Shell handles Git SSH sessions for GitLab and modifies the list of authorized keys.
diff --git a/doc/development/workhorse/channel.md b/doc/development/workhorse/channel.md
new file mode 100644
index 00000000000..9d7dd524c2f
--- /dev/null
+++ b/doc/development/workhorse/channel.md
@@ -0,0 +1,199 @@
+---
+stage: Create
+group: Source Code
+info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
+---
+
+# Websocket channel support
+
+In some cases, GitLab can provide in-browser terminal access to an
+environment (which is a running server or container, onto which a
+project has been deployed), or even access to services running in CI
+through a WebSocket. Workhorse manages the WebSocket upgrade and
+long-lived connection to the websocket connection, which frees
+up GitLab to process other requests.
+
+This document outlines the architecture of these connections.
+
+## Introduction to WebSockets
+
+A websocket is an "upgraded" HTTP/1.1 request. Their purpose is to
+permit bidirectional communication between a client and a server.
+**Websockets are not HTTP**. Clients can send messages (known as
+frames) to the server at any time, and vice-versa. Client messages
+are not necessarily requests, and server messages are not necessarily
+responses. WebSocket URLs have schemes like `ws://` (unencrypted) or
+`wss://` (TLS-secured).
+
+When requesting an upgrade to WebSocket, the browser sends a HTTP/1.1
+request that looks like this:
+
+```plaintext
+GET /path.ws HTTP/1.1
+Connection: upgrade
+Upgrade: websocket
+Sec-WebSocket-Protocol: terminal.gitlab.com
+# More headers, including security measures
+```
+
+At this point, the connection is still HTTP, so this is a request and
+the server can send a normal HTTP response, including `404 Not Found`,
+`500 Internal Server Error`, etc.
+
+If the server decides to permit the upgrade, it will send a HTTP
+`101 Switching Protocols` response. From this point, the connection
+is no longer HTTP. It is a WebSocket and frames, not HTTP requests,
+will flow over it. The connection will persist until the client or
+server closes the connection.
+
+In addition to the subprotocol, individual websocket frames may
+also specify a message type - examples include `BinaryMessage`,
+`TextMessage`, `Ping`, `Pong` or `Close`. Only binary frames can
+contain arbitrary data - other frames are expected to be valid
+UTF-8 strings, in addition to any subprotocol expectations.
+
+## Browser to Workhorse
+
+Using the terminal as an example, GitLab serves a JavaScript terminal
+emulator to the browser on a URL like
+`https://gitlab.com/group/project/-/environments/1/terminal`.
+This opens a websocket connection to, e.g.,
+`wss://gitlab.com/group/project/-/environments/1/terminal.ws`,
+This endpoint doesn't exist in GitLab - only in Workhorse.
+
+When receiving the connection, Workhorse first checks that the
+client is authorized to access the requested terminal. It does
+this by performing a "preauthentication" request to GitLab.
+
+If the client has the appropriate permissions and the terminal
+exists, GitLab responds with a successful response that includes
+details of the terminal that the client should be connected to.
+Otherwise, it returns an appropriate HTTP error response.
+
+Errors are passed back to the client as HTTP responses, but if
+GitLab returns valid terminal details to Workhorse, it will
+connect to the specified terminal, upgrade the browser to a
+WebSocket, and proxy between the two connections for as long
+as the browser's credentials are valid. Workhorse will also
+send regular `PingMessage` control frames to the browser, to
+keep intervening proxies from terminating the connection
+while the browser is present.
+
+The browser must request an upgrade with a specific subprotocol:
+
+### `terminal.gitlab.com`
+
+This subprotocol considers `TextMessage` frames to be invalid.
+Control frames, such as `PingMessage` or `CloseMessage`, have
+their usual meanings.
+
+`BinaryMessage` frames sent from the browser to the server are
+arbitrary text input.
+
+`BinaryMessage` frames sent from the server to the browser are
+arbitrary text output.
+
+These frames are expected to contain ANSI text control codes
+and may be in any encoding.
+
+### `base64.terminal.gitlab.com`
+
+This subprotocol considers `BinaryMessage` frames to be invalid.
+Control frames, such as `PingMessage` or `CloseMessage`, have
+their usual meanings.
+
+`TextMessage` frames sent from the browser to the server are
+base64-encoded arbitrary text input (so the server must
+base64-decode them before inputting them).
+
+`TextMessage` frames sent from the server to the browser are
+base64-encoded arbitrary text output (so the browser must
+base64-decode them before outputting them).
+
+In their base64-encoded form, these frames are expected to
+contain ANSI terminal control codes, and may be in any encoding.
+
+## Workhorse to GitLab
+
+Using again the terminal as an example, before upgrading the browser,
+Workhorse sends a normal HTTP request to GitLab on a URL like
+`https://gitlab.com/group/project/environments/1/terminal.ws/authorize`.
+This returns a JSON response containing details of where the
+terminal can be found, and how to connect it. In particular,
+the following details are returned in case of success:
+
+- WebSocket URL to **connect** to, e.g.: `wss://example.com/terminals/1.ws?tty=1`
+- WebSocket subprotocols to support, e.g.: `["channel.k8s.io"]`
+- Headers to send, e.g.: `Authorization: Token xxyyz..`
+- Certificate authority to verify `wss` connections with (optional)
+
+Workhorse periodically re-checks this endpoint, and if it gets an
+error response, or the details of the terminal change, it will
+terminate the websocket session.
+
+## Workhorse to the WebSocket server
+
+In GitLab, environments or CI jobs may have a deployment service (e.g.,
+`KubernetesService`) associated with them. This service knows
+where the terminals or the service for an environment may be found, and these
+details are returned to Workhorse by GitLab.
+
+These URLs are *also* WebSocket URLs, and GitLab tells Workhorse
+which subprotocols to speak over the connection, along with any
+authentication details required by the remote end.
+
+Before upgrading the browser's connection to a websocket,
+Workhorse opens a HTTP client connection, according to the
+details given to it by Workhorse, and attempts to upgrade
+that connection to a websocket. If it fails, an error
+response is sent to the browser; otherwise, the browser is
+also upgraded.
+
+Workhorse now has two websocket connections, albeit with
+differing subprotocols. It decodes incoming frames from the
+browser, re-encodes them to the channel's subprotocol, and
+sends them to the channel. Similarly, it decodes incoming
+frames from the channel, re-encodes them to the browser's
+subprotocol, and sends them to the browser.
+
+When either connection closes or enters an error state,
+Workhorse detects the error and closes the other connection,
+terminating the channel session. If the browser is the
+connection that has disconnected, Workhorse will send an ANSI
+`End of Transmission` control code (the `0x04` byte) to the
+channel, encoded according to the appropriate subprotocol.
+Workhorse will automatically reply to any websocket ping frame
+sent by the channel, to avoid being disconnected.
+
+Currently, Workhorse only supports the following subprotocols.
+Supporting new deployment services will require new subprotocols
+to be supported:
+
+### `channel.k8s.io`
+
+Used by Kubernetes, this subprotocol defines a simple multiplexed
+channel.
+
+Control frames have their usual meanings. `TextMessage` frames are
+invalid. `BinaryMessage` frames represent I/O to a specific file
+descriptor.
+
+The first byte of each `BinaryMessage` frame represents the file
+descriptor (fd) number, as a `uint8` (so the value `0x00` corresponds
+to fd 0, `STDIN`, while `0x01` corresponds to fd 1, `STDOUT`).
+
+The remaining bytes represent arbitrary data. For frames received
+from the server, they are bytes that have been received from that
+fd. For frames sent to the server, they are bytes that should be
+written to that fd.
+
+### `base64.channel.k8s.io`
+
+Also used by Kubernetes, this subprotocol defines a similar multiplexed
+channel to `channel.k8s.io`. The main differences are:
+
+- `TextMessage` frames are valid, rather than `BinaryMessage` frames.
+- The first byte of each `TextMessage` frame represents the file
+ descriptor as a numeric UTF-8 character, so the character `U+0030`,
+ or "0", is fd 0, STDIN).
+- The remaining bytes represent base64-encoded arbitrary data.
diff --git a/doc/development/workhorse/configuration.md b/doc/development/workhorse/configuration.md
new file mode 100644
index 00000000000..f715cf442eb
--- /dev/null
+++ b/doc/development/workhorse/configuration.md
@@ -0,0 +1,219 @@
+---
+stage: Create
+group: Source Code
+info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
+---
+
+# Workhorse configuration
+
+For historical reasons Workhorse uses both command line flags, a configuration file and environment variables.
+
+All new configuration options that get added to Workhorse should go into the configuration file.
+
+## CLI options
+
+```plaintext
+ gitlab-workhorse [OPTIONS]
+
+Options:
+ -apiCiLongPollingDuration duration
+ Long polling duration for job requesting for runners (default 50ns)
+ -apiLimit uint
+ Number of API requests allowed at single time
+ -apiQueueDuration duration
+ Maximum queueing duration of requests (default 30s)
+ -apiQueueLimit uint
+ Number of API requests allowed to be queued
+ -authBackend string
+ Authentication/authorization backend (default "http://localhost:8080")
+ -authSocket string
+ Optional: Unix domain socket to dial authBackend at
+ -cableBackend string
+ Optional: ActionCable backend (default authBackend)
+ -cableSocket string
+ Optional: Unix domain socket to dial cableBackend at (default authSocket)
+ -config string
+ TOML file to load config from
+ -developmentMode
+ Allow the assets to be served from Rails app
+ -documentRoot string
+ Path to static files content (default "public")
+ -listenAddr string
+ Listen address for HTTP server (default "localhost:8181")
+ -listenNetwork string
+ Listen 'network' (tcp, tcp4, tcp6, unix) (default "tcp")
+ -listenUmask int
+ Umask for Unix socket
+ -logFile string
+ Log file location
+ -logFormat string
+ Log format to use defaults to text (text, json, structured, none) (default "text")
+ -pprofListenAddr string
+ pprof listening address, e.g. 'localhost:6060'
+ -prometheusListenAddr string
+ Prometheus listening address, e.g. 'localhost:9229'
+ -proxyHeadersTimeout duration
+ How long to wait for response headers when proxying the request (default 5m0s)
+ -secretPath string
+ File with secret key to authenticate with authBackend (default "./.gitlab_workhorse_secret")
+ -version
+ Print version and exit
+```
+
+The 'auth backend' refers to the GitLab Rails application. The name is
+a holdover from when GitLab Workhorse only handled Git push/pull over
+HTTP.
+
+GitLab Workhorse can listen on either a TCP or a Unix domain socket. It
+can also open a second listening TCP listening socket with the Go
+[`net/http/pprof` profiler server](http://golang.org/pkg/net/http/pprof/).
+
+GitLab Workhorse can listen on Redis events (currently only builds/register
+for runners). This requires you to pass a valid TOML configuration file via
+`-config` flag.
+For regular setups it only requires the following (replacing the string
+with the actual socket)
+
+## Redis
+
+GitLab Workhorse integrates with Redis to do long polling for CI build
+requests. This is configured via two things:
+
+- Redis settings in the TOML configuration file
+- The `-apiCiLongPollingDuration` command line flag to control polling
+ behavior for CI build requests
+
+It is OK to enable Redis in the configuration file but to leave CI polling
+disabled; this just results in an idle Redis pubsub connection. The
+opposite is not possible: CI long polling requires a correct Redis
+configuration.
+
+Below we discuss the options for the `[redis]` section in the configuration
+file.
+
+```plaintext
+[redis]
+URL = "unix:///var/run/gitlab/redis.sock"
+Password = "my_awesome_password"
+Sentinel = [ "tcp://sentinel1:23456", "tcp://sentinel2:23456" ]
+SentinelMaster = "mymaster"
+```
+
+- `URL` takes a string in the format `unix://path/to/redis.sock` or
+`tcp://host:port`.
+- `Password` is only required if your Redis instance is password-protected
+- `Sentinel` is used if you are using Sentinel.
+
+ NOTE:
+ If both `Sentinel` and `URL` are given, only `Sentinel` will be used.
+
+Optional fields are as follows:
+
+```plaintext
+[redis]
+DB = 0
+MaxIdle = 1
+MaxActive = 1
+```
+
+- `DB` is the Database to connect to. Defaults to `0`
+- `MaxIdle` is how many idle connections can be in the Redis pool at once. Defaults to 1
+- `MaxActive` is how many connections the pool can keep. Defaults to 1
+
+## Relative URL support
+
+If you are mounting GitLab at a relative URL, e.g.
+`example.com/gitlab`, then you should also use this relative URL in
+the `authBackend` setting:
+
+```plaintext
+gitlab-workhorse -authBackend http://localhost:8080/gitlab
+```
+
+## Interaction of authBackend and authSocket
+
+The interaction between `authBackend` and `authSocket` can be a bit
+confusing. It comes down to: if `authSocket` is set it overrides the
+_host_ part of `authBackend` but not the relative path.
+
+In table form:
+
+|authBackend|authSocket|Workhorse connects to?|Rails relative URL|
+|---|---|---|---|
+|unset|unset|`localhost:8080`|`/`|
+|`http://localhost:3000`|unset|`localhost:3000`|`/`|
+|`http://localhost:3000/gitlab`|unset|`localhost:3000`|`/gitlab`|
+|unset|`/path/to/socket`|`/path/to/socket`|`/`|
+|`http://localhost:3000`|`/path/to/socket`|`/path/to/socket`|`/`|
+|`http://localhost:3000/gitlab`|`/path/to/socket`|`/path/to/socket`|`/gitlab`|
+
+The same applies to `cableBackend` and `cableSocket`.
+
+## Error tracking
+
+GitLab-Workhorse supports remote error tracking with
+[Sentry](https://sentry.io). To enable this feature set the
+`GITLAB_WORKHORSE_SENTRY_DSN` environment variable.
+You can also set the `GITLAB_WORKHORSE_SENTRY_ENVIRONMENT` environment variable to
+use the Sentry environment functionality to separate staging, production and
+development.
+
+Omnibus (`/etc/gitlab/gitlab.rb`):
+
+```ruby
+gitlab_workhorse['env'] = {
+ 'GITLAB_WORKHORSE_SENTRY_DSN' => 'https://foobar'
+ 'GITLAB_WORKHORSE_SENTRY_ENVIRONMENT' => 'production'
+}
+```
+
+Source installations (`/etc/default/gitlab`):
+
+```plaintext
+export GITLAB_WORKHORSE_SENTRY_DSN='https://foobar'
+export GITLAB_WORKHORSE_SENTRY_ENVIRONMENT='production'
+```
+
+## Distributed Tracing
+
+Workhorse supports distributed tracing through [LabKit](https://gitlab.com/gitlab-org/labkit/) using [OpenTracing APIs](https://opentracing.io).
+
+By default, no tracing implementation is linked into the binary, but different OpenTracing providers can be linked in using [build tags](https://golang.org/pkg/go/build/#hdr-Build_Constraints) or build constraints. This can be done by setting the `BUILD_TAGS` make variable.
+
+For more details of the supported providers, see LabKit, but as an example, for Jaeger tracing support, include the tags: `BUILD_TAGS="tracer_static tracer_static_jaeger"`.
+
+```shell
+make BUILD_TAGS="tracer_static tracer_static_jaeger"
+```
+
+Once Workhorse is compiled with an opentracing provider, the tracing configuration is configured via the `GITLAB_TRACING` environment variable.
+
+For example:
+
+```shell
+GITLAB_TRACING=opentracing://jaeger ./gitlab-workhorse
+```
+
+## Continuous Profiling
+
+Workhorse supports continuous profiling through [LabKit](https://gitlab.com/gitlab-org/labkit/) using [Stackdriver Profiler](https://cloud.google.com/profiler).
+
+By default, the Stackdriver Profiler implementation is linked in the binary using [build tags](https://golang.org/pkg/go/build/#hdr-Build_Constraints), though it's not
+required and can be skipped.
+
+For example:
+
+```shell
+make BUILD_TAGS=""
+```
+
+Once Workhorse is compiled with Continuous Profiling, the profiler configuration can be set via `GITLAB_CONTINUOUS_PROFILING`
+environment variable.
+
+For example:
+
+```shell
+GITLAB_CONTINUOUS_PROFILING="stackdriver?service=workhorse&service_version=1.0.1&project_id=test-123 ./gitlab-workhorse"
+```
+
+More information about see the [LabKit monitoring documentation](https://gitlab.com/gitlab-org/labkit/-/blob/master/monitoring/doc.go).
diff --git a/doc/development/workhorse/gitlab_features.md b/doc/development/workhorse/gitlab_features.md
new file mode 100644
index 00000000000..cdec2a8f0c1
--- /dev/null
+++ b/doc/development/workhorse/gitlab_features.md
@@ -0,0 +1,73 @@
+---
+stage: Create
+group: Source Code
+info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
+---
+
+# Features that rely on Workhorse
+
+Workhorse itself is not a feature, but there are several features in
+GitLab that would not work efficiently without Workhorse.
+
+To put the efficiency benefit in context, consider that in 2020Q3 on
+GitLab.com [we see][https://thanos-query.ops.gitlab.net/graph?g0.range_input=1h&g0.max_source_resolution=0s&g0.expr=sum(ruby_process_resident_memory_bytes%7Bapp%3D%22webservice%22%2Cenv%3D%22gprd%22%2Crelease%3D%22gitlab%22%7D)%20%2F%20sum(puma_max_threads%7Bapp%3D%22webservice%22%2Cenv%3D%22gprd%22%2Crelease%3D%22gitlab%22%7D)&g0.tab=1&g1.range_input=1h&g1.max_source_resolution=0s&g1.expr=sum(go_memstats_sys_bytes%7Bapp%3D%22webservice%22%2Cenv%3D%22gprd%22%2Crelease%3D%22gitlab%22%7D)%2Fsum(go_goroutines%7Bapp%3D%22webservice%22%2Cenv%3D%22gprd%22%2Crelease%3D%22gitlab%22%7D)&g1.tab=1]
+Rails application threads using on average
+about 200MB of RSS vs about 200KB for Workhorse goroutines.
+
+Examples of features that rely on Workhorse:
+
+## 1. `git clone` and `git push` over HTTP
+
+Git clone, pull and push are slow because they transfer large amounts
+of data and because each is CPU intensive on the GitLab side. Without
+Workhorse, HTTP access to Git repositories would compete with regular
+web access to the application, requiring us to run way more Rails
+application servers.
+
+## 2. CI runner long polling
+
+GitLab CI runners fetch new CI jobs by polling the GitLab server.
+Workhorse acts as a kind of "waiting room" where CI runners can sit
+and wait for new CI jobs. Because of Go's efficiency we can fit a lot
+of runners in the waiting room at little cost. Without this waiting
+room mechanism we would have to add a lot more Rails server capacity.
+
+## 3. File uploads and downloads
+
+File uploads and downloads may be slow either because the file is
+large or because the user's connection is slow. Workhorse can handle
+the slow part for Rails. This improves the efficiency of features such
+as CI artifacts, package repositories, LFS objects, etc.
+
+## 4. Websocket proxying
+
+Features such as the web terminal require a long lived connection
+between the user's web browser and a container inside GitLab that is
+not directly accessible from the internet. Dedicating a Rails
+application thread to proxying such a connection would cost much more
+memory than it costs to have Workhorse look after it.
+
+## Quick facts (how does Workhorse work)
+
+- Workhorse can handle some requests without involving Rails at all:
+ for example, JavaScript files and CSS files are served straight
+ from disk.
+- Workhorse can modify responses sent by Rails: for example if you use
+ `send_file` in Rails then GitLab Workhorse will open the file on
+ disk and send its contents as the response body to the client.
+- Workhorse can take over requests after asking permission from Rails.
+ Example: handling `git clone`.
+- Workhorse can modify requests before passing them to Rails. Example:
+ when handling a Git LFS upload Workhorse first asks permission from
+ Rails, then it stores the request body in a tempfile, then it sends
+ a modified request containing the tempfile path to Rails.
+- Workhorse can manage long-lived WebSocket connections for Rails.
+ Example: handling the terminal websocket for environments.
+- Workhorse does not connect to PostgreSQL, only to Rails and (optionally) Redis.
+- We assume that all requests that reach Workhorse pass through an
+ upstream proxy such as NGINX or Apache first.
+- Workhorse does not accept HTTPS connections.
+- Workhorse does not clean up idle client connections.
+- We assume that all requests to Rails pass through Workhorse.
+
+For more information see ['A brief history of GitLab Workhorse'](https://about.gitlab.com/2016/04/12/a-brief-history-of-gitlab-workhorse/).
diff --git a/doc/development/workhorse/index.md b/doc/development/workhorse/index.md
new file mode 100644
index 00000000000..f7ca16e0f31
--- /dev/null
+++ b/doc/development/workhorse/index.md
@@ -0,0 +1,84 @@
+---
+stage: Create
+group: Source Code
+info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
+---
+
+# GitLab Workhorse
+
+GitLab Workhorse is a smart reverse proxy for GitLab. It handles
+"large" HTTP requests such as file downloads, file uploads, Git
+push/pull and Git archive downloads.
+
+Workhorse itself is not a feature, but there are [several features in
+GitLab](gitlab_features.md) that would not work efficiently without Workhorse.
+
+The canonical source for Workhorse is
+[`gitlab-org/gitlab/workhorse`](https://gitlab.com/gitlab-org/gitlab/tree/master/workhorse).
+Prior to [epic #4826](https://gitlab.com/groups/gitlab-org/-/epics/4826), it was
+[`gitlab-org/gitlab-workhorse`](https://gitlab.com/gitlab-org/gitlab-workhorse/tree/master),
+but that repository is no longer used for development.
+
+## Install Workhorse
+
+To install GitLab Workhorse you need [Go 1.15 or newer](https://golang.org/dl) and
+[GNU Make](https://www.gnu.org/software/make/).
+
+To install into `/usr/local/bin` run `make install`.
+
+```plaintext
+make install
+```
+
+To install into `/foo/bin` set the PREFIX variable.
+
+```plaintext
+make install PREFIX=/foo
+```
+
+On some operating systems, such as FreeBSD, you may have to use
+`gmake` instead of `make`.
+
+*NOTE*: Some features depends on build tags, make sure to check
+[Workhorse configuration](configuration.md) to enable them.
+
+### Run time dependencies
+
+Workhorse uses [Exiftool](https://www.sno.phy.queensu.ca/~phil/exiftool/) for
+removing EXIF data (which may contain sensitive information) from uploaded
+images. If you installed GitLab:
+
+- Using the Omnibus package, you're all set.
+ *NOTE* that if you are using CentOS Minimal, you may need to install `perl`
+ package: `yum install perl`
+- From source, make sure `exiftool` is installed:
+
+ ```shell
+ # Debian/Ubuntu
+ sudo apt-get install libimage-exiftool-perl
+
+ # RHEL/CentOS
+ sudo yum install perl-Image-ExifTool
+ ```
+
+## Testing your code
+
+Run the tests with:
+
+```plaintext
+make clean test
+```
+
+Each feature in GitLab Workhorse should have an integration test that
+verifies that the feature 'kicks in' on the right requests and leaves
+other requests unaffected. It is better to also have package-level tests
+for specific behavior but the high-level integration tests should have
+the first priority during development.
+
+It is OK if a feature is only covered by integration tests.
+
+<!--
+## License
+
+This code is distributed under the MIT license, see the [LICENSE](LICENSE) file.
+-->
diff --git a/doc/development/workhorse/new_features.md b/doc/development/workhorse/new_features.md
new file mode 100644
index 00000000000..6b17af0fc34
--- /dev/null
+++ b/doc/development/workhorse/new_features.md
@@ -0,0 +1,46 @@
+---
+stage: Create
+group: Source Code
+info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
+---
+
+# Adding new features to Workhorse
+
+GitLab Workhorse is a smart reverse proxy for GitLab. It handles
+"long" HTTP requests such as file downloads, file uploads, Git
+push/pull and Git archive downloads.
+
+Workhorse itself is not a feature, but there are [several features in GitLab](gitlab_features.md) that would not work efficiently without Workhorse.
+
+At a first glance, it may look like Workhorse is just a pipeline for processing HTTP streams so that you can reduce the amount of logic in your Ruby on Rails controller, but there are good reasons to avoid treating it like that.
+
+Engineers embarking on the quest of offloading a feature to Workhorse often find that the endeavor is much higher than what originally anticipated. In part because of the new programming language (only a few engineers at GitLab are Go developers), in part because of the demanding requirements for Workhorse. Workhorse is stateless, memory and disk usage must be kept under tight control, and the request should not be slowed down in the process.
+
+## Can I add a new feature to Workhorse?
+
+We suggest to follow this route only if absolutely necessary and no other options are available.
+
+Splitting a feature between the Rails code-base and Workhorse is deliberately choosing to introduce technical debt. It adds complexity to the system and coupling between the two components.
+
+- Building features using Workhorse has a considerable complexity cost, so you should prefer designs based on Rails requests and Sidekiq jobs.
+- Even when using Rails+Sidekiq is "more work" than using Rails+Workhorse, Rails+Sidekiq is easier to maintain in the long term because Workhorse is unique to GitLab while Rails+Sidekiq is an industry standard.
+- For "global" behaviors around web requests consider using a Rack middleware instead of Workhorse.
+- Generally speaking, we should only use Rails+Workhorse if the HTTP client expects behavior that is not reasonable to implement in Rails, like "long" requests.
+
+## What is a "long" request?
+
+There is one order of magnitude between Workhorse and Puma RAM usage. Having connection open for a period longer than milliseconds is a problem because of the amount of RAM it monopolizes once it reaches the Ruby on Rails controller.
+
+So far we identified two classes of "long" requests: data transfers and HTTP long polling.
+
+`git push`, `git pull`, uploading or downloading an artifact, the CI runner waiting for a new job are all good examples of long requests.
+
+With the rise of cloud-native installations, Workhorse's feature-set was extended to add object storage direct-upload, to get rid of the shared Network File System (NFS) drives.
+
+In 2020 @nolith presented at FOSDEM a talk titled [_Speed up the monolith. Building a smart reverse proxy in Go_](https://archive.fosdem.org/2020/schedule/event/speedupmonolith/).
+You can watch the recording for more details on the history of Workhorse and the NFS removal.
+
+[Uploads development documentation](../uploads.md)
+contains the most common use-cases for adding a new type of upload and may answer all of your questions.
+
+If you still think we should add a new feature to Workhorse, please open an issue explaining **what you want to implement** and **why it can't be implemented in our Ruby code-base**. Workhorse maintainers will be happy to help you assessing the situation.
diff --git a/doc/update/removals.md b/doc/update/removals.md
index bb3b54573b8..80f5006982e 100644
--- a/doc/update/removals.md
+++ b/doc/update/removals.md
@@ -30,16 +30,6 @@ For removal reviewers (Technical Writers only):
## 15.0
-### Removed feature flag PUSH_RULES_SUPERSEDE_CODE_OWNERS
-
-WARNING:
-This feature was changed or removed in 15.0
-as a [breaking change](https://docs.gitlab.com/ee/development/contributing/#breaking-changes).
-Before updating GitLab, review the details carefully to determine if you need to make any
-changes to your code, settings, or workflow.
-
-The feature flag `PUSH_RULES_SUPERSEDE_CODE_OWNERS` has been removed in GitLab 15.0. From now on, push rules will supersede CODEOWNERS. The CODEOWNERS feature is no longer available for access control.
-
### Request a new review
The ability to [request a new review](https://docs.gitlab.com/ee/user/project/merge_requests/reviews/#request-a-new-review) has been removed in GitLab 15.0. This feature is replaced with [requesting attention](https://docs.gitlab.com/ee/user/project/merge_requests/#request-attention-to-a-merge-request) to a merge request.
diff --git a/doc/user/project/import/bitbucket_server.md b/doc/user/project/import/bitbucket_server.md
index 4e3642eb3bd..b6241dbbdb0 100644
--- a/doc/user/project/import/bitbucket_server.md
+++ b/doc/user/project/import/bitbucket_server.md
@@ -24,11 +24,10 @@ created as private in GitLab as well.
## Import your Bitbucket repositories
-Prerequisites:
+Prerequisite:
- An administrator must have enabled the **Bitbucket Server** in
**Admin > Settings > General > Visibility and access controls > Import sources**.
-- Review the importer's [limitations](#limitations).
To import your Bitbucket repositories:
@@ -41,24 +40,27 @@ To import your Bitbucket repositories:
1. Select the projects to import, or import all projects. You can filter projects by name and select
the namespace for which to import each project.
-## Limitations
+### Items that are not imported
-- GitLab doesn't allow comments on arbitrary lines of code. Any out-of-bounds Bitbucket comments are
- inserted as comments in the merge request.
-- Bitbucket Server allows multiple threading levels. The importer collapses this into one thread and
- quotes part of the original comment.
-- Declined pull requests have unreachable commits. This prevents the importer from generating a
- proper diff. These pull requests show up as empty changes.
-- Project filtering doesn't support fuzzy search. Only starts with or full match strings are
- supported.
-
-The following aren't imported:
+The following items aren't imported:
- Pull request approvals
- Attachments in Markdown
- Task lists
- Emoji reactions
+### Items that are imported but changed
+
+The following items are changed when they are imported:
+
+- GitLab doesn't allow comments on arbitrary lines of code. Any out-of-bounds Bitbucket comments are
+ inserted as comments in the merge request.
+- Multiple threading levels are collapsed into one thread and
+ quotes are added as part of the original comment.
+- Declined pull requests have unreachable commits. These pull requests show up as empty changes.
+- Project filtering doesn't support fuzzy search. Only **starts with** or **full match** strings are
+ supported.
+
## User assignment
When issues and pull requests are importing, the importer tries to find the author's email address
diff --git a/locale/gitlab.pot b/locale/gitlab.pot
index cba3c316a32..f07fcef5623 100644
--- a/locale/gitlab.pot
+++ b/locale/gitlab.pot
@@ -969,6 +969,9 @@ msgstr[1] ""
msgid "%{service_ping_link_start}What information is shared with GitLab Inc.?%{service_ping_link_end}"
msgstr ""
+msgid "%{size} %{unit}"
+msgstr ""
+
msgid "%{size} GiB"
msgstr ""
@@ -19913,18 +19916,6 @@ msgstr ""
msgid "Incidents|Must start with http or https"
msgstr ""
-msgid "Incidents|There was an issue deleting the image."
-msgstr ""
-
-msgid "Incidents|There was an issue loading metric images."
-msgstr ""
-
-msgid "Incidents|There was an issue updating your image."
-msgstr ""
-
-msgid "Incidents|There was an issue uploading your image."
-msgstr ""
-
msgid "Incident|Add new timeline event"
msgstr ""
@@ -23871,6 +23862,18 @@ msgstr ""
msgid "MetricChart|There is too much data to calculate. Please change your selection."
msgstr ""
+msgid "MetricImages|There was an issue deleting the image."
+msgstr ""
+
+msgid "MetricImages|There was an issue loading metric images."
+msgstr ""
+
+msgid "MetricImages|There was an issue updating your image."
+msgstr ""
+
+msgid "MetricImages|There was an issue uploading your image."
+msgstr ""
+
msgid "Metrics"
msgstr ""
@@ -40507,9 +40510,6 @@ msgstr ""
msgid "UsageQuota|Buy additional minutes"
msgstr ""
-msgid "UsageQuota|Buy storage"
-msgstr ""
-
msgid "UsageQuota|CI minutes usage by month"
msgstr ""
@@ -40540,10 +40540,10 @@ msgstr ""
msgid "UsageQuota|LFS storage"
msgstr ""
-msgid "UsageQuota|Learn more about usage quotas."
+msgid "UsageQuota|Learn more about excess storage usage"
msgstr ""
-msgid "UsageQuota|Namespace storage used"
+msgid "UsageQuota|Learn more about usage quotas"
msgstr ""
msgid "UsageQuota|No CI minutes usage data available."
@@ -40564,10 +40564,7 @@ msgstr ""
msgid "UsageQuota|Purchase more storage"
msgstr ""
-msgid "UsageQuota|Purchased storage"
-msgstr ""
-
-msgid "UsageQuota|Purchased storage used"
+msgid "UsageQuota|Purchased storage available"
msgstr ""
msgid "UsageQuota|Repository"
@@ -40594,12 +40591,24 @@ msgstr ""
msgid "UsageQuota|Storage used"
msgstr ""
+msgid "UsageQuota|This is the total amount of storage used across your projects within this namespace."
+msgstr ""
+
+msgid "UsageQuota|This is the total amount of storage used by projects above the free %{actualRepositorySizeLimit} storage limit."
+msgstr ""
+
msgid "UsageQuota|This namespace contains locked projects"
msgstr ""
msgid "UsageQuota|This namespace has no projects which use shared runners"
msgstr ""
+msgid "UsageQuota|Total excess storage used"
+msgstr ""
+
+msgid "UsageQuota|Total namespace storage used"
+msgstr ""
+
msgid "UsageQuota|Unlimited"
msgstr ""
diff --git a/qa/qa/resource/bulk_import_group.rb b/qa/qa/resource/bulk_import_group.rb
index a22529152e1..31db8ae4cc6 100644
--- a/qa/qa/resource/bulk_import_group.rb
+++ b/qa/qa/resource/bulk_import_group.rb
@@ -7,10 +7,14 @@ module QA
:destination_group,
:import_id
- attribute :access_token do
+ attribute :import_access_token do
api_client.personal_access_token
end
+ attribute :gitlab_address do
+ QA::Runtime::Scenario.gitlab_address
+ end
+
# In most cases we will want to set path the same as source group
# but it can be set to a custom name as well when imported via API
attribute :destination_group_path do
@@ -19,18 +23,16 @@ module QA
# Can't define path as attribue since @path is set in base class initializer
alias_method :path, :destination_group_path
- delegate :gitlab_address, to: 'QA::Runtime::Scenario'
-
- def fabricate_via_browser_ui!
+ def fabricate!
Page::Main::Menu.perform(&:go_to_create_group)
Page::Group::New.perform do |group|
group.switch_to_import_tab
- group.connect_gitlab_instance(gitlab_address, api_client.personal_access_token)
+ group.connect_gitlab_instance(gitlab_address, import_access_token)
end
Page::Group::BulkImport.perform do |import_page|
- import_page.import_group(path, sandbox.path)
+ import_page.import_group(destination_group_path, sandbox.full_path)
end
reload!
@@ -49,7 +51,7 @@ module QA
{
configuration: {
url: gitlab_address,
- access_token: access_token
+ access_token: import_access_token
},
entities: [
{
diff --git a/qa/qa/resource/project.rb b/qa/qa/resource/project.rb
index 7b91815a43e..dba3eb2e219 100644
--- a/qa/qa/resource/project.rb
+++ b/qa/qa/resource/project.rb
@@ -343,16 +343,17 @@ module QA
parse_body(response)
end
- def pipelines
- response = get(request_url(api_pipelines_path))
- parse_body(response)
- end
-
def pipeline_schedules
response = get(request_url(api_pipeline_schedules_path))
parse_body(response)
end
+ def pipelines(auto_paginate: false, attempts: 0)
+ return parse_body(api_get_from(api_pipelines_path)) unless auto_paginate
+
+ auto_paginated_response(request_url(api_pipelines_path, per_page: '100'), attempts: attempts)
+ end
+
def issues(auto_paginate: false, attempts: 0)
return parse_body(api_get_from(api_issues_path)) unless auto_paginate
@@ -387,9 +388,7 @@ module QA
api_resource[:import_status] == "finished"
end
- unless mirror_succeeded
- raise "Mirroring failed with error: #{api_resource[:import_error]}"
- end
+ raise "Mirroring failed with error: #{api_resource[:import_error]}" unless mirror_succeeded
end
def remove_via_api!
diff --git a/qa/qa/specs/features/api/1_manage/import_large_github_repo_spec.rb b/qa/qa/specs/features/api/1_manage/import_large_github_repo_spec.rb
index f2c6ffc7c47..e30995cb074 100644
--- a/qa/qa/specs/features/api/1_manage/import_large_github_repo_spec.rb
+++ b/qa/qa/specs/features/api/1_manage/import_large_github_repo_spec.rb
@@ -111,7 +111,7 @@ module QA
user.remove_via_api! unless example.exception
next unless defined?(@import_time)
- # save data for comparison after run finished
+ # save data for comparison notification creation
save_json(
"data",
{
@@ -121,26 +121,30 @@ module QA
source: {
name: "GitHub",
project_name: github_repo,
- branches: gh_branches.length,
- commits: gh_commits.length,
- labels: gh_labels.length,
- milestones: gh_milestones.length,
- mrs: gh_prs.length,
- mr_comments: gh_prs.sum { |_k, v| v[:comments].length },
- issues: gh_issues.length,
- issue_comments: gh_issues.sum { |_k, v| v[:comments].length }
+ data: {
+ branches: gh_branches.length,
+ commits: gh_commits.length,
+ labels: gh_labels.length,
+ milestones: gh_milestones.length,
+ mrs: gh_prs.length,
+ mr_comments: gh_prs.sum { |_k, v| v[:comments].length },
+ issues: gh_issues.length,
+ issue_comments: gh_issues.sum { |_k, v| v[:comments].length }
+ }
},
target: {
name: "GitLab",
project_name: imported_project.path_with_namespace,
- branches: gl_branches.length,
- commits: gl_commits.length,
- labels: gl_labels.length,
- milestones: gl_milestones.length,
- mrs: mrs.length,
- mr_comments: mrs.sum { |_k, v| v[:comments].length },
- issues: gl_issues.length,
- issue_comments: gl_issues.sum { |_k, v| v[:comments].length }
+ data: {
+ branches: gl_branches.length,
+ commits: gl_commits.length,
+ labels: gl_labels.length,
+ milestones: gl_milestones.length,
+ mrs: mrs.length,
+ mr_comments: mrs.sum { |_k, v| v[:comments].length },
+ issues: gl_issues.length,
+ issue_comments: gl_issues.sum { |_k, v| v[:comments].length }
+ }
},
not_imported: {
mrs: @mr_diff,
@@ -158,7 +162,7 @@ module QA
start = Time.now
# import the project and log gitlab path
- Runtime::Logger.info("== Importing project '#{github_repo}' in to '#{imported_project.reload!.full_path}' ==")
+ logger.info("== Importing project '#{github_repo}' in to '#{imported_project.reload!.full_path}' ==")
# fetch all objects right after import has started
fetch_github_objects
diff --git a/qa/qa/specs/features/api/1_manage/migration/gitlab_migration_large_project_spec.rb b/qa/qa/specs/features/api/1_manage/migration/gitlab_migration_large_project_spec.rb
new file mode 100644
index 00000000000..1095d78c7ad
--- /dev/null
+++ b/qa/qa/specs/features/api/1_manage/migration/gitlab_migration_large_project_spec.rb
@@ -0,0 +1,400 @@
+# frozen_string_literal: true
+
+# rubocop:disable Rails/Pluck, Layout/LineLength, RSpec/MultipleMemoizedHelpers
+module QA
+ RSpec.describe "Manage", :requires_admin, only: { job: 'large-gitlab-import' } do
+ describe "Gitlab migration" do
+ let(:logger) { Runtime::Logger.logger }
+ let(:differ) { RSpec::Support::Differ.new(color: true) }
+ let(:gitlab_group) { 'gitlab-migration' }
+ let(:gitlab_source_address) { "https://staging.gitlab.com" }
+
+ let(:import_wait_duration) do
+ {
+ max_duration: (ENV['QA_LARGE_IMPORT_DURATION'] || 3600).to_i,
+ sleep_interval: 30
+ }
+ end
+
+ let(:admin_api_client) { Runtime::API::Client.as_admin }
+
+ # explicitly create PAT via api to not create it via UI in environments where admin token env var is not present
+ let(:target_api_client) do
+ Runtime::API::Client.new(
+ user: user,
+ personal_access_token: Resource::PersonalAccessToken.fabricate_via_api! do |pat|
+ pat.api_client = admin_api_client
+ end.token
+ )
+ end
+
+ let(:user) do
+ Resource::User.fabricate_via_api! do |usr|
+ usr.api_client = admin_api_client
+ end
+ end
+
+ let(:source_api_client) do
+ Runtime::API::Client.new(
+ gitlab_source_address,
+ personal_access_token: ENV["QA_LARGE_IMPORT_GL_TOKEN"],
+ is_new_session: false
+ )
+ end
+
+ let(:sandbox) do
+ Resource::Sandbox.fabricate_via_api! do |group|
+ group.api_client = admin_api_client
+ end
+ end
+
+ let(:destination_group) do
+ Resource::Group.fabricate_via_api! do |group|
+ group.api_client = admin_api_client
+ group.sandbox = sandbox
+ group.path = "imported-group-destination-#{SecureRandom.hex(4)}"
+ end
+ end
+
+ # Source group and it's objects
+ #
+ let(:source_group) do
+ Resource::Sandbox.fabricate_via_api! do |group|
+ group.api_client = source_api_client
+ group.path = gitlab_group
+ end
+ end
+
+ let(:source_project) { source_group.projects.find { |project| project.name.include?("dri") }.reload! }
+ let(:source_branches) { source_project.repository_branches(auto_paginate: true).map { |b| b[:name] } }
+ let(:source_commits) { source_project.commits(auto_paginate: true).map { |c| c[:id] } }
+ let(:source_labels) { source_project.labels(auto_paginate: true).map { |l| l.except(:id) } }
+ let(:source_milestones) { source_project.milestones(auto_paginate: true).map { |ms| ms.except(:id, :web_url, :project_id) } }
+ let(:source_pipelines) { source_project.pipelines.map { |pp| pp.except(:id, :web_url, :project_id) } }
+ let(:source_mrs) { fetch_mrs(source_project, source_api_client) }
+ let(:source_issues) { fetch_issues(source_project, source_api_client) }
+
+ # Imported group and it's objects
+ #
+ let(:imported_group) do
+ Resource::BulkImportGroup.fabricate_via_api! do |group|
+ group.import_access_token = source_api_client.personal_access_token # token for importing on source instance
+ group.api_client = target_api_client # token used by qa framework to access resources in destination instance
+ group.gitlab_address = gitlab_source_address
+ group.source_group = source_group
+ group.sandbox = destination_group
+ end
+ end
+
+ let(:imported_project) { imported_group.projects.find { |project| project.name.include?("dri") }.reload! }
+ let(:branches) { imported_project.repository_branches(auto_paginate: true).map { |b| b[:name] } }
+ let(:commits) { imported_project.commits(auto_paginate: true).map { |c| c[:id] } }
+ let(:labels) { imported_project.labels(auto_paginate: true).map { |l| l.except(:id) } }
+ let(:milestones) { imported_project.milestones(auto_paginate: true).map { |ms| ms.except(:id, :web_url, :project_id) } }
+ let(:pipelines) { imported_project.pipelines.map { |pp| pp.except(:id, :web_url, :project_id) } }
+ let(:mrs) { fetch_mrs(imported_project, target_api_client) }
+ let(:issues) { fetch_issues(imported_project, target_api_client) }
+
+ before do
+ Runtime::Feature.enable(:bulk_import_projects)
+
+ destination_group.add_member(user, Resource::Members::AccessLevel::MAINTAINER)
+ end
+
+ # rubocop:disable RSpec/InstanceVariable
+ after do |example|
+ next unless defined?(@import_time)
+
+ # save data for comparison notification creation
+ save_json(
+ "data",
+ {
+ importer: :gitlab,
+ import_time: @import_time,
+ source: {
+ name: "GitLab Source",
+ project_name: source_project.path_with_namespace,
+ data: {
+ branches: source_branches.length,
+ commits: source_commits.length,
+ labels: source_labels.length,
+ milestones: source_milestones.length,
+ pipelines: source_pipelines.length,
+ mrs: source_mrs.length,
+ mr_comments: source_mrs.sum { |_k, v| v[:comments].length },
+ issues: source_issues.length,
+ issue_comments: source_issues.sum { |_k, v| v[:comments].length }
+ }
+ },
+ target: {
+ name: "GitLab Target",
+ project_name: imported_project.path_with_namespace,
+ data: {
+ branches: branches.length,
+ commits: commits.length,
+ labels: labels.length,
+ milestones: milestones.length,
+ pipelines: pipelines.length,
+ mrs: mrs.length,
+ mr_comments: mrs.sum { |_k, v| v[:comments].length },
+ issues: issues.length,
+ issue_comments: issues.sum { |_k, v| v[:comments].length }
+ }
+ },
+ not_imported: {
+ mrs: @mr_diff,
+ issues: @issue_diff
+ }
+ }
+ )
+ end
+ # rubocop:enable RSpec/InstanceVariable
+
+ it "migrates large gitlab group via api" do
+ start = Time.now
+
+ # trigger import and log imported group path
+ logger.info("== Importing group '#{gitlab_group}' in to '#{imported_group.full_path}' ==")
+
+ # fetch all objects right after import has started
+ fetch_source_gitlab_objects
+
+ # wait for import to finish and save import time
+ logger.info("== Waiting for import to be finished ==")
+ expect { imported_group.import_status }.to eventually_eq('finished').within(import_wait_duration)
+ @import_time = Time.now - start
+
+ aggregate_failures do
+ verify_repository_import
+ verify_labels_import
+ verify_milestones_import
+ verify_pipelines_import
+ verify_merge_requests_import
+ verify_issues_import
+ end
+ end
+
+ # Fetch source project objects for comparison
+ #
+ # @return [void]
+ def fetch_source_gitlab_objects
+ logger.info("== Fetching source group objects ==")
+
+ source_branches
+ source_commits
+ source_labels
+ source_milestones
+ source_pipelines
+ source_mrs
+ source_issues
+ end
+
+ # Verify repository imported correctly
+ #
+ # @return [void]
+ def verify_repository_import
+ logger.info("== Verifying repository import ==")
+ expect(imported_project.description).to eq(source_project.description)
+ expect(branches).to match_array(source_branches)
+ expect(commits).to match_array(source_commits)
+ end
+
+ # Verify imported labels
+ #
+ # @return [void]
+ def verify_labels_import
+ logger.info("== Verifying label import ==")
+ expect(labels).to include(*source_labels)
+ end
+
+ # Verify milestones import
+ #
+ # @return [void]
+ def verify_milestones_import
+ logger.info("== Verifying milestones import ==")
+ expect(milestones).to match_array(source_milestones)
+ end
+
+ # Verify pipelines import
+ #
+ # @return [void]
+ def verify_pipelines_import
+ logger.info("== Verifying pipelines import ==")
+ expect(pipelines).to match_array(source_pipelines)
+ end
+
+ # Verify imported merge requests and mr issues
+ #
+ # @return [void]
+ def verify_merge_requests_import
+ logger.info("== Verifying merge request import ==")
+ @mr_diff = verify_mrs_or_issues('mr')
+ end
+
+ # Verify imported issues and issue comments
+ #
+ # @return [void]
+ def verify_issues_import
+ logger.info("== Verifying issue import ==")
+ @issue_diff = verify_mrs_or_issues('issue')
+ end
+
+ # Verify imported mrs or issues and return missing items
+ #
+ # @param [String] type verification object, 'mr' or 'issue'
+ # @return [Hash]
+ def verify_mrs_or_issues(type)
+ # Compare length to have easy to read overview how many objects are missing
+ #
+ expected = type == 'mr' ? source_mrs : source_issues
+ actual = type == 'mr' ? mrs : issues
+ count_msg = "Expected to contain same amount of #{type}s. Source: #{expected.length}, Target: #{actual.length}"
+ expect(actual.length).to eq(expected.length), count_msg
+
+ missing_comments = verify_comments(type, actual, expected)
+
+ {
+ "#{type}s": (expected.keys - actual.keys).map { |it| actual[it].slice(:title, :url) },
+ "#{type}_comments": missing_comments
+ }
+ end
+
+ # Verify imported comments
+ #
+ # @param [String] type verification object, 'mrs' or 'issues'
+ # @param [Hash] actual
+ # @param [Hash] expected
+ # @return [Hash]
+ def verify_comments(type, actual, expected)
+ actual.each_with_object([]) do |(key, actual_item), missing_comments|
+ expected_item = expected[key]
+ title = actual_item[:title]
+ msg = "expected #{type} with title '#{title}' to have"
+
+ # Print title in the error message to see which object is missing
+ #
+ expect(actual_item).to be_truthy, "#{msg} been imported"
+ next unless expected_item
+
+ # Print difference in the description
+ #
+ expected_body = expected_item[:body]
+ actual_body = actual_item[:body]
+ body_msg = <<~MSG
+ #{msg} same description. diff:\n#{differ.diff(expected_body, actual_body)}
+ MSG
+ expect(actual_body).to eq(expected_body), body_msg
+
+ # Print amount difference first
+ #
+ expected_comments = expected_item[:comments]
+ actual_comments = actual_item[:comments]
+ comment_count_msg = <<~MSG
+ #{msg} same amount of comments. Source: #{expected_comments.length}, Target: #{actual_comments.length}
+ MSG
+ expect(actual_comments.length).to eq(expected_comments.length), comment_count_msg
+ expect(actual_comments).to match_array(expected_comments)
+
+ # Save missing comments
+ #
+ comment_diff = expected_comments - actual_comments
+ next if comment_diff.empty?
+
+ missing_comments << {
+ title: title,
+ target_url: actual_item[:url],
+ source_url: expected_item[:url],
+ missing_comments: comment_diff
+ }
+ end
+ end
+
+ private
+
+ # Project merge requests with comments
+ #
+ # @param [QA::Resource::Project]
+ # @param [Runtime::API::Client] client
+ # @return [Hash]
+ def fetch_mrs(project, client)
+ imported_mrs = project.merge_requests(auto_paginate: true, attempts: 2)
+
+ Parallel.map(imported_mrs, in_threads: 4) do |mr|
+ resource = Resource::MergeRequest.init do |resource|
+ resource.project = project
+ resource.iid = mr[:iid]
+ resource.api_client = client
+ end
+
+ [mr[:iid], {
+ url: mr[:web_url],
+ title: mr[:title],
+ body: sanitize_description(mr[:description]) || '',
+ comments: resource
+ .comments(auto_paginate: true, attempts: 2)
+ .map { |c| sanitize_comment(c[:body]) }
+ }]
+ end.to_h
+ end
+
+ # Project issues with comments
+ #
+ # @param [QA::Resource::Project]
+ # @param [Runtime::API::Client] client
+ # @return [Hash]
+ def fetch_issues(project, client)
+ imported_issues = project.issues(auto_paginate: true, attempts: 2)
+
+ Parallel.map(imported_issues, in_threads: 4) do |issue|
+ resource = Resource::Issue.init do |issue_resource|
+ issue_resource.project = project
+ issue_resource.iid = issue[:iid]
+ issue_resource.api_client = client
+ end
+
+ [issue[:iid], {
+ url: issue[:web_url],
+ title: issue[:title],
+ body: sanitize_description(issue[:description]) || '',
+ comments: resource
+ .comments(auto_paginate: true, attempts: 2)
+ .map { |c| sanitize_comment(c[:body]) }
+ }]
+ end.to_h
+ end
+
+ # Importer user mention pattern
+ #
+ # @return [Regex]
+ def created_by_pattern
+ @created_by_pattern ||= /\n\n \*By gitlab-migration on \S+ \(imported from GitLab\)\*/
+ end
+
+ # Remove added prefixes and legacy diff format from comments
+ #
+ # @param [String] body
+ # @return [String]
+ def sanitize_comment(body)
+ body&.gsub(created_by_pattern, "")
+ end
+
+ # Remove created by prefix from descripion
+ #
+ # @param [String] body
+ # @return [String]
+ def sanitize_description(body)
+ body&.gsub(created_by_pattern, "")
+ end
+
+ # Save json as file
+ #
+ # @param [String] name
+ # @param [Hash] json
+ # @return [void]
+ def save_json(name, json)
+ File.open("tmp/#{name}.json", "w") { |file| file.write(JSON.pretty_generate(json)) }
+ end
+ end
+ end
+end
+# rubocop:enable Rails/Pluck, Layout/LineLength, RSpec/MultipleMemoizedHelpers
diff --git a/qa/qa/specs/features/api/1_manage/migration/gitlab_project_migration_common.rb b/qa/qa/specs/features/api/1_manage/migration/gitlab_project_migration_common.rb
index 3d96ecc2990..5a9cef6fded 100644
--- a/qa/qa/specs/features/api/1_manage/migration/gitlab_project_migration_common.rb
+++ b/qa/qa/specs/features/api/1_manage/migration/gitlab_project_migration_common.rb
@@ -4,10 +4,10 @@ module QA
# Disable on live envs until bulk_import_projects toggle is on by default
# Otherwise tests running in parallel can disable feature in the middle of other test
RSpec.shared_context 'with gitlab project migration', requires_admin: 'creates a user via API',
- feature_flag: {
- name: 'bulk_import_projects',
- scope: :global
- } do
+ feature_flag: {
+ name: 'bulk_import_projects',
+ scope: :global
+ } do
let(:source_project_with_readme) { false }
let(:import_wait_duration) { { max_duration: 300, sleep_interval: 2 } }
let(:admin_api_client) { Runtime::API::Client.as_admin }
diff --git a/spec/db/schema_spec.rb b/spec/db/schema_spec.rb
index c53610f7d4c..a2941ff2d1a 100644
--- a/spec/db/schema_spec.rb
+++ b/spec/db/schema_spec.rb
@@ -180,18 +180,16 @@ RSpec.describe 'Database schema' do
'PrometheusAlert' => %w[operator]
}.freeze
- context 'for enums' do
- ApplicationRecord.descendants.each do |model|
- # skip model if it is an abstract class as it would not have an associated DB table
- next if model.abstract_class?
+ context 'for enums', :eager_load do
+ # skip model if it is an abstract class as it would not have an associated DB table
+ let(:models) { ApplicationRecord.descendants.reject(&:abstract_class?) }
- describe model do
- let(:ignored_enums) { ignored_limit_enums(model.name) }
- let(:enums) { model.defined_enums.keys - ignored_enums }
+ it 'uses smallint for enums in all models', :aggregate_failures do
+ models.each do |model|
+ ignored_enums = ignored_limit_enums(model.name)
+ enums = model.defined_enums.keys - ignored_enums
- it 'uses smallint for enums' do
- expect(model).to use_smallint_for_enums(enums)
- end
+ expect(model).to use_smallint_for_enums(enums)
end
end
end
diff --git a/spec/frontend/ide/components/new_dropdown/upload_spec.js b/spec/frontend/ide/components/new_dropdown/upload_spec.js
index 7303f81aad0..5a7419d6dce 100644
--- a/spec/frontend/ide/components/new_dropdown/upload_spec.js
+++ b/spec/frontend/ide/components/new_dropdown/upload_spec.js
@@ -69,25 +69,21 @@ describe('new dropdown upload', () => {
jest.spyOn(FileReader.prototype, 'readAsText');
});
- it('calls readAsText and creates file in plain text (without encoding) if the file content is plain text', (done) => {
+ it('calls readAsText and creates file in plain text (without encoding) if the file content is plain text', async () => {
const waitForCreate = new Promise((resolve) => vm.$on('create', resolve));
vm.createFile(textTarget, textFile);
expect(FileReader.prototype.readAsText).toHaveBeenCalledWith(textFile);
- waitForCreate
- .then(() => {
- expect(vm.$emit).toHaveBeenCalledWith('create', {
- name: textFile.name,
- type: 'blob',
- content: 'plain text',
- rawPath: '',
- mimeType: 'test/mime-text',
- });
- })
- .then(done)
- .catch(done.fail);
+ await waitForCreate;
+ expect(vm.$emit).toHaveBeenCalledWith('create', {
+ name: textFile.name,
+ type: 'blob',
+ content: 'plain text',
+ rawPath: '',
+ mimeType: 'test/mime-text',
+ });
});
it('creates a blob URL for the content if binary', () => {
diff --git a/spec/frontend/ide/stores/actions/merge_request_spec.js b/spec/frontend/ide/stores/actions/merge_request_spec.js
index e62811a4517..5592e2664c4 100644
--- a/spec/frontend/ide/stores/actions/merge_request_spec.js
+++ b/spec/frontend/ide/stores/actions/merge_request_spec.js
@@ -63,56 +63,47 @@ describe('IDE store merge request actions', () => {
mock.onGet(/api\/(.*)\/projects\/abcproject\/merge_requests/).reply(200, mockData);
});
- it('calls getProjectMergeRequests service method', (done) => {
- store
- .dispatch('getMergeRequestsForBranch', { projectId: TEST_PROJECT, branchId: 'bar' })
- .then(() => {
- expect(service.getProjectMergeRequests).toHaveBeenCalledWith(TEST_PROJECT, {
- source_branch: 'bar',
- source_project_id: TEST_PROJECT_ID,
- state: 'opened',
- order_by: 'created_at',
- per_page: 1,
- });
-
- done();
- })
- .catch(done.fail);
+ it('calls getProjectMergeRequests service method', async () => {
+ await store.dispatch('getMergeRequestsForBranch', {
+ projectId: TEST_PROJECT,
+ branchId: 'bar',
+ });
+ expect(service.getProjectMergeRequests).toHaveBeenCalledWith(TEST_PROJECT, {
+ source_branch: 'bar',
+ source_project_id: TEST_PROJECT_ID,
+ state: 'opened',
+ order_by: 'created_at',
+ per_page: 1,
+ });
});
- it('sets the "Merge Request" Object', (done) => {
- store
- .dispatch('getMergeRequestsForBranch', { projectId: TEST_PROJECT, branchId: 'bar' })
- .then(() => {
- expect(store.state.projects.abcproject.mergeRequests).toEqual({
- 2: expect.objectContaining(mrData),
- });
- done();
- })
- .catch(done.fail);
+ it('sets the "Merge Request" Object', async () => {
+ await store.dispatch('getMergeRequestsForBranch', {
+ projectId: TEST_PROJECT,
+ branchId: 'bar',
+ });
+ expect(store.state.projects.abcproject.mergeRequests).toEqual({
+ 2: expect.objectContaining(mrData),
+ });
});
- it('sets "Current Merge Request" object to the most recent MR', (done) => {
- store
- .dispatch('getMergeRequestsForBranch', { projectId: TEST_PROJECT, branchId: 'bar' })
- .then(() => {
- expect(store.state.currentMergeRequestId).toEqual('2');
- done();
- })
- .catch(done.fail);
+ it('sets "Current Merge Request" object to the most recent MR', async () => {
+ await store.dispatch('getMergeRequestsForBranch', {
+ projectId: TEST_PROJECT,
+ branchId: 'bar',
+ });
+ expect(store.state.currentMergeRequestId).toEqual('2');
});
- it('does nothing if user cannot read MRs', (done) => {
+ it('does nothing if user cannot read MRs', async () => {
store.state.projects[TEST_PROJECT].userPermissions[PERMISSION_READ_MR] = false;
- store
- .dispatch('getMergeRequestsForBranch', { projectId: TEST_PROJECT, branchId: 'bar' })
- .then(() => {
- expect(service.getProjectMergeRequests).not.toHaveBeenCalled();
- expect(store.state.currentMergeRequestId).toBe('');
- })
- .then(done)
- .catch(done.fail);
+ await store.dispatch('getMergeRequestsForBranch', {
+ projectId: TEST_PROJECT,
+ branchId: 'bar',
+ });
+ expect(service.getProjectMergeRequests).not.toHaveBeenCalled();
+ expect(store.state.currentMergeRequestId).toBe('');
});
});
@@ -122,15 +113,13 @@ describe('IDE store merge request actions', () => {
mock.onGet(/api\/(.*)\/projects\/abcproject\/merge_requests/).reply(200, []);
});
- it('does not fail if there are no merge requests for current branch', (done) => {
- store
- .dispatch('getMergeRequestsForBranch', { projectId: TEST_PROJECT, branchId: 'foo' })
- .then(() => {
- expect(store.state.projects[TEST_PROJECT].mergeRequests).toEqual({});
- expect(store.state.currentMergeRequestId).toEqual('');
- done();
- })
- .catch(done.fail);
+ it('does not fail if there are no merge requests for current branch', async () => {
+ await store.dispatch('getMergeRequestsForBranch', {
+ projectId: TEST_PROJECT,
+ branchId: 'foo',
+ });
+ expect(store.state.projects[TEST_PROJECT].mergeRequests).toEqual({});
+ expect(store.state.currentMergeRequestId).toEqual('');
});
});
});
@@ -140,17 +129,18 @@ describe('IDE store merge request actions', () => {
mock.onGet(/api\/(.*)\/projects\/abcproject\/merge_requests/).networkError();
});
- it('flashes message, if error', (done) => {
- store
- .dispatch('getMergeRequestsForBranch', { projectId: TEST_PROJECT, branchId: 'bar' })
+ it('flashes message, if error', () => {
+ return store
+ .dispatch('getMergeRequestsForBranch', {
+ projectId: TEST_PROJECT,
+ branchId: 'bar',
+ })
.catch(() => {
expect(createFlash).toHaveBeenCalled();
expect(createFlash.mock.calls[0][0].message).toBe(
'Error fetching merge requests for bar',
);
- })
- .then(done)
- .catch(done.fail);
+ });
});
});
});
@@ -165,29 +155,15 @@ describe('IDE store merge request actions', () => {
.reply(200, { title: 'mergerequest' });
});
- it('calls getProjectMergeRequestData service method', (done) => {
- store
- .dispatch('getMergeRequestData', { projectId: TEST_PROJECT, mergeRequestId: 1 })
- .then(() => {
- expect(service.getProjectMergeRequestData).toHaveBeenCalledWith(TEST_PROJECT, 1);
-
- done();
- })
- .catch(done.fail);
+ it('calls getProjectMergeRequestData service method', async () => {
+ await store.dispatch('getMergeRequestData', { projectId: TEST_PROJECT, mergeRequestId: 1 });
+ expect(service.getProjectMergeRequestData).toHaveBeenCalledWith(TEST_PROJECT, 1);
});
- it('sets the Merge Request Object', (done) => {
- store
- .dispatch('getMergeRequestData', { projectId: TEST_PROJECT, mergeRequestId: 1 })
- .then(() => {
- expect(store.state.currentMergeRequestId).toBe(1);
- expect(store.state.projects[TEST_PROJECT].mergeRequests['1'].title).toBe(
- 'mergerequest',
- );
-
- done();
- })
- .catch(done.fail);
+ it('sets the Merge Request Object', async () => {
+ await store.dispatch('getMergeRequestData', { projectId: TEST_PROJECT, mergeRequestId: 1 });
+ expect(store.state.currentMergeRequestId).toBe(1);
+ expect(store.state.projects[TEST_PROJECT].mergeRequests['1'].title).toBe('mergerequest');
});
});
@@ -196,32 +172,28 @@ describe('IDE store merge request actions', () => {
mock.onGet(/api\/(.*)\/projects\/abcproject\/merge_requests\/1/).networkError();
});
- it('dispatches error action', (done) => {
+ it('dispatches error action', () => {
const dispatch = jest.fn();
- getMergeRequestData(
+ return getMergeRequestData(
{
commit() {},
dispatch,
state: store.state,
},
{ projectId: TEST_PROJECT, mergeRequestId: 1 },
- )
- .then(done.fail)
- .catch(() => {
- expect(dispatch).toHaveBeenCalledWith('setErrorMessage', {
- text: 'An error occurred while loading the merge request.',
- action: expect.any(Function),
- actionText: 'Please try again',
- actionPayload: {
- projectId: TEST_PROJECT,
- mergeRequestId: 1,
- force: false,
- },
- });
-
- done();
+ ).catch(() => {
+ expect(dispatch).toHaveBeenCalledWith('setErrorMessage', {
+ text: 'An error occurred while loading the merge request.',
+ action: expect.any(Function),
+ actionText: 'Please try again',
+ actionPayload: {
+ projectId: TEST_PROJECT,
+ mergeRequestId: 1,
+ force: false,
+ },
});
+ });
});
});
});
@@ -240,27 +212,22 @@ describe('IDE store merge request actions', () => {
.reply(200, { title: 'mergerequest' });
});
- it('calls getProjectMergeRequestChanges service method', (done) => {
- store
- .dispatch('getMergeRequestChanges', { projectId: TEST_PROJECT, mergeRequestId: 1 })
- .then(() => {
- expect(service.getProjectMergeRequestChanges).toHaveBeenCalledWith(TEST_PROJECT, 1);
-
- done();
- })
- .catch(done.fail);
+ it('calls getProjectMergeRequestChanges service method', async () => {
+ await store.dispatch('getMergeRequestChanges', {
+ projectId: TEST_PROJECT,
+ mergeRequestId: 1,
+ });
+ expect(service.getProjectMergeRequestChanges).toHaveBeenCalledWith(TEST_PROJECT, 1);
});
- it('sets the Merge Request Changes Object', (done) => {
- store
- .dispatch('getMergeRequestChanges', { projectId: TEST_PROJECT, mergeRequestId: 1 })
- .then(() => {
- expect(store.state.projects[TEST_PROJECT].mergeRequests['1'].changes.title).toBe(
- 'mergerequest',
- );
- done();
- })
- .catch(done.fail);
+ it('sets the Merge Request Changes Object', async () => {
+ await store.dispatch('getMergeRequestChanges', {
+ projectId: TEST_PROJECT,
+ mergeRequestId: 1,
+ });
+ expect(store.state.projects[TEST_PROJECT].mergeRequests['1'].changes.title).toBe(
+ 'mergerequest',
+ );
});
});
@@ -269,32 +236,30 @@ describe('IDE store merge request actions', () => {
mock.onGet(/api\/(.*)\/projects\/abcproject\/merge_requests\/1\/changes/).networkError();
});
- it('dispatches error action', (done) => {
+ it('dispatches error action', async () => {
const dispatch = jest.fn();
- getMergeRequestChanges(
- {
- commit() {},
- dispatch,
- state: store.state,
+ await expect(
+ getMergeRequestChanges(
+ {
+ commit() {},
+ dispatch,
+ state: store.state,
+ },
+ { projectId: TEST_PROJECT, mergeRequestId: 1 },
+ ),
+ ).rejects.toEqual(new Error('Merge request changes not loaded abcproject'));
+
+ expect(dispatch).toHaveBeenCalledWith('setErrorMessage', {
+ text: 'An error occurred while loading the merge request changes.',
+ action: expect.any(Function),
+ actionText: 'Please try again',
+ actionPayload: {
+ projectId: TEST_PROJECT,
+ mergeRequestId: 1,
+ force: false,
},
- { projectId: TEST_PROJECT, mergeRequestId: 1 },
- )
- .then(done.fail)
- .catch(() => {
- expect(dispatch).toHaveBeenCalledWith('setErrorMessage', {
- text: 'An error occurred while loading the merge request changes.',
- action: expect.any(Function),
- actionText: 'Please try again',
- actionPayload: {
- projectId: TEST_PROJECT,
- mergeRequestId: 1,
- force: false,
- },
- });
-
- done();
- });
+ });
});
});
});
@@ -312,25 +277,20 @@ describe('IDE store merge request actions', () => {
jest.spyOn(service, 'getProjectMergeRequestVersions');
});
- it('calls getProjectMergeRequestVersions service method', (done) => {
- store
- .dispatch('getMergeRequestVersions', { projectId: TEST_PROJECT, mergeRequestId: 1 })
- .then(() => {
- expect(service.getProjectMergeRequestVersions).toHaveBeenCalledWith(TEST_PROJECT, 1);
-
- done();
- })
- .catch(done.fail);
+ it('calls getProjectMergeRequestVersions service method', async () => {
+ await store.dispatch('getMergeRequestVersions', {
+ projectId: TEST_PROJECT,
+ mergeRequestId: 1,
+ });
+ expect(service.getProjectMergeRequestVersions).toHaveBeenCalledWith(TEST_PROJECT, 1);
});
- it('sets the Merge Request Versions Object', (done) => {
- store
- .dispatch('getMergeRequestVersions', { projectId: TEST_PROJECT, mergeRequestId: 1 })
- .then(() => {
- expect(store.state.projects[TEST_PROJECT].mergeRequests['1'].versions.length).toBe(1);
- done();
- })
- .catch(done.fail);
+ it('sets the Merge Request Versions Object', async () => {
+ await store.dispatch('getMergeRequestVersions', {
+ projectId: TEST_PROJECT,
+ mergeRequestId: 1,
+ });
+ expect(store.state.projects[TEST_PROJECT].mergeRequests['1'].versions.length).toBe(1);
});
});
@@ -339,32 +299,28 @@ describe('IDE store merge request actions', () => {
mock.onGet(/api\/(.*)\/projects\/abcproject\/merge_requests\/1\/versions/).networkError();
});
- it('dispatches error action', (done) => {
+ it('dispatches error action', () => {
const dispatch = jest.fn();
- getMergeRequestVersions(
+ return getMergeRequestVersions(
{
commit() {},
dispatch,
state: store.state,
},
{ projectId: TEST_PROJECT, mergeRequestId: 1 },
- )
- .then(done.fail)
- .catch(() => {
- expect(dispatch).toHaveBeenCalledWith('setErrorMessage', {
- text: 'An error occurred while loading the merge request version data.',
- action: expect.any(Function),
- actionText: 'Please try again',
- actionPayload: {
- projectId: TEST_PROJECT,
- mergeRequestId: 1,
- force: false,
- },
- });
-
- done();
+ ).catch(() => {
+ expect(dispatch).toHaveBeenCalledWith('setErrorMessage', {
+ text: 'An error occurred while loading the merge request version data.',
+ action: expect.any(Function),
+ actionText: 'Please try again',
+ actionPayload: {
+ projectId: TEST_PROJECT,
+ mergeRequestId: 1,
+ force: false,
+ },
});
+ });
});
});
});
@@ -503,37 +459,36 @@ describe('IDE store merge request actions', () => {
);
});
- it('dispatches actions for merge request data', (done) => {
- openMergeRequest({ state: store.state, dispatch: store.dispatch, getters: mockGetters }, mr)
- .then(() => {
- expect(store.dispatch.mock.calls).toEqual([
- ['getMergeRequestData', mr],
- ['setCurrentBranchId', testMergeRequest.source_branch],
- [
- 'getBranchData',
- {
- projectId: mr.projectId,
- branchId: testMergeRequest.source_branch,
- },
- ],
- [
- 'getFiles',
- {
- projectId: mr.projectId,
- branchId: testMergeRequest.source_branch,
- ref: 'abcd2322',
- },
- ],
- ['getMergeRequestVersions', mr],
- ['getMergeRequestChanges', mr],
- ['openMergeRequestChanges', testMergeRequestChanges.changes],
- ]);
- })
- .then(done)
- .catch(done.fail);
+ it('dispatches actions for merge request data', async () => {
+ await openMergeRequest(
+ { state: store.state, dispatch: store.dispatch, getters: mockGetters },
+ mr,
+ );
+ expect(store.dispatch.mock.calls).toEqual([
+ ['getMergeRequestData', mr],
+ ['setCurrentBranchId', testMergeRequest.source_branch],
+ [
+ 'getBranchData',
+ {
+ projectId: mr.projectId,
+ branchId: testMergeRequest.source_branch,
+ },
+ ],
+ [
+ 'getFiles',
+ {
+ projectId: mr.projectId,
+ branchId: testMergeRequest.source_branch,
+ ref: 'abcd2322',
+ },
+ ],
+ ['getMergeRequestVersions', mr],
+ ['getMergeRequestChanges', mr],
+ ['openMergeRequestChanges', testMergeRequestChanges.changes],
+ ]);
});
- it('updates activity bar view and gets file data, if changes are found', (done) => {
+ it('updates activity bar view and gets file data, if changes are found', async () => {
store.state.entries.foo = {
type: 'blob',
path: 'foo',
@@ -548,28 +503,24 @@ describe('IDE store merge request actions', () => {
{ new_path: 'bar', path: 'bar' },
];
- openMergeRequest({ state: store.state, dispatch: store.dispatch, getters: mockGetters }, mr)
- .then(() => {
- expect(store.dispatch).toHaveBeenCalledWith(
- 'openMergeRequestChanges',
- testMergeRequestChanges.changes,
- );
- })
- .then(done)
- .catch(done.fail);
+ await openMergeRequest(
+ { state: store.state, dispatch: store.dispatch, getters: mockGetters },
+ mr,
+ );
+ expect(store.dispatch).toHaveBeenCalledWith(
+ 'openMergeRequestChanges',
+ testMergeRequestChanges.changes,
+ );
});
- it('flashes message, if error', (done) => {
+ it('flashes message, if error', () => {
store.dispatch.mockRejectedValue();
- openMergeRequest(store, mr)
- .catch(() => {
- expect(createFlash).toHaveBeenCalledWith({
- message: expect.any(String),
- });
- })
- .then(done)
- .catch(done.fail);
+ return openMergeRequest(store, mr).catch(() => {
+ expect(createFlash).toHaveBeenCalledWith({
+ message: expect.any(String),
+ });
+ });
});
});
});
diff --git a/spec/frontend/ide/stores/actions/project_spec.js b/spec/frontend/ide/stores/actions/project_spec.js
index e07dcf22860..cc7d39b4d43 100644
--- a/spec/frontend/ide/stores/actions/project_spec.js
+++ b/spec/frontend/ide/stores/actions/project_spec.js
@@ -146,22 +146,16 @@ describe('IDE store project actions', () => {
});
});
- it('calls the service', (done) => {
- store
- .dispatch('refreshLastCommitData', {
- projectId: store.state.currentProjectId,
- branchId: store.state.currentBranchId,
- })
- .then(() => {
- expect(service.getBranchData).toHaveBeenCalledWith('abc/def', 'main');
-
- done();
- })
- .catch(done.fail);
+ it('calls the service', async () => {
+ await store.dispatch('refreshLastCommitData', {
+ projectId: store.state.currentProjectId,
+ branchId: store.state.currentBranchId,
+ });
+ expect(service.getBranchData).toHaveBeenCalledWith('abc/def', 'main');
});
- it('commits getBranchData', (done) => {
- testAction(
+ it('commits getBranchData', () => {
+ return testAction(
refreshLastCommitData,
{
projectId: store.state.currentProjectId,
@@ -181,14 +175,13 @@ describe('IDE store project actions', () => {
],
// action
[],
- done,
);
});
});
describe('showBranchNotFoundError', () => {
- it('dispatches setErrorMessage', (done) => {
- testAction(
+ it('dispatches setErrorMessage', () => {
+ return testAction(
showBranchNotFoundError,
'main',
null,
@@ -204,7 +197,6 @@ describe('IDE store project actions', () => {
},
},
],
- done,
);
});
});
@@ -216,8 +208,8 @@ describe('IDE store project actions', () => {
jest.spyOn(api, 'createBranch').mockResolvedValue();
});
- it('calls API', (done) => {
- createNewBranchFromDefault(
+ it('calls API', async () => {
+ await createNewBranchFromDefault(
{
state: {
currentProjectId: 'project-path',
@@ -230,21 +222,17 @@ describe('IDE store project actions', () => {
dispatch() {},
},
'new-branch-name',
- )
- .then(() => {
- expect(api.createBranch).toHaveBeenCalledWith('project-path', {
- ref: 'main',
- branch: 'new-branch-name',
- });
- })
- .then(done)
- .catch(done.fail);
+ );
+ expect(api.createBranch).toHaveBeenCalledWith('project-path', {
+ ref: 'main',
+ branch: 'new-branch-name',
+ });
});
- it('clears error message', (done) => {
+ it('clears error message', async () => {
const dispatchSpy = jest.fn().mockName('dispatch');
- createNewBranchFromDefault(
+ await createNewBranchFromDefault(
{
state: {
currentProjectId: 'project-path',
@@ -257,16 +245,12 @@ describe('IDE store project actions', () => {
dispatch: dispatchSpy,
},
'new-branch-name',
- )
- .then(() => {
- expect(dispatchSpy).toHaveBeenCalledWith('setErrorMessage', null);
- })
- .then(done)
- .catch(done.fail);
+ );
+ expect(dispatchSpy).toHaveBeenCalledWith('setErrorMessage', null);
});
- it('reloads window', (done) => {
- createNewBranchFromDefault(
+ it('reloads window', async () => {
+ await createNewBranchFromDefault(
{
state: {
currentProjectId: 'project-path',
@@ -279,18 +263,14 @@ describe('IDE store project actions', () => {
dispatch() {},
},
'new-branch-name',
- )
- .then(() => {
- expect(window.location.reload).toHaveBeenCalled();
- })
- .then(done)
- .catch(done.fail);
+ );
+ expect(window.location.reload).toHaveBeenCalled();
});
});
describe('loadEmptyBranch', () => {
- it('creates a blank tree and sets loading state to false', (done) => {
- testAction(
+ it('creates a blank tree and sets loading state to false', () => {
+ return testAction(
loadEmptyBranch,
{ projectId: TEST_PROJECT_ID, branchId: 'main' },
store.state,
@@ -302,20 +282,18 @@ describe('IDE store project actions', () => {
},
],
expect.any(Object),
- done,
);
});
- it('does nothing, if tree already exists', (done) => {
+ it('does nothing, if tree already exists', () => {
const trees = { [`${TEST_PROJECT_ID}/main`]: [] };
- testAction(
+ return testAction(
loadEmptyBranch,
{ projectId: TEST_PROJECT_ID, branchId: 'main' },
{ trees },
[],
[],
- done,
);
});
});
@@ -372,56 +350,48 @@ describe('IDE store project actions', () => {
const branchId = '123-lorem';
const ref = 'abcd2322';
- it('when empty repo, loads empty branch', (done) => {
+ it('when empty repo, loads empty branch', () => {
const mockGetters = { emptyRepo: true };
- testAction(
+ return testAction(
loadBranch,
{ projectId, branchId },
{ ...store.state, ...mockGetters },
[],
[{ type: 'loadEmptyBranch', payload: { projectId, branchId } }],
- done,
);
});
- it('when branch already exists, does nothing', (done) => {
+ it('when branch already exists, does nothing', () => {
store.state.projects[projectId].branches[branchId] = {};
- testAction(loadBranch, { projectId, branchId }, store.state, [], [], done);
+ return testAction(loadBranch, { projectId, branchId }, store.state, [], []);
});
- it('fetches branch data', (done) => {
+ it('fetches branch data', async () => {
const mockGetters = { findBranch: () => ({ commit: { id: ref } }) };
jest.spyOn(store, 'dispatch').mockResolvedValue();
- loadBranch(
+ await loadBranch(
{ getters: mockGetters, state: store.state, dispatch: store.dispatch },
{ projectId, branchId },
- )
- .then(() => {
- expect(store.dispatch.mock.calls).toEqual([
- ['getBranchData', { projectId, branchId }],
- ['getMergeRequestsForBranch', { projectId, branchId }],
- ['getFiles', { projectId, branchId, ref }],
- ]);
- })
- .then(done)
- .catch(done.fail);
+ );
+ expect(store.dispatch.mock.calls).toEqual([
+ ['getBranchData', { projectId, branchId }],
+ ['getMergeRequestsForBranch', { projectId, branchId }],
+ ['getFiles', { projectId, branchId, ref }],
+ ]);
});
- it('shows an error if branch can not be fetched', (done) => {
+ it('shows an error if branch can not be fetched', async () => {
jest.spyOn(store, 'dispatch').mockReturnValue(Promise.reject());
- loadBranch(store, { projectId, branchId })
- .then(done.fail)
- .catch(() => {
- expect(store.dispatch.mock.calls).toEqual([
- ['getBranchData', { projectId, branchId }],
- ['showBranchNotFoundError', branchId],
- ]);
- done();
- });
+ await expect(loadBranch(store, { projectId, branchId })).rejects.toBeUndefined();
+
+ expect(store.dispatch.mock.calls).toEqual([
+ ['getBranchData', { projectId, branchId }],
+ ['showBranchNotFoundError', branchId],
+ ]);
});
});
@@ -449,17 +419,13 @@ describe('IDE store project actions', () => {
jest.spyOn(store, 'dispatch').mockResolvedValue();
});
- it('dispatches branch actions', (done) => {
- openBranch(store, branch)
- .then(() => {
- expect(store.dispatch.mock.calls).toEqual([
- ['setCurrentBranchId', branchId],
- ['loadBranch', { projectId, branchId }],
- ['loadFile', { basePath: undefined }],
- ]);
- })
- .then(done)
- .catch(done.fail);
+ it('dispatches branch actions', async () => {
+ await openBranch(store, branch);
+ expect(store.dispatch.mock.calls).toEqual([
+ ['setCurrentBranchId', branchId],
+ ['loadBranch', { projectId, branchId }],
+ ['loadFile', { basePath: undefined }],
+ ]);
});
});
@@ -468,22 +434,18 @@ describe('IDE store project actions', () => {
jest.spyOn(store, 'dispatch').mockReturnValue(Promise.reject());
});
- it('dispatches correct branch actions', (done) => {
- openBranch(store, branch)
- .then((val) => {
- expect(store.dispatch.mock.calls).toEqual([
- ['setCurrentBranchId', branchId],
- ['loadBranch', { projectId, branchId }],
- ]);
-
- expect(val).toEqual(
- new Error(
- `An error occurred while getting files for - <strong>${projectId}/${branchId}</strong>`,
- ),
- );
- })
- .then(done)
- .catch(done.fail);
+ it('dispatches correct branch actions', async () => {
+ const val = await openBranch(store, branch);
+ expect(store.dispatch.mock.calls).toEqual([
+ ['setCurrentBranchId', branchId],
+ ['loadBranch', { projectId, branchId }],
+ ]);
+
+ expect(val).toEqual(
+ new Error(
+ `An error occurred while getting files for - <strong>${projectId}/${branchId}</strong>`,
+ ),
+ );
});
});
});
diff --git a/spec/frontend/ide/stores/actions/tree_spec.js b/spec/frontend/ide/stores/actions/tree_spec.js
index 8d7328725e9..fc44cbb21ae 100644
--- a/spec/frontend/ide/stores/actions/tree_spec.js
+++ b/spec/frontend/ide/stores/actions/tree_spec.js
@@ -62,27 +62,21 @@ describe('Multi-file store tree actions', () => {
});
});
- it('adds data into tree', (done) => {
- store
- .dispatch('getFiles', basicCallParameters)
- .then(() => {
- projectTree = store.state.trees['abcproject/main'];
-
- expect(projectTree.tree.length).toBe(2);
- expect(projectTree.tree[0].type).toBe('tree');
- expect(projectTree.tree[0].tree[1].name).toBe('fileinfolder.js');
- expect(projectTree.tree[1].type).toBe('blob');
- expect(projectTree.tree[0].tree[0].tree[0].type).toBe('blob');
- expect(projectTree.tree[0].tree[0].tree[0].name).toBe('fileinsubfolder.js');
-
- done();
- })
- .catch(done.fail);
+ it('adds data into tree', async () => {
+ await store.dispatch('getFiles', basicCallParameters);
+ projectTree = store.state.trees['abcproject/main'];
+
+ expect(projectTree.tree.length).toBe(2);
+ expect(projectTree.tree[0].type).toBe('tree');
+ expect(projectTree.tree[0].tree[1].name).toBe('fileinfolder.js');
+ expect(projectTree.tree[1].type).toBe('blob');
+ expect(projectTree.tree[0].tree[0].tree[0].type).toBe('blob');
+ expect(projectTree.tree[0].tree[0].tree[0].name).toBe('fileinsubfolder.js');
});
});
describe('error', () => {
- it('dispatches error action', (done) => {
+ it('dispatches error action', async () => {
const dispatch = jest.fn();
store.state.projects = {
@@ -103,28 +97,26 @@ describe('Multi-file store tree actions', () => {
mock.onGet(/(.*)/).replyOnce(500);
- getFiles(
- {
- commit() {},
- dispatch,
- state: store.state,
- getters,
- },
- {
- projectId: 'abc/def',
- branchId: 'main-testing',
- },
- )
- .then(done.fail)
- .catch(() => {
- expect(dispatch).toHaveBeenCalledWith('setErrorMessage', {
- text: 'An error occurred while loading all the files.',
- action: expect.any(Function),
- actionText: 'Please try again',
- actionPayload: { projectId: 'abc/def', branchId: 'main-testing' },
- });
- done();
- });
+ await expect(
+ getFiles(
+ {
+ commit() {},
+ dispatch,
+ state: store.state,
+ getters,
+ },
+ {
+ projectId: 'abc/def',
+ branchId: 'main-testing',
+ },
+ ),
+ ).rejects.toEqual(new Error('Request failed with status code 500'));
+ expect(dispatch).toHaveBeenCalledWith('setErrorMessage', {
+ text: 'An error occurred while loading all the files.',
+ action: expect.any(Function),
+ actionText: 'Please try again',
+ actionPayload: { projectId: 'abc/def', branchId: 'main-testing' },
+ });
});
});
});
@@ -137,15 +129,9 @@ describe('Multi-file store tree actions', () => {
store.state.entries[tree.path] = tree;
});
- it('toggles the tree open', (done) => {
- store
- .dispatch('toggleTreeOpen', tree.path)
- .then(() => {
- expect(tree.opened).toBeTruthy();
-
- done();
- })
- .catch(done.fail);
+ it('toggles the tree open', async () => {
+ await store.dispatch('toggleTreeOpen', tree.path);
+ expect(tree.opened).toBeTruthy();
});
});
@@ -163,24 +149,23 @@ describe('Multi-file store tree actions', () => {
Object.assign(store.state.entries, createEntriesFromPaths(paths));
});
- it('opens the parents', (done) => {
- testAction(
+ it('opens the parents', () => {
+ return testAction(
showTreeEntry,
'grandparent/parent/child.txt',
store.state,
[{ type: types.SET_TREE_OPEN, payload: 'grandparent/parent' }],
[{ type: 'showTreeEntry', payload: 'grandparent/parent' }],
- done,
);
});
});
describe('setDirectoryData', () => {
- it('sets tree correctly if there are no opened files yet', (done) => {
+ it('sets tree correctly if there are no opened files yet', () => {
const treeFile = file({ name: 'README.md' });
store.state.trees['abcproject/main'] = {};
- testAction(
+ return testAction(
setDirectoryData,
{ projectId: 'abcproject', branchId: 'main', treeList: [treeFile] },
store.state,
@@ -201,7 +186,6 @@ describe('Multi-file store tree actions', () => {
},
],
[],
- done,
);
});
});
diff --git a/spec/frontend/ide/stores/actions_spec.js b/spec/frontend/ide/stores/actions_spec.js
index be43c618095..3889c4f11c3 100644
--- a/spec/frontend/ide/stores/actions_spec.js
+++ b/spec/frontend/ide/stores/actions_spec.js
@@ -43,15 +43,9 @@ describe('Multi-file store actions', () => {
});
describe('redirectToUrl', () => {
- it('calls visitUrl', (done) => {
- store
- .dispatch('redirectToUrl', 'test')
- .then(() => {
- expect(visitUrl).toHaveBeenCalledWith('test');
-
- done();
- })
- .catch(done.fail);
+ it('calls visitUrl', async () => {
+ await store.dispatch('redirectToUrl', 'test');
+ expect(visitUrl).toHaveBeenCalledWith('test');
});
});
@@ -89,15 +83,10 @@ describe('Multi-file store actions', () => {
expect(store.dispatch.mock.calls).toEqual(expect.arrayContaining(expectedCalls));
});
- it('removes all files from changedFiles state', (done) => {
- store
- .dispatch('discardAllChanges')
- .then(() => {
- expect(store.state.changedFiles.length).toBe(0);
- expect(store.state.openFiles.length).toBe(2);
- })
- .then(done)
- .catch(done.fail);
+ it('removes all files from changedFiles state', async () => {
+ await store.dispatch('discardAllChanges');
+ expect(store.state.changedFiles.length).toBe(0);
+ expect(store.state.openFiles.length).toBe(2);
});
});
@@ -121,24 +110,18 @@ describe('Multi-file store actions', () => {
});
describe('tree', () => {
- it('creates temp tree', (done) => {
- store
- .dispatch('createTempEntry', {
- name: 'test',
- type: 'tree',
- })
- .then(() => {
- const entry = store.state.entries.test;
-
- expect(entry).not.toBeNull();
- expect(entry.type).toBe('tree');
+ it('creates temp tree', async () => {
+ await store.dispatch('createTempEntry', {
+ name: 'test',
+ type: 'tree',
+ });
+ const entry = store.state.entries.test;
- done();
- })
- .catch(done.fail);
+ expect(entry).not.toBeNull();
+ expect(entry.type).toBe('tree');
});
- it('creates new folder inside another tree', (done) => {
+ it('creates new folder inside another tree', async () => {
const tree = {
type: 'tree',
name: 'testing',
@@ -148,22 +131,16 @@ describe('Multi-file store actions', () => {
store.state.entries[tree.path] = tree;
- store
- .dispatch('createTempEntry', {
- name: 'testing/test',
- type: 'tree',
- })
- .then(() => {
- expect(tree.tree[0].tempFile).toBeTruthy();
- expect(tree.tree[0].name).toBe('test');
- expect(tree.tree[0].type).toBe('tree');
-
- done();
- })
- .catch(done.fail);
+ await store.dispatch('createTempEntry', {
+ name: 'testing/test',
+ type: 'tree',
+ });
+ expect(tree.tree[0].tempFile).toBeTruthy();
+ expect(tree.tree[0].name).toBe('test');
+ expect(tree.tree[0].type).toBe('tree');
});
- it('does not create new tree if already exists', (done) => {
+ it('does not create new tree if already exists', async () => {
const tree = {
type: 'tree',
path: 'testing',
@@ -173,76 +150,52 @@ describe('Multi-file store actions', () => {
store.state.entries[tree.path] = tree;
- store
- .dispatch('createTempEntry', {
- name: 'testing',
- type: 'tree',
- })
- .then(() => {
- expect(store.state.entries[tree.path].tempFile).toEqual(false);
- expect(document.querySelector('.flash-alert')).not.toBeNull();
-
- done();
- })
- .catch(done.fail);
+ await store.dispatch('createTempEntry', {
+ name: 'testing',
+ type: 'tree',
+ });
+ expect(store.state.entries[tree.path].tempFile).toEqual(false);
+ expect(document.querySelector('.flash-alert')).not.toBeNull();
});
});
describe('blob', () => {
- it('creates temp file', (done) => {
+ it('creates temp file', async () => {
const name = 'test';
- store
- .dispatch('createTempEntry', {
- name,
- type: 'blob',
- mimeType: 'test/mime',
- })
- .then(() => {
- const f = store.state.entries[name];
-
- expect(f.tempFile).toBeTruthy();
- expect(f.mimeType).toBe('test/mime');
- expect(store.state.trees['abcproject/mybranch'].tree.length).toBe(1);
-
- done();
- })
- .catch(done.fail);
+ await store.dispatch('createTempEntry', {
+ name,
+ type: 'blob',
+ mimeType: 'test/mime',
+ });
+ const f = store.state.entries[name];
+
+ expect(f.tempFile).toBeTruthy();
+ expect(f.mimeType).toBe('test/mime');
+ expect(store.state.trees['abcproject/mybranch'].tree.length).toBe(1);
});
- it('adds tmp file to open files', (done) => {
+ it('adds tmp file to open files', async () => {
const name = 'test';
- store
- .dispatch('createTempEntry', {
- name,
- type: 'blob',
- })
- .then(() => {
- const f = store.state.entries[name];
-
- expect(store.state.openFiles.length).toBe(1);
- expect(store.state.openFiles[0].name).toBe(f.name);
+ await store.dispatch('createTempEntry', {
+ name,
+ type: 'blob',
+ });
+ const f = store.state.entries[name];
- done();
- })
- .catch(done.fail);
+ expect(store.state.openFiles.length).toBe(1);
+ expect(store.state.openFiles[0].name).toBe(f.name);
});
- it('adds tmp file to staged files', (done) => {
+ it('adds tmp file to staged files', async () => {
const name = 'test';
- store
- .dispatch('createTempEntry', {
- name,
- type: 'blob',
- })
- .then(() => {
- expect(store.state.stagedFiles).toEqual([expect.objectContaining({ name })]);
-
- done();
- })
- .catch(done.fail);
+ await store.dispatch('createTempEntry', {
+ name,
+ type: 'blob',
+ });
+ expect(store.state.stagedFiles).toEqual([expect.objectContaining({ name })]);
});
it('sets tmp file as active', () => {
@@ -251,24 +204,18 @@ describe('Multi-file store actions', () => {
expect(store.dispatch).toHaveBeenCalledWith('setFileActive', 'test');
});
- it('creates flash message if file already exists', (done) => {
+ it('creates flash message if file already exists', async () => {
const f = file('test', '1', 'blob');
store.state.trees['abcproject/mybranch'].tree = [f];
store.state.entries[f.path] = f;
- store
- .dispatch('createTempEntry', {
- name: 'test',
- type: 'blob',
- })
- .then(() => {
- expect(document.querySelector('.flash-alert')?.textContent.trim()).toEqual(
- `The name "${f.name}" is already taken in this directory.`,
- );
-
- done();
- })
- .catch(done.fail);
+ await store.dispatch('createTempEntry', {
+ name: 'test',
+ type: 'blob',
+ });
+ expect(document.querySelector('.flash-alert')?.textContent.trim()).toEqual(
+ `The name "${f.name}" is already taken in this directory.`,
+ );
});
});
});
@@ -372,45 +319,38 @@ describe('Multi-file store actions', () => {
});
describe('updateViewer', () => {
- it('updates viewer state', (done) => {
- store
- .dispatch('updateViewer', 'diff')
- .then(() => {
- expect(store.state.viewer).toBe('diff');
- })
- .then(done)
- .catch(done.fail);
+ it('updates viewer state', async () => {
+ await store.dispatch('updateViewer', 'diff');
+ expect(store.state.viewer).toBe('diff');
});
});
describe('updateActivityBarView', () => {
- it('commits UPDATE_ACTIVITY_BAR_VIEW', (done) => {
- testAction(
+ it('commits UPDATE_ACTIVITY_BAR_VIEW', () => {
+ return testAction(
updateActivityBarView,
'test',
{},
[{ type: 'UPDATE_ACTIVITY_BAR_VIEW', payload: 'test' }],
[],
- done,
);
});
});
describe('setEmptyStateSvgs', () => {
- it('commits setEmptyStateSvgs', (done) => {
- testAction(
+ it('commits setEmptyStateSvgs', () => {
+ return testAction(
setEmptyStateSvgs,
'svg',
{},
[{ type: 'SET_EMPTY_STATE_SVGS', payload: 'svg' }],
[],
- done,
);
});
});
describe('updateTempFlagForEntry', () => {
- it('commits UPDATE_TEMP_FLAG', (done) => {
+ it('commits UPDATE_TEMP_FLAG', () => {
const f = {
...file(),
path: 'test',
@@ -418,17 +358,16 @@ describe('Multi-file store actions', () => {
};
store.state.entries[f.path] = f;
- testAction(
+ return testAction(
updateTempFlagForEntry,
{ file: f, tempFile: false },
store.state,
[{ type: 'UPDATE_TEMP_FLAG', payload: { path: f.path, tempFile: false } }],
[],
- done,
);
});
- it('commits UPDATE_TEMP_FLAG and dispatches for parent', (done) => {
+ it('commits UPDATE_TEMP_FLAG and dispatches for parent', () => {
const parent = {
...file(),
path: 'testing',
@@ -441,17 +380,16 @@ describe('Multi-file store actions', () => {
store.state.entries[parent.path] = parent;
store.state.entries[f.path] = f;
- testAction(
+ return testAction(
updateTempFlagForEntry,
{ file: f, tempFile: false },
store.state,
[{ type: 'UPDATE_TEMP_FLAG', payload: { path: f.path, tempFile: false } }],
[{ type: 'updateTempFlagForEntry', payload: { file: parent, tempFile: false } }],
- done,
);
});
- it('does not dispatch for parent, if parent does not exist', (done) => {
+ it('does not dispatch for parent, if parent does not exist', () => {
const f = {
...file(),
path: 'test',
@@ -459,71 +397,66 @@ describe('Multi-file store actions', () => {
};
store.state.entries[f.path] = f;
- testAction(
+ return testAction(
updateTempFlagForEntry,
{ file: f, tempFile: false },
store.state,
[{ type: 'UPDATE_TEMP_FLAG', payload: { path: f.path, tempFile: false } }],
[],
- done,
);
});
});
describe('setCurrentBranchId', () => {
- it('commits setCurrentBranchId', (done) => {
- testAction(
+ it('commits setCurrentBranchId', () => {
+ return testAction(
setCurrentBranchId,
'branchId',
{},
[{ type: 'SET_CURRENT_BRANCH', payload: 'branchId' }],
[],
- done,
);
});
});
describe('toggleFileFinder', () => {
- it('commits TOGGLE_FILE_FINDER', (done) => {
- testAction(
+ it('commits TOGGLE_FILE_FINDER', () => {
+ return testAction(
toggleFileFinder,
true,
null,
[{ type: 'TOGGLE_FILE_FINDER', payload: true }],
[],
- done,
);
});
});
describe('setErrorMessage', () => {
- it('commis error messsage', (done) => {
- testAction(
+ it('commis error messsage', () => {
+ return testAction(
setErrorMessage,
'error',
null,
[{ type: types.SET_ERROR_MESSAGE, payload: 'error' }],
[],
- done,
);
});
});
describe('deleteEntry', () => {
- it('commits entry deletion', (done) => {
+ it('commits entry deletion', () => {
store.state.entries.path = 'testing';
- testAction(
+ return testAction(
deleteEntry,
'path',
store.state,
[{ type: types.DELETE_ENTRY, payload: 'path' }],
[{ type: 'stageChange', payload: 'path' }, createTriggerChangeAction()],
- done,
);
});
- it('does not delete a folder after it is emptied', (done) => {
+ it('does not delete a folder after it is emptied', () => {
const testFolder = {
type: 'tree',
tree: [],
@@ -540,7 +473,7 @@ describe('Multi-file store actions', () => {
'testFolder/entry-to-delete': testEntry,
};
- testAction(
+ return testAction(
deleteEntry,
'testFolder/entry-to-delete',
store.state,
@@ -549,7 +482,6 @@ describe('Multi-file store actions', () => {
{ type: 'stageChange', payload: 'testFolder/entry-to-delete' },
createTriggerChangeAction(),
],
- done,
);
});
@@ -569,8 +501,8 @@ describe('Multi-file store actions', () => {
});
describe('and previous does not exist', () => {
- it('reverts the rename before deleting', (done) => {
- testAction(
+ it('reverts the rename before deleting', () => {
+ return testAction(
deleteEntry,
testEntry.path,
store.state,
@@ -589,7 +521,6 @@ describe('Multi-file store actions', () => {
payload: testEntry.prevPath,
},
],
- done,
);
});
});
@@ -604,21 +535,20 @@ describe('Multi-file store actions', () => {
store.state.entries[oldEntry.path] = oldEntry;
});
- it('does not revert rename before deleting', (done) => {
- testAction(
+ it('does not revert rename before deleting', () => {
+ return testAction(
deleteEntry,
testEntry.path,
store.state,
[{ type: types.DELETE_ENTRY, payload: testEntry.path }],
[{ type: 'stageChange', payload: testEntry.path }, createTriggerChangeAction()],
- done,
);
});
- it('when previous is deleted, it reverts rename before deleting', (done) => {
+ it('when previous is deleted, it reverts rename before deleting', () => {
store.state.entries[testEntry.prevPath].deleted = true;
- testAction(
+ return testAction(
deleteEntry,
testEntry.path,
store.state,
@@ -637,7 +567,6 @@ describe('Multi-file store actions', () => {
payload: testEntry.prevPath,
},
],
- done,
);
});
});
@@ -650,7 +579,7 @@ describe('Multi-file store actions', () => {
jest.spyOn(eventHub, '$emit').mockImplementation();
});
- it('does not purge model cache for temporary entries that got renamed', (done) => {
+ it('does not purge model cache for temporary entries that got renamed', async () => {
Object.assign(store.state.entries, {
test: {
...file('test'),
@@ -660,19 +589,14 @@ describe('Multi-file store actions', () => {
},
});
- store
- .dispatch('renameEntry', {
- path: 'test',
- name: 'new',
- })
- .then(() => {
- expect(eventHub.$emit.mock.calls).not.toContain('editor.update.model.dispose.foo-bar');
- })
- .then(done)
- .catch(done.fail);
+ await store.dispatch('renameEntry', {
+ path: 'test',
+ name: 'new',
+ });
+ expect(eventHub.$emit.mock.calls).not.toContain('editor.update.model.dispose.foo-bar');
});
- it('purges model cache for renamed entry', (done) => {
+ it('purges model cache for renamed entry', async () => {
Object.assign(store.state.entries, {
test: {
...file('test'),
@@ -682,17 +606,12 @@ describe('Multi-file store actions', () => {
},
});
- store
- .dispatch('renameEntry', {
- path: 'test',
- name: 'new',
- })
- .then(() => {
- expect(eventHub.$emit).toHaveBeenCalled();
- expect(eventHub.$emit).toHaveBeenCalledWith(`editor.update.model.dispose.foo-key`);
- })
- .then(done)
- .catch(done.fail);
+ await store.dispatch('renameEntry', {
+ path: 'test',
+ name: 'new',
+ });
+ expect(eventHub.$emit).toHaveBeenCalled();
+ expect(eventHub.$emit).toHaveBeenCalledWith(`editor.update.model.dispose.foo-key`);
});
});
@@ -731,8 +650,8 @@ describe('Multi-file store actions', () => {
]);
});
- it('if not changed, completely unstages and discards entry if renamed to original', (done) => {
- testAction(
+ it('if not changed, completely unstages and discards entry if renamed to original', () => {
+ return testAction(
renameEntry,
{ path: 'renamed', name: 'orig' },
store.state,
@@ -751,24 +670,22 @@ describe('Multi-file store actions', () => {
},
],
[createTriggerRenameAction('renamed', 'orig')],
- done,
);
});
- it('if already in changed, does not add to change', (done) => {
+ it('if already in changed, does not add to change', () => {
store.state.changedFiles.push(renamedEntry);
- testAction(
+ return testAction(
renameEntry,
{ path: 'orig', name: 'renamed' },
store.state,
[expect.objectContaining({ type: types.RENAME_ENTRY })],
[createTriggerRenameAction('orig', 'renamed')],
- done,
);
});
- it('routes to the renamed file if the original file has been opened', (done) => {
+ it('routes to the renamed file if the original file has been opened', async () => {
store.state.currentProjectId = 'test/test';
store.state.currentBranchId = 'main';
@@ -776,17 +693,12 @@ describe('Multi-file store actions', () => {
opened: true,
});
- store
- .dispatch('renameEntry', {
- path: 'orig',
- name: 'renamed',
- })
- .then(() => {
- expect(router.push.mock.calls).toHaveLength(1);
- expect(router.push).toHaveBeenCalledWith(`/project/test/test/tree/main/-/renamed/`);
- })
- .then(done)
- .catch(done.fail);
+ await store.dispatch('renameEntry', {
+ path: 'orig',
+ name: 'renamed',
+ });
+ expect(router.push.mock.calls).toHaveLength(1);
+ expect(router.push).toHaveBeenCalledWith(`/project/test/test/tree/main/-/renamed/`);
});
});
@@ -809,25 +721,20 @@ describe('Multi-file store actions', () => {
});
});
- it('updates entries in a folder correctly, when folder is renamed', (done) => {
- store
- .dispatch('renameEntry', {
- path: 'folder',
- name: 'new-folder',
- })
- .then(() => {
- const keys = Object.keys(store.state.entries);
-
- expect(keys.length).toBe(3);
- expect(keys.indexOf('new-folder')).toBe(0);
- expect(keys.indexOf('new-folder/file-1')).toBe(1);
- expect(keys.indexOf('new-folder/file-2')).toBe(2);
- })
- .then(done)
- .catch(done.fail);
+ it('updates entries in a folder correctly, when folder is renamed', async () => {
+ await store.dispatch('renameEntry', {
+ path: 'folder',
+ name: 'new-folder',
+ });
+ const keys = Object.keys(store.state.entries);
+
+ expect(keys.length).toBe(3);
+ expect(keys.indexOf('new-folder')).toBe(0);
+ expect(keys.indexOf('new-folder/file-1')).toBe(1);
+ expect(keys.indexOf('new-folder/file-2')).toBe(2);
});
- it('discards renaming of an entry if the root folder is renamed back to a previous name', (done) => {
+ it('discards renaming of an entry if the root folder is renamed back to a previous name', async () => {
const rootFolder = file('old-folder', 'old-folder', 'tree');
const testEntry = file('test', 'test', 'blob', rootFolder);
@@ -841,53 +748,45 @@ describe('Multi-file store actions', () => {
},
});
- store
- .dispatch('renameEntry', {
- path: 'old-folder',
- name: 'new-folder',
- })
- .then(() => {
- const { entries } = store.state;
-
- expect(Object.keys(entries).length).toBe(2);
- expect(entries['old-folder']).toBeUndefined();
- expect(entries['old-folder/test']).toBeUndefined();
-
- expect(entries['new-folder']).toBeDefined();
- expect(entries['new-folder/test']).toEqual(
- expect.objectContaining({
- path: 'new-folder/test',
- name: 'test',
- prevPath: 'old-folder/test',
- prevName: 'test',
- }),
- );
- })
- .then(() =>
- store.dispatch('renameEntry', {
- path: 'new-folder',
- name: 'old-folder',
- }),
- )
- .then(() => {
- const { entries } = store.state;
-
- expect(Object.keys(entries).length).toBe(2);
- expect(entries['new-folder']).toBeUndefined();
- expect(entries['new-folder/test']).toBeUndefined();
-
- expect(entries['old-folder']).toBeDefined();
- expect(entries['old-folder/test']).toEqual(
- expect.objectContaining({
- path: 'old-folder/test',
- name: 'test',
- prevPath: undefined,
- prevName: undefined,
- }),
- );
- })
- .then(done)
- .catch(done.fail);
+ await store.dispatch('renameEntry', {
+ path: 'old-folder',
+ name: 'new-folder',
+ });
+ const { entries } = store.state;
+
+ expect(Object.keys(entries).length).toBe(2);
+ expect(entries['old-folder']).toBeUndefined();
+ expect(entries['old-folder/test']).toBeUndefined();
+
+ expect(entries['new-folder']).toBeDefined();
+ expect(entries['new-folder/test']).toEqual(
+ expect.objectContaining({
+ path: 'new-folder/test',
+ name: 'test',
+ prevPath: 'old-folder/test',
+ prevName: 'test',
+ }),
+ );
+
+ await store.dispatch('renameEntry', {
+ path: 'new-folder',
+ name: 'old-folder',
+ });
+ const { entries: newEntries } = store.state;
+
+ expect(Object.keys(newEntries).length).toBe(2);
+ expect(newEntries['new-folder']).toBeUndefined();
+ expect(newEntries['new-folder/test']).toBeUndefined();
+
+ expect(newEntries['old-folder']).toBeDefined();
+ expect(newEntries['old-folder/test']).toEqual(
+ expect.objectContaining({
+ path: 'old-folder/test',
+ name: 'test',
+ prevPath: undefined,
+ prevName: undefined,
+ }),
+ );
});
describe('with file in directory', () => {
@@ -919,24 +818,21 @@ describe('Multi-file store actions', () => {
});
});
- it('creates new directory', (done) => {
+ it('creates new directory', async () => {
expect(store.state.entries[newParentPath]).toBeUndefined();
- store
- .dispatch('renameEntry', { path: filePath, name: fileName, parentPath: newParentPath })
- .then(() => {
- expect(store.state.entries[newParentPath]).toEqual(
- expect.objectContaining({
- path: newParentPath,
- type: 'tree',
- tree: expect.arrayContaining([
- store.state.entries[`${newParentPath}/${fileName}`],
- ]),
- }),
- );
- })
- .then(done)
- .catch(done.fail);
+ await store.dispatch('renameEntry', {
+ path: filePath,
+ name: fileName,
+ parentPath: newParentPath,
+ });
+ expect(store.state.entries[newParentPath]).toEqual(
+ expect.objectContaining({
+ path: newParentPath,
+ type: 'tree',
+ tree: expect.arrayContaining([store.state.entries[`${newParentPath}/${fileName}`]]),
+ }),
+ );
});
describe('when new directory exists', () => {
@@ -949,40 +845,30 @@ describe('Multi-file store actions', () => {
rootDir.tree.push(newDir);
});
- it('inserts in new directory', (done) => {
+ it('inserts in new directory', async () => {
expect(newDir.tree).toEqual([]);
- store
- .dispatch('renameEntry', {
- path: filePath,
- name: fileName,
- parentPath: newParentPath,
- })
- .then(() => {
- expect(newDir.tree).toEqual([store.state.entries[`${newParentPath}/${fileName}`]]);
- })
- .then(done)
- .catch(done.fail);
+ await store.dispatch('renameEntry', {
+ path: filePath,
+ name: fileName,
+ parentPath: newParentPath,
+ });
+ expect(newDir.tree).toEqual([store.state.entries[`${newParentPath}/${fileName}`]]);
});
- it('when new directory is deleted, it undeletes it', (done) => {
- store.dispatch('deleteEntry', newParentPath);
+ it('when new directory is deleted, it undeletes it', async () => {
+ await store.dispatch('deleteEntry', newParentPath);
expect(store.state.entries[newParentPath].deleted).toBe(true);
expect(rootDir.tree.some((x) => x.path === newParentPath)).toBe(false);
- store
- .dispatch('renameEntry', {
- path: filePath,
- name: fileName,
- parentPath: newParentPath,
- })
- .then(() => {
- expect(store.state.entries[newParentPath].deleted).toBe(false);
- expect(rootDir.tree.some((x) => x.path === newParentPath)).toBe(true);
- })
- .then(done)
- .catch(done.fail);
+ await store.dispatch('renameEntry', {
+ path: filePath,
+ name: fileName,
+ parentPath: newParentPath,
+ });
+ expect(store.state.entries[newParentPath].deleted).toBe(false);
+ expect(rootDir.tree.some((x) => x.path === newParentPath)).toBe(true);
});
});
});
@@ -1023,30 +909,25 @@ describe('Multi-file store actions', () => {
document.querySelector('.flash-container').remove();
});
- it('passes the error further unchanged without dispatching any action when response is 404', (done) => {
+ it('passes the error further unchanged without dispatching any action when response is 404', async () => {
mock.onGet(/(.*)/).replyOnce(404);
- getBranchData(...callParams)
- .then(done.fail)
- .catch((e) => {
- expect(dispatch.mock.calls).toHaveLength(0);
- expect(e.response.status).toEqual(404);
- expect(document.querySelector('.flash-alert')).toBeNull();
- done();
- });
+ await expect(getBranchData(...callParams)).rejects.toEqual(
+ new Error('Request failed with status code 404'),
+ );
+ expect(dispatch.mock.calls).toHaveLength(0);
+ expect(document.querySelector('.flash-alert')).toBeNull();
});
- it('does not pass the error further and flashes an alert if error is not 404', (done) => {
+ it('does not pass the error further and flashes an alert if error is not 404', async () => {
mock.onGet(/(.*)/).replyOnce(418);
- getBranchData(...callParams)
- .then(done.fail)
- .catch((e) => {
- expect(dispatch.mock.calls).toHaveLength(0);
- expect(e.response).toBeUndefined();
- expect(document.querySelector('.flash-alert')).not.toBeNull();
- done();
- });
+ await expect(getBranchData(...callParams)).rejects.toEqual(
+ new Error('Branch not loaded - <strong>abc/def/main-testing</strong>'),
+ );
+
+ expect(dispatch.mock.calls).toHaveLength(0);
+ expect(document.querySelector('.flash-alert')).not.toBeNull();
});
});
});
diff --git a/spec/frontend/ide/stores/modules/branches/actions_spec.js b/spec/frontend/ide/stores/modules/branches/actions_spec.js
index 135dbc1f746..306330e3ba2 100644
--- a/spec/frontend/ide/stores/modules/branches/actions_spec.js
+++ b/spec/frontend/ide/stores/modules/branches/actions_spec.js
@@ -42,21 +42,20 @@ describe('IDE branches actions', () => {
});
describe('requestBranches', () => {
- it('should commit request', (done) => {
- testAction(
+ it('should commit request', () => {
+ return testAction(
requestBranches,
null,
mockedContext.state,
[{ type: types.REQUEST_BRANCHES }],
[],
- done,
);
});
});
describe('receiveBranchesError', () => {
- it('should commit error', (done) => {
- testAction(
+ it('should commit error', () => {
+ return testAction(
receiveBranchesError,
{ search: TEST_SEARCH },
mockedContext.state,
@@ -72,20 +71,18 @@ describe('IDE branches actions', () => {
},
},
],
- done,
);
});
});
describe('receiveBranchesSuccess', () => {
- it('should commit received data', (done) => {
- testAction(
+ it('should commit received data', () => {
+ return testAction(
receiveBranchesSuccess,
branches,
mockedContext.state,
[{ type: types.RECEIVE_BRANCHES_SUCCESS, payload: branches }],
[],
- done,
);
});
});
@@ -110,8 +107,8 @@ describe('IDE branches actions', () => {
});
});
- it('dispatches success with received data', (done) => {
- testAction(
+ it('dispatches success with received data', () => {
+ return testAction(
fetchBranches,
{ search: TEST_SEARCH },
mockedState,
@@ -121,7 +118,6 @@ describe('IDE branches actions', () => {
{ type: 'resetBranches' },
{ type: 'receiveBranchesSuccess', payload: branches },
],
- done,
);
});
});
@@ -131,8 +127,8 @@ describe('IDE branches actions', () => {
mock.onGet(/\/api\/v4\/projects\/\d+\/repository\/branches(.*)$/).replyOnce(500);
});
- it('dispatches error', (done) => {
- testAction(
+ it('dispatches error', () => {
+ return testAction(
fetchBranches,
{ search: TEST_SEARCH },
mockedState,
@@ -142,20 +138,18 @@ describe('IDE branches actions', () => {
{ type: 'resetBranches' },
{ type: 'receiveBranchesError', payload: { search: TEST_SEARCH } },
],
- done,
);
});
});
describe('resetBranches', () => {
- it('commits reset', (done) => {
- testAction(
+ it('commits reset', () => {
+ return testAction(
resetBranches,
null,
mockedContext.state,
[{ type: types.RESET_BRANCHES }],
[],
- done,
);
});
});
diff --git a/spec/frontend/ide/stores/modules/clientside/actions_spec.js b/spec/frontend/ide/stores/modules/clientside/actions_spec.js
index d2777623b0d..c2b9de192d9 100644
--- a/spec/frontend/ide/stores/modules/clientside/actions_spec.js
+++ b/spec/frontend/ide/stores/modules/clientside/actions_spec.js
@@ -26,15 +26,13 @@ describe('IDE store module clientside actions', () => {
});
describe('pingUsage', () => {
- it('posts to usage endpoint', (done) => {
+ it('posts to usage endpoint', async () => {
const usageSpy = jest.fn(() => [200]);
mock.onPost(TEST_USAGE_URL).reply(() => usageSpy());
- testAction(actions.pingUsage, PING_USAGE_PREVIEW_KEY, rootGetters, [], [], () => {
- expect(usageSpy).toHaveBeenCalled();
- done();
- });
+ await testAction(actions.pingUsage, PING_USAGE_PREVIEW_KEY, rootGetters, [], []);
+ expect(usageSpy).toHaveBeenCalled();
});
});
});
diff --git a/spec/frontend/ide/stores/modules/commit/actions_spec.js b/spec/frontend/ide/stores/modules/commit/actions_spec.js
index cb6bb7c1202..d65039e89cc 100644
--- a/spec/frontend/ide/stores/modules/commit/actions_spec.js
+++ b/spec/frontend/ide/stores/modules/commit/actions_spec.js
@@ -57,40 +57,25 @@ describe('IDE commit module actions', () => {
});
describe('updateCommitMessage', () => {
- it('updates store with new commit message', (done) => {
- store
- .dispatch('commit/updateCommitMessage', 'testing')
- .then(() => {
- expect(store.state.commit.commitMessage).toBe('testing');
- })
- .then(done)
- .catch(done.fail);
+ it('updates store with new commit message', async () => {
+ await store.dispatch('commit/updateCommitMessage', 'testing');
+ expect(store.state.commit.commitMessage).toBe('testing');
});
});
describe('discardDraft', () => {
- it('resets commit message to blank', (done) => {
+ it('resets commit message to blank', async () => {
store.state.commit.commitMessage = 'testing';
- store
- .dispatch('commit/discardDraft')
- .then(() => {
- expect(store.state.commit.commitMessage).not.toBe('testing');
- })
- .then(done)
- .catch(done.fail);
+ await store.dispatch('commit/discardDraft');
+ expect(store.state.commit.commitMessage).not.toBe('testing');
});
});
describe('updateCommitAction', () => {
- it('updates store with new commit action', (done) => {
- store
- .dispatch('commit/updateCommitAction', '1')
- .then(() => {
- expect(store.state.commit.commitAction).toBe('1');
- })
- .then(done)
- .catch(done.fail);
+ it('updates store with new commit action', async () => {
+ await store.dispatch('commit/updateCommitAction', '1');
+ expect(store.state.commit.commitAction).toBe('1');
});
});
@@ -139,34 +124,24 @@ describe('IDE commit module actions', () => {
});
});
- it('updates commit message with short_id', (done) => {
- store
- .dispatch('commit/setLastCommitMessage', { short_id: '123' })
- .then(() => {
- expect(store.state.lastCommitMsg).toContain(
- 'Your changes have been committed. Commit <a href="http://testing/-/commit/123" class="commit-sha">123</a>',
- );
- })
- .then(done)
- .catch(done.fail);
+ it('updates commit message with short_id', async () => {
+ await store.dispatch('commit/setLastCommitMessage', { short_id: '123' });
+ expect(store.state.lastCommitMsg).toContain(
+ 'Your changes have been committed. Commit <a href="http://testing/-/commit/123" class="commit-sha">123</a>',
+ );
});
- it('updates commit message with stats', (done) => {
- store
- .dispatch('commit/setLastCommitMessage', {
- short_id: '123',
- stats: {
- additions: '1',
- deletions: '2',
- },
- })
- .then(() => {
- expect(store.state.lastCommitMsg).toBe(
- 'Your changes have been committed. Commit <a href="http://testing/-/commit/123" class="commit-sha">123</a> with 1 additions, 2 deletions.',
- );
- })
- .then(done)
- .catch(done.fail);
+ it('updates commit message with stats', async () => {
+ await store.dispatch('commit/setLastCommitMessage', {
+ short_id: '123',
+ stats: {
+ additions: '1',
+ deletions: '2',
+ },
+ });
+ expect(store.state.lastCommitMsg).toBe(
+ 'Your changes have been committed. Commit <a href="http://testing/-/commit/123" class="commit-sha">123</a> with 1 additions, 2 deletions.',
+ );
});
});
@@ -221,74 +196,49 @@ describe('IDE commit module actions', () => {
});
});
- it('updates stores working reference', (done) => {
- store
- .dispatch('commit/updateFilesAfterCommit', {
- data,
- branch,
- })
- .then(() => {
- expect(store.state.projects.abcproject.branches.main.workingReference).toBe(data.id);
- })
- .then(done)
- .catch(done.fail);
+ it('updates stores working reference', async () => {
+ await store.dispatch('commit/updateFilesAfterCommit', {
+ data,
+ branch,
+ });
+ expect(store.state.projects.abcproject.branches.main.workingReference).toBe(data.id);
});
- it('resets all files changed status', (done) => {
- store
- .dispatch('commit/updateFilesAfterCommit', {
- data,
- branch,
- })
- .then(() => {
- store.state.openFiles.forEach((entry) => {
- expect(entry.changed).toBeFalsy();
- });
- })
- .then(done)
- .catch(done.fail);
+ it('resets all files changed status', async () => {
+ await store.dispatch('commit/updateFilesAfterCommit', {
+ data,
+ branch,
+ });
+ store.state.openFiles.forEach((entry) => {
+ expect(entry.changed).toBeFalsy();
+ });
});
- it('sets files commit data', (done) => {
- store
- .dispatch('commit/updateFilesAfterCommit', {
- data,
- branch,
- })
- .then(() => {
- expect(f.lastCommitSha).toBe(data.id);
- })
- .then(done)
- .catch(done.fail);
+ it('sets files commit data', async () => {
+ await store.dispatch('commit/updateFilesAfterCommit', {
+ data,
+ branch,
+ });
+ expect(f.lastCommitSha).toBe(data.id);
});
- it('updates raw content for changed file', (done) => {
- store
- .dispatch('commit/updateFilesAfterCommit', {
- data,
- branch,
- })
- .then(() => {
- expect(f.raw).toBe(f.content);
- })
- .then(done)
- .catch(done.fail);
+ it('updates raw content for changed file', async () => {
+ await store.dispatch('commit/updateFilesAfterCommit', {
+ data,
+ branch,
+ });
+ expect(f.raw).toBe(f.content);
});
- it('emits changed event for file', (done) => {
- store
- .dispatch('commit/updateFilesAfterCommit', {
- data,
- branch,
- })
- .then(() => {
- expect(eventHub.$emit).toHaveBeenCalledWith(`editor.update.model.content.${f.key}`, {
- content: f.content,
- changed: false,
- });
- })
- .then(done)
- .catch(done.fail);
+ it('emits changed event for file', async () => {
+ await store.dispatch('commit/updateFilesAfterCommit', {
+ data,
+ branch,
+ });
+ expect(eventHub.$emit).toHaveBeenCalledWith(`editor.update.model.content.${f.key}`, {
+ content: f.content,
+ changed: false,
+ });
});
});
@@ -349,138 +299,93 @@ describe('IDE commit module actions', () => {
jest.spyOn(service, 'commit').mockResolvedValue({ data: COMMIT_RESPONSE });
});
- it('calls service', (done) => {
- store
- .dispatch('commit/commitChanges')
- .then(() => {
- expect(service.commit).toHaveBeenCalledWith('abcproject', {
- branch: expect.anything(),
- commit_message: 'testing 123',
- actions: [
- {
- action: commitActionTypes.update,
- file_path: expect.anything(),
- content: '\n',
- encoding: expect.anything(),
- last_commit_id: undefined,
- previous_path: undefined,
- },
- ],
- start_sha: TEST_COMMIT_SHA,
- });
-
- done();
- })
- .catch(done.fail);
+ it('calls service', async () => {
+ await store.dispatch('commit/commitChanges');
+ expect(service.commit).toHaveBeenCalledWith('abcproject', {
+ branch: expect.anything(),
+ commit_message: 'testing 123',
+ actions: [
+ {
+ action: commitActionTypes.update,
+ file_path: expect.anything(),
+ content: '\n',
+ encoding: expect.anything(),
+ last_commit_id: undefined,
+ previous_path: undefined,
+ },
+ ],
+ start_sha: TEST_COMMIT_SHA,
+ });
});
- it('sends lastCommit ID when not creating new branch', (done) => {
+ it('sends lastCommit ID when not creating new branch', async () => {
store.state.commit.commitAction = '1';
- store
- .dispatch('commit/commitChanges')
- .then(() => {
- expect(service.commit).toHaveBeenCalledWith('abcproject', {
- branch: expect.anything(),
- commit_message: 'testing 123',
- actions: [
- {
- action: commitActionTypes.update,
- file_path: expect.anything(),
- content: '\n',
- encoding: expect.anything(),
- last_commit_id: TEST_COMMIT_SHA,
- previous_path: undefined,
- },
- ],
- start_sha: undefined,
- });
-
- done();
- })
- .catch(done.fail);
+ await store.dispatch('commit/commitChanges');
+ expect(service.commit).toHaveBeenCalledWith('abcproject', {
+ branch: expect.anything(),
+ commit_message: 'testing 123',
+ actions: [
+ {
+ action: commitActionTypes.update,
+ file_path: expect.anything(),
+ content: '\n',
+ encoding: expect.anything(),
+ last_commit_id: TEST_COMMIT_SHA,
+ previous_path: undefined,
+ },
+ ],
+ start_sha: undefined,
+ });
});
- it('sets last Commit Msg', (done) => {
- store
- .dispatch('commit/commitChanges')
- .then(() => {
- expect(store.state.lastCommitMsg).toBe(
- 'Your changes have been committed. Commit <a href="webUrl/-/commit/123" class="commit-sha">123</a> with 1 additions, 2 deletions.',
- );
-
- done();
- })
- .catch(done.fail);
+ it('sets last Commit Msg', async () => {
+ await store.dispatch('commit/commitChanges');
+ expect(store.state.lastCommitMsg).toBe(
+ 'Your changes have been committed. Commit <a href="webUrl/-/commit/123" class="commit-sha">123</a> with 1 additions, 2 deletions.',
+ );
});
- it('adds commit data to files', (done) => {
- store
- .dispatch('commit/commitChanges')
- .then(() => {
- expect(store.state.entries[store.state.openFiles[0].path].lastCommitSha).toBe(
- COMMIT_RESPONSE.id,
- );
-
- done();
- })
- .catch(done.fail);
+ it('adds commit data to files', async () => {
+ await store.dispatch('commit/commitChanges');
+ expect(store.state.entries[store.state.openFiles[0].path].lastCommitSha).toBe(
+ COMMIT_RESPONSE.id,
+ );
});
- it('resets stores commit actions', (done) => {
+ it('resets stores commit actions', async () => {
store.state.commit.commitAction = COMMIT_TO_NEW_BRANCH;
- store
- .dispatch('commit/commitChanges')
- .then(() => {
- expect(store.state.commit.commitAction).not.toBe(COMMIT_TO_NEW_BRANCH);
- })
- .then(done)
- .catch(done.fail);
+ await store.dispatch('commit/commitChanges');
+ expect(store.state.commit.commitAction).not.toBe(COMMIT_TO_NEW_BRANCH);
});
- it('removes all staged files', (done) => {
- store
- .dispatch('commit/commitChanges')
- .then(() => {
- expect(store.state.stagedFiles.length).toBe(0);
- })
- .then(done)
- .catch(done.fail);
+ it('removes all staged files', async () => {
+ await store.dispatch('commit/commitChanges');
+ expect(store.state.stagedFiles.length).toBe(0);
});
describe('merge request', () => {
- it('redirects to new merge request page', (done) => {
+ it('redirects to new merge request page', async () => {
jest.spyOn(eventHub, '$on').mockImplementation();
store.state.commit.commitAction = COMMIT_TO_NEW_BRANCH;
store.state.commit.shouldCreateMR = true;
- store
- .dispatch('commit/commitChanges')
- .then(() => {
- expect(visitUrl).toHaveBeenCalledWith(
- `webUrl/-/merge_requests/new?merge_request[source_branch]=${store.getters['commit/placeholderBranchName']}&merge_request[target_branch]=main&nav_source=webide`,
- );
-
- done();
- })
- .catch(done.fail);
+ await store.dispatch('commit/commitChanges');
+ expect(visitUrl).toHaveBeenCalledWith(
+ `webUrl/-/merge_requests/new?merge_request[source_branch]=${store.getters['commit/placeholderBranchName']}&merge_request[target_branch]=main&nav_source=webide`,
+ );
});
- it('does not redirect to new merge request page when shouldCreateMR is not checked', (done) => {
+ it('does not redirect to new merge request page when shouldCreateMR is not checked', async () => {
jest.spyOn(eventHub, '$on').mockImplementation();
store.state.commit.commitAction = COMMIT_TO_NEW_BRANCH;
store.state.commit.shouldCreateMR = false;
- store
- .dispatch('commit/commitChanges')
- .then(() => {
- expect(visitUrl).not.toHaveBeenCalled();
- done();
- })
- .catch(done.fail);
+ await store.dispatch('commit/commitChanges');
+ expect(visitUrl).not.toHaveBeenCalled();
});
it('does not redirect to merge request page if shouldCreateMR is checked, but branch is the default branch', async () => {
@@ -514,17 +419,11 @@ describe('IDE commit module actions', () => {
});
});
- it('shows failed message', (done) => {
- store
- .dispatch('commit/commitChanges')
- .then(() => {
- const alert = document.querySelector('.flash-container');
-
- expect(alert.textContent.trim()).toBe('failed message');
+ it('shows failed message', async () => {
+ await store.dispatch('commit/commitChanges');
+ const alert = document.querySelector('.flash-container');
- done();
- })
- .catch(done.fail);
+ expect(alert.textContent.trim()).toBe('failed message');
});
});
@@ -548,52 +447,37 @@ describe('IDE commit module actions', () => {
});
describe('first commit of a branch', () => {
- it('commits TOGGLE_EMPTY_STATE mutation on empty repo', (done) => {
+ it('commits TOGGLE_EMPTY_STATE mutation on empty repo', async () => {
jest.spyOn(service, 'commit').mockResolvedValue({ data: COMMIT_RESPONSE });
jest.spyOn(store, 'commit');
- store
- .dispatch('commit/commitChanges')
- .then(() => {
- expect(store.commit.mock.calls).toEqual(
- expect.arrayContaining([
- ['TOGGLE_EMPTY_STATE', expect.any(Object), expect.any(Object)],
- ]),
- );
- done();
- })
- .catch(done.fail);
+ await store.dispatch('commit/commitChanges');
+ expect(store.commit.mock.calls).toEqual(
+ expect.arrayContaining([['TOGGLE_EMPTY_STATE', expect.any(Object), expect.any(Object)]]),
+ );
});
- it('does not commmit TOGGLE_EMPTY_STATE mutation on existing project', (done) => {
+ it('does not commmit TOGGLE_EMPTY_STATE mutation on existing project', async () => {
COMMIT_RESPONSE.parent_ids.push('1234');
jest.spyOn(service, 'commit').mockResolvedValue({ data: COMMIT_RESPONSE });
jest.spyOn(store, 'commit');
- store
- .dispatch('commit/commitChanges')
- .then(() => {
- expect(store.commit.mock.calls).not.toEqual(
- expect.arrayContaining([
- ['TOGGLE_EMPTY_STATE', expect.any(Object), expect.any(Object)],
- ]),
- );
- done();
- })
- .catch(done.fail);
+ await store.dispatch('commit/commitChanges');
+ expect(store.commit.mock.calls).not.toEqual(
+ expect.arrayContaining([['TOGGLE_EMPTY_STATE', expect.any(Object), expect.any(Object)]]),
+ );
});
});
});
describe('toggleShouldCreateMR', () => {
- it('commits both toggle and interacting with MR checkbox actions', (done) => {
- testAction(
+ it('commits both toggle and interacting with MR checkbox actions', () => {
+ return testAction(
actions.toggleShouldCreateMR,
{},
store.state,
[{ type: mutationTypes.TOGGLE_SHOULD_CREATE_MR }],
[],
- done,
);
});
});
diff --git a/spec/frontend/ide/stores/modules/file_templates/actions_spec.js b/spec/frontend/ide/stores/modules/file_templates/actions_spec.js
index 9ff950b0875..1080a30d2d8 100644
--- a/spec/frontend/ide/stores/modules/file_templates/actions_spec.js
+++ b/spec/frontend/ide/stores/modules/file_templates/actions_spec.js
@@ -20,21 +20,20 @@ describe('IDE file templates actions', () => {
});
describe('requestTemplateTypes', () => {
- it('commits REQUEST_TEMPLATE_TYPES', (done) => {
- testAction(
+ it('commits REQUEST_TEMPLATE_TYPES', () => {
+ return testAction(
actions.requestTemplateTypes,
null,
state,
[{ type: types.REQUEST_TEMPLATE_TYPES }],
[],
- done,
);
});
});
describe('receiveTemplateTypesError', () => {
- it('commits RECEIVE_TEMPLATE_TYPES_ERROR and dispatches setErrorMessage', (done) => {
- testAction(
+ it('commits RECEIVE_TEMPLATE_TYPES_ERROR and dispatches setErrorMessage', () => {
+ return testAction(
actions.receiveTemplateTypesError,
null,
state,
@@ -49,20 +48,18 @@ describe('IDE file templates actions', () => {
},
},
],
- done,
);
});
});
describe('receiveTemplateTypesSuccess', () => {
- it('commits RECEIVE_TEMPLATE_TYPES_SUCCESS', (done) => {
- testAction(
+ it('commits RECEIVE_TEMPLATE_TYPES_SUCCESS', () => {
+ return testAction(
actions.receiveTemplateTypesSuccess,
'test',
state,
[{ type: types.RECEIVE_TEMPLATE_TYPES_SUCCESS, payload: 'test' }],
[],
- done,
);
});
});
@@ -81,23 +78,17 @@ describe('IDE file templates actions', () => {
});
});
- it('rejects if selectedTemplateType is empty', (done) => {
+ it('rejects if selectedTemplateType is empty', async () => {
const dispatch = jest.fn().mockName('dispatch');
- actions
- .fetchTemplateTypes({ dispatch, state })
- .then(done.fail)
- .catch(() => {
- expect(dispatch).not.toHaveBeenCalled();
-
- done();
- });
+ await expect(actions.fetchTemplateTypes({ dispatch, state })).rejects.toBeUndefined();
+ expect(dispatch).not.toHaveBeenCalled();
});
- it('dispatches actions', (done) => {
+ it('dispatches actions', () => {
state.selectedTemplateType = { key: 'licenses' };
- testAction(
+ return testAction(
actions.fetchTemplateTypes,
null,
state,
@@ -111,7 +102,6 @@ describe('IDE file templates actions', () => {
payload: pages[0].concat(pages[1]).concat(pages[2]),
},
],
- done,
);
});
});
@@ -121,16 +111,15 @@ describe('IDE file templates actions', () => {
mock.onGet(/api\/(.*)\/templates\/licenses/).replyOnce(500);
});
- it('dispatches actions', (done) => {
+ it('dispatches actions', () => {
state.selectedTemplateType = { key: 'licenses' };
- testAction(
+ return testAction(
actions.fetchTemplateTypes,
null,
state,
[],
[{ type: 'requestTemplateTypes' }, { type: 'receiveTemplateTypesError' }],
- done,
);
});
});
@@ -184,8 +173,8 @@ describe('IDE file templates actions', () => {
});
describe('receiveTemplateError', () => {
- it('dispatches setErrorMessage', (done) => {
- testAction(
+ it('dispatches setErrorMessage', () => {
+ return testAction(
actions.receiveTemplateError,
'test',
state,
@@ -201,7 +190,6 @@ describe('IDE file templates actions', () => {
},
},
],
- done,
);
});
});
@@ -217,46 +205,43 @@ describe('IDE file templates actions', () => {
.replyOnce(200, { content: 'testing content' });
});
- it('dispatches setFileTemplate if template already has content', (done) => {
+ it('dispatches setFileTemplate if template already has content', () => {
const template = { content: 'already has content' };
- testAction(
+ return testAction(
actions.fetchTemplate,
template,
state,
[],
[{ type: 'setFileTemplate', payload: template }],
- done,
);
});
- it('dispatches success', (done) => {
+ it('dispatches success', () => {
const template = { key: 'mit' };
state.selectedTemplateType = { key: 'licenses' };
- testAction(
+ return testAction(
actions.fetchTemplate,
template,
state,
[],
[{ type: 'setFileTemplate', payload: { content: 'MIT content' } }],
- done,
);
});
- it('dispatches success and uses name key for API call', (done) => {
+ it('dispatches success and uses name key for API call', () => {
const template = { name: 'testing' };
state.selectedTemplateType = { key: 'licenses' };
- testAction(
+ return testAction(
actions.fetchTemplate,
template,
state,
[],
[{ type: 'setFileTemplate', payload: { content: 'testing content' } }],
- done,
);
});
});
@@ -266,18 +251,17 @@ describe('IDE file templates actions', () => {
mock.onGet(/api\/(.*)\/templates\/licenses\/mit/).replyOnce(500);
});
- it('dispatches error', (done) => {
+ it('dispatches error', () => {
const template = { name: 'testing' };
state.selectedTemplateType = { key: 'licenses' };
- testAction(
+ return testAction(
actions.fetchTemplate,
template,
state,
[],
[{ type: 'receiveTemplateError', payload: template }],
- done,
);
});
});
diff --git a/spec/frontend/ide/stores/modules/merge_requests/actions_spec.js b/spec/frontend/ide/stores/modules/merge_requests/actions_spec.js
index e1f2b165dd9..344fe3a41c3 100644
--- a/spec/frontend/ide/stores/modules/merge_requests/actions_spec.js
+++ b/spec/frontend/ide/stores/modules/merge_requests/actions_spec.js
@@ -28,21 +28,20 @@ describe('IDE merge requests actions', () => {
});
describe('requestMergeRequests', () => {
- it('should commit request', (done) => {
- testAction(
+ it('should commit request', () => {
+ return testAction(
requestMergeRequests,
null,
mockedState,
[{ type: types.REQUEST_MERGE_REQUESTS }],
[],
- done,
);
});
});
describe('receiveMergeRequestsError', () => {
- it('should commit error', (done) => {
- testAction(
+ it('should commit error', () => {
+ return testAction(
receiveMergeRequestsError,
{ type: 'created', search: '' },
mockedState,
@@ -58,20 +57,18 @@ describe('IDE merge requests actions', () => {
},
},
],
- done,
);
});
});
describe('receiveMergeRequestsSuccess', () => {
- it('should commit received data', (done) => {
- testAction(
+ it('should commit received data', () => {
+ return testAction(
receiveMergeRequestsSuccess,
mergeRequests,
mockedState,
[{ type: types.RECEIVE_MERGE_REQUESTS_SUCCESS, payload: mergeRequests }],
[],
- done,
);
});
});
@@ -118,8 +115,8 @@ describe('IDE merge requests actions', () => {
});
});
- it('dispatches success with received data', (done) => {
- testAction(
+ it('dispatches success with received data', () => {
+ return testAction(
fetchMergeRequests,
{ type: 'created' },
mockedState,
@@ -129,7 +126,6 @@ describe('IDE merge requests actions', () => {
{ type: 'resetMergeRequests' },
{ type: 'receiveMergeRequestsSuccess', payload: mergeRequests },
],
- done,
);
});
});
@@ -156,8 +152,8 @@ describe('IDE merge requests actions', () => {
);
});
- it('dispatches success with received data', (done) => {
- testAction(
+ it('dispatches success with received data', () => {
+ return testAction(
fetchMergeRequests,
{ type: null },
{ ...mockedState, ...mockedRootState },
@@ -167,7 +163,6 @@ describe('IDE merge requests actions', () => {
{ type: 'resetMergeRequests' },
{ type: 'receiveMergeRequestsSuccess', payload: mergeRequests },
],
- done,
);
});
});
@@ -177,8 +172,8 @@ describe('IDE merge requests actions', () => {
mock.onGet(/\/api\/v4\/merge_requests(.*)$/).replyOnce(500);
});
- it('dispatches error', (done) => {
- testAction(
+ it('dispatches error', () => {
+ return testAction(
fetchMergeRequests,
{ type: 'created', search: '' },
mockedState,
@@ -188,21 +183,19 @@ describe('IDE merge requests actions', () => {
{ type: 'resetMergeRequests' },
{ type: 'receiveMergeRequestsError', payload: { type: 'created', search: '' } },
],
- done,
);
});
});
});
describe('resetMergeRequests', () => {
- it('commits reset', (done) => {
- testAction(
+ it('commits reset', () => {
+ return testAction(
resetMergeRequests,
null,
mockedState,
[{ type: types.RESET_MERGE_REQUESTS }],
[],
- done,
);
});
});
diff --git a/spec/frontend/ide/stores/modules/pane/actions_spec.js b/spec/frontend/ide/stores/modules/pane/actions_spec.js
index 42fe8b400b8..98c4f22dac8 100644
--- a/spec/frontend/ide/stores/modules/pane/actions_spec.js
+++ b/spec/frontend/ide/stores/modules/pane/actions_spec.js
@@ -7,19 +7,19 @@ describe('IDE pane module actions', () => {
const TEST_VIEW_KEEP_ALIVE = { name: 'test-keep-alive', keepAlive: true };
describe('toggleOpen', () => {
- it('dispatches open if closed', (done) => {
- testAction(actions.toggleOpen, TEST_VIEW, { isOpen: false }, [], [{ type: 'open' }], done);
+ it('dispatches open if closed', () => {
+ return testAction(actions.toggleOpen, TEST_VIEW, { isOpen: false }, [], [{ type: 'open' }]);
});
- it('dispatches close if opened', (done) => {
- testAction(actions.toggleOpen, TEST_VIEW, { isOpen: true }, [], [{ type: 'close' }], done);
+ it('dispatches close if opened', () => {
+ return testAction(actions.toggleOpen, TEST_VIEW, { isOpen: true }, [], [{ type: 'close' }]);
});
});
describe('open', () => {
describe('with a view specified', () => {
- it('commits SET_OPEN and SET_CURRENT_VIEW', (done) => {
- testAction(
+ it('commits SET_OPEN and SET_CURRENT_VIEW', () => {
+ return testAction(
actions.open,
TEST_VIEW,
{},
@@ -28,12 +28,11 @@ describe('IDE pane module actions', () => {
{ type: types.SET_CURRENT_VIEW, payload: TEST_VIEW.name },
],
[],
- done,
);
});
- it('commits KEEP_ALIVE_VIEW if keepAlive is true', (done) => {
- testAction(
+ it('commits KEEP_ALIVE_VIEW if keepAlive is true', () => {
+ return testAction(
actions.open,
TEST_VIEW_KEEP_ALIVE,
{},
@@ -43,28 +42,26 @@ describe('IDE pane module actions', () => {
{ type: types.KEEP_ALIVE_VIEW, payload: TEST_VIEW_KEEP_ALIVE.name },
],
[],
- done,
);
});
});
describe('without a view specified', () => {
- it('commits SET_OPEN', (done) => {
- testAction(
+ it('commits SET_OPEN', () => {
+ return testAction(
actions.open,
undefined,
{},
[{ type: types.SET_OPEN, payload: true }],
[],
- done,
);
});
});
});
describe('close', () => {
- it('commits SET_OPEN', (done) => {
- testAction(actions.close, null, {}, [{ type: types.SET_OPEN, payload: false }], [], done);
+ it('commits SET_OPEN', () => {
+ return testAction(actions.close, null, {}, [{ type: types.SET_OPEN, payload: false }], []);
});
});
});
diff --git a/spec/frontend/ide/stores/modules/pipelines/actions_spec.js b/spec/frontend/ide/stores/modules/pipelines/actions_spec.js
index 3ede37e2eed..b76b673c3a2 100644
--- a/spec/frontend/ide/stores/modules/pipelines/actions_spec.js
+++ b/spec/frontend/ide/stores/modules/pipelines/actions_spec.js
@@ -25,6 +25,7 @@ import {
import * as types from '~/ide/stores/modules/pipelines/mutation_types';
import state from '~/ide/stores/modules/pipelines/state';
import axios from '~/lib/utils/axios_utils';
+import waitForPromises from 'helpers/wait_for_promises';
import { pipelines, jobs } from '../../../mock_data';
describe('IDE pipelines actions', () => {
@@ -44,32 +45,30 @@ describe('IDE pipelines actions', () => {
});
describe('requestLatestPipeline', () => {
- it('commits request', (done) => {
- testAction(
+ it('commits request', () => {
+ return testAction(
requestLatestPipeline,
null,
mockedState,
[{ type: types.REQUEST_LATEST_PIPELINE }],
[],
- done,
);
});
});
describe('receiveLatestPipelineError', () => {
- it('commits error', (done) => {
- testAction(
+ it('commits error', () => {
+ return testAction(
receiveLatestPipelineError,
{ status: 404 },
mockedState,
[{ type: types.RECEIVE_LASTEST_PIPELINE_ERROR }],
[{ type: 'stopPipelinePolling' }],
- done,
);
});
- it('dispatches setErrorMessage is not 404', (done) => {
- testAction(
+ it('dispatches setErrorMessage is not 404', () => {
+ return testAction(
receiveLatestPipelineError,
{ status: 500 },
mockedState,
@@ -86,7 +85,6 @@ describe('IDE pipelines actions', () => {
},
{ type: 'stopPipelinePolling' },
],
- done,
);
});
});
@@ -123,7 +121,7 @@ describe('IDE pipelines actions', () => {
.reply(200, { data: { foo: 'bar' } }, { 'poll-interval': '10000' });
});
- it('dispatches request', (done) => {
+ it('dispatches request', async () => {
jest.spyOn(axios, 'get');
jest.spyOn(Visibility, 'hidden').mockReturnValue(false);
@@ -133,34 +131,21 @@ describe('IDE pipelines actions', () => {
currentProject: { path_with_namespace: 'abc/def' },
};
- fetchLatestPipeline({ dispatch, rootGetters });
+ await fetchLatestPipeline({ dispatch, rootGetters });
expect(dispatch).toHaveBeenCalledWith('requestLatestPipeline');
- jest.advanceTimersByTime(1000);
-
- new Promise((resolve) => requestAnimationFrame(resolve))
- .then(() => {
- expect(axios.get).toHaveBeenCalled();
- expect(axios.get).toHaveBeenCalledTimes(1);
- expect(dispatch).toHaveBeenCalledWith(
- 'receiveLatestPipelineSuccess',
- expect.anything(),
- );
-
- jest.advanceTimersByTime(10000);
- })
- .then(() => new Promise((resolve) => requestAnimationFrame(resolve)))
- .then(() => {
- expect(axios.get).toHaveBeenCalled();
- expect(axios.get).toHaveBeenCalledTimes(2);
- expect(dispatch).toHaveBeenCalledWith(
- 'receiveLatestPipelineSuccess',
- expect.anything(),
- );
- })
- .then(done)
- .catch(done.fail);
+ await waitForPromises();
+
+ expect(axios.get).toHaveBeenCalled();
+ expect(axios.get).toHaveBeenCalledTimes(1);
+ expect(dispatch).toHaveBeenCalledWith('receiveLatestPipelineSuccess', expect.anything());
+
+ jest.advanceTimersByTime(10000);
+
+ expect(axios.get).toHaveBeenCalled();
+ expect(axios.get).toHaveBeenCalledTimes(2);
+ expect(dispatch).toHaveBeenCalledWith('receiveLatestPipelineSuccess', expect.anything());
});
});
@@ -169,27 +154,22 @@ describe('IDE pipelines actions', () => {
mock.onGet('/abc/def/commit/abc123def456ghi789jkl/pipelines').reply(500);
});
- it('dispatches error', (done) => {
+ it('dispatches error', async () => {
const dispatch = jest.fn().mockName('dispatch');
const rootGetters = {
lastCommit: { id: 'abc123def456ghi789jkl' },
currentProject: { path_with_namespace: 'abc/def' },
};
- fetchLatestPipeline({ dispatch, rootGetters });
+ await fetchLatestPipeline({ dispatch, rootGetters });
- jest.advanceTimersByTime(1500);
+ await waitForPromises();
- new Promise((resolve) => requestAnimationFrame(resolve))
- .then(() => {
- expect(dispatch).toHaveBeenCalledWith('receiveLatestPipelineError', expect.anything());
- })
- .then(done)
- .catch(done.fail);
+ expect(dispatch).toHaveBeenCalledWith('receiveLatestPipelineError', expect.anything());
});
});
- it('sets latest pipeline to `null` and stops polling on empty project', (done) => {
+ it('sets latest pipeline to `null` and stops polling on empty project', () => {
mockedState = {
...mockedState,
rootGetters: {
@@ -197,26 +177,31 @@ describe('IDE pipelines actions', () => {
},
};
- testAction(
+ return testAction(
fetchLatestPipeline,
{},
mockedState,
[{ type: types.RECEIVE_LASTEST_PIPELINE_SUCCESS, payload: null }],
[{ type: 'stopPipelinePolling' }],
- done,
);
});
});
describe('requestJobs', () => {
- it('commits request', (done) => {
- testAction(requestJobs, 1, mockedState, [{ type: types.REQUEST_JOBS, payload: 1 }], [], done);
+ it('commits request', () => {
+ return testAction(
+ requestJobs,
+ 1,
+ mockedState,
+ [{ type: types.REQUEST_JOBS, payload: 1 }],
+ [],
+ );
});
});
describe('receiveJobsError', () => {
- it('commits error', (done) => {
- testAction(
+ it('commits error', () => {
+ return testAction(
receiveJobsError,
{ id: 1 },
mockedState,
@@ -232,20 +217,18 @@ describe('IDE pipelines actions', () => {
},
},
],
- done,
);
});
});
describe('receiveJobsSuccess', () => {
- it('commits data', (done) => {
- testAction(
+ it('commits data', () => {
+ return testAction(
receiveJobsSuccess,
{ id: 1, data: jobs },
mockedState,
[{ type: types.RECEIVE_JOBS_SUCCESS, payload: { id: 1, data: jobs } }],
[],
- done,
);
});
});
@@ -258,8 +241,8 @@ describe('IDE pipelines actions', () => {
mock.onGet(stage.dropdownPath).replyOnce(200, jobs);
});
- it('dispatches request', (done) => {
- testAction(
+ it('dispatches request', () => {
+ return testAction(
fetchJobs,
stage,
mockedState,
@@ -268,7 +251,6 @@ describe('IDE pipelines actions', () => {
{ type: 'requestJobs', payload: stage.id },
{ type: 'receiveJobsSuccess', payload: { id: stage.id, data: jobs } },
],
- done,
);
});
});
@@ -278,8 +260,8 @@ describe('IDE pipelines actions', () => {
mock.onGet(stage.dropdownPath).replyOnce(500);
});
- it('dispatches error', (done) => {
- testAction(
+ it('dispatches error', () => {
+ return testAction(
fetchJobs,
stage,
mockedState,
@@ -288,69 +270,64 @@ describe('IDE pipelines actions', () => {
{ type: 'requestJobs', payload: stage.id },
{ type: 'receiveJobsError', payload: stage },
],
- done,
);
});
});
});
describe('toggleStageCollapsed', () => {
- it('commits collapse', (done) => {
- testAction(
+ it('commits collapse', () => {
+ return testAction(
toggleStageCollapsed,
1,
mockedState,
[{ type: types.TOGGLE_STAGE_COLLAPSE, payload: 1 }],
[],
- done,
);
});
});
describe('setDetailJob', () => {
- it('commits job', (done) => {
- testAction(
+ it('commits job', () => {
+ return testAction(
setDetailJob,
'job',
mockedState,
[{ type: types.SET_DETAIL_JOB, payload: 'job' }],
[{ type: 'rightPane/open', payload: rightSidebarViews.jobsDetail }],
- done,
);
});
- it('dispatches rightPane/open as pipeline when job is null', (done) => {
- testAction(
+ it('dispatches rightPane/open as pipeline when job is null', () => {
+ return testAction(
setDetailJob,
null,
mockedState,
[{ type: types.SET_DETAIL_JOB, payload: null }],
[{ type: 'rightPane/open', payload: rightSidebarViews.pipelines }],
- done,
);
});
- it('dispatches rightPane/open as job', (done) => {
- testAction(
+ it('dispatches rightPane/open as job', () => {
+ return testAction(
setDetailJob,
'job',
mockedState,
[{ type: types.SET_DETAIL_JOB, payload: 'job' }],
[{ type: 'rightPane/open', payload: rightSidebarViews.jobsDetail }],
- done,
);
});
});
describe('requestJobLogs', () => {
- it('commits request', (done) => {
- testAction(requestJobLogs, null, mockedState, [{ type: types.REQUEST_JOB_LOGS }], [], done);
+ it('commits request', () => {
+ return testAction(requestJobLogs, null, mockedState, [{ type: types.REQUEST_JOB_LOGS }], []);
});
});
describe('receiveJobLogsError', () => {
- it('commits error', (done) => {
- testAction(
+ it('commits error', () => {
+ return testAction(
receiveJobLogsError,
null,
mockedState,
@@ -366,20 +343,18 @@ describe('IDE pipelines actions', () => {
},
},
],
- done,
);
});
});
describe('receiveJobLogsSuccess', () => {
- it('commits data', (done) => {
- testAction(
+ it('commits data', () => {
+ return testAction(
receiveJobLogsSuccess,
'data',
mockedState,
[{ type: types.RECEIVE_JOB_LOGS_SUCCESS, payload: 'data' }],
[],
- done,
);
});
});
@@ -395,8 +370,8 @@ describe('IDE pipelines actions', () => {
mock.onGet(`${TEST_HOST}/project/builds/trace`).replyOnce(200, { html: 'html' });
});
- it('dispatches request', (done) => {
- testAction(
+ it('dispatches request', () => {
+ return testAction(
fetchJobLogs,
null,
mockedState,
@@ -405,7 +380,6 @@ describe('IDE pipelines actions', () => {
{ type: 'requestJobLogs' },
{ type: 'receiveJobLogsSuccess', payload: { html: 'html' } },
],
- done,
);
});
@@ -426,22 +400,21 @@ describe('IDE pipelines actions', () => {
mock.onGet(`${TEST_HOST}/project/builds/trace`).replyOnce(500);
});
- it('dispatches error', (done) => {
- testAction(
+ it('dispatches error', () => {
+ return testAction(
fetchJobLogs,
null,
mockedState,
[],
[{ type: 'requestJobLogs' }, { type: 'receiveJobLogsError' }],
- done,
);
});
});
});
describe('resetLatestPipeline', () => {
- it('commits reset mutations', (done) => {
- testAction(
+ it('commits reset mutations', () => {
+ return testAction(
resetLatestPipeline,
null,
mockedState,
@@ -450,7 +423,6 @@ describe('IDE pipelines actions', () => {
{ type: types.SET_DETAIL_JOB, payload: null },
],
[],
- done,
);
});
});
diff --git a/spec/frontend/ide/stores/modules/terminal_sync/actions_spec.js b/spec/frontend/ide/stores/modules/terminal_sync/actions_spec.js
index 22b0615c6d0..448fd909f39 100644
--- a/spec/frontend/ide/stores/modules/terminal_sync/actions_spec.js
+++ b/spec/frontend/ide/stores/modules/terminal_sync/actions_spec.js
@@ -22,43 +22,37 @@ describe('ide/stores/modules/terminal_sync/actions', () => {
});
describe('upload', () => {
- it('uploads to mirror and sets success', (done) => {
+ it('uploads to mirror and sets success', async () => {
mirror.upload.mockReturnValue(Promise.resolve());
- testAction(
+ await testAction(
actions.upload,
null,
rootState,
[{ type: types.START_LOADING }, { type: types.SET_SUCCESS }],
[],
- () => {
- expect(mirror.upload).toHaveBeenCalledWith(rootState);
- done();
- },
);
+ expect(mirror.upload).toHaveBeenCalledWith(rootState);
});
- it('sets error when failed', (done) => {
+ it('sets error when failed', () => {
const err = { message: 'it failed!' };
mirror.upload.mockReturnValue(Promise.reject(err));
- testAction(
+ return testAction(
actions.upload,
null,
rootState,
[{ type: types.START_LOADING }, { type: types.SET_ERROR, payload: err }],
[],
- done,
);
});
});
describe('stop', () => {
- it('disconnects from mirror', (done) => {
- testAction(actions.stop, null, rootState, [{ type: types.STOP }], [], () => {
- expect(mirror.disconnect).toHaveBeenCalled();
- done();
- });
+ it('disconnects from mirror', async () => {
+ await testAction(actions.stop, null, rootState, [{ type: types.STOP }], []);
+ expect(mirror.disconnect).toHaveBeenCalled();
});
});
@@ -83,20 +77,17 @@ describe('ide/stores/modules/terminal_sync/actions', () => {
};
});
- it('connects to mirror and sets success', (done) => {
+ it('connects to mirror and sets success', async () => {
mirror.connect.mockReturnValue(Promise.resolve());
- testAction(
+ await testAction(
actions.start,
null,
rootState,
[{ type: types.START_LOADING }, { type: types.SET_SUCCESS }],
[],
- () => {
- expect(mirror.connect).toHaveBeenCalledWith(TEST_SESSION.proxyWebsocketPath);
- done();
- },
);
+ expect(mirror.connect).toHaveBeenCalledWith(TEST_SESSION.proxyWebsocketPath);
});
it('sets error if connection fails', () => {
diff --git a/spec/frontend/issues/show/components/incidents/incident_tabs_spec.js b/spec/frontend/issues/show/components/incidents/incident_tabs_spec.js
index 890bf88f6a2..de32e057f27 100644
--- a/spec/frontend/issues/show/components/incidents/incident_tabs_spec.js
+++ b/spec/frontend/issues/show/components/incidents/incident_tabs_spec.js
@@ -34,6 +34,7 @@ describe('Incident Tabs component', () => {
provide: {
fullPath: '',
iid: '',
+ projectId: '',
uploadMetricsFeatureAvailable: true,
glFeatures: { incidentTimeline: true, incidentTimelineEvents: true },
},
diff --git a/spec/frontend/packages_and_registries/dependency_proxy/app_spec.js b/spec/frontend/packages_and_registries/dependency_proxy/app_spec.js
index 4e6870be5ff..dbe9793fb8c 100644
--- a/spec/frontend/packages_and_registries/dependency_proxy/app_spec.js
+++ b/spec/frontend/packages_and_registries/dependency_proxy/app_spec.js
@@ -259,6 +259,12 @@ describe('DependencyProxyApp', () => {
it('shows the clear cache dropdown list', () => {
expect(findClearCacheDropdownList().exists()).toBe(true);
+
+ const clearCacheDropdownItem = findClearCacheDropdownList().findComponent(
+ GlDropdownItem,
+ );
+
+ expect(clearCacheDropdownItem.text()).toBe('Clear cache');
});
it('shows the clear cache confirmation modal', () => {
diff --git a/spec/frontend/runner/components/runner_assigned_item_spec.js b/spec/frontend/runner/components/runner_assigned_item_spec.js
index c6156c16d4a..1ff6983fbe7 100644
--- a/spec/frontend/runner/components/runner_assigned_item_spec.js
+++ b/spec/frontend/runner/components/runner_assigned_item_spec.js
@@ -1,6 +1,7 @@
import { GlAvatar } from '@gitlab/ui';
import { shallowMountExtended } from 'helpers/vue_test_utils_helper';
import RunnerAssignedItem from '~/runner/components/runner_assigned_item.vue';
+import { AVATAR_SHAPE_OPTION_RECT } from '~/vue_shared/constants';
const mockHref = '/group/project';
const mockName = 'Project';
@@ -40,7 +41,7 @@ describe('RunnerAssignedItem', () => {
alt: mockName,
entityName: mockName,
src: mockAvatarUrl,
- shape: 'rect',
+ shape: AVATAR_SHAPE_OPTION_RECT,
size: 48,
});
});
diff --git a/spec/frontend/vue_shared/components/markdown/field_spec.js b/spec/frontend/vue_shared/components/markdown/field_spec.js
index cf98630c654..d1c4d777d44 100644
--- a/spec/frontend/vue_shared/components/markdown/field_spec.js
+++ b/spec/frontend/vue_shared/components/markdown/field_spec.js
@@ -85,7 +85,7 @@ describe('Markdown field component', () => {
describe('mounted', () => {
const previewHTML = `
<p>markdown preview</p>
- <video src="${FIXTURES_PATH}/static/mock-video.mp4" muted="muted"></video>
+ <video src="${FIXTURES_PATH}/static/mock-video.mp4"></video>
`;
let previewLink;
let writeLink;
diff --git a/spec/frontend/vue_shared/components/metric_images/__snapshots__/metric_images_table_spec.js.snap b/spec/frontend/vue_shared/components/metric_images/__snapshots__/metric_images_table_spec.js.snap
new file mode 100644
index 00000000000..5dd12d9edf5
--- /dev/null
+++ b/spec/frontend/vue_shared/components/metric_images/__snapshots__/metric_images_table_spec.js.snap
@@ -0,0 +1,73 @@
+// Jest Snapshot v1, https://goo.gl/fbAQLP
+
+exports[`Metrics upload item render the metrics image component 1`] = `
+<gl-card-stub
+ bodyclass="gl-border-1,gl-border-t-solid,gl-border-gray-100,[object Object]"
+ class="collapsible-card border gl-p-0 gl-mb-5"
+ footerclass=""
+ headerclass="gl-display-flex gl-align-items-center gl-border-b-0 gl-py-3"
+>
+ <gl-modal-stub
+ actioncancel="[object Object]"
+ actionprimary="[object Object]"
+ body-class="gl-pb-0! gl-min-h-6!"
+ dismisslabel="Close"
+ modalclass=""
+ modalid="delete-metric-modal"
+ size="sm"
+ titletag="h4"
+ >
+
+ <p>
+ Are you sure you wish to delete this image?
+ </p>
+ </gl-modal-stub>
+
+ <gl-modal-stub
+ actioncancel="[object Object]"
+ actionprimary="[object Object]"
+ data-testid="metric-image-edit-modal"
+ dismisslabel="Close"
+ modalclass=""
+ modalid="edit-metric-modal"
+ size="sm"
+ titletag="h4"
+ >
+
+ <gl-form-group-stub
+ label="Text (optional)"
+ label-for="upload-text-input"
+ labeldescription=""
+ optionaltext="(optional)"
+ >
+ <gl-form-input-stub
+ data-testid="metric-image-text-field"
+ id="upload-text-input"
+ />
+ </gl-form-group-stub>
+
+ <gl-form-group-stub
+ description="Must start with http or https"
+ label="Link (optional)"
+ label-for="upload-url-input"
+ labeldescription=""
+ optionaltext="(optional)"
+ >
+ <gl-form-input-stub
+ data-testid="metric-image-url-field"
+ id="upload-url-input"
+ />
+ </gl-form-group-stub>
+ </gl-modal-stub>
+
+ <div
+ class="gl-display-flex gl-flex-direction-column"
+ data-testid="metric-image-body"
+ >
+ <img
+ class="gl-max-w-full gl-align-self-center"
+ src="test_file_path"
+ />
+ </div>
+</gl-card-stub>
+`;
diff --git a/spec/frontend/vue_shared/components/metric_images/metric_images_tab_spec.js b/spec/frontend/vue_shared/components/metric_images/metric_images_tab_spec.js
new file mode 100644
index 00000000000..2cefa77b72d
--- /dev/null
+++ b/spec/frontend/vue_shared/components/metric_images/metric_images_tab_spec.js
@@ -0,0 +1,174 @@
+import { GlFormInput, GlModal } from '@gitlab/ui';
+import { shallowMount } from '@vue/test-utils';
+import Vue from 'vue';
+import merge from 'lodash/merge';
+import Vuex from 'vuex';
+import MetricImagesTable from '~/vue_shared/components/metric_images/metric_images_table.vue';
+import MetricImagesTab from '~/vue_shared/components/metric_images/metric_images_tab.vue';
+import createStore from '~/vue_shared/components/metric_images/store';
+import waitForPromises from 'helpers/wait_for_promises';
+import UploadDropzone from '~/vue_shared/components/upload_dropzone/upload_dropzone.vue';
+import { fileList, initialData } from './mock_data';
+
+const service = {
+ getMetricImages: jest.fn(),
+};
+
+const mockEvent = { preventDefault: jest.fn() };
+
+Vue.use(Vuex);
+
+describe('Metric images tab', () => {
+ let wrapper;
+ let store;
+
+ const mountComponent = (options = {}) => {
+ store = createStore({}, service);
+
+ wrapper = shallowMount(
+ MetricImagesTab,
+ merge(
+ {
+ store,
+ provide: {
+ canUpdate: true,
+ iid: initialData.issueIid,
+ projectId: initialData.projectId,
+ },
+ },
+ options,
+ ),
+ );
+ };
+
+ beforeEach(() => {
+ mountComponent();
+ });
+
+ afterEach(() => {
+ if (wrapper) {
+ wrapper.destroy();
+ wrapper = null;
+ }
+ });
+
+ const findUploadDropzone = () => wrapper.findComponent(UploadDropzone);
+ const findImages = () => wrapper.findAllComponents(MetricImagesTable);
+ const findModal = () => wrapper.findComponent(GlModal);
+ const submitModal = () => findModal().vm.$emit('primary', mockEvent);
+ const cancelModal = () => findModal().vm.$emit('hidden');
+
+ describe('empty state', () => {
+ beforeEach(() => {
+ mountComponent();
+ });
+
+ it('renders the upload component', () => {
+ expect(findUploadDropzone().exists()).toBe(true);
+ });
+ });
+
+ describe('permissions', () => {
+ beforeEach(() => {
+ mountComponent({ provide: { canUpdate: false } });
+ });
+
+ it('hides the upload component when disallowed', () => {
+ expect(findUploadDropzone().exists()).toBe(false);
+ });
+ });
+
+ describe('onLoad action', () => {
+ it('should load images', async () => {
+ service.getMetricImages.mockImplementation(() => Promise.resolve(fileList));
+
+ mountComponent();
+
+ await waitForPromises();
+
+ expect(findImages().length).toBe(1);
+ });
+ });
+
+ describe('add metric dialog', () => {
+ const testUrl = 'test url';
+
+ it('should open the add metric dialog when clicked', async () => {
+ mountComponent();
+
+ findUploadDropzone().vm.$emit('change');
+
+ await waitForPromises();
+
+ expect(findModal().attributes('visible')).toBe('true');
+ });
+
+ it('should close when cancelled', async () => {
+ mountComponent({
+ data() {
+ return { modalVisible: true };
+ },
+ });
+
+ cancelModal();
+
+ await waitForPromises();
+
+ expect(findModal().attributes('visible')).toBeFalsy();
+ });
+
+ it('should add files and url when selected', async () => {
+ mountComponent({
+ data() {
+ return { modalVisible: true, modalUrl: testUrl, currentFiles: fileList };
+ },
+ });
+
+ const dispatchSpy = jest.spyOn(store, 'dispatch');
+
+ submitModal();
+
+ await waitForPromises();
+
+ expect(dispatchSpy).toHaveBeenCalledWith('uploadImage', {
+ files: fileList,
+ url: testUrl,
+ urlText: '',
+ });
+ });
+
+ describe('url field', () => {
+ beforeEach(() => {
+ mountComponent({
+ data() {
+ return { modalVisible: true, modalUrl: testUrl };
+ },
+ });
+ });
+
+ it('should display the url field', () => {
+ expect(wrapper.find('#upload-url-input').attributes('value')).toBe(testUrl);
+ });
+
+ it('should display the url text field', () => {
+ expect(wrapper.find('#upload-text-input').attributes('value')).toBe('');
+ });
+
+ it('should clear url when cancelled', async () => {
+ cancelModal();
+
+ await waitForPromises();
+
+ expect(wrapper.findComponent(GlFormInput).attributes('value')).toBe('');
+ });
+
+ it('should clear url when submitted', async () => {
+ submitModal();
+
+ await waitForPromises();
+
+ expect(wrapper.findComponent(GlFormInput).attributes('value')).toBe('');
+ });
+ });
+ });
+});
diff --git a/spec/frontend/vue_shared/components/metric_images/metric_images_table_spec.js b/spec/frontend/vue_shared/components/metric_images/metric_images_table_spec.js
new file mode 100644
index 00000000000..d792bd46ccd
--- /dev/null
+++ b/spec/frontend/vue_shared/components/metric_images/metric_images_table_spec.js
@@ -0,0 +1,230 @@
+import { GlLink, GlModal } from '@gitlab/ui';
+import { shallowMount, mount } from '@vue/test-utils';
+import Vue from 'vue';
+import merge from 'lodash/merge';
+import Vuex from 'vuex';
+import createStore from '~/vue_shared/components/metric_images/store';
+import MetricsImageTable from '~/vue_shared/components/metric_images/metric_images_table.vue';
+import waitForPromises from 'helpers/wait_for_promises';
+
+const defaultProps = {
+ id: 1,
+ filePath: 'test_file_path',
+ filename: 'test_file_name',
+};
+
+const mockEvent = { preventDefault: jest.fn() };
+
+Vue.use(Vuex);
+
+describe('Metrics upload item', () => {
+ let wrapper;
+ let store;
+
+ const mountComponent = (options = {}, mountMethod = mount) => {
+ store = createStore();
+
+ wrapper = mountMethod(
+ MetricsImageTable,
+ merge(
+ {
+ store,
+ propsData: {
+ ...defaultProps,
+ },
+ provide: { canUpdate: true },
+ },
+ options,
+ ),
+ );
+ };
+
+ afterEach(() => {
+ if (wrapper) {
+ wrapper.destroy();
+ wrapper = null;
+ }
+ });
+
+ const findImageLink = () => wrapper.findComponent(GlLink);
+ const findLabelTextSpan = () => wrapper.find('[data-testid="metric-image-label-span"]');
+ const findCollapseButton = () => wrapper.find('[data-testid="collapse-button"]');
+ const findMetricImageBody = () => wrapper.find('[data-testid="metric-image-body"]');
+ const findModal = () => wrapper.findComponent(GlModal);
+ const findEditModal = () => wrapper.find('[data-testid="metric-image-edit-modal"]');
+ const findDeleteButton = () => wrapper.find('[data-testid="delete-button"]');
+ const findEditButton = () => wrapper.find('[data-testid="edit-button"]');
+ const findImageTextInput = () => wrapper.find('[data-testid="metric-image-text-field"]');
+ const findImageUrlInput = () => wrapper.find('[data-testid="metric-image-url-field"]');
+
+ const closeModal = () => findModal().vm.$emit('hidden');
+ const submitModal = () => findModal().vm.$emit('primary', mockEvent);
+ const deleteImage = () => findDeleteButton().vm.$emit('click');
+ const closeEditModal = () => findEditModal().vm.$emit('hidden');
+ const submitEditModal = () => findEditModal().vm.$emit('primary', mockEvent);
+ const editImage = () => findEditButton().vm.$emit('click');
+
+ it('render the metrics image component', () => {
+ mountComponent({}, shallowMount);
+
+ expect(wrapper.element).toMatchSnapshot();
+ });
+
+ it('shows a link with the correct url', () => {
+ const testUrl = 'test_url';
+ mountComponent({ propsData: { url: testUrl } });
+
+ expect(findImageLink().attributes('href')).toBe(testUrl);
+ expect(findImageLink().text()).toBe(defaultProps.filename);
+ });
+
+ it('shows a link with the url text, if url text is present', () => {
+ const testUrl = 'test_url';
+ const testUrlText = 'test_url_text';
+ mountComponent({ propsData: { url: testUrl, urlText: testUrlText } });
+
+ expect(findImageLink().attributes('href')).toBe(testUrl);
+ expect(findImageLink().text()).toBe(testUrlText);
+ });
+
+ it('shows the url text with no url, if no url is present', () => {
+ const testUrlText = 'test_url_text';
+ mountComponent({ propsData: { urlText: testUrlText } });
+
+ expect(findLabelTextSpan().text()).toBe(testUrlText);
+ });
+
+ describe('expand and collapse', () => {
+ beforeEach(() => {
+ mountComponent();
+ });
+
+ it('the card is expanded by default', () => {
+ expect(findMetricImageBody().isVisible()).toBe(true);
+ });
+
+ it('the card is collapsed when clicked', async () => {
+ findCollapseButton().trigger('click');
+
+ await waitForPromises();
+
+ expect(findMetricImageBody().isVisible()).toBe(false);
+ });
+ });
+
+ describe('delete functionality', () => {
+ it('should open the delete modal when clicked', async () => {
+ mountComponent({ stubs: { GlModal: true } });
+
+ deleteImage();
+
+ await waitForPromises();
+
+ expect(findModal().attributes('visible')).toBe('true');
+ });
+
+ describe('when the modal is open', () => {
+ beforeEach(() => {
+ mountComponent(
+ {
+ data() {
+ return { modalVisible: true };
+ },
+ },
+ shallowMount,
+ );
+ });
+
+ it('should close the modal when cancelled', async () => {
+ closeModal();
+
+ await waitForPromises();
+
+ expect(findModal().attributes('visible')).toBeFalsy();
+ });
+
+ it('should delete the image when selected', async () => {
+ const dispatchSpy = jest.spyOn(store, 'dispatch').mockImplementation(jest.fn());
+
+ submitModal();
+
+ await waitForPromises();
+
+ expect(dispatchSpy).toHaveBeenCalledWith('deleteImage', defaultProps.id);
+ });
+ });
+
+ describe('canUpdate permission', () => {
+ it('delete button is hidden when user lacks update permissions', () => {
+ mountComponent({ provide: { canUpdate: false } });
+
+ expect(findDeleteButton().exists()).toBe(false);
+ });
+ });
+ });
+
+ describe('edit functionality', () => {
+ it('should open the delete modal when clicked', async () => {
+ mountComponent({ stubs: { GlModal: true } });
+
+ editImage();
+
+ await waitForPromises();
+
+ expect(findEditModal().attributes('visible')).toBe('true');
+ });
+
+ describe('when the modal is open', () => {
+ beforeEach(() => {
+ mountComponent({
+ data() {
+ return { editModalVisible: true };
+ },
+ propsData: { urlText: 'test' },
+ stubs: { GlModal: true },
+ });
+ });
+
+ it('should close the modal when cancelled', async () => {
+ closeEditModal();
+
+ await waitForPromises();
+
+ expect(findEditModal().attributes('visible')).toBeFalsy();
+ });
+
+ it('should delete the image when selected', async () => {
+ const dispatchSpy = jest.spyOn(store, 'dispatch').mockImplementation(jest.fn());
+
+ submitEditModal();
+
+ await waitForPromises();
+
+ expect(dispatchSpy).toHaveBeenCalledWith('updateImage', {
+ imageId: defaultProps.id,
+ url: null,
+ urlText: 'test',
+ });
+ });
+
+ it('should clear edits when the modal is closed', async () => {
+ await findImageTextInput().setValue('test value');
+ await findImageUrlInput().setValue('http://www.gitlab.com');
+
+ expect(findImageTextInput().element.value).toBe('test value');
+ expect(findImageUrlInput().element.value).toBe('http://www.gitlab.com');
+
+ closeEditModal();
+
+ await waitForPromises();
+
+ editImage();
+
+ await waitForPromises();
+
+ expect(findImageTextInput().element.value).toBe('test');
+ expect(findImageUrlInput().element.value).toBe('');
+ });
+ });
+ });
+});
diff --git a/spec/frontend/vue_shared/components/metric_images/mock_data.js b/spec/frontend/vue_shared/components/metric_images/mock_data.js
new file mode 100644
index 00000000000..480491077fb
--- /dev/null
+++ b/spec/frontend/vue_shared/components/metric_images/mock_data.js
@@ -0,0 +1,5 @@
+export const fileList = [{ filePath: 'test', filename: 'hello', id: 5, url: null }];
+
+export const fileListRaw = [{ file_path: 'test', filename: 'hello', id: 5, url: null }];
+
+export const initialData = { issueIid: '123', projectId: 456 };
diff --git a/spec/frontend/vue_shared/components/metric_images/store/actions_spec.js b/spec/frontend/vue_shared/components/metric_images/store/actions_spec.js
new file mode 100644
index 00000000000..518cf354675
--- /dev/null
+++ b/spec/frontend/vue_shared/components/metric_images/store/actions_spec.js
@@ -0,0 +1,158 @@
+import Vue from 'vue';
+import Vuex from 'vuex';
+import actionsFactory from '~/vue_shared/components/metric_images/store/actions';
+import * as types from '~/vue_shared/components/metric_images/store/mutation_types';
+import createStore from '~/vue_shared/components/metric_images/store';
+import testAction from 'helpers/vuex_action_helper';
+import createFlash from '~/flash';
+import { convertObjectPropsToCamelCase } from '~/lib/utils/common_utils';
+import { fileList, initialData } from '../mock_data';
+
+jest.mock('~/flash');
+const service = {
+ getMetricImages: jest.fn(),
+ uploadMetricImage: jest.fn(),
+ updateMetricImage: jest.fn(),
+ deleteMetricImage: jest.fn(),
+};
+
+const actions = actionsFactory(service);
+
+const defaultState = {
+ issueIid: 1,
+ projectId: '2',
+};
+
+Vue.use(Vuex);
+
+describe('Metrics tab store actions', () => {
+ let store;
+ let state;
+
+ beforeEach(() => {
+ store = createStore(defaultState);
+ state = store.state;
+ });
+
+ afterEach(() => {
+ createFlash.mockClear();
+ });
+
+ describe('fetching metric images', () => {
+ it('should call success action when fetching metric images', () => {
+ service.getMetricImages.mockImplementation(() => Promise.resolve(fileList));
+
+ testAction(actions.fetchImages, null, state, [
+ { type: types.REQUEST_METRIC_IMAGES },
+ {
+ type: types.RECEIVE_METRIC_IMAGES_SUCCESS,
+ payload: convertObjectPropsToCamelCase(fileList, { deep: true }),
+ },
+ ]);
+ });
+
+ it('should call error action when fetching metric images with an error', async () => {
+ service.getMetricImages.mockImplementation(() => Promise.reject());
+
+ await testAction(
+ actions.fetchImages,
+ null,
+ state,
+ [{ type: types.REQUEST_METRIC_IMAGES }, { type: types.RECEIVE_METRIC_IMAGES_ERROR }],
+ [],
+ );
+ expect(createFlash).toHaveBeenCalled();
+ });
+ });
+
+ describe('uploading metric images', () => {
+ const payload = {
+ // mock the FileList api
+ files: {
+ item() {
+ return fileList[0];
+ },
+ },
+ url: 'test_url',
+ };
+
+ it('should call success action when uploading an image', () => {
+ service.uploadMetricImage.mockImplementation(() => Promise.resolve(fileList[0]));
+
+ testAction(actions.uploadImage, payload, state, [
+ { type: types.REQUEST_METRIC_UPLOAD },
+ {
+ type: types.RECEIVE_METRIC_UPLOAD_SUCCESS,
+ payload: fileList[0],
+ },
+ ]);
+ });
+
+ it('should call error action when failing to upload an image', async () => {
+ service.uploadMetricImage.mockImplementation(() => Promise.reject());
+
+ await testAction(
+ actions.uploadImage,
+ payload,
+ state,
+ [{ type: types.REQUEST_METRIC_UPLOAD }, { type: types.RECEIVE_METRIC_UPLOAD_ERROR }],
+ [],
+ );
+ expect(createFlash).toHaveBeenCalled();
+ });
+ });
+
+ describe('updating metric images', () => {
+ const payload = {
+ url: 'test_url',
+ urlText: 'url text',
+ };
+
+ it('should call success action when updating an image', () => {
+ service.updateMetricImage.mockImplementation(() => Promise.resolve());
+
+ testAction(actions.updateImage, payload, state, [
+ { type: types.REQUEST_METRIC_UPLOAD },
+ {
+ type: types.RECEIVE_METRIC_UPDATE_SUCCESS,
+ },
+ ]);
+ });
+
+ it('should call error action when failing to update an image', async () => {
+ service.updateMetricImage.mockImplementation(() => Promise.reject());
+
+ await testAction(
+ actions.updateImage,
+ payload,
+ state,
+ [{ type: types.REQUEST_METRIC_UPLOAD }, { type: types.RECEIVE_METRIC_UPLOAD_ERROR }],
+ [],
+ );
+ expect(createFlash).toHaveBeenCalled();
+ });
+ });
+
+ describe('deleting a metric image', () => {
+ const payload = fileList[0].id;
+
+ it('should call success action when deleting an image', () => {
+ service.deleteMetricImage.mockImplementation(() => Promise.resolve());
+
+ testAction(actions.deleteImage, payload, state, [
+ {
+ type: types.RECEIVE_METRIC_DELETE_SUCCESS,
+ payload,
+ },
+ ]);
+ });
+ });
+
+ describe('initial data', () => {
+ it('should set the initial data correctly', () => {
+ testAction(actions.setInitialData, initialData, state, [
+ { type: types.SET_INITIAL_DATA, payload: initialData },
+ ]);
+ });
+ });
+});
diff --git a/spec/frontend/vue_shared/components/metric_images/store/mutations_spec.js b/spec/frontend/vue_shared/components/metric_images/store/mutations_spec.js
new file mode 100644
index 00000000000..754f729e657
--- /dev/null
+++ b/spec/frontend/vue_shared/components/metric_images/store/mutations_spec.js
@@ -0,0 +1,147 @@
+import { cloneDeep } from 'lodash';
+import * as types from '~/vue_shared/components/metric_images/store/mutation_types';
+import mutations from '~/vue_shared/components/metric_images/store/mutations';
+import { initialData } from '../mock_data';
+
+const defaultState = {
+ metricImages: [],
+ isLoadingMetricImages: false,
+ isUploadingImage: false,
+};
+
+const testImages = [
+ { filename: 'test.filename', id: 5, filePath: 'test/file/path', url: null },
+ { filename: 'second.filename', id: 6, filePath: 'second/file/path', url: 'test/url' },
+ { filename: 'third.filename', id: 7, filePath: 'third/file/path', url: 'test/url' },
+];
+
+describe('Metric images mutations', () => {
+ let state;
+
+ const createState = (customState = {}) => {
+ state = {
+ ...cloneDeep(defaultState),
+ ...customState,
+ };
+ };
+
+ beforeEach(() => {
+ createState();
+ });
+
+ describe('REQUEST_METRIC_IMAGES', () => {
+ beforeEach(() => {
+ mutations[types.REQUEST_METRIC_IMAGES](state);
+ });
+
+ it('should set the loading state', () => {
+ expect(state.isLoadingMetricImages).toBe(true);
+ });
+ });
+
+ describe('RECEIVE_METRIC_IMAGES_SUCCESS', () => {
+ beforeEach(() => {
+ mutations[types.RECEIVE_METRIC_IMAGES_SUCCESS](state, testImages);
+ });
+
+ it('should unset the loading state', () => {
+ expect(state.isLoadingMetricImages).toBe(false);
+ });
+
+ it('should set the metric images', () => {
+ expect(state.metricImages).toEqual(testImages);
+ });
+ });
+
+ describe('RECEIVE_METRIC_IMAGES_ERROR', () => {
+ beforeEach(() => {
+ mutations[types.RECEIVE_METRIC_IMAGES_ERROR](state);
+ });
+
+ it('should unset the loading state', () => {
+ expect(state.isLoadingMetricImages).toBe(false);
+ });
+ });
+
+ describe('REQUEST_METRIC_UPLOAD', () => {
+ beforeEach(() => {
+ mutations[types.REQUEST_METRIC_UPLOAD](state);
+ });
+
+ it('should set the loading state', () => {
+ expect(state.isUploadingImage).toBe(true);
+ });
+ });
+
+ describe('RECEIVE_METRIC_UPLOAD_SUCCESS', () => {
+ const initialImage = testImages[0];
+ const newImage = testImages[1];
+
+ beforeEach(() => {
+ createState({ metricImages: [initialImage] });
+ mutations[types.RECEIVE_METRIC_UPLOAD_SUCCESS](state, newImage);
+ });
+
+ it('should unset the loading state', () => {
+ expect(state.isUploadingImage).toBe(false);
+ });
+
+ it('should add the new metric image after the existing one', () => {
+ expect(state.metricImages).toMatchObject([initialImage, newImage]);
+ });
+ });
+
+ describe('RECEIVE_METRIC_UPLOAD_ERROR', () => {
+ beforeEach(() => {
+ mutations[types.RECEIVE_METRIC_UPLOAD_ERROR](state);
+ });
+
+ it('should unset the loading state', () => {
+ expect(state.isUploadingImage).toBe(false);
+ });
+ });
+
+ describe('RECEIVE_METRIC_UPDATE_SUCCESS', () => {
+ const initialImage = testImages[0];
+ const newImage = testImages[0];
+ newImage.url = 'https://www.gitlab.com';
+
+ beforeEach(() => {
+ createState({ metricImages: [initialImage] });
+ mutations[types.RECEIVE_METRIC_UPDATE_SUCCESS](state, newImage);
+ });
+
+ it('should unset the loading state', () => {
+ expect(state.isUploadingImage).toBe(false);
+ });
+
+ it('should replace the existing image with the new one', () => {
+ expect(state.metricImages).toMatchObject([newImage]);
+ });
+ });
+
+ describe('RECEIVE_METRIC_DELETE_SUCCESS', () => {
+ const deletedImageId = testImages[1].id;
+ const expectedResult = [testImages[0], testImages[2]];
+
+ beforeEach(() => {
+ createState({ metricImages: [...testImages] });
+ mutations[types.RECEIVE_METRIC_DELETE_SUCCESS](state, deletedImageId);
+ });
+
+ it('should remove the correct metric image', () => {
+ expect(state.metricImages).toEqual(expectedResult);
+ });
+ });
+
+ describe('SET_INITIAL_DATA', () => {
+ beforeEach(() => {
+ mutations[types.SET_INITIAL_DATA](state, initialData);
+ });
+
+ it('should unset the loading state', () => {
+ expect(state.modelIid).toBe(initialData.modelIid);
+ expect(state.projectId).toBe(initialData.projectId);
+ });
+ });
+});
diff --git a/workhorse/README.md b/workhorse/README.md
index ee62c230aa7..c57f90b4a49 100644
--- a/workhorse/README.md
+++ b/workhorse/README.md
@@ -4,37 +4,11 @@ group: Source Code
info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
---
-# GitLab Workhorse
+# Workhorse documentation
-GitLab Workhorse is a smart reverse proxy for GitLab. It handles
-"large" HTTP requests such as file downloads, file uploads, Git
-push/pull and Git archive downloads.
+This document was moved to [another location](../doc/development/workhorse/index.md).
-Workhorse itself is not a feature, but there are [several features in
-GitLab](doc/architecture/gitlab_features.md) that would not work efficiently without Workhorse.
-
-## Canonical source
-
-The canonical source for Workhorse is
-[`gitlab-org/gitlab/workhorse`](https://gitlab.com/gitlab-org/gitlab/tree/master/workhorse).
-Prior to [epic #4826](https://gitlab.com/groups/gitlab-org/-/epics/4826), it was
-[`gitlab-org/gitlab-workhorse`](https://gitlab.com/gitlab-org/gitlab-workhorse/tree/master),
-but that repository is no longer used for development.
-
-## Documentation
-
-Workhorse documentation is available in the [`doc` folder of this repository](doc/).
-
-- Architectural overview
- - [GitLab features that rely on Workhorse](doc/architecture/gitlab_features.md)
- - [Websocket channel support](doc/architecture/channel.md)
-- Operating Workhorse
- - [Source installation](doc/operations/install.md)
- - [Workhorse configuration](doc/operations/configuration.md)
-- [Contributing](CONTRIBUTING.md)
- - [Adding new features](doc/development/new_features.md)
- - [Testing your code](doc/development/tests.md)
-
-## License
-
-This code is distributed under the MIT license, see the [LICENSE](LICENSE) file.
+<!-- This redirect file can be deleted after <2022-07-01>. -->
+<!-- Redirects that point to other docs in the same project expire in three months. -->
+<!-- Redirects that point to docs in a different project or site (for example, link is not relative and starts with `https:`) expire in one year. -->
+<!-- Before deletion, see: https://docs.gitlab.com/ee/development/documentation/redirects.html -->
diff --git a/workhorse/doc/architecture/channel.md b/workhorse/doc/architecture/channel.md
index 9d7dd524c2f..04bb0e7652f 100644
--- a/workhorse/doc/architecture/channel.md
+++ b/workhorse/doc/architecture/channel.md
@@ -2,198 +2,15 @@
stage: Create
group: Source Code
info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
+redirect_to: '../../../doc/development/backend/create_source_code_be/workhorse/channel.md'
+remove_date: '2022-07-01'
---
# Websocket channel support
-In some cases, GitLab can provide in-browser terminal access to an
-environment (which is a running server or container, onto which a
-project has been deployed), or even access to services running in CI
-through a WebSocket. Workhorse manages the WebSocket upgrade and
-long-lived connection to the websocket connection, which frees
-up GitLab to process other requests.
+This document was moved to [another location](../../../doc/development/workhorse/channel.md).
-This document outlines the architecture of these connections.
-
-## Introduction to WebSockets
-
-A websocket is an "upgraded" HTTP/1.1 request. Their purpose is to
-permit bidirectional communication between a client and a server.
-**Websockets are not HTTP**. Clients can send messages (known as
-frames) to the server at any time, and vice-versa. Client messages
-are not necessarily requests, and server messages are not necessarily
-responses. WebSocket URLs have schemes like `ws://` (unencrypted) or
-`wss://` (TLS-secured).
-
-When requesting an upgrade to WebSocket, the browser sends a HTTP/1.1
-request that looks like this:
-
-```plaintext
-GET /path.ws HTTP/1.1
-Connection: upgrade
-Upgrade: websocket
-Sec-WebSocket-Protocol: terminal.gitlab.com
-# More headers, including security measures
-```
-
-At this point, the connection is still HTTP, so this is a request and
-the server can send a normal HTTP response, including `404 Not Found`,
-`500 Internal Server Error`, etc.
-
-If the server decides to permit the upgrade, it will send a HTTP
-`101 Switching Protocols` response. From this point, the connection
-is no longer HTTP. It is a WebSocket and frames, not HTTP requests,
-will flow over it. The connection will persist until the client or
-server closes the connection.
-
-In addition to the subprotocol, individual websocket frames may
-also specify a message type - examples include `BinaryMessage`,
-`TextMessage`, `Ping`, `Pong` or `Close`. Only binary frames can
-contain arbitrary data - other frames are expected to be valid
-UTF-8 strings, in addition to any subprotocol expectations.
-
-## Browser to Workhorse
-
-Using the terminal as an example, GitLab serves a JavaScript terminal
-emulator to the browser on a URL like
-`https://gitlab.com/group/project/-/environments/1/terminal`.
-This opens a websocket connection to, e.g.,
-`wss://gitlab.com/group/project/-/environments/1/terminal.ws`,
-This endpoint doesn't exist in GitLab - only in Workhorse.
-
-When receiving the connection, Workhorse first checks that the
-client is authorized to access the requested terminal. It does
-this by performing a "preauthentication" request to GitLab.
-
-If the client has the appropriate permissions and the terminal
-exists, GitLab responds with a successful response that includes
-details of the terminal that the client should be connected to.
-Otherwise, it returns an appropriate HTTP error response.
-
-Errors are passed back to the client as HTTP responses, but if
-GitLab returns valid terminal details to Workhorse, it will
-connect to the specified terminal, upgrade the browser to a
-WebSocket, and proxy between the two connections for as long
-as the browser's credentials are valid. Workhorse will also
-send regular `PingMessage` control frames to the browser, to
-keep intervening proxies from terminating the connection
-while the browser is present.
-
-The browser must request an upgrade with a specific subprotocol:
-
-### `terminal.gitlab.com`
-
-This subprotocol considers `TextMessage` frames to be invalid.
-Control frames, such as `PingMessage` or `CloseMessage`, have
-their usual meanings.
-
-`BinaryMessage` frames sent from the browser to the server are
-arbitrary text input.
-
-`BinaryMessage` frames sent from the server to the browser are
-arbitrary text output.
-
-These frames are expected to contain ANSI text control codes
-and may be in any encoding.
-
-### `base64.terminal.gitlab.com`
-
-This subprotocol considers `BinaryMessage` frames to be invalid.
-Control frames, such as `PingMessage` or `CloseMessage`, have
-their usual meanings.
-
-`TextMessage` frames sent from the browser to the server are
-base64-encoded arbitrary text input (so the server must
-base64-decode them before inputting them).
-
-`TextMessage` frames sent from the server to the browser are
-base64-encoded arbitrary text output (so the browser must
-base64-decode them before outputting them).
-
-In their base64-encoded form, these frames are expected to
-contain ANSI terminal control codes, and may be in any encoding.
-
-## Workhorse to GitLab
-
-Using again the terminal as an example, before upgrading the browser,
-Workhorse sends a normal HTTP request to GitLab on a URL like
-`https://gitlab.com/group/project/environments/1/terminal.ws/authorize`.
-This returns a JSON response containing details of where the
-terminal can be found, and how to connect it. In particular,
-the following details are returned in case of success:
-
-- WebSocket URL to **connect** to, e.g.: `wss://example.com/terminals/1.ws?tty=1`
-- WebSocket subprotocols to support, e.g.: `["channel.k8s.io"]`
-- Headers to send, e.g.: `Authorization: Token xxyyz..`
-- Certificate authority to verify `wss` connections with (optional)
-
-Workhorse periodically re-checks this endpoint, and if it gets an
-error response, or the details of the terminal change, it will
-terminate the websocket session.
-
-## Workhorse to the WebSocket server
-
-In GitLab, environments or CI jobs may have a deployment service (e.g.,
-`KubernetesService`) associated with them. This service knows
-where the terminals or the service for an environment may be found, and these
-details are returned to Workhorse by GitLab.
-
-These URLs are *also* WebSocket URLs, and GitLab tells Workhorse
-which subprotocols to speak over the connection, along with any
-authentication details required by the remote end.
-
-Before upgrading the browser's connection to a websocket,
-Workhorse opens a HTTP client connection, according to the
-details given to it by Workhorse, and attempts to upgrade
-that connection to a websocket. If it fails, an error
-response is sent to the browser; otherwise, the browser is
-also upgraded.
-
-Workhorse now has two websocket connections, albeit with
-differing subprotocols. It decodes incoming frames from the
-browser, re-encodes them to the channel's subprotocol, and
-sends them to the channel. Similarly, it decodes incoming
-frames from the channel, re-encodes them to the browser's
-subprotocol, and sends them to the browser.
-
-When either connection closes or enters an error state,
-Workhorse detects the error and closes the other connection,
-terminating the channel session. If the browser is the
-connection that has disconnected, Workhorse will send an ANSI
-`End of Transmission` control code (the `0x04` byte) to the
-channel, encoded according to the appropriate subprotocol.
-Workhorse will automatically reply to any websocket ping frame
-sent by the channel, to avoid being disconnected.
-
-Currently, Workhorse only supports the following subprotocols.
-Supporting new deployment services will require new subprotocols
-to be supported:
-
-### `channel.k8s.io`
-
-Used by Kubernetes, this subprotocol defines a simple multiplexed
-channel.
-
-Control frames have their usual meanings. `TextMessage` frames are
-invalid. `BinaryMessage` frames represent I/O to a specific file
-descriptor.
-
-The first byte of each `BinaryMessage` frame represents the file
-descriptor (fd) number, as a `uint8` (so the value `0x00` corresponds
-to fd 0, `STDIN`, while `0x01` corresponds to fd 1, `STDOUT`).
-
-The remaining bytes represent arbitrary data. For frames received
-from the server, they are bytes that have been received from that
-fd. For frames sent to the server, they are bytes that should be
-written to that fd.
-
-### `base64.channel.k8s.io`
-
-Also used by Kubernetes, this subprotocol defines a similar multiplexed
-channel to `channel.k8s.io`. The main differences are:
-
-- `TextMessage` frames are valid, rather than `BinaryMessage` frames.
-- The first byte of each `TextMessage` frame represents the file
- descriptor as a numeric UTF-8 character, so the character `U+0030`,
- or "0", is fd 0, STDIN).
-- The remaining bytes represent base64-encoded arbitrary data.
+<!-- This redirect file can be deleted after <2022-07-01>. -->
+<!-- Redirects that point to other docs in the same project expire in three months. -->
+<!-- Redirects that point to docs in a different project or site (for example, link is not relative and starts with `https:`) expire in one year. -->
+<!-- Before deletion, see: https://docs.gitlab.com/ee/development/documentation/redirects.html -->
diff --git a/workhorse/doc/architecture/gitlab_features.md b/workhorse/doc/architecture/gitlab_features.md
index cdec2a8f0c1..42dd919690d 100644
--- a/workhorse/doc/architecture/gitlab_features.md
+++ b/workhorse/doc/architecture/gitlab_features.md
@@ -6,68 +6,9 @@ info: To determine the technical writer assigned to the Stage/Group associated w
# Features that rely on Workhorse
-Workhorse itself is not a feature, but there are several features in
-GitLab that would not work efficiently without Workhorse.
+This document was moved to [another location](../../../doc/development/workhorse/gitlab_features.md).
-To put the efficiency benefit in context, consider that in 2020Q3 on
-GitLab.com [we see][https://thanos-query.ops.gitlab.net/graph?g0.range_input=1h&g0.max_source_resolution=0s&g0.expr=sum(ruby_process_resident_memory_bytes%7Bapp%3D%22webservice%22%2Cenv%3D%22gprd%22%2Crelease%3D%22gitlab%22%7D)%20%2F%20sum(puma_max_threads%7Bapp%3D%22webservice%22%2Cenv%3D%22gprd%22%2Crelease%3D%22gitlab%22%7D)&g0.tab=1&g1.range_input=1h&g1.max_source_resolution=0s&g1.expr=sum(go_memstats_sys_bytes%7Bapp%3D%22webservice%22%2Cenv%3D%22gprd%22%2Crelease%3D%22gitlab%22%7D)%2Fsum(go_goroutines%7Bapp%3D%22webservice%22%2Cenv%3D%22gprd%22%2Crelease%3D%22gitlab%22%7D)&g1.tab=1]
-Rails application threads using on average
-about 200MB of RSS vs about 200KB for Workhorse goroutines.
-
-Examples of features that rely on Workhorse:
-
-## 1. `git clone` and `git push` over HTTP
-
-Git clone, pull and push are slow because they transfer large amounts
-of data and because each is CPU intensive on the GitLab side. Without
-Workhorse, HTTP access to Git repositories would compete with regular
-web access to the application, requiring us to run way more Rails
-application servers.
-
-## 2. CI runner long polling
-
-GitLab CI runners fetch new CI jobs by polling the GitLab server.
-Workhorse acts as a kind of "waiting room" where CI runners can sit
-and wait for new CI jobs. Because of Go's efficiency we can fit a lot
-of runners in the waiting room at little cost. Without this waiting
-room mechanism we would have to add a lot more Rails server capacity.
-
-## 3. File uploads and downloads
-
-File uploads and downloads may be slow either because the file is
-large or because the user's connection is slow. Workhorse can handle
-the slow part for Rails. This improves the efficiency of features such
-as CI artifacts, package repositories, LFS objects, etc.
-
-## 4. Websocket proxying
-
-Features such as the web terminal require a long lived connection
-between the user's web browser and a container inside GitLab that is
-not directly accessible from the internet. Dedicating a Rails
-application thread to proxying such a connection would cost much more
-memory than it costs to have Workhorse look after it.
-
-## Quick facts (how does Workhorse work)
-
-- Workhorse can handle some requests without involving Rails at all:
- for example, JavaScript files and CSS files are served straight
- from disk.
-- Workhorse can modify responses sent by Rails: for example if you use
- `send_file` in Rails then GitLab Workhorse will open the file on
- disk and send its contents as the response body to the client.
-- Workhorse can take over requests after asking permission from Rails.
- Example: handling `git clone`.
-- Workhorse can modify requests before passing them to Rails. Example:
- when handling a Git LFS upload Workhorse first asks permission from
- Rails, then it stores the request body in a tempfile, then it sends
- a modified request containing the tempfile path to Rails.
-- Workhorse can manage long-lived WebSocket connections for Rails.
- Example: handling the terminal websocket for environments.
-- Workhorse does not connect to PostgreSQL, only to Rails and (optionally) Redis.
-- We assume that all requests that reach Workhorse pass through an
- upstream proxy such as NGINX or Apache first.
-- Workhorse does not accept HTTPS connections.
-- Workhorse does not clean up idle client connections.
-- We assume that all requests to Rails pass through Workhorse.
-
-For more information see ['A brief history of GitLab Workhorse'](https://about.gitlab.com/2016/04/12/a-brief-history-of-gitlab-workhorse/).
+<!-- This redirect file can be deleted after <2022-07-01>. -->
+<!-- Redirects that point to other docs in the same project expire in three months. -->
+<!-- Redirects that point to docs in a different project or site (for example, link is not relative and starts with `https:`) expire in one year. -->
+<!-- Before deletion, see: https://docs.gitlab.com/ee/development/documentation/redirects.html -->
diff --git a/workhorse/doc/channel.md b/workhorse/doc/channel.md
index 68553170775..86078321dff 100644
--- a/workhorse/doc/channel.md
+++ b/workhorse/doc/channel.md
@@ -1 +1,14 @@
-This file was moved to [`architecture/channel.md`](doc/architecture/channel.md).
+---
+stage: Create
+group: Source Code
+info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
+---
+
+# Websocket channel support
+
+This document was moved to [another location](../../doc/development/workhorse/channel.md).
+
+<!-- This redirect file can be deleted after <2022-07-01>. -->
+<!-- Redirects that point to other docs in the same project expire in three months. -->
+<!-- Redirects that point to docs in a different project or site (for example, link is not relative and starts with `https:`) expire in one year. -->
+<!-- Before deletion, see: https://docs.gitlab.com/ee/development/documentation/redirects.html -->
diff --git a/workhorse/doc/development/new_features.md b/workhorse/doc/development/new_features.md
index 97128d4e6f2..60f0ad75539 100644
--- a/workhorse/doc/development/new_features.md
+++ b/workhorse/doc/development/new_features.md
@@ -6,41 +6,9 @@ info: To determine the technical writer assigned to the Stage/Group associated w
# Adding new features to Workhorse
-GitLab Workhorse is a smart reverse proxy for GitLab. It handles
-"long" HTTP requests such as file downloads, file uploads, Git
-push/pull and Git archive downloads.
+This document was moved to [another location](../../../doc/development/workhorse/new_features.md).
-Workhorse itself is not a feature, but there are [several features in GitLab](https://gitlab.com/gitlab-org/gitlab/-/blob/master/workhorse/doc/architecture/gitlab_features.md) that would not work efficiently without Workhorse.
-
-At a first glance, it may look like Workhorse is just a pipeline for processing HTTP streams so that you can reduce the amount of logic in your Ruby on Rails controller, but there are good reasons to avoid treating it like that.
-
-Engineers embarking on the quest of offloading a feature to Workhorse often find that the endeavor is much higher than what originally anticipated. In part because of the new programming language (only a few engineers at GitLab are Go developers), in part because of the demanding requirements for Workhorse. Workhorse is stateless, memory and disk usage must be kept under tight control, and the request should not be slowed down in the process.
-
-## Can I add a new feature to Workhorse?
-
-We suggest to follow this route only if absolutely necessary and no other options are available.
-
-Splitting a feature between the Rails code-base and Workhorse is deliberately choosing to introduce technical debt. It adds complexity to the system and coupling between the two components.
-
-- Building features using Workhorse has a considerable complexity cost, so you should prefer designs based on Rails requests and Sidekiq jobs.
-- Even when using Rails+Sidekiq is "more work" than using Rails+Workhorse, Rails+Sidekiq is easier to maintain in the long term because Workhorse is unique to GitLab while Rails+Sidekiq is an industry standard.
-- For "global" behaviors around web requests consider using a Rack middleware instead of Workhorse.
-- Generally speaking, we should only use Rails+Workhorse if the HTTP client expects behavior that is not reasonable to implement in Rails, like "long" requests.
-
-## What is a "long" request?
-
-There is one order of magnitude between Workhorse and Puma RAM usage. Having connection open for a period longer than milliseconds is a problem because of the amount of RAM it monopolizes once it reaches the Ruby on Rails controller.
-
-So far we identified two classes of "long" requests: data transfers and HTTP long polling.
-
-`git push`, `git pull`, uploading or downloading an artifact, the CI runner waiting for a new job are all good examples of long requests.
-
-With the rise of cloud-native installations, Workhorse's feature-set was extended to add object storage direct-upload, to get rid of the shared Network File System (NFS) drives.
-
-In 2020 @nolith presented at FOSDEM a talk titled [_Speed up the monolith. Building a smart reverse proxy in Go_](https://archive.fosdem.org/2020/schedule/event/speedupmonolith/).
-You can watch the recording for more details on the history of Workhorse and the NFS removal.
-
-[Uploads development documentation]( https://docs.gitlab.com/ee/development/uploads.html)
-contains the most common use-cases for adding a new type of upload and may answer all of your questions.
-
-If you still think we should add a new feature to Workhorse, please open an issue explaining **what you want to implement** and **why it can't be implemented in our Ruby code-base**. Workhorse maintainers will be happy to help you assessing the situation.
+<!-- This redirect file can be deleted after <2022-07-01>. -->
+<!-- Redirects that point to other docs in the same project expire in three months. -->
+<!-- Redirects that point to docs in a different project or site (for example, link is not relative and starts with `https:`) expire in one year. -->
+<!-- Before deletion, see: https://docs.gitlab.com/ee/development/documentation/redirects.html -->
diff --git a/workhorse/doc/development/tests.md b/workhorse/doc/development/tests.md
index 032730fab45..a5eb7571cc0 100644
--- a/workhorse/doc/development/tests.md
+++ b/workhorse/doc/development/tests.md
@@ -6,18 +6,9 @@ info: To determine the technical writer assigned to the Stage/Group associated w
# Testing your code
-Run the tests with:
+This document was moved to [another location](../../../doc/development/workhorse/index.md).
-```plaintext
-make clean test
-```
-
-## Coverage / what to test
-
-Each feature in GitLab Workhorse should have an integration test that
-verifies that the feature 'kicks in' on the right requests and leaves
-other requests unaffected. It is better to also have package-level tests
-for specific behavior but the high-level integration tests should have
-the first priority during development.
-
-It is OK if a feature is only covered by integration tests.
+<!-- This redirect file can be deleted after <2022-07-01>. -->
+<!-- Redirects that point to other docs in the same project expire in three months. -->
+<!-- Redirects that point to docs in a different project or site (for example, link is not relative and starts with `https:`) expire in one year. -->
+<!-- Before deletion, see: https://docs.gitlab.com/ee/development/documentation/redirects.html -->
diff --git a/workhorse/doc/operations/configuration.md b/workhorse/doc/operations/configuration.md
index f715cf442eb..62dc8369dfb 100644
--- a/workhorse/doc/operations/configuration.md
+++ b/workhorse/doc/operations/configuration.md
@@ -6,214 +6,9 @@ info: To determine the technical writer assigned to the Stage/Group associated w
# Workhorse configuration
-For historical reasons Workhorse uses both command line flags, a configuration file and environment variables.
+This document was moved to [another location](../../../doc/development/workhorse/configuration.md).
-All new configuration options that get added to Workhorse should go into the configuration file.
-
-## CLI options
-
-```plaintext
- gitlab-workhorse [OPTIONS]
-
-Options:
- -apiCiLongPollingDuration duration
- Long polling duration for job requesting for runners (default 50ns)
- -apiLimit uint
- Number of API requests allowed at single time
- -apiQueueDuration duration
- Maximum queueing duration of requests (default 30s)
- -apiQueueLimit uint
- Number of API requests allowed to be queued
- -authBackend string
- Authentication/authorization backend (default "http://localhost:8080")
- -authSocket string
- Optional: Unix domain socket to dial authBackend at
- -cableBackend string
- Optional: ActionCable backend (default authBackend)
- -cableSocket string
- Optional: Unix domain socket to dial cableBackend at (default authSocket)
- -config string
- TOML file to load config from
- -developmentMode
- Allow the assets to be served from Rails app
- -documentRoot string
- Path to static files content (default "public")
- -listenAddr string
- Listen address for HTTP server (default "localhost:8181")
- -listenNetwork string
- Listen 'network' (tcp, tcp4, tcp6, unix) (default "tcp")
- -listenUmask int
- Umask for Unix socket
- -logFile string
- Log file location
- -logFormat string
- Log format to use defaults to text (text, json, structured, none) (default "text")
- -pprofListenAddr string
- pprof listening address, e.g. 'localhost:6060'
- -prometheusListenAddr string
- Prometheus listening address, e.g. 'localhost:9229'
- -proxyHeadersTimeout duration
- How long to wait for response headers when proxying the request (default 5m0s)
- -secretPath string
- File with secret key to authenticate with authBackend (default "./.gitlab_workhorse_secret")
- -version
- Print version and exit
-```
-
-The 'auth backend' refers to the GitLab Rails application. The name is
-a holdover from when GitLab Workhorse only handled Git push/pull over
-HTTP.
-
-GitLab Workhorse can listen on either a TCP or a Unix domain socket. It
-can also open a second listening TCP listening socket with the Go
-[`net/http/pprof` profiler server](http://golang.org/pkg/net/http/pprof/).
-
-GitLab Workhorse can listen on Redis events (currently only builds/register
-for runners). This requires you to pass a valid TOML configuration file via
-`-config` flag.
-For regular setups it only requires the following (replacing the string
-with the actual socket)
-
-## Redis
-
-GitLab Workhorse integrates with Redis to do long polling for CI build
-requests. This is configured via two things:
-
-- Redis settings in the TOML configuration file
-- The `-apiCiLongPollingDuration` command line flag to control polling
- behavior for CI build requests
-
-It is OK to enable Redis in the configuration file but to leave CI polling
-disabled; this just results in an idle Redis pubsub connection. The
-opposite is not possible: CI long polling requires a correct Redis
-configuration.
-
-Below we discuss the options for the `[redis]` section in the configuration
-file.
-
-```plaintext
-[redis]
-URL = "unix:///var/run/gitlab/redis.sock"
-Password = "my_awesome_password"
-Sentinel = [ "tcp://sentinel1:23456", "tcp://sentinel2:23456" ]
-SentinelMaster = "mymaster"
-```
-
-- `URL` takes a string in the format `unix://path/to/redis.sock` or
-`tcp://host:port`.
-- `Password` is only required if your Redis instance is password-protected
-- `Sentinel` is used if you are using Sentinel.
-
- NOTE:
- If both `Sentinel` and `URL` are given, only `Sentinel` will be used.
-
-Optional fields are as follows:
-
-```plaintext
-[redis]
-DB = 0
-MaxIdle = 1
-MaxActive = 1
-```
-
-- `DB` is the Database to connect to. Defaults to `0`
-- `MaxIdle` is how many idle connections can be in the Redis pool at once. Defaults to 1
-- `MaxActive` is how many connections the pool can keep. Defaults to 1
-
-## Relative URL support
-
-If you are mounting GitLab at a relative URL, e.g.
-`example.com/gitlab`, then you should also use this relative URL in
-the `authBackend` setting:
-
-```plaintext
-gitlab-workhorse -authBackend http://localhost:8080/gitlab
-```
-
-## Interaction of authBackend and authSocket
-
-The interaction between `authBackend` and `authSocket` can be a bit
-confusing. It comes down to: if `authSocket` is set it overrides the
-_host_ part of `authBackend` but not the relative path.
-
-In table form:
-
-|authBackend|authSocket|Workhorse connects to?|Rails relative URL|
-|---|---|---|---|
-|unset|unset|`localhost:8080`|`/`|
-|`http://localhost:3000`|unset|`localhost:3000`|`/`|
-|`http://localhost:3000/gitlab`|unset|`localhost:3000`|`/gitlab`|
-|unset|`/path/to/socket`|`/path/to/socket`|`/`|
-|`http://localhost:3000`|`/path/to/socket`|`/path/to/socket`|`/`|
-|`http://localhost:3000/gitlab`|`/path/to/socket`|`/path/to/socket`|`/gitlab`|
-
-The same applies to `cableBackend` and `cableSocket`.
-
-## Error tracking
-
-GitLab-Workhorse supports remote error tracking with
-[Sentry](https://sentry.io). To enable this feature set the
-`GITLAB_WORKHORSE_SENTRY_DSN` environment variable.
-You can also set the `GITLAB_WORKHORSE_SENTRY_ENVIRONMENT` environment variable to
-use the Sentry environment functionality to separate staging, production and
-development.
-
-Omnibus (`/etc/gitlab/gitlab.rb`):
-
-```ruby
-gitlab_workhorse['env'] = {
- 'GITLAB_WORKHORSE_SENTRY_DSN' => 'https://foobar'
- 'GITLAB_WORKHORSE_SENTRY_ENVIRONMENT' => 'production'
-}
-```
-
-Source installations (`/etc/default/gitlab`):
-
-```plaintext
-export GITLAB_WORKHORSE_SENTRY_DSN='https://foobar'
-export GITLAB_WORKHORSE_SENTRY_ENVIRONMENT='production'
-```
-
-## Distributed Tracing
-
-Workhorse supports distributed tracing through [LabKit](https://gitlab.com/gitlab-org/labkit/) using [OpenTracing APIs](https://opentracing.io).
-
-By default, no tracing implementation is linked into the binary, but different OpenTracing providers can be linked in using [build tags](https://golang.org/pkg/go/build/#hdr-Build_Constraints) or build constraints. This can be done by setting the `BUILD_TAGS` make variable.
-
-For more details of the supported providers, see LabKit, but as an example, for Jaeger tracing support, include the tags: `BUILD_TAGS="tracer_static tracer_static_jaeger"`.
-
-```shell
-make BUILD_TAGS="tracer_static tracer_static_jaeger"
-```
-
-Once Workhorse is compiled with an opentracing provider, the tracing configuration is configured via the `GITLAB_TRACING` environment variable.
-
-For example:
-
-```shell
-GITLAB_TRACING=opentracing://jaeger ./gitlab-workhorse
-```
-
-## Continuous Profiling
-
-Workhorse supports continuous profiling through [LabKit](https://gitlab.com/gitlab-org/labkit/) using [Stackdriver Profiler](https://cloud.google.com/profiler).
-
-By default, the Stackdriver Profiler implementation is linked in the binary using [build tags](https://golang.org/pkg/go/build/#hdr-Build_Constraints), though it's not
-required and can be skipped.
-
-For example:
-
-```shell
-make BUILD_TAGS=""
-```
-
-Once Workhorse is compiled with Continuous Profiling, the profiler configuration can be set via `GITLAB_CONTINUOUS_PROFILING`
-environment variable.
-
-For example:
-
-```shell
-GITLAB_CONTINUOUS_PROFILING="stackdriver?service=workhorse&service_version=1.0.1&project_id=test-123 ./gitlab-workhorse"
-```
-
-More information about see the [LabKit monitoring documentation](https://gitlab.com/gitlab-org/labkit/-/blob/master/monitoring/doc.go).
+<!-- This redirect file can be deleted after <2022-07-01>. -->
+<!-- Redirects that point to other docs in the same project expire in three months. -->
+<!-- Redirects that point to docs in a different project or site (for example, link is not relative and starts with `https:`) expire in one year. -->
+<!-- Before deletion, see: https://docs.gitlab.com/ee/development/documentation/redirects.html -->
diff --git a/workhorse/doc/operations/install.md b/workhorse/doc/operations/install.md
index 088bd26c877..9f0a8212783 100644
--- a/workhorse/doc/operations/install.md
+++ b/workhorse/doc/operations/install.md
@@ -1,44 +1,14 @@
-# Installation
-
-To install GitLab Workhorse you need [Go 1.15 or
-newer](https://golang.org/dl) and [GNU
-Make](https://www.gnu.org/software/make/).
-
-To install into `/usr/local/bin` run `make install`.
-
-```plaintext
-make install
-```
-
-To install into `/foo/bin` set the PREFIX variable.
-
-```plaintext
-make install PREFIX=/foo
-```
+---
+stage: Create
+group: Source Code
+info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
+---
-On some operating systems, such as FreeBSD, you may have to use
-`gmake` instead of `make`.
-
-*NOTE*: Some features depends on build tags, make sure to check
-[Workhorse configuration](doc/operations/configuration.md) to enable them.
-
-## Run time dependencies
-
-### Exiftool
-
-Workhorse uses [Exiftool](https://www.sno.phy.queensu.ca/~phil/exiftool/) for
-removing EXIF data (which may contain sensitive information) from uploaded
-images. If you installed GitLab:
-
-- Using the Omnibus package, you're all set.
- *NOTE* that if you are using CentOS Minimal, you may need to install `perl`
- package: `yum install perl`
-- From source, make sure `exiftool` is installed:
+# Installation
- ```shell
- # Debian/Ubuntu
- sudo apt-get install libimage-exiftool-perl
+This document was moved to [another location](../../../doc/development/workhorse/index.md).
- # RHEL/CentOS
- sudo yum install perl-Image-ExifTool
- ```
+<!-- This redirect file can be deleted after <2022-07-01>. -->
+<!-- Redirects that point to other docs in the same project expire in three months. -->
+<!-- Redirects that point to docs in a different project or site (for example, link is not relative and starts with `https:`) expire in one year. -->
+<!-- Before deletion, see: https://docs.gitlab.com/ee/development/documentation/redirects.html -->