Welcome to mirror list, hosted at ThFree Co, Russian Federation.

gitlab.com/gitlab-org/gitlab-foss.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
Diffstat (limited to 'doc/user/application_security')
-rw-r--r--doc/user/application_security/api_fuzzing/index.md653
-rw-r--r--doc/user/application_security/configuration/index.md4
-rw-r--r--doc/user/application_security/container_scanning/index.md71
-rw-r--r--doc/user/application_security/coverage_fuzzing/index.md10
-rw-r--r--doc/user/application_security/cve_id_request.md2
-rw-r--r--doc/user/application_security/dast/checks/16.7.md2
-rw-r--r--doc/user/application_security/dast/checks/209.1.md10
-rw-r--r--doc/user/application_security/dast/dast_troubleshooting.md2
-rw-r--r--doc/user/application_security/dast/index.md66
-rw-r--r--doc/user/application_security/dast_api/index.md659
-rw-r--r--doc/user/application_security/dependency_scanning/analyzers.md8
-rw-r--r--doc/user/application_security/dependency_scanning/index.md16
-rw-r--r--doc/user/application_security/generate_test_vulnerabilities/index.md2
-rw-r--r--doc/user/application_security/get-started-security.md27
-rw-r--r--doc/user/application_security/iac_scanning/index.md2
-rw-r--r--doc/user/application_security/index.md27
-rw-r--r--doc/user/application_security/offline_deployments/index.md6
-rw-r--r--doc/user/application_security/policies/index.md12
-rw-r--r--doc/user/application_security/policies/scan-execution-policies.md33
-rw-r--r--doc/user/application_security/policies/scan-result-policies.md4
-rw-r--r--doc/user/application_security/sast/analyzers.md38
-rw-r--r--doc/user/application_security/sast/index.md64
-rw-r--r--doc/user/application_security/secret_detection/index.md543
-rw-r--r--doc/user/application_security/security_dashboard/index.md28
-rw-r--r--doc/user/application_security/vulnerabilities/index.md20
-rw-r--r--doc/user/application_security/vulnerabilities/severities.md2
-rw-r--r--doc/user/application_security/vulnerability_report/index.md13
-rw-r--r--doc/user/application_security/vulnerability_report/pipeline.md24
28 files changed, 1743 insertions, 605 deletions
diff --git a/doc/user/application_security/api_fuzzing/index.md b/doc/user/application_security/api_fuzzing/index.md
index 80e4700c34c..8e371ed4dc6 100644
--- a/doc/user/application_security/api_fuzzing/index.md
+++ b/doc/user/application_security/api_fuzzing/index.md
@@ -39,6 +39,7 @@ or other scanners) during a scan could cause inaccurate results.
You can run a Web API fuzzing scan using the following methods:
- [OpenAPI Specification](#openapi-specification) - version 2, and 3.
+- [GraphQL Schema](#graphql-schema)
- [HTTP Archive](#http-archive-har) (HAR)
- [Postman Collection](#postman-collection) - version 2.0 or 2.1
@@ -76,6 +77,7 @@ To enable Web API fuzzing:
- For manual configuration instructions, see the respective section, depending on the API type:
- [OpenAPI Specification](#openapi-specification)
+ - [GraphQL Schema](#graphql-schema)
- [HTTP Archive (HAR)](#http-archive-har)
- [Postman Collection](#postman-collection)
- Otherwise, see [Web API fuzzing configuration form](#web-api-fuzzing-configuration-form).
@@ -95,7 +97,7 @@ a YAML snippet that you can paste in your GitLab CI/CD configuration.
To generate an API Fuzzing configuration snippet:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Configuration**.
1. In the **API Fuzzing** row, select **Enable API Fuzzing**.
1. Complete the fields. For details see [Available CI/CD variables](#available-cicd-variables).
@@ -262,7 +264,7 @@ Example `.gitlab-ci.yml` file using a HAR file:
FUZZAPI_TARGET_URL: http://test-deployment/
```
-This is a minimal configuration for API fuzzing. From here you can:
+This example is a minimal configuration for API fuzzing. From here you can:
- [Run your first scan](#running-your-first-scan).
- [Add authentication](#authentication).
@@ -270,6 +272,118 @@ This is a minimal configuration for API fuzzing. From here you can:
For details of API fuzzing configuration options, see [Available CI/CD variables](#available-cicd-variables).
+### GraphQL Schema
+
+> Support for GraphQL Schema was [introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/352780) in GitLab 15.4.
+
+GraphQL is a query language for your API and an alternative to REST APIs.
+API Fuzzing supports testing GraphQL endpoints multiple ways:
+
+- Test using the GraphQL Schema. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/352780) in GitLab 15.4.
+- Test using a recording (HAR) of GraphQL queries.
+- Test using a Postman Collection containing GraphQL queries.
+
+This section documents how to test using a GraphQL schema. The GraphQL schema support in
+API Fuzzing is able to query the schema from endpoints that support introspection.
+Introspection is enabled by default to allow tools like GraphiQL to work.
+
+#### API Fuzzing scanning with a GraphQL endpoint URL
+
+The GraphQL support in API Fuzzing is able to query a GraphQL endpoint for the schema.
+
+NOTE:
+The GraphQL endpoint must support introspection queries for this method to work correctly.
+
+To configure API Fuzzing to use an GraphQL endpoint URL that provides information about the target API to test:
+
+1. [Include](../../../ci/yaml/index.md#includetemplate)
+ the [`API-Fuzzing.gitlab-ci.yml` template](https://gitlab.com/gitlab-org/gitlab/-/blob/master/lib/gitlab/ci/templates/Security/API-Fuzzing.gitlab-ci.yml) in your `.gitlab-ci.yml` file.
+
+1. Provide the GraphQL endpoint path, for example `/api/graphql`. Specify the path by adding the `FUZZAPI_GRAPHQL` variable.
+
+1. The target API instance's base URL is also required. Provide it by using the `FUZZAPI_TARGET_URL`
+ variable or an `environment_url.txt` file.
+
+ Adding the URL in an `environment_url.txt` file at your project's root is great for testing in
+ dynamic environments. See the [dynamic environment solutions](#dynamic-environment-solutions) section of our documentation for more information.
+
+Complete example configuration of using a GraphQL endpoint URL:
+
+```yaml
+stages:
+ - fuzz
+
+include:
+ - template: API-Fuzzing.gitlab-ci.yml
+
+apifuzzer_fuzz:
+ variables:
+ FUZZAPI_GRAPHQL: /api/graphql
+ FUZZAPI_TARGET_URL: http://test-deployment/
+```
+
+This example is a minimal configuration for API Fuzzing. From here you can:
+
+- [Run your first scan](#running-your-first-scan).
+- [Add authentication](#authentication).
+- Learn how to [handle false positives](#handling-false-positives).
+
+#### API Fuzzing with a GraphQL Schema file
+
+To configure API Fuzzing to use a GraphQl schema file that provides information about the target API to test:
+
+1. [Include](../../../ci/yaml/index.md#includetemplate)
+ the [`API-Fuzzing.gitlab-ci.yml` template](https://gitlab.com/gitlab-org/gitlab/-/blob/master/lib/gitlab/ci/templates/Security/API-Fuzzing.gitlab-ci.yml) in your `.gitlab-ci.yml` file.
+
+1. Provide the GraphQL endpoint path, for example `/api/graphql`. Specify the path by adding the `FUZZAPI_GRAPHQL` variable.
+
+1. Provide the location of the GraphQL schema file. You can provide the location as a file path
+ or URL. Specify the location by adding the `FUZZAPI_GRAPHQL_SCHEMA` variable.
+
+1. The target API instance's base URL is also required. Provide it by using the `FUZZAPI_TARGET_URL`
+ variable or an `environment_url.txt` file.
+
+ Adding the URL in an `environment_url.txt` file at your project's root is great for testing in
+ dynamic environments. See the [dynamic environment solutions](#dynamic-environment-solutions) section of our documentation for more information.
+
+Complete example configuration of using an GraphQL schema file:
+
+```yaml
+stages:
+ - fuzz
+
+include:
+ - template: API-Fuzzing.gitlab-ci.yml
+
+apifuzzer_fuzz:
+ variables:
+ FUZZAPI_GRAPHQL: /api/graphql
+ FUZZAPI_GRAPHQL_SCHEMA: test-api-graphql.schema
+ FUZZAPI_TARGET_URL: http://test-deployment/
+```
+
+Complete example configuration of using an GraphQL schema file URL:
+
+```yaml
+stages:
+ - fuzz
+
+include:
+ - template: API-Fuzzing.gitlab-ci.yml
+
+apifuzzer_fuzz:
+ variables:
+ FUZZAPI_GRAPHQL: /api/graphql
+ FUZZAPI_GRAPHQL_SCHEMA: http://file-store/files/test-api-graphql.schema
+ FUZZAPI_TARGET_URL: http://test-deployment/
+```
+
+This example is a minimal configuration for API Fuzzing. From here you can:
+
+- [Run your first scan](#running-your-first-scan).
+- [Add authentication](#authentication).
+- Learn how to [handle false positives](#handling-false-positives).
+
### Postman Collection
The [Postman API Client](https://www.postman.com/product/api-client/) is a popular tool that
@@ -344,34 +458,264 @@ For details of API fuzzing configuration options, see [Available CI/CD variables
#### Postman variables
+> - Support for Postman Environment file format was [introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/356312) in GitLab 15.1.
+> - Support for multiple variable files was [introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/356312) in GitLab 15.1.
+> - Support for Postman variable scopes: Global and Environment was [introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/356312) in GitLab 15.1.
+
+##### Variables in Postman Client
+
Postman allows the developer to define placeholders that can be used in different parts of the
-requests. These placeholders are called variables, as explained in the Postman documentation,
-[Using variables](https://learning.postman.com/docs/sending-requests/variables/).
+requests. These placeholders are called variables, as explained in [using variables](https://learning.postman.com/docs/sending-requests/variables/).
You can use variables to store and reuse values in your requests and scripts. For example, you can
edit the collection to add variables to the document:
![Edit collection variable tab View](img/api_fuzzing_postman_collection_edit_variable.png)
+Or alternatively, you can add variables in an environment:
+
+![Edit environment variables View](img/api_fuzzing_postman_environment_edit_variable.png)
+
You can then use the variables in sections such as URL, headers, and others:
![Edit request using variables View](img/api_fuzzing_postman_request_edit.png)
-Variables can be defined at different [scopes](https://learning.postman.com/docs/sending-requests/variables/#variable-scopes)
-(for example, Global, Collection, Environment, Local, and Data). In this example, they're defined at
-the Environment scope:
+Postman has grown from a basic client tool with a nice UX experience to a more complex ecosystem that allows testing APIs with scripts, creating complex collections that trigger secondary requests, and setting variables along the way. Not every feature in the Postman ecosystem is supported. For example, scripts are not supported. The main focus of the Postman support is to ingest Postman Collection definitions that are used by the Postman Client and their related variables defined in the workspace, environments, and the collections themselves.
-![Edit environment variables View](img/api_fuzzing_postman_environment_edit_variable.png)
+Postman allows creating variables in different scopes. Each scope has a different level of visibility in the Postman tools. For example, you can create a variable in a _global environment_ scope that is seen by every operation definition and workspace. You can also create a variable in a specific _environment_ scope that is only visible and used when that specific environment is selected for use. Some scopes are not always available, for example in the Postman ecosystem you can create requests in the Postman Client, these requests do not have a _local_ scope, but test scripts do.
-When you export a Postman collection, only Postman collection variables are exported into the
-Postman file. For example, Postman does not export environment-scoped variables into the Postman
-file.
+Variable scopes in Postman can be a daunting topic and not everyone is familiar with it. We strongly recommend that you read [Variable Scopes](https://learning.postman.com/docs/sending-requests/variables/#variable-scopes) from Postman documentation before moving forward.
-By default, the API fuzzer uses the Postman file to resolve Postman variable values. If a JSON file
-is set in a GitLab CI/CD variable `FUZZAPI_POSTMAN_COLLECTION_VARIABLES`, then the JSON
-file takes precedence to get Postman variable values.
+As mentioned above, there are different variable scopes, and each of them has a purpose and can be used to provide more flexibility to your Postman document. There is an important note on how values for variables are computed, as per Postman documentation:
-WARNING:
-Although Postman can export environment variables into a JSON file, the format is not compatible with the JSON expected by `FUZZAPI_POSTMAN_COLLECTION_VARIABLES`.
+> If a variable with the same name is declared in two different scopes, the value stored in the variable with narrowest scope is used. For example, if there is a global variable named `username` and a local variable named `username`, the local value is used when the request runs.
+
+The following is a summary of the variable scopes supported by the Postman Client and API Fuzzing:
+
+- **Global Environment (Global) scope** is a special pre-defined environment that is available throughout a workspace. We can also refer to the _global environment_ scope as the _global_ scope. The Postman Client allows exporting the global environment into a JSON file, which can be used with API Fuzzing.
+- **Environment scope** is a named group of variables created by a user in the Postman Client.
+The Postman Client supports a single active environment along with the global environment. The variables defined in an active user-created environment take precedence over variables defined in the global environment. The Postman Client allows exporting your environment into a JSON file, which can be used with API Fuzzing.
+- **Collection scope** is a group of variables declared in a given collection. The collection variables are available to the collection where they have been declared and the nested requests or collections. Variables defined in the collection scope take precedence over the _global environment_ scope and also the _environment_ scope.
+The Postman Client can export one or more collections into a JSON file, this JSON file contains selected collections, requests, and collection variables.
+- **API Fuzzing Scope** is a new scope added by API Fuzzing to allow users to provide extra variables, or override variables defined in other supported scopes. This scope is not supported by Postman. The _API Fuzzing Scope_ variables are provided using a [custom JSON file format](#api-fuzzing-scope-custom-json-file-format).
+ - Override values defined in the environment or collection
+ - Defining variables from scripts
+ - Define a single row of data from the unsupported _data scope_
+- **Data scope** is a group of variables in which their name and values come from JSON or CSV files. A Postman collection runner like [Newman](https://learning.postman.com/docs/running-collections/using-newman-cli/command-line-integration-with-newman/) or [Postman Collection Runner](https://learning.postman.com/docs/running-collections/intro-to-collection-runs/) executes the requests in a collection as many times as entries have the JSON or CSV file. A good use case for these variables is to automate tests using scripts in Postman.
+API Fuzzing does **not** support reading data from a CSV or JSON file.
+- **Local scope** are variables that are defined in Postman scripts. API Fuzzing does **not** support Postman scripts and by extension, variables defined in scripts. You can still provide values for the script-defined variables by defining them in one of the supported scopes, or our custom JSON format.
+
+Not all scopes are supported by API Fuzzing and variables defined in scripts are not supported. The following table is sorted by broadest scope to narrowest scope.
+
+| Scope |Postman | API Fuzzing | Comment |
+| ------------------ |:---------:|:------------:| :--------|
+| Global Environment | Yes | Yes | Special pre-defined environment |
+| Environment | Yes | Yes | Named environments |
+| Collection | Yes | Yes | Defined in your postman collection |
+| API Fuzzing Scope | No | Yes | Custom scope added by API Fuzzing |
+| Data | Yes | No | External files in CSV or JSON format |
+| Local | Yes | No | Variables defined in scripts |
+
+For more details on how to define variables and export variables in different scopes, see:
+
+- [Defining collection variables](https://learning.postman.com/docs/sending-requests/variables/#defining-collection-variables)
+- [Defining environment variables](https://learning.postman.com/docs/sending-requests/variables/#defining-environment-variables)
+- [Defining global variables](https://learning.postman.com/docs/sending-requests/variables/#defining-global-variables)
+
+##### Exporting from Postman Client
+
+The Postman Client lets you export different file formats, for instance, you can export a Postman collection or a Postman environment.
+The exported environment can be the global environment (which is always available) or can be any custom environment you previously have created. When you export a Postman Collection, it may contain only declarations for _collection_ and _local_ scoped variables; _environment_ scoped variables are not included.
+
+To get the declaration for _environment_ scoped variables, you have to export a given environment at the time. Each exported file only includes variables from the selected environment.
+
+For more details on exporting variables in different supported scopes, see:
+
+- [Exporting collections](https://learning.postman.com/docs/getting-started/importing-and-exporting-data/#exporting-collections)
+- [Exporting environments](https://learning.postman.com/docs/getting-started/importing-and-exporting-data/#exporting-environments)
+- [Downloading global environments](https://learning.postman.com/docs/sending-requests/variables/#downloading-global-environments)
+
+##### API Fuzzing Scope, custom JSON file format
+
+Our custom JSON file format is a JSON object where each object property represents a variable name and the property value represents the variable value. This file can be created using your favorite text editor, or it can be produced by an earlier job in your pipeline.
+
+This example defines two variables `base_url` and `token` in the API Fuzzing scope:
+
+```json
+{
+ "base_url": "http://127.0.0.1/",
+ "token": "Token 84816165151"
+}
+```
+
+##### Using scopes with API Fuzzing
+
+The scopes: _global_, _environment_, _collection_, and _GitLab API Fuzzing_ are supported in [GitLab 15.1 and later](https://gitlab.com/gitlab-org/gitlab/-/issues/356312). GitLab 15.0 and earlier, supports only the _collection_, and _GitLab API Fuzzing_ scopes.
+
+The following table provides a quick reference for mapping scope files/URLs to API Fuzzing configuration variables:
+
+| Scope | How to Provide |
+| ------------------ | --------------- |
+| Global Environment | FUZZAPI_POSTMAN_COLLECTION_VARIABLES |
+| Environment | FUZZAPI_POSTMAN_COLLECTION_VARIABLES |
+| Collection | FUZZAPI_POSTMAN_COLLECTION |
+| API Fuzzing Scope | FUZZAPI_POSTMAN_COLLECTION_VARIABLES |
+| Data | Not supported |
+| Local | Not supported |
+
+The Postman Collection document automatically includes any _collection_ scoped variables. The Postman Collection is provided with the configuration variable `FUZZAPI_POSTMAN_COLLECTION`. This variable can be set to a single [exported Postman collection](https://learning.postman.com/docs/getting-started/importing-and-exporting-data/#exporting-collections).
+
+Variables from other scopes are provided through the `FUZZAPI_POSTMAN_COLLECTION_VARIABLES` configuration variable. The configuration variable supports a comma (`,`) delimited file list in [GitLab 15.1 and later](https://gitlab.com/gitlab-org/gitlab/-/issues/356312). GitLab 15.0 and earlier, supports only one single file. The order of the files provided is not important as the files provide the needed scope information.
+
+The configuration variable `FUZZAPI_POSTMAN_COLLECTION_VARIABLES` can be set to:
+
+- [Exported Global environment](https://learning.postman.com/docs/sending-requests/variables/#downloading-global-environments)
+- [Exported environments](https://learning.postman.com/docs/getting-started/importing-and-exporting-data/#exporting-environments)
+- [API Fuzzing Custom JSON format](#api-fuzzing-scope-custom-json-file-format)
+
+##### Undefined Postman variables
+
+There is a chance that API Fuzzing engine does not find all variables references that your Postman collection file is using. Some cases can be:
+
+- You are using _data_ or _local_ scoped variables, and as stated previously these scopes are not supported by API Fuzzing. Thus, assuming the values for these variables have not been provided through [the API Fuzzing scope](#api-fuzzing-scope-custom-json-file-format), then the values of the _data_ and _local_ scoped variables are undefined.
+- A variable name was typed incorrectly, and the name does not match the defined variable.
+- Postman Client supports a new dynamic variable that is not supported by API Fuzzing.
+
+When possible, API Fuzzing follows the same behavior as the Postman Client does when dealing with undefined variables. The text of the variable reference remains the same, and there is no text substitution. The same behavior also applies to any unsupported dynamic variables.
+
+For example, if a request definition in the Postman Collection references the variable `{{full_url}}` and the variable is not found it is left unchanged with the value `{{full_url}}`.
+
+##### Dynamic Postman variables
+
+In addition to variables that a user can define at various scope levels, Postman has a set of pre-defined variables called _dynamic_ variables. The [_dynamic_ variables](https://learning.postman.com/docs/writing-scripts/script-references/variables-list/) are already defined and their name is prefixed with a dollar sign (`$`), for instance, `$guid`. _Dynamic_ variables can be used like any other variable, and in the Postman Client, they produce random values during the request/collection run.
+
+An important difference between API Fuzzing and Postman is that API Fuzzing returns the same value for each usage of the same dynamic variables. This differs from the Postman Client behavior which returns a random value on each use of the same dynamic variable. In other words, API Fuzzing uses static values for dynamic variables while Postman uses random values.
+
+The supported dynamic variables during the scanning process are:
+
+| Variable | Value |
+| ----------- | ----------- |
+| `$guid` | `611c2e81-2ccb-42d8-9ddc-2d0bfa65c1b4` |
+| `$isoTimestamp` | `2020-06-09T21:10:36.177Z` |
+| `$randomAbbreviation` | `PCI` |
+| `$randomAbstractImage` | `http://no-a-valid-host/640/480/abstract` |
+| `$randomAdjective` | `auxiliary` |
+| `$randomAlphaNumeric` | `a` |
+| `$randomAnimalsImage` | `http://no-a-valid-host/640/480/animals` |
+| `$randomAvatarImage` | `https://no-a-valid-host/path/to/some/image.jpg` |
+| `$randomBankAccount` | `09454073` |
+| `$randomBankAccountBic` | `EZIAUGJ1` |
+| `$randomBankAccountIban` | `MU20ZPUN3039684000618086155TKZ` |
+| `$randomBankAccountName` | `Home Loan Account` |
+| `$randomBitcoin` | `3VB8JGT7Y4Z63U68KGGKDXMLLH5` |
+| `$randomBoolean` | `true` |
+| `$randomBs` | `killer leverage schemas` |
+| `$randomBsAdjective` | `viral` |
+| `$randomBsBuzz` | `repurpose` |
+| `$randomBsNoun` | `markets` |
+| `$randomBusinessImage` | `http://no-a-valid-host/640/480/business` |
+| `$randomCatchPhrase` | `Future-proofed heuristic open architecture` |
+| `$randomCatchPhraseAdjective` | `Business-focused` |
+| `$randomCatchPhraseDescriptor` | `bandwidth-monitored` |
+| `$randomCatchPhraseNoun` | `superstructure` |
+| `$randomCatsImage` | `http://no-a-valid-host/640/480/cats` |
+| `$randomCity` | `Spinkahaven` |
+| `$randomCityImage` | `http://no-a-valid-host/640/480/city` |
+| `$randomColor` | `fuchsia` |
+| `$randomCommonFileExt` | `wav` |
+| `$randomCommonFileName` | `well_modulated.mpg4` |
+| `$randomCommonFileType` | `audio` |
+| `$randomCompanyName` | `Grady LLC` |
+| `$randomCompanySuffix` | `Inc` |
+| `$randomCountry` | `Kazakhstan` |
+| `$randomCountryCode` | `MD` |
+| `$randomCreditCardMask` | `3622` |
+| `$randomCurrencyCode` | `ZMK` |
+| `$randomCurrencyName` | `Pound Sterling` |
+| `$randomCurrencySymbol` | `£` |
+| `$randomDatabaseCollation` | `utf8_general_ci` |
+| `$randomDatabaseColumn` | `updatedAt` |
+| `$randomDatabaseEngine` | `Memory` |
+| `$randomDatabaseType` | `text` |
+| `$randomDateFuture` | `Tue Mar 17 2020 13:11:50 GMT+0530 (India Standard Time)` |
+| `$randomDatePast` | `Sat Mar 02 2019 09:09:26 GMT+0530 (India Standard Time)` |
+| `$randomDateRecent` | `Tue Jul 09 2019 23:12:37 GMT+0530 (India Standard Time)` |
+| `$randomDepartment` | `Electronics` |
+| `$randomDirectoryPath` | `/usr/local/bin` |
+| `$randomDomainName` | `trevor.info` |
+| `$randomDomainSuffix` | `org` |
+| `$randomDomainWord` | `jaden` |
+| `$randomEmail` | `Iva.Kovacek61@no-a-valid-host.com` |
+| `$randomExampleEmail` | `non-a-valid-user@example.net` |
+| `$randomFashionImage` | `http://no-a-valid-host/640/480/fashion` |
+| `$randomFileExt` | `war` |
+| `$randomFileName` | `neural_sri_lanka_rupee_gloves.gdoc` |
+| `$randomFilePath` | `/home/programming_chicken.cpio` |
+| `$randomFileType` | `application` |
+| `$randomFirstName` | `Chandler` |
+| `$randomFoodImage` | `http://no-a-valid-host/640/480/food` |
+| `$randomFullName` | `Connie Runolfsdottir` |
+| `$randomHexColor` | `#47594a` |
+| `$randomImageDataUri` | `data:image/svg+xml;charset=UTF-8,%3Csvg%20xmlns%3D%22http%3A%2F%2Fwww.w3.org%2F2000%2Fsvg%22%20version%3D%221.1%22%20baseProfile%3D%22full%22%20width%3D%22undefined%22%20height%3D%22undefined%22%3E%20%3Crect%20width%3D%22100%25%22%20height%3D%22100%25%22%20fill%3D%22grey%22%2F%3E%20%20%3Ctext%20x%3D%220%22%20y%3D%2220%22%20font-size%3D%2220%22%20text-anchor%3D%22start%22%20fill%3D%22white%22%3Eundefinedxundefined%3C%2Ftext%3E%20%3C%2Fsvg%3E` |
+| `$randomImageUrl` | `http://no-a-valid-host/640/480` |
+| `$randomIngverb` | `navigating` |
+| `$randomInt` | `494` |
+| `$randomIP` | `241.102.234.100` |
+| `$randomIPV6` | `dbe2:7ae6:119b:c161:1560:6dda:3a9b:90a9` |
+| `$randomJobArea` | `Mobility` |
+| `$randomJobDescriptor` | `Senior` |
+| `$randomJobTitle` | `International Creative Liaison` |
+| `$randomJobType` | `Supervisor` |
+| `$randomLastName` | `Schneider` |
+| `$randomLatitude` | `55.2099` |
+| `$randomLocale` | `ny` |
+| `$randomLongitude` | `40.6609` |
+| `$randomLoremLines` | `Ducimus in ut mollitia.\nA itaque non.\nHarum temporibus nihil voluptas.\nIste in sed et nesciunt in quaerat sed.` |
+| `$randomLoremParagraph` | `Ab aliquid odio iste quo voluptas voluptatem dignissimos velit. Recusandae facilis qui commodi ea magnam enim nostrum quia quis. Nihil est suscipit assumenda ut voluptatem sed. Esse ab voluptas odit qui molestiae. Rem est nesciunt est quis ipsam expedita consequuntur.` |
+| `$randomLoremParagraphs` | `Voluptatem rem magnam aliquam ab id aut quaerat. Placeat provident possimus voluptatibus dicta velit non aut quasi. Mollitia et aliquam expedita sunt dolores nam consequuntur. Nam dolorum delectus ipsam repudiandae et ipsam ut voluptatum totam. Nobis labore labore recusandae ipsam quo.` |
+| `$randomLoremSentence` | `Molestias consequuntur nisi non quod.` |
+| `$randomLoremSentences` | `Et sint voluptas similique iure amet perspiciatis vero sequi atque. Ut porro sit et hic. Neque aspernatur vitae fugiat ut dolore et veritatis. Ab iusto ex delectus animi. Voluptates nisi iusto. Impedit quod quae voluptate qui.` |
+| `$randomLoremSlug` | `eos-aperiam-accusamus, beatae-id-molestiae, qui-est-repellat` |
+| `$randomLoremText` | `Quisquam asperiores exercitationem ut ipsum. Aut eius nesciunt. Et reiciendis aut alias eaque. Nihil amet laboriosam pariatur eligendi. Sunt ullam ut sint natus ducimus. Voluptas harum aspernatur soluta rem nam.` |
+| `$randomLoremWord` | `est` |
+| `$randomLoremWords` | `vel repellat nobis` |
+| `$randomMACAddress` | `33:d4:68:5f:b4:c7` |
+| `$randomMimeType` | `audio/vnd.vmx.cvsd` |
+| `$randomMonth` | `February` |
+| `$randomNamePrefix` | `Dr.` |
+| `$randomNameSuffix` | `MD` |
+| `$randomNatureImage` | `http://no-a-valid-host/640/480/nature` |
+| `$randomNightlifeImage` | `http://no-a-valid-host/640/480/nightlife` |
+| `$randomNoun` | `bus` |
+| `$randomPassword` | `t9iXe7COoDKv8k3` |
+| `$randomPeopleImage` | `http://no-a-valid-host/640/480/people` |
+| `$randomPhoneNumber` | `700-008-5275` |
+| `$randomPhoneNumberExt` | `27-199-983-3864` |
+| `$randomPhrase` | `You can't program the monitor without navigating the mobile XML program!` |
+| `$randomPrice` | `531.55` |
+| `$randomProduct` | `Pizza` |
+| `$randomProductAdjective` | `Unbranded` |
+| `$randomProductMaterial` | `Steel` |
+| `$randomProductName` | `Handmade Concrete Tuna` |
+| `$randomProtocol` | `https` |
+| `$randomSemver` | `7.0.5` |
+| `$randomSportsImage` | `http://no-a-valid-host/640/480/sports` |
+| `$randomStreetAddress` | `5742 Harvey Streets` |
+| `$randomStreetName` | `Kuhic Island` |
+| `$randomTransactionType` | `payment` |
+| `$randomTransportImage` | `http://no-a-valid-host/640/480/transport` |
+| `$randomUrl` | `https://no-a-valid-host.net` |
+| `$randomUserAgent` | `Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.9.8; rv:15.6) Gecko/20100101 Firefox/15.6.6` |
+| `$randomUserName` | `Jarrell.Gutkowski` |
+| `$randomUUID` | `6929bb52-3ab2-448a-9796-d6480ecad36b` |
+| `$randomVerb` | `navigate` |
+| `$randomWeekday` | `Thursday` |
+| `$randomWord` | `withdrawal` |
+| `$randomWords` | `Samoa Synergistic sticky copying Grocery` |
+| `$timestamp` | `1562757107` |
+
+##### Example: Global Scope
+
+In this example, [the _global_ scope is exported](https://learning.postman.com/docs/sending-requests/variables/#downloading-global-environments) from the Postman Client as `global-scope.json` and provided to API Fuzzing through the `FUZZAPI_POSTMAN_COLLECTION_VARIABLES` configuration variable.
Here is an example of using `FUZZAPI_POSTMAN_COLLECTION_VARIABLES`:
@@ -384,21 +728,171 @@ include:
variables:
FUZZAPI_PROFILE: Quick-10
- FUZZAPI_POSTMAN_COLLECTION: postman-collection_serviceA.json
+ FUZZAPI_POSTMAN_COLLECTION: postman-collection.json
+ FUZZAPI_POSTMAN_COLLECTION_VARIABLES: global-scope.json
+ FUZZAPI_TARGET_URL: http://test-deployment/
+```
+
+##### Example: Environment Scope
+
+In this example, [the _environment_ scope is exported](https://learning.postman.com/docs/getting-started/importing-and-exporting-data/#exporting-environments) from the Postman Client as `environment-scope.json` and provided to API Fuzzing through the `FUZZAPI_POSTMAN_COLLECTION_VARIABLES` configuration variable.
+
+Here is an example of using `FUZZAPI_POSTMAN_COLLECTION_VARIABLES`:
+
+```yaml
+stages:
+ - fuzz
+
+include:
+ - template: API-Fuzzing.gitlab-ci.yml
+
+variables:
+ FUZZAPI_PROFILE: Quick
+ FUZZAPI_POSTMAN_COLLECTION: postman-collection.json
+ FUZZAPI_POSTMAN_COLLECTION_VARIABLES: environment-scope.json
+ FUZZAPI_TARGET_URL: http://test-deployment/
+```
+
+##### Example: Collection Scope
+
+The _collection_ scope variables are included in the exported Postman Collection file and provided through the `FUZZAPI_POSTMAN_COLLECTION` configuration variable.
+
+Here is an example of using `FUZZAPI_POSTMAN_COLLECTION`:
+
+```yaml
+stages:
+ - fuzz
+
+include:
+ - template: API-Fuzzing.gitlab-ci.yml
+
+variables:
+ FUZZAPI_PROFILE: Quick
+ FUZZAPI_POSTMAN_COLLECTION: postman-collection.json
FUZZAPI_TARGET_URL: http://test-deployment/
FUZZAPI_POSTMAN_COLLECTION_VARIABLES: variable-collection-dictionary.json
```
-The file `variable-collection-dictionary.json` is a JSON document. This JSON is an object with
-key-value pairs for properties. The keys are the variables' names, and the values are the variables'
+##### Example: API Fuzzing Scope
+
+The API Fuzzing Scope is used for two main purposes, defining _data_ and _local_ scope variables that are not supported by API Fuzzing, and changing the value of an existing variable defined in another scope. The API Fuzzing Scope is provided through the `FUZZAPI_POSTMAN_COLLECTION_VARIABLES` configuration variable.
+
+Here is an example of using `FUZZAPI_POSTMAN_COLLECTION_VARIABLES`:
+
+```yaml
+stages:
+ - fuzz
+
+include:
+ - template: API-Fuzzing.gitlab-ci.yml
+
+variables:
+ FUZZAPI_PROFILE: Quick
+ FUZZAPI_POSTMAN_COLLECTION: postman-collection.json
+ FUZZAPI_POSTMAN_COLLECTION_VARIABLES: api-fuzzing-scope.json
+ FUZZAPI_TARGET_URL: http://test-deployment/
+```
+
+The file `api-fuzzing-scope.json` uses our [custom JSON file format](#api-fuzzing-scope-custom-json-file-format). This JSON is an object with key-value pairs for properties. The keys are the variables' names, and the values are the variables'
values. For example:
- ```json
- {
- "base_url": "http://127.0.0.1/",
- "token": "Token 84816165151"
- }
- ```
+```json
+{
+ "base_url": "http://127.0.0.1/",
+ "token": "Token 84816165151"
+}
+```
+
+##### Example: Multiple Scopes
+
+In this example, a _global_ scope, _environment_ scope, and _collection_ scope are configured. The first step is to export our various scopes.
+
+- [Export the _global_ scope](https://learning.postman.com/docs/sending-requests/variables/#downloading-global-environments) as `global-scope.json`
+- [Export the _environment_ scope](https://learning.postman.com/docs/getting-started/importing-and-exporting-data/#exporting-environments) as `environment-scope.json`
+- Export the Postman Collection which includes the _collection_ scope as `postman-collection.json`
+
+The Postman Collection is provided using the `FUZZAPI_POSTMAN_COLLECTION` variable, while the other scopes are provided using the `FUZZAPI_POSTMAN_COLLECTION_VARIABLES`. API Fuzzing can identify which scope the provided files match using data provided in each file.
+
+```yaml
+stages:
+ - fuzz
+
+include:
+ - template: API-Fuzzing.gitlab-ci.yml
+
+variables:
+ FUZZAPI_PROFILE: Quick
+ FUZZAPI_POSTMAN_COLLECTION: postman-collection.json
+ FUZZAPI_POSTMAN_COLLECTION_VARIABLES: global-scope.json,environment-scope.json
+ FUZZAPI_TARGET_URL: http://test-deployment/
+```
+
+##### Example: Changing a Variables Value
+
+When using exported scopes, it's often the case that the value of a variable must be changed for use with API Fuzzing. For example, a _collection_ scoped variable might contain a variable named `api_version` with a value of `v2`, while your test needs a value of `v1`. Instead of modifying the exported collection to change the value, the API Fuzzing scope can be used to change its value. This works because the _API Fuzzing_ scope takes precedence over all other scopes.
+
+The _collection_ scope variables are included in the exported Postman Collection file and provided through the `FUZZAPI_POSTMAN_COLLECTION` configuration variable.
+
+The API Fuzzing Scope is provided through the `FUZZAPI_POSTMAN_COLLECTION_VARIABLES` configuration variable, but first, we must create the file.
+The file `api-fuzzing-scope.json` uses our [custom JSON file format](#api-fuzzing-scope-custom-json-file-format). This JSON is an object with key-value pairs for properties. The keys are the variables' names, and the values are the variables'
+values. For example:
+
+```json
+{
+ "api_version": "v1"
+}
+```
+
+Our CI definition:
+
+```yaml
+stages:
+ - fuzz
+
+include:
+ - template: API-Fuzzing.gitlab-ci.yml
+
+variables:
+ FUZZAPI_PROFILE: Quick
+ FUZZAPI_POSTMAN_COLLECTION: postman-collection.json
+ FUZZAPI_POSTMAN_COLLECTION_VARIABLES: api-fuzzing-scope.json
+ FUZZAPI_TARGET_URL: http://test-deployment/
+```
+
+##### Example: Changing a Variables Value with Multiple Scopes
+
+When using exported scopes, it's often the case that the value of a variable must be changed for use with API Fuzzing. For example, an _environment_ scope might contain a variable named `api_version` with a value of `v2`, while your test needs a value of `v1`. Instead of modifying the exported file to change the value, the API Fuzzing scope can be used. This works because the _API Fuzzing_ scope takes precedence over all other scopes.
+
+In this example, a _global_ scope, _environment_ scope, _collection_ scope, and _API Fuzzing_ scope are configured. The first step is to export and create our various scopes.
+
+- [Export the _global_ scope](https://learning.postman.com/docs/sending-requests/variables/#downloading-global-environments) as `global-scope.json`
+- [Export the _environment_ scope](https://learning.postman.com/docs/getting-started/importing-and-exporting-data/#exporting-environments) as `environment-scope.json`
+- Export the Postman Collection which includes the _collection_ scope as `postman-collection.json`
+
+The API Fuzzing scope is used by creating a file `api-fuzzing-scope.json` using our [custom JSON file format](#api-fuzzing-scope-custom-json-file-format). This JSON is an object with key-value pairs for properties. The keys are the variables' names, and the values are the variables'
+values. For example:
+
+```json
+{
+ "api_version": "v1"
+}
+```
+
+The Postman Collection is provided using the `FUZZAPI_POSTMAN_COLLECTION` variable, while the other scopes are provided using the `FUZZAPI_POSTMAN_COLLECTION_VARIABLES`. API Fuzzing can identify which scope the provided files match using data provided in each file.
+
+```yaml
+stages:
+ - fuzz
+
+include:
+ - template: API-Fuzzing.gitlab-ci.yml
+
+variables:
+ FUZZAPI_PROFILE: Quick
+ FUZZAPI_POSTMAN_COLLECTION: postman-collection.json
+ FUZZAPI_POSTMAN_COLLECTION_VARIABLES: global-scope.json,environment-scope.json,api-fuzzing-scope.json
+ FUZZAPI_TARGET_URL: http://test-deployment/
+```
## API fuzzing configuration
@@ -420,21 +914,23 @@ provide a script that performs an authentication flow or calculates the token.
#### HTTP Basic Authentication
[HTTP basic authentication](https://en.wikipedia.org/wiki/Basic_access_authentication)
-is an authentication method built in to the HTTP protocol and used in conjunction with
+is an authentication method built into the HTTP protocol and used in conjunction with
[transport layer security (TLS)](https://en.wikipedia.org/wiki/Transport_Layer_Security).
-To use HTTP basic authentication, two CI/CD variables are added to your `.gitlab-ci.yml` file:
-- `FUZZAPI_HTTP_USERNAME`: The username for authentication.
-- `FUZZAPI_HTTP_PASSWORD`: The password for authentication.
+We recommended that you [create a CI/CD variable](../../../ci/variables/index.md#custom-cicd-variables)
+for the password (for example, `TEST_API_PASSWORD`), and set it to be masked. You can create CI/CD
+variables from the GitLab project's page at **Settings > CI/CD**, in the **Variables** section.
+Because of the [limitations on masked variables](../../../ci/variables/index.md#mask-a-cicd-variable),
+you should Base64-encode the password before adding it as a variable.
-For the password, we recommended that you [create a CI/CD variable](../../../ci/variables/index.md#custom-cicd-variables)
-(for example, `TEST_API_PASSWORD`) set to the password. You can create CI/CD variables from the
-GitLab projects page at **Settings > CI/CD**, in the **Variables** section. Use that variable
-as the value for `FUZZAPI_HTTP_PASSWORD`:
+Finally, add two CI/CD variables to your `.gitlab-ci.yml` file:
+
+- `FUZZAPI_HTTP_USERNAME`: The username for authentication.
+- `FUZZAPI_HTTP_PASSWORD_BASE64`: The Base64-encoded password for authentication.
```yaml
stages:
- - fuzz
+ - fuzz
include:
- template: API-Fuzzing.gitlab-ci.yml
@@ -444,9 +940,13 @@ variables:
FUZZAPI_HAR: test-api-recording.har
FUZZAPI_TARGET_URL: http://test-deployment/
FUZZAPI_HTTP_USERNAME: testuser
- FUZZAPI_HTTP_PASSWORD: $TEST_API_PASSWORD
+ FUZZAPI_HTTP_PASSWORD_BASE64: $TEST_API_PASSWORD
```
+#### Raw password
+
+If you do not want to Base64-encode the password (or if you are using GitLab 15.3 or earlier) you can provide the raw password `FUZZAPI_HTTP_PASSWORD`, instead of using `FUZZAPI_HTTP_PASSWORD_BASE64`.
+
#### Bearer Tokens
Bearer tokens are used by several different authentication mechanisms, including OAuth2 and JSON Web
@@ -493,7 +993,7 @@ Follow these steps to provide the bearer token with `FUZZAPI_OVERRIDES_ENV`:
##### Token generated at test runtime
If the bearer token must be generated and doesn't expire during testing, you can provide to API
-fuzzing a file containing the token. A prior stage and job, or part of the API fuzzing job, can
+fuzzing with a file containing the token. A prior stage and job, or part of the API fuzzing job, can
generate this file.
API fuzzing expects to receive a JSON file with the following structure:
@@ -605,7 +1105,10 @@ profile increases as the number of tests increases.
|[`FUZZAPI_OPENAPI_ALL_MEDIA_TYPES`](#openapi-specification) | Use all supported media types instead of one when generating requests. Causes test duration to be longer. Default is disabled. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/333304) in GitLab 14.10. |
|[`FUZZAPI_OPENAPI_MEDIA_TYPES`](#openapi-specification) | Colon (`:`) separated media types accepted for testing. Default is disabled. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/333304) in GitLab 14.10. |
|[`FUZZAPI_HAR`](#http-archive-har) | HTTP Archive (HAR) file. |
+|[`FUZZAPI_GRAPHQL`](#graphql-schema) | Path to GraphQL endpoint, for example `/api/graphql`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/352780) in GitLab 15.4. |
+|[`FUZZAPI_GRAPHQL_SCHEMA`](#graphql-schema) | A URL or filename for a GraphQL schema in JSON format. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/352780) in GitLab 15.4. |
|[`FUZZAPI_POSTMAN_COLLECTION`](#postman-collection) | Postman Collection file. |
+|[`FUZZAPI_POSTMAN_COLLECTION_VARIABLES`](#postman-variables) | Path to a JSON file to extract Postman variable values. The support for comma-separated (`,`) files was [introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/356312) in GitLab 15.1. |
|[`FUZZAPI_POSTMAN_COLLECTION_VARIABLES`](#postman-variables) | Path to a JSON file to extract Postman variable values. |
|[`FUZZAPI_OVERRIDES_FILE`](#overrides) | Path to a JSON file containing overrides. |
|[`FUZZAPI_OVERRIDES_ENV`](#overrides) | JSON string containing headers to override. |
@@ -616,6 +1119,7 @@ profile increases as the number of tests increases.
|[`FUZZAPI_OVERRIDES_INTERVAL`](#overrides) | How often to run overrides command in seconds. Defaults to `0` (once). |
|[`FUZZAPI_HTTP_USERNAME`](#http-basic-authentication) | Username for HTTP authentication. |
|[`FUZZAPI_HTTP_PASSWORD`](#http-basic-authentication) | Password for HTTP authentication. |
+|[`FUZZAPI_HTTP_PASSWORD_BASE64`](#http-basic-authentication) | Password for HTTP authentication, Base64-encoded. [Introduced](https://gitlab.com/gitlab-org/security-products/analyzers/api-fuzzing-src/-/merge_requests/702) in GitLab 15.4. |
### Overrides
@@ -1062,21 +1566,21 @@ This example excludes the `/auth` resource. This does not exclude child resource
```yaml
variables:
- FUZZAPI_EXCLUDE_PATHS=/auth
+ FUZZAPI_EXCLUDE_PATHS: /auth
```
To exclude `/auth`, and child resources (`/auth/child`), we use a wildcard.
```yaml
variables:
- FUZZAPI_EXCLUDE_PATHS=/auth*
+ FUZZAPI_EXCLUDE_PATHS: /auth*
```
To exclude multiple paths we can use the `;` character. In this example we exclude `/auth*` and `/v1/*`.
```yaml
variables:
- FUZZAPI_EXCLUDE_PATHS=/auth*;/v1/*
+ FUZZAPI_EXCLUDE_PATHS: /auth*;/v1/*
```
### Exclude parameters
@@ -1336,7 +1840,15 @@ Each value in `FUZZAPI_EXCLUDE_URLS` is a regular expression. Characters such as
The following example excludes the URL `http://target/api/auth` and its child resources.
```yaml
+stages:
+ - fuzz
+
+include:
+ - template: API-Fuzzing.gitlab-ci.yml
+
variables:
+ FUZZAPI_TARGET_URL: http://target/
+ FUZZAPI_OPENAPI: test-api-specification.json
FUZZAPI_EXCLUDE_URLS: http://target/api/auth
```
@@ -1345,7 +1857,15 @@ variables:
To exclude the URLs `http://target/api/buy` and `http://target/api/sell` but allowing to scan their child resources, for instance: `http://target/api/buy/toy` or `http://target/api/sell/chair`. You could use the value `http://target/api/buy/$,http://target/api/sell/$`. This value is using two regular expressions, each of them separated by a `,` character. Hence, it contains `http://target/api/buy$` and `http://target/api/sell$`. In each regular expression, the trailing `$` character points out where the matching URL should end.
```yaml
+stages:
+ - fuzz
+
+include:
+ - template: API-Fuzzing.gitlab-ci.yml
+
variables:
+ FUZZAPI_TARGET_URL: http://target/
+ FUZZAPI_OPENAPI: test-api-specification.json
FUZZAPI_EXCLUDE_URLS: http://target/api/buy/$,http://target/api/sell/$
```
@@ -1354,7 +1874,15 @@ variables:
In order to exclude the URLs: `http://target/api/buy` and `http://target/api/sell`, and their child resources. To provide multiple URLs we use the `,` character as follows:
```yaml
+stages:
+ - fuzz
+
+include:
+ - template: API-Fuzzing.gitlab-ci.yml
+
variables:
+ FUZZAPI_TARGET_URL: http://target/
+ FUZZAPI_OPENAPI: test-api-specification.json
FUZZAPI_EXCLUDE_URLS: http://target/api/buy,http://target/api/sell
```
@@ -1363,7 +1891,15 @@ variables:
In order to exclude exactly `https://target/api/v1/user/create` and `https://target/api/v2/user/create` or any other version (`v3`,`v4`, and more). We could use `https://target/api/v.*/user/create$`, in the previous regular expression `.` indicates any character and `*` indicates zero or more times, additionally `$` indicates that the URL should end there.
```yaml
+stages:
+ - fuzz
+
+include:
+ - template: API-Fuzzing.gitlab-ci.yml
+
variables:
+ FUZZAPI_TARGET_URL: http://target/
+ FUZZAPI_OPENAPI: test-api-specification.json
FUZZAPI_EXCLUDE_URLS: https://target/api/v.*/user/create$
```
@@ -1679,11 +2215,11 @@ For more information, see [Offline environments](../offline_deployments/index.md
### `Error waiting for API Security 'http://127.0.0.1:5000' to become available`
-A bug exists in versions of the API Fuzzing analyzer prior to v1.6.196 that can cause a background process to fail under certain conditions. The solution is to update to a newer version of the DAST API analyzer.
+A bug exists in versions of the API Fuzzing analyzer prior to v1.6.196 that can cause a background process to fail under certain conditions. The solution is to update to a newer version of the API Fuzzing analyzer.
The version information can be found in the job details for the `apifuzzer_fuzz` job.
-If the issue is occurring with versions v1.6.196 or greater, please contact Support and provide the following information:
+If the issue is occurring with versions v1.6.196 or greater, contact Support and provide the following information:
1. Reference this troubleshooting section and ask for the issue to be escalated to the Dynamic Analysis Team.
1. The full console output of the job.
@@ -1750,12 +2286,15 @@ This solution is for pipelines in which the target API URL doesn't change (is st
For environments where the target API remains the same, we recommend you specify the target URL by using the `FUZZAPI_TARGET_URL` environment variable. In your `.gitlab-ci.yml` file, add a variable `FUZZAPI_TARGET_URL`. The variable must be set to the base URL of API testing target. For example:
```yaml
+stages:
+ - fuzz
+
include:
- - template: API-Fuzzing.gitlab-ci.yml
+ - template: API-Fuzzing.gitlab-ci.yml
- variables:
- FUZZAPI_TARGET_URL: http://test-deployment/
- FUZZAPI_OPENAPI: test-api-specification.json
+variables:
+ FUZZAPI_TARGET_URL: http://test-deployment/
+ FUZZAPI_OPENAPI: test-api-specification.json
```
#### Dynamic environment solutions
@@ -1793,7 +2332,7 @@ TODO
### Use OpenAPI with an invalid schema
-There are cases where the document is autogenerated with an invalid schema or cannot be edited manually in a timely manner. In those scenarios, the API Security is able to perform a relaxed validation by setting the variable `FUZZAPI_OPENAPI_RELAXED_VALIDATION`. We recommend providing a fully compliant OpenAPI document to prevent unexpected behaviors.
+There are cases where the document is autogenerated with an invalid schema or cannot be edited manually in a timely manner. In those scenarios, the API Fuzzing is able to perform a relaxed validation by setting the variable `FUZZAPI_OPENAPI_RELAXED_VALIDATION`. We recommend providing a fully compliant OpenAPI document to prevent unexpected behaviors.
#### Edit a non-compliant OpenAPI file
@@ -1810,25 +2349,25 @@ If your OpenAPI document is generated manually, load your document in the editor
Relaxed validation is meant for cases when the OpenAPI document cannot meet OpenAPI specifications, but it still has enough content to be consumed by different tools. A validation is performed but less strictly in regards to document schema.
-API Security can still try to consume an OpenAPI document that does not fully comply with OpenAPI specifications. To instruct API Security to perform a relaxed validation, set the variable `FUZZAPI_OPENAPI_RELAXED_VALIDATION` to any value, for example:
+API Fuzzing can still try to consume an OpenAPI document that does not fully comply with OpenAPI specifications. To instruct API Fuzzing analyzer to perform a relaxed validation, set the variable `FUZZAPI_OPENAPI_RELAXED_VALIDATION` to any value, for example:
```yaml
- stages:
- - fuzz
+stages:
+ - fuzz
- include:
- - template: API-Fuzzing.gitlab-ci.yml
+include:
+ - template: API-Fuzzing.gitlab-ci.yml
- variables:
- FUZZAPI_PROFILE: Quick-10
- FUZZAPI_TARGET_URL: http://test-deployment/
- FUZZAPI_OPENAPI: test-api-specification.json
- FUZZAPI_OPENAPI_RELAXED_VALIDATION: On
+variables:
+ FUZZAPI_PROFILE: Quick-10
+ FUZZAPI_TARGET_URL: http://test-deployment/
+ FUZZAPI_OPENAPI: test-api-specification.json
+ FUZZAPI_OPENAPI_RELAXED_VALIDATION: 'On'
```
### `No operation in the OpenAPI document is consuming any supported media type`
-API Security uses the specified media types in the OpenAPI document to generate requests. If no request can be created due to the lack of supported media types, then an error will be thrown.
+API Fuzzing uses the specified media types in the OpenAPI document to generate requests. If no request can be created due to the lack of supported media types, then an error will be thrown.
**Error message**
diff --git a/doc/user/application_security/configuration/index.md b/doc/user/application_security/configuration/index.md
index 9ca1a6f125f..32a523a1871 100644
--- a/doc/user/application_security/configuration/index.md
+++ b/doc/user/application_security/configuration/index.md
@@ -29,7 +29,7 @@ all security features are configured by default.
To view a project's security configuration:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Configuration**.
Select **Configuration history** to see the `.gitlab-ci.yml` file's history.
@@ -56,7 +56,7 @@ You can configure the following security controls:
- Can be configured by adding a configuration block to your agent configuration. For more details, read [Operational Container Scanning](../../clusters/agent/vulnerabilities.md#enable-operational-container-scanning).
- [Secret Detection](../secret_detection/index.md)
- Select **Configure with a merge request** to create a merge request with the changes required to
- enable Secret Detection. For more details, read [Enable Secret Detection via an automatic merge request](../secret_detection/index.md#enable-secret-detection-via-an-automatic-merge-request).
+ enable Secret Detection. For more details, read [Enable Secret Detection using a merge request](../secret_detection/index.md#enable-secret-detection-using-a-merge-request).
- [API Fuzzing](../api_fuzzing/index.md)
- Select **Enable API Fuzzing** to use API Fuzzing for the current project. For more details, read [API Fuzzing](../../../user/application_security/api_fuzzing/index.md#enable-web-api-fuzzing).
- [Coverage Fuzzing](../coverage_fuzzing/index.md)
diff --git a/doc/user/application_security/container_scanning/index.md b/doc/user/application_security/container_scanning/index.md
index 7bb3cb4f64c..961ccf6b563 100644
--- a/doc/user/application_security/container_scanning/index.md
+++ b/doc/user/application_security/container_scanning/index.md
@@ -1,7 +1,7 @@
---
type: reference, howto
-stage: Protect
-group: Container Security
+stage: Secure
+group: Composition Analysis
info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
---
@@ -14,6 +14,7 @@ info: To determine the technical writer assigned to the Stage/Group associated w
> - Integration with Grype as an alternative scanner [introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/326279) in GitLab 14.0.
> - [Changed](https://gitlab.com/gitlab-org/gitlab/-/merge_requests/86092) the major analyzer version from `4` to `5` in GitLab 15.0.
> - [Moved](https://gitlab.com/gitlab-org/gitlab/-/merge_requests/86783) from GitLab Ultimate to GitLab Free in 15.0.
+> - Container Scanning variables that reference Docker [renamed](https://gitlab.com/gitlab-org/gitlab/-/issues/357264) in GitLab 15.4.
Your application's Docker image may itself be based on Docker images that contain known
vulnerabilities. By including an extra Container Scanning job in your pipeline that scans for those
@@ -68,7 +69,7 @@ information directly in the merge request.
| [Solutions for vulnerabilities (auto-remediation)](#solutions-for-vulnerabilities-auto-remediation) | No | Yes |
| Support for the [vulnerability allow list](#vulnerability-allowlisting) | No | Yes |
| [Access to Security Dashboard page](#security-dashboard) | No | Yes |
-| [Access to Dependency List page](../dependency_list/) | No | Yes |
+| [Access to Dependency List page](../dependency_list/index.md) | No | Yes |
## Requirements
@@ -83,7 +84,7 @@ To enable container scanning in your pipeline, you need the following:
- [Build and push](../../packages/container_registry/index.md#build-and-push-by-using-gitlab-cicd)
the Docker image to your project's container registry.
- If you're using a third-party container registry, you might need to provide authentication
- credentials through the `DOCKER_USER` and `DOCKER_PASSWORD` [configuration variables](#available-cicd-variables).
+ credentials through the `CS_REGISTRY_USER` and `CS_REGISTRY_PASSWORD` [configuration variables](#available-cicd-variables).
For more details on how to use these variables, see [authenticate to a remote registry](#authenticate-to-a-remote-registry).
## Configuration
@@ -157,13 +158,13 @@ include:
container_scanning:
variables:
- DOCKER_IMAGE: example.com/user/image:tag
+ CS_IMAGE: example.com/user/image:tag
```
##### Authenticate to a remote registry
-Scanning an image in a private registry requires authentication. Provide the username in the `DOCKER_USER`
-variable, and the password in the `DOCKER_PASSWORD` configuration variable.
+Scanning an image in a private registry requires authentication. Provide the username in the `CS_REGISTRY_USER`
+variable, and the password in the `CS_REGISTRY_PASSWORD` configuration variable.
For example, to scan an image from AWS Elastic Container Registry:
@@ -178,9 +179,9 @@ container_scanning:
include:
- template: Security/Container-Scanning.gitlab-ci.yml
- DOCKER_IMAGE: <aws_account_id>.dkr.ecr.<region>.amazonaws.com/<image>:<tag>
- DOCKER_USER: AWS
- DOCKER_PASSWORD: "$AWS_ECR_PASSWORD"
+ CS_IMAGE: <aws_account_id>.dkr.ecr.<region>.amazonaws.com/<image>:<tag>
+ CS_REGISTRY_USER: AWS
+ CS_REGISTRY_PASSWORD: "$AWS_ECR_PASSWORD"
```
Authenticating to a remote registry is not supported when [FIPS mode](../../../development/fips_compliance.md#enable-fips-mode) is enabled.
@@ -190,7 +191,7 @@ Authenticating to a remote registry is not supported when [FIPS mode](../../../d
> [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/345434) in GitLab 14.6.
The `CS_DISABLE_DEPENDENCY_LIST` CI/CD variable controls whether the scan creates a
-[Dependency List](../dependency_list/)
+[Dependency List](../dependency_list/index.md)
report. This variable is currently only supported when the `trivy` analyzer is used. The variable's default setting of `"false"` causes the scan to create the report. To disable
the report, set the variable to `"true"`:
@@ -230,10 +231,10 @@ container_scanning:
```
When you enable this feature, you may see [duplicate findings](../terminology/index.md#duplicate-finding)
-in the [Vulnerability Report](../vulnerability_report/)
-if [Dependency Scanning](../dependency_scanning/)
+in the [Vulnerability Report](../vulnerability_report/index.md)
+if [Dependency Scanning](../dependency_scanning/index.md)
is enabled for your project. This happens because GitLab can't automatically deduplicate findings
-across different types of scanning tools. Please reference [this comparison](../dependency_scanning/#dependency-scanning-compared-to-container-scanning)
+across different types of scanning tools. Please reference [this comparison](../dependency_scanning/index.md#dependency-scanning-compared-to-container-scanning)
between GitLab Dependency Scanning and Container Scanning for more details on which types of dependencies are likely to be duplicated.
#### Available CI/CD variables
@@ -251,7 +252,7 @@ including a large number of false positives.
| `CI_APPLICATION_REPOSITORY` | `$CI_REGISTRY_IMAGE/$CI_COMMIT_REF_SLUG` | Docker repository URL for the image to be scanned. | All |
| `CI_APPLICATION_TAG` | `$CI_COMMIT_SHA` | Docker repository tag for the image to be scanned. | All |
| `CS_ANALYZER_IMAGE` | `registry.gitlab.com/security-products/container-scanning:5` | Docker image of the analyzer. | All |
-| `CS_DEFAULT_BRANCH_IMAGE` | `""` | The name of the `DOCKER_IMAGE` on the default branch. See [Setting the default branch image](#setting-the-default-branch-image) for more details. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/338877) in GitLab 14.5. | All |
+| `CS_DEFAULT_BRANCH_IMAGE` | `""` | The name of the `CS_IMAGE` on the default branch. See [Setting the default branch image](#setting-the-default-branch-image) for more details. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/338877) in GitLab 14.5. | All |
| `CS_DISABLE_DEPENDENCY_LIST` | `"false"` | Disable Dependency Scanning for packages installed in the scanned image. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/345434) in GitLab 14.6. | All |
| `CS_DISABLE_LANGUAGE_VULNERABILITY_SCAN` | `"true"` | Disable scanning for language-specific packages installed in the scanned image. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/345434) in GitLab 14.6. | All |
| `CS_DOCKER_INSECURE` | `"false"` | Allow access to secure Docker registries using HTTPS without validating the certificates. | All |
@@ -259,10 +260,14 @@ including a large number of false positives.
| `CS_IGNORE_UNFIXED` | `"false"` | Ignore vulnerabilities that are not fixed. | All |
| `CS_REGISTRY_INSECURE` | `"false"` | Allow access to insecure registries (HTTP only). Should only be set to `true` when testing the image locally. Works with all scanners, but the registry must listen on port `80/tcp` for Trivy to work. | All |
| `CS_SEVERITY_THRESHOLD` | `UNKNOWN` | Severity level threshold. The scanner outputs vulnerabilities with severity level higher than or equal to this threshold. Supported levels are Unknown, Low, Medium, High, and Critical. | Trivy |
-| `DOCKER_IMAGE` | `$CI_APPLICATION_REPOSITORY:$CI_APPLICATION_TAG` | The Docker image to be scanned. If set, this variable overrides the `$CI_APPLICATION_REPOSITORY` and `$CI_APPLICATION_TAG` variables. | All |
-| `DOCKER_PASSWORD` | `$CI_REGISTRY_PASSWORD` | Password for accessing a Docker registry requiring authentication. The default is only set if `$DOCKER_IMAGE` resides at [`$CI_REGISTRY`](../../../ci/variables/predefined_variables.md). Not supported when [FIPS mode](../../../development/fips_compliance.md#enable-fips-mode) is enabled. | All |
-| `DOCKER_USER` | `$CI_REGISTRY_USER` | Username for accessing a Docker registry requiring authentication. The default is only set if `$DOCKER_IMAGE` resides at [`$CI_REGISTRY`](../../../ci/variables/predefined_variables.md). Not supported when [FIPS mode](../../../development/fips_compliance.md#enable-fips-mode) is enabled. | All |
-| `DOCKERFILE_PATH` | `Dockerfile` | The path to the `Dockerfile` to use for generating remediations. By default, the scanner looks for a file named `Dockerfile` in the root directory of the project. You should configure this variable only if your `Dockerfile` is in a non-standard location, such as a subdirectory. See [Solutions for vulnerabilities](#solutions-for-vulnerabilities-auto-remediation) for more details. | All |
+| <!-- start_remove The following content will be removed on remove_date: '2023-08-22' --> `DOCKER_IMAGE` | `$CI_APPLICATION_REPOSITORY:$CI_APPLICATION_TAG` | **Deprecated** will be removed in GitLab 16.0. Replaced by `CS_IMAGE`. The Docker image to be scanned. If set, this variable overrides the `$CI_APPLICATION_REPOSITORY` and `$CI_APPLICATION_TAG` variables. | All |
+| `DOCKER_PASSWORD` | `$CI_REGISTRY_PASSWORD` | **Deprecated** will be removed in GitLab 16.0. Replaced by `CS_REGISTRY_PASSWORD`. Password for accessing a Docker registry requiring authentication. The default is only set if `$DOCKER_IMAGE` resides at [`$CI_REGISTRY`](../../../ci/variables/predefined_variables.md). Not supported when [FIPS mode](../../../development/fips_compliance.md#enable-fips-mode) is enabled. | All |
+| `DOCKER_USER` | `$CI_REGISTRY_USER` | **Deprecated** will be removed in GitLab 16.0. Replaced by `CS_REGISTRY_USER`. Username for accessing a Docker registry requiring authentication. The default is only set if `$DOCKER_IMAGE` resides at [`$CI_REGISTRY`](../../../ci/variables/predefined_variables.md). Not supported when [FIPS mode](../../../development/fips_compliance.md#enable-fips-mode) is enabled. | All |
+| `DOCKERFILE_PATH` | `Dockerfile` | **Deprecated** will be removed in GitLab 16.0. Replaced by `CS_DOCKERFILE_PATH`. The path to the `Dockerfile` to use for generating remediations. By default, the scanner looks for a file named `Dockerfile` in the root directory of the project. You should configure this variable only if your `Dockerfile` is in a non-standard location, such as a subdirectory. See [Solutions for vulnerabilities](#solutions-for-vulnerabilities-auto-remediation) for more details. | All <!-- end_remove --> |
+| `CS_IMAGE` | `$CI_APPLICATION_REPOSITORY:$CI_APPLICATION_TAG` | The Docker image to be scanned. If set, this variable overrides the `$CI_APPLICATION_REPOSITORY` and `$CI_APPLICATION_TAG` variables. | All |
+| `CS_REGISTRY_PASSWORD` | `$CI_REGISTRY_PASSWORD` | Password for accessing a Docker registry requiring authentication. The default is only set if `$CS_IMAGE` resides at [`$CI_REGISTRY`](../../../ci/variables/predefined_variables.md). Not supported when [FIPS mode](../../../development/fips_compliance.md#enable-fips-mode) is enabled. | All |
+| `CS_REGISTRY_USER` | `$CI_REGISTRY_USER` | Username for accessing a Docker registry requiring authentication. The default is only set if `$CS_IMAGE` resides at [`$CI_REGISTRY`](../../../ci/variables/predefined_variables.md). Not supported when [FIPS mode](../../../development/fips_compliance.md#enable-fips-mode) is enabled. | All |
+| `CS_DOCKERFILE_PATH` | `Dockerfile` | The path to the `Dockerfile` to use for generating remediations. By default, the scanner looks for a file named `Dockerfile` in the root directory of the project. You should configure this variable only if your `Dockerfile` is in a non-standard location, such as a subdirectory. See [Solutions for vulnerabilities](#solutions-for-vulnerabilities-auto-remediation) for more details. | All |
| `SECURE_LOG_LEVEL` | `info` | Set the minimum logging level. Messages of this logging level or higher are output. From highest to lowest severity, the logging levels are: `fatal`, `error`, `warn`, `info`, `debug`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/10880) in GitLab 13.1. | All |
### Supported distributions
@@ -309,7 +314,7 @@ Starting with GitLab 14.10, `-fips` is automatically added to `CS_ANALYZER_IMAGE
enabled in the GitLab instance.
Container scanning of images in authenticated registries is not supported when [FIPS mode](../../../development/fips_compliance.md#enable-fips-mode)
-is enabled. When `CI_GITLAB_FIPS_MODE` is `"true"`, and `DOCKER_USER` or `DOCKER_PASSWORD` is set,
+is enabled. When `CI_GITLAB_FIPS_MODE` is `"true"`, and `CS_REGISTRY_USER` or `CS_REGISTRY_PASSWORD` is set,
the analyzer exits with an error and does not perform the scan.
### Enable Container Scanning through an automatic merge request
@@ -426,14 +431,14 @@ container_scanning:
variables:
CS_DEFAULT_BRANCH_IMAGE: $CI_REGISTRY_IMAGE:$CI_COMMIT_SHA
before_script:
- - export DOCKER_IMAGE="$CI_REGISTRY_IMAGE/$CI_COMMIT_BRANCH:$CI_COMMIT_SHA"
+ - export CS_IMAGE="$CI_REGISTRY_IMAGE/$CI_COMMIT_BRANCH:$CI_COMMIT_SHA"
- |
if [ "$CI_COMMIT_BRANCH" == "$CI_DEFAULT_BRANCH" ]; then
- export DOCKER_IMAGE="$CI_REGISTRY_IMAGE:$CI_COMMIT_SHA"
+ export CS_IMAGE="$CI_REGISTRY_IMAGE:$CI_COMMIT_SHA"
fi
```
-`CS_DEFAULT_BRANCH_IMAGE` should remain the same for a given `DOCKER_IMAGE`. If it changes, then a
+`CS_DEFAULT_BRANCH_IMAGE` should remain the same for a given `CS_IMAGE`. If it changes, then a
duplicate set of vulnerabilities are created, which must be manually dismissed.
When using [Auto DevOps](../../../topics/autodevops/index.md), `CS_DEFAULT_BRANCH_IMAGE` is
@@ -500,7 +505,7 @@ This example excludes from `gl-container-scanning-report.json`:
- `generalallowlist` block allows you to specify CVE IDs globally. All vulnerabilities with matching CVE IDs are excluded from the scan report.
-- `images` block allows you to specify CVE IDs for each container image independently. All vulnerabilities from the given image with matching CVE IDs are excluded from the scan report. The image name is retrieved from one of the environment variables used to specify the Docker image to be scanned, such as `$CI_APPLICATION_REPOSITORY:$CI_APPLICATION_TAG` or `DOCKER_IMAGE`. The image provided in this block **must** match this value and **must not** include the tag value. For example, if you specify the image to be scanned using `DOCKER_IMAGE=alpine:3.7`, then you would use `alpine` in the `images` block, but you cannot use `alpine:3.7`.
+- `images` block allows you to specify CVE IDs for each container image independently. All vulnerabilities from the given image with matching CVE IDs are excluded from the scan report. The image name is retrieved from one of the environment variables used to specify the Docker image to be scanned, such as `$CI_APPLICATION_REPOSITORY:$CI_APPLICATION_TAG` or `CS_IMAGE`. The image provided in this block **must** match this value and **must not** include the tag value. For example, if you specify the image to be scanned using `CS_IMAGE=alpine:3.7`, then you would use `alpine` in the `images` block, but you cannot use `alpine:3.7`.
You can specify container image in multiple ways:
@@ -649,8 +654,8 @@ you're using a non-GitLab Docker registry, you must change the `$CI_REGISTRY` va
To scan an image in an external private registry, you must configure access credentials so the
container scanning analyzer can authenticate itself before attempting to access the image to scan.
-If you use the GitLab [Container Registry](../../packages/container_registry/),
-the `DOCKER_USER` and `DOCKER_PASSWORD` [configuration variables](#available-cicd-variables)
+If you use the GitLab [Container Registry](../../packages/container_registry/index.md),
+the `CS_REGISTRY_USER` and `CS_REGISTRY_PASSWORD` [configuration variables](#available-cicd-variables)
are set automatically and you can skip this configuration.
This example shows the configuration needed to scan images in a private [Google Container Registry](https://cloud.google.com/container-registry/):
@@ -661,12 +666,12 @@ include:
container_scanning:
variables:
- DOCKER_USER: _json_key
- DOCKER_PASSWORD: "$GCP_CREDENTIALS"
- DOCKER_IMAGE: "gcr.io/path-to-you-registry/image:tag"
+ CS_REGISTRY_USER: _json_key
+ CS_REGISTRY_PASSWORD: "$GCP_CREDENTIALS"
+ CS_IMAGE: "gcr.io/path-to-you-registry/image:tag"
```
-Before you commit this configuration, [add a CI/CD variable](../../../ci/variables/#add-a-cicd-variable-to-a-project)
+Before you commit this configuration, [add a CI/CD variable](../../../ci/variables/index.md#add-a-cicd-variable-to-a-project)
for `GCP_CREDENTIALS` containing the JSON key, as described in the
[Google Cloud Platform Container Registry documentation](https://cloud.google.com/container-registry/docs/advanced-authentication#json-key).
Also:
@@ -706,12 +711,12 @@ The results are stored in `gl-container-scanning-report.json`.
## Reports JSON format
The container scanning tool emits JSON reports which the [GitLab Runner](https://docs.gitlab.com/runner/)
-recognizes through the [`artifacts:reports`](../../../ci/yaml/#artifactsreports)
+recognizes through the [`artifacts:reports`](../../../ci/yaml/index.md#artifactsreports)
keyword in the CI configuration file.
Once the CI job finishes, the Runner uploads these reports to GitLab, which are then available in
the CI Job artifacts. In GitLab Ultimate, these reports can be viewed in the corresponding [pipeline](../vulnerability_report/pipeline.md)
-and become part of the [Vulnerability Report](../vulnerability_report/).
+and become part of the [Vulnerability Report](../vulnerability_report/index.md).
These reports must follow a format defined in the
[security report schemas](https://gitlab.com/gitlab-org/security-products/security-report-schemas/). See:
@@ -772,7 +777,7 @@ Some vulnerabilities can be fixed by applying the solution that GitLab
automatically generates.
To enable remediation support, the scanning tool _must_ have access to the `Dockerfile` specified by
-the [`DOCKERFILE_PATH`](#available-cicd-variables) CI/CD variable. To ensure that the scanning tool
+the [`CS_DOCKERFILE_PATH`](#available-cicd-variables) CI/CD variable. To ensure that the scanning tool
has access to this
file, it's necessary to set [`GIT_STRATEGY: fetch`](../../../ci/runners/configure_runners.md#git-strategy) in
your `.gitlab-ci.yml` file by following the instructions described in this document's
diff --git a/doc/user/application_security/coverage_fuzzing/index.md b/doc/user/application_security/coverage_fuzzing/index.md
index 154884c16e7..bec242a0426 100644
--- a/doc/user/application_security/coverage_fuzzing/index.md
+++ b/doc/user/application_security/coverage_fuzzing/index.md
@@ -53,7 +53,7 @@ You can use the following fuzzing engines to test the specified languages.
To confirm the status of coverage-guided fuzz testing:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Configuration**.
1. In the **Coverage Fuzzing** section the status is:
- **Not configured**
@@ -174,7 +174,7 @@ artifacts files you can download from the CI/CD pipeline.
To view details of the corpus registry:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Configuration**.
1. In the **Coverage Fuzzing** section, select **Manage corpus**.
@@ -202,7 +202,7 @@ provided by the `COVFUZZ_CORPUS_NAME` variable. The corpus is updated on every p
To upload an existing corpus file:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Configuration**.
1. In the **Coverage Fuzzing** section, select **Manage corpus**.
1. Select **New corpus**.
@@ -277,7 +277,7 @@ For a complete example, read the [Go coverage-guided fuzzing example](https://gi
It's also possible to run the coverage-guided fuzzing jobs longer and without blocking your main
pipeline. This configuration uses the GitLab
-[parent-child pipelines](../../../ci/pipelines/parent_child_pipelines.md).
+[parent-child pipelines](../../../ci/pipelines/downstream_pipelines.md#parent-child-pipelines).
The suggested workflow in this scenario is to have long-running, asynchronous fuzzing jobs on the
main or development branch, and short synchronous fuzzing jobs on all other branches and MRs. This
@@ -376,4 +376,4 @@ corpus file extracts into a folder named `corpus`.
If you see this error message when running the fuzzing job with `COVFUZZ_USE_REGISTRY` set to `true`,
ensure that duplicates are allowed. For more details, see
-[duplicate Generic packages](../../packages/generic_packages/#do-not-allow-duplicate-generic-packages).
+[duplicate Generic packages](../../packages/generic_packages/index.md#do-not-allow-duplicate-generic-packages).
diff --git a/doc/user/application_security/cve_id_request.md b/doc/user/application_security/cve_id_request.md
index 5ffd47527c5..6f076bbe3f9 100644
--- a/doc/user/application_security/cve_id_request.md
+++ b/doc/user/application_security/cve_id_request.md
@@ -1,6 +1,6 @@
---
type: tutorial
-stage: Secure
+stage: Govern
group: Threat Insights
info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
---
diff --git a/doc/user/application_security/dast/checks/16.7.md b/doc/user/application_security/dast/checks/16.7.md
index 2e6607575db..a052149ee4d 100644
--- a/doc/user/application_security/dast/checks/16.7.md
+++ b/doc/user/application_security/dast/checks/16.7.md
@@ -25,7 +25,7 @@ Only three directives are applicable for the `Strict-Transport-Security` header.
Note that invalid directives, or the `Strict-Transport-Security` header appearing more than once (if the values are
different) is considered invalid.
-Prior to adding to this security configuration to your website, it is recommended you review the hstspreload.org
+Prior to adding to this security configuration to your website, it is recommended you review the hstspreload.org
[Deployment Recommendations](https://hstspreload.org/#deployment-recommendations).
## Details
diff --git a/doc/user/application_security/dast/checks/209.1.md b/doc/user/application_security/dast/checks/209.1.md
index 2e4163bdec0..f2713a70afd 100644
--- a/doc/user/application_security/dast/checks/209.1.md
+++ b/doc/user/application_security/dast/checks/209.1.md
@@ -9,17 +9,17 @@ info: To determine the technical writer assigned to the Stage/Group associated w
## Description
The application was found to return error data such as stack traces. Depending on the data contained within the error message,
-this information could be used by an attacker to conduct further attacks. While stack traces are helpful during development
-and debugging, they should not be presented to users when an error occurs.
+this information could be used by an attacker to conduct further attacks. While stack traces are helpful during development
+and debugging, they should not be presented to users when an error occurs.
## Remediation
Applications should handle exception conditions internally and map known failure types to error codes that can be displayed
to a user. These error codes should be customized to the application and returned along with the relevant HTTP error code.
-When an error occurs, the application identifies the error type or class, and displays a numerical value to the
-user. Requests should also be tracked so when a user is presented with an error code, it has a corresponding request ID.
-Support teams can then correlate the HTTP error, the customized error code, and the request ID in the log files to
+When an error occurs, the application identifies the error type or class, and displays a numerical value to the
+user. Requests should also be tracked so when a user is presented with an error code, it has a corresponding request ID.
+Support teams can then correlate the HTTP error, the customized error code, and the request ID in the log files to
determine the root cause of the error without leaking details to the end user.
Example of returning customized errors:
diff --git a/doc/user/application_security/dast/dast_troubleshooting.md b/doc/user/application_security/dast/dast_troubleshooting.md
index 0c7a9806c72..4e87f1898cc 100644
--- a/doc/user/application_security/dast/dast_troubleshooting.md
+++ b/doc/user/application_security/dast/dast_troubleshooting.md
@@ -5,7 +5,7 @@ info: To determine the technical writer assigned to the Stage/Group associated w
type: reference, howto
---
-# Dynamic Application Security Testing (DAST) Troubleshooting **(ULTIMATE)**
+# Troubleshooting Dynamic Application Security Testing (DAST) **(ULTIMATE)**
The following troubleshooting scenarios have been collected from customer support cases. If you
experience a problem not addressed here, or the information here does not fix your problem, create a
diff --git a/doc/user/application_security/dast/index.md b/doc/user/application_security/dast/index.md
index a49dd8fd646..0f446ddee3e 100644
--- a/doc/user/application_security/dast/index.md
+++ b/doc/user/application_security/dast/index.md
@@ -280,7 +280,7 @@ page.
You can enable or configure DAST settings using the UI. The generated settings are formatted so they
can be conveniently pasted into the `.gitlab-ci.yml` file.
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Configuration**.
1. In the **Dynamic Application Security Testing (DAST)** section, select **Enable DAST** or
**Configure DAST**.
@@ -357,13 +357,9 @@ variables:
#### Import API specification from a file
If your API specification file is in your repository, you can provide its filename as the target.
-The API specification file must be in the `/zap/wrk` directory.
```yaml
dast:
- before_script:
- - mkdir -p /zap/wrk
- - cp api-specification.yml /zap/wrk/api-specification.yml
variables:
GIT_STRATEGY: fetch
DAST_API_SPECIFICATION: api-specification.yml
@@ -1075,7 +1071,7 @@ The on-demand DAST scan runs and the project's dashboard shows the results.
To run a saved on-demand scan:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > On-demand Scans**.
1. Select the **Scan library** tab.
1. In the scan's row, select **Run scan**.
@@ -1094,7 +1090,7 @@ The on-demand DAST scan runs, and the project's dashboard shows the results.
To schedule a scan:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > On-demand Scans**.
1. Select **New scan**.
1. Complete the **Scan name** and **Description** text boxes.
@@ -1143,14 +1139,16 @@ To delete an on-demand scan:
1. In the saved scan's row select **More actions** (**{ellipsis_v}**), then select **Delete**.
1. Select **Delete** to confirm the deletion.
-### Site profile
+## Site profile
-A site profile describes the attributes of a web site to scan on demand with DAST. A site profile is
-required for an on-demand DAST scan.
+A site profile defines the attributes and configuration details of the deployed application,
+website, or API to be scanned by DAST. A site profile can be referenced in `.gitlab-ci.yml` and
+on-demand scans.
-A site profile contains the following:
+A site profile contains:
-- **Profile name**: A name you assign to the site to be scanned.
+- **Profile name**: A name you assign to the site to be scanned. While a site profile is referenced
+ in either `.gitlab-ci.yml` or an on-demand scan, it **cannot** be renamed.
- **Site type**: The type of target to be scanned, either website or API scan.
- **Target URL**: The URL that DAST runs against.
- **Excluded URLs**: A comma-separated list of URLs to exclude from the scan.
@@ -1168,7 +1166,7 @@ When an API site type is selected, a [host override](#host-override) is used to
When configured, request headers and password fields are encrypted using [`aes-256-gcm`](https://en.wikipedia.org/wiki/Advanced_Encryption_Standard) before being stored in the database.
This data can only be read and decrypted with a valid secrets file.
-#### Site profile validation
+### Site profile validation
> - Site profile validation [introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/233020) in GitLab 13.8.
> - Meta tag validation [introduced](https://gitlab.com/groups/gitlab-org/-/epics/6460) in GitLab 14.2.
@@ -1192,7 +1190,7 @@ All these methods are equivalent in functionality. Use whichever is feasible.
In [GitLab 14.2 and later](https://gitlab.com/gitlab-org/gitlab/-/issues/324990), site profile
validation happens in a CI job using the [GitLab Runner](../../../ci/runners/index.md).
-#### Create a site profile
+### Create a site profile
To create a site profile:
@@ -1203,7 +1201,7 @@ To create a site profile:
The site profile is created.
-#### Edit a site profile
+### Edit a site profile
If a site profile is linked to a security policy, a user cannot edit the profile from this page. See
[Scan execution policies](../policies/scan-execution-policies.md)
@@ -1220,7 +1218,7 @@ To edit a site profile:
1. In the profile's row select the **More actions** (**{ellipsis_v}**) menu, then select **Edit**.
1. Edit the fields then select **Save profile**.
-#### Delete a site profile
+### Delete a site profile
If a site profile is linked to a security policy, a user cannot delete the profile from this page.
See [Scan execution policies](../policies/scan-execution-policies.md)
@@ -1234,13 +1232,13 @@ To delete a site profile:
1. In the profile's row, select the **More actions** (**{ellipsis_v}**) menu, then select **Delete**.
1. Select **Delete** to confirm the deletion.
-#### Validate a site profile
+### Validate a site profile
Validating a site is required to run an active scan.
To validate a site profile:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Configuration**.
1. In the **Dynamic Application Security Testing (DAST)** section, select **Manage profiles**.
1. Select the **Site Profiles** tab.
@@ -1266,7 +1264,7 @@ To validate a site profile:
The site is validated and an active scan can run against it. A site profile's validation status is
revoked only when it's revoked manually, or its file, header, or meta tag is edited.
-#### Retry a failed validation
+### Retry a failed validation
> - [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/322609) in GitLab 14.3.
> - [Deployed behind the `dast_failed_site_validations` flag](../../../administration/feature_flags.md), enabled by default.
@@ -1277,13 +1275,13 @@ page.
To retry a site profile's failed validation:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Configuration**.
1. In the **Dynamic Application Security Testing (DAST)** section, select **Manage profiles**.
1. Select the **Site Profiles** tab.
1. In the profile's row, select **Retry validation**.
-#### Revoke a site profile's validation status
+### Revoke a site profile's validation status
WARNING:
When a site profile's validation status is revoked, all site profiles that share the same URL also
@@ -1297,12 +1295,12 @@ To revoke a site profile's validation status:
The site profile's validation status is revoked.
-#### Validated site profile headers
+### Validated site profile headers
The following are code samples of how you can provide the required site profile header in your
application.
-##### Ruby on Rails example for on-demand scan
+#### Ruby on Rails example for on-demand scan
Here's how you can add a custom header in a Ruby on Rails application:
@@ -1315,7 +1313,7 @@ class DastWebsiteTargetController < ActionController::Base
end
```
-##### Django example for on-demand scan
+#### Django example for on-demand scan
Here's how you can add a
[custom header in Django](https://docs.djangoproject.com/en/2.2/ref/request-response/#setting-header-fields):
@@ -1329,7 +1327,7 @@ class DastWebsiteTargetView(View):
return response
```
-##### Node (with Express) example for on-demand scan
+#### Node (with Express) example for on-demand scan
Here's how you can add a
[custom header in Node (with Express)](https://expressjs.com/en/5x/api.html#res.append):
@@ -1341,22 +1339,26 @@ app.get('/dast-website-target', function(req, res) {
})
```
-### Scanner profile
+## Scanner profile
> - [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/222767) in GitLab 13.4.
> - [Added](https://gitlab.com/gitlab-org/gitlab/-/issues/225804) in GitLab 13.5: scan mode, AJAX spider, debug messages.
-A scanner profile defines the scanner settings used to run an on-demand scan:
+A scanner profile defines the configuration details of a security scanner. A scanner profile can be
+referenced in `.gitlab-ci.yml` and on-demand scans.
-- **Profile name:** A name you give the scanner profile. For example, "Spider_15".
+A scanner profile contains:
+
+- **Profile name:** A name you give the scanner profile. For example, "Spider_15". While a scanner
+ profile is referenced in either `.gitlab-ci.yml` or an on-demand scan, it **cannot** be renamed.
- **Scan mode:** A passive scan monitors all HTTP messages (requests and responses) sent to the target. An active scan attacks the target to find potential vulnerabilities.
- **Spider timeout:** The maximum number of minutes allowed for the spider to traverse the site.
- **Target timeout:** The maximum number of seconds DAST waits for the site to be available before
starting the scan.
-- **AJAX spider:** Run the AJAX spider, in addition to the traditional spider, to crawl the target site.
+- **AJAX spider:** Run the AJAX spider, in addition to the traditional spider, to crawl the target site.
- **Debug messages:** Include debug messages in the DAST console output.
-#### Create a scanner profile
+### Create a scanner profile
To create a scanner profile:
@@ -1366,7 +1368,7 @@ To create a scanner profile:
1. Complete the form. For details of each field, see [Scanner profile](#scanner-profile).
1. Select **Save profile**.
-#### Edit a scanner profile
+### Edit a scanner profile
If a scanner profile is linked to a security policy, a user cannot edit the profile from this page.
See [Scan execution policies](../policies/scan-execution-policies.md)
@@ -1381,7 +1383,7 @@ To edit a scanner profile:
1. Edit the form.
1. Select **Save profile**.
-#### Delete a scanner profile
+### Delete a scanner profile
If a scanner profile is linked to a security policy, a user cannot delete the profile from this
page. See [Scan execution policies](../policies/scan-execution-policies.md)
diff --git a/doc/user/application_security/dast_api/index.md b/doc/user/application_security/dast_api/index.md
index 1f86f2ffa49..f15dce37123 100644
--- a/doc/user/application_security/dast_api/index.md
+++ b/doc/user/application_security/dast_api/index.md
@@ -55,6 +55,7 @@ The following projects demonstrate DAST API scanning:
You can specify the API you want to scan by using:
- [OpenAPI v2 or v3 Specification](#openapi-specification)
+- [GraphQL Schema](#graphql-schema)
- [HTTP Archive (HAR)](#http-archive-har)
- [Postman Collection v2.0 or v2.1](#postman-collection)
@@ -199,7 +200,119 @@ variables:
DAST_API_TARGET_URL: http://test-deployment/
```
-This is a minimal configuration for DAST API. From here you can:
+This example is a minimal configuration for DAST API. From here you can:
+
+- [Run your first scan](#running-your-first-scan).
+- [Add authentication](#authentication).
+- Learn how to [handle false positives](#handling-false-positives).
+
+### GraphQL Schema
+
+> Support for GraphQL Schema was [introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/352780) in GitLab 15.4.
+
+GraphQL is a query language for your API and an alternative to REST APIs.
+DAST API supports testing GraphQL endpoints multiple ways:
+
+- Test using the GraphQL Schema. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/352780) in GitLab 15.4.
+- Test using a recording (HAR) of GraphQL queries.
+- Test using a Postman Collection containing GraphQL queries.
+
+This section documents how to test using a GraphQL schema. The GraphQL schema support in
+DAST API is able to query the schema from endpoints that support introspection.
+Introspection is enabled by default to allow tools like GraphiQL to work.
+
+#### DAST API scanning with a GraphQL endpoint URL
+
+The GraphQL support in DAST API is able to query a GraphQL endpoint for the schema.
+
+NOTE:
+The GraphQL endpoint must support introspection queries for this method to work correctly.
+
+To configure DAST API to use a GraphQL endpoint URL that provides information about the target API to test:
+
+1. [Include](../../../ci/yaml/index.md#includetemplate)
+ the [`DAST-API.gitlab-ci.yml` template](https://gitlab.com/gitlab-org/gitlab/-/blob/master/lib/gitlab/ci/templates/Security/DAST-API.gitlab-ci.yml) in your `.gitlab-ci.yml` file.
+
+1. Provide the path to the GraphQL endpoint, for example `/api/graphql`. Specify the location by adding the `DAST_API_GRAPHQL` variable.
+
+1. The target API instance's base URL is also required. Provide it by using the `DAST_API_TARGET_URL`
+ variable or an `environment_url.txt` file.
+
+ Adding the URL in an `environment_url.txt` file at your project's root is great for testing in
+ dynamic environments. See the [dynamic environment solutions](#dynamic-environment-solutions) section of our documentation for more information.
+
+Complete example configuration of using a GraphQL endpoint path:
+
+```yaml
+stages:
+ - dast
+
+include:
+ - template: DAST-API.gitlab-ci.yml
+
+dast_api:
+ variables:
+ DAST_API_GRAPHQL: /api/graphql
+ DAST_API_TARGET_URL: http://test-deployment/
+```
+
+This example is a minimal configuration for DAST API. From here you can:
+
+- [Run your first scan](#running-your-first-scan).
+- [Add authentication](#authentication).
+- Learn how to [handle false positives](#handling-false-positives).
+
+#### DAST API scanning with a GraphQL Schema file
+
+To configure DAST API to use a GraphQL schema file that provides information about the target API to test:
+
+1. [Include](../../../ci/yaml/index.md#includetemplate)
+ the [`DAST-API.gitlab-ci.yml` template](https://gitlab.com/gitlab-org/gitlab/-/blob/master/lib/gitlab/ci/templates/Security/DAST-API.gitlab-ci.yml) in your `.gitlab-ci.yml` file.
+
+1. Provide the GraphQL endpoint path, for example `/api/graphql`. Specify the path by adding the `DAST_API_GRAPHQL` variable.
+
+1. Provide the location of the GraphQL schema file. You can provide the location as a file path
+ or URL. Specify the location by adding the `DAST_API_GRAPHQL_SCHEMA` variable.
+
+1. The target API instance's base URL is also required. Provide it by using the `DAST_API_TARGET_URL`
+ variable or an `environment_url.txt` file.
+
+ Adding the URL in an `environment_url.txt` file at your project's root is great for testing in
+ dynamic environments. See the [dynamic environment solutions](#dynamic-environment-solutions) section of our documentation for more information.
+
+Complete example configuration of using an GraphQL schema file:
+
+```yaml
+stages:
+ - dast
+
+include:
+ - template: DAST-API.gitlab-ci.yml
+
+dast_api:
+ variables:
+ DAST_API_GRAPHQL: /api/graphql
+ DAST_API_GRAPHQL_SCHEMA: test-api-graphql.schema
+ DAST_API_TARGET_URL: http://test-deployment/
+```
+
+Complete example configuration of using an GraphQL schema file URL:
+
+```yaml
+stages:
+ - dast
+
+include:
+ - template: DAST-API.gitlab-ci.yml
+
+dast_api:
+ variables:
+ DAST_API_GRAPHQL: /api/graphql
+ DAST_API_GRAPHQL_SCHEMA: http://file-store/files/test-api-graphql.schema
+ DAST_API_TARGET_URL: http://test-deployment/
+```
+
+This example is a minimal configuration for DAST API. From here you can:
- [Run your first scan](#running-your-first-scan).
- [Add authentication](#authentication).
@@ -267,35 +380,266 @@ This is a minimal configuration for DAST API. From here you can:
- [Add authentication](#authentication).
- Learn how to [handle false positives](#handling-false-positives).
-##### Postman variables
+#### Postman variables
+
+> - Support for Postman Environment file format was [introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/356312) in GitLab 15.1.
+> - Support for multiple variable files was [introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/356312) in GitLab 15.1.
+> - Support for Postman variable scopes: Global and Environment was [introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/356312) in GitLab 15.1.
+
+##### Variables in Postman Client
Postman allows the developer to define placeholders that can be used in different parts of the
-requests. These placeholders are called variables, as explained in [Using variables](https://learning.postman.com/docs/sending-requests/variables/).
+requests. These placeholders are called variables, as explained in [using variables](https://learning.postman.com/docs/sending-requests/variables/).
You can use variables to store and reuse values in your requests and scripts. For example, you can
edit the collection to add variables to the document:
![Edit collection variable tab View](img/dast_api_postman_collection_edit_variable.png)
+Or alternatively, you can add variables in an environment:
+
+![Edit environment variables View](img/dast_api_postman_environment_edit_variable.png)
+
You can then use the variables in sections such as URL, headers, and others:
![Edit request using variables View](img/dast_api_postman_request_edit.png)
-Variables can be defined at different [scopes](https://learning.postman.com/docs/sending-requests/variables/#variable-scopes)
-(for example, Global, Collection, Environment, Local, and Data). In this example, they're defined at
-the Environment scope:
+Postman has grown from a basic client tool with a nice UX experience to a more complex ecosystem that allows testing APIs with scripts, creating complex collections that trigger secondary requests, and setting variables along the way. Not every feature in the Postman ecosystem is supported. For example, scripts are not supported. The main focus of the Postman support is to ingest Postman Collection definitions that are used by the Postman Client and their related variables defined in the workspace, environments, and the collections themselves.
-![Edit environment variables View](img/dast_api_postman_environment_edit_variable.png)
+Postman allows creating variables in different scopes. Each scope has a different level of visibility in the Postman tools. For example, you can create a variable in a _global environment_ scope that is seen by every operation definition and workspace. You can also create a variable in a specific _environment_ scope that is only visible and used when that specific environment is selected for use. Some scopes are not always available, for example in the Postman ecosystem you can create requests in the Postman Client, these requests do not have a _local_ scope, but test scripts do.
-When you export a Postman collection, only Postman collection variables are exported into the
-Postman file. For example, Postman does not export environment-scoped variables into the Postman
-file.
+Variable scopes in Postman can be a daunting topic and not everyone is familiar with it. We strongly recommend that you read [Variable Scopes](https://learning.postman.com/docs/sending-requests/variables/#variable-scopes) from Postman documentation before moving forward.
-By default, the DAST API scanner uses the Postman file to resolve Postman variable values. If a JSON file
-is set in a GitLab CI/CD environment variable `DAST_API_POSTMAN_COLLECTION_VARIABLES`, then the JSON
-file takes precedence to get Postman variable values.
+As mentioned above, there are different variable scopes, and each of them has a purpose and can be used to provide more flexibility to your Postman document. There is an important note on how values for variables are computed, as per Postman documentation:
-WARNING:
-Although Postman can export environment variables into a JSON file, the format is not compatible with the JSON expected by `DAST_API_POSTMAN_COLLECTION_VARIABLES`.
+> If a variable with the same name is declared in two different scopes, the value stored in the variable with narrowest scope is used. For example, if there is a global variable named `username` and a local variable named `username`, the local value is used when the request runs.
+
+The following is a summary of the variable scopes supported by the Postman Client and DAST API:
+
+- **Global Environment (Global) scope** is a special pre-defined environment that is available throughout a workspace. We can also refer to the _global environment_ scope as the _global_ scope. The Postman Client allows exporting the global environment into a JSON file, which can be used with DAST API.
+- **Environment scope** is a named group of variables created by a user in the Postman Client.
+The Postman Client supports a single active environment along with the global environment. The variables defined in an active user-created environment take precedence over variables defined in the global environment. The Postman Client allows exporting your environment into a JSON file, which can be used with DAST API.
+- **Collection scope** is a group of variables declared in a given collection. The collection variables are available to the collection where they have been declared and the nested requests or collections. Variables defined in the collection scope take precedence over the _global environment_ scope and also the _environment_ scope.
+The Postman Client can export one or more collections into a JSON file, this JSON file contains selected collections, requests, and collection variables.
+- **DAST API Scope** is a new scope added by DAST API to allow users to provide extra variables, or override variables defined in other supported scopes. This scope is not supported by Postman. The _DAST API Scope_ variables are provided using a [custom JSON file format](#dast-api-scope-custom-json-file-format).
+ - Override values defined in the environment or collection
+ - Defining variables from scripts
+ - Define a single row of data from the unsupported _data scope_
+- **Data scope** is a group of variables in which their name and values come from JSON or CSV files. A Postman collection runner like [Newman](https://learning.postman.com/docs/running-collections/using-newman-cli/command-line-integration-with-newman/) or [Postman Collection Runner](https://learning.postman.com/docs/running-collections/intro-to-collection-runs/) executes the requests in a collection as many times as entries have the JSON or CSV file. A good use case for these variables is to automate tests using scripts in Postman.
+DAST API does **not** support reading data from a CSV or JSON file.
+- **Local scope** are variables that are defined in Postman scripts. DAST API does **not** support Postman scripts and by extension, variables defined in scripts. You can still provide values for the script-defined variables by defining them in one of the supported scopes, or our custom JSON format.
+
+Not all scopes are supported by DAST API and variables defined in scripts are not supported. The following table is sorted by broadest scope to narrowest scope.
+
+| Scope |Postman | DAST API | Comment |
+| ------------------ |:---------:|:------------:| :--------|
+| Global Environment | Yes | Yes | Special pre-defined environment |
+| Environment | Yes | Yes | Named environments |
+| Collection | Yes | Yes | Defined in your postman collection |
+| DAST API Scope | No | Yes | Custom scope added by DAST API |
+| Data | Yes | No | External files in CSV or JSON format |
+| Local | Yes | No | Variables defined in scripts |
+
+For more details on how to define variables and export variables in different scopes, see:
+
+- [Defining collection variables](https://learning.postman.com/docs/sending-requests/variables/#defining-collection-variables)
+- [Defining environment variables](https://learning.postman.com/docs/sending-requests/variables/#defining-environment-variables)
+- [Defining global variables](https://learning.postman.com/docs/sending-requests/variables/#defining-global-variables)
+
+##### Exporting from Postman Client
+
+The Postman Client lets you export different file formats, for instance, you can export a Postman collection or a Postman environment.
+The exported environment can be the global environment (which is always available) or can be any custom environment you previously have created. When you export a Postman Collection, it may contain only declarations for _collection_ and _local_ scoped variables; _environment_ scoped variables are not included.
+
+To get the declaration for _environment_ scoped variables, you have to export a given environment at the time. Each exported file only includes variables from the selected environment.
+
+For more details on exporting variables in different supported scopes, see:
+
+- [Exporting collections](https://learning.postman.com/docs/getting-started/importing-and-exporting-data/#exporting-collections)
+- [Exporting environments](https://learning.postman.com/docs/getting-started/importing-and-exporting-data/#exporting-environments)
+- [Downloading global environments](https://learning.postman.com/docs/sending-requests/variables/#downloading-global-environments)
+
+##### DAST API Scope, custom JSON file format
+
+Our custom JSON file format is a JSON object where each object property represents a variable name and the property value represents the variable value. This file can be created using your favorite text editor, or it can be produced by an earlier job in your pipeline.
+
+This example defines two variables `base_url` and `token` in the DAST API scope:
+
+```json
+{
+ "base_url": "http://127.0.0.1/",
+ "token": "Token 84816165151"
+}
+```
+
+##### Using scopes with DAST API
+
+The scopes: _global_, _environment_, _collection_, and _GitLab DAST API_ are supported in [GitLab 15.1 and later](https://gitlab.com/gitlab-org/gitlab/-/issues/356312). GitLab 15.0 and earlier, supports only the _collection_, and _GitLab DAST API_ scopes.
+
+The following table provides a quick reference for mapping scope files/URLs to DAST API configuration variables:
+
+| Scope | How to Provide |
+| ------------------ | --------------- |
+| Global Environment | DAST_API_POSTMAN_COLLECTION_VARIABLES |
+| Environment | DAST_API_POSTMAN_COLLECTION_VARIABLES |
+| Collection | DAST_API_POSTMAN_COLLECTION |
+| DAST API Scope | DAST_API_POSTMAN_COLLECTION_VARIABLES |
+| Data | Not supported |
+| Local | Not supported |
+
+The Postman Collection document automatically includes any _collection_ scoped variables. The Postman Collection is provided with the configuration variable `DAST_API_POSTMAN_COLLECTION`. This variable can be set to a single [exported Postman collection](https://learning.postman.com/docs/getting-started/importing-and-exporting-data/#exporting-collections).
+
+Variables from other scopes are provided through the `DAST_API_POSTMAN_COLLECTION_VARIABLES` configuration variable. The configuration variable supports a comma (`,`) delimited file list in [GitLab 15.1 and later](https://gitlab.com/gitlab-org/gitlab/-/issues/356312). GitLab 15.0 and earlier, supports only one single file. The order of the files provided is not important as the files provide the needed scope information.
+
+The configuration variable `DAST_API_POSTMAN_COLLECTION_VARIABLES` can be set to:
+
+- [Exported Global environment](https://learning.postman.com/docs/sending-requests/variables/#downloading-global-environments)
+- [Exported environments](https://learning.postman.com/docs/getting-started/importing-and-exporting-data/#exporting-environments)
+- [DAST API Custom JSON format](#dast-api-scope-custom-json-file-format)
+
+##### Undefined Postman variables
+
+There is a chance that DAST API engine does not find all variables references that your Postman collection file is using. Some cases can be:
+
+- You are using _data_ or _local_ scoped variables, and as stated previously these scopes are not supported by DAST API. Thus, assuming the values for these variables have not been provided through [the DAST API scope](#dast-api-scope-custom-json-file-format), then the values of the _data_ and _local_ scoped variables are undefined.
+- A variable name was typed incorrectly, and the name does not match the defined variable.
+- Postman Client supports a new dynamic variable that is not supported by DAST API.
+
+When possible, DAST API follows the same behavior as the Postman Client does when dealing with undefined variables. The text of the variable reference remains the same, and there is no text substitution. The same behavior also applies to any unsupported dynamic variables.
+
+For example, if a request definition in the Postman Collection references the variable `{{full_url}}` and the variable is not found it is left unchanged with the value `{{full_url}}`.
+
+##### Dynamic Postman variables
+
+In addition to variables that a user can define at various scope levels, Postman has a set of pre-defined variables called _dynamic_ variables. The [_dynamic_ variables](https://learning.postman.com/docs/writing-scripts/script-references/variables-list/) are already defined and their name is prefixed with a dollar sign (`$`), for instance, `$guid`. _Dynamic_ variables can be used like any other variable, and in the Postman Client, they produce random values during the request/collection run.
+
+An important difference between DAST API and Postman is that DAST API returns the same value for each usage of the same dynamic variables. This differs from the Postman Client behavior which returns a random value on each use of the same dynamic variable. In other words, DAST API uses static values for dynamic variables while Postman uses random values.
+
+The supported dynamic variables during the scanning process are:
+
+| Variable | Value |
+| ----------- | ----------- |
+| `$guid` | `611c2e81-2ccb-42d8-9ddc-2d0bfa65c1b4` |
+| `$isoTimestamp` | `2020-06-09T21:10:36.177Z` |
+| `$randomAbbreviation` | `PCI` |
+| `$randomAbstractImage` | `http://no-a-valid-host/640/480/abstract` |
+| `$randomAdjective` | `auxiliary` |
+| `$randomAlphaNumeric` | `a` |
+| `$randomAnimalsImage` | `http://no-a-valid-host/640/480/animals` |
+| `$randomAvatarImage` | `https://no-a-valid-host/path/to/some/image.jpg` |
+| `$randomBankAccount` | `09454073` |
+| `$randomBankAccountBic` | `EZIAUGJ1` |
+| `$randomBankAccountIban` | `MU20ZPUN3039684000618086155TKZ` |
+| `$randomBankAccountName` | `Home Loan Account` |
+| `$randomBitcoin` | `3VB8JGT7Y4Z63U68KGGKDXMLLH5` |
+| `$randomBoolean` | `true` |
+| `$randomBs` | `killer leverage schemas` |
+| `$randomBsAdjective` | `viral` |
+| `$randomBsBuzz` | `repurpose` |
+| `$randomBsNoun` | `markets` |
+| `$randomBusinessImage` | `http://no-a-valid-host/640/480/business` |
+| `$randomCatchPhrase` | `Future-proofed heuristic open architecture` |
+| `$randomCatchPhraseAdjective` | `Business-focused` |
+| `$randomCatchPhraseDescriptor` | `bandwidth-monitored` |
+| `$randomCatchPhraseNoun` | `superstructure` |
+| `$randomCatsImage` | `http://no-a-valid-host/640/480/cats` |
+| `$randomCity` | `Spinkahaven` |
+| `$randomCityImage` | `http://no-a-valid-host/640/480/city` |
+| `$randomColor` | `fuchsia` |
+| `$randomCommonFileExt` | `wav` |
+| `$randomCommonFileName` | `well_modulated.mpg4` |
+| `$randomCommonFileType` | `audio` |
+| `$randomCompanyName` | `Grady LLC` |
+| `$randomCompanySuffix` | `Inc` |
+| `$randomCountry` | `Kazakhstan` |
+| `$randomCountryCode` | `MD` |
+| `$randomCreditCardMask` | `3622` |
+| `$randomCurrencyCode` | `ZMK` |
+| `$randomCurrencyName` | `Pound Sterling` |
+| `$randomCurrencySymbol` | `£` |
+| `$randomDatabaseCollation` | `utf8_general_ci` |
+| `$randomDatabaseColumn` | `updatedAt` |
+| `$randomDatabaseEngine` | `Memory` |
+| `$randomDatabaseType` | `text` |
+| `$randomDateFuture` | `Tue Mar 17 2020 13:11:50 GMT+0530 (India Standard Time)` |
+| `$randomDatePast` | `Sat Mar 02 2019 09:09:26 GMT+0530 (India Standard Time)` |
+| `$randomDateRecent` | `Tue Jul 09 2019 23:12:37 GMT+0530 (India Standard Time)` |
+| `$randomDepartment` | `Electronics` |
+| `$randomDirectoryPath` | `/usr/local/bin` |
+| `$randomDomainName` | `trevor.info` |
+| `$randomDomainSuffix` | `org` |
+| `$randomDomainWord` | `jaden` |
+| `$randomEmail` | `Iva.Kovacek61@no-a-valid-host.com` |
+| `$randomExampleEmail` | `non-a-valid-user@example.net` |
+| `$randomFashionImage` | `http://no-a-valid-host/640/480/fashion` |
+| `$randomFileExt` | `war` |
+| `$randomFileName` | `neural_sri_lanka_rupee_gloves.gdoc` |
+| `$randomFilePath` | `/home/programming_chicken.cpio` |
+| `$randomFileType` | `application` |
+| `$randomFirstName` | `Chandler` |
+| `$randomFoodImage` | `http://no-a-valid-host/640/480/food` |
+| `$randomFullName` | `Connie Runolfsdottir` |
+| `$randomHexColor` | `#47594a` |
+| `$randomImageDataUri` | `data:image/svg+xml;charset=UTF-8,%3Csvg%20xmlns%3D%22http%3A%2F%2Fwww.w3.org%2F2000%2Fsvg%22%20version%3D%221.1%22%20baseProfile%3D%22full%22%20width%3D%22undefined%22%20height%3D%22undefined%22%3E%20%3Crect%20width%3D%22100%25%22%20height%3D%22100%25%22%20fill%3D%22grey%22%2F%3E%20%20%3Ctext%20x%3D%220%22%20y%3D%2220%22%20font-size%3D%2220%22%20text-anchor%3D%22start%22%20fill%3D%22white%22%3Eundefinedxundefined%3C%2Ftext%3E%20%3C%2Fsvg%3E` |
+| `$randomImageUrl` | `http://no-a-valid-host/640/480` |
+| `$randomIngverb` | `navigating` |
+| `$randomInt` | `494` |
+| `$randomIP` | `241.102.234.100` |
+| `$randomIPV6` | `dbe2:7ae6:119b:c161:1560:6dda:3a9b:90a9` |
+| `$randomJobArea` | `Mobility` |
+| `$randomJobDescriptor` | `Senior` |
+| `$randomJobTitle` | `International Creative Liaison` |
+| `$randomJobType` | `Supervisor` |
+| `$randomLastName` | `Schneider` |
+| `$randomLatitude` | `55.2099` |
+| `$randomLocale` | `ny` |
+| `$randomLongitude` | `40.6609` |
+| `$randomLoremLines` | `Ducimus in ut mollitia.\nA itaque non.\nHarum temporibus nihil voluptas.\nIste in sed et nesciunt in quaerat sed.` |
+| `$randomLoremParagraph` | `Ab aliquid odio iste quo voluptas voluptatem dignissimos velit. Recusandae facilis qui commodi ea magnam enim nostrum quia quis. Nihil est suscipit assumenda ut voluptatem sed. Esse ab voluptas odit qui molestiae. Rem est nesciunt est quis ipsam expedita consequuntur.` |
+| `$randomLoremParagraphs` | `Voluptatem rem magnam aliquam ab id aut quaerat. Placeat provident possimus voluptatibus dicta velit non aut quasi. Mollitia et aliquam expedita sunt dolores nam consequuntur. Nam dolorum delectus ipsam repudiandae et ipsam ut voluptatum totam. Nobis labore labore recusandae ipsam quo.` |
+| `$randomLoremSentence` | `Molestias consequuntur nisi non quod.` |
+| `$randomLoremSentences` | `Et sint voluptas similique iure amet perspiciatis vero sequi atque. Ut porro sit et hic. Neque aspernatur vitae fugiat ut dolore et veritatis. Ab iusto ex delectus animi. Voluptates nisi iusto. Impedit quod quae voluptate qui.` |
+| `$randomLoremSlug` | `eos-aperiam-accusamus, beatae-id-molestiae, qui-est-repellat` |
+| `$randomLoremText` | `Quisquam asperiores exercitationem ut ipsum. Aut eius nesciunt. Et reiciendis aut alias eaque. Nihil amet laboriosam pariatur eligendi. Sunt ullam ut sint natus ducimus. Voluptas harum aspernatur soluta rem nam.` |
+| `$randomLoremWord` | `est` |
+| `$randomLoremWords` | `vel repellat nobis` |
+| `$randomMACAddress` | `33:d4:68:5f:b4:c7` |
+| `$randomMimeType` | `audio/vnd.vmx.cvsd` |
+| `$randomMonth` | `February` |
+| `$randomNamePrefix` | `Dr.` |
+| `$randomNameSuffix` | `MD` |
+| `$randomNatureImage` | `http://no-a-valid-host/640/480/nature` |
+| `$randomNightlifeImage` | `http://no-a-valid-host/640/480/nightlife` |
+| `$randomNoun` | `bus` |
+| `$randomPassword` | `t9iXe7COoDKv8k3` |
+| `$randomPeopleImage` | `http://no-a-valid-host/640/480/people` |
+| `$randomPhoneNumber` | `700-008-5275` |
+| `$randomPhoneNumberExt` | `27-199-983-3864` |
+| `$randomPhrase` | `You can't program the monitor without navigating the mobile XML program!` |
+| `$randomPrice` | `531.55` |
+| `$randomProduct` | `Pizza` |
+| `$randomProductAdjective` | `Unbranded` |
+| `$randomProductMaterial` | `Steel` |
+| `$randomProductName` | `Handmade Concrete Tuna` |
+| `$randomProtocol` | `https` |
+| `$randomSemver` | `7.0.5` |
+| `$randomSportsImage` | `http://no-a-valid-host/640/480/sports` |
+| `$randomStreetAddress` | `5742 Harvey Streets` |
+| `$randomStreetName` | `Kuhic Island` |
+| `$randomTransactionType` | `payment` |
+| `$randomTransportImage` | `http://no-a-valid-host/640/480/transport` |
+| `$randomUrl` | `https://no-a-valid-host.net` |
+| `$randomUserAgent` | `Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.9.8; rv:15.6) Gecko/20100101 Firefox/15.6.6` |
+| `$randomUserName` | `Jarrell.Gutkowski` |
+| `$randomUUID` | `6929bb52-3ab2-448a-9796-d6480ecad36b` |
+| `$randomVerb` | `navigate` |
+| `$randomWeekday` | `Thursday` |
+| `$randomWord` | `withdrawal` |
+| `$randomWords` | `Samoa Synergistic sticky copying Grocery` |
+| `$timestamp` | `1562757107` |
+
+##### Example: Global Scope
+
+In this example, [the _global_ scope is exported](https://learning.postman.com/docs/sending-requests/variables/#downloading-global-environments) from the Postman Client as `global-scope.json` and provided to DAST API through the `DAST_API_POSTMAN_COLLECTION_VARIABLES` configuration variable.
Here is an example of using `DAST_API_POSTMAN_COLLECTION_VARIABLES`:
@@ -308,21 +652,170 @@ include:
variables:
DAST_API_PROFILE: Quick
- DAST_API_POSTMAN_COLLECTION: postman-collection_serviceA.json
- DAST_API_POSTMAN_COLLECTION_VARIABLES: variable-collection-dictionary.json
+ DAST_API_POSTMAN_COLLECTION: postman-collection.json
+ DAST_API_POSTMAN_COLLECTION_VARIABLES: global-scope.json
+ DAST_API_TARGET_URL: http://test-deployment/
+```
+
+##### Example: Environment Scope
+
+In this example, [the _environment_ scope is exported](https://learning.postman.com/docs/getting-started/importing-and-exporting-data/#exporting-environments) from the Postman Client as `environment-scope.json` and provided to DAST API through the `DAST_API_POSTMAN_COLLECTION_VARIABLES` configuration variable.
+
+Here is an example of using `DAST_API_POSTMAN_COLLECTION_VARIABLES`:
+
+```yaml
+stages:
+ - dast
+
+include:
+ - template: DAST-API.gitlab-ci.yml
+
+variables:
+ DAST_API_PROFILE: Quick
+ DAST_API_POSTMAN_COLLECTION: postman-collection.json
+ DAST_API_POSTMAN_COLLECTION_VARIABLES: environment-scope.json
DAST_API_TARGET_URL: http://test-deployment/
```
-The file `variable-collection-dictionary.json` is a JSON document. This JSON is an object with
-key-value pairs for properties. The keys are the variables' names, and the values are the variables'
+##### Example: Collection Scope
+
+The _collection_ scope variables are included in the exported Postman Collection file and provided through the `DAST_API_POSTMAN_COLLECTION` configuration variable.
+
+Here is an example of using `DAST_API_POSTMAN_COLLECTION`:
+
+```yaml
+stages:
+ - dast
+
+include:
+ - template: DAST-API.gitlab-ci.yml
+
+variables:
+ DAST_API_PROFILE: Quick
+ DAST_API_POSTMAN_COLLECTION: postman-collection.json
+ DAST_API_TARGET_URL: http://test-deployment/
+```
+
+##### Example: DAST API Scope
+
+The DAST API Scope is used for two main purposes, defining _data_ and _local_ scope variables that are not supported by DAST API, and changing the value of an existing variable defined in another scope. The DAST API Scope is provided through the `DAST_API_POSTMAN_COLLECTION_VARIABLES` configuration variable.
+
+Here is an example of using `DAST_API_POSTMAN_COLLECTION_VARIABLES`:
+
+```yaml
+stages:
+ - dast
+
+include:
+ - template: DAST-API.gitlab-ci.yml
+
+variables:
+ DAST_API_PROFILE: Quick
+ DAST_API_POSTMAN_COLLECTION: postman-collection.json
+ DAST_API_POSTMAN_COLLECTION_VARIABLES: dast-api-scope.json
+ DAST_API_TARGET_URL: http://test-deployment/
+```
+
+The file `dast-api-scope.json` uses our [custom JSON file format](#dast-api-scope-custom-json-file-format). This JSON is an object with key-value pairs for properties. The keys are the variables' names, and the values are the variables'
values. For example:
- ```json
- {
- "base_url": "http://127.0.0.1/",
- "token": "Token 84816165151"
- }
- ```
+```json
+{
+ "base_url": "http://127.0.0.1/",
+ "token": "Token 84816165151"
+}
+```
+
+##### Example: Multiple Scopes
+
+In this example, a _global_ scope, _environment_ scope, and _collection_ scope are configured. The first step is to export our various scopes.
+
+- [Export the _global_ scope](https://learning.postman.com/docs/sending-requests/variables/#downloading-global-environments) as `global-scope.json`
+- [Export the _environment_ scope](https://learning.postman.com/docs/getting-started/importing-and-exporting-data/#exporting-environments) as `environment-scope.json`
+- Export the Postman Collection which includes the _collection_ scope as `postman-collection.json`
+
+The Postman Collection is provided using the `DAST_API_POSTMAN_COLLECTION` variable, while the other scopes are provided using the `DAST_API_POSTMAN_COLLECTION_VARIABLES`. DAST API can identify which scope the provided files match using data provided in each file.
+
+```yaml
+stages:
+ - dast
+
+include:
+ - template: DAST-API.gitlab-ci.yml
+
+variables:
+ DAST_API_PROFILE: Quick
+ DAST_API_POSTMAN_COLLECTION: postman-collection.json
+ DAST_API_POSTMAN_COLLECTION_VARIABLES: global-scope.json,environment-scope.json
+ DAST_API_TARGET_URL: http://test-deployment/
+```
+
+##### Example: Changing a Variables Value
+
+When using exported scopes, it's often the case that the value of a variable must be changed for use with DAST API. For example, a _collection_ scoped variable might contain a variable named `api_version` with a value of `v2`, while your test needs a value of `v1`. Instead of modifying the exported collection to change the value, the DAST API scope can be used to change its value. This works because the _DAST API_ scope takes precedence over all other scopes.
+
+The _collection_ scope variables are included in the exported Postman Collection file and provided through the `DAST_API_POSTMAN_COLLECTION` configuration variable.
+
+The DAST API Scope is provided through the `DAST_API_POSTMAN_COLLECTION_VARIABLES` configuration variable, but first, we must create the file.
+The file `dast-api-scope.json` uses our [custom JSON file format](#dast-api-scope-custom-json-file-format). This JSON is an object with key-value pairs for properties. The keys are the variables' names, and the values are the variables'
+values. For example:
+
+```json
+{
+ "api_version": "v1"
+}
+```
+
+Our CI definition:
+
+```yaml
+stages:
+ - dast
+
+include:
+ - template: DAST-API.gitlab-ci.yml
+
+variables:
+ DAST_API_PROFILE: Quick
+ DAST_API_POSTMAN_COLLECTION: postman-collection.json
+ DAST_API_POSTMAN_COLLECTION_VARIABLES: dast-api-scope.json
+ DAST_API_TARGET_URL: http://test-deployment/
+```
+
+##### Example: Changing a Variables Value with Multiple Scopes
+
+When using exported scopes, it's often the case that the value of a variable must be changed for use with DAST API. For example, an _environment_ scope might contain a variable named `api_version` with a value of `v2`, while your test needs a value of `v1`. Instead of modifying the exported file to change the value, the DAST API scope can be used. This works because the _DAST API_ scope takes precedence over all other scopes.
+
+In this example, a _global_ scope, _environment_ scope, _collection_ scope, and _DAST API_ scope are configured. The first step is to export and create our various scopes.
+
+- [Export the _global_ scope](https://learning.postman.com/docs/sending-requests/variables/#downloading-global-environments) as `global-scope.json`
+- [Export the _environment_ scope](https://learning.postman.com/docs/getting-started/importing-and-exporting-data/#exporting-environments) as `environment-scope.json`
+- Export the Postman Collection which includes the _collection_ scope as `postman-collection.json`
+
+The DAST API scope is used by creating a file `dast-api-scope.json` using our [custom JSON file format](#dast-api-scope-custom-json-file-format). This JSON is an object with key-value pairs for properties. The keys are the variables' names, and the values are the variables'
+values. For example:
+
+```json
+{
+ "api_version": "v1"
+}
+```
+
+The Postman Collection is provided using the `DAST_API_POSTMAN_COLLECTION` variable, while the other scopes are provided using the `DAST_API_POSTMAN_COLLECTION_VARIABLES`. DAST API can identify which scope the provided files match using data provided in each file.
+
+```yaml
+stages:
+ - dast
+
+include:
+ - template: DAST-API.gitlab-ci.yml
+
+variables:
+ DAST_API_PROFILE: Quick
+ DAST_API_POSTMAN_COLLECTION: postman-collection.json
+ DAST_API_POSTMAN_COLLECTION_VARIABLES: global-scope.json,environment-scope.json,dast-api-scope.json
+ DAST_API_TARGET_URL: http://test-deployment/
+```
## Authentication
@@ -332,17 +825,19 @@ provide a script that performs an authentication flow or calculates the token.
### HTTP Basic Authentication
[HTTP basic authentication](https://en.wikipedia.org/wiki/Basic_access_authentication)
-is an authentication method built in to the HTTP protocol and used in conjunction with
+is an authentication method built into the HTTP protocol and used in conjunction with
[transport layer security (TLS)](https://en.wikipedia.org/wiki/Transport_Layer_Security).
-To use HTTP basic authentication, two CI/CD variables are added to your `.gitlab-ci.yml` file:
-- `DAST_API_HTTP_USERNAME`: The username for authentication.
-- `DAST_API_HTTP_PASSWORD`: The password for authentication.
+We recommended that you [create a CI/CD variable](../../../ci/variables/index.md#custom-cicd-variables)
+for the password (for example, `TEST_API_PASSWORD`), and set it to be masked. You can create CI/CD
+variables from the GitLab project's page at **Settings > CI/CD**, in the **Variables** section.
+Because of the [limitations on masked variables](../../../ci/variables/index.md#mask-a-cicd-variable),
+you should Base64-encode the password before adding it as a variable.
-For the password, we recommended that you [create a CI/CD variable](../../../ci/variables/index.md#custom-cicd-variables)
-(for example, `TEST_API_PASSWORD`) set to the password. You can create CI/CD variables from the
-GitLab projects page at **Settings > CI/CD**, in the **Variables** section. Use that variable
-as the value for `DAST_API_HTTP_PASSWORD`:
+Finally, add two CI/CD variables to your `.gitlab-ci.yml` file:
+
+- `DAST_API_HTTP_USERNAME`: The username for authentication.
+- `DAST_API_HTTP_PASSWORD_BASE64`: The Base64-encoded password for authentication.
```yaml
stages:
@@ -356,9 +851,13 @@ variables:
DAST_API_HAR: test-api-recording.har
DAST_API_TARGET_URL: http://test-deployment/
DAST_API_HTTP_USERNAME: testuser
- DAST_API_HTTP_PASSWORD: $TEST_API_PASSWORD
+ DAST_API_HTTP_PASSWORD_BASE64: $TEST_API_PASSWORD
```
+#### Raw password
+
+If you do not want to Base64-encode the password (or if you are using GitLab 15.3 or earlier) you can provide the raw password `DAST_API_HTTP_PASSWORD`, instead of using `DAST_API_HTTP_PASSWORD_BASE64`.
+
### Bearer tokens
Bearer tokens are used by several different authentication mechanisms, including OAuth2 and JSON Web
@@ -383,7 +882,7 @@ Follow these steps to provide the Bearer token with `DAST_API_OVERRIDES_ENV`:
can create CI/CD variables from the GitLab projects page at **Settings > CI/CD**, in the
**Variables** section.
Due to the format of `TEST_API_BEARERAUTH` it's not possible to mask the variable.
- To mask the token's value, you can create a second variable with the token value's, and define
+ To mask the token's value, you can create a second variable with the token values, and define
`TEST_API_BEARERAUTH` with the value `{"headers":{"Authorization":"Bearer $MASKED_VARIABLE"}}`.
1. In your `.gitlab-ci.yml` file, set `DAST_API_OVERRIDES_ENV` to the variable you just created:
@@ -402,12 +901,12 @@ Follow these steps to provide the Bearer token with `DAST_API_OVERRIDES_ENV`:
DAST_API_OVERRIDES_ENV: $TEST_API_BEARERAUTH
```
-1. To validate that authentication is working, run an DAST API test and review the job logs
+1. To validate that authentication is working, run a DAST API test and review the job logs
and the test API's application logs.
#### Token generated at test runtime
-If the Bearer token must be generated and doesn't expire during testing, you can provide DAST API a file that has the token. A prior stage and job, or part of the DAST API job, can
+If the Bearer token must be generated and doesn't expire during testing, you can provide DAST API with a file that has the token. A prior stage and job, or part of the DAST API job, can
generate this file.
DAST API expects to receive a JSON file with the following structure:
@@ -439,7 +938,7 @@ variables:
DAST_API_OVERRIDES_FILE: dast-api-overrides.json
```
-To validate that authentication is working, run an DAST API test and review the job logs and
+To validate that authentication is working, run a DAST API test and review the job logs and
the test API's application logs.
#### Token has short expiration
@@ -552,8 +1051,10 @@ can be added, removed, and modified by creating a custom configuration.
|[`DAST_API_OPENAPI_ALL_MEDIA_TYPES`](#openapi-specification) | Use all supported media types instead of one when generating requests. Causes test duration to be longer. Default is disabled. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/333304) in GitLab 14.10. |
|[`DAST_API_OPENAPI_MEDIA_TYPES`](#openapi-specification) | Colon (`:`) separated media types accepted for testing. Default is disabled. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/333304) in GitLab 14.10. |
|[`DAST_API_HAR`](#http-archive-har) | HTTP Archive (HAR) file. |
+|[`DAST_API_GRAPHQL`](#graphql-schema) | Path to GraphQL endpoint, for example `/api/graphql`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/352780) in GitLab 15.4. |
+|[`DAST_API_GRAPHQL_SCHEMA`](#graphql-schema) | A URL or filename for a GraphQL schema in JSON format. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/352780) in GitLab 15.4. |
|[`DAST_API_POSTMAN_COLLECTION`](#postman-collection) | Postman Collection file. |
-|[`DAST_API_POSTMAN_COLLECTION_VARIABLES`](#postman-variables) | Path to a JSON file to extract postman variable values. |
+|[`DAST_API_POSTMAN_COLLECTION_VARIABLES`](#postman-variables) | Path to a JSON file to extract Postman variable values. The support for comma-separated (`,`) files was [introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/356312) in GitLab 15.1. |
|[`DAST_API_OVERRIDES_FILE`](#overrides) | Path to a JSON file containing overrides. |
|[`DAST_API_OVERRIDES_ENV`](#overrides) | JSON string containing headers to override. |
|[`DAST_API_OVERRIDES_CMD`](#overrides) | Overrides command. |
@@ -562,7 +1063,8 @@ can be added, removed, and modified by creating a custom configuration.
|`DAST_API_POST_SCRIPT` | Run user command or script after scan session has finished. |
|[`DAST_API_OVERRIDES_INTERVAL`](#overrides) | How often to run overrides command in seconds. Defaults to `0` (once). |
|[`DAST_API_HTTP_USERNAME`](#http-basic-authentication) | Username for HTTP authentication. |
-|[`DAST_API_HTTP_PASSWORD`](#http-basic-authentication) | Password for HTTP authentication. |
+|[`DAST_API_HTTP_PASSWORD`](#http-basic-authentication) | Password for HTTP authentication. Consider using `DAST_API_HTTP_PASSWORD_BASE64` instead. |
+|[`DAST_API_HTTP_PASSWORD_BASE64`](#http-basic-authentication) | Password for HTTP authentication, base64-encoded. [Introduced](https://gitlab.com/gitlab-org/security-products/analyzers/api-fuzzing-src/-/merge_requests/702) in GitLab 15.4. |
|`DAST_API_SERVICE_START_TIMEOUT` | How long to wait for target API to become available in seconds. Default is 300 seconds. |
|`DAST_API_TIMEOUT` | How long to wait for API responses in seconds. Default is 30 seconds. |
@@ -1010,21 +1512,21 @@ This example excludes the `/auth` resource. This does not exclude child resource
```yaml
variables:
- DAST_API_EXCLUDE_PATHS=/auth
+ DAST_API_EXCLUDE_PATHS: /auth
```
To exclude `/auth`, and child resources (`/auth/child`), we use a wildcard.
```yaml
variables:
- DAST_API_EXCLUDE_PATHS=/auth*
+ DAST_API_EXCLUDE_PATHS: /auth*
```
To exclude multiple paths we use the `;` character. In this example we exclude `/auth*` and `/v1/*`.
```yaml
variables:
- DAST_API_EXCLUDE_PATHS=/auth*;/v1/*
+ DAST_API_EXCLUDE_PATHS: /auth*;/v1/*
```
To exclude one or more nested levels within a path we use `**`. In this example we are testing API endpoints. We are testing `/api/v1/` and `/api/v2/` of a data query requesting `mass`, `brightness` and `coordinates` data for `planet`, `moon`, `star`, and `satellite` objects. Example paths that could be scanned include, but are not limited to:
@@ -1037,7 +1539,7 @@ In this example we test the `brightness` endpoint only:
```yaml
variables:
- DAST_API_EXCLUDE_PATHS=/api/**/mass;/api/**/coordinates
+ DAST_API_EXCLUDE_PATHS: /api/**/mass;/api/**/coordinates
```
### Exclude parameters
@@ -1297,7 +1799,15 @@ Each value in `DAST_API_EXCLUDE_URLS` is a regular expression. Characters such a
The following example excludes the URL `http://target/api/auth` and its child resources.
```yaml
+stages:
+ - dast
+
+include:
+ - template: DAST-API.gitlab-ci.yml
+
variables:
+ DAST_API_TARGET_URL: http://target/
+ DAST_API_OPENAPI: test-api-specification.json
DAST_API_EXCLUDE_URLS: http://target/api/auth
```
@@ -1306,7 +1816,15 @@ variables:
To exclude the URLs `http://target/api/buy` and `http://target/api/sell` but allowing to scan their child resources, for instance: `http://target/api/buy/toy` or `http://target/api/sell/chair`. You could use the value `http://target/api/buy/$,http://target/api/sell/$`. This value is using two regular expressions, each of them separated by a `,` character. Hence, it contains `http://target/api/buy$` and `http://target/api/sell$`. In each regular expression, the trailing `$` character points out where the matching URL should end.
```yaml
+stages:
+ - dast
+
+include:
+ - template: DAST-API.gitlab-ci.yml
+
variables:
+ DAST_API_TARGET_URL: http://target/
+ DAST_API_OPENAPI: test-api-specification.json
DAST_API_EXCLUDE_URLS: http://target/api/buy/$,http://target/api/sell/$
```
@@ -1315,7 +1833,15 @@ variables:
In order to exclude the URLs: `http://target/api/buy` and `http://target/api/sell`, and their child resources. To provide multiple URLs we use the `,` character as follows:
```yaml
+stages:
+ - dast
+
+include:
+ - template: DAST-API.gitlab-ci.yml
+
variables:
+ DAST_API_TARGET_URL: http://target/
+ DAST_API_OPENAPI: test-api-specification.json
DAST_API_EXCLUDE_URLS: http://target/api/buy,http://target/api/sell
```
@@ -1324,7 +1850,15 @@ variables:
In order to exclude exactly `https://target/api/v1/user/create` and `https://target/api/v2/user/create` or any other version (`v3`,`v4`, and more). We could use `https://target/api/v.*/user/create$`, in the previous regular expression `.` indicates any character and `*` indicates zero or more times, additionally `$` indicates that the URL should end there.
```yaml
+stages:
+ - dast
+
+include:
+ - template: DAST-API.gitlab-ci.yml
+
variables:
+ DAST_API_TARGET_URL: http://target/
+ DAST_API_OPENAPI: test-api-specification.json
DAST_API_EXCLUDE_URLS: https://target/api/v.*/user/create$
```
@@ -1544,7 +2078,7 @@ The DAST API engine outputs an error message when it cannot establish a connecti
**Error message**
- In [GitLab 13.11 and later](https://gitlab.com/gitlab-org/gitlab/-/issues/323939), `Failed to start scanner session (version header not found).`
-- In GitLab 13.10 and earlier, `API Security version header not found. Are you sure that you are connecting to the API Security server?`.
+- In GitLab 13.10 and earlier, `API Security version header not found. Are you sure that you are connecting to the DAST API server?`.
**Solution**
@@ -1568,12 +2102,15 @@ This solution is for pipelines in which the target API URL doesn't change (is st
For environments where the target API remains the same, we recommend you specify the target URL by using the `DAST_API_TARGET_URL` environment variable. In your `.gitlab-ci.yml`, add a variable `DAST_API_TARGET_URL`. The variable must be set to the base URL of API testing target. For example:
```yaml
+stages:
+ - dast
+
include:
- - template: DAST-API.gitlab-ci.yml
+ - template: DAST-API.gitlab-ci.yml
- variables:
- DAST_API_TARGET_URL: http://test-deployment/
- DAST_API_OPENAPI: test-api-specification.json
+variables:
+ DAST_API_TARGET_URL: http://test-deployment/
+ DAST_API_OPENAPI: test-api-specification.json
```
#### Dynamic environment solutions
@@ -1603,7 +2140,7 @@ deploy-test-target:
### Use OpenAPI with an invalid schema
-There are cases where the document is autogenerated with an invalid schema or cannot be edited manually in a timely manner. In those scenarios, the API Security is able to perform a relaxed validation by setting the variable `DAST_API_OPENAPI_RELAXED_VALIDATION`. We recommend providing a fully compliant OpenAPI document to prevent unexpected behaviors.
+There are cases where the document is autogenerated with an invalid schema or cannot be edited manually in a timely manner. In those scenarios, the DAST API is able to perform a relaxed validation by setting the variable `DAST_API_OPENAPI_RELAXED_VALIDATION`. We recommend providing a fully compliant OpenAPI document to prevent unexpected behaviors.
#### Edit a non-compliant OpenAPI file
@@ -1620,25 +2157,25 @@ If your OpenAPI document is generated manually, load your document in the editor
Relaxed validation is meant for cases when the OpenAPI document cannot meet OpenAPI specifications, but it still has enough content to be consumed by different tools. A validation is performed but less strictly in regards to document schema.
-API Security can still try to consume an OpenAPI document that does not fully comply with OpenAPI specifications. To instruct API Security to perform a relaxed validation, set the variable `DAST_API_OPENAPI_RELAXED_VALIDATION` to any value, for example:
+DAST API can still try to consume an OpenAPI document that does not fully comply with OpenAPI specifications. To instruct DAST API to perform a relaxed validation, set the variable `DAST_API_OPENAPI_RELAXED_VALIDATION` to any value, for example:
```yaml
- stages:
- - dast
+stages:
+ - dast
- include:
- - template: DAST-API.gitlab-ci.yml
+include:
+ - template: DAST-API.gitlab-ci.yml
- variables:
- DAST_API_PROFILE: Quick
- DAST_API_TARGET_URL: http://test-deployment/
- DAST_API_OPENAPI: test-api-specification.json
- DAST_API_OPENAPI_RELAXED_VALIDATION: On
+variables:
+ DAST_API_PROFILE: Quick
+ DAST_API_TARGET_URL: http://test-deployment/
+ DAST_API_OPENAPI: test-api-specification.json
+ DAST_API_OPENAPI_RELAXED_VALIDATION: 'On'
```
### `No operation in the OpenAPI document is consuming any supported media type`
-API Security uses the specified media types in the OpenAPI document to generate requests. If no request can be created due to the lack of supported media types, then an error will be thrown.
+DAST API uses the specified media types in the OpenAPI document to generate requests. If no request can be created due to the lack of supported media types, then an error will be thrown.
**Error message**
diff --git a/doc/user/application_security/dependency_scanning/analyzers.md b/doc/user/application_security/dependency_scanning/analyzers.md
index acbc94cba47..a59399f7e8d 100644
--- a/doc/user/application_security/dependency_scanning/analyzers.md
+++ b/doc/user/application_security/dependency_scanning/analyzers.md
@@ -32,14 +32,6 @@ The Dependency Scanning analyzers' current major version number is 2.
Dependency Scanning is pre-configured with a set of **default images** that are
maintained by GitLab, but users can also integrate their own **custom images**.
-<!--- start_remove The following content will be removed on remove_date: '2022-08-22' -->
-
-The [`bundler-audit`](https://gitlab.com/gitlab-org/gitlab/-/issues/289832) and [`retire.js`](https://gitlab.com/gitlab-org/gitlab/-/issues/350510) analyzers were deprecated
-in GitLab 14.8 and [removed](https://gitlab.com/gitlab-org/gitlab/-/merge_requests/86704) in 15.0.
-Use Gemnasium instead.
-
-<!--- end_remove -->
-
## Official default analyzers
Any custom change to the official analyzers can be achieved by using a
diff --git a/doc/user/application_security/dependency_scanning/index.md b/doc/user/application_security/dependency_scanning/index.md
index 521bb6adbf0..7aabbdd3194 100644
--- a/doc/user/application_security/dependency_scanning/index.md
+++ b/doc/user/application_security/dependency_scanning/index.md
@@ -50,12 +50,12 @@ possible, we encourage you to use all of our security scanning tools:
declared software dependencies (and those installed as a sub-dependency).
Dependency Scanning can not detect software dependencies that are pre-bundled
into the container's base image. To identify pre-bundled dependencies, enable
- [Container Scanning](../container_scanning/) language scanning using the
- [`CS_DISABLE_LANGUAGE_VULNERABILITY_SCAN` variable](../container_scanning/#report-language-specific-findings).
-- [Container Scanning](../container_scanning/) analyzes your containers and tells
+ [Container Scanning](../container_scanning/index.md) language scanning using the
+ [`CS_DISABLE_LANGUAGE_VULNERABILITY_SCAN` variable](../container_scanning/index.md#report-language-specific-findings).
+- [Container Scanning](../container_scanning/index.md) analyzes your containers and tells
you about known risks in the operating system's (OS) packages. You can configure it
to also report on software and language dependencies, if you enable it and use
- the [`CS_DISABLE_LANGUAGE_VULNERABILITY_SCAN` variable](../container_scanning/#report-language-specific-findings).
+ the [`CS_DISABLE_LANGUAGE_VULNERABILITY_SCAN` variable](../container_scanning/index.md#report-language-specific-findings).
Turning this variable on can result in some duplicate findings, as we do not yet
de-duplicate results between Container Scanning and Dependency Scanning. For more details,
efforts to de-duplicate these findings can be tracked in
@@ -304,7 +304,7 @@ table.supported-languages ul {
<li>
<a id="notes-regarding-supported-languages-and-package-managers-3"></a>
<p>
- npm is only supported when `lockfileVersion = 1` or `lockfileVersion = 2`. Work to add support for `lockfileVersion = 3` is being tracked in issue <a href="https://gitlab.com/gitlab-org/gitlab/-/issues/365176">GitLab#365176</a>.
+ npm is only supported when <code>lockfileVersion = 1</code> or <code>lockfileVersion = 2</code>. Work to add support for <code>lockfileVersion = 3</code> is being tracked in issue <a href="https://gitlab.com/gitlab-org/gitlab/-/issues/365176">GitLab#365176</a>.
</p>
</li>
<li>
@@ -452,8 +452,8 @@ We only execute one build in the directory where a build file has been detected.
multiple Gradle, Maven, or sbt builds, or any combination of these, `gemnasium-maven` only analyzes dependencies for the first build file
that is detected. Build files are searched for in the following order:
-1. `build.gradle` or `build.gradle.kts` for single or [multi-project](https://docs.gradle.org/current/userguide/intro_multi_project_builds.html) Gradle builds.
1. `pom.xml` for single or [multi-module](https://maven.apache.org/pom.html#Aggregation) Maven projects.
+1. `build.gradle` or `build.gradle.kts` for single or [multi-project](https://docs.gradle.org/current/userguide/intro_multi_project_builds.html) Gradle builds.
1. `build.sbt` for single or [multi-project](https://www.scala-sbt.org/1.x/docs/Multi-Project.html) sbt builds.
The search begins with the root directory and then continues with subdirectories if no builds are found in the root directory. Consequently an sbt build file in the root directory would be detected before a Gradle build file in a subdirectory.
@@ -521,7 +521,7 @@ always take the latest dependency scanning artifact available.
To enable Dependency Scanning in a project, you can create a merge request:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Configuration**.
1. In the **Dependency Scanning** row, select **Configure with a merge request**.
1. Review and merge the merge request to enable Dependency Scanning.
@@ -627,7 +627,7 @@ The following variables are used for configuring specific analyzers (used for a
The previous tables are not an exhaustive list of all variables that can be used. They contain all specific GitLab and analyzer variables we support and test. There are many variables, such as environment variables, that you can pass in and they will work. This is a large list, many of which we may be unaware of, and as such is not documented.
For example, to pass the non-GitLab environment variable `HTTPS_PROXY` to all Dependency Scanning jobs,
-set it as a [custom CI/CD variable in your `.gitlab-ci.yml`](../../../ci/variables/#create-a-custom-cicd-variable-in-the-gitlab-ciyml-file)
+set it as a [custom CI/CD variable in your `.gitlab-ci.yml`](../../../ci/variables/index.md#create-a-custom-cicd-variable-in-the-gitlab-ciyml-file)
file like this:
```yaml
diff --git a/doc/user/application_security/generate_test_vulnerabilities/index.md b/doc/user/application_security/generate_test_vulnerabilities/index.md
index aafbebb91cd..4d424acf9c3 100644
--- a/doc/user/application_security/generate_test_vulnerabilities/index.md
+++ b/doc/user/application_security/generate_test_vulnerabilities/index.md
@@ -1,6 +1,6 @@
---
type: reference, howto
-stage: Secure
+stage: Govern
group: Threat Insights
info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
---
diff --git a/doc/user/application_security/get-started-security.md b/doc/user/application_security/get-started-security.md
index 4c2b971b5fa..ee7864b5ce9 100644
--- a/doc/user/application_security/get-started-security.md
+++ b/doc/user/application_security/get-started-security.md
@@ -6,25 +6,36 @@ info: To determine the technical writer assigned to the Stage/Group associated w
# Get started with GitLab application security **(ULTIMATE)**
-Complete the following steps to get the most from GitLab application security tools.
+<i class="fa fa-youtube-play youtube" aria-hidden="true"></i>
+For an overview, see [Adopting GitLab application security](https://www.youtube.com/watch?v=5QlxkiKR04k).
-1. Enable [Secret Detection](secret_detection/index.md) scanning for your default branch.
-1. Enable [Dependency Scanning](dependency_scanning/index.md) for your default branch so you can start identifying existing
+The following steps will help you get the most from GitLab application security tools. These steps are a recommended order of operations. You can choose to implement capabilities in a different order or omit features that do not apply to your specific needs.
+
+1. Enable [Secret Detection](secret_detection/index.md) and [Dependency Scanning](dependency_scanning/index.md)
+ to identify any leaked secrets and vulnerable packages in your codebase.
+
+ - For all security scanners, enable them by updating your `[.gitlab-ci.yml](../../ci/yaml/gitlab_ci_yaml.md)` directly on your `default` branch. This creates a baseline scan of your `default` branch, which is necessary for
+ feature branch scans to be compared against. This allows [merge requests](../project/merge_requests/index.md)
+ to display only newly-introduced vulnerabilities. Otherwise, merge requests will display every
+ vulnerability in the branch, regardless of whether it was introduced by a change in the branch.
+ - If you are after simplicity, enable only Secret Detection first. It only has one analyzer,
+ no build requirements, and relatively simple findings: is this a secret or not?
+ - It is good practice to enable Dependency Scanning early so you can start identifying existing
vulnerable packages in your codebase.
-1. Add security scans to feature branch pipelines. The same scans should be enabled as are running
- on your default branch. Subsequent scans will show only new vulnerabilities by comparing the feature branch to the default branch results.
1. Let your team get comfortable with [vulnerability reports](vulnerability_report/index.md) and
establish a vulnerability triage workflow.
1. Consider creating [labels](../project/labels.md) and [issue boards](../project/issue_board.md) to
help manage issues created from vulnerabilities. Issue boards allow all stakeholders to have a
- common view of all issues.
+ common view of all issues and track remediation progress.
+1. Use [scheduled pipelines](../../ci/pipelines/schedules.md#scheduled-pipelines) to regularly scan important branches such as `default` or those used for maintenance releases.
+ - Running regular dependency and [container scans](container_scanning/index.md) will surface newly-discovered vulnerabilities that already exist in your repository.
+ - Scheduled scans are most useful for projects or important branches with low development activity where pipeline scans are infrequent.
1. Create a [scan result policy](policies/index.md) to limit new vulnerabilities from being merged
- into your default branch.
+ into your `default` branch.
1. Monitor the [Security Dashboard](security_dashboard/index.md) trends to gauge success in
remediating existing vulnerabilities and preventing the introduction of new ones.
1. Enable other scan types such as [SAST](sast/index.md), [DAST](dast/index.md),
[Fuzz testing](coverage_fuzzing/index.md), or [Container Scanning](container_scanning/index.md).
- Be sure to add the same scan types to both feature pipelines and default branch pipelines.
1. Use [Compliance Pipelines](../../user/project/settings/index.md#compliance-pipeline-configuration)
or [Scan Execution Policies](policies/scan-execution-policies.md) to enforce required scan types
and ensure separation of duties between security and engineering.
diff --git a/doc/user/application_security/iac_scanning/index.md b/doc/user/application_security/iac_scanning/index.md
index 16f08de738b..1b9cdb11ea3 100644
--- a/doc/user/application_security/iac_scanning/index.md
+++ b/doc/user/application_security/iac_scanning/index.md
@@ -116,7 +116,7 @@ that you can download and analyze.
To enable IaC Scanning in a project, you can create a merge request:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Configuration**.
1. In the **Infrastructure as Code (IaC) Scanning** row, select **Configure with a merge request**.
1. Review and merge the merge request to enable IaC Scanning.
diff --git a/doc/user/application_security/index.md b/doc/user/application_security/index.md
index 7c7d5380a24..fbd617351da 100644
--- a/doc/user/application_security/index.md
+++ b/doc/user/application_security/index.md
@@ -98,7 +98,7 @@ approach.
## Security scanning with Auto DevOps
To enable all GitLab Security scanning tools, with default settings, enable
-[Auto DevOps](../../topics/autodevops/):
+[Auto DevOps](../../topics/autodevops/index.md):
- [Auto SAST](../../topics/autodevops/stages.md#auto-sast)
- [Auto Secret Detection](../../topics/autodevops/stages.md#auto-secret-detection)
@@ -361,7 +361,7 @@ Learn more on overriding security jobs:
- [Overriding SAST jobs](sast/index.md#overriding-sast-jobs).
- [Overriding Dependency Scanning jobs](dependency_scanning/index.md#overriding-dependency-scanning-jobs).
- [Overriding Container Scanning jobs](container_scanning/index.md#overriding-the-container-scanning-template).
-- [Overriding Secret Detection jobs](secret_detection/index.md#customizing-settings).
+- [Overriding Secret Detection jobs](secret_detection/index.md#configure-scan-settings).
- [Overriding DAST jobs](dast/index.md#customize-dast-settings).
- [Overriding License Compliance jobs](../compliance/license_compliance/index.md#overriding-the-template).
@@ -397,15 +397,6 @@ Validation depends on the schema version declared in the security report artifac
You can always find supported and deprecated schema versions in the [source code](https://gitlab.com/gitlab-org/gitlab/blob/master/lib/gitlab/ci/parsers/security/validators/schema_validator.rb).
-<!--- start_remove The following content will be removed on remove_date: '2022-08-22' -->
-
-### Enable security report validation (removed)
-
- This feature was [deprecated](https://gitlab.com/gitlab-org/gitlab/-/issues/354928) in GitLab 14.9
- and [removed](https://gitlab.com/gitlab-org/gitlab/-/merge_requests/85400) in GitLab 15.0.
-
- <!--- end_remove -->
-
## Interact with findings and vulnerabilities
You can interact with the results of the security scanning tools in several locations:
@@ -414,7 +405,7 @@ You can interact with the results of the security scanning tools in several loca
- [Project Security Dashboard](security_dashboard/index.md)
- [Security pipeline tab](security_dashboard/index.md)
- [Group Security Dashboard](security_dashboard/index.md)
-- [Security Center](security_dashboard/#security-center)
+- [Security Center](security_dashboard/index.md#security-center)
- [Vulnerability Report](vulnerability_report/index.md)
- [Vulnerability Pages](vulnerabilities/index.md)
- [Dependency List](dependency_list/index.md)
@@ -455,7 +446,7 @@ Security and compliance teams must ensure that security scans:
GitLab provides two methods of accomplishing this, each with advantages and disadvantages.
-- [Compliance framework pipelines](../project/settings/#compliance-pipeline-configuration)
+- [Compliance framework pipelines](../project/settings/index.md#compliance-pipeline-configuration)
are recommended when:
- Scan execution enforcement is required for any scanner that uses a GitLab template, such as SAST IaC, DAST, Dependency Scanning,
@@ -497,6 +488,11 @@ Feedback is welcome on our vision for [unifying the user experience for these tw
-->
### Secure job failing with exit code 1
+WARNING:
+Debug logging can be a serious security risk. The output may contain the content of
+environment variables and other secrets available to the job. The output is uploaded
+to the GitLab server and visible in job logs.
+
If a Secure job is failing and it's unclear why, add `SECURE_LOG_LEVEL: "debug"` as a global CI/CD variable for
more verbose output that is helpful for troubleshooting.
@@ -534,6 +530,11 @@ Select **new pipeline** to run a new pipeline.
### Getting warning messages `… report.json: no matching files`
+WARNING:
+Debug logging can be a serious security risk. The output may contain the content of
+environment variables and other secrets available to the job. The output is uploaded
+to the GitLab server and visible in job logs.
+
This message is often followed by the [error `No files to upload`](../../ci/pipelines/job_artifacts.md#error-message-no-files-to-upload),
and preceded by other errors or warnings that indicate why the JSON report wasn't generated. Check
the entire job log for such messages. If you don't find these messages, retry the failed job after
diff --git a/doc/user/application_security/offline_deployments/index.md b/doc/user/application_security/offline_deployments/index.md
index 7aeb094093c..7344695886a 100644
--- a/doc/user/application_security/offline_deployments/index.md
+++ b/doc/user/application_security/offline_deployments/index.md
@@ -87,9 +87,9 @@ above. You can find more information at each of the pages below:
- [Container scanning offline directions](../container_scanning/index.md#running-container-scanning-in-an-offline-environment)
- [SAST offline directions](../sast/index.md#running-sast-in-an-offline-environment)
-- [Secret Detection offline directions](../secret_detection/#running-secret-detection-in-an-offline-environment)
+- [Secret Detection offline directions](../secret_detection/index.md#running-secret-detection-in-an-offline-environment)
- [DAST offline directions](../dast/run_dast_offline.md#run-dast-in-an-offline-environment)
-- [API Fuzzing offline directions](../api_fuzzing/#running-api-fuzzing-in-an-offline-environment)
+- [API Fuzzing offline directions](../api_fuzzing/index.md#running-api-fuzzing-in-an-offline-environment)
- [License Compliance offline directions](../../compliance/license_compliance/index.md#running-license-compliance-in-an-offline-environment)
- [Dependency Scanning offline directions](../dependency_scanning/index.md#running-dependency-scanning-in-an-offline-environment)
@@ -236,4 +236,4 @@ an offline environment.
Note that these steps are specific to GitLab Secure with AutoDevOps. Using other stages with
AutoDevOps may require other steps covered in the
-[Auto DevOps documentation](../../../topics/autodevops/).
+[Auto DevOps documentation](../../../topics/autodevops/index.md).
diff --git a/doc/user/application_security/policies/index.md b/doc/user/application_security/policies/index.md
index 53f9c400259..9b86ef7316a 100644
--- a/doc/user/application_security/policies/index.md
+++ b/doc/user/application_security/policies/index.md
@@ -1,6 +1,6 @@
---
-stage: Protect
-group: Container Security
+stage: Govern
+group: Security Policies
info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
---
@@ -54,7 +54,7 @@ to select, edit, and unlink a security policy project.
As a project owner, take the following steps to create or edit an association between your current
project and a project that you would like to designate as the security policy project:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Policies**.
1. Select **Edit Policy Project**, and search for and select the
project you would like to link from the dropdown menu.
@@ -78,7 +78,7 @@ policies for all available environments. You can check a
policy's information (for example, description or enforcement
status), and create and edit deployed policies:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Policies**.
![Policies List Page](img/policies_list_v15_1.png)
@@ -89,7 +89,7 @@ status), and create and edit deployed policies:
You can use the policy editor to create, edit, and delete policies:
-1. On the top bar, select **Menu > Projects** and find your group.
+1. On the top bar, select **Main menu > Projects** and find your group.
1. On the left sidebar, select **Security & Compliance > Policies**.
- To create a new policy, select **New policy** which is located in the **Policies** page's header.
You can then select which type of policy to create.
@@ -144,6 +144,6 @@ for more information on the product direction of security policies within GitLab
When you create a new security policy or change an existing policy, a new branch is automatically created with the branch name following the pattern `update-policy-<timestamp>`. For example: `update-policy-1659094451`.
-If you have group or instance push rules that do not allow branch name patterns that contain the text `update-policy-<timestamp>`, you will get an error that states `Branch name does not follow the pattern 'update-policy-<timestamp>'`.
+If you have group or instance push rules that do not allow branch name patterns that contain the text `update-policy-<timestamp>`, you will get an error that states `Branch name does not follow the pattern 'update-policy-<timestamp>'`.
The workaround is to amend your group or instance push rules to allow branches following the pattern `update-policy-` followed by an integer timestamp.
diff --git a/doc/user/application_security/policies/scan-execution-policies.md b/doc/user/application_security/policies/scan-execution-policies.md
index eb1f9a7c7b8..f2fc52a2de8 100644
--- a/doc/user/application_security/policies/scan-execution-policies.md
+++ b/doc/user/application_security/policies/scan-execution-policies.md
@@ -1,12 +1,13 @@
---
-stage: Protect
-group: Container Security
+stage: Govern
+group: Security Policies
info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
---
# Scan execution policies **(ULTIMATE)**
-> Group-level security policies were [introduced](https://gitlab.com/groups/gitlab-org/-/epics/4425) in GitLab 15.2 [with a flag](../../../administration/feature_flags.md) named `group_level_security_policies`. Enabled by default.
+> - Group-level security policies were [introduced](https://gitlab.com/groups/gitlab-org/-/epics/4425) in GitLab 15.2.
+> - Group-level security policies were [enabled on GitLab.com](https://gitlab.com/gitlab-org/gitlab/-/issues/356258) in GitLab 15.4.
Group, sub-group, or project owners can use scan execution policies to require that security scans run on a specified
schedule or with the project (or multiple projects if the policy is defined at a group or sub-group level) pipeline. Required scans are injected into the CI pipeline as new jobs
@@ -14,10 +15,10 @@ with a long, random job name. In the unlikely event of a job name collision, the
any pre-existing job in the pipeline. If a policy is created at the group-level, it will apply to every child
project or sub-group. A group-level policy cannot be edited from a child project or sub-group.
-This feature has some overlap with [compliance framework pipelines](../../project/settings/#compliance-pipeline-configuration),
+This feature has some overlap with [compliance framework pipelines](../../project/settings/index.md#compliance-pipeline-configuration),
as we have not [unified the user experience for these two features](https://gitlab.com/groups/gitlab-org/-/epics/7312).
For details on the similarities and differences between these features, see
-[Enforce scan execution](../#enforce-scan-execution).
+[Enforce scan execution](../index.md#enforce-scan-execution).
NOTE:
Policy jobs are created in the `test` stage of the pipeline. If you modify the default pipeline
@@ -88,8 +89,7 @@ This rule enforces the defined actions and schedules a scan on the provided date
|------------|------|-----------------|-------------|
| `type` | `string` | `schedule` | The rule's type. |
| `branches` | `array` of `string` | `*` or the branch's name | The branch the given policy applies to (supports wildcard). |
-| `cadence` | `string` | CRON expression (for example, `0 0 * * *`) | A whitespace-separated string containing five fields that represents the scheduled time. <!--- start_remove The following content will be removed on remove_date: '2022-08-22' --> |
-| `clusters` (removed) | `object` | | This field was [removed](https://gitlab.com/gitlab-org/gitlab/-/issues/356465) in 15.0. The cluster where the given policy enforces running selected scans (only for `container_scanning`/`cluster_image_scanning` scans). The key of the object is the name of the Kubernetes cluster configured for your project in GitLab. In the optionally provided value of the object, you can precisely select Kubernetes resources that are scanned. <!--- end_remove --> |
+| `cadence` | `string` | CRON expression (for example, `0 0 * * *`) | A whitespace-separated string containing five fields that represents the scheduled time. |
GitLab supports the following types of CRON syntax for the `cadence` field:
@@ -98,23 +98,6 @@ GitLab supports the following types of CRON syntax for the `cadence` field:
It is possible that other elements of the CRON syntax will work in the cadence field, however, GitLab does not officially test or support them.
-<!--- start_remove The following content will be removed on remove_date: '2022-08-22' -->
-
-### `cluster` schema (removed)
-
-This schema was [removed](https://gitlab.com/gitlab-org/gitlab/-/issues/356465) in 15.0.
-
-Use this schema to define `clusters` objects in the [`schedule` rule type](#schedule-rule-type).
-
-| Field | Type | Possible values | Description |
-|--------------|---------------------|--------------------------|-------------|
-| `containers` | `array` of `string` | | The container name that is scanned (only the first value is currently supported). |
-| `resources` | `array` of `string` | | The resource name that is scanned (only the first value is currently supported). |
-| `namespaces` | `array` of `string` | | The namespace that is scanned (only the first value is currently supported). |
-| `kinds` | `array` of `string` | `deployment`/`daemonset` | The resource kind that should be scanned (only the first value is currently supported). |
-
-<!--- end_remove -->
-
## `scan` action type
This action executes the selected `scan` with additional parameters when conditions for at least one
@@ -146,7 +129,7 @@ Note the following:
- A container scanning and cluster image scanning scans configured for the `pipeline` rule type ignores the cluster defined in the `clusters` object.
They use predefined CI/CD variables defined for your project. Cluster selection with the `clusters` object is supported for the `schedule` rule type.
A cluster with a name provided in the `clusters` object must be created and configured for the project.
-- The SAST scan uses the default template and runs in a [child pipeline](../../../ci/pipelines/parent_child_pipelines.md).
+- The SAST scan uses the default template and runs in a [child pipeline](../../../ci/pipelines/downstream_pipelines.md#parent-child-pipelines).
## Example security policies project
diff --git a/doc/user/application_security/policies/scan-result-policies.md b/doc/user/application_security/policies/scan-result-policies.md
index 3eee4957e2f..78a97b36e92 100644
--- a/doc/user/application_security/policies/scan-result-policies.md
+++ b/doc/user/application_security/policies/scan-result-policies.md
@@ -1,6 +1,6 @@
---
-stage: Protect
-group: Container Security
+stage: Govern
+group: Security Policies
info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
---
diff --git a/doc/user/application_security/sast/analyzers.md b/doc/user/application_security/sast/analyzers.md
index cbd64e278c8..ec8e8e6fd93 100644
--- a/doc/user/application_security/sast/analyzers.md
+++ b/doc/user/application_security/sast/analyzers.md
@@ -9,7 +9,7 @@ info: To determine the technical writer assigned to the Stage/Group associated w
> [Moved](https://gitlab.com/groups/gitlab-org/-/epics/2098) from GitLab Ultimate to GitLab Free in 13.3.
Static Application Security Testing (SAST) uses analyzers
-to detect vulnerabilities in source code. Each analyzer is a wrapper around a [scanner](../terminology/#scanner), a third-party code analysis tool.
+to detect vulnerabilities in source code. Each analyzer is a wrapper around a [scanner](../terminology/index.md#scanner), a third-party code analysis tool.
The analyzers are published as Docker images that SAST uses to launch dedicated containers for each
analysis.
@@ -20,7 +20,7 @@ For each scanner, an analyzer:
- Exposes its detection logic.
- Handles its execution.
-- Converts its output to a [standard format](../terminology/#secure-report-format).
+- Converts its output to a [standard format](../terminology/index.md#secure-report-format).
## SAST analyzers
@@ -77,7 +77,7 @@ You can choose to disable the other analyzers early and use Semgrep-based scanni
- You'll enjoy significantly faster scanning, reduced CI minutes usage, and more customizable scanning rules.
- However, vulnerabilities previously reported by language-specific analyzers will be reported again under certain conditions, including if you've dismissed the vulnerabilities before. The system behavior depends on:
- whether you've excluded the Semgrep-based analyzer from running in the past.
- - which analyzer first discovered the vulnerabilities shown in the project's [Vulnerability Report](../vulnerability_report/).
+ - which analyzer first discovered the vulnerabilities shown in the project's [Vulnerability Report](../vulnerability_report/index.md).
### Vulnerability translation
@@ -103,7 +103,7 @@ You can choose to use Semgrep-based scanning instead of language-specific analyz
We recommend taking this approach if any of these cases applies:
-- You haven't used SAST before on a project, so you don't already have SAST vulnerabilities in your [Vulnerability Report](../vulnerability_report/).
+- You haven't used SAST before on a project, so you don't already have SAST vulnerabilities in your [Vulnerability Report](../vulnerability_report/index.md).
- You're having trouble configuring one of the analyzers whose coverage overlaps with Semgrep-based coverage. For example, you might have trouble setting up the SpotBugs-based analyzer to compile your code.
- You've already seen and dismissed vulnerabilities created by ESLint, Gosec, or Flawfinder scanning, and you've kept the re-created vulnerabilities created by Semgrep.
@@ -120,6 +120,36 @@ To switch to Semgrep-based scanning early, you can:
1. Merge the MR and wait for the default-branch pipeline to run.
1. Use the Vulnerability Report to dismiss the findings that are no longer detected by the language-specific analyzers.
+#### Preview Semgrep-based scanning
+
+You can see how Semgrep-based scanning will work in your projects before the GitLab-managed Stable CI/CD template for SAST is updated.
+We recommend that you test this change in a merge request but continue using the Stable template in your default branch pipeline configuration.
+
+In GitLab 15.3, we [activated a feature flag](https://gitlab.com/gitlab-org/gitlab/-/issues/362179) to migrate security findings on the default branch from other analyzers to Semgrep.
+We plan to [plan to remove the deprecated analyzers](https://gitlab.com/gitlab-org/gitlab/-/issues/352554) from the Stable CI/CD template in GitLab 15.4.
+
+To preview the upcoming changes to the CI/CD configuration:
+
+1. Open an MR to switch from the Stable CI/CD template, `SAST.gitlab-ci.yaml`, to [the Latest template](https://gitlab.com/gitlab-org/gitlab/-/blob/master/lib/gitlab/ci/templates/Jobs/SAST.latest.gitlab-ci.yml), `SAST.latest.gitlab-ci.yaml`.
+ - On GitLab.com, use the latest template directly:
+
+ ```yaml
+ include:
+ template: 'SAST.latest.gitlab-ci.yaml'
+ ```
+
+ - On a Self-Managed instance, download the template from GitLab.com:
+
+ ```yaml
+ include:
+ remote: 'https://gitlab.com/gitlab-org/gitlab/-/raw/2851f4d5/lib/gitlab/ci/templates/Jobs/SAST.latest.gitlab-ci.yml'
+ ```
+
+1. Verify that scanning jobs succeed in the MR. You'll notice findings from the removed analyzers in _Fixed_ and findings from Semgrep in _New_. (Some findings may show different names, descriptions, and severities, since GitLab manages and edits the Semgrep rulesets.)
+1. Close the MR.
+
+To learn more about Stable and Latest templates, see documentation on [CI/CD template versioning](../../../development/cicd/templates.md#versioning).
+
## Customize analyzers
Use [CI/CD variables](index.md#available-cicd-variables)
diff --git a/doc/user/application_security/sast/index.md b/doc/user/application_security/sast/index.md
index d4b0d5b972c..c9bfecffecc 100644
--- a/doc/user/application_security/sast/index.md
+++ b/doc/user/application_security/sast/index.md
@@ -78,16 +78,17 @@ You can also [view our language roadmap](https://about.gitlab.com/direction/secu
|------------------------------------------------|-----------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------|
| .NET Core | [Security Code Scan](https://security-code-scan.github.io) | 11.0 |
| .NET Framework<sup>1</sup> | [Security Code Scan](https://security-code-scan.github.io) | 13.0 |
+| .NET (all versions, C# only) | [Semgrep](https://semgrep.dev) | 15.4 |
| Apex (Salesforce) | [PMD](https://pmd.github.io/pmd/index.html) | 12.1 |
| C | [Semgrep](https://semgrep.dev) | 14.2 |
| C/C++ | [Flawfinder](https://github.com/david-a-wheeler/flawfinder) | 10.7 |
| Elixir (Phoenix) | [Sobelow](https://github.com/nccgroup/sobelow) | 11.1 |
| Go | [Gosec](https://github.com/securego/gosec) | 10.7 |
| Go | [Semgrep](https://semgrep.dev) | 14.4 |
-| Groovy<sup>2</sup> | [SpotBugs](https://spotbugs.github.io/) with the [find-sec-bugs](https://find-sec-bugs.github.io/) plugin | 11.3 (Gradle) & 11.9 (Ant, Maven, SBT) |
+| Groovy<sup>2</sup> | [SpotBugs](https://spotbugs.github.io/) with the [find-sec-bugs](https://find-sec-bugs.github.io/) plugin | 11.3 (Gradle) & 11.9 (Maven, SBT) |
| Helm Charts | [Kubesec](https://github.com/controlplaneio/kubesec) | 13.1 |
| Java (any build system) | [Semgrep](https://semgrep.dev) | 14.10 |
-| Java<sup>2</sup> | [SpotBugs](https://spotbugs.github.io/) with the [find-sec-bugs](https://find-sec-bugs.github.io/) plugin | 10.6 (Maven), 10.8 (Gradle) & 11.9 (Ant, SBT) |
+| Java<sup>2</sup> | [SpotBugs](https://spotbugs.github.io/) with the [find-sec-bugs](https://find-sec-bugs.github.io/) plugin | 10.6 (Maven), 10.8 (Gradle) & 11.9 (SBT) |
| Java (Android) | [MobSF (beta)](https://github.com/MobSF/Mobile-Security-Framework-MobSF) | 13.5 |
| JavaScript | [ESLint security plugin](https://github.com/nodesecurity/eslint-plugin-security) | 11.8 |
| JavaScript | [Semgrep](https://semgrep.dev) | 13.10 |
@@ -103,16 +104,16 @@ You can also [view our language roadmap](https://about.gitlab.com/direction/secu
| React | [Semgrep](https://semgrep.dev) | 13.10 |
| Ruby | [brakeman](https://brakemanscanner.org) | 13.9 |
| Ruby on Rails | [brakeman](https://brakemanscanner.org) | 10.3 |
-| Scala<sup>2</sup> | [SpotBugs](https://spotbugs.github.io/) with the [find-sec-bugs](https://find-sec-bugs.github.io/) plugin | 11.0 (SBT) & 11.9 (Ant, Gradle, Maven) |
+| Scala<sup>2</sup> | [SpotBugs](https://spotbugs.github.io/) with the [find-sec-bugs](https://find-sec-bugs.github.io/) plugin | 11.0 (SBT) & 11.9 (Gradle, Maven) |
| Swift (iOS) | [MobSF (beta)](https://github.com/MobSF/Mobile-Security-Framework-MobSF) | 13.5 |
| TypeScript | [ESLint security plugin](https://github.com/nodesecurity/eslint-plugin-security) | 11.9, [merged](https://gitlab.com/gitlab-org/gitlab/-/issues/36059) with ESLint in 13.2 |
| TypeScript | [Semgrep](https://semgrep.dev) | 13.10 |
-1. .NET 4 support is limited. The analyzer runs in a Linux container and does not have access to Windows-specific libraries or features. We currently plan to [migrate C# coverage to Semgrep-based scanning](https://gitlab.com/gitlab-org/gitlab/-/issues/347258) to make it easier to scan C# projects.
-1. The SpotBugs-based analyzer supports [Ant](https://ant.apache.org/), [Gradle](https://gradle.org/), [Maven](https://maven.apache.org/), and [SBT](https://www.scala-sbt.org/). It can also be used with variants like the
+1. .NET 4 support is limited. The analyzer runs in a Linux container and does not have access to Windows-specific libraries or features. Use the Semgrep-based scanner if you need .NET 4 support.
+1. The SpotBugs-based analyzer supports [Gradle](https://gradle.org/), [Maven](https://maven.apache.org/), and [SBT](https://www.scala-sbt.org/). It can also be used with variants like the
[Gradle wrapper](https://docs.gradle.org/current/userguide/gradle_wrapper.html),
[Grails](https://grails.org/),
-and the [Maven wrapper](https://github.com/takari/maven-wrapper).
+and the [Maven wrapper](https://github.com/takari/maven-wrapper). However, SpotBugs has [limitations](https://gitlab.com/gitlab-org/gitlab/-/issues/350801) when used against [Ant](https://ant.apache.org/)-based projects. We recommend using the Semgrep-based analyzer for Ant-based Java projects.
### Multi-project support
@@ -242,7 +243,7 @@ successfully, and an error may occur.
To enable and configure SAST with default settings:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance** > **Configuration**.
1. In the SAST section, select **Configure with a merge request**.
1. Review and merge the merge request to enable SAST.
@@ -262,7 +263,7 @@ successfully, and an error may occur.
To enable and configure SAST with customizations:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Configuration**.
1. If the project does not have a `.gitlab-ci.yml` file, select **Enable SAST** in the Static
Application Security Testing (SAST) row, otherwise select **Configure SAST**.
@@ -337,7 +338,7 @@ False positive detection is available in a subset of the [supported languages](#
> [Introduced](https://gitlab.com/groups/gitlab-org/-/epics/5144) in GitLab 14.2.
Source code is volatile; as developers make changes, source code may move within files or between files.
-Security analyzers may have already reported vulnerabilities that are being tracked in the [Vulnerability Report](../vulnerability_report/).
+Security analyzers may have already reported vulnerabilities that are being tracked in the [Vulnerability Report](../vulnerability_report/index.md).
These vulnerabilities are linked to specific problematic code fragments so that they can be found and fixed.
If the code fragments are not tracked reliably as they move, vulnerability management is harder because the same vulnerability could be reported again.
@@ -550,8 +551,8 @@ Some analyzers can be customized with CI/CD variables.
| `KUBESEC_HELM_CHARTS_PATH` | Kubesec | Optional path to Helm charts that `helm` uses to generate a Kubernetes manifest that `kubesec` scans. If dependencies are defined, `helm dependency build` should be ran in a `before_script` to fetch the necessary dependencies. |
| `KUBESEC_HELM_OPTIONS` | Kubesec | Additional arguments for the `helm` executable. |
| `COMPILE` | Gosec, SpotBugs | Set to `false` to disable project compilation and dependency fetching. [Introduced for `SpotBugs`](https://gitlab.com/gitlab-org/gitlab/-/issues/195252) analyzer in GitLab 13.1 and [`Gosec`](https://gitlab.com/gitlab-org/gitlab/-/issues/330678) analyzer in GitLab 14.0. |
-| `ANT_HOME` | SpotBugs | The `ANT_HOME` variable. |
-| `ANT_PATH` | SpotBugs | Path to the `ant` executable. |
+| `ANT_HOME` | SpotBugs | The `ANT_HOME` variable. |
+| `ANT_PATH` | SpotBugs | Path to the `ant` executable. |
| `GRADLE_PATH` | SpotBugs | Path to the `gradle` executable. |
| `JAVA_OPTS` | SpotBugs | Additional arguments for the `java` executable. |
| `JAVA_PATH` | SpotBugs | Path to the `java` executable. |
@@ -565,6 +566,22 @@ Some analyzers can be customized with CI/CD variables.
| `PHPCS_SECURITY_AUDIT_PHP_EXTENSIONS` | phpcs-security-audit | Comma separated list of additional PHP Extensions. |
| `SAST_DISABLE_BABEL` | NodeJsScan | **{warning}** **[Removed](https://gitlab.com/gitlab-org/gitlab/-/merge_requests/64025)** in GitLab 13.5 |
| `SAST_SEMGREP_METRICS` | Semgrep | Set to `"false"` to disable sending anonymized scan metrics to [r2c](https://r2c.dev/). Default: `true`. Introduced in GitLab 14.0 from the [confidential issue](../../project/issues/confidential_issues.md) `https://gitlab.com/gitlab-org/gitlab/-/issues/330565`. |
+| `SAST_SCANNER_ALLOWED_CLI_OPTS` | Semgrep | CLI options (arguments with value, or flags) that are passed to the underlying security scanner when running scan operation. Only a limited set of [options](#security-scanner-configuration) are accepted. Separate a CLI option and its value using either a blank space or equals (`=`) character. For example: `name1 value1` or `name1=value1`. Multiple options must be separated by blank spaces. For example: `name1 value1 name2 value2`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/368565) in GitLab 15.3. |
+
+#### Security scanner configuration
+
+SAST analyzers internally use OSS security scanners to perform the analysis. We set the recommended
+configuration for the security scanner so that you need not to worry about tuning them. However,
+there can be some rare cases where our default scanner configuration does not suit your
+requirements.
+
+To allow some customization of scanner behavior, you can add a limited set of flags to the
+underlying scanner. Specify the flags in the `SAST_SCANNER_ALLOWED_CLI_OPTS` CI/CD variable. These
+flags are added to the scanner's CLI options.
+
+| Analyzer | CLI option | Description |
+|------------------------------------------------------------------------------|--------------------|------------------------------------------------------------------------------|
+| [Semgrep](https://gitlab.com/gitlab-org/security-products/analyzers/semgrep) | `--max-memory` | Sets the maximum system memory to use when running a rule on a single file. Measured in MB. |
#### Custom CI/CD variables
@@ -729,6 +746,31 @@ variables:
SECURE_LOG_LEVEL: "debug"
```
+### Pipeline errors related to changes in the GitLab-managed CI/CD template
+
+The [GitLab-managed SAST CI/CD template](#configure-sast-manually) controls which [analyzer](analyzers.md) jobs run and how they're configured. While using the template, you might experience a job failure or other pipeline error. For example, you might:
+
+- See an error message like `'<your job>' needs 'spotbugs-sast' job, but 'spotbugs-sast' is not in any previous stage` when you view an affected pipeline.
+- Experience another type of unexpected issue with your CI/CD pipeline configuration.
+
+If you're experiencing a job failure or seeing a SAST-related `yaml invalid` pipeline status, you can temporarily revert to an older version of the template so your pipelines keep working while you investigate the issue. To use an older version of the template, change the existing `include` statement in your CI/CD YAML file to refer to a specific template version, such as `v15.3.3-ee`:
+
+```yaml
+include:
+ remote: 'https://gitlab.com/gitlab-org/gitlab/-/raw/v15.3.3-ee/lib/gitlab/ci/templates/Jobs/SAST.gitlab-ci.yml'
+```
+
+If your GitLab instance has limited network connectivity, you can also download the file and host it elsewhere.
+
+We recommend that you only use this solution temporarily and that you return to [the standard template](#configure-sast-manually) as soon as possible.
+
+### Errors in a specific analyzer job
+
+GitLab SAST [analyzers](analyzers.md) are released as container images.
+If you're seeing a new error that doesn't appear to be related to [the GitLab-managed SAST CI/CD template](#configure-sast-manually) or changes in your own project, you can try [pinning the affected analyzer to a specific older version](#pinning-to-minor-image-version).
+
+Each [analyzer project](analyzers.md#sast-analyzers) has a `CHANGELOG.md` file listing the changes made in each available version.
+
### `Error response from daemon: error processing tar file: docker-tar: relocation error`
This error occurs when the Docker version that runs the SAST job is `19.03.0`.
diff --git a/doc/user/application_security/secret_detection/index.md b/doc/user/application_security/secret_detection/index.md
index 93c32f998fa..fe029b26ce5 100644
--- a/doc/user/application_security/secret_detection/index.md
+++ b/doc/user/application_security/secret_detection/index.md
@@ -6,26 +6,42 @@ info: To determine the technical writer assigned to the Stage/Group associated w
# Secret Detection **(FREE)**
-> [Moved](https://gitlab.com/gitlab-org/gitlab/-/issues/222788) from GitLab Ultimate to GitLab Free in 13.3.
-
-A recurring problem when developing applications is that people may accidentally commit secrets to
-their remote Git repositories. Secrets include keys, passwords, API tokens, and other sensitive
+> - In GitLab 13.1, Secret Detection was split from the [SAST configuration](../sast/index.md#configuration)
+> into its own CI/CD template. If you're using GitLab 13.0 or earlier and SAST is enabled, then
+> Secret Detection is already enabled.
+> - [Moved](https://gitlab.com/gitlab-org/gitlab/-/issues/222788) from GitLab Ultimate to GitLab
+> Free in 13.3.
+> - [In GitLab 14.0](https://gitlab.com/gitlab-org/gitlab/-/issues/297269), Secret Detection jobs
+> `secret_detection_default_branch` and `secret_detection` were consolidated into one job,
+> `secret_detection`.
+
+People may accidentally commit secrets to
+remote Git repositories. Secrets include keys, passwords, API tokens, and other sensitive
information. Anyone with access to the repository could use the secrets for malicious purposes.
-Secrets exposed in this way must be treated as compromised, and be replaced, which can be costly.
-It's important to prevent secrets from being committed to a Git repository.
+Exposed secrets are compromised and must be replaced, which can be costly.
+
+To help prevent secrets from being committed to a Git repository, you can use Secret Detection
+to scan your repository for secrets. Scanning is language
+and framework agnostic, but does not support scanning binary files.
+
+Secret Detection uses a specific analyzer containing the
+[Gitleaks](https://github.com/zricethezav/gitleaks) tool to scan the repository for secrets, in a
+`secret-detection` job. The results are saved as a
+[Secret Detection report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportssecret_detection)
+that you can later download and analyze. Due to implementation limitations, we always take the
+latest Secret Detection artifact available.
+
+GitLab SaaS supports post-processing hooks, so you can take action when a secret is found. For
+more information, see [Post-processing and revocation](post_processing.md).
-Secret Detection uses the [Gitleaks](https://github.com/zricethezav/gitleaks) tool to scan the
-repository for secrets. All identified secrets are reported in the:
+All identified secrets are reported in the:
- Merge request widget
- Pipelines' **Security** tab
-- [Security Dashboard](../security_dashboard/)
+- [Security Dashboard](../security_dashboard/index.md)
![Secret Detection in merge request widget](img/secret_detection_v13_2.png)
-WARNING:
-Secret Detection does not support scanning binary files.
-
## Detected secrets
Secret Detection uses a [default ruleset](https://gitlab.com/gitlab-org/security-products/analyzers/secrets/-/blob/master/gitleaks.toml)
@@ -34,105 +50,57 @@ patterns using [custom rulesets](#custom-rulesets). If you want to contribute ru
"well-identifiable" secrets, follow the steps detailed in the
[community contributions guidelines](https://gitlab.com/gitlab-org/gitlab/-/issues/345453).
-## Requirements
-
-To run Secret Detection jobs, by default, you need GitLab Runner with the
-[`docker`](https://docs.gitlab.com/runner/executors/docker.html) or
-[`kubernetes`](https://docs.gitlab.com/runner/install/kubernetes.html) executor.
-If you're using the shared runners on GitLab.com, this is enabled by default.
-
-WARNING:
-Our Secret Detection jobs expect a Linux/amd64 container type. Windows containers are not supported.
-
-WARNING:
-If you use your own runners, make sure the Docker version installed
-is **not** `19.03.0`. See [troubleshooting information](../sast#error-response-from-daemon-error-processing-tar-file-docker-tar-relocation-error) for details.
-
-### Making Secret Detection available to all GitLab tiers
+## Features per tier
-To make Secret Detection available to as many customers as possible, we have enabled it for all GitLab tiers.
-However not all features are available on every tier. See the breakdown below for more details.
+Different features are available in different [GitLab tiers](https://about.gitlab.com/pricing/).
-#### Summary of features per tier
+| Capability | In Free & Premium | In Ultimate |
+|:---------------------------------------------------------------- |:-----------------------|:-----------------------|
+| [Configure Secret Detection scanner](#enable-secret-detection) | **{check-circle}** Yes | **{check-circle}** Yes |
+| [Customize Secret Detection settings](#configure-scan-settings) | **{check-circle}** Yes | **{check-circle}** Yes |
+| Download [JSON Report](../sast/index.md#reports-json-format) | **{check-circle}** Yes | **{check-circle}** Yes |
+| See new findings in the merge request widget | **{dotted-circle}** No | **{check-circle}** Yes |
+| View identified secrets in the pipelines' **Security** tab | **{dotted-circle}** No | **{check-circle}** Yes |
+| [Manage vulnerabilities](../vulnerabilities/index.md) | **{dotted-circle}** No | **{check-circle}** Yes |
+| [Access the Security Dashboard](../security_dashboard/index.md) | **{dotted-circle}** No | **{check-circle}** Yes |
+| [Customize Secret Detection rulesets](#custom-rulesets) | **{dotted-circle}** No | **{check-circle}** Yes |
-Different features are available in different [GitLab tiers](https://about.gitlab.com/pricing/),
-as shown in the following table:
+## Enable Secret Detection
-| Capability | In Free & Premium | In Ultimate |
-|:----------------------------------------------------------------|:--------------------|:-------------------|
-| [Configure Secret Detection scanner](#configuration) | **{check-circle}** | **{check-circle}** |
-| [Customize Secret Detection settings](#customizing-settings) | **{check-circle}** | **{check-circle}** |
-| Download [JSON Report](../sast/index.md#reports-json-format) | **{check-circle}** | **{check-circle}** |
-| See new findings in the merge request widget | **{dotted-circle}** | **{check-circle}** |
-| View identified secrets in the pipelines' **Security** tab | **{dotted-circle}** | **{check-circle}** |
-| [Manage vulnerabilities](../vulnerabilities/index.md) | **{dotted-circle}** | **{check-circle}** |
-| [Access the Security Dashboard](../security_dashboard/index.md) | **{dotted-circle}** | **{check-circle}** |
-| [Customize Secret Detection rulesets](#custom-rulesets) | **{dotted-circle}** | **{check-circle}** |
+Prerequisites:
-## Configuration
+- GitLab Runner with the [`docker`](https://docs.gitlab.com/runner/executors/docker.html) or
+[`kubernetes`](https://docs.gitlab.com/runner/install/kubernetes.html) executor. If you're using the
+shared runners on GitLab.com, this is enabled by default.
+- If you use your own runners, make sure the Docker version installed is **not** `19.03.0`. See
+ [troubleshooting information](../sast#error-response-from-daemon-error-processing-tar-file-docker-tar-relocation-error)
+ for details.
+- Linux/amd64 container type. Windows containers are not supported.
+- GitLab CI/CD configuration (`.gitlab-ci.yml`) must include the `test` stage.
-> - In GitLab 13.1, Secret Detection was split from the [SAST configuration](../sast#configuration) into its own CI/CD template. If you're using GitLab 13.0 or earlier and SAST is enabled, then Secret Detection is already enabled.
-> - [In GitLab 14.0](https://gitlab.com/gitlab-org/gitlab/-/issues/297269), Secret Detection jobs `secret_detection_default_branch` and `secret_detection` were consolidated into one job, `secret_detection`.
+To enable Secret Detection, either:
-Secret Detection is performed by a [specific analyzer](https://gitlab.com/gitlab-org/gitlab/-/blob/master/lib/gitlab/ci/templates/Security/Secret-Detection.gitlab-ci.yml)
-during the `secret-detection` job. It runs regardless of your app's programming language.
+- Enable [Auto DevOps](../../../topics/autodevops/index.md), which includes [Auto Secret Detection](../../../topics/autodevops/stages.md#auto-secret-detection).
-The Secret Detection analyzer includes [Gitleaks](https://github.com/zricethezav/gitleaks) checks.
+- [Enable Secret Detection by including the template](#enable-secret-detection-by-including-the-template).
-Note that the Secret Detection analyzer ignores Password-in-URL vulnerabilities if the password
-begins with a dollar sign (`$`), as this likely indicates the password is an environment variable.
-For example, `https://username:$password@example.com/path/to/repo` isn't detected, while
-`https://username:password@example.com/path/to/repo` is.
+- [Enable Secret Detection using a merge request](#enable-secret-detection-using-a-merge-request).
-NOTE:
-You don't have to configure Secret Detection manually as shown in this section if you're using
-[Auto Secret Detection](../../../topics/autodevops/stages.md#auto-secret-detection),
-provided by [Auto DevOps](../../../topics/autodevops/index.md).
+### Enable Secret Detection by including the template
-To enable Secret Detection for GitLab 13.1 and later, you must include the
-`Secret-Detection.gitlab-ci.yml` template that's provided as a part of your GitLab installation. For
-GitLab versions earlier than 11.9, you can copy and use the job as defined in that template.
+We recommend this method if you have an existing GitLab CI/CD configuration file.
-Ensure your `.gitlab-ci.yml` file has a `stage` called `test`, and add the following to your `.gitlab-ci.yml` file:
+Add the following extract to your `.gitlab-ci.yml` file:
```yaml
include:
- template: Security/Secret-Detection.gitlab-ci.yml
```
-The included template creates Secret Detection jobs in your CI/CD pipeline and scans
-your project's source code for secrets.
-
-The results are saved as a
-[Secret Detection report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportssecret_detection)
-that you can later download and analyze. Due to implementation limitations, we
-always take the latest Secret Detection artifact available.
-
-### Supported distributions
-
-The default scanner images are build off a base Alpine image for size and maintainability.
-
-#### FIPS-enabled images
-
-> [Introduced](https://gitlab.com/groups/gitlab-org/-/epics/6479) in GitLab 14.10.
-
-GitLab offers [Red Hat UBI](https://www.redhat.com/en/blog/introducing-red-hat-universal-base-image)
-versions of the images that are FIPS-enabled. To use the FIPS-enabled images, you can either:
-
-- Set the `SAST_IMAGE_SUFFIX` to `-fips`.
-- Add the `-fips` extension to the default image name.
-
-For example:
-
-```yaml
-variables:
- SECRET_DETECTION_IMAGE_SUFFIX: '-fips'
-
-include:
- - template: Security/Secret-Detection.gitlab-ci.yml
-```
+Pipelines now include a Secret Detection job, and the results are included in the merge request
+widget.
-### Enable Secret Detection via an automatic merge request
+### Enable Secret Detection using a merge request
> - [Introduced](https://gitlab.com/groups/gitlab-org/-/epics/4496) in GitLab 13.11, deployed behind a feature flag, enabled by default.
> - [Feature flag removed](https://gitlab.com/gitlab-org/gitlab/-/issues/329886) in GitLab 14.1.
@@ -142,45 +110,36 @@ This method works best with no existing `.gitlab-ci.yml` file, or with a minimal
file. If you have a complex GitLab configuration file it may not be parsed successfully, and an
error may occur.
-To enable Secret Detection in a project, you can create a merge request:
+To enable Secret Detection using a merge request:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Configuration**.
1. In the **Secret Detection** row, select **Configure with a merge request**.
-1. Review and merge the merge request to enable Secret Detection.
+1. Review and merge the merge request.
-Pipelines now include a Secret Detection job.
+Pipelines now include a Secret Detection job, and the results are included in the merge request
+widget.
-### Customizing settings
+## Configure scan settings
The Secret Detection scan settings can be changed through [CI/CD variables](#available-cicd-variables)
-by using the
-[`variables`](../../../ci/yaml/index.md#variables) parameter in `.gitlab-ci.yml`.
+by using the [`variables`](../../../ci/yaml/index.md#variables) parameter in `.gitlab-ci.yml`.
WARNING:
-All customization of GitLab security scanning tools should be tested in a merge request before
+All configuration of GitLab security scanning tools should be tested in a merge request before
merging these changes to the default branch. Failure to do so can give unexpected results,
including a large number of false positives.
To override a job definition, (for example, change properties like `variables` or `dependencies`),
-declare a job with the same name as the secret detection job to override. Place this new job after the template
-inclusion and specify any additional keys under it.
-
-WARNING:
-Beginning in GitLab 13.0, the use of [`only` and `except`](../../../ci/yaml/index.md#only--except)
-is no longer supported. When overriding the template, you must use [`rules`](../../../ci/yaml/index.md#rules) instead.
+declare a job with the same name as the secret detection job to override. Place this new job after
+the template inclusion and specify any additional keys under it.
-#### `GIT_DEPTH` variable
+In the following example _extract_ of a `.gitlab-ci.yml` file:
-The [`GIT_DEPTH` CI/CD variable](../../../ci/runners/configure_runners.md#shallow-cloning) affects Secret Detection.
-The Secret Detection analyzer relies on generating patches between commits to scan content for
-secrets. If you override the default, ensure the value is greater than 1. If the number of commits
-in an MR is greater than the `GIT_DEPTH` value, Secret Detection will [fail to detect secrets](#error-couldnt-run-the-gitleaks-command-exit-status-2).
-
-#### Custom settings example
-
-In the following example, we include the Secret Detection template and at the same time we
-override the `secret_detection` job with the `SECRET_DETECTION_HISTORIC_SCAN` CI/CD variable to `true`:
+- The Secret Detection template is [included](../../../ci/yaml/index.md#include).
+- In the `secret_detection` job, the CI/CD variable `SECRET_DETECTION_HISTORIC_SCAN` is set to
+ `true`. Because the template is evaluated before the pipeline configuration, the last mention of
+ the variable takes precedence.
```yaml
include:
@@ -191,10 +150,7 @@ secret_detection:
SECRET_DETECTION_HISTORIC_SCAN: "true"
```
-Because the template is [evaluated before](../../../ci/yaml/index.md#include)
-the pipeline configuration, the last mention of the variable takes precedence.
-
-#### Available CI/CD variables
+### Available CI/CD variables
Secret Detection can be customized by defining available CI/CD variables:
@@ -202,7 +158,7 @@ Secret Detection can be customized by defining available CI/CD variables:
|-----------------------------------|---------------|-------------|
| `SECRET_DETECTION_EXCLUDED_PATHS` | "" | Exclude vulnerabilities from output based on the paths. This is a comma-separated list of patterns. Patterns can be globs (see [`doublestar.Match`](https://pkg.go.dev/github.com/bmatcuk/doublestar/v4@v4.0.2#Match) for supported patterns), or file or folder paths (for example, `doc,spec` ). Parent directories also match patterns. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/225273) in GitLab 13.3. |
| `SECRET_DETECTION_HISTORIC_SCAN` | false | Flag to enable a historic Gitleaks scan. |
-| `SECRET_DETECTION_IMAGE_SUFFIX` | "" | Suffix added to the image name. If set to `-fips`, `FIPS-enabled` images are used for scan. See [FIPS-enabled images](#fips-enabled-images) for more details. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/355519) in GitLab 14.10. |
+| `SECRET_DETECTION_IMAGE_SUFFIX` | "" | Suffix added to the image name. If set to `-fips`, `FIPS-enabled` images are used for scan. See [Use FIPS-enabled images](#use-fips-enabled-images) for more details. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/355519) in GitLab 14.10. |
| `SECRET_DETECTION_LOG_OPTIONS` | "" | [`git log`](https://git-scm.com/docs/git-log) options used to define commit ranges. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/350660) in GitLab 15.1.|
In previous GitLab versions, the following variables were also available:
@@ -211,45 +167,79 @@ In previous GitLab versions, the following variables were also available:
|-----------------------------------|---------------|-------------|
| `SECRET_DETECTION_COMMIT_FROM` | - | The commit a Gitleaks scan starts at. [Removed](https://gitlab.com/gitlab-org/gitlab/-/issues/243564) in GitLab 13.5. Replaced with `SECRET_DETECTION_COMMITS`. |
| `SECRET_DETECTION_COMMIT_TO` | - | The commit a Gitleaks scan ends at. [Removed](https://gitlab.com/gitlab-org/gitlab/-/issues/243564) in GitLab 13.5. Replaced with `SECRET_DETECTION_COMMITS`. |
-| `SECRET_DETECTION_COMMITS` | - | The list of commits that Gitleaks should scan. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/243564) in GitLab 13.5. [Removed](https://gitlab.com/gitlab-org/gitlab/-/issues/352565) in GitLab 15.0. |
+| `SECRET_DETECTION_COMMITS` | - | The list of commits that Gitleaks should scan. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/243564) in GitLab 13.5. [Removed](https://gitlab.com/gitlab-org/gitlab/-/issues/352565) in GitLab 15.0. |
-### Custom rulesets **(ULTIMATE)**
+#### Use FIPS-enabled images
-> - [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/211387) in GitLab 13.5.
-> - [Added](https://gitlab.com/gitlab-org/gitlab/-/issues/339614) support for
-> passthrough chains. Expanded to include additional passthrough types of `file`, `git`, and `url` in GitLab 14.6.
-> - [Added](https://gitlab.com/gitlab-org/gitlab/-/issues/235359) support for overriding rules in GitLab 14.8.
+> [Introduced](https://gitlab.com/groups/gitlab-org/-/epics/6479) in GitLab 14.10.
-You can customize the default secret detection rules provided with GitLab.
-Ruleset customization supports the following capabilities that can be used
-simultaneously:
+The default scanner images are built off a base Alpine image for size and maintainability. GitLab
+offers [Red Hat UBI](https://www.redhat.com/en/blog/introducing-red-hat-universal-base-image)
+versions of the images that are FIPS-enabled.
-- [Disabling predefined rules](#disable-predefined-analyzer-rules).
-- [Overriding predefined rules](#override-predefined-analyzer-rules).
-- Modifying the default behavior of the Secret Detection analyzer by [synthesizing and passing a custom configuration](#synthesize-a-custom-configuration).
+To use the FIPS-enabled images, either:
-Customization allows replacing the default secret detection rules with rules that you define.
+- Set the `SECRET_DETECTION_IMAGE_SUFFIX` CI/CD variable to `-fips`.
+- Add the `-fips` extension to the default image name.
-To create a custom ruleset:
+For example:
-1. Create a `.gitlab` directory at the root of your project, if one doesn't already exist.
-1. Create a custom ruleset file named `secret-detection-ruleset.toml` in the `.gitlab` directory.
+```yaml
+variables:
+ SECRET_DETECTION_IMAGE_SUFFIX: '-fips'
-#### Disable predefined analyzer rules
+include:
+ - template: Security/Secret-Detection.gitlab-ci.yml
+```
-To disable analyzer rules:
+## Full history Secret Detection
-1. Set the `disabled` flag to `true` in the context of a `ruleset` section.
+By default, Secret Detection scans only the current state of the Git repository. Any secrets
+contained in the repository's history are not detected. To address this, Secret Detection can
+scan the Git repository's full history.
+
+We recommend you do a full history scan only once, after enabling Secret Detection. A full history
+can take a long time, especially for larger repositories with lengthy Git histories. After
+completing an initial full history scan, use only standard Secret Detection as part of your
+pipeline.
+
+### Enable full history Secret Detection
+
+To enable full history Secret Detection, set the variable `SECRET_DETECTION_HISTORIC_SCAN` to `true` in your `.gitlab-ci.yml` file.
+
+## Custom rulesets **(ULTIMATE)**
+
+> - [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/211387) in GitLab 13.5.
+> - [Added](https://gitlab.com/gitlab-org/gitlab/-/issues/339614) support for passthrough chains.
+> Expanded to include additional passthrough types of `file`, `git`, and `url` in GitLab 14.6.
+> - [Added](https://gitlab.com/gitlab-org/gitlab/-/issues/235359) support for overriding rules in
+> GitLab 14.8.
+
+You can customize the default Secret Detection rules provided with GitLab.
-1. In one or more `ruleset.identifier` subsections, list the rules that you want disabled. Every `ruleset.identifier` section has:
+The following customization options can be used separately, or in combination:
- - a `type` field, to name the predefined rule identifier.
- - a `value` field, to name the rule to be disabled.
+- [Disable predefined rules](#disable-predefined-analyzer-rules).
+- [Override predefined rules](#override-predefined-analyzer-rules).
+- [Synthesize a custom configuration](#synthesize-a-custom-configuration).
-##### Example: Disable predefined rules of Secret Detection analyzer
+### Disable predefined analyzer rules
-In the following example, the disabled rules is assigned to `secrets`
-by matching the `type` and `value` of identifiers:
+If there are specific Secret Detection rules that you don't want active, you can disable them.
+
+To disable analyzer rules:
+
+1. Create a `.gitlab` directory at the root of your project, if one doesn't already exist.
+1. Create a custom ruleset file named `secret-detection-ruleset.toml` in the `.gitlab` directory, if
+ one doesn't already exist.
+1. Set the `disabled` flag to `true` in the context of a `ruleset` section.
+1. In one or more `ruleset.identifier` subsections, list the rules to disable. Every
+ `ruleset.identifier` section has:
+ - A `type` field for the predefined rule identifier.
+ - A `value` field for the rule name.
+
+In the following example `secret-detection-ruleset.toml` file, the disabled rules are assigned to
+`secrets` by matching the `type` and `value` of identifiers:
```toml
[secrets]
@@ -260,29 +250,30 @@ by matching the `type` and `value` of identifiers:
value = "RSA private key"
```
-#### Override predefined analyzer rules
+### Override predefined analyzer rules
-To override rules:
-
-1. In one or more `ruleset.identifier` subsections, list the rules that you want to override. Every `ruleset.identifier` section has:
+If there are specific Secret Detection rules you want to customize, you can override them. For
+example, you might increase the severity of specific secrets.
- - a `type` field, to name the predefined rule identifier that the Secret Detection analyzer uses.
- - a `value` field, to name the rule to be overridden.
+To override rules:
+1. Create a `.gitlab` directory at the root of your project, if one doesn't already exist.
+1. Create a custom ruleset file named `secret-detection-ruleset.toml` in the `.gitlab` directory, if
+ one doesn't already exist.
+1. In one or more `ruleset.identifier` subsections, list the rules to override. Every
+ `ruleset.identifier` section has:
+ - A `type` field for the predefined rule identifier.
+ - A `value` field for the rule name.
1. In the `ruleset.override` context of a `ruleset` section,
provide the keys to override. Any combination of keys can be
overridden. Valid keys are:
-
- description
- message
- name
- severity (valid options are: Critical, High, Medium, Low, Unknown, Info)
-##### Example: Override predefined rules of Secret Detection analyzer
-
-In the following example, rules
-are matched by the `type` and `value` of identifiers and
-then overridden:
+In the following example `secret-detection-ruleset.toml` file, rules are matched by the `type` and
+`value` of identifiers and then overridden:
```toml
[secrets]
@@ -297,166 +288,157 @@ then overridden:
severity = "Info"
```
-#### Synthesize a custom configuration
+### Synthesize a custom configuration
-To create a custom configuration, you can use passthrough chains.
+To create a custom configuration, you can use passthrough chains. Passthroughs can also be chained
+to build more complex configurations. For more details, see
+[SAST Customize ruleset](../sast/customize_rulesets.md).
-1. In the `secret-detection-ruleset.toml` file, do one of the following:
+In the `secret-detection-ruleset.toml` file, do one of the following:
- - Define a custom ruleset:
+- Define a custom ruleset, for example:
- ```toml
- [secrets]
- description = 'secrets custom rules configuration'
+ ```toml
+ [secrets]
+ description = 'secrets custom rules configuration'
- [[secrets.passthrough]]
- type = "raw"
- target = "gitleaks.toml"
- value = """\
- title = "gitleaks config"
- # add regexes to the regex table
- [[rules]]
- description = "Test for Raw Custom Rulesets"
- regex = '''Custom Raw Ruleset T[est]{3}'''
- """
- ```
+ [[secrets.passthrough]]
+ type = "raw"
+ target = "gitleaks.toml"
+ value = """\
+ title = "gitleaks config"
+ # add regexes to the regex table
+ [[rules]]
+ description = "Test for Raw Custom Rulesets"
+ regex = '''Custom Raw Ruleset T[est]{3}'''
+ """
+ ```
- - Provide the name of the file containing a custom ruleset:
+- Provide the name of the file containing a custom ruleset, for example:
- ```toml
- [secrets]
- description = 'secrets custom rules configuration'
+ ```toml
+ [secrets]
+ description = 'secrets custom rules configuration'
- [[secrets.passthrough]]
- type = "file"
- target = "gitleaks.toml"
- value = "config/gitleaks.toml"
- ```
+ [[secrets.passthrough]]
+ type = "file"
+ target = "gitleaks.toml"
+ value = "config/gitleaks.toml"
+ ```
-Passthroughs can also be chained to build more complex configurations.
-For more details, see [SAST Customize ruleset section](../sast/customize_rulesets.md).
+## Running Secret Detection in an offline environment **(PREMIUM SELF)**
-### Logging level
-
-To control the verbosity of logs set the `SECURE_LOG_LEVEL` CI/CD variable. Messages of this logging level or higher are output. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/10880) in GitLab 13.1.
-
-From highest to lowest severity, the logging levels are:
-
-- `fatal`
-- `error`
-- `warn`
-- `info` (default)
-- `debug`
-
-## Post-processing and revocation
+For self-managed GitLab instances in an environment with limited, restricted, or intermittent access
+to external resources through the internet, some configuration changes are required for the Secret
+Detection job to run successfully. The instructions in this section must be completed together with
+the instructions detailed in [offline environments](../offline_deployments/index.md).
-Upon detection of a secret, GitLab SaaS supports post-processing hooks.
-For more information, see [Post-processing and revocation](post_processing.md).
+### Configure GitLab Runner
-## Full History Secret Detection
+By default, a runner tries to pull Docker images from the GitLab container registry even if a local
+copy is available. We recommend using this default setting, to ensure Docker images remain current.
+However, if no network connectivity is available, you must change the default GitLab Runner
+`pull_policy` variable.
-GitLab 12.11 introduced support for scanning the full history of a repository. This new functionality
-is particularly useful when you are enabling Secret Detection in a repository for the first time and you
-want to perform a full secret detection scan. Running a secret detection scan on the full history can take a long time,
-especially for larger repositories with lengthy Git histories. We recommend not setting this CI/CD variable
-as part of your normal job definition.
+Configure the GitLab Runner CI/CD variable `pull_policy` to
+[`if-not-present`](https://docs.gitlab.com/runner/executors/docker.html#using-the-if-not-present-pull-policy).
-A new configuration variable ([`SECRET_DETECTION_HISTORIC_SCAN`](#available-cicd-variables))
-can be set to change the behavior of the GitLab Secret Detection scan to run on the entire Git history of a repository.
+### Use local Secret Detection analyzer image
-## Running Secret Detection in an offline environment
+Use a local Secret Detection analyzer image if you want to obtain the image from a local Docker
+registry instead of the GitLab container registry.
-For self-managed GitLab instances in an environment with limited, restricted, or intermittent access
-to external resources through the internet, some adjustments are required for the Secret Detection job to
-run successfully. For more information, see [Offline environments](../offline_deployments/index.md).
+Prerequisites:
-### Requirements for offline Secret Detection
+- Importing Docker images into a local offline Docker registry depends on your
+ network security policy. Consult your IT staff to find an accepted and approved process
+ to import or temporarily access external resources.
-To use Secret Detection in an offline environment, you need:
+1. Import the default Secret Detection analyzer image from `registry.gitlab.com` into your
+ [local Docker container registry](../../packages/container_registry/index.md):
-- GitLab Runner with the [`docker` or `kubernetes` executor](#requirements).
-- A Docker Container Registry with locally available copy of Secret Detection [analyzer](https://gitlab.com/gitlab-org/security-products/analyzers) images.
-- Configure certificate checking of packages (optional).
+ ```plaintext
+ registry.gitlab.com/security-products/secret-detection:3
+ ```
-GitLab Runner has a [default `pull policy` of `always`](https://docs.gitlab.com/runner/executors/docker.html#using-the-always-pull-policy),
-meaning the runner tries to pull Docker images from the GitLab container registry even if a local
-copy is available. The GitLab Runner [`pull_policy` can be set to `if-not-present`](https://docs.gitlab.com/runner/executors/docker.html#using-the-if-not-present-pull-policy)
-in an offline environment if you prefer using only locally available Docker images. However, we
-recommend keeping the pull policy setting to `always` if not in an offline environment, as this
-enables the use of updated scanners in your CI/CD pipelines.
+ The Secret Detection analyzer's image is [periodically updated](../index.md#vulnerability-scanner-maintenance)
+ so you may need to periodically update the local copy.
-### Make GitLab Secret Detection analyzer image available inside your Docker registry
+1. Set the CI/CD variable `SECURE_ANALYZERS_PREFIX` to the local Docker container registry.
-Import the following default Secret Detection analyzer images from `registry.gitlab.com` into your
-[local Docker container registry](../../packages/container_registry/index.md):
+ ```yaml
+ include:
+ - template: Security/Secret-Detection.gitlab-ci.yml
-```plaintext
-registry.gitlab.com/security-products/secret-detection:3
-```
+ variables:
+ SECURE_ANALYZERS_PREFIX: "localhost:5000/analyzers"
+ ```
-The process for importing Docker images into a local offline Docker registry depends on
-**your network security policy**. Please consult your IT staff to find an accepted and approved
-process by which external resources can be imported or temporarily accessed. These scanners are [periodically updated](../index.md#vulnerability-scanner-maintenance)
-with new definitions, and you may be able to make occasional updates on your own.
+The Secret Detection job should now use the local copy of the Secret Detection analyzer Docker
+image, without requiring internet access.
-For details on saving and transporting Docker images as a file, see Docker's documentation on
-[`docker save`](https://docs.docker.com/engine/reference/commandline/save/), [`docker load`](https://docs.docker.com/engine/reference/commandline/load/),
-[`docker export`](https://docs.docker.com/engine/reference/commandline/export/), and [`docker import`](https://docs.docker.com/engine/reference/commandline/import/).
+### Configure a custom Certificate Authority
-### Set Secret Detection CI/CD variables to use the local Secret Detection analyzer container image
+To trust a custom Certificate Authority, set the `ADDITIONAL_CA_CERT_BUNDLE` variable to the bundle
+of CA certificates that you trust. Do this either in the `.gitlab-ci.yml` file, in a file
+variable, or as a CI/CD variable.
-Add the following configuration to your `.gitlab-ci.yml` file. You must replace
-`SECURE_ANALYZERS_PREFIX` to refer to your local Docker container registry:
+- In the `.gitlab-ci.yml` file, the `ADDITIONAL_CA_CERT_BUNDLE` value must contain the
+ [text representation of the X.509 PEM public-key certificate](https://tools.ietf.org/html/rfc7468#section-5.1).
-```yaml
-include:
- - template: Security/Secret-Detection.gitlab-ci.yml
+ For example:
-variables:
- SECURE_ANALYZERS_PREFIX: "localhost:5000/analyzers"
-```
+ ```yaml
+ variables:
+ ADDITIONAL_CA_CERT_BUNDLE: |
+ -----BEGIN CERTIFICATE-----
+ MIIGqTCCBJGgAwIBAgIQI7AVxxVwg2kch4d56XNdDjANBgkqhkiG9w0BAQsFADCB
+ ...
+ jWgmPqF3vUbZE0EyScetPJquRFRKIesyJuBFMAs=
+ -----END CERTIFICATE-----
+ ```
-The Secret Detection job should now use the local copy of the Secret Detection analyzer Docker image to scan your code and generate
-security reports without requiring internet access.
+- If using a file variable, set the value of `ADDITIONAL_CA_CERT_BUNDLE` to the path to the
+ certificate.
-#### If support for Custom Certificate Authorities are needed
+- If using a variable, set the value of `ADDITIONAL_CA_CERT_BUNDLE` to the text
+ representation of the certificate.
-Support for custom certificate authorities was introduced in the following versions.
+## Troubleshooting
-| Analyzer | Version |
-| -------- | ------- |
-| secrets | [v3.0.0](https://gitlab.com/gitlab-org/security-products/analyzers/secrets/-/releases/v3.0.0) |
+### Set the logging level
-To trust a custom Certificate Authority, set the `ADDITIONAL_CA_CERT_BUNDLE` variable to the bundle
-of CA certs that you want to trust in the SAST environment. The `ADDITIONAL_CA_CERT_BUNDLE` value should contain the [text representation of the X.509 PEM public-key certificate](https://tools.ietf.org/html/rfc7468#section-5.1). For example, to configure this value in the `.gitlab-ci.yml` file, use the following:
+> [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/10880) in GitLab 13.1.
-```yaml
-variables:
- ADDITIONAL_CA_CERT_BUNDLE: |
- -----BEGIN CERTIFICATE-----
- MIIGqTCCBJGgAwIBAgIQI7AVxxVwg2kch4d56XNdDjANBgkqhkiG9w0BAQsFADCB
- ...
- jWgmPqF3vUbZE0EyScetPJquRFRKIesyJuBFMAs=
- -----END CERTIFICATE-----
-```
+Set the logging level to `debug` when you need diagnostic information in a Secret Detection job log.
-The `ADDITIONAL_CA_CERT_BUNDLE` value can also be configured as a [custom variable in the UI](../../../ci/variables/index.md#custom-cicd-variables), either as a `file`, which requires the path to the certificate, or as a variable, which requires the text representation of the certificate.
+WARNING:
+Debug logging can be a serious security risk. The output may contain the content of environment
+variables and other secrets available to the job. The output is uploaded to the GitLab server and
+visible in job logs.
-## Troubleshooting
+1. In the `.gitlab-ci.yml` file, set the `SECURE_LOG_LEVEL` CI/CD variable to `debug`.
+1. Run the Secret Detection job.
+1. Analyze the content of the Secret Detection job.
+1. In the `.gitlab-ci.yml` file, set the `SECURE_LOG_LEVEL` CI/CD variable to `info` (default).
-### Getting warning message `gl-secret-detection-report.json: no matching files`
+### Warning: `gl-secret-detection-report.json: no matching files`
For information on this, see the [general Application Security troubleshooting section](../../../ci/pipelines/job_artifacts.md#error-message-no-files-to-upload).
### Error: `Couldn't run the gitleaks command: exit status 2`
-If a pipeline is triggered from a merge request containing 60 commits while the `GIT_DEPTH` variable's
-value is less than that, the Secret Detection job fails as the clone is not deep enough to contain all of the
-relevant commits. For information on the current default value, see the
-[pipeline configuration documentation](../../../ci/pipelines/settings.md#limit-the-number-of-changes-fetched-during-clone).
+The Secret Detection analyzer relies on generating patches between commits to scan content for
+secrets. If the number of commits in a merge request is greater than the value of the
+[`GIT_DEPTH` CI/CD variable](../../../ci/runners/configure_runners.md#shallow-cloning), Secret
+Detection [fails to detect secrets](#error-couldnt-run-the-gitleaks-command-exit-status-2).
+
+For example, if a pipeline is triggered from a merge request containing 60 commits and the
+`GIT_DEPTH` variable's value is less than 60, the Secret Detection job fails as the clone is not
+deep enough to contain all of the relevant commits. To veridy the current value, see
+[pipeline configuration](../../../ci/pipelines/settings.md#limit-the-number-of-changes-fetched-during-clone).
-To confirm this as the cause of the error, set the
-[logging level](../../application_security/secret_detection/index.md#logging-level) to `debug`, then
+To confirm this as the cause of the error, set the [logging level](#set-the-logging-level) to `debug`, then
rerun the pipeline. The logs should look similar to the following example. The text "object not
found" is a symptom of this error.
@@ -476,11 +458,12 @@ secret_detection:
GIT_DEPTH: 100
```
-### `secret-detection` job fails with `ERR fatal: ambiguous argument` message
+### Error: `ERR fatal: ambiguous argument`
-Your `secret-detection` job can fail with `ERR fatal: ambiguous argument` error if your
-repository's default branch is unrelated to the branch the job was triggered for.
-See issue [!352014](https://gitlab.com/gitlab-org/gitlab/-/issues/352014) for more details.
+Secret Detection can fail with the message `ERR fatal: ambiguous argument` error if your
+repository's default branch is unrelated to the branch the job was triggered for. See issue
+[!352014](https://gitlab.com/gitlab-org/gitlab/-/issues/352014) for more details.
-To resolve the issue, make sure to correctly [set your default branch](../../project/repository/branches/default.md#change-the-default-branch-name-for-a-project) on your repository. You should set it to a branch
-that has related history with the branch you run the `secret-detection` job on.
+To resolve the issue, make sure to correctly [set your default branch](../../project/repository/branches/default.md#change-the-default-branch-name-for-a-project)
+on your repository. You should set it to a branch that has related history with the branch you run
+the `secret-detection` job on.
diff --git a/doc/user/application_security/security_dashboard/index.md b/doc/user/application_security/security_dashboard/index.md
index f3c834e06c7..967e8da58a9 100644
--- a/doc/user/application_security/security_dashboard/index.md
+++ b/doc/user/application_security/security_dashboard/index.md
@@ -1,6 +1,6 @@
---
type: reference, howto
-stage: Secure
+stage: Govern
group: Threat Insights
info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
---
@@ -49,12 +49,15 @@ To reduce false negatives in [dependency scans](../../../user/application_securi
> - [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/285477) in GitLab 13.11, date range slider to visualize data between given dates.
The project Security Dashboard shows the total number of vulnerabilities
-over time, with up to 365 days of historical data. Data refreshes daily at 01:15 UTC.
-It shows statistics for all vulnerabilities.
+over time, with up to 365 days of historical data. Data refresh begins daily at 01:15 UTC via a scheduled job.
+Each refresh captures a snapshot of open vulnerabilities. Data is not backported to prior days
+so vulnerabilities opened after the job has already run for the day will not be reflected in the
+counts until the following day's refresh job.
+Project Security Dashboards show statistics for all vulnerabilities with a current status of `Needs triage` or `Confirmed` .
To view total number of vulnerabilities over time:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Security Dashboard**.
1. Filter and search for what you need.
- To filter the chart by severity, select the legend name.
@@ -67,7 +70,7 @@ To view total number of vulnerabilities over time:
To download an SVG image of the vulnerabilities chart:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Security dashboard**.
1. Select **Save chart as an image** (**{download}**).
@@ -78,7 +81,7 @@ branches of projects in a group and its subgroups.
To view vulnerabilities over time for a group:
-1. On the top bar, select **Menu > Groups** and select a group.
+1. On the top bar, select **Main menu > Groups** and select a group.
1. Select **Security > Security Dashboard**.
1. Hover over the chart to get more details about vulnerabilities.
- You can display the vulnerability trends over a 30, 60, or 90-day time frame (the default is 90 days).
@@ -88,15 +91,16 @@ To view vulnerabilities over time for a group:
## View project security status for a group
-Use the group Security Dashboard to view the security status of projects. The security status is based
-on the number of detected vulnerabilities.
+Use the group Security Dashboard to view the security status of projects.
To view project security status for a group:
-1. On the top bar, select **Menu > Groups** and select a group.
+1. On the top bar, select **Main menu > Groups** and select a group.
1. Select **Security > Security Dashboard**.
-Projects are [graded](#project-vulnerability-grades) by vulnerability severity. Dismissed vulnerabilities are excluded.
+Each project is assigned a letter [grade](#project-vulnerability-grades) according to the highest-severity open vulnerability.
+Dismissed or resolved vulnerabilities are excluded. Each project can receive only one letter grade and will appear only once
+in the Project security status report.
To view vulnerabilities, go to the group's [vulnerability report](../vulnerability_report/index.md).
@@ -127,13 +131,13 @@ The Security Center includes:
### View the Security Center
-To view the Security Center, on the top bar, select **Menu > Security**.
+To view the Security Center, on the top bar, select **Main menu > Security**.
### Add projects to the Security Center
To add projects to the Security Center:
-1. On the top bar, select **Menu > Security**.
+1. On the top bar, select **Main menu > Security**.
1. On the left sidebar, select **Settings**, or select **Add projects**.
1. Use the **Search your projects** text box to search for and select projects.
1. Select **Add projects**.
diff --git a/doc/user/application_security/vulnerabilities/index.md b/doc/user/application_security/vulnerabilities/index.md
index ad397c3fe04..91793272cce 100644
--- a/doc/user/application_security/vulnerabilities/index.md
+++ b/doc/user/application_security/vulnerabilities/index.md
@@ -1,5 +1,5 @@
---
-stage: Secure
+stage: Govern
group: Threat Insights
info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
---
@@ -46,14 +46,14 @@ are reintroduced and detected by subsequent scans have a _new_ vulnerability rec
existing vulnerability is no longer detected in a project's `default` branch, you should change its
status to **Resolved**. This ensures that if it is accidentally reintroduced in a future merge, it
is reported again as a new record. You can use the Vulnerability Report's
-[Activity filter](../vulnerability_report/#activity-filter) to select all vulnerabilities that are
+[Activity filter](../vulnerability_report/index.md#activity-filter) to select all vulnerabilities that are
no longer detected, and change their status.
## Change status of a vulnerability
To change a vulnerability's status from its Vulnerability Page:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Vulnerability report**.
1. Select the vulnerability's description.
1. From the **Status** dropdown list select a status, then select **Change status**.
@@ -77,7 +77,7 @@ that when Jira integration is enabled, the GitLab issue feature is not available
To create a GitLab issue for a vulnerability:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Vulnerability report**.
1. Select the vulnerability's description.
1. Select **Create issue**.
@@ -99,7 +99,7 @@ Prerequisites:
To create a Jira issue for a vulnerability:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Vulnerability report**.
1. Select the vulnerability's description.
1. Select **Create Jira issue**.
@@ -132,7 +132,7 @@ Be aware of the following conditions between a vulnerability and a linked issue:
To link a vulnerability to existing issues:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Vulnerability report**.
1. Select the vulnerability's description.
1. In the **Linked issues** section, select the plus icon (**{plus}**).
@@ -167,7 +167,7 @@ To resolve a vulnerability, you can either:
To resolve the vulnerability with a merge request:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Vulnerability report**.
1. Select the vulnerability's description.
1. From the **Resolve with merge request** dropdown list, select **Resolve with merge request**.
@@ -179,7 +179,7 @@ Process the merge request according to your standard workflow.
To manually apply the patch that GitLab generated for a vulnerability:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Vulnerability report**.
1. Select the vulnerability's description.
1. From the **Resolve with merge request** dropdown list, select **Download patch to resolve**.
@@ -197,7 +197,7 @@ Security training helps your developers learn how to fix vulnerabilities. Develo
To enable security training for vulnerabilities in your project:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Configuration**.
1. On the tab bar, select **Vulnerability Management**.
1. To enable a security training provider, turn on the toggle.
@@ -215,7 +215,7 @@ Vulnerabilities with a CWE are most likely to return a training result.
To view the security training for a vulnerability:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Vulnerability report**.
1. Select the vulnerability for which you want to view security training.
1. Select **View training**.
diff --git a/doc/user/application_security/vulnerabilities/severities.md b/doc/user/application_security/vulnerabilities/severities.md
index 987dac677e7..aed86cd93aa 100644
--- a/doc/user/application_security/vulnerabilities/severities.md
+++ b/doc/user/application_security/vulnerabilities/severities.md
@@ -1,6 +1,6 @@
---
type: reference
-stage: Secure
+stage: Govern
group: Threat Insights
info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
---
diff --git a/doc/user/application_security/vulnerability_report/index.md b/doc/user/application_security/vulnerability_report/index.md
index 15a287356f8..ba448d410ea 100644
--- a/doc/user/application_security/vulnerability_report/index.md
+++ b/doc/user/application_security/vulnerability_report/index.md
@@ -1,6 +1,6 @@
---
type: reference, howto
-stage: Secure
+stage: Govern
group: Threat Insights
info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
---
@@ -44,7 +44,7 @@ At the project level, the Vulnerability Report also contains:
To view the project-level vulnerability report:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Vulnerability report**.
## Vulnerability Report actions
@@ -57,6 +57,7 @@ From the Vulnerability Report you can:
- [View an issue raised for a vulnerability](#view-issues-raised-for-a-vulnerability).
- [Change the status of vulnerabilities](#change-status-of-vulnerabilities).
- [Export details of vulnerabilities](#export-vulnerability-details).
+- [Sort vulnerabilities by date](#sort-vulnerabilities-by-date-detected).
- [Manually add a vulnerability finding](#manually-add-a-vulnerability-finding).
## Vulnerability Report filters
@@ -186,6 +187,12 @@ Vulnerability records cannot be deleted, so a permanent record always remains.
If a vulnerability is dismissed in error, reverse the dismissal by changing its status.
+## Sort vulnerabilities by date detected
+
+By default, vulnerabilities are sorted by severity level, with the highest-severity vulnerabilities listed at the top.
+
+To sort vulnerabilities by the date each vulnerability was detected, click the "Detected" column header.
+
## Export vulnerability details
> - [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/213014) in the Security Center (previously known as the Instance Security Dashboard) and project-level Vulnerability Report (previously known as the Project Security Dashboard) in GitLab 13.0.
@@ -247,7 +254,7 @@ To undo this action, select a different status from the same menu.
To add a new vulnerability finding from your project level Vulnerability Report page:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **Security & Compliance > Vulnerability Report**.
1. Select **Submit Vulnerability**.
1. Complete the fields and submit the form.
diff --git a/doc/user/application_security/vulnerability_report/pipeline.md b/doc/user/application_security/vulnerability_report/pipeline.md
index 32916f4c9c7..7faf273515c 100644
--- a/doc/user/application_security/vulnerability_report/pipeline.md
+++ b/doc/user/application_security/vulnerability_report/pipeline.md
@@ -1,6 +1,6 @@
---
type: reference, howto
-stage: Secure
+stage: Govern
group: Threat Insights
info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
---
@@ -11,7 +11,7 @@ info: To determine the technical writer assigned to the Stage/Group associated w
To view vulnerabilities in a pipeline:
-1. On the top bar, select **Menu > Projects** and find your project.
+1. On the top bar, select **Main menu > Projects** and find your project.
1. On the left sidebar, select **CI/CD > Pipelines**.
1. From the list, select the pipeline you want to check for vulnerabilities.
1. Select the **Security** tab.
@@ -26,7 +26,7 @@ For example, if a pipeline contains DAST and SAST jobs, but the DAST job fails b
The pipeline vulnerability report only shows results contained in the security report artifacts. This report differs from
the [Vulnerability Report](index.md), which contains cumulative results of all successful jobs, and from the merge request
-[security widget](../#view-security-scan-information-in-merge-requests), which combines the branch results with
+[security widget](../index.md#view-security-scan-information-in-merge-requests), which combines the branch results with
cumulative results.
Before GitLab displays results, the vulnerability findings in all pipeline reports are [deduplicated](#deduplication-process).
@@ -35,7 +35,7 @@ Before GitLab displays results, the vulnerability findings in all pipeline repor
**Scan details** shows a summary of vulnerability findings in the pipeline and the source reports.
-GitLab displays one row of information for each [scan type](../terminology/#scan-type-report-type) artifact present in
+GitLab displays one row of information for each [scan type](../terminology/index.md#scan-type-report-type) artifact present in
the pipeline.
Note that each scan type's total number of vulnerabilities includes dismissed findings. If the number of findings
@@ -83,11 +83,11 @@ incorporated once the pipeline finishes.
When a pipeline contains jobs that produce multiple security reports of the same type, it is possible that the same
vulnerability finding is present in multiple reports. This duplication is common when different scanners are used to
-increase coverage. The deduplication process allows you to maximize the vulnerability scanning coverage while reducing
+increase coverage, but can also exist within a single report. The deduplication process allows you to maximize the vulnerability scanning coverage while reducing
the number of findings you need to manage.
-A finding is considered a duplicate of another finding when their [scan type](../terminology/#scan-type-report-type),
-[location](../terminology/#location-fingerprint) and
+A finding is considered a duplicate of another finding when their [scan type](../terminology/index.md#scan-type-report-type),
+[location](../terminology/index.md#location-fingerprint), and one or more of its
[identifiers](../../../development/integrations/secure.md#identifiers) are the same.
The scan type must match because each can have its own definition for the location of a vulnerability. For example,
@@ -95,8 +95,9 @@ static analyzers are able to locate a file path and line number, whereas a conta
name instead.
When comparing identifiers, GitLab does not compare `CWE` and `WASC` during deduplication because they are
-"type identifiers" and are used to classify groups of vulnerabilities. Including these identifiers results in
-many findings being incorrectly considered duplicates.
+"type identifiers" and are used to classify groups of vulnerabilities. Including these identifiers would result in
+many findings being incorrectly considered duplicates. Two findings are considered unique if none of their
+identifiers match.
In a set of duplicated findings, the first occurrence of a finding is kept and the remaining are skipped. Security
reports are processed in alphabetical file path order, and findings are processed sequentially in the order they
@@ -124,16 +125,17 @@ appear in a report.
- Location fingerprint: `adc83b19e793491b1c6ea0fd8b46cd9f32e592fc`
- Identifiers: CWE-798
- Deduplication result: duplicates because `CWE` identifiers are ignored.
-- Example 3: matching scan type, location and identifiers.
+- Example 3: matching scan type, location and an identifier.
- Finding
- Scan type: `container_scanning`
- Location fingerprint: `adc83b19e793491b1c6ea0fd8b46cd9f32e592fc`
- - Identifiers: CVE-2022-25510, CWE-259
+ - Identifiers: CVE-2019-12345, CVE-2022-25510, CWE-259
- Other Finding
- Scan type: `container_scanning`
- Location fingerprint: `adc83b19e793491b1c6ea0fd8b46cd9f32e592fc`
- Identifiers: CVE-2022-25510, CWE-798
- Deduplication result: duplicates because all criteria match, and type identifiers are ignored.
+ Only one identifier needs to match, in this case CVE-2022-25510.
The examples above don't include the raw location values. Each scan type defines its own
`fingerprint_data`, which is used to generate a `SHA1` hash that is used as the `location_fingerprint`.