diff options
author | Marcel Amirault <ravlen@gmail.com> | 2019-05-05 15:06:37 +0000 |
---|---|---|
committer | Achilleas Pipinellis <axil@gitlab.com> | 2019-05-05 15:06:37 +0000 |
commit | 339d0a5b9259a27da77ef090c46d180314f624aa (patch) | |
tree | abbb6fc68272f0c0bafac7c30bd3790e540dc24b /doc/ci/metrics_reports.md | |
parent | 4eac38d48ca4731e91b9d35ef2ac4e90214f4e59 (diff) | |
download | gitlab-ce-339d0a5b9259a27da77ef090c46d180314f624aa.tar.gz |
Docs: Merge EE doc/ci to CE
Diffstat (limited to 'doc/ci/metrics_reports.md')
-rw-r--r-- | doc/ci/metrics_reports.md | 39 |
1 files changed, 39 insertions, 0 deletions
diff --git a/doc/ci/metrics_reports.md b/doc/ci/metrics_reports.md new file mode 100644 index 00000000000..36e7c82cc3a --- /dev/null +++ b/doc/ci/metrics_reports.md @@ -0,0 +1,39 @@ +# Metrics Reports **[PREMIUM]** + +> [Introduced](https://gitlab.com/gitlab-org/gitlab-ee/issues/9788) in [GitLab Premium](https://about.gitlab.com/pricing) 11.10. +Requires GitLab Runner 11.10 and above. + +## Overview + +GitLab provides a lot of great reporting tools for [merge requests](../user/project/merge_requests/index.md) - [JUnit reports](./junit_test_reports.md), [codequality](https://docs.gitlab.com/ee/user/project/merge_requests/code_quality.html), performance tests, etc. While JUnit is a great open framework for tests that "pass" or "fail", it is also important to see other types of metrics from a given change. + +You can configure your job to use custom Metrics Reports, and GitLab will display a report on the merge request so that it's easier and faster to identify changes without having to check the entire log. + +![Metrics Reports](./img/metrics_reports.png) + +## Use cases + +Consider the following examples of data that can utilize Metrics Reports: + +1. Memory usage +1. Load testing results +1. Code complexity +1. Code coverage stats + +## How it works + +Metrics are read from the metrics report (default: `metrics.txt`). They are parsed and displayed in the MR widget. + +## How to set it up + +Add a job that creates a [metrics report](yaml/README.md#artifactsreportsmetrics-premium) (default filename: `metrics.txt`). The file should conform to the [OpenMetrics](https://openmetrics.io/) format. + +For example: + +```yaml +metrics: + script: + - echo 'metric_name metric_value' > metrics.txt + reports: + metrics: metrics.txt +``` |