summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorKamil Trzciński <ayufan@ayufan.eu>2018-06-22 14:44:37 +0000
committerKamil Trzciński <ayufan@ayufan.eu>2018-06-22 14:44:37 +0000
commitafb1ee87a7ff44ba644db7460334d2b648f6dea0 (patch)
treea4ebade2b7183887f3b9d257a0dd806e4e15dbcb
parent69e54faa4e12de7c81b2b4d493286963a0614faf (diff)
parent10ff1632b653401f934d1a53a4627ee881d71a8d (diff)
downloadgitlab-ce-afb1ee87a7ff44ba644db7460334d2b648f6dea0.tar.gz
Merge branch 'patch-29' into 'master'
Update lfs_administration.md with language edits See merge request gitlab-org/gitlab-ce!19950
-rw-r--r--doc/workflow/lfs/lfs_administration.md52
1 files changed, 26 insertions, 26 deletions
diff --git a/doc/workflow/lfs/lfs_administration.md b/doc/workflow/lfs/lfs_administration.md
index ba1e4e55d5b..8a2f230f505 100644
--- a/doc/workflow/lfs/lfs_administration.md
+++ b/doc/workflow/lfs/lfs_administration.md
@@ -17,7 +17,7 @@ There are various configuration options to help GitLab server administrators:
* Enabling/disabling Git LFS support
* Changing the location of LFS object storage
-* Setting up an object storage supported by [Fog](http://fog.io/about/provider_documentation.html)
+* Setting up object storage supported by [Fog](http://fog.io/about/provider_documentation.html)
### Configuration for Omnibus installations
@@ -44,31 +44,31 @@ In `config/gitlab.yml`:
storage_path: /mnt/storage/lfs-objects
```
-## Storing LFS objects to an object storage
+## Storing LFS objects in remote object storage
> [Introduced][ee-2760] in [GitLab Premium][eep] 10.0. Brought to GitLab Core
in 10.7.
-It is possible to store LFS objects to a remote object storage which allows you
-to offload R/W operation on local hard disk and freed up disk space significantly.
-You can check which object storage can be integrated with GitLab [here](http://fog.io/about/provider_documentation.html)
-(Since GitLab is tightly integrated with `Fog`, you can refer the documentation)
-You can also use an object storage in a private local network. For example,
-[Minio](https://www.minio.io/) is standalone object storage, easy to setup, and works well with GitLab instance.
+It is possible to store LFS objects in remote object storage which allows you
+to offload local hard disk R/W operations, and free up disk space significantly.
+GitLab is tightly integrated with `Fog`, so you can refer to its [documentation](http://fog.io/about/provider_documentation.html)
+to check which storage services can be integrated with GitLab.
+You can also use external object storage in a private local network. For example,
+[Minio](https://www.minio.io/) is a standalone object storage service, is easy to setup, and works well with GitLab instances.
-GitLab provides two different options as the uploading mechanizm. One is "Direct upload", and another one is "Background upload".
+GitLab provides two different options for the uploading mechanism: "Direct upload" and "Background upload".
**Option 1. Direct upload**
-1. User pushes a lfs file to the GitLab instance
-1. GitLab-workhorse uploads the file to the object storage
-1. GitLab-workhorse notifies to GitLab-rails that the uploading process is done
+1. User pushes an lfs file to the GitLab instance
+1. GitLab-workhorse uploads the file directly to the external object storage
+1. GitLab-workhorse notifies GitLab-rails that the upload process is complete
**Option 2. Background upload**
-1. User pushes a lfs file to the GitLab instance
-1. GitLab-rails stores the file to the local files storage
-1. GitLab-rails uploads the file to object storage asynchronously
+1. User pushes an lfs file to the GitLab instance
+1. GitLab-rails stores the file in the local file storage
+1. GitLab-rails then uploads the file to the external object storage asynchronously
The following general settings are supported.
@@ -83,7 +83,7 @@ The following general settings are supported.
The `connection` settings match those provided by [Fog](https://github.com/fog).
-Here is the configuration example with S3.
+Here is a configuration example with S3.
| Setting | Description | example |
|---------|-------------|---------|
@@ -101,14 +101,14 @@ Here is a configuration example with GCS.
|---------|-------------|---------|
| `provider` | The provider name | `Google` |
| `google_project` | GCP project name | `gcp-project-12345` |
-| `google_client_email` | The email address of a service account | `foo@gcp-project-12345.iam.gserviceaccount.com` |
-| `google_json_key_location` | The json key path to the | `/path/to/gcp-project-12345-abcde.json` |
+| `google_client_email` | The email address of the service account | `foo@gcp-project-12345.iam.gserviceaccount.com` |
+| `google_json_key_location` | The json key path | `/path/to/gcp-project-12345-abcde.json` |
-_NOTE: Service account must have a permission to access the bucket. See more https://cloud.google.com/storage/docs/authentication_
+_NOTE: The service account must have permission to access the bucket. [See more](https://cloud.google.com/storage/docs/authentication)_
### Manual uploading to an object storage
-There are two ways to do the same thing with automatic uploading which described above.
+There are two ways to manually do the same thing as automatic uploading (described above).
**Option 1: rake task**
@@ -204,15 +204,15 @@ and [projects APIs](../../api/projects.md).
## Troubleshooting: `Google::Apis::TransmissionError: execution expired`
-If LFS integration is configred with Google Cloud Storage and background upload (`background_upload: true` and `direct_upload: false`)
-sidekiq workers may encouter this error. This is because uploading timed out by huge files.
-For the record, upto 6GB lfs files can be uploaded without any extra steps, otherwise you need the following workaround.
+If LFS integration is configred with Google Cloud Storage and background uploads (`background_upload: true` and `direct_upload: false`),
+sidekiq workers may encouter this error. This is because the uploading timed out with very large files.
+LFS files up to 6Gb can be uploaded without any extra steps, otherwise you need to use the following workaround.
```shell
$ sudo gitlab-rails console # Login to rails console
-> # Setup timeouts. 20 minutes is enough to upload 30GB LFS files.
-> # Those settings are only effective in the same session, i.e. Those are not effective in sidekiq workers.
+> # Set up timeouts. 20 minutes is enough to upload 30GB LFS files.
+> # These settings are only in effect for the same session, i.e. they are not effective for sidekiq workers.
> ::Google::Apis::ClientOptions.default.open_timeout_sec = 1200
> ::Google::Apis::ClientOptions.default.read_timeout_sec = 1200
> ::Google::Apis::ClientOptions.default.send_timeout_sec = 1200
@@ -223,7 +223,7 @@ $ sudo gitlab-rails console # Login to rails console
> end
```
-See more information in https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/19581
+See more information in [!19581](https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/19581)
## Known limitations