summaryrefslogtreecommitdiff
path: root/doc/workflow
diff options
context:
space:
mode:
authorMarcel Amirault <ravlen@gmail.com>2018-06-18 08:12:25 +0000
committerMarcel Amirault <ravlen@gmail.com>2018-06-18 08:12:25 +0000
commit10ff1632b653401f934d1a53a4627ee881d71a8d (patch)
tree53b29a8a6bd5170eb3adbc24e46bd9853ac35ba5 /doc/workflow
parentad5e4469ecdee93e2d064e00ec463c702d47822d (diff)
downloadgitlab-ce-10ff1632b653401f934d1a53a4627ee881d71a8d.tar.gz
Update lfs_administration.md with language edits
Diffstat (limited to 'doc/workflow')
-rw-r--r--doc/workflow/lfs/lfs_administration.md52
1 files changed, 26 insertions, 26 deletions
diff --git a/doc/workflow/lfs/lfs_administration.md b/doc/workflow/lfs/lfs_administration.md
index ba1e4e55d5b..8a2f230f505 100644
--- a/doc/workflow/lfs/lfs_administration.md
+++ b/doc/workflow/lfs/lfs_administration.md
@@ -17,7 +17,7 @@ There are various configuration options to help GitLab server administrators:
* Enabling/disabling Git LFS support
* Changing the location of LFS object storage
-* Setting up an object storage supported by [Fog](http://fog.io/about/provider_documentation.html)
+* Setting up object storage supported by [Fog](http://fog.io/about/provider_documentation.html)
### Configuration for Omnibus installations
@@ -44,31 +44,31 @@ In `config/gitlab.yml`:
storage_path: /mnt/storage/lfs-objects
```
-## Storing LFS objects to an object storage
+## Storing LFS objects in remote object storage
> [Introduced][ee-2760] in [GitLab Premium][eep] 10.0. Brought to GitLab Core
in 10.7.
-It is possible to store LFS objects to a remote object storage which allows you
-to offload R/W operation on local hard disk and freed up disk space significantly.
-You can check which object storage can be integrated with GitLab [here](http://fog.io/about/provider_documentation.html)
-(Since GitLab is tightly integrated with `Fog`, you can refer the documentation)
-You can also use an object storage in a private local network. For example,
-[Minio](https://www.minio.io/) is standalone object storage, easy to setup, and works well with GitLab instance.
+It is possible to store LFS objects in remote object storage which allows you
+to offload local hard disk R/W operations, and free up disk space significantly.
+GitLab is tightly integrated with `Fog`, so you can refer to its [documentation](http://fog.io/about/provider_documentation.html)
+to check which storage services can be integrated with GitLab.
+You can also use external object storage in a private local network. For example,
+[Minio](https://www.minio.io/) is a standalone object storage service, is easy to setup, and works well with GitLab instances.
-GitLab provides two different options as the uploading mechanizm. One is "Direct upload", and another one is "Background upload".
+GitLab provides two different options for the uploading mechanism: "Direct upload" and "Background upload".
**Option 1. Direct upload**
-1. User pushes a lfs file to the GitLab instance
-1. GitLab-workhorse uploads the file to the object storage
-1. GitLab-workhorse notifies to GitLab-rails that the uploading process is done
+1. User pushes an lfs file to the GitLab instance
+1. GitLab-workhorse uploads the file directly to the external object storage
+1. GitLab-workhorse notifies GitLab-rails that the upload process is complete
**Option 2. Background upload**
-1. User pushes a lfs file to the GitLab instance
-1. GitLab-rails stores the file to the local files storage
-1. GitLab-rails uploads the file to object storage asynchronously
+1. User pushes an lfs file to the GitLab instance
+1. GitLab-rails stores the file in the local file storage
+1. GitLab-rails then uploads the file to the external object storage asynchronously
The following general settings are supported.
@@ -83,7 +83,7 @@ The following general settings are supported.
The `connection` settings match those provided by [Fog](https://github.com/fog).
-Here is the configuration example with S3.
+Here is a configuration example with S3.
| Setting | Description | example |
|---------|-------------|---------|
@@ -101,14 +101,14 @@ Here is a configuration example with GCS.
|---------|-------------|---------|
| `provider` | The provider name | `Google` |
| `google_project` | GCP project name | `gcp-project-12345` |
-| `google_client_email` | The email address of a service account | `foo@gcp-project-12345.iam.gserviceaccount.com` |
-| `google_json_key_location` | The json key path to the | `/path/to/gcp-project-12345-abcde.json` |
+| `google_client_email` | The email address of the service account | `foo@gcp-project-12345.iam.gserviceaccount.com` |
+| `google_json_key_location` | The json key path | `/path/to/gcp-project-12345-abcde.json` |
-_NOTE: Service account must have a permission to access the bucket. See more https://cloud.google.com/storage/docs/authentication_
+_NOTE: The service account must have permission to access the bucket. [See more](https://cloud.google.com/storage/docs/authentication)_
### Manual uploading to an object storage
-There are two ways to do the same thing with automatic uploading which described above.
+There are two ways to manually do the same thing as automatic uploading (described above).
**Option 1: rake task**
@@ -204,15 +204,15 @@ and [projects APIs](../../api/projects.md).
## Troubleshooting: `Google::Apis::TransmissionError: execution expired`
-If LFS integration is configred with Google Cloud Storage and background upload (`background_upload: true` and `direct_upload: false`)
-sidekiq workers may encouter this error. This is because uploading timed out by huge files.
-For the record, upto 6GB lfs files can be uploaded without any extra steps, otherwise you need the following workaround.
+If LFS integration is configred with Google Cloud Storage and background uploads (`background_upload: true` and `direct_upload: false`),
+sidekiq workers may encouter this error. This is because the uploading timed out with very large files.
+LFS files up to 6Gb can be uploaded without any extra steps, otherwise you need to use the following workaround.
```shell
$ sudo gitlab-rails console # Login to rails console
-> # Setup timeouts. 20 minutes is enough to upload 30GB LFS files.
-> # Those settings are only effective in the same session, i.e. Those are not effective in sidekiq workers.
+> # Set up timeouts. 20 minutes is enough to upload 30GB LFS files.
+> # These settings are only in effect for the same session, i.e. they are not effective for sidekiq workers.
> ::Google::Apis::ClientOptions.default.open_timeout_sec = 1200
> ::Google::Apis::ClientOptions.default.read_timeout_sec = 1200
> ::Google::Apis::ClientOptions.default.send_timeout_sec = 1200
@@ -223,7 +223,7 @@ $ sudo gitlab-rails console # Login to rails console
> end
```
-See more information in https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/19581
+See more information in [!19581](https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/19581)
## Known limitations