Skip to content

Commit

Permalink
push/pull: roll back common problems, add troubleshooting guide
Browse files Browse the repository at this point in the history
  • Loading branch information
pared committed Dec 19, 2019
1 parent c5b4c71 commit 350b2db
Show file tree
Hide file tree
Showing 3 changed files with 17 additions and 26 deletions.
13 changes: 0 additions & 13 deletions static/docs/command-reference/pull.md
Original file line number Diff line number Diff line change
Expand Up @@ -100,19 +100,6 @@ reflinks or hardlinks to put it in the workspace without copying. See

- `-v`, `--verbose` - displays detailed tracing information.

## Common problems

- `[Errno 24] Too many open files` - most common for MacOs users who work with
s3 remote. Each download thread needs to open one or more file descriptor(s).
The more `jobs` is specified, the more file descriptors needs to be open on
host filesystem. If there is too many open file descriptors, host filesystem
will not allow opening new ones. Recommended approach to handle this error is
to increase open file descriptors limit. On UNIX systems that can be done
using `ulimit -n`. In case of Windows, please refer to
[official resources](https://blogs.technet.microsoft.com/markrussinovich/2009/09/29/pushing-the-limits-of-windows-handles/).
If changing limits is impossible, the workaround would be to use lower `jobs`
option value.

## Examples

For using the `dvc pull` command, a remote storage must be defined. (See
Expand Down
13 changes: 0 additions & 13 deletions static/docs/command-reference/push.md
Original file line number Diff line number Diff line change
Expand Up @@ -104,19 +104,6 @@ to push.

- `-v`, `--verbose` - displays detailed tracing information.

## Common problems

- `[Errno 24] Too many open files` - most common for MacOs users who work with
s3 remote. Each upload thread needs to open one or more file descriptor(s).
The more `jobs` is specified, the more file descriptors needs to be open on
host filesystem. If there is too many open file descriptors, host filesystem
will not allow opening new ones. Recommended approach to handle this error is
to increase open file descriptors limit. On UNIX systems that can be done
using `ulimit -n`. In case of Windows, please refer to
[official resources](https://blogs.technet.microsoft.com/markrussinovich/2009/09/29/pushing-the-limits-of-windows-handles/).
If changing limits is impossible, the workaround would be to use lower `jobs`
option value.

## Examples

For using the `dvc push` command, a remote storage must be defined. (See
Expand Down
17 changes: 17 additions & 0 deletions static/docs/user-guide/troubleshooting.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# Troubleshooting

In this section we cover some of the known issues that DVC user might stumbe
upon.

## Too many open files error

A known problem some users run into with `pull`, `fetch` and `push` commands is
`[Errno 24] Too many open files` (most common for S3 remotes on MacOS). The more
`--jobs` specified, the more file descriptors need to be open on the host file
system for each download thread, and the limit may be reached, causing this
error.

To solve this, it's often possible to increase the open file descriptors limit,
for example with `ulimit -n` on UNIX-like system, or
[increasing Handles limit](https://blogs.technet.microsoft.com/markrussinovich/2009/09/29/pushing-the-limits-of-windows-handles/)
on Windows. Otherwise, please try using a lower `JOBS` value.

0 comments on commit 350b2db

Please sign in to comment.