Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

some crates take long enough to make docs.rs look stuck #335

Closed
dekellum opened this issue Apr 24, 2019 · 7 comments
Closed

some crates take long enough to make docs.rs look stuck #335

dekellum opened this issue Apr 24, 2019 · 7 comments
Labels
A-builds Area: Building the documentation for a crate

Comments

@dekellum
Copy link

Noticed this after waiting 4 hours for a crate update to build on docs.rs. The docs.rs queue has remained unchanged, and the most recent docs are for stm32f7 (4 hours ago) and last failure: riot-sys-0.2.2 (5 hours ago).

https://docs.rs/releases/queue

Queue

  1. stm32f4-0.7.0
  2. stm32f3-0.7.0
  3. stm32f2-0.7.0
  4. stm32f1-0.7.0
  5. stm32f0-0.7.0

Meanwhile, crates.io has many updated crates since these:

fakecargo (0.2.0)
easy_reader (0.4.0)
security-framework (0.3.1)
security-framework-sys (0.3.1)
twapi (0.4.3)
oauth1-request-derive (0.2.1)
http_req (0.4.9)
esplugin-ffi (2.1.2)
esplugin (2.1.2)
flo_binding (0.1.2)
...

Sampling suggests these haven't been built by docs.rs yet. Should they not at least have appeared in the queue by now?

@QuietMisdreavus
Copy link
Member

The queue has lagged behind because the rapid release of many stm32 crates has caused the builder to slow down - each of these crates takes so long to build and add to our database that it caused everything else to lag behind.

The same thread both loads crates onto the queue and builds them, so if a particular section of the queue is taking a while, nothing else will get loaded until it's gone through its backlog. In fact, if you look now, you'll see that the queue has changed, and now the builder is stuck on more stm crates, with more in the backlog.

@dekellum dekellum changed the title docs.rs is stuck or lagging? some crates take long enough to make docs.rs look stuck Apr 25, 2019
@dekellum
Copy link
Author

Out of curiosity, I downloaded stm32f4 0.7.0 from crates.io, un-tar'd and ran:

cargo doc --features "rt, stm32f401, stm32f407, stm32f413, stm32f469" --no-deps

...per the selected features for docs.rs. Sure enough it takes ~20 minutes of 1 CPU and wants 6.1g of RAM to produce 1.1g of docs. Looks like the project is aware of the issue: stm32-rs/stm32-rs#3

Should individual doc build jobs be canceled, reported as error, after some more reasonable time period?

The queue has become hard to find. I think it was previously linked in somewhere? If there is not much advantage to having a separate thread to poll for crate updates and push these to the queue, or perhaps in any case, some reporting improvements might include:

  • the current crate being processed and start timestamp or duration

  • the timestamp or duration since when the queue was last populated

This way I'd have been able to conclude that there wasn't some transient failure with my crate update.

@QuietMisdreavus
Copy link
Member

In case anyone finds this thread again today, the queue is currently stuck thanks to stm32ral, which has taken a few hours to build so far and looks like it will take a couple more. I'll start working on #343 soon, which should help out this issue some. I worry that the only real "fix" for this issue is either to significantly optimize how we do file storage, add a unified file format to rustdoc itself, or find some other way to solve the fact that each build of this crate generates over a million files.

@dekellum
Copy link
Author

dekellum commented May 2, 2019

generates over a million files

Oh, wow! Isn't this a case of some rare and exceptional crates interfering too regularly with steady operation for normal crates? If there was some timeout mechanism (and a way to cancel the build) would this not be more fare to the majority of crates, making docs.rs more stable? With that in place another potential accommodation would be to schedule a re-trial with a much longer timeout at some off-peak time, assuming there is an off-peak?

@dekellum
Copy link
Author

dekellum commented May 2, 2019

...or scheduling re-trial with longer timeout at a lower priority, as per your #344.

@QuietMisdreavus
Copy link
Member

Rescheduling with lower priority post-#344 won't really help, since there's still only one builder - it'll still block all other builds while it's running. I feel like the easiest solution is #343, since these crates don't need to be built for more than one platform (i've asked their maintainers in the past if that would be an issue, and they said no), and it would speed up crate builds generally.

@syphar
Copy link
Member

syphar commented Aug 3, 2023

I'll close this issue for now:

@syphar syphar closed this as not planned Won't fix, can't repro, duplicate, stale Aug 3, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-builds Area: Building the documentation for a crate
Projects
None yet
Development

No branches or pull requests

4 participants