Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FastHttpUser requests are blocking #1810

Closed
soitinj opened this issue Jul 8, 2021 · 7 comments · Fixed by #1812
Closed

FastHttpUser requests are blocking #1810

soitinj opened this issue Jul 8, 2021 · 7 comments · Fixed by #1812
Labels

Comments

@soitinj
Copy link
Contributor

soitinj commented Jul 8, 2021

Describe the bug

I'd like to simulate concurrent requests, much like a real javascript client would do. This can be done by spawning multiple greenlets within a task, and resolving them cooperatively. This approach is also suggested in this issue report. This works just fine with HttpUser, but FastHttpUser requests are blocking, which means that the requests spawned are handled sequentially instead of cooperatively.

This also affects the request times of requests performed. For example, in the sample code below, the result time of request to /example_url_3 is the sum of all requests performed, request time to url /example_url_2 is the sum of request times to /example_url_1 and /example_url_2, and so on. This is the most unfortunate part of the issue.

geventhttpclient already supports concurrency, but this is 1 by default. The gevent UserAgent can be passed a concurrency kwarg, as explained here, but this is ignored in locust. I forked locust and made this commit that allows adjusting concurrency within a locust user class that extends FastHttpUser. From my perspective, setting the concurrency to some number above 1 solves the issue. In the code sample below, concurrency=3 would be enough.

Expected behavior

Requests should be executed cooperatively.

Actual behavior

Requests are blocking instead of cooperatively.

Steps to reproduce

from locust import TaskSet, task, between
from locust.contrib.fasthttp import FastHttpUser
from gevent.pool import Group


class ExampleTaskSet(TaskSet):

    wait_time = between(4, 5)

    @task
    def example_task(self):
        group = Group()
        group.spawn(lambda: self.client.get("/example_url_1"))
        group.spawn(lambda: self.client.get("/example_url_2"))
        group.spawn(lambda: self.client.get("/example_url_3"))
        group.join()

class ExampleUser(FastHttpUser):
    tasks = [ExampleTaskSet]
    host = ''

Environment

  • OS: macOS 10.14.6 Mojave
  • Python version: 3.7.2
  • Locust version: 1.5.3
  • Locust command line that you ran: locust
@soitinj soitinj added the bug label Jul 8, 2021
@mboutet
Copy link
Contributor

mboutet commented Jul 8, 2021

If instead you do:

group = Group()
group.add(gevent.spawn_later(0, self.client.get, "/example_url_1"))
group.add(gevent.spawn_later(0, self.client.get, "/example_url_2"))
group.add(gevent.spawn_later(0, self.client.get, "/example_url_3"))
group.join()

Or:

greenlets = [
    gevent.spawn(self.client.get, "/example_url_1"),
    gevent.spawn(self.client.get, "/example_url_2"),
    gevent.spawn(self.client.get, "/example_url_3"),
]
gevent.joinall(greenlets)

Does it work?

@soitinj
Copy link
Contributor Author

soitinj commented Jul 12, 2021

Sorry for the slow response. I tried both of your suggestions but it made no difference. These are the response times I get by running custom greenlets vs standard sequential requests:

Running cooperatively (spawning custom greenlets):
Screenshot 2021-07-12 at 21 41 40
Running sequentially (without spawning greenlets):
Screenshot 2021-07-12 at 21 41 45

@cyberw
Copy link
Collaborator

cyberw commented Jul 12, 2021

If you make a PR with that change we can merge it, no problem.

@soitinj
Copy link
Contributor Author

soitinj commented Jul 13, 2021

Made a pull request: #1812

@FerdinandNell
Copy link

Quick question, how do we set the concurrency then? This is quite a major feature that is not documented in the Locust.IO docs

@heyman
Copy link
Member

heyman commented Jun 29, 2022

how do we set the concurrency then?

class MyUser(FastHttpUser):
    concurrency = 5

This is only necessary if you want to make concurrent requests within a single simulated user with custom spawned greenlets.

With that said, I think it would make sense to increase the default concurrency to something like 5. For the default use case where Users are single-threaded, it shouldn't affect the functionality in any way with only a very minor impact on the memory footprint.

@FerdinandNell
Copy link

Thanks, got it working nonetheless. Had to set it to 15 for the 15 concurrent endpoints we load during app launch

cyberw added a commit that referenced this issue Aug 29, 2022
…ion in #1810 Add a test (which also serves as an example for how to use it)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants