You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I don't actually understand how, but somehow, when keepalive is installed, multiple processes will get mixed up in what they get back from their http requests.
I.e. the main process forks into process A and process B, they each make a SPARQL query. And then, sometimes, process A will get back the result meant for process B! And sometimes the xml document that is being read will be changed halfway through parsing it, and you get weird XML parsing errors.
I can only assume the different processes somehow share a buffer for reading stuff from the network, but I really don't understand how this happens. Normally, setting up shared memory between forks is lots of hassle :)
I condensed the problem to the program below. My db has a bunch of resources, with a single ID property, which is the same as the id in their URL - so I can fork one process per resource, and check that I get the right ID back.
I don't know how to even start fixing this - and it's not very urgent for me (I've replaced the code with requests), it's mainly as a heads up for other people using this!
I've been trying to use the SPARQLStore in a multiprocessing setup, i.e. I fork more processes, they get their own graph/store, and make some queries.
They all use this package underneath.
My first bug was this: RDFLib/sparqlwrapper#84 (which made the sub-process hang)
After that was fixed it got much worse :)
I don't actually understand how, but somehow, when keepalive is installed, multiple processes will get mixed up in what they get back from their http requests.
I.e. the main process forks into process A and process B, they each make a SPARQL query. And then, sometimes, process A will get back the result meant for process B! And sometimes the xml document that is being read will be changed halfway through parsing it, and you get weird XML parsing errors.
I can only assume the different processes somehow share a buffer for reading stuff from the network, but I really don't understand how this happens. Normally, setting up shared memory between forks is lots of hassle :)
I condensed the problem to the program below. My db has a bunch of resources, with a single ID property, which is the same as the id in their URL - so I can fork one process per resource, and check that I get the right ID back.
I don't know how to even start fixing this - and it's not very urgent for me (I've replaced the code with
requests
), it's mainly as a heads up for other people using this!The text was updated successfully, but these errors were encountered: