Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support figures that are larger than Chromium's maximum message size #19

Open
bhachauk opened this issue Jul 24, 2020 · 12 comments
Open
Labels
feature something new P3 not needed for current cycle

Comments

@bhachauk
Copy link

Hi,

I am trying to export the fig using the method

fig.to_image(format="png", engine="kaleido")

I got the exception like:

Transform failed. Error stream:

[0723/161014.534602:FATAL:ipc_mojo_bootstrap.cc(729)] Check failed: message->data_num_bytes() <= Channel::kMaximumMessageSize  (366505976 vs. 134217728)

How to handle this check ?

@nicolaskruchten nicolaskruchten transferred this issue from plotly/plotly.py Jul 24, 2020
@nicolaskruchten
Copy link

@jonmmease transferred from Plotly.py

@jonmmease
Copy link
Collaborator

Hi @Bhanuchander210, this message indicates that the size of the figure JSON bundle exceeded the maximum chromium message size. This is generally a pretty large threshold (e.g. somewhere around an 8 million point scatter plot).

Could you describe the figure you are trying to export? Are you able to display it interactively? Did you ever try exporting it using the Orca? Thanks!

@bhachauk
Copy link
Author

bhachauk commented Jul 24, 2020

Hi @jonmmease, As you mentioned, yes it is pretty large scatter plot approx (6 columns * 1696900 rows : may have nan, so number of points will vary).

Are you able to display it interactively?

  • Yes i can display the plot interactively

Did you ever try exporting it using the Orca?

  • Yes i have tried that also, but seems not working...

It shows the following exception :

For some reason plotly.py was unable to communicate with the
local orca server process, even though the server process seems to be running.

Please review the process and connection information below:

orca status
-----------
    state: running
    executable: /home/bhanuchander/miniconda3/bin/orca
    version: 1.2.1
    port: 46368
    pid: 11821
    command: ['/home/bhanuchander/miniconda3/bin/orca', 'serve', '-p', '46368', '--plotly', '/home/bhanuchander/miniconda3/lib/python3.7/site-packages/plotly/package_data/plotly.min.js', '--graph-only', '--mathjax', 'https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.5/MathJax.js']

@jonmmease
Copy link
Collaborator

Thanks for the details. As it does look like a figure size issue, it's not surprising that orca fails as well since Electron is using the same Chromium engine underneath.

It should be possible to work around this limitation, but it will take some redesign work to support streaming the figure specification from the C++ to the JavaScript side.

Right now, we parse the input JSON in C++ and then let chromium transfer the resulting dictionary from C++ to JavaScript. We may need to transfer the figure specification in string form in chunks and then parse it into a JavaScript object at the end.

@jonmmease jonmmease added the bug something broken label Jul 24, 2020
@jonmmease jonmmease changed the title FATAL:ipc_mojo_bootstrap.cc(729)] Check failed: message->data_num_bytes() <= Channel::kMaximumMessageSize Support figures that are larger than Chromium's maximum message size Jul 24, 2020
@jayceslesar
Copy link

I also get this issue with large datasets

@rskabelund
Copy link

I am also getting this error on large plots.

figure.to_image( 
File "/opt/conda/lib/python3.8/site-packages/plotly/basedatatypes.py", line 3495, in to_image return pio.to_image(self, *args, **kwargs) 
File "/opt/conda/lib/python3.8/site-packages/plotly/io/_kaleido.py", line 131, in to_image img_bytes = scope.transform( 
File "/opt/conda/lib/python3.8/site-packages/kaleido/scopes/plotly.py", line 103, in transform 
response = self._perform_transform( File "/opt/conda/lib/python3.8/site-packages/kaleido/scopes/base.py", line 305, in _perform_transform raise ValueError(message) 
ValueError: Transform failed. 

Error stream: [0513/215523.654363:FATAL:ipc_mojo_bootstrap.cc(729)] 
Check failed: message->data_num_bytes() <= Channel::kMaximumMessageSize (400041304 vs. 134217728) #0 0x55f02719ef89 

Has there been any traction in chunking messages from C++ to JavaScript? Been searching all over for a workaround, but that might be the only solution for me.

Are there any alternatives to get base64 data from a plotly plot for large datasets? Is there a way to chunk it on the python side? I tried pushing back and produce aggregate data for these plots, but they need to have all the data and I am stuck as we need to save to disk.

Any suggestions would be greatly appreciated.

@JesseFarebro
Copy link

Also experiencing this issue. I have a large figure with ~32 subplots, hopefully, there's a solution on the horizon.

@artizzq
Copy link

artizzq commented May 2, 2022

Same problem, please help

@Rob-Hulley
Copy link

I just updated kaleido.py to v 0.2.1 and it has resolved the issue for me.

@rskabelund
Copy link

Agree with @Rob-Hulley, I updated kaleido and my previously non-functional example now functions.

@matzl
Copy link

matzl commented Jan 9, 2024

Same issue. Spent quite some time generating a plot with many subplots and massive data behind, can barely view it in the browser and had hoped that I could at least export it as .png :(

@almontoya
Copy link

Running into this issue too, however, bit of a catch22 if you use Win11 as per this issue - https://github.com/plotly/Kaleido/issues/126.. Although I can choose to render plots interactively, with many points the app becomes slow to render. Hence, this is why I export to static and render that (can't with latest data...too big). Would be great if latest version of kaleido also works in W11 and deals with this issue.

@gvwilson gvwilson self-assigned this Jul 26, 2024
@gvwilson gvwilson removed their assignment Aug 3, 2024
@gvwilson gvwilson added feature something new P3 not needed for current cycle and removed bug something broken labels Aug 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature something new P3 not needed for current cycle
Projects
None yet
Development

No branches or pull requests