-
Notifications
You must be signed in to change notification settings - Fork 74.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tensor slice is too large to serialize #4291
Comments
There is a hard limit of 2GB for serializing individual tensors because of the 32bit signed size in protobuf. See #3766, #2648, etc. You need to find a way to break things up into smaller tensors/variables. |
@aselle Thanks for your quick response. Are you planning to provide some operations to break big tensors/variables into smaller ones, since it is very easy to exceed 2 GB limitation for some big models. |
@concretevitamin can comment further if need be, but the plan is to make tensors be checkpointed in a format that does not use protobufs, sidestepping this limitation. |
@concretevitamin Are you able to provide an example on how to use tf.split(). Currently my code is something like: |
finally, how to solve that problem? |
Hi,
I met a issue said that "Tensor slice is too large to serialize (conservative estimate: 2268204567 bytes)". And i am wondered that why there is a limitation to prevents the TensorSliceWriter from attempting to serialize variables that are larger than 2GB. How can I to skip this limitation if I want to use a variable large than 2GB?
Thanks
Jinlong
The text was updated successfully, but these errors were encountered: