Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wasmtime-related memory allocation failure on aarch64 #833

Closed
nazar-pc opened this issue Sep 23, 2022 · 4 comments
Closed

Wasmtime-related memory allocation failure on aarch64 #833

nazar-pc opened this issue Sep 23, 2022 · 4 comments
Assignees
Labels
bug Something isn't working node Node (service library/node app)

Comments

@nazar-pc
Copy link
Member

Looks like this:

subspace-node    | 2022-09-15 21:22:39 Subspace    
subspace-node    | 2022-09-15 21:22:39 ✌️  version 0.1.0-unknown    
subspace-node    | 2022-09-15 21:22:39 ❤️  by Subspace Labs <https://subspace.network>, 2021-2022    
subspace-node    | 2022-09-15 21:22:39 📋 Chain specification: Subspace Gemini 2a    
subspace-node    | 2022-09-15 21:22:39 🏷  Node name: counterpoint_aarch64_pi4_hdd    
subspace-node    | 2022-09-15 21:22:39 👤 Role: AUTHORITY    
subspace-node    | 2022-09-15 21:22:39 💾 Database: ParityDb at /var/subspace/chains/subspace_gemini_2a/paritydb/full    
subspace-node    | 2022-09-15 21:22:39 ⛓  Native runtime: subspace-4 (subspace-0.tx0.au0)    
subspace-node    | 
subspace-node    | ====================
subspace-node    | 
subspace-node    | Version: 0.1.0-unknown
subspace-node    | 
subspace-node    |    0: sp_panic_handler::set::{{closure}}
subspace-node    |    1: std::panicking::rust_panic_with_hook
subspace-node    |    2: std::panicking::begin_panic_handler::{{closure}}
subspace-node    |    3: std::sys_common::backtrace::__rust_end_short_backtrace
subspace-node    |    4: rust_begin_unwind
subspace-node    |    5: core::panicking::panic_fmt
subspace-node    |    6: sc_consensus_subspace::archiver::initialize_archiver
subspace-node    |    7: subspace_service::new_partial
subspace-node    |    8: subspace_service::new_full::{{closure}}
subspace-node    |    9: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
subspace-node    |   10: sc_cli::runner::Runner<C>::run_node_until_exit
subspace-node    |   11: subspace_node::main
subspace-node    |   12: std::sys_common::backtrace::__rust_begin_short_backtrace
subspace-node    |   13: std::rt::lang_start::{{closure}}
subspace-node    |   14: main
subspace-node    |   15: __libc_start_main
subspace-node    |   16: <unknown>
subspace-node    | 
subspace-node    | 2022-09-15 21:22:42 [PrimaryChain] Cannot create a runtime error=Other("cannot create the wasmtime engine: failed to create memory pool mapping: mmap failed to allocate 0x3080000000 bytes: Cannot allocate memory (os error 12)")
subspace-node    | 
subspace-node    | Thread 'main' panicked at 'Failed to make runtime API call during last archived block search: Application(VersionInvalid("cannot create the wasmtime engine: failed to create memory pool mapping: mmap failed to allocate 0x3080000000 bytes: Cannot allocate memory (os error 12)"))', /code/crates/sc-consensus-subspace/src/archiver.rs:72
subspace-node    | 
subspace-node    | This is a bug. Please report it at:
subspace-node    | 
subspace-node    |      https://forum.subspace.network
subspace-node    | 
subspace-node exited with code 1

Details:

  • Raspberry Pi4 8GiB
  • Docker compose
  •   Distributor ID: Debian
      Description:    Debian GNU/Linux 11 (bullseye)
      Release:        11
      Codename:       bullseye
    
  • Linux raspberrypi4 5.15.56-v8+ #1575 SMP PREEMPT Fri Jul 22 20:31:26 BST 2022 aarch64 GNU/Linux

Originally reported on the forum: https://forum.subspace.network/t/failed-to-allocate-bytes-exception-when-running-docker-on-aarch64/606

@nazar-pc nazar-pc added bug Something isn't working node Node (service library/node app) labels Sep 23, 2022
@nazar-pc nazar-pc self-assigned this Sep 23, 2022
@nazar-pc
Copy link
Member Author

nazar-pc commented Sep 24, 2022

This happens on the block with runtime upgrade.

I wasn't able to reproduce it though. Tried running on:

  • x86-64 Ubuntu 22.04 machine (Docker, it uses qemu under the hood), here I have 128G of RAM, so basically impossible to run out of it
  • Rock64 4G with 4G swap (the most powerful single board computer I have right now) with Ubuntu 20.04 and also Docker, it would actually succeed even with 2G of swap, here is memory usage at ~17000 blocks height:
                  total        used        free      shared  buff/cache   available
    Mem:          3.8Gi       3.1Gi        16Mi        25Mi       735Mi       689Mi
    Swap:         4.0Gi       1.6Gi       2.4Gi
    

@nazar-pc
Copy link
Member Author

Orange PI with 2G RAM and 4G swap and Ubuntu 20.04 works fine too, Raspberry PI 3B+ with 1G RAM doesn't really work, simply not enough RAM it seems and just USB 2.0 interfaces to put swap on 😞

@nazar-pc
Copy link
Member Author

Upstream Substrate issue: paritytech/substrate#12538

@nazar-pc
Copy link
Member Author

Closing for now as this is a primarily OS configuration issue and we'll certainly pull any Substrate improvements that happen on this front.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working node Node (service library/node app)
Projects
Development

No branches or pull requests

1 participant