Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

module file: no environment variable with absolute install path #163

Closed
psychocoderHPC opened this issue Sep 16, 2022 · 2 comments · Fixed by #164
Closed

module file: no environment variable with absolute install path #163

psychocoderHPC opened this issue Sep 16, 2022 · 2 comments · Fixed by #164

Comments

@psychocoderHPC
Copy link

As a workaround to have the HIP kernel in the trace file I need to set the environment variable

export HSA_TOOLS_LIB=<path to onitrace install>/omnitrace-1.6.0/rocm-5.1.0/lib/libomnitrace.so

The problem is that the module file is not providing a environment variable e.g. omnitrace_ROOT or omnitrace_HOME which I can use to set HSA_TOOLS_LIB.
Currently, I need to use$omnitrace_DIR/../../.. which is not very elegant.

Tested with: omnitrace 1.6.0

@skyreflectedinmirrors
Copy link

skyreflectedinmirrors commented Sep 20, 2022

I can probably manually edit the modules on Crusher to make this work for the moment, but @jrmadsen I think all we'd need here is to change set ROOT... to a setenv omnitrace_ROOT ..., e.g., from Crusher:

#%Module1.0

module-whatis "omnitrace (version 1.6.0)"

proc ModulesHelp { } {
puts stderr "Loads omnitrace v1.6.0"
}

~set ROOT [file normalize [file dirname [file normalize ${ModulesCurrentModulefile}]]/../../..]~
set omnitrace_ROOT [file normalize [file dirname [file normalize ${ModulesCurrentModulefile}]]/../../..]

prereq "rocm/5.1.0"

prepend-path CMAKE_PREFIX_PATH "${omnitrace_ROOT}"
prepend-path PATH "${omnitrace_ROOTbin"
prepend-path LD_LIBRARY_PATH "${omnitrace_ROOT}/lib"
prepend-path PYTHONPATH "${omnitrace_ROOT}/lib/python/site-packages"
setenv omnitrace_DIR "${omnitrace_ROOT}/share/cmake/omnitrace"

@jrmadsen
Copy link
Collaborator

I can probably manually edit the modules on Crusher to make this work for the moment, but @jrmadsen I think all we'd need here is to change set ROOT... to a setenv omnitrace_ROOT ..., e.g., from Crusher:

#%Module1.0

module-whatis "omnitrace (version 1.6.0)"

proc ModulesHelp { } {
puts stderr "Loads omnitrace v1.6.0"
}

~set ROOT [file normalize [file dirname [file normalize ${ModulesCurrentModulefile}]]/../../..]~
set omnitrace_ROOT [file normalize [file dirname [file normalize ${ModulesCurrentModulefile}]]/../../..]

prereq "rocm/5.1.0"

prepend-path CMAKE_PREFIX_PATH "${omnitrace_ROOT}"
prepend-path PATH "${omnitrace_ROOTbin"
prepend-path LD_LIBRARY_PATH "${omnitrace_ROOT}/lib"
prepend-path PYTHONPATH "${omnitrace_ROOT}/lib/python/site-packages"
setenv omnitrace_DIR "${omnitrace_ROOT}/share/cmake/omnitrace"

You don't even need to make this many edits. You can just add:

setenv omnitrace_ROOT "${ROOT}"

jrmadsen added a commit that referenced this issue Sep 21, 2022
- improved error handling in dyninst
- improved error handling in omnitrace exe
- new logging facility for omnitrace exe
- improved backtraces
- disable concurrent kernels in rocprofiler
- updates `setup-env.sh` and modulefile
  - set `omnitrace_ROOT`
  - set `HSA_TOOLS_LIB` if roctracer or rocprofiler enabled
  - set `ROCP_TOOL_LIB` if rocprofiler enabled
  - closes #163 
- No longer make setting `HSA_ENABLE_INTERRUPT=0` the default 
  - this has performance implications
- this was set to workaround a bug in ROCR which caused an ioctl call in
ROCm to hang when interrupted. But it was only interrupted when realtime
sampling was enabled since the CPU-clock doesn't increment when waiting
  - This bug should be fixed in ROCm 5.3
- omnitrace no longer activates a realtime sampler by default when
sampling, thus this bug is no longer encountered unless the user
explicitly triggers realtime sampling
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants