[SOLVED] This ORT build has ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'] enabled #250
Replies: 3 comments 11 replies
-
What versions of onnxruntime and CUDA do you use? |
Beta Was this translation helpful? Give feedback.
-
Hi @lisdiyanto2024 I am having this same exact error message from ReActor node after installing YoloWorld/ESAM nodes two days ago. I do not believe I have python 3.12 anywhere on my machine. It's all 3.11.8 (or 3.11.x). I succesfully ran this cmd for ORT Azure Devops for CUDA 12: Still error. Can you give me any guidance? |
Beta Was this translation helpful? Give feedback.
-
UPDATE: Below cmd worked for me. I can now use ReActor again with CUDA 12.4. Hopefully this doesn't interrupt Yolo/ESAM bc i'm trying to make a connected flow lol. in python_embeded (not elsewhere) i needed to uninstall existing onnx runtime and revert it to 1.15.1. Can confirm this works with most recent Cuda/Drivers/CuDNN. (See my specs above). Just install the appropriate version: |
Beta Was this translation helpful? Give feedback.
-
I've just recently having problem running ReactorFaceSwap (maybe after installing other models/nodes), because I can execute ReactorFaceSwap before. i use Python 3.11
the error messages is as follows:
Error occurred when executing ReActorFaceSwap:
This ORT build has ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'], ...)
File "D:\ComfyUI_windows_portable\ComfyUI\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-0246\utils.py", line 381, in new_func
res_value = old_func(*final_args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-reactor-node\nodes.py", line 240, in execute
script.process(
File "D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-reactor-node\scripts\reactor_faceswap.py", line 86, in process
result = swap_face(
^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-reactor-node\scripts\reactor_swapper.py", line 199, in swap_face
source_faces = analyze_faces(source_img)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-reactor-node\scripts\reactor_swapper.py", line 118, in analyze_faces
face_analyser = copy.deepcopy(getAnalysisModel())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "copy.py", line 172, in deepcopy
File "copy.py", line 271, in _reconstruct
File "copy.py", line 146, in deepcopy
File "copy.py", line 231, in _deepcopy_dict
File "copy.py", line 146, in deepcopy
File "copy.py", line 231, in _deepcopy_dict
File "copy.py", line 172, in deepcopy
File "copy.py", line 271, in _reconstruct
File "copy.py", line 146, in deepcopy
File "copy.py", line 231, in _deepcopy_dict
File "copy.py", line 172, in deepcopy
File "copy.py", line 273, in _reconstruct
File "D:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\insightface\model_zoo\model_zoo.py", line 33, in setstate
self.init(model_path)
File "D:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\insightface\model_zoo\model_zoo.py", line 25, in init
super().init(model_path, **kwargs)
File "D:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 396, in init
raise e
File "D:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 383, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "D:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 415, in _create_inference_session
raise ValueError(
Beta Was this translation helpful? Give feedback.
All reactions