Skip to main content

High-Speed Inference Using External Devices (Windows)

When you run this tool with the -use_external_device option and an external inference device is available,
part of the analysis can be performed using the external device instead of the CPU.

On Windows, external inference devices are supported via NVIDIA CUDA for high-speed inference.
To use CUDA, you need CUDA v11.8 and cuDNN v8.x.

Additionally, a directory such as C:\tools\cuda\bin, where cudnn64_8.dll is located, must be set in the environment variable path.

For more detailed installation instructions, please refer to NVIDIA's documentation: CUDA Installation Guide for Microsoft Windows