File libtorch_backend.cpp
FileList > common > src > inference > libtorch_backend.cpp
Go to the source code of this file
LibTorch inference backend implementation. More...
#include "libtorch_backend.hpp"#include <iostream>#include <torch/script.h>#include <torch/torch.h>
Namespaces
| Type | Name |
|---|---|
| namespace | emulator |
| namespace | inference |
Classes
| Type | Name |
|---|---|
| struct | Impl Private implementation details for LibTorchBackend . |
Detailed Description
Provides native C++ neural network inference using LibTorch (PyTorch C++ API). This backend loads TorchScript models and executes inference without Python.
Note:
Models must be exported to TorchScript format (.pt) using torch.jit.trace() or torch.jit.script() before use with this backend.
See also: LibTorchBackend
The documentation for this class was generated from the following file components/emulator_comps/common/src/inference/libtorch_backend.cpp