> 文章列表 > [jetson]源码编译fastdeploy

[jetson]源码编译fastdeploy

[jetson]源码编译fastdeploy

测试通过环境

jetpack4.6.3

cuda10.2

cudnn8.2

tensorrt8.2

GCC==7.5.0

cmake==3.25.2

Jetson部署库编译

FastDeploy当前在Jetson仅支持ONNX Runtime CPU和TensorRT GPU/Paddle Inference三种后端推理

升级cmake

sudo apt purge cmake

添加签名密钥

wget -O - https://apt.kitware.com/keys/kitware-archive-latest.asc 2>/dev/null | sudo apt-key add -
将存储库添加到您的源列表并进行更新

稳定版

sudo apt-add-repository 'deb https://apt.kitware.com/ubuntu/ bionic main'
sudo apt-get update

候选发布版本(可选)
sudo apt-add-repository 'deb https://apt.kitware.com/ubuntu/ bionic-rc main'
sudo apt-get update

安装新版本:

sudo apt install -y cmake

C++ SDK编译安装

编译需满足

  • gcc/g++ >= 5.4(推荐8.2)
  • cmake >= 3.10.0
  • jetpack >= 4.6.1

如果需要集成Paddle Inference后端,在Paddle Inference预编译库页面根据开发环境选择对应的Jetpack C++包下载,并解压。

git clone https://github.com/PaddlePaddle/FastDeploy.git
cd FastDeploy
mkdir build && cd build
cmake .. -DBUILD_ON_JETSON=ON \\-DENABLE_VISION=ON \\-DENABLE_PADDLE_BACKEND=ON \\ # 可选项,如若不需要Paddle Inference后端,可关闭-DPADDLEINFERENCE_DIRECTORY=/Download/paddle_inference_jetson \\-DCMAKE_INSTALL_PREFIX=${PWD}/installed_fastdeploy
make -j8
make install

编译完成后,即在CMAKE_INSTALL_PREFIX指定的目录下生成C++推理库

Python编译安装

编译过程同样需要满足

  • gcc/g++ >= 5.4(推荐8.2)
  • cmake >= 3.10.0
  • jetpack >= 4.6.1
  • python >= 3.6

Python打包依赖wheel,编译前请先执行pip install wheel

如果需要集成Paddle Inference后端,在Paddle Inference预编译库页面根据开发环境选择对应的Jetpack C++包下载,并解压。

所有编译选项通过环境变量导入

git clone https://github.com/PaddlePaddle/FastDeploy.git
cd FastDeploy/python
export BUILD_ON_JETSON=ON
export ENABLE_VISION=ON# ENABLE_PADDLE_BACKEND & PADDLEINFERENCE_DIRECTORY为可选项
export ENABLE_PADDLE_BACKEND=ON
export PADDLEINFERENCE_DIRECTORY=/Download/paddle_inference_jetsonpython setup.py build
python setup.py bdist_wheel

编译完成即会在FastDeploy/python/dist目录下生成编译后的wheel包,直接pip install即可

编译过程中,如若修改编译参数,为避免带来缓存影响,可删除FastDeploy/python目录下的build.setuptools-cmake-build两个子目录后再重新编译

杭州女装