> 文章列表 > Wsl环境下使用Mediapipe构建Holistic Sdk

Wsl环境下使用Mediapipe构建Holistic Sdk

Wsl环境下使用Mediapipe构建Holistic Sdk

Mediapipe介绍

Mediapipe 是Google的一个开源项目,用于构建机器学习管道的框架,用于处理视频、音频等时间序列数据。

这篇文章将带大家通过bazel命令生成holistic(整体检测) sdk。以便定制化每个项目所需的特定资源,例如 MediaPipe 计算器。

构建 MediaPipe AAR 的步骤

1. 进入app目录下并生成相关文件

进入app目录

cd mediapipe/examples/android/src/java/com/google/mediapipe/apps/ 

创建文件夹

mkdir build_aar

进入build目录

cd build_aar 

生成并进入BUILD文件

vim BUILD

2.修改BUILD文件

因为BUILD文件是我们新创建出来的,因此执行vim后BUILD文件是没有内容的,需要先点击键盘上ESC键 并输入 :w 切换为输入模式,这时我们就可以对文件进行编辑了。

输入以下代码

load("//mediapipe/java/com/google/mediapipe:mediapipe_aar.bzl", "mediapipe_aar")
mediapipe_aar(
name = "aar_holistic",
calculators = ["//mediapipe/graphs/holistic_tracking:holistic_tracking_gpu_deps"],
)
cc_library(name = "mediapipe_jni_lib",
srcs = [":libmediapipe_jni.so"],
alwayslink = 1,
)
android_library(
name = "holistic",
srcs = [],
assets = [
"//mediapipe/graphs/holistic_tracking:holistic_tracking_gpu.binarypb",
"//mediapipe/modules/face_detection:face_detection_short_range.tflite",
"//mediapipe/modules/face_landmark:face_landmark.tflite",
"//mediapipe/modules/hand_landmark:hand_landmark_full.tflite",
"//mediapipe/modules/hand_landmark:hand_landmark_lite.tflite",
"//mediapipe/modules/hand_landmark:handedness.txt",
"//mediapipe/modules/holistic_landmark:hand_recrop.tflite",
"//mediapipe/modules/pose_detection:pose_detection.tflite",
"//mediapipe/modules/pose_landmark:pose_landmark_full.tflite",
],
assets_dir = "",
javacopts = ["-Acom.google.auto.value.AutoBuilderIsUnstable"],
manifest = ":AndroidManifest.xml",
visibility = ["//visibility:public"],
deps = [
"//mediapipe/framework/formats:classification_java_proto_lite",
"//mediapipe/framework/formats:landmark_java_proto_lite",
"//mediapipe/java/com/google/mediapipe/framework:android_framework",
"//mediapipe/java/com/google/mediapipe/solutioncore:solution_base",
"//third_party:autovalue",
"@maven//:androidx_annotation_annotation",
"@maven//:com_google_code_findbugs_jsr305",
"@maven//:com_google_guava_guava",
],
)

输入完成后注意格式问题,不能有tab符号。再次点击键盘上ESC键 并输入 :w 进行保存操作。然后输入 :q结束本次编辑。

3.执行bazel命令

  • 在保存完build文件后,返回mediapipe根目录,在确认环境和配置无问题的情况下,分别执行以下命令:
bazel build -c opt --strip=ALWAYS    
--host_crosstool_top=@bazel_tools//tools/cpp:toolchain    
--fat_apk_cpu=arm64-v8a,armeabi-v7a    
--legacy_whole_archive=0    
--features=-legacy_whole_archive    
--copt=-fvisibility=hidden    
--copt=-ffunction-sections    
--copt=-fdata-sections     
--copt=-fstack-protector     
--copt=-Oz     
--copt=-fomit-frame-pointer    
--copt=-DABSL_MIN_LOG_LEVEL=2     
--linkopt=-Wl,--gc-sections,--strip-all     //mediapipe/examples/android/src/java/com/google/mediapipe/apps/build_aar:aar_holistic.aarbazel build -c opt --strip=ALWAYS    
--host_crosstool_top=@bazel_tools//tools/cpp:toolchain    
--fat_apk_cpu=arm64-v8a,armeabi-v7a    
--legacy_whole_archive=0    
--features=-legacy_whole_archive    
--copt=-fvisibility=hidden    
--copt=-ffunction-sections    
--copt=-fdata-sections     
--copt=-fstack-protector     
--copt=-Oz     
--copt=-fomit-frame-pointer    
--copt=-DABSL_MIN_LOG_LEVEL=2     
--linkopt=-Wl,--gc-sections,--strip-all     //mediapipe/examples/android/src/java/com/google/mediapipe/apps/build_aar:holistic.aar

编译成功后控制台会打印aar路径,进入路径执行cp命令将aar拷贝至windows系统。

注:代码块中两处命令,生成的不同aar文件,都需要拷贝

接入项目

创建安卓demo程序,在gradle文件加入相关依赖,同时将打包好的aar放入app层libs文件中

dependencies {implementation fileTree(dir: 'libs', include: ['*.jar', '*.aar'])implementation 'androidx.appcompat:appcompat:1.0.2'implementation 'androidx.constraintlayout:constraintlayout:1.1.3'testImplementation 'junit:junit:4.12'androidTestImplementation 'androidx.test.ext:junit:1.1.0'androidTestImplementation 'androidx.test.espresso:espresso-core:3.1.1'// MediaPipe depsimplementation 'com.google.flogger:flogger:latest.release'implementation 'com.google.flogger:flogger-system-backend:latest.release'implementation 'com.google.code.findbugs:jsr305:latest.release'implementation 'com.google.guava:guava:27.0.1-android'implementation 'com.google.protobuf:protobuf-javalite:3.19.1'// CameraX core librarydef camerax_version = "1.0.0-beta10"implementation "androidx.camera:camera-core:$camerax_version"implementation "androidx.camera:camera-camera2:$camerax_version"implementation "androidx.camera:camera-lifecycle:$camerax_version"// AutoValuedef auto_value_version = "1.8.1"implementation "com.google.auto.value:auto-value-annotations:$auto_value_version"annotationProcessor "com.google.auto.value:auto-value:$auto_value_version"
}

将MainActiviy替换为以下代码。

import android.content.pm.ApplicationInfo;
import android.content.pm.PackageManager;
import android.graphics.SurfaceTexture;
import android.os.Bundle;
import androidx.appcompat.app.AppCompatActivity;import android.util.Log;
import android.util.Size;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.ViewGroup;
import com.google.mediapipe.components.CameraHelper;
import com.google.mediapipe.components.CameraXPreviewHelper;
import com.google.mediapipe.components.ExternalTextureConverter;
import com.google.mediapipe.components.FrameProcessor;
import com.google.mediapipe.components.PermissionHelper;
import com.google.mediapipe.framework.AndroidAssetUtil;
import com.google.mediapipe.glutil.EglManager;public class holistic_activity extends AppCompatActivity {private static final String TAG = "MainActivity";// Flips the camera-preview frames vertically by default, before sending them into FrameProcessor// to be processed in a MediaPipe graph, and flips the processed frames back when they are// displayed. This maybe needed because OpenGL represents images assuming the image origin is at// the bottom-left corner, whereas MediaPipe in general assumes the image origin is at the// top-left corner.// NOTE: use "flipFramesVertically" in manifest metadata to override this behavior.private static final boolean FLIP_FRAMES_VERTICALLY = true;static {// Load all native libraries needed by the app.System.loadLibrary("mediapipe_jni");try {System.loadLibrary("opencv_java3");} catch (UnsatisfiedLinkError e) {// Some example apps (e.g. template matching) require OpenCV 4.System.loadLibrary("opencv_java4");}}// Sends camera-preview frames into a MediaPipe graph for processing, and displays the processed// frames onto a {@link Surface}.protected FrameProcessor processor;// Handles camera access via the {@link CameraX} Jetpack support library.protected CameraXPreviewHelper cameraHelper;// {@link SurfaceTexture} where the camera-preview frames can be accessed.private SurfaceTexture previewFrameTexture;// {@link SurfaceView} that displays the camera-preview frames processed by a MediaPipe graph.private SurfaceView previewDisplayView;// Creates and manages an {@link EGLContext}.private EglManager eglManager;// Converts the GL_TEXTURE_EXTERNAL_OES texture from Android camera into a regular texture to be// consumed by {@link FrameProcessor} and the underlying MediaPipe graph.private ExternalTextureConverter converter;// ApplicationInfo for retrieving metadata defined in the manifest.private ApplicationInfo applicationInfo;@Overrideprotected void onCreate(Bundle savedInstanceState) {super.onCreate(savedInstanceState);setContentView(R.layout.activity_holistic_activity);try {applicationInfo =getPackageManager().getApplicationInfo(getPackageName(), PackageManager.GET_META_DATA);} catch (PackageManager.NameNotFoundException e) {Log.e(TAG, "Cannot find application info: " + e);}previewDisplayView = new SurfaceView(this);setupPreviewDisplayView();// Initialize asset manager so that MediaPipe native libraries can access the app assets, e.g.,// binary graphs.AndroidAssetUtil.initializeNativeAssetManager(this);eglManager = new EglManager(null);processor =new FrameProcessor(this,eglManager.getNativeContext(),applicationInfo.metaData.getString("binaryGraphName"),applicationInfo.metaData.getString("inputVideoStreamName"),applicationInfo.metaData.getString("outputVideoStreamName"));processor.getVideoSurfaceOutput().setFlipY(applicationInfo.metaData.getBoolean("flipFramesVertically", FLIP_FRAMES_VERTICALLY));PermissionHelper.checkAndRequestCameraPermissions(this);}@Overrideprotected void onResume() {super.onResume();converter = new ExternalTextureConverter(eglManager.getContext());converter.setFlipY(applicationInfo.metaData.getBoolean("flipFramesVertically", FLIP_FRAMES_VERTICALLY));converter.setConsumer(processor);if (PermissionHelper.cameraPermissionsGranted(this)) {startCamera();}}@Overrideprotected void onPause() {super.onPause();converter.close();}@Overridepublic void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) {super.onRequestPermissionsResult(requestCode, permissions, grantResults);PermissionHelper.onRequestPermissionsResult(requestCode, permissions, grantResults);}protected void onCameraStarted(SurfaceTexture surfaceTexture) {previewFrameTexture = surfaceTexture;// Make the display view visible to start showing the preview. This triggers the// SurfaceHolder.Callback added to (the holder of) previewDisplayView.previewDisplayView.setVisibility(View.VISIBLE);}protected Size cameraTargetResolution() {return null; // No preference and let the camera (helper) decide.}public void startCamera() {cameraHelper = new CameraXPreviewHelper();cameraHelper.setOnCameraStartedListener(surfaceTexture -> {onCameraStarted(surfaceTexture);});CameraHelper.CameraFacing cameraFacing =applicationInfo.metaData.getBoolean("cameraFacingFront", false)? CameraHelper.CameraFacing.FRONT: CameraHelper.CameraFacing.BACK;cameraHelper.startCamera(this, cameraFacing, /*surfaceTexture=*/ null, cameraTargetResolution());}protected Size computeViewSize(int width, int height) {return new Size(width, height);}protected void onPreviewDisplaySurfaceChanged(SurfaceHolder holder, int format, int width, int height) {// (Re-)Compute the ideal size of the camera-preview display (the area that the// camera-preview frames get rendered onto, potentially with scaling and rotation)// based on the size of the SurfaceView that contains the display.Size viewSize = computeViewSize(width, height);Size displaySize = cameraHelper.computeDisplaySizeFromViewSize(viewSize);boolean isCameraRotated = cameraHelper.isCameraRotated();// Connect the converter to the camera-preview frames as its input (via// previewFrameTexture), and configure the output width and height as the computed// display size.converter.setSurfaceTextureAndAttachToGLContext(previewFrameTexture,isCameraRotated ? displaySize.getHeight() : displaySize.getWidth(),isCameraRotated ? displaySize.getWidth() : displaySize.getHeight());}private void setupPreviewDisplayView() {previewDisplayView.setVisibility(View.GONE);ViewGroup viewGroup = findViewById(R.id.preview_display_layout);viewGroup.addView(previewDisplayView);previewDisplayView.getHolder().addCallback(new SurfaceHolder.Callback() {@Overridepublic void surfaceCreated(SurfaceHolder holder) {processor.getVideoSurfaceOutput().setSurface(holder.getSurface());}@Overridepublic void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {onPreviewDisplaySurfaceChanged(holder, format, width, height);}@Overridepublic void surfaceDestroyed(SurfaceHolder holder) {processor.getVideoSurfaceOutput().setSurface(null);}});}
}

总结

至此我们就实现了在Wsl环境下构建Holistic Sdk。中间可能会有很多环境问题,按步骤设置环境并在需要的时候使用sudo su命令切换为管理员模式即可。