admin管理员组

文章数量:1417442

I am working on a project to replace touch input with hand tracking for system-wide interactions (e.g., scrolling, swiping, opening apps) on an Android device powered by Snapdragon XR1. I want to enable this functionality at the system level, so it works on the home screen and across all apps, not just within a specific application.

What I Want to Achieve:

  • Detect and track hand movements using the device's camera.
  • Map hand gestures to system input events (e.g., swipe, scroll, tap).
  • Inject these input events into the Android system so they can be recognized by the home screen and other apps.

I’ve looked into using a foreground service to handle hand tracking in the background, but I’m unsure which layer of the Android architecture (e.g., HAL, Linux Kernel, or Android Framework) would be the most appropriate for this implementation. I’m confused about whether to:

  • Modify the Android Framework Layer to create a custom system service.

  • Work at the HAL Layer for hardware-specific optimizations (e.g., using Snapdragon XR1's DSP/GPU).

  • Modify the Linux Kernel Layer for low-level hardware access.

本文标签: mediapipeHow to Implement SystemWide Hand Tracking on Android Using Snapdragon XR1Stack Overflow