admin管理员组文章数量:1417442
I am working on a project to replace touch input with hand tracking for system-wide interactions (e.g., scrolling, swiping, opening apps) on an Android device powered by Snapdragon XR1. I want to enable this functionality at the system level, so it works on the home screen and across all apps, not just within a specific application.
What I Want to Achieve:
- Detect and track hand movements using the device's camera.
- Map hand gestures to system input events (e.g., swipe, scroll, tap).
- Inject these input events into the Android system so they can be recognized by the home screen and other apps.
I’ve looked into using a foreground service to handle hand tracking in the background, but I’m unsure which layer of the Android architecture (e.g., HAL, Linux Kernel, or Android Framework) would be the most appropriate for this implementation. I’m confused about whether to:
Modify the Android Framework Layer to create a custom system service.
Work at the HAL Layer for hardware-specific optimizations (e.g., using Snapdragon XR1's DSP/GPU).
Modify the Linux Kernel Layer for low-level hardware access.
本文标签: mediapipeHow to Implement SystemWide Hand Tracking on Android Using Snapdragon XR1Stack Overflow
版权声明:本文标题:mediapipe - How to Implement System-Wide Hand Tracking on Android Using Snapdragon XR1? - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1745264625a2650525.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论