admin管理员组

文章数量:1122832

I'm working on an ARKit project where I need to detect and interact with objects at close range. Specifically, I'm trying to implement functionality to anchor an image onto someone's arm, leg, or chest when they hold their camera to it.

The Problem: BodyTrackingConfiguration in ARKit requires the entire body to be in view, but I need to work with close-range body parts only (e.g., just an arm). When aiming the camera at an arm from a close distance, ARKit fails to detect the body or recognize it for anchoring. I can currently place an image onto a wall behind the arm but can't reliably detect and anchor to the arm itself. The Desired Outcome: Detect an arm, leg, or chest at close range. Allow the user to select a point (via tap) on the detected body part. Anchor the image to the selected point and make it move with the body part. Secondary Use Case: I'm exploring similar functionality for objects, such as wrapping an image around a black mug in real life. The workflow would look like this:

The user selects an image from their phone. They aim the camera at the black mug. They tap the mug in the AR view. The image wraps around the mug realistically (with curvature). Current Progress: I can detect surfaces using raycasting and place an image on a surface, but: This doesn't work for an arm or other body parts at close range. The image is placed flat and doesn't curve to match the surface (e.g., a cylindrical mug or arm). What I Need: Detection: Is there a way to detect arms or objects like mugs at close range without requiring the entire body? Wrapping: Are there any tutorials or examples for wrapping a 2D image around a cylindrical or curved surface (e.g., a mug or arm)? Anchoring: How can I anchor the image to the arm (or an object like a mug) so that it moves with the body or object? Any advice, examples, or tutorials on how to achieve this would be greatly appreciated!

本文标签: xcodeHow to Detect and Anchor AR Content to an Arm or Object at Close Range with ARKitStack Overflow