admin管理员组

文章数量:1122846

I'm not sure if my post fully complies with the rules or if anyone will be able to clone the repo, but I'm stuck.

After changing the Swift Language Version from 5 to 6 and making a few adjustments (strict concurrency) to Apple's sample project available here: , it unfortunately crashes upon opening.

I'm trying to understand why this happens. I noticed that while the photo gallery works fine in the app, the camera session does not. My question is: does anyone know what I might be doing wrong? Interestingly, if I revert to Swift 5, the crash doesn't occur even with the changes.

Here are some relevant code snippets, where I configure the capture session, start the capture session, and where I process the capture buffer:

class Camera {
    …

    private func configureCaptureSession(completionHandler: (_ success: Bool) -> Void) {
        
        var success = false
        
        self.captureSession.beginConfiguration()
        
        defer {
            self.captureSessionmitConfiguration()
            completionHandler(success)
        }
        
        guard
            let captureDevice = captureDevice,
            let deviceInput = try? AVCaptureDeviceInput(device: captureDevice)
        else {
            logger.error("Failed to obtain video input.")
            return
        }
        
        let photoOutput = AVCapturePhotoOutput()
                        
        captureSession.sessionPreset = AVCaptureSession.Preset.photo

        let videoOutput = AVCaptureVideoDataOutput()
        videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "VideoDataOutputQueue"))
  
        guard captureSession.canAddInput(deviceInput) else {
            logger.error("Unable to add device input to capture session.")
            return
        }
        guard captureSession.canAddOutput(photoOutput) else {
            logger.error("Unable to add photo output to capture session.")
            return
        }
        guard captureSession.canAddOutput(videoOutput) else {
            logger.error("Unable to add video output to capture session.")
            return
        }
        
        captureSession.addInput(deviceInput)
        captureSession.addOutput(photoOutput)
        captureSession.addOutput(videoOutput)
        
        self.deviceInput = deviceInput
        self.photoOutput = photoOutput
        self.videoOutput = videoOutput
        
        photoOutput.isHighResolutionCaptureEnabled = true
        photoOutput.maxPhotoQualityPrioritization = .quality
        
        updateVideoOutputConnection()
        
        isCaptureSessionConfigured = true
        
        success = true
    }

    …

    func start() async {
        let authorized = await checkAuthorization()
        guard authorized else {
            logger.error("Camera access was not authorized.")
            return
        }
        
        if isCaptureSessionConfigured {
            if !captureSession.isRunning {
                sessionQueue.async { [self] in
                    self.captureSession.startRunning()
                }
            }
            return
        }
        
        sessionQueue.async { [self] in
            self.configureCaptureSession { success in
                guard success else { return }
                self.captureSession.startRunning()
            }
        }
    }
}

extension Camera: @preconcurrency AVCaptureVideoDataOutputSampleBufferDelegate {

    @MainActor    
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        guard let pixelBuffer = sampleBuffer.imageBuffer else { return }
        
        guard let device = captureDevice else { return }
        let rotationCoordiantor = AVCaptureDevice.RotationCoordinator(
            device: device,
            previewLayer: nil
        )
        
        if connection.isVideoOrientationSupported {
            connection.videoRotationAngle = rotationCoordiantor.videoRotationAngleForHorizonLevelCapture
        }

        addToPreviewStream?(CIImage(cvPixelBuffer: pixelBuffer))
    }
}

If anyone is willing to take a look, I have prepared a repo with two commits. In the initial commit, I added Apple's example code 1-to-1, and in the 'Swift 6 update' commit, I made some adjustments to silence the compiler errors. However, I'm not sure if I did it correctly.

link to repo:

ps: to check camera session you need open this example on real device because on simulator you are able only to open library and there is no crash

I'm not sure if my post fully complies with the rules or if anyone will be able to clone the repo, but I'm stuck.

After changing the Swift Language Version from 5 to 6 and making a few adjustments (strict concurrency) to Apple's sample project available here: https://developer.apple.com/tutorials/sample-apps/capturingphotos-captureandsave, it unfortunately crashes upon opening.

I'm trying to understand why this happens. I noticed that while the photo gallery works fine in the app, the camera session does not. My question is: does anyone know what I might be doing wrong? Interestingly, if I revert to Swift 5, the crash doesn't occur even with the changes.

Here are some relevant code snippets, where I configure the capture session, start the capture session, and where I process the capture buffer:

class Camera {
    …

    private func configureCaptureSession(completionHandler: (_ success: Bool) -> Void) {
        
        var success = false
        
        self.captureSession.beginConfiguration()
        
        defer {
            self.captureSession.commitConfiguration()
            completionHandler(success)
        }
        
        guard
            let captureDevice = captureDevice,
            let deviceInput = try? AVCaptureDeviceInput(device: captureDevice)
        else {
            logger.error("Failed to obtain video input.")
            return
        }
        
        let photoOutput = AVCapturePhotoOutput()
                        
        captureSession.sessionPreset = AVCaptureSession.Preset.photo

        let videoOutput = AVCaptureVideoDataOutput()
        videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "VideoDataOutputQueue"))
  
        guard captureSession.canAddInput(deviceInput) else {
            logger.error("Unable to add device input to capture session.")
            return
        }
        guard captureSession.canAddOutput(photoOutput) else {
            logger.error("Unable to add photo output to capture session.")
            return
        }
        guard captureSession.canAddOutput(videoOutput) else {
            logger.error("Unable to add video output to capture session.")
            return
        }
        
        captureSession.addInput(deviceInput)
        captureSession.addOutput(photoOutput)
        captureSession.addOutput(videoOutput)
        
        self.deviceInput = deviceInput
        self.photoOutput = photoOutput
        self.videoOutput = videoOutput
        
        photoOutput.isHighResolutionCaptureEnabled = true
        photoOutput.maxPhotoQualityPrioritization = .quality
        
        updateVideoOutputConnection()
        
        isCaptureSessionConfigured = true
        
        success = true
    }

    …

    func start() async {
        let authorized = await checkAuthorization()
        guard authorized else {
            logger.error("Camera access was not authorized.")
            return
        }
        
        if isCaptureSessionConfigured {
            if !captureSession.isRunning {
                sessionQueue.async { [self] in
                    self.captureSession.startRunning()
                }
            }
            return
        }
        
        sessionQueue.async { [self] in
            self.configureCaptureSession { success in
                guard success else { return }
                self.captureSession.startRunning()
            }
        }
    }
}

extension Camera: @preconcurrency AVCaptureVideoDataOutputSampleBufferDelegate {

    @MainActor    
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        guard let pixelBuffer = sampleBuffer.imageBuffer else { return }
        
        guard let device = captureDevice else { return }
        let rotationCoordiantor = AVCaptureDevice.RotationCoordinator(
            device: device,
            previewLayer: nil
        )
        
        if connection.isVideoOrientationSupported {
            connection.videoRotationAngle = rotationCoordiantor.videoRotationAngleForHorizonLevelCapture
        }

        addToPreviewStream?(CIImage(cvPixelBuffer: pixelBuffer))
    }
}

If anyone is willing to take a look, I have prepared a repo with two commits. In the initial commit, I added Apple's example code 1-to-1, and in the 'Swift 6 update' commit, I made some adjustments to silence the compiler errors. However, I'm not sure if I did it correctly.

link to repo: https://github.com/miltenkot/CapturingPhotosSwift6

ps: to check camera session you need open this example on real device because on simulator you are able only to open library and there is no crash

Share Improve this question edited Nov 22, 2024 at 15:52 Rob 437k74 gold badges833 silver badges1.1k bronze badges asked Nov 21, 2024 at 16:45 miltenkotmiltenkot 2811 silver badge9 bronze badges 0
Add a comment  | 

1 Answer 1

Reset to default 3

tl;dr

The problem is that the AVCaptureVideoDataOutputSampleBufferDelegate protocol has a non-isolated requirement of captureOutput, but your implementation is isolated to the main actor. An isolated method cannot be used to satisfy a non-isolated protocol requirement.


When you get this error, you can expand that panel on the left and you will see the name of the thread on which it is crashing, namely VideoDataOutputQueue:

That is a thread associated with the queue that was supplied when the AVCaptureVideoDataOutput was instantiated and its buffer delegate and queue were specified:

let videoOutput = AVCaptureVideoDataOutput()
videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "VideoDataOutputQueue"))

That queue is used by AVCaptureVideoDataOutputSampleBufferDelegate method captureOutput.

But you have marked that method as @MainActor (presumably in order to access the deviceOrientation property, which is isolated to the main actor):

extension Camera: @preconcurrency AVCaptureVideoDataOutputSampleBufferDelegate {

    @MainActor
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        guard let pixelBuffer = sampleBuffer.imageBuffer else { return }
        
        if connection.isVideoOrientationSupported,
           let videoOrientation = videoOrientationFor(deviceOrientation) {
            connection.videoOrientation = videoOrientation
        }

        addToPreviewStream?(CIImage(cvPixelBuffer: pixelBuffer))
    }

}

The use of @preconcurrency (which, admittedly, the compiler actually suggests as a possible work-around) silenced a critical compiler error:

Main actor-isolated instance method 'captureOutput(_:didOutput:from:)' cannot be used to satisfy nonisolated protocol requirement.

Bottom line, one cannot just add actor-isolation to a delegate method being called by some framework. The framework dictates how it is called, not you. You must remove that @MainActor qualifier on captureOutput.

Once you do that, the crash disappears.

本文标签: concurrencyAVCaptureSession crash on Swift 6Stack Overflow