Skip to content

[camera_avfoundation] Implementation swift migration - part 5 #9397

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions packages/camera/camera_avfoundation/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,9 @@
## 0.9.19+3

* Migrates lifecycle methods (`start`, `stop`, `close`) to Swift.
* Migrates exposure and focus related methods to Swift.
* Migrates `receivedImageStreamData` and `reportInitializationState` methods to Swift.

## 0.9.19+2

* Adds the `Camera` Swift protocol.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,15 +33,23 @@ protocol Camera: FlutterTexture, AVCaptureVideoDataOutputSampleBufferDelegate,

func setUpCaptureSessionForAudioIfNeeded()

/// Informs the Dart side of the plugin of the current camera state and capabilities.
func reportInitializationState()

/// Acknowledges the receipt of one image stream frame.
///
/// This should be called each time a frame is received. Failing to call it may
/// cause later frames to be dropped instead of streamed.
func receivedImageStreamData()

func start()
func stop()

/// Starts recording a video with an optional streaming messenger.
/// If the messenger is non-nil then it will be called for each
/// captured frame, allowing streaming concurrently with recording.
///
/// @param messenger Nullable messenger for capturing each frame.
func startVideoRecording(
completion: @escaping (_ error: FlutterError?) -> Void,
messengerForStreaming: FlutterBinaryMessenger?
Expand All @@ -60,12 +68,31 @@ protocol Camera: FlutterTexture, AVCaptureVideoDataOutputSampleBufferDelegate,

func setExposureMode(_ mode: FCPPlatformExposureMode)
func setExposureOffset(_ offset: Double)

/// Sets the exposure point, in a (0,1) coordinate system.
///
/// If @c point is nil, the exposure point will reset to the center.
func setExposurePoint(
_ point: FCPPlatformPoint?,
withCompletion: @escaping (_ error: FlutterError?) -> Void
)

/// Sets FocusMode on the current AVCaptureDevice.
///
/// If the @c focusMode is set to FocusModeAuto the AVCaptureDevice is configured to use
/// AVCaptureFocusModeContinuousModeAutoFocus when supported, otherwise it is set to
/// AVCaptureFocusModeAutoFocus. If neither AVCaptureFocusModeContinuousModeAutoFocus nor
/// AVCaptureFocusModeAutoFocus are supported focus mode will not be set.
/// If @c focusMode is set to FocusModeLocked the AVCaptureDevice is configured to use
/// AVCaptureFocusModeAutoFocus. If AVCaptureFocusModeAutoFocus is not supported focus mode will not
/// be set.
///
/// @param mode The focus mode that should be applied.
func setFocusMode(_ mode: FCPPlatformFocusMode)

/// Sets the focus point, in a (0,1) coordinate system.
///
/// If @c point is nil, the focus point will reset to the center.
func setFocusPoint(
_ point: FCPPlatformPoint?,
completion: @escaping (_ error: FlutterError?) -> Void
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,10 +23,182 @@ final class DefaultCamera: FLTCam, Camera {

/// Maximum number of frames pending processing.
/// To limit memory consumption, limit the number of frames pending processing.
/// After some testing, 4 was determined to be the best maximuńm value.
/// After some testing, 4 was determined to be the best maximum value.
/// https://github.com/flutter/plugins/pull/4520#discussion_r766335637
private var maxStreamingPendingFramesCount = 4

private var exposureMode = FCPPlatformExposureMode.auto
private var focusMode = FCPPlatformFocusMode.auto

func reportInitializationState() {
// Get all the state on the current thread, not the main thread.
let state = FCPPlatformCameraState.make(
withPreviewSize: FCPPlatformSize.make(
withWidth: Double(previewSize.width),
height: Double(previewSize.height)
),
exposureMode: exposureMode,
focusMode: focusMode,
exposurePointSupported: captureDevice.isExposurePointOfInterestSupported,
focusPointSupported: captureDevice.isFocusPointOfInterestSupported
)

FLTEnsureToRunOnMainQueue { [weak self] in
self?.dartAPI?.initialized(with: state) { _ in
// Ignore any errors, as this is just an event broadcast.
}
}
}

func receivedImageStreamData() {
streamingPendingFramesCount -= 1
}

func start() {
videoCaptureSession.startRunning()
audioCaptureSession.startRunning()
}

func stop() {
videoCaptureSession.stopRunning()
audioCaptureSession.stopRunning()
}

func setExposureMode(_ mode: FCPPlatformExposureMode) {
exposureMode = mode
applyExposureMode()
}

private func applyExposureMode() {
try? captureDevice.lockForConfiguration()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we do defer { captureDevice.unlockForConfiguration() } in case we update the code and introduce an early exit.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mean an early exit after lockForConfiguration fails? If so I wouldn't add defer because I don't think we have to unlockForConfiguration if lockForConfigurationhas failed (currently we discard the lock result in line with ObjC implementation)

switch exposureMode {
case .locked:
// AVCaptureExposureMode.autoExpose automatically adjusts the exposure one time, and then locks exposure for the device
captureDevice.setExposureMode(.autoExpose)
case .auto:
if captureDevice.isExposureModeSupported(.continuousAutoExposure) {
captureDevice.setExposureMode(.continuousAutoExposure)
} else {
captureDevice.setExposureMode(.autoExpose)
}
@unknown default:
assertionFailure("Unknown exposure mode")
}
captureDevice.unlockForConfiguration()
}

func setExposureOffset(_ offset: Double) {
try? captureDevice.lockForConfiguration()
captureDevice.setExposureTargetBias(Float(offset), completionHandler: nil)
captureDevice.unlockForConfiguration()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here

}

func setExposurePoint(
_ point: FCPPlatformPoint?, withCompletion completion: @escaping (FlutterError?) -> Void
) {
guard captureDevice.isExposurePointOfInterestSupported else {
completion(
FlutterError(
code: "setExposurePointFailed",
message: "Device does not have exposure point capabilities",
details: nil))
return
}

let orientation = UIDevice.current.orientation
try? captureDevice.lockForConfiguration()
// A nil point resets to the center.
let exposurePoint = cgPoint(
for: point ?? FCPPlatformPoint.makeWith(x: 0.5, y: 0.5), withOrientation: orientation)
captureDevice.setExposurePointOfInterest(exposurePoint)
captureDevice.unlockForConfiguration()
// Retrigger auto exposure
applyExposureMode()
completion(nil)
}

func setFocusMode(_ mode: FCPPlatformFocusMode) {
focusMode = mode
applyFocusMode()
}

func setFocusPoint(_ point: FCPPlatformPoint?, completion: @escaping (FlutterError?) -> Void) {
guard captureDevice.isFocusPointOfInterestSupported else {
completion(
FlutterError(
code: "setFocusPointFailed",
message: "Device does not have focus point capabilities",
details: nil))
return
}

let orientation = deviceOrientationProvider.orientation()
try? captureDevice.lockForConfiguration()
// A nil point resets to the center.
captureDevice.setFocusPointOfInterest(
cgPoint(
for: point ?? .makeWith(x: 0.5, y: 0.5),
withOrientation: orientation)
)
captureDevice.unlockForConfiguration()
// Retrigger auto focus
applyFocusMode()
completion(nil)
}

private func applyFocusMode() {
applyFocusMode(focusMode, onDevice: captureDevice)
}

private func applyFocusMode(
_ focusMode: FCPPlatformFocusMode, onDevice captureDevice: FLTCaptureDevice
) {
try? captureDevice.lockForConfiguration()
switch focusMode {
case .locked:
// AVCaptureFocusMode.autoFocus automatically adjusts the focus one time, and then locks focus
if captureDevice.isFocusModeSupported(.autoFocus) {
captureDevice.setFocusMode(.autoFocus)
}
case .auto:
if captureDevice.isFocusModeSupported(.continuousAutoFocus) {
captureDevice.setFocusMode(.continuousAutoFocus)
} else if captureDevice.isFocusModeSupported(.autoFocus) {
captureDevice.setFocusMode(.autoFocus)
}
@unknown default:
assertionFailure("Unknown focus mode")
}
captureDevice.unlockForConfiguration()
}

private func cgPoint(
for point: FCPPlatformPoint, withOrientation orientation: UIDeviceOrientation
)
-> CGPoint
{
var x = point.x
var y = point.y
switch orientation {
case .portrait: // 90 ccw
y = 1 - point.x
x = point.y
case .portraitUpsideDown: // 90 cw
x = 1 - point.y
y = point.x
case .landscapeRight: // 180
x = 1 - point.x
y = 1 - point.y
case .landscapeLeft:
// No rotation required
break
default:
// No rotation required
break
}
return CGPoint(x: x, y: y)
}

func captureOutput(
_ output: AVCaptureOutput,
didOutput sampleBuffer: CMSampleBuffer,
Expand Down Expand Up @@ -241,6 +413,22 @@ final class DefaultCamera: FLTCam, Camera {
}
}

func close() {
stop()
for input in videoCaptureSession.inputs {
videoCaptureSession.removeInput(FLTDefaultCaptureInput(input: input))
}
for output in videoCaptureSession.outputs {
videoCaptureSession.removeOutput(output)
}
for input in audioCaptureSession.inputs {
audioCaptureSession.removeInput(FLTDefaultCaptureInput(input: input))
}
for output in audioCaptureSession.outputs {
audioCaptureSession.removeOutput(output)
}
}

func copyPixelBuffer() -> Unmanaged<CVPixelBuffer>? {
var pixelBuffer: CVPixelBuffer?
pixelBufferSynchronizationQueue.sync {
Expand Down
Loading