Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

Camera become black for few propduction users during photo capture
PLATFORM AND VERSION :iOS 18.5 I wanted to bring to your attention a critical issue some of our production users are experiencing with the CoinOut app. Specifically, users are encountering a problem when attempting to capture photos of receipts using the app's customized camera feature. The camera, which utilizes AVCaptureVideoPreviewLayer and AVCaptureDevice, occasionally fails to load the preview, resulting in a black screen instead of the expected camera view. This camera blackout issue is significantly impacting the user experience as it prevents them from snapping photos of their receipts, which is a core functionality of the CoinOut app. Any help/suggestion to this issue would be greatly appreciated. STEPS TO REPRODUCE Open the app and click on camera icon. It will display camera to capture photo. Camera shows black for few production user's. class ViewController: UIViewController { @IBOutlet private weak var captureButton: UIButton! private var fillLayer: CAShapeLayer! private var previewLayer : AVCaptureVideoPreviewLayer! private var output: AVCapturePhotoOutput! private var device: AVCaptureDevice! private var session : AVCaptureSession! private var highResolutionEnabled: Bool = false private let sessionQueue = DispatchQueue(label: "session queue") override func viewDidLoad() { super.viewDidLoad() setupCamera() customiseUI() } @IBAction func startCamera(sender: UIButton) { didTapTakePhoto() } private func setupCamera() { let session = AVCaptureSession() session.sessionPreset = AVCaptureSession.Preset.high previewLayer = AVCaptureVideoPreviewLayer(session: session) output = AVCapturePhotoOutput() device = AVCaptureDevice.default(.builtInWideAngleCamera, for: AVMediaType.video, position: .back) if let device = self.device{ do{ let input = try AVCaptureDeviceInput(device: device) if session.canAddInput(input){ session.addInput(input)} else { print("\(#fileID):\(#function):\(#line) : Session Input addition failed") } if session.canAddOutput(output){ output.isHighResolutionCaptureEnabled = self.highResolutionEnabled session.addOutput(output) } else { print("\(#fileID):\(#function):\(#line) : Session Input high resolution failed") } previewLayer.videoGravity = .resizeAspectFill previewLayer.session = session sessionQueue.async { session.startRunning() } self.session = session self.session.accessibilityElementIsFocused() try device.lockForConfiguration() if device.isWhiteBalanceModeSupported(AVCaptureDevice.WhiteBalanceMode.autoWhiteBalance) { device.whiteBalanceMode = .autoWhiteBalance } else { print("\(#fileID):\(#function):\(#line) : isWhiteBalanceModeSupported no supported") } if device.isWhiteBalanceModeSupported(AVCaptureDevice.WhiteBalanceMode.continuousAutoWhiteBalance) { device.whiteBalanceMode = .continuousAutoWhiteBalance } else { print("\(#fileID):\(#function):\(#line) : isWhiteBalanceModeSupported no supported") } if device.isFocusModeSupported(.continuousAutoFocus) { device.focusMode = .continuousAutoFocus} else if device.isFocusModeSupported(.autoFocus) { device.focusMode = .autoFocus } device.unlockForConfiguration() } catch { print("\(#fileID):\(#function):\(#line) : \(error.localizedDescription)") } } else { print("\(#fileID):\(#function):\(#line) : Device found as nil") } } private func customiseUI() { let path = UIBezierPath(roundedRect: CGRect(x: 0, y: 0, width: self.view.bounds.width, height: self.view.bounds.height), cornerRadius: 0) let rectangleWidth = view.frame.width - (view.frame.width * 0.16) let x = (view.frame.width - rectangleWidth) / 2 let rectangleHeight = view.frame.height - (view.frame.height * 0.16) let y = (view.frame.height - rectangleHeight) / 2 let roundRect = UIBezierPath(roundedRect: CGRect(x: x, y: y, width: rectangleWidth, height: rectangleHeight), byRoundingCorners:.allCorners, cornerRadii: CGSize(width: 0, height: 0)) roundRect.move(to: CGPoint(x: self.view.center.x , y: self.view.center.y)) path.append(roundRect) path.usesEvenOddFillRule = true fillLayer = CAShapeLayer() fillLayer.path = path.cgPath fillLayer.fillRule = .evenOdd fillLayer.opacity = 0.4 previewLayer.addSublayer(fillLayer) previewLayer.frame = view.bounds view.layer.addSublayer(previewLayer) view.bringSubviewToFront(captureButton) } private func didTapTakePhoto() { let settings = self.getSettings(camera: self.device) if device.isAdjustingFocus { do { try device.lockForConfiguration() device.focusMode = .continuousAutoFocus device.unlockForConfiguration() device.addObserver(self, forKeyPath: "adjustingFocus", options: [.new], context: nil) } catch { print(error) } } else { output.capturePhoto(with: settings, delegate: self) } } func getSettings(camera: AVCaptureDevice) -> AVCapturePhotoSettings { var settings = AVCapturePhotoSettings() if let rawFormat = output.availableRawPhotoPixelFormatTypes.first { settings = AVCapturePhotoSettings(rawPixelFormatType: OSType(rawFormat)) } settings.isHighResolutionPhotoEnabled = self.highResolutionEnabled let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first! let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType] as [String : Any] settings.previewPhotoFormat = previewFormat return settings } } extension ViewController: AVCapturePhotoCaptureDelegate { func photoOutput(_ output: AVCapturePhotoOutput, willCapturePhotoFor resolvedSettings: AVCaptureResolvedPhotoSettings) { AudioServicesDisposeSystemSoundID(1108) } func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { guard let data = photo.fileDataRepresentation() else { return } let image = UIImage(data: data)! showImage(cropped: image) } func showImage(cropped: UIImage) { let vc = self.storyboard?.instantiateViewController(withIdentifier: "ImagePreviewViewController") as? ImagePreviewViewController vc?.captured = cropped self.present(vc!, animated: true) } }```
1
0
244
Jul ’25
Telephoto Lens Keeps Switching to Other Lenses on iPhone 16 Pro Max During PPG (Finger on Camera)
Hi, I’m building a PPG-based heart rate feature where the user places their finger over the rear telephoto camera. On iPhone 16 Pro Max, I'm explicitly selecting the telephoto lens like this: videoDevice = AVCaptureDevice.default(.builtInTelephotoCamera, for: .video, position: .back) And trying to lock it: if #available(iOS 15.0, *), device.activePrimaryConstituentDeviceSwitchingBehavior != .unsupported { try? device.lockForConfiguration() device.setPrimaryConstituentDeviceSwitchingBehavior(.locked, restrictedSwitchingBehaviorConditions: []) device.unlockForConfiguration() } I also lock everything else to prevent dynamic changes: try device.lockForConfiguration() device.focusMode = .locked device.exposureMode = .locked device.whiteBalanceMode = .locked device.videoZoomFactor = 1.0 device.automaticallyEnablesLowLightBoostWhenAvailable = false device.automaticallyAdjustsVideoHDREnabled = false device.unlockForConfiguration() Despite this, the camera still switches to another lens, especially under different lighting, even though the user’s finger fully covers the lens. Questions: How can I completely prevent lens switching in this scenario? Would using videoZoomFactor = 3.0 or 5.0 better enforce use of the telephoto lens? Thanks! Gal
3
0
206
Jul ’25
Always audio from latest connected external USB mic
Hello! I've two mics connected to a USB-hub. The USB-hub is then connected to my iPad. Both mics are part of the audio session's list of available inputs. The problem is that regardless of which mic I select in my app (using setPreferredInput() on the audio session), the audio keeps coming from the mic that was last connected to the USB-hub. Anyone that knows if this is a limitation in iPadOS/iOS?
1
1
209
Jul ’25
Playing periodic audio in background using AVFoundation - facing audio session startup failure
Hello everyone, I’m new to Swift development and have been working on an audio module that plays a specific sound at regular intervals - similar to a workout timer that signals switching exercises every few minutes. Following AVFoundation documentation, I’m configuring my audio session like this: let session = AVAudioSession.sharedInstance() try session.setCategory( .playback, mode: .default, options: [.interruptSpokenAudioAndMixWithOthers, .duckOthers] ) self.engine.attach(self.player) self.engine.connect(self.player, to: self.engine.outputNode, format: self.audioFormat) try? session.setActive(true) When it’s time to play cues, I schedule playback on a DispatchQueue: // scheduleAudio uses DispatchQueue self.scheduleAudio(at: interval.start) { do { try audio.engine.start() audio.node.play() for sample in interval.samples { audio.node.scheduleBuffer(sample.buffer, at: AVAudioTime(hostTime: sample.hostTime)) } } catch { print("Audio activation failed: \(error)") } } This works perfectly in the foreground. But once the app goes into the background, the scheduled callback runs, yet the audio engine fails to start, resulting in an error with code 561015905. Interestingly, if the app is already playing audio before going to the background, the scheduled sounds continue to play as expected. I have added the required background audio mode to my Info plist file by including the key UIBackgroundModes with the value audio. Is there anything else I should configure? What is the best practice to play periodic audio when the app runs in the background? How do apps like turn-by-turn navigation handle continuous audio playback in the background? Any advice or pointers would be greatly appreciated!
0
0
228
Jul ’25
Couldn't able to hear audio via speaker on ios real device
This is my native module code implementation I'm getting base64 encoded string from server and passing this to my native module of pcm player to play audio App.tsx PcmPlayer.writeChunk(e.data); PcmPlayer.swift import AVFoundation @objc(PcmPlayer) class PcmPlayer: RCTEventEmitter { private var engine: AVAudioEngine? private var playerNode: AVAudioPlayerNode? private var format: AVAudioFormat? private var bufferQueue = [Data]() private var isPlaying = false private var hasEnded = false private var scheduledBufferCount = 0 private let minBufferBytes = 50000 private let pcmQueue = DispatchQueue(label: "pcm.queue") override init() { super.init() } override func supportedEvents() -> [String]! { return ["onStatus", "onMessage"] } @objc(initPlayer:channels:bitsPerSample:) func initPlayer(_ sampleRate: NSNumber, channels: NSNumber, bitsPerSample: NSNumber) { pcmQueue.async { self.stopInternal() let session = AVAudioSession.sharedInstance() do { try session.setCategory(.playback, mode: .default, options: []) try session.setActive(true, options: .notifyOthersOnDeactivation) try session.setMode(.default) print("🔈 Audio session active. Output route:", session.currentRoute.outputs) } catch { print("❌ Audio session setup failed:", error) return } self.engine = AVAudioEngine() self.playerNode = AVAudioPlayerNode() guard let engine = self.engine, let playerNode = self.playerNode else { print("❌ Engine or playerNode is nil") return } engine.attach(playerNode) self.format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: sampleRate.doubleValue, channels: AVAudioChannelCount(channels.uintValue), interleaved: false) guard let format = self.format else { print("❌ Failed to create AVAudioFormat") return } engine.connect(playerNode, to: engine.mainMixerNode, format: format) do { try engine.start() playerNode.play() engine.mainMixerNode.outputVolume = 1.0 print("✅ AVAudioEngine started with format:", format) } catch { print("❌ Engine start failed:", error) } self.hasEnded = false } } @objc(writeChunk:) func writeChunk(_ base64Pcm: String) { pcmQueue.async { guard base64Pcm.count >= 10 else { print("⚠️ Skipping short base64 string") return } var padded = base64Pcm let mod4 = base64Pcm.count % 4 if mod4 > 0 { padded += String(repeating: "=", count: 4 - mod4) } guard let data = Data(base64Encoded: padded, options: .ignoreUnknownCharacters) else { print("❌ Failed to decode base64") return } self.bufferQueue.append(data) print("📥 Received PCM chunk (\(data.count) bytes)") print("📥 writeChunk called. isPlaying=\(self.isPlaying), bufferQueue.count=\(self.bufferQueue.count)") if !self.isPlaying { self.isPlaying = true self.waitForBufferAndStartPlayback() } else if self.scheduledBufferCount == 0 { self.isPlaying = true self.waitForBufferAndStartPlayback() } } } private func waitForBufferAndStartPlayback() { DispatchQueue.global().async { while self.queueSize() < self.minBufferBytes && !self.hasEnded { Thread.sleep(forTimeInterval: 0.01) } self.writeLoop() } } private func writeLoop() { DispatchQueue.global().async { writeLoop: while self.isPlaying { if self.bufferQueue.isEmpty { for _ in 0..<100 { Thread.sleep(forTimeInterval: 0.01) if !self.bufferQueue.isEmpty { break } } if self.bufferQueue.isEmpty { print("🔇 No more data to play after waiting") self.isPlaying = false break writeLoop } } var data: Data? self.pcmQueue.sync { if !self.bufferQueue.isEmpty { data = self.bufferQueue.removeFirst() } } guard let chunk = data else { print("⚠️ No data to process") continue } if let buffer = self.pcmBufferFromData(chunk) { self.scheduledBufferCount += 1 self.playerNode?.scheduleBuffer(buffer, completionHandler: { self.pcmQueue.async { self.scheduledBufferCount -= 1 if self.bufferQueue.isEmpty && self.scheduledBufferCount == 0 { print("ℹ️ Playback idle - waiting for more data") self.isPlaying = false } } }) } } } } private func pcmBufferFromData(_ data: Data) -> AVAudioPCMBuffer? { guard let format = self.format else { return nil } let frameCount = UInt32(data.count / 2) guard let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: frameCount) else { print("❌ Failed to create AVAudioPCMBuffer") return nil } buffer.frameLength = frameCount guard let floatChannelData = buffer.floatChannelData?[0] else { print("❌ floatChannelData is nil") return nil } data.withUnsafeBytes { (rawBuffer: UnsafeRawBufferPointer) in let int16Buffer = rawBuffer.bindMemory(to: Int16.self) let count = min(int16Buffer.count, Int(frameCount)) for i in 0..<count { floatChannelData[i] = Float32(int16Buffer[i]) / Float32(Int16.max) } } return buffer } @objc(stopPlayer) func stopPlayer() { pcmQueue.async { self.stopInternal() } } private func stopInternal() { print("🛑 stopInternal called") self.playerNode?.stop() self.engine?.stop() self.engine?.reset() self.playerNode = nil self.engine = nil self.format = nil self.bufferQueue.removeAll() self.isPlaying = false self.hasEnded = true self.scheduledBufferCount = 0 } @objc(canWrite:rejecter:) func canWrite(_ resolve: @escaping RCTPromiseResolveBlock, rejecter reject: RCTPromiseRejectBlock) { pcmQueue.async { resolve(self.bufferQueue.count < 20) } } @objc(flushPlayer:rejecter:) func flushPlayer(_ resolve: @escaping RCTPromiseResolveBlock, rejecter reject: RCTPromiseRejectBlock) { pcmQueue.async { self.bufferQueue.removeAll() resolve(nil) } } @objc static override func requiresMainQueueSetup() -> Bool { return false } private func queueSize() -> Int { return pcmQueue.sync { return self.bufferQueue.reduce(0) { $0 + $1.count } } } } I couldn't able to hear any audio via my real iOS device also it is working fine on emulator.
0
0
209
Jul ’25
Making DataScannerViewController work in the Simulator
Before you post —Camera doesn't work on the Simulator— that's no longer true. I've made a solution that makes the Simulator believe there's an actual hardware device connected, allowing users to stream the macOS camera to the iOS Simulator (see for more info RocketSim's documentation: https://docs.rocketsim.app/features/hzQMSrSga7BGWvxdNVdwYs/simulator-camera-support/58tQ5jvevLNSnyUEA7VgAv) Now, it works for VNDocumentCameraViewController, but when I try opening DataScannerViewController, I directly run into: Failed to start scanning: The operation couldn’t be completed. (VisionKit.DataScannerViewController.ScanningUnavailable error 0.) My question: How does this view controller determine whether scanning is available? Is there a certain capability the available AVCaptureDevice's need to support maybe? Any direction would be helpful for me to make this work for developers, making them build apps faster!
0
1
364
Jul ’25
Linking to iTunesLibrary requires access every launch?
Hello, I have a command line application that uses iTunesLibrary to "save" the state of what I have listened to. I have it run every night via a LaunchAgent. You can see the source here: https://github.com/bolsinga/itunes_json Prior to Sequoia it would run nightly. I'd just have to grant it access to the Music library once, and it would be fine thereafter. However with Sequoia it requires UI interaction to grant it access every time. This makes it no longer run unattended overnight, defeating its purpose. I have the console logs of when this happens. You can see it in my issue tracking it here: https://github.com/bolsinga/itunes_json/issues/410 One thing that makes me wonder is that it is a command line application, not a bundle. How do I make a command line application get access to MusicKit / iTunesLibrary, and keep it thereafter? I'd like to get my pre-Sequoia behavior back. I've filed FB15592660 too. I've granted it access to run in the background, as well as access to my Music library (please see attached screenshots). AMPLibraryAgent 10:48:29.489944-0700 xpc Connection from framework client invalidated pid:57606 clientname:iTunesLibrary(itunes_json) AMPLibraryAgent 10:48:29.492763-0700 service Unloading domains(14) for ClientID:iTunesLibrary(itunes_json)-1229 previous open:15 new open:1 itunes_json 10:48:59.980864-0700 connection [0x157f05800] activating connection: mach=true listener=false peer=false name=com.apple.amp.library.framework tccd 10:48:59.982568-0700 access AUTHREQ_ATTRIBUTION: msgID=1795.214, attribution={accessing={TCCDProcess: identifier=itunes_json, pid=57652, auid=501, euid=501, binary_path=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json}, requesting={TCCDProcess: identifier=com.apple.AMPLibraryAgent, pid=1795, auid=501, euid=501, binary_path=/System/Library/PrivateFrameworks/AMPLibrary.framework/Versions/A/Support/AMPLibraryAgent}, }, tccd 10:48:59.982651-0700 access requestor: TCCDProcess: identifier=com.apple.AMPLibraryAgent, pid=1795, auid=501, euid=501, binary_path=/System/Library/PrivateFrameworks/AMPLibrary.framework/Versions/A/Support/AMPLibraryAgent is checking access for accessor TCCDProcess: identifier=itunes_json, pid=57652, auid=501, euid=501, binary_path=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json tccd 10:48:59.995636-0700 access AUTHREQ_SUBJECT: msgID=1795.214, subject=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json, tccd 10:48:59.996283-0700 access -[TCCDAccessIdentity staticCode]: static code for: identifier /Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json, type: 1: 0xc00341b00 at /Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json tccd 10:49:00.018205-0700 access Failed to match existing code requirement for subject /Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json and service kTCCServiceMediaLibrary cdhash H"6bc380972f4df49b337a2a05308fb7b98fbe6473" or cdhash H"0708bcaabbfbab8770522050f7e2642d4d864f31" cdhash H"6bc380972f4df49b337a2a05308fb7b98fbe6473" or cdhash H"0708bcaabbfbab8770522050f7e2642d4d864f31" tccd 10:49:00.018997-0700 access AUTHREQ_PROMPTING: msgID=1795.214, service=kTCCServiceMediaLibrary, subject=Sub:{/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json}Resp:{TCCDProcess: identifier=itunes_json, pid=57652, auid=501, euid=501, binary_path=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json}, AMPLibraryAgent 10:49:02.489170-0700 xpc ampld> register framework ClientName:iTunesLibrary(itunes_json) tccd 10:49:02.488189-0700 events Publishing <TCCDEvent: type=Create, service=kTCCServiceMediaLibrary, identifier_type=Path, identifier=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json> to 4 subscribers: { 633 = "<TCCDEventSubscriber: token=633, state=Initial, csid=(null)>"; 628 = "<TCCDEventSubscriber: token=628, state=Passed, csid=com.apple.chronod>"; 464 = "<TCCDEventSubscriber: token=464, state=Passed, csid=com.apple.cloudd>"; 513 = "<TCCDEventSubscriber: token=513, state=Passed, csid=com.apple.photolibraryd>"; } AMPLibraryAgent 10:49:02.490391-0700 xpc ampld> registered framework ClientName:iTunesLibrary(itunes_json) with clientID:1230 itunes_json 10:49:02.792084-0700 connection [0x147e04340] activating connection: mach=true listener=false peer=false name=com.apple.amp.artworkd itunes_json 10:49:02.801482-0700 <Missing Description> openDatabase 0xe4af30f4493e5ef5 artwork folder Y '<private>' itunes_json 10:49:02.805087-0700 <Missing Description> openDatabase 0xf2db6e8d7672edc9 artwork folder Y '<private>' itunes_json 10:49:02.806736-0700 <Missing Description> openDatabase 0xfb2acd898c951851 artwork folder Y '<private>' itunes_json 10:49:02.813286-0700 <Missing Description> openDatabase 0xf0f4919c5ff0e88 artwork folder Y '<private>' itunes_json 10:49:09.634928-0700 connection [0x600002b6a0d0] activating connection: mach=true listener=false peer=false name=com.apple.cfprefsd.daemon itunes_json 10:49:09.635019-0700 connection [0x600002b78000] activating connection: mach=true listener=false peer=false name=com.apple.cfprefsd.agent AMPLibraryAgent 10:49:12.382878-0700 xpc Connection from framework client invalidated pid:57652 clientname:iTunesLibrary(itunes_json) AMPLibraryAgent 10:49:12.383474-0700 service Unloading domains(14) for ClientID:iTunesLibrary(itunes_json)-1230 previous open:15 new open:1 itunes_json.log
7
0
1.1k
Jul ’25
ScaleTimeRange will cause noise in sound
I'm using AVFoundation to make a multi-track editor app, which can insert multiple track and clip, including scale some clip to change the speed of the clip, (also I'm not sure whether AVFoundation the best choice for me) but after making the scale with scaleTimeRange API, there is some short noise sound in play back. Also, sometimes it's fine when play AVMutableCompostion using AVPlayer with AVPlayerItem, but after exporting with AVAssetReader, will catch some short noise sounds in result file.... Not sure why. Here is the example project, which can build and run directly. https://github.com/luckysmg/daily_images/raw/refs/heads/main/TestDemo.zip
0
0
138
Jul ’25
FCPX Workflow Extension: Drag & Drop FCPXML with Titles to Timeline Not Working Despite Following File Promise Guidelines
I'm developing a Final Cut Pro X workflow extension that transcribes audio and creates a text output. I need to allow users to drag this text directly from my extension into FCPX's timeline as titles. Current Implementation: Using NSFilePromiseProvider as per Apple's guidelines for drag and drop Generating valid FCPXML (v1.10) with proper structure: Complete resources section with format and asset references Event and project hierarchy Asset clip with connected title elements Proper timing and duration calculations Supporting multiple pasteboard types: com.apple.finalcutpro.xml.v1-10 com.apple.finalcutpro.xml.v1-9 com.apple.finalcutpro.xml What's Working: Drag operation initiates correctly File promise provider is set up properly FCPXML generation is successful (verified content) All required pasteboard types are registered Proper logging confirms data is being requested and provided Current Pasteboard Types Offered: com.apple.NSFilePromiseItemMetaData com.apple.pasteboard.promised-file-name com.apple.pasteboard.promised-suggested-file-name com.apple.pasteboard.promised-file-content-type Apple files promise pasteboard type com.apple.pasteboard.NSFilePromiseID com.apple.pasteboard.promised-file-url com.apple.finalcutpro.xml.v1-10 com.apple.finalcutpro.xml.v1-9 com.apple.finalcutpro.xml What additional requirements or considerations are needed to make FCPX accept the dragged FCPXML content? Are there specific requirements for workflow extensions regarding drag and drop operations with titles that aren't documented? Any insights, especially from those who have implemented similar functionality in FCPX workflow extensions, would be greatly appreciated. Technical Details: macOS Version: 15.5 (24F74) FCPX Version: 11.1.1 Extension built with SwiftUI and AppKit integration Using NSFilePromiseProvider and NSPasteboardItemDataProvider Full pasteboard type support for FCPXML versions
0
0
206
Jul ’25
Live HLS Stream Not Playing After Window Resize in Vision Pro
Hi everyone, I'm developing a visionOS app for Apple Vision Pro, and I've encountered an issue related to window resizing at runtime when using AVPlayer to play a live HLS stream. ✅ What I'm Trying to Do Play a live HLS stream (from Wowza) inside my app using AVPlayer. Support resizing the immersive window using Vision Pro’s built-in runtime scaling gesture. Stream works fine at default window size when the app launches. ❌ Problem If I resize the app’s window at runtime (using the Vision Pro pinch-drag gesture), then try to start the stream, it does not play. Instead, it just shows the "Loading live stream..." state and never proceeds to playback. This issue only occurs after resizing the window — if I don’t resize, the stream works perfectly every time. 🧪 What I’ve Tried Verified the HLS URL — it’s working and plays fine in Safari and in the app before resizing. Set .automaticallyWaitsToMinimizeStalling = false on AVPlayer. Observed that .status on AVPlayerItem never reaches .readyToPlay after resizing. Tried to force window size back using UIWindowScene.requestGeometryUpdate(...), but behavior persists.
2
0
290
Jul ’25
After playing an HDR video on iPhone for a while, the HDR effect disappears and the screen brightness decrease
When i use AVPlayer to obtain the video frame CVPixelBufferRef of an HDR video, and use AVSampleBufferDisplayLayer to display it on the screen, after a period of time, the HDR video content and screen gradually darken, losing the HDR effect. Steps to reproduce: Create an AVPlayer to loop an HDR video, specify the video frame format as kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange Create a timer to get the video frame CVPixelBufferRef at 30 frames per second Use AVSampleBufferDisplayLayer to display CVPixelBufferRef on the screen Don't operate the phone, wait for a period of time (such as 40 minutes), the HDR effect disappears and the screen darkens Note: You need to use an iPhone device, iOS 18.5 and below operating system You need to ensure that the HDR video is played in a loop, that is, to ensure that the screen continues to display HDR content, wait for a period of time, depending on different devices, you need to wait for 20-40 minutes. In the iPhone Photos app,the same problem will occur after playing HDR video in a loop for a long time Expected Results: When rendering HDR content for a long time, it is guaranteed that there is always an HDR effect, and the HDR content and screen will not be darkened. Current Results: After about 20-40 minutes, the HDR effect disappears and the screen darkens.
4
0
780
Jul ’25
HDR video & screen brightness
When I play an HDR video in the iPhone Photos app, I can see the HDR effect obviously. But if this HDR video is played continuously for more than 30-40 minutes, the HDR effect will disappear and the brightness will be compressed to the SDR range. This issue will appear on any iPhone. Depending on the phone, it may be 20-30 minutes, or 30-40 minutes, or even a few minutes, such as iPhone 12 mini. Similarly, if I use AVPlayer to play and preview an HDR video, if it plays more than 30-40 minutes, the HDR effect will disappear and the screen brightness will dim. Also the currentEDRHeadroom will gradually decrease to 1 Note, test it with an HDR video longer than 1 hour, and if the video is short, please loop it. My question is how to avoid losing the HDR effect after 30-40 minutes when I use CAMetalLayer to render any HDR video.
1
0
204
Jul ’25
AVKit - PiP with AVSampleBufferDisplayLayer Error
AVPictureInPictureControllerContentSource *contentSource = [[AVPictureInPictureControllerContentSource alloc] initWithSampleBufferDisplayLayer:self.renderView.sampleBufferDisplayLayer playbackDelegate:self]; AVPictureInPictureController *pictureInPictureController = [[AVPictureInPictureController alloc] initWithContentSource:contentSource]; pictureInPictureController.delegate = self; (void)pictureInPictureController:(AVPictureInPictureController *)pictureInPictureController failedToStartPictureInPictureWithError:(NSError *)error { //error NSError * domain: @"PGPegasusErrorDomain" - code: -1003 0x00000002819fe3a0 } when first start the PiP play, I got the error "//error NSError * domain: @"PGPegasusErrorDomain" - code: -1003 0x00000002819fe3a0", why? and second start is Ok.
0
0
189
Jul ’25
How can third-party iOS apps obtain real-time waveform / spectrogram data for Apple Music tracks (similar to djay & other DJ apps)?
Hi everyone, I’m working on an iOS MusicKit app that overlays a metronome on top of Apple Music playback, using ApplicationMusicPlayer. To line the clicks up perfectly I’d like access to low-level audio analysis data—ideally a waveform / spectrogram or beat grid—while the track is playing. I’ve noticed that several approved DJ apps (e.g. djay, Serato, rekordbox) can already: • Display detailed scrolling waveforms of Apple Music songs • Scratch, loop or time-stretch those tracks in real time That implies they receive decoded PCM frames or at least high-resolution analysis data from Apple Music under a special entitlement. My questions: Does MusicKit (or any public framework) expose real-time audio buffers, FFT bins, or beat markers for streaming Apple Music content? If not, is there an Apple program or entitlement that developers can apply for—similar to the “DJ with Apple Music” initiative—to gain that deeper access? Where can I find official documentation or a point of contact for this kind of request? I’ve searched the docs and forums but only see standard MusicKit playback APIs, which don’t appear to expose raw audio for DRM-protected songs. Any guidance, links or insider tips on the proper application process would be hugely appreciated! Thanks in advance.
1
3
326
Jul ’25
Apple Music / MusicKit and Simulator
It's been an ask for a few years and I'm wondering if there are any plans, or whether the '26 SDKs/Tools allow Apple Music to work in the simulator? I develop for the Vision Pro so the usual 'fix' of running on the device is a bit of a hard ask. At the very least a small sample library that works in the simulator would be welcome (similar to how photos works) Cheers
1
1
199
Jul ’25
Final Cut Pro Workflow Extension Drag and Drop to Timeline Not Working Despite Proper FCPXML Clip Structure
Our Final Cut Pro workflow extension built with ProExtensionHost framework uses an advanced NSPasteboardItemDataProvider system with multi-version FCPXML support (1.9, 1.10, 1.13) and proper relative path UIDs for Motion templates. We've implemented clip wrapper approach with placeholder assets and elements containing effects to enable direct timeline drag functionality. However, drag and drop from our Final Cut Pro workflow extension directly to timeline is still not working despite proper element structure in our FCPXML. Our implementation creates valid clip elements with effects applied, but Final Cut Pro timeline doesn't accept them during drag operations from our ProExtensionHost-based workflow extension. Steps to Reproduce: Create Final Cut Pro workflow extension using ProExtensionHost framework with NSPasteboardItemDataProvider implementation Generate FCPXML with proper element structure: Expected Result: Clip should be accepted by timeline and effect applied from workflow extension Actual Result: Timeline rejects drag operation from ProExtensionHost-based workflow extension Question: Are there additional requirements or ProExtensionHost API calls needed beyond standard NSPasteboardItemDataProvider for Final Cut Pro workflow extension timeline drag functionality?
0
0
196
Jul ’25
Apple Music iOS 26 features in Android
Since many users like me use Apple Music on Android, the app is almost as feature-rich as iOS. It would be fantastic if the developers could add the new iOS 26 features to the Android app, along with a minor UI change. I know it’s challenging to implement liquid glass on Android hardware or design, but features like auto-mix, pronunciation, and translation could be added. kindly consider this request !!!!
1
0
225
Jul ’25
IPadOS 17 external camera exposure
I'm developing iPad app that will be mostly dedicated for certain external camera for visually impaired people. The linux UVC api (e.g. using guvcview) allows to enable automatic exposure for the camera. IOs api "isExposureModeSupported" unfortunately returns false for any of the exposure modes. Is it a bug? Or perhaps AVFoundation doesn't support UVC exposure yet?
1
2
690
Jul ’25
Camera become black for few propduction users during photo capture
PLATFORM AND VERSION :iOS 18.5 I wanted to bring to your attention a critical issue some of our production users are experiencing with the CoinOut app. Specifically, users are encountering a problem when attempting to capture photos of receipts using the app's customized camera feature. The camera, which utilizes AVCaptureVideoPreviewLayer and AVCaptureDevice, occasionally fails to load the preview, resulting in a black screen instead of the expected camera view. This camera blackout issue is significantly impacting the user experience as it prevents them from snapping photos of their receipts, which is a core functionality of the CoinOut app. Any help/suggestion to this issue would be greatly appreciated. STEPS TO REPRODUCE Open the app and click on camera icon. It will display camera to capture photo. Camera shows black for few production user's. class ViewController: UIViewController { @IBOutlet private weak var captureButton: UIButton! private var fillLayer: CAShapeLayer! private var previewLayer : AVCaptureVideoPreviewLayer! private var output: AVCapturePhotoOutput! private var device: AVCaptureDevice! private var session : AVCaptureSession! private var highResolutionEnabled: Bool = false private let sessionQueue = DispatchQueue(label: "session queue") override func viewDidLoad() { super.viewDidLoad() setupCamera() customiseUI() } @IBAction func startCamera(sender: UIButton) { didTapTakePhoto() } private func setupCamera() { let session = AVCaptureSession() session.sessionPreset = AVCaptureSession.Preset.high previewLayer = AVCaptureVideoPreviewLayer(session: session) output = AVCapturePhotoOutput() device = AVCaptureDevice.default(.builtInWideAngleCamera, for: AVMediaType.video, position: .back) if let device = self.device{ do{ let input = try AVCaptureDeviceInput(device: device) if session.canAddInput(input){ session.addInput(input)} else { print("\(#fileID):\(#function):\(#line) : Session Input addition failed") } if session.canAddOutput(output){ output.isHighResolutionCaptureEnabled = self.highResolutionEnabled session.addOutput(output) } else { print("\(#fileID):\(#function):\(#line) : Session Input high resolution failed") } previewLayer.videoGravity = .resizeAspectFill previewLayer.session = session sessionQueue.async { session.startRunning() } self.session = session self.session.accessibilityElementIsFocused() try device.lockForConfiguration() if device.isWhiteBalanceModeSupported(AVCaptureDevice.WhiteBalanceMode.autoWhiteBalance) { device.whiteBalanceMode = .autoWhiteBalance } else { print("\(#fileID):\(#function):\(#line) : isWhiteBalanceModeSupported no supported") } if device.isWhiteBalanceModeSupported(AVCaptureDevice.WhiteBalanceMode.continuousAutoWhiteBalance) { device.whiteBalanceMode = .continuousAutoWhiteBalance } else { print("\(#fileID):\(#function):\(#line) : isWhiteBalanceModeSupported no supported") } if device.isFocusModeSupported(.continuousAutoFocus) { device.focusMode = .continuousAutoFocus} else if device.isFocusModeSupported(.autoFocus) { device.focusMode = .autoFocus } device.unlockForConfiguration() } catch { print("\(#fileID):\(#function):\(#line) : \(error.localizedDescription)") } } else { print("\(#fileID):\(#function):\(#line) : Device found as nil") } } private func customiseUI() { let path = UIBezierPath(roundedRect: CGRect(x: 0, y: 0, width: self.view.bounds.width, height: self.view.bounds.height), cornerRadius: 0) let rectangleWidth = view.frame.width - (view.frame.width * 0.16) let x = (view.frame.width - rectangleWidth) / 2 let rectangleHeight = view.frame.height - (view.frame.height * 0.16) let y = (view.frame.height - rectangleHeight) / 2 let roundRect = UIBezierPath(roundedRect: CGRect(x: x, y: y, width: rectangleWidth, height: rectangleHeight), byRoundingCorners:.allCorners, cornerRadii: CGSize(width: 0, height: 0)) roundRect.move(to: CGPoint(x: self.view.center.x , y: self.view.center.y)) path.append(roundRect) path.usesEvenOddFillRule = true fillLayer = CAShapeLayer() fillLayer.path = path.cgPath fillLayer.fillRule = .evenOdd fillLayer.opacity = 0.4 previewLayer.addSublayer(fillLayer) previewLayer.frame = view.bounds view.layer.addSublayer(previewLayer) view.bringSubviewToFront(captureButton) } private func didTapTakePhoto() { let settings = self.getSettings(camera: self.device) if device.isAdjustingFocus { do { try device.lockForConfiguration() device.focusMode = .continuousAutoFocus device.unlockForConfiguration() device.addObserver(self, forKeyPath: "adjustingFocus", options: [.new], context: nil) } catch { print(error) } } else { output.capturePhoto(with: settings, delegate: self) } } func getSettings(camera: AVCaptureDevice) -> AVCapturePhotoSettings { var settings = AVCapturePhotoSettings() if let rawFormat = output.availableRawPhotoPixelFormatTypes.first { settings = AVCapturePhotoSettings(rawPixelFormatType: OSType(rawFormat)) } settings.isHighResolutionPhotoEnabled = self.highResolutionEnabled let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first! let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType] as [String : Any] settings.previewPhotoFormat = previewFormat return settings } } extension ViewController: AVCapturePhotoCaptureDelegate { func photoOutput(_ output: AVCapturePhotoOutput, willCapturePhotoFor resolvedSettings: AVCaptureResolvedPhotoSettings) { AudioServicesDisposeSystemSoundID(1108) } func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { guard let data = photo.fileDataRepresentation() else { return } let image = UIImage(data: data)! showImage(cropped: image) } func showImage(cropped: UIImage) { let vc = self.storyboard?.instantiateViewController(withIdentifier: "ImagePreviewViewController") as? ImagePreviewViewController vc?.captured = cropped self.present(vc!, animated: true) } }```
Replies
1
Boosts
0
Views
244
Activity
Jul ’25
Telephoto Lens Keeps Switching to Other Lenses on iPhone 16 Pro Max During PPG (Finger on Camera)
Hi, I’m building a PPG-based heart rate feature where the user places their finger over the rear telephoto camera. On iPhone 16 Pro Max, I'm explicitly selecting the telephoto lens like this: videoDevice = AVCaptureDevice.default(.builtInTelephotoCamera, for: .video, position: .back) And trying to lock it: if #available(iOS 15.0, *), device.activePrimaryConstituentDeviceSwitchingBehavior != .unsupported { try? device.lockForConfiguration() device.setPrimaryConstituentDeviceSwitchingBehavior(.locked, restrictedSwitchingBehaviorConditions: []) device.unlockForConfiguration() } I also lock everything else to prevent dynamic changes: try device.lockForConfiguration() device.focusMode = .locked device.exposureMode = .locked device.whiteBalanceMode = .locked device.videoZoomFactor = 1.0 device.automaticallyEnablesLowLightBoostWhenAvailable = false device.automaticallyAdjustsVideoHDREnabled = false device.unlockForConfiguration() Despite this, the camera still switches to another lens, especially under different lighting, even though the user’s finger fully covers the lens. Questions: How can I completely prevent lens switching in this scenario? Would using videoZoomFactor = 3.0 or 5.0 better enforce use of the telephoto lens? Thanks! Gal
Replies
3
Boosts
0
Views
206
Activity
Jul ’25
Tvos 26 beta2 not support dolby atmos Apple tv 4k 3rd
On Apple TV 4K 3rd generation, with tvOS 26 beta 2, when two HomePod 2 are paired to the device, music and movie sources with Dolby Atmos can only be listened to in stereo. dolby atmos not supported
Replies
0
Boosts
0
Views
190
Activity
Jul ’25
Always audio from latest connected external USB mic
Hello! I've two mics connected to a USB-hub. The USB-hub is then connected to my iPad. Both mics are part of the audio session's list of available inputs. The problem is that regardless of which mic I select in my app (using setPreferredInput() on the audio session), the audio keeps coming from the mic that was last connected to the USB-hub. Anyone that knows if this is a limitation in iPadOS/iOS?
Replies
1
Boosts
1
Views
209
Activity
Jul ’25
Playing periodic audio in background using AVFoundation - facing audio session startup failure
Hello everyone, I’m new to Swift development and have been working on an audio module that plays a specific sound at regular intervals - similar to a workout timer that signals switching exercises every few minutes. Following AVFoundation documentation, I’m configuring my audio session like this: let session = AVAudioSession.sharedInstance() try session.setCategory( .playback, mode: .default, options: [.interruptSpokenAudioAndMixWithOthers, .duckOthers] ) self.engine.attach(self.player) self.engine.connect(self.player, to: self.engine.outputNode, format: self.audioFormat) try? session.setActive(true) When it’s time to play cues, I schedule playback on a DispatchQueue: // scheduleAudio uses DispatchQueue self.scheduleAudio(at: interval.start) { do { try audio.engine.start() audio.node.play() for sample in interval.samples { audio.node.scheduleBuffer(sample.buffer, at: AVAudioTime(hostTime: sample.hostTime)) } } catch { print("Audio activation failed: \(error)") } } This works perfectly in the foreground. But once the app goes into the background, the scheduled callback runs, yet the audio engine fails to start, resulting in an error with code 561015905. Interestingly, if the app is already playing audio before going to the background, the scheduled sounds continue to play as expected. I have added the required background audio mode to my Info plist file by including the key UIBackgroundModes with the value audio. Is there anything else I should configure? What is the best practice to play periodic audio when the app runs in the background? How do apps like turn-by-turn navigation handle continuous audio playback in the background? Any advice or pointers would be greatly appreciated!
Replies
0
Boosts
0
Views
228
Activity
Jul ’25
Couldn't able to hear audio via speaker on ios real device
This is my native module code implementation I'm getting base64 encoded string from server and passing this to my native module of pcm player to play audio App.tsx PcmPlayer.writeChunk(e.data); PcmPlayer.swift import AVFoundation @objc(PcmPlayer) class PcmPlayer: RCTEventEmitter { private var engine: AVAudioEngine? private var playerNode: AVAudioPlayerNode? private var format: AVAudioFormat? private var bufferQueue = [Data]() private var isPlaying = false private var hasEnded = false private var scheduledBufferCount = 0 private let minBufferBytes = 50000 private let pcmQueue = DispatchQueue(label: "pcm.queue") override init() { super.init() } override func supportedEvents() -> [String]! { return ["onStatus", "onMessage"] } @objc(initPlayer:channels:bitsPerSample:) func initPlayer(_ sampleRate: NSNumber, channels: NSNumber, bitsPerSample: NSNumber) { pcmQueue.async { self.stopInternal() let session = AVAudioSession.sharedInstance() do { try session.setCategory(.playback, mode: .default, options: []) try session.setActive(true, options: .notifyOthersOnDeactivation) try session.setMode(.default) print("🔈 Audio session active. Output route:", session.currentRoute.outputs) } catch { print("❌ Audio session setup failed:", error) return } self.engine = AVAudioEngine() self.playerNode = AVAudioPlayerNode() guard let engine = self.engine, let playerNode = self.playerNode else { print("❌ Engine or playerNode is nil") return } engine.attach(playerNode) self.format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: sampleRate.doubleValue, channels: AVAudioChannelCount(channels.uintValue), interleaved: false) guard let format = self.format else { print("❌ Failed to create AVAudioFormat") return } engine.connect(playerNode, to: engine.mainMixerNode, format: format) do { try engine.start() playerNode.play() engine.mainMixerNode.outputVolume = 1.0 print("✅ AVAudioEngine started with format:", format) } catch { print("❌ Engine start failed:", error) } self.hasEnded = false } } @objc(writeChunk:) func writeChunk(_ base64Pcm: String) { pcmQueue.async { guard base64Pcm.count >= 10 else { print("⚠️ Skipping short base64 string") return } var padded = base64Pcm let mod4 = base64Pcm.count % 4 if mod4 > 0 { padded += String(repeating: "=", count: 4 - mod4) } guard let data = Data(base64Encoded: padded, options: .ignoreUnknownCharacters) else { print("❌ Failed to decode base64") return } self.bufferQueue.append(data) print("📥 Received PCM chunk (\(data.count) bytes)") print("📥 writeChunk called. isPlaying=\(self.isPlaying), bufferQueue.count=\(self.bufferQueue.count)") if !self.isPlaying { self.isPlaying = true self.waitForBufferAndStartPlayback() } else if self.scheduledBufferCount == 0 { self.isPlaying = true self.waitForBufferAndStartPlayback() } } } private func waitForBufferAndStartPlayback() { DispatchQueue.global().async { while self.queueSize() < self.minBufferBytes && !self.hasEnded { Thread.sleep(forTimeInterval: 0.01) } self.writeLoop() } } private func writeLoop() { DispatchQueue.global().async { writeLoop: while self.isPlaying { if self.bufferQueue.isEmpty { for _ in 0..<100 { Thread.sleep(forTimeInterval: 0.01) if !self.bufferQueue.isEmpty { break } } if self.bufferQueue.isEmpty { print("🔇 No more data to play after waiting") self.isPlaying = false break writeLoop } } var data: Data? self.pcmQueue.sync { if !self.bufferQueue.isEmpty { data = self.bufferQueue.removeFirst() } } guard let chunk = data else { print("⚠️ No data to process") continue } if let buffer = self.pcmBufferFromData(chunk) { self.scheduledBufferCount += 1 self.playerNode?.scheduleBuffer(buffer, completionHandler: { self.pcmQueue.async { self.scheduledBufferCount -= 1 if self.bufferQueue.isEmpty && self.scheduledBufferCount == 0 { print("ℹ️ Playback idle - waiting for more data") self.isPlaying = false } } }) } } } } private func pcmBufferFromData(_ data: Data) -> AVAudioPCMBuffer? { guard let format = self.format else { return nil } let frameCount = UInt32(data.count / 2) guard let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: frameCount) else { print("❌ Failed to create AVAudioPCMBuffer") return nil } buffer.frameLength = frameCount guard let floatChannelData = buffer.floatChannelData?[0] else { print("❌ floatChannelData is nil") return nil } data.withUnsafeBytes { (rawBuffer: UnsafeRawBufferPointer) in let int16Buffer = rawBuffer.bindMemory(to: Int16.self) let count = min(int16Buffer.count, Int(frameCount)) for i in 0..<count { floatChannelData[i] = Float32(int16Buffer[i]) / Float32(Int16.max) } } return buffer } @objc(stopPlayer) func stopPlayer() { pcmQueue.async { self.stopInternal() } } private func stopInternal() { print("🛑 stopInternal called") self.playerNode?.stop() self.engine?.stop() self.engine?.reset() self.playerNode = nil self.engine = nil self.format = nil self.bufferQueue.removeAll() self.isPlaying = false self.hasEnded = true self.scheduledBufferCount = 0 } @objc(canWrite:rejecter:) func canWrite(_ resolve: @escaping RCTPromiseResolveBlock, rejecter reject: RCTPromiseRejectBlock) { pcmQueue.async { resolve(self.bufferQueue.count < 20) } } @objc(flushPlayer:rejecter:) func flushPlayer(_ resolve: @escaping RCTPromiseResolveBlock, rejecter reject: RCTPromiseRejectBlock) { pcmQueue.async { self.bufferQueue.removeAll() resolve(nil) } } @objc static override func requiresMainQueueSetup() -> Bool { return false } private func queueSize() -> Int { return pcmQueue.sync { return self.bufferQueue.reduce(0) { $0 + $1.count } } } } I couldn't able to hear any audio via my real iOS device also it is working fine on emulator.
Replies
0
Boosts
0
Views
209
Activity
Jul ’25
Making DataScannerViewController work in the Simulator
Before you post —Camera doesn't work on the Simulator— that's no longer true. I've made a solution that makes the Simulator believe there's an actual hardware device connected, allowing users to stream the macOS camera to the iOS Simulator (see for more info RocketSim's documentation: https://docs.rocketsim.app/features/hzQMSrSga7BGWvxdNVdwYs/simulator-camera-support/58tQ5jvevLNSnyUEA7VgAv) Now, it works for VNDocumentCameraViewController, but when I try opening DataScannerViewController, I directly run into: Failed to start scanning: The operation couldn’t be completed. (VisionKit.DataScannerViewController.ScanningUnavailable error 0.) My question: How does this view controller determine whether scanning is available? Is there a certain capability the available AVCaptureDevice's need to support maybe? Any direction would be helpful for me to make this work for developers, making them build apps faster!
Replies
0
Boosts
1
Views
364
Activity
Jul ’25
Linking to iTunesLibrary requires access every launch?
Hello, I have a command line application that uses iTunesLibrary to "save" the state of what I have listened to. I have it run every night via a LaunchAgent. You can see the source here: https://github.com/bolsinga/itunes_json Prior to Sequoia it would run nightly. I'd just have to grant it access to the Music library once, and it would be fine thereafter. However with Sequoia it requires UI interaction to grant it access every time. This makes it no longer run unattended overnight, defeating its purpose. I have the console logs of when this happens. You can see it in my issue tracking it here: https://github.com/bolsinga/itunes_json/issues/410 One thing that makes me wonder is that it is a command line application, not a bundle. How do I make a command line application get access to MusicKit / iTunesLibrary, and keep it thereafter? I'd like to get my pre-Sequoia behavior back. I've filed FB15592660 too. I've granted it access to run in the background, as well as access to my Music library (please see attached screenshots). AMPLibraryAgent 10:48:29.489944-0700 xpc Connection from framework client invalidated pid:57606 clientname:iTunesLibrary(itunes_json) AMPLibraryAgent 10:48:29.492763-0700 service Unloading domains(14) for ClientID:iTunesLibrary(itunes_json)-1229 previous open:15 new open:1 itunes_json 10:48:59.980864-0700 connection [0x157f05800] activating connection: mach=true listener=false peer=false name=com.apple.amp.library.framework tccd 10:48:59.982568-0700 access AUTHREQ_ATTRIBUTION: msgID=1795.214, attribution={accessing={TCCDProcess: identifier=itunes_json, pid=57652, auid=501, euid=501, binary_path=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json}, requesting={TCCDProcess: identifier=com.apple.AMPLibraryAgent, pid=1795, auid=501, euid=501, binary_path=/System/Library/PrivateFrameworks/AMPLibrary.framework/Versions/A/Support/AMPLibraryAgent}, }, tccd 10:48:59.982651-0700 access requestor: TCCDProcess: identifier=com.apple.AMPLibraryAgent, pid=1795, auid=501, euid=501, binary_path=/System/Library/PrivateFrameworks/AMPLibrary.framework/Versions/A/Support/AMPLibraryAgent is checking access for accessor TCCDProcess: identifier=itunes_json, pid=57652, auid=501, euid=501, binary_path=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json tccd 10:48:59.995636-0700 access AUTHREQ_SUBJECT: msgID=1795.214, subject=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json, tccd 10:48:59.996283-0700 access -[TCCDAccessIdentity staticCode]: static code for: identifier /Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json, type: 1: 0xc00341b00 at /Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json tccd 10:49:00.018205-0700 access Failed to match existing code requirement for subject /Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json and service kTCCServiceMediaLibrary cdhash H"6bc380972f4df49b337a2a05308fb7b98fbe6473" or cdhash H"0708bcaabbfbab8770522050f7e2642d4d864f31" cdhash H"6bc380972f4df49b337a2a05308fb7b98fbe6473" or cdhash H"0708bcaabbfbab8770522050f7e2642d4d864f31" tccd 10:49:00.018997-0700 access AUTHREQ_PROMPTING: msgID=1795.214, service=kTCCServiceMediaLibrary, subject=Sub:{/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json}Resp:{TCCDProcess: identifier=itunes_json, pid=57652, auid=501, euid=501, binary_path=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json}, AMPLibraryAgent 10:49:02.489170-0700 xpc ampld> register framework ClientName:iTunesLibrary(itunes_json) tccd 10:49:02.488189-0700 events Publishing <TCCDEvent: type=Create, service=kTCCServiceMediaLibrary, identifier_type=Path, identifier=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json> to 4 subscribers: { 633 = "<TCCDEventSubscriber: token=633, state=Initial, csid=(null)>"; 628 = "<TCCDEventSubscriber: token=628, state=Passed, csid=com.apple.chronod>"; 464 = "<TCCDEventSubscriber: token=464, state=Passed, csid=com.apple.cloudd>"; 513 = "<TCCDEventSubscriber: token=513, state=Passed, csid=com.apple.photolibraryd>"; } AMPLibraryAgent 10:49:02.490391-0700 xpc ampld> registered framework ClientName:iTunesLibrary(itunes_json) with clientID:1230 itunes_json 10:49:02.792084-0700 connection [0x147e04340] activating connection: mach=true listener=false peer=false name=com.apple.amp.artworkd itunes_json 10:49:02.801482-0700 <Missing Description> openDatabase 0xe4af30f4493e5ef5 artwork folder Y '<private>' itunes_json 10:49:02.805087-0700 <Missing Description> openDatabase 0xf2db6e8d7672edc9 artwork folder Y '<private>' itunes_json 10:49:02.806736-0700 <Missing Description> openDatabase 0xfb2acd898c951851 artwork folder Y '<private>' itunes_json 10:49:02.813286-0700 <Missing Description> openDatabase 0xf0f4919c5ff0e88 artwork folder Y '<private>' itunes_json 10:49:09.634928-0700 connection [0x600002b6a0d0] activating connection: mach=true listener=false peer=false name=com.apple.cfprefsd.daemon itunes_json 10:49:09.635019-0700 connection [0x600002b78000] activating connection: mach=true listener=false peer=false name=com.apple.cfprefsd.agent AMPLibraryAgent 10:49:12.382878-0700 xpc Connection from framework client invalidated pid:57652 clientname:iTunesLibrary(itunes_json) AMPLibraryAgent 10:49:12.383474-0700 service Unloading domains(14) for ClientID:iTunesLibrary(itunes_json)-1230 previous open:15 new open:1 itunes_json.log
Replies
7
Boosts
0
Views
1.1k
Activity
Jul ’25
ScaleTimeRange will cause noise in sound
I'm using AVFoundation to make a multi-track editor app, which can insert multiple track and clip, including scale some clip to change the speed of the clip, (also I'm not sure whether AVFoundation the best choice for me) but after making the scale with scaleTimeRange API, there is some short noise sound in play back. Also, sometimes it's fine when play AVMutableCompostion using AVPlayer with AVPlayerItem, but after exporting with AVAssetReader, will catch some short noise sounds in result file.... Not sure why. Here is the example project, which can build and run directly. https://github.com/luckysmg/daily_images/raw/refs/heads/main/TestDemo.zip
Replies
0
Boosts
0
Views
138
Activity
Jul ’25
Infrequent Sound inconsitency with Apple Music
Since MacOS 26 Apple Music has inconsitent drops to the Quality of some Tracks indiscrimantly. I don't know if others Expereinced it. It doesn't happen on the Speakers or connected via Bluetooth, but the AUX I/O has it quite often. It is more noticable on Headphones with 48kHz and higher Frequency Bandwidth. Here is the FB18062589
Replies
0
Boosts
0
Views
184
Activity
Jul ’25
FCPX Workflow Extension: Drag & Drop FCPXML with Titles to Timeline Not Working Despite Following File Promise Guidelines
I'm developing a Final Cut Pro X workflow extension that transcribes audio and creates a text output. I need to allow users to drag this text directly from my extension into FCPX's timeline as titles. Current Implementation: Using NSFilePromiseProvider as per Apple's guidelines for drag and drop Generating valid FCPXML (v1.10) with proper structure: Complete resources section with format and asset references Event and project hierarchy Asset clip with connected title elements Proper timing and duration calculations Supporting multiple pasteboard types: com.apple.finalcutpro.xml.v1-10 com.apple.finalcutpro.xml.v1-9 com.apple.finalcutpro.xml What's Working: Drag operation initiates correctly File promise provider is set up properly FCPXML generation is successful (verified content) All required pasteboard types are registered Proper logging confirms data is being requested and provided Current Pasteboard Types Offered: com.apple.NSFilePromiseItemMetaData com.apple.pasteboard.promised-file-name com.apple.pasteboard.promised-suggested-file-name com.apple.pasteboard.promised-file-content-type Apple files promise pasteboard type com.apple.pasteboard.NSFilePromiseID com.apple.pasteboard.promised-file-url com.apple.finalcutpro.xml.v1-10 com.apple.finalcutpro.xml.v1-9 com.apple.finalcutpro.xml What additional requirements or considerations are needed to make FCPX accept the dragged FCPXML content? Are there specific requirements for workflow extensions regarding drag and drop operations with titles that aren't documented? Any insights, especially from those who have implemented similar functionality in FCPX workflow extensions, would be greatly appreciated. Technical Details: macOS Version: 15.5 (24F74) FCPX Version: 11.1.1 Extension built with SwiftUI and AppKit integration Using NSFilePromiseProvider and NSPasteboardItemDataProvider Full pasteboard type support for FCPXML versions
Replies
0
Boosts
0
Views
206
Activity
Jul ’25
Live HLS Stream Not Playing After Window Resize in Vision Pro
Hi everyone, I'm developing a visionOS app for Apple Vision Pro, and I've encountered an issue related to window resizing at runtime when using AVPlayer to play a live HLS stream. ✅ What I'm Trying to Do Play a live HLS stream (from Wowza) inside my app using AVPlayer. Support resizing the immersive window using Vision Pro’s built-in runtime scaling gesture. Stream works fine at default window size when the app launches. ❌ Problem If I resize the app’s window at runtime (using the Vision Pro pinch-drag gesture), then try to start the stream, it does not play. Instead, it just shows the "Loading live stream..." state and never proceeds to playback. This issue only occurs after resizing the window — if I don’t resize, the stream works perfectly every time. 🧪 What I’ve Tried Verified the HLS URL — it’s working and plays fine in Safari and in the app before resizing. Set .automaticallyWaitsToMinimizeStalling = false on AVPlayer. Observed that .status on AVPlayerItem never reaches .readyToPlay after resizing. Tried to force window size back using UIWindowScene.requestGeometryUpdate(...), but behavior persists.
Replies
2
Boosts
0
Views
290
Activity
Jul ’25
After playing an HDR video on iPhone for a while, the HDR effect disappears and the screen brightness decrease
When i use AVPlayer to obtain the video frame CVPixelBufferRef of an HDR video, and use AVSampleBufferDisplayLayer to display it on the screen, after a period of time, the HDR video content and screen gradually darken, losing the HDR effect. Steps to reproduce: Create an AVPlayer to loop an HDR video, specify the video frame format as kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange Create a timer to get the video frame CVPixelBufferRef at 30 frames per second Use AVSampleBufferDisplayLayer to display CVPixelBufferRef on the screen Don't operate the phone, wait for a period of time (such as 40 minutes), the HDR effect disappears and the screen darkens Note: You need to use an iPhone device, iOS 18.5 and below operating system You need to ensure that the HDR video is played in a loop, that is, to ensure that the screen continues to display HDR content, wait for a period of time, depending on different devices, you need to wait for 20-40 minutes. In the iPhone Photos app,the same problem will occur after playing HDR video in a loop for a long time Expected Results: When rendering HDR content for a long time, it is guaranteed that there is always an HDR effect, and the HDR content and screen will not be darkened. Current Results: After about 20-40 minutes, the HDR effect disappears and the screen darkens.
Replies
4
Boosts
0
Views
780
Activity
Jul ’25
HDR video & screen brightness
When I play an HDR video in the iPhone Photos app, I can see the HDR effect obviously. But if this HDR video is played continuously for more than 30-40 minutes, the HDR effect will disappear and the brightness will be compressed to the SDR range. This issue will appear on any iPhone. Depending on the phone, it may be 20-30 minutes, or 30-40 minutes, or even a few minutes, such as iPhone 12 mini. Similarly, if I use AVPlayer to play and preview an HDR video, if it plays more than 30-40 minutes, the HDR effect will disappear and the screen brightness will dim. Also the currentEDRHeadroom will gradually decrease to 1 Note, test it with an HDR video longer than 1 hour, and if the video is short, please loop it. My question is how to avoid losing the HDR effect after 30-40 minutes when I use CAMetalLayer to render any HDR video.
Replies
1
Boosts
0
Views
204
Activity
Jul ’25
AVKit - PiP with AVSampleBufferDisplayLayer Error
AVPictureInPictureControllerContentSource *contentSource = [[AVPictureInPictureControllerContentSource alloc] initWithSampleBufferDisplayLayer:self.renderView.sampleBufferDisplayLayer playbackDelegate:self]; AVPictureInPictureController *pictureInPictureController = [[AVPictureInPictureController alloc] initWithContentSource:contentSource]; pictureInPictureController.delegate = self; (void)pictureInPictureController:(AVPictureInPictureController *)pictureInPictureController failedToStartPictureInPictureWithError:(NSError *)error { //error NSError * domain: @"PGPegasusErrorDomain" - code: -1003 0x00000002819fe3a0 } when first start the PiP play, I got the error "//error NSError * domain: @"PGPegasusErrorDomain" - code: -1003 0x00000002819fe3a0", why? and second start is Ok.
Replies
0
Boosts
0
Views
189
Activity
Jul ’25
How can third-party iOS apps obtain real-time waveform / spectrogram data for Apple Music tracks (similar to djay & other DJ apps)?
Hi everyone, I’m working on an iOS MusicKit app that overlays a metronome on top of Apple Music playback, using ApplicationMusicPlayer. To line the clicks up perfectly I’d like access to low-level audio analysis data—ideally a waveform / spectrogram or beat grid—while the track is playing. I’ve noticed that several approved DJ apps (e.g. djay, Serato, rekordbox) can already: • Display detailed scrolling waveforms of Apple Music songs • Scratch, loop or time-stretch those tracks in real time That implies they receive decoded PCM frames or at least high-resolution analysis data from Apple Music under a special entitlement. My questions: Does MusicKit (or any public framework) expose real-time audio buffers, FFT bins, or beat markers for streaming Apple Music content? If not, is there an Apple program or entitlement that developers can apply for—similar to the “DJ with Apple Music” initiative—to gain that deeper access? Where can I find official documentation or a point of contact for this kind of request? I’ve searched the docs and forums but only see standard MusicKit playback APIs, which don’t appear to expose raw audio for DRM-protected songs. Any guidance, links or insider tips on the proper application process would be hugely appreciated! Thanks in advance.
Replies
1
Boosts
3
Views
326
Activity
Jul ’25
Apple Music / MusicKit and Simulator
It's been an ask for a few years and I'm wondering if there are any plans, or whether the '26 SDKs/Tools allow Apple Music to work in the simulator? I develop for the Vision Pro so the usual 'fix' of running on the device is a bit of a hard ask. At the very least a small sample library that works in the simulator would be welcome (similar to how photos works) Cheers
Replies
1
Boosts
1
Views
199
Activity
Jul ’25
Final Cut Pro Workflow Extension Drag and Drop to Timeline Not Working Despite Proper FCPXML Clip Structure
Our Final Cut Pro workflow extension built with ProExtensionHost framework uses an advanced NSPasteboardItemDataProvider system with multi-version FCPXML support (1.9, 1.10, 1.13) and proper relative path UIDs for Motion templates. We've implemented clip wrapper approach with placeholder assets and elements containing effects to enable direct timeline drag functionality. However, drag and drop from our Final Cut Pro workflow extension directly to timeline is still not working despite proper element structure in our FCPXML. Our implementation creates valid clip elements with effects applied, but Final Cut Pro timeline doesn't accept them during drag operations from our ProExtensionHost-based workflow extension. Steps to Reproduce: Create Final Cut Pro workflow extension using ProExtensionHost framework with NSPasteboardItemDataProvider implementation Generate FCPXML with proper element structure: Expected Result: Clip should be accepted by timeline and effect applied from workflow extension Actual Result: Timeline rejects drag operation from ProExtensionHost-based workflow extension Question: Are there additional requirements or ProExtensionHost API calls needed beyond standard NSPasteboardItemDataProvider for Final Cut Pro workflow extension timeline drag functionality?
Replies
0
Boosts
0
Views
196
Activity
Jul ’25
Apple Music iOS 26 features in Android
Since many users like me use Apple Music on Android, the app is almost as feature-rich as iOS. It would be fantastic if the developers could add the new iOS 26 features to the Android app, along with a minor UI change. I know it’s challenging to implement liquid glass on Android hardware or design, but features like auto-mix, pronunciation, and translation could be added. kindly consider this request !!!!
Replies
1
Boosts
0
Views
225
Activity
Jul ’25
IPadOS 17 external camera exposure
I'm developing iPad app that will be mostly dedicated for certain external camera for visually impaired people. The linux UVC api (e.g. using guvcview) allows to enable automatic exposure for the camera. IOs api "isExposureModeSupported" unfortunately returns false for any of the exposure modes. Is it a bug? Or perhaps AVFoundation doesn't support UVC exposure yet?
Replies
1
Boosts
2
Views
690
Activity
Jul ’25