Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

Background GPU Access availability
I would love to use Background GPU Access to do some video processing in the background. However the documentation of BGContinuedProcessingTaskRequest.Resources.gpu clearly states: Not all devices support background GPU use. For more information, see Performing long-running tasks on iOS and iPadOS. Is there a list available of currently released devices that do (or don't) support GPU background usage? That would help to understand what part of our user base can use this feature. (And what hardware we need to test this on as developers.) For example it seems that it isn't supported on an iPad Pro M1 with the current iOS 26 beta. The simulators also seem to not support the background GPU resource. So would be great to understand what hardware is capable of using this feature!
4
0
1.1k
Jul ’25
macOS Tahoe Beta 4 disabled __asm keyword for Metal
Hi, developers, I maintain a shipped app that uses string concatenation to construct Metal shader and compile on-device. Beta 4 seems disabled __asm keyword, resulting the compilation failure. The error is: v2/GEMMKernel.cpp:229: error: program_source:23:9: error: illegal string literal in 'asm' __asm("air.simdgroup_async_copy_1d.p3i8.p1i8"); The relevant code is available at https://github.com/liuliu/ccv/blob/unstable/lib/nnc/mfa/v2/GEMMHeaders.cpp#L30 although any __asm will trip this. Please give us guidance on whether this is a regression or this will be something enforced in 26 release. Personally, I would consider this as a bug given it won't impact anything "compiled" shaders. Thanks for your patience reading this!
Topic: Graphics & Games SubTopic: Metal Tags:
4
6
948
Jul ’25
What good is NSBitmapFormatAlphaNonpremultiplied?
If I create a bitmap image and then try to get ready to draw into it, like so: NSBitmapImageRep* newRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes: nullptr pixelsWide: 128 pixelsHigh: 128 bitsPerSample: 8 samplesPerPixel: 4 hasAlpha: YES isPlanar: NO colorSpaceName: NSDeviceRGBColorSpace bitmapFormat: NSBitmapFormatAlphaNonpremultiplied | NSBitmapFormatThirtyTwoBitBigEndian bytesPerRow: 4 * 128 bitsPerPixel: 32]; [NSGraphicsContext setCurrentContext: [NSGraphicsContext graphicsContextWithBitmapImageRep: newRep]]; then the log shows this error: CGBitmapContextCreate: unsupported parameter combination: RGB 8 bits/component, integer 512 bytes/row kCGImageAlphaLast kCGImageByteOrderDefault kCGImagePixelFormatPacked Valid parameters for RGB color space model are: 16 bits per pixel, 5 bits per component, kCGImageAlphaNoneSkipFirst 32 bits per pixel, 8 bits per component, kCGImageAlphaNoneSkipFirst 32 bits per pixel, 8 bits per component, kCGImageAlphaNoneSkipLast 32 bits per pixel, 8 bits per component, kCGImageAlphaPremultipliedFirst 32 bits per pixel, 8 bits per component, kCGImageAlphaPremultipliedLast 32 bits per pixel, 10 bits per component, kCGImageAlphaNone|kCGImagePixelFormatRGBCIF10|kCGImageByteOrder16Little 64 bits per pixel, 16 bits per component, kCGImageAlphaPremultipliedLast 64 bits per pixel, 16 bits per component, kCGImageAlphaNoneSkipLast 64 bits per pixel, 16 bits per component, kCGImageAlphaPremultipliedLast|kCGBitmapFloatComponents|kCGImageByteOrder16Little 64 bits per pixel, 16 bits per component, kCGImageAlphaNoneSkipLast|kCGBitmapFloatComponents|kCGImageByteOrder16Little 128 bits per pixel, 32 bits per component, kCGImageAlphaPremultipliedLast|kCGBitmapFloatComponents 128 bits per pixel, 32 bits per component, kCGImageAlphaNoneSkipLast|kCGBitmapFloatComponents See Quartz 2D Programming Guide (available online) for more information. If I don't use NSBitmapFormatAlphaNonpremultiplied as part of the format, I don't get the error message. My question is, why does the constant NSBitmapFormatAlphaNonpremultiplied exist if you can't use it like this? If you're wondering why I wanted to do this: I want to extract the RGBA pixel data from an image, which might have non-premultiplied alpha. And elsewhere online, I saw advice that if you want to look at the pixels of an image, draw it into a bitmap whose format you know and look at those pixels. And I don't want the process of drawing to premultiply my alpha.
3
0
181
Jun ’25
Unable to play .aivu with VideoPlayerComponent
I’m trying to play an Apple Immersive video in the .aivu format using VideoPlayerComponent using the official documentation found here: https://aninterestingwebsite.com/documentation/RealityKit/VideoPlayerComponent Here is a simplified version of the code I'm running in another application: import SwiftUI import RealityKit import AVFoundation struct ImmersiveView: View { var body: some View { RealityView { content in let player = AVPlayer(url: Bundle.main.url(forResource: "Apple_Immersive_Video_Beach", withExtension: "aivu")!) let videoEntity = Entity() var videoPlayerComponent = VideoPlayerComponent(avPlayer: player) videoPlayerComponent.desiredImmersiveViewingMode = .full videoPlayerComponent.desiredViewingMode = .stereo player.play() videoEntity.components.set(videoPlayerComponent) content.add(videoEntity) } } } Full code is here: https://github.com/tomkrikorian/AIVU-VideoPlayerComponentIssueSample But the video does not play in my project even though the file is correct (It can be played in Apple Immersive Video Utility) and I’m getting this error when the app crashes: App VideoPlayer+Component Caption: onComponentDidUpdate Media Type is invalid Domain=SpatialAudioServicesErrorDomain Code=2020631397 "xpc error" UserInfo={NSLocalizedDescription=xpc error} CA_UISoundClient.cpp:436 Got error -4 attempting to SetIntendedSpatialAudioExperience [0x101257490|InputElement #0|Initialize] Number of channels = 0 in AudioChannelLayout does not match number of channels = 2 in stream format. Video I’m using is the official sample that can be found here but tried several different files shot from my clients and the same error are displayed so the issue is definitely not the files but on the RealityKit side of things: https://aninterestingwebsite.com/documentation/immersivemediasupport/authoring-apple-immersive-video Steps to reproduce the issue: - Open AIVUPlayerSample project and run. Look at the logs. All code can be found in ImmersiveView.swift Sample file is included in the project Expected results: If I followed the documentation and samples provided, I should see my video played in full immersive mode inside my ImmersiveSpace. Am i doing something wrong in the code? I'm basically following the documentation here. Feedback ticket: FB19971306
3
0
865
Aug ’25
How to use MetalPeformancePrimitives
I am trying to learn the new Metal Peformance Primitives APIs. I have added the MetalPeformancePrimitives framework and included the header in my shader code as per documentation #include <MetalPeformancePrimitives/MetalPeformancePrimitives.h> Unfortunately, Xcode complains that the header cannot be found. How do I include it properly? I am using Xcode 26 on Tahoe. The MetalPeformancePrimitives framework is present on my machine and I can inspect the headers in the filesystem.
3
1
791
Oct ’25
RealityKit captureHighResolutionFrame from session is broken on iOS26?
A bit of background on what our app is doing: We have a RealityKit ARView session running. During this period we place objects in RealityKit. At some point user can "take photo" and we use session.captureHighResolutionFrame to capture a frame. We then use captured frame and frame.camera.projectPoint to project my objects back to 2D Issue we found is that on devices that have iOS26, first photo user takes and the first frame received from session.captureHighResolutionFrame gives incorrect CGPoint for frame.camera.projectPoint. If user takes the second photo with the same camera phostion, second frame received from session.captureHighResolutionFrame gives correct CGPoint for frame.camera.projectPoint I notices some difference between first and subsequent frames that i believe is corresponding with the issue. Yaw value of camera (frame.camera.eulerAngles.y) on first frame is not correct ( inconsistent with any subsequent frame) I also created a small example app and i followed Building an Immersive Experience with RealityKit example to create it. The issue exists in this app for iOS26, while iOS18.* has consistent values between first and subsequent captured frames. Note: The yaw value seems to differ more if we start session in portrait but take photo in landscape. Example result for 3 captured frames: Frame captured with yaw: 1.4855177402496338 Frame captured with yaw: -0.08803760260343552 Frame captured with yaw: -0.08179682493209839 Example code: class CustomARView: ARView, ARSessionDelegate { required init(frame: CGRect) { super.init(frame: frame) } required init?(coder decoder: NSCoder) { fatalError("init(coder:) has not been implemented")} func setup() { let singleTap = UITapGestureRecognizer(target: self, action: #selector(handleTap)) addGestureRecognizer(singleTap) } @objc func handleTap(_ gestureRecognizer: UIGestureRecognizer) { Task { do { let frame = try await session.captureHighResolutionFrame() print("Frame captured with yaw: \(Double(frame.camera.eulerAngles.y))") } catch { } } } } struct CustomARViewUIViewRepresentable: UIViewRepresentable { func makeUIView(context: Context) -> some UIView { let arView = CustomARView(frame: .zero) arView.setup() return arView } func updateUIView(_ uiView: UIViewType, context: Context) { } } struct ContentView: View { var body: some View { CustomARViewUIViewRepresentable() .frame(maxWidth: .infinity, maxHeight: .infinity) .ignoresSafeArea() } }
3
1
712
Sep ’25
Scenekit view and scenekit editor color difference
Hello there, I'm having trouble matching what I see in the scenekit editor and the output of the resulting scene in a scnview. For a glitter effect I have set a high value on the diffuse intensity which looks fine in the editor but when running the game the colors are much darker. To see if the intensity value is merely capped I have set the same multiplier on the hat below - but it is blown out which looks to me like there is some grading going on I have tried to switch on hdr rendering but that didn't make a difference. I tried disabling linear rendering and that simply made everything darker still - which I expect. Does someone have an idea what else this could be? What rendering is the scenekit editor using and how can I match it? Interestingly when I take a screenshot of the editor window for this post, the image is also blown out... what is going on? :) Thanks so much for any pointers, Seb
3
0
267
Apr ’25
Clarifying when Game Center activity events fire relative to authentication
Hello, In our game we enforce an age gate before showing Game Center sign‑in. Only after the user passes the age gate do we call GKLocalPlayer.localPlayer.authenticateHandler. The reason I’m asking is that we want to reliably detect if the game was launched from a Game Center activity in the Games app (iOS 26+). If the user prefers to enter via activities, we don’t want to miss that event during cold start. Our current proposal is: Register a GKLocalPlayerListener early in didFinishLaunchingWithOptions: so the app is ready to catch events. Queue any incoming events in our dispatcher. Only process those events after the user passes the age gate and authentication succeeds. My questions are: Does player:wantsToPlayGameActivity:completionHandler: ever fire before authentication, or only after the local player is authenticated? If it only fires after authentication, is our “register early but gate processing” approach the correct way to ensure we don’t miss activity launches? Is there any recommended pattern to distinguish “activity launch” vs. “normal launch” in this age‑gate scenario? We want to respect Apple’s age gate requirements, but also ensure activity launches are not lost if the user prefers that entry point. Sorry if this is a stupid question — I just want to be sure we’re following the right pattern. Thanks for any clarification or best‑practice guidance!
3
0
606
Feb ’26
receivedTurnEventForMatch giving stale data
In my turn-based game, I receive GKListener event receivedTurnEventForMatch and decode the match.matchData. On occasion, the matchData is clearly stale and is from the previous turn. If I call the MatchMaker ViewController up and select that same match, the data is not stale, so it's not a matter of not calling endTurn. I have tried both loadMatchWithID and loadMatchesWithCompletionHandler after receiving the receivedTurnEventForMatch, but the data is still stale. Advice?
3
0
925
2d
CIBumpDistortion filter not working on my view
I'm trying to apply a CIBumpDistortion Core Image filter to a view that contains a UILabel (my storyLabel). The goal is to create a visual bump/magnifying glass effect over the text. However, despite my attempts, the filter doesn't seem to render at all. The view and the label appear as normal, with no distortion effect. I've tried adjusting the filter parameters and reviewing the view hierarchy, but without success. I also haven't been able to find clear documentation or examples for applying this filter to a UIView's layer. // // TVView.swift // Mistery // // Created by Joje on 31/07/25. // import CoreImage import CoreImage.CIFilterBuiltins import UIKit import AVFoundation final class TVView: UIView { // propriedades animacao texto private var textAnimationTimer: Timer? private var fullTextToAnimate: String = "" private var currentCharIndex: Int = 0 // propriedades video estatica private var player: AVQueuePlayer? private var playerLayer: AVPlayerLayer? private var playerLooper: AVPlayerLooper? var onNextButtonTap: () -> Void = {} // MARK: - Subviews // imagem da TV private(set) lazy var tvImageView: UIImageView = { let imageView = UIImageView() imageView.translatesAutoresizingMaskIntoConstraints = false imageView.image = UIImage(named: "tvFinal") imageView.contentMode = .scaleAspectFit return imageView }() // texto que passa dentro da TV private(set) lazy var storyLabel: UILabel = { let label = UILabel() label.translatesAutoresizingMaskIntoConstraints = false //label.backgroundColor = .gray label.textColor = .red label.font = UIFont(name: "MeltedMonster", size: 30) label.textAlignment = .left label.numberOfLines = 0 label.text = "" return label }() private(set) lazy var nextButton: UIButton = { let button = UIButton(type: .system) button.translatesAutoresizingMaskIntoConstraints = false //button.backgroundColor = .darkGray button.addTarget(self, action: #selector(didPressNextButton), for: .touchUpInside) return button }() // MARK: - Lifecycle override init(frame: CGRect) { super.init(frame: frame) backgroundColor = .black setupVideoPlayer() addSubviews() setupConstraints() } override func layoutSubviews() { super.layoutSubviews() playerLayer?.frame = tvImageView.frame.insetBy(dx: tvImageView.frame.width * 0.05, dy: tvImageView.frame.height * 0.18) setupFisheyeEffect() } private func setupFisheyeEffect() { // cria o filtro guard let filter = CIFilter(name: "CIBumpDistortion") else {return print("erro")} storyLabel.layer.shouldRasterize = true storyLabel.layer.rasterizationScale = UIScreen.main.scale // define os parametros filter.setDefaults() // centro do efeito let center = CIVector(x: storyLabel.bounds.midX, y: storyLabel.bounds.midY) filter.setValue(center, forKey: kCIInputCenterKey) // raio de distorção filter.setValue(storyLabel.bounds.width, forKey: kCIInputRadiusKey) // intensidade de distorção filter.setValue(7, forKey: kCIInputScaleKey) storyLabel.layer.filters = [filter] } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } // MARK: - Button actions @objc private func didPressNextButton() { onNextButtonTap() } @objc private func animateNextCharacter() { guard currentCharIndex < fullTextToAnimate.count else { textAnimationTimer?.invalidate() return } let currentTextIndex = fullTextToAnimate.index(fullTextToAnimate.startIndex, offsetBy: currentCharIndex) let partialText = String(fullTextToAnimate[...currentTextIndex]) storyLabel.text = partialText currentCharIndex += 1 } public func updateStoryText(with text: String) { textAnimationTimer?.invalidate() storyLabel.text = "" fullTextToAnimate = text currentCharIndex = 0 textAnimationTimer = Timer.scheduledTimer(timeInterval: 0.12, target: self, selector: #selector(animateNextCharacter), userInfo: nil, repeats: true) } // MARK: - Setup methods private func setupVideoPlayer() { guard let videoURL = Bundle.main.url(forResource: "static-video", withExtension: "mov") else { print("Erro: Não foi possível encontrar o arquivo de vídeo static-video.mov") return } let playerItem = AVPlayerItem(url: videoURL) player = AVQueuePlayer(playerItem: playerItem) // LINHA COM POSSIVEL ERRO playerLooper = AVPlayerLooper(player: player!, templateItem: playerItem) playerLayer = AVPlayerLayer(player: player) playerLayer?.videoGravity = .resizeAspectFill if let layer = playerLayer { self.layer.addSublayer(layer) } player?.play() } private func addSubviews() { self.addSubview(storyLabel) self.addSubview(tvImageView) self.addSubview(nextButton) } private func setupConstraints() { NSLayoutConstraint.activate([ // TV Image tvImageView.centerXAnchor.constraint(equalTo: centerXAnchor), tvImageView.centerYAnchor.constraint(equalTo: centerYAnchor), tvImageView.widthAnchor.constraint(equalTo: widthAnchor), // TV Text storyLabel.centerXAnchor.constraint(equalTo: tvImageView.centerXAnchor, constant: -50), storyLabel.centerYAnchor.constraint(equalTo: tvImageView.centerYAnchor, constant: -25), storyLabel.widthAnchor.constraint(equalTo: tvImageView.widthAnchor, multiplier: 0.35), storyLabel.heightAnchor.constraint(equalTo: tvImageView.heightAnchor, multiplier: 0.42), //TV Button nextButton.topAnchor.constraint(equalTo: tvImageView.centerYAnchor, constant: -25), nextButton.centerXAnchor.constraint(equalTo: self.centerXAnchor, constant: 190), nextButton.widthAnchor.constraint(equalToConstant: 100), nextButton.heightAnchor.constraint(equalToConstant: 160) ]) } } #Preview{ ViewController() }
3
0
194
Sep ’25
GameKit Turn Based Matches Push Notifications
I'm developing a game that supports GameKit turn based matches. What I don't understand is this: Is tapping on the Game Center notification push messages the only way for the GKTurnBasedEventListener to trigger? What if someone misses the push message (swiping it away by accident or something like that) but still wants to join? Is there some inbox somewhere where the pending messages can be seen or fetched? Also it was mentioned in a very old WWDC video (from 2013, I think that's the latest with information about turn based matches) that the notification also includes a badge for the icon. However, I do not understand how to implement that. Is there any documentation for that?
3
0
913
Jan ’26
How does the automatch feature work in Game Kit?
I'm developing a turn based game. When I present the GKTurnBasedMatchmakerViewController players can opt in for automatch instead of selecting a specific friend as opponent. How exactly does the matching work if a player doesn't specify anything explicitly? Does Game Center send push notifications in a round robin fashion to all friends and the first one to accept is then matched as opponent? Is this documented somewhere?
3
0
533
Jan ’26
Is there any future for screensavers on macOS?
I haven't been looking at screensavers for a long time because of Apple's lack of will (or resources?) to provide a public version of the private modern SDK used by Apple for a very long time now. I'm now looking at the Screen Saver pane in System Settings (the What-If version of System Preferences in an alternate universe where all screens are in portrait mode). In macOS Sequoia, it seems like 3rd party screensavers are not welcome considering that they are relegated to the "Other" section at the bottom of the list and you have to click Show All to start seeing 3rd party screen savers. I also had a quick look at macOS Tahoe Beta 3 and it looks like that all the real screensavers are gone (3rd party and the ones from Apple: Hello, Message, Flurry, etc.) or at least it requires to be a Nobel Prize to find them (and the Search field is not useful). I tried to install a 3rd party screen saver on macOS Tahoe Beta 3, it doesn't show up in the list. To summarize: No public access to modern APIs AFAIK. UI that is hostile to 3rd party screen savers on macOS Sequoia. Apparently only screensavers that are slideshows or movies curated by Apple in macOS Tahoe b3. Hence the question: Is there any future for screen savers on macOS? Because if there's none, I won't waste my time trying to update some old screen savers.
3
0
576
Aug ’25
RealityView content scale factor
Hi, following the recent deprecation of SceneKit, I'm trying to move a couple of my SceneKit projects to RealityKit. One thing I can't seem to find is how to change the content scale factor when using a RealityView in SwiftUI. It was really easy to do in SceneKit with just a SCNView property, and it seems that it's also possible when using ARView, but I can't find a way to do it with a RealityView. Maybe it's a SwiftUI limitation?
3
1
208
Jul ’25
RealityKit .Kinematic + collisions (visionOs)
Hi everyone, I'm new to visionOS development. I'm trying to create a physics-based scene (with gravity) where users can pick up and move objects on a workbench. I am struggling with physics interactions during the drag gesture: Kinematic Mode: If I switch to .kinematic during the drag, the object moves smoothly but clips through other objects (no collisions). Dynamic Mode: I tried keeping it .dynamic and applying linear velocity toward the hand position, but the movement feels laggy and unresponsive. Hybrid Approach: I tried switching to .kinematic during DragGesture.onChange and back to .dynamic on collision, but this causes the entity to jitter/shake violently when touching other objects. Has anyone found a clean way to drag objects while maintaining solid collisions. Thanks for your help!
3
0
841
2w
Memory leak when no draw calls issued to encoder
I noticed that when the render command encoder adds no draw calls an apps memory usage seems to grow unboundedly. Using a super simple MTKView-based drawing with the following delegate (code at end). If I add the simplest of draw calls, e.g., a single vertex, the app's memory usage is normal, around 100-ish MBs. I am attaching a couple screenshot, one from Xcode and one from Instruments. What's going on here? Is this an illegal program? If yes, why does it not crash, such as if the encode or command buffer weren't ended. Or is there some race condition at play here due to the lack of draws? class Renderer: NSObject, MTKViewDelegate { var device: MTLDevice var commandQueue: MTL4CommandQueue var commandBuffer: MTL4CommandBuffer var allocator: MTL4CommandAllocator override init() { guard let d = MTLCreateSystemDefaultDevice(), let queue = d.makeMTL4CommandQueue(), let cmdBuffer = d.makeCommandBuffer(), let alloc = d.makeCommandAllocator() else { fatalError("unable to create metal 4 objects") } self.device = d self.commandQueue = queue self.commandBuffer = cmdBuffer self.allocator = alloc super.init() } func mtkView(_ view: MTKView, drawableSizeWillChange size: CGSize) {} func draw(in view: MTKView) { guard let drawable = view.currentDrawable else { return } commandBuffer.beginCommandBuffer(allocator: allocator) guard let descriptor = view.currentMTL4RenderPassDescriptor, let encoder = commandBuffer.makeRenderCommandEncoder( descriptor: descriptor ) else { fatalError("unable to create encoder") } encoder.endEncoding() commandBuffer.endCommandBuffer() commandQueue.waitForDrawable(drawable) commandQueue.commit([commandBuffer]) commandQueue.signalDrawable(drawable) drawable.present() } }
3
0
436
Jan ’26
انشاء تطبيق جديد
اريد انشاء لعبه في ابل ستور و تكون اول صفحه تكون شروط و الاحكام و خيار بدا اللعبه200 فئات من السعوديه من مسلسل من العب من بنات و بس وقطر و الإمارات وانمي ومسلسلات تركيه و السياحه و الدول وشركات عالميه و شركات كترونيه
3
0
347
Mar ’26
Background GPU Access availability
I would love to use Background GPU Access to do some video processing in the background. However the documentation of BGContinuedProcessingTaskRequest.Resources.gpu clearly states: Not all devices support background GPU use. For more information, see Performing long-running tasks on iOS and iPadOS. Is there a list available of currently released devices that do (or don't) support GPU background usage? That would help to understand what part of our user base can use this feature. (And what hardware we need to test this on as developers.) For example it seems that it isn't supported on an iPad Pro M1 with the current iOS 26 beta. The simulators also seem to not support the background GPU resource. So would be great to understand what hardware is capable of using this feature!
Replies
4
Boosts
0
Views
1.1k
Activity
Jul ’25
macOS Tahoe Beta 4 disabled __asm keyword for Metal
Hi, developers, I maintain a shipped app that uses string concatenation to construct Metal shader and compile on-device. Beta 4 seems disabled __asm keyword, resulting the compilation failure. The error is: v2/GEMMKernel.cpp:229: error: program_source:23:9: error: illegal string literal in 'asm' __asm("air.simdgroup_async_copy_1d.p3i8.p1i8"); The relevant code is available at https://github.com/liuliu/ccv/blob/unstable/lib/nnc/mfa/v2/GEMMHeaders.cpp#L30 although any __asm will trip this. Please give us guidance on whether this is a regression or this will be something enforced in 26 release. Personally, I would consider this as a bug given it won't impact anything "compiled" shaders. Thanks for your patience reading this!
Topic: Graphics & Games SubTopic: Metal Tags:
Replies
4
Boosts
6
Views
948
Activity
Jul ’25
SceneKit is now deprecated - when it will be removed?
Now that SceneKit has been marked as soft deprecated, is there a planned date or timeframe when it will be completely removed from iOS? I’m concerned about how long my existing SceneKit-based game will continue to work, especially as an indie developer without the resources for a quick rewrite to RealityKit.
Replies
4
Boosts
1
Views
567
Activity
16h
What good is NSBitmapFormatAlphaNonpremultiplied?
If I create a bitmap image and then try to get ready to draw into it, like so: NSBitmapImageRep* newRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes: nullptr pixelsWide: 128 pixelsHigh: 128 bitsPerSample: 8 samplesPerPixel: 4 hasAlpha: YES isPlanar: NO colorSpaceName: NSDeviceRGBColorSpace bitmapFormat: NSBitmapFormatAlphaNonpremultiplied | NSBitmapFormatThirtyTwoBitBigEndian bytesPerRow: 4 * 128 bitsPerPixel: 32]; [NSGraphicsContext setCurrentContext: [NSGraphicsContext graphicsContextWithBitmapImageRep: newRep]]; then the log shows this error: CGBitmapContextCreate: unsupported parameter combination: RGB 8 bits/component, integer 512 bytes/row kCGImageAlphaLast kCGImageByteOrderDefault kCGImagePixelFormatPacked Valid parameters for RGB color space model are: 16 bits per pixel, 5 bits per component, kCGImageAlphaNoneSkipFirst 32 bits per pixel, 8 bits per component, kCGImageAlphaNoneSkipFirst 32 bits per pixel, 8 bits per component, kCGImageAlphaNoneSkipLast 32 bits per pixel, 8 bits per component, kCGImageAlphaPremultipliedFirst 32 bits per pixel, 8 bits per component, kCGImageAlphaPremultipliedLast 32 bits per pixel, 10 bits per component, kCGImageAlphaNone|kCGImagePixelFormatRGBCIF10|kCGImageByteOrder16Little 64 bits per pixel, 16 bits per component, kCGImageAlphaPremultipliedLast 64 bits per pixel, 16 bits per component, kCGImageAlphaNoneSkipLast 64 bits per pixel, 16 bits per component, kCGImageAlphaPremultipliedLast|kCGBitmapFloatComponents|kCGImageByteOrder16Little 64 bits per pixel, 16 bits per component, kCGImageAlphaNoneSkipLast|kCGBitmapFloatComponents|kCGImageByteOrder16Little 128 bits per pixel, 32 bits per component, kCGImageAlphaPremultipliedLast|kCGBitmapFloatComponents 128 bits per pixel, 32 bits per component, kCGImageAlphaNoneSkipLast|kCGBitmapFloatComponents See Quartz 2D Programming Guide (available online) for more information. If I don't use NSBitmapFormatAlphaNonpremultiplied as part of the format, I don't get the error message. My question is, why does the constant NSBitmapFormatAlphaNonpremultiplied exist if you can't use it like this? If you're wondering why I wanted to do this: I want to extract the RGBA pixel data from an image, which might have non-premultiplied alpha. And elsewhere online, I saw advice that if you want to look at the pixels of an image, draw it into a bitmap whose format you know and look at those pixels. And I don't want the process of drawing to premultiply my alpha.
Replies
3
Boosts
0
Views
181
Activity
Jun ’25
Customize the Metal Performance HUD on Apple TV
Hi there, Is it possible to customize the Metal Performance HUD on Apple TV, similar to how it can be done on iPhone & iPad? Would like to see things like Compiled Shaders for my Apps on tvOS .
Replies
3
Boosts
0
Views
373
Activity
Aug ’25
Unable to play .aivu with VideoPlayerComponent
I’m trying to play an Apple Immersive video in the .aivu format using VideoPlayerComponent using the official documentation found here: https://aninterestingwebsite.com/documentation/RealityKit/VideoPlayerComponent Here is a simplified version of the code I'm running in another application: import SwiftUI import RealityKit import AVFoundation struct ImmersiveView: View { var body: some View { RealityView { content in let player = AVPlayer(url: Bundle.main.url(forResource: "Apple_Immersive_Video_Beach", withExtension: "aivu")!) let videoEntity = Entity() var videoPlayerComponent = VideoPlayerComponent(avPlayer: player) videoPlayerComponent.desiredImmersiveViewingMode = .full videoPlayerComponent.desiredViewingMode = .stereo player.play() videoEntity.components.set(videoPlayerComponent) content.add(videoEntity) } } } Full code is here: https://github.com/tomkrikorian/AIVU-VideoPlayerComponentIssueSample But the video does not play in my project even though the file is correct (It can be played in Apple Immersive Video Utility) and I’m getting this error when the app crashes: App VideoPlayer+Component Caption: onComponentDidUpdate Media Type is invalid Domain=SpatialAudioServicesErrorDomain Code=2020631397 "xpc error" UserInfo={NSLocalizedDescription=xpc error} CA_UISoundClient.cpp:436 Got error -4 attempting to SetIntendedSpatialAudioExperience [0x101257490|InputElement #0|Initialize] Number of channels = 0 in AudioChannelLayout does not match number of channels = 2 in stream format. Video I’m using is the official sample that can be found here but tried several different files shot from my clients and the same error are displayed so the issue is definitely not the files but on the RealityKit side of things: https://aninterestingwebsite.com/documentation/immersivemediasupport/authoring-apple-immersive-video Steps to reproduce the issue: - Open AIVUPlayerSample project and run. Look at the logs. All code can be found in ImmersiveView.swift Sample file is included in the project Expected results: If I followed the documentation and samples provided, I should see my video played in full immersive mode inside my ImmersiveSpace. Am i doing something wrong in the code? I'm basically following the documentation here. Feedback ticket: FB19971306
Replies
3
Boosts
0
Views
865
Activity
Aug ’25
Any way to save metrics presets as preferences in Metal HUD 4 on macOS?
I mean…I want to use defaults rather than launching apps via open with the saved environment variables. This is pretty easy on iOS and other platforms. So what about in macOS?
Replies
3
Boosts
0
Views
447
Activity
Aug ’25
How to use MetalPeformancePrimitives
I am trying to learn the new Metal Peformance Primitives APIs. I have added the MetalPeformancePrimitives framework and included the header in my shader code as per documentation #include <MetalPeformancePrimitives/MetalPeformancePrimitives.h> Unfortunately, Xcode complains that the header cannot be found. How do I include it properly? I am using Xcode 26 on Tahoe. The MetalPeformancePrimitives framework is present on my machine and I can inspect the headers in the filesystem.
Replies
3
Boosts
1
Views
791
Activity
Oct ’25
RealityKit captureHighResolutionFrame from session is broken on iOS26?
A bit of background on what our app is doing: We have a RealityKit ARView session running. During this period we place objects in RealityKit. At some point user can "take photo" and we use session.captureHighResolutionFrame to capture a frame. We then use captured frame and frame.camera.projectPoint to project my objects back to 2D Issue we found is that on devices that have iOS26, first photo user takes and the first frame received from session.captureHighResolutionFrame gives incorrect CGPoint for frame.camera.projectPoint. If user takes the second photo with the same camera phostion, second frame received from session.captureHighResolutionFrame gives correct CGPoint for frame.camera.projectPoint I notices some difference between first and subsequent frames that i believe is corresponding with the issue. Yaw value of camera (frame.camera.eulerAngles.y) on first frame is not correct ( inconsistent with any subsequent frame) I also created a small example app and i followed Building an Immersive Experience with RealityKit example to create it. The issue exists in this app for iOS26, while iOS18.* has consistent values between first and subsequent captured frames. Note: The yaw value seems to differ more if we start session in portrait but take photo in landscape. Example result for 3 captured frames: Frame captured with yaw: 1.4855177402496338 Frame captured with yaw: -0.08803760260343552 Frame captured with yaw: -0.08179682493209839 Example code: class CustomARView: ARView, ARSessionDelegate { required init(frame: CGRect) { super.init(frame: frame) } required init?(coder decoder: NSCoder) { fatalError("init(coder:) has not been implemented")} func setup() { let singleTap = UITapGestureRecognizer(target: self, action: #selector(handleTap)) addGestureRecognizer(singleTap) } @objc func handleTap(_ gestureRecognizer: UIGestureRecognizer) { Task { do { let frame = try await session.captureHighResolutionFrame() print("Frame captured with yaw: \(Double(frame.camera.eulerAngles.y))") } catch { } } } } struct CustomARViewUIViewRepresentable: UIViewRepresentable { func makeUIView(context: Context) -> some UIView { let arView = CustomARView(frame: .zero) arView.setup() return arView } func updateUIView(_ uiView: UIViewType, context: Context) { } } struct ContentView: View { var body: some View { CustomARViewUIViewRepresentable() .frame(maxWidth: .infinity, maxHeight: .infinity) .ignoresSafeArea() } }
Replies
3
Boosts
1
Views
712
Activity
Sep ’25
Scenekit view and scenekit editor color difference
Hello there, I'm having trouble matching what I see in the scenekit editor and the output of the resulting scene in a scnview. For a glitter effect I have set a high value on the diffuse intensity which looks fine in the editor but when running the game the colors are much darker. To see if the intensity value is merely capped I have set the same multiplier on the hat below - but it is blown out which looks to me like there is some grading going on I have tried to switch on hdr rendering but that didn't make a difference. I tried disabling linear rendering and that simply made everything darker still - which I expect. Does someone have an idea what else this could be? What rendering is the scenekit editor using and how can I match it? Interestingly when I take a screenshot of the editor window for this post, the image is also blown out... what is going on? :) Thanks so much for any pointers, Seb
Replies
3
Boosts
0
Views
267
Activity
Apr ’25
Clarifying when Game Center activity events fire relative to authentication
Hello, In our game we enforce an age gate before showing Game Center sign‑in. Only after the user passes the age gate do we call GKLocalPlayer.localPlayer.authenticateHandler. The reason I’m asking is that we want to reliably detect if the game was launched from a Game Center activity in the Games app (iOS 26+). If the user prefers to enter via activities, we don’t want to miss that event during cold start. Our current proposal is: Register a GKLocalPlayerListener early in didFinishLaunchingWithOptions: so the app is ready to catch events. Queue any incoming events in our dispatcher. Only process those events after the user passes the age gate and authentication succeeds. My questions are: Does player:wantsToPlayGameActivity:completionHandler: ever fire before authentication, or only after the local player is authenticated? If it only fires after authentication, is our “register early but gate processing” approach the correct way to ensure we don’t miss activity launches? Is there any recommended pattern to distinguish “activity launch” vs. “normal launch” in this age‑gate scenario? We want to respect Apple’s age gate requirements, but also ensure activity launches are not lost if the user prefers that entry point. Sorry if this is a stupid question — I just want to be sure we’re following the right pattern. Thanks for any clarification or best‑practice guidance!
Replies
3
Boosts
0
Views
606
Activity
Feb ’26
receivedTurnEventForMatch giving stale data
In my turn-based game, I receive GKListener event receivedTurnEventForMatch and decode the match.matchData. On occasion, the matchData is clearly stale and is from the previous turn. If I call the MatchMaker ViewController up and select that same match, the data is not stale, so it's not a matter of not calling endTurn. I have tried both loadMatchWithID and loadMatchesWithCompletionHandler after receiving the receivedTurnEventForMatch, but the data is still stale. Advice?
Replies
3
Boosts
0
Views
925
Activity
2d
CIBumpDistortion filter not working on my view
I'm trying to apply a CIBumpDistortion Core Image filter to a view that contains a UILabel (my storyLabel). The goal is to create a visual bump/magnifying glass effect over the text. However, despite my attempts, the filter doesn't seem to render at all. The view and the label appear as normal, with no distortion effect. I've tried adjusting the filter parameters and reviewing the view hierarchy, but without success. I also haven't been able to find clear documentation or examples for applying this filter to a UIView's layer. // // TVView.swift // Mistery // // Created by Joje on 31/07/25. // import CoreImage import CoreImage.CIFilterBuiltins import UIKit import AVFoundation final class TVView: UIView { // propriedades animacao texto private var textAnimationTimer: Timer? private var fullTextToAnimate: String = "" private var currentCharIndex: Int = 0 // propriedades video estatica private var player: AVQueuePlayer? private var playerLayer: AVPlayerLayer? private var playerLooper: AVPlayerLooper? var onNextButtonTap: () -> Void = {} // MARK: - Subviews // imagem da TV private(set) lazy var tvImageView: UIImageView = { let imageView = UIImageView() imageView.translatesAutoresizingMaskIntoConstraints = false imageView.image = UIImage(named: "tvFinal") imageView.contentMode = .scaleAspectFit return imageView }() // texto que passa dentro da TV private(set) lazy var storyLabel: UILabel = { let label = UILabel() label.translatesAutoresizingMaskIntoConstraints = false //label.backgroundColor = .gray label.textColor = .red label.font = UIFont(name: "MeltedMonster", size: 30) label.textAlignment = .left label.numberOfLines = 0 label.text = "" return label }() private(set) lazy var nextButton: UIButton = { let button = UIButton(type: .system) button.translatesAutoresizingMaskIntoConstraints = false //button.backgroundColor = .darkGray button.addTarget(self, action: #selector(didPressNextButton), for: .touchUpInside) return button }() // MARK: - Lifecycle override init(frame: CGRect) { super.init(frame: frame) backgroundColor = .black setupVideoPlayer() addSubviews() setupConstraints() } override func layoutSubviews() { super.layoutSubviews() playerLayer?.frame = tvImageView.frame.insetBy(dx: tvImageView.frame.width * 0.05, dy: tvImageView.frame.height * 0.18) setupFisheyeEffect() } private func setupFisheyeEffect() { // cria o filtro guard let filter = CIFilter(name: "CIBumpDistortion") else {return print("erro")} storyLabel.layer.shouldRasterize = true storyLabel.layer.rasterizationScale = UIScreen.main.scale // define os parametros filter.setDefaults() // centro do efeito let center = CIVector(x: storyLabel.bounds.midX, y: storyLabel.bounds.midY) filter.setValue(center, forKey: kCIInputCenterKey) // raio de distorção filter.setValue(storyLabel.bounds.width, forKey: kCIInputRadiusKey) // intensidade de distorção filter.setValue(7, forKey: kCIInputScaleKey) storyLabel.layer.filters = [filter] } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } // MARK: - Button actions @objc private func didPressNextButton() { onNextButtonTap() } @objc private func animateNextCharacter() { guard currentCharIndex < fullTextToAnimate.count else { textAnimationTimer?.invalidate() return } let currentTextIndex = fullTextToAnimate.index(fullTextToAnimate.startIndex, offsetBy: currentCharIndex) let partialText = String(fullTextToAnimate[...currentTextIndex]) storyLabel.text = partialText currentCharIndex += 1 } public func updateStoryText(with text: String) { textAnimationTimer?.invalidate() storyLabel.text = "" fullTextToAnimate = text currentCharIndex = 0 textAnimationTimer = Timer.scheduledTimer(timeInterval: 0.12, target: self, selector: #selector(animateNextCharacter), userInfo: nil, repeats: true) } // MARK: - Setup methods private func setupVideoPlayer() { guard let videoURL = Bundle.main.url(forResource: "static-video", withExtension: "mov") else { print("Erro: Não foi possível encontrar o arquivo de vídeo static-video.mov") return } let playerItem = AVPlayerItem(url: videoURL) player = AVQueuePlayer(playerItem: playerItem) // LINHA COM POSSIVEL ERRO playerLooper = AVPlayerLooper(player: player!, templateItem: playerItem) playerLayer = AVPlayerLayer(player: player) playerLayer?.videoGravity = .resizeAspectFill if let layer = playerLayer { self.layer.addSublayer(layer) } player?.play() } private func addSubviews() { self.addSubview(storyLabel) self.addSubview(tvImageView) self.addSubview(nextButton) } private func setupConstraints() { NSLayoutConstraint.activate([ // TV Image tvImageView.centerXAnchor.constraint(equalTo: centerXAnchor), tvImageView.centerYAnchor.constraint(equalTo: centerYAnchor), tvImageView.widthAnchor.constraint(equalTo: widthAnchor), // TV Text storyLabel.centerXAnchor.constraint(equalTo: tvImageView.centerXAnchor, constant: -50), storyLabel.centerYAnchor.constraint(equalTo: tvImageView.centerYAnchor, constant: -25), storyLabel.widthAnchor.constraint(equalTo: tvImageView.widthAnchor, multiplier: 0.35), storyLabel.heightAnchor.constraint(equalTo: tvImageView.heightAnchor, multiplier: 0.42), //TV Button nextButton.topAnchor.constraint(equalTo: tvImageView.centerYAnchor, constant: -25), nextButton.centerXAnchor.constraint(equalTo: self.centerXAnchor, constant: 190), nextButton.widthAnchor.constraint(equalToConstant: 100), nextButton.heightAnchor.constraint(equalToConstant: 160) ]) } } #Preview{ ViewController() }
Replies
3
Boosts
0
Views
194
Activity
Sep ’25
GameKit Turn Based Matches Push Notifications
I'm developing a game that supports GameKit turn based matches. What I don't understand is this: Is tapping on the Game Center notification push messages the only way for the GKTurnBasedEventListener to trigger? What if someone misses the push message (swiping it away by accident or something like that) but still wants to join? Is there some inbox somewhere where the pending messages can be seen or fetched? Also it was mentioned in a very old WWDC video (from 2013, I think that's the latest with information about turn based matches) that the notification also includes a badge for the icon. However, I do not understand how to implement that. Is there any documentation for that?
Replies
3
Boosts
0
Views
913
Activity
Jan ’26
How does the automatch feature work in Game Kit?
I'm developing a turn based game. When I present the GKTurnBasedMatchmakerViewController players can opt in for automatch instead of selecting a specific friend as opponent. How exactly does the matching work if a player doesn't specify anything explicitly? Does Game Center send push notifications in a round robin fashion to all friends and the first one to accept is then matched as opponent? Is this documented somewhere?
Replies
3
Boosts
0
Views
533
Activity
Jan ’26
Is there any future for screensavers on macOS?
I haven't been looking at screensavers for a long time because of Apple's lack of will (or resources?) to provide a public version of the private modern SDK used by Apple for a very long time now. I'm now looking at the Screen Saver pane in System Settings (the What-If version of System Preferences in an alternate universe where all screens are in portrait mode). In macOS Sequoia, it seems like 3rd party screensavers are not welcome considering that they are relegated to the "Other" section at the bottom of the list and you have to click Show All to start seeing 3rd party screen savers. I also had a quick look at macOS Tahoe Beta 3 and it looks like that all the real screensavers are gone (3rd party and the ones from Apple: Hello, Message, Flurry, etc.) or at least it requires to be a Nobel Prize to find them (and the Search field is not useful). I tried to install a 3rd party screen saver on macOS Tahoe Beta 3, it doesn't show up in the list. To summarize: No public access to modern APIs AFAIK. UI that is hostile to 3rd party screen savers on macOS Sequoia. Apparently only screensavers that are slideshows or movies curated by Apple in macOS Tahoe b3. Hence the question: Is there any future for screen savers on macOS? Because if there's none, I won't waste my time trying to update some old screen savers.
Replies
3
Boosts
0
Views
576
Activity
Aug ’25
RealityView content scale factor
Hi, following the recent deprecation of SceneKit, I'm trying to move a couple of my SceneKit projects to RealityKit. One thing I can't seem to find is how to change the content scale factor when using a RealityView in SwiftUI. It was really easy to do in SceneKit with just a SCNView property, and it seems that it's also possible when using ARView, but I can't find a way to do it with a RealityView. Maybe it's a SwiftUI limitation?
Replies
3
Boosts
1
Views
208
Activity
Jul ’25
RealityKit .Kinematic + collisions (visionOs)
Hi everyone, I'm new to visionOS development. I'm trying to create a physics-based scene (with gravity) where users can pick up and move objects on a workbench. I am struggling with physics interactions during the drag gesture: Kinematic Mode: If I switch to .kinematic during the drag, the object moves smoothly but clips through other objects (no collisions). Dynamic Mode: I tried keeping it .dynamic and applying linear velocity toward the hand position, but the movement feels laggy and unresponsive. Hybrid Approach: I tried switching to .kinematic during DragGesture.onChange and back to .dynamic on collision, but this causes the entity to jitter/shake violently when touching other objects. Has anyone found a clean way to drag objects while maintaining solid collisions. Thanks for your help!
Replies
3
Boosts
0
Views
841
Activity
2w
Memory leak when no draw calls issued to encoder
I noticed that when the render command encoder adds no draw calls an apps memory usage seems to grow unboundedly. Using a super simple MTKView-based drawing with the following delegate (code at end). If I add the simplest of draw calls, e.g., a single vertex, the app's memory usage is normal, around 100-ish MBs. I am attaching a couple screenshot, one from Xcode and one from Instruments. What's going on here? Is this an illegal program? If yes, why does it not crash, such as if the encode or command buffer weren't ended. Or is there some race condition at play here due to the lack of draws? class Renderer: NSObject, MTKViewDelegate { var device: MTLDevice var commandQueue: MTL4CommandQueue var commandBuffer: MTL4CommandBuffer var allocator: MTL4CommandAllocator override init() { guard let d = MTLCreateSystemDefaultDevice(), let queue = d.makeMTL4CommandQueue(), let cmdBuffer = d.makeCommandBuffer(), let alloc = d.makeCommandAllocator() else { fatalError("unable to create metal 4 objects") } self.device = d self.commandQueue = queue self.commandBuffer = cmdBuffer self.allocator = alloc super.init() } func mtkView(_ view: MTKView, drawableSizeWillChange size: CGSize) {} func draw(in view: MTKView) { guard let drawable = view.currentDrawable else { return } commandBuffer.beginCommandBuffer(allocator: allocator) guard let descriptor = view.currentMTL4RenderPassDescriptor, let encoder = commandBuffer.makeRenderCommandEncoder( descriptor: descriptor ) else { fatalError("unable to create encoder") } encoder.endEncoding() commandBuffer.endCommandBuffer() commandQueue.waitForDrawable(drawable) commandQueue.commit([commandBuffer]) commandQueue.signalDrawable(drawable) drawable.present() } }
Replies
3
Boosts
0
Views
436
Activity
Jan ’26
انشاء تطبيق جديد
اريد انشاء لعبه في ابل ستور و تكون اول صفحه تكون شروط و الاحكام و خيار بدا اللعبه200 فئات من السعوديه من مسلسل من العب من بنات و بس وقطر و الإمارات وانمي ومسلسلات تركيه و السياحه و الدول وشركات عالميه و شركات كترونيه
Replies
3
Boosts
0
Views
347
Activity
Mar ’26