Reality Composer Pro

RSS for tag

Prototype and produce content for AR experiences using Reality Composer Pro.

Learn More

Posts under Reality Composer Pro subtopic

Post

Replies

Boosts

Views

Created

Exporting .reality files from Reality Composer Pro
I've been using the MacOS XCode Reality Composer to export interactive .reality files that can be hosted on the web and linked to, triggering QuickLook to open the interactive AR experience. That works really well. I've just downloaded XCode 15 Beta which ships with the new Reality Composer Pro and I can't see a way to export to .reality files anymore. It seems that this is only for building apps that ship as native iOS etc apps, rather than that can be viewed in QuickLook. Am I missing something, or is it no longer possible to export .reality files? Thanks.
3
2
2.1k
Sep ’23
ShaderGraphMaterial with Occlusion Surface Output fails to load on iOS and macOS
A ShaderGraphMaterial with an Occlusion Surface Output generated with RealityComposer 2 fails to load on iOS 18 and macOS 15 with the following error: RealityFoundation.ShaderGraphMaterial.LoadError.invalidTypeFound (https://aninterestingwebsite.com/documentation/realitykit/shadergraphmaterial/loaderror/invalidtypefound) This happens with both https://aninterestingwebsite.com/documentation/shadergraph/realitykit/occlusion-surface-(realitykit) and https://aninterestingwebsite.com/documentation/shadergraph/realitykit/shadow-receiving-occlusion-surface-(realitykit) RealityView { content in do { let bgEntity = ModelEntity(mesh: .generateCone(height: 0.5, radius: 0.1), materials: [SimpleMaterial(color: .red, isMetallic: true)]) bgEntity.position.z = -0.2 content.add(bgEntity) let occlusionMaterial = try await ShaderGraphMaterial(named: "/Root/OcclusionMaterial", from: "OcclusionMaterial") let testEntity = ModelEntity(mesh: .generateSphere(radius: 0.4), materials: [occlusionMaterial]) content.add(testEntity) content.cameraTarget = testEntity } catch { print("Shader Graph Load Error:") dump(error) } } .realityViewCameraControls(.orbit) .edgesIgnoringSafeArea(.all) Feedback ID: FB15081296
2
1
1.5k
Sep ’24
[visionOS] How to render side-by-side stereo video?
I want to render a 3d/stereoscopic video in an Apple Vision Pro window using RealityKit/RealityView. The video is a left-right stereo. The straight forward approach would be to spawn a quad, and give it a custom Shader Graph material, which has a CameraIndexSwitch. The CameraIndexSwitch chooses between the right texture vs the left texture. https://i.sstatic.net/XawqjNcg.png The issue I have here is that I have to extract the video frames from my AVSampleBufferVideoRenderer. This should work ok, but not if I'm playing FairPlay content. So, my question is, how to render stereo FairPlay videos in a SwiftUI RealityView?
1
0
736
Dec ’24
Build Vision Pro failed
`error: [xrsimulator] Component Compatibility: EnvironmentLightingConfiguration not available for 'xros 1.0', please update 'platforms' array in Package.swift error: [xrsimulator] Exception thrown during compile: compileFailedBecause(reason: "compatibility faults") error: Tool exited with code 1
5
0
696
Jan ’25
RC Pro Timeline Notification Not Received in Xcode
I'm having a heck of a time getting this to work. I'm trying to add an event notification at the end of a timeline animation to trigger something in code but I'm not receiving the notification from RC Pro. I've watched that Compose Interactive 3D Content video quite a few times now and have tried many different ways. RC Pro has the correct ID names on the notifications. I'm not a programmer at all. Just a lowly 3D artist. Here is my code... import SwiftUI import RealityKit import RealityKitContent extension Notification.Name { static let button1Pressed = Notification.Name("button1pressed") static let button2Pressed = Notification.Name("button2pressed") static let button3Pressed = Notification.Name("button3pressed") } struct MainButtons: View { @State private var transitionToNextSceneForButton1 = false @State private var transitionToNextSceneForButton2 = false @State private var transitionToNextSceneForButton3 = false @Environment(AppModel.self) var appModel @Environment(\.dismissWindow) var dismissWindow // Notification publishers for each button private let button1PressedReceived = NotificationCenter.default.publisher(for: .button1Pressed) private let button2PressedReceived = NotificationCenter.default.publisher(for: .button2Pressed) private let button3PressedReceived = NotificationCenter.default.publisher(for: .button3Pressed) var body: some View { ZStack { RealityView { content in // Load your RC Pro scene that contains the 3D buttons. if let immersiveContentEntity = try? await Entity(named: "MainButtons", in: realityKitContentBundle) { content.add(immersiveContentEntity) } } // Optionally attach a gesture if you want to debug a generic tap: .gesture( TapGesture().targetedToAnyEntity().onEnded { value in print("3D Object tapped") _ = value.entity.applyTapForBehaviors() // Do not post a test notification here—rely on RC Pro timeline events. } ) } .onAppear { dismissWindow(id: "main") // Remove any test notification posting code. } // Listen for distinct button notifications. .onReceive(button1PressedReceived) { (output) in print("Button 1 pressed notification received") transitionToNextSceneForButton1 = true } .onReceive(button2PressedReceived.receive(on: DispatchQueue.main)) { _ in print("Button 2 pressed notification received") transitionToNextSceneForButton2 = true } .onReceive(button3PressedReceived.receive(on: DispatchQueue.main)) { _ in print("Button 3 pressed notification received") transitionToNextSceneForButton3 = true } // Present next scenes for each button as needed. For example, for button 1: .fullScreenCover(isPresented: $transitionToNextSceneForButton1) { FacilityTour() .environment(appModel) } // You can add additional fullScreenCover modifiers for button 2 and 3 transitions. } }
5
0
540
Mar ’25
Pass Video/ Frames to a Shader Graph?
Wondering if this is even possible without using CVImageBuffer and passing each frame as an image which I imagine will be very expensive. Have a PoC of a shader graph that applies a radial zoom effect to an image. In RealityKit I'm passing the image as a resource: if let textureResource = try? await TextureResource(named: "fuji") { let value = MaterialParameters.Value.textureResource(textureResource) try? material.setParameter(name: "MyImage", value: value) model.model?.materials = [material] } Thanks in advance
0
0
140
Apr ’25
目前在使用Unity进行开发时,遇到了材质(Material)和Shader在visionOS设备上无法正常渲染的问题
具体表现为:在Unity编辑器中材质显示正常,但部署到Vision Pro真机后部分材质丢失或Shader效果异常(如透明通道失效、光照计算错误等)。此问题影响了开发进度,希望得到技术支持的帮助 Specific results: The materials are displayed normally in the Unity editor, but after being deployed to the Vision Pro real machine, some materials are lost or the Shader effect is abnormal (such as transparent channel failure, antenna calculation error, etc.). This problem has affected the development progress, and I hope to get help from technical support
0
0
79
Apr ’25
attenuation map covers over object
hi, I'm trying to create a virtual movie theater, but after running computeDiffuseReflectionUVs.py and applying attenuation map, I noticed the light falloff effect just covers over the objects. I used apple provided attenuation map (did not specify the attenuation map name on python script) with sample size of 6000. I thought the python script would calculate vertices and create shadow for, say, back of the chairs. Am I understanding this wrong?
1
0
114
Apr ’25
How to read a video stream that includes both the physical world and digital space when obtaining a video stream from VisionPro by applying for the Enterprise API but only reading a video stream that contains both the physical world and digital space
We have successfully obtained the permissions for "Main Camera access" and "Passthrough in screen capture" from Apple. Currently, the video streams we have received are from the physical world and do not include the digital world. How can we obtain video streams from both the physical and digital worlds? thank you!
0
0
128
Apr ’25
Playing USDZ animation at last known location of reference object
Hi there I'm using Reality Composer Pro to anchor virtual content to a .referenceobject. However by moving the referenceobject quickly, it causes tracking to stop. (I know this is a limitation so im trying to make it a feature) IS there a way to play a USDZ animation at the last known location, after detecting that reference object is no longer being tracked? is it possible to set this up in Reality Composer pro? Nearly everything is set up in Reality Composer pro with my immersive.scene just anchoring virtual content to the Reference object in the RCP Scene, so my immersive view just does this - if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) & this .onAppear { appModel.immersiveSpaceState = .open } .onDisappear { appModel.immersiveSpaceState = .closed } I have tried Using SpatialTracking & WorldTrackProvider, but I'm still quite new to Swift and coding in general so im unsure how to implement in conjunction with my RCP scene and if this is actually the right way to do it. Apologies for my lack of knowledge.
0
0
96
Apr ’25
Playing USDZ animation at last known location of reference object
Hi there I'm using Reality Composer Pro to anchor virtual content to a .referenceobject. However moving the referenceobject quickly causes tracking to stop. (I know this is a limitation and I am trying to embrace it as a feature) Is there a way to play a USDZ animation at the last known location, after detecting that the reference object is no longer tracked? is it possible to set this up in Reality Composer pro? I'm trying to get the USDZ to play before the Virtual Content disappears (due to reference object not being located). So that it smooths out the vanishing of the content. Nearly everything is set up in Reality Composer pro with my immersive.scene just adding virtual content to the reference object which anchors it in the RCP Scene, so my immersive view just does this - if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) & this .onAppear { appModel.immersiveSpaceState = .open } .onDisappear { appModel.immersiveSpaceState = .closed } I have tried Using SpatialTracking & WorldTrackingProvider, but I'm still quite new to Swift and coding in general so im unsure how to implement in conjunction with my RCP scene and/or if this is the right way to go about it. Also I have implemented this at the beginning of object tracking. All I had to do was add a onAppear behavior to the object to play a USDZ and that works. Doing it for disappearing (due to loss of reference object) seems to be a lot harder.
0
0
120
Apr ’25
Launching a timeline on a specific model via notification
Hello! I’m familiar with the discussion on “Sending messages to the scene”, and I’ve successfully used that code. However, I have several instances of the same model in my scene. Is it possible to make only one specific model respond to a notification? For example, can I pass something like RealityKit.NotificationTrigger.SourceEntity in userInfo or use another method to target just one instance?
1
1
204
May ’25
Exporting .reality files from Reality Composer Pro
I've been using the MacOS XCode Reality Composer to export interactive .reality files that can be hosted on the web and linked to, triggering QuickLook to open the interactive AR experience. That works really well. I've just downloaded XCode 15 Beta which ships with the new Reality Composer Pro and I can't see a way to export to .reality files anymore. It seems that this is only for building apps that ship as native iOS etc apps, rather than that can be viewed in QuickLook. Am I missing something, or is it no longer possible to export .reality files? Thanks.
Replies
3
Boosts
2
Views
2.1k
Activity
Sep ’23
Reality Composer/Xcode 15
Reality Composer is no longer available in XCode 15 Release. Is this intended or will be available in later releases? Do I need to revert to Xcode15 Beta8 to get Reality Composer?
Replies
4
Boosts
0
Views
1.1k
Activity
Oct ’23
Reality Composer Pro for iphone game
We are building an AR experience for deployment on iphones. We are using Unity but it looks as if Reality Composer Pro has better features for spatial audio. I am not sure if Reality Composer Pro can only be used for Vision Pro or can it also be used for deployment on Iphone or ipad.
Replies
1
Boosts
0
Views
611
Activity
Mar ’24
ShaderGraphMaterial with Occlusion Surface Output fails to load on iOS and macOS
A ShaderGraphMaterial with an Occlusion Surface Output generated with RealityComposer 2 fails to load on iOS 18 and macOS 15 with the following error: RealityFoundation.ShaderGraphMaterial.LoadError.invalidTypeFound (https://aninterestingwebsite.com/documentation/realitykit/shadergraphmaterial/loaderror/invalidtypefound) This happens with both https://aninterestingwebsite.com/documentation/shadergraph/realitykit/occlusion-surface-(realitykit) and https://aninterestingwebsite.com/documentation/shadergraph/realitykit/shadow-receiving-occlusion-surface-(realitykit) RealityView { content in do { let bgEntity = ModelEntity(mesh: .generateCone(height: 0.5, radius: 0.1), materials: [SimpleMaterial(color: .red, isMetallic: true)]) bgEntity.position.z = -0.2 content.add(bgEntity) let occlusionMaterial = try await ShaderGraphMaterial(named: "/Root/OcclusionMaterial", from: "OcclusionMaterial") let testEntity = ModelEntity(mesh: .generateSphere(radius: 0.4), materials: [occlusionMaterial]) content.add(testEntity) content.cameraTarget = testEntity } catch { print("Shader Graph Load Error:") dump(error) } } .realityViewCameraControls(.orbit) .edgesIgnoringSafeArea(.all) Feedback ID: FB15081296
Replies
2
Boosts
1
Views
1.5k
Activity
Sep ’24
[visionOS] How to render side-by-side stereo video?
I want to render a 3d/stereoscopic video in an Apple Vision Pro window using RealityKit/RealityView. The video is a left-right stereo. The straight forward approach would be to spawn a quad, and give it a custom Shader Graph material, which has a CameraIndexSwitch. The CameraIndexSwitch chooses between the right texture vs the left texture. https://i.sstatic.net/XawqjNcg.png The issue I have here is that I have to extract the video frames from my AVSampleBufferVideoRenderer. This should work ok, but not if I'm playing FairPlay content. So, my question is, how to render stereo FairPlay videos in a SwiftUI RealityView?
Replies
1
Boosts
0
Views
736
Activity
Dec ’24
Build Vision Pro failed
`error: [xrsimulator] Component Compatibility: EnvironmentLightingConfiguration not available for 'xros 1.0', please update 'platforms' array in Package.swift error: [xrsimulator] Exception thrown during compile: compileFailedBecause(reason: "compatibility faults") error: Tool exited with code 1
Replies
5
Boosts
0
Views
696
Activity
Jan ’25
RC Pro Timeline Notification Not Received in Xcode
I'm having a heck of a time getting this to work. I'm trying to add an event notification at the end of a timeline animation to trigger something in code but I'm not receiving the notification from RC Pro. I've watched that Compose Interactive 3D Content video quite a few times now and have tried many different ways. RC Pro has the correct ID names on the notifications. I'm not a programmer at all. Just a lowly 3D artist. Here is my code... import SwiftUI import RealityKit import RealityKitContent extension Notification.Name { static let button1Pressed = Notification.Name("button1pressed") static let button2Pressed = Notification.Name("button2pressed") static let button3Pressed = Notification.Name("button3pressed") } struct MainButtons: View { @State private var transitionToNextSceneForButton1 = false @State private var transitionToNextSceneForButton2 = false @State private var transitionToNextSceneForButton3 = false @Environment(AppModel.self) var appModel @Environment(\.dismissWindow) var dismissWindow // Notification publishers for each button private let button1PressedReceived = NotificationCenter.default.publisher(for: .button1Pressed) private let button2PressedReceived = NotificationCenter.default.publisher(for: .button2Pressed) private let button3PressedReceived = NotificationCenter.default.publisher(for: .button3Pressed) var body: some View { ZStack { RealityView { content in // Load your RC Pro scene that contains the 3D buttons. if let immersiveContentEntity = try? await Entity(named: "MainButtons", in: realityKitContentBundle) { content.add(immersiveContentEntity) } } // Optionally attach a gesture if you want to debug a generic tap: .gesture( TapGesture().targetedToAnyEntity().onEnded { value in print("3D Object tapped") _ = value.entity.applyTapForBehaviors() // Do not post a test notification here—rely on RC Pro timeline events. } ) } .onAppear { dismissWindow(id: "main") // Remove any test notification posting code. } // Listen for distinct button notifications. .onReceive(button1PressedReceived) { (output) in print("Button 1 pressed notification received") transitionToNextSceneForButton1 = true } .onReceive(button2PressedReceived.receive(on: DispatchQueue.main)) { _ in print("Button 2 pressed notification received") transitionToNextSceneForButton2 = true } .onReceive(button3PressedReceived.receive(on: DispatchQueue.main)) { _ in print("Button 3 pressed notification received") transitionToNextSceneForButton3 = true } // Present next scenes for each button as needed. For example, for button 1: .fullScreenCover(isPresented: $transitionToNextSceneForButton1) { FacilityTour() .environment(appModel) } // You can add additional fullScreenCover modifiers for button 2 and 3 transitions. } }
Replies
5
Boosts
0
Views
540
Activity
Mar ’25
Pass Video/ Frames to a Shader Graph?
Wondering if this is even possible without using CVImageBuffer and passing each frame as an image which I imagine will be very expensive. Have a PoC of a shader graph that applies a radial zoom effect to an image. In RealityKit I'm passing the image as a resource: if let textureResource = try? await TextureResource(named: "fuji") { let value = MaterialParameters.Value.textureResource(textureResource) try? material.setParameter(name: "MyImage", value: value) model.model?.materials = [material] } Thanks in advance
Replies
0
Boosts
0
Views
140
Activity
Apr ’25
The skeletal model moves
I want to make a model with added bones move by dragging it with gestures,This is a model exported using Blender, What I understand is using IKComponent, but I don't know how to use it specifically
Replies
2
Boosts
0
Views
161
Activity
Apr ’25
VideoDockingRegion avplayer video plays through all object
Hi, I'm trying to place an object in front of AVPlayer that is docked in VideoDockingRegion, but when launched in immersive space, the video passes through the objects placed in front of. How do I make sure these objects are visible? image for reference
Replies
0
Boosts
0
Views
122
Activity
Apr ’25
Don't Apple frameworks support non-English USDZ?
I have two cubes in my blender project. But one gets lost after importing the USDZ file which is exported from the blender project. It seems that Apple frameworks don't support non-English USDZ.
Replies
1
Boosts
0
Views
108
Activity
Apr ’25
目前在使用Unity进行开发时,遇到了材质(Material)和Shader在visionOS设备上无法正常渲染的问题
具体表现为:在Unity编辑器中材质显示正常,但部署到Vision Pro真机后部分材质丢失或Shader效果异常(如透明通道失效、光照计算错误等)。此问题影响了开发进度,希望得到技术支持的帮助 Specific results: The materials are displayed normally in the Unity editor, but after being deployed to the Vision Pro real machine, some materials are lost or the Shader effect is abnormal (such as transparent channel failure, antenna calculation error, etc.). This problem has affected the development progress, and I hope to get help from technical support
Replies
0
Boosts
0
Views
79
Activity
Apr ’25
attenuation map covers over object
hi, I'm trying to create a virtual movie theater, but after running computeDiffuseReflectionUVs.py and applying attenuation map, I noticed the light falloff effect just covers over the objects. I used apple provided attenuation map (did not specify the attenuation map name on python script) with sample size of 6000. I thought the python script would calculate vertices and create shadow for, say, back of the chairs. Am I understanding this wrong?
Replies
1
Boosts
0
Views
114
Activity
Apr ’25
How to read a video stream that includes both the physical world and digital space when obtaining a video stream from VisionPro by applying for the Enterprise API but only reading a video stream that contains both the physical world and digital space
We have successfully obtained the permissions for "Main Camera access" and "Passthrough in screen capture" from Apple. Currently, the video streams we have received are from the physical world and do not include the digital world. How can we obtain video streams from both the physical and digital worlds? thank you!
Replies
0
Boosts
0
Views
128
Activity
Apr ’25
Playing USDZ animation at last known location of reference object
Hi there I'm using Reality Composer Pro to anchor virtual content to a .referenceobject. However by moving the referenceobject quickly, it causes tracking to stop. (I know this is a limitation so im trying to make it a feature) IS there a way to play a USDZ animation at the last known location, after detecting that reference object is no longer being tracked? is it possible to set this up in Reality Composer pro? Nearly everything is set up in Reality Composer pro with my immersive.scene just anchoring virtual content to the Reference object in the RCP Scene, so my immersive view just does this - if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) & this .onAppear { appModel.immersiveSpaceState = .open } .onDisappear { appModel.immersiveSpaceState = .closed } I have tried Using SpatialTracking & WorldTrackProvider, but I'm still quite new to Swift and coding in general so im unsure how to implement in conjunction with my RCP scene and if this is actually the right way to do it. Apologies for my lack of knowledge.
Replies
0
Boosts
0
Views
96
Activity
Apr ’25
Playing USDZ animation at last known location of reference object
Hi there I'm using Reality Composer Pro to anchor virtual content to a .referenceobject. However moving the referenceobject quickly causes tracking to stop. (I know this is a limitation and I am trying to embrace it as a feature) Is there a way to play a USDZ animation at the last known location, after detecting that the reference object is no longer tracked? is it possible to set this up in Reality Composer pro? I'm trying to get the USDZ to play before the Virtual Content disappears (due to reference object not being located). So that it smooths out the vanishing of the content. Nearly everything is set up in Reality Composer pro with my immersive.scene just adding virtual content to the reference object which anchors it in the RCP Scene, so my immersive view just does this - if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) & this .onAppear { appModel.immersiveSpaceState = .open } .onDisappear { appModel.immersiveSpaceState = .closed } I have tried Using SpatialTracking & WorldTrackingProvider, but I'm still quite new to Swift and coding in general so im unsure how to implement in conjunction with my RCP scene and/or if this is the right way to go about it. Also I have implemented this at the beginning of object tracking. All I had to do was add a onAppear behavior to the object to play a USDZ and that works. Doing it for disappearing (due to loss of reference object) seems to be a lot harder.
Replies
0
Boosts
0
Views
120
Activity
Apr ’25
Im trying to use videoMaterial on a obj in Xcode 16, I'm lost.
I believe I have created a videoMaterial and assigned it to a mesh with code I found in the Developer's Documentation but Im getting this error. "Trailing closure passed to parameter of type 'String' that does not accept a closure" I have attached a photo of the code and where the error happens. Any help will greatly be appreciated.
Replies
0
Boosts
0
Views
127
Activity
May ’25
Launching a timeline on a specific model via notification
Hello! I’m familiar with the discussion on “Sending messages to the scene”, and I’ve successfully used that code. However, I have several instances of the same model in my scene. Is it possible to make only one specific model respond to a notification? For example, can I pass something like RealityKit.NotificationTrigger.SourceEntity in userInfo or use another method to target just one instance?
Replies
1
Boosts
1
Views
204
Activity
May ’25
My MacBook Screen is with little pixels Help!!
help me please i cant remove little messed up pixels and they are getting more and more please help me!! 2025 Macbook pro no protection screen actual screen
Replies
1
Boosts
0
Views
152
Activity
May ’25
Looking for help on setting up shader graph nodes to create a gaussian blur effect.
Anyone could share ideas or nodes setup to implement a gaussian blur on shader graph material, with a blur size parameter? Thanks!
Replies
0
Boosts
0
Views
160
Activity
May ’25