Main Problem:
I am adding this section AFTER to clarify the problem. -- I can PAUSE my video (I do not want it playing on a loop). When my node comes into sight, my node plays my video, even if it is on pause. If my video has finished playing, and it comes into sight, it will restart. I want to REMOVE this behavior.
In my app, I have a SKVideoNode
created from an AVPlayer(:URL)
inside 3D Space using SCNNode
objects and SCNGeometry
objects. I use ARKit
.ImageTracking
to determine when a specific image is found, and play a video from there. All is good and dandy, except that the player determines to play on its own time, every time the AVPlayer comes into sight; however, it could be whenever the ARImageAnchor
the SCNNode
is attached to comes into sight. Either way, the AVPlayer
is playing every time the node comes into sight of the camera lens. I use
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if(keyPath == "rate") {
print((object as! AVPlayer).rate)
}
}
to print out the rate, and it is 1, however, it was 0.
I printed out some sort of print function print("Play")
for all of my functions utilizing player.pause() or player.play() and none of them are called whenever the rate is changed above. How can I find the source of what is changing the rate of my player?
I checked the original rootnode, self.sceneview.scene.rootNode.childNodes
to make sure I am not creating extra VideoNodes/SCNNodes/AVPlayers, etc, and it seems that there is only 1.
Any ideas on why the SKVideoNode/AVPlayer is playing as the SCNNode comes into sight of the camera using ARKit? Thanks in advance!
Edit1:
Made a workaround, to determine ONLY when a user clicked on this node
let tap = UITapGestureRecognizer(target: self, action: #selector(self!.tapGesture))
tap.delegate = self!
tap.name = "MyTap"
self!.sceneView.addGestureRecognizer(tap)
and then inside of this next function, I put
@objc func tapGesture(_ gesture:UITapGestureRecognizer) {
let tappedNodes = self.sceneView.hitTest(gesture.location(in: gesture.view), options: [SCNHitTestOption.searchMode: 1])
if !tappedNodes.isEmpty {
for nodes in tappedNodes {
if nodes.node == videoPlayer3D {
videoPlayer3D.tappedVideoPlayer = true
videoPlayer3D.playOrPause()
break
}
}
}
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if(keyPath == "rate") {
print((object as! AVPlayer).rate)
if(!self.tappedVideoPlayer) {
self.player.pause() //HERE
}
}
}
where videoPlayer3D is the SCNNode
that contains the SKVideoNode.
However, I get the error com.apple.scenekit.scnview-renderer (17): EXC_BAD_ACCESS (code=2, address=0x16d8f7ad0)
on the section labeled "HERE" above. It seems that the renderer of the sceneview is attempting to alter my video node in the render function, although, I don't even use the renderer(updateAtTime:)
function, I only use
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let imageAnchor = anchor as? ARImageAnchor else { return }
createVideoNode(imageAnchor)
}
to determine when I see an image, i.e., imageTracking
and I create the node. Any tips?
Thought 1
The error is presented stating that some method is being called from an SCNView object for the method renderer (that's what I'm understanding from the error), but I don't have the node specifically called. I think maybe a default action, as the node is coming to view, is being called, however, I'm not 100% sure on how to access it or determine which method. The objects I'm using are not SCNView objects, and I don't believe they inherit from SCNView objects (look at the 1st paragraph to see the variables used). Just looking to remove the "action" of the node playing every time it is in view.
ADDITION:
For the sake of following the creation of my video player if interested, here it is. Let me know if there is anything else you'd like to see (not sure what else you might want to see) and thanks for your help.
func createVideoNode(_ anchor:ARImageAnchor, initialPOV:SCNNode) -> My3DPlayer? {
guard let currentFrame = self.sceneView.session.currentFrame else {
return nil
}
let delegate = UIApplication.shared.delegate as! AppDelegate
var videoPlayer:My3DPlayer!
videoPlayer = delegate.testing ? My3DPlayer(data: nil, currentFrame: currentFrame, anchor: anchor) : My3DPlayer(data: self.urlData, currentFrame: currentFrame, anchor: anchor)
//Create TapGesture
let tap = UITapGestureRecognizer(target: self, action: #selector(self.tapGesture))
tap.delegate = self
tap.name = "MyTap"
self.sceneView.addGestureRecognizer(tap)
return videoPlayer
}
My3dPlayer Class:
class My3DPlayer: SCNNode {
init(geometry: SCNGeometry?) {
super.init()
self.geometry = geometry
}
required init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
convenience init(data:Data?, currentFrame:ARFrame, anchor:ARImageAnchor) {
self.init(geometry: nil)
self.createPlayer(currentFrame, data, anchor)
}
private func createPlayer(_ frame:ARFrame, _ data:Data?,_ anchor:ARImageAnchor) {
let physicalSize = anchor.referenceImage.physicalSize
print("Init Player W/ physicalSize: \(physicalSize)")
//Create video
if((UIApplication.shared.delegate! as! AppDelegate).testing) {
let path = Bundle.main.path(forResource: "Bear", ofType: "mov")
self.url = URL(fileURLWithPath: path!)
}
else {
let url = data!.getAVAssetURL(location: "MyLocation")
self.url = url
}
let asset = AVAsset(url: self.url)
let track = asset.tracks(withMediaType: AVMediaType.video).first!
let playerItem = AVPlayerItem(asset: asset)
let player = AVPlayer(playerItem: playerItem)
self.player = player
var videoSize = track.naturalSize.applying(track.preferredTransform)
videoSize = CGSize(width: abs(videoSize.width), height: abs(videoSize.height))
print("Init Video W/ size: \(videoSize)")
//Determine if landscape or portrait
self.landscape = videoSize.width > videoSize.height
print(self.landscape == true ? "Landscape" : "Portrait")
//Do something when video ended
NotificationCenter.default.addObserver(self, selector: #selector(playerDidFinishPlaying(note:)), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: nil)
//Add observer to determine when Player is ready
player.addObserver(self, forKeyPath: "status", options: [], context: nil)
//Create video Node
let videoNode = SKVideoNode(avPlayer: player)
//Create 2d scene to put 2d player on - SKScene
videoNode.position = CGPoint(x: videoSize.width/2, y: videoSize.height/2)
videoNode.size = videoSize
//Portrait -- //Landscape doesn't need adjustments??
if(!self.landscape) {
let width = videoNode.size.width
videoNode.size.width = videoNode.size.height
videoNode.size.height = width
videoNode.position = CGPoint(x: videoNode.size.width/2, y: videoNode.size.height/2)
}
let scene = SKScene(size: videoNode.size)
//Add videoNode to scene
scene.addChild(videoNode)
//Create Button-look even though we don't use the button. Just creates the illusion to pressing play and pause
let image = UIImage(named: "PlayButton")!
let texture = SKTexture(image: image)
self.button = SKSpriteNode(texture: texture)
self.button.position = videoNode.position
//Makes the button look like a square
let minimumSize = [videoSize.width, videoSize.height].min()!
self.button.size = CGSize(width: minimumSize/4, height: minimumSize/4)
scene.addChild(button)
//Get ratio difference from physicalsize and video size
let widthRatio = Float(physicalSize.width)/Float(videoSize.width)
let heightRatio = Float(physicalSize.height)/Float(videoSize.height)
let finalRatio = [widthRatio, heightRatio].min()!
//Create a Plane (SCNPlane) to put the SKScene on
let plane = SCNPlane(width: scene.size.width, height: scene.size.height)
plane.firstMaterial?.diffuse.contents = scene
plane.firstMaterial?.isDoubleSided = true
//Set Self.geometry = plane
self.geometry = plane
//Size the node correctly
//Find the real scaling variable
let scale = CGFloat(finalRatio)
let appearanceAction = SCNAction.scale(to: scale, duration: 0.4)
appearanceAction.timingMode = .easeOut
//Set initial scale to 0 then use action to scale up
self.scale = SCNVector3Make(0, 0, 0)
self.runAction(appearanceAction)
}
@objc func playerDidFinishPlaying(note: Notification) {
self.player.seek(to: .zero, toleranceBefore: .zero, toleranceAfter: .zero)
self.player.seek(to: .zero, toleranceBefore: .zero, toleranceAfter: .zero)
self.setButtonAlpha(alpha: 1)
}
}
Efforts1:
I have tried to stop tracking via:
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let imageAnchor = anchor as? ARImageAnchor else { return }
createVideoNode(imageAnchor)
self.resetConfiguration(turnOnConfig: true, turnOnImageTracking: false)
}
func resetConfiguration(turnOnConfig: Bool = true, turnOnImageTracking:Bool = false) {
let configuration = ARWorldTrackingConfiguration()
if(turnOnImageTracking) {
guard let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "AR Resources", bundle: nil) else {
fatalError("Missing expected asset catalog resources.")
}
configuration.planeDetection = .horizontal
configuration.detectionImages = referenceImages
}
else {
configuration.planeDetection = []
}
if(turnOnConfig) {
sceneView.session.run(configuration, options: [.resetTracking])
}
}
Above, I have tried to reset the configuration. This only causes it to reset the planes it seems, as the video is still playing on render. Whether it is paused or finished, it will reset and start over or continue playing where left off.
Efforts2:
I have tried
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let imageAnchor = anchor as? ARImageAnchor else { return }
createVideoNode(imageAnchor)
self.pauseTracking()
}
func pauseTracking() {
self.sceneView.session.pause()
}
This stops everything therefore the camera even freezes as nothing is being tracked. It is completely useless here.
Ok. So here is a fix. see renderer(_:updateAtTime:)
.
var player: AVPlayer!
var play = true
@objc func tap(_ recognizer: UITapGestureRecognizer){
if play{
play = false
player.pause()
}else{
play = true
player.play()
}
}
func setVideo() -> SKScene{
let size = CGSize(width: 500, height: 500)
let skScene = SKScene(size: size)
let videoURL = Bundle.main.url(forResource: "video.mp4", withExtension: nil)!
player = AVPlayer(url: videoURL)
skScene.scaleMode = .aspectFit
videoSpriteNode = SKVideoNode(avPlayer: player)
videoSpriteNode.position = CGPoint(x: size.width/2, y: size.height/2)
videoSpriteNode.size = size
videoSpriteNode.yScale = -1
skScene.addChild(videoSpriteNode)
player.play()
return skScene
}
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
if let image = anchor as? ARImageAnchor{
print("found")
let planeGeometry = SCNPlane(width: image.referenceImage.physicalSize.width, height: image.referenceImage.physicalSize.height)
let plane = SCNNode(geometry: planeGeometry)
planeGeometry.materials.first?.diffuse.contents = setVideo()
plane.transform = SCNMatrix4MakeRotation(-.pi/2, 1, 0, 0)
node.addChildNode(plane)
}
}
func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
if !play{
player.pause()
}
}
Use this idea in your code.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With