Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

SceneKit get texture coordinate after touch with Swift

I want to manipulate 2D textures in a 3D SceneKit scene. Therefore i used this code to get local coordinates:

@IBAction func tap(sender: UITapGestureRecognizer) {
    var arr:NSArray = my3dView.hitTest(sender.locationInView(my3dView), options: NSDictionary(dictionary: [SCNHitTestFirstFoundOnlyKey:true]))
    var res:SCNHitTestResult = arr.firstObject as SCNHitTestResult

    var vect:SCNVector3 = res.localCoordinates}

I have the texture read out from my scene with:

    var mat:SCNNode = myscene.rootNode.childNodes[0] as SCNNode
    var child:SCNNode = mat.childNodeWithName("ID12", recursively: false)
    var geo:SCNMaterial = child.geometry.firstMaterial
    var channel = geo.diffuse.mappingChannel        
    var textureimg:UIImage = geo.diffuse.contents as UIImage

and now i want to draw at the touchpoint to the texture... how can i do that? how can i transform my coordinate from touch to the texture image?

like image 569
Philsen Avatar asked Feb 13 '23 03:02

Philsen


1 Answers

Sounds like you have two problems. (Without even having used regular expressions. :))

First, you need to get the texture coordinates of the tapped point -- that is, the point in 2D texture space on the surface of the object. You've almost got that right already. SCNHitTestResult provides those with the textureCoordinatesWithMappingChannel method. (You're using localCoordinates, which gets you a point in the 3D space owned by the node in the hit-test result.) And you already seem to have found the business about mapping channels, so you know what to pass to that method.

Problem #2 is how to draw.

You're doing the right thing to get the material's contents as a UIImage. Once you've got that, you could look into drawing with UIGraphics and CGContext functions -- create an image with UIGraphicsBeginImageContext, draw the existing image into it, then draw whatever new content you want to add at the tapped point. After that, you can get the image you were drawing with UIGraphicsGetImageFromCurrentImageContext and set it as the new diffuse.contents of your material. However, that's probably not the best way -- you're schlepping a bunch of image data around on the CPU, and the code is a bit unwieldy, too.

A better approach might be to take advantage of the integration between SceneKit and SpriteKit. This way, all your 2D drawing is happening in the same GPU context as the 3D drawing -- and the code's a bit simpler.

You can set your material's diffuse.contents to a SpriteKit scene. (To use the UIImage you currently have for that texture, just stick it on an SKSpriteNode that fills the scene.) Once you have the texture coordinates, you can add a sprite to the scene at that point.

var nodeToDrawOn: SCNNode!
var skScene: SKScene!

func mySetup() { // or viewDidLoad, or wherever you do setup
    // whatever else you're doing for setup, plus:

    // 1. remember which node we want to draw on
    nodeToDrawOn = myScene.rootNode.childNodeWithName("ID12", recursively: true)

    // 2. set up that node's texture as a SpriteKit scene
    let currentImage = nodeToDrawOn.geometry!.firstMaterial!.diffuse.contents as UIImage
    skScene = SKScene(size: currentImage.size)
    nodeToDrawOn.geometry!.firstMaterial!.diffuse.contents = skScene

    // 3. put the currentImage into a background sprite for the skScene
    let background = SKSpriteNode(texture: SKTexture(image: currentImage))
    background.position = CGPoint(x: skScene.frame.midX, y: skScene.frame.midY)
    skScene.addChild(background)
}

@IBAction func tap(sender: UITapGestureRecognizer) {
    let results = my3dView.hitTest(sender.locationInView(my3dView), options: [SCNHitTestFirstFoundOnlyKey: true]) as [SCNHitTestResult]
    if let result = results.first {
        if result.node === nodeToDrawOn {
            // 1. get the texture coordinates
            let channel = nodeToDrawOn.geometry!.firstMaterial!.diffuse.mappingChannel
            let texcoord = result.textureCoordinatesWithMappingChannel(channel)

            // 2. place a sprite there
            let sprite = SKSpriteNode(color: SKColor.greenColor(), size: CGSize(width: 10, height: 10))
            // scale coords: texcoords go 0.0-1.0, skScene space is is pixels
            sprite.position.x = texcoord.x * skScene.size.width
            sprite.position.y = texcoord.y * skScene.size.height
            skScene.addChild(sprite)
        }
    }
}

For more details on the SpriteKit approach (in Objective-C) see the SceneKit State of the Union Demo from WWDC14. That shows a SpriteKit scene used as the texture map for a torus, with spheres of paint getting thrown at it -- whenever a sphere collides with the torus, it gets a SCNHitTestResult and uses its texcoords to create a paint splatter in the SpriteKit scene.


Finally, some Swift style comments on your code (unrelated to the question and answer):

  • Use let instead of var wherever you don't need to reassign a value, and the optimizer will make your code go faster.
  • Explicit type annotations (res: SCNHitTestResult) are rarely necessary.
  • Swift dictionaries are bridged to NSDictionary, so you can pass them directly to an API that takes NSDictionary.
  • Casting to a Swift typed array (hitTest(...) as [SCNHitTestResult]) saves you from having to cast the contents.
like image 67
rickster Avatar answered Feb 15 '23 09:02

rickster