I'm trying to do some image edit syncing between two of the same app running on different iPhones. I would like to send an NSSet * from one device to another (which I imagine involves encapsulating in NSData) then decrypting this back to an NSSet, then using it in a touchesMoved type of function. Is this feasible, or should I work on syncing the UIImages instead? I worry that UIImage syncing would have too much latency for realtime interaction.
Thanks for your help!
I take it you are dealing with a set of UITouchs. This will not work as UITouch does not implement NSCoding (i think).
You will need to extract out what information you need from each UITouch and then put that into something that conforms to NSCoding.
NSSet * someSet = ...;
NSData * serializedSet = [NSKeyedArchiver archivedDataWithRootObject:someSet];
Then send this data using game kit.
The other device, when it gets data, converts it back to a set.
NSData * receivedData = ....;
NSSet * set = [NSKeyedUnarchiver unarchiveObjectWithData:receivedData];
Then you would calls what ever method you need to process the set with yourself. Since you are likely changing some UI components here, make sure to call the selector to run in the main thread.
Also, this type of thing will only work if the touch events you are dealing with are stateless, meaning it doesn't matter where the touch was before. Otherwise this can cause some issues. It may be better to extract out the idea of what has changed after the touch events, and then send this to the other device and only update the other device with the deltas of how the UIImage has changed (in other words what type of processing has occurred to the image on the other device).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With