OS X have a nice feature such as Accessibility API, which allow to control some system elements and another applications from your app code. But it stops working after turning on Sandbox. To submit an app to the store is mandatory to turn on sandboxing.
That`s what Apple say:
With App Sandbox, you can and should enable your app for accessibility, as described in this document. However, you cannot sandbox an assistive app such as a screen reader, and you cannot sandbox an app that controls another app.
But App Store has a few applications that use Accessibility API:
So it means that it is possible to use Sandbox/Accessibility API/App Store together, but how?
As of a couple of years ago, Accessibility APIs were not available from within the OS X sandbox, but that might have changed. It is more likely, however, that those apps managed to get a special exception from Apple, and that they have additional sandboxing entitlements that partially break them out of the sandbox just enough to let them use the accessibility APIs.
What you should do is first file a bug via bugreporter.apple.com, telling what you're trying to do with accessibility and why, and asking them to give you a sandboxing exception. If they approve the exception, they will probably give you a custom bit of sandbox profile language code that makes it possible to call the accessibility APIs from within the (somewhat weakened) sandbox.
With that said, depending on what you're doing, don't be surprised if Apple says, "Sorry, that app doesn't fit the app store model. Please sign your app using Developer ID with sandboxing disabled, and distribute it outside the store."
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With