Given a set of two or more logical conditions, is it possible to algorithmically determine that exactly ONE of them will evaluate to TRUE? For example:
# this should pass, since for every X, only one condition is taken
cond 1: (X >= 1.0)
cond 2: (X < 1.0)
# this should fail
cond 1: (X < 1.0)
cond 2: (X > 2.0)
# this should also fail, since X=1.0 would meet both conditions
cond 1: (X < 2.0)
cond 2: (X > 0.0)
# there may be more than one variable involved
cond 1: (X >= 1.0 && Y >= 0)
cond 2: (X < 1.0 && Y <= -1)
These conditions are generated from a domain specific language and used to determine the next execution path. i.e. users compose a condition for each option when the execution tree splits into multiple paths, and the condition that evaluates to true determines the path that is to be taken. For the simulation to be valid these should be only one possible path that can be taken for any given values.
At present, I evaluate these conditions at runtime and throw a tantrum if more than one (or none) of them are True.
I would like to be able to check errorneous conditions during the parse stage (domain language to compilable source code). Is it possible? How would one go about validating the conditions?
With regards to what can be included in the conditions, the scope is rather wide in practice. All these are possible conditions:
X >= Y && Y < Z
X.within_radius(0.4)
X IN some_array
X * Y < Z
It does seem like a solution that covers all possible conditions is not possible (or at least, given my limited knowledge, not possible within the time allocated for the problem). Will revisit this some day, but for now accepting answer that brought me forward the furthest.
EDIT: I'll restate because it seems like the other answers are assuming a bunch of things which have since been confirmed:
If you can state your conditions (and the constraint that only one is true) in terms of Presburger arithmetic, then you can write a decision procedure to verify that property statically. This seems perfectly achievable from the examples above.
The "blunt instrument" approach is to basically interface with something like an automatic theorem prover or an SMT solver (where you would basically be trying to prove the negation of the statement "there exists some value x that satisfies constraint1 XOR constraint2"). I've programmatically interfaced with CVC3 before, and found it pretty good to work with, but my understand is that it has been surpassed by other SMT solvers.
Anything else you do to solve this problem is probably going to end up approximating some implementation of the kinds of tools I've suggested, I think. Depending on exactly how your constraints are specified, you might be able to get away with implementing some kind of decision procedure for something like Presburger arithmetic.
In general, no. But if what you're really asking is whether it is possible given conditions made up of boolean logical combinations of inequalities on a finite set of independent integer variables with constants, then there's hope. You can exhaustively check by permuting the variables with the constants that appear in your inequalities (and +1 and -1 of those constants), and verifying that the number of conditions that hold true is always 1.
If you want to find if only one condition is true (out of two or more possible conditions) it may be helpful to refer to this xor question on SO: xor-of-three-values. Taking directly from its answer:
(a ^ b ^ c) && !(a && b && c)
In your case:
(cond 1 ^ cond 2 ^ cond 3) && !(cond 1 && cond 2 && cond 3)
There's also a general solution where you increment a count each time any condition is true, then check the count against 1 once all conditions have been tested.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With