If I want to compute a^b mod c then there is an efficient way of doing this, without computing a^b in full.
However, when programming, if I write f g x
then g(x) is computed regardless of f
.
J provides for composing f and g in special cases, and the modulo power function is one of them. For instance, the following runs extremely quickly.
1000&| @ (2&^) 10000000x
This is because the 'atop' conjunction @
tells the language to compose the functions if possible. If I remove it, it goes unbearably slowly.
If however I want to work with x^x , then ^~
no longer works and I get limit errors for large values. Binding those large values however does work.
So
999&| @ (100333454&^) 100333454x
executes nice and fast but
999&| @ ^~ 100333454x
gives me a limit error - the RHS is too big.
Am I right in thinking that in this case, J is not using an efficient power-modulo algorithm?
Per the special code page:
m&|@^ dyad avoids exponentiation for integer arguments
m&|@(n&^) monad avoids exponentiation for integer arguments
The case for ^~
is not supported by special code.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With