Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Partial Derivatives in Haskell

A while back a friend wanted help with a program that could solve for the roots of functions using Newton's method, and naturally for that I needed some way to calculate the derivative of a function numerically, and this is what I came up with:

deriv f x = (f (x+h) - f x) / h where h = 0.00001

Newton's method was a fairly easy thing to implement, and it works rather well. But now I've started to wonder - Is there some way I could use this function to solve partial derivatives in a numerical manner, or is that something that would require a full-on CAS? I would post my attempts but I have absolutely no clue what to do yet.

Please keep in mind that I am new to Haskell. Thank you!

like image 874
Ben DalFavero Avatar asked Jan 11 '18 21:01

Ben DalFavero


1 Answers

You can certainly do much the same thing as you already implemented, only with multivariate perturbation instead. But first, as you should always do with top-level functions, add a type signature:

deriv :: (Double -> Double) -> Double -> Double

That's not the most general signature possible, but probably sufficiently general for everything you'll need. I'll call

type ℝ = Double

in the following for brevity, i.e.

deriv :: (ℝ -> ℝ) -> ℝ -> ℝ

Now what you want is, for example in ℝ²

grad :: ((ℝ,ℝ) -> ℝ) -> (ℝ,ℝ) -> (ℝ,ℝ)
grad f (x,y) = ((f (x+h,y) - f (x,y)) / h, (f (x,y+h) - f (x,y)) / h)
 where h = 0.00001

It's awkward to have to write out the components individually and make the definition specific to a particular-dimensional vector space. A generic way of doing it:

import Data.VectorSpace
import Data.Basis

grad :: (HasBasis v, Scalar v ~ ℝ) => (v -> ℝ) -> v -> v
grad f x = recompose [ (e, (f (x ^+^ h*^basisValue b) - f x) ^/ h)
                     | (e,_) <- decompose x ]
 where h = 0.00001

Note that this pre-chosen-step–finite-differentiation is always a tradeoff between inaccuracy from higher-order terms and from floating-point errors, so definitely check out automatic differentiation.

like image 166
leftaroundabout Avatar answered Sep 30 '22 02:09

leftaroundabout