Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does a lower delta cause my PID controller to adjust with less precision?

Tags:

java

algorithm

I'm not entirely sure if this is the right place to ask this. Well, it is a programming problem I suppose.


I am trying to create a simple PID Controller simulation in Java.

In short, there is a target value and a current value. The current value is modified by a number. You provide the PID Controller the current value and it will try to return a number in the hopes that such number will cause the current value to approximate the target value. Overtime, the more you use the PID controller, it will "learn" (using integrals and derivatives), and will eventually return more and more accurate values. This is useful for, say, maintaining the equilibrium of a boat by controlling the wheel movement.


The formula used by a PID controller is pretty general and quite straightforward - or so I thought. In the below example, the value that is returned by the PID Controller is simply added to the current value. I assume it would work with more complex applications (that involve multiplication or division, etc). This is my program:

public class PID {

    private static double Kp = 0.1;
    private static double Kd = 0.01;
    private static double Ki = 0.005;

    private static double targetValue = 100.0;
    private static double currentValue = 1.0;

    private static double integral = 0.0;
    private static double previousError = 0.0;

    private static double dt = 0.5;

    private static double max = 5;
    private static double min = -5;

    public static void main(String[] args) throws Exception {
        while (true) {
            Thread.sleep((long) (1000.0 * dt));
            double error        = targetValue - currentValue;
            double derivative   = 0.0;
            double output       = 0.0;
            integral = integral + error * dt;
            derivative          = (error - previousError) / dt;
            output              = Kp * error + Ki * integral + Kd * derivative;
            previousError         = error;
            if (output > max) output = max;
            if (output < min) output = min;

            // Apply the output to the current value:
            System.out.println(currentValue + " + " + output + " = " + (currentValue + output));
            currentValue += output;
        }
    }

}

If you run this, you will see that the PID controller eventually manages to cause the current value to be very very close to the target value.

It's pretty cool. Now, I wanted to see my results a bit faster (because I plan to make some sort of interactive graph), so I decided to change the delta dt to 0.1.

Alas, the resulting value is no longer close to 100! Now it appears to reach 105 and then, very slowly, decrease to 100. That's not good!

Now imagine having dt at 0.01! Now it is extremely slow to reach 102, and now it doesn't even go back to 100, now it just keeps increasing!

So my question is: why does a lower delta cause this?

My code is based on this PDF document and they use 0.01 just fine.

like image 884
Voldemort Avatar asked Nov 16 '15 00:11

Voldemort


1 Answers

its very easy, you are getting a integral windup.

See, "integral" is not limited in grow, BUT you limit the effect of the output to be in range [-5, 5]

There are many solution, my dump fix is to limit integral between min and max.

with that fix there is no overshot bigger than a digit with a loop time of 0.5, 0.1 and 0.01 (but also limiting derivative)

Limiting derivative may be fix by using the same trick used to prevent "derivative kick": use the difference between precedent and actual value instead of difference between error. Just pay attention that you also need to invert the sign

But if you need to simulate a PID with dt of any value as fast as possible, just comment the sleep!

like image 172
Lesto Avatar answered Oct 20 '22 00:10

Lesto