Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is there a method to help make this Python logic run faster

I've been working on a solution to grab data from a PLC sensor with Python and I was able to work out the syntax and such using cpppo and that is working just fine as far as getting the data from the tag in a loop in a presumed serialized manner.

To test this new Python cpppo solution I've connected the machine that runs the Python logic to a PLC via a VPN tunnel and I have it poll a specific tag/sensor. This tag is also polled and recorded with a different non-Python solution that is connected to the local machine network via Ethernet.


Question

Does anyone know a way I could potentially re-write this simply code below in a manner that I could force it to poll 3 or even 4 times in a second? Anything else that could be contributing to this?

  • By contributing to "this", I am talking about the "other" or "non-Python" method seems to record poll responses from this same tag 3 times in a second and the Python cpppo solution only seems poll records at a max of 2 times per second and thus missing a weight every now and then when there are 3 weight values in one second—sometimes there are only 2 per second so it's not always 3 values in a second but there are 3 in a second sometimes.

enter image description here


The Data

The sensor data is returned enclosed in square brackets but represents weight in grams with decimal precision and below is a small sample of the raw data.

[610.5999755859375]
[607.5]
[623.5999755859375]
[599.7999877929688]
[602.5999755859375]
[610.0]

Python Code

Note: The Python logic will write the polled value from the sensor to a csv file but generates and inserts the time stamp record based on system date and time via datetime.now() but before that I convert the value to a string and then strip square brackets from the iterated values per str(x).replace('[','').replace(']','') using the str() function.

from cpppo.server.enip.get_attribute import proxy_simple
from datetime import datetime
import time, csv

CsvFile = "C:\\folder\\Test\\Test.csv"
host = "<IPAddress>"

while True:
    x, = proxy_simple(host).read("<TagName>")

    with open(CsvFile,"a",newline='') as file:
        csv_file = csv.writer(file) 
        for val in x:
            y = str(x).replace('[','').replace(']','')
            csv_file.writerow([datetime.now(), y])
#time.sleep(0.05)

The Problem and Testing Results

When I compare the Python captured csv file records with the records from the other non-Python capturing method of the tag, the Python generated csv records are missing at times and often.

Notable Details (just in case)

  • There is a second or less time stamp difference between these two systems since they generate the time stamp at capture time.

  • This specific sensor can spit out three values in one second but NOT always; sometimes one a second, or two a second, or none in a second.

  • The other method uses Java I think but this code is not accessible to compare logic.

  • I'm using Python version 3.6.5 (v3.6.5:f59c0932b4, Mar 28 2018, 16:07:46) [MSC v.1900 32 bit (Intel)] and from Windows 10.

Results

Python Method CSV (missing the correlated 606.6 value)

2018-04-12 13:56:42.249408,610.5999755859375
2018-04-12 13:56:42.909309,607.5
2018-04-12 13:56:43.559472,623.5999755859375
2018-04-12 13:56:44.259771,599.7999877929688
2018-04-12 13:56:44.910270,602.5999755859375
2018-04-12 13:56:45.541044,610.0

Other Method CSV Results (contains the 606.6 value)

12/04/2018 13:56:41,610.6
12/04/2018 13:56:42,607.5
12/04/2018 13:56:42,623.6
12/04/2018 13:56:43,606.6
12/04/2018 13:56:43,599.8
12/04/2018 13:56:44,602.6
12/04/2018 13:56:44,610

Problem Note: Python missed capturing the 12/04/2018 13:56:43,606.6 record whereas it was recorded from the other system. I suspect this is due to some small minuscule delay per this logic since I only ever see it missing values when compared to the other non-Python captured file.

like image 672
Bitcoin Murderous Maniac Avatar asked Feb 13 '26 14:02

Bitcoin Murderous Maniac


1 Answers

The key part of your code is:

while True:
    x, = proxy_simple(host).read("<TagName>")

    with open(CsvFile,"a",newline='') as file:
        for val in x:
            # ...

In pseudo code:

forever:
    create proxy
    open output file
    process values from proxy

You mention that the sensor might produce about 3 values per second. If you look at the implementation of read(), what it does is set up a new reader object and yield all values from it.

You probably believe your code is running like this:

  1. create proxy
  2. open output file
  3. process value
  4. goto 3

But in fact, it is probably running like this:

  1. create proxy
  2. open output file
  3. process value
  4. goto 1

Each time you call read(), it yields the values it knows about at the time. It does not wait for any new values to arrive.

Try this refactoring:

source = proxy_simple(host)
with open(CsvFile,"a",newline='') as file:
    while True:
        for message in source.read("<TagName>"):
            for val in message:
                # ...

The other error in your original code is this:

x, = proxy_simple(host).read("<TagName>")

If read() returns multiple values, you only use the first one. That's why there are two for loops in my proposed code above.

Then you will only open the input and output once per program run, not once per message.

like image 89
John Zwinck Avatar answered Feb 15 '26 02:02

John Zwinck



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!