I'm starting to learn Python and encountered a few issues while trying to read values from a file.
My parameter file is somewhat like this:
var1 11111111
path_value "some/space containing path/file.txt"
var3 something
#some other values
var4 some/value1
var5 some/value2
var6 some/value3
This is my code:
file=open('this_file.txt')
for line in file:
fields = line.strip().split()
if(fields[0] in "var1"):
## this will give me 11111111
var_1_value=fields[1]
if(fields[0] in "path_value"):
## this will give me only till "/some/space
path_value_contains=fields[1]
How do I fetch the path properly? I'm not sure this is an efficient way of doing things. Can you please let me know any better ways to do this?
I'm not using any modules.
file parameter of Python's print() Function print() function in Python3 supports a 'file' argument, which specifies where the function should write a given object(s) to. If not specified explicitly, it is sys. stdout by default. It serves two essential purposes: Print to STDERR Print to external file.
Use readlines() to Read the range of line from the File The readlines() method reads all lines from a file and stores it in a list. You can use an index number as a line number to extract a set of lines from it. This is the most straightforward way to read a specific line from a file in Python.
As I mentioned in the comments, I strongly recommend you use a stdlib module that's good at this, but if you're intent on rolling your own, here's the problem:
if fields[0] in "path_value"
That's not testing if fields[0]
is equal to "path_value"
, it's testing if fields[0]
is any one of "p"
, "a"
, "t"
, "h"
, "_"
, etc.
Try instead:
if fields[0] == "path_value"
That said, it seems you're having a totally different problem and that one is merely related to a copy/paste error. You can try doing this if you're on Python3:
for line in file:
param_name, *param_values = line.split()
param_value = ' '.join(param_values)
# then proceed as usual
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With