I have some text file like this, with several 5000 lines:
5.6 4.5 6.8 "6.5" (new line) 5.4 8.3 1.2 "9.3" (new line)
so the last term is a number between double quotes.
What I want to do is, using Python (if possible), to assign the four columns to double variables. But the main problem is the last term, I found no way of removing the double quotes to the number, is it possible in linux?
This is what I tried:
#!/usr/bin/python import os,sys,re,string,array name=sys.argv[1] infile = open(name,"r") cont = 0 while 1: line = infile.readline() if not line: break l = re.split("\s+",string.strip(line)).replace('\"','') cont = cont +1 a = l[0] b = l[1] c = l[2] d = l[3]
Option 1: Remove any double quotes in a text string with replace('my string','"',''). This will substitute any instance of a double quote anywhere in the string with an empty string. Option 2: Remove the first and last character in a string with substring('my string',1,sub(length('my string'),2)).
By using the escape character \" we are able to use double quotes to enclose a string that includes text quoted between double quotes. Similarly, we can use the escape character \' to add an apostrophe in a string that is enclosed in single quotes: print('Sammy\'s balloon is red. ')
If you need to use the double quote inside the string, you can use the backslash character. Notice how the backslash in the second line is used to escape the double quote characters.
strip() If you want to remove the enclosing quotes from a string before printing it, you can call the string. strip() method and pass the single and double quotes characters to be stripped from the beginning and end of the string object on which it is called. For example, the expression '"hello world"'.
for line in open(name, "r"): line = line.replace('"', '').strip() a, b, c, d = map(float, line.split())
This is kind of bare-bones, and will raise exceptions if (for example) there aren't four values on the line, etc.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With