Based on the answer to this question, I'd like to determine the amount of columns in my file.
The file looks like this:
Header,,Header2,,Header3,,
1,2,3,4,5,6
11,12,13,14,15,16
When I now try to use the stats command:
stats 'data.dat'
max_col = STATS_columns
Gnuplot gives the error that there's bad data on line 1 of file data.dat which obviously is the header.
If I remove the header, everything's fine, but I'm planning on using columnheader for automated labelling (as discussed e.g. here) of the curves, so removing the header is not a solution.
If it matters: I'm working on a Windows-machine.
As hinted in the comments, the solution is simply modifying the command like this:
stats 'data.dat' skip 1
max_col = STATS_columns
Be aware that STATS_columns will not necessarily give you the maximum number of columns in a file or datablock. Apparently, it will just give you the number of columns in the first data row. If following rows have more columns stats will not take this into account.
If you need to know the maximum number of columns of data which is not a regular table, the following code (tested in gnuplot 5.2.5) is counting the separators of all lines in a "irregular" datablock and tells you the maximum. Well, possible drawback: commented lines containing separators will also be included. A file has to be loaded to a datablock first. I am happy to learn if there is a better way.
### maximum number of columns in datablock
reset session
$Data <<EOD
11 12 33
21 22 23 24 25
31 32 33 34 35 36 37 38 39
41 42 43 44 45
51 52
EOD
# Method with stats
stats $Data nooutput
print "STATS_columns: ", STATS_columns
# Method with counting separators
CountChar(s,c) = int(sum[Count_i=1:strlen(s)] (s[Count_i:Count_i] eq c))
ColMax = 0
Separator = "\t"
do for [i=1:|$Data|] {
ColCount = CountChar($Data[i],Separator)+1
ColMax = (ColCount > ColMax ? ColCount : ColMax)
}
print "Maximum Columns: ", ColMax
### end of code
Result:
STATS_columns: 3
Maximum Columns: 9
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With