I need to process a large number of files in a directory. The files can be partitioned into several groups, based upon the file names. That is to say, the file names can be pattern matchedne which 'group' they belong to. For instance, the names are like this:
etc ...
Each 'group' has a different processing methodology (i.e. a different command is called for processing).
I want to write a bash script to:
I am running on Ubuntu 10.0.4. I am new to bash, and would appreciate skeleton code snippet that will help me get started in writing this script.
The syntax to loop through each file individually in a loop is: create a variable (f for file, for example). Then define the data set you want the variable to cycle through. In this case, cycle through all files in the current directory using the * wildcard character (the * wildcard matches everything).
To loop through a directory, and then print the name of the file, execute the following command: for FILE in *; do echo $FILE; done.
To see a list of all subdirectories and files within your current working directory, use the command ls . In the example above, ls printed the contents of the home directory which contains the subdirectories called documents and downloads and the files called addresses. txt and grades.
The easiest way is probably just to iterate each group separately. This side-steps the parsing issue entirely.
DIRECTORY=. for i in $DIRECTORY/YYYYMMDD_*_bulk_import.csv; do # Process $i done for i in $DIRECTORY/YYYYMMDD_*_genstats_import.csv; do # Process $i done for i in $DIRECTORY/YYYYMMDD_*allstats.csv; do # Process $i done
Set DIRECTORY
to whatever directory you want to search. The default .
will search the current working directory.
Here is basic iteration over files, with switch block to determine file type.
#!/bin/bash for f in *; do case $f in [0-9]*_bulk_import.csv) echo $f case 1 ;; [0-9]*_genstats_import.csv) echo $f case 2 ;; [0-9]*allstats.csv) echo $f case 3 ;; esac done
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With