It was a long time. I wrote all sorts of scripts in the console, but some might well be needed in PHP system calls. Very relevant to large and very large text files.
1. Replacing characters in a file
2. Removal of windows-like hyphenation
3. Fast line counting
4. Cut columns from a CSV-like file
5. Sort file by columns
6. Analysis of the database based on the ini-file
I am using awk 4.6. If anyone does not know, then this is a specialized c-like language (by the way, it reminds very php) for processing text data. It works very, very quickly.
')
1) Replace all 'e' with 'o' in the file
foo .
sed -i.bak y / e / o / g foo
-i.bak creates a backup copy of the file named
foo.bak2) If you copied a file with Windows transfers (\ n \ r), then simply remove them
col -bx <dosfile> newfile
3) Very quickly calculate the number of lines in the file (works hundreds of times faster than puffing)
wc -l file
4) Cut multiple columns from file
cat somefile | awk '{print $ 1 $ 2 $ 4 $ 6;}'
Here at the entrance of the CSV file, only the tab / space delimiter. And if you need a comma or something else, then it is easy to add another separator by adding
awk 'BEGIN {FS = ","} {print $ 1 $ 2 $ 4 $ 6;}'
And at the exit 1,2,4,6 columns from it.
5) If the task is to sort the file foo by columns, with which the column separator is tabulation (the parameter "-t \ t"):
sort -t \ t -b -k 1n, 3n foo
6) I will not write exactly, I don’t remember because, but I can, if necessary, tell how to parse ini files using awk