Today i was struggling to parse huge log files with PHP. Problem was the log file was huge in size and the new entries were being appended at the end. So PHP was taking forever to parse this file and extract last few records.
The solution? use AWK. Shell programmers swiss army knife :)
I knew every line had a unique time stamp (well almost) and i would store the timestamp of last line parsed. So i wrote a small command in AWK:
exec (" awk '/Jun 12 00:13:01/ {line_fnd = NR; }END {print NR,line_fnd}' OFS='=' /var/log/huge.log ", $arg1);
This returns the line number found and total number of lines separated by '=' symbol.
Then its quite easy to parse the output of this command and find the total new lines to grab. Then you can tail that many lines and redirect the output to a temporary file and parse the file line by line and do whatever you want with it and at last store the timestamp of the last line you parsed.
This shaved off quite a huge time of my script execution time :)
Anyone facing this problem might use this technique or if there is a better solution then please do let me know.
Subscribe to:
Post Comments
(
Atom
)
2 comments :
could also be done with "logtail"
(might be faster)
could also be done with "logtail"
(which might be faster)
Post a Comment