1) Determine how many
lines of data were in the 10 files.
2) Copy originals
as 01.txt to 10.txt, so we can use "?" wildcards.
3) "cat" all
10, then count lines using wc.
--> cat ??.txt |
wc -l
41025
that’s 41,025 lines of
information.
4) add more filters to see how many possible
Messages there could be:
-> cat
??.txt | grep Maint | grep - | wc -l
5640
That’s 5,640 messages included in that batch of 10 reports.
5) Now we needed to find out which were
common across all 10 reports and work those.
6) Technical content
for each of the reported errors ranged from 14 to about 23 lines long,
using "grep"
with the -A for "after" the pattern, we could create a report that gave most of the details.
Initially the
engineer did the steps manually. A
script has been written to automate the steps.
What took 13 hours per test run in EXCEL
now takes less than 5 seconds in a Linux shell..
From 13
hours... to less than 5 seconds.
Next, find one error message in these 10 reports. The information is obsfucated for
security.
grep
to find the pattern 21-18891 and
to display 23 lines after that.
The
goal of course was to eliminate all errors, but initially they focused on the
common ones first.