Skip to content

Commit

Permalink
Merge pull request #144 from daverodgman/compression_perf
Browse files Browse the repository at this point in the history
speed up analyze-outcomes stage
  • Loading branch information
daverodgman authored Jan 4, 2024
2 parents 135c5f5 + ba120af commit 865d01b
Showing 1 changed file with 7 additions and 4 deletions.
11 changes: 7 additions & 4 deletions vars/analysis.groovy
Original file line number Diff line number Diff line change
Expand Up @@ -237,16 +237,19 @@ def process_outcomes() {
deleteDir()
}

// The complete outcome file is ~14MB compressed as I write.
// The complete outcome file is 2.1GB uncompressed / 56MB compressed as I write.
// Often we just want the failures, so make an artifact with just those.
// Only produce a failure file if there was a failing job (otherwise
// we'd just waste time creating an empty file).
//
// Note that grep ';FAIL;' could pick up false positives, if another field such
// as test description or test suite was "FAIL".
if (gen_jobs.failed_builds) {
sh '''\
awk -F';' '$5 == "FAIL"' outcomes.csv >"failures.csv"
LC_ALL=C grep ';FAIL;' outcomes.csv >"failures.csv"
# Compress the failure list if it is large (for some value of large)
if [ "$(wc -c <failures.csv)" -gt 99999 ]; then
xz failures.csv
xz -0 -T0 failures.csv
fi
'''
}
Expand All @@ -258,7 +261,7 @@ fi
}
}
} finally {
sh 'xz outcomes.csv'
sh 'xz -0 -T0 outcomes.csv'
archiveArtifacts(artifacts: 'outcomes.csv.xz, failures.csv*',
fingerprint: true,
allowEmptyArchive: true)
Expand Down

0 comments on commit 865d01b

Please sign in to comment.