-
Notifications
You must be signed in to change notification settings - Fork 399
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reduce CI fuzz iterations as we're now timing out #3691
base: main
Are you sure you want to change the base?
Conversation
I've assigned @joostjager as a reviewer! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ACK, seems we hit yet another failing case in the router
target now: #3692
else | ||
HFUZZ_RUN_ARGS="$HFUZZ_RUN_ARGS -N1000000" | ||
HFUZZ_RUN_ARGS="$HFUZZ_RUN_ARGS -N500000" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any specific reasoning behind the 10x, 10x and 2x reductions?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No. I probably could have benchmarked, but in general there's not a lot of value in full_stack_target
in CI because its just too dense to make any real progress, and similar for chanmon_consistency_target
.
I checked out a fuzzer run on the attr failures PR and grepped the logs. Interestingly it seems like indeed
Maybe I am not interpreting correctly though. Also from this log, you'd say that all the fast tests (<30 sec) don't need their iteration count changed.
|
0244aa0
to
9f68f1b
Compare
Ah, thanks for doing that. Yea, I was going on the results when I fuzz with real corpuses, where
Huh, interesting, I went ahead and slowed this one down though.
Yep! |
c87c3f3
to
ad4fb1e
Compare
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #3691 +/- ##
==========================================
- Coverage 89.18% 89.16% -0.02%
==========================================
Files 155 155
Lines 120796 120796
Branches 120796 120796
==========================================
- Hits 107731 107710 -21
- Misses 10415 10432 +17
- Partials 2650 2654 +4 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
2773431
to
b93b1d6
Compare
When we made `test_node_counter_consistency` more aggressively run, our `process_network_graph` fuzzer got materially slower, resulting in consistent fuzz CI job timeouts. Thus, here, we tweak the iteration count on all our fuzz jobs to get them running in more consistent times. Further, we further reduce `full_stack_target` iterations in anticipation of a later commit which will start using our hard-coded fuzz seeds, creating substantially more coverage and slowing down fuzzing iterations.
In 3145168 we disabled `test_node_counter_consistency` in debug builds since it can make make things very slow, including `lightning-rapid-gossip-sync` tests. We should, however, have kept it when fuzzing, since that gives us testing of potential coverage gaps in normal tests.
This should materially improve our fuzzing coverage in CI.
b93b1d6
to
4b4bad8
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Fuzz failure is #3708
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great that fuzzing again found an issue. This is awesome.
Before merging this, I think we still want to see a successful fuzz run well within the timeout?
Yea, happy to wait until we can at least run a fuzzing run and check that all the timings make sense. |
I think github has slowed down the runners so now our fuzz tests are timing out. Here we just reduce iteration count a bit.