Bug #16336

Need to address concurrency limits in various bulk actions

Added by Partha Aji over 1 year ago. Updated 6 months ago.

Assigned To:-
Target version:Team Brad - Backlog
Difficulty:easy Pull request:
Bugzilla link:
Story points-
Velocity based estimate-
ReleaseKatello BacklogRelease relationshipAuto


There seems to be a possibility that bulk actions can get overloaded when there are concurrent tasks with too many hosts.
This is likely to affect pretty much all the bulk actions which are run concurrently
look at

The Dynflow runtime keeps the data of currently-running tasks in memory, while waiting for the events: the could correspond to the amount of memory consumption, if the tasks get stuck.

How the thing might be solved would require dynflow being able to drop some inline data and load that from database again on demand.

Another thing, that might help would be the Katello bulk actions setting some upper limit on concurrent tasks (dynflow has support for this), so we would not get into the level of concurrency, where the memory issues appear. Another question, if the timeout limit for the stack actions could be lower to finish the task sooner (to not take up that much memory)

Likely to be fixed with something like https://github.com/Dynflow/dynflow/blob/master/examples/sub_plan_concurrency_control.rb#L49


#1 Updated by Justin Sherrill over 1 year ago

  • Category set to Orchestration
  • Release set to Katello Backlog
  • Difficulty set to easy

#2 Updated by John Mitsch 6 months ago

  • Target version set to Team Brad - Backlog

Also available in: Atom PDF