-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
config option to disable spawning / evacuate #75
Comments
I would propose a pool config:
|
Or if you want something more fine-grained, e.g. |
I like your idea, sounds good, but I have an alternative proposal, you can pick which one you want to be implemented. Since PR #79, we now have an easy way for terminating all resources by Maybe we could add
Then by running resalloc-maint pause
resalloc-main resource-delete --all
# Whatever maintenance tasks needed to be done
resalloc-maint resume I like this solution better because I would expect a command for doing this instead of a config option (even though I originally reported the RFE as a config option), but that is just my subjective feeling. But I think it may also have some advantages, such as one could safely run ansible playbooks and replace the configs, and the spawning won't be accidentally resumed. Lastly, the naming is just a suggestion, it can be What do you think? |
I am fine with the proposal! But technically I don't think we should create Also, what you propose is not a per-pool setting — but all or nothing. I think |
Sounds good to me
True, I was only in situations when I needed to pause all pools, so I
and the
Then it would be easy to add cmdline parameters for specifying pool |
+1 |
If we want to (temporarily) disable spawning new resources, we need to edit
/etc/resallocserver/pools.yaml
and setmax: 0
for each pool section. This is too cumbersome as we have 10 pools in Copr.I would like to propose some boolean option into
server.yaml
or some main section inpools.yaml
that would supersede themax
value and temporarily disable spawning new resources.The text was updated successfully, but these errors were encountered: