-
Notifications
You must be signed in to change notification settings - Fork 348
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Investigate impact of Mocha spam #4230
Comments
we did spam arabica extensively and had some issues, but they were limited to the cluster being configured incorrectly where each node only had access to small amounts of ram. We had >7MB blocks averaging 5s, which is to be expected on a local 4 node network |
adding some logs from our RPC here (which seems not to be working ok anymore after that for people trying to use it for blobs): Prior to restart of the node, lots of these:
after i restarted, same amount of msgs but theyre now these:
additional info: in app.toml we have set
HW specs: ryzen 7700x (8 cores) and 64gb ram |
I found 3 reports:
|
According to https://celestia-tools.brightlystake.com/ the Mocha endpoints are healthy with the exception of http://celestia-t-rpc.noders.services |
@mindstyle85 do you have system metrics (CPU, RAM, Network I/O) from the POPs consensus node? |
Thanks, super helpful!
Everything else seems fine. My hypothesis is that your server was maxing out 1 Gb/s network due to txsim and wasn't able to service any other inbound GRPC requests. |
ah, the disk was just the OS disk filling up due to logs, but we cleaned that.. the actual disk with db still has about 50% space left not sure if its properly working yet though, i still see the broken pipe error messages in the logs so it looks like it hasnt fully recovered yet or something |
Context
I spammed Mocha last night #4212 (comment)
Problem
Some node operators reported issues on Mocha last night
Proposal
cc: @evan-forbes did we ever perform similar txsim on Arabica after bumping to 8 MiB? Any similar symptoms from network tests?
The text was updated successfully, but these errors were encountered: