Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spam Mocha and analyze block times #4212

Closed
rootulp opened this issue Jan 13, 2025 · 6 comments
Closed

Spam Mocha and analyze block times #4212

rootulp opened this issue Jan 13, 2025 · 6 comments
Assignees

Comments

@rootulp
Copy link
Collaborator

rootulp commented Jan 13, 2025

Context

Mocha increased to 8 MiB blocks.

Problem

We want to be confident that block times can remain ~6 seconds with full 8 MiB blocks.

Proposal

  1. Spam Mocha via txsim for a couple hours. Use a range of blob sizes. Verify that blocks are near full.
  2. After a couple hours, analyze the block times. There exists a Go binary for this.
@rootulp rootulp self-assigned this Jan 13, 2025
@rootulp
Copy link
Collaborator Author

rootulp commented Jan 13, 2025

Looks like Mocha blocks over the past week haven't been that full so yes we need to spam it. I originally tried

$ txsim --key-path ~/.celestia-app --grpc-endpoint rpc-mocha.pops.one:9090 --feegrant --blob 10 --blob-amounts 1-10 --blob-sizes 1000-1000000

but I'm observing some logs of the form

{"level":"error","error":"broadcast tx error: tx size 3095389 bytes is larger than the application's configured MaxTxSize of 2097152 bytes for version 3: exceeds max tx size limit","address":"celestia1sqpcnmafsg3e4lvcwwzkzg6qlwcytuurkq2l7d","blobs count":"7","total byte size of blobs":3094218,"time":"2025-01-13T15:53:30-05:00","message":"tx failed"}

so trying this instead

$ txsim --key-path ~/.celestia-app --grpc-endpoint rpc-mocha.pops.one:9090 --feegrant --blob 10 --blob-amounts 1 --blob-sizes 1000000-2000000

@rootulp
Copy link
Collaborator Author

rootulp commented Jan 13, 2025

It's kinda tough to get consistently full blocks b/c the txs land in different blocks. For example logs look like

{"level":"info","height":4156070,"address":"celestia1tj5t0chjtv44fs3du4f3mjlsexz7s9v5sckx23","blobs count":"1","total byte size of blobs":1720203,"time":"2025-01-13T15:56:18-05:00","message":"tx committed"}
{"level":"info","height":4156070,"address":"celestia1gt45dyfjjxtd58l3kjesc83uttk8ewyqljtu65","blobs count":"1","total byte size of blobs":1720203,"time":"2025-01-13T15:56:18-05:00","message":"tx committed"}
{"level":"info","height":4156072,"address":"celestia1fzkmlt3mee0gv6svpwg90lsmrwdncmmdrw68ac","blobs count":"1","total byte size of blobs":1720203,"time":"2025-01-13T15:56:18-05:00","message":"tx committed"}
{"level":"info","height":4156073,"address":"celestia10ny9aqs63fg8ysr6vvpfh3mtev9kypejagz0g2","blobs count":"1","total byte size of blobs":1720203,"time":"2025-01-13T15:56:18-05:00","message":"tx committed"}
{"level":"info","height":4156075,"address":"celestia17s9ampc4u7whp6fu2emjv8y437kjc7yk8254s0","blobs count":"1","total byte size of blobs":1720203,"time":"2025-01-13T15:56:18-05:00","message":"tx committed"}
{"level":"info","height":4156077,"address":"celestia1n5yl702v2t4273qp4un83cjcxkuqyzelj38sfj","blobs count":"1","total byte size of blobs":1720203,"time":"2025-01-13T15:56:18-05:00","message":"tx committed"}
{"level":"info","height":4156079,"address":"celestia15cgjxztmxtugmppfa6et6kwuj7z532ttcysl7e","blobs count":"1","total byte size of blobs":1720203,"time":"2025-01-13T15:56:18-05:00","message":"tx committed"}
{"level":"info","height":4156081,"address":"celestia1htcw8j4epfw37zxt0sk6g74smvduwpa7s260l5","blobs count":"1","total byte size of blobs":1720203,"time":"2025-01-13T15:56:18-05:00","message":"tx committed"}
{"level":"info","height":4156082,"address":"celestia16jqxzfwus0mj2qmt2a04dzrlf2qqq4pw9crfj7","blobs count":"1","total byte size of blobs":1720203,"time":"2025-01-13T15:56:18-05:00","message":"tx committed"}

and then there are a bunch of consecutive blocks that are all > 1 MiB but rarely > 2 MiB

@rootulp
Copy link
Collaborator Author

rootulp commented Jan 13, 2025

A little crazy but tip from Josh was to run:

txsim --key-path ~/.celestia-app --grpc-endpoint rpc-mocha.pops.one:9090 --blob 4 --blob-amounts 1 --blob-sizes 1782579 --feegrant

I ran that in 5+ terminal tabs and now blocks are more utilized. Some ~6 MiB blocks

Screenshot 2025-01-13 at 4 51 03 PM

@rootulp
Copy link
Collaborator Author

rootulp commented Jan 14, 2025

I let txsim run on Mocha over night but 2 of the 5 terminal tabs hit:

{"level":"error","error":"sequence 2: tx was evicted from the mempool","time":"2025-01-14T08:28:24-05:00","message":"sequence failed"}
Error: sequence 2: tx was evicted from the mempool

Blocks on Mocha don't come near 8 MiB so next steps are:

  1. Repeat txsim on Arabica to see if it's possible to get 8 MiB blocks there
  2. Investigate txsim code to see if it's possible to submit txs in a way that actually fills blocks

@rootulp
Copy link
Collaborator Author

rootulp commented Jan 14, 2025

Update, I think my previous issue was that I was overwhelming one server. I got Mocha block utilization higher by spreading out txsim across multiple endpoints. I'm running against 4 GRPC endpoints:

txsim --key-path ~/.celestia-app --grpc-endpoint rpc-mocha.pops.one:9090 --feegrant --blob 8 --blob-amounts 1 --blob-sizes 1000000

txsim --key-path ~/.celestia-app --grpc-endpoint grpc-1.testnet.celestia.nodes.guru:10790 --feegrant --blob 8 --blob-amounts 1 --blob-sizes 1000000

txsim --key-path ~/.celestia-app --grpc-endpoint mocha-4-consensus.mesa.newmetric.xyz:9090 --feegrant --blob 8 --blob-amounts 1 --blob-sizes 1000000

txsim --key-path ~/.celestia-app --grpc-endpoint full.consensus.mocha-4.celestia-mocha.com:9090 --feegrant --blob 8 --blob-amounts 1 --blob-sizes 1000000

blocks are > 4 MiB:

Screenshot 2025-01-14 at 3 33 44 PM

@rootulp
Copy link
Collaborator Author

rootulp commented Jan 15, 2025

Going to close this b/c

$ go run main.go
Fetched a total of 110 blocks (from ~4170798 up to ~4170898).
Average bytes_in_block: 5617659.89 bytes (~5.36 MiB)
Average block_time:     6761.83 ms (~6.76 seconds)

@rootulp rootulp closed this as completed Jan 15, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant