You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Such a cool program! I am running an M3 Max 16 Core 128 GB RAM (2TB SSD), so it can handle arbitrary precision arithmetic very well. Is it possible to use one of the solutions here so that the zoom can be increased to more than 0.000005 and retain precision?
The text was updated successfully, but these errors were encountered:
I assume that you are refering to GPU 2D visualisation. Last time I checked, I performed some (empirical) calculations and I found that I couldn't go further than this 0.000005 before the visualisation became a mess of pixels. I'm not very familiar with BigDecimal (or its equivalent to C#), but this was using an old late 2013 MacBook Pro and using an old version of .NET.
Also,
Do you have a source that states that the M3 Max (or maybe the whole Apple Silicon line-up) supports arbitrary precision arithmetic?
Implementing BigDecimal (or its equivalent to C#) might not be trivial, since I'm thinking that this might require a major update on both .NET and Unity version that (most probably) will break a lot of things.
@0xAdriaTorralba To my knowledge, there is no hardware that supports arbitrary precision arithmetic. It is almost always (if not always) done in software. Therefore, the question is moot.
Such a cool program! I am running an M3 Max 16 Core 128 GB RAM (2TB SSD), so it can handle arbitrary precision arithmetic very well. Is it possible to use one of the solutions here so that the zoom can be increased to more than 0.000005 and retain precision?
The text was updated successfully, but these errors were encountered: