Author here. This is my first published paper - developed independently. Novel approach uses a fixed set of witness primes to reduce verification to O(1) per even. Full C# implementation on GitHub for anyone to run.
> Goldbach states: every even number ≥ 4 is a sum of two primes. The naive check for an even n
tries many primes p and hopes that n − p is prime. Our idea is simpler: fix a small set of primes
Q = {q1, . . . , qK} (the “gear”), and for each even n only test p = n − q with q ∈ Q
I don't see how your idea is different from the naive check. As far as I can tell you are basically saying do the naive check but only up to p > 250-300?
The idea is that a fixed gear approach means that instead of exhaustively checking against everything, a small subset of primes (k=300) actually is sufficient for effectively complete coverage and holds true at large slices in the quadrillions, or even higher, or all the way through an entire cycle of evens to 1 trillion. It’s saving a massive amount of compute and appears to be just as effective as naive checks. Like orders of magnitude faster.
As far as I know, no one has tested this method or written an algorithm precisely like this. And then determined that k=300 is the sweet spot for primes sets. Complexity isn’t required for improvements.
Think of it like this: “naive 300“ would try and check if n minus any of the first 300 primes lands on another prime. For big n, it falls apart fast as prime gaps explode, and you start missing left and right. But here I am doing a small, constant set of primes, call it the “gear”… and for each even n, I check n – q for every q in that gear. Not just as a casual test, but as a full-blown, multi-threaded, checkpointed sweep with audited logs.
So at 12 threads. I can do 10^12 in about 3hrs and 45 mins. But obviously takes over the whole system. I think it could even be more optimized if the concept was taken further in the right hands or hardware. I should add some benchmarks for sure. Will do so soon.
I don't see how your idea is different from the naive check. As far as I can tell you are basically saying do the naive check but only up to p > 250-300?
As far as I know, no one has tested this method or written an algorithm precisely like this. And then determined that k=300 is the sweet spot for primes sets. Complexity isn’t required for improvements.
How long does it take for you to test up to 10^9? 10^12?
I wrote https://github.com/sethtroisi/goldbach in the last four hours and I think it's 10-100x faster than your code
I know of at least another 4x I could improve it but I've been nerd sniped enough for the night.I wouldn't trust it 100% but it finds all terms in https://oeis.org/A025018 and https://oeis.org/A025019 and I can verify in https://sweet.ua.pt/tos/goldbach.html