Hacker News

hnlyman
Generating All 32-Bit Primes (Part I) hnlyman.github.io

susam4 hours ago

I have a little tool called Prime Grid Explorer at https://susam.net/primegrid.html that I wrote for my own amusement. It can display all primes below 3317044064679887385961981 (an 82-bit integer).

The largest three primes it can show are

  3317044064679887385961783
  3317044064679887385961801
  3317044064679887385961813
Visit https://susam.net/primegrid.html#3317044064679887385961781-2... to see them plotted. Click the buttons labelled '·' and 't' to enable the grid and tooltips, then hover over each circle to see its value.

So essentially it can test all 81-bit integers and some 82-bit integers for primality. It does so using the Miller-Rabin primality test with prime bases derived from https://oeis.org/A014233 (OEIS A014233). The algorithm is implemented in about 80 lines of plain JavaScript. If you view the source, look for the function isPrimeByMR.

The Miller-Rabin test is inherently probabilistic. It tests whether a number is a probable prime by checking whether certain number theoretic congruence relations hold for a given base a. The test can yield false positives, that is, a composite number may pass the test. But it cannot have false negatives, so a number that fails the test is definitely composite. The more bases for which the test holds, the more likely it is that the tested number is prime. It has been computationally verified that there are no false positives below 3317044064679887385961981 when tested with prime bases 2, 3, 5, ..., 41. So although the algorithm is probabilistic, it functions as a deterministic test for all numbers below this bound when tested with these 13 bases.

flanciana minute ago

Very cool, thank you! Both the visualization tool and the description of Miller-Rabin. I didn't know an algorithm with this properties existed! Furthermore, your tool gave me a more intuitive feel of the rate at which primes "thin out" than every treatment of the topic I read previously.

senfiaj7 hours ago

There is also the segmented Sieve of Eratosthenes. It has a simlar performance but uses much less memory: the number of prime numbers from 2 to sqrt(n). For example, for n = 1000000, the RAM has to store only 168 additional numbers.

I use this algorithm here https://surenenfiajyan.github.io/prime-explorer/

dahart3 hours ago

The Pseudosquares Sieve will drop the memory requirements much further from sqrt(n) to log^2(n). https://link.springer.com/chapter/10.1007/11792086_15

hnlymanop2 hours ago

Yep, this is the natural way to go, especially considering the possibility of parallel computing and the importance of cache locality, etc.

ojciecczasan hour ago

Do you know the https://en.wikipedia.org/wiki/Sieve_of_Atkin? It's mind-blowing.

forinti7 hours ago

If you take all 53 8 bit primes, you can use modular arithmetic with a residue base to work with numbers up to

64266330917908644872330635228106713310880186591609208114244758680898150367880703152525200743234420230

This would require 334 bits.

mark-r6 hours ago

You can combine the Sieve and Wheel techniques to reduce the memory requirements dramatically. There's no need to use a bit for numbers that you already know can't be prime. You can find a Python implementation at https://stackoverflow.com/a/62919243/5987

tromp3 hours ago

Or a C implementation at https://tromp.github.io/pearls.html#sieve which runs in well under 10s.

hnlymanop2 hours ago

I'd be interested in seeing an explanation of the code, since it looks pretty incomprehensible to me. Per the arbitrary rules I set for myself, I'm not allowed to precompute/hardcode the wheel (looks like this implementation uses a hardcoded wheel of size 2x3x5=30). I wonder if/by how much the performance would suffer by computing and storing the coprime remainders in memory instead of handing them directly to the compiler.

[deleted]3 hours agocollapsed

dahart3 hours ago

This got me through many of the first 100 problems on Project Euler:

    n = 1000000 # must be even
    sieve = [True] * (n/2)
    for i in range(3,int(n**0.5)+1,2):
        if sieve[i/2]: sieve[i*i/2::i] = [False] * ((n-i*i-1)/(2*i)+1)
    …
    # x is prime if x%2 and sieve[x/2]
Edit: I guess I irked someone. :/ Yes this is a memory hog, but to me beautiful because it’s so tiny and simple. I never tried very hard, but I wonder if it could be made a real one-liner.

davispeckan hour ago

I always like seeing implementations that start from trial division and gradually introduce optimizations like wheel factorization.

It makes the trade-offs much clearer than jumping straight to a complex sieve.

[deleted]6 hours agocollapsed

reader92742 hours ago

Very well written

ZyanWu7 hours ago

> There is a long way to go from here. Kim Walisch's primesieve can generate all 32-bit primes in 0.061s (though this is without writing them to a file)

Oh, come on, just use a bash indirection and be done with it. It takes 1 minute and you had another result for comparison

marxisttemp6 hours ago

Why include writing the primes to a file instead of, say, standard output? That increases the optimization space drastically and the IO will eclipse all the careful bitwise math

Does having the primes in a file even allow faster is-prime lookup of a number?

hnlymanop3 hours ago

No real reason. It's just an arbitrary task I made for myself. I might have to adjust the goal if writing to the file becomes the lion's share of the runtime, but I'll be pretty happy with myself if that's the project's biggest problem.

logicallee7 hours ago

there are also very fast primality tests that work statistically. It's called Miller-Rabin, I tested in the browser here[1] and it can do them all in about three minutes on my phone.

[1] https://claude.ai/public/artifacts/baa198ed-5a17-4d04-8cef-7...

amelius4 hours ago

What are the false positive/negative rates?

logicallee4 hours ago

for the way this one was done, this witness set has been proven to produce no false positives or negatives for n < 2³⁷.

_alternator_4 hours ago

Nice. Notably with Miller-Rabin, you can also iterate the test cheaply and get exponentially low false positive/negative rates. I believe that this is how prime factors for RSA keys are usually chosen; choose an error rate below 2^-1000 and sleep extremely soundly knowing that the universe is more likely to evaporate in the next second than that you’ve got a false positive prime.

shablulman7 hours ago

[dead]

hn-front (c) 2024 voximity
source