I asked a question earlier that involved pulling large primes from a text file and putting them into another file. It was supposed to grab every prime up to and including the first prime after 2^32, and for some reason this script stopped working.
#!/bin/bash
n=4294967296
last=0
while read number
do
if [ $last -gt $n ]
then break
fi
echo $number
last=$number
done < primes.txt > primes2.txt
It ended up looping through these 11 numbers:
4232004449
4232004479
4232004493
4232004509
4232004527
4232004533
4232004559
4232004589
4232004593
4232004613
004437
The original file didn't have 004437
in it, and my bash will handle numbers over 8999999999999999999
Does anybody have a clue as to why this happened?
64-bit Ubuntu 10.04, 16GB RAM, 8-cores @ 3.60 GHz
GNU bash, version 4.1.5(1)-release (x86_64-pc-linux-gnu)
Update:
After downloading and compiling the "fixed" bash provided by jfgagne and linking to it in my bash script, it errored out at the same exact spot. Using the significantly faster perl equivalent from my original prime question, I got some file sizes from ls -al:
11 next_prime (just to make sure this was counting bytes accurately)
2147483659 primes2.txt
2147483670 one_too_many
2147483659 = 2^31 + 11
The size of the next prime (4232004631
) is 11 bytes
This holds all primes up to 4232004613
. I also realized that the 004437
is coming from the end of the prime at the bottom of this error loop (4232004437
). It seems like something is trying to advance, but stuck.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…