What is the most efficient algorithm to find the closest prime less than a given number $n$












8












$begingroup$


Problem

Given a number n, $2 leq n leq 2^{63}$. $n$ could be prime itself. Find the a prime $p$ that is closet to $n$.



Using the fact that for all prime $p$, $p > 2$, $p$ is odd and $p$ is of the form $6k + 1$ or $6k + 5$, I wrote a for-loop checking from $n - 1$ to 2 to check if that number is prime. So instead of checking for all numbers I need to check for every odd of the two form $p equiv 1 pmod{k}$, and $p equiv 5 pmod{k}$. However, I wonder if there is a faster algorithm to solve this problem? i.e. some constraints that can restrict the range of numbers need to be checked? Any idea would be greatly appreciated.










share|cite|improve this question











$endgroup$








  • 4




    $begingroup$
    Use a strong probabilistic test and then verify primality with a deterministic test, e.g. see primes.utm.edu/prove/index.html
    $endgroup$
    – Bill Dubuque
    Apr 27 '11 at 0:38






  • 3




    $begingroup$
    Would storing all the primes within the range to some array and using (a suitably modified) binary search be impractical?
    $endgroup$
    – J. M. is not a mathematician
    Apr 27 '11 at 0:40






  • 7




    $begingroup$
    You only have to check the interval $[n/2,2n]$ because of Bertrand's postulate.
    $endgroup$
    – lhf
    Apr 27 '11 at 0:53






  • 4




    $begingroup$
    @Chan: You will just be wasting a lot of people's time by asking in multiple places. If you are looking for different solutions you should mention that in your question. Also, disagree that you won't find good practical answers here. IMO, you will find much better answers (if needed, using 'bitwise') here than on stackoverflow.
    $endgroup$
    – Aryabhata
    Apr 27 '11 at 2:05








  • 5




    $begingroup$
    I think it's OK to ask (essentially) the same question in two venues, $bf as long as you're upfront about it$, that is, as long as in each venue you note that you're asking it in the other, and you give a link so people can easily check to see that they are not duplicating effort.
    $endgroup$
    – Gerry Myerson
    Apr 27 '11 at 5:14
















8












$begingroup$


Problem

Given a number n, $2 leq n leq 2^{63}$. $n$ could be prime itself. Find the a prime $p$ that is closet to $n$.



Using the fact that for all prime $p$, $p > 2$, $p$ is odd and $p$ is of the form $6k + 1$ or $6k + 5$, I wrote a for-loop checking from $n - 1$ to 2 to check if that number is prime. So instead of checking for all numbers I need to check for every odd of the two form $p equiv 1 pmod{k}$, and $p equiv 5 pmod{k}$. However, I wonder if there is a faster algorithm to solve this problem? i.e. some constraints that can restrict the range of numbers need to be checked? Any idea would be greatly appreciated.










share|cite|improve this question











$endgroup$








  • 4




    $begingroup$
    Use a strong probabilistic test and then verify primality with a deterministic test, e.g. see primes.utm.edu/prove/index.html
    $endgroup$
    – Bill Dubuque
    Apr 27 '11 at 0:38






  • 3




    $begingroup$
    Would storing all the primes within the range to some array and using (a suitably modified) binary search be impractical?
    $endgroup$
    – J. M. is not a mathematician
    Apr 27 '11 at 0:40






  • 7




    $begingroup$
    You only have to check the interval $[n/2,2n]$ because of Bertrand's postulate.
    $endgroup$
    – lhf
    Apr 27 '11 at 0:53






  • 4




    $begingroup$
    @Chan: You will just be wasting a lot of people's time by asking in multiple places. If you are looking for different solutions you should mention that in your question. Also, disagree that you won't find good practical answers here. IMO, you will find much better answers (if needed, using 'bitwise') here than on stackoverflow.
    $endgroup$
    – Aryabhata
    Apr 27 '11 at 2:05








  • 5




    $begingroup$
    I think it's OK to ask (essentially) the same question in two venues, $bf as long as you're upfront about it$, that is, as long as in each venue you note that you're asking it in the other, and you give a link so people can easily check to see that they are not duplicating effort.
    $endgroup$
    – Gerry Myerson
    Apr 27 '11 at 5:14














8












8








8


4



$begingroup$


Problem

Given a number n, $2 leq n leq 2^{63}$. $n$ could be prime itself. Find the a prime $p$ that is closet to $n$.



Using the fact that for all prime $p$, $p > 2$, $p$ is odd and $p$ is of the form $6k + 1$ or $6k + 5$, I wrote a for-loop checking from $n - 1$ to 2 to check if that number is prime. So instead of checking for all numbers I need to check for every odd of the two form $p equiv 1 pmod{k}$, and $p equiv 5 pmod{k}$. However, I wonder if there is a faster algorithm to solve this problem? i.e. some constraints that can restrict the range of numbers need to be checked? Any idea would be greatly appreciated.










share|cite|improve this question











$endgroup$




Problem

Given a number n, $2 leq n leq 2^{63}$. $n$ could be prime itself. Find the a prime $p$ that is closet to $n$.



Using the fact that for all prime $p$, $p > 2$, $p$ is odd and $p$ is of the form $6k + 1$ or $6k + 5$, I wrote a for-loop checking from $n - 1$ to 2 to check if that number is prime. So instead of checking for all numbers I need to check for every odd of the two form $p equiv 1 pmod{k}$, and $p equiv 5 pmod{k}$. However, I wonder if there is a faster algorithm to solve this problem? i.e. some constraints that can restrict the range of numbers need to be checked? Any idea would be greatly appreciated.







elementary-number-theory algorithms prime-numbers






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited May 11 '12 at 20:02









J. M. is not a mathematician

60.9k5150289




60.9k5150289










asked Apr 27 '11 at 0:26









ChanChan

5,3961177120




5,3961177120








  • 4




    $begingroup$
    Use a strong probabilistic test and then verify primality with a deterministic test, e.g. see primes.utm.edu/prove/index.html
    $endgroup$
    – Bill Dubuque
    Apr 27 '11 at 0:38






  • 3




    $begingroup$
    Would storing all the primes within the range to some array and using (a suitably modified) binary search be impractical?
    $endgroup$
    – J. M. is not a mathematician
    Apr 27 '11 at 0:40






  • 7




    $begingroup$
    You only have to check the interval $[n/2,2n]$ because of Bertrand's postulate.
    $endgroup$
    – lhf
    Apr 27 '11 at 0:53






  • 4




    $begingroup$
    @Chan: You will just be wasting a lot of people's time by asking in multiple places. If you are looking for different solutions you should mention that in your question. Also, disagree that you won't find good practical answers here. IMO, you will find much better answers (if needed, using 'bitwise') here than on stackoverflow.
    $endgroup$
    – Aryabhata
    Apr 27 '11 at 2:05








  • 5




    $begingroup$
    I think it's OK to ask (essentially) the same question in two venues, $bf as long as you're upfront about it$, that is, as long as in each venue you note that you're asking it in the other, and you give a link so people can easily check to see that they are not duplicating effort.
    $endgroup$
    – Gerry Myerson
    Apr 27 '11 at 5:14














  • 4




    $begingroup$
    Use a strong probabilistic test and then verify primality with a deterministic test, e.g. see primes.utm.edu/prove/index.html
    $endgroup$
    – Bill Dubuque
    Apr 27 '11 at 0:38






  • 3




    $begingroup$
    Would storing all the primes within the range to some array and using (a suitably modified) binary search be impractical?
    $endgroup$
    – J. M. is not a mathematician
    Apr 27 '11 at 0:40






  • 7




    $begingroup$
    You only have to check the interval $[n/2,2n]$ because of Bertrand's postulate.
    $endgroup$
    – lhf
    Apr 27 '11 at 0:53






  • 4




    $begingroup$
    @Chan: You will just be wasting a lot of people's time by asking in multiple places. If you are looking for different solutions you should mention that in your question. Also, disagree that you won't find good practical answers here. IMO, you will find much better answers (if needed, using 'bitwise') here than on stackoverflow.
    $endgroup$
    – Aryabhata
    Apr 27 '11 at 2:05








  • 5




    $begingroup$
    I think it's OK to ask (essentially) the same question in two venues, $bf as long as you're upfront about it$, that is, as long as in each venue you note that you're asking it in the other, and you give a link so people can easily check to see that they are not duplicating effort.
    $endgroup$
    – Gerry Myerson
    Apr 27 '11 at 5:14








4




4




$begingroup$
Use a strong probabilistic test and then verify primality with a deterministic test, e.g. see primes.utm.edu/prove/index.html
$endgroup$
– Bill Dubuque
Apr 27 '11 at 0:38




$begingroup$
Use a strong probabilistic test and then verify primality with a deterministic test, e.g. see primes.utm.edu/prove/index.html
$endgroup$
– Bill Dubuque
Apr 27 '11 at 0:38




3




3




$begingroup$
Would storing all the primes within the range to some array and using (a suitably modified) binary search be impractical?
$endgroup$
– J. M. is not a mathematician
Apr 27 '11 at 0:40




$begingroup$
Would storing all the primes within the range to some array and using (a suitably modified) binary search be impractical?
$endgroup$
– J. M. is not a mathematician
Apr 27 '11 at 0:40




7




7




$begingroup$
You only have to check the interval $[n/2,2n]$ because of Bertrand's postulate.
$endgroup$
– lhf
Apr 27 '11 at 0:53




$begingroup$
You only have to check the interval $[n/2,2n]$ because of Bertrand's postulate.
$endgroup$
– lhf
Apr 27 '11 at 0:53




4




4




$begingroup$
@Chan: You will just be wasting a lot of people's time by asking in multiple places. If you are looking for different solutions you should mention that in your question. Also, disagree that you won't find good practical answers here. IMO, you will find much better answers (if needed, using 'bitwise') here than on stackoverflow.
$endgroup$
– Aryabhata
Apr 27 '11 at 2:05






$begingroup$
@Chan: You will just be wasting a lot of people's time by asking in multiple places. If you are looking for different solutions you should mention that in your question. Also, disagree that you won't find good practical answers here. IMO, you will find much better answers (if needed, using 'bitwise') here than on stackoverflow.
$endgroup$
– Aryabhata
Apr 27 '11 at 2:05






5




5




$begingroup$
I think it's OK to ask (essentially) the same question in two venues, $bf as long as you're upfront about it$, that is, as long as in each venue you note that you're asking it in the other, and you give a link so people can easily check to see that they are not duplicating effort.
$endgroup$
– Gerry Myerson
Apr 27 '11 at 5:14




$begingroup$
I think it's OK to ask (essentially) the same question in two venues, $bf as long as you're upfront about it$, that is, as long as in each venue you note that you're asking it in the other, and you give a link so people can easily check to see that they are not duplicating effort.
$endgroup$
– Gerry Myerson
Apr 27 '11 at 5:14










3 Answers
3






active

oldest

votes


















5












$begingroup$

This is an extension of Bill's comment.



It is certainly reasonable to use strong probabilistic tests and then verify with the AKS algorithm. In fact, if we look for primes alternately above and below the target number and test with AKS, where each primality test takes time at most $log^{O(1)}N$, and there is an approximately $dfrac{1}{log N}$ chance that each is prime, one would be at least 99% sure to find a prime in time $O(log^{O(1)}N)$. But this is not deterministic - conceivably, there exists an interval where this works very poorly, far worse than logarithmic time. To be fair, I suspect such an interval does not exist in the numbers up to $2^{64}$. But speaking generally, a deterministic algorithm might be desired.



To that end, I refer you to the results of Polymath4 "Finding Primes," a collaborative mathematical experience that sought to find a method to deterministically find primes of about a given size. In their paper, they describe deterministic methods that take time $O(N^{.5 + epsilon})$, which I think is pretty exciting.



Ultimately, I would say that Bill's suggestion of probabilistically checking numbers and then verifying their primality is optimal. But the promise of a deterministic algorithm is... quite alluring, I think, too.






share|cite|improve this answer











$endgroup$









  • 1




    $begingroup$
    Just a note: The gain by using probabilistic primality test is that if a number is deemed composite, we can ignore running the deterministic test on it, as usually probablisitic tests don't lie about compositeness.
    $endgroup$
    – Aryabhata
    Apr 27 '11 at 2:18












  • $begingroup$
    This is very clear! Whenever I find primes of a large size, I always just search using Miller-Rabin and AKS. But it is still probabilistic, not deterministic.
    $endgroup$
    – davidlowryduda
    Apr 27 '11 at 2:31






  • 1




    $begingroup$
    @mixed: Most probabilistic tests guarantee compositeness (I believe Miller-Rabin is one of those). So using Miller-Rabin then AKS is a deterministic algorithm!
    $endgroup$
    – Aryabhata
    Apr 27 '11 at 2:55






  • 1




    $begingroup$
    @mixed: There are many determinstic algorithms whose runtime is not known precisely. Shell Sort is one classic example. Deterministic does not mean you know the runtime. It just means you are 100% confident of the output. The whole point of using a probabilistic test is that we will likely move over the composites quickly and only run AKS on some composites and primes saving a lot of computation time.
    $endgroup$
    – Aryabhata
    Apr 27 '11 at 3:05








  • 1




    $begingroup$
    @mixed: Come on. You know very well it does not matter if it was AKS or not :-) Anyway, I agree that it is not determinstic with respect to the runtime, but it is determinstic with respect to the output. We are talking of different things I suppose. Anyway...
    $endgroup$
    – Aryabhata
    Apr 27 '11 at 3:22



















8












$begingroup$

You mention $2^{63}.$ It's known that below $4cdot10^{18}approx2^{61.8},$ there is at least one prime every 1476 numbers. So a reasonable algorithm would be to sieve some numbers smaller than n (up to 2000 if you want to be on the safe side) up to some small bound, then check the remaining numbers with the Miller-Rabin test. Test any that pass with a primality-proving algorithm—do not use AKS as other answers have suggested; it is not fast in this range.



Of course 2000 is overkill; an interval of length 200 should be good enough for 99.9997% of cases. But sieving is fast so it's not really obvious what length is optimal. In any case the more important optimization issue is deciding how far to sieve rather than how long of an interval to sieve.



As for the primality-proving algorithm, ECPP works. Recently it was proven that there are no BPSW-pseudoprimes less than $2^{64},$ so a Lucas test following your MR test (to base 2) is even faster.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    using a segmented seive upto root of n would be a good optimization .
    $endgroup$
    – saruftw
    Jun 11 '16 at 10:45










  • $begingroup$
    @saru95 I don't think so, not at that size. Faster to sieve out the easy composites and test the rest rather than doing billions of divisions.
    $endgroup$
    – Charles
    Jun 11 '16 at 14:37



















3












$begingroup$

Use sieving to optimize your loop even more.



Guess an interval of reasonable length on which your prime is going to be found. Initialize a corresponding table with entries all equal to "true".



Go over a list of small primes $p_i$, and compute $n mod p_i$. Using that value, you can mark off some of the entries in the table as "false".



Now do what Bill suggests - go over the remaining entries, use Miller-Rabin and then if you want to be really sure, some deterministic test (not necessarily AKS, which is slow).






share|cite|improve this answer











$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f35300%2fwhat-is-the-most-efficient-algorithm-to-find-the-closest-prime-less-than-a-given%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    5












    $begingroup$

    This is an extension of Bill's comment.



    It is certainly reasonable to use strong probabilistic tests and then verify with the AKS algorithm. In fact, if we look for primes alternately above and below the target number and test with AKS, where each primality test takes time at most $log^{O(1)}N$, and there is an approximately $dfrac{1}{log N}$ chance that each is prime, one would be at least 99% sure to find a prime in time $O(log^{O(1)}N)$. But this is not deterministic - conceivably, there exists an interval where this works very poorly, far worse than logarithmic time. To be fair, I suspect such an interval does not exist in the numbers up to $2^{64}$. But speaking generally, a deterministic algorithm might be desired.



    To that end, I refer you to the results of Polymath4 "Finding Primes," a collaborative mathematical experience that sought to find a method to deterministically find primes of about a given size. In their paper, they describe deterministic methods that take time $O(N^{.5 + epsilon})$, which I think is pretty exciting.



    Ultimately, I would say that Bill's suggestion of probabilistically checking numbers and then verifying their primality is optimal. But the promise of a deterministic algorithm is... quite alluring, I think, too.






    share|cite|improve this answer











    $endgroup$









    • 1




      $begingroup$
      Just a note: The gain by using probabilistic primality test is that if a number is deemed composite, we can ignore running the deterministic test on it, as usually probablisitic tests don't lie about compositeness.
      $endgroup$
      – Aryabhata
      Apr 27 '11 at 2:18












    • $begingroup$
      This is very clear! Whenever I find primes of a large size, I always just search using Miller-Rabin and AKS. But it is still probabilistic, not deterministic.
      $endgroup$
      – davidlowryduda
      Apr 27 '11 at 2:31






    • 1




      $begingroup$
      @mixed: Most probabilistic tests guarantee compositeness (I believe Miller-Rabin is one of those). So using Miller-Rabin then AKS is a deterministic algorithm!
      $endgroup$
      – Aryabhata
      Apr 27 '11 at 2:55






    • 1




      $begingroup$
      @mixed: There are many determinstic algorithms whose runtime is not known precisely. Shell Sort is one classic example. Deterministic does not mean you know the runtime. It just means you are 100% confident of the output. The whole point of using a probabilistic test is that we will likely move over the composites quickly and only run AKS on some composites and primes saving a lot of computation time.
      $endgroup$
      – Aryabhata
      Apr 27 '11 at 3:05








    • 1




      $begingroup$
      @mixed: Come on. You know very well it does not matter if it was AKS or not :-) Anyway, I agree that it is not determinstic with respect to the runtime, but it is determinstic with respect to the output. We are talking of different things I suppose. Anyway...
      $endgroup$
      – Aryabhata
      Apr 27 '11 at 3:22
















    5












    $begingroup$

    This is an extension of Bill's comment.



    It is certainly reasonable to use strong probabilistic tests and then verify with the AKS algorithm. In fact, if we look for primes alternately above and below the target number and test with AKS, where each primality test takes time at most $log^{O(1)}N$, and there is an approximately $dfrac{1}{log N}$ chance that each is prime, one would be at least 99% sure to find a prime in time $O(log^{O(1)}N)$. But this is not deterministic - conceivably, there exists an interval where this works very poorly, far worse than logarithmic time. To be fair, I suspect such an interval does not exist in the numbers up to $2^{64}$. But speaking generally, a deterministic algorithm might be desired.



    To that end, I refer you to the results of Polymath4 "Finding Primes," a collaborative mathematical experience that sought to find a method to deterministically find primes of about a given size. In their paper, they describe deterministic methods that take time $O(N^{.5 + epsilon})$, which I think is pretty exciting.



    Ultimately, I would say that Bill's suggestion of probabilistically checking numbers and then verifying their primality is optimal. But the promise of a deterministic algorithm is... quite alluring, I think, too.






    share|cite|improve this answer











    $endgroup$









    • 1




      $begingroup$
      Just a note: The gain by using probabilistic primality test is that if a number is deemed composite, we can ignore running the deterministic test on it, as usually probablisitic tests don't lie about compositeness.
      $endgroup$
      – Aryabhata
      Apr 27 '11 at 2:18












    • $begingroup$
      This is very clear! Whenever I find primes of a large size, I always just search using Miller-Rabin and AKS. But it is still probabilistic, not deterministic.
      $endgroup$
      – davidlowryduda
      Apr 27 '11 at 2:31






    • 1




      $begingroup$
      @mixed: Most probabilistic tests guarantee compositeness (I believe Miller-Rabin is one of those). So using Miller-Rabin then AKS is a deterministic algorithm!
      $endgroup$
      – Aryabhata
      Apr 27 '11 at 2:55






    • 1




      $begingroup$
      @mixed: There are many determinstic algorithms whose runtime is not known precisely. Shell Sort is one classic example. Deterministic does not mean you know the runtime. It just means you are 100% confident of the output. The whole point of using a probabilistic test is that we will likely move over the composites quickly and only run AKS on some composites and primes saving a lot of computation time.
      $endgroup$
      – Aryabhata
      Apr 27 '11 at 3:05








    • 1




      $begingroup$
      @mixed: Come on. You know very well it does not matter if it was AKS or not :-) Anyway, I agree that it is not determinstic with respect to the runtime, but it is determinstic with respect to the output. We are talking of different things I suppose. Anyway...
      $endgroup$
      – Aryabhata
      Apr 27 '11 at 3:22














    5












    5








    5





    $begingroup$

    This is an extension of Bill's comment.



    It is certainly reasonable to use strong probabilistic tests and then verify with the AKS algorithm. In fact, if we look for primes alternately above and below the target number and test with AKS, where each primality test takes time at most $log^{O(1)}N$, and there is an approximately $dfrac{1}{log N}$ chance that each is prime, one would be at least 99% sure to find a prime in time $O(log^{O(1)}N)$. But this is not deterministic - conceivably, there exists an interval where this works very poorly, far worse than logarithmic time. To be fair, I suspect such an interval does not exist in the numbers up to $2^{64}$. But speaking generally, a deterministic algorithm might be desired.



    To that end, I refer you to the results of Polymath4 "Finding Primes," a collaborative mathematical experience that sought to find a method to deterministically find primes of about a given size. In their paper, they describe deterministic methods that take time $O(N^{.5 + epsilon})$, which I think is pretty exciting.



    Ultimately, I would say that Bill's suggestion of probabilistically checking numbers and then verifying their primality is optimal. But the promise of a deterministic algorithm is... quite alluring, I think, too.






    share|cite|improve this answer











    $endgroup$



    This is an extension of Bill's comment.



    It is certainly reasonable to use strong probabilistic tests and then verify with the AKS algorithm. In fact, if we look for primes alternately above and below the target number and test with AKS, where each primality test takes time at most $log^{O(1)}N$, and there is an approximately $dfrac{1}{log N}$ chance that each is prime, one would be at least 99% sure to find a prime in time $O(log^{O(1)}N)$. But this is not deterministic - conceivably, there exists an interval where this works very poorly, far worse than logarithmic time. To be fair, I suspect such an interval does not exist in the numbers up to $2^{64}$. But speaking generally, a deterministic algorithm might be desired.



    To that end, I refer you to the results of Polymath4 "Finding Primes," a collaborative mathematical experience that sought to find a method to deterministically find primes of about a given size. In their paper, they describe deterministic methods that take time $O(N^{.5 + epsilon})$, which I think is pretty exciting.



    Ultimately, I would say that Bill's suggestion of probabilistically checking numbers and then verifying their primality is optimal. But the promise of a deterministic algorithm is... quite alluring, I think, too.







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Jan 5 at 3:17









    Gaurang Tandon

    3,51522148




    3,51522148










    answered Apr 27 '11 at 1:04









    davidlowrydudadavidlowryduda

    74.4k7118252




    74.4k7118252








    • 1




      $begingroup$
      Just a note: The gain by using probabilistic primality test is that if a number is deemed composite, we can ignore running the deterministic test on it, as usually probablisitic tests don't lie about compositeness.
      $endgroup$
      – Aryabhata
      Apr 27 '11 at 2:18












    • $begingroup$
      This is very clear! Whenever I find primes of a large size, I always just search using Miller-Rabin and AKS. But it is still probabilistic, not deterministic.
      $endgroup$
      – davidlowryduda
      Apr 27 '11 at 2:31






    • 1




      $begingroup$
      @mixed: Most probabilistic tests guarantee compositeness (I believe Miller-Rabin is one of those). So using Miller-Rabin then AKS is a deterministic algorithm!
      $endgroup$
      – Aryabhata
      Apr 27 '11 at 2:55






    • 1




      $begingroup$
      @mixed: There are many determinstic algorithms whose runtime is not known precisely. Shell Sort is one classic example. Deterministic does not mean you know the runtime. It just means you are 100% confident of the output. The whole point of using a probabilistic test is that we will likely move over the composites quickly and only run AKS on some composites and primes saving a lot of computation time.
      $endgroup$
      – Aryabhata
      Apr 27 '11 at 3:05








    • 1




      $begingroup$
      @mixed: Come on. You know very well it does not matter if it was AKS or not :-) Anyway, I agree that it is not determinstic with respect to the runtime, but it is determinstic with respect to the output. We are talking of different things I suppose. Anyway...
      $endgroup$
      – Aryabhata
      Apr 27 '11 at 3:22














    • 1




      $begingroup$
      Just a note: The gain by using probabilistic primality test is that if a number is deemed composite, we can ignore running the deterministic test on it, as usually probablisitic tests don't lie about compositeness.
      $endgroup$
      – Aryabhata
      Apr 27 '11 at 2:18












    • $begingroup$
      This is very clear! Whenever I find primes of a large size, I always just search using Miller-Rabin and AKS. But it is still probabilistic, not deterministic.
      $endgroup$
      – davidlowryduda
      Apr 27 '11 at 2:31






    • 1




      $begingroup$
      @mixed: Most probabilistic tests guarantee compositeness (I believe Miller-Rabin is one of those). So using Miller-Rabin then AKS is a deterministic algorithm!
      $endgroup$
      – Aryabhata
      Apr 27 '11 at 2:55






    • 1




      $begingroup$
      @mixed: There are many determinstic algorithms whose runtime is not known precisely. Shell Sort is one classic example. Deterministic does not mean you know the runtime. It just means you are 100% confident of the output. The whole point of using a probabilistic test is that we will likely move over the composites quickly and only run AKS on some composites and primes saving a lot of computation time.
      $endgroup$
      – Aryabhata
      Apr 27 '11 at 3:05








    • 1




      $begingroup$
      @mixed: Come on. You know very well it does not matter if it was AKS or not :-) Anyway, I agree that it is not determinstic with respect to the runtime, but it is determinstic with respect to the output. We are talking of different things I suppose. Anyway...
      $endgroup$
      – Aryabhata
      Apr 27 '11 at 3:22








    1




    1




    $begingroup$
    Just a note: The gain by using probabilistic primality test is that if a number is deemed composite, we can ignore running the deterministic test on it, as usually probablisitic tests don't lie about compositeness.
    $endgroup$
    – Aryabhata
    Apr 27 '11 at 2:18






    $begingroup$
    Just a note: The gain by using probabilistic primality test is that if a number is deemed composite, we can ignore running the deterministic test on it, as usually probablisitic tests don't lie about compositeness.
    $endgroup$
    – Aryabhata
    Apr 27 '11 at 2:18














    $begingroup$
    This is very clear! Whenever I find primes of a large size, I always just search using Miller-Rabin and AKS. But it is still probabilistic, not deterministic.
    $endgroup$
    – davidlowryduda
    Apr 27 '11 at 2:31




    $begingroup$
    This is very clear! Whenever I find primes of a large size, I always just search using Miller-Rabin and AKS. But it is still probabilistic, not deterministic.
    $endgroup$
    – davidlowryduda
    Apr 27 '11 at 2:31




    1




    1




    $begingroup$
    @mixed: Most probabilistic tests guarantee compositeness (I believe Miller-Rabin is one of those). So using Miller-Rabin then AKS is a deterministic algorithm!
    $endgroup$
    – Aryabhata
    Apr 27 '11 at 2:55




    $begingroup$
    @mixed: Most probabilistic tests guarantee compositeness (I believe Miller-Rabin is one of those). So using Miller-Rabin then AKS is a deterministic algorithm!
    $endgroup$
    – Aryabhata
    Apr 27 '11 at 2:55




    1




    1




    $begingroup$
    @mixed: There are many determinstic algorithms whose runtime is not known precisely. Shell Sort is one classic example. Deterministic does not mean you know the runtime. It just means you are 100% confident of the output. The whole point of using a probabilistic test is that we will likely move over the composites quickly and only run AKS on some composites and primes saving a lot of computation time.
    $endgroup$
    – Aryabhata
    Apr 27 '11 at 3:05






    $begingroup$
    @mixed: There are many determinstic algorithms whose runtime is not known precisely. Shell Sort is one classic example. Deterministic does not mean you know the runtime. It just means you are 100% confident of the output. The whole point of using a probabilistic test is that we will likely move over the composites quickly and only run AKS on some composites and primes saving a lot of computation time.
    $endgroup$
    – Aryabhata
    Apr 27 '11 at 3:05






    1




    1




    $begingroup$
    @mixed: Come on. You know very well it does not matter if it was AKS or not :-) Anyway, I agree that it is not determinstic with respect to the runtime, but it is determinstic with respect to the output. We are talking of different things I suppose. Anyway...
    $endgroup$
    – Aryabhata
    Apr 27 '11 at 3:22




    $begingroup$
    @mixed: Come on. You know very well it does not matter if it was AKS or not :-) Anyway, I agree that it is not determinstic with respect to the runtime, but it is determinstic with respect to the output. We are talking of different things I suppose. Anyway...
    $endgroup$
    – Aryabhata
    Apr 27 '11 at 3:22











    8












    $begingroup$

    You mention $2^{63}.$ It's known that below $4cdot10^{18}approx2^{61.8},$ there is at least one prime every 1476 numbers. So a reasonable algorithm would be to sieve some numbers smaller than n (up to 2000 if you want to be on the safe side) up to some small bound, then check the remaining numbers with the Miller-Rabin test. Test any that pass with a primality-proving algorithm—do not use AKS as other answers have suggested; it is not fast in this range.



    Of course 2000 is overkill; an interval of length 200 should be good enough for 99.9997% of cases. But sieving is fast so it's not really obvious what length is optimal. In any case the more important optimization issue is deciding how far to sieve rather than how long of an interval to sieve.



    As for the primality-proving algorithm, ECPP works. Recently it was proven that there are no BPSW-pseudoprimes less than $2^{64},$ so a Lucas test following your MR test (to base 2) is even faster.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      using a segmented seive upto root of n would be a good optimization .
      $endgroup$
      – saruftw
      Jun 11 '16 at 10:45










    • $begingroup$
      @saru95 I don't think so, not at that size. Faster to sieve out the easy composites and test the rest rather than doing billions of divisions.
      $endgroup$
      – Charles
      Jun 11 '16 at 14:37
















    8












    $begingroup$

    You mention $2^{63}.$ It's known that below $4cdot10^{18}approx2^{61.8},$ there is at least one prime every 1476 numbers. So a reasonable algorithm would be to sieve some numbers smaller than n (up to 2000 if you want to be on the safe side) up to some small bound, then check the remaining numbers with the Miller-Rabin test. Test any that pass with a primality-proving algorithm—do not use AKS as other answers have suggested; it is not fast in this range.



    Of course 2000 is overkill; an interval of length 200 should be good enough for 99.9997% of cases. But sieving is fast so it's not really obvious what length is optimal. In any case the more important optimization issue is deciding how far to sieve rather than how long of an interval to sieve.



    As for the primality-proving algorithm, ECPP works. Recently it was proven that there are no BPSW-pseudoprimes less than $2^{64},$ so a Lucas test following your MR test (to base 2) is even faster.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      using a segmented seive upto root of n would be a good optimization .
      $endgroup$
      – saruftw
      Jun 11 '16 at 10:45










    • $begingroup$
      @saru95 I don't think so, not at that size. Faster to sieve out the easy composites and test the rest rather than doing billions of divisions.
      $endgroup$
      – Charles
      Jun 11 '16 at 14:37














    8












    8








    8





    $begingroup$

    You mention $2^{63}.$ It's known that below $4cdot10^{18}approx2^{61.8},$ there is at least one prime every 1476 numbers. So a reasonable algorithm would be to sieve some numbers smaller than n (up to 2000 if you want to be on the safe side) up to some small bound, then check the remaining numbers with the Miller-Rabin test. Test any that pass with a primality-proving algorithm—do not use AKS as other answers have suggested; it is not fast in this range.



    Of course 2000 is overkill; an interval of length 200 should be good enough for 99.9997% of cases. But sieving is fast so it's not really obvious what length is optimal. In any case the more important optimization issue is deciding how far to sieve rather than how long of an interval to sieve.



    As for the primality-proving algorithm, ECPP works. Recently it was proven that there are no BPSW-pseudoprimes less than $2^{64},$ so a Lucas test following your MR test (to base 2) is even faster.






    share|cite|improve this answer











    $endgroup$



    You mention $2^{63}.$ It's known that below $4cdot10^{18}approx2^{61.8},$ there is at least one prime every 1476 numbers. So a reasonable algorithm would be to sieve some numbers smaller than n (up to 2000 if you want to be on the safe side) up to some small bound, then check the remaining numbers with the Miller-Rabin test. Test any that pass with a primality-proving algorithm—do not use AKS as other answers have suggested; it is not fast in this range.



    Of course 2000 is overkill; an interval of length 200 should be good enough for 99.9997% of cases. But sieving is fast so it's not really obvious what length is optimal. In any case the more important optimization issue is deciding how far to sieve rather than how long of an interval to sieve.



    As for the primality-proving algorithm, ECPP works. Recently it was proven that there are no BPSW-pseudoprimes less than $2^{64},$ so a Lucas test following your MR test (to base 2) is even faster.







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited May 11 '12 at 19:26

























    answered Apr 27 '11 at 19:38









    CharlesCharles

    23.7k452114




    23.7k452114












    • $begingroup$
      using a segmented seive upto root of n would be a good optimization .
      $endgroup$
      – saruftw
      Jun 11 '16 at 10:45










    • $begingroup$
      @saru95 I don't think so, not at that size. Faster to sieve out the easy composites and test the rest rather than doing billions of divisions.
      $endgroup$
      – Charles
      Jun 11 '16 at 14:37


















    • $begingroup$
      using a segmented seive upto root of n would be a good optimization .
      $endgroup$
      – saruftw
      Jun 11 '16 at 10:45










    • $begingroup$
      @saru95 I don't think so, not at that size. Faster to sieve out the easy composites and test the rest rather than doing billions of divisions.
      $endgroup$
      – Charles
      Jun 11 '16 at 14:37
















    $begingroup$
    using a segmented seive upto root of n would be a good optimization .
    $endgroup$
    – saruftw
    Jun 11 '16 at 10:45




    $begingroup$
    using a segmented seive upto root of n would be a good optimization .
    $endgroup$
    – saruftw
    Jun 11 '16 at 10:45












    $begingroup$
    @saru95 I don't think so, not at that size. Faster to sieve out the easy composites and test the rest rather than doing billions of divisions.
    $endgroup$
    – Charles
    Jun 11 '16 at 14:37




    $begingroup$
    @saru95 I don't think so, not at that size. Faster to sieve out the easy composites and test the rest rather than doing billions of divisions.
    $endgroup$
    – Charles
    Jun 11 '16 at 14:37











    3












    $begingroup$

    Use sieving to optimize your loop even more.



    Guess an interval of reasonable length on which your prime is going to be found. Initialize a corresponding table with entries all equal to "true".



    Go over a list of small primes $p_i$, and compute $n mod p_i$. Using that value, you can mark off some of the entries in the table as "false".



    Now do what Bill suggests - go over the remaining entries, use Miller-Rabin and then if you want to be really sure, some deterministic test (not necessarily AKS, which is slow).






    share|cite|improve this answer











    $endgroup$


















      3












      $begingroup$

      Use sieving to optimize your loop even more.



      Guess an interval of reasonable length on which your prime is going to be found. Initialize a corresponding table with entries all equal to "true".



      Go over a list of small primes $p_i$, and compute $n mod p_i$. Using that value, you can mark off some of the entries in the table as "false".



      Now do what Bill suggests - go over the remaining entries, use Miller-Rabin and then if you want to be really sure, some deterministic test (not necessarily AKS, which is slow).






      share|cite|improve this answer











      $endgroup$
















        3












        3








        3





        $begingroup$

        Use sieving to optimize your loop even more.



        Guess an interval of reasonable length on which your prime is going to be found. Initialize a corresponding table with entries all equal to "true".



        Go over a list of small primes $p_i$, and compute $n mod p_i$. Using that value, you can mark off some of the entries in the table as "false".



        Now do what Bill suggests - go over the remaining entries, use Miller-Rabin and then if you want to be really sure, some deterministic test (not necessarily AKS, which is slow).






        share|cite|improve this answer











        $endgroup$



        Use sieving to optimize your loop even more.



        Guess an interval of reasonable length on which your prime is going to be found. Initialize a corresponding table with entries all equal to "true".



        Go over a list of small primes $p_i$, and compute $n mod p_i$. Using that value, you can mark off some of the entries in the table as "false".



        Now do what Bill suggests - go over the remaining entries, use Miller-Rabin and then if you want to be really sure, some deterministic test (not necessarily AKS, which is slow).







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Apr 27 '11 at 12:52

























        answered Apr 27 '11 at 0:48









        Yuval FilmusYuval Filmus

        48.3k471144




        48.3k471144






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f35300%2fwhat-is-the-most-efficient-algorithm-to-find-the-closest-prime-less-than-a-given%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            1300-talet

            1300-talet

            Has there ever been an instance of an active nuclear power plant within or near a war zone?