Quantifying how crowded points are in $d$ dimensional cube












6














Let $x_1, cdots, x_n$ be distinct points in the $d$ dimensional cube $[-1,1]^d$. What is a lower bound on the quantity:
$$ sum_{1 le j < k le n} frac{1}{||x_k-x_j||}$$
where $|| cdot ||$ is the Euclidean norm.



An asymptotic answer for large $n$ is also fine and bounds on any quantity that looks similar, such as replacing the norm with norm squared, is also welcomed.



My motivation is a problem that appeared in the book The Cauchy-Schwarz Master Class which proved the following:




If $-1 le x_1 < x_2 < cdots < x_n le 1$ then $$ sum_{1 le j < k le n} frac{1}{x_k-x_j} ge frac{n^2 log n}8.$$











share|cite|improve this question
























  • As it stands your question is a bit too broad, because it is simultaneously asking for the best bound, an asymptotic bound, and "bounds on any quantity that looks similar". Could you please edit your question to make it more focused?
    – Lord_Farin
    Dec 24 '18 at 9:13










  • Ok, I edited a little. Since I dont know the difficulty of this question its bound to be a little vague but the whole point is to simulate some discussion and get any useful bounds.
    – Sandeep Silwal
    Dec 24 '18 at 16:25










  • These kind of "simple" questions very quickly get out of hand...
    – Fabian
    Jan 1 at 4:35
















6














Let $x_1, cdots, x_n$ be distinct points in the $d$ dimensional cube $[-1,1]^d$. What is a lower bound on the quantity:
$$ sum_{1 le j < k le n} frac{1}{||x_k-x_j||}$$
where $|| cdot ||$ is the Euclidean norm.



An asymptotic answer for large $n$ is also fine and bounds on any quantity that looks similar, such as replacing the norm with norm squared, is also welcomed.



My motivation is a problem that appeared in the book The Cauchy-Schwarz Master Class which proved the following:




If $-1 le x_1 < x_2 < cdots < x_n le 1$ then $$ sum_{1 le j < k le n} frac{1}{x_k-x_j} ge frac{n^2 log n}8.$$











share|cite|improve this question
























  • As it stands your question is a bit too broad, because it is simultaneously asking for the best bound, an asymptotic bound, and "bounds on any quantity that looks similar". Could you please edit your question to make it more focused?
    – Lord_Farin
    Dec 24 '18 at 9:13










  • Ok, I edited a little. Since I dont know the difficulty of this question its bound to be a little vague but the whole point is to simulate some discussion and get any useful bounds.
    – Sandeep Silwal
    Dec 24 '18 at 16:25










  • These kind of "simple" questions very quickly get out of hand...
    – Fabian
    Jan 1 at 4:35














6












6








6


1





Let $x_1, cdots, x_n$ be distinct points in the $d$ dimensional cube $[-1,1]^d$. What is a lower bound on the quantity:
$$ sum_{1 le j < k le n} frac{1}{||x_k-x_j||}$$
where $|| cdot ||$ is the Euclidean norm.



An asymptotic answer for large $n$ is also fine and bounds on any quantity that looks similar, such as replacing the norm with norm squared, is also welcomed.



My motivation is a problem that appeared in the book The Cauchy-Schwarz Master Class which proved the following:




If $-1 le x_1 < x_2 < cdots < x_n le 1$ then $$ sum_{1 le j < k le n} frac{1}{x_k-x_j} ge frac{n^2 log n}8.$$











share|cite|improve this question















Let $x_1, cdots, x_n$ be distinct points in the $d$ dimensional cube $[-1,1]^d$. What is a lower bound on the quantity:
$$ sum_{1 le j < k le n} frac{1}{||x_k-x_j||}$$
where $|| cdot ||$ is the Euclidean norm.



An asymptotic answer for large $n$ is also fine and bounds on any quantity that looks similar, such as replacing the norm with norm squared, is also welcomed.



My motivation is a problem that appeared in the book The Cauchy-Schwarz Master Class which proved the following:




If $-1 le x_1 < x_2 < cdots < x_n le 1$ then $$ sum_{1 le j < k le n} frac{1}{x_k-x_j} ge frac{n^2 log n}8.$$








geometry inequality upper-lower-bounds






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 24 '18 at 16:25

























asked Dec 24 '18 at 1:47









Sandeep Silwal

5,83811236




5,83811236












  • As it stands your question is a bit too broad, because it is simultaneously asking for the best bound, an asymptotic bound, and "bounds on any quantity that looks similar". Could you please edit your question to make it more focused?
    – Lord_Farin
    Dec 24 '18 at 9:13










  • Ok, I edited a little. Since I dont know the difficulty of this question its bound to be a little vague but the whole point is to simulate some discussion and get any useful bounds.
    – Sandeep Silwal
    Dec 24 '18 at 16:25










  • These kind of "simple" questions very quickly get out of hand...
    – Fabian
    Jan 1 at 4:35


















  • As it stands your question is a bit too broad, because it is simultaneously asking for the best bound, an asymptotic bound, and "bounds on any quantity that looks similar". Could you please edit your question to make it more focused?
    – Lord_Farin
    Dec 24 '18 at 9:13










  • Ok, I edited a little. Since I dont know the difficulty of this question its bound to be a little vague but the whole point is to simulate some discussion and get any useful bounds.
    – Sandeep Silwal
    Dec 24 '18 at 16:25










  • These kind of "simple" questions very quickly get out of hand...
    – Fabian
    Jan 1 at 4:35
















As it stands your question is a bit too broad, because it is simultaneously asking for the best bound, an asymptotic bound, and "bounds on any quantity that looks similar". Could you please edit your question to make it more focused?
– Lord_Farin
Dec 24 '18 at 9:13




As it stands your question is a bit too broad, because it is simultaneously asking for the best bound, an asymptotic bound, and "bounds on any quantity that looks similar". Could you please edit your question to make it more focused?
– Lord_Farin
Dec 24 '18 at 9:13












Ok, I edited a little. Since I dont know the difficulty of this question its bound to be a little vague but the whole point is to simulate some discussion and get any useful bounds.
– Sandeep Silwal
Dec 24 '18 at 16:25




Ok, I edited a little. Since I dont know the difficulty of this question its bound to be a little vague but the whole point is to simulate some discussion and get any useful bounds.
– Sandeep Silwal
Dec 24 '18 at 16:25












These kind of "simple" questions very quickly get out of hand...
– Fabian
Jan 1 at 4:35




These kind of "simple" questions very quickly get out of hand...
– Fabian
Jan 1 at 4:35










2 Answers
2






active

oldest

votes


















2





+50









This is an answer only up to $n = exp(ad)$ for some $a in theta(1)$, but may be enough to get started: There are as many as $n = exp(ad)$ points $y_1,ldots, y_n$ in ${-1,1}^d$ that satisfy the following:
$||y_i-y_j||_1 in theta(d)$ for each $i not = j$, where $|| cdot ||_1$ denotes the Manhattan metric.
[Google error-correcting codes]



Then for each $i not = j$, the following holds:



$$frac{1}{||y_i-y_j||_2} leq frac{1}{csqrt{d}}$$



for some $c in Omega(1)$, which implies the following:




Inequality 1: $$sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2} leq frac{n^2}{2csqrt{d}}$$




However. not that for any distinct $x,y$, the following holds:




Inequality 2: $$frac{1}{||x-y||_2} ge frac{1}{sqrt{d}} $$




So Inequalities 1 and 2 together imply that for $y_1,ldots, y_n$ as above,



$sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2}$ is no more than a constant multiple larger than $sum_{1 le i < j le n} frac{1}{||x_i-x_j||_2}$ for any other $x_1,ldots, x_n in [-1,1]^d$.






share|cite|improve this answer





























    1














    The situation seems to be less interesting asymptotically for fixed $d geq 2$. We have
    $$frac{1}{||x-y||_2} geq frac{1}{2sqrt{d}}$$ for any $x, y in [-1, 1]^d$, so there's an easy lower bound of
    $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} geq frac{1}{2sqrt{d}} binom{n}{2} = Omega(n^2).$$ This lower bound is actually achieved (up to a constant) by letting the $x_j$ lie on a grid, say $Lambda^+ = {n^{-1/d}, 2n^{-1/d}, dots, 1}^d$.



    First note that if we set
    $$Lambda = {-1, dots, -2n^{-1/d}, -n^{-1/d}, 0, n^{-1/d}, 2n^{-1/d}, dots, 1}^d$$
    we have
    $$sum_{j neq k} frac{1}{||x_k - x_j||} < sum_{v in Lambda, v neq 0} frac{1}{||v||}$$
    for each $k$.



    Also, since $d geq 2$, the integral
    $$int_{[-1, 1]^d} frac{dx}{||x||}$$
    actually converges, say to some $I_d < infty$, meaning in particular that
    $$frac{1}{n} sum_{v in Lambda, v neq 0} frac{1}{||v||} to I_d$$
    (this is just a standard grid approximation).



    Putting these together, we get
    $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} = frac{1}{2} sum_k sum_{j neq k} frac{1}{||x_k - x_j||} < frac{n}{2} sum_{v in Lambda, v neq 0} frac{1}{||v||} sim frac{n^2}{2} I_d$$
    so our sum is $O(n^2)$, matching the lower bound.






    share|cite|improve this answer





















      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3050869%2fquantifying-how-crowded-points-are-in-d-dimensional-cube%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      2





      +50









      This is an answer only up to $n = exp(ad)$ for some $a in theta(1)$, but may be enough to get started: There are as many as $n = exp(ad)$ points $y_1,ldots, y_n$ in ${-1,1}^d$ that satisfy the following:
      $||y_i-y_j||_1 in theta(d)$ for each $i not = j$, where $|| cdot ||_1$ denotes the Manhattan metric.
      [Google error-correcting codes]



      Then for each $i not = j$, the following holds:



      $$frac{1}{||y_i-y_j||_2} leq frac{1}{csqrt{d}}$$



      for some $c in Omega(1)$, which implies the following:




      Inequality 1: $$sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2} leq frac{n^2}{2csqrt{d}}$$




      However. not that for any distinct $x,y$, the following holds:




      Inequality 2: $$frac{1}{||x-y||_2} ge frac{1}{sqrt{d}} $$




      So Inequalities 1 and 2 together imply that for $y_1,ldots, y_n$ as above,



      $sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2}$ is no more than a constant multiple larger than $sum_{1 le i < j le n} frac{1}{||x_i-x_j||_2}$ for any other $x_1,ldots, x_n in [-1,1]^d$.






      share|cite|improve this answer


























        2





        +50









        This is an answer only up to $n = exp(ad)$ for some $a in theta(1)$, but may be enough to get started: There are as many as $n = exp(ad)$ points $y_1,ldots, y_n$ in ${-1,1}^d$ that satisfy the following:
        $||y_i-y_j||_1 in theta(d)$ for each $i not = j$, where $|| cdot ||_1$ denotes the Manhattan metric.
        [Google error-correcting codes]



        Then for each $i not = j$, the following holds:



        $$frac{1}{||y_i-y_j||_2} leq frac{1}{csqrt{d}}$$



        for some $c in Omega(1)$, which implies the following:




        Inequality 1: $$sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2} leq frac{n^2}{2csqrt{d}}$$




        However. not that for any distinct $x,y$, the following holds:




        Inequality 2: $$frac{1}{||x-y||_2} ge frac{1}{sqrt{d}} $$




        So Inequalities 1 and 2 together imply that for $y_1,ldots, y_n$ as above,



        $sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2}$ is no more than a constant multiple larger than $sum_{1 le i < j le n} frac{1}{||x_i-x_j||_2}$ for any other $x_1,ldots, x_n in [-1,1]^d$.






        share|cite|improve this answer
























          2





          +50







          2





          +50



          2




          +50




          This is an answer only up to $n = exp(ad)$ for some $a in theta(1)$, but may be enough to get started: There are as many as $n = exp(ad)$ points $y_1,ldots, y_n$ in ${-1,1}^d$ that satisfy the following:
          $||y_i-y_j||_1 in theta(d)$ for each $i not = j$, where $|| cdot ||_1$ denotes the Manhattan metric.
          [Google error-correcting codes]



          Then for each $i not = j$, the following holds:



          $$frac{1}{||y_i-y_j||_2} leq frac{1}{csqrt{d}}$$



          for some $c in Omega(1)$, which implies the following:




          Inequality 1: $$sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2} leq frac{n^2}{2csqrt{d}}$$




          However. not that for any distinct $x,y$, the following holds:




          Inequality 2: $$frac{1}{||x-y||_2} ge frac{1}{sqrt{d}} $$




          So Inequalities 1 and 2 together imply that for $y_1,ldots, y_n$ as above,



          $sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2}$ is no more than a constant multiple larger than $sum_{1 le i < j le n} frac{1}{||x_i-x_j||_2}$ for any other $x_1,ldots, x_n in [-1,1]^d$.






          share|cite|improve this answer












          This is an answer only up to $n = exp(ad)$ for some $a in theta(1)$, but may be enough to get started: There are as many as $n = exp(ad)$ points $y_1,ldots, y_n$ in ${-1,1}^d$ that satisfy the following:
          $||y_i-y_j||_1 in theta(d)$ for each $i not = j$, where $|| cdot ||_1$ denotes the Manhattan metric.
          [Google error-correcting codes]



          Then for each $i not = j$, the following holds:



          $$frac{1}{||y_i-y_j||_2} leq frac{1}{csqrt{d}}$$



          for some $c in Omega(1)$, which implies the following:




          Inequality 1: $$sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2} leq frac{n^2}{2csqrt{d}}$$




          However. not that for any distinct $x,y$, the following holds:




          Inequality 2: $$frac{1}{||x-y||_2} ge frac{1}{sqrt{d}} $$




          So Inequalities 1 and 2 together imply that for $y_1,ldots, y_n$ as above,



          $sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2}$ is no more than a constant multiple larger than $sum_{1 le i < j le n} frac{1}{||x_i-x_j||_2}$ for any other $x_1,ldots, x_n in [-1,1]^d$.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Dec 27 '18 at 19:16









          Mike

          3,089211




          3,089211























              1














              The situation seems to be less interesting asymptotically for fixed $d geq 2$. We have
              $$frac{1}{||x-y||_2} geq frac{1}{2sqrt{d}}$$ for any $x, y in [-1, 1]^d$, so there's an easy lower bound of
              $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} geq frac{1}{2sqrt{d}} binom{n}{2} = Omega(n^2).$$ This lower bound is actually achieved (up to a constant) by letting the $x_j$ lie on a grid, say $Lambda^+ = {n^{-1/d}, 2n^{-1/d}, dots, 1}^d$.



              First note that if we set
              $$Lambda = {-1, dots, -2n^{-1/d}, -n^{-1/d}, 0, n^{-1/d}, 2n^{-1/d}, dots, 1}^d$$
              we have
              $$sum_{j neq k} frac{1}{||x_k - x_j||} < sum_{v in Lambda, v neq 0} frac{1}{||v||}$$
              for each $k$.



              Also, since $d geq 2$, the integral
              $$int_{[-1, 1]^d} frac{dx}{||x||}$$
              actually converges, say to some $I_d < infty$, meaning in particular that
              $$frac{1}{n} sum_{v in Lambda, v neq 0} frac{1}{||v||} to I_d$$
              (this is just a standard grid approximation).



              Putting these together, we get
              $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} = frac{1}{2} sum_k sum_{j neq k} frac{1}{||x_k - x_j||} < frac{n}{2} sum_{v in Lambda, v neq 0} frac{1}{||v||} sim frac{n^2}{2} I_d$$
              so our sum is $O(n^2)$, matching the lower bound.






              share|cite|improve this answer


























                1














                The situation seems to be less interesting asymptotically for fixed $d geq 2$. We have
                $$frac{1}{||x-y||_2} geq frac{1}{2sqrt{d}}$$ for any $x, y in [-1, 1]^d$, so there's an easy lower bound of
                $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} geq frac{1}{2sqrt{d}} binom{n}{2} = Omega(n^2).$$ This lower bound is actually achieved (up to a constant) by letting the $x_j$ lie on a grid, say $Lambda^+ = {n^{-1/d}, 2n^{-1/d}, dots, 1}^d$.



                First note that if we set
                $$Lambda = {-1, dots, -2n^{-1/d}, -n^{-1/d}, 0, n^{-1/d}, 2n^{-1/d}, dots, 1}^d$$
                we have
                $$sum_{j neq k} frac{1}{||x_k - x_j||} < sum_{v in Lambda, v neq 0} frac{1}{||v||}$$
                for each $k$.



                Also, since $d geq 2$, the integral
                $$int_{[-1, 1]^d} frac{dx}{||x||}$$
                actually converges, say to some $I_d < infty$, meaning in particular that
                $$frac{1}{n} sum_{v in Lambda, v neq 0} frac{1}{||v||} to I_d$$
                (this is just a standard grid approximation).



                Putting these together, we get
                $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} = frac{1}{2} sum_k sum_{j neq k} frac{1}{||x_k - x_j||} < frac{n}{2} sum_{v in Lambda, v neq 0} frac{1}{||v||} sim frac{n^2}{2} I_d$$
                so our sum is $O(n^2)$, matching the lower bound.






                share|cite|improve this answer
























                  1












                  1








                  1






                  The situation seems to be less interesting asymptotically for fixed $d geq 2$. We have
                  $$frac{1}{||x-y||_2} geq frac{1}{2sqrt{d}}$$ for any $x, y in [-1, 1]^d$, so there's an easy lower bound of
                  $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} geq frac{1}{2sqrt{d}} binom{n}{2} = Omega(n^2).$$ This lower bound is actually achieved (up to a constant) by letting the $x_j$ lie on a grid, say $Lambda^+ = {n^{-1/d}, 2n^{-1/d}, dots, 1}^d$.



                  First note that if we set
                  $$Lambda = {-1, dots, -2n^{-1/d}, -n^{-1/d}, 0, n^{-1/d}, 2n^{-1/d}, dots, 1}^d$$
                  we have
                  $$sum_{j neq k} frac{1}{||x_k - x_j||} < sum_{v in Lambda, v neq 0} frac{1}{||v||}$$
                  for each $k$.



                  Also, since $d geq 2$, the integral
                  $$int_{[-1, 1]^d} frac{dx}{||x||}$$
                  actually converges, say to some $I_d < infty$, meaning in particular that
                  $$frac{1}{n} sum_{v in Lambda, v neq 0} frac{1}{||v||} to I_d$$
                  (this is just a standard grid approximation).



                  Putting these together, we get
                  $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} = frac{1}{2} sum_k sum_{j neq k} frac{1}{||x_k - x_j||} < frac{n}{2} sum_{v in Lambda, v neq 0} frac{1}{||v||} sim frac{n^2}{2} I_d$$
                  so our sum is $O(n^2)$, matching the lower bound.






                  share|cite|improve this answer












                  The situation seems to be less interesting asymptotically for fixed $d geq 2$. We have
                  $$frac{1}{||x-y||_2} geq frac{1}{2sqrt{d}}$$ for any $x, y in [-1, 1]^d$, so there's an easy lower bound of
                  $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} geq frac{1}{2sqrt{d}} binom{n}{2} = Omega(n^2).$$ This lower bound is actually achieved (up to a constant) by letting the $x_j$ lie on a grid, say $Lambda^+ = {n^{-1/d}, 2n^{-1/d}, dots, 1}^d$.



                  First note that if we set
                  $$Lambda = {-1, dots, -2n^{-1/d}, -n^{-1/d}, 0, n^{-1/d}, 2n^{-1/d}, dots, 1}^d$$
                  we have
                  $$sum_{j neq k} frac{1}{||x_k - x_j||} < sum_{v in Lambda, v neq 0} frac{1}{||v||}$$
                  for each $k$.



                  Also, since $d geq 2$, the integral
                  $$int_{[-1, 1]^d} frac{dx}{||x||}$$
                  actually converges, say to some $I_d < infty$, meaning in particular that
                  $$frac{1}{n} sum_{v in Lambda, v neq 0} frac{1}{||v||} to I_d$$
                  (this is just a standard grid approximation).



                  Putting these together, we get
                  $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} = frac{1}{2} sum_k sum_{j neq k} frac{1}{||x_k - x_j||} < frac{n}{2} sum_{v in Lambda, v neq 0} frac{1}{||v||} sim frac{n^2}{2} I_d$$
                  so our sum is $O(n^2)$, matching the lower bound.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered yesterday









                  user125932

                  58529




                  58529






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.





                      Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                      Please pay close attention to the following guidance:


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3050869%2fquantifying-how-crowded-points-are-in-d-dimensional-cube%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      An IMO inspired problem

                      Management

                      Investment