The conventional (Type 1) approach is to treat each prime number as a whole unit in cardinal terms.
So 2 represents a prime number (i.e. one prime unit) in this sense; 3 then represents another prime unit and so on.
However there is an alternative (Type 2) manner of looking at the primes where each prime number is viewed as constituting a group of related members in ordinal terms.
So 2 is this sense represents a prime that is made up of (unique) 1st and 2nd members; 3 represents a prime made up of unique 1st, 2nd and 3rd members and so on.
Likewise the conventional manner of measuring the frequency of primes is based on the Type 1 approach.
Therefore in considering for example how many of the first hundred natural numbers are prime, we are treating the occurrence of each prime as a single unit.
And as we have seen a simple and surprisingly accurate formula can be given for the frequency of primes from this perspective i.e. n/(log n – 1).
However there is an alternative way (based on the type 2 perspective) of measuring the frequency of primes.
Here each prime is not treated as a single unit (in cardinal terms) but as a multiple group of related members.
So therefore 2 (as prime) is made up of 2 units, 3 made up of 3 units and so on.
Of course this is true of the natural numbers also.
Therefore in considering the same example of the frequency of primes in the first 100 numbers basically we need to sum the total for members for the prime numbers (contained up to 100)
i.e. 2 + 3 + 5 + 7 +........ + 97.
Now a simple formula that I suggest to obtain this result is based on the corresponding sum of the first n numbers divided by (log n – 1).
The formula for the sum of n numbers = n(n + 1)/2
Therefore the corresponding formula for the sum of (Type 2) primes,
= n(n + 1)/2(log n – 1).
In the following table I present the estimated total of prime members (using this formula) as against the actual total.
Up to

Actual No.

Estimated No.

% Accuracy

100

1060

1401

75.66

200

4227

4676

90.40

300

8275

9599

86.21

400

13877

16067

86.37

500

21536

24019

89.66

600

29296

33408

87.69

700

38612

44199

87.36

800

49078

56363

87.07

900

61797

69876

88.44

1000

75127

84719

88.67

1100

91953

100873

91.57

1200

110728

118324

93.58

What is interesting is that it does not provide nearly as accurate predictions as the corresponding Type 1 approach (based on n/(log n – 1).
However overall the relative accuracy does gradually increase so that once we pass 1000, it seems to be consistently now over 90%.
No comments:
Post a Comment