In the last blog entry, I contrasted the (static) analytic notion of independence with the corresponding (dynamic) holistic notion of interdependence.
Once again the analytic notion is 1-dimensional in nature, corresponding to interpretation with respect to just one polar frame of reference; the holistic notion by contrast is multidimensional corresponding to interpretation with respect to a number of reference frames simultaneously.
The simplest - and most important - case of holistic interpretation is then with respect to 2 complementary reference frames that are positive and negative in terms of each other. (These 2 dimensions are thereby represented in quantitative terms by the corresponding 2 roots of unity i.e. + 1 and – 1).
Therefore when we holistically view both the random and ordered nature of the primes through the complementary cardinal and ordinal aspects of number, we then can truly realise the ultimate dynamic interdependence of both these notions.
Now such interdependence is directly qualitative, entailing the (unconscious) negation of what has already been (consciously) posited in experience. So, again, at a crossroads initially within single reference frames, one can posit both L and R turns in an unambiguous manner. However when one then simultaneously attempts to combine approaches to the crossroads from both N and S directions, this leads to the realisation of the mutual interdependence of L and R (as both L and R). This then enables us to (unconsciously) negate in experience the unambiguous conscious identification of each turn as either L or R (separately).
Now this latter holistic recognition arising from dynamic negation (of the single posited pole) of recognition is properly of a 2-dimensional nature.
However customary mathematical understanding takes place in an analytic manner within single (posited) poles of reference.
Therefore to indirectly express the qualitative holistic recognition, that is 2-dimensional, in a 1-dimensional manner, we take the equivalent of the square root of – 1.
In other words, we have here the vitally important point here that the imaginary notion itself represents the indirect quantitative means of expressing the holistic qualitative notion of interdependence in a reduced analytic manner.
So all the Riemann (Zeta 1) zeros are postulated to lie on the imaginary line (through 1/2).
This in fact points clearly to the fact that the numbers representing these zeros intrinsically represent the holistic notion of interdependence with respect to both the random and ordered nature of the primes throughout the number system. Now if we were to postulate that a zero could lie off this imaginary line, then we would in effect be maintaining that such a zero was of a qualitatively different nature (than those lying on the imaginary line)!
Once again we saw that when approached from the cardinal perspective in analytic (1-dimensional) terms, both the random and ordered nature of the primes lie at two extremes from each other, with individual primes highly random and the collective relationship of the primes (with the natural numbers) highly ordered respectively.
Then when approached from the ordinal perspective, in reverse manner, both the ordered and random nature of the primes again lie at two extremes from each other with now the individual natural number members of each prime highly ordered and the overall collective nature of the primes highly random.
So when we combine both reference frames (cardinal and ordinal) both random and order notions are rendered paradoxical, for what is random from one perspective is ordered from the other, and what is ordered from one perspective is random from the other.
Thus quite remarkably, the Riemann non-trivial zeros, in a direct sense represent holistic points approaching pure interdependence with respect to both the random and ordered nature of the primes. These points can then be indirectly expressed in an analytic manner on the imaginary axis (through 1/2).
Now it may help to appreciate this more clearly by once again going back to the number system that contains both primes and composite natural numbers.
Each prime (from the cardinal perspective) is of a random nature (with no component factors). However each composite number by contrast is of an ordered nature (with multiple factors).
I showed already how there is an extremely close link as between the frequency of the Riemann (Zeta 1) non trivial zeros and the corresponding frequency of the natural factors of the composite natural numbers.
However we keep moving between two extremes. So we start with 1 and then the two primes 2 and 3 (with no factors). Then suddenly we hit 4 (with 2 factors). We then have another prime 5 (with no factors) and then 6 (with 3 factors).
Therefore with respect to factors (up to 6) we have 2 factor points at 4 and another 3 factor points at 6).
So these factor points occur in a discontinuous manner due to the existence of primes (with no factors).
Thus the Riemann non-trivial zeros can be fruitfully seen as an attempt to smooth out these factor points so as to take account of the random nature of the primes.
Therefore though the overall sum of factors and non-trivial zeros will closely match each other (up to a given number) the factors will occur as discontinuous blocks of numbers (where order is separated from randomness) whereas the non-trivial zeros will occur as single points (where at each location order is closely balanced with randomness).
Put another way, remarkably at the location of each Riemann zero, the primes and natural numbers closely approximate the situation where they can be seen to be identical with each other in a dynamic relative manner. And such identity equates to pure energy states (in both physical and psychological terms).
Expressed again in an alternative manner, at these points the cardinal and ordinal aspects of the number system, likewise approach full identity with each other (again in a dynamic relative manner).
Indeed this situation can be viewed even more clearly from the ordinal perspective.
So if we take for example the prime 3, this has 3 individual members (in a natural number fashion) i.e. 1st, 2nd and 3rd members respectively. Now remember the various prime roots of 1 (except for the trivial case of 1), represent the Zeta 2 zeros!
Indirectly we can give these ordinal members an individual quantitative identity, through the 3 roots of 1 i.e. – .5 + .866...i, – .5 – .866...i and 1.
However the interdependent identity of these roots is expressed through their sum = 0.
So the prime 3, expressed in this sense, is inseparable from its 3 natural number members in ordinal terms.
Thus each member can preserve a certain random individual identity, while remaining closely ordered with the remaining members of its group.
In this way the zeta zeros (Zeta 1 and Zeta 2) can approach perfect balance as between order and randomness, both with respect to the cardinal behaviour of the primes in terms of the number system as a whole and their corresponding ordinal behaviour in terms of each individual prime.
An explanation of the true nature of the Riemann Hypothesis by incorporating the - as yet - unrecognised holistic interpretation of mathematical symbols
Wednesday, May 28, 2014
Tuesday, May 27, 2014
More on Randomness and Order (3)
In the last couple of entries I showed that from a dynamic interactive perspective - which is the appropriate way to view the nature of number - randomness and order are necessarily complementary aspects.
Thus, in cardinal terms, what is random from the individual perspective (i.e. each single prime) is highly ordered from the opposite collective perspective (i.e. the overall relationship pf the primes to the natural numbers).
Then in a reverse ordinal manner, what is highly ordered from the individual perspective (i.e. each natural number member of a prime group) is random from the overall collective perspective (i.e. in the selection of primes).
So when we combine both reference frames (cardinal and ordinal) the notions of randomness and order lead directly to paradox.
Now once again, it is vital to appreciate a fundamental point relating to - what I term - the analytic and holistic aspects of mathematical appreciation.
Analytic appreciation in every context takes place within single isolated poles of reference. So the analytic aspect in this crucial respect is of a rational linear (i.e. 1-dimensional) nature.
Conventional Mathematics is then formally defined in such a linear analytic manner.
This leads to an unambiguous (dualistic) interpretation of relationships in a somewhat static absolute type fashion.
So for example the interpretation of randomness and order are unambiguously separated from this perspective.
What is random therefore is - by definition - not ordered and what is ordered in thereby not random!
However by contrast holistic appreciation takes place simultaneously as between multiple reference frames. So the holistic aspect - at a minimum - requires two opposite poles in - what I term - 2-dimensional appreciation.
By its very nature holistic appreciation is inherently of a dynamic interactive nature leading to a circular (paradoxical) form of interpretation.
Now all understanding - including of course mathematical - inherently entails both analytic (Type 1) and holistic (Type 2) appreciation.
However, quite remarkably, no formal recognition whatsoever is given to the holistic (Type 2) aspect in Conventional Mathematics. Such mathematics therefore, for all its admittedly great achievements, is fundamentally of a reduced - and thereby distorted - nature. Thus it is not suited to appropriate appreciation of the true nature of the number system in the two-way relationships of the primes and natural numbers.
I have illustrated countless times before the precise nature of both analytic and holistic type appreciation respectively in terms of the two turns at a crossroads. If one heads N and encounter a crossroads, a left (L ) or right (R) turn has an unambiguous meaning; if on passing though. one then turns around heading S and once more encounter the crossroad, again either a left (L) or right (R) turn has an unambiguous meaning.
This is because interpretation is takes place in a linear analytic manner (within independent frames of reference.
However if one now considers the interdependence of both N and S, through attempting to view both polar reference frames simultaneously, then deep paradox results with respect to the interpretation of both turns at the crossroads. For what is left (L) approaching in the N direction, is right when approached from the opposite S direction; and what is right (R) when approached from the N, is left (L) from the opposite S direction.
Now implicitly this recognition that L and R turns at a crossroads have a merely relative arbitrary validity, depending on polar context, is of a holistic (2-dimensional) nature.
Again, whereas in direct terms, analytic appreciation is of a rational nature, holistic appreciation is directly intuitive (though indirectly can be expressed in a rational fashion).
Now because Conventional Mathematics has no formal recognition of the holistic - as opposed to the analytic - aspect of interpretation, this inevitably means that it cannot deal with the key notion of interdependence (in any relevant context) except in a reduced manner.
So in the context of the present discussion, randomness and order are akin to the L and R turns at our crossroads.
So what is random from the cardinal perspective is ordered from the corresponding ordinal perspective; and what is ordered from the ordinal perspective is random from the corresponding cardinal perspective.
Thus, if we are to grasp the truly relative nature of randomness and order (and order and randomness) with respect to the behaviour of the primes, we must be able to appreciate in a truly holistic manner (where opposite frames of polar reference can be mutually embraced).
However, once again in formal terms, our very appreciation of number with respect to both its individual and collective aspects is of a strictly analytic nature (viewed within a merely quantitative frame of reference).
So true holistic appreciation requires the balanced recognition that all mathematical understanding entails both quantitative and qualitative type appreciation.
So we have the customary analytic mathematical understanding with respect to the number system (where quantitative is abstracted from qualitative interpretation).
Then we have the greatly unrecognised holistic aspect of mathematical understanding (where both quantitative and qualitative aspects are understood as dynamically interdependent).
The non-trivial zeros (Zeta 1 and Zeta 2) relate directly to this latter unrecognised holistic aspect of the number system (where both quantitative and qualitative aspects of understanding are interdependent).
So, quite obviously, the non-trivial zeros - which are an inseparable component of the number system - cannot be satisfactorily interpreted in the conventional analytic manner.
Thus, in cardinal terms, what is random from the individual perspective (i.e. each single prime) is highly ordered from the opposite collective perspective (i.e. the overall relationship pf the primes to the natural numbers).
Then in a reverse ordinal manner, what is highly ordered from the individual perspective (i.e. each natural number member of a prime group) is random from the overall collective perspective (i.e. in the selection of primes).
So when we combine both reference frames (cardinal and ordinal) the notions of randomness and order lead directly to paradox.
Now once again, it is vital to appreciate a fundamental point relating to - what I term - the analytic and holistic aspects of mathematical appreciation.
Analytic appreciation in every context takes place within single isolated poles of reference. So the analytic aspect in this crucial respect is of a rational linear (i.e. 1-dimensional) nature.
Conventional Mathematics is then formally defined in such a linear analytic manner.
This leads to an unambiguous (dualistic) interpretation of relationships in a somewhat static absolute type fashion.
So for example the interpretation of randomness and order are unambiguously separated from this perspective.
What is random therefore is - by definition - not ordered and what is ordered in thereby not random!
However by contrast holistic appreciation takes place simultaneously as between multiple reference frames. So the holistic aspect - at a minimum - requires two opposite poles in - what I term - 2-dimensional appreciation.
By its very nature holistic appreciation is inherently of a dynamic interactive nature leading to a circular (paradoxical) form of interpretation.
Now all understanding - including of course mathematical - inherently entails both analytic (Type 1) and holistic (Type 2) appreciation.
However, quite remarkably, no formal recognition whatsoever is given to the holistic (Type 2) aspect in Conventional Mathematics. Such mathematics therefore, for all its admittedly great achievements, is fundamentally of a reduced - and thereby distorted - nature. Thus it is not suited to appropriate appreciation of the true nature of the number system in the two-way relationships of the primes and natural numbers.
I have illustrated countless times before the precise nature of both analytic and holistic type appreciation respectively in terms of the two turns at a crossroads. If one heads N and encounter a crossroads, a left (L ) or right (R) turn has an unambiguous meaning; if on passing though. one then turns around heading S and once more encounter the crossroad, again either a left (L) or right (R) turn has an unambiguous meaning.
This is because interpretation is takes place in a linear analytic manner (within independent frames of reference.
However if one now considers the interdependence of both N and S, through attempting to view both polar reference frames simultaneously, then deep paradox results with respect to the interpretation of both turns at the crossroads. For what is left (L) approaching in the N direction, is right when approached from the opposite S direction; and what is right (R) when approached from the N, is left (L) from the opposite S direction.
Now implicitly this recognition that L and R turns at a crossroads have a merely relative arbitrary validity, depending on polar context, is of a holistic (2-dimensional) nature.
Again, whereas in direct terms, analytic appreciation is of a rational nature, holistic appreciation is directly intuitive (though indirectly can be expressed in a rational fashion).
Now because Conventional Mathematics has no formal recognition of the holistic - as opposed to the analytic - aspect of interpretation, this inevitably means that it cannot deal with the key notion of interdependence (in any relevant context) except in a reduced manner.
So in the context of the present discussion, randomness and order are akin to the L and R turns at our crossroads.
So what is random from the cardinal perspective is ordered from the corresponding ordinal perspective; and what is ordered from the ordinal perspective is random from the corresponding cardinal perspective.
Thus, if we are to grasp the truly relative nature of randomness and order (and order and randomness) with respect to the behaviour of the primes, we must be able to appreciate in a truly holistic manner (where opposite frames of polar reference can be mutually embraced).
However, once again in formal terms, our very appreciation of number with respect to both its individual and collective aspects is of a strictly analytic nature (viewed within a merely quantitative frame of reference).
So true holistic appreciation requires the balanced recognition that all mathematical understanding entails both quantitative and qualitative type appreciation.
So we have the customary analytic mathematical understanding with respect to the number system (where quantitative is abstracted from qualitative interpretation).
Then we have the greatly unrecognised holistic aspect of mathematical understanding (where both quantitative and qualitative aspects are understood as dynamically interdependent).
The non-trivial zeros (Zeta 1 and Zeta 2) relate directly to this latter unrecognised holistic aspect of the number system (where both quantitative and qualitative aspects of understanding are interdependent).
So, quite obviously, the non-trivial zeros - which are an inseparable component of the number system - cannot be satisfactorily interpreted in the conventional analytic manner.
Sunday, May 25, 2014
More on Randomness and Order (2)
In the previous blog entry, I commented on the dynamic complementary nature of both randomness and order with respect to the number system.
So from one perspective (i.e. Type 1), the individual nature of cardinal primes reflects the extreme of random behaviour; however the collective nature of primes (with respect to the number system) reflects the opposite extreme of ordered behaviour.
Then when we look at prime behaviour from the corresponding (Type 2) ordinal perspective, these relationships are completely reversed.
For example, if we take the prime number 3, it thereby is composed individually of its 1st, 2nd and 3rd members respectively.
So - by definition - these individual members are thereby ordered in a natural number interdependent manner.
Thus the notion of 2nd therefore arises through its relationship with 1st and 3rd; likewise 3rd arises through its relationship with 1st and 2nd and finally, 1st arises through its relationship with 2nd and 3rd members respectively.
So the individual notions of 1st, 2nd and 3rd (as the 3 ordinal natural number members of 3 are thereby highly ordered in a relative manner.
However, once again this ordering cannot be absolute, for a certain arbitrary fixing of identity is always required in locating the 1st member of a group.
So again 3 as a prime group entails 3 individual members. Now, in principle any of these 3 could be identified as the 1st member So once again, in any relative context with respect to identifying 2nd and 3rd members respectively, the initial position with respect to the 1st member must be fixed!
Now this is similar to the cardinal approach, in the sense that 1 is itself excluded from the relationship of the primes with the natural numbers. So all other natural numbers (except 1) can be uniquely expressed as the combination of primes! Therefore though the number 1 is not specifically included, it underlies in a fundamental manner the nature of all other natural numbers.
So returning to the ordinal approach, the individual nature of the natural number members of a prime group are of a highly ordered nature.
Then in reverse, the combined collection of these prime groups lies at the opposite extreme of exhibiting the extreme of random behaviour.
What this means in effect is that in this context, the order in which we take the primes is of little consequence with the same basic conclusion applying to the ordinal behaviour of its natural number members.
In other words if one picked a prime at random from the collection of all primes then the behaviour of each of its individual natural number members lies at the extreme of a totally ordered identity (in relative terms).
However we are now left with a considerable dilemma.
Once again from the (Type 1) cardinal perspective, the individual primes represent a random extreme (with respect to the overall number system) whereas the collective behaviour of the primes represents the opposite ordered extreme (in relative terms); then from the (Type 2) ordinal perspective, the individual natural numbered members, comprising each prime, represent an ordered extreme, whereas the collection of all primes lie at the other extreme of random behaviour (in relative terms).
So both sets of results are fully paradoxical in terms of each other.
In other words, what is random with respect to the primes from the cardinal perspective (Type 1) is ordered from the alternative ordinal (Type 2) perspective; and what is random from the ordinal perspective (Type 2) is ordered from the alternative cardinal perspective (Type 1).
And it is vital to recognise that our actual appreciation of number is necessarily of a dynamic interactive nature combining both cardinal and ordinal type appreciation.
So, for example, the explicit recognition of 3 as a cardinal prime implicitly entails corresponding ordinal recognition of its constituent individual members in a natural number fashion; equally the explicit recognition of the 1st, 2nd, and 3rd members of 3, implicitly entails cardinal recognition of the number 3.
Now, in direct terms, cardinal and ordinal recognition are quantitative as to qualitative - entailing notions of independence and interdependence - respectively.
The huge limitation of the accepted conventional mathematical approach is that it attempts to abstract the quantitative from the qualitative aspect of appreciation. Thus it reduces an inherently dynamic interactive relationship as between two complementary aspects of number, in a merely reduced absolute type manner.
Thus the standard approach to viewing the relationship of the primes to the natural numbers is to treat both with respect to their mere cardinal identities.
This leads therefore to a substantial misinterpretation of the true relationship between both aspects.
In other words the ultimately mysterious relationship as between the primes and the natural numbers is the same mysterious relationship as between their cardinal and ordinal identities.
Indeed ultimately this points to the fundamental connection as between both the quantitative notions of independence and qualitative notions of interdependence that underlie the behaviour of the number system (and indeed everything in phenomenal creation).
So from one perspective (i.e. Type 1), the individual nature of cardinal primes reflects the extreme of random behaviour; however the collective nature of primes (with respect to the number system) reflects the opposite extreme of ordered behaviour.
Then when we look at prime behaviour from the corresponding (Type 2) ordinal perspective, these relationships are completely reversed.
For example, if we take the prime number 3, it thereby is composed individually of its 1st, 2nd and 3rd members respectively.
So - by definition - these individual members are thereby ordered in a natural number interdependent manner.
Thus the notion of 2nd therefore arises through its relationship with 1st and 3rd; likewise 3rd arises through its relationship with 1st and 2nd and finally, 1st arises through its relationship with 2nd and 3rd members respectively.
So the individual notions of 1st, 2nd and 3rd (as the 3 ordinal natural number members of 3 are thereby highly ordered in a relative manner.
However, once again this ordering cannot be absolute, for a certain arbitrary fixing of identity is always required in locating the 1st member of a group.
So again 3 as a prime group entails 3 individual members. Now, in principle any of these 3 could be identified as the 1st member So once again, in any relative context with respect to identifying 2nd and 3rd members respectively, the initial position with respect to the 1st member must be fixed!
Now this is similar to the cardinal approach, in the sense that 1 is itself excluded from the relationship of the primes with the natural numbers. So all other natural numbers (except 1) can be uniquely expressed as the combination of primes! Therefore though the number 1 is not specifically included, it underlies in a fundamental manner the nature of all other natural numbers.
So returning to the ordinal approach, the individual nature of the natural number members of a prime group are of a highly ordered nature.
Then in reverse, the combined collection of these prime groups lies at the opposite extreme of exhibiting the extreme of random behaviour.
What this means in effect is that in this context, the order in which we take the primes is of little consequence with the same basic conclusion applying to the ordinal behaviour of its natural number members.
In other words if one picked a prime at random from the collection of all primes then the behaviour of each of its individual natural number members lies at the extreme of a totally ordered identity (in relative terms).
However we are now left with a considerable dilemma.
Once again from the (Type 1) cardinal perspective, the individual primes represent a random extreme (with respect to the overall number system) whereas the collective behaviour of the primes represents the opposite ordered extreme (in relative terms); then from the (Type 2) ordinal perspective, the individual natural numbered members, comprising each prime, represent an ordered extreme, whereas the collection of all primes lie at the other extreme of random behaviour (in relative terms).
So both sets of results are fully paradoxical in terms of each other.
In other words, what is random with respect to the primes from the cardinal perspective (Type 1) is ordered from the alternative ordinal (Type 2) perspective; and what is random from the ordinal perspective (Type 2) is ordered from the alternative cardinal perspective (Type 1).
And it is vital to recognise that our actual appreciation of number is necessarily of a dynamic interactive nature combining both cardinal and ordinal type appreciation.
So, for example, the explicit recognition of 3 as a cardinal prime implicitly entails corresponding ordinal recognition of its constituent individual members in a natural number fashion; equally the explicit recognition of the 1st, 2nd, and 3rd members of 3, implicitly entails cardinal recognition of the number 3.
Now, in direct terms, cardinal and ordinal recognition are quantitative as to qualitative - entailing notions of independence and interdependence - respectively.
The huge limitation of the accepted conventional mathematical approach is that it attempts to abstract the quantitative from the qualitative aspect of appreciation. Thus it reduces an inherently dynamic interactive relationship as between two complementary aspects of number, in a merely reduced absolute type manner.
Thus the standard approach to viewing the relationship of the primes to the natural numbers is to treat both with respect to their mere cardinal identities.
This leads therefore to a substantial misinterpretation of the true relationship between both aspects.
In other words the ultimately mysterious relationship as between the primes and the natural numbers is the same mysterious relationship as between their cardinal and ordinal identities.
Indeed ultimately this points to the fundamental connection as between both the quantitative notions of independence and qualitative notions of interdependence that underlie the behaviour of the number system (and indeed everything in phenomenal creation).
Friday, May 23, 2014
More on Randomness and Order (1)
They key point to grasp with respect to understanding of the number system is that randomness and order are complementary notions, which can only be properly appreciated in a dynamic interactive manner.
Indeed randomness in this context implies non-order (i.e. lack or order), whereas from the opposite perspective, order implies non-randomness (i.e. lack of randomness).
Crucially therefore both notions imply each other and cannot be interpreted - except in reduced fashion - in the standard absolute manner that characterises the very paradigm of Conventional Mathematics.
Indeed in the deeper sense, both randomness and order imply the key notions of independence and interdependence respectively. And as I have repeatedly stated, this requires - in terms of a consistent mathematical approach - both analytic (Type 1) and holistic (Type 2) aspects of interpretation in equal balance.
As once again Conventional Mathematics is formally defined solely in terms of the analytic (Type 1) aspect, it is severely limited therefore in its capacity to unravel the true nature of these issues.
In effect, in its attempt to confine interpretation within the Type 1 aspect, it thereby adopts a distorted perspective (which robs these very notions of randomness and order of their true dynamic nature).
It can help to demonstrate the relative nature of randomness and order by initially looking again at the tossing of an unbiased coin.
Now with respect to each individual toss, the random nature of the outcome is in evidence.
Thus a high level of unpredictability attaches to the outcome (which can either H or T).
However when we now look at the other polar extreme of the collective nature of a combined series of tosses, the ordered nature of the outcome now comes sharply into evidence.
Therefore as the number of tosses increases, without finite limit, the proportion of Heads and Tails recorded will approximate ever closer to 1/2.
Now, of course, in terms of finite events, we do not get a result of 1/2 in absolute terms. Rather the result is strictly relative and approximate approaching ever closer to 1/2 (which however is not exact in a finite manner).
So the collective notion of order here is strictly of a relative approximate nature.
However, when we view the matter appropriately, this implies that the notion of randomness (attaching to each individual toss) is also necessarily of a relative approximate nature.
Now one might attempt maintain that the probability of a H or T is exactly 1/2 for each separate trial.
However, strictly in any practical experimental context, it would be impossible to ensure conditions guaranteeing this absolute lack of bias. So some degree of bias, arising from the nature of the coin, weather conditions, manner of tossing etc. will inevitably arise (regardless of how small).
More importantly, the very notion of independence (randomness) with respect to each individual trial implies the corresponding notion of interdependence (order) with respect to the collective group of trials. So once again, though we cannot successfully predict the outcome of any individual trial, we can expect to predict to a high degree of accuracy the overall result attaching to a combined group of trials!
Thus we can sum up so far, by once more stating that both the random nature of each individual trial (in tossing a coin) and the ordered nature of the collective outcome (of a combined group of tosses) are strictly of a relative approximate nature. Randomness and order are thus complementary notions that can only be properly understood in a dynamic interactive manner.
Now the relationship of the cardinal primes to the natural numbers can be understood in a similar manner.
When we look at the individual nature of the primes, they are strongly characterised by their random nature.
This in turn corresponds with the view of the primes as the independent building blocks of the natural number system.
However quite clearly the primes are not random in any absolute sense. There are certain restrictions on this randomness. Apart from 2, no even number can be prime; then in the denary system, apart from 5, no other number ending in 5 can be prime!
Then we look at the collective nature of the primes, they are now in complementary fashion strongly charcterised by their ordered nature, exhibiting a stunning overall degree of regularity. However once again this order is of a relative rather than absolute nature. Thus, like the frequency of Heads and Tails in the tossing of coins, we can equally predict the frequency of primes (and by extension the composite natural numbers) to a high - though never absolutely total - degree of accuracy.
Even when we allow for the refined modifications to such predictions using the Riemann zeros, the estimation of the number of primes (to a certain number) strictly remains of a relative - rather than absolute - nature. Now in theory one might attempt to maintain that given infinite adjustments (through using the total set of zeros) that our predictions would then be absolute. However by their very nature, we can only have access to a finite number of such zeros. Thus such total adjustment, by its very nature is not possible.
So once again the notions of randomness and order in the cardinal number system are relative and approximate based on complementary poles, which can be only properly appreciated in a dynamic interactive manner.
One of the great problems with the way that we view the natural number system in cardinal terms is that customarily it is viewed as being composed of individual numbers that are independent of each other.
Thus though it may well be accepted that the primes serve as the building blocks of the (composite) natural numbers, once they have been derived from the primes, they are then misleadingly treated as independent units in a similar manner.
So 1, 2, 3, 4,.... are all treated as independent number entities in all subsequent calculations. This then leads to the gravely mistaken impression that the number system itself exists in abstraction as some absolute system frozen in space and time.
This distorted tendency is then greatly accentuated through the very paradigm that defines conventional mathematical thinking that is linear (1-dimensional) in nature (based on single isolated poles of reference).
Thus the very paradigm on which Conventional Mathematics is based is utterly unsuited to come to grips with the true dynamic interactive nature of the number system based on complementary poles of reference.
Thus internal and external constitute one key set of dynamic polarities. We cannot form for example the notion of a mathematical object such as a number in an external manner, without a corresponding mental perception of the number that is - relatively - of an internal nature.
So we cannot have objective truth in the absence of subjective interpretation. The very illusion of absolute truth in Mathematics itself reflects the mistaken belief that objective truth can exist in the absence of such subjective interpretation! So this mistaken belief itself reflects an important - though ultimately untenable - form of interpretation!
Individual (part) and collective (whole) aspects then constitute another key set of dynamic polarities. So we cannot form a notion of an individual number perception in the absence of the collective conceptual notion of number. From the opposite perspective, we cannot form the general concept of number in the absence of particular number perceptions. Thus, the number with respect to its individual and collective aspects, is dynamic and interactive (based on complementary opposite poles).
However, Conventional Mathematics is once again based on the reduced notion that these two aspects can be abstracted from each other in a static absolute manner.
Though the natural number system - from the cardinal perspective - intimately depends on the primes, the natural numbers are then treated as exhibiting even more randomness than the primes.
Thus in a lottery where 100 people are given tickets numbered 1 - 100 respectively, if we draw 25 tickets (allowing replacement) each ticket has an equal chance of being chosen on each draw (though as we have seen this must strictly be interpreted in a relative, rather than absolute manner). So the tickets are randomly chosen in this manner.
However, the composite natural numbers depend on the unique order exhibited by the primes. So it seems somewhat paradoxical that we should then consider all such numbers in a merely random fashion.
This simply therefore reflects a reduced - and unsatisfactory - linear way of attempting to view the number system.
Quite clearly the randomness of the system cannot be understood in the absence of its corresponding order; likewise the order of the system cannot be understood in the absence of its corresponding randomness.
This notion of true interdependence with respect to the number system requires holistic appreciation of a circular kind (rather than analytic understanding in a linear manner).
Then in true appreciation of the - relatively - independent and interdependent aspects of the number system both analytic (Type 1) and holistic (Type 2) understanding are required.
Indeed randomness in this context implies non-order (i.e. lack or order), whereas from the opposite perspective, order implies non-randomness (i.e. lack of randomness).
Crucially therefore both notions imply each other and cannot be interpreted - except in reduced fashion - in the standard absolute manner that characterises the very paradigm of Conventional Mathematics.
Indeed in the deeper sense, both randomness and order imply the key notions of independence and interdependence respectively. And as I have repeatedly stated, this requires - in terms of a consistent mathematical approach - both analytic (Type 1) and holistic (Type 2) aspects of interpretation in equal balance.
As once again Conventional Mathematics is formally defined solely in terms of the analytic (Type 1) aspect, it is severely limited therefore in its capacity to unravel the true nature of these issues.
In effect, in its attempt to confine interpretation within the Type 1 aspect, it thereby adopts a distorted perspective (which robs these very notions of randomness and order of their true dynamic nature).
It can help to demonstrate the relative nature of randomness and order by initially looking again at the tossing of an unbiased coin.
Now with respect to each individual toss, the random nature of the outcome is in evidence.
Thus a high level of unpredictability attaches to the outcome (which can either H or T).
However when we now look at the other polar extreme of the collective nature of a combined series of tosses, the ordered nature of the outcome now comes sharply into evidence.
Therefore as the number of tosses increases, without finite limit, the proportion of Heads and Tails recorded will approximate ever closer to 1/2.
Now, of course, in terms of finite events, we do not get a result of 1/2 in absolute terms. Rather the result is strictly relative and approximate approaching ever closer to 1/2 (which however is not exact in a finite manner).
So the collective notion of order here is strictly of a relative approximate nature.
However, when we view the matter appropriately, this implies that the notion of randomness (attaching to each individual toss) is also necessarily of a relative approximate nature.
Now one might attempt maintain that the probability of a H or T is exactly 1/2 for each separate trial.
However, strictly in any practical experimental context, it would be impossible to ensure conditions guaranteeing this absolute lack of bias. So some degree of bias, arising from the nature of the coin, weather conditions, manner of tossing etc. will inevitably arise (regardless of how small).
More importantly, the very notion of independence (randomness) with respect to each individual trial implies the corresponding notion of interdependence (order) with respect to the collective group of trials. So once again, though we cannot successfully predict the outcome of any individual trial, we can expect to predict to a high degree of accuracy the overall result attaching to a combined group of trials!
Thus we can sum up so far, by once more stating that both the random nature of each individual trial (in tossing a coin) and the ordered nature of the collective outcome (of a combined group of tosses) are strictly of a relative approximate nature. Randomness and order are thus complementary notions that can only be properly understood in a dynamic interactive manner.
Now the relationship of the cardinal primes to the natural numbers can be understood in a similar manner.
When we look at the individual nature of the primes, they are strongly characterised by their random nature.
This in turn corresponds with the view of the primes as the independent building blocks of the natural number system.
However quite clearly the primes are not random in any absolute sense. There are certain restrictions on this randomness. Apart from 2, no even number can be prime; then in the denary system, apart from 5, no other number ending in 5 can be prime!
Then we look at the collective nature of the primes, they are now in complementary fashion strongly charcterised by their ordered nature, exhibiting a stunning overall degree of regularity. However once again this order is of a relative rather than absolute nature. Thus, like the frequency of Heads and Tails in the tossing of coins, we can equally predict the frequency of primes (and by extension the composite natural numbers) to a high - though never absolutely total - degree of accuracy.
Even when we allow for the refined modifications to such predictions using the Riemann zeros, the estimation of the number of primes (to a certain number) strictly remains of a relative - rather than absolute - nature. Now in theory one might attempt to maintain that given infinite adjustments (through using the total set of zeros) that our predictions would then be absolute. However by their very nature, we can only have access to a finite number of such zeros. Thus such total adjustment, by its very nature is not possible.
So once again the notions of randomness and order in the cardinal number system are relative and approximate based on complementary poles, which can be only properly appreciated in a dynamic interactive manner.
One of the great problems with the way that we view the natural number system in cardinal terms is that customarily it is viewed as being composed of individual numbers that are independent of each other.
Thus though it may well be accepted that the primes serve as the building blocks of the (composite) natural numbers, once they have been derived from the primes, they are then misleadingly treated as independent units in a similar manner.
So 1, 2, 3, 4,.... are all treated as independent number entities in all subsequent calculations. This then leads to the gravely mistaken impression that the number system itself exists in abstraction as some absolute system frozen in space and time.
This distorted tendency is then greatly accentuated through the very paradigm that defines conventional mathematical thinking that is linear (1-dimensional) in nature (based on single isolated poles of reference).
Thus the very paradigm on which Conventional Mathematics is based is utterly unsuited to come to grips with the true dynamic interactive nature of the number system based on complementary poles of reference.
Thus internal and external constitute one key set of dynamic polarities. We cannot form for example the notion of a mathematical object such as a number in an external manner, without a corresponding mental perception of the number that is - relatively - of an internal nature.
So we cannot have objective truth in the absence of subjective interpretation. The very illusion of absolute truth in Mathematics itself reflects the mistaken belief that objective truth can exist in the absence of such subjective interpretation! So this mistaken belief itself reflects an important - though ultimately untenable - form of interpretation!
Individual (part) and collective (whole) aspects then constitute another key set of dynamic polarities. So we cannot form a notion of an individual number perception in the absence of the collective conceptual notion of number. From the opposite perspective, we cannot form the general concept of number in the absence of particular number perceptions. Thus, the number with respect to its individual and collective aspects, is dynamic and interactive (based on complementary opposite poles).
However, Conventional Mathematics is once again based on the reduced notion that these two aspects can be abstracted from each other in a static absolute manner.
Though the natural number system - from the cardinal perspective - intimately depends on the primes, the natural numbers are then treated as exhibiting even more randomness than the primes.
Thus in a lottery where 100 people are given tickets numbered 1 - 100 respectively, if we draw 25 tickets (allowing replacement) each ticket has an equal chance of being chosen on each draw (though as we have seen this must strictly be interpreted in a relative, rather than absolute manner). So the tickets are randomly chosen in this manner.
However, the composite natural numbers depend on the unique order exhibited by the primes. So it seems somewhat paradoxical that we should then consider all such numbers in a merely random fashion.
This simply therefore reflects a reduced - and unsatisfactory - linear way of attempting to view the number system.
Quite clearly the randomness of the system cannot be understood in the absence of its corresponding order; likewise the order of the system cannot be understood in the absence of its corresponding randomness.
This notion of true interdependence with respect to the number system requires holistic appreciation of a circular kind (rather than analytic understanding in a linear manner).
Then in true appreciation of the - relatively - independent and interdependent aspects of the number system both analytic (Type 1) and holistic (Type 2) understanding are required.
Wednesday, May 21, 2014
New Perspective on Prime Number Theorem (6)
It is important to keep restating my central purpose on these blogs.
Once again I am demonstrating how mathematical relationships are inherently of a dynamic interactive nature entailing complementary opposite polarities.
In actual experience the key polarities then relate firstly to external (objective) and internal (subjective). We cannot for example have knowledge of a mathematical entity such as a number in an (objective) external manner without the corresponding mental perception of this number which - relatively - is of a (subjective) internal nature. So we cannot therefore in Mathematics have "objective" truth in the absence of corresponding "mental" interpretation (both of which dynamically interact in experience).
Therefore the standard mathematical approach of attempting to treat mathematical objects in an an absolute abstract manner is ultimately untenable. Of course I will readily admit that, very much in the manner of Newtonian Physics, such an assumption can prove extremely valuable in approximating truth in a partial manner. In other words the notion of objective validity in an absolute type manner represents one very useful - though ultimately limiting - interpretation of mathematical truth.
Secondly, all experience (including of course mathematical) necessarily involves the dynamic interaction of the two fundamental poles of whole and part. This can be expressed alternatively as the interaction of quantitative and qualitative, individual and collective and perhaps most crucially of notions of independence and interdependence respectively.
Now, as it stands, the accepted mathematical enterprise is but of a very reduced nature where, in every context, qualitative notions of interdependence are reduced in an independent merely quantitative manner.
Admittedly huge progress has been made, though again necessarily in a limited manner, through its reduced quantitative assumptions.
However, it is doomed ultimately to failure with respect to understanding the key relationship as between the primes and the natural numbers.
So it will take an enormous paradigm shift, such as never has occurred before within Mathematics to start dealing with this fundamental issue in an appropriate manner.
In fact as I have been repeatedly stating in these blog entries it requires an inherently dynamic interactive approach based on complementary poles of reference to properly understand the two-way relationship as between the primes and natural numbers.
This therefore requires both an analytic (quantitative) and holistic (qualitative) method of interpreting all mathematical variables. And it has to be said that given the increasingly abstract nature of mathematical developments that the holistic manner of understanding - which in truth is equally important with the analytic - has been all but eliminated in formal terms from Mathematics.
The current obsession to prove the Riemann Hypothesis reflects very much the highly reduced quantitative bias of Mathematics. Here an attempt is made to treat both primes and natural numbers as independent entities in a merely quantitative cardinal manner.
However both the primes and natural numbers equally have an ordinal - as well as cardinal - identity. And whereas the cardinal identity directly focuses on such numbers as independent entities, the ordinal aspect relates by contrast directly to the qualitative notion of number as interdependent (where meaning is necessarily derived from their relationship with other numbers).
So properly understand the relationship of the primes with the natural numbers entails the two-way dynamic interaction within the number system of both quantitative notions of individual numbers as independent and qualitative notions of the collective interdependence of all numbers.
I have been illustrating in the past few blog entries this two-way relationship from yet another perspective.
Staring with the Type 1 perspective which focuses directly on the quantitative nature of the number system, I have been at pains to show its hidden ordinal aspect.
So we start with the well known set of cardinal primes (which I refer to as Order 1 Primes).
However these implicitly contain a natural number basis (in ordinal terms) i.e. so that they can be ranked in order as 1st, 2nd, 3rd, 4th and so on.
By then switching to the prime rankings within this natural number ordinal ordinal set, we are then able to identify a new subset of primes in a cardinal manner (which I identify as Order 2 Primes).
And we can continue on interactively in this manner switching as between primes and natural numbers in cardinal and ordinal manner to identify an unlimited number of subsets of Higher Order Primes.
And I demonstrated how the Prime Number Theorem would then apply to each of these subsets.
So we have the strange paradox here that what we consider as cardinal primes implicitly can also be viewed as ordinal natural numbers. Thus the relationship as between the - ultimately unlimited - subsets of cardinal primes throughout the number system, can equally be viewed as the relationship between corresponding subsets of ordinal natural numbers!
We then viewed from the Type 2 perspective the much less recognised relationship as between the ratio of natural to prime factors per number (up to n).
Now in Type 1 terms the Prime Number Theorem relates to a quantitative notion of frequency i.e. the number of cardinal primes up to n, on the natural number scale.
However in Type 2 terms the Prime Number Theorem relates to a qualitative notion of frequency, in that we are comparing prime with natural number factors (which reflects an ordinal notion).
This is a crucially important point. However because of the great neglect of ordinal type notions - which are mistakenly assumed to be derived from cardinal - within Conventional Mathematics, the appropriate mathematical language does not even exist to discuss this issue coherently.
We have in fact two distinct notions of the ordinal. The first is within a prime group of numbers. So 3 is a cardinal prime; however the group of 3 necessarily contains 1st, 2nd and 3rd members in natural number ordinal terms. And the notion of addition connects these 3 members so that the group of 3 (in cardinal terms) = 1st + 2nd + 3rd members in an ordinal manner. So again, we see that the explicit notion of a cardinal prime, implies entails natural number notions (in an ordinal manner).
However the 2nd - largely unrecognised - notion of ordinal relates to the unique combinations of prime numbers, through which all composite natural numbers are derived.
So when I express 6 (as a cardinal natural number) as the product of 2 * 3, I am strictly using the prime numbers 2 and 3 in an ordinal fashion. In other words a unique interdependence is established as between these two primes resulting in the composite number 6, which we then view as an independent cardinal number. And the notion of multiplication ordinally connects, in this case, the relationship between the primes.
So 6 in cardinal natural number terms = 2 * 3 (from an ordinal prime perspective).
Thus, when we look appropriately at the matter, a double paradox in revealed with respect to the number system.
From the Type 1 perspective, each (explicit) cardinal prime implicitly entails natural numbers in an ordinal fashion.
Then, from the complementary Type 2 perspective, each (explicit) cardinal natural number, implicitly entails primes in an ordinal fashion.
When one appreciates this two-way interaction appropriately, it then becomes obvious that the primes and natural numbers are ultimately totally interdependent with each other (in both cardinal and ordinal fashion).
Indeed it is only the phenomenal tendency to attempt to view relationships within isolated reference frames - especially pronounced within Mathematics - that creates the illusion of a causal relationship between them.
So Conventional Mathematics is still firmly stuck in this totally one-sided notion of the primes as the (cardinal) building blocks of the (cardinal) natural number system.
However when we properly include both Type 1 (cardinal) and Type 2 (ordinal) perspectives, which are quantitative and qualitative with respect to each other, the number system is understood in a dynamic interactive manner, with prime and natural number notions, and cardinal and ordinal notions, ultimately fully interdependent with each other in an a priori ineffable manner.
Of course, as I have been saying from the very beginning of these blog entries, this means that the attempt to prove or disprove the Riemann Hypothesis is strictly futile.
Quite simply, The Riemann Hypothesis by its very nature transcends the limits of the conventional mathematical approach. It points in fact to the ultimate condition necessary for the the consistent reconciliation of both cardinal (quantitative) and ordinal (qualitative) notions within the number system. And clearly this cannot be achieved within a paradigm that does not (formally) recognise a distinct role for the qualitative. So the a priori consistency (as between quantitative and qualitative notions) to which the Riemann Hypothesis applies, is already necessarily assumed in the very use of the conventional mathematical axioms!
However there is a much bigger issue to be faced here here than the role of the Riemann Hypothesis important as it admittedly is! This is that the very paradigm on which Conventional Mathematics is built is crucially flawed .
Rather than representing all valid mathematical inquiry, it represents just a small - though admittedly very important - special case.
We are now in need - not alone of properly understanding the number system - but indeed all mathematical relationships - of an unparalleled paradigm shift to a truly dynamic interactive manner of appreciating mathematical relationships, that entails both quantitative (analytic) and qualitative (holistic) aspects of appreciation in equal balance.
Once again I am demonstrating how mathematical relationships are inherently of a dynamic interactive nature entailing complementary opposite polarities.
In actual experience the key polarities then relate firstly to external (objective) and internal (subjective). We cannot for example have knowledge of a mathematical entity such as a number in an (objective) external manner without the corresponding mental perception of this number which - relatively - is of a (subjective) internal nature. So we cannot therefore in Mathematics have "objective" truth in the absence of corresponding "mental" interpretation (both of which dynamically interact in experience).
Therefore the standard mathematical approach of attempting to treat mathematical objects in an an absolute abstract manner is ultimately untenable. Of course I will readily admit that, very much in the manner of Newtonian Physics, such an assumption can prove extremely valuable in approximating truth in a partial manner. In other words the notion of objective validity in an absolute type manner represents one very useful - though ultimately limiting - interpretation of mathematical truth.
Secondly, all experience (including of course mathematical) necessarily involves the dynamic interaction of the two fundamental poles of whole and part. This can be expressed alternatively as the interaction of quantitative and qualitative, individual and collective and perhaps most crucially of notions of independence and interdependence respectively.
Now, as it stands, the accepted mathematical enterprise is but of a very reduced nature where, in every context, qualitative notions of interdependence are reduced in an independent merely quantitative manner.
Admittedly huge progress has been made, though again necessarily in a limited manner, through its reduced quantitative assumptions.
However, it is doomed ultimately to failure with respect to understanding the key relationship as between the primes and the natural numbers.
So it will take an enormous paradigm shift, such as never has occurred before within Mathematics to start dealing with this fundamental issue in an appropriate manner.
In fact as I have been repeatedly stating in these blog entries it requires an inherently dynamic interactive approach based on complementary poles of reference to properly understand the two-way relationship as between the primes and natural numbers.
This therefore requires both an analytic (quantitative) and holistic (qualitative) method of interpreting all mathematical variables. And it has to be said that given the increasingly abstract nature of mathematical developments that the holistic manner of understanding - which in truth is equally important with the analytic - has been all but eliminated in formal terms from Mathematics.
The current obsession to prove the Riemann Hypothesis reflects very much the highly reduced quantitative bias of Mathematics. Here an attempt is made to treat both primes and natural numbers as independent entities in a merely quantitative cardinal manner.
However both the primes and natural numbers equally have an ordinal - as well as cardinal - identity. And whereas the cardinal identity directly focuses on such numbers as independent entities, the ordinal aspect relates by contrast directly to the qualitative notion of number as interdependent (where meaning is necessarily derived from their relationship with other numbers).
So properly understand the relationship of the primes with the natural numbers entails the two-way dynamic interaction within the number system of both quantitative notions of individual numbers as independent and qualitative notions of the collective interdependence of all numbers.
I have been illustrating in the past few blog entries this two-way relationship from yet another perspective.
Staring with the Type 1 perspective which focuses directly on the quantitative nature of the number system, I have been at pains to show its hidden ordinal aspect.
So we start with the well known set of cardinal primes (which I refer to as Order 1 Primes).
However these implicitly contain a natural number basis (in ordinal terms) i.e. so that they can be ranked in order as 1st, 2nd, 3rd, 4th and so on.
By then switching to the prime rankings within this natural number ordinal ordinal set, we are then able to identify a new subset of primes in a cardinal manner (which I identify as Order 2 Primes).
And we can continue on interactively in this manner switching as between primes and natural numbers in cardinal and ordinal manner to identify an unlimited number of subsets of Higher Order Primes.
And I demonstrated how the Prime Number Theorem would then apply to each of these subsets.
So we have the strange paradox here that what we consider as cardinal primes implicitly can also be viewed as ordinal natural numbers. Thus the relationship as between the - ultimately unlimited - subsets of cardinal primes throughout the number system, can equally be viewed as the relationship between corresponding subsets of ordinal natural numbers!
We then viewed from the Type 2 perspective the much less recognised relationship as between the ratio of natural to prime factors per number (up to n).
Now in Type 1 terms the Prime Number Theorem relates to a quantitative notion of frequency i.e. the number of cardinal primes up to n, on the natural number scale.
However in Type 2 terms the Prime Number Theorem relates to a qualitative notion of frequency, in that we are comparing prime with natural number factors (which reflects an ordinal notion).
This is a crucially important point. However because of the great neglect of ordinal type notions - which are mistakenly assumed to be derived from cardinal - within Conventional Mathematics, the appropriate mathematical language does not even exist to discuss this issue coherently.
We have in fact two distinct notions of the ordinal. The first is within a prime group of numbers. So 3 is a cardinal prime; however the group of 3 necessarily contains 1st, 2nd and 3rd members in natural number ordinal terms. And the notion of addition connects these 3 members so that the group of 3 (in cardinal terms) = 1st + 2nd + 3rd members in an ordinal manner. So again, we see that the explicit notion of a cardinal prime, implies entails natural number notions (in an ordinal manner).
However the 2nd - largely unrecognised - notion of ordinal relates to the unique combinations of prime numbers, through which all composite natural numbers are derived.
So when I express 6 (as a cardinal natural number) as the product of 2 * 3, I am strictly using the prime numbers 2 and 3 in an ordinal fashion. In other words a unique interdependence is established as between these two primes resulting in the composite number 6, which we then view as an independent cardinal number. And the notion of multiplication ordinally connects, in this case, the relationship between the primes.
So 6 in cardinal natural number terms = 2 * 3 (from an ordinal prime perspective).
Thus, when we look appropriately at the matter, a double paradox in revealed with respect to the number system.
From the Type 1 perspective, each (explicit) cardinal prime implicitly entails natural numbers in an ordinal fashion.
Then, from the complementary Type 2 perspective, each (explicit) cardinal natural number, implicitly entails primes in an ordinal fashion.
When one appreciates this two-way interaction appropriately, it then becomes obvious that the primes and natural numbers are ultimately totally interdependent with each other (in both cardinal and ordinal fashion).
Indeed it is only the phenomenal tendency to attempt to view relationships within isolated reference frames - especially pronounced within Mathematics - that creates the illusion of a causal relationship between them.
So Conventional Mathematics is still firmly stuck in this totally one-sided notion of the primes as the (cardinal) building blocks of the (cardinal) natural number system.
However when we properly include both Type 1 (cardinal) and Type 2 (ordinal) perspectives, which are quantitative and qualitative with respect to each other, the number system is understood in a dynamic interactive manner, with prime and natural number notions, and cardinal and ordinal notions, ultimately fully interdependent with each other in an a priori ineffable manner.
Of course, as I have been saying from the very beginning of these blog entries, this means that the attempt to prove or disprove the Riemann Hypothesis is strictly futile.
Quite simply, The Riemann Hypothesis by its very nature transcends the limits of the conventional mathematical approach. It points in fact to the ultimate condition necessary for the the consistent reconciliation of both cardinal (quantitative) and ordinal (qualitative) notions within the number system. And clearly this cannot be achieved within a paradigm that does not (formally) recognise a distinct role for the qualitative. So the a priori consistency (as between quantitative and qualitative notions) to which the Riemann Hypothesis applies, is already necessarily assumed in the very use of the conventional mathematical axioms!
However there is a much bigger issue to be faced here here than the role of the Riemann Hypothesis important as it admittedly is! This is that the very paradigm on which Conventional Mathematics is built is crucially flawed .
Rather than representing all valid mathematical inquiry, it represents just a small - though admittedly very important - special case.
We are now in need - not alone of properly understanding the number system - but indeed all mathematical relationships - of an unparalleled paradigm shift to a truly dynamic interactive manner of appreciating mathematical relationships, that entails both quantitative (analytic) and qualitative (holistic) aspects of appreciation in equal balance.
Tuesday, May 20, 2014
New Perspective on Prime Number Theorem (5)
In the last two blog entries I showed, how coming from the standard Type 1 perspective, that the Prime Number Theorem can be indefinitely extended within the number system to an unlimited set of relationships involving Higher Order Primes.
So what are commonly referred to as the primes thereby in this context represent the 1st Order Primes. However 2nd Order, 3rd, Order,.....Nth Order Primes can be subsequently defined, all of which are bound by a corresponding Prime Number Theorem equivalent.
However, though more difficult to properly envisage it is also equally possible to view the Prime Number Theorem from the (largely unrecognised) Type 2 perspective, once again resulting in a potentially unlimited set of Prime Number theorem equivalents.
Remember once again that the key formulation here relates to the average frequency of the natural factors of a number (up to n) divided by the corresponding average frequency of its (distinct) prime factors!
We suggested that this was given (for large n) by log n/log(log n).
Therefore by letting t1= log n, the relationship could then be expressed as t1/log t1 .
We also pointed out the fascinating connection here with the well known harmonic series (sum of the reciprocals of the natural numbers) and the corresponding prime series (of the sum of the reciprocals of the prime numbers).
Indeed n/log n could equally be expressed (for large n) as,
1 + 1 + 1 + 1 +....../(1 + 1/2 + 1/3 + 1/4 +.....)
t1/log t1 i.e. log n/log(log n) is then expressed as,
1 + 1/2 + 1/3 + 1/4 +......../(1/2 + 1/3 + 1/5 + 1/7 +.......)
So we are using the reciprocals of the 0th Order Primes (i.e. natural numbers) and 1st Order Primes (i.e. the regular primes) in this relationship.
However we could equally defines a new (infinite) series based on the reciprocals of the 2nd Order Primes
i.e. 1/3 + 1/5 + 1/11 + 1/17 + .........
It is fascinating to speculate that the value of this sum would in turn be given by,
log{log(log n)} + C where C represents a constant.
Then for very sufficiently large n, this would approximate to log{log(log n)}.
Then in turn we could define a new (infinite) series based on the reciprocals of the 3rd Order Primes i.e.
1/5 + 1/11 + 1/31 + 1/41 +....
Again it would be interesting to speculate that the value of this sum in turn would be given by:
log[log{log(log n)}] + D where D represents a constant.
Then for sufficiently large n, again the result would be approximated well as log[log{log(log n)}].
And we could continue on in this manner for 4th Order, 5th Order,....Nth Order Primes.
One interesting and - perhaps - extremely surprising - conclusion from all this is that the (infinite) series of reciprocals for all Higher Order Primes, would ultimately be divergent in value.
Thus again, no matter how often we thin out the original series of prime numbers to derive Higher Order Primes, the (infinite) sum of the reciprocal series (based on these primes) will ultimately diverge (albeit extremely slowly).
Now we could express the original ratio as between the average number of natural factors (up to n) contained by a number and the corresponding average number of prime factors as log n/ log(log n), which is replicated in terms of the ratio of the sum of the reciprocals of the natural numbers and the corresponding sum of reciprocals of the standard (1st Order) primes
i.e. 1 + 1/2 + 1/3 + 1/4 +......../(1/2 + 1/3 + 1/5 + 1/7 +.......).
This therefore suggests that we can extend this indefinitely in a Type 2 fashion throughout the number system.
So for example, if we confine ourselves to (distinct) prime factors that are 2nd Order Primes (i.e. 3, 5, 11, 17, 31,...) and also confine ourselves to the natural factors of numbers based on these primes), then the new ratio of natural to prime factors will be given as log(log n)/ log{log(log n)}.
If we let t2 = log(log n) then this ratio could then be expressed as t2/log t2.
This ratio could then in turn be expressed as the ratio of the sum of reciprocals based on the 1st Order Primes divided by the corresponding sum of reciprocals based on 2nd Order Primes
i.e. 1/2 + 1/3 + 1/5 + 1/7 +....../(1/3 + 1/5 + 1/11 + 1/17 +.....).
We could then go on to get a new ratio of natural to prime factors based on consideration of 3rd Order Primes which would be approximated by log{log(log n)}/log[log{log(log n)}].
Then by letting t3 = log{log(log n)}, we could express this ratio as t3/log t3.
This result once again could then be expressed as the ratio of the sums of reciprocals of the 2nd Order and 3rd Order Primes,
i.e. 1/3 + 1/5 + 1/11 + 1/17 +..../(1/5 + 1/11 + 1/31 + 1/41 + .....).
And we could proceed indefinitely in this manner to - in the general case - expressing a ratio of natural to prime factors based on consideration of Nth Order Primes which would be expressed in turn as tN/log tN .
And this result in turn could then be expressed as the ratio of the sums of reciprocals of the (N – 1)th and Nth Order Primes respectively.
So what are commonly referred to as the primes thereby in this context represent the 1st Order Primes. However 2nd Order, 3rd, Order,.....Nth Order Primes can be subsequently defined, all of which are bound by a corresponding Prime Number Theorem equivalent.
However, though more difficult to properly envisage it is also equally possible to view the Prime Number Theorem from the (largely unrecognised) Type 2 perspective, once again resulting in a potentially unlimited set of Prime Number theorem equivalents.
Remember once again that the key formulation here relates to the average frequency of the natural factors of a number (up to n) divided by the corresponding average frequency of its (distinct) prime factors!
We suggested that this was given (for large n) by log n/log(log n).
Therefore by letting t1= log n, the relationship could then be expressed as t1/log t1 .
We also pointed out the fascinating connection here with the well known harmonic series (sum of the reciprocals of the natural numbers) and the corresponding prime series (of the sum of the reciprocals of the prime numbers).
Indeed n/log n could equally be expressed (for large n) as,
1 + 1 + 1 + 1 +....../(1 + 1/2 + 1/3 + 1/4 +.....)
t1/log t1 i.e. log n/log(log n) is then expressed as,
1 + 1/2 + 1/3 + 1/4 +......../(1/2 + 1/3 + 1/5 + 1/7 +.......)
So we are using the reciprocals of the 0th Order Primes (i.e. natural numbers) and 1st Order Primes (i.e. the regular primes) in this relationship.
However we could equally defines a new (infinite) series based on the reciprocals of the 2nd Order Primes
i.e. 1/3 + 1/5 + 1/11 + 1/17 + .........
It is fascinating to speculate that the value of this sum would in turn be given by,
log{log(log n)} + C where C represents a constant.
Then for very sufficiently large n, this would approximate to log{log(log n)}.
Then in turn we could define a new (infinite) series based on the reciprocals of the 3rd Order Primes i.e.
1/5 + 1/11 + 1/31 + 1/41 +....
Again it would be interesting to speculate that the value of this sum in turn would be given by:
log[log{log(log n)}] + D where D represents a constant.
Then for sufficiently large n, again the result would be approximated well as log[log{log(log n)}].
And we could continue on in this manner for 4th Order, 5th Order,....Nth Order Primes.
One interesting and - perhaps - extremely surprising - conclusion from all this is that the (infinite) series of reciprocals for all Higher Order Primes, would ultimately be divergent in value.
Thus again, no matter how often we thin out the original series of prime numbers to derive Higher Order Primes, the (infinite) sum of the reciprocal series (based on these primes) will ultimately diverge (albeit extremely slowly).
Now we could express the original ratio as between the average number of natural factors (up to n) contained by a number and the corresponding average number of prime factors as log n/ log(log n), which is replicated in terms of the ratio of the sum of the reciprocals of the natural numbers and the corresponding sum of reciprocals of the standard (1st Order) primes
i.e. 1 + 1/2 + 1/3 + 1/4 +......../(1/2 + 1/3 + 1/5 + 1/7 +.......).
This therefore suggests that we can extend this indefinitely in a Type 2 fashion throughout the number system.
So for example, if we confine ourselves to (distinct) prime factors that are 2nd Order Primes (i.e. 3, 5, 11, 17, 31,...) and also confine ourselves to the natural factors of numbers based on these primes), then the new ratio of natural to prime factors will be given as log(log n)/ log{log(log n)}.
If we let t2 = log(log n) then this ratio could then be expressed as t2/log t2.
This ratio could then in turn be expressed as the ratio of the sum of reciprocals based on the 1st Order Primes divided by the corresponding sum of reciprocals based on 2nd Order Primes
i.e. 1/2 + 1/3 + 1/5 + 1/7 +....../(1/3 + 1/5 + 1/11 + 1/17 +.....).
We could then go on to get a new ratio of natural to prime factors based on consideration of 3rd Order Primes which would be approximated by log{log(log n)}/log[log{log(log n)}].
Then by letting t3 = log{log(log n)}, we could express this ratio as t3/log t3.
This result once again could then be expressed as the ratio of the sums of reciprocals of the 2nd Order and 3rd Order Primes,
i.e. 1/3 + 1/5 + 1/11 + 1/17 +..../(1/5 + 1/11 + 1/31 + 1/41 + .....).
And we could proceed indefinitely in this manner to - in the general case - expressing a ratio of natural to prime factors based on consideration of Nth Order Primes which would be expressed in turn as tN/log tN .
And this result in turn could then be expressed as the ratio of the sums of reciprocals of the (N – 1)th and Nth Order Primes respectively.
Monday, May 19, 2014
New Perspective on Prime Number Theorem (4)
In yesterday's blog entry, I was approaching the relationship as between the primes and the natural numbers from the Type 1 perspective, where ultimately we view the primes and natural numbers in cardinal terms.
However what is vital to appreciate, in showing how the same fundamental relationship as between the primes and natural numbers is reiterated without limit throughout the number system through the definition of different order primes, is that (i) the cardinal notion of number necessarily implies the ordinal; (ii) the very notion of the primes necessarily implies the natural numbers.
So properly understood - which inherently requires holistic dynamic appreciation - both cardinal and ordinal notions are ultimately completely interdependent; likewise the notions of prime and natural numbers are likewise ultimately completely interdependent with each other.
When one examines the approach I adopted in deriving 2nd Order, 3rd Order,....Nth Order Primes, it is clear that we keep switching as between both cardinal and ordinal notions and also as between prime and natural number notions.
So for example in moving from the 1st Order Primes i.e. 2, 3, 5, 7, 11, .... to the 2nd Order Primes 3, 5, 11, 17, 31,...., we initially rank the 1st Order Primes sequentially in natural number order (i.e. 1, 2, 3, 4, 5,...).
We have moved here therefore to natural number notions of an ordinal kind. Indeed even momentary reflection on the issue will indicate that the very ability to order the primes, implicitly implies the natural numbers (in an ordinal manner). So this begs the key question of how possibly the primes can be unambiguously seen as the building blocks of the natural number system (in cardinal terms), when the very recognition of the primes already requires the natural numbers (in an ordinal manner).
Having ranked the 1st Order Primes sequentially in natural number fashion, we then choose from among these rankings, those corresponding sequentially with the primes (again in an ordinal manner).
So from our list of 1st Order Primes we choose the 2nd, 3rd, 5th, 7th, 11th,.... members on the list.
We then identify these ordinally derived members with the original cardinal primes on the list thus giving us the 2nd Order Primes 3, 5, 11, 17, 31,......
We then express the frequency of these cardinal primes in relation to the original list of primes (also expressed in a cardinal manner).
Therefore though the ultimate relationship is expressed as between both prime and natural numbers expressed in a cardinal manner, their very derivation entails switching as between both prime and natural number notions, and also as between cardinal and ordinal notions.
There is another aspect to all this that might not appear immediately obvious.
I have listed below first 20 natural numbers and the corresponding list of primes (up to 20).
So we are expressing here the relationship of the 0th Order Primes (i.e. the natural numbers) and the 1st Order Primes (the full list of primes).
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20
2, 3, 5, 7, 11, 13, 17, 19
Now in the 2nd case, I will provide a list of the first 20 1st Order Primes with the corresponding list of 2nd Order Primes.
2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37, 41, 43, 47, 53, 59, 61, 67, 71
3, 5, 11, 17, 31, 41, 59, 67
In some ways, though obvious, we can point to a remarkable finding here, in that the relationship as between the 1st Order and 2nd Order Primes exactly replicates the same relationship as between the the starting list of the natural numbers (0th Order) and primes (1st Order).
Thus in the first 20 natural numbers we have a frequency of 8 primes. Likewise in the first 20 (1st Order) Primes, we have a frequency of 8 (2nd Order) Primes.
And no matter how large our starting value n, the frequency in both cases would be exactly similar.
Thus there is not just one relationship governing the primes (1st Order) and natural numbers (0th Order).
We have in fact a never-ending iteration of subsequent relationships as between (N – 1)th Order and Nth Order primes where the same fundamental pattern is exactly replicated.
In fact, in all these cases the Nth Order can be seen as playing the role of the natural numbers in replicating the original order of the primes with the natural numbers.
So in fact we have two ways in general of expressing all these relationships.
We can express each relationship as - I have been doing so far - as that between (N – 1)th Order and Nth Order primes, So the first standard formulation (i.e. primes and natural numbers) expresses therefore the relationship as between the 1st Order and 0th Order Primes. So the natural numbers are here revealed as expressing a special form of primes!
We can express each relationship as between the Nth Order Primes and the Nth Order Natural Numbers. So now the primes are equally revealed as representing a special form of natural number!
So the first standard formulation (again of the primes and natural numbers) can, be equally expressed as that between 1st Order Primes and the 1st Order Natural Numbers.
Thus all subsequent formulations, from this perspective, entails a relationship as between both higher Order Primes and Natural Numbers.
However what is vital to appreciate, in showing how the same fundamental relationship as between the primes and natural numbers is reiterated without limit throughout the number system through the definition of different order primes, is that (i) the cardinal notion of number necessarily implies the ordinal; (ii) the very notion of the primes necessarily implies the natural numbers.
So properly understood - which inherently requires holistic dynamic appreciation - both cardinal and ordinal notions are ultimately completely interdependent; likewise the notions of prime and natural numbers are likewise ultimately completely interdependent with each other.
When one examines the approach I adopted in deriving 2nd Order, 3rd Order,....Nth Order Primes, it is clear that we keep switching as between both cardinal and ordinal notions and also as between prime and natural number notions.
So for example in moving from the 1st Order Primes i.e. 2, 3, 5, 7, 11, .... to the 2nd Order Primes 3, 5, 11, 17, 31,...., we initially rank the 1st Order Primes sequentially in natural number order (i.e. 1, 2, 3, 4, 5,...).
We have moved here therefore to natural number notions of an ordinal kind. Indeed even momentary reflection on the issue will indicate that the very ability to order the primes, implicitly implies the natural numbers (in an ordinal manner). So this begs the key question of how possibly the primes can be unambiguously seen as the building blocks of the natural number system (in cardinal terms), when the very recognition of the primes already requires the natural numbers (in an ordinal manner).
Having ranked the 1st Order Primes sequentially in natural number fashion, we then choose from among these rankings, those corresponding sequentially with the primes (again in an ordinal manner).
So from our list of 1st Order Primes we choose the 2nd, 3rd, 5th, 7th, 11th,.... members on the list.
We then identify these ordinally derived members with the original cardinal primes on the list thus giving us the 2nd Order Primes 3, 5, 11, 17, 31,......
We then express the frequency of these cardinal primes in relation to the original list of primes (also expressed in a cardinal manner).
Therefore though the ultimate relationship is expressed as between both prime and natural numbers expressed in a cardinal manner, their very derivation entails switching as between both prime and natural number notions, and also as between cardinal and ordinal notions.
There is another aspect to all this that might not appear immediately obvious.
I have listed below first 20 natural numbers and the corresponding list of primes (up to 20).
So we are expressing here the relationship of the 0th Order Primes (i.e. the natural numbers) and the 1st Order Primes (the full list of primes).
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20
2, 3, 5, 7, 11, 13, 17, 19
Now in the 2nd case, I will provide a list of the first 20 1st Order Primes with the corresponding list of 2nd Order Primes.
2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37, 41, 43, 47, 53, 59, 61, 67, 71
3, 5, 11, 17, 31, 41, 59, 67
In some ways, though obvious, we can point to a remarkable finding here, in that the relationship as between the 1st Order and 2nd Order Primes exactly replicates the same relationship as between the the starting list of the natural numbers (0th Order) and primes (1st Order).
Thus in the first 20 natural numbers we have a frequency of 8 primes. Likewise in the first 20 (1st Order) Primes, we have a frequency of 8 (2nd Order) Primes.
And no matter how large our starting value n, the frequency in both cases would be exactly similar.
Thus there is not just one relationship governing the primes (1st Order) and natural numbers (0th Order).
We have in fact a never-ending iteration of subsequent relationships as between (N – 1)th Order and Nth Order primes where the same fundamental pattern is exactly replicated.
In fact, in all these cases the Nth Order can be seen as playing the role of the natural numbers in replicating the original order of the primes with the natural numbers.
So in fact we have two ways in general of expressing all these relationships.
We can express each relationship as - I have been doing so far - as that between (N – 1)th Order and Nth Order primes, So the first standard formulation (i.e. primes and natural numbers) expresses therefore the relationship as between the 1st Order and 0th Order Primes. So the natural numbers are here revealed as expressing a special form of primes!
We can express each relationship as between the Nth Order Primes and the Nth Order Natural Numbers. So now the primes are equally revealed as representing a special form of natural number!
So the first standard formulation (again of the primes and natural numbers) can, be equally expressed as that between 1st Order Primes and the 1st Order Natural Numbers.
Thus all subsequent formulations, from this perspective, entails a relationship as between both higher Order Primes and Natural Numbers.
Sunday, May 18, 2014
New Perspective on Prime Number Theorem (3)
We have seen how the average frequency of the natural factors of a number expressed relative to the corresponding average frequency of the prime factors (up to n) can be approximated as log n/log(log n).
Now it is well known that the sum of the terms of the harmonic series i.e. the sum of the reciprocals of the natural numbers (up to n) = log n + γ, where γ = the Euler-Mascheroni constant (= .5772156649...).
Thus, 1/1 + 1/2 + 1/3 + 1/4 +........+ 1/n = log n + γ.
However for sufficiently large n as γ is constant, log n + γ approximates more simply to log n.
It is also known that the sum of the corresponding series entailing the sum of the reciprocals of the primes (up to n) = log(log n) + B1 ,where B1, = Mertens constant (= .2614972148...).
So, 1/2 + 1/3 + 1/5 + 1/7 +....... + 1/n = log(log n) + B1.
Again however, for sufficiently large n, as B1 is constant, log(log n) + B1 approximates to log(log n).
Therefore we have the delightful connection that the average frequency of the natural factors of a number expressed relative to the corresponding average frequency of its prime factors (up to n) is replicated as the ratio of the sum of the reciprocals of the natural numbers to the corresponding sum of the reciprocals of the primes (up to n).
This would also provide a fascinating new way of expressing the prime number theorem.
So, once again, if we let t = log n, then log n/log(log n) = t/log t.
Therefore t/log t can be expressed as the ratio of the two series (up to n) entailing the sum of the reciprocals of the natural numbers and primes respectively.
This result can be extended indefinitely.
I have talked before of 1st Order, 2nd Order, 3rd Order,....Nth Order Primes.
The 1st Order Primes are - what we directly recognise as - the prime numbers, which (up to 100) are 2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37, 41, 43, 47, 53, 59, 61, 67, 71, 73, 79, 83, 89 and 97.
To obtain the 2nd Order Primes we give natural number rankings to these primes and then choose those relating to the cardinal primes from those natural number ordinal rankings.
So for example with respect to the 1st 5 primes, we rank these in the following manner (with cardinal primes in 1st row and ordinal rankings in 2nd).
2, 3, 5, 7, 11
1, 2, 3, 4, 5
We now choose the numbers that refer to the prime rankings (i.e. 2, 3 and 5)
So 3, 5 and 11 are now the three primes, from the original 5, belonging to the 2nd Order Primes.
Now if we list 2nd Order Primes to 100 we get 3, 5, 11, 17, 31, 41, 59, 67 and 83.
Now again, we can give these 2nd Order primes (in ascending order) natural number ordinal rankings as indicated below:
3, 5, 11, 17, 31, 41, 59, 67, 83
1, 2, 3, 4, 5, 6, 7, 8, 9
Once more to obtain 3rd Order Primes, we choose the numbers corresponding to the the prime rankings
i.e. 5, 11, 31 and 59
We then give these 3rd Order Primes a natural number ordinal ranking before picking those corresponding to (ordinal) prime rankings as our new set of 4th Order Primes, i.e.
5, 11, 31, 59
1, 2, 3, 4
So the 4th Order Primes (up to n = 100) are 11 and 31.
Again we rank these in natural number ordinal terms as 1 and 2 respectively
11, 31
1, 2
Then by choosing the cardinal number corresponding to the (ordinal) prime number ranking 2, we obtain the only 5th Order Prime (up to 100) i.e. 31.
However where the value of n is unlimited in finite terms, the number of different Orders of Primes is likewise finitely unlimited.
Thus the point I am making is that the order among the natural numbers and primes i.e. natural numbers and First Order Primes) is continually replicated through the corresponding order throughout the number system, as between 1st Order and 2nd Order Primes, 2nd Order and 3rd Order Primes, 3rd Order and 4th Order,...,Nth – 1 Order and Nth Order Primes (where N ultimately has no limit in finite terms).
The frequency of 1st Order primes is approximated as we have seen by n/log n.
Now if we let n1 =n/log n, then this latter relationship (of the frequency of 2nd Order among the Ist Order Primes) can be expressed as n1/log n1.
In turn, if we now let n2 = n1/log n1, then the frequency of 3rd Order among the 2nd Order Primes can be expressed as n2/log n2.
Then if we let n3 = n2/log n2, then the frequency of 4rd Order among the 3rd Order Primes can be expressed in turn as n3/log n3.
Ultimately when nN = nN – 1/log nN – 1, then the frequency of Nth Order among the (Nth – 1) Order Primes can be expressed in turn as nN/log nN.
In fact, in terms of the above approach, the natural numbers can be expressed as 0th Order Primes!
So rather than the Prime Number Theorem being reserved to just one relationship i.e. the frequency of 1st Order among the 0th Order Primes (primes among the natural numbers), we have in fact potentially an unlimited number of Prime Number Theorems continually reiterated throughout the number system i.e. for the frequency of 2nd Order among the Ist Order, the frequency of 3rd Order among the 2nd Order,....., to finally the frequency of Nth Order among the (Nth – 1) Order Primes.
Now it is well known that the sum of the terms of the harmonic series i.e. the sum of the reciprocals of the natural numbers (up to n) = log n + γ, where γ = the Euler-Mascheroni constant (= .5772156649...).
Thus, 1/1 + 1/2 + 1/3 + 1/4 +........+ 1/n = log n + γ.
However for sufficiently large n as γ is constant, log n + γ approximates more simply to log n.
It is also known that the sum of the corresponding series entailing the sum of the reciprocals of the primes (up to n) = log(log n) + B1 ,where B1, = Mertens constant (= .2614972148...).
So, 1/2 + 1/3 + 1/5 + 1/7 +....... + 1/n = log(log n) + B1.
Again however, for sufficiently large n, as B1 is constant, log(log n) + B1 approximates to log(log n).
Therefore we have the delightful connection that the average frequency of the natural factors of a number expressed relative to the corresponding average frequency of its prime factors (up to n) is replicated as the ratio of the sum of the reciprocals of the natural numbers to the corresponding sum of the reciprocals of the primes (up to n).
This would also provide a fascinating new way of expressing the prime number theorem.
So, once again, if we let t = log n, then log n/log(log n) = t/log t.
Therefore t/log t can be expressed as the ratio of the two series (up to n) entailing the sum of the reciprocals of the natural numbers and primes respectively.
This result can be extended indefinitely.
I have talked before of 1st Order, 2nd Order, 3rd Order,....Nth Order Primes.
The 1st Order Primes are - what we directly recognise as - the prime numbers, which (up to 100) are 2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37, 41, 43, 47, 53, 59, 61, 67, 71, 73, 79, 83, 89 and 97.
To obtain the 2nd Order Primes we give natural number rankings to these primes and then choose those relating to the cardinal primes from those natural number ordinal rankings.
So for example with respect to the 1st 5 primes, we rank these in the following manner (with cardinal primes in 1st row and ordinal rankings in 2nd).
2, 3, 5, 7, 11
1, 2, 3, 4, 5
We now choose the numbers that refer to the prime rankings (i.e. 2, 3 and 5)
So 3, 5 and 11 are now the three primes, from the original 5, belonging to the 2nd Order Primes.
Now if we list 2nd Order Primes to 100 we get 3, 5, 11, 17, 31, 41, 59, 67 and 83.
Now again, we can give these 2nd Order primes (in ascending order) natural number ordinal rankings as indicated below:
3, 5, 11, 17, 31, 41, 59, 67, 83
1, 2, 3, 4, 5, 6, 7, 8, 9
Once more to obtain 3rd Order Primes, we choose the numbers corresponding to the the prime rankings
i.e. 5, 11, 31 and 59
We then give these 3rd Order Primes a natural number ordinal ranking before picking those corresponding to (ordinal) prime rankings as our new set of 4th Order Primes, i.e.
5, 11, 31, 59
1, 2, 3, 4
So the 4th Order Primes (up to n = 100) are 11 and 31.
Again we rank these in natural number ordinal terms as 1 and 2 respectively
11, 31
1, 2
Then by choosing the cardinal number corresponding to the (ordinal) prime number ranking 2, we obtain the only 5th Order Prime (up to 100) i.e. 31.
However where the value of n is unlimited in finite terms, the number of different Orders of Primes is likewise finitely unlimited.
Thus the point I am making is that the order among the natural numbers and primes i.e. natural numbers and First Order Primes) is continually replicated through the corresponding order throughout the number system, as between 1st Order and 2nd Order Primes, 2nd Order and 3rd Order Primes, 3rd Order and 4th Order,...,Nth – 1 Order and Nth Order Primes (where N ultimately has no limit in finite terms).
The frequency of 1st Order primes is approximated as we have seen by n/log n.
Now if we let n1 =n/log n, then this latter relationship (of the frequency of 2nd Order among the Ist Order Primes) can be expressed as n1/log n1.
In turn, if we now let n2 = n1/log n1, then the frequency of 3rd Order among the 2nd Order Primes can be expressed as n2/log n2.
Then if we let n3 = n2/log n2, then the frequency of 4rd Order among the 3rd Order Primes can be expressed in turn as n3/log n3.
Ultimately when nN = nN – 1/log nN – 1, then the frequency of Nth Order among the (Nth – 1) Order Primes can be expressed in turn as nN/log nN.
In fact, in terms of the above approach, the natural numbers can be expressed as 0th Order Primes!
So rather than the Prime Number Theorem being reserved to just one relationship i.e. the frequency of 1st Order among the 0th Order Primes (primes among the natural numbers), we have in fact potentially an unlimited number of Prime Number Theorems continually reiterated throughout the number system i.e. for the frequency of 2nd Order among the Ist Order, the frequency of 3rd Order among the 2nd Order,....., to finally the frequency of Nth Order among the (Nth – 1) Order Primes.
Saturday, May 17, 2014
New Perspective on Prime Number Theorem (2)
In my last blog entry I indicated that the prime number theorem can be viewed in two complementary ways (that dynamically interact with each other).
From the standard (Type 1) perspective it relates to the frequency of the primes with respect to the natural numbers.
However from the corresponding - largely unrecognised (Type 2) perspective - it relates to the average frequency of the natural factors of a number with respect to its prime factors.
Once again what is important to understand here is the truly complementary nature of both aspects (where the relationship as between the primes and natural numbers is directly inverted).
So again from the Type 1 perspective, numbers are treated in an independent quantitative manner. What this implicitly implies is that all numbers are defined with respect to the default 1st dimension.
Thus any number, n, in this approach is defined more fully as n1. Therefore 2 is 21, 3 is 31, 4 is 41 and so on.
From this perspective log n measures the (estimated) average spread (or gap) as between each prime (up to the number n).
This means that the intervening numbers between the primes are all composite. So log n (strictly log n – 1) therefore represents an unbroken sequence of composite natural numbers!
Then n/log n, measures the (estimated) frequency of the primes among the natural numbers.
It is also important to recognise the additive nature of the relationship in that the frequency of primes (up to n) combined with the remaining frequency among composite natural numbers is connected through addition (with the sum = n).
However from the Type 2 perspective, numbers are treated - relatively - in an interdependent qualitative manner. What this implies is that all numbers relate directly to a dimension (power or exponent) that is defined with respect to the default base quantity of 1.
Thus any number in this approach is defined as are defined as 1n. Therefore 2 is now 12, 3 is 13, 4 is 14 and so on.
The distinct prime factors can be seen here as representing the corresponding dimensional power of the number (with which the Type 2 perspective is directly concerned).
So using the same example as in yesterday's entry the number 12 has two distinct prime factors i.e. 2 and 3.
This therefore constitutes 2 possible dimensions which could be geometrically illustrated as the sides of a rectangle (resulting from multiplying 2 by 3 (i.e. 2 * 3).
However 12, as we have seen (according to my manner of definition) has 5 natural factors i.e. 2, 3, 4, 6 and 12.
Thus through multiplication of these factors 5 possible dimensions result (geometrically represented by the hypercube with sides 2, 3, 4, 6 and 12 respectively).
Thus both the prime and natural factors give rise directly here from the Type 2 perspective to the notion of number as representing dimensional powers; however, as we have seen, with the (standard) Type 1 aspect both prime and natural numbers are treated directly as number quantities (defined with respect to the default dimension of 1).
And what is vital to recognise is that, in dynamic interactive terms, The Type 1 and type 2 aspects are quantitative and qualitative (with respect to each other).
Therefore whereas the Type 1 aspect relates directly to notions of number independence (suited to addition), the Type 2 aspect relates directly to the notion of number interdependence (in the dimensional changes brought about through the multiplication of differing numbers.
Whereas from the Type 1 perspective, log n relates to an unbroken sequence of (composite) natural numbers, in Type 2 terms it relates in complementary fashion directly to the number of prime factors of a number.
And whereas from the Type 1 perspective, n/log n relates to the frequency of primes among the natural numbers (which is connected to the naturals through an additive relationship), again in a complementary Type 2 manner, n/log n relates to the frequency of the natural factors (with respect to the prime factors) with the relationship between both in this case due to a multiplicative relationship. So again n/log n in this case measures the number we multiply the (average) frequency of the prime factors to obtain the corresponding (average) frequency of the natural factors of a number.
Finally whereas in Type 1 terms we move from the individual numbers to consideration of relationships with respect to the overall collective set of numbers, in Type 2 terms we move in reverse fashion from the overall set of numbers to consideration of the results for the (average) individual number.
So this illustrates well how with respect to the dynamic interactive nature of the number system we have the the two-way relationship of the part with the whole and the whole with the part respectively.
So we have in fact two complementary perspectives on the prime number theorem which in dynamic interactive terms are the direct opposite of each other.
Again in the (standard) Type 1 approach, it is customary to view the primes as the basic (quantitative) building blocks of the natural number system.
However from the Type 2 perspective, this is directly reversed with the natural numbers in a (qualitative) ordinal manner seen as the building blocks of each prime number.
So once again if we look at the number 6 for example in Type 1 terms this natural externally represents the quantitative product (in cardinal terms) of its two prime factors (i.e. 2 and 3).
Thus we treat each cardinal number here as an indivisible whole unit!
However, when we look internally, as it were at the composition of both prime numbers, we find that they necessarily already comprise a sequence of natural numbers in a qualitative ordinal manner.
Thus 2 necessarily consists of a 1st and 2nd member, whereas the number 3 necessarily consists of a 1st, 2nd and 3rd member!
Therefore we cannot explicitly even refer to a prime externally in cardinal terms, without implicit recognition of its natural number composition in an ordinal manner.
Of course from the opposite perspective, we cannot explicitly refer to the ordinal natural number composition of a prime, without implicit recognition of its prime identity in cardinal terms!
Thus in dynamic interactive therms both the cardinal and ordinal aspects of number are ultimately fully interdependent with each other (in an ineffable manner).
It is important to remember that the quantitative also has a qualitative aspect (when viewed from an opposite perspective) and vice versa the qualitative a quantitative aspect.
Thus the Type 2 perspective I offer here of the prime number theorem is presented now in a quantitative type manner.
However it is vital to keep remembering that in dynamic interactive terms, complementary (opposite) poles are always quantitative as to qualitative (and qualitative as to quantitative) with each other.
Thus the importance of the relationship between the primes and natural numbers when seen from this perspective, is that they serve as the crucial means through which both the quantitative (and qualitative aspects qualitative aspects) of the number system - and indeed ultimately all created phenomena - are communicated. with each other.
From the standard (Type 1) perspective it relates to the frequency of the primes with respect to the natural numbers.
However from the corresponding - largely unrecognised (Type 2) perspective - it relates to the average frequency of the natural factors of a number with respect to its prime factors.
Once again what is important to understand here is the truly complementary nature of both aspects (where the relationship as between the primes and natural numbers is directly inverted).
So again from the Type 1 perspective, numbers are treated in an independent quantitative manner. What this implicitly implies is that all numbers are defined with respect to the default 1st dimension.
Thus any number, n, in this approach is defined more fully as n1. Therefore 2 is 21, 3 is 31, 4 is 41 and so on.
From this perspective log n measures the (estimated) average spread (or gap) as between each prime (up to the number n).
This means that the intervening numbers between the primes are all composite. So log n (strictly log n – 1) therefore represents an unbroken sequence of composite natural numbers!
Then n/log n, measures the (estimated) frequency of the primes among the natural numbers.
It is also important to recognise the additive nature of the relationship in that the frequency of primes (up to n) combined with the remaining frequency among composite natural numbers is connected through addition (with the sum = n).
However from the Type 2 perspective, numbers are treated - relatively - in an interdependent qualitative manner. What this implies is that all numbers relate directly to a dimension (power or exponent) that is defined with respect to the default base quantity of 1.
Thus any number in this approach is defined as are defined as 1n. Therefore 2 is now 12, 3 is 13, 4 is 14 and so on.
The distinct prime factors can be seen here as representing the corresponding dimensional power of the number (with which the Type 2 perspective is directly concerned).
So using the same example as in yesterday's entry the number 12 has two distinct prime factors i.e. 2 and 3.
This therefore constitutes 2 possible dimensions which could be geometrically illustrated as the sides of a rectangle (resulting from multiplying 2 by 3 (i.e. 2 * 3).
However 12, as we have seen (according to my manner of definition) has 5 natural factors i.e. 2, 3, 4, 6 and 12.
Thus through multiplication of these factors 5 possible dimensions result (geometrically represented by the hypercube with sides 2, 3, 4, 6 and 12 respectively).
Thus both the prime and natural factors give rise directly here from the Type 2 perspective to the notion of number as representing dimensional powers; however, as we have seen, with the (standard) Type 1 aspect both prime and natural numbers are treated directly as number quantities (defined with respect to the default dimension of 1).
And what is vital to recognise is that, in dynamic interactive terms, The Type 1 and type 2 aspects are quantitative and qualitative (with respect to each other).
Therefore whereas the Type 1 aspect relates directly to notions of number independence (suited to addition), the Type 2 aspect relates directly to the notion of number interdependence (in the dimensional changes brought about through the multiplication of differing numbers.
Whereas from the Type 1 perspective, log n relates to an unbroken sequence of (composite) natural numbers, in Type 2 terms it relates in complementary fashion directly to the number of prime factors of a number.
And whereas from the Type 1 perspective, n/log n relates to the frequency of primes among the natural numbers (which is connected to the naturals through an additive relationship), again in a complementary Type 2 manner, n/log n relates to the frequency of the natural factors (with respect to the prime factors) with the relationship between both in this case due to a multiplicative relationship. So again n/log n in this case measures the number we multiply the (average) frequency of the prime factors to obtain the corresponding (average) frequency of the natural factors of a number.
Finally whereas in Type 1 terms we move from the individual numbers to consideration of relationships with respect to the overall collective set of numbers, in Type 2 terms we move in reverse fashion from the overall set of numbers to consideration of the results for the (average) individual number.
So this illustrates well how with respect to the dynamic interactive nature of the number system we have the the two-way relationship of the part with the whole and the whole with the part respectively.
So we have in fact two complementary perspectives on the prime number theorem which in dynamic interactive terms are the direct opposite of each other.
Again in the (standard) Type 1 approach, it is customary to view the primes as the basic (quantitative) building blocks of the natural number system.
However from the Type 2 perspective, this is directly reversed with the natural numbers in a (qualitative) ordinal manner seen as the building blocks of each prime number.
So once again if we look at the number 6 for example in Type 1 terms this natural externally represents the quantitative product (in cardinal terms) of its two prime factors (i.e. 2 and 3).
Thus we treat each cardinal number here as an indivisible whole unit!
However, when we look internally, as it were at the composition of both prime numbers, we find that they necessarily already comprise a sequence of natural numbers in a qualitative ordinal manner.
Thus 2 necessarily consists of a 1st and 2nd member, whereas the number 3 necessarily consists of a 1st, 2nd and 3rd member!
Therefore we cannot explicitly even refer to a prime externally in cardinal terms, without implicit recognition of its natural number composition in an ordinal manner.
Of course from the opposite perspective, we cannot explicitly refer to the ordinal natural number composition of a prime, without implicit recognition of its prime identity in cardinal terms!
Thus in dynamic interactive therms both the cardinal and ordinal aspects of number are ultimately fully interdependent with each other (in an ineffable manner).
It is important to remember that the quantitative also has a qualitative aspect (when viewed from an opposite perspective) and vice versa the qualitative a quantitative aspect.
Thus the Type 2 perspective I offer here of the prime number theorem is presented now in a quantitative type manner.
However it is vital to keep remembering that in dynamic interactive terms, complementary (opposite) poles are always quantitative as to qualitative (and qualitative as to quantitative) with each other.
Thus the importance of the relationship between the primes and natural numbers when seen from this perspective, is that they serve as the crucial means through which both the quantitative (and qualitative aspects qualitative aspects) of the number system - and indeed ultimately all created phenomena - are communicated. with each other.
Friday, May 16, 2014
New Perspective on Prime Number Theorem (1)
The notion of the factors of a number can be defined in different ways.
For example id we take the number 12 it can be uniquely expressed in terms of its prime constituents as 2 * 2 * 3.
However one of these factors (i.e. 2) repeats. Therefore in terms of distinct primes, 12 entails just two building blocks (i.e. 2 and 3).
However 12 has several more factors when we consider all those numbers with which it can be evenly divided.
Now clearly, 12 can be divided by 1. However as all numbers by definition can be divided by 1, we will exclude this from consideration as a trivial factor.
12 can also be divided by 2, 3, 4, 6 and 12. Now again we could query the inclusion of 12 as an integer - by definition - will include itself as a factor. However 12 can perhaps be validly considered as somewhat more unique than 1 (as all integers are not divisible by 12).
So in this sense, in which I define it, 12 has 5 factors.
Finally where primes are concerned we simply ignore any factors. Therefore for example, though 7 is a factor of 7, because the number is prime, we do not consider it.
So we are now left with two ways of defining factors.
Once again we have the prime factors, relating to the distinct prime building blocks of a number.
Then we have - what I define as - the natural factors which are defined with respect to the composite numbers to include all factors (other than 1).
The important question then arises as to the relationship as between the average number of natural and prime factors contained by a number.
There is a well known theorem i.e. the Hardy-Ramanujan Theorem, which postulates that on average the number of prime factors contained by a number can be given as log(log n) with the accuracy of this prediction increasing for large n.
Then recently in my own researches, I came up with an estimate that on average the number of natural factors contained by a number can be given as log n – 1.
However as for very large n, log n – 1 would closely approximate log n, we can give this result more simply as log n.
So if we let log n = t, we can now say that on average, for large n, the number of natural factors contained by a number = t.
We can also then say that on average, therefore the number of prime factors contained by the same number = log t.
So the frequency of natural to prime factors is given by t/log t.
So for example in the region of the number system, where each number on average contains 1000 natural factors, we would expect it (again on on average) to contain nearly 7 (6.9) prime factors. Thus we would expect over 140 times more natural than prime factors per number in this region of the number system.
Therefore what we have concluded is that not alone does the prime number theorem apply to the distribution of the primes among the natural numbers, but equally it applies to the distribution of the (distinct) prime factors among all the natural factors of a number.
So once again we have distinguished two complementary aspects to the distribution of primes (among the natural numbers).
Again in Type 1 terms we have the well-recognised distribution of the primes among the natural numbers (in cardinal terms).
However In Type 2 terms, we have the largely unrecognised distribution of the prime among the natural factors of a number.
Once again the Type 1 approach is - literally 1-dimensional in nature, where all numbers are considered in reduced fashion as lying on the same number line.
So here, strictly speaking we view relationships in a merely quantitative manner (as befits notions of number independence).
However, the Type 2 approach is multidimensional in nature where numbers represent differing dimensions against a fixed quantitative base (as befits notions of number interdependence).
But in dynamic interactive terms, the Type 1 and Type 2 aspects are fully complementary with each other.
Thus we cannot properly view quantitative (Type 1) notions of number independence in the absence of corresponding qualitative (Type 2) notions of number interdependence; likewise we cannot properly view qualitative (Type 2) notions of number interdependence in the absence of corresponding quantitative (Type 1) notions of number independence.
Thus the distribution of primes among the natural numbers for both numbers and factors occur simultaneously in the coincidence of quantitative and qualitative aspects.
Thus the ultimate relationship between the primes and natural numbers (and natural numbers and primes) in both Type 1 and Type 2 terms is determined in a purely synchronous ineffable manner.
For example id we take the number 12 it can be uniquely expressed in terms of its prime constituents as 2 * 2 * 3.
However one of these factors (i.e. 2) repeats. Therefore in terms of distinct primes, 12 entails just two building blocks (i.e. 2 and 3).
However 12 has several more factors when we consider all those numbers with which it can be evenly divided.
Now clearly, 12 can be divided by 1. However as all numbers by definition can be divided by 1, we will exclude this from consideration as a trivial factor.
12 can also be divided by 2, 3, 4, 6 and 12. Now again we could query the inclusion of 12 as an integer - by definition - will include itself as a factor. However 12 can perhaps be validly considered as somewhat more unique than 1 (as all integers are not divisible by 12).
So in this sense, in which I define it, 12 has 5 factors.
Finally where primes are concerned we simply ignore any factors. Therefore for example, though 7 is a factor of 7, because the number is prime, we do not consider it.
So we are now left with two ways of defining factors.
Once again we have the prime factors, relating to the distinct prime building blocks of a number.
Then we have - what I define as - the natural factors which are defined with respect to the composite numbers to include all factors (other than 1).
The important question then arises as to the relationship as between the average number of natural and prime factors contained by a number.
There is a well known theorem i.e. the Hardy-Ramanujan Theorem, which postulates that on average the number of prime factors contained by a number can be given as log(log n) with the accuracy of this prediction increasing for large n.
Then recently in my own researches, I came up with an estimate that on average the number of natural factors contained by a number can be given as log n – 1.
However as for very large n, log n – 1 would closely approximate log n, we can give this result more simply as log n.
So if we let log n = t, we can now say that on average, for large n, the number of natural factors contained by a number = t.
We can also then say that on average, therefore the number of prime factors contained by the same number = log t.
So the frequency of natural to prime factors is given by t/log t.
So for example in the region of the number system, where each number on average contains 1000 natural factors, we would expect it (again on on average) to contain nearly 7 (6.9) prime factors. Thus we would expect over 140 times more natural than prime factors per number in this region of the number system.
Therefore what we have concluded is that not alone does the prime number theorem apply to the distribution of the primes among the natural numbers, but equally it applies to the distribution of the (distinct) prime factors among all the natural factors of a number.
So once again we have distinguished two complementary aspects to the distribution of primes (among the natural numbers).
Again in Type 1 terms we have the well-recognised distribution of the primes among the natural numbers (in cardinal terms).
However In Type 2 terms, we have the largely unrecognised distribution of the prime among the natural factors of a number.
Once again the Type 1 approach is - literally 1-dimensional in nature, where all numbers are considered in reduced fashion as lying on the same number line.
So here, strictly speaking we view relationships in a merely quantitative manner (as befits notions of number independence).
However, the Type 2 approach is multidimensional in nature where numbers represent differing dimensions against a fixed quantitative base (as befits notions of number interdependence).
But in dynamic interactive terms, the Type 1 and Type 2 aspects are fully complementary with each other.
Thus we cannot properly view quantitative (Type 1) notions of number independence in the absence of corresponding qualitative (Type 2) notions of number interdependence; likewise we cannot properly view qualitative (Type 2) notions of number interdependence in the absence of corresponding quantitative (Type 1) notions of number independence.
Thus the distribution of primes among the natural numbers for both numbers and factors occur simultaneously in the coincidence of quantitative and qualitative aspects.
Thus the ultimate relationship between the primes and natural numbers (and natural numbers and primes) in both Type 1 and Type 2 terms is determined in a purely synchronous ineffable manner.
Thursday, May 15, 2014
Interesting Observations
I have commented on previously the important complementary link as between the simple formula for estimating the frequency of primes up to n, i.e. n/(log n – 1) and the corresponding simple formula for estimating the frequency of the factors of composite numbers up to n i.e. n(log n – 1).
The product of the two formulae is therefore n2.
This means that if we multiply the frequency of primes by the corresponding frequency of factors of composite numbers (up to n) that the result will closely approximate n2.
Alternatively, if we know the frequency of primes to a given number, then we can use this result to estimate the corresponding frequency of the sum of factors (of the composites). Alternatively, we can use the frequency of the sum of factors to estimate the corresponding frequency of the primes.
For example, we know that there are 25 primes up to 100. So n2 = 10,000 with the corresponding frequency of factors of composite numbers estimated at 10,000/25 = 400.
In fact as we have seen the actual number of such factors (up to 100) = 357.
So this approach already gives a reasonably accurate estimate (which improves in relative terms for higher n ultimately approaching 100% accuracy).
Alternatively we can use knowledge of the frequency of the sum of factors to estimate the corresponding frequency of primes.
So the estimate of frequency of primes (up to 100) = 10,000/357 = 28 (which is just 3 more than the correct result).
An even more striking expression of this result can be given with respect to the average frequency of primes and factors of composites up to a given number.
So quite simply, when we multiply the average frequency of primes (up to a given number) by the corresponding average frequency of factors of the composites (to the same number), the result will closely approximate 1 (with the accuracy increasing towards 100% in relative terms at higher n).
The average frequency of primes up to 100 = 25/100 = .25 and the average frequency of factors = 357/100 = 3.57.
The product of these two numbers = .8925 (which is already reasonably close to 1).
Now at 200, the average frequency of primes = 46/200 = .23 and the average frequency of factors = 852/200 = 4.26. So the product of both = .23 * 4.26 = .9798.
So we can see, that the relative accuracy has greatly increased in moving from 100 to 200 and already is very close to 1!
There is another very interesting point that can be made here!
As we have seen, up to n = 100, the average frequency of primes is 1 in 4. This means that the average unbroken sequence of composite numbers (between primes) = 3.
So the average number of factors is closely related to this average unbroken sequence of composite numbers. In fact we just add 1 to the unbroken sequence.
We know for example that there are 46 primes up to 200. So the average is 1 in every 4.35 numbers (approx).
This means that the average unbroken sequence of composite numbers (between each prime) = 3.35.
So the estimate for average number of factors per number (for n = 200) = 3.35 + 1 = 4.35.
And as we have seen the actual number of factors (up to 200) = 852.
This gives an average for each number of 852/200 = 4.26 (which compares very well with our estimate of 4.35).
I will make just one more observation at this point.
As we have seen the (accumulated) sum of the factors of the composite nos. up to n, on the real scale, is very closely related to the corresponding frequency of the Riemann (Zeta 1) zeros up to t, on the imaginary scale (where n = t/2π).
Therefore we can equally say that the product of the frequency of primes (up to n) multiplied by the corresponding frequency of zeros (up to t) approximates very closely to n2.
Thus, once more, we can use the frequency of primes to estimate the corresponding frequency of (non-trivial) zeros or alternatively use the frequency of the zeros to estimate the frequency of the primes.
Again, as we have seen the frequency of primes to 200 = 46.
Therefore we can estimate the corresponding frequency of zeros to 1256.63... (i.e. 200 * 2π),
as 2002/46 = 40,000/46 = 870 (to nearest integer).
As the the actual number of zeros to 1256.63 = 861, we can see already the relative closeness of the two results!
Alternatively by making use of the actual existence of 861 zeros to 1256.63..., we can estimate the frequency of primes to 200 as 40,000/861 = 46 (to nearest integer) which in fact is the exact result in this case!
The product of the two formulae is therefore n2.
This means that if we multiply the frequency of primes by the corresponding frequency of factors of composite numbers (up to n) that the result will closely approximate n2.
Alternatively, if we know the frequency of primes to a given number, then we can use this result to estimate the corresponding frequency of the sum of factors (of the composites). Alternatively, we can use the frequency of the sum of factors to estimate the corresponding frequency of the primes.
For example, we know that there are 25 primes up to 100. So n2 = 10,000 with the corresponding frequency of factors of composite numbers estimated at 10,000/25 = 400.
In fact as we have seen the actual number of such factors (up to 100) = 357.
So this approach already gives a reasonably accurate estimate (which improves in relative terms for higher n ultimately approaching 100% accuracy).
Alternatively we can use knowledge of the frequency of the sum of factors to estimate the corresponding frequency of primes.
So the estimate of frequency of primes (up to 100) = 10,000/357 = 28 (which is just 3 more than the correct result).
An even more striking expression of this result can be given with respect to the average frequency of primes and factors of composites up to a given number.
So quite simply, when we multiply the average frequency of primes (up to a given number) by the corresponding average frequency of factors of the composites (to the same number), the result will closely approximate 1 (with the accuracy increasing towards 100% in relative terms at higher n).
The average frequency of primes up to 100 = 25/100 = .25 and the average frequency of factors = 357/100 = 3.57.
The product of these two numbers = .8925 (which is already reasonably close to 1).
Now at 200, the average frequency of primes = 46/200 = .23 and the average frequency of factors = 852/200 = 4.26. So the product of both = .23 * 4.26 = .9798.
So we can see, that the relative accuracy has greatly increased in moving from 100 to 200 and already is very close to 1!
There is another very interesting point that can be made here!
As we have seen, up to n = 100, the average frequency of primes is 1 in 4. This means that the average unbroken sequence of composite numbers (between primes) = 3.
So the average number of factors is closely related to this average unbroken sequence of composite numbers. In fact we just add 1 to the unbroken sequence.
We know for example that there are 46 primes up to 200. So the average is 1 in every 4.35 numbers (approx).
This means that the average unbroken sequence of composite numbers (between each prime) = 3.35.
So the estimate for average number of factors per number (for n = 200) = 3.35 + 1 = 4.35.
And as we have seen the actual number of factors (up to 200) = 852.
This gives an average for each number of 852/200 = 4.26 (which compares very well with our estimate of 4.35).
I will make just one more observation at this point.
As we have seen the (accumulated) sum of the factors of the composite nos. up to n, on the real scale, is very closely related to the corresponding frequency of the Riemann (Zeta 1) zeros up to t, on the imaginary scale (where n = t/2π).
Therefore we can equally say that the product of the frequency of primes (up to n) multiplied by the corresponding frequency of zeros (up to t) approximates very closely to n2.
Thus, once more, we can use the frequency of primes to estimate the corresponding frequency of (non-trivial) zeros or alternatively use the frequency of the zeros to estimate the frequency of the primes.
Again, as we have seen the frequency of primes to 200 = 46.
Therefore we can estimate the corresponding frequency of zeros to 1256.63... (i.e. 200 * 2π),
as 2002/46 = 40,000/46 = 870 (to nearest integer).
As the the actual number of zeros to 1256.63 = 861, we can see already the relative closeness of the two results!
Alternatively by making use of the actual existence of 861 zeros to 1256.63..., we can estimate the frequency of primes to 200 as 40,000/861 = 46 (to nearest integer) which in fact is the exact result in this case!
Unexpected Link
In yesterday's blog entry, I showed a close and fascinating link as between the accumulated sum of the Riemann (Zeta 1) zeros and a corresponding sum relating to the aggregate of the factors of the composite nos.
I also demonstrated how the simple formula n(n + 1)(log n – 1)/2 can be used to estimate both sums.
Having completed this entry, I then considered another aggregate sum entailing the Riemann zeros.
Here we multiply each prime in ascending sequence by the matching entry from the list of (non-trivial) zeros (likewise arranged in ascending sequence).
So the 1st prime number is multiplied by the 1st Riemann zero, the 2nd prime by the 2nd Riemann zero, and so on.
Up to n in this case relates to the value of the primes in question.
So to illustrate this new aggregate up to 10, we find the sum of (2 * 14.13) + (3 * 21.02) + (5 * 25.01) + (7 * 30.42) with zeros expressed correct to 2 decimal places = 429.31.
Once again, because the non-trivial zeros are estimated with respect to the imaginary scale, t (where n = t/2π), to convert to n, we divide 429.31 by 2π = 68.33 (or 68 rounded to nearest integer).
In the table below I show these accumulated totals. (in col 2).
Then in col. 3 I show again the total accumulated sum of zeros (appropriately rescaled to n) as dealt with in yesterday's entry.
In fact, there is an unexpected link as between the two aggregates (in cols 2 and 3 respectively).
If we divide the total in col 2 by 2/π (or alternatively multiply by π /2), we obtain a new set of figures (in col. 4) which bears close comparison with the previous aggregates measure for Riemann zeros (in yesterday's entry).
The link between the two sets of figures (2/π) is not accidental. We have already seen how this shows up as very important measurement with respect to the Zeta 2 zeros (with the average reduced value for both the cos and sin parts of all the roots of 1 converging to 2/π).
In the final column (col 5), I show the relationship as % as between the measurements in cols. 3 and 4 respectively.
The formula for the new estimate in col. 2 (i.e. aggregate of each prime multiplied by corresponding non-trivial zero), is given as {n(n + 1)(log n – 1)}/π.
I also demonstrated how the simple formula n(n + 1)(log n – 1)/2 can be used to estimate both sums.
Having completed this entry, I then considered another aggregate sum entailing the Riemann zeros.
Here we multiply each prime in ascending sequence by the matching entry from the list of (non-trivial) zeros (likewise arranged in ascending sequence).
So the 1st prime number is multiplied by the 1st Riemann zero, the 2nd prime by the 2nd Riemann zero, and so on.
Up to n in this case relates to the value of the primes in question.
So to illustrate this new aggregate up to 10, we find the sum of (2 * 14.13) + (3 * 21.02) + (5 * 25.01) + (7 * 30.42) with zeros expressed correct to 2 decimal places = 429.31.
Once again, because the non-trivial zeros are estimated with respect to the imaginary scale, t (where n = t/2π), to convert to n, we divide 429.31 by 2π = 68.33 (or 68 rounded to nearest integer).
In the table below I show these accumulated totals. (in col 2).
Up to n
|
Agg. of primes * by
zeros = (2)
|
Accumulated sum of
zeros = (3)
|
(2)/(2/π) = (4)
|
(4)/(3) as %
|
10
|
68
|
91
|
107
|
117.58
|
20
|
228
|
493
|
359
|
72.82
|
30
|
634
|
1234
|
996
|
80.71
|
40
|
1228
|
2677
|
1928
|
72.02
|
50
|
2518
|
4221
|
3956
|
93.72
|
60
|
3736
|
6370
|
5869
|
92.14
|
70
|
5244
|
8767
|
8236
|
93.94
|
80
|
8079
|
12200
|
12691
|
104.02
|
90
|
10437
|
15858
|
16394
|
103.38
|
100
|
11808
|
20133
|
18548
|
92.13
|
110
|
15060
|
24958
|
23656
|
94.78
|
Then in col. 3 I show again the total accumulated sum of zeros (appropriately rescaled to n) as dealt with in yesterday's entry.
In fact, there is an unexpected link as between the two aggregates (in cols 2 and 3 respectively).
If we divide the total in col 2 by 2/π (or alternatively multiply by π /2), we obtain a new set of figures (in col. 4) which bears close comparison with the previous aggregates measure for Riemann zeros (in yesterday's entry).
The link between the two sets of figures (2/π) is not accidental. We have already seen how this shows up as very important measurement with respect to the Zeta 2 zeros (with the average reduced value for both the cos and sin parts of all the roots of 1 converging to 2/π).
In the final column (col 5), I show the relationship as % as between the measurements in cols. 3 and 4 respectively.
The formula for the new estimate in col. 2 (i.e. aggregate of each prime multiplied by corresponding non-trivial zero), is given as {n(n + 1)(log n – 1)}/π.
Subscribe to:
Posts (Atom)