Examination of a multiple universe version of the Partition function of the universe, based upon Penrose’s cyclic conformal cosmology. Leading to uniform values of (Planck’s constant) and invariant physical laws in each Universe of the ‘multiverse’

We look at, starting with Shankar’s treatment of the partition function, inserting in the data of the modified Heisenberg uncertainty principle as to give a role of the inflaton in the formation of a partition of the universe. The end result will be, even with the existence of a multiverse, i.e. simultaneous universes, uniform physical laws throughout the Multiple universes. Key words, Cyclic Conformal cosmology (Penrose), Modified HUP


What is Unique about our application of modification of the Penrose Cyclic cosmology theory?
We review the modification of the Penrose cyclic conformal cosmology paradigm given in [1] and include in a partition function given in [2], as a way to include in the modified Heinsenberg Uncertainty principle, [3] as a way to ascertain the role of the inflaton, as we write it up in using Padmanablan's reference [4].
Modification of the HUP, and including in the inflaton, in the Partition function will then lead to, after we are including in the results from [2] a way to discuss how to get a uniform value for Planck's constant, , in, per cycle creation of new universes.
Contrary to the supposition given by Susskind and others, [5] as to up to 10^100 universes, with only say 10^6 of them surviving due to sufficiently 'robust' cosmological values, for stable physical law. The end result is that we would have instead is a 'multiverse' which is dynamic and stable over time. And so we review our present modification of the Penrose cyclic conformal cosmology model to take into account multiple universes 2. Extending Penrose's suggestion of cyclic universes, black hole evaporation, and the embedding structure our universe is contained within. This multiverse embeds BHs and may resolve what appears to be an impossible dichotomy. , so minimum information is conserved between a set of partition functions per universe. We are when following this using the notation of [1] while noting that there is a subsequent alteration of the notation used for partition functions.
However, there is non-uniqueness of information put into each partition function   1 Hawking radiation from the black holes is collated via a strange attractor collection in the mega universe structure to form a new big bang for each of the N universes represented by   1 . Verification of this mega structure compression and expansion of information with a non-uniqueness of information placed in each of the N universes favors ergodic mixing treatments of initial values for each of N universes expanding from a singularity beginning. The f n value, will be using (Ng, 2008) f entropy n S~. [14] . How to tie in this energy expression, as in Eq. (1) (30) will be to look at the formation of a nontrivial gravitational measure as a new big bang for each of the N universes as by  ) ( 3 Each of i E identified with Eq.(2) (31) above, are with the iteration for N universes (Penrose, 2006) [14] Then the following holds, namely, from [1] Claim 1, For N number of universes, with each regime nucleation before j j     for j = 1 to N being the partition function of each universe just before the blend into the RHS of Eq. (3) (32) above for our present universe. Also, each of the independent universes given by regime nucleation before j j     are constructed by the absorption of one to ten million black holes taking in energy. I.e. [1] [14]. Furthermore, the main point is similar to what was done in [18] in terms of general ergodic mixing What is done in Claim 1 and Claim 2 is to come up with a protocol as to how a multi dimensional representation of black hole physics enables continual mixing of spacetime [16] largely as a way to avoid the Anthropic principle, as to a preferred set of initial conditions. How can a graviton with a wavelength 10^ -4 the size of the universe interact with a Kere black hole, spatially. Embedding the BH in a multiverse setting may be the only way out.
Claim 1 is particularly important. The idea here is to use what is known as CCC cosmology, which can be thought of as the following.
First. Have a big bang ( initial expansion) for the universe. After redshift z = 10, a billion years ago, SMBH formation starts. Matter-energy is vacuumed up by the SMBHs, which at a much later date than today ( present era) gather up all the matter-energy of the universe and recycles it in a cyclic conformal translation, as follows, namely 8 , E T g E source for gravitational field T mass energy density g gravitational metric vacuum energy rescaled as follows C1 is , here a constant. Then The main methodology in the Penrose proposal has been in Eq.(6) evaluating a change in the metric ab g by a conformal mapping  to 2 ab ab gg  Penrose's suggestion has been to utilize the following [1] 1ĉ cc    (8) The infall into cosmic black hopes has been the main mechanism which the author asserts would be useful for the recycling apparent in Eq. (8) Eq. (9) is to be generalized, as given by a weighing averaging as given by Eq (3). where the averaging is collated over perhaps thousands of universes, call that number N, with an ergotic mixing of all these universes, with the ergodic mixing represented by Eq.(3) to generalize Eq.(9) from cycle to cycle.

Now for the mixing being put in, and Birkhoff's Ergodic mixing theorem
We will, afterwards, do the particulars of the partition function. But before that, we will do the 'mixing' of inputs into the Partition function of the Universe, i.e. an elaboration on Eq. (3) above. To do this, first, look at the following, from [ 7 ] Birkhoff's Ergodic mixing theorem In the end, we need to have a way to present how the bona fides of Eq. (9) can be established, and the averaging of both Eq. (10) and Eq. (4) above need to be put to a consistent general treatment for an invariant for cycle to cycle, of cosmological creation.
To do this, we also refer to the generalized treatment of , from [ Having said, that, the remaining constraint is to come up with a suitably averaged value of the Partition function in the above work. Our averaging eventually will have to be reconciled with the Birkhoff Ergodic Mixing theorem.

How to average out the Planck's constant, of Eq. partition function given in Eq. (11)
We begin with what is given in Shankar's treatment of the partition function of [9] as given by This is the baseline of the constraint which will make Planck's constant, a constant per universe creation cycle. As given by Eq. (9). I.e. Eq. (9) is confirmed by Eq. (18). We will next then go to how this ties into Eq. (10) above, via use of averaging is affecting the choice of the inputs into Eq. (18) above. Doing this will allow investigation as to how to falsify the Birkhoff Ergodic mixing theorem as mentioned next.

Applying the Birkoff Ergodic averaging Eq.(10) to the inputs into Eq. (18)
To do this, we specifically look at the wavelength, namely , applying [7] to a wavelength  . One over the wavelength is proportional to frequency, so if we have the wave length, as represented by the following situation, with invariance set in stone. Here we are assuming that the formulation is, as follows: With N the number of recycled "universes" I.e. the averaging by the Burkhoff theorem implies that there is a critical invariance. And this invariance should be linked, then, to the diameter of a non singular bounce point. A non singular bounce, i.e., beginning of an expansion of a new universe is the main point of [ 11] . Furthermore, we have that if we look at applying the insights of [12,13] we obtain   We will try to show, in a later date that these are invariant per cycle, but the upshot is that if there is a natural fit, as to Eq. (19) and if  is fixed as an invariant per cycle, given by 19, then the invariance of per cycle is then maintained.

Conclusion
: Implications of the invariance of : Uniform physics laws per universe, and not the 10^1000 created universes with only say 10^10 surviving through a cosmic cycle.
In a word this demolishes the program of the cosmic land scape of string theory [5] , and gives credence to the possibility of an invariant multiverse, which would not be collapsing.
If this is confirmed, experimentally, it will do much to reduce what has been at times a post modern fragmentation of basic physics inquiry and to have physics, with a uniform set of laws, regardless of if there were many worlds, or just one, in terms of one universe, or many universes. As well as allow investigation of the information theory approach of [14] to event horizons and early universe cosmology. How [14] could influence a choice of partition functions is given in reference A below

6.. Acknowledgements
This work is supported in part by National Nature Science Foundation of China grant No. 11375279