Page 1 of 1
Local chaos
Posted: April 27th, 2013, 12:55 pm
by Allinone
I have read that quantum mechanics is chaotic, in that every event is random, but any series of events is predictable within the laws of probability. So what I don't really understand, is how the function of a system can be local, or how interaction between particles and events, can be restrained or determined by the speed of light. Surely all particles must be connected in some instantaneous way, in order for the laws of probability to apply to quantum events, would there not need to be communication, in order for events to conform to probability.
So if I imagine a scenario in which I throw a dice 60 times, I can expect to get roughly ten of each number, I know it isn't exact, but as I throw the dice an increasing amount of times, the margins of error become less. So by the time we reach 6,000,000 throws the numbers of each result will be relatively consistent. If we then take 6,000,000 die, and give them to 6,000,000 people, who reside in various locations, the results will be the same. Does this not imply that random events are not local, but inherently connected?
Re: Local chaos
Posted: April 27th, 2013, 2:18 pm
by A Poster He or I
Actually, sub-atomic particles do have the capacity to "beat the odds" of probability once they become entangled. This is Bell's Theorem, and "quantum entanglement" (that is, the ability for entangled quanta to violate Bell's Inequality) has been empirically proven since 1972, demonstrating the instantaneous interconnection that you have pondered. However, no reputable physicist that I'm aware of considers this communication. Rather, the consensus is that meaningful communication is not possible via entanglement. In short, there are no means to distinguish communicated information from randomness on the receiving end of the communication.
In my opinion, the philosophical upshot of this is that communication entails transmission and therefore requires space-time. Given a "back-drop" of space-time (or in other words, observing in a macroscopic frame of reference), inherent interconnection per se does not obviate the full character of randomness as we experience it. So it has no practical effect, nor can it be leveraged, for any practical means that simple statistical odds don't already accomplish.
At the quantum scale, however, the situation is clearly different. This difference can yield practical effects in principle; indeed, quantum computers will rely on those effects once they're invented. But we must acknowledge that quantum entanglement isn't happening "in" space-time as we currently understand space-time (namely, Newtonian or Relativistic space-time allows only local connections between events, but entanglement demonstrates non-local connections). Entanglement implicates our understanding of space-time as incomplete. But it does not necessarily imply instantaneous communication; rather what is being demonstrated is simultaneity. Is simultaneity communication? No, not if the "receiver" (a clearly arbitrary designation given a simultaneous scenario) can't distinguish it from statistical randomness.
Re: Local chaos
Posted: April 27th, 2013, 3:05 pm
by Creative
A Poster He or I wrote: Rather, the consensus is that meaningful communication is not possible via entanglement. In short, there are no means to distinguish communicated information from randomness on the receiving end of the communication. ..
Entanglement implicates our understanding of space-time as incomplete. But it does not necessarily imply instantaneous communication; rather what is being demonstrated is simultaneity. Is simultaneity communication? No, not if the "receiver" (a clearly arbitrary designation given a simultaneous scenario) can't distinguish it from statistical randomness.
There are two questions that deserve further examination:
1) The nature of time (or duration). There are two versions to be explored and compared. That which a) we feel as the passing of time within us as a heterogeneous, indivisible flow of memory from the past into the present and b) that which we use to compare simultaneity of external events and view as some spatial passing. Bergson has written and critiqued these two notions while de Broglie went further (and suggested that Bergson should have gone further) by suggested that even space should be construed as an internal feeling of flow as duration unfolds into the present.
2) The nature of entanglement and communication. A species could very well "communicate" a change or a new understanding via non-local effects at a very deep level (what Bohm calls the Implicate Order or what Rupert Sheldrake calls the Morphic Resonance Field) without there being some overt language or perceptual communication. Communication may need to be refined or redefined to include possible subtle non-local effects.
Re: Local chaos
Posted: April 27th, 2013, 5:47 pm
by Allinone
A poster he or I, so does bells theorem suggest that all particles are thus entangled. If energy, and the particles that represent it, where at one stage part of a singularity, then wouldn't that mean all particles, and quantum events, behave relative to each others behaviour, is it not in fact necessary that events are entangled on the level of quantum events. I understand this communication doesn't occur in or through the medium of space, but rather through the realm that energy resides in prior to the creation of space. I think of space as a quality, the same as mass, but on an underlying level, I can only imagine that all energy, or that which is the potential for events to occur, is a singular whole. Measurable events, are possible by virtue of the qualitys that the whole is able to exhibit, but due to its fundamentaly singular nature, all events are connected by their nature of being manifestations of a single potential.
Re: Local chaos
Posted: April 27th, 2013, 6:28 pm
by A Poster He or I
so does bells theorem suggest that all particles are thus entangled. If energy, and the particles that represent it, where at one stage part of a singularity, then wouldn't that mean all particles, and quantum events, behave relative to each others behaviour, is it not in fact necessary that events are entangled on the level of quantum events.
Bell's Theorem per se does not speculate so, nor does it even posit, predict or prove entanglement. Entanglement was a prediction of quantum mechanics (specifically, Schroedinger wave mechanics back in the 1920s). Bell merely provided an empirically-verifiable metric whereby we could see if entanglement really happens as QM predicted. And it does. Now that QM's prediction of entanglement is considered fact, popular interpretations of entanglement suggest that --given a Big Bang origin for the universe-- the original singularity must have been a single quantum event, sharing the same wave function. Once entangled, always entangled, says Schroedinger wave mechanics. Therefore, all of space-time remains entangled in a single, vast, incomprehensibly complex wave function.
Incomprehensibly complex is the key term to appreciate here. What it means is that in PRACTICAL terms, entanglement--outside of highly-engineered experimental situations--is FAR too subtle a phenomenon to be leveraged by existing technology or to be
scientifically considered as any basis for such common occurrences as serendipity, coincidence, trans-cultural parallelism, soul-mates, etc., etc. Even in the laboratory, as soon as 2 entangled quanta stop interacting, they quickly lose their phase coherence (the actual manifestation of entanglement per wave mechanics) as they begin interacting with the environment, inheriting phase information from every quanta encountered while propagating their own phase info into the jumble.
I understand this communication doesn't occur in or through the medium of space, but rather through the realm that energy resides in prior to the creation of space. I think of space as a quality, the same as mass, but on an underlying level, I can only imagine that all energy, or that which is the potential for events to occur, is a singular whole. Measurable events, are possible by virtue of the qualitys that the whole is able to exhibit, but due to its fundamentaly singular nature, all events are connected by their nature of being manifestations of a single potential.
Yes, this is the essence of Bohm's Implicate Order, for example. Just remember that such speculative theorizing currently has no
scientific basis, given its untestability. Broadly speaking, holistic phenomena remain outside of science's ability to model, given that scientific methodology is reductionistic. Only complexity theory has made inroads into holistic modeling but it is a slow-to-develop paradigm since it is mostly dependent upon ever-the-more powerful computing.
Re: Local chaos
Posted: April 29th, 2013, 3:32 pm
by Skakos
It is interesting to note that we define the "limits" of what we call "event". So what is "random" and what is not depends heavily on us. We could call 6 rolls of the dice in the same location or 6 rolls of the dice in 6 different locations. Whether or not these are "interconnected" is really an interesting question...
Re: Local chaos
Posted: April 29th, 2013, 6:03 pm
by Allinone
A poster he or I, I can understand that the entanglement, or relationship between quantum events would be vastly complex, but does the complexity not fade when we deal in statistical probabilitys. For example, when multiple beams of light are focused, there will remain a degree of scattering in all the beams, the no. of particles will be random to a degree, but as the no. of beams measured increases, then overall the numbers of stray photons will be close to an average predicted by probability. If this was not the case, and events were not in some way determined by other chance events, then it's hard to see how any of the fundamentally forces are consistently produced.
Skakos, I too find this an interesting idea, if we had 60 million people, on 60 million planets around the galaxy, and they all had one throw of the die, by knowing the location of all the throws that resulted in 1,2,3,4, and 5, then we could predict with a high degree of accuracy where the throw resulted in a 6. At least this is what I believe.
Re: Local chaos
Posted: April 30th, 2013, 2:57 pm
by A Poster He or I
...I can understand that the entanglement, or relationship between quantum events would be vastly complex, but does the complexity not fade when we deal in statistical probabilitys. For example, when multiple beams of light are focused, there will remain a degree of scattering in all the beams, the no. of particles will be random to a degree, but as the no. of beams measured increases, then overall the numbers of stray photons will be close to an average predicted by probability. If this was not the case, and events were not in some way determined by other chance events, then it's hard to see how any of the fundamentally forces are consistently produced.
Well, as I see it, you're just describing the contrast between the sub-atomic and macroscopic perspectives of reality. Complexity does not fade as more variables are measured and assigned values. Rather, complexity begins to display emergent patterning relative to the macroscopically-biased schemata interpreting the measurements. This is unavoidable since measurement entails interaction with macroscopic devices at the measuring end, and macroscopic reality is a byproduct of cognitive patterning. The PATTERNS display greater simplicity in themselves relative to their complexly interacting components, but complexity overall (that is, at the level of the coponents) is greater than ever. This is the essence of emergent properties as complexity theory attempts to model it.
Re: Local chaos
Posted: May 1st, 2013, 2:29 am
by MazerRackhem
Greetings. I think there are a couple of aspects in the language that need to be defined before we can get too much further. In his original post Allinone talked of quantum mechanics and it being random and chaotic. These are not the same thing. Quantum phenomena are random, but not chaotic. The orbits of three massive objects subject to mutual gravitational attraction on the other hand is chaotic, but not random.
A chaotic system is one characterized by extreme sensitivity to initial conditions. Such systems often exhibit a unique forward limit set which is referred to as a chaotic attractor, which is most often a fractal image in N dimensional phase space.
A random event by contrast is one which is fundamentally described by a probability distribution, that is an event which is not deterministic.
In short, chaotic systems as classically defined are not random but deterministic, and by definition no truly random event can be chaotic since true randomness cannot be tied to initial conditions.
A roll of the die may be taken as random, though given enough information its outcome may be deterministic. We may accept that the result of the toss of a fair die can be taken a priori as probabilistic since the conditions of the air currents applied during fall are sufficiently complex to make all outcomes equally likely for the throw of any given die for which the path is not deterministicaly worked out ahead of time. I bring this point up only because I will continue to use dice in my post though we could substitute some truly random phenomena in its place, say the decay of a cesium atom.
As Allinone describes above the outcome of any roll of the die is random, but the aggregate results of 6 million tosses is not. This is a result of the law of large numbers, however, and does not display an interconnectedness between the events. The argument to the contrary is often referred to as "The Gambler's Fallacy" which results from assuming that results of one toss have some effect on the results of a future toss, this is not the case. In another post Allinone discussed predicting where the 6's were tossed if we knew where all the 1-5 were tossed. This is of course true because if we know where all of the 1-5 tosses are, the only other thing the remaining players could have tossed were 6s! But I think he was aiming for something a bit deeper.
It has been shown that our minds often fall into a trap of "observed probabilities" which do not bear out in reality. This comes from the post hoc view of events. If a person rolls a die he knows to be fair many times and does not get a 6 he begins to rate the likelihood of a 6 on the next roll higher, he begins to feel that a 6 is "overdue" and thus more likely. However, even if you have not rolled a 6 on 20 consecutive rolls, the chance of a 6 on the 21st roll is 1/6, exactly the same as on the first roll. The fallacy comes from our post hoc view of events. It is highly unlikely that one will not roll a 6 for 20 rolls (only 2.6%) and even less likely that one will not roll it for 21 rolls (2.1%). The mind knows this intuitively and thus begins to expect that a 6 is due, but the chances have not changed. It is only in retrospect that we see that the string of rolls is unlikely, but this is true of any string of rolls.
If you roll a die 20 times, the odds that you will come up with the result that you do is 4.56-15 %, but this does not demonstrate that you have beaten the odds in any way, it is only in hindsight that the result can be seen to be unlikely. That result was exactly as likely as all the others and thus no great stroke of luck. Had you written down that exact list of numbers, in order, before rolling the die, that would be a different matter. The probability of a string of random events cannot be thus determined in this post hoc fashion. No reasonable person would call the string of 20 random numbers rolled on a die a miracle because the fallacy is obvious. However, our minds are often tricked into exactly that fallacy when it is much subtler, as when a gambler begins to think his lucky roll is "due."
I was going to put another tidbit about probability and the mind here but I think I've made the point I wished to and don't want to travel too far down the rabbit hole.
The point is: the distribution of the decay of atoms in radioactive material does not display an interaction between the atoms, but rather is the result of there being so many of them. At any given moment there is a tiny statistical chance that a specific given atom will decay, however, since there is a huge number of atoms that all have a small chance to decay, the number of atoms in the substance decaying is a near certainty. Just like tossing trillions of dice. The odds that a specific die comes up 6 is exactly 1/6 regardless of what any other die comes up, if no other die in the universe came up 6 it would not increase the odds that your die comes up 6 by one iota. If your intuition suggests otherwise it is the well documented Gambler's Fallacy rearing up. However the odds that no die comes up 6 out of trillions thrown is astronomically high. Thus by the law of large numbers the overall count for 6s is a near certainty (within of course a margin of error), this is due, not to any interaction between the dice, but rather to the properties intrinsic to each individual die.
Perhaps one last point will clarify the matter. You speak of the fact that as we toss more and more dice the distribution becomes clearer and clearer. This is indeed true, with one very important caveat. Say we toss 600 dice and 6 trillion. We should expect that the 6 trillion dice conform much more closely to the distribution of 16.67% likelihood for the roll of a 6, however, we should also expect that number of 6s by which we are off by is much higher than that for the 600 dice.
That is: we may have rolled 88 die with a 6 out of 600 total tosses which gives us 14.67% 6s out of our 600. Rolling even as close as 16.65% for the 6 trillion rolls, however, gives us 990,000,000,000 die with a 6, thus we are off of the perfect value by 10 billion rolls! Thus as we roll ever greater numbers of dice we get closer and closer to the exact probability due to the large numbers involved; but only when we consider the number of dice as a percentage of the total rolled. We do not actually get closer to a perfect distribution in terms of the number of the individual die tosses, in actuality, in nominal terms, we get further away.
Thus the statistical convergence has everything to do with there being many many events which are taking place so that the results of any particular event influence the total outcome very little and this does not constitute any interaction between the events themselves.