• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

Randomness

Polymath257

Think & Care
Staff member
Premium Member
So one could tap this behavior for a source of randomness...but this is always to be understood with respect to a system with no detectable, orderly connection to the system generating the randomness. Many other contrived systems of behavior could serve the same role.

I suspect that the overall behavior of the atom or particle undergoing the weak force occurs within a given context and that context is an orderly one even if we cannot observe that level of order. This, of course, is a speculation that goes beyond the current scientific understanding.

More than that, it directly contradicts the modern understanding. We understand the weak force fairly well at this point. And it is a quantum field theory, just like the field theory for electrodynamics. And, for example, the *timing* of the decay of a neutron is random, even though the averages give a predictable half life if we have enough neutrons ( I use neutron decay because it is mediated by the weak force).

However, it is more than just a belief on my part. Looking at the systemic nature of the vast majority of natural systems we can see that this random behavior within an overall order may indicate an unknown systemic behavior in which the weak force participates.

Consider as an analogy the cooling of a cloud of gas in a static volume. As the gas cools the individual atoms undergo changes in kinetic energy in a entirely unpredictable way. Although we understand the laws of physics that apply to atoms and their kinetic interactions, we simply can't measure all the atoms in a gas or even one atom in the gas, nor can we measure their trajectories in such a way to make a prediction as to when one or another atom will collide and what their resulting kinetic energies will be. But we can on a statistic level of the whole.

And that is what puts the kinetic theory of gases as a *chaotic* phenomenon as opposed to a *random* phenomenon as in quantum mechanics. In kinetic theory, the randomness does, in fact, come from our ignoring (or not knowing) some relevant information.

Now, I have to hedge a bit here. Modern kinetic theory is based on the quantum behavior of the molecules in the gas. And this quantum behavior does have a macroscopic effect in the heat capacities of gases. This was actually the first recognized deviation from classical physics.

The quantum aspects become even more important in solid state theory. Once again, though, the randomness at the quantum level tends to be averaged out at the macroscopic level *except* that some aspects of conductivity (for example) would not happen at all without the quantum aspects of the electron (that it is a fermion and not a boson).

So I think that randomness does exists, but I doubt that it exists in a vacuum without some amount of orderly behavior suggesting that it is not intrinsically random, only random with respect to another frame of reference.

Well, the current evidence, primarily from the violation of Bell's inequalities, says that this is probably wrong, at least if you want to keep anything like local causality.
 

Skwim

Veteran Member
"Radioactive decay is a stochastic (i.e. random) process at the level of single atoms. According to quantum theory, it is impossible to predict when a particular atom will decay,[1][2][3] regardless of how long the atom has existed. However, for a collection of atoms, the collection's expected decay rate is characterized in terms of their measured decay constants or half-lives "
Note that the decay of a single atom isn't said to have no cause---an absolute and utter state of randomness---but only that it is unpredictable. So again, "random" comes down to describing event whose condition depends on our ability to discern its nature, and not the true nature of the event itself. As a matter of randomness, it puts radioactive decay in the same category as the flip of a coin, an event that, for lack of information, we can't predict, but know is determined by several controlling conditions.

Want to regard "random" in this sense, a condition dependent on human ability? Fine, but I don't find it very meaningful.

.


.

.
 

Segev Moran

Well-Known Member
I'm going to agree that your other examples are NOT of randomness, but at most of chaos.

These, however, are in a different category. We have actual reasons to believe that quantum events are *random*: that there are NOT 'hidden variables' that determine the outcomes. So, it isn't just the inability to calculate something that is, in principle, possible to calculate. From what we understand, it is impossible *in principle* to calculate these.
As we lack the knowledge, i can't claim that it is either true or false :)
we do know however that the position of particles is set by probability. not random position rather probable position.
In atoms, the location of atoms is quantized, meaning it is not random rather have an order to things.
I guess only time will tell if the universe is random or not ;)
 

Bob the Unbeliever

Well-Known Member
So, if I throw a coin 1 million times and I predict that about 50% of the outoomes are head, and I find it to be a pretty good prediction, does that entail that the experiment is not random?

I ask because I think that what you consider not random does not exist. What we see, as solid and predictable as it might seem, is just the average of many somethings which are individually inherently random.

Ciao

- viole

A thing can be fully random, but also fully contained by limits. In the case of the classic coin-toss, the limit is heads or tails (assuming it never-ever lands on it's edge; theoretically possible given the coin is not infinitely thin).

This is a somewhat difficult concept for some folk to grasp. I remember back in the 1980's, there was a big Kurfluffle over Fuzzy Logic.

In Classic Logic (according to popular science magazines of the day) utilized T or F, with zero shading: it was a case of absolute Either Or. In numerical terms, one would typically use zero (0) for False, and one (1) for True.

Introduce Fuzzy Logic, however, you introduced shading or weighting of the results. You cannot express it using T/F, but mathematically? Decimal fractions would typically be used-- thus, in Fuzzy Logic it's quite possible to be 1/2 True and 1/2 False, or 0.50 as expressed numerically.

Apparently, this was a Big Deal in embedded processing. My dad and I had been using weighted calculations since the inception of our program. For example, to calculate a 3-D membrane or surface, using single data points, you need to interpolate points over the entire surface, to have a smooth membrane. You do this, by using the actual real-world data points, and you weight them by the distance from the new interpolated point. That is, the nearer to the new point a real-world datum is, the higher it's weighting in the final calculation.

A simple way to think about this is using Shares. Say you have 10 data points, but you need to fill a grid of 100, so you'll need to interpolate 90 virtual points from the real data.

You could use simple averages of all 10, but then, you'd have a righteous mess. :)

Instead, you could create 30 shares from the 10. The one nearest to your new point might get as many as 3 or 5 or more shares, whereas a point very far away might not get any shares... obviously, you'd do this with weighted factors, i.e. multiply the value of a nearby point by 3 (in this simplistic example), but multiply a far point by 0.01 or less. Then you essentally take the "average" of the results.

Fuzzy Logic. Very useful in handling imprecise data, such as Voice Analysis. Or? Google Search requests. You seldom hear the term these days...

.. but once upon a time? Fuzzy Logic was All The Rage in digital computing circles. :D
 

Bob the Unbeliever

Well-Known Member
The problem isn't the seed. The problem is that the algorithm that does the RND function does not really generate a string of truly random numbers. For the purposes of any commercial application like accounting or inventory control, the number generated was more than sufficient.

However, the RND functions in most commonly used languages fall far short of producing what science would refer to as random numbers.

Of course! If you pick the same SEED every time, you get the exact same sequence of numbers-- every time.

However, if you use a pretty good random number as the SEED? The odds of getting the exact same sequence of numbers more than once, is pretty low.

Using the ever-increasing clock-tick as a SEED is guaranteed it won't repeat until it exceeds the limits of whichever INTEGER you utilized in creating it. Then, naturally, it'll roll-over and you start again... but for a processor that only lasted roughly 10 years or less before replacement? More than good enough-- the odds of roll-over are nil.

Thus the odds of a repeated string of numbers is also very low-- yes, it's not really **random**, not how Cosmic Rays are, but it's Good Enough for Service. :)
 

Bob the Unbeliever

Well-Known Member
And yet, many prefer the "inexactness" of analog music as can be seen by the resurgence of vinyl.

Indeed: to a trained ear? Vinyl in Analog format can actually have better dynamic range-- if you ignore the noise inherent in the system (and a very clean record on a modern high-mass turntable, can have almost zero noise, indeed, you could point to the fault being in the amplifier circuits or the speakers themselves). Thus one of the more complex Classical Pieces should sound better on vinyl than digital.

The problem is? When engineers set the standards for Digital Music, many compromises had to be made, in order to Get It Done. It wasn't ideal, but it was Good Enough.

The principle weakness, was the hard limits on dynamic range. A good classical piece can exceed this range easily-- that's often why some classical CD's sound really quiet (as opposed to the much more simplistic pop music, which rarely stresses the system at all...). By recording it at a quieter starting point, it gives more "head room" to get loud enough to do the piece Justice. :)

Naturally, all of the above is purely theoretical to me. My ears have long since deteriorated to the point I cannot tell the difference between a quality digital recording, and an equally quality analog one.

I'm assured by Experts: people who make a very good living listening to, and writing up, classical musical performances -- that analog is better.

Of course: The snob in me has to point out, that nothing beats a Live Performance, so there's that... :D

I expect that part of the appeal? Is due in part to the fact we are a social species.
 

Bob the Unbeliever

Well-Known Member
Does that mean that the randomness is constrained by some distribution/decay equation?

I expect that's exactly what it means--- radioactive decay always seems to have a measurable half-life, regardless of how "regular" individual decay events take place.

I'm told, that some radioactive items decay at a more "regular" fashion than others, but all are essentially a collection of random decay events. Which collectively eliminate 1/2 of the original element at a measurable rate: the half life.
 

Bob the Unbeliever

Well-Known Member
So one could tap this behavior for a source of randomness...but this is always to be understood with respect to a system with no detectable, orderly connection to the system generating the randomness. Many other contrived systems of behavior could serve the same role.

I suspect that the overall behavior of the atom or particle undergoing the weak force occurs within a given context and that context is an orderly one even if we cannot observe that level of order. This, of course, is a speculation that goes beyond the current scientific understanding.

However, it is more than just a belief on my part. Looking at the systemic nature of the vast majority of natural systems we can see that this random behavior within an overall order may indicate an unknown systemic behavior in which the weak force participates.

Consider as an analogy the cooling of a cloud of gas in a static volume. As the gas cools the individual atoms undergo changes in kinetic energy in a entirely unpredictable way. Although we understand the laws of physics that apply to atoms and their kinetic interactions, we simply can't measure all the atoms in a gas or even one atom in the gas, nor can we measure their trajectories in such a way to make a prediction as to when one or another atom will collide and what their resulting kinetic energies will be. But we can on a statistic level of the whole.

So I think that randomness does exists, but I doubt that it exists in a vacuum without some amount of orderly behavior suggesting that it is not intrinsically random, only random with respect to another frame of reference.

I think you'd be quite fascinated with a Thought Experiment, Maxwell's Demons (if you are not already familiar with the subject).

Maxwell's demon - Wikipedia

:)
 

Ponder This

Well-Known Member
Let's look at a classic example of randomness:

A person rolls two six-sided dice at a craps table.
We regard that process as producing a 'random' result.
However, a skilled dice thrower can take control of the dice and throw them for long runs without getting the dreaded seven that ends their turn. They are able to use a series of dices settings and throwing techniques to improve their results. The reality is that this classically 'random' process can produce repeatable predictable results. In this case, the expression that this process is 'random' is an expression of ignorance, as in "I don't really know how to predict this result".

Cue definition of the adjective 'random': 'made, done, happening, or chosen without method or conscious decision'.

Now let's look at a scientific principle:
In physics, the uncertainty principle roughly states:
'we cannot simultaneously know precisely both the momentum and the position of a particle'
What the uncertainty principle does not state is:
'a particle cannot simultaneously have precise position and momentum'
In other word, the principle is a statement of our ignorance, our inability to predict or know. The uncertainty principle comes from the process by which we observe things using light.

Quantum mechanics is an expression of the limits of our ability to know. When science say that an electron is bouncing around randomly, what they mean is that they don't know both exactly where it is and what it's momentum is.

Now let's look at what 'random' means in DNA mutations:
it means that we cannot predict precisely when these mutations will occur.
We could do things which will increase the frequency of mutations. We might even be able to induce a specific particular mutation. But as far as predicting the wealth of mutations occurring in DNA everywhere on the planet... we are at a loss. We can't actually precisely predict when or what mutations will occur. Hence, they are 'random'.

The definition of random means that these events should occur 'without method or conscious decision'.
The claim is, generally, that there is a method, but no conscious decision.
 

Polymath257

Think & Care
Staff member
Premium Member
Of course! If you pick the same SEED every time, you get the exact same sequence of numbers-- every time.

However, if you use a pretty good random number as the SEED? The odds of getting the exact same sequence of numbers more than once, is pretty low.

Using the ever-increasing clock-tick as a SEED is guaranteed it won't repeat until it exceeds the limits of whichever INTEGER you utilized in creating it. Then, naturally, it'll roll-over and you start again... but for a processor that only lasted roughly 10 years or less before replacement? More than good enough-- the odds of roll-over are nil.

Thus the odds of a repeated string of numbers is also very low-- yes, it's not really **random**, not how Cosmic Rays are, but it's Good Enough for Service. :)

The problem isn't just getting a repeat sequence, If you want to do Monte Carlo simulations, you need very high quality randomness in the sequence generated. That means that several statistical tests need to be passed. Think of it like this. No single coin flip should determine the next one. But neither should any sequence of 10 coin flips determine the next one. In fact, if you plot out the 1024 possible *sequences* of 10 coin flips, those should appear random also. But, in most RND functions in most programming languages, at about this level, you do begin to see patterns: things are NOT evenly distributed.
 

sealchan

Well-Known Member
I think you'd be quite fascinated with a Thought Experiment, Maxwell's Demons (if you are not already familiar with the subject).

Maxwell's demon - Wikipedia

:)

Yes, I read this...and I had heard of it before. For me this shows that top to bottom reality seems to be systemic. You cannot successfully cheat the rules by manipulation of the parts because the whole system maintains an equilibrium in line with the rules as we know them. And there is no way to experimentally close off a physical system in this Universe such that one can violate any of the fundamental laws.

The whole Universe acts as if it has already figured out any possible loophole and sealed it off by design. Such a design is the hallmark of a dump replication process running in an evolutionary, adaptive system.
 

Polymath257

Think & Care
Staff member
Premium Member
Let's look at a classic example of randomness:

A person rolls two six-sided dice at a craps table.
We regard that process as producing a 'random' result.
However, a skilled dice thrower can take control of the dice and throw them for long runs without getting the dreaded seven that ends their turn. They are able to use a series of dices settings and throwing techniques to improve their results. The reality is that this classically 'random' process can produce repeatable predictable results. In this case, the expression that this process is 'random' is an expression of ignorance, as in "I don't really know how to predict this result".

Cue definition of the adjective 'random': 'made, done, happening, or chosen without method or conscious decision'.

Yes, the values the dice end up with are determined by things like the force of the throw, the level of wear of the table, the rotation of the dices, etc. They could, theoretically, be predicted by having a computer watch the initial part of the throw and do a quick calculation to determine the end state before it actually ends.

Now let's look at a scientific principle:
In physics, the uncertainty principle roughly states:
'we cannot simultaneously know precisely both the momentum and the position of a particle'
What the uncertainty principle does not state is:
'a particle cannot simultaneously have precise position and momentum'
In other word, the principle is a statement of our ignorance, our inability to predict or know. The uncertainty principle comes from the process by which we observe things using light.

Quantum mechanics is an expression of the limits of our ability to know. When science say that an electron is bouncing around randomly, what they mean is that they don't know both exactly where it is and what it's momentum is.

If all you look at is the uncertainty principle, you would be correct. But there is more to QM than just that. QM actually predicts *probabilities* and specifically states that which outcome cannot be determined *even theoretically* from the initial state. it is NOT simply a 'lack of knowledge', but an impossibility that the outcome is determined by previous states. What's more, when we get into entangled particles, the correlations are such that *no* hidden variables can explain the observed phenomena unless there are very deep violations of causality.

Now let's look at what 'random' means in DNA mutations:
it means that we cannot predict precisely when these mutations will occur.
We could do things which will increase the frequency of mutations. We might even be able to induce a specific particular mutation. But as far as predicting the wealth of mutations occurring in DNA everywhere on the planet... we are at a loss. We can't actually precisely predict when or what mutations will occur. Hence, they are 'random'.

The definition of random means that these events should occur 'without method or conscious decision'.
The claim is, generally, that there is a method, but no conscious decision.

And this seems to be intermediate. For example, a mutation caused by a radioactive decay nearby would indeed be random, not just in which gene is affected, but in the timing, etc. A mutation produced by a mutagenic chemical may well be predictable in some ways given the affinity of that chemical to DNA, for example.
 

1robin

Christian/Baptist
I support a version of compatabilism not necessarily Dennett's as I said before. I do not discount Dennettt completely, but disagree with him.
As you probably guessed I am not much of a Dennett fan, I think he is a lousy philosopher.



You apparently did not read and did not respond to what I said. All these examples indicate choices within a limited range, and not complete Free Will. As I said . . .
Choices don't exist without freewill. Also the range of events I described did not occur within too limited of a range. If Atoms ricocheting off each other is all that determines events then after every intentional thought I had just about any secondary event could have taken place. The fact that hundreds of events transpired that allowed me to fulfill my intention suggest that blind forces are not what governed these events (choices). Again if you get hungry (and atoms in motion don't care if you starve to death) why were these blind forces so obliging as to perform the hundreds of necessary actions that allowed you to actual eat some food? No, the fact is to explain the set of all events you need both determinism and free will). Either one alone just isn't enough to explain intentional fulfillment.

Disagree if you like, but pretty much all examples you come up with would fall within predictable outcomes within a range of possible outcomes. I do believe in a form of Dennett's compatabilism where a degree of potential free will exists. Maybe what you call intentionalism, but nonetheless that largely takes place within a range of possible predetermined choices and outcomes..
I don't agree that the event that follows another occurs within a narrow range. If atoms in motion determine all events can you imagine the size of the set of all possible actions that follow from a previous action? It's incomprehensively large. Lets say you think to yourself that you want to call a family member, the amount of actions that can follow that thought are countless if blind physics is determining events. Why is a force that has no intentionality so obliging as to let you go through the hundreds of actions required for you to actually place the call? So far I simply can't make you reconcile this fact with your worldview. Nothing else in our posts makes any difference unless you can reckon with these trillions of intentional thoughts that are being fulfilled (supposedly) by a force that does not care anything about fulfilling anything.



As far as the contemporary view of the science of evolution and science in general Fred Hoyle is an antique. No by the present knowledge it is not valid that "The numbers one calculates from the facts seem to me so overwhelming as to put this conclusion almost beyond question." Bogus use of probability.
I don't find your rejection of a scholar as well credentialed as Fred Hoyle persuasive. It appear to be a desperation move but maybe you could explain further. I am not sure what aspect of reality Fred was referring to when he made the statement about but I can think of one that it does apply to. The teleological argument for God (Fine tuning argument) suggests very strongly that our universe exhibits intentionality in even it's most remote events. For the universe to support advanced life (be consistent with God's purpose) it must be balanced on a knife edge. Many of the relationships like strong nuclear force, expansion rate, symmetry breaking have constants thrown into their relationships that must all be exactly what they are to get any universe than can support life. Keep in mind this is no sharp shooter fallacy, this (if life is the prize) is basically the universe winning the life lottery over and over and over again against unimaginable odds and these odds are multiplication not additive. Just as if the same person kept winning the same lottery over and over and over we would all suspect manipulation of an intelligent source so with the universe. It appear as he stated that a super intellects has monkeyed with all the relationships in the universe. Sorry I did not intend to spend so much time on this footnote.

Hoyle smells of bogus Intelligent Design, so does your referenced website, and in the above anthropomorphizes the nature of our physical existence. Nothing is sighted nor blind in the physical nature of our existence.
I can quote similar conclusions from hundreds of scientists including the most modern but you can see that Hoyle logic is sound just by looking at the example I gave above. I suspect your rejection of Hoyle is based on some level of bias instead of his actual abilities. In a contest between your credentials and Hoyle's it should be obvious (all things being equal) who's I should place my trust in.



Not clear nor coherent. Science is what science is, and thank goodness without the religious assumptions of Fred Hoyle, and Intelligent design advocates.
Modern science (abstract science) is because of the faith of men like Hoyle. One of the greatest works on the subject (can't recall the name) written by an atheist examined why it was that modern science (not technological innovation) was discovered only in the heart of the Christian west of all places. As much as the Atheist author tried he kept having to conclude that the reason was directly related to the scientists faith (78% of Nobel Laureates are Christians with many of the rest being Jews). The reason was that Christian scientists believed that God (being rational) would create a rational universe. Their scientific efforts were the result of their trying to decode the rationality out of the universe they believe God had put in it.


Look, all this stuff is interesting but you have not touched my central claim. Why are trillions of intentional states fulfilled by a force that has no interest in fulfilling anyone's intention. Why are atoms in motion so obliging despite having no will. You can't explain this by determinism alone but I wish you would at least try. Until you do my trillions of examples of freewill remain perfectly intact. Explain to me why uncaring physical forces would enable us to carry on a debate it cares nothing for. Which cares for and intends nothing at all.
 

sayak83

Veteran Member
Staff member
Premium Member
I'm going to agree that your other examples are NOT of randomness, but at most of chaos.

These, however, are in a different category. We have actual reasons to believe that quantum events are *random*: that there are NOT 'hidden variables' that determine the outcomes. So, it isn't just the inability to calculate something that is, in principle, possible to calculate. From what we understand, it is impossible *in principle* to calculate these.
Since chaotic events are unpredictable because they are super sensitive to initial conditions and perturbations, and since QM says that there will be small scale random fluctuations in the states of all microscopic particles, does it not mean that chaos itself is caused by the inherent randomness of the QM scale and not by human's inability to detect all governing states to absolute precision?
 

Polymath257

Think & Care
Staff member
Premium Member
Since chaotic events are unpredictable because they are super sensitive to initial conditions and perturbations, and since QM says that there will be small scale random fluctuations in the states of all microscopic particles, does it not mean that chaos itself is caused by the inherent randomness of the QM scale and not by human's inability to detect all governing states to absolute precision?

Well, the *classical* equations for turbulence are deterministic, but subject to chaos (sensitive dependence of initial conditions). So chaos itself doesn't depend on quantum randomness. It appears even in classical equations.

But, since all real systems are quantum at base, there is always a good question how well the classical equations approximate the quantum behavior and what happens at the quantum level when the classical equations show chaos. I have seen a couple of papers that discuss this subject. What appears to happen in the cases I have seen is that the quantum randomness spreads out the classical chaotic behavior even more. This makes sense: any inherent randomness would be expected to provide the uncertainty in initial conditions required for the classical equations to be unpredictable.

My main caution is that this doesn't seem to have been well studied, so conclusions are tentative here.
 

atanu

Member
Premium Member
The randomness isn't constrained, but the fact of randomness leads to a decay equation. The simple fact that the decay is truly random, so far as we can tell, means that the rate of decay of the atoms is proportional to the number of undecayed atoms present.

When the rate of change of a quantity at a given instant is proportional to the value of the quantity at that instant, that means there is an exponential relationship. (eˣ is the only thing in maths that differentiates into itself.) More here, including half-life etc: Exponential decay - Wikipedia

This is a nice example of how order arises, quite naturally, out of randomness. (Somebody should tell this to followers of Intelligent Design! ;))

Well, in a sense. In the same way that randomness in a coin flip would be 'constrained' by the 50%-50% split for heads and tails. To be a fair coin requires that. But it would still be meaningful to say the coin flip was random (outside of physical issues which don't happen in the quantum situation).

I expect that's exactly what it means--- radioactive decay always seems to have a measurable half-life, regardless of how "regular" individual decay events take place.

I'm told, that some radioactive items decay at a more "regular" fashion than others, but all are essentially a collection of random decay events. Which collectively eliminate 1/2 of the original element at a measurable rate: the half life.

We can always extract patterns out of so called random-ness.
 

exchemist

Veteran Member
We can always extract patterns out of so called random-ness.
I think that is going too far. But often, in natural phenomena at least, individual randomness gives rise to collective order, due to the operation of underlying physical "laws", as we call them.
 

wellwisher

Well-Known Member
The concept of random and statistics is usually explained and demonstrated using cards and dice. The conceptual problem is, cards and dice are manmade objects, They are not natural objects. It is a magic trick.

Dice are designed and manufactured to weigh the same on all sides. This manufacturing trick is what allows all sides to have the same odds. The difference between the sides, is not based on physical potential but is based on a set of subjective markings, which by human convention, assign different subjective values or potentials to each side. Subjective markings is not a natural potential, which is why it appears random. Subjective operates under random laws. The blackbox approach amplifies subjectivity.

Natural is different from this. For example, the hydrogen atom has distinct quantified energy levels each with different natural potentials. The natural hydrogen atom dice is loaded, differently on each side. It is not assigned subjective loads, based on manmade up pictures, to make the game work. It is assigned natural potentials that are distinct and different; quanta. As an exercise, design a dice where the material weight is different on each side. The odds will change.

Most of the use for the statistics is based on manmade things; done in factories. The is fine because subjective human endeavors; best cell phone, benefit by subjective tools. The problem has been the assignment of this result of this manmade game; magic trick, to natural reality, which does not use the same type of dice and cards. The dice and cards of natural are loaded and/or marked. For example, the periodic table is like a deck of cards. Each card is not equally likely, with the hydrogen card appearing the most.

The idea that life can appear from other solvents, besides water, is based on manmade cards, where you subjectively assign any solvent, all types of magical properties, beyond natural chemistry. Natural chemistry uses loaded and marked cards, with water a loaded wild card.

Random, as applied to nature, is a type of religion. This God of random can do anything, similar to the God of determinism. The difference is the God of Random takes longer, and his acts often come about when you don't expect it. Regardless, he is still assumed to be omnipotent; all things have finite odds. He is also a mystery God, in that although we know all things are possible with this God, predicting the whims of this god is not easy. This God is also not personified. This is also due to his nature as unpredictable, therefore his image is not fixed or determined. He image, like a deck of cards with various subjective pictures.
 

atanu

Member
Premium Member
I think that is going too far. But often, in natural phenomena at least, individual randomness gives rise to collective order, due to the operation of underlying physical "laws", as we call them.

I believe that the starting point of our empirical studies, assuming the ever changing forms as discrete different objects/things/particles, skims upon a kaleidoscopic show, base of which is probably an undifferentiated whole. We being part of this whole cannot subject it to objective study.

If this is true, we do not know how it affects results of our studies.
 
Top