• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

The World As A Hologram

uberrobonomicon4000

Active Member
"Before analyzing in detail the resulting distinction between quantum and classical information, let us take note of some common misconceptions about the relationship of information to the physical world related to the increasingly popular idea that information is physical in nature. The greatest and most common error of this sort is naively to identify information with the physical systems that may be used in communicating it."
Jaeger, G. (2009). Entanglement, information, and the interpretation of quantum mechanics. Springer.
Why can information not be identified with physical systems? Again, what type of information are we talking about other than bits? Bits are not information. They just represent information and its actually not even information at all, it is just data.
 

LegionOnomaMoi

Veteran Member
Premium Member
Why can information not be identified with physical systems? Again, what type of information are we talking about other than bits? Bits are not information. They just represent information and its actually not even information at all, it is just data.

The above questions touch on a few different topics, so I apologize if I don't address them systematically (also, the only sleep I've had in the past three days was a few hours and I've sort of spent all my time on looking into research practices, software, and participant recruitment/management/attrition/etc., so if I am incoherent, I apologize).

I think the first issue which requires some clarification is what I mean (and Susskind, the physicist in the video clip) mean by information. The link I included in my last post is a major contribution to information theory by Miller and therefore cognitive science, psychology, communications, etc. (as soon as you get off of the elevator at Harvard's William James Hall, there are a bunch of materials such as a first printing of Chomsky's Syntactic Structures which were foundational in the cognitive and computer sciences; one of them is a plaque with a print of Miller's study).

The classical definition of information is in its simplist form one of possibilities. Language, binary code, morse code, genetic code, etc., all are capable of representing meaningful data because they can be "chunked" into units, and these units can have at least 2 states. In classical physics, entropy and information theory were for a while much almost as tied together as information theory is in modern (quantum) physics. But while computer science, linguistics, etc., increasingly used information/communications theory the use in physics decreased. The reason has to do with what information means when used formally.

As I said, information is more or less just possible states/alternatives. The ancient Greeks used pebbles to vote. But pebbles are only "information" when someone can interpret the states as meaningful in some way. The Greek use of pebbles did not make pebbles binary units (bits) of information. Their interpretation of the pebbles' states did.

Bits ARE information, as information is defined (formally) in terms of entropy or possible states of some physical entities, elements, etc. But they are not physical states or physical entities in and of themselves, anymore than any unit of measurement is. A computer bit has a particular physical realization, but it is far from the only possible physical instantiation of a binary unit, nor does any information unit require a direct correspondence with some physical entity or characteristic of some physical entity.

In classical mechanics, the use of words like "state" or "property" or "characteristic" had a clear meaning because they had a clear relation to a physical systen. A rock thrown from a particular hight at a certain speed with certain wind conditions could be described mathematically in terms of the location of the rock, the space the rock occupied, the velocity of the rock, and so forth. Every measurement or value of the system (a rock being thrown in some environment) corresponded quite clearly to some physical aspect of that system.

This is not true in quantum physics. There is no such relation. When we describe the "state" of a quantum system, this state has no physical reality. It's a vector space with particular properties (a Hilbert space). In order to describe actual physical properties, an entirely different formalism/method is required. Simplistically, a quantum system in some experiment exists in the physical world but it is described by the researchers using mathematical formalisms that do not directly relate to the system itself, nor is it clear what they mean. It is not clear, for example, whether a quantum system which is described as being in multiple states is actually in multiple states (a violation of classical logic and causality), or that this discription is simply a probability/statistical function.

When a quantum system is measured, the description of the system fails. Imagine throwing a rock in a lab designed with all kinds of devices to measure location, velocity, etc., but every time you throw the rock, the only way you can predict where the rock will end up is by describing it as existing in different places. However, you never actually observe the rock in multiple places. There are then two different steps when it comes to dealing with quantum systems. The first is the description, a mathematical transcription of the system which is based on theory, not on observations of the system itself (it can't be). Then you run the system (e.g., firing electrons at some detector). The description of the system you have represents the electrons as being in different states at the same time. But you never observe this, and you require another mathematical device (usually Hermetian operators) to describe the measurements.

In other words, while your "system" is completely described by your mathematical transcription, you will never actually be able to relate that description to anything physical. Doing so would be like actually observing the rock you throw as being in different places at the same time.

Qubits are used to describe the states of quantum systems. However, once again these "states" cannot be observed and it is not clear what they represent (i.e., how they correspond to the physical world). Unlike operators used in the measurement process, however, using an information theory approach makes the physical nature secondary and the problem of relating "states" of the system to to actual characteristics of it a non-issue. However, the mathematics used in QM describe reality isn't all that important in this sense, because the focus is only on possible states of a prepared system after it is measured. Which makes the ontological status of the states and/or the physical system largely irrelevant. The important thing is only the initial set-up and the final states, which is the one thing in QM we know very well.

However, because the system is described in terms of multiple states, and we do not know how these states actually correspond to physical reality (does classical logic fail at the most fundamental level of reality? do physical systems violate causality?), there is no way to relate qubits to the actual quantum systems. They can only relate to the outcomes.
 

uberrobonomicon4000

Active Member
The reason why I said a bit and a qubit are two entirely different things, is based on their states. As in a regular bit as you already said is either 1 or 0 (on or off). It can be in either one of those two states at any period of time, but not both. A Qubit however is similar to a classical bit, but very much different, as in a qubit can be in multiple states at once simultaneously due to quantum superposition.

I don’t refer to a bit as information. Information is something that is useful and understandable. A bit just has properties attributed to it which make up the bit. A regular bit in this case or that of the video is referring to a particle, such as a photon, which deals with particle physics and is probably more useful to a physicist who studies them. What I did find interesting is that a particle or bit of “information” can be compressed to the point where actual information about the bit is hidden. I think Susskind referred to the tiniest possible unit of a bit is a Planck, which is a compressed particle when they enter or get near the horizon of a black hole.

So maybe coded or programmed isn’t the right word to use here. Maybe encoded would be a little more accurate, since a bit can be said to store information. Which is very interesting to think about, because it’s the actually bits that make up the information, not the bits themselves.
 
Last edited:

uberrobonomicon4000

Active Member
But that's kinda just making up your own terminology. Also, Planck units - Wikipedia, the free encyclopedia
No it’s not.

Information theory deals with more than physics. A bit is a basic “unit” of information. It is the actual bits that make up the information. This is a science and technology section so maybe I am being too technical which may go beyond the intended purpose of the video. It has also been a couple weeks since I last seen the video of when he gets into vectors.

Susskind starts out with a basic unit and is using that unit (the bit) it to describe “The world as a hologram”, which he gets into the “building blocks” of creating a hologram, a 3-demenisional picture or image and relating it to “reality”.
An actual hologram is just an illusion of a 3-demenisional environment on a 2D surface, while an illusion can be just about anything.

170px-Zonenplatte_Cosinus.png
'

This is an image on wiki page for Holography, which provides an illusion of motion.
 
Last edited:

LegionOnomaMoi

Veteran Member
Premium Member
The reason why I said a bit and a qubit are two entirely different things, is based on their states. As in a regular bit as you already said is either 1 or 0 (on or off).

Or heads/tails. Or black/white. It simply refers to something which can exist in only two states. The reason it is (IMO) misleading to say a bit and a qubit are "two entirely different things" is
1) the reason behing the name is no accident, but a deliberate choice with the intent of associating qubits with bits as related things
2) both are part of information theory.
3) both refer to units of information


I don’t refer to a bit as information.
When most people use the term "bit" or the term "information" they are using it in ways that are different than the way scientists do (at least, scientists who work in fields that use information theory).
In common usage, this

Information is something that is useful and understandable.
is true and a good enough definition. But information, as used in computer science, physics, some fields within psychology, etc., has a formal/mathematical definition.


A bit just has properties attributed to it which make up the bit.

Rather, a bit is a property attributed to something. It's not often used that way, because it has become so associated with computers that the term "bit" is often used to refer to computer bits, rather than as a property intrinsic to computer hardware. But technically, whether we are talking computers or coins, a bit is a property attributed to the entity in question. A coin has two states, which means it "is" a bit in that it has the property of being in one of two possible states.

I think Susskind referred to the tiniest possible unit of a bit is a Planck, which is a compressed particle when they enter or get near the horizon of a black hole.

That gets into notions concerning whether spacetime is actually discrete (that is, there are minimal units of spacetime which cannot be divided any further) and quantum gravity.
 

uberrobonomicon4000

Active Member
Rather, a bit is a property attributed to something. It's not often used that way, because it has become so associated with computers that the term "bit" is often used to refer to computer bits, rather than as a property intrinsic to computer hardware. But technically, whether we are talking computers or coins, a bit is a property attributed to the entity in question. A coin has two states, which means it "is" a bit in that it has the property of being in one of two possible states.
I agree that a bit is a property attributed to something on an extremely small scale, like that of a planck.

He is however referring to a bit as an elementary particle and hidden information as entropy.

So how many planks are needed before the bits become relevant to determine the information of a bit, being a elementary particle like that of a photon. A very large amount.
I think he said 1000s of trillions of plancks which is incompressible, because the “bits” are so compressed that that cannot move or do anything else. So they make up the actual bit.
 
Last edited:

uberrobonomicon4000

Active Member
Yes it is. Good we discussed this.
It is clear you didn’t read my entire post and don’t really care to contribute anything to this conversation. You already said this video is “mind blowing” to you so I didn’t expect for you to completely understand or know everything I’m talking about. So I will further explain.

Information Theory is or has originally been used by mathematicians (introduced and created by Claude E. Shannon – who is the Einstein of Information Theory), electrical engineers, computer science, bioinformatics and now by “physicists”. Why? Because physicist didn’t think they needed information theory, yet every other branch of science uses it and physicists were later to adopt it. Now physicists use it to smash atoms and chart all of that data, study black holes or build quantum computers.

Information like I said is something that is meaningful and can be useful. For example, let’s say you know absolutely nothing. Well I drop a symbol or number into your empty brain bank of information. That number is the number 1. You don’t know what that number is, or what it means, unless you have something else to compare it too. Well, you knew nothing before, but now you know something. So the only other alternative to not knowing is knowing, which is what a bit represents a unit or ‘bit’ of information, it isn’t the completed picture so to speak and in this case that of a hologram.

I drop another number 1 into your brain bank which is no longer empty. You now have two of these numbers. You can do a lot more with 2 than one. I drop a third, so on and so forth. You have now gained knowledge of the numbers and what they might represent and can therefore do more with them. Possibly come up with some combinations of the numbers, add and subtract depending on how many you have. So as you experiment with these numbers you are gaining knowledge, which leads to understanding and you can then translate it as information. Information is something that can be share with other people and used by others.
 
Last edited:

LegionOnomaMoi

Veteran Member
Premium Member
I agree that a bit is a property attributed to something on an extremely small scale, like that of a planck.

A banner/flag that is several times larger than a billboard can be a "bit".

He is however referring to a bit as an elementary particle and hidden information as entropy.

"Entropy" in physics was (and still is) defined in terms of information theory long before quantum physics was. Actually, it is more appropriate to say that information theory was defined in terms of entropy as understood in physics. That's where Shannon borrowed the term.

So how many planks
Planck length is a unit of measurement, and is relevant here because it is the smallest possible radius of curvature in the GTR (i.e., in a 4-dimensional universe) that is compatible with quantum mechanics.

It is also fundamental in quantum physics (or, rather, Planck's constant is), because Planck's constant is intrinsic to the various equations used in QM (including the famous Schroedinger equation) to describe the dynamics of quantum systems. One of the things this means (from an information theory point of view) is that Planck's constant relates the uncertainty in any measurement of any quantum system between how well the measurement determines position and how well it does momentum. The greater the degree of precision for one means less for the other, and Planck's constant is used in the equation which expresses this relationship.

The reason information theory uses the term entropy is because information theory is a mathematically precise (i.e., formalized) framework for understanding communication, whether we are dealing with speech, TCP/IP, or morse code. However, communication is all about sending information. What Shannon did was prove certain things about the uncertainty related to the transmission of information over a channel of any type. Uncertainty is a matter of probability, and probability was (and is) key in statistical mechanics. Entropy in physics is a measure of the disorder; more specifically, it is a statistical concept relating a system with it's possible configurations and more importantly the change from some initial configuration to some final configuration.

Communication is all about possible configurations. Static, "noise", signals, words, spelling, etc., are all about configuration states. Miller's work became foundational in cognitive science, linguistics, and psychology because he related the idea of a "bit" ("One bit of information is the amount of information that we need to make a decision between two equally likely alternatives. If we must decide whether a man is less than six feet tall or more than six feet tall and if we know that the chances are 50-50, then we need one bit of information. Notice that this unit of information does not refer in any way to the unit of length that we use -- feet, inches, centimeters, etc. However you measure the man's height, we still need just one bit of information") to human memory and developed the idea of "chunking".

If given a sequence or random letters, such as "sdfrhjldgskhinedslhg", you'd probably have great difficultly memorizing them. However, if given a sequence with an even greater number of letters, but which was "in a hole in the ground their lived a hobbit" it would be a lot easier to memorize. That's because in the first case, each letter is "chunked" seperately, and constitutes a unit of information. However, when letters are "chunked" as words, then they no longer represent single units of information; the words do.

"bit" or "binary unit", is used across multiple disciplines because of certain properties a log with base 2 has, but that isn't really all that important here. More important is that no matter how you wish to define information, whether in terms of "meaning" or "language" or whatever, that definition can always be related to the mathematical version used in information theory, making that version far more useful.
 

LegionOnomaMoi

Veteran Member
Premium Member
Information Theory is or has originally been used by mathematicians (introduced and created by Claude E. Shannon – who is the Einstein of Information Theory)

Not exactly, although he was perhaps the key player in providing a theory which allowed vital work done before, such as the development of boolean algebra (by the founder, Boole), foundational work in computability theory by Turing and others (including those who introduced the term "bit"). Shannon pretty much established the mathematical framework which enabled information theory to exist, but his more influential work was co-authored with Weaver, and within computer science, cognitive science, and most fields which use information theory, guys like Turing, von Neumann, and Kolmogorov are at least as important if not more so. Not only was the idea of a "bit" around before him, but so were computers.

Why? Because physicist didn’t think they needed information theory, yet every other branch of science uses it and physicists were later to adopt it.
Shannon's key contribution was adopting the development of statistical mechanics and incorporating one key component of it (entropy) into communication theory. In other words, most of the mathematics was used in physics before it was used about 40 years before Shannon was born, and had begun nearly a century before that.

Information like I said is something that is meaningful and can be useful.
As these are entirely relative terms (what is meaningful to one may be completely useless and meaningless to another), they aren't helpful for use within any science. Additionally, while you may say that this: "wsdkfjle" is neither meaingful nor useful, there are any number of ways it can be. Each letter, for example, has associated with it a place in at least one alphabet and therefore is a meaningful element of the alphabet of that language, and as such is also useful.



I drop another number 1 into your brain bank which is no longer empty. You now have two of these numbers. You can do a lot more with 2 than one.
Which makes it no longer a "bit". I have three distinct configuration possibilities: 1, 1, and 11.

Information is something that can be share with other people and used by others.

The problem, however, is that what can be shared and use by others at one moment may not be at another. For example, we have had for some time various objects with a script written on them. This script is called Linear B. Nobody in the world was able to understand anything written in this script. Then some brilliant work within Indo-European linguistics resulted in the identification as an early alphabet used to for writing ancient Greek. The script was around when nobody could read it, but it still contained information. In fact, unlike most things used to communicate in the modern digital world, we KNEW it represented information.

Information theory provides a way of dealing with these kind of ambiguities with concepts like "knowledge" or "meaning' or even "concepts".
 

uberrobonomicon4000

Active Member
A banner/flag that is several times larger than a billboard can be a "bit".
I disagree, a bit, alone by itself has no special meaning. A flag is a flag, a banner is a banner. Either you have a flag or you don’t. Any other information related to flag will comprise of more bits.

So to refer to a flag or banner as a bit of information is a bit misleading.

"Entropy" in physics was (and still is) defined in terms of information theory long before quantum physics was. Actually, it is more appropriate to say that information theory was defined in terms of entropy as understood in physics. That's where Shannon borrowed the term.


Planck length is a unit of measurement, and is relevant here because it is the smallest possible radius of curvature in the GTR (i.e., in a 4-dimensional universe) that is compatible with quantum mechanics.

It is also fundamental in quantum physics (or, rather, Planck's constant is), because Planck's constant is intrinsic to the various equations used in QM (including the famous Schroedinger equation) to describe the dynamics of quantum systems. One of the things this means (from an information theory point of view) is that Planck's constant relates the uncertainty in any measurement of any quantum system between how well the measurement determines position and how well it does momentum. The greater the degree of precision for one means less for the other, and Planck's constant is used in the equation which expresses this relationship.

The reason information theory uses the term entropy is because information theory is a mathematically precise (i.e., formalized) framework for understanding communication, whether we are dealing with speech, TCP/IP, or morse code. However, communication is all about sending information. What Shannon did was prove certain things about the uncertainty related to the transmission of information over a channel of any type. Uncertainty is a matter of probability, and probability was (and is) key in statistical mechanics. Entropy in physics is a measure of the disorder; more specifically, it is a statistical concept relating a system with it's possible configurations and more importantly the change from some initial configuration to some final configuration.

Communication is all about possible configurations. Static, "noise", signals, words, spelling, etc., are all about configuration states. Miller's work became foundational in cognitive science, linguistics, and psychology because he related the idea of a "bit" ("One bit of information is the amount of information that we need to make a decision between two equally likely alternatives. If we must decide whether a man is less than six feet tall or more than six feet tall and if we know that the chances are 50-50, then we need one bit of information. Notice that this unit of information does not refer in any way to the unit of length that we use -- feet, inches, centimeters, etc. However you measure the man's height, we still need just one bit of information") to human memory and developed the idea of "chunking".
When Susskind was referring to an elementary particle as a bit and comparing that to computer bit, he jumped to the word bytes.

Computers, electronics or whatever piece of technology, have you, do in fact operate on bits. I’m not saying this was intentional or maybe he misspoke, but it is in fact blurring (skewing) the lines as to what a bit actually means or is.
Where a byte is actually 8 bits and depending on the logarithm, you will be working with different units, nat, ban, qubit.

Maybe he actually wanted to say qubit, but then he would have to explain what a qubit is and people would need to know what a bit is before they could ever wrap their head around a qubit.
If given a sequence or random letters, such as "sdfrhjldgskhinedslhg", you'd probably have great difficultly memorizing them. However, if given a sequence with an even greater number of letters, but which was "in a hole in the ground their lived a hobbit" it would be a lot easier to memorize. That's because in the first case, each letter is "chunked" seperately, and constitutes a unit of information. However, when letters are "chunked" as words, then they no longer represent single units of information; the words do.

"bit" or "binary unit", is used across multiple disciplines because of certain properties a log with base 2 has, but that isn't really all that important here. More important is that no matter how you wish to define information, whether in terms of "meaning" or "language" or whatever, that definition can always be related to the mathematical version used in information theory, making that version far more useful.

We are no longer dealing with a single unit or piece information, yet a collective sum. As in a letter, represents a sound, it has more meaning that what is actually attributed to it, which is what most people may take for granted, but that is looking at the very intricate details or properties of the letter. It has its own clearly defined shape. The letter “A” can be both uppercase and lower case. Depending on the type of word you are using and how you are using it in a sentence. All of these things are comprised of bits. Not just a single unit. But the more bits you have the more information you are giving to that entity. So these words or whatever you wish to use are a set data types. They are no longer bits of information. A data type can be a letter, word, picture, graphic, or whatever.
 
Last edited:

uberrobonomicon4000

Active Member
Not exactly, although he was perhaps the key player in providing a theory which allowed vital work done before, such as the development of boolean algebra (by the founder, Boole), foundational work in computability theory by Turing and others (including those who introduced the term "bit"). Shannon pretty much established the mathematical framework which enabled information theory to exist, but his more influential work was co-authored with Weaver, and within computer science, cognitive science, and most fields which use information theory, guys like Turing, von Neumann, and Kolmogorov are at least as important if not more so. Not only was the idea of a "bit" around before him, but so were computers.


Shannon's key contribution was adopting the development of statistical mechanics and incorporating one key component of it (entropy) into communication theory. In other words, most of the mathematics was used in physics before it was used about 40 years before Shannon was born, and had begun nearly a century before that.


As these are entirely relative terms (what is meaningful to one may be completely useless and meaningless to another), they aren't helpful for use within any science. Additionally, while you may say that this: "wsdkfjle" is neither meaingful nor useful, there are any number of ways it can be. Each letter, for example, has associated with it a place in at least one alphabet and therefore is a meaningful element of the alphabet of that language, and as such is also useful.




Which makes it no longer a "bit". I have three distinct configuration possibilities: 1, 1, and 11.



The problem, however, is that what can be shared and use by others at one moment may not be at another. For example, we have had for some time various objects with a script written on them. This script is called Linear B. Nobody in the world was able to understand anything written in this script. Then some brilliant work within Indo-European linguistics resulted in the identification as an early alphabet used to for writing ancient Greek. The script was around when nobody could read it, but it still contained information. In fact, unlike most things used to communicate in the modern digital world, we KNEW it represented information.

Information theory provides a way of dealing with these kind of ambiguities with concepts like "knowledge" or "meaning' or even "concepts".
I know, but that can be said about a lot of things. That is like saying Galileo discovered gravity when gravity has always been there and archers would shoot their arrows to practice their trajectories before physics ever existed. Mechanical computers were around in the 1800s and the first programmer was a lady, by the name of Lovelace, but you don’t hear much about her. It was another 100 years or more before the first electronic computer was developed. The wheel has been around for ages too.

I see where this is going, and I don’t care to get into a deep philosophical discussion of whether or not science has always been around even if not in principle compared to modern day science. The fact of the matter is somewhere down the line, people started to take the time to record or document information. That information was useful information, even if it was numbers, letters, symbols, hieroglyphs, whatever.

However you view information is entirely up to you. But information by no means is just a bunch of random letters from the alphabet thrown together to make a bunch of random nonsense.
 

LegionOnomaMoi

Veteran Member
Premium Member
I disagree, a bit, alone by itself has no special meaning. A flag is a flag, a banner is a banner. Either you have a flag or you don’t. Any other information related to flag will comprise of more bits.

A "bit" has a definition in the sciences:

"The log is to the base 2 and entropy is expressed in bits. For example, the entropy of a fair coin toss is 1 bit."
Cover, T. M., & Thomas, J. A. (2006). Elements of information theory. Wiley-interscience.

Or, the same expressed in a different way: "Bit: A unit of uncertainty: one bit of uncertainty is equivalent to uncertainty regarding the truth or falsity of one elementary proposition"
from the glossary of Klir, G. J. (2005). Uncertainty and information: foundations of generalized information theory. Wiley-IEEE Press.

So to refer to a flag or banner as a bit of information is a bit misleading.
It isn't, as the "unit of information" which is defined as a bit is "the physical realization of a system- any system- that after appropriate preparation and operation 1. can exist in one of two clearly distinguishable, mutually exclusive, states (0–1; no–yes; true–false; on–off; left path–right path; etc.), and 2. can be read-out or measured to determine which of the two states it is"

Roederer, J. G. (2005). Information and its Role in Nature. Springer.

A flag that has one side which is white, and another which is black, fits all the above descriptions. The last (and most descriptive) makes this clear, as we can see if it fits the necessary conditions. The first is whether or not the "system" (the flag in this case) can "exist in one of two clearly distinguishable, mutually exclusive, states". It can: in one state the white side is displayed, in another it is the black. As for the second, any number of optical devices (including the human eye) can "read-out" which of the states the flag is in.

When Susskind was referring to an elementary particle as a bit and comparing that to computer bit, he jumped to the word bytes.

Computers, electronics or whatever piece of technology, have you, do in fact operate on bits. I’m not saying this was intentional or maybe he misspoke, but it is in fact blurring (skewing) the lines as to what a bit actually means or is.
Where a byte is actually 8 bits and depending on the logarithm, you will be working with different units, nat, ban, qubit.

Here's Susskind's definition of a bit: "Entropy and information are naturally measured in “bits”. A bit is the entropy of a two state system if nothing is known".

Susskind, L., & Lindesay, J. (2005). An introduction to black holes, information and the string theory revolution. World Scientific.

This relates to what he is talking about mainly in terms of time: "Now, in the classical theory, there are no limits on how much information can be sent in an arbitrarily small time with arbitrarily small energy. However in quantum mechanics, to send a single bit requires at least one quantum"

Also, "to obtain a single bit of information about the state of the proton, the observer has to probe it with at least one quantum...But such an interaction between the proton and the probe quantum is at high enough energy that it can cause a baryon violating interaction even from the perspective of the proton’s frame of reference.Thus the observer cannot measure and report the absence of baryon violation at the horizon without himself causing it."



We are no longer dealing with a single unit or piece information, yet a collective sum. As in a letter, represents a sound, it has more meaning that what is actually attributed to it, which is what most people may take for granted, but that is looking at the very intricate details or properties of the letter. It has its own clearly defined shape. The letter “A” can be both uppercase and lower case. Depending on the type of word you are using and how you are using it in a sentence. All of these things are comprised of bits. Not just a single unit.

Shape, sound, etc., is utterly irrelevant. That is the entire point of formal definitions: they are not ambiguous. A letter can be thousands upon thousands of bits, or 1, or 2, or none. It depends upon the system in which the letter exists. And the fact that this formal definition enables us to also make some intuitively pleasing connections with commonplace uses of "information" is a bonus. When we read words, we do not read them as strings of letters (if you don't believe this, see how fast it takes to read a sentence and then type one with the same number of letters and read it). Nor do we interpret sentences in terms of strings of phonemes.

But the more bits you have the more information you are giving to that entity. So these words or whatever you wish to use are a set data types. They are no longer bits of information. A data type can be a letter, word, picture, graphic, or whatever.

And that "whatever" can always be defined in terms of bits. Even in quantum computing, the distinction between qubits and bits has less to do with the definition of "bit" and more to do with the influence of computers.
 

uberrobonomicon4000

Active Member
A "bit" has a definition in the sciences:
I know what a bit is. It has a definition in electrical engineering, computer science and mathematics too. Which all agree with one another.
"The log is to the base 2 and entropy is expressed in bits. For example, the entropy of a fair coin toss is 1 bit."
Cover, T. M., & Thomas, J. A. (2006). Elements of information theory. Wiley-interscience.

Or, the same expressed in a different way: "Bit: A unit of uncertainty: one bit of uncertainty is equivalent to uncertainty regarding the truth or falsity of one elementary proposition"
from the glossary of Klir, G. J. (2005). Uncertainty and information: foundations of generalized information theory. Wiley-IEEE Press.
If a bit is a unit of uncertainty in physics then that bit contains no information. Meaning it is hidden information. That is exactly what Susskind said in his video. So you are applying entropy or uncertainty as to what the bit may be, not what it actually is. As in you have a coin, but just because it lands on heads once doesn’t mean it will land on heads again based on that bit of entropy, so that bit of information is useless. Because it doesn’t matter what side the coin lands on, unless you are trying to determine how many states it can actually be in. Yet, we already know how many states a bit can be in.
It isn't, as the "unit of information" which is defined as a bit is "the physical realization of a system- any system- that after appropriate preparation and operation 1. can exist in one of two clearly distinguishable, mutually exclusive, states (0–1; no–yes; true–false; on–off; left path–right path; etc.), and 2. can be read-out or measured to determine which of the two states it is"

Roederer, J. G. (2005). Information and its Role in Nature. Springer.

A flag that has one side which is white, and another which is black, fits all the above descriptions. The last (and most descriptive) makes this clear, as we can see if it fits the necessary conditions. The first is whether or not the "system" (the flag in this case) can "exist in one of two clearly distinguishable, mutually exclusive, states". It can: in one state the white side is displayed, in another it is the black. As for the second, any number of optical devices (including the human eye) can "read-out" which of the states the flag is in.
Sure, if you are referring to a qubit, but not a regular bit. A regular bit can’t be in two states or have two states like you described. If that is the case then you are interfering with the bit based on the observer effect. A bit in this case would either be:

A) a white flag or not be a white flag.
B) a black flag or not be a black flag.

Then your bit of entropy would what the flag is, if it isn’t a white flag or a black flag.

Here's Susskind's definition of a bit: "Entropy and information are naturally measured in “bits”. A bit is the entropy of a two state system if nothing is known".

Susskind, L., & Lindesay, J. (2005). An introduction to black holes, information and the string theory revolution. World Scientific.

This relates to what he is talking about mainly in terms of time: "Now, in the classical theory, there are no limits on how much information can be sent in an arbitrarily small time with arbitrarily small energy. However in quantum mechanics, to send a single bit requires at least one quantum"

Also, "to obtain a single bit of information about the state of the proton, the observer has to probe it with at least one quantum...But such an interaction between the proton and the probe quantum is at high enough energy that it can cause a baryon violating interaction even from the perspective of the proton’s frame of reference.Thus the observer cannot measure and report the absence of baryon violation at the horizon without himself causing it."
Again, you say a “bit” has a definition in the sciences, but you are specifically referring to physics. Again he even says: “a bit is the entropy of a two state system if nothing is known.”
This is more or less a guess as to what state the two state system is in. There is no certainty about what state the bit is in.


Shape, sound, etc., is utterly irrelevant. That is the entire point of formal definitions: they are not ambiguous. A letter can be thousands upon thousands of bits, or 1, or 2, or none. It depends upon the system in which the letter exists. And the fact that this formal definition enables us to also make some intuitively pleasing connections with commonplace uses of "information" is a bonus. When we read words, we do not read them as strings of letters (if you don't believe this, see how fast it takes to read a sentence and then type one with the same number of letters and read it). Nor do we interpret sentences in terms of strings of phonemes.
They are not irrelevant. They very much matter. You can't just make up the rules as you go. That is like saying water is a bit, then you take away its hydrogen atom. Guess what? Water is no longer a bit.
And that "whatever" can always be defined in terms of bits. Even in quantum computing, the distinction between qubits and bits has less to do with the definition of "bit" and more to do with the influence of computers.
I know and you are right. It can be described in bits and not just a single bit, but a whole plethora of bits.
 

LegionOnomaMoi

Veteran Member
Premium Member
If a bit is a unit of uncertainty in physics
None of the definitions I cited concerned physics. I deliberately avoided referring to physics texts until I cited Susskind.

then that bit contains no information.
I know what a bit is. It has a definition in electrical engineering, computer science and mathematics too. Which all agree with one another.
All the definitions of a bit in the sciences are based on a mathematical approach to information. Information and meaning are deliberately not equated or directly related because in order to be formal, a definition of information cannot involve the "fuzzy" nature of semantics which are involved in language and which make "meaning" so problematic a basis for any theory of information.

That is exactly what Susskind said in his video. So you are applying entropy or uncertainty as to what the bit may be, not what it actually is.

There is nothing that a "bit' actually is. Because a bit is a property of some system, or is a characteristic of some system. A coin is a bit if and only if the states "heads" and "tails" are treated in a particular way. In an actual computer, every individual bit is actually not either one state or another. Instead, it is a signal within a particular range that is read as an all or nothing signal, making it binary.

As in you have a coin, but just because it lands on heads once doesn’t mean it will land on heads again based on that bit of entropy, so that bit of information is useless.
Entropy is defined in terms of possible states, and so is information. All information is utterly useless in some situation. English is utterly useless for those who don't understand it. Information theory, as a field or as a part of some other field, is science. Which means that information doesn't correspond to the way the term is used in English. Nor does the word "information" correspond well to words in most languages, from ancient Greek to Navajo or Basque. "Useful" is not a clear, easily characterized term. It is not much good to any scientific programme if a term like information requires such a fuzzy notion as "useful".


Because it doesn’t matter what side the coin lands on, unless you are trying to determine how many states it can actually be in. Yet, we already know how many states a bit can be in
A bit cannot be in any state. It is defined as some system which can be in one of two states with equal probability.
Sure, if you are referring to a qubit, but not a regular bit. A regular bit can’t be in two states or have two states like you described. If that is the case then you are interfering with the bit based on the observer effect. A bit in this case would either be:

A) a white flag or not be a white flag.
B) a black flag or not be a black flag.

That misses the second requirement of a binary unit of information. I said one side was white, and the other black (like a coin has "heads" and "tails"). Every coin has two sides just like a flag which has one side white and the other black. What makes it a "bit" is the possible states which can be "read" or "computed" or "measured" by some device, person, etc.


Then your bit of entropy would what the flag is, if it isn’t a white flag or a black flag.

Again, you say a “bit” has a definition in the sciences, but you are specifically referring to physics.
I'm not. I quoted from three academic texts which are on information theory itself. I also linked earlier to the extremely influential paper "The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information"

That paper was published in the journal The Psychological Review and Miller defines a bit as follows: "One bit of information is the amount of information that we need to make a decision between equally likely alternatives". That's exactly what Susskind's definition entails.

Again he even says: “a bit is the entropy of a two state system if nothing is known.”
Because, as Miller (a in psychology journal) states, a bit corresponds to the state of a system in which two alternatives are equally likely.

This is more or less a guess as to what state the two state system is in. There is no certainty about what state the bit is in.
Which is why it is a bit. All of information is about uncertainty. If I cannot represent more than one state, I cannot transmit information. You seem to want to equate "bit" and "information" with the way these are used outside of the sciences.


They are not irrelevant. They very much matter. You can't just make up the rules as you go.
I'm not. In fact, I'm following those who made very sure that the "rules" were a clear as possible.

That is like saying water is a bit, then you take away its hydrogen atom. Guess what? Water is no longer a bit.

Nothing in any computer on the planet is run on a system in which anything is either in one state or another. What the "bits" of a computer really correspond to are within certain ranges. In other words, all the 1's in any binary code implemented on your computer are not identical or in the same state at all. But we design computers to treat them that way.
 

uberrobonomicon4000

Active Member
Legion:

I study computer science and information technology.

I know what a bit is and how computers understand them. All the way down to the binary.
 

LegionOnomaMoi

Veteran Member
Premium Member
Legion:

I study computer science and information technology.

I know what a bit is and how computers understand them. All the way down to the binary.
That seems to be the problem. First, because you are already at a level of abstraction. That "binary" is actually an alpha-cut through a fuzzy set. The computer doesn't distinguish between two absolute binary states. A voltage of 2.6 or 2.7 will be interpreted as 2.9 and therefore a "1" while voltage close to 0 will be interpreted as "0".

Second, you are conflating how a computer uses binary code and what binary digits actually are. The fact that a computer relies on a binary based architecture and that this is a matter of voltage states (which are not actually binary, but are treated as such), does not mean that binary digits are defined based on only on their use within computers. Were that the case, then all of information theory would be limited to computer science and more specifically to computer architecture and how it implements boolean algebra. Moreover, even within computer science, a "bit" is a by definition something which is either 1 or 0. That is, it has two possible states.

A regular bit can’t be in two states or have two states like you described.

This is more or less a guess as to what state the two state system is in. There is no certainty about what state the bit is in.

Uncertainty is absolutely fundamental. Bits in a computer are only useful because of uncertainty: it is possible for each one to be in one of two states (technically, again this is not true, as the absolute voltage isn't used), which means you don't know the state in advance. In morse code, the binary units don't correspond to voltage as in a computer but can be implemented on a drum or tapping on a wall, because in the end they depend upon the length of a sound which is either in one form or the other. People on ships have used flags to communicate, and although they don't use a binary system, they could. They could communicate by using a flag with two different colors, one on each side. What makes binary units of information useful is because we have some system capable of being in two possible states. The fact that the states are possible rather than determined means that there is uncertainty. Without that uncertainty, bits are useless.

Information theory generalized from physics, math, and communication theory to extend ideas within these. But bits were already abstractions of a system which could be in one of two states. And already information depended upon entropy. Shannon (and Weaver) just proved this.

The only thing that makes anything a "bit" is whether it is a system in which two and only two states are possible, and something can measure/observe the system as existing in one or the other state.

If you understand computers, all the way down to logic gates and silicon, that doesn't mean you are understanding bits the way that they are defined, only the way the are implemented in one particular abstraction. Saying you understand how computers work doesn't exactly say much about why the definitions I gave you in several different technical works, three of which were solely on information theory, are wrong.
 
Last edited:

uberrobonomicon4000

Active Member
That seems to be the problem. First, because you are already at a level of abstraction. That "binary" is actually an alpha-cut through a fuzzy set. The computer doesn't distinguish between two absolute binary states. A voltage of 2.6 or 2.7 will be interpreted as 2.9 and therefore a "1" while voltage close to 0 will be interpreted as "0".

Second, you are conflating how a computer uses binary code and what binary digits actually are. The fact that a computer relies on a binary based architecture and that this is a matter of voltage states (which are not actually binary, but are treated as such), does not mean that binary digits are defined based on only on their use within computers. Were that the case, then all of information theory would be limited to computer science and more specifically to computer architecture and how it implements boolean algebra. Moreover, even within computer science, a "bit" is a by definition something which is either 1 or 0. That is, it has two possible states.
In all seriousness I don’t think the problem lies with the definition. I have said things multiple times, that you have repeatedly responded to with definitions or quotes as to what other people defined as bit of either entropy or a bit in general which don’t conflict with anything I have said. So I almost get the feeling that we are speaking past each other on this topic. With that being said, I have not been relating everything (well really nothing) directly to a binary system of computer bits. I am just letting you know I understand bits and how they can be manipulated when it comes to computers. This doesn’t disagree with some of the definitions I am most familiar with and some you have also posted.

As in:

'It isn't, as the "unit of information" which is defined as a bit is "the physical realization of a system- any system- that after appropriate preparation and operation 1. can exist in one of two clearly distinguishable, mutually exclusive, states (0–1; no–yes; true–false; on–off; left path–right path; etc.), and 2. can be read-out or measured to determine which of the two states it is"

Roederer, J. G. (2005). Information and its Role in Nature. Springer.'

That is an accurate definition.
Uncertainty is absolutely fundamental. Bits in a computer are only useful because of uncertainty: it is possible for each one to be in one of two states (technically, again this is not true, as the absolute voltage isn't used), which means you don't know the state in advance. In morse code, the binary units don't correspond to voltage as in a computer but can be implemented on a drum or tapping on a wall, because in the end they depend upon the length of a sound which is either in one form or the other. People on ships have used flags to communicate, and although they don't use a binary system, they could. They could communicate by using a flag with two different colors, one on each side. What makes binary units of information useful is because we have some system capable of being in two possible states. The fact that the states are possible rather than determined means that there is uncertainty. Without that uncertainty, bits are useless.

Information theory generalized from physics, math, and communication theory to extend ideas within these. But bits were already abstractions of a system which could be in one of two states. And already information depended upon entropy. Shannon (and Weaver) just proved this.

The only thing that makes anything a "bit" is whether it is a system in which two and only two states are possible, and something can measure/observe the system as existing in one or the other state.

If you understand computers, all the way down to logic gates and silicon, that doesn't mean you are understanding bits the way that they are defined, only the way the are implemented in one particular abstraction. Saying you understand how computers work doesn't exactly say much about why the definitions I gave you in several different technical works, three of which were solely on information theory, are wrong.
I have not been disagreeing with any of your definitions or saying they are wrong. I have been disagreeing with your interpretation of some of them. As to what a bit actually is or represents. Just like you have been disagreeing with everything I have said. So I guess neither one of our arguments really carry a positive or negative charge, we just stabilize or neutralize each others arguments out.

Which is kind of funny, and also annoying, but not really, because we have been going on and on about bits for quite some time now.

So my final question since I haven’t really asked any.

Is how can a bit be in two possible states, when dealing with a black hole since no one has actually gone into a black hole or gotten close enough to physically put anything in it. How could this not be a curvature or fluctuate in space? Like a ripple in the water.That is what I would like to know.

Not a tireless debate on bits! :p Unless of course you have anything else you would like to say about bits. That are more related to electrical charges or components of some system that involves photons.
 
Top