I must not be understanding you, for which I apologize.In all seriousness I dont think the problem lies with the definition. I have said things multiple times, that you have repeatedly responded to with definitions or quotes as to what other people defined as bit of either entropy or a bit in general which dont conflict with anything I have said.
That does seem to be the case.So I almost get the feeling that we are speaking past each other on this topic
I believe you, and I thought that perhaps that might be the issue, because of the way bits are treated somewhat differently in computer science/programming/etc. It seemed from some of your statements that your issue was one with the use of uncertainty to characterize units of information. I may have read into this as well as your comment about knowing computers, because there actually is a difference in emphasis when it comes to computers vs. information science.With that being said, I have not been relating everything (well really nothing) directly to a binary system of computer bits. I am just letting you know I understand bits and how they can be manipulated when it comes to computers.
This doesnt disagree with some of the definitions I am most familiar with and some you have also posted.
As in:
'It isn't, as the "unit of information" which is defined as a bit is "the physical realization of a system- any system- that after appropriate preparation and operation 1. can exist in one of two clearly distinguishable, mutually exclusive, states (01; noyes; truefalse; onoff; left pathright path; etc.), and 2. can be read-out or measured to determine which of the two states it is"
Roederer, J. G. (2005). Information and its Role in Nature. Springer.'
That is an accurate definition.
From some of what you said earlier (in particular, what I quoted in my last reply), it seemed like the issue was one of knowledge about the states. That's a terrible way to phrase it, but with computers, the uncertainty part of information is not focal and sometimes not mentioned, even when the discussion get's down to the level of voltage. There doesn't need to be any inclusion of uncertainty or the emphasis on the irrelevance of the system because whatever digital technology one is using, bits work pretty much the same and the binary "system" is always the same thing.
Is how can a bit be in two possible states, when dealing with a black hole since no one has actually gone into a black hole or gotten close enough to physically put anything in it.
Here's where I think we might be talking past one another. A bit cannot be in two possible states ever. It is defined as the amount of uncertainty we have before a measurement of any system in which an actual measurement (or observation) would yield either one of two possible states. Particles have spin, even mass-less particles like photons. The fact that quantum systems can be entangled means that you can know something about one system by performing a measurement on the other, even at a distance. That ability to know translates into nonlocal measurements of sorts (but causality constraints make it impossible to transmit information nonlocally).
Basically, the issue was that what used to be the laws of thermodynamics in classical physics have a direct relation to measurement in quantum physics, which means they also have a direct relation to information. Entropy in statistical mechanics dealt with uncertainty that was epistemological. In QM, the uncertainty principle guarentees that there will be a certain amount of entropy no matter what, and Planck's constant is involved in knowing how much uncertainty we have for any observation of a system.
So the laws of physics entail certain things which (it appeared) black holes violate because it entailed a violation of conservation (which, in QM, can be framed and is actually easier to frame in terms of information, because the "states" of quantum systems are entirely mathematical until observation).