This is my only question on logic in RG; there are other questions on applications of logic, that I recommend.
There are any types and number of truth values, not just binary, or two or three. It depends on the finesse desired. Information processing and communication seem to be described by a tri-state system or more, in classical systems such as FPGAs, ICs, CPUs, and others, in multiple applications programmed by SystemVerilog, an IEEE standard. This has replaced the Boolean algebra of a two-state system indicated by Shannon, also in gate construction with physical systems. The primary reason, in my opinion, is in dealing more effectively with noise.
Although, constructionally, a three-state system can always be embedded in a two-state system, efficiency and scalability suffer. This should be more evident in quantum computing, offering new vistas, as explained in the preprint
Preprint Tri-state+ communication symmetry using the algebraic approach
As new evidence accumulates, including in modern robots interacting with humans in complex computer-physical systems, this question asks first whether only the mathematical nature is evident as a description of reality, while a physical description is denied. Thus, ternary logic should replace the physical description of choices, with a possible and third truth value, which one already faces in physics, biology, psychology, and life, such as more than a coin toss to represent choices.
The physical description of "heads or tails", is denied in favor of opening up to a third possibility, and so on, to as many possibilities as needed. Are we no longer black or white, but accept a blended reality as well?
In reference to tri-state, ternary, or three-valued logic, and why it may solve cases that two-value logic, or Boolean algebra, struggles with, see
Pablo Cobreros, Paul Égré, David Ripley & Robert van Rooij (2014) Foreword: Three-valued logics and their applications, Journal of Applied Non-Classical Logics, 24:1-2, 1-11, DOI: 10.1080/11663081.2014.909631
at https://www.tandfonline.com/doi/citedby/10.1080/11663081.2014.909631
The question, "Who is the King of France?" has no True or False answer, not even "no one." There is no King of France presently, the correct answer is not the name of a person, or vacant.
Three-valued logics, besides contingency, reference failure, and vagueness, have been associated with at least three other phenomena of interest in which the notion of indeterminacy plays a central role – namely conditionals, computability, and the semantic paradoxes. Our work adds quantum, as in the Bohr hypothesis of all-states at-once, obscurity, ambivalence, lack of perceived form, and many others (e.g., the addition of a third value in ternary logic leads to a total of 3^3 = 27 distinct operators on a single input value, and operators with 2 inputs present ternary logic with 3^(3×3) = 19,683 possibilities), as given in
https://www.researchgate.net/publication/290284114_What_Is_Identification_That_We_Can_Identify_It_Part_I
https://www.researchgate.net/publication/290283924_What_Is_Identification_That_We_Can_Identify_It_Part_II
https://www.researchgate.net/publication/347563918/
Whether or not we can observe something directly, contemplating its possible existence and reference frames may allow us to understand how it might play a role in how the world works. Physics takes this commonly.
For it is impossible for science to pursue the truth, because we do not know where truth is and so we have to base ourselves on experiments and observations, which can be flawed in more than one way. This, of course, can lead to flawed conclusions, so even the most secure scientific statements can never be confused with truth nor its absence. It is not two-level logic.
In normal two-state computing today, one can achieve a higher throughput by simply replacing multiplication by "repeated addition". This unrolls the loop and uses fast addition instead of sluggish mutiplication. One can find many videos on YouTube on this "trick".
Similar effect happens with tri-state, so pre-computation of binary results in three logical truth values can offer a twenty thousand better discrimination. This is without any quantum consideration.
In the control and protective circuits of complex electrical systems it is frequently necessary to make intricate interconnections of relay contacts and switches. Examples of these circuits occur in automatic telephone exchanges, industrial motor control equipment and in almost any circuits designed to perform complex operations automatically. Two problems that occur in connection with such networks of switches will be treated here. The first, which will be called analysis, is to determine the operating characteristics of a given circuit. It is, of course, always possible to analyze any given circuit by setting up all possible sets of initial conditions (positions of switches and relays) and following through the chain of events so instigated. This method is, however, very tedious and open to frequent error.
The second problem is that of synthesis. Given certain characteristics, it is required to find a circuit incorporating these characteristics. The solution of this type of problem is not unique and it is therefore additionally desirable that the circuit requiring the least number of switch blades and relay contacts be found. Although a solution can usually be obtained by a "cut and try" nethod, first satisfying one requirement and then making additions until all are satisfied, the circuit so obtained will seldom be the simplest. This method also has the disadvantages of being long, and the resulting design often contains hidden "sneak circuits."
The method of solution of these problems which will be developed here may be described briefly as follows: Any circuit is represented by a set of equations, the terms of the equations representing the various relays and switches of the circuit. A calculus is developed for manipulating these equations by simple mathematical processes, most of which are similar to ordinary algebraic algorisms. This calculus is shown to be exactly analogous to the Calculus of Propositions [Boolean algebra] used in the symbolic study of logic. For the synthesis problem the desired characteristics are first written as a system of equations, and the equations are then manipulated into the form representing the simplest circuit. The circuit may then be immediately drawn from the equations. By this method it is always possible to find the simplest circuit containing only series and parallel connections.
These were the first words by Shannon (see file attached) to start his Master Thesis. They are true for idealized switches, or relays. However, for real cases one needs three states. This deeply modifies quantum computing, deprecating qubits as we show (op. cit.)
Trust is what gives meaning to information (Ed Gerck, op. cit.) and allows information to be evaluated. That needs more than just information.
For example, the same letters for GIFT can mean “present” or “poison”, depending on one’s trust on the speaker’s language (English or German). Shannon arguments do not even address this discrepancy, which does not relate to information “in the wire”.
See (PDF) Tri-state Quantum Information Model. Available from: https://www.researchgate.net/publication/347563918_Tri-state_Quantum_Information_Model
The Texas energy crisis, and the Capitol voting revolt, are trying to show us more than failure. That the systems that make our lives run are fallible is not new. But we should think about what we'll do when they fail. A third truth value gives us a different logic, to deal with indeterminacy. Not just true or false, but also obscure and ambiguous, as well as other forms of indeterminacy, affect the results. Not all answers are yes and no.
Voting fraud includes fraud against voting. Things are not so black/white, and we are seeing it all around us. What will be the next example?
Yep, this is already part of more fault-tolerant languages in the shape of the "null" value (true times null equals null, false times null equals null).
I've argued in the past that nulls ought to be supported at the hardware level. Currently, standard signed integer ranges tend to be asymmetrical -- the range of eight-bit values going from zero to 127, and minus zero to minus 127, with the "minus zero" treated as an additional integer value of "minus 128". Asymmetrical ranges mean that if we take clipped measurement data (such as recorded audio), and invert it, the -128 values become +128, which is an unsupported value leading to an overflow and a crash.
With hindsight it would have been far more sensible to have used the extra value to signify "null". You'd no longer risk crashing an app just by inverting a datastream's polarity, you wouldn't need so many manual overflow checks, and if your software accidentally did somethign silly like dividing by zero, the result would be a legitimate "null" ("disregard me, I'm not reliable data") rather than a program halt.
The absence of a null in badly-designed computer systems was also partly responsible for the "y2k" problem. Without the option of adding "unknown" as a date value, programmers used specific real two-digit year codes as "private code" for "unknown", or "null", and when those real dates came around ... problems.
Some languages will use "certainty registers", so that 2 times null is null, but two plus null is two higher than zero plus null - this might be expressed as a "two", and an additional error code flag. The use of multi-purpose error-state registers allows calculations to continue even with unreliable input data, and can be especially useful in the contexts of avionics and spacecraft design, where you may need to be able to continue calculating even if critical sensors fail or go offline.
In the case of our two-sensor problem, where one sensor gives outputs of zero and then two, and the other is offline, and where both values need to be added, then with consecutive inputs from the "good" sensor of zero and two (plus a state error flag), we can at least see that the new combined reading has just increased by two, rather than discard the entire calculation. For calculations based on multiple sensors, certainty or uncertainty (or probability) values can be calculated in the background in parallel to the actual data, so that uncertainty over the values reported by a sensor are automatically reflected in the certainty of the final calculated answer. If the sensor only play a small part in the calculations or decision-making, the result is reported as confident ... if its dubious data is critical to the final decision or calculation, its assigned uncertainty makes a correspondingly larger contribution to the final reported uncertainty.
Normally these uncertainty registers are implemented at the language level. It would be nice if (for processors used in spacecraft and other critical applications like maybe self-driving cars) they could be implemented at the processor level, as hardware registers. Ideally, you want to be able to aim for having multi-processor systems, where wires can be cut, or modules can be switched in and out "live" and "hot-swapped" while the vehicle is in motion, and have the distributed decision-making system not care. As soon as a module's electrics are compromised, it flags itself as unreliable, and its contribution to the final decisions are downgraded accordingly. If it's completely unplugged, or is datastream stops, or shows checksum corruptions, then again, the status of its data values get downgraded w.r.t. their credibility.
If a seven-redundant-processor deep-space craft has multiple failures, and its seven modules are reduced to two, and these two report conflicting results, then ideally we ought to be able to choose one of them based on which set of data is more reliable. Or if multiple modules give conflicting results, then if module A has a rotten gyrocope but good optics, B has bad optics and a partially-working acceleration sensor, and C has a great gyro, but dodgy optics, then instead of discarding all three modules's data as unreliable, we can still feed their data into the algorithm, and with proper uncertainty values, still be able to carry out meaningful calculations. All three might normally provide conflicting final data that's flagged as utterly unreliable (leading to all three modules being disregarded), but with certainty registers, they they can still share their source sensor data and hopefully calculate the same output.
of course, a certainty register isn't a complete answer, since some people might want a second register to express how confident we are that the first register's certainty values are reliable. In some case, we might not need the certainty register to have the same resolution as the actual data, in which case we can steal a few bits from the certainty register and use them as general error flags and state flags, for differently-nuanced versions of "null" ("this certainty value seems legitimate, when manually sanity-checked, outside the usual calculations", "We're declaring all these generated certainty values as good, because the module had a full checkup recently" "this number represents a calculation where even the supposed certainty is unreliable")
Climate change is another example. Things are just not bad. There are many good aspects of climate change, of course. But the insistence on yes or no makes it difficult to wait, and see evolution working. Indeterminacy may require delayed evaluation, and goals that fluctuate.
These remarks are in relation to modeling the data used to convey statements about conjectures. It seems that reports of Truth States (true/false) are interesting either in-the-moment or in-a-temporal-sequence. It seems also that identifying changes in the data in a Truth Status report underlies this interest in a pragmatic/engineering sense. The suggested true/false/ indeterminate statements seem to imply a report of an in-the-moment state. As has been pointed out we have not been universally successful in elegantly modelling indeterminate states. However, the data required to carry out such modelling may already be in the data sets as Ed Gerck suggests but in a richer form if the indeterminate state is placed in a change sequence. The data may show, at one moment a state reported as true in relation to some criteria, at the next, a state about which no Status is able to be reported because some identified change occurred in the data. This data not only identifies what characterises the indeterminate state but also the data against which subsequent data changes may be evaluated as to no change or the direction of possible change.. either towards a False Status or towards a return to a True Status. Given change in data, comparison with known prior data may enable statements about the probability of subsequent changes resulting from particular changes in the prior data record. That data-in-a-process transition is temporal information hidden in binary logic. These simple constructs may, in the first instance, be modeled with four characters, T,Y,F,X. Or perhaps more appropriately in a two bit 4-valued Gray Code, 00, 01, 11, 10.
Where
00 stands for false
01 stands for moving towards true
11 stands for true
10 stands for moving towards false.
These suggestions maybe directly extensible in Gray Code (to include Fuzzy Logic) to any depth required.
RB and all: The importance of evaluating indeterminacy cannot be overstated. A tri-state system may be all one needs, as it can embed more states, recursively, ad infinitum, and is simple enough to code today.
Let's call the three states 0, 1 and Z, avoiding numerical bias on Z. The diagram shows how we can get three states with today's systems that only seem to offer two, either 0 or 1.
In the Chinese language, one can routinely say "Huh" as a Z state when asked Yes or No, to the same effect as in the diagram.
Binary modelling may be viewed as being based on an in-the-moment logic. Dynamic systems may be viewed as being based in an in-a-temporal-sequence with many possible data sets and many possible states.
Dynamic systems models may include multiple data sets the variations in which may include indeterminacy in data availability that precludes binary decision making. The suggestion Ed Gerck makes of including indeterminacy as a logical attribute is clearly appropriate for some overall states of complex systems. If data about complex system attributes moves to a system state where evaluations show data in a desired range then no changes may be required. Missing or out of range data may generate uncertainty about the overall system state. Change may be required following evaluation of the data. Further observation (data collection) following a change may show that data about the systems attributes trends towards reducing uncertainty, that is 'fixing' the system. Or the data may show varying degrees of 'failing' .
Identifying the difference between fixing and failing seems to be a critical system management strategy. Hence the suggestion of logically differentiating between the two types of uncertainty in time-modelling complex systems. Doing so requires a four valued logic in order to avoid lost system evaluation criteria that may be found in often imperfect data sets characteristic of complex systems.
RB: dynamic evaluation is needed in robotic systems and day-to-day life.
Communication is done better with Verilog, that has a coherent bus between different challenge-response systems.
The bus is coherent, avoiding conflicts between competing race-conditions of different responses arriving back, and new challenges. Communication code uses the Curry-Howard relationship to provide an out-of-band interconnect between different systems, like a wormhole connecting different universes.
It is not just a matter of having 3, 4, or n states. One also needs a coherent bus, offering an out-band signal (trust) for interconnection. Both, information (unexpected) and trust (known) are needed for communication to exist.
For example, if a friend tells me he is sending me a GIFT, if I have the out-of-band information (trust) that he speaks German, I may interconnect expecting a poison -- which is the meaning of GIFT in German.
EG: Current uses of Verilog (:)) suggest that it is highly successful in modelling hardware level electronic systems. Your instances in the discussion above move far beyond abstractions of hardware. Can you assist us further by showing how this tool moves from what may be seen as essentially about instantiations of syntax to discussions about what seems to becoming about the semantics of the underlying logic?
RB: This points to the good use of philosophy in highly technical areas. As another example, people need to learn about ad hominem attacks and other nodes of failure in discussions, such as the "sharpshooter paradox", in order to avoid them in mining the gold of truth among peers. As human beings, we are all peers to one another.
The semantics of the underlying logic comes from various sources. One particular important one comes from the Curry-Howard relationship (CH). This is worth studying because it is readily and widely applicable. The CH establishes a connection between computer code and structural logic.
So that, if you can produce working code, the logic is thereby proved by using CH. Our group applied it in studying subjective, intersubjevtive, objective, and abstract realities in CS, a matter even more important today, with modern robots such as such as self-driving cars. We discussed how different protocols, and persons, can view the same data differently, in subjective, intersubjective, objective, and abstract reality modes. See a summary at:
https://www.researchgate.net/publication/318661666_On_ABSTRACT_OBJECTIVE_SUBJECTIVE_and_INTERSUBJECTIVE_Modes
My remarks were directed towards ways of dealing with the difficulties of accommodating change in binary logic and the importance of the added possibilities of introducing temporal direction to temporal logic in the analysis of change in empirical worlds.
I am less interested in provably correct code that makes no empirical claims. Provably correct code based on rule-following still leaves questions of 'fit', in empirical applications, unanswered. Pragmatically correct code enables confidence in our applications in our empirical complex world. All these statements are assertions for which all the evidence is not yet in.
However, the discussion paper on The Principles of Ternary Logic provides a useful perspective on the nature of the third value that bear upon the interesting work you are engaged in.
See: https://firstfrontiers.com/
The 'elephant' in the room of provable correctness turns on the problem of undecideability. Lots of really interesting ideas fail because there are, to date, still problems the solutions to which fail to pass the barrier of contradiction in binary logic. And, because as has been pointed out above any other current logic may, with difficulty, be modeled in binary logic, the current situation seems unlikely to change. It follows that the current abstractions we use must fail unless a means of showing empirically, outside binary logic, that the Halting Problem can be approximately resolved. Moving to a 3-valued logic, essentially an oscillator, introduces the possibility of a solution. In moving to a four valued logic we introduce the possibility of a range of satisficing logical solutions in temporal sequences following Herbert A Simon, and that may be the best we may have at the moment.
RB: Yes, logic has turned out to mean "proof", or better, "evidence". The barrier of contradiction in binary or Boolean logic is solved by abandoning LEM (Law of Excluded Middle), which one must do with three states! There IS now a valid alternative between True and False. The world is not Boolean.
A four (or more!) valued logic can then be built recursively on the three value logic. For example, we use 4 valued logic in DAOF logic, for Distinguished, Ambiguous, Obscure, and Formless, useful e.g. for the analysis of texts. For example, "The present King of France" is F, while a "rubber band" should be considered A, as it can be both a "fixing device" and an "erasing rubber". A teacher, ignoring that A level for the rubber band, may give a student a text, written in pencil and a rubber band to fix it, only to receive back the text, modified as the student wants!
Hopefully I can prove that everything that the translation says looks good (‘is true’) is provable in my system? (completeness)
RB: I think -- in a general view -- -- only by using three states, or more states, recursively, one can prove that a system has completeness, so that everything that the translation says looks good, ‘is true’ and is provable in it, thus complete and consistent.
This goes beyond Gödel, who found Boolean logic systems more complex than arithmetic, to be either complete or consistent, but not both. The "limitation" is not in the complexity of arithmetic, but in the Boolean logic. The world is not Boolean.
The Halting Problem's grandfather is Decision Theory (DT). Of the two types of DT, descriptive theory, based on the retrospective evaluation of decisions under constraints post the fact of deciding seems more open investigating to what people do rather than normative theory. It much depends on whether the goal is to emulate human decision making (so study/model human behaviour) or to investigate how to instantiate some aspects of predetermined decision making based on a sets of logically defined variable constraints. There is a vast literature on the first and not a lot on the second.
In an other field of choice-making a question was raised about the impact of normative sequential decisions on the probability of particular outcomes. I modelled this as a four valued cellular automaton analogous to Langton's two valued Ant. It might be expected that the eventual sequences and the model behaviour they construct would see, with four options, a distribution of behaviour that reliably generated a 25/25/25/25 mix in outcome behaviour pattern. That was not the case, and is a salutary learning for the expectations of anyone involved in investigating sequences of algorithmic outcomes.
All and RB: This can be better understood by using quantum theory. But, why would quantum raise its head in classical? In classical logic, of all fields, so theoretical and, apparenly, what one wants?
Because quantum is the underlying reality of the universe, of which classical is built and we measure it objectively using universality (in physics, as given by the renormalization group). And that is also how we know that everything must be modified to include quantum, inducing an overall change. Not only Maxwell equations were wrong, as they did not allow diamagnetism or the laser (now 75 years old), but general relativity must be wrong.
The "arrow of time", surpringly, offers a way out also -- by itself. The "arrow of time" is about the apparent impossibility of "remembering the future" in physics. Sean M. Caroll, of Caltech, talks extensively about it. As neuroscience tells it, "deja vu" is a recognizable and valid phenomenon of "remembering the future" -- not an illusion by deranged patients, as Caroll would see it. The future seems to be accessible, at least partially, to the powers of the mind.
Mathematically, one can also "remember the future" by doing a simple Maclaurin series. Or, by doing any Calculus, or analysis.
Thus, the "arrow of time" is just surprising when we consider Boolean logic as true. In the real world, however, TRS, of ref.(1), is valid, and although it may masquerade as Boolean (TRS may be embedded as Boolean), everything is decidable. And ambiguity is a valid reality, as well as other forms of uncertainty.
This interconnects with race conditions and the halting problem in CS, showing that they are already solved in .
[1] https://www.researchgate.net/publication/350214519_TRS_Tri-State_Communication_Theory_and_Trust_Analogy_Semantics_connecting_computer_code_with_structural_logic_by_the_Curry-Howard_relationship
Either true or false is rediculous. It is true, false or neither of the two.
Everything is in a state of becoming something other than it is until the last movement ceases and quantum probability gives up. Fortunately no sign of that yet.
All: the question here is whether nature and QM can be used as a better foundation for the concept , than made up postulates from the last 4,000 years of quaint mathematics.
Where do these "hidden structures" in math come from? If we accept that QM is our most comprehensive theory of nature, then it is logical to look for an answer to this math problem in nature and using QM, not in made up concepts that deny such structures a priori.
This is not a call for Platonism, or a return to the mathematical program proposed by David Hilbert in the 1920s.
In structural logic, a system is inconsistent if it admits a proof of the absurd.
If the system has a cut elimination theorem, then if it has a proof of the absurd, or of the empty sequent, it should also have a proof of the absurd (or the empty sequent), without cuts. Thus, once a system is shown to have a cut elimination theorem, it is normally immediate that the system is consistent.
An aside: on scrutiny of the history of the development of science there appears to be no essences in other than Platonic worlds which exist only in the mind. We construct more or less satisfactory models of our beliefs for which we then construct confirmatory models knowing that they are both place markers for the yet to be put forward 'improved' models supported by 'better evidence'.
That is a healthy position to adopt because it leaves other constructed possibilities open and prevents the return of scholasticism.
RB : The fish does not taste the water it swims in. We do not taste water. We are limited by our paradigms. That is why it is important to force the limits, so that we can enlarge our vision.
The dictatorship of the LEM (law of the excluded middle) is forced open on a third truth level. And soon we discover more levels, not neccesarily recursive on three, but one can follow Galois Fields in extended definition, to unlimited logical states, and/or use polymorphism.
RB: In addition, I agree with your last part above. That would be a healthy position to adopt, like said above, but it must also leave out failed constructed possibilities as being 'closed' -- a matter of history. They have to stay behind, and prevent the return of empty scholasticism.
EG: Agreed that failed possibilities must be left behind.
We can expect no new outcomes from doing the same thing over without modification or adjustment to overcome the reason for failure. However, knowing about past errors and reporting them is an efficient strategy for reducing mistakes. Written into history are the heroic failures who eventually 'çame good'. Edison said something like. "I have not failed. I've just found/constructed 10,000 ways that won't work".
RB: But Tesla, facing Edison, calculated first and avoided the "10,000" experiments. It is important to deprecate, it is similar to the god of destruction in hindu religion, and in quantum mechanical states of second quantization, where one creates and destroys. Destruction must exist, we learn, to avoid accumulation of clutter. It is used in housecleaning, and running code.
Hi Ann Gerck ,
No problem with your remarks except to say that problem classes have, in their expression, problem solutions. It seems that problem solutions are generated by a higher levels of abstraction, and then frequently applied at lower levels. You are correct that the detritus of research/running code needs house cleaning and the garbage taken out. I was referring to the content of abstractions where solutions are generated and noting that learning comes from changes in approach, that is, changes in strategies.
In an earlier remark I said that Eds work was interesting because a new approach is suggested where old ones failed (to accommodate for quantum phenomena elegantly). So, is the fundamental issue how to model, in running code, states, events, processes and outcomes with inadequate tools? If so my first response would be change the tools and test the outcome reporting the results.
RB: According to the laws of logic, a sound statement always begets a sound statement, with no chance for a stray aberration to be born -- that would be wrong, not a sound conclusion.
Therefore, we don't need to keep a roster of "don't dos" together, ready for consultation, when that moment could need it -- that moment will never come. Keeping it as a failed experience, in history, is enough of a bad memory.
There are many examples, where paying too much attention to the negative experiences that one wants to avoid, can actually create a path for it to manifest again. So, let's move forward only with the good, the bad cannot be created from it. There is nothing to fear.
RB: One can realized that the numbers that can be reached from an operation must all have a quite specific kind of symmetry.
We have an example of a quintic polynomial, not solvable because those solutions have a more open symmetry than the kind of symmetry which some set of polynomial formulas can generate, not because there is 'no solution'.
Therefore, it must be impossible to represent its solutions in terms of those kinds of formulas.
Saying this again, if all one has are these ‘solution specific’ formulas, which generate collections of numbers that have a specific kind of symmetry, and since we have an example of a polynomial in x^5 whose solutions do not contain this symmetry, then we must deduce that it is not solvable by a formula we have.
According to Galois Theory, the reason that polynomials in x^5 do not have a general solution formula is because the group S_5 contains a non-abelian normal subgroup that does not have any non-trivial normal subgroups (it’s the subgroup A_5).
That is why things work up to x^4, because the groups S_4, S_3 and S_2 are too small for anything to go wrong.
It is not so much that things were going well and then we just had a hiccup for x^5, but rather the opposite: things went well for x^2, x^3 and x^4 because they are special cases – the groups are too small to have any ‘bad’ behaviour.
Therefore, things can go 'wrong' as groups get bigger, because we can find more inconsistencies. We can avoid that by using a different group, larger in the symmetries that it offers, even though it seems like an overkill for smaller cases.
We are finding that out in information theory for communication. With a simple case, we can consider information given by Boolean logic, simple and LEM is valid, and Shannon theory mapping Boolean logic to electric relays. Thus, it seems that any electrical circuit of relays can be analysed and synthesized by following Boolean logic instead, as Shannon did well.
Things seem to work well, but some problems require tri-state logic in a challenge-response system and coherent semantics, as it can be programmed by Verilog, to work better. What happened? We explain it in terms of a Topological Relationship (TR, as we call it), in which a continuous path in higher dimension MUST map to a discontinuous path in lower dimension, a well-known theorem. The dimension of the problem is larger, which can be explained as a 'different symmetry' above. The problem can be solved in lower dimension, but with less scalability and more approximations.
We find the same in quantum physics, where qubits reveal themselves to be too simple, and qutris, qudits; and one needs qtrust with GF(3^n), with an open-ended number of states and trust semantics, which must be used.
The simple rule is, a round peg doesn't fit into a square role, to put it simply.
One can the use such common sense to demistify Galois Fields. We note that, in the case of a quintic polynomial, n=5 it is the smallest value of n such that the symmetric group of degree n has a non-abelian simple subgroup.
In other words, for the general polynomial of degree n (i.e. with indeterminate coefficients), the Galois group is Sn, and the alternating group An is simple for n≥5, which is of course, not abelian. Hence Sn is not solvable for n>=5.
This does not mean that a quintic polynomial is not solvable. Just not solvable in what used to work for n
Yes we know the quintic polynomial is solvable in some easy cases.
Say
(x-1)(x-2)(x-3)(x-4) (x-5)
Unfortunately for the third possibility, computers only have two voltage levels
Say you had three possibilities, true, false, dont know.
Let your computer run for a while, and your result would read
dont know, dont know, dont know,...
The preprint Preprint Tri-state+ communication symmetry using the algebraic approach
shows that, in analogy to stimulated emission in explaining the thermal radiation of bodies in communication, in the Information Theory (IT) by Shannon, a third truth value Z, as a new process of a coherent logic state, should exist.
In 1916, Einstein argued that, in addition to the random processes of absorption and spontaneous emission, a third, new, and coherent process of stimulated emission must exist, reproducing then exactly the experimental study of communication as the thermal radiation of bodies. This is the so-called black-body radiation law, and shows how radiation is communicated as three processes in every case, extent, wavelength, and temperature, providing the basis for the later invention of the laser. Similarly, taking this three-fold symmetry as more general, representative of the needs of a communication process, this work considers that in the Information Theory (IT) by Shannon, with two, random logical states only, “0” and “1”, needs a third, new process of a coherent logic state Z.
Ed
Fine, maybe, but you avoid saying what the third truth value is.
Or what the three are if they are together. Of course you could just express numbers in a base 3 system, and have some code with letters, but loose contact with logic.
My conception of shanon or info theory is only a very indirect analogous relation to physics phenomena. Nothing deep. Mostly with entropy, not emission.
Comunication is something else altogether. There info. theory is relevant.
Communication is the process whereby information is transfered from one point in spacetime, called the source, to another point in spacetime, called the destination. Information is what is transferred from source to destination, if nothing is transferred then the information is zero and there is no communication; information can also be seen as surprise as to what is received, if there is no surprise, because the information has already been received, the information is zero. This relates to uncertainty, and information is a measure of uncertainty, and relates to entropy. The average information is called the source entropy. These are all standard terms, and one can google, or look up in a book. In particular, I recommend "Communication Systems" by A. Bruce Carlson, McGraw Hill.
Ed
Dont know what you are thinking, but you should know that just 2 levels are
needed for communication. Then if you use more, perhaps all the better. Better still if you use N levels.
You still dodge the issue.(as to the meaning/function of a third level)
In physics we dont care if A and B communicate, all we care about is if a signal can go from A to B
This establishes a unity between Einstein's and Shannon's theories, hitherto not reported, both referring to communication processes between bodies, and deprecates two-state bit and qubit.
The Shannon-Weaver model ignored not only the need for, at least, the three states as ''0'', ''1'', ''Z'', but also the semantics of state ''Z'': the coherent, out-of-band interconnects (in higher order, given by GF(3^n) with 3, 9, 27 ... states) that provided the meaning, which was later found to be an essential part of a wider communication system in a federated environment with independent authorities.
Multi-valued logics are well known, as are the associated conventions and nomenclatures applied to their applications. There is a need for clearly distinguishing between the state(s) of the machines in which they (multi-valued logics) are instantiated, the states of the algorithms in which they are exemplified, the logical abstractions that are modeled in their use, and limitations/constraints that become apparent over time. What I am drawing attention to here may be summarised by affirming that while mathematical abstractions may true by definition, the types of logic under discussion here may or may not have truth status in use because they are simply inadequate in application. The phenomenal world is complex but that emergent complexity may not become any more accessible merely by adding different labels. So, try it and see what happens, reporting all the layers of the outcomes. There may be a strong component of "suck it and see" about what you propose., that is, the outcome may not be predictable.
RB: Good point. Using a well-known physical model may seem like a guarantee against a phantasy representation, but we have encountered counter examples. The complex number system, e.g., seems physical. We can understand sqrt(-1) as a rotation, and this becomes more part of our reality then sqrt(2) -- that we can't measure, being irrational.
And we calculate complex numbers in code and in algebra, but all we can measure are finite integers. We can never measure a complex number, just its parts under some other model, such as a complex polar. If we had a different model, something else would be measured. But what we consider real does not depend on a model.
Now, if alll we can measure are finite integers, why not assume them as basic, and build our world with those parts only?
Such is the approach of constructive mathematics, in which the quality "exists" is strictly represented by "can be constructed". By using only parts that we can construct, we can have a further test more easily, using the same rule.
This avoids a fundamental disconnect in the human approach, in that for the physicist everything in the universe is finite and discrete, but in conventional mathematics everything is continuous and infinite. We, therefore, avoid using the continuous and the infinite, which we can't construct, and build a world under the hypothesis of "can construct", of constructive mathematics, for all of its parts.
Then, the finite integers are part of a Galois number system, always. The four arithmetic operations apply, +-×÷, and everything can be calculated and built, using Galois numbers.
How about the number 4? Four is not a direct Galois number, but 2^2 is an extended Galois number of base 2. Or, we can use 9 =3^2 and just occupy the first 4 positions. Thus, any finite integer can be put into correspondence with a Galois number system, extended or not, and they all obey the four arithmetic operations, we can rest assured. There are a large number of possibilities, without limit.
Thus, reality can exist, because it can always be constructed, when using Galois numbers. This is what the Einstein-Shannon model is telling us: reality can't be constructed with Boolean logic, but can be constructed with GF(3^n), for a communication process. Information does not exist with Boolean logic. The world has more symmetries in information than Boolean logic can represent.
We can, however, always embed more logic levels in Boolean logic, but not without adding discontinuity. For example, chiral information does not survive a projection.
I think we should distinguish states as performing a logical function, from those in comunication which merely help to carry numbers in N ary representation.
Also states with just two voltage levels from multiple voltage levels.
No need to mix too many things together at once.
Again, just two 1 voltage levels and adequate code carry information.
Dont think logical processing is that important these days.
JW: two-state logic can't represent a conversation in a national language of each party, as they can trust the same syntactic expression (word) to have different meanings, and point to different things (referents).
Frege dealt with this, 100 years ago, and it is becoming more obvious with the Internet. There is no "referential theory of meaning". If one says "the first star to appear" and the other says "the last star to disappear", they are likely referring to the same planet, Venus. If one says GIFT, it means different things in German and English. Shannon did not consider this, only the syntactic expression -- as if meaning could be logically built by what is said.
I assign no meaning to your objection, for example. It is like you asked "Is the King of France bald" -- as France has no King, so it is not a YES or NO, a third logical value?
Einstein described how communication between bodies required a third, coherent process, and named it "stimulated emission" -- it must exist. Shannon is also treating communication between bodies, so where is the coherent process? Not to be found in the Shannon theory, but it must exist. Therefore, the Shannon theory must be modified. Empirically, Verilog and tri-state circuits do it, on a large scale. Time for the Shannon fans to smell the coffee, and have a new theory that represents the missing symmetries. Tri-state+ is this new theory, with new solutions and predictions. Time to follow what Einstein described more than 100 years ago.
When communicating, people believe they are following their own logic. They believe that they do what they want. A mistake. They must follow the laws of the greater nature, in addition to their own predjudices and hidden persuaders. They can easily be nudged, as well, by others and by advertising.
For example, when using the Internet, they are open to attacks, willing or not. The solution is to use the law that the attackers must also follow. The solution is simple. No one can attack what does not exist. Offering "no target" is possible, under Einstein's understanding. If there is no sympathetic resonance, there is no interaction possible. The amplitude, or number of attackers, is of no importance. If the frequency is different, there is no interaction. In other words, if there is no coherence, there is no interaction. A photon can only interfere with itself, number of photons (intensity) doesn't matter, only frequency.
A coherent process must exist, also as a way of protection. And this shows, in the Einstein expression, when he postulated a coherent process and obtained an expression for the equilibrium that avoided the "ultraviolet catastrophe". That was the dominance, leading to a catastrophe, of the high frequencies, which was avoided by the coherent process, favoring the lower frequencies. Likewise, on the Internet, savy humans can apply the same law and achieve protection. If the interaction gets obsessive, or dominating, a catastrophe can be avoided by disengaging.
Hi Ann Gerck ,
It is worth considering whether the 'Laws of Nature' are constructed before the 'facts' or after the 'facts'. It is also worth considering whether the 'Laws of Nature' are human constructs and are therefore subject to revision should countervailing evidence appear. Of course, if the 'Laws of Nature' are beyond question then a problem arises with the concept of free will which on my understanding is still widely accepted by humans. Furthermore, the issue arises of what to do when 'the Laws of Nature' seem to be in contradiction at different scales or different contexts. What differentiates the disciplines of science from other human activities is a high regard for the veracity of the evidence that is used to support the claims made about the activities in science.
Code that runs on a machine simply means you got the syntax correct. Running code says little or nothing about the semantics (and rarely anything about the pragmatics) associated with the environment in which the code is run.
RB and TG: The laws of nature are independent of what we call "facts". We learn in history, that history itself can change. Facts are what we were willing to believe, physics teaches. Once, we believed it was a fact that the Sun circled the Earth. We are still very primitive, and that was a few years ago.
Einstein showed the need for a coherent process, in the communication of bodies. One thinks humans might be exempt of this fact. We are not, because we necessarily use bodies, matter to communicate. Binary logic ignores this reality. Anything that obeys binary logic must be false for bodies, and Pierce understood this. We see that in Internet discussions as well. Some things are intersubjective, like a medical diagnosis. Other things are objective, like the Sun. These latter things are not a question of opinion. Only fools would discuss if the Sun continues to exist at night, just because we can't see it.
We are not in a position to condemn movements and their interpretations. Physics has valid reasons; chemistry deserves our greatest respect, all schools are serving progress. Every mathematical expression needs to be precise.
What challenges us then, is that individuals with clear thinking and good hearts would fail to see more of the truth we see, a testing reality for all of us. The intersubjectivity of individual positions, not objective. But, of course, a medical diagnosis is intersubjective, and that is useful. Physics, however, is objective as nature is the final arbiter.
We are not doing this work to build our own statue, our own name. Instead, we are giving our best efforts to reaching the hearts and minds of everyone with concepts of truth, who can offer lasting freedom to humanity, objectively -- the function of physics, is not to inform only, but to make people aware.
And, if our best efforts are not enough, we can do better ourselves, because we can learn.
Come on Ed, it is basic how to go from bits, to bytes to words.
You have a code in binary for each symbol or letter in the alphabet.Then a sequence of ones and zeros is translated into language.
You can do this sequentially or partly in parallel.
If you use plain language, then you should understand, if you both know the same language. Or use a translator(maybe some mistakes then, better than nothing)
Stimulated emission is a physics term, nothing to do with human comunication.
JW: One can go from bits to words and phrases. But there is no "referential theory of meaning". What does the word actually mean? One can send and receive the word GIFT, but it means different things, in German or English.
No meaning is transmitted, because it needs to be coherent. If you try to transmit it, an attacker may change it. It needs a parallel channel, protected 100%.
Shannon did not describe it, Shannon describes a syntactic equivalence, only. The words meanings and referents are not only denied, they are not transmitted. Say GIFT -- what does it point to?
JW: Of course, "stimulated emission" applies not only to bodies that we must use to transmit and receive information, but also to how we communicate. For example, your answer stimulated me to emit my reply in coherence (stimulated emission), and so on. To communicate, you need not only information, as surprise, but also coherence, as that which both sides know.
Why are you so bent with the idea that communicators speak different languages ?? Normally it is not so. I agree that certain comon features like codes or languages must be present to communicate, but you dont worry about comunicating with aliens of another planet every day.
Well, yes gift can mean married or poisoned in Swedish.Swedes rarely get married, they go bo.
Regards, Juan
JW: Frege already gave 100 years ago the background why there is no "referential theory of meaning". No need to repeat. Shannon theory does not convey meaning. This is well-known, and a problem for communication, and allows attacks. One needs a coherent state, in addition to on/off. There is no alternative. Physicists can read Einstein, he gave the correct reasons for bodies. There must be a coherent state.
On the web: Most translators are familiar with the expression “traduttore, traditore” meaning “translator, traitor” and have their own personal experiences with the difficulties in translation. We have all seen poor-quality translations, translated text that is virtually unintelligible for a native speaker, translations that misrepresent the original text and blatant mistakes whether in subtitles, song lyrics, or in day-to-day document translations.
Translators become the villains in this story–the easy targets when pointing the finger. After all, translation is really about just taking words from one language and finding the equivalent in the target language, right? So how hard can it really be?
Well, for starters, translation is no easy task and involves much more than simply transferring the words into another language. It requires research, thorough understanding of both the original and target languages, cultural knowledge, and specific training on the topic you are translating. And even then, there are still inherent problems with the language itself that lend themselves to numerous interpretations and glaring mistakes. There are just some phrases that are so connected to cultural context that it is next to impossible to provide an equivalent translation of the text that also bears the same meaning.
So, what exactly is the translator’s job when faced with these difficult expressions? Is it better to translate them literally so as not to “betray” the text but at the risk of a lower quality translation, or is it better to find the closest alternative that makes sense in the target language, even though the translated version may slightly modify the idea? Most translators would say that their task is to effectively communicate the same idea so that it makes sense to native speakers, but does that mean we are doomed to constant criticism?
JW: Thank you for the Swedish lesson. GIFT has indeed two distinct meanings in Swedish, neither of which is the same as the English 'gift'. Gift means both 'married' (when used as an adjective) and 'poison' (when used as a noun), something that has caused confusion to plenty of Swedish language-learners.
Translation is where we find more evidence to deprecate Shannon theory more easily, as there are many known examples. The Internet has only made theses cases more evident.
Translation is also used for attacks, especially for buffer-overflow, where machine translate a piece of data as code, and execute it. For example, by not clearing user input of such attacks, the widely used SQL language can be used maliciously. The Shannon theory cannot be used to block such attacks, as often the meaning of the name expression is not considered as an engineering problem.
"Unfortunately, our translation systems made an error last week that misinterpreted what this individual posted." Expect more of these disclaimers.
Facebook has apologised after an error in its machine-translation service saw Israeli police arrest a Palestinian man for posting “good morning” on his social media profile.
The man, a construction worker in the West Bank settlement of Beitar Illit, near Jerusalem, posted a picture of himself leaning against a bulldozer with the caption “يصبحهم”, or “yusbihuhum”, which translates as “good morning”.
But Facebook’s artificial intelligence-powered translation service, which it built after parting ways with Microsoft’s Bing translation in 2016, instead translated the word into “hurt them” in English or “attack them” in Hebrew.
In computers, attackers can exploit the cache side-channel protections, to leak secret information and break basic security mechanisms. A translation lookaside buffer (TLB) is a memory cache that is used to reduce the time taken to access a user memory location. It is a part of the chip's memory-management unit (MMU). The TLB stores the recent translations of virtual memory to physical memory and can be called an address-translation cache. Hardware translation lookaside buffers (TLBs) can be abused to leak fine-grained information about a victim’s activity even when CPU cache activity is guarded by state-of-the-art cache side-channel protections.
For example, this can be used to leak a leak a 256-bit EdDSA secret key from a single capture after 17 seconds of computation time with a 98% success rate. Reports Ben Gras, Kaveh Razavi, Herbert Bos, and Cristiano Giuffrida, of Vrije Universiteit, at USENIX.
In biology, translation is the process of translating the sequence of a messenger RNA (mRNA) molecule to a sequence of amino acids during protein synthesis. The genetic code describes the relationship between the sequence of base pairs in a gene and the corresponding amino acid sequence that it encodes. This mechanism can also be attacked, disrupting a protein synthesis without disrupting the corresponding genetic code. To protect, one of the mechanisms in nature is to use triplet codes. A triplet code is where each codon (within the code), consists of three, nonoverlapping, nuceoltides. The code is degenerate, as different triplet base pairs can code for the same amino acid. For example, AAA and AAG both code for lysine.
Excellent question Ed. I have the same opinion as you on the subject. Greetings
Communication is based on coherence, even though it must have a random, incoherent component. This random component is what calls attention first, as information is defined as a measure of uncertainty, and that reveals itself as a coherent understanding. The inability of the Shannon theory to efficiently represent coherence is based on Boolean logic, which denies a third truth value, an alternative to YES or NO, but humans, animals, plants, fungi, minerals, and every classification of matter, has created it, in communication within the group and across boundaries, in coherent interconnects. This was first explained by Pierce, and currently by Jones. Coherence seems to emerge out of indeterminacy, in trying to decide upon uncertainty, between YES and NO. In Chinese language, nationals sound "huh". One needs a "huh" utterance to disambiguate at first, a third truth value.
A congame, an attack based on the abuse of trust, usually includes an insistence on YES or NO, immediately.
COVID is an example, likely, of a virus that coherently interconnects animals and humans. We also find coherent interconnects between plants and animals. We all share the same 20 aminoacids. All the same atoms in the Periodic Table. As physics explains, the atoms in our bodies came from the stars, another coherent interconnect. All nature is coherently interconnected. Ignore that, and we ignore a basic aspect of any form of life we know -- coherence. Which, to exist, requires a third truth value.
Shannon theory, by denying coherence as basic, proves by absurd the own existence of coherence. And we can also prove it constructively, by our inability to define the opposite -- randomness. To define randomness, one has to use coherence. Randomness is what has no coherence.
It is absurd to take out coherence from any stage of communication, and stimulated emission shows that even at the atomic scale, it must exist. It is not random emission and random absorption. There must be coherence also, which is a form of order.
Order in the universe, at every level. This is a message that physics reveals, at every phenomenon. Also, order in the quantum model, which is another message that is coming out of new developments in quantum physics. And will impact quantum computation, which one expects to become possible in desktops, and mobile devices.
Some random questions that come to mind.
What is the place of coherence before, during and after the 'Big Bang'?
So, what carries coherence? Is it still like Dark matter, somewhat enigmatic? Suggesting coherence has attributes, as yet undefined, other than presence or absence? Or do we find coherence as an attribute of quantum phenomena extending to all phenomena from quantum scales upwards? Is it an epiphenomenon or an emergent property? Or is it an attribute of the processes of observation? What is the metric for coherence?
What is gained/lost if coherence is removed from our modelling attempts?
RB: physics makes no declarations before the "Big Bang". Coherence shows itself after the Big Bang, as an increase in entropy. A word with a small Shannon entropy value (i.e., a word that is not evenly distributed in all the documents) tends to be more discriminative, while the one having a large entropy value (i.e., a word that is more evenly distributed in all the documents) is less useful in discriminating among documents. The universe tends to go from less evenly distributed (less coherence) to more evenly distributed (more coherence).
Now, coherence can have a structure but that is mostly unknown. We are suggesting that it follows GF(3^n) in communication, but GF(2^n) in information, such as in encryption by AES.
We currently fit any information to be encrypted in GF(2^8), maybe we need higher powers of n, maybe we need a different base prime, maybe we need a different group.
Mapping 'coherence' onto mathematical constructs merely changes the language used and does not address the underlying concept or define its attributes or address the questions about attributes raised above. At best, instances of Galois Theory give access to some tools and techniques developed in a subset of finite number theory. Is the suggestion here that what may apply in number theory also applies to 'coherence'? What would you suggest is the crucial 'real world test' that could confirm such a claim that is independent of the observer?
Another way of getting at the issues raised in our remarks is to ask what is present in 'coherent phenomena' and what is absent in 'incoherent phenomena' in the sense used in physics, other than that applied in wave theory.
RB: Mapping 'coherence' onto mathematical constructs would be useless as you say, if we also did not map said 'coherence' to a well-defined physical model, used in stimulated emission technical parlance.
The coherence of the stimulated emission photon is as absolute as possible, and can be seen as a "perfect copy" of the incoming photon, including direction of propagation, polarization, etc. The mathematical model is of an equivalence being created by coherence. The two photons are equivalent, in stimulated emission. Here, also, coherence doesn't create an equality, but an equivalent class. This equivalent class has a dispersion, it may include more equivalence as well, as one increases resolution. It may look continuous in the macro, but is discrete in the micro. We do not necessarily have to consider the micro.
So, continuity may be present in 'coherent phenomena' and may also be absent in 'incoherent phenomena'. It may, however, just be a question of scale, not of structure.
Phenomena may always be discrete, which is the current thinking in physics. Nothing is continuous, it just may look so. Wave theory disagrees, and cannot reproduce single-photon interference, observed in experiments. There is no duality matter-wave, it always behaves the same way -- as a quantum wave of probability. No localization of quantum waves, contrary to classical waves, or particles.
As remarked in
https://www.researchgate.net/publication/350950693_Tri-state_Communication_Symmetry
"WARNING: Because the answer is also against the use of the LEM, it cannot use Boolean or binary logic to represent the thought process as YES/NO answers, or infinitesimals [18], nor continuity [5], and no usual mathematical formulas that are built in conformance, and any bit-wise operators. In particular, the XOR (also called Bit-wise Addition Modulo-2) operator cannot be used — even though ubiquitously available in hardware, in silicon gates. Therefore, in this paper we refrain from using conventional mathematical language and its Boolean baggage, as we are describing higher-dimensional logic — and prefer text, where different meanings and referents [19] can apply, and are naturally sorted, to the literal (name) expression.This is important, for example, in discussing network coding, the trust analogy, and tri-state Verilog.The reader is advised, therefore, to look for indeterminacy in the language as valuable topological pointers to further processing by multilevel logic, or deeper thinking, and not as a binary “fault” of the text. Further publications will advance suitable mathematical language to what, today, can only be richly expressed in non-binary language form."
Language, thus, to the rescue, as richer than present mathematics.
Three or more states logic can explain what binary logic cannot. Take, e.g., the double-slit experiment, considered mysterious even in QM. This is easy to explain when one focuses on breaking the LEM. Breaking the LEM is easy to explain in the particle picture, but impossible to explain in the wave picture, where tri-state+ is embedded in binary logic (and LEM is valid). This continues at
https://www.researchgate.net/post/On_what_is_real
Law of the Excluded Middle (LEM)? Two-state logic is complete in its own terms, but the universe is not limited by it, not limited by LEM. Where is the "maybe", we so often face?
A simple oriented graph with three non-colinear points, making a triangle, already shows it: one can reach the second point, from the first, by any of two routes, it is ambivalent. One needs more logical reasoning states, even in simple cases.
The problem is, then one loses the LEM, which is an intrinsic measure of success. This LEM function is to be provided now by intersubjective agreements, much like in the real world. So, the LEM makes it easier and faster, as we see in online groups coming to an agreement, on anything, it seems to take too long. Too much time is lost!
A binary tree following GF(2^n), with Galois fields for suitable finite integers, can be used in more powerful decision-making, using the LEM in n cases, in succession. It can even allow one to convince contrarians (private experiment) of unpopular views. The LEM can be used well. It is the base of computing.
And GF(2^n) solves the incompleteness problem, as n grows without bounds. Try it.
According to a PM, some people may think that there seems to be a fundamental lack of understanding of logic here, including models, and quantum computing.
For instance, we seem to be connecting the qubit-based quantum computing to classical logic through the fact that a qubit has a two-state basis.
But, whether the Law of the Excluded Middle (LEM) is satisfied would be somehow largely unrelated to the number of values in a model of the logic, and this is woud be further entirely separate from the size of the basis of a quantum system.
This seems just a desperate play on words. The LEM is not to be violated in a binary system, in a system with only two values of logic. Where else to go? There is no other choice in a two-state logic system but "1" or "0". One sees that artificial force in US congressional testimony. One asks for a decision, and only YES or NO is offered.
Further, the number and type of states of the logic system dictates per force the size of the basis of the quantum system. One cannot have a spin 1 quantum system with one or more spin 1/2 particles. A binary logic system, where the LEM must be valid, has values "1" or "0" in each type, incommunicado. A spin 1/2 particle cannot transform or become an integer spin particle, they are of different types. It is possible, however, to confuse type with number. This might be the source of confusion here.
But the number of particles of a certain type does not increase or decrease the number of logical levels available. That number is the number of logic levels, such as revealed for the electron in the two possibilities of the Stern-Gerlach experiment, as well-known. In the Stern-Gerlach experiment, silver atoms were observed to possess two possible discrete angular momenta despite having no orbital angular momentum, and the number of particles was not important.
The conventional definition of the spin quantum number is s = n/2, where n can be any non-negative integer. Hence the allowed values of s are 0, 1/2, 1, 3/2, 2, etc. The value of s for an elementary particle depends only on the type of particle and cannot be altered in any known way (in contrast to the spin direction, as well-known).
The types of spin remain incommunicado in an unidirectional way -- a boson is always a boson, but a fermion is not always a fermion. A photon is always a boson, but an electron, a fermion, may transform collectively into a boson. Further, a boson can be made of any number of fermions. Like Helium-4, which is a boson collectively.
The quantum-mechanical interpretation of momentum is as phase dependence in the position, and of orbital angular momentum as phase dependence in the angular position.
Quantum computing (QC) does not mean that everything is possible. That would be anarchy.
QC means, as Bohr said, that all states are possible at once. So, first, before being possible in actuality, it has to exist as a state. How many states one can have? Physics has an answer since Einstein in 1917, with stimulatedemission. It has to be three or more.
Electrons have half-integer spin and are fermions that obey the Pauli exclusion principle, while photons have integer spin and do not. But so- called q-Fermions (proposed in 1991), that satisfy q-deformed anti-commutation relations (pertaining to spin half) can have the property that more than one q-Fermion can occupy a given quantum state.
Thus, random cannot exist. But chaos, as calculated by a physical or mathematical law, can exist. This allows one to determine what is random -- random being what cannot be calculated or exist.
Logic is complete, and there is hope that everyone will accept it.
In particular, there are many deductive systems for first-order logic which are both sound (i.e., all provable statements are true in all models) and complete (i.e. all statements which are true in all models are provable).
The foundations of first-order logic were developed independently by Gottlob Frege and Charles Sanders Peirce.
Of importance here, false premises (such as irrationals, continuity, or infinitesimals exist) can lead to either a true or a false conclusion even in a valid argument.
Infinity, when regarded as "not a number" exists and can be used safely.
Gödel's proof is binary, and obeys the LEM (Law of the Excluded Middle).
Opposite statements, G and ~G, can’t both be true in a consistent axiomatic binary system. So the truth of G must be undecidable. But this LEM logic fails in third-or-higher order. Not all logic systems are undecidable.
This is important for quantum computing. And, for example:
When faced today with a YES or NO decision, choose a MAYBE. That's a decision that at least postpones the outcome, and tomorrow will be another day. Repeat, to any number of days. Always preserve your hope to the future.
What seems impossible now, may find a solution tomorrow.
When projecting 3D we avoid chiral loss by projecting two 2D images (as our eyes do). This is analogous to GF(3) --> GF(2^2). So, one expects that tri-state can be represented by two binary trees. From [1,2], the number of combinations generated is sufficient. To form tri-state+, one just needs GF(2^n), with n > 2.
Thus, QC can be implemented in binary hardware. No one needs quantum annealing or Majorana fermions, highly hypothetical particles that could avoid loss of information.
[1] Ed Gerck. Tri-State+ Communication Symmetry Using the Algebraic Approach. Computational Nanotechnology. 2021. Vol. 8. No. 3. Pp. 29–35.
[2] Ed Gerck. On the Physical Representation of Quantum Systems. Computational Nanotechnology. 2021. Vol. 8. No. 3. Pp. 13–18.
Hi Ed Gerck ,
You may find these works of interest:
https://ivv5hpp.uni-muenster.de/u/rds/blau_review.pdf
and more recent developments detailed in this series beginning:
https://www.researchgate.net/post/Is_this_a_new_valid_logic_And_what_does_layer_logic_mean
Communication and cybersecurity need to take an algebraic approach, and use GF(3^n) -- ternary, not GF(2^m) -- binary. Boolean logic revealed to be insufficient.
The triangle-inequality needs to be modified. The path between 3 points, ABC, may be faster when taking the longer route (network encoding). But, may appear to take the shortest route (MITM).
Hi Ed Gerck ,
Logic abstracts states from existence by distancing states from events. Changes in states imply changes in attributes, before and after, introducing temporality. So far as we known existence is instantiated through exemplars of entities having states in a context up to the end of time.
Assertions may be made about temporally related changes in the truth values of sets of attributes of entities in particular states.
Indeterminacy of existence has temporal direction retrospectively identifiable as changes in a past state where the attributes of the entity were becoming false, the appearance of a new state of the entity, the attributes of which are becoming true. Assertions about the truth of statements about existing entities in a context have a four-valued logic in paradigm instances of temporality: true, becoming false, false, becoming true.
Indeterminacy is always on the way to becoming something else. The questions that matter are around whether what it is "becoming" is positively or negatively valued..... and then what actions should be taken to alter attributes that facilitate desired changes in entities. This last may be seen as the dilemma of praxis.
RB: You wrote, "Indeterminacy is always on the way to becoming something else.". This is ilogical, as logic is defined by Ulrich Blau, as above.
According to Blau, logic is atemporal, absolute.
According to nature, we learn from the sunset. That's when we can see the stars! They were always there ... though. There is no "becoming".
That's how we can understand indeterminacy versus "becoming".
It is intersubjectively "becoming" to us, but not necessarily "becoming" in a objective sense. The stars pre-existed us.
Then, we learn from the Holographic Principle (HP). When in doubt, look at nature. We don't need to learn anything -- everything is given in nature. We just have to be grateful, and forget the ego.
The ego is not holographic, though reproduced a lot. It always insists to be different, which is the mark of solipsistim.
Indeterminacy is a logical state. What it will become, if at all, is ... indeterminate. It is atemporal, absolute. It ends on itself. What will become of ... the egg?
The indeterminacy is at the root of stimulated emission, will a photon be emitted or not? A coin toss, is likewise indeterminate. That's the essence of what we call "choice".
A perfect choice is 50/50 in a coin. This what we call chaos -- of course it is deterministic, but we can't follow all the events that lead to a particular result.
Hi Ed Gerck ,
Suppose, for a brief discussion about probability, that the axioms governing your coin toss analogy would have included: each coin toss is independent of prior coin tosses, and also independent of following coin tosses? That is a conventional view.
Suppose that is not the case, and suppose the following is much closer to what goes on in a world of existence. Suppose the coin has a top and bottom, coloured black or white respectively. Suppose that each coin toss is affected by prior coin tosses and also influences future coin tosses, leaving evidence of the outcomes. Now suppose there exists an experiment conducted on a plain white 2D grid surface where the coin is tossed and heads or tails are the possible outcomes. The outcomes are recorded from the center of the grid, tails for black or heads for white. At each coin toss the records may be used to show the tossing events as analogous to clock events, that is. the passage of time showing that patterns emerge under the simplest of systems. The details follow Langton's Ant.See: https://adambaskerville.github.io/posts/LangtonsAnt/
The point is that Langton's Ant creates as simulation of apparent random behaviour until eventual order ( the so called superhiway) emerges out of the chaos in every case.
Instead of colours we could have any pair of descriptors or even true or false.
RB: Thanks. Yes, I did suppose that each coin toss is independent of each other.
But, suppose they are not, which is an interesting case as you say. The important assumption, it seems at first sight, is that an external influence would account for it, not the coin itself. The coin would be always fair, by itself.
One could, then, detect such influence by using Baye's theorem, on updating beliefs. No order is created, but due to the influence, an external order exists -- thus detectable.
I will take a further look into this.
RB: According to Ulrich Blau, logic is atemporal, absolute. If we keep that in mind, order cannot come from chaos, except if already in chaos.
Chaos implies an objective mathematical rule, but unkown to the observer. It looks random, but randomness does not exist. Not even in a 50/50 coin toss.
However, by observing the game long enough, we can learn the rules. This we call Physics. And, it can change Mathematics, by a paradigm shift. In an instant.
Order comes from outside. Some call it the coherent channel. Others call it trust. Some call it the Creator. No religious connotation. Not confidence, either. Trust may describe it better.
Hi Ed Gerck ,
It looks to me that maybe number relations may describe but do not account for the emergence of order out of the chaos of multiple inter-related iterations of events.
Wolfram has another view in his extended material, See: https://www.wolframalpha.com/
Warning, this is, as you may know already, a "very deep rabbit hole". Nevertheless, Wolfram's work raises interesting questions as to the significance of the growing extent to which particular sets of number relations seem to map onto actual events.
RB: The hypothesis of the emergence of order out of chaos, is of central interest to materialism.
The denial of what one could trust, is a denial of trust as an external factor. There is no Creator, one tries to assume.
So, order is supposed to come out of nothing -- as if by itself.
The rational mind cannot accept, though, this path of an effect without cause. Nature is causal. That is the base of all sciences.
Even Allan Kardec, the Spirititism pioneer, believed in something similar -- the spontaneous generation of life. He ignored the existence of germs, bacteria and virus.
But Ulrich Blau only accepts logic if it is "sterile" -- absolute, thus atemporal and delocalized. Not in spacetime, not a "becoming".
Thus, logically, order must exist in what, in our ignorance, we call chaos.
Since randomness does not exist, even the appearance of randomness must also be chaos.
Order is, thus, a consequence of the order in chaos, and lack of order is the same.
We do not know the objective, mathematical, inescapable, law behind chaos. But we can see its hand on phenomena, subjectively.
Therefore, seeking community agreement is a way to discover it more intersubjectively, on the way to be objective. This community agreement was studied by Michael Polanyi, and has been called trust in some languages. Other languages have no single word to express it, like Portuguese and Korean.
No one uses real numbers in cryptography, for lack of precision, no matter how many digits are used. People use natural numbers, in modular arithmetic mod p, where p is an natural power of a prime, like 2^8. How did we come to this? No reals, and yet absolute precision?
The reals introduce other difficulties, in infinite digits that would be needed, as in 0.666... = 2/3, and yet no one can measure (not physically, there is no unlimited time).
According to Ulrich Blau of LMU (Munich), one can only accept logic if it is "sterile" -- absolute, thus atemporal and delocalized.
Not in spacetime, not a "becoming". Then, logic must be absolute, with no spacetime relativity, and no reflection -- thus, logic is understood as first-order logic, following Ulrich Blau.
Hi Ed Gerck ,
A simple figure, a circle, defies logic, in the traditional sense. where the relationship between circumference and radius only rarely approaches integer values for both. In the translation between a logical circle and an actual circle for all actual circles either 'will hold water' in a plane or be 'wasteful of pencil lead' if drawn, or materials, if constructed.
If calculated out, a series of circles with increased integer radii will approach integer values for circumferences, then slightly exceed integer values in a sequence the differences of which have a wave form over 100+ iterations.
So circles have circumference and radii relationships that 'become close' to paired integer values then 'move away' from integer values.
Of course, the involvement of pi is our best current guess but the question remains, are there to be no 2D circle relationships in Blau's world of logic because they lack the precision of integers.
Good question. As known, a circle has an area that is equal to π times a side equal to the radius of the circle, times the radius of the circle, which is a rectangle. This can be seen geometrically, by slicing a circle into a rectangle.
As quantum mechanics is true, and we do think it is ontic, one could think that a circle has no allowed existence IF the rationals do not fit. Not only r must be rational, but also π×r. The rationals can be mapped 1:1 to the integers. Space is grainy, it seems, not only non-euclidean.
Circles, as we think we know them, are incompatible with QM, but not only. Circles are not compatible with QM, Martin-Löf types in CS, and computers, which were three paradigm shifts developed later, after Newton, Leibnitz, and Cauchy.
Hi Ed Gerck ,
The truths about pi as an attribute of circles in 2D do not apply in 3D where circles on 3D surfaces may have integer circumferences and associate integer radii. Adding a 3rd dimension restores a semblance of order, but raises further questions about the ubiquity of logic, that is, in which dimensions does it apply/not apply. Do these dimensional constraints apply to three and four state and higher states of logics in any coherent sequence?