Well actually with theoretical physicists because they
particularly deal with statistical methods rather than
tangibles, are _especially_ drawn to information theory.
Consider where the shift happened from "energy is neither
created nor destroyed" and became "information cannot be
destroyed"? Theoretical physics. Theoretical physics is
precisely the source of the notion. "Information (data) is the
base unit. There can be nothing smaller than the base unit.
Therefore, everything is made of the base unit." If you were to
do something like, say, try to get "meta" with the information
source and start talking about semantics and meaning, social
constructs of knowledge and such, the ultimate _human_ basis for
all of these systems, complete with all of the flaws that goes
with that - well, there wouldn't be any room for it in a system
with information-as-base-unit. That's the issue with it
ultimately. That doesn't invalidate its usefulness on a
pragmatic level, but it does show where its limits are. == Oh
you're talking quantum information theory. I think in qubit
computer logic, they're primarily matrix transforms, which
definitely gives them more flexibility than the standard binary.
== I'd like to see work done on qutrits, but as we've barely
bothered with ternary logic, I don't see much benefit to jumping
to qutrits 'til we've mastered ternary. In a way, qubits are
easier to work with than ternary logic because the probabilities
allows some assertions of "trueness and falseness" and
maintaining the ability to negotiate relative truth values that
are interdependent quantities in an entangled state. == AH ok,
the "This is the size of your brain" stuff. How many states are
possible. Ok. How big is the checkerboard for us to move our
pieces on. I mean, that's certainly useful stuff but it's like a
gigantic piece of paper. If I know how big my paper is and how
wide my pencil tip is, it still won't let me draw anything on it
and really, pragmatically speaking, we can do more with less.
Limits are useful but like an unplugged computer, doesn't do
much. == About two years ago, I did a big search for ternary
chips in use. There's some project, some practical things - and
I remember a soviet computer that was ternary - probably that
same one you're talking about... but the closest things I've
found to ternary logic chips are some analog computing tech that
managed to survive (in chip form) because their functionality
couldn't be reproduced as well (or at all) in digital
equivalents. But generally, there's not much out there. I have
some work on it saved somewhere... I really expected to see
more. == Oh LISP machines ... man they had such potential. I
wanted to learn LISP too. The professor suggested I go through
the proper steps (this is 1990) - start with Pascal and work up
but I wanted to jump right into AI. I took a course in Pascal
but it was like, "eh, this is like Basic" and was kinda a waste
of time, although I did write a few things in Turbo Pascal in
the early 90's and distributed on BBS', so not a total loss. The
base pairings are interesting in the rule-sets they create. it's
hard _not to_ compare our gene sequences with some kind of
computer code because the similarities are so profound. == My
fascination with ternary logic though, is that it _can't_ be
mapped to binary. Aristotle's Excluded Middle has to be included
tongue emoticon Considering how influential his black + white
thinking has been to, I dunno, ALL of Western Civilization, it's
no surprise that we have trouble thinking in such terms. The
very development of our languages themselves have it built into
it, the source of dichotomies, paradoxes and the like. I'm not
saying it's a bad thing: So much has been accomplished with ONLY
THIS NEVER THAT NOTHING INBETWEEN and variations of it but I
know when we start being able to think in terms of variations of
"I don't knows" that aren't necessarily tied into probabilities
(which I see as kinda primitive: "On a scale of 0-100, how close
is it to X?" - probability is linear at its root. We can think
in terms of "two at once" - the either/or... but once we can
three distinctive somethings, things get wonky for us because
we've never been trained in navigating relativities,
uncertainties, unbalanced triads and such. I found something
somewhere about the logic used in the prefrontal cortex being
ternary in nature.. which makes sense - and they've mapped the
circuits (so to speak). I should find it. It was pretty
fascinating to see them under magnification. and being traced
out. == True, although then wouldn't it require another level of
processing for the "I don't knows", like a database or
something? == Ah, see there's the issue I have with the mapping
of 3 -- 2: The levels of complexity grows beyond the pure logic
itself. Other layers of abstraction are required. To me, this
points to a fundamental flaw in the usage of binary vs ternary:
You can't just feed ternary into binary and have a logic-machine
(as it were) process it all to come up with definitive results.
You need to add additional algorithms, user input, database
abstractions, semantics, language: in short, the "Uncertain bit"
seems to be everything that logic is not. == I suppose I seem to
asking the impossible but I don't think it is: "SOLVE FOR X="I
DON'T KNOW" ==