Translate

Tuesday, 27 June 2023

Words about Words - My First Principle

Here is an essay to describe what I call fuzzy logic and its effects everywhere. The question, which I must form in words is "what is a word?" So what is it already?

Well I would say...  It is a metaphor that attempts to define something. 

Is that enough? Words are made of metaphors? Considering this is a bitcoin blog and you may well be a coder, lets do an analysis from this perspective. Computers work in binary so we can be mathematic about it and this is a useful to show the contrast from reality. 

If I where to define a word in code I would look up its definition in the dictionary, there would say ten lines of text there, and to accurately describe this I would need, lets say, 100 lines of code (we are assuming that we can have a perfect code that exists before words are defined which we can't and it's circular but hey, we can ignore that detail). I would also look at the thesaurus which would give us links to the definitions of other words and we come into yet another recursive problem. Having defined words using words not yet properly defined, we have a thesaurus of undefined words which we use to add detail to our definitions, and now our words relly on the meaning of other words, which relly on the meaning of other words and when you call the word the computer ends up churning through code endlessly. However to solve I that just cap our word calling function at 1000 lines of code. There is little option but to accept that the starting point is not exact, that there are bandwidth limitations within the definition process itself and in accessing the meaning of a word even in a artificially limited context.

Compromise is cool so I end up with words defined with 1000 lines of code each and it seems reasonable that we could build a dictionary of code which works and of which inaccuracies seem ok (kinda like chat gpt). A quick search tells me there are 170 thousands words and 47 thousand obsolete words in the english language. So the math comes out to a database of approximately 21Gb. Not too bad, might fit on a small computer.

However definitions are changing! I have to maintain this database and when one word changes, it effects the meaning of other words linked so I have to check all these. Less updates per second than your average internet connection, but I can see that this is enough work to occupy more than what one very productive person could be expected to achieve with all their efforts. I am going to need some team work here.

So looking at words like this we can have this decentralised evolving database of which we contribute to and which we can not comprehend, but can understand the extent of. People could even influence the consensus on definitions over time in some areas of their experience or expertise. 

Knowing now that the definitions are not absolute and given that this metaphorical metaphorical database is in consensus we can know that given people understand the technical limitations the resulted communication should be as good as it gets. It seems like it might work ok if we had heaps of human workers to manage this database and maybe some kind of digital neural link. Yet at the same time it is scary to give away the attribution of meaning to this machine. To me it's scary because a machine will act absolutely to the code, in a binary way and at the same time I know that the word database is only an attempt at accuracy.

Now lets detail how word meanings are applicable and effective dependent on perspective. To go to first principles words are different depending on context and context can be described by time and space. It's all very important as the accuracy of data transfer relates directly to the amount of information transfer that we can transmit between people, the quality of communication.

The definition of a word twenty years ago in the database is not identical to the word today or what it might be in the future. That's one aspect of time, ie; we time stamp the definitions that we pull in our code. There is also another aspect of time which I need to bring in an example of a defined word a database item.

So this item is a "horse", you might think "a horse is a horse of course of course" but what about over time? When in it's evolution did a horse become a horse from a a pre-horse creature? This is code so we need to define the exact date. Also we can look at the horse over the time that is spanned by its individual life, at what time in pregnancy does the horse become a horse and at what moment of its death and decomposition does it cease to be a horse? Yikes!

What about our database entry for a "horse" in space. What shape and form can the horse be? Where does the Venn diagram overlap with a pony a donkey a mule. What aspects associated with the horse comprise of the horse, the hair in its main, the hair that falls off? The smell of the horse, the horses hooves, it's shoes? It is actually very hard to define in code how to draw a line around a horse in space.

So we need to add to the thousand lines of code to each entry to specifically address time and space in a way that is appropriate for each entry and these need to be kept up to date in a way that is more customised to your particular context, a perspective script. SO our database is 42Gb in total now, still less than the bitcoin blockchain, but with half of it quite dynamic. Far more information than I can comprehend and this comparison ends quickly as you don't at all need to perceive the movements of the bitcoin blockchain in your mind to live and communicate effectively. Words are how we structure our thoughts and interface with other humans, so fundamental. 

Language is the database through which we communicate. The better each participant knows it and is in a synchronised consensus the better the communication. The better the communication the better almost everything. The less chance of miscommunications, relationship breaks downs, misinterpretations, the more accurate the thought process the less chance of bugs in a program and the faster the information transfer, the better and more inclusive our consensus is on the definitions. 

So hopefully the words establish how errors appear through time and space within our definitions with context. Apologies that so many meanings can come from that sentence, but hopefully less in the context of this essay. Actually i'm not sure the volume of words provides more meaning, just by itself. I think quality is important and sometimes less is more.

Bringing it back to society, let's consider that we create laws with these words that we have created. The more words we use the more exponentially complex maintaining them and abiding to them accurately becomes. If you are not very clever, and if you are not comprehensive of the dynamics I attempt to articulate here, the law easily becomes fuzzy as you extend it. It becomes less comprehensive and defined as you add words to it. That is why we need people, juries and judges to interpret them and that is one areas where I am confident less words can be better than more. I am sure we should all be able to read the law fully in the early parts of our lifetimes. To read this effectively you probably want to know about the fuzziness we have described here and have a good grasp on definitions of the words within the law.

So now here, we have a lot of words about words. Hopefully the information transfer in the text is not corrupted and it can give clarity. I think this understanding of the wide spread persistence of fuzzy logic, not only as a mathematic concept, but as a linguistic one is important and I wish I had been taught this. To me fuzziness is an important foundation of accurate thought, a first principle of first principles and a basis for building really good relationships.

No comments:

Post a Comment