In recent times, there are two attempts to create the Universal Character Language which jump out at me. One of them is Stephen Wolfram’s language. The other is iConji.
Of the two, while Wolfram’s language is certainly impressive, he has not made the simple leap to the idea that some form of text based icon, similar to emoticons, is necessary. (iConji has). Wolfram is a brilliant thinker and has done an immense service building a search engine which has introduced a whole generation to problems in science and mathematics. But I find he falls into some of the typical traps of those who have attempted the Universal Character: too much theorizing at the expense of actually solving the problem.
Wolfram is certainly aware of this, and the history of those who have had similar inclinations. To be sure, it is a great irony. In order to have the idea to make a Universal Character at all, one must have a deeply theoretical mind, capable of at least grasping the concepts behind the fundamentals of mathematics and logic, if not their formal theorems.
And yet, to actually build the Universal Character, one must act like someone who has none of those skills, and behave in an incredibly pragmatic, mission-driven way. One must put theory aside and focus on incredibly concrete matters. Not surprisingly, these skillsets do not intersect in many individuals. And even having them both, history teaches us, does not mean success.
I’ve often thought that you need to have the kind of personality comfortable with spending years in a research library, and also be a daredevil ready to jump off literal cliffs. Discipline and daring rarely align.
I’m not imputing any personality traits to Wolfram, of course. Merely noting that, although he has defined what the Universal Character would have to be — really a lexicon, and pictorial, rather than logical connectives like Frege’s Begriffshrift — as far as I know (do correct me if I’m wrong) he has not taken any concrete steps in this direction since 2016.
Which leads me to iConji, the most recent project which came close to a Universal Character Language. Like Wolfram, its creator, Kai Staats, was aware of the lineage of failures that came before him. Unlike Wolfram, he had the patronage of Apple.
I want to take a moment to discuss this failure, because I find it incredibly interesting. If you’ve been following some of my other posts in this newsletter you may see why.
Conscious of the fact that what had driven the failures of most previous creators, was a megalomaniacal tendency to specify a lexicon for speakers that was too idiosyncratic, Staats decided to make iConji open source. Anyone could submit their own symbols (this was in 2010). Anyone could also add their own meanings. The symbols were supposed to be abstract, universal and directly represent their meanings.
Unfortunately, the lack of centralization created more problems than it solved. Ultimately iConji could not attract enough participation from its user-base and closed down.
Most interesting to me however is this: Staats also wanted people to create their own rebii. He was sophisticated enough to understand that what made Egyptian and Chinese writing successful was the phonosemantic element. Unfortunately his attempt represents a rare mistake of giving too much initiative to his speakers, most likely a product of the era he was living in.
The idea of creating rebii is essential for creating a system. Unfortunately no one has been wild enough yet to attempt a universal rebus. (Coming soon. I have an Observable notebook with a lot of data in it that is up for grabs. I may publish these results in a scientific paper.)
The other thing iConji obviously did wrong is to expect images to be unambiguous. As I discussed in this post:
— images are not less ambiguous than words. On the contrary, they offer more potential interpretations. The only universal icons are concrete depictions of objects. We know a picture of a bucket 🪣 is a bucket if it is a faithful representation of a bucket. If we have seen a bucket, and have a word for it in our language, given an icon 🪣 that faithfully depicts it, we will not fail to perform its name.
On the other hand if I lived on the Andaman islands and had never seen a vessel shaped like that, I would be hard-pressed to tell you what that picture was.
So, what can we do? Well, we can look at universal concrete concepts that are found cross-linguistically. See my notebook above. We want them to be concrete, not abstract. iConji does not do this. It tries too hard to represent all abstract concepts with universal images. Impossible. It also tries to represent every single word in grammar. Also misguided.
The iConji sentence above requires one to be in a Western audience to understand what it is talking about. We need to be familiar with coffee cups, we need to already know the @ symbol, we need to know the Arabic number system, be familiar with Westernized-Babylonian time-keeping, as well as the Western question mark. None of this is transparent, and all it does it create a system which is more opaque than alphabetic writing. Which is the opposite of what the purpose of such a language would be.
In contrast here is a very crude example of a word in my language, the first word, written both phonetically and ideographically. This is just an example and subject to change. The same symbols represent both sounds and meanings, so the 𓂓 represent the vowel ‘a’ when highlighted in blue, and the concept ‘all’ when highlighted in red. The flying dove 🕊️ represents the letter ‘l’ in blue. Thus the same symbols will be used to write sounds and meanings depending on the color code. You tag phonetic symbols with combinations of ideograms to give it an address in the ontology.
This is a more flexible, phonosemantic writing system. Each glyph represents concepts instantly translatable into 2,000 different languages, but each one also represents a phone from the IPA. This means that the bulk of the writing one does with the system is phonetic, but there is also a semantic element to it.
Words will be spelled with sounds, and then tagged with meanings, highlighted in red. It would be a bit like if I wrote an English sentence like this: a dog 🐕 swam 🏊♂️ across a river 🌉, and each of the emoticons was a hyperlink that led me to an exact definition of that word, or an encyclopedia article, or a web of linked concepts. Except, instead of writing in the Latin alphabet ‘ABCDE’, I would use the same pictures that I use to represent meanings to represent sounds.
The first document I produce that will be a transcription of the UN declaration of human rights in the English language, to demonstrate the efficacy of this system once it is produced. I am almost finished with the meaning side of my language, and about to produce the Universal Rebus Principle. I invite any readers to follow me on Twitter and comment on the first document written in my language, whether they would like to try to decode it, if they find the design interesting, or if they have any criticisms.
I believe there are three things I have done which makes my attempt at creating this product different than the ‘900 others’ in the past 1,000 years. The first is that I already know my concepts are universal because there are cognates in 2,000 languages directly linked from the PanLex project. The second is that I am actively engaging with, and am in conversation with everyone who has attempted this task before me. My motivations are incredibly similar to those who attempted this in the 17th century. I am a pansophist, and firmly believe in encyclopaedism as the pinnacle of the scientific project. Many who have attempted recently have viewed those motives as a relic; for me they are quite sincere.
The third, final and most significant thing, that truly is different from those who have attempted before, is my commitment to the Universal Rebus Principle. This is such a small, basic intuition, with almost no intellectual merit at all. It is that sounds have inherent meanings, or at least we experience them that way. That is my interpretation of the Rebus Principle. Therefore, I will produce a writing system where one writes in sounds and meanings at the same time, like the Egyptians did.
To my knowledge, although John Wilkins came very close, no one has tried this before.