Ontology as grammar
I have hit upon something that lets one to unfold a sequence out of a seed like this visualization by @elzr: https://twitter.com/elzr/status/1371732525106790402
This is a great visualization! Thanks for sharing.
This is how I picture the design process. Each level in the tree is a design decision. With every decision you differentiate/specialize the solution more and make other solutions impossible; reduce the solution space.
Of course, when building, the solution space is not limited like in this example. The tree is not binary and every decision has probably infinite possible choices. The depth of branches varies. And it’s not even a tree in the first place, but a (semi-)lattice, where some decisions let you cross over to another branch again (which is impossible in a tree).
In computing we often deal with solution spaces restricted like this, with certain orders, regularities (symmetries!), limits, and/or discrete structures.
This example, however, also adds some interesting ideas around representations. A simple sequence of four binary decisions yields 16 possibilities. But what these 16 possibilities stand for — what they represent — is completely arbitrary. Their meaning is constructed, based on metaphoric structuring that “makes sense” to us. All examples even map more or less directly to embodied schemas.
I love the part where he reveals the logic gates, as that is a more subtle and less expected meaning, and not all sixteen paths map to a useful thing. That is much more like design, where you also end up with lots of possibilities that are not useful.
I guess a deep question here is: are these meanings showcased in the video inherent to the structure itself (Do they exist in reality?), or are they constructs of the mind? 😎
[…] I didn’t quite think of ontology as grammar before! But of course, that is the function of ontology, to provide one kind of primitives for a grammar (rules being the other). And the grammar is the generative framework for creating larger constructs out of these smaller primitives. A grammar helps us form meaning-/use-ful constructs more easily by guiding our choices to more sensible ones that follow the grammar’s rules, but still allows us to generate infinitely many solutions.
And choosing an ontology, picking those labels, is not arbitrary! It theoretically can be, of course. But that wouldn’t make sense — literally: we only pick labels that make sense to us in some way. This, I think, is how metaphorical structuring connects to this.
“Good” primitives, a carefully picked ontology which “makes sense”, is what A Pattern Language is on a fairly high level, what metaphoric structures and embodied schemas are in human language just beneath the language level, and what ifrit describes in the linked thread on a mathematical low level.
Exciting. I need to let this sink in more before I get to the second half of your post…
I think this is where “meaning” can change from what is fundamentally mathematical (and/or cognitive) on the lowest-levels to image-laden and constructed on the higher-levels. For instance that we now have mental models of how things work in the grammar of capitalism, and therefore certain actions make more sense in that grammar than others. Yet, the whole thing is just a construction which happens to be somewhat coherent enough for us to make sense of it and with it.
In that sense, Alexander is trying to focus our attention on the lowest-level grammar that is apparently “built-in”, either into our fundamental perception of the world or — what he really wants it to be — into the structure of the universe itself.
[…] But anyway, the point here is that every partial order along the way lends itself to the construction of many grammars describing how an observation (a sequence of binary values) maps to meaning (a natural number) — and sometimes even backwards.
The order we choose to make sense of a complex thing, determines the grammars that can describe how this thing is put together. In that way, grammars are a sense-making device. Some are closer to a lookup table in that we can’t really find any elegant patterns and most of it seems arbitrary to us. Some are closer to a total order and we can express most results with just a few elegant rules. […]
Thoughts on this discussion
- It’s worth following the thread of arbitrary vs. meaningful ontologies, and their connection to metaphorical structuring. And how more meaningful ontologies structure the solution space “better” in various ways.
- Solution spaces in computing are usually discrete and limited. How does this observation help?
- The logic gate example is an interesting case where a subset of the solution space can represent the whole solution space and points towards Functional completeness and ergodicity. This seems an interesting path to investigate further.
- The idea of a grammar as a sense-making device points towards grammars being a foundational primitive for thinking — of course, language is heavily involved in how we think analytically.
- Cognitive categories are like non-terminals, kinesthetic image schemas could be one form of terminals (there might be others), and the rules form the tree/lattice-shape of the ontology.
- I’m unclear if this is really one ontology; I suspect it is a multi-dimensional overlapping of independent tree-shaped ontologies that combined form a lattice structure.
- If grammar is a foundational thinking primitive, (spoken/written) language would naturally form from it.