Symmetry and recursion as repetition

Symmetry = repetition in space, recursion = repetition in time

Stefan

[…] Moving on, then you throw in the “(a)symmetries” of the space. This might be the lower-level explanation of why we pick up on regularities and “like” symmetry so much. At the end of the day we are pattern recognition machines (it’s hard to leave the mechanist world view behind, isn’t it?), and it is very likely that a lot of our perception and cognition is “just” sense making in the form of discovering familiar patterns.

[Two paragraphs moved to -> Eleatic philosophers]

But wherever these (a)symmetries come from, they seem to be perceptible building blocks of higher-level structures that likely cause higher-level structural patterns to emerge (e.g., geometric properties) that we can then also pattern match, because they are more likely to appear than other configurations.

I’m still mulling over the “functional completeness” part of it. The way I see it, a grammar does play a role in filtering out some forms that become the equivalent of syntax errors, so they reduce the solution space somewhat, but that is probably not in violence of what you mean by “functional completeness”? Yet a grammar still provides infinite possible constructions and filtering is not the only and likely not the most important part of a grammar. The interesting part a grammar adds is that the construction of a result can now be described as a generative sequence — the (potentially recursive) application of rules in the grammar. Which coincidentally also carves the space up into meaningful/not meaningful.

But the real power is that we’ve just broken down a complex thing (the result) into a sequence of steps that are much easier to describe, whereas our default mode of operation is usually to try to break down a complex thing into it’s structural components along a whole-parts-configuration schema. This is at the core of Alexander’s theory, who talks at length about both structure (Book 1) and process (Book 2) and provides a lot of the reasoning for why the slightly less obvious approach of breaking complexity down in process as opposed to structure is more powerful in the sense that we can create more complex systems that way.

My suspicion is that this is all fundamentally about the duality of result/process or data/algorithm or declarative/imperative. […]

-> Imperative declarative process structure

Stefan

about the lowest-level symmetries in that visualization from the tweet:

If we take the natural numbers representation for example, we could assign numbers to their binary representation in arbitrary order. Let’s say in random order (meaning without any discernible structure). Then we’d have to use a lookup table for each possible mapping to make sense of it, or memorize quite a lot.

What we end up doing instead though, is we use an order that feels natural to us — from lowest to highest. We could still use a lookup table, but now we have much more efficient ways of converting a sequence of binary values into a natural number (and back): We can use an algorithm. That is more efficient, and it seems more elegant to do it that way. There are many different possible orders we can choose, but this one appeals to us a lot, because it somehow makes more sense.

I can “feel” the symmetry in this: instead of memorizing everything, we memorize fewer things, in this case just a starting point, and then reconstruct the rest through unfolding the symmetries. It doesn’t quite match perfectly, but I feel a strong connection here between symmetries, unfolding, and the recursive aspect of generative processes (I’d intuitively say that’s a form of symmetry, but I don’t know if that holds up mathematically?).

You can also picture some other partial orders somewhere in between those extreme cases of total randomness and total order, where locally some parts “make sense” (have order) but globally it doesn’t all fit together neatly. And that is pretty much our default mode of wandering through the world. Perhaps inspired by mathematics we think there must be some total order, the one rule or law of the universe that can describe everything. And maybe that is so.

-> Eleatic philosophers

[…] Hypothesis: When we decide to take a complex thing apart structurally (structure: whole-part-configuration), it is symmetries that help us make sense of it. And when we decide to take a complex thing apart procedurally (process: sequence/path-goal), it is recursion that helps us make sense of it. Because both symmetry and recursion allow us to “compress” the complexity in a form of repetition. Symmetry as repetition in space, recursion as repetition in time.

Prabros

I can “feel” the symmetry in this: instead of memorizing everything, we memorize fewer things, in this case just a starting point, and then reconstruct the rest through unfolding the symmetries. It doesn’t quite match perfectly, but I feel a strong connection here between symmetries, unfolding, and the recursive aspect of generative processes (I’d intuitively say that’s a form of symmetry, but I don’t know if that holds up mathematically?).

If you have some time to explore the math side of it, please have a look at the idea of quotienting and equivalence classes. I might have linked it up before. But please go through the answer by Lors Soren: https://www.quora.com/What-is-an-intuitive-explanation-of-a-quotient-space and see how in math models this idea of creating an equivalence class captures this idea. Quotienting, which is quite similar to dividing by a base is what leads up to the idea of abstraction whereby you equate between things that share in a property modulo the property that is different. Modulo here also maps to the mathematical idea of taking remainder. So this is in a sense where arithmetical operations reflect the way in which we think about mental concepts!

Prabros

Symmetry as repetition in space, recursion as repetition in time.

That is a damn awesome line. But let me scrutinize it privately see if it holds up in different settings, if it does, I think you said something striking about the structure/process dichotomy and possibly have identified a unifying concept here.


Thoughts about this discussion

  • Restricting representable forms is in violation of functional completeness and they are distinct concepts. Both are still interesting properties though and deserve further investigation.
  • It could be that the structure of a chosen grammar itself is what causes “fundamental properties” to emerge from a process described by that grammar. It seems intuitively sound that if a generative grammar restricts its solution space, that this makes some permutations of possible results more likely than others, perhaps even in a describable statistical distribution. This could explain the appearance of patterns in configurations, as for instance the Fifteen fundamental properties.

References

  • https://www.quora.com/What-is-an-intuitive-explanation-of-a-quotient-space

Notes mentioning this note


Here are all the notes in this garden, along with their links, visualized as a graph.