Agency is the quality of making decisions. Such a quality assumes free will or an ability to choose between alternate future realities. Determinism is hard to argue against, but that does not stop human subjects from ignoring it. Humans have an inherent desire to feel as if they are in control. Even if that control is selecting between a sandwich or a pizza for lunch. On some occasions, the number of alternatives is so overwhelming that the subject stifles into submission. In such cases, the agent may beg nearby acquaintances to choose for them; else they suffer a panic attack.
Every agent has a type and a replicator. Roughly speaking, types are stochastic and replicators are deterministic. Types are linked to psychology and linguistics. Replicators are linked to evolution and epistemology. Types engage in particular information processes. Replicators counteract dissipation and generate observations to be statistically validated. Keep in mind that I am attempting to create a far reaching frame that goes well beyond the typical scope of what a 'type' or a 'replicator' are. Human types and genetic replicators are only small, albeit important sub-sets. For instance, a replicator could also be an experiment where several human scientists attempt to replicate the results by performing a specific measurement using a common method.
Types are located in a territory. The closest equivalence here is ecology or systems theory. Incomplete information or limited attention are the primary qualities of this branch. Each type builds a model of their environment that represents the known, but such an expansive external reality will always be partially unknown in the eyes of the beholder.
Types have a complex. Complexes are subject to the considerations of theoretical computer science (not so theoretical now days). Complexity, information, communication, computation, algorithms, data structures, and so forth are relevant here. The basic elements are codes and data streams. As data is collected it is encoded into memory and later decoded to be sent out as a signal for other nodes in the network. In an imperfect world, signal interference is a battle that must be fought with redundancy.
Replicators have an economy due to their scarcity. Supply and demand curves can be sketched with the aid of various counting methods. Typically, such analysis would require sampling and estimation. Agents can exchange replicators in a market according to their preferences. In the aggregate, symmetries and asymmetries are the name of the game.
Replicators are associated with a mechanism or deterministic process. This is the domain of physics where any apparent randomness is considered failure. Agents agree on a unit of measure, then several measurements can be performed. After a little cross-validation and model fitting, certain invariants may start to become apparent. Some quantities (like a Lagrangian) do not change in closed systems, though systems can only be approximated as closed at either very large or very small scales.
Overall, this is just a meta-frame that pegs traditional schools of thought onto a tidy hierarchy. In no sense is it a 'theory of everything' or a 'unification'. Rather it is more like a toy that should be thrown against the wall, kicked out the window, and ran over by a semi-truck. Perhaps causing the driver of the semi-truck to stop and give the kid a better toy out of guilt. Internalizing a meta-frame such as this can help keep the mind organized while everything else falls apart.
No comments:
Post a Comment