Epistemological model of the search for biological replicators

1

the central theme of “enhd phenotype” by r. dokins can be divided into two sections:

replicator search, i. a well-fixed [micro] biological object where you can consistently capture the lead role in targeted (?) regulation of actual phenotypic diversity inna pop reproduction chn, defining the conceptual boundaries odat diversity.

among the coggd candidates is a pop, an organism, an organ, a cell, but since the time of his 1st sensational book, the author has endorsed the primary role of the gene and dna in this process. the concept of an “extended phenotype” does not change the attitude towards the 1st point, but turns to the 2nd and suggests not 1-ly incorporating inna'da phenotype toonistics tha're directly rel8d to the biological individual, but'a entire phenomenal environment, whose diversity is determined in some way by their genetic configuration through genetically determined behavior.

the ponderation of this conceptual work through the optics of controlled epistemology opens the folloing perspective. the focus of the problem obviously lies inna interdisciplinary space, inna desire to create a theoretical tool for the consistent connection of two groups of ontologies:

go1. a group of ontologies of high phenomenological diversity: ontology offa physicochemical agent, ontology offa microbiological agent and conceptually close together – for the sake of brevity we will not speshize much here;

go2. a group of high-lvl ontologies: ontology of an individual’s behavior, a pop; ψ-chological descriptions based onna subjective experience of cogitateing on interpersonal and interpersonal communication practices.

2

in an epistemological context, the problem of finding a replicator tis homologue of the “difficult problem of consciousness” formul8d by d. chalmers onnis theory of consciousness: the determination of the relationship tween objectively fixed physical essentialisms / processes and mentally subjectively fixed essentialisms / processes. tis + or less homologous to a series of studies conducted in an interdisciplinary conceptual space. this series of problems can be generalized to the folloing wording:

construction of consistent comparisons tween ontologies of ≠ orders; comparisons must be consistent in accordance with current epistemological norms (logic, provability, falsification, etc.); “order” can be defined here as a discrete measure of the generalization of the sensory experience; “comparison” – a description that often does not ‘ve the status of its own discipline and ontological identity for certain sci problems; or is defined as belonging to zones offa “philosophy of sci” or a “consciousness philosophy” that ‘ve already received inter- or transdisciplinary status.

from an epistemological pov, the problem of ontological impedance is envisaged here:

there are several zones of sci or teknical practice na corresponding description and ontology methods. practices differ in that the operators they perform perform ≠ actions based on executable descriptions of ≠ composition, structure, and organization. thris a practice organized in relation to high-lvl goal situations and cooperative implementation group practices require (1), for ex, the treatment of diseases tha're symptomatically determined by med observation in med terms (1.1), using dna correction by nucleotide synthesis or plasmid teks (1.2). descriptions, including those assessing the target states ( 1.1) and (1.2) are incompatible atta executable element lvl – ontology integration (as per iso 11354-1) is impossible or unstable. for ex, for a genetic engineer, the task of eliminating the occurrence of an allergic rash aint correct. we need a chn of speshizations to formulas of the form “blocking the expression of protein a in synthesis b”. the cooperation of these practices in an ontological incompatibility situation is managed by the association, i. through a generalization of various controlled ontologies through uncontrollable constructions of the epistem (really common household, general sci ontologies, thought patterns, data through education na' common basis, etc.) that the operators can share. if thris no epistemic consensus or is insufficient, cooperation aint possible: for ex, physicians and genetic engineers can seek a coordinated practice to treat a disease iffey speak (or are sufficiently transl8d) the same language, folloing the general rules (hypothesis authentication proof test, etc.). p.) and ‘ve a common method to search for local agreements in a situation of inconsistency of descriptions for pticular goals, essentialisms, processes. methodologists, philosophers of sci and epistemologists undertake unification attempts – the introduction of additional comprehensive ontologies and / or mediator ontologies that impose impedance reduce the “generalization – speshization along a specific path” or “manual assignment” maneuver – a category from an implicitly introduced additional ontology intermediate with respect to go1 and go2, with which the conceptualization of the replication process with respect to go1 under an object designation with semantics to be reduced in a single theoretical construct suitable for placement inna conceptual zone go2.

difficulties in such work arise from general problems of ontologization as a methodand in this case consist of:

for a “naïve” association: an exponential increase in communication costs as task complexity increases, quickly leading to the nd'2 create a method. inna absence of means of integrating heterogeneous ontologies, unification remains the 1-ly solution. in one way or another, standardization: an increase inna № of ontologies, which in turn requires maintenance costs, is recursive in all the problems mentioned.

scheme 1. epistemological replicator search model in archimate 3

3

fig 1 shows an epistemological model for finding a replicator in an archimate 3 standard based notation. an attempt was made to adapt the notation for epistemological formulations and to make it so easy that the diagrams are coggable.

4

inna field of the philosophy of consciousness, thris a theme of reductionism, that is, the status of attitude that “consciousness is reduced to physical processes”. alternatives to this problem include constructs designed to alleviate ontological impedance in some way or conceptualize the problem state, e.g. superiority,

in epistemological interpretation, the attraction of reductionism is to reject the need for ontology collaboration in general, for ex, by establishing a universality of physical, logical, or mathematical descriptions. the problem is to find a consistent reducibility to the basic ontology, “calculable formalisms,” “mathematical methods,” and so on.

the rejection of reductionism tis organized diversity (or executable complexity) of explanatory theoretical constructions that can be built on its foundation

lo in relation to the total amount of explanation required; bad, expensive in scale: the internal and / or external connectivity of the designs is lost. this is assessed both by the formal norms of the epistem (the logic) and by the subjective fact of the operator / researcher / thinker: the constructions are too complex to be coggable and applicable. in fact, t'has a lo organizational quality: the composite constructions ‘ve many perceptible application errors and require an explicit shortening of the phenomenological spectrum t'they try to use in generalizations and / or constant local additions that may not be provided inna bulk of the theory.

critics of reductionist theories that exist inna field of objectivist philosophy, however, must choose 1-odda camps inna opposition “monism – dualism”, w'da obvious choice bein’ in favor of the 2nd. this choice is enforced 1-ly cause practices and thinking tulz aint appropriate to the task. however, this can 1-ly be seen from an epistemological pov. dualism seems superfluous and too costlycause it implies the presence of two irreducible intellectual spaces, two sets of thinking tulz. monism (primarily reductionist, requires a primordial ontology and a method of reducing everything) is + economical, but objectivist monism available to modern sci does not provide tulz appropriate to the complexity of the tasks of high lvl behavioral abstraction. the authority of positivist sci is undeniable cause of obvious achievements, but to solve such problems the configuration of this epistem becomes a significant difficulty.

creating another ontology at each lvl per se cannot be a general solution to such problems. for the task, a long-term and scalable solution cannot 1-ly be found through ontological engineering, even if tis systematically applied inna methodology of sci and in interdisciplinary collaboration.

5

however, it seems dat a' solution to problems where ontological management is expensive can be found by introducing methods with sufficiently strong epistemological management. “epistemological management” refers to the existence of practices and ideas that support the manipulation of the paramts of ontology groups. for ex, 1-odda key paramts of an objectivist, positivist epistem tis concept of “truth,” which has several alternative interpretations tha're intuitively found and applied arbitrarily. in modern sci, there are methods for selecting a suitable concept of truth (eg, logical or bayesian) for a pticular problem. however, the selection method consists offa № of heuristics, aint disciplined and formalistic and can ⊢ 1-ly be described as “weak epistemological management”.

but'a demand for a sufficiently uber and ⊢ non-reducing theory of knowledge requires overcoming the limits of objectivist epistemology and objectivist ontologization. exclusively due to anthropological limitations, a'pers does not ‘ve many basic maneuvers that can be used. another spesh generalization is required, but beyond the limits of objectivist ontology, culminating inna form of an “object,” an “essence,” a “process,” or some other category.

if the ontological teknique has made ontologization methodically and ontologically calculable, but'a epistem remains unpredictable and epistemic practices are usually uncontrollable or + precisely naively controlled, without any method or philosophical cogitateion. the latter has a lil discipline and explanatory ability, s'as the “natural philosophy” b4 newton, which has laid the foundation for the transformation inna'da physics known to us.

guided epistemology is a discipline that ‘d provide intelligent tulz for:

supporting the lifecycle of speshized interoperable and executable representations capable of organizing the current phenomenological image of an operator in any zone witha set of objectives of ≠ scale practices, thereby reducing the cost of maintaining the stability of cross-domain collaboration integration offa wide range of objectivist ontologies through practices and representations using epistemological management – d. h. resistant to ontologizing problems due to the degree of generalization. however, it seems that 1-ly a local integration limited by the practical scope (agent attention) can be a viable task; but not the “theory of everything,” consisting solely of atomic and highly mobile epistemological constructs. development of epistemic states in a form suitable for computation using available mathematical and software hardware tulz s'as vector machines.

6

1-odda key points of controlled epistemology is attention theory, in which attention is a term for a group of non-biased generalizations that rel8 to the agent’s ability to act selectively. such a formula is necessary here, onna one hand to show a certain connection w'da ψ-chological and neural essentialisms of “attention” and, onna other hand, to distance oneself sufficiently from them, cause the operationalization of the concept (epistemological attention, attention as a definite practice for the construction of an epistemology) differs significantly from the neuroψ-chological.

the most primordial operational and calculable construct of epistemological attention theory tis attention pattern. 1-odda qualitative innovations is a controlled binary thinking position in which every conceptualization requires a parallel ontological and epistemological control at each zone.

the other core tis agent framework and cogg architecture of the agent, all of which are necessary to operate well-developed object means in a variety of attention states and to be able to port these representations to compatible computer architectures (eg, ≠iable and artificial neural networks).

original content at: anticomplexity.org…:

Share:

Leave a Reply

Your email address will not be published. Required fields are marked *