Threesology Research Journal
Artificial Intelligence and 3sology (56K)
Page 43

Note: the contents of this page as well as those which precede and follow, must be read as a continuation and/or overlap in order that the continuity about a relationship to/with the typical dichotomous assignment of Artificial Intelligence (such as the usage of zeros and ones used in computer programming) as well as the dichotomous arrangement of the idea that one could possibly talk seriously about peace from a different perspective... will not be lost (such as war being frequently used to describe an absence of peace and vice-versa). However, if your mind is prone to being distracted by timed or untimed commercialization (such as that seen in various types of American-based television, radio, news media and magazine publishing... not to mention the average classroom which carries over into the everyday workplace), you may be unable to sustain prolonged exposures to divergent ideas about a singular topic without becoming confused, unless the information is provided in a very simplistic manner.

AI and 3sology pages:

Artificial Intelligence and 3sology Introduction
pg1 pg2 pg3 pg4 pg5 pg6 pg7 pg8
pg9 pg10 pg11 pg12 pg13 pg14 pg15 pg16
pg17 pg18 pg19 pg20 pg21 pg22 pg23 pg24
pg25 pg26 pg27 pg28 pg29 pg30 pg31 pg32
pg33 pg34 pg35 pg36 pg37 pg38 pg39 pg40
pg41 pg42 pg43 pg44        

As with any idea, while some people outright accept, reject or take them with a grain of salt, there are those who want to verify one view or another. However, such an exercise has led to different fundamental approaches with varying interpretations resulting in a new perspective to test the testing procedures since the results can sometimes contradict and not support a cherished belief. Such a cherished belief is the insistence for using a binary code or permitting present industry computer standards and those thus in charge, to set the course of computer design and functionality based primarily on the idea for keeping their business viable.

Qualitative tests to distinguish alternative theories

At the time that Augustin-Jean Fresnel presented his wave theory of light to the French Academy (1815), the leading physicists were adherents of Newton's corpuscular theory. It was pointed out by Siméon-Denis Poisson, as a fatal objection, that Fresnel's theory predicted a bright spot at the very centre of the shadow cast by a circular obstacle. When this was in fact observed by François Arago, Fresnel's theory was immediately accepted.

Another qualitative difference between the wave and corpuscular theories concerned the speed of light in a transparent medium. To explain the bending of light rays toward the normal to the surface when light entered the medium, the corpuscular theory demanded that light go faster while the wave theory required that it go slower. Jean-Bernard-Léon Foucault showed that the latter was correct (1850).

The three categories of experiments or observations discussed above are those that do not demand high-precision measurement. The following, however, are categories in which measurement at varying degrees of precision is involved.

1) Direct comparison of theory and experiment

This is one of the commonest experimental situations. Typically, a theoretical model makes certain specific predictions, perhaps novel in character, perhaps novel only in differing from the predictions of competing theories. There is no fixed standard by which the precision of measurement may be judged adequate. As is usual in science, the essential question is whether the conclusion carries conviction, and this is conditioned by the strength of opinion regarding alternative conclusions.

Where strong prejudice obtains, opponents of a heterodox conclusion may delay acceptance indefinitely by insisting on a degree of scrupulosity in experimental procedure that they would unhesitatingly dispense with in other circumstances.

At the opposite extreme may be cited the 1919 expedition of the English scientist-mathematician Arthur Stanley Eddington to measure the very small deflection of the light from a star as it passed close to the Sun—a measurement that requires a total eclipse. The theories involved here were Einstein's general theory of relativity and the Newtonian particle theory of light, which predicted only half the relativistic effect. The conclusion of this exceedingly difficult measurement—that Einstein's theory was followed within the experimental limits of error, which amounted to ±30 percent—was the signal for worldwide feting of Einstein. If his theory had not appealed aesthetically to those able to appreciate it and if there had been any passionate adherents to the Newtonian view, the scope for error could well have been made the excuse for a long drawn-out struggle, especially since several repetitions at subsequent eclipses did little to improve the accuracy. In this case, then, the desire to believe was easily satisfied. It is gratifying to note that recent advances in radio astronomy have allowed much greater accuracy to be achieved, and Einstein's prediction is now verified within about 1 percent.

H.O.B. note: In other words, some assert that we are not to question the relevance of an idea based on the appeal of the person who has developed or supported it, or out of habit, application, tradition, practice, observed law, or some avowed notion of what is believed to be the common-sensical... such as for example, that the U.S. practices a Democracy- though very little of an actual democracy actually exists.

2) Compilation of data

Technical design, whether of laboratory instruments or for industry and commerce, depends on knowledge of the properties of materials (density, strength, electrical conductivity, etc.), some of which can only be found by very elaborate experiments (e.g., those dealing with the masses and excited states of atomic nuclei). One of the important functions of standards laboratories is to improve and extend the vast body of factual information, but much also arises incidentally rather than as the prime objective of an investigation or may be accumulated in the hope of discovering regularities or to test the theory of a phenomenon against a variety of occurrences.

3) Tests of fundamental concepts

Coulomb's law states that the force between two electric charges varies as the inverse square of their separation. Direct tests, such as those performed with a special torsion balance by the French physicist Charles-Augustin de Coulomb, for whom the law is named, can be at best approximate. A very sensitive indirect test, devised by the English scientist and clergyman Joseph Priestley (following an observation by Benjamin Franklin) but first realized by the English physicist and chemist Henry Cavendish (1771), relies on the mathematical demonstration that no electrical changes occurring outside a closed metal shell—as, for example, by connecting it to a high voltage source—produce any effect inside if the inverse square law holds. Since modern amplifiers can detect minute voltage changes, this test can be made very sensitive. It is typical of the class of null measurements in which only the theoretically expected behaviour leads to no response and any hypothetical departure from theory gives rise to a response of calculated magnitude. It has been shown in this way that if the force between charges, r apart, is proportional not to 1/r2 but to 1/r2 + x, then x is less than 2 × 10-9.

Source: "physical science, principles of." Encyclopædia Britannica Ultimate Reference Suite, 2013.

The existence of the void

To Democritus the existence of the void was a necessary element in atomistic theory. Without the void the atoms could not be separated from each other and could not move. In the 17th century Descartes rejected the existence of the void, whereas Newton's conception of action at a distance was in perfect harmony with the acceptance of the void and the drawing of a sharp distinction between occupied and nonoccupied space. The success of the Newtonian law of gravitation was one of the reasons that atomic theories came to prevail in the 18th century. Even with respect to the phenomena of light, the corpuscular and hence atomic theory of Newton, which held that light is made of tiny particles, was adopted almost universally, in spite of Huygens's brilliant development of the wave hypothesis.

When, at the beginning of the 19th century, the corpuscular theory of light in its turn was abandoned in favour of the wave theory, the case for the existence of the void had to be reopened, for the proponents of the wave theory did not think in terms of action at a distance; the propagation of waves seemed to presuppose instead a medium not only with geometrical properties but with physical ones as well. At first the physical properties of the medium, the ether, were described in the language of mechanics; later they were described in that of the electromagnetic field theory of J.C. Maxwell. Yet, to a certain extent, the old dichotomy between occupied and nonoccupied space continued to exist. According to the ether theory, the atoms moved without difficulty in the ether, whereas the ether pervaded all physical bodies.

In contemporary science this dichotomy has lost its sharpness, owing to the fact that the distinction between material phenomena, which were supposed to be discontinuous, and the phenomena of light, which were supposed to be continuous, appears to be only a relative one. In conclusion, it can be claimed that, although modern theories still speak of space and even of “empty” space, this “emptiness” is not absolute: space has come to be regarded as the seat of the electromagnetic field, and it certainly is not the void in the sense in which the term was used by Democritus.

Atoms in external aggregation versus in internal relationship

In most forms of atomism it is a matter of principle that any combination of atoms into a greater unity can only be an aggregate of these atoms. The atoms remain intrinsically unchanged and retain their identity. The classical atomic theory of chemistry was based upon the same principle: the union of the atoms into the molecules of a compound was conceived as a simple juxtaposition. Each chemical formula (e.g., H2O, H2SO4, NaCl, etc.) reflects this principle through the tacit implication that each atom is still an H, O, or S, etc., even when in combination to form a molecule.

Chemistry had twofold reasons for adopting this principle. One reason was observational, the other philosophical. The fact that some of the properties of a chemical compound could, by simple juxtaposition, be derived from those of the elements (the molecular weight, for example, equals the simple sum of the respective atomic weights) was a strong factual argument in favour of the principle. Many properties of the components, however, could not be determined in this way. In fact, most chemical properties of compounds differed considerably from those of the composing elements. Consequently, the principle of juxtaposition could not be based on factual data alone. It was in need of a more general support. This support was offered by the philosophical idea that inspired all atomism—namely, that if complex phenomena cannot be explained in terms of aggregates of more elementary factors, they cannot be explained at all.

For the evaluation of this idea, the development of the scientific atomic theory is highly interesting, especially with respect to the interpretation of the concept of an aggregate. Is the only interpretation of this concept that of an assemblage in which the components preserve their individuality—like, for instance, a heap of stones? Modern atomic theory offers an answer to this question. This theory still adheres to the basic principle that a complex structure has to be explained in terms of aggregates of more elementary factors, but it interprets the term aggregate in such a way that it is not limited to a mere juxtaposition of the components. In modern theories atomic and molecular structures are characterized as associations of many interacting entities that lose their own identity. The resulting aggregate originates from the converging contributions of all of its components. Yet, it forms a new entity, which in its turn controls the behaviour of its components. Instead of mere juxtaposition of components, there is an internal relationship between them. Or, expressed in another way: in order to know the properties of the components, one has to study not only the isolated components but also the structures into which they enter. To a certain extent modern atomic theory has bridged the gap between atomistic and holistic thought.

Source: "Atomism." Encyclopædia Britannica Ultimate Reference Suite, 2013.

Although we are bipedal, our pelvis is oriented like that of quadrupedal primates. The early bipedal hominins assumed erect trunk posture by bending the spine upward, particularly in the lower back (lumbar region). In order to transfer full upper-body mass to the lower limbs and to reposition muscles so that one could walk without assistance from the upper limbs and without wobbling from side to side, changes were required in the pelvis—particularly in the ilia (the large, blade-shaped bones on either side), the ischia (protuberances on which body rests when sitting), and the sacrum (a wedge-shaped bone formed by the fusing of vertebrae). Source: "Human Evolution." Encyclopædia Britannica Ultimate Reference Suite, 2013.

Back on page 38, the article describe a comparison being made between a similar functionality identified in biology and in electronics. If we look at the whole of biological evolution with the intent of making other comparisons, we should acknowledge that there is convergent, divergent, parallel and serial forms taking place gradually, in spurts (punctuations), and variations thereof... though the usage of other labels would no doubt enable us to recognize unconventionally identified formulas or labeling... perhaps sometimes due to an infrequency not available to most researchers in a given era. Just because there is a convention for describing evolution in one way or another, does not make such a description wholly inclusive for all time and all events in all places. We must allow ourselves to be flexibly open to altering our theories if such is suggested by evidence or even theoretical proposition. Both theory and evidence should not be treated as an evolved species of thought which limits imaginative exercises of intellectual pursuit. That which we label today as the foremost truth may well be the superstition of tomorrow, simply because someone(s) came along and looked at the same information in a different way. As such, it is of need to look at Evolution with the ideas of Analog (generalized analogy) and Digital (specialized analogy)... though different terms such as homology, morphology, taxonomy, etc., may be used and overlapped to produce some variant containing both the generalized and specialized formulas.

Some people keep a running inventory of comparisons being made between multiple subjects, while others are easily exhausted by what appears to them as the expenditure of enormous effort. Some people appear to make such comparisons as easily as they breathe... that is, with little effort. They have difficultly thinking analogically and then translating these ideas, these images, into a digital format... and vice versa... with an accountability program which permits substitutions and alterations "on the fly". They do not need to have their hand held nor be continually reminded that the topic at hand is a study of AI and threesology, with "threesology" a very vast generalized specificity into multiple subject areas where the "threes phenomena" may or may not have been previously located. In short, it is an R and D (research and development) type of exploration. With this said, let's get back to it.

The modern theory of evolution provides a causal explanation of the similarities between living things. Organisms evolve by a process of descent with modification. Changes, and therefore differences, gradually accumulate over the generations. The more recent the last common ancestor of a group of organisms, the less their differentiation; similarities of form and function reflect phylogenetic propinquity. Accordingly, phylogenetic affinities can be inferred on the basis of relative similarity.

Convergent and parallel evolution

A distinction has to be made between resemblances due to propinquity of descent and those due only to similarity of function. As discussed in the section The evidence for evolution: Structural similarities, correspondence of features in different organisms that is due to inheritance from a common ancestor is called homology. The forelimbs of humans, whales, dogs, and bats are homologous. The skeletons of these limbs are all constructed of bones arranged according to the same pattern because they derive from a common ancestor with similarly arranged forelimbs. Correspondence of features due to similarity of function but not related to common descent is termed analogy. The wings of birds and of flies are analogous. Their wings are not modified versions of a structure present in a common ancestor but rather have developed independently as adaptations to a common function, flying. The similarities between the wings of bats and birds are partially homologous and partially analogous. Their skeletal structure is homologous, due to common descent from the forelimb of a reptilian ancestor; but the modifications for flying are different and independently evolved, and in this respect they are analogous.

Features that become more rather than less similar through independent evolution are said to be convergent. Convergence is often associated with similarity of function, as in the evolution of wings in birds, bats, and flies. The shark (a fish) and the dolphin (a mammal) are much alike in external morphology; their similarities are due to convergence, since they have evolved independently as adaptations to aquatic life.

Taxonomists also speak of parallel evolution. Parallelism and convergence are not always clearly distinguishable. Strictly speaking, convergent evolution occurs when descendants resemble each other more than their ancestors did with respect to some feature. Parallel evolution implies that two or more lineages have changed in similar ways, so that the evolved descendants are as similar to each other as their ancestors were. The evolution of marsupials in Australia, for example, paralleled the evolution of placental mammals in other parts of the world. There are Australian marsupials resembling true wolves, cats, mice, squirrels, moles, groundhogs, and anteaters. These placental mammals and the corresponding Australian marsupials evolved independently but in parallel lines by reason of their adaptation to similar ways of life. Some resemblances between a true anteater (genus Myrmecophaga) and a marsupial anteater, or numbat (Myrmecobius), are due to homology—both are mammals. Others are due to analogy—both feed on ants.

Parallel and convergent evolution are also common in plants. New World cacti and African euphorbias, or spurges, are alike in overall appearance although they belong to separate families. Both are succulent, spiny, water-storing plants adapted to the arid conditions of the desert. Their corresponding morphologies have evolved independently in response to similar environmental challenges.

Homology can be recognized not only between different organisms but also between repetitive structures of the same organism. This has been called serial homology. There is serial homology, for example, between the arms and legs of humans, between the seven cervical vertebrae of mammals, and between the branches or leaves of a tree. The jointed appendages of arthropods are elaborate examples of serial homology. Crayfish have 19 pairs of appendages, all built according to the same basic pattern but serving diverse functions—sensing, chewing, food handling, walking, mating, egg carrying, and swimming. Although serial homologies are not useful in reconstructing the phylogenetic relationships of organisms, they are an important dimension of the evolutionary process.

Relationships in some sense akin to those between serial homologs exist at the molecular level between genes and proteins derived from ancestral gene duplications. The genes coding for the various hemoglobin chains are an example. About 500 million years ago a chromosome segment carrying the gene coding for hemoglobin became duplicated, so that the genes in the different segments thereafter evolved in somewhat different ways, one eventually giving rise to the modern gene coding for the α hemoglobin chain, the other for the β chain. The β chain gene became duplicated again about 200 million years ago, giving rise to the γ hemoglobin chain, a normal component of fetal hemoglobin (hemoblobin F). The genes for the α, β, γ, and other hemoglobin chains are homologous; similarities in their nucleotide sequences occur because they are modified descendants of a single ancestral sequence.

There are two ways of comparing homology between hemoglobins. One is to compare the same hemoglobin chain—for instance, the α chain—in different species of animals. The degree of divergence between the α chains reflects the degree of the evolutionary relationship between the organisms, because the hemoglobin chains have evolved independently of one another since the time of divergence of the lineages leading to the present-day organisms. A second way is to make comparisons between, say, the α and β chains of a single species. The degree of divergence between the different globin chains reflects the degree of relationship between the genes coding for them. The different globins have evolved independently of each other since the time of duplication of their ancestral genes. Comparisons between homologous genes or proteins within a given organism provide information about the phylogenetic history of the genes and hence about the historical sequence of the gene duplication events.

Whether similar features in different organisms are homologous or analogous—or simply accidental—cannot always be decided unambiguously, but the distinction must be made in order to determine phylogenetic relationships. Moreover, the degrees of homology must be quantified in some way so as to determine the propinquity of common descent between species. Difficulties arise here as well. In the case of forelimbs, it is not clear whether the homologies are greater between human and bird than between human and reptile, or between human and reptile than between human and bat. The fossil record sometimes provides the appropriate information, even though the record is deficient. Fossil evidence must be examined together with the evidence from comparative studies of living forms and with the quantitative estimates provided by comparative studies of proteins and nucleic acids.

Gradual and punctuational evolution

The fossil record indicates that morphological evolution is by and large a gradual process. Major evolutionary changes are usually due to a building-up over the ages of relatively small changes. But the fossil record is discontinuous. Fossil strata are separated by sharp boundaries; accumulation of fossils within a geologic deposit (stratum) is fairly constant over time, but the transition from one stratum to another may involve gaps of tens of thousands of years. Whereas the fossils within a stratum exhibit little morphological variation, new species—characterized by small but discontinuous morphological changes—typically appear at the boundaries between strata. That is not to say that the transition from one stratum to another always involves sudden changes in morphology; on the contrary, fossil forms often persist virtually unchanged through several geologic strata, each representing millions of years.

The apparent morphological discontinuities of the fossil record are often attributed by paleontologists to the discontinuity of the sediments—that is, to the substantial time gaps encompassed in the boundaries between strata. The assumption is that, if the fossil deposits were more continuous, they would show a more gradual transition of form. Even so, morphological evolution would not always keep progressing gradually, because some forms, at least, remain unchanged for extremely long times. Examples are the lineages known as “living fossils”—for instance, the lamp shell Lingula, a genus of brachiopod (a phylum of shelled invertebrates) that appears to have remained essentially unchanged since the Ordovician Period, some 450 million years ago; or the tuatara (Sphenodon punctatus), a reptile that has shown little morphological evolution for nearly 200 million years, since the early Mesozoic.

Some paleontologists have proposed that the discontinuities of the fossil record are not artifacts created by gaps in the record but rather reflect the true nature of morphological evolution, which happens in sudden bursts associated with the formation of new species. The lack of morphological evolution, or stasis, of lineages such as Lingula and Sphenodon is in turn due to lack of speciation within those lineages. The proposition that morphological evolution is jerky, with most morphological change occurring during the brief speciation events and virtually no change during the subsequent existence of the species, is known as the punctuated equilibrium model.

Whether morphological evolution in the fossil record is predominantly punctuational or gradual is a much-debated question. The imperfection of the record makes it unlikely that the issue will be settled in the foreseeable future. Intensive study of a favourable and abundant set of fossils may be expected to substantiate punctuated or gradual evolution in particular cases. But the argument is not about whether only one or the other pattern ever occurs; it is about their relative frequency. Some paleontologists argue that morphological evolution is in most cases gradual and only rarely jerky, whereas others think the opposite is true.

Much of the problem is that gradualness or jerkiness is in the eye of the beholder. Consider the evolution of shell rib strength (the ratio of rib height to rib width) within a lineage of fossil brachiopods of the genus Eocelia. Results of the analysis of an abundant sample of fossils in Wales from near the beginning of the Devonian Period is shown in the figure. One possible interpretation of the data is that rib strength changed little or not at all from 415 million to 413 million years ago; rapid change ensued for the next 1 million years, followed by virtual stasis from 412 million to 407 million years ago; and then another short burst of change occurred about 406 million years ago, followed by a final period of stasis. On the other hand, the same record may be interpreted as not particularly punctuated but rather a gradual process, with the rate of change somewhat greater at particular times.

The proponents of the punctuated equilibrium model propose not only that morphological evolution is jerky but also that it is associated with speciation events. They argue that phyletic evolution—that is, evolution along lineages of descent—proceeds at two levels. First, there is continuous change through time within a population. This consists largely of gene substitutions prompted by natural selection, mutation, genetic drift, and other genetic processes that operate at the level of the individual organism. The punctualists maintain that this continuous evolution within established lineages rarely, if ever, yields substantial morphological changes in species. Second, they say, there is the process of origination and extinction of species, in which most morphological change occurs. According to the punctualist model, evolutionary trends result from the patterns of origination and extinction of species rather than from evolution within established lineages.

Source: "Evolution." Encyclopædia Britannica Ultimate Reference Suite, 2013.

Subject page first Originated (saved into a folder): Thursday, November 13, 2014... 5:50 AM
Page re-Originated:Sunday, 24-Jan-2016... 08:51 AM
Initial Posting: Saturday, 13-Feb-2016... 10:59 AM
Updated Posting: Saturday, 31-March-2018... 3:58 PM

Your Questions, Comments or Additional Information are welcomed:
Herb O. Buckland