SEMANTIC CONTAGION

James F. Ross

I: Introduction: "The problem of "the organization

of the lexicon", with a new approach.

1. Lexical fields do not organize the lexicon; something else does. Lexical field theories were thought to display the organization of the lexicon at least at the level of contrastive word selection. But no generalized theory of networking lexical fields (semantic fields) was proposed for the overall organization of natural languages lexically, or to explain the similarity of lexical fields (with somewhat divergent members) across non-cognate languages (e.g., words for kinship), or to explain field differences among languages (e.g., differences of words for weather, or time). Lexical field theory was developed unpretentiously and may have untested potentialities. Yet, those versions that postulate parallel verbal and "mental" lexicons will suffer most from accumulating evidence that our conceptual competence, in various respects, subsists in our linguistic competence and is not separate from it, even though it can in some cases survive damage to the word-producing portions of the brain.

There are reasons of principle limiting what lexical fields can explain. As will emerge, they are not just the limitations that have encouraged "frame" semantics, or an emphasis on the "belief elements of meaning" peculiar to the lexicon of a given language, but reasons concerned with the combinatorial adaptation of words in all languages. An example of combinatorial adaptation, which I call "semantic contagion," is the italicized pair: "look down \on art; look down \at the floor".

Consequently, I go in a different direction, to develop an account of lexical organization that has lexical fields (semantic fields) as non-explanatory but observable outcomes. My account has two explanatory dimensions: (a) semantic contagion (meaning-adaptation of words to their verbal contexts), on which this paper concentrates, and (b) pragmatic traction (the engagement of talk with action, that, for instance, generates symbol distinctions, yet with commonalities across cultures, e.g., words for body parts, kinship, boatparts, or disorders of the human spirit,etc.). The energy that drives both the "software" of semantic contagion and the development of new words comes from the engagement of talk with action; pragmatic traction. Both explanatory dimensions are especially prominent and easily distinguished in craft-talk, (e.g., the professional talk of lawyers, doctors, boatbuilders, iron-workers, etc., newspapermen), and particularly visible, there, are the sub-categories of semantic contagion, (like analogy, metaphor, homonymy, figurative discourse, and denomination). It is on the enormous and luxuriantly varied sub-corpus of craft-talk that I propose my accounts of "semantic contagion" and a related notion, "semantic relativity" (see below), should be tested, and compared to other accounts of lexical organization.

Semantic contagion is a phenomenon, that is, a regular happening: that words adapt in meaning to other words --and word surrogates--which combine with them or are within syntagmatic reach: he swallowed /his water, /the insults, /his rage, /the lies, /his pride, /his enemies. Contrasting completion words differentiate susceptible frame words. That is the phenomenon to be explained, not just at the surface of events, but with a conception of the semantic cosmos.

The explanation discloses what I will call the underlying general semantic relativity. According to this notion, widely accepted views of the componentiality of utterance-meaning have to be rejected, as well as standard notions of what compositionality consists in (see part IV). That is because we are not combining fixed meaning-values (like fixed quantities) under a single structural syntax, but combining varying values in a syntax-affecting way. Instead of the notion of units of meaning combined by insertion into syntactical slots to make sentential wholes, we have meaning units whose IDENTITY depends reciprocally on which meaning-units they combine with, so as to determine a semantic whole which has a definite syntactic structure as a RESULT of the semantic adjustment. Thus, the explanatory order is exactly the reverse of what is usually supposed.

Lexical organization, even among the items in a particular utterance, follows principles of "general semantic relativity", analogous to the physics of general relativity. Moreover, the analogy with physics holds for a considerable depth, even to the component forces (of meaning contrast) and the roles they play (see part III). Furthermore, like physical relativity, lexical organization is dynamic. It is virtually the same across languages, so that the "laws" of semantic relativity are the same regardless of the frame of observation from which they are projected, as is the case with cosmic physics. Thus, there is a dynamic lexical organization for natural languages, one that explains many phenomena hitherto unexplained.

This project relies upon something observable: that refined behavior requires refined distinctions, which need multiple meanings and enriched vocabulary. In a word, talk is part of our activities, shaping, extending, fitting, and changing what we do, and talk undergoes reshaping, extensions and fittings of its own to what we do. In fact, in many craft-activities, the talk is part of what we are doing, from shamanism, to psychiatry, to law, to philosophy, to governing, even to sailing, as in, "Ready, about! Hard alee!" Survival skills (hunting, fishing, farming, feeding, and fasting), ornamentation (e.g., the arts), and expressive excellence (literature, music, dance, science and philosophy) evolve galaxies of words in variously geometrized semantic spaces but under universal semantic forces, the same regardless of the local geometry. I mean that particular contrasting practices, for instance, kinds of canoe paddling or steering, or kinds of sowing seeds, have associated verbal oppositions, that create a verbal geometry of what to say and what not to say, a subspace, shaped by the particular sorts of action. The features of a local space are exhibited in "affinity", "proximity", and many kinds of "opposition" (and other semantic relations), just as John Lyons postulated, among distinct words (but coapplicable in the very utterance), and in the constant polysemy of "utility" words, and of every word upon occasion,--a condition that results when a given word is dominated by contrasting completion words, as in: look for: trouble/ money/ men/ advice/ help/ support/ death/ victory/ release/ freedom. In fact, recurring words tend to differentiate in meaning. The relativity hypotheses explain that.

In all crafts, practices and rituals modified by discourse--and all are-- discourse is symbiotic with the activity it modifies. That observation is the foundation of this inquiry. The results are both polysemy and multiplicity of symbols, both of which are in constant change, like an expanding universe, as our forms of life grow and diverge. And, though I concentrate on how semantic contagion produces polysemy, and how that is accounted for by general semantic relativity, let me emphasize that the energy for semantic relativity, as well as for symbol proliferation, is from the traction between discourse and action.

2. Objectives: I move from lexical fields to broad explanatory principles of semantic relativity and, then, to describe craft-talk, where we can test hypotheses like these, in five steps. First, I explain why lexical fields do not explain semantic contagion, or lexical organization generally. Second, I explain how we can treat semantic contagion, neatly, as the display of continuous forces causing polysemy (of classifiable kinds) by fitting single words or phrases to contrasting verbal contexts: e.g., to sponge/ a stain; to sponge/ a living. I develop an analogy with cosmic physics, to describe the semantic forces and their interactions as general semantic relativity, even with analogous component principles, (part III).

What explains differentiation of meaning for the same word synchronically also explains the development of a language's expressive capacity diachronically. For if you explain how words get different meanings from being combined in different ways, you explain how the language varies its expressive power as it is used, including the effects of introducing new words, which ripple through the language, creating new affinities and oppositions of meaning to words "already" there. Having a systematic story to tell about developing expressive power is one of the most important "payoffs" of a general account of semantic contagion.

Third, I remark, and illustrate now, that lexical organization is evolutionary, adaptive: talk changes with the activity it modulates, making, selling and using computerese, trading narcotics, building space shuttles, selling junk bonds. Words like "cause" or "entail" become semantically captured in the practice of law; "store" is semantically captured in computerese; "balance", in banking. Old crafts and practices (whale oil processing; New England spinning mills; puritan predestinarianism), and their ways of talking, sometimes the very words, dry out, while the language is still "capable" of them, as historic recreations (Williamsburg, Plymouth) and historical plays and novels show; yet gradually such usage becomes "archaic", no longer part of linguistic competence. New ideas, activities and appraisals find expression in "new life forms," with new meanings for old words and new words as well, in a drug culture with "hash", "stash" and "narcowar", or in a money-culture with corporate "take-overs", "greenmail", "poison pills", "junk bonds" and "insider trading", and in the calmer talk of "ecosystems" "ozone holes" "global warming" and "the biosphere".

Just by standing in various affinity, opposition, and other combinatorial relationships with old words, new words differentiate the old ones (because the old ones now have different antonyms, synonyms, hyponyms, etc.). A new form of life,--cold war, corporate raiding or computer literacy--, makes a new semantic space with a local geometry of affinities, oppositions, congruities and unacceptabilities for words both old and new. Linguists, as far as I know, regard that as obvious, even trivial; but they cannot explain how, when new words are made or captured, old words shift around, making room, accommodating them like a crowd at a cocktail party, as I propose to do.

One consequence of semantic relativity is that certain philosophical accounts are in "big trouble" because typical compositional analyses of utterance meaning (as well as certain views of componentiality) are in principle mistaken, (see part IV). Finally, I explain what is so special about craft-talk (part V): there, semantic contagion and pragmatic traction visibly gear together to make forms of life in which meaning and experience are inextricable (as with a medical student's finding out what ascites is by poking the swollen belly of an alcoholic and feeling the difference from someone who is pregnant, obese, or has peritonitis); so it is in farming, fishing, house-building, cabinetry, goldsmithing, air traffic control, philosophy, news reporting, banking, law, medicine, ballet, music, painting, and every refinement of life. Of course, the two explanatory features of semantic organization are everywhere in discourse; but in craft-talk, they are prominent, like an aristocratic nose or authoritative height, and, so, easily distinguished from one another; they afford the separate testing of my hypotheses without a confusion of results, or a confusion with other theories. Further, besides making purely formal models (see III, 6 below), we can do dynamic experiments, by making certain kinds of games (that are like craft-talk), by altering activity and observing the effect on discourse and by altering discourse and observing the effect on activity.

II. Semantic Contagion Makes Lexical Organization. 1. Semantic fields are consequences, not explanations of semantic organization. There are several reasons.

(i) Lexical fields are idealizations. They can only be said, metaphorically, to realign, regroup or recalibrate when a member changes completion words, as in "black American" vs "black Irish". Rather, the difference of lexical field is the result of the difference of meaning, not the explanation of it. Color words ("black" or "red") contrast for people (who can be brown and still black, and can be brownish and still "red"), while the color contraries and intermediates are quite different for cows; yet, it is not that any lexical field changed but that the relevant fields are different, having different members. The difference, again, is the result of the difference of meaning, not the explanation of it. When the same word recurs ("black mood"), in a distinct lexical field ("black Irish"), something has to explain the SELECTION of its lexical field in the particular context. That has to be dynamic and causal. Lists or ranges of words will not do.

Thus we have several lexical fields for "color" words, some where "black" includes "brown" (in some racial classifications), some where "fuschia" is not a relevant contrary to any other word (e.g. as a color for people or chickens) or where "white" includes "pink" and is never pure (as in skin tones); we also have distinct fields of color words, for paints, pigments, stone, chromatic shades, and light, etc. Yet nothing about the fields explains why the same word in one context has a different meaning (say different contraries) than in others. Neither will the difference in the sorts of objects referred to, as is commonly supposed, because they are not linguistic entities. But if one appeals to thoughts or concepts for explanation, one acknowledges (though looking in the wrong place for the explanation) the very phenomenon I am discussing: semantic contagion--adaptation of meaning caused combinatorially by words, or word surrogates. Because such combinatorial differences are causal and constant, they require semantic software: "She grabbed the handle."; "She grabbed the limelight." Thought alone will not do; the semantic software FORMS the thought by organizing the expression of thought.

We can get closer to the explanatory realities by noticing that the same word, when differentiated (e.g., to block \ a road \a scenario \a proposal), belongs to different predicate schemes. For now, simplify the idea of "predicate schemes" into

differences of what other words can be replacements for a given word in otherwise the very same utterance [and action context] (a) to convey "nearly the same idea", or (b) a contrary idea, or (c) an appropriately modified idea, where "replacement" is measured by speakers familiar with the acceptable discourse (e.g., film, electronics, waste disposal) as if they were a "qualified jury of speakers."

So, "he flinched" has different cognitive content among detectives explaining a bullet's trajectory, from its content among psychologists explaining physical signs of one's being insulted. We can map and mark the differences of meaning at the level where causation is obvious in the differences of coapplicable (as above) words. They are a kind of "on the ground" lexical field.

One reason for saying lexical fields are abstracted from predicate schemes is that differences of predicate schemes are both more and less than field differences. Predicate schemes are, as I said, roughly the "alternatives" for one word in a given sentence that will yield an acceptable sentence close in meaning or in suitable contrast to 'this' one. Such alternatives can be idealized, simplified, augmented, smoothed out, and clumped into lists called "lexical fields" (like cooking or kinship or sailing or walking or running terms). The appropriate substitutions (the predicate scheme) for a given case are often only a small portion of a lexical field because of syntagmatic exclusions, and, yet, may include antonyms, contraries or other words not in the lexical field either because they are not often enough alternatives, or because the custom is not to include bungling and failure words in the lists, though they are genuine verbal (and behavioral) alternatives. "He burned it", "He trashed it", "He cremated it," are appropriate contraries, sometimes, for most people's "broiling," "baking," etc., but not in the field of COOKING terms. Further, not every cooking term applies acceptably when any one of them does; e.g., poaching fish and eggs is an alternative to broiling, but not for a saddle of lamb. And the exclusion is not just syntagmatic, either. Smoothing out contextual incongruities and ignoring syntagmatic restrictions,--e.g., that "He is her ..." excludes "her aunt",-- and omitting failure and misadventure words, amounts to idealizing abstraction, a project not illicit, provided licitly understood, and not applied where it mixes up causes and effects.

Paradoxically, the very fact that the same word recurs in distinct lexical fields shows that something besides lexical fields has to explain lexical organization. For something dynamic has to explain why a word belongs to the one of its lexical fields (say people colors vs. primary colors) rather than another in the particular case. That cannot be the fact that it does. The difference of fields is an EFFECT of semantic contagion.

Lexical fields are like the constellations, practical groupings of objects into recognizable patterns (with some "common conceptual content") that allow us to tell "where we are," the way a sailor can stay at a certain latitude by sailing along under the Little Bear. Such groupings are not arbitrary in the sense of "without rational basis"; they may, in fact be pragmatically compelled by the necessities of action, the way bluewater sailing requires a dependable measure of "where we are" for success and survival. Besides, lexical fields have more contact with reality than do the obtuse abstractions of propositional calculus or of first order quantification. Not so much has been smoothed out and replaced with made-up features. Navigation by the stars, by the constellations, was our way to explore the world before the refined sextant and sat-nav. Without the earlier success, the more daring deep-space vantage would not be accomplished.

If you take lexical fields seriously, you may think the "lexicon" is the domain of fields. But "the lexicon" as a range of items, is only an idealization, an obtuse abstraction from working words under a load of action, and not the explanation of anything. Nevertheless, there has to be a general explanation of lexical organization, even if there is no such domain as "the lexicon" to be organized; for there is evident semantic organization in natural languages. In fact studies, not directed to explanation, convincingly exhibit semantic organization even in the name patterns for hairstyles, cosmetics and boutiques, street names and housing developments, and automobile models, as well as trendy restaurants and nightspots (See Adrienne Lehrer's paper in this volume). Two explanatory dimensions are evident in those cases, though one is more prominent than the other; for the effect is evidently one of pragmatic traction: things that are supposed to be attractive get names connoting attraction; things associated with success, social leadership and snobbery get names to attach those associations to the things, (and that includes fads in baby-names). But another dimension is needed to explain exactly how "Knock Knees" (a club name) comes to mean intimacy rather than awkwardness; that requires another story besides, a story of semantic contagion.

ii) We can do with a few words, variously adapted, what we can do with many different ones; we can have the meaning equivalent of many lexical contrasts, without many distinct symbols. So a surly sea captain's saying, "Coffee," can range from questioning whether there is some [on board, cooked, in the pot, in the cup, or offered for shipment], to a command to find it, to cook it, to pour it out, to wipe it up, to watch out not to spill it, or, even to throw it. Furthermore, we can have homonymy by contagion, the equivalent of unrelated words: charge \battery; charge \account. So, symbol complexity is not explained merely by meaning complexity or vice versa. Now the fact that there are two kinds of meaning plurality [polysemy, and many words], where neither explains the other, signals a deeper mechanism. A striking display of both points is that polysemy in one language is a string of symbols in another (see 2, next). And, furthermore, that holds pairwise over a wide domain of languages.

(iii) The lexicon evolves by semantic contagion at least as much as by "field additions" (like the invention of new word groups for computerese). That is displayed everywhere in the writing of this paper, and by the analogous application of words like "display", "read", "print out", "screen", "memory", "recall", along with new words, like "hard drive", "floppy drive", "byte", "megabyte", and "nanosecond" in computerese. So there is at least as much demand to explain meaning proliferation by combination (semantic contagion), as there is by addition of words. Something has to explain both.

(iv) The same lexical items, (a field of tree names, say), taken as a group, adapt in meaning by belonging to contrasting craft-talk: "walnut", "oak", "maple", "birch" and "cherry"..., as WOOD names differ in truth-conditions and conditions of warranted application from PLYWOOD names, VENEER names, COLOR names, FINISH names and TREE names, though the general "shape" of the contrast is maintained. The words keep their general geometry of contrasts, but acquire local differences of meaning: different conditions of application. So semantic fields can migrate, pretty much unchanged in membership, through domains of discourse, as the kind of activity differs: logging, carpentry, cabinetry, counter-making, furniture-finishing, and milling; thus, "maple" differs in sense in: "maple trees"; "maple stain"; "maple patterns"; "maple plywood". That suggests that meaning-distinction by adaptation may be more basic than by symbol distinction. Thus the stage is set for finding semantic software that explains how words adjust in meaning.

2. Polysemy in one language needs a lexical field for translation into another. A few words, variously adapted, can do what many words can do, (like a few bent wires opening a lot of locks that need a lot of keys). That has to be true if polysemy in one language is lexical plurality (many distinct words) in another: "dare" in Italian is a lexical field (or several) in English: "give/ grant/ permit/ commit/ appoint/ announce/ produce/ yield/ show/ tell/ strike". This case,"dare," is particularly convincing because the English words so obviously belong to several linguistic fields, yet the Italian word is not used homonymously. Similarly, "see" in English can be: "vedere/ comprendere/ connoscere/ osservare/ scopire" in Italian; and "vedere" in Italian can be: "to see/ perceive/ observe/ notice"(etc.) in English, which may not amount to distinct fields, but surely amount to a whole field. Thus, if we can explain polysemy, there will be no point in assigning lexical fields semantic software of their own, since the fields will be consequences of the contagion (and of pragmatic traction multiplying symbols as practice demands).

3. Polysemy requires semantic contagion. We are so used to differentiated meanings we do not even notice them: he collected friends/ coins/ debts/ a pension/ the interest/ specimens/ invitations/ wives; he looked for peace/ progress/ a dollar/ his car/ a new car/ his wife/ a wife. Thus we tend not to look for an explanation of why and how the meaning-contrast of "debts" and "pension" can account for a meaning-difference in two occurrences of "collected." Even more subtle is the difference in "understands" in "He understands Italian" and "He understands music", and between "He commands my respect", and " He commands my attention". Yet something has to explain how that happens; it does not depend on the thinking of the utterer or hearer. The difference would be there, even if the utterance were composed by a computer malfunction and printed out in a part of the daily paper that no one read.

Adjustment of meaning to context is displayed, comparatively, as analogy of meaning: analogy of proportionality (see/ the color; see/ the point); denominative analogy (brilliant/ writer; brilliant/ book); and metaphor (blacken/ shoes; blacken/ his name); and the figures of speech as well. The fact that one word adapts so differently in webs of other words as to belong to distinct lexical fields [e.g. "see" = "perceive/ sense/ observe/ notice/ grasp/ understand/ comprehend/ espy/ sight...,"] is semantically fundamental. We have to explain it, and explain it by some constant causation, some force, that makes semantic contagion automatic, that is, not requiring thought or intention, just utterance [use] as part of some action (which, of course, is also some kind of thought). Differentiation of recurring words is the norm. For that, a constant cause, a force, so general and so regular as to be before our eyes, unseen, adjusts word-meanings to one another in the context; that is the universal linguistic force: "the syntactically coherent tends to make a semantic whole." That there should be such a force is quite natural when you consider that the function of syntactical unities is to express thought, to have semantic content, where the thought content expressed is part of what we DO, not in isolation but as practices.

Infants internalize semantic contagion. We learned to talk that way. We learned to think that way. Parents talk to children with a simplified grammar and semantics and by using utility words in many senses: "See daddy; see, mommy is putting peas on the spoon; see, here comes grandma"; and so forth for almost all the words that are used, and children's primers reinforce this by also using utility words in many senses: "get sick", "get home", "get paid"; "fix dinner", "fix the wagon", and so forth. {See the adaption of "way", just above; there was no thought required, beyond my saying, and your seeing, what I meant to say.} We do not, therefore, notice that differences of meaning for the same word in contrasting verbal neighborhoods must be the result, in part, of a tendency of grammatically well-formed (or approximately well-formed) utterances to have semantic unity (as far as each can), to be something we understand to be said (whether or not we understand what is being said). "Making sense" (a semantic unity), whether or not we get "the sense", turns out to be a universal linguistic force. "He's blanketing our sails." makes sense whether or not you know enough about sailing to know what the manoeuver consists in. So too, " He chamfered the stone before polishing it". You can recognize semantic integration even though you may not know enough about stone-cutting to know what chamfering is. Consider this well-formed sentence: "All things out of abstraction sail, and all their swelling canvas wear." Integration of meaning, like gravity in nature, is a constant force "downward" from utterance, embedded in its action-role, on the component words to go together so as to fill that role, (with limitations to be mentioned).

Even linguists tend to suppose that we have learned to "put the pieces down in the right order", like Lego blocks, or dominos, to make semantic unities of what we say. But that cannot be right; WHAT piece has been put down is a function of pieces a long way away and sometimes quite a long time afterward, or supposed, in an activity. Semantic unity is not straightforward, like explaining a brick wall from the placing and mortaring of the bricks. Nor is it like Newtonian billiard balls, a mere product of initial force. It is like the unity of galaxies in space formed by their very passage.

The component meaning-units are dependent, for their semantic "values" or 2"mass", on the "resultant" meaning. If we have trouble explaining what looks like "backwards" causation, so much the worse for the poverty of our analogies. Pieces often take shape from what follows (and at some distance),--like history: "Thus, he toppled the Saracen Empire with one attack". Words often gain identity from neighbors ("They cheated their way closer to the wind") and from what has gone by a while before or belongs to the discourse as a whole. So the unity of the utterance is not explained by the mere sequence of the pieces; change goes backwards, too, as in "He challenged himself and his enemies". Semantic unity of utterance has to involve a constant cause (a force); in fact, several.

Even a constant force toward "making sense" will not explain the differences of recurrent words without something that functions like mass, to be subject to gravity [the force downward from resultant meaning that causes adaptation]; for it is mass under force(s) that makes some words dominant, relatively, over others. The only way we can get adaptation, fit (relatively) to context, is (i) for a word to display a different pattern of affinities and oppositions to other words, and (ii) for another word to be, relatively, intransigent in maintaining its pattern of affinities and oppositions to other words, and for the (iii) difference in the adapting word and the intransigent word to be required by the utterance-meaning in its role in action.

The same general structure explains the simple differences of "She dropped/ a friend, her jaw, her eyes, her glasses" and the more subtle ones like "He engineered/ the new submarine;/ the peace treaty in their East;/ the indictment of every one of his friends." In brief, adaptation is the outcome of differential resistance to "giving up" relations of affinity and opposition to other words, in order to make a semantic unity that performs the expression's role in action. Thus, the action counts: "Fire! The forward gun !" vs "Fire! In the forward hold!". To explain differential resistance, we need the semantic analogue of relativistic mass in General Relativity (see III).

3. Predicate Schemes are abstractions, too. They are a map of the substitutions one might have put in, say, to distinguish "He saw her car" from "He saw her home"; they are a counterfactual map of meaning differences, and so, only better than lexical fields because less abstract and less "made up" but still made up. And always incomplete. Such a map only marks what needs explaining. We need something that explains why the substitutions in contrasting cases would have been different, by explaining why there is a difference of meaning in the first place. Yet, we do have some progress: we see that whatever makes meaning "fit" context is done to attain "acceptability" of the utterance, given its role in action.

4. Semantic contagion is contrastive adaptation. Semantic contagion is observable by the "method of difference" (J. S. Mill): synchronically as differentiation of same-word meaning, and diachronically as change of same-word meaning triggered by the contrasting contexts. Contagion is the fit of words to one another like people crowded on a bench, with give and take; give is indifference; take is dominance. Difference of meaning displays itself as comparative rearrangements of oppositions and affinities to other words, for instance, ones that might be substituted in the very sentence but to different effects: You could see an invisible point but not, in the same sense, an invisible cat. You can miss me with an ax, or with a sigh, or in the crowd.

5. Dominance. Here is the nub of it. Unity of meaning is achieved if it can be. (See linguistic force). To attain that, indifferent expressions adapt to dominant ones. But what is indifferent, and what is dominant is entirely relative, partly to how the words are used elsewhere, and partly to what the words are being used to DO (both in an illocutionary and perlocutionary way). For, as I mentioned, the equivalent of gravity is the force downward, from the perlocutionary role of the utterance in the context, upon the component words to adjust so as to perform that role. [A hint of that is our consternation when by mistake we say something with quite the opposite effect from what we intend].

A word becomes dominant when pushed "to the edge of the bench", to the edge of unacceptability, and can "give" no more, say, because it has an antecedent or a syntagmatic link, or a tie to the subject matter, and so, "has its foot down". Thus, the equivalent of mass is entrenchment in a subject, anchoring to a case, or syntagmatic ties to another anchored expression. Words also adapt to avoid semantic uncompletability, and to avoid commonplace falsity or public offensiveness and various equivalents (--the latter are defeasible conditions of unacceptability, the former, not). Those are the other forces (see below). It is resistance, under the linguistic forces [(1) to make illocutionary sense in the perlocutionary roles of the utterance in action; (2) to avoid (a) uncompletability, (b) commonplace falsehood, and (c) other kinds of unacceptability], that makes a word dominant in an utterance.

Some words resist adjustment to a context when others do not; those that resist, dominate. Dominance is the resistance of the, relatively, definite units (whether anchored or entrenched) to concatenating unacceptably, as long as something else can "give" so as to avoid unacceptability. That makes dominance relative to other words and to context. Being dominated comes down to "giving up" affinities and oppositions to other words (in this context, as compared with various other contexts) to compose an acceptable utterance. (Notice, in an arbitrarily chosen utterance, by itself, neither dominance nor adjustment is discernable.) The dominant words fit the context without adaptation. Adaptation is relative.

No word is always dominated, or always dominates. They all get their turns. Dominance is relative, like physical mass (or size), depending on what is in the neighborhood, the discourse environment. The semantic cosmos is just a distribution of neighborhoods. There is no absolute semantic mass. The relative mass of an expression is a function of whether it tends to be indifferent to other words or intransigent. That depends on the neighborhoods it frequents and upon its entrenchment in a subject or its anchoring to benchmark cases of what it is.

Why do "eyes/ books/ friends/ jaw" all dominate "dropped" in the sentence frame "She dropped her...eyes/ books/ friend/ jaw", but only conditionally dominate "burned" in "She burned her.. eyes/ books/ friend/ jaw," since several senses of "burn" will do? Where as, some of these, "She sold her...eyes/ books/ friend/ jaw," need a "saving" context to avoid unacceptablity? Why isn't "cut" dominated in "She cut" her... eye/ books/ friend/ jaw," though it could be? Here is clear dominance: he commanded... a submarine/ respect/ a thousand dollars a day/ attention/ a regiment/ more and more of my attention/. No adaptation of "cut" is needed for the completed sentence to make a definite and acceptable sense (in context). In other words, when the same sense will do, nothing dominates a common word. This is the principle of inertia, that words recur in the same meaning unless something makes a difference in meaning (for any n-tuple of recurrences).


III. Linguistic General Relativity

1. What is Linguistic General Relativity? Every meaning element depends synchronically on every other. And the "value" of a meaning-element [its particular meaning] depends on what it is combined with and in what perlocutionary role. Yet, effects diminish with distance. So, degree of dependence, depends.

Word meaning in natural language is dynamically organized, like the distribution of matter in space-time. Everything affects everything else semantically, with the "biggest" effects being caused by the, relatively, most massive lexical items on "less" massive word that appear frequently nearby,(utility words). Over time, meaning seems to become "more" diversely organized because adaptation tends to increase expressive variety, [though, nonsense is a byproduct of semantic contagion too.]

Although there are synchronic slices of discourse (even very big ones), the language (la language) does not exist atemporally. English exists only in what was said and written during seven centuries, or so. Nevertheless, slices, without regard for time order, reveal explanatory structures [that I call software], the way cell-slices do to a cell biologist; some of the structures are localized geographically, historically, and by social class; but there are underlying universal features; for example, the adaptation of a "utility" word to categorically contrasting completion words, as happens with "used," in "He used...language/ ointment/ exercise/ surgery/ railways/ deceptions/ flattery/."

Diachronically, in the semantic, as well as the physical universe, mass determines space and space determines motion -- where "mass" is "dominance" and "space" is "locus of semantic adaptation" and "motion" is actual adaptation. Besides the two basic principles of linguistic inertia, and universal linguistic force, there are four component forces, parallelling (1) gravity (--the force on component words to achieve the meaning required by the utterance's role in action), (2) electro-magnetic force (--the force of semantic inclusions, the way "man" involves "male"), (3) weak force (--the force of the defeasible unacceptabilities, like commonplace falsehood, impropriety, and public offense), and (4) strong force, like the force binding the nucleus of an atom, the binding force of combined semantic and syntagmatic ties within discourse. The overall idea is that to fit one context, say, "He used English," the word "used" adds or drops no more of its relations to other words than exactly those needed to differentiate its meaning from its fit in "He used patience," and vice versa, and so on, for any other occurrence of "used", in a complete context.

There is no absolute, only relative semantic mass, indicated by how much contextual modification, especially by explicit phrases, it takes to dominate an expression so that it adapts in meaning to fit the context. That varies with context and with completion expression. Thus semantic mass is equivalent to resistance (potential).

Overall, (1) grammatically well-formed expressions adapt their words to one another to "fit" the action to which the talk belongs, and (2) HOW the "fit" is achieved varies, though under detectable forces. That IS general semantic relativity ("mass determines space; space determines motion"). Diachronically, relativity, displayed as adaptations, expands expressive capacity. Poetry not only shows it up, it shows it off.

2. The principle of inertia is observable, "there is no difference of word meaning without something that makes a difference of meaning". Construct several sentence frames: "She shot... and complete two with the same word, "pictures," twice. "She shot pictures." "She shot pictures." If "shot" has a different meaning in the two cases something has to make a difference, the way something does in "She shot rapids" and "She shot rustlers".

3. In "She shot pictures, rapids and rattlers and rustlers", something has to cancel, turn off and turn on, the affinities and oppositions of "shot" to other words (in other contexts). That requires several other elements, the first and most important being, resistance to unacceptability, universal linguistic force, [namely that "grammatically well-formed sentences make what sense they can".] Another formulation of linguistic force is: "grammatically well-formed utterances resist concatenating unacceptably until forced". The logical consequence of universal force is linguistic gravity, a constant causation exerted downward from the meaning of the whole (from its role in our actions) upon the meanings of the parts, to adjust to one another.

Linguistic force manifests itself when the same sense (as any arbitrarily given one) will result in unacceptability for failure of semantic unity (or for certain defeasible reasons). In such a case, each element of the expression is under "pressure" to adjust. Comparatively, the subdominant word adapts. This is universal: utterances avoid unacceptability unless forced. So the meaning of "dropped" and "burned" and "cut" adjusts selectively, in the examples I gave earlier, just as "shot" does in the examples just above. Why does the contrast of "pictures", "rapids", and "rustlers" make a difference of meaning in another word ("shot")? (Of course, it doesn't automatically make a difference; it depends upon the order in which the sentences occur.) Because the differential adjustment of "shot" avoids unacceptable concatenations, which cannot be avoided by any available adjustment of the other words; "failure of semantic unity" (incongruity in the sense of "failure of meaning") is the most powerful of the forms of unacceptability. "Making no sense at all even to the speaker" is not a discourse function; for making no sense at all is not really talk (but only an echo of it).

4. There is weak force, as I said: "words resist concatenating to commonplace falsehoods, public offense, (etc.) unless forced." That has enormous "nearby" effects. Local resistance to commonplace falsehoods, offense, impropriety, stupidity, silly puns,. etc., is a semantic star builder like the weak force in physics. It tends to stabilize meaning, eliminating double meanings, for instance. See note # 49. For "weak force" builds craft-talk directly out of neologisms, etymological derivatives, verbal inventions, and utility adaptations. The force is called "weak" because the resistance to commonplace falsehood, or pointlessness, is easily defeated. So, for example, "everything is garbage" would not be taken literally; yet, it could quite easily be meant literally by someone who says to the garbageman that everything on the curb should be taken away.

5. Further, there is a quantum principle. The strong force of semantic and syntagmatic inclusions, combines with the "on/off" form of adjustments to other words ["affinity," "opposition," "congruence" "consequence,"] to make step-wise adjustments, so that "meaning adjustments are comparatively minimal". For instance, consider figurative discourse. Relatively to a given, non-figurative statement with the required words, any figure of speech can be generated by a short sequence of steps, by changing the dominant words in the sentence frame until the figurative sense is produced. There is a discussion of figurative discourse in Ch. 6, Portraying Analogy, in which it is argued that "complex meaning-relatedness can be decomposed into stepwise atomic adaptations (proportionality, simple metaphor, denomination, and paronomy)...; figurative occurrences can similarly be decomposed into stepwise adaptations whose salient feature is double differentiation, at least one of which is metaphorical and the other of which involves paronymy". There is a method of hypothetical construction offered by which one can confirm, or disconfirm, this hypothesis.

6. Adjustment is quantized; it consists in the comparative addition or loss, a whole step at a time (or several in a clump) of affinity or opposition to other words. An example of this is the fact noted earlier that, among color words that apply nowadays to people, "brown" is not opposed to "black", and "white" has only "black", "yellow", "red" and a few others as contraries. The list, however, is quite different for the skin tones of cosmetics and different again for the skin colors used by painters.

Since adjustments are by whole steps [usually taken in clumps], of giving up opposition or affinity, they are digitalized, rather than continuous. Adjustments ratchet, rather than glide, even though the semantic fit seems as seamless as a movie. (The constructive method I described in Portraying Analogy, Chapter 6 can be used to demonstrate this point, I think.)

The stepwise feature of adaptation makes modeling feasible. First, we can devise a computational model in which each semantic unit has "experiential anchors" (that might in advanced systems change over time), where the semantic value of each term (at a given occurrence) is a small number of pluses and minuses and "neutrals" (zeros), to every other semantic unit, where only certain patterns make utterance-sense. As you combine units, the anchored ones resist changing values until the unanchored (or distantly anchored ones) run through the shortest, next shortest, etc, paths of adjustments of pluses and minuses, (and replacements for zeros), until a shortest path into a semantic unity is found. That gives the compositional unity of the whole (subject to various overriding rules, and priorities, that can be invented, to taste). Thus, you can say of everything that it is not whatever you like. But in "March's storms are Spring's nurse", "nurse" has to drop "human" as a hyponym. A formal dynamic model of adjustment to context can be made, especially using the over-simplified structural principle that semantic incompatibilities will not compute. We of course, have no such principle in natural languages; incompatibilities can make semantic unities. Our principles of non-assembly, non-unity, have yet to be discovered.

Secondly, less formal models, models for craft-talk well known to the modelmaker [models made by experts sampling the talk in which the sampler is an expert] can be used to examine inertia, linguistic force, strong force, and weak force, to see whether semantic contagion really is the visible manifestation of general semantic relativity. That is what I describe in section V, below.

7. Semantic Software. There is (1) a dynamic semantic software of natural languages, that is (2) embedded synchronically in any large slice from the corpus of actual discourse, that (3) explains the adaptations of word-meaning to contrasting contexts and (4) produces multiple semantic fields from the same (or overlapping and regrouped) word members and (5) expands expressive capacity with new meaning.

Rearranging lexical fields (like color words for cows and people) is as startling as a scar. But it is a mere product (along with new words) of the simple adaptation that slides by, in silent computation, like envy, yet recalibrates words, like temperature words, as we drive through another linguistic neighborhood. So hot days and cool nights give way to hot shades and cool colors, to hot trumpets and cool saxes (hot saxes and cool flutes), to hot eyes and cool glances, hot spots and cool stances. The silent adaptation (contagion) is invisible. Look at "recalibrates" above; look at "slides by"; look at "rearranging lexical fields"; they are low on the semantic horizon and as dun colored as stealth planes, but they have adapted.

Adapted utility words, along with special words dictated by the demands of action, make galaxies of craft-talk that dot the semantic space of the language; the common structure of the space is the organization of the lexicon. That structure cosmically, is general relativity. That, in precis, is my message.

The meaning-space forms under general relativity, constantly, locally where a singer sings "Every lover is a thief, all want, no friend. ...No wonder every love, with want, will end." Items far apart lexically, say, "lover" and "thief" get closer in meaning, while "lover" and "friend" move apart a little, and "love" and "loyal" separate, and "love" and "want" get closer, while "want" and "love" and "am interested in" pull apart. But quote St. Paul, Romans, 13:10 "Love is the one thing that cannot hurt your neighbor," and space reshapes. The worldline of "love" veers toward that of "friend" and streams along near "want", wobbles relatively to "loyal" and sharply veers toward, and away from "ends", time after time.

It is as if culture moved its knee in bed and the blanket of meaning reshaped. With big life changes (atomic war and power; cancer and pollution; jets and take-overs), there are big meaning changes: linguistic dust is sucked into new galaxies of craft-talk. Old star clusters burn out like radio tubes. Old words and neologisms congeal like gasses into new spiral nebulae, fiery blue and red (the further and faster), into discourse that IS our thought for new and beautiful, bad and sad things: holes in the ozone layer; chemicals killing fish; binaural sound; acid rock, interferon, crack.

The same "software" explains (1) the meaning-adjustment (teeming/ rain; teeming/ crowds) as if it were reorganization of lexical fields to fit distinct subjects; (2) the polysemy of the "utility" words of natural languages, (like "give/ take/ run/ cut/ hit/ fix/ learn/ read/ see/ drive/ ..."), and (3) the transfer of those unbound utility words into the craftbound discourse, where some of the words are semantically captured, the way "cause" is in tort law, "run" is in baseball, and "drive" is in golf, and where other utility words remain adapted to unbound contexts, to provide the "bridging" facts between unbound and craftbound talk. Thus "I have a stomachache" has both an unbound and a craftbound use each with different truth conditions.

IV: Compositionality?

Compositionality as previously understood through this century has been falsified. There are many forms of compositionality, but basically all are as M.J. Cresswell reasoned, required because the complexity and multiplicity of what we say could not be learned "one by one". Thus, utterance meanings have to be composed (put together), and also understood by our working them out from the utterance structure and the meanings of the individual words. It is a Lego theory of utterance meaning, however it is packaged. Moreover, it invites the errors of translation that Latin students quickly abandon (I hope). If what the components are, depends on how they are assembled, as I say it does, then compositionality, as now understood, is false. You have to understand the whole to know how to translate the words: cimini sectores are not barbers, or scissiors, but hair spliiters.

Donald Davidson dragged the semantics of natural languages behind the grammar: that the language must be axiomatizable formally, and that there must be a finite number of meaning primitives, from which all utterances are made. I agree that a learner has to begin with a finite number of meaning units. The rest, in particular Davidson's conclusion that otherwise the language would be unlearnable, seems gratuitous--as is Chomsky's claim that unless we have a hardwired language learning machine or grammar-induction machine, we could never learn the grammar of a natural language because it has infinite potential sentences. (In a few minutes one can learn patterns of piano or singing notes that have equal variety, and just by hearing a few.) For instance, Davidson argues:

Suppose that a language lacks this feature; then no matter how many sentences a would-be speaker learns to produce and understand, there will be always others whose meanings are not given by the rules already mastered. Such a language is unlearnable.

Natural languages are unlearnable in Davidson's sense. There will always by sentences whose meanings are not given by any rules already learned for the production and understanding of sentences. That is not so much because we cannot compute the grammar of new sentences, as that we cannot break into the meaning network from our base of experience. That holds for everyone, and for extremely large samples of the well-formed utterances in the language. No matter how much you learn, you will be a stranger in so many areas of human expertise as to be unable even to understand what is being said by those familiar with the subject. Probably, most of what is said in English nowadays is semantically inaccessible to most speakers because it is craftbound. And it is a deep misunderstanding to regard that missing element as a matter of not knowing the "references" of words. Rather, in C.I. Lewis' terms, it is a case of not knowing the sense meaning, e.g. how a broken femur feels to an orthopedist's hands; what the sense of "collateral estoppel" is in law, as distinct from "equitable estoppel". We are not just short on the words; we are short on experience.

You cannot, because of craft-talk, learn any natural language as a whole. There always WILL be well-formed utterances whose meanings we cannot compute, whose truth-conditions and conditions of warranted-assertability escape us. The talk is inaccessible from our experiential base.

It seems that there is one thing that must be jettisoned for sure; that is Davidson's notion of compositionality functioning as follows:

If sentences depend for their meanings on their structures, and we understand the meaning of each item in the structure only as an abstraction from the totality of sentences in which it features, then we can give the meaning of any sentence (or word) only by giving the meaning of every sentence (and word) in the language.

Not that words are not components, or that they do not contribute individually to utterance meanings. They do. But what they contribute depends on what the other words contribute as they resolve their resistances.

V: From Unbound Discourse to Craft-talk

Once we understand some principles that explain the general relativity of word-meanings within a language, [e.g. the opposition between "boy" and "girl", but the relative affinity of "boy" and "girl" in contrast to "cat " and "dog"; and their affinity, in contrast to "philosophy" and "religion", and so on-- to use an oversimplified example], we can look at the lexicon as a cosmos of word clusters, looking for (1) an internal "software" for meaning adaptations in utterances and for (2) interferences, deformation of word clusters by the demands of action (the pragmatic traction of talk that is [part of] doing something). For the necessities of action are to meaning what mass is to space.

The lexicon is ordered, but not hierarchically. It is ordered adaptively to action. We make "new" [multiple] symbols for exactitude at the expense of analogue semantic speed (differentiation by context) where the payoff rewards it; otherwise we use analogy, analogue meaning, -- as "payoff" and "rewards" in the previous clause display, and as "look at" and "look for" in the paragraph above do, too.

There really is an explanation for why we have many words, when a few words with many meanings might also do. There really is an explanation of why many meanings are lexicalized by a single morpho-phonological form. [That turns out to be a necessary consequence of dominance.] The explanatory structure in both cases is basically the same (general relativity and pragmatic traction). And HOW a meaning contrast is lexicalized, (by a new word or by adaptation of a word in service), almost entirely depends on what is at stake. In fact, we often use both for different objectives, as "...am aquainted with.." and "...know of...", and, other times, simply "know".

Something is at stake in the talk that might be lost without distinct symbols, and even special phrases and sounds, (as in airplane talk, "niner"). Often, the need to avoid ambiguity (in matters of life and death, and money and marriage, too), or the demands of elegance and the display of power, count on different words to mark the meanings. Vanity, precision, and a thousand frailties and virtues, motivate marking distinct meanings with distinct sounds, (not the least social factor being the power of esoteric knowledge protected by the craft-talk of insiders: doctors, financiers, shamans, and lawyers). Just as importantly, the revelatory power of metaphor and the emotive force of figurative discourse motivate the silent assault of semantic contagion.

Semantic contagion and pragmatic traction steam, groan and boil over in craft-talk, the talk "insiders" know (with its many faces), and in its apes and impersonations, right down to the jargon of hang-tough executives and con-artists, and to "street talk" and "rap". That's where the "linguistic action" is: where the doing is.

We can use craft-talk as an experimental laboratory (1) to find craft-talk to be (partially) a product of the semantic organization of the unbound lexicon, (2) with a distinct arena of experience that, by pragmatic traction, makes its special vocabulary; and (3) to show how general relativity is taught by linguistic practice from infancy (e.g., "Mommy loves you", "Johnny loves ice cream"; "See Daddy", "See the ball"; "See the red ?" "Fix lunch", "Fix my wagon", "Fix my pants."), so that children have a general purpose semantics that unfolds into the many domains of adult craft-talk, mastered even by the greatly disadvantaged.

The general purpose semantics and pragmatics of infants is the very same software for the craft-talk of adults, and for the meanderings of the deranged. The differences lie in the experience base and the contexts of action. So, only part of the story is in general relativity, the semantics. The rest is the distinct experiences of crafts, and in pragmatic traction, the mutual molding between talk and doing. Among dairymen there cannot be an argument as to whether a heifer is a cow; but in law there can be; and in law as to whether a train engine is a railroad car or an airplane, a motor vehicle.

You can tell when you have craft-talk for an activity, not just by the sudden blossoming of bunches of words associated with the activity, but more importantly, by what insiders MUST NOT SAY. That is, by restrictions on the acceptability of utterances,that are otherwise well-formed and acceptable in "unbound discourse," our general purpose discourse. So a sailor says "throw me the line", not " throw me that rope". A person may ask "can I sue him ?" In legal parlance, the answer is "you can sue anybody for anything". But that is not what the craft-outsider is really asking; he is asking "can I probably win?". Craft discourse is more regimented, more full of rank and restriction of meaning than unbound discourse, in particular, by excluding expressions that are semantically well-formed and close in meaning to what is required in the craft: so "that's robbery" is acceptable in unbound discourse when you are overcharged, but is a solecism in law, if violence or a threat of imminent violence is not involved.

There are not neat boundaries among related crafts, e.g., between tax law and tort law. Classifications often migrate, are appropriated, distorted, and augmented, for use in related crafts, as I illustrated with a family of wood names, above. Some crafts are more or less "close" to other. Others are unrelated: stonewall-building is nothing like cannon law; goldsmithing, nothing like flight-control; policework is nothing like currency arbitrage, or electronic design. Unrelated crafts are so distant that their lexical changes, short of supernovae (like the introduction of computer analogies), don't have an apparent impact on the others, even though everything is subtly and often imperceptibly (at short range) changing relative semantic position (opposition and affinity),like the fixed stars, and all word worldlines travel subtle corkscrews of semantic space.

The two key features of lexical orginization, semantic contagion and pragmatic traction, are consequences of the engagement of discourse with endeavor. When you put talk into gear ("put your mouth in gear") to DO SOMETHING, from simply lying, sighing or crying, to "telling how it happened", to talking a non-pilot down to a safe landing, or an enraged friend out of a walk-out, --and everything we do is far more complicated than those words display --the necessities and niceties of the task explain the ramification (and often the simplification) of the vocabulary, including the polysemy, just as they do in the real crafts like cabinetry, masonry, boating, and instrument flying. To see the gears and levers working, look at some case of craft-talk, a planetarium for the whole language.

VI: Conclusion

One thing I take to be beyond doubt, already displayed, is that there is, comparatively, adaptation of meaning (of the same words) to varying semantic contexts -- namely, semantic contagion-- and that it is everywhere in discourse, and is principled. Whether I have sketched the principles minutely enough or accurately is another matter. Further, the examples make it plain, I think, that the identity of the lexical components of a sentence is dependent on what items are combined.

That dynamic feature of semantics cannot be explained by any account presupposing a fixed basic vocabulary, or a fixed categorical nesting of lexical fields, or even a finite innate stock of lexical markers, that maintain their "original" affinities and oppositions regardless of how they are combined.

But once you allow that there are principles of semantic combination, you have recognized general relativity of meaning. That is the simple change in assumption that has to be made to revolutionize semantics. I suggest, even urge, that we attend to the complexity of craft-talk which both displays and proves these claims and is the place to turn general speculation into manageable empirical hypotheses.

The whole of the data, the rioting waves of opposition, overlap, contrast and clash of meaning, are whipped around by the winds of our doings, shaping discourse to action, and making acts out of talk, in a sea of semantic relativity explained by harmonious and beautiful symmetries.

James Ross

University of Pennsylvania

Philadelphia, PA 3-11-91.