0% found this document useful (0 votes)
7 views

Module 4

Uploaded by

Ams Sagittarius
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

Module 4

Uploaded by

Ams Sagittarius
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

‭Morpheme, Allomorphs, Zero Morph, Portmanteau Morph‬

‭ orphology is the study of the internal structure of words. It is the systematic study of how‬
M
‭morphemes combine to form words. The first linguists were primarily morphologists. Well-structured‬
‭lists of morphological forms of Sumerian words were attested on clay tablets from Ancient‬
‭Mesopotamia and date from around 16‬‭th‬ ‭century BC.‬‭Morphology was also prominent in the writings‬
‭of Panini (5th century BC), and in the Greek and Roman grammatical tradition. Until the 19th century,‬
‭Western linguists often thought of grammar as consisting primarily of rules determining word‬
‭structure.‬

‭ orpheme‬‭is the minimal meaning-bearing unit in the‬‭structure of language. Words are composed‬
M
‭of morphemes (one or more). For example ‘nation’ consists of one morpheme. ‘National’ consists of‬
‭two morphemes: nation and -al; ‘Nationalize’ consists of three morphemes: nation, -al and -ize.‬
‭‘Nationalization’ consists of four morphemes: nation, -al, -iz, and -ation. ‘Denationalization’ consists‬
‭of five morphemes: de-, nation, -al, -iz, -ation. The word denationalization is in other words‬
‭composed of five meaningful units or morphemes. More Examples: sing-er-s, home-work,‬
‭un-kind-ly. Morphemes combine to form words.‬

S‭ ome linguists have defined morphemes in terms of phonemes. They describe a morpheme as a‬
‭meaningful phoneme or a series of phonemes, which cannot be further divided without destruction‬
‭or change of meaning in a particular language. A morpheme may consist of a single phoneme, but it‬
‭has to have some meaning. For example the /z/ in sings is a morpheme but the /z/ in zeal is not a‬
‭morpheme. The /z/ in sings is a phoneme and also a morpheme because it denotes the singular verb‬
‭in the present tense. A morpheme is the smallest unit of meaning in the grammatical system of‬
‭language.‬

T‭ here are two types of morphemes: Free and Bound.‬‭Free morphemes‬‭are those morphemes which‬
‭can occur independently as words in a sentence. Example: work, happy, boy etc.‬‭Bound morphemes‬
‭are those morphemes which cannot occur independently. Example: -ed, -ing, -s, un-, -al, etc. they are‬
‭always attached to or bound to other morphemes. Example: work +-ed = worked, un- + happy =‬
‭unhappy, boy + -s = boys. Bound morphemes are also called affixes.‬

‭ llomorphs‬‭are variants of the same morpheme, i.e.,‬‭morphs corresponding to the same morpheme;‬
A
‭they have the same function but different forms. Some morphemes have only a single form in all‬
‭contexts. Example: -ing. But some morphemes are realised in variant forms. Example: the plural‬
‭morpheme realised as -s or -es in spelling has three different phonetic realisations: /s/, /z/ and/iz/ in‬
‭posts, dogs and bosses respectively. These variant forms of morphemes are called allomorphs.‬
‭Another example is the past tense morpheme represented in spelling by -d or -ed. It has three‬
‭variant phonetic realisations: /d/, /t/ and /id/ in saved, picked and wanted respectively.‬
‭Zero Morph‬

S‭ ome linguists recognise a zero morph where a morpheme is expected in a grammatical system, but‬
‭it is not represented there. It is represented by the symbol Ø. The absence of a relative pronoun in‬
‭the utterance “a letter I wrote” is an example for zero morph. We expect in the utterance “a letter‬
‭that‬‭I wrote” but‬‭that‬‭is absent, though it is implied.‬‭This is a case of zero morph. Another example‬
‭for where Zero morphs can be found are the countable nouns that have the same form for the‬
‭singular and plural. The plural for sheep is sheep itself, though the plural for cow is cows. The word‬
‭sheep has undergone zero plural modification. One more example is the case of verbs that have the‬
‭same form for present and past tense. In the case of words like cut, hit and shut the present and the‬
‭past tense are the same i.e, they have undergone a zero past tense modification.‬

‭ ‬‭Portmanteau word‬‭is a blend of two or more morphemes‬‭and morphs in which morphemes‬


A
‭selected from two or more other words are combined into a single word with a new and different‬
‭sound and a new grammatical meaning all its own. The portmanteau word that's formed from these‬
‭other words is a novel combination of the original sounds and meanings of the words it came from.‬
‭For example ‘brunch’ is a portmanteau word which blends breakfast with lunch.‬

‭ ‭p
A ‬ ortmanteau morph‬‭is‬‭a phonological sequence that cannot be analyzed into smaller units in‬
‭terms of ‭f‬ orm‬‭but has two or more distinct components‬‭in terms of ‭m
‬ eaning‬‭. It is a single morph that‬
‭is analyzed as representing two underlying morphemes. A single morph which consists of two or‬
‭more morphemes but which cannot be divided neatly. For example, the verb ‘crashed’ can be‬
‭separated into the morphemes crash and -ed but a word like ‘sang’ which consists of the stem sing‬
‭and a past tense marker (the changed vowel) cannot be so divided, though we know that in terms of‬
‭meaning it is a combination of two morphemes. Another example: the word took can be a‬
‭portmanteau morph of two morphs; take and -ed.‬
I‭ nflection and Derivation, Level 1 and Level 2 Affixes in English ordering between‬
‭derivation and inflection, +boundary (morpheme level), and # boundary (word level) in‬
‭affixation‬
‭Inflection‬
I‭ nflections are affixes (i.e. bound morphemes that do not occur independently but only‬
‭attached to other morphemes) used to inflect words for indicating grammatical relations.‬
‭Unlike derivations which forms new words, inflections change only the form of a word and‬
‭not its class (class is equivalent to a ‘part of speech’ in traditional grammar) or meaning and‬
‭hence they do not create new words. For example, in laugh, laughs, laughing, laughed the -s,‬
‭-ing, -ed are inflections. Inflected languages have inflections for number, gender, case, tense‬
‭etc.‬‭Latin for instance has about 120 forms of the‬‭verb beginning with amo, amas and amat.‬
‭Most European languages have far more forms than English which has only (except be) five‬
‭different forms of the verb. Example: write, writes, writing, wrote, written. Many languages‬
‭have different forms of the noun while English nouns have just two forms, singular and‬
‭plural, apart from the possessive forms‬
I‭ nflections do not form new words, but serve to indicate grammatical relations such as‬
‭number, gender, tense etc. Thus‬‭the English inflectional‬‭suffix -s in ‘girls’ makes it plural in‬
‭form but girl and girls are not two separate words belonging to two separate word classes or‬
‭with different meanings. That is why ‭l‬eaf‬‭and ‭l‬eaves‬‭,‬‭or ‭w
‬ rite‬‭and ‭w
‬ rites‬‭, or ‭r‬ un‬‭and ‬‭ran‬‭are‬
‭not given separate ‬‭headwords‬‭in dictionari‬‭es.‬‭Derivational‬‭suffixes on the other hand create‬
‭new words from the stem. Thus from ‘child’ we get a new word ‘childhood’ by attaching the‬
‭derivational suffix -hood to it. Inflectional suffixes occur at the end of words not followed by‬
‭other suffixes. They are closing morphemes. In English inflections are all suffixes and not‬
‭prefixes.‬
‭The following are the inflectional affixes in English‬
‭Nouns‬ ‭Verbs‬ ‭Adjectives‬
‭1.‬ ‭Plural -s‬ ‭1.‬ ‭ resent tense singular -s‬
P ‭1.‬ ‭Comparative -er‬
‭(eg., girls)‬ ‭(eg., laughs)‬ ‭(eg., higher)‬
‭2.‬ ‭Possessive –‘s‬ ‭2.‬ ‭Past tense -ed‬ ‭2.‬ ‭Superlative -est‬
‭(eg., girl’s)‬ ‭(eg., laughed)‬ ‭(eg., highest)‬
‭3.‬ ‭Present participle -ing‬
‭(eg., driving)‬
‭4.‬ ‭Past participle -en‬
‭(eg., driven)‬

‭Derivation‬

‭ erivation is the process by which new words are formed from the existing words, e.g., from‬
D
‭the stem boy we can derive a new word, boyhood by adding the suffix -hood. Derivation‬
‭creates separate words, so that ‬‭leaflet, writer,‬‭and ‭r‬ erun‬‭will‬‭figure as separate words in‬
‭dictionaries. Derivational affixes can be suffixes or prefixes. They are of two types:‬
1‭ .‬ C‭ lass changing derivational affixes‬
‭2.‬ ‭Class maintaining derivational affixes‬

‭ lass changing derivational affixes change the class of the word. For example when we add‬
C
‭-ness to good (an adjective) it becomes goodness (a noun).‬‭So here by adding the affix -ness‬
‭the adjective becomes a noun. Similarly ‘centralise’ is a verb but when the class changing‬
‭derivational affix -ation is added to it, it becomes a noun ‘centralisation.’‬

‭ lass maintaining derivational affixes do not change the class of the stem. When we supply‬
C
‭the derivational affix -un to the stem ‘happy’ we get a new word ‘unhappy.’ Here ‘happy’ and‬
‭unhappy’ are both adjectives. No change of class takes place‬‭. Hence -un in this case is a‬
‭Class maintaining derivational affix. -hood in ‘childhood’ and il- in ‘illogical’ are other‬
‭examples of class maintaining derivational affixes.‬

‭ erivation creates new words logic, logical, illogical, illogicality, logician, etc.‬‭Derivation‬
D
‭tends to affects the meaning of the word, while inflection tends to affect only its syntactic‬
‭function.‬‭However, the boundary between derivation‬‭and inflection is often fuzzy and‬
‭unclear.‬

‭Level ordered Morphology‬

‭ he description is aimed at examining the shape taken by a lexical category when it goes‬
T
‭through the morphological processes of affixation, and which culminate in the formation of a‬
‭word. There are two levels‬‭. The proponents of the‬‭Lexical Morphology theory propose a‬
‭morphological concept referred to as strata, which refers to a series of affixation morphemes‬
‭which come packaged together in the word formation process. The morphemes are usually‬
‭linearly arranged so that‬‭both derivational and inflectional‬‭word formation processes occur in‬
‭a series of strata or levels‬‭which are linked together.‬‭This concept provides that words are‬
‭made up of the root, base and affixes, and that the affixes are organized into strata, such that‬
‭certain affixes are added at level one and others at level two.‬

‭ evel 1 affixes are closer to the root than Level‬‭2 affixes, in what is known as the ordering of‬
L
‭affixes.‬‭The import of this is that morphological‬‭rules apply in the lexicon in such a way that‬
‭rules apply first to the root of the word, then outward to the subsequent layers of affixes. This‬
‭results in a lexicon which is considered as having an internal structure and not just a list of‬
‭words. Katamba (1989:258)‬‭likens this structure to‬‭an onion‬‭‘with the root of the word as the‬
‭core, and Level 1 as the inner layer, Level 2 as the outer layer and post-lexical phonology as‬
‭the skin on the outside’.‬

‭This ordering of affixes can be illustrated in the following sketch:‬

‭Prefixes [root/base] Suffixes‬

‭[Level 1 affixes] [Root] [Level 1 affixes]‬

‭[Level 2 affixes] [Level 1 affixes] [Root] [Level 1 affixes] [Level 2 affixes]‬


‭ he processes begin with an underived lexical item which goes through a level 1‬
T
‭morphological process of affixation. Thereafter, the derived word goes through another‬
‭morphological process of affixation at level 2. Further, alongside the morphological rules are‬
‭phonological rules which show how the resultant structure built by the morphology should be‬
‭articulated.‬

‭ ‭b
A ‬ oundary‬‭is a formal device used in Chomsky & Halle‬‭(1968) to express a distinction‬
‭between two types of ‭a‬ ffixes‬‭.‬
‭Example:‬
‭ he derivation of the‬‭English words productívity and prodúctiveness‬‭runs‬‭as follows. In the‬
T
‭case of productivity, we first put together pro-, duct, -ive, and -ity (all Class I), and then we‬
‭apply the stress rules (productívity). The derivation of productiveness is crucially different.‬
‭First we put together pro-, duct, and -ive, then we apply the stress rules (prodúctive), and only‬
‭then do we have the chance to add the Class II affix -ness, giving prodúctiveness. Since the‬
‭affixation of -ness takes place after the stress rules have applied, it is correctly predicted that‬
‭-ness cannot affect the stress already assigned to productive.‬
‭ he claim is that -ity is a‬‭morpheme-boundary‬‭or formative-boundary ‬‭affix‬‭(i.e. ‬‭+ity‬‭),‬
T
‭and -ness a‬‭word-boundary‬‭‬‭affix‬‭(i.e. ‬‭#ness‬‭).‬
‭ he assumption that ‭a‬ ffixes‬‭are associated with different‬‭boundaries, viz‬‭. + (morpheme‬
T
‭boundary) and # (‬‭word‬‭boundary)‬‭accounts for the fact‬‭that the‬
‭English ‬‭suffixes‬‭-ity and -ness behave differently‬‭with respect to a number of phonological‬
‭rules.‬

I‭ t is worth observing that although the proponents of the lexical morphology theory are‬
‭generally agreed that the lexicon is hierarchically structured, they are, nonetheless, not in‬
‭agreement on the number of levels or strata involved. Mohanan, (1982:8) for instance,‬
‭advocates for a 4-level hierarchical structure. On his part, Kiparsky (1982:133) suggests 3‬
‭strata. Katamba (1989:259) argues in favour of two strata, namely Stratum 1, which involves‬
‭derivations and Stratum 2 which involves inflections.‬
‭ ord Formation Techniques: Blending, Clipping, Back Formations, Acronyms, Echo word‬
W
‭formation, Abbreviation etc.‬

‭ s knowledge grows, so language grows with it. As languages evolve and develop all the time, new‬
A
‭words emerge to help users communicate better. The Dr. Johnson’s dictionary of 1755 recorded‬
‭about 48,000 headwords whereas the Oxford dictionary today has explanations for 273,000 words.‬
‭Though the former was a work by one man and the latter by a team assisted by technology we can‬
‭safely say that words keep getting added through various means every single day that a language‬
‭continues to exist.‬

“‭ English language is by far the richest, having the most extensive vocabulary of any in the world. This‬
‭is partly due to historical factors, partly to “the genius of the language” and its readiness to absorb‬
‭words from foreign tongues, or to make new ones where existing terms are not adequate.” F. T Wood‬

‭Blending‬

‭ lending is a process in which the sound and the meaning of two different words are blended‬
B
‭(combined) to form a new word. For example, ‘telecast’ is a word that is formed from tele-vision and‬
‭broad-cast. Such words are also known as portmanteau words. The word is derived from the French‬
‭word which combines porter (to carry) and manteaux (mantle). Lewis Caroll is credited with coining‬
‭this word in his‬‭Through the Looking Glass‬‭based on‬‭the fact that a portmanteau bag is one that‬
‭opens into two equal parts. Blends are also called centaur words because they are words coined by a‬
‭combination of two words, just as a centaur is a combination of man and horse. Some examples are:‬

c‭ hannel + tunnel = chunnel ‬


‭motor + hotel = motel‬
‭work + alcoholic = workaholic‬
‭emote + icon = emoticon‬
‭smoke + fog = smog‬
‭biography + picture = biopic‬
‭camera + recorder = camcorder‬
‭gleam + shimmer = glimmer‬
‭Most blends are formed by the following methods:‬
‭a.‬ ‭The beginning of one word is added to the end of the other. Ex: breakfast + lunch = brunch‬
‭b.‬ ‭The beginning of two words are combined Ex: cybernetic + organism = cyborg‬
‭c.‬ ‭One complete word is combined with a part of the other. Ex: guess + estimate = guesstimate‬
‭d.‬ ‭Two words are blended around a common sequence of sounds. Ex: California+ fornication =‬
‭Californication‬
‭e.‬ ‭Multiple sounds from two component words are blended while preserving the sounds’ order.‬
‭Ex: lithe + slimy = slithy‬
‭When two words are combined in their entirety the result is considered a compound word rather‬
‭than a blend. Ex: bagpipe is a compound, not a blend.‬

‭Abbreviation‬

‭ n abbreviation is a shortened form of a word or phrase that is used to represent the full form of the‬
A
‭word or phrase. Here are some examples:‬
‭USA (United States of America) ‬
‭CEO (Chief Executive Officer) ‬
‭ NESCO (United Nations Educational, Scientific and Cultural Organization) ‬
U
‭km (kilometer) ‬
‭St. (street) ‬
‭There are three types of abbreviations.‬

‭a.‬ A ‭ cronyms‬‭: When we put the initial letters of a set‬‭of words together or separate the initial‬
‭letters of a word, we make an acronym. An acronym is always pronounced as a single word.‬
‭NATO (North Atlantic Treaty Organization) ‬
‭NASA (National Aeronautics and Space Administration) ‬
‭UNICEF (United Nations International Children's Emergency Fund) ‬
‭b.‬ ‭Initialism:‬‭An initialism is formed by the first letters‬‭of a group of words that are‬
‭pronounced letter by letter, not as a word. (Ex: BBC is pronounced a bee-bee-cee)‬
‭HR (Human Resources) ‬
‭UN (United Nations) ‬
‭URL (uniform resource locator) ‬
‭FYI (for your information) ‬
‭c.‬ ‭Clipping:‬‭A short form created by removing one or‬‭more syllables‬‭.‬
‭ad (advertisement)‬
‭exam (examination)‬
‭phone (telephone)‬
‭varsity (University)‬
‭flu (influenza)‬

‭Clipping‬

‭ lipping is a method of shortening a long word by cutting off (clipping) one or more syllables from‬
C
‭the word. The part of the word that remains stands for the whole word. In short clipping is the word‬
‭formation process which consists of the reduction of a word to one of its parts. It is a variety of‬
‭abbreviation method of word formation. Some examples are:‬

a‭ dvertisement = ad‬
‭hamburger = burger‬
‭demonstration = demo‬
‭laboratory = lab‬
‭taxicab = taxi‬
‭omnibus = bus‬
‭telephone = phone‬
‭aeroplane = plane‬
‭influenza = flu‬
‭Clipping mainly consists of the following types:‬
‭a.‬ ‭Back clipping: This is the most common type of clipping where the beginning of a word is‬
‭retained. Ex: ad (advertisement), exam (examination), gas (gasoline), memo (memorandum),‬
‭pub (public house), gym (gymnasium)‬
‭b.‬ ‭Fore-clipping: Here the final part of a long word is retained. Ex: phone (telephone), varsity‬
‭(University), chute (parachute)‬
‭c.‬ ‭Middle clipping: Here the middle of the word is retained. Ex: flu (influenza), jams (pyjamas)‬
‭d.‬ ‭Complex clipping: (clipped compounding) When a word is clipped and compounded with‬
‭another word. It is a type of blending. Ex: Sci-fi (Science-fiction), Sitcom (situation comedy).‬
‭Here it was compounded and then clipped.‬
‭ lippings originate as terms of a special group like schools, army, police, the medical profession etc,‬
C
‭in the intimacy of a milieu where a hint is sufficient to indicate the whole. Clipped terms of some‬
‭influential groups then pass on into common usage.‬

‭Acronyms‬

‭ cronyms are abbreviation formed by using only the first letter(s) of a series of words or word parts‬
A
‭in a phrase or name. These sets of letters are pronounced as one single word.Example:‬

‭ ATO (North Atlantic Treaty Organization) ‬


N
‭NASA (National Aeronautics and Space Administration) ‬
‭UNICEF (United Nations International Children's Emergency Fund) ‬
‭These examples have kept their capital letters, but many acronyms lose their capital letters to‬
‭become every-day terms. Example:‬

L‭ aser (Light Amplification by Stimulated Emission of Radiation)‬


‭Radar (Radio Detecting and Ranging)‬
‭Different varieties of Acronyms exist:‬
‭a.‬ ‭Those pronounced as a word, containing only initial letters‬
‭Scuba (Self Contained Underwater Breathing Apparatus)‬
‭b.‬ ‭Those pronounced as a word containing non-initial letters‬
‭Interpol (International Criminal Police Organisation)‬
‭c.‬ ‭Recursive acronyms in which the abbreviation itself is the expansion of one initial‬
‭Visa (Visa International Service Association)‬
‭d.‬ ‭Pseudo-acronyms which when pronounced resemble the sounds of the words‬
‭IOU (I Owe You)‬

‭Backformation‬

T‭ he formation of a word form one that looks like its derivative is called back formation. One of the‬
‭most common method of word formation is affixation, adding a prefix or suffix to create a new word.‬
‭For example speak + -er = speaker or act + -or = actor. But back formation is the exact reverse‬
‭process. It is the creation of a simpler or shorter form from a pre-existing more complex form like‬
‭edit from editor or intuit from intuition. In other words back formation is the creation of a new word‬
‭from an existing form assumed, incorrectly most of the time, to be its derivative.‬

‭ ption (noun) → opt (verb) ‬


O
‭Insertion (noun) → insert (verb)‬
‭opinion (noun) → opine (verb) ‬
‭resurrection (noun) → resurrect (verb)‬
‭beggar (noun) → beg (verb) ‬
‭diagnosis (noun) → diagnose (verb)‬
‭burglar (noun) → burgle (verb) ‬
‭hawker(noun) → hawk (verb) ‬
‭television (noun) → televise (verb) ‬

‭ ack-formation is different from clipping. Back-formation may change the word's class or meaning,‬
B
‭whereas clipping makes shortened words from longer words, but does not change the class or‬
‭meaning of the word. Another curious type of back formation seen mainly in British and Australian‬
E‭ nglish is hypocorisms. First a longer word is reduced to a single syllable then -y or -ie is added at the‬
‭end. Example: movie from “moving pictures”, Aussie from “Australian”, bookie from “bookmaker”‬
‭and telly from “television.“‬

‭Echo Word Formation‬

E‭ cho word formation or Echoism is formation of words whose sounds suggest their meaning. That‬
‭means onomatopoeia is used for the formation of words by imitating sounds. It is one of the oldest‬
‭means of word making. Examples: bang, pop, buzz, click, rumble, slash, hiss, giggle, whip, cuckoo etc.‬
‭The word ‘barbarian’ in Latin is said to be an imitation of the uncouth and unintelligible babbling (as‬
‭it sounded to the Roman ears) of foreign tribes. Onomatopoeic words are not the same across‬
‭languages. Hence the sound of a clock may be ‘tick-tock’ in English, ‘di-da’ in Mandarin and ‘katchin‬
‭katchin’ in Japanese.‬

‭*********************************************************************************‬

‭Affixation‬

‭The process of adding prefixes, suffixes, infixes or circumfixes to words to create new ones.‬

‭ refixes are those affixes which are added in the beginning of a base word. The most commonly used‬
P
‭prefixes are : -un, -dis, -mis, -im, -il, etc. For example:‬

‭ iscipline – indiscipline‬
D
‭Just – unjust‬
‭Tidy – untidy‬
‭Respect – disrespect‬
‭Understand – misunderstand‬
‭Comfortable – uncomfortable‬
‭Comfort – discomfort‬
‭Responsible – irresponsible‬
‭Honest – dishonest‬
‭Suffixes are those affixes which are added at the end of a base word. The most common suffixes‬
‭include ‘-ment’, ‘-ness’, ‘-ity’, ‘-ous’, ‘-tion’, ‘-sion’, ‘-al’, ‘-able’, ‘-ible’, ‘-ive’, ‘-ly’, ‘-ate’, ‘-er’, ‘-or’, etc.‬
‭For example:‬

‭ omprehend (verb) – comprehension (noun) – comprehensible (adjective)‬


C
‭Inform (verb) – information (noun) – informative (adjective)‬
‭Invest (verb) – Investment (noun) – Investor (noun)‬
‭Write (verb) – writer (noun)‬
‭Authorise (verb) – authorisation (noun)‬
‭Move (verb) – movement (noun)‬
‭Add (verb) – addition (noun)‬
‭Happy (adjective) – happiness (noun)‬
‭Conserve (verb) – conservation (noun)‬
‭Infixes are affixes inserted into the word stems. Mother-in-law to mothers-in-law. By adding the infix‬
‭-s to the first noun the plural form is generated.‬

‭ ircumfix consists of a prefix and a suffix that together produce a deived or inflected form. Example‬
C
‭“enlighten” has en- and -en being added to light to create a new word.‬

‭Compounding‬
‭ ompound words are formed by combining one part of speech with another to form a specific word‬
C
‭class. There are many ways in which compound words are formed. Verbs are combined with‬
‭adjectives to form compound verbs, a present ‬‭participle‬‭is‬‭combined with a noun to form a‬
‭compound noun, two nouns are combined to form a compound noun, an adjective and a noun are‬
‭combined to form a compound noun, an ‬‭adverb‬‭is combined‬‭with a noun to form a compound noun,‬
‭an adjective is combined with a past participle to form a compound adjective and so on. Take a look‬
‭at the following examples and go through the articles on ‭c‬ ompound nouns‬‭, ‬‭compound‬
‭words‬‭and ‭c‬ ompound adjectives‬‭to understand how they‬‭work.‬

‭Examples of Word Formation by Compounding‬

‭ ver (adverb) + load (noun) – Overload‬


O
‭White (adjective) + wash (verb) – Whitewash‬
‭Black (adjective) + board (noun ) – Blackboard‬
‭Cup (noun) + board (noun) – Cupboard‬
‭Short (adjective) + hand (noun) – Shorthand‬
‭Swimming (present participle) + pool (noun) – Swimming pool‬
‭Three (adjective) + legged (past participle) – Three-legged‬
‭Break (verb) + Down (preposition) – Breakdown‬
‭Up (preposition) + town (noun) – Uptown‬
‭Copy (verb) + writer (noun) – Copywriter‬
‭Sun (noun) + rise (verb) – Sunrise‬
‭Count (verb) + down (preposition) – Countdown‬
‭Flash (verb) + mob (noun) – Flash mob‬
‭Master (noun) + piece (noun) – Masterpiece‬
‭Round (adjective) + table (noun) – Round-table‬
‭. There are three types of compound words:‬

‭a.‬ ‭Open compounding: When there is a space between two elements.‬

i‭ce cream ‬
‭peanut butter ‬
‭first aid ‬
‭b.‬ ‭Hyphenated compounding: When there is a hyphen between two compounded elements.‬

‭ ever-ending ‬
n
‭in-depth ‬
‭left-handed ‬
‭c.‬ ‭Closed compounding: When the two elements are written together.‬

‭ ookcase ‬
b
‭fishbowl ‬
‭fingerprint ‬
‭[The Plural Form of Compound Nouns are generated in two ways: first Compound nouns pluralized‬
‭by adding the inflection -s to the end of the word. Ex: a bookcase → two bookcases ;a post office →‬
‭two post offices and the second type in the case of compound nouns that are made from a noun and‬
‭an adverb, the first part (the noun) becomes plural. Ex: a passer-by → several passers-by; a‬
‭Listener-in → several Listeners-in] ‬
‭Loan-word‬

‭ orrowing or loan word refers to the process where a foreign word is used in the language without‬
B
‭being translated. The English language has adopted a large number of words from other languages.‬
‭Remember that the word does not lose its meaning in the target language.‬

‭There are two different types of loan words.‬

‭a.‬ F‭ oreign words with the same spelling. This happens when a word is borrowed from a foreign‬
‭language and its orthography stays the same.‬

‭ allet from (French) to (English) ‬


b
‭patio from (Spanish) to (English) ‬
‭hamster from (German) to (English) ‬
‭cookie from (Dutch) to (English) ‬
‭yogurt from (Turkish) to (English) ‬
‭b.‬ ‭Foreign words with different spelling. In this case, the orthography of the word changes in‬
‭the target language.‬

c‭ hauffeur from (French) to (Spanish) chofer ‬


‭football from (English) to (Hungarian) futbal ‬
‭check from (English) to (Finnish) sˇekki ‬

‭Conversion‬

T‭ his word-formation process is also called zero derivation and happens when we create a new word‬
‭with another part of speech without changing its form. For example:‬

‭green (noun) → green (adjective) ‬

‭The adjective 'green' is derived from the noun 'green' that means a grass land.‬

‭cheat (verb): cheat (noun) ‬

‭ onversion happens when a word changes from one word class to another. For instance, the verb to‬
C
‭google is formed from the noun Google; or the noun read (as in a good read) is formed from the‬
‭verb to read. For example:‬

‭I emailed this document to John. (emailed is a verb formed from the noun email)‬

‭He was bullied at school as a child. (bullied is a verb formed from the noun bully)‬

‭Conversion: Types‬

‭As you can see below, there are three types of conversion:‬

‭phrasal verb to noun‬

‭to print out → a printout ‬

‭to take over → a takeover ‬

‭verb to noun‬

‭must (verb) → must (noun) ‬


‭As you can see, 'Must' is mainly used as a verb but it is trendy to use 'must' as a noun.‬

‭hit (verb): hit (noun) ‬

‭noun to verb‬

‭bottle (noun) → to bottle ‬

‭The noun 'bottle' is used as a verb these days.‬

‭mail (noun) → to mail ‬

‭adjective to verb‬

‭clean (adjective) → clean (verb) ‬

‭empty (adjective) → empty (verb) ‬

‭Coinage/ Neologism‬

‭ oinage is a type of word-formation process in which a new word is created, either by inventing a‬
C
‭completely new word or by adapting an existing word in a new way. This can happen because of‬
‭advances in technology, movies, literature, music, and popular culture. For instance:‬

‭google ‬

‭teflon ‬

‭aspirin ‬

‭Warning!‬

‭ ords that are formed by coinage are usually written in lowercase letters when they are used in‬
W
‭context, but when we want to refer to the source of the word, it becomes a proper noun and has to‬
‭be written in uppercase letters.‬

T‭ his basically means coming up with a completely new word without any of the processes above.‬
‭Some examples are:‬

‭puzzle‬

‭bash‬

‭gimmick‬

‭gadget‬
‭Semantic Relations: Componential Analysis, Prototypes‬
‭ here are several different kinds of semantic relations: terms that are semantically‬
T
‭related to a given term. These may be found in a thesaurus.‬
‭Synonymy‬
‭ ach listed synonym in a thesaurus denotes the same as the entry. The equivalence‬
E
‭may be less than perfect, but should pass the practical test that "often when people‬
‭say ‬‭X‭‬they are referring to a thing that term ‬‭Y‬‭also‬‭often refers to." For example:‬
‭ retty‬‭and ‭a
p ‬ ttractive‬‭‬
‭sick‬‭and ‬‭ill‬‭ ‬

‭ ntonymy‬
A
‭Each listed antonym in a thesaurus denotes the opposite of the entry. Two words are‬
‭said to be antonyms when they mean the opposite.‬‭For‬‭example:‬
‭Up and down‬
‭Dead and alive‬
‭Parent and child‬

‭ ypernymy‬
H
‭Each listed hypernym is superordinate to this entry. This entry’s referent is (one of)‬
‭the kind(s) of things each hypernym refers to.‬‭For‬‭example:‬
‭animal‬‭is a ‬‭hypernym‬‭of ‭m ‬ ammal‬‭(mammals are animals); ‬
‭mammal‬‭is a ‭h ‬ ypernym‬‭of ‬‭dog‬‭(dogs are mammals)‬
‭flower is a ‬‭hypernym‬‭of ‬‭tulip‬
‭red‬‭is a ‭h
‬ ypernym‬‭of ‬‭scarlet‬‭, ‭v‬ ermilion‬‭, ‭c‬ armine‬‭and ‬‭crimson‬

‭ yponymy‬
H
‭Each listed hyponym is subordinate to this entry. Each hyponym refers to a specific‬
‭kind of the thing described by this entry.For example:‬
‭dog‬‭is a ‬‭hyponym‬‭of ‬‭mammal‬‭(dogs are among the various‬‭animals which are‬
‭mammals) ‬
‭mammal‬‭is a ‭h ‬ yponym‬‭of ‭a ‬ nimal‬
‭tulip‬‭is a ‬‭hyponym‬‭of ‬‭flower‬
‭scarlet‬‭, ‬‭vermilion‬‭, ‬‭carmine‬‭and ‭c‬ rimson‬‭are ‭h ‬ yponyms‬‭of ‭r‬ ed‬
‭Meronymy‬
‭Each listed meronym denotes part of this entry’s referent. For example:‬
‭bark‬‭is a ‭m ‬ eronym‬‭of ‬‭tree‬‭(bark is part of what makes‬‭up a tree); ‬
‭tree‬‭is a ‬‭meronym‬‭of ‬‭forest‬
‭elbow‬‭is a ‭m ‬ eronym‬‭of ‭a ‬ rm‬‭‬
‭arm is a ‬‭meronym‬‭of ‭b ‬ ody‬
‭Holonymy‬
‭Each listed holonym has this entry’s referent as a part of itself; this entry’s referent is‬
‭part of each listed holonym. For example:‬
‭forest‬‭is a ‭h ‬ olonym‬‭of ‬‭tree‬‭(forests contain trees); ‬
‭tree‬‭is a ‬‭holonym‬‭of ‭b‬ ark‬
‭body‬‭is a ‬‭holonym‬‭of ‭a ‬ rm‬‭‬
‭arm is a ‬‭holonym‬‭of ‬‭elbow‬
‭ ‬‭roponymy‬
T
‭Each listed troponym denotes a particular way to do this entry’s referent. For‬
‭example:‬
‭to ‬‭trim‬‭and to ‬‭slice‬‭are ‭t‬roponyms‬‭of to ‬‭cut‬
‭to ‬‭slide‬‭and to ‬‭spin‬‭are ‭t‬roponyms‬‭of to ‬‭move‬
‭to ‬‭snack‬‭and to ‬‭nibble‬‭are ‬‭troponyms‬‭of to ‬‭eat‬

‭Componential analysis‬

‭ any different theories have been proposed for representing components of lexical meaning. All of‬
M
‭them aim to develop a formal representation of meaning components which will allow us to account‬
‭for semantic properties of words, such as their sense relations, and perhaps some syntactic‬
‭properties as well.‬
‭ ne very influential approach during the middle of the 20‬‭th‬‭century was to treat word‬
O
‭meanings as bundles of distinctive semantic features, in much the same way that phonemes‬
‭are defined in terms of distinctive phonetic/phonological features. This approach is‬
‭sometimes referred to as componential analysis of meaning‬‭. Some of the motivation for this‬
‭approach can be seen in the following‬‭famous example‬‭from Hjelmslev‬‭. The example makes‬
‭it clear that the‬‭feature of gender‬‭is an aspect of‬‭meaning that distinguishes‬‭many pairs of‬
‭lexical items within certain semantic domains.‬

‭ eatures like gender and adulthood are binary, and so lend themselves to representation in‬
F
‭either‬‭tree or matrix format‬‭, as illustrated below.‬‭Notice that‬‭in addition to the values + and –,‬
‭features may be unspecified (represented by‬‭⌀‬‭). For‬‭example, the word ‭f‬oal‬‭is unspecified for‬
‭gender, and the word ‬‭horse‬‭is unspecified for both‬‭age and gender‬‭.‬
‭Binary feature analysis for horse terms:‬

‭Binary feature analysis for human terms:‬


‭Componential analysis provides neat explanations for some sense relations:‬
‭ ynonyms‬‭: Synonymous senses can be represented as‬‭pairs that share all the same‬
S
‭components of meaning.‬
‭Antonyms‬‭are perfectly modelled by binary features:‬‭the two elements differ only in the‬
‭polarity for one feature, e.g. [+/– alive], [+/– awake], [+/– possible], [+/– legal], etc.‬
‭Hyperonyms & Hyponyms‬‭: The semantic components of‬‭a hyperonym (e.g. ‭c‬ hild‬‭[+human,‬
‭–adult]) are a proper subset of the semantic components of its hyponyms (e.g. ‬‭boy‬‭[+human,‬
‭–adult, +male]);‬‭girl‬‭[+human, –adult, –male])). In‬‭other words, each hyponym contains all‬
‭the semantic components of the hyperonym plus at least one more; and these “extra”‬
‭components are the ones that distinguish the meanings of taxonomic sisters.‬
‭On the other hand, it is not so easy to define meronyms in this way. Moreover, while many of‬
‭the benefits of this kind of componential analysis are shared by other approaches, a number‬
‭of problems have been pointed out which are specific to the binary feature approach such as‬
‭the fact that, there are many lexical distinctions which do not seem to be easily expressible in‬
‭terms of binary features, at least not in any plausible way. Species names, for example, are a‬
‭well-known challenge to this approach. What features distinguish members of the cat family‬
‭(‬‭lion, tiger, leopard, jaguar, cougar, wildcat, lynx,‬‭cheetah‬‭, etc.) from each other? Similar‬
‭issues arise with colour terms, types of metal, etc. In order to deal with such cases, it seems‬
‭that the number of features would need to be almost as great as the number of lexical items.‬

‭An alternative explanation for Componential Analysis‬


I‭t is the analysis of words through structured sets of semantic features, which are given as ‘present’‬
‭‘absent’ or ‘indifferent with reference to feature’.‬‭The method thus differs from the method of‬
‭compositionality.‬‭Words can be defined in terms of‬‭their semantic components which usually come‬
‭in pairs called ‘semantic oppositions’. Ex: Up and Down. This is represented in componential analysis‬
‭using the symbol + (if the feature is present) and – (if the feature is absent).‬

‭ omposition‬
C ‭ owered‬
P ‭ arries People‬
C ‭ wheeler‬
4 ‭ etrol‬
P ‭ wheeler‬
2
‭Car‬ ‭+‬ ‭+‬ ‭+‬ ‭+‬ ‭-‬
‭Bus‬ ‭+‬ ‭+‬ ‭+‬ ‭+‬ ‭-‬
‭Van‬ ‭+‬ ‭+‬ ‭+‬ ‭+‬ ‭-‬
‭Bicycle‬ ‭-‬ ‭+‬ ‭-‬ ‭-‬ ‭+‬
‭Train‬ ‭+‬ ‭+‬ ‭-‬ ‭+‬ ‭-‬
‭Plane‬ ‭+‬ ‭+‬ ‭-‬ ‭+‬ ‭-‬
‭Ambulance‬ ‭+‬ ‭+‬ ‭+‬ ‭+‬ ‭-‬

‭ omponential analysis provides insight into the meaning of words, and a way to study the‬
C
‭relationships between words, that are related to meaning.‬
‭Prototype‬‭:‬

I‭ n everyday discourse‬‭, the term ‘prototype’ r‬‭efers‬‭to an engineer’s model which, after testing and‬
‭possible improvement, may then go into mass production. In linguistics and in cognitive science more‬
‭generally, the term has acquired a specialized sense, although the idea of a basic unit, from which‬
‭other examples can be derived, may still be discerned. The term, namely, refers to the best, most‬
‭typical, or most central member of a category. Things belong in the category by virtue of their‬
‭sharing of commonalities with the prototype.‬

I‭ n semantics‬‭the term ‘prototype’‬‭refers to a mental‬‭representation of a typical example or the most‬


‭representative instance of a category. A prototype is a category that allows listeners to understand a‬
‭concept. For example, a prototype might be “dogs” within which listeners would categorise “Great‬
‭Dane” “Beagle” etc.‬

‭ rototype‬‭in semantic relations‬‭is a member or a set‬‭of members of a group that best represents the‬
P
‭group as a whole. An example of a group that is easily recognized by people is a prototype. For‬
‭example ‘chair’ is the prototype for furniture.‬

‭Prototype Analysis‬
‭ rototype theory‬‭is a theory of ‬‭categorization‬‭in ‭c‬ ognitive‬‭science‬‭, particularly in psychology‬
P
‭and ‬‭cognitive linguistics‬‭, in which there is a graded‬‭degree of belonging to a conceptual category,‬
‭and some members are more central than others. It emerged in 1971 with the work of‬
‭psychologist ‭E
‬ leanor Rosch‬‭, and it has been described‬‭as a "‬‭Copernican Revolution‬‭" in the‬
‭theory of categorization for its departure from the traditional ‬‭Aristotelian categories‬‭. ‬
I‭n this prototype theory, any given ‭c‬ oncept‬‭in any‬‭given language has a real world example that‬
‭best represents this concept. For example: when asked to give an example of the‬
‭concept ‬‭furniture‬‭, a ‭c‬ ouch‬‭is more frequently cited‬‭than, say, a ‭w
‬ ardrobe‬‭. Prototype theory has‬
‭also been applied in ‭l‬inguistics‬‭, as part of the mapping‬‭from ‬‭phonological structure‬‭to ‬‭semantics‬‭.‬
‭ osch and others developed prototype theory as a response to, and radical departure from, the‬
R
‭classical theory of concepts, which defines concepts by features. Rather than defining concepts‬
‭by features, the prototype theory defines categories based on either a specific artifact of that‬
‭category or by a set of entities within the category that represent a prototypical member. The‬
‭prototype of a category can be understood in lay terms by the object or member of a class most‬
‭often associated with that class.‬‭The prototype is‬‭the center of the class, with all other members‬
‭moving progressively further from the prototype, which leads to the gradation of categories.‬
‭Every member of the class is not equally central in human cognition.‬‭As in the example‬
‭of ‬‭furniture‬‭above, ‭c‬ ouch‬‭is more central than ‬‭wardrobe‬‭.‬‭Contrary to the classical view,‬
‭prototypes and gradations lead to an understanding of category membership not as an‬
‭all-or-nothing approach, but as more of a web of interlocking categories which overlap.‬
‭Implication, Entailment and Presupposition‬
I‭ n a theory of meaning based on the‬‭distinction between‬‭semantics and pragmatics‬‭, it is‬
‭crucial to distinguish between‬‭what a sentence means‬‭and what a speaker intends to convey‬
‭by uttering it.‬‭Since Paul Grice’s seminal work, the‬‭distinction between ‭s‬ entence‬‭or ‭l‬iteral‬
‭meaning‬‭and ‬‭speaker meaning‬‭is quite uncontroversial‬‭among both linguists and philosophers‬
‭of language.‬‭With ‭s‬ entence meaning‬‭, one refers to‬‭the content encoded in the words a speaker‬
‭uses to utter a grammatically correct sentence, while ‬‭speaker‬‭meaning‬‭is the content the‬
‭speaker intends to convey by uttering a sentence in order to achieve a certain goal‬‭. Sentence‬
‭meaning and speaker meaning‬‭coincide‬‭whenever the‬‭speaker intends to convey exactly the‬
‭content encoded in the words used to utter a sentence. However, language users often intend‬
‭to‬‭convey contents that differ‬‭from what the sentences‬‭they utter literally mean.‬
‭ s one of the basic forms of reasoning, inference can in general be defined as a process of‬
A
‭accepting a statement or proposition (called the conclusion) on the basis of the acceptance of‬
‭one or more other statements or propositions. It includes entailment, presupposition and‬
‭implicature‬‭. (Bublitz & Norrick)‬
‭ hen one reads or hears pieces of language, one normally‬‭tries to understand not only what‬
W
‭the words mean, but what the writer or speaker of those words intend to convey‬‭. One of the‬
‭principal difficulties that one faces when dealing with aspects of language is how to‬
‭distinguish between presupposition and entailment‬‭.‬‭These two concepts are described and‬
‭examined for the reason that they seem to provide the basis for answering a number of‬
‭questions both about speaker commitment and sentence meaning. As a matter of fact,‬
‭presupposition is what the speaker assumes to be the case prior to making an utterance‬
‭whereas entailment is what logically follows from what is asserted in the sentence.‬
‭ resupposition‬‭plays an important role in the production‬‭and comprehension of speech act. It‬
P
‭is defined from different points of view, each of which is similar to each other is some way or‬
‭the other. The German mathematician‬‭Gottlob Frege‬‭is generally recognised as the first‬
‭scholar in modern times who re-introduced the philosophical study of presupposition though‬
‭the notion of presupposition goes back centuries.‬‭Yule defines presupposition as an‬
‭assumption by a speaker/writer about what is true or already known by the listener/reader.‬
‭Therefore speakers, not sentences have presupposition. For example, in “Your brother is‬
‭waiting outside” there is an obvious presupposition that you have a brother‬‭. In the sentence‬
‭“‬‭John’s brother bought three horses‬‭” the speaker will‬‭normally be expected to have the‬
‭presupposition that‬‭a person called John exists and‬‭he has a brother‬‭. Further the speaker may‬
‭also hold‬‭more specific presupposition that John’s‬‭brother has a lot of money‬‭.‬‭Hudson states‬
‭that a presupposition is something assumed or presupposed to be true in a sentence which‬
‭asserts other information.‬‭In the following example,‬‭the sentence (a) presupposes sentence‬
‭(b).‬
(‭ a)‬ ‭The child has sneezed again.‬
‭(b)‬‭The child has sneezed before.‬
‭There are‬‭6 types of presupposition‬‭:‬
‭1.‬ E
‭ xistential presupposition‬‭: it is assumed to be present‬‭either‬‭in possessive‬
‭constructions‬‭(‬‭your car presupposes you have a car)‬‭or in definite noun phrases like‬
t‭he King of Sweden‬‭, the cat etc in which the‬‭speaker presupposes the existence of the‬
‭entities named.‬
‭2.‬ ‭The factive presupposition: since some words are used in the sentences to denote‬
‭facts, such as‬‭know, regret, realize, glad, odd‬‭and‬‭aware‬‭. (“I wasn’t aware that she got‬
‭a job” presupposes that she got a job)‬
‭3.‬ ‭Non-factive presupposition: which is assumed not to be true. Verbs like‬‭dream,‬
‭imagine‬‭and‬‭pretend‬‭are used with the presupposition‬‭that what follows is not true.‬
‭For example, “‬‭John dreamed that he was rich” presupposes‬‭that John is not rich‬‭.‬
‭4.‬ ‭Lexical presupposition: such as‬‭manage, stop‬‭and‬‭start‬‭.‬‭In this type, the use of one‬
‭form with its asserted meaning is conventionally interpreted with the presupposition‬
‭that another (non-asserted) meaning is understood. When one says “He stopped‬
‭smoking” it presupposes that he used to smoke‬‭.‬
‭5.‬ ‭Structural presupposition: in this case certain sentence structures have been analysed‬
‭as conventionally presupposing that part of the structure is assumed to be true‬‭. One‬
‭might say that speaker can use such structures to treat information as presupposed and‬
‭hence to be accepted as true by the listeners.‬‭For‬‭instance, “When did John leave?”‬
‭presupposes John left.‬
‭6.‬ ‭Counter-factual presupposition: in which what is presupposed is not only true but is‬
‭the opposite of what is true, or contrary to facts. For example, “If I were rich I would‬
‭buy a car” presupposes I’m not rich.‬
‭Entailment‬
‭ ntailment is a term derived from formal logic and now often used as part of the study of‬
E
‭semantics. All the other essential semantic relations like equivalence and contradiction‬
‭can be defined in terms of entailment.‬‭Crystal defines‬‭it as “a term that refers to a relation‬
‭between a pair of sentences such that the truth of the second sentence necessarily follows‬
‭from the truth of the first.” For example: I can see a dog entails “I can see an animal.”‬
‭One cannot assert the first and deny the second.‬
‭ ack and Ellege define entailment as a logical relationship between two propositions,‬
B
‭where if one is true then the other must also be true. For example, “Mary married John”‬
‭entails Mary got married. There are two types of entailment: (1)background entailment‬
‭and (2)foreground entailment. In one occasion, one sentence can have a number of‬
‭background entailments but one foreground entailment. For example:‬
‭Bob chased three rabbits‬
‭ he speaker is necessarily committed to the truth of a large number of background‬
T
‭entailments, some being:‬
1‭ .‬ ‭ omeone chased three rabbits‬
S
‭2.‬ ‭Bob did something to three rabbits‬
‭3.‬ ‭Bob chased three of something‬
‭4.‬ ‭Something happened‬
(‭ 2) Foreground entailment is defined by stress. It is more important for interpreting‬
‭intended meaning. For example:‬
‭a. Bob chased THREE rabbits‬
‭b. BOB chased three rabbits,‬
i‭n (a) the speaker indicated Bob chased a certain number of rabbits, while in (b) the focus‬
‭shifts to Bob. This is foreground entailment.‬
‭ o conclude‬‭presupposition is what the speaker assumes‬‭to be the case prior to making an‬
T
‭utterance. Therefore, speakers not sentences have presupposition, whereas entailment is‬
‭what logically follows from what is asserted in the utterance. Sentences not speaker have‬
‭entailments.‬
‭Implicature‬
I‭ n ‬‭pragmatics‬‭, a subdiscipline of ‬‭linguistics‬‭,‬‭an ‭i‬mplicature‬‭is‬‭something the speaker‬
‭suggests or implies with an ‬‭utterance‬‭, even though‬‭it is not literally expressed. Implicatures‬
‭can aid in communicating more efficiently than by explicitly saying everything we want to‬
‭communicate. The philosopher ‭H ‬ . P. Grice‬‭coined the‬‭term in 1975‬‭. Grice‬
‭distinguished ‬‭conversational‬‭implicatures, which arise‬‭because speakers are expected to‬
‭respect general rules of conversation, and ‭c‬ onventional‬‭ones,‬‭which are tied to certain words‬
‭such as "but" or "therefore". For example the following exchange:‬
‭A: I am out of gas.‬
‭B: There is a gas station 'round the corner.‬
‭ ere, B does not say, but ‬‭conversationally implicates‬‭,‬‭that you will get gas at the gas station‬
H
‭round the corner as it is open‬
‭ n example of a conventional implicature is "Donovan is poor but happy", where the word‬
A
‭"but" implicates a sense of contrast between being poor and being happy‬

‭Here is another example:‬


‭ lan: Are you going to Paul’s party?‬
A
‭Barb: I have to work.‬
I‭ f this was a typical exchange, Barb ‬‭meant‬‭that she‬‭is not going to Paul’s party by ‭s‬ aying‬‭that‬
‭she has to work. She did not say that she is not going to Paul’s party, and the sentence she‬
‭uttered does not mean that. Grice introduced the technical terms ‭i‬mplicate‬‭and ‬‭implicature‬‭for‬
‭the case in which what the speaker said is distinct from what the speaker thereby meant or‬
‭implied. Thus Barb implicated that she is not going; that she is not going was her implicature.‬
‭Competent speakers will grasp immediately that Barb meant both that she has to work and‬
‭that she is not going to Paul’s party.‬‭‬
‭ ll‬‭speech acts have to be inferred from contextual‬‭evidence, including what was said and‬
A
‭what words were uttered. Whether there is any significant difference in the kind of inference‬
‭required to recognize an implicature is a matter of some debate, and may depend on the type‬
‭of implicature.‬‭Competent hearers recognize and language‬‭learners acquire the ability to‬
‭recognize what a speaker implicates‬‭. Implicature is‬‭the subject of intensive experimentation‬
‭and theorizing in the‬‭field of pragmatics‬‭.‬
‭Semantic Theories: Sense and Reference, Connotation and denotation, Extension and Intension‬

‭Sense and Reference‬

‭ ense and Reference are two semantic terms introduced by the German Mathematician and‬
S
‭the philosopher ‭G ‬ ottlob Frege‬‭. The ‬‭reference‬‭(or "‬‭referent‬‭"; ‭B
‬ edeutung‬‭)‬‭of a ‬‭proper name‬‭is‬
‭the object it means or indicates (‬‭bedeuten‬‭), whereas‬‭its ‭s‬ ense‬‭(‬‭Sinn‬‭) is what the name‬
‭expresses. The reference of a ‬‭sentence‬‭is its ‭t‬ruth‬‭value‬‭, whereas its sense is the thought that‬
‭it expresses. The sense is a 'mode of presentation', which serves to illuminate only a single‬
‭aspect of the referent.‬

S‭ ense relates to the complex system or relationships that hold between the linguistic elements‬
‭themselves (mostly the words); it is concerned only with intra linguistic relations. Ex: cow/bull,‬
‭mare/stallion, etc. are sets of words that are related in terms of gender‬‭. Other kinds of sense‬
‭relationships between words can be seen in duck/duckling, pig/piglet, father/son, narrow/wide,‬
‭buy/sell‬
‭Dictionary is usually concerned with sense relations -relating words to words- an unknown word to a‬
‭word whose reference is already understood‬‭.‬
‭Reference deals with the relationship between the linguistic elements, words, sentences etc. in‬
‭particular context. The ‘reference’ of an expression is the way in which the expression presents that‬
‭reference‬‭. For example, the ancients used‬‭‘the morning‬‭star’ and the ‘evening star’‬‭to designate what‬
‭turned out to be the same heavenly body, the planet‬‭Venus.‬‭These two expressions have the same‬
‭reference, but they clearly differ in that each presents that reference in a different way. So, although‬
‭coreferential, each expression is associated with a different ‘sense.’‬
‭When we study meaning we should distinguish between sense and reference.‬‭The object in the real‬
‭world is referent, the concept we have of it in or minds is the reference. The symbol we use to refer‬
‭to them is the word. The meanings of words are given in terms of their relationships‬‭. For example,‬
‭when we say jasmine we immediately think that it is a kind of flower. Thus‬‭we explain the meaning of‬
‭a word (sense) by using other words (reference).‬‭The‬‭relationship between one word with another is‬
‭called a sense relation.‬

‭Connotation and Denotation‬

T‭ he denotation of a word is its explicit or direct meaning. It refers to the literal meaning of a word,‬
‭the ‘dictionary meaning.’ For example, if you look up the word snake in a dictionary, you will discover‬
‭that one of its denotative meanings is “any of numerous scaly, legless, sometimes venomous reptiles‬
‭having a long, tapering, cylindrical body and found in most tropical and temperate regions.”‬
‭The connotation of a word or phrase is the associated or secondary meaning. It can be something‬
‭suggested or implied by a word or thing, rather than being explicitly named or described.‬
‭Connotation refers to the associations that are connected to a certain word or the emotional‬
‭suggestions related to that word. The connotative meanings of a word exist together with the‬
‭denotative meanings. The connotations for the word snake could include evil or danger.‬
‭The connotation of a word‬‭depends on cultural context‬‭and personal associations, but the‬
‭denotation of a word is its standardised meaning‬‭within‬‭the English language. For example, the‬
‭words‬‭home and house have similar denotations‬‭or primary‬‭meanings: a home is a “shelter that is‬
‭the usual residence of a person, family, or household,” and a house is “a building in which people‬
‭live.” However, for many, these terms carry different associations or secondary meanings, also known‬
‭as connotations. Many people would agree that‬‭home‬‭connotes a sense of belonging and comfort,‬
‭whereas house conveys little more than a structure‬‭.‬
‭In semantics connotation and denotation are two basic concepts.‬‭A word denotes no more than‬
‭what its bare definition states. For example, the word ‘head’ denotes a biological organ on the other‬
‭hand when we speak of ‘the head of an institution’ we mean the person at the helm of the affairs of‬
t‭ hat institution. This is connotative meaning. Fire apart from its denotative meaning flame can also‬
‭denote enthusiasm, passion, spirit, love etc.‬

‭I‬‭ntension and Extension‬

I‭ntension and extension in logic are correlative words that indicate the reference of a term or‬
‭concept:‬‭“intension” indicates the internal content‬‭of a term or concept that constitutes its formal‬
‭definition; and “extension” indicates its range of applicability by naming the particular objects that it‬
‭denotes. For instance, the intension on ‘ship’ as a “substantive is vehicle for conveyance on water”‬
‭whereas its extension embraces such things as cargo ships, passenger ships, battle ships and sailing‬
‭ships.‬‭The distinction between intension and extension‬‭is‬‭not the same as between connotation and‬
‭denotation.‬
T‭ ruth Conditional semantics: Propositions, Truth values, Determining the‬
‭Semantic Value of a Proposition, Compositional Procedure, terms and‬
‭predicates, predicate logic, possible worlds semantics‬
‭ ruth-conditional semantics‬‭is an approach to semantics‬‭of ‬‭natural language‬‭that sees‬
T
‭meaning (or at least the meaning of assertions) as being the same as, or reducible to, their ‬‭truth‬
‭conditions‬‭. This approach to semantics is principally‬‭associated with ‭D ‬ onald Davidson‬‭.‬
‭Truth-conditional theories of semantics attempt to define the meaning of a given proposition by‬
‭explaining when the sentence is true. So, for example, because 'snow is white' is true ‬‭if and only‬
‭if‬‭snow is white.‬‭It corresponds to what is often‬‭called the ‘literal meaning’.‬
‭ or any grammatical declarative sentence, a native speaker has truth-conditional intuitions about‬
F
‭it, i.e. he or she knows in what situations it is true and in what situations it is false. Here is an‬
‭example illustrating the fact that you have truth-conditional intuitions.‬
‭(1)‬ ‭There is a circle in a square. If you are given a situation depicted below, you know that‬
‭the sentence is true.‬

‭.‬

‭(2)‬ ‭On the other hand, in the following situation, the sentence is false.‬

‭ f course, this is not the only situation where the sentence in (1) is true and infinitely‬
O
‭many situations where it is false. However, it is not difficult to see that the sentence in (1)‬
‭is true in any situation where there is a circle and there is a square and the circle is in the‬
‭square, and is false in any situations where there isn’t a circle in a square. We take this‬
‭to be a good characterization of the truth-condition of the sentence.‬

‭ e take truth-conditional intuitions to be part of the native speaker’s linguistic knowledge, on a‬


W
‭par with other linguistic intuitions like grammaticality judgments. One of the primary questions in‬
‭formal semantics is how the truth-conditional intuitions arise. What is particularly important here‬
‭is the fact that native speakers have truth-conditional intuitions about all grammatical declarative‬
‭sentences.‬
‭To every‬‭proposition‬‭corresponds a truth condition,‬‭displaying how things must be for that‬
‭proposition's truth. It is natural to take a proposition and its truth condition to be one and the‬
‭same entity, for that proposition is, by its very nature, true in just those situations set out by its‬
‭truth condition. If you know the truth condition of a sentence S, then you know a condition that‬
‭enables you to determine the truth value of S in any circumstance where you might evaluate it.‬
‭This condition will enable you to determine when S is true and when S is false, regardless of‬
‭where you happen to be evaluating it. To know the meaning of S is to know the conditions that‬
‭make it true: the truth conditions.‬
‭ he idea is that a sentence is true or false only with respect to a particular way things are, a‬
T
‭particular model of what is reality. In some state of affairs, the sentence is true and in some‬
‭others it will be false. Such alternative state of affairs is often called a possible world. Truth‬
‭values are relative to such possible worlds. Formal semantics and truth conditions offer a precise‬
‭framework for understanding the meaning of language expressions by analyzing their‬
‭relationship with truth values. This allows for clear analysis of language usage and interpretation,‬
‭aiding in communication, logical reasoning, and computational language processing.‬

‭Meaning of Sentences‬‭• A sentence can be true or false‬‭in a given situation or circumstance.‬


‭(1)‬ ‭The pope talked to Prince Williams between 3 and 4 pm on Feb. 5, 2005.‬
‭ lthough we may not know what the facts are, we know what they ought to be in order to judge‬
A
‭the sentence true (i.e., truth conditions). Conversely, even if we know what the facts are, we‬
‭cannot use these facts to evaluate whether the sentence is true, if we do not understand what‬
‭the sentence means. Thus, the truth condition is a necessary component of sentence meaning,‬
‭although it may not be a sufficient component.‬
‭ ow do we arrive at the truth conditions of sentences? Can we simply list all possible sentences‬
H
‭with the corresponding truth conditions? No. Because a language contains an infinite number of‬
‭sentences. We need a theory of truth that allows us to generate all of the correspondingly infinite‬
‭number of truth conditions. This means that the mechanism for specifying truth conditions must‬
‭be based on a generative device that is similar to the way syntax is characterized. That is, the‬
‭truth conditions associated with a sentence must be specified compositionally, by looking at the‬
‭smaller units and the way they are combined. One approach to specifying the meaning of pieces‬
‭of languages is to treat those meanings as constructions out of‬‭possible worlds‬‭and possible‬
‭objects. This technique is useful both in logic and in providing the semantics of natural‬
‭languages.‬
‭ any philosophers argue that propositions are the primary bearers of truth and falsity. This is not‬
M
‭to deny that other entities or events—utterances, sentences, beliefs etc.—are also capable of‬
‭being true or false. But, by saying that propositions are the primary bearers of truth or falsity, one‬
‭is saying that the truth or falsity of an utterance or belief depends on the truth or falsity of the‬
‭proposition expressed by that utterance or on the proposition believed (what is believed).‬
‭ ruth-Values‬‭Recall that a native speaker knows the‬‭truth-condition of any given grammatical‬
T
‭declarative sentence. For instance, we already know the truth-condition of ‘John smokes’,‬
‭namely, it is true if John smokes. Notice that whether the sentence is true or false is relative to‬
‭some state of affairs. This makes sense because (most) sentences are not inherently true or‬
‭false. One can only say that a sentence is true or false in a given situation. In other words, in‬
‭order for you to determine the truth of falsity of the sentence ‘John smokes’, you have to inspect‬
‭the situation and know (i) who John is and (ii) whether that person smokes or not. We represent‬
‭information like this in a mathematical structure called a model. You don’t need to know the‬
‭details of this. All you have to know is that a model is a mathematical representation of a‬
‭situation or state of affairs that contains enough information for you to decide the truth or falsity of‬
‭the sentence. A model is often denoted by M (or M1, M2, etc.). A semantic theory that uses‬
‭models is called model theoretic semantics, and this is the framework we will adopt. If a model M‬
‭is given, you can say a given sentence S is true or false in the situation represented by M. Or‬
‭equivalently, we say S denotes truth or falsity in M, or the denotation of S is truth or falsity in M.‬
‭The truth and falsity, furthermore, are standardly denoted by 1 and 0, respectively, which are‬
‭called‬‭truth-values‬‭.‬
‭ hat is the Truth Value? The definition of a truth value is the attribute of a proposition as to‬
W
‭whether the proposition is true or false. For example, the truth value for "7 is odd" is true,‬
‭which can be denoted as T/1. The truth value of "1 + 1 = 3" is false, which can be denoted‬
‭as F/0.‬
I‭t is important to keep in mind the difference between truth-values and truth-conditions. Both are‬
‭in some sense ‘meanings’ of sentences, but truth-conditions are more abstract than truth-values.‬
‭ ruth-values are denotations of sentences in particular models, which represent particular states‬
T
‭of affairs, and they are either 0 or 1. Truth-conditions are conditions that tell you in what kind of‬
‭situation the sentence denotes 0 and in what kind of situations the sentence denotes 1.‬
‭ ompositional semantics‬‭is concerned with phrasal‬‭meanings; the ways in which individual‬
C
‭words come together and connect to make a broader, comprehensible sentence. It is contrasted‬
‭with lexical semantics, which focusses primarily on individual word meanings.‬
‭ he Compositionality Principle: The meaning of a complex phrase is determined solely by the‬
T
‭meanings of its parts and their syntax. A semantic system obeying this principle can deal with‬
‭infinitely many expressions. Before explaining why, however, let’s consider some examples.‬
‭Recall that it is easy to know the truth condition of any grammatical declarative sentence. Let’s‬
‭take ‘A man saw Mary’. According to the Compositionality Principle, the truth-condition of this‬
‭sentence is determined only by the meanings of its parts, namely ‘a’, ‘man’, ‘saw’, and ‘Mary’,‬
‭and how these expressions are arranged. This sounds like a truism. However, it is not so hard to‬
‭imagine an artificial language whose semantics doesn’t obey the Compositionality Principle. For‬
‭example, take a more abstract ‘language’ in which sentences are sequences of mouse clicks.‬
‭Suppose that one click means ‘Select the file that the cursor is currently on’, two clicks means‬
‭‘Open the file the cursor is currently on’, and three clicks means ‘Delete the file the cursor is‬
‭currently on.’ The sentence with three clicks is syntactically decomposable into three individual‬
‭clicks or one click plus two clicks, but clearly, the meaning of three clicks can be analysed as‬
‭three instances of the meaning of one click, or the meaning of one click combined with the‬
‭meaning of two clicks. In languages like these that do not obey the Compositionality Principle,‬
‭there is no general way of computing the meaning of a complex expression based on the‬
‭meanings of its parts. In other words, the meanings of complex expressions need to be‬
‭remembered. The point here is that natural languages are not like this, and obey the‬
‭Compositionality Principle.‬
I‭f natural language semantics obeys the Compositionality Principle, there must be a way to‬
‭combine any given two grammatical phrases. And if there are only a finite number of such ways‬
‭to combine meanings that will collectively cover all grammatical sentences, we have a finite‬
‭semantic system that computes the meanings of any grammatical sentence, by combining the‬
‭meanings of its parts. It should be stressed here that how the words are arranged, i.e. the‬
‭syntactic structure, has semantic consequences. The semantics is in a way dependent on the‬
‭syntax.‬
‭ s is clear from the above discussion, our semantic system will be made up of two finite‬
A
‭components. They are the lexicon and the compositional rules. The lexicon is a list of the‬
‭meanings of words, or more precisely, of morphemes. The compositional rules are instructions‬
‭as to how to combine different meanings.‬
‭Predicate Logic‬
‭ redicate logic, first-order logic or quantified logic is a formal language in which propositions are‬
P
‭expressed in terms of predicates, variables and quantifiers. It is different from propositional logic‬
‭which lacks quantifiers.‬
‭ redicate Logic is an extension of Propositional Logic not a replacement. It retains the central‬
P
‭tenet of Propositional Logic: that sentences express propositions and propositions denote‬
‭truth-conditions.‬
‭ redicate logic, is a collection of ‬‭formal systems‬‭used‬‭in ‭m
P ‬ athematics‬‭, ‭p ‬ hilosophy‬‭, ‬‭linguistics‬‭,‬
‭and ‬‭computer science‬‭. Predicate logic uses ‬‭quantified‬‭variables‬‭over non-logical objects, and‬
‭allows the use of sentences that contain variables, so that rather than propositions such as‬
‭"Socrates is a man", one can have expressions in the form "there exists x such that x is Socrates‬
‭and x is a man", where "there exists‬‭"‭‬is a quantifier,‬‭while ‬‭x‬‭is a variable. This distinguishes it‬
‭from ‭p‬ ropositional logic‬‭, which does not use quantifiers‬‭or ‭r‬ elations‬‭; in this sense, propositional‬
‭logic is the foundation of first-order logic.‬
‭ redicate logic‬‭, also known as ‭f‬ irst-order logic‬‭, is a ‬‭formal system‬‭in mathematics and logic‬
P
‭where functions, quantifiers, and predicates are used to express statements about objects.‬
‭These elements allow for a more nuanced analysis of mathematical statements and logical‬
‭expressions.‬
‭In essence, predicate logic extends the capabilities of propositional logic by dealing with‬
‭expressions containing variables that can take on different values. This enables the discussion‬
‭of specific properties of objects and the relationships between them. It plays a critical role in‬
‭a variety of fields, from computer science to linguistics, by providing a foundation for‬
‭specifying and reasoning about the properties of objects and the relationships between them.‬

‭ onsider a statement like "All humans are mortal." In predicate logic, this can be expressed‬
C
‭more formally as ‬‭∀𝑥‬‭(‭𝐻
‬ 𝑢𝑚𝑎𝑛‬‭(‬‭𝑥 ‬‭)→‬‭𝑀𝑜𝑟𝑡𝑎𝑙‬‭(‬‭𝑥 ‬‭)),‬‭where ‭∀
‬ ‬‭denotes the universal quantifier "for‬
‭all", and → represents implication.‬

‭https://calcworkshop.com/logic/predicate-logic/‬
‭Possible Worlds Semantics in Logic and Language‬
‭ he expression “possible worlds semantics” was first used to describe “semantics” in the‬
T
‭logician’s sense. In this sense, possible worlds semantics is a matter of associating with a given‬
‭logic a model that contains worlds; and assignments, relative to those worlds, of the truth-values‬
‭of sentences, extensions of predicates, and so on. The best-known application of possible worlds‬
‭semantics is in the semantics of modal logics, usually attributed to Kripke.‬
‭ ichard Montague introduced possible worlds into the systematic study of the semantics of‬
R
‭natural language, and his approach was to take a simplified fragment of English, and then treat it‬
‭as logicians had treated their artificial formal languages.‬
‭ hat Are Possible Worlds? Possible worlds semantics relies on there being a domain of possible‬
W
‭worlds, and usually things in those worlds to be the members of the sets associated with‬
‭predicates in each world. So, we have possible worlds and possible individuals. Does that mean‬
‭that, in order to use possible worlds semantics, we need to think that there is an infinite range of‬
‭alternate universes, full of merely possible individuals, including strange individuals like golden‬
‭mountains and talking donkeys? Some have argued that this is indeed the best way to‬
‭understand possible worlds and possible individuals (Lewis 1986).‬
S‭ eminar: Katz, Jerold J. and Jerry A. Fodor. “The Structure of a Semantic Theory.”‬‭Language‬‭.‬
‭Vol.39.2. April-June 1963. pp.170-210.‬

‭ atz and Fodor, in their 1963 paper ‭T


K ‬ he Structure‬‭of a Semantic‬
‭Theory‬‭argued that a generative grammar should be‬‭thought of as a system of‬
‭rules relating the sentences in a language in their externalised form, to their‬
‭meanings. To relate the grammar to meaning, they attempted to express‬
‭meaning in a universal semantic representation, in the same way that sounds‬
‭can be expressed in a universal phonetic representation.‬
‭ he Katz-Fodor framework for a semantic component consists of two parts: a‬
T
‭lexicon and a set of projection rules. A lexicon contains a list of all the lexical‬
‭items of the language that can contain semantic readings, syntactic and‬
‭phonological information associated with each item, thus forming a repository‬
‭of varied information about the lexical items that the language is composed‬
‭of. (Multiple senses and readings, e.g., in ‬‭bachelor‬‭have‬‭been the center of‬
‭discussion in a lot of these works, with attempts at formalisms to capture‬
‭anomaly and analyticity. This work however, assumes that the lexicon has‬
‭some specification for each reading of the lexical item, and they can therefore‬
‭all be enumerated and called upon as needed.)‬
‭ he projection rules, on the other hand, are a set of rules that contribute to the‬
T
‭structure, organisation and semantic representation of a sentence. This is‬
‭therefore the part of interpretation that is traceable to syntactic structure. The‬
‭assumption here is that the projection rules are ‬‭interpretive‬‭i.e.,‬‭they operate‬
‭on the syntactic structures generated by the base structures and‬
‭transformations, in order to produce semantic readings.‬
‭ he approach to projection rules and this framework is therefore as follows:‬
T
‭the projection rules describe the contribution of syntactic structure to‬
‭meaning, and the syntactic structure is generated by recursively applying a‬
‭sequence of rules i.e., first phrase structure rules and then transformations.‬
‭Different sentences of different structure or semantic readings are due to‬
‭differences in sequences of rules and transformations that generate the‬
‭sentences --- so two sentences with the same lexical items can differ in‬
‭meaning only if different sequences of rules have been applied. These‬
‭projection rules can be of Type I or Type II, depending on whether associated‬
‭with phrase structure or transformations.‬
‭ o quote: ‬‭"The projection problem: A full synchronic‬‭description of a natural‬
T
‭language is a grammatical and semantic characterisation of that language‬
‭[...] Hence, a semantic theory must be constructed to have whatever‬
‭properties are demanded by its role in linguistic description. Since, however,‬
‭the goals of such description are reasonably well understood and since, in‬
‭comparison to semantics, the nature of grammar has been clearly‬
‭articulated, we may expect that by studying the contribution that semantics‬
‭will be required to make [..] constrains on a semantic theory. [..] Since a fluent‬
‭speaker is able to use and understand any sentence drawn from the INFINITE‬
‭set of sentences of his language, and since, at any time, he has only‬
‭encountered a FINITE set of sentences, it follows that the speaker's‬
‭knowledge of his language takes the form of rules which project the finite set‬
‭ f sentences he has fortuitously encountered to the infinite set of sentences of‬
o
‭the language. A description of the language which adequately represents‬
‭the speaker's linguistic knowledge must, accordingly, state these rules. The‬
‭problem of formulating these rules we shall refer to as the projection‬
‭problem."‬
TRANSFORMATIONAL GENERATIVE GRAMMAR
Transformational-generative grammar was introduced during the late 1950s when Noam Chomsky, a
student of Zellig Harris, published Syntactic Structures (1957). A modified version appeared in 1965 with the
publication of the Aspects of the Theory of Syntax and modifications still continue.
Chomsky postulated a syntactic base of language (called deep structure), which consists of a series of
phrase-structure rewrite rules, i.e., a series of (possibly universal) rules that generates the underlying
phrase-structure of a sentence, and a series of rules (called transformations) that act upon the
phrase-structure to form more complex sentences. The end result of a transformational-generative grammar
is a surface structure that, after the addition of words and pronunciations, is identical to an actual sentence of
a language. All languages have the same deep structure, but they differ from each other in surface structure
because of the application of different rules for transformations, pronunciation, and word insertion. Another
important distinction made in transformational-generative grammar is the difference between language
competence (the subconscious control of a linguistic system) and language performance (the speaker's actual
use of language).
Since 1957, there have been many changes in the descriptive apparatus of TGG. Common to all
versions is the view that some rules are transformational: that is, they change one structure into another
according to such prescribed conventions as moving, inserting, deleting, and replacing items. Transformations
link deep structure with surface structure. A typical transformation is the rule for forming questions, which
requires that the normal subject-verb order is inverted so that the surface structure of Can I see you later?,
differs in order of elements from that of I can see you later.The theory postulates that the two sentences have
the same order in deep structure, but the question transformation changes the order to that in surface
structure.
As can be seen there are two aspects to TG Grammar– Transformational and Generative. These two
aspects are not logically dependent on each other though the theory gains plausibility from the interaction of
the two. The Generative aspect defines the aim – that is the grammar should be able to generate all and only
the correct sentences, and the Transformational aspect defines one of the means of achieving it. That is, a
grammar, which is able to generate all and only the correct sentences of the language by means of applying
transformational rules, is a transformational generative grammar.
The concept of the tree, generated by the Phrase Structure Rules has persisted within the Chomskyan
tradition too. This was very much like the immediate constituent structures of the Bloomfieldians, but now
with each node being categorically labelled NP, VP etc. With the later concept of the deep structure, the
nodes could be labelled more abstractly by such terms as Wh , embedding or subordination). Further
refinements such as the X-bar ( X ) syntax have taken phrase structure beyond the analytic model of
immediate constituency.
In his Aspects of the Theory of Syntax (1965) Chomsky presented what is known as his ‘standard theory’
which added the concepts deep structure and surface structure: deep or underlying forms which by
transformation become surface or observable sentences of a particular language. The ‘standard theory’ also
distinguishes between a speaker’s competence (knowledge of a language) and performance (actual use of a
language), Chomskian grammar being concerned with competence not performance.
COMPETENCE AND PERFORMANCE. Competence and performance is the distinction between a person's
knowledge of language (competence) and use of it (performance). Competence can be defined as the
speaker-hearer’s knowledge of his language which allows him to construct and understand an infinite number
of actual and potential grammatical sentences in his language. It is the set of principles (rules) which a person
must possess in order to be a speaker of the language. As such, competence is an ideal, which presupposes a
“completely homogeneous speech-community.” It is hypothesized as a psychological or mental property or
function and therefore cannot be directly observed. Performance is the translation of this knowledge into
utterances. It can be defined as the actual use of language in concrete situations and is concerned with
acceptability rather than grammaticality. Performance is hampered by the speaker’s lapses like slips of the
tongue, forgetfulness, false starts etc. Thus performance is an incomplete and inaccurate demonstration of
what an individual knows about his or her language. If competence is abstract performance is concrete.
Errors occurring in linguistic performance are irrelevant to the study of linguistic competence since
performance does not directly mirror competence. The goal and challenge at the same time for a linguist and
for a child learning a language is to figure the underlying system of rules of the speaker-hearer on the basis of
his performance. Thus, a grammar of a language may be said to aim at a syntactic description of the ideal
speaker-hearer’s intrinsic competence, and if this grammar is explicit enough, it may be termed a generative
grammar. Hence, generative grammar is there to describe the knowledge of the language of a speaker-hearer,
which knowledge he utilizes in actual language use.
Deep structure (also known as deep grammar or D-structure) is the underlying syntactic structure—or
level—of a sentence. It is an abstract underlying structure that incorporates all the syntactic information
required for the interpretation of a given sentence. Surface structure incorporates all the syntactic features of a
sentence required to convert the sentence into a spoken or written version. In contrast to surface structure
(the outward form of a sentence), deep structure is an abstract representation that identifies the ways a
sentence can be analysed and interpreted. Deep structures are generated by phrase-structure rules, and
surface structures are derived from deep structures by a series of transformations.
In Syntactic Structures (1957) Chomsky’s theory of grammar had three components – 1. Phrase Structure
Component (PS Component) 2. Transformational Component (TFL Component) and 3. Morphophonemic
Component (MPH Component). Grammar is regarded as a machine in which a sentence is thought of as

passing progressively through each of these components. The PS Component by means of re-write rules
generates the kernel string. The TFL Component contains T-Rules, which transform the kernel string into the
desired structure. The MPH Component converts this structure into a phonological form.
In Aspects of the Theory of Syntax (1965) he made revisions to his earlier theory, the most important
being the inclusion of a totally new Semantic Component to deal with meaning. The PS Component was
modified and renamed the Base Component. The TFL Component was retained as such. The MPH Component
was renamed the Phonological Component.

The Basic Component contains the branching rules or the PS Rules, which generate the deep structure. The
TFL Component converts or transforms the deep structure to the surface structure. The Phonological
Component converts the surface structure into a string of sounds, which provide the basis for the transmission
of the message. The Semantic Component converts the minimal, meaning bearing units into larger and larger
units.
In 1986 Chomsky published the Knowledge of Language in which he replaced the terms Competence
with I-Language (Internalised Language) and Performance with E-Language (Externalised Language).
TRANSFORMATIONAL ASPECT
"Essentially, transformation is a method of stating how the structures of many sentences in languages
can be generated or explained formally as the result of specific transformations applied to certain basic
sentence structures.", says R. H. Robins. Thus the transformational component shows that all the sentences in
English are in fact formed out of certain elementary or basic structures. The basic sentences are called kernel
sentences.
A kernel sentence is the simple, declarative, assertive sentence in the active voice form, from which
other sentences (more complicated or complex lengthy sentences) can be derived by applying appropriate
T-rules. For example an English sentence like John killed the snake is a kernel sentence consisting of an NP and
VP. It’s a simple, declarative sentence in the active voice in the affirmative sense. It can be transformed into
different structures such as, The snake was killed by John., John did not kill the snake., Did John kill the snake?,
Who killed the snake?, What did John kill?, Didn’t John kill the snake?, The snake was not killed by John., What did
John do?, By whom was the snake killed? etc. All these sentences have different appearances. However we can
see the underlying similarity among these sentences. They are different only in their surface structures. They
have the same deep structure. Kernel sentences are generated without the application of any optional
transformations. Non-kernel sentences require the application of both optional and obligatory transformations,
and they differ one from another in that a different selection of optional transformations is made.
As proposed in 1957, transformational rules were a means by which one kind of sentence (such as the
passive The work was done by the local men) could be derived from another kind (such as the active The local
men did the work). Any process governed by such rules was a transformation (in the following example the
passivisation-transformation) and any sentence resulting from such rules was a transform. Transformations link
deep structure with surface structure. The two sentences “The local men did the work” and “ The work was

done by the local men” have the same order in deep structure, but the passivisation-transformation
transforms this order to that in the surface structure.
In "Syntactic Structures" Chomsky handles the active-passive relationship by saying that if S1 is a
grammatical sentence with the form NP1----Aux----V----NP2, then the corresponding string of form
NP2----Aux + be + en----V----by + Np1
is also a grammatical sentence. In this way, the sentence "The door was opened by John" is the transform of the
sentence in active voice "John opened the door".
There are, however, plenty of other transformations. One that occurs in English but is not paralleled in
most languages is that of `permutation'. That is, "Has Jim played the piano?" is a transformation of "Jim has played
the piano." This occurs with all the auxiliary verbs of English as "Is he coming?", "Can you go?", "Must I sleep?", etc.
If there is no auxiliary verb, the verb `do' has to be supplied to act as one:
He goes. - Does he go? They play. - Do they play? We came. - Did we come?
A different, and in some ways more important type of transformation is "relative transformation" which
involves more than one "kernel sentence". There is a sense in which one sentence can be regarded as being
part of another sentence, that one structure can be embedded into another. The sentence that is embedded
into another is known as the "constituent" and the sentence into which it is embedded as the "matrix". For
example, the sentence `The boy who was standing there ran away.' can be treated as a transformation of the two
sentences:
1. The boy ran away. and 2. The boy was standing there. Thus the relative transformation places the second
sentence after `boy' in the first and then replaces `the boy' in the second by `who'.
Transformational grammar can disambiguate structures where IC analysis and PS Grammar fails. For
example, take the sentence `Visiting professors can be dangerous.' We want to distinguish here two senses-the
action of visiting professors can be dangerous, and professors who visit can be dangerous. We can show this by
difference in the matrix and the constituent sentence as well as the place of embedding. On the first meaning
we have the kernel sentence: 1. Can be dangerous. and 2. (someone), visits professors. We then transform the
second into `visiting professors' and insert it in the place of NP. On the second meaning the kernel sentences
will be: 1. Professors can be dangerous. 2. Professors visit. Here we must apply a transformation similar to
the relative transformation. Professors who visit can be dangerous and then a further transformation to give the
required sentence by transforming `who visit' in `visiting' and placing it before `professors'. Here we can see that
the deep structure of the two apparently identical sentences are quite different. Thus, sentences that appear to
be identical are often transforms from different kernels, transformational analysis can disambiguate far more.
The whole course of the subsequent decades within and outside Chomsky’s formulations has involved
the progressive downgrading of the role of transformations by recourse to the introduction of the concept of
‘deep structure’. The term transformational, so frequent in earlier textbooks, has now almost disappeared and
the Chomskyan theory is now designated simply as generative linguistics.
GENERATIVE ASPECT:
‘Generative’ is a term borrowed from mathematics into linguistics by Chomsky. It means merely that
the grammar must be so designed that by following its rules and conventions we can produce all or any of the
possible sentences of the language. A generative grammar is thus one, which precisely specifies the
membership of the set of all the grammatical sentences in a language in question and therefore excludes all
the ungrammatical sentences. To `generate' is thus to `predict' what can be sentences of the language or to
`specify' precisely what are the possible sentences of the language. Thus a grammar should `generate', `specify',
and `predict' sentences such as: He plays the piano. but not * Plays the piano he. and * He the piano plays.
The `competence' and `performance' of a native speaker of a language are related to the TG
grammarians' interest not in the neutral text but in what is linguistically possible. According to the theory, the
native speaker of a language has `internalized a set of rules' which form the basis of his ability to speak and
understand his language. It is the knowledge of these rules that is the object of the linguist's attention, not the
actual sentences he produces.
Differences between Phrase Structure Grammar and TG Grammar
1. PS Grammar rewrites the symbol as a string whereas in TG Grammar a string is transformed into
another string. e.g. PS Grammar – S ⭢ NP + VP
TG Grammar – NP1 + Aux + V + NP2 NP2 + Aux + be + en + V + by + NP1
2. PS Grammar uses only PS Rules. In TG Grammar both PS Rules and T-Rules are used.
3. PS Rules merely expand elements whereas T-Rules rearrange delete, add or substitute elements thus
altering the deep structure.
4. PS Rules merely show the arrangement of constituents within a structure while T-Rules show the
relationship between the surface structure and the deep structure. In PS Grammar the PS Rules
generate the surface structure, while in TG Grammar the PS Rules generate the deep structure and
T-Rules transform them into surface structure.
Advantages of TG Grammar
1. While structural grammar is corpus bound and limits itself to the analysis of the gathered data, TGG
is generative in the sense that it attempts to explain the generation of sentences.
2. TG Grammar helps us to relate outwardly or superficially different sentences and distinguish
between superficially similar sentences. e.g. 1) Rama killed a snake - A snake was killed
by Rama
2) John is easy to please - John is eager to please
3. TG Grammar can resolve ambiguities in sentences by making use of the concept of deep structure.

You might also like