Module 4
Module 4
orphology is the study of the internal structure of words. It is the systematic study of how
M
morphemes combine to form words. The first linguists were primarily morphologists. Well-structured
lists of morphological forms of Sumerian words were attested on clay tablets from Ancient
Mesopotamia and date from around 16th century BC.Morphology was also prominent in the writings
of Panini (5th century BC), and in the Greek and Roman grammatical tradition. Until the 19th century,
Western linguists often thought of grammar as consisting primarily of rules determining word
structure.
orphemeis the minimal meaning-bearing unit in thestructure of language. Words are composed
M
of morphemes (one or more). For example ‘nation’ consists of one morpheme. ‘National’ consists of
two morphemes: nation and -al; ‘Nationalize’ consists of three morphemes: nation, -al and -ize.
‘Nationalization’ consists of four morphemes: nation, -al, -iz, and -ation. ‘Denationalization’ consists
of five morphemes: de-, nation, -al, -iz, -ation. The word denationalization is in other words
composed of five meaningful units or morphemes. More Examples: sing-er-s, home-work,
un-kind-ly. Morphemes combine to form words.
S ome linguists have defined morphemes in terms of phonemes. They describe a morpheme as a
meaningful phoneme or a series of phonemes, which cannot be further divided without destruction
or change of meaning in a particular language. A morpheme may consist of a single phoneme, but it
has to have some meaning. For example the /z/ in sings is a morpheme but the /z/ in zeal is not a
morpheme. The /z/ in sings is a phoneme and also a morpheme because it denotes the singular verb
in the present tense. A morpheme is the smallest unit of meaning in the grammatical system of
language.
T here are two types of morphemes: Free and Bound.Free morphemesare those morphemes which
can occur independently as words in a sentence. Example: work, happy, boy etc.Bound morphemes
are those morphemes which cannot occur independently. Example: -ed, -ing, -s, un-, -al, etc. they are
always attached to or bound to other morphemes. Example: work +-ed = worked, un- + happy =
unhappy, boy + -s = boys. Bound morphemes are also called affixes.
llomorphsare variants of the same morpheme, i.e.,morphs corresponding to the same morpheme;
A
they have the same function but different forms. Some morphemes have only a single form in all
contexts. Example: -ing. But some morphemes are realised in variant forms. Example: the plural
morpheme realised as -s or -es in spelling has three different phonetic realisations: /s/, /z/ and/iz/ in
posts, dogs and bosses respectively. These variant forms of morphemes are called allomorphs.
Another example is the past tense morpheme represented in spelling by -d or -ed. It has three
variant phonetic realisations: /d/, /t/ and /id/ in saved, picked and wanted respectively.
Zero Morph
S ome linguists recognise a zero morph where a morpheme is expected in a grammatical system, but
it is not represented there. It is represented by the symbol Ø. The absence of a relative pronoun in
the utterance “a letter I wrote” is an example for zero morph. We expect in the utterance “a letter
thatI wrote” butthatis absent, though it is implied.This is a case of zero morph. Another example
for where Zero morphs can be found are the countable nouns that have the same form for the
singular and plural. The plural for sheep is sheep itself, though the plural for cow is cows. The word
sheep has undergone zero plural modification. One more example is the case of verbs that have the
same form for present and past tense. In the case of words like cut, hit and shut the present and the
past tense are the same i.e, they have undergone a zero past tense modification.
p
A ortmanteau morphisa phonological sequence that cannot be analyzed into smaller units in
terms of f ormbut has two or more distinct componentsin terms of m
eaning. It is a single morph that
is analyzed as representing two underlying morphemes. A single morph which consists of two or
more morphemes but which cannot be divided neatly. For example, the verb ‘crashed’ can be
separated into the morphemes crash and -ed but a word like ‘sang’ which consists of the stem sing
and a past tense marker (the changed vowel) cannot be so divided, though we know that in terms of
meaning it is a combination of two morphemes. Another example: the word took can be a
portmanteau morph of two morphs; take and -ed.
I nflection and Derivation, Level 1 and Level 2 Affixes in English ordering between
derivation and inflection, +boundary (morpheme level), and # boundary (word level) in
affixation
Inflection
I nflections are affixes (i.e. bound morphemes that do not occur independently but only
attached to other morphemes) used to inflect words for indicating grammatical relations.
Unlike derivations which forms new words, inflections change only the form of a word and
not its class (class is equivalent to a ‘part of speech’ in traditional grammar) or meaning and
hence they do not create new words. For example, in laugh, laughs, laughing, laughed the -s,
-ing, -ed are inflections. Inflected languages have inflections for number, gender, case, tense
etc.Latin for instance has about 120 forms of theverb beginning with amo, amas and amat.
Most European languages have far more forms than English which has only (except be) five
different forms of the verb. Example: write, writes, writing, wrote, written. Many languages
have different forms of the noun while English nouns have just two forms, singular and
plural, apart from the possessive forms
I nflections do not form new words, but serve to indicate grammatical relations such as
number, gender, tense etc. Thusthe English inflectionalsuffix -s in ‘girls’ makes it plural in
form but girl and girls are not two separate words belonging to two separate word classes or
with different meanings. That is why leafand leaves,or w
riteand w
rites, or r unand ranare
not given separate headwordsin dictionaries.Derivationalsuffixes on the other hand create
new words from the stem. Thus from ‘child’ we get a new word ‘childhood’ by attaching the
derivational suffix -hood to it. Inflectional suffixes occur at the end of words not followed by
other suffixes. They are closing morphemes. In English inflections are all suffixes and not
prefixes.
The following are the inflectional affixes in English
Nouns Verbs Adjectives
1. Plural -s 1. resent tense singular -s
P 1. Comparative -er
(eg., girls) (eg., laughs) (eg., higher)
2. Possessive –‘s 2. Past tense -ed 2. Superlative -est
(eg., girl’s) (eg., laughed) (eg., highest)
3. Present participle -ing
(eg., driving)
4. Past participle -en
(eg., driven)
Derivation
erivation is the process by which new words are formed from the existing words, e.g., from
D
the stem boy we can derive a new word, boyhood by adding the suffix -hood. Derivation
creates separate words, so that leaflet, writer,and r erunwillfigure as separate words in
dictionaries. Derivational affixes can be suffixes or prefixes. They are of two types:
1 . C lass changing derivational affixes
2. Class maintaining derivational affixes
lass changing derivational affixes change the class of the word. For example when we add
C
-ness to good (an adjective) it becomes goodness (a noun).So here by adding the affix -ness
the adjective becomes a noun. Similarly ‘centralise’ is a verb but when the class changing
derivational affix -ation is added to it, it becomes a noun ‘centralisation.’
lass maintaining derivational affixes do not change the class of the stem. When we supply
C
the derivational affix -un to the stem ‘happy’ we get a new word ‘unhappy.’ Here ‘happy’ and
unhappy’ are both adjectives. No change of class takes place. Hence -un in this case is a
Class maintaining derivational affix. -hood in ‘childhood’ and il- in ‘illogical’ are other
examples of class maintaining derivational affixes.
erivation creates new words logic, logical, illogical, illogicality, logician, etc.Derivation
D
tends to affects the meaning of the word, while inflection tends to affect only its syntactic
function.However, the boundary between derivationand inflection is often fuzzy and
unclear.
he description is aimed at examining the shape taken by a lexical category when it goes
T
through the morphological processes of affixation, and which culminate in the formation of a
word. There are two levels. The proponents of theLexical Morphology theory propose a
morphological concept referred to as strata, which refers to a series of affixation morphemes
which come packaged together in the word formation process. The morphemes are usually
linearly arranged so thatboth derivational and inflectionalword formation processes occur in
a series of strata or levelswhich are linked together.This concept provides that words are
made up of the root, base and affixes, and that the affixes are organized into strata, such that
certain affixes are added at level one and others at level two.
evel 1 affixes are closer to the root than Level2 affixes, in what is known as the ordering of
L
affixes.The import of this is that morphologicalrules apply in the lexicon in such a way that
rules apply first to the root of the word, then outward to the subsequent layers of affixes. This
results in a lexicon which is considered as having an internal structure and not just a list of
words. Katamba (1989:258)likens this structure toan onion‘with the root of the word as the
core, and Level 1 as the inner layer, Level 2 as the outer layer and post-lexical phonology as
the skin on the outside’.
b
A oundaryis a formal device used in Chomsky & Halle(1968) to express a distinction
between two types of a ffixes.
Example:
he derivation of theEnglish words productívity and prodúctivenessrunsas follows. In the
T
case of productivity, we first put together pro-, duct, -ive, and -ity (all Class I), and then we
apply the stress rules (productívity). The derivation of productiveness is crucially different.
First we put together pro-, duct, and -ive, then we apply the stress rules (prodúctive), and only
then do we have the chance to add the Class II affix -ness, giving prodúctiveness. Since the
affixation of -ness takes place after the stress rules have applied, it is correctly predicted that
-ness cannot affect the stress already assigned to productive.
he claim is that -ity is amorpheme-boundaryor formative-boundary affix(i.e. +ity),
T
and -ness aword-boundaryaffix(i.e. #ness).
he assumption that a ffixesare associated with differentboundaries, viz. + (morpheme
T
boundary) and # (wordboundary)accounts for the factthat the
English suffixes-ity and -ness behave differentlywith respect to a number of phonological
rules.
I t is worth observing that although the proponents of the lexical morphology theory are
generally agreed that the lexicon is hierarchically structured, they are, nonetheless, not in
agreement on the number of levels or strata involved. Mohanan, (1982:8) for instance,
advocates for a 4-level hierarchical structure. On his part, Kiparsky (1982:133) suggests 3
strata. Katamba (1989:259) argues in favour of two strata, namely Stratum 1, which involves
derivations and Stratum 2 which involves inflections.
ord Formation Techniques: Blending, Clipping, Back Formations, Acronyms, Echo word
W
formation, Abbreviation etc.
s knowledge grows, so language grows with it. As languages evolve and develop all the time, new
A
words emerge to help users communicate better. The Dr. Johnson’s dictionary of 1755 recorded
about 48,000 headwords whereas the Oxford dictionary today has explanations for 273,000 words.
Though the former was a work by one man and the latter by a team assisted by technology we can
safely say that words keep getting added through various means every single day that a language
continues to exist.
“ English language is by far the richest, having the most extensive vocabulary of any in the world. This
is partly due to historical factors, partly to “the genius of the language” and its readiness to absorb
words from foreign tongues, or to make new ones where existing terms are not adequate.” F. T Wood
Blending
lending is a process in which the sound and the meaning of two different words are blended
B
(combined) to form a new word. For example, ‘telecast’ is a word that is formed from tele-vision and
broad-cast. Such words are also known as portmanteau words. The word is derived from the French
word which combines porter (to carry) and manteaux (mantle). Lewis Caroll is credited with coining
this word in hisThrough the Looking Glassbased onthe fact that a portmanteau bag is one that
opens into two equal parts. Blends are also called centaur words because they are words coined by a
combination of two words, just as a centaur is a combination of man and horse. Some examples are:
Abbreviation
n abbreviation is a shortened form of a word or phrase that is used to represent the full form of the
A
word or phrase. Here are some examples:
USA (United States of America)
CEO (Chief Executive Officer)
NESCO (United Nations Educational, Scientific and Cultural Organization)
U
km (kilometer)
St. (street)
There are three types of abbreviations.
a. A cronyms: When we put the initial letters of a setof words together or separate the initial
letters of a word, we make an acronym. An acronym is always pronounced as a single word.
NATO (North Atlantic Treaty Organization)
NASA (National Aeronautics and Space Administration)
UNICEF (United Nations International Children's Emergency Fund)
b. Initialism:An initialism is formed by the first lettersof a group of words that are
pronounced letter by letter, not as a word. (Ex: BBC is pronounced a bee-bee-cee)
HR (Human Resources)
UN (United Nations)
URL (uniform resource locator)
FYI (for your information)
c. Clipping:A short form created by removing one ormore syllables.
ad (advertisement)
exam (examination)
phone (telephone)
varsity (University)
flu (influenza)
Clipping
lipping is a method of shortening a long word by cutting off (clipping) one or more syllables from
C
the word. The part of the word that remains stands for the whole word. In short clipping is the word
formation process which consists of the reduction of a word to one of its parts. It is a variety of
abbreviation method of word formation. Some examples are:
a dvertisement = ad
hamburger = burger
demonstration = demo
laboratory = lab
taxicab = taxi
omnibus = bus
telephone = phone
aeroplane = plane
influenza = flu
Clipping mainly consists of the following types:
a. Back clipping: This is the most common type of clipping where the beginning of a word is
retained. Ex: ad (advertisement), exam (examination), gas (gasoline), memo (memorandum),
pub (public house), gym (gymnasium)
b. Fore-clipping: Here the final part of a long word is retained. Ex: phone (telephone), varsity
(University), chute (parachute)
c. Middle clipping: Here the middle of the word is retained. Ex: flu (influenza), jams (pyjamas)
d. Complex clipping: (clipped compounding) When a word is clipped and compounded with
another word. It is a type of blending. Ex: Sci-fi (Science-fiction), Sitcom (situation comedy).
Here it was compounded and then clipped.
lippings originate as terms of a special group like schools, army, police, the medical profession etc,
C
in the intimacy of a milieu where a hint is sufficient to indicate the whole. Clipped terms of some
influential groups then pass on into common usage.
Acronyms
cronyms are abbreviation formed by using only the first letter(s) of a series of words or word parts
A
in a phrase or name. These sets of letters are pronounced as one single word.Example:
Backformation
T he formation of a word form one that looks like its derivative is called back formation. One of the
most common method of word formation is affixation, adding a prefix or suffix to create a new word.
For example speak + -er = speaker or act + -or = actor. But back formation is the exact reverse
process. It is the creation of a simpler or shorter form from a pre-existing more complex form like
edit from editor or intuit from intuition. In other words back formation is the creation of a new word
from an existing form assumed, incorrectly most of the time, to be its derivative.
ack-formation is different from clipping. Back-formation may change the word's class or meaning,
B
whereas clipping makes shortened words from longer words, but does not change the class or
meaning of the word. Another curious type of back formation seen mainly in British and Australian
E nglish is hypocorisms. First a longer word is reduced to a single syllable then -y or -ie is added at the
end. Example: movie from “moving pictures”, Aussie from “Australian”, bookie from “bookmaker”
and telly from “television.“
E cho word formation or Echoism is formation of words whose sounds suggest their meaning. That
means onomatopoeia is used for the formation of words by imitating sounds. It is one of the oldest
means of word making. Examples: bang, pop, buzz, click, rumble, slash, hiss, giggle, whip, cuckoo etc.
The word ‘barbarian’ in Latin is said to be an imitation of the uncouth and unintelligible babbling (as
it sounded to the Roman ears) of foreign tribes. Onomatopoeic words are not the same across
languages. Hence the sound of a clock may be ‘tick-tock’ in English, ‘di-da’ in Mandarin and ‘katchin
katchin’ in Japanese.
*********************************************************************************
Affixation
The process of adding prefixes, suffixes, infixes or circumfixes to words to create new ones.
refixes are those affixes which are added in the beginning of a base word. The most commonly used
P
prefixes are : -un, -dis, -mis, -im, -il, etc. For example:
iscipline – indiscipline
D
Just – unjust
Tidy – untidy
Respect – disrespect
Understand – misunderstand
Comfortable – uncomfortable
Comfort – discomfort
Responsible – irresponsible
Honest – dishonest
Suffixes are those affixes which are added at the end of a base word. The most common suffixes
include ‘-ment’, ‘-ness’, ‘-ity’, ‘-ous’, ‘-tion’, ‘-sion’, ‘-al’, ‘-able’, ‘-ible’, ‘-ive’, ‘-ly’, ‘-ate’, ‘-er’, ‘-or’, etc.
For example:
ircumfix consists of a prefix and a suffix that together produce a deived or inflected form. Example
C
“enlighten” has en- and -en being added to light to create a new word.
Compounding
ompound words are formed by combining one part of speech with another to form a specific word
C
class. There are many ways in which compound words are formed. Verbs are combined with
adjectives to form compound verbs, a present participleiscombined with a noun to form a
compound noun, two nouns are combined to form a compound noun, an adjective and a noun are
combined to form a compound noun, an adverbis combinedwith a noun to form a compound noun,
an adjective is combined with a past participle to form a compound adjective and so on. Take a look
at the following examples and go through the articles on c ompound nouns, compound
wordsand c ompound adjectivesto understand how theywork.
ice cream
peanut butter
first aid
b. Hyphenated compounding: When there is a hyphen between two compounded elements.
ever-ending
n
in-depth
left-handed
c. Closed compounding: When the two elements are written together.
ookcase
b
fishbowl
fingerprint
[The Plural Form of Compound Nouns are generated in two ways: first Compound nouns pluralized
by adding the inflection -s to the end of the word. Ex: a bookcase → two bookcases ;a post office →
two post offices and the second type in the case of compound nouns that are made from a noun and
an adverb, the first part (the noun) becomes plural. Ex: a passer-by → several passers-by; a
Listener-in → several Listeners-in]
Loan-word
orrowing or loan word refers to the process where a foreign word is used in the language without
B
being translated. The English language has adopted a large number of words from other languages.
Remember that the word does not lose its meaning in the target language.
a. F oreign words with the same spelling. This happens when a word is borrowed from a foreign
language and its orthography stays the same.
Conversion
T his word-formation process is also called zero derivation and happens when we create a new word
with another part of speech without changing its form. For example:
The adjective 'green' is derived from the noun 'green' that means a grass land.
onversion happens when a word changes from one word class to another. For instance, the verb to
C
google is formed from the noun Google; or the noun read (as in a good read) is formed from the
verb to read. For example:
I emailed this document to John. (emailed is a verb formed from the noun email)
He was bullied at school as a child. (bullied is a verb formed from the noun bully)
Conversion: Types
As you can see below, there are three types of conversion:
verb to noun
noun to verb
adjective to verb
Coinage/ Neologism
oinage is a type of word-formation process in which a new word is created, either by inventing a
C
completely new word or by adapting an existing word in a new way. This can happen because of
advances in technology, movies, literature, music, and popular culture. For instance:
google
teflon
aspirin
Warning!
ords that are formed by coinage are usually written in lowercase letters when they are used in
W
context, but when we want to refer to the source of the word, it becomes a proper noun and has to
be written in uppercase letters.
T his basically means coming up with a completely new word without any of the processes above.
Some examples are:
puzzle
bash
gimmick
gadget
Semantic Relations: Componential Analysis, Prototypes
here are several different kinds of semantic relations: terms that are semantically
T
related to a given term. These may be found in a thesaurus.
Synonymy
ach listed synonym in a thesaurus denotes the same as the entry. The equivalence
E
may be less than perfect, but should pass the practical test that "often when people
say Xthey are referring to a thing that term Yalsooften refers to." For example:
rettyand a
p ttractive
sickand ill
ntonymy
A
Each listed antonym in a thesaurus denotes the opposite of the entry. Two words are
said to be antonyms when they mean the opposite.Forexample:
Up and down
Dead and alive
Parent and child
ypernymy
H
Each listed hypernym is superordinate to this entry. This entry’s referent is (one of)
the kind(s) of things each hypernym refers to.Forexample:
animalis a hypernymof m ammal(mammals are animals);
mammalis a h ypernymof dog(dogs are mammals)
flower is a hypernymof tulip
redis a h
ypernymof scarlet, v ermilion, c armineand crimson
yponymy
H
Each listed hyponym is subordinate to this entry. Each hyponym refers to a specific
kind of the thing described by this entry.For example:
dogis a hyponymof mammal(dogs are among the variousanimals which are
mammals)
mammalis a h yponymof a nimal
tulipis a hyponymof flower
scarlet, vermilion, carmineand c rimsonare h yponymsof r ed
Meronymy
Each listed meronym denotes part of this entry’s referent. For example:
barkis a m eronymof tree(bark is part of what makesup a tree);
treeis a meronymof forest
elbowis a m eronymof a rm
arm is a meronymof b ody
Holonymy
Each listed holonym has this entry’s referent as a part of itself; this entry’s referent is
part of each listed holonym. For example:
forestis a h olonymof tree(forests contain trees);
treeis a holonymof b ark
bodyis a holonymof a rm
arm is a holonymof elbow
roponymy
T
Each listed troponym denotes a particular way to do this entry’s referent. For
example:
to trimand to sliceare troponymsof to cut
to slideand to spinare troponymsof to move
to snackand to nibbleare troponymsof to eat
Componential analysis
any different theories have been proposed for representing components of lexical meaning. All of
M
them aim to develop a formal representation of meaning components which will allow us to account
for semantic properties of words, such as their sense relations, and perhaps some syntactic
properties as well.
ne very influential approach during the middle of the 20thcentury was to treat word
O
meanings as bundles of distinctive semantic features, in much the same way that phonemes
are defined in terms of distinctive phonetic/phonological features. This approach is
sometimes referred to as componential analysis of meaning. Some of the motivation for this
approach can be seen in the followingfamous examplefrom Hjelmslev. The example makes
it clear that thefeature of genderis an aspect ofmeaning that distinguishesmany pairs of
lexical items within certain semantic domains.
eatures like gender and adulthood are binary, and so lend themselves to representation in
F
eithertree or matrix format, as illustrated below.Notice thatin addition to the values + and –,
features may be unspecified (represented by⌀). Forexample, the word foalis unspecified for
gender, and the word horseis unspecified for bothage and gender.
Binary feature analysis for horse terms:
omposition
C owered
P arries People
C wheeler
4 etrol
P wheeler
2
Car + + + + -
Bus + + + + -
Van + + + + -
Bicycle - + - - +
Train + + - + -
Plane + + - + -
Ambulance + + + + -
omponential analysis provides insight into the meaning of words, and a way to study the
C
relationships between words, that are related to meaning.
Prototype:
I n everyday discourse, the term ‘prototype’ refersto an engineer’s model which, after testing and
possible improvement, may then go into mass production. In linguistics and in cognitive science more
generally, the term has acquired a specialized sense, although the idea of a basic unit, from which
other examples can be derived, may still be discerned. The term, namely, refers to the best, most
typical, or most central member of a category. Things belong in the category by virtue of their
sharing of commonalities with the prototype.
rototypein semantic relationsis a member or a setof members of a group that best represents the
P
group as a whole. An example of a group that is easily recognized by people is a prototype. For
example ‘chair’ is the prototype for furniture.
Prototype Analysis
rototype theoryis a theory of categorizationin c ognitivescience, particularly in psychology
P
and cognitive linguistics, in which there is a gradeddegree of belonging to a conceptual category,
and some members are more central than others. It emerged in 1971 with the work of
psychologist E
leanor Rosch, and it has been describedas a "Copernican Revolution" in the
theory of categorization for its departure from the traditional Aristotelian categories.
In this prototype theory, any given c onceptin anygiven language has a real world example that
best represents this concept. For example: when asked to give an example of the
concept furniture, a c ouchis more frequently citedthan, say, a w
ardrobe. Prototype theory has
also been applied in linguistics, as part of the mappingfrom phonological structureto semantics.
osch and others developed prototype theory as a response to, and radical departure from, the
R
classical theory of concepts, which defines concepts by features. Rather than defining concepts
by features, the prototype theory defines categories based on either a specific artifact of that
category or by a set of entities within the category that represent a prototypical member. The
prototype of a category can be understood in lay terms by the object or member of a class most
often associated with that class.The prototype isthe center of the class, with all other members
moving progressively further from the prototype, which leads to the gradation of categories.
Every member of the class is not equally central in human cognition.As in the example
of furnitureabove, c ouchis more central than wardrobe.Contrary to the classical view,
prototypes and gradations lead to an understanding of category membership not as an
all-or-nothing approach, but as more of a web of interlocking categories which overlap.
Implication, Entailment and Presupposition
I n a theory of meaning based on thedistinction betweensemantics and pragmatics, it is
crucial to distinguish betweenwhat a sentence meansand what a speaker intends to convey
by uttering it.Since Paul Grice’s seminal work, thedistinction between s entenceor literal
meaningand speaker meaningis quite uncontroversialamong both linguists and philosophers
of language.With s entence meaning, one refers tothe content encoded in the words a speaker
uses to utter a grammatically correct sentence, while speakermeaningis the content the
speaker intends to convey by uttering a sentence in order to achieve a certain goal. Sentence
meaning and speaker meaningcoincidewhenever thespeaker intends to convey exactly the
content encoded in the words used to utter a sentence. However, language users often intend
toconvey contents that differfrom what the sentencesthey utter literally mean.
s one of the basic forms of reasoning, inference can in general be defined as a process of
A
accepting a statement or proposition (called the conclusion) on the basis of the acceptance of
one or more other statements or propositions. It includes entailment, presupposition and
implicature. (Bublitz & Norrick)
hen one reads or hears pieces of language, one normallytries to understand not only what
W
the words mean, but what the writer or speaker of those words intend to convey. One of the
principal difficulties that one faces when dealing with aspects of language is how to
distinguish between presupposition and entailment.These two concepts are described and
examined for the reason that they seem to provide the basis for answering a number of
questions both about speaker commitment and sentence meaning. As a matter of fact,
presupposition is what the speaker assumes to be the case prior to making an utterance
whereas entailment is what logically follows from what is asserted in the sentence.
resuppositionplays an important role in the productionand comprehension of speech act. It
P
is defined from different points of view, each of which is similar to each other is some way or
the other. The German mathematicianGottlob Fregeis generally recognised as the first
scholar in modern times who re-introduced the philosophical study of presupposition though
the notion of presupposition goes back centuries.Yule defines presupposition as an
assumption by a speaker/writer about what is true or already known by the listener/reader.
Therefore speakers, not sentences have presupposition. For example, in “Your brother is
waiting outside” there is an obvious presupposition that you have a brother. In the sentence
“John’s brother bought three horses” the speaker willnormally be expected to have the
presupposition thata person called John exists andhe has a brother. Further the speaker may
also holdmore specific presupposition that John’sbrother has a lot of money.Hudson states
that a presupposition is something assumed or presupposed to be true in a sentence which
asserts other information.In the following example,the sentence (a) presupposes sentence
(b).
( a) The child has sneezed again.
(b)The child has sneezed before.
There are6 types of presupposition:
1. E
xistential presupposition: it is assumed to be presenteitherin possessive
constructions(your car presupposes you have a car)or in definite noun phrases like
the King of Sweden, the cat etc in which thespeaker presupposes the existence of the
entities named.
2. The factive presupposition: since some words are used in the sentences to denote
facts, such asknow, regret, realize, glad, oddandaware. (“I wasn’t aware that she got
a job” presupposes that she got a job)
3. Non-factive presupposition: which is assumed not to be true. Verbs likedream,
imagineandpretendare used with the presuppositionthat what follows is not true.
For example, “John dreamed that he was rich” presupposesthat John is not rich.
4. Lexical presupposition: such asmanage, stopandstart.In this type, the use of one
form with its asserted meaning is conventionally interpreted with the presupposition
that another (non-asserted) meaning is understood. When one says “He stopped
smoking” it presupposes that he used to smoke.
5. Structural presupposition: in this case certain sentence structures have been analysed
as conventionally presupposing that part of the structure is assumed to be true. One
might say that speaker can use such structures to treat information as presupposed and
hence to be accepted as true by the listeners.Forinstance, “When did John leave?”
presupposes John left.
6. Counter-factual presupposition: in which what is presupposed is not only true but is
the opposite of what is true, or contrary to facts. For example, “If I were rich I would
buy a car” presupposes I’m not rich.
Entailment
ntailment is a term derived from formal logic and now often used as part of the study of
E
semantics. All the other essential semantic relations like equivalence and contradiction
can be defined in terms of entailment.Crystal definesit as “a term that refers to a relation
between a pair of sentences such that the truth of the second sentence necessarily follows
from the truth of the first.” For example: I can see a dog entails “I can see an animal.”
One cannot assert the first and deny the second.
ack and Ellege define entailment as a logical relationship between two propositions,
B
where if one is true then the other must also be true. For example, “Mary married John”
entails Mary got married. There are two types of entailment: (1)background entailment
and (2)foreground entailment. In one occasion, one sentence can have a number of
background entailments but one foreground entailment. For example:
Bob chased three rabbits
he speaker is necessarily committed to the truth of a large number of background
T
entailments, some being:
1 . omeone chased three rabbits
S
2. Bob did something to three rabbits
3. Bob chased three of something
4. Something happened
( 2) Foreground entailment is defined by stress. It is more important for interpreting
intended meaning. For example:
a. Bob chased THREE rabbits
b. BOB chased three rabbits,
in (a) the speaker indicated Bob chased a certain number of rabbits, while in (b) the focus
shifts to Bob. This is foreground entailment.
o concludepresupposition is what the speaker assumesto be the case prior to making an
T
utterance. Therefore, speakers not sentences have presupposition, whereas entailment is
what logically follows from what is asserted in the utterance. Sentences not speaker have
entailments.
Implicature
I n pragmatics, a subdiscipline of linguistics,an implicatureissomething the speaker
suggests or implies with an utterance, even thoughit is not literally expressed. Implicatures
can aid in communicating more efficiently than by explicitly saying everything we want to
communicate. The philosopher H . P. Gricecoined theterm in 1975. Grice
distinguished conversationalimplicatures, which arisebecause speakers are expected to
respect general rules of conversation, and c onventionalones,which are tied to certain words
such as "but" or "therefore". For example the following exchange:
A: I am out of gas.
B: There is a gas station 'round the corner.
ere, B does not say, but conversationally implicates,that you will get gas at the gas station
H
round the corner as it is open
n example of a conventional implicature is "Donovan is poor but happy", where the word
A
"but" implicates a sense of contrast between being poor and being happy
ense and Reference are two semantic terms introduced by the German Mathematician and
S
the philosopher G ottlob Frege. The reference(or "referent"; B
edeutung)of a proper nameis
the object it means or indicates (bedeuten), whereasits s ense(Sinn) is what the name
expresses. The reference of a sentenceis its truthvalue, whereas its sense is the thought that
it expresses. The sense is a 'mode of presentation', which serves to illuminate only a single
aspect of the referent.
S ense relates to the complex system or relationships that hold between the linguistic elements
themselves (mostly the words); it is concerned only with intra linguistic relations. Ex: cow/bull,
mare/stallion, etc. are sets of words that are related in terms of gender. Other kinds of sense
relationships between words can be seen in duck/duckling, pig/piglet, father/son, narrow/wide,
buy/sell
Dictionary is usually concerned with sense relations -relating words to words- an unknown word to a
word whose reference is already understood.
Reference deals with the relationship between the linguistic elements, words, sentences etc. in
particular context. The ‘reference’ of an expression is the way in which the expression presents that
reference. For example, the ancients used‘the morningstar’ and the ‘evening star’to designate what
turned out to be the same heavenly body, the planetVenus.These two expressions have the same
reference, but they clearly differ in that each presents that reference in a different way. So, although
coreferential, each expression is associated with a different ‘sense.’
When we study meaning we should distinguish between sense and reference.The object in the real
world is referent, the concept we have of it in or minds is the reference. The symbol we use to refer
to them is the word. The meanings of words are given in terms of their relationships. For example,
when we say jasmine we immediately think that it is a kind of flower. Thuswe explain the meaning of
a word (sense) by using other words (reference).Therelationship between one word with another is
called a sense relation.
T he denotation of a word is its explicit or direct meaning. It refers to the literal meaning of a word,
the ‘dictionary meaning.’ For example, if you look up the word snake in a dictionary, you will discover
that one of its denotative meanings is “any of numerous scaly, legless, sometimes venomous reptiles
having a long, tapering, cylindrical body and found in most tropical and temperate regions.”
The connotation of a word or phrase is the associated or secondary meaning. It can be something
suggested or implied by a word or thing, rather than being explicitly named or described.
Connotation refers to the associations that are connected to a certain word or the emotional
suggestions related to that word. The connotative meanings of a word exist together with the
denotative meanings. The connotations for the word snake could include evil or danger.
The connotation of a worddepends on cultural contextand personal associations, but the
denotation of a word is its standardised meaningwithinthe English language. For example, the
wordshome and house have similar denotationsor primarymeanings: a home is a “shelter that is
the usual residence of a person, family, or household,” and a house is “a building in which people
live.” However, for many, these terms carry different associations or secondary meanings, also known
as connotations. Many people would agree thathomeconnotes a sense of belonging and comfort,
whereas house conveys little more than a structure.
In semantics connotation and denotation are two basic concepts.A word denotes no more than
what its bare definition states. For example, the word ‘head’ denotes a biological organ on the other
hand when we speak of ‘the head of an institution’ we mean the person at the helm of the affairs of
t hat institution. This is connotative meaning. Fire apart from its denotative meaning flame can also
denote enthusiasm, passion, spirit, love etc.
Intension and extension in logic are correlative words that indicate the reference of a term or
concept:“intension” indicates the internal contentof a term or concept that constitutes its formal
definition; and “extension” indicates its range of applicability by naming the particular objects that it
denotes. For instance, the intension on ‘ship’ as a “substantive is vehicle for conveyance on water”
whereas its extension embraces such things as cargo ships, passenger ships, battle ships and sailing
ships.The distinction between intension and extensionisnot the same as between connotation and
denotation.
T ruth Conditional semantics: Propositions, Truth values, Determining the
Semantic Value of a Proposition, Compositional Procedure, terms and
predicates, predicate logic, possible worlds semantics
ruth-conditional semanticsis an approach to semanticsof natural languagethat sees
T
meaning (or at least the meaning of assertions) as being the same as, or reducible to, their truth
conditions. This approach to semantics is principallyassociated with D onald Davidson.
Truth-conditional theories of semantics attempt to define the meaning of a given proposition by
explaining when the sentence is true. So, for example, because 'snow is white' is true if and only
ifsnow is white.It corresponds to what is oftencalled the ‘literal meaning’.
or any grammatical declarative sentence, a native speaker has truth-conditional intuitions about
F
it, i.e. he or she knows in what situations it is true and in what situations it is false. Here is an
example illustrating the fact that you have truth-conditional intuitions.
(1) There is a circle in a square. If you are given a situation depicted below, you know that
the sentence is true.
.
(2) On the other hand, in the following situation, the sentence is false.
f course, this is not the only situation where the sentence in (1) is true and infinitely
O
many situations where it is false. However, it is not difficult to see that the sentence in (1)
is true in any situation where there is a circle and there is a square and the circle is in the
square, and is false in any situations where there isn’t a circle in a square. We take this
to be a good characterization of the truth-condition of the sentence.
onsider a statement like "All humans are mortal." In predicate logic, this can be expressed
C
more formally as ∀𝑥(𝐻
𝑢𝑚𝑎𝑛(𝑥 )→𝑀𝑜𝑟𝑡𝑎𝑙(𝑥 )),where ∀
denotes the universal quantifier "for
all", and → represents implication.
https://calcworkshop.com/logic/predicate-logic/
Possible Worlds Semantics in Logic and Language
he expression “possible worlds semantics” was first used to describe “semantics” in the
T
logician’s sense. In this sense, possible worlds semantics is a matter of associating with a given
logic a model that contains worlds; and assignments, relative to those worlds, of the truth-values
of sentences, extensions of predicates, and so on. The best-known application of possible worlds
semantics is in the semantics of modal logics, usually attributed to Kripke.
ichard Montague introduced possible worlds into the systematic study of the semantics of
R
natural language, and his approach was to take a simplified fragment of English, and then treat it
as logicians had treated their artificial formal languages.
hat Are Possible Worlds? Possible worlds semantics relies on there being a domain of possible
W
worlds, and usually things in those worlds to be the members of the sets associated with
predicates in each world. So, we have possible worlds and possible individuals. Does that mean
that, in order to use possible worlds semantics, we need to think that there is an infinite range of
alternate universes, full of merely possible individuals, including strange individuals like golden
mountains and talking donkeys? Some have argued that this is indeed the best way to
understand possible worlds and possible individuals (Lewis 1986).
S eminar: Katz, Jerold J. and Jerry A. Fodor. “The Structure of a Semantic Theory.”Language.
Vol.39.2. April-June 1963. pp.170-210.
passing progressively through each of these components. The PS Component by means of re-write rules
generates the kernel string. The TFL Component contains T-Rules, which transform the kernel string into the
desired structure. The MPH Component converts this structure into a phonological form.
In Aspects of the Theory of Syntax (1965) he made revisions to his earlier theory, the most important
being the inclusion of a totally new Semantic Component to deal with meaning. The PS Component was
modified and renamed the Base Component. The TFL Component was retained as such. The MPH Component
was renamed the Phonological Component.
The Basic Component contains the branching rules or the PS Rules, which generate the deep structure. The
TFL Component converts or transforms the deep structure to the surface structure. The Phonological
Component converts the surface structure into a string of sounds, which provide the basis for the transmission
of the message. The Semantic Component converts the minimal, meaning bearing units into larger and larger
units.
In 1986 Chomsky published the Knowledge of Language in which he replaced the terms Competence
with I-Language (Internalised Language) and Performance with E-Language (Externalised Language).
TRANSFORMATIONAL ASPECT
"Essentially, transformation is a method of stating how the structures of many sentences in languages
can be generated or explained formally as the result of specific transformations applied to certain basic
sentence structures.", says R. H. Robins. Thus the transformational component shows that all the sentences in
English are in fact formed out of certain elementary or basic structures. The basic sentences are called kernel
sentences.
A kernel sentence is the simple, declarative, assertive sentence in the active voice form, from which
other sentences (more complicated or complex lengthy sentences) can be derived by applying appropriate
T-rules. For example an English sentence like John killed the snake is a kernel sentence consisting of an NP and
VP. It’s a simple, declarative sentence in the active voice in the affirmative sense. It can be transformed into
different structures such as, The snake was killed by John., John did not kill the snake., Did John kill the snake?,
Who killed the snake?, What did John kill?, Didn’t John kill the snake?, The snake was not killed by John., What did
John do?, By whom was the snake killed? etc. All these sentences have different appearances. However we can
see the underlying similarity among these sentences. They are different only in their surface structures. They
have the same deep structure. Kernel sentences are generated without the application of any optional
transformations. Non-kernel sentences require the application of both optional and obligatory transformations,
and they differ one from another in that a different selection of optional transformations is made.
As proposed in 1957, transformational rules were a means by which one kind of sentence (such as the
passive The work was done by the local men) could be derived from another kind (such as the active The local
men did the work). Any process governed by such rules was a transformation (in the following example the
passivisation-transformation) and any sentence resulting from such rules was a transform. Transformations link
deep structure with surface structure. The two sentences “The local men did the work” and “ The work was
done by the local men” have the same order in deep structure, but the passivisation-transformation
transforms this order to that in the surface structure.
In "Syntactic Structures" Chomsky handles the active-passive relationship by saying that if S1 is a
grammatical sentence with the form NP1----Aux----V----NP2, then the corresponding string of form
NP2----Aux + be + en----V----by + Np1
is also a grammatical sentence. In this way, the sentence "The door was opened by John" is the transform of the
sentence in active voice "John opened the door".
There are, however, plenty of other transformations. One that occurs in English but is not paralleled in
most languages is that of `permutation'. That is, "Has Jim played the piano?" is a transformation of "Jim has played
the piano." This occurs with all the auxiliary verbs of English as "Is he coming?", "Can you go?", "Must I sleep?", etc.
If there is no auxiliary verb, the verb `do' has to be supplied to act as one:
He goes. - Does he go? They play. - Do they play? We came. - Did we come?
A different, and in some ways more important type of transformation is "relative transformation" which
involves more than one "kernel sentence". There is a sense in which one sentence can be regarded as being
part of another sentence, that one structure can be embedded into another. The sentence that is embedded
into another is known as the "constituent" and the sentence into which it is embedded as the "matrix". For
example, the sentence `The boy who was standing there ran away.' can be treated as a transformation of the two
sentences:
1. The boy ran away. and 2. The boy was standing there. Thus the relative transformation places the second
sentence after `boy' in the first and then replaces `the boy' in the second by `who'.
Transformational grammar can disambiguate structures where IC analysis and PS Grammar fails. For
example, take the sentence `Visiting professors can be dangerous.' We want to distinguish here two senses-the
action of visiting professors can be dangerous, and professors who visit can be dangerous. We can show this by
difference in the matrix and the constituent sentence as well as the place of embedding. On the first meaning
we have the kernel sentence: 1. Can be dangerous. and 2. (someone), visits professors. We then transform the
second into `visiting professors' and insert it in the place of NP. On the second meaning the kernel sentences
will be: 1. Professors can be dangerous. 2. Professors visit. Here we must apply a transformation similar to
the relative transformation. Professors who visit can be dangerous and then a further transformation to give the
required sentence by transforming `who visit' in `visiting' and placing it before `professors'. Here we can see that
the deep structure of the two apparently identical sentences are quite different. Thus, sentences that appear to
be identical are often transforms from different kernels, transformational analysis can disambiguate far more.
The whole course of the subsequent decades within and outside Chomsky’s formulations has involved
the progressive downgrading of the role of transformations by recourse to the introduction of the concept of
‘deep structure’. The term transformational, so frequent in earlier textbooks, has now almost disappeared and
the Chomskyan theory is now designated simply as generative linguistics.
GENERATIVE ASPECT:
‘Generative’ is a term borrowed from mathematics into linguistics by Chomsky. It means merely that
the grammar must be so designed that by following its rules and conventions we can produce all or any of the
possible sentences of the language. A generative grammar is thus one, which precisely specifies the
membership of the set of all the grammatical sentences in a language in question and therefore excludes all
the ungrammatical sentences. To `generate' is thus to `predict' what can be sentences of the language or to
`specify' precisely what are the possible sentences of the language. Thus a grammar should `generate', `specify',
and `predict' sentences such as: He plays the piano. but not * Plays the piano he. and * He the piano plays.
The `competence' and `performance' of a native speaker of a language are related to the TG
grammarians' interest not in the neutral text but in what is linguistically possible. According to the theory, the
native speaker of a language has `internalized a set of rules' which form the basis of his ability to speak and
understand his language. It is the knowledge of these rules that is the object of the linguist's attention, not the
actual sentences he produces.
Differences between Phrase Structure Grammar and TG Grammar
1. PS Grammar rewrites the symbol as a string whereas in TG Grammar a string is transformed into
another string. e.g. PS Grammar – S ⭢ NP + VP
TG Grammar – NP1 + Aux + V + NP2 NP2 + Aux + be + en + V + by + NP1
2. PS Grammar uses only PS Rules. In TG Grammar both PS Rules and T-Rules are used.
3. PS Rules merely expand elements whereas T-Rules rearrange delete, add or substitute elements thus
altering the deep structure.
4. PS Rules merely show the arrangement of constituents within a structure while T-Rules show the
relationship between the surface structure and the deep structure. In PS Grammar the PS Rules
generate the surface structure, while in TG Grammar the PS Rules generate the deep structure and
T-Rules transform them into surface structure.
Advantages of TG Grammar
1. While structural grammar is corpus bound and limits itself to the analysis of the gathered data, TGG
is generative in the sense that it attempts to explain the generation of sentences.
2. TG Grammar helps us to relate outwardly or superficially different sentences and distinguish
between superficially similar sentences. e.g. 1) Rama killed a snake - A snake was killed
by Rama
2) John is easy to please - John is eager to please
3. TG Grammar can resolve ambiguities in sentences by making use of the concept of deep structure.