Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Generative Grammar and Parsing Algorithms, Study notes of Linguistics

The concept of generative grammar, phrase structure rules, and parsing algorithms in linguistics. It covers the idea of transformations, top-down and left-corner parsing, and the role of abstract syntactic structures in interpreting language.

Typology: Study notes

Pre 2010

Uploaded on 02/13/2009

koofers-user-pw6-1
koofers-user-pw6-1 🇺🇸

4

(2)

10 documents

1 / 50

Toggle sidebar

Related documents


Partial preview of the text

Download Generative Grammar and Parsing Algorithms and more Study notes Linguistics in PDF only on Docsity! 1 Introducing the Sentence LING499A So far … • way of talking about information processing systems • levels of analysis (Marr) • ways to characterize information flow (Marslen-Wilson) • some properties of comprehension (Marslen-Wilson) • lexical access basics • dimensions of lexicon organization (Altmann chap.) • context effects (Swinney) • decomposition (Longtin) Today • the sentence as a mental object •a little history • salient properties • kinds of evidence What is a sentence? Wilhelm Wundt (1832-1920) (Co)founder of Experimental Psychology We have to ask speakers 2 • Our explanation is made in terms of mental states/processes •A key source of data comes from acceptability judgments What is a sentence? Behaviorism or … no mental states for you! JB Watson (1878-1958) BF Skinner (1904-1990) 5 Behaviorist account of the sentence • Osgood (1963) • incorporated ‘hidden’ pathways • fractional s-r relationships • allows for more linguistic sophistication, because it incorporates a measure of abstraction • very early ‘connectionism’ [at least, prescient] Perceptual evidence for the sentence • The waterfall effect: words have greater clarity in sentences • Miller, Heise & Lichten 1951 • Miller & Isard 1963 • The sentence lends a perceptual advantage, under noisy conditions. • The implication is that there must be something more to a sentence than associative word-word relations “Around accidents country honey the shoot” “Accidents kill motorists on highways.” 6 Perceptual evidence for the sentence “Horses cry.” “Horses eat.” Suppose the intensity of the noise is adjusted so that the likelihood of recognizing each of these words is 50%. If there is no linguistic representation above the word level, then we expect recognition of the entire sequence to be 25%. In fact, it’s greater than 50%. Experientially: subjects report greater clarity in sentence contexts. Evidence from recall Jarvella 1971 (a) … S1 [S2, S3] Corresponds to open symbols (b) … [S1, S2] S3 Corresponds to filled symbols • each S = 7 words • recall of words in S2 drastically better if it is incorporated into the most recent complex sentence (a) [versus (b)] • suggests that sentences are important organizing units in memory as well as perception 7 Generative grammar: a new model Generative grammar: a new model • Sentences are grammatical if they can be derived/analyzed with a set of rules. • The goal of the computational theory is to formulate the rules such that only the grammatical sentences of a language -- where grammaticality is assessed principally by appeal to speaker intuitions -- are derivable. • To account for the properties that the behaviorist model failed on, it is necessary to introduce abstraction: symbols that do not directly correspond to perceptual experience 10 A Simple Derivation S (starting axiom) 1. S ➝ NP VP S NP VP A Simple Derivation S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP S NP VP V NP 11 A Simple Derivation S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N S NP VP V NP D N A Simple Derivation S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. N ➝ Bill S NP VP V NP D N Bill 12 A Simple Derivation S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. N ➝ Bill 5. V ➝ hit S NP VP V NP D N Bill hit A Simple Derivation S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. N ➝ Bill 5. V ➝ hit 6. D ➝ the S NP VP V NP D N Bill hit the 15 E.g.: question formation John asked Mary to give him a walnut. Apply basic phrase structure rules: John ask PAST [ Mary to give him what ] Apply transformations: PAST John ask Mary to give what What PAST John ask Mary to give What did John ask Mary to give Can we ‘measure’ these proposals? Do any perceptual measures correlate with the representations postulated on the basis of acceptability judgments? 16 Phrase structure • the ‘click’ experiments [Fodor and Bever, 1965; Garrett 1965; Chapin et al 1972] • In her hope of marrying Anna was impractical O R • Harry’s hope of marrying Anna was impractical R O • Potentially goes beyond surface structure: • The general defied the troops to fight O R • The general desired the troops to fight R O Phrase structure • Levelt’s clustering analysis of relatedness judgments • words prompt better recall for words within phrases than without (e.g. Johnson 1965) 17 Psychological reality? ‘The distinction between competence and performance created an aura in which it could appear that “competence” was some kind of abstraction that had no particular implication for mental structure: it was performance that carried out that mapping and created behavioral expressions of the grammar. Technically (and importantly) this common view was a mistake: “competence” itself was a theory of actual knowledge, and therefore it pertained directly to mental structures. …. Grammaticality intuitions are real behaviors in their own right, and a grammar that accurately distinguishes between grammatical and ungrammatical sentences already explains a vast range of behavior. ….’ T&B, pp 27-28 Psychological reality? The click experiments of Fodor & Bever, the memory experiments of Jarvella, the relatedness analysis of Levelt, reaction time measures, magnetic field potentials, etc. etc. are measures, and intuitions about sentence acceptability are measures. There is no in principle distinction between ‘how mental’ one measure is compared to another. One isn’t more “psychologically real.” We must reason from a brain potential to mental life just as we must reason from an acceptability judgment. We are always supplying linking hypotheses. 20 Early generative grammar TRANSFORMATIONS Rearrange the basic skeleton. Motivation: Different structures, same elements playing similar roles: active-passive Molly smacked Morton. Morton was smacked by Molly. Same apparent structure, but with elements playing different roles; and different variants defy/desire John desired Bill to go: John desires, Bill goes John defied Bill to go: John defies Bill, Bill goes For Bill to go was desired by John *For Bill to go was defied by John Early generative grammar TRANSFORMATIONS Rearrange the basic skeleton. What did John ask Mary to hammer? Phrase structure rules generate: John ask PAST [ Mary to give him what ] Transformations rearrange that result: A PAST John ask Mary to hammer what What PAST John ask Mary to hammer What do+PAST John ask Mary to hammer “What did John ask Mary to hammer” 21 Generative grammar Different kinds of grammars work differently, but all have ways of: Structuring words and their categories into part-whole configurations (building a ‘skeleton’) Syntactically relating two distinct positions in that configuration (‘connecting the vasculature’ ??) Transformations were the first and perhaps most influential way of relating two positions, but others have been developed (particularly in the 1980s) Converging evidence for representations above the word level • Perception – Click experiments (Bever, Chapin and others) • Phrase boundaries ‘attract’ clicks – ‘naïve’ judgments of phrase structure (Levelt) • Relatedness judgments reveal hierarchy very much like linguists’ phrase structure revealed from speaker intuitions – (also Martin; cf. Gee & Grosjean 1980) • Memory – E.g.: Jarvella (1971) results • 3 clauses, organized into 2 sentences • Ability to recall words from ‘middle’ clause dramatically improved if clause incorporated into most recent sentence 22 So … • Seems like evidence for a number of behavioral tests besides acceptability judgments converges with respect to part-whole relations in a sentence • During comprehension, we could use phrase structure either: – Represented explicitly as data structure – Or as an algorithm … like a map of how to proceed • What about transformations?? Miller & Chomsky (1963) • ‘The psychological plausibility of a transformational model of the language user would be strengthened, of course, if it could be shown that our performance on tasks requiring an appreciation of the structure of transformed sentences is some function of the nature, number and complexity of the grammatical transformations involved.’ ---(Miller & Chomsky 1963: p. 481) 25 McMahon (1963) a. i. seven precedes thirteen K (true) ii. thirteen precedes seven K (false) b. i. thirteen is preceded by seven P (true) ii. seven is preceded by thirteen P (false) c. i. thirteen does not precede seven N (true) ii. seven does not precede thirteen N (false) d. i. seven is not preceded by thirteen PN (true) ii. thirteen is not preceded by seven PN (false) judge truth/falsity Easy Transformations • Passive – The first shot the tired soldier the mosquito bit fired missed. – The first shot fired by the tired soldier bitten by the mosquito missed. • Heavy NP Shift – I gave a complete set of the annotated works of H.H. Munro to Felix. – I gave to Felix a complete set of the annotated works of H.H. Munro. • Full Passives – Fido was kissed (by Tom). • Adjectives – The {red house/house which is red} is on fire. 26 Failure of DTC? • Any DTC-like prediction is contingent on a particular theory of grammar, which may be wrong • It’s not surprising that transformations are not the only contributor to perceptual complexity – memory demands, may increase or decrease – ambiguity, where grammar does not help – difficulty of access Townsend & Bever (2001, ch. 2) • “Alas, it soon became clear that either the linking hypothesis was wrong, or the grammar was wrong, or both.” 27 Townsend & Bever (2001, ch. 2) • “The moral of this experience is clear. Cognitive science made progress by separating the question of what people understand and say from how they understand and say it. The straightforward attempt to use the grammatical model directly as a processing model failed. The question of what humans know about language is not only distinct from how children learn it, it is distinct from how adults use it.” People took away different morals from the DTC debates • that the grammar and parser needed separate models • that transformations weren’t “psychologically real” Are these conclusions warranted? (on the basis of the ‘DTC failure’ alone) Parsing Algorithms for recovering phrase structure 30 Bottom-up parsing S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. N ➝ Bill 5. V ➝ hit 6. D ➝ the 7. N ➝ ball NP Bill Bottom-up parsing S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. N ➝ Bill 5. V ➝ hit 6. D ➝ the 7. N ➝ ball NP Bill hit 31 Bottom-up parsing S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. N ➝ Bill 5. V ➝ hit 6. D ➝ the 7. N ➝ ball NP VBill hit Bottom-up parsing S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. N ➝ Bill 5. V ➝ hit 6. D ➝ the 7. N ➝ ball NP VBill hit the 32 Bottom-up parsing S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. N ➝ Bill 5. V ➝ hit 6. D ➝ the 7. N ➝ ball NP V D Bill hit the Bottom-up parsing S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. N ➝ Bill 5. V ➝ hit 6. D ➝ the 7. N ➝ ball NP V D Bill hit the ball 35 Bottom-up parsing S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. N ➝ Bill 5. V ➝ hit 6. D ➝ the 7. N ➝ ball S NP VP V NP D N Bill hit the ball Problems? 36 Top-down parsing S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. N ➝ Bill 5. V ➝ hit 6. D ➝ the 7. N ➝ ball S Bill hit the ball(Hypothesis driven) Top-down parsing S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. N ➝ Bill 5. V ➝ hit 6. D ➝ the 7. N ➝ ball S Bill hit the ball(Hypothesis driven) NP VP 37 Top-down parsing S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. N ➝ Bill 5. V ➝ hit 6. D ➝ the 7. N ➝ ball S Bill hit the ball(Hypothesis driven) NP VP Top-down parsing S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. N ➝ Bill 5. V ➝ hit 6. D ➝ the 7. N ➝ ball S Bill hit the ball(Hypothesis driven) NP VP V NP 40 Not all hypotheses succeed S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. VP ➝ V PP … etc S Nick flew to Nova Scotia NP VP V PP P NP Not all hypotheses succeed S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. VP ➝ V PP … etc S Nick flew to Nova Scotia NP VP 41 Not all hypotheses succeed S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. VP ➝ V PP … etc S Nick flew to Nova Scotia NP VP NPV Not all hypotheses succeed S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. VP ➝ V PP … etc S Nick flew to Nova Scotia NP VP NPV 42 Not all hypotheses succeed S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. VP ➝ V PP … etc S Nick flew to Nova Scotia NP VP NPV ND Not all hypotheses succeed S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. VP ➝ V PP … etc S Nick flew to Nova Scotia NP VP NPV ND 45 Left-corner parse S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. VP ➝ V PP 5. NP ➝ N … etc Bill hit the ball N Left-corner parse S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. VP ➝ V PP 5. NP ➝ N … etc Bill hit the ball N 46 Left-corner parse S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. VP ➝ V PP 5. NP ➝ N … etc Bill hit the ball N NP Left-corner parse S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. VP ➝ V PP 5. NP ➝ N … etc Bill hit the ball N NP 47 Left-corner parse S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. VP ➝ V PP 5. NP ➝ N … etc Bill hit the ball N NP S VP Left-corner parse S (starting axiom) 1. S ➝ NP VP 2. VP ➝ V NP 3. NP ➝ D N 4. VP ➝ V PP 5. NP ➝ N … etc Bill hit the ball N NP S VP V
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved