Structural Ambiguity And Lexical Relations

1y ago
3 Views
2 Downloads
1.07 MB
18 Pages
Last View : 1m ago
Last Download : 3m ago
Upload by : Fiona Harless
Transcription

Structural Ambiguity and LexicalRelationsDonald Hindle*AT&T Bell LaboratoriesMats Rooth tAT&T Bell LaboratoriesWe propose that many ambiguous prepositional phrase attachments can be resolved on the basisof the relative strength of association of the preposition with verbal and nominal heads, estimatedon the basis of distribution in an automatically parsed corpus. This suggests that a distributionalapproach can provide an approximate solution to parsing problems that, in the worst case, callfor complex reasoning.1. IntroductionPrepositional phrase attachment is the canonical case of structural ambiguity, as in thetimeworn example:Example 1I saw the m a n with the telescope.An analysis where the prepositional phrase [ppwith the telescope] is part of the objectnoun phrase has the semantics "the m a n who had the telescope"; an analysis where thePP has a higher attachment (perhaps as daughter of VP) is associated with a semanticswhere the seeing is achieved by means of a telescope. The existence of such ambiguityraises problems for language models. It looks like it might require extremely complexcomputation to determine what attaches to what. Indeed, one recent proposal suggeststhat resolving attachment ambiguity requires the construction of a discourse model inwhich the entities referred to in a text are represented and reasoned about (Altmannand Steedman 1988). We take this argument to show that reasoning essentially involving reference in a discourse model is implicated in resolving attachment ambiguitiesin a certain class of cases. If this p h e n o m e n o n is typical, there is little hope in thenear term for building computational models capable of resolving such ambiguities inunrestricted text.1.1 Structure-Based Ambiguity ResolutionThere have been several structure-based proposals about ambiguity resolution in theliterature; they are particularly attractive because they are simple and don't d e m a n dcalculations in the semantic or discourse domains. The two main ones are as follows. Right Association--a constituent tends to attach to another constituentimmediately to its right (Kimball 1973).* AT&T Bell Laboratories,600 Mountain Ave., Murray Hill, NJ 07974, USA.f The new affiliationof the second author is: Institut ffir maschinelleSprachverarbeitung, Universit/itStuttgart.( ) 1993 Associationfor Computational Linguistics

Computational Linguistics Volume 19, Number 1Minimal Attachment--a constituent tends to attach to an existingnonterminal using the fewest additional syntactic nodes (Frazier 1978).For the particular case we are concerned with, attachment of a prepositional phrasein a verb object context as in Example 1, these two principles--at least given theversion of syntax that Frazier assumes--make opposite predictions: Right Associationpredicts noun attachment, while Minimal Attachment predicts verb attachment.Psycholinguistic work on structure-based strategies is primarily concerned withmodeling the time course of parsing and disambiguation, and acknowledges that otherinformation enters into determining a final parse. Still, one can ask what informationis relevant to determining a final parse, and it seems that in this domain structurebased disambiguation is not a very good predictor. A recent study of attachmentof prepositional phrases in a sample of written responses to a "Wizard of Oz" travelinformation experiment shows that neither Right Association nor Minimal Attachmentaccounts for more than 55% of the cases (Whittemore, Ferrara, and Brunner 1990). Andexperiments by Taraban and McClelland (1988) show that the structural models arenot in fact good predictors of people's behavior in resolving ambiguity.1.2 Resolving Ambiguity through Lexical AssociationsWhittemore, Ferrara, and Brunner (1990) found lexical preferences to be the key toresolving attachment ambiguity. Similarly, Taraban and McClelland found that lexicalcontent was key in explaining people's behavior. Various previous proposals for guiding attachment disambiguation by the lexical content of specific words have appeared(e.g. Ford, Bresnan, and Kaplan 1982; Marcus 1980). Unfortunately, it is not clear wherethe necessary information about lexical preferences is to be found. Jenson and Binot(1987) describe the use of dictionary definitions for disambiguation, but dictionariesare typically rather uneven in their coverage. In the Whittemore, Ferrara, and Brunnerstudy (1990), the judgment of attachment preferences had to be made by hand for thecases that their study covered; no precompiled list of lexical preferences was available. Thus, we are posed with the problem of how we can get a good list of lexicalpreferences.Our proposal is to use co-occurrence of verbs and nouns with prepositions in alarge body of text as an indicator of lexical preference. Thus, for example, the preposition to occurs frequently in the context send NP , that is, after the object of the verbsend. This is evidence of a lexical association of the verb send with to. Similarly, fromoccurs frequently in the context withdrawal , and this is evidence of a lexical association of the noun withdrawal with the preposition from. This kind of association isa symmetric notion: it provides no indication of whether the preposition is selectingthe verbal or nominal head, or vice versa. We will treat the association as a propertyof the pair of words. It is a separate issue, which we will not be concerned with inthe initial part of this paper, to assign the association to a particular linguistic licensing relation. The suggestion that we want to explore is that the association revealedby textual distribution--whether its source is a complementation relation, a modification relation, or something else--gives us information needed to resolve prepositionalattachment in the majority of cases.2. Discovering Lexical Association in TextA 13 million-word sample of Associated Press news stories from 1989 were automatically parsed by the Fidditch parser (Hindle 1983 and in press), using Church's104

Donald Hindle and Mats RoothStructural Ambiguity and Lexical RelationsTable 1A sample of NP heads, preceding verbs, andfollowing prepositions derived from the assuageNounchangeregulationPRO- NGPrepinSyntax-Vatofinofaspart-of-speech analyzer as a preprocessor (Church 1988), a combination that we willcall simply "the parser." The parser produces a single partial syntactic descriptionof a sentence. Consider Example 2, and its parsed representation in Example 3. Theinformation in the tree representation is partial in the sense that some attachmentinformation is missing: the nodes dominated by "?" have not been integrated intothe syntactic representation. Note in particular that many PPs have not been attached.This is a symptom of the fact that the parser does not (in many cases) have the kind oflexical information that we have just claimed is required in resolving PP attachment.Example 2The radical changes in export and customs regulations evidently are aimed at remedying an extreme shortage of consumer goods in the Soviet Union and assuagingcitizens angry over the scarcity of such basic items as soap and windshield wipers.From the syntactic analysis provided by the parser, we extracted a table containingthe heads of all noun phrases. For each noun phrase head, we recorded the following preposition if any occurred (ignoring whether or not the parser had attached thepreposition to the noun phrase), and the preceding verb if the noun phrase was theobject of that verb. The entries in Table 1 are those generated from the text above.Each noun phrase in Example 3 is associated with an entry in the Noun column ofthe table. Usually this is simply the root of the head of the noun phrase: good is theroot of the head of consumer goods. Noun phrases with no head, or where the head isnot a common noun, are coded in a special way: DARTopNP represents a noun phrasebeginning with a definite article and headed by a proper noun, and VING represents agerundive noun phrase. PRO- represents the empty category which, in the syntactictheory underlying the parser, is assumed to be the object of the passive verb aimed.In cases where a prepositional phrase follows the noun phrase, the head prepositionappears in the Prep column; attached and unattached prepositional phrases generatethe same kinds of entries. If the noun phrase is an object, the root of the governingverb appears in the Verb column: aim is the root of aimed, the verb governing the empty105

ComputationalExampleLinguisticsVolume3INPIAUXDART NBAR,vPP7TheADJADVVPTNS VPRES,VPPRT NP,evidently,are,aimed pro NPL PREP NPI1IIradical changes in NBARIIIININNPL]regulationsCONJ NPLIIIexport and PIPPIIandNBARtheNBARNPIINPremedying IIARTCONJIIDARTIVINGCONJPVINGPNPassuaging NBAR]Soviet UnionADJNII]citizensPR,EP NPIiextreme shortage of NBARNNPLIIconsumer goods?IIIADJPPPIIADJI] PREPangry I ver IDARTItheIINPPREPiiIIPPNBARIIIFINI.1NBAR[II]IN] PREPNP[[of NBAR]NI[ADJNII[ soapNPLisuch basic itemsINPasANscarcity106?PPICONJIINPNPLII wipersNIand windshield19, Number1

Donald Hindle and Mats RoothS t r u c t u r a l A m b i g u i t y a n d Lexical R e l a t i o n scategory [fRo ]. The last column in the table, labeled Syntax, marks with the symbol-V all cases where there is no preceding verb that might license the preposition: theinitial subject of Example 2 is such a case.In the 13 million-word sample, 2,661,872 noun phrases were identified. Of these,467,920 were recognized as the object of a verb, and 753,843 were followed by apreposition. Of the object noun phrases identified, 223,666 were ambiguous verbnoun-preposition triples.3. Estimating AssociationsThe table of verbs, nouns, and prepositions is in several respects an imperfect sourceof information about lexical associations. First, the parser gives us incorrect analysesin some cases. For instance, in the analysis partially described in Example 4a, theparser incorrectly classified probes as a verb, resulting in a table entry probe lightningin. Similarly, in Example 4b, the infinitival marker to has been misidentified as apreposition.Example 4a. [NpThespace] [w s probes] [Npdetected lightning] [ppin Jupiter's upperatmosphere] and observed auroral emissions like Earth's northern lightsin the Jovian polar regions.b.The Bush administration told Congress on Tuesday it wants to[v preserve] [Npthe right] [ [ to] control entry] to the United States ofanyone who was ever a Communist.Second, a preposition in an entry might be structurally related to neither the nounof the entry nor the verb (if there is one), even if the entry is derived from a correctparse.' For instance, the phrase headed by the preposition might have a higher locusof attachment:Example 5a. The Supreme Court today agreed to consider reinstating the murderconviction of a N e w York City man who confessed to [v,NGkilling] [Nrhisformer girlfriend] [pafter] police illegally arrested him at his home.b.NBC was so afraid of hostile advocacy groups and unnerving advertisersthat it shot its dramatization of the landmark court case that[VPASTlegalized] [Nrabortion] [Prunder two phony script titles]The temporal phrase headed by after modifies confess, but given the procedure described above, Example 5a results in a tuple kill girlfriend after. In the second example,a tuple legalize abortion under is extracted, although the PP headed by under modifiesthe higher verb shot.Finally, entries of the form verb noun preposition do not tell us whether to inducea lexical association between verb and preposition or between noun and preposition. Wewill view the first two problems as noise that we do not have the means to eliminate,1 For present purposes, we can consider a parse correct if it contains no incorrect information in therelevant area. Provided the PPs in Example 5 are unattached, the parses would be correct in this sense.The incorrect information is added by our table construction step, which (given our interpretation of thetable) assumes that a preposition following an object NP modifies either the NP or its governing verb.107

Computational LinguisticsVolume 19, Number 1and partially address the third problem in a procedure we will n o w describe. We wantto use the verb-noun-preposition table to derive a table of bigrams counts, where abigram is a pair consisting of a n o u n or verb and an associated preposition (or nopreposition). To do this we need to try to assign each preposition that occurs either tothe n o u n or to the verb that it occurs with. In some cases it is fairly certain whether thepreposition attaches to the n o u n or the verb; in other cases, this is far less certain. Ourapproach is to assign the clear cases first, then to use these to decide the unclear casesthat can be decided, and finally to divide the data in the remaining unresolved casesbetween the two hypotheses (verb and n o u n attachment). The procedure for assigningprepositions is as follows:.N o Preposition--if there is no preposition, the n o u n or verb is simplyentered with a special symbol NULL, conceived of as the nullpreposition. (Items b, f, g, and j-1 in Table 1 are assigned).Sure Verb Attach 1 - - t h e preposition is attached to the verb if the n o u nphrase head is a pronoun.Sure Verb Attach 2---the preposition is attached to the verb if the verb ispassivized, unless the preposition is by. The instances of by following apassive verb were left unassigned. (Item c in Table 1 is assigned).Sure N o u n A t t a c h - - t h e preposition is attached to the noun, if the n o u nphrase occurs in a context w h e r e no verb could license the prepositionalphrase, specifically if the n o u n phrase is in a subject or other pre-verbalposition. The required syntactic information is present in the last c o l u m nof the table derived from the parse. (Item a in Table 1 is assigned.).A m b i g u o u s Attach 1--Using the table of attachments c o m p u t e d so far, ifthe LA-score for the ambiguity (a score that compares the probability ofn o u n versus verb attachment, as described below) is greater than 2.0 orless than -2.0, then assign the preposition according to the LA-score.Iterate until this step produces no new attachments. (Item d in Table 1m a y be assigned.).A m b i g u o u s Attach 2 - - f o r the remaining ambiguous triples, split thed a t u m between the n o u n and the verb, assigning a count of .5 to thenoun-preposition pair and .5 to the verb-preposition pair. (Item d inTable 1 is assigned, if not assigned in the previous step.).Unsure Attach--assign remaining pairs to the noun. (Items e, h, and i inTable 1 are assigned.)This procedure gives us bigram counts representing the frequency with which a givenn o u n occurs associated with an immediately following preposition (or no preposition),or a given verb occurs in a transitive use and is associated with a preposition immediately following the object of the verb. We use the following notation: f(w, p) is thefrequency count for the pair consisting of the verb or n o u n w and the preposition p.The u n i g r a m frequency count for the w o r d w (either a verb, noun, or preposition) canbe viewed as a sum of bigram frequencies, and is written f(w). For instance, if p is apreposition, f(p) wf W, p).108

Donald Hindle and Mats RoothStructural Ambiguity and Lexical Relations3.1 The Procedure for Guessing AttachmentOur object is to develop a procedure to guess whether a preposition is attached tothe verb or its object w h e n a verb and its object are followed by a preposition. Weassume that in each case of attachment ambiguity, there is a forced choice betweentwo outcomes: the preposition attaches either to the verb or to the noun. 2 For example,in Example 6, we want to choose between two possibilities: either into is attached tothe verb send or it is attached to the n o u n soldier.Example 6Moscow sent more than 100,000 soldiers into Afghanistan .In particular, we want to choose between two structures:Example 7a.verb attach structure: [vpsend [NP" soldier NULL] [ppinto .] .]b.noun attach structure: [vpsend [NP"soldier [Fpinto .]] . ]For the verb attach case, we require not only that the preposition attach to the verbsend but also that the n o u n soldier have no following prepositional phrase attached:since into directly follows the head of the object n o u n phrase, there is no room forany post-modifier of the n o u n soldier. We use the notation NULL to emphasize thatin order for a preposition licensed by the verb to be in the immediately postnominalposition, the n o u n must have no following complements (or adjuncts). For the case ofn o u n attachment, the verb m a y or m a y not have additional prepositional complementsfollowing the prepositional phrase associated with the noun.Since we have a forced choice between two outcomes, it is appropriate to usea likelihood ratio to compare the attachment probabilities (cf. Mosteller and Wallace1964). 3 In particular, we look at the log of the ratio of the probability of verb attachto the probability of noun attach. We will call this log likelihood ratio the LA (lexicalassociation) score.P(verb ttach p [ v, n)LA(v, n,p) log 2 P(noun attach p [ v, n)For the current example,P(verb attach into [ sendv, soldierN) , P( into[sendv ) P(NULL[soldierN)andP(noun attach into [ sendv, soldierN) , P(into[soldierN).Again, the probability of n o u n attachment does not involve a term indicating that theverb sponsors no (additional) complement; w h e n we observe a prepositional phrasethat is in fact attached to the object NP, the verb might or might not have a c o m p l e m e n tor adjunct following the object phrase.2 T h u s w e are ignoring the fact that the preposition m a y in fact be licensed by neither the verb nor then o u n , as in Example 5.3 In earlier versions of this p a p e r we u s e d a t-test for deciding a t t a c h m e n t a n d a different p r o c e d u r e forestimating the probabilities. The current procedure h a s several advantages. Unlike the t-test u s e dpreviously, it is sensitive to the m a g n i t u d e of the difference b e t w e e n the two probabilities, not to o u rconfidence in o u r ability to estimate those probabilities accurately. A n d o u r estimation p r o c e d u r e h a sthe property that it defaults (in case of novel words) to the average behavior for n o u n s or verbs, forinstance, reflecting a default preference with of for n o u n attachment.109

Computational LinguisticsVolume 19, Number 1We can estimate these probabilities f r o m the table of co-occurrence counts as: 4f(sendv, into)86- .049f(sendv)1742.5f(soldierN, NULL)1182P (NULL JsoldierN )-- .800f(soldierN)1478f(soldierN, into)1P( into[soldierN ) ,- - -- .0007f (soldierN)1478P(into[sendv)-Thus, the LA score for this e x a m p l e is:LA (sendv , soldierN , into) log 2.049 *.800.00075.81The LA score has several useful properties. The sign indicates which possibility,v e r b a t t a c h m e n t or n o u n attachment, is m o r e likely; an LA score of zero m e a n s theyare equally likely. The m a g n i t u d e of the score indicates h o w m u c h m o r e probable oneo u t c o m e is than the other. For example, if the LA score is 2.0, then the probability ofv e r b a t t a c h m e n t is four times greater than n o u n attachment. D e p e n d i n g on the task,w e can require a certain threshold of LA score m a g n i t u d e before m a k i n g a decision)As usual, in dealing with counts f r o m corpora w e m u s t confront the p r o b l e m ofh o w to estimate probabilities w h e n counts are small. The m a x i m u m likelihood estimatedescribed a b o v e is not v e r y g o o d w h e n frequencies are small, a n d w h e n frequenciesare zero, the formula will not w o r k at all. We use a crude a d j u s t m e n t to o b s e r v e dfrequencies that has the right general properties, t h o u g h it is not likely to be a v e r yg o o d estimate w h e n frequencies are small. For o u r p u r p o s e s , h o w e v e r exploring ingeneral the relation of distribution in a corpus to a t t a c h m e n t d i s a m b i g u a t i o n - - w ebelieve it is sufficient. O t h e r a p p r o a c h e s to adjusting small frequencies are discussedin Church et al. (1991) a n d Gale, Church, Yarowsky (in press).The idea is to use the typical association rates of n o u n s a n d verbs to interpolateour probabilities. W h e r e f(N, p) E n f(n, p), f(V, p) vf(V, p), f(N) -- End(n) a n d4 The nonintegral count for send is a consequence of the data-splitting step A m b i g u o u s Attach 2, and thedefinition of unigram frequencies as a sum of bigram frequencies.5 An advantage of the likelihood ratio approach is that we can use it in a Bayesian discriminationframework to take into account other factors that might influence our decision about attachment (seeGale, Church, and Yarowsky [in press] for a discussion of this approach). We k n o w of course that otherinformation has a bearing on the attachment decision. For example, w e have observed that if the nounphrase object includes a superlative adjective as a premodifier, then noun attachment is certain (for asmall sample of 16 cases). We could easily take this into account by setting the prior odds ratio toheavily favor n o u n attachment: let's suppose that if there is a superlative in the object n o u n phrase,then n o u n attachment is say 1000 times more probable than verb attachment; otherwise, they areequally probable. Then following Mosteller and Wallace (1964), we assume thatFinal attachment odds log 2(initial odds) LA.In case there is no superlative in the object, the initial log odds will be zero (verb and noun attachmentare equally probable), and the final o d d s will equal our LA score. If there is a superlative,1Final attachment odds log 21- 110-}- LA(v, n, p).

Donald Hindle and Mats RoothStructural Ambiguity and Lexical Relationsf(V) vf(V), we redefine our probability estimates in the following way:f(n,p) f(N,p)f(N)P(p [ n) f(n) 1f(v, p) [(V,p)f(v)P(P l v) f(v) 1When f(n) is zero, the estimate for P(p I n) is the average ( f(N))across all nouns,and similarly for verbs. When f(n, p) is zero, the estimate used is proportional to thisaverage. If we have seen only one case of a n o u n and it occurred with a prepositionp (that is f(n, p) 1 and f(n) 1 ), then our estimate is nearly cut in half. This is thekind of effect we want, since u n d e r these circumstances we are not v e r y confident in1 as an estimate of P(p I n). When f(n, p) is large, the adjustment factor does not makem u c h difference. In general; this interpolation procedure adjusts small counts in theright direction and has little effect w h e n counts are large.For our current example, this estimation procedure changes the LA score little: \,into)f(soldierN,NULL) LL)f (sendv,mto) f ( Vf fl-ffy-LA(sendv, soldier , into) log 2f(sendv) lf(s ldierN) l.f(Nf(s l&erN mt ) "f (soldierN) 186a. 2292385435into)1182q 204731126 594l 510 21742.5 124341478 1 5.87.1-} 26565941478 1The LA score of 5.87 for this example is positive and therefore indicates verb attachment; the m a g n i t u d e is large e n o u g h to suggest a strong preference for verb attachment. This m e t h o d of calculating the LA score was used both to decide unsure casesin building the bigram tables as described in A m b i g u o u s Attach 1, and to make theattachment decisions in novel ambiguous cases, as discussed in the sections following.4. Testing AttachmentTo evaluate the performance of the procedure, 1000 test sentences in which the parseridentified an ambiguous v e r b - n o u n - p r e p o s i t i o n triple were r a n d o m l y selected fromAP news stories. These sentences were selected from stories included in the 13 millionw o r d sample, but the particular sentences were excluded from the calculation of lexicalassociations. The two authors first guessed attachments on the v e r b - n o u n - p r e p o s i t i o ntriples, making a judgment on the basis of the three h e a d w o r d s alone. The judges wererequired to make a choice in each instance. This task is in essence the one that we willgive the c o m p u t e r - - t o judge the attachment without any more information than thepreposition and the heads of the two possible attachment sites.This initial step provides a rough indication of what we might expect to be achievable based on the information our procedure is using. We also w a n t e d a standard ofcorrectness for the test sentences. We again judged the attachment for the 1000 triples,111

Computational LinguisticsVolume 19, Number 1this time using the full-sentence context, first grading the test sentences separately,and then discussing examples on which there was disagreement. Disambiguating thetest sample turned out to be a surprisingly difficult task. While many decisions werestraightforward, more than 10% of the sentences seemed problematic to at least oneauthor. There are several kinds of constructions where the attachment decision is notclear theoretically. These include idioms as in Examples 8 and 9, light verb constructions (Example 10), and small clauses (Example 11).Example 8But over time, misery has given way to mending.Example 9The meeting will take place in Quantico.Example 10Bush has said he would not make cuts in Social Security.Example 11Sides said Francke kept a .38-caliber revolver in his car's glove compartment.In the case of idioms, we made the assignment on the basis of a guess about thesyntactic structure of the idiom, though this was sometimes difficult to judge. Wechose always to assign light verb constructions to noun attachment, based on the factthat the noun supplies the lexical information about what prepositions are possible,and small clauses to verb attachment, based on the fact that this is a predicativeconstruction lexically licensed by the verb.Another difficulty arose with cases where there seemed to be a systematic semantically based indeterminacy about the attachment. In the situation described byExample 12a, the bar and the described event or events are presumably in the samelocation, and so there is no semantic reason to decide on one attachment. Example 12bshows a systematic benefactive indeterminacy: if you arrange something for someone,then the thing arranged is also for them. The problem in Example 12c is that signingan agreement usually involves two participants who are also parties to the agreement.Example 13 gives some further examples drawn from another test sample.Example 12a. . . .known to frequent the same bars in one neighborhood.b.Inaugural officials reportedly were trying to arrange a reunion for Bushand his old submarine buddies .c.We have not signed a settlement agreement with them.Example 13112a.It said the rebels issued a challenge to soldiers in the area and fought withthem for 30 minutes.b.The worst such attack came Nov. 11 when a death squad firingsubmachine guns killed 43 people in the northwest town of Segovia.c.Another charge raised at the Contra news conference was that theSandinistas have mined roads along the H onduran and Costa Ricanborders.

Donald Hindle and Mats Roothd.Structural Ambiguity and Lexical RelationsBuckner said Control Data is in the process of negotiating a new lendingagreement with its banks.which would require ailing banks and S&Ls to obtain advancepermission from regulators before raising high-cost deposits throughe. . . .money brokers.f.She said the organization has opened a cleaning center in Seward and shewas going to Kodiak to open another.In general, we can say that an attachment is semantically indeterminate if situationsthat verify the meaning associated with one attachment also make the meaning associated with the other attachment true. Even a substantial overlap (as opposed toidentity) between the classes of situations verifying the two meanings makes an attachment choice difficult.The problems in determining attachments are heterogeneous. The idiom, lightverb, and small clause constructions represent cases where the simple distinction between noun attachment and verb attachment perhaps does not make sense, or is verytheory-dependent. It seems to us that the phenomenon of semantically based indeterminacy deserves further exploration. If it is often difficult to decide what licensesa prepositional phrase, we need to develop language models that appropriately capture this. For our present purpose, we decided to make an attachment choice in allcases, in some cases relying on controversial theoretical considerations, or relativelyunanalyzed intuitions.In addition to the problematic cases, 120 of the 1000 triples identified automatically as instances of the verb-object-preposition configuration turned out in fact to beother constructions, often as the result of parsing errors. Examples of this kind weregiven above, in the context of our description of the construction of the verb-nounpreposition table. Some further misidentifications that showed up in the test sampleare: identifying the subject of the complement clause of say as its object, as in Example 10, which was identified as (say ministers from), and misparsing two constituents asa single-object noun phrase, as in Example 11, which was identified as (make subject to).Example 14a. Ortega also said deputy foreign ministers from the five governmentswould meet Tuesday in Managua, .b.Congress made a deliberate choice to make this commission subject to theopen meeting requirements .After agreeing on the 'correct' attachment for the test sample, we were left with 880 disambiguated verb-noun-preposition triples, having discarded the examples that werenot instances of the relevant construction. Of these, 586 are noun attachments and 294are verb

resolving attachment ambiguity. Similarly, Taraban and McClelland found that lexical content was key in explaining people's behavior. Various previous proposals for guid- ing attachment disambiguation by the lexical content of specific words have appeared (e.g. Ford, Bresnan, and Kaplan 1982; Marcus 1980).

Related Documents:

Keywords: lexical ambiguity, syntactic ambiguity, humor Introduction . These prior studies found that ambiguity is a source which is often used to create humor. There are two types of ambiguity commonly used as the source of humors, i.e. lexical and syntactic ambiguity. The former one refers to ambiguity conveyed

ambiguity. 5.1.2 Lexical Ambiguity Lexical ambiguity is the simplest and the most pervasive type of ambiguity. It occurs when a single lexical item has more than one meaning. For example, in a sentence like "John found a bat", the word "bat" is lexically ambiguous as it refer s to "an animal" or "a stick used for hitting the ball in some games .

Resolving ambiguity through lexical asso- ciations Whittemore et al. (1990) found lexical preferences to be the key to resolving attachment ambiguity. Similarly, Taraban and McClelland found lexical content was key in explaining people's behavior. Various previous propos- als for guiding attachment disambiguation by the lexical

3.1 The Types of Lexical Ambiguity The researcher identified the types of lexical ambiguity from the data and found 2 types based on types of lexical ambiguity framework used by Murphy (2010) which are absolute homonymy and polysemy. The researcher found 38 utterances which were lexically ambiguous. 3.1.1 Absolute

lexical ambiguity on the movie based on the theory. 4.1 Findings The finding of this study is divided into two parts based on the research problems. The first partis about lexical ambiguity that found in Zootopia movie. In this part the writer also analyzes the types of lexical ambiguity in the words that categorize as lexical ambiguity.

ambiguity. This paper also tackles the notion of ambiguity under the umbrella of Empson's (1949) and Crystal (1988). There are two types of ambiguity identified and they are as follows: a. Syntactic or structural ambiguity generating structure of a word in a sentence is unclear. b. Lexical or semantic ambiguity generating when a word has

ambiguity and then describing the causes and the ways to disambiguate the ambiguous sentences by using different ways from some linguists. The finding shows that the writer finds lexical ambiguity (23,8%) and structural or syntactic ambiguity (76,2%). Lexical ambiguity divided into some part of speech;

There are three types of ambiguities: structural ambiguity lexical ambiguity and semantic ambiguity. 2.1.1. Lexical Ambiguity The Words and phrases in one language often have multiple meaning in another language. . be found for a particular word or phrase of one language in another. Consider the sentence,