Tree-Adjoining Grammar Parsing And Vector Representations .

2y ago
10 Views
3 Downloads
861.58 KB
62 Pages
Last View : 2m ago
Last Download : 2m ago
Upload by : Melina Bettis
Transcription

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WTree-Adjoining Grammar Parsing and VectorRepresentations of SupertagsJungo KasaiYale UniversityDecember 14, 2017

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WOutline1Background and Motivations2Supertagging Models3Parsing Models4Vector Representations of Supertags5Ongoing TAG Parsing Work6Applications of TAG7Future Work

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WOutline1Background and Motivations2Supertagging Models3Parsing Models4Vector Representations of Supertags5Ongoing TAG Parsing Work6Applications of TAG7Future Work

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WSyntactic ParsingSVPNPVPJohn AdvPreallyVNPlikes MaryWhy do we need parsing?Does John love Mary? Does Mary love John?Understanding of a sentence depends on the structure

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WContext Free GrammarsSS NP VPVP AdvP VPVPNPVP V NPVPJohn AdvPreallyAdvP reallyVNP MaryNPNP theyNP Johnlikes MaryV likeV likesThese production rules generate sentences

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WContext Free GrammarsSS NP VPVP AdvP VPVPNPVP V NPVPJohn AdvPreallyAdvP reallyVNP MaryNPNP theyNP Johnlikes MaryV likeV likesFundamental problem: constraints are distributed overseparate rulesHow do we choose V like or V likes?

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WTree-Adjoining GrammarTree-Adjoining Grammar (TAG) localizes grammaticalconstraintsFinite set of lexicalized elementary treesFinite set of operations (Substitution and Adjunction) areused to combine elementary treesNP0 VPSSVPV likesNP1 NP0 VPNPAdvP VP*V N Ad sleepJohnreally

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WTree-Adjoining GrammarSubstitution

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WTree-Adjoining GrammarAdjunction

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WTree-Adjoining GrammarAdjunction allows for unbounded recursion while still enforcingagreement.John smartly occasionally really only likes Mary.

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WDerivation TreeDerivation tree records the operations.Forms a dependency tree (each token has exactly one parent)likesSubst 1AdjJohn really MarySubst 0ROOTSubst 0ADJSubst 1ROOT John really likes Mary

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WTwo Steps in TAG ParsingNow the reverse process.SupertaggingAssign elementary trees (supertags) to each token. Similarto POS tagging.ParsingPredict operations on the elementary trees.NP*NP*SNP0 SNP0 VPSNPVPV -NONE- V leftleft

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WOutline1Background and Motivations2Supertagging Models3Parsing Models4Vector Representations of Supertags5Ongoing TAG Parsing Work6Applications of TAG7Future Work

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WSupertagging is a bottleneckSupertaggerGoldMaxent (MICA)ParserChart (MICA)Chart (MICA)Stag ng is almost parsingThere are about 5,000 supertags in the grammarAbout half of them occur only once in the training data(PTB WSJ Sections 1-22).

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WBiLSTM SupertaggingFigure: BiLSTM Supertagger Architecture.

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WSupertagging is still a bottleneckSupertaggerMaxent (MICA)BiLSTMParserChart (MICA)Chart (MICA)Stag Acc88.5289.32UASLAS

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WSupertagging is still a bottleneckSupertaggerGoldMaxent (MICA)BiLSTMParserChart (MICA)Chart (MICA)Chart (MICA)Stag 88.32We can compensate for supertagging errors by exploitingstructural similarities across elementary trees.Similarities across supertags are not utilized by the chartparser.We use two alternative families of parsing algorithms

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WOutline1Background and Motivations2Supertagging Models3Parsing Models4Vector Representations of Supertags5Ongoing TAG Parsing Work6Applications of TAG7Future Work

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WParsing ModelsPrior Work: Unlexicalized Chart-Parser (MICA)[Bangalore et al., 2009]Unlexicalized Transition-based Parser[Kasai et al., 2017, Friedman et al., 2017]Graph-based Parser (work in progress)

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WTransition-based ParsingArc-Eager System (MALT) [Nivre et al., 2006]ROOTSubst 0ADJSubst1ROOT John really likes Mary

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WTransition-based Parsing

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WTransition-based Parsing

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WTransition-based Parsing

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WTransition-based Parsing

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WTransition-based Parsing

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WTransition-based Parsing

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WTransition-based Parsing

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WTransition-based TAG ParsingHow do we learn?Represent the configuration by the top k elements fromstack and buffer: {si , bi }ki 1 [Chen and Manning, 2014].Represent si (bi ) by the TAG elementary tree and thederived substitution operations performed into si .Encode the TAG elementary trees and the substitutionoperations with dense vectors.

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WNN Transition-based Parsing ModelFigure: Transition-based Parser Neural Network Architecture.

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WExampleJohn really likes MaryStackROOT likesBufferMaryRelations{(ROOT , likes, ROOT ), (likes, John, 0) · · · }ActionRIGHT:1

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WParsing ResultsParsing ModelMICA ChartTransition-basedGold StagsUASLAS97.60 97.3097.67 97.45Predicted StagsUASLAS90.0588.3290.2388.77Table: Results on Section 00. Beam size 16.Predicted supertags are from our BiLSTM supertagger.

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WOutline1Background and Motivations2Supertagging Models3Parsing Models4Vector Representations of Supertags5Ongoing TAG Parsing Work6Applications of TAG7Future Work

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WEmbeddings for Elementary Trees

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WInduced EmbeddingsThe input is a one-hot vector for each supertag; randomlyinitialized weights are trained.Embeddings do not have a priori knowledge of syntacticproperties of the elementary trees

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WPCA Plots of Vector RepresentationsFigure: Declarative/subject relative alignment (Atomic embeddings)ex: the man sneezed vs. the man who sneezed

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WPCA Plots of Vector RepresentationsFigure: Transitive/intransitive alignment (Atomic embeddings).ex: the man who devoured the pizza vs. the man who sneezed

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WAnalogy testsSemantic analogies have been used to test word embeddings(e.g. [Mikolov et al., 2013]): king : man :: queen : woman king man woman queen

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WAnalogy testsWe use syntactic analogies to test supertag embeddings: trans. : intransitive :: subj.rel.trans. : subj.rel.intrans. trans. intransitive subj.rel.intrans. subj.rel.trans.NP*NP*SNP0 V NP1 NP0 VPV NP*SNP0 SVPNP*SNPSNP0 VP-NONE- V NP1 SNPVP-NONE- V

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WFormulation of TestsSyntactic transformations used to construct analogies:Subject relativizationObject relativizationSubject wh-movementObject wh-movementTransitivizationPassivization with a byphrasePassivization without a byphraseInfinitivizationDative shift

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WAnalogy Test Resultsn3004724# equations24657220% correct50.404.62Avg. position7.98289.48Table: Analogy task results. (n is a number used to restrict whichsupertags are considered: For a given n, only equations for which allsupertags are among the n most common supertags are considered.)% correct: Percent of equations for which the left handside’s closest cosine neighbor was the right hand side.Avg. position: The position of the correct right hand sidein the list of supertag embeddings ranked by cosinedistance from the left hand side.

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WOutline1Background and Motivations2Supertagging Models3Parsing Models4Vector Representations of Supertags5Ongoing TAG Parsing Work6Applications of TAG7Future Work

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WGraph-based Parsing[McDonald et al., 2005]Score n2 directed edges (n potential parents for each ofthe n tokens in a sentence) using featuresFind the maximum spanning tree (greedy cycle fix)

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WGraph-based Parsing TAG Parsing

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WComparison btw transition-based and graph-basedRich feature representations with parse history v.s. globaltraining/inferenceTransition-based parsers have parse history that naturallyrelates supertags that differ only by a certain operation(e.g. transitive/intransive)Transition-based parsers suffer from global errorpropagationGraph-based parsers assign scores independentlyGraph-based parser with BiLSTM feature representations(still no history)

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WNew ResultsParserMICA ChartTransition-based ParsingJoint Graph Parsing (POS Stag)UAS86.6690.9793.26Table: Parsing results on the test set.LAS84.9089.6891.89

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WOutline1Background and Motivations2Supertagging Models3Parsing Models4Vector Representations of Supertags5Ongoing TAG Parsing Work6Applications of TAG7Future Work

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WSyntactically-oriented Textual Entailment[Xu et al., 2017]E.g. The guy who left the room saw a squirrel The guy left the room The guy saw a squirrel 6The room saw a squirrel 6The guy saw an animal (not pure syntactic)Parse the original sentence and hypothesisTransform the parses using properties of supertagsIf the original sentence parse subsumes the hypothesisone, YES.

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WSyntactically-oriented Textual EntailmentROOTSubst 0ADJADJSubst 1Subst 1ADJADJSubst 0ROOT the guy who left the room saw a squirrelROOTSubst1ADJ Subst 0ADJROOT the guy saw a squirrel

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WSyntactically-oriented Textual EntailmentROOTSubst 0ADJADJSubst 0Subst 1Subst 1ADJADJROOT the guy who left the room saw a squirrelROOTSubst1ADJ Subst 0ADJROOT the guy left the room

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WSyntactically-oriented Textual EntailmentSystem[Rimell and Clark, 2010][Ng et al., 2010][Lien, 2014]Transition-based TAG ParsingGraph-based %R62.880.150.056.468.6F170.273.763.968.076.4Table: PETE test results. Precision (P), recall (R), and F1 arecalculated for “entails.”

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WSupertag-based Semantic Role LabelingWho did what to whom?WHENWHATWHO WHOMPeter hit Mary with a ball yesterday

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WSupertag-based Semantic Role LabelingSyntactic Parsing and SRL are related.Supertags instead? (Work in Progress)

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WSupertag-based Semantic Role LabelingNon-ensemble System[FitzGerald et al., 2015][Roth and Lapata, 2016][Marcheggiani et al., 2017][Marcheggiani and Titov, 886.888.2F187.387.787.788.088.6Table: Non-ensemble system results on the CoNLL-2009 in-domaintest set for English.

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WOutline1Background and Motivations2Supertagging Models3Parsing Models4Vector Representations of Supertags5Ongoing TAG Parsing Work6Applications of TAG7Future Work

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WFuture WorkFurther improvement of parsingAdd a priori syntactic knowledge from supertags?Relax conditional independence in our graph-based parserMore applicationsSemantic Parsing

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WAcknowledgementThank you!Robert Frank, Dan Friedman, Tom McCoy, William Merrill,Alexis Nasr, Dragomir Radev, Owen Rambow, and Pauli XuComputational Linguistics at Yale (CLAY) lab

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WBangalore, S., Boullier, P., Nasr, A., Rambow, O., andSagot, B. (2009).MICA: A Probabilistic Dependency Parser Based on TreeInsertion Grammars.In NAACL HLT 2009 (Short Papers).Chen, D. and Manning, C. (2014).A fast and accurate dependency parser using neuralnetworks.In Proceedings of the 2014 Conference on EmpiricalMethods in Natural Language Processing (EMNLP), pages740–750, Doha, Qatar. Association for ComputationalLinguistics.FitzGerald, N., Täckström, O., Ganchev, K., and Das, D.(2015).Semantic role labeling with neural network factors.

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WIn Proceedings of the 2015 Conference on EmpiricalMethods in Natural Language Processing, pages 960–970,Lisbon, Portugal. Association for Computational Linguistics.Friedman, D., Kasai, J., McCoy, R. T., Frank, R., Davis, F.,and Rambow, O. (2017).Linguistically rich vector representations of supertags fortag parsing.In Proceedings of the 13th International Workshop on TreeAdjoining Grammars and Related Formalisms, pages122–131, Umeå, Sweden. Association for ComputationalLinguistics.Kasai, J., Frank, R., McCoy, R. T., Rambow, O., and Nasr,A. (2017).TAG Parsing with Neural Networks and VectorRepresentations of Supertags.In Proceedings of EMNLP.Lien, E. (2014).

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WUsing minimal recursion semantics for entailmentrecognition.In Proceedings of the Student Research Workshop at the14th Conference of the European Chapter of theAssociation for Computational Linguistics, page 7684,Gothenburg, Sweden.Marcheggiani, D., Frolov, A., and Titov, I. (2017).A simple and accurate syntax-agnostic neural model fordependency-based semantic role labeling.In Proceedings of the 21st Conference on ComputationalNatural Language Learning (CoNLL 2017), pages411–420, Vancouver, Canada. Association forComputational Linguistics.Marcheggiani, D. and Titov, I. (2017).Encoding sentences with graph convolutional networks forsemantic role labeling.

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WIn Proceedings of the 2017 Conference on EmpiricalMethods in Natural Language Processing, pages1507–1516, Copenhagen, Denmark. Association forComputational Linguistics.McDonald, R., Pereira, F., Ribarov, K., and Hajic, J. (2005).Non-projective dependency parsing using spanning treealgorithms.In Proceedings of Human Language TechnologyConference and Conference on Empirical Methods inNatural Language Processing, pages 523–530, Vancouver,British Columbia, Canada. Association for ComputationalLinguistics.Mikolov, T., Sutskever, I., Chen, K., Corrado, G. S., andDean, J. (2013).Distributed Representations of Words and Phrases andtheir Compositionality.

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WIn Burges, C. J. C., Bottou, L., Welling, M., Ghahramani, Z.,and Weinberger, K. Q., editors, Advances in NeuralInformation Processing Systems 26, pages 3111–3119.Curran Associates, Inc.Ng, D., Constable, J. W., Honnibal, M., and Curran, J. R.(2010).SCHWA: PETE using CCG dependencies with the C&Cparser.In Proceedings of the 5th International Workshop onSemantic Evaluation, page 313316.Nivre, J., Hall, J., and Nilsson, J. (2006).Maltparser: A data-driven parser-generator for dependencyparsing.In LREC.Rimell, L. and Clark, S. (2010).Cambridge: Parser evaluation using textual entailment bygrammatical relation comparison.

Background and Motivations Supertagging Models Parsing Models Vector Representations of Supertags Ongoing TAG Parsing WIn Proceedings of the 5th International Workshop onSemantic Evaluation, pages 268–271.Roth, M. and Lapata, M. (2016).Neural semantic role labeling with dependency pathembeddings.In Proceedings of the 54th Annual Meeting of theAssociation for Computational Linguistics (Volume 1: LongPapers), pages 1192–1202, Berlin, Germany. Associationfor Computational Linguistics.Xu, P., Frank, R., Kasai, J., and Rambow, O. (2017).Tag parser evaluation using textual entailments.In Proceedings of the 13th International Workshop on TreeAdjoining Grammars and Related Formalisms, pages132–141, Umeå, Sweden. Association for ComputationalLinguistics.

Gold Chart (MICA) 100.00 97.60 97.30 Maxent (MICA) Chart (MICA) 88.52 87.60 85.80 Supertagging is almost parsing There are about 5,000 supertags in the grammar About half of them occu

Related Documents:

Civic Style - Marker Symbols Ü Star 4 û Street Light 1 ú Street Light 2 ý Tag g Taxi Æb Train Station Þ Tree 1 òñðTree 2 õôóTree 3 Ý Tree 4 d Truck ëWreck Tree, Columnar Tree, Columnar Trunk Tree, Columnar Crown @ Tree, Vase-Shaped A Tree, Vase-Shaped Trunk B Tree, Vase-Shaped Crown C Tree, Conical D Tree, Conical Trunk E Tree, Conical Crown F Tree, Globe-Shaped G Tree, Globe .

The parsing algorithm optimizes the posterior probability and outputs a scene representation in a "parsing graph", in a spirit similar to parsing sentences in speech and natural language. The algorithm constructs the parsing graph and re-configures it dy-namically using a set of reversible Markov chain jumps. This computational framework

Model List will show the list of parsing models and allow a user of sufficient permission to edit parsing models. Add New Model allows creation of a new parsing model. Setup allows modification of the license and email alerts. The file parsing history shows details on parsing. The list may be sorted by each column. 3-4. Email Setup

the parsing anticipating network (yellow) which takes the preceding parsing results: S t 4:t 1 as input and predicts future scene parsing. By providing pixel-level class information (i.e. S t 1), the parsing anticipating network benefits the flow anticipating network to enable the latter to semantically distinguish different pixels

tored parsing model, and I develop for the new grammar formalism a scoring model to resolve parsing ambiguities. I demonstrate the flexibility of the new model by implement- . phrase structure tree, but phrase

Grammar Express 79 Center Stage 79 Longman Advanced Learners’ Grammar 80 An Introduction to English Grammar 80 Longman Student Grammar of Spoken & Written English 80 Longman Grammar of Spoken & Written English 80 Grammar Correlation Chart KEY BOOK 1 BOOK 2 BOOK 3 BOOK 4 BOOK 5 BOOK 6 8. Grammar.indd 76 27/8/10 09:44:10

IV Grammar/Comp Text ABeka Grammar 10th Grade 5.00 IV Grammar/Comp Text ABeka Grammar 10th Grade 5.00 Grammar/Composition IV ABeka Grammar 10th Grade 3.00 Workbook - Keys ABeka Grammar 12th Grade 10.00 Workbook VI-set ABeka Grammar 12th Grade 20.00 Daily Grams Gra

This paper aims to extend this range and introduces a novel engineering application of Origami: Folded Textured Sheets. Existing applications of Origami in engineering can broadly be catego-rized into three areas. Firstly, many deployable structures take inspiration from, or are directly derived from, Origami folding. Examples are diverse and range from wrapping solar sails [Guest and .