FastLearningwithExplanationand PriorKnowledge

2y ago
18 Views
2 Downloads
1.59 MB
47 Pages
Last View : 12d ago
Last Download : 3m ago
Upload by : Louie Bolen
Transcription

Fast Learning with Explanation andPrior KnowledgeSean (Xiang) RenDepartment of Computer ScienceInformation Science InstituteUSChttp://inklab.usc.edu

Recipe for Modern NLP ApplicationsModel LabeledData? ComputingPower2

Recipe for Modern NLP ApplicationsModel Model architectures and computing powerLabeledare transferrable across applicationsDatalabeled data is not! ?ComputingPower3

Creating Labeled Data for Relation ExtractionTACRED dataset: 106k labeledinstances for 41 relations,crowd-sourced via AmazonMechanical Turk(Zhang et al., 2018)4

Creating Labeled Data for Relation ExtractionCost on AmazonMechanical Turk: 0.5per instance à 53k!Time cost: 20 secondper instance à 7 days(Zhou et al., WWW20)5

Labeled data for more complex tasks(Rajpurkar et al., 2018)6

Towards faster learning (with less labels)Multi-task LearningTransfer LearningDistant SupervisionActive Learning7

Towards faster learning (with less labels)Challenges: availability of related dataMulti-task LearningTransfer Learningsources & strong assumptions on datadistributionsDistant SupervisionActive Learning8

Our Idea: High-level Human Supervisions9

Our Idea: High-level Human SupervisionsMachine digests human rationale andlearns how to make decisions10

This TalkQ1 How to augment model training with rules?Soft rule grounding for data augmentation (Zhou et al. WWW20)Q2 How to handle compositional natural language input?Neural execution tree for NL explanation (Wang et al. ICLR20)Q3 How to incorporate prior knowledge as inductive bias?Knowledge-aware graph networks (Lin et al. EMNLP19)11

Standard pipeline for data annotationCorpusLabelsMicrosoft was founded by Bill Gates in 1975.Apple was founded by Steven Jobs in 1976.Amazon was founded by Jeff Bezos in 1994.ORG: FOUNDED BYORG: FOUNDED BYORG: FOUNDED BYNeural ClassifierAnnotatorSlow, redundant annotationefforts on similar instances!12

Alternative Labeling Scheme: Surface Pattern RulesCorpusLabelsMicrosoft was founded by Bill Gates in 1975.Apple was founded by Steven Jobs in 1976.Amazon was founded by Jeff Bezos in 1994.ORG: FOUNDED BYORG: FOUNDED BYORG: FOUNDED BYSUBJ-ORG was founded by OBJ-PER ORG: FOUNDED BYAnnotatorAnnotate contextually similarinstances via much fewer rules!(Hearst, 1992)13

Neural Rule Grounding for Data AugmentationGeneralizing rule coverage via soft matching to instances(x i, y i)Hard-matched instancesCorpusMicrosoft was founded by Bill Gates in 1975.Apple was founded by Steven Jobs in 1976.Microsoft was established by Bill Gates in 1975.In 1975, Bill Gates launched Microsoft.1. Hard-matchingMicrosoft was founded by Bill Gates in 1975.Apple was founded by Steven Jobs in 1976.Unmatched instancesMicrosoft was established by Bill Gates.In 1975, Bill Gates launched Microsoft.ORG: FOUNDED BYORG: FOUNDED BY(x i, y i, matching score)ORG: FOUNDED BY 0.8ORG: FOUNDED BY 0.7Labeling RulesSUBJ-ORG was founded by OBJ-PER ORG: FOUNDED BYSUBJ-PER born in OBJ-LOC PER: ORIGIN(Zhou et al, WWW20)2. Soft-matching14

A Learnable, Soft Rule Matching FunctionUnmatched instancesMicrosoft was established by Bill Gates.In 1975, Bill Gates launched Microsoft.(x i, y i, matching score)ORG: FOUNDED BY 0.8ORG: FOUNDED BY 0.7Labeling RulesENT1 was founded by ENT2 ORG: FOUNDED BYENT1 born in ENT2 PER: ORIGIN(Zhou et al, WWW20)2. Soft-matching15

Joint Parameter Learning: Relation Extractor Soft Rule Matcher(x i, y i)Matched SentencesMicrosoft was founded by Bill Gates in 1975.Apple was founded by Steven Jobs in 1976.Labeling RulesSUBJ-ORG was founded by OBJ-PER ORG: FOUNDED BYSUBJ-PER born in OBJ-LOC PER: ORIGINUnmatched SentencesMicrosoft was established by Bill Gates.In 1975, Bill Gates launched Microsoft.ORG: FOUNDED BYORG: FOUNDED BY(x i, y i, matching score)ORG: FOUNDED BY 0.8ORG: FOUNDED BY 0.7𝐿 ()* ,-&.Soft-matching𝐿)* ,-&.𝐿# %&'Cross-entropy loss on relation labels(Zhou et al, WWW20)16

Joint Parameter Learning: Relation Extractor Soft Rule MatcherLabeling RulesENT1 was founded by ENT2 ORG: FOUNDED BYENT1 born in ENT2 PER: ORIGINContrastive loss fordiscriminating by rule bodies(surface patterns)𝐿,% '(Zhou et al, WWW20)17

Joint Parameter Learning: Relation Extractor Soft Rule Matcher𝑳 𝐿)* ,-&. 𝛼 𝐿 ()* ,-&. 𝛽 𝐿# %&' 𝜸 𝑳𝒄𝒍𝒖𝒔LabelsMatched SentencesCorpusMicrosoft was founded by Bill Gates in 1975.Apple was founded by Steven Jobs in 1976.Microsoft was established by Bill Gates in 1975.In 1975, Bill Gates launched Microsoft.Microsoft was founded by Bill Gates in 1975.Apple was founded by Steven Jobs in 1976.Unmatched Sentences1. Hard-matchingLabeling RulesSUBJ-ORG was founded by OBJ-PER ORG: FOUNDED BYSUBJ-PER born in OBJ-LOC PER: ORIGIN(Zhou et al, WWW20)Microsoft was established by Bill Gatesin 1975.In 1975, Bill Gates launched Microsoft.ORG: FOUNDED BYORG: FOUNDED BYLabels Matching ScoreORG: FOUNDED BY 0.8ORG: FOUNDED BY 0.72. Soft-matching18

Results on Relation Extraction6049.05043.6F1 score42.938.84051.345.339.2302020.1100RulesLSTM ATTHard-matchingNERO (w/o Self-Training Mean-Teacher NERO (-SRM) NERO ngRelation Extraction Performance (in F1 score) on TACRED19

Study on Label EfficiencySpent 40minon labelinginstances fromTACREDDashed: Avg # of rules / sentences labeled by annotators.Solid: Avg model F1 trained with corresponding annotations.{Rules Neural Rule Grounding} produces muchmore effective model with limited time!20

Standard annotation pipelineview eachexampleassess theexampleprovide alabelRule-based annotation pipelineview severalexamplesAnnotatorsummarize rulesLabeling rulesSUBJ-ORG was founded by OBJ-PER ORG: FOUNDED BYBetter label efficiencyLess user-friendly, limited expressiveness21

Problem: Can users provide more complex cluesto explain their thought process, in a natural way?22

Learning with Natural Language ExplanationsSentiment on ENT ispositive or negative?x; : There was a long wait for a tableoutside, but it was a little too hot in thesun anyway so our ENT was very nice.Users' natural languageexplanationsPositive, because thewords “very nice” is within3 words after the ENT.Relation between ENT1 and ENT2?x : Officials in Mumbai said that the twosuspects, David Headley, and ENT1, whowas born in Pakistan but is a ENT2 citizen,both visited Mumbai before the attacks.Wang et al., to appear at ICLR’20per: nationality, because thewords ‘‘is a’’ appear rightbefore ENT2 and the word‘‘citizen’’ is right after ENT2.http://inklab.usc.edu/project-NExT23

Explanations to “labeling functions”Labeling function (most plausible)Explanationdef LF (x) :Return ( 1 if : And ( Is ( Word ( ‘who died’ ), AtMost ( Left( OBJECT ), Num (3, tokens ) ) ), Is ( Word ( ‘who died’ ),Between ( SUBJECT , OBJECT) ) ); else 0 )The words “who died” precede OBJECT by no more thanthree words and occur between SUBJECT and OBJECTpredicate assigningfunction assigning@Word @Quote(who died) @Left @OBJECT @AtMost@Num @Token @And @Is @Between @SUBJECT @And@OBJECTCCG parsingCandidate logical forms@And ( @Is ( @Quote ( ‘who died’ ), @AtMost ( @Left (@OBJECT ), @Num ( @Token ) ) ), @Is ( @Word ( ‘whodied’ ), @Between ( @SUBJECT , @OBJECT) ) )inferenceCandidate scoring. . .(Srivastava et al., 2017; Zettlemoyer & Collins, 2012)24

Hard matching for data augmentationInstanceSentence: quality ingredients preparationall around, and a very fair price for NYC.Question: What is the sentiment polarityw.r.t. “price” ?Human labelingLabel resultLabel: Positiveunlabeled instanceSentence: it has delicious foodwith a fair price.Hard MatchingExplanation: because the word “price” isdirectly preceded by fair.25

Problems with hard matchinghardmatchingexplanation𝑈 {𝒙z }Challenge 1: languagevariations on both explanationpredicates & contextual cluestraining𝐿 {(𝒙z , 𝒚z )}modelper: nationality, because thewords ‘‘is a’’ appear rightbefore ENT2 and the word‘‘citizen’’ is right after ENT2.Challenge 2: compositionalnature of the explanations26

Learning with Hard & Soft Matchingexplanationsample and annotatehardmatching𝐵* {(𝒙z , 𝒚z )}semantic parsinglogical form𝑈 {𝒙z }modelsoftmatching , 𝜔 )}𝐵 {(𝒙 , 𝒚annotate with a pseudolabel and a confidence score(Wang et al., ICLR20)27

Neural Execution Tree (NExT) for Soft deterministicfunctionNeural ExecutionTreeLabeling functiondef LF (x) :Return ( 1 if : And ( Is ( Word ( ‘who died’ ), AtMost ( Left( OBJECT ), Num (3, tokens ) ) ), Is ( Word ( ‘who died’ ), Between( SUBJECT , OBJECT) ) ); else 0 )ExplanationThe words “who died” precede OBJECT by no more thanthree words and occur between SUBJECT and OBJECT(Wang et al., ICLR20)SentenceSUBJECT was murdered on OBJECTSUBJECT was killed in OBJECTSUBJECT , who died on OBJECT. .28

Neural Execution Tree (NExT) for Soft Matching0.3, 0.2, 0.9, 0.2, 0.40.6, 1, 1, 1, 1( Word ( who died ) )0, 1, 1, 1, 00.3, 0.2, 0.9, 0.2, 0.4Between(SUBJECT, OBJECT))(Word(who died)AtMost (Left(OBJECT), Num(3 Tokens))Neural ExecutionTreeLabeling functiondef LF (x) :Return ( 1 if : And ( Is ( Word ( ‘who died’ ), AtMost ( Left( OBJECT ), Num (3, tokens ) ) ), Is ( Word ( ‘who died’ ), Between( SUBJECT , OBJECT) ) ); else 0 )ExplanationThe words “who died” precede OBJECT by no more thanthree words and occur between SUBJECT and OBJECT(Wang et al., ICLR20)SentenceSUBJECT was murdered on OBJECTSUBJECT was killed in OBJECTSUBJECT , who died on OBJECT. .29

Neural Execution Tree (NExT) for Soft Matchingargmax(p1*p2)argmax(p1*p2)p10.90.90.6, 1, 1, 1, 10.3, 0.2, 0.9, 0.2, 0.4p2p1p20, 1, 1, 1, 00.3, 0.2, 0.9, 0.2, 0.4IsIs(Word(who died)Between(SUBJECT, OBJECT))( Word ( who died ) )AtMost (Left(OBJECT), Num(3 Tokens))Neural ExecutionTreeLabeling functiondef LF (x) :Return ( 1 if : And ( Is ( Word ( ‘who died’ ), AtMost ( Left( OBJECT ), Num (3, tokens ) ) ), Is ( Word ( ‘who died’ ), Between( SUBJECT , OBJECT) ) ); else 0 )ExplanationThe words “who died” precede OBJECT by no more thanthree words and occur between SUBJECT and OBJECT(Wang et al., ICLR20)SentenceSUBJECT was murdered on OBJECTSUBJECT was killed in OBJECTSUBJECT , who died on OBJECT. .30

Neural Execution Tree (NExT) for Soft Matchingmatching scoremax(p1 p2-1, ndSUBJECTwasonSUBJECT0,wason0.6,1, murdered1, 1, 11, 1,murdered1, 0OBJECTOBJECTSUBJECTwasmurdered0.3, 0.2,0.9,0.2, 0.4 onOBJECTSUBJECT was murdered on0.3, 0.2, 0.9, 0.2, 0.4OBJECTIsIs(Word(who died)Between(SUBJECT, OBJECT))( Word ( who died ) )AtMost (Left(OBJECT), Num(3 Tokens))Neural ExecutionTreeLabeling functiondef LF (x) :Return ( 1 if : And ( Is ( Word ( ‘who died’ ), AtMost ( Left( OBJECT ), Num (3, tokens ) ) ), Is ( Word ( ‘who died’ ), Between( SUBJECT , OBJECT) ) ); else 0 )ExplanationThe words “who died” precede OBJECT by no more thanthree words and occur between SUBJECT and OBJECT(Wang et al., ICLR20)SentenceSUBJECT was murdered on OBJECTSUBJECT was killed in OBJECTSUBJECT , who died on OBJECT. .31

Modules in NeXT1. String matching2. Soft counting3. Soft logic4. Deterministicfunctions(Wang et al., ICLR20)32

Study on Label Efficiency (TACRED)2500200015001000500# labels toachieve equalperformanceNumber of explanationsAnnotation time cost:giving a label an explanation 2x giving a label33

Standard annotation pipelineprovide alabelassess theexampleview eachexampleRule-based annotation pipelineview severalexamplesAnnotatorsummarize rulesLabeling rulesSUBJ-ORG was founded by OBJ-PER ORG: FOUNDED BYNL explanation-based annotation pipelineview anexampleAnnotatorprovide rationaleNL explanationsPositive, because thewords “very nice” is within3 words after the TERM.34

Problem: Can we make use of prior knowledge toconstrain the model learning?35

Commonsense Reasoning in QAWhere do adults usually use glue sticks?A: classroom B: office C: desk drawerWhat do you need to fill with ink to write notes on an A4 paper?A: fountain penB: printer C: pencilCan you choose the most plausible answer based ondaily life commonsense knowledge?(CommonsenseQA, Talmor et al., 2018)36

Pre-trained LMs doesn’t get it for free: if the choice is correct or notQuestionAnswer ChoiceFine-tuning BERT for CommonsenseQA (12k QA pairs).Accuracy will drop 15 % if labeled dataare reduced for 10%37

Limitations of Fine-tuned LMs1. Not capturing commonsenseMost plausible predictionsare far from common truthOnline demo of BERT’s Masked-LM https://demo.allennlp.org/masked-lm2. Not Interpretable w/ Knowledge38

Neural-Symbolic Reasoning withCommonsense KGadultSymbol tic Space(Bill Yuchen Lin et al. EMNLP19)fHSasuusevbeeReceiveActionntAtLglue stickoconatiA Schema Graphfor the choice BofficeKnowledge-Aware Reasoning*Where do adults use glue sticks?A: classroom B: office C: desk drawerQuestionAnswer Candidates39

Multi-relational Graph as Inductive BiasStatementQuestionAnswerQuestion AnswerConcept Recognition Concepts ConceptsLanguage Encoder(e.g. BERT)KagNetGraph Constructionvia Path FindingStatement VectorGraphVectorGraph Encoder*MLPPlausibility score(Bill Yuchen Lin et al. 2019)Schema Graph40

KagNet: Knowledge-aware Graph NetworkEncoding UnlabeledSchema Graphs gRStatement Vector sT latexit sha1 base64 "lILmndifYlXYYd3TBDeiCV0kJuw " AAAB8XicbVDLSgMxFL1TX7W qi7dBIvgqsxUQZdFNy4r2Ae2Q8mkd9rQTGZIMkIZ hduXCji1r9x59 YtrPQ1gOBwzn3knNPkAiujet O4W19Y3NreJ2aWd3b/ ssHM0nQj aZigWnc9NzF Q00noSBXZyllAvezPxP6 JItwVs 6L8 58LEYLTr5zDH/gfP4A9u6RGw /latexit latexit sha1 base64 "KcCkQ8Dr2DPVFNecfOXjV24oJ5Y " Y7adduNmF3I5TQX xcWmVIwljZkoYs1N8TGY20nkaB7YyoGetVby7 wjO8wpvz6Lw4787HsnXDyWfO4A czx rjYzK /latexit latexit sha1 base64 "CrLsuWMDMFWjz iaUw09lj4OVG4 " AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mqoMeCF4 t2FpoQ9lsJ 3azSbsboQS t0s7u3v5B Y2ZKGzNXfExmNtJ5Ege2MqBnpZW8m/ud1UxNe LL6 Sdq3qXVRrzctK3c3jKMIJnMI5eHAFdbiFBrSAAcIzvMKb8 i8OO/Ox6K14OQzx/AHzucPqIWMyA /latexit latexit sha1 base64 "70nNBQaCBgZQwVTQrRxMMmGaQK4 " 120y7dbMLuRCih/8CLB0W8 o 8 W/ctDlo64OBx3szzMwLEikMuu63s7a sbm1Xdop7 I dWGZIw1rYUkrn6eyKjkTHTKLCdEcWxWfZy8T vl2J442dCJSlyxRaLwlQSjEn ijBKdwBhfgwTU04A6a0AIGITzDK7w5E fFeXc Fq1rTjFzAn/gfP4A/MyM8Q /latexit Cq latexit sha1 base64 "5OMPA 86tRP1ySu0Q8DeEiYVzTU " Jppg3NJGOSUcrQ/3DjQhG3/os7/8ZMOwttPRA4nHMv9 TNCmYYbTTqwojgJO28G4nvntR6o0k UvL5NWteKdV6q3F XadV5HAY7hBM7Ag0uowQ00oAkEFDzDK7w5T86L8 58zEdXnHznCP7A fwBPJySVQ /latexit latexit sha1 base64 "dQQHt0/0PBOFo0bEC awhykIroQ " fTSTt0Mokzk0IJ/Q43LhRx68e482 ctFlo64GBwzn3cs8cP ZMacf5tgobm1vbO8Xd0t7 weFR i9Ny sfy9GCle cwh9Ynz/q6JIx /latexit Ca(q) latexit sha1 base64 "cFp0lm/JqJAAbxCNYiELRwyFIY0 " AAAB8HicbVBNSwMxEJ2tX7V VT16CRahXspuK RrCx0j3ps4e0/Hg u9RNMYkzEe0q6lEguq/XR vz7nwsWnNONnMMf B8/gBS55AJ /latexit ci(a)LSTM Path EncodercjLSTM(Pi,j [k] ) latexit sha1 base64 "6jx4y/5xilUNLmSm8VO3ZuDeHtU " AAAB8HicbVBNSwMxEJ2tX7V VT16CRahXspuK RrCx0j3pPz6kZXw u9RNMYkzEe0q6lEguq/XR vz7nwsWnNONnMMf B8/gA8EY/6 /latexit Pi,j latexit sha1 base64 "wjgZQHVCe19RlY4Rdt9vOhdqX/Q " AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBg5REBT0WvXisYD xYbgR2E4U0igQ2ApGt1O/9YRK81g WKVPwljZkobM1N8TGY20HkeB7YyoGepFbyr 53VSE177GZdJalCy W8tOPnMIfyB8/kDGqSPag /latexit latexit sha1 base64 "teqK/CNCdOXuGAw7kuDknxGtZZM " AAAB8XicbVBNSwMxEJ2tX7V pvvPlvTNs9aOuDgcd7M8zMCxPOtHHdb6ewsrq2vlHcLG1t7 zulfcPWlqmitAmkVyqTog15UzQpmGG006iKI5DTtvh6Gbqt5 jkavfJXty9JGlNhCMda Ur/M4inAEx3AKHlxCHW6hAU0gIOAZXuHN0c6L8 58zFsLTj5zCH/gfP4AS7eQqw /latexit Pi,j [k] . latexit sha1 base64 "teqK/CNCdOXuGAw7kuDknxGtZZM " AAAB8XicbVBNSwMxEJ2tX7V pvvPlvTNs9aOuDgcd7M8zMCxPOtHHdb6ewsrq2vlHcLG1t7 zulfcPWlqmitAmkVyqTog15UzQpmGG006iKI5DTtvh6Gbqt5 jkavfJXty9JGlNhCMda Ur/M4inAEx3AKHlxCHW6hAU0gIOAZXuHN0c6L8 58zFsLTj5zCH/gfP4AS7eQqw /latexit (q)Modeling Relational Paths Pi,j between ci latexit sha1 base64 "cFp0lm/JqJAAbxCNYiELRwyFIY0 " AAAB8HicbVBNSwMxEJ2tX7V VT16CRahXspuK RrCx0j3ps4e0/Hg u9RNMYkzEe0q6lEguq/XR vz7nwsWnNONnMMf B8/gBS55AJ /latexit latexit sha1 base64 "wjgZQHVCe19RlY4Rdt9vOhdqX/Q " AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBg5REBT0WvXisYD xYbgR2E4U0igQ2ApGt1O/9YRK81g WKVPwljZkobM1N8TGY20HkeB7YyoGepFbyr 53VSE177GZdJalCy W8tOPnMIfyB8/kDGqSPag /latexit and c(a)j latexit sha1 base64 "6jx4y/5xilUNLmSm8VO3ZuDeHtU " AAAB8HicbVBNSwMxEJ2tX7V VT16CRahXspuK RrCx0j3pPz6kZXw u9RNMYkzEe0q6lEguq/XR vz7nwsWnNONnMMf B8/gA8EY/6 /latexit (q)the k-th path between ci latexit sha1 base64 "cFp0lm/JqJAAbxCNYiELRwyFIY0 " AAAB8HicbVBNSwMxEJ2tX7V VT16CRahXspuK RrCx0j3ps4e0/Hg u9RNMYkzEe0q6lEguq/XR vz7nwsWnNONnMMf B8/gBS55AJ /latexit (a)and cj latexit sha1 base64 "6jx4y/5xilUNLmSm8VO3ZuDeHtU " AAAB8HicbVBNSwMxEJ2tX7V VT16CRahXspuK RrCx0j3pPz6kZXw u9RNMYkzEe0q6lEguq/XR vz7nwsWnNONnMMf B8/gA8EY/6 /latexit (Bill Yuchen Lin et al. 2019)KagNet: Knowledge-Aware Graph Networksfor Commonsense Reasoning41

ExperimentsRecent follow-up submissions:- Based on XL-NET / RoBERTa (72.1)- Using large-scale wiki docs via IR- Transfer from other QA datasets (e.g. RACE)- Adversarial Data AugmentationMore Performance on Official Test Set: https://www.tau-nlp.org/csqa-leaderboard(Bill Yuchen Lin et al. 2019)42

ining!CSQA59.01% vs 56.53%SWAG53.51% vs 51.23%No Training!(Bill Yuchen Lin et al. 2019)WSC43

Conclusion(Label-efficient) Learning from high-level human supervisionsthat are abstractive, compositional, and linguistically complexQ1 How to augment model training with rules?Soft rule grounding for data augmentation (Zhou et al. WWW20)Q2 How to handle compositional natural language input?Neural execution tree for NL explanation (Wang et al. ICLR20)Q3 How to incorporate prior knowledge as inductive bias?Knowledge-aware graph networks (Lin et al. EMNLP19)44

Other related effortsQ1 How to augment model training with rules?Soft rule grounding for data augmentation (Zhou et al. WWW20)Q2 How to handle compositional natural language input?Neural execution tree for NL explanation (Wang et al. ICLR20)Q3 How to incorporate background knowledge?Knowledge-aware graph networks (Lin et al. EMNLP19)Learning from Distant Supervision: [Ye et al., EMNLP19], [Zhang et al.,NAACL19], [Shang et al., EMNLP18], [Liu et al., EMNLP17]Reasoning over Heterogeneous Data: [Fu et al., EMNLP18], [Jin et al., ICLRGRLM19], [Ying et al., NeurIPS18], [Ying et al., ICML18]45

StudentsCollaboratorsDan MacFarland, Sociology, Stanford UniversityJure Leskovec, Computer Science, Stanford UniversityDan Jurafsky, Computer Science, Stanford UniversityJiawei Han, Computer Science, UIUCMorteza Dehghani, Psychology, USCKennth Yates, Clinical Education, USCCraig Knoblock, USC ISICurt Langlotz, Bioinformatics, Stanford UniversityKuansan Wang, Microsoft AcademicLeonardo Neves, Snap ResearchMark Musen, Bioinformatics, Stanford UniversityFundingResearch Partnership46

Thank you!USC Intelligence and Knowledge Discovery (INK) Labhttp://inklab.usc.edu/Code: nNLP47

JointParameterLearning:RelationExtractor SoftRuleMatcher 18 Microsoftwas founded by Bill Gatesin 1975. Applewas founded by Steven Jobsin 1976. Microsoftwas established by Bill Gatesin 1975. In 1975, Bill Gateslaunched Microsoft. Corpus SUBJ-ORG was founded by OBJ-PER ORG: FOUNDED_BY SUBJ-PERborn in OBJ-LOC PER: ORIGIN Labeling

Related Documents:

organisasi yang sejenis, lembaga, dana pensiun, bentuk usaha tetap serta bentuk badan usaha lainnya; o. Perdagangan adalah kegiatan jual beli barang atau jasa yang dilakukan secara terus menerus dengan tujuan pengalihan hak atas barang atau jasa dengan disertai imbalan atau kompensasi; p. Perusahaan adalah setiap bentuk usaha yang menjalankan .

The IELTS Bridge, Dehradun 07302390901 46 D Pathribagh, near SGRR PG College Dehradun theieltsbridge@gmail.com 91-7302390901 A Depression is the third leading cause of illness and disability among adolescents, and suicide is the third leading cause of death in older adolescents (15–19 years). Violence, poverty, humiliation and feeling devalued can increase the risk of developing mental .

&c., Broadcasting (Offences) Act 1967; to revoke a class licence granted under the Telecommunications Act 1984 to run broadcast relay systems; and for connected purposes. [1st November 1990] BE IT ENACTED by the Queen’s most Excellent Majesty, by and with the advice and consent of the Lords Spiritual and Temporal, and Commons, in this present Parliament assembled, and by the authority of the .

BSS 7230 F2 FAR 25.853 (d), App. F -Part V ABD 0031 / AITM 2.0007 BSS 7238 ABD 0031 / AITM 3.0005 BSS 7239 UL 94 / HB UL 94 / V-0 Conversion of units: 1.0 mm is equivalent to 0.03937 inches Property Measurement Method UnitValue 1.1 2.0 1.5 / 2.0 1.5 / 2.0 0.06 / 0.08 1.5 / 2.0 1.5 / 2.0 0.06 / 0.08 1.5 / 2.0 0.06 / 0.08 mm mm mm inches mm mm inches mm mm inches. EOS 2008 EOSINT P FORMIGA P .

OSCI LLATOR John Case, GW4HWR, de-scribes this oscillator used for repeating certain se-quences of events. lag BOOK REVIEW RECOMMENDED RADIO READING We take a look at some of the books available from the RSGB. CONSTRUCTION A RECEIVER CONVERTER FOR 50MHz Extends the range of your communications receiver to cover the 50MHz bands. 20 G4RAW's low light indicator. 18 TECHNICAL AMPLITUDE MODULATION .

Business Architecture 140 Strategic Plans 141 Mission and Vision 143 Drivers 144 Goals and Objectives 146 Capabilities 148 Business Processes 150 Information Architecture 153 Conceptual Information Model 154 Logical Data Model 157 Schemas and Messages 158 Physical Data Model 159 Application Architecture 161 Application Lists Diagrams and Matrices 162 Application Communication 166 Interface .

to imminent climate change in order to mini-mize vulnerability and enhance resilience to future risks. This report is concerned with the economic impact of climate change for Belize, and the costs of inaction in responding to the impend-ing risks. There have been several studies of the costs of inaction on climate change which

1 Corporate Governance and Firm Efficiency: Evidence from China’s Publicly Listed Firms Chen Lin a, Yue Ma a, b, Dongwei Su c,d, a Department of Economics, Lingnan University, Hong Kong b Macroeconomic Research Center, Xiamen University, China c Department of Finance, Jinan University, Guangzhou 510632, China d Research Institute of Finance, Jinan University, Guangzhou 510632, China