Главная    Ex Libris    Книги    Журналы    Статьи    Серии    Каталог    Wanted    Загрузка    ХудЛит    Справка    Поиск по индексам    Поиск    Форум   
blank
Авторизация

       
blank
Поиск по указателям

blank
blank
blank
Красота
blank
Carter D. — Interpreting anaphors in natural language text
Carter D. — Interpreting anaphors in natural language text



Обсудите книгу на научном форуме



Нашли опечатку?
Выделите ее мышкой и нажмите Ctrl+Enter


Название: Interpreting anaphors in natural language text

Автор: Carter D.

Аннотация:

Perhaps the major obstacle to the development of computer programs capable of the sophisticated processing of natural language is the problem of representing and using the large and varied quantities of world or domain knowledge that are, in general, required. This book describes an attempt to circumvent this obstacle for one aspect of the language processing problem - that of inteIpreting anaphors (pronouns and other abbreviated expressions) in texts - by adopting a "shallow processing" approach. In this approach, linguistic knowledge, about syntax, semantics and local focusing, is exploited as heavily as possible, in order to minimise reliance on world knowledge.


Язык: en

Рубрика: Computer science/

Статус предметного указателя: Готов указатель с номерами страниц

ed2k: ed2k stats

Год издания: 1987

Количество страниц: 290

Добавлена в каталог: 04.12.2005

Операции: Положить на полку | Скопировать ссылку для форума | Скопировать ID
blank
Предметный указатель
Accessibility (in discourse representation theory)      51
Active anaphors and candidates      198
Actor focus (AF)      110 114
Actor focus stack (AFS)      110
Adequacy (referential)      212
AF (Actor Focus)      110
AFS (Actor Focus Stack)      110
AI (artificial intelligence)      21
Anaphor      14 33
Anaphor starting point (for inference)      188
Anaphor tendril      188
Anaphora      33
Antecedent      33
AR (Anaphor Resolution) rules      131
ATN (Augmented Transition Network)      69
Attributive noun phrases      47
BA (Boguraev’s Analyser)      69
Bach-Peters sentences      45
Backward centre      116
Basic rule      114
Binding (in inference)      188
Bound variable pronouns      45
c-command      60
c-commanded pronoun (CCP) criterion      153
Candidate starting point (for inference)      188
Candidate tendril      188
Cataphora      33
CCP (C-Commanded Pronoun) criterion      153
CD (Conceptual Dependency)      158
Centre      116
Chain (of inferences)      188
Class primitive      66
Coherence relation      96
Cohesion      32
Collective factors      147
Collocation      41
Common sense inference rule (CSIR)      170 188
Conjunction      36
Considerateness      13
context      84
Context of identity (of a TMN node)      84
Coreference      34
Corresponds, assertion      81
Cospecification      34 36
CSI (Common Sense Inference)      17 23
CSIR (Common Sense Inference Rule)      170
Current focus      92
Current fragment      84
DE (Discourse Entity)      55
Demon      160
Demonstrative cospecification      38
Dependent      66
Descriptional anaphors      38
DF (Discourse Focus)      110
DFS (Discourse Focus Stack)      110
Discourse      32
Discourse entity (DE)      55
Discourse focus (DF)      109 110
Discourse focus stack (DFS)      110
Discourse representation (DR) theory      50
Discourse representation structure (DRS)      50
Discourse segment purpose      101
Distributive factors      147
Dominance (of intentions)      101
Donkey sentences      46
DR (Discourse Representation)      50
DRS (Discourse Representation Structure)      50
Efficiency (referential)      212
Ellipsis      36 38
Epithet      42
Exophora      33
Expected focus      110
Explicit TMN node      77
Extended mode (in PS)      169
Extraction      169
Extraction rule      188
FDNP (Full Definite Noun Phrase)      41
Focus      17
Focus registers      110
Focus retention criterion      153
Focus update algorithm      111
Focus value      92
Focus, focusing      92
Formula      65
Forward centre      116
Fragment constructor      84 86
Fragmentation      66
Full definite noun phrase (FDNP)      41
Game-theoretical semantics (GTS)      53
Generic TMN node      84
Global focus      93
Global inference      158
Governor      66
GTS (game-theoretical semantics)      53
Habitability      66
Hearer      32
Heterogeneous inference systems      159 165
ID (Invoking Description)      55
Implicit TMN node      77
Inference      23 188
Inferred specification      133
Information constraint      41
Intentions      101
Intermediate form (for formulas)      80
Inverted nominal formula      66
Invoking description (ID)      55
Lexical cohesion      36
Local focus      93
Local inference      158
Minimal governing category (MGC)      59
MOP (Memory Organisation Packet)      165
Need-driven generation      215
Negative matches and chains      193
No-exit inference rules      192
Nominal ellipsis      39
Nominal substitution      39
Non-specific noun phrases      47
Non-vacuous binding      191
Normal pronouns      114
Null TMN node      84
p-focus      92 103
PAF (Potential Actor Focus)      110
Paraplate      68
Parasitic pronoun      136
Paycheck sentences      40
PDF (Potential Discourse Focus)      110
Personal cospecification      36
PI (Pronoun Interpretation) rules      111
Positive chains      193
Potential actor focus (PAF)      110
Potential discourse focus (PDF)      110
PP (prepositional phrase)      61
Pragmatic pronouns      45
primitive      65
Principle of anaphoric success      128
PRL (Possible Referent List)      160
Pronoun interpretation (PI) rules      111
Pronouns of laziness      40
PS (Preference Semantics system)      64
Pseudo-text (FT)      71
PT (Pseudo-Text)      71
R-pronouns      59
Recency rule      114
Redundancy      13
Ref and rel      81
Reference      34
Referential adequacy      212
Referential efficiency      212
Referential index      45
Reiteration      41
Repetition criterion      152
Satisfaction-precedence (of intentions)      101
Script      163
Semantic formula      65
Semantic primitive      65
Sense-oriented representation      75
Shallow processing      13 17
Shallow processing hypothesis      18
Speaker      32
Specialisation, assertion      81
Specialised inference systems      158 162
Specific TMN node      84
Specification      34
Starting point (for inference)      188
State of focus      92
Status (of a TMN node)      84
Strategy and tactics (in generation)      215
Strong preference      115
Structure matcher      85
Substitution      36 38
TAU (Thematic Affect Unit)      165
template      67
Tendril      188
text      32
Text model      76
Text model network (TMN)      77
Topic      102
Undetermined anaphor      147
Uniform inference systems      159 166
Vacuous binding      191
Weak inference rule      204
Weak preference      115
Word sense network (WSN)      77
blank
Реклама
blank
blank
HR
@Mail.ru
       © Электронная библиотека попечительского совета мехмата МГУ, 2004-2024
Электронная библиотека мехмата МГУ | Valid HTML 4.01! | Valid CSS! О проекте