36,99 €
Statt 47,95 €**
36,99 €
inkl. MwSt.
**Preis der gedruckten Ausgabe (Broschiertes Buch)
Sofort per Download lieferbar
payback
0 °P sammeln
36,99 €
Statt 47,95 €**
36,99 €
inkl. MwSt.
**Preis der gedruckten Ausgabe (Broschiertes Buch)
Sofort per Download lieferbar

Alle Infos zum eBook verschenken
payback
0 °P sammeln
Als Download kaufen
Statt 47,95 €****
36,99 €
inkl. MwSt.
**Preis der gedruckten Ausgabe (Broschiertes Buch)
Sofort per Download lieferbar
payback
0 °P sammeln
Jetzt verschenken
Statt 47,95 €****
36,99 €
inkl. MwSt.
**Preis der gedruckten Ausgabe (Broschiertes Buch)
Sofort per Download lieferbar

Alle Infos zum eBook verschenken
payback
0 °P sammeln
  • Format: PDF

Thesis (M.A.) from the year 2004 in the subject German Studies - Linguistics, grade: 1,0, University of Freiburg (Germanistik), language: English, abstract: Natural language is a complicated thing. When processing a sentence, the human parser has to keep track of the structure of the sentence; this requires remembering the input string, integrating new words into already built structures, and many other things, - and everything has to be done on-line. If the sentence becomes too difficult, the parser will lose control, and processing becomes slow, or may eventually break down. There have been…mehr

Produktbeschreibung
Thesis (M.A.) from the year 2004 in the subject German Studies - Linguistics, grade: 1,0, University of Freiburg (Germanistik), language: English, abstract: Natural language is a complicated thing. When processing a sentence, the human parser has to keep track of the structure of the sentence; this requires remembering the input string, integrating new words into already built structures, and many other things, - and everything has to be done on-line. If the sentence becomes too difficult, the parser will lose control, and processing becomes slow, or may eventually break down. There have been a number of complexity measures for natural language; the most influential one at the moment is Gibson's (2000) Dependency Locality Theory (DLT). However, in a recent experiment, Konieczny and Döring (2003) found that reading times on clause-final verbs were faster, not slower, when the number of verb arguments was increased. This was taken as evidence against DLT's integration cost hypothesis and for the anticipation hypothesis originally developed by Konieczny (1996): During language processing, a listener / reader anticipates what is about to come - he is "ahead of time". This paper presents a series of simulations modeling anticipation. Due to the fact that Simple Recurrent Networks (SRNs; Elman 1990) seem to be the most adequate device for modeling verbal working memory (MacDonald & Christiansen 2002), neural networks were used for the simulations. In seven series of simulations, I managed to model the anticipation effect. Next to a deeper understanding of anticipation, insights into the way SRNs function could be gained. The paper is organized as follows. First I will give an overview of different complexity measures; then the experiment mentioned above will be described. Third, I will briefly discuss existing models for verbal working memory. After a short introduction into neural network modeling, the core part of the paper, the seven simulation series, will be presented in detail. Finally, in the Discussion I will argue that SRNs represent a good model for anticipation; implications for the anticipation hypothesis as well as implications for SRNs in general will be considered. Finally, predictions for further experiments will be discussed.

Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, HR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.