Semantics boosts syntax in artificial grammar learning tasks with recursion

Anna Fedor, Máté Varga, E. Szathmáry

Research output: Contribution to journalArticle

8 Citations (Scopus)

Abstract

Center-embedded recursion (CER) in natural language is exemplified by sentences such as "The malt that the rat ate lay in the house." Parsing center-embedded structures is in the focus of attention because this could be one of the cognitive capacities that make humans distinct from all other animals. The ability to parse CER is usually tested by means of artificial grammar learning (AGL) tasks, during which participants have to infer the rule from a set of artificial sentences. One of the surprising results of previous AGL experiments is that learning CER is not as easy as had been thought. We hypothesized that because artificial sentences lack semantic content, semantics could help humans learn the syntax of center-embedded sentences. To test this, we composed sentences from 4 vocabularies of different degrees of semantic content due to 3 factors (familiarity, meaning of words, and semantic relationship between words). According to our results, these factors have no effect one by one but they make learning significantly faster when combined. This leads to the assumption that there were different mechanisms at work when CER was parsed in natural and in artificial languages. This finding questions the suitability of AGL tasks with artificial vocabularies for studying the learning and processing of linguistic CER.

Original languageEnglish
Pages (from-to)776-782
Number of pages7
JournalJournal of Experimental Psychology: Learning Memory and Cognition
Volume38
Issue number3
DOIs
Publication statusPublished - May 2012

Fingerprint

Semantics
syntax
grammar
semantics
Learning
learning
Vocabulary
vocabulary
Language
Aptitude
Linguistics
language
Artificial Grammar Learning
Recursion
Syntax
animal
linguistics
lack
experiment
ability

Keywords

  • Artificial grammar learning
  • Center-embedded recursion
  • Familiarity
  • Linguistics
  • Natural language
  • Semantics

ASJC Scopus subject areas

  • Experimental and Cognitive Psychology
  • Language and Linguistics
  • Linguistics and Language
  • Medicine(all)

Cite this

Semantics boosts syntax in artificial grammar learning tasks with recursion. / Fedor, Anna; Varga, Máté; Szathmáry, E.

In: Journal of Experimental Psychology: Learning Memory and Cognition, Vol. 38, No. 3, 05.2012, p. 776-782.

Research output: Contribution to journalArticle

@article{42c795b24e984d45a7b5649b51a02cf1,
title = "Semantics boosts syntax in artificial grammar learning tasks with recursion",
abstract = "Center-embedded recursion (CER) in natural language is exemplified by sentences such as {"}The malt that the rat ate lay in the house.{"} Parsing center-embedded structures is in the focus of attention because this could be one of the cognitive capacities that make humans distinct from all other animals. The ability to parse CER is usually tested by means of artificial grammar learning (AGL) tasks, during which participants have to infer the rule from a set of artificial sentences. One of the surprising results of previous AGL experiments is that learning CER is not as easy as had been thought. We hypothesized that because artificial sentences lack semantic content, semantics could help humans learn the syntax of center-embedded sentences. To test this, we composed sentences from 4 vocabularies of different degrees of semantic content due to 3 factors (familiarity, meaning of words, and semantic relationship between words). According to our results, these factors have no effect one by one but they make learning significantly faster when combined. This leads to the assumption that there were different mechanisms at work when CER was parsed in natural and in artificial languages. This finding questions the suitability of AGL tasks with artificial vocabularies for studying the learning and processing of linguistic CER.",
keywords = "Artificial grammar learning, Center-embedded recursion, Familiarity, Linguistics, Natural language, Semantics",
author = "Anna Fedor and M{\'a}t{\'e} Varga and E. Szathm{\'a}ry",
year = "2012",
month = "5",
doi = "10.1037/a0026986",
language = "English",
volume = "38",
pages = "776--782",
journal = "Journal of Experimental Psychology: Learning Memory and Cognition",
issn = "0278-7393",
publisher = "American Psychological Association Inc.",
number = "3",

}

TY - JOUR

T1 - Semantics boosts syntax in artificial grammar learning tasks with recursion

AU - Fedor, Anna

AU - Varga, Máté

AU - Szathmáry, E.

PY - 2012/5

Y1 - 2012/5

N2 - Center-embedded recursion (CER) in natural language is exemplified by sentences such as "The malt that the rat ate lay in the house." Parsing center-embedded structures is in the focus of attention because this could be one of the cognitive capacities that make humans distinct from all other animals. The ability to parse CER is usually tested by means of artificial grammar learning (AGL) tasks, during which participants have to infer the rule from a set of artificial sentences. One of the surprising results of previous AGL experiments is that learning CER is not as easy as had been thought. We hypothesized that because artificial sentences lack semantic content, semantics could help humans learn the syntax of center-embedded sentences. To test this, we composed sentences from 4 vocabularies of different degrees of semantic content due to 3 factors (familiarity, meaning of words, and semantic relationship between words). According to our results, these factors have no effect one by one but they make learning significantly faster when combined. This leads to the assumption that there were different mechanisms at work when CER was parsed in natural and in artificial languages. This finding questions the suitability of AGL tasks with artificial vocabularies for studying the learning and processing of linguistic CER.

AB - Center-embedded recursion (CER) in natural language is exemplified by sentences such as "The malt that the rat ate lay in the house." Parsing center-embedded structures is in the focus of attention because this could be one of the cognitive capacities that make humans distinct from all other animals. The ability to parse CER is usually tested by means of artificial grammar learning (AGL) tasks, during which participants have to infer the rule from a set of artificial sentences. One of the surprising results of previous AGL experiments is that learning CER is not as easy as had been thought. We hypothesized that because artificial sentences lack semantic content, semantics could help humans learn the syntax of center-embedded sentences. To test this, we composed sentences from 4 vocabularies of different degrees of semantic content due to 3 factors (familiarity, meaning of words, and semantic relationship between words). According to our results, these factors have no effect one by one but they make learning significantly faster when combined. This leads to the assumption that there were different mechanisms at work when CER was parsed in natural and in artificial languages. This finding questions the suitability of AGL tasks with artificial vocabularies for studying the learning and processing of linguistic CER.

KW - Artificial grammar learning

KW - Center-embedded recursion

KW - Familiarity

KW - Linguistics

KW - Natural language

KW - Semantics

UR - http://www.scopus.com/inward/record.url?scp=84865681418&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84865681418&partnerID=8YFLogxK

U2 - 10.1037/a0026986

DO - 10.1037/a0026986

M3 - Article

VL - 38

SP - 776

EP - 782

JO - Journal of Experimental Psychology: Learning Memory and Cognition

JF - Journal of Experimental Psychology: Learning Memory and Cognition

SN - 0278-7393

IS - 3

ER -