RemNote Community
Community

Syntax - Major Syntactic Frameworks

Understand the basics of constituency, how dependency grammar structures sentences, and the goals and theories of generative syntax.
Summary
Read Summary
Flashcards
Save Flashcards
Quiz
Take Quiz

Quick Practice

What property refers to a group of words functioning together as a single unit or phrase?
1 of 10

Summary

Constituency in Syntax What Is Constituency? Constituency is a fundamental concept in syntax that describes how words cluster together into meaningful groups called constituents or phrases. A constituent is a sequence of words that function as a single unit within a sentence. For example, in the sentence "The tall professor wrote an interesting book," the phrase "the tall professor" is a constituent—it functions together as the subject of the sentence. Similarly, "an interesting book" is a constituent functioning as the object. The individual words "the," "tall," and "professor" do not function as a unit by themselves, but together they do. The key insight is that constituents have properties: they can be moved, replaced, or referenced as single units, and this behavior reveals which words belong together. How Constituents Move and Function One of the clearest ways to identify constituents is through movement. When a constituent moves as a unit in syntactic transformations, it demonstrates that those words genuinely form a unified group. Consider these examples: Statement: "John gave Mary a book yesterday." Question: "What did John give Mary yesterday?" (Here "a book" moves to the front as a unit) Alternative statement: "Yesterday, John gave Mary a book." (The entire phrase "yesterday" can move to the beginning) Notice that we can move "a book" or "yesterday," but we cannot move only the word "Mary" from "a book" to the front ("Mary did John give a book yesterday?" is ungrammatical). This tells us that "a book" and "yesterday" are constituents, while "Mary" is not a constituent on its own in this position. Discontinuous Phrases While constituents typically consist of adjacent words, some languages permit discontinuous phrases, where the words making up a single constituent are separated by words from other constituents. For example, in German and Dutch, verbs can split into parts that appear in different positions within a clause: German: "Er hat das Buch gelesen" (He has the book read) Here, the verb particle "gelesen" (read) is separated from the auxiliary verb "hat" (has), yet they form a single verbal constituent. The words of the constituent are discontinuous, but they still function as a unified unit for grammatical purposes. This is important because it shows that constituents are defined by their grammatical relationships, not merely by their linear position in the sentence. <extrainfo> Recursive Constituents Constituents have an important property: they can be recursive, meaning a constituent can contain other constituents of the same type within it. This recursive nesting is crucial for the infinite productivity of language—our ability to create indefinitely long sentences. Consider noun phrases (NPs). An NP can contain another NP: "The book" (simple NP) "The book on the table" (NP containing a prepositional phrase, which itself contains an NP: "the table") "The book on the table in the library" (NP containing two prepositional phrases, each with an NP inside) "The book on the table in the library on Main Street" (and so on...) Because constituents can recursively contain other constituents, we can theoretically build infinitely complex sentences from a finite set of grammatical rules. This recursive structure is one of the most important properties that distinguishes human language from many other communication systems. </extrainfo> Dependency Grammar The Core Idea of Dependency Grammar Dependency grammar is an alternative way of analyzing sentence structure that differs fundamentally from constituency-based approaches. Rather than grouping words into phrases, dependency grammar organizes words according to dependency relations—direct grammatical relationships between individual words. In a dependency analysis, every word (except one) depends on exactly one other word, creating a hierarchical structure. These dependencies indicate which words modify or complement which other words, showing how meaning flows through the sentence. Think of dependency as a system of grammatical "links" between words, where each link shows that one word has a grammatical function relative to another. The Verb as Root In dependency grammar, the finite verb serves a special role: it is the root of the entire clause structure. Every other word in the clause either depends directly on the verb or depends on another word that ultimately traces back to the verb. For example, in "The tall professor carefully wrote an interesting book yesterday": "wrote" is the root verb "professor" depends on "wrote" (it's the subject) "book" depends on "wrote" (it's the object) "tall" depends on "professor" (it modifies the subject) "carefully" depends on "wrote" (it modifies the verb) "interesting" depends on "book" (it modifies the object) The verb stands at the center of this web of relationships, with all other words connected to it through chains of dependencies. This reflects the idea that the verb is the core element of a clause, with other elements playing supporting grammatical roles. Directed Links Between Words Dependencies are represented as directed links between words, where the direction indicates the hierarchical relationship. Typically, an arrow points from the dependent (the word that relies on another) to the head (the word it depends on). These directed links do two important things: Show grammatical relationships: The type of dependency (subject, object, modifier, etc.) clarifies what grammatical role each word plays. Create hierarchy: By following the directed links, you can identify which words are central to the meaning and which are modifying or supporting information. A dependency can connect words that are far apart in the sentence, unlike constituents which must form contiguous groups (unless the language allows discontinuous phrases). This makes dependency grammar particularly useful for analyzing languages with flexible word order. Generative Syntax The Goal of Generative Syntax Generative syntax is a theoretical approach that seeks to write explicit rules capable of generating all and only the well-formed (grammatical) expressions of a language. The term "generative" does not mean "creative" but rather refers to the mathematical sense: the rules should be able to produce (generate) all grammatical sentences. This is a very precise goal. The rules must: Generate all sentences that native speakers accept as grammatical Fail to generate any sentences that native speakers reject as ungrammatical Think of generative syntax as trying to write a complete recipe that produces perfect cakes—every step is specified, and following the recipe produces all correct cakes and no burnt failures. The Autonomy of Syntax Principle A core assumption of generative syntax is the Autonomy of Syntax principle, which states that syntactic structure is determined by its own rules, independently of meaning and communicative intent. In other words, syntax is autonomous—it operates according to its own logic. This principle suggests that: Meaning does not determine structure. Two sentences with similar meanings may have different syntactic structures. Structure determines meaning. The same words in different structures can mean different things. For example, consider: "The boy hit the ball" (The boy is the agent; the ball is the affected object) "The ball was hit by the boy" (Same basic meaning, but the ball appears in the subject position) Both sentences convey essentially the same situation, yet they have different syntactic structures. Neither the meaning of "boy" nor the meaning of "ball" determines whether they appear as subjects or objects—the syntactic rules do. This is somewhat counterintuitive, because we often think of language primarily as a system for communicating meaning. The autonomy principle says that while syntax ultimately serves communication, the syntactic system itself operates by its own rules. Historical Development and Major Theories Generative syntax was introduced in the late 1950s by Noam Chomsky, building on earlier theoretical work by Zellig Harris (who developed transformational approaches) and Louis Hjelmslev (who developed structural approaches). Chomsky's innovation was to formalize these ideas and propose explicit generative rules that could be computationally precise. Since its inception, generative syntax has evolved through several major frameworks: Transformational Grammar (1960s-1970s) was Chomsky's original model. It proposed that sentences are generated through phrase structure rules, which can then be modified by transformation rules that move, delete, or rearrange constituents. Government and Binding Theory (1980s) refined and reorganized transformational grammar, introducing principles that constrain how transformations can apply. Rather than having many different transformation rules, this theory proposed universal principles that govern all languages. The Minimalist Program (1990s-present) stripped away additional complexities, seeking the most minimal set of principles needed to generate language. It proposes that movement occurs only when required for convergence (when certain principles must be satisfied). Each of these theories maintains the core goals of generative syntax—to write explicit rules that generate all and only grammatical sentences—but differs in how many rules are needed and how they interact.
Flashcards
What property refers to a group of words functioning together as a single unit or phrase?
Constituency
How do constituents typically behave during syntactic transformations?
They move as entire units
What is the term for phrases where the constituent parts are separated by other elements?
Discontinuous phrases
What property allows a constituent to contain other constituents of the same or different types?
Recursion (Recursive constituents)
On what type of relations does dependency grammar arrange syntactic units?
Dependency relations (rather than constituency)
Which element serves as the root of the entire clause structure in dependency grammar?
The finite verb
What is the function of the directed links used in dependency grammar?
To connect words and indicate hierarchical relations
What is the primary goal of generative syntax regarding linguistic expressions?
To specify rules that generate all and only well-formed expressions
Who proposed generative syntax in the late 1950s?
Noam Chomsky
What are the major theories included within generative syntax?
Transformational Grammar Government and Binding Theory Minimalist Program

Quiz

According to dependency grammar, how are syntactic units primarily organized?
1 of 6
Key Concepts
Syntactic Structures
Constituency (syntax)
Discontinuous phrase
Recursive constituent
Dependency grammar
Directed link (dependency)
Generative Theories
Generative syntax
Transformational grammar
Government and Binding Theory
Minimalist Program
Syntactic Principles
Autonomy of syntax principle