For most people, the world “grammar” implies the dullest of
subjects. The g-word conjures up associations of pedantic English teachers enforcing
the tedious memorization of rules. That’s a shame because grammar is actually
one of the most extraordinary adaptions in all of nature.
Words are spoken in
sequence, one after another. In making sense of language, the human mind
transforms linear sequences of words into non-linear structures of thought.
This ability, known as parsing, is at the core of how the mind processes and
comprehends language.
Last night as I
was driving home, I saw a sign that read “HUGE KIDS SALE.” Knowing that our
society prohibits the sale of huge kids, I parsed the phrase into a little thought
structure shaped like this:
In parsing, the mind represents a phrase at two levels. At
the surface level are the sounds, the specific meanings of the words, and the
order of words—one thing after another. The mind transforms this into an
abstract, invisible, inaudible, tree-like structure of thought.
We can make out
the structure of a sentence even when we don’t know the meaning of all the
words. That’s why we can parse the first sentence of Lewis Carroll’s
nonsensical poem, “Jabberwocky.”
“T'was brillig, and the slithy toves did gyre and gimble in the wabe."
We know, for instance, that “toves” must be a plural noun and that “gyre” and “gimble” must be present-tense verbs.
“T'was brillig, and the slithy toves did gyre and gimble in the wabe."
We know, for instance, that “toves” must be a plural noun and that “gyre” and “gimble” must be present-tense verbs.
We make these
transformations based on grammar. We don’t have to learn millions of different
possible sentence structures—just a small set of rules.
___
As language evolved, some of the words were transformed
into markers of grammar through a process known as grammartization. A good
example is the word “will,” which originally was a verb meaning “to want” or
“to wish,” but is now most often used to signal the future tense (as in “I will be there”). Another example is the
word “it” in the sentence, “It is
raining.” In that sentence, the word “it” has no meaning, but serves a purely
grammatical function as the subject of the sentence.
It’s hard to say
exactly how our rules of grammar developed. All human societies possess
language with fully-formed systems of grammar; there are no half-baked grammars
that we can point to as examples of what the intermediate stages might have
been like.
The early stages
of grammar might have been roughly similar to a child’s early stages of
language development. The first utterances were probably single nouns:
“Bear”
then combinations of nouns:
“Dad bear”
In time, people began using some of their nouns to refer to actions, like when we say “water the plants” or “handle the situation,” thus creating verbs. They could then began to string to together sequences of nouns and verbs:
“Dad bear chase”
At some point, societies established agreed-upon rules of syntax that determined how the order of words affects their meaning. The rules might have determined, for example, that “Dad chase bear” was good news, and “Bear chase Dad” was bad.
“Bear”
then combinations of nouns:
“Dad bear”
In time, people began using some of their nouns to refer to actions, like when we say “water the plants” or “handle the situation,” thus creating verbs. They could then began to string to together sequences of nouns and verbs:
“Dad bear chase”
At some point, societies established agreed-upon rules of syntax that determined how the order of words affects their meaning. The rules might have determined, for example, that “Dad chase bear” was good news, and “Bear chase Dad” was bad.
The most basic
rule of syntax is that a sentence can consist of a word for a thing, a word for
an action, and often a word for another thing—otherwise known as a subject,
verb, and object. If language can be said to have one primary function, it is
to answer the question, “Who did what?” That’s what sentences do.
Sentences grew
more complex as new grammatical devices entered the language. These included
verb tenses:
“Bear chased Dad.”
articles and pronouns:
“A bear chased my dad.”
prepositional phrases:
“A bear chased my dad up a tree.”
passive constructions, in which the object is placed at the beginning of the sentence:
“My dad was chased up a tree by a bear.”
“Bear chased Dad.”
articles and pronouns:
“A bear chased my dad.”
prepositional phrases:
“A bear chased my dad up a tree.”
passive constructions, in which the object is placed at the beginning of the sentence:
“My dad was chased up a tree by a bear.”
and recursion, in which phrases refer to other phrases:
“My mom said that my dad was chased up a tree by a bear.”
“My mom said that my dad was chased up a tree by a bear.”
___
When we listen to someone speak, we experience each
sentence in real time, one word at a time. At any given moment, we are usually
somewhere in the middle of a sentence, yet we usually understand what the
speaker is talking about. That’s because our minds can build tree structures
“on the fly” based on the words we’ve heard so far. The process is so natural that
we become aware of it only in the rare cases when it breaks down. Try parsing this sentence:
“The old man the boat.”
Huh? You probably had to read that one a couple times
before it made sense. The first time through, you naturally assumed that “The old
man” was a noun phrase, so when “the boat” came along, you found yourself up
the creek without a verb. You had to rebuild your tree, reinterpreting it with
“old” as a noun and “man” as a verb. (“The old man the boat. The young stay
ashore.”)
“The old man the
boat” is an example of a garden path sentence, a devious sentence that lures
you into an interpretation that turns out to be a dead end. So are these:
“The man whistling tunes pianos.”
“The author wrote the novel was likely to be a best seller.”
“The cotton clothing is made of grows in Mississippi.”
“The horse raced past the barn fell.”
As garden path sentences make obvious, parsing is a matter of prediction and guesswork. As we listen to (or read) a sentence, we make guesses about how each word is probably going to fit into the overall structure of the sentence. Fortunately those guesses are right most of the time, so we are able to parse sentences correctly at around 150 words per minute—an impressively high rate of speed.