Tagged: prescriptivism Toggle Comment Threads | Keyboard Shortcuts

  • John Stokes 12:13 pm on March 12, 2012 Permalink | Reply
    Tags: grammaticality judgments, , literature, , newspapers, prescriptivism, text speak, , txt   

    Those narrow-minded, prescriptivist . . . texters? 

    Texting encourages us to be creative and unconstrained with our language, right? Traditional print media, fettered as they are by the bounds of Standard English, promote more rigid acceptability and grammaticality judgments, don’t they? Aren’t those prescriptivist editors and stodgy old style columnists just concerned with dictating how we speak and write?

    Not so says some new research from University of Calgary linguist Joan Lee.

    Lee’s Master’s thesis tested students with varying levels of exposure to text/instant messaging versus traditional media (newspapers, magazines, literature, nonfiction). She hypothesized that those with comparatively more exposure to the free-form nature of ‘text speak’ would be comparatively more lenient in their acceptance of novel and deviate forms of words, both morphological and orthographic. What she found was the opposite.  Students who had spent more time reading books and newspapers were more likely to judge novel words or deviate forms of words as acceptable. Those who had more exposure to text speak tended to be considerably more rigid and constrained in their acceptability judgments.

    This result is moderately mind-boggling. Think of all you know about texting and compare that to your expectations of the effects of traditional media on language use. From texters, we see such classics as ‘wot r u doin 2day‘ and ‘ur stoopid dood‘ and ‘kewl‘ and, perhaps best of all, kthxbai‘. There’s a bunch of research discussing why texters say and spell things like this. In the preview of her thesis linked above, Lee cites one such study that says people are trying to be playful, spontaneous, socially interactive, and even creative. Compare this to what you read in the New York Times, where there’s actually someone who writes articles nitpicking the grammar of other traditional-print-media articles, including sometimes the NYTimes itself!

    I haven’t read the whole 150-page thesis yet, but it seems there are several plausible explanations for what’s going on. One idea is that for a novel form to be acceptable to texters, it must be a novel form commonly seen in text speak. So while there may not be the same types of grammatical constraints, there are conventions that are respected nonetheless. Or perhaps text speak is still free-form and not subject to constraints in the ways that traditional media might be, but free-formedness doesn’t actually equate with creativity. So while you’re tossing standardized grammar out the window, you’re not necessarily looking for the most precise, novel, creative word to express a given idea. You may be using words with odd morphology and spelling, but you’re probably not reaching deep into the dictionary to find those words. This means that readers of text speak don’t often see words they’re unfamiliar with and thus don’t often have to figure what those unfamiliar words might mean.

    If, on the other hand, you’re a big-time reader of traditional print media, you probably encounter unusual words all the time. To figure out what they mean, you either infer their meaning from context or use your knowledge of productive morphology like -ity or -ness. This could explain why texters are ok with ur and kthxbai and wot, but not some of the novel forms that Lee proposed in her study, like canality and groundness. If you read and don’t text, the textisms may be ridiculous to you, but you might be able to come up with a meaning for canality and groundness that makes sense, and thus conclude they’re acceptable words.

    I’m sure much more will be said on this front in the near future. For now, I think it’s enough to note this interesting bit of research and sign off — TTYL, folks.

     
    • Chad Nilep 7:39 pm on March 13, 2012 Permalink | Reply

      I haven’t read the thesis at all, but but it seems there are several plausible arguments that this phenomenon is not going on. According to Lee’s abstract, she administered a questionnaire to 33 university students. Small sample size, homogeneity of the subject pool, or confounded variables might cause such research to turn up artefacts that don’t hold in the larger population, texters and newspaper-readers generally, that they are extrapolated to.

      It sounds like an interesting study, and I’ll add it to my queue of things to read in my spare time. But I’m not confident that Lee’s findings would necessary hold up in a larger study. I’m more confident, however, that if much more will be said on this front in the near future some newspapers will report the results as though they were proven to be true among a much broader general public.

    • Josh 11:33 pm on April 4, 2012 Permalink | Reply

      I can’t see past page 17 of the article, but the abstract says the sample size is 33, which seems awfully low. I’d also be curious if she controlled for gender/income/some proxy for intelligence since it’s easy to imagine some variable like this being highly correlated with text/instant message frequency. But I imagine these issues were probably addressed in pages 18-150!

      • Josh 11:34 pm on April 4, 2012 Permalink | Reply

        Note: I didn’t see Chad’s comment when I first posted.

  • The Diacritics 1:11 pm on December 19, 2011 Permalink | Reply
    Tags: , , english teachers, , , , prescriptivism,   

    Speaking with precision 

    (posted by John)

    My first semester of law school is drawing to a close, so I thought I would write about something I heard on my very first day. I’ve been mulling it over since then, partially because at first blush it runs so against my beliefs about prescriptivism and the ‘rightness’ of one person’s language over another’s. Professor John Langbein finished his riveting orientation talk on the history of law schools in America with a lament about the debasement of the English language my generation is committing. My immediate reaction, as you might guess, was a bit of haughty “This old fogey just doesn’t get it. Prescriptivism is dumb!”

    But on at least some level, he was right. Professor Langbein’s point was not that language shouldn’t change because change is bad. His point was that it’s easy to lose some of the aspects of language most valuable—especially to someone trying to become a lawyer. To me his most potent example was the loss of precision in language, which he blamed on the overlarge number of outlets for spewing our thoughts to others. Cell phone, text, facebook, twitter—you catch the drift I’m sure. It seems every major newspaper has a bi-monthly requirement for an editorial talking about the over-share phenomenon of Facebook status and twitter updates.

    Langbein wasn’t quite talking about this, though. Think about a recent conversation you’ve had, in which you related the contents of an interaction with another person. Did it run something along the lines of “I was like . . .Then he was like . . . Then I was just like whatever and left.” It may not have, but if you do some good ol’ eavesdropping on the street you’re sure to hear something like it. (Or if you’re lucky you might get “And I was all . . . Then she was all . . . Then I was all . . . .” ). This is one of the things (<– there’s another one of them) that dismayed Professor Langbein. “Is that really what you were like?” He asked us. He gave other examples, too. Overusing “thing” was one of them. Another was prefacing a point we haven’t fully thought out and can’t very well express with “You know, uh, . . . ,” and then proceeding on our muddled way. Another was compensating for a poorly-thought-out sentence by ending it with an “. . . or whatever.”

    We can all get our point across using imprecise language, and the linguist in me recoils at the thought of saying it’s actually ‘wrong’ to do so. But you can be sure that being imprecise is the one of the quickest routes to becoming an inept law student (not to mention a bad lawyer).

    So I’ll cede the point: it is worthwhile to attempt to be precise in language. If we don’t use linguistic vagaries like “or whatever” and if we avoid saying “thing” whenever the right word doesn’t immediately come to mind, it forces us to organize our thoughts more clearly. Using precise language makes us think more precisely. I tried spending a day saying precisely what I meant every time I spoke. It was exceedingly difficult, but it seemed helpful in terms of my mental organization.

    Based on our knowledge of how language allows us to think complex thoughts in the first place, it makes sense that being more precise in our speech would make us more precise in our thinking. I wrote a post a while back looking at some of Liz Spelke’s experiments that suggest language lets otherwise distinct, insulated modules of intelligence interact, thereby making us ‘smart’ compared to other species. One experiment I didn’t discuss there shows that language allows us to grasp the concept of “sets of individuals.” Babies and monkeys can distinguish “individuals” and they can distinguish “sets,” and when the set is less than four items large, they recognize that adding or subtracting an individual changes the size of the set. But when the set is larger than four, they cannot combine the representations of ‘set’ and ‘individual’ to understand that it is a “set of individuals” such that adding or subtracting one changes the quantity. Only once we have language is this possible.

    There are also sad but interesting cases of so-called ‘feral children‘ who have been deprived of exposure to language from a very young age.  These people never fully learn a language. They also are unable to perform tasks indicative of ‘higher’ human intelligence—for example distinguishing which of two massed quantities is larger.  According to still more research by Spelke and others, children without language and other animals like monkeys can distinguish between larger and smaller quantities at a ratio of about 2:1. If the quantitates get much closer in number, it becomes difficult for them to guess correctly. Humans with language can do this at a considerably better rate.

    Finally, the emergence of language, some have argued, is associated with a cultural explosion of sorts; more complex tools, recursive patterns on bits of pottery, even materials that look like they could be used to go fishing. The idea is that language allowed us to do the ‘higher thought’ necessary to develop culture.

    All of this evidence suggests that we are able to think complex, highly structured thoughts in large part because we have language. It also suggests I should take Professor Langbein’s advice: you know, try not to be like, “Let’s speak more clearly or whatever.”

     
    • John Cowan 3:47 pm on December 19, 2011 Permalink | Reply

      Your first judgement on Langbein was right. I was all/like is a perfectly sensible quotative form, and not the least bit vague or ambiguous. Taking it literally is a ludicrous rhetorical strategy suitable only for bonking a captive audience on the head, like complaining about It was colder than hell because Hell is traditionally a place of fire. Does he criticize people for using the vague word said when so many other “more precise” terms such as shouted, hissed, shrieked, muttered, babbled, grunted, ejaculated, or bloviated are available? I suspect not.

      And I doubt he would have a word to say to anyone who used such older hedges as as it were, in a sense, in a manner of speaking, if you will and so on, though they too are used to mark “a point we haven’t fully thought out and can’t very well express”. Human beings are not expected to formulate all their thoughts in advance and spew them out in perfectly formed standard prose — unless they are giving speeches, and even then, the speech will not sound natural if they do.

      No, let Langbein stick to law or learn the facts about what he’s talking about.

    • Phil 12:29 am on December 23, 2011 Permalink | Reply

      What your professor is really bemoaning is the demise of a world where the uneducated working classes were denied access to the means with which to publish their thoughts.

      Once upon a time, the only way your ideas were ever permitted to be put on paper is if they were first rigidly screened by arch-conservative editors, and if you’d spent years at a private school, being caned for ending sentences with prepositions, and having all originality beat out of you.

      Today, the uneducated masses have Twitter, blogs, tabloids, Facebook, Tumblr, and god knows what else. The foremost triumph of democracy is the expansion of access to ‘voice’, as political scientists would term it.

      They may not be able to express themselves as clearly as we’d like, but the question arises whether that is due to their lack of ability, or our lack of flexibility. We are accustomed to seeing information presented in a certain manner, but this does not harm intelligibility, this is merely a reflection of systemic elitism.

      For instance, we undeniably understand that “she was like, no way, i wasn’t there!!1lol” equates perfectly well to “She vehemently denied her involvement”, and yet we can’t help form certain preconceptions about the authors of the two statements which make it far easier to dismiss the former out of hand. Perhaps that is what we ought to work on, not on trying to make the masses conform to our narrow class-based preconception of how intelligence correlates with register.

  • The Diacritics 4:18 pm on December 13, 2011 Permalink | Reply
    Tags: bad lip reading, dubs, fake english, funny, history of english, , , mcgurk effect, prescriptivism, videos   

    Lots of language videos 

    Stephen Fry rails against pedantic prescriptivists: “Sod them to Hades!”

    Bad Lip Reading, whose hilarious dubs bring to mind the McGurk Effect, reimagines the words of disgraced Republican candidate Herman Cain: “Mexican people don’t eat sugar, especially when it’s a mixture of lice and tiger DNA!”

    The Open University describes the history of English in a charming cartoon video.

    Finally, short film capturing the cadences and sounds of normal spoken English, but utterly nonsensical. Apparently intended to show how American English sounds to others. (Family Guy trades it back, making fun of how British English sounds to Americans.)

     
    • krish 1:05 am on December 20, 2011 Permalink | Reply

      I MADE IT ON THE BLOG!!

c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel