The degree to which this is true was brought home to me a couple of weeks ago, when I realized that I had to address "text spelling." Since I started TAing in grad school about 7 years ago, I've kind of gotten an idea of the lay of the land when it comes to spelling mistakes. Certainly some students make more and some make less, but the general logic is usually the same, and a lot of the mistakes I used to see were similar (next time some one starts yammering to you about the Ivy League, let it be known that both Ivy League and GED students I've had use the incorrect word "alot"). Now some new stuff is starting to turn up-spelling "when" as "wen," for example. Or "that" as "dat." So I got my students to help me make a list of text spellings, and they really got into it-they were pleased to be asked about something they felt like experts on. There were arguments about variant spellings. There were discussions about who uses which kinds of spellings when. There was a brief complaint from one student who insisted that she always texts in complete sentences (which I do not believe-c'mon, even I don't do that!), but once she saw every one else getting into it, she dove in too. I gave the students worksheets on switching back and forth between spellings, based on what they told me, which they really liked, and there's been a noticeable fall off in the use of text spelling in essay writing ever since. This is the kind of discussion I am always wishing we could have about dialect, but the stigma is just too big.
So here's my latest thought in the never ending search for a good way to teach standard grammar. There's a relatively new field out there called Forensic Linguistics, which researches how linguistics can be used to assess the authenticity or lack thereof in contested documents, or to test legal claims about how questions are asked in courts based on the transcripts. The idea is both that each person has a quantifiable linguistic signature (tendencies to use certain phrases in a certain way), and that one can read a document and hypothesize about the identity of the writer based on linguistic tics that occur in particular places or communities. I wonder if approaching grammar as a detecting technique on other people's texts, instead of one's own speech or text, would help. And the idea of how you'd try to fool someone who was trying to identify you is pretty appealing-not, of course, for criminal purposes, but it could be a less uncomfortable way of thinking about codeswitching. Maybe.
Add a comment