Grammar Mining (and the collected Mark SLA Lexicon)

Many of us agree that teaching “at the point of need” (as I believe Meddings and Thornbury put it) is an ideal context for formal grammar teaching.  Students’ trying to communicate something provides clear evidence that they need the grammar that would facilitate communicating it, and depending on how close they come to natural expression, evidence that their internal representation of English is capable of taking on this additional piece of information.

In interlanguage punting, I conjectured that taking a guess at grammar students may need in the future and organizing a lesson around a particular grammar point was justifiable if the lessons you used to introduce that grammar would be memorable long enough for a “point of need” to be found before the lesson was forgotten.  At the time, I was teaching weekly 1-hour grammar workshops with rotating groups students at different levels, and as I could not teach reactively I had to justify my grammar-first (formS-focused) approach.

Read on for the last post before the new semester starts.

Read More »

Advertisements

Stuff I will miss 3: Intensely bitter grapes

I’ve been out of Japan, my home for almost all of my adult life, for a year now.  There are now some things I’d legitimately think of planning a vacation around to experience again.  Some of these are of the nostalgic flavor variety, and others are more profound.

(If you had had the power to predict back in 1987 that one of the effects of a global information network would be that everything textual had to be organized as a list, would that have helped you to make any wise investments?  What could you do with that information besides corner the market on scroll bars?)

Air conditioning.  People tend to think of Japan as an ancient country.  I disagree with the concept of “ancient countries” on philosophical and historical grounds (“ancient” being but one of many possible stances a modern person can take on the history of a current political entity), but in any case, you see no evidence of this claim living in an apartment in Japan.  It’s quite rare to find any domicile more than 30 years old, and the facilities within any given residence are bound to be either brand new or from around the same time the domicile itself was built (again, not a very long time ago).  Modularity is the philosophy of most interiors, leading to replaceable fluorescent ceiling fixtures, temporary flooring (often the rugs, tile, and the wafer-thin floor itself), and detachable wall AC/heating units.  The philosophy of housing in Japan seems similar to the philosophy of food in the US – ultrarational and at the convenience of industry.  My current residence in the US is older than any building I lived in in Japan, and its AC sounds like a fleet of Huey helicopters.  The idea that American buildings are old and sturdy and Japanese buildings are new and full of ad hoc fixes clashes with stereotype, but more importantly, sometimes slapdashness has perks in easy upgrades.  My AC in our school in Japan was practically silent in comparison.  If only the walls had been made of material capable of keeping heat in.

Unsweetened food choice architecture.  I still believe the point that I used to make, that the stereotype about everything in the US being incredibly sweet is false.  However, the sweet things here are definitely more prominently placed and almost always the first thing you notice on any given supermarket shelf.  There are croissants among the danishes and donuts, and plain yogurt next to the vanilla and strawberry yogurt, but the sweeter options are at least 2/3 of the options available and always hit the eye first.  This doesn’t technically force you to buy sugar-coated or HFCS-filled products, but it does make them harder to ignore.  Shopping here tends to nudge you towards cavities.  At least the dentists wear gloves here.

Meiwaku.  Interaction has a steep transaction cost in Japan.  Initiating random conversation, asking an off-script question to someone whose job isn’t specifically answering questions, mingling, sharing, and basically doing anything that anyone else might conceivably see all come weighted with fees.  Those fees come in the form of aversion to and fear of being called out for disturbing others (迷惑 meiwaku).  I don’t remember if meiwaku was in that book on Cultural Code Words, but it definitely should be – if there’s any single word that explains public behavior in Japan, it’s meiwakuMeiwaku, to me, is why people don’t usually talk on trains, get up to leave during the credits at the movies, or call out or loudly resist perverts on public transportation.  I see the threat of being accused of causing meiwaku as an additional burden that one feels in every public action, encouraging fairly extreme adherence to rules because the threat looms so large.  If it were a monetary cost, it would be a sales tax of 20% on all verbal transactions, pretty strongly disincentivizing public garrulousness.  The thing is, the revenue raised by this tax also allows for quiet and well-maintained (or at least not willfully destroyed) public spaces and a priority placed on social order, and it is something you can begin to miss in its absence.  Its near-complete lack in the US produces something also undesirable – reckless and risky behavior brought on by a lack of cost for disturbing other people – a recklessness subsidy, accompanied by a widely accepted cultural value on confidence and asserting onself.  In public in the US, all options for public behavior are available to everyone at no cost in face or shame.  As a result, people avoid boring, on-rails conversations but are much more likely to engage in all manner of misanthropic behavior because of how highly weighted self-expression is over public order or consideration of others.

The dream of a pretense of post-racial society.  Japan’s mainstream concept of itself is explicitly racial.  Not “we hate other races”, but “Japan’s story is the story of our race”.  I’ve come to think that most countries are this way, and a legalistic, “our club is the best but anyone is welcome to join as long as they follow the rules” definition of citizenship is a distinct minority of countries.  Now, if one squinted at US history and let the bloody details fade into the background, one could have said that this was the story Americans have always told themselves – that America is an idea that rises above race and class.  In fact, it was true until recently that even the most conservative politicians publicly espoused a legalistic rather than blood-and-soil definition of citizenship.  I worry that having had to defend this president will cause larger parts of conservative America to abandon even the rhetoric of equality.  Cognitive dissonance and all that.

I knew there were elements in the US who envied the easy worldview offered by an explicitly racial view of their nation, and sought to tie Americanness to a mythic concept of “whiteness” just as Japan’s is to “Japaneseness”. I didn’t think these people would ever have a voice in polite society or have their chosen candidate win a national election.

Of course, it seems silly to say I miss the rose-colored view of my home country that I had while I was away from it, but that is the truth.  I miss having the US as an example of a country that didn’t mythologize itself as the work of one uniquely virtuous race while I lived in one that did.

Shallow roots.  The US is unique not in its newness (its government is actually very old compared to that of most countries) but in its inability to pretend to be ancient.  Most people, when asked the age of a country like Japan, will inevitably answer in the thousands of years.  If you consider a claim like this in practical terms, that means either that the denizens of these islands have imagined themselves to be part of a Japanese polity before they even grew rice or that a series of governments, even without continuity from one to another, have been able to exercise control over these islands since the bronze age without the benefit of any communication faster than a pony (or in the earliest days, any form of writing).  Nonetheless, part of the current cultural software in some countries like Japan is a claim to continuous existence back into antiquity, made plausible by some historical facts (people really have lived here for a long time) and some guessing enabled by lack of evidence (nobody knows what these people did besides live in huts and make cord-inscribed pottery).  The US, with all of the history of its current government being part of the written record, cannot feasibly claim any of this.

Belonging to an “ancient society” weaponizes many of the arguments conservatives tend to make in every society – that our way of life has been ever thus, distinctive and characteristic of our land and people, until vulgarians and foreigners infiltrated it and began its corrosion.  Of course, you hear these arguments in the US too, but the difference is that in an “ancient society” everyone starts the discussion by ceding most of the conservatives’ points.  Yes, our way of life has existed unperturbed for millenia, but maybe it wouldn’t be so bad to give women maternity leave.  Yes, our culture is unique and distinctive, but a little labor flexibility would help our economy’s competitiveness.  Progressives need to start with an acknowledgment of what they are breaking with or they will lose most people’s sympathy.  As I said, the US has versions of these arguments, but people often have to outsource their roots (“Western” or “Judeo-Christian”, nothing on this continent) and the mists of time don’t seem to reach much further back than 1955.  A loss of national or cultural identity can be quite freeing.

Of course, this is a list of the things I miss, and like in the last entry, moving here has certainly disillusioned me of my fellow citizens’ resilience in the face of appeals from antiquity.  The president seems to come to his particular flavor of chauvinism simply by liking things the way he’s used to (white by default and free of the burden of empathy), but others, even progressives, have come to embrace the framing of conservative vs. liberal as traditional vs. non-traditional or continuity vs. interruption.  I suppose I had hoped I would be coming back to a society that saw interruption as its tradition.

Let’s end on something light…

Natto maki.  More sour beans than sour grapes.

natto-maki
Source.  I’ll be sure to try them if I’m ever in Ho Chi Minh City.

The urinal cake principle

Imagine yourself pushing through a crowded train station during rush hour.  As you pass a certain doorway, you detect hints of lavender and hibiscus coming from within.  Do these smells evoke:

  1. just flowers, or
  2. first flowers then toilets based on your prior knowledge and experience regarding the likelihood that lavender is blooming within 100 feet of a subway platform, or
  3. just toilets?

This is the best way for me to understand the principle of dead metaphors.  A dead metaphor is a process of cutting out a semiotic middleman. The process of a metaphor dying is a powerful and inevitable one that affects culture and particularly language in some subtle ways, as I hope to illustrate in as colorful a way as I can.

The process, in dense and unfriendly language, is this: The definition of a symbol over time changes from the first thing (flowers) to the second thing indirectly through the first thing (toilets via floral-scented deodorizing discs), and finally just comes to stand for the second thing (toilets).  This can be true even if the form of the symbol does not change – e.g. if the deodorizer continues to have a floral scent.  The reference stops being indirect and just goes straight for the thing that it was always eventually getting at.

I’ve been trying to think of more real-world examples of this principle in action.  Here are a few more:

A clothing brand (say, Members Only) is associated with rich people.  Poor people start to buy that clothing brand on sale or used because it evokes rich people.  The brand comes to be associated with poor people’s desperation to look rich.  (Louis Vuitton in Japan is rumored to head off this effect by buying back their used bags and burning them to prevent them going to the secondhand market)

“Political correctness” is a recognized phenomenon in polite discourse.  Reactionaries vocally dislike it and use it as a stick with which to beat their cultural enemies.  It comes to be much more widely known for its rhetorical value in reactionary discourse (specifically, their hatred of it) than as a phenomenon within polite discourse.

A famous person with an idiosyncratic name (say, “Zoey”) appears.  People who have kids during the zenith of that person’s career name their kids after him/her.  That name comes to be associated with that generation of kids rather than the famous person.

Taboo concepts have euphemisms invented to avoid direct reference to them while still maintaining the ability to refer to them indirectly if necessary.  Subsequent generations come to think of the euphemisms as simply the names for those taboo concepts, since those are usually the only words for those concepts that they ever hear.  Those generations invent new euphemisms to replace the no-longer-thought-to-be-indirect old ones.

When I was studying Japanese before moving there, we learned 便所 benjo (“convenient place”) as “bathroom”, when practically nobody alive now still uses that word.  Sometime between the 50s and now they were お手洗い otearai (“hand-wash”) or 化粧室 keshoushitsu (“powder room”). Now they are called トイレ toire, presumably until the next generation starts calling them something like ルーム ruumu.

Hipster beards are destined to go from “modern guy” to “guy who still thinks beards are modern” to “guy who doesn’t know that nobody thinks beards are modern” and in 20 or so years “guy who affects not caring that hipster beards are not considered modern” and back to just “modern guy” again.  Believe me; it happened to skinny ties.

Most words unless they were invented very recently are dead metaphors or have changed meanings through some other process.  A word’s history is like the miles of junk DNA we carry around with us in our cells, only using a small portion to form proteins, transmit messages or enjoy an inside joke.  Words like “give up” (my favorite example, a clear dead metaphor), “Wednesday”, or “cranberry” have stories behind their present forms and usages that are very interesting, but also very optional.  Each living word has untold numbers of lost meanings (in addition to its likely numerous current meanings) which we don’t have and don’t need access to in order to use it.  The process by which a word’s meaning changes isn’t always the middleman-eliding of the dead metaphor, but the idea that one doesn’t need all the information about the past referents of a given token to understand it in its current context is the same.

We language teachers often pride ourselves on the elaborate stories and setups that we use to explain usage of one language item or another.  One time I attended a presentation that asked us straightforwardly, “Why do you use on for buses but in for cars?”, to which several teachers laid out the possibly-made-up-but-convincing stories that they give their students.  These stories can definitely be useful for appearing to give students stage-by-stage mastery and “permission” to use a particular language item, things I definitely wanted in my early stages of Japanese learning.  Nowadays, I tend to think of these as a bootstrap or a foot in the door (are those dead metaphors?) than understanding itself, more affective factors than what we would usually call L2 competence.  Naturally, the end goal of most language learning is to have a grasp of the language similar to the fluent speakers in your target community, not to have complete explicit knowledge of the target language (although many learners confuse these two – some teachers as well).  One does not need to know the origin of a word or the logic behind its present form to use it correctly any more than one needs to have smelled fresh lavender as a reference point to know what that same smell means at the train station.

671358e4-5056-b061-b6aeb39f780f0706

My black robes

According to The Impact, a judge has an unusually strong effect on mental health patients in causing them to follow treatment plans.  This phenomenon is called the black robe effect, based on perhaps a metaphor for and perhaps the real, physical source of the judge’s authority.  After only on listening to the episode and googling the term “black robe effect” once, this is my understanding of the effect:

  • The effect on the patient is due to the outward signs of authority that the judge carries;
  • The effect is in causing otherwise uncooperative patients under the judge’s purview to follow advice/orders already known to those patients (i.e. the judge is not the orignator of the advice/orders);
  • Most of the effect is realized in the judge’s absence as an indirect effect of his/her authority (e.g. when the patient takes a daily medication at home);

The basic outline of this effect is something I’ve found to be a major part of my job as an ESL or EFL teacher.  I’m often in the position of telling my students do things that they could feasibly do without anyone’s saying anything, but they’re much more likely to do when I tell them.  This is probably the one way in which I most reliably assume the “teacher role” and exercise my authority.

In fact, this is probably one of the best justifications nowadays for teachers existing at all.  We are great at causing (or forcing or allowing or facilitating; I’m not picky on the causal metaphor) people to do things that they could always do for free, and ideally creating norm-governed communities where success at those things is celebrated.  We definitely aren’t the only ones in the room anymore with access to the right information – students have all the human knowledge in the world in their pockets.  We have authority and an agreed-upon role as an arbiter of the values of our in-class community, and not much else.

Reading circles are a good example of the black robe effect in my classes.  This semester, one of my classes has read a non-fiction book over the course of a couple of months, and every 2 weeks during that time we’ve done reading circles that cover the chapters we read in the previous week (for the curious, here are the roles that I use).  Now what is my role in “teaching” the weeks that we share our reading circles sheets?  It’s pretty much the black robe effect without the gavel:

  • The effect on the students is due to the outward signs of authority that the teacher carries; (i.e. they do it because the person in the front of the room told them to)
  • The effect is in causing otherwise uncooperative students under the teacher’s purview to follow advice/orders already known to those students; (i.e. the book we’re reading has always been available to buy, as are millions of other fine books – “uncooperative” here means “wouldn’t do it by default”)
  • Most of the effect is realized in the teacher’s absence (e.g. when the student reads at home – and although I’m physically present in the classroom when they’re sharing their reading circles, I’m not participating, so then too).

One of my staple activities is even more of a textbook example of a black robe effect – I give students something called a Language Log, which is basically a blank sheet with spaces for English input (things they watched or read or people they talked to) outside the classroom and what they noticed.  Nothing about the sheet requires some deep knowledge on the part of the teacher to design or implement – it is a kind of educational MacGuffin that furthers the goals of language development without containing anything meaningful itself (the educational MacGuffin was a staple of my classes back in Japan too).  Still, if some non-authority or even one of the student’s family members gave them the same sheet and instructed him/her to keep track of input, it would not work – family members, in ESL and in mental health treatment, don’t get to wear black robes.

I’ll post again at a later date about what exactly my black robes comprise.

On Tyranny in the ESL Classroom: 20 Lessons from 20th Century Pedagogy

(Pace Timothy Snyder – originally this post was going to be “Democrating Backsliding in the ELT Classroom”, but I haven’t actually read the relevant materials for that.  The point is the same, though – a series of semi-political tips for not letting classes or institutions slide into tranmissive dictatorships.  The usual caveat applies:  I certainly don’t apply as many of these rules as I’d like, and in fact wrote this partly as a warning to myself.)

Do not obey in advance

Let’s assume your students have shown a pattern of reluctance to choose input for themselves or engage in self-directed learning, which is common in language classrooms around the world.  Do not assume that this pattern will continue forever, and do not change your teaching methods in anticipation of this reluctance even before it happens.  Do not treat your students as unready for communicative or other modern methods simply because previous classes may have been.

Defend institutions

Defend modern ELT in principle.  Many classes slide into teacher-domination because expedience seems to demand it – because teachers accept the unilateral authority that the forces of student expectation and curricular deadlines seem to require.  Temporary suspensions of student-centeredness in favor of transmission-style teaching should be resisted, not just because they do not work, but because they encourage the view that  researched and rigorous concepts such as interlanguage are inconveniences standing in the way of truly efficient impartation of knowledge.  In reality, of course, that efficiency is more a path toward perfunctory teacherly role-playing than toward learners’ mastery of English.

Beware the one-party state

Many classroom dictatorships arise not because a teacher arrogates power but because his/her pupils choose to cede it when given the option.  Do not take opportunities that students give you to take full control of the classroom, and do not use your authority as a teacher to consolidate attention and legitimate authority around yourself.

Take responsibility for the face of the world

The appearance of the classroom should not reflect the will of a single person.  The only writing on the whiteboard should not be the teacher’s, the only printed text used should not be from the teacher, and the only voice heard should not be the teacher’s.  Classrooms should physically manifest the priority given to students’, not teachers’, expression.

Remember professional ethics

Oftentimes, a teacher-centered class emerges because students feel pressure to play the part of the student as they understand it.  This part, which is often defined by passive receptivity and obedience, is not simply unconscious habit – students may see it as an affirmative moral value in itself.  That is, the job of the teacher may not be just to present a more interesting alternative to silent absorption of information, but actively discourage students’ preconceived ideas of “how to be a student”.  Students have their own professional ethics of classroom conduct, and teachers would do well to acknowledge their existence.

(Yes, this is the opposite of Timothy Snyder’s point on this subject.  Bear with me.)

Be wary of paramilitaries

Clusters of students that are apparently sympathetic to the communicative, egalitarian, task-based curriculum that the teacher is trying to effect may appear and begin to dominate classroom activities.  The existence of these seeming allies among the student population is welcome to a degree, but can begin to create a hostile environment for students who are reluctant to engage to the same degree for reasons of identity or ability.  Remember that the job of the teacher is not to give more advantage to students who are already advantaged because of a higher starting point or previous experience with modern ELT classes, or to signal a preference for those students.  The creation of a privileged minority of students within the classroom should be avoided.

Be reflective if you must be armed

For students: Being appointed, being selected, or volunteering to be group leader means that you are responsible for the maintenance of communicative norms within that group.  When you have power over your classmates, maintain norms of discourse that do not privilege particular viewpoints – yours especially – or consist only of participation by students who are already fluent speakers.  Some students will take the reduced numbers of eyes on them when working in a small group as an invitation to dominate the conversation or to shrink back into individual study.  As the local authority, your job is to prevent either of these from happening.

Stand out

Taking a modern, communicative approach may distinguish you from your colleagues in ways that are mutually uncomfortable.  You may feel that you are passing judgment on your colleagues’ or institution’s way of doing things by breaking from it.  Indeed, some teaching milieux may have norms so deeply established for so long that trying something new is seen as synonymous with questioning everyone else’s competence.  Be open about trying new techniques and approaches and be honest about their success or failure.  Be prepared to justify them with reference to research.  Above all, be honest about why you teach the way you do, and do not acquiesce to unjustifiable pedagogical norms no matter how many people with pages-long CVs are pushing them.

Be kind to our language

Do not adopt buzzwords needlessly, and certainly do not use them without understanding them.  “Learning styles” were a litmus test for being a modern teacher for 15 years or so, during which many teachers described their classes and students with the vocabulary of what turned out to be a false theory of educational psychology.  Many still use the terminology of “learning styles”, describing an activity as “ideal for kinesthetic learners” when they could just as easily call it “less boring than sitting still”.  By adopting this terminology, teachers have appeared to endorse a theory which was debunked.

Believe in truth

In some teaching contexts, a long career is seen as a substitute for reflected-upon experience and confidence in one’s methods as equivalent to knowledge of their efficacy.  Foreign language pedagogy is a field with a long history and plenty of research.  This body of research is mature enough to offer at least some tentative answers to long-standing questions in our field, such as how central formal grammar should be in classes and how much of a difference input makes.  Access to the current state of knowledge on questions like these, and more importantly, believing that the questions have answers that can’t be ignored in favor of a local or individual long-practiced method, is a step toward more effective and more justifiable pedagogy.

Investigate

That said, the answers to pedagogy’s big questions may not come in an obvious form.  Sometimes a teacher will have great success with a method or technique that appears to come from the middle ages.  Commit to trying to understand how different teachers have success with different class styles and the principles underlying that success.  Above all, do not accept pedadogical prescription or proscription without the application of your critical faculties.

Make eye contact and small talk

Humanity can be brought to the classroom by simple engagement with learners as people.  Some one-on-one or small group interaction with the teacher not as a fount of wisdom but just as a person, and with the learner not as a receptacle of knowledge or target of remediation but as another person, can bring much-needed humanity back to the classroom.

Practice corporeal politics

PhD researchers who don’t teach and chalk-faced teachers who don’t reflect on practice or theory are a perfect recipe for each other’s stagnation.  Take theory that comes from people who haven’t set foot in a language classroom in years with a grain of salt.  You cannot realize good pedagogical theory without contact with learners.  I mean this in two ways – your theory will be useless if it doesn’t survive contact with actual people, and putting your theory into practice with your own students ensures that at least some people will benefit from it.

Establish a private life

You do not need to share as much with your learners as they share with you.  There is a happy medium between sterile professionalism in the classroom and complete shedding of boundaries.  Affective factors certainly do affect achievement, and that entails at least some rapport and sense of community beyond a shared interest in skillbuilding.  However, oversharing runs the risk of reducing the teacher to merely an affective variable and not an expert in either the subject or how to teach it.

Contribute to good causes

A local, institutional professional culture may fall short of maintaining pedagogical standards.  Sometimes, a national or international group, formal or informal, may function better as a community of practice for a teacher hoping to grow and keep up with current wisdom.  In any case, join (i.e., send money), attend, and especially present.  If a group of which you are a member is failing to provide something of value, you should provide it instead.

Learn from peers in other countries

ELT and especially SLA are worldwide fields, and different cultures, countries, and institutions around the world often practice radically different pedagogy.  Staying in one milieux for too long threatens to particularize your skillset; working in many countries or at least communicating with fellow teachers and learners in other countries exposes you to different sorts of problems to be solved and ways of solving them.  A frequent stumbling block in your milieux may have an extremely commonsense solution elsewhere in the world – and you may be surprised by the depth of thought that goes into an issue you thought only had one answer.

Listen for dangerous words

Pedagogy can be circumscribed a bit too cleanly by the words used to describe it.  “Syllabus”, “material”, “instruction”, “grammar”, “participation”, “master” and even “know” are all words that language teachers have good reason to take with several grains of salt.  If you hear these words being used as if their meanings were obvious, and especially if they are being used with obviously mistaken meanings, don’t be afraid to ask, “what do you mean?”  Often, the most useful discussions with colleagues and students occur over supposedly commonsense terms.

Be calm when the unthinkable arrives

Emergencies and exceptions are dangerous times.  The last day before the test might seem like a time when the norms of student-centeredness might best be suspended in favor of teacher-led review sessions.  This might even be presented as the only responsible option.  Of course, if teacher-centeredness is the most responsible path right before an exam, another exam will come soon, and the exceptional circumstance might be stretched a bit longer.  In fact, every lesson contains something of vital importance which seem to deserve priority over the luxuries of free student participation and self-directed learning.  There are always circumstances that would seem to make every class session a temporary exception or an emergency and cause the teacher to resort to a more “efficient” method.  Be very suspicious of exhortations or enjoinders because of the supposed unique circumstances of the present class period.

Be a patriot

Be a teacher, not a deliverer or keeper of information.  You can take for granted that you know the subject matter better than your students.  Knowing the metalanguage around your subject matter, including serious-sounding terms like “adjective clause”, makes it easier for you to convince other native speakers that you really earn your paycheck, but of course you will never catch up to Google search in your grammar knowledge.  Your job is bringing other people to a more complete understanding (see “dangerous words”) of the subject matter, not just knowing it yourself, and certainly not impressing your students with how much more than them you know.

Be as courageous as you can

If none of us is prepared to work for our betterment, then all of us will labor under mediocrity.

My Knowing You Has Moral Value of Life and Death

I’ve been pushed for the first time since the early weeks of this blog to comment on vegetarianism, thanks to a thoughtful post by Wandering ELT.

Like most of the strongly held opinions that made up my identity when I was in college, such as the utility of taxing custom rims or the superiority of Megadeth over Metallica, vegetarianism has turned into from ideology into mere habit.  It still exists like an old UCI sweatshirt as a vestige of the intellectual life I used to have.  I still practice it (and still listen to Megadeth) more because it’s what I did yesterday and not because I am a consistently, mindfully moral person.  Obviously, no completely moral person can listen to Megadeth as much as I do.

Most of this post will have the odor of long-dormant dogma reactivated.  If I begin to sound too strident, just be glad you know me now and not when I was 22.

The moral crisis that fomented the change in my life from meat eating to not began with the death of one of my dogs in the summer of 2001.  I felt suddenly aware of how much his admittedly simple life had meant to me, and how distressing it was to think of his last moments of suffering.  I suppose almost any pet owner in the same situation feels the same things.  This time, for some reason, I was also very aware of just how few beings in the world would be capable of drawing this kind of reaction, and as weeks went on afterward, this took up more and more of my thoughts.  I was still feeling the loss itself, and some odd guilt as well for feeling this so selectively.  I began to notice that the gap between my overriding preoccupation with my dog’s well-being at the end of his life and my complete ignorance of the well-being of every other animal on earth said something very bad about how my moral circle of concern applied to the world outside.

Screen Shot 2017-10-09 at 18.52.01.png
The way I thought my moral circle was

My dog’s death that summer, and the terror attacks soon to come after it, made very salient in my mind the inhumanity of how I drew my moral circle.  Many of us have heard “expanding circle” arguments about morality, in which we treat the things that are close to us as valuable, and further things less valuable, until we are basically indifferent to things that are very distant or different from us.  What my dog’s death made very clear to me is that 1) my moral circle where animals were concerned had a monstrous gap between the animals I cared about and the animals I didn’t, and 2) the center of my moral circle was quite small and was only justified by my own ego.

Screen Shot 2017-10-09 at 18.55.09
The way I saw it after summer 2001

The way a circle of moral concern is often understood is that we simply don’t care about the things/beings outside it; we don’t wish them harm, but we also don’t actively try to improve their lives.  In my case, the momentary suffering ending in death of one creature debilitated me for some time, which is an inevitable and even healthy response for pets and family members near the center of one’s circle of concern.  The creatures outside of my moral concern, however, weren’t simply benignly outside of my attention.  I paid people to emiserate and kill them, albeit indirectly.  I enjoyed the fruits of their deaths and considered the savings from not giving them comfortable lives a bonus for my wallet.  In short, I wasn’t indifferent to them; I actively participated in their torture and destruction.  The revision of the outline of my moral circle from a slow fadeout into a sheer cliff was intellectually jarring.

And the center of my moral circle was me, just me and all my coincidental associations.  I don’t think things enter or leave my life for cosmically meaningful reasons.  My dog was adopted by my family, lived with us, and was loved by us because we happened to choose him that day when I was in fifth grade, not because he was made of clearly superior stuff and the universe especially wanted him to have a good life.  Through the accident of his association with us, in addition to having been born a domesticated dog rather than a pig, chicken, or cow, he was granted a life of gentle leisure rather than one of neglect, prolonged discomfort or constant agony.  His death would be seen as a tragedy rather than a transaction because he happened to come into contact with us.  In other words, my family and I were the center of a bizarre moral universe in which only the few animals near us had human-like moral value, and all others deserved to die to make our sandwiches tastier.  Our circle of concern wasn’t based on logical or universal criteria like the capacity to feel, consciousness, or a complex nervous system, but was transparently based on whether you happened to be lucky enough to know us.  It was a solipsistic moral circle, and as I mentioned earlier, the edges were dropoffs into moral worthlessness.

So by becoming vegetarian I convinced myself that my moral circle was something I could justify.  Now at least I wasn’t basing the moral value of animal life on its proximity to me, and deeming those animals who failed to meet that arbitrary criterion subject to slow torture and death.

I don’t completely agree with this point of view now, since the suffering of animals outside of human society is arguably worse than that of animals living in countries with decent animal welfare laws, even if they are being raised for meat.  If I had the choice between a meat meal from an actually happy cow (the ones from California regrettably aren’t) and a vegetarian meal of 100% imported and processed rainforest-grown grains, I might really need to think about my choice.  Of course, we don’t live in a country with decent animal welfare laws, so I’ve never had to resolve that conundrum.

(FYI, my last meaty meal ever was chicken soft tacos from the Del Taco on Campus Drive.)

Translationism

Here’s another of those posts where I try to slap a label on an ELT phenomenon I’ve noticed (Schmidt, 1994).

Translationism is the prioritizing of translation as a means of seeing and learning other languages.  It is built on the assumption that different languages are sets of arbitrarily-differing tokens which refer to identical basic phenomena in the real world, and therefore that learning another language is a matter of matching the tokens from the L2 to the tokens from the L1 (tokens being lexis or grammar forms).  It is more a result of slips in thinking or adherence to other ideologies than an ideology itself, but is common enough to warrant naming.  Some of the ideologies that it results from are native-speakerism (NSism) and nationalism, which displace translationism when convenient for that ideology.

Disclaimer: Clearly, this post is sort of a holdover from my time in Japan, where I saw this ideology reflected in the approaches taken by both Japanese ELT and Japanese culture in general toward other languages.  I don’t see as much of it in California and thankfully not in ESL.  (To the contrary, I see ESL teachers, unhelpfully in my view, warning students against using bilingual dictionaries.)  I have a feeling translationism is much more prevalent in EFL contexts, particularly ones in thrall to a national narrative that links the dominant ethnic group’s supposedly innate characteristics to its current culture and modes of expression.  Maybe my blogging self misses living in a place like that and always having things to be outraged by.

What follows is a breakdown of types and effects of translationism. ご覧ください。

Read More »

Discursive stowaways and human logic (dangling participles part 2)

Dangling participles are less ambiguous than style manuals would have you believe.  They are subject to the same basic rule that governs all modifiers – namely, that human readers with functioning representations of the real world will give them the most plausible interpretations and move on.  At worst, they are just like a lot of adverbials or adjective clauses in that they could conceivably refer to multiple parts of the sentence.  More often, danging participles in common use are essentially idioms with set meanings, whether or not they share a subject with the main clause.  These are the ones you hear on the evening news – keep an ear out and you’ll catch quite a few.

I put together another survey after the last one to further investigate what may make a dangling participle seem more comprehensible or clear besides having the subject of the main clause as its subject.  Specifically, I was interested in a few things that seemed to be the most common implied subjects, and whether using these reliably made a dangling participle more comprehensible than other implied subjects.  My conclusion was not what I had expected.

If that hasn’t already put you to sleep, read on.

Read More »

The simple past in simplified history

I had an interesting conversation with a fellow dog-owner, who happened to be an Indian nationalist [Edit: Apparently the term for people of this persuasion is “Hindu nationalist”, not “Indian nationalist”.  Thanks Adi Rajan], at the dog park.  My interlocutor was recounting some of the wrongs that had been visited on Hindus in India by foreign conquerers, and he described how one named Aurangzeb had a particularly bad habit of tearing down Hindu places of worship and replacing them with mosques.  As it happened I had just finished reading Atrocities again and was sort of on the same page mentally, or at least more prepared than average to hear stories of Mughal emperors sweeping armies across the subcontinent, disrupting agriculture and failing to plan for floods, and generally causing a kind of misery that has political power hundreds of years into the future.  Oh, and don’t ask me how we got on the topic.

the_emperor_aurangzeb_on_horseback_ca-_1690e280931710_the_cleveland_museum_of_art
You might be wondering why the parasol-bearer is so badly failing at his job.  Actually, what he’s holding aloft is a massive lemon meringue pie, which Mughal emperors would order baked after a successful military campaign as a show of strength.

Anyway, he mentioned one countermeasure that Hindus took during Aurangzeb’s reign to at least be pillaged on their own terms.  As was explained to us, it was (is?) normal in Hinduism to cremate bodies soon after death, so that the soul didn’t have anything in this world to cling to when it has to move on.  In the case of holy men, upon (physical) death the bodies were kept and/or preserved rather than cremated.  This was, of course, because holy men’s souls can move independently of their bodies.  Holy men’s mummified corpses from that era would presumably still be on hand if observant Hindus hadn’t taken it upon themselves to cremate them as well during Aurangzeb’s reign, to prevent them from falling into the hands of the Muslim conquerers, in a bit of proactive self-desecration. This was, according to the man at the dog park, characteristic of Hindus, who always sought to keep their faith pure.

I got to thinking about how common this practice (let’s call it proactive saint cremation, or PSC) could really have been, as part of my usual ruminations on how in the creation of a group narrative, “a few people did it” turns into “people did it” and then “we did it collectively displaying the unique characteristics of our people”.

I realized that some semantic properties of the “simple past” (scare quotes for bad naming – it’s no more “simple” than the “simple present”) might enable this transition.  Namely, the blurriness of the simple past with respect to whether it refers to a single event or a stereotyped, repeated event facilitates the transition of historical occurrences from discrete to characteristic of a people, place, or time period.  The fact that the adverbials that serve distinguish the simple past for single occurrences from the simple past for repeated occurrences are easily discarded is of significance as well, as well as other qualifiers on the noun subject which are often grammatically inessential.

For example, let’s say this is a historically justifiable statement:

Ruling Muslims from the upper class ordered Hindu monuments destroyed in 1699.

(I’m not saying that this sentence is true – just using it as an example)

With the adverbial prepositional phrase removed, it is easily interpretable as referring to a repeated action.

Ruling Muslims from the upper class ordered Hindu monuments destroyed.

And with all the grammatically inessential (i.e., non-head) information removed from the subject noun phrase,

Muslims ordered Hindu monuments destroyed.

It would be plausible for someone just joining the conversation at this point to hear a blanket indictment of Muslims rather than a description of a particular historical event.

Now, part of what makes this possible is the particular grammatical feature of English that the same verb form, the badly-named simple past, works both as a past version of the simple present (i.e., it paints the subject with a stereotyped action occurring at no particular time, like “dogs bark”) and as a reference to a single action taking place at a specific time (which the simple present does as well, but less often – see “he shoots, he scores” or “I arrive at 6 PM”).  Of course, if you want to be very specific about the fact that an action was repeated, you could use alternatives like “Hindus used to burn their dead” or “Holy men would be preserved instead”, but the simple past in the absence of qualifying adverbials leaves either interpretation open, and therefore makes extension of historical events from single and limited to common and characteristic very tempting.

Also driving this, of course, is the omnipresent impulse to narrativize one’s national history and define one’s or someone else’s ethnic group with characteristics that are “proven” with reference to stories like the above.  In fact, my inkling is that any ambiguity in descriptions of historical events will always be used to simplify them for inclusion in one country or another’s national story.  In Japanese, it is the lack of plurals for nouns, allowing “a Japanese apologized to comfort women” to become “the Japanese apologized to comfort women” with no change in wording.  I assume other languages have similar ambiguities that can ease the transition from events that happened to national triumphs or tribal enmities.  Grammatical ambiguity as in the simple past may be but one of many forms of catalyst that make historical events into parts of a story about us.

NSism and being a gadfly

My old tutor has taken the bold step of defending the veracity of the NS/NNS (native speaker/non-native speaker) dichotomy.  Why bold?  Well, NS/NNS is to ELT what male/female and white/black (or white/non-white) are to the social sciences.  That is, the implications, interpersonal and political, of the dichotomies are so reviled that their underlying reality ends up becoming toxic by association, and people end up denying the empirical existence of those categories altogether.  In such an environment, asserting that male/female, white/black, or NS/NNS are even scientifically justifiable categorization schemes puts one in the position of speaking against the vast majority of one’s peers, which seems, yes, bold.

It looks like I’m going to take the conventional position here of defending my tutor and others who take similar stances on other issues as put-upon whistleblowers who just want to stand up for an unpopular truth, but it’s quite a bit more complicated than that.  As it turns out, in some cultures (like this one), such positions frequently result in one being valorized as an intellectual martyr for pushing back against a stifling, politically correct consensus, making it tempting for a minority of commentators on any given topic to claim the “underdog truth-teller” ground and start getting invited to speak at universities just for being perceived as a rare straight arrow in topsy-turvy academia.  This is despite the fact that if 99 experts out of 100 believe one thing and the last believes something else, the one should be deemed less likely to be right than the 99, not equally or even more likely (is this related to the phenomenon wherein as the odds of winning the lottery decrease, people focus more on and overestimate more the likelihood that they will win?).  Also, the unconventional opinion being celebrated is often suspiciously close to the cherished beliefs of reactionary and conservative elements in society.  If you’re obnoxious enough in your determination to offend the progressive consensus, you might even get called a “gadfly” or “provocateur”, terms which like “curmudgeon” seem to indicate a type of backwardness that we are obligated to find charming.  The iconoclast is a role that many people would be happy to play.

I know my tutor better than that; he doesn’t endorse blanket favoritism of NS teachers based on their supposed innate qualities, nor does every evolutionary psychologist interested in biological sex think that men and women communicate differently because of gender roles in hunter-gatherer society, or every social scientist who includes race among his/her variables think that IQ might be tied to skin tone. But many people take the cultural cachet surrounding the consensus-challenger and the thread of empiricism in these categorizations and tie to it weighty bundles of prejudice, folk wisdom, and plain assumption.  That is, they use the role of the gadfly in challenging apparent ridiculousness at the far left to on-the-sly reassert regressive principles.  Under the pretense of stating clear facts, they Morse-code their chauvinistic political beliefs.  Telling someone that you believe that white and black are different races is oddly out of place unless it is taken to signify something of much more salience, analogous to how mentioning an applicant’s great hairstyle and friendly demeanor in a letter of recommendation sends a very clear message to that applicant’s detriment.  I’m not just saying that ordinary people are statistically illiterate and not strong enough critical thinkers to understand where the salience of these categories stops, and therefore that they mistake the significance of mentioning them.  The categories are the mastheads on vast memeplexes concerning “innate” differences, and accepting the categories reliably conveys the message that you endorse their usual hangers-on as well.

This is partly because people reliably can’t parse nuanced messages (like “NSs exist but NSism is wrong”), yes, but it’s also because time spent defending the categorizations rather than fighting against the various injustices that accompany them is a statement of one’s priorities.  By defending the categorizations you reliably affirm stereotypes associated with them by 1) taking part in a known code that believers in stereotypes use, and 2) signalling by omission that the categories have greater importance to you than the injustices that follow them.

Now to cut my tutor a break again, the NS/NNS distinction is much more to the heart of our field of study than gender or racial differences are in any of the contexts you often hear them discussed.  A lot of what we actually choose to do or have our students do in the classroom absolutely depends on whether we think they can learn English the same we that we did, i.e. mostly implicitly.  This makes the NS/NNS distinction worth discussing without implications for who should be teaching them.  In my mind, the NS teacher/NNS teacher distinction is an artifact of how we assume that those teachers were themselves trained, and has hugely diminished in apparent significance since we moved from Japan to California.  On the other hand, not many news readers hearing how people of African descent are more likely to have sickle-cell anemia have any practical use for that information.  HR interviewers have much less need for information on gender differences (in any skill) than they might think.  If you hear the subject of innate differences between two types of people mentioned in most common conversational contexts, you are probably hearing a real-time misapplication of mental resources.  This extends, by the way, to discussions of NS and NNS teachers.

If the categories are as durable as I and many others think, they will survive a period of neglect while we focus on addressing the many problems the hangers-on of these categories have produced.  I don’t even see a need here to define what I mean by affirming their empirical existence; presumably anyone reading this and my other posts on race (try “Search”) knows I think part of the socially constructed meanings of these categories is that they are supposed to be objectively real (and therefore very important and unchangeable), but that doesn’t mean that their real-world effects are limited to “hurt feelings” – or that no part of them is objectively real.  Merely mentioning them in this post is sure to invite insinuations that I think they mean more than I do.  I just mean to say that if “innate differences” really are innate, they will stick around and continue to have whatever effects they do even if we don’t take bold stands defending them.