On Tyranny in the ESL Classroom: 20 Lessons from 20th Century Pedagogy

(Pace Timothy Snyder – originally this post was going to be “Democrating Backsliding in the ELT Classroom”, but I haven’t actually read the relevant materials for that.  The point is the same, though – a series of semi-political tips for not letting classes or institutions slide into tranmissive dictatorships.  The usual caveat applies:  I certainly don’t apply as many of these rules as I’d like, and in fact wrote this partly as a warning to myself.)

Do not obey in advance

Let’s assume your students have shown a pattern of reluctance to choose input for themselves or engage in self-directed learning, which is common in language classrooms around the world.  Do not assume that this pattern will continue forever, and do not change your teaching methods in anticipation of this reluctance even before it happens.  Do not treat your students as unready for communicative or other modern methods simply because previous classes may have been.

Defend institutions

Defend modern ELT in principle.  Many classes slide into teacher-domination because expedience seems to demand it – because teachers accept the unilateral authority that the forces of student expectation and curricular deadlines seem to require.  Temporary suspensions of student-centeredness in favor of transmission-style teaching should be resisted, not just because they do not work, but because they encourage the view that  researched and rigorous concepts such as interlanguage are inconveniences standing in the way of truly efficient impartation of knowledge.  In reality, of course, that efficiency is more a path toward perfunctory teacherly role-playing than toward learners’ mastery of English.

Beware the one-party state

Many classroom dictatorships arise not because a teacher arrogates power but because his/her pupils choose to cede it when given the option.  Do not take opportunities that students give you to take full control of the classroom, and do not use your authority as a teacher to consolidate attention and legitimate authority around yourself.

Take responsibility for the face of the world

The appearance of the classroom should not reflect the will of a single person.  The only writing on the whiteboard should not be the teacher’s, the only printed text used should not be from the teacher, and the only voice heard should not be the teacher’s.  Classrooms should physically manifest the priority given to students’, not teachers’, expression.

Remember professional ethics

Oftentimes, a teacher-centered class emerges because students feel pressure to play the part of the student as they understand it.  This part, which is often defined by passive receptivity and obedience, is not simply unconscious habit – students may see it as an affirmative moral value in itself.  That is, the job of the teacher may not be just to present a more interesting alternative to silent absorption of information, but actively discourage students’ preconceived ideas of “how to be a student”.  Students have their own professional ethics of classroom conduct, and teachers would do well to acknowledge their existence.

(Yes, this is the opposite of Timothy Snyder’s point on this subject.  Bear with me.)

Be wary of paramilitaries

Clusters of students that are apparently sympathetic to the communicative, egalitarian, task-based curriculum that the teacher is trying to effect may appear and begin to dominate classroom activities.  The existence of these seeming allies among the student population is welcome to a degree, but can begin to create a hostile environment for students who are reluctant to engage to the same degree for reasons of identity or ability.  Remember that the job of the teacher is not to give more advantage to students who are already advantaged because of a higher starting point or previous experience with modern ELT classes, or to signal a preference for those students.  The creation of a privileged minority of students within the classroom should be avoided.

Be reflective if you must be armed

For students: Being appointed, being selected, or volunteering to be group leader means that you are responsible for the maintenance of communicative norms within that group.  When you have power over your classmates, maintain norms of discourse that do not privilege particular viewpoints – yours especially – or consist only of participation by students who are already fluent speakers.  Some students will take the reduced numbers of eyes on them when working in a small group as an invitation to dominate the conversation or to shrink back into individual study.  As the local authority, your job is to prevent either of these from happening.

Stand out

Taking a modern, communicative approach may distinguish you from your colleagues in ways that are mutually uncomfortable.  You may feel that you are passing judgment on your colleagues’ or institution’s way of doing things by breaking from it.  Indeed, some teaching milieux may have norms so deeply established for so long that trying something new is seen as synonymous with questioning everyone else’s competence.  Be open about trying new techniques and approaches and be honest about their success or failure.  Be prepared to justify them with reference to research.  Above all, be honest about why you teach the way you do, and do not acquiesce to unjustifiable pedagogical norms no matter how many people with pages-long CVs are pushing them.

Be kind to our language

Do not adopt buzzwords needlessly, and certainly do not use them without understanding them.  “Learning styles” were a litmus test for being a modern teacher for 15 years or so, during which many teachers described their classes and students with the vocabulary of what turned out to be a false theory of educational psychology.  Many still use the terminology of “learning styles”, describing an activity as “ideal for kinesthetic learners” when they could just as easily call it “less boring than sitting still”.  By adopting this terminology, teachers have appeared to endorse a theory which was debunked.

Believe in truth

In some teaching contexts, a long career is seen as a substitute for reflected-upon experience and confidence in one’s methods as equivalent to knowledge of their efficacy.  Foreign language pedagogy is a field with a long history and plenty of research.  This body of research is mature enough to offer at least some tentative answers to long-standing questions in our field, such as how central formal grammar should be in classes and how much of a difference input makes.  Access to the current state of knowledge on questions like these, and more importantly, believing that the questions have answers that can’t be ignored in favor of a local or individual long-practiced method, is a step toward more effective and more justifiable pedagogy.

Investigate

That said, the answers to pedagogy’s big questions may not come in an obvious form.  Sometimes a teacher will have great success with a method or technique that appears to come from the middle ages.  Commit to trying to understand how different teachers have success with different class styles and the principles underlying that success.  Above all, do not accept pedadogical prescription or proscription without the application of your critical faculties.

Make eye contact and small talk

Humanity can be brought to the classroom by simple engagement with learners as people.  Some one-on-one or small group interaction with the teacher not as a fount of wisdom but just as a person, and with the learner not as a receptacle of knowledge or target of remediation but as another person, can bring much-needed humanity back to the classroom.

Practice corporeal politics

PhD researchers who don’t teach and chalk-faced teachers who don’t reflect on practice or theory are a perfect recipe for each other’s stagnation.  Take theory that comes from people who haven’t set foot in a language classroom in years with a grain of salt.  You cannot realize good pedagogical theory without contact with learners.  I mean this in two ways – your theory will be useless if it doesn’t survive contact with actual people, and putting your theory into practice with your own students ensures that at least some people will benefit from it.

Establish a private life

You do not need to share as much with your learners as they share with you.  There is a happy medium between sterile professionalism in the classroom and complete shedding of boundaries.  Affective factors certainly do affect achievement, and that entails at least some rapport and sense of community beyond a shared interest in skillbuilding.  However, oversharing runs the risk of reducing the teacher to merely an affective variable and not an expert in either the subject or how to teach it.

Contribute to good causes

A local, institutional professional culture may fall short of maintaining pedagogical standards.  Sometimes, a national or international group, formal or informal, may function better as a community of practice for a teacher hoping to grow and keep up with current wisdom.  In any case, join (i.e., send money), attend, and especially present.  If a group of which you are a member is failing to provide something of value, you should provide it instead.

Learn from peers in other countries

ELT and especially SLA are worldwide fields, and different cultures, countries, and institutions around the world often practice radically different pedagogy.  Staying in one milieux for too long threatens to particularize your skillset; working in many countries or at least communicating with fellow teachers and learners in other countries exposes you to different sorts of problems to be solved and ways of solving them.  A frequent stumbling block in your milieux may have an extremely commonsense solution elsewhere in the world – and you may be surprised by the depth of thought that goes into an issue you thought only had one answer.

Listen for dangerous words

Pedagogy can be circumscribed a bit too cleanly by the words used to describe it.  “Syllabus”, “material”, “instruction”, “grammar”, “participation”, “master” and even “know” are all words that language teachers have good reason to take with several grains of salt.  If you hear these words being used as if their meanings were obvious, and especially if they are being used with obviously mistaken meanings, don’t be afraid to ask, “what do you mean?”  Often, the most useful discussions with colleagues and students occur over supposedly commonsense terms.

Be calm when the unthinkable arrives

Emergencies and exceptions are dangerous times.  The last day before the test might seem like a time when the norms of student-centeredness might best be suspended in favor of teacher-led review sessions.  This might even be presented as the only responsible option.  Of course, if teacher-centeredness is the most responsible path right before an exam, another exam will come soon, and the exceptional circumstance might be stretched a bit longer.  In fact, every lesson contains something of vital importance which seem to deserve priority over the luxuries of free student participation and self-directed learning.  There are always circumstances that would seem to make every class session a temporary exception or an emergency and cause the teacher to resort to a more “efficient” method.  Be very suspicious of exhortations or enjoinders because of the supposed unique circumstances of the present class period.

Be a patriot

Be a teacher, not a deliverer or keeper of information.  You can take for granted that you know the subject matter better than your students.  Knowing the metalanguage around your subject matter, including serious-sounding terms like “adjective clause”, makes it easier for you to convince other native speakers that you really earn your paycheck, but of course you will never catch up to Google search in your grammar knowledge.  Your job is bringing other people to a more complete understanding (see “dangerous words”) of the subject matter, not just knowing it yourself, and certainly not impressing your students with how much more than them you know.

Be as courageous as you can

If none of us is prepared to work for our betterment, then all of us will labor under mediocrity.

Advertisements

My Knowing You Has Moral Value of Life and Death

I’ve been pushed for the first time since the early weeks of this blog to comment on vegetarianism, thanks to a thoughtful post by Wandering ELT.

Like most of the strongly held opinions that made up my identity when I was in college, such as the utility of taxing custom rims or the superiority of Megadeth over Metallica, vegetarianism has turned into from ideology into mere habit.  It still exists like an old UCI sweatshirt as a vestige of the intellectual life I used to have.  I still practice it (and still listen to Megadeth) more because it’s what I did yesterday and not because I am a consistently, mindfully moral person.  Obviously, no completely moral person can listen to Megadeth as much as I do.

Most of this post will have the odor of long-dormant dogma reactivated.  If I begin to sound too strident, just be glad you know me now and not when I was 22.

The moral crisis that fomented the change in my life from meat eating to not began with the death of one of my dogs in the summer of 2001.  I felt suddenly aware of how much his admittedly simple life had meant to me, and how distressing it was to think of his last moments of suffering.  I suppose almost any pet owner in the same situation feels the same things.  This time, for some reason, I was also very aware of just how few beings in the world would be capable of drawing this kind of reaction, and as weeks went on afterward, this took up more and more of my thoughts.  I was still feeling the loss itself, and some odd guilt as well for feeling this so selectively.  I began to notice that the gap between my overriding preoccupation with my dog’s well-being at the end of his life and my complete ignorance of the well-being of every other animal on earth said something very bad about how my moral circle of concern applied to the world outside.

Screen Shot 2017-10-09 at 18.52.01.png
The way I thought my moral circle was

My dog’s death that summer, and the terror attacks soon to come after it, made very salient in my mind the inhumanity of how I drew my moral circle.  Many of us have heard “expanding circle” arguments about morality, in which we treat the things that are close to us as valuable, and further things less valuable, until we are basically indifferent to things that are very distant or different from us.  What my dog’s death made very clear to me is that 1) my moral circle where animals were concerned had a monstrous gap between the animals I cared about and the animals I didn’t, and 2) the center of my moral circle was quite small and was only justified by my own ego.

Screen Shot 2017-10-09 at 18.55.09
The way I saw it after summer 2001

The way a circle of moral concern is often understood is that we simply don’t care about the things/beings outside it; we don’t wish them harm, but we also don’t actively try to improve their lives.  In my case, the momentary suffering ending in death of one creature debilitated me for some time, which is an inevitable and even healthy response for pets and family members near the center of one’s circle of concern.  The creatures outside of my moral concern, however, weren’t simply benignly outside of my attention.  I paid people to emiserate and kill them, albeit indirectly.  I enjoyed the fruits of their deaths and considered the savings from not giving them comfortable lives a bonus for my wallet.  In short, I wasn’t indifferent to them; I actively participated in their torture and destruction.  The revision of the outline of my moral circle from a slow fadeout into a sheer cliff was intellectually jarring.

And the center of my moral circle was me, just me and all my coincidental associations.  I don’t think things enter or leave my life for cosmically meaningful reasons.  My dog was adopted by my family, lived with us, and was loved by us because we happened to choose him that day when I was in fifth grade, not because he was made of clearly superior stuff and the universe especially wanted him to have a good life.  Through the accident of his association with us, in addition to having been born a domesticated dog rather than a pig, chicken, or cow, he was granted a life of gentle leisure rather than one of neglect, prolonged discomfort or constant agony.  His death would be seen as a tragedy rather than a transaction because he happened to come into contact with us.  In other words, my family and I were the center of a bizarre moral universe in which only the few animals near us had human-like moral value, and all others deserved to die to make our sandwiches tastier.  Our circle of concern wasn’t based on logical or universal criteria like the capacity to feel, consciousness, or a complex nervous system, but was transparently based on whether you happened to be lucky enough to know us.  It was a solipsistic moral circle, and as I mentioned earlier, the edges were dropoffs into moral worthlessness.

So by becoming vegetarian I convinced myself that my moral circle was something I could justify.  Now at least I wasn’t basing the moral value of animal life on its proximity to me, and deeming those animals who failed to meet that arbitrary criterion subject to slow torture and death.

I don’t completely agree with this point of view now, since the suffering of animals outside of human society is arguably worse than that of animals living in countries with decent animal welfare laws, even if they are being raised for meat.  If I had the choice between a meat meal from an actually happy cow (the ones from California regrettably aren’t) and a vegetarian meal of 100% imported and processed rainforest-grown grains, I might really need to think about my choice.  Of course, we don’t live in a country with decent animal welfare laws, so I’ve never had to resolve that conundrum.

(FYI, my last meaty meal ever was chicken soft tacos from the Del Taco on Campus Drive.)

Translationism

Here’s another of those posts where I try to slap a label on an ELT phenomenon I’ve noticed (Schmidt, 1994).

Translationism is the prioritizing of translation as a means of seeing and learning other languages.  It is built on the assumption that different languages are sets of arbitrarily-differing tokens which refer to identical basic phenomena in the real world, and therefore that learning another language is a matter of matching the tokens from the L2 to the tokens from the L1 (tokens being lexis or grammar forms).  It is more a result of slips in thinking or adherence to other ideologies than an ideology itself, but is common enough to warrant naming.  Some of the ideologies that it results from are native-speakerism (NSism) and nationalism, which displace translationism when convenient for that ideology.

Disclaimer: Clearly, this post is sort of a holdover from my time in Japan, where I saw this ideology reflected in the approaches taken by both Japanese ELT and Japanese culture in general toward other languages.  I don’t see as much of it in California and thankfully not in ESL.  (To the contrary, I see ESL teachers, unhelpfully in my view, warning students against using bilingual dictionaries.)  I have a feeling translationism is much more prevalent in EFL contexts, particularly ones in thrall to a national narrative that links the dominant ethnic group’s supposedly innate characteristics to its current culture and modes of expression.  Maybe my blogging self misses living in a place like that and always having things to be outraged by.

What follows is a breakdown of types and effects of translationism. ご覧ください。

Read More »

Discursive stowaways and human logic (dangling participles part 2)

Dangling participles are less ambiguous than style manuals would have you believe.  They are subject to the same basic rule that governs all modifiers – namely, that human readers with functioning representations of the real world will give them the most plausible interpretations and move on.  At worst, they are just like a lot of adverbials or adjective clauses in that they could conceivably refer to multiple parts of the sentence.  More often, danging participles in common use are essentially idioms with set meanings, whether or not they share a subject with the main clause.  These are the ones you hear on the evening news – keep an ear out and you’ll catch quite a few.

I put together another survey after the last one to further investigate what may make a dangling participle seem more comprehensible or clear besides having the subject of the main clause as its subject.  Specifically, I was interested in a few things that seemed to be the most common implied subjects, and whether using these reliably made a dangling participle more comprehensible than other implied subjects.  My conclusion was not what I had expected.

If that hasn’t already put you to sleep, read on.

Read More »

The simple past in simplified history

I had an interesting conversation with a fellow dog-owner, who happened to be an Indian nationalist [Edit: Apparently the term for people of this persuasion is “Hindu nationalist”, not “Indian nationalist”.  Thanks Adi Rajan], at the dog park.  My interlocutor was recounting some of the wrongs that had been visited on Hindus in India by foreign conquerers, and he described how one named Aurangzeb had a particularly bad habit of tearing down Hindu places of worship and replacing them with mosques.  As it happened I had just finished reading Atrocities again and was sort of on the same page mentally, or at least more prepared than average to hear stories of Mughal emperors sweeping armies across the subcontinent, disrupting agriculture and failing to plan for floods, and generally causing a kind of misery that has political power hundreds of years into the future.  Oh, and don’t ask me how we got on the topic.

the_emperor_aurangzeb_on_horseback_ca-_1690e280931710_the_cleveland_museum_of_art
You might be wondering why the parasol-bearer is so badly failing at his job.  Actually, what he’s holding aloft is a massive lemon meringue pie, which Mughal emperors would order baked after a successful military campaign as a show of strength.

Anyway, he mentioned one countermeasure that Hindus took during Aurangzeb’s reign to at least be pillaged on their own terms.  As was explained to us, it was (is?) normal in Hinduism to cremate bodies soon after death, so that the soul didn’t have anything in this world to cling to when it has to move on.  In the case of holy men, upon (physical) death the bodies were kept and/or preserved rather than cremated.  This was, of course, because holy men’s souls can move independently of their bodies.  Holy men’s mummified corpses from that era would presumably still be on hand if observant Hindus hadn’t taken it upon themselves to cremate them as well during Aurangzeb’s reign, to prevent them from falling into the hands of the Muslim conquerers, in a bit of proactive self-desecration. This was, according to the man at the dog park, characteristic of Hindus, who always sought to keep their faith pure.

I got to thinking about how common this practice (let’s call it proactive saint cremation, or PSC) could really have been, as part of my usual ruminations on how in the creation of a group narrative, “a few people did it” turns into “people did it” and then “we did it collectively displaying the unique characteristics of our people”.

I realized that some semantic properties of the “simple past” (scare quotes for bad naming – it’s no more “simple” than the “simple present”) might enable this transition.  Namely, the blurriness of the simple past with respect to whether it refers to a single event or a stereotyped, repeated event facilitates the transition of historical occurrences from discrete to characteristic of a people, place, or time period.  The fact that the adverbials that serve distinguish the simple past for single occurrences from the simple past for repeated occurrences are easily discarded is of significance as well, as well as other qualifiers on the noun subject which are often grammatically inessential.

For example, let’s say this is a historically justifiable statement:

Ruling Muslims from the upper class ordered Hindu monuments destroyed in 1699.

(I’m not saying that this sentence is true – just using it as an example)

With the adverbial prepositional phrase removed, it is easily interpretable as referring to a repeated action.

Ruling Muslims from the upper class ordered Hindu monuments destroyed.

And with all the grammatically inessential (i.e., non-head) information removed from the subject noun phrase,

Muslims ordered Hindu monuments destroyed.

It would be plausible for someone just joining the conversation at this point to hear a blanket indictment of Muslims rather than a description of a particular historical event.

Now, part of what makes this possible is the particular grammatical feature of English that the same verb form, the badly-named simple past, works both as a past version of the simple present (i.e., it paints the subject with a stereotyped action occurring at no particular time, like “dogs bark”) and as a reference to a single action taking place at a specific time (which the simple present does as well, but less often – see “he shoots, he scores” or “I arrive at 6 PM”).  Of course, if you want to be very specific about the fact that an action was repeated, you could use alternatives like “Hindus used to burn their dead” or “Holy men would be preserved instead”, but the simple past in the absence of qualifying adverbials leaves either interpretation open, and therefore makes extension of historical events from single and limited to common and characteristic very tempting.

Also driving this, of course, is the omnipresent impulse to narrativize one’s national history and define one’s or someone else’s ethnic group with characteristics that are “proven” with reference to stories like the above.  In fact, my inkling is that any ambiguity in descriptions of historical events will always be used to simplify them for inclusion in one country or another’s national story.  In Japanese, it is the lack of plurals for nouns, allowing “a Japanese apologized to comfort women” to become “the Japanese apologized to comfort women” with no change in wording.  I assume other languages have similar ambiguities that can ease the transition from events that happened to national triumphs or tribal enmities.  Grammatical ambiguity as in the simple past may be but one of many forms of catalyst that make historical events into parts of a story about us.

NSism and being a gadfly

My old tutor has taken the bold step of defending the veracity of the NS/NNS (native speaker/non-native speaker) dichotomy.  Why bold?  Well, NS/NNS is to ELT what male/female and white/black (or white/non-white) are to the social sciences.  That is, the implications, interpersonal and political, of the dichotomies are so reviled that their underlying reality ends up becoming toxic by association, and people end up denying the empirical existence of those categories altogether.  In such an environment, asserting that male/female, white/black, or NS/NNS are even scientifically justifiable categorization schemes puts one in the position of speaking against the vast majority of one’s peers, which seems, yes, bold.

It looks like I’m going to take the conventional position here of defending my tutor and others who take similar stances on other issues as put-upon whistleblowers who just want to stand up for an unpopular truth, but it’s quite a bit more complicated than that.  As it turns out, in some cultures (like this one), such positions frequently result in one being valorized as an intellectual martyr for pushing back against a stifling, politically correct consensus, making it tempting for a minority of commentators on any given topic to claim the “underdog truth-teller” ground and start getting invited to speak at universities just for being perceived as a rare straight arrow in topsy-turvy academia.  This is despite the fact that if 99 experts out of 100 believe one thing and the last believes something else, the one should be deemed less likely to be right than the 99, not equally or even more likely (is this related to the phenomenon wherein as the odds of winning the lottery decrease, people focus more on and overestimate more the likelihood that they will win?).  Also, the unconventional opinion being celebrated is often suspiciously close to the cherished beliefs of reactionary and conservative elements in society.  If you’re obnoxious enough in your determination to offend the progressive consensus, you might even get called a “gadfly” or “provocateur”, terms which like “curmudgeon” seem to indicate a type of backwardness that we are obligated to find charming.  The iconoclast is a role that many people would be happy to play.

I know my tutor better than that; he doesn’t endorse blanket favoritism of NS teachers based on their supposed innate qualities, nor does every evolutionary psychologist interested in biological sex think that men and women communicate differently because of gender roles in hunter-gatherer society, or every social scientist who includes race among his/her variables think that IQ might be tied to skin tone. But many people take the cultural cachet surrounding the consensus-challenger and the thread of empiricism in these categorizations and tie to it weighty bundles of prejudice, folk wisdom, and plain assumption.  That is, they use the role of the gadfly in challenging apparent ridiculousness at the far left to on-the-sly reassert regressive principles.  Under the pretense of stating clear facts, they Morse-code their chauvinistic political beliefs.  Telling someone that you believe that white and black are different races is oddly out of place unless it is taken to signify something of much more salience, analogous to how mentioning an applicant’s great hairstyle and friendly demeanor in a letter of recommendation sends a very clear message to that applicant’s detriment.  I’m not just saying that ordinary people are statistically illiterate and not strong enough critical thinkers to understand where the salience of these categories stops, and therefore that they mistake the significance of mentioning them.  The categories are the mastheads on vast memeplexes concerning “innate” differences, and accepting the categories reliably conveys the message that you endorse their usual hangers-on as well.

This is partly because people reliably can’t parse nuanced messages (like “NSs exist but NSism is wrong”), yes, but it’s also because time spent defending the categorizations rather than fighting against the various injustices that accompany them is a statement of one’s priorities.  By defending the categorizations you reliably affirm stereotypes associated with them by 1) taking part in a known code that believers in stereotypes use, and 2) signalling by omission that the categories have greater importance to you than the injustices that follow them.

Now to cut my tutor a break again, the NS/NNS distinction is much more to the heart of our field of study than gender or racial differences are in any of the contexts you often hear them discussed.  A lot of what we actually choose to do or have our students do in the classroom absolutely depends on whether we think they can learn English the same we that we did, i.e. mostly implicitly.  This makes the NS/NNS distinction worth discussing without implications for who should be teaching them.  In my mind, the NS teacher/NNS teacher distinction is an artifact of how we assume that those teachers were themselves trained, and has hugely diminished in apparent significance since we moved from Japan to California.  On the other hand, not many news readers hearing how people of African descent are more likely to have sickle-cell anemia have any practical use for that information.  HR interviewers have much less need for information on gender differences (in any skill) than they might think.  If you hear the subject of innate differences between two types of people mentioned in most common conversational contexts, you are probably hearing a real-time misapplication of mental resources.  This extends, by the way, to discussions of NS and NNS teachers.

If the categories are as durable as I and many others think, they will survive a period of neglect while we focus on addressing the many problems the hangers-on of these categories have produced.  I don’t even see a need here to define what I mean by affirming their empirical existence; presumably anyone reading this and my other posts on race (try “Search”) knows I think part of the socially constructed meanings of these categories is that they are supposed to be objectively real (and therefore very important and unchangeable), but that doesn’t mean that their real-world effects are limited to “hurt feelings” – or that no part of them is objectively real.  Merely mentioning them in this post is sure to invite insinuations that I think they mean more than I do.  I just mean to say that if “innate differences” really are innate, they will stick around and continue to have whatever effects they do even if we don’t take bold stands defending them.

Average vs. cumulative outrage

There’s a ceiling on how much outrage I can feel at any given moment, much like there’s a limit on how much I will consider paying for a set of dishes, even if that set contains 10,000 bone china plates.

Over the past week I’ve seen, as you have, a string of successive and increasingly shocking affronts to human decency from the President and his advisors, which should have added up to at least a few instances of my head literally hitting the ceiling.

The thing is, outrage doesn’t seem to add up in this way, and rather than each new bit of news causing me to hit the ceiling, it has simply been added to the lively simmering crock pot of intense disappointment I’ve had in my head for the last few weeks.  I get the feeling most of my liberal friends feel somewhat like this, and are just motivated enough to tweet, complain on facebook or maybe send the ACLU some money via Paypal.  I’m not criticizing this, just remarking that the emotional state of many liberals is less:

panteravulgardisplayofpower

and more:

cover_38531512112009

This isn’t a political post at all, actually.  It’s just something that I noticed about the way I feel about Trump and the news that might point to something interesting about how people think in general.

People, it shouldn’t need to be pointed out, are poor intuitive statisticians.  This much is obvious, and provable by trying to explain any statistic to anyone, ever.  As a species we seem to have a counting module that can think about quantifiable things only as “one” “some” or “a lot”.  What is interesting is that this tendency applies to things that don’t even seem quantifiable, like feelings of outrage or indignation.  I can be outraged at one thing, outraged at some things, or outraged in general, but my subjective experience during the last of these isn’t all that much more intense than the first.  I have an upper boundary on how much of any particular emotion I can feel, and more input that would push me in that direction simply escapes and is lost as outrage radiation.

On the other hand, any countervailing information that I get cancels out far more outrage than it should.  If I hear that Trump’s son might have a disability, or I see people making fun of Melania for speaking learner’s English, I can pretty quickly forget about the last 5 terrible things that Trump himself did.  The bad things he does and the good (or at least not bad) things about him are both quantified in my brain as “some stuff” and weighted surprisingly equally.  Whenever I am made to recall at least one redeeming thing about him, my outrage drops down from its ceiling to “some outrage”, until some fresh news item (or just remembering the last one) pushes it back up.

It’s as if my outrage is averaged out rather than summed.  Rather than adding up travesty after travesty to get to 10,000 travesty points before subtracting 100 because he seems to love his children, all the travesties are normalized to within a narrow range and then have positive things I know about him subtracted.

The principle in action here seems to be a variation on the principle outlined (as many great principles are) in Daniel Kahneman’s Thinking, Fast and Slow, that the value of any given set is usually thought of as the average of its components rather than their sum.  That is, a set of 10 like-new dishes is priced higher than a set of 12 like-new dishes and 3 broken ones.  The broken ones seem to taint the set as a whole, bringing down the value of the entire package, although the package still contains at least as many like-new dishes as the alternative.  Ergo, as long as any number of things are conceived as a set rather than taken individually, their value is likely to be considered as a mean rather than a sum.  I’m surprised and a bit disappointed (in addition to fearful of what this could mean for how we think of Trump during his administration) in light of the fact that this principle seems to apply just as well to our feelings about a person’s set of actions as to our valuation of sets of dishes.

(I had a class in which I demoed this principle improvisationally on pieces of paper, handing each student a different description of a set of dishes and asking them to price it.  The principle was proven in real time, to the surprise of several managerial types in attendance.)

Things language teachers know #1 – impermanence

A truth you’re exposed to pretty early on in your training in SLA is that correctness is a matter of making your utterances target-like rather than meeting some objective standard.  This is because those who insist on a purer, correcter version of their language that nobody happens to speak are, well, incorrect – the rules of languages are defined by the people who use them.  Target-like means similar to the community that you want to be a part of, and if that community changes its mind, then what is “correct” changes too.  We EFL/ESL teachers help our students to be members of new language communities, not learn objective facts.  What appears objective and true about the rules of English only lasts as long as English speakers’ belief in or maintenance of them.  That’s right, languages and Tinkerbell have something rather crucial in common.

I understand the need to feel like you’re part of something permanent, or are playing by rules that are not just someone else’s opinion or the product of consensus.  If it bothers you, it’s best not to think about it, but impermanence is the law to which all the things we experience are temporary exceptions.  The groups at various levels of abstraction that you consider yourself part of, the morality you espouse and sometimes observe, and the language you speak are all are.. well, you know who said it best.

cover_627222122009
The dust and the wind are also impermanent.
Read More »

Machine translation requires human-like AI

Some of you alive in the 90s might remember an episode of Star Trek: TNG that is held as an example of the philosophy of language showing up in popular culture.  In this episode, the voyagers of the Starship Enterprise arrive on a planet where the inhabitants speak a highly allegorical language, using phrases about mythic or historical figures a la”Shaka, when the walls fell” to convey messages such as “oops” or “I see your point”.  As a result of these literal translations, the Enterprise’s crew members are forced to decipher what the dense metaphors mean contextually rather than in their normal English idiom as the universal translators usually supply.  Universal translators, as you can probably guess, are supposed to work with any language on the first encounter with that language or even with the species using it, and as far as I know this is the only episode where this particulary difficulty arises.

67152858
Source.

The problem is, if a universal translator can’t work with the very (infeasibly, as the article above points out) allegorical language spoken in that episode, it shouldn’t work with any language.  Even very closely related human languages use vastly different grammar and vocabulary to express greetings, thanks, obligation, and anything else under the Sol System’s sun.  To know that the verb phrase “thank you” is a show of gratitute in English (not a command, as verb phrases in isolation generally are), while an adverbial like “doumo” serves that purpose in Japanese, a universal translator would need to be a mind-reader before it was a translator, as there is no way to ferret out the fact that “doumo” and “thank you serve the same purpose from first principles or even from the grammar of that language (which universal translators don’t always have access to; they work on every language even on the first try). Moreover, it would need to do this mind-reading on species whose physiology it has never encountered before, meaning it would need to determine where the locus of that species’ cognition is, make intelligent predictions about how the patterns of (presumably) chemical synapse firings correlate to intentions, and map those intentions onto speech acts as they occur in real time.  The prerequisite technology for a universal translator is much larger than mere substitution and reordering of words, and approaches impossible, even by sci-fi standards.

In our world, people often discuss non-sci-fi machine translation like Google Translate as if it also were a scaling problem of existing technology, as if adding more of the same gears and cogs we already have would result in perfect language-to-language recoding.  In essence, people think the incremental improvement of current machine translation technology can save us from the years-long process of mastering new languages ourselves.  This post, with its oddly long prologue, is meant to argue that perfect machine translation would require a project of enormously grander scale than the visible inputs and outputs of textual language, and like the universal translators in Star Trek, would have a project of imposing complexity as a prerequisite, one whose implications would go far beyond mere translation.  In the case of machine translation that prerequisitive is a complete human-like artificial intelligence.

Read More »

Free will and vocabulary size

Criminals commit crimes of their own volition.  Society certainly plays a role in creating the conditions that produce crime, but ultimately the criminal is responsible for his or her own actions and should be held accountable as a free individual.

Words have identifiable meanings at their core.  People may use them incorrectly, or their meanings may be corrupted over time, but in order to understand them, use them, and teach them you need to know their real definitions independent of context.

I think both of these statements are wrong, but they have something interesting in common, which is a perceived need to regard certain things in isolation for some purposes when in reality they depend on other things for almost all their characteristics.  I intend here to draw a connection between what I studied in college (criminology) and some issues that haunt my current field of applied linguistics.  The connection that I will draw is one of an ultimately false atomized view of both human agents and words in language, and then I will show that the realization that atomized view is false nonetheless has very limited implications due to institutional restraints on how we can deal with human agency and how we can teach and test vocabulary.

brain-book
Source.

That started off much more formally than I intended.  I guess the issues I’m going to bring up are fairly high-minded, but keep in mind I’m used to dealing with these topics in academic writing.  I will try to address the least educated of my readers.  No, not you.  The other guy.  You know the one.

Read More »