Ancestry dot dot dot

Around junior high school, when I realized that “races” were a thing and I had one too, I started making my schoolwork Japan-themed wherever possible and ex nihilo informing my classmates that “taco”, in addition to being a receptacle for beef or chicken, meant “octopus” in Japanese.

(I wonder if the age at which you first realize your own race is a reliable shorthand for the stigmatization of the race of which you are a member…)

My classmates and teachers were nice enough not to call me out on this strange behavior. In fact, it probably would have been seen as improper if they had – after all, I was celebrating my heritage. I had Japanese ancestry, and that earned me the right to “rediscover my roots”, even in an awkward, teenage way.

(It’s funny how learning something new is frame as recovering it if you’re in a demographic thought to be born with that knowledge.)

Later, in high school, there was a club called Asian Cultural Enlightenment (ACE), which I somehow felt that I should join, although I never did. Several of my classmates in Japanese (the only Asian language elective) were members. I think I was putting a little bit of distance between me and Asian-ness, or simply taking advantage of the fact that as a stealth minority (i.e. capable of passing as white – many people assume my last name is Irish), I didn’t need to affirm any particular ethnic identity. I was fine with un-discovering my roots at this point.

Looking back, I wonder if the other members would have thought it was strange that someone with basically one toe in the pool of Asian identity would try to join an almost explicitly ethnically-based club. I also wonder how far back in my family tree I could have an Asian ancestor to legitimize an Asian identity if I had wanted to embrace one. If I merely shared with the other Asians the 99% of DNA that all humans share, would that not count as enough?

This journey down memory lane was spurred by yet another news story about cultural appropriation.

Read More »

Advertisements

Random reflections on economics

For some time now I’ve been lucky enough to have a professor of economics as one of my private students, and helping this person put together presentations, papers, and whatnot has exposed me to a field of inquiry that is quite different than SLA.  It’s been refreshing and somewhat zen-like to see the extreme quantification of social forces and psychological phenomena and to hear the thoughts of people dedicated to to that enterprise.  The following are some thoughts on what I’ve seen over the last year or so.

Quantification is not reductive

The stereotype is that economists view people’s loves and lives as “mere” numbers, which has earned economics as a field the nickname “the dismal science”.  I never got the feeling, though, that economists view quantification as taking away some quintessential human elán from the thousands or millions of people whose behavior they are analyzing.  To the contrary, it seems to be a common understanding of the field that numbers are just the only way to deal with data points that number in the millions; it would be impossible to describe something like a national gender wage gap qualitatively and still be fair to each individual.  It’s certainly not true that economists view that number as the inarguable conclusion of a research question; validity and how to test for it are problems that animate much of the literature (it seems). In short, quantification of human behavior is a necessary part of looking at data sets this large and doesn’t “reduce” people if you have an appropriately skeptical attitude toward what the numbers really mean.

Conservatives tend to place free will at the base of questions of economic justice

A basic assumption of the field which has come under question since the 1980s is that people, when presented with a field of choices, will choose correctly and consistently according to their mostly stable preferences.  It would be hard to find a bedrock principle more at odds with either modern psychology or any adult’s lived experience of other adults.

It follows from this ideology that humans make rational choices based on stable preferences that human choice is above reproach, that whatever people decide given a set of options is a priori proof of justice. Any attempt to “nudge” people into a better choice or to force certain choices will produce warped and economically unhealthy outcomes. If people seem to naturally separate themselves into different groups, it must reflect a natural, stable preference within those groups.  Such is the explanation often deployed to dismiss the gender pay gap as the result of women’s free will rather than any kind of injustice.

You see the basic logic at play here in many areas of public life – certain politicians seem to see no motivation for human behavior that is not economic, and the main or only purpose of government is to encourage (or at least not punish) good economic decisionmaking. When people, either individually or as a group, seem to display an affinity for factors other than income (e.g. family, conformity, culture, or community) when choosing a career, that choice is accounted for in their reduced income. The last thing the government should do when people make uneconomic choices is to reward them economically with nutritional assistance, hiring quotas, or tax credits.

Luckily, I am at a healthy remove from both the ideologies of free will and the prosperity gospel, and I therefore don’t think people’s choices (particularly economic choices) are self-justifying.

Glass ceilings vs. sticky floors

The glass ceiling is probably the most emblematic phenomenon from economics to make it into popular culture. Loosely defined, it is an income gap at the top of the income distribution. In practice, it is often interpreted as a man getting promoted to an upper management position over an equally hard-working woman, who unlike the man is expected to perform childcare and other domestic duties in addition to working full-time.

Of course, I don’t know many men or women in upper management of anything. I do know many men and women in jobs that pay by the hour, and many more who used to have those jobs.  Every week when I went shopping at my local MaxValu (Japanese chain supermarket), I would notice the people stocking the shelves, men and women, the cashiers, almost all women, and the mounted pictures of the store managers, all men. There are, obviously, many more people in jobs like this than in jobs like the last paragraph in any developed country.  But for some reason, there isn’t a metaphor in common currency to describe the observed income gap at the bottom of the income distribution.

Where it is discussed, it is called a sticky floor.  As I understand it, in economics, it is simply a parallel phenomenon to the glass ceiling, but one that concerns vastly larger numbers of people. In my mind, discussions of glass ceilings sometimes have the false-consciousness character of waitstaff on their break debating whether a 39.6% tax on the top bracket is unfairly high. Yes, it matters that Sheryl Sandberg has few peers in the Forbes 500, but it matters more and to more people that men in the bottom 10% of incomes out-earn women in the same bracket (I would include a source here, but it would reveal the identity of my student).

Because all my posts now include mandatory COCA data, The phrase “glass ceiling” occurs 465 times in the corpus, vs. 20 for “sticky floor” (only 3 of which seemed to be about economics rather than literal sticky floors).

A salary scale in a company that isn’t growing

This will strike any of you who have formally learned economics before as shockingly ignorant, even if the rest of this post hasn’t. Basically, when things stop growing, it’s not as if they settle into a flat but stable equilibrium. Sometimes, growth makes the system stable.

Screen Shot 2018-04-25 at 13.38.10.png

This graph, drawn for me at least 2 weeks in a row by my student, shows the salary of a worker in the sort of company that hires people for life compared to that worker’s level of contribution to that company (y axes), over the career of that worker (x axis).  The salary is in blue and the level of contribution (I believe it was called “human capital”) is in green.  There are two periods where these lines are very far apart: at the beginning of the worker’s career, where he/she contributes far more than he/she takes in, and past mid-career, where he/she takes far more than he/she contributes. This graph was drawn for me mostly to explain the phenomenon of mandatory early (sometimes as low as 55) retirement ages, the rationale being that companies want to shorten the length of time that workers can draw more salary than they’re worth. It also helps explain why companies may want more and more recruits every year; it is these recruits who contribute the most to the company. As each cohort ages, larger and larger new cohorts are required to pay for the older cohorts’ increasingly opulent salaries.  This is a stable system as long as each cohort is larger than the last.

When the cohorts stop growing, it starts a chain of events that potentially results in the death of the company. First, without the contributions of new workers, the company can no longer afford the salaries of its older workers.  Older workers may take early retirement or salary reductions (and grouse mightily about today’s youth). New workers and potential recruits notice that the formerly guaranteed high late-career salary is no longer guaranteed and start to question the benefits of accepting such a low early-career salary. The company therefore has an even more difficult time finding large enough cohorts of new workers.

Call me naïve, but I hadn’t seen this clearly before, nor had I seen the implications for national pension systems. Now that I do, I am even more glad to be in ESL rather than working for Toshiba, and I definitely hope all my students have lots of kids who all pay their Social Security taxes.

Varieties of middle C culture

Where is the dividing line between “Culture”, the kind we are obliged to respect, and “culture”, the pattern of living that distinguishes communities? Is a kettle Big C Culture if you use it to brew Earl Grey tea served with scones? Is the sound of a Harley Davidson’s engine revving just a shared reference point in a few countries? What if the main character of a TV show syndicated worldwide rides one?

In an effort to tie together somewhat thematically different chapters on “Culture” in a reading book one of my classes is using, I’ve introduced the concepts of “little c” and “Big C” culture and had the students examine the situations outlined in the chapters through that lens. If the terms are new to you, this or this are decent explanations. It’s been interesting, particularly when we’ve had a venn diagram on the whiteboard and the opportunity for students to put their own candidates for little c or Big C culture up for discussion – for example, students consider LGBT (for some reason, they didn’t want the Q) to be Big C because the term has become well-known and, to some, emblematic of first-world liberalism. Contrarily, they consider karaoke to be little c culture because, in their minds, everyone has it and no one considers it to be the legacy of any particular country.

Needless to say (for anyone who’s lived in Japan), students’ opinions about karaoke surprised me quite a bit, as karaoke is regarded in Japan to be a clear example of Japanese culture succeeding and spreading around the world, alongside sushi and anime. This has raised the question in my mind as to whether the little c/Big C dichotomy needs to be amended with consideration for the fact that different cultures have not only different artifacts and practices, but different perceptions of the importance of those artifacts and practices. What is Big C in the country that produced it may not be understood as a national symbol elsewhere, and what is unremarked upon in a country may be considered a national emblem of it elsewhere.

big-c-little-c.gif
Adapted from here.

(For the purposes of this discussion, I am flattening and homogenizing countries and cultures.  I recognize that no symbol is truly equally and universally shared in any political, ethnic, linguistic, or cultural group.)

Below the jump are my additions to the little c/Big C scheme.

Read More »

Losing my mind

What follows is a long, student-unfriendly version of a 3-paragraph paper (not an essay) on a 30-day challenge that I did with an intermediate integrated skills class.  The paper has to have an academic paragraph on the time before, the time during, and the time after the challenge.  Originally, the paragraphs had to use the past tense, present tense, and future tense (with any aspect), but I haven’t followed that rule faithfully here.

Getting lost in hectic thought was the default mode of my mind before I started my 30-day challenge.  The challenge, which was to meditate 10 minutes a day for 30 days, came at a time when I my mind was almost constantly in a state of emergency.  Every thought of grading, making new assignments, or updating a class vocabulary list was a red alert in a long line of red alerts.  I would be exhausted at the end of a day of classes, but unable to take a nap without thoughts of all the papers I had to grade rushing in and beating back my attempts at rest.  As a result, I was often in a sour mood and was inclined to greet any attempts at contact from colleagues or students as yet another demand on the limited resources of my attention.  When I had a minute, or just a desperate need to pretend that I did, I spent it with value-free distractions (the App Store specializes in them), afraid to glance back at the wave of paperwork threatening to crash over me from behind.

Since I started meditating, I haven’t ceased being distracted, but I have been better able to incorporate distraction into my workflow, i.e. to be mindful of distraction.  In the interior of my mind, thoughts of work have begun to appear less like photobombing tourists in the lens of my attention, and more like part of the shot.  I have become better able to take a long view of my own time and attention and to refuse to devote my full mental resources to every problem, incomplete task, or request that jumped into frame.  What is called “mindfulness” is key to this.  While I meditate, thoughts still appear, and I still think them, but I am aware of the process, and that awareness prevents me from identifying with them completely.  I become something of an observer of my own mental life.  I see how this could be described as being “mindful”, as it does in a sense feel like an additional layer of abstraction has been placed between my stream of consciousness and the thoughts that usually occupy it, but in a sense more important to me, something is also taken away.  That thing is the formerly irresistable urge to load that thought into the chamber of my executive-function pistol and start manically squeezing the trigger.  It is also the need to build a spider’s web around each thought, connected to all my other thoughts, and claim it irrevocably as mine.  In these senses I believe “mindlessness” is just as good a term as “mindfulness” for what occurs in and as a result of meditation.  In any case, disassociation from my thoughts, most of which are proverbial red circles with white numbers in them, has helped me to control the way that I react (or not) to them.

This brief experiment with meditation has given me a good deal of perspective to take with me into future semesters.  I can now see the regular rhythm of the waves of classwork as something other than a renewed threat.  Now, they seem more like tides, dangerous if unplanned for but predictable in their rises and falls.  Importantly, I also see the high water mark and know that as long as I keep my mind somewhere dry, it will recede without doing much damage.  In the future, as long as I refrain from doing something crazy like teaching 20 units, I think I will be able to maintain calm with the help of this perspective.  Also, in a more specific sense, I will be better able to resist the call to distract myself from my work.  I can recognize the formerly irresistable need to latch onto an interesting task, and this recognition enables me to prevent YouTube or WordPress (except for right now) from hijacking monotonous tasks like grading or… well, mostly grading.  Next semester and into the future, I will feel less threatened and better able to deal with inbound masses of schoolwork.

The simple present, unsimplified

Since I started my hobby/rigorous research pursuit of conducting Google Forms surveys on grammar, I have been thinking about the big one.  The one that combines the most assumptions and nuance and the simplest form into a wad of meaning with white dwarf-like density, which is maximally unbalanced in its complexity and the earliness and brevity with which it is treated in grammar textbooks.  The big one is, of course, the present simple.

This is going to be a long post.

Read More »

Fire alarm effects in ELT

I didn’t expect such a great metaphor for the ESL/EFL classroom to come from a writer on artificial intelligence.

In his article “There’s No Fire Alarm for Artificial Intelligence”, Eliezer Yudkowski uses the metaphor of a fire alarm to explain situations in which people act strangely without it being a faux pas.  His version of a fire alarm is a public messaging system that would give people permission to act with what in his opinion is the correct amount of urgency in the face of dangerously advanced and amoral (at least by our standards) AI.  A fire alarm, he postulates, is not simply an indication that danger exists (the other main indication being smoke), but a signal that it is acceptable to act as if it does in front of other people.  The acceptability comes from the fact that (actual and metaphorical) fire alarms are heard by everyone, and one’s knowledge that others also hear it enables one to take part in behavior like descending the stairs and paying a visit to the parking lot in the middle of a workday knowing that coworkers will not hold it against you.  Like many widely-shared messages, a fire alarm turns insane solo behavior into acceptable, even encouraged, group behavior.

(I heard this for the first time on Sam Harris’s podcast.  Yudkowski sounds exactly as you might expect someone with his job description to.  Incidentally, I have some basic disagreements with a lot of what Harris says, but still enjoy listening to his interviews.  I will be more specific in a future post.)

It’s pretty close to universal knowledge that speaking one’s L2 in front of other people is face-threatening behavior.  Consider the range of situations where reproach or shame are possible results – besides the obvious ones (sitting alone on the bus), you may be considered rude, stupid, foreign, pretentious, or just strange for suddenly bursting into French at your pâtisserie or watching Chinese soap operas on your phone.  Naturally, the number of “safe” contexts to speak your L2 increases if you move to a society where most people speak that language, but it is still not close to 100% of them – at the very least, you will mark yourself as a foreigner by “practicing” in public, and in the worst case, people can just be unbelievable assholes around 2nd language speakers.  Of course, there are learners who don’t feel threatened at all by speaking their L2, and maybe those are the same people who would immediately perform a fire drill alone at the first hint of smoke in the air.  Most people need acknowledgement that they won’t be judged negatively for trying and often failing to make themselves understood in a new code – they need a public signal that legitimizes it for everyone.  Something in the ESL/EFL classroom is necessary to transform society’s gaze from judgmental to facilitative.

This may turn out to be another black robe effect.  That is, the teacher might be the variable that turns language practice from face-threatening to the group norm.  The inverse is clearly true – teachers can definitely act in ways the discourage open practice or make students ashamed of failed attempts at communication (or worse, ashamed of imperfect grammar).  Teachers can also strengthen the norm of practicing English within the class by spelling it out explicitly and practicing it themselves.  I suspect though that a lot of the legitimization of language practice is due to the physical edifice of the classroom and the rituals one must go through to join a class – signing up, visiting the bursar’s office, carrying a bookbag, etc.  You can test this by walking out of your classroom during a task and secretly observing how much of the communication in your absence is still in English, and compare it to what happens when a waiter who shares an L1 with the cook is done taking your order.  As in the experiments that Yudkowski cites to make his case, students’ shared understanding of what behavior is validated is essential for any of that behavior to actually take place. Whatever it is that is acting as a fire alarm in language classes, its effects depend as much on the people as on the signal.

An objection to the feasibility of simulation

I used to have this fantasy about being able to predict the future by entering all the relevant data about the real world, down to the location of each atom, into a supercomputer and letting that supercomputer simply run a simulation of the world at a rate faster than actual time.  My inner materialist loved the idea of every geological force, weather system, human brain and every therefore manifestation of the emergent property we call a “soul” being predicted (something about my needing to take the stuffing out of humanity as a teenager), and I believed that doing this with the power of computing was eminently plausible save for our lack of complete data.  I now realize that it is impossible.  No, not because I’ve stopped being a materialist.

Any computer used to run a complete simulation of the real world must be at least as big as the system that it will be used to simulate.  That is, a complete simulation of an amoeba would require at least an amoeba-sized computer, a complete simulation of a human would require at least a human-sized computer, and a complete simulation of a planet would require a planet-sized computer, etc.  This is for a reason that is a “bit” obvious once you come to see it, as I did sometime during my undergrad years (if my memory of conversations over AOL Instant Messenger serves).  Data is instantiated in computer memory in chips as 1s and 0s, or bits, which have mathematical operations performed on them which in aggregate give rise to more complex operations, everything from blogging to Microsoft Flight Simulator.  At the moment, each of those bits needs at minimum a single atom with a charge to represent its value (the details of the bleeding edge of computer memory are quite fuzzy to me; replace “atom” with “quantum particle” in this argument as you see fit).  Any atom in a simulated universe would need great amounts of bits to represent its various properties (number of neutrons, location(s), plum pudding viscosity, etc.), and thus many atoms of real-world silicon would be a minimum to represent a single simulated atom.  Because all matter is composed of particles that would need at least that number of particles of computing hardware to simulate them, hardware must always be at least as physically big as the physical system that it simulates.  So much for running a predictive version of Grays Sports Almanac on my Windows computer.

But maybe not all that information is needed.  Maybe not all aspects of the system need to be accurately represented in the simulation for the result to be close – the number of neutrinos flying through the Milky Way surely can’t have that much to do with whether Leicester beats Arsenal 2-1 or 2-0. But consider that that game takes place in a universe where neutrinos definitely exist and people know and talk about them.  Some proportion of viewers, players, or advertisers are surely affected by the existence of scientific research being done in the city (Leicester and London are both home to renowned universities) where they live, even if indirectly – universities are huge employers with large real estate footprints.  Seen in the broader picture, the existence of neutrinos seems like a variable actually capable of affecting the outcome of a soccer match.  Even a single sporting event isn’t really a closed system – consider how directly they are affected by weather.  And of course the types of simulated realities that are en vogue recently thanks to Black Mirror are earth-like or at least have environments capable of fooling complete human simulacra, which means that the humans in them need referents for the things that they talked about when they were still flesh and blood – can you imagine a physicist being confined in a San Junipero happily if the rules of atomic motion are not part of the virtual world?  What would you do for fun when the 80s nostalgia wears off?

It’s an open question whether a simulated mind deserves moral consideration even if it has the subatomic workings of its nervous system simplified in order to make it run on a smartphone. The point I mean to make is just that it’s impossible to have a completely simulated anything without building a computer of at least that physical size in the real world.

Grammar Mining (and the collected Mark SLA Lexicon)

Many of us agree that teaching “at the point of need” (as I believe Meddings and Thornbury put it) is an ideal context for formal grammar teaching.  Students’ trying to communicate something provides clear evidence that they need the grammar that would facilitate communicating it, and depending on how close they come to natural expression, evidence that their internal representation of English is capable of taking on this additional piece of information.

In interlanguage punting, I conjectured that taking a guess at grammar students may need in the future and organizing a lesson around a particular grammar point was justifiable if the lessons you used to introduce that grammar would be memorable long enough for a “point of need” to be found before the lesson was forgotten.  At the time, I was teaching weekly 1-hour grammar workshops with rotating groups students at different levels, and as I could not teach reactively I had to justify my grammar-first (formS-focused) approach.

Read on for the last post before the new semester starts.

Read More »

Stuff I will miss 3: Intensely bitter grapes

I’ve been out of Japan, my home for almost all of my adult life, for a year now.  There are now some things I’d legitimately think of planning a vacation around to experience again.  Some of these are of the nostalgic flavor variety, and others are more profound.

(If you had had the power to predict back in 1987 that one of the effects of a global information network would be that everything textual had to be organized as a list, would that have helped you to make any wise investments?  What could you do with that information besides corner the market on scroll bars?)

Air conditioning.  People tend to think of Japan as an ancient country.  I disagree with the concept of “ancient countries” on philosophical and historical grounds (“ancient” being but one of many possible stances a modern person can take on the history of a current political entity), but in any case, you see no evidence of this claim living in an apartment in Japan.  It’s quite rare to find any domicile more than 30 years old, and the facilities within any given residence are bound to be either brand new or from around the same time the domicile itself was built (again, not a very long time ago).  Modularity is the philosophy of most interiors, leading to replaceable fluorescent ceiling fixtures, temporary flooring (often the rugs, tile, and the wafer-thin floor itself), and detachable wall AC/heating units.  The philosophy of housing in Japan seems similar to the philosophy of food in the US – ultrarational and at the convenience of industry.  My current residence in the US is older than any building I lived in in Japan, and its AC sounds like a fleet of Huey helicopters.  The idea that American buildings are old and sturdy and Japanese buildings are new and full of ad hoc fixes clashes with stereotype, but more importantly, sometimes slapdashness has perks in easy upgrades.  My AC in our school in Japan was practically silent in comparison.  If only the walls had been made of material capable of keeping heat in.

Unsweetened food choice architecture.  I still believe the point that I used to make, that the stereotype about everything in the US being incredibly sweet is false.  However, the sweet things here are definitely more prominently placed and almost always the first thing you notice on any given supermarket shelf.  There are croissants among the danishes and donuts, and plain yogurt next to the vanilla and strawberry yogurt, but the sweeter options are at least 2/3 of the options available and always hit the eye first.  This doesn’t technically force you to buy sugar-coated or HFCS-filled products, but it does make them harder to ignore.  Shopping here tends to nudge you towards cavities.  At least the dentists wear gloves here.

Meiwaku.  Interaction has a steep transaction cost in Japan.  Initiating random conversation, asking an off-script question to someone whose job isn’t specifically answering questions, mingling, sharing, and basically doing anything that anyone else might conceivably see all come weighted with fees.  Those fees come in the form of aversion to and fear of being called out for disturbing others (迷惑 meiwaku).  I don’t remember if meiwaku was in that book on Cultural Code Words, but it definitely should be – if there’s any single word that explains public behavior in Japan, it’s meiwakuMeiwaku, to me, is why people don’t usually talk on trains, get up to leave during the credits at the movies, or call out or loudly resist perverts on public transportation.  I see the threat of being accused of causing meiwaku as an additional burden that one feels in every public action, encouraging fairly extreme adherence to rules because the threat looms so large.  If it were a monetary cost, it would be a sales tax of 20% on all verbal transactions, pretty strongly disincentivizing public garrulousness.  The thing is, the revenue raised by this tax also allows for quiet and well-maintained (or at least not willfully destroyed) public spaces and a priority placed on social order, and it is something you can begin to miss in its absence.  Its near-complete lack in the US produces something also undesirable – reckless and risky behavior brought on by a lack of cost for disturbing other people – a recklessness subsidy, accompanied by a widely accepted cultural value on confidence and asserting onself.  In public in the US, all options for public behavior are available to everyone at no cost in face or shame.  As a result, people avoid boring, on-rails conversations but are much more likely to engage in all manner of misanthropic behavior because of how highly weighted self-expression is over public order or consideration of others.

The dream of a pretense of post-racial society.  Japan’s mainstream concept of itself is explicitly racial.  Not “we hate other races”, but “Japan’s story is the story of our race”.  I’ve come to think that most countries are this way, and a legalistic, “our club is the best but anyone is welcome to join as long as they follow the rules” definition of citizenship is a distinct minority of countries.  Now, if one squinted at US history and let the bloody details fade into the background, one could have said that this was the story Americans have always told themselves – that America is an idea that rises above race and class.  In fact, it was true until recently that even the most conservative politicians publicly espoused a legalistic rather than blood-and-soil definition of citizenship.  I worry that having had to defend this president will cause larger parts of conservative America to abandon even the rhetoric of equality.  Cognitive dissonance and all that.

I knew there were elements in the US who envied the easy worldview offered by an explicitly racial view of their nation, and sought to tie Americanness to a mythic concept of “whiteness” just as Japan’s is to “Japaneseness”. I didn’t think these people would ever have a voice in polite society or have their chosen candidate win a national election.

Of course, it seems silly to say I miss the rose-colored view of my home country that I had while I was away from it, but that is the truth.  I miss having the US as an example of a country that didn’t mythologize itself as the work of one uniquely virtuous race while I lived in one that did.

Shallow roots.  The US is unique not in its newness (its government is actually very old compared to that of most countries) but in its inability to pretend to be ancient.  Most people, when asked the age of a country like Japan, will inevitably answer in the thousands of years.  If you consider a claim like this in practical terms, that means either that the denizens of these islands have imagined themselves to be part of a Japanese polity before they even grew rice or that a series of governments, even without continuity from one to another, have been able to exercise control over these islands since the bronze age without the benefit of any communication faster than a pony (or in the earliest days, any form of writing).  Nonetheless, part of the current cultural software in some countries like Japan is a claim to continuous existence back into antiquity, made plausible by some historical facts (people really have lived here for a long time) and some guessing enabled by lack of evidence (nobody knows what these people did besides live in huts and make cord-inscribed pottery).  The US, with all of the history of its current government being part of the written record, cannot feasibly claim any of this.

Belonging to an “ancient society” weaponizes many of the arguments conservatives tend to make in every society – that our way of life has been ever thus, distinctive and characteristic of our land and people, until vulgarians and foreigners infiltrated it and began its corrosion.  Of course, you hear these arguments in the US too, but the difference is that in an “ancient society” everyone starts the discussion by ceding most of the conservatives’ points.  Yes, our way of life has existed unperturbed for millenia, but maybe it wouldn’t be so bad to give women maternity leave.  Yes, our culture is unique and distinctive, but a little labor flexibility would help our economy’s competitiveness.  Progressives need to start with an acknowledgment of what they are breaking with or they will lose most people’s sympathy.  As I said, the US has versions of these arguments, but people often have to outsource their roots (“Western” or “Judeo-Christian”, nothing on this continent) and the mists of time don’t seem to reach much further back than 1955.  A loss of national or cultural identity can be quite freeing.

Of course, this is a list of the things I miss, and like in the last entry, moving here has certainly disillusioned me of my fellow citizens’ resilience in the face of appeals from antiquity.  The president seems to come to his particular flavor of chauvinism simply by liking things the way he’s used to (white by default and free of the burden of empathy), but others, even progressives, have come to embrace the framing of conservative vs. liberal as traditional vs. non-traditional or continuity vs. interruption.  I suppose I had hoped I would be coming back to a society that saw interruption as its tradition.

Let’s end on something light…

Natto maki.  More sour beans than sour grapes.

natto-maki
Source.  I’ll be sure to try them if I’m ever in Ho Chi Minh City.

The urinal cake principle

Imagine yourself pushing through a crowded train station during rush hour.  As you pass a certain doorway, you detect hints of lavender and hibiscus coming from within.  Do these smells evoke:

  1. just flowers, or
  2. first flowers then toilets based on your prior knowledge and experience regarding the likelihood that lavender is blooming within 100 feet of a subway platform, or
  3. just toilets?

This is the best way for me to understand the principle of dead metaphors.  A dead metaphor is a process of cutting out a semiotic middleman. The process of a metaphor dying is a powerful and inevitable one that affects culture and particularly language in some subtle ways, as I hope to illustrate in as colorful a way as I can.

The process, in dense and unfriendly language, is this: The definition of a symbol over time changes from the first thing (flowers) to the second thing indirectly through the first thing (toilets via floral-scented deodorizing discs), and finally just comes to stand for the second thing (toilets).  This can be true even if the form of the symbol does not change – e.g. if the deodorizer continues to have a floral scent.  The reference stops being indirect and just goes straight for the thing that it was always eventually getting at.

I’ve been trying to think of more real-world examples of this principle in action.  Here are a few more:

A clothing brand (say, Members Only) is associated with rich people.  Poor people start to buy that clothing brand on sale or used because it evokes rich people.  The brand comes to be associated with poor people’s desperation to look rich.  (Louis Vuitton in Japan is rumored to head off this effect by buying back their used bags and burning them to prevent them going to the secondhand market)

“Political correctness” is a recognized phenomenon in polite discourse.  Reactionaries vocally dislike it and use it as a stick with which to beat their cultural enemies.  It comes to be much more widely known for its rhetorical value in reactionary discourse (specifically, their hatred of it) than as a phenomenon within polite discourse.

A famous person with an idiosyncratic name (say, “Zoey”) appears.  People who have kids during the zenith of that person’s career name their kids after him/her.  That name comes to be associated with that generation of kids rather than the famous person.

Taboo concepts have euphemisms invented to avoid direct reference to them while still maintaining the ability to refer to them indirectly if necessary.  Subsequent generations come to think of the euphemisms as simply the names for those taboo concepts, since those are usually the only words for those concepts that they ever hear.  Those generations invent new euphemisms to replace the no-longer-thought-to-be-indirect old ones.

When I was studying Japanese before moving there, we learned 便所 benjo (“convenient place”) as “bathroom”, when practically nobody alive now still uses that word.  Sometime between the 50s and now they were お手洗い otearai (“hand-wash”) or 化粧室 keshoushitsu (“powder room”). Now they are called トイレ toire, presumably until the next generation starts calling them something like ルーム ruumu.

Hipster beards are destined to go from “modern guy” to “guy who still thinks beards are modern” to “guy who doesn’t know that nobody thinks beards are modern” and in 20 or so years “guy who affects not caring that hipster beards are not considered modern” and back to just “modern guy” again.  Believe me; it happened to skinny ties.

Most words unless they were invented very recently are dead metaphors or have changed meanings through some other process.  A word’s history is like the miles of junk DNA we carry around with us in our cells, only using a small portion to form proteins, transmit messages or enjoy an inside joke.  Words like “give up” (my favorite example, a clear dead metaphor), “Wednesday”, or “cranberry” have stories behind their present forms and usages that are very interesting, but also very optional.  Each living word has untold numbers of lost meanings (in addition to its likely numerous current meanings) which we don’t have and don’t need access to in order to use it.  The process by which a word’s meaning changes isn’t always the middleman-eliding of the dead metaphor, but the idea that one doesn’t need all the information about the past referents of a given token to understand it in its current context is the same.

We language teachers often pride ourselves on the elaborate stories and setups that we use to explain usage of one language item or another.  One time I attended a presentation that asked us straightforwardly, “Why do you use on for buses but in for cars?”, to which several teachers laid out the possibly-made-up-but-convincing stories that they give their students.  These stories can definitely be useful for appearing to give students stage-by-stage mastery and “permission” to use a particular language item, things I definitely wanted in my early stages of Japanese learning.  Nowadays, I tend to think of these as a bootstrap or a foot in the door (are those dead metaphors?) than understanding itself, more affective factors than what we would usually call L2 competence.  Naturally, the end goal of most language learning is to have a grasp of the language similar to the fluent speakers in your target community, not to have complete explicit knowledge of the target language (although many learners confuse these two – some teachers as well).  One does not need to know the origin of a word or the logic behind its present form to use it correctly any more than one needs to have smelled fresh lavender as a reference point to know what that same smell means at the train station.

671358e4-5056-b061-b6aeb39f780f0706