What’s Black, White, and Blue All Over: The Hierarchy of Colors in Linguistic Evolution

In recent times Boyfriend and I, along with our board-game buddies, have taken a break from Settlers of Catan, and decided to give Puerto Rico a shot instead. Like any other well-designed strategy game Puerto Rico is a lot of fun, but it also makes me profoundly uncomfortable. As the name suggests, it is set in some fictional version of Puerto Rico,  and the path to victory includes establishing plantations of crops like sugar, indigo, or tobacco. You get to either sell the harvest in the (presumably) European market, or ship it off to (presumably) Europe.

The game doesn’t say so explicitly, but if you’ve ever attended a high school history class, you will find it hard to miss the fact that we’re dealing with colonialism here. Of course, if you still don’t understand why this game makes me uncomfortable, allow me to present yet another detail. The plantations that you acquire need workers, right? Well, the game provides you with these workers. They are called “colonists,” they mysteriously appear on a “colonist ship,” and the “Mayor” of Puerto Rico gets to distribute them among various players. Here’s what they look like:

puerto-rico-board-game-introYeah. Those dark brown pegs? They’re the “colonists.” Now do you understand why this award-winning game makes me uncomfortable?

Obviously, I couldn’t help addressing this issue for the entire duration of the game. It didn’t stop me from being competitive, of course, and when I came in second to Boyfriend, who had invested heavily in indigo plantations, I may have said a thing or two about how the pursuit of indigo had essentially destroyed the world, and how I hoped he was looking forward to having all his clothes dyed in blue. Boyfriend had already completed his victory dance and moved onto other things at this point, and when I threatened to dye his clothes blue mentioned blue dye, his mind raced to this place:

“Did you know that in most languages blue is one of the last colors to be named?”

I looked this up. Not only is blue the very last basic color to be given a name in most languages, but most languages also follow the same pattern when it comes to the order in which other basic colors are named. As far back as 1969, anthropologist Brent Berlin and linguist Paul Kay posited that if you could could determine what stage of evolution a particular language was in, you could draw accurate conclusions about how many colors had been named in this language, and which ones they were. All languages, they claimed, had terms for black (dark/cold) and white (light/warm), because these two categories were named in Stage 1. When a language progressed to Stage 2 it had a name for red, and if it was in Stages 3 or 4, it had names for yellow or green, or both. The naming of the color blue, however, was an act that took place only when a language had reached a significantly advanced stage. As happens in academia, Berlin and Kay’s work was challenged on several grounds in the following years. But as recently as 2012, a paper published in PNAS confirmed the order of naming colors that they had originally proposed. Most languages, it appears, name basic colors in the following order:

1. Black/white, 2: Red, 3: Violet, 4: Green/Yellow, and 5: Blue.

There is an obvious question here: Why is this the case? There exists an abundance of research to prove that neither cultures nor languages are homogenous. The overwhelming similarity in the order in which basic colors are named, however, suggests that there is a fundamental commonality in human experience that dates as far back as the beginning of language.

No one has been able to lay out with any certainly what this fundamental experiential commonality is, but that doesn’t mean people haven’t tried. British Prime Minister William Gladstone, who published a study in 1858 on the work of Homer, was particularly bothered by the great writer’s use of the term “wine-colored” or “wine-like” to describe turbulent oceans, stubborn oxen, and everything that lay in between. To Gladstone, neither the ocean nor oxen were even close to “wine-colored.” He concluded, therefore, that the ancient Greeks hadn’t developed the ocular distinction between various colors, and that to them, the world appeared mostly black and white with some shades of red.

This theory was later debunked, and it became clear that the question needed to be addressed from a linguistic perspective, not an optical one. In his book Through the Language Glass: Why the World Looks Different in Other Languages, Guy Deutscher briefly deals with the color-naming hierarchy, and suggests that languages are universally late in arriving at a word for blue because the color rarely appears in nature.

Naturally occuring

Occurs in nature. (Image source.)

NOT naturally occurring

DOES NOT occur in nature.

The hypothesis does hold a certain amount of weight. Black and white are as basic as shutting your eyes and then opening them, or being able to tell the difference between night and day. Red, similarly, occurs widely in nature, in flowers, berries, some types of soil, birds, beetles, even the sunrise. Red is also one of the easiest dyes to make naturally, while blue is among the most difficult. Furthermore, as some people have proposed, red is a color that would have been familiar to human beings since the very beginning because it is the color of blood. Early humans didn’t eat meat, but between injuries in the wild, menstruation, and childbirth, they probably bled quite a bit themselves. Going further down the naming hierarchy, yellow is common to flowers, fruits, and animals, while even a little scrap of shrubbery would introduce you to green. Blue, however, is found only on rare insects, rare flowers, rare birds and reptiles, and a handful of berries.

Oh, and the sky. Yes, the sky appears blue. But that’s only some of the time, isn’t it? The sky also appears purple and magenta and orange and grey. And each of these colors was among the very last to be named, following after blue. Clearly, early humans weren’t big on staring into the sky. Or maybe they were just rabidly utilitarian: if the sky didn’t interfere with their daily lives, they didn’t feel the need to address it.

There are no certain answers on the subject. Each theory has holes in it. For instance, the ground is brown for most part, as are tree trunks, and the skin on a lot of early humans. So why didn’t the name for this color appear before red? We’ll probably never know. What we can draw conclusions about, though, is the connection between this linguistic phenomenon and the spread of European colonialism. (I know. I can’t help myself: a decade of Postcolonial Studies will do that to you. But bear with me.) Scholars like Deutscher have proposed that one of the reasons blue was among the last colors to be named is because blue dyes were very difficult to make. Now, the latter part of this claim is known to be true. Because blue was a very difficult dye to make, the color acquired luxury status across the ancient and medieval world. It became the color of royalty and of the upper classes. The ancient Egyptians dyed the cloth they used to mummify bodies blue, and Julius Caesar claimed that Celtic warriors painted their bodies blue. Because of the status this color enjoyed, indigo, when discovered, became something of a goldmine for European traders. It became a business that set the foundations for one of history’s biggest colonial empires.

I know, there’s no causation here. But I didn’t promise a relationship of causality, I just promised a connection. And the connection is undeniable, don’t you think? Just like the beauty of a blue planet seen from far, far away.

Want to know more? Go here. And here.

The Thomasson: Useless, but meticulously maintained

All right, I’ll admit it. It’s intimidating to come back to my blog after a break of several months. (Why was I away? Grad school applications. Don’t ask, it was brutal.) One of the things that kept me going while I was away was weekly email updates about the statistics on my blog. “Neha! Two people favorited your blog this week!” “Neha! Seven people viewed and four people liked Stuff My Boyfriend Tells Me’s FaceBook page this week!” “Neha! Eleven people printed that comic of your boyfriend rocking to Daft Punk on their newborn babies’ onesies this week!” (Okay, that last one is made up. But only a little.)

Providing irrefutable proof that folks don’t know what to do with themselves in the lazy, food-filled days between Christmas and New Year, the last week of December showed unusually high activity on my blog. I immediately pointed out said blog activity to Boyfriend, and as per usual, Boyfriend nodded sagely (if “sagely” were inflected with a hint of “That’s impressive”). He was curious about why I still got updates on my blog when I wasn’t posting on it, and I said it was because I liked keeping tabs on it. “You know, to maintain it, keep track of what’s going on, etc.”

“That’s interesting. So even if you never posted on your blog again, there would still be maintenance on it?”


“So it could be a Thomasson of the blogosphere. I wonder how many of those exist…”

“A what-now of the blogosphere?”

“A Thomasson…”

(You know the drill by now. No need to ask. Just wait for it. 3…2…1…)

“Did you know, a Japanese artist coined the term “Thomasson” in the 1970s to describe architectural features of buildings that no longer serve any purpose but are maintained nonetheless?”

...Are you saying these stairs don't lead to that little door in the wall that takes you down to the basement where miniature people live in secret?

So… that hole in the wall isn’t a door used by the miniature family that secretly lives in the basement? This staircase with the freshly painted railing actually leads nowhere?

I have no desire to turn my blog into a Thomasson. But it is only fitting that my first post after the break should deal with the nomenclatural genesis of this eloquent term. And there isn’t much to it, really. In fact, you could even say that the entirety of this term’s history lies in just two observations:

A: (As some of you may know) Artists are often interested in architecture;

B: (As some of you may not know) Japan is very interested in baseball.

How do these things connect? Bear with me.

A: The year is 1972. Channeling the flâneur within him, Japanese artist Genpei Akasegawa is wandering the streets of his city. As he walks past a building, his attention is arrested by a staircase. This staircase, to be precise:

Staircase to Nowhere

The staircase leads up to a landing and then down again on the other side. But there is no door there, a feature that immediately strikes Akasegawa as odd. “What a strange thing,” he thinks. “I have stumbled upon a Staircase to Nowhere.”* Assuming this structure-with-no-purpose is part of an older construction, Akasegawa is about to move on when he notices that the railing on the staircase has just been repaired. “Hmm,” he thinks again. “An architectural feature that is entirely useless, but still undergoes careful maintenance!” Akasegawa is fascinated, and from that point forward he seeks out architectural features like this one in other buildings as well: doors that are boarded up but freshly painted, water pipes that haven’t carried water in years but are still cleaned, or windows that hug brick walls but still have their glass replaced when broken. Anything that fits the bill of “useless but well-maintained.”

B: The year is 1980. After having played his last game for the Los Angeles Dodgers, Major League Baseball player Gary Thomasson is bought by the Yomiuri Giants of the Japanese Nippon Pro Baseball. Having played for the San Francisco Giants, the Oakland Athletics, and the New York Yankees before he was traded to the Dodgers, Thomasson is a bit of a star. And as these things tend to happen, the Yomiuri Giants give him a contract befitting a star. The heftiest contract, in fact, in the history of the Nippon League. People have grand expectations; Thomasson is eager to deliver. But something happens upon his arrival in Japan, something dramatically antithetical to the fairytale denouement everyone is expecting: Thomasson loses his mojo. Drops his baseball chops, entirely and spectacularly. Truth be told, if he hadn’t injured his knee in a career-ending move, he may have set a new strikeout record for the league.

But the contract remains, and the Yomiuri Giants are under legal obligation to pay him. For two years, the player strikes out but cashes in. For the Nippon League–and many baseball fans in Japan–Gary Thomasson came to define “useless but maintained.”

You see the connection now, don’t you?

I’m not sure if Akasegawa was a baseball fan, but he definitely kept tabs on the performance of various players in the Nippon League. And when it dawned on him at some point during his quest that these “useless-but-maintained” architectural features should be given a name, “Thomasson” seemed an obvious (if slightly mean spirited) choice. He started to photograph these Thomassons, compiling and publishing them in his 1985 book Hyperart: Thomasson. The term quickly gained a cult following, and the book was later translated into English.

Understandably, Gary Thomasson himself had no comment on the matter. But meanness aside, I think Akasegawa has given us a great gift here; a name for all those life situations we’ve never been able to define. That tailbone that serves no purpose but needs you to exercise the muscles around it because you once injured it in spin class? It’s a Thomasson! That sweater you have to wash in special detergent made from a baby penguin’s belly feathers, but that you don’t actually wear anywhere because it isn’t warm enough? It’s a Thomasson! That dog you adopted to bark at strangers when you moved into a sketchy neighborhood because of sketchy career choices like wanting to be a writer, but who just cannot control his excitement upon meeting new people? He’s a Thomasson! That cat who–well, any cat, really–is a Thomasson! Thomassons all round, everywhere you look.

There’s perverse beauty in the term, you have to admit. And the best part? It’s thoroughly useful, and needs no maintenance whatsoever.

Want to know more? Go here. And here.

*I may be paraphrasing Akasegawa’s thoughts here. I’m a storyteller. I can’t help myself.


Neanderthals and the Addiction to Nicotine

As a writer I sometimes feel like I’ve committed to a lifetime of searching for a new story to tell. I am completely wedded to the plot I’m working on right now (“Oh yeah, Neha – weren’t you supposed to have finished writing the first half of your novel by now?” Shut up, it’s none of your business.) but that doesn’t keep me from thinking of new ideas, new stories, even new genres. The other evening, after a day of hard and frustrating work trying to connect the various narrative strains in my book, I sat down with some green tea (actually it was scotch) and picked up an article on our long-dead cousins, the neanderthals. And one of the things I read in it was that while homosapiens (our great, great, great gran-grans and pop-pops) developed the concept of living in large settlements resembling villages quite early in the game, neanderthals lived in tiny groups that were separated from and not dependent on one another.

Gods! I think there is an alternate universe story to tell here!

I immediately engaged Boyfriend, who likes pointlessly long drawn discussions about completely hypothetical situations more than anyone else I know. I shared some of my scotch with him (actually, he just made himself some green tea), and we plotted the world as it might have been 400,000 years ago, with the hypothetical situation that homosapiens and neanderthals had gone to war with one another. And every scenario we could think of, I ended with gran-gran and pop-pop wiping out the neanderthals.

For some reason, Boyfriend was disappointed by this; he really, really wanted the neanderthals to survive. Maybe the neanderthals fought back when they were attacked, he suggested.

Sure they did. But the homosapiens lived in bigger groups, and their interdependency had probably led to a sense of common identity. They had larger numbers – the neanderthals lived in tiny groups, remember? – and they were fighting for something bigger than just a bunch of strangers not putting out their campfire. (You think flint is easy to find, you motherf***er? [Of course, at the time I doubt motherf***er was really considered an insult. But I digress.])

Well, maybe they were stronger…? Boyfriend suggested feebly, knowing this was barely a possibility. Even in a universe not as hypothetical as the one we were constructing, homosapiens occupied sub-Saharan Africa and parts of Asia – regions with tropical to temperate climates. Neanderthals were scattered across Europe, struggling to fight the cold, hunt for food, and somehow survive. I doubt they were stronger.

At this point, Boyfriend was crestfallen. What’s the point of a hypothetical universe if everything’s the same. Hey – he brightened up – maybe they fought for a bit but then called a truce and lived in harmony ever after?

Haha! Called a truce. You’re funny.

Whatever. You’re right, they probably weren’t very strong, Boyfriend admitted with resignation, and then continued:

“Did you know the gene for nicotine addiction in modern human beings comes from neanderthals?”

Homo sapien sapien, Q.E.D. (Image source.)

Homo sapien, Q.E.D. (Image source.)

Scientists have known for a long time that about 2-4% of the genetic material of average modern human beings (not of African origin – sub-Saharan Africans are believed to have inherited little to no genetic material from neanderthals) comes from neanderthals, mainly affecting skin and hair. A recent study conducted at Harvard University and published in Nature has managed to shed some light on the microscopic aspects of this genetic mixing. It found that the alleles for nicotine addiction, Type 2 diabetes, and Crohn’s disease in the modern human being are a result of interbreeding between neanderthals and homosapiens.

Neanderthals and homosapiens shared common ancestors about 600,000 years ago. But for several thousand years after these ‘cousins’ parted ways, homosapiens lived in sub-Saharan Africa, where they hunted, gathered, and eventually settled and farmed. Neanderthals lived in hostile climates outside Africa, where the barren environment was not really conducive to farming. The study found that by the time the groups of homosapiens who left sub-Saharan Africa met and bred with neanderthals, the two species had already evolved in very distinct ways, and were on the brink of incompatibility. About 300,000 years ago, not very long after this interbreeding, the neanderthals died out.

While there have been indications explaining the neanderthal demise, the Harvard study is among the first to have found evidence across large areas of human DNA that suggest neanderthals may, in fact, have had genetic qualities that were harmful, and that were passed onto the first human beings through interbreeding, but were wiped out of the gene pool over time, through natural selection. The allele for nicotine addiction is among the few that persist in modern humans.

I guess this means all of you smokers out there can no longer be guilt tripped by your friends and family. I mean, you got the genes from the neanderthals. It’s not your fault, right? Right?

Eh. Wrong.

Of course, Fred's insight may be true in ways other than just the human desire to smoke. (Image source.)

Of course, Fred’s insight may apply to things other than just the human desire to smoke. (Image source.)

I am definitely among those people who get inordinately excited about evolutionary biology and genetic findings such as these. But as significant as the results of this study are, we have to bear in mind that considering just how long we (homo sapiens, not just humans) have been around, the ways in which our environment has changed, and the fact that Evolution (as I have pointed out before on this blog) is really slow to catch on, it would be erroneous to attach a relationship of direct causality between the findings of this study and the extinction of neanderthals 300,000 years ago, or diseases and addictions in humans today. While the allele for nicotine addiction can be traced back to neanderthals, that does not necessarily mean neanderthals were smokers or addicts of any kind. In its original carrier, the gene could have been part of a whole other DNA party altogether. Similarly, while the high incidence of Crohn’s disease in people of eastern European descent (considered among the closest to neanderthals) may suggest some causality, this reasoning is immediately challenged by the high incidence of Type 2 diabetes in African Americans, which is likely the result of modern environmental and socio-economic factors. And let’s not forget, we got some good things too, from the neanderthals. Neanderthal DNA is what gives us stronger keratin, which leads to stronger skin, hair, and nails. But again – while this helped the earliest humans survive extreme cold, it is still not strong enough evidence to suggest that the neanderthals had adapted well to the extreme cold they lived in.

So while each of these studies wipes clean a new piece of a jigsaw puzzle, the puzzle itself is so big and so complicated, I don’t know if we’ll ever really find all the pieces. Our species is becoming more enlightened and more informed with every passing minute, but in the words of one recently famous TV redhead, you (still) know nothing, Jon Snow.

Want to know more? Go here. And here, for the study published in Nature.