What’s Black, White, and Blue All Over: The Hierarchy of Colors in Linguistic Evolution

In recent times Boyfriend and I, along with our board-game buddies, have taken a break from Settlers of Catan, and decided to give Puerto Rico a shot instead. Like any other well-designed strategy game Puerto Rico is a lot of fun, but it also makes me profoundly uncomfortable. As the name suggests, it is set in some fictional version of Puerto Rico,  and the path to victory includes establishing plantations of crops like sugar, indigo, or tobacco. You get to either sell the harvest in the (presumably) European market, or ship it off to (presumably) Europe.

The game doesn’t say so explicitly, but if you’ve ever attended a high school history class, you will find it hard to miss the fact that we’re dealing with colonialism here. Of course, if you still don’t understand why this game makes me uncomfortable, allow me to present yet another detail. The plantations that you acquire need workers, right? Well, the game provides you with these workers. They are called “colonists,” they mysteriously appear on a “colonist ship,” and the “Mayor” of Puerto Rico gets to distribute them among various players. Here’s what they look like:

puerto-rico-board-game-introYeah. Those dark brown pegs? They’re the “colonists.” Now do you understand why this award-winning game makes me uncomfortable?

Obviously, I couldn’t help addressing this issue for the entire duration of the game. It didn’t stop me from being competitive, of course, and when I came in second to Boyfriend, who had invested heavily in indigo plantations, I may have said a thing or two about how the pursuit of indigo had essentially destroyed the world, and how I hoped he was looking forward to having all his clothes dyed in blue. Boyfriend had already completed his victory dance and moved onto other things at this point, and when I threatened to dye his clothes blue mentioned blue dye, his mind raced to this place:

“Did you know that in most languages blue is one of the last colors to be named?”

I looked this up. Not only is blue the very last basic color to be given a name in most languages, but most languages also follow the same pattern when it comes to the order in which other basic colors are named. As far back as 1969, anthropologist Brent Berlin and linguist Paul Kay posited that if you could could determine what stage of evolution a particular language was in, you could draw accurate conclusions about how many colors had been named in this language, and which ones they were. All languages, they claimed, had terms for black (dark/cold) and white (light/warm), because these two categories were named in Stage 1. When a language progressed to Stage 2 it had a name for red, and if it was in Stages 3 or 4, it had names for yellow or green, or both. The naming of the color blue, however, was an act that took place only when a language had reached a significantly advanced stage. As happens in academia, Berlin and Kay’s work was challenged on several grounds in the following years. But as recently as 2012, a paper published in PNAS confirmed the order of naming colors that they had originally proposed. Most languages, it appears, name basic colors in the following order:

1. Black/white, 2: Red, 3: Violet, 4: Green/Yellow, and 5: Blue.

There is an obvious question here: Why is this the case? There exists an abundance of research to prove that neither cultures nor languages are homogenous. The overwhelming similarity in the order in which basic colors are named, however, suggests that there is a fundamental commonality in human experience that dates as far back as the beginning of language.

No one has been able to lay out with any certainly what this fundamental experiential commonality is, but that doesn’t mean people haven’t tried. British Prime Minister William Gladstone, who published a study in 1858 on the work of Homer, was particularly bothered by the great writer’s use of the term “wine-colored” or “wine-like” to describe turbulent oceans, stubborn oxen, and everything that lay in between. To Gladstone, neither the ocean nor oxen were even close to “wine-colored.” He concluded, therefore, that the ancient Greeks hadn’t developed the ocular distinction between various colors, and that to them, the world appeared mostly black and white with some shades of red.

This theory was later debunked, and it became clear that the question needed to be addressed from a linguistic perspective, not an optical one. In his book Through the Language Glass: Why the World Looks Different in Other Languages, Guy Deutscher briefly deals with the color-naming hierarchy, and suggests that languages are universally late in arriving at a word for blue because the color rarely appears in nature.

Naturally occuring

Occurs in nature. (Image source.)

NOT naturally occurring

DOES NOT occur in nature.

The hypothesis does hold a certain amount of weight. Black and white are as basic as shutting your eyes and then opening them, or being able to tell the difference between night and day. Red, similarly, occurs widely in nature, in flowers, berries, some types of soil, birds, beetles, even the sunrise. Red is also one of the easiest dyes to make naturally, while blue is among the most difficult. Furthermore, as some people have proposed, red is a color that would have been familiar to human beings since the very beginning because it is the color of blood. Early humans didn’t eat meat, but between injuries in the wild, menstruation, and childbirth, they probably bled quite a bit themselves. Going further down the naming hierarchy, yellow is common to flowers, fruits, and animals, while even a little scrap of shrubbery would introduce you to green. Blue, however, is found only on rare insects, rare flowers, rare birds and reptiles, and a handful of berries.

Oh, and the sky. Yes, the sky appears blue. But that’s only some of the time, isn’t it? The sky also appears purple and magenta and orange and grey. And each of these colors was among the very last to be named, following after blue. Clearly, early humans weren’t big on staring into the sky. Or maybe they were just rabidly utilitarian: if the sky didn’t interfere with their daily lives, they didn’t feel the need to address it.

There are no certain answers on the subject. Each theory has holes in it. For instance, the ground is brown for most part, as are tree trunks, and the skin on a lot of early humans. So why didn’t the name for this color appear before red? We’ll probably never know. What we can draw conclusions about, though, is the connection between this linguistic phenomenon and the spread of European colonialism. (I know. I can’t help myself: a decade of Postcolonial Studies will do that to you. But bear with me.) Scholars like Deutscher have proposed that one of the reasons blue was among the last colors to be named is because blue dyes were very difficult to make. Now, the latter part of this claim is known to be true. Because blue was a very difficult dye to make, the color acquired luxury status across the ancient and medieval world. It became the color of royalty and of the upper classes. The ancient Egyptians dyed the cloth they used to mummify bodies blue, and Julius Caesar claimed that Celtic warriors painted their bodies blue. Because of the status this color enjoyed, indigo, when discovered, became something of a goldmine for European traders. It became a business that set the foundations for one of history’s biggest colonial empires.

I know, there’s no causation here. But I didn’t promise a relationship of causality, I just promised a connection. And the connection is undeniable, don’t you think? Just like the beauty of a blue planet seen from far, far away.

Want to know more? Go here. And here.

Neanderthals and the Addiction to Nicotine

As a writer I sometimes feel like I’ve committed to a lifetime of searching for a new story to tell. I am completely wedded to the plot I’m working on right now (“Oh yeah, Neha – weren’t you supposed to have finished writing the first half of your novel by now?” Shut up, it’s none of your business.) but that doesn’t keep me from thinking of new ideas, new stories, even new genres. The other evening, after a day of hard and frustrating work trying to connect the various narrative strains in my book, I sat down with some green tea (actually it was scotch) and picked up an article on our long-dead cousins, the neanderthals. And one of the things I read in it was that while homosapiens (our great, great, great gran-grans and pop-pops) developed the concept of living in large settlements resembling villages quite early in the game, neanderthals lived in tiny groups that were separated from and not dependent on one another.

Gods! I think there is an alternate universe story to tell here!

I immediately engaged Boyfriend, who likes pointlessly long drawn discussions about completely hypothetical situations more than anyone else I know. I shared some of my scotch with him (actually, he just made himself some green tea), and we plotted the world as it might have been 400,000 years ago, with the hypothetical situation that homosapiens and neanderthals had gone to war with one another. And every scenario we could think of, I ended with gran-gran and pop-pop wiping out the neanderthals.

For some reason, Boyfriend was disappointed by this; he really, really wanted the neanderthals to survive. Maybe the neanderthals fought back when they were attacked, he suggested.

Sure they did. But the homosapiens lived in bigger groups, and their interdependency had probably led to a sense of common identity. They had larger numbers – the neanderthals lived in tiny groups, remember? – and they were fighting for something bigger than just a bunch of strangers not putting out their campfire. (You think flint is easy to find, you motherf***er? [Of course, at the time I doubt motherf***er was really considered an insult. But I digress.])

Well, maybe they were stronger…? Boyfriend suggested feebly, knowing this was barely a possibility. Even in a universe not as hypothetical as the one we were constructing, homosapiens occupied sub-Saharan Africa and parts of Asia – regions with tropical to temperate climates. Neanderthals were scattered across Europe, struggling to fight the cold, hunt for food, and somehow survive. I doubt they were stronger.

At this point, Boyfriend was crestfallen. What’s the point of a hypothetical universe if everything’s the same. Hey – he brightened up – maybe they fought for a bit but then called a truce and lived in harmony ever after?

Haha! Called a truce. You’re funny.

Whatever. You’re right, they probably weren’t very strong, Boyfriend admitted with resignation, and then continued:

“Did you know the gene for nicotine addiction in modern human beings comes from neanderthals?”

Homo sapien sapien, Q.E.D. (Image source.)

Homo sapien, Q.E.D. (Image source.)

Scientists have known for a long time that about 2-4% of the genetic material of average modern human beings (not of African origin – sub-Saharan Africans are believed to have inherited little to no genetic material from neanderthals) comes from neanderthals, mainly affecting skin and hair. A recent study conducted at Harvard University and published in Nature has managed to shed some light on the microscopic aspects of this genetic mixing. It found that the alleles for nicotine addiction, Type 2 diabetes, and Crohn’s disease in the modern human being are a result of interbreeding between neanderthals and homosapiens.

Neanderthals and homosapiens shared common ancestors about 600,000 years ago. But for several thousand years after these ‘cousins’ parted ways, homosapiens lived in sub-Saharan Africa, where they hunted, gathered, and eventually settled and farmed. Neanderthals lived in hostile climates outside Africa, where the barren environment was not really conducive to farming. The study found that by the time the groups of homosapiens who left sub-Saharan Africa met and bred with neanderthals, the two species had already evolved in very distinct ways, and were on the brink of incompatibility. About 300,000 years ago, not very long after this interbreeding, the neanderthals died out.

While there have been indications explaining the neanderthal demise, the Harvard study is among the first to have found evidence across large areas of human DNA that suggest neanderthals may, in fact, have had genetic qualities that were harmful, and that were passed onto the first human beings through interbreeding, but were wiped out of the gene pool over time, through natural selection. The allele for nicotine addiction is among the few that persist in modern humans.

I guess this means all of you smokers out there can no longer be guilt tripped by your friends and family. I mean, you got the genes from the neanderthals. It’s not your fault, right? Right?

Eh. Wrong.

Of course, Fred's insight may be true in ways other than just the human desire to smoke. (Image source.)

Of course, Fred’s insight may apply to things other than just the human desire to smoke. (Image source.)

I am definitely among those people who get inordinately excited about evolutionary biology and genetic findings such as these. But as significant as the results of this study are, we have to bear in mind that considering just how long we (homo sapiens, not just humans) have been around, the ways in which our environment has changed, and the fact that Evolution (as I have pointed out before on this blog) is really slow to catch on, it would be erroneous to attach a relationship of direct causality between the findings of this study and the extinction of neanderthals 300,000 years ago, or diseases and addictions in humans today. While the allele for nicotine addiction can be traced back to neanderthals, that does not necessarily mean neanderthals were smokers or addicts of any kind. In its original carrier, the gene could have been part of a whole other DNA party altogether. Similarly, while the high incidence of Crohn’s disease in people of eastern European descent (considered among the closest to neanderthals) may suggest some causality, this reasoning is immediately challenged by the high incidence of Type 2 diabetes in African Americans, which is likely the result of modern environmental and socio-economic factors. And let’s not forget, we got some good things too, from the neanderthals. Neanderthal DNA is what gives us stronger keratin, which leads to stronger skin, hair, and nails. But again – while this helped the earliest humans survive extreme cold, it is still not strong enough evidence to suggest that the neanderthals had adapted well to the extreme cold they lived in.

So while each of these studies wipes clean a new piece of a jigsaw puzzle, the puzzle itself is so big and so complicated, I don’t know if we’ll ever really find all the pieces. Our species is becoming more enlightened and more informed with every passing minute, but in the words of one recently famous TV redhead, you (still) know nothing, Jon Snow.

Want to know more? Go here. And here, for the study published in Nature.

Three Teaspoons of Vaseline A Day Keeps the Doctor Away?

Lately, Boyfriend has become enamored with the versatility of gelatin, and turns any produce he can get his hands on into jelly or jello. The results are mostly of the orgasmic-explosion-of-flavors-in-mouth type, but every now and then, they are also just plain weird. Last week, he attempted to make a clear jello glaze/cake/brick/thingie. The end product looked like a blob of Vaseline, an observation with which Boyfriend concurred. I took a spoonful and tasted it, realizing soon enough that it didn’t really have a taste as much as a vague, papery-but-buttery texture. So I declared with finality that it tasted like Vaseline too.

“Oh, that’s interesting,” Boyfriend said as he started to clean up.

Really? It’s interesting that I think it tastes like a non-edible, greasy, skin care product I’ve never tasted?

Here’s why it was interesting:

“Did you know that the guy who invented Vaseline ate three spoons of it everyday, because he believed it contained a magic ingredient that could cure every disease?”

Robert Chesebrough, an English-born New Yorker, discovered Vaseline in 1869. Well, there’s actually some confusion over whether he discovered or invented it. Here’s why:

In 1859, at the green age of twenty two, Robert Chesebrough found himself dreaming the greatest cliche of all American dreams: he wanted to be a millionaire. So one night, after the farm animals had been penned in, supper had been eaten, and the gentle snores of his businessman father wafted through the family home, young Bobby jumped out the window and took off for the oil rigs of Pennsylvania. Ok, I don’t actually know if they had farm animals, or if Mr. Chesebrough senior snored. Or, for that matter, if young Bobby really ran away in the middle of the night. But he did abandon his father’s business because he thought he could make a fortune with oil.

It was only once he got to the oil rigs that he realized he actually had the curious and observant sensibility of a scientist, not the brute power and drive of an oil worker. Instead of getting his hands dirty, Bobby parked himself in the sidelines, and observed the workers. (I really want to say it was because he was taken with their glistening, muscular bodies, but I have no proof of that either.)

He noticed that a black, gooey substance the workers called rod wax, accumulated on the metal rods periodically, and prevented the machinery from functioning smoothly. And while this rod wax was the cause for much anxiety among the oil barons, the miners used it as a salve to heal their injuries. At the time, Bobby didn’t know what the chemical properties of this black goo were. But in it, he saw the opportunity to get rich without getting dirty.

Babies with unnecessary amounts of attitude? Vaseline has a cure! (Image source.)

Babies with unnecessary amounts of attitude? Vaseline has a cure! (Image source.)

Bobby pinched some rod wax, stored it away carefully, and rode right back home. Over the next ten years, as his father grew old and bent of back trying to keep a business he had built for his son running, just so he could continue to feed his family (Again, don’t take my word for this. For all I know Mr. Chesebrough Sr. was having pool parties in a retirement community in Jacksonville.), Bobby Chesebrough built himself a lab, and experimented with rod wax until he finally hit upon a formula that produced a compound with similar chemical properties, but was odorless and colorless.

He named this product Vaseline, from the German wasser (water) and Greek elaion (oil). A greasy substance derived from the waste of oil rigs and distilled by a young, mostly self-taught chemist who believed it had magical properties. Unsurprisingly, no one bought it.

Until Bobby Chesebrough got his marketing chops on.

When pharmacists and chemists in the city rejected his product, he loaded tubs of Vaseline onto a wagon and carted them off to the countryside. Once there, he inflicted burn wounds on his own skin (seriously, what is with this trend?!), and demonstrated how quickly they healed upon the application of Vaseline.

That, and he ate three spoonfuls of Vaseline everyday. Because his decade-long intimacy with the product notwithstanding, he believed it contained a secret, magical element that had the ability to cure every illness and disease man has ever known. (Oh, the innocence of a world without Hepatitis and HIV, a world with rarely occurring cancers.) The one time Bobby fell seriously ill from pleurisy, he insisted the hospital staff cover him in Vaseline from head to toe.

Almost certainly incidentally, he recovered. Almost certainly not incidentally, this recovery turned him into an even greater Vaseline fanatic. Robert Chesebrough ate Vaseline three times a day until he died.

As for the product, well, I’m sure almost every one of us has had a tub of Vaseline Petroleum Jelly in our homes at some point. That tells us all we need to know. Of course, we also know that Vaseline doesn’t contain a mysterious, magic ingredient, and it doesn’t have secret healing powers. As Professor John Hawk of the St. Thomas Hospital in London explains, “Vaseline is an occlusive moisturizer, which means that it creates a barrier on the surface of the skin. This is beneficial because it helps the skin to retain moisture, which is crucial to the healing process, and also because it keeps wounds sterile by preventing harmful bacteria from getting in.” Thus its efficacy in mending cold sores, chapped skin, minor burns, and nappy rashes.

I... just... don't even.

I… just… don’t even.

In case you were still wondering, there is no scientifically verifiable benefit to eating Vaseline. In fact, there’s reason to believe it is a patently bad idea to eat it. Almost certainly incidentally, though, after Robert Chesebrough recovered from the Vaseline-slathered bout of pleurisy, and vowed to give himself a daily dose of the ‘tonic’, he went on to live a very healthy life till the age of 96.

Want to know more? Go here. And here.