The Lavender Scare

Frank Kameny (second picketer in line) and other activists protest outside the White House (Washington Post)

Happy Pride Month from HFE! 🏳️‍🌈

Many Americans who learned about the state of the United States government during the 1950s will be familiar with the practices and consequences of McCarthyism. Named after Senator Joseph McCarthy, McCarthyism is an era in American politics in which many government officials were accused of having communist sympathies, and therefore were disloyal to the United States. It is considered to be one of the primary effects of the Second Red Scare, a wave of fears and resentments towards communist and socialist ideologies that swept the country during the 1950s. As harmful as this political witch hunt was to many individuals, it often takes the spotlight from a concurrent practice that was equally damaging to the careers and reputations of thousands of American civil servants: the Lavender Scare. The term Lavender Scare, first coined by LGBTQ historian David K. Johnson, is used to describe a direct, pervasive discrimination against suspected homosexuals, bisexuals, or other members of the LGBTQ community within the United States federal government. The Scare resulted not only in the expulsion of thousands of competent, loyal federal workers, but the smearing of their reputations, and the public exposure of their sexual identities to a hostile public.

In 1952, homosexuality was deemed a mental illness by the American Psychological Association. Most aspects of the government, academia, and society as a whole saw homosexuality as sexual perversion, no different from pedophilia or bestiality. But while LGBTQ Americans always faced intolerance wherever they went for much of history, it was not until the 1950s did they become associated with treachery, espionage, and ties to communism which together constituted the primary concerns of McCarthyism. Many argued that because the majority of homosexuals were closeted, they could invariably be controlled by blackmail from foreign agents, and therefore could not be trusted in the federal government. While the claim itself is unfounded at best, and bigoted at worst, the existing hostile sentiment towards LGBTQ persons allowed a systemic discrimination against the group with virtually no resistance from those in power.

While subtle means of rooting out suspected homosexuals in certain government institutions such as the State Department or the Armed Forces had existed long before McCarthyism, 1950 became a key year in the Lavender Scare. In June of that year, investigations began to be conducted by the US Senate investigating the “threat” that homosexuals in government posed to the United States. The Subcommittee on Investigations released their findings in December, declaring that homosexuals were a real and present threat, and that it was necessary to remove them from their positions, because they were predisposed to commit acts that were morally wrong, illegal, or even treacherous.

From the original Senate report, “Employment of Homosexuals and Other Sex Perverts in Government”:

It is the opinion of this subcommittee that those who engage in acts of homosexuality and other perverted-sex activities are unsuitable for employment in the Federal Government. This conclusion is based upon the fact that persons who indulge in such degraded activity are committing not only illegal and immoral acts, but they also constitute security risks in positions of public trust.

Just three years following the Senate report, Executive Order 10450 was signed by President Dwight D. Eisenhower, allowing the FBI, the Civil Service Commission, and the federal agencies themselves to investigate federal workers whom they suspected were partaking in “criminal, infamous, dishonest, immoral, or notoriously disgraceful conduct, habitual use of intoxicants to excess, drug addiction, or sexual perversion”. Under this order, federal workers became subject to security investigations that examined deep into one’s professional and personal lives. Workers could be investigated, interrogated, and eventually fired on suspicions as arbitrary as receiving numerous calls from a member of the same sex, or wearing clothes deemed unfit for their apparent gender. An estimated 5,000 federal workers were fired as a direct result of Executive Order 10450, while thousands more hoping to work for the federal government were barred from employment. One particularly disturbing case can be seen in Andrew Ference, a young research assistant who worked at the American embassy in Paris. After he was discovered to have been living with a male partner, he was subjected to harsh investigations and interrogations. Ference, fearing that he would lose his dream job, was under intense mental stress. On September 7, 1954, while his investigation was still ongoing, he committed suicide in his Paris apartment.

In 1957, Frank Kameny, a Harvard-educated astronomer in the US Army Map Service, was fired from his position due to him refusing to disclose information about his sexuality, and was later permanently barred from any future federal employment. Furious by the injustice he was subjected to, Kameny went on the co-found the Mattachine Society, one of America’s first national gay rights activism groups. The Society, though later splitting into regional organizations, conducted some of the first gay rights protests by openly LGBTQ persons, many of which were former federal employees who were fired due to their sexual or gender identity. Activists such as Kameny continued to fight all the way until the 1990s, when President Bill Clinton finally banned all sexual-orientation-based discrimination in federal employment or the granting of federal security clearances.

While the Lavender Scare no doubt left a legacy of homophobia in the American federal government and bureaucracy, it also left an important legacy of LGBTQ activism. The actions of bold leaders like Frank Kameny were the first steps in a continuing fight towards true equality for all LGBTQ Americans.

Mythical Origins Part III: Nazi Germany

Nazi German poster that compared “Aryan” (left) and Jewish (right) children for racial classification. Nazi propaganda used both existing and fabricated myths to promote the idea of a master race. (United States Holocaust Memorial Museum)

Nazi Germany, was, in every sense, a totalitarian dictatorship. Through the whims of just a few powerful men, it caused and facilitated more destruction than perhaps any other state or institution that has ever existed in the history of the world. With absolute power in their hands, Adolf Hitler and his inner circle had complete control over anything that happened within Germany or the territories it later controlled. Any internal opposition to the ideas or actions of the Nazi Party or its leadership would quickly be snuffed out by cruel agencies such as the Gestapo, the Nazi secret police. The Nazi regime is rightfully remembered as the historical epitome tyranny and oppression; the antithesis of liberty and democracy. It is therefore surprising to learn that Hitler and his inner circle first came to power largely by legitimate means. That is, they were elected through free and fair elections or appointed by existing government officials as allowed by the constitution of the Weimar Republic. While Hitler eventually transformed Germany into the despotic nightmare that the world has come to despise, his unassuming rise to power reveals one important fact about the Nazi party: that the ideology that the party produced was not merely the insane machinations of a few corrupt politicians, but a complete set of social, moral, and political ideas that genuinely captured the hearts and minds of the German people.

Though there were a number of factors that allowed for the rise of the Nazis, one issue on the minds of many Germans was the standing of the German people in relation to the rest of the world. The humiliating result of the First World War and the economic hardship as result of the worldwide Great Depression left Germany in a weaker state than it had been in for decades. Therefore, the Nazis capitalized on the people’s desire for German strength, empowerment, and respect. This led to the creation of the myth of the Herrenrasse, the superior race of humans. This and other myths central to the Nazi party’s ideology were themselves derived from much older myths originating from the Nordic and Germanic cultures that predated the Nazis by centuries. Like the Yoruba myths in West Africa, these myths were modified or outright rewritten to further the political agenda of the Nazi party. More specifically, it upheld the notion that the peoples of Central and Northern Europe were inherently superior, and therefore justified the continued aggression of Nazi Germany, by claiming and maintaining Germany as the world’s dominant people.

Hitler, Goering, Himmler, Goebbels—all names typically associated as being the core of the Nazi party. But one name often less remembered by popular history belonged to man whom Hitler himself credited as being a spiritual co-founder to Nazism: Dietrich Eckart. Eckart was a political writer and poet from the Bavaria region of southern Germany, who was active in the earliest days of the Nazi party, helping create the underlying principles of the Nazi party, as well as the personality cult surrounding Adolf Hitler. A fervent anti-Semite, many of Eckart’s works featured furious ramblings about the growing influence of Jews in Europe, and the responsibility they held for the decline of Germany. He considered Jews as an outside, corrupting force that was weakening the German people, and saw Hitler as a messiah who would save them. However, there was a key problem in convincing the German people in Eckart’s vision. Germany, for most of its history, was not unified under one political or cultural entity. Eckart himself was born in the Kingdom of Bavaria, which was effectively its own country before it became a part of Germany. How could the German people unite against their Jewish enemy if they could hardly find unity among themselves? The key was through finding a common heritage in Nordic and Germanic folklore.

Eckart, alongside future prominent Nazi officials such as chief Nazi racial theorist Alfred Rosenburg, belonged to a occultist organization in Munich called the Thule Society, named after a legendary Northern European nation that appeared in Roman and Greek mythology. The society believed in a perfect, almost superhuman Aryan Teutonic race. They believed this race descended from the mythical land of Atlantis, and later migrated into Germany where they became the Germanic peoples we know today. The society claimed, however, that the race was in danger, and was being corrupted by races inferior to them, such as the Jews. Not did they believe that a cultural and genetic destruction of the master race was taking place, but also believed that it was actually part of a deliberate plot by the enemies of the Aryans to take power and remove the Aryans from their rightful place at the apex of human civilization. In the end it was actually the Thule Society that first sponsored the German Worker’s Party, which, under the direction of Hitler and his henchmen, would transform itself into the Nazi Party.

While the Nazi Party did adopt its own original mythology through Nazi-affiliated scholars such as Eckart or Rosenburg, it did incorporate elements of existing European mythology to strengthen its connection to its alleged Aryan heritage.

Helmet of the SS

One example of a Nazi attempt to directly tie itself to Northern European tradition can be seen in the emblem of SS, the Nazi paramilitary group that was under the direct control of the Nazi Party. The emblem contains two Germanic sig runes, which both mean “victory”. The SS emblem is now one of the most enduring symbols of Nazi Germany.

Today, mythology, whether from ancient European traditions or Nazi racial theory, continue to be part of the far-right movements of the present. Symbols such as the Celtic Cross and Triskele are used by certain white supremacist groups, which, by using these symbols, attempt to call back to a sense of common European heritage and pride. As tragic as it is, there can be no denying that the traditions and symbols of several Northern and Central European cultures has forever been associated with the actions of a few individuals. The complex tale of the development of Nazi ideology provides a sobering tale for what it means to embrace or butcher the truth, and how easily the line between the two can be blurred.

Mythical Origins Part I: Japan

Izanami stands as Izanagi dips the Amenonuhoko into the sea (MFA Boston)

“Myths are things which never happened, but always are.”

Salutius, 4th century CE

Every civilization in the history of the Earth has a story of who they are, and how they got there. While few of these could ever be verified by historical or archeological fact, even the most unlikely of origin stories can impact a society just as much, or even more, than if the story was certain to be true. This series will provide a brief outline of the mythical origins of a handful of civilizations, and draw historical connections between those myths and the peoples it served, or continues to serve, as a foundation to. As a whole, the series will attempt to highlight that in history, myth and fact can go hand-in-hand, and that the line between them is not always clear, nor relevant in shaping humanity.

In the beginning, there was only chaos; the world a formless disarray of nothing and everything at the same time. From the disorder emerged a dichotomy between two opposing ideas: Heaven and Earth, with the divide between the two being apparent in all worldly and spiritual things.

Japan’s creation myth is similar to others such as the Ancient Greeks or Hawaiians, who generally believe that the universe was created from chaos; that the things that made up the universe were initially or always existent, though disordered, and were then reorganized into universe we see today. Creation from chaos myths are distinct from other categories of creation myths, such as creatio ex nihlio (Latin for “creation from nothing”): the belief in a single intelligent being creating the universe from nothingness.

Takamagahara, the abode of the heavenly gods, was the first thing created from the chaos. Emerging from the primordial oil and now living in Takamagahara were the three original creation gods: Amenominakanushi, Takamimusubi, and Kamimusubi. Seven generations of deities were born from these original three, with the final generation consisting only of brother Izanagi and sister Izanami. The two were gifted were a sacred jeweled naginata (a traditional Japanese spear) at birth. The siblings, now standing on the bridge between Heaven and Earth, stir the sea with their naginata, creating the Earth’s first islands from the droplets that fell from the spear’s tip, and finally descending from Heaven to live on them. Before long the two realize their anatomical differences, and organize a marriage ceremony around the pillar of Heaven. Their first set of offspring are severely deformed, which they determine, after some discussion with their dead ancestors, is a result of Izanami speaking first during the ceremony instead of Izanagi. The couple redo the ceremony, this time successfully abiding by their respective roles.

The naginata, central to Japan’s creation myth, is also central to the nation’s military history. A versatile weapon—something between a sword and a lance—was used by everyone from samurai to warrior monks. Meanwhile, the pillar of Heaven, around which the wedding takes place, is replicated with the central pillars common in buildings constructed during the Yayoi period of Japan (300 BCE–300 CE). With regards to Izanagi and Izanami’s relation to each other, it is important to note that there is a level of ambiguity within certain Japanese words for “wife” and “little sister”, so scholars continue debate whether or not they can be considered related by blood. Lastly, the story about Izanami’s botched role in the ceremony reflects Japan’s traditional gender roles and historically conservative attitude towards women, as well as the Confucian philosophy on gender roles which influenced virtually all of East Asia and beyond.

The renewed union between Izanagi and Izanami results in a new set of offspring, which take multiple forms. These include new islands, geographical features such as forests and mountains, and even more gods. Finally, Izanami dies in childbirth after she gives birth to her most volatile creation, fire. A complicated saga ensues after her death, which actually results in permanent rift between couple. Izanami, now a resident of the underworld, threatens to condemn thousands of mortals to death unless her husband backs off, while Izanagi promises to do the opposite by creating an even greater number of births, thus creating the cycle of birth and death. Meanwhile, the original couple themselves aren’t the only ones having problems with each other; so are the many gods whom they created. After Susano’o, the storm god, gets into a quarrel with his sister, the sun goddess Amaterasu, the latter locks herself into a cave, plunging the world into darkness.

The new islands described as being created by Izanagi and Izanami align well with the islands of Honshu, Kyushu, and the other major islands that make up Japan. Amaterasu locking herself in the cave likely corresponds with a real life natural disaster, such as an eclipse, or perhaps the infamous year 536 CE, in which most of the world, not just Japan, experienced massive crop failure probably due to a volcanic eruption which created an ash cloud that blotted out the sun.

If all of this sounds confusing and disconnected, that’s because it is. Although Japan is considered a culturally homogenous country, that was not always the case. Its ancient peoples consisted of small independent tribes and chiefdoms, whose various myths and legends eventually culminated in an only somewhat cohesive narrative of Japanese mythology. It was not until long after these myths were created, well into the first millennium CE, that surviving records of these myths are found. The many myths that can together be considered the creation myth of Japan are a mostly a compilation of the many disputes, affairs, and fights between the gods. Many of these myths would follow a pattern such as this: a few gods get into trouble with one another, they start fighting, and something important is ultimately created as an unintended result of the fighting.

Out of the mishmash that is Japanese mythology, the two most direct and tangible legacies of the Japanese creation myth are the Shinto religion and the alleged origin of the Japanese imperial family. Shinto is considered the indigenous religion of Japan, and has existed in some form since even before the Common Era. Blending elements of Japanese folk traditions (including its mythology) and Zen Buddhism, the religion is an important symbol of Japan, influencing many of the nation’s most iconic traditions, historical events, and architecture. The semi-mythical beginning of the Japanese imperial family can also be traced to the country’s founding myths. According to legend, the aforementioned sun goddess Amaterasu—herself a descendent of Izanagi, Izanami and the original three gods—had an extensive and documented lineage of her own. Five generations below Amaterasu lies her great-great-great-grandson, Jinmu, who is considered to be the first emperor of Japan. 126 generations later, the family tree arrives at Naruhito who, although stripped of all but ceremonial powers, is the current emperor of Japan. Thus, if one were to directly trace the lineage of Naruhito as far up as the records allow, Naruhito can be considered a direct descendant of the original heavenly gods. Many readers will no doubt be familiar with fanatic acts, such as piloting a kamikaze airplane, being done in the name of Japanese emperor. These could be paralleled to acts of religious fanaticism, since both consider their actions as vindicated by the divine.

Japan is a unique nation that was formed from unique circumstances. Its mythical origin reflects the values and customs that has transformed the country from a few tribes inhabiting a handful of islands to an enduring economic and cultural power that has influenced the entire world.

The (real) Kansas Jayhawks

Soldiers from the 7th Kansas Volunteer Cavalry (Kansas Historical Society)

What do decorated college and professional basketball player Wilt Chamberlain and a storied group of anti-slavery militias have in common? Both of their titles: “Jayhawk(er)”, are deeply connected to the history of Kansas. The term has been used to represent the University of Kansas and its athletics teams, but also for Kansans as a whole, and has become a symbol of pride for the entire state. Contrary to its name and cartoon image, the Jayhawk is not actually a real bird, and while the name is one recognized across the United States, few outside of the state of Kansas may know the term’s true, and rich history.

The term “jayhawker” is most likely a compound word between the blue jay and sparrow hawk. It was first coined by the original Kansas settlers who admired both the blue jay’s turbulent personality and the sparrow hawk’s predatory nature, and the term became applicable to anyone from the region. It was not long, however, that the story of Kansas took an sharp turn, as the Kansas-Nebraska Act of 1954 was signed into law. The bill, passed during a time of divisiveness over the issue of slavery, granted the newly formed territories of Kansas and Nebraska to right to decide by referendum whether they would be open or closed to slavery. While intended as a lasting compromise between pro and anti-slavery factions in the US, it only heightened tensions over the issue, which would lead to—preluding, of course, the American Civil War—a period known as Bleeding Kansas.

As word spread about the policy through which Kansas and Nebraska would decide their stance on slavery, thousands of armed supporters on both sides flooded west hoping to skew the favor in one direction or another. The southerners, hailing mostly from neighboring Missouri, were motivated by a staunch opposition to that they viewed as tyrannical abolitionism. The majority of northerners, on the other hand, were only somewhat abolitionist, most feeling little sympathy for enslaved Africans. These settlers were mostly part of the Free Soil movement, primarily concerned with protecting the White American family farm, which would no doubt be endangered by the expansion of southern-style plantations. In fact, the majority of supporters from this movement supported outlawing Blacks, free or enslaved, from entering the Kansas territory at all. Only a small portion of northern settlers, such as the legendary John Brown, were opposed to slavery on mainly on moral grounds. As the two (or perhaps three) sides, both armed and ready to fight, began to enter their area, an interesting assortment of nicknames began to sprout for different groups during the late 1850s. The pro-slavery bands during Bleeding Kansas were generally called “bushwhackers” due to their ambush tactics and criminal reputations, while similarly aligned groups that specifically came from Missouri were called “border ruffians”. Finally, their abolitionist counterparts, seeing themselves as rightful defenders of Kansas from pro-slavery aggression, adopted the name affiliated with the region itself: “jayhawker”.

Charles Rainsford Dennison, famed jayhawker and perhaps the most fashionable officer of the American Civil War (Dickinson College)

While the dubious motives of the anti-slavery faction may very well on its own do enough to disprove the notion of Bleeding Kansas as a noble struggle between good and evil, it is the means through which both sides carried out their beliefs that is perhaps what made the conflict so ugly. Jayhawkers were known to use any means necessary to combat their enemies, not hesitating to murder or pillage to further their cause, but also to earn personal land and monetary gains from those they murdered or pillaged. As the intra-Kansas conflict continued into the much larger Civil War in 1861, so too did many of the jayhawkers’ and bushwhackers’ tactics. Union and Confederate leadership alike detested the work of those such Charles Dennison, a notable jayhawker who led a Union militia cavalry unit notorious for its brutality and willingness to use extrajudicial killings. A more respected, although equally uncompromising fighting force that adopted the jayhawker moniker was Lane’s Brigade, under Senator and Brigadier General Thomas H. Lane, which earned many victories along the Missouri border. Throughout the war much of the Western frontier conflict was defined by unhinged guerilla warfare, as thousands of civilians were robbed, displaced, or summarily executed by militants on either side of the conflict, as seen in the Lawrence and Osceola raids.

As the guerilla conflict cooled off and the Civil War came to a close, the name “jayhawker” remained in the hearts of Kansans, who did not see the term in the same negative light which their former enemies had, but instead embraced it as an homage to Kansan statehood and its contributions to the Union cause. In 1890, just 25 years after the end of the war, the University of Kansas football team took the field for the first time, proudly calling themselves the Kansas Jayhawkers. Today, the KU athletics teams instead use the truncated name “Jayhawk”, which despite its far-from-perfect origin, continues to be the symbol of Kansan pride it was 150 years ago.

The Rhodesian Bush War: Causes and Legacy

Key officials agree to the terms of the Lancaster House Agreement, the peace agreement that would lead to the full independence of Zimbabwe (The Guardian)

By the midpoint of the 20th century, the old colonial empires of Europe were beginning to come apart. The aftermath of the Second World War had ushered in a new era of peace, and created a growing distaste towards imperialism and nationalism as a whole. The part of the world most affected by centuries of colonialism was the continent of Africa, which, with few exceptions, was at one point the territory of one European nation or another. Africa’s transition into its post-colonial era is one full of triumph and tragedy, and is a process that arguable continues to this day. Perhaps the best known story from post-colonial Africa is that of South Africa, whose peaceful transition from apartheid-ridden colonial state into a (somewhat) fair and equal democracy inspired much of the world. However, today’s story will revolve around its neighbor to the north, the former British colony of Rhodesia, and the country it is today known as: Zimbabwe.

Though the lands that eventually became Rhodesia/Zimbabwe were occupied by a variety of peoples, it was primarily ruled by the Ndebele Kingdom, a breakaway tribe from the Zulu people to the south. The Ndebele were founded by Mzilikazi, himself a Zulu general under the famed King Shaka Zulu. The Ndebele would soon grow very powerful, and by the time European’s arrive at their door, they ruled the many tribes of Zimbabwe under a tribute system, including the Shona people who were previously the dominant force in the region. The divisions between the Shona and the Ndebele would continue well after the latter came into power in the mid-19th century. In 1888, after several years of British presence in the area, the British South Africa Company (BSAC) under the cunning, imperialist, white supremacist tycoon Cecil Rhodes began its rule over Zimbabwe. Mzilikazi’s son and successor, King Lobengula, agreed to a deal with Rhodes that would concede mining rights to BSAC. Rhodes subsequently used the concession to obtain a royal charter from the British Crown, and solidify the region as a British colony. The new territory owned by BSAC was named Rhodesia, whose namesake was, not surprisingly, Cecil Rhodes himself. White settlers soon flooded into Rhodesia, while its native people, Shona and Ndebele alike, were forced onto “tribal trust areas”, which filled a similar role as North American Indian Reservations. Treated as second class citizens in their own homes, the tensions between black African and white British factions is a common theme in many former European colonies on the continent, and was certainly the case in Rhodesia.

In 1923, the colony, now called Southern Rhodesia to differentiate it from the newly created Northern Rhodesia (now Zambia), officially became a self-governing colony of the British Empire, making the colony effectively independent as state, but still technically under British rule. Following the Second World War, the British government (in accordance with its decolonization policy) pressured Southern Rhodesia to end its minority rule of the country, and expand suffrage to its black African population. The colony was ruled entirely by its 80,000 whites, while its 2.5 million blacks still lived essentially as colonial assets. Not wanting to give up its control domination over the country, the white leadership of Rhodesia (switching back to its old name since Northern Rhodesia had already become Zambia) declared independence; a shocking move that was technically the first of its kind since the American Revolution. The new nation of Rhodesia, under Prime Minister Ian Smith, was now totally independent, though it did not receive any real international recognition. Rhodesia received harsh economic sanctions and condemnations not just from the British, but from the entire international community.

In 1964, shortly before Rhodesia’s Universal Declaration of Independence (UDI), the conflict now known as the Rhodesian Bush War began with a minor skirmish involving Rhodesian forces and one of the two emerging, Marxist, African nationalist groups. Formerly one entity, these two disparate parties were the Zimbabwe African People’s Union (ZAPU; its military wing named ZIRPA), and its breakaway group the Zimbabwe African National Union (ZANU; its military wing named ZANLA). The generations-long division between the Ndebele and Shona peoples was key to the split, as the two ethnic groups controlled ZIRPA and ZANLA, respectively. Throughout the war, ZIRPA and ZANLA would occasionally fight each other to gain better regional control. Opposite both of these factions was the Rhodesian Security Forces, a well equipped, professional army that had considerable air power, its own SAS special forces unit, while consisting of both white and black units. ZIRPA and ZANLA, having both being expelled out the country and into Zambia, conducted the first phase of the war (1964-1972) through a number of battles through and along the Zambian border. The Rhodesians thoroughly defeated the rebels in this first phase, even becoming confident enough to release rebel leaders such as Robert Mugabe, who they deemed to no longer be a threat.

However, as the Rhodesians were celebrating their apparent victory, both the ZIRPA and the ZANLA weren’t just licking their wounds, but were also gearing up for the second phase. Due to the involvement of communist and non-communist factions, Cold War politics inevitably found their way into the conflict. United States covertly supported Rhodesia due to the Rhodesian Front’s strong anticommunist sentiments, while South Africa provided ground forces to fight alongside them. But more importantly, the ZIRPA was strongly backed by the Soviet Union, while the ZANLA received support from the People’s Republic of China. The advisors and resources provided by their strong allies allowed the insurgents a tremendous advantage coming into the second phase (1972-1979). ZANLA’s relocation to Mozambique also meant that the Rhodesians also needed to fight along the Mozambican as well as the Zambian, further worsening the situation for Ian Smith’s government. As South Africa pulled out the conflict, and more insurgents entered the country, the situation for Rhodesia became desperate, with the Rhodesian Security Forces even resorting to using deadly chemical and biological weapons. In soon became clear that white minority rule was no longer possible, and a gradual political transition was attempted, which led to the election of the first black prime minister and president of a newly named, but short-lived state, Zimbabwe-Rhodesia. But for the warring ZANU and ZAPU, it was still not enough, and fighting continued until the Lancaster House Agreement in December 1979. The agreement, brokered by the British government under the newly incumbent Prime Minister Margaret Thatcher, granted Zimbabwe full and sovereign independence, while also allowing the belligerent ZANU and ZAPU parties to hold office. After a decade and a half of brutal conflict, Zimbabwe was finally an independent nation.

In 1980, ZANU leader Robert Mugabe was elected Prime Minster, and despite his popularity surrounding his anti-imperialist heroics, the election essentially began Mugabe’s 37 year rule as a brutal and corrupt despot. While the story of post-war Zimbabwe and Mugabe’s dictatorship is all too common among unstable African nations affected by colonialism, it is perhaps Rhodesia itself that has the most interesting legacy from the whole ordeal. Unlike South Africa, which peacefully transitioned from its white minority rule into a liberal democracy, the racist sentiments and overall brutality that caused its civil war still follow Rhodesia long after its dissolution. In the last few years, an odd nostalgia for Rhodesia has appeared it certain alt-right and white supremacist communities, especially in the United States. Dylann Roof, the perpetrator of the racially motivated 2015 Charleston church shooting, wrote his hateful, violent manifesto on a website called The Last Rhodesian, while a picture surfaced of him wearing a jacket with the Rhodesian Flag. Meanwhile, certain social media communities celebrate the existence of Rhodesia, with some claiming that Rhodesia was better off before its white rule was eliminated. As strange as Rhodesia’s modern legacy seems to be, there can be no doubt that European colonialism as whole, and the countless movements that resulted as a consequence of it have shaped the social fabric of the world in more ways than one.

Eugenics in America Part III: Native American Women

Cartoon from US Department of Health, Education, and Welfare (HEW), encouraging American Indians to have a sterilization procedure. Left depicts tired parents with many children and only one horse, right depicts happy, active parents with a single child but many horses. (Akwesasne Notes, via UC Berkley Law)

Since the arrival of the first European colonists, the indigenous population of the Americas, especially in the more sparsely populated tribes in modern-day Canada and the United States, has been in grave danger due to the multitude of threats posed by colonialism and its legacy. Among these threats, the most commonly known in popular history are likely the deadly smallpox epidemics or the forced relocation of tribes onto reservations, both of which happened largely until the late 19th century. But a far more recent concern about the civil rights and autonomy of American Indians was brought to light well within a human lifetime from the present day, when tens of thousands of Native American women were coerced into dangerous and effectively irreversible sterilization procedures. The number of women sterilized, though small in the context of the total US population, is massive when taken as a proportion of the Native American population; far higher in any other ethnicity in the United States.

In 1955, the Indian Health Service (IHS) was founded after the authority to oversee Indian health concerns was transferred from the Bureau of Indian Affairs (then known as the Office of Indian Affairs) to the Department of Health, Education, and Welfare. The IHS aimed to provide necessary care to the millions of Native Americans living on reservation across the country, and was, in fact, successful in ensuring that more people had their healthcare needs properly addressed. However, most the doctors of the IHS were not Indians themselves, and some thus held the same prejudice against Native Americans that many others did at the time. Many were under the belief that Indians were inherently intellectually and morally inferior, and that they could not be trusted to manage their own health. These assumptions, though problematic in and of themselves, became especially concerning just a decade after the founding of the IHS, when the Service began to provide family services to its patients.

The United States government had long been concerned about the extremely high birth rates in many Indian communities, with some tribes averaging up to 4 children for every adult mother in 1970, double that of the America’s white population. Many attributed the problems of poverty, drug abuse, and overall social decay to the rapidly rising Indian population. The family service program was meant to advise patients about different methods of birth control, but the prejudice against the Indians—taking the form of a flawed dynamic in which the doctors had a superior intellectual and authoritative position on their patients—became very apparent as many patients were coerced into receiving treatments they would have otherwise refused. The two most common procedures for women were tubal ligations (colloquially known as “getting one’s tubes tied), in which the Fallopian tubes are blocked, and the far more dangerous hysterectomy—the complete removal of the uterus. Both procedures were extremely difficult or impossible to reverse and are considered permanent forms of birth control.

Despite the extreme consequentiality of the procedures being done, patients often did not have an interpreter through which they could clearly communicate with their doctor, while the doctors themselves often omitted any mention of a procedure’s permanency or other long term effects. Several Indian women later interviewed also claimed that the IHS, as well as other welfare agencies, threatened to cut their benefits should they choose to have another child. Perhaps the most coercive technique, however, was the threat of losing one’s children to foster homes, adoption, or boarding schools—a fear deeply rooted into the culture of Canadian and American Indian tribes.

Jane Lawrence, from her essay, The Indian Health Service and the Sterilization of Native American Women, American Indian quarterly via University of Nebraska Press.

A young Indian woman entered Dr. Connie Pinkerton-Uri’s Los Angeles office on a November day in 1972. The twenty-six-year-old woman as Dr. Pinkerton-Uri for a “womb transplant” because she and her husband wished to start a family. An Indian Health Service (IHS) physician had given the woman a complete hysterectomy when she was having problems with alcoholism six years earlier. Dr. Pinkerton-Uri had to tell the young woman there was no such thing as a “womb transplant” despite the IHS physician having told her that the surgery was reversible. The woman left Dr. Pinkerton Uri’s office in tears.

Estimates for the number of American Indian women sterilized in the 1970s are almost dumbfounding, ranging from 25 to 50 percent of the total population. Dr. Constance Redbird Pinkerton Uri, a Choctaw/Cherokee physician of the IHS and advocate for Indian interests, stated that the mass sterilization was not motivated by a desire to reduce the native population, but by a flawed idea that the solution to poverty was to limit the number of children a family could have. Others, however, such as Northern Cheyenne tribal judge Marie Sanchez viewed as a modern form genocide; a continuation of the injustices perpetrated by the United States government against Native Americans. Whatever the motive was, it was clear that the sovereignty and welfare of America’s Indian Tribes were in grave danger, and justice began to be demanded as many Indians rallied under the larger Red Power movement, which advocated for greater Indian self-governance and reduced influence from the American federal government. The largest victory of the movement was in 1976 when the Indian Health Care Improvement Act was passed, transferring the power of managing the IHS to the tribes themselves, with many IHS facilities having since been taken over by regional tribe authorities.

Throughout the history of the United States, the balance between the power of the federal government and the interests of the country’s indigenous people has redefined, tested, and broken several times over. The question of what place, if any, that American Indians have in the vision of an equal, prosperous country continues to be asked today. The mass sterilization of Native American women in the 1970s is just one example of how easily power can be abused, and how easily that abuse can ignored or forgotten.

Eugenics in America Part II: African-Americans

W.E.B. Du Bois, black activist and eugenics advocate (Smithsonian)

The story of eugenics in the United States and the concurrent social movements for the interests of African Americans are deeply intertwined. History has revealed that there were actually African American supporters on both sides of the eugenics argument, but usually for different reasons than their white counterparts. The relationship that black activists had with eugenics in a given time period can provide an insight into the changing goals and reasoning of the centuries-long struggle for racial justice.

As eugenics began to gain prominence in the late 19th century, some African Americans, despite the mainstream movement often labeling those of African descent as an “unfit” group, saw it as a possible way to improve their race. While some African Americans believed in protecting the racial purity of the black race (such as Marcus Garvey), or even that the black race itself was inferior (such as William Hannibal Thomas), the majority of eugenics proponents saw the ideas of “fit” and “unfit” groups as something no different from breeding cows or corn.

W.E.B. Du Bois was a leading intellectual within the black community, and strong advocate for “assimilationist eugenics”. He believed that it was the responsibility of the African American community to lift itself out of its current state, not just through social or environmental changes, but by selecting which of its members should procreate. Du Bois also claimed that the mixed-race children born to white slave owners (and the decedents of those children) were partially responsible for black moral decay by carrying the genes of perverted adulterers. He observed that similar to any other race, the black race contained individuals who possessed traits that were desirable or defective. One of Du Bois’ famous ideas was that of the “Talented Tenth” He believed that only the best of the race would be able to save the whole race. All the while, he insisted that the white and black races were equal, and that the differences alleged by contemporary white scientists were the result of class and environment rather than genetics.

Another prominent African American proponent of eugenics was Dr. Thomas Wyatt Turner. In contrast to Du Bois’ balanced emphasis on both nature and nurture, Turner doubled down on the ideas of biological determinism and the importance of one’s genetic background. He helped reshape the mainstream ideas of eugenics into one that better fit the notion of racial equality. Turner’s ideas were taught to thousands of black students at Howard, Tuskegee, and Hampton. In fact, a 1915 exam from Turner’s class at Howard University read, “Define Eugenics. Explain how society may be helped by applying eugenic laws“. In the end, Turner was hugely responsible for popularizing the eugenics both among the black elite, and the general African American population through his volunteer lectures. Years later, the NAACP, which Turner would help found, would hold baby contests (yes, baby contests) that were no doubt tied to the ideas that Turner helped spread.

While eugenics was viewed favorably by African Americans for decades, the emerging civil rights movement of the mid 20th century began to see the idea rapidly fall out of favor. Eugenics policies, such as the North Carolina Eugenics Board, which disproportionally affected African Americans were common in the United States, especially in the South. One policy of the generally progressive President Lyndon B. Johnson was the allocation of federal funding towards birth control in low income communities. Although the more radical, black nationalist faction of the civil rights movement already opposed the moderate reforms of the Johnson administration, this particular action sparked widespread outrage, since many saw it as limiting the black population as a way to suppress its influence in the United States.

The popularity of eugenics plummeted across racial communities by the 1950s and 60s, especially in response to atrocities of the Nazi regime, who adopted eugenics and Aryan superiority as a basis for its ideology. The fall of eugenics was particularly strong in the African American community, who sometimes highlighted the hypocrisy of fighting against injustice abroad when it was still being fought for at home. It was not long before eugenics was seen to be as conducive to black empowerment as lynchings or poll taxes were.

Eugenics in America Part I: Buck v. Bell

Carrie and Emma Buck (Encyclopedia Virginia)

Eugenics, though a concept present to some degree for a large part of human history, began to gain significant traction among some Western intellectual and political circles in the 19th century. Advocates for eugenics argue that certain genetic traits in humans are more desirable than others, and that those who possess undesirable traits should be sterilized or otherwise removed from the genetic pool. Some eugenicists support the ideology on the basis that certain racial or ethnic groups are superior to others, while others seek to eliminate certain physical or mental disabilities from the population. While Nazi Germany and the Greek city-state of Sparta are probably history’s most famous proponents of eugenics, the practice also has an unfortunate history in the United States. This three-part article series will attempt to briefly overview three different historical outlooks of eugenics in America.

Carrie Buck was on July 3, 1906, in Charlottesville, Virginia to Emma and Frederick Buck. While Frederick had abandoned the family shortly after Carrie’s birth, Emma was admitted to the Virginia State Colony for Epileptics and Feebleminded, an institution that housed Virginians deemed mentally unfit to be a part of society. Carrie Buck initially had a normal childhood under her new foster parents, earning average grades in school, and later, as was relatively common for young girls at the time, removed from school to help with domestic work. However, her life was forever changed at age 17 when she was raped and impregnated by her foster mother’s nephew. To cover up the family’s embarrassment from the incident, her foster parents committed Buck to the same institution that her mother was in, accusing her of feeble-mindedness and promiscuity. Carrie Buck’s newly born daughter, Vivian, was deemed to be similarly mentally feeble, although later in life she actually excelled in school. Shortly after Carrie’s admittance, the Colony’s Board of Directors authorized her sterilization via salpingectomy, an irreversible procedure that removes the patient’s Fallopian tubes.

Seeking to test the legal legitimacy of the practice of forced sterilization through the Virginia Sterilization Act of 1924, the Colony’s superintendent, Albert Sidney Priddy (though later succeeded by John Hendren Bell), asked Buck’s state-appointed guardian, Robert G. Shelton, to challenge the order for her sterilization. Shelton appealed the order both to the Amherst County Circuit Court, and the Supreme Court of Virginia. After the order for Carrie’s sterilization was affirmed in each of those lower courts, he appealed one final time to the highest court in the country, the United States Supreme Court, in BUCK v. BELL, Superintendent of State Colony Epileptics and Feeble Minded.

Buck’s attorney, Irving P. Whitehead, argued that the Due Process Clause of the Fourteenth Amendment prohibited the Commonwealth of Virginia from performing involuntary sterilization, because a citizen was being deprived of her rights without due process of law. Meanwhile, the Colony’s attorney, A. E. Strode, cited the apparent (though not actual) genetic defects in the Bucks’ bloodline, instead arguing that the sterilization was justified under the premise that removing those defects from the Commonwealth’s collective gene pool was in the best interests of the state. On May 2, 1927, the Court delivered its verdict: ruling 8-1 in favor of Bell. The lone dissenter in the case was Justice Pierce Butler, whose Catholic faith likely influenced his decision. He did not write a dissenting opinion.

From the majority opinion of Buck v. Bell (1927):

It is better for all the world if, instead of waiting to execute degenerate offspring for crime or to let them starve for their imbecility, society can prevent those who are manifestly unfit from continuing their kind.

Three generations of imbeciles are enough.

The significance of Buck reaches far beyond the case of Carrie Buck herself. It essentially legitimized all similar eugenic practices in the United States, and is no doubt one of the largest stains on the country’s legacy when it comes to eugenics. In fact, defense teams for Nazi officials during the Nuremburg trials used Buck in their arguments, using the decision to expose a terrible hypocrisy in American criticism of Nazi ideologies during WW2.

While Skinner v. Oklahoma ruled 1942 that criminals could not sterilized as punishment for their crimes, it did not ban the type of state-mandated sterilization that was affirmed by Buck. Forced sterilization of ordinary citizens deemed mentally unfit was still legal in Virginia until 1974, with the last such operation in the country being performed in Oregon in 1981. Today, Buck v. Bell is used as a symbol of the struggle for disability rights in the United States, and demonstrates how disabled Americans were one of the many groups in the country who faced outright discrimination and oppression during this time period.

The Brazilian Expeditionary Force

FEB soldier loads artillery with shell inscribed with A COBRA ESTÁ FUMANDO; THE SNAKE IS SMOKING (Getty Images)

The Second World War is a conflict well catalogued and studied by high school students, history buffs, filmmakers, writers, and scholars. But as significant and consequential the war was to every corner of the world, there would inevitably some stories that would be forgotten by its popular history. One such story, at least outside of South America, is that of the 25,000 man-strong Brazilian Expeditionary Force (Força Expedicionária Brasileira, or FEB).

Prior to their entry into the war, Brazil had been a valuable trading partner to the Allied powers, and even allowed the United States to construct air bases on its soil. Eventually anti-Axis sentiments began to mount as Brazilian merchant ships were sunk by German U-boats, and in August of 1942, Brazil declared war on the Axis powers. Initially, Brazilian support to the Allies was no different from that of other South American countries — providing much-needed war material by becoming a key link in the supply chain across the Atlantic and into Africa. However, Brazilian leaders soon realized that by sending an actual military force to the Allies’ aid, it would be a symbolic commitment to their cause, and improve their position at the negotiation table once the war came to an end.

In addition to its main infantry division, the FEB also included a fighter squadron, and was supported by the Brazilian Navy. In the summer of 1944, the first Brazilian troops arrived in Naples, merging itself into a larger American force that was already fighting a brutal campaign in Italy. Their nickname was the Cobras Fumantes (“Smoking Snakes”), after it became a running joke that it was more likely for a snake to smoke than it would be to see the FEB to see any actual combat.

From the memoirs of Mark Clark, commander of the U.S. Fifth Army:

The Performance of the Brazilians was, of course, important politically as well as militarily. Brazil was the only Latin American country to send an expeditionary force to take part in the European war, and, naturally, we were eager to give them a chance to make a good showing.

While the small force was not hugely impactful when considering the massive scale of military operations during WW2, they were nonetheless remembered in the hearts and minds of the Brazilian people. Perhaps the greatest of these victories was at the Battle of Collecchio, in which the FEB surrounded and captured two German infantry divisions on April 29, 1945, just days before the fall of Berlin, the collapse of Nazi Germany, and the end of the war in Europe.

The legacy of the FEB in Brazil can be considered somewhat complicated, especially considering the various roles FEB veterans played during the 1964 Brazilian coup d’état. However, it can generally be said that the Brazilian Expeditionary Force is an enduring symbol of national pride for Brazil, and serves as testament to the bravery and dedication of countless of individuals during WW2, particularly by countries who roles in the war are not as well known.

Operation Wetback

Migrants in El Centro, CA await deportation (LA Times Archive)

Mexican immigration to America has been significant to the history of both countries ever since they have shared a border. The continued flow of Mexican migrants have been with met a multitude of laws, policies, or doctrines from the United States over the years, each of which represent, to some extent, the broader social and political conditions of the time.

In the decades leading up to the Second World War, hundreds of thousands of Mexican immigrants entered the United States both legally and illegally, primarily to work on farms in the rural Southwest. Their diasporic communities formed and grew quickly, creating a new generation of Mexican Americans. By the outbreak of the war, the American government was in need of cheap labor to fuel the war effort, both from the increase in demand for manufactured and agricultural goods, and the removal of millions of young men from the traditional workforce who instead served overseas. In response, the governments of the US and Mexico struck a deal known as the Bracero program, which allowed more Mexican laborers to enter the States on short term contracts. The program eventually brought over four million braceros.

Despite the program, illegal immigrants continue to flow into the country, much to the concern of the United States. Under the Eisenhower administration in 1954, a series of deportations would be authorized under the name Operation Wetback. U.S. Border Patrol agents began mass sweeps across the country. Hundreds of thousands of Mexicans, some of whom were American citizens, were packed into trucks, boats, or planes, and shipped back to Mexico. Stuck in a place they were not familiar with, with no guarantee of jobs, food, water, or shelter, they had to rebuild their lives from scratch. While the federal government boasted that it had successfully deported over a million illegal immigrants in just a few months, the number is likely lower due to the fact that many of those deported returned to the United States several times, only to be deported once more.

Operation Wetback was, overall, a failure. Both the Bracero program and illegal immigration far outlived any consequences that came a result of the operation, other than the continuing legacy of anti-Mexican sentiments in the United States. In fact, the sudden deportation of such a large number of Mexican laborers increased an already high demand for cheap labor, thus also increasing illegal immigration to the United States as whole. It is also important to note the operation’s name, “wetback”. Today it is known as a highly offensive slur towards Mexican Americans, further tarnishing the operation’s legacy.

The operation reentered the minds of mainstream America during the 2016 Republican Presidential primaries, when eventual winner Donald Trump used the operation both as precedent, and as an example for the feasibility of his proposed immigration policy, which included the mass deportation of the millions of illegal immigrants currently living in the United States.