The Baltic Way

Demonstrators in Estonia join hands and wave the flag of the dissolved (but now restored) Republic of Estonia. (Jaak Künnap)

At around 7:00 PM local time on 23 August, 1989, an ambitious photojournalist finds himself in a helicopter flying over a major highway in Latvia. Peering out of the helicopter’s window with his camera in hand, he can hardly believe his eyes. On the ground below lies a massive human chain of demonstrators holding hands along the length of the road, whose ranks stretch as far as the horizon. The historic protest this photojournalist witnessed is a key moment in the history of Eastern Europe, and an important step towards dismantling the old social and political order imposed by the now weakening Soviet Union.

Known today as the three “Baltic states”, Estonia, Latvia, Lithuania were for much of their recent history, under the thumb of a larger, more powerful political entity. They were, after all, situated close to several major historical powers, including Sweden, Germany, and Russia. Still, these countries had long been adamant in preserving their national identities and cultures, even in the face of power opposition. In few other instances has this been more true than during late stages of the Baltic states’ membership in the Soviet Union.

On 23 August 1939, Nazi German leader Adolf Hitler and Soviet Union leader Josef Stalin came to an agreement known as the Molotov-Ribbentrop Pact. Under this agreement, the land lying between the two countries, which included Poland, the Baltic states, and others, would be divided into “spheres of influence” for both Germany and the Soviet Union. While Poland was split between the two rising superpowers, the Baltic states were left entirely to the Soviets. Using both military and political means to pressure them, Stalin essentially bullied the Baltic states into forming the Estonian, Latvian, and Lithuanian Soviet Socialist Republics. The process of Sovietization of the three nations included the deportation of tens of thousands of citizens which were deemed “hostile” by Soviet officials.

Fast forward fifty years from that fateful agreement and one finds a Baltic region weary of decades of Soviet control which, among other perceived violations of Baltic autonomy, included the purposeful introduction of Russian migrants (who influenced local policy in favor of the central Soviet authority in Moscow), and policies that suppressed the expression of the languages and cultures of the three Baltic nations. Meanwhile, the Soviet Union officially maintained the stance that all three nations voluntarily joined the USSR, instead of being forced to do so by Nazi or Soviet political maneuvering. Leading up to the 50th anniversary of the Pact, tensions and rhetoric surrounding reform or even independence in the Baltic states surged. These movements were condemned as harmful “nationalism” by Soviet authorities, as they prepared for a possible military crackdown on the region.

With the significance of the 50th anniversary in mind, local officials in all three nations began to plan one massive demonstration they hoped would capture the attention of the entire world. Though it is not clear who came up with the idea of a massive human chain, the concept was communicated to political and social organizations across Estonia, Latvia, and Lithuania, and a plan was approved. The chain would link Tallinn, Riga, and Vilnius, the three capitals of the Baltic nations. Organizers determined that in order for the plan to work, around 1,500,000 participants would be needed. 

According to most estimates, the expected numbers were met, and perhaps even exceeded on that fateful Wednesday evening. Along the highways connecting the Baltic capitals, demonstrators held hands and sang national hymns. The flags of all three nations flew proudly in the wind, as reporters swarmed to capture the spectacle. Solidarity demonstrations were held in cities across the world, including Moscow (although police quickly dispersed that demonstration). Though the whole ordeal lasted little more than 15 minutes, up to two million participants were estimated to have taken part in this historic demonstration, which amounted to a quarter of the total population of the three Baltic nations. 

The event was highly publicized in the media across the world, bringing international attention to the issue of Baltic independence. Though the demonstration was initially denounced by Soviet media, again claiming that the protests was little more than a manifestation of harmful nationslist rhetoric, it did cause Soviet leader Mikhail Gorbachev to reconsider the issue of Baltic independence. In December of 1989, mere months after the protest, an official condemnation of the secret protocols of the Molotov-Ribbentrop Pact was signed by Gorbachev. After free elections were held in the Soviet Union for the first time in 1990, pro-independence candidates in the Baltic countries were placed into public office. And finally, by the end of the following year, the Republic of Estonia, the Republic of Latvia, and Republic of Lithuania were established and internationally recognized as free and independent countries.

Today, the Baltic Way remains one of the largest peaceful demonstrations in history. It has been a source of inspiration for protestors in places such as Catalonia and Hong Kong, who have emulated the creation of a human chain. The independence movements of the Baltic nations contributed to the total collapse of the Soviet Union and its stranglehold on Eastern Europe. All three of the Baltic nations are now comfortably outside of the Russian sphere of influence, being members of both NATO and the EU. The Baltic countries are among the most developed and prosperous countries in the world. 

Charles Hamilton Houston: “The Man who Killed Jim Crow”

Houston delivers his argument. (Charles Hamilton Houston Institute)

The era of “Jim Crow”, a period of American history with widespread societal and legal discrimination against African-Americans (especially in the South), is generally considered to have ended with the Civil Rights Movement in the 1960s. Famous leaders from this period, such as Dr. Martin Luther King Jr., Malcolm X, and others are widely celebrated, since their actions took place at the climax of the movement. Less appreciated, however, are the many leaders who paved the way for this celebrated generation of activists, many of whom, including the subject of today’s article, never lived to see the fruits of their labor.

Charles Hamilton Houston was born in 1895 to middle-class African-American family in Washington D.C. Houston’s father, William, was an attorney. Houston was described as a brilliant child, graduating from Dunbar High School at the age of 15. He then went on to Amherst College in Massachusetts, where he left as one of six valedictorians in his class. Following a brief stint teaching English at Howard University, Houston applied as an officer to the United States Army upon the country’s entry into World War I. This was a formative experience for the young Houston, who witnessed the constant bigotry and racism that was present in a still segregated army. Using the law, he was determined to right the wrongs he saw everywhere. Houston returned to the United States in 1919, shortly after the war’s end. He enrolled in Harvard Law School, becoming the first African-American to serve as an editor of the Harvard Law Review, and graduated with honors in 1923. Houston was soon admitted law to District of Columbia Bar, where he would begin to practice law alongside his father. He also aided in the creation of the National Bar Association, which unlike the dominant American Bar Association, recognized and accredited African-American attorneys.

Beginning in 1924, Houston returned to Howard University, only this time teaching law instead of English. Mordecai Johnson, the university’s president, saw potential and Houston, and allowed him a significant role in reforming Howard Law School. Although it was responsible for training three fourths of the country’s Black lawyers, Howard Law School still only held part-time night classes. After Houston was appointed vice-dean (effectively with the powers of a dean) of the law school in 1929, he helped bring about its transition into a full-time law school. With his new role as the head of the African-American law’s central institution, Houston envisioned a new generation of Black lawyers who could use their skills for the advancement of their people. Among his students were James Nabrit, Oliver Hill, Spottswood Robinson, and Thurgood Marshall. Houston’s role in fighting Jim Crow, however, was not limited to the classrooms of Howard University. Rather, by working with the attorneys he trained himself at Howard, Houston was able to make considerable strides towards racial equality under the law.

Resigning from his post at Howard in 1935, Houston would spend the remainder of his life working on civil rights law. He assumed the position as the first special council to the National Association for the Advancement of Colored People (NAACP). One of his first cases following his departure from Howard was in Hollins v. State of Oklahoma, which concerned a Black man sentenced to death by an all white jury. Houston and his all-black defense team were able to prevent the man from being executed. Though it was a goal of Houston to rid American juries, it would be decades before that become a reality. Another of Houston’s primary concerns was the segregation of public schools, which was deemed constitutional by the 1897 Supreme Court case Plessy v. Ferguson, under the doctrine of “separate but equal”. He would dedicate much of his work towards attacking this doctrine, which he believed was the keystone for much of Jim Crow’s stranglehold on the South. Alongside Thurgood Marshall and the Baltimore branch of the NAACP, Houston argued in Murray v. Pearson before the Maryland Court of Appeals. The case concerned Donald Gaines Murray, an applicant to the University of Maryland School of Law who was rejected due to his race. The court ruled in Murray’s favor, and ordered the school to admit Murray. This ruling, however, did not mean the end of segregation in America’s, or even in Maryland’s schools. The court noted that only because the University of Maryland School of Law was the only law school in the state, did the Equal Protection Clause of the Fourteenth Amendment apply. In theory, a separate Black-only law school could legally exist in Maryland. Nonetheless, this was heralded as a victory for Houston and his devoted followers.

The precedent of outlawing segregation in institutions which were the only of its kind within its state was carried on to the federal level, thanks to Houston’s work in Missouri ex rel. Gaines v. Canada. This case was very similar in background to Murray, with the added impact that it started to raise doubt within the Supreme Court of the United States about the legitimacy of “separate but equal”. Still, however, the doctrine remained as the official legal precedent in American law. Concurrent with his struggle towards desegregating American schools was Houston’s battle towards racist housing covenants. These were legally binding contracts attached to properties that restricted who could purchase it, which often meant discrimination against prospective Black homeowners. Using these covenants, real estate developers could directly control the demographics of the neighborhoods they built. In 1948 the Supreme Court ruled in Shelley v. Kramer that the enforcement of these covenants by state or local authorities was unconstitutional, thus ending a decades-long battle by Houston and the NAACP. Though Houston himself did not argue before the court, his advice and connections to the Howard Law School alumni who did, are another example of Houston’s vital role in dismantling Jim Crow on multiple fronts.

Charles Hamilton Houston died of a heart attack on April 22, 1950, at the age of 54. Just four years after his death came the landmark decision Brown v. Board of Education, which successfully overruled the doctrine of “separate but equal”. The case was headed by the director-counsel of the newly established NAACP Legal Defense Fund, and one of Houston’s most loyal disciples, Thurgood Marshall. In 1967, Marshall would be appointed by Lyndon B. Johnson as the first African-American justice to serve on the Supreme Court of the United States.

We owe it all to Charlie.

– Thurgood Marshall

The Lavender Scare

Frank Kameny (second picketer in line) and other activists protest outside the White House (Washington Post)

Happy Pride Month from HFE! 🏳️‍🌈

Many Americans who learned about the state of the United States government during the 1950s will be familiar with the practices and consequences of McCarthyism. Named after Senator Joseph McCarthy, McCarthyism is an era in American politics in which many government officials were accused of having communist sympathies, and therefore were disloyal to the United States. It is considered to be one of the primary effects of the Second Red Scare, a wave of fears and resentments towards communist and socialist ideologies that swept the country during the 1950s. As harmful as this political witch hunt was to many individuals, it often takes the spotlight from a concurrent practice that was equally damaging to the careers and reputations of thousands of American civil servants: the Lavender Scare. The term Lavender Scare, first coined by LGBTQ historian David K. Johnson, is used to describe a direct, pervasive discrimination against suspected homosexuals, bisexuals, or other members of the LGBTQ community within the United States federal government. The Scare resulted not only in the expulsion of thousands of competent, loyal federal workers, but the smearing of their reputations, and the public exposure of their sexual identities to a hostile public.

In 1952, homosexuality was deemed a mental illness by the American Psychological Association. Most aspects of the government, academia, and society as a whole saw homosexuality as sexual perversion, no different from pedophilia or bestiality. But while LGBTQ Americans always faced intolerance wherever they went for much of history, it was not until the 1950s did they become associated with treachery, espionage, and ties to communism which together constituted the primary concerns of McCarthyism. Many argued that because the majority of homosexuals were closeted, they could invariably be controlled by blackmail from foreign agents, and therefore could not be trusted in the federal government. While the claim itself is unfounded at best, and bigoted at worst, the existing hostile sentiment towards LGBTQ persons allowed a systemic discrimination against the group with virtually no resistance from those in power.

While subtle means of rooting out suspected homosexuals in certain government institutions such as the State Department or the Armed Forces had existed long before McCarthyism, 1950 became a key year in the Lavender Scare. In June of that year, investigations began to be conducted by the US Senate investigating the “threat” that homosexuals in government posed to the United States. The Subcommittee on Investigations released their findings in December, declaring that homosexuals were a real and present threat, and that it was necessary to remove them from their positions, because they were predisposed to commit acts that were morally wrong, illegal, or even treacherous.

From the original Senate report, “Employment of Homosexuals and Other Sex Perverts in Government”:

It is the opinion of this subcommittee that those who engage in acts of homosexuality and other perverted-sex activities are unsuitable for employment in the Federal Government. This conclusion is based upon the fact that persons who indulge in such degraded activity are committing not only illegal and immoral acts, but they also constitute security risks in positions of public trust.

Just three years following the Senate report, Executive Order 10450 was signed by President Dwight D. Eisenhower, allowing the FBI, the Civil Service Commission, and the federal agencies themselves to investigate federal workers whom they suspected were partaking in “criminal, infamous, dishonest, immoral, or notoriously disgraceful conduct, habitual use of intoxicants to excess, drug addiction, or sexual perversion”. Under this order, federal workers became subject to security investigations that examined deep into one’s professional and personal lives. Workers could be investigated, interrogated, and eventually fired on suspicions as arbitrary as receiving numerous calls from a member of the same sex, or wearing clothes deemed unfit for their apparent gender. An estimated 5,000 federal workers were fired as a direct result of Executive Order 10450, while thousands more hoping to work for the federal government were barred from employment. One particularly disturbing case can be seen in Andrew Ference, a young research assistant who worked at the American embassy in Paris. After he was discovered to have been living with a male partner, he was subjected to harsh investigations and interrogations. Ference, fearing that he would lose his dream job, was under intense mental stress. On September 7, 1954, while his investigation was still ongoing, he committed suicide in his Paris apartment.

In 1957, Frank Kameny, a Harvard-educated astronomer in the US Army Map Service, was fired from his position due to him refusing to disclose information about his sexuality, and was later permanently barred from any future federal employment. Furious by the injustice he was subjected to, Kameny went on the co-found the Mattachine Society, one of America’s first national gay rights activism groups. The Society, though later splitting into regional organizations, conducted some of the first gay rights protests by openly LGBTQ persons, many of which were former federal employees who were fired due to their sexual or gender identity. Activists such as Kameny continued to fight all the way until the 1990s, when President Bill Clinton finally banned all sexual-orientation-based discrimination in federal employment or the granting of federal security clearances.

While the Lavender Scare no doubt left a legacy of homophobia in the American federal government and bureaucracy, it also left an important legacy of LGBTQ activism. The actions of bold leaders like Frank Kameny were the first steps in a continuing fight towards true equality for all LGBTQ Americans.

The Rhodesian Bush War: Causes and Legacy

Key officials agree to the terms of the Lancaster House Agreement, the peace agreement that would lead to the full independence of Zimbabwe (The Guardian)

By the midpoint of the 20th century, the old colonial empires of Europe were beginning to come apart. The aftermath of the Second World War had ushered in a new era of peace, and created a growing distaste towards imperialism and nationalism as a whole. The part of the world most affected by centuries of colonialism was the continent of Africa, which, with few exceptions, was at one point the territory of one European nation or another. Africa’s transition into its post-colonial era is one full of triumph and tragedy, and is a process that arguable continues to this day. Perhaps the best known story from post-colonial Africa is that of South Africa, whose peaceful transition from apartheid-ridden colonial state into a (somewhat) fair and equal democracy inspired much of the world. However, today’s story will revolve around its neighbor to the north, the former British colony of Rhodesia, and the country it is today known as: Zimbabwe.

Though the lands that eventually became Rhodesia/Zimbabwe were occupied by a variety of peoples, it was primarily ruled by the Ndebele Kingdom, a breakaway tribe from the Zulu people to the south. The Ndebele were founded by Mzilikazi, himself a Zulu general under the famed King Shaka Zulu. The Ndebele would soon grow very powerful, and by the time European’s arrive at their door, they ruled the many tribes of Zimbabwe under a tribute system, including the Shona people who were previously the dominant force in the region. The divisions between the Shona and the Ndebele would continue well after the latter came into power in the mid-19th century. In 1888, after several years of British presence in the area, the British South Africa Company (BSAC) under the cunning, imperialist, white supremacist tycoon Cecil Rhodes began its rule over Zimbabwe. Mzilikazi’s son and successor, King Lobengula, agreed to a deal with Rhodes that would concede mining rights to BSAC. Rhodes subsequently used the concession to obtain a royal charter from the British Crown, and solidify the region as a British colony. The new territory owned by BSAC was named Rhodesia, whose namesake was, not surprisingly, Cecil Rhodes himself. White settlers soon flooded into Rhodesia, while its native people, Shona and Ndebele alike, were forced onto “tribal trust areas”, which filled a similar role as North American Indian Reservations. Treated as second class citizens in their own homes, the tensions between black African and white British factions is a common theme in many former European colonies on the continent, and was certainly the case in Rhodesia.

In 1923, the colony, now called Southern Rhodesia to differentiate it from the newly created Northern Rhodesia (now Zambia), officially became a self-governing colony of the British Empire, making the colony effectively independent as state, but still technically under British rule. Following the Second World War, the British government (in accordance with its decolonization policy) pressured Southern Rhodesia to end its minority rule of the country, and expand suffrage to its black African population. The colony was ruled entirely by its 80,000 whites, while its 2.5 million blacks still lived essentially as colonial assets. Not wanting to give up its control domination over the country, the white leadership of Rhodesia (switching back to its old name since Northern Rhodesia had already become Zambia) declared independence; a shocking move that was technically the first of its kind since the American Revolution. The new nation of Rhodesia, under Prime Minister Ian Smith, was now totally independent, though it did not receive any real international recognition. Rhodesia received harsh economic sanctions and condemnations not just from the British, but from the entire international community.

In 1964, shortly before Rhodesia’s Universal Declaration of Independence (UDI), the conflict now known as the Rhodesian Bush War began with a minor skirmish involving Rhodesian forces and one of the two emerging, Marxist, African nationalist groups. Formerly one entity, these two disparate parties were the Zimbabwe African People’s Union (ZAPU; its military wing named ZIRPA), and its breakaway group the Zimbabwe African National Union (ZANU; its military wing named ZANLA). The generations-long division between the Ndebele and Shona peoples was key to the split, as the two ethnic groups controlled ZIRPA and ZANLA, respectively. Throughout the war, ZIRPA and ZANLA would occasionally fight each other to gain better regional control. Opposite both of these factions was the Rhodesian Security Forces, a well equipped, professional army that had considerable air power, its own SAS special forces unit, while consisting of both white and black units. ZIRPA and ZANLA, having both being expelled out the country and into Zambia, conducted the first phase of the war (1964-1972) through a number of battles through and along the Zambian border. The Rhodesians thoroughly defeated the rebels in this first phase, even becoming confident enough to release rebel leaders such as Robert Mugabe, who they deemed to no longer be a threat.

However, as the Rhodesians were celebrating their apparent victory, both the ZIRPA and the ZANLA weren’t just licking their wounds, but were also gearing up for the second phase. Due to the involvement of communist and non-communist factions, Cold War politics inevitably found their way into the conflict. United States covertly supported Rhodesia due to the Rhodesian Front’s strong anticommunist sentiments, while South Africa provided ground forces to fight alongside them. But more importantly, the ZIRPA was strongly backed by the Soviet Union, while the ZANLA received support from the People’s Republic of China. The advisors and resources provided by their strong allies allowed the insurgents a tremendous advantage coming into the second phase (1972-1979). ZANLA’s relocation to Mozambique also meant that the Rhodesians also needed to fight along the Mozambican as well as the Zambian, further worsening the situation for Ian Smith’s government. As South Africa pulled out the conflict, and more insurgents entered the country, the situation for Rhodesia became desperate, with the Rhodesian Security Forces even resorting to using deadly chemical and biological weapons. In soon became clear that white minority rule was no longer possible, and a gradual political transition was attempted, which led to the election of the first black prime minister and president of a newly named, but short-lived state, Zimbabwe-Rhodesia. But for the warring ZANU and ZAPU, it was still not enough, and fighting continued until the Lancaster House Agreement in December 1979. The agreement, brokered by the British government under the newly incumbent Prime Minister Margaret Thatcher, granted Zimbabwe full and sovereign independence, while also allowing the belligerent ZANU and ZAPU parties to hold office. After a decade and a half of brutal conflict, Zimbabwe was finally an independent nation.

In 1980, ZANU leader Robert Mugabe was elected Prime Minster, and despite his popularity surrounding his anti-imperialist heroics, the election essentially began Mugabe’s 37 year rule as a brutal and corrupt despot. While the story of post-war Zimbabwe and Mugabe’s dictatorship is all too common among unstable African nations affected by colonialism, it is perhaps Rhodesia itself that has the most interesting legacy from the whole ordeal. Unlike South Africa, which peacefully transitioned from its white minority rule into a liberal democracy, the racist sentiments and overall brutality that caused its civil war still follow Rhodesia long after its dissolution. In the last few years, an odd nostalgia for Rhodesia has appeared it certain alt-right and white supremacist communities, especially in the United States. Dylann Roof, the perpetrator of the racially motivated 2015 Charleston church shooting, wrote his hateful, violent manifesto on a website called The Last Rhodesian, while a picture surfaced of him wearing a jacket with the Rhodesian Flag. Meanwhile, certain social media communities celebrate the existence of Rhodesia, with some claiming that Rhodesia was better off before its white rule was eliminated. As strange as Rhodesia’s modern legacy seems to be, there can be no doubt that European colonialism as whole, and the countless movements that resulted as a consequence of it have shaped the social fabric of the world in more ways than one.

Eugenics in America Part III: Native American Women

Cartoon from US Department of Health, Education, and Welfare (HEW), encouraging American Indians to have a sterilization procedure. Left depicts tired parents with many children and only one horse, right depicts happy, active parents with a single child but many horses. (Akwesasne Notes, via UC Berkley Law)

Since the arrival of the first European colonists, the indigenous population of the Americas, especially in the more sparsely populated tribes in modern-day Canada and the United States, has been in grave danger due to the multitude of threats posed by colonialism and its legacy. Among these threats, the most commonly known in popular history are likely the deadly smallpox epidemics or the forced relocation of tribes onto reservations, both of which happened largely until the late 19th century. But a far more recent concern about the civil rights and autonomy of American Indians was brought to light well within a human lifetime from the present day, when tens of thousands of Native American women were coerced into dangerous and effectively irreversible sterilization procedures. The number of women sterilized, though small in the context of the total US population, is massive when taken as a proportion of the Native American population; far higher in any other ethnicity in the United States.

In 1955, the Indian Health Service (IHS) was founded after the authority to oversee Indian health concerns was transferred from the Bureau of Indian Affairs (then known as the Office of Indian Affairs) to the Department of Health, Education, and Welfare. The IHS aimed to provide necessary care to the millions of Native Americans living on reservation across the country, and was, in fact, successful in ensuring that more people had their healthcare needs properly addressed. However, most the doctors of the IHS were not Indians themselves, and some thus held the same prejudice against Native Americans that many others did at the time. Many were under the belief that Indians were inherently intellectually and morally inferior, and that they could not be trusted to manage their own health. These assumptions, though problematic in and of themselves, became especially concerning just a decade after the founding of the IHS, when the Service began to provide family services to its patients.

The United States government had long been concerned about the extremely high birth rates in many Indian communities, with some tribes averaging up to 4 children for every adult mother in 1970, double that of the America’s white population. Many attributed the problems of poverty, drug abuse, and overall social decay to the rapidly rising Indian population. The family service program was meant to advise patients about different methods of birth control, but the prejudice against the Indians—taking the form of a flawed dynamic in which the doctors had a superior intellectual and authoritative position on their patients—became very apparent as many patients were coerced into receiving treatments they would have otherwise refused. The two most common procedures for women were tubal ligations (colloquially known as “getting one’s tubes tied), in which the Fallopian tubes are blocked, and the far more dangerous hysterectomy—the complete removal of the uterus. Both procedures were extremely difficult or impossible to reverse and are considered permanent forms of birth control.

Despite the extreme consequentiality of the procedures being done, patients often did not have an interpreter through which they could clearly communicate with their doctor, while the doctors themselves often omitted any mention of a procedure’s permanency or other long term effects. Several Indian women later interviewed also claimed that the IHS, as well as other welfare agencies, threatened to cut their benefits should they choose to have another child. Perhaps the most coercive technique, however, was the threat of losing one’s children to foster homes, adoption, or boarding schools—a fear deeply rooted into the culture of Canadian and American Indian tribes.

Jane Lawrence, from her essay, The Indian Health Service and the Sterilization of Native American Women, American Indian quarterly via University of Nebraska Press.

A young Indian woman entered Dr. Connie Pinkerton-Uri’s Los Angeles office on a November day in 1972. The twenty-six-year-old woman as Dr. Pinkerton-Uri for a “womb transplant” because she and her husband wished to start a family. An Indian Health Service (IHS) physician had given the woman a complete hysterectomy when she was having problems with alcoholism six years earlier. Dr. Pinkerton-Uri had to tell the young woman there was no such thing as a “womb transplant” despite the IHS physician having told her that the surgery was reversible. The woman left Dr. Pinkerton Uri’s office in tears.

Estimates for the number of American Indian women sterilized in the 1970s are almost dumbfounding, ranging from 25 to 50 percent of the total population. Dr. Constance Redbird Pinkerton Uri, a Choctaw/Cherokee physician of the IHS and advocate for Indian interests, stated that the mass sterilization was not motivated by a desire to reduce the native population, but by a flawed idea that the solution to poverty was to limit the number of children a family could have. Others, however, such as Northern Cheyenne tribal judge Marie Sanchez viewed as a modern form genocide; a continuation of the injustices perpetrated by the United States government against Native Americans. Whatever the motive was, it was clear that the sovereignty and welfare of America’s Indian Tribes were in grave danger, and justice began to be demanded as many Indians rallied under the larger Red Power movement, which advocated for greater Indian self-governance and reduced influence from the American federal government. The largest victory of the movement was in 1976 when the Indian Health Care Improvement Act was passed, transferring the power of managing the IHS to the tribes themselves, with many IHS facilities having since been taken over by regional tribe authorities.

Throughout the history of the United States, the balance between the power of the federal government and the interests of the country’s indigenous people has redefined, tested, and broken several times over. The question of what place, if any, that American Indians have in the vision of an equal, prosperous country continues to be asked today. The mass sterilization of Native American women in the 1970s is just one example of how easily power can be abused, and how easily that abuse can ignored or forgotten.

Eugenics in America Part II: African-Americans

W.E.B. Du Bois, black activist and eugenics advocate (Smithsonian)

The story of eugenics in the United States and the concurrent social movements for the interests of African Americans are deeply intertwined. History has revealed that there were actually African American supporters on both sides of the eugenics argument, but usually for different reasons than their white counterparts. The relationship that black activists had with eugenics in a given time period can provide an insight into the changing goals and reasoning of the centuries-long struggle for racial justice.

As eugenics began to gain prominence in the late 19th century, some African Americans, despite the mainstream movement often labeling those of African descent as an “unfit” group, saw it as a possible way to improve their race. While some African Americans believed in protecting the racial purity of the black race (such as Marcus Garvey), or even that the black race itself was inferior (such as William Hannibal Thomas), the majority of eugenics proponents saw the ideas of “fit” and “unfit” groups as something no different from breeding cows or corn.

W.E.B. Du Bois was a leading intellectual within the black community, and strong advocate for “assimilationist eugenics”. He believed that it was the responsibility of the African American community to lift itself out of its current state, not just through social or environmental changes, but by selecting which of its members should procreate. Du Bois also claimed that the mixed-race children born to white slave owners (and the decedents of those children) were partially responsible for black moral decay by carrying the genes of perverted adulterers. He observed that similar to any other race, the black race contained individuals who possessed traits that were desirable or defective. One of Du Bois’ famous ideas was that of the “Talented Tenth” He believed that only the best of the race would be able to save the whole race. All the while, he insisted that the white and black races were equal, and that the differences alleged by contemporary white scientists were the result of class and environment rather than genetics.

Another prominent African American proponent of eugenics was Dr. Thomas Wyatt Turner. In contrast to Du Bois’ balanced emphasis on both nature and nurture, Turner doubled down on the ideas of biological determinism and the importance of one’s genetic background. He helped reshape the mainstream ideas of eugenics into one that better fit the notion of racial equality. Turner’s ideas were taught to thousands of black students at Howard, Tuskegee, and Hampton. In fact, a 1915 exam from Turner’s class at Howard University read, “Define Eugenics. Explain how society may be helped by applying eugenic laws“. In the end, Turner was hugely responsible for popularizing the eugenics both among the black elite, and the general African American population through his volunteer lectures. Years later, the NAACP, which Turner would help found, would hold baby contests (yes, baby contests) that were no doubt tied to the ideas that Turner helped spread.

While eugenics was viewed favorably by African Americans for decades, the emerging civil rights movement of the mid 20th century began to see the idea rapidly fall out of favor. Eugenics policies, such as the North Carolina Eugenics Board, which disproportionally affected African Americans were common in the United States, especially in the South. One policy of the generally progressive President Lyndon B. Johnson was the allocation of federal funding towards birth control in low income communities. Although the more radical, black nationalist faction of the civil rights movement already opposed the moderate reforms of the Johnson administration, this particular action sparked widespread outrage, since many saw it as limiting the black population as a way to suppress its influence in the United States.

The popularity of eugenics plummeted across racial communities by the 1950s and 60s, especially in response to atrocities of the Nazi regime, who adopted eugenics and Aryan superiority as a basis for its ideology. The fall of eugenics was particularly strong in the African American community, who sometimes highlighted the hypocrisy of fighting against injustice abroad when it was still being fought for at home. It was not long before eugenics was seen to be as conducive to black empowerment as lynchings or poll taxes were.

Eugenics in America Part I: Buck v. Bell

Carrie and Emma Buck (Encyclopedia Virginia)

Eugenics, though a concept present to some degree for a large part of human history, began to gain significant traction among some Western intellectual and political circles in the 19th century. Advocates for eugenics argue that certain genetic traits in humans are more desirable than others, and that those who possess undesirable traits should be sterilized or otherwise removed from the genetic pool. Some eugenicists support the ideology on the basis that certain racial or ethnic groups are superior to others, while others seek to eliminate certain physical or mental disabilities from the population. While Nazi Germany and the Greek city-state of Sparta are probably history’s most famous proponents of eugenics, the practice also has an unfortunate history in the United States. This three-part article series will attempt to briefly overview three different historical outlooks of eugenics in America.

Carrie Buck was on July 3, 1906, in Charlottesville, Virginia to Emma and Frederick Buck. While Frederick had abandoned the family shortly after Carrie’s birth, Emma was admitted to the Virginia State Colony for Epileptics and Feebleminded, an institution that housed Virginians deemed mentally unfit to be a part of society. Carrie Buck initially had a normal childhood under her new foster parents, earning average grades in school, and later, as was relatively common for young girls at the time, removed from school to help with domestic work. However, her life was forever changed at age 17 when she was raped and impregnated by her foster mother’s nephew. To cover up the family’s embarrassment from the incident, her foster parents committed Buck to the same institution that her mother was in, accusing her of feeble-mindedness and promiscuity. Carrie Buck’s newly born daughter, Vivian, was deemed to be similarly mentally feeble, although later in life she actually excelled in school. Shortly after Carrie’s admittance, the Colony’s Board of Directors authorized her sterilization via salpingectomy, an irreversible procedure that removes the patient’s Fallopian tubes.

Seeking to test the legal legitimacy of the practice of forced sterilization through the Virginia Sterilization Act of 1924, the Colony’s superintendent, Albert Sidney Priddy (though later succeeded by John Hendren Bell), asked Buck’s state-appointed guardian, Robert G. Shelton, to challenge the order for her sterilization. Shelton appealed the order both to the Amherst County Circuit Court, and the Supreme Court of Virginia. After the order for Carrie’s sterilization was affirmed in each of those lower courts, he appealed one final time to the highest court in the country, the United States Supreme Court, in BUCK v. BELL, Superintendent of State Colony Epileptics and Feeble Minded.

Buck’s attorney, Irving P. Whitehead, argued that the Due Process Clause of the Fourteenth Amendment prohibited the Commonwealth of Virginia from performing involuntary sterilization, because a citizen was being deprived of her rights without due process of law. Meanwhile, the Colony’s attorney, A. E. Strode, cited the apparent (though not actual) genetic defects in the Bucks’ bloodline, instead arguing that the sterilization was justified under the premise that removing those defects from the Commonwealth’s collective gene pool was in the best interests of the state. On May 2, 1927, the Court delivered its verdict: ruling 8-1 in favor of Bell. The lone dissenter in the case was Justice Pierce Butler, whose Catholic faith likely influenced his decision. He did not write a dissenting opinion.

From the majority opinion of Buck v. Bell (1927):

It is better for all the world if, instead of waiting to execute degenerate offspring for crime or to let them starve for their imbecility, society can prevent those who are manifestly unfit from continuing their kind.

Three generations of imbeciles are enough.

The significance of Buck reaches far beyond the case of Carrie Buck herself. It essentially legitimized all similar eugenic practices in the United States, and is no doubt one of the largest stains on the country’s legacy when it comes to eugenics. In fact, defense teams for Nazi officials during the Nuremburg trials used Buck in their arguments, using the decision to expose a terrible hypocrisy in American criticism of Nazi ideologies during WW2.

While Skinner v. Oklahoma ruled 1942 that criminals could not sterilized as punishment for their crimes, it did not ban the type of state-mandated sterilization that was affirmed by Buck. Forced sterilization of ordinary citizens deemed mentally unfit was still legal in Virginia until 1974, with the last such operation in the country being performed in Oregon in 1981. Today, Buck v. Bell is used as a symbol of the struggle for disability rights in the United States, and demonstrates how disabled Americans were one of the many groups in the country who faced outright discrimination and oppression during this time period.

The Brazilian Expeditionary Force

FEB soldier loads artillery with shell inscribed with A COBRA ESTÁ FUMANDO; THE SNAKE IS SMOKING (Getty Images)

The Second World War is a conflict well catalogued and studied by high school students, history buffs, filmmakers, writers, and scholars. But as significant and consequential the war was to every corner of the world, there would inevitably some stories that would be forgotten by its popular history. One such story, at least outside of South America, is that of the 25,000 man-strong Brazilian Expeditionary Force (Força Expedicionária Brasileira, or FEB).

Prior to their entry into the war, Brazil had been a valuable trading partner to the Allied powers, and even allowed the United States to construct air bases on its soil. Eventually anti-Axis sentiments began to mount as Brazilian merchant ships were sunk by German U-boats, and in August of 1942, Brazil declared war on the Axis powers. Initially, Brazilian support to the Allies was no different from that of other South American countries — providing much-needed war material by becoming a key link in the supply chain across the Atlantic and into Africa. However, Brazilian leaders soon realized that by sending an actual military force to the Allies’ aid, it would be a symbolic commitment to their cause, and improve their position at the negotiation table once the war came to an end.

In addition to its main infantry division, the FEB also included a fighter squadron, and was supported by the Brazilian Navy. In the summer of 1944, the first Brazilian troops arrived in Naples, merging itself into a larger American force that was already fighting a brutal campaign in Italy. Their nickname was the Cobras Fumantes (“Smoking Snakes”), after it became a running joke that it was more likely for a snake to smoke than it would be to see the FEB to see any actual combat.

From the memoirs of Mark Clark, commander of the U.S. Fifth Army:

The Performance of the Brazilians was, of course, important politically as well as militarily. Brazil was the only Latin American country to send an expeditionary force to take part in the European war, and, naturally, we were eager to give them a chance to make a good showing.

While the small force was not hugely impactful when considering the massive scale of military operations during WW2, they were nonetheless remembered in the hearts and minds of the Brazilian people. Perhaps the greatest of these victories was at the Battle of Collecchio, in which the FEB surrounded and captured two German infantry divisions on April 29, 1945, just days before the fall of Berlin, the collapse of Nazi Germany, and the end of the war in Europe.

The legacy of the FEB in Brazil can be considered somewhat complicated, especially considering the various roles FEB veterans played during the 1964 Brazilian coup d’état. However, it can generally be said that the Brazilian Expeditionary Force is an enduring symbol of national pride for Brazil, and serves as testament to the bravery and dedication of countless of individuals during WW2, particularly by countries who roles in the war are not as well known.

Operation Wetback

Migrants in El Centro, CA await deportation (LA Times Archive)

Mexican immigration to America has been significant to the history of both countries ever since they have shared a border. The continued flow of Mexican migrants have been with met a multitude of laws, policies, or doctrines from the United States over the years, each of which represent, to some extent, the broader social and political conditions of the time.

In the decades leading up to the Second World War, hundreds of thousands of Mexican immigrants entered the United States both legally and illegally, primarily to work on farms in the rural Southwest. Their diasporic communities formed and grew quickly, creating a new generation of Mexican Americans. By the outbreak of the war, the American government was in need of cheap labor to fuel the war effort, both from the increase in demand for manufactured and agricultural goods, and the removal of millions of young men from the traditional workforce who instead served overseas. In response, the governments of the US and Mexico struck a deal known as the Bracero program, which allowed more Mexican laborers to enter the States on short term contracts. The program eventually brought over four million braceros.

Despite the program, illegal immigrants continue to flow into the country, much to the concern of the United States. Under the Eisenhower administration in 1954, a series of deportations would be authorized under the name Operation Wetback. U.S. Border Patrol agents began mass sweeps across the country. Hundreds of thousands of Mexicans, some of whom were American citizens, were packed into trucks, boats, or planes, and shipped back to Mexico. Stuck in a place they were not familiar with, with no guarantee of jobs, food, water, or shelter, they had to rebuild their lives from scratch. While the federal government boasted that it had successfully deported over a million illegal immigrants in just a few months, the number is likely lower due to the fact that many of those deported returned to the United States several times, only to be deported once more.

Operation Wetback was, overall, a failure. Both the Bracero program and illegal immigration far outlived any consequences that came a result of the operation, other than the continuing legacy of anti-Mexican sentiments in the United States. In fact, the sudden deportation of such a large number of Mexican laborers increased an already high demand for cheap labor, thus also increasing illegal immigration to the United States as whole. It is also important to note the operation’s name, “wetback”. Today it is known as a highly offensive slur towards Mexican Americans, further tarnishing the operation’s legacy.

The operation reentered the minds of mainstream America during the 2016 Republican Presidential primaries, when eventual winner Donald Trump used the operation both as precedent, and as an example for the feasibility of his proposed immigration policy, which included the mass deportation of the millions of illegal immigrants currently living in the United States.