In mid-1939 German Chancellor Adolf Hitler had a problem. He wanted to go to war with the Soviet Union in order to grab precious Lebensraum, or living space — and eradicate the Bolshevik menace. The Western powers, however, namely Great Britain and France, refused to make a deal with him.
Instead, they guaranteed the security of Poland, the next obvious Nazi target and pathway to the USSR. He wanted to avoid a two-front war, which ended badly for the Germans in World War I. So the Austrian corporal turned German Führer sought a deus ex machina. He found it on August 23, 1939, when the Treaty of Non-Aggression Between Germany and the Union of Soviet Socialist Republics, also known as the Hitler-Stalin Pact and Molotov-Ribbentrop Pact, the names of the respective dictators and foreign ministers who negotiated the agreement’s terms, was signed.
While diplomacy almost always is preferable to war, the two sometime coincide. Plenty of plundering marauders have made common cause. But it is hard to think of an example of greater depravity: two of the worst mass murderers in history dividing the world between them.
World War I left both Germany and Russia isolated pariah states. Germany’s new Weimar republic had expected gentler treatment by the allies, having surrendered under Woodrow Wilson’s “14 Points” and then defenestrated the Kaiser and the entire imperial system. But the Versailles Treaty placed full blame on Berlin, amputated historic Germanic lands, transferred indisputably German populations to other nations, imposed the cost of the war on the German people, and kept the democratic German government out of the League of Nations, which was designed to guarantee British and French dominance of the new international order. Ravaged by political conflict and civil strife at home, Berlin schemed to overturn the artificial territorial divide, which it never accepted.
We should never forget the moment when two of history’s worst dictators came together to do evil, leaving immeasurable death and carnage in their wake.
The newly created Soviet Union, successor state to the Russian Empire, was even more isolated. Forced by Germany, which triumphed on the Eastern Front, to accept the draconian Treaty of Brest-Litovsk in 1918 — the only way for the Bolsheviks to preserve their tenuous control as civil war loomed — the Communists spent the next several years battling counter-revolutionaries while seeking to reassemble the old empire. The Americans, British, French, and Japanese intervened militarily against the new regime, first hoping to keep Russia in the war and next seeking to strangle the Soviet state in its infancy. The USSR survived, but turned inward as Vladimir Ilyich Lenin’s successors battled for control and the triumphant Joseph Stalin brutally industrialized his peasant nation.
During this time, the former enemies became friends of sorts. In April 1922, Germany and Russia signed the Treaty of Rapallo, renouncing financial and territorial claims against the other. A secret annex allowed Berlin to train military personnel and test military equipment on Soviet soil, violating the Versailles Treaty. The Treaty of Berlin, signed in April 1926, guaranteed neutrality in the event of a third-party attack on the other. Trade also expanded between the two states — especially noteworthy for Moscow, which was more isolated from capitalist markets.
Then Adolf Hitler came to power on January 30, 1933. He disliked the Western powers but bore special animus toward the Soviet Union and Bolshevism, against which he had preached war. In November, Berlin and Tokyo signed the Anti-Comintern Pact, which explicitly targeted the Communist International and USSR. Aided by substantial Communist parties active in Europe, Moscow initially looked to the West. In May 1935, France and the Soviet Union signed the Franco-Soviet Treaty of Mutual Assistance. After the September 1938 Munich Agreement and March 1939 German invasion of Czechoslovakia, Moscow, London, and Paris opened tripartite talks over military cooperation against Germany.
The barriers to agreement were significant, however. Only a fellow traveler could imagine Soviet Communism as a trustworthy bulwark for Western democracies. Poland refused to allow the passage of Soviet troops, lest they not be so quick to leave. And the British and French, uncertain and unenthusiastic, hoped war with Germany could be avoided and doubted the military value of the Red Army, which was recovering from Stalin’s purges. Divided over strategy, they stalled negotiations, sending their representatives by boat rather than air and denying them authority to make a deal.
This encouraged Stalin to seek a new foreign dance partner. In May he replaced Maxim Litvinov, the Westward-leaning (and Jewish) foreign minister, with Vyacheslav Mikhailovich Molotov, a hardened revolutionary and loyal apparatchik. The result was a whirlwind geopolitical romance, as Hitler pressed for a quick settlement that would free him to attack Poland and then deal with his European foes. The two great totalitarian rivals decided that they were united in “opposition to the capitalist democracies,” as the diplomats put it.
Of course, there were tensions, since both governments had spent years vilifying the other. Some top Nazis were uneasy about sacrificing the Finns and Balts, who were supposed to be racial kin of the Germans (and among whom many ethnic Germans lived). The political pirouettes performed by the Communists, especially party members in the West, who went from calling for war against the Reich to demanding peace with Germany, were even more dramatic. After all, few of them had before acted like they believed that “fascism is a matter of taste,” as Molotov observed when the agreement was signed.
Publicly the two governments agreed not to aid or ally with any nation against the other signatory. Through a secret protocol, Berlin and Moscow defined “spheres of influence”: the two totalitarians coldly divided Poland and apportioned influence over the three Baltic States, Finland, and Romania. (They later adjusted their shares, with continuing contempt for the territories and peoples bartered back and forth.) Moscow also became a significant supplier of raw materials to the Reich, receiving industrial and military products in return.
Stalin apparently explained his decision in a speech to the Politburo on August 19, as the agreement was being finalized — though the Soviets always denied that the talk occurred. (More than one version of the supposed text exists.) He explained why “we must accept the German proposal and, with a refusal, politely send the Anglo-French mission home.”
The Soviet dictator said the agreement with Germany ensured that Berlin would invade Poland and be at war with France and Britain. As a result, “Western Europe would be subjected to serious upheavals and disorder. In this case we will have a great opportunity to stay out of the conflict, and we could plan the opportune time for us to enter the war.”
If Berlin defeated the allies, it still would have acknowledged the USSR’s geopolitical interests, he indicated. More important, “Germany will leave the war too weakened to start a war with the USSR within a decade at least.” Berlin also would need to occupy the two allied states and exploit new territories. “Obviously, this Germany will be too busy elsewhere to turn against us” — especially after a Communist revolution would break out in France and it, along with other nations that fall under the victorious Nazis, would become Moscow’s ally.
If Germany lost to Britain and France, “a Sovietization of Germany will unavoidably occur,” said Stalin, though he was afraid that Britain and France would intervene and destroy the resulting Communist government. Therefore, he argued, “our goal is that Germany should carry out the war as long as possible so that England and France grow weary and become exhausted to such a degree that they are no longer in a position to put down a Sovietized Germany.”
Stalin’s cynicism was almost complete. He concluded that “it is in the interest of the USSR, the worker’s homeland, that a war breaks out between the Reich and the Anglo-French bloc. Everything should be done so that it drags out a long as possible with the goal of weakening both sides.” After signing the non-aggression pact, Moscow must “work in such a way that this war, once it is declared, will be prolonged maximally.”
In some ways the aftermath was predictable. Germany invaded Poland on September 1, 1939. London and Paris declared war on Berlin on September 3. The Soviets grabbed their share of Poland two weeks later, causing that nation to cease to exist. (Reestablished after the war, Warsaw was not able to reclaim the lands seized by Russia, instead annexing German lands to the west.) Next, the USSR stationed troops in and ultimately swallowed the helpless Baltic countries, placed in its sphere of influence by the Hitler-Stalin Pact. In November, Moscow attacked Finland. The latter fought heroically but was forced to cede territory to the Soviet Union. Finally, Moscow demanded Bessarabia and Northern Bukovina from Romania.
But Stalin seriously overestimated French and British military effectiveness. And Hitler was even more cynical than the Soviet leader about their deal, never abandoning his underlying animus toward Communism. In June 1940, Hitler announced that Germany’s victories in the West “finally freed his hands for his important real task: the showdown with Bolshevism.”
Perhaps even more important, German and Soviet geopolitical interests clashed in the Balkans and Finland. Negotiations ensued over enlisting Moscow as a fourth member of the Axis, but Stalin could not be diverted to the Middle East/South Asia. For a time the German Führer appeared genuinely ambivalent about which direction to move, and Molotov visited Berlin in November 1940. Some of the talks had to be held in a bomb shelter during a British air raid, embarrassing the Germans.
The following month Stalin spoke to his generals; he anticipated war but hoped to delay conflict for at least two years to give the Red Army time to prepare. He got six months. Moscow’s demands were too heavy and Germany allowed the negotiations to lapse. Hitler complained that his Soviet counterpart “demands more and more” and is “a cold-blooded blackmailer.” Thus, the USSR “must be brought to her knees as soon as possible.” When Stalin was speaking with his generals, the German military was delivering its plan for the invasion of the Soviet Union. Originally scheduled for May 15, the action ultimately began on June 22.
Operation Barbarossa ended the Russo-German entente less than two years after it was forged. Hitler expected an easy victory. Before attacking, he declared, “We have only to kick in the door and the whole rotten structure will come crashing down.” Germany’s initial victories were great, but the Soviet Union’s resources were greater. Barely four years later, on May 2, 1945, the Red Army celebrated victory in the ruins of Berlin. Hitler’s thousand-year Reich collapsed 988 years early, with the Führer dying in the ruins of his chancellery.
Stalin died in 1953 of a stroke, or perhaps of poisoning by his secret police chief, Lavrentiy Beria. Ribbentrop was executed after trial by the Nuremburg tribunal, having been convicted for his role in promoting aggressive war and unleashing the Holocaust. Molotov lost influence after Stalin’s death but lived until 1986, when he died at the age of 96. He remained an unrepentant Stalinist to the end. Only in 1989 did Moscow admit the existence of the secret protocol; President Vladimir Putin later condemned the agreement as “immoral.”
There almost certainly would have been widespread war without the Hitler-Stalin Pact. But the character of the conflict would have been radically different. Had the German dictator proceeded to invade Poland, followed by an attack on the USSR, the Wehrmacht would have been far less prepared for extensive operations. So would the Red Army, and there would have been no American Lend-Lease program, which effectively mechanized the Soviet military. Berlin, however, would not have been able to call on significant contingents of Hungarians, Italians, and Romanians for aid. Rather than do nothing during the infamous Sitzkrieg after declaring war on Germany, Britain and France might have launched an offensive while German troops were tied down in faraway Russia.
A strike westward without safeguarding Germany’s eastern border would have been far riskier for the Reich than the conflict’s actual course. Poland might have attacked to support its allies. Moscow probably would have stayed neutral while accelerating its armaments production. But the USSR might have taken a more active role in the conflict: without a non-aggression pact, the Soviet Union would be the obvious next target for a Germany victorious in the West. But any succeeding attempt at the conquest of the Soviet Union by Berlin would have been without advanced positions in the east and the advantage of surprise when striking. America’s involvement might have remained much the same, dedicated to saving Britain and defeating Germany — and aiding Russia if the latter was attacked.
In short, there have been few treaties with consequences as great as the Hitler-Stalin Pact. It simultaneously emboldened the Third Reich, weakened the Allies, and anesthetized the Soviets. The agreement might not have changed the course of the war, but probably lengthened it while increasing the casualty toll. Perhaps the gravest humanitarian consequence was the expansion of the Holocaust. The treaty gave Germany easier access to countries with large Jewish populations and space within these countries for death camps.
Finally, the dictators’ partnership helped transform the map of Europe. If Berlin had not abandoned friendly states along Russia’s border, the Soviet Union might not have swallowed the Baltics and chopped off pieces of Finland, Poland, and Romania. Perhaps Poland would have avoided defeat, ultimately being allied with rather than a victim of the USSR.
Yet this malign “deal of the century” was well-nigh impossible to avoid. Only very late did the Allies understand the true nature of Hitler and his regime. Soviet brutality — such as the Katyn massacre of thousands of Polish military officers — retrospectively justified Warsaw’s reluctance to admit the Red Army to fight Germany. Virtually no one imagined the success of the Wehrmacht’s Blitzkrieg, without which Stalin’s plan for sitting out the conflict might have proved prescient.
Eighty years on, the picture of Stalin, Molotov, and Ribbentrop celebrating their handiwork still offends us mentally and morally. Thankfully, that world has passed. Yet evil has not disappeared from international affairs. We should never forget the moment when two of history’s worst dictators came together to do evil, leaving immeasurable death and carnage in their wake.Doug Bandow s a Senior Fellow at the Cato Institute. He is a former Special Assistant to President Ronald Reagan, and the author of Foreign Follies: America’s New Global Empire.
Across the map of the United States, the borders of Tennessee, Oklahoma, New Mexico, and Arizona draw a distinct line. It’s the 36º30′ line, a remnant of the boundary between free and slave states drawn in 1820. It is a scar across the belly of America, and a vivid symbol of the ways in which slavery still touches nearly every facet of American history.
That pervasive legacy is the subject of a series of articles in The New York Times titled “The 1619 Project.” To cover the history of slavery and its modern effects is certainly a worthy goal, and much of the Project achieves that goal effectively. Khalil Gibran Muhammad’s portrait of the Louisiana sugar industry, for instance, vividly covers a region that its victims considered the worst of all of slavery’s forms. Even better is Nikole Hannah-Jones’s celebration of black-led political movements. She is certainly correct that “without the idealistic, strenuous and patriotic efforts of black Americans, our democracy today would most likely look very different” and “might not be a democracy at all.”
Where the 1619 articles go wrong is in a persistent and off-key theme: an effort to prove that slavery “is the country’s very origin,” that slavery is the source of “nearly everything that has truly made America exceptional,” and that, in Hannah-Jones’s words, the founders “used” “racist ideology” “at the nation’s founding.” In this, the Times steps beyond history and into political polemic—one based on a falsehood and that in an essential way, repudiates the work of countless people of all races, including those Hannah-Jones celebrates, who have believed that what makes America “exceptional” is the proposition that all men are created equal.
As part of its ambitious “1619” inquiry into the legacy of slavery, The New York Times revives false 19th century revisionist history about the American founding.
For one thing, the idea that, in Hannah-Jones’ words, the “white men” who wrote the Declaration of Independence “did not believe” its words applied to black people is simply false. John Adams, James Madison, George Washington, Thomas Jefferson, and others said at the time that the doctrine of equality rendered slavery anathema. True, Jefferson also wrote the infamous passages suggesting that “the blacks…are inferior to the whites in the endowments both of body and mind,” but he thought even that was irrelevant to the question of slavery’s immorality. “Whatever be their degree of talent,” Jefferson wrote, “it is no measure of their rights. Because Sir Isaac Newton was superior to others in understanding, he was not therefore lord of the person or property of others.”
The myth that America was premised on slavery took off in the 1830s, not the 1770s. That was when John C. Calhoun, Alexander Stephens, George Fitzhugh, and others offered a new vision of America—one that either disregarded the facts of history to portray the founders as white supremacists, or denounced them for not being so. Relatively moderate figures such as Illinois Sen. Stephen Douglas twisted the language of the Declaration to say that the phrase “all men are created equal” actually meant only white men. Abraham Lincoln effectively refuted that in his debates with Douglas. Calhoun was, in a sense, more honest about his abhorrent views; he scorned the Declaration precisely because it made no color distinctions. “There is not a word of truth in it,” wrote Calhoun. People are “in no sense…either free or equal.” Indiana Sen. John Pettit was even more succinct. The Declaration, he said, was “a self-evident lie.”
It was these men—the generation after the founding—who manufactured the myth of American white supremacy. They did so against the opposition of such figures as Lincoln, Charles Sumner, Frederick Douglass, and John Quincy Adams. “From the day of the declaration of independence,” wrote Adams, the “wise rulers of the land” had counseled “to repair the injustice” of slavery, not perpetuate it. “Universal emancipation was the lesson which they had urged upon their contemporaries, and held forth as transcendent and irremissible [sic] duties to their children of the present age.” These opponents of the new white supremacist myth were hardly fringe figures. Lincoln and Douglass were national leaders backed by millions who agreed with their opposition to the white supremacist lie. Adams was a former president. Sumner was nearly assassinated in the Senate for opposing white supremacy. Yet their work is never discussed in the Times articles.
In 1857, Chief Justice Roger Taney sought to make the myth into the law of the land by asserting in Scott v. Sandford that the United States was created as, and could only ever be, a nation for whites. “The right of property in a slave,” he declared, “is distinctly and expressly affirmed in the Constitution.” This was false: the Constitution contains no legal protection for slavery, and doesn’t even use the word. Both Lincoln and Douglass answered Taney by citing the historical record as well as the text of the laws: the founders had called slavery both evil and inconsistent with their principles; they forbade the slave trade and tried to ban it in the territories; nothing in the Declaration or the Constitution established a color line; in fact, when the Constitution was ratified, black Americans were citizens in several states and could even vote. The founders deserved blame for not doing more, but the idea that they were white supremacists, said Douglass, was “a slander upon their memory.”
Lincoln provided the most thorough refutation. There was only one piece of evidence, he observed, ever offered to support the thesis that the Declaration’s authors didn’t mean “all men” when they wrote it: that was the fact that they did not free the slaves on July 4, 1776. Yet there were many other explanations for that which did not prove the Declaration was a lie. Most obviously, some founders may simply have been hypocrites. But that individual failing did not prove that the Declaration excluded non-whites, or that the Constitution guaranteed slavery.
Even some abolitionists embraced the white supremacy legend. William Lloyd Garrison denounced the Constitution because he believed it protected slavery. This, Douglass replied, was false both legally and factually: those who claimed it was pro-slavery had the burden of proof—yet they never offered any. The Constitution’s wording gave it no guarantees and provided plentiful means for abolishing it. In fact, none of its words would have to be changed for Congress to eliminate slavery overnight. It was slavery’s defenders, he argued, not its enemies, who should fear the Constitution—and secession proved him right. Slaveocrats had realized that the Constitution was, in Douglass’s words, “a glorious liberty document,” and they wanted out.
Still, after the war, “Lost Cause” historians rehabilitated the Confederate vision, claiming the Constitution was a racist document, so that the legend remains today. The United States, writes Hannah-Jones, “was founded…as a slavocracy,” and the Constitution “preserved and protected slavery.” This is once more asserted as an uncontroverted fact—and Lincoln’s and Douglass’s refutations of it go unmentioned in the Times.
No doubt Taney would be delighted at this acceptance of his thesis. What accounts for it? The myth of a white supremacist founding has always served the emotional needs of many people. For racists, it offers a rationalization for hatred. For others, it offers a vision of the founders as arch-villains. Some find it comforting to believe that an evil as colossal as slavery could only be manufactured by diabolically perfect men rather than by quotidian politics and the banality of evil. For still others, it provides a new fable of the fall from Eden, attractive because it implies the possibility of a single act of redemption. If evil entered the world at a single time, by a conscious act, maybe it could be reversed by one conscious revolution.
The reality is more complex, more dreadful, and, in some ways, more glorious. After all, slavery was abolished, segregation was overturned, and the struggle today is carried on by people ultimately driven by their commitment to the principle that all men are created equal—the principle articulated at the nation’s birth. It was precisely because millions of Americans have never bought the notion that America was built as a slavocracy—and have had historical grounds for that denial—that they were willing to lay their lives on the line, not only in the 1860s but ever since, to make good on the promissory note of the Declaration.
Their efforts raise the question of what counts as the historical “truth” about the American Dream. A nation’s history, after all, occupies a realm between fact and moral commitments. Like a marriage, a constitution, or an ethical concept like “blame,” it encompasses both what actually happened and the philosophical question of what those happenings mean. Slavery certainly happened—but so, too, did the abolitionist movement and the ratification of the Thirteenth, Fourteenth, and Fifteenth Amendments. The authors of those amendments viewed them not as changing the Constitution, but as rescuing it from Taney and other mythmakers who had tried to pervert it into a white supremacist document.
In fact, it would be more accurate to say that what makes America unique isn’t slavery but the effort to abolish it. Slavery is among the oldest and most ubiquitous of all human institutions; as the Times series’ title indicates, American slavery predated the American Revolution by a century and a half. What’s unique about America is that it alone announced at birth the principle that all men are created equal—and that its people have struggled to realize that principle since then. As a result of their efforts, the Constitution today has much more to do with what happened in 1865 than in 1776, let alone 1619. Nothing could be more worthwhile than learning slavery’s history, and remembering its victims and vanquishers. But to claim that America’s essence is white supremacy is to swallow slavery’s fatal lie.
As usual, Lincoln said it best. When the founders wrote of equality, he explained, they knew they had “no power to confer such a boon” at that instant. But that was not their purpose. Instead, they “set up a standard maxim for free society, which should be familiar to all, and revered by all; constantly looked to, constantly labored for, and even though never perfectly attained, constantly approximated, and thereby constantly spreading and deepening its influence, and augmenting the happiness and value of life to all people of all colors everywhere.” That constant labor, in the generations that followed, is the true source of “nearly everything that has truly made America exceptional.”Timothy Sandefur holds the Duncan Chair in Constitutional Government at the Goldwater Institute and is the author of Frederick Douglass: Self Made Man (Cato Institute, 2017).
Jonathan Blanks and Jeffrey A. Singer
What do gun owners and pain patients have in common? They both may be collateral damage of policy hastily enacted in response to catastrophic news. Mass shootings and drug overdoses naturally evoke fear and outrage. But with populism animating both major parties, we should be wary of policy making through fear. Visceral reactions to tragedies are normal, but new laws and restrictions rarely reduce harm and often make matters worse. The best public policy relies on data-driven evidence.
While all gun deaths have a common denominator of firearms, the vast majority of gun deaths have little in common with the mass shootings that dominate headlines. The scale of those differences is staggering and the facts undermine the current advocacy that focuses on “assault weapons.”
Visceral reactions to tragedies are normal, but new laws and restrictions rarely reduce harm and often make matters worse. The best public policy relies on data-driven evidence.
According to Mother Jones’ mass shootings database, there have been 114 mass and spree shootings in the U.S. since 1982. Those tragedies have resulted in 934 deaths and 1,406 people injured.
In 2017, there were nearly 40,000 gun deaths in the United States. Of that number, about 24,000 died by suicide. Gun suicides make up just over half of the roughly 47,000 American suicides annually. About 14,000 gun deaths were homicides, stemming primarily from street violence and intimate partner homicide.
Certainly, semi-automatic rifles made the 2017 Las Vegas shooting unfathomably deadly. But most gun deaths and most mass shootings are perpetrated with handguns. During the last federal ban on assault weapons, there was no measurable impact on gun-crime victimizations.
These facts should not preclude new gun laws, but the drivers of these deaths go beyond guns. Despite a recent uptick, homicide rates remain near historic lows after two decades of decline in violent crime. But suicides are trending upward, which is evidence that policymakers should pay more attention to the “why” rather than simply “how” so many die.
In 2017, the Center for Disease Control and Prevention reported 47,600 opioid-related deaths. Policymakers blamed excessive prescription of opioids by doctors for addicting the population.
But federal survey data consistently show no correlation between prescription volume and the nonmedical use of opioids or opioid addiction. And medically prescribed opioids have overdose rates ranging from 0.022% to 0.04%.
Many people mistake dependency for addiction, but they are two different things. Some drugs, including opioids, antidepressants, antiepileptics and beta blockers, can make a person physically dependent after prolonged use. Abruptly stopping them can cause sometimes fatal withdrawal effects.
Addiction, on the other hand, is a distinct behavioral disease, with a major genetic component, featuring compulsive behavior despite obvious self-destructive consequences. The director of the National Institute on Drug Abuse states that opioid addiction in patients is very uncommon “even among those with preexisting vulnerabilities.” Recent studies show a “misuse” rate of 0.6%in patients prescribed opioids for acute pain and roughly 1% in those on chronic opioid treatment.
High-dose prescribing is down 58% since 2008. Yet the overdose rate continues to rise, involving fentanyl or heroin 75% of the time. Evidence shows a steady exponential increase in nonmedical use of drugs since the 1970s and suggests complex socio-cultural factors are root causes. As prescription pain pills become less available for diversion into the black market, nonmedical users find cheaper and deadlier options.
Opioid dependence is real, but not necessarily detrimental. As the American Medical Association has acknowledged, there are many patients for whom opioids are the only drugs that control their pain enough to live a quality life. But our fear-based response to opioids — with top-down pill restrictions and crackdowns on prescribers — has cutoff many chronic pain patients, causing a great number to self-medicate with unpredictable and dangerous drugs on the black market. Some, in desperation, turn to turn to suicide.
The overdose problem has always been primarily a consequence of drug prohibition and the dangerous black market it fuels.To reduce overdoses, policies should be redirected from restrictive, prohibitionist interventions to those more focused on reducing the harms that result from drug use in an underground market.
Drug overdoses and gun deaths are serious problems that require changes from the status quo. However, changes should be based on data and political realities, not fears that demand policymakers “do something.” Implementing the wrong policies can obscure larger problems or make bad situations tragically worse.Jonathan Blanks is a research associate in the Cato Institute Project on Criminal Justice; Jeffrey Singer practices general surgery in Phoenix, Ariz., and is a senior fellow at the Cato Institute.
Michael D. Tanner
Democrats running for president have certainly not hesitated to criticize President Trump’s trade policies.
There is a good reason for the rhetoric. Several recent studies, from researchers at Harvard, Columbia, the IMF, and two different branches of the Federal Reserve, have all concluded that the tariffs imposed by President Trump on China and others have indeed hurt American consumers and threatened economic growth domestically and internationally. For instance, scholars at Columbia, Princeton, and the New York Fed found that the Trump tariffs had reduced U.S. real income by $1.4 billion per month by the end of 2018.
In response — or perhaps just because Americans have a reactive response to any Trump policy — polls suggest that support for free trade is on the rise. A Monmouth poll found that 52 percent of Americans in 2018 think free-trade agreements are good for the United States, a dramatic increase when compared to 24 percent in 2015.
Democrats are right to disagree with Trump. Too bad they don’t bring any good ideas to the table.
But what exactly are the Democratic presidential candidates proposing as an alternative? Their policies — as opposed to their words — don’t seem all that different. In fact, some of the Democratic plans may be even more restrictive.
For example, many experts believe that the best way to restrain China would be to join with our regional allies in some sort of block, similar to the Trans-Pacific Partnership (TPP). And there is reason to believe that our allies would be happy to have us join the pact. But with the exception of extreme long-shot Representative John Delaney, every major Democratic candidate either joins Trump in opposing the TPP or is highly critical of the current negotiation. Even former vice president Joe Biden won’t commit to the treaty his administration negotiated.
Biden’s change in position is just his latest concession to the special interests and unions that dominate the Democratic primaries. He once voted for normal trade relations in China, NAFTA, and pushed for the Trans-Pacific Partnership, but no longer.
Nor is it just the TPP that Democrats oppose. Like Trump, most of the major Democrats oppose NAFTA. But, with the exception of Beto O’Rourke, they also oppose Trump’s renegotiation of NAFTA (renamed the United States-Mexico-Canada Agreement, or USMCA). Most Democrats have also opposed other, bilateral trade deals, such as those with Korea and Colombia.
The left flank of the Democratic party is even more anti-trade. Elizabeth Warren, for instance, wants the focus of trade to be on labor, the environment, and, ironically, consumers. She wants the U.S. to trade only with countries that have signed the Paris Agreement and meet onerous human-rights and labor standards.
This policy would fall most heavily on poor nations who can least afford costly environmental or labor upgrades. Countries such as El Salvador, Honduras, and Guatemala would be devastated, sending a new flood of refugees streaming toward our border.
And Bernie Sanders’s opinions are quite similar to Warren’s. Both of them are in favor of steel and aluminum tariffs and oppose all current trade deals. Sanders, like Warren, wants all future negotiations to be centered around labor, the environment, and human rights.
This shouldn’t be a surprise. The Left has long opposed free trade. After all, the ability to buy and sell to whomever you wish is the antithesis of central planning.
Unfortunately, though, for those of us who believe in the free market, the 2020 race continues to offer less of a choice, and more of an echo.Michael Tanner is a senior fellow at the Cato Institute and the author of The Inclusive Economy: How to Bring Wealth to America’s Poor.
Triggering the Google Social Credit System
by Michelle Malkin
I learned last week from a Silicon Valley whistleblower, who spoke with the intrepid investigative team at Project Veritas, that my namesake news and opinion website is on a Google blacklist.
Thank goodness the Big Tech giant hasn’t taken over the newspaper syndication business yet. Twenty years of column writing have allowed me to break news and disseminate my opinions without the tyranny of social justice algorithms downgrading or whitewashing my words. But given the toxic metastasis of social media in every aspect of our lives, especially for those who make their living exercising the First Amendment, it may only be a matter of time before this column somehow falls prey to the Google Ministry of Truth, too.
Armed with internal memos and emails, former Google software engineer Zachary Vorhies exposed how MichelleMalkin.com (online since 1999) was placed on a news blacklist banning my content from appearing on newsfeeds accessed through Android Google products. I do not advocate violence, publish porn or indulge in vulgarity or profanity (other than my occasional references to Beltway crapweasels). But I triggered the Google Social Credit System and there’s no going back.
My apparent sin: Independently growing a large organic following of readers on the internet who share my mainstream conservative views on immigration, jihad, education, social issues, economic policy, faith and more.
Other conservative victims of the Google ban hammer include: Twitchy (a Twitter aggregation site I founded in 2012), FrontPage Magazine (founded by prolific conservative author and journalist David Horowitz), the Daily Caller (founded by Fox News host and journalist Tucker Carlson), Legal Insurrection (founded by Cornell University law professor and investigative blogger William Jacobson), NewsBusters (founded by Media Research Center in 2005), The Gateway Pundit (founded by grassroots social media pioneer Jim Hoft in 2004), the American Thinker (another of the veteran conservative blogs founded in 2003 by Thomas Lifson), LifeNews.com (an independent, pro-life news site founded in 1992 by Steven Ertelt), the Catholic News Agency and The Christian Post.
I suspect, because so many of the blacklisted sites belong to the original generation of conservative bloggers, that Google’s ideology-based censorship significantly predates the timeframe of the documents that Vorhies (who worked at Google for eight years) shared with Project Veritas. Indeed, my first substantiated censorship by Google/YouTube, which was covered by The New York Times, occurred 13 years ago in 2006. Around that time, it also became clear to me that humans, not algorithms, were manipulating Google Images to prioritize unspeakably crude photoshopped images of me disseminated by left-wing misogynists. And not long after, my heavily trafficked blog posts started dropping off the search engine radar altogether.
Several previous Google insiders have confirmed that the Big Tech giant discriminates against right-leaning journalists, pundits and personalities — not to mention free-thinking employees within its own workforce who’ve been persecuted, fired and even harassed by police for their whistleblowing. Leaked documents also show that a small cadre of meddling social justice overlords at Google Central Command manually manipulate search engine results — despite the company elite’s brazen denial of the practice at a recent congressional hearing.
In the early days of New Media, entrepreneurs on the left, right and center rallied around the transparency and open access mantra, “Information just wants to be free.” Now, in the wholly disingenuous names of “trust” and “safety,” the overlords of the internet want to throttle information with which they disagree. Google employees actively demote content on YouTube deemed “controversial queries,” according to internal documents from Vorhies, including the following phrases:
–Abortion is barbaric.
–Abortion is wrong.
–Abortion is murdering.
–Abortion is a crime.
“Do vaccines cause autism,” “climate change hoax,” and “Girl speaks about the danger in Germany due to rape refugees” were also all red-flagged as dangerously “fringe” by the Google P.C. police. So was President Donald Trump’s factual statement that immigration chaos has led to “people that are from all over that are killers and rapists and they’re coming into this country,” which one open borders employee complained was “explicit bias” that “we should take a stand on.”
So they’re for foreign killers and rapists coming into this country? Noted.
Internal staff complaints catalyze search engine manipulation, so political agitation among Google employees is a harbinger of speech clampdowns to come. Just last week, more than 1,000 Google employees lobbied the company to shun any contract work with U.S. Customs and Border Protection or Immigration and Customs Enforcement. Given that Google works with the hate racket and smear machine known as the Southern Poverty Law Center, you bet I’m worried that my immigration blog and column archives (not to mention all my reporting on the treasonous Silicon Valley CEOs in my upcoming book, “Open Borders Inc.”) will trip the Google Social Credit wire.
With Google’s homegrown menaces squelching our freedom of expression, damaging our reputations and livelihoods through slimy and secretive blacklists, and hampering our ability to do honest research — not to mention mining student data in schools by tethering children to Google apps/email/Chromebooks and holding their academic progress hostage to Google’s high-tech leash — who needs foreign enemies? China ain’t got nuthin’ on America’s “Don’t Be Evil” thought control freaks.