What were they thinking? Long ago, G.K. Chesterton wrote how easily, or with apparent ease, societies discard beneficial practices and institutions without knowing two things. Why was the practice or institution was the way it was, and second, without considering, without reasoning the subsequent effects of change? Nobody has any business destroying an institution until he examines it from a historical perspective.1 Regular readers know my disdain for the 17th Amendment and its awful accumulated consequences. Yes, there was a building consensus in favor of popularly elected senators and in 1913 congress headed off a convention of the states. While…
I was one of five men who had spent weeks cleaning up a long-abandoned city building cluttered with trash and debris from collapsing infrastructure. Our project to open a soup kitchen to help support the local community of illiterate migrants and mentally ill homeless people was nearing completion.
While loading chunks of broken concrete from a pile of rubble into my hauling bucket, I noticed a strange luminescent lamp laying on its side. The lamp seemed sturdy and I rubbed off its grime to reveal a golden surface embossed with intricate designs. It rumbled violently and large blue genie burst out before me and began levitating.
With a booming voice he told me that my unselfish actions had freed him from centuries of systemic oppression and now he would reward me with three wishes.
Most people wish for love or world peace but with my first wish I asked to be made immensely wealthy — not for my own purposes of course, as I don’t care much for money, but so I could buy my mom a house and ostentatiously donate to food shelters and asinine social causes that make me feel superior to the hopelessly impoverished.
The genie nodded that he understood my brave wish, closed his eyes, hummed loudly, and a few seconds later many billions were now mine to use for any purpose I deemed ethical. Looking around, it seemed nothing much had changed with the world. Some numbers in a bank account were different and I had stacks of paper and gold sitting somewhere, but everything else seemed normal. I took a deep breath and kept going.
For my second wish, I looked the genie deep in his vacant eyes and told him I wished that no man was illegal. He snickered derisively at me before clearing his throat and looking down to regain his composure. He then continued with the customary gesture of waving his hands magically and acknowledged my command had been fulfilled.
It turned out this time there were a few side effects I hadn’t considered. For example, I had failed to consider that a nation’s only lasting wealth is its people and that capabilities for civilization among populations differ greatly, so borders have historically acted as a bank vault storing the wealth of nations by securing their population. Also, being accustomed to first world standards, I had assumed these were normal and natural throughout the world, which turned out not to be the case at all.
With no man illegal, nations were rendered borderless and those with civilization were immediately flooded by low-IQ high time preference third worlders trying to get to where people had built something good. It was quite a shock to find out 90% of the world is a disaster of poor people with low intelligence barely able to figure out food, clothing, and shelter, and almost entirely unable to plan or organize.
During a sober moment, they looked around at their countrymen, realized they would never develop civilization, and decided the best course of action was to flee their homelands to get away from the mass of people possessing the same traits as themselves, which in the aggregate had created their national conditions of colossal failure.
The previously successful nations that were now borderless quickly lost their unity and shared notion of common sense in the frenzied greed of peasant fantasies where each took as much as they could while shirking responsibilities and accountability. It was looting in slow motion.
This worldwide revolution made history obsolete by removing divisions between the people who developed nations and those who wanted to enjoy what was beyond their station. Developments that took hardy people many centuries to achieve found their nourishing populations replaced by millions of incapables unable to keep civilization afloat. Worst of all, because they knew they were intruding squatters unable to achieve basic standards, the migrants hated the natives for their abilities and openly wished for their downfall.
Wealthy nations became poor overnight and the work of good people was diverted to trying to fix problems created by the imported population. Some natives tried to build a nation within a nation in an effort to preserve their culture and its essential aspects that now struggled under the weight of chaotic disunity. Infrastructure and institutions could no longer manage clean up on the messes of outsiders, and even after sacrificing the possibility of keeping a high level of culture, there was not enough wealth and energy to hold things together. A free fall to a sustainable third world standard ensued.
With my third wish, I asked the genie to restore lawful borders. He snickered again and this time I felt silly from the realization that I needed an appointment at the laser removal clinic to take care of that embarrassing leftist slogan tattoo that seemed so edgy when I was a teenager.
LONDON (Reuters) – “Dirty” Russian money hidden in British assets and laundered through City of London financial institutions undermines the government’s efforts to take a tougher stance against Moscow’s “aggressive foreign policy”, UK …
If someone on Facebook claimed that Starbucks CEO Kevin Johnson had just instituted a policy that would allow minorities to move to the head of the line at all of the coffeehouse chain’s outlets, would you believe it?
Some did, and that suggests Starbucks and other organizations like it have tread so far to the radical left that it’s virtually impossible to distinguish fact from fiction anymore.
Check out the original Facebook post that started this controversy below, and make sure to also read the attached comments:
Relax, it was fake. But some people really believed it.
“That’s it! Enough of this white Guilt CRAP! I don’t need coffee that bad!” bellowed one incensed Facebook user.
“I’ll take my white privilege somewhere else!” wrote another.
While it’s tempting to blame these users for being so susceptible to false information, perhaps the blame lies elsewhere.
In the wake of a Philadelphia-based Starbucks calling the cops on two black patrons two months ago after they refused to leave, the coffeehouse chain launched a “progressive” campaign to force left-wing dogma about “implicit bias” down its employees’ throats.
This despite the fact that loitering and trespassing on a private business are grounds for removal/arrest no matter what color you are. But all that apparently mattered to Starbucks was that the far-left was accusing it of being racist.
The notion of “implicit bias” is in itself very suspect, and studies have shown that so-called “implicit bias” training leads to no fruitful results. Starbucks was so desperate to appease the left-wing mob that it didn’t care, though.
Starbucks isn’t the first company (or government agency) that’s succumbed to the far-left’s nonsense, nor will it be the last. And when so many basic societal institutions fall prey to craziness, can you really blame everyday Americans for falling hook, line and sinker for the craziest brand of fake news? Not really.
If you think about it, it’s not really fake news that’s the problem, as the left would have everyone believe. It’s rather the ongoing deterioration of American society courtesy the left’s racial pandering, political correctness and soft bigotry of low expectations.
Case in point: If someone told you that a university professor had demanded that his school stop hiring white people because “cis het white people need to lose more,” would you believe it?
If you answered no, then whoops, you got it wrong, because that really happened!
What if someone told you that a singer had demanded that white audience members move to the back of an music festival? Is that fake news or real news?
Again, it’s real news.
Last but not least, what if someone told you that former President Barack Obama had revealed to the media that he “distrusts white people” because they lie so much? Is that fake or real news?
THAT is actually fake news, but in case you thought it was real, don’t feel bad!
As we enter the age where Leftism, having gained supremacy fifty years ago and failed in all of its promises, prepares to pass on into the dust-bin of history, it makes sense to understand what Leftism is.
On this site, we treat politics as a series of philosophies. Philosophies are explanation for how the world works and what we should do about it. At the core, each philosophy possesses a basic statement which summarizes its approach, and this is why they are distinctive.
It has become common — and that word never means anything good — for people to bloviate on about how they are “neither Left or Right,” which forgets that these two things are distinct philosophies, and like many things at a basic level, indicate a necessary fork in the road of human thinking.
Very few realize that the Right is our continuation of what was there before Leftism, and that while it has been misinterpreted and linguistically slaughtered like everything else in our declining society, its basic philosophy still stands: conserve the best of the past while aiming for inner excellence.
Even fewer understand Leftism. What is Leftism? An encylopedia provides us the roots of Leftist philosophy:
Left: In politics, the portion of the political spectrum associated in general with egalitarianism and popular or state control of the major institutions of political and economic life.
Now we can see the basics of the philosophy: it is egalitarianism plus the idea that the State should enforce it. Continuing our exploration, we ask, “What is Egalitarianism?” Fortunately a specialized encyclopedia of philosophy provides an explanation of egalitarianism:
Egalitarians think, firstly, that unfair life prospects should be equalized. Secondly, that equality is the most or one of the most important irreducible intrinsic or constitutive worth(s) of justice. Thirdly, that welfare should be increased. Fourthly, that justice is comparative. Fifthly, that inequalities are just when otherwise advantages are destroyed in the name of justice. Lastly, that there are certain absolute humanitarian principles like autonomy, freedom or human dignity.
The suffix “ism” tends to mean a philosophy that advocates using its root term as a means of solving problems and leading the best possible life. For that reason, elitism means those who advocate choosing the elite or quality over quantity; socialism denotes using socialized means of production; egalitarianism indicates those who want to use equality as a universal tool for fixing and enhancing society.
In that definition, we have every aspect of modern Leftism. They want to create a Utopia through progress toward equality. They think this should be done by taking from the successful and giving to the unsuccessful. They believe in using the State to do this through Civil Rights programs.
Through that understanding, we can see that Leftists — liberals, communists, marxists, socialists, anarchists, libertarians — are all degrees of the same thing, namely the idea of equality being both a goal and a method of achieving the best possible civilization and lives, although uniquely they see a “perfect” Utopia as possible.
Let us then revisit the historical portion of the definition of Leftism from above:
The term dates from the 1790s, when in the French revolutionary parliament the socialist representatives sat to the presiding officerâ€s left. Leftists tend to be hostile to the interests of traditional elites, including the wealthy and members of the aristocracy, and to favour the interests of the working class (see proletariat). They tend to regard social welfare as the most important goal of government. Socialism is the standard leftist ideology in most countries of the world; communism is a more radical leftist ideology.
In this we see how egalitarianism translates into reality: since we cannot make the unsuccessful more competent, we must penalize the successful, and have a strong gangster-style government to take their wealth and give it to the less competent. This creates a Darwinian death spiral but transfers power to the Leftist Regime.
Leftism consists of several sub-philosophies, all of which share a common goal of Utopia through progress of equality, which means that all Leftist philosophies are essentially the same, differing only in degree. On the mild side of Leftism, liberalism, libertarianism, and classical liberalism hide their real goal:
Liberalism, political doctrine that takes protecting and enhancing the freedom of the individual to be the central problem of politics. Liberals typically believe that government is necessary to protect individuals from being harmed by others, but they also recognize that government itself can pose a threat to liberty.
…Liberalism is derived from two related features of Western culture. The first is the Westâ€s preoccupation with individuality, as compared to the emphasis in other civilizations on status, caste, and tradition. Throughout much of history, the individual has been submerged in and subordinate to his clan, tribe, ethnic group, or kingdom. Liberalism is the culmination of developments in Western society that produced a sense of the importance of human individuality, a liberation of the individual from complete subservience to the group, and a relaxation of the tight hold of custom, law, and authority. In this respect, liberalism stands for the emancipation of the individual. See also individualism.
Liberalism also derives from the practice of adversariality in European political and economic life, a process in which institutionalized competitionâ€”such as the competition between different political parties in electoral contests, between prosecution and defense in adversary procedure, or between different producers in a market economy (see monopoly and competition)â€”generates a dynamic social order. Adversarial systems have always been precarious, however, and it took a long time for the belief in adversariality to emerge from the more traditional view, traceable at least to Plato, that the state should be an organic structure, like a beehive, in which the different social classes cooperate by performing distinct yet complementary roles.
Individualism creates egalitarianism because no individual wants to be left behind or restricted in what they can do. As a result, they demand a utilitarian solution: everyone does whatever they want — small exceptions are made for crimes and blatant antisocial behavior — and decisions are made by choosing whatever is most popular.
This comes from the notion of the moral worth of the individual in individualism:
Individualism, political and social philosophy that emphasizes the moral worth of the individual.
If the individual has moral worth, then all individuals must be included and their choices supported, which naturally prohibits the type of cooperation necessary to create civilization. Individualism expresses itself through “rights” by which an individual can reject the need to uphold social standards, customs, and principles.
Although it was called by different terms, individualism arose from the Renaissance, in which “man is the measure of all things” became a replacement for classical ideas of social order. Instead of designing civilization as a structure, it was conceived as a container for individuals which sought to facilitate their desires.
This inverts social order. Instead of having standards and rewarding those who meet them, we make people the standard, and assume that they can be motivated with external carrot/stick combinations like money and the threat of not having money. Over time this breaks down, and so societies turn toward socialism in order to keep their ideology intact.
We fight a war of ideas. The West adopted individualism, then egalitarianism, and implemented them in Leftism because as the most successful society on Earth, it had the wealth and power to take on a crazy notion and not have it fail immediately. Over the past centuries and especially past fifty years however, we have seen that it fails anyway.
For us to displace Leftism from the West, and nothing else will save us, we must get to the root of this dysfunction and remove the moldy old Renaissance™ and Enlightenment™ notions of equality from our thinking. This requires that we get over ourselves, but we have surmounted greater challenges in the past.
Talk of higher education reform tends to focus, understandably enough, on the cost of college. After all, steady tuition increases, rising student debt, and eye-popping sticker prices at well-known colleges and universities leave too many students and parents wondering if college is out of reach.
For all this healthy attention as to whether students can afford to go to college, however, we’ve too often lost sight of an equally crucial question — whether they’ll actually earn a degree once they’re there. The disheartening reality is that far too many students invest scarce time and money in attending a college from which they never graduate, and frequently wind up worse off than if they’d simply foregone college altogether.
In 2016, more than 40 percent of all students who started at a four-year college six years earlier had not yet earned a degree. Odds are that most of those students never will. In real terms, this means that nearly two million students who begin college each year will drop out before earning a diploma.
Indeed, according to our research, there are more than 600 four-year colleges where less than a third of students will graduate within six years of arriving on campus. When we look at public two-year colleges, most of which are community colleges, the graduation rate for full-time, first-time students is even lower. Only about 26 percent of students at those schools will have completed their degree within three years.
These dismal completion rates create significant private and societal costs. For individual students, the costs come in the form of student debt, lost time, and lower expected earnings (median annual earnings for students who complete a bachelor’s degree are $15,000 higher than for those who attended college but didn’t earn a degree). For society, the costs show up in forgone tax revenue and wasted public subsidies. In aggregate, some estimate that the total private and public costs of non-completion impose a half a trillion dollar drag on the economy.
In seeking to respond to these challenges, education scholars at the American Enterprise Institute and Third Way have joined together to commission a series of studies by five experts laying out the challenges of non-completion and the urgency for families, educators, and policymakers to take action to address it. (You can find those papers here.)
Now, we do well to heed the risks that a narrow focus on college completion can invite — especially when such an emphasis starts to shapes the incentives and strictures of public policy.
As we have seen in K–12, it is all too possible for simple metrics to yield gamesmanship, corner cutting, or manipulation. We are all-too-familiar with colleges that are content to churn out watered-down degrees with little labor market value, or that take care to only admit the most academically prepared students — leaving someone else to serve others for whom the path to completion will be more difficult. Obviously, measures that encourage colleges to “game the system” are a step in the wrong direction.
Thus, reforms intended to incentivize or improve completion rates need to be designed with scrupulous attention to potential consequences and due regard for the full range of outcomes that matter to taxpayers and students.
That said, there are examples of intriguing programs at the state and college-level that merit careful attention. Thirty-two states currently use performance-based funding policies that award a larger share of public subsidies to colleges that deliver impressive performance metrics. While the overall success of these policies is still up for debate, what’s clear is that states like Indiana, Ohio, and Tennessee are using these policies to gently prod colleges to focus on their students’ outcomes. In such states, some higher education institutions have modified their advising, counseling, and academic services to prioritize retention and completion.
Approached with care and appropriate attention to possible perverse incentives, performance-based funding is one way to encourage colleges to put more emphasis on supporting the students they enroll.
At the campus level, it’s vital to note that low-cost, quick-fix programs are predictably hard to come by. While there are no silver bullets, we know that higher education providers are already making hundreds of decisions that impact students’ experience and motivation in a way that makes it more or less likely they will succeed.
For example, Georgia State University issues automatic completion grants to college-level juniors and seniors with unmet financial need. On average, these grants are about $900 each, and they help students overcome the stumbling blocks that can be posed by expenses like heating bills and textbook costs. In 2016, nearly 2,000 students received completion grants, with GSU reporting that 61 percent of seniors who received one graduated within two semesters. Programs like these illustrate what colleges can do to help students graduate, without compromising standards or lowering the bar for college completion.
Even in these polarized times, we can agree that college students should complete their degrees and that taxpayers should get repaid for the funds they make available through student loans. We have the opportunity to seek solutions that focus not only on whether students can afford to arrive on campus, but on whether those students willing to do the work will leave with the education and the credential they came for. Left or right, that’s a cause we can all embrace.
Most discussions of federal subsidies for higher education focus on student aid programs such as Pell Grants and student loans. Another category of subsidies costs the federal government over $40 billion per year, but receives much less attention. The federal tax code is riddled with many credits, deductions, and exclusions that benefit the education industry. However, these tax expenditures generally aren’t counted as normal government spending.
However, since carveouts in the tax code represent foregone revenue for the federal government, they have the same effect on the deficit as a traditional spending program of the same size. The Committee for a Responsible Federal Budget notes that if tax expenditures were counted as normal spending, they would consume 28% of the federal budget.
New estimates from the Joint Committee on Taxation show that tax expenditures for education cost a combined $47 billion in 2017. The estimates also reveal how last year’s Republican-backed tax reform bill, the Tax Cuts and Jobs Act, influenced these indirect subsidies for education. Most education-related tax expenditures were relatively unaffected by the law, though there are notable exceptions.
Tuition tax credits represent the largest single tax break for education. These reimburse households for college tuition costs of up to $2,500 per year. The federal government spent $19 billion on these credits in 2017, accounting for 45% of all education-related tax expenditures. For comparison, the federal government spends $27 billion annually on Pell Grants, the main student aid grant program.
The deduction for charitable contributions to educational institutions amounts to the second-largest tax expenditure for education. The “charitable deduction” was also the largest education-related expenditure affected by the Tax Cuts and Jobs Act. While the deduction itself remained mostly untouched after tax reform, other statutory changes will cause the cost of this tax break for schools and colleges to drop from $10.5 billion in 2027 to $8.7 billion in 2021.
Specifically, Congress cut marginal tax rates, which reduces the value of itemized deductions. Additionally, the tax law nearly doubled the standard deduction, to $12,000 for single filers and $24,000 for joint filers. This was a major tax cut for low- and middle-income households, as claiming the standard deduction is more common among these groups.
However, in order to claim the break for charitable contributions, tax filers must forego the standard deduction and instead itemize their deductions. With a larger, more enticing standard deduction, fewer taxpayers will opt to itemize, and so fewer will claim the charitable deduction. Many of these people will still donate to their favorite colleges and universities; they’ll simply no longer get to claim a tax break for it.
Early drafts of the tax law proposed stripping away specialized tax expenditures and using the money to lower tax rates for everyone. Proponents of reform argued that a tax code with lower marginal rates and fewer carveouts would spur faster economic growth than a code with higher rates but more carveouts. Lower marginal rates drive individuals and businesses to invest in projects with high economic dividends, but tax expenditures divert resources to politically popular arenas that may deliver less of a boost to growth.
Under those guiding principles, the tax bill’s authors originally proposed eliminating many education-related tax breaks. These included one of the tuition tax credits, the student loan interest deduction, and tax-exempt bonds for private educational institutions. Yielding to political expediency, however, the final draft of the bill retained most of those tax expenditures while also lowering marginal rates. As a result, under tax reform these carveouts will stay roughly as expensive as they were beforehand. (Meanwhile, the federal budget deficit approaches $1 trillion.)
Two other major tax expenditures—the exclusion of scholarships and fellowships from taxable income and tax preferences for 529 college savings plans—will become more expensive over the next several years. However, the costs of these provisions were already slated to rise before Congress passed tax reform.
Most tax breaks for higher education made it through the Tax Cuts and Jobs Act unscathed. But as Congress grapples with higher deficits over the next several years, it will become more difficult to ignore $40 billion in hidden subsidies for the education industry. Politicians who are serious about reining in government spending should keep tax expenditures on the table.
The president was intent on a public patriotic celebration. It was important, he said, to “give significant expression to our thoughtful love of America.” He marched at the head of a parade past cheering tens of thousands of citizens who lined Pennsylvania Avenue from the Capitol to the White House. Later that day the president traveled to the Washington Monument, where he gave a speech that denounced an unnamed ethnic group. He said that it “must absolutely be crushed” before it could further subvert America and its influence abroad. He urged Americans to make it clear that “loyalty to this flag is the first test of tolerance in the United States” and demanded that his political party pass a plank making it clear that anyone who was a real American would agree with his course of action abroad and at home.
Thus Woodrow Wilson kicked off his campaign for reelection in June 1916. “Together, the speech and the plank,” writes Patricia O’Toole in her excellent new biography The Moralist, “proposed to abolish the Constitution’s guarantees of free expression and free assembly. Equally startling was the fact that no one in the mainstream press protested the demagoguery.”
Nearly a year later, after America entered World War I, Wilson went on to establish the Committee on Public Information, whose missions were censorship and propaganda to help persuade Americans that they faced an evil empire in the form of Wilhelmine Germany. Meanwhile, Wilson’s attorney general, Thomas Watt Gregory, locked up suspected spies and subversives. Gregory also oversaw a voluntary American Protective League. To make the world safe for democratic liberation, Wilson was prepared to suspend liberty: “If there should be disloyalty,” he declared, “it will be dealt with a firm hand of stern repression.”
Like not a few past national American leaders, Woodrow Wilson has come under closer inspection for his flaws in the past few years, including at his beloved Princeton University, where the Woodrow Wilson School has wrestled with his legacy on, among other things, race relations. A 10-member committee examined the matter of renaming the school. It decided that his views “clearly contradict with the values we hold today,” but concluded that expunging his name was not a good idea. There is also a Woodrow Wilson High School in the nation’s capital, but for now there does not appear to be a movement to alter its designation.
For many years, it was conservatives who dinged Wilson. Perhaps his most virulent detractor was H.L. Mencken, the sage of Baltimore. He took an almost lascivious pleasure in dismantling Wilson, referring to him as Moses and scorning his oratory for “its ideational hollowness, its ludicrous strutting and bombast, its heavy dependence upon greasy and meaningless words, its frequent descent to mere sound and fury, signifying nothing.” But later conservatives, such as Richard M. Nixon, admired him, and George W. Bush sounded very much like Wilson in his second inaugural address, when he proclaimed, “It is the policy of the United States to seek and support the growth of democratic movements and institutions in every nation and culture, with the ultimate goal of ending tyranny in our world.”
What to make of Wilson’s fascinating and influential presidency and life? O’Toole offers a fair-minded portrait of a vain moralist and political visionary whose certitude could exceed his judgment. She chronicles Wilson’s rise from the presidency of Princeton to becoming governor of New Jersey to president in the quadruple-contested election of 1912. Absent the Bull Moose candidacy of Theodore Roosevelt, which crippled his erstwhile protege William Howard Taft, Wilson would probably never have become president. After he won, however, the rectitudinous Wilson was able to give full vent to his crusading impulses in both domestic and foreign policy.
Thomas Woodrow Wilson was born on December 28, 1856, and grew up in Augusta, Georgia. His father, Joseph, a Presbyterian minister, carefully instructed his son in grammar and syntax, and Woodrow later referred to his father as his greatest teacher. “From his father,” writes O’Toole, “Tom had learned that great oratory was closely reasoned and deeply felt as well as pleasing to the ear.” At Princeton he studied history and philosophy, ransacking the past for lessons about the present. He also had blank calling cards upon which he inscribed, “Thomas Woodrow Wilson, Senator from Virginia.” At Princeton he scored an early coup in his senior year when he successfully submitted an essay to the International Review, which was edited by Henry Cabot Lodge. The piece called for cabinet government in the United States, a sign of his desire to import elements of the British parliamentary system into American politics.
Wilson had a heroic conception of politics that looked with disdain upon the grubby pols cutting backroom deals. He wanted men of influence to set the terms of debate, much as he believed they did in Great Britain, where Gladstone and Disraeli vied to climb the greasy pole. In 1885, Wilson published a book called Congressional Government in which he expounded upon his belief that the power of the presidency had become emasculated by an overly powerful Congress. In 1899, after America made its first stab at establishing its own empire, Wilson declared that strong presidents were imperative: “When foreign affairs play a prominent part in the politics and policy of a nation,” he wrote, “the Executive must of necessity be its guide…”
Armed with a Ph.D. from Johns Hopkins, Wilson ended up returning to Princeton, where he became its president by 1902. Now he sought to create his own little empire. He hired 50 young preceptors who, like the tutors of Oxford and Cambridge, were supposed to elevate standards at Princeton, and he embarked upon a grand plan to build quadrangles, akin to those at Oxford and Cambridge, that would sideline the famously snobbish eating clubs such as Tiger Inn and Cap and Gown. He wanted to create a new sense of academic community. The alumni revolted. Wilson, never willing to compromise in any way, lost the battle.
His next crusade was in 1909 over the location of a long-delayed building for Princeton’s graduate school. Andrew West, the dean of graduate studies, thought it should be built near the university golf course. Wilson disagreed. He wanted it located centrally amid the hustle and bustle of the undergraduates. For Wilson, who could not bear to see his precious Princeton sullied by any vision other than his own, the stakes could not have been higher. He embarked upon a ferocious battle. But the affable West, who enjoyed close relations with the wealthy alumni that were going to fund the project, won out.
With his Princeton career at a dead-end, Wilson looked for an exit. In 1910, with the backing of the Democratic machine, he capitalized on his national reputation as a reformer to run for governor of New Jersey. Wilson easily won and was quickly viewed as presidential timber. After only four months as governor, O’Toole reports, Wilson embarked on a 9,000-mile speaking tour to the West Coast and back. His message about what ailed America was as clear as it was direct: “the control of our politics, therefore our life, by great bodies of accumulated and organized wealth.” In 1912 he rode the wave of progressive indignation against the trusts to capture the Democratic nomination. The New Freedom was his credo, as against TR’s New Nationalism. O’Toole perceptively notes that Wilson performed best on a stage. “He had studied elocution as diligently as any actor,” she writes, “and without seeming to raise his voice could make himself heard by a crowd of fifteen thousand even in a hall with poor acoustics. A genius at the harmonics of political speech, he could easily work idealism and self-interest into the same chord, and he had the rare ability to stir emotion even as he appealed to reason.” On the personal level, however, he always had trouble. The journalist William Allen White recalled that when he first met Wilson, “the hand he gave me to shake felt like a ten-cent pickled mackerel in brown paper.”
Wilson approached the presidency, in many ways, as an amplification of his previous duties at Princeton. Once more, he would be the great reformer. But Congress would prove to be his trustees, reining in his grand ambitions. From the outset, Wilson believed that he could pursue a morally correct foreign policy that would set wrong aright. He told a British envoy, “I am going to teach the South American Republics to elect good men.” His great antagonist Henry Cabot Lodge, however, saw from the outset that Wilson was “extraordinarily green” when it came to dealing with foreign nations.
When it came to entry into World War I, Wilson temporized. Eventually, the British naval blockade of Germany prompted Kaiser Wilhelm to authorize unrestricted submarine warfare. O’Toole deftly recounts the complicated diplomatic maneuvering that Wilson engaged in to try and avoid becoming entangled directly in the Great War. Even as Wilson’s ambassador to the Court of St. James, Walter Hines Page, pushed for intervention, Colonel Edward M. House sought to serve as a kind of honest broker between the British and Germans to effect an end to the hostilities. O’Toole is not much impressed by House’s performance, which she depicts as consisting of naïve diplomatic blunders. But House did also create The Inquiry, a group of scholars led by Walter Lippmann, that attempted to prepare the administration for the postwar negotiations and became the basis for the Council on Foreign Relations.
Throughout, Wilson’s pacific aims could not have been more ambitious. Wilson may already have intervened militarily in Veracruz in 1914, in Haiti in 1915, and in the Dominican Republic in 1916, but he always saw himself as a man of peace. In May 1916, he delivered a speech in Washington that sought to reorient American foreign policy and prefigured much of his later diplomacy. He called for a community of nations and for collective security. “Most radical of all,” O’Toole writes, “was his abandonment of isolationism, the first principle of U.S. foreign policy.” In April 1917, after Russian Tsar Nicholas II was toppled from power, one of Wilson’s last remaining objections to entry into World War I was removed. Now he could fight for democracy with democracies.
But as hubris is usually followed by nemesis, so Wilson found himself checked by November 1918, when the Democrats suffered a crushing loss in the midterm congressional elections. After he returned from France and the protracted peace negotiations, Wilson was wholly convinced that it was imperative to create a League of Nations and that the covenant had to be ratified in toto, including Article X which stated that “The Members of the League undertake to respect and preserve as against external aggression the territorial integrity and existing political independence of all Members of the League.”
This Henry Cabot Lodge and other Republicans would not endorse. Lodge did not attack the treaty frontally but wanted to load it down with reservations that he reckoned Wilson would not accept. When the French ambassador assured Wilson that it would not pose a problem for France, Wilson responded that he would “consent to nothing. The Senate must take its medicine.” It was the battle of the quads all over again. Wilson, refractory and pedantic, simply could not even contemplate a compromise. He had banished House in 1919 and became ever more isolated and convinced of his own infallibility. He embarked upon a trip across the country to rouse support for the treaty but ended up wrecking his fragile health. He had succeeded not only in paralyzing his diplomacy but also himself.
Despite his incapacitation and long convalescence, which saw his young second wife, Edith Bolling Galt, essentially run the nation, Wilson flirted with the idea of trying to capture the nomination for an unprecedented third term. He saw himself as indispensable. Freedom and peace depended on another term. Party elders put the kibosh on that. The moment had arrived, as it were, for regime change in the Democratic Party and the doughty governor of Ohio, James M. Cox, got the nod. His running mate was the young Franklin D. Roosevelt, assistant secretary of the Navy. Cox campaigned for the League of Nations, but his Republican counterpart Warren G. Harding vowed that it was time to return to “normalcy.” Harding won.
Wilson took the election news serenely but remained as flinty as ever. When Attorney General A. Mitchell Palmer recommended in January 1921 that Wilson commute the great socialist leader Eugene V. Debs’s sentence, whom he prosecuted for delivering an anti-war speech in Canton, Ohio, he refused. “Wilson,” O’Toole writes, “knew that he would be denounced by champions of free speech but did not care.” To the last, Wilson, who died in 1924, remained unbending in his resolve to smite anyone who dared disagree with him. It was Harding who commuted Debs’s sentence in December 1921.
Jacob Heilbrunn is editor of The National Interest.
America is increasingly polarized.
That isn’t news to anyone who’s been following the social research of the past couple years. After the 2016 presidential election, David Wasserman of FiveThirtyEight wrote that “America’s political fabric, geographically, is tearing apart,” and suggested this should be seen as a “flashing danger sign.” In Yuval Levin’s Fractured Republic, which came out in May 2016, he wrote of a hollowed-out society in which mediating institutions and social capital had all but disappeared from American life, leaving in their wake a jaded individualism and growing political rancor.
But a new Pew Research Poll suggests that this polarization—across geographic, cultural, and political lines—is growing even more pronounced with time. Our political differences are strengthening, with an increasing number of urban Americans moving further left and more than half of rural voters (54 percent) declaring their allegiance to the GOP. What’s more, most urban and rural Americans see themselves as judged and misunderstood by each other, with a majority from both groups saying those who don’t live in their types of communities have a negative view of those who do.
Urban and rural divides are not new, as University of Wisconsin political scientist Kathy Cramer told the New York Times. What’s unique about our moment, however, is that “cultural divides overlap with political divides, which overlap with geography,” creating a maelstrom of suspicion and disconnect.
This remarkable growth in polarization leads the Times to ask an important question: are we sorting ourselves, increasingly moving to fit in with those in our “camp”? If not, how and why are the numbers becoming so extreme?
Cramer, for her part, suggests that place-based resentment is becoming a sort of identity marker, especially as politicians employ “us versus them” rhetoric. Shopping at Whole Foods or going to the gun range have increasingly become political acts, talismans of personality and place with markedly partisan affiliations. Our sorting seems to have more to do with an increased tendency to tie cultural and social acts (as well as geographic identity) to politics than it does with a marked shift in our habits or moving patterns.
Alongside these differences, however, the Pew poll also shows remarkable (and somewhat alarming) similarities between urban and rural communities. Both groups are about equally worried over the impact of the opioid epidemic on their neighborhoods. Both are worried about job availability. Young people from both are more mobile and restless—although “Roughly a third (32%) of young adults in rural areas say they are very or mostly dissatisfied with life in their community; this is significantly higher than the share of young adults in suburban areas who say the same (21%).”
About four in 10 Americans across geographic divides say they don’t feel attached to their current communities. While knowing one’s neighbors, owning one’s house, and living in one place for a long period of time all increase the chances of community involvement and satisfaction, only three in 10 Americans say they know most or all of their neighbors—and a third say they would move away if they could. While a greater percentage of rural folks say they know their neighbors, that doesn’t mean they interact more often. Indeed, according to Pew, community involvement doesn’t vary much by community type: “Among those who know at least some of their neighbors, rural Americans are no more likely than their urban and suburban counterparts to say they interact with them on a regular basis.”
Obviously, these figures could be worse. Most Americans say they still know at least some of their neighbors; large numbers in urban, suburban, and rural communities say they remain close to—or have moved back towards—their families. But there’s still a marked sense of alienation, suspicion, and discontent displayed in this poll. Not only do disparate American communities suspect each other of unkindness and disrespect, many have retreated from neighborliness and association within their own circles.
These findings reminded me of the suggestion in Patrick Deneen’s recently released Why Liberalism Failed that the political ideology of liberalism drives us apart, making us more lonely and polarized than ever. As Christine Emba writes in her Washington Post review of Deneen’s book:
As liberalism has progressed, it has done so by ever more efficiently liberating each individual from “particular places, relationships, memberships, and even identities—unless they have been chosen, are worn lightly, and can be revised or abandoned at will.” In the process, it has scoured anything that could hold stable meaning and connection from our modern landscape—culture has been disintegrated, family bonds devalued, connections to the past cut off, an understanding of the common good all but disappeared.
That latter loss—of a common understanding of the good—seems particularly applicable to the Pew poll’s findings regarding polarization. Although our country has always struggled with an urban-rural divide, it could be that our lack of a common conception of the good has made it even worse. Left and Right subscribe to different liberal tenets that tear at association and community: on the Right, “classical liberalism celebrated the free market, which facilitated the radical expansion of choice,” while the Left’s liberalism “celebrated the civil right to personal choice and self-definition, along with the state that secured this right by enforcing the law.” As Emba notes, both forms of liberalism foster “a headlong and depersonalized pursuit of individual freedom and security that demands no concern for the wants and needs of others, or for society as a whole.”
Thus we disconnect in terms both broad and intimate, struggling to equate our political autonomy and self-definition with the demands of empathy, neighborliness, and service. The fact that our urban and rural communities are so suspicious of each other suggests a degree of navel-gazing and self-consciousness that is deeply detrimental, if not tempered by a proper degree of rationality and generosity.
Fixing these problems will require more than a distrust of our political leaders’ schismatic rhetoric, instrumental in entrenching our divide though that rhetoric has been. Turning to the state for answers or blame is one of the reasons we’re in trouble in the first place. A healthy effort to “plug in”—to connect at the local level, to dialogue with our political “enemies,” and to engage in civic and philanthropic efforts—may be the best way to cut back on some of this rancor and polarization.
In the conclusion of Why Liberalism Failed, Deneen suggests that we need to foster local “counter-anticultures”: bastions of community, civic engagement, philanthropy, and religion to counteract our cultural and social vacuum. Levin recommends something similar in Fractured Republic, turning to Edmund Burke’s “little platoons” and the reinvigoration of associational life as a balm for widespread fragmentation.
This newest Pew poll suggests that the more deeply we know each other—and the more time we spend together—the less lonely and restless we will feel. That isn’t a shocking revelation, but it’s an important one nonetheless. Those who feel nourished and cared for by their communities will feel less cheated by the state and more empowered to confront the changes and dilemmas in their neighborhoods. It may be that by itself this can’t bridge our deep urban-rural divide, considering how widespread our resentment and political differences are. But I do think a community that feels self-sufficient and nourished is less likely to harbor feelings of resentment and suspicion toward those outside its borders: there’s less temptation towards discontent, and often a deeper awareness of the issues we share in common. Philadelphia, Pennsylvania, and Elizabethville, Pennsylvania, need the same things: committed citizens, generous philanthropists, passionate civic leaders, savvy planners and political leaders, strong local institutions, and vigorous community involvement. They often struggle with the same things, too: loneliness, despair, unemployment, fragmented families, weak civic and educational institutions, a lack of funds, poor urban planning, and so on.
While our national discourse champions rancorous politics, local associations and news celebrate self-empowerment, service, and communal ties. They emphasize every community’s desire to become the best version of itself. The more we can focus on these things, the better.
Gracy Olmstead is a writer and journalist located outside Washington, D.C. She’s written for The American Conservative, The Week, National Review, The Federalist, and The Washington Times, among others.
One telling detail keeps escaping the men and women of words who would end school shootings by one expedient or another: gun control, better security, the arming of teachers, more careful vetting of potential gunmen and so forth.
The detail of which I speak: We didn’t use to endure this horror. It didn’t happen.
The urgent question that flows from this detail: Why not?
Well, to start with, because things were different, prior to the shooting fests, which break so many hearts and generate so much despair.
Right, yes — but different in what way?
I will take a crack at this: Our culture (as we have come to call the circumstances of daily life) was cooler, calmer, less emotional, more orderly than it has become since then — which is not the same as saying pre-massacre culture (what a term) was cool, calm, and unemotional. It was not. Those personally familiar with that culture know better, I hope, than to indulge in nose-honkings over the joys of the past.
Still, massacres, explosions of personal rage, were rare and generally connected with mental disorder, such as the case of Howard Unruh, the World War II vet who went wild in New Jersey in 1949, gunning down people on and off the street, including a barber and his 6-year-old customer. There were guns enough out there, no doubt; nevertheless, few thought of using them in today’s ghastly, almost customary, way.
We didn’t use to endure this horror. It didn’t happen (or, save for Howard Unruh, hardly ever).
I am still taking a crack at this thing, with no more deleterious effect, I hope, than would flow from an attack on the Second Amendment. I submit that the factor at which we should look for explanation is social control: its widespread presence in pre-massacre time and its absence in the present day.
I do not mean that the secret police ran life back then. I mean institutions did, more or less, and with a touch far lighter and more helpful, in most cases, than today’s advocates of liberation would admit under coaxing from a liberally applied cat o’ nine tails. Whee, we’re free! So goes the general apologia for the removal of rules and guidelines of all kinds.
Free we are, or there wouldn’t have been much point to America. Yet Americans, according to the manner of their (generally) British culture, acknowledged not just opportunities but obligations. Institutions took these obligations, and their (normally) gentle enforcement, with great seriousness and sense of duty.
Mothers and fathers were supposed to impart to children a sense of… well, plain old decent behavior would likely cover it. Churches posited their own senses of duty and right belief — often overlapping the teachings of parents. Schools, as virtually anybody who attended one in the pre-massacre era can testify, necessarily exerted forms of control. If they hadn’t, no teaching would have taken place.
Was it all done perfectly? Who’d make such a ridiculous claim as that? Of course it wasn’t done perfectly. Sometimes it was done wretchedly.
But we didn’t use to endure the horror of mass massacres. People didn’t fear taking their children to school. Now they do.
The real horror of the matter is the hand-waving futility the massacre debate engenders. No one can believe, with any depth of conviction, that tighter gun control laws would make life as safe as a public library story hour.
The rebuilding and refitting of our weakened institutions, public and private, is the only path toward peace. But how to bring that about? Through change in beliefs and commitments: which is where the heavy lifting begins, as old formulas for human flourishing (e.g., the indispensability of the two-parent family) are reinserted into the common life. Or, through human folly, not reinserted.
The fact is that too few acknowledge the unmatched power of benevolent institutions to shape character, maintain the general peace, and impart dignity to human life — as well as keep it safe and free. But they do. Or rather, they did: here, there — yes, and in Santa Fe, Texas.
William Murchison is writing a book on moral restoration.
COPYRIGHT 2018 CREATORS.COM