The Deliberate Deterioration of American Values (Commentary)

In hindsight, I feel very lucky to have grown up in the early 1990s, when traditional American values were still widely held and pride in the country was more the norm than the exception. I feel obligated to say this after reviewing a new poll conducted by The Wall Street Journalwhich grabbed my attention for all of the wrong reasons.

In short, the poll tracks Americans’ opinions on a variety of topics—from the economy to religion—and how much those opinions have changed over the past 25 years.

For instance, in 1998, when the poll was first conducted, 70 percent of Americans said patriotism was “very important.” Today, that number has plummeted to 38 percent.

Unfortunately, this disturbing trend applies to several other societal matters. For example, in 1998, 62 percent of Americans said religion was “very important.” Now, that stands at just 39 percent.

Likewise, the percentage of Americans who said having children was “very important” declined from 59 percent in 1998 to 30 percent today. As one would expect, Americans are also placing less importance on “community involvement,” “Belief in God,” and “Marriage.”

On the other hand, one of the few topics Americans deemed more important in 2023 than they did in 1998 was “Money.”

In other words, over the past 25 years, which is not that long when you think about it, the traditional American values that shaped the fabric of this nation have been in steep decline.

So, what could be driving this abrupt inversion regarding the very values that made America what it is, over such a short period of time?

Although there is not one sole factor that is causing the trend, I think it is within reason to speculate that much of this newfangled antipathy towards time-honored American values stems from the left’s coordinated attack via Hollywood, academia, the mainstream media, social media, and a plethora of other places that is intended to achieve one simple goal: make Americans believe our country is not exceptional so that they can transform the nation from what it was built upon into what they want it to become.

One of the best pieces of unequivocal evidence to support this assertion comes from the poll itself, which shows that on almost all of the topics outlined above, there exists a huge gap between the beliefs of Republicans versus Democrats. As you would probably guess, Republicans are much more likely to deem patriotism, belief in God, and having children as “very important” compared to their Democratic counterparts.

However, there is a much larger story in the works that has precipitated this hostility towards American values.

First and foremost, in recent decades, our public education system has launched an unrelenting attack on our nation’s history and core values. When I was a kid in school, this was not the case.

Yet, I experienced this new phenomenon, in which our public education system undermines our nation’s principles, as soon as I embarked on a career in public education more than a decade ago.

While attending a well-known teacher college in Chicago, where I was pursuing my master’s degree in secondary education, I was absolutely shocked at the blatant left-wing, anti-American propaganda spouted by almost every professor in almost every class.

At first, I was slightly annoyed because I didn’t enroll in an expensive degree program to become brainwashed with leftist rhetoric. But, then, I realized that the entire point of the program was not to make us become well-trained teachers who challenged their students to think critically and appreciate our nation’s unique role in world history. Instead, it seemed very apparent to me (and a few other graduate students in my cohort) that the entire goal of the two-year program was to turn us future educators into left-wing ideologues who would become social justice warriors in the classroom.

Needless to say, after graduating and teaching U.S. history for several years in public schools in Illinois and South Carolina, I grew weary of the near-obsession with the “America is a terrible country” mindset that was so prevalent among far too many of my teaching colleagues.

Yes, this is an anecdotal account. But, believe me, our current public education system is not concerned with teaching kids how to think, it is almost wholly concerned with teaching them what to think. And, it goes without saying, that over the past 20 years or so, this mantra has resulted in a new generation of Americans who not only don’t understand our nation’s history, but they have been led to believe that America’s founding principles and values are treacherous.

Of course, the same basic strategy has been taken by Hollywood, the mainstream media, social media, and many other powerful and prestigious institutions.

When I was a kid, Hollywood routinely made pro-America movies. Today, not so much. Also, when I was growing up, the mainstream media was relatively fair, unbiased, and was much less prone to castigate the country for innocuous reasons. Today, not so much.

It seems as if the left’s long march through the institutions is paying massive dividends these days, seeing as how the zeitgeist has turned on a dime.

However, there still is reason for optimism. In recent years, we’ve seen a school choice renaissance, which would deal a devastating blow to the public education indoctrination monopoly. And, we’ve seen Twitter shift from an anti-free speech platform in which the left held carte blanche into a more free speech-friendly outlet where people can feel comfortable stating their opinions, even if they are pro-America, without fear of retribution.

Throughout American history, the pendulum has swung back and forth. Sometimes, for whatever reason, anti-American sentiment takes hold (as it did in the 1960s). Yet, time and time again, we have witnessed the pendulum swing back to the other direction (as it did in the 1980s). This is not to say that the anti-American values that are being espoused today will cease to exist in a few years. But, it is a reminder that nothing is set in stone, and American decline is not a foregone conclusion. As many historians have stated, internal deterioration, not outside forces, will most likely lead to America’s downfall. If this is the case, which I believe it to be, we better get our house in order and we must do all we can to entrench the values and principles that made America the world’s beacon of freedom and opportunity in the first place.

Original Article: https://heartlanddailynews.com/2023/03/the-deliberate-deterioration-of-american-values-commentary/