The American West Is Shifting to the Left

The American West Is Shifting to the Left

The following article is Sponsored by The Centennial Institute at Colorado Christian University and written by Jeff Hunt.

The American West has always exuded a spirit of rugged individualism, united by the common bonds of hard work and dedication to faith, family, and freedom. However, that spirit is quickly disappearing as political ideologies shift to the left, threatening the very essence of what makes the West great.

The change began on the coast, with California, Oregon, and Washington – once the very definition of the great American West – falling to leftist ideologies. At first it seemed like a joke; we dismissed it as the “Left Coast.” Yet, in the 2020 Presidential Election, Democrats not only won the Left Coast, they also won Nevada, Arizona, New Mexico, and Colorado. Joe Biden beat Donald Trump by nearly 7 million votes in the West, and that’s when you include Texas.

Continue reading

You Might Like

Daily Truth Report •
Thanks for sharing!
Send this to a friend