Don’t believe leftist lies. American history IS good.
The left has fostered a culture that encourages Americans to despise this country by focusing exclusively on the darkest parts of American history – and in some cases, distorting the narrative to portray our nation and her most significant historical figures