When Did The American Government (and its lick-spittle minions) Ever Tell Us The Truth?
They ALWAYS have told us the truth.
They told us the truth up to and during WWII, but started lying afterwards.
They lied before WWII, started telling the truth during WWII, then started lying again.
They have ALWAYS lied to us.
Vote
View Results
See this poll on:
https://poll.fm/7098479/embed
Leave a Comment
Your Name
Please enter your name.
Email Address
Please enter your email address.
Your Website (optional)
Your Comment
Please enter a comment.
0
/4000 chars
Submit Comment
Leave a Comment