The U.S. Government disappoints me. I'm reading a book entitled "A Nation of Wusses: How America's Leaders lost the Guts to make us Great", and it bring out many valid points. Most notably, it discusses the fact that politicians aren't really in it to perform their civic duty, but are instead in it for the money and publicity. Honestly, this disgusts me. What disgusts me even more is that Congressmen (and women) feel the need to dictate how our military operates, when 80% of Congressmen and Women haven't even served in the military.
That's also the problem with the Presidency. When you're elected to the Presidency, the entire military might of the United States of America is at your fingertips. Considering that very few modern Presidents have served in the military, is it time for that to change? Most modern day Presidents are career politicians, who know nothing about the military or how it operates. Honestly, I'm up for reinstating the draft just to fix
that.
Before we go blaming the President for every goddamn thing that's wrong with our country, maybe we should step back and look at Congress, and the politicians who are playing partisan politics.
When I was discussing this with one of my friends (who would make a great congressman), both of us decided that should we ever run for public office, it will be as Independents.
There, Rant/Run-on Sentence/What's on my mind Complete.