The title says it all. As much rhetoric is thrown around about the United States being a democracy, it really isn't. The only real decision making that US citizens do is voting for who gets to lead the country. After that, all other government decisions are pretty much out of our hands. In fact, the Constitution clearly spells out that our nation is a republic and not a democracy.
The thing is, in a representative democracy, voters are supposed to elect people who represent their wishes in government processes, but I'm increasingly convinced that most politicians only take actions which benefit them economically and popularity-wise. The truth is that the people of the United States have very little true political power. Why do we bother making it look like we actually have a say in our nation's politics when we don't?
Any help will be apprecited.
I didn't find the right solution from the Internet.