I used to think that if NASA or some scientist, somewhere would go ahead and find proof of life on other planets that maybe, just maybe it might make us (here on Earth) think twice about whether or not we are so different from one another after all. Now... I don't think that way anymore. If alien creatures from the stars landed en mass all over this planet, it wouldn't stop the anger, the hatred, the killing... we humans would still be prejudiced against one another for one reason or another, color, sex, religion, ethnicity, whatever. If alien creatures did ever land on this planet, why, it would just give us another being to hold up as 'different' and therefore to be despised. Why do people need to hate other people? Why does what religion someone practices make them better or worse than their neighbors? I can read the Bible just as well as the priests and preachers and I don't believe I have ever found any where therein that Jesus said we should despise or hate another being for any reason. But it seems to me that organized "Christian" churches teach nothing but hate. Homosexuality is a sin; women should be subservient to their husband; if you don't take a particular church's word for how things are and how you are supposed to live then you are damned; go into almost any so called Christian church and you will hear that they alone know what God means for us to do and if we don't believe as they do then we are going straight to hell; do not pass go, do not collect $200, go directly to hell.
I am so disgusted by everything that's been going on. The war, this president going on with his little "I am the King" mentality and Congress just going along with it. The new thing that pissed me off was the thing with the CDC testimony before Congress. The White House diced, sliced and julienned the testimony that the CDC lady was supposed to give. I did not realize that they could do that. I thought that when someone or some department got called in to testify before Congress they were supposed to tell the truth, the whole truth, etc. Not tell what the White House says they can tell and nothing beyond that. The whole system is so out of whack. I feel like I'm in a Twilight Zone episode. I have real bouts of actual vertigo sometimes just watching the news. Nothing seems real. And people seem to be fine with just going on about their lives and ignoring the situation.
And the Democrats are really pissing me off. They are just going on like it's all business as usual. And it's not! Nothing is 'as usual'. Why can't they see that? I don't think that a Democrat will win the White House in '08. They are doing all the wrong things and nothing right. And that damned Nancy Pelosi. She is really getting off with me something fierce! If I have to see her stupid smile one more time I may actually throw something heavy at my television. (I need a new one, anyway) She and Harry Reid have made a real dog's dinner of the whole thing. Well, I can't just lay all the blame on them, I guess. To be fair, I need to spread the blame around to the whole Democratic party. They are loosing the '08 election and they all seem to not realize the fact. They are beginning to all look like grinning idiots. Pisses me off something awful!
And to think that the religious conservatives may well be the one's who bring a viable third party to our nation!? How incredible is that? Just what we need, a conservative Christian political party. Can you imagine what life would be like if that took over the government? No better than the Taliban, eh? The skinheads and the KKK and the neo-nazis and others like them would just love that. Give them the right to burn crosses, lynch anyone who isn't a "true believer". Encouraging hate, fostering strife between the sexes, between races, ethnicities, other forms of religion. Ugh! What a nightmare!
I'm pretty sure this isn't what my ancestors fought and died for.