Originally Posted by feministmama
I havent met anymore christian hippies. It seems that Newt Ginrich and some other folks using christianity to discriminate people gave it a bad name.
So my question for christians is this: Do you feel christianity has changed since the 70s? Has it lost something becasue of politicians using it for thier own gain rather than as a means of love? Or am I asking a really dumb question?
I wanted to answer this question yesterday, but ran out of time.
I don't think Christianity has changed since the 70s, but I do believe that Chrisitans have. We all have. The world isn't like it was back then. I think we're all a little more cynical now than we were back then. We are much more likely to believe that someone is trying to take advantage of us, than to believe that someone needs our help. And we will NOT be taken advantage of, will we?
Our culture tends to be more self-centered, and more egocentric. We're so afraid of stepping on anyone's toes that sometimes we don't want to be truthful. And then, when we get up the courage to be truthful, we often forget to do so in love.
I think the shift in churches today is simply a reflection of our society as a whole. Politicians and leaders have been using religions of all kinds since the beginning of time to serve their own agendas. Religions haven't changed much over the thousands and thousands of years. We're just much more aware of it now, thanks to the media.
It makes us all a little more cynical. I don't view it as a breakdown of Christianity. I think it's a breakdown of humanity.
It's hard for me, because so often God and Jesus get blamed for the mistakes we make. I am not perfect. I yell at bad drivers. I speak without thinking sometimes. I get impatient.
I am not always good. God is. I'm not. It's sad that He gets blamed just because I am inept.