I read in a book that mentioned as a side note how Christians can "change our culture." I am not sure that is something we should actually be doing.
As I understand, there will always be a sinful culture (Eph. 2:2). There is nothing new under the sun. Unbelievers will live, act, think, and believe like unbelievers. Thankfully, Christ has created a culture within His body, the Church (Rom. 8:4, Gal. 5:16). We coexist, we believers with the unbelievers, and we are diametrically opposed.
I recently read a (different) book about Allied POWs in WWII. While captive, they were able to exercise, keep rank, communicate with family, and even study for a college degree. They were very nearly living a normal life, except for the guards, guns, and prison walls. We cannot make the prison camp of sinners a more Christian existence. We must rescue them. We cannot make the prison as close to a Christian society as possible. We must bring them into the real Christian society, the Body of Christ. As the city on a hill (Matt. 5:14-16), we are drawing those that live in the darkness into the light, not merely making the darkness a little less dark.
Granted, the existence of the Body of Christ in this world will have positive effects, such as hospitals, charities, universities, etc. But that is not our aim. It is perhaps this faulty philosophy that encourages Christianized pagans to perceive themselves as Christians. I do not have to quote stats to bring to mind the plethora of Americans who think they are Christians because they grow up in a "Christian nation," a Christian family, or a Christian church. The Gospel is what they need.
1 comment:
I agree. I think Jesus just assumed that culture was going to be bad. It's almost postmillenial to think we will change the culture to a God-honoring one!
Post a Comment