Where is the Church?
Where is the Christian Church? It Appears Dead Has the church lost its influence in America? I for one do not believe it has, it may be somewhat misplaced, but that may be God’s plan to prune some of the ‘dead wood’ from the tree. Several denominations have declared their progressive politically correct liberal views. Many Christians today simply do not use the word “sin” or admit there is such a thing. The liberals have set the talking points, demonizing Christianity to the point that it is embarrassing to hear the stupidity pouring from their mouths, or in most cases written across a comment space after a story. If you don’t believe that just visit my e-mail or view a comment section of a Christian story. The loss of values in this nation is linked or is a result of the assault on faith in Jesus. Liberals have provided a platform for all types of special interest (sin) groups...