Is God Removing His Hand From America?

Posted by

-

Charisma Media Staff

Perry Stone

In the past, God raised America up as a prophetic vineyard bearing fruit for the gospel. But is He now removing His hand from America? Watch Perry Stone answer.


Leave a Comment

Scroll to Top
Copy link