The Church & America

From the very beginning when the Pilgrims came to America the church played a role. Most, if not all, had been persecuted for their faith. They saw America as their last chance to be able to worship God as they believed the Bible taught.