By Ken Ham
According to a new study of 4,700 American adults, the Pew Research Center found that 80% of Americans believe in God. But what (or who) do they mean by “God”? Well, for years now I’ve been telling people that when you say “God” to this secularized culture, you can’t just assume they understand that you mean the God of the Bible. Today, you actually have to define what you mean by the word “God.” And this new research bears this out.
While 80% of Americans say they believe in God, when they were asked further questions, 23% said they believe …read more
Source: Ken Ham AIG
Thanks! Share it with your friends!
Tweet
Share
Pin It
LinkedIn
Google+
Reddit
Tumblr