While reading a recent thread ("If God Could Just Show Himself..."), I was thinking of the basic idea of God "showing Himself" to those who lack faith or belief in God as a way of proving that God exists.
But most religions I'm aware of tend to claim that God has already shown Himself, as recorded in Scriptures. But if the whole idea is for people to have faith in the unseen, why would God bother to announce His presence in the first place? Why not just put people on Earth and see what happens? If this is supposed to be some kind of test of humanity's character, isn't God skewing the results by announcing His existence?
Moreover, why would God actively interfere and meddle in the affairs of humans - and then turn around and presume to judge human society, as if He had nothing to do with how it turned out?
But most religions I'm aware of tend to claim that God has already shown Himself, as recorded in Scriptures. But if the whole idea is for people to have faith in the unseen, why would God bother to announce His presence in the first place? Why not just put people on Earth and see what happens? If this is supposed to be some kind of test of humanity's character, isn't God skewing the results by announcing His existence?
Moreover, why would God actively interfere and meddle in the affairs of humans - and then turn around and presume to judge human society, as if He had nothing to do with how it turned out?