Khale
Active Member
Maybe I am the only one who thinks this, but why does either camp (science and religion) care so much about the beginning of the earth.
For religion, it doesn't particularly matter whether or not your deity/s created the earth. The message stays the same. Plus, how many texts state that you must believe in that your deity created the earth? The majority of religions I know of simply say that you need to believe in the deity itself, if that; the rest is just icing-ish on the cake.
For science, it doesn't particularly matter what the religion camp believes. If you, after years of trying, finally convince one of them that you are right, what have you changed in their life? Nothing. They now know a single fairly useless fact that will only help them out in pre-school jeopardy. (There may be some other uses, but this is an angry rant!)
In conclusion, please, explain to me why it matters.
For religion, it doesn't particularly matter whether or not your deity/s created the earth. The message stays the same. Plus, how many texts state that you must believe in that your deity created the earth? The majority of religions I know of simply say that you need to believe in the deity itself, if that; the rest is just icing-ish on the cake.
For science, it doesn't particularly matter what the religion camp believes. If you, after years of trying, finally convince one of them that you are right, what have you changed in their life? Nothing. They now know a single fairly useless fact that will only help them out in pre-school jeopardy. (There may be some other uses, but this is an angry rant!)
In conclusion, please, explain to me why it matters.