How useful this is I don't know, but rather than equating the 'ought' or the 'ought-not' to reality, I would instead equate it to our interpretation of reality. Present a dozen people with the same situation, and no two of them will interpret it the same. In some cases the differences may be subtle, in other cases starkly different.
So in your example, our mind takes cues from the baby being drop-kicked, and we then apply a layer to that based on our nature/nurture. Our morals are not reality, but subjective. Some things are almost universal, and amongst those would be protection of defenceless babies. That protection instinct is evolutionary in nature, and hard-wired into our brain, plus then reinforced for many by all sorts of mechanisms, thus equating to a (basically) universal moral instinct which is none-the-less not a truth in reality, but subjective.
(meh...I'm taking a whole lot of shortcuts here, and just giving a chalk-outline of where my heads at with this.)
This is possible to do, but the goal of the OP is to assume that there is a moral truth that corresponds to actual
reality somehow -- and to use that assumption to try to determine what it would be like, how it metaphysically works, whether or not it's coherent or logically possible at all.
Shifting the correspondence to our perception of reality can produce a pragmatic, working theory of ethics; but it wouldn't help pin down what, if it existed at all, a "moral truth" would be.
To say "It's true that most people think they ought not to dropkick babies" is a proposition, a true one -- similar to how "Erin thinks green is the best color" is a true proposition.
But the statement this thread is trying to explore is "It's true that one ought not to dropkick babies" rather than "It's true that people think they ought not to dropkick babies."
Still, that's not to say you didn't have a good insight.
lewisnotmiller said:
Again, interesting. So, I'm thinking some morals which many hold could just as easily have a different value.
Let's say I am watching a man dousing a Bible in petrol and lighting it up. Depending on my background, and my interpretation of the action, and just as importantly my understanding of the motive behind the action, I could have a wide range of moral truths, all based on the reality of the Bible being burnt.
1) The man is morally bankrupt, and is working directly against the objective moral truth of my religion.
2) The man is misguided and I need to help him, as this is what my religion dictates.
3) The man is righteously demonstrating against hypocrisy and should be applauded for his moral stand.
4) Burning books, no matter the reason, is morally deficient, short-sighted, and unintelligent...
Etc...you get the idea, I'm sure.
Those may or may not be moral "truths," though -- here they are still basically subjective preference statements. Unless we identify some moral "fact" in reality for a moral "truth" to correspond to, you're basically just saying here that you'd
prefer for people not to burn books, or prefer people not to offend people, etc.
lewisnotmiller said:
Makes sense...
As a quick sidenote, do you consider any parallels also in the fact that some colours are more commonly appealing, whilst others are commonly less appealing?
There's probably some sort of evolutionary, physiological, or cultural explanation for why we're more likely to have this preference over that preference. However, that still doesn't make our preferences true in and of themselves -- though we could make a truth statement
about them (e.g., "In America, men are less likely to prefer pink.")
lewisnotmiller said:
That would be a pretty fascinating study. Personally, I've changed my favourite colour several times over the years, mostly when I was a child, before settling on green. Understanding why would be difficult in some of those cases, but it might be a way of looking at preference without the inherent bias I would think impacts in moral testing.
Another interesting notion to bring up regarding that might be the question of doxastic voluntarism: do we choose our preferences (or our beliefs) freely, or do we just happen to have
those ones (be it by chance, or fatalism, or determinism, or nature, or nurture, or
whatever)?
For instance if I told you I have a living dinosaur in my garage and you understood me to actually mean one of the animals, not a toy or anything, could you force yourself through force of will to
actually believe me just based on my saying so? Or would you be unable to help but to be skeptical?
The idea, if doxastic voluntarism is false, is that you don't
decide your standard of evidence -- or your preferences, or your beliefs -- you just
have them. And chances are my telling you I have a dinosaur in my garage wouldn't meet that standard, so no matter how hard you tried to truly believe me (just imagining that it's "possible" doesn't count, TRULY believe me), it just wouldn't happen until some standard of evidence that
you have no conscious control over is met.
If doxastic voluntarism is false, we probably don't pick our moral feelings, or our preferences, or our standards of evidence. We can of course investigate what possibly
helped them to be what they are (nature vs. nurture, cultural influences, evolutionary history, whatever), but it seems like we probably had no conscious part in
deciding what they are.
lewisnotmiller said:
Weight? I changed favourite colours when I was a child on a whim. A moral decision is not something I would change on a whim (no matter that it could be just as subjective). So I kinda return to my premise that for something to be moral, it needs to be effecting 'others', with the understanding that 'others' is a lot broader than just other people (or at least, it often is), and that the definition of 'others' would differ between individuals.
That's a good point, yes, moral feelings tend to be held a lot stronger than other preferences.
lewisnotmiller said:
I suspect yes, but what and how are pretty hard to answer. My thinking is that morals universally held would have a basis in biology. I would still consider them subjective, rather than objective moral truths, since I can't think of a case where any moral truth is absolute to the point of involuntary action, in the way that our heart beating is. But I think the 'reality' is our unconscious, biologically-driven response to certain things which has a major influence on the actions we (as humans) undertake in a given situation.
(Sorry on length, ramble, and massive quoting of OP. It was so many pages ago I thought it was important to give at least SOME context to my response)
Agreed, just because there may be a biological basis that makes some preferences more likely to be held than others doesn't make the preferences true. It just makes statements like "We're more likely to prefer x because of y" true, which solves nothing.