Let us start from the beginning. The materialists point to numerous correlation between physical brain states and behaviours to deduce causal relationship. They conclude that brain generates consciousness.
The scientific assertion that brain states cause mental states is not derived merely from correlation, but from causation. Scientists have literally done experiments in which they have stimulated parts of the brain and observed the effect that stimulation has on consciousness. And those effects have been observed to be consistent and predictable, to the point that scientists have been able to map which parts of the brain are responsible for which aspects of conscious experience.
Suppose, we observe that whenever there is a visible shaft of lightning in the air there is a corresponding electrical discharge. We might be confident that the lightning and the electrical discharge are aspects of one and the same thing.
Such analogies cannot apply to the mind-body problem (M/BP) (Nagel 1974, 1986). There is no problem if we accept that two sets of empirical observations are "aspects" of the same thing, given a causal model that unifies them. But there is no such causal model in the case of the mind-body problem. For, unlike all other empirical observations, such as lightning/electricity (or water/H2O, heat/molecular-motion, life/biogenetic-function, matter/energy, etc.), in the special case of M/B, the correlated phenomena are not of the same KIND. And that makes this particular set of "correlations" different, and the forecast that M/B will simply turn out to be yet another set of correlations like the rest does not hold. The question arises because of the obvious disanalogies: public ("3rd person") data, as in all the other analogies, vs. private ("1st person") data.
Empirically detectable shafts of lightning and empirically detectable electrical discharges are the same kind of thing (empirical data, detectable by instruments). So are empirically detectable brain activities and empirically detectable behaviour and circumstances. For example, pains can be correlated to realisers — suppose an area in brain. If pain is so functionalized, the problem of mental causation has a simple solution for all pain instances. So when I say or act out that something hurts (especially when something is indeed damaging my tissues), and there is corresponding brain-image, we may have a correlation between things of the same kind. And out of that correlation we can construct a causal theory of pain function (tissue injury, avoidance, learning, recall, etc.).
Again, we have demonstrated more than inference of causation. We have literally demonstrated that brain states affect states of consciousness.
But when the correlate in question is my feeling of pain, there's now an explanatory gap that neither the pain function theory (which is only a functional theory of tissue-damage-related doing) nor any amount of reconfirmation of the tightness of the correlation can close. What should we say about the causal powers of pain as a mental kind? The answer is that as a kind, pain will be causally heterogeneous.
This doesn't make sense. If I want to know if you're in pain, I can ask you. Or I can observe that you yelp or scream in pain, etc. Now, it's true I can't literally feel your pain myself. So sure, you could be lying, or I could be mistaken - my eyes could be deceiving me. But that is true of literally any phenomena we ever observe. What we observe could be inaccurate. Heck, what we observe could be a completely fabricated delusion. We could all be in the Matrix. Unless we're going to throw out all observation as useless because we can't confirm that we're not in the Matrix, then this pain example is fundamentally no different than any other.
Why is person ‘xyz’ in pain? Can we derive the statement “xyz is in pain” from information exclusively about xyz’s physical/behavioral properties?
Yes, as much as we can derive any information about anything in the world based on its physical/behavioral properties.
The phenomenal mental properties are not functionally definable and hence functionally irreducible. Hence, the problem of mental causation is not solvable for phenomenal mental properties.
Nonsense.
Similarly, only if consciousness is functionally reducible it’s mystery will be solvable. But what stands in the way of solving the problem of consciousness is the impossibility of interpreting or defining it in terms of its causal relations to physical/biological properties.
It's not impossible at all. That is absurd. If you've ever had a headache and taken ibuprofen and felt better, you have personally, directly experienced the causal relationship between consciousness and its physical/biological properties.