Well, you can. Since matrix multiplication is associative, you
can do so. Example with right-side transforms:
Since the translation part has to happen first and last, I don't think
this helps. Association isn't the issue, it's commutation.
F = T1 x S x A x T2
A is constant, S (start) is your input, T1 translates back to the
origin, and T2 translates back. F is the final output. T1 and T2 are
"variable" in that they are related to S and will be different for each
pass. I think T2 = the total translation on the S object, and T1 =
1/T2, but I haven't double checked/proved that mathematically.
If you try to combine A x T2, you don't save any work, because it still
has to be done once per S rotated.
I've seen this formula a lot in many books, and if there were a way to
simplify it, I'm sure someone would have by now. Plus, math.
Let me try to give a quick example. We have a shape which looks like
the letter X, which we want to rotate 45 degrees counter clockwise about
it's own center, so we are looking at an X that is slightly heeled over.
The X is currently at position (1,2), so if we just applied a
rotation, we'd actually rotate the (1,2) about the origin we don't want
that.
^
|
|. . .X
| .
|------------->
To do this we must translate back to the origin, apply the transform,
and then translation back to (1,2). Now imagine we have several hundred
other shapes to rotate/scale the same way. I think that makes clear the
operation I'm talking about. Many shapes all with a *local* transform
which need to be applied, most not at the origin. I don't think you can
reduce that formula in the general case.