As the director of recruiting for your department you sometimes have to consider Affirmative Action motives.  Indeed you are sympathetic to Affirmative Action yourself and even on your own your recruiting policy would internalize those motives.  But in fact your institution has a policy.  You perceive clear external incentives coming from that policy.

Now this creates a dilemma.  For any activity like this there is some socially optimal level and it combines your own private motivations with any additional external interests.  But the dilemma for you is how these should be combined.  One possibility is that the public motive and your own private interest stem from completely independent reasons.  Then you should just “add together” the weight of the external incentives you feel plus those of your own.  But it could be that what motivates your Dean to institutionalize affirmative action is exactly what motivates you.  In this case he has just codified the incentives you would be responding to anyway,  and rather than adding to them, his external incentives should perfectly crowd out your own.

There is no way of knowing which of these cases, or where in between, the true moral calculation is.  That is a real dilemma, but I want to think of it as a metaphor for the dilemma you face in trying to sort out the competing voices in your own private moral decisions.

Say you have a close friend and you have an opportunity to do something nice for them, say buy them a birthday gift.  You think about how nice your friend has been to you and decide that you should be especially nice back.  But compared to what? Absent that deliberative calculation you would have chosen the default level of generosity.  So what your deliberation has led you to decide is that you should be more generous than the default.

But how do you know?  What exactly determined the default?  One possibility is that the default represents your cumulative wisdom about how nice you should be to other people in general.  Then your reflection on this particular friend’s particular generosity should increment the default by a lot.  But surely that’s not the relevant default.  He’s your friend, he’s not just an arbitrary person (you would even be considering giving a gift to an arbitrary person.)  No doubt your instinctive inclination to be generous to your friend already encodes a lot of the collected memory and past reflection that also went into your most recent conscious deliberation.  And as long as there is any duplication, there should be crowding out. So you optimally moderate the enthusiasm that arises from your conscious calculation.

But how much?  That is a dilemma.