A question raised over dinner last week. A group of N diners are dining out and the bill is $100. In scenario A, they are splitting the check N ways, with each paying by credit card and separately entering a gratuity for their share of the check. In scenario B, one of them is paying the whole check.
In which case do you think the total gratuity will be larger? Some thoughts:
- Because of selection bias, it’s not enough to cite folk wisdom that tables who split the check tip less (as a percentage): At tables where one person pays the whole check that person is probably the one with the deepest pockets. So field data would be comparing the max versus the average. The right thought experiment is to randomly assign the check.
- Scenario B can actually be divided into two subcases. In Scenario B1, you have a single diner who pays the check (and decides the tip) but collects cash from everyone else. In Scenario B2 the server divides the bill into N separate checks and hands them to each diner separately. We can dispense with B1 because the guy paying the bill internalizes only 1/Nth of the cost of the tip so he will clearly tip more than he would in Scenario A. So we are really interested in B2.
- One force favoring larger tips in B2 is the shame of being the lowest tipper at the table. In both A and B2 a tipper is worried about shame in the eyes of the server but in B2 there are two additional sources. First, beyond being a low tipper relative to the overall population, having the server know that you are the lowest tipper among your peers is even more shameful. But even more important is shame in the eyes of your friends. You are going to have to face them tomorrow and the next day.
- On the other hand, B2 introduces a free-rider effect which has an ambiguous impact on the total tip. The misers are likely to be even more miserly (and feel even less guilty about it) when they know that others are tipping generously. On the other hand, as long as it is known that there are misers at the table, the generous tippers will react to this by being even more generous to compensate. The total effect is an increase in the empirical variance of tips, with ambiguous implications for the total.
- However I think the most important effect is a scale effect. People measure how generous they are by the percentage tip they typically leave. But the cost of being a generous tipper is the absolute level of the tip not the percentage. When the bill is large its more costly to leave a generous tip in terms of percentage. So the optimal way to maintain your self-image is to tip a large percentage when the bill is small and a smaller percentage when the bill is large. This means that tips will be larger in scenario B2.
- One thing I haven’t sorted out is what to infer from common restaurant policy of adding a gratuity for large parties. On the one hand you could say that it is evidence of the scale effect in 5. The restaurant knows that a large party means a large check and hence lower tip percentage. However it could also be that the restaurant knows that large parties are more likely to be splitting the check and then the policy would reveal that the restaurant believes that B2 has lower tips. Does anybody know if restaurants continue to add a default gratuity when the large party asks to have the check split?
- The right dataset you want to test this is the following. You want to track customers who sometimes eat alone and sometimes eat with larger groups. You want to compare the tip they leave when they eat alone to the tip they leave when part of a group. The hypothesis implied by 3 and 5 is that their tips will be increasing order in these three cases: they are paying for the whole group, they are eating alone, they are splitting the check.
(Thanks to those who commented on G+)