Raymond Chen has a post today about making change, which reminded me of an interesting problem.
American currency (or most currencies, for that matter) has the desirable property that the greedy solution to giving change results in giving change using the minimal number of pieces. I.e., if I want to give you change using the minimal number or pieces, I can start with the largest denomination, give you as many as I can without giving you too much change, and then repeat this with each smaller denomination in turn. For example. if I want to give you $1.73 in change, I do something like:
This took 7 pieces, and you can do no better. This is not always the case: if you had a system with a penny, a dime, and a 7-cent piece, then the greedy algorithm would give you a 5-piece solution for $0.14, but giving two 7-cent pieces is better.
My question: Is there an easy way to describe the necessary and sufficient conditions for a system to exhibit the property that the greedy change-making algorthm always produces minimal results?
It would appear that there is not. Well, that's not 100% clear, but apparently the best known algorithm for determining whether a set of denominations has this property is O(n^3) in the number of denominations [PDF]. (Jeffrey Shallit discuss this an some other change-making problems in a rather cute paper: What This Country Needs is an 18 cent Piece [PDF].)
In Raymond's case that method doesn't work. He gave extra money to begin with (+$0.20 over $5 bill) so the clerk should have subtracted instead.
I'm not sure I follow. Of course this works in Raymond's case: he was owed $3.05 in change, which the clerk could have greedily paid as $2 + $1 + $.05 (or 3 x $1 + $.05 if he didn't have any $2 bills in his drawer).
Raymond didn't ask this question, but it reminded me of it. Actually, Raymond solves this in reverse, moving from small denominations to large, but this algorithm isn't as simple, as it requires some look ahead.
E.g., if I need to pay out $.30, I have to think about the fact that paying out a nickel will allow me to use a quarter next, whereas paying a dime will preclude that. This isn't rocket science, but it's not as simple as the greedy algorithm.
Raymond's algorithm does have the benefit of making it more clear to the customer that they're getting the correct change, but this is only due to the average customer's lack of mathematical skill.
"this is only due to the average <i>customer's</i> lack of mathematical skill."
I more often run into the cashier's lack of mathematical skill. I like to get a peek at the POS system that various stores are using, and a lot of times the register will actually display "dispense x $1, y quarters, z pennies."
This seems ridiculous until you are buying something for $5.25 and you hand the person a $10 bill and a quarter. A lot of kids behind the counter have a meltdown about then.
In Australia we dropped the 1c and 2c completly. The prices are still 99c etc but it is rounded on the total. $3.85 becomes the max with $2, $1, 50c , 20c, 10c and 5c. Yes we dropped the notes and only have coins for $1 and $2.
I seem to rememeber something like it costs more to make a 1c peice then 1c?