Two brothers, John and Dom, set up an apple stand outside their parent's house. Each sells thirty apples a day. John sells his apples at two for 50 cents (and therefore earns $7.50 per day). Dom sells his apples at three for 50 cents (and therefore earns $5.00 per day). Thus, the total received by the brothers is $12.50.
One day, John is sick, and Dom takes over his apple selling duties. To accommodate the differing rates, Dom sells the 60 apples at five for a dollar. But selling 60 apples at five for a dollar yields only $12.00 earnings at the end of the day. What happened to the other 50 cents?
Akshay Bist - 8 years ago
The average price of each apple on the day John was sick, was 20 cents. On other days, each apple went for ~20.083 cents --> (25 + 50/3)/2 cents. The difference, multiplied by 60 apples come out to 50 cents.