I think this is one of those things like the Monty Hall problem, where the result stops being counterintuitive when the numbers involved are changed. Suppose the shirt costs $1 (I'm using a currency I can represent with one keypress rather than five, so TAKE THAT EUROPE) but you still borrow $50 from each of your parents, and suppose that you then give each of your parents $1 from the change you receive for the shirt and keep the remaining $97. Now try applying the same reasoning as the OP:
Since you already gave mom and dad $1 you now owe them $49 each.
$49 + $49 = $98 + your $97 = $195
But so what, right? $98 is what you have to give your parents, $97 is what you now have. Subtracting one from the other tells you that you will end up $1 poorer while you're wearing your $1 shirt, whereas adding one to the other tells you nothing of interest. But for some reason people get confused when the numbers involved are more like those in the OP.