If angel bought 8 oranges, and he knows they are 50 cents, then you can multiply 8 by 0.50 to determine how much he spent only on oranges:
8 x 0.50 = 4.00
Then, you should subtract that value from the total that he spent, to see how much was spent on apples:
15.22 - 4 = 11.22
Finally, because you know Angel bought 11 apples, divide the remaining amount he spent by 11 to determine how much each apple was:
11.22 / 11 = 1.02
So, yes, Angel's guess was reasonable, because $1.02 is, indeed, a little more than $1.00
Hope this helps :)