Where are you on the spectrum from "SSA and SIA are equally valid ways of reasoning" to "it's more and more likely that in some sense SIA is just true"? I feel like I've been at the latter position for a few years now.
More SIAish for conventional anthropic problems. Other theories are more applicable for more specific situations, specific questions, and for duplicate issues.
I am on a quest to show that anthropics probability are normal, at least in the absence of exact duplicates.
So consider this simple example: a coin is tossed. This coin is either fair, is 3/4 biased to heads, or 3/4 biased to tails; the three options are equally likely. After being tossed, the coin is covered, and you eat a cake. Then you uncover the coin, and see that it was tails.
You can now update your probabilities on what type of coin it was. It goes to a posterior of 1/6 on the coin being heads-biased, 1/3 on it being fair, and 1/2 on it being tails-biased[1]. Your estimated probability of it being tails on the next toss is (1/6)(1/4)+(1/3)(1/2)+(1/2)(3/4)=7/12.
Now you are told that, had the coin come up heads, there would have been poison in the cake and you would have died before seeing the coin.
This fact makes the problem into an anthropic problem: you would never have been alive to see the coin, had it come up heads. But I can't see how that would have changed your probability update. If we got ethics board approval, we could actually run this experiment. And for the survivors in the tail worlds, we could toss the coin a second time (without cake or poison), just to see what it came up as. In the long run, we would indeed get roughly 7/12 tails frequency. So the update was correct, and the poison makes no difference.
Again, it seems that, if we ignore identical copies, anthropics is just normal probability theory. Now, if we knew about the poison, then we could deduce that the coin was tails from our survival. But that information gives us exactly the same update as seeing the coin was actually tails. So "I survived the cake" is exactly the same type of information as "the coin was tails".
Incubators
If we had more power in this hypothetical thought experiment, we could flip the coin and create you if it comes up tails. Then, after getting over your surprise, you could bet on the next flip of the coin - and the odds on that will be the same as in the poison cake and in the non-anthropic-case. Thus updates are the same if:
The probability of tails given the heads-biased coin is 1/4; given the fair coin it is 1/2=2/4, and given tails-biased it is 3/4. So the odds are 1:2:3; multiplying these by the (equal) prior probabilities doesn't change these odds. To get probabilities, divide the odds by 6, the sum of the odds, and get 1/6, 2/6=1/3 and 3/6=12. ↩︎