Radioactive decay is sometimes cited as a counterexample to the law of causality, but I don’t think there is a problem. It’s not quantum mechanics that tells you decays happen without cause; it’s quantum mechanics plus some crazy extra assumption (or, as we’ll see, quantum mechanics minus some crazy assumption).
I gather that people are bothered by two things. 1) Lack of an outside agent to trigger the decay. 2) That the decay happens abruptly, and there’s no reason for it to happen now as opposed to later.
Why would one object to the glib answer to #1 that beta decays, for example, are caused by the weak nuclear force? Mathematically, the reason particles decay is because free particle states are no longer eigenstates of the Hamiltonian once one adds interaction terms, and the interaction Hamiltonian is taken to be a manifestation of the weak force. One could say that the glib answer is guilty of objectifying a term in an equation and thus violates my “abstractions are not causes” rule. When I say “weak interaction”, the objection continues, I must really refer to non-virtual W and Z bosons coming in from outside the system, which is not what I mean here. However, this objection itself rests on an assumption that free particle eigenstates are what are really real, and I think this dubious, precisely because it leaves pieces of the theory with unclear ontological ground. In fact, I affirm my ignorance; I frame no hypotheses. I only say that the whole Hamiltonian has some ontological ground(s), and so the drift of the state vector from a pure particle state to some superposition has some cause. (Not knowing the ultimate actors, one must fall back on philosophical arguments as to whether these actors are altered without cause, whether they are self-“moving”, and whatever else. The description given by the standard model of particle physics can’t help you with that.)
But what about the arbitrariness of when it happens? Note that this is only a problem for very strong causality claims, like Leibnitz’ Principle of Sufficient Reason. Lots of perfectly good theories of causality allow for probabilistic action. “All x have a cause” doesn’t necessarily mean that everything about how and when the cause operates is predetermined. (I would think most scholastics would be uncomfortable with a claim of such wholesale determinism.) In any case, even the strongest causality principle is not threatened by particle decay as actually described by the time dependent perturbation theory of quantum mechanics (from which decay rate calculations come), because this description actually doesn’t have any abruptness or discontinuity in it. What actually gives is a continuous drift. Now, an experimental apparatus like a geiger counter will of course go off at a discrete time, and how we understand that goes into the contentious matter of the interpretation of quantum mechanics. When playing the game of “quantum mechanics says”, one should use the one interpretation that does not posit a violation of QM on macroscopic scales: the many-worlds interpretation. In this model, the state vector continuously and deterministically diffuses through all possible worlds (driven by the full Hamiltonian with interaction, or rather whatever grounds this). We are one such world, where the decay is measured at such-and-such particular time, but the full state vector is a combination of this and many other possibilities. Interference terms between “worlds” are destroyed by a process called “decoherence”, which is actually the effect of a third piece of the Hamiltonian, that due to the background environment (including measuring device), so decoherence is definitely a causal event. Scholastics are always being accused of believing in “spooky” hidden variables as a way to save their belief in causality, but they could just as well accuse their opponents of believing in “spooky” stochastic hidden variables or “spooky” wavefunction collapse.