(markov chain:)
for any two states A and Z, there is some n for which either the probability of reaching Z within n steps of starting from A is nonzero or the probabiltiy of reaching A within n steps of starting from Z is nonzero.

A reducible Markov chain is easily seen by its disconnected graph. For instance, if we have 2 states A and B, and P(A->A)=P(B->B)=1, then our graph is 2 disconnected circles, and obviously processes starting at A never read B. In this (extreme) case, any distribution is stable (it's not unique).

If, however, we chose P(A->B)=P(B->B)=1, our chain becomes irreducible, and it has a unique stable distribution: P(B)=1.