A markov chain (Markov is given the ultimate mathematical dignity of having his name lowercased!) is a random process on some state space, where Xn+1 (the state at time n+1) depends only on Xn, and Pr(Xn+1=y|Xn=x) = px,y is a constant. The matrix P=(px,y)x,y is called the transition matrix of the process. It is a stochastic matrix (the sum of each row is 1), and completely determines the process.

The two important conditions on a markov chain are it being irreducible and it being acyclic (see my writeups there for details). If your markov chain is irreducible and acyclic (this makes it an ergodic process), then there exists a unique stable distribution S for the chain. In that case, there is also convergence to that stable distribution: pick an initial state according to any distribution D (for instance, you could always pick some fixed node), then run the chain; the distribution of the n'th state of the chain will converge to the stable distribution (as n tends to infinity).

Computer science makes up new names for everything. In this case, it uses the name "markov chain" in a slightly different setting. You can condition on the last k states (rather than just the last state) and pretend it's a new type of process. In fact, it's the same thing, if you just enlarge your state space appropriately. Details are left as an easy exercise for the interested reader.