Markov Chains in Sandpiles

This is a quick overview of Markov Chains in sandpiles. I don't know how much people know about Markov chains, so I started from the very basics (this is also important because I don't know any more than the very basics). If you find any mistakes, please correct me.

A markov chain is a random process in which there is a set of states $S=\{s_i\}$. At each time step, we transition between states. This transition depends only on the current state, and there is a different probability of transitioning to each state. For example, consider a graph $G=(V,E)$ that is a sandpile with a sink, and let $S$ be the set of stable states. At each time step, we add a chip at a random vertex, and then stabilize. If we are at state $s_i$, then let the probability of transitioning to state $s_j$ be $a_{ij}$. Notice that $a_{ij}=\frac{1}{\#V}$ if we can get from $s_i$ to $s_j$ by adding a chip at some vertex, and $0$ otherwise.

Having found the transition probabilities, we may form a matrix $A=(a_{ij})$. Say we start off in state $s_i$. This corresponds to a vector $x$ with a $1$ in the $i^{th}$ position and $0$'s]] everywhere else. Then the components of the vector [[ $Ax$ gives the probabilities of being at each state if we add one chip at a random vertex.

Say we are only interested in the long term behavior. For example, say we want to know if we continue this process of adding a chip, and then stabilizing, what percent of the time do we spend in each state? To find this, we evaluate $\lim_{n\rightarrow \infty} A^nx$. If $A$ has one eigenvalue that is 1, and all other eigenvalues have magnitude between 0 and 1, then this limit will converge (it will project onto the 1-eigenspace). This limit vectors gives the probabilities being in a given state, i.e. what percentage of the time is spent in a given state. The states with non-zero probabilities are the recurrent states.