Markov Decision Process
A Markov Decision Process on a set of states S and
a set of actions A
consists of a conditional distribution, P(r,s'|a,s).
Partially Observable Markov Decision Process
A POMDP on a set of states S and a set of actions
A consists of a conditional distribution,
P(r, s'|a, s), and another conditional distribution
P(o|s).
MDP is an assumption, but POMDP is not (up to newtonian physics).