***********************************************
Discrete-time Markov decision processes (MDPs)
***********************************************


Background
=====================
Same example of die knuth as in dtmc but mdp (model with nondeterminsm)

.. seealso:: `01-building-mdps.py <todo /examples/mdps/01-building-mdps.py>`

First, we import Stormpy::

    >>>	import stormpy

Transition Matrix
=====================
more choices than states!


Labeling
================


Reward models
==================


Exit Rates
====================

Building the Model
====================


POMDPS
====================
Give
observations as ..