How to define a variable conditional on a state of a markov chain
Issue #62
invalid
I want to define a variable with different distributions based on the current state of a Markov chain.
For example, I define a weather variable using Markov chain as
weather = markov.chain_from_matrix(('sunny','rainy'), ('sunny',(0.95, 0.05)), ('rainy',(0.8, 0.2)))
I want to have a different distribution of temperatures according to today's weather
Potentially, I want something like
temp_dist = weather.switch({'sunny': p_sunny, 'rainy': p_rainy})
But Lea will return an error
AttributeError: 'Chain' object has no attribute 'switch'
Comments (3)
-
repo owner -
repo owner - changed status to wontfix
hopefully clarified (not an enhancement/bug)!
-
repo owner - changed status to invalid
- Log in to comment
A Markov chain is not a probability distribution, so
switch
method cannot be called. To get a probability distribution, you need to define an initial distribution like “yesterday was sunny, with x% chance” You can obtain easily a uniform initial distribution using.state
attribute (see doc here)From this initial state, the Markov chain can build a distribution for today:
(or next days, replacing 1 by 2, 3, etc.) Then, you may define temperature for today using a
switch
(i.e. conditional probability table). For instance:As a sanity check,
… that is the
p_sunny
temperature distribution, as expected.