Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > Python > markov query

Reply
Thread Tools

markov query

 
 
kpp9c
Guest
Posts: n/a
 
      03-14-2006
markov query

I have noticed a couple markov implementations in python, but none
quite seem to do what i would like. Most seem to do an analysis of some
text and create a new text based on that... I think, (sorry i just
don't know terminology well) a markov table (or is it called a
transition table) to describe how to get from one event to another. So
if i have 3 events, say, A, B, and C which can come in any order, a
Markov chain describes the probability of what the next event will be
using a table. The following table demonstrates a first-order Markov
chain. There are three possible states. Either the current event is A,
B, or C. For each possible current state, there are three possible next
letters. Each row in the table indicates the relative probability of
going to the next letter. For example, if you are currently on letter
A, there is a 20% chance of repeating letter A, a 50% chance of going
to B, and a 30% chance of going to C. Note that the sum of changes for
each row is 100% (20 + 50 + 30 = 100).

Current -- next -- next -- next
A ----- B ----- C
A: 20% -- 50% -- 30%
B: 35% -- 25% -- 40%
C: 70% -- 14% -- 16%

Here the sequence C B and C C would be rare and the sequence C A
common.

This is a first-order Markov chain, which means that only the current
state affects the choice of the next event. A second-order Markov chain
would mean that the current state and the last state affect the choice
of the next event. A third-order Markov chain would indicate that the
current state and the last two states in the sequence will affect the
choice of the next state, and so on. Here is an example transition
table for a 2nd order Markov chain.

Current -- next -- next -- next
A ----- B ----- C
A A 15% 55% 30%
A B 20% 45% 35%
A C 60% 30% 10%
B A 35% 25% 40%
B B 49% 48% 3%
B C 60% 20% 20%
C A 5% 75% 20%
C B 0% 90% 10%
C C 70% 14% 16%

For example, if the current event is B and the last one was A, then the
probability of the next event being A is 20%, B is 45% and C is 35%.
The pattern C B A will never occur, and the pattern B B C will occur
rarely.

Does anyone know of any modules or packages that include markov tables
for creating patterns like shown above.


P.S. Does any one know first of all whether these are called markov
tables, transition tables or probability tables? I am not sure i am
referring to this correctly and what the differences would be if any

cheers,

kp

 
Reply With Quote
 
 
 
 
Stefan Behnel
Guest
Posts: n/a
 
      03-14-2006
Don't know of a Python module (although this doesn't look complex enough for a
package anyway...), but

kpp9c wrote:
> P.S. Does any one know first of all whether these are called markov
> tables, transition tables or probability tables? I am not sure i am
> referring to this correctly and what the differences would be if any


as for terminology:

http://www.csse.monash.edu.au/~lloyd...tured/HMM.html
http://en.wikipedia.org/wiki/Hidden_Markov_model


Hope it helps,
Stefan
 
Reply With Quote
 
 
 
 
Erik Max Francis
Guest
Posts: n/a
 
      03-14-2006
kpp9c wrote:

> I have noticed a couple markov implementations in python, but none
> quite seem to do what i would like. Most seem to do an analysis of some
> text and create a new text based on that... I think, (sorry i just
> don't know terminology well) a markov table (or is it called a
> transition table) to describe how to get from one event to another.


Yes, a system which does this has to build a Markov chain from a data
set and then traverse it.

> Does anyone know of any modules or packages that include markov tables
> for creating patterns like shown above.


Any program that actually uses Markov chains to generate new text based
on existing input as you've described will necessarily create a Markov
chain. So if what you want is the Markov chain itself, then it's
already in there somewhere.

> P.S. Does any one know first of all whether these are called markov
> tables, transition tables or probability tables? I am not sure i am
> referring to this correctly and what the differences would be if any


They're called Markov chains.

--
Erik Max Francis && http://www.velocityreviews.com/forums/(E-Mail Removed) && http://www.alcyone.com/max/
San Jose, CA, USA && 37 20 N 121 53 W && AIM erikmaxfrancis
Nothing spoils a confession like repentence.
-- Anatole France
 
Reply With Quote
 
kpp9c
Guest
Posts: n/a
 
      03-15-2006
> Yes, a system which does this has to build a Markov chain from a data
> set and then traverse it.


>Any program that actually uses Markov chains to generate new text based
>on existing input as you've described will necessarily create a Markov
>chain.


I think you misunderstood. If you see my original post my whole point
was that this was exactly what i don't want. There are several
algorithms out that that already do that. I want to use a table that i
define from scratch to shape a stochastic process. In this case there
is no original input to analyze and i don't want a chain built by
analysis and i am not using necessarily texts.

 
Reply With Quote
 
kpp9c
Guest
Posts: n/a
 
      03-15-2006
>Yes, a system which does this has to build a Markov
>chain from a data set and then traverse it.


>>Any program that actually uses Markov chains to generate
>> new text based on existing input as you've described


Hi. That isn't really what i have described. If i did i could use
exsisting algorithms. What you describe is exactly what i don't want in
this particular case. I am actually not wanting it to build a chain
from some input that it analyzes. There is no input. If you see, i am
trying to define a table that tells it how to proceed forward from
scratch as a stochastic process.

cheers,

-kp--

 
Reply With Quote
 
Robert Kern
Guest
Posts: n/a
 
      03-15-2006
kpp9c wrote:
>>Yes, a system which does this has to build a Markov
>>chain from a data set and then traverse it.

>
>>>Any program that actually uses Markov chains to generate
>>>new text based on existing input as you've described

>
> Hi. That isn't really what i have described. If i did i could use
> exsisting algorithms. What you describe is exactly what i don't want in
> this particular case. I am actually not wanting it to build a chain
> from some input that it analyzes. There is no input. If you see, i am
> trying to define a table that tells it how to proceed forward from
> scratch as a stochastic process.


Any such program has two main parts, one that trains the transition matrix from
a data set, and one that generates output from that transition matrix. You
should be able to take such a program and only use the latter component with
your transition matrix, however you choose to create it. Googling for 'python
markov' generates several leads.

--
Robert Kern
(E-Mail Removed)

"I have come to believe that the whole world is an enigma, a harmless enigma
that is made terrible by our own mad attempt to interpret it as though it had
an underlying truth."
-- Umberto Eco

 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Re: Markov Analysis Help bearophileHUGS@lycos.com Python 2 05-17-2008 08:05 PM
markov chains Sigma2 Digital Photography 0 11-30-2007 02:30 PM
[SOLUTION] Markov Chain (#74) semmons99@gmail.com Ruby 0 04-09-2006 01:19 PM
Re: Markov process representation kpp9c Python 8 03-16-2006 08:17 AM
Markov chain with extras? temp@mclift.fsnet.co.uk Python 7 05-19-2005 06:23 PM



Advertisments