?

Log in

No account? Create an account
A random question for people with science leanings: I need a… - The Mad Schemes of Dr. Tectonic [entries|archive|friends|userinfo]
Beemer

[ userinfo | livejournal userinfo ]
[ archive | journal archive ]

[Sep. 23rd, 2005|01:57 pm]
Beemer
A random question for people with science leanings:

I need a probability distribution, but its shape isn't really known. That inclines me to use a gaussian as a generic, because most things are gaussian.

However, it can't produce negative values. What would you use?

Truncate the gaussian below zero? Fold negative values back to positive? Something like a chi-square distribution that looks gaussian once the mean is far from zero?
LinkReply

Comments:
[User Picture]From: dr_tectonic
2005-09-23 03:38 pm (UTC)
So, the agents are integrating signals from a number of sources (weighted, yes indeed) to determine their general alarm level. And once they hit their threshold, they act. That seems to me to be equivalent to what you've said above, just continuous instead of discrete.

I don't follow once we get to the infinite-state Markov chain. How does it have any implications for the distribution? In other words, it seems like the markov chain representation is just a way of making a histogram of the agents' thresholds... what am I missing?

Note also that one of the signals the agents integrate is "who else has already evacuated", which would violate one of the assumptions for markov processes, no?
(Reply) (Parent) (Thread)
[User Picture]From: melted_snowball
2005-09-23 03:44 pm (UTC)
Oh, the chain is just to think about it in a different way. The point is that if the person is truly Markovian, the most no-information idea is to wrap all of the X_i states together into a single state, with probability p of "run like hell" and probability 1-p of "stay." In this context, the distribution of thresholds is geometric, and making it normal is the odd thing, not somehow a default. [Sorry--somehow I thought I'd written that, but I guess I just thought it...]

Yes, "who's already gone" would violate the Markovianness.
(Reply) (Parent) (Thread)
[User Picture]From: dr_tectonic
2005-09-23 04:00 pm (UTC)
Ah! Okay, gotcha. Yeah, I think people are definitely non-Markovian in their decision process.

I'll fool around with gammas some. (Hopefully, when we do some sensitivity analysis of the model, one of the things we'll find out is that the details of the distribution of thresholds is not especially important. That'd be convenient.)
(Reply) (Parent) (Thread)
[User Picture]From: melted_snowball
2005-09-23 04:55 pm (UTC)
There's a few decent handbooks on basic distributions that are a sensible thing to have on your shelf. I don't have any of them (I have a mathematical statistics book), but getting one of them is sensible.
(Reply) (Parent) (Thread)