Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Pareto distribution #495

Merged
merged 5 commits into from
Jun 7, 2018
Merged

Implement Pareto distribution #495

merged 5 commits into from
Jun 7, 2018

Conversation

vks
Copy link
Collaborator

@vks vks commented Jun 5, 2018

No description provided.

@dhardy
Copy link
Member

dhardy commented Jun 5, 2018

Okay, but I'm starting to wonder if we shouldn't discuss #290 before continuing to add more distributions.

Is there some specific motivation for this one?

///
/// # Panics
///
/// `scale` and `shape` have to be non-zero and positive.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May be worth mentioning common variable names such as k, x_m and α.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree and added those to the docs.

@vks
Copy link
Collaborator Author

vks commented Jun 5, 2018

Okay, but I'm starting to wonder if we shouldn't discuss #290 before continuing to add more distributions.

#290 was also about adding PDFs, CDFs, moments etc. Are there specific reasons for the inclusion of the current distributions in Rand?

@dhardy
Copy link
Member

dhardy commented Jun 5, 2018

As I understand it was primarily about adding new distributions. And no, we don't currently have any framework for choosing whether to add new distributions, this is just an arbitrary point to bring it up (also since you opened #290 originally).

Copy link
Member

@dhardy dhardy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The implementation looks good to me. Testing is minimal; perhaps a bit more could be done. With a couple of tweaks suggested I think we should merge, in spite of #290.

impl Distribution<f64> for Pareto {
fn sample<R: Rng + ?Sized>(&self, rng: &mut R) -> f64 {
let u: f64 = rng.sample(OpenClosed01);
self.scale * u.powf(-1.0 / self.shape)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It should be equivalent to store -1 / shape as a field which should speed up sampling.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point! Done.

let d = Pareto::new(1.0, 2.0);
let mut rng = ::test::rng(1);
for _ in 0..1000 {
d.sample(&mut rng);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In theory the sample must never be less than the shape parameter, so you should either test this or give some small allowance for rounding (but hopefully this isn't necessary).

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I assume you meant the scale parameter? Done.

@vks
Copy link
Collaborator Author

vks commented Jun 6, 2018

Testing is minimal; perhaps a bit more could be done.

It is as extensive as the testing of Normal. But yes, we could also look at histograms and calculate some moments, but this likely requires large samples.

@dhardy
Copy link
Member

dhardy commented Jun 6, 2018

Looks good. Still missing a benchmark but not required. Any further comments before merging? If not I'll probably merge tomorrow.

@dhardy dhardy merged commit 319a384 into rust-random:master Jun 7, 2018
@vks vks deleted the pareto branch June 7, 2018 11:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants