Skip to content

Commit

Permalink
Fix links to documentation (issue probcomp#88).
Browse files Browse the repository at this point in the history
  • Loading branch information
spelufo committed Sep 20, 2023
1 parent 3fe56a6 commit f11a17c
Show file tree
Hide file tree
Showing 11 changed files with 31 additions and 31 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ These notebooks assume some familiarity with the [Julia programming language](ht

For reference on Gen see:

- The [Gen documentation](https://www.gen.dev/dev/)
- The [Gen documentation](https://www.gen.dev/docs/dev/)

- Documentation for [GenPyTorch.jl](https://probcomp.github.io/GenPyTorch.jl/dev/) and [GenTF.jl](https://probcomp.github.io/GenTF/dev/)

Expand Down
8 changes: 4 additions & 4 deletions tutorials/Data-Driven Proposals in Gen.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -909,7 +909,7 @@
"\n",
"To see how to use the built-in importance resampling function, run\n",
"```?Gen.importance_resampling``` or check out the\n",
"[documentation](https://www.gen.dev/dev/ref/importance/#Gen.importance_resampling)."
"[documentation](https://www.gen.dev/docs/dev/ref/importance/#Gen.importance_resampling)."
]
},
{
Expand Down Expand Up @@ -1319,7 +1319,7 @@
"metadata": {},
"source": [
"We will propose the x-coordinate of the destination from a\n",
"[piecewise_uniform](https://www.gen.dev/dev/ref/distributions/#Gen.piecewise_uniform)\n",
"[piecewise_uniform](https://www.gen.dev/docs/dev/ref/distributions/#Gen.piecewise_uniform)\n",
"distribution, where we set higher probability for certain bins based on the\n",
"heuristic described above and use a uniform continuous distribution for the\n",
"coordinate within a bin. The `compute_bin_probs` function below computes the\n",
Expand Down Expand Up @@ -1774,7 +1774,7 @@
"source": [
"Our choice of the `score_high` value of 5. was somewhat arbitrary. To use\n",
"more informed value, we can make `score_high` into a [*trainable\n",
"parameter*](https://www.gen.dev/dev/ref/gfi/#Trainable-parameters-1)\n",
"parameter*](https://www.gen.dev/docs/dev/ref/gfi/#Trainable-parameters-1)\n",
"of the generative function. Below, we write a new version of the proposal\n",
"function that makes `score_high` trainable. However, the optimization\n",
"algorithms we will use for training work best with *unconstrained* parameters\n",
Expand Down Expand Up @@ -1928,7 +1928,7 @@
"Next, we choose type of optimization algorithm we will use for training. Gen\n",
"supports a set of gradient-based optimization algorithms (see [Optimizing\n",
"Trainable\n",
"Parameters](https://www.gen.dev/dev/ref/parameter_optimization/#Optimizing-Trainable-Parameters-1)).\n",
"Parameters](https://www.gen.dev/docs/dev/ref/parameter_optimization/#Optimizing-Trainable-Parameters-1)).\n",
"Here we will use gradient descent with a fixed step size of 0.001."
]
},
Expand Down
8 changes: 4 additions & 4 deletions tutorials/Data-Driven Proposals in Gen.jl
Original file line number Diff line number Diff line change
Expand Up @@ -390,7 +390,7 @@ measurements = [
#
# To see how to use the built-in importance resampling function, run
# ```?Gen.importance_resampling``` or check out the
# [documentation](https://www.gen.dev/dev/ref/importance/#Gen.importance_resampling).
# [documentation](https://www.gen.dev/docs/dev/ref/importance/#Gen.importance_resampling).

# We have provided some starter code.

Expand Down Expand Up @@ -614,7 +614,7 @@ num_x_bins = 5
num_y_bins = 5;

# We will propose the x-coordinate of the destination from a
# [piecewise_uniform](https://www.gen.dev/dev/ref/distributions/#Gen.piecewise_uniform)
# [piecewise_uniform](https://www.gen.dev/docs/dev/ref/distributions/#Gen.piecewise_uniform)
# distribution, where we set higher probability for certain bins based on the
# heuristic described above and use a uniform continuous distribution for the
# coordinate within a bin. The `compute_bin_probs` function below computes the
Expand Down Expand Up @@ -839,7 +839,7 @@ visualize_inference(measurements, scene, start, computation_amt=5, samples=1000)

# Our choice of the `score_high` value of 5. was somewhat arbitrary. To use
# more informed value, we can make `score_high` into a [*trainable
# parameter*](https://www.gen.dev/dev/ref/gfi/#Trainable-parameters-1)
# parameter*](https://www.gen.dev/docs/dev/ref/gfi/#Trainable-parameters-1)
# of the generative function. Below, we write a new version of the proposal
# function that makes `score_high` trainable. However, the optimization
# algorithms we will use for training work best with *unconstrained* parameters
Expand Down Expand Up @@ -927,7 +927,7 @@ end;
# Next, we choose type of optimization algorithm we will use for training. Gen
# supports a set of gradient-based optimization algorithms (see [Optimizing
# Trainable
# Parameters](https://www.gen.dev/dev/ref/parameter_optimization/#Optimizing-Trainable-Parameters-1)).
# Parameters](https://www.gen.dev/docs/dev/ref/parameter_optimization/#Optimizing-Trainable-Parameters-1)).
# Here we will use gradient descent with a fixed step size of 0.001.

update = Gen.ParamUpdate(Gen.FixedStepGradientDescent(0.001), custom_dest_proposal_trainable);
Expand Down
2 changes: 1 addition & 1 deletion tutorials/Introduction to Modeling in Gen.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -9269,7 +9269,7 @@
"What if we'd want to predict `ys` given `xs`?\n",
"\n",
"Using the API method\n",
"[`generate`](https://www.gen.dev/dev/ref/gfi/#Gen.generate), we\n",
"[`generate`](https://www.gen.dev/docs/dev/ref/gfi/#Gen.generate), we\n",
"can generate a trace of a generative function in which the values of certain\n",
"random choices are constrained to given values. The constraints are a choice\n",
"map that maps the addresses of the constrained random choices to their\n",
Expand Down
2 changes: 1 addition & 1 deletion tutorials/Introduction to Modeling in Gen.jl
Original file line number Diff line number Diff line change
Expand Up @@ -512,7 +512,7 @@ scatter(xs, ys_sine, color="black", label=nothing)
# What if we'd want to predict `ys` given `xs`?
#
# Using the API method
# [`generate`](https://www.gen.dev/dev/ref/gfi/#Gen.generate), we
# [`generate`](https://www.gen.dev/docs/dev/ref/gfi/#Gen.generate), we
# can generate a trace of a generative function in which the values of certain
# random choices are constrained to given values. The constraints are a choice
# map that maps the addresses of the constrained random choices to their
Expand Down
12 changes: 6 additions & 6 deletions tutorials/Particle Filtering in Gen.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -46,12 +46,12 @@
"\"bearings only tracking\" problem described in [4]. \n",
"\n",
"This notebook will also introduce you to the \n",
"[`Unfold`](https://www.gen.dev/dev/ref/combinators/#Unfold-combinator-1) combinator, \n",
"[`Unfold`](https://www.gen.dev/docs/dev/ref/combinators/#Unfold-combinator-1) combinator, \n",
"which can be used to improve performance of SMC.\n",
"`Unfold` is just one example of the levers that Gen provides for\n",
"improving performance; once you understand it, you can check\n",
"Gen's documentation to see how similar principles apply to the \n",
"[`Map`](https://www.gen.dev/dev/ref/combinators/#Map-combinator-1) combinator \n",
"[`Map`](https://www.gen.dev/docs/dev/ref/combinators/#Map-combinator-1) combinator \n",
"and to the static DSL. (These features are also covered in the previous tutorial,\n",
"[Scaling with Combinators and the Static Modeling Language](Scaling%20with%20Combinators%20and%20the%20Static%20Modeling%20Language.ipynb).)\n",
"\n",
Expand Down Expand Up @@ -607,7 +607,7 @@
"filter produces.\n",
"\n",
"Gen provides methods for initializing and updating the state of a particle\n",
"filter, documented in [Particle Filtering](https://www.gen.dev/dev/ref/pf/).\n",
"filter, documented in [Particle Filtering](https://www.gen.dev/docs/dev/ref/pf/).\n",
"\n",
"- `Gen.initialize_particle_filter`\n",
"\n",
Expand Down Expand Up @@ -678,7 +678,7 @@
"- The new arguments to the generative function for this step. In our case,\n",
" this is the number of measurements beyond the first measurement.\n",
"\n",
"- The [argdiff](https://www.gen.dev/dev/ref/gfi/#Argdiffs-1)\n",
"- The [argdiff](https://www.gen.dev/docs/dev/ref/gfi/#Argdiffs-1)\n",
" value, which provides detailed information about the change to the\n",
" arguments between the previous step and this step. We will revisit this\n",
" value later. For now, we indicate that we do not know how the `T::Int`\n",
Expand Down Expand Up @@ -42966,7 +42966,7 @@
"DSL to be very flexible and to have a simple implementation, at the cost of\n",
"performance. There are several ways of improving performance after one has a\n",
"prototype written in the built-in modeling DSL. One of these is [Generative\n",
"Function Combinators](https://www.gen.dev/dev/ref/combinators/), which make \n",
"Function Combinators](https://www.gen.dev/docs/dev/ref/combinators/), which make \n",
"the flow of information through the generative process more explicit to Gen, \n",
"and enable asymptotically more efficient inference programs.\n",
"\n",
Expand Down Expand Up @@ -42997,7 +42997,7 @@
"This `for` loop has a very specific pattern of information flow—there is a\n",
"sequence of states (represented by `x`, `y`, `vx`, and `vy`), and each state is\n",
"generated from the previous state. This is exactly the pattern that the\n",
"[Unfold](https://www.gen.dev/dev/ref/combinators/#Unfold-combinator-1)\n",
"[Unfold](https://www.gen.dev/docs/dev/ref/combinators/#Unfold-combinator-1)\n",
"generative function combinator is designed to handle.\n",
"\n",
"Below, we re-express the Julia `for` loop over the state sequence using the\n",
Expand Down
12 changes: 6 additions & 6 deletions tutorials/Particle Filtering in Gen.jl
Original file line number Diff line number Diff line change
Expand Up @@ -56,12 +56,12 @@
# "bearings only tracking" problem described in [4].
#
# This notebook will also introduce you to the
# [`Unfold`](https://www.gen.dev/dev/ref/combinators/#Unfold-combinator-1) combinator,
# [`Unfold`](https://www.gen.dev/docs/dev/ref/combinators/#Unfold-combinator-1) combinator,
# which can be used to improve performance of SMC.
# `Unfold` is just one example of the levers that Gen provides for
# improving performance; once you understand it, you can check
# Gen's documentation to see how similar principles apply to the
# [`Map`](https://www.gen.dev/dev/ref/combinators/#Map-combinator-1) combinator
# [`Map`](https://www.gen.dev/docs/dev/ref/combinators/#Map-combinator-1) combinator
# and to the static DSL. (These features are also covered in the previous tutorial,
# [Scaling with Combinators and the Static Modeling Language](Scaling%20with%20Combinators%20and%20the%20Static%20Modeling%20Language.ipynb).)
#
Expand Down Expand Up @@ -243,7 +243,7 @@ title!("Observed bearings (lines) and positions (dots)")
# filter produces.
#
# Gen provides methods for initializing and updating the state of a particle
# filter, documented in [Particle Filtering](https://www.gen.dev/dev/ref/pf/).
# filter, documented in [Particle Filtering](https://www.gen.dev/docs/dev/ref/pf/).
#
# - `Gen.initialize_particle_filter`
#
Expand Down Expand Up @@ -302,7 +302,7 @@ end;
# - The new arguments to the generative function for this step. In our case,
# this is the number of measurements beyond the first measurement.
#
# - The [argdiff](https://www.gen.dev/dev/ref/gfi/#Argdiffs-1)
# - The [argdiff](https://www.gen.dev/docs/dev/ref/gfi/#Argdiffs-1)
# value, which provides detailed information about the change to the
# arguments between the previous step and this step. We will revisit this
# value later. For now, we indicate that we do not know how the `T::Int`
Expand Down Expand Up @@ -592,7 +592,7 @@ title!("Rejuvenation with resimulation MH on the starting points")
# DSL to be very flexible and to have a simple implementation, at the cost of
# performance. There are several ways of improving performance after one has a
# prototype written in the built-in modeling DSL. One of these is [Generative
# Function Combinators](https://www.gen.dev/dev/ref/combinators/), which make
# Function Combinators](https://www.gen.dev/docs/dev/ref/combinators/), which make
# the flow of information through the generative process more explicit to Gen,
# and enable asymptotically more efficient inference programs.
#
Expand Down Expand Up @@ -623,7 +623,7 @@ title!("Rejuvenation with resimulation MH on the starting points")
# This `for` loop has a very specific pattern of information flow—there is a
# sequence of states (represented by `x`, `y`, `vx`, and `vy`), and each state is
# generated from the previous state. This is exactly the pattern that the
# [Unfold](https://www.gen.dev/dev/ref/combinators/#Unfold-combinator-1)
# [Unfold](https://www.gen.dev/docs/dev/ref/combinators/#Unfold-combinator-1)
# generative function combinator is designed to handle.
#
# Below, we re-express the Julia `for` loop over the state sequence using the
Expand Down
2 changes: 1 addition & 1 deletion tutorials/Reversible-Jump MCMC in Gen.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@
"\n",
"### Using `@dist` to define new distributions for convenience \n",
"To sample the number of segments, we need a distribution with support only on\n",
"the positive integers. We create one using the [`@dist` DSL](https://www.gen.dev/dev/ref/distributions/#dist_dsl-1):"
"the positive integers. We create one using the [`@dist` DSL](https://www.gen.dev/docs/dev/ref/distributions/#dist_dsl-1):"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion tutorials/Reversible-Jump MCMC in Gen.jl
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ Logging.disable_logging(Logging.Info);
#
# ### Using `@dist` to define new distributions for convenience
# To sample the number of segments, we need a distribution with support only on
# the positive integers. We create one using the [`@dist` DSL](https://www.gen.dev/dev/ref/distributions/#dist_dsl-1):
# the positive integers. We create one using the [`@dist` DSL](https://www.gen.dev/docs/dev/ref/distributions/#dist_dsl-1):

# A distribution that is guaranteed to be 1 or higher.
@dist poisson_plus_one(rate) = poisson(rate) + 1;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,11 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Up until this point, we have been using [Gen's generic built-in modeling language](https://www.gen.dev/dev/ref/modeling/), which is a very flexible modeling language that is shallowly embedded in Julia. However, better performance and scaling characteristics can be obtained using specialized modeling languages or modeling constructs. This notebook introduces two built-in features of Gen:\n",
"Up until this point, we have been using [Gen's generic built-in modeling language](https://www.gen.dev/docs/dev/ref/modeling/), which is a very flexible modeling language that is shallowly embedded in Julia. However, better performance and scaling characteristics can be obtained using specialized modeling languages or modeling constructs. This notebook introduces two built-in features of Gen:\n",
"\n",
"- A more specialized [Static Modeling Language](https://www.gen.dev/dev/ref/modeling/#Static-Modeling-Language-1) which is built-in to Gen.\n",
"- A more specialized [Static Modeling Language](https://www.gen.dev/docs/dev/ref/modeling/#Static-Modeling-Language-1) which is built-in to Gen.\n",
"\n",
"- A class of modeling constructs called [Generative function combinators](https://www.gen.dev/dev/ref/combinators/).\n",
"- A class of modeling constructs called [Generative function combinators](https://www.gen.dev/docs/dev/ref/combinators/).\n",
"\n",
"These features provide both constant-factor speedups, as well as improvements in asymptotic orders of growth, over the generic built-in modeling language.\n",
"\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,11 +15,11 @@

# # Scaling with Combinators and the Static Modeling Language

# Up until this point, we have been using [Gen's generic built-in modeling language](https://www.gen.dev/dev/ref/modeling/), which is a very flexible modeling language that is shallowly embedded in Julia. However, better performance and scaling characteristics can be obtained using specialized modeling languages or modeling constructs. This notebook introduces two built-in features of Gen:
# Up until this point, we have been using [Gen's generic built-in modeling language](https://www.gen.dev/docs/dev/ref/modeling/), which is a very flexible modeling language that is shallowly embedded in Julia. However, better performance and scaling characteristics can be obtained using specialized modeling languages or modeling constructs. This notebook introduces two built-in features of Gen:
#
# - A more specialized [Static Modeling Language](https://www.gen.dev/dev/ref/modeling/#Static-Modeling-Language-1) which is built-in to Gen.
# - A more specialized [Static Modeling Language](https://www.gen.dev/docs/dev/ref/modeling/#Static-Modeling-Language-1) which is built-in to Gen.
#
# - A class of modeling constructs called [Generative function combinators](https://www.gen.dev/dev/ref/combinators/).
# - A class of modeling constructs called [Generative function combinators](https://www.gen.dev/docs/dev/ref/combinators/).
#
# These features provide both constant-factor speedups, as well as improvements in asymptotic orders of growth, over the generic built-in modeling language.
#
Expand Down

0 comments on commit f11a17c

Please sign in to comment.