PKPD models will likely need outputs from the same model with different likelihoods. E.g. one output follows a normal model; another follows a log-normal. At the moment, this sort of thing is not possible in PINTS without lots of manual hacking. Similarly, it is not possible to natively handle an inference problem with multiple outputs where those outputs are measured at different times.
I wanted to open the forum on this. Here's a potential proposal.
We create a MasterProblemCollection with the following methods:
- initialised with a forward model and values as is usual. It is also (crucially) initialised with some way of indexing which outputs correspond to which
SubProblem (see below). At a simple level, this could be a list [1,1,1,2,2,2,2,3,3,3,3,3] which would indicate that the first three forward model outputs correspond to subproblem1, the second four to subproblem2 and the last five to subproblem3. It is also initiated with a list of times lists: one list of times for each SubProblem.
evaluate takes an argument index (in the above example, this would be either 1, 2 or 3) which specifies which output set to return. The first time evaluate is called, the result is cached so that the model is only solved once -- not once for each output set.
n_outputs also takes index and returns the relevant number of outputs for the SubProblem
times takes index and returns the appropriate time list
We also create a SubOutputProblem:
- intialised with a
MasterProblemCollection and an index
evaluate calls MasterProblemCollection.evaluate(index)
- and so on for
n_outputs etc
Example use case
User has a forward model with three outputs and would to model the first two using a GaussianLogLikelihood and the third with LogNormalLogLikelihood.
They would do:
times_1 = [1, 2, 4, 6]
times_2 = [3, 5]
times = [times_1, times_2]
index_list = [1, 1, 2]
master = MasterProblemCollection(forward_model, times, value, index_list)
subproblem_1 = SubProblem(master, 1)
subproblem_2 = SubProblem(master, 2)
loglikelihood_1 = GaussianLogLikelihood(subproblem_1)
loglikelihood_2 = LogNormalLogLikelihood(subproblem_2)
logprior_1 = ...
logprior_2 = ...
logposterior_1 = LogPosterior(loglikelihood_1, logprior_1)
logposterior_2 = LogPosterior(loglikelihood_2, logprior_2)
The user could then create a wrapper to create a LogPosterior class that just returns the sum of logposterior_1 and logposterior_2 which they could use for inference.
@martinjrobins @MichaelClerx @chonlei (and anyone else!) interested to hear thoughts on this
PKPD models will likely need outputs from the same model with different likelihoods. E.g. one output follows a normal model; another follows a log-normal. At the moment, this sort of thing is not possible in PINTS without lots of manual hacking. Similarly, it is not possible to natively handle an inference problem with multiple outputs where those outputs are measured at different times.
I wanted to open the forum on this. Here's a potential proposal.
We create a
MasterProblemCollectionwith the following methods:SubProblem(see below). At a simple level, this could be a list[1,1,1,2,2,2,2,3,3,3,3,3]which would indicate that the first three forward model outputs correspond to subproblem1, the second four to subproblem2 and the last five to subproblem3. It is also initiated with a list oftimeslists: one list of times for eachSubProblem.evaluatetakes an argumentindex(in the above example, this would be either 1, 2 or 3) which specifies which output set to return. The first time evaluate is called, the result is cached so that the model is only solved once -- not once for each output set.n_outputsalso takesindexand returns the relevant number of outputs for theSubProblemtimestakesindexand returns the appropriate time listWe also create a
SubOutputProblem:MasterProblemCollectionand anindexevaluatecallsMasterProblemCollection.evaluate(index)n_outputsetcExample use case
User has a forward model with three outputs and would to model the first two using a
GaussianLogLikelihoodand the third withLogNormalLogLikelihood.They would do:
The user could then create a wrapper to create a
LogPosteriorclass that just returns the sum oflogposterior_1andlogposterior_2which they could use for inference.@martinjrobins @MichaelClerx @chonlei (and anyone else!) interested to hear thoughts on this