Replies: 2 comments 5 replies
-
For the first part, I think before we answer a lot of the questions, you might want to refer to the documentation on the simplemodels: Already, I noticed that model = pyhf.simplemodels.correlated_background([signal_yields], [bkg_yields], [bkg_unc], [bkg_unc]) is incorrect. What you'd want, assuming your >>> model = pyhf.simplemodels.correlated_background([signal_yields], [bkg_yields], [bkg_yields - bkg_unc], [bkg_yields + bkg_unc])
>>> computer(model, data)
array([-0.00037948, 0. ]) or if the background uncertainty was relative, then >>> model = pyhf.simplemodels.correlated_background([signal_yields], [bkg_yields], [bkg_yields * (1 - bkg_unc)], [bkg_yields * (1 + bkg_unc)])
>>> computer(model, data)
array([3.45592019e-05, 2.41560547e-20]) since the correlated background is built by a Likewise, for the The other thing that's tripping you up here is that the order of the parameters is not consistent between these two models (and the auxdata changes too!) >>> model.config.par_order
['mu', 'uncorr_bkguncrt'] and for correlated >>> model.config.par_order
['correlated_bkg_uncertainty', 'mu'] so you need to probably (if easier) just change your computer to be def computer(model, obs_yields):
data = [obs_yields] + model.config.auxdata
pars, twice_nllh = pyhf.infer.mle.fit(
data,
model,
return_fitted_val=True,
maxiter=200,
par_bounds=model.config.suggested_bounds(),
)
return dict(zip(model.config.par_order, pars)) and making these changes, I get >>> model = pyhf.simplemodels.uncorrelated_background([signal_yields], [bkg_yields], [bkg_unc])
>>> computer(model, obs_yields)
{'mu': 0.9999999999542389, 'uncorr_bkguncrt': 1.0000000207942246} and >>> model = pyhf.simplemodels.correlated_background([signal_yields], [bkg_yields], [bkg_yields - bkg_unc], [bkg_yields + bkg_unc])
>>> computer(model, obs_yields)
{'correlated_bkg_uncertainty': -0.00037948419484118976, 'mu': 0.0} For the uncorrelated case, I suspect that the >>> model.logpdf([0, 1], data)
array([-15.6542112])
>>> model.logpdf([1, 1], data)
array([-15.65917812]) but I have to think more about this here, but I think you might be setting your uncertainty incorrectly. @alexander-held ?
See #820. Open issue atm. Constraint types cannot be changed from the config, but you can tweak the python code and do a bit of monkeypatching to make this work. |
Beta Was this translation helpful? Give feedback.
-
Hi @jackaraz - I notice that the bkg uncertainty is very small - does it look better for e.g. a 5% uncertainty? |
Beta Was this translation helpful? Give feedback.
-
Hi all, I have a very naive question; we are trying to understand how we should initialise our statistical model in different ways and need clarification. So let me set up an example;
Let's say I have a region with the following yields
Typically I would initialise this as
And I would expect$\hat{\mu}$ to be zero since observed and bkg yields are the same; i.e.
However, I got$\hat{\mu} = 1$ . So I tested the same example in two ways; first, I used correlated background simple model
Which is closer to what I expected than I tried adding 1 to bkg uncertainty in the uncorrelated simple model
Which yields precisely what I expect. However, I don't know if I should add 1 to my uncertainty: is the uncertainty applied multiplicatively? Could you please confirm? Is this the same for correlated simple model? I guess for this particular example, since I have one bin, it doesn't matter, but I'm assuming that I should not be using a correlated simple model if I do not have upper and lower envelop information for the bkg uncertainty.
My second question is I see in the documentation that uncorrelated shape is always constrained via Poissonian. Is there a function that uses Gaussian instead for uncorrelated shapes? The reason I'm asking is that I'm trying to make the
pdf
similar to simplified likelihood.Thanks!
Beta Was this translation helpful? Give feedback.
All reactions