197k views
4 votes
Which statistical methods are better than basic random scans for studying and exploring the parameter space of extensions of the standard model?

Given a lagrangian of a new model beyond the standard model, and given a set of constraints, say the oblique parameters for instance and the decay of the higgs boson and some signal strengths and branching ratios, up till now with my professor we have been using the random number generator ran2 (from numerical recipes) to scan the ranges of the free parameters and see if the calculated quantities are allowed by the constraints. However, looking over the literature, I find that most people use fancier methods such as likeihood functions and monte carlo. But the details of their caculations and implementations are far from explicit and seem arbitrary (especially the choice of the probability distribution to sample from).

My question is: are these methods justified for this kind of phenomenological studies ? If so, on what basis do we choose the probability distribution ? Is there any link between the likelihood function and the lagrangian of the model ?

1 Answer

6 votes

Final answer:

Sophisticated techniques like likelihood functions, Monte Carlo simulations, and Bayesian inference with MCMC optimization provide a more rigorous approach for exploring parameter spaces in new physics models than basic random scans.

Step-by-step explanation:

When we examine statistical methods for exploring the parameter space of extensions to the standard model in physics, it becomes clear that methods such as likelihood functions and Monte Carlo simulations offer more sophisticated ways of understanding the implications of a new Lagrangian and its constraints compared to basic random scans.

Likelihood-based approaches are grounded on the probability of observing the given data and optimizing the parameters to make the observed data most probable. This method can be more rigorous as it does not assume prior knowledge about parameter distributions. However, the Bayesian inference method, which can be implemented using software like WinBUGS, incorporates prior information to provide higher certainty in parameter estimates. This is helpful since it accounts for any previous knowledge and allows parameters to be variable rather than fixed. Moreover, Bayesian approaches are also beneficial when errors do not conform to traditional distributions and for their flexibility in avoiding local minima through Markov Chain Monte Carlo (MCMC) optimization.

Choosing a probability distribution to sample from in MC methods should be guided by the nature of the parameter space and the form of the priors if Bayesian methods are used. Additionally, model comparison techniques like the Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC) reward parsimony and can be valuable tools for assessing different hypotheses about model structures.

User Vrijdenker
by
8.0k points