Research workers in many fields are realizing the substantial limitations of statistical tests, test statistics, arbitrary α-levels, P-values, and dichotomous rulings concerning “statistical significance.” These traditional approaches were developed at the beginning of the last century and are being replaced by modern methods that are much more useful. These methods rely on the concept of information loss and formal evidence. They provide easy-to-compute quantities such at the probability of each hypothesis/model and evidence ratios. Furthermore, simple methods allow formal inference (e.g. prediction/forecasting) from all the models in an a priori set (“multimodel inference”). This course on the Information-Theoretic approaches to statistical inference focuses on the practical application of these new methods and is based on Kullback-Leibler information and Akaike’s information criterion (AIC). The material follows the recent textbook: Anderson, D. R. 2008. Model based inference in the life sciences: a primer on evidence. Springer, New York, NY. 184pp. A copy of this book, a reference sheet, and several handouts are included in the registration fee. These courses stress science and science philosophy as much as statistical methods. The focus is on quantification and qualification of formal evidence concerning alternative science hypotheses. The courses are informal and discussion and debate is encouraged.Registration deadline: September, 15.