Principles of Nonparametric Learning Udine (Italy), July 9-13, 2001 Advanced School coordinated by L. Györfi Budapest University of Technology and Economics Hungary Modern nonparametric methods have become the most important tools in various fields of applications of pattern recognition, density and regression function estimation, data compression, on-line learning and prediction. The common feature in these problems is that some unknown underlying system generates data and the best action is to be learnt from these data. The purpose of the course is to teach the basic principles of nonparametric inference with emphasis on the cited areas. Leading international experts of these areas will introduce the participants into the theory and advanced methods of inference. A prototype problem to be discussed in depth is the problem of pattern recognition in which an observation is to be classified into one of a finite number of classes. The optimal classification must be approximated based on training data. Classical nonparametric methods such as nearest neighbor and kernel methods, as well as modern methodologies including neural networks, support vector machines, and binary tree classifiers will be studied in the course. Evolutionary optimization methodologies, e.g. genetic programming, will also be considered as they allow nonparametric learning to handle complex data and explore structured model spaces. The methodologies covered by this course have found applications in various fields such as identification of biological and mechanical systems, data mining, forecasting, universal data compression, optimal portfolio strategies for stock markets, and measurement-based call admission control for high-speed communication networks. The course is addressed to postgraduates in engineering, mathematics, and computer science, and researchers in universities and research institutions. Please visit the school home page (www.cism.it/c2001/c03/) for information on program, admission, and accomodation.