Alexander Gammerman

Professor, co-Director of Centre for Machine Learning

Professor Gammerman's current research interest lies in machine learning. In particular, the development of conformal predictors -- a set of novel machine learning techniques that guarantee the validity of prediction. The validity here means that a probability of error is equal to or not exceeding a given confidence level. Areas in which these techniques have been applied include medical diagnosis, drug design, forensic science, proteomics, genomics, environment and information security. He has published about two hundred research papers and several books on computational learning and probabilistic inference.

Professor Gammerman is a Fellow of the Royal Statistical Society and a Fellow of the Royal Society of Arts. He chaired and participated in organising committees of many international conferences and workshops on Machine Learning and Bayesian methods in Europe, Russia and in the United States. He was also a member of the editorial boards of the Law, Probability and Risk journal (2002-2009) and the Computer Journal (2003-2008). He has held visiting and honorary professorships from several universities in Europe and USA.

download cv

Employment

  • 1998-present

    co-Director of Centre for Machine Learning, Royal Holloway, University of London

  • 1995-2005

    Head of Computer Science Department, Royal Holloway, University of London

  • 1993-present

    Chair in Computer Science at University of London

  • 1983-1993

    Reader in Computer Science at Heriot-Watt University, Edinburgh

  • 1974-1983

    Senior Research Fellow at Regional Research Computer Centre, St.Petersburg, Russia

Research

Conformal Prediction

Conformal prediction is a new method of machine learning. Conformal predictors are among the most accurate methods of machine learning and, unlike other state-of-the-art methods, they provide information about their own accuracy and reliability.

How good is your prediction ? If you are predicting the label of a new object, how confident are you that the predicted label is correct ? If the label is a number, how close do you think it is to the correct one ?

In machine learning, these questions are usually answered in a fairly rough way from past experience. Conformal prediction uses past experience to determine precise levels of confidence in predictions.

More information are discussed here.

My research interest lies in machine learning and pattern recognition. In particular, the latest work has been in the development of conformal predictors: a set of novel machine learning techniques that guarantee the validity of prediction.

Applications

Areas in which our techniques have been applied include IT industry, medicine, pharmaceuitcal companies, forensic science, genomics, environment and finance.

Research in Conformal Predictors have been receiving funding from the industry, universities and research councils.

Exascale Compound Activity Prediction Engines Proteomic Analysis

Selected Grants

My research are widely funded by the industry and the academy.

AstraZeneca: Machine Learning for Chemical Synthesis; 2018-2020. £395,762.

EU Horizon 2020: Exascale Compound Activity Prediction Engine. &euros; over 4 millions in total.

BBSRC: Living with uninvited guests: comparing plant and animal responses to endocytic invasions (ERASysBio). £700,000.

EPSRC: Mining the Network Behaviour of Bots. £680,623.

EPSRC: Practical competitive prediction. £406,000.

AstraZeneca: Machine Learning for Chemical Synthesis. £395,762.

EU FP7 programme: Post-translational modification, O-PTM. £193,046.

Proteomic Analysis of the Human Serum Proteome: Analysis and Applications. £170,091.

BBSRC: Pattern Recognition Techniques for Gene and Promoter Identification and Classification in Plant Genomic Sequences. £145,210.

EPSRC: Complexity Approximation Principle and Predictive Complexity. £142,996.

EPSRC: Support Vector and Bayesian Learning Algorithms: Analysis and Applications. £142,360.

EPSRC: Comparison of the Support Vector Machine and Minimum Message Length methods for induction and prediction. £132,787.

Thales: Automated methods for detection of anomalous behaviour. £85,000.

Ph.D Students

I have supervised (and co-supervised) about 30 Ph.D students.

B. Burford C. Zhou C. Saunders D. Lindsay D. Devetyarov D. Adamskiy
D. Surkov H. Papadopoulos I. Nouretdinov J. Weston L. Gordon M. Stitson
J. Smith
Y. Gu
M. Yang
K. Proedrou
D. Ryabko
T. Melluish T. Bellotti V. Fedorova X. Liu Z. Luo

Publications

Measures of Complexity

Springer | 2016 | ISBN-10: 3319357786

This book brings together historical notes, reviews of research developments, fresh ideas on how to make VC (Vapnik–Chervonenkis) guarantees tighter, and new technical contributions in the areas of machine learning, statistical inference, classification, algorithmic statistics, and pattern recognition. The contributors are leading scientists in domains such as statistics, mathematics, and theoretical computer science, and the book will be of interest to researchers and graduate students in these domains.

These recollections about the origins of VC theory were written by Alexey Chervonenkis in 2004 for several colleagues and not intended for publication. They are now published for the first time.

Conformal and Probabilistic Prediction with Applications

Springer | 2016 | ISBN-10: 3319333941

This book constitutes the refereed proceedings of the 5th International Symposium on Conformal and Probabilistic Prediction with Applications, COPA 2016, held in Madrid, Spain, in April 2016.

The volume is divided into three parts. The first part presents the invited paper "Learning with Intelligent Teacher" by Vladimir Vapnik and Rauf Izmailov, devoted to learning with privileged information and emphasizing the role of the teacher in the learning process. The second part is devoted to the theory of conformal prediction. The two papers in this part investigate various criteria of efficiency used in conformal prediction and introduce a universal probability-free version of conformal predictors. The core of the book is formed by the third part, containing experimental papers describing various applications of conformal prediction.

Statistical Learning and Data Sciences

Springer | 2015 | ISBN-10: 3319170902

This book constitutes the refereed proceedings of the Third International Symposium on Statistical Learning and Data Sciences, SLDS 2015, held in Egham, Surrey, the United Kingdom, in April 2015.

The volume is divided into five parts. The first part is devoted to two invited papers by Vladimir Vapnik. The first paper, “Learning with Intelligent Teacher: Similarity Control and Knowledge Transfer” is a further development of his research on learning with privileged information, with a special attention to the knowledge representation problem. The second, “Statistical Inference Problems and their Rigorous Solutions” suggests a novel approach to pattern recognition and regression estimation. Both papers promise to become milestones in the developing field of statistical learning. The second part consists of 16 papers that were accepted for presentation at the main event, while the other three parts reflect new research in important areas of statistical learning to which the symposium devoted special sessions.

Algorithmic Learning in a Random World

Springer | 2005 | ISBN-10: 0387001522

The main topic of this book is conformal prediction, a method of prediction recently developed in machine learning. Conformal predictors are among the most accurate methods of machine learning, and unlike other state-of-the-art methods, they provide information about their own accuracy and reliability.

The book integrates mathematical theory and revealing experimental work. It demonstrates mathematically the validity of the reliability claimed by conformal predictors when they are applied to independent and identically distributed data, and it confirms experimentally that the accuracy is sufficient for many practical problems. Later chapters generalize these results to models called repetitive structures, which originate in the algorithmic theory of randomness and statistical physics. The approach is flexible enough to incorporate most existing methods of machine learning, including newer methods such as boosting and support vector machines and older methods such as nearest neighbors and the bootstrap.

Causal Models and Intelligent Data Management

Springer | 1999 | ISBN 978-3-642-58648-4

Data analysis and inference have traditionally been research areas of statistics. However, the need to electronically store, manipulate and analyze large-scale, high-dimensional data sets requires new methods and tools, new types of databases, new efficient algorithms, new data structures, in effect new computational methods.

This monograph presents new intelligent data management methods and tools, such as the support vector machine, and new results from the field of inference, in particular of causal modeling. In 11 well-structured chapters, leading experts map out the major tendencies and future directions of intelligent data analysis. The book will become a valuable source of reference for researchers exploring the interdisciplinary area between statistics and computer science as well as for professionals applying advanced data analysis methods in industry and commerce. Students and lecturers will find the book useful as an introduction to the area.

Probabilistic Reasoning and Bayesian Belief Networks

Nelson Thornes Ltd | 1998 | ISBN-10: 1872474268

One of the most significant characteristics of an intelligent computer system is the ability to reason with judgmental knowledge. That is, how it uses heuristics, and improves its decision-making procedures in the light of examples which it is given. These heuristics are typically uncertain.

This book summarises some important work in the development of computational models of Bayesian belief networks, and their applications to medicine, transport and defence. The book should be of interest to all those working in: adaptive information processing, particularly in the allied fields of computer science, electrical engineering, physics and mathematics; also those researching in the neurosciences and branches of psychology and philsophy, particularly those concerned with neural modelling should benefit from this book. There are two companion volumes to this book, "Neural Networks" and "Applications of Modern Heuristic Methods", which individually stand alone, but combined form a set treating a broad but integrated spectrum of techniques and tools for undertaking complex tasks.

Computational Learning and Probabilistic Reasoning

Wiley | 1996 | ISBN-10: 0471962791

This book is devoted to two interrelated techniques in solving some important problems in machine intelligence and pattern recognition, namely probabilistic reasoning and computational learning.

It is divided into four parts, the first of which describes several new inductive principles and techniques used in computational learning. The second part contains papers on Bayesian and Causal Belief networks. Part three includes chapters on case studies and descriptions of several hybrid systems and the final part describes some related theoretical work in the field of probabilistic reasoning. Real-life problems are used to demonstrate the practical and effective implementation of the relevant algorithms and techniques.

Machine Learning: Progress and Prospects

Royal Holloway | 1996

When did machine learning start? Maybe a good starting point is 1949 when Claude Shannon suggested a learning algorithm for chess playing programs. Or maybe we should go back to the 1930s when Ronald Fisher developed discriminant analysis - a type of learning where the problem is to construct a decision rule that separates two types of vector. Or could it be the 18th century when David Hume discussed the idea of induction? Or the 14th century when William of Ockham formulated the principle of "simplicity" known as "Ockham's razor" (Ockham, by the way, is a small village not far from Royal Holloway). Or it may be that like almost everything else in Western civilisation and culture, the origin of these ideas lies in the Mediterranean. After all, it was Aristotle who said that "we learn some things only by doing things".

The field of machine learning has been greatly influenced by other disciplines and the subject is in itself not a very homogeneous discipline but includes separate, overlapping subfields. There are many parallel lines of research in ML: inductive learning, neural networks, clustering, and theories of learning. They are all part of the more general field of machine learning.

Awards and Honorary positions

Videos

Reliable Diagnostics By Conformal Predictors

Royal Holloway University of London | Berlin | October 2015

This talk reviews a modern machine learning technique called Conformal Predictors. The talk outlines the basic ideas of Conformal Predictors and then illustrates the technique with applications to several medical problems.

Memorial Concert Celebrating the Life of Alexey Chervonenkis

Royal Holloway University of London | Egham | April 2015

This concert was presented to celebrate the life of Prof. Alexey Chervonenkis, who tragically died on 22nd September 2014. The memorial was performed at the Statistical Learning and Data Sciences Symposium 2015, with in the introduction given by Prof. Alex Gammerman.

Conformal Prediction and its Applications

Yandex School of Data Analysis | Moscow | September 2013

In this talk, Prof. Gammerman will introduce the concept of Conformal Prediction and its applications. In particular, a comparison of Conformal Prediction and Bayesian approach will be discussed.

Contact

A.Gammerman@rhul.ac.uk +44 1784 443434
  • Centre for Machine Learning,
  • Royal Holloway, University of London
  • Egham
  • Surrey TW20 0EX
  • United Kingdom