School of Computer Science

login

MLO Group  /   Dr Gavin Brown

Home

My Group

Publications

Teaching

Software

Interesting Things

Maintained by G.Brown
Who am I? What do I do?

I am a Senior Lecturer (Associate Professor) in the Machine Learning and Optimization Group. My research interests can be summarised as: feature selection/extraction with information theoretic methods, Markov blanket algorithms, ensemble learning (aka multiple classifier systems), and online learning. All of the above in application to two domains: Systems Biology and adaptive compiler optimisation.

Or, in less technical jargon:... click here

Tel: 0161 275 6190


Gavin Brown
Highlighted research publications:
Beyond Fano's Inequality: Bounds on the Optimal F-Score, BER, and Cost-Sensitive Risk and Their Implications
Ming-Jie Zhao, Narayanan Edakunni, Adam Pocock and Gavin Brown
Journal of Machine Learning Research (14) 2013.

Conditional Likelihood Maximisation: A Unifying Framework for Information Theoretic Feature Selection
Gavin Brown, Adam Pocock, Mingjie Zhao, Mikel Lujan
Journal of Machine Learning Research (13) 2012.

Informative Priors for Markov Blanket Discovery
Adam Pocock, Mikel Lujan, Gavin Brown
Int. Conf Artificial Intelligence and Statistics, 2012

Boosting as a Product of Experts
Narayanan Edakunni, Gavin Brown, Tim Kovacs
Int Conf. Uncertainty in Artificial Intelligence 2011


Recent Activities:



NEWS (14/3/2014):
For all you 7-year olds out there, I just appeared on Children's BBC explaining what 'artificial intelligence' is.  See the full clip HERE.


 
NEWS (Nov 2013): I am co-chairing SPR 2014 along with Marco Loog.  Please submit your best papers! Note the special journal spotlight track, for articles published in JMLR/PAMI/TNNLS the past year. Submissions are open til March 1st 2014.

VERY GOOD NEWS! : 
Very pleased to announce that my PhD student Adam Pocock has just won the BCS Distinguished Dissertation Award 2013! Read a BCS press release here!  Read his thesis HERE !  The judging panel said of the thesis: "The judges were very impressed by the fact that the thesis not only makes a major advance of the state of the art, but also illustrates the context of the problem and motivates the work in a way that a general audience would be able to understand. One reviewer observed that he would use it as a standard reference in the area, and recommend it to his students as a model to be aspired to."

VERY GOOD NEWS! :
One of my undergraduate project students, Laura Howarth-Kirke, just won the award for Best Undergraduate Science, Engineering and Technology Student in the UK!  Very proud of her.  Read the full story HERE. As a nice follow up, I won Best SET Lecturer. Very nice indeed!


NEWS (July 2013): I presented a lecture for children, on the topic of "Making Computers Think".  The version below is on YouTube, but there is a higher resolution copy available if you email me.



NEWS (July 2013): New JMLR paper "Beyond Fano's inequality: bounds on the optimal F-score, balanced error rate, and cost-sensitive risk using conditional entropy and their implications". Though the paper is quite dense, the basic idea is this: if you want to optimize F-score or Balanced Error Rate, don't use infomax approaches like maximum likelihood or mutual information! Our main result shows that the infomax principle is NOT a proper criterion for this task, and can cause a catastrophic failure for the F-score in particular!

NEWS (June 2013): New grant. I am Co-I, along with Mikel Lujan (PI) on a new EU-funded project, AXLE, Analytics on Xtremely Large European Databases. The principle here is to explore how to study large scale data analytics (including machine learning algorithms as a special case) on very large data. We want to get to the point where 10tb is considered "normal". This is in collaboration with various EU partners, in particular Janez Demsar and friends who developed the Orange Python toolkit.

NEWS (June 2013): I taught on a Summer School in Machine Learning in June, in Spain. Great fun, great people.


Richard Stapenhurst PhD
My PhD student Richard recently finished his thesis. You can see it here.

Adam Pocock PhD
My PhD student Adam Pocock recently finished his thesis. You can see it here.

ICML 2012
We presented our work on feature selection at ICML 2012 in the ML-journaled special sessions. You can watch the talk here. And... thanks to Charles Sutton, the "wordcloud" is this....
Talks on the JMLR paper
I have been touring somewhat, giving talks about our recent JMLR paper - thanks for the invites everyone!
Visiting... Surrey Elec Eng, Birmingham Computer Science, Manchester Medical School.... next scheduled talk: Oxford (Mathematics Dept) in May.

REUNITE project featured by BBC World Service
The BBC's flagship technology programme "Click" recently featured our project. You can hear the podcast here, or watch the video...


Papers accepted to AISTATS 'Informative Priors for Markov Blanket Discovery', and to UAI "Boosting as a Product of Experts"

AstraZeneca MSc Research Bursaries
I am currently investigating biomarkers for lung cancer analysis with AstraZeneca Research. AZ have sponsored our students this year, under their predictive safety science initiative.

Invited Doctoral Lecture Course University of Cagliari, Sardinia
I delivered a series of 8 invited lectures in Cagliari - see the course webpages here.

Invited lecture at IEEE symposium
I am delivering a keynote at the 2011 IEEE symposium on Computational Intelligence, on the topic of computational intelligence in dynamic and uncertain environments.

New book chapter - Ensemble Learning
I wrote an invited book chapter for the Springer Encyclopedia of Machine Learning.
You can see also the typeset article here.

"The study of ensemble methods, with model outputs considered for their abstract properties rather than the specifics of the algorithm which produced them, allows for a wide impact across many fields of study. If we can understand precisely why, when, and how particular ensemble methods can be applied successfully, we would have made progress toward a powerful new tool for Machine Learning: the ability to automatically exploit the strengths and weaknesses of different learning systems."

New PhD (Dec 2010) : - Manuela Zanda completed her PhD, entitled ``A Probabilistic Perspective on Ensemble Diversity''. A copy of her thesis can be downloaded here.

New Grant (9th Sept 2010) - EPSRC KTA, Reuniting Refugees with Computational Intelligence.
REUNITE is a research project aiming to utilise crowdsourcing and machine learning techniques to help reunite those separated by conflict and natural disaster. Imagine the following scenario. A disaster occurs in a remote part of the developing world. The local population are forced to flee their homes. Many are separated from their family and friends. With no mobile or Internet communication, finding loved ones in the aftermath of a disaster is incredibly difficult. Relief organisations go to great lengths to help people find those they are missing. The system we are developing aims to make this process easier, faster and more secure.

Article in THE (3 June 2010): I had an article about computer science education in the Times Higher this week. Are you looking for the Computing at School group? Or for the Manchester Schools' Animation Competition? The Animation competition is an effort led by Toby Howard, to encourage schoolchildren to learn the concepts of computational thinking, and I strongly encourage all to take note!

Invited plenary talk at MCS 2010
I gave an invited talk at the Intl Workshop on Multiple Classifier Systems 2010, entitled Some Thoughts at the Interface of Ensemble Methods and Feature Selection. It was repeated with (slightly) adapted slides for Microsoft Research Cairo,

New PhD (Nov 2009) : - Amir Ahmad completed his PhD, entitled ``Data Transformations for Decision Tree Ensembles''. A copy of his thesis can be downloaded here.

AISTATS 2009 paper - Feature Selection with Information Theory
The traditional approach to so-called filter methods in feature selection is to construct a criterion to measure the utility of any given feature. The more sophisticated methods penalize feature-feature correlations (`redundancy') with various penalty terms. The last 15 years have produced a flood of papers advocating different penalty terms. My recent work shows that the vast majority of these can be naturally derived from a single framework, using multivariate information theory. The work reveals that there exists a natural, smooth space space of feature selection criteria, where each paper over the last 15 years corresponds to one point. Most of the space has never been explored. See the AISTATS 2009 paper for details.

Invited plenary at UK-KDD 2009 - Feature Selection by Filters, a Unifying Perspective
I gave an invited talk at UK-KDD 2009.

New Grant - Dynamic Ensemble Techniques (EPSRC grant EP/F023855/1)
With colleagues at Bristol, I am investigating how dynamic ensemble techniques can tackle multi-step (control) and nonstationary problems. This is in collaboration with Tim Kovacs, James Marshall and Jeremy Wyatt, conducted under our EPSRC funded ADEPT project.

New Grant - Machine Learning for Multi-Core Computers (EPSRC grant EP/G000662/1)
The computer industry is undergoing the "multi-core" revolution. When you buy a PC off the shelf these days, it is inevitably "dual-core" or "quad-core". This idea of more and more CPU "cores" executing in parallel is expected to continue to the hundreds and thousands. The problem of coordinating these cores is challenging and unsolved. With Mikel Lujan and Jeremy Singer I am working on applying Machine Learning to this problem, conducted under our EPSRC funded iTLS project.

IEEE TNN paper on Sparse Distributed Memories
In a project with Steve Furber I found that sparse distributed memory models like the correlation matrix memories of Wilshaw and Kanerva could give significant insights into the design of fault tolerant computer architectures. This resulted in a IEEE TNN paper available here.

Ensemble Learning
I worked for a long while on the issue of diversity in ensembles, with Jeremy Wyatt. A summary of the work can be found on this page. A slightly less optimistic (but rather insightful) take on the field is found here.

Image Feature Extraction
I did a nice project with Honda several years ago, which turned into a patent, on image feature extraction - I follow up little avenues on this occasionally. Throughout this time I have maintained an interest in evolutionary speciation and optimisation, which has spun off into several useful collaborations.