KDD Nuggets 95:20, e-mailed 95-08-25 Contents: * GPS, KDD-95 conference: A great success * R. Ravula, Query: KDD in Transportation Industry? * R. Hauser, Net Research Tool, http://www.infohaus.com/access/by-seller/Agent_Knowledgebase_Associates_Inc * R. Hauser, KD Mine featured in NetWatch, http://www.pulver.com/netwatch/topten/topten.htm * S. Schaal, Paper on incremental local learning, http://www.hip.atr.co.jp/~sschaal/pub/publications.html * S. Salzberg, Paper on Comparing Classifiers, http://www.cs.jhu.edu/salzberg/home.html Job Ad: * Cavill, UK: seek graduate student for NN / Data Mining (KDD) project The KDD Nuggets is a moderated mailing list for news and information relevant to Data Mining and Knowledge Discovery in Databases (KDD). Please include a DESCRIPTIVE subject line in your submission. Nuggets frequency is approximately bi-weekly. Back issues of Nuggets, a catalog of S*i*ftware (data mining tools), references, FAQ, and other KDD-related information are available at Knowledge Discovery Mine, URL ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Date: Thu, 24 Aug 1995 00:31:39 -0400 From: rhauser@exis.net (RICK HAUSER) Subject: Congratulations! I am pleased to inform you that your page at http://info.gte.com/~kdd/ is featured in the Intelligent Agent section of the NetWatch Top Ten -- http://www.pulver.com/netwatch/topten/topten.htm Or - you can go to NetWatch at and follow the links for the Top Ten. The electronic version of USA Today recently included a link to the Netwatch site. Sites are included because they are new, outstanding, unusual, or of particular interest. This is a dynamic list with new sites being added and current sites being replaced periodically. If you make major/dynamic or other noteworthy changes to your site in the future, please bring that fact to my attention and I will consider including your site again. Thank you Rick Hauser President Agent Knowledgebase Associates Inc. http://www.infohaus.com/access/by-seller/Agent_Knowledgebase_Associates_Inc >~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ via ML-LIST From: Stefan Schaal Date: Tue, 1 Aug 95 13:31:27 JST Subject: Paper available on incremental local learning http://www.hip.atr.co.jp/~sschaal/pub/publications.html contains a number of papers, including the following new paper: FROM ISOLATION TO COOPERATIONION: AN ALTERNATIVE VIEW OF A SYSTEM OF EXPERTS Stefan Schaal and Christopher G. Atkeson submitted to NIPS'95 We introduce a constructive, incremental learning system for regression problems that models data by means of locally linear experts. In contrast to other approaches, the experts are trained independently and do not compete for data during learning. Only when a prediction for a query is required do the experts cooperate by blending their individual predictions. Each expert is trained by minimizing a penalized local cross validation error using second order methods. In this way, an expert is able to adjust the size and shape of the receptive field in which its predictions are valid, and also to adjust its bias on the importance of individual input dimensions. The size and shape adjustment corresponds to finding a local distance metric, while the bias adjustment accomplishes local dimensionality reduction. We derive asymptotic results for our method. In a variety of simulations we demonstrate the properties of the algorithm with respect to interference, learning speed, prediction accuracy, feature detection, and task oriented incremental learning. The paper is 8 pages long, requires 2.3 MB of memory (uncompressed), and is ftp-able as: ftp://ftp.cc.gatech.edu/people/sschaal/schaal-NIPS95.ps.gz or can be accessed through: http://www.cc.gatech.edu/fac/Stefan.Schaal/ http://www.hip.atr.co.jp/~sschaal/ Comments are most welcome. >~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ From: Steven Salzberg Date: Fri, 28 Jul 95 15:50:25 EDT Subject: new papers available via www; comments and feedback sought The following paper is newly available, and comments from other machine learning researchers would be greatly appreciated. To retrieve it, go to http://www.cs.jhu.edu/salzberg/home.html. A number of other recent and not-so-recent papers are also available at that site. Title: On Comparing Classifiers: A Critique of Current Research and Methods Author: Steven Salzberg Abstract: Experimental machine learning research needs to scrutinize its approach to experimental design. If not done very carefully, comparative studies of classification algorithms can easily result in statistically invalid conclusions. This paper describes several phenomena that can, if ignored, invalidate an experimental comparison. It also divides machine learning research into several different types, and discusses why comparative analysis is more important for some than for others. >~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Subject: PhD at Cranfield University From: CAVILL@rmcs.cranfield.ac.uk Date: Wed, 16 Aug 1995 16:40:31 +0100 3 YEAR PROJECT - NEURAL NETWORK / DATA MINING (KDD) We are seeking an honours graduate to study knowledge discovery in large data bases using neural networks in the Computing and Information Systems Management Group (CISMG). Ideally you should hold, or expect to be awarded a good honours degree or MSc in Computing Science or a closely related discipline. The successful student will register with the University for a PhD degree and the post will attract a non-taxable stipend of #7,000 p.a. in line with current EPSRC Case Award rates. Interested applicants should apply in writing, including a CV and the names of two referees to Ms M L Vaughn CISMG School of Defence Management Cranfield University RMCS Shrivenham Swindon Wilts SN6 8LA Closing date: 2nd September 1995 Cranfield University is an exempt charity offering a centre of excellence for research and education. >~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~