Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Instance-Based Learning: Locally Weighted Regression and Radial Basis Functions - Prof. Gr, Study notes of Computer Science

A lecture note from the carnegie mellon university (cmu) course intro. To machine learning (csi 5325) focusing on instance-based learning techniques, specifically locally weighted regression and radial basis functions. The lecture covers the concepts behind these methods, their differences, and the training processes. It also discusses the applicability and advantages of linear models in machine learning.

Typology: Study notes

Pre 2010

Uploaded on 08/16/2009

koofers-user-jvo-1
koofers-user-jvo-1 🇺🇸

10 documents

1 / 16

Toggle sidebar

Related documents


Partial preview of the text

Download Instance-Based Learning: Locally Weighted Regression and Radial Basis Functions - Prof. Gr and more Study notes Computer Science in PDF only on Docsity! Intro. to machine learning (CSI 5325) Lecture 20: Instance-based learning Intro. to machine learning (CSI 5325) Lecture 20: Instance-based learning Greg Hamerly Spring 2008 Some content from Tom Mitchell. 1 / 16 Intro. to machine learning (CSI 5325) Lecture 20: Instance-based learning Outline 1 Locally weighted regression 2 Radial basis functions 3 Learning linear functions 4 Case-based reasoning 5 Lazy and eager learning 2 / 16 Intro. to machine learning (CSI 5325) Lecture 20: Instance-based learning Radial basis functions Radial Basis Function Networks Global approximation to target function, in terms of linear combination of local approximations Used, e.g., for image classification A different kind of neural network Closely related to distance-weighted regression, but “eager” instead of “lazy” 5 / 16 Intro. to machine learning (CSI 5325) Lecture 20: Instance-based learning Radial basis functions Radial Basis Function Networks Where ai (x) are the attributes describing instance x , and f (x) = w0 + k∑ u=1 wuKu(d(xu, x)) One common choice for Ku(d(xu, x)) is Ku(d(xu, x)) = e − 1 2σ2u d2(xu ,x) 6 / 16 Intro. to machine learning (CSI 5325) Lecture 20: Instance-based learning Radial basis functions Training Radial Basis Function Networks Q1: What xu to use for each kernel function Ku(d(xu, x)) Scatter uniformly throughout instance space Or use training instances (reflects instance distribution) Or use the means of clusters (found by k-means, Gaussian EM, etc.) Q2: How to train weights (assume here Gaussian Ku) First choose variance (and perhaps mean) for each Ku e.g., use EM Then hold Ku fixed, and train linear output layer efficient methods to fit linear function 7 / 16 Intro. to machine learning (CSI 5325) Lecture 20: Instance-based learning Learning linear functions Deriving linear weights Take the derivative of E with respect to the weights: E (f̂ ) = 1 2 (Xw − y)T (Xw − y) ∂E ∂w = (XT )(Xw − y) = XTXw − XT y Setting this equal to zero to find the minimum value: XTXw − XT y = 0 XTXw = XT y (XTX )−1XTXw = (XTX )−1XT y ŵ = (XTX )−1XT y Easily computed in any numerical package (e.g. Matlab), with the majority of the cost being a (d + 1)× (d + 1) matrix inversion. 10 / 16 Intro. to machine learning (CSI 5325) Lecture 20: Instance-based learning Learning linear functions General applicability of linear models Linear models are extremely popular because: they can be solved efficiently (see previous slides) represent more complex functions through ‘basis expansions’ construct a linear combination of nonlinear features Many machine learning algorithms are related to some sort of linear model: linear output perceptron (without thresholding) radial basis network locally weighted regression non-linear (e.g. polynomial) regression support vector machines etc. 11 / 16 Intro. to machine learning (CSI 5325) Lecture 20: Instance-based learning Case-based reasoning Case-Based Reasoning Can apply instance-based learning even when X 6= <n → need different “distance” metric Case-Based Reasoning is instance-based learning applied to instances with symbolic logic descriptions ((user-complaint error53-on-shutdown) (cpu-model PowerPC) (operating-system Windows) (network-connection PCIA) (memory 48meg) (installed-applications Excel Netscape VirusScan) (disk 1gig) (likely-cause ???)) 12 / 16 Intro. to machine learning (CSI 5325) Lecture 20: Instance-based learning Case-based reasoning Case-Based Reasoning in CADET Instances represented by rich structural descriptions Multiple cases retrieved (and combined) to form solution to new problem Tight coupling between case retrieval and problem solving Bottom line: Simple matching of cases useful for tasks such as answering help-desk queries Area of ongoing research 15 / 16 Intro. to machine learning (CSI 5325) Lecture 20: Instance-based learning Lazy and eager learning Lazy and Eager Learning Lazy: wait for query before generalizing k-Nearest Neighbor, Case based reasoning Eager: generalize before seeing query Radial basis function networks, ID3, Backpropagation, NaiveBayes, . . . Does it matter? Eager learner must create global approximation Lazy learner can create many local approximations if they use same H, lazy can represent more complex functions (e.g., consider H = linear functions) 16 / 16
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved