Log in

goodpods headphones icon

To access all our features

Open the Goodpods app
Close icon
Data Science Decoded - Data Science #18 - The k-nearest neighbors algorithm (1951)

Data Science #18 - The k-nearest neighbors algorithm (1951)

Data Science Decoded

11/25/24 • 44 min

plus icon
bookmark
Share icon

In the 18th episode we go over the original k-nearest neighbors algorithm; Fix, Evelyn; Hodges, Joseph L. (1951). Discriminatory Analysis. Nonparametric Discrimination: Consistency Properties USAF School of Aviation Medicine, Randolph Field, Texas They introduces a nonparametric method for classifying a new observation z z as belonging to one of two distributions, F F or G G, without assuming specific parametric forms. Using k k-nearest neighbor density estimates, the paper implements a likelihood ratio test for classification and rigorously proves the method's consistency.

The work is a precursor to the modern k k-Nearest Neighbors (KNN) algorithm and established nonparametric approaches as viable alternatives to parametric methods. Its focus on consistency and data-driven learning influenced many modern machine learning techniques, including kernel density estimation and decision trees.

This paper's impact on data science is significant, introducing concepts like neighborhood-based learning and flexible discrimination.

These ideas underpin algorithms widely used today in healthcare, finance, and artificial intelligence, where robust and interpretable models are critical.

11/25/24 • 44 min

profile image

1 Listener

plus icon
bookmark
Share icon

Generate a badge

Get a badge for your website that links back to this episode

Select type & size
Open dropdown icon
share badge image

<a href="https://goodpods.com/podcasts/data-science-decoded-556870/data-science-18-the-k-nearest-neighbors-algorithm-1951-79019036"> <img src="https://storage.googleapis.com/goodpods-images-bucket/badges/generic-badge-1.svg" alt="listen to data science #18 - the k-nearest neighbors algorithm (1951) on goodpods" style="width: 225px" /> </a>

Copy