From 2c9cee72a9b70d63fd825030b03dab7f908f63d1 Mon Sep 17 00:00:00 2001 From: Ozer TANRISEVER Date: Mon, 8 Nov 2021 14:29:59 +0300 Subject: [PATCH] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index f8690c3..f6492c2 100644 --- a/README.md +++ b/README.md @@ -106,7 +106,7 @@ Motivation =============== Current semi-supervised learning approaches require strong assumptions, and perform badly if those -assumptions are violated (e.g. low density assumption, clustering assumption). In some cases, they can perform worse than a supervised classifier trained only on the labeled exampels. Furthermore, the vast majority require O(N^2) memory. +assumptions are violated (e.g. low density assumption, clustering assumption). In some cases, they can perform worse than a supervised classifier trained only on the labeled examples. Furthermore, the vast majority require O(N^2) memory. [(Loog, 2015)](http://arxiv.org/abs/1503.00269) has suggested an elegant framework (called Contrastive Pessimistic Likelihood Estimation / CPLE) which **only uses assumptions intrinsic to the chosen classifier**, and thus allows choosing likelihood-based classifiers which fit the domain / data