Proper Learning of k-term DNF Formulas from Satisfying Assignments

Maciej Liskiewicz, Matthias Lutter, Rüdiger Reischuk

Abstract

In certain applications there may only be positive samples available to
to learn concepts of a class of interest,
and this has to be done properly, i.e. the
hypothesis space has to coincide with the concept class,
and without false positives, i.e. the hypothesis always has be a subset of the real concept (one-sided error).
For the well studied class of k-term DNF formulas it has been known that
learning is difficult.
Unless RP = NP, it is not feasible to learn k-term DNF formulas properly in a distribution-free sense even if both positive and negative samples are available and even if false positives are allowed.

This paper constructs an efficient algorithm that for arbitrary fixed k,
if samples are drawn from distributions like uniform or q-bounded ones,
properly learns the class of k-term DNFs
without false positives from positive samples alone
with arbitrarily small relative error.
Original languageEnglish
JournalElectronic Colloquium on Computational Complexity (ECCC)
Pages (from-to)13 - 36
Number of pages24
ISSN1433-8092
Publication statusPublished - 01.07.2017

Fingerprint

Dive into the research topics of 'Proper Learning of k-term DNF Formulas from Satisfying Assignments'. Together they form a unique fingerprint.

Cite this