Distribution-Independent Evolvability of Linear Threshold Functions

Vitaly Feldman
Proceedings of the 24th Annual Conference on Learning Theory, PMLR 19:253-272, 2011.

Abstract

Valiant’s model of evolvability models the evolutionary process of acquiring useful functionality as a restricted form of learning from random examples (Valiant, 2009). Linear threshold functions and their various subclasses, such as conjunctions and decision lists, play a fundamental role in learning theory and hence their evolvability has been the primary focus of research on Valiant’s framework. One of the main open problems regarding the model is whether conjunctions are evolvable distribution-independently (Feldman and Valiant, 2008). We show that the answer is negative. Our proof is based on a new combinatorial parameter of a concept class that lower-bounds the complexity of learning from correlations. We contrast the lower bound with a proof that linear threshold functions having a non-negligible margin on the data points are evolvable distribution-independently via a simple mutation algorithm. Our algorithm relies on a non-linear loss function being used to select the hypotheses instead of 0-1 loss in Valiant’s original definition. The proof of evolvability requires that the loss function satisfies several mild conditions that are, for example, satisfied by the quadratic loss function studied in several other works (Michael, 2007; Feldman, 2009b; Valiant, 2011). An important property of our evolution algorithm is monotonicity, that is the algorithm guarantees evolvability without any decreases in performance. Previously, monotone evolvability was only shown for conjunctions with quadratic loss (Feldman, 2009b) or when the distribution on the domain is severely restricted (Michael, 2007; Feldman, 2009b; Kanade et al., 2010).

Cite this Paper


BibTeX
@InProceedings{pmlr-v19-feldman11b, title = {Distribution-Independent Evolvability of Linear Threshold Functions}, author = {Feldman, Vitaly}, booktitle = {Proceedings of the 24th Annual Conference on Learning Theory}, pages = {253--272}, year = {2011}, editor = {Kakade, Sham M. and von Luxburg, Ulrike}, volume = {19}, series = {Proceedings of Machine Learning Research}, address = {Budapest, Hungary}, month = {09--11 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v19/feldman11b/feldman11b.pdf}, url = {https://proceedings.mlr.press/v19/feldman11b.html}, abstract = {Valiant’s model of evolvability models the evolutionary process of acquiring useful functionality as a restricted form of learning from random examples (Valiant, 2009). Linear threshold functions and their various subclasses, such as conjunctions and decision lists, play a fundamental role in learning theory and hence their evolvability has been the primary focus of research on Valiant’s framework. One of the main open problems regarding the model is whether conjunctions are evolvable distribution-independently (Feldman and Valiant, 2008). We show that the answer is negative. Our proof is based on a new combinatorial parameter of a concept class that lower-bounds the complexity of learning from correlations. We contrast the lower bound with a proof that linear threshold functions having a non-negligible margin on the data points are evolvable distribution-independently via a simple mutation algorithm. Our algorithm relies on a non-linear loss function being used to select the hypotheses instead of 0-1 loss in Valiant’s original definition. The proof of evolvability requires that the loss function satisfies several mild conditions that are, for example, satisfied by the quadratic loss function studied in several other works (Michael, 2007; Feldman, 2009b; Valiant, 2011). An important property of our evolution algorithm is monotonicity, that is the algorithm guarantees evolvability without any decreases in performance. Previously, monotone evolvability was only shown for conjunctions with quadratic loss (Feldman, 2009b) or when the distribution on the domain is severely restricted (Michael, 2007; Feldman, 2009b; Kanade et al., 2010).} }
Endnote
%0 Conference Paper %T Distribution-Independent Evolvability of Linear Threshold Functions %A Vitaly Feldman %B Proceedings of the 24th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2011 %E Sham M. Kakade %E Ulrike von Luxburg %F pmlr-v19-feldman11b %I PMLR %P 253--272 %U https://proceedings.mlr.press/v19/feldman11b.html %V 19 %X Valiant’s model of evolvability models the evolutionary process of acquiring useful functionality as a restricted form of learning from random examples (Valiant, 2009). Linear threshold functions and their various subclasses, such as conjunctions and decision lists, play a fundamental role in learning theory and hence their evolvability has been the primary focus of research on Valiant’s framework. One of the main open problems regarding the model is whether conjunctions are evolvable distribution-independently (Feldman and Valiant, 2008). We show that the answer is negative. Our proof is based on a new combinatorial parameter of a concept class that lower-bounds the complexity of learning from correlations. We contrast the lower bound with a proof that linear threshold functions having a non-negligible margin on the data points are evolvable distribution-independently via a simple mutation algorithm. Our algorithm relies on a non-linear loss function being used to select the hypotheses instead of 0-1 loss in Valiant’s original definition. The proof of evolvability requires that the loss function satisfies several mild conditions that are, for example, satisfied by the quadratic loss function studied in several other works (Michael, 2007; Feldman, 2009b; Valiant, 2011). An important property of our evolution algorithm is monotonicity, that is the algorithm guarantees evolvability without any decreases in performance. Previously, monotone evolvability was only shown for conjunctions with quadratic loss (Feldman, 2009b) or when the distribution on the domain is severely restricted (Michael, 2007; Feldman, 2009b; Kanade et al., 2010).
RIS
TY - CPAPER TI - Distribution-Independent Evolvability of Linear Threshold Functions AU - Vitaly Feldman BT - Proceedings of the 24th Annual Conference on Learning Theory DA - 2011/12/21 ED - Sham M. Kakade ED - Ulrike von Luxburg ID - pmlr-v19-feldman11b PB - PMLR DP - Proceedings of Machine Learning Research VL - 19 SP - 253 EP - 272 L1 - http://proceedings.mlr.press/v19/feldman11b/feldman11b.pdf UR - https://proceedings.mlr.press/v19/feldman11b.html AB - Valiant’s model of evolvability models the evolutionary process of acquiring useful functionality as a restricted form of learning from random examples (Valiant, 2009). Linear threshold functions and their various subclasses, such as conjunctions and decision lists, play a fundamental role in learning theory and hence their evolvability has been the primary focus of research on Valiant’s framework. One of the main open problems regarding the model is whether conjunctions are evolvable distribution-independently (Feldman and Valiant, 2008). We show that the answer is negative. Our proof is based on a new combinatorial parameter of a concept class that lower-bounds the complexity of learning from correlations. We contrast the lower bound with a proof that linear threshold functions having a non-negligible margin on the data points are evolvable distribution-independently via a simple mutation algorithm. Our algorithm relies on a non-linear loss function being used to select the hypotheses instead of 0-1 loss in Valiant’s original definition. The proof of evolvability requires that the loss function satisfies several mild conditions that are, for example, satisfied by the quadratic loss function studied in several other works (Michael, 2007; Feldman, 2009b; Valiant, 2011). An important property of our evolution algorithm is monotonicity, that is the algorithm guarantees evolvability without any decreases in performance. Previously, monotone evolvability was only shown for conjunctions with quadratic loss (Feldman, 2009b) or when the distribution on the domain is severely restricted (Michael, 2007; Feldman, 2009b; Kanade et al., 2010). ER -
APA
Feldman, V.. (2011). Distribution-Independent Evolvability of Linear Threshold Functions. Proceedings of the 24th Annual Conference on Learning Theory, in Proceedings of Machine Learning Research 19:253-272 Available from https://proceedings.mlr.press/v19/feldman11b.html.

Related Material