A Statistical Implicative Analysis Based Algorithm and MMPC Algorithm for Detecting Multiple Dependencies

Elham Salehi, Jayashree Nyayachavadi, Robin Gras
; Proceedings of the Fourth International Workshop on Feature Selection in Data Mining, PMLR 10:22-34, 2010.

Abstract

Discovering the dependencies among the variables of a domain from examples is an important problem in optimization. Many methods have been proposed for this purpose, but few large-scale evaluations were conducted. Most of these methods are based on measurements of conditional probability. The statistical implicative analysis offers another perspective of dependencies. It is important to compare the results obtained using this approach with one of the best methods currently available for this task: the MMPC heuristic. As the SIA is not used directly to address this problem, we designed an extension of it for our purpose. We conducted a large number of experiments by varying parameters such as the number of dependencies, the number of variables involved or the type of their distribution to compare the two approaches. The results show strong complementarities of the two methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v10-salehi10a, title = {A Statistical Implicative Analysis Based Algorithm and MMPC Algorithm for Detecting Multiple Dependencies}, author = {Elham Salehi and Jayashree Nyayachavadi and Robin Gras}, pages = {22--34}, year = {2010}, editor = {Huan Liu and Hiroshi Motoda and Rudy Setiono and Zheng Zhao}, volume = {10}, series = {Proceedings of Machine Learning Research}, address = {Hyderabad, India}, month = {21 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v10/salehi10a/salehi10a.pdf}, url = {http://proceedings.mlr.press/v10/salehi10a.html}, abstract = {Discovering the dependencies among the variables of a domain from examples is an important problem in optimization. Many methods have been proposed for this purpose, but few large-scale evaluations were conducted. Most of these methods are based on measurements of conditional probability. The statistical implicative analysis offers another perspective of dependencies. It is important to compare the results obtained using this approach with one of the best methods currently available for this task: the MMPC heuristic. As the SIA is not used directly to address this problem, we designed an extension of it for our purpose. We conducted a large number of experiments by varying parameters such as the number of dependencies, the number of variables involved or the type of their distribution to compare the two approaches. The results show strong complementarities of the two methods.} }
Endnote
%0 Conference Paper %T A Statistical Implicative Analysis Based Algorithm and MMPC Algorithm for Detecting Multiple Dependencies %A Elham Salehi %A Jayashree Nyayachavadi %A Robin Gras %B Proceedings of the Fourth International Workshop on Feature Selection in Data Mining %C Proceedings of Machine Learning Research %D 2010 %E Huan Liu %E Hiroshi Motoda %E Rudy Setiono %E Zheng Zhao %F pmlr-v10-salehi10a %I PMLR %J Proceedings of Machine Learning Research %P 22--34 %U http://proceedings.mlr.press %V 10 %W PMLR %X Discovering the dependencies among the variables of a domain from examples is an important problem in optimization. Many methods have been proposed for this purpose, but few large-scale evaluations were conducted. Most of these methods are based on measurements of conditional probability. The statistical implicative analysis offers another perspective of dependencies. It is important to compare the results obtained using this approach with one of the best methods currently available for this task: the MMPC heuristic. As the SIA is not used directly to address this problem, we designed an extension of it for our purpose. We conducted a large number of experiments by varying parameters such as the number of dependencies, the number of variables involved or the type of their distribution to compare the two approaches. The results show strong complementarities of the two methods.
RIS
TY - CPAPER TI - A Statistical Implicative Analysis Based Algorithm and MMPC Algorithm for Detecting Multiple Dependencies AU - Elham Salehi AU - Jayashree Nyayachavadi AU - Robin Gras BT - Proceedings of the Fourth International Workshop on Feature Selection in Data Mining PY - 2010/05/26 DA - 2010/05/26 ED - Huan Liu ED - Hiroshi Motoda ED - Rudy Setiono ED - Zheng Zhao ID - pmlr-v10-salehi10a PB - PMLR SP - 22 DP - PMLR EP - 34 L1 - http://proceedings.mlr.press/v10/salehi10a/salehi10a.pdf UR - http://proceedings.mlr.press/v10/salehi10a.html AB - Discovering the dependencies among the variables of a domain from examples is an important problem in optimization. Many methods have been proposed for this purpose, but few large-scale evaluations were conducted. Most of these methods are based on measurements of conditional probability. The statistical implicative analysis offers another perspective of dependencies. It is important to compare the results obtained using this approach with one of the best methods currently available for this task: the MMPC heuristic. As the SIA is not used directly to address this problem, we designed an extension of it for our purpose. We conducted a large number of experiments by varying parameters such as the number of dependencies, the number of variables involved or the type of their distribution to compare the two approaches. The results show strong complementarities of the two methods. ER -
APA
Salehi, E., Nyayachavadi, J. & Gras, R.. (2010). A Statistical Implicative Analysis Based Algorithm and MMPC Algorithm for Detecting Multiple Dependencies. Proceedings of the Fourth International Workshop on Feature Selection in Data Mining, in PMLR 10:22-34

Related Material