A Statistical Implicative Analysis Based Algorithm and MMPC Algorithm for Detecting Multiple Dependencies

Elham Salehi, Jayashree Nyayachavadi, Robin Gras
Proceedings of the Fourth International Workshop on Feature Selection in Data Mining, PMLR 10:22-34, 2010.

Abstract

Discovering the dependencies among the variables of a domain from examples is an important problem in optimization. Many methods have been proposed for this purpose, but few large-scale evaluations were conducted. Most of these methods are based on measurements of conditional probability. The statistical implicative analysis offers another perspective of dependencies. It is important to compare the results obtained using this approach with one of the best methods currently available for this task: the MMPC heuristic. As the SIA is not used directly to address this problem, we designed an extension of it for our purpose. We conducted a large number of experiments by varying parameters such as the number of dependencies, the number of variables involved or the type of their distribution to compare the two approaches. The results show strong complementarities of the two methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v10-salehi10a, title = {A Statistical Implicative Analysis Based Algorithm and MMPC Algorithm for Detecting Multiple Dependencies}, author = {Salehi, Elham and Nyayachavadi, Jayashree and Gras, Robin}, booktitle = {Proceedings of the Fourth International Workshop on Feature Selection in Data Mining}, pages = {22--34}, year = {2010}, editor = {Liu, Huan and Motoda, Hiroshi and Setiono, Rudy and Zhao, Zheng}, volume = {10}, series = {Proceedings of Machine Learning Research}, address = {Hyderabad, India}, month = {21 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v10/salehi10a/salehi10a.pdf}, url = {https://proceedings.mlr.press/v10/salehi10a.html}, abstract = {Discovering the dependencies among the variables of a domain from examples is an important problem in optimization. Many methods have been proposed for this purpose, but few large-scale evaluations were conducted. Most of these methods are based on measurements of conditional probability. The statistical implicative analysis offers another perspective of dependencies. It is important to compare the results obtained using this approach with one of the best methods currently available for this task: the MMPC heuristic. As the SIA is not used directly to address this problem, we designed an extension of it for our purpose. We conducted a large number of experiments by varying parameters such as the number of dependencies, the number of variables involved or the type of their distribution to compare the two approaches. The results show strong complementarities of the two methods.} }
Endnote
%0 Conference Paper %T A Statistical Implicative Analysis Based Algorithm and MMPC Algorithm for Detecting Multiple Dependencies %A Elham Salehi %A Jayashree Nyayachavadi %A Robin Gras %B Proceedings of the Fourth International Workshop on Feature Selection in Data Mining %C Proceedings of Machine Learning Research %D 2010 %E Huan Liu %E Hiroshi Motoda %E Rudy Setiono %E Zheng Zhao %F pmlr-v10-salehi10a %I PMLR %P 22--34 %U https://proceedings.mlr.press/v10/salehi10a.html %V 10 %X Discovering the dependencies among the variables of a domain from examples is an important problem in optimization. Many methods have been proposed for this purpose, but few large-scale evaluations were conducted. Most of these methods are based on measurements of conditional probability. The statistical implicative analysis offers another perspective of dependencies. It is important to compare the results obtained using this approach with one of the best methods currently available for this task: the MMPC heuristic. As the SIA is not used directly to address this problem, we designed an extension of it for our purpose. We conducted a large number of experiments by varying parameters such as the number of dependencies, the number of variables involved or the type of their distribution to compare the two approaches. The results show strong complementarities of the two methods.
RIS
TY - CPAPER TI - A Statistical Implicative Analysis Based Algorithm and MMPC Algorithm for Detecting Multiple Dependencies AU - Elham Salehi AU - Jayashree Nyayachavadi AU - Robin Gras BT - Proceedings of the Fourth International Workshop on Feature Selection in Data Mining DA - 2010/05/26 ED - Huan Liu ED - Hiroshi Motoda ED - Rudy Setiono ED - Zheng Zhao ID - pmlr-v10-salehi10a PB - PMLR DP - Proceedings of Machine Learning Research VL - 10 SP - 22 EP - 34 L1 - http://proceedings.mlr.press/v10/salehi10a/salehi10a.pdf UR - https://proceedings.mlr.press/v10/salehi10a.html AB - Discovering the dependencies among the variables of a domain from examples is an important problem in optimization. Many methods have been proposed for this purpose, but few large-scale evaluations were conducted. Most of these methods are based on measurements of conditional probability. The statistical implicative analysis offers another perspective of dependencies. It is important to compare the results obtained using this approach with one of the best methods currently available for this task: the MMPC heuristic. As the SIA is not used directly to address this problem, we designed an extension of it for our purpose. We conducted a large number of experiments by varying parameters such as the number of dependencies, the number of variables involved or the type of their distribution to compare the two approaches. The results show strong complementarities of the two methods. ER -
APA
Salehi, E., Nyayachavadi, J. & Gras, R.. (2010). A Statistical Implicative Analysis Based Algorithm and MMPC Algorithm for Detecting Multiple Dependencies. Proceedings of the Fourth International Workshop on Feature Selection in Data Mining, in Proceedings of Machine Learning Research 10:22-34 Available from https://proceedings.mlr.press/v10/salehi10a.html.

Related Material