On PAC Learning Halfspaces in Non-interactive Local Privacy Model with Public Unlabeled Data

Jinyan Su, Jinhui Xu, Di Wang
Proceedings of The 14th Asian Conference on Machine Learning, PMLR 189:927-941, 2023.

Abstract

In this paper, we study the problem of PAC learning halfspaces in the non-interactive local differential privacy model (NLDP). To breach the barrier of exponential sample complexity, previous results studied a relaxed setting where the server has access to some additional public but unlabeled data. We continue in this direction. Specifically, we consider the problem under the standard setting instead of the large margin setting studied before. Under different mild assumptions on the underlying data distribution, we propose two approaches that are based on the Massart noise model and self-supervised learning and show that it is possible to achieve sample complexities that are only linear in the dimension and polynomial in other terms for both private and public data, which significantly improve the previous results. Our methods could also be used for other private PAC learning problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v189-su23a, title = {On PAC Learning Halfspaces in Non-interactive Local Privacy Model with Public Unlabeled Data}, author = {Su, Jinyan and Xu, Jinhui and Wang, Di}, booktitle = {Proceedings of The 14th Asian Conference on Machine Learning}, pages = {927--941}, year = {2023}, editor = {Khan, Emtiyaz and Gonen, Mehmet}, volume = {189}, series = {Proceedings of Machine Learning Research}, month = {12--14 Dec}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v189/su23a/su23a.pdf}, url = {https://proceedings.mlr.press/v189/su23a.html}, abstract = {In this paper, we study the problem of PAC learning halfspaces in the non-interactive local differential privacy model (NLDP). To breach the barrier of exponential sample complexity, previous results studied a relaxed setting where the server has access to some additional public but unlabeled data. We continue in this direction. Specifically, we consider the problem under the standard setting instead of the large margin setting studied before. Under different mild assumptions on the underlying data distribution, we propose two approaches that are based on the Massart noise model and self-supervised learning and show that it is possible to achieve sample complexities that are only linear in the dimension and polynomial in other terms for both private and public data, which significantly improve the previous results. Our methods could also be used for other private PAC learning problems.} }
Endnote
%0 Conference Paper %T On PAC Learning Halfspaces in Non-interactive Local Privacy Model with Public Unlabeled Data %A Jinyan Su %A Jinhui Xu %A Di Wang %B Proceedings of The 14th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Emtiyaz Khan %E Mehmet Gonen %F pmlr-v189-su23a %I PMLR %P 927--941 %U https://proceedings.mlr.press/v189/su23a.html %V 189 %X In this paper, we study the problem of PAC learning halfspaces in the non-interactive local differential privacy model (NLDP). To breach the barrier of exponential sample complexity, previous results studied a relaxed setting where the server has access to some additional public but unlabeled data. We continue in this direction. Specifically, we consider the problem under the standard setting instead of the large margin setting studied before. Under different mild assumptions on the underlying data distribution, we propose two approaches that are based on the Massart noise model and self-supervised learning and show that it is possible to achieve sample complexities that are only linear in the dimension and polynomial in other terms for both private and public data, which significantly improve the previous results. Our methods could also be used for other private PAC learning problems.
APA
Su, J., Xu, J. & Wang, D.. (2023). On PAC Learning Halfspaces in Non-interactive Local Privacy Model with Public Unlabeled Data. Proceedings of The 14th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 189:927-941 Available from https://proceedings.mlr.press/v189/su23a.html.

Related Material