PRIVACY-PRESERVING MACHINE LEARNING IN CYBERSECURITY

Authors

  • Pranav Mani Tripathi Senior Risk & Compliance Analyst, Taxbit, USA. Author

Keywords:

Privacy-preserving Machine Learning, Federated Learning,, Cybersecurity, Data Privacy, Compliance, Homomorphic Encryption, Differential Privacy

Abstract

The growing complexity of cybersecurity threats pushes the need for machine learning to step in and protect our digital world. However, a significant hurdle to clear is keeping sensitive data safe while also playing by the rules set by significant laws such as the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA). The paper dives into methods that protect privacy, like scrambling data to allow for differential privacy analysis and a method where multiple parties compute together without exposing their data, aiming to protect the data used in training machine learning models for protecting computer systems. It delves into applying these techniques in cybersecurity, highlighting how they help lessen threats, ensure adherence to regulations, and keep the efficiency of models in check. Even though these methods tackle crucial issues about privacy, there are still hurdles, such as extra computational work and difficulties in merging them with other systems. This research paper is focused on pushing forward the use of machine learning that protects privacy to create a secure and rule-following digital world that offers helpful advice for both those who put it into practice and those who make the regulations.

References

Q. Yang, Y. Liu, T. Chen, and Y. Tong, Federated Machine Learning: Concept and Applications, ACM Transactions on Intelligent Systems and Technology, 10(2), 2019, 12. Retrieved from https://doi.org/10.1145/3298981.

General Data Protection Regulation (GDPR), Regulation (EU) 2016/679 of the European Parliament and of the Council, Official Journal of the European Union, 2016. Retrieved from https://eur-lex.europa.eu/eli/reg/2016/679/oj.

California Consumer Privacy Act (CCPA), California Civil Code §1798.100 - §1798.199, State of California Department of Justice, 2018. Retrieved from https://oag.ca.gov/privacy/ccpa.

S. Truex, L. Liu, and M. E. Gursoy, Demystifying Membership Inference Attacks in Machine Learning as a Service, IEEE Transactions on Services Computing, 14(6), 2019, 1939-1952. Retrieved from https://doi.org/10.1109/TSC.2019.2897553.

R. C. Geyer, T. Klein, and M. Nabi, Differentially Private Federated Learning: A Client-Level Perspective, Advances in Neural Information Processing Systems, 30, 2017, 1-11. Retrieved from https://arxiv.org/abs/1712.07557.

M. Abadi, A. Chu, I. Goodfellow, H. B. McMahan, I. Mironov, K. Talwar, and L. Zhang, Deep Learning with Differential Privacy, Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security, 2016, 308-318. Retrieved from https://doi.org/10.1145/2976749.2978318.

K. Bonawitz, V. Ivanov, B. Kreuter, A. Marcedone, H. B. McMahan, S. Patel, and K. Seth, Practical Secure Aggregation for Privacy-Preserving Machine Learning, Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security, 2017, 1175-1191. Retrieved from https://doi.org/10.1145/3133956.3133982.

National Institute of Standards and Technology (NIST), NIST Privacy Framework: A Tool for Improving Privacy through Enterprise Risk Management, Version 1.0, U.S. Department of Commerce, 2020. Retrieved from https://www.nist.gov/privacy-framework.

Downloads

Published

2024-12-28

How to Cite

PRIVACY-PRESERVING MACHINE LEARNING IN CYBERSECURITY. (2024). INTERNATIONAL JOURNAL OF INFORMATION TECHNOLOGY (IJIT), 5(2). https://lib-index.com/index.php/IJIT/article/view/1611