Some ideas for a Master thesis

If you are interested in the topics below, contact Mina Sheikhalishahi (email: m.sheikhalishahi* tue.nl (replace * with @) )
  1. Game Theory Meets Multi-party Access Control: In multi-party access control, several agents are authorized to make a decision (permit or deny) about an access request for co-owned resource, e.g. a photo shared in social network. The main issue in multi-party access control arises from privacy conflicts in collaborative management of the shared data. In practice, stakeholders might prefer to maximize their own profit without considering the benefit of entire system. The goal of this project is to formulate multi-party access control as a game that models the interaction between stakeholders. The participated stakeholders are considered as the players of the game where their highest rewards is obtained when the privacy requirements is relaxed as much as possible. To this end, Nash Equilibrium needs to be set in the point that the best balance between privacy and system efficiency is guaranteed.

    Suggested reading:
    1. Hongxin Hu, Gail-Joon Ahn, Ziming Zhao, Dejun Yang. Game Theoretic Analysis of Multiparty Access Control in Online Social Networks, Proceedings of the 19th ACM symposium on Access control models and technologies, 2014
    2. Fang Liu , Li Pan , Li-hong Yao. Evolutionary Game Based Analysis for User Privacy Protection Behaviors in Social Networks, IEEE International Conference on Data Science in Cyberspace (DSC), 2018.
    3. Tansu Alpcan , Lacra Pavel. Nash Equilibrium Design and Optimization. International Conference on Game Theory for Networks, 2009.

  2. Compare the Efficiency of Different Classfiers over Encrypted Data:

    The collection of digital data brought by governments, companies, and individuals, has created fabulous opportunities for knowledge-based decision making. For global benefits, or by regulations that require certain data to be published, there is a demand for the exchange and publication of data among various parties. However, privacy is always an obstacle in sharing information. Although a bunch of work has been devoted to compare the efficiency of different secure multi-party computation protocols, few works consider the effect of data-mining structure in computation complexity over encrypted data. In this project, we suppose several data holders are interested in sharing their data for constructing a (more accurate) classifier on whole data, without revealing any sensitive information. The aim of this project is to compare the computation complexities needed to construct different classifiers over encrypted data. The result of this project helps in selecting better classifier in terms of efficiency-accuracy trade-off. Suggested reading:
    1. Raphael Bost, Raluca Ada Popa, Stephen Tu, Shafi Goldwasser, Machine Learning Classification over Encrypted Data, NDSS Symposium 2015.
    2. Alhassan Khedr, Member, IEEE, Glenn Gulak, Senior Member, IEEE, and Vinod Vaikuntanathan, SHIELD: Scalable Homomorphic Implementation of Encrypted Data-Classifiers, IEEE TRANSACTIONS ON COMPUTERS, VOL. 65, NO. 9, SEPTEMBER 2016.
  3. Privacy Preserving Stream Data Analysis Streaming data is data that is continuously generated by different sources. As an example consider the application in finance where it allows to track changes in the stock market in real time, computes value-at-risk, and automatically rebalances portfolios based on stock price movements. It is on the global benefit of organizations, companies, individuals to complete their knowledge by sharing the stream data continuously. However, the data holders are unwilling to publish their original datasets due to privacy issues. In this project we plan to perform analysis on distributed stream data, while the privacy requirements of data owners needs to be preserved dynamically.

    Suggested reading:
    1. Jingchao Sun , Rui Zhang , Jinxue Zhang , Yanchao Zhang, PriStream: Privacy-preserving distributed stream monitoring of thresholded PERCENTILE statistics, IEEE INFOCOM 2016 - The 35th Annual IEEE International Conference on Computer Communications 2016.
    2. Do Le Quoc , Martin Beck , Pramod Bhatotia, Ruichuan Chen, Christof Fetzer, Thorsten Strufe, PrivApprox: Privacy-Preserving Stream Analytics, USENIX 2017.
    3. Ting Wang ; Ling Liu. Butterfly: Protecting Output Privacy in Stream Mining, IEEE 24th International Conference on Data Engineering, 2008.

  4. Privacy Preserving Adversarial Machine Learning Sharing data with partners, authorities for data collection and even competitors, may help in inferring additional intelligence through collaborative information analysis. Specifically sharing data lead to constructing more accurate classifiers on richer amount of data. However, shared data might contain sensitive information, and data owners might only be ready to share their data in encrypted format. Considering that some data providers could be the adversaries who intentionally insert (encrypted) adversarial instances to fool the classifier, it is vital to train robust classifier over encrypted data, which is resistant against being fooled. Robust classifiers are mainly designed through either 1) creating a class of outliers (simulating adversarial instances), or 2) relaxing the conditions of assigning a point to a specific class. The goal of this project is to train robust classifier over encrypted values. Suggested reading:
    1. Jingchao Sun , Rui Zhang , Jinxue Zhang , Yanchao Zhang, PriStream: Privacy-preserving distributed stream monitoring of thresholded PERCENTILE statistics, IEEE INFOCOM 2016 - The 35th Annual IEEE International Conference on Computer Communications 2016.
    2. Battista Biggio , Giorgio Fumera , Fabio Roli, Design of robust classifiers for adversarial environments, IEEE International Conference on Systems, Man, and Cybernetics, 2011.
    3. Battista Biggio, Giorgio Fumera, Fabio Roli, Multiple classifier systems for robust classifier design in adversarial environments, International Journal of Machine Learning and Cybernetics, 2010.

  5. Game Theory Meets Privacy-preserving Collaborative Data Classification Companies, organization, and even individuals find mutual benefits in sharing their own information to make better decisions or to increase their revenues. However, generally for privacy concerns data holders are unwilling to share their original data, but they are interested in obtaining more complete knowledge out of shared data. Thence, it is essential to define a platform in which several aspects of data-sharing come under consideration and through a game theoretic approach all parties relax their privacy requirements as much as possible to obtain a more effective output. The goal of this project is to model privacy-preserving data-sharing as a game in which several aspects of data-sharing come under consideration: 1) the value of shared data (freshness, size,..), 2) privacy gain (in terms of anonymization or differential privacy), 3) trust, 4) utility of result, etc. The output of the game is setting the Nash Equilibrium in the point that the best balance in terms of utility and privacy is obtained

    Suggested reading:
    1. Mohammad Hossein Manshaei, Quanyan Zhu, Tansu Alpcan, Jean-Pierre Hubaux, Game Theory Meets Network Security and Privacy, ACM Computing Surveys (CSUR) Journal, 2013.
    2. Aron Laszka , Mark Felegyhazi , Levente Butty�an, A Survey of Interdependent Security Games, ACM Computing Surveys (CSUR) Journal, 2014.

  6. Privacy of Human Behavioural Analysis through Sensing Data Human behavior is the focus of many studies from social sciences to health. Several studies have been devoted to analyze human behavior through the data collected from her/his smartphone sensing. As examples we can name physical activity patterns, mobility patterns, social behaviors, daily activity patterns, etc. As these methods become wide spread in behavioral research, attention needs to be directed to exploring the ethical implications of sensor-based observation for users� privacy. The aim of this project is to provide tools to address privacy issues in this field of research.

    Suggested reading:
    1. Syagnik (Sy) Banerjee, Thomas Hemphill & Phil Longstreet, Wearable devices and healthcare: Data sharing and privacy, The Information Society, 34:1, 49-57, 2017.
    2. Gabriella M Harari, Sandrine R Muller, Min SH Aung, Peter Rentfrow, Smartphone sensing methods for studying behavior in everyday life, Current opinion in behavioral sciences, 2017.
    3. Gabriella M. Harari, Nicholas D. Lane, Rui Wang, Benjamin S. Crosier, Andrew T. Campbell4, and Samuel D. Gosling, Using Smartphones to Collect Behavioral Data in Psychological Science: Opportunities, Practical Considerations, and Challenges, 2017

  7. The Privacy of Collaborative Classifier Output Privacy preserving data mining has focused on obtaining valid result when the input data is private. For example, secure multi-party computation techniques are utilized to construct a data-mining algorithm on whole distributed data, without revealing the original data. However, these approaches still might leave potential privacy breach. The aim of this project is to investigate how the output of collaborative classifier construction over private data might violate the privacy, and to find the solutions to address these privacy breaches.

    Suggested reading:
    1. Murat Kantarcioglu, Jiashun Jin, Chris Clifton; When do Data Mining Results Violate Privacy?, ACM SIGKDD International conference on Knowledge discovery and data mining, 2004.
    2. Radhika Kotecha, Sanjay Garg; Preserving output-privacy in data stream classification, Progress in Artificial Intelligence, June 2017, Volume 6, Issue 2, pp 87�10.
    3. TING WANG, LING LIU, Output Privacy in Data Mining, ACM Transactions on Database Systems, 2011.

  8. Network Traffic Anomaly Detection using Deep Learning. See description.

Home Page