References of "Tang, Qiang"
     in
Bookmark and Share    
Full Text
Peer Reviewed
See detailTogether or Alone: The Price of Privacy in Collaborative Learinig
Pejo, Balazs UL; Tang, Qiang; Biczók, Gergely

in Proceedings on Privacy Enhancing Technologies (2019, July)

Machine learning algorithms have reached mainstream status and are widely deployed in many applications. The accuracy of such algorithms depends significantly on the size of the underlying training ... [more ▼]

Machine learning algorithms have reached mainstream status and are widely deployed in many applications. The accuracy of such algorithms depends significantly on the size of the underlying training dataset; in reality a small or medium sized organization often does not have the necessary data to train a reasonably accurate model. For such organizations, a realistic solution is to train their machine learning models based on their joint dataset (which is a union of the individual ones). Unfortunately, privacy concerns prevent them from straightforwardly doing so. While a number of privacy-preserving solutions exist for collaborating organizations to securely aggregate the parameters in the process of training the models, we are not aware of any work that provides a rational framework for the participants to precisely balance the privacy loss and accuracy gain in their collaboration. In this paper, by focusing on a two-player setting, we model the collaborative training process as a two-player game where each player aims to achieve higher accuracy while preserving the privacy of its own dataset. We introduce the notion of Price of Privacy, a novel approach for measuring the impact of privacy protection on the accuracy in the proposed framework. Furthermore, we develop a game-theoretical model for different player types, and then either find or prove the existence of a Nash Equilibrium with regard to the strength of privacy protection for each player. Using recommendation systems as our main use case, we demonstrate how two players can make practical use of the proposed theoretical framework, including setting up the parameters and approximating the non-trivial Nash Equilibrium. [less ▲]

Detailed reference viewed: 43 (1 UL)
Peer Reviewed
See detailFacilitating Privacy-preserving Recommendation-as-a-Service with Machine Learning
Wang, Jun UL; Arriaga, Afonso; Tang, Qiang et al

Poster (2018, October)

Machine-Learning-as-a-Service has become increasingly popular, with Recommendation-as-a-Service as one of the representative examples. In such services, providing privacy protection for users is an ... [more ▼]

Machine-Learning-as-a-Service has become increasingly popular, with Recommendation-as-a-Service as one of the representative examples. In such services, providing privacy protection for users is an important topic. Reviewing privacy-preserving solutions which were proposed in the past decade, privacy and machine learning are often seen as two competing goals at stake. Though improving cryptographic primitives (e.g., secure multi-party computation (SMC) or homomorphic encryption (HE)) or devising sophisticated secure protocols has made a remarkable achievement, but in conjunction with state-of-the-art recommender systems often yields far-from-practical solutions. We tackle this problem from the direction of machine learning. We aim to design crypto-friendly recommendation algorithms, thus to obtain efficient solutions by directly using existing cryptographic tools. In particular, we propose an HE-friendly recommender system, refer to as CryptoRec, which (1) decouples user features from latent feature space, avoiding training the recommendation model on encrypted data; (2) only relies on addition and multiplication operations, making the model straightforwardly compatible with HE schemes. The properties turn recommendation-computations into a simple matrix-multiplication operation. To further improve efficiency, we introduce a sparse-quantization-reuse method which reduces the recommendation-computation time by $9\times$ (compared to using CryptoRec directly), without compromising the accuracy. We demonstrate the efficiency and accuracy of CryptoRec on three real-world datasets. CryptoRec allows a server to estimate a user's preferences on thousands of items within a few seconds on a single PC, with the user's data homomorphically encrypted, while its prediction accuracy is still competitive with state-of-the-art recommender systems computing over clear data. Our solution enables Recommendation-as-a-Service on large datasets in a nearly real-time (seconds) level. [less ▲]

Detailed reference viewed: 76 (4 UL)
Full Text
Peer Reviewed
See detailDifferentially Private Neighborhood-based Recommender Systems
Wang, Jun UL; Tang, Qiang

in IFIP Information Security & Privacy Conference (2017, May)

Privacy issues of recommender systems have become a hot topic for the society as such systems are appearing in every corner of our life. In contrast to the fact that many secure multi-party computation ... [more ▼]

Privacy issues of recommender systems have become a hot topic for the society as such systems are appearing in every corner of our life. In contrast to the fact that many secure multi-party computation protocols have been proposed to prevent information leakage in the process of recommendation computation, very little has been done to restrict the information leakage from the recommendation results. In this paper, we apply the differential privacy concept to neighborhood-based recommendation methods (NBMs) under a probabilistic framework. We first present a solution, by directly calibrating Laplace noise into the training process, to differential-privately find the maximum a posteriori parameters similarity. Then we connect differential privacy to NBMs by exploiting a recent observation that sampling from the scaled posterior distribution of a Bayesian model results in provably differentially private systems. Our experiments show that both solutions allow promising accuracy with a modest privacy budget, and the second solution yields better accuracy if the sampling asymptotically converges. We also compare our solutions to the recent differentially private matrix factorization (MF) recommender systems, and show that our solutions achieve better accuracy when the privacy budget is reasonably small. This is an interesting result because MF systems often offer better accuracy when differential privacy is not applied. [less ▲]

Detailed reference viewed: 152 (17 UL)
Full Text
Peer Reviewed
See detailTo Cheat or Not to Cheat - A Game-Theoretic Analysis of Outsourced Computation Verification
Pejo, Balazs UL; Tang, Qiang

in Fifth ACM International Workshop on Security in Cloud Computing, Abu Dhabi 2 April 2017 (2017, April 02)

In the cloud computing era, in order to avoid computational burdens, many organizations tend to outsource their com- putations to third-party cloud servers. In order to protect service quality, the ... [more ▼]

In the cloud computing era, in order to avoid computational burdens, many organizations tend to outsource their com- putations to third-party cloud servers. In order to protect service quality, the integrity of computation results need to be guaranteed. In this paper, we develop a game theoretic framework which helps the outsourcer to maximize its pay- o while ensuring the desired level of integrity for the out- sourced computation. We de ne two Stackelberg games and analyze the optimal setting's sensitivity for the parameters of the model. [less ▲]

Detailed reference viewed: 81 (7 UL)
Full Text
Peer Reviewed
See detailOn the power of Public-key Function-Private Functional Encryption
Iovino, Vincenzo UL; Tang, Qiang; Zebrowski, Karol

in IET Information Security (2017)

In the public-key setting, known constructions of function-private functional encryption (FPFE) were limited to very restricted classes of functionalities like inner-product [Agrawal et al. - PKC 2015 ... [more ▼]

In the public-key setting, known constructions of function-private functional encryption (FPFE) were limited to very restricted classes of functionalities like inner-product [Agrawal et al. - PKC 2015]. Moreover, its power has not been well investigated. In this paper, we construct FPFE for general functions and explore its powerful applications, both for general and specific functionalities. One key observation entailed by our results is that Attribute-based Encryption with function privacy implies FE, a notable fact that sheds light on the importance of the function privacy property for FE. [less ▲]

Detailed reference viewed: 67 (5 UL)
Full Text
Peer Reviewed
See detailA Probabilistic View of Neighborhood-based Recommendation Methods
Wang, Jun UL; Tang, Qiang

in ICDM 2016 - IEEE International Conference on Data Mining series (ICDM) workshop CLOUDMINE (2016, December 12)

Probabilistic graphic model is an elegant framework to compactly present complex real-world observations by modeling uncertainty and logical flow (conditionally independent factors). In this paper, we ... [more ▼]

Probabilistic graphic model is an elegant framework to compactly present complex real-world observations by modeling uncertainty and logical flow (conditionally independent factors). In this paper, we present a probabilistic framework of neighborhood-based recommendation methods (PNBM) in which similarity is regarded as an unobserved factor. Thus, PNBM leads the estimation of user preference to maximizing a posterior over similarity. We further introduce a novel multi-layer similarity descriptor which models and learns the joint influence of various features under PNBM, and name the new framework MPNBM. Empirical results on real-world datasets show that MPNBM allows very accurate estimation of user preferences. [less ▲]

Detailed reference viewed: 93 (10 UL)
Full Text
Peer Reviewed
See detailUpdatable Functional Encryption
Arriaga, Afonso; Iovino, Vincenzo UL; Tang, Qiang

in Paradigms in Cryptology – Mycrypt 2016. Malicious and Exploratory Cryptology (2016, December)

Functional encryption (FE) allows an authority to issue tokens associated with various functions, allowing the holder of some token for function f to learn only f(𝖣)f(D) from a ciphertext that encrypts ... [more ▼]

Functional encryption (FE) allows an authority to issue tokens associated with various functions, allowing the holder of some token for function f to learn only f(𝖣)f(D) from a ciphertext that encrypts 𝖣D . The standard approach is to model f as a circuit, which yields inefficient evaluations over large inputs. Here, we propose a new primitive that we call updatable functional encryption (UFE), where instead of circuits we deal with RAM programs, which are closer to how programs are expressed in von Neumann architecture. We impose strict efficiency constrains in that the run-time of a token 𝖯⎯⎯⎯P¯ on ciphertext 𝖢𝖳CT is proportional to the run-time of its clear-form counterpart (program 𝖯P on memory 𝖣D ) up to a polylogarithmic factor in the size of 𝖣D , and we envision tokens that are capable to update the ciphertext, over which other tokens can be subsequently executed. We define a security notion for our primitive and propose a candidate construction from obfuscation, which serves as a starting point towards the realization of other schemes and contributes to the study on how to compute RAM programs over public-key encrypted data. [less ▲]

Detailed reference viewed: 40 (3 UL)
Full Text
See detailGame-Theoretic Framework for Integrity Verification in Computation Outsourcing
Pejo, Balazs UL; Tang, Qiang

Poster (2016, November 03)

Detailed reference viewed: 76 (8 UL)
Full Text
Peer Reviewed
See detailPrivacy-preserving Friendship-based Recommender Systems
Tang, Qiang; Wang, Jun UL

in IEEE Transactions on Dependable and Secure Computing (2016, November)

Privacy-preserving recommender systems have been an active research topic for many years. However, until today, it is still a challenge to design an efficient solution without involving a fully trusted ... [more ▼]

Privacy-preserving recommender systems have been an active research topic for many years. However, until today, it is still a challenge to design an efficient solution without involving a fully trusted third party or multiple semitrusted third parties. The key obstacle is the large underlying user populations (i.e. huge input size) in the systems. In this paper, we revisit the concept of friendship-based recommender systems, proposed by Jeckmans et al. and Tang and Wang. These solutions are very promising because recommendations are computed based on inputs from a very small subset of the overall user population (precisely, a user’s friends and some randomly chosen strangers). We first clarify the single prediction protocol and Top-n protocol by Tang and Wang, by correcting some flaws and improving the efficiency of the single prediction protocol. We then design a decentralized single protocol by getting rid of the semi-honest service provider. In order to validate the designed protocols, we crawl Twitter and construct two datasets (FMT and 10-FMT) which are equipped with auxiliary friendship information. Based on 10-FMT and MovieLens 100k dataset with simulated friendships, we show that even if our protocols use a very small subset of the datasets, their accuracy can still be equal to or better than some baseline algorithm. Based on these datasets, we further demonstrate that the outputs of our protocols leak very small amount of information of the inputs, and the leakage decreases when the input size increases. We finally show that he single prediction protocol is quite efficient but the Top-n is not. However, we observe that the efficiency of the Top-n protocol can be dramatically improved if we slightly relax the desired security guarantee. [less ▲]

Detailed reference viewed: 89 (9 UL)
Full Text
See detailProtect both Integrity and Confidentiality in Outsourcing Collaborative Filtering Computations
Pejo, Balazs UL; Tang, Qiang; Wang, Husen

Scientific Conference (2016, June 27)

Detailed reference viewed: 74 (10 UL)
Full Text
Peer Reviewed
See detailUpdatable Functional Encryption
Delerue Arriaga, Afonso UL; Iovino, Vincenzo UL; Tang, Qiang

in Paradigms in Cryptology - Mycrypt 2016. Malicious and Exploratory Cryptology, Second International Conference, Mycrypt 2016, Kuala Lumpur, Malaysia, December 1-2, 2016, Revised Selected Papers (2016)

Functional encryption (FE) allows an authority to issue tokens associated with various functions, allowing the holder of some token for function f to learn only f(D) from a ciphertext that encrypts D. The ... [more ▼]

Functional encryption (FE) allows an authority to issue tokens associated with various functions, allowing the holder of some token for function f to learn only f(D) from a ciphertext that encrypts D. The standard approach is to model f as a circuit, which yields inefficient evaluations over large inputs. Here, we propose a new primitive that we call updatable functional encryption (UFE), where instead of circuits we deal with RAM programs, which are closer to how programs are expressed in von Neumann architecture. We impose strict efficiency constrains in that the run-time of a token P' on ciphertext CT is proportional to the run-time of its clear-form counterpart (program P on memory D) up to a polylogarithmic factor in the size of D, and we envision tokens that are capable to update the ciphertext, over which other tokens can be subsequently executed. We define a security notion for our primitive and propose a candidate construction from obfuscation, which serves as a starting point towards the realization of other schemes and contributes to the study on how to compute RAM programs over public-key encrypted data. [less ▲]

Detailed reference viewed: 100 (6 UL)
Full Text
Peer Reviewed
See detailRecommender Systems and their Security Concerns
Wang, Jun UL; Tang, Qiang

Scientific Conference (2015, October)

Instead of simply using two-dimensional User × Item features, advanced recommender systems rely on more additional dimensions (e.g. time, location, social network) in order to provide better ... [more ▼]

Instead of simply using two-dimensional User × Item features, advanced recommender systems rely on more additional dimensions (e.g. time, location, social network) in order to provide better recommendation services. In the first part of this paper, we will survey a variety of dimension features and show how they are integrated into the recommendation process. When the service providers collect more and more personal information, it brings great privacy concerns to the public. On another side, the service providers could also suffer from attacks launched by malicious users who want to bias the recommendations. In the second part of this paper, we will survey attacks from and against recommender service providers, and existing solutions. [less ▲]

Detailed reference viewed: 134 (7 UL)
Full Text
Peer Reviewed
See detailDoubleMod and SingleMod: Simple Randomized Secret-Key Encryption with Bounded Homomorphicity
Ryan, Peter UL; Phatak, Dhananjay S.; Tang, Qiang et al

in IACR Cryptology ePrint Archive (2014)

An encryption relation f Z Z with decryption function f 􀀀1 is “group-homomorphic” if, for any suitable plaintexts x1 and x2, x1+x2 = f 􀀀1( f (x1)+f (x2)). It is “ring-homomorphic” if furthermore x1x2 ... [more ▼]

An encryption relation f Z Z with decryption function f 􀀀1 is “group-homomorphic” if, for any suitable plaintexts x1 and x2, x1+x2 = f 􀀀1( f (x1)+f (x2)). It is “ring-homomorphic” if furthermore x1x2 = f 􀀀1( f (x1) f (x2)); it is “field-homomorphic” if furthermore 1=x1 = f 􀀀1( f (1=x1)). Such relations would support oblivious processing of encrypted data. We propose a simple randomized encryption relation f over the integers, called DoubleMod, which is “bounded ring-homomorphic” or what some call ”somewhat homomorphic.” Here, “bounded” means that the number of additions and multiplications that can be performed, while not allowing the encrypted values to go out of range, is limited (any pre-specified bound on the operation-count can be accommodated). Let R be any large integer. For any plaintext x 2 ZR, DoubleMod encrypts x as f (x) = x + au + bv, where a and b are randomly chosen integers in some appropriate interval, while (u; v) is the secret key. Here u > R2 is a large prime and the smallest prime factor of v exceeds u. With knowledge of the key, but not of a and b, the receiver decrypts the ciphertext by computing f 􀀀1(y) = (y mod v) mod u. DoubleMod generalizes an independent idea of van Dijk et al. 2010. We present and refine a new CCA1 chosen-ciphertext attack that finds the secret key of both systems (ours and van Dijk et al.’s) in linear time in the bit length of the security parameter. Under a known-plaintext attack, breaking DoubleMod is at most as hard as solving the Approximate GCD (AGCD) problem. The complexity of AGCD is not known. We also introduce the SingleMod field-homomorphic cryptosystems. The simplest SingleMod system based on the integers can be broken trivially. We had hoped, that if SingleMod is implemented inside non-Euclidean quadratic or higher-order fields with large discriminants, where GCD computations appear di cult, it may be feasible to achieve a desired level of security. We show, however, that a variation of our chosen-ciphertext attack works against SingleMod even in non-Euclidean fields. [less ▲]

Detailed reference viewed: 54 (7 UL)