Hoepman, J.-H.: Privacy design strategies. In: Cuppens-Boulahia, N., Cuppens, F., Jajodia, S., Abou El Kalam, A., Sans, T. (eds.) SEC 2014. IAICT, vol. 428, pp. 446–459. Springer, Heidelberg (2014). https://doi.org/10.1007/978-3-642-55415-5_38
ENISA: Privacy Enhancing Technologies: Evolution and State of the Art A Community Approach to PETs Maturity Assessment (2016). https://www.enisa.europa.eu/publications/pets-evolution-and-state-of-the-art
Schaub, F., Balebako, R., Durity, A.L., Cranor, L.F.: A design space for effective privacy notices. In: Eleventh Symposium on Usable Privacy and Security (SOUPS 2015), Ottawa, pp. 1–17. USENIX Association (2015)
President’s Council of Advisors on Science and Technology: Big data and privacy: a technological perspective. Report to the U.S. President, Executive Office of the President, May 2014
Cranor, L.F.: Necessary but not sufficient: standard mechanisms for privacy notice and choice. J. Telecommun. High Technol. Law 10, 273 (2012)
Cate, F.H.: The limits of notice and choice. IEEE Secur. Priv. 8(2), 59–62 (2010)
Schaub, F., Balebako, R., Cranor, L.F.: Designing effective privacy notices and controls. IEEE Internet Comput. 21(3), 70–77 (2017)
Wenning, R., et al.: The platform for privacy preferences 1.1 (P3P 1.1) specification (2006). https://www.w3.org/TR/2018/NOTE-P3P11-20180830/
Article 29 Working Party. Opinion 05/2014 on anonymisation techniques (2014). WP216. http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp216_en.pdf
Narayanan, A., Shmatikov, V.: Robust de-anonymization of large sparse datasets. In: 2008 IEEE Symposium on Security and Privacy, SP 2008 (2008)
Cavoukian, A., Castro, D.: Big data and innovation, setting the record straight: deidentification does work. In: Information and Privacy Commissioner, p. 18 (2014)
Hu, R., Stalla-Bourdillon, S., Yang, M., Schiavo, V., Sassone, V.: Bridging policy, regulation and practice? A techno-legal analysis of three types of data in the GDPR. In: Data Protection and Privacy: The Age of Intelligent Machines, p. 39 (2017)
Ye, L.R.: The value of explanation in expert systems for auditing: an experimental investigation. Expert Syst. Appl. 9(4), 543–556 (1995)
Article 29 Working Party. Guidelines on transparency under regulation 2016/679 (2016). 17/EN WP260. http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id = 615250
Wachter, S., Mittelstadt, B., Floridi, L.: Why a right to explanation of automated decision-making does not exist in the general data protection regulation. Int. Data Priv. Law 7, 76–99 (2017)
Selbst, A.D., Powles, J.: Meaningful information and the right to explanation. Int. Data Priv. Law 7(4), 233–242 (2017)
Biran, O., Cotton, C.: Explanation and justification in machine learning: a survey. In: IJCAI-17 Workshop on Explainable AI (XAI) Proceedings, pp. 8–13 (2017). http://www. intelligentrobots.org/files/IJCAI2017/IJCAI-17_XAI_WS_Proceedings.pdf#page=8
Lipton, Z.C.: The mythos of model interpretability. In: ICML 2016 Workshop on Human Interpretability in Machine Learning (WHI 2016) (2016). http://zacklipton.com/media/papers/mythos_model_interpretability_lipton2016.pdf
Edwards, L., Veale, M.: Slave to the algorithm? Why a ’right to an explanation’ is probably not the remedy you are looking for. Duke Law Technol. Rev. 16, 18 (2017)
Article 29 Working Party. Guidelines on automated individual decision-making and profiling for the purposes of regulation 2016/679 (2018). 17/EN WP251rev.01. http://ec. europa.eu/newsroom/article29/item-detail.cfm?item_id=612053
Obar, J.A., Oeldorf-Hirsch, A., The biggest lie on the internet: ignoring the privacy policies and terms of service policies of social networking services. In: TPRC 44: The 44th Research Conference on Communication, Information and Internet Policy (2016)
Cate, F.H.: Information security breaches: looking back & thinking ahead. Technical report Paper 233, Articles by Maurer Faculty (2008). http://www.repository.law.indiana.edu/facpub/233
Atzori, M., Bonchi, F., Giannotti, F., Pedreschi, D.: Anonymity preserving pattern discovery. VLDB J. 17(4), 703–727 (2008)
Hansen, M., Jensen, M., Rost, M.: Protection goals for privacy engineering. In: 2015 IEEE Security and Privacy Workshops (SPW), pp. 159–166, May 2015
Schmidt, A., Herrmann, T., Degeling, M.: From interaction to intervention: an approach for keeping humans in control in the context of socio-technical systems. In: 4th Workshop on Socio-Technical Perspective in IS development (STPIS 2018) (2018)
Ribeiro, M.T., Singh, S., Guestrin, C.: “Why should I trust you?”: explaining the predictions of any classifier. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD 2016, pp. 1135–1144. ACM, New York (2016)
Gürses, S., van Hoboken, J.: Privacy after the agile turn. In: Selinger, E., Polonetsky, J., Tene, O. (eds.) The Cambridge Handbook of Consumer Privacy (Cambridge Law Handbooks, pp. 579–601). Cambridge University Press, Cambridge (2018). https://doi. org/10.1017/9781316831960.032
Oliver, I.: Privacy Engineering: A Data Flow and Ontological Approach. CreateSpace Independent Publishing, July 2014. 978-1497569713
Anton, A.I., Earp, J.B.: A requirements taxonomy for reducing web site privacy vulnerabilities. Requirements Eng. 9(3), 169–185 (2004)
Solove, D.J.: A taxonomy of privacy. Univ. Pennsylvania Law Rev. 154(3), 477 (2006). GWU Law School Public Law Research Paper No. 129
Solove, D.J.: Conceptualizing privacy. Calif. Law Rev. 90(4), 1087–1155 (2002)
Kost, M., Freytag, J.C., Kargl, F., Kung, A.: Privacy verification using ontologies. In: ARES, pp. 627–632. IEEE (2011)
Kern, T.: Flight Discipline. McGraw-Hill Education, New York (1998)
Card, A.J., Ward, J.R., Clarkson, P.J.: Beyond FMEA: the structured what-if technique (SWIFT). J. Healthc. Risk Manag. 31, 23–29 (2012)
Scandariato, R., Wuyts, K., Joosen, W.: A descriptive study of Microsoft’s threat modeling technique. Requirements Eng. 20(2), 163–180 (2015)
Gawande, A.: The Checklist Manifesto. Profile Books (2011)
Reason, J.T.: Managing the Risks of Organizational Accidents. Ashgate, Farnham (1997)
Pfleeger, S.L.: Risky business: what we have yet to learn about risk management. J. Syst. Softw. 53(3), 265–273 (2000)
Oliver, I.: Experiences in the development and usage of a privacy requirements framework. In: 24th IEEE International Requirements Engineering Conference, RE 2016, Beijing, China, 12–16 September 2016, pp. 293–302. IEEE Computer Society (2016)
Power, M.: The risk management of everything. J. Risk Finance 5, 58–65 (2004)