C.2 Privacy Protection in Human-Distant IoT Environments

Technical progress in the field of connected human-remote sensors is generating a volume of data whose increasingly intelligent analysis and the actuator technology based on it are creating great opportunities for “smart environments” such as smart homes, smart factories, and smart cities in all areas of life. At the same time, however, major privacy challenges are emerging: key principles such as transparency, purpose limitation, data minimization and accuracy are being challenged by the progressive establishment of connected human-remote sensors.

This leads to the question of how privacy protection can still be effectively guaranteed in such environments. The protection of privacy itself is not negotiable from a normative perspective, because privacy is a fundamental right. However, new normative approaches and regulatory mechanisms are needed, as well as new strategies for the technical and organizational implementation of outdated principles.

Subproject C.2. therefore aims to find solutions on three levels: first, normatively, by further developing regulation; second, organizationally, by involving intermediaries; and third, technically, by developing implementation mechanisms, especially for transparency, intervenability and accountability.

Current PhD project of subarea C.2:

The Tension between Human-Distant IoT Devices and Transparency under Data Protection Law

-Linda Seyda-

Technical progress in IoT environments distant from humans is generating a large amount of data that can be analyzed intelligently. That opens opportunities, but at the same time also significant risks and challenges for privacy, especially with an eye to data protection principles such as data minimization, but also transparency. A clear and universally applicable solution for user privacy is needed (legal and technical). Such a solution could be found in the development of new normative approaches and regulatory mechanisms regarding transparency.

The principle of transparency was originally developed sensuously with the Census Judgment of the federal constitutional court (BVerfG) in 1983. There, it is said that it is not compatible with the right to informational self-determination, if citizens can no longer know who knows what about them, when and on what occasion.

Now, the principle of transparency is fixed in Art. 5 Paragraph 1 lit. a GDPR, and essentially realized through several instruments of the GDPR, such as the information requirements in the Articles 12-14 or the data subject’s right to access in art. 15 for example.

The first steps will be to elaborate, what “transparency” means in general, in the legal context (especially in the GDPR) and in other disciplines, especially in computer science.

Another step will be the examination, if the current way the data subject is informed is too little, sufficient or even too much and if other, better and more effective ways of information are imaginable and how these could be implemented.

Since law and technology can influence each other's development, an interdisciplinary solution is beneficial. To achieve such a solution, the method of concretizing legal requirements (Methode KORA) could be used.

  Name Working area(s) Contact
Prof. Dr. Gerrit Hornung
C.2
+49 561 804 7923
K33 2109
Linda Seyda
C.2, Tandem: C.1
+49 561 804 7717