Trust is a multilayered phenomenon, which is considered a highly relevant research topic in several disciplines ranging from computer science to sociology, psychology, economics and law. The spectrum covered by the term trust extends from trust in abstract systems and technology to reflexive and pre-reflexive understandings of trust. In this research area, dealing with Online Social Networks (OSN), the multilayered and ambivalent nature of the term trust is the focal point of research.
OSNs do not only present challenges regarding the constitution of trust (B.1), which are to be reduced by the application of hardware-based trust anchors (B.2). The usage of data by third parties, endangering users’ private autonomy, also presents severe trust issues (B.3). Especially in crisis contexts, trust can be impaired by uncertainty and a dynamic course of events, leading additional demands for privacy in collaborative use of information and communications technology (ICT) and OSNs (B4). OSNs, with the diverse trust relations they represent, require trust, but also exhibit deceptive forms of trust. In this context, the computational representation of trust and trustworthiness, as well as the modeling of socio-technical trust infrastructures, has to be examined more closely.
Currently we witness an increased use of recursive algorithmic systems by the companies that organize many of the most relevant OSN. That is why we expand the perspective of research area B to address algorithmic settings and the intensified interconnectedness between practice and technology that comes with it. On the one hand, algorithms guide social practices, on the other, these practices serve as training data for further adaptations. Our common working thesis states that the genesis and interplay of privacy and trust in social networks cannot be investigated any longer without considering these algorithmic loops. It is necessary to examine how privacy is deployed in the formation of trust and, conversely, how algorithmically optimized trust changes the value(s) of privacy. Therefore, our research is focused on how privacy and trust relate to and (de)stabilize each other in algorithmic environments.