In platforms we trust: misinformation on social networks in the presence of social mistrust
We examine the effect social mistrust has on the propagation of misinformation on a social network. Agents communicate with each other and observe information sources, changing their opinion with some probability determined by their social trust, which can be low or high. Low social trust agents are less likely to be convinced out of their opinion by their peers and are more likely to observe misinformative information sources. A platform facilitates the creation of a network where users are more likely to connect with agents of the same level of social trust and the same social characteristics. In the case where worldview is relatively important in determining network structure, echo chambers are more pronounced, reducing the probability that agents believe misinformation. At the same time, echo chambers increase polarisation, leading to a trade-off which has implications for the optimal intervention of a platform wishing to reduce misinformation, which we characterise.