Social media forensics
that the one holding it has no other account, and the use of a pseudonym is for legitimate purposes, such as for security reasons. Real accounts can perform all the tasks, from posting to following, liking, commenting and sharing.
Cloned, farmed or manufactured accounts are duplicate accounts, usually operated by people whose in the account. A real person can have several clones or duplicates. A click farm is a facility that cultivates or manufactures these cloned accounts whose services they sell to clients by offering follows, likes, comments and shares depending on the price. Cloned, farmed or manufactured accounts are fake accounts because while they are technically organic because they are operated by real people, they are patently inauthentic since they do not correspond to real people.
Programmed accounts, otherwise known as bots, are the fakest of all accounts since they are not only inauthentic but also inorganic. They are accounts that are machine-generated, and therefore do not represent real persons. For a fee, one can avail of services of knowledgeable people togenerate fake accounts that are programmed to follow, like and even comment on someone’s post albeit in short prefabricated prose. However, bots could not share.
The social presence of an FB account is usually measured by its reach and engagement. Reach refers to the number of people that have seen a post. Engagement, on the other hand, refers to interactions that go beyond viewing a post, and includes likes, comments and shares. Reach and engagement are the main parameters used in FB analytics. Engagement can be translated to average shares per post, average comments per post, average interactions per post and interaction rate, among others.
However, and taken separately, reach and engagements are not reliable measures of organicity and authenticity, considering that even cloned, farmed or manufactured accounts and bots form part of the reach and engagements. Click farms provide services that can increase engagement, even if these are purchased and are inauthentic. Bots can be programmed to like and comment.
Furthermore, reach and engagements do not in themselves measure approval and acceptability ratings, considering that reach can include people who saw or viewed a post but hated it, and engagement includes people who troll an account, and share it to be bashed in other walls.
A relational analysis between reach and engagement can however reveal initial signs for the presence of fake followers of an account. A high level of reach, but uncharacteristically low levels of engagements when the reach is increasing, but the engagements are decreasing. However, this measure is not enough considering that real people who are part of the reach of an account may not necessarily like, comment or even share. In fact, purchased clones and programmed bots are more likely to like and comment than real accounts precisely because they are paid to do so.
Thus, the clue is not in the relationship between reach and engagement, but between engagement-related factors such as likes and shares.
Liking and sharing are two kinds of actions that are predictably related to each other. While it is not automatic that if one likes a post, that such will also be shared, it is reasonable to assume that the normal overall pattern is that higher number of likes will translate into a corresponding higher number of shares. A statistical index is available, namely the Pearson’s R or the linear correla strength of the relationships. A weak correlation, when accompanied by low percentage of number of shares relative to number of likes can then be interpreted as a possible indicator of a high number of fake likes.
After all, the acts of liking and sharing are the ones that differentiate a real account from fake ones. Real accounts can like and share. Cloned accounts purchased from click farms can like and share, but these come not as a variable authentic behavior, but as a fixed number based on the purchased package, and would behave differently from organic and authentic likes and shares. Manufactured bots can like but could not share and likewise behave inorganically and inauthentically because they are machine-generated.
Thus, bots lower the percentage of shares relative to likes simply because they cannot share. Furthermore, both bots and clones behave in non- random ways simply because they are either programmed or purchased as a block, and would therefore weaken the computed correlation index even if these are lumped with organic and authentic likes and shares.
Indeed, numbers do not lie, only people. Inasmuch as real people farm, manufacture or program fake accounts, their inorganic and inauthentic behavior will eventually be detected through aforensic analysis of the numbers.
Electoral fraud exists when votes are manufactured.
Social media fraud exists when reach and engagement are farmed, manufactured or programmed.
Both acts are to be condemned because they maliciously game democracy.