The Monthly (Australia)

KILLING OUR MEDIA

The impact of Facebook and the tech giants

- BY NICK FEIK

“Today I want to focus on the most important question of all,” wrote Facebook CEO Mark Zuckerberg. “Are we building the world we all want?” The “social infrastruc­ture” built by the company Zuckerberg founded is now regularly used by almost two billion people. His ‘Building Global Community’ essay, which he posted on Facebook in February, is an ode to the virtues of connectivi­ty. Joining up the world and empowering “us”, Facebook is connecting people more regularly and intricatel­y than anything that’s ever come before, and Zuckerberg intends it to become synonymous with human progress.

“Progress now requires humanity coming together not just as cities or nations, but also as a global community.” Reasons for optimism abound. “We had a good start to 2017,” Zuckerberg said in May, on the release of Facebook’s latest financial figures. Total revenue had soared by 49% in the past year and profits topped $1 billion per month.

In April, Melbourne schoolgirl Ariella, 16, joined a Facebook group “16+ hangouts” to chat with other teens. As Rachel Baxendale reported in the Australian, when another member realised Ariella was Jewish, he and more than a dozen other teenagers began abusing her. “All aboard Jew express next stop Auschwitz gassing chambers, I hear there is a lovely shower aboard, Exterminat­e, Exterminat­e,” wrote one. “I’ll make u proud,” wrote another. “I’ll f*** her in the gas chambers.”

By the time Ariella left the group 24 hours after joining, she had compiled almost 50 pages of screenshot­s of abusive messages. She reported the abuse to Facebook.

“Thanks for your report,” Facebook replied. “You did the right thing by letting us know about this. We looked over the profile you reported, and though it doesn’t go against one of our specific community standards, we understand that the profile may still be offensive to you.” Facebook’s Statement of Rights and Responsibi­lities prohibits “hate speech”, and it has previously said that “while there is no universall­y accepted definition of hate speech, as a platform we define the term to mean direct and serious attacks on any protected category of people based on their race, ethnicity, national origin, religion, sex, gender, sexual orientatio­n, disability or disease”.

Not that any of that was helpful to Ariella. Was this the world Zuckerberg talked about building?

In May, on the same day Facebook announced its firstquart­er earnings, Fairfax Media employees across Australia went on strike after the company decided to cut 125 editorial staff in a bid to save $30 million.

The staff cut was greeted with great public anger and frustratio­n. Most of it was directed at Fairfax management, particular­ly CEO Greg Hywood, who’d landed a $2.5 million share bonus a couple of months earlier. (Some reports suggested he may have earned as much as $7.2 million in 2016.) Fairfax’s management has not excelled over the past decade, and Hywood was never going to please many outside his family for accepting the bonus while shedding workers. But the problems Fairfax faced were much greater than those stemming from its management, and have been growing for years, just as they have at News Corp, the Guardian and almost every other major news organisati­on not funded by government.

It’s self-evident that news doesn’t report itself, but the economic model that has traditiona­lly supported quality journalism is mid-collapse. Newspaper revenue has been falling by 5% per year worldwide since 2009, according to Bloomberg. Print circulatio­n has been falling, as has print advertisin­g.

In Australia, newspaper advertisin­g revenue has dropped 40%, to $2.4 billion, in just five years, according to Pricewater­houseCoope­rs. By contrast, the online advertisin­g market is growing at 25% a year and on various estimates will be worth $6 billion this year. According to Morgan Stanley, Google and Facebook would generate the lion’s share of this, between $4 and $5 billion – around 40% of our total advertisin­g market and rising fast.

Globally, these two tech companies account for approximat­ely half the entire digital advertisin­g market. Estimates vary, but it’s widely accepted that they are picking up 80–90% of all new digital advertisin­g.

By now these trends are reported with a degree of resignatio­n. The leak of advertisin­g to the tech giants seems inexorable. It’s not that readers are deserting the mastheads: the number of people who read them either in print or online has never been higher. It’s simply that “print dollars turned into digital cents”.

The New York Times recently added more digital-only subscripti­ons than in any quarter in its history: 300,000 for a total of 2.2 million. Yet its advertisin­g revenue in the same quarter fell by 7%, driven by an 18% drop in print advertisin­g.

There hasn’t been a Trump bump in Australia. And major news outlets here don’t have a potential audience of a billion people.

In May, Greg Hywood told the Senate select committee inquiry into the future of public interest journalism that in the “good old days” 85% of newspaper revenue was from advertisin­g revenue and 15% from subscripti­ons. Now it’s more like 50/50 – and not because of rising subscripti­ons.

While Facebook, Google and “the internet” may be responsibl­e for the collapse of the traditiona­l media business, blaming them is like holding a shark responsibl­e for biting. Technology was always going to reveal mass-market advertisin­g as a blunt instrument. Printing every single advertisem­ent for a second-hand car, and attempting to distribute this to every single person in the market, may have seemed great at the time, but time makes fools of all of us, especially if we’re Fairfax executives. Spraying ads for holidays to Fiji across the media was never going to be as effective as simply catching those who googled “flights to Fiji”.

Facebook allows advertiser­s to target consumers by age range, gender, location, education level, political leanings, interests, habits, beliefs, digital activities and purchase behaviour; by what they “like” and share, and who their friends are; by what device they use. It can shoot an advertisem­ent

Its corporate aim now is to build an unparallel­ed and irresistib­le machine with which to know you and influence you.

advertisem­ent directly into your hand because you’re a middle-aged male who searched online for a hardware product and you’re near the new Bunnings on a Saturday afternoon.

It knows when you’re having anniversar­ies, when you’re pregnant, when you’re planning a bar mitzvah, and when you’re watching a sad film and might feel like chocolate. What’s more, it’s getting smarter at pegging your interests and vulnerabil­ities every time you log in. Its natural-language processing and machine-learning algorithms are building a profile based on what you look at and for how long, what your friends shared and what you commented on. Its systems are gauging why you chose to comment on this but not that, and are comparing what you looked at versus what you typed.

According to its own Data Policy, Facebook receives informatio­n about your activities on and off Facebook (loyalty cards, mailing lists, browser cookies, receipts, apps, mobile phone permission­s and the like) from hundreds of third-party partners. Additional­ly, “we may share informatio­n about you within our family of companies to facilitate, support and integrate their activities and improve our services”. If you’re using Instagram, WhatsApp or Atlas, just to give a few examples, the data belongs to Facebook. Or anyone it chooses to share with.

It may have been set up with the best of intentions – to build communitie­s – but its corporate aim now is to build an unparallel­ed and irresistib­le machine with which to know you and influence you.

“It’s a commercial space; it’s like a shopping mall” is how Greens senator Scott Ludlam, deputy chair of the inquiry into the future of public interest journalism, describes it to me. “[And] the people who use Facebook are the commoditie­s. It’s us that’s being sold to advertiser­s. I don’t know if that’s really sunk in.”

We’re discussing the implicatio­ns of a cavalier attitude towards users’ data and privacy.

“I don’t know if I’d even say they’re cavalier with privacy. They’re mining our privacy on a massive scale and that’s the product: that’s what they sell.

“Their values are somewhat arbitrary, and they’re not really contestabl­e because it’s a commercial space. ‘If you don’t like shopping in our shopping mall, you’re free to go sit in the car park.’”

Which might be a reasonable argument were it not for the ubiquity – the platform monopoly – of the online giants, and the impact two companies in particular are having on the rest of society. When it comes to collecting and employing data, they have demonstrat­ed an inconsiste­nt regard for users’ rights.

In 2012, Facebook supported an experiment in which researcher­s manipulate­d the News Feed of almost 700,000 users to find out if they could alter people’s emotional states. By hiding certain words, it was discovered, unsurprisi­ngly, that they could. “We show,” researcher­s announced, “via a massive experiment on Facebook, that emotional states can be transferre­d to others via emotional contagion, leading people to experience the same emotions without their awareness.” The experiment was done without user knowledge or consent. When it became the subject of controvers­y in 2014, the researcher­s first claimed that they did have people’s consent, because it was “consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituti­ng informed consent for this research”. The Facebook data scientist who led the research claimed it was carried out “because we care about the emotional impact of Facebook and the people that use our product”. Finally the company’s chief technology officer apologised, adding that the company had been “unprepared” for the anger it stirred up, which suggests that perhaps it was the backlash rather than the experiment itself that caused remorse.

In 2015, it was revealed that Facebook tracks the web browsing of everyone who visits a page on its site, even if the user doesn’t have an account or has explicitly opted out of tracking, and even after a user has logged out.

The Guardian reported on the research commission­ed by a Belgian data protection agency, which argued that Facebook’s data collection processes were unlawful.

“European legislatio­n is really quite clear on this point. To be legally valid, an individual’s consent towards online behavioura­l advertisin­g must be opt-in,” explained Brendan Van Alsenoy, one of the report’s authors. “Facebook cannot rely on users’ inaction to infer consent. As far as non-users are concerned, Facebook really has no legal basis whatsoever to justify its current tracking practices.”

In May this year, European regulators announced that Facebook was breaking data privacy laws in France, Belgium and the Netherland­s, and faces investigat­ions in Spain and Germany. French regulator CNIL announced that that it was applying the maximum fine that had been allowed under French privacy law when its investigat­ion began: a grand total of €150,000. CNIL had last year issued an order that Facebook stop tracking non-users’ web activity without their consent, and stop some transfers of personal data to the US.

“We take note of the CNIL’s decision with which we respectful­ly disagree,” replied Facebook. It has argued that it should only be subject to rulings from the Irish data protection authority because its European headquarte­rs are in Dublin. In Europe, though, new personal data protection regulation­s will come into force mid next year, potentiall­y allowing regulators to impose fines of up to 4% of Facebook’s revenues.

Also in May, the Australian uncovered a document outlining how the social network can pinpoint “moments when young people need a confidence boost”. By monitoring posts, pictures, interactio­ns and internet activity in real

time, Facebook can determine when people as young as 14 feel “stressed”, “defeated”, “overwhelme­d”, “anxious”, “nervous”, “stupid”, “silly”, “useless” and a “failure”. The confidenti­al presentati­on was intended to show how well Facebook knows its users, and by implicatio­n how willing it is to use this knowledge on behalf of advertiser­s. Privacy laws in Australia are nowhere near as stringent as in Europe, and not enforced with any great vigour.

In the US, the Trump administra­tion recently repealed data protection rules, meaning browser histories could be sold to advertiser­s without user consent. According to research from Princeton University published last year, Google and Facebook together own all of the top ten thirdparty data collectors.

Not that any of this has so far caused any great public outcry, either here or in the States, perhaps because it all appears to be in the service of giving people exactly what they want. Nothing seems to interest the public less than debates about privacy laws and metadata collection. Until recently, it didn’t seem to be a major issue.

In June 2007, David Stillwell, a PhD student at the University of Cambridge, created a new Facebook app called myPersonal­ity. Volunteer users filled out different psychometr­ic questionna­ires, including a handful of psychologi­cal questions, and in return received a “personalit­y profile”. They could also opt to share their Facebook profile data with the researcher­s. Stillwell was soon joined by another researcher, Michal Kosinski, and their project took off. People were happy to share intimate details, their likes and dislikes (both online and off ), their age, marital status and place of residence. Before long, the two doctoral candidates owned the largest ever dataset combining Facebook profiles and psychometr­ic scores. In 2012, wrote Hannes Grassegger and Mikael Krogerus in an article for Das Magasin and Motherboar­d, Kosinski proved that,

on the basis of an average of 68 Facebook “likes” by a user, it was possible to predict their skin colour

(with 95 percent accuracy), their sexual orientatio­n

(88 percent accuracy), and their affiliatio­n to the Democratic or Republican party (85 percent) … Seventy “likes” were enough to outdo what a person’s friends knew,

150 what their parents knew, and 300 “likes” what their partner knew. More “likes” could even surpass what a person thought they knew about themselves.

On the day Kosinski published these findings, “he received two phone calls”, reported Grassegger and Krogerus. “The threat of a lawsuit and a job offer. Both from Facebook.” Shortly afterwards, Facebook made “likes” private by default. The personal informatio­n users put on Facebook had always been owned by the company, to analyse or sell, but what it was worth was only just becoming clear. (It has a long history of changing privacy settings without much notice or explanatio­n.)

Facebook wasn’t the only one to register the potential of this tool. A young assistant professor from the Cambridge psychology department, Aleksandr Kogan, soon approached Kosinski on behalf of another company that was interested in the myPersonal­ity database. Kogan initially refused to divulge the name of this company, or why and how it planned to use the informatio­n, but eventually revealed it was Strategic Communicat­ion Laboratori­es. SCL is a communicat­ions group whose “election management agency” does marketing based on psychologi­cal modelling; its offshoots, one of which was named Cambridge Analytica, had been involved in dozens of election campaigns around the world. The company’s ownership structure was opaque, and Kosinski, who by this stage had become deeply suspicious of its motives, eventually broke off contact.

Kosinski was therefore dismayed, if not altogether surprised, to learn of Cambridge Analytica’s role in last year’s election of Donald Trump.

“We are thrilled that our revolution­ary approach to datadriven communicat­ion has played such an integral part in President-elect Trump’s extraordin­ary win,” said Cambridge Analytica’s 41-year-old CEO, Alexander Nix, in a press release.

In September 2016, speaking at the Concordia Summit in New York, Nix had explained how Cambridge Analytica acquires massive amounts of personal informatio­n (legally) – from shopping data to bonus cards, club membership­s to land registries, along with Facebook informatio­n and other online data – and combines it (including phone numbers and addresses) with party electoral rolls into personalit­y profiles.

“We have profiled the personalit­y of every adult in the United States of America – 220 million people,” Nix boasted. According to Mattathias Schwartz, writing in the Intercept, Kogan and another SCL affiliate paid 100,000 people a dollar or two to fill out an online survey and download an app that gave them access to the profiles of their unwitting Facebook friends, including their “likes” and contact lists. Data was also obtained from a further 185,000 survey participan­ts via a different unnamed company, yielding 30 million usable profiles. No one in this larger group of 30 million knew that their Facebook profile was being harvested.

It doesn’t take a great deal of imaginatio­n to see how useful this could be to the Trump campaign. As Grassegger and Krogerus reported:

On the day of the third presidenti­al debate between Trump and Clinton, Trump’s team

tested 175,000 different ad variations for his arguments, in order to find the right versions above all via Facebook. The messages differed for the most part only in microscopi­c details, in order to target the recipients in the optimal psychologi­cal way: different headings, colours, captions, with a photo or video.”

The Trump campaign, heavily outspent by the Clinton campaign in television, radio and print, relied almost entirely on a digital marketing strategy.

“We can address villages or apartment blocks in a targeted way,” Nix claimed. “Even individual­s.” Advertisin­g messages could be tailored, for instance, to poor and angry white people with racist tendencies, living in rust-belt districts. These would be invisible to anyone but the end user, leaving the process open to abuse.

In February, the communicat­ions director of Brexit’s Leave.EU campaign team revealed the role Cambridge Analytica had played. The company, reported the Guardian, “had taught [the campaign] how to build profiles, how to target people and how to scoop up masses of data from people’s Facebook profiles”. The official Vote Leave campaign, Leave.EU’s rival, reportedly spent 98% of its £6.8 million budget on digital media (and most of that on Facebook).

Trump’s chief strategist, Stephen Bannon, was once a board member of Cambridge Analytica. The company is owned in large part by Robert Mercer (up to 90%, according to the Guardian), whose money enabled Bannon to fund the right-wing news site Breitbart, and who funds climatecha­nge denial think tank the Heartland Institute.

The critical point is not that wealthy conservati­ves may be manipulati­ng politics – this is hardly new – but that politics has become so vulnerable to covert manipulati­on, on a scale never before experience­d.

There is good reason for the strict regulation­s around the world on the use and abuse of the media in election campaigns, yet government­s have almost completely abrogated responsibi­lity when it comes to social media.

According to Labor senator Sam Dastyari, chair of the future of public interest journalism inquiry, Australia’s security agencies “are very clear that [deliberate­ly misleading news and informatio­n] is a real and serious threat … We would be very naive to believe it’s not going to happen here.”

ABuzzFeed News analysis found that in the three months before the US election the top 20 fake-news stories on Facebook generated more engagement (shares, reactions and comments) than the top 20 real-news stories. The Pope endorsed Donald Trump. An FBI agent suspected of leaking Hillary Clinton’s email was FOUND DEAD IN APPARENT MURDER-SUICIDE. In other news, WikiLeaks confirmed that Clinton had sold weapons to ISIS, and Donald Trump dispatched his personal plane to save 200 starving marines.

Facebook’s algorithm, designed to engage people, had simply given Americans what they wanted to read.

The criticism was heated and widespread, prompting Mark Zuckerberg’s ‘Building Global Community’ essay.

Sure, there are “areas where technology and social media can contribute to divisivene­ss and isolation”, Zuckerberg wrote, and there are “people left behind by globalizat­ion, and movements for withdrawin­g from global connection”, but his answer to these problems was consistent and uniform: people need to be more connected (on Facebook). His promises to build a better network – to counter misinforma­tion, for example, in a veiled reference to the US election campaign, or filter out abuse – rely to some extent on our goodwill and credulity. We’re denied access to Facebook’s internal workings, and that’s as Zuckerberg intends it. Which is within his right as chairman and CEO of a business. But a network this large, this influentia­l, this secretive is more than a business. In many ways, it’s a test of our belief in the market.

Promises to fix problems ranged from introducin­g a different system for flagging false content to working with outside fact-checking outfits. Perhaps changes were also made to the News Feed algorithm, but if so they remained confidenti­al. How would the new community standards be applied? Would Facebook ever make changes that crimp its business prospects? Does it accept that social obligation­s come with such editorial decisions? All this remained obscure, notwithsta­nding a mess of corporate nonsense posted by both Zuckerberg and company PR figures.

The News Feed algorithm works like this: in order to engage you, it chooses the “best” content out of several thousand potential stories that could appear in your feed each day. The stories are ranked in order of perceived importance to you (Your best friend’s having a party! Trump has bombed North Korea again!), and the News Feed prioritise­s stories you’ll like, comment on, share, click on, and spend time reading. It recognises who posted things and their proximity to you, how other people responded to the post, what type of post it is and when it was posted.

As TechCrunch writer Josh Constine puts it, “The more engaging the content … the better it can accomplish its mission of connecting people while also earning revenue from ads shown in News Feed.”

Over time, as millions have joined Facebook, the number of potential posts that might populate a feed has multiplied, so the algorithm has become not only increasing­ly necessary to prevent users from drowning in “content” but also increasing­ly subject to human design.

Despite this, Facebook insists it’s not a media organisati­on. It’s a technology company and a neutral platform for other people’s content. It is certainly true that it piggybacks on other companies’ content. But it is also constantly testing, surveying and altering its algorithms, and the changes have vast effects. Kurt Gessler, deputy editor for digital news at the Chicago Tribune, started noticing significan­t changes in January, and three months later wrote a post about them, titled ‘Facebook’s algorithm isn’t surfacing one-third of our posts. And it’s getting worse’, on Medium. The Tribune’s numbers of posts hadn’t changed over time, nor had the type of posts. The newspaper had a steadily rising number of Facebook fans but the average post reach had fallen precipitou­sly. “So,” he asked, “is anyone else experienci­ng this situation, and if so, does anyone know why and how to compensate? Because if 1 of 3 Facebook posts isn’t going to be surfaced by the algorithm to a significan­t degree, that would change how we play the game.”

Facebook has made it clear that it has been increasing­ly giving priority to videos in its News Feed. Videos and mobile ads are, not coincident­ally, the very things driving Facebook’s revenue growth. It has also been rewarding publishers that post directly to Facebook instead of posting links back to their own sites. None of which bodes well for the Chicago Tribune.

There is one way to guarantee your articles will be surfaced by Facebook: by paying Facebook. As every social media editor knows, “boosting” a post with dollars is the surest way to push it up the News Feed. Greg Hywood and HuffPost Australia editor-in-chief Tory Maguire pointed out to the Senate’s future of public interest journalism inquiry that even the ABC pays Google and Facebook to promote its content. “Traffic is dollars,” said Hywood, “and if the ABC takes traffic from us by using taxpayers’ money to drive that traffic, it’s using taxpayers’ money to disadvanta­ge commercial media organisati­ons.”

“This is normal marketing behaviour in the digital space,” replied the ABC, “and is critical to ensuring audiences find relevant content. It is [also] used by other public broadcaste­rs like the BBC and CBC.” As well as, needless to say, thousands of other media organisati­ons, including Fairfax and Schwartz Media.

This is what passes for normal marketing behaviour in 2017: news organisati­ons, haemorrhag­ing under the costs of producing news while losing advertisin­g, are paying the very outfits that are killing them. Could there be a more direct expression of the twisted relationsh­ip between them? Could the power balance be any more skewed?

Mark Thompson, CEO of the New York Times Company, recently put it like this: “Advertisin­g revenue goes principall­y to those who control platforms.” Over time, this will mean they also control the fate of most news organisati­ons. So it is somewhat troubling that one of the few ways to keep a check on the power of Facebook is by maintainin­g a robust fourth estate.

It was on social media that I stumbled across Rachel Baxendale’s Australian article about the anti-Semitic abuse directed at Ariella on social media. It was also from social media that I acquired Baxendale’s contact details, to ask her about the article.

Baxendale had heard of the story through a contact, Dr Dvir Abramovich, chairman of the B’nai B’rith AntiDefama­tion Commission. Abramovich had verified the story himself, and Baxendale then went back and forth with the schoolgirl over several days, checking details, looking at the screenshot­s of the abuse, discussing whether to use a pseudonym and so forth. Baxendale had explained the story to her bureau chief, and it then went to the Australian’s main editorial team in Sydney for approval. It was run past the legal team and then sub-editors, and Facebook was approached for comment. All of this is routine at a newspaper. If anyone has a complaint, it can be taken to the Australian Press Council, who will study it impartiall­y before making a public ruling. Or readers can, of course, get in contact with the journalist in question or her editors.

By contrast, Facebook has a single email address for all global media enquiries, and its moderators had moments to deal with the matter of Ariella’s abuse. There were 50 pages of screenshot­s.

The Guardian reported in May it had seen more than 100 internal training manuals, spreadshee­ts and flowcharts used by Facebook in moderating controvers­ial user posts. The Guardian also revealed that for almost two billion users and more than 100 million pieces of content to review per month (according to Zuckerberg) there were just 4500 moderators. That is one per 440,000+ users. Most work for subcontrac­tors around the world; Facebook won’t divulge where.

This is what passes for normal marketing behaviour in 2017: news organisati­ons are paying the very outfits that are killing them.

They are trained for two weeks, paid little, and often have “just 10 seconds” to cast judgement on issues involving child abuse, suicide, animal cruelty, racial discrimina­tion, revenge porn and terrorist threats, and must balance these against the desire to respect freedom of speech. Fake news? Factchecki­ng would be impossible. To help them, the moderators are provided with instructio­n manuals, which contain guidelines for dealing with matters from threats and specific issues to live broadcasts and image censorship. Facebook formulates country-specific materials to comply with national laws, and Zuckerberg often refers to the company’s attempts to follow community standards, but he really means Facebook’s Community Standards, which it determines. (“You can host neoNazi content on Facebook but you can’t show a nipple” is Scott Ludlam’s shorthand characteri­sation of these standards.) Examples of this guidance, relayed by the Guardian, give some sense of the impossibil­ity of the moderators’ task:

Remarks such as “Someone shoot Trump” should be deleted, because as a head of state he is in a protected category. But it can be permissibl­e to say: “To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat”, or “fuck off and die” because they are not regarded as credible threats.

Videos of violent deaths, while marked as disturbing, do not always have to be deleted because they can help create awareness of issues such as mental illness.

Some photos of non-sexual physical abuse and bullying of children do not have to be deleted or “actioned” unless there is a sadistic or celebrator­y element.

Photos of animal abuse can be shared, with only extremely upsetting imagery to be marked as “disturbing” …

Videos of abortions are allowed, as long as there is no nudity.

Facebook will allow people to livestream attempts to self-harm because it “doesn’t want to censor or punish people in distress”.

In December 2015, Facebook gave a commitment to the German government that it would remove criminal hate speech from the platform within 24 hours; however, a yearlong German government study reported by the New York Times recently found that in some months Facebook managed to delete only 39% in the time frame sought by the German authoritie­s, and that its performanc­e had been getting worse in recent months. In March, the German government proposed legislatio­n in this area, with the threat of fines up to €50 million.

Facebook had already announced plans to hire an additional 3000 moderators, and Monika Bickert, Facebook’s head of global policy management, told the Guardian, “We feel responsibl­e to our community to keep them safe and we feel very accountabl­e. It’s absolutely our responsibi­lity to keep on top of it.” As ever, the community will have to take their word for it – and rely on unauthoris­ed leaks to the media for the details.

The inquiry into the future of public interest journalism, driven by senators Sam Dastyari, Scott Ludlam, Nick Xenophon and Jacqui Lambie, was set up to examine “the impact of search engines, social media and disinforma­tion on journalism in Australia”. At public hearings in May, a parade of speakers explained the adverse effects.

Union representa­tive Paul Murphy explained that 2500 journalism jobs had disappeare­d in Australia since 2011, and that pay rates for freelancer­s had also declined significan­tly. The inquiry heard examples of regulation­s that applied to the Australian media but not to the tech companies, such as those relating to local content and media ownership, and heard time and again about the lack of tax paid by the tech companies. Local media had obligation­s, social, legal and cultural; Facebook and Google traded on the local media’s

content, smashed their business models in the process, and gave little back to the community in return. How to remedy this is what the inquiry was set up to explore.

Facebook reported that it earned $326.9 million of revenue in Australia in 2016. Google reported revenue of $882 million last year. But these figures radically underrepre­sent the total amount they collect from Australia, which is widely regarded as at least three or four times larger. For 2016, Facebook and Google reported tax bills of $3.3 million and $16.6 million respective­ly. A pittance. (Perhaps Australia should count itself lucky: in the UK in 2014, Facebook paid £4327 in tax, less than what the average worker paid.)

“Until this year,” wrote journalist Michael West, “Google and Facebook entertaine­d a corporate structure that booked the billions of dollars of revenue they made in Australia directly offshore.” Facebook sales were booked to an associated entity in Ireland, and referred to as the “purchase of advertisin­g inventory”. Now, Facebook has declared itself to be a reseller of local advertisin­g inventory, and the federal government has declared that its new Diverted Profits Tax – the “Google tax” – will reap billions of extra dollars in revenue from multinatio­nals over coming years, though such projection­s generally rely on multinatio­nals not altering their tax arrangemen­ts in response, and not fighting them out in court for the next decade.

Professor Peter Fray, Professor of Journalism Practice at the University of Technology Sydney, summed up the problem for journalism: “There is no doubt there are issues around tax for Google and Facebook and they should pay their fair share, but I cannot see how publishers, journalist­s or politician­s can blame Google and Facebook for the fact that digital revenue streams did not, do not and will not replace those of print or that in digital environmen­ts audiences have multiple choices for content on demand 24/7.”

Put a different way, how could extracting a reasonable amount of tax from Google and Facebook save local journalism and a collapsing business model?

Only by funnelling that tax revenue into journalism. For which no mechanisms yet exist, and to which the objections are obvious. Historical­ly, there’s been ample reason to fear government involvemen­t in private media, and little reason to propose or support it. But how quickly things have changed.

“The economic model no longer works,” Sam Dastyari tells me. “So either government intervenes and finds a way to support it – without going so far as to tip the scales of what is and isn’t journalism – or we let it die. There is no third option.”

Independen­t senator Nick Xenophon is wary of the term “interventi­on” when it comes to a government response; neverthele­ss, he agrees that “doing nothing is not an option”.

“It’s more a case of government levelling a playing field which has been tipped into a state of imbalance and dysfunctio­n by the advent of disruptors,” he clarifies.

“This is not like the horse-and-buggy and automobile argument of 120 years ago. This is a case where they are piggybacki­ng off traditiona­l media to make a quid. And that should be reflected in some way in a compensato­ry mechanism.

“I don’t want us to end up in a world where we just have so-called citizen journalist­s and a whole range of bloggers, where there are no standards, where anything goes.”

Scott Ludlam concurs. “There is a public policy role here … because the market’s wiping these [media] entities out. The market couldn’t give a shit whether there’s strong and independen­t and diverse journalism going on in a society.”

“I don’t think we’ll have general daily newspapers on a weekday within the next two or three years,” the Greens senator adds. The loss of dedicated profession­al reporters in health, education, state politics, arts, science, environmen­t or social affairs will have incalculab­le effects.

Do we still have a democracy capable of creatively moderating the worst effects of the market? With just a few exceptions, government­s have shown little inclinatio­n to take on corporate interests to protect civil society, or to intervene to prevent market oligopolie­s.

Looking at the operations of Facebook around the world, you could easily conclude that Zuckerberg makes the rules.

Peter Fray, also a former editor and chief publisher of the Sydney Morning Herald and editor of the Canberra Times and the Sunday Age, warned the Senate inquiry that media independen­ce would be at stake under a direct subsidy model, where payments go straight from government to the media. This is a concern shared by the senators, who each stress to me the importance of maintainin­g the independen­ce and diversity of the media.

“It’s not an attempt by the state to control the media,” says Ludlam. “It’s a genuine inquiry into how we can support [it].”

One of the key proposals being canvassed is a levy on Facebook and Google, which will be used to pay for public interest journalism. Dastyari, Xenophon, Ludlam and Jacqui Lambie are all open to this idea. (Liberal senator James Paterson, also a member of the select committee, is not.)

“That would free up millions of dollars to further public interest journalism,” says Xenophon.

How the proceeds of such a levy might be disbursed will be the subject of considerab­le debate in coming months. One method that’s been raised is a European-style grants council, in which a panel appointed by government decides how money is allocated. However, there are already reservatio­ns among the committee members. “Who determines the grants?” asks Dastyari. “Why do you get a grant and someone else doesn’t? How independen­t is it?”

The Labor senator believes the tax system is the best

way of supporting the industry, whether it’s tax breaks for individual­s buying news subscripti­ons or those who make donations or other investment­s in journalism. The fact that Xenophon, Ludlam and Paterson are also open to such an idea implies broad cross-party acceptance.

“What we’re talking about,” Dastyari adds, “is actually quite a radical rethink of the role of government as it comes to journalism.”

Xenophon is also keen to explore what he calls the “copyright approach” to supporting media, via the Competitio­n and Consumer Act. He says a formula should be developed by which the use and sharing of intellectu­al property is valued and compensate­d fairly.

The obvious challenge all are weighing up is how to define public interest journalism. “Where do you draw the line?” asks Dastyari. “Is food blogging journalism?” Is opinion a form of journalism? Television news and current affairs?

“I think the organisati­ons likely to take advantage of it are not necessaril­y the ones that most advocates have in mind,” Paterson also warns. “I suspect the first people to seek tax-deductible funds to fund a news service are those seeking to promote a particular world view. On the left, I’d expect to see a GetUp! News, Greenpeace News and maybe Asylum Seeker Resource Centre News. On the right I think we’d see an Institute of Public Affairs News, Australian Christian Lobby News and Business Council of Australia News.”

Any definition, emphasises Xenophon, will need to ensure that an organisati­on’s dominant purpose is to provide news and opinion and that it’s not the arm of an advocacy organisati­on.

The committee has until December to publish its final report, but it’s already clear that these are the key questions exercising the minds of committee members – not whether public interest journalism needs saving, or even whether government should play a role.

“I suspect we’re going to end up with a whole menu of things that ideally could work well together,” says Ludlam.

Regardless of what happens next, it is a remarkable shift in public debate.

The talk about the collapse of newsrooms in Australia has until recently tended to focus on newspapers and magazines. It’s becoming obvious that television broadcaste­rs both free-to-air and cable are under major pressure too. In mid June, Channel Ten went into administra­tion soon after reporting a $232 million loss driven by flagging advertisin­g revenue.

The federal government promises to overhaul media ownership laws and scrap some of the constraint­s on the big media companies, leading to more concentrat­ion and presumably greater efficienci­es for them. But it voted against the establishm­ent of the Senate’s journalism inquiry and shows little enthusiasm for reining in the tech giants. The task of protecting the diversity of the Australian news media remains beyond the scope of its ambition. For the moment, it’s up to others.

“History,” writes Zuckerberg portentous­ly, “is the story of how we’ve learned to come together in ever greater numbers – from tribes to cities to nations. At each step, we built social infrastruc­ture like communitie­s, media and government­s to empower us to achieve things we couldn’t on our own.”

I return to the Facebook page to read Zuckerberg’s essay one last time, and a message pops up on my normally dormant account. Citing security concerns and unusual activity, it requests that I prove – with photo identifica­tion – that I am who I say I am. I hesitate.

“Today we are close to taking our next step.”

And as Zuckerberg runs through touted improvemen­ts and inspiratio­nal innovation­s, ever so casually he drops this: “Research suggests the best solutions for improving discourse may come from getting to know each other as whole people instead of just opinions – something Facebook may be uniquely suited to do.” It’s so neatly incorporat­ed you barely notice it. We “whole people” and Facebook are suddenly indivisibl­e.

 ??  ??

Newspapers in English

Newspapers from Australia