Editorial roundup
March 21 The Charleston Gazette
Tackling the opioid crisis
The Trump administration’s latest proposal for tackling the opioid crisis is not devoid of good ideas. It’s certainly got more meat on its bones than the president’s declaration of the crisis as a public health emergency last fall.
There was talk, although still not enough specifics, of increasing treatment options for addicts, and of new limits on prescriptions of oxycodone, hydrocodone and other opioids.
But Trump spent the days leading up to his announcement touting the idea of the death penalty for opioid dealers, and his Monday speech focused (as much as it focused on anything) on similar “tough on crime” policies.
The problem is, those policies don’t get good results. America — well, much of it — has learned from previous drug epidemics that such tough-talk tactics aren’t the solution. By all means, prosecute drug dealers and interrupt their trade routes. But if too much emphasis is on chucking people in jail, let alone executing them, America will just raise another generation of damaged, hopeless families.
Many law enforcement officers and politicians in this area openly acknowledged this a few years ago — a welcome departure from the years when they were afraid of being painted as “weak on crime.” Now, at least at the federal level, that mindset has returned.
At least the plan released by the administration Monday merely referred to increased use of the death penalty under existing law, rather than increasing the situations where it might be used.
No word if the death penalty for opioid dealers would extend to the CEOs of companies that shipped millions of unnecessary painkillers to West Virginia, resulting in hundreds of deaths. No word if the doctors who overprescribed opioids, or the pharmacists who ran “pill mills,” would face execution.
Trump also claimed Monday that his beloved wall at the Mexican
border would stop the flow of heroin into this country. Experts are extremely skeptical of that claim, to put it mildly.
Even the better parts of Trump’s plan have problems. Cutting prescriptions for painkillers sounds good, and West Virginia lawmakers have already taken steps in that direction. They passed a bill (SB 273) earlier this month that would limit many initial opioid prescriptions. Gov. Jim Justice asked for that bill, so even though he hadn’t signed it as of Tuesday afternoon, he almost certainly will.
That’s useful, but even though prescription painkillers kicked off this epidemic, heroin and fentanyl have eclipsed them in recent years, partially because those addicted to pills had to find other ways to feed their addiction once the pills became harder to get. And chronic pain sufferers who legitimately need opioids will likely find them harder to get under such limits.
The plan for increasing treatment for opioid addicts is still more undefined — as is how any treatment increase would be paid for.
A budget passed by Congress calls for $6 billion in spending on the opioid crisis over the next two years, but public health officials say that’s a drop in the bucket. Trump’s budget proposal calls for an additional $7 billion, which would be two or three drops in the bucket.
One specific goal from the Trump plan — to have Congress repeal a law that allows large treatment facilities to get Medicaid reimbursement — wouldn’t help West Virginia, because the state already has federal permission to waive that rule.
And speaking of Medicaid, many people in West Virginia who get treatment for their opioid addictions do so through the state’s Medicaid program, which was expanded under the Affordable Care Act, which Trump and his GOP cronies keep wanting to cut.
There are no easy answers for this huge and disastrous conundrum. But as long as the federal government focuses on punishment, rather than treatment and recovery, the problem will only get bigger.
March 20
Los Angeles Times
Privacy abuses
Reports surfaced this weekend about yet another Facebook-fueled abuse of privacy, this time by an outside company trying to manipulate voters on behalf of political causes and candidates — including Donald J. Trump in 2016. The revelations were both sadly familiar and newly outrageous.
According to the Guardian and the New York Times, a Cambridge University researcher named Aleksandr Kogan produced an innocuous-looking personality testing app for Facebook whose real purpose was to identify the sorts of marketing pitches one might be susceptible to — ones that played to people’s anxieties, for example, or alternatively to their sentiments. He then gathered data not just from the roughly 270,000 people who used the app, but from tens of millions of their Facebook friends, all without the friends’ knowledge or consent, according to the news articles.
The story gets worse, however. Kogan reportedly turned over the Facebook data he had harvested to a political consulting firm, Cambridge Analytica, to help it build profiles that it could use to sway voters on a massive scale. The messages could be tailored precisely to the weaknesses of a narrow group of voters, and each pitch could be confined to a single group to avoid putting off voters with different sensibilities.
As Christopher Wylie, who worked at Cambridge Analytica from its founding until 2014, told the Guardian, “We would know what kinds of messages you would be susceptible to, and where you’re going to consume that. And then how many times do we need to touch you with that in order to change how you think about something.”
Political campaigns and commercial advertisers have long sought to target their pitches; what’s different now is how the internet and enormously popular platforms such as Facebook make the process easier, more effective, wider-scale and far more intrusive. Cambridge Analytica has denied using Facebook data improperly, but news reports over the weekend strongly suggest that the firm built its profiles at least in part with data Facebook users didn’t realize they were sharing with it.
Kogan insists that he, too, did nothing wrong, and that’s one of the most disturbing elements of the story. One reason he could acquire all that data about his app users’ Facebook friends, including their posts and their likes, is because Facebook opened that information to application developers in 2011. The company rolled back access to information about app users’ friends in 2015, long after Kogan harvested and shared that data.
Facebook says that when it first learned about Kogan’s data sharing in 2015, it instructed his company and Cambridge Analytica to destroy the harvested data. If the latest news reports are true, a Facebook executive told the Guardian, “it’s a serious abuse” of the company’s rules.
That’s little comfort to the millions of people whose marketing susceptibilities have now been cataloged with Facebook’s help. And this is just the latest in a long line of privacy problems at the social network. In fact, the Federal Trade Commission entered into a settlement with the company in 2011 over misleading or false disclosures to users about app developers’ access to data, among other issues.
Whether the consent decree that Facebook signed in 2011 will have any bearing on this latest privacy affront remains to be seen. Regardless, the fundamental problem here is that companies keep finding new and unanticipated ways to use the personal information people share online — including the digital bread crumbs they leave unwittingly as they wander around the web. Are you comfortable knowing that companies are scooping up your crumbs, your “likes,” your tweets, your status updates and your lists of friends to determine whether to offer you discounts? To decide which news items to show you? To determine how best to sell you on a controversial presidential candidate or a costly ballot measure?
The largely hands-off approach taken by regulators to date has encouraged bountiful innovation and experimentation, but it’s also reinforced an act-first, ask-forgiveness-later mentality among entrepreneurs. The only guidelines are the vaguely worded prohibitions in federal law against unfair and deceptive practices; rather than trying to adopt comprehensive guidelines, the Federal Trade Commission has handled complaints on a case-by-case basis.
It’s not enough. Internet users need some measure of control over the information they reveal online, so that they are not unwittingly helping countless unseen data brokers, aggregators and analysts find new and better ways to steer their opinions, purchases and votes. A federal bill of rights covering online data would be a good place to start. How many outrages have to surface before the tech industry and the federal government start taking privacy seriously?