Biden’s $12 billion for women’s health should be just a start
On Monday, President Joe Biden signed an executive order that will create a $12 billion fund to improve our understanding of — and ideally treatments for — women’s health. It’s a welcome, if egregiously belated investment by the U.S. government. And although it sounds like a big amount, there’s a lot of catching up to do.
For example, one analysis found that conditions that overwhelmingly affect women, like migraines, headaches, endometriosis, anxiety disorders and chronic fatigue syndrome, are severely underfunded compared to conditions that predominantly affect men. (Anyone following the long COVID story knows that condition could easily be added to this list.)
Researchers call this the health gap, and it has serious societal and economic consequences: A recent report from Mckinsey & Company found that reducing the time women spend in poor health by 25% could be worth $1 trillion, in large part because health disparities disproportionately hit women during their working years.
The funds allocated by this executive order, which cut across a wide swath of agencies and areas of health, begin to address the problem. The next step will be for Congress to approve Biden’s larger budget for 2025, thereby funding the order. The ultimate test will be whether foundational research in women’s health can attract more interest from industry, which has not given the area enough attention.
“My hope is that we’re at the beginning of a fundamental shift in the recognition of the importance of this necessary funding and research,” says Lisa Larkin, president of The Menopause Society. “It’s not enough yet, but I really am excited.”
It’s no secret that women have historically gotten the short end of the stick when it comes to medical research. For decades, women were left out of clinical trials entirely. That’s a situation the National Institutes of Health has made strides in remedying, but disparities linger.
A recent report from the RAND Corporation, commissioned by the nonprofit WHAM (Women’s Health
Access Matters), found that a relatively small investment in studying women and Alzheimer’s disease, cardiovascular disease and rheumatoid arthritis would pay economic and societal dividends, says WHAM’S president, Lori Frank. Doubling the modest portion of women-focused research dollars in those three conditions — an investment that would amount to about $300 million — could increase lifespan and productive time in the workforce, while saving society some $13 billion, the report estimated.
Of note, Biden’s women’s health initiative
create an independent authority responsible for establishing and enforcing baseline safety and privacy rules for social media companies. To ensure compliance, the agency should have access to relevant company information and documents and the authority to hold noncompliant companies accountable. If or when things go awry, the agency should have the authority to investigate what happened, much as the transportation board can investigate Boeing after its recent mishaps.
Reining in social media harms is a difficult task. But we need to start somewhere, and attempts to ban platforms after they've already become hugely influential, as some U.S. lawmakers are trying
to do with Tiktok, just set up an unending game of whack-a-mole.
Platforms can track the number of accounts taken down, the number of posts removed and the reasons why those actions were taken. It also should be feasible to build a companywide database of the hidden but traceable device IDS for phones and IP addresses that have been used to commit privacy, safety and other rule violations, including links to the posts and activities that were the basis for the decision to catalog the person and device.
Companies should also share how algorithms are being used to moderate content, along with specifics on their safeguards to avoid bias (research indicates that, for example, automated hate speech detection
shows racial bias and can amplify race-based harm). At minimum, companies would be banned from accepting payment from terrorist groups looking to verify social media accounts, as the Tech Transparency Project found X (formerly Twitter) to be doing.
People often forget how much content removal already happens on social media, including child pornography bans, spam filters and suspensions on individual accounts such as the one that tracked Elon Musk's private jet. Regulating these private companies to prevent harassment, harmful data sharing and misinformation is a necessary, and natural, extension for user safety, privacy and experience.
Protecting users' privacy and safety requires research and insight
into how social media companies work, how their current policies were written, and how their content moderation decisions have historically been made and enforced. Safety teams, whose members do the essential work of content moderation and hold vital insider knowledge, have recently been scaled back at companies such as Amazon, Twitter and Google. Those layoffs, on top of the rising number of people pursuing tech careers yet finding uncertainty in the private tech sector, leave numerous individuals on the job market with the skills and knowledge to tackle these issues. They could be recruited by a new agency to create practical, effective solutions.
Tech regulation is the rare issue that has
bipartisan support. And in 2018, Congress created an agency to protect the cybersecurity of the government. It can and should create another regulatory agency to face threats from both legacy and emerging technologies of domestic and foreign companies. Otherwise we'll just keep experiencing one social media disaster after another.
Anika Collier Navaroli is a journalist, lawyer and senior fellow at the
Tow Center for Digital Journalism at the Columbia Journalism School. She is also a former senior policy official at Twitter and Twitch. Ellen K. Pao is a tech investor and advocate, the former CEO of Reddit and a cofounder of the award-winning diversity and inclusion nonprofit Project Include.