» biopower to the peo­ple

FIT­NESS TRACK­ERS ARE RE­DEFIN­ING WHAT IT MEANS TO BE A HU­MAN SUB­JECT

Bitch: A Feminist Response to Pop Culture - - THE FACTS ISSUE - By mailee hung

Fit­ness track­ers are re­defin­ing what it means to be a hu­man sub­ject.

Since get­ting a Fit­bit sev­eral months ago, my days have been fo­cused on ac­tion and anal­y­sis: Wake up, check my sleep stats. Go to the gym, track my work­out. Eat break­fast, log my calo­ries. Bike to work, track my miles and steps. Re­peat ad in­fini­tum. Va­ri­ety is the en­emy of op­ti­miza­tion.

And “op­ti­miza­tion” has in­creas­ingly be­come a syn­onym for “health,” one that con­jures a sense of the ra­tio­nal, the or­dered, the pro­gram­mat­i­cally ideal. To op­ti­mize one’s body is to take it to its func­tional max­i­mum, to fine-tune its per­for­mance to ma­chine-level ac­cu­racy.

Then there’s “fit­ness,” an­other term that’s been folded into this tech­no­log­i­cal vi­sion of abil­ity and po­ten­tial. “Fit­ness,” the Fit­bit web­site states, “is the sum of your life.” And track­ing “ev­ery part of your day— in­clud­ing ac­tiv­ity, ex­er­cise, food, weight, and sleep—[helps] you find your fit, stay mo­ti­vated, and see how small steps make a big im­pact.” In essence, Fit­bit claims that not only is your day-to-day the true marker of fit­ness, and not only is fit­ness is the key marker of your life, but that quan­ti­fy­ing them as a se­ries of in­puts and out­puts will ul­ti­mately im­prove it, too. Health track­ers like the Fit­bit—in­clud­ing the Ap­ple Watch, Nike Fuel­band, Garmin vívos­mart, and Sam­sung Galaxy Gear—as­sert that your bod­ily out­put is the sum to­tal of your ex­pe­ri­ence, and that sum can be quan­ti­fied.

THIS IS THE BEDROCK of the Quan­ti­fied Self (QS) move­ment, a group of peo­ple whose rallying cry is “self knowl­edge through num­bers.” You won’t be sur­prised to hear that the QS move­ment was first con­ceived in San Fran­cisco, by for­mer Wired mag­a­zine edi­tors Gary Wolf and Kevin Kelly, in 2007. From de­ter­min­ing the peak en­joy­ment of an al­bum by num­ber of lis­tens to the most ef­fec­tive way to train for phys­i­cal strength or en­durance, QS evan­ge­lists be­lieve that gath­er­ing data about the self is one of the most ef­fec­tive and mean­ing­ful ways to learn about both the hu­man con­di­tion and the hu­man body. “If we want to act more ef­fec­tively in the world,” said Wolf in a 2010 TED Talk, “we have to get to know our­selves bet­ter.” By re­flect­ing on our­selves as sys­tems and us­ing data “as a mir­ror,” Wolf says we can achieve lev­els of self-aware­ness—and there­fore self-im­prove­ment— pre­vi­ously un­avail­able to us. Who knows what we might achieve once we at­tain peak per­sonal per­for­mance?

Of course, self-track­ing has been around for a long time. Cum­ber­some though they were, com­put­ers were small enough to be de­vel­oped into wearables by fringe en­thu­si­asts in the 1970s; throw­ing it back even fur­ther, women have been track­ing their pe­ri­ods since at least 388 ad. We have been seek­ing ways to un­der­stand the body’s be­hav­ior for as long as we’ve turned a sci­en­tific eye to our own navels. In to­day’s era of ubiq­ui­tous com­put­ing, Blue­tooth, and mi­cro­pro­ces­sors, it only makes sense that some of our most so­phis­ti­cated mea­sure­ment de­vices be ap­plied to our­selves. Now, the body is best un­der­stood through its ab­strac­tion: It isn’t un­til I’ve logged my meals and checked my stats that I’m able to com­pre­hend what I’ve done with my day. There’s lit­tle space in the ethos of op­ti­miza­tion for the chaotic, un­pre­dictable, and of­ten un­con­trol­lable vi­cis­si­tudes of be­ing hu­man. Order has al­ways been a hu­man ideal—now that we can ap­ply it to the pre­vi­ously in­vis­i­ble and un­quan­tifi­able pro­cesses of our phys­i­cal selves, has it be­come a defin­ing cat­e­gory of a wor­thy life?

The an­swer to the ques­tion of why order and op­ti­miza­tion are so seductive seems self-ev­i­dent: Bet­ter is bet­ter. If we dig into our own in­cen­tives for self-im­prove­ment, it’s likely we’ll find sim­i­lar def­i­ni­tions of what “bet­ter” means—greater hap­pi­ness, less pain, more free­dom and au­ton­omy. But whether or not op­ti­miza­tion through self-reg­u­la­tion is the means to those ends for ev­ery­one is an­other ques­tion al­to­gether.

Michel Fou­cault the­o­rized that a reg­u­lated pop­u­la­tion is eas­ier to con­trol, ar­gu­ing that reg­u­la­tion it­self is the mech­a­nism by which mod­ern-day states man­age their con­stituents, a mech­a­nism he called “biopower.” Health and fit­ness track­ers are tools with enor­mous po­ten­tial for smooth­ing out the kinks in this chain of power from the pop­u­la­tion level to the in­di­vid­ual, not only by gath­er­ing de­tailed so­cial and sci­en­tific data on the body and its daily rhythms but also by bring­ing sta­tis­ti­cal av­er­ages di­rectly to the body. (My av­er­age rest­ing heart rate is mean­ing­less with­out a base­line to com­pare it to, but the Fit­bit app help­fully does just that.) The over­all health, well­ness, and life ex­pectancy of a pop­u­la­tion can be more ac­cu­rately drawn and tightly con­trolled with bet­ter data, which is pre­cisely what fit­ness track­ers pro­vide. Through

wear­able tech­nolo­gies, we are see­ing a new the­o­riza­tion of the mod­ern body from a tech mind-set.

***

FIT­BIT’S WEB-BASED DASH­BOARD is a va­ri­ety of friendly colors and graphics, full of easy-to-read charts and cheer­ful icons rep­re­sent­ing your bio­met­rics. I find sift­ing through the num­bers an en­joy­able time sink, a way to rep­re­sent me to my­self. Per­son­ally—and this may be anath­ema to QS diehards—i am less con­cerned with the strict ac­cu­racy of the data; it’s more about see­ing trends and feel­ing ac­com­plished than about ac­quir­ing “true” bi­o­log­i­cal in­for­ma­tion on my­self.

The term “self-track­ing” is strange. Like fol­low­ing trail signs of an an­i­mal in the woods, it con­jures a sense of both the past and the fu­ture— where it has been, where you will be go­ing. But in the present there is only a watch­ful­ness, an ac­tive surveil­lance. The “self” in self-track­ing is sur­pris­ingly ab­sent: What­ever peppy, in­spi­ra­tional copy Fit­bit uses to move its prod­uct, it is a reg­u­la­tory de­vice, bring­ing sta­tis­ti­cal av­er­ages and norms to bear on the in­di­vid­ual. Re­gard­less of what my sleep­ing and wak­ing hours are, the Fit­bit day ends at 11:59 p.m. and be­gins at 12 a.m., and my coun­ters, un­less I change the de­fault set­tings, are re­set by the clock. My de­vice al­lows me to com­pare my rest­ing heart rate and lev­els of sleep to other women my age. I’m en­cour­aged to move only be­tween the hours of 9 a.m. to 5 p.m., in ac­cor­dance with the typ­i­cal desk-jockey life­style that still some­how shapes our idea of eco­nomic rhythm, de­spite the rel­a­tive pre­car­ity and un­pre­dictabil­ity of lives spent free­lanc­ing, con­tract­ing, in­tern­ing, or oth­er­wise shoe­horned into the “shar­ing econ­omy.”

This all hap­pens, of course, with the user’s con­sent. I shelled out money for the thing, and no gov­ern­ment has yet made such de­vices manda­tory. Though in Jan­uary 2017, Fit­bit part­nered with Unit­ed­health­care and Qual­comm’s cloud-based care plat­form to roll out a pro­gram that would al­low users to earn up to $1,500 in health­care cred­its, in­cen­tiviz­ing em­ploy­ees within their in­sur­ance net­works to use the track­ers.

This surveil­lance of a body in ab­sen­tia is a foun­da­tional premise of biopower. Emerg­ing in late 18th-cen­tury Europe as a new mech­a­nism for con­trol over a pop­u­la­tion un­der­go­ing in­dus­tri­al­iza­tion, biopower was the tech­nol­ogy of de­mog­ra­phers, of those who sought con­trol at the pop­u­la­tion level—birth rates, mor­tal­ity rates, life ex­pectancy. Biopower per Fou­calt “deals with the pop­u­la­tion as a po­lit­i­cal prob­lem,” and de­vel­ops reg­u­la­tory mech­a­nisms in order to main­tain bi­o­log­i­cal—and there­fore also so­cial and po­lit­i­cal—equi­lib­rium. Rather than hav­ing a re­gent rule by threat of death, we have state pow­ers that rule through reg­u­la­tion: aca­demic and fit­ness tests, for in­stance, in­stead of soldiers march­ing in the streets.

Biopower works, in Fou­cault’s es­ti­ma­tion, through the me­di­at­ing force of the norm: a base­line for ob­jec­tiv­ity “that can be ap­plied to both a body one wishes to dis­ci­pline and a pop­u­la­tion one wishes to reg­u­lar­ize.” Sta­tis­ti­cal av­er­ages be­come both a reg­u­la­tory func­tion for a pop­u­la­tion and an ex­pec­ta­tion in­ter­nal­ized by any given per­son. As a for­mer stu­dent in Cal­i­for­nia pub­lic schools, I re­call pe cur­ric­ula essen­tially train­ing us for Fit­ness­gram phys­i­cal-fit­ness tests, mak­ing sure we could at least mea­sure up to the state’s base­line av­er­age. It was al­ways a point of pride for my peers when we out­per­formed other schools, a ju­ve­nile sat­is­fac­tion of su­pe­ri­or­ity at­tained while un­wit­tingly con­tribut­ing to state school rank­ings and, by ex­ten­sion, fund­ing dis­tri­bu­tion.

The im­pli­ca­tions of biopower sys­tems go beyond mak­ing sure that the nation’s chil­dren are, on the whole, phys­i­cally healthy (to say noth­ing of the eth­i­cal and bi­o­log­i­cal as­ser­tions that go into draw­ing that par­tic­u­lar bound­ary). In draw­ing power from the reg­u­la­tion of bod­ies, the norm becomes a deadly force: Any­thing that does not con­form to it can be seen as a jus­ti­fi­able threat to the pop­u­la­tion. A nation un­der biopower—in which any­one who is not white, able-bod­ied, male, and straight is con­sid­ered a de­vi­a­tion from the norm—is one that can, and does, jus­tify racism and big­otry. This is why no form of vis­ual record­ing, whether body cam­eras or livestreams from iphones, can save the lives of the Black men, women, and chil­dren reg­u­larly mur­dered by state po­lice.

When Fou­cault was the­o­riz­ing biopower in 1976, he un­der­stood it as the new mech­a­nism for ex­er­cis­ing sov­er­eign power over sub­jects. But there’s a new player in town, one that was only just coming to ma­tu­rity in the late 1970s: the cor­po­ra­tion. Th­ese days we’re see­ing biopower wielded in far greater scope than gov­ern­ment reg­u­la­tion. Big Data is its new name, and the ones us­ing it with far more creativ­ity and can­ni­ness are based in San Fran­cisco lofts rather than of­fices in Wash­ing­ton, D.C.

Project Base­line is Al­pha­bet’s (that is, Google’s) new­est health study. Its 10,000 sub­jects (“to rep­re­sent dif­fer­ent ages, back­grounds, and med­i­cal his­to­ries on be­half of hu­man­ity”) are given spe­cial watch-style health track­ers and sen­sors to put un­der their mat­tresses, and are stud­ied over four years. Par­tic­i­pants agree to use the var­i­ous health track­ers daily, to fill out ques­tion­naires and sur­veys reg­u­larly, and to per­form up to four an­nual in-per­son health tests. All test sub­jects are vol­un­teers; they are not com­pen­sated for their par­tic­i­pa­tion, nor are the health tests meant to pro­vide any kind of med­i­cal care. If an ap­pli­cant is not se­lected, it is likely that Project Base­line has “already met [its] re­quire­ments for peo­ple of your age, lo­ca­tion, health sta­tus, etc., or that we do not yet have a study site near you.”

This last part is note­wor­thy, con­sid­er­ing the in­ex­tri­ca­ble link be­tween lo­ca­tion and de­mo­graph­ics. Though the study aims to be rep­re­sen­ta­tive of the Amer­i­can pop­u­la­tion, there are already known lim­i­ta­tions to its ap­peal to uni­ver­sal­ity. Cur­rently, the only study sites are in the San Fran­cisco Bay Area near Stan­ford Univer­sity and in North Carolina near Duke Univer­sity: One can imag­ine the data sets avail­able in those ar­eas, es­pe­cially given that vol­un­teer­ing for the project re­quires one to know about it in the first place. The project hopes to ex­pand glob­ally, but ques­tions about what that ex­pan­sion looks like are unan­swered for the time be­ing.

The lim­i­ta­tions of Project Base­line’s sam­ple set is the pre­cise prob­lem with th­ese kinds of ex­trap­o­la­tory projects: There will al­ways be bias de­pend­ing on how the sam­ple is ac­quired. And when you’re talk­ing about “creat[ing] a Google Maps for hu­man health,” who gets ex­cluded from the sam­ple is more than just a round­ing er­ror. There are en­tire de­mo­graph­ics that would lit­er­ally be ex­cluded from what con­sti­tutes “the hu­man race.” It mat­ters if the re­quire­ment of four an­nual clinic vis­its makes par­tic­i­pa­tion in the study im­pos­si­ble for peo­ple who, for in­stance, have dif­fi­culty leav­ing their homes, whether that’s due to phys­i­cal or men­tal dis­abil­ity, or eco­nomic rea­sons such as lack of child­care or free time. It mat­ters if the sam­ple sets can only be de­rived from ar­eas near clin­ics with the right tech. It mat­ters if the only peo­ple able to par­tic­i­pate are those who already be­lieve in the goal. With­out ad­dress­ing th­ese bi­ases, Project Base­line will not be a rad­i­cal leap for­ward in hu­man un­der­stand­ing, but a cod­i­fi­ca­tion of norms that marginal­izes more sec­tors of the pop­u­la­tion ev­ery day.

In his book Dis­abil­ity Aes­thet­ics, cul­tural scholar Tobin Siebers ar­gues that dis­abil­ity is the most ba­sic form of hu­man dis­qual­i­fi­ca­tion, pre­sum­ably pred­i­cated by bi­o­log­i­cal fact rather than so­cio­cul­tural con­di­tions. This means that all types of so­cial in­equal­i­ties, such as racism, sex­ism, and ableism, stem from a bi­o­log­i­cal jus­ti­fi­ca­tion for their op­pres­sion—th­ese bod­ies are less fit, less healthy, less wor­thy, and ul­ti­mately, less hu­man. So when a project like Project Base­line re­it­er­ates those jus­ti­fi­ca­tions rather than chal­leng­ing them just based on who they let through the door, we ought to be con­cerned about which bod­ies are al­lowed into fu­tu­rity.

Health, of course, is already a state issue. State fund­ing de­ter­mines what foods are avail­able in pub­lic-school lunches, what sci­en­tific stud­ies get funded, and what in­sur­ance pre­mi­ums look like. The health of the body becomes synec­doche for the health of the state; the pre­cur­sor to the cur­rent it­er­a­tion of the phys­i­cal fit­ness test was the Pres­i­den­tial Fit­ness Test, a nowde­funct test­ing for­mat that Pres­i­dent Kennedy claimed, in a 1960 Sports Il­lus­trated op-ed piece, All types of so­cial in­equal­i­ties, such as racism, sex­ism, and ableism, stem from a bi­o­log­i­cal jus­ti­fi­ca­tion for their op­pres­sion—th­ese bod­ies are less fit, less healthy, less wor­thy, and ul­ti­mately, less hu­man.

would com­bat Amer­i­cans’ “in­creas­ing lack of phys­i­cal fit­ness” that he saw as a “men­ace to our se­cu­rity.” If a healthy body must also con­form to stan­dards and reg­u­la­tions de­vel­oped through state power and state in­cen­tives, then the op­pres­sive func­tion of biopower nec­es­sar­ily ex­cludes and dis­qual­i­fies the dis­or­derly bod­ies that ex­ist out­side of its spectrum. Bod­ily ideals, cod­i­fied by sci­en­tific ar­gu­men­ta­tion for fit­ness, are uti­lized as a mea­sure of con­trol— ones which are func­tion­ally im­pos­si­ble for cer­tain bod­ies to achieve. And my Fit­bit is the most pow­er­ful tool avail­able for this project.

***

BUT PER­HAPS we’re danc­ing around the real issue here, which is death.

From a biopower per­spec­tive, the pri­mary goal for pro­grams like Project Base­line is more ef­fec­tive reg­u­la­tion, and there­fore more ef­fec­tive con­trol over the lives and deaths of the gen­eral pop­u­la­tion. From an in­di­vid­ual per­spec­tive, Project Base­line is ex­cit­ing be­cause it of­fers up the pos­si­bil­ity for deeper un­der­stand­ing of en­demics, like dis­eases, and there­fore the pos­si­bil­ity for cur­ing them. The project has great po­ten­tial to do ob­jec­tively good things (ad­vance med­i­cal un­der­stand­ing) and more ques­tion­able ones (al­low more gran­u­lar state con­trol). But the real rea­son peo­ple are vol­un­teer­ing for it is a de­sire to es­cape the reaper.

In Tad Friend’s 2017 New Yorker ar­ti­cle “Sil­i­con Val­ley’s Quest to Live For­ever” doc­tor-cumhedge-fund-man­ager Joon Yun de­scribes death as a hack­able code: “Ther­mo­dy­nam­i­cally, there should be no rea­son we can’t de­fer en­tropy in­def­i­nitely. We can end ag­ing for­ever.” Friend’s ex­plo­ration of the ways tech-in­dus­try play­ers are throw­ing money at this one seem­ingly un­solv­able prob­lem il­lus­trates a view of death as sim­ply a bug in our oth­er­wise func­tional op­er­at­ing sys­tems. But it’s also a lit­tle pre­sump­tu­ous to ar­gue that the best way to ex­tend lives is through some high-tech fix for shrink­ing telom­eres when there are still mil­lions of peo­ple in the United States alone who don’t have ac­cess to health­care, clean water, or food.

The en­tire im­pe­tus for health is that it en­cour­ages longevity, and the pos­si­bil­ity of staving off a nat­u­ral death for as long as pos­si­ble. And what feels clos­est to avoid­ing that fi­nal fact of bi­o­log­i­cal ex­is­tence than to be­come closer to the ma­chine? As if by tech­nol­o­giz­ing the body, we can trans­mo­grify our­selves into the eter­nal, ef­fi­cient, or­derly, and im­mor­tal cy­borgs of our wildest fan­tasies. But whether you want to the­o­rize it as the fi­nal great mys­tery of ex­is­tence or as merely a pro­gram to be hacked, death is never sim­ple. Per­haps its great­est irony is that it becomes eas­ier to deal with the more you ab­stract it. At the level of biopower, death is just an­other met­ric to con­trol for. At the level of the in­di­vid­ual, well.

Even in try­ing to write the sen­tence, “When my brother died,” I find my­self at a loss to com­plete it. There have been many sen­tences since his death that I’ve been un­able to fin­ish. Grief is some­thing you learn to live with rather than es­cape from, a con­stant com­pan­ion that some­times taps you on the shoul­der gen­tly and other times lays you out cold on the side of the road, glad that you were at least able to pull over be­fore the real sob­bing started. There are no clear met­rics for im­prove­ment, and no sense of pro­gres­sion. You can go weeks and months feel­ing like maybe you’re fi­nally done cry­ing be­fore you find your­self on the side of the road again.

In this con­text, op­ti­miza­tion is more than a seductive mar­ket­ing ploy: it’s a sur­vival strat­egy. Yes, we must be vig­i­lant about where our data is go­ing, who has ac­cess to it, and who ben­e­fits from it. We must not al­low our­selves to be sorted like so many prod­ucts in a ware­house, bod­ies cod­i­fied and strat­i­fied in ac­cor­dance to fit­ness, race, abil­ity. We must not let our data be cod­i­fied into “ob­jec­tive” knowl­edge, fore­clos­ing on any pos­si­bil­i­ties for a dia­lec­tic and re­pur­posed for the ben­e­fit of eu­gen­ics-by-cap­i­tal­ism.

But I am find­ing that be­hav­ior track­ing grounds me. The ab­strac­tion of my­self into num­bers has be­come the most ac­ces­si­ble way for me to be in my body, to re­mind my­self that I am this liv­ing thing; the messi­ness can be left for later. To be able to work to­ward a quan­tifi­able goal, even one that is more rigid than my body can bear, is to find some­thing tan­gi­ble in grief. It is sat­is­fy­ing to com­plete the cir­cles, to fill the bars, to earn the badge. It is com­fort­ing to see that I walked far­ther to­day than I did yes­ter­day.

My ther­a­pist of­ten asks me, “Where are you feel­ing this in your body?”

I am never able to an­swer her with any ac­cu­racy. MAILEE HUNG is Bitch Me­dia’s 2017 Writ­ing Fel­low in Tech­nol­ogy. She is a writer based in San Fran­cisco, Cal­i­for­nia. A sci-fi afi­cionado, rock climber, and dumpling en­thu­si­ast, her work fo­cuses on the in­ter­sec­tions of the body and tech­nol­ogy.

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.