Two weeks ago, there was a major brouhaha
over the discovery that the social network Facebook had engaged in research to
measure and manipulate its users’ moods. The company had given access to users’
data and profiles to social scientists, who then manipulated the users’ News
Feeds (the constantly updating set of profile updates, pictures and
interactions from ‘friends’ that users see when they are logged into the site).
The researchers tweaked News Feeds to make them more positive or negative, in
an effort to find out whether that would make the users more or less positive
in their status updates.
The main controversy was over the fact that
this research was carried out without the users’ knowledge and permission
(though the obvious criticism of this is that knowing your mood was being
manipulated would change how you behaved on the site – the old ‘Heisenbergian’
model of social science research). There were also some earnest defenders of
Facebook’s efforts. These centred around the fact that the company eventually
owned up to the fact that it had engaged in the research, and that it was,
mostly, benign.
Take that as you will, but the
creeping sense of foreboding you may have felt was not misplaced. It turns out
that Facebook had conducted other experiments in the lead-up to the American
mid-term elections in 2010, and these experiments then had actual, real-world
effects. These included measuring and manipulating users’ propensity to vote,
and to model which party users belonged to and send messages based on that.
That particular experiment was conducted on 61 million Americans (approximately
90 million voted that year, meaning that two thirds of voters were unwitting
participants in that experiment).
What does that have to do with you,
you may ask. You’re happily sitting in Nairobi reading the newspaper and having
a cup of coffee. Yes, you’re logged on to Facebook and are constantly glancing
at it on your phone, computer or tablet, but these problems are distant to you.
This is until the revelations late last week. It turns out that it was not just
social scientists and Facebook researchers who were interested in users’ moods.
Part of the funding for the research came from the United States military,
which is interested in ‘emotional contagion’. According to the Atlantic
Monthly, quoting a US Defense Department website, the research was to help in
identifying social ‘tipping points’. ‘The tipping points in question include
“the 2011 Egyptian revolution, the 2011 Russian Duma elections, the 2012
Nigerian fuel subsidy crisis and the 2013 Gazi park protests in Turkey.”’
You can have no doubt that the
lead-up to the Saba Saba rallies yesterday was equally closely studied. The
ethnically-tinged messages of hate and hubris, as well as the plans for
disruption, earnest calls for calm and peace, and official updates from both
camps were fodder for the site, and, it turns out, foreign governments.
We often treat social media as if
they are benign utilities, staying in the background while we busily update our
lives and share our feelings on them.
The amount of information we’re
willing to share is astounding. People post condolence messages before the
closest of kin are aware that the person has died. People post pictures of
two-hour-old babies – before the new mother has even come out of the delivery
room. We share our messages of the deepest bile and ethnic animosity for
like-minded bigots (sometimes writing them in the vernacular, as if that gives
them an extra cloak of invisibility from the ‘wrong’ eyes).
I’m not trying to turn you off social
media, least of all Facebook. I am an inveterate social media user, who is fond
of casting argumentative cats among the social media pigeons and engaging in
the ensuing disputes. My class teacher in Standard Seven called me ‘too
argumentative’ (I don’t think she meant it kindly), and this is the perfect
medium for me to argue to my heart’s content.
But even then, I’m acutely aware that
there is nothing innocent about membership of and participation in social media
sites. They are there purely for the purpose of making money. As you catch up
with old friends and subject your children to far more exposure than will ever
be healthy, the companies are harvesting your information and making money off
it in increasingly creepy ways. The more information you offer up, the more
complete your profile becomes and the more lucrative you are to these companies.
And as the revelations last week
showed, these companies are domiciled in particular parts of the world, under
specific jurisdictions. You vote for and pay your taxes to the government of
Kenya (which you can theoretically put to task to regulate any institution
within our borders), but the social media sites to which you offer up so much private,
intimate information are not regulated by that government.
So use these media to your heart’s
content, but be aware that if your heart is not content, it may just be that
someone in a far-off land is pulling emotional levers that you’re unaware of.
Also published in the Business Daily on 8 July 2014 at http://www.businessdailyafrica.com/Opinion-and-Analysis/Facebook-News-Feed-experiment/-/539548/2375086/-/item/0/-/er4kw4/-/index.html
Comments
Post a Comment