Author Archives: Jasmine McNealy

Jasmine E. McNealy is an assistant professor in the Department of Telecommunication, in the College of Journalism and Communications at the University of Florida, where she studies information, communication and technology with a view toward influencing law and policy.
Colorful map of Los Angeles showing different colored dots to represent race and ethnicity. Some areas are more red, some blue, some yellow, a few green.

Data for Discrimination

In early November 2016, ProPublica broke the story that Facebook’s advertising system could be used to exclude segments of its users from seeing specific ads. Advertisers could “microtarget” ad audiences based on almost 50,000 different labels that Facebook places on site users. These categories include labels connected to “Ethnic Affinities” as well as user interests and backgrounds. Facebook’s categorization of its users is based on the significant (to say the least) amount of data it collects and then allows marketers and advertisers to use. The capability of the ad system evoked a question about possible discriminatory advertising practices. Of particular concern in ProPublica’s investigation was the ability of advertisers to exclude potential ad viewers by race, gender, or other identifier, as directly prohibited by US federal anti-discrimination laws like the Civil Rights Act and the Fair Housing Act. States also have laws prohibiting specific kinds of discrimination based on the audience for advertisements. (read more...)

Photograph of a welcome mat that says home with a heart instead of an o.

Finding a ‘Home’ Online

An oft-repeated mantra in scholarship on privacy is that you have the greatest expectation of privacy inside of your home, and the least expectation of privacy in public. What this means is that you can legitimately assume what happens inside your home will stay in your home (to use a phrasing usually connected with visits to Las Vegas). But if people can view or hear an event or occurrence, whether you are having an argument on your cellphone or you trip, fall, and people can see it without technological assistance, you cannot reasonably believe that what happened will remain ‘private.’ This perspective permeates law and how cases involving privacy and the use of personal information are resolved. But in an era in which many people live their lives online where some much is publicly accessible, what does the concept of home mean and how should it influence how we view privacy? (read more...)

Screenshot from Twitter of @KirkegaardEmil discussing the release of a dataset with identifiable information, and the concerned response of @esjewett. @KirkegaardEmil "The OKCupid paper has now been submitted. This means that the dataset is now public! Enjoy! :)" @esjewett "@KirkegaardEmil This dataset is highly re-identifiable. Even includes usernames? Was any work at all done to anonymize it?" @KirkegaardEmil "@esjewett No. Data is already public." @esjewett "@KirkegaardEmil Differing degrees of 'public'. Also different ethical guidelines. IMO, you should speak with a research ethicist/IRB ASAP."

The Problem of Expecting Privacy on Social Media

In May of this year, Danish researchers released a data set containing the profile information of 70,000 OkCupid users. OkCupid is a free online dating site to which, as you would expect, users post information in hopes of making a connection. The researchers collected this data by scraping the site, or using code that captures the information available. The data set included usernames, locations, and the answers to the personal questions related to user dating, sexuality, and sexual preferences. In other words, the researchers published personal information that the dating site users would expect to remain, at least theoretically, among the other members of the dating site, and could also be used to discover the users’ real names. But should OkCupid users, and the denizens of social media in general, expect what they post online to not be made “public”? In my last blog post, I briefly pondered the normalization of doxxing and what that means for privacy online. My question, for the most part, was whether courts would see how common doxxing has become as an indication that it is not as highly offensive to a reasonable person as necessary for a judgment of invasion of privacy. In that post I focused on doxxing by individuals, and sometimes the media. It’s important to note, however, that researchers have begun to participate in the same kind of behavior with little to no remorse. Which leads to what I think is the overarching question of what expectation of privacy people can have in information that they place on social media or connected sites like newspaper comment forums or review sites like Yelp. (read more...)

pri-va|cy (close up image of privacy in dictionary, highlighted yellow)

The Hulk, Doxxing, and Changing Standards of Privacy

By now you’ve probably heard the verdict in the Bollea v. Gawker case, the formal name of the lawsuit that Hulk Hogan (Terry Bollea being his legal name) filed against the online news site Gawker. The jury awarded the Hulkster with $140 million in damages for invasion of privacy after Gawker posted a one-minute segment of a sex tape featuring the wrestler with the wife of his best friend Bubba the Love Sponge. If you got a chance to watch the trial, or a least read about what was happening, you’d know that it was very entertaining, particularly for a media/info/tech law nerd such as myself. You should also know that Hulk has unfinished business with Gawker, having recently (as of May 3, 2016) filed another lawsuit against the media organization and others claiming emotional distress. Bollea v. Gawker, as humorous as it was, is perhaps not as important as (read more...)

Photograph of FTC Chairmen Jon Leibowitz standing at a podium in the Russell Senate Office Building, with two flags and two legislators sitting on either side.

Unpredictable Technologies: The need for thick description in regulatory decision-making

I call myself a scholar of information, communication, and technology with a view toward influencing law and policy. To that end, my motto over that last few years has been “Social Science matters!” And by that, I really mean that qualitative research, or research aimed at understanding how people and organizations actually use technology, is important for creating good law. To this end, ethnographic study, the kind that produces thick descriptions of people and culture, should be MO of any body tasked with writing regulations. Recently I was asked to participate in training a group of telecommunications regulators who want to conduct a regulatory impact assessment (RIA). A RIA is a thorough investigation of the possible impacts of a proposed or revised regulation. In the most basic sense, the investigation is used to forecast whether the new rule will achieve what it’s supposed to, and what else could happen. Countries around the world use RIAs to evaluate regulatory needs and possible interventions. US federal agencies have been required to conduct and submit RIAs since the early 1980s, and President Bill Clinton codified this requirement in 1993 with Executive Order 12866. A second executive order, 13563, requires that agencies use “the best available techniques to quantify anticipated present and future benefits and costs as accurately as possible.” (read more...)