Everyone knows that “I have read the Terms and Conditions” is the biggest lie on the Internet. But in an age where these contracts are present on almost every website online, just what is it we’re agreeing to when we accept the terms provided to us by web companies?
Follow the show on Twitter for updates @TermsCondPod
Links and further reading:
“When Not Reading the Fine Print Can Cost You Your Soul” – NPR, March 8, 2019 https://www.npr.org/2019/03/08/701417140/when-not-reading-the-fine-print-can-cost-your-soul
“The Biggest Lie on the Internet: Ignoring the Privacy Policies and Terms of Service Policies of Social Networking Services” – Obar and Oeldorf-Hirsch, April 2016, revised August 2018 https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2757465
The Alex Kogan Experience – Against the Rules with Michael Lewis, April 16, 2019. https://atrpodcast.com/episodes/the-alex-kogan-experience-s1!d20f3
Facebook CEO Mark Zuckerberg Hearing on Data Privacy and Protection, April 10, 2018. https://www.c-span.org/video/?443543-1/facebook-ceo-mark-zuckerberg-testifies-data-protection
Music from https://filmmusic.io:
“Backbay Lounge” by Kevin MacLeod (https://incompetech.com)
License: CC BY (http://creativecommons.org/licenses/by/4.0/)
“Dreamer” by Kevin MacLeod (https://incompetech.com)
License: CC BY (http://creativecommons.org/licenses/by/4.0/)
“Loopster” by Kevin MacLeod (https://incompetech.com)
License: CC BY (http://creativecommons.org/licenses/by/4.0/)
“Samba Isobel” by Kevin MacLeod (https://incompetech.com)
License: CC BY (http://creativecommons.org/licenses/by/4.0/)
“Let That Sink In,” composed by Lee Rosevere. https://leerosevere.bandcamp.com/
Used under a Creative Commons Attribution 4.0 International License https://creativecommons.org/licenses/by/4.0/
“Royale,” performed by Josh Lippi & the Overtimers https://joshlippi.bandcamp.com/
Below is the original script for the podcast if you would like to read the content. Some clips have not been transcribed, so be sure to listen to the show above for the full experience.
It’s the biggest lie told millions of times across the Internet: “I have read the Terms and Conditions.”
We’ve all been there. You’re signing up for a new account online, and there it is. The inevitable checkbox. “I have read and agree to the Terms of Service.” Do you give it a second thought? Do you just check it and move on? Is it one of the times where a website makes you scroll through the entire terms of service before you can click the check box?
Anyhow, you’re off to the next page- continuing through whatever you were doing before. But wait a moment. What did you actually just agree to? Which of your rights did you sign away with a click?
In 2015, academic researchers Johnathan Obar and Anne Oeldorf-Hirsch wanted to know just how many people actually read the terms and conditions. So they devised a study. 543 participants were subjected to a survey where they proceeded through the sign up process for a fictitious social media website called NameDrop.
74%? Ok, that’s not so bad, right? Well, there was another interesting aspect of the study.
Deep in the Terms of Service, Obar and Oeldorf-Hirsch included two “gotcha” clauses. One permitted NameDrop to hand over any and all of each user’s data to any third party organization for any reason. Most of us may not have flinched at this one, but the second clause stated, “all users of this site agree to immediately assign their first-born child to NameDrop, Inc.” and that “all individuals assigned to NameDrop automatically become the property of NameDrop, Inc. No exceptions.”
98% of all participants missed these clauses, and thus, Obar and Oeldorf-Hirsch are now in possession of many children, which they may raise as participants in future studies. Ok, that part’s definitely not true, but still… in this case, it was just a research study. Other times, companies have actually placed clauses in their TOS to see for themselves how much attention people actually give to the terms.
Earlier this year, a Florida woman won $10,000 when she read the fine print of a travel insurance policy she purchased, which offered the prize to the first person who responded.
In 2017, 22,000 people in England unknowingly agreed to 1,000 hours of community service in exchanged for free public wifi after the company providing the service inserted the clause to raise consumer awareness.
There are many other news items about strange or interesting terms of service clauses. Thanks to NPR for reporting the ones I just mentioned (you can find a link to their piece in our show notes)
But how did we arrive at this place where we just haphazardly agree to documents we haven’t even read? Have things always been this way? We’re going to back quite a bit to try to understand why we enter into agreements, and if modern terms of service are truly detrimental, or a necessary evil.
If you’re an American like me, you’ll probably vaguely recall names like Thomas Hobbes, John Locke, and Jean-Jacques Rousseau. Various flavors of their ideologies may be found woven among the writings of the American founding fathers, who ultimately enshrined Locke’s inalienable rights doctrine in the Declaration of Independence in 1776.
The American experiment introduced the modern concept of a government which derived its power from the consent of those it governed, thus binding the populous and the government within a social contract. A social contract is a written or unwritten agreement between individuals and the state. Individuals agree to give up portions of their freedoms in submission to a governing body, and in exchange, that same governing body protects the remaining rights of the individuals.
John Locke proposed this idea in great length in his Second Treatise of Government, published in 1763.
[Excerpt from Sect. 131 of Second Treatise of Government]
You get the idea… We only submit to the government in order to protect our liberty and property.
It’s not a new idea. Plato wrote of similar concepts in The Republic, but it nevertheless is an important concept that’s relevant to modern society. If individuals believe their government has violated the social contract, they may take action. This has traditionally occurred at the ballot box, where citizens may vote to remove any transgressing representatives. However, as time has passed and political landscapes have changed, it could be argued that adjustments to campaign finance and voting laws have limited citizen action. But we’ll save the political discussion for another podcast. The point here is that when two parties enter into a social contract, one hopes that it is mutually beneficial, that is, everyone gains something.
If you’ve been using social media for any time at all, you can start to see where this discussion is going. While tech companies are assuredly different than governments, some of them are very large and exert gigantic amounts of influence across the globe. So what does it look like when we extend this idea of a social contract to agreements we make with private entities?
It starts with the contract itself.
Contracts have existed as long as there have been laws to enforce them. Often contracts are spelled out to assure the existence of a record of a deal. Other times they’re introduced to mitigate a lack of trust between individuals who must come to an agreement. Nevertheless, before the digital age, for each agreement, there was a hard copy of a contract. Whether it was fresh off the printing press or still hot from the Xerox machine, contracts could be physically intimidating. This is often portrayed in film and television, where a character walks into the room carrying a massive stack of papers and thumps them down on the table. Often another wide-eyed character must then go through and sign and initial in various places while hearing a synopsis of the entire stack from a lawyer standing by.
While the invention of the PDF file certainly changed the way we read huge documents (consider the implications of distributing a physical copy of the recent 448-page Mueller report to every member of the United States Congress. That’s almost 240,000 sheets of paper.), the rise of the World Wide Web and digital technology have led to the propagation of something we’re all familiar with: extremely long and cumbersome Terms and Conditions.
To illustrate, let’s consider Facebook’s Terms of Service. Facebook launched for United States college students in 2004. The earliest version of their Terms of Service I could find was from October 3, 2005, and it weighs in at just over 1900 words, which is about four single-spaced pages of 12-point Times New Roman font. The current Facebook Terms of Service? Well, it’s a bit more complicated. The basic terms page is about 3200 words, so six pages of text now, which doesn’t seem like a huge jump until you realize that this is just one of many terms and conditions pages Facebook has now. As the company has expanded over the years to include more features, the terms and conditions have ballooned. See for yourself at facebook.com/legal/terms.
Every website, app, game, or software application you use- even the software I’m using to record this podcast right now has a Terms and Conditions page somewhere. Just scroll down to the bottom of most pages or tap the settings icon if you’re on mobile, and you’ll see a link to Terms. These terms, are, in their simplest form, what you as a user agree to by using a service offered by a website or company. Think of them like house rules. When you visit a friend’s home for dinner, it’s understood that you don’t start a food fight. Sometimes these rules are offered by your host- perhaps they want you to remove your shoes or hat. Ultimately, if you want to stick around, you have to play by the rules.
And so we’ve looped back around to the social contract- or at least a similar idea. Two entities, usually an individual and an online company or service. The company wants to offer the individual a unique service. The individual wants to partake. This agreement is literally defined by the company’s terms of service. And if you recall, a social contract involves an individual sacrificing a right or set of rights in order to obtain a service. Never has this been more relevant than in the digital age. More on that right after this.
This is no different on Twitter, YouTube, Amazon, or Instagram. By using these websites, you agree to their terms, and the most upfront thing you’ll find in their terms is that they are going to obtain every piece of information about you that they can in order to provide you with other opportunities. Most websites record your IP address, which is a unique identifier based on your physical location, and thanks to smartphones, GPS, and routers, it’s not uncommon for your location to be accurate within a few meters. Websites are also aware of what links you click, and even how much time you spend looking at certain parts of their pages.
It’s all spelled out. Facebook clearly states “We collect information about how you use our Products, such as the types of content you view or engage with; the features you use; the actions you take; the people or accounts you interact with; and the time, frequency and duration of your activities.”
[Zuckerberg clip about “leaving Facebook”]
The voice you just heard is Mark Zuckerberg, Facebook’s CEO, and he’s absolutely right. While it seems trivial that you can just delete your Facebook account, many times we forget that most of the information Facebook has on each of its users is provided by the user. We choose who we share the information with by controlling who our friends are and adjusting the visibility of each post. Regardless of each post’s privacy settings, Facebook still collects the data from you so they can use it to enhance your experience. But what happens when those data show up in the hands of a third party, one you haven’t authorized to use your data?
In April of 2018, Facebook CEO Mark Zuckerberg appeared before the US Senate Judiciary Committee. (the clip of his voice earlier came from that hearing) So, why was he there? In short, some time before the 2016 election, an academic researcher named Alex Kogan began a scientific study that acquired data from users via an app connected to Facebook. His team developed an algorithm that attempted to profile users based on their data obtained through the app, but, according to Kogan, the model just didn’t work. That doesn’t seem so malicious, so where’s the issue? The problem was that a company called Cambridge Analytica had funded Kogan’s study, and once it was over, much of the data collected by the study as well as the algorithm developed by the team was sold to the Ted Cruz campaign by Cambridge Analytica during the campaign cycle proceeding the 2016 US presidential election. Once Cruz left the race, the algorithm was acquired by the Trump campaign. When the news about this exchange went mainstream after the election, outrage erupted from the media, politicians, and many Facebook users, which lead to exchanges like this between US Senators and Mark Zuckerberg:
[Sen. Durbin and Zuckerberg exchange]
This exchange goes on, and Zuckerberg, once again, reiterates the importance of users being able to determine who exactly is going to see their content. But that distinction isn’t what led to Senator Durbin asking this line of questioning. Let’s step back to the Cambridge Analytica scandal, if you can call it that… Everyone who originally participated in Kogan’s study agreed to participate and was compensated. They agreed, for a small amount of money, to sacrifice the privacy of the data they provided to the app. The problem was that the app collected information about the friends of the users who participated, hence the outrage. About 270,000 people participated in the initial study, but through the accounts of those users, Cambridge Analytica gained access to the information of over 87 million other Facebook users. I was actually one of these people- evidently one of my Facebook friends participated in the study, so yes, I was a little annoyed, perhaps even angry at the time when this happened. But should I have been?
Following the Cambridge Analytica incident, Facebook promised to close the loophole that allowed the secondhand information access, and that’s great. But it takes us back to the premise of this show. Agreeing to the terms of service. Users waived their right to be upset about their data being shared when they signed up for the service. Facebook’s Terms of Service in 2016 stated, “when you download or use such third-party services, they can access your Public Profile, which includes your username or user ID, your age range and country/language, your list of friends, as well as any information that you share with them.” It’s not entirely clear that by having access to your list of friends, third parties would gain access to their information, but Facebook’s current data policy no longer includes the phrase “your list of friends” in the quote I just read. Instead, a new sentence has been added to the section that states, “Apps and websites you use may receive your list of Facebook friends if you choose to share it with them. But apps and websites you use will not be able to receive any other information about your Facebook friends from you, or information about any of your Instagram followers (although your friends and followers may, of course, choose to share this information themselves). Information collected by these third-party services is subject to their own terms and policies, not this one.”
So what have we learned here? Facebook did indeed self-regulate following the Cambridge Analytica incident. But would they have made these changes if the scandal hadn’t made the news? Probably not. Their terms of service (which all users had agreed to) clearly allowed for the data sharing, and it was likely in their interest as a business to continue sharing.
In the failure, or rather, absence of privacy protections in the contract made between Facebook and its users, the United States Congress held hearings in order to honor its side of the social contract between the government and the American people. That’s right, 243 years after the Declaration of Independence, the US government is still (at least for appearances sake) trying to protect the freedoms of its citizens. They even managed to do it in bipartisan manner. Earlier, you heard a question from Senator Durbin, a Democrat from Illinois. I want you to listen to another clip from that same hearing:
[Lindsey Graham clip]
The Senator asking the questions? Lindsey Graham, a Republican from South Carolina.
My point here is this: whether it’s a social contract or agreeing to a company’s terms of service, we should be aware of what we’re sacrificing and weigh the costs and benefits. We’ve established that with the US government, citizens pay taxes and expect to have their freedoms protected as part of a social contract. You arguably have more to gain from entering into the contract than the government, mainly because the government couldn’t exist without the agreement. But when it comes to tech companies, you do gain access to some exciting services, but it turns out the companies have much more to gain when you agree to their terms, and it’s all about the money. That’s next time on Terms and Conditions Apply.
Terms and Conditions Apply is written and produced by Ethan D. Smith. If you’d like more information on the topics presented in this episode, be sure to visit the links in the show notes, which include the full research study by Obar and Oeldorf-Hirsch, Mark Zuckerberg’s senate testimony, and the article from NPR I mentioned about other interesting terms and conditions. If you want to learn more about the Cambridge Analytica incident, you can listen to a recent episode of the podcast “Against the Rules” with bestselling author Michael Lewis, where he talks to Alex Kogan about the scandal. That link is also in the show notes.
The theme music for this podcast is composed by Lee Rosevere, and other music is composed by Kevin MacLeod. The tracks are used under Creative Commons 4.0 and 3.0 licenses, respectively.
The end credit theme you hear now is performed by Josh Lippi & The Overtimers. You can find links to more from all three artists in the show notes.
Thanks so much for listening. If you enjoyed this episode, share it with friends in person or on social media. Once again, I’m Ethan Smith, and I’ll meet you back here for the next episode.