Jessica Ashooh, Director of Policy at Reddit, joins Ellysse and Ashley to explore the impact of Section 230 on small to mid-sized companies and explain its importance for innovation and competition in the Internet economy.
- Emily Birnbaum, “Reddit worries it’s going to be crushed in the fight against Big Tech,” Protocol, October 28, 2020.
Jessica Ashooh: That’s what we’re talking about when we say that something stifles innovation, is it takes resources that could be developing a platform and developing safety tools, and it just sends them to lawyers.
Ellysse Dick: Welcome to Ellysse and Ashley Break the Internet, a series where we’re exploring the ins and outs of Section 230, the law that has defined content moderation for platforms big and small for over 20 years. I’m Ellysse Dick, Research Fellow at the Information Technology and Innovation Foundation. We are a tech policy think tank based in Washington, D.C.
Ashley Johnson: And I’m Ashley Johnson. I’m a Research Analyst covering Internet policy at ITIF. In this episode, we’ll be looking beyond the Big Tech companies that are often the focus of the debate surrounding Section 230 of the Communications Decency Act, and instead discussing the law’s impact on small to mid-size platforms. Joining us, we have Jessica Ashooh, Director of Policy at Reddit. Welcome to the podcast, Jessica.
Jessica Ashooh: Thank you so much. It’s great to be here.
Ellysse Dick: So, Jessica, one of the reasons we wanted to have you on is because Reddit is really unique in terms of your community and your content moderation. Being a unique online platform, it’s a community of communities. So can you talk a little bit about your content moderation approach?
Jessica Ashooh: Yeah, absolutely. So Reddit, as you said, follows a community moderation model. So this is very different from the more centralized, corporatized moderation models that a larger company may follow. So just for a brief overview of how Reddit handles content moderation, Reddit is divided into hundreds of thousands of different communities known as subreddits. And these are essentially message boards of interest that are organized around a particular topic. Now, these are all created and moderated by the users themselves on a volunteer basis. So a community moderator is not a Reddit employee, he or she is just a person who’s really passionate about whatever they’ve come to Reddit to find community and belonging about.
And we want to empower those people to be able to set the tone to have the types of conversations that they want to have. And so, community moderators are empowered to set rules that are specific to their particular subreddit. And these can be very, very detailed. They can be very quirky. We have communities that only allow photos of certain animals. We have communities that will only allow discussion of certain sports teams. We have a wide range of communities. And we really want to give the users the power to set the tone for that conversation, and we give them a variety of tools to do so. And you can think of those community rules or those subreddit rules almost like state or municipal laws, if you were to think of Reddit as a federal system.
Now, sitting on top of those subreddit rules is the Reddit content policy, which we set as a company, and which you can think of like a constitution for Reddit. And like a constitution, it’s high level and principles-based. So it includes everything that you would expect it to include: don’t encourage violence, don’t harass people, don’t dox people, things of that nature. And we’ll step in and enforce that content policy in situations where either the community moderators are unwilling or unable to enforce rules adequately in their community, or if it’s an issue, like for example, commercial spam, where it really takes a little bit more work on the backend and it’s something that may be happening across communities that needs administrative attention from us.
Now, one of the biggest benefits of this system is that it inherently scales with our users. The only thing that really scales with users is other users. And that’s really important to us at Reddit, because frankly we’re a much smaller company in terms of employee numbers than people realize. So we’re around 750 employees and 430 million monthly active users worldwide. So we really have no choice other than to involve the users in these content moderation decisions, because that’s the only way that it scales. And frankly, it’s a more democratic approach. And we think that it’s healthier for discourse and healthier for the platform to involve users directly in these content moderation decisions. And in fact, this is borne out in our public transparency report, more than 99.7 percent of all content removals on Reddit actually happen at the volunteer moderator level. So it truly is a community-driven platform.
Ellysse Dick: So using that model, can you talk a little bit about some of the challenges you have with that approach that might not exist in more traditional moderation approaches?
Jessica Ashooh: Yeah, absolutely. I think one of the biggest challenges is that it puts a lot of weight on people who are volunteers. And so we spend a lot of time thinking as a company about how we can be sure that we’re engaging with our user community, making sure that they understand what our rules are and what the expectations are for using the platform. And also, we have to make sure that we’re providing them with the appropriate tools to be able to moderate their communities as they see fit. So for example, one of the tools that we provide our user base and our moderators is called Auto Moderator. And it’s a very simple, automated tool that allows community moderators to do things like block certain words from appearing in their community, block certain racial slurs, block links from particular domains. And that’s a very basic thing that’s important to just maintaining a baseline of health in a conversation.
But one of the things about Auto Moderator as it stands right now is that it does take a little bit of programming knowledge to be able to use it adequately. And so, one of the things that we’re working on right now, as we speak, is creating a version of Auto Moderator that’s much more user-friendly, that you don’t need programming knowledge to be able to implement in your community. And that’s one of the things that we’re doing to try and make Reddit a more inclusive and welcoming place for people of all different knowledge levels and backgrounds.
Ellysse Dick: So we are a Section 230 podcast, so I will ask you the Section 230 question. Do you think this approach would look different if intermediary liability protections that are in place today were different or not in place at all?
Jessica Ashooh: Our approach would absolutely look different because it would have to, because we’d be fully responsible for everything on the site. And the impact of that is that we would no longer be able to have this type of relationship with our users where we put so much trust in the users and devolve decision-making into the user base, because as a business decision, it simply couldn’t bear it out. Because when you talk about liability, really from a business point of view, it’s all about costs. So for example, even with Section 230 in place, we can still be sued. It’s most likely that those lawsuits will be dismissed, but even to bring a lawsuit to dismissal, it can cost upwards of $80,000. And you can see how those costs will pile up very, very quickly. And so that’s really a death sentence for smaller companies that are just getting off the ground, or even medium-sized companies like ours that are still privately held, still on venture funding, and don’t have millions of dollars to throw around defending against frivolous lawsuits.
Ashley Johnson: As you’ve sort of alluded to, much of the debate surrounding Section 230 focuses on the Big Tech platforms, like Google, Facebook, YouTube, and Twitter. But Section 230 protections apply to platforms of all sizes. What impact does Section 230 have on the broader ecosystem of small and medium-sized digital communications and social networking platforms?
Jessica Ashooh: Yeah, well it really puts them on more even footing with the larger companies that we have to compete with. And that’s really important, because I think that there is a hidden competition angle as well in the Section 230 debate. Because there are companies whose legal departments are larger than our entire employee base, and so it would be very, very difficult for us to compete in that manner. And you can see how there is actually an incentive, if you’re a larger company, for some regulatory capture, because you know that your smaller competitors and companies, frankly, that haven’t emerged yet would never be able to comply with burdensome regulations. So I think that that’s a really, really important thing to keep in mind.
And another thing to keep in mind, frankly, is how democratizing Section 230 is in terms of how people use the Internet. So it’s important to point out that Section 230, on a Reddit model, protects not only us as the company, but it protects those volunteer user moderators as well, who are themselves making content moderation decisions. And that’s a point that we brought up in our FCC comment on Section 230 rulemaking last fall, because it’s a really salient one and it’s not purely academic. We have had some of our moderators be sued by other users who take exception to the rules of the communities that the moderators have set up. And so, this is really, really important to protecting a community model of the Internet that in many ways is what the Internet was originally about, and is the healthier aspect of that early Internet that had so much promise.
Ellysse Dick: I think it’s really fascinating that Reddit is sort of this reminiscent platform of the early Internet where it’s so community-based and that you’ve managed to really, like you said, scale that approach. You now have, you said hundreds of millions of monthly active users. December report said you had around 50 million daily active users. So this puts you in a really interesting mid-size, but growing position. Can you talk a little bit about the role that the intermediary liability protections have played, not only in enabling your content moderation, but also this growth that you’ve been seeing?
Jessica Ashooh: I mean, to put it really frankly, we would not have been able to grow in the way that we have without the protections that Section 230 offers. And you can see that in the examples of cases where Section 230 protections have been taken away, because there are a few carveouts that have happened to Section 230. SESTA-FOSTA, of course, is the one that most people are familiar with. And basically what happens when you have a carveout in liability exceptions, is that content around whatever topic the carveout is specific to, that content goes away. And so, in the case of SESTA and FOSTA, for example, that content was harm reduction content, and harm reduction discussions and resources for sex workers. And so all of that content went away. And you can see how something similar could be at risk for happening on any number of other really, really important topics that people have discussed having Section 230 carveouts on.
One great example is there’s been a discussion about a Section 230 carveout for opiate-related content. There are addiction support and harm reduction communities on the Internet that rely on those protections of Section 230 to be able to have those honest conversations about themselves. And it would be really damaging to people who are struggling with those issues to take away this incredible discussion space for them to process their problems. Another thing that I think we would probably see at a high level, not so much on Reddit, but kind of at a high level on other platforms, is that you would see the Internet become much more corporatized and commercialized because user-generated content would suddenly be very dangerous from a liability point of view.
And so, a larger platform wouldn’t necessarily go away or cease to exist, but I think that you would see more approved content, if you will, where it’s more content from official creators or from brands. And there’s a time and a place for that content, but that’s not really why most people come to the Internet. Most people come to the Internet because they want to see the full breadth of interesting ideas that other users have. And I think that that would go away and it would be a real shame.
Ashley Johnson: So we’ve definitely touched on this, but just to ask explicitly, what are the dangers to companies like Reddit if Congress repealed Section 230 or weakened its liability protections? And what impact would that have on consumers?
Jessica Ashooh: Yeah. It’s really, from a company standpoint, for us the fear is the death by a thousand cuts. Having to fend off dozens, if not hundreds, maybe thousands, who knows, of frivolous lawsuits. And that would be not only difficult from a resource standpoint, but it would be bad for the health of the platform and bad for the health of the Internet. Because in addition to the raw costs that we talk about in terms of defending against lawsuits, yes, there is kind of the raw cost and the sunk cost of the money itself, but you also have to think about it in terms of opportunity cost. And what that means, is that’s $80,000 that’s not being spent on product development. It’s not being reinvested into the platform. It’s not being invested into things like safety tools that would work to overall make the Internet a better place.
And so, it really does hamper innovation, and I know that that phrase gets thrown around a lot and has almost become cliche, but that’s what we’re talking about when we say that something stifles innovation, is it takes resources that could be developing a platform and developing safety tools, and it just sends them to lawyers.
Ellysse Dick: So when you’re talking about innovation, obviously there’s innovation within companies. What do you think it looks like for smaller startup platforms? Is it even possible with weakened protections, especially if they want to take creative approaches to content moderation like Reddit has?
Jessica Ashooh: One of the really interesting things about Section 230 and the debate on Section 230, is that many of the people who argue that Section 230 needs to be thrown out or radically reformed will say that the Internet is a very different-looking place now than it was in 1996 when Section 230 came about. But I think that that’s actually the brilliance of Section 230, is that it’s something that truly enabled the flourishing of the Internet economy as we know it today, and it enabled platforms and business models that we couldn’t even have conceived of in the 90s. And that’s not a bug, that’s by design. And so there are business models and new technologies that we haven’t even conceived of yet that hypothetically enjoy the protection of Section 230, and they’re on the cusp of being developed, but they won’t be if you take away those protections.
Ellysse Dick: So while we’re talking about company size, let’s have a hypothetical. One of the things that’s been discussed is amending Section 230 so that only companies of a certain size would have the protections available to them. How would this impact Reddit’s approach to content moderation, if, for example, you still receive protections, but larger companies did not?
Jessica Ashooh: Well, it’s a really difficult question, because how you measure size on Internet platforms is actually not especially straightforward. You can look at it from a user-based perspective, just raw people hitting the platform. And in that sense, Reddit looks very large, 430 million monthly users. But as I’ve already mentioned, we’re only 750 employees. So do we have the resources to be able to comply with a large regulatory regime? Certainly not to the degree that a larger company would. And then even if you break that data down further, users look very different. Some users spend a lot of time on the platform. Some users don’t spend a lot of time on the platform. Some users are registered, some aren’t. And that all has kind of very different implications for how you moderate content and how you approach regulating how platforms moderate content.
And so it’s not always a simple apples-to-apples comparison. Revenue is another thing that many people will look at when deciding who’s a large company and who’s not a large company. You can be drawing a healthy revenue stream, but still not making billions and billions of dollars. And I think that this conversation gets very distorted by the fact that we are literally talking about the largest, most capitalized companies in the world, perhaps in human history ever, when we’re talking about this debate. It almost doesn’t make sense to try and hold up a Reddit against an Amazon and say that you’re both large. Because the reality is the companies look very, very different, and frankly, the product models of all of these platforms are very, very different. And again, I think that that’s really the genius of Section 230, is that it accommodates companies large and small. It accommodates business models that range from everything to online sales platforms to your community message board. And that diversity of product is really, really important to keep in the marketplace.
Ellysse Dick: So do you think size and scale have any place in this debate, or are there other differentiating factors that should be considered, or should it just be blanket protections across the board to keep that flexibility that you’ve discussed?
Jessica Ashooh: I think there’s a conversation to be had about size and scale, but it’s important that that conversation be very nuanced and have eyes wide open about all of the different points that I just raised about how you can measure size and why you might measure size. Are you measuring size because you want to see how many people are exposed to something on a particular platform? Are you measuring size because you want to evaluate the resources that a company has to deal with any particular legislation? And those are two questions where the answer would imply a very different action.
Ashley Johnson: So for our final question that we ask all of our guests, we want to know what your verdict is on Section 230. Should we keep it, amend it, or repeal it?
Jessica Ashooh: I am in favor of keeping it. I think it’s working.
Ashley Johnson: Thanks so much for joining us, Jessica. That’s it for this episode. If you liked it, then please be sure to rate us and tell friends and colleagues to subscribe.
Ellysse Dick: You can find the show notes and sign up for our weekly email newsletter on our website, itif.org. Be sure to follow us on Twitter, Facebook, and LinkedIn too @ITIFdc.