Two lawyers, two doctors, and an army officer walk into a Zoom meeting and make Bright the best digital social community in the world. The team’s education and diversity of experience have given us the tools to confront some of the toughest tech and social problems.
We discuss the Senate hearings on child exploitation on social media shed light on the urgent need for action and accountability. Senators described the crisis of online sexual exploitation as a national crisis, sharing alarming statistics on child sexual abuse material (CSAM) and financial sextortion. The CEOs of major social media platforms, including Discord, Snapchat, TikTok, and X (formerly known as Twitter), were questioned about their platforms' safety features and their responsibility to protect children.
Key Takeaways:
- The Senate hearings highlighted the urgent need for action and accountability in addressing the crisis of online sexual exploitation.
- There is bipartisan support for action.
- CEOs evaded direct answers and accountability, raising concerns about their commitment to addressing the issues.
- The revenue and influence of social media platforms like TikTok are significant but not widely understood.
- Witnesses in hearings tend to stick to their prepared talking points, making it difficult to achieve meaningful change.
- Addressing online harm and regulating platforms is a complex task with potential privacy concerns.
- There is hope for change, particularly in targeting vulnerable platforms like TikTok.
Taryn Ward Hi, I'm Taryn Ward.
Steven Jones. I'm Steven Jones.
Join the Waitlist
TW. and this is Breaking the Feed, Social Media: Beyond the Headlines.
SJ. We're taking a closer look at the core issues around social media, including the freedom of expression or free speech, as it's sometimes known. To better understand the role that social media plays in our everyday lives and society.
TW. This is a special episode covering the Senate hearings on Child Exploitation on social media. Although this issue is closely related to free speech and expression, it's slightly out of order. But we felt it was worth including and at this stage, because it's timely. This episode is rated 18+, because we cover some difficult topics that may not be appropriate for some audiences.
TW. Before the hearing began, they played a video with statements from victims of sexual exploitation in their families. It's worth watching, and it was a powerful way to frame the conversation. Let's talk about what actually happened during the hearing, just to set out that this was about four hours long. So you know, we don't expect that everyone will have watched the whole thing, although we did so that we could pull out highlights and things that we thought were particularly important.
TW. Senator Durbin opened the hearing by describing online sexual exploitation is a crisis in America, and noting that the National Centre for Missing and Exploited Children, or NCMEC now receives more than 100,000 reports of child sexual abuse material, or CSEM, up from just over 1000 per day, 10 years ago, financial sextortion two, he noted has exploded with NCMEC, receiving fewer than 200 reports throughout all of 2021, and in 2023, just through October, more than 22,000. Just in that year, more than a dozen children have died by suicide after being victims of this crime, and those are just the children we know about.
SJ. Oh, my gosh, I mean, the scale of this problem, and and what I think the the important rule is, this is just what we know about and as we'll come on to later, I'm sure that is a very important phrase, in this context. But this is this is a a massive problem and one, which, you know, these tech executives, who are let's face it, some of the richest people in the world are facing Congress, because they have not done enough in the eyes of legislators and the families and us. I mean you and I for sure to do to do anything about this, and in fact, have actively lobbied against having to do anything that would prevent to induce measures which prevent that. So, I mean, I think this is a really important topic, and I'm glad we're doing this in a in a timely way. So...
TW. Yeah, I think you're right to really pull that out that you know, that just what we know about you and I have been around the block a few times. But we both know plenty of people who have been affected by this, who wouldn't have reported it in any meaningful way. So, I know people in my everyday life, you know, people in your everyday life has who have had experiences who have seen things, and although these were serious issues, they're not always being reported, because that's another, it's another thing that somebody has to go through, and it's not always worth it. So, the numbers that we talk about, and when when we see these problems unfolding in this way, I think it really is just the tip of the iceberg.
SJ. Yeah, this context, I'm afraid is the tip of the tip of the iceberg. You know, it's such a small fraction, because, I mean, you know, how yourself, if you make a complaint to a social media network about something that's online, it is so difficult to get them to take it seriously. They have who's like they actively find reasons not to do anything about it, and yet, when it comes to this issue, as we'll see later, they're gonna try and tell us that they're taking it seriously. The these things don't jive well, with our experiences.
TW. Yeah, no, that's absolutely right. So, Senator Durbin first talked about smartphones, and he talked about the apps on these phones, and then specifically social media and the algorithms. He mentioned discord first, and then you know, Meta's Instagram, Snapchat, TikTok, and finally, "X", formerly known as Twitter. He noted that their design choices, their failure to adequately invest in trust and safety, constant pursuit of engagement and profit over basic safety have all put our children at risk, and really, I think underlying that, that that is why they were all there that that was the point of the whole hearing.
SJ. You know, it's not often that I find myself agreeing wholeheartedly with a politician from any party, particularly an American politician, but I mean, he couldn't have put that any better although perhaps he could have put it a bit stronger. It's difficult to do in the in the Senate why it's a serious chamber. But yeah, of course it's right. We know from internal research that Meta for example, knew that it was endangering children that and yet its head of health and safety, stood up in a sat down in a in an inquest in the UK, and told the Coroner that the content was perfectly acceptable for children of 13to view, again, it's very difficult to take them seriously. So, I mean, I think he did a good job of setting the setting the stage, and we know these are deliberate choices they've made, they have not made it more difficult for people to lie about their age, because they just want people on there. They've not put any controls in or made it easy for parents to use controls, or essential for parents to use controls, which of course, is the case for very young children. I remember when my kid had wanted access to Barbieland, or whatever the hell it was. You they couldn't do it without parental control, and you had oversight of everything they were doing there I'm sure it was still deficient. But at least you know, there was no choice. They couldn't get around.
TW. Yeah, and I think, you know, I think he did a good job of of setting this up, and, you know, he was sort of, there's sort of a take no prisoners attitude in the beginning from him. You know, he noted, for example, that several of these companies had implemented new safety features just in this last week. So, in the lead up to the hearing, because CEOs knew that they'd be forced to answer for their decisions, they had all of a sudden rolled out these new features that they had been sitting on for who knows how long, and I thought that was a good a good thing to include in the opening, and he also said know how much he actually believes us, we can debate but, you know, he did note that tech companies alone are not to blame in that Congress needed to have a long, hard look in the mirror.
TW. He talked about Section 230, and the liability shield that's provided for these platforms, and also the record breaking profits tech companies have made as a result of it, and, you know, this is Section 230 has remained unchanged for decades. It is long overdue, and whether you believe it should be repealed or or amended. It is incredibly outdated, and I think he was very right to start with acknowledging that.
SJ. Yeah, that was interesting, wasn't it? It's also interesting, this is something that we discussed in the last podcast that we recorded on on social media responsibility for free speech and using this to avoid any responsibility for the terrible things people say and do. So yeah, obviously, I agree. It does. Again, there isn't a blunt instrument solution to this, it needs to be carefully thought through one of the problems with congressional action very often is it's is it actually not just the Congress, but but all governments as they they want headlines to see that the public can see their doing something, and they don't necessarily think through context and nuance and what, you know, where, where this is going to go?
SJ. The problem is, I guess one of the problems is that they they need a genuine support and engagement from the tech companies who understand their systems and what they can do with them, if they genuinely want to avoid negative consequences, and I don't know that there's any genuine intention for the tech companies to do that. If there were they wouldn't have waited to the last possible second to eke the last possible dime out of users before introducing new safety constraints before the meeting. Right. I mean, down to the wire.
TW. Yeah, I think that's that's certainly a fair statement, and I think, you know, before he handed things over to Senator Graham, Durbin noted that this is the one issue, and I really do mean, right now the one issue that has united Democrats and Republicans in the United States, and he voiced his hope that this hearing might be a call to action to get something on the President's desk that could actually hold big tech accountable.
TW. Obviously, we'll see, but you know, Senator Graham, pick that out and promise that every Republican would answer that call. Again, we'll see, you know, everybody is when you're talking about something like this, and you see a video like the one that they showed at the opening, and you hear from your constituents consistently, that this is a problem that's affecting their lives. It is moving, you know, I hope that we see action that reflects that. I think it's worth talking about this just a little bit more, because I think particularly with TikTok, it is the one issue that you see this overwhelming support, and so I do think although we've not seen a lot of progress on a lot of issues, I think there is reason to hope that that we might see some change here.
SJ. Yeah, I mean, it's hard to over state how inactive this current session of Congress has been, I read somewhere that the House has passed 30 bills this session, and normally it's something like 300 and getting the government to do anything, and we're focusing on the American government, let's be honest, because even though the majority of this material is made and distributed abroad, they're using largely US based platforms to do something about it, and and that's why this emphasis.
SJ. It is remarkable that you've got people working together across the aisle. It's basically unheard of right now, and particularly in the context of immigration reform, which everybody wants, and which has been apparently deliberately held up because the likely presidential candidate doesn't really pass. So, he can use it as an electric talking point that, you know, the fact that they're willing to work tells you that there's genuine concern from their constituents, and they need to be seen to be doing something about this even in an election year.
TW. Yeah, I think that's right, and you can see it even in the language that they use. I mean, I didn't really get the sense that that many of the members of this committee were trying to score points or trying to make their colleagues on the other side look bad. I mean, you see a lot of a lot of back and forth between Democrats and Republicans really echoing the same sentiments, and again, even using the same language, and that really does matter, because a lot of times, now, when you hear conversations between Democrats and Republicans, they're talking past each other, because they're not even using the same words. It's like they're speaking two different versions of the English language, and here you have, you know, Senator Graham, picking this up and saying, Republicans will answer this call. Social media companies are providing dangerous products, they're destroying lives, and they're destroying democracy itself. Let's do this, and so it was really a thing. There, it was a really promising start to the hearing.
TW. You know, Senator Graham spoke specifically about one suicide case, and told Mark Zuckerberg straight to his face that he has blood on his hands, before he compared social media to cigarettes and guns. That's a big step, particularly for Republican senator to take. You know, he added that of all the people in the country they could give blanket liability to this would be the last group he would pick, while gesturing broadly to the panel of social media CEOs and called for the repeal of Section 230. This is not a super left super progressive, super nanny state. Senator. This is Senator Graham, and he noted that although they were five bills to provide more protection for consumers, once they left committee, they tended to go nowhere. So, you know, I think it is worth really noting the support that this has right now.
SJ. It wouldn't be any surprise to you or to listeners to, for me to say I don't really agree with Lindsey Graham on virtually anything.
TW. I am sure everyone is shocked to hear that.
SJ. I mean, good lord. I mean, the world shifted on his axis because, yeah, he's absolutely right, and I admire him for saying it, and now I need to go and have a shower. This was really good stuff, and it's true, and often, you know, he said something which I was thinking when I heard about this, this we're hearing happening, it's like inviting the cigarette companies to come in to tell us how to regulate them. Because this is exactly what this is, you know, you've got a Special Advisory from the US Surgeon General about potential harm for children, the same way that they brought one out for cigarettes, this is a similar scale problem, this product is addictive, and it is harmful. So, you know, yeah, it was it was brilliant, and I sort of hate myself, but he did a good job.
TW. You're in good company, because, you know, he raised Senator Elizabeth Warren, and said, you know, we have nothing in common, and he added that he promised her that he would say that publicly if used her name, and, you know, so that they could park their differences to work on this issue together, and then he congratulated all of his colleagues on the committee, but Democrats in particular, for working collaboratively to address it. That is so far removed from most of what you see in US politics today. It really was a moment, it's surprising in a way, because, you know, we're social media has done so much to erode public decency and to erode, you know, how we talk to each other and in all of these things, but it was social media, this issue that's bringing people together and saying, wait a minute, we have to figure out a way to work together because this is, you know, it's a greater good question, and I think it's, it's interesting anyway.
SJ. It was it was shocking, and, you know, I also that moment also stood out for me in this was sort of like, how you, how Congress used to work, particularly the Senate, how the Senate used to work, you know, dealing with serious problems in a serious way without the polarised and polarising language. I think you've described that earlier on beautifully. This is them trying to tackle a serious problem. There are so many serious problems they need to tackle in this way, and it is ironic that the great polarizer the thing which has driven them to these extremes is the thing was brought them back together again. How frustrating that must be for Mark and his little tech buddies.
TW. Well, I think they probably had a had a rough day. Graham, Graham said just just before we we move on. One of the things he noted that I think is worth just sort of tucking away in the back of our minds is that we have all these problems, all these concerns and all of that while AI is really just starting. So, we are already underwater on this, and we are already behind, and now we have this new issue on the horizon. You know, we need to get our house in order, no pun intended, and prepare ourselves to address the new challenges that lie ahead; and I think that was a really important moment to, you know, I don't want to spend too much time on the preliminary stuff because it is worth looking at what was actually said.
TW. So so just one other thing, because this comes up later, Senator Durbin was clearly annoyed that not everyone was there voluntarily. So he thanked two CEOs, specifically, those representing Meta and TikTok for coming voluntarily, and then said flat out the others were there only by subpoena, and specifically that Discord CEO, only accepted service after US Marshals were sent to Discords headquarters at taxpayer expense, and Senator Durbin comes back again and again, to Discord CEO, and there's, there's a bit of a tone there, and you can tell it, he's annoyed, I think, for good reason, because you refuse to show up, you require a subpoena, and then you show up and you say, thanks so much for having me, it's great to be discussing this. It's not really the best way to start something like this.
SJ. I mean, I don't know the CEO of Discord, by reputation at all. But what moron seriously, like, they invite you to come and chat. It's it's not really an invitation. Is it? Now, I mean, to be fair, Congress has done itself some harm, because they have also, you know, members of Congress have also decided not to go when they were asked to talk to committees in the recent past, and that does erode you know, people's respect public respect for the forms of government and behaviour. But come on sent having to send the US Marshals to tell him that he's going to come to the meeting whether he wants to or not. It's a bad start, and he was asking for trouble. I mean, I don't know who his lawyers are, but you should probably get new ones.
TW. Yeah, I would agree with that, and I mean, I guess maybe he feels like he has some sort of image he needs to uphold or something like that. But I think it was, was not not the right decision. So, the witnesses were sworn in just putting that out there so that we're clear that the witnesses were under oath, for everything that follows they sworn an oath, the usual oath, and everything that they said afterwards was both under oath and on the record.
TW. So, Discord's CEO, who we were just talking about Jason Citron, was invited to begin with his statement first. So, each witness was invited in turn to do this, I'm going to spend a little bit of time describing his because the other witnesses used a very similar format. I don't know if they were, you know, cheating off from each other's notes beforehand, or if they just got very similar advice from probably a handful of law firms that that are really capable of advising under these situations. So, just like when we looked at social media platforms in our previous series, we think it's always worth noting how these platforms see themselves and how they present themselves, and this is an opportunity to hear directly from their CEOs rather than reading through a website. So, I think it's a really good opportunity to get a sense of what matters to them.
TW. So, Citron shared some company stats, including how many Americans they employ and and how many states before adding a personal note about his own love of video games and his hope that Discord can be a place where anyone can find their friends. He teed this up to be about how games bring people together, and listed Discord along with the likes of iMessages and Zoom in describing how they have really revolutionised how people communicate and connect with each other; and then he stated that just like with any other technology, there will always be people who seek to exploit it and cause problems. He did admit to his credit, I think, that Discord has a special responsibility to protect users because of the number of young people on their platform. But then he said, that's why safety is quote, "built into everything we do". So, he again, brought it back to the personal after that, and noted that he's a father to two children, whom he wants to be safe on Discord and whom he wants to be proud of him, and so that's why he was glad to be there to talk about this important issue. He did not say of course, why it was required for US Marshals to intervene to get him there, and then he went on to talk about some of Discord safety features in a broad sense, including their decision not to encrypt messages in their zero tolerance policy for CSEM or the platform.
SJ. Yeah, I mean, I've got a shovel and you could probably shovel some more of that right Of course, he's got a zero tolerance for CSEM, CSEM on on his, his, his network, it's illegal. But that doesn't mean that he's actively trying to stop it, or that any of his safety features really make children safer, and given the distributed nature of Discord, they're gonna have to work a little bit harder than most networks in order to make that safer, and let's be honest, that they have, unlike most of the other platforms, who'd you tend to target us maybe a slightly older demographic, particularly Facebook, this is aimed at children. The pandemic did them tremendous, tremendous favours, like a public disaster, did Discords business a huge amount of good just like it Zooms, but they now have a platform, which is difficult to monitor, and which is, you know, directly in the hands of children and being used for all sorts of, you know, communication, illicit, and licit, I guess, and that's that's a massive problem, and I don't think you read that statement didn't really hit home for me that he took this really seriously despite being the father of to who he wanted to be proud of him.
TW. Mark Zuckerberg was next. I don't think he really needs an introduction. He started by noting the amazing things teens do on their platforms every day, and insisted that overall teens tell them that this is a positive experience in their lives. Zuckerberg too added a touch of the personal and said that being a parent is one of the hardest jobs and claimed that they are on the side of parents everywhere working to raise their kids. This is worth noting because this was definitely a theme and a lot of his responses, he would say this again and again, that they want to support parents, and so I think it's an interesting strategic position to take.
TW. Yes, no, I think so too. I think the the idea that they've built safety into everything that they do, rung hollow, but I think also, you know, his his framing it as well, I'm a father to two and that's why I want to do the right thing. Maybe on the heels of or in connection with I'm so glad to be here and how disingenuous that sounded. It was a bit.. it surprised me a little bit, because it doesn't, it doesn't strike me as something that somebody who's really confident in their position would have to say, you know, referring to their their personal position, and 'Oh, I'm a parent to this many children, and this is why you can count on me to do the right thing'. What are the facts? Let's stick to the facts, and what are the actual problems but of course, there was no mention in his opening statement of of where he thinks they could do better, and to be fair, no one no one said that, right? Nobody was volunteering, a list of mistakes that they've made or places where they could improve. But it was it was definitely interesting to hear from him and to hear his take.
TW. He then listed some of their efforts over the years, including a number of things we know for a fact haven't worked, and we know they know they don't work too, but it didn't stop him from from listing them, and then he claimed that Meta goes above and beyond legal requirements, and finds and reports more harmful content than anyone else in the industry. He did not offer commentary in terms of why there's so much harmful content that needs to be found or reported to begin with, that this was his time to speak, and, you know, nobody interrupted him to to ask that question. He expressed hope that the conclusions of this committee would give parents what they really want, and I think, you know, positioned himself to speak for them, and sort of put it on on to the App Store. So he suggested that it really should be the App Store who is doing age verification, and giving parents control over the apps that that parents use.
TW. So, in other words, he was attempting to shift a large portion of responsibility and potential liability to someone else. In his concluding statement, he said that he wanted to recognise the families in attendance who have lost someone and said he hoped they could make progress, and I think if Citron statements rang hollow, they were out done. I think Mark Zuckerberg managed to outdo Jason Citron, remarkably, with his statements.
SJ. Everybody's favourite android trying to appear like a real human boy. It never lands well, and, you know, when you when you talk about him saying all these things about supporting parents it the only thing it reminds me of is the big, big food companies trying to convince people that the reason they're fat is they don't exercise enough, and the reason that we have a climate crisis is because people don't recycle enough. It's this is the same strategy. It's not us. It's you. It's the way that use our use our product, and to use his analogy. It's a little bit like blaming news agents in the UK for teens smoking cigarettes. Yes, we make the most addictive substance on Earth. But it's it's not us. It's how actually selling it as it is, is really the news agents that you should be regulating, isn't it? Because they're the ones who are actually peddling this crap. I mean, I know I've personally made billions from the sale of this toxic material, but but it's not my fault.
TW. Yeah, no, I think that's exactly right. I don't think anyone was particularly convinced or impressed with with those statements, I have to imagine the families who were there probably have had a very sick feeling during that part of his statement. But Evan Spiegel at Snapchat was next, you know, he's, he's one of my all-time favourite. So he spoke third, and he too, started with stats, and then quickly turn to the personal side, recounting what led him to build Snapchat with his co-founder when he was only 20 years old, bit of a almost a patriotic note in what he was saying, you know, he started this when he was just 20, and, you know, started from nothing and built this up, and he said, at the time, there were no alternatives to mainstream social, and he really wanted an option that was fast, fun and private, without public likes or comments, and, you know, he listed them some of their efforts to address concerns raised by the committee, and expressed his own belief that children under 13, should not use Snapchat, and then he to suggested that parents use the controls that are available to them, and restrict the apps their children use as he does, or rather, and this was a real favourite of mine, his wife does in his own household.
TW. Spiegel then expressed gratitude for the opportunities his country has afforded him, but stops short of singing the national anthem, and I'm sure we're all very grateful for that, and then pledged to work cooperatively with the community. You know, again, I think standing on its own, this would have made everyone cringe, but after the first two, it was a bit like, oh, okay, more of the same then. But a slightly different, a slightly different twist. This time around.
SJ. Okay, I think I threw up on my mouth a little bit as you were reading. Seriously, dude, like, I like my kids who are somewhat older than yours. were around when Snapchat became big, and they talked about it, and I asked them what it was about, and they said, Oh, it's like a platform where kids can send naked pictures of themselves to other kids, and they can't keep the photos. Of course, it turned out to be complete, lie, as you can also take a screenshot with your phone. So, that was the private and temporary was indeed the point of Snapchat, and so, you know, come on, buddy, and not only does he offload responsibility to the parents, he doesn't even take responsibility for it himself in his own house. He offloads it to his wife. Yeah, I mean, I guess if they'd had a, you know, Stars and Stripes Flag on the desk, he might have hugged it to show how patriotic and awesome he was. But fortunately, they didn't. All right, I suppose we should move on to the to the next one.
TW. Yeah, that would have been a nice touch, and fittingly, because next Shou Chew from TikTok was up, and we know how Congress feels about TikTok at the moment. Shou Chew started with stats, and then quickly added that he's a father to three himself. So, we can see sort of a pattern emerging now, that's that's really clear. Before saying he's proud of the efforts they've made to protect children. He said they're vigilant and enforcing their 13+ policies, which is very much news to me and everyone that I know, and offered an experience to teens, that is more restrictive than for adults, and then went on to describe some of these policies to said that they expected to invest more than $2 billion this year, and trust and safety and promise that much of it would be here in the US, and that was that was really it.
SJ. I mean, if I were him, I would have kept it short, too, because you're talking to a hostile audience to begin with. You are the only non-American company in the room, and they already don't like you. But serious seriously, how exactly is it that they enforce this 13+ rule? I mean, I actually am finding it difficult to talk because my jaw hit the desk. When when you read that. Good Lord, buddy, like, they're not that stupid. I know. They're old. But they're not dumb.
TW. No, and I had the same thought when Evan Spiegel was talking about his views that under third teens shouldn't use Snapchat. I mean, I think about the kids at my kids school, who are on Snapchat starting, sometimes as young as nine, but certainly 11. I mean, I think most I would say the majority of 11 year olds at their school are on Snapchat, and I've seen zero evidence that Snapchat has done anything about this, and I think it's similar, maybe worse on TikTok, so I'm not really sure. You know, again, they didn't have an opportunity to push back on these statements, but I think people were definitely taking notes.
SJ. The the other thing to remember is that child export sexual exploitation doesn't doesn't end at 13. That's not the age guys, that there's a there's a different legal reason why you fixate on 13. It's got nothing to do with the fact that they're now adults, and, you know, sexual images aren't illegal. I think they're they in their own heads, they haven't got their their mind around the fact that it doesn't stop there.
TW. Well, last but not least, we heard from Linda Yaccarino X's, CEO, "X", formerly known as Twitter. She was the last one to give her statement. So she was notably the only woman on the panel and began by thanking the committee for the opportunity to discuss "X" safety features, and spoke about how urgent this issue is for her to as a mother. So again, we see that pattern sort of sort of lining up. But she had on the personal side before she before she spoke about stats. So, it was I believe, you know, it was right after the thanks for the opportunity, which is a slight variation. She described "X" is an entirely new company. So that was a little bit surprising, and also described it as an indispensable platform for the world and democracy and gave her personal commitment that "X" will be part of the solution. Yaccarino said that, although she was new to "X", she brings with her a wealth of experience in working with stakeholders to do this kind of work, and so that "X" is not the platform of choice for tweens and children, and so they do not even have a line of business for that demographic. She claimed that less than 1% of US users are between the ages of 13 and 17. They don't have users under 13, and that "X" is not available to under 13s again, there's no reason to believe they're any better about checking this than any of the other platforms. She went on to say that "X" has protective policies in place anyway and as a zero tolerance policy for CSEM.
TW. She did not say what impact their decision to got their trust and safety team how to lease efforts. I'm sure that surprises no one and in closing, Yaccarino expressed her belief that platform safety and free speech can and must coexist. Although, again, she offered no real explanation as to how to balance those two things out or or what her priority is.
SJ. None of them are supposed to be available to people under 13. But we know they add that they have them. Now, admittedly, it becomes less attractive as time goes by to to people under 13. Because it becomes the home to white supremacy. There's some mis and disinformation machines. But seriously, it's not. It's not necessarily the point that the kids are going to be spending need to be spending time on this platform in order for explicit material about teens and children to be spread on this by the 99% of their users who are apparently adults. I mean, what is that that's a meaningless statistic, which just sounds pleasant and is in any case itself a lie, and they are and she didn't address this. The only platform that I know of that is deliberately allowed somebody who distributed Child Sexual Abuse Material back on to its platform because of her chief technology officer, and boss made that decision.
TW. Yeah, it is really interesting, because listening to her at the hearing, I would say of all the CEOs on that panel, she was probably the most enthusiastic about supporting the committee's work, and she was, you know, she really was, you have my personal commitment to this, I'm going to help you we're going to do this together. This is serious, even though this isn't so much an issue for us, because we're dealing with adults, we take this seriously. But I think it's worth just sharing that, you know, the very next day today, Elon Musk tweeted the following. "When you hear the names of legislation or anything done by the government, it is worth remembering that the group that sent so many people to the guillotine during the French Revolution was called the Committee of Public Safety, not the cut off their heads committee".
SJ. Yeah, I mean, that just tells you how much her personal commitment counts, because it's his company. He bought it. It's not publicly traded. She works for him, he hired her. She is his little public face that is marginally and increasingly more acceptable than his, because of his efforts to, you know, undermine the government and democracy and blah, blah, blah, blah. This is the man who was accused in the newspapers this week of spreading this election misinformation online. Right. In the name of free speech. Come on. So yeah, I mean, whatever she says and I don't doubt her personal commitment to this, it is abhorrent why wouldn't she have a personal revulsion of this material? But what does it matter what she thinks she's not actually running the show? And we all know it. She's the one person from the tech companies that we know, isn't running the show at their, at their company, unlike Mark Zuckerberg, who is the majority shareholder and decision maker, right?
TW. Well, and I hope that all of these CEOs are genuinely disgusted by this material as much as you or I would be. But it's one thing to be disgusted by it, it's another to take action that stops it, even if it gets in the way of profits, then I think that's the issue where we're just not seeing that that connection happen, and I think, you know, maybe we shouldn't maybe it is the role of government to say, it's not your job to regulate yourselves. It's our job to regulate you. So so here you go.
TW. But what's happened because of Section 230, and in the way this is all played out, is these platforms are largely shielded from liability, and so they can get away with these things, and there's no way to hold them accountable. So, I think it's worth looking at Senator Durbin's round of questioning, we won't do this for every single senator, because then we'd be here for, you know, nine hours instead of the four that this hearing would. But you know, he specifically asked Citron, whether he would support the CSEM bill and Citron hedged and said he would be open to discussing details and, you know, blah, blah, blah. Durbin was having none of this turned to Spiegel inside. You know, already in 2017 Snapchat was identified as a paedophiles go to sexual exploitation tool by law enforcement, and describe one specific case where a perpetrator admitted he used Snapchat and only Snapchat, because he knew the photos would go away. Durbin asked whether Spiegel and his staff could really claim they failed to see the risk. So, he was you know, he was trying to nail nail them down, and Spiegel said, Oh, this behaviour is disgusting, and we provide tools for people to be able to report it. You know, as they have all said a few times at this point and Durban referring back to his example, case, noted that the victim tried to sue Snapchat. But the case was dismissed due to Section 230 and then asked whether Snapchat would have worked harder to implement safety features if they were subject to liability under circumstances like that one. I'm sure it will shock you, Steve, that Spiegel didn't provide a direct answer to that question, and instead said they only recommend friends to teens, if they already have multiple people in common.
SJ. I mean, a that's not actually very good advice. Just because you know, lots of people doesn't mean that you're not a sexual predator. Secondly, You're a disgusting human being seriously, like, if you just because you were shielded from being sued because of Section 230. Doesn't mean that you shouldn't be doing something to protect children. I mean, yeah, obviously, he cares about his children, because they all do. But he obviously doesn't care about other people's right, that, you know, that's your problem mate right there to you. Those children are just dollars in your bank accounts and your Children's Trust Fund. So let's get a little bit real here. I mean, well done. Senator Durbin, unfortunately, isn't actually allowed to physically skewer people who are giving testimony?
TW. No, I think he might have liked to especially with citron, who he then turned back to, by the way, so remember that Citroen as the CEO who had to be served by US Marshals, and this definitely got under Durbin skin. So he turns back to him and says, "How do you justify a lack of moderation in groups of fewer than 200 people on Discord?" So, like Spiegel Citroen attempted to sidestep the question, and instead talked about something they've implemented to support teams across the platform. Durbin responded that if this were working, they wouldn't all be sitting there.
SJ. Oh, 10 out of 10, Senator Durbin and well done!
TW. It. It was a great moment. I have to admit it was really I'm not often on the side of aggressive questioning from members of Congress, but in this case, you know, bravo. So, I was surprised then when Senator Durbin turned to Chew and had what felt like a softer, broader question. So, he asked what they're doing specifically, and whether they CSEM on TikTok. Chew repeated his plan to invest $2 billion and trust in safety this year, and I think thought that he had almost sort of gotten away with something until Durban turned back and so much more direct question. Why is it TikTok is allowing children to be exploited into performing commercialised sex acts?
SJ. Ouch!!
TW. Yeah. Chew respectfully disagreed with that characterization, and said that live streaming products are not for anyone under 18.
SJ. Ah, so their sexual performers have to provide passports or birth certificates, or how else did they verify the age of these people? I mean, surely, he can't possibly believe that people might lie in order to make money. I mean, I would be surprised, given his testimony so far, that he would be surprised by that.
TW. Well, I knew and I know full well, that there are tools out there that can tell your age, based on your face based on the way that you type based on the language that you use based on the way you interact. I have no doubt the TikTok is collecting all that information anyway. I would be truly shocked if they weren't. So, the idea that they don't already know or have some idea is absurd. But you know, I thought it was a good question to ask, even even though he was never gonna get a direct answer. I think putting it out there was was important.
SJ. It set the tone, didn't it?
TW. It certainly did. It. Let's jump back to Senator Graham. I know he's not your favourite person. But again, he started at the front of the line with with Citron, and he pushed back on the notion that this is the start of a discussion and said, you know, we've been having these discussions for a long time, and now it's time for results, and he asked whether Citron agreed with that, they had a bit of a back and forth before Graham listed the bills they're considering in order and asked whether Citron supports them one-by-one.
TW. Citron attempted to sidestep with answers like, oh, we look forward to having conversations about this, and Graham pushed for a yes or no, and eventually concluded across the board that the answers were no, with no objection from him, and finally, Graham asked if Citron would support the repeal of Section 230, so that social media platforms could in fact, be civilly liable for these issues, and of course, Citron would not confirm that he would, but admitted that it could use an update. So, you know, Graham, I think importantly, noted that if we're going to wait around for this panel of witnesses to make changes and to solve these issues themselves, we're going to be waiting a hell of a long time.
SJ. When he got that, right, didn't he? I mean, it's like cutting off part of their food supply, isn't it? Why would they? Why would they try and actively restrict the money they can make from this business? This guy is really not distinguishing himself in this panel is he in I mean, if I put I'm not in this position, because I don't have children of this age anymore. They're all adults, but I think I'd be seriously reconsidering allowing my kids to use TikTok. If I was a parent of underage children, and let's face it, we're talking about 18 and down, that's a huge demographic on Discord. He, he has made himself look very bad, and as consequence is platform.
TW. Yeah, it was, it was not his day. It's interesting, because I think actually, Snapchat really should have been under under fire in some ways more. But But I think, you know, he really managed he being Citron and really managed to frustrate and anger, a lot of people and so he was he was first in line every time. But But Graham didn't only focus on him. He turned Zuckerberg and picked up this question again, about about Section 230. But he started, he started by saying, I'm going to try to be respectful, which I think, you know, you could see everyone sort of, you know, looking at each other and trying to figure out what was going to come next, and he referenced a young man who got caught up in a sextortion ring and committed suicide, and he asked Zuckerberg what he would like to say to this man's family, this boy's family and Zuckerberg said, That was terrible, and no one should have to go through that. So, Graham asked, Do you think they should be allowed to sue you? Zuckerberg of course said I mean, I think they can sue us, and Graham said they can't, and then said the committee is done talking, where it's time for action.
TW. Of course, he then went on to talk and to say a lot more words. But you know, the the point was, was sort of out there. You admit that this is terrible. This should never happen to anyone. So should you be held accountable for doing this and I know that for our European listeners and for listeners outside of the United States, this might feel a little bit strange, this idea that the way to hold people accountable is to be able to sue them. But that's just how it works in America like that is you can't always count on somebody to regulate and to make sure people are ticking the boxes, and that is one thing that is really strange for us coming from the US and coming here, and there are advantages and disadvantages to both. But I still always find it very surprising when people are willing to do their jobs, just well enough to allow, allow some mistakes and some things to go wrong, knowing that there won't be a lawsuit that results, and I think it as part of the way in the US system functions, you have this ability to sue and to get these big awards, because it scares companies into doing the right things. I'm not saying that's how it should work. I'm just explaining that that's how it does work in and social media really is the only exception to that.
SJ. Yeah, because they're covered by Section 230. As we talked about earlier, and he knew that he knew he can't be sued, he's sitting pretty on his pile of cash generated by the blood of others, as Lindsey Graham puts it, or paraphrasing what Lindsey Lindsey Graham said.
TW. Backing up just a little bit, because I don't want to get skewered myself for this. Social media companies are not the only industry that has some protection for civil liability. There are some first pharmaceutical companies that have that protection. Government has some protection. But social media is the only is the only sector like it that has this protection. We don't offer it to people who manufacture cigarettes, for example, or people who make cars or people who do, you know, you might have carve outs or or limits. But by and large, you can sue and social media as a sort of weird blanket protection that you just don't see in most areas, the law.
TW. So So yeah, Yaccarion was up next, and again, I give her credit for this because she was able to sidestep this with a lot more skill than Citron was. So, Graham asked her the same question, and his point here was really that until the courtroom doors are open, this issue is going to continue, and, of course, she didn't agree with that. But it was a much more credible and softer. Let's talk about this some more. I'm not saying no, you know, let's, let's figure this out.
SJ. She's better. Okay. That's great. But her answers were the same. Well, let's talk more, because every minute, I keep you talking, I'm making millions. I mean, she's not. But that's because she doesn't make the money, and it's Twitter, and they're not making much money anymore. But you know, that's the point, right? This delay, delay delay.
TW. And this is what they do, and you and I have seen this with lawmakers in Europe. This is their whole game. They spend a lot of money and a lot of time getting lawmakers to delay because every minute, there's a delay, they're making money, and they're making brands and they're getting ahead of the game, and they're creating new problems that we'll have to address tomorrow, and you know, they're always, always one step ahead. Speaking of Graham turn to Chew a TikTok in said, "Okay, $2 billion. Great, what percent of your revenue is that?" Which is a great question, because I think most people don't really have a sense of what TikToks revenue is, and $2 billion. Sounds like a lot of money. Unsurprisingly, Chew declined to share their financials, and so this was sort of left hanging, but I think it it was enough to make his point.
SJ. I think he was and to be honest, that and going back to something you said earlier, when he mentioned that in his his introductory statement. I mean, it sounds like a lot of money. But when you say well, we're going to invest $2 billion and most of it will be invested here. It sounds like a bribe. It sounds like a threat. It's like we've got $2 billion that could be coming to one of your state’s, but you know, it doesn't have to. So be careful what you do and that is in fact, what he was saying that is the game that is being played here.
TW. I think that's exactly right, and I hadn't even thought about it that way. But that's that's very true. Where's that money going to go? Who's going to make it and this is why we're sort of stuck in this cyclical hellscape.
TW. So, soon as soon after this other committee members got to take their shots in it. They were able to question the witnesses and, you know, Senator Klobuchar when went first and asked, I think, some very good questions and shared some some stories that she had heard, and, you know, I think she was genuinely moved and and emotional about the whole thing, and some of the stories are, you can understand why because repeating them as is, is a lot, and must much of the rest of the hearing aside from questions about TikTok specifically, involves senators reading sobering stats, sharing devastating stories and asking the witnesses whether they supported various legislation, and why not with varying degrees of passion or enthusiasm, admittedly, we had some yelling and some shouting and some some of that which we often get at these hearings, it helps break things up, and, and the witnesses, of course, repeated the same answers they'd already prepared and provided. So you know, if you watch that first hour, everything that comes after is largely a repeat in a sort of recycle the same, the same things again, and again.
SJ. Yeah, which is why, of course, they're willing to swear an oath. Because these guys at their core are also politicians, they will stick to their talking points, and they will not deviate, and their only job is not to get angry when they are shouted out and vilified and ridiculed by the people on the panel, and I think this is why the people on the panel do it, they try to rile them up, so that they'll say something that they don't intend to, and unfortunately, for at the end of the congressional panel, this panel of witnesses was equal to the task, they didn't screw it up. From their point of view, they didn't admit that they were a fault. They didn't admit that they could be doing more they kept to, and this case, the phrase has never been more at the company line. They are at their money. Right. That's what they did. They earned their exorbitant salaries and stock interests by delaying this once again.
TW. Yeah, and I'm sure they were quite well prepared for all of this, and some of them have had quite a bit of practice at this point, too. So, you know, that leaves us with the big question. Right, and that is, will this be THE thing? Will this hearing be the trigger? That actually leads to change? What do you think?
SJ. It sounds like they want it to be. Right? That it? They are, they're heavily critical of Section 230, whether they repeal it, or reform it? I think that that is something which is which is a relatively low hanging fruit, it's clearly problematic. These guys have pissed these guys off. Right? I mean, that the Senate is clearly annoyed, and because there's nothing other than collective fury that would have brought them all together and have them in such agreement. Partly, this is because it is objectionable, it is vile it's a vile subjects, and they have these these companies have allowed it to proliferate, maybe easier to access easier to do and do with anonymity.
SJ. But it is a difficult subject, right Apple? Was it last year, the year before had proposed scanning iCloud for child sexual abuse imagery, and privacy advocates, forced them to withdraw the proposal, and, you know, there are lots of problems with companies, you know, just scanning people's content looking for potentially criminal behaviour. You know, there is a thin end of the wedge argument to be made. But it isn't, it isn't easy, this is going to be difficult to do, and I whilst I have some hope that they'll take that are taking this seriously, they want to make progress. A lot of money is going to be spent to limit the impact on the income of these companies in lobbying groups in Washington and the election the year. Politicians need money more than anything else more than even votes.
TW. Right. I think that's all fair. I am, do a few things leave me feeling slightly more hopeful. I think the fact that there is really broad support across both sides, means that there's some hope that people who don't get on board will be sort of clearly outliers, and it'll be fairly obvious who's problematic and who's, you know, who cares more about things that aren't our children and American politicians are very, this is a topic that they get excited about, although not so much with school shootings.
TW. I think the other thing that makes me a little bit helpful is that this has been coming for a long time, and I think the build up behind it has become sort of strong enough. I think sometimes we think about courts, and we think about governments as sort of, you know, leading the way and and doing all these things. Most of the time they don't its public opinion, and people who sort of say this is enough, we've had enough of this, what are you going to do about it? That that sort of pushes them to do this stuff? And so I think, you know, I hope we see some changes to [Section] 230 over the next year, but I definitely think we're gonna see some of these new Bills get get pushed through, at least I hope we will, and I think TikTok in particular, is vulnerable to billion dollars or not. I think it has a target on its back, as you and I have talked about for for a long time, and I think that would be a very easy place to focus their efforts.
SJ. Yeah, yeah. I mean, you know, I think you and I both agree that tackling online anonymity is something which would be really helpful, because people commit crimes because they are untraceable or believe they're untraceable. That that needs to stop, I think meaningful age verification, genuinely meaningful age verification is essential, and for the companies to change, their obsession at the age of 13, is because there's legislation. So, legislation definitely has an impact, and, and so, you know, they obviously need to be regulated, they can't behave themselves, they are pursuing the interests of their shareholders, which includes themselves of course above the public good, because that's what they think they have to do, and on this issue, at least, it seems like the US government is actually pursuing the interests of this population, and by extension, everybody else's. So, let's keep those fingers crossed, and, you know, it's nice that things are moving in our direction, you know, we were designing this platform to solve or prevent many of these issues happening. So, it'd be really interesting to see. They're, they're making our case for us. Thank you very much.
TW. Yes, that's always helpful when when that happens. Next time, we'll get back to our regularly scheduled programming and free expression free speech. But thank you for joining us for this special episode. In the meantime, we'll post a transcript of this episode with references on our website.
SJ.Until next time, I'm Steven Jones,
TW. and I'm Taryn Ward.
SJ. Thank you for joining us for Breaking the Feed, Social Media: Beyond the Headlines.
Join the Conversation
Join the waitlist to share your thoughts and join the conversation.
The Bright Team
Two lawyers, two doctors, and an army officer walk into a Zoom meeting and make Bright the best digital social community in the world. The team’s education and diversity of experience have given us the tools to confront some of the toughest tech and social problems.