The Rise and Fall of Social Media Empires: Meta (Facebook), Part 2 Podcast Transcript
The Bright Team
The Bright Team • Oct 19

The Rise and Fall of Social Media Empires: Meta (Facebook), Part 2 Podcast Transcript

Breaking the Feed, Social Media: Beyond the Headlines

FaceMash, The Facebook, Facebook, Instagram, WhatsApp, Meta... is the Threads flop the end of an era?  In this three-part series, we go back to where it all started and trace Facebook's early successes and controversies. This episode takes a closer look at some of Facebook's early challenges and acquisition of Instagram and WhatsApp, with a closer look at Cambridge Analytica.

Taryn Ward  Hi. I'm Taryn Ward,

Steven Jones and I'm Steven Jones,

Join the Waitlist

TW.  and this is Breaking the Feed, Social Media: Beyond the Headlines. 

SJ.  We're taking a closer look at the core issues around social media, including the rise and fall of social media empires, to better understand the role social media plays in our everyday lives and society.

TW.  Today, we'll continue to take a look at Meta. Last time, we went back to the beginning and traced the early formation of Facebook through to the acquisition of Instagram and WhatsApp. This episode, we'll take a closer look at a few of Facebook's more recent controversies, including Cambridge Analytica.

Backing up slightly too before Facebook acquired Instagram in 2012 or WhatsApp in 2014.  Facebook had already faced some difficulties in 2004.  So basically, right from the start, the Winklevoss twins accused Mark Zuckerberg of stealing the idea. This only settled four years later, in 2008, for $65 million in cash and stock options; then, not long after, in 2010, "The Social Network" came out.  I don't want to spend too much time on this because there's a lot we could say. But Steve, did you watch "The Social Network" when it came out?

SJ.  Oh, yeah, well, yes, I did. I didn't go to the cinema to watch it because I wasn't that excited. But I did watch it, and I thought it was really it was really interesting. It was a little it's hard to know, right? We don't know is outside is where the truth lies, but it in betray anyone particularly well, but he certainly didn't reflect well on on Mark Zuckerberg and Jesse Eisenberg, I thought played slightly weird entrepreneur really, really well. He was really engaging he carried the movie, right? And, and the Winklevoss twins were umm, also they did not they did not come across as particularly, you know, endearing characters, I'm not sure that I would want them running Facebook either. In fact, I'm damn sure that I would not want them running Facebook, you know. So,so it was really, it was really interesting. But there was also this look into how the company got started and got his first investment and the pressures that, you know, Mark, who was a very young man at the time, you know, significantly younger than both of us and the things that he had to deal with in order to sort of like make the company take off. And so I did have a little bit of sympathy with him, just because it was obviously a lot of work. And we know now ourselves how difficult is to do this. But yeah, it was, it was a great movie. What did you think of it?

TW.  Yeah, similarly, I mean, I think it's really hard to know how much it is true and how much it is not. But I think it did a really good job of painting a colourful picture, and I think at least some of what they presented is verifiable. And I think if that movie stood alone, it would be nothing. I think so, with the background of Facemash being the way it started and the accusation that the idea was stolen, and then you sort of add this movie on top. I don't think you know, by the time this movie came out and people had watched it, there weren't exactly warm feelings about Mark Zuckerberg, and I don't know if you remember this, this. I was thinking about this just the other day. Do you remember when Mark Zuckerberg went on the sort of? I don't know that it was an apology tour, necessarily. But the idea was that he was going to tour the US and sort of meet people and show how in touch he was with the average person. Do you remember that?

SJ.  Yeah. Yeah, I do. It was like, "Yeah, look, feel my skin. I'm even warm."

TW.   Yeah,!

SJ.  It was. It was. It was like, he taken his criticism of being a bit of an Android to heart.  A little too close in like, no, no, it's look, the synthetics is really good. Yeah, it was really interesting. It was really interesting, and why would you do that? I mean, as a tech entrepreneur, I can't imagine Elon Musk going on any kind of apology tour, can you?

TW.  Well, no, that's true. I think one of the I think it probably went wrong for a number of reasons. And I'm not, I, it's not something that I want to do, that's for sure. But I think part of the problem was he had to travel with so much security that it flew in the face of this whole I'm an average Joe kind of a guy, and so rather than seeming approachable, you know, people who were part of this tour they attended, or they met him various places.  I don't think it really increased how people felt about him. And you know, we sort of said in passing last time, all of this controversy with Elon Musk people are starting to feel better about Mark Zuckerberg because Elon Musk looks like such an idiot. It might be unfair, but I mean, really problematic. And people are starting to say, wow, maybe Mark Zuckerberg isn't so bad after all.

SJ.  Yeah, it's remarkable. Isn't it it's, it's really interesting how you can be rehabilitated really quickly. You know, Boris Johnson, Boris Johnson didn't leave office in a good place. But only a few days later, Liz Truss left, and everybody was like, wow, so it does get worse. Suddenly, your reputation is a little bit rehabilitated, for I mean, not enough to become Prime Minister. One hopes, but still, It's better. It's, err, Yeah, it's really, really interesting, isn't it? You don't think you do wonder why, by obviously, you know, you're rich and important, and whatever, and there are security risks in the US that don't exist in the UK.  But and you see that with the way that American politicians struggle, right? I mean, the President I was part of a tour, when, when President Obama came to Canada, for the first time, we will put the security, and you know, it is an absolutely incredible machine, you know, the stuff that travels, absolutely incredible. But as a tech entrepreneur, do you also need that? Are you? Do you need the same sort of level of security that the President travels with? Do you need all these armed people surrounding you? Maybe most of the time, but not if you're trying to engage with the public, that makes no sense whatsoever.  So I don't know. I mean, maybe it was bad advice. Maybe it was just it, you know, being still a relatively young guy and not really having sense. But it didn't work, did it?

TW.  No, no, definitely not. But you know, so. So again, there's always this balance, right? So we're talking about some of the criticism, criticisms and controversies. We're also talking about some of their successes. So this was all happening in the background. People were also switching over from, you know, just using social on their desktop to using it on their phones, a lot of a lot of change in transitions. So Instagram came along, Facebook was already huge, making, you know, huge strides in terms of users and daily active users and all these things. But there were a lot of worries about competition and disruptors, especially from Mark Zuckerberg, and there was a fear that someone was going to do to Facebook what Facebook had done to MySpace. And you know, some sources really have claimed that this was about fear about competing on mobile, in particular. And especially with Twitter, I don't know that I don't know what Mark Zuckerberg knew or didn't know. But Twitter had already had some conversations with Instagram. And so I think, in hindsight, Facebook's decision to buy Instagram turned out to be a really good one. And I would add this, because there were so concerns about whether Facebook was going to be able to pivot to mobile, this was a way of hedging their bets. And I think it really was smart. But it was a huge risk. And in hindsight, I can say all this, I don't know that I would have made that decision. I don't have all the facts that he had. But a lot of people thought this was a huge mistake.

SJYeah, that's right. It is interesting, isn't it? It's interesting thinking about his headspace because, in 2010, he had 606 million users. Right that this was Facebook, I didn't, we talked about this in a previous episode. But at this point, Facebook was a third biggest country on Earth, it was a it was a country, only India and China more people. And he was worried about a user base collapse or credit to him. But you know, two by 2012, he had over a billion users, like, at that time, a seventh of the planet, on on Facebook, and still was worried they would collapse. Yeah, but buying buying Instagram was brilliant. I mean, and it didn't logically make sense. Because often, the outside obviously, it was a brilliant, brilliant decision. And it totally made sense. And, and has definitely they've made they've definitely made the most of that acquisition, sharing all of that data across the platforms. And now, you know, you look at Facebook, at least I look at my Facebook, and it's just cross-selling Instagram reels, which is really just crap cross-selling tic TOCs. Let's be honest. So you know, that maybe, at this point, has gone a little bit too far. Because what I see on Facebook is largely Instagram content and beams.

TW.  Well, to that end, I have a quote from Mark Zuckerberg from you know about this acquisition in particular. He said, for years, we've focused on building the best experience for sharing photos with your friends and family. Now, we'll be able to work even more closely with the Instagram team to also offer the best experiences for sharing beautiful mobile photos with people based on your interests. So I think that's all that is probably true. But this purchase, to your point, was largely about connecting with young people. So he understood in a way that I think a lot of people didn't that although they were growing and they had all these users on the network. They were not gaining enough ground with young people by this point. and they were scared. Facebook was really losing the ability to connect with this new generation. And they needed to find a way to do that. And I think he learned the hard way, like so many of his predecessors, that would appeal to one generation wasn't necessarily going to carry over as, as you might hope, and Instagram was gaining that ground. 

SJ.  Yeah, I think that that is exactly right. I think there was an article, I think, at The Guardian, just this week about the ageism in Silicon Valley, and particularly with Facebook, you know, they, they have not been cool with the youngest generation of users for let's face it a decade, Instagram has helped them sort of get over that hump, as you just said, but what's wrong with being the primary social media for people who are, you know, 55, and over, they have the majority of the world's disposable income, let's be honest. So, you know, if you're an advertiser, they're a great demographic to advertise to. But it's there is clearly an obsession to grab the youngest users, you know, which resulted in what, two years ago now, the suggestion that they would bring out Instagram for kids, which was so thoroughly panned that they withdrew the idea, but you know, somewhere in the depths of Silicon Valley, somebody is still thinking about this. Yeah, it is interesting. We, obviously, you want to capture the imagination. But young people, and you know, this, because you have kids, they're not going to like your platform, just because their parents like it, it doesn't matter whether it's good or not, they, you know, it's not cool, because their mom and dad and God forbid granddad uses it. So that constant reinvention is something which is probably going to have to happen. But at the same time, if you are an older person and who wants to use Facebook for sharing family photos, it's become exceptionally bad at that, like, as that quote said, they built a good platform for that. And they've systematically made that less pleasant in the hunt for those younger users.

TW.  Yeah,and I know Facebook made this decision in the paid a billion dollars for Instagram, which at the time was a staggering amount. But times continue to change, and people move on. And you have to find that growth somewhere. So in 2014, they bought WhatsApp for $22 billion. And this, again, was about growth. And it was about acquiring a platform that was adding over a million users every day 70% of which were active every day.

SJ.  Right now, I spent a bit of time, as you know, working internationally, and everybody was using WhatsApp in Africa and the Middle East. It was the messenger of preference, even in situations where it should not have been when we were discussing things which should have definitely been private within governments, but it became the default, because everybody uses it. And it was platform agnostic. You know, personally, I love iMessages on my Apple device, and all of my family have Apple devices. So it works perfectly well. But if you don't, it doesn't work really well. It worked very well at all right. So the WhatsApp was the game in town at 22 billion is a staggering amount of money. I'm not even sure that WhatsApp was making much money at that point. And they paid 22 billion for it. I

TW.  I mean, they couldn't have been making anything that would have justified the sort of valuation that they got. But it was this this potential again, and you know, Mark Zuckerberg, again, to his credit, saw this potential with Instagram and saw it with WhatsApp. And to your point, you know, WhatsApp really was important, especially in developing markets, where it was helping to fuel growth and controversy. And you know, the internet connectivity was still more limited, and so, this was a great way for Facebook to become part of the ecosystem.

SJ.  Yeah, absolutely. Absolutely, and also, it's an additional social graph, right? The people that you talk to on Facebook and the people that you share photos with on Instagram are not necessarily the same people that you share messages with on WhatsApp. So they suddenly get this a massive view of all of the people that you're connected with, and in what, what, what you sort of do with them?  

TW.  Yeah.

SJ.  and if they're not reading your messages, though, back in the early days, those messages were not, in fact, encrypted end to end were they?

TW.  Ha, well that's a whole feels a conversation for another day.  Umm, But essentially, what this meant is by 2016, Facebook owned Instagram and WhatsApp and essentially owned the social space, Twitter, LinkedIn, Pinterest, Snap, they all remained competitors, but 79% of adults in the US used Facebook by this point; 32% were on Instagram. The next closest was 31% on Pinterest, 29% on LinkedIn, 24% on Twitter, and Instagram importantly was growing really quickly amongst young adults. So they were really, really well positioned going into 2016. And in some ways, painted a very large target on their back. Not that it wasn't well deserved. But 2016 is really when we see the conversation start to shift a little bit negatively. And whether that's because they had this huge lead, and people started to say, wait a minute, this is getting to be a little bit weird. Or it was purely because, you know, we started to learn more about their decision-making, I don't know. But this was when we really started to see them start to take heavy criticism for fake news, they did try to build some new features that allowed users to flag fake posts, and to improve the algorithm. But reports of abuse, harassment and hate speech continued and actually picked up the pace. So they really were starting to struggle with this.

SJ.  And this comes back to what we were talking about, they didn't really set out to build a social network and hadn't really thought through the problems, you know, and the social dilemma, another great movie about, not exclusively Facebook, but let's face it, largely Facebook, because it is so market dominant, talks about some of the features they built in. And I think it's fair to say that nobody realised that the light button was going to stimulate people's dopamine and give them a, you know, a thrill every time somebody liked it. But once they found out, they sort of doubled down on it, right? And the same is true with this fake news. The reality for Facebook is that they, they start to make money because of ads; and the way that people work is they like to be part of an in-group, and therefore everybody else is part of an out-group. And if you're saying nasty things about the group, they like it. And Facebook undoubtedly saw this happening but didn't really do enough to intervene, and that just increased the polarisation that we've seen the effects that has, you know, in with the Rohingyas, where, you know, there's no there's any doubt that Facebook played, seem to play a part in that tragedy in the Brexit vote, which I think we'll probably talk about in a bit. And also the, you know, the US election, and more recently, recently, you know, closer to my heart COVID.  The amount of fake news and misinformation and polarising content has just continued to spiral outwards. And I think it's, it's very bad for everybody, including in the long run them. 

TW.  Yep, and I think while that was happening, we started to see more conversations about the amount of time young people were spending on their phones, especially on social media. And we started to see some of the first really troubling reports about how Instagram might be affecting mental health, especially for young women. And you know, this was still light touch again, so this wasn't a situation where this was headlines, and people were talking about it all the time. It was just sort of, you know, we started to ask some questions, and started to see some evidence that maybe this wasn't all it was cracked up to be. And then Cambridge Analytical broke. 

So for those of you who don't know or don't remember, in 2018, this all sort of sort of came to light. Cambridge, Analytica was a British political consulting firm that actually started back in 2013, and it was a subsidiary of a private intelligence company, the founders and investors were, let's say, politically connected. I don't think it serves us to point fingers or or to get into the weeds on this, although I think we will in another episode.  Umm.  But the people who were involved in this had a lot of power at their disposal, and they used it after the scandal it closed. And you know, the fallout from all this is still ongoing, but just a little bit about what actually happened because, you know, some of the some of the news articles that came out tell part of the story, or sort of, or don't really explain how it all fits together. 

So just a quick, quick and dirty how this worked. They extracted personal information of around 87 million Facebook users, largely through an app called this is your digital life. This is a personality profiling app. So it looked like some of those silly quizzes that float around on Facebook every once in a while. But this asked personality questions, and the owners or the people leading this said that this resulted in over 5000 data points on each participant. Obviously, that's a lot of data to have from one quiz. So it raises questions. It raises a lot of questions. But part of this was information about their friends on Facebook, so who they're connected to, who they were interacting with, and it spread and spread and spread. So that's how you get to 87 million people.  How this worked, they, it was psychological targeting. So this was an attempt to influence people's attitudes, emotions or behaviours; sometimes attitudes, emotions and behaviours, by speaking to their fundamental psychological motivations. This is done in various ways sometimes because you know, somebody wants to sell something, in this case, it was about elections. So, Cambridge Analytica operated in Australia, India, Kenya, Malta, Mexico, the UK, and others.  Executives said they worked in more than 200 elections around the world during this period, most famously in the United States.  

So this information was used to further Trump's position in the 2016 Presidential election. And just to note here, this is not a conspiracy theory, this is a well established fact, this definitely happened. There was a £500,000 fine that was paid to the UK's Information Commissioner's Office for the role Facebook played in it, the it's not a question of whether whether this happened or whether the information was used, or whether there is, you know, it, the push was to get people to vote for Trump. These are all things that definitely happened. What is up for discussion is the extent to which it was effective. So to what extent did this, this work that Cambridge Analytica did, actually impact the election results.

SJ.  And I'd love to come back and talk about Brexit, because that's the subject talk close to my heart. But, but if you think about the American election, people think well, gosh, this, like 160 million voters, how many can be swayed by this system.  But the reality and for people listening in, in the UK, this might be difficult to grasp, actually, very few of those votes actually count, because some States are always going to be blue-Democrat, and some states are generally going to be red-Republican, it's those purple states. And there are what seven of them, and they've changed like Florida used to be a purple state, and now it's more of a red state, but they change it, they change over time, slightly. 

But he, he really, Trump only really needed to win by a few 1000 votes and some of those states in order to swing the election. And whilst it's not conceivable that you could change the minds of 160 million people, you didn't need to, you probably need to change the minds of no more than 100,000 people. And that is a fraction of 87 million people who are being targeted by this fight. So in the other thing, which this does, and social media does. And we, tech companies, use this capability to do a B testing. So you, you have a feature. And before you release it on everybody use it, we're like, well, you release it to some people and see if they prefer that to the to the reactions you get from the group, right? A B testing is a brilliant device. But social media allows you to do that with political messaging. So you have all of these people whose profiles you have and then you can rapidly produce political ads and memes and things, which you can AB test in defined psychologically evaluated. Groups, you know, and I know a lot of people have doubts about whether the sorts of psychological tests have validity, but a lot of major companies use them and believe the results. And so whilst they're, whether they're clinically relevant or not, is beyond the point, I think they have very strong correlation with some behaviours, and if you have a very large data set, they get better, right? So yeah, it's really interesting this, this had gave them the opportunity to sway the minds of a few 1000 people, and that's really all they had to do. But yeah, you're you're American, you were voting in this election, I imagine.

TW.  Well, I think to your point, I think, yeah, it's a it's a tough one. I did vote in that election. I think a couple of things. I think none of us like to believe that we're susceptible to this kind of manipulation. And I understand why that's not a pleasant thought. We'd like to think that we are rational beings, and we make these decisions. I would just say that there is some very good evidence, and we'll link to some studies and papers in this transcript on the website. There's some very good evidence that this works. And it may not work for people who are very, you know, definitely going to vote one way or the other. But to your point about which votes matter the most. It's the people who are undecided or who are willing to consider the other perspective who are most susceptible to this. And those were, that's who they were targeting. More than that, if you're really, really cynical and sceptical. Consider the amount of money that we spent on this and the amount of money that spent more generally on advertising. If it didn't work at all, I think If it would not be an issue, if there were no chance of convincing people to change their minds, we wouldn't see the amount of money spent on political campaigns on campaigns to buy a certain type of brand, or whatever it is, because there would be no payoff. ad companies don't do this out of the goodness of their hearts, they're paid a lot of money. And companies don't pay Ad-companies, because they make pretty looking billboards. It is about results. And this company was, again, these were some very smart people, very well connected. They're not unintelligent, they knew exactly what they were doing. And they had a pretty good sense and some pretty strong evidence this was going to work.

SJ.  They were talking to some pretty hard-nosed political operatives, right? Not, it's not the politicians in the end of the day, who makes these these decisions, it's the people who work for them to help run the elections, and they, they know what they're doing. It is a data science business these days, and you're absolutely right, you know, Ad, Ad-companies exist because adverts work and whilst you know, I definitely like to consider myself to be rational. But I also recognise that I bought things because I sold them in an advert and I thought they looked cool.  Now, did I go out and buy something that I was not thinking about buying at all, or didn't think that I had a need for it? No. But as you said, if you know 5000 data points on someone, you can predict very clearly whether they're going to vote Republican or Democrat, for sure. You don't advertise to them? Do you? You go to the ones, which you think are going to be what they say 55% likely to vote for a Democrat and just try and change their mind by 6% Because that all it takes at the ballot box, and the winning margins are vanishingly small, and we saw that in the US election, right? You know, again, the Democrat the Democratic candidate, presidential candidate, Hillary Clinton won the popular vote, but lost in the Electoral College, because it just takes those few 1000 votes to sway the sway the result. And, you know, this, this nefarious data scheme was also used during the Brexit election, and it was, again, there, there was a very small number of people across the UK, who voted and swayed the election one way or the other, it's certainly not inconceivable that you can change the mind of 2% of people. And also remember, you don't have to just persuade people to change their vote, you can also persuade people who are going to vote for the side, you don't like not to vote, which also has, you know, an effect. So all of these games get played, and the problem was that Facebook in this instance, did things that incorporated with Cambridge Analytic in ways which didn't protect people's data, right? And that's why they got dinged by the UK Commissioner. But what is the point in finding a company the size of Facebook £500,000?  It makes absolutely no sense penalty, right? Like, good grief. Makes no sense.

TW.  Yeah, I mean, I think the answer you would probably get is that this is a process. And it's symbolic, partially, but it's, you know, laying the groundwork for future efforts, though, whether you buy that or not, is another question. But I think your point about how does this relate to Facebook? Right, this is a privacy issue. And we'll get much deeper into this in our privacy series, where we'll explore the impact of existing privacy policies and enforcement and some of those fines that are not an adequate size. And why all of this is a problem for democracy. So, you know, the glass tea mug, with the special infuser that I bought, that I don't really need is one thing. But you know, changing my mind about which election or how I'm going to vote in an election or whether I'm going to vote in an election is another thing and you know, the more information that we put out there about ourselves and about our children or other people we know, the more vulnerable we are. And I think, again, sometimes when we talk about this, it sounds like a conspiracy theory, or we're, you know, making something small into something big. But actually, I think in this case, the opposite is true. Really, I would be surprised if we didn't see a lot more conversations around Cambridge Analytica unfolding in advance of the next election cycle. In terms of Facebook, though, after Cambridge Analytica, there was some blowback and whether it was the hashtag #DeleteFacebook trend on Twitter and a number of celebrities who, you know, deleted their Facebook account and swore that they would never return. Or the fact that we saw a new appreciation from The public of the concept, you know, "You Are The Product". You're, it's your data, it's your information, it's your content, you know, you are this is not free, it did start some really important conversations.

SJ.  Yeah, I think it was. It was a watershed moment, I think, for Facebook and social media. Because this, up until sort of this point, had been free. And you know, it was fun. And it enables you to keep in touch with people through a very slight touch that you didn't see for years. But then suddenly, they get this idea that if you're not if you're not paying for something, then you're the product became, you know, very prominent, and I'm not sure how many of those celebrities and other people who have posted, you know, hashtag delete Facebook and left Facebook stayed off it because there is a lot of convenience to it. And there are, let's face it, billions of people on it. So, you know, yes, it's hard. You know, I understand that symbolic value of a $500,000 fine, $500,000 fine, but it's just meaningless in the context of the size of these businesses. And if you and I broke into an election office and changed a few votes, we'd probably serve time, and we would have changed half a dozen votes, you know, how many votes were changed because the work of Facebook and Cambridge Analytica, and they got a little naughty boys go sit in the corner. I just don't think it's gonna, it's just gonna work. Right? This is big business for vast amounts of money, country-sized amounts of income. So yeah, it's, it is really, there's a real problem here. And I think that people are just starting to become aware of it, I think, no doubt, we'll talk about this in later episodes. But I think if there is going to be a downfall of this, these companies, that will be because of the business model and people's increasing awareness of it. 

TW.  Yeah, and awareness is the first step, and I think 2018 brought us that., umm, and 2019 wasn't necessarily a better year for Facebook. That year, they paid out fines to several different governments, and maybe not significant enough to be meaningful. But even if they were symbolic, you know, this wasn't just one government saying, Hey, you messed up, give us money, you know, several different governments saying actually, you violated our laws, and, and you're going to pay because we're going to hold you accountable. And then the year ended with another major security breach. So, there was a leak of more than 267 million users information, including phone numbers and email addresses. So you had two sort of back-to-back years that weren't great. You mentioned COVID, earlier, that's a really interesting one to talk about because it is so recent, and it still stings, and as we're recording this, now, there's a new variant, which is apparently resistant to existing versions of the vaccine and boosters, and you know, we're heading into another flu season in the northern hemisphere, and there are all kinds of things happening. So this is very much fresh, and COVID and everything that happened around COVID, brought a new wave of how reliable is what you read and see on Facebook? And the answer is, I think, for most people not very!  Now whether that actually stops people from believing what they're reading is another question. But I think now we've all sort of wrapped our minds around the fact that this is not a really reliable resource in most cases. And then in 2021, because all that wasn't enough, in February, Facebook changed WhatsApp's privacy policy to share user data with Facebook. And this was particularly unpopular, given the relatively recent breach headlines, but also because there was no trust. And this is again, an opportunity for us to look all the way back to the first episode and think about how Facebook started, how the story sort of came together, it's not difficult to imagine why people weren't really excited about giving Facebook more information.

SJ.  I can only think of reasons to give Facebook less information, and I must confess, personally, since this, these stories broke, I've not completed any of those quizzes or any of those other things that give, you know, some unspecified company access to more data, and I think a lot of people don't understand how much can be inferred from a relatively small amount of data. 

You know, if you have somebody's name and their telephone number, you can identify where they live, and every time you add a data point to that it go gets better and better and better, right?  So I think, what is it? 300 data points means that the algorithms can infer your choices better than your spouse, which is a vanishingly small number of data points given how many times you Like, share, comment, look at, open things on Facebook, right? And now, not just Facebook, but also Instagram and since 2021, everything that you do on WhatsApp as well. So you, you can imagine how big these datasets get, and how much better they are at manipulating you, and even if they're not manipulating you, they are... even if they're not manipulating you to decide who to vote for; they're trying to manipulate you to make decisions that will favour their advertisers because that is the business model. They're trying to make you buy things that people want to sell, which is, you know, pretty nasty when you think about it. Because it's, it's not, it's also sort of not really clear, you drive past a billboard, or you see an advert during the commercial break on TV, you know, you're being advertised to the joy of social media for advertisers is that it's often possible to hide those adverts in something which looks like content. And that's, that makes it more difficult for people to know, right.

TW.  I think that sets us up really nicely for our next episode where we'll talk about the rebrand. It's not hard to get from the scene we've set now, to why they would want to rebrand and Meta's launch of Threads, and, of course, the cage matches whether this could really be the end or the beginning of the end for Meta. In the meantime, we'll post a transcript of this episode with references on our website. You can find this and more about us at TheBrightApp.com.

SJ.  Until next time, I'm Steven Jones,

TW.  and I'm Taryn Ward.

SJ.  Thank you for joining us for breaking the feed social media beyond the headlines.

Join the Conversation

Join the waitlist to share your thoughts and join the conversation.

Brock Melvin
Sue Gutierrez
Adrian Faiers
Mike Perez Perez
chris dickens
The Bright Team
The Bright Team

Two lawyers, two doctors, and an army officer walk into a Zoom meeting and make Bright the best digital social community in the world. The team’s education and diversity of experience have given us the tools to confront some of the toughest tech and social problems.

Join the Waitlist

Join the waitlist today and help us build something extraordinary.