top of page
Writer's pictureEPOCHTV

How to Resist the Censorship Apparatus: Insider Andrew Lowenthal

"Do you think it's a problem when major media corporations, Big Tech, academics, and NGOs get together in advance of a story in order to essentially create their own misinformation campaign ... not only that—somehow there's a connection to the highest reaches of the Intelligence Community, all the way up to Secretary of State [Antony] Blinken."


In this episode, I sit down with Andrew Lowenthal, director of liber-net, an online platform promoting civil liberties and countering digital authoritarianism.


"The civil society organizations who are receiving many millions, hundreds of millions of dollars to police disinformation need to switch from policing disinformation street crime to disinformation corporate crime," says Mr. Lowenthal.


For over a decade, Mr. Lowenthal worked to foster online free expression in South Asia and combat disinformation from state actors. He witnessed first-hand how the very organizations in the West, once suspicious of government surveillance infrastructure, suddenly signed up to be part of it.


Watch the clip:



"This is the new weird world that we live in. Essentially, this kind-of neocon war on terror, [with a] massive amount of surveillance infrastructure that was built up, is then kind-of turned into this 'countering populism' infrastructure that the left, which previously were the people who were most suspicious of the war on terror and the various agencies that became extremely bloated under that banner, have now signed up to be to be part of," says Mr. Lowenthal.



🔴 WATCH the full episode (41 minutes) on Epoch Times: https://ept.ms/S1209AndrewLowenthal

FULL TRANSCRIPT


Jan Jekielek: Andrew Lowenthal, so good to have you on American Thought Leaders.


Andrew Lowenthal: Lovely to be here.


Mr. Jekielek: Let's talk about the Westminster Declaration. Let's talk about free speech and the need for it in the first place. For full disclosure, we are both signatories. You contacted me about it some weeks back, and I'm very proud to be part of it. What is this all about?


Mr. Lowenthal: The Westminster Declaration came out of a meeting that was called by Matt Taibbi, Michael Schellenberger, and myself, that occurred in London in June 2023. Essentially, it was a gathering of people from both the Left and the Right, however much those categories still make any sense, who retained a commitment to free speech and free expression, and were particularly concerned about the way in which anti-disinformation was being used as a cudgel for censorship.


That's the main thrust of the declaration. It says that so much of this work is illegitimate and should be pushed back, and that we need free speech and free expression if we're going to maintain a free society.


Mr. Jekielek: Malinformation is one of these words that is particularly pernicious. Basically, it's something that is true, but goes against the prevailing narrative.


Mr. Lowenthal: It's information that is used in ways that someone else doesn't like.


Mr. Jekielek: It is something that's officially prohibited in some places.


Mr. Lowenthal: Yes, in terms of the full definition. At the same time, disinformation and misinformation have been very corrupted as terms. They have the same function as in the Virality Project, this so-called vaccine disinformation initiative run out of the Stanford Internet Observatory, which claimed that true stories of vaccine side effects should be considered misinformation. This was the advice they were giving to their seven Big Tech partners. Even the other words are being corrupted, and the whole conceptual framework is moot in many respects.


Mr. Jekielek: You have been interested in the topic of disinformation for quite some time and you have recognized its ill effects. I was looking at communist China, and you were looking at other countries. Please tell us about how you came to be focused on issues of free speech.


Mr. Lowenthal: In 2005, I founded an NGO [non-governmental organization] that predominantly worked in Southeast Asia, East Asia, and South Asia on issues of online free expression, online privacy, and other digital civil liberties issues. Because I'm from Australia, we are very focused on building bridges between Australia and Asia, and focusing on what was important around the new forms of human rights online that were emerging in the early 2000s as internet use was growing.


We were working in Indonesia. I lived in Indonesia for five years and in Thailand for almost four years. I ran that NGO for almost 18 years. We were dealing with countries that had serious levels of repression of free speech, oftentimes dictatorships like Myanmar or Thailand. Everyone that we were working with would have identified as liberals and progressives interested in human rights, democracy, and free speech. There was not really ever a question around that.


At some point, the question of disinformation began to pop up. It was real in the sense that there really were state actors like in the Philippines who were building online troll armies to distribute misinformation and disinformation online, trying to manipulate elections, decision-making, and public opinion. That was real, and is still real, and needs addressing. There were pernicious laws in Malaysia and Singapore where they would use fake news as a way, just as misinformation and disinformation are used now, to squash dissent.


The progressive activists that we were working with in the NGOs knew that. They were campaigning against these fake news laws. Then I saw people in the West start getting into this new form of pro-content moderation. For someone in the Philippines or someone in Thailand to start saying, "You need to trust the government more to manage content on the internet," no one was thinking that was a good idea.


There was this schism that started developing at the same time, because that NGO space is predominantly populated with people from the middle and upper middle class. They took on far too many of the ideas that the West was imparting to them, because the money was coming from the West and the big foundations that they were reliant on.


Mr. Jekielek: Your nonprofit from the outside almost appears to be like many of these other nonprofits that exist right now in the name of making sure that good, accurate information is transmitted, but they are actually playing a kind of censorship role.


Mr. Lowenthal: There are actually two things. My first organization was called EngageMedia, then I left and started this new entity called liber-net. With my previous organization that still exists today there were anti-disinformation initiatives that we participated in that still stand up. They were addressing troll armies in the Philippines that were clearly state-based manipulation of the public.


Mr. Jekielek: That is similar to the infamous 50 Cent Army in communist China.


Mr. Lowenthal: These things are real, but they totally overreached. They overreached partly because the culture changed, and people became more pro-censorship, and also because the philanthropy changed and started directing people more in this direction.


Mr. Jekielek: At one point, you saw organizations that had been genuinely trying to foster free speech and expression become their own worst enemy.


Mr. Lowenthal: I don't think they would definitely not see it as that. It depends who I'm holding in my mind and thinking of as an example. If I'm thinking of the local groups that I would've worked with in Indonesia, they would've thought, “No, we are combating state and corporate disinformation.”


It has been much more in the West where the shift has occurred conceptually, where the policing of disinformation or the transparency and advocacy around it switched to from targeting the most powerful actors to essentially targeting everyday people.That's linked the massive amount of infrastructure that was built up during the war on terror which was about countering violent extremism, which then got hijacked to be about countering populism, and essentially got directed towards everyday people.


That's why in Australia, for the Twitter Files that I did there, the Department of Home Affairs and its counterterrorism unit was dealing with an account with only 20 followers. What is that going to do? Why does it need to be doing that? It's completely off mission, at least based on the kind of outline that it gives to the general public.


But internally, there was a massive cultural shift where this became acceptable and was driven by progressives, but started by neocons. This is the new weird world that we live in where this neocon war on terror with massive amounts of surveillance infrastructure is then turned into this countering-populism infrastructure. The Left, the people who were most suspicious of the war on terror, and the various agencies that became extremely bloated under that banner have now signed up to be part of it.


Mr. Jekielek: Is it just about the money? You mentioned that the philanthropy groups changed their focus, saying, "We need you to do a little more of this instead. There's a few million dollars in there for you." That can mean a lot to a small nonprofit, whatever it is that they are working on. I'm wondering how much that created this effect, wherever it was.


Mr. Lowenthal: It's not so much them saying, “We need you to do a little more of this instead.” You are presented with, "This is the new strategic plan of this foundation. We're working on climate change, anti-disinformation, and women's equality.” Then you get to choose which grant you apply for. If they were just freely handing out money to organizations, they would be doing something very, very different.


But all philanthropy comes with strings attached. A lot more strings got put in place post-Trump and post-Brexit. The philanthropic sector freaked out and thought, "We can't be so hands-off when we give our money out." Obviously, there were always a certain amount of strings, but they decided they had to be much more focused on how the money they gave out was used. It became very targeted towards their specific objectives.


Part of the panic was them thinking, "When we said free speech, we didn't mean that. We meant this." There was a lot of big ideology around the internet at the beginning that was extremely utopian, that essentially imagined what the internet looked like from the perspective of people from Harvard, Yale, and the West Coast, which doesn’t represent most people at all.


Then the plebs got on the internet and they all freaked out and said, "Oh my God, we can't have this. This is not what we meant. We assumed there would be this civilizing process," I don't know why they thought a civilizing process would come along with this process of getting access to information.


Then you saw a switch to, "There's an infodemic, it's dangerous, and we need better control.” There were problems, that was absolutely true. A certain number of people were concerned, and for good reasons. But then a certain number of people were saying, "We're losing political control here. This is very, very dangerous, and we need to do something."


But then this becomes all mixed up. When you look at that diagram of the censorship industrial complex, every single organization, every single fact-checker, and every single university institute is not working for the man. There is a total mix.


That's why it works in some ways, because there are people who are in there doing work that very directly contradicts the actual aims of what anti-disinformation should be. For example, there is the Virality Project, which calls true stories misinformation. Then there are legitimate groups that might be a local organization in the Philippines who are combating these troll armies that are created by political parties or the government. But it's all going into one blob.


Teasing it out is quite difficult to do. Who is a legitimate actor, and who is an illegitimate actor? Sometimes they might be swapping back and forth on the same day and have some staff that are doing quite legitimate work and other staff who are doing very politicized work.


Mr. Jekielek: I'm very familiar with the German Marshall Fund, which still exists in the U.S. They had a program doing this kind of stuff, but they also had a whole bunch of programs that are very useful.


Mr. Lowenthal: There needs to be a legitimate basis for what these organizations are doing. They can't call themselves the Global Censorship Project. They need to be tasked with a mission that ordinary people, well-intentioned people can get on board with, like if your mission is to stop the next Christchurch massacre. That is a well-intentioned and good idea.


Mr. Jekielek: Everyone would be on board with that.


Mr. Lowenthal: Who would not want to do that? But the problem is that it gets chipped away and chipped away, and then at some point starts accelerating dramatically. I would say it was around Covid. Some people might say it was Russiagate or before, but certainly around Covid there was… an uptick would be an absolute understatement.


The sector that I was part of is dedicated to online free expression and gets hundreds of millions of dollars across hundreds or thousands of different NGOs, and not just organizations looking at anti disinformation. They're organizations that are tasked with protecting free expression online. That is their remit.


I can count on one hand the number of organizations from that sector who put their hands up during Covid and said, "This is censoring the idea that there might have been a lab leak, or how effective is masking, or do the vaccines stop transmission?" Almost none of them touched any of it, but we did.


At EngageMedia, we did something modest because it was very difficult to even get my staff on board with what was going on. That would probably have changed now, but back then this was a collapse. When we talk about a collapse of institutions, there was this massive sector that is still there, who was meant to be looking out for free expression online, and who failed.


Mr. Jekielek: Actually, it was worse than failing.


Mr. Lowenthal: It was worse than failing. Lots of them became part of the apparatus of censorship. My connection there was that I could see during Covid that something wasn't quite right. I had been doing work around media censorship propaganda for more than 25 years. I probably read Manufacturing Consent when I was 19-years-old. Something was just a bit weird. This was not Iraq WMD [weapons of mass destruction]-type public opinion management. This was something new, and it was something online. There were different things going on that were much more difficult to understand.


I had this sense that something wasn't quite right, so I also knew when to look. When I came on board to work with Matt Taibbi on the Twitter Files, the reason that he brought me on was essentially because I personally knew a lot of the people that were leaders in this so-called anti-disinformation field. I knew who was connected to who, all the different funders, and where the meetings happened.


But at the same time, my mind was blown. To find them so ascendant into the center of this new system of public opinion management was also mind-blowing, and to see the connections with the intelligence community, or to see a so-called anti-misinformation organization like First Draft, who actually co-coined the disinformation framework, participating in the Aspen Institute's Hunter Biden laptop tabletop exercise two months before anyone apart from the Trump campaign team and the FBI knew about the laptop. Somehow, the Aspen Institute managed to put together this exercise with the New York Times, with Rolling Stone, with CNN, with the Washington Post, with Twitter, and with Facebook.


Mr. Jekielek: We weren't invited for some reason.


Mr. Lowenthal: You were not. I saw the email and you were not on there. Next time.


Mr. Jekielek: This particular instance is telling, because early on it was obvious that this was real. For years, there were incredible amounts of disinformation repeatedly fed into the system. Does everybody now understand that it's real?


Mr. Lowenthal: Yes and no. For example, a couple of months ago, I went to the 25th anniversary event of the Berkman Klein Center for Internet and Society at Harvard University. I was a fellow there for a couple of years, and I met people who thought that it was just a crazy conspiracy theory of their mom who had gone a bit QAnon. That's what they thought.


Mr. Jekielek: They still believe that.


Mr. Lowenthal: These are very intelligent people who have affiliations to Harvard University. It's probably not that surprising for many people. But other people were clued in and were actually quite encouraging in other regards. But everyone was very quiet and not willing to speak up, which is also part of where we are in these times.


A lot of people in the liberal space haven't received the memo. Actually, there's a lot of people who still think it's a conspiracy theory, so there's a lot of work to be done, essentially. To them that justifies that it should be suppressed. I would talk to people, who said in a Sam Harris kind of way, “I'm okay with that because Trump was such an existential threat, and therefore it's justified.”


But most people can see that the Aspen Institute's tabletop exercise was extremely egregious. You can ask them, "How did they know there was a laptop to do this? Do you think it's a problem when major media corporations, Big Tech, academics, and NGOs get together in advance of a story in order to essentially create their own misinformation campaign?”


Not only that, somehow there's a connection to the highest reaches of the intelligence community all the way up to Secretary of State Blinken, and to this letter of the 50-plus former intelligence officers who said, "Classic earmarks of a Russian disinformation operation." You have a top to bottom situation, and what do you want to call it?


Mr. Jekielek: Coordination.


Mr. Lowenthal: Coordination is probably the politest word. At the same time, someone like Antony Blinken will be platformed at major events in this online human rights sector. There's actually an event called RightsCon, which I once actually co-organized with the NGO that runs it, that platformed Antony Blinken and Nobel Prize winner Maria Ressa in a session on disinformation at what would be the world's most important conference about human rights on the internet. Now, that tells me something is very, very rotten in Denmark.


Mr. Jekielek: But why?


Mr. Lowenthal: There has been a lot of capture. The money that has flowed into the sector has steered it in particular ways. This sector was the kind of place where Edward Snowden was celebrated. In that sector where I was, everyone thought Edward Snowden was great. He was defending freedom. He was trying to stop mass surveillance and push back against the administrative state. This was great.


Then someone got the memo and said, "This can't be the culture of the internet freedom community. We've got to tailor and shift it somehow." That came from philanthropy, and it came from powerful state actors as well.


It's a lot of money that the United States government gives directly to internet freedom, which comes out of this Clinton doctrine around internet freedom. Around 2010, Hillary Clinton actually had quite a good speech about freedom online, and said something to the effect of, "Just because someone's feelings might be hurt, or there's difficulties and challenges with free expression, it doesn't mean we should give up these American values. This is the reason we're fighting the war on terror.”


That is just an absolute contrast with the kind of people who are now the mascots of this work, like Jacinda Ardern, who says, "We will continue to be your single source of truth." There are basically inorganic and organic elements to that. You can see in the Twitter Files and all through this Aspen Institute tabletop there is coordination between state actors, the nonprofit sector, and academia. Do we need to know exactly what the mix of inorganic and organic is? It probably would be helpful to a degree, but we can clearly state that it's there and that it's happening.


Mr. Jekielek: Whatever mechanisms you create, someone's going to take advantage of those, and sometimes it will be the person you align with, and other times it will not. Why don't we create things with resilient structures that can overcome that? That idea is very central to the Westminster Declaration.


Mr. Lowenthal: Yes, absolutely. At the same time, I wouldn't say it's a free speech absolutist declaration, either. That’s not what you're saying, but some people might interpret it that way. It looks to both Article 19 of the UN Declaration of Human Rights and the First Amendment, because those two things appeal to different people, and they're both useful in terms of what we can draw on.


Where the boundary sits is always the conversation. The forever conversation around free speech is, “Where is the line?” The internet has changed that because it has changed the way that we express ourselves and it can be weaponized. But just as free speech can be weaponized, censorship, content moderation, disinformation can also be weaponized.


We need to work out quite quickly what the new configuration is, because clearly what's happening now is not working. It was a disaster during Covid. We would have got such better results from our leaders, and we would have held these huge mega-corporations to account much more easily and more quickly had we had our free speech rights intact.


Literally, if you want to talk about safety, it actually was a question of safety. People engaged, they took jabs, they lost businesses, or they suffered from mental health issues, which was all about their safety and well-being. The things that were proposed to them or mandated to them as behaviors that they must adhere to oftentimes put them in danger and didn't keep them safe.


Free speech would have actually protected us from these depredations of the state and corporate power. There needs to be a new playing field created that fosters massive amounts of extremely rapid innovation in the online world that will meet the needs of the market for free speech.


Mr. Jekielek: One of our partners, Gan Jing World, is trying to create a clean system, which fosters freedom of speech, but no pornography. There are very specific limits outlined to encourage people to participate. The behemoths, as you called them, have become very big. There's a lot of antitrust legislation that the U.S. has ignored for a long time. Some of these behemoths are larger than certain countries at the moment.


Mr. Lowenthal: Without major political change, it's not possible to break them up. There's a cozy relationship and there's this agreement where, "We won't break you up and overregulate you, but you've got to do this for us." Essentially, it’s a quasi-monopoly situation that is working for big government and Big Tech. Someone has to get in there, be an actual disruptor, and create space for 1,000 different platforms.


Mr. Jekielek: It's not really just a censorship industrial complex. The term I like to use is the perceived consensus manufacturing complex.


Mr. Lowenthal: Yes. Because it's not only about censorship, it's also about putting bad information, malinformation, and disinformation into the system. It's about downranking information that is unhelpful to them. Walter Kirn uses this amazing metaphor about making a mixtape and pushing a bit of the treble up, and bringing the bass down, and essentially mixing reality.


Mr. Jekielek: Imagine having those dials in your hands. That is an unbelievable amount of power that you could wield without people noticing, in many cases. Zach Voorhees, one of the Google whistleblowers, came out with documents where they talked about machine learning fairness. For example, if you searched for American inventors on Google, you would get a disproportionate number of female and black inventors. The reason was, since a lot of the greatest inventors in the past were white men, the ideological injection here was to be fair to everyone. As a concept, we're talking about this ability to dial up certain things, but also to inject entirely new visions of reality.


Mr. Lowenthal: Absolutely. Every day I'm constantly surprised that people never imagine that this might be used by people for purposes that they're against. Maybe a Trump manager in that space might say, "Let's dial down the black, the Latino, and the women. White men have been unfairly treated this last decade. Let's rebalance it again by overcompensating and therefore excluding certain people." This is a bad system all round. You don't want that. You always want some level of interpretation.


But what was so great about the way that it worked earlier on is that it just expressed what the people online were recognizing. Maybe there was some bias there, but you didn't manipulate it by changing the algorithm. You go and do a documentation project on black female inventors, and you build a website about them.


If people find that interesting, and follow it, and link to it, then it will come up higher in the search results. But you don't change the recipe and say, "We're just going to dial up and down whatever we think appeals to us at this particular time, for what this particular group of people thinks is important." Because it can go the other way, and we're seeing that now.


So again, the shoe is going to be on the other foot. That is why we had free speech in the first place, to recognize we're not building a system for one team or the other. We're building a system for the umpire. We're building a system of rules so that we have a game that we all agree on, and all agree how we're going to play it. We don't keep trying to put a blue or a red jumper on the umpire.


Mr. Jekielek: The idea is to let the best ideas rise to the top.


Mr. Lowenthal: Yes. We'll have a system that is going to enable the best ideas to rise to the top. It is one that does not manipulate, and essentially leaves it to people's personal initiative to advocate for things that are important for them. People used to be very excited about the wisdom of crowds before the crowds became mobs.


Mr. Jekielek: The Westminster Declaration has been published for a while, and there has been some interest. What do you hope to accomplish? What do you think the value is here?


Mr. Lowenthal: It's a tool or it's kind of a banner or a standard that people can look to and say, "This is how we think we should approach this.” By having a large number of very prominent people signed up, it gains recognition and legitimacy. There's obviously been other letters in the past like the Harper's Letter, but this is quite different. Harper's Letter was more, "Hey, everyone, be nicer to each other and let each other speak."


Whereas, this is much more specific about the problem of misinformation and disinformation or the overuse of hate speech, and how the technology itself is impacting free speech. It's much more a declaration that's linked to these times around Covid and what happened with platform manipulation and censorship.


I hope that it will be a tool for people to say, "This is what we want. This is the standard that we want to see happen. This is what platforms and politicians should be adhering to." It was very deliberately, as much as possible, balanced between liberals and conservatives so that we could get back to this idea of, "We need a shared set of rules about how we have discussions, because whatever is going on now is not working. We need to get away from systems that are easy to hijack, no matter who is doing the hijacking."


Mr. Jekielek: Powerful disinformation called inauthentic activity or botnets can influence whole populations and is being used by various state actors. Disinformation has become a bad word because it's a euphemism for speech someone doesn't like. But there are actual bad actors, some funded to the tune of billions of dollars, that are pushing actual disinformation.


Mr. Lowenthal: The civil society organizations who are receiving hundreds of millions of dollars to police disinformation need to switch from policing disinformation street crime to disinformation corporate crime. They have essentially focused on these small actors like crazy Joe in city X, who says, "Take this snake venom potion and you will cure your Covid." But aren't looking at what is in Pfizer's potion. They're not focused on that.


One of the key things that needs to be changed is the orientation, which should have always been the orientation of civil society actors, which is to focus on the people with the most power. It seems entirely obvious, but they switched from taking power to account to actually working for power. To switch that back immediately would go a very, very long way.


But they also need to understand that it's very complicated. They think their side is 100 percent right and the other side is 100 percent wrong, at least a great deal of them. Their fear is around censorship that comes from the Right, and there is a certain amount of it. There needs to be a shift in who people understand to be the most powerful—the elites across the board, rather than those on either the Right or the Left.


Mr. Jekielek: Andrew Lowenthal, it’s such a pleasure to have you on the show.


Mr. Lowenthal: It was great to be here. Thank you, Jan.


Mr. Jekielek: Thank you all for joining Andrew Lowenthal and me on this episode of American Thought Leaders. I'm your host, Jan Jekielek.


🔴 WATCH the full episode (41 minutes) on Epoch Times: https://ept.ms/S1209AndrewLowenthal

 

Epoch Original DVD collection:





7 views0 comments

Comentários


Post: Blog2_Post

HOT PRODUCTS

bottom of page