top of page
Writer's pictureEPOCHTV

Big Tech Immunity Explained: Law Professor Adam Candeub on How to Rein In Section 230 Abuse

“Who is raising children? It’s not really parents, it’s not teachers, it’s not coaches or clergymen. It’s Twitter influencers. They’re the ones that have the ears and souls of our children.”

I sit down with Adam Candeub, professor of law at Michigan State University and a senior fellow at the Center for Renewing America. Candeub served as acting deputy and then acting assistant secretary of the Commerce Department’s National Telecommunications and Information Administration (NTIA) during Trump’s presidency. He was outspoken in his criticism of what he sees as the abuse and expansion of Section 230 of the Communications Decency Law—a federal provision that grants social media companies protection from liability.

“This rather limited protection that sort-of mimicked the telephone, the telegraph, and we’ve had for hundreds of years and we couldn’t really survive without, has morphed into a protection of the platforms for anything they do,” Candeub says.

Candeub currently advocates for the “common carrier” approach to social media, and is involved in a number of major First Amendment cases, both at the state and federal level, which will likely shape the future of Big Tech’s impact on our society.

“We have given power to these agencies—the gobbledygook alphabet soup of security agencies—that are not really accountable to anybody. And like any other agency, they tend to be co-opted by special interests,” says Candeub.

 

Interview trailer:

 

FULL TRANSCRIPT

Jan Jekielek: Adam Candeub, such a pleasure to have you on American Thought Leaders.

Adam Candeub: A real pleasure. Thank you for having me.

Mr. Jekielek: You’ve been working for years on First Amendment law, teaching it and arguing cases. Right now there are a number of related cases at the Supreme Court, quite significant cases. You’ve also been deeply involved in coming up with this common carrier approach to dealing with Big Tech or Section 230. We’ve heard a lot about Section 230. I’m going to get you to explain a little bit about the details.

Before we go there, I’ll mention another thing. You’ve been involved in coming up with the concepts that have now been implemented in the Utah social media law. These are quite significant and might give people an idea of how you think about law and how you think about these things.

Mr. Candeub: Involvement with protecting children on the internet is not something that gets a lot of press or news, but actually there’s an appetite for it. A lot of the states want what I want for the internet, which is an internet that is based upon user control, and for minors that means parental control.

It’s amazing with the internet how technology has smashed so many expectations, that people think it’s fine for their children to essentially create contracts with these social media companies. They take their personal information, and they give them legal requirements and legal obligations, all without parental knowledge or consent.

This is something very new, very different, and we’ve accepted it for the last 10 years. A lot of people have been saying this is not the right way to go. I worked with Jean Twenge who’s a leading social psychologist at San Diego State University, who’s worked on demonstrating the emotional and psychological harm that social media has imposed upon our children. I’ve worked with Bill Wilcox at the Institute for Family Studies at UVA, and Clare Morell who works at the Ethics and Policy Center in Washington DC.

We came up with a report with some suggestions of how the states can actually give more power to parents. Because, let’s face it, parents don’t raise children these days. Screens do. That’s a frightening thing, and that is social disintegration.

Mr. Jekielek: There’s the obvious thing, with pornography being easily accessible to children right now. Most people would be deeply concerned about this.

Mr. Candeub: The transformation of porn, the way people live, and the way young people live are some of the unrecognized shifts in our society. It’s kind of embarrassing and sort of yucky to think about. When I look at my students in their twenties, they’re a lot less active romantically than I remember myself being. This is not just an old guy criticizing the young generation. There’s plenty of data showing that young people are not as romantically involved, they’re marrying less, they’re going on less dates, and they’re having less sex.

Pornography does very likely play a role in that, because it provides a substitute and a distraction from romance and the dance of the sexes. If we don’t have people marrying and forming love relationships and having children, it’s the end. We are looking at civilizational collapse.

That’s not just hysterical conservatives saying that. If you look at the data, it’s shocking how few people in their twenties and thirties are getting married and having kids and doing the things that traditionally have allowed people to live happy, flourishing lives.

Mr. Jekielek: Please explain to me how this new social media law works in Utah, and also its implications?

Mr. Candeub: The social media law largely does two or three things of significance. It says no social media firm can form any kind of account with a minor without parental consent. That doesn’t mean just click through consent where you say, “I hereby assert and affirm that I am of a legal age.” There has to be an independent third party verification.

It requires the social media companies to give parents access to minor’s accounts, so parents can see what they’re doing. To me that’s so vital, because who is raising children? It’s not really parents, it’s not teachers, it’s not coaches or clergymen—it is Twitter influencers. They’re the ones that have the ears and souls of our children. Parents should be able to know about that.

Mr. Jekielek: Or TikTok.

Mr. Candeub: Or TikTok. TikTok influencers. Exactly. Which in many respects really means the Communist party of China. That’s a problem. The State of Utah said, “No, parents should have access.” We see a tremendous amount of mental deterioration among young people in the rates of depression, suicidal thoughts, and visits to the emergency room because of self-abuse. Again, this is not from a hysterical, old conservative, it’s in the data. It’s remarkable how close to over 50 percent of women are on some sort of antidepressants or psychopharmacological drug in certain areas, and in certain high schools. This is not good.

One of the big debates in the social science community is, “Is it really social media or is it sleep deprivation?” It could be something simple like that, because we know that social media keeps kids up. What the Utah statute says is, “No social media for kids between 10:30 pm and 6:30 am.” To this day, there’s no indecent programming on broadcast television during those hours. It’s just like curfews that are imposed and have been upheld by courts. This is the same thing, but for the internet world.

Mr. Jekielek: As I hear about the curfew, that sounds really restrictive, Adam.

Mr. Candeub: Yes, I guess. But it follows all sorts of rules that we’ve always had. As I said, curfew rules are hundreds of years old and they have been upheld by the courts. Children have fewer First Amendment rights than adults. For instance, indecency programming still exists, and regulations still exist for broadcast television. For those few channels that you get on your cable that are actually broadcast channels, they can’t have nudity and they can’t say dirty words between certain hours of the day, and that’s to protect children.

As a society, not so long ago, we have been quite comfortable with more aggressive efforts. When you have something like a smartphone which can be smuggled into a bedroom, it’s very difficult for a parent to control. This is an appropriate time to say, “No, we have to come up with a rule to help everyone.”

Mr. Jekielek: The bottom line is that adult websites would need some kind of dialogue that required third party verification, which theoretically makes it impossible for kids to get into those sites.

Mr. Jekielek: It doesn’t do everything. The limit is on account formation, because of contract law, so essentially, the state can more easily regulate children’s ability to form a contract. The problem again goes back to the boring court rulings from the Supreme Court in Reno. The ACLU said, “No, sorry, if you’re just trying to block porn, age verification is too burdensome and we should have filters instead.”

They said that in 2002, and perhaps we’ll revisit this. Other states are moving on legislation to do that. No, it does not just block porn. But if you want to start an account with porn site, you can’t do that without your parents’ consent.

Mr. Jekielek: Got it. What does society think about the curfews?

Mr. Candeub: It depends who you ask. The corporatist, libertarian, think tank lobbyist groups in DC think it’s terrible, and that somehow children will wither and die if they can’t be on social media between the hours of 10:30 and 6:30. That’s a rather esoteric point of view. Most parents would be quite okay with that. What are they going to do? What horrible thing is going to happen?

They might read a book, they might talk to their parents, and maybe even they could watch a movie together with their family. We’ll see. Courts will look at that rule and they could be absolutist and say, “No, we can’t have it, this would be terrible.” On the other hand, I hope they’ll be more realistic and say, “Look, we need this for family cohesion, for the health of our children, and for the future of our society.”

Mr. Jekielek: It’s very interesting how deeply you were involved in conceptualizing this, which involves the heart and the hand of the government putting its finger on the scale. On the other hand, you’re very much a strong free speech advocate. That is reflected in your writing and the cases that you’ve taken up.

Mr. Candeub: Traditionally, we have always allowed the greatest freedom for political discussion, which is essential for our society. But we’ve also recognized that other images and communications aren’t really that great for us. There is a real difference between pornography on the internet and a controversial op-ed or a tweet that people don’t like. I’m for having that distinction.

Mr. Jekielek: There’s a huge discussion right now about the disinformation industrial complex. When you were working in the Trump administration, you were looking at Section 230 reform to deal with perceived overreach by Big Tech platforms. Now, we’re seeing something much bigger that before 2020 many of us didn’t quite imagine. Section 230 is interesting because it provides a lot of freedom, but it also creates huge problems at the same time. There’s some sort of path through the middle. Please explain the picture to me.

Mr. Candeub: Sure. They are two very different but related issues. They’re also connected by the power of Big Tech and its interrelatedness with them. Many agencies and organs of our government and the nonprofit sectors in academe are able to create this unified front to project certain views.

Certainly what the Twitter Files have revealed is—I worked in government and had no idea—the degree to which the intelligence establishment and law enforcement is involved in surveilling what Americans do and say.

Something we often find in this debate is the redefining of terms, like terrorist threat. We always thought that was something from outside the United States. The legal authority that was generally only given for surveillance of foreign individuals and foreign communications, all of a sudden has been used in the United States.

This is added to this weird little proliferation of all these strange nonprofit organizations that work hand in glove with the intelligence community to create warnings and to create concepts. As you were talking about disinformation and misinformation, whoever even heard these terms until about five years ago?

Disinformation, yes, when talking about spies in World War II, but nobody ever talked about misinformation or disinformation on the internet before. That’s new and that has been created recently. That’s created by constellations of people who want to surveil what we say and do. That’s troublesome.

Section 230 fits in a little bit differently. It’s a short statute, and I encourage your viewers to look at it. 230 C1 has become the important one, but C2 was the one that Congress was interested in. It should be a relatively simple matter. It provides this protection for the big platforms that telephone and telegraph companies have, or even common carriers who are carrying packages.

When a telephone company completes a call between two conspirators who are going to commit a crime or who are going to defame, the two people in the phone call have legal liability, and the phone company doesn’t. That’s just what Section 230C1 does. It says, “Look, you go against the user. Facebook has no liability.”

What the courts have done with the help of the huge proliferation of Big Tech money and influence in DC in the nonprofits and academe is to say, ”No. What C1 covers is anything having to do with speech the platforms do.” Well, that’s like everything.

For instance, a particular case which I find notable and egregious and unfortunately often cited is the Sikhs for Justice. It was the claim that one of the platforms was discriminating against the religious community of Sikhs on the basis of their religion, and they got C1 protection for that.

I noticed the difference between that and our libel situation. With our libel situation, there were two people saying something defamatory on the phone. You could sue the users, but you couldn’t sue the platform. Here, the Sikhs were saying, “No, Facebook discriminated against us and C1 protected them.” It has just been growing and growing and growing since then.

Facebook says, “I promise I’m going to carry your postings and I won’t censor them.” Then, if Facebook censors them, too bad, so sad, C1 protects them. With consumer fraud, if you make fraudulent claims in violation of state law, too bad, so sad, that involves the editorial discretion of the companies, so therefore C1 protects them.

This rather limited protection that mimicked the telephone and the telegraphs, which had for hundreds of years and we couldn’t really survive without, has morphed into a protection of the platforms for anything they do. How does this fit into this weird world in which there are all these nonprofits working with the government to censor and surveil?

In a way, we’re seeing the platforms working very closely with these groups, largely under the threat of this wonderful, extraordinary legal protection being taken away. That’s what Biden has said, that is what all these hearings are about. These senators and representatives say, “If you don’t censor more people, we’ll take away 230C1.” What’s a company to do?

I do see these things as related. They happened very suddenly, as you pointed out. It’s complicated and difficult to describe. It’s a perfect storm for inadequate democratic oversight, because this emerged and it’s hard to explain the situation.

Mr. Jekielek: I hadn’t realized, in your view, the original rules have been abused. The original law has been abused.

Mr. Candeub: Without a doubt. Yes.

Mr. Jekielek: But it’s the removal of the abuse of that law which is used as the threat, basically.

Mr. Candeub: Yes, exactly. That’s right. I’ve been involved in some of these cases and it’s very frustrating, because the platforms hire the best lawyers in the country. They are not quite unified, but they work loosely together to make sure the right opinions are put forward. For instance, the case I referenced, Sikhs for Justice, was actually pro se for most of the case. Essentially, they had a non-lawyer represent themselves against the Death Star of top DC lawyers.

Look what happened to the opinion. It just copied all of this language from Big Tech lawyers and that became law, and then, that’s cited in the next case. It’s a unified pressure expanding C1 to an absurd degree. But that’s sort of a mixed bag, because now they rely more and more on this legal liability protection, and that becomes a bigger stick that the Democrats can hold over them.

Mr. Jekielek: A small company that wants to break into the social media space where there are questions of defamatory opinions or copyright infringement would need protection, otherwise it would be sued out of existence.

Mr. Candeub: What it originally conceived of makes sense. 230C1 makes sense. Facebook shouldn’t be liable for the postings of its users. I don’t want them to be liable for that. However, if they discriminate against people on the basis of their religion or their race or, as Texas social media law will come to, on the basis of their political views, there should be some repercussions. These platforms don’t communicate anything themselves. Their users use them to communicate. They provide a service, and the service is like any others. It’s like going to a restaurant or any other public accommodation.

Mr. Jekielek: Your position is that you just want to bring it back to what it originally was. That’s interesting. I hadn’t fully grasped that.

Mr. Candeub: Yes. I’m always the conservative. The way it was is the best. But yes, exactly. For better or worse, a large part of my legal scholarship has been the history of common carrier law and network law and very boring stuff. If we start going into it, no one will be listening to this, and they’ll quickly switch it off. But that’s been the rule for hundreds of years and it’s worked quite successfully.

A power we see so often with the internet is that regulators and legislators think, “This is so special and new, we have to give extra protection, or the old rules don’t apply.” But there’s nothing new under the sun, and it would be best if we return to traditional understandings on this.

Mr. Jekielek: This is the perfect opportunity. I want to find out how you came to be studying these uninteresting things, as you describe them, but things of profound importance to our society today. Please tell me about your path. Along the way, you’ve had some very interesting cases that caused you a lot of trouble, actually.

Mr. Candeub: Yes, a little heartache.

Mr. Jekielek: Yes.

Mr. Candeub: I went to UPenn Law School, and afterwards I clerked for J. Clifford Wallace, Chief Judge of the Ninth Circuit. Then, as many people do, I worked as an associate in a law firm. This was in the late nineties, and the Telecommunications Act of 1996, of which Section 230 is part of, had just been passed. The main part of the Telecommunications Act had nothing to do with the Communications Decency Act. It had to do with local telephone competition, and now people will start snoring. But it was this weird concocted effort to create competition at the local level. Regulatory lawyers were very busy and I got involved with that.

I worked at the FCC for three years during the height of those issues. And then, I thought I’ve had enough of DC. There was a telecom bust and I was very fortunate. I did what I always wanted to do, which was to become a professor. I moved my family out to Michigan State in East Lansing, and I taught there for about 10 years, living a quiet midwestern life.

Just by accident, I got involved with these two cases. One was very controversial with Jared Taylor. His outfit is called American Renaissance. He describes himself as a white advocate, but most of his detractors would call him a white nationalist. He was kicked off Twitter. Also, Megan Murphy, who is a Canadian feminist from the Vancouver area was kicked off, because she deadnamed or misgendered a very vocal political opponent of hers.

I really got involved with these cases because of Section 230. I always taught my students the parameters of Section 230 mirrored the old telegraph and telephone regulations. I always thought that it protected platforms against the libelous or otherwise unlawful statements of their users.

We got into court, and it’s like, “That’s not the way they do things in California. There are all these new rules that essentially say, “Section 230C1 protects Twitter’s decision that Jared Taylor is not worthy of being on Twitter. Meghan Murphy is not worthy of being on Twitter,” and that they have that editorial discretion to do so, even though it’s not in the statute. That ticked me off.

It just got me really angry, because it’s not the way the system is supposed to work. You have a statute, which is very protective of platforms, which does all the good things that we like, allowing the entrance of small companies and allowing people to express themselves without making their platforms liable for what they say. And then, you see it just became even bigger through really bad arguments and abusive law. So, I started writing a lot about this.

Mr. Jekielek: Just to be clear, you don’t have any particular sympathies to any of these positions.

Mr. Candeub: No. I’m a middle-aged Jewish law professor, I’m not a white nationalist.

Mr. Jekielek: Obviously, but the reason I ask is important in this day and age. Lawyers get criticized for taking cases, even though lawyers are supposed to take the cases of the most egregious people. Lawyers are supposed to take those cases and defend those people vigorously. That’s how our system works.

Mr. Candeub: My representation of Jared Taylor has cost me career-wise. I could never move from Michigan State. If I hadn’t had tenure, it probably would’ve been very bad for me. But Jared followed the rules, and he’s polite. If he goes, all ethno-nationalists go. That means goodbye to Marine Le Pen of France, Giorgia Maloni, the Prime Minister of Italy, and Viktor Orban of Hungary. That is a slippery slope. I thought that it was wrong. He followed the rules and if he goes, then who’s next?

That’s what happened. They were the beginning. He and Megan Murphy were the canaries in the coal mine. I knew if they weren’t going to follow the rules with them, then they weren’t going to follow rules with anybody. It’s the road that led right to President Trump.

It was a belief that they had complete editorial control, regardless of the civil rights laws, regardless of contract, and regardless of consumer fraud, to say who was on their network. Emboldened by cases like Jared Taylor’s and Megan Murphy’s, which we lost, the platforms were able to do this. That’s why I did this, because I thought this was the beginning of the end. I hate to say I was right, but I was right.

Mr. Jekielek: You hear this very often, especially in conservative debates, that this is a private company. It should be able to do whatever it wants to do.

Mr. Candeub: The telephone company was a private company, it had to provide service to everybody. The telegraph company was a private company, it had to provide service to everybody. Restaurants are private companies, they have to provide service for everybody. Schools of higher education are private institutions, and they can’t discriminate.

What these social media companies do is provide a service, just the telephone company, and just like FedEx. It’s perfectly reasonable and within the bounds of the constitutional authority of a state to say, “Look, you have to serve everybody. If they agree to follow your rules, you have to serve them.”

Mr. Jekielek: The rules can’t be, “Your viewpoint has to be our viewpoint.”

Mr. Candeub: Exactly, you’re right. That’s like the genie that says, “You have three wishes.” You say, “I want 12,000 wishes.” You can’t play by those rules, and you can’t play that game. The platforms have this wonderful sort of protean identity. On one hand, when they’re getting protection under Section 230C1 they say, “When we moderate content and use our editorial discretion, that’s using the statute’s term, ‘speech of another.’ We get protection under that,” which is what I’m talking about, this expanded notion of C1.

However, when they were challenging the Texas social media law, they said, “All the statements on our platform are our own expression.” They are somehow miraculously expressing themselves through the billions and billions of tweets or postings, and that creates a coherent message, or so they claimed. Luckily, Judge Andy Oldham, one of the great rising stars of the federal judiciary said, “That makes no sense. You can’t have both of these.”

Mr. Jekielek: You can’t have both at the same time, which seems kind of obvious, doesn’t it?

Mr. Candeub: It does, but we can talk a little bit about why the courts have been so open to Section 230 expansion. In general, judges like easy ways to dispose cases, and liability protection does that. Because at the beginning of cases they’re like, “Done. Cross it off. Next.” Also, it is a weird conjunction of ideologies, and we’ll see this when the issue gets up to the court.

A lot of Republicans are very libertarian and they’re open to this idea. It’s like our wires, we get to do what we want to do, the First Amendment guarantees it. The Dems, the liberal judges, they like the censorship in there. It’s been a perfect storm where judges on both sides of the aisle have unified to expand C1. Justice Thomas, in separate statements, as well as some of the other justices in the lower courts have only recently said, “No, this is crazy.”

Mr. Jekielek: Let’s talk about this Texas social media law, which I know you’re a fan of.

Mr. Candeub: Yes, I’m a great fan of that. Kudos to Governor Abbott and Senator Hughes who really allowed this to happen. It’s a very straightforward law, and it’s short. The disclosure provisions are a little bit longer, but your viewers should just look it up. HB20, or Texas social media law even, it’s on the internet. All it does is say that platforms cannot discriminate on the basis of viewpoint.

That doesn’t mean that they can’t get rid of content they don’t like. Facebook can ban nudity, and can ban four letter words. But what it does do is say that you can’t ban people on the basis of their viewpoint. If I am an advocate for naturism, they can’t cut me off. But they could-

Mr. Jekielek: Not allow you to post your naturism photos.

Mr. Candeub: Yes, exactly. Precisely. A very unpleasant thought, so let’s move on to the next question.

Mr. Jekielek: All right. It’s pretty simple and straightforward, but being challenged.

Mr. Candeub: Yes, exactly. It was a liberal judge that struck it down in the district court, and it was reversed and upheld by the fifth circuit in a really bravura opinion written by Andy Oldham. I thought it was just great. Then, CERT was taken by NetChoice, and we were all very excited to see what was going to happen earlier this year. Maybe because the Supreme Court had too many hot dishes on their plates, they punted. What they said is, “We’re going to send this over to the Solicitor General for their opinion, for its opinion on whether we should take CERT.

Now, for us in the business, it was a little rich. Because after all, Texas social media law was designed to, among other things, prevent the government from pressuring the social media companies to censor or throw off politically unpopular views. We’re going to the Biden Solicitor General, given their record on that topic, to have their opinion. But they’ll probably take six to seven months and issue something in May or June, and the court will vote on it, and will decide CERT in the fall. We will return, and the Supreme Court will have to sort out this mess.

Mr. Jekielek: Let’s talk about some of the other First Amendment-related cases that the Supreme Court is taking up. Some of them have very profound implications. There’s 303 Creative v. Elenis, and Gonzalez v. Google. Please give me a quick breakdown of what’s going on and why it’s significant.

Mr. Candeub: Sure. 303 Creative v. Elenis is a reprise of the famous Colorado cake master case involving whether or not anti-discrimination laws involving gay people should apply to small artisan creators who are creating specialized types of products for individuals. This time it’s not cakes, it’s wedding invitations. A gay couple asked a religious web developer to do their wedding invitations and she refused. They brought action under the Colorado law that prohibited discrimination on the basis of sexual preference or identity.

This puts conservatives in a funny spot. On one hand, we’re fine with non-discrimination requirements for the social media companies in Texas. But someone would say, “Conservatives are being hypocritical by saying the wedding invitation designer should be free on First Amendment grounds not to make the wedding invitation.”

That’s actually the wrong way to look at it. These non-discrimination requirements of the kind that Texas is imposing have been imposed on large network industries that provide a commodified service that are largely unexpressive. Nobody thinks that when someone uses the telephone and defames you, that is the telephone company speaking. Furthermore, these services can be provided on an impersonal basis, as opposed to wedding invitations where you sit down with the bride and groom and work with them to have a detailed invitation that expresses their particular preferences.

This bespoke, individualized work that involves the commitment and intellectual work of one individual is a very different situation. The court can reasonably distinguish that and say, “No, we can’t have anti-discrimination laws here that implicate not just an individual’s creative endeavors, their individual work, but also their religious affirmations. They could just say that this is the First Amendment freedom of religious expression, and not even get into actual speech issues.

As opposed to discrimination online, where if the court says that the Texas social media law is unlawful, all of our public accommodation laws will be suspect. Because the lunch counter will say, “I’m going to express myself by having an all white lunch counter or a black white lunch counter.” The airline will be able to express themselves by having only people of a certain religion, in the same way that the platform expresses themselves by only having people of a certain viewpoint. If the court tries to make it consistent in that way, they’ll open up a Pandora’s box. It’s an interesting issue, and the court has to draw the right line between them.

Mr. Jekielek: A significant part of the distinction you draw is one of scale. One idea at a small scale can become a nightmare at a large scale, even though it’s a perfectly good idea at a small scale. That’s a topic for another day. How about Gonzalez v. Google, as we finish up?

Mr. Candeub: This is an interesting case. The Google plaintiffs represent numerous individuals whose family members were killed in terrorist incidents. The theory of the Gonzalez plaintiffs is that YouTube’s targeted recommendation radicalized the terrorists to commit their deeds. There is a large causation issue here, and the court was very uncomfortable with it, not only in this case, but the companion case. It could go away solely on that issue.

But the case does present a very interesting Section 230C1, the issue that goes back to what we were talking about earlier. When you go to YouTube, you have these recommendations on the side after you listen to a video. The question is, are those recommendations the speech of YouTube or are they the speech of the users? YouTube says, “They’re not our speech. They’re the speech of the users, and we’re just the conveyor.”

Mr. Jekielek: Because the users made them.

Mr. Candeub: The users made them. But the Gonzalez plaintiff says, “You created the algorithm that selected them, so they’re yours.” Now, how that works in C1 is C1 protects you if they’re the statements of your users, the telephone company has no liability for their user statements if they are libelous. Similarly, YouTube has no liability for the videos the users upload. But for the speech of the platform itself, they would have liability.

That’s the way C1 works. It was a very unusual argument. The justices were very confused and you sensed a sense of discomfort. I’m hopeful that they will punt this issue and not deal with it and just say, “We have problems with the underlying claim,” at least from my selfish perspective. My fear is that they might, in fact, take this opportunity to cement the sort of expansive Section 230C1 protections that we spoke about earlier.

Mr. Jekielek: There’s a significant case to be made that it’s YouTube that decided what would go there. Mr. Candeub: Technology really has changed our expectations. Yes, they created the platform, and yet, there’s a tremendous resistance to saying they have to be liable for it. We treat them in this special little world, If this had been a newspaper that had delivered stories saying, “Look at this great terrorism. Want to learn more about all these great terrorist activities?” Of course, we would’ve held them liable. But there’s something about these online tools that make people feel differently, and I haven’t quite figured it out.

Mr. Jekielek: Fascinating. How do we deal legally with this disinformation industrial complex that has developed, because it seems to not fit neatly into any existing rules. That’s just my gut feeling, but why don’t you tell me?

Mr. Candeub: You’re right. It’s largely because it’s a product of the fourth branch of government, the administrative state. We have given power to these agencies, the gobbledy-gook alphabet soup of security agencies, that are not really accountable to anybody. Like any other agency, they tend to be co-opted by special interests. That’s a very dangerous brew, and it’s hard to prove. It’s hard to bring into the sunlight and shine a light on it, and it’s hard to bring accountability. It’s going to be a real challenge, and it’s one of the difficulties of our time.

Mr. Jekielek: Adam Candeup, such a pleasure to have you on.

Mr. Candeub: Thank you for having me, Jan.

Mr. Jekielek: Thank you all for joining Adam Candeub and me on this episode of American Thought Leaders. I’m your host, Jana Kelik.

This interview was edited for clarity and brevity.


To get notifications about new Kash's Corner and American Thought Leaders episodes, please sign up for our newsletter! Here 👉 Get Alerts


-

PRE-ORDER "The Shadow State" DVD:


The Real Story of January 6 | Documentary BUY Jan 6 DVD:



-

Follow American Thought Leaders on social media:

4 views0 comments

Comments


Post: Blog2_Post

HOT PRODUCTS

bottom of page