Why did Twitter recently appear to censor Substack content? What was revealed in the open-source algorithms that Twitter recently published? And what does a truly free and open social media look like?
I sit down with Bill Ottman, co-founder and CEO of the Minds social media platform, to discuss a radical new vision for internet privacy and decentralization.
Interview trailer:
Watch the full interview: https://www.theepochtimes.com/does-elon-musk-want-to-buy-substack-bill-ottman-on-twitter-substack-rift-how-censorship-fuels-violence_5188501.html
FULL TRANSCRIPT
Jan Jekielek: Bill Ottman, such a pleasure to have you on American Thought Leaders.
Bill Ottman: Thanks for having me.
Mr. Jekielek: Bill, you’re the founder of a network, Minds, that is looking to completely decentralize the social media experience. Please tell me about this. Twitter recently open sourced at least part of its algorithm. What’s the significance of that?
Mr. Ottman: Yes, that is a positive step in the right direction. They have published a piece of their recommendation algorithm. There’s a couple other components of Twitter that they’ve open sourced. Open sourcing is what we’ve been pushing for over a decade. For anyone who doesn’t know, it means that the software is actually published. It’s auditable, and can be inspected. Theoretically, you can make sure that there’s not shadow banning or there’s not bias or there’s not security issues.
The rationale for companies keeping their software secret is always just based on this irrational fear that someone is going to steal your idea or steal your code. To be honest, the likelihood is that no one’s going to care less about your code. You should be, and we are flattered when people take our code. We want people to take our code because that’s spreading our technology. You can actually build a hugely successful business enterprise with open source.
I could just name the companies for days. You have the Mongo database, and WordPress. WordPress powers 30 percent of the websites on the entire internet, and they’re fully open source. That is something that is only going to keep expanding. What was formerly Birdwatch, now Community Notes, is also an open source component of Twitter, which honestly, has been one of their more successful products. It’s really interesting to see their fact-checking. You’ll even see Biden getting fact-checked, even Elon gets fact-checked on his own posts, and all that infrastructure is out there. It’s a big step in the right direction.
Mr. Jekielek: I like how it’s called Community Notes, and not fact-checking. Fact-checking has definitely got a very bad name now because typically, it’s not really fact-checking.
Mr. Ottman: It wasn’t when it was being done on Facebook and Twitter and YouTube was with a handful of think tanks behind the scenes, arbitrarily applying labels whenever they felt like it. The way that Community Notes is set up, is actually similar to ours. We have a jury system on Minds for our whole appeals process for a community jury. It’s similar in that sense where it’s not limited to just a small handful of organizations. Actually, the community is participating.
Mr. Jekielek: Elon Musk did release some significant portion of the algorithm, but it’s not actually the algorithm you said. Please explain this to me.
Mr. Ottman: GitHub is where people create software, the GitHub repo, it’s called github.com/thealgorithm. That’s what they labeled it in this cute branding way, and everyone was so excited about it. People are analyzing it, commenting, even opening merge requests for changes. That’s kind of how GitHub works. Basically, you have a master branch, and then people can fork the code, make a little change, and then request it gets merged back in with the main branch. That’s a kind of versioning software.
Unfortunately, we realized when Substack links got limited over here on the source code for the algorithm, nothing about Substack was updated when that happened. We now know that it’s not the production code. It’s an algorithm, but it’s not the algorithm that is running live from what we can tell, because none of the Substack changes were referenced.
Elon says that the whole purpose of what he’s doing is for maximum public trust, and that you have to be transparent with the algorithm because otherwise society can’t trust this as the town square, which totally makes sense. But again, you need to publish the thing that you are actually running.
Mr. Jekielek: I want to touch on this in a moment, and have you explain what that means in more detail. Also, there is code in there that makes it so that people who are blocked repeatedly by others, that actually downgrades the ability for their content to be seen. As I was looking through this, there’s a whole realm of possible weaponization that can be used here. I became aware of block lists. Maybe you can just tell me about this whole thing.
Mr. Ottman: Sure, yes. The more you’re blocked, the more you’re reported, which can be manipulated by mobs. There’s a whole block list that people can subscribe to. Someone could create a whole bot army to just go and block somebody, and then based on one person creating a bot army that blocks one user, suddenly their reach is destroyed. Additionally, the ratio of followers to who you’re following matters.
Say you have a hundred thousand followers, but you’re following a hundred thousand people, that’s bad according to the algorithm. They don’t want you to have a high number of people that you’re following because that’s bot behavior. We even had our team talking about it. They said, “Should we start unfollowing people so that we can get favored? It messes with your head.
Mr. Jekielek: Right, exactly. There’s a behavior modification element. A little while ago, all the people started asking questions. There became this thing where everyone on Twitter was constantly asking questions, some of which seem very gratuitous or not particularly necessary. People discovered that asking questions leads to more engagement in their accounts and more followers. Please explain to me what you mean by, “if this is not the code,” and how the Substack piece fits into it.
Mr. Ottman: There’s clearly some sort of blacklist somewhere in the Twitter infrastructure, which is a whole list of probably legitimate spam URLs that should be punished. But what happened last week is that with links to Substack, you could no longer engage with them. You couldn’t share. The whole API integration between Substack and Twitter was broken, so all the Twitter embeds on Substack articles were all broken. There was speculation about what was happening, and Elon alluded to this in one post where he said that Substack is trying to download all our data and create their own social network, and so we needed to block them.
So Substack is a venture backed tech company. Andreessen Horowitz I’m pretty sure is a major investor. Marc Andreessen, a legendary technologist and investor also invested in the Twitter acquisition in a big way, I think over a hundred million dollars.
Elon has also alluded to being interested in potentially buying Substack. It’s this great place for independent journalist blogs and he’s been doing all this stuff with the Twitter Files. I’m sure that he and Mark have been having conversations about it anyway, and doing their negotiations. Who knows how much Substack is worth. But the speculation is saying that Elon blocked Substack as a reality check for them to understand how much they are actually reliant on Twitter and their APIs in an attempt to potentially negotiate the valuation.
This obviously had a major blowback effect. He essentially censored all of the Twitter Files journalists that he’s been working with; Taibbi and Shellenberger all run Substacks, even for their main websites. The Free Press, Bari Weiss’s group, also uses Substack for their backend infrastructure. It’s actually impressive that Elon has managed to infuriate both mainstream and independent journalists at this point.
Again, I say this with the caveat that obviously Elon has brought Twitter to a much better place than it was pre-acquisition. Ultimately, I’m an Elon fan and I’m rooting for him, but some of these behaviors seem sort of impulsive. I would love to be able to talk with him and help him work through some of the issues.
Mr. Jekielek: You have a pretty specific philosophy. Minds is a pretty small network. Epoch Times, everyone knows, is on Minds. But there’s a whole vision behind it. You’re building something that you believe is very important. I agree, and I think it can be very important. Why don’t you tell me about that?
Mr. Ottman: We have over 6 million users now. The evolution of social media now is going to be much more difficult for us than it was in the early days, because we’re not willing to use a lot of the surveillance tactics and what is called growth hacking. Other people call it dark growth hacking mechanisms, like reaching into people’s contact books and using all of these manipulation measures to grow. We’re not willing to do that, which is what they leaned on for their whole growth wave. The whole analytics industry, customer acquisition data industry, it’s all a total mess.
It’s just a data brokering world and people are buying information from each other, and it’s really something that we’ve rejected. Where we are trying to take things is to not have our creators be reliant on us. That’s where these decentralized and open source protocols come in.
One notable feature that we’ve integrated with is called Nostr, which stands for Notes and Other Stuff Transmitted by Relays. It’s a cryptographic relay network where essentially—and this is all happening in the background, I know people’s eyes may glaze over hearing some of those words—what’s happening is that you have a crypto key pair in the background of your account and every time you post or follow somebody, your key is signing those events. What that means is that you and only you control your private key. This is the same with cryptocurrencies as it is with cryptography and encrypted messenger services. There’s a public key and a private key.
That data is stored on this relay network, meaning that you can take your private key that you get on Minds and you can actually leave Minds and go to another app on Nostr. There are dozens of them that are social media interfaces. You can upload your key, your content is there, and your followers are there. So, your identity is yours.
The issue right now is that on YouTube and Twitter, kind of web 2.0 juggernauts, if you get banned, your stuff is stuck there. Everything that you work a decade on is ripped from you. The paradigm is that the creator owns their identity, their content, and their social graph. Snowden is on Nostr now. Dorsey has taken a liking to it. He’s repenting for his Twitter days. He’s become sort of a Bitcoin and decentralization guy. This new ecosystem has millions of users.
The way that we are interfacing with the Nostr protocol is the same way that Twitter could, the same way that Rumble could, or the same way that any centralized app can. Just because you’re running on servers doesn’t mean that you can’t participate in this new decentralized internet. The momentum is totally unstoppable, because at the end of the day, it creates a more resilient network that can’t get taken down.
It is censorship resistant. It actually takes liability away from the platform, because the content is unleashed on the world. It’s like the torrent music sites. Torrents have been around for decades, and we all know what they did to the music industry.
Mr. Jekielek: We’re talking about Substack. By Elon doing this limiting of Substack, some people call it the censorship of Substack, he inadvertently demonstrated the power of a centralized network to instantly turn off a whole business for someone potentially, if Twitter’s so important. Maybe that’s what he wanted to show, but he also reminded everybody else that power does exist, and maybe that’s not the best thing.
Mr. Ottman: It certainly exposed both of those facts. He probably was caught up in whatever negotiation they have going on. You can’t just play with people’s livelihoods like that, in my opinion. That’s the same behavior that a decade ago when Facebook started really restricting their algorithm caused dozens of businesses to go bankrupt. Because one day they’re getting a thousand likes on every post, and the next day they’re getting 10, just because Facebook decided that they’re going to make some tweaks.
Suddenly, all these people are out of jobs and have lost their business. Now granted, does Facebook have the legal right to do that? Probably. But that doesn’t mean that you do it just because you can do it. With this whole Substack fiasco, I have faith that he could probably admit he was wrong.
Mr. Jekielek: For the Epoch Times, this throttling was not academic at all. We spent years developing our Facebook presence, only to have to very aggressively pivot to an independent subscription model. The throttling happened pretty much exactly how you described it, probably to an even greater order of magnitude.
Mr. Ottman: Subscription businesses are absolutely the future. We have Minds Plus on Minds. We enable creators to set up their own memberships. We have that functionality. Being reliant on the surveillance advertising economy is just a bad decision for any site. Advertising is always going to be relevant, but typically it should be thought of as a secondary revenue stream as opposed to primary. Because when it becomes primary, you are just dependent on the whims of corporations that are not your own corporations. That’s not really ideal.
Mr. Jekielek: How did you become so interested in creating a completely open source decentralized system for social media?
Mr. Ottman: I just always thought that it was inevitable that an open source system would take over social media. We’ve seen historical precedent for that with other technologies, for instance, with Linux, which is the most famous open source operating system powering the majority of the infrastructure of the world. It used to be that all software was just free, and software licenses didn’t even really exist. It was just that the academics were all sharing software. Then, this whole proprietary software, intellectual property mode of operating came into play with Windows and everything started getting locked down.
We’ve seen these instances, whether it’s Linux, or whether it’s Wikipedia. I’ll be the first to say that Wikipedia has many problems with regards to the editing and the administration and some of the censorship that occurs there. Even on our page, there are these trolls, and I know many others have had similar experiences. However, Wikipedia was revolutionary in what it did. It’s fully open source, all the software and the content is creative commons, meaning that it can be shared freely with attribution. Encarta and Encyclopedia Britannica, all of these institutions are gone, because no one uses them.
Wikipedia, again, not necessarily the most reliable, but it changed the world. It is the go-to spot to get the basic facts now, and it beat out all of the proprietary corporations that were trying to dominate the encyclopedia world. You see what’s going on with Bitcoin in relation to the traditional finance world. Bitcoin is a fully open source and transparent system that is just organically happening. It’s happening because it’s open and transparent and it’s accessible for people.
The idea that it would happen in social media makes sense and it is happening. It’s not just us who have taken on this philosophy. With technology systems, it is inevitable that most industries are going to move to open source.
Mr. Jekielek: At this point, the reason we became interested in Minds in the first place was because we were seeing all these other systems getting locked down, like Twitter, Facebook, and YouTube. We were basically looking to go anywhere where there is a philosophy of openness and no censorship. Then, Elon came and created competition again. What is the case for someone to join your platform, when there aren’t a ton of people engaging with you?
Mr. Ottman: We actually find that smaller to medium-size creators have an easier time getting engagement on Minds than on Twitter and Facebook. I know you’re using the Twitter mirror function because you have a big Twitter following. But I would argue that if you were going hard full bore native on Minds, you would probably see different results than maybe you’ve seen so far.
First of all, we don’t have restrictive algorithms, so it’s much easier to get into the recommendation feed. We also have a whole reward system, where you can earn more reach based on the contributions that you make.
You can earn these tokens, and then, tokens are worth views. You can use the tokens that you earn to boost your posts for more views. This is the inverse of how most other networks are operating, where they’re pushing you down. We’re trying to bring people up and share revenue with creators and give more reach.
But to what you’re saying about the dynamic of competition with Twitter, I relate it to what happened when Obama got elected. When Obama got elected, I feel like everybody was super happy, hopeful, and it created this apathy where you have the figurehead that looks good and it feels like change, and it feels like progress, but then it’s a little bit emptier than you would’ve hoped for. With Elon, I would say there is change happening that is positive.
However, with letting people back on, for instance, he seems to be playing favorites. Not everybody has been let back on Twitter. There are probably thousands, maybe even millions of accounts that got banned during Covid of non-high profile, random people who were just sharing a study and got banned. They’re not prioritized in terms of being reinstated. So, we’re going to keep holding his feet to the fire.
He will be more willing to change, and he’s already proven this, than Facebook, Google, and Apple. The thing that I love about Elon is his willingness to have the conversation in the open. He’s also shattered barriers for how the elite converse with the public. He acts like a normal person. He posts memes, he plays around, and he’s a goofball.
That in itself is a total game changer, because traditionally, it’s always been that the billionaires shield their personality. They don’t really act like themselves in public. But he’s changed that, and I absolutely appreciate that. There are major advances in discourse that have happened, and we just have to keep moving in the right direction.
Mr. Jekielek: This is shattering what you’re describing, and it also extends through to the Twitter Files. Of course, there is a lot of criticism of the Twitter Files, including about the fact that it’s not nearly everything or it’s very piecemeal. Nonetheless, have there been some unbelievable realities exposed. For you, this question of censorship is obviously essential.
On the one hand, there is a whole kind of ideology these days that censorship is actually a good thing and necessary, because the content might lead to radicalization and violence. But on the other hand, we’re also seeing the Twitter Files telling us that some really bad decisions are being made. When there’s no light of day, then people can just keep doing that with zero accountability. It’s hard to square all of this.
Mr. Ottman: Yes. Both realities are true to a degree. It is true that disinformation can run wild on social media and cause negative things to occur. It has happened in isolated instances, but you have to compare that with the devastating impact of censorship itself. We wrote a whole paper with a team of PhDs and Daryl Davis on bringing together the peer-reviewed research on this debate and showing both sides.
It’s overwhelmingly true that censorship facilitates greater isolation and greater radicalization. Let’s do a thought experiment. You are John Smith, posting a study about Covid during the pandemic. You’ve been on Twitter for 10 years, and then, you’re gone. How does that make you feel?
Mr. Jekielek: Sad, angry, potentially militant.
Mr. Ottman: Do you suddenly think that the study you posted is fake?
Mr. Jekielek: No, of course not. You’re indignant.
Mr. Ottman: You are indignant. Yes, you’re reinforced. There is one example that we get into, and I’m not blaming YouTube for this, but there was one example of a woman that they banned. She was a Middle Eastern kind of dance creator, and she showed up at YouTube headquarters and shot two people.
Antonio Casilli is a European censorship researcher. He has correlated violent protests with censorship. He’s talked about the toothpaste tube effect where he studied this one Tumblr community that was a pro-anorexia community. Odd, but it exists. There are all kinds of strange places.
Twitter tried to get rid of it because it was really nasty stuff, but people are just going to go somewhere else to talk about it. Just because you censor people doesn’t mean that they stop being interested in what they’re discussing. In fact, even looking at all the school shooters, it’s like isolation is depression. They’re essentially synonymous. The psychological impact of censorship is just devastating. We just saw it on a scale that has never been seen before.
Mr. Jekielek: We are seeing it through the Twitter Files and other disclosures in lawsuits like Missouri v. Biden. There is one thing I found really fascinating in the censorship effect report you wrote which is very eye-opening. I want to recommend that people take a look at it and its conclusions. The word radicalization has become a pejorative, but in this report you outlined that radicalization isn’t positive or negative, it’s just something that happens. Please explain this to me.
Mr. Ottman: It obviously can, and it is used colloquially as a pejorative, and probably for the most part it is. But with a lot of these policies, there’s no room for positive radicalization or positive extremism.
Mr. Jekielek: I’m going to jump in. My friend Bob Woodson uses the term radical grace when he talks. Radical grace is when someone forgives. For example, there’s a story in our American Essence magazine recently about a woman who was a lawyer under the Ceaușescu regime in Romania, and was too successful. A hit man was sent to kill her, and she talked him down basically, and he ended up becoming a Christian. She forgave him for what he was doing—radical grace.
Mr. Ottman: Yes, absolutely. We actually mentioned that type of field in the paper, radical forgiveness and radical love. There are instances of mothers whose son was murdered, and then they see the killer in court and decide at that moment to forgive them, or they go visit them in jail in the years following. The catharsis and healing that’s able to occur through that process is counterintuitive, but it’s arguably one of the most effective mechanisms to healing.
That’s very much the approach that Daryl Davis has taken in his de-radicalization work. For anyone who doesn’t know, he’s a famous black musician and race relations expert who has befriended hundreds of the KKK and inspired them to leave. Being kind and human to people is more likely to change their minds than shaming them. From a human perspective, some of this stuff is very obvious. Be nice to people if you want to have any chance at influencing them. Don’t shame them for being hateful.
Mr. Jekielek: Basically, radicalization is just feeling very, very strongly about something, correct?
Mr. Ottman: Right. You need to have access to radical ideas in order to stretch yourself, and for humanity to stretch itself into the lawful fringe areas that the First Amendment protects. That is where you’re going to see all types of crazy stuff that is nonsense, but also a decent amount of the most breakthrough ideas. This paradigm of social media apps just banning all of that if it has anything to do with that topic, it’s just gone.
The global impact this is having on our brains is devastating in terms of the chilling effect on how we communicate publicly. Everybody’s basically putting on a front where we have to protect ourselves. Because you’re not going to risk the following that you’ve developed over a decade that is directly correlated to your livelihood and your ability to influence the world.
Mr. Jekielek: Martin Kulldorff, the former Harvard professor of epidemiology, has discussed this with me. We’ve been advised by people who are from former communist countries, in order not to lose our Twitter or Facebook presence, to apply self-censorship methods to push it to the limit, but not go over the edge. This is basically what dissidents in these societies would do.
Of course, this is probably good advice in a sense. But on the other hand, it exposes this terrible reality that our system is like that now. Also, it’s not all radical ideas that are censored. Some of them are propped up too.
Mr. Ottman: That’s a double standard.
Mr. Jekielek: Right.
Mr. Ottman: Yes, absolutely. One of the major backlashes against Twitter censorship pre-Elon acquisition was about their deadnaming policy for trans individuals, which is actually still in their terms, and they haven’t rescinded those terms. They did rescind the Covid misinformation terms, but not the deadnaming stuff, which Megan Murphy got banned for by saying a man is not a woman.
She was famously getting banned for saying that. That’s a double standard, or they might say, “These protests are okay. These are loving protests. Those are hateful protests.” It’s the same double standard of what is allowed. It should just all be allowed and let people curate their own experience.
You would think from a business perspective, having maximum access to a customer base on both the Right and the Left would be good for business. You would think that, and I do think it’s true. Somehow, this cartel of advertisers have peer-pressured the companies to create these content policies. I would imagine that they would know that half as many potential users is worse than twice as many. But it shows you how strong the ideology has taken hold. They are willing to sacrifice those customers.
Mr. Jekielek: I’m reminded that you’re actually entering into a lawsuit against this censorship law in California which is begging to be challenged under the First Amendment. Please tell me about it.
Mr. Ottman: AB 587 is a new law in California that went into effect in January, spearheaded by Gavin Newsom and the Attorney General. It essentially mandates social media companies to report on a biannual basis all different kinds of analytics about their moderation, and to have specific content policies about a specific group of topics; disinformation, misinformation, hate speech, radicalization and extremism. None of these terms are defined, and those are the only ones that they are mandating. For instance, they’re not mandating child abuse material policies.
They might argue, “That is already against the law,” but it still gives you a sense of where their priorities lie. They’re only mandating specific policies around topics that are lawful under the First Amendment. First of all, of the words that I just mentioned, they don’t define them, so we don’t even know what they mean. But none of those are inherently unlawful.
What is misinformation? If someone posts about some aloe cure for whatever it is, one person is going to call that misinformation, and another is not. Maybe it is, and maybe it isn’t. It’s not illegal to be wrong. It’s not illegal to be stupid. With extremism and radicalization, it’s an extreme what? It’s a radical what? They in themselves are just qualifiers, they’re adjectives.
Mr. Jekielek: As you mentioned in the report, and also as Jacob Siegel mentions in his amazing new piece about the hoax of the century, it originally stems from information warfare techniques designed to deal with the war on terror. That’s what radicalization implies when that term is used.
Mr. Ottman: I want to touch on the Restrict Act, which is sort of the Patriot Act 2.0, and is related to this. But just to close the thread on the lawsuit. With James Lawrence, who leads the effort—he was Alex Berenson’s attorney who famously got Alex reinstated on Twitter after they had banned him for Covid content—we are suing the state of California to get this overturned. Tim Pool is joining and Babylon Bee is also joining, and we’re hopeful.
Mr. Jekielek: Please tell me about the Restrict Act.
Mr. Ottman: Everyone’s been hearing about TikTok surveillance and that we need to ban TikTok in the U.S. TikTok is a very problematic application that is a surveillance nightmare. It is clearly sending information to China that poses a national security threat, probably legitimately.
Mr. Jekielek: And a tool of influence, I have to add. It has a massive ability to influence, if you look at Dr. Robert Epstein’s work that I’m familiar with on other social media apps.
Mr. Ottman: Right. That sentiment of people wanting to do something about TikTok is totally well founded, I would say. But with the Restrict Act, Jesse Watters famously confronted Lindsey Graham on this. Lindsey Graham, who was one of the primary advocates in Congress, didn’t even know what was in the act. He didn’t know that it was a censorship nightmare, a piece of legislation that essentially would fine people in the U.S. for using VPNs.
It’s just such a great example of the hijacking of a good cause. More sensible legislation would be an “encrypt act,” where you’re actually protecting citizens, mandating end-to-end encryption throughout the government, and then educating people about the dangers of surveillance applications in the app stores.
Regulation in this realm is so risky, so it has to be very sensitively crafted. You could label apps in the app stores that are surveillance-based or that are funneling data to different territories around the world, in the same way that Twitter is labeling media companies now. You could label apps in the app store for what’s going on with them.
Are they open source? Are they privacy preserving? You could create little icons. We saw it with organic food. We can use mechanisms like that to educate people and push back on TikTok. But I was joking with a friend that the Restrict Act could be one of the rare pieces of legislation where the Left and the Right come together. Because you’re getting all the TikTok people who are clearly going to be against banning TikTok, and then you’re going to get all the privacy and freedom people who don’t want the Restrict Act. Ironically, you could end up having the Left and the Right both arguing to bring down the Restrict Act.
Mr. Jekielek: It’s a hugely intractable problem with TikTok. I don’t know the answer. You have a totalitarian society taking advantage of a genuinely open society to massively push their agenda. We seem to be unable to deal with the information gathering and the influence operations of TikTok without going against our own values. How do we deal with this?
Mr. Ottman: We need consistent standards, because with all the other apps in the app store, a large majority may have similar surveillance mechanisms in them, and those companies might be U.S. based like Facebook or Google. But we also don’t know what Facebook and Google are doing with their data behind the scenes. Facebook could be selling that data to anybody, and so could Google.
When the TikTok CEO was testifying the other day, he actually said the same thing. He asked, “Can we just have consistency?” I’m not necessarily defending him at all, but it is true that we can’t think of TikTok in a vacuum. You can’t think that just because Facebook is based in the U.S. that we don’t need to address that elephant in the room just as much.
In fact, it could be more nefarious, because we think that they’re in the U.S. and they would never betray our interests. But the likely reality is that they are betraying our interests.
Mr. Jekielek: Let’s talk about that briefly. We’ve seen the film, “The Social Dilemma,” and I know you’ve seen this film. It shows us how deeply these social media networks understand us and how they develop technology that’s addictive, and gets us using it more and more. Of course, TikTok is all this stuff on steroids. The difference is that it’s in the hands of a regime that seeks to subvert America and has zero guardrails of any sort. It’s a regime that has a multi-billion dollar forced organ harvesting industry that’s committing genocide.
A tool like this in the hands of the Chinese military is different from Facebook in my view. I don’t think we can create an equivalency. Legitimately, it’s a special category. Although, I am deeply afraid of where Facebook might be selling its data.
Mr. Ottman: This conversation, wondering about the national security risks of the apps that we store in our pockets, is an opportunity to have that conversation holistically. Yes, TikTok and other authoritarian-sponsored apps are in their own category, but we need to have the whole conversation because all of them are a threat.
Mr. Jekielek: One hundred percent. This is where Minds can give us an alternative. Here’s an opportunity for you to tell us how Minds is different? How can you not put your finger on the scale as Elon did with Substack, and especially like TikTok would be doing.
Mr. Ottman: Sure. We can, and the reason we’re integrating with Nostr and other decentralized protocols is because we know we want to protect ourselves from ourselves. We know what could happen to us down the line. Back in the day, Twitter’s slogan was, “Free speech wing of the free speech party.” They all thought that they were fighting for free speech in their early origins. By integrating with sovereign censorship resistant protocols, you’re basically protecting your user base and giving them an escape hatch. You’re saying you can take your stuff with you at any point in time. What that does is in terms of the power dynamic, it keeps us on a level playing field. When we know that our users could just take their key and go to another app, we’re going to respect their freedom, because we know that they could do that. The problem with YouTube and Twitter and Facebook is that they have everybody locked in. You can’t move your whole following to another app. It actually takes a decade to do.
I’m hoping that we can get more large networks to adopt these protocols. From a user perspective, they shouldn’t even really feel anything different. It all feels the same. But in your settings, you would just have an option where you can download your key. That means that if Twitter ever messes with me, I can just go somewhere else.
Mr. Jekielek: This can be huge amounts of data. Imagine you’re a very active user; you’ve got video and you’ve got all sorts of stuff. Your key is where does this thing even live, if you’ve got however many gigabytes or even terabytes of video?
Mr. Ottman: For right now, there is a torrent integration with Nostr happening. Video and more heavy duty media is coming. Right now, it’s mostly supporting just text and links. That data lives on the relay network, so anybody can run a relay node on their machine. That’s where it lives. It lives in this decentralized network. Then, you have your key, which gives you the ability to access and write to that identity.
Mr. Jekielek: This brings up a couple of thoughts. One is that you’re building something step-by-step without some kind of massive funding that’s driving everything behind you. At least that’s the sense I get. It actually depends on user participation. All the people that buy into Minds are the people that are effectively hosting all of this material. That’s very interesting and compelling. Of course, it’s a work in progress.
Mr. Ottman: Yes. Our original funding round was equity crowdfunding, which was made possible through the Jobs Act passed in 2016, which essentially enabled non-accredited investors to invest in startups. Historically, in the whole tech industry, you’ve only been able to invest in startups if you’re an accredited investor, which means that you need to have a certain income or certain net worth, which really created this siloed world where a normal person couldn’t invest in tech companies.
Now, they can. We’re actually reopening our funding to the public this coming week. That’s at Wefunder, wefunder.com/minds. It’s really important to us for the community to have its own stake. It’s not a fully public company, but the equity crowdfunding shift has made it much more accessible. We need to be owned by our users. There’s no world where I wouldn’t want that to be a real reality. That’s really important. Additionally, the code is owned by everybody. Everybody owns their keys, their data, and their content. It’s really this sovereign, self-ownership world that we want to move to.
Mr. Jekielek: I’ll just qualify for everyone that I didn’t know that you were opening up your funding. You’re very welcome to mention it. I want to talk more about this broader ideological push to censor, to hide information, and to keep so many things classified or top secret. You recently drew my attention to this New York Times piece about what happened with the Nord Stream 2 pipeline explosion.
The New York Times has a line where they say, “Maybe it’s in the best interest of everyone not to know,” or something in this vein. Wow, really? How could that be true? But this is the mentality that has now emerged in our society. Obviously, you’re dead set against this. What do you make of this?
Mr. Ottman: That’s where a lot of my original philosophy comes from, this desire for greater access to information, both from the government and from corporations. There’s always an excuse to be more secretive, whether it’s talking about national security or the infrastructural security of the app. They say, “We can’t publish our code because that would be a security risk to our users.” That’s what most of these companies will say and you can go ahead and make the security through obscurity argument.
But time and time again, when you talk to the best cybersecurity researchers in the world, they will all say that open source encryption protocols are the most hardened and battle tested, because you have that public accountability and you’re essentially having all the smartest people in the world audit your code and find the holes.
Look at Bitcoin. Bitcoin is totally open source. It’s never been hacked. There is every incentive to hack it, but yet it seems to be one of the most secure encrypted protocols that we’ve ever seen. Wikipedia works, it’s open source, and it’s not like it’s getting hacked. Linux works, exploits get found once in a while, but they get fixed and the world doesn’t end.
We’re trying to push that transparency-by-default mentality. It’s really hard when you spend a lot of time building a business. I remember when the day before we first open sourced our code. You are nervous and you think, “Oh gosh, we’ve put so much time and money into building this and we’re going to put it out there. But what people don’t realize is that there are licensing models and software licenses that you can use that do protect you, but also enable you to be transparent.
The software license that we use is called the General Public License version three, which was created by the Free Software Foundation. It’s the same license that Twitter used on their algorithm, AGPL v3.0. What it essentially means is. “Here’s the code, you can do whatever you want with it. You can use it and you can commercialize it, but any changes that you make you have to share with the whole world as well.”
You can’t just take this code and then create your own proprietary app. You have to pay it forward essentially, in terms of transparency. Some people call it a copyleft license, or the content equivalent would be a creative commons share alike. That means, “You can take this music that I made, you can remix it into your own music, but when you share the final product, you have to share that with everybody else.” I thought it was interesting that Twitter picked that license for their algorithm.
There are other models where you can do a time-delayed release where we’re going to publish our code so it can be audited, but no commercial use can happen for a period of X years. Basically, you are being transparent, but you’re protecting yourself in the business sense. With open source, most people think, “Oh, you’re just giving it away.” But there’s a lot more nuance to it than that.
Mr. Jekielek: There’s a proxy that many Chinese users use to access the Epoch Times and a lot of other sites. It’s essentially a plugin for a browser. If you’re on the China side of the great firewall of China, it lets your browser basically use the worldwide web freely. Now, the way Freegate works is it mimics, it pretends to be other protocols. When talking with the guy that runs this, he said, “Never in a million years would I make my protocols public.” Why? “Because the CCP would know exactly what we’re doing and I can’t.”
So my point, and I think he has a very strong case. For this particular type of case, it probably doesn’t work. But you suggest you don’t want radical transparency in all cases. Are there examples where secrecy or classified information is acceptable in your mind? Please explain this to me.
Mr. Ottman: For something like a browser extension, which is what you said is being used?
Mr. Jekielek: It’s not a VPN, it’s a proxy. But effectively what it does is it allows users free access to the internet in a place where it’s massively censored. But the protocol itself, if the bad guys knew how it worked, it would negate its use.
Mr. Ottman: Yes. I would need to dig into the specifics, but it would also be possible for traditional content publishers to integrate with Nostr or another kind of relay network or distributed system where the content lives in the decentralized web as well. There’s different strategies.
Mr. Jekielek: This is what I’m saying, I’m not saying that there aren’t other ways to skin this cat and people have tried. This one seems to be the most effective. The point is that I may want to not provide the nuclear codes to the worldwide web. That would probably be an extreme example, correct?
Mr. Ottman: No, it’s an important example. I do think that there are certainly exceptions for absolutely everything being transparent. That’s the paradox between transparency and privacy, because privacy is a cousin to secrecy in a sense. Obviously, we need privacy. There’s no freedom without privacy. If we don’t have the ability to converse privately, then we’re not free.
In the same sense that we’re not free to use applications privately, we’re arguably not free. But I don’t know it’s possible with that proxy that there could be some open version of it that wouldn’t be stoppable. I would need to look into the specifics on it. Yes, we would push for as many systems as are generally accepted to be useful for society to be transparent as possible. However, cocktails for biowarfare and nuclear codes should be private.
Mr. Jekielek: Right. I’m just trying to understand where the limits on transparency are.
Mr. Ottman: The thing with this proxy that you’re using, the creator has his opinion. We don’t know that it would be immediately taken down if it was published. Maybe if it was published, it would be forked by other developers and they would create more resilient versions that can’t be taken down.
Maybe it’s riskier for it to not be published, because if they find out who that individual is and take him out, then the whole thing is taken out, because he was the only one with access to the code in the first place.
If they can politically attack that person, then that’s a single choke point. Part of the benefit of open source is that it’s unleashed, so it can’t get taken down, because it’s been widely distributed. It’s an interesting debate though.
Mr. Jekielek: Let’s go back to this general question. Privacy and secrecy are two sides of the same coin you suggested. There’s room for secrecy, maybe even more than just the nuclear codes and the bioweapon cocktails. Where do you set that line?
Mr. Ottman: Certainly for personal communications, we believe that it should be all end-to-end encrypted by default. We have no desire to have access to people’s private conversations. It’s more of a risk to both us and them to have us be able to access those conversations.
We use an end-to-end encrypted protocol for our messaging. That’s one of the scary things about Twitter. They actually have said that encrypted messages are coming in the roadmap, but that’s something that doesn’t get spoken about enough.
That’s a Twitter file that I would like to see; the history of DMs being accessed. Think about who you have on there. You have heads of state, you have the corporate leaders of the world. Let’s see the history of Twitter admins accessing those messages. The data is there, and it has been done. We’ve seen screenshots of the Twitter admin console where if a Twitter admin goes to your page, they can just look right at your DMs. Right now, they can do that, and that Twitter file wasn’t shared.
We also need to know what’s the status of these programs with the different government task forces, and the web portals for them to submit take-down requests to Twitter. We don’t know what’s happening right now. It’s a high likelihood that it is still happening.
What I can say is while running a social media app, we have received requests from the government before, not often, but on a few occasions. We absolutely require a warrant and a subpoena. We have built our infrastructure so that when those requests come in, we’re not even in a position to share it, because if we have access to the DMs, then we have to share the DMs. But if we engineer ourselves out of the driver’s seat, then we can’t. They come in and we say, “You can see what’s public on their page. That’s what we have.”
Mr. Jekielek: The counter-argument, which you’ve heard a million times is, “The terrorists get to use these systems, so the government needs access.”
Mr. Ottman: Yes. Breaking encryption by all of the smartest security researchers in the world makes everyone less safe. It makes the U.S. infrastructure less safe, when you build back doors into primary encryption protocols. The government uses Linux, and the government uses Signal. We know this from the Twitter Files.
Signal is one of the most popular open source, encrypted messenger apps. The war rooms between Twitter and the government were happening on Signal. That’s where they were talking. I have friends in the military who use Signal. They are all told to use Signal.
For the NSA and other intelligence agencies to try to break encryption is breaking their own security for conversations that they’re encouraging their own personnel to have. It’s not a good idea.
Mr. Jekielek: Interesting. I hadn’t heard that before. The thing that keeps me up at night is this thing that I call the megaphone. The megaphone is the system of all these different government and civil society and media structures working together to create a perceived consensus around issues, for example, “The vaccine is safe and effective.” That’s the one that perhaps we’ve heard the most often in recent times.
It’s a very, very powerful mechanism, and censorship is a piece of it, but not the whole thing. Jacob mentions this digital leviathan that’s potentially replacing the governance system in the U.S., if it’s not stopped somehow. What do you see happening here, and is there any way to stop it?
Mr. Ottman: It’s really alarming that the information landscape is getting taken over by these agencies. We’re seeing this as Operation Mockingbird 2.0. Historically, Operation Mockingbird was where the media was being used as a mouthpiece for the government. Now, we’re seeing that in the social media age, and citizens are the targets.
This term, domestic terrorist, is the new fetish of the intelligence agencies. They seem more concerned with domestic terrorism from Joe Citizen sharing their thoughts on social media about the government. “The freedom community is now threat number one to our national security.”
Mr. Jekielek: It just means that there is dramatically less focus on all these really serious, legitimate threats that America faces.
Mr. Ottman: Right. It would have been nice in the Twitter Files to see a hundred thousand requests for child sexual abuse material, but we don’t see that. Maybe it’s there. I hope it’s there.
Mr. Jekielek: Yes.
Mr. Ottman: Yes.
Mr. Jekielek: What do you think is going to happen here, based on what you’re seeing? You think about this in a very broad way, it seems.
Mr. Ottman: The genie is out of the bottle. They got caught with their pants down. Ultimately, open source decentralized systems are going to take over just as a matter of physics, because they are more resilient. They protect the users more. As they become infused with the monetization incentives that the centralized networks have, and we reach feature parity with Big Tech, there’s no reason not to use both. This is one of the positives about the Twitter deal, it used to be Big Tech and alternative tech. You had us and Rumble and Parler and these alternative social media apps, and then you had the legacy Silicon Valley world. There was this big split, and there was no in-between.
But now of the top apps, Twitter is kind of in the middle, probably even more over on our side. From a power dynamic and competitive perspective, Zuckerberg and Pichai and all of these CEOs are looking at this and saying, “What are we going to do next?”
“Look at what Twitter did. Twitter open sourced their algorithm. Can you believe it? What are we going to do? Would we ever do that?” It’s in the conversation now. I have been screaming from the rooftops for 12 years about just using the word open source. Now, it’s even being discussed at that level, and it actually seems to be taking hold to a certain degree.
I make the analogy to organic food a lot, where historically food was just food. No one really knew the difference between GMO processed hormone food versus organic labeled local food. But then, this whole movement came out, and now in the grocery store, you have a whole labeled section and organic section, and then you have the conventional section. That’s what I want to see in the app stores.
In the app stores, you go in and you have a labeling infrastructure where you can see what apps are doing to you. It is like food. Our phones are like extensions of us. They are having a biological impact and they’re with Neuralink and all of these advances. Soon they’re going to be in people’s eyeballs.
We’re entering that transhuman reality, and so we need to be even more cognizant of nanotechnology, and nanobots in vaccines. Everything is becoming digitized and it’s going to be interfacing with us on a more intimate level, which makes it urgent to understand the nature of the software that we are injecting into ourselves. It is that much more critical.
Mr. Jekielek: One of the things about this trans-humanist direction that at least part of our society seems to be interested in taking, inadvertently, there are fundamental questions of censorship control. Are you losing autonomy? Are you losing decision making? Is certain information kept from you? I don’t know. Profound questions.
Mr. Ottman: It’s super profound. Imagine you become paralyzed or you have some neurodegenerative disease which one of these technologies can help you. It’s a very different conversation when you have it from a health angle and a severe medical condition angle versus a supplementary human performance level.
It becomes a lot more personal. You can think about it from the perspective of a disease that could get you, “Would I give up some freedom to have this capability restored in my body? Would I risk having my body infected with some nanobot that could just shut me down at a moment’s notice, but I’m able to regain this function?” That’s scary, because that is real. If you take this nanotech, you are risking that, because there’s people controlling that software.
The same thing when you hop into a Tesla. If someone hacks Tesla, a hundred thousand cars could go off the road at once. There could be a major disaster because Tesla’s code is not open. But again, they would make the argument similar to the proxy argument that you made. They’re making that security through obscurity argument, which is a good debate.
Mr. Jekielek: Gosh, I could keep talking to you forever here. Why don’t we just jump to the Minds event that’s coming up on Saturday the 15th. We’ll be there. You’re going to be there. Let people know what they could find there if you happen to be in the Austin area.
Mr. Ottman: Yes, Austin, Texas, April 15th. If you go to festival.minds.com, you can see the whole lineup. It’s like the New York City event you came to that we had at the Beacon last year. At Minds Fest, we’re trying to have debates and civil dialogue with people from across the political/cultural spectrum. There’s a huge appetite for it, even though major social networks want to shut down that debate. That really is what people want to see. One person who’s coming, who is historically liberal, Steven Bonnell, known as Destiny. He’s a great YouTube debater and comes at things from more of an establishment perspective, but he is willing to have the conversation.
Historically, it was liberals who championed free speech at Berkeley. Now it’s the conservatives and the Christians who are fighting for free speech, like that all inverted. We just like to dig into that. We’re going to be talking about the Twitter Files. We’re going to be talking about social media censorship. We’re going to be doing lots of live debates with the audience. We’re going to be inviting the audience on stage and talking about government secrets and probably talking about Nord Stream that you mentioned.
Then, we’re doing music afterwards. We’ve got some blues musicians. Daryl Davis, who I mentioned, is an amazing blues musician and piano player. He played piano with Chuck Berry, the famous legendary rock musician. In our original roots back when I was at UVM in Vermont, when Minds first kind of glimmered, we would hold these events called Gathering of the Minds. We would bring together all students, student government, and professors, and then have these round tables.
We successfully got the University of Vermont to divest from weapons manufacturers. That was a result of one of the conferences that we had. Not to go too deep, but most universities have stock portfolios, they’re invested in all kinds of different companies. That was when I was staunchly in Students Against War and protesting the Iraq war.
But we then blend it with culture, art, and music, because people are exhausted by the culture war. It is exhausting. I know that you probably feel that sometimes. I do. It’s fascinating. I love it. But you need to bounce it out and just show that people can have a good time together. That’s what music does and has done historically .
Unfortunately, we don’t have that kind of spirit of culture and counterculture that music from the 60s really had. It doesn’t really exist anymore. I’m hoping that we can bring that back a little bit, but a bit more of a balanced way, because you don’t want the music to get overly politicized.
Mr. Jekielek: I’ll just mention this as we finish, that it almost seems like facilitating the dialogue is the radical thing now.
Mr. Ottman: Exactly right.
Mr. Jekielek: I understand exactly what you mean when you say more in a balanced way. Bill Ottman, it’s such a pleasure to have you on the show.
Mr. Ottman: Thanks so much for having me.
Mr. Jekielek: Thank you all for joining Bill Ottman and me on this episode of American Thought Leaders. I’m your host, Jan Jekielek.
To get notifications about new Kash's Corner and American Thought Leaders episodes, please sign up for our newsletter! Here 👉 Get Alerts
-
PRE-ORDER "The Shadow State" DVD:
The Real Story of January 6 | Documentary BUY Jan 6 DVD:
https://www.epochtv.shop/product-page/dvd-the-real-story-of-january-6, Promo Code “Jan” for 20% off.
-
Follow American Thought Leaders on social media:
Twitter: https://twitter.com/AmThoughtLeader
Truth Social: https://truthsocial.com/@AmThoughtLeader
Gettr: https://gettr.com/user/amthoughtleader Facebook: https://www.facebook.com/AmericanThoughtLeaders Gab: https://gab.com/AmThoughtLeader Telegram: https://t.me/AmThoughtLeader
Comments