Check out our 4 Keys to Thriving in the age of AI Ebook
April 1, 2024

Alex Fink | The Hidden Consequences of Misinformation: How It Shapes Our Reality

Alex Fink | The Hidden Consequences of Misinformation: How It Shapes Our Reality

Discover how Alex Fink is revolutionizing our approach to digital information with AI-enhanced solutions. Learn to navigate the media landscape more effectively.

In this episode, Srini Rao interviews Alex Fink, the founder of OtherWeb, about the challenges of navigating the media landscape and the importance of consuming reliable information. Alex shares his personal experiences growing up in the Soviet Union and Israel, where access to accurate information was limited. He discusses the impact of censorship on collective reality and the social consequences of living in an environment of fear. The conversation also delves into the current state of the media industry, the role of technology in shaping information consumption, and the need for a new system to filter and reward high-quality content.

Subscribe for ad-free interviews and bonus episodes https://plus.acast.com/s/the-unmistakable-creative-podcast.

 


Hosted on Acast. See acast.com/privacy for more information.

Transcript
Srini:
 
Alex, welcome to the Unmistakable Creative. Thanks so much for taking the time to join us. Yeah, it is my pleasure to have you here. So I found out about you by way of somebody who had pitched me multiple guests and I saw what you did. And when I read about the startup that you're the founder of, I thought to myself, hell yes, this is definitely something I want to talk about.
 
Alex Fink:
 
Thank you so much.
 
Srini:
 
given that I have an audience full of people who consumes a shitload of media. But before we get into all of that, I wanted to start by asking you where in the world were you born and raised and what impact did that end up having on what you've ended up doing with your life and your career?
 
Alex Fink:
 
So I was born in the Soviet Union and we moved to Israel when I was six. I grew up in Israel. Then I lived in Japan for a bit after college, then California. Now I'm in Texas. Now the Soviet Union, I'm sure had a big impact on why I'm so interested in what I'm working on right now, because it was an environment where a hundred percent of the information around you is essentially fake. It's curated by the government. That's what people want you to believe. And I have these vivid memories of my parents.
 
trying to get some information, typically by locking themselves in the bathroom and listening to Voice of America on the radio at 4 a .m. so none of the neighbors would know, right? And so that gives me some idea that real information is important. People go to great lengths to get it. But somehow in today's environment where it's supposed to be freely available, it's still kind of hard to get.
 
Srini:
 
Mm -hmm.
 
Srini:
 
Mm hmm. Yeah. Well, I noticed that when I have been seeing the coverage of the war in Ukraine, and I assume that that's still the case. Like, so what does that do for the sort of collective reality of the entire population when, you know,
 
information is censored in that way. And also how, I mean, the fact that they could go and listen to Voice of America, like how is that even possible in those days? I mean, I'm sure now with the internet, some things are possible, but talk to me about the sort of restrictions that are already, that are in place and what the impact of that is obviously beyond sort of, you know, perpetuating propaganda.
 
Alex Fink:
 
So if you're going back to the 80s, the one thing that couldn't be blocked was just radio waves, especially long range radio. So the US actually had a conscious policy. They created Voice of America specifically for foreign propaganda, some would say, but in reality to try to inform countries behind the Iron Curtain. And so they had it in a wide variety of languages, including in Russian. And so people in the Soviet Union could listen to it, but it wasn't legal.
 
Srini:
 
Mm -hmm.
 
Alex Fink:
 
You had to hide the fact that you're listening to it. Today, as you said, everybody can install a VPN and try to go to websites that are banned in a particular country and still consume real information. Again, with the same sort of legal restrictions that if you get caught, it's probably going to be bad for you.
 
Srini:
 
Yeah. Well, so when you know that happens to a society at large, because I think that, you our audience is overwhelmingly probably American, like just based on what I've seen in downloads, like give people a picture of what that reality is like, because I honestly think that we take the ability to consume whatever information we want.
 
for granted and obviously we'll get into some of that, but that also has certain consequences. But talk to me about what that does for the collective reality of the entire population and what Americans don't quite understand about the implications of that.
 
Alex Fink:
 
There is a few effects. Now bear in mind, I was a child. So now I'm reconstructing some childhood memories and adding things that I just read. But I think you essentially have one official version of everything. Almost everybody in the society is like 99 % sure that it's fake, but they're not 100 % sure. So they're afraid to say that it's fake because what if their neighbors don't agree? And so you have this preference falsification on a mass scale.
 
Srini:
 
Mm -hmm.
 
Alex Fink:
 
where everybody pretends to believe something but doesn't actually believe it. And nobody can actually say the things that they suspect to be true because they don't know if anybody else agrees. So that's, I think, the mass psychology that you observe in environments like this. You're beginning to see now some elements of this in the US, some preference falsification, some things that people think but are afraid to say, but it's obviously not at the same scale.
 
Srini:
 
Yeah. Well, so what happens, like, what are the social consequences of something like this, you know, in terms of the way that people act, the way that they behave, the way that they relate to each other is just like an environment of like living in perpetual fear.
 
Alex Fink:
 
Well, it depends on what the penalties are for trying to find out real information or to disseminate it. So Soviet Union in the 50s and 60s. Yes, absolutely. Everybody could be sent to a work camp, whether or not they did something bad just because their neighbor said that they said something bad. Right. Or they used the wrong tone of voice when they told a joke. Soviet Union in the 80s, much less so for the most part, you would be shunned a little bit. You wouldn't be accepted into the party.
 
Srini:
 
Mm -hmm.
 
Srini:
 
Yeah.
 
Alex Fink:
 
Maybe you wouldn't get promoted at work if you said the wrong thing, but there were no gulags anymore. In fact, from 85, 86 onwards, the Soviet Union had this process called glasnost, which basically means being able to say things or to voice things. And one of the things that was voiced is all the horrors of the late 40s and early 50s. That became public in the Soviet Union in the late 80s. Before that, nobody knew. And so...
 
Obviously, they couldn't then keep doing that, at least not on a mass scale, after people found out and agreed that it's bad.
 
Srini:
 
Yeah. Well, so when you look at sort of the landscape today, especially somebody who is, you know, effectively runs a media company or curator of media, what does that do for the media creation industry? Because if you think about people who are listening now, many of them are content creators themselves, like bloggers. But how does that affect the media landscape when that becomes sort of things like what is journalism like? I assume you don't have sort of independent media outlets the way we do here.
 
Alex Fink:
 
Well, certainly on authoritarian regimes, you don't. What you have typically is a big, you can call it a single conglomerate, even if it has different names and supposedly independent bodies, right? But there's one conglomerate of media. And to enter it, you have to say the right things in the right way. And there is a selection mechanism where only the people who say it in the perfect way enter the club. And then you start having an underground counterculture.
 
Srini:
 
Mm -hmm.
 
Alex Fink:
 
where people are trying to disseminate ideas in alternative ways. In the Soviet Union, historically, that was typically by printing little brochures or pamphlets and then giving it to somebody. And then somebody would actually manually copy the pamphlet and then give it to more people. Right. That was the main mechanism. And then, of course, there were foreign mechanisms like Voice of America that were created by a competing government.
 
to try and get a competing voice into this ecosystem, which again, by itself was probably American propaganda, but American propaganda was much more truthful than Soviet propaganda.
 
Srini:
 
Yeah, absolutely. Well, it's funny because I was reading this book called Superminds and he was kind of tracing the evolution of communication technology and pointing to the invention of writing as one of the most significant inventions. And people don't think of writing itself as a form of technology, but as he traced all of it, it was really interesting to look at how each evolution.
 
in communication technology impacted the transfer of knowledge. For example, we take it for granted that we can read books, but it's like no printing press and that doesn't happen. And kind of like the pamphlets you're talking about, it's like, okay, the invention of the printing press is kind of a double -edged sword because on the one hand, it allowed diffusion of knowledge across borders, allowed us to produce content at scale in a way we couldn't before, but it also enabled the spread of propaganda for better or worse in a lot of ways.
 
Alex Fink:
 
And if you look at it historically, one of the first things that happened when the printing press was invented was inquisitions and holy wars and witch hunts. The witch hunts specifically were intensified because of a single book that was self -published. It was called The Hammer of Evil. And that started this process in several different countries in mainland Europe, and probably 80 ,000 people died because of it. So yes, when you have a new technology that comes around and society is not
 
adjusted to it yet, it hasn't quite learned to live with it yet, the transition period can be pretty painful.
 
Srini:
 
We'll get to some of that, I think, as it relates to AI. But how old were you when you left Russia? Six, OK. So talk to me about the transition to a place like Israel, because Israel, obviously, given its current circumstances, is probably not the most normal environment either. I'm sure compared to the Soviet Union, it must have seemed drastically different. But talk to me about that experience. This is something I seem to find pretty consistently when I talk to people who come from Israel, is just the
 
Alex Fink:
 
Six.
 
Srini:
 
sheer abundance of technological prowess and innovation that seems to come out of there. Because I remember when I was in college, like, ICQ came out and like you just start asking, what is it with the Israelis in chat software? Why are they so good at this? Like, literally every chat software company at that time was coming out of Israel.
 
Alex Fink:
 
So we were all watching I secure pretty intently because they were the canary in the coal mine of essentially the Israeli startup nation, but let's Roll this back a little bit to the time when I was six. So Yes, I'm sure the environment wasn't normal. In fact, if you think about it when we arrived within five months the United States began operation doesn't storm in Iraq and Saddam Hussein because he couldn't fight back against the Americans and
 
Srini:
 
Mm -hmm.
 
Alex Fink:
 
He did, I guess, what every normal Middle Eastern leader does and started bombing Israel instead. And so we had a kind of a welcome party to Israel where we had just arrived and now we're wearing gas masks, sitting in these closed rooms and sleeping with our clothes on so that if the siren starts sounding, we run to that room, lock ourselves and wait there with our gas masks. That was the welcome party. But other than that, Israel is a
 
Srini:
 
You
 
Alex Fink:
 
pretty normal Western country. I mean, the food is Middle Eastern, but everything about it is Western, at least in the part of Israel where I grew up in. So I wouldn't say that you travel from Israel to some other country and it's like you arrived at a different world. It's about the same. So most of the life there is normal. Most of the education there is normal. You have slightly more religious education than elsewhere. But I found that fascinating because at least all of my
 
Srini:
 
Mm -hmm.
 
Alex Fink:
 
teachers even in the class that was supposed to teach religion were actually non -religious and so we were basically studying it as theology not as something that we're supposed to believe in. We were analyzing the text, figuring out historic mistakes about it and things like that.
 
Srini:
 
Well, I think that there are a couple of things that I wonder about. From what we see, I think, in the media and the world at large, there's this book that Richard Haas, who was an economic advisor for Bill Clinton, if I remember correctly, wrote called A World in Disarray. They did a documentary of it on Vice News. And he'd mentioned the Middle East as the most destabilizing force in the entire world. And I wonder, one, what do we not see?
 
based on what the media shows us here in the United States. This is something I've always been curious about with people who have come from different places or seen environments like this. Because I remember my dad flew tower air to India once, and he was like, fuck this, I'm never flying tower air again. Because literally when you get to the counter to check in, this is in LA or wherever it is, he said they ask you something like 35 questions for security.
 
And when you have your layover in Tel Aviv, you're not allowed to go anywhere. You're confined to like this like one room where you're connecting flight as he was just like he said it was so insane. But he said, you also got to remember, like this is a place that's literally surrounded by enemies constantly.
 
But what do we not see when we are seeing it through the lens of like a million miles away? Like we're literally just seeing things unfold on the news. And I'm not just talking about now, but in general, like throughout the last 30, 40 years, like it just seems like a region that has been in perpetual conflict.
 
Alex Fink:
 
So let me separate the answers as it relates to Israel and to the rest of the region. In Israel, I would say we're actually seeing too much of the negative stuff in the news. But the reality when you live there is most of the time you just don't pay attention to it. Right. Yes, there were bombings where I was growing up. Right. But at some point when I learned enough math, I just did the math on how many people per year die from bombings. That year it was about 30. And how many people die from car accidents? That year it was about 600.
 
Srini:
 
No.
 
Alex Fink:
 
And I figured I shouldn't worry about bombings anymore. Obviously, in some periods, it becomes worse and it actually starts affecting people in a significant way. When they had the last war with Lebanon, then half the country was essentially either living somewhere else to not be close to the northern part of the country, or they will sheltering in some way. I was working with an Israeli company at the time that had offices in Haifa, and essentially everybody relocated to their other office in the southern part of the country.
 
Srini:
 
Yeah.
 
Alex Fink:
 
to not have to sit in the shelter all day. But most of the time you don't see that. It's kind of normal. Now, the rest of the Middle East, I can't claim to be an expert. I've tried to follow it pretty closely in the news. I studied Arabic for four years, so I have some understanding. But essentially, I think they have a small part of the population that lives like Western aristocrats, and they have a large part of the population that lives like it's still the 11th century. And that's
 
Srini:
 
Mm -hmm.
 
Srini:
 
Yeah.
 
Alex Fink:
 
pretty much what most of that region looks like. And in many ways, they also have very limited access to information. Many people there believe really odd things if you look at surveys made by Gallup, by Pew, things like that. And so I don't know if you can fix the region just by diplomacy or by politics or even by force. It has to start with some sort of education and getting information to the masses.
 
Srini:
 
Mm.
 
Srini:
 
Mm -hmm.
 
Alex Fink:
 
Because as long as they believe things like the US caused the tsunamis and the earthquakes that hit Indonesia, then you can't really convince them of everything else, right? If they think the US causes earthquakes, we have a problem. We can't fix this with politics.
 
Srini:
 
Yeah.
 
Well, I don't remember what it was. It was some documentary I was watching. It might have been on vice or whatever it is, right? They were I might have been in Afghanistan, one of these Middle Eastern countries where these kids grow up watching drone strikes and they know they're all American drone strikes. And so you kind of in one way, like not that I'm advocating for the beliefs of terrorists, but what I'm.
 
kind of trying to get to here is like, you're like, okay, if you don't grow up in an environment like that, then you can't really understand it. But if your entire life and your entire childhood has been nothing but seeing military drones from America, bombing your people, and you've seen that since you were six years old, the anti American sentiment actually makes sense in that context.
 
Alex Fink:
 
It makes sense to have some anti -American sentiment, but I don't think the drone strikes account for most of it. I think that I know a bit more about Palestine than about other Middle Eastern Arab speaking areas, obviously. So if you look at what a child growing up in Palestine learns over their childhood, yes, some of that is observing drone strikes. A lot of that is watching their television. And on their television, you actually have their own version of Sesame Street where
 
Srini:
 
Yeah.
 
Srini:
 
Mm -hmm.
 
Alex Fink:
 
Big Bird is wearing a suicide vest and singing about how great it is to bomb Israelis. You learn that and you learn the drone strike. So you learn bad stuff is done by Israelis and then somebody explains to you what you're supposed to do about it. Both of these things inform your worldview. It's not just one of them that works. So I think if we can in some way improve this second part, I'm not even just saying replace it. I'm saying supplement it with additional information.
 
Srini:
 
Mm -hmm.
 
Srini:
 
Mm -hmm.
 
Alex Fink:
 
then things will get better. If we just come there and we try to force some solutions on the population that lives there, you're going against everything that they've seen since their childhood.
 
Srini:
 
Yeah, absolutely. Well, okay, so this is another random weird curiosity question. So if you're somebody who is not of like, you know, an Israeli natural born citizen, so for example, you immigrated there, are you still obligated to the military service?
 
Alex Fink:
 
Yep.
 
Alex Fink:
 
You're supposed to, but the latest stat I've seen is that only 48 % of the population that's supposed to serve actually does. There's quite a few loopholes. So yes, you're supposed to, but if you're religious, you don't serve. If you're depressed, you don't serve. There's quite a few people that come out.
 
Srini:
 
Yeah.
 
Srini:
 
Dad, did you?
 
Alex Fink:
 
No, I'm one of those that didn't. It's a medical exemption and I will not get into the details of what it is.
 
Srini:
 
Okay, how did you get away with that? What did you what excuse did you make?
 
Srini:
 
Fair enough. Okay, fair enough. I had another guest here very recently from Israel who had survived a suicide bombing of all things. And he said the only person I know who's ever dodged the military service is my wife. And I was like, how'd she get out of it? He said he literally she literally just went there told them I don't think this is for me and they were like fine.
 
Alex Fink:
 
Okay, that is a dangerous approach because technically you're supposed to spend six months in prison if you do that. If you're just a conscientious objector, right? But for women, it's actually pretty easy. Women can just claim to be religious and bring essentially a note from a rabbi and they're out. For men, it's not enough to claim to be religious. You actually have to show that you're studying religion at the recognized religious yeshiva and then you're out.
 
Srini:
 
Yeah.
 
Mm -hmm. Yeah.
 
Srini:
 
Yeah.
 
Srini:
 
Well, talk to me about the trajectory from Israel to what you're doing today, like going to Japan. So how did you end up here, of all places?
 
Alex Fink:
 
Yeah. So I was dreaming about moving to the U S since I was 12, probably. I don't know why I just read about it in books. I saw these boundless opportunities, the wild west, the mountains of Colorado. I read all the books of Jack London. I know he's not really popular in the U S but he wrote a lot about the wild, like about Alaska, about the Western part of the U S. So I dreamt about it. As soon as I started working and I had the ability to choose where I work after college.
 
Srini:
 
Mm -hmm.
 
Alex Fink:
 
I selected an employer whose entire selling point to me was relocation to the U S it was kind of a demotion actually based on a compared to where I worked before that. But I accepted the job. It was L one visa. So you have to work for that employer for a year in the foreign office and not in their American office. They said you should work in Israel. I asked, does it have to be Israel? And so we agreed to let me work in Japan for that year. That's how I ended up there. And from there I moved to the U S and I've been here ever since.
 
Srini:
 
So what, I mean, you kind of alluded to it at the beginning of our conversation. I assume there were predecessors as well to what you're doing now at the other web, but talk me through sort of the foundations of this. What was the impetus for this?
 
Alex Fink:
 
So the impetus is, I think I already mentioned, news is important to me, real information is important to me. I guess that developed into this kind of lifelong fascination with information and me becoming an information junkie, reading a lot, watching a lot, listening a lot, and remembering almost everything. I have an unusually good memory as well. And at some point, after a couple of decades of doing this, I noticed something is wrong. It's becoming worse. I don't know why.
 
Srini:
 
Hmm.
 
Alex Fink:
 
but it seems like there's more and more noise and less and less signal. Even though my sources did not become worse, I'm still trying to select just as carefully as I used to, but it's requiring more effort now. And so I've been walking around with this thought and I had my provisional diagnosis in mind, which we'll get to in a second, but I didn't do anything about it. I kept working in my normal day job, which was making video cameras and perception systems, right? Actually generating more information you could claim.
 
Srini:
 
Mm -hmm.
 
Srini:
 
What?
 
Alex Fink:
 
And at some point, this dissonance was just becoming unbearable. It was too much. Right. And I thought, okay, I'm making money, making more cameras, but I don't think the world needs more cameras. On the other hand, I think the world really needs a better way to filter information because everybody around me is having a hard time figuring out what is true. And I mentioned my parents before they live in the U S now. And these were the people I looked up to that really.
 
always knew what was going on in the world. They always were more informed than their neighbors. And now I see them struggling. They can't understand what is true and what isn't. They're not able to navigate this weird ecosystem that we have going on. So about three years ago, I started getting serious about it. I was still building cameras on the side, but I started at least talking to as many people as I can, journalists or other entrepreneurs or people in academia trying to
 
figure out what others think, what is the root cause of this, what can we do to improve it? And I developed my own theory, which was that the main root cause of everything becoming worse over the past one years is that most of the content people create is monetized by ads and most ads pay per click and per view. There's no paper truth or paper quality. And so over time, if the only thing incentivizing creators is clicks and views,
 
Srini:
 
Mm -hmm.
 
Alex Fink:
 
you essentially have evolution with a single selective pressure. So even if creators are trying to do the right thing, there is a selective pressure to maximize clicks and views as the most prominent trait of everything being created. So everything drifts that way and becomes clickbait over time.
 
Srini:
 
Yeah. Well, I think that what is so fascinating to me about this idea, my brother -in has been working on a startup to combat misinformation for years. And I was telling him, I said, you know, I think the biggest challenge with this notion of misinformation is that we don't experience the consequences of it immediately. Like it's kind of like a delayed effect, right? Where you basically, for example, if you have a headache, you take an aspirin, like you feel the pain right away. Whereas,
 
often we don't see the consequences of this until years down the road. When somebody who ends up in office is somebody who basically got there through misinformation, regardless of where your politics are, we can't argue that we're so divided when it comes to this idea of the truth. And I remember asking Cal Fussman about this. You'd pointed out sort of the single source of truth being the government, which is not a good thing. But one of the things that has happened as a byproduct,
 
of this sort of freedom of expression that we have, and I'm saying this even as a content creator who has the privilege of being able to do something like this show, is that you have this massive fragmentation of the media landscape. Like I remember, Kel Fussman said, you know, he said back in those days, he said, like, you knew when Walter Cronkite got on the news that he was the source of truth. But now you layer on top of that 20 different news outlets, 50 ,000 different bloggers, podcasts, whatever.
 
And to your point, nobody knows necessarily what the truth is. And we also have to combat our own cognitive biases to deal with this. So talk to me about one, when you look at the media landscape today with the fact that we have the ability as individual creators to create, layer on top of it this entire ecosystem of media, how do we even begin to?
 
fix this mess because it's not like we can go backwards from here and say, okay, you know what, we're cutting off all the news. We're cutting off all the bloggers. We're going to go back to the Walter Walter Cronkite era. That's not going to happen. So like where is the solution? And the other thing, let's address this issue of like people actually not understanding the consequences because I honestly don't think most of us like honestly to day to day, I don't really care that much about misinformation because it doesn't affect me per se on a day to day basis.
 
Alex Fink:
 
Right. Right.
 
Srini:
 
but I'm well aware enough to know that yeah, that has consequences that I might feel years down the road. It's kind of like smoking cigarettes for 10 years and then being shocked that you have a heart attack.
 
Alex Fink:
 
Right. So there is a lot to unpack here. First, you mentioned how do we go about this? We can't go back to Walter Cronkite. And I think that is absolutely true. We mentioned the printing press before. You couldn't fix the problems of the Inquisition and the Holy Wars by going back to only the Catholic Church publishing materials, right? You had to come up with some sort of a system to help the population filter information in a way that makes it more likely that the truth bubbles up to the top.
 
Srini:
 
Yeah.
 
Alex Fink:
 
And so this is why about 200 years after the invention of the printing press, we started seeing more serious journals where the editors are very selective about what gets published. We started seeing the scientific method, we saw peer review, and eventually all of that culminated in the Enlightenment. So it was a social process of basically a lot of smart minds building this kind of almost a sieve through which the best stuff gets through.
 
and the noise tends to get caught and not come out on the other end. I think we need some sort of a new system like this now, because as you mentioned, instead of three big networks, we now have thousands of independent publishers. And we need some sort of a social process that makes it more likely that things that are true or at least well -written end up reaching the masses as opposed to whatever is the loudest, whatever is the most infuriating.
 
You mentioned people getting elected through misinformation. I should remind people that in 2016, the most widely shared news story on Facebook was the Pope endorsing Donald Trump. Right? Yes, it had 800 ,000 shares. And that was the number one story for that year. So obviously that's not written with the intent to disinform. It's written with the intent to get 800 ,000 shares.
 
Srini:
 
Really? I did not know that. Wow.
 
Alex Fink:
 
and it worked like magic, but the result is misinformation. And this is the other thing that I really want to focus on because I think people have this hyperactive agent detection. We always think that whatever we're observing is the nefarious intent of somebody somewhere. And so when people look at misinformation, they start looking at Russian propaganda. They start looking at those evil conservatives or those evil cabal of Democrats in media.
 
They start blaming somebody and if you're on the right, you're blaming the left and if you're on the left, you're blaming the right. The reality is everybody is just trying to get clicks. There is some small percentage of actual GRU agents sitting in a dungeon in St. Petersburg somewhere, but they are the minority. It's just people chasing clicks. And it just turns out that the crazier the stuff you publish, the more clicks you get. And so if we want to improve this,
 
I think we need to change the incentives. We need to actually punish bad stuff and make it so that higher quality content gets rewarded in some way.
 
Srini:
 
Yeah, so I think you bring up an interesting point because if you think about platforms like Facebook, platforms like Twitter, even platforms like Medium, I remember I got to a certain point where I had to set up a separate Medium account because I realized, I was like, wait a minute, I've written all these articles about productivity and stuff related to that. And then I was like, God, I'm like, my entire Medium feed is just a cesspool of productivity porn. I don't discover anything new or learn any new perspectives or anything. And so...
 
I think that like one of the big issues with all of this is these echo chambers that get created as a result of algorithmic content creation. But like, to your point, like how are you going to incentivize a company like Facebook to say, you know what, let's actually change the way that you curate this newsfeed. Like rumor has it, I'm sure you probably saw the social dilemma. Like most of these people say like we built something that is kind of beyond our control and we don't even really know how the hell it works anymore.
 
Alex Fink:
 
So this is probably where I disagree with Tristan Harris the most Which is he has this nonprofit that tries to approach Facebook and tell them please be nice Please try to filter the bad stuff out. Please try to show people better content, but Facebook is a C Corp They need to maximize shareholder value. Right the way to do that is to maximize engagement Outrage porn sells right and so you can't tell them don't put outrageous stuff in people's feeds
 
Srini:
 
Yeah.
 
Srini:
 
Yeah.
 
Srini:
 
Mm -hmm.
 
Alex Fink:
 
and actually expect them to follow. You need to create some sort of a cost to putting the outrageous stuff in people's feeds. And you saw the backlash against Facebook and basically resulting in bad PR for them and hearings in Congress and things like that. And so now suddenly Adam Mosseri announces they will not be promoting politics and threads. Well, why is that? It's because to them, the cost of political discussions is now higher.
 
than the benefit of engagement that these discussions create. Now, I'm not sure we want that to be the result. I think we want political discussions. We just don't want the outrageous stuff to bubble up to the top. But it at least shows you that incentives actually drive behavior. Going and pleading with them to be nicer is obviously not going to work. The other thing that you should consider here is that ultimately Facebook and other companies like it are distributing content, but...
 
Srini:
 
Yeah.
 
Alex Fink:
 
There is a consumption layer that goes behind that. Right. Many people would click on an article on Facebook and end up in the browser. Right. Or they would actually go to Facebook through their browser and not through the Facebook app. So if we actually give people tools to filter bad stuff out at the point of consumption, maybe they have a browser extension that starts blinking red and saying alert alert. This is unlikely to be true. I don't think the pope endorsed Donald Trump for president. Right.
 
Srini:
 
Yeah.
 
Srini:
 
Mm -hmm.
 
Alex Fink:
 
then both the people writing these articles will be making less money and possibly Facebook will be making less money on these articles so they would start promoting better ones.
 
Srini:
 
Yep.
 
Srini:
 
Yeah, well, so that raises the question as somebody who runs a media outlet, I've thought about this, right? I was like, you're right, at the end of the day, we have ads on our show. And the truth is that I've always thought, it's like, how do you get a industry that has largely been dominated by advertising revenue, pretty much the entire media landscape?
 
to date is dominated by ad revenue. And as much as we'd love it if everybody said, you know, hey, we'll pay, if I had everybody listening to us pay a dollar a month, I could literally tell every advertiser that I don't need them anymore. But that's not gonna happen. You know, that just, because I think the other issue is we've gotten so conditioned to everything being free, right? And I think that the idea that it's free also is kind of an illusion because I was like, look, somebody is always subsidizing your ability to consume anything that you.
 
get for free. So for example, you people listen to this podcast for free, guess what our advertisers are subsidizing that you subscribe to a newsletter for free and that person who writes the newsletter sells courses. Well, guess what every person who buys that course basically subsidizes the ability for those people who will never buy a damn thing to consume that content for free. And so, you know, in this world where there's
 
truly nothing free. And I'll give you one more example of this is something I brought up in the show before. I remember seeing this documentary about CBS and how somebody was doing a documentary at CBS on sweatshops at Nike. And coincidentally, Nike happened to be sponsoring the Olympics, which was being broadcast on CBS that year.
 
That's like a hundred million dollars in ad revenue. Like, I mean, if I were the CEO of CBS, I would have killed the documentary too. Not because I would like want to censor a story about sweatshops, but if it's like, hey, let's show this story about sweatshops or here's a hundred million dollars from Nike. I'm the CEO of a damn company. If I don't take the hundred million, I'm not going to have a job.
 
Alex Fink:
 
Right. Well, admittedly, there are some good counter examples to this. I think the Wall Street Journal had a chance to kill the Theranos story and didn't on principle, even though the owner of the Wall Street Journal invested a hundred million dollars into Theranos. So you have examples like this where it kind of works out fine, but I completely agree with you. I think that monetization needs to be addressed and I think you have to be pretty big to be able to address it. So.
 
Srini:
 
Yeah.
 
Srini:
 
Wow.
 
Srini:
 
Mm -hmm.
 
Alex Fink:
 
As the founder of a startup, I thought we're not in a position to try to change how people monetize content. In fact, we're not even in a position to change how we are going to monetize content. So the best we could do as a small company is first register ourselves as a public benefit corporation so that we can at least put the mission alongside the profit motive in our bylaws. So we're supposed to maximize shareholder value. We're also supposed to improve the quality of information people consume.
 
Srini:
 
Yeah.
 
Alex Fink:
 
And we have to track how we're doing that and file reports every two years, et cetera. But that gives at least me as an executive, the defense of yes, I know higher quality information means slightly less engagement. Yes, I know we're going to make less money if it's ad monetized, but I'm allowed to do this. Dear shareholders, you cannot sue me for breaching my fiduciary duty, right? That's one element. The other one is trying to bind our future selves.
 
by opening up our models and our data sets to the world so that if we change them in a nefarious way tomorrow, people would notice. If we suddenly close the source tomorrow, people would notice. So this is almost like forcing your future self to be good or future whoever replaces me in the future to be good.
 
Srini:
 
Mm -hmm.
 
Srini:
 
Yeah.
 
Well, so here's the thing. I think that one thing we're all guilty of to a degree is letting our confirmation bias guide us. And on top of that, I noticed when I started watching the YouTube videos, and I watched a lot of, it's funny because I never found the news interesting until Trump got elected. Then every day was such a shit show. I was like, this is fucking fascinating. But I noticed I started watching John Oliver and Trevor Noah, and pretty soon Seth Meyers ended up in my feed. And I was like, oh, wow, I'm experiencing this for a second.
 
hand and you know every now and then just you know because I wanted to know what the other side is being told like and this I think is another big problem I would go to watch Sean Hannity like you know and I was like okay love him or hate him like Sean Hannity has a massive fucking audience there's something going on here and I wanted to know what and it was kind of like wow so this is what millions of people are not only being told but like you know like
 
fervently believe and so you have that issue on top of it. But so one, let's talk about the role of cognitive bias and all of this and what the implications are for people's consumption habits. But on top of that, I mean, like people, I don't think, like I said, really understand the long term effects of this because day to day, like I said, I didn't even know that, you know, that Pope article was shared 800 ,000 times and truth be told, I probably wouldn't have given a shit.
 
then because it doesn't impact me in the immediate moment, I don't feel the pain of that article spreading in any way at all. So talk to me about, like, I think people need to understand, like, what the serious long -term consequences of this, and then I'll let you kind of get into other web and how it works and how you guys do what you do.
 
Alex Fink:
 
So let me use an analogy that I probably overuse a little bit, but I think it's really fitting in this case. The food ecosystem in the US is also making people around us unhealthy. And it also takes a long time for the effects to become visible. People start eating processed foods that are hyper palatable, have too much sugar, too much fat in them, too much salt in them. And then 20 years later, you're seeing an obesity epidemic that cost the US...
 
probably four or five, maybe even more trillion dollars a year to address in medical costs. And there's no way to cut those medical costs without actually making people eat better food, right? I think you're seeing a very similar situation with information. It's just the food for our minds instead of the food for our guts. And the costs are going to be just as bad, if not worse, because bad information causes us to elect people that might press the wrong button.
 
Srini:
 
Mm -hmm. Yeah. Well, so the other thing I think about, right, is something Brian Holliday said, something along these lines, I'm paraphrasing it. And he said, you know, if it's, you know, important, then it'll probably be in a book. And so, like, I tend to prioritize books. I don't, I read a lot. I actually don't read that much on the internet anymore.
 
But I read books every week and I think there's something to be said for information that is timeless. Now, obviously, does that mean that every book, you know, basically is an icon of truth? No, obviously not. But like, I think the other thing is that it's important to be open to being exposed to perspectives that you disagree with.
 
And the example I always come back to is like somebody's like my brother -in saw me reading Roger Ailes book and he's like why the fuck are you reading Roger Ailes book? You know, he's the founder of Fox News. I'm like, yeah, I don't disagree with any of his views on anything, but I'm like he built a massive audience. There's probably something useful I can learn from him. So like it's almost like I'm not gonna discount the value of the message because I don't like the messenger.
 
Alex Fink:
 
Right, so I'm going to give you a scoop. We just ran a study with Harris where we polled more than 2000 people or rather they polled them for us and we gave them a bunch of statements and asked them whether they agree or not and to what extent. When we say the statement Americans need to consume both left leaning and right leaning media to know what's really going on or some version of it, more than 80 % of Republicans and 80 % of Democrats say yes, absolutely, definitely agree.
 
Srini:
 
Mm -hmm. Mm -hmm.
 
Alex Fink:
 
When we ask them to read the statement, I mostly consume news from my side of the political aisle. More than 60 % of Republicans and more than 60 % of Democrats also say yes, absolutely agree.
 
So everybody knows that we need to know what the other side is thinking, but for some reason, even when they're, they're already supposed to give answers to some extent that make them feel good about themselves, but they're honest enough to say, no, I only consume my own side. So I think this is the reality we live in. And again, it might be driven just by incentives. People are more likely to click and share on stuff they agree with or on stuff that demonizes the people they don't like. And so that's where we end up.
 
Srini:
 
Well, so talk to me about what you guys are doing at OtherWeb to address this issue. Like, how does it work? Like, why is it, you know, like, why is the quality of information that you consume improve? And also, I mean, you're also giving people the freedom to choose the sources that they consume from. So that also poses another problem, right? It's like, great, I just have another app where I'll add my, you know, newsfeed from CNN or my newsfeed from Fox.
 
Alex Fink:
 
Right. So we are taking it from the different direction. What you just described sounds a lot like Feedly, where you add RSS feeds by yourself. We actually start with all the feeds already populated. So you start with more than 200 sources, depending on which topics you selected. And then if you don't like a source, you can disable it. But basically, to end up with a right -wing or a left -wing echo chamber, you need to do a lot of manual work. So hopefully, not that many people are doing it.
 
Srini:
 
Yeah.
 
Srini:
 
Mm -hmm.
 
Srini:
 
Yeah.
 
Alex Fink:
 
But yes, the idea was we want to sample from the entire spectrum or at least the entire kind of rational overton window. And it's not even just focused on news. We have commentary, we have podcasts, we have research studies, et cetera. But we try to sample as much information as we can from all over the web. We use a suite of AI models that we developed to filter the obviously bad stuff out and to give people some additional information to decide on what they want to consume and how. So.
 
We organize the outputs from these models as a nutrition label. So you can look at what we've learned about the content you're about to read before you read it. And by the way, you mentioned books. So one of the models that we have, I'm not sure most people actually look at the output, but it kind of addresses the need that you described there, which is it tries to evaluate how timely versus timeless this content is. We call it the time value of content. So it would rank something.
 
like breaking news as really timely, but not really worth a lot in the long run, right? And to read something like a Wikipedia article about something that happened 100 years ago, as really timeless. And it's pretty important to know about content before you read it, just like you read the nutrition labels when you're in the store, before you buy something and end up eating it. So that was the general idea. We give people as many sorting options as possible. Again, not everybody uses them, but basically if it,
 
Srini:
 
Mm -hmm.
 
Alex Fink:
 
can be customized, we let people customize it. At some point, we even developed a model that evaluates which emotions this article is most likely to evoke in the reader. And based on how people interact, we try to optimize the feed to give them more of the stuff that they like. But they can go into advanced customizations in the app and change it. Basically, they can see everything that the app learned about them, and they can manually change it.
 
Srini:
 
Wow.
 
Srini:
 
Mm -hmm.
 
Alex Fink:
 
So if we learned that you really like infuriating stuff and you really like depressing stuff and you really don't like educational or hopeful stuff, you can see that and you can modify it to change your feed. And maybe you will also take a mental note that you should be spending more time on the hopeful stuff and less time on the infuriating stuff.
 
Srini:
 
Well, talk about the implications of AI as it relates to all of this, because I'm sure you've probably come across that there's a company called Heygen that allows you to create digital avatars. And I do these introductions to my YouTube videos. I was like, this is a pain in the ass. I'm just basically recording a one and a half minute intro over and over again. I was like, that's it. I'm going to create an avatar. And I remember sending it to a friend and he was like, holy shit, this is scary because it's that good. Like, you'd never know it wasn't me. Like I.
 
I can show you a sample of it. And like literally nobody on any of my YouTube videos has mentioned, hey, this looks like it was generated by AI. But like, what are the implications of all of this? Like, you know, obviously there are huge upsides to this, but I'm also think that I think that the downsides of AI are basically sort of the wild wild west. Like nobody knows what the fuck they're doing or has any idea. At least I feel like that because I'm a big AI enthusiast.
 
Alex Fink:
 
Yeah, I mean, there's certainly that element of it, right, that it's pretty democratized and therefore some people will do good stuff with it. Some people will do bad stuff. And you're just hoping that the guys doing the good stuff have more resources and therefore will be more prominent. The other element of all of this is that it's a really big revolution that will affect probably 30 or 40 percent of the economy in the next few years. Right. And so in that sense, it reminds me of not that I remember, but.
 
Srini:
 
Yeah.
 
Alex Fink:
 
I read about the agricultural revolution, right? When agriculture was mechanized in the 1920s, 1930s, something like 30 % of the US population basically had to change jobs because we went from 30 % of the population working in agriculture to something like three in a decade. And it's a huge transition. I'm not even sure the US is over that transition now. Even though it's been four generations, you're still seeing some of the effects of that. So we're about to have that, except the people getting displaced.
 
Srini:
 
No.
 
Alex Fink:
 
at least right now, are the people whose output is words. And so it might be a lot more painful because that class has a lot more power than people who worked agriculture.
 
Srini:
 
down.
 
Srini:
 
That's kind of what I've noticed too, is that I'm like, okay, this is the first sort of time where you have a technological disruption that has the potential to displace sort of white collar jobs. Like I've never seen that before. And I'm, you even as a writer who uses AI, like I've found all sorts of different ways to use it. But like, I realized like, you know, there are definitely some scary things about this. I think creatives in particular are worried and, you know, in some ways they should be because I think they have to adapt.
 
But Sam Wolbin had a really interesting interview on CBS where he said, you know, he said, the thing is, like, every time technology comes along, we have to adapt our own behavior. He gave a really interesting example of like teaching math. He said, you know, like when the calculator came out, we had to teach out, change how we taught math. And the upside I see like is that this enables a more sort of Socratic method learning approach. So, for example, now, when I take book notes.
 
I have an AI tool that has all my notes from all my books. I mean, my outline for our conversation is here. I mean, I haven't followed it, but I was like, here, put together a briefing on Alex so I can kind of know what the hell I'm talking about when I speak to him. But I make a point to go through and actually discuss with the AI. I'm like, OK, let's talk about this idea. I want to understand it better. And I think that that, to me, is really, from an educational standpoint, I think the power is sort of limitless.
 
Alex Fink:
 
It's absolutely true and I think education is going to have to essentially get inverted from the model that we're using right now. It makes no sense to sit in a classroom and listen to the teacher lecturing on something because you can get the same amount of information at a pace that is much more appropriate for you at home with an AI doing the teaching. On the other hand, that makes absolutely no sense to do homework at home and then bring it to the classroom.
 
Srini:
 
now.
 
Alex Fink:
 
because 99 % of the time it would be AI doing the homework. So you almost have to flip those two and you have to do homework in class with somebody watching you to make sure you're actually doing it and get all your instruction at home from a personalized teacher that has a special program just for you. But setting that aside, I think that it's obviously good in the long run if we make it to the long run.
 
Srini:
 
Yeah.
 
Srini:
 
Hmm?
 
Srini:
 
Yeah.
 
Alex Fink:
 
It's one of these cases where the transition is going to be painful just because humans aren't that good at adapting, right? We have to adapt. Yes, but most people don't learn to code when they're 50. Most people don't live really well with the idea of I was a journalist until now, and now I'm going to be annotating AI models for the rest of my life. Right. And so you're going to get a lot of social disruption, like, or even use a simpler example, interpreters, right?
 
Srini:
 
Yeah.
 
Srini:
 
Mm -hmm.
 
Srini:
 
Yeah.
 
Alex Fink:
 
Interpreters are essentially only going to be useful within five years for annotating data. Other than that, why would you ever hire one? A model is just going to be better than any one single interpreter, right? An interpreter would still be good for annotating some esoteric corner of the data set that only they know and that the model is still not good at. But other than that, what other job they could do? They have to learn something new or just become an annotator.
 
Srini:
 
Yeah.
 
Srini:
 
Yeah.
 
Well, so, you know, it's funny. I had a conversation with David Brooks about his new book, How to Know a Person. And he said this, he said, you know, I think one of the things that we're led astray by is the phrase artificial intelligence, because it gives the impression that the machine has the intelligence, but it's really just synthesizing human intelligence in human language. And that really stayed with me because I realized one of the reasons I'm able to leverage AI so effectively is the fact that I've read thousands of books, you know, I've got a
 
thousand transcripts from conversations with people like you. So I realized, I was like, wow, the role of knowledge acquisition is gonna be even more important. But to me, I was like, what basically data is to large companies, your personal knowledge or personal knowledge capital is the same thing for the individual. It's where your power lies. In my mind, I'm like, there's nothing more important an individual could be doing in the age of AI than cultivating their own knowledge base.
 
Alex Fink:
 
and probably cultivating their own understanding of themselves. Because if they don't cultivate this, and this is more of Yuval Noah Harari's points, right? If you don't know yourself, these systems will know you better than you know yourself, and then you're in trouble.
 
Srini:
 
Yeah.
 
Srini:
 
Yeah, absolutely. Well, I mean, you and I could talk all day about all of this. So in the interest of time, I want to finish with my final question, which is how we finish all of our interviews at the Unmistakable Creative. What do you think it is that makes somebody or something unmistakable?
 
Alex Fink:
 
unmistakable.
 
I tend to get too technical when you ask me things like this. I'm thinking literally what it means. But in general, I think that if people know themselves, if people really figure out what it is that they are the best in the world at, or at least the best out of everybody that they know at, and they invest as many resources as possible into this, then whatever they produce over their lifetime will probably...
 
be obviously a manifestation of them. Nobody will be able to mistake it for anything other than the product of this particular mind. And hopefully I will leave a mark like this in the world, but time will tell.
 
Srini:
 
Amazing. Well, I can't thank you enough for taking the time to join us and share your insights and your stories and your wisdom with our listeners. Where can people find out more about you, your work, and everything else you're up to?
 
Alex Fink:
 
So our website is otherweb .com or if you want to skip the website and go straight to the apps, then it's called OtherWeb on either the App Store or Google Play. And just go from there. There's a newsletter. There's a whole bunch of other secondary products. But in general, what I would like to ask people to do besides just using our products, right? I don't mind if you use other products that do the same thing, if you can find them. I just want people to pay attention to what they put into their brain.
 
because we know we're not supposed to put bad stuff in our mouths, right? But with our brains, we haven't been paying attention because as you said, it's a delayed onset problem, but it's about time we started because the delayed onset is starting now.
 
Srini:
 
Yeah.
 
Srini:
 
Yeah, amazing. And for everybody listening, we will wrap the show with that.