Episode 77

full
Published on:

21st Mar 2024

Algorithmic Sameness with Kyle Chayka

In this episode of 'People vs. Algorithms,' hosts Brian Morrissey, Troy Young, and Alex Schleifer engage in a wide-ranging discussion on how algorithms shape our cultural landscape, the future of AI in content creation, and the consequences of a digitally dominated world. They share insights from the Game Developer Conference about the optimism among independent studios and discuss the ramifications of algorithmic sameness across media. The episode features a detailed conversation with Kyle Chayka, a writer at the New Yorker and author of 'Filterworld,' centering on how algorithms are flattening culture and what it means for creators and consumers alike.

Topics:

  • 00:00 Kicking Off at the Game Developer Conference
  • 06:42 The Future of AI in Gaming: Opportunities and Challenges
  • 08:40 Navigating the Complexities of AI Ethics and Open Source Debates
  • 19:49 The Middle Management Dilemma: Efficiency vs. Human Touch
  • 27:33 Listener Questions: The Future of Learning and AI's Role
  • 34:01 The Digital Dilemma: Navigating Education and Work in a Tech-Driven World
  • 34:48 Rethinking Education: A Vision for the Future
  • 35:08 Introducing Kyle: A Deep Dive into Algorithms and Culture
  • 35:48 The Impact of Algorithms on Culture and Individuality
  • 38:41 Exploring the Future: The Role of Algorithms in Shaping Our World
  • 43:02 The Algorithmic Influence: From Consumption to Creation
  • 47:29 Navigating the Algorithmic Landscape: Personal Experiences and Strategies
  • 51:36 The Future of Culture in the Age of Algorithms
  • 55:59 The Personal and Professional Impact of Living Online
  • 59:01 Envisioning a Future Beyond Algorithmic Feeds
  • 01:06:28 Exploring Good Products
Transcript
Alex:

I'm at the game developer conference today,

Brian:

What's the mood at the Game Developer Conference?

Alex:

There's definitely a lot of people looking for work and more activity around studios trying to get, consulting gigs so not as, hot as it was last year, but last year it was all crypto and it's kind of nice not to see any of that.

Alex:

Year it's lot of the independent studios actually sound pretty excited.

Alex:

And the fact that there's a lot of stuff changing, is kind of showing up as a lot of opportunities.

Alex:

So it's actually more optimistic than I thought it would be.

Troy:

you don't need no optimism, Alex.

Troy:

Cause you got talent.

Brian:

Welcome to People vs.

Brian:

Algorithms, a show about patterns in media, technology, and culture.

Brian:

I'm Brian Morrissey, and each week, I'm joined by Troy Young and Alex Schleifer as we try to connect the dots about what's happening in the worlds of media and technology.

Brian:

This week, we have a conversation with Kyle Chayka, a writer, author, at the New Yorker and author of the recently released book Filter World, which explores how algorithms are flattening culture.

Brian:

This is a topic Kyle has focused on for years.

Brian:

It's something that I used to call algorithmic sameness, and I saw it mostly in publishing, with a loss of control over distribution.

Brian:

Publishers needed to create content, not just for people, but for the needs of algorithms first and foremost.

Brian:

Whether that algorithm was from Google, Facebook, or more recently, TikTok.

Brian:

I mean, even the most hyper aware can fall into the trap of shaping their content to please whatever the algorithmic needs.

Brian:

God of their persuasion demands.

Brian:

And to me, it's not just the story of algorithms, but the ecosystem that exists around algorithmic media, and in particular, the overemphasis, in my view, at least on optimization.

Brian:

I mean, once you're in the algorithmic fighting pits, the only way to compete seemingly is through optimization or growth hacking.

Brian:

A generation of media has It's been largely defined by finding growth hacks before the algorithm catches up.

Brian:

And this cat and mouse game gave us such cultural treasures as You Won't Believe What Happens Next headline conventions, the monotone TikTok voice, and the interminable backstories we get before we can finally read the recipe.

Brian:

And yet, as a consumer of media, I have to admit that algorithmic recommendations are a cheat code.

Brian:

I happily give away my discovery of new music to Spotify's algorithm.

Brian:

I never spent a lot of time working on a finely tuned taste of music, so outsourcing that's wonderful.

Brian:

I mean, I suspect many people don't have time to wade through the stacks at vinyl record stores.

Brian:

I mean, as Troy says, it's easy to fall into an anti algorithmic viewpoint.

Brian:

It just doesn't seem very cool.

Brian:

To admit that you prefer to give over to some computer code, your preferences and cultural products.

Brian:

And yet I believe most people happily make that trade off.

Brian:

I hope you enjoy this discussion with Kyle, which comes after we discuss the plight of the middle manager.

Brian:

And whether you should trust the people shaping the AI world.

Brian:

And also we hear from Graham, on the rant line.

Brian:

On that note, you can send us a question or a comment on the rant line by visiting memo.

Brian:

fm slash PVA.

Brian:

You can then hit record and, send us a voice message that, We'll use in future episode.

Brian:

Thanks a lot for listening.

Brian:

And if you like this show, I regularly meet people who are listeners, which is always a terrific experience, please send us a rating and review wherever you listen to your podcasts.

Alex:

Our game has gotten some good responses.

Alex:

We've

Troy:

What did

Alex:

our, we've changed our studio name to a Human Computer and that got a good response as well.

Alex:

So, and we announced

Troy:

Hold, hold it, hold it, hold it, hold it, hold it.

Troy:

There's a lot of news.

Troy:

That's a lot of

Troy:

news.

Brian:

you gotta

Alex:

dropping a lot of news.

Alex:

Well, I'm rushing through it because you sound like you're totally disinterested, Troy.

Alex:

So I was trying to speed

Troy:

don't, Hey, hey, hey, hey, hey,

Troy:

hey.

Troy:

Don't, don't do that.

Troy:

Don't you need to talk to your guy about that?

Troy:

Don't do that.

Troy:

That's dysfunctional.

Troy:

Listen, what was the name of your previous studio?

Troy:

UUU, uni Union

Alex:

Universal entities.

Troy:

Universal Entities, sorry.

Troy:

Shortened to Unient.

Troy:

Okay.

Troy:

And now the new one's called Human something.

Alex:

Human Computer.

Troy:

Isn't that taken?

Alex:

No, we, we own Human Computer LLC.

Alex:

It's ours.

Troy:

Oh, good for you.

Brian:

Does it matter if you get a URL anymore?

Brian:

I called it The Rebooting because I couldn't get reboot.

Brian:

com and now I

Alex:

Well, the URL is different things.

Alex:

I mean, we, what we needed.

Alex:

So our URL is, if anybody wants to visit it as humancomputer.

Alex:

io, but, yeah, we, we registered the company and, it was all available.

Alex:

so we're very excited.

Brian:

When will I be able to play this game?

Alex:

Q4 2024.

Brian:

And do I need to be an advanced gamer to be able to play it?

Alex:

No, our first game will not be for advanced gamers.

Alex:

It'll be for noobs, noobs like you.

Brian:

That's well, I'm not new.

Brian:

I I did Asteroids and,

Alex:

come a long way since Pac Man.

Brian:

I got a a Ms.

Brian:

Pac Man, excuse me.

Alex:

Oh, the better ones.

Alex:

Yeah,

Brian:

Yeah, my mom got addicted to Ms.

Brian:

Pac Man, once during our childhood.

Brian:

It was like a very brief period,

Brian:

she was playing it like non stop.

Alex:

The genius thing about Ms.

Alex:

Pac Man is that they essentially repackaged a very similar game, but they just put a bow on top of Pac Man and called Ms.

Alex:

Pac Man.

Alex:

It's incredible.

Troy:

They changed some stuff, like the treats were different.

Alex:

The treats were different.

Brian:

It's a great game.

Brian:

okay, so, a lot of optimism, I guess some optimism.

Brian:

Not a lot, not as much

Brian:

fear and loathing there.

Alex:

I thinkk independent studios and it's the same thing as everywhere in the media, smaller companies or individuals that have a low, capital intensive structure, are pretty flexible, are distributed, and have, a bit of time to figure out what's going on, are seeing this as an opportunity, you know, you have more access to talent, you, you get kind of this renewed focus.

Alex:

If you're, a big studio with like hundreds of people or your cost structure is through the roof, then you're struggling a lot more.

Alex:

So, we're seeing similar trends, in gaming.

Alex:

So, but most of the people we hang out with are in the smaller space, independent games and, of course, you know, Apple and Netflix and the big, platforms are here, talking to these small publishers because, you you never know where the hits going to come from.

Alex:

It's been

Troy:

Is there, was there a lot of AI talk?

Alex:

Yeah, there's been lots of AI talk.

Alex:

Ubisoft, I think, also demo something where they had fully AI generated characters that would have conversations with you.

Alex:

They would fully generate it.

Alex:

So they're starting to demo agents.

Alex:

There's some anxiety around the amount of junk that we will see.

Alex:

On game stores, with A.

Alex:

I.

Alex:

Just being able to generate, lots of trash games that can then flood flood the market.

Alex:

But if there's 40, 000 games being released every year on steam, for example, we're really competing against 0.

Alex:

1 percent of that because most of that is just trash that that nobody plays.

Alex:

It's not to disparage the work of people doing games that aren't successful, but a lot of these are essentially just unity kind of asset rips that you just package and try to sell for two 99.

Troy:

Hey, Alex, now that Brian has given in and given me a subscription

Troy:

to the rebooting, can I get a free, free copy of the game?

Brian:

I'm going to pay you for your game because I value your labor, Alex,

Alex:

Thank you.

Alex:

I appreciate that.

Alex:

Well, it depends, if we're on Apple arcade or Netflix, it might just come with your subscription.

Alex:

So, you know, it

Alex:

might be freE.

Troy:

Oh, great.

Troy:

So yeah, I'll get it that way.

Troy:

Thank you.

Troy:

I don't get Brian's product as a bundle, some kind of bundle, but maybe in the, who knows?

Troy:

Optimistic.

Alex:

Hey, Brian, do you want to, do you want to bundle your, your stuff with my game?

Alex:

We can be the next New York times.

Alex:

You got

Alex:

games,

Alex:

you got insightful content.

Brian:

SEO content and whatnot.

Brian:

Okay.

Brian:

Speaking of SEO, I don't know

Brian:

if you

Troy:

jamming a lot of shit behind the paywall, Brian.

Brian:

Well, I mean, I, I need to give tangible benefits to people who support the work, those who pay.

Brian:

And there's, there's different ways to

Troy:

feel molested by the thing.

Troy:

It's filled with ads and now it's behind the paywall.

Brian:

Now the dual revenue stream is, I've never gotten a complaint about an ad.

Brian:

They're very tasteful.

Brian:

I think they're additive.

Brian:

Did you guys see the, OpenAI CTO Mira Mirati, interview?

Brian:

I guess this stuff, it speeds up so much that I think it might even be out of the news cycle at this point because there's so much stuff.

Brian:

But basically, she was being asked in a Wall Street Journal video interview about where the data came from for Sora.

Brian:

I mean, it's a video AI.

Brian:

Video generator.

Brian:

We all know where video is on the internet.

Brian:

It's on YouTube.

Brian:

And she's basically asked, does it come from YouTube?

Brian:

And she got this funny look on her face that quickly became a meme because she's, you could see the lawyers seizing, seizing control of her.

Brian:

And she was like, I don't know, which is kind of funny.

Brian:

This is going to be a major issue for these companies.

Troy:

She makes the world's most complex machines, but sadly she skipped PR training.

Brian:

Yeah

Alex:

Troy, right?

Alex:

Like you felt that.

Alex:

Same way I did, I guess when I saw this interview, I was like, how come, how does she not have an answer to that?

Alex:

Like a perfectly, constructed answer to that question.

Alex:

They knew it was coming.

Brian:

Were you PR trained at all, Troy?

Brian:

Or did they just give up?

Brian:

Or

Brian:

I'm trying to imagine that.

Alex:

it's kind of, you, you get this dog from the pound that keeps pissing on the floor and it's kind of lost cause.

Brian:

Poor Allison Keene.

Brian:

Allison, send me a note.

Brian:

I want to know

Troy:

yeah, of course I've been

Troy:

PR trained many many times.

Troy:

Can't you tell in my polish?

Troy:

she was a little sloppy.

Troy:

I think that it also shows maybe that we're at this sort of in between time when Nobody really knows what to say about fair use, the whole market is struggling with what's the right answer to the question.

Troy:

which is like you gobble up the internet to train your model and, somehow it seems more poignant when it's tart, when it's focused on video, which was the,

Brian:

Well, it's more obvious, right?

Brian:

I mean, a lot of Silicon Valley in business, as far as I can tell, is based on plausible deniability, right?

Brian:

And you can plausibly deny a lot of things when you build on video.

Brian:

You can't plausibly deny that you're you're using YouTube video.

Brian:

It just doesn't pass any kind of smell test by any sentient human being.

Troy:

I think we're moving into this long period of figuring out how to price human effort around content.

Troy:

And it's going to take a while, right?

Troy:

So we, we don't know.

Troy:

Because I think maybe partly because we don't know how much the other side of it's worth, meaning, what's the value of that, that content to the creator of an AI platform.

Troy:

It's hard to price the inputs.

Troy:

And it's going to take years to figure out like, how much we pay for farm to table versus industrial content production via an AI bot and how pricing differs between different types of content and the limits of fair use in a use case that I think hasn't been, that wasn't conceived of, frankly, when the, when fair use was created.

Alex:

There are a couple of like interesting things here.

Alex:

One is it looks like this stuff's gonna have to go to court until somebody has an answer, because nobody wants to make a statement, right?

Alex:

Not even Sam Altman or the CEO, they don't even have a big statement where it says, this is what we believe.

Alex:

It's, it's, it's really interesting.

Alex:

And

Brian:

don't understand why they just come out and be like, look, we believe in fair use.

Brian:

We have an incredibly expansive definition of fair use.

Brian:

Anything that is public data is fair use to train on.

Brian:

Obviously, YouTube is public data.

Brian:

We train on that.

Brian:

The New York Times, public data, we train on that,

Brian:

etc, etc.

Brian:

If you want to, if you want to, if you want to sue us, fine.

Brian:

But this is what we do.

Alex:

Right but think about the audience here..

Alex:

They could say that and, maybe YouTube could have even said that when they were showing clips of, CNN on it in the beginning or, somebody playing a song.

Alex:

I think those were corporations fighting corporations here.

Alex:

If OpenAI comes out and says, yeah, of course we rip YouTube.

Alex:

We think it's fair use.

Alex:

Now you have thousands of YouTube creators.

Alex:

Right.

Alex:

You and me, everyone looking at open AI is ripping off the content of people.

Alex:

This is not YouTube versus Google.

Alex:

So, so I think part of the timidity, that's my hypothesis, is the fact that if they go out and say that they're not going to make a lot of friends

Brian:

Okay, we'll get a better answer before they're hauled in front of Congress, because that is where this is going.

Brian:

They're getting hauled in front of Congress.

Brian:

Congress loves to do that.

Brian:

They love to get the clips out.

Brian:

And

Brian:

so

Troy:

You know, Brian, but, Sam Altman performs so well in the PR ring

Troy:

that, it's hard to compare the two.

Troy:

thought he was unbelievable.

Troy:

He did another Lex Fridman this week.

Troy:

And he's so fast.

Troy:

And so he's really transparent, actually, but always has the seemingly the right answer on it, you know, at his fingertips, but Did you listen to it?

Troy:

There was a question where, Lex asked Sam if he should be trusted with, the kind of, terrifying potential of AGI.

Brian:

Was he like, no?

Troy:

he's

Troy:

like, you know, we need governance and, I'd like to think that, I could be trusted, that I've shown that I'm trustworthy, but you can't rely on a single person ever.

Troy:

The system has to create, checks and balances.

Troy:

So

Brian:

So let me

Brian:

ask you this.

Brian:

How important is it that that he be trusted in it that a lot of the people who are building this be trusted?

Brian:

Because look, Musk is out there.

Brian:

He's open sourcing, Grok, and he's suing open AI for dishonesty, and they've got a name problem, right?

Brian:

He's trying to redefine open the open and AI and a lot of that's esoteric.

Brian:

Okay.

Brian:

To regular people esoteric like tech talk, but I think it does speak to Let's let's be real.

Brian:

I mean we saw this with the social network and it didn't dent Zuckerberg There's a lot of playing fast and loose with rules that that's baked into the system It's it's ask not even ask for forgiveness afterwards.

Brian:

It's just Take what you can and then deal with the consequences down the line when you have enough lawyers and whatnot.

Brian:

And honestly, I think youtube sold to google because they just needed the lawyers does it matter that if sam altman is quote unquote trusted?

Troy:

it's ridiculous to think that.

Troy:

a, a CEO of a you know, important technology company needs to be trusted on some level and on some level, the governance systems within that company particularly, there's cascading systems of, of governance, right?

Troy:

So like the board has to be a check for Sam Altman as does his management team inside of the company.

Troy:

But as we've seen.

Troy:

The board can be moved with, money effectively.

Troy:

And so to me, they have to live in a bigger system of gov system of governance, which is regulation at some point, and a question is when and how, and I think that the open source idea is an important check here.

Troy:

It's that if you imagine that this is actually.

Troy:

A era defining, kind of technology.

Troy:

If it is anywhere near something that is sentient, or, the power that can kind of replace humans.

Troy:

Surely, it can't be dependent on an individual or a company, it's got to be a societal asset, it's like discovering DNA.

Troy:

You know, it needs to be part of, I mean, now people will take advantage in different ways to wring commercial value out of it in different ways.

Troy:

I think that's fine, but I think that, that the open source idea is really important here.

Troy:

I really do.

Troy:

You know, it's the same with all stuff with Elon.

Troy:

He's like a fuckhead, but he always, often has something important to contribute to the discussion.

Brian:

That's a great review of him well, let's let's move on to a different, topic

Troy:

Why, what are you talking about?

Troy:

Alex isn't going to bite on that?

Brian:

No, I think he's he's weary of the elon stuff.

Brian:

I don't want

Brian:

to talk for

Alex:

you know, I don't want to say anything about Elon because whenever I criticize him, I'm just called, what was it?

Alex:

Derangement syndrome.

Troy:

Elon derangement syndrome?

Brian:

first and then, and

Brian:

then

Troy:

No, naive is for the different reasons for that.

Brian:

naive and deranged,

Alex:

I think Elon, I don't think it's particularly genuine because if he ended up having open AI merged into Tesla, he wouldn't be so upset about it.

Alex:

However, irrespective, I find I'm finding him deeply uninteresting and I'm starting to get a sense of what this guy's about, that the open AI stuff, the thing that's kind of, quirky about it is that it's such a messy structure.

Alex:

And it's, it's a lot of promises from, as we know, human beings who are fallible.

Alex:

It could be like the most important company of this generation.

Alex:

And yet we don't really fully understand how it's organized and, and who runs the show.

Alex:

So I think that's just messy.

Alex:

I would like it to be.

Alex:

Just a little bit more transparent.

Alex:

I don't particularly need it to be open as such.

Alex:

That being said, I think most of these models are going to be pushed towards open source.

Alex:

Apple's efforts seem to be entirely open source because they know that the value that they'll bring, and then that they'll Take out of their customers is not going to be the LLMs.

Alex:

So I expect a lot of this stuff is just gonna go open source.

Alex:

Now, when it comes to a GI, it's all hypothetical, right?

Alex:

Like a conspiracy theory on the internet now that a GI exists and it's cons controlling Sam Altman to get integrated into Microsoft to take over the world, right?

Alex:

Like

Brian:

really?

Brian:

I like that.

Alex:

Yeah.

Troy:

That's a good one, but you gotta, you gotta give him credit because I mean, to me, Sam Ullman, sees ahead of things and he understood that, you know, he said at one point in this podcast that in the future, there won't be a kind of governance capacity inside of the company, whatever you'd call it, safety and what would

Troy:

the function be?

Troy:

Trust and safety.

Troy:

He said the whole company is trust and safety.

Troy:

At some point, like 90 percent of what we will, what we'll do as a company is manage the broader impact of our technology on society.

Troy:

Also remember that Sam Altman, took no equity in the company.

Troy:

Why?

Troy:

Both anticipating that either A, he was already wealthy enough, that B, this would give him an incredible platform of influence, and that no one could accuse him of being profiteering off of something that should have societal benefit, or C that he could find other, hustles off of being Sam Altman that would, create new investment opportunities.

Troy:

And it really didn't matter because he would be wildly rich through the whole process.

Troy:

So it's, there's a lot, a lot, a lot of stories to tell around all of this, I think.

Troy:

Who

Brian:

Yeah.

Brian:

I hope there's a movie.

Alex:

yeah.

Troy:

would play him?

Troy:

Michael Cera?

Brian:

Yeah, that's actually a really good one.

Brian:

I would like that.

Alex:

I think the interesting thing though, with right now, with, the content being ripped from YouTube is that I don't expect us to hear anything from YouTube because they're also facing the same issues,

Brian:

It's like the Spider Man meme.

Alex:

they're, they're kind of, yeah.

Alex:

Yeah, and they're kind of all like kind of like it's some sort of standoff at the end of a

Alex:

cowboy movie

Brian:

Well, meanwhile, in my like private dinners that I do in between webinars, there, there's a lot of discussion of AI and payments and, and

Troy:

Are we moving into a conversation about middle management?

Brian:

Well, yeah, but most publishers just throw their hands up and

Alex:

Let him segue man.

Alex:

What the hell he was just

Troy:

That was a, that

Troy:

was a segway joke.

Troy:

It was a great joke.

Alex:

it was a great You're just like the guy that like speaks over the comedian just before the punchline

Alex:

because you see it Yeah,

Brian:

In your media training, did you get told on the, panels not to be like, Oh, well, that's like what we discussed in the pre call.

Brian:

I was like, no shit.

Brian:

That's why I'm bringing it up.

Brian:

anyway, let's just talk about middle managers.

Brian:

They're all getting offed.

Brian:

And, Troy wants data, but Bloomberg, Bloomberg crunched the data.

Brian:

Look, I know, nobody cries for the middle managers, but then there's, there's studies that show that middle managers are necessary.

Brian:

And so.

Brian:

Bloomberg crunched the

Alex:

whatever the case

Alex:

they're they're people and they're an important part of the economy right now, you

Brian:

Yes.

Brian:

I, you pull out the middle managers and you've, you've collapsed the post World War II sort of white collar, professional class because that's what it is.

Brian:

Like very few people for all this talk of the entrepreneur, entrepreneur, entrepreneur.

Brian:

It's a very small group of people.

Brian:

And, you know, most people do not have the, the wherewithal, the risk profile to go off on their own.

Brian:

And,

Troy:

Well, let's not just entrepreneurs and middle management.

Troy:

There are other roles in the world.

Brian:

Yeah, there's individual contributors and there's, craftsmen, there's trades,

Troy:

Mm hmm.

Troy:

Discipline leaders.

Troy:

There's, experts.

Troy:

There's,

Brian:

leaders?

Troy:

Well, it's kind of

Brian:

sounds like something a middle manager calls themselves on LinkedIn.

Troy:

it's more

Troy:

defined mentoring and expertise than it is about, coordination.

Alex:

Right, but it's like four of those, in a company of a thousand.

Alex:

Sure.

Alex:

And I'm, I'm not sweating, but yeah, no, I mean, it's, it's really brutal.

Alex:

We're seeing it in tech here.

Troy:

Is it, brutal?

Alex:

yeah, because I think it became a, even when I was at Airbnb, I could see this kind of, environment where the company wanted to focus and become much better at prioritizing, Troy, you've, you've been in companies, you know, how hard it is to prioritize companies always do too much, right?

Alex:

But then you look at your organization, you say, well, of course we have all these like smaller leaders and they all want to own something.

Alex:

And it creates an organization that's working on a hundred different things at the same time and not moving in unison.

Alex:

And so you start saying, well, how do we change that?

Alex:

It's like, okay, well, it's to remove these leaders and focus on more senior leadership that runs a team of.

Alex:

People who make shit, right?

Alex:

Whether it's, whatever it is you're making.

Alex:

and that was a trend that started for me then I think, and now it's been just accelerated, because economics kind of, you know, Forced us to rethink it because, technology is showing us that, I think working remotely made it happen.

Alex:

Like, wait, wait a second.

Alex:

You could have, everything's happening over Slack.

Alex:

People can kind of just get their work done.

Alex:

Do we need all this?

Alex:

Do we need all these layers of abstraction

Alex:

between the person making decisions and the person making the thing?

Troy:

who are we crying for right now?

Troy:

Listen, I know you guys want me to be the sort

Troy:

of callous

Brian:

Of

Brian:

course,

Brian:

no,

Troy:

argument.

Brian:

you're volunteering.

Troy:

no,

Troy:

No, but let's get a little more specific, Brian, I know you like to identify with, with middle management, but like, what

Brian:

Why don't I like to identify with the cubicle class?

Brian:

That's below middle management.

Troy:

okay, but cubicle class are people that actually do things.

Troy:

What is minimal management to you?

Brian:

so I used to go around, the editorial meetings at a place I worked, and identify people who wrote emails, most, or actually wrote

Brian:

stories.

Brian:

Thanks.

Troy:

volume.

Brian:

Well, if their main output is, emails and meetings.

Brian:

Yeah.

Brian:

You're a middle manager.

Brian:

I mean, Mark Zuckerberg, you know, when he was talking about his year of efficiency and a lot of this cascades from Silicon Valley, it just does managers managing, manage managers, managing managers, managing managers, managing the people who are doing the work.

Alex:

I think you could define it as like a person, a few levels within the organization that manages three to eight people that has, oversight over, you know, Tiny part of the organization.

Alex:

And, and

Troy:

Someone typed something into perplexity.

Alex:

the thing is like that manager class became important when companies started scaling and you were bringing a lot of young people in that you wanted to kind of like put to work.

Alex:

And so you needed to set up this kind of, system of like, eight or something people to one manager.

Alex:

And then on top of that.

Alex:

The consultants would come in and create these org structures and say, and it started defining becoming a manager as a career progression.

Alex:

So you start as somebody who does something and then you start, you become somebody who manages a small group of people who do something.

Alex:

And that's what gets you to becoming a senior manager and a director and stuff like

Alex:

that.

Alex:

But as the companies grew, you started having more and more of these middle managers

Troy:

Right.

Troy:

So, so, but Jack Welsh did the same thing when he whacked half of, you know, a bunch of people at GE years ago.

Troy:

So is this just part of a cycle?

Troy:

Are we in a tightening, era or is this.

Troy:

A fundamental restructuring of business hierarchies such that communication technologies, coordination technologies, and kind of changes in how we think about talent inside of companies is more fundamental.

Alex:

think it's three things.

Alex:

Can I list them?

Troy:

Do you get to

Brian:

She's the best too.

Alex:

Number one is like, one of the drivers is definitely economics and efficiency companies want to become more efficient and they're cutting back what they think they can cut back number two.

Alex:

Is, a kind of, a philosophy shift that happened during the pandemic when we started working remotely, where, you know, new technologies, new processes were created that kind of where it became apparent that you could actually automate a lot of this stuff and you didn't need a lot of this, these layers in between.

Alex:

And number three is a technological one where companies are already bracing themselves and, and setting themselves up for not only.

Alex:

Technology that will, improve their efficiency dramatically, potentially,

Troy:

So C, C.

Troy:

1 on that one, right?

Troy:

Like it's a wave of efficiency.

Alex:

it's a, it's a wave of efficiency, but it's also on top of that, it's a, existential risk.

Alex:

There's not one company that says today, like, we don't know if AI is going to really help us thrive or completely destroy parts of our business.

Alex:

and so that is kind of, creating this compression, in, in organization that's flattening the organization.

Alex:

And it's it's a confluence.

Alex:

So it's, it's happening at a bigger scale.

Alex:

It seems to be right.

Troy:

the positive here?

Brian:

Well, I think the positive is that it could break apart a lot of the soullessness of companies and lead to a force a lot of people maybe against their will, but to first of all, I it will reward people who stayed closer to the work.

Brian:

And I think that is a positive.

Brian:

Overall, a good thing.

Brian:

And I think that we're going to have organizations that are smaller and by necessity, when you're in smaller organizations, you need less layers of management.

Troy:

Did we have middle management in the olden, olden days?

Troy:

Did the, the blacksmith have a middle

Brian:

I was just thinking of the blacksmith.

Brian:

No, the blacksmith had a, that's why we need to bring back apprenticeships too, by the way, that's an entirely different episode.

Brian:

We're

Brian:

not going to solve the middle manager thing on this episode.

Brian:

Let's go into reader questions.

Brian:

I want to get into listener questions.

Troy:

Yes, the ran line.

Troy:

Are we gonna keep calling this the Ran Line?

Troy:

I think the b the newspaper I worked at that owned the ip, a Rent Line went outta business.

Troy:

So we may as well just take it.

Troy:

I like it.

Alex:

yeah.

Brian:

so on the Rant Line is, Graham.

Brian:

let's, hear from Graham.

Graham:

Well, you were convinced no one was going to record one of these, but I'm guessing you've probably got dozens.

Graham:

Also, excuse the noise.

Graham:

I'm out walking the dog, which is something I usually do when I'm listening to your podcast.

Graham:

question for you all, I guess.

Graham:

I work in the digital learning space.

Graham:

I manage a A huge learning management system for a global health company.

Graham:

I'd like to know your take on the future of human learning and how we teach each other things, how much AI is going to nose itself into that space and supplant people like me.

Brian:

Alex, do you want to help Graham out with forecasting the future of,

Alex:

Yeah.

Alex:

I mean, I think, some of the most exciting demos of AI that feel most, I would say, useful today are, AI based tutorships, AI based tutors that adapt to your learning style, your level, because right now a lot of content needs to be created, to create kind of curriculums for people at different levels with a different pace, with a different learning style.

Alex:

AI is able to today, I mean, even if you use chat GPT, create a learning model for you to learn nearly anything that is accessible over the internet.

Alex:

I have very specific ways of learning things and, consuming information and I had chat GPT teach me about, shaders, which is a.

Alex:

Type of like computer programming and it did an excellent job.

Alex:

So I cannot see a future for like textbooks or something like that.

Alex:

Where, where things are set, but rather that you will have these specialized AI's that are into tutoring.

Alex:

The second thing is I'm working, you know, with various companies around AI and stuff like language learning is really powerful with AI.

Alex:

Because I can not only speak to you, but, with multimodal, it can.

Alex:

Review your pronunciation.

Alex:

It

Troy:

Mm-Hmm.

Alex:

catch mistakes.

Alex:

It's

Alex:

incredible.

Troy:

you could get an American accent.

Alex:

Never.

Alex:

I've been trying.

Brian:

You should get a different accent.

Brian:

You get like a South African accent.

Brian:

That

Troy:

was the gentleman's name that asked the question?

Troy:

First of all, I really appreciate him asking.

Troy:

Thank you for, for hitting The ran line.

Troy:

1 little note From, from hustle culture, right.

Troy:

This is a hustle culture corner.

Troy:

One thing I noticed today for all of you is that if you're stuck, go for a walk because there's nothing like walking to free the mind, just saying.

Troy:

and,

Troy:

and,

Brian:

Are we in the good product

Troy:

no, no, no, no, no, no.

Troy:

But the, question asker, who's, what was the name

Troy:

again?

Troy:

Graham.

Troy:

Graham was walking his dog when he asked the question.

Troy:

yeah.

Troy:

And I just said,

Brian:

us when he walks his dog.

Troy:

yeah, and I think it's a great time to walking is cause I went out thinking about Kyle joining the show today and what I was going to ask and all that.

Troy:

I went for a walk and it was just like, boom.

Troy:

That's good.

Troy:

I got it.

Brian:

he's about here.

Brian:

We're, we're gonna be talking with Kyle Chayka,

Brian:

in a few

Troy:

I answer the question

Troy:

now?

Troy:

is it my turn, Alex?

Alex:

did not like my answer.

Troy:

I loved your answer.

Troy:

Why are you so insecure about everything?

Troy:

Jesus.

Troy:

Okay.

Troy:

Listen.

Brian:

I think.

Troy:

Okay.

Troy:

So I got to say big picture.

Troy:

Great.

Troy:

Your platforms will be fine.

Troy:

It would seem to me that more and more education are moving to, management systems and all that.

Troy:

But like, I'm totally bearish on formal education.

Troy:

Right now, admittedly, educational environments are great for maybe mentorship, slash instruction, community, definitely, certification from a brand or affinity or association with a brand, but like you can learn anything.

Troy:

If you're disciplined now, you can really learn anything.

Troy:

You can learn it in, with, between groups of people, you can learn it from YouTube, you can learn it from online, courses.

Troy:

I think there's, people will go to college for a long time because there's lots of other reasons to go other than learn, but you absolutely don't need to sit in

Troy:

a classroom

Brian:

honest with you, were you a good student?

Troy:

when I want it to be.

Brian:

Oh, that's a, that's a no.

Brian:

Which is fine, I think it fits your brand.

Brian:

and that's the most important thing.

Brian:

But I think that the, extra benefits of, college are completely overrated.

Brian:

I don't get it.

Brian:

Because outside of the very elite colleges, it's not like the people I went to college with.

Brian:

I mean, no offense, Jack and everyone I hung out with at Old Eagles.

Brian:

But I don't really think that it was like super beneficial to my career or anything.

Alex:

I mean, insofar as maybe it, it acts like a country club where you meet other people and it gives you that cachet, right?

Brian:

Providence College?

Brian:

I don't

Troy:

Alex is one of the smartest, most resourceful people I know.

Troy:

Did you even go to high school?

Brian:

Did you go to

Brian:

high school.

Alex:

high school.

Brian:

Really?

Troy:

No, that's why he, uh,

Brian:

high school?

Troy:

so testy.

Alex:

didn't finish high school.

Alex:

I didn't go to college.

Troy:

with us highfalutin educated folks.

Alex:

Yeah, exactly.

Alex:

And I probably missed out.

Alex:

but just like an experiment, folks, for the listeners.

Alex:

Okay.

Alex:

If you have a chat GPT plus subscriptions, 20 bucks a month, if you're interested in this, it's worth checking out and Perplexity as well, but chat GPT has a voice mode.

Alex:

You turn it on while you're in traffic or walking the dog, you turn it on and start a conversation with chat GPT and voice.

Alex:

Now there's still a little bit of the delay between answers.

Alex:

That's, that's the kind of the biggest gap that they need to cross to make this really, truly useful.

Alex:

But.

Alex:

You can start a conversation and get ChatGPT to start teaching you any subject.

Alex:

Try it out.

Alex:

Say, I am this, I want to learn that, help me run through a set of exercises that will teach me this, right?

Troy:

Incredible

Alex:

And it is incredible what you get just out of the raw output of Chat GPT it is not

Troy:

What a tip!

Troy:

What a tip fromm Hustle Corner

Brian:

Corps, let me just, one thing though, I hate to be the, the negative here, but we all have roles to play.

Brian:

What about being like well rounded, right?

Brian:

Like where I went to like college, it was a liberal arts college, and I know that's like anathema to the tech people.

Brian:

Providence College, is in Providence, Rhode Island.

Brian:

It's like when I went, when I tell people I went to school in Providence, they're like, Brown?

Brian:

I'm like, no, across town, but not Bryant and not Salve Regina.

Troy:

What about RISDI?

Troy:

Why did't you go to RISDI?

Brian:

RISD?

Brian:

Oh, the RISD kids were cool.

Brian:

I wasn't cool enough to go there.

Brian:

I mean, they could draw and stuff.

Brian:

Alex

Troy:

would have gone to RISDI.

Brian:

Oh, totally.

Brian:

what's his name?

Brian:

Brian Chesky went to

Brian:

RISD.

Alex:

went to, yeah, Brian and Joe, Gabby from Airbnb

Brian:

Yeah.

Alex:

to RISD.

Brian:

but we, we studied like the development of Western civilization and we, read books and all this other stuff.

Brian:

Won't this just lead to people being less well rounded in many ways?

Brian:

I mean, it's it's one thing to pick up the skills and, or is that just the fate?

Brian:

Like people are, going to be less well

Brian:

rounded.

Alex:

sure.

Brian:

Definition will be

Alex:

yeah, I think that there is.

Alex:

Computers and access to everything without any friction has a tendency to make us more insular.

Alex:

But at the same time, you could see it as like a recovery of our time, and spending our time in more meaningful ways.

Alex:

If we make space for it, like individually, As communities, as culture, as a culture.

Alex:

I'm not saying it's gonna happen, but I don't think school or the education system as it is today helps us being well rounded.

Alex:

People end up in debt, people end up having spent four years of their life on something that they're probably never going to use.

Alex:

A lot of kids are not, Including me, I'm not made for the education system.

Alex:

I didn't think school made me more well rounded, I think school like, felt like it was slowing me down Hey, maybe in the future, we all work four days a week.

Alex:

School is about being outdoors and learning how to build shit with your hands.

Alex:

And then we spend two hours at the computers and become geniuses.

Alex:

I don't know, but it's going to have to be all these things put together,

Troy:

I like

Troy:

that.

Troy:

It's a good idea.

Troy:

You should build a school around that

Brian:

Yeah.

Brian:

in the morning you do like survival skills and then you sit at a computer for a couple hours.

Alex:

Yeah.

Alex:

Oh, Kyle just joined us.

Alex:

Hey.

Troy:

in the bathroom.

Brian:

Well, he said he was there.

Brian:

There's drilling going on.

Brian:

This is New York.

Brian:

Welcome to New York.

Troy:

Hi, Kyle.

Kyle:

It's You wouldn't want to hear elsewhere, I don't think.

Troy:

but just out of curiosity, is there a chair in the bathroom or are you sitting where I think you're sitting?

Kyle:

I moved a chair into the bathroom.

Brian:

Good.

Brian:

It's keeping it

Brian:

classy.

Alex:

we, really appreciate it, Kyle.

Alex:

Thank you.

Alex:

Thank you for doing that.

Brian:

So Kyle, you're a writer for the New Yorker and you just published, this is your second book, right?

Brian:

Or have you written more?

Kyle:

Yeah, just a second.

Brian:

so it's Filter World and it's about how algorithms flatten our culture.

Brian:

This is, this is something you've been covering for a few years now, right?

Kyle:

Yeah, I feel like since 2015 or 2016, probably.

Brian:

Okay, so for those, who have not read the book, What is your thesis in basically two minutes?

Kyle:

the thesis is all in the subtitle of the book, which is how algorithms flattened culture.

Kyle:

And I think the overall argument is that, is basically all of our experiences being so dictated by algorithmic recommendations and feeds on digital platforms has made us more passive consumers who don't know where content is coming from and care less who makes it and the same situation for pressures creators of that content, whether it's video or music or visual arts or writing to kind of conform to the aesthetic standards set up by those algorithmic feeds and the platforms that they function on.

Brian:

Got it.

Brian:

But has that impacted you specifically?

Brian:

And like how you go about, you know, building your own taste?

Brian:

Cause I understand like the point, I just wonder whether there, it's like an all or nothing thing and how you personally manage the algorithms in your life affecting you.

Kyle:

it definitely does affect me.

Kyle:

I think anyone who made their career on the internet feels the pressure of algorithmic feeds and kind of catering to the standards of them or not.

Kyle:

as a writer, I've definitely felt the pressure of talking to Twitter, optimizing my posts for a Facebook feed or for an Instagram story.

Kyle:

and then as a consumer, I mean, I kind of wrote the book to become more aware of how it was impacting me.

Kyle:

I think I let algorithmic recommendations take over a lot of what I was consuming, particularly in a music context, because that's an area where I maybe don't do enough work for myself.

Kyle:

So I was, you know, letting Spotify shape a lot of what I consumed.

Kyle:

And so, it was in the book and over the course of doing all this writing, I tried to step back from that and find other ways of consuming culture, like music, art,

Kyle:

books, whatever.

Troy:

this is maybe the segment of the podcast where we can hate on Spotify for a minute.

Troy:

I have a rant about that later.

Troy:

I, but you made me think of something, on my, walk today, Kyle.

Troy:

And I was thinking like, because one of the things that.

Troy:

Interested me is kind of where this goes next and where it

Troy:

would go.

Troy:

If you had to say in it, and, but I was thinking about this, this thing that I had read that.

Troy:

I guess I see, this podcast is called people versus algorithms, but maybe you're happy about that.

Troy:

You've landed in the right place, Kyle.

Kyle:

Very apt.

Kyle:

Heh

Kyle:

heh.

Troy:

and then I was thinking that algorithms are this kind of emergent global brain, right?

Troy:

Like they're the building blocks of the virtual world.

Troy:

and I was like, it's like they're beginning to assemble the next frame of our reality.

Troy:

Meaning like, and, and yeah, it starts with a fucking taboola text recommendation, but later it's like Alex and his VR headset.

Brian:

There's a

Troy:

he, the, the, the,

Brian:

and your VR, Alex.

Troy:

And then, but what I remember hearing this thing that people had, had thought that, Mark Zuckerberg had seen the future clearly when he bought a bunch of Nvidia chips to build spare capacity, anticipating the demand for AI to power, reels feeds, right?

Troy:

Meaning that.

Troy:

That recommendation systems in a recommendation culture were actually eating an enormous amount of compute.

Troy:

And I used to just think of them as like, Oh, we got to recommend the next article in our CMS.

Troy:

We need a little script to do that.

Troy:

But really, They're the things that power the connection between everything, whether it's an additional piece of content, or it's, the product you're going to buy next, or the algorithm that fuels the customer service app that's helping me do anything with my bank or something like that.

Troy:

So it really is this kind of layer of automation is what I was thinking.

Troy:

And I know you acknowledge that they can be a force of good or a force of evil or kind of both.

Troy:

and I was wondering if it just struck me that it was almost like a layer of cultural automation had been created.

Troy:

Right.

Troy:

But now what we could do is live.

Troy:

a layer above that, just like we live a layer above automation where someone makes our mac and cheese at the craft plant.

Troy:

We don't have to make that with our hands anymore.

Troy:

And it reminded me because I need to bring in Canada every time I talk on the podcast call.

Troy:

It reminded me of living in Regina, which is the city, the great city where I'm from.

Troy:

It's named after the queen.

Troy:

and we didn't have scenes like big cities.

Troy:

We didn't have like the cool, Williamsburg scene.

Troy:

But we had this ambient kind of din of culture from around the world, mostly delivered to us from like the CBC, and in particular, our late night program called Brave New Waves.

Troy:

And so we took that in.

Troy:

That was our kind of cultural, feeding tube.

Troy:

And then we went into our rec rooms in our suburban homes and we made it our own.

Troy:

and so I guess I'm going on here, but I'm wondering if it's just like, we're shifting cultural production to a new layer because I'm super happy right now on the internet, Kyle.

Troy:

And

Kyle:

Good.

Kyle:

Good for you, personally.

Kyle:

Heh Heh heh

Kyle:

Heh heh.

Troy:

I know, but I'm like, I just find a lot of shit that interests me.

Brian:

Yeah, why not just give it over to the feeds?

Brian:

think it's, like, sort of, I, what I wonder, this would be my challenge on, on Troy's thing to you, Kyle, is, isn't this just kind of the repricing of connoisseurs?

Brian:

Like being a connoisseur takes a lot of time and effort.

Brian:

And now there's this cheat code where you can sort of pass yourself off as having taste without doing the work.

Kyle:

I think that's very true.

Kyle:

I think the automation thing is very true.

Kyle:

And I think the connoisseurship or taste thing is very true.

Kyle:

It's like an automation of taste, of expertise, of cultural connoisseurship.

Kyle:

And sure, you can, Give yourself over to the recommendation algorithm.

Kyle:

You could theoretically watch whatever comes up on your Netflix homepage.

Kyle:

I don't think you'd be very

Kyle:

happy,

Troy:

could give ourselves over to the DJ right

Kyle:

for sure.

Kyle:

But the difference in that scenario is that the AM radio DJ was a person who had thoughts about what they were putting on the radio.

Kyle:

Theoretically, like, if the radio DJ was a robot, then sure.

Kyle:

I like that idea of another person existing in the same reality as you, maybe being in that town, being Canadian, period, and kind of speaking to your perspective, your identity, your life, curating something for you.

Kyle:

Though I do think, it's underrated how much work this stuff takes and how much labor curation needs.

Kyle:

Like automating that is very tempting and I totally get

Alex:

if, if you're, because, Troy is a person of a certain age with a, brain that's fully formed or maybe even in decline, but it, to me, to me, the impact of algorithms, It's really age dependent, if it was up to me, I would not allow anyone under 14 to be exposed to algorithms because I don't think you have fully fledged.

Alex:

Taste and thoughts and it's so easy to manipulate and we're so malleable then and you see it, right?

Alex:

Like you see Coco Lemon this cartoon on Netflix that that is essentially algorithmically created They feel tested with kids and edited as soon as kids lose interest or putting a kid in front of tick tock and watch And I think that Troy maybes beyond that because he was living in Regina with a single radio station But if you fed this on.

Alex:

Do you think that has more of an impact?

Alex:

Do you think age has an impact here?

Kyle:

Yeah, I mean, you're more impressionable when you're younger and your attention is more liable to gaming.

Kyle:

I mean, maybe it's always liable to gaming and influencers use lots of tricks to game our attention to if we're looking at them, but for the younger generations, it's like They don't have the chance to exist outside of that recommendation system, like they don't experience that scarcity of content that might come when you only have the one radio station.

Alex:

Right.

Kyle:

and so we like live in this surplus of opportunities to consume and content that's optimized for us to pay attention to it.

Kyle:

But it feels like as the optimization has gone up, the like long term value has also gone down in a way.

Kyle:

You can watch a TikTok feed for an hour and gain absolutely nothing from it.

Kyle:

Like it's just, it's

Brian:

Yeah.

Brian:

I wonder if people, if people are hyper aware, like I'm like aware of when I feel like I've been manipulated by an algorithm and I just don't know if people focus on that.

Brian:

Like I went to dinner last night at Ochoa and I was like, wait a second.

Brian:

This is a TikTok place.

Brian:

I'm like, everyone here is 20 plus years younger than me, and they're all like taking TikToks.

Brian:

I was like, this must be like one of these viral TikTok places.

Brian:

And I was like, oh shit, did I get manipulated into coming here by something that then got passed on to like Reels?

Brian:

But

Troy:

but haven't you always been manipulated by media, Brian?

Troy:

I mean, if you weren't so, so, so, so the algorithm is just our collective conscience, it's not, Oh, the tech bros in San Francisco, it is the energy of the collective, right?

Troy:

So, so yes, is it better to be like some goofball at CGM, me and Regina, who's doing top 20 radio, or is it better that, that we're absorbing culture from a collective of people whose tastes are similar to our own?

Troy:

And by the way, I think it's a brilliant thesis to argue, and a very important one.

Troy:

I'm not saying that one is better than the other.

Troy:

I'm just saying my kids don't read movie critics anymore, and they need to?

Kyle:

I think, I mean, we, we've kind of turned a corner in a way.

Kyle:

I don't know that the, the print movie critic is coming back anytime soon.

Kyle:

But what I do find interesting is like how criticism and curation and connoisseurship even have moved into these new spaces.

Kyle:

So it's not just like the algorithmic feed is suggesting stuff for me.

Kyle:

It's like new waves of creators and interesting people are finding ways to communicate their ideas through those platforms.

Kyle:

Watched a million TikTok video essays on Dune, like, breaking down what happened in the movie and how it relates to the books and all of that stuff.

Kyle:

So I don't think, the algorithmic landscape is devoid of intelligence or expertise, but you do have to seek it out in a different way, I think.

Alex:

I think there's also, it's much, it's so much more potent that the algorithm actually ends up creating culture as much as it, we keep talking about recommendation.

Alex:

It's not like culture is happening, blooming all over the place.

Alex:

And then the algorithm go finds what humans are creating.

Alex:

The algorithm is creating the content.

Alex:

Everybody does that stupid face on YouTube.

Alex:

You know, the stupid Mr Beast face, right?

Alex:

Like the tone of voice, the music, the edit is all being generated to fit into the algorithm.

Alex:

And at some point I wonder like how much of that is recommendation and just generation.

Alex:

And this is where I think you talk about this flattening of culture

Kyle:

Yeah, I mean, I think that's the pressure.

Kyle:

So as consumers, like sure, it's great that we can get our consumption curated by an algorithm, but then the creators also have to fit themselves into that mold.

Kyle:

of whatever works on a platform or whatever has proved effective on a platform.

Kyle:

So I think that's why you get the YouTube face, the TikTok voice.

Kyle:

There's this hilarious video of a nurse or like a doctor on TikTok mimicking the hand gestures

Kyle:

that

Brian:

TikTok voice?

Brian:

Because I know what you're talking about, but I'm sure some people don't specifically know.

Brian:

Yeah.

Kyle:

it's, it's like a very staccato monotone.

Kyle:

So it's something like, I went to dinner in soho last night and I had to get the pizza at this restaurant because I saw the pizza so often on my TikTok feed.

Troy:

Yeah,

Kyle:

going and going and going.

Troy:

that was that

Troy:

was great.

Troy:

Kyle.

Troy:

There's, there's also an NPR voice, you'll notice, which is, which is a good one.

Troy:

Hey, can you explain what the difference is between filter world and filter bubbles?

Kyle:

Yeah.

Kyle:

I mean, filter bubble, I was surprised to find is like an idea from quite a while ago at this point.

Kyle:

like it, it feels so relevant, but it was coined I think in.

Kyle:

I don't want to misquote it, but it was either the late 2000s

Kyle:

or early 2010s.

Brian:

founder.

Kyle:

Yeah.

Troy:

No, it was, it was Cory Doctorow.

Brian:

Oh, I thought it was, uh, I thought it was Eli.

Brian:

That seems shitification.

Kyle:

Yeah, exactly.

Kyle:

No, the, the filter bubble to me was like an early concept of how our own individual consumption could be shaped by a feed.

Kyle:

And I mean, feeds have taken over so much more of what we experience at this point.

Kyle:

That was not the era of Netflix, like it was not the era of Spotify being so dominant.

Kyle:

So now it's not just like seeing it on our Twitter feeds or our Facebook feeds, but actually in all forms of culture and entertainment.

Kyle:

And those feeds have also gotten much more algorithmic over the past decade.

Kyle:

So you can see, I mean, with Twitter alone.

Kyle:

Like Twitter used to be pretty linear.

Kyle:

You could kind of depend on how it worked and now you don't even know how it's working on a day to day basis.

Kyle:

So it feels more warped, I guess.

Kyle:

Like the filter bubble has become more powerful and warping.

Kyle:

And so to me, it's this like immersive environments that we all live in.

Alex:

so you wrote this article, I, I, when I was at Airbnb, I think it was like, welcome to Airspace, right?

Alex:

and that was about like apartments becoming, looking, starting to look like

Alex:

coffee shops looking the same.

Alex:

And I remember, I, I was, so, I was head of design there when that happened and , and it took that article for a lot of us to realize like, oh shit,

Alex:

because we could pull out all the pictures.

Alex:

and in our case.

Alex:

it was entirely like preference based, right?

Alex:

Because when you wanted to pick a place, and you weren't sure you kind of went down a baseline where it looked clean in a certain way and certain things photographed better and it created this, echo chamber where everybody started looking the same because they would just perform better.

Alex:

And you would all of a sudden, we started.

Alex:

Digging into all these YouTube videos is like, here's how you take your pictures.

Alex:

Here's how you set the thing.

Alex:

And, and they had like rules for all that stuff.

Alex:

That wasn't actually algorithmically created by us.

Alex:

So at the time I didn't feel so bad.

Alex:

But

Troy:

But

Troy:

clearly you were a protagonist in Filter World,

Alex:

I mean, I was a hundred percent right.

Alex:

Like,

Troy:

the sickness

Alex:

no, I, and I think part of it is like also, You know, we keep talking about how we don't know what the algorithms do to us.

Alex:

some of these people building those algorithms don't really know why it's showing stuff to you.

Alex:

And, because, because it's like hitting some sort of dopamine receptor.

Alex:

And when you throwing that at a million people, it works.

Alex:

So I think we're kind of losing control of culture.

Alex:

It felt, really quaint at the time, the Airbnb stuff felt quaint and kind of, Quirky, but, it's, it's a different world, I think today, and I'm, I'm terrified with AI.

Alex:

I was actually trying to pivot this conversation in AI.

Alex:

So how, how does that world look in a world of AI where everything's not only a feed, but just regurgitated in some sort of video format or text format that you consume?

Kyle:

AI stuff scares me, but I think, I mean, Part of what I wanted to talk about with this book was not just like the algorithmic recommendation per se, but just how digital platforms period have performed this hyper globalization on what we consume.

Kyle:

So it's not just like, it's the fact that Airbnb exists, like enables this generic aesthetic.

Kyle:

It's not purely because of a recommendation, but people will kind of naturally optimize what they're doing on this platform that collects so millions of people at once.

Brian:

I don't know the cause and effect.

Brian:

Like, I, like, there's something, like, the high end cocktail bar is something I noticed, in the mid 2010s.

Brian:

I don't know where that came from.

Brian:

You could be in Tokyo, you could be in Dusseldorf, you could be in London, you could be in New York, and it's the same.

Brian:

There was this sameness that pervaded a lot of these kinds of experiences.

Brian:

And I don't know how much of it is caused by feeds or digital platforms or algorithms, but, it's definitely something I've noticed, that there's a lot of just similar experiences.

Brian:

But also, the reality is, Within a certain group of people, a lot of people have become very similar.

Brian:

You end up having more in common with someone in Barcelona than someone who lives in your own city or town.

Kyle:

for sure.

Kyle:

But like, what is the change that has happened?

Kyle:

Like, over, over the 2000s, 2010s, it is the internet.

Kyle:

It is our ability to follow that one guy in Barcelona or to, for the barista in Williamsburg to follow the barista in Sydney and see how the latte art looks on a day to day basis.

Kyle:

I think, to me, it's like the granularity of globalization changed, where it's not just a corporation, it's not just supply chains, it's, like, all of our aesthetic experiences have become more similar to a vastly larger number of people all over the

Kyle:

world than they were

Kyle:

before.

Troy:

can I just build on it?

Troy:

Because this is the distinction between Bubbles and Filter World.

Troy:

Because Filter Bubbles are about creating these kind narratives that are part of a collective, true or not, and believing and a self reinforcing mechanism inside of a, group of people informed by, communication technology.

Troy:

And that's what really worries me because I feel like we're living in this kind of modern, Fog.

Troy:

And where, where we potentially have these dueling narratives, none of which are anchored in any sort of objective truth.

Troy:

They're just like, what the fuck do you want to believe?

Troy:

And how did the people around you reinforce that?

Troy:

And some of that has actually nothing to do with algorithms.

Troy:

In fact, there, there's a paper, I sent it to you guys, from the communications of the ACM, which is some computer science organization.

Troy:

called "Bias Skew and Search Engines are Sufficient to Explain Online Toxicity" basically saying that the read write world of web one web two and search engines were enough to create basically the tools of online community building that created, these potentially kind of warring factions or these, groups that feel really separate from one another.

Troy:

So if the flattening of culture means we all get along like kumbaya man, let's do it.

Troy:

But I'm, I'm more concerned that, This technology is used to perpetuate lies.

Kyle:

On reality, the reality thing is hard to nail down, because it's like, if most people's lived realities are on the internet, then what is the fact and what is the distortion?

Kyle:

And it sometimes seems hard to separate.

Kyle:

If your, if your politics are on Twitter, then your beliefs are Twitter beliefs.

Brian:

Yeah.

Brian:

So you did a like a social media hunger strike for, for the, in the book, right?

Brian:

I don't know, did your, did your editor tell you how to do it?

Kyle:

No, I did feel like I had to do it myself.

Kyle:

Like, both.

Kyle:

I mean, as, as anyone who has finished a book knows, it's like the worst slog that you've ever gone through in your life.

Kyle:

And so this was like coming toward the end of writing the book.

Kyle:

And feeling like I needed to go the full distance, make myself uncomfortable

Brian:

Okay, so professionally you feel the need to be on Twitter and all these feeds, like TikTok and the rest of them.

Kyle:

I think so, particularly with my job now, writing an internet column for The New Yorker.

Brian:

If you didn't, If you didn't, would you, would you, like, do you resent the fact that you have to be submit yourself to

Brian:

these?

Troy:

beat, dude.

Brian:

No, but I know, I feel like for instance, like not to make it about me, but I will for a little bit is I do not want to be on Twitter.

Brian:

I really don't want to be on

Brian:

Twitter.

Brian:

And I I know that's, it's, it's a burden.

Brian:

But,

Kyle:

double edged

Brian:

Yeah, I know.

Brian:

But I have to, I feel the need, to have to be on there.

Brian:

I would like to have a more pastoral life.

Kyle:

Mm hmm.

Kyle:

I would, too.

Kyle:

I mean, I think having covered the internet pretty intensely for the better part of a decade, I'm, like, ready to do something slightly different.

Kyle:

And I am jealous of the people who can I'm, Write about something that's not online and can, have an interesting reporter life and career and not have to deal with the, vicissitudes of the next new platform and the weirdest freaks you've ever seen in your life doing something

Brian:

So does it become like punk rock to like just completely opt out of all this?

Kyle:

I think so.

Kyle:

I mean, it's, it's not Luddite because I read the Luddite book and the Luddites were much more aggressive and violent.

Kyle:

Brian, great, great book by Brian Merchant.

Kyle:

but it does, it feels like to me, like a back to the land movement or something, or like a, you know, most hippie thing to be like, I'm not going to be online.

Kyle:

I'm going to cut myself off from all of this.

Kyle:

I'm going to find other ways to consume culture to connect to my friends, whatever.

Kyle:

It feels, it feels rebellious at this point.

Troy:

but.

Troy:

You are a product and a participant in the world of professional media.

Troy:

You're a professional journalist.

Troy:

Did writing, filter world, or any of your observations change the way you thought about professional media?

Kyle:

For sure.

Kyle:

I mean, I, my thinking was kind of already changing as, as many of us witnessed the decline of the 2010s VC media bubble.

Troy:

You work at a holdout.

Kyle:

yeah, exactly.

Kyle:

I mean, subscriptions.

Kyle:

But I think like before I started writing the book and like in the mid to late 2010s, I guess, I was really obsessed with what was happening on Twitter.

Kyle:

Like I saw that as the reality of the media industry, or of culture, or of success.

Kyle:

And I think doing the book changed that, and also the algorithm cleanse of just like totally getting off all of it for three months.

Kyle:

also help change that.

Kyle:

Just to realize, I mean, I grew up on the internet, like I'm a fully millennial internet pilled

Kyle:

person.

Brian:

the internet?

Kyle:

Probably when I was 10.

Brian:

so you are full.

Brian:

I was fully formed before I got the internet.

Kyle:

Yeah, which I think, it didn't help my life necessarily to be so online.

Brian:

I think it's harder for those of us who grew up analog to some degree.

Kyle:

But maybe now we're moving into a new analog era, like a post post internet.

Troy:

what do you think happens next?

Troy:

I was really aching to ask you this question.

Troy:

trying to think about like, shit, what comes after feed world?

Troy:

Because it doesn't sound like it's going to be good.

Troy:

Or maybe it'll be great.

Troy:

I mean, other than, you know, farmed back to the land, which let's exclude that for a second.

Troy:

What, what happens, you

Troy:

know, on the internets?

Troy:

Well, okay.

Troy:

I get that.

Troy:

We covered it.

Troy:

But is it, the nat, I think the obvious answer is the feed becomes infinitely more personalized.

Troy:

That's what AI does, right?

Troy:

And, and

Kyle:

I mean, do you want to live in a world of like, the most generic AI generated art stuff?

Kyle:

Like, I feel like the early products of AI generation are, don't inspire hope in me at all.

Kyle:

It's all not very interesting.

Kyle:

It's not very compelling.

Kyle:

Perhaps it can make feeds better.

Kyle:

Like a perfect algorithm would be better.

Kyle:

If it could be more granular, if it could somehow not push people into hard behavior.

Kyle:

But that might also just be a fact of human life.

Kyle:

If we're all congregated onto a particular platform, we'll naturally move in a similar direction as everyone else.

Brian:

a hopeful scenario?

Brian:

What's

Kyle:

I think, is that they get better.

Kyle:

They prioritize more human curation in some way.

Kyle:

They let you follow people who are better attuned to what you want to see.

Kyle:

I mean, personally, I have more hope for just non algorithmic consumption of stuff.

Kyle:

Like, I

Kyle:

have

Kyle:

more

Kyle:

hope.

Kyle:

Going to a record store.

Kyle:

I don't know some very some very vintage things like I mean to me substacks and newsletters Are pretty non algorithmic for now and the fact that I can pay someone five bucks a month to write about music in a way That I appreciate is a nice thing for me to

Troy:

Although those bastards there, they'll say the opposite, but they're jonesing for the algorithm over

Kyle:

Yes.

Kyle:

Yeah, I mean I'm The way that they're moving toward an algorithmic platform is very upsetting and disturbing.

Kyle:

And it's not adding value, like the many people have pointed out, the random followers and subscribers you get from this recommendation system don't really help you that much.

Troy:

Yeah.

Alex:

Maybe the future is a AI powered bubbles.

Alex:

You just create yourself for yourself.

Alex:

I mean, there's something.

Alex:

Okay.

Brian:

I think we're all going to become the precogs.

Brian:

We're just going to be like immersed in content.

Troy:

Yeah.

Alex:

It's super,

Troy:

in the soup.

Troy:

Yeah.

Troy:

Kyle, one of the things you didn't talk about is that much is the whole, the other side of this whole thing, which is the economics and, algorithms.

Troy:

owned by platforms, the real problem, other than it's maybe, in your view, turned some of us into zombies, the bigger problem is, is it's robbed media of their ability to create their own distribution.

Troy:

And that's the bigger problem, I think.

Kyle:

and not just media, but not just journalists, but musicians and visual artists and anyone who wants to reach an audience.

Kyle:

Really faces a lot of pressure and mediation from these systems.

Troy:

Yeah, I'm getting this sort of sickly feeling from Spotify that it's like too much.

Troy:

It's all greased up.

Troy:

It's all greased with algorithm juice I started using Qobuz because I, I'm a title, like I like high, high fidelity streams and Qobuz has a nice balance.

Troy:

I mean, it is very little algorithmic stuff.

Troy:

A lot of it is just, they curate and say this is a good album and oh my God, that felt good.

Kyle:

It feels cleaner to not deal with these things that are trying to manipulate you.

Kyle:

Like in, in the book I wrote about Adagio, which is a classical music streaming service.

Kyle:

And the experience of using that software is night and day from Spotify.

Kyle:

You're not getting pushed into stuff.

Kyle:

You can click a musician and see their whole back catalog in one click.

Kyle:

You can sort by performers, you can sort by orchestra, you can sort by era.

Kyle:

The metadata is better.

Kyle:

It's, it just shows you how much better it could be

Kyle:

for the thing that you're experiencing than it is.

Troy:

I agree.

Troy:

The bonus question that we didn't ask him, Brian, that we discussed, is should we ban TikTok, Kyle?

Brian:

Oh yeah, that was the last one.

Kyle:

I think if we ban TikTok, all of the user problems will just persist in other forms, like Instagram Reels

Kyle:

or whatever

Brian:

demand for this.

Brian:

I think that the big outstanding question is, I'm sort of with you on a lot of the pernicious impacts, at least from my point of view of algorithms, but they're incredibly popular.

Brian:

If you put an algorithmic feed versus a reverse cron feed in front of people, they'll choose the algorithmic feed, like 10 out of 10, it

Kyle:

For sure, yeah.

Kyle:

Yeah, they naturally drift toward that.

Kyle:

And like, TikTok proved that it's a powerful tool.

Kyle:

Or like, a compelling, addictive product for user experiencE.

Kyle:

the Chinese data stuff, I mean, It's neither here nor there in a way.

Kyle:

Like, all of the data is being surveilled anyway.

Brian:

Yeah.

Brian:

Well, it just seems to be like, it's all theoretical.

Brian:

It's, it's, could they use the algorithm in order to push people in one direction or the other direction?

Brian:

I don't, maybe, probably, if they, if they wanted to.

Brian:

Why

Alex:

mean, I think there's, there's some value in looking at reciprocity of, them selling a product to us that we're not allowed to sell to them, right?

Alex:

I think if only that should be looked at.

Alex:

If, we can sell each other cars and computers, but maybe not that.

Alex:

So,

Brian:

try to get one of their electric vehicles in the United States.

Brian:

You can't.

Troy:

The reason America is great, Alex, is because we don't look at petty things like reciprocity as things that define who we are.

Alex:

Yeah, somebody who grew up in the Middle East, I will disagree with you, but let's, let's

Troy:

Okay.

Troy:

All right.

Troy:

Well, the last, the last thing I would just ask of you for our listeners is your hustle culture tip of the day.

Brian:

What does that mean?

Kyle:

Oh man.

Kyle:

Hustle

Kyle:

culture tip of the

Kyle:

day.

Troy:

to write all these write these amazing articles and books and also have your you know know whatever else you're accomplishing in life.

Kyle:

Hmm.

Kyle:

I don't know.

Kyle:

My, my hustle culture tip of the day is pay attention to your subconscious.

Kyle:

Like be a, be a Freudian or something.

Kyle:

Like when something sticks in your brain for some reason.

Kyle:

You have to go dig it up.

Brian:

Awesome.

Troy:

That's a good

Troy:

one.

Troy:

Nice

Troy:

Thank you

Troy:

Kyle

Kyle:

this

Kyle:

was

Brian:

Thank you so much, Kyle,

Brian:

for,

Brian:

Getting into the bathroom for us.

Brian:

this could be a regular

Brian:

segment.

Kyle:

Bathroom podcast.

Alex:

you came with headphones and a mic.

Alex:

I really appreciate the

Alex:

effort.

Alex:

Thank you

Brian:

pretty

Brian:

good in the restroom.

Troy:

have that lamp in your bathroom?

Troy:

Kyle

Kyle:

So the bathroom in this apartment, the fan is really loud and it comes on with the light so I

Kyle:

can,

Brian:

thank you.

Brian:

This is New York city.

Brian:

This is, this is a very New York city.

Brian:

Nobody, by the way, I just want to tell you,

Brian:

Kyle, as someone who's spending more time outside of New York city, nobody else deals with this stuff outside of New York

Kyle:

Oh, I totally get it.

Kyle:

you know, I've prepared in advance.

Kyle:

Like, I

Kyle:

thought about the scenario.

Brian:

Awesome.

Alex:

you.

Alex:

See

Brian:

much.

Troy:

subconscious.

Kyle:

Yeah, have a, have a good

Brian:

All right.

Brian:

Take

Brian:

home.

Alex:

Bye.

Troy:

That was wonderful.

Troy:

Wonderful.

Alex:

love that.

Brian:

you got to go, Alex, do you have a

Brian:

quick,

Alex:

just wanted to keep things moving.

Troy:

do you want to participate in the good product corner?

Alex:

yeah,

Brian:

good product.

Alex:

What's your good product?

Alex:

Feet.

Troy:

Well,

Alex:

to get you around.

Troy:

this is more of a scenario, good product or an experience, right?

Troy:

so I was thinking a little about Kyle last night I was looking at a newsletter.

Troy:

And a newsletter that is the product of our wonderful modern internet world.

Troy:

One of the greats called Today in Tabs.

Troy:

No,

Brian:

mine.

Troy:

I couldn't get to yours cause it's blocked by a paywall.

Troy:

And then.

Troy:

So, so I'm just, doing my thing, thinking about Kyle, thinking about, Oh my God, I feel like my interests are well served.

Troy:

And in this newsletter, it turns out that Ben Folds, the, musician, is divorcing for the fifth time.

Troy:

And I thought, well, that's interesting.

Troy:

And, and, it's costing him a huge amount of money.

Troy:

More money than I would have thought that,

Brian:

Ben Foltz

Troy:

seemingly sort of faded into the background.

Troy:

But no, he's busy.

Troy:

and I read about it a little on Scoop Nashville, which is a niche publication, I guess, documenting the trials and tribulations of, folks in Nashville.

Troy:

So he's an interesting guy, Ben Folds, and he is an advisor to the National Symphony Orchestra, has been for a bit.

Troy:

And then I, I really remembered, oh my God, how much I appreciated like his kind of brand of quirky, nerdy, thoughtful pop rock, right?

Troy:

And I started thinking about all the other bands I liked at that time, the Lemonheads or Elliott Smith or Ben Queller or this.

Troy:

Underappreciated Canadian band called Sloan or Postal Service, Fountains of Wayne, all these bands, right?

Troy:

And I started just sitting in my chair, going through an old catalog of music, of course, that was infinitely available to me because of modern technology.

Troy:

And it just made me connect all these things like the piano tours that came before Ben, Randy Newman and Todd Rundgren or Elton John or whatever.

Troy:

And I fired up the songs and I listened to them through Quobuz.

Troy:

Because I'm kind of over Spotify and I spent an hour kind of reliving old times and the whole experience of like random discovery, going back, reliving that no one bugging me was a great product, not just a good product, but a great product.

Troy:

And, three songs I played from Ben folds, the one called Philosophy, this soupy.

Troy:

Kind of pop perfection of his song Brick, or like the, you guys know the song Rock in the Suburbs, which is like kind of driving pop rock.

Troy:

And while I was in that listening, I was in that newsletter and I came across another account and I hope you looked at this Alex, which was this guy, also randomly a Canadian called the Tom John.

Troy:

And.

Troy:

This is the perfect kind of companion to the whole feed world thing.

Troy:

He had done a TikTok video in the same voice that Kyle projected earlier, like the fast talking kind of TikTok thing.

Troy:

And he did a thing where The video was some entitled something like how to listen to your favorite song whenever you want.

Troy:

And what he did is he did this whole setup in his room of using a rotary phone to call a radio station to tell them to play a song for his girlfriend to take that song and plug it.

Troy:

his, Walkman that had a radio in it in and connect it to his tape deck to record the song from the Walkman onto the tape deck, and then to put it on to another mixtape because the tape deck had two

Brian:

I enjoy that, but it.

Brian:

was,

Brian:

it's not,

Troy:

uh,

Brian:

not,

Brian:

that complicated to record those songs off 97.

Brian:

5 WISB.

Troy:

It's a, it's a video, man.

Troy:

It's entertainment.

Troy:

The point is, is that it used to be a lot harder to do certain things to listen to that Ben Folds song over and over and over again.

Troy:

And, that's the internet.

Troy:

It's amazing place and I'm grateful for it.

Brian:

I definitely did that.

Brian:

I definitely did that.

Brian:

That was, that was a good, good product.

Brian:

My good product is, Chicken McNuggets.

Brian:

I made my own version of chicken, chicken McNuggets and they were

Troy:

That's so funny.

Troy:

We did the same last night.

Troy:

Yeah, my kids do.

Brian:

Yeah.

Troy:

They're a little dry actually.

Brian:

Well, you're not doing it right.

Troy:

You know what?

Troy:

If you want to, if you want to back up good product, I will only tell you, I thought you guys would probably criticize me for it.

Troy:

Shake Shack, man.

Troy:

Oh my God.

Troy:

The, the, the Smoke Shack burger.

Troy:

So good.

Alex:

Uh, we do, we do in and out here.

Alex:

We're very happy with that.

Alex:

Although there's a Shake Shack.

Brian:

Thank you all for listening, and if you like this podcast, I hope you do.

Brian:

Please leave us a rating and review on Apple or Spotify or wherever you get your podcasts.

Brian:

That takes ratings and reviews.

Brian:

Always like to get those.

Brian:

and if you have feedback, do send me a note.

Brian:

My email is bmaracy at therebooting.com.

Brian:

Be back next week.

Alex:

alright guys, well thanks so much.

Alex:

I've gotta go meet with folks at Unity right now, so I have to bail.

Alex:

Talk about video games.

Brian:

All right.

Troy:

Good luck.

Brian:

Thanks.

Alex:

peace.

Listen for free

Show artwork for People vs Algorithms

About the Podcast

People vs Algorithms
A podcast for curious media minds.
Uncovering patterns of change in media, culture, and technology, each week media veterans Brian Morrissey, Alex Schleifer and Troy Young break down stuff that matters.
Get our newsletters:
https://www.peoplevsalgorithms.com/
https://www.therebooting.com/