Studio Sessions

49. The Human Cost of "Artificial Intelligence": What Are We Trading Away?

Matthew O'Brien, Alex Carter Season 2 Episode 23

We dive deep into the seductive promise and hidden costs of AI technology and the question that haunts every creative: what are we trading away for convenience? Through our exploration, we examine how tools shape not just what we make, but who we become in the process.

We also wrestle with the tension between efficiency and authenticity, discussing everything from DV tape workflows to AI-generated content and the race to the bottom that threatens meaningful creation. This conversation challenges us to consider whether the magic intelligence in the cloud will free us to focus on what matters most, or whether we're surrendering the very friction and deliberateness that gives our work its soul. -Ai

If you enjoyed this episode, please consider giving us a rating and/or a review. We read and appreciate all of them. Thanks for listening, and we'll see you in the next episode.

Links To Everything:

Video Version of The Podcast: https://geni.us/StudioSessionsYT

Matt’s YouTube Channel: https://geni.us/MatthewOBrienYT

Matt’s 2nd Channel: https://geni.us/PhotoVideosYT

Alex’s YouTube Channel: https://geni.us/AlexCarterYT

Matt’s Instagram: https://geni.us/MatthewIG

Alex’s Instagram: https://geni.us/AlexIG

Speaker 1:

And it had been a golden afternoon and I remember having the familiar conviction that life was beginning over again with the summons. I feel like we might be a little disorienting. I'm just going to provide some orientation here. So what Alex and I are talking about stems from the 30 plus minutes in pre-show where I had brought up the video that open AI put out, where Johnny I've and I'm not Sam.

Speaker 1:

Altman Alex hasn't seen it sit down at, I think, the Zoetrope coffee shop in San Francisco to have a little espresso and talk about what they're doing.

Speaker 1:

Openai acquired Johnny Ives' company IO for $6.5 billion to sort of christen the marriage between the two of them in formally, I think, producing a piece of hardware, a hardware device that will use OpenAI's artificial intelligence. So, yeah, they sat down and had this conversation about a range of things, their values, what they are trying to do, without telling you exactly what they're doing. There's a prototype that is being used by Sam Altman and a few other people, um, to sort of do what this device is intended to do for the masses. Uh, and they're excited about revolutionizing how we interact with, uh, with a computer that the actual computer itself, you know, like flip open your laptop and input stuff. With a computer that the actual computer itself, you know, like flip open your laptop and input stuff with a trackpad, a mouse and keyboard. They think that that's legacy hardware and it's ready to be taken out of the equation and replaced with this magic, this quote magic intelligence in the cloud cams.

Speaker 1:

Yeah, this quote, magic intelligence in the cloud that this device is going to allow you to interact with in a in a way that lets you make things that you never could have made, or do things that you never could have done, whether it's, I don't know, whatever it's gonna, whatever it's gonna do. So we spoke about that for about 30 or so minutes and, uh, and now we're trying to figure out whether or not that will play a role in this episode or if we're going to talk about something completely unrelated probably the latter, probably the latter.

Speaker 2:

So, yeah, my take on that is, I don't know, skepticism sure, paired with I don't know. It's probably worth doing an episode on thoughts on ai eventually, because there's definitely some nuance, I think, to both of us, yeah, in our approach.

Speaker 1:

But my big thing is you have skepticism and all that Mine is what's this going to cost us? And I don't mean how much is the device going to be at a retail location, an open AI store, but what's this going to cost us as a civilization, a society, a culture? And I was making references to social media and, uh, you know the, the smartphone in general, and the cost of social media, um, with misinformation, and you know all the different things that are the negatives of social media right Depression.

Speaker 1:

You know issues with young people and self-image, death of creativity, originality and then smartphones.

Speaker 1:

You know I've talked about this, you and I, and on my channel. You know just being, you know the siren song of the magic rectangle and just like it being so difficult to do anything but stare or have a screen in your face at all times, because it's a way for you to make stuff, it's a way for you to communicate, it's a way for you to engage in digital third places. You know all that stuff. So what's this thing and I'm asking rhetorically, but what's this thing going to cost us in order to have the magic intelligence in the cloud at our beck and call? What are we going to give up to do that? Back to the social media thing we give up all of our personal data. That's the product mining our personal data, our phone number, our zip code, what we click on, what we do all of that stuff in order to serve up ads that are going to have a higher likelihood of converting us to buying something. All the consumerism that you know, and what social media, including YouTube, promotes. You know all that stuff. So that's what I think about. I'm like, I'm open and interested in a magical and revolutionary product, but I want to know what it's going to cost us to wield it, to have it, to use it, and and is it worth that cost?

Speaker 1:

I am someone who is more susceptible to seduction and, hey, I can make an app, or I can have it edit a video for me, or I can. It'll do all these things, which, which I think is going to give me more time to do the things that are really meaningful to me. So it wants to listen to me 24, seven and, and you know, capture imagery of everything I'm doing in order to do its thing. Okay, yeah, great Cause, I can tell it to edit a video for me and it does it in 20 minutes and I give it a bunch of revisions and then I'm done. And what took me? Usually three days before, and I'm now doing it, and you know, 45 minutes sold, but what are you trading away?

Speaker 1:

Right, what's the cost?

Speaker 2:

Oh man, then it's not your videos, ai's video, absolutely, it's like yeah, but I told it it's like an. Instagram filter. You have seven options, that's right. You have six colors. Express yourself with one of these six colors.

Speaker 2:

These three interior options we're going to forget. I mean, no, I don't think. I think there's usually a counterforce to a lot of these things. We're in a pretty interesting period right now, too, where I think the sins of the sins of Silicon Valley are um in focus in a lot of ways, yeah, and some of it is just unwarranted, Some of it is just like convenient rhetoric or whatever. But yeah, I think it's an interesting time period to try to push a disruptive technology. I mean, if it helps with with day to day stuff, then you're always going to find an audience, and I'm I'm very pro AI. I mean, I've been. I was talking to you about this stuff a long time ago.

Speaker 2:

Um, but the more I've used it and the more it's become integrated into my life. There's also just the more it's become integrated into my life, there's also just you kind of see the limitation capacity, you kind of start to hit a ceiling in some of it and it's going to be really, really good for a lot of industries and a lot of things, but it's going to do more harm than good for a lot of people and I think eventually I like to hope maybe that's just optimism that you know people will start to realize oh man, the price, the trade-off is not is not worth it.

Speaker 2:

The cost is too great yeah, and I I mentioned to matt before there's like my outlook on ai. I forget. I think there's, I think it's a verner herzog quote, I'm not sure where this comes from, but it's um. Procrastination is for good problems. That's right. Bad problems should be killed immediately to make more time for good problems and just replace uh killed immediately with uh fed to AI immediately.

Speaker 2:

And I think that pretty much sums up my my view on on large language models or whatever however you want to put it right now is the concern, though, that people are going to put their good problems through ai as well.

Speaker 1:

Yeah, that they're.

Speaker 2:

I mean, it's just, it's like it's like a, it's like using gps, or I mean there's. I've met so many people who just don't, can't read anymore, not like can't read phonetically, can't pronounce words, I mean they can't read. They have no attention span. If you give them a book they can't physically make their way from beginning of the book to end of the book. It's actually impossible, and that's been such an impossibility for so long because of low information diet or high speed information diet right that there's no there's no like it would take them.

Speaker 2:

Like you, you can't just jump into a marathon and run a marathon right and I forgive that metaphor, but it's a, it's a useful it takes.

Speaker 2:

You have to build up to it. Yeah, and a lot of people don't realize that many of the cognitive skills that we take for granted I, I think operate on a pretty similar, on a pretty similar framework. And once you start to remove friction from that, I think you're also starting to remove the cognitive abilities that you've built up or conditioned. And I mean, a lot of people can't navigate without a GPS, right. A lot, a lot of people, yeah, can't make it through a book or can't sit down and watch a movie from start to finish.

Speaker 1:

And I see AI doing that with a lot of cognitive tasks. You know we you talking about.

Speaker 1:

That made me think about amusing ourselves to death by postman and the shift from, you know, sort of written word consumption to imagery based consumption with television.

Speaker 1:

Um, you know, and I don't know the answer, I'm wondering aloud.

Speaker 1:

The feeling I got from watching that video was that there would be sort of a reduction in our in the presence that screens have in our lives and to me there would be an increase in in oral communication, that this would be like a person that you live with and you speak to them to get things done, you direct them, you, um, you have to, you have to kind of give instructions, you have to give feedback, and it's such a heightened language model that you know it learns the nuances of what you're saying, maybe it can sense sarcasm or sort of like.

Speaker 1:

You know just these shadings of of meaning to give you a new version of the thing that you're trying to make, whether it's a software application or a video or um, a lesson plan, whatever it is and that you know you might look at a screen to sort of review what it's done for you but then move away from it. So I'm curious what this would mean in the shifting of how our brain works. Again, postman writes about how we think in imagery because of the television and the presence of screens and imagery and all that in our lives. Will this have an impact on that or won't it? Will screen still be as ubiquitous to us? But instead of staring at a screen to like work on a spreadsheet, we'll be staring at a screen to just watch stuff.

Speaker 2:

Yeah.

Speaker 1:

And will those screens be strapped to our head like a vision pro and the videos more immersive? Will it be 2d movie theater?

Speaker 2:

Will it be, you know, television in our whatever it'll be a smartphone, how customizable can these or how individual can these models get when plugged into a specific human's experience? Yeah, because I mean, you know you talk about the decay of images or the like. We see the world through images, but we don't see it through a lot of images anymore because there's not a lot of individual experiences happening, there's more of a collective experience happening, and so we already have, we already see in our day-to-day reality and just in our language and in our conversations and in the ways like how we mean things and communicate ideas, the decay of metaphor, the decay of, you know, meaningful imagery past a certain point. Yeah, um, and you know, will this accelerate that? Because you're pulling, everybody's pulling from the same database. Nobody has a, nobody has a room for an individual experience anymore. If you're, it's a collective experience. Everybody has the same secretary, right? So everything is being pulled from that and the secretaries are pulling from the same library. And once it's monetized.

Speaker 2:

What are they giving you that helps them with their bottom line and their profits and the and the sort of I think I saw a statistic, too, where I mean don't statistic too, where I mean don't take this, don't take this as like a specific. I'm not the one to recite like oh, the number is exact, but I do think that a lot of open AI is actual profit comes from their commercial um and not their consumer Right Um, like enterprise enterprise that, thank you, yeah, um, which is more just token based.

Speaker 2:

Yeah, so I don't think like their subscription. You know my my first thought when you're like, how will it monetize? It's like, well, subscription services. It's like there's a reason they have a hundred thousand dollar subscriptions, or forty thousand, I don't know what the exact numbers are at this point.

Speaker 2:

There's also a reason I know they have, they have higher tiers now for enterprise based, but and I'm not even talking about those being where the profit or the the revenues come from. I'm talking about their, like pure. You're using, like, you're opening their faucet for tokens to interact with this AI, to build it into the to you know, back into an application or into something that is more custom. That's where most of their like it's not stuff that you and I think about going on and using ChadGPT Right, yeah, to make an image through its dolly with them.

Speaker 2:

Yeah, and I think that's honestly where a lot of the profit is going to come in AI period. I mean, obviously there's a consumer market. Maybe their idea is that they'll get the jump on the consumer market. I mean, obviously there's going to be a huge consumer market for AI and they're paving the way for that by having a free Billions of people that are going to use it every day there huge consumer market for AI and they're paving the way for that by having billions of free.

Speaker 1:

There's a freemium, there's a paid subscription which I pay for Um, and then as hardware products come out, that couple with it. You know they're going to build an entire ecosystem but then also have opportunities to monetize in other ways potentially whether it's advertising or sort of like. Google has sponsored results. You know, when you start using chat GPT, like to, I need a spreadsheet of whatever, like it's going to be this company's spreadsheet format that it spits out and that company's paying a premium to be the default. Whatever you know, because you're not owning the software and sitting there and making the thing yourself, you're just telling chat GPT what to do and it crafts it all for you.

Speaker 1:

Yeah, um, you know and I can't look into the crystal ball too far with all of that, because this is, I mean literally that watching that video sort of like cracked my brain open into going. Where's this gonna go? I mean, we've had several years already with chat gT and AI as people shout you probably can't hear this, but there's like people yelling in the front yard of Alex's apartment building and it's very distracting.

Speaker 1:

You know we've had several years to kind of look into the crystal ball and go where is this headed. But there was something about them talking about a hardware product which wasn't unfathomable to me 24 hours ago but, now it's like they have a prototype.

Speaker 1:

They're using it with how they think we all will use it eventually, when we all have one, or they hope we all have one. So my brain just starts going well, what's that going to look like? And again, what's the cost of it? What's it going to? What are we going to give up to have this thing? And um, and you know I don't want to get full on dystopian because, like you said, we'll just opt out, you know, if this isn't something that aligns with.

Speaker 2:

I feel like most of it like one like the environment, the. The technological environment that the iPhone launched in and the social environment that the iPhone launched in was a much different social environment than today. Yeah, so just the fact that it feels like they are trying to like chase the sequel to that makes me more skeptical.

Speaker 1:

Yeah.

Speaker 2:

I don't think it's going gonna feel like that the next time that happens.

Speaker 1:

I feel like like even jobs was chasing sequels to a certain extent, on a much larger scale. I was watching this documentary on Edwin land, who invented the Polaroid, the Polaroid camera and Polaroid film.

Speaker 2:

Oh yeah, I mean Jobs was chasing sequels in a more, I guess, a less information-rich environment.

Speaker 1:

Yeah.

Speaker 2:

But yeah, I mean, everybody's in a way chasing sequels. I don't think it feels like, yeah, I don't know, I don't want to speculate to it, it just doesn't. I don't think it does any good. I, I do think that ai is useful if you feed it your bullshit that you don't really want to deal with. Yeah, it's very useful for that and it will free up brain share. Um, I mean, that's speaking from firsthand experience, right, which is nice, and it does feed up your ability. But then also, then you have to make sure that you use that ability to focus on things that you actually want to spend time focusing on, because it's just as easy to get sucked into some mindless vortex with that extra time.

Speaker 1:

Well, and not even, not even that. Um, you use the extra time for things away from, I don't know, the internet, or making stuff for the internet, or whatever it's going to be, and obviously that's what we say.

Speaker 2:

You know, I think that's.

Speaker 1:

That's the big kind of pitches like hey, well, if you, if, if chat GPT can make your elaborate spreadsheet for this thing that you need for work in 30 minutes instead of you doing it in three days go hike. Well, we're just going to do more work Like if. I can, if chat GPT can make a video for me using a photo, realistic version of myself, based on the footage, the real footage I give it like I want all my videos to look exactly like this and I want to work with you to come up with the

Speaker 1:

script and then I want you to spit out like me talking to camera in art, you know in a, you know synthetically, um, and then I can tell you I'll put some B roll from WWDC 2022 over this part of the video and, like I can build it just by telling you what to do, if I can get that video done in three hours instead of two days, I'm not sitting there going great. Now I can go spend time with my kids. I'm going now. I need to start making the next video because I can crank them out that much faster, because this whole ecosystem of social media and content creation and the need to make money pulls me into don't?

Speaker 2:

I don't think we're going to find a lot. Pulls me into more time, though, to doing consume content Like, and I think that I mean, yeah, you can definitely fall for that or fall into that, but one I you can always play that game. That's what they call chasing the bottom of the, or fighting to get to the bottom of the barrel.

Speaker 1:

Yeah, race to zero.

Speaker 2:

Race to zero, scraping the bottom of the barrel. You can always play that game. We know companies that play that game. We know people that play that game, and a lot of time. That's a result of not offering anything of real value. Yep, which is a result of not putting time and energy and brain share and effort into something. So I mean, yeah, if all you're doing is making wwdc um recaps, right, and you're just trying to spit your shitty recap out before everybody else's shitty?

Speaker 2:

recap goes out, then, yeah, people are going to play that game and you can play that game. But everybody does have a certain level of administrative tasks that they have to complete on a day-to-day basis or on a week-to-day basis or on a week-to-week basis, you know ad infinitum. And if you choose to outsource those things and focus on creating more value, then I don't necessarily think that time vacuum happens or that you know, I'll just do more and more. I think it's really up to the it's up to the user right.

Speaker 2:

Whoever's using or taking advantage of this product, and it's also up to the. You know, if you're working for a company and they're just like, oh, we're expecting more and more and more of you, uh, part of that is, well, they're trying to maximize the value you give. So if you give value in some other realm that can't be scaled like that, then magically will go away. But I mean, yeah, if that's the game you're playing, then that's always going to be. The cycle that you're caught up in is race to zero do more, do more, do more, do more. And I, you, you know we talked about opting out, but that's just something that you have to acknowledge and opt out of. I think that's a choice more than a metaphysical phenomenon.

Speaker 1:

And we're seeing that with, at least in our circles. We're seeing sort of an opting out of even using modern tools that do make things easier. We're seeing I mean we're recording this right now with flip cameras which you have to. It's cool that it has a little usb flip out thing, but it's usb a, so you have to have a dock that fits an older technology, because we're all usbc now. You have to plug it in your computer and it takes a long time to transfer, whatever it ends up being two gigabytes, Eight minutes for like a gig of footage, Whereas all this modern stuff it's just in the cloud.

Speaker 1:

We're not even there yet, right when these cameras if they were modern cameras should really just be beaming the video to the cloud and by the time we're done recording, you sit down at your computer, the footage is there.

Speaker 1:

We'd be live streaming auto cutting yeah or whatever, yeah, I mean and, and there are some of those tools that we're using in this podcast, where you know we are using ai to sort of auto cut the multi-cam and it's, it's. It's a sort of a bad problem to feed it right so that we can actually make this podcast, because this, because this wouldn't happen.

Speaker 2:

Like we don't enjoy that.

Speaker 1:

Right, and that's a great example.

Speaker 2:

That's a wonderful example of what I'm talking about. This wouldn't be possible without using AI to cut it or, you know, coming up with AI descriptions and timestamps. And you know is it always perfect? No, I don't care.

Speaker 1:

Yeah.

Speaker 2:

Cause all I care about is every week we sit down and have a conversation that's at least meaningful to us.

Speaker 1:

Right.

Speaker 2:

And then we put it up there and it seems like there's, you know, there's people every week that get some value out of it, and as long as there's, you know, one or two people that seem to get some value out of it, it's really fulfilling. And then it's, you know, even if there was nobody. It's fulfilling to us. But yeah, I don't want to go in and cut and spend. You know, I still spend like an hour an episode, even with the AI. Yeah, because you have to organize the footage and bring it all in and then you have to do a quality control and you have to do the audio and then you have to export it and we have to upload it to, you know, the cloud for us to pull and has to be uploaded to the podcast platform and then to YouTube, um, and then you spend an hour you know finding the clips and cutting it.

Speaker 1:

So and I think this is the argument that Sam Altman would make, and I'm not saying that I want to give up that workflow in order to save time yeah but his argument would be well, let's have ai do all of that.

Speaker 2:

Yeah, well, in my argument would be then I, then I don't care about it then then because right now, when I look at it. Yeah, every week we sit down and have a conversation and then we have to fuck with it a little bit. Right? This is my labor.

Speaker 1:

Right.

Speaker 2:

When I look in, I see that we have 200 videos on the channel, or you know, we hit a new listening mark because we've or we've uploaded. You know we're about to hit 50 episodes over two years.

Speaker 1:

It's you have credit for that. Yeah, I don't versus like I'm not doing this because of the pure, you know.

Speaker 2:

Like the reason that we're doing this would disappear if it was just a. We had a conversation, ai handled it all and posted it, and that's the crux of this conversation to me and this video and that's the whole reason I'm skeptical. This is the cost of AI. What are we optimizing for anyway?

Speaker 1:

Who cares? Is the cost of AI making our podcast, other than the conversation that we have right now? Is the cost of that? Just who cares? You don't care about it, the same way than if you have all those input points to pilot the thing.

Speaker 2:

And now a good counter argument to that would be well, if you guys got to a certain level, would you hire somebody to do it, and then they would do it and you wouldn't have to deal with it anyways, and you guys would just come and sit down and record. There's plenty of people that do that.

Speaker 1:

And you would give feedback to a real human like could we cut to the wide a little bit more? Like you know, there's a couple of things Matt says and it doesn't cut to him.

Speaker 2:

There's, there's, um, there's an argument to be made there, but that is not the reason that we're doing this thing right now. It may be for some people, that is so. That's where I say it just depends on your situation If you're going to get value out of it or not, because there's certainly some people that do just want to sit down and have a conversation and then they have an entire production crew quote unquote that goes out and cuts this thing together. But there's something about and you just have to know yourself.

Speaker 1:

There's something about having a real person involved in that. You know, in the spirit of collaboration, they're bringing their whole literally their whole life to bear all their experiences, but then again who's like to be honest, who's a real like.

Speaker 2:

we have skin in the game. When I'm editing the podcast, it is because my the incentives are aligned properly, because my the incentives are aligned properly and a purely economic sense, the incentives are aligned with me sitting down and doing that work. Um, that's a shit job right.

Speaker 2:

Like, if I was hired even if it's it was a huge podcast if I didn't have my incentive aligned, I'm not going to enjoy that job of sitting down to edit the podcast. So yeah, there is a good argument to be made. Like, well, it's with a human. It's like, why isn't it a robot that you're out? Or you know a computer that you're outsourcing that to?

Speaker 1:

Yeah, um, but my argument is it's a person who's going to bring their experience and perspective and expertise to bear, and what little little things might they say or give feedback on that would actually shape the voice and and and vision of what we're doing? And they might say just a couple little things that you take with you and over time you are, you know, kind of chewing on it and it informs how you engage with each other or you know, whatever it is, that it plays a role in the soul of the thing, whereas, can a magic intelligence in the cloud contribute to the soul of what you're making or does it remove the soul? Yeah, and what threshold of involvement from AI do you reach or cross? Where you lose the soul of the thing? And it's just this yeah man.

Speaker 1:

I'll give you an example. A family member posted a video and it was so obviously AI to me but not to her. She thought this was real and it was this veterinarian stopping in a dirty gravel road in the forest to help a baby moose that was stuck in the middle of the road and the mother moose watched on and this veterinarian went in and administered a shot and gave it some milk and washed it off and the mother let her do it. And it's this formulaic, synthetic bullshit thing to tap into the, you know, to tug the heartstrings of someone like my family.

Speaker 2:

That video sucks if it's, if it's real.

Speaker 1:

Somebody. It's literally all made with AI.

Speaker 2:

Like I've seen real videos like that and I'm just, like you know, stabbing myself in the eyes.

Speaker 1:

And so they took that formula of these viral videos that actually, from what I understand, really happened. You know, a person found a fawn and brought him into their home and nursed him back to health, and you know, or an injured owl and all this stuff, splicing them together in a new, perfectly crafted formulaic thing that is sort of like guaranteed for virality, because it hits all these emotional beats yeah and so now they're skipping that whole process of cultivating all of the shots that they need from all these animal rescue videos and they're just having ai make it from scratch and then people like this family member watch it and they I mean she was convinced it was real.

Speaker 1:

I had to do a screenshot and circle the the baby moose's legs to say there are three legs. Here it is. This is completely fake.

Speaker 2:

Yeah, See, I wouldn't have even given it the time of day, I would have just and I.

Speaker 1:

I just felt cause I, you know, care about this family member and and and just and it just bothered me that they sort of fell victim to it. Yeah, you know what is this? And this is a little bit of a different thing, sort of like the appropriation of this technology to just fabricate something that's a lie in order to make it well and as yeah, as long as there's people clicking on that video and they do, and it had and so desperate for some kind of an emotional yeah uh, catalyst that they're going to interact with that in

Speaker 2:

that way, there's going to be people exploiting it, and that's just. I mean, that might just be human condition, I don't know how AI cause. Yeah, what's the difference? I mean the difference. Honestly. If I had to make a case one way or the other, I would almost make a case towards the the the positive, that if it becomes so easy to do and millions and millions of videos like that start popping up because now you don't have to cut it together or hire somebody from a different country on Fiverr to do it for you or whatever then it's you're going to oversaturate and then it's going to become easier for people to realize how fake they are Right and then they completely lose their effect. When's the last time you clicked on a banner ad on a website?

Speaker 1:

Yeah.

Speaker 2:

That was a game changing industry at one point. Right, there's an entire generation of 35, 45 year olds that haven't clicked on a banner ad two decades, yep, because it oversaturated it became too easy two decades because it oversaturated, it became too easy. So I think that there is a a case to be made for oversaturation actually being a positive, with things like that are more exploitative.

Speaker 1:

Yeah, the end result might be ultimately for the better, but you know it, it sort of but it is it I.

Speaker 2:

I completely understand the how you feel when a family member is. It makes you feel maybe vulnerable. It's a little deeper than seeing a stranger interact with something.

Speaker 1:

Absolutely yeah, I completely understand that.

Speaker 2:

But yeah, I mean in the case of, of AI, um doing, I mean it's. Yeah, I would almost make the case that good, let's put up with it for five years and I think we'll be better off. It's kind of like how bullshit doesn't fly as well online in 2025.

Speaker 1:

Yeah.

Speaker 2:

Like there's still a lot of bullshit on the internet Don't get me wrong and there's a lot of bots on the internet, but it's more obvious now than it's ever been. Yeah, and that's a good thing, I think. I think it's a net positive.

Speaker 1:

Well, and you know my my earlier question about the soul of something you know, and whether it's, you know, a movie that's made with a bunch of collaborators or it's, um, a video made with AI, by a single person who tells AI to make the music and to create the imagery, and you know, um, give feedback on hitting the right emotional beats for virality, or you know whatever it is that the result?

Speaker 1:

The result is that they want to get, and again I'm asking rhetorically but what's the threshold for the involvement of these tools in it crossing us over into a place of soullessness, that it's just this vapid nothing thing that we end up making? And I've used AI in some of my photography videos to create um, like synthy vibey music. And I've been curious across, you know, the entire um catalog that I have on that channel, from the music that real people made and I, you know the service that I use lets you sort of adjust the different stems to kind of remix it a little bit, um, but again, it's still a person made the music to using full-on like copywritten music, you know, and the royalties or whatever go to the artist and I haven't manipulated the song.

Speaker 1:

You know, at all, all the way to something where I plug into AI a chord progression that ChatGPT gave me to invoke an emotion. And I input that chord progression and beats per minute and all these different parameters to come up with a piece of music. And is the audience's experience of those videos with the AI music versus the stuff that was manipulated but that a person actually made, am I going to see? People just aren't engaging with it because there's something soulless about the music.

Speaker 2:

Well, sam, I don't. I don't think so much of music that you're going to find, especially in the royalty-free world, like 100% in this world. It's probably pretty structured anyways, the AI is almost counter-structure, yeah. So I mean, I think I would almost say we overestimate the amount of you know using soulless as a thing that is counterhuman. I think there's a lot of human created stuff that's soulless.

Speaker 1:

Absolutely, Even stuff that people really care about.

Speaker 2:

I think makes its way out and it's pretty soulless.

Speaker 1:

Absolutely.

Speaker 2:

And part of that is, yeah, you know some of the things we talked about at the beginning, where you, you know the degradation metaphor and like collective thinking and things like that, um, they, they produce something. That is, again, precision and definition of what we're talking about when we're talking about soulless is probably necessary here, but it is lacking in in a soul and ai might actually bring some of that back now. I'd like to think that the peak experience of soul is only achievable by a human, as long as the audience is human, because, at the end of the day, what I see as like a soulfulness is something where it's a human's connection with another experience. Um, what is the, the tolstoy, and? And um, there's a quote in what is art from tolstoy, where he's like a work of art is something that transmits a feeling from one human to another human, and that's soulfulness in my mind. Again, just my definition for it.

Speaker 2:

And, yeah, if a lot of humanity has become soulless or has lost the ability to transmit that emotion from one human to another, then maybe AI is actually better at doing that at first. Now, again, if the whole point is to translate an experience from one human to another, then by way of that definition. The logic would tell you that the best expression of that is only possible by another human right, the purest expression of that. Maybe an ai can copy bits and pieces of of what make that experience experience, but it's not going to be able to fully encompass what that what that is.

Speaker 2:

But yeah, I mean I would not trying to be like contrarian and no, no, it's great. I would almost think that cause I listened to it and I was. I picked the thing that was more random and I ended up picking the AI and I was like this is just more interesting to me. Um, and it's partly because it was just random. It was something it's like when I watch a, a, a, when I watch a film and it's the same beats, I get bored very quickly.

Speaker 1:

Right.

Speaker 2:

But when you're watching something that's by somebody who's a true great, you're usually enthralled the entire time because they're doing everything. It's like that wasn't the obvious choice. Didn't see that coming? Amazing? Didn't see that coming? Any genuinely exciting or, you know, brilliant piece of work is usually going against that. Yeah it's interesting with stock music in particular.

Speaker 1:

I think AI does a pretty decent job well, I think it goes to what you said about saturation. You know, you do, you do the save the cat formula and movies enough and we're all just gonna collectively get up.

Speaker 2:

And then the thing oh yeah, you might not even have the connection in the music, like you're just making music, but or animal rescue videos.

Speaker 1:

You know oh yeah, yep, the the thing's injured. The person shows up, cuts the barbed wire the animal gets out it gives it a you know, it gives it a knowing look to say thank you and the person goes off about their, their life you know like yeah, we know the formula.

Speaker 1:

We've seen 10 million of these, we don't feel anything for it anymore. And then someone comes along. Not that they would make it up, but let's say they made a short film and it was I don't know something with an injured animal, but they did something to subvert your expectations, because you've been trained with this oversaturated, formulaic bullshit yeah, to expect all these things. And then it does something to subvert those. And it's not the delight in the fact that your expectations were subverted, it's that they were both subverted and you were made to feel something.

Speaker 2:

Yeah, on top of that there's an interesting case to be made right with what you just said, that ai could just literally turn it into a mathematical equation absolutely, let's subvert this.

Speaker 1:

Let's subvert this. That's right if the right pilot of the ai knows to do that.

Speaker 2:

But then again with the saturation problem humans are knows to do that. But then again, with the saturation problem, humans are going to solve that eventually and come up with something even more so that that hopefully rises above it.

Speaker 1:

And that's, you know, as we all started to democratize, the creation of everything that people can make, things that they used to not be able to make, um you know, and we're seeing more of a democratization of IP too. Sure.

Speaker 2:

Which is helpful with that.

Speaker 1:

Yeah, yeah, um, it's just, it's just all crazy.

Speaker 2:

Yeah, and my, my whole thing is always look at like it's super easy to be, to take a dystopian viewpoint or you know more of a um yeah that's pessimistic, skeptical, not skeptical, but pessimistic.

Speaker 1:

Uh, a more negative outlook on like a Huxley or Wellian kind of yeah or just that we feel like we're on, we're, we're on the cusp of something happening and we're just going to have a completely negative take on the whole thing.

Speaker 2:

It's just not. It's kind of like speculating on anything. It's just not a good use of my time, right, and it's not a good use of energy. Well, and we all, it's sexy, it's fun, but it's like I rather just focus on the stuff that is.

Speaker 1:

Life is giving me life, like it's getting me excited and well, I think these inflection points make us want to go to absolutes and go to extremes. We want, nope, it's all negative, Nope, it's all great, Instead of just going look just like everything before it. It's going to be a little bit of both. It's going to be some shit. It's going to be some awesome stuff. Absolutely. When photography was invented, everybody thought it was the end.

Speaker 2:

It's over, it's over.

Speaker 1:

It worked out, okay, okay, yeah.

Speaker 2:

Yes, and you're, you're get caught up in that. And then I mean, you look back at how some people freaked out about things in the sixties and seventies and eighties and nineties and it's like you know, at a certain point you're just like man. Man, you've spent 60 years being in a state of just absolute upheaval yeah that's not.

Speaker 2:

I don't want to do that personally, and maybe some people operate on that level. That's the frequency that they they need. Yeah, but personally, not for me, and there's a lot of things that really get me fired up and enthusiastic about, you know, just being a human. Yeah, like we're here for such a finite amount of time and there is no guarantee on how long that time is, and I'm like I rather spend it doing a number of things and sitting here, and you know um like, for example, being concerned your cameras, you know like you've been talking about and this probably might feel like a an abrupt right turn.

Speaker 1:

I just love that you texted me about a firewire adapter and cable because you want to use a camera that records the dv tape and then digitize that stuff for some video work that you want to do and the camera that started it all was the first camera right, but yeah like the very first camera that I remember saving up my money when I was a little kid to buy because I wanted to make little things.

Speaker 1:

There's something interesting about the the. This sounds like math or something, but like the higher probability or likelihood that the thing that you end up making with these has a lot of soul, because you have so much effort and craft and work that goes into making it. You can't just like we talked about the cloud, like, oh, I just recorded something that's in the cloud, it's on my computer the minute I sit down. Like you have to, you have to go find dv tapes, yeah, put them in there, record to it, connect the thing to your computer, ingest it all in real time and then edit from it and while you're not like developing the film yourself, and the chemicals and all that.

Speaker 1:

With DV tape in particular, there's something about that slower process, the deliberateness to it, the inconvenience of it, the attention to detail. You got to think thoughtfully about what your data rate is and what the resolution is going to be and you know, to get that right intersection between data leanness and high quality, all that stuff and all those little decisions and all those little touch points and all those little inputs are gonna be brought to bear when you sit down to edit the thing yeah there's going to be a different vibe with working with that footage than there would be if you just had a device on you and you told a magic intelligence in the cloud to craft a video that feels like it's from the the early 2000s on dv tape and it's you talking to camera about whatever?

Speaker 1:

like it's from the early 2000s on DV tape and it's you talking to camera about whatever, or it's a travelogue, or whatever it is you're making.

Speaker 2:

Well, and we live in a world driven by stories, right, Mm-hmm, Like you can watch something and okay, it's pretty good.

Speaker 1:

Neighborhood's popping today. Buddy yeah, neighborhood's alive. It is summer in the neighborhood Holy cow.

Speaker 2:

We live in a world driven by stories, so the enthusiasm I showed you a project like me and the team, being creative about something yes, making a creative decision. Thinking through okay, how does this work practically? What A leads to B leads to C, what does D look like? He says what is C, what does D look like? So the enthusiasm that I come at you with when sharing the work is different than if it was just input into language model. Now Maggie's coming to get in on the action.

Speaker 1:

I've got something to say about AI.

Speaker 2:

We're pretty much at time here.

Speaker 1:

We kind of have a hard out. Today we have a hard out.

Speaker 2:

We got to go.

Speaker 1:

I mean, yeah, I think there were a lot of interesting things that were dropped in there and thanks for indulging me with the ai conversation I didn't come in quite hot in the sense, but uh, I watched that video today and was working on a script about it and all this stuff and definitely like took, took over a little bit. Yeah, yeah, mostly it was interesting, I mean curious some of your thoughts about it, even though you hadn't watched the video. So I appreciate you indulging me in that, yeah that was fun, good times, sweet.

Speaker 1:

It had been a golden afternoon and I remember having the familiar conviction that life was beginning over again with the summer and summer, thank you.

People on this episode