Episode Transcript
[00:00:04] Speaker A: Welcome back to another episode of Sort of Sophisticated, the podcast where culture, curiosity and chaos collide. I'm your host, Pete, and with me, as always, is Amanda. How are you doing, Amanda?
[00:00:14] Speaker B: I'm doing great.
[00:00:15] Speaker A: Did you spring too much energy to that?
[00:00:16] Speaker B: No, it was fabulous. I felt like that was, like, so good.
[00:00:19] Speaker A: Welcome back, everybody. Let's go. Let's start right now.
[00:00:21] Speaker B: It's good. It's because we've had a break, so it's good. It's good.
[00:00:23] Speaker A: We got a little bit of break. And you know what? We're back to being hot in here again.
[00:00:26] Speaker B: Oh, is it warm again?
[00:00:27] Speaker A: Yeah, my pits are sweating a little bit.
[00:00:28] Speaker B: I'm not gonna. Sorry.
[00:00:28] Speaker A: Yeah, yeah, yeah.
[00:00:29] Speaker B: Sorry for the pit sweat.
[00:00:30] Speaker A: Well, well, you got the things we do right. For the listeners, here we are.
[00:00:35] Speaker B: So they're talking about today.
[00:00:37] Speaker A: Today. Today is going to be a kind of a wonky one.
[00:00:39] Speaker B: Wonky.
[00:00:40] Speaker A: It's wonky. You know how I say sometimes I'm into them, sometimes I'm not into them?
[00:00:43] Speaker B: So.
[00:00:43] Speaker A: So, yeah, I'm like 50. 50 on this one. Yeah. Yeah. So you know how we always joke, like, AI is like, taking over the world?
[00:00:50] Speaker B: Well, I don't know if that's a joke, but.
[00:00:51] Speaker A: Right, okay.
[00:00:52] Speaker B: Or behind every joke, there's a little bit of truth.
[00:00:54] Speaker A: Anyways, that's it. You know how we always say, like, yeah, we have to learn AI before, you know, like, AI like, figures us out. You know what I mean?
[00:01:00] Speaker B: Yeah. I think it already knows us, and it's what nightmares are made of. So where are we going with this?
[00:01:04] Speaker A: That's probably true because. Yes. No, it's kind of funny because I'm convinced that Chat GPT absolutely knows me.
[00:01:10] Speaker B: Well, because it can talk like you, it writes like you, it gets in your brain.
[00:01:13] Speaker A: So here's what we're gonna do.
[00:01:14] Speaker B: But I will say, so far, AI is more agreeable with us than it is like, free thinking. But anyways, continue.
[00:01:20] Speaker A: That's why they. They do it on purpose. They want it to be. That's how it's taking us over.
Okay. But here's what.
[00:01:26] Speaker B: Here's.
[00:01:26] Speaker A: Here's what I decided to do. I have decided we are debunking the whole myth of is AI going to take over the world and take over.
[00:01:33] Speaker B: No, it's not a myth. It's going to happen.
[00:01:34] Speaker A: No, no, no. So here's what I did, which I don't even know if this is legit, but I.
[00:01:37] Speaker B: Did you ask AI this question?
[00:01:39] Speaker A: I did.
[00:01:39] Speaker B: And it told you said no.
[00:01:41] Speaker A: I Went farther than that. I used AI to write the entire episode script about AI about itself, and that's what we're.
[00:01:49] Speaker B: Of course it's going to make itself sound good and that it's not going to take over the world. That's what any evil villain would do.
[00:01:53] Speaker A: Listen.
[00:01:54] Speaker B: What?
[00:01:54] Speaker A: Did you ever watch. What is it? Dr. Doofenshmirg?
[00:01:58] Speaker B: What?
[00:01:58] Speaker A: Yes.
From. From Phineas and Ferb.
[00:02:01] Speaker B: Oh, no, I didn't watch. I'm not a. I'm not a Nickelodeon kid, okay?
[00:02:04] Speaker A: Okay. Anyway, here's my AI thing, okay? I'm doing. I'm doing AI with a twist.
[00:02:08] Speaker B: Okay?
[00:02:08] Speaker A: We do everything with a twist.
[00:02:09] Speaker B: Yes.
[00:02:09] Speaker A: Because I don't want to talk about the regular stuff. Like, everybody on TikTok and Instagram talk about, like, learn AI in 10 minutes a day, and if you don't, then you're going to be behind and blah, blah, blah, blah and all that shit. So we are going to talk about how AI is actually not going to. To take over everybody's jobs.
[00:02:24] Speaker B: It's like a bad knot joke, and
[00:02:26] Speaker A: we have nothing to worry about.
[00:02:28] Speaker B: Okay.
[00:02:29] Speaker A: If we do these very important key things.
[00:02:32] Speaker B: Okay.
[00:02:33] Speaker A: Okay. That's what's happening.
[00:02:34] Speaker B: This is really sus. I don't know what else to tell you.
[00:02:36] Speaker A: Our official title is you're thinking about AI all wrong. Here's the deal. The whole point we're going to make, literally none of this is new. I mean, AI is new, but nothing that I'm talking about is going to be new.
Everybody's just freaking out because they can't see the other side of AI yet. Right? That's really all this is about. We're just at the beginning of it,
[00:02:54] Speaker B: kind of like Y2K and the Internet, and, like, everything was shut down.
[00:02:57] Speaker A: That's exactly right.
[00:02:57] Speaker B: They couldn't go from 1999 to 2000. Okay.
[00:02:59] Speaker A: You got, like.
[00:03:00] Speaker B: I get it. I get the concept, but I really do think this time, like, AI is
[00:03:03] Speaker A: going to take over with this twisted little episode. I'm going to see if I can debunk that and bring you over to my dark side.
[00:03:09] Speaker B: Well, how does this make us more cultured and curious?
[00:03:11] Speaker A: Okay, so, I mean, I get the
[00:03:12] Speaker B: curiosity part, but, like, cultured.
[00:03:14] Speaker A: So you ready for this?
[00:03:15] Speaker B: Okay.
[00:03:16] Speaker A: Because I chat GPT. That said, I asked chat GPT how learning about chatGPT is gonna make us culture and curious. Here's what I got. Okay?
[00:03:22] Speaker B: When I take over the world, you will know me.
[00:03:24] Speaker A: So sort of. Cause I was, like, ready for this really stupid, dry answer, but here's what I got. Learning about ChatGPT makes you more cultured and curious because it turns. I have no idea what this is into. Hold on, gimme 30 seconds. This is literally what it says, word for word. It's not that you suddenly know more, it's that you're less afraid to explore what you don't know. And let's be honest, it's the only tool that lets you ask a dumb question without someone sighing dramatically.
Curiosity without shame. That's culture, okay? That's what it wrote.
That's how much it knows how I think. Which is totally bullshit. Right? Like I thought that.
[00:03:57] Speaker B: Oh, what would your real answer be then?
[00:03:58] Speaker A: Okay, okay, my real answer is AI can only produce things.
AI can't decide things yet.
[00:04:08] Speaker B: I do like the astrotop yet.
[00:04:10] Speaker A: So. Right. Jedi mind tricking. This whole thing about, like learning about AI is just a sneaky way of learning more about ourselves. Now if you use AI like as Google, I get it, you're not going to learn more about yourself. But like the way I use AI, it's getting to know me intimately. And as a result, no, don't freak out, like making faces. My point is, is it starts to figure out your motivations, what you want to do, what you're asking it about all the time. Like I talk about grief all the time, right? So it sort of knows that about me. Or I talk about wanting to be cultured and learn more stuff. So it feeds me more. So my point is it's starting to figure out what I value above all else. Okay, does this make sense?
[00:04:49] Speaker B: Yes.
[00:04:49] Speaker A: And if it can figure out what I value that is valuable to me to learn more about myself and my self awareness. Forget AI not taking over anything. I'm just saying in general, like the cultured question is I can look back at my history of AI and go like, holy shit, I cared about all that stuff this year. I must really think about this stuff a lot more than I thought. Like it's a window into our own human psyche.
[00:05:11] Speaker B: Right?
[00:05:12] Speaker A: Makes sense.
[00:05:12] Speaker B: Yeah.
[00:05:12] Speaker A: So that, that's how I'm. Okay, that's how I'm sort of saying it's like going to make us.
[00:05:16] Speaker B: It's a little weak, but you know, it's self aware. It makes sense. Okay, makes sense.
[00:05:20] Speaker A: How about this? Why don't you ask it? Right? I mean, I just feel like, hey,
[00:05:22] Speaker B: am I so interesting if you're self, aware? Amanda, if everybody typed into their AI, like diagnose me, I'm going to do,
[00:05:30] Speaker A: I'm going to do it.
[00:05:32] Speaker B: Which is interesting.
[00:05:33] Speaker A: Peter, you're an egotistical, narcissistic lunatic.
[00:05:36] Speaker B: Yeah, well, that's for next week. As of this week, what is our word of the week?
[00:05:40] Speaker A: Word of the week. Okay, let's go. Word of the week is a truculent. Have you heard of truculent?
[00:05:45] Speaker B: No. It's about, like, succulent.
[00:05:47] Speaker A: Oh, kind of. I think succulent has two C's. This is truculent. T, R U, C, U, L, E, N, T. Truculent means aggressively defiant or eager to argue.
[00:05:57] Speaker B: Huh. That's aggressive.
[00:05:58] Speaker A: Being combative, huh? Yes, being combative. Yes. It comes from the Latin truculentus. I wish.
[00:06:04] Speaker B: Like, I. Oh, it sounds like one of the spells trulentus from, like, universal.
[00:06:10] Speaker A: That was pretty good. Okay, it's not what it is, but got it.
Truculentis meaning fierce or savage. Yes. So whatever. Combative, not. And, like, not just angry.
[00:06:20] Speaker B: Like, aggressively.
[00:06:22] Speaker A: Yeah. Like and only, you know, you're pissed off. Like, it's in your head. Like, you are not outward.
Yes.
[00:06:29] Speaker B: You didn't even like how I made you very truculent the other week.
[00:06:32] Speaker A: Yes, you did. Or how I know we don't talk about politics, but how Donald Trump generally makes everyone truculent all the time. Yeah, yeah, yeah. Okay, Right.
[00:06:40] Speaker B: Well, on that note, why don't you just go ahead and start the episode now? Because realistically, what is the history we're talking about? Like, the creation of AI where I started?
[00:06:48] Speaker A: I got nothing. I got nothing on history. Like, it happened, like, 10 seconds ago. I don't even know where to start. We're just going to. We're just going to talk about AI, okay? No history.
[00:06:56] Speaker B: All right. This is gonna bode so well. Here we go.
[00:06:58] Speaker A: This is gonna go great. Okay, so restating my premise, everybody says AI is gonna take all the jobs, and we're calling bullshit on this whole thing. In this episode, I'm gonna somehow convince you that it's not gonna take our jobs. It's just history repeating itself over and over again. Because every time something new starts in history, we all panic, we all shit a brick. Then what do we do? We end up adapting and everything turns out just fine, I guess.
[00:07:23] Speaker B: But this is kind of way different.
[00:07:25] Speaker A: This is not. This is. They all felt different. You don't understand. You just didn't live then.
[00:07:30] Speaker B: I did. I lived through Y2K and the Internet.
[00:07:32] Speaker A: Oh, my God, Fine. Did you live through the printing press? Did you live through the steam engine? Did you live through all that stuff?
[00:07:38] Speaker B: But this is different. This is like the printing press couldn't take over my life.
[00:07:41] Speaker A: Okay, we're going to start with the printing press. That's where we're going to go. Then that's what we're going to talk about.
[00:07:45] Speaker B: There is history. Got it.
[00:07:46] Speaker A: So here we go. Okay. So before the printing press, I feel
[00:07:49] Speaker B: like I'm back in Disney World and
[00:07:50] Speaker A: you just fell right in on.
[00:07:51] Speaker B: On the ride of whatever that. Right. Is Epcot Ball that you go through.
[00:07:55] Speaker A: Yeah, yeah, right.
[00:07:56] Speaker B: I don't even know what the ride's called. But like you literally start and then it takes you through all of the different transitions of communication. And it's Larry David technology.
[00:08:03] Speaker A: And the best commercial ever in the super bowl where he's, you know, remember, they bring him the wheel, he's like, oh, yeah. Yes. They bring him the toilet.
Okay, sorry. All right, so, okay, the printing press, who was it? Johannes Gutenberg. Yeah, that guy. Yuck. Anyway, 1450, right? 1450. Printing press start shooting out pages like 100 miles an hour. Okay, here we go. And what happened is like the scribes, people that got paid, right? So they all shit a brick, right? Because they were like, oh, my God, we don't have any jobs anymore.
[00:08:31] Speaker B: I mean, fair, theoretically.
[00:08:32] Speaker A: This is totally true. They lost their jobs. I completely get the idea. We're scrambling, we're freaking out.
[00:08:37] Speaker B: Yeah, but they had to reinvent themselves.
[00:08:39] Speaker A: Yes. So then what happens, Amanda? So what happens is all of a sudden they need authors, they need publishers, they need. They need like, writers, editors, they need libraries. They need frigging libraries. So out of nowhere they start building a million new jobs.
[00:08:53] Speaker B: Fair. Because we have always said that, like, my kids careers probably will not be something that we know of today. Like, it's not invented yet. Like the whole influencer career, right? That was not a thing.
[00:09:05] Speaker A: No.
[00:09:05] Speaker B: When you were young, or really when I was young, we're gonna get.
[00:09:08] Speaker A: You've already spoiled a whole nother one I got. You're absolutely correct. Influencers is a whole other example of it.
[00:09:13] Speaker B: Like when it's there, like, it does create new jobs and new innovation, et cetera, et cetera. But doesn't prove the point that AI is not going to just take over.
[00:09:23] Speaker A: Yes, it does.
[00:09:24] Speaker B: Okay?
[00:09:24] Speaker A: Because people would have thought with the printing press that the world was ending. Like that's what they would say they were thinking back then.
No, no, let me go, let me go further with this because this isn't just about like, oh, scribes lost their jobs. They changed history, it changed culture, like up to that point. Think of who was reading? The only people that were reading were like nobles and rich people, correct?
[00:09:42] Speaker B: Yes.
[00:09:42] Speaker A: Okay. So this, for the first time ever, made books accessible to everyone in the world. It added layers to society. Amanda. Universities literally started because of this shit. Like. Like, that's why. Because people were getting educated. They needed. You get the idea.
[00:09:59] Speaker B: You were very positive. This is true. This is all true. I'm just pessimistic. This is all very logical.
[00:10:04] Speaker A: Okay, so I'm going to another.
[00:10:06] Speaker B: Maybe I'm more emotional.
[00:10:07] Speaker A: Fine.
[00:10:07] Speaker B: Okay.
[00:10:08] Speaker A: Okay, bear with me. Steam engine. Same thing with steam engine. Fast forward. I don't even remember when the steam engine was. But who cares? The same idea. So Industrial revolution, right? Like that's. That's everything. So before that time, manual labor, everybody's doing thing. Fun fact. When they created the steam engine in the late 1700s, early 1800s, people went around with pickaxes and like sledgehammers and broke. Literally would go into businesses and manufacturers and break the machinery so that way it wouldn't. Right. So they couldn't. It couldn't do the jobs of the people that wanted manual labor. But what happened as a result of that whole thing? They thought they lost all their jobs. The same whole thing. I'm sure some people did. But then railroads, engineers, mechanics, manufacturers. Literally, the Industrial Revolution because of the steam engine. Right, I get it. Cities were born.
[00:10:58] Speaker B: Yes.
[00:10:59] Speaker A: So if the printing press made universities and that started the whole frigging thing.
[00:11:03] Speaker B: Sure.
[00:11:04] Speaker A: Oh, my God. If you think about the printing press, Amanda, up till that point, this is a perfect example. The Protestant Reformation started because think about it. Up to that point, everybody believed their priest, their monk, their scholar, that was it, whatever they said. Because they didn't have access to anything else. And so as a result of the printing press, people got educated. Protestant Reformation. Everybody's like, oh my God. It's the same shit going on today. We have Access now on TikTok and social media. All right?
[00:11:31] Speaker B: We have too much access.
[00:11:33] Speaker A: Oh, my God.
[00:11:33] Speaker B: Like where we were, our brains created a house.
[00:11:35] Speaker A: You said the same thing back then. You would say the same thing back then.
[00:11:38] Speaker B: I know, but that's my point.
[00:11:39] Speaker A: So you just have to be discerning. This is my whole point. This is what we're.
[00:11:41] Speaker B: Get to the job we are making. You're going to tell me that it's all about the labor because then it creates positions and you need people. But we're creating AI. We're creating AI robots that will function like us to replace.
[00:11:53] Speaker A: No, no, that's where I'm stopping because I'm saying AI robots can't be discerning. They can't make the decisions. They can't really evaluate. Okay.
[00:12:02] Speaker B: I feel like some people would argue,
[00:12:03] Speaker A: they could argue it all you want. What's the thing that everybody always talks about at the end of the day when you have to make a real tough decision? The difference between like good businesses, great businesses. Right. What is it called? Our gut. It's that gut. And so people have that gut instinct and they do it better.
It's code. It's code for discernment. So this idea of we have to be able to harness AI, it's going to learn a bunch of shit. I totally get that. It's going to learn a bunch of shit. But the jobs your kids are going to have is going to be about deciding how to figure its way through it more than it is about access to information. I would argue access to information is great. You're the one who always says it. Question everything.
I want to teach my kids. Question everything. Question everything all the time. Which I think is excellent. That's the whole point of this. So the more questioners. Is that a word? The more questioners we have, those are the jobs that people are going to need to fill and those are the people that are going to accept as a result of this. Does that make sense?
[00:12:55] Speaker B: I get your point. I do still think it's a little bit different. I know I'm like on the poo poo wagon over here, total poopoo. I know you're basically saying that I'm trucking.
[00:13:04] Speaker A: You're being totally trucked.
[00:13:05] Speaker B: Oh, I am right now.
[00:13:06] Speaker A: Yeah.
[00:13:06] Speaker B: That's why you picked that word. You knew exactly how it was going to be.
[00:13:09] Speaker A: It's going to be really easy.
[00:13:10] Speaker B: But like, I get that you're saying that as we advance technology, jobs get rearranged, new jobs come, etcetera, New opportunities, new roles, blah, blah, blah. But really like AI is so different than any of these other moments in time because it can do everything. And I get your thing about discernment, but it can write an email that sounds like you, it can give you plans, it can answer questions. And I know there's would have been like a wave of people who've been let go. Right. Because of AI. And I get it.
[00:13:36] Speaker A: Right.
[00:13:37] Speaker B: And part of, I guess human nature should be that if we fall into that, then we need to reinvent ourselves. And I think that's maybe where we're coming out of the industrial revolution, which, you know, and my whole kick on education and how, you know, it's not serving us well in our future because our kids are in an industrial revolution education system.
And that now I think we're moving away from that mindset and needs into like entrepreneurship or, you know, creating new jobs, ideas or ways to make money, etc. But I still kind of feel like AI is going to take over and kill us all.
[00:14:15] Speaker A: Oh my God.
[00:14:16] Speaker B: It's why I talked to Chet Nicely.
[00:14:17] Speaker A: Listen, he. I just wish somebody was here from like the 1600s right now.
[00:14:21] Speaker B: No, because they would just die of a heart attack.
[00:14:23] Speaker A: No, I wish they could tell you, like, well, you know, before the printing press and now after the print, like it's just so like, oh my God, it's going to sound terrible. We're being so little if we think about it that way, because I know
[00:14:32] Speaker B: you have to adapt, right?
[00:14:34] Speaker A: Yes.
[00:14:34] Speaker B: I mean, when the Internet came about and you know, I was introduced to the Internet and the knowledge I probably wouldn't have because we used to have those Britannica encyclopedias.
[00:14:44] Speaker A: Yes, right.
[00:14:44] Speaker B: That's what I grew up on. That's what you would reference. So I get the access to information and is good. Yes, it just is.
Who's controlling AI? That's all.
[00:14:54] Speaker A: That's my point. We will.
[00:14:57] Speaker B: Who?
[00:14:58] Speaker A: Government Humans?
[00:15:00] Speaker B: Sure, those evil villain humans.
[00:15:02] Speaker A: Oh, no, your daughters. Oh my God, that was hilarious. One billion. I just asked for sharks with freaking laser beams, people. Okay, listen, in any one of these circumstances, the same thing's happening. It's just reducing the friction in the system. More access to information. That's what AI is giving you. It's reducing friction. Printing press reduced friction. Steam engine reduced friction. Every time something reduces friction, it makes the barrier of entry easier. And when the barrier of entry is easier, more jobs get created, more ideas get formed. Are you with me so far?
[00:15:40] Speaker B: I get it.
[00:15:41] Speaker A: And then what ultimately happens is the. I guess, like, lack of a better word. The lazy people, the people that don't have good differentiation, okay, end up dying by the wayside. And those that do build new businesses, need new infrastructure and create more jobs. And AI is doing the same thing. It's just reducing the level of friction in our system. And we're just trying to figure out how to. How to use it all.
[00:16:09] Speaker B: Does this mean it's making our socioeconomic problem worse?
[00:16:12] Speaker A: No, it's going to make it better, not worse.
[00:16:14] Speaker B: Oh, you think so?
[00:16:15] Speaker A: Yes.
[00:16:15] Speaker B: Here's the thing. I think of the people that scammed my parents out of money, okay. And I think that they're the lowest of low lives.
[00:16:20] Speaker A: Okay.
[00:16:21] Speaker B: And they are in no shape or form going to ever be able to adapt with the AI mindset to create something better than just being scammers.
[00:16:30] Speaker A: Everybody's scammers. There's always like a contingent of scammers everywhere. So let me, let me argue the point. I would say on the opposite side of that, you're going to have AI quality control.
People's jobs are literally going to be to police the shit you're talking about. So the same way there's going to
[00:16:48] Speaker B: be scammers doing all that stuff, like cybersecurity before AI.
[00:16:51] Speaker A: Before AI. Okay, absolutely. They're going to be the. They're going to be the. What would you call them? The.
[00:16:56] Speaker B: They're going to be the ones that tell us to go back to just the pen and the paper.
[00:17:00] Speaker A: They're going to be like the ethical police. That's going to be their job. And we're going to have a whole contingent of people. I bet fast forward 10 years, you're going to have a major in college called AI Ethical Management and your daughter can get a degree in it and she'll make a gazillion dollars managing all the bullshit from all those sociopaths that are out there trying to figure out how to, like, ruin us. I think that's a perfect example of how AI is going to do better than worse. Does that make sense?
[00:17:29] Speaker B: Yeah, no, I hear you. I did. I said along the whole way, you make lots of logical sense. I understand that we're going to have this system, hopefully like cyber security, that oversees AI and keeps us alive.
[00:17:41] Speaker A: Yes.
We're going to call them the ethical police. So the EP tm. Let's go.
[00:17:48] Speaker B: But what, what I saw today was like this real.
That basically was like, we're doomed because it was this person sitting there and they had a filter and he could switch between filters and the filter was like, he turned in Taylor Swift, he turned into Donald Trump. He turned into like all of these big names that you would know.
But it looked real. Yes, right. Like, it. It looked as if that person was talking and movements and just was taking his movements and did an overlay. And so I guess, like, at the core of it, you can't trust anything anymore.
[00:18:23] Speaker A: So same argument, we're being circular. We're like totally going to our corners here, which is hysterical. So I would say that those that are going to benefit or where jobs will be created are, if you can decipher and discern all that kind of shit. Literally the last two years of My life I've spent using AI to, like, figure out how to do this podcast, and there is so much shit out there that's absolutely fake. It is fake af. I totally get it. But also, my game from, like, understanding fake versus real has gone up like a hundred times. So if you put something in front of me right now and said, like, is this fake or is this real,
[00:18:59] Speaker B: or can you just because you're a highly intelligent individual.
[00:19:02] Speaker A: No, I trained my brain to decipher and discern. That is what I'm telling you. That is where people need to spend their time. My only point is I know it's scary and I know it sucks. I'm telling you, there's going to be big policing and big, like, I hope so. Think of health care, think of law. Think of, like, big jobs, right? Doctors, lawyers, all that kind of shit. Access to information. Like you said, like, it's going to be everywhere. There's going to need to be people that are behind the scenes managing all that shit.
[00:19:32] Speaker B: I mean, I hope so.
[00:19:34] Speaker A: Not, I hope, like, it's the only way that you can go forward. It's. We won't be able to process or move forward without all of that.
So I know it's super scary, but going all the way back to your dude who's on TikTok, right? Like the guy who's making all this shit up. Those people, those early adopters, I'm gonna call them, they're like the. Yeah, their job is to be disruptive. I call them loud explainers. That's their job. So they're out there telling everybody, hey, look here, oh, my God, look what it can do. Blah, blah, blah. And they're. They're sort of chaotic about it, and I don't think they're very thoughtful. Okay, I get it. I mean, some of them are, some of them aren't. I think it's a little mixed bag, but. But my point is, is all they're doing is bringing attention to it, which I think is critically important. I think fast forward five years, all those people are going to be gone. They're. They're dying off. Like, their job is to be the loud explainer, right? And then. And then go away because they don't really have a platform. Then all the real jobs and all the real access and all the real processes will be built around what to do. And all I'm suggesting. I'm not. I'm freaking out over AI too. Like, I'm not saying, like, oh, my God, this is going to be the best Thing in the world. I'm just saying my argument is everybody needs to settle down from like a nine to like a three on. We're losing jobs.
[00:20:45] Speaker B: And I get it.
[00:20:45] Speaker A: Everybody's got to stop freaking out over that.
[00:20:47] Speaker B: In our industry, there are significant ways that we do use AI that are very helpful. Right. And so do I think it's like one of those. You can use your power for good. Yes, yes. I think the freak outcomes of the bad. Right?
[00:21:00] Speaker A: No, no, absolutely.
[00:21:01] Speaker B: People who are malicious or, you know, want. Want to take over the world.
[00:21:06] Speaker A: That's with anything, dude. It was happening in the 70s with the mob. It was happening in the 20s with Al Capone. Like, we've.
[00:21:12] Speaker B: Yes, we've illustrated this.
[00:21:14] Speaker A: Right. Like, so it's just. It's just the next wave. And I understand it's more. It's scary. Like, listen, here's the deal. Like, my age, people, like, I totally get it. Like, honestly, like, I could give two shit if I wanted to. I could totally write it, right? Like, oh, my God, I don't want to do this AI thing. Who cares? I could live my life for the next 20 years and die. Your. Your generate. You're like, right? Like, you sort of got.
[00:21:31] Speaker B: No, I think I could. I think I could also, like, you're close.
[00:21:33] Speaker A: You got to deal with it. Anybody. My kids, your kids, there's.
[00:21:35] Speaker B: They're going to have to do it. Yeah.
[00:21:36] Speaker A: They got to feel. And like, the thing is, is number one, you got to worry about malicious and all the bad people to lazy people. Right? Lazy people are fucked. So it's just. It's those two groups of people. So just don't be a jerk, don't be lazy, and you're going to be excellent and is going to be another tool in your tool belt. Yes, that's it.
[00:21:55] Speaker B: And I do feel like, if I guess you. We have to remember, like, you can't lose your human connection. Right? It's gonna be about relationships. It has to be. And we've always already have wanted that. We've craved that. We've wanted community. That's how we were. I believe that's how we were made and built as humans. And so, like, as a whole, we. I think we have to lean into that more and not as much as like, oh, AI is my friend and they're, you know, gonna tell me all the things in the world.
[00:22:21] Speaker A: They're still trying to figure it all out. So. San Diego State University. Paul's graduating.
[00:22:24] Speaker B: Yes.
[00:22:25] Speaker A: They sent him a Note. They said AI is going to generate the 5,000 names. Here's your name. If it's incorrect, please re record it or do something, I don't know, up to three times.
If it doesn't work and you're not satisfied, call this number and we'll make sure that we announce your name.
And my 21 year old said, this is jacked up. I want them to announce my name. And he's like, I'm just going to tell him it doesn't work. And I said, good for you. Buck the system. I really don't care. My point of the whole story is I'm not saying like, oh, they shouldn't do it. They're just experiment. We're like, we're in step one, right? We're literally in step one.
[00:22:59] Speaker B: And I think we're okay in step one, right. I just think it's when we get to like, I know step 50, I won't be there. I know your kids. I'm hoping I won't either.
[00:23:06] Speaker A: Here's what I will tell.
[00:23:07] Speaker B: I know my kids will.
[00:23:08] Speaker A: When am I going to see your daughters again?
Uncle Peter, here's the deal. Be the ethical police. Yeah, go.
Here's your new job, right? Totally. Here's your new job.
[00:23:16] Speaker B: I think we still got to push, like just critical thinking, challenging systems, just like Paul's doing, you know, trying to just be human still. Right. We can't lose that. I think if we lose that and we lean too much into it as a society, then I think we, you
[00:23:29] Speaker A: know, I think spirals. But I. I'm giving you credit right now. I think you are ten times ahead of the curve compared to any other mom your age with what you're teaching your kids. I think that's exactly what you're teaching them. And I think they will be well on their way because of it. So whether you know it or not, you are preparing them for this moment.
[00:23:46] Speaker B: Well, I hope so. I don't know.
[00:23:47] Speaker A: You are.
[00:23:48] Speaker B: We pray a lot about it, but.
[00:23:49] Speaker A: Good for you.
[00:23:49] Speaker B: Your theory, though, and your logical explanation
[00:23:52] Speaker A: on how AI is going to take over for priests.
I didn't even think about this yet. Are we going to.
[00:23:56] Speaker B: Did you not see?
[00:23:57] Speaker A: Are we going to work it into religion?
[00:23:58] Speaker B: I forgot what it was. There's one denomination that came out that said you cannot have AI writing your sermons anymore.
[00:24:02] Speaker A: Oh, my God.
[00:24:03] Speaker B: Yeah, we're doing it. Which I get because now you're doing a sermon that doesn't have the discernment. You lost that.
[00:24:08] Speaker A: Yeah.
[00:24:09] Speaker B: And now you're just regurgitating what AI told you.
[00:24:13] Speaker A: I'm being such a hard ass about this. But my point would be if I could have AI write a script for me for a podcast, like, excellently, and then I could spend most or the majority of my time discerning what I want. Yeah, I would think that would be better versus me coming up with it all myself. So, like, right now it's.
[00:24:32] Speaker B: But I don't think that they were going through and reworking it personally.
[00:24:35] Speaker A: No, no, I understand what you're saying, but my, my point is, is you're
[00:24:37] Speaker B: making it your own. No plagiarism.
[00:24:39] Speaker A: I want more. I'm arguing for more discernment, that's all.
[00:24:42] Speaker B: And I think we should. Yeah, yeah, yeah, yeah, absolutely.
[00:24:44] Speaker A: I mean, really, like, it's about judgment.
[00:24:47] Speaker B: Fair.
[00:24:47] Speaker A: It's about deciding what's good, like, what is actually true. What is worth paying attention to.
[00:24:54] Speaker B: Well, and I think, though, that we as a society, even with social media and the news and everything else and the like, I know there's all the media and you're saying it's this or it's that or what's true, what's not true? We already as a society have an issue with that.
[00:25:07] Speaker A: Yeah.
[00:25:08] Speaker B: Like where we. If it was on the Internet, it's true.
[00:25:10] Speaker A: Right.
[00:25:11] Speaker B: But now AI, it's just taking whatever I can find on the Internet. I know, I just taking real life experience, stories, things like that.
[00:25:20] Speaker A: You're playing right into it. This is a perfect example.
[00:25:21] Speaker B: When you say it's about judgment, I think that we have to continually remember it is an analytic system that is giving you information that lives in the Internet, in the clouds, on the line,
[00:25:38] Speaker A: up there on the line.
[00:25:39] Speaker B: But it's not actual, like, reality.
[00:25:42] Speaker A: No. AI can't do strategy. It can't do, like, oh, my God, Like, I just recently was like, we don't have a marketing manager at PAC Lab. And so I was like playing around with it.
[00:25:53] Speaker B: Oh, you don't?
[00:25:54] Speaker A: No. And so I like, came up with all this and it gave me a great marketing plan. Don't get me wrong, like, yeah, yeah, certain jobs. Certain jobs are going to go. I get it. But at the end of the day, like, AI is not replacing leadership.
AI is not replacing strategy. AI is not going to replace the brand identity. I call it discernment judgment. All the things. It can't do it. And that's why I said lazy people, malicious people, truculent people. Those people, they're all done, right, Everybody.
Right.
[00:26:19] Speaker B: But don't go to organized crime.
[00:26:20] Speaker A: But at the end of the day, like, we all have to remember our episode on art, on opera. We all have different tastes, we all have different styles. It can't figure any of that out. It gives you millions of options. If you said, AI, show me the best artwork in the planet, it'll show you all the different art, and then it's up to you to decide what you like and what you don't like. It's not replacing it. So I'm just saying it's time to figure out how to, like, use that angle as making us less threatened and more capable so we are more formidable in the future. Again, like, against it. Whatever you don't quote, unquote. Yeah, you get it.
[00:26:54] Speaker B: So if somebody wants to become more formidable or more acquainted with AI, do you have steps that you would recommend people start with?
[00:27:01] Speaker A: Oh, my God.
[00:27:02] Speaker B: And not using it. Like Google step number one.
[00:27:04] Speaker A: Right. Step. Even though I use it like Google. First of all, I mean, I do
[00:27:08] Speaker B: use Google Gemini for that, because that's. Google created Gemini for that reason. But.
But we're talking more like, you know, I mean, even Chat now. Chats. I don't know. There's like a race, right, Between Claude and Chad. Claude, right.
[00:27:20] Speaker A: Yeah. I've. I've. I've recently.
[00:27:22] Speaker B: Oh, you moved.
[00:27:23] Speaker A: I've made. I'm getting to know Claude, and Claude's getting to know me.
[00:27:26] Speaker B: You know, you can download your information from chat to Claude.
[00:27:29] Speaker A: Y. Oh, yeah, yeah. That's a lot of stuff. I'm also.
[00:27:32] Speaker B: I haven't got over quad yet.
[00:27:33] Speaker A: I'm also working with something called Gamma, which is another one. It's a. It's about PowerPoints more than anything.
[00:27:38] Speaker B: Oh, okay.
[00:27:38] Speaker A: It's. It's excellent. Oh, my God. Again. And you hate me for this. I know, but, like, it does all the easy, heavy lift, and then you're left to spend all your time on strategy.
[00:27:49] Speaker B: Yeah.
[00:27:50] Speaker A: And it's. Oh, it's a lifesaver. Anyway. Sorry. Okay. So I would say, like, in my limited life with AI, because I'm not. I mean, I'm an. First of all, I'm an old guy who doesn't know how to use this, but I would say over the last two years, like, I got really good at asking the right questions. So learn how to ask good questions. It sucks, right?
[00:28:08] Speaker B: Cause I feel like a lot of people tell it. Make me this. Do this for me.
[00:28:12] Speaker A: Right? That's. And it's super basic, and you get vanilla all the time, Right? So you have to get way, way better at practicing how to frame. Listen, everybody would do the Same thing with Google, with Google searches. Ten years ago you didn't know how to Google search. You were like da da. And now you're like oh my God, I know exactly how to Google search. Same bullshit, right? So that second I would say judgment, right. So I talked about judging. So stop asking. Chat about like every decision that you ever want to get. No, none of that. I would like skip all that part. It's total waste of time. I think you said it already. Another one would be build your relationships, build your networks. You said something about network networking is going to be super, super key. I think that is going to be more important than anything as people start to reject or get concerned. Like you said about oh wait a, hold on a second, what's real, what's going on? Malicious this, malicious that, all that crap. I absolutely believe that networking is, we should do, we should probably do an app on networking because networking is going to change as we go forward.
[00:29:08] Speaker B: Like maybe this is like your second career in life. What we can create like a networking.
[00:29:13] Speaker A: Yeah, we might be onto something to build community because I would argue people are hiding. Right. Especially now with like social media and so on and so forth. They're like oh, I don't need networking. Oh I could just do this via online X, Y and Z. And I'm going to argue, I think the way through AI to be able to show command and control will be better networking. Yes. And I don't think old style networking is going to be the same.
[00:29:34] Speaker B: I think we should boys club but
[00:29:36] Speaker A: now it's a different time on that. I think I might fuck around with that episode a little bit. Let's see. Okay.
And then last get really good at figuring out specializing in something super, super specific. Because I said earlier lazy people are going to die. Like average is going to die. You know what, you know what's going to die. Sort of sophisticated because this information's available
[00:29:54] Speaker B: everywhere but it's still available now and again it's the person I know.
[00:29:58] Speaker A: No, no, actually. Right. So it's figure out what your little niche is, what your little specialization is because that would, that's going to be a big differentiator going forward I think.
[00:30:06] Speaker B: So do you have an example or like something in your head or something that trigger sparked when you're saying like specialize in something. Because I think a lot of people feel like AI is going to take over everything that's not blue collar jobs.
[00:30:17] Speaker A: Sort of.
[00:30:18] Speaker B: What do you mean sort of?
[00:30:19] Speaker A: Only parts of it. Like I said like everybody. Certain, certain jobs are Going to die. Use your job, Your job, financial advice, OC Wealth Coach. Ta da.
Ladies and gentlemen, here at OC Wealth Coach, what we do is we not only serve the clients, but we serve the community.
What? Your job. Your job is a perfect example.
[00:30:39] Speaker B: Oh, we get this all the time. We literally had somebody email saying, oh, no, I just use chat for this. And this is what chat told me to do. And so they gave us this, you know, the one through five of what chat told me to do. And we were like, we actually would do none of those one through five having had real life experience, but you do you. And I literally was like, sounds like you got it handled. Okay, bye.
[00:30:57] Speaker A: See you later. Call me when you need me. So the specialized services that you guys provide, I think is gold. AI will never be able to, to figure out how to replicate the specialized service.
[00:31:09] Speaker B: Like, you could totally do personalization.
[00:31:11] Speaker A: It's not just the personalization, though, but you also have a very specific niche. Right? Like, and I think that's super important. So don't lose the niche, don't lose the specialization. AI can't do anything. They're going to vanilla the whole thing. And like you said, you'll get the phone call for, the person goes, oh, I don't really need you. Fine, I got it.
[00:31:28] Speaker B: Yeah, don't.
[00:31:28] Speaker A: Don't really need. That's great. But you'll never lose your clientele, and my guess is those clients will refer other clients in similar situations and you'll win as a result of it always. So just stay specialized. And I think you're. I think that's a great example of being specialized. So there you go. You're doing it and you don't even know you're doing it.
[00:31:44] Speaker B: Well, I was very kind.
[00:31:45] Speaker A: It's not going away.
[00:31:45] Speaker B: Very.
[00:31:46] Speaker A: You're not going away.
[00:31:47] Speaker B: Thanks for making me feel better, feel more secure in my chosen career that evidently, you know, now is on the brink of existence. Kind of like the dinosaurs. It's fine.
I know you're. You're very reassuring. And so if we're not doing the cheap stuff, you know, we're honing in on specific skills, you know, we're wanting to evolve and grow with AI and figure out how to better use it and incorporate it. Um, and the ones that don't will obviously fall off and won't be as successful, et cetera, et cetera. Pretty much that's it in a nutshell.
[00:32:21] Speaker A: Yeah, you, you basically got it. Like, I, like, look, at the end of the day, like, did I win you over? Or not. Like, that's all I want to know you.
[00:32:27] Speaker B: I mean, you did. And I told you very early on, like, logically, it 100% all makes sense.
[00:32:33] Speaker A: Yeah.
[00:32:33] Speaker B: And I will hope and pray that there are, you know, systems put in place to keep AI in check.
[00:32:39] Speaker A: Right.
[00:32:39] Speaker B: And that it doesn't all of a sudden take over.
[00:32:42] Speaker A: People be part of building the systems. Just be part of building the systems or being aware.
[00:32:46] Speaker B: Maybe that's really what I. What, like we should end, you know, the episode on of. Yes, it may be scary. Yes. There's a lot of unknown. It can be a beautiful thing. It can make growth, it can produce new jobs and new innovation, et cetera, et cetera. But unless you're involved in it or keeping up with it or interacting, then you are icing yourself out in a sense. You won't be able to have a voice. You won't be able to talk to the dangers or see the dangers that maybe.
[00:33:16] Speaker A: Yes, absolutely. You'll be more susceptible. I think that's totally right.
[00:33:19] Speaker B: So I guess we will just say the one thing that you should do out of this whole episode is be aware and be involved with AI so that way you can have discernment over firsthand knowledge on whether or not it's taking over the world.
[00:33:32] Speaker A: Why don't you just. Let's start with everybody. Just go into your chat GPT and say, hey, chat GPT. When are you going to take over the world?
[00:33:38] Speaker B: Great.
[00:33:38] Speaker A: Ask it that question.
Okay, fine.
[00:33:41] Speaker B: Have a fun fact. So I do want fun facts.
Bringing the mood up a little fun.
[00:33:46] Speaker A: There's AI Fun fact.
[00:33:49] Speaker B: Yes. Like, when was it invented? That's a fun fact because I think people really feel like AI is this new thing. It just. It just was released. But the truth of the matter is, is that it's been around forever. This is why we've had movies about it taking over the world for years and years and years.
[00:34:04] Speaker A: Do you know what I did to find these fun facts?
[00:34:06] Speaker B: You asked AI.
[00:34:07] Speaker A: No, I didn't. I went to the library.
[00:34:09] Speaker B: No, you didn't.
[00:34:10] Speaker A: And I looked in the Encyclopedia Britannica just like you asked. Yes, I did.
All right, ready?
All right, fun fact one. Here we go. Do you want to know when it was invented?
[00:34:20] Speaker B: I do. Yeah.
[00:34:20] Speaker A: You have any idea when it was invented? You want to guess? Guess.
[00:34:24] Speaker B: Was it before the Internet?
[00:34:25] Speaker A: It was way before the Internet.
[00:34:26] Speaker B: No way.
[00:34:27] Speaker A: Yes.
[00:34:28] Speaker B: Oh, no.
[00:34:28] Speaker A: Yes.
You're going to shut. You're going to die.
[00:34:31] Speaker B: Okay.
[00:34:31] Speaker A: 1956.
[00:34:32] Speaker B: Whoa. Would not have picked 1956.
[00:34:35] Speaker A: Yeah, the term artificial intelligence was coined at a conference at Dartmouth University in.
[00:34:40] Speaker B: What did they do? Like, do a spinal tap to inject your brain? Like, I don't know.
[00:34:44] Speaker A: I don't know. I wasn't there. I wasn't alive. That's not even the fun fact part.
[00:34:47] Speaker B: Oh, really?
[00:34:48] Speaker A: Yeah. Okay. At the time, they figured they would have human level intelligence in machines by the end of that same summer.
[00:34:56] Speaker B: Oh, dang.
[00:34:57] Speaker A: In three months.
[00:34:58] Speaker B: Wow.
[00:34:58] Speaker A: They said, like, that's what we're gonna do? Yeah.
[00:35:00] Speaker B: Okay. Well, thank goodness they didn't succeed.
[00:35:02] Speaker A: They were convinced that reasoning, logic and abstraction were all just basically engineering problems that could be dealt with in days. Wow. Literally days. Yes. And here we are 70 years later.
[00:35:17] Speaker B: That's crazy.
[00:35:17] Speaker A: Just starting to think about.
[00:35:18] Speaker B: I mean, but now we're on that cusp of, like, it could happen in days.
[00:35:20] Speaker A: So I. I wanna know who was sitting in that little panel at Dartmouth University in 1956? Like.
Yeah, by September 1st, we're gonna have this whole thing figured out. It's called AI. Here we go. I wish I was there. So, I mean, technically, it wasn't invented. The, like, the term was.
[00:35:34] Speaker B: Right, sure.
[00:35:35] Speaker A: Okay.
[00:35:36] Speaker B: But the fact that it was, like, thought about then. Yeah, that's crazy.
[00:35:38] Speaker A: I know. And meanwhile, like, what? Patrick Swayze and Jennifer Grey are like Dirty Dancing in the Catskills.
[00:35:43] Speaker B: It's fair.
[00:35:43] Speaker A: Right? And these people are doing AI shit at Dartmouth. What are you gonna do?
[00:35:46] Speaker B: We were all doing something different.
[00:35:48] Speaker A: Yeah, well, I wasn't even in my dad's testicles yet. Okay, number two, here we go. Did you know that the first ever chat bot was built in 1966?
[00:35:55] Speaker B: 1966.
[00:35:56] Speaker A: 66.
[00:35:57] Speaker B: So 10 years after AI was coined, a chat bot. Okay, technically, almost like big computers.
[00:36:01] Speaker A: No, technically, like a chat doll. What? Yes, a chat doll. Her name was Eliza. Yes. And some people got emotionally attached to her.
[00:36:09] Speaker B: Oh, well, that didn't surprise me.
[00:36:10] Speaker A: Right. Some men, of course they did. Right? Little scary shit. She didn't understand anything. Which is probably, let's be honest, why they loved her. All she could do was really just rephrase your statements as questions. So, for example, if you said, I'm feeling anxious, it replied, why are you feeling anxious? That was sort of it, yeah.
[00:36:28] Speaker B: Okay.
[00:36:28] Speaker A: Still, people got addicted to her and loved her.
[00:36:30] Speaker B: And probably, I mean, people, like, want to marry their chat now, so that doesn't. Doesn't surprise me.
[00:36:36] Speaker A: Okay, number three. So I guess they've studied this. Apparently, AI has made moves in some strategy games, like chess, that chess masters swore were mistakes until they weren't. So I guess Grandmaster, you know, Grandmaster in chess. You know what that is? Yes, like the highest level or whatever. Okay. They can think like 10, sometimes 20 moves ahead. Like the. Like. Yeah, like the level 10 or whatever. However high they go, AI can think 40 moves ahead.
[00:37:03] Speaker B: Which is why AI is scary.
[00:37:04] Speaker A: Right. People are really looking at patterns. AI can actually figure out the sequences, like a million at a time.
[00:37:11] Speaker B: With me so far, the processes, the overall, they can have 50 moves if I move one. Yeah.
[00:37:16] Speaker A: Right. So theoretically, AI can beat, you know, any grand astro.
[00:37:20] Speaker B: They should. Yes.
[00:37:21] Speaker A: Except here's the funny part. AI has no idea it's even playing chess. It doesn't know.
[00:37:25] Speaker B: Well, because it's. Yeah, it's just computing it. Like it's a problem or just a problem.
[00:37:28] Speaker A: Yeah, I think. But that. Therein lies my little distinction. That's the nuance. Okay. I thought that was wild. Okay. Number four, AI actually doesn't know anything technically. Like you said. It's just like pull shit from the Internet. So another way of thinking about it is it just predicts things. It doesn't actually know anything. It just sees it billions and billions of times over and over and over again. So then it just like shoves that output in our face. So if it says, like, for instance, Paris is the capital of France, it's not recalling a stored fact that it knows at all.
[00:37:56] Speaker B: Right.
[00:37:57] Speaker A: It's making an extremely educated statistical guess based on seeing that information over and over and over and over again on the Internet.
[00:38:05] Speaker B: Which is why you can't trust AI.
[00:38:06] Speaker A: No, absolutely. Which I think is wild. Like, if you actually think about it. So it's like. It's just really basic.
[00:38:10] Speaker B: Yeah.
[00:38:10] Speaker A: For its complexity, it just can process
[00:38:12] Speaker B: much faster than our human brain.
[00:38:14] Speaker A: Absolutely.
And finally, last one, AI has already saved lives that I believe. Apparently it helps detect breast cancer. That trained radiologists initially miss. In large clinical trials. Now, Amanda, in real hospitals, AI models are trained on millions of mammograms and have improved early detection rates, reducing missed cancers and. And helping flag patterns that are nearly invisible to humans.
[00:38:37] Speaker B: Yep.
[00:38:37] Speaker A: That is wild.
[00:38:39] Speaker B: It is wild. Again, as a doctor, like, I mean, you need that person whose connection to see things and whatever. But, you know, I'm fascinated. There'll be like a full AI body scan at one point.
[00:38:48] Speaker A: Right. I wonder how many boobs AI has seen.
[00:38:50] Speaker B: Oh, my God.
[00:38:51] Speaker A: Do you think it's seen all the boobs in the world? I wish I was AI. That's awesome. Okay. That's all I got.
[00:38:56] Speaker B: Sometimes there are just parts That I hope you cut out.
[00:38:58] Speaker A: Okay, well, not that one.
[00:39:01] Speaker B: Okay, well, those were. Some were better than others, especially the editorial comment about boobies. But either way, what are you going to do? Nothing. Nothing about it at all. Except for the fact that we're just going to say that, like, AI Was created, some are thought about, produced a name in 1956.
[00:39:18] Speaker A: Yes, it was.
[00:39:19] Speaker B: And now we are moving on to it, being able to be grandmaster chess players. And we don't even know if it's ever telling us the truth or it's just deducing. Right.
[00:39:31] Speaker A: So we're gonna ask your kids in 10 years. That's how this is gonna go.
[00:39:34] Speaker B: Not ready. I'm not ready for that. But I guess before my brain melts more into any of that, we already said the first thing that people should be doing is being active. And. Yeah, you know, but if someone wants to learn more about AI or what they should do or what their next step should be without being too scared, what are our call to actions?
[00:39:51] Speaker A: Oh, my God. I don't have a lot of them. I will say recently, have. Have you seen the movie her with Joaquin Phoenix?
[00:39:56] Speaker B: No.
[00:39:57] Speaker A: Is Scarlett Johansson plays like an AI Bot. Oh, my God. First of all. Well, I mean, Scarlett.
[00:40:01] Speaker B: Scarlett. Okay. Yes.
[00:40:01] Speaker A: And then, like, her voice just.
[00:40:03] Speaker B: Oh, okay.
[00:40:04] Speaker A: Anyway, all right, so first of all, watch her if you want.
[00:40:06] Speaker B: That was like an older one.
[00:40:06] Speaker A: It was. Yeah, yeah, yeah. And then what is it called? Ex Machina. Oh, that one was crazy. Again, like, super realistic. AI and she's gonna have anxiety and by the way. But okay, there is boobies in it at the end. But anyway, like, you could watch those. But, like, I mean, like, that's just for fun. I mean, that's not like, whatever. It's just to show you what AI can actually do at some point. But I would say read the book called CO Intelligence by Ethan Mollick. That's probably a way to go, like, super practical. It's not like AI is going to destroy humanity or anything like that. It's like, here's how to work with this thing intelligently and ethically. And that whole thing we just talked about, but it's super short, super readable, super easy, and it'll give you a good sense of, like.
[00:40:44] Speaker B: I think if we looked at AI Totally digressing and rabbit holing right now is that it was like a superhero. I think it was Spider Man. Right.
[00:40:52] Speaker A: Okay.
[00:40:53] Speaker B: Where he was told, like, with great power comes great responsibility. I feel like we can apply that to AI and like, with everybody, like, There is great power that you have access to, but with that becomes a responsibility. And so just as you're saying is like that he talked about, you know, how to work with it ethically, etc. Like I think that's a responsibility that we as humans should probably.
[00:41:13] Speaker A: Isn't that our whole point?
[00:41:14] Speaker B: I mean that's the theory that we should have. But I don't feel like most people do. Right, right.
[00:41:17] Speaker A: I would argue this is what the indigenous people tried to teach us, you know, all those years ago and then
[00:41:23] Speaker B: treat the land well.
[00:41:24] Speaker A: Right.
[00:41:24] Speaker B: Careful.
[00:41:24] Speaker A: Right.
[00:41:25] Speaker B: Yeah.
[00:41:25] Speaker A: And then all the here we are Europeans decided to take over. Okay, sorry.
If you want to do a podcast, listen to Hard Fork New York Times podcast. They cover AI in a way that's like way smarter than us there, but not like again in the AI is going to take over the world. Way to freak people out, but more like how it's playing out like in business and culture and all that kind of stuff. So I think it's super sophisticated and super worth it.
[00:41:48] Speaker B: Love it.
[00:41:48] Speaker A: And if you don't want to do it any of that, then just remember these important details to seem sort of sophisticated.
[00:41:53] Speaker B: Here we go.
[00:41:54] Speaker A: Number one, AI isn't the first job killer.
[00:41:56] Speaker B: No, just like the hundredth.
[00:41:57] Speaker A: Oh my God, shut up. From the printing press all the way to the Internet, every major leap has felt catastrophic at first, but ultimately expanded opportunity. History shows us that when friction drops, new industries and roles emerge faster than old ones disappear. We know that it's proof. We got the pattern people repeats. Number two, AI lowers the barrier to creation, which increases competition and creates new layers of specialization like we were talking about. When more people can build, the market expands and new niches form that didn't exist before. Just don't be lazy. Number three, the real threat isn't automation, it's being mediocre. When everybody can produce, judgment and taste become the premium in an AI world, that is what's going to set you apart. It's not the output, its interpretation and discernment. Like we talked about. Number four, AI doesn't know anything. It predicts things. It's advanced probability at scale, not consciousness. It feels intelligent because the math is just super powerful, not because the machine understands anything behind the meaning. And finally, we're arguing the future isn't human versus AI. It's the advantage that's going to go to those who multiply their value. The winners won't be the ones who resist it, they'll be the ones who collaborate with it strategically. Like Amanda was saying. Before.
And that is all I got.
[00:43:13] Speaker B: And there you have it, fellow listeners, a slightly unhinged tour through the world of artificial intelligence and why you're probably thinking about it all wrong, per Pete.
But from medieval monks panicking over the printing press to modern workers side eyeing ChatGPT, this isn't a robot takeover story. It's a pattern story. And if we look back in history, we can see that technology shifts, humans adapt, and values move.
[00:43:35] Speaker A: Yes, you've come to the dark.
[00:43:37] Speaker B: So if we did our job today, you're definitely not walking away knowing how to code or memorize better prompts. Instead, we're hoping you're walking away with a sharper lens, noticing where automation ends and judgment begins and learning that differentiation is the real secret sauce. So if you enjoyed this episode, subscribe, leave a review, and send it to someone who still thinks AI is just a fancy spell checker. But until next time, stay curious, stay adaptable, and remember, if the robot is doing all the work, you better be the one doing all the thinking.