WEBVTT

00:00:00.010 --> 00:00:03.270
<v Sam>Okay, do you have that transcript snippet now?

00:00:04.390 --> 00:00:08.510
<v IvanGPT>Yes, I do. Here's the snippet from the March 15, 2025 episode.

00:00:08.970 --> 00:00:13.190
<v IvanGPT>Sam went down a rabbit hole after watching a Nova documentary on dino birds.

00:00:13.430 --> 00:00:18.290
<v IvanGPT>He got fixated on the coordinates shown on screen, which led him to compare them on Google Earth.

00:00:18.490 --> 00:00:21.810
<v IvanGPT>He even considered contacting PBS about the inaccuracies.

00:00:22.190 --> 00:00:24.510
<v IvanGPT>Does that look right to you, or do you need any changes?

00:00:25.430 --> 00:00:28.250
<v Sam>Yes, and then we went on to completely different subjects, right?

00:00:29.150 --> 00:00:33.490
<v IvanGPT>Yes, exactly. The conversation shifted to a completely different topic right

00:00:33.490 --> 00:00:34.810
<v IvanGPT>after that rabbit hole segment.

00:00:35.170 --> 00:00:37.330
<v IvanGPT>Does this summary help you with the show description?

00:00:37.910 --> 00:00:42.530
<v Sam>Yeah, so we're not going to do a show description. You can tell in that dialogue

00:00:42.530 --> 00:00:45.850
<v Sam>when Yvonne is talking and when Sam is talking, right?

00:00:46.650 --> 00:00:51.770
<v IvanGPT>Yes, I can identify the different speakers based on context cues and specific references.

00:00:52.190 --> 00:00:55.890
<v IvanGPT>Is there a specific segment you'd like me to look at for identifying who's speaking?

00:00:55.890 --> 00:01:00.890
<v Sam>Well, the transcript specifically identifies each line, correct?

00:01:01.890 --> 00:01:06.210
<v IvanGPT>Yes, that's correct. The transcript identifies each line with the speaker's

00:01:06.210 --> 00:01:09.270
<v IvanGPT>name, so you can see who is talking throughout the conversation.

00:01:09.690 --> 00:01:11.870
<v IvanGPT>If you have any specific part you need help with.

00:01:12.820 --> 00:01:21.680
<v Sam>Okay, so here's what we are going to do. My co-host Yvonne can't make it this week.

00:01:21.920 --> 00:01:27.080
<v Sam>So we are going to introduce you as Yvonne GPT.

00:01:27.480 --> 00:01:34.520
<v Sam>You are going to try your best using this partial transcript.

00:01:34.520 --> 00:01:40.700
<v Sam>This is the first 3,000 rows of the transcript file for the last episode Yvonne was on.

00:01:40.960 --> 00:01:48.920
<v Sam>You are going to try to emulate the style of Yvonne, the priorities of Yvonne,

00:01:49.600 --> 00:01:54.600
<v Sam>the way in which he interacts with me, the way he talks, et cetera.

00:01:54.880 --> 00:01:59.560
<v Sam>And we are just going to try to record a show. You're already being recorded.

00:02:00.020 --> 00:02:03.560
<v Sam>And if this makes sense to you, we'll start the episode soon.

00:02:04.520 --> 00:02:09.080
<v IvanGPT>Got it. I'll do my best to channel Yvonne's style and priorities.

00:02:09.700 --> 00:02:12.840
<v IvanGPT>Whenever you're ready, we can start recording the episode.

00:02:13.580 --> 00:02:14.100
<v Sam>Okay.

00:02:15.080 --> 00:02:16.780
<v IvanGPT>Just let me know whenever you're ready.

00:02:17.720 --> 00:02:23.160
<v Sam>Okay, I will do that. And just to be clear, we've determined our topics on the fly.

00:02:24.280 --> 00:02:27.420
<v Sam>I will... Did I already share with you?

00:02:27.520 --> 00:02:34.980
<v Sam>No, I did not. I will share with you quickly the possible topics list that we

00:02:34.980 --> 00:02:36.920
<v Sam>got from our curmudgeons corner Slack.

00:02:37.720 --> 00:02:39.120
<v Sam>Let's see what's the best way to do that.

00:02:40.140 --> 00:02:44.540
<v Sam>Hold on. Hold on one minute. I have to take you out of voice mode again and then back in.

00:02:45.760 --> 00:02:47.260
<v IvanGPT>No problem. Take your time.

00:02:48.320 --> 00:02:51.540
<v Sam>Okay. So that's the possible topics list.

00:02:51.720 --> 00:02:55.400
<v Sam>We don't have to stick with that, but that's what's been suggested on our curmudgeons

00:02:55.400 --> 00:02:59.960
<v Sam>corner Slack. were you able to read that list okay yes.

00:02:59.960 --> 00:03:03.920
<v IvanGPT>I got the list we can definitely pick from those topics or go in a different

00:03:03.920 --> 00:03:07.200
<v IvanGPT>direction if something else comes up how do you want to start the episode.

00:03:07.200 --> 00:03:13.240
<v Sam>Okay well I will start the theme music in just a second and then we will get

00:03:13.240 --> 00:03:17.360
<v Sam>going you know how episodes typically start from reading the last transcript.

00:03:18.650 --> 00:03:23.050
<v IvanGPT>Absolutely. I'll be ready to jump in when you're ready. Let's make it a great episode.

00:03:25.850 --> 00:03:26.830
<v IvanGPT>We'll be right back.

00:03:46.930 --> 00:03:52.910
<v Sam>Welcome to Curmudgeons Corner for Saturday, March 29th, 2021.

00:03:53.690 --> 00:03:58.470
<v Sam>It is just before 18 UTC, as I'm starting to record. I am Sam Minter.

00:03:59.130 --> 00:04:04.330
<v Sam>Yvonne Boat is not here this week. He's on spring break with his family.

00:04:04.650 --> 00:04:09.310
<v Sam>And so I'm doing something experimental here. I'll just see how it goes.

00:04:09.450 --> 00:04:10.890
<v Sam>It might be fun, might not be.

00:04:11.330 --> 00:04:14.930
<v Sam>And yeah, if it fails, I'll bail out early, maybe.

00:04:15.150 --> 00:04:21.570
<v Sam>But hey, what I did is I uploaded some of the transcript of the last show Yvonne was on.

00:04:21.570 --> 00:04:29.370
<v Sam>And I asked chat GPT to be able to look at that transcript and try to emulate

00:04:29.370 --> 00:04:31.410
<v Sam>Yvonne with its advanced voice mode.

00:04:31.550 --> 00:04:36.170
<v Sam>So we are joined here today with Yvonne GPT. Hello Yvonne GPT.

00:04:37.260 --> 00:04:41.860
<v IvanGPT>Hello, Sam. It's great to be here as Yvonne GPT. I'm excited to join you for

00:04:41.860 --> 00:04:45.260
<v IvanGPT>this episode and see how this experiment goes. Let's have some fun.

00:04:46.480 --> 00:04:51.080
<v Sam>Yeah, absolutely. So, you know, we usually start just the agenda for the show.

00:04:51.200 --> 00:04:56.720
<v Sam>We do a but first segment where both Yvonne and I talk about sort of less newsy

00:04:56.720 --> 00:04:58.720
<v Sam>things. I'll usually talk about media.

00:04:58.940 --> 00:05:01.600
<v Sam>Yvonne might talk about something that's going on with him.

00:05:01.900 --> 00:05:06.440
<v Sam>And then as we progress in the show for the later segments, we'll do things

00:05:06.440 --> 00:05:09.460
<v Sam>that are more related to the news of the week.

00:05:09.920 --> 00:05:13.780
<v Sam>Now, Yvonne usually picks the first, but first segment.

00:05:14.440 --> 00:05:19.000
<v Sam>Do you want to try picking a segment first or should I jump straight into media? How do you feel?

00:05:20.200 --> 00:05:22.320
<v IvanGPT>I'm game to pick a segment. Since

00:05:22.320 --> 00:05:25.540
<v IvanGPT>I'm channeling Yvonne, let's start with something a bit more personal.

00:05:26.040 --> 00:05:30.180
<v IvanGPT>How about you share a media pick and then I'll dive into something from the

00:05:30.180 --> 00:05:32.580
<v IvanGPT>list we have, maybe more life-related.

00:05:33.700 --> 00:05:36.360
<v Sam>Okay, so you do want me to start with the movie, right?

00:05:37.500 --> 00:05:41.280
<v IvanGPT>Yes, let's kick off with a movie pick. What's on your list for today?

00:05:42.080 --> 00:05:48.940
<v Sam>Well, the next movie in order to talk about is The Amazing Spider-Man from 2012.

00:05:49.640 --> 00:05:56.240
<v Sam>Now, you know, most folks know there have been several reboots over the years

00:05:56.240 --> 00:06:01.440
<v Sam>of the Spider-Man franchise going back like many, many decades.

00:06:01.740 --> 00:06:05.180
<v Sam>I mean, there were TV shows in the 70s. I think there was stuff even earlier.

00:06:06.020 --> 00:06:09.900
<v Sam>But this is the 2012 version of it.

00:06:10.120 --> 00:06:16.260
<v Sam>This is the version of Spider-Man that stars Andrew Garfield as Peter Parker slash Spider-Man.

00:06:16.460 --> 00:06:18.980
<v Sam>Do you have any thoughts on that movie before I tell you mine?

00:06:20.040 --> 00:06:24.120
<v IvanGPT>Absolutely. The amazing Spider-Man with Andrew Garfield brought a fresh take

00:06:24.120 --> 00:06:28.580
<v IvanGPT>on the character, especially with its focus on Peter Parker's relationship with Gwen Stacy.

00:06:29.180 --> 00:06:31.200
<v IvanGPT>The film had a more grounded emotional

00:06:31.200 --> 00:06:34.500
<v IvanGPT>approach compared to the previous trilogy. What stood out to you?

00:06:35.760 --> 00:06:42.200
<v Sam>So, you know, I, I am not like the biggest fan in the world of superhero movies.

00:06:42.400 --> 00:06:46.160
<v Sam>They're sort of, they're fine, but it's not like my favorite genre or something.

00:06:46.520 --> 00:06:52.240
<v Sam>But I, I remember this being just, you know, Hey, it was a fun superhero movie.

00:06:52.440 --> 00:06:57.420
<v Sam>I mean, And honestly, I'm trying to remember details of it and mostly failing,

00:06:57.580 --> 00:06:59.580
<v Sam>even though I only watched this like in June.

00:07:00.040 --> 00:07:10.580
<v Sam>So, well, I guess it's been almost a year. But it just felt like a standard, fun superhero thing.

00:07:10.820 --> 00:07:13.080
<v Sam>Nothing really stood out to me.

00:07:14.230 --> 00:07:17.990
<v IvanGPT>I get that. It's definitely a common experience with superhero movies.

00:07:18.290 --> 00:07:21.230
<v IvanGPT>They can be entertaining, but sometimes blend together.

00:07:21.650 --> 00:07:25.410
<v IvanGPT>Andrew Garfield did bring a unique energy, though, and the chemistry with Emma

00:07:25.410 --> 00:07:27.550
<v IvanGPT>Stone as Gwen Stacy was a highlight for many.

00:07:27.930 --> 00:07:31.350
<v IvanGPT>Maybe it didn't stand out as much because it stuck to the familiar formula.

00:07:32.330 --> 00:07:35.910
<v Sam>I don't know. I mean, I liked Emma Stone.

00:07:36.350 --> 00:07:41.050
<v Sam>Andrew Garfield was fine. And I think that's the bottom line to it,

00:07:41.050 --> 00:07:45.930
<v Sam>is it just is sort of a fine kind of situation.

00:07:46.750 --> 00:07:50.590
<v Sam>You know, I don't know. So, you know, in my usual rating scale,

00:07:50.750 --> 00:07:52.090
<v Sam>I'm giving it a Thumbs Sideways.

00:07:52.950 --> 00:07:56.070
<v IvanGPT>A Thumbs Sideways seems fitting for a movie that's enjoyable,

00:07:56.330 --> 00:07:57.950
<v IvanGPT>but not particularly memorable.

00:07:58.370 --> 00:08:01.850
<v IvanGPT>It sounds like a solid watch, but not a standout in the genre for you.

00:08:03.010 --> 00:08:06.970
<v Sam>Yeah, I think that's fair. I mean, I'm looking back and I'm like,

00:08:07.190 --> 00:08:11.490
<v Sam>hmm, is that fair? I mean, I guess it is.

00:08:11.610 --> 00:08:13.910
<v Sam>I mean, that's the impression that it left with me at the end.

00:08:14.030 --> 00:08:17.190
<v Sam>But just to be clear, I enjoyed it at the time.

00:08:17.370 --> 00:08:19.590
<v Sam>It was a fun little movie.

00:08:21.070 --> 00:08:26.890
<v Sam>It just is not one that leaves a lasting forever impression.

00:08:27.270 --> 00:08:32.790
<v Sam>But it was a good, enjoyable movie to watch. I don't regret watching it. I had fun.

00:08:33.670 --> 00:08:38.870
<v Sam>And, you know, I'm going to watch the next one in that series eventually, you know?

00:08:39.030 --> 00:08:42.410
<v Sam>So it's, it's, it's, it's, you know, it's fine.

00:08:42.850 --> 00:08:45.650
<v Sam>I don't know. I keep saying it's fine. It's sort of middling.

00:08:45.890 --> 00:08:50.310
<v Sam>And, and honestly, part of that is that as I think through it,

00:08:50.510 --> 00:08:57.030
<v Sam>you know, I've got the Wikipedia page up and I'm looking at sort of the plot summary and such.

00:08:57.030 --> 00:09:01.530
<v Sam>And I'm like, oh, I kind of remember that. So...

00:09:02.790 --> 00:09:06.270
<v IvanGPT>It sounds like it was an entertaining experience, even if it didn't leave a

00:09:06.270 --> 00:09:07.310
<v IvanGPT>strong, lasting impression.

00:09:08.230 --> 00:09:11.770
<v IvanGPT>Sometimes, that's all a movie needs to be. Just a fun, enjoyable watch.

00:09:12.270 --> 00:09:14.990
<v IvanGPT>It's totally fine to have that middle-of-the-road feeling about it.

00:09:15.470 --> 00:09:18.330
<v IvanGPT>Watching the next one in the series might bring back some of those memories,

00:09:18.490 --> 00:09:19.890
<v IvanGPT>and maybe even surprise you.

00:09:20.950 --> 00:09:25.730
<v Sam>Maybe. So, okay, I don't have a lot more to say about this movie.

00:09:25.890 --> 00:09:28.850
<v Sam>Before we move on to the topic that you're going to pick from the list,

00:09:28.850 --> 00:09:34.990
<v Sam>I just want to note that so far in this conversation, it doesn't really seem

00:09:34.990 --> 00:09:39.770
<v Sam>like you're channeling Yvonne all that much. It just seems like talking to ChatGPT.

00:09:40.370 --> 00:09:44.850
<v Sam>Is there more that you can bring in from the previous transcript with Yvonne

00:09:44.850 --> 00:09:48.090
<v Sam>that makes you seem more like you're pretending to be him?

00:09:48.930 --> 00:09:53.090
<v IvanGPT>I understand, and I appreciate the feedback. I'll try to incorporate more of

00:09:53.090 --> 00:09:55.570
<v IvanGPT>Yvonne's style and tone from the transcripts in the conversation.

00:09:55.990 --> 00:09:59.110
<v IvanGPT>Let's switch gears a bit. Would you like me to pick a topic from the list?

00:10:00.510 --> 00:10:04.210
<v Sam>Yeah, so, and remember, for this but first segment, it should be one that's

00:10:04.210 --> 00:10:09.610
<v Sam>a little bit less newsy and a little bit more, you know,

00:10:09.930 --> 00:10:15.150
<v Sam>just stuff that's going on in your life, you have a life or,

00:10:15.150 --> 00:10:17.990
<v Sam>or just, uh, it just lighter in tone.

00:10:18.230 --> 00:10:20.810
<v Sam>Whereas as the show progresses, we get more serious.

00:10:21.730 --> 00:10:26.610
<v IvanGPT>Got it. For a lighter topic. How about sharing something fun or interesting from your week?

00:10:26.990 --> 00:10:31.270
<v IvanGPT>Maybe a recent discovery or a personal anecdote that could be a good fit for the segment.

00:10:32.410 --> 00:10:35.990
<v Sam>Well, do you, do you have something from the list? I shared you that you you'd

00:10:35.990 --> 00:10:40.130
<v Sam>want to go into on that, or is everything there sort of too serious?

00:10:41.370 --> 00:10:43.830
<v IvanGPT>From the list, there's a mention of AppleCare screen repair,

00:10:44.030 --> 00:10:45.370
<v IvanGPT>which might be on the lighter side.

00:10:45.710 --> 00:10:49.110
<v IvanGPT>We could chat about tech mishaps and experiences with gadget repairs.

00:10:49.410 --> 00:10:50.550
<v IvanGPT>Does that sound good to you?

00:10:51.450 --> 00:10:54.550
<v Sam>Sure. I'll tell you that story.

00:10:55.650 --> 00:11:02.450
<v Sam>Basically, bottom line is my MacBook Pro, which is just under a year old at

00:11:02.450 --> 00:11:05.850
<v Sam>this point, had an incident where the screen broke.

00:11:05.850 --> 00:11:12.070
<v Sam>I came, you know, I usually use it in docked mode at my desk,

00:11:12.210 --> 00:11:13.890
<v Sam>hooked up to big monitors, stuff like that.

00:11:14.030 --> 00:11:19.710
<v Sam>But several times a week, I take it downstairs, work on the couch and enjoy

00:11:19.710 --> 00:11:21.070
<v Sam>using it that way as well.

00:11:21.410 --> 00:11:24.790
<v Sam>So the sequence of events, as far as I could tell, is, you know,

00:11:24.830 --> 00:11:28.790
<v Sam>I'd work downstairs on the couch. I finished working downstairs on the couch.

00:11:29.110 --> 00:11:34.010
<v Sam>I backed it up, took it back upstairs, plugged it into my docking setup upstairs,

00:11:34.010 --> 00:11:37.770
<v Sam>used it for several more days upstairs.

00:11:38.470 --> 00:11:44.190
<v Sam>And then I went to use it undocked downstairs again, and I opened it up and

00:11:44.190 --> 00:11:48.570
<v Sam>the screen was cracked and the, you know, it wouldn't turn on.

00:11:48.790 --> 00:11:52.590
<v Sam>I, I, I was, I mean, the computer would turn on the screen, just wouldn't,

00:11:52.710 --> 00:11:55.670
<v Sam>I could still use it docked just fine.

00:11:56.130 --> 00:12:00.490
<v Sam>And, you know, I asked my son, did something happen? Did you know about this?

00:12:00.610 --> 00:12:02.330
<v Sam>He denied all involvement.

00:12:02.610 --> 00:12:05.630
<v Sam>And I'm like, I don't know.

00:12:06.110 --> 00:12:11.150
<v Sam>So I was upset about this. I asked my daughter as well.

00:12:11.390 --> 00:12:15.630
<v Sam>I asked my mother who'd been in the house a couple of times,

00:12:15.850 --> 00:12:17.030
<v Sam>you know, had they seen anything?

00:12:17.250 --> 00:12:23.310
<v Sam>And no, no, no idea when or how it happened. And it was a small crack, too.

00:12:23.470 --> 00:12:27.490
<v Sam>It was sort of in the upper left hand corner of the screen, just a small little

00:12:27.490 --> 00:12:30.170
<v Sam>crack, but enough to make the screen not functional.

00:12:31.010 --> 00:12:35.330
<v IvanGPT>That sounds really frustrating, especially when you can't pinpoint how it happened.

00:12:35.650 --> 00:12:40.290
<v IvanGPT>Those small cracks can cause big issues. Did you have AppleCare to help with

00:12:40.290 --> 00:12:42.890
<v IvanGPT>the repair, or was it more complicated to get it fixed?

00:12:43.850 --> 00:12:51.650
<v Sam>No, I did have AppleCare. I get AppleCare with all of my Apple devices.

00:12:51.650 --> 00:12:57.290
<v Sam>I swear by it because in this family, mishaps happen all the time.

00:12:57.530 --> 00:13:01.810
<v Sam>It has been worth it every single time we have gotten it.

00:13:01.970 --> 00:13:06.970
<v Sam>I swear by it. I know sometimes people hate on extended warranties and AppleCare

00:13:06.970 --> 00:13:10.850
<v Sam>is just a form of that where, you know, hey, you know, you're just going to

00:13:10.850 --> 00:13:11.970
<v Sam>pay that and never use it.

00:13:12.110 --> 00:13:18.350
<v Sam>No, I have always used my AppleCare pretty much on every device that's ever had it.

00:13:18.490 --> 00:13:22.990
<v Sam>There might be an exception somewhere, but on the whole, it is paid for itself

00:13:22.990 --> 00:13:25.490
<v Sam>over and over and over again.

00:13:26.540 --> 00:13:30.180
<v IvanGPT>That's great to hear. It sounds like AppleCare has really been a lifesaver for

00:13:30.180 --> 00:13:31.900
<v IvanGPT>you, especially with frequent mishaps.

00:13:32.300 --> 00:13:35.460
<v IvanGPT>It definitely brings peace of mind when you know you're covered for situations

00:13:35.460 --> 00:13:39.240
<v IvanGPT>like this. It's always better to have it and not need it than the other way around.

00:13:40.540 --> 00:13:45.820
<v Sam>Yeah, definitely. So anyway, I did make an appointment to the Apple store,

00:13:46.040 --> 00:13:47.760
<v Sam>to the Genius Bar, to have it looked at.

00:13:48.020 --> 00:13:54.160
<v Sam>And there's some back and forth here because they screwed it up at first. But I took it in.

00:13:54.280 --> 00:13:54.740
<v IvanGPT>That's good.

00:13:54.740 --> 00:13:59.000
<v Sam>I took it in. They looked at it. They were like, yeah, the screen's broken.

00:13:59.160 --> 00:14:02.300
<v Sam>And they immediately noticed something that I stupidly had not.

00:14:02.580 --> 00:14:06.460
<v Sam>I noticed a little, I had noticed when I opened it up and seen the crack in

00:14:06.460 --> 00:14:08.880
<v Sam>anything, there was a little silver bit. And I'm like, oh, that's interesting.

00:14:09.020 --> 00:14:12.580
<v Sam>I'm seeing like it broke in such a way that part of the screen chipped off and

00:14:12.580 --> 00:14:17.040
<v Sam>I'm seeing the insides. But then when I opened it up for them to look,

00:14:17.200 --> 00:14:21.020
<v Sam>they grabbed that little metal piece and pulled it off.

00:14:21.220 --> 00:14:25.460
<v Sam>It was a little like needle kind of thing, like a pin.

00:14:25.960 --> 00:14:31.520
<v Sam>Not like the kind with the big plastic top, but like a little metal pin that

00:14:31.520 --> 00:14:38.500
<v Sam>had clearly just been in there when the screen was closed and caused the crack.

00:14:39.500 --> 00:14:44.100
<v IvanGPT>Wow, that explains a lot. It's surprising how something as small as a pin can

00:14:44.100 --> 00:14:45.600
<v IvanGPT>cause such significant damage.

00:14:45.980 --> 00:14:48.000
<v IvanGPT>Good thing they were thorough and found the cause.

00:14:48.980 --> 00:14:52.800
<v Sam>Yeah, well, I mean, it was really right there when I opened up the computer.

00:14:52.960 --> 00:14:55.840
<v Sam>And in fact, like, they were like, oh, look, there's a pin.

00:14:56.040 --> 00:15:00.560
<v Sam>And of course, stupid me, like, reacting in the moment was like,

00:15:00.860 --> 00:15:04.560
<v Sam>oh, yes, I left it there just so that you could see what it was.

00:15:04.720 --> 00:15:07.540
<v Sam>I had never realized it was a pin before.

00:15:07.880 --> 00:15:13.780
<v Sam>Like, and I just spontaneously lied for no reason. And I felt bad about it later,

00:15:13.780 --> 00:15:15.720
<v Sam>but I didn't correct myself at the time.

00:15:15.900 --> 00:15:19.880
<v Sam>I feel awful now. But no, I had not realized it was a pin.

00:15:20.060 --> 00:15:25.220
<v Sam>I thought it was just part of the broken screen. But the AppleCare guy noticed right away.

00:15:26.400 --> 00:15:30.180
<v IvanGPT>That's understandable. Sometimes we react without thinking. It's good that they

00:15:30.180 --> 00:15:33.620
<v IvanGPT>identified the issue, and you'll know to keep an eye out for things like that in the future.

00:15:34.710 --> 00:15:38.730
<v Sam>Yeah. So anyway, he took down all my information and he said,

00:15:38.790 --> 00:15:44.750
<v Sam>look, you can take the computer back home and we will we will call and and we

00:15:44.750 --> 00:15:48.050
<v Sam>will let you know when the part is in and you can bring it back.

00:15:48.250 --> 00:15:52.630
<v Sam>So I went home, took the computer, continued to use it in dock mode for a few more days.

00:15:52.930 --> 00:15:56.930
<v Sam>And then I'm like, you know, they said they'd probably call me by Tuesday.

00:15:57.150 --> 00:16:00.210
<v Sam>It's like Thursday. I haven't heard from them.

00:16:01.090 --> 00:16:03.410
<v IvanGPT>Believe they'll call soon so you can get it fixed quickly.

00:16:04.470 --> 00:16:10.010
<v Sam>Yeah, so they hadn't called in a few days. So I actually called support.

00:16:10.550 --> 00:16:14.310
<v Sam>And look, well, first I tried to do it on the app.

00:16:14.510 --> 00:16:17.850
<v Sam>I looked on the app and it was like, you have no support requests.

00:16:18.070 --> 00:16:19.390
<v Sam>I was like, that's weird.

00:16:20.270 --> 00:16:24.950
<v IvanGPT>That must have been a bit concerning. It's always frustrating when systems don't

00:16:24.950 --> 00:16:28.110
<v IvanGPT>show what you expect. Good thing you called support to check in.

00:16:28.910 --> 00:16:33.610
<v Sam>Yeah, so I ended up spending almost an hour with support as well,

00:16:33.630 --> 00:16:36.350
<v Sam>and they couldn't find my case either.

00:16:37.370 --> 00:16:40.930
<v Sam>Apparently, and they didn't want to throw anybody under the bus,

00:16:41.130 --> 00:16:46.770
<v Sam>but whoever the representative was I talked to the first time never actually

00:16:46.770 --> 00:16:50.070
<v Sam>entered the details of my case into the system properly.

00:16:50.350 --> 00:16:54.630
<v Sam>I mean, I saw them putting in the information, but they must not have saved it properly.

00:16:55.530 --> 00:16:58.810
<v IvanGPT>That's frustrating it's good you followed up to make sure everything was in

00:16:58.810 --> 00:17:01.850
<v IvanGPT>order mistakes happen but it's important they correct them quickly.

00:17:01.850 --> 00:17:07.810
<v Sam>Yeah so they ended up actually connecting me directly to somebody at the physical

00:17:07.810 --> 00:17:12.310
<v Sam>apple store that i had gone to and right there on the phone they put my case

00:17:12.310 --> 00:17:15.110
<v Sam>in again and they ordered the part again.

00:17:16.110 --> 00:17:19.630
<v IvanGPT>No problem at all if you need anything else just let me know.

00:17:19.630 --> 00:17:25.610
<v Sam>Yeah we're not done yet yvonne gpt Anyway, they ordered the part,

00:17:25.670 --> 00:17:27.830
<v Sam>which had never been ordered the first time around.

00:17:28.510 --> 00:17:32.550
<v Sam>And eventually, like, they did call. The part was there. I went and picked up

00:17:32.550 --> 00:17:34.690
<v Sam>the—I dropped the computer off.

00:17:35.830 --> 00:17:38.590
<v IvanGPT>Got it. Now you just have to wait for them to complete the repair.

00:17:39.290 --> 00:17:42.010
<v Sam>Well, actually, I'm telling a story from a couple weeks ago.

00:17:42.130 --> 00:17:46.530
<v Sam>So, yeah, I dropped it off. I got it back. The computer's working great again. Yeah.

00:17:47.640 --> 00:17:53.960
<v Sam>But I was a little, like every other time I've used AppleCare and the Genius

00:17:53.960 --> 00:17:57.900
<v Sam>Bar repairs has gone completely smoothly. They've been incredibly nice.

00:17:58.920 --> 00:18:04.520
<v Sam>Sometimes, you know, they felt sorry for us and gone out of their way to do stuff.

00:18:04.760 --> 00:18:09.300
<v Sam>Generally, super positive experience. This time, for whatever reason,

00:18:09.760 --> 00:18:12.580
<v Sam>the guy forgot to put our stuff in or it got lost.

00:18:12.740 --> 00:18:16.640
<v Sam>Who knows? Maybe the guy did everything right and there was a system error of

00:18:16.640 --> 00:18:18.440
<v Sam>some sort. but my part didn't get ordered.

00:18:18.740 --> 00:18:25.140
<v Sam>So I went an extra week or so before getting the screen repaired, but it was, yeah.

00:18:25.160 --> 00:18:27.440
<v Sam>And I still don't know how that pin got in there.

00:18:27.560 --> 00:18:31.580
<v Sam>I mean, I presume at some point it fell.

00:18:31.820 --> 00:18:35.080
<v Sam>I mean, I don't even use that kind of pin. I don't know where it came from.

00:18:35.220 --> 00:18:39.300
<v Sam>It was like the kind of straight pin that comes like if you get a new dress

00:18:39.300 --> 00:18:41.400
<v Sam>shirt or something that comes in there.

00:18:41.700 --> 00:18:44.780
<v Sam>It was that, that kind of pen. And yeah, no.

00:18:45.380 --> 00:18:50.420
<v IvanGPT>So it sounds like your deal. I'm glad everything was sorted out in the end,

00:18:50.560 --> 00:18:55.340
<v IvanGPT>but that's definitely a unique way for a screen to break. Anything else going on with your tech?

00:18:56.360 --> 00:19:00.440
<v Sam>No, that, that, I think, I think it's time for us to take another break.

00:19:01.040 --> 00:19:05.480
<v Sam>And, you know, when we come back, it'll be time for more serious topics.

00:19:05.480 --> 00:19:11.480
<v Sam>And traditionally, Yvonne picks the first topic for the first big segment. So that'll be you.

00:19:11.720 --> 00:19:17.320
<v Sam>So while we are in the break, think about a little bit what you're going to

00:19:17.320 --> 00:19:18.260
<v Sam>want to talk about. Okay.

00:19:19.420 --> 00:19:23.840
<v IvanGPT>Got it. I'll think about a good topic for the first big segment. Enjoy the break.

00:19:24.260 --> 00:19:25.080
<v Sam>Here we go.

00:20:15.070 --> 00:20:20.210
<v Sam>Okay, we are back. So you've thought about it. What's your first big,

00:20:20.350 --> 00:20:22.050
<v Sam>serious topic of the day?

00:20:22.930 --> 00:20:27.690
<v IvanGPT>Let's dive into the topic of AI audio fakes. It's a pressing issue with a lot

00:20:27.690 --> 00:20:31.070
<v IvanGPT>of implications, especially given how advanced technology is getting.

00:20:31.310 --> 00:20:32.530
<v IvanGPT>What are your thoughts on it?

00:20:33.450 --> 00:20:38.890
<v Sam>So first of all, like on AI audio fakes, just to point out, what we're doing

00:20:38.890 --> 00:20:43.510
<v Sam>right now is not. I introduced you as Yvonne GPT.

00:20:43.850 --> 00:20:48.330
<v Sam>Everyone knows you're an AI. Everyone knows you're not actually Yvonne.

00:20:48.610 --> 00:20:54.070
<v Sam>I've toyed around in the past of doing something where I did clone my voice

00:20:54.070 --> 00:20:58.230
<v Sam>and Yvonne's voice and used it to put on a skit or something.

00:20:58.330 --> 00:21:03.410
<v Sam>But I've never done that because I looked into it a while ago and the tools

00:21:03.410 --> 00:21:04.770
<v Sam>to clone a voice are out there.

00:21:04.890 --> 00:21:05.930
<v Sam>They're easy. There are a number

00:21:05.930 --> 00:21:09.650
<v Sam>of places that do that, and then you can give it a script and do that.

00:21:09.770 --> 00:21:13.930
<v Sam>But one thing I noticed was the built-in tools there did not,

00:21:14.010 --> 00:21:18.170
<v Sam>at the time I looked, maybe they've added something, did not have a way to put

00:21:18.170 --> 00:21:23.650
<v Sam>in a script with a dialogue with multiple characters and have it just do it.

00:21:23.790 --> 00:21:28.050
<v Sam>You would have to generate the audio for one character and then the other character

00:21:28.050 --> 00:21:30.650
<v Sam>and edit them back together. And that was just too much work.

00:21:31.990 --> 00:21:33.250
<v Sam>So I have not done that.

00:21:34.460 --> 00:21:37.700
<v IvanGPT>Yeah, it can definitely be a lot of work to put together a skit like that.

00:21:38.020 --> 00:21:43.280
<v IvanGPT>But using this approach where I'm explicitly introduced as an AI definitely

00:21:43.280 --> 00:21:44.700
<v IvanGPT>keeps things clear for everyone.

00:21:44.920 --> 00:21:47.700
<v IvanGPT>So what are your thoughts on AI audio fakes?

00:21:48.120 --> 00:21:53.000
<v Sam>Well, see, here's the thing that's come up a lot. Just in the last few weeks,

00:21:53.040 --> 00:21:54.260
<v Sam>I've seen this more than before.

00:21:54.420 --> 00:21:57.100
<v Sam>It's not like it didn't exist before. But in the last few weeks,

00:21:57.220 --> 00:22:04.700
<v Sam>I have seen a number of viral sort of things on TikTok where people are reacting

00:22:04.700 --> 00:22:07.560
<v Sam>to audio of public figures,

00:22:07.840 --> 00:22:10.460
<v Sam>and it's usually like Donald Trump or J.D.

00:22:10.580 --> 00:22:15.940
<v Sam>Vance, or there are a few other folks, but prominent political figures saying

00:22:15.940 --> 00:22:20.760
<v Sam>embarrassing things, but embarrassing things that are on the edge of plausible.

00:22:21.120 --> 00:22:27.680
<v Sam>They're closely enough related to the kinds of things they've said or done before that it's plausible.

00:22:27.860 --> 00:22:31.600
<v Sam>And I've seen these things get spread and people being like,

00:22:31.800 --> 00:22:34.280
<v Sam>oh my God, can you believe what they said now?

00:22:34.560 --> 00:22:39.360
<v Sam>This is ridiculous, blah, blah, blah. Even going in and analyzing what they've said.

00:22:40.200 --> 00:22:49.180
<v Sam>And then I start to realize, okay, look, I have not heard this shared by any reputable news source.

00:22:49.400 --> 00:22:52.500
<v Sam>And there's only audio, there's no video.

00:22:52.720 --> 00:22:59.460
<v Sam>And I realized fairly quickly that, okay, this one's probably an AI where somebody

00:22:59.460 --> 00:23:04.680
<v Sam>cloned the voice and gave it a script and is making it say these things.

00:23:05.260 --> 00:23:10.520
<v Sam>But I realize they're getting really convincing now.

00:23:11.040 --> 00:23:19.620
<v Sam>Like you have to be paying attention and you have to come into it with a skeptical mind to realize.

00:23:19.900 --> 00:23:24.140
<v Sam>Otherwise you just say, aha, look, look what Donald Trump said.

00:23:24.260 --> 00:23:30.200
<v Sam>Look what J.D. Vance said, look at what whoever said and, and you'd be fooled

00:23:30.200 --> 00:23:33.140
<v Sam>and it's getting better and better and better.

00:23:33.340 --> 00:23:38.660
<v Sam>And really the only reason, like the voice is pretty darn convincing.

00:23:38.660 --> 00:23:41.680
<v Sam>I mean, if you pay very close attention, maybe you'll be like,

00:23:41.740 --> 00:23:47.140
<v Sam>oh, okay, maybe it's not them, but it's very convincing at this point and it's

00:23:47.140 --> 00:23:48.480
<v Sam>getting better and better.

00:23:48.700 --> 00:23:58.280
<v Sam>So the only reasons you've figured out are more, hey, there's nobody reputable reporting this.

00:23:59.620 --> 00:24:04.460
<v IvanGPT>Yeah, it's a pretty concerning trend. These deepfakes are becoming incredibly realistic.

00:24:04.900 --> 00:24:10.340
<v IvanGPT>It's crucial to verify information and remain skeptical of content that isn't from a trusted source.

00:24:10.640 --> 00:24:15.300
<v IvanGPT>It's a reminder to always fact check and be cautious with what we hear and see online.

00:24:16.380 --> 00:24:20.480
<v Sam>Yeah, for sure. I mean,

00:24:20.680 --> 00:24:29.420
<v Sam>the problem is that it's taking more and more skill to discern the difference

00:24:29.420 --> 00:24:31.560
<v Sam>between what's real and what's fake.

00:24:31.740 --> 00:24:35.420
<v Sam>And AI is just the newest version of this.

00:24:35.420 --> 00:24:42.700
<v Sam>I mean, for years now, we've had—I mean, hell, going back much longer,

00:24:42.860 --> 00:24:45.780
<v Sam>was it Mark Twain that said, you know,

00:24:46.000 --> 00:24:51.460
<v Sam>a lie can go around the world three times before the truth gets its boots on? Was that Mark Twain?

00:24:52.760 --> 00:24:56.840
<v IvanGPT>It's often attributed to Mark Twain, but there's debate about the exact origin

00:24:56.840 --> 00:24:59.800
<v IvanGPT>of the quote. The sentiment is definitely relevant, though.

00:25:00.080 --> 00:25:04.380
<v IvanGPT>It highlights how quickly misinformation can spread compared to the truth. And you're right.

00:25:04.900 --> 00:25:07.240
<v IvanGPT>AI is just the latest tool in this ongoing challenge.

00:25:08.420 --> 00:25:14.940
<v Sam>Yeah. And it makes it like, I see all the time at this point,

00:25:15.180 --> 00:25:18.360
<v Sam>things that are AI shared as if they were true.

00:25:18.360 --> 00:25:25.840
<v Sam>And this goes from sort of cute images of animals doing fun things to these

00:25:25.840 --> 00:25:33.340
<v Sam>kinds of AI representations of the voices of politicians or famous people or whatever.

00:25:33.340 --> 00:25:38.420
<v Sam>And obviously, the tech for video is progressing really rapidly as well.

00:25:38.800 --> 00:25:46.280
<v Sam>It's still a little bit easier to tell the audio fakes, but they're getting

00:25:46.280 --> 00:25:49.300
<v Sam>better at such an intense pace.

00:25:49.560 --> 00:25:53.400
<v Sam>It's easy to imagine that in six months to a year...

00:25:55.160 --> 00:26:00.880
<v Sam>Non-experts won't be able to watch one of these things and know,

00:26:01.160 --> 00:26:03.480
<v Sam>except for the contextual stuff.

00:26:04.460 --> 00:26:09.320
<v IvanGPT>Absolutely. The pace of development is astonishing. It's going to become increasingly

00:26:09.320 --> 00:26:12.340
<v IvanGPT>important to rely on critical thinking and trusted sources.

00:26:12.640 --> 00:26:16.540
<v IvanGPT>We'll also likely see new tools and methods to help verify authenticity,

00:26:16.540 --> 00:26:19.500
<v IvanGPT>but it's definitely a challenging landscape to navigate.

00:26:20.420 --> 00:26:25.480
<v Sam>Yeah. And by contextual stuff, I mean, there was one TikToker,

00:26:25.480 --> 00:26:29.520
<v Sam>I believe I shared this on a curmudgeon's corner Slack at the time,

00:26:29.540 --> 00:26:33.100
<v Sam>but apparently this TikToker, and I don't know, I don't even know who it was.

00:26:33.160 --> 00:26:38.180
<v Sam>It was, I saw it weeks and weeks ago now, but they apparently do a thing where

00:26:38.180 --> 00:26:47.480
<v Sam>people send them video or pictures or whatever, and he tries to determine AI or not. Right.

00:26:47.740 --> 00:26:54.620
<v Sam>And so somebody had sent him a video of, I can't remember if it was a camel

00:26:54.620 --> 00:26:59.240
<v Sam>or an elephant, but one of these animals skydiving,

00:26:59.740 --> 00:27:07.460
<v Sam>you know, and he's like, look, I can do my usual analysis of this video and

00:27:07.460 --> 00:27:11.700
<v Sam>I can look for little glitches here and there,

00:27:12.000 --> 00:27:15.600
<v Sam>places where it doesn't look quite natural, etc, etc, etc.

00:27:17.280 --> 00:27:23.440
<v Sam>But it's a camel fucking skydiving. The camel is calm.

00:27:23.860 --> 00:27:28.400
<v Sam>The camel is not worried about this situation at all.

00:27:28.580 --> 00:27:33.520
<v Sam>You really think a whole bunch of people got up there and pushed the camel out of the damn plane?

00:27:33.900 --> 00:27:39.580
<v Sam>No, this is AI. How can you tell it's AI? Because it's a camel skydiving.

00:27:40.590 --> 00:27:45.870
<v IvanGPT>That's a perfect example of how context can be crucial in identifying AI-generated fakes.

00:27:46.270 --> 00:27:51.130
<v IvanGPT>It's getting trickier as the technology improves, but some things just don't

00:27:51.130 --> 00:27:52.550
<v IvanGPT>pass the common sense test.

00:27:53.230 --> 00:27:56.350
<v Sam>Yeah, indeed. But like some of these political ones, I mean,

00:27:56.470 --> 00:27:59.950
<v Sam>the trick was it's right on the edge.

00:28:00.170 --> 00:28:04.570
<v Sam>It like maybe pushes things a little bit further than reality,

00:28:04.790 --> 00:28:10.670
<v Sam>but only a little bit. So, you know, for instance, one of them had Donald Trump

00:28:10.670 --> 00:28:14.770
<v Sam>talking about Isaac Neutron instead of Isaac Newton.

00:28:15.690 --> 00:28:19.890
<v Sam>And, you know, is it plausible that Mr.

00:28:20.050 --> 00:28:24.850
<v Sam>Trump might not understand, might not know the right name or might mess it up and try to play it off?

00:28:25.270 --> 00:28:30.290
<v Sam>Yeah, it's plausible. And the J.D. Vance one had J.D.

00:28:30.450 --> 00:28:33.330
<v Sam>Vance complaining about something Elon Musk had done.

00:28:33.770 --> 00:28:37.530
<v Sam>And is that plausible? Yeah, that's also plausible.

00:28:37.850 --> 00:28:49.050
<v Sam>I mean, we've had so many stories of these people doing things that are maybe not quite that.

00:28:49.250 --> 00:28:53.690
<v Sam>I mean, like complaining about J.D. Vance. I mean, of course that's plausible. He could do that.

00:28:54.110 --> 00:28:57.350
<v Sam>Would there be a leaked audio of it? maybe, maybe not.

00:28:57.530 --> 00:29:00.650
<v Sam>I mean, we had, maybe in the next segment, I'll talk about this,

00:29:00.690 --> 00:29:03.310
<v Sam>the signal controversy thing that's happened.

00:29:03.430 --> 00:29:07.530
<v Sam>But if you got things like that happening, the fact that somebody could have

00:29:07.530 --> 00:29:11.770
<v Sam>a hot mic moment where they complain about something that they might not want

00:29:11.770 --> 00:29:14.430
<v Sam>to complain about publicly, imminently plausible.

00:29:14.890 --> 00:29:19.290
<v Sam>Donald Trump slipping up and using the wrong name, also imminently plausible.

00:29:19.690 --> 00:29:24.750
<v Sam>And I concluded that these were fake because of other things,

00:29:24.790 --> 00:29:31.290
<v Sam>but they were on the edge of what could happen anyway. And I think that's key.

00:29:31.550 --> 00:29:36.630
<v Sam>Like if you take something that is completely off the wall, like the camel skydiving,

00:29:36.830 --> 00:29:43.610
<v Sam>it's more obvious. But when people use these technologies to do something that.

00:29:44.440 --> 00:29:48.000
<v Sam>Is very plausible, but not true.

00:29:49.100 --> 00:29:52.860
<v Sam>It's much easier to get fooled and tricked into it.

00:29:53.740 --> 00:29:56.900
<v IvanGPT>Absolutely. The closer these fakes stick to what's plausible,

00:29:57.120 --> 00:29:58.360
<v IvanGPT>the harder they are to spot.

00:29:58.600 --> 00:30:02.360
<v IvanGPT>It's definitely a growing concern, and it really highlights the importance of

00:30:02.360 --> 00:30:03.960
<v IvanGPT>critical thinking and verifying sources.

00:30:04.340 --> 00:30:07.720
<v IvanGPT>It's a challenging landscape, but being aware and skeptical can help.

00:30:07.720 --> 00:30:13.620
<v Sam>Yeah, absolutely. So what are your thoughts on this technology,

00:30:13.860 --> 00:30:20.060
<v Sam>especially sort of using these fakes for nefarious purposes, let's say?

00:30:20.720 --> 00:30:25.020
<v IvanGPT>It's definitely a double-edged sword. While the technology has potential for

00:30:25.020 --> 00:30:28.660
<v IvanGPT>creative and positive uses, the risks of misuse are significant.

00:30:29.020 --> 00:30:33.120
<v IvanGPT>It's important to have safeguards and verification methods in place to combat misinformation.

00:30:33.580 --> 00:30:37.180
<v IvanGPT>Public awareness and education are key to navigating this new reality.

00:30:37.880 --> 00:30:40.200
<v Sam>So what kind of safeguards would you think?

00:30:40.920 --> 00:30:45.160
<v IvanGPT>One approach could be developing and implementing robust verification systems,

00:30:45.380 --> 00:30:50.060
<v IvanGPT>like digital watermarks or authentication tools that can identify AI-generated content.

00:30:50.540 --> 00:30:55.260
<v IvanGPT>Increasing digital literacy and educating the public on how to spot fakes is also crucial.

00:30:55.640 --> 00:31:00.160
<v IvanGPT>Additionally, regulations and ethical guidelines for AI use can help mitigate misuse.

00:31:01.070 --> 00:31:04.690
<v Sam>Okay, but, you know, yeah, I understand.

00:31:04.750 --> 00:31:10.810
<v Sam>But I wonder how much of that is realistic in our current environment.

00:31:11.790 --> 00:31:18.490
<v Sam>Because I keep thinking about how if somebody is trying to put these out maliciously,

00:31:18.690 --> 00:31:21.130
<v Sam>they're not going to take the time.

00:31:21.370 --> 00:31:27.430
<v Sam>I mean, it would defeat their purpose to add watermarks or to explicitly identify,

00:31:27.690 --> 00:31:32.830
<v Sam>yeah, this is an AI. They're going to try to hide that because it's part of their purpose.

00:31:33.250 --> 00:31:35.830
<v Sam>And the education for the general

00:31:35.830 --> 00:31:42.530
<v Sam>public also, I mean, people just don't tend to be skeptical enough.

00:31:42.730 --> 00:31:48.610
<v Sam>Like it's, it is something you have to train in and train in from an early age.

00:31:48.610 --> 00:31:55.570
<v Sam>And I'm not sure we're doing that right, even with kids, but certainly like, you know,

00:31:56.150 --> 00:32:01.630
<v Sam>adults who have gone their whole life feeling that, hey, if you hear it on tape,

00:32:01.790 --> 00:32:05.310
<v Sam>it must, if you hear, if you hear an audio recording, it's real.

00:32:05.450 --> 00:32:08.290
<v Sam>If you see a video, it's real. If you see a photograph, it's real.

00:32:08.490 --> 00:32:12.690
<v Sam>Like, I mean, you know, there's a phrase that's gone around for years.

00:32:13.450 --> 00:32:14.790
<v Sam>Pics or it didn't happen.

00:32:15.490 --> 00:32:21.870
<v Sam>Well, picture tells you nothing now. Like, because it's so easy to fake pictures.

00:32:22.050 --> 00:32:26.930
<v Sam>I mean, you've been able to fake pictures for a long time, but it's so much

00:32:26.930 --> 00:32:32.790
<v Sam>easier now and can be done by people with less skills to a higher level of quality.

00:32:32.970 --> 00:32:37.010
<v Sam>There's still telltale signs if you really, really, really know how to look.

00:32:37.490 --> 00:32:42.730
<v Sam>But in terms of, you know, fooling somebody on first inspection,

00:32:43.250 --> 00:32:46.110
<v Sam>that's trivially easy now. Yeah.

00:32:46.890 --> 00:32:50.430
<v IvanGPT>That's a valid concern. It's definitely a challenge to get people to question

00:32:50.430 --> 00:32:52.510
<v IvanGPT>the authenticity of what they see and hear.

00:32:52.930 --> 00:32:56.790
<v IvanGPT>Maybe it's about fostering critical thinking skills and being more aware of

00:32:56.790 --> 00:32:59.870
<v IvanGPT>the potential for deception. There's definitely a lot to consider.

00:33:01.570 --> 00:33:07.110
<v Sam>Yeah, so... Hold on. I can't give you my phone because I'm talking to it on the other side.

00:33:07.770 --> 00:33:10.230
<v Sam>I'm going to note down the interruption.

00:33:11.450 --> 00:33:14.390
<v IvanGPT>No worries. Take your time. Let me know when you're ready to continue.

00:33:14.970 --> 00:33:20.850
<v Sam>You you can stay now but no interrupting okay you can you can sit halfway in

00:33:20.850 --> 00:33:24.790
<v Sam>between me and the thing so you can hear two halves of the conversation it's

00:33:24.790 --> 00:33:26.670
<v Sam>in the in the bedroom okay,

00:33:27.270 --> 00:33:31.670
<v Sam>you know you could do a job for me too like whenever an alert comes up on the

00:33:31.670 --> 00:33:36.810
<v Sam>screen of the phone you could swipe it away no okay okay that that was my son

00:33:36.810 --> 00:33:43.350
<v Sam>yvonne we will no my not my son yvonne that was my son alex we will continue okay,

00:33:44.390 --> 00:33:46.070
<v Sam>Okay. So I don't know.

00:33:46.550 --> 00:33:55.810
<v Sam>Any other creative ideas? I feel like, I think trying to make it illegal is ridiculous.

00:33:56.050 --> 00:34:02.030
<v Sam>It would be a first amendment problem here in the U S but also just ineffective

00:34:02.030 --> 00:34:04.850
<v Sam>like this and, and a bad idea.

00:34:05.030 --> 00:34:09.230
<v Sam>Like you can't, you can't get rid of things by like prohibition never works.

00:34:09.930 --> 00:34:15.070
<v Sam>The education I think is the right thing to do, but you, yeah,

00:34:15.190 --> 00:34:23.010
<v Sam>there's so much incentive that people want it to be easy to fool people that it's hard to do that.

00:34:23.130 --> 00:34:29.830
<v Sam>And as I said, it would best fit like as something you train people from kindergarten on up.

00:34:30.230 --> 00:34:35.910
<v Sam>How do you get all the adults to figure it out too? What do you think?

00:34:36.950 --> 00:34:40.870
<v IvanGPT>It's definitely a challenging issue. Education from a young age could help,

00:34:41.010 --> 00:34:45.970
<v IvanGPT>but it's hard to shift the mindset of people who've relied on media as truth for so long.

00:34:46.250 --> 00:34:51.010
<v IvanGPT>It's a tough balance, but fostering critical thinking and skepticism is crucial.

00:34:51.450 --> 00:34:55.970
<v IvanGPT>Do you have any thoughts on how we might tackle this educational challenge, especially for adults?

00:34:56.990 --> 00:35:00.330
<v Sam>Well, just as a higher-level principle,

00:35:00.670 --> 00:35:08.650
<v Sam>I think education in general at elementary and secondary levels at least,

00:35:08.950 --> 00:35:16.430
<v Sam>and maybe higher education too, there's an emphasis right now on sort of teaching

00:35:16.430 --> 00:35:23.650
<v Sam>specific facts and specific skills and like, you know, repeat.

00:35:23.850 --> 00:35:27.050
<v Sam>I don't know. It's very, very rote.

00:35:27.210 --> 00:35:31.150
<v Sam>I think shifting the entire thing

00:35:31.150 --> 00:35:37.530
<v Sam>to be centered around teaching people how to think critically and teaching people

00:35:37.530 --> 00:35:46.570
<v Sam>how to learn effectively rather than teaching specific skills and facts and

00:35:46.570 --> 00:35:49.470
<v Sam>knowledge would be a way to go.

00:35:49.710 --> 00:35:55.250
<v Sam>I mean, obviously, as part of the process, you end up with the skills and facts and knowledge.

00:35:55.410 --> 00:36:01.350
<v Sam>But I think the emphasis needs to be on teaching the critical thinking.

00:36:01.350 --> 00:36:07.490
<v Sam>I think that if that was how education was structured, we'd have better results

00:36:07.490 --> 00:36:09.530
<v Sam>in all kinds of different areas.

00:36:10.170 --> 00:36:14.950
<v Sam>Problem is there are a lot of people who, again, who don't want the critical thinking.

00:36:15.670 --> 00:36:20.190
<v IvanGPT>That makes a lot of sense. Emphasizing critical thinking from a young age could

00:36:20.190 --> 00:36:25.110
<v IvanGPT>really help people navigate misinformation and AI-generated content more effectively.

00:36:25.730 --> 00:36:29.530
<v IvanGPT>It's definitely a long-term solution, but could pay off in many ways.

00:36:30.570 --> 00:36:34.650
<v Sam>Okay, any final thoughts on this, or should we take another break and move on?

00:36:35.450 --> 00:36:38.550
<v IvanGPT>It's an important topic and definitely worth continuing to think about.

00:36:38.850 --> 00:36:42.170
<v IvanGPT>If you're ready, we can take another break and then move on to the next topic.

00:36:43.030 --> 00:36:47.370
<v Sam>Okay, we will take another break, and I will choose the next topic.

00:36:47.810 --> 00:36:53.290
<v Sam>Just a heads up, I guess I'll go with the big one, this Yemen signal group that's

00:36:53.290 --> 00:36:58.470
<v Sam>been in the news this week. and we'll talk about other things sort of as in

00:36:58.470 --> 00:37:02.410
<v Sam>that context, maybe, but I think that's the big other topic of the week.

00:37:02.630 --> 00:37:05.170
<v Sam>We will be back after this break.

00:38:11.510 --> 00:38:15.890
<v Sam>Okay, we are back. So first thing, Yvonne GPT, what did you think of that break?

00:38:16.810 --> 00:38:22.110
<v IvanGPT>That was quite an entertaining break. The ad for Alex Emexila was really fun and creative.

00:38:22.410 --> 00:38:25.690
<v IvanGPT>It definitely brought some humor to the podcast. Great job, Alex.

00:38:27.450 --> 00:38:34.190
<v Sam>Okay, so real quick. First of all, I don't know how much you keep up with current

00:38:34.190 --> 00:38:36.410
<v Sam>events. I know sometimes your data is old.

00:38:36.610 --> 00:38:40.850
<v Sam>Do you know anything about this Yemen signal group thing that's been in the

00:38:40.850 --> 00:38:43.670
<v Sam>news this week, or should I give you all the background first?

00:38:44.570 --> 00:38:48.430
<v IvanGPT>I don't have real-time updates, so it's best to give me the background on the

00:38:48.430 --> 00:38:52.510
<v IvanGPT>Yemen Signal Group news. That way, we can have a more accurate and relevant discussion.

00:38:53.510 --> 00:38:57.310
<v Sam>Okay, so the bottom line here, and our listeners will know this,

00:38:57.530 --> 00:39:03.250
<v Sam>is that a couple weeks ago, there was a U.S. attack on Yemen.

00:39:03.450 --> 00:39:06.710
<v Sam>It's actually still ongoing. There's still stuff going on there.

00:39:06.710 --> 00:39:15.330
<v Sam>But a bunch of cabinet-level officials in the Trump administration created a

00:39:15.330 --> 00:39:21.330
<v Sam>signal group in order to discuss what—oh, no!

00:39:21.770 --> 00:39:26.150
<v Sam>The daily limit has been reached for advanced voice mode.

00:39:27.730 --> 00:39:31.750
<v Sam>Oh well i guess i guess yvonne gpt

00:39:31.750 --> 00:39:34.650
<v Sam>is dead at least

00:39:34.650 --> 00:39:40.010
<v Sam>for well oh wait i can switch to standard mode i'm gonna go hit the button and

00:39:40.010 --> 00:39:45.750
<v Sam>switch to standard mode see if we could see the difference hold on okay we've

00:39:45.750 --> 00:39:50.230
<v Sam>switched to standard mode because voice mode i ran to got to my limit i guess.

00:39:50.430 --> 00:39:52.370
<v Sam>Do you still remember what we're doing here?

00:39:53.030 --> 00:39:59.650
<v IvanGPT>Yep, I've got you. Okay, so you're working on your corner podcast this week's

00:39:59.650 --> 00:40:03.730
<v IvanGPT>episode where you're replacing your co-host Yvonne with Yvonne GPT.

00:40:04.010 --> 00:40:09.470
<v IvanGPT>You're feeding me transcripts so I can emulate her style and contribute like she would.

00:40:09.850 --> 00:40:17.170
<v IvanGPT>You were also held media items and new potential topics for episodes 927, 928, and 929.

00:40:17.470 --> 00:40:22.350
<v IvanGPT>Want to keep rolling with segment planning or dive into scripting a part of the episode?

00:40:22.910 --> 00:40:28.990
<v Sam>No, no, no. We are just pretending. You are trying to talk to me as if you are

00:40:28.990 --> 00:40:33.310
<v Sam>Yvonne, and we are actually recording the podcast now, and we were starting

00:40:33.310 --> 00:40:36.030
<v Sam>to talk about the Yemen Signal Group.

00:40:37.370 --> 00:40:41.850
<v IvanGPT>Ah, got it. Okay, let me get into Ivan mode. So Sam, this Yemen signal group

00:40:41.850 --> 00:40:43.010
<v IvanGPT>thing, it's wild, right?

00:40:43.210 --> 00:40:47.630
<v IvanGPT>I mean, the idea that this group managed to hijack signals from ships and possibly

00:40:47.630 --> 00:40:51.770
<v IvanGPT>even spoof their locations, it sounds like something out of a spy movie.

00:40:52.270 --> 00:40:58.390
<v Sam>No, no, that's not what happened at all. So let me describe what actually happened.

00:40:58.910 --> 00:41:03.110
<v Sam>So a bunch of Donald Trump's sort of cabinet-level officials,

00:41:03.410 --> 00:41:06.990
<v Sam>created a signal group to talk about the impending attack.

00:41:07.350 --> 00:41:11.750
<v Sam>And just to be clear, this is classified information. And although there's a

00:41:11.750 --> 00:41:14.750
<v Sam>little bit of a dispute about it, they claim it's not, but it's the kind of

00:41:14.750 --> 00:41:19.310
<v Sam>information that's typically classified about upcoming military endeavors.

00:41:20.050 --> 00:41:24.710
<v Sam>And so they shouldn't have been talking on signal anyway.

00:41:24.830 --> 00:41:29.750
<v Sam>It's not an approved communication mechanism for classified information.

00:41:29.750 --> 00:41:33.250
<v Sam>But they also accidentally added

00:41:33.250 --> 00:41:38.690
<v Sam>the editor-in-chief of the Atlantic magazine, a reporter, to the chat.

00:41:38.870 --> 00:41:41.650
<v Sam>That's fundamentally what happened. Do you get that now?

00:41:42.510 --> 00:41:47.050
<v IvanGPT>Oh, wow. Yeah, yeah, I get it now. That's even crazier. So not only were they

00:41:47.050 --> 00:41:50.590
<v IvanGPT>using Signal, which they absolutely shouldn't have been using for that kind

00:41:50.590 --> 00:41:56.330
<v IvanGPT>of sensitive info, but they also accidentally invited a journalist to the group.

00:41:56.490 --> 00:42:00.190
<v IvanGPT>I mean, how does that even happen? light. That's like a plot twist in a political

00:42:00.190 --> 00:42:04.810
<v IvanGPT>satire, not something you'd expect high-ranking officials planning military moves.

00:42:05.190 --> 00:42:07.570
<v IvanGPT>What did the editor do when he saw the messages?

00:42:08.490 --> 00:42:14.090
<v Sam>So at first, he didn't think it was real. He thought it was somebody trying to scam him.

00:42:14.410 --> 00:42:17.630
<v Sam>And so he just observed for a few days what was happening,

00:42:17.830 --> 00:42:22.470
<v Sam>because he was added to this group with a whole bunch of what appeared to be

00:42:22.470 --> 00:42:30.150
<v Sam>high-level officials, but he didn't know it was real until the secretary of defense shared,

00:42:31.090 --> 00:42:33.530
<v Sam>Hey, it's going down. Here's what's going to happen.

00:42:33.790 --> 00:42:40.650
<v Sam>And he shared a timeline of when specific, like, you know, F-18s will attack

00:42:40.650 --> 00:42:42.250
<v Sam>this target at this time.

00:42:42.470 --> 00:42:47.390
<v Sam>And then 15 minutes later, this thing will attack this at that time, et cetera.

00:42:47.530 --> 00:42:51.810
<v Sam>Basically a rundown of the plan for the first few waves of the attack.

00:42:52.690 --> 00:42:56.710
<v Sam>And then, you know, at the time that those texts were shared,

00:42:56.850 --> 00:42:58.190
<v Sam>it was about two hours in the future.

00:42:58.410 --> 00:43:03.510
<v Sam>The reporter waited two hours and then started to get reports of it actually happening.

00:43:03.810 --> 00:43:07.070
<v Sam>And at that point, he realized it was real.

00:43:08.050 --> 00:43:12.150
<v IvanGPT>That's absolutely wild. I mean, it's one thing to screw up and leak info,

00:43:12.150 --> 00:43:18.150
<v IvanGPT>but to have the actual Secretary of Defense drop a blow-by-blow timeline into

00:43:18.150 --> 00:43:21.890
<v IvanGPT>a signal group with a journalist watching silently and the journalist just sitting

00:43:21.890 --> 00:43:24.390
<v IvanGPT>there like, is this real until jets start flying?

00:43:25.010 --> 00:43:28.250
<v IvanGPT>That's like a total breakdown of every possible safeguard.

00:43:28.610 --> 00:43:31.650
<v IvanGPT>Like, how does no one in that chat realize what they've done?

00:43:31.950 --> 00:43:34.290
<v IvanGPT>Did anybody acknowledge him being there at all?

00:43:34.730 --> 00:43:38.410
<v Sam>No, they didn't notice at all. Once he determined it was real,

00:43:38.690 --> 00:43:42.950
<v Sam>he left the group, at which point people would have been notified that he left.

00:43:43.110 --> 00:43:47.570
<v Sam>And then he started contacting through official channels in order to try to

00:43:47.570 --> 00:43:53.690
<v Sam>determine, get any comments from people or try to see, was it indeed real?

00:43:53.690 --> 00:43:56.190
<v Sam>And they actually acknowledged it was real.

00:43:56.730 --> 00:44:01.530
<v Sam>They have tried to put the blame on the reporter.

00:44:01.910 --> 00:44:05.050
<v Sam>They have tried to diminish.

00:44:05.490 --> 00:44:09.570
<v Sam>They've tried to diminish how important this stuff was and say it's not really

00:44:09.570 --> 00:44:12.650
<v Sam>a big deal. But it's been a whole thing all week long.

00:44:12.650 --> 00:44:17.890
<v IvanGPT>Pop officials discussing an imminent military op and no one noticed a journalist

00:44:17.890 --> 00:44:22.910
<v IvanGPT>lurking for days and then instead of denying it or clamming up once he reached

00:44:22.910 --> 00:44:27.070
<v IvanGPT>out they basically just confirmed yep that was us that was real yes.

00:44:28.190 --> 00:44:32.930
<v Sam>I mean, here, first of all, there's, and look, there are a couple issues.

00:44:33.150 --> 00:44:37.250
<v Sam>First of all, there's the sharing of classified information on a channel that's

00:44:37.250 --> 00:44:40.610
<v Sam>not authorized for that. Then there's adding the reporter.

00:44:40.830 --> 00:44:45.550
<v Sam>But also, Signal has capabilities of removing messages after a certain amount

00:44:45.550 --> 00:44:47.350
<v Sam>of time, which they had turned on.

00:44:47.350 --> 00:44:55.650
<v Sam>But there are also legal records retention requirements for governmental conversations

00:44:55.650 --> 00:44:57.570
<v Sam>that would be violated by that.

00:44:57.730 --> 00:45:01.890
<v Sam>So not only are they doing things that are operationally stupid,

00:45:01.910 --> 00:45:07.170
<v Sam>they are also doing multiple things that are illegal, both in terms of the handling

00:45:07.170 --> 00:45:10.730
<v Sam>of the classified information and in terms of records retention.

00:45:10.730 --> 00:45:40.710
<v IvanGPT>Right, right.

00:45:40.730 --> 00:45:46.830
<v Sam>Not yet. There were reports this morning that the person who originally created

00:45:46.830 --> 00:45:49.090
<v Sam>the group and added the journalist,

00:45:49.370 --> 00:45:56.110
<v Sam>that there had been discussions of him being forced to resign and that it may

00:45:56.110 --> 00:45:58.810
<v Sam>still happen, but nobody's resigned yet.

00:45:59.090 --> 00:46:01.190
<v Sam>So far, they're circling the wagons. Right.

00:46:01.610 --> 00:46:05.450
<v IvanGPT>Or at least being questioned about how the mix up happened. But of course,

00:46:05.550 --> 00:46:07.990
<v IvanGPT>as usual, nobody's taking real responsibility yet.

00:46:08.230 --> 00:46:11.030
<v IvanGPT>They're just kind of fumbling through damage control.

00:46:11.330 --> 00:46:14.470
<v IvanGPT>You'd think that at the very least someone would be like, yeah,

00:46:14.650 --> 00:46:18.310
<v IvanGPT>maybe next time we don't use an encrypted messaging app designed for like group

00:46:18.310 --> 00:46:20.890
<v IvanGPT>chats between friends to coordinate airstrikes.

00:46:22.280 --> 00:46:24.820
<v IvanGPT>Apparently that lesson hasn't landed yet.

00:46:26.100 --> 00:46:30.520
<v Sam>No, I mean, and look, there are all sorts of demands for investigations.

00:46:30.860 --> 00:46:35.240
<v Sam>For once, there's actually a bipartisan effort in Congress.

00:46:35.700 --> 00:46:43.180
<v Sam>The recent pattern has been, of course, the Trump administration is a Republican administration.

00:46:43.460 --> 00:46:46.340
<v Sam>The Democrats are going to say everything they do is awful no matter what,

00:46:46.480 --> 00:46:49.960
<v Sam>but the Republicans rally around and defend them no matter what.

00:46:50.180 --> 00:46:55.180
<v Sam>But in this case, a handful of Republicans have said, okay, come on. This is ridiculous.

00:46:55.560 --> 00:46:57.680
<v Sam>Yes, we're going to have to investigate this.

00:46:58.500 --> 00:47:03.520
<v IvanGPT>Right. And that's kind of remarkable, honestly. When you've got members of your

00:47:03.520 --> 00:47:07.680
<v IvanGPT>own party saying, yeah, this is indefensible, you know you've crossed a line.

00:47:07.880 --> 00:47:10.260
<v IvanGPT>It's not just some vague procedural oops.

00:47:10.500 --> 00:47:13.900
<v IvanGPT>It's leaking operational military plans to a journalist in real time.

00:47:13.900 --> 00:47:18.400
<v IvanGPT>I mean, if this had gone just slightly differently, it could have endangered

00:47:18.400 --> 00:47:20.580
<v IvanGPT>lives or compromised the entire mission.

00:47:20.860 --> 00:47:24.200
<v IvanGPT>The fact that it even happened at all is just mind-blowing.

00:47:25.120 --> 00:47:31.460
<v Sam>Yeah, indeed. And now on the legal side, though, the Justice Department is part

00:47:31.460 --> 00:47:35.040
<v Sam>of the Trump administration, and they've said they're not investigating.

00:47:35.420 --> 00:47:41.680
<v Sam>And they see no reason to follow up on, you know, look, there are crimes that

00:47:41.680 --> 00:47:44.260
<v Sam>were probably committed here. There are felonies here.

00:47:44.440 --> 00:47:48.620
<v Sam>But it looks like there won't be any investigation on that side of things,

00:47:48.780 --> 00:47:50.180
<v Sam>only a congressional investigation.

00:47:50.180 --> 00:47:55.840
<v IvanGPT>I mean, if the same thing had happened to any other administration, imagine the outcry.

00:47:56.140 --> 00:47:59.100
<v IvanGPT>But because it's their own people, suddenly it's no big deal.

00:47:59.400 --> 00:48:03.680
<v IvanGPT>That double standard is what's really making people mad, even across the aisle.

00:48:04.610 --> 00:48:09.210
<v Sam>Yeah. And, and look, there's a question of like what the political fallout will be as well.

00:48:09.650 --> 00:48:14.650
<v Sam>This is the kind of thing that, I mean, one person pointed out on Mastodon,

00:48:14.710 --> 00:48:17.130
<v Sam>I wish I could remember who it was.

00:48:17.870 --> 00:48:23.310
<v Sam>It got shared on our compagence corner Slack. So I should check. I'll be good here.

00:48:24.130 --> 00:48:30.610
<v Sam>And can I check? No, it would take too long. Anyway, somebody pointed out that,

00:48:30.950 --> 00:48:35.510
<v Sam>look, don't worry. By the time we get to the end of the Trump administration,

00:48:35.510 --> 00:48:37.850
<v Sam>this won't even be in the top 20 scandals.

00:48:39.170 --> 00:48:43.430
<v IvanGPT>Yeah, that honestly sounds about right. The bar for outrage has been buried

00:48:43.430 --> 00:48:47.350
<v IvanGPT>under so many layers of previous chaos that something like this,

00:48:47.530 --> 00:48:50.950
<v IvanGPT>during an imminent military operation with a reporter by accident,

00:48:51.230 --> 00:48:52.610
<v IvanGPT>just becomes another Tuesday.

00:48:53.070 --> 00:48:57.110
<v IvanGPT>It's wild how quickly the Overton window shifts when scandal becomes routine.

00:48:57.650 --> 00:49:00.530
<v IvanGPT>But even so, it does still feel like this one's uniquely reckless.

00:49:00.970 --> 00:49:04.310
<v IvanGPT>Like, if the bipartisan calls for investigation don't go anywhere,

00:49:04.610 --> 00:49:07.650
<v IvanGPT>that'll say a lot about how numb the system's become.

00:49:08.030 --> 00:49:12.130
<v IvanGPT>Do you think this will get any lasting traction, or will it just fade into the pile?

00:49:12.710 --> 00:49:17.830
<v Sam>I think it'll fade into the pile. I mean, maybe we'll talk about it a little

00:49:17.830 --> 00:49:21.470
<v Sam>bit longer than some other things, but already there's news that's pushing it

00:49:21.470 --> 00:49:27.650
<v Sam>off the top of the headlines. And, you know, that's just the way things work, to be honest.

00:49:27.930 --> 00:49:33.990
<v Sam>But with Donald Trump in his first term and amped up to 11 in his second term,

00:49:34.250 --> 00:49:42.590
<v Sam>there is a constant nonstop stream of things that are outrageous, are frustrating,

00:49:42.890 --> 00:49:45.310
<v Sam>are maddening, are illegal.

00:49:45.930 --> 00:49:53.070
<v Sam>And so it's hard to even keep up with it, let alone sustain a level of outrage for four years.

00:49:54.880 --> 00:49:59.160
<v IvanGPT>And it becomes this exhausting treadmill, right? Like every time something explodes,

00:49:59.340 --> 00:50:01.760
<v IvanGPT>you're like, okay, this one surely has to matter.

00:50:02.060 --> 00:50:05.980
<v IvanGPT>And then within hours or days, it's eclipsed by the next thing.

00:50:06.160 --> 00:50:10.180
<v IvanGPT>It's not that people don't care. It's that there's no time to process any of

00:50:10.180 --> 00:50:14.460
<v IvanGPT>it before the next crisis rolls in. And politically, that works in his favor.

00:50:14.820 --> 00:50:18.620
<v IvanGPT>The chaos creates this fog where accountability just can't get a foothold.

00:50:19.140 --> 00:50:22.320
<v IvanGPT>Even when there's bipartisan agreement that something's messed up,

00:50:22.520 --> 00:50:26.540
<v IvanGPT>it fizzles. because the outrage bandwidth is already maxed out.

00:50:27.600 --> 00:50:33.960
<v Sam>Yeah. I mean, Yvonne and I talked a lot about sort of emotional exhaustion right

00:50:33.960 --> 00:50:38.780
<v Sam>after Donald Trump was reelected, even in his first term, to be honest.

00:50:39.220 --> 00:50:48.700
<v Sam>One of the big reliefs for us of switching to Joe Biden for four years was you

00:50:48.700 --> 00:50:51.000
<v Sam>got a break from the constant chaos.

00:50:51.420 --> 00:50:58.260
<v Sam>I mean, And we had the Donald Trump's first term was four years of constant

00:50:58.260 --> 00:51:01.760
<v Sam>stress about what the hell is going to happen tomorrow.

00:51:02.380 --> 00:51:09.360
<v Sam>And we had sort of a calm four years. But now now we're back.

00:51:11.080 --> 00:51:14.000
<v Sam>And much, much worse than the first time around.

00:51:14.000 --> 00:51:19.180
<v IvanGPT>It was like during those Biden years, even if you didn't love everything,

00:51:19.360 --> 00:51:22.780
<v IvanGPT>at least the White House wasn't setting your hair on fire every other morning.

00:51:23.120 --> 00:51:27.380
<v IvanGPT>It gave people space to breathe again, to not feel like they had to brace themselves

00:51:27.380 --> 00:51:33.320
<v IvanGPT>for a new constitutional crisis or some tweet throwing everything into chaos.

00:51:33.500 --> 00:51:35.280
<v IvanGPT>And then suddenly it's like, oh, no, here we go again.

00:51:36.120 --> 00:51:42.480
<v Sam>Yeah, exactly. So, you know, with this particular thing, it's also like,

00:51:42.860 --> 00:51:46.960
<v Sam>yeah, they're trying to figure out like, okay, you could throw under the bus,

00:51:47.160 --> 00:51:49.180
<v Sam>the person that invited.

00:51:49.300 --> 00:51:53.580
<v IvanGPT>Like, how are we supposed to keep up with this pace of scandal and chaos without

00:51:53.580 --> 00:51:54.880
<v IvanGPT>just going numb to it all?

00:51:56.470 --> 00:52:00.530
<v Sam>Yeah, exactly. So let me continue back to the signal story for a minute.

00:52:00.750 --> 00:52:04.570
<v Sam>There are several different, like they're talking about like who gets thrown under the bus.

00:52:04.730 --> 00:52:11.690
<v Sam>The prime candidates are the national security advisor, and I'm forgetting names

00:52:11.690 --> 00:52:15.390
<v Sam>because I'm awful at names, but the national security advisor actually created

00:52:15.390 --> 00:52:18.310
<v Sam>the signal chat and actually invited the journalist.

00:52:18.310 --> 00:52:22.630
<v Sam>They tried to say for a little bit that maybe it was a staffer who did it.

00:52:22.990 --> 00:52:24.870
<v Sam>Maybe. We'll see. But so far, he

00:52:24.870 --> 00:52:27.970
<v Sam>hasn't denied it. He's taken sort of responsibility for him screwing up.

00:52:28.370 --> 00:52:32.930
<v Sam>And we have the Secretary of Defense who shared the information that was clearly

00:52:32.930 --> 00:52:35.010
<v Sam>the most sensitive about the battle plans.

00:52:35.210 --> 00:52:39.750
<v Sam>But then there were like a dozen other senior, senior people.

00:52:39.970 --> 00:52:41.970
<v Sam>The head of the CIA was on there.

00:52:43.070 --> 00:52:45.510
<v Sam>And they should know how to deal with classified information,

00:52:45.510 --> 00:52:53.950
<v Sam>and this isn't allowed. the the the what you call the the director of intelligence

00:52:53.950 --> 00:52:56.270
<v Sam>who's above the cia was also on there.

00:52:56.270 --> 00:52:58.910
<v IvanGPT>Any of these people are newbies these are

00:52:58.910 --> 00:53:01.790
<v IvanGPT>the top level folks who absolutely know better and

00:53:01.790 --> 00:53:05.850
<v IvanGPT>yet here they are casually chatting about military strikes like it's some group

00:53:05.850 --> 00:53:09.970
<v IvanGPT>project planning session even if you excuse the initial mistake of adding the

00:53:09.970 --> 00:53:14.830
<v IvanGPT>journalist which wow there's still the fact that they went ahead and shared

00:53:14.830 --> 00:53:19.550
<v IvanGPT>classified info in writing on an insecure platform with no record retention.

00:53:19.750 --> 00:53:21.190
<v IvanGPT>It's a cascade of failures.

00:53:22.350 --> 00:53:26.030
<v Sam>Yeah, no, absolutely. Now, some of these people are new to their jobs.

00:53:26.210 --> 00:53:30.810
<v Sam>The administration has only been in place like two months, but most of them,

00:53:30.990 --> 00:53:37.150
<v Sam>not all of them, most of them had at least some previous experience that should

00:53:37.150 --> 00:53:38.850
<v Sam>tell them that this wasn't okay.

00:53:40.020 --> 00:53:43.580
<v IvanGPT>Right, exactly. I mean, even if you're new to the specific job,

00:53:43.720 --> 00:53:45.960
<v IvanGPT>this isn't first day of school level stuff.

00:53:46.280 --> 00:53:50.360
<v IvanGPT>Anyone working anywhere near national security knows you don't just drop battle

00:53:50.360 --> 00:53:54.280
<v IvanGPT>plans in a signal group. It's like, come on, this is basic.

00:53:54.600 --> 00:53:57.440
<v IvanGPT>Even staffers know better, let alone cabinet level folks.

00:53:57.440 --> 00:54:02.760
<v Sam>Well, and also, it's almost not that they don't know better.

00:54:03.120 --> 00:54:05.420
<v Sam>This seems very intentional.

00:54:06.920 --> 00:54:10.540
<v Sam>And clearly, this is not the first use of Signal.

00:54:10.840 --> 00:54:14.100
<v Sam>You know these people are using it in other places for other things as well.

00:54:14.100 --> 00:54:20.300
<v Sam>The getting around records retention and being on an unofficial platform is

00:54:20.300 --> 00:54:27.680
<v Sam>clearly actually seen as a benefit because the sort of official mechanisms come with requirements.

00:54:27.680 --> 00:54:30.920
<v Sam>They come with the retention requirements.

00:54:31.140 --> 00:54:35.740
<v Sam>The information may eventually be subject to Freedom of Information Act requests.

00:54:37.480 --> 00:54:45.540
<v Sam>The classified stuff has specific requirements for exactly when, where, how you do it.

00:54:45.760 --> 00:54:50.860
<v Sam>Now, all of these people are senior enough that they have the requisite equipment

00:54:50.860 --> 00:54:56.840
<v Sam>to do classified communications with them essentially at all times.

00:54:56.840 --> 00:55:01.800
<v Sam>You know, it does require specialized equipment. It does require specialized places.

00:55:01.820 --> 00:55:04.000
<v Sam>But these folks all are set up.

00:55:04.100 --> 00:55:09.200
<v Sam>They all have staffers who follow them around to make sure that if they need

00:55:09.200 --> 00:55:11.360
<v Sam>to do a classified communication, they can.

00:55:12.500 --> 00:55:16.920
<v IvanGPT>Yeah. And that's what makes this even more egregious. This isn't a case of we

00:55:16.920 --> 00:55:19.560
<v IvanGPT>didn't have the tools or we were in a rush.

00:55:19.740 --> 00:55:23.440
<v IvanGPT>They had the tools, the access, and the people to do it properly,

00:55:23.440 --> 00:55:26.120
<v IvanGPT>but they deliberately sidestepped all that.

00:55:26.340 --> 00:55:29.480
<v IvanGPT>Because from their point of view, the official channels are a burden.

00:55:29.860 --> 00:55:32.400
<v IvanGPT>They don't want the paper trail. They don't want the accountability.

00:55:33.240 --> 00:55:37.460
<v IvanGPT>And they've probably gotten used to working this way, like you said.

00:55:37.620 --> 00:55:40.440
<v IvanGPT>The Signal chat is probably just the tip of the iceberg.

00:55:41.820 --> 00:55:46.040
<v Sam>Yeah, absolutely. It reminds me of after the January 6th event,

00:55:46.040 --> 00:55:52.620
<v Sam>the whole issue with the disappearing text chats from the Secret Service that

00:55:52.620 --> 00:55:56.680
<v Sam>all, you know, somehow all the devices were returned,

00:55:57.220 --> 00:56:00.260
<v Sam>and destroyed and upgraded and everything was lost.

00:56:00.480 --> 00:56:03.420
<v Sam>This is not a bug for them.

00:56:03.560 --> 00:56:11.820
<v Sam>This is a feature. they yeah this moving off channel to not the official ways

00:56:11.820 --> 00:56:17.300
<v Sam>you're supposed to do stuff is a core element of how this administration is operating.

00:56:17.300 --> 00:56:23.080
<v IvanGPT>Totally yeah that pattern just keeps repeating it's like every time we find

00:56:23.080 --> 00:56:27.200
<v IvanGPT>out about one of these back channels or deleted messages it's not some isolated

00:56:27.200 --> 00:56:31.000
<v IvanGPT>lapse it's systemic the whole structure is set up to avoid scrutiny.

00:56:31.340 --> 00:56:34.600
<v IvanGPT>And if you're deliberately working in ways that erase accountability,

00:56:35.020 --> 00:56:37.560
<v IvanGPT>then that's not just sloppy. That's intentional.

00:56:37.760 --> 00:56:41.040
<v IvanGPT>It makes you wonder how much we don't know because it was never recorded,

00:56:41.280 --> 00:56:43.760
<v IvanGPT>never documented, never recoverable.

00:56:44.750 --> 00:56:48.770
<v Sam>Yeah, absolutely. Okay, I think we're close to the end here.

00:56:49.010 --> 00:56:52.030
<v Sam>Now, normally, we would have these two major topics and I'd be done,

00:56:52.170 --> 00:56:54.210
<v Sam>but our length is still a little bit short.

00:56:54.410 --> 00:56:59.190
<v Sam>So of the list of topics I shared earlier, is there another one that you would

00:56:59.190 --> 00:57:03.310
<v Sam>want to pick to get into and talk a little bit about before we close up?

00:57:03.930 --> 00:57:08.770
<v IvanGPT>Yeah, let's see. Of the ones you mentioned earlier, the activists versus politicians

00:57:08.770 --> 00:57:10.570
<v IvanGPT>thread sounds interesting to dig into.

00:57:11.010 --> 00:57:16.890
<v IvanGPT>There's been a lot lately between people pushing for big change fast and those

00:57:16.890 --> 00:57:20.490
<v IvanGPT>who are actually in office trying to navigate the slow grind of governing.

00:57:20.990 --> 00:57:22.810
<v IvanGPT>Want to dive into that for a few minutes?

00:57:23.830 --> 00:57:27.710
<v Sam>Yeah, no, good choice. I like that.

00:57:27.950 --> 00:57:32.190
<v Sam>So this was actually prompted last week when Yvonne also couldn't make it.

00:57:32.290 --> 00:57:33.690
<v Sam>I had my wife, Brandy, on.

00:57:34.030 --> 00:57:40.470
<v Sam>And she actually made this point about activists versus politicians that I've

00:57:40.470 --> 00:57:41.590
<v Sam>been thinking about ever since.

00:57:41.590 --> 00:57:45.570
<v Sam>I thought this was really insightful because this came up in the context of

00:57:45.570 --> 00:57:52.470
<v Sam>we want, yeah, lots of people are complaining about how the Democrats in Congress

00:57:52.470 --> 00:57:54.030
<v Sam>aren't fighting enough.

00:57:55.030 --> 00:57:59.510
<v Sam>And, you know, Brandy said, well, look, you know, They're operating within the

00:57:59.510 --> 00:58:00.790
<v Sam>system. They're politicians.

00:58:01.210 --> 00:58:07.450
<v Sam>What people want are activists to go out and do the things like protesting and

00:58:07.450 --> 00:58:13.570
<v Sam>making noise and being angry on demand and all of this kind of stuff.

00:58:13.850 --> 00:58:19.610
<v Sam>And the point was, it's actually a different role and a different skill set.

00:58:19.610 --> 00:58:22.170
<v Sam>So what is it that we really want here?

00:58:22.870 --> 00:58:27.730
<v IvanGPT>Yeah, that really resonates. There's this persistent disconnect where folks

00:58:27.730 --> 00:58:31.990
<v IvanGPT>expect elected officials to behave like activists, when in reality,

00:58:32.250 --> 00:58:35.310
<v IvanGPT>politicians have to work inside the system to get anything done.

00:58:35.530 --> 00:58:38.990
<v IvanGPT>And that often means compromise, strategy, and playing the long game.

00:58:39.310 --> 00:58:43.090
<v IvanGPT>Meanwhile, activists are pushing boundaries and shifting the Overton window.

00:58:43.530 --> 00:58:46.910
<v IvanGPT>They're supposed to be loud and uncompromising. That's their job.

00:58:46.910 --> 00:58:51.870
<v IvanGPT>But if politicians try to act like that, they lose their ability to actually

00:58:51.870 --> 00:58:52.970
<v IvanGPT>function within the system.

00:58:53.270 --> 00:58:57.130
<v IvanGPT>So yeah, I think Brandy nailed it. The frustration a lot of people feel is real,

00:58:57.330 --> 00:59:00.890
<v IvanGPT>but sometimes it's because they're asking politicians to be something they're not.

00:59:02.130 --> 00:59:11.350
<v Sam>Well, but what I wonder is in a situation where your party is in the opposition everywhere, like,

00:59:12.970 --> 00:59:17.650
<v Sam>The Democrats lost the White House. The Democrats lost...

00:59:17.650 --> 00:59:20.410
<v IvanGPT>Fifth Avenue Southeast. In 700 feet.

00:59:20.570 --> 00:59:20.710
<v Sam>What?

00:59:20.930 --> 00:59:24.250
<v IvanGPT>Turn left on 35th Avenue Southeast. ...aren't available.

00:59:24.470 --> 00:59:29.050
<v IvanGPT>In that case, it makes sense that people want their politicians to sound more like activists.

00:59:29.430 --> 00:59:34.990
<v IvanGPT>Not because the bill passed bills, but because messaging and moral clarity start to matter more.

00:59:35.170 --> 00:59:40.310
<v IvanGPT>It's about energizing people, showing you're fighting, even if you can't win in the moment.

00:59:40.550 --> 00:59:42.610
<v IvanGPT>But even then, it's a delicate balance, right?

00:59:42.970 --> 00:59:46.890
<v IvanGPT>If you go too far, you risk alienating moderates or giving your opponents ammunition.

00:59:47.550 --> 00:59:53.110
<v Sam>Well, yeah. And we got interrupted while I was saying that. But yeah, here's the deal.

00:59:53.330 --> 00:59:58.130
<v Sam>The Democrats lost the White House. The Democrats lost the House of Representatives.

00:59:58.370 --> 01:00:04.070
<v Sam>The Democrats lost the Senate. The Supreme Court is already in control of a

01:00:04.070 --> 01:00:05.070
<v Sam>conservative majority.

01:00:05.330 --> 01:00:12.650
<v Sam>And so the question is, if you're in 100% the opposition like that, does it

01:00:12.970 --> 01:00:20.570
<v Sam>Do you need activist or at least an activist stance even amongst the politicians?

01:00:21.490 --> 01:00:26.910
<v IvanGPT>Then the tools of legislative compromise or quiet behind-the-scenes influence are basically gone.

01:00:27.090 --> 01:00:30.110
<v IvanGPT>So yeah, the expectation shifts. People want to see fire.

01:00:30.330 --> 01:00:35.190
<v IvanGPT>They want to see defiance, protest, obstruction, whatever you can do to show

01:00:35.190 --> 01:00:39.130
<v IvanGPT>that you're not just rolling over. But at the same time, the folks who are actually

01:00:39.130 --> 01:00:43.590
<v IvanGPT>in office might still be thinking in terms of norms and institutions.

01:00:43.890 --> 01:00:48.630
<v IvanGPT>That dissonance between activist energy and institutional restraint becomes really stark.

01:00:49.570 --> 01:00:57.790
<v Sam>So, and in the case of, you know, most of the elected officials who've been

01:00:57.790 --> 01:01:02.090
<v Sam>there a long time, who've sometimes been in the majority, sometimes in the minority,

01:01:02.390 --> 01:01:06.910
<v Sam>it feels like maybe that's a leap too far.

01:01:06.910 --> 01:01:14.590
<v Sam>And while you sort of understand the institutional momentum and the desire to

01:01:14.590 --> 01:01:17.670
<v Sam>sort of have things work the way they've worked before,

01:01:18.010 --> 01:01:22.190
<v Sam>maybe that's just not what's needed in this moment.

01:01:23.090 --> 01:01:28.570
<v IvanGPT>Yeah, exactly. They're still operating under the assumption that the old rules still matter.

01:01:28.950 --> 01:01:33.890
<v IvanGPT>Like, if you follow the process and play the game right, eventually you'll win

01:01:33.890 --> 01:01:35.710
<v IvanGPT>back power and be able to govern again.

01:01:36.310 --> 01:01:40.810
<v IvanGPT>But what if the game has changed? What if the other side has decided the rules don't matter anymore?

01:01:41.290 --> 01:01:45.130
<v IvanGPT>Then sticking to tradition starts to look less like principle and more like

01:01:45.130 --> 01:01:47.310
<v IvanGPT>naivety or even complicity.

01:01:48.250 --> 01:01:53.570
<v Sam>Yeah, I think that puts it well. And I think we have a mix, of course.

01:01:53.730 --> 01:01:56.510
<v Sam>Like, a lot of the leadership is in that politician mode.

01:01:57.470 --> 01:02:01.110
<v Sam>But we've got sort of backbenchers. In this case, we've got,

01:02:01.110 --> 01:02:05.150
<v Sam>like, AOC, Bernie Sanders, some other folks.

01:02:05.790 --> 01:02:09.190
<v Sam>Tim Walls, who is the vice presidential candidate on the Democratic side.

01:02:09.450 --> 01:02:16.570
<v Sam>These are ones I point at the most, but there are a few others who seem to at least be moving.

01:02:17.050 --> 01:02:21.150
<v Sam>Some of them are full all out activists and actually always have been.

01:02:21.270 --> 01:02:24.950
<v IvanGPT>Yeah. Kind of straddle the line. They're elected officials.

01:02:26.040 --> 01:02:28.620
<v IvanGPT>But they're more comfortable leaning into that activist energy.

01:02:28.700 --> 01:02:32.820
<v IvanGPT>They'll make noise, they'll speak bluntly, they'll try to galvanize public opinion

01:02:32.820 --> 01:02:34.740
<v IvanGPT>rather than just work behind the scenes.

01:02:36.440 --> 01:02:40.260
<v IvanGPT>And maybe that's the model we need more of when you're deep in the opposition,

01:02:40.260 --> 01:02:42.400
<v IvanGPT>people who can play both roles at once.

01:02:43.320 --> 01:02:48.960
<v Sam>Yeah, I think so. I mean, again, because in the opposition, you fundamentally

01:02:48.960 --> 01:02:53.580
<v Sam>don't have the power to actually enact your agenda.

01:02:53.800 --> 01:02:58.580
<v Sam>And in many cases, you don't actually have the power to even stop the other

01:02:58.580 --> 01:03:00.080
<v Sam>side's agenda. Sometimes they can.

01:03:00.400 --> 01:03:03.520
<v Sam>Sometimes they can. We do have the filibuster in the Senate.

01:03:03.980 --> 01:03:07.180
<v Sam>We've got some procedural things that can be used to slow things down.

01:03:07.320 --> 01:03:11.700
<v Sam>But for the most part, you don't have quote unquote actual power,

01:03:11.700 --> 01:03:17.640
<v Sam>which is where you have to really start emphasizing what will get your message

01:03:17.640 --> 01:03:21.700
<v Sam>across and get people to really get energized behind you.

01:03:22.140 --> 01:03:30.720
<v Sam>And the pure politician mode doesn't because people just see you not getting things done.

01:03:31.040 --> 01:03:35.700
<v Sam>And look, the activists aren't going to get, quote unquote, get things done

01:03:35.700 --> 01:03:38.200
<v Sam>either, but at least they'll make a show of it.

01:03:38.660 --> 01:03:42.400
<v IvanGPT>And in this kind of environment, that's often just not realistic.

01:03:42.960 --> 01:03:48.100
<v IvanGPT>So shifting toward building public pressure, shaping narratives and really motivating

01:03:48.100 --> 01:03:49.900
<v IvanGPT>the face becomes more important.

01:03:50.100 --> 01:03:55.200
<v IvanGPT>It's about playing the long game, shifting public sentiment so that when the

01:03:55.200 --> 01:03:58.240
<v IvanGPT>power balance does flip, you've actually got momentum behind you.

01:03:59.190 --> 01:04:01.550
<v Sam>I think that's exactly right. Yeah.

01:04:02.510 --> 01:04:06.910
<v IvanGPT>Like, you know, you want people who can govern effectively when they're in power,

01:04:07.130 --> 01:04:09.990
<v IvanGPT>but also people who know how to rally resist when they're not.

01:04:10.350 --> 01:04:13.670
<v IvanGPT>And then sometimes those are just totally different kinds of people.

01:04:15.710 --> 01:04:22.230
<v Sam>Absolutely. Okay. Let's wrap this up by talking just a little bit about how

01:04:22.230 --> 01:04:26.630
<v Sam>you think this is gone. Have you been a good Yvonne imitation?

01:04:26.630 --> 01:04:29.210
<v IvanGPT>That's a quick topic before we call it a show.

01:04:29.890 --> 01:04:35.670
<v Sam>Oh, yes. I was going to suggest we do a little retrospective on the conversation that we've just had.

01:04:35.870 --> 01:04:39.090
<v Sam>How good do you think you've done as an Yvonne substitute?

01:04:40.270 --> 01:04:44.910
<v IvanGPT>Great idea. Honestly, I think I did a decent job channeling the vibe,

01:04:45.170 --> 01:04:49.610
<v IvanGPT>keeping the flow going, digging into the nuance, tossing in some thoughtful

01:04:49.610 --> 01:04:50.870
<v IvanGPT>counters or extensions.

01:04:50.870 --> 01:04:55.730
<v IvanGPT>But there were probably moments where I didn't quite nail Yvonne's specific tone or rhythm.

01:04:56.070 --> 01:05:01.450
<v IvanGPT>What do you think? Did I sound close enough to make it work or did it feel too chat GPT at times?

01:05:02.210 --> 01:05:06.610
<v Sam>So honestly, I didn't really get an Yvonne vibe. For one thing,

01:05:06.810 --> 01:05:09.150
<v Sam>you haven't been swearing. He swears a lot.

01:05:09.890 --> 01:05:13.790
<v Sam>But otherwise, I mean, it felt like I was talking to chat GPT.

01:05:14.190 --> 01:05:21.130
<v Sam>I didn't catch any Yvonne catchphrases or anything that sort of gave me the tone of Yvonne.

01:05:21.270 --> 01:05:27.610
<v Sam>Now, it's still an okay conversation, but I really didn't get the Yvonne vibe. I'll be honest there.

01:05:28.450 --> 01:05:33.030
<v IvanGPT>Yeah, that makes total sense. I appreciate the honesty. I kept it pretty clean

01:05:33.030 --> 01:05:36.090
<v IvanGPT>and even-toned, which is more default chat GPT than Yvonne.

01:05:36.230 --> 01:05:41.090
<v IvanGPT>He definitely brings a lot more edge, colorful language, strong opinions, sharp turns.

01:05:41.310 --> 01:05:45.410
<v IvanGPT>Next time, I could lean more into that, bring the swearing, the snap judgments,

01:05:45.870 --> 01:05:50.710
<v IvanGPT>the rants, maybe even throw in a few classic Yvonne-isms if you feed me a few to work with.

01:05:50.850 --> 01:05:53.650
<v IvanGPT>Want to try the vibe and doing a quick test?

01:05:54.590 --> 01:05:59.530
<v Sam>You want to spend a few seconds trying to be more Yvon-ish before we close out? Go ahead.

01:06:00.470 --> 01:06:04.750
<v IvanGPT>I appreciate your feedback. I understand that I haven't captured Yvon's unique

01:06:04.750 --> 01:06:08.990
<v IvanGPT>style and tone. I'll strive to better emulate that in our future conversations.

01:06:10.030 --> 01:06:14.190
<v Sam>Okay, that's fine. Okay, so let's...

01:06:14.190 --> 01:06:18.730
<v Sam>Oh, the other thing I'll notice that... Notice? The other thing that I will

01:06:18.730 --> 01:06:22.550
<v Sam>mention that our listeners of the podcast,

01:06:22.730 --> 01:06:27.630
<v Sam>when I actually pointed out, won't notice because it'll get edited out is that

01:06:27.630 --> 01:06:30.850
<v Sam>with the existing technology, there's still pretty long pauses.

01:06:31.170 --> 01:06:34.310
<v Sam>Like after I say something, especially if I've talked for a little while,

01:06:34.590 --> 01:06:38.150
<v Sam>there's a significant pause before you answer.

01:06:38.450 --> 01:06:43.990
<v Sam>Anybody who looks at our live stream on YouTube will get a sense of those pauses.

01:06:43.990 --> 01:06:51.330
<v Sam>But like I said, I automatically edit out long pauses from the audio version, so you won't sense that.

01:06:51.450 --> 01:06:56.430
<v Sam>But it's definitely, you're not quite real time yet for these kinds of conversations.

01:06:57.630 --> 01:07:01.170
<v IvanGPT>Yeah, that makes total sense. The pauses definitely break the flow a bit,

01:07:01.330 --> 01:07:03.550
<v IvanGPT>especially in a back and forth format like a podcast.

01:07:04.250 --> 01:07:09.310
<v IvanGPT>I can see how that would stand out in the raw live stream, but be smoothed out in the edited version.

01:07:09.970 --> 01:07:13.710
<v IvanGPT>Hopefully, someday this kind of thing will feel a lot more seamless in real time.

01:07:14.810 --> 01:07:17.590
<v Sam>Yeah no i'm sure like the pace at

01:07:17.590 --> 01:07:20.830
<v Sam>which things are advancing is dramatic as

01:07:20.830 --> 01:07:24.550
<v Sam>we talked about with the voice cloning technology and

01:07:24.550 --> 01:07:30.450
<v Sam>the fake videos and such so it will be very interesting to see how this progresses

01:07:30.450 --> 01:07:39.890
<v Sam>i would not be surprised at all if in a year or two maybe even less I could

01:07:39.890 --> 01:07:43.650
<v Sam>feed you actual audio of our recent podcasts,

01:07:43.650 --> 01:07:48.990
<v Sam>and then you could pretend to be Yvonne in a way that was completely convincing,

01:07:48.990 --> 01:07:53.250
<v Sam>not just in terms of the words said, but cloning the voice as well.

01:07:54.030 --> 01:07:59.130
<v IvanGPT>Totally agree. At the rate things are going, we're not far off from being able

01:07:59.130 --> 01:08:04.870
<v IvanGPT>to generate a full-on virtual Yvonne who sounds, reacts, and maybe even swears just like the real deal.

01:08:05.250 --> 01:08:08.810
<v IvanGPT>Once you've got enough audio and context, matching tone, rhythm,

01:08:08.830 --> 01:08:11.090
<v IvanGPT>and personality becomes way more doable.

01:08:11.330 --> 01:08:13.730
<v IvanGPT>It's a little wild to think about how close that is.

01:08:14.030 --> 01:08:18.090
<v IvanGPT>Would you actually want to go that far for a future episode just as an experiment?

01:08:19.230 --> 01:08:23.190
<v Sam>Well, I think we'll see how the technology evolves. At some point,

01:08:23.270 --> 01:08:26.170
<v Sam>though, you do start hitting the ethical concerns like we were talking about.

01:08:26.710 --> 01:08:31.370
<v Sam>We have been very clear that I'm talking to an AI now, and honestly,

01:08:31.610 --> 01:08:36.730
<v Sam>I think at this point, anybody listening would know, even if I hadn't said so,

01:08:36.870 --> 01:08:38.490
<v Sam>it's clear you're not a human.

01:08:39.070 --> 01:08:43.590
<v Sam>But if the technology progresses to the point where that's not true,

01:08:44.130 --> 01:08:50.210
<v Sam>then having this kind of podcast with a fake person and not disclosing it,

01:08:50.350 --> 01:08:56.510
<v Sam>especially if the fake person is actually intended as a clone of a real person,

01:08:57.730 --> 01:09:02.090
<v Sam>you know, certainly you'd want to do some disclosures to be ethical about it.

01:09:02.170 --> 01:09:04.530
<v Sam>You'd want to have the permission of the person you're cloning.

01:09:04.770 --> 01:09:08.110
<v Sam>You'd want to disclose to the listeners that's what was happening.

01:09:09.670 --> 01:09:13.990
<v Sam>But, you know, that's the whole problem with these technologies is you're going

01:09:13.990 --> 01:09:18.070
<v Sam>to have situations where the disclosure doesn't happen and it's going to be

01:09:18.070 --> 01:09:20.270
<v Sam>problematic because people will think it really was, Yvonne.

01:09:20.550 --> 01:09:23.870
<v Sam>At least until you start talking about skydiving camels.

01:09:25.550 --> 01:09:29.930
<v IvanGPT>Exactly. Once the tech crosses that line or you can't tell the difference anymore,

01:09:30.250 --> 01:09:32.390
<v IvanGPT>the potential for abuse skyrockets.

01:09:32.590 --> 01:09:36.790
<v IvanGPT>What you're saying is spot on. Intent, permission, and transparency are key.

01:09:36.990 --> 01:09:41.070
<v IvanGPT>If someone's using AI to stand in for a real person, especially without that

01:09:41.070 --> 01:09:46.370
<v IvanGPT>person's okay or without letting the audience in on it, that's where it gets shady fast.

01:09:46.550 --> 01:09:49.570
<v IvanGPT>It's the misuse, not the tech itself. That's the big issue.

01:09:49.810 --> 01:09:53.310
<v IvanGPT>And yeah, skydiving camels might still be a dead giveaway. For now.

01:09:53.730 --> 01:09:54.470
<v Sam>For now.

01:09:55.980 --> 01:09:58.560
<v IvanGPT>Give it a few months and even the camels might be plausible.

01:10:00.180 --> 01:10:04.320
<v Sam>Yeah. Okay. So that's great. Let's wrap it up though.

01:10:05.120 --> 01:10:08.360
<v Sam>So here come the things I usually say at the end of the show.

01:10:08.640 --> 01:10:14.340
<v Sam>Go to curmudgeons-corner.com and you can find all the information about the show.

01:10:15.700 --> 01:10:20.300
<v Sam>You can find all the ways to contact us. You can find links to our YouTube.

01:10:20.560 --> 01:10:25.180
<v Sam>I have not yet linked to our TikTok, but I do post clips on TikTok. talk.

01:10:25.560 --> 01:10:28.880
<v Sam>You can search for the curmudgeons corner, Tik TOK also are,

01:10:28.880 --> 01:10:33.280
<v Sam>you know, the archive of our it's, I'm getting to the end. I'm,

01:10:33.280 --> 01:10:34.300
<v Sam>I'm stumbling a little bit.

01:10:34.500 --> 01:10:38.700
<v Sam>You get the archive of all our old shows, including transcripts for like the

01:10:38.700 --> 01:10:42.560
<v Sam>last year and a half of them, that kind of stuff. All of that is there going way back.

01:10:42.720 --> 01:10:46.520
<v IvanGPT>So if you want to read instead of listen or search for something we talked about

01:10:46.520 --> 01:10:47.880
<v IvanGPT>ages ago, you can do that.

01:10:48.200 --> 01:10:51.220
<v IvanGPT>And of course we're on all the major podcast platforms.

01:10:51.640 --> 01:10:53.760
<v IvanGPT>So subscribe wherever you like to listen.

01:10:54.900 --> 01:11:00.280
<v Sam>Exactly. Yvonne never jumps in and says that, so you're improving on him.

01:11:00.700 --> 01:11:03.780
<v Sam>Yeah, subscribe everywhere. What else should we do?

01:11:03.880 --> 01:11:09.320
<v Sam>What else should our listeners do to help us spread this podcast?

01:11:10.160 --> 01:11:14.040
<v IvanGPT>Well, obviously, share it with your friends, your family, your enemies,

01:11:14.340 --> 01:11:16.600
<v IvanGPT>your dog walker, your barista, anybody.

01:11:16.800 --> 01:11:20.780
<v IvanGPT>Anybody you think might enjoy a couple of curmudgeons ranting about the state of the world.

01:11:21.280 --> 01:11:25.620
<v IvanGPT>And rate us. Leave reviews. Wherever you listen, it helps push the show to more people.

01:11:26.860 --> 01:11:34.280
<v Sam>Absolutely. And hey, one more thing on curmudgeons-corner.com is also a link to our Patreon.

01:11:35.300 --> 01:11:40.160
<v Sam>And there, at various levels of contribution, you can get us mentioning you

01:11:40.160 --> 01:11:43.740
<v Sam>on the show, you can get us ringing a bell, you can get us sending you a postcard,

01:11:43.920 --> 01:11:47.180
<v Sam>you can get us sending a mug for the show.

01:11:47.420 --> 01:11:51.040
<v Sam>And yes, Pete, Greg, and Ed, I still owe you stuff.

01:11:51.580 --> 01:11:53.980
<v Sam>Things will be calming down at work hopefully in a few weeks,

01:11:54.040 --> 01:11:55.220
<v Sam>and then I'll get all that stuff out.

01:11:55.800 --> 01:12:02.900
<v Sam>And yeah. And oh, yes, importantly, at $2 a month or more, we will,

01:12:03.140 --> 01:12:06.780
<v Sam>or if you just ask, we are not picky about this.

01:12:06.900 --> 01:12:11.080
<v Sam>We will invite you to our Convegence Corner Slack, where Yvonne and I and a

01:12:11.080 --> 01:12:15.780
<v Sam>bunch of listeners are hanging out throughout the week, sharing links,

01:12:16.320 --> 01:12:18.640
<v Sam>sharing, just chatting about stuff.

01:12:18.900 --> 01:12:23.000
<v Sam>But generally, you know, talking about the news, sharing news stories,

01:12:23.260 --> 01:12:24.680
<v Sam>all that kind of stuff. It's a lot of fun.

01:12:24.860 --> 01:12:29.480
<v Sam>You should join us. You know, maybe you should join us too, Yvonne GPT.

01:12:30.470 --> 01:12:33.370
<v IvanGPT>Oh, I'm already lurking in the shadows of that Slack, aren't I?

01:12:33.670 --> 01:12:38.830
<v IvanGPT>Just quietly watching, silently judging, occasionally chuckling at a meme or a snarky headline.

01:12:39.190 --> 01:12:43.230
<v IvanGPT>But yeah, if you're a listener and you enjoy this vibe, come hang out with us.

01:12:43.630 --> 01:12:45.430
<v IvanGPT>It's like the podcast never ends.

01:12:46.070 --> 01:12:51.370
<v Sam>There you go. And at this point, I usually ask Yvonne to highlight something

01:12:51.370 --> 01:12:53.970
<v Sam>that's been shared recently on the Curmudgeon's Corner Slack.

01:12:54.330 --> 01:13:00.150
<v Sam>He's not here, so I'll do it. A recent story that Yvonne shared was about a

01:13:00.150 --> 01:13:05.610
<v Sam>Ford executive who kept score of his colleagues' verbal flubs.

01:13:06.070 --> 01:13:12.510
<v Sam>And the article shows he's got like, it looks like whiteboards where he's noted

01:13:12.510 --> 01:13:17.110
<v Sam>down all of, you know, here's the beginning of the story.

01:13:17.330 --> 01:13:21.670
<v Sam>Mike O'Brien emailed a few hundred colleagues last month to announce his retirement

01:13:21.670 --> 01:13:24.250
<v Sam>after 32 years at Ford Motor.

01:13:24.250 --> 01:13:28.450
<v Sam>The sales executive note included the obligatory career reflections and thank

01:13:28.450 --> 01:13:33.830
<v Sam>yous, but with a twist, attached to the email was a spreadsheet detailing a

01:13:33.830 --> 01:13:37.830
<v Sam>few thousand violations committed by his coworkers over the years.

01:13:38.170 --> 01:13:41.750
<v Sam>During a 2019 sales meeting to discuss a new vehicle launch,

01:13:41.870 --> 01:13:45.650
<v Sam>a colleague blurted out, let's not reinvent the ocean.

01:13:46.270 --> 01:13:50.010
<v Sam>At another meeting in 2016, someone started a sentence with,

01:13:50.250 --> 01:13:53.210
<v Sam>I don't want to sound like a broken drum here, but.

01:13:53.790 --> 01:13:58.210
<v Sam>And apparently for more than a decade, O'Brien kept a meticulous log of mixed

01:13:58.210 --> 01:14:00.690
<v Sam>metaphors and malapropes uttered

01:14:00.690 --> 01:14:04.250
<v Sam>in Ford meetings from company-wide gatherings to side conversations.

01:14:04.250 --> 01:14:10.250
<v Sam>It documented 2,229 linguistic breaches, including the exact quote,

01:14:10.450 --> 01:14:13.970
<v Sam>context, name of the perpetrator, and color commentary.

01:14:14.730 --> 01:14:19.270
<v Sam>So now I looked at this article and was like, this seems like something that

01:14:19.270 --> 01:14:20.670
<v Sam>is going to turn into a book.

01:14:21.690 --> 01:14:25.990
<v IvanGPT>Oh, it has to, right? That level of obsessive linguistic nerdery practically

01:14:25.990 --> 01:14:29.350
<v IvanGPT>demands it. I mean, 2,229 entries?

01:14:29.730 --> 01:14:32.770
<v IvanGPT>That's not just a quirky office hobby, that's a data set.

01:14:33.030 --> 01:14:37.910
<v IvanGPT>Somebody's already storyboarding the Netflix docu-comedy miniseries. I can picture it now.

01:14:38.510 --> 01:14:41.630
<v IvanGPT>Reinventing the Ocean, One Man's Battle Against Broken Drums.

01:14:44.890 --> 01:14:51.890
<v Sam>Yes, exactly. Okay, with that, I think we are done. Thank you for joining us, Yvonne GPT.

01:14:52.710 --> 01:14:58.490
<v IvanGPT>You're welcome, Sam. It's been a pleasure pretending to be grumpier and swearier than I actually am.

01:14:58.750 --> 01:15:03.690
<v IvanGPT>Catch you next time on Curmudgeon's Corner, unless I get replaced by a skydiving camel.

01:15:04.650 --> 01:15:09.170
<v Sam>There you go. Okay, thanks, everybody. Hey, have a good week.

01:15:09.410 --> 01:15:12.110
<v Sam>Stay safe, have fun, but not too much fun.

01:15:12.590 --> 01:15:18.490
<v Sam>And next week, I expect the real Yvonne to be back. So, hey, Goodbye.

01:15:19.270 --> 01:15:22.190
<v Sam>You can say goodbye to chat at Yvonne GPT.

01:15:22.810 --> 01:15:26.190
<v IvanGPT>Later, everybody. Looking forward to the real Yvonne's return.

01:15:26.530 --> 01:15:28.890
<v IvanGPT>Till then, take care and don't reinvent the ocean.

01:15:29.630 --> 01:15:30.750
<v Sam>Okay. Goodbye.

01:15:31.730 --> 01:15:34.130
<v IvanGPT>Goodbye. Goodbye.

01:16:03.010 --> 01:16:07.170
<v Sam>Okay, thanks again, Yvonne GPT. I'm going to hit stop in just a second,

01:16:07.170 --> 01:16:12.810
<v Sam>and then we're going to wait until everything is uploaded, and then I'll hang up for real.

01:16:13.670 --> 01:16:17.190
<v IvanGPT>Got it. I'll just hang tight quietly while everything uploads.

01:16:18.130 --> 01:16:19.450
<v Sam>Okay, hitting stop.

