Episode 24 – Aspirational Decaf

Description
Scott couldn't handle another long distance phone call with Peter, so John Chidgey of The Engineered Network is here to give a more balanced view of AI, ML, ChatGPT, and other societal-techno issues.
Transcript

00:00:00.000 —> 00:00:02.000
Friends with Brews.

00:00:02.000 —> 00:00:06.080
Well, I’m awake. My ears, I’m awake. And hello. How’s it going, Scott?

00:00:06.080 —> 00:00:16.320
This is exciting because although we’ve had Adam Bell on this podcast as a guest several times, we’ve never had a guest plus not had Peter.

00:00:16.320 —> 00:00:24.600
So Peter’s now hearing me say I’m very excited that we’re not having Peter, but we’re not having Peter. We’re having John Chidgey.

00:00:24.600 —> 00:00:25.600
Well, thanks for having me.

00:00:26.200 —> 00:00:31.360
And you, you’re a real coffee aficionado. You’re not drinking that decaf stuff that Peter always drinks.

00:00:31.360 —> 00:00:38.220
Ew, what? No. I mean I have, okay, I have aspirational decaf that I imagine I may drink at some point.

00:00:38.220 —> 00:00:40.900
It’s just sitting there waiting for me to drink it. And it’s been sitting there for a while.

00:00:40.900 —> 00:00:43.480
Which means it’s probably not good to drink it at this point.

00:00:43.480 —> 00:00:46.680
So, I think like I say, it’s aspirational decaf. It’s there.

00:00:46.680 —> 00:00:48.380
I just, yeah.

00:00:48.380 —> 00:00:54.640
The decaf starts off inferior and then of course, like any coffee, the longer it sits, the more inferior it gets.

00:00:54.640 —> 00:00:59.860
Yeah, exactly. So I’m probably scared to touch it. It’s been sitting there for a couple years now. So but that’s okay

00:00:59.860 —> 00:01:05.080
It’s aspirational and speaking of that. What are you drinking this morning? So all this evening? Sorry, I

00:01:05.080 —> 00:01:08.740
Am drinking a local coffee roasters

00:01:08.740 —> 00:01:14.640
they’re called back porch coffee and they have a blend called the back porch blend and

00:01:14.640 —> 00:01:17.360
It is quite good. It’s got

00:01:17.360 —> 00:01:23.080
Dark chocolate peanut butter orange citrus in it and it’s really good. We can talk about

00:01:23.820 —> 00:01:31.160
Methodologies here if you want to I use a Kalita wave and I do a pour over which I like because of the process of

00:01:31.160 —> 00:01:37.680
It I don’t know. It’s just something I enjoy doing it is harder to get the same smoothness as the

00:01:37.680 —> 00:01:40.840
What’s the plunger thing that you use?

00:01:40.840 —> 00:01:47.960
aeropress. Yeah, the aeropress aeropress is number one for getting just a smooth with no acidity and no bitterness

00:01:47.960 —> 00:01:52.600
But I also have to use double the grounds to get the strength of the coffee

00:01:52.600 —> 00:01:59.240
I want with the AeroPress. So I use a Kalita Wave, but with this one it’s pretty easy to

00:01:59.240 —> 00:02:03.960
make it nice and smooth and not at all bitter. So I really like this blend, this roast.

00:02:03.960 —> 00:02:04.840
Nice.

00:02:04.840 —> 00:02:05.400
What about you?

00:02:05.400 —> 00:02:09.560
Well I’m stepping out of my comfort zone and mainly because I ran out of coffee.

00:02:09.560 —> 00:02:15.960
So normally I’ll get a half a kilo or you know one kilo bag of beans. It’s good for like three weeks,

00:02:15.960 —> 00:02:19.640
you know, for me having a couple of cups of coffee in the morning.

00:02:19.640 —> 00:02:26.200
and normally I have my favorite Campos superior blend done as an espresso but I do that, I’m

00:02:26.200 —> 00:02:31.720
froth up some milk and make myself a latte out of it but if I have more time I will either use an

00:02:31.720 —> 00:02:38.600
AeroPress or I’ll use my Hario V60 pour over. So it depends on the sort of mood I’m in and how much

00:02:38.600 —> 00:02:43.800
time I have. As odd as it sounds and I guess maybe it’s not odd because the whole point of the

00:02:43.800 —> 00:02:48.520
espresso machine originally was to make coffee more quickly so it is in fact quicker to make a

00:02:48.520 —> 00:02:54.280
coffee using the espresso machine. But in any case, depends on the mood I’m in. And so anyway,

00:02:54.280 —> 00:02:58.520
that would have been all well and good. But the Campos Superior blend I’d run out. So when I was

00:02:58.520 —> 00:03:02.120
out in the shops yesterday, I grabbed something randomly and I hate to admit it, I was drawn in

00:03:02.120 —> 00:03:07.240
by the packaging. This particular one I’ve never tried it before. It’s called The Darkness. And

00:03:07.240 —> 00:03:13.720
yeah, it’s rich and chocolatey apparently, which I kind of taste the richness, the chocolatey bit

00:03:13.720 —> 00:03:19.320
as a stretch and it’s by a company called DC Roasters. So they’re an Australian, you know,

00:03:19.320 —> 00:03:23.880
roaster and they’re based in Victoria but they ship their stuff all around Australia. And this

00:03:23.880 —> 00:03:29.720
particular one’s a mixture of Colombian, two parts of Brazil, Armenia Brazil, Labarida,

00:03:29.720 —> 00:03:35.160
Labarida? I don’t know how to pronounce that. And South Silvestri. So they’ve got an interesting

00:03:35.160 —> 00:03:39.400
blend in here and some of it’s quite dark roasted and you can taste that bitterness.

00:03:39.400 —> 00:03:44.440
It rates itself on a scale of one to ten of intensity of an eight out of ten, whatever that even means.

00:03:44.440 —> 00:03:47.680
Would I say it’s intense? I’d say it’s a little bit more bitter.

00:03:47.680 —> 00:03:51.880
And that’s probably because it was done as an espresso shot. It really brought that acidity out.

00:03:51.880 —> 00:03:53.880
But having said that, it’s smooth.

00:03:53.880 —> 00:03:58.880
And it is a bit of a slap in your face kind of a coffee, which is kind of what I need in the morning.

00:03:58.880 —> 00:04:01.880
Not too bad. Not too bad at all.

00:04:01.880 —> 00:04:05.880
That’s interesting. I am looking at the packaging and I have to admit, see,

00:04:05.880 —> 00:04:12.880
here in the US, for roasters and even honestly beers that would have this crazy of a package,

00:04:12.880 —> 00:04:18.880
I would generally shy away from it because they’re usually going for a super strong effect that I don’t necessarily want.

00:04:18.880 —> 00:04:24.880
So I looked at this and I immediately got scared. And not just because of the skulls on the package, but…

00:04:24.880 —> 00:04:31.880
It does draw your eye, doesn’t it? Because there you are walking down the aisle and you can get something like a standard Vittoria coffee,

00:04:31.880 —> 00:04:34.680
kind of like an Albany brown colored packet.

00:04:34.680 —> 00:04:36.760
And then you see this bright green glowing thing

00:04:36.760 —> 00:04:37.600
with skulls on it.

00:04:37.600 —> 00:04:39.200
And you’re like, “Oh yeah, I’ll have a go at that.”

00:04:39.200 —> 00:04:41.280
So I sucked in by the marketing.

00:04:41.280 —> 00:04:42.820
Sucked in by the marketing.

00:04:42.820 —> 00:04:44.520
But yeah, I don’t know if I’d get it again

00:04:44.520 —> 00:04:46.960
because I, like you, don’t necessarily like

00:04:46.960 —> 00:04:49.080
the super strong as a regular thing.

00:04:49.080 —> 00:04:51.080
I’m like, it’s nice for a change for me,

00:04:51.080 —> 00:04:56.080
but I prefer the Campos because it’s not as roasted as dark.

00:04:56.080 —> 00:04:59.480
Yeah, I don’t get the whole full city roast thing.

00:04:59.480 —> 00:05:00.320
Sorry, Marco.

00:05:00.320 —> 00:05:01.360
But yeah.

00:05:01.360 —> 00:05:05.240
Yeah, so when you use your Hario, that’s basically equivalent to using the Kalita Wave.

00:05:05.240 —> 00:05:07.480
I mean, there’s people that like either one.

00:05:07.480 —> 00:05:10.960
I don’t have strong feelings either way because I haven’t used the Hario.

00:05:10.960 —> 00:05:13.040
To me, it could have been one or the other.

00:05:13.040 —> 00:05:14.440
I just happened to go with the Kalita Wave.

00:05:14.440 —> 00:05:17.520
But when do you choose to do one over the other?

00:05:17.520 —> 00:05:23.520
So when I want to have a black coffee, I will always use the Hario

00:05:23.520 —> 00:05:26.880
simply because it doesn’t make it really bitter.

00:05:26.880 —> 00:05:29.480
And if I do an AeroPress, it is still—

00:05:29.480 —> 00:05:32.840
I find AeroPress is still more bitter than a pour over.

00:05:32.840 —> 00:05:34.440
Oh, interesting.

00:05:34.440 —> 00:05:37.240
Yeah, just my taste buds, I don’t know.

00:05:37.240 —> 00:05:38.440
That’s just what I found.

00:05:38.440 —> 00:05:42.240
But the funny thing is when I’m traveling, though, I’ll always take the AeroPress.

00:05:42.240 —> 00:05:45.160
I just find it easier to work with when I’m traveling.

00:05:45.160 —> 00:05:50.560
So, for example, I spent five days in Monto, which is a little country town,

00:05:50.560 —> 00:05:54.840
about five or six hours drive from here in north and west from Brisbane.

00:05:54.840 —> 00:05:58.200
And they don’t have, well, they’re a country town, so they’re not open.

00:05:58.200 —> 00:06:01.480
nothing’s open on a Sunday and their coffee shops are barely open on Saturday

00:06:01.480 —> 00:06:04.760
morning. So if you want a coffee, it’s a BYO thing on the weekend.

00:06:04.760 —> 00:06:09.280
So I brought my AeroPress and pre-ground some coffee and it was actually pretty

00:06:09.280 —> 00:06:11.600
damn good to be honest. So, but yeah, I’ll use the Hari-Up,

00:06:11.600 —> 00:06:16.080
I want a black coffee, no milk whatsoever. And I do that from time to time. Yeah.

00:06:16.080 —> 00:06:18.480
It’s the sort of thing though, that I’ve got to be in the mood and I gotta,

00:06:18.480 —> 00:06:22.280
I gotta have more time to do it. So generally I’ll do it on a Sunday morning.

00:06:22.280 —> 00:06:23.000
It’s a process.

00:06:23.000 —> 00:06:26.680
Yeah, yeah, exactly. And one of the things I love about coffee is that, you know,

00:06:26.720 —> 00:06:29.780
The way you make it has a massive impact on how it tastes.

00:06:29.780 —> 00:06:35.020
And whilst I do drink lattes quite a bit, it’s not the only way I enjoy coffee.

00:06:35.020 —> 00:06:38.500
And I think that’s wonderful about coffee, to be honest, because with tea, it’s like you just

00:06:38.500 —> 00:06:41.300
dug a tea bag or put tea leaves in hot water and that’s it.

00:06:41.300 —> 00:06:42.620
It’s like, great.

00:06:42.620 —> 00:06:50.660
That is true, but it’s interesting, like if you get some of the tea leaves, if you get it in

00:06:50.660 —> 00:06:57.640
leaf format, you’ll find that different tea vendors for a specific tea will recommend

00:06:57.640 —> 00:07:02.540
wildly different water temperatures and wildly different times for steeping.

00:07:02.540 —> 00:07:04.220

That’s true.

00:07:04.220 —> 00:07:09.420

It really depends, and to be honest, I’m not super great at brewing tea, so I couldn’t

00:07:09.420 —> 00:07:13.280
tell you what the differences are when I take a tea and do it exactly the way they want

00:07:13.280 —> 00:07:16.300
to versus not exactly the way they want to.

00:07:16.300 —> 00:07:18.640
I really need to drink a lot more tea.

00:07:18.640 —> 00:07:27.600
My problem is, for some reason, although I have patience for doing the pour over coffee method, I don’t know, doing the tea bugs me.

00:07:27.600 —> 00:07:30.160
Tea seems to want to be at lower temperature.

00:07:30.160 —> 00:07:35.200
And by the time I get it brewed and by the time I get through what’s in the pot, it’s too cool already.

00:07:35.200 —> 00:07:37.080
I don’t know. I got to figure out a better way.

00:07:37.080 —> 00:07:42.320
I don’t want to pay for one of those tea robots just yet, but maybe someday I’ll have to.

00:07:43.600 —> 00:07:48.560
Well, what was transformative for me for both the tea and for the pour over

00:07:48.560 —> 00:07:52.960
was getting a, what is it, Brewista kettle.

00:07:52.960 —> 00:07:57.240
That’sSo you set the temperature and away you go and it’s got a keep warm function.

00:07:57.240 —> 00:07:59.760
It’s fantastic. And so my wife takes herYeah.

00:07:59.760 —> 00:08:01.760
So, you know, have you got one or something similar?

00:08:01.760 —> 00:08:06.840
Yeah, I’ve got a basically, probably a slightly cheaper version of something that does exactly the same thing.

00:08:06.840 —> 00:08:08.440
Yeah. Yeah. All right.

00:08:08.440 —> 00:08:13.200
Awesome. Yeah. So my wife, for example, she likes her black teas and black teas are generally done at 98

00:08:13.200 —> 00:08:17.920
or 100 depending upon the recommendation, like you say, some of them are, they want them to be hotter

00:08:17.920 —> 00:08:21.280
and you know the extra two degrees can make a difference. Anyway when I’m doing the pour overs

00:08:21.280 —> 00:08:25.200
on the Aeropress I’ll set it lower than that and so on and so forth but and the the whole

00:08:25.200 —> 00:08:30.640
gooseneck thing which I originally you know sort of like scoffed at it just makes it so much easier

00:08:30.640 —> 00:08:36.480
when you’re doing a pour over to control that flow rate and as well as even in the Aeropress I found

00:08:36.480 —> 00:08:44.480
Because using a stout pourer on the one at the pub I was staying at, was difficult to get that

00:08:44.480 —> 00:08:48.720
even and right. Because I found that the water flow rushed down the side and sort of like blew

00:08:48.720 —> 00:08:52.880
out straight down to the aeropress filter and then the water just went straight through without

00:08:52.880 —> 00:08:57.120
actually soaking the grounds. And I’m like, this wouldn’t have happened if I had my

00:08:57.120 —> 00:09:01.520
my goose neck. But never mind. Anyway. Yeah, it’s either pouring straight through or it’s

00:09:01.520 —> 00:09:04.200
just churning the grinds like crazy.

00:09:04.200 —> 00:09:05.840
Yeah, exactly.

00:09:05.840 —> 00:09:07.240
And you just… anyway.

00:09:07.240 —> 00:09:08.160
But yeah, you’re right.

00:09:08.160 —> 00:09:12.400
The steeping time and the steeping and the temperature are two of the variables with tea.

00:09:12.400 —> 00:09:16.440
And I have learnt via my wife more than myself because she’ll tell me,

00:09:16.440 —> 00:09:19.160
”Oh, that was a good cup.” I’m like, “Yep, noted for future reference.”

00:09:19.160 —> 00:09:22.240
So I’ve gotten better at not oversteeping tea,

00:09:22.240 —> 00:09:24.120
which is something, a mistake I used to make.

00:09:24.120 —> 00:09:25.440
Has it been steeping for 20 minutes?

00:09:25.440 —> 00:09:27.040
Eh, it’s long enough.

00:09:27.040 —> 00:09:29.160
No, it’s probably too long.

00:09:29.160 —> 00:09:30.240
Too long.

00:09:30.240 —> 00:09:36.080
Besides, centuries of Japanese tea ceremonies tell me that some people do think that there’s an art to brewing tea.

00:09:36.080 —> 00:09:38.000
So, totally.

00:09:38.000 —> 00:09:40.400
I can’t argue with them. They’re the experts.

00:09:40.400 —> 00:09:46.280
All right. Hey, John, since the last time I talked to you in person or yeah, I think so.

00:09:46.280 —> 00:09:50.560
I think that you and I both have new computer equipment.

00:09:50.560 —> 00:09:52.600
I think we both have new Macs. Is that true?

00:09:52.600 —> 00:09:56.720
I think so. Well, you tell me about yours and then I’ll tell you about mine.

00:09:56.720 —> 00:10:06.480
Okay, well, I bought a 2021 MacBook Pro 14 inch with an M1 Pro.

00:10:06.480 —> 00:10:09.440
It’s the base model, except I got the terabyte of storage.

00:10:09.440 —> 00:10:10.040
Nice.

00:10:10.040 —> 00:10:13.600
So it was affordable and I could justify it.

00:10:13.600 —> 00:10:16.440
But I was famously a laptop hater before.

00:10:16.440 —> 00:10:20.560
And all of the things that I hated about laptops are pretty much gone.

00:10:20.560 —> 00:10:25.240
The noise, the heat, the compromise, performance, all that stuff’s gone.

00:10:26.080 —> 00:10:31.520
The Apple Silicon made such a remarkable transformation of what it’s like to use a laptop.

00:10:31.520 —> 00:10:38.080
This is the best Mac I’ve ever owned by far, and I’ve owned some desktop Macs. I’ve even owned the

00:10:38.080 —> 00:10:46.320
old Intel cheese grater Mac Pros in the past. But this thing is amazing. It just scythes through

00:10:46.320 —> 00:10:50.720
whatever I throw at it without the fans spinning up at all. It’s pretty remarkable.

00:10:50.720 —> 00:10:54.320
Yeah, they’re a very nice machine and Apple Silicon has changed the game,

00:10:54.320 —> 00:11:02.160
Absolutely. And I love my Mac Pro 2013. I also had a cheese grater before that, a 2009 Quad

00:11:02.160 —> 00:11:09.120
Core Nehalem Mac Pro, which, to be honest, it was a beast both in size and in power and consumption

00:11:09.120 —> 00:11:15.520
and heat production. I mean, that thing was just a beautiful machine, but you could walk into the

00:11:15.520 —> 00:11:21.600
study and you could feel the five to 10 degrees of temperature rising as you walked into the room.

00:11:21.600 —> 00:11:25.440
I think it was a space heater. Even though it wasn’t trying to be, that’s what it was,

00:11:25.440 —> 00:11:30.560
and because those Intel’s just kick out so much heat. And my 2013 trashcan Mac Pro,

00:11:30.560 —> 00:11:34.240
which I used for one and a half years during the worst of COVID, I picked it up secondhand,

00:11:34.240 —> 00:11:38.160
and it was a beautiful machine. And I was running a whole bunch of virtual machines,

00:11:38.160 —> 00:11:42.960
running Alpine Linux and all sorts of stuff on it. So I did that on top of using it for day-to-day

00:11:42.960 —> 00:11:47.920
work, and it just, it very rarely had any issues. It was a beautiful machine. And then Apple Silicon

00:11:47.920 —> 00:11:54.240
came out and I needed a machine that could drive my three 4k displays that I became slightly addicted

00:11:54.240 —> 00:11:59.360
to during Covid as well because I used to have the laptop screen and one external display then

00:11:59.360 —> 00:12:05.280
that became a 4k display then that became two external 4k displays and then I’m like right

00:12:05.280 —> 00:12:11.440
I really want a more powerful desktop so I ended up getting myself a Mac Studio. So the Mac Studio

00:12:11.440 —> 00:12:16.720
is so much better quieter like the power consumption I checked in the Mac Pro at idle

00:12:16.720 —> 00:12:26.120
was 137 watts average, whereas the Maxx Studio at idle doesn’t even crack 50 watts.

00:12:26.120 —> 00:12:27.720
And it’s just… just…

00:12:27.720 —> 00:12:28.720
Okay.

00:12:28.720 —> 00:12:30.060
It’s next level.

00:12:30.060 —> 00:12:31.140
And it is so quiet.

00:12:31.140 —> 00:12:32.140
The fans on it.

00:12:32.140 —> 00:12:36.040
I mean, you can hear that crow in the background telling us what it thinks.

00:12:36.040 —> 00:12:38.380
But you can’t hear the fans on this thing.

00:12:38.380 —> 00:12:39.380
You just can’t.

00:12:39.380 —> 00:12:44.200
Even though they’re spinning away at 1300 RPM, you can’t hear them.

00:12:44.200 —> 00:12:45.780
And there’s no power supply noise or anything like that.

00:12:45.780 —> 00:12:46.780
It’s just beautifully quiet.

00:12:46.780 —> 00:12:49.140

Yeah, that would have definitely been the way I was going to

00:12:49.140 —> 00:12:53.100
go if I had gone with a desktop Mac again.

00:12:53.100 —> 00:12:58.260
I came to the conclusion that I was done with the iPad as a working device experiment that

00:12:58.260 —> 00:13:01.720
I had been conducting for a couple years for many reasons, and I don’t want to get into

00:13:01.720 —> 00:13:02.720
it here.

00:13:02.720 —> 00:13:04.540
That’s not a topic I want to dive into today.

00:13:04.540 —> 00:13:10.060
But the bottom line is I decided I’m tired of fighting with the iPad Pro’s operating

00:13:10.060 —> 00:13:14.060
system to do the things that I need to do and always just barely falling short.

00:13:14.060 —> 00:13:18.980
I mean, it excelled at some things, but the funny thing is is that, okay, I said I wasn’t

00:13:18.980 —> 00:13:23.100
going to talk about it, but the narrative around the iPad is, “Oh, it can’t do this,

00:13:23.100 —> 00:13:27.340
it can’t do that,” but usually when people say that, it can do the things that they think

00:13:27.340 —> 00:13:28.340
it can’t.

00:13:28.340 —> 00:13:33.660
It does fall short in other areas, but it’s usually not what people think it is, and it

00:13:33.660 —> 00:13:37.060
would be more like the Federicos of the world who could tell you where it actually crashes

00:13:37.060 —> 00:13:38.060
and falls short.

00:13:38.060 —> 00:13:41.240
But anyway, I got tired of bumping up against those, and I decided, “Okay, I can afford

00:13:41.240 —> 00:13:46.960
one Mac, so I’m going with a laptop and Clay and Vic had talked to me enough to convince me that

00:13:46.960 —> 00:13:48.640
this is a laptop that I won’t hate.

00:13:48.640 —> 00:13:52.880
It’s not going to feel like the noisy thing that you can’t stand to use because it’s burning a hole

00:13:52.880 —> 00:13:56.960
in your pants or blowing fans or it’s slow because it’s overheating.

00:13:56.960 —> 00:13:59.640
And so that’s why I went with that.

00:13:59.640 —> 00:14:03.440
But the Mac Studio, I think, is an amazing computer.

00:14:03.440 —> 00:14:05.120
It really looks nice.

00:14:05.120 —> 00:14:06.800
And that that attempted me at first.

00:14:07.040 —> 00:14:12.880
Yeah, it’s honestly been the fastest, quietest and best desktop Mac I’ve ever had.

00:14:12.880 —> 00:14:18.240
And then the situation at work required me to be going into the office more regularly with

00:14:18.240 —> 00:14:25.120
the tail end of COVID, so I then invested in an M.2 MacBook Air. And the only upgrade I did on that

00:14:25.120 —> 00:14:32.640
was I got a 512GB SSD because I just could not handle 256GB. Same with the Mac Studio, I went to

00:14:32.640 —> 00:14:36.840
to a one terabyte SSD as opposed to the 512 stock.

00:14:36.840 —> 00:14:39.600
So, but other than that, they’re both entry-level models.

00:14:39.600 —> 00:14:42.800
Yeah, ‘cause I needed a laptop for work

00:14:42.800 —> 00:14:45.080
’cause the laptops that they give you at work

00:14:45.080 —> 00:14:47.800
but from the IT department are just as bad

00:14:47.800 —> 00:14:48.840
as they always were.

00:14:48.840 —> 00:14:51.120
And the thing is I went through the catalogs

00:14:51.120 —> 00:14:55.000
and on the online IT ordering service portal

00:14:55.000 —> 00:14:57.600
and said, I want a high-performance laptop.

00:14:57.600 —> 00:15:00.640
Well, their high-performance laptop runs probably

00:15:00.640 —> 00:15:05.200
quarter of the speed of my MacBook Air. So yeah, for sure.

00:15:05.200 —> 00:15:09.880
It’s not even a race. It’s not a contest. It’s just a joke. So I’m like, yeah, okay.

00:15:09.880 —> 00:15:13.280
Anyway, so I’ve also got myself a MacBook Air.

00:15:13.280 —> 00:15:15.960
It’s the two fantastic machines.

00:15:15.960 —> 00:15:20.200
And now that I’ve got something like the Mac Studio

00:15:20.200 —> 00:15:25.120
running Final Cut Pro and doing editing for the from the YouTube channel has become

00:15:25.120 —> 00:15:26.640
really, really easy.

00:15:26.640 —> 00:15:29.280
Yeah, that’s cool. That’s awesome.

00:15:29.960 —> 00:15:35.320
Yeah, I have a similar thing with my work laptop except surprisingly, John, I recently,

00:15:35.320 —> 00:15:40.440
well, I don’t remember, a few months ago they upgraded my computer at work and I have a Lenovo

00:15:40.440 —> 00:15:48.600
T14 and I actually really like it. It does bog down occasionally and it does churn through

00:15:48.600 —> 00:15:53.640
battery like crazy. Those are two huge differences between it and my MacBook Pro, but that was the

00:15:53.640 —> 00:15:58.600
computer that convinced me that I would be okay with the 14-inch form factor because I’ve always

00:15:58.600 —> 00:16:03.560
thought I need the biggest laptop possible and I had a 16 inch laptop at work before that.

00:16:03.560 —> 00:16:11.080
I really like the 14 inch form factor and on the MacBook Pro it’s just about perfect for me.

00:16:11.080 —> 00:16:17.480
It’s got a beautiful enough screen with small enough bezels that I can see what I need to see,

00:16:17.480 —> 00:16:21.000
plus when I’m at the desk I have it plugged into a studio display anyway. But

00:16:21.000 —> 00:16:27.880
yeah, it’s that form factor I could not be happier with. I really like it. So I’m glad that I got that

00:16:27.880 —> 00:16:32.640
that computer at work before I decided which Mac to buy because it did convince

00:16:32.640 —> 00:16:33.760
me to go with the 14 inch.

00:16:33.760 —> 00:16:38.440
Yeah, I actually, I did seriously consider the MacBook Pros and I could have

00:16:38.440 —> 00:16:40.640
stretched and got a MacBook Pro if I wanted to.

00:16:40.640 —> 00:16:44.000
But by that time I’d got, I’d prioritized the Mac Studio.

00:16:44.000 —> 00:16:47.680
And so that kind of ate away my budget and I was like, yeah, okay,

00:16:47.680 —> 00:16:51.000
so now I’ve got MacBook Air money as opposed to MacBook Pro money.

00:16:51.000 —> 00:16:54.520
So I sort of, yeah, that sort of made my decision for me.

00:16:54.520 —> 00:17:01.000
was my previous choice from a month a month and a bit prior. But for me, I guess I looked at it like this.

00:17:01.000 —> 00:17:06.720
If you want to have a laptop/desktop sort of thing that you take everywhere and as your machine,

00:17:06.720 —> 00:17:10.840
then you have one desktop-like machine or laptop-like, you know,

00:17:10.840 —> 00:17:17.660
you definitely go for a MacBook Pro. And I also think the 14-inch form factor is better than the larger one.

00:17:17.660 —> 00:17:21.740
I hate big laptops because they’re too… to me, it’s about portability.

00:17:21.740 —> 00:17:26.920
So the bigger the laptop the more cumbersome it is. I used to own 15 inch MacBook Pros way back in the day

00:17:26.920 —> 00:17:30.580
I never owned it 17, but for me the 17 inch was comical

00:17:30.580 —> 00:17:35.940
Yeah, and I just couldn’t even and even the 15 inch after a while it got a bit too much

00:17:35.940 —> 00:17:41.500
So I’ve been you know 13 inch laptop owner ever since I never went smaller than that to the 11

00:17:41.500 —> 00:17:43.740
I ever tried an 11 because I felt too cramped

00:17:43.740 —> 00:17:51.200
But the 14 maintains approximately the dimensions of the 13 and yet the screen is that much bigger. I can absolutely see its appeal

00:17:51.200 —> 00:17:54.000

Yeah, and they’re a lot lighter than they used to be.

00:17:54.000 —> 00:17:55.880
I used to have some of those way back in the day.

00:17:55.880 —> 00:18:00.520
I had a couple 15-inch MacBook Pros also, and those were pretty heavy.

00:18:00.520 —> 00:18:01.720
Those were heavy beasts.

00:18:01.720 —> 00:18:04.920

For sure, definitely.

00:18:04.920 —> 00:18:09.440

Well, if you or any of the listeners out there have been listening to this podcast,

00:18:09.440 —> 00:18:14.880
you’ll know that I don’t know why, but Peter has fallen in love with chatGPT.

00:18:14.880 —> 00:18:19.800
I thought the whole AI thing was an interesting topic, and I thought that chatGPT was…

00:18:19.800 —> 00:18:21.580
It’s like cryptocurrency.

00:18:21.580 —> 00:18:26.100
It’s like that all over again, where you’ve got people that are skeptical and you’ve got

00:18:26.100 —> 00:18:31.140
some people that just can’t see anything but the upsides to it.

00:18:31.140 —> 00:18:36.620
And it’s been a pretty fascinating look at how we really haven’t learned a whole lot

00:18:36.620 —> 00:18:40.940
in terms of our stance on technology.

00:18:40.940 —> 00:18:44.460
It seems like we still have a lot of people that are 100%, “This is only good, this is

00:18:44.460 —> 00:18:49.700
great,” and some people that are 100% skeptical, and then a few people in between.

00:18:49.700 —> 00:18:54.200
And I really enjoyed a podcast I listened to recently.

00:18:54.200 —> 00:18:57.080
There’s a podcast called Tech Won’t Save Us.

00:18:57.080 —> 00:18:58.740
They look at tech from a different angle.

00:18:58.740 —> 00:19:02.700
They look at it from the lens of the people who create these things and think they’re

00:19:02.700 —> 00:19:07.800
transforming the world generally aren’t, or they are, but in a bad way, not the way that

00:19:07.800 —> 00:19:09.560
they claim they are.

00:19:09.560 —> 00:19:15.060
And the woman that was on this, I’ll probably pronounce her name wrong, but Timnit Gebru,

00:19:15.060 —> 00:19:20.540
She’s an AI expert and she was working at Google and she actually got fired from Google

00:19:20.540 —> 00:19:26.180
when she was trying to build up a committee and a group, a working group in Google that

00:19:26.180 —> 00:19:33.820
would make sure that AI was being used responsibly and that all of the inbuilt prejudices and

00:19:33.820 —> 00:19:39.520
all of the inherent social problems that come from AI being designed the way it is designed

00:19:39.520 —> 00:19:41.420
would be addressed and looked at.

00:19:41.420 —> 00:19:45.280
And I guess at some point Google didn’t really want to hear it.

00:19:45.280 —> 00:19:50.160
And so I listened to that and it got me thinking a lot about some of these things.

00:19:50.160 —> 00:19:55.640
And then of course looking at some of the issues that ChatGBT has had, like with CNET

00:19:55.640 —> 00:20:00.520
trying to use it to write articles with, and it turns out, “Hey, gee, they’re full of mistakes

00:20:00.520 —> 00:20:01.520
and errors.

00:20:01.520 —> 00:20:03.300
Who knew?”

00:20:03.300 —> 00:20:09.480
And then artists who are suing AI image generators for using their names as keywords, in other

00:20:09.480 —> 00:20:15.080
words, using their work as inputs to the AI to train their models and so forth without their

00:20:15.080 —> 00:20:21.160
permission. And I just thought, I kind of wanted to get an impression of what your views on this

00:20:21.160 —> 00:20:25.640
whole thing are, you know, what your view is with respect to the strengths, the weaknesses,

00:20:25.640 —> 00:20:28.680
the good things, the bad things. How do you feel about the hype, I guess?

00:20:28.680 —> 00:20:33.880
First of all, I haven’t actually been able to get onto ChatGPT to have a play with it,

00:20:33.880 —> 00:20:37.880
because it’s at capacity. They said, “Oh, well, we’ll let you know when it’s available again.”

00:20:37.880 —> 00:20:40.040
And I’m like, yeah, great. And still waiting.

00:20:40.040 —> 00:20:41.440
And that was weeks ago.

00:20:41.440 —> 00:20:42.080
Yeah.

00:20:42.080 —> 00:20:44.880
Because I was vaguely aware of it.

00:20:44.880 —> 00:20:49.720
I’ve been listening to a bunch of different podcasts and read a bunch of articles about it.

00:20:49.720 —> 00:20:51.240
So this is not firsthand use.

00:20:51.240 —> 00:20:54.680
But to be honest, I think I’ve seen enough of here’s the input.

00:20:54.680 —> 00:20:55.880
This is the output, you know,

00:20:55.880 —> 00:21:01.160
comparatives for me to have some opinion of it, but I just need to state that up front.

00:21:01.160 —> 00:21:06.120
Before I sort of like go into that, do you remember a program many, many years ago?

00:21:06.160 —> 00:21:12.800
perhaps the first so-called AI program, ELISA, the therapist program. Do you remember that?

00:21:12.800 —> 00:21:13.920
Yeah, I do.

00:21:13.920 —> 00:21:18.800
So ELISA, the way it worked is it took your questions and it essentially broke them down

00:21:18.800 —> 00:21:24.480
into their constituent components and then rephrased them as a question where it could,

00:21:24.480 —> 00:21:28.960
but there was also a random generator that would ask things like, could you elaborate more on that?

00:21:28.960 —> 00:21:34.880
Or, you know, how did that make you feel where it could not do that? And it was very, very basic.

00:21:34.880 —> 00:21:40.000
the code behind it was really not that complicated, but it created the illusion of interactivity,

00:21:40.000 —> 00:21:43.360
like, oh hey, I’m having a conversation with a computer, this is like, this is artificial

00:21:43.360 —> 00:21:48.080
intelligence, it’s going to take over the world, and once you learn how it works, it’s like the

00:21:48.080 —> 00:21:53.920
veil is lifted and you’re like, yeah okay, so this is fun, but it’s not actually useful,

00:21:53.920 —> 00:21:58.880
like it really, really is not useful, but it’s fun and it’s cool, but it’s really not useful, and so

00:22:00.720 —> 00:22:07.920
My take on ChatGPT is that it’s the ELISA, but with machine learning.

00:22:07.920 —> 00:22:14.720
It’s more or less the same kind of, “Hey, this is really cool. Isn’t this amazing? Yeah. Is it useful?

00:22:14.720 —> 00:22:22.160
No. No, not really.” And the problem is that the data that it gives you is only good as the data

00:22:22.160 —> 00:22:26.560
that you feed into it, like any machine learning model. Machine learning is based on just pattern

00:22:26.560 —> 00:22:32.560
recognition, repetition, identification of information within patterns. And so ultimately,

00:22:32.560 —> 00:22:38.800
if you feed it garbage, you will get garbage. And the thing is that I, because having looked

00:22:38.800 —> 00:22:42.800
at what’s happened to the internet, you know, my entire life, it started out with that promise

00:22:42.800 —> 00:22:47.280
of being, oh, it’s free information, information is free, everyone can get access to all this

00:22:47.280 —> 00:22:52.000
information. It’ll be great. It’ll be like, you can have an encyclopedia and you’ll be able to

00:22:52.000 —> 00:22:56.080
search for anything and you’ll just get the answer online. It’s going to be amazing. And in many ways

00:22:56.080 —> 00:23:01.840
that is true, but the problem is that it’s also, and I mean I know that disinformation is kind of

00:23:01.840 —> 00:23:09.920
like one of those catch terms at the moment, but the opposite of truth or of reality can also be

00:23:09.920 —> 00:23:14.640
posted just as equally alongside, and in some cases weighted as equally, as something that is

00:23:14.640 —> 00:23:20.080
an established fact. If you look at the way that they would research encyclopedia articles in the

00:23:20.080 —> 00:23:25.760
past, they were way way more thorough than anything on the internet generally speaking today.

00:23:25.760 —> 00:23:31.340
Like, you know, Wikipedia is another great example, you know, citation needed kind of BS that goes back,

00:23:31.340 —> 00:23:33.720
you know, that argument’s been going on for a decade or more.

00:23:33.720 —> 00:23:41.480
So I sort of feel like the problem with any machine learning algorithm is if you have a pure source of data to feed it,

00:23:41.480 —> 00:23:47.860
that is, first of all, you’re legally allowed to use it, and second of all, it is a true representative sample of reality,

00:23:47.860 —> 00:23:51.800
whatever that might be, then you will get meaningful useful information out of it.

00:23:52.100 —> 00:23:55.740
But if you think about that, that’s actually not possible.

00:23:55.740 —> 00:24:00.500
If you’re using anything on the internet as your source of data, you are going to have random BS in there,

00:24:00.500 —> 00:24:03.540
and therefore your outputs will occasionally be random BS.

00:24:03.540 —> 00:24:08.180
And I just look at ChatTPT just from that point of view, we’ll talk about the whole image thing in a minute,

00:24:08.180 —> 00:24:12.500
but I guess I just look at that and I think to myself, “Well, sure, this is cool,

00:24:12.500 —> 00:24:18.580
but you can’t trust the answers because it’s probably anywhere from 5 to 99% BS.”

00:24:18.580 —> 00:24:26.500
I think you’re absolutely right. Anything, whether it be a chat AI or whether it be art generation…

00:24:26.500 —> 00:24:33.460
Art generation’s a little different because it can come up with things that aren’t meant to be

00:24:33.460 —> 00:24:39.940
true or false, or they’re not meant to be narratively correct. They just are.

00:24:39.940 —> 00:24:46.580
But at the same time, still with those, people have found, like for example, Asian women who

00:24:46.580 —> 00:24:52.500
have used AI image generators have found that those AI image generators want to undress them

00:24:52.500 —> 00:24:57.060
or put them in skimpy clothing more often than they do with other people.

00:24:57.060 —> 00:25:05.860
And so they’re subject to the Asian woman fetish that exists on the internet, and that goes into

00:25:05.860 —> 00:25:11.220
those AI image generators, and that’s what they output. Yeah, I remember, John, I’m sure we’re both

00:25:11.220 —> 00:25:15.860
old enough to remember when we first started on this tech roller coaster, when personal

00:25:15.860 —> 00:25:19.380
computers first started becoming a thing, and then it became the internet.

00:25:19.380 —> 00:25:20.980
I felt really optimistic about it all.

00:25:20.980 —> 00:25:23.580
It felt like it could only be a good thing.

00:25:23.580 —> 00:25:27.220
Of course, now, since then, there’s been many times where I’ve wondered if the internet

00:25:27.220 —> 00:25:28.420
was just a giant mistake.

00:25:28.420 —> 00:25:31.940
Yeah, I can’t get along without it because there’s so many things I researched and so

00:25:31.940 —> 00:25:36.060
many things I learned that would be so much more difficult to do without the internet.

00:25:36.060 —> 00:25:42.460
But looking at the state of humanity and how some of our social problems are undoubtedly

00:25:42.460 —> 00:25:48.220
caused by everybody having access to everybody else’s thoughts, and some people’s brains aren’t

00:25:48.220 —> 00:25:52.300
capable of handling that because they’ll believe any terrible sounding thing that they hear,

00:25:52.300 —> 00:25:57.980
it does make you wonder. And then to dump all of that into AIs that don’t know what reality is and

00:25:57.980 —> 00:26:03.260
have no way to… they’re even worse than the people falling for conspiracy theories on Facebook in

00:26:03.260 —> 00:26:07.820
terms of falsehood rejection, in my opinion, because they just have no clue to begin with.

00:26:07.820 —> 00:26:12.380
Yeah. The thing that I find interesting about machine learning is that people equate machine

00:26:12.380 —> 00:26:14.300
machine learning with human intelligence.

00:26:14.300 —> 00:26:19.100
And we can recognize things like you say, you know, like to go the whole

00:26:19.100 —> 00:26:24.220
Silicon Valley on you, like, you know, hot dog, not hot dog kind of thing.

00:26:24.220 —> 00:26:26.380
So I can tell the difference between something that’s a hot dog and it’s not

00:26:26.380 —> 00:26:29.300
a hot dog, but, you know, so can a machine learning algorithm.

00:26:29.300 —> 00:26:33.300
But in the future, let’s say that this hot dog now that comes

00:26:33.300 —> 00:26:35.820
in three different subtypes and so on and so forth.

00:26:35.820 —> 00:26:37.500
And the original thing we used to call

00:26:37.500 —> 00:26:40.620
hot dog is no longer called a hot dog, you could teach a human, you know, hey,

00:26:40.820 —> 00:26:45.220
this thing you used to call a hot dog, it’s now going to be called something else. And you know,

00:26:45.220 —> 00:26:49.460
like a hoagie, I don’t know, I’m picking a random American slang name for something and I’m probably

00:26:49.460 —> 00:26:53.620
getting that wrong, doesn’t matter. But the point is that a human being can relearn that and say,

00:26:53.620 —> 00:26:57.380
you know what, I’m not going to call that anymore, maybe I’ll slip into bad habits every now and then

00:26:57.380 —> 00:27:02.500
and call it what I used to call it, but I can sort of train myself, I can learn a different way,

00:27:02.500 —> 00:27:08.260
or different thing, a different way of identifying something. And you know, machine learning

00:27:08.260 —> 00:27:13.220
algorithms, like it’s how you teach them to unlearn something that they’ve been trained on

00:27:13.220 —> 00:27:19.380
is extremely difficult. And at this point in time, the simplest way to do it is to wipe it and start

00:27:19.380 —> 00:27:23.220
over with a different data set. And that doesn’t include the thing that you don’t want in it

00:27:23.220 —> 00:27:27.300
anymore. And that’s not the way a human works, because a human will be, well, what are we going

00:27:27.300 —> 00:27:30.580
to do? Wipe your memory and start you out as a toddler? And we’re going to train you again?

00:27:30.580 —> 00:27:35.700
This is no longer a hot dog. I mean, that’s, humans don’t work like that. We don’t need to work like

00:27:35.700 —> 00:27:42.020
that. We can change our thinking. We can, you know, it’s, I don’t know, in many

00:27:42.020 —> 00:27:45.540
respects, I’ve just, I’ve been listening to the whole machine learning is the

00:27:45.540 —> 00:27:47.700
future and everything and there’s a whole bunch of stuff that it can do.

00:27:47.700 —> 00:27:53.420
That’s for sure and it is very cool but it will never be on its own in isolation

00:27:53.420 —> 00:27:58.900
as a technology. It will never be adaptive. It will never be at the same level that

00:27:58.900 —> 00:28:03.280
a human can be simply because all it is is patent recognition and you’ve got to

00:28:03.280 —> 00:28:05.560
see that for what it is and all of its faults.

00:28:05.560 —> 00:28:11.840
So it’s not it’s not the cure all for all of our for all of our ills, that’s for sure.

00:28:11.840 —> 00:28:16.880
Right. And sometimes it is really hard to change a human’s mind, but humans can change

00:28:16.880 —> 00:28:22.320
your minds. And we and we see over time, like, you know, just think about yourself and I

00:28:22.320 —> 00:28:26.840
can think about myself and over time, how much my thought processes about specific

00:28:26.840 —> 00:28:32.080
topics have changed as I’ve just realized, oh, I had a very naive, shallow view of this

00:28:32.080 —> 00:28:35.080
topic and I didn’t know what it was like from this other person’s point of view.

00:28:35.080 —> 00:28:40.400
And I think, you know, everybody, you try to keep learning, you try to keep getting

00:28:40.400 —> 00:28:44.120
better, but I don’t think there’s any way to give machine learning that

00:28:44.120 —> 00:28:49.920
understanding of, like, humans can have a collective understanding of, “Oh, this is

00:28:49.920 —> 00:28:53.280
the way we should be headed if we want a better society.” Machine learning will

00:28:53.280 —> 00:28:57.600
take all those inputs and I don’t know how it makes a judgment on, “Well, I need

00:28:57.600 —> 00:29:00.940
to be careful about this because these people clearly have some sort of motive

00:29:00.940 —> 00:29:04.660
for pushing this theory. Like we know that anytime somebody’s trying to sell you something,

00:29:04.660 —> 00:29:08.440
there’s a reason. It might be good, it might be bad, but there is a reason when people

00:29:08.440 —> 00:29:11.920
are trying to sell you stuff. And it doesn’t matter if it’s a product or a thought process

00:29:11.920 —> 00:29:16.060
or a belief system or whatever it is, people are trying to convince you to believe something

00:29:16.060 —> 00:29:22.100
for a reason. I don’t know how you teach AI or computers to understand that they have

00:29:22.100 —> 00:29:27.980
to figure that out and weed out the preconceptions and the prejudices behind the information

00:29:27.980 —> 00:29:30.860
they’re being fed. I don’t see how that’s possible.

00:29:30.860 —> 00:29:36.140
Exactly. I think that’s a difficult problem and machine learning unlearning is actually a

00:29:36.140 —> 00:29:41.740
field of study in machine learning spheres and they claim that they are making progress.

00:29:41.740 —> 00:29:45.820
But I couldn’t find anything recently on it. I did a little bit of a research before we started

00:29:45.820 —> 00:29:51.260
recording and I can’t see anything recently on it. It’s difficult. But it’s not just unlearning

00:29:51.260 —> 00:29:55.820
when it learns the wrong thing or takes in data that it shouldn’t, that may not be true.

00:29:55.820 —> 00:30:01.100
it’s also things like hallucination. And I originally, when I heard the term AI hallucination,

00:30:01.100 —> 00:30:05.740
I’m like, that’s okay, well they’re giving some magic mushrooms to the AI, interesting.

00:30:05.740 —> 00:30:10.460
But it’s actually fooling the machine learning pattern matching to think that something is what

00:30:10.460 —> 00:30:16.620
it isn’t. And so like, for example, arranging certain objects and images on a sign or putting

00:30:16.620 —> 00:30:21.500
stickers on a sign to make it look like it’s a stop sign when it isn’t, you know, things like that.

00:30:21.500 —> 00:30:25.740
Whereas a human might look at that sign and say, well, that’s clearly not a stop sign.

00:30:25.740 —> 00:30:28.380
And someone’s just put a sticker on it because we can separate out,

00:30:28.380 —> 00:30:34.300
like out of the broad field of our view, we know exactly what a stop sign looks like in the context

00:30:34.300 —> 00:30:38.620
of I’m driving a car, it’s on the side of the road, I’m looking at this sign, is that a stop sign?

00:30:38.620 —> 00:30:43.260
Whereas a machine learning algorithm looks at the entirety and says, well, this is the pattern that

00:30:43.260 —> 00:30:46.860
I’ve recognized previously as a stop sign. It doesn’t matter if I’m on a road or anywhere.

00:30:46.860 —> 00:30:51.100
It has no way of having additional depth and context to determine, oh, hang on,

00:30:51.100 —> 00:30:55.740
on, that’s a sticker on top of a sign, it’s not actually what the sign actually says.

00:30:55.740 —> 00:31:02.520
It has no way of, it can’t do that. So tricking AI is, that’s the term they’re calling

00:31:02.520 —> 00:31:09.300
AI hallucination. But the whole thing is, it’s problematic. And I guess the other thing

00:31:09.300 —> 00:31:13.860
is I want to just quickly address is the whole, you know, artists thing. The funny thing I

00:31:13.860 —> 00:31:19.180
was thinking about is that like artists, I’m okay, I’m not an artist, not really. I’ve

00:31:19.180 —> 00:31:23.780
I’ve done a few logos that were mostly terrible and I can’t paint to save my life.

00:31:23.780 —> 00:31:28.420
It basically looks like a big, it’s like grey smudgy mess when I’m done.

00:31:28.420 —> 00:31:30.940
And that in itself is not art as far as I’m concerned.

00:31:30.940 —> 00:31:32.180
So I’m not an artist.

00:31:32.180 —> 00:31:35.960
Well, you are a photographer and that’s an artistic workflow.

00:31:35.960 —> 00:31:41.880
It’s not that you’re using your fingers or your hands to move in certain ways to generate

00:31:41.880 —> 00:31:44.800
your art, but photography is very much an art form.

00:31:44.800 —> 00:31:46.140
So you get the mindset.

00:31:46.140 —> 00:31:47.140
Oh, okay.

00:31:47.140 —> 00:31:53.380
Okay, yeah, I accept that. That’s fair. And even for years, Scott, I didn’t see myself as a photographer.

00:31:53.380 —> 00:31:56.660
But the more I’ve been doing it and the more I’ve invested in it, the more I’ve learned about it,

00:31:56.660 —> 00:32:01.860
the more I realize that I’m starting to think more like an artist when I’m setting up for a photo.

00:32:01.860 —> 00:32:05.220
I’m still a far cry from, you know, some of the professionals, that’s for sure.

00:32:05.220 —> 00:32:07.940
And I’ll probably never get to that point because I have other things in my life.

00:32:07.940 —> 00:32:11.220
But okay, fine. I accept that. Fair comment.

00:32:11.220 —> 00:32:16.100
The artist mindset is, well, the way art works is we start by mimicry.

00:32:16.100 —> 00:32:18.980
and we say, well, I really respect this kind of art.

00:32:18.980 —> 00:32:20.940
I want to make art just like this.

00:32:20.940 —> 00:32:24.140
And then, you know, as time goes on and we, you know, we learn more

00:32:24.140 —> 00:32:27.100
and we absorb more and we look around more, we try a few things.

00:32:27.100 —> 00:32:29.900
And it’s like, well, I really like doing it this way.

00:32:29.900 —> 00:32:31.780
This one thing I particularly like about it.

00:32:31.780 —> 00:32:32.900
So I’m going to go off on a tangent.

00:32:32.900 —> 00:32:34.060
I’m going to do this my way.

00:32:34.060 —> 00:32:37.140
And then, you know, they talk about having the artist’s own voice.

00:32:37.140 —> 00:32:39.660
You know, they find their own voice, their own style, their own

00:32:39.660 —> 00:32:42.460
way of conveying what they want to in their art.

00:32:42.460 —> 00:32:45.100
And it’s like that is then becomes unique

00:32:45.180 —> 00:32:46.940
because that is then unique to that individual,

00:32:46.940 —> 00:32:48.540
rather you hope that it is.

00:32:48.540 —> 00:32:52.580
So you apply that context now to an AI system

00:32:52.580 —> 00:32:55.700
that’s trying to auto-generate art for you.

00:32:55.700 —> 00:32:56.940
And you think to yourself,

00:32:56.940 —> 00:32:59.100
well, it’s a similar kind of a process,

00:32:59.100 —> 00:33:02.540
but the only difference is that the AI is not going to say,

00:33:02.540 —> 00:33:04.220
well, there’s one particular thing that I like,

00:33:04.220 —> 00:33:05.980
and I’m just gonna go off on a bit of a tangent here

00:33:05.980 —> 00:33:07.660
and have my own voice.

00:33:07.660 —> 00:33:09.540
It’s not capable of doing that.

00:33:09.540 —> 00:33:11.500
So it’s basically art,

00:33:11.500 —> 00:33:13.820
but it can only ever be a combination

00:33:13.820 —> 00:33:17.860
of the sum of its inputs. So whatever you put into it is all you will ever get out of

00:33:17.860 —> 00:33:23.900
it in a weird mishmash in the output. And that sort of thing is probably okay for a

00:33:23.900 —> 00:33:28.020
lot of people that don’t, and this is going to sound really, I don’t know actually how

00:33:28.020 —> 00:33:33.220
this sounds, but people that really understand and can interpret art, and like I say, that’s

00:33:33.220 —> 00:33:38.420
why it sounds weird, is because to me, appreciating art is in some ways as difficult as creating

00:33:38.420 —> 00:33:44.100
art. Because for example, like I would, before I was a photographer, I’d walk down like in a photo

00:33:44.100 —> 00:33:48.820
gallery and I’d say, well, there’s a bunch of photos there. Yep, that’s a bridge. Yep, that’s

00:33:48.820 —> 00:33:54.020
a house. And that’s a person. Good. Whereas now I’d walk down exactly the same row of photos and

00:33:54.020 —> 00:33:58.580
I’d stop and I’d say, oh my God, look at the way that they’ve matched the light and the angle and

00:33:58.580 —> 00:34:04.820
how did they do that? And I’m like, yeah, I can truly appreciate difficult photos because I know

00:34:04.820 —> 00:34:10.420
how hard it is to take them. So average person will just say, “Hey, that’s a nice photo of a bridge.”

00:34:10.420 —> 00:34:15.940
So you type in your AI thing, it spits out a logo of a car driving on, you know, on sand or something

00:34:15.940 —> 00:34:20.580
like that, I don’t know, and you’ll look at that and you’ll say, “Well, I’m not an artist, but it’s

00:34:20.580 —> 00:34:26.180
good enough, right?” But the truth is that that’s all that can ever be, and you’ll never get anything

00:34:26.180 —> 00:34:30.900
beyond that. You’ll never get anything that is really artistic in the sense of the way a human

00:34:30.900 —> 00:34:36.660
can create art by finding their own voice. So, I mean, this is probably a very engineering

00:34:36.660 —> 00:34:44.140
take on AI from the point of view of, like, will an AI ever develop its own voice for

00:34:44.140 —> 00:34:48.100
creating art based on everything else that’s put into it? And I genuinely don’t think it

00:34:48.100 —> 00:34:53.340
can. And that comes back to the debate of AI being soulless and the nature of a soul

00:34:53.340 —> 00:34:56.980
and blah blah blah, free will and choice and yadda yadda yadda. But I mean, I don’t know

00:34:56.980 —> 00:34:59.660
if that’s just a rambling mess of thoughts there, probably is.

00:34:59.660 —> 00:35:06.140
No, no, no. I agree with you, and I think that it’s interesting because Jason Snell was talking

00:35:06.140 —> 00:35:09.660
about finding the same thing with respect to writing. He’s going, “Well, you can tell that

00:35:09.660 —> 00:35:14.540
it was just, you know, fed a bunch of, you know, bad internet stories, and it wasn’t given the good

00:35:14.540 —> 00:35:20.220
literature and all that.” But you’re right in that it’s not going to develop its own voice. And the

00:35:20.220 —> 00:35:26.380
one thing I took away from Peter’s super long narration of his chat GPT experiment, trying to

00:35:26.380 —> 00:35:30.860
to get it to write a story for him was that it would come up with something, he would

00:35:30.860 —> 00:35:35.020
disagree with the premise, he would say “but wait, that can’t be because blah blah blah”

00:35:35.020 —> 00:35:40.100
and it would say “oh, you’re right!” it basically was a yes man, I mean it has no opinion, it

00:35:40.100 —> 00:35:44.900
has no idea, it has no creative expression, it has nothing but what it’s given and then

00:35:44.900 —> 00:35:52.080
it just goes along with that so… it can generate things, I mean CNET tried to replace

00:35:52.080 —> 00:35:55.740
writers with it and I’m sure other places have too, and I suppose you could have it

00:35:55.740 —> 00:36:00.780
generate certain types of writing, but they’re never going to be a writer that people say…

00:36:00.780 —> 00:36:07.420
If you have a… You can’t replace a Jason Snell or a John Gruber with an AI, for example,

00:36:07.420 —> 00:36:13.500
and that’s just taking tech writing. That’s not even taking creative writing, like sci-fi authors

00:36:13.500 —> 00:36:19.500
or fantasy authors or something like that. Okay, you can teach a chat GPT to write terrible fantasy.

00:36:19.500 —> 00:36:24.140
Those aren’t the books that people want to buy. No. I guess I’m not worried about creative writers

00:36:24.140 —> 00:36:26.100
is getting replaced by AI.

00:36:26.100 —> 00:36:28.060
I think one of the things you just pointed out there

00:36:28.060 —> 00:36:29.660
is that people, that you’re right there,

00:36:29.660 —> 00:36:31.660
I did read an article and I didn’t keep the link

00:36:31.660 —> 00:36:33.780
of some places that were saying,

00:36:33.780 —> 00:36:37.380
I’m gonna use some chat GBT and AI to actually,

00:36:37.380 —> 00:36:38.980
not chat GBT, but we’re gonna use AI

00:36:38.980 —> 00:36:41.140
to replace writers and so on and so forth.

00:36:41.140 —> 00:36:42.580
And the thought that occurred to me is that,

00:36:42.580 —> 00:36:43.900
but we’ve already got that.

00:36:43.900 —> 00:36:45.740
It’s just not an AI thing necessarily.

00:36:45.740 —> 00:36:47.940
We have algorithms that scrape.

00:36:47.940 —> 00:36:50.100
So for example, let’s say you wanna do,

00:36:50.100 —> 00:36:53.220
look at a review on a new camera

00:36:53.220 —> 00:36:58.420
and you do a search in Google and the Google search comes up with this result, you go and level this result

00:36:58.420 —> 00:37:05.540
and it’s like, this camera has a large 24.5 megapixel sensor, it is black in color

00:37:05.540 —> 00:37:10.340
and it has a rubberized hand grip and what you’re doing is you’re basically reading

00:37:10.340 —> 00:37:15.540
a fake review which has essentially been computer generated based on a scraping of the tech specs

00:37:15.540 —> 00:37:20.660
for that camera and I’ve come across these from time to time and I’m like they all read much the same

00:37:20.660 —> 00:37:28.760
And it’s all because it’s generated by a series of algorithms that scrape the features off of a website and create a garbage review page

00:37:28.760 —> 00:37:33.460
And it’s obvious because there’s nothing in there that’s beyond the basic text and specs

00:37:33.460 —> 00:37:39.020
It’ll say like this item is heavy and it’ll say whatever the weight is and you think to yourself

00:37:39.020 —> 00:37:44.540
Would we call that heavy? So I feel like that has already happened and

00:37:44.540 —> 00:37:50.180
It’s already terrible and AI tools aren’t really gonna make it any less terrible

00:37:50.180 —> 00:37:56.860
it’s still gonna suck and people are just gonna move on to a page where put someone who’s actually picked up the camera and actually

00:37:56.860 —> 00:38:01.500
Used it and and a lot of people will probably just turn as they already are to YouTube

00:38:01.500 —> 00:38:08.060
For those sorts of reviews because it’ll be like right. Well, you know trying an AI this right, but I guess that’s probably next

00:38:08.060 —> 00:38:13.640
Yeah. Yeah, I don’t know. I initially my thought was I don’t think writers and artists have anything to worry about

00:38:13.640 —> 00:38:19.300
Apparently that’s not true if it was BuzzFeed that we were that we were thinking of I found an article

00:38:19.300 —> 00:38:21.660
Oh yeah, right, right, right.

00:38:21.660 —> 00:38:26.420
So I guess certain types of authors may have to worry about that kind of thing, but those

00:38:26.420 —> 00:38:32.080
websites are the types of sites that just want to churn out tons of content per day.

00:38:32.080 —> 00:38:36.500
And let’s be honest, between their headlines and a lot of the content of the articles,

00:38:36.500 —> 00:38:39.700
which going back to what you were talking about, research, research, what’s that?

00:38:39.700 —> 00:38:40.780
Who does that anymore?

00:38:40.780 —> 00:38:46.620
So yeah, those people need jobs and I get that, but that’s the type of thing that AI

00:38:46.620 —> 00:38:50.120
is going to come for, at least initially, unless it improves dramatically.

00:38:50.120 —> 00:38:54.280
Those are the types of jobs that I think that you might

00:38:54.280 —> 00:38:56.120
prospectively lose.

00:38:56.120 —> 00:38:57.620
We’ll see what happens with BuzzFeed.

00:38:57.620 —> 00:38:58.920
We’ll see if anybody.

00:38:58.920 —> 00:39:00.920
I don’t know, man. Do you want to?

00:39:00.920 —> 00:39:01.620
I don’t know.

00:39:01.620 —> 00:39:04.040
I guess we’ll have to see if people like that or not.

00:39:04.040 —> 00:39:07.800
Well, I think that that’s a self-solving problem, honestly, Scott,

00:39:07.800 —> 00:39:09.720
because I don’t read BuzzFeed.

00:39:09.720 —> 00:39:13.220
I can’t remember the last time I actively in action. No, I just.

00:39:13.220 —> 00:39:15.940
But let’s say it wasn’t just they’re just the first.

00:39:15.940 —> 00:39:23.340
Like, let’s say 90% of news sites say, “Let’s use AI to generate summaries of things that

00:39:23.340 —> 00:39:25.780
are going on for us.”

00:39:25.780 —> 00:39:27.800
And let’s say it becomes widespread that way.

00:39:27.800 —> 00:39:33.240
It’s still going to be a certain type of article, but it’s also going to let us find out whether

00:39:33.240 —> 00:39:36.680
or not humans are okay with that, or if they even notice, I guess.

00:39:36.680 —> 00:39:37.680
I don’t know.

00:39:37.680 —> 00:39:41.520
It’s like, do I personally want to read stuff that’s generated by AI?

00:39:41.520 —> 00:39:42.780
No, not really.

00:39:42.780 —> 00:39:44.980
Not for more than a laugh.

00:39:44.980 —> 00:39:48.060
And even that, I don’t want to keep doing it over and over.

00:39:48.060 —> 00:39:50.120
But I don’t know, maybe people will accept this.

00:39:50.120 —> 00:39:56.880
So maybe that type of quick, generate many articles per day stuff with fancy headlines

00:39:56.880 —> 00:39:59.640
that gets clicks, maybe that kind of stuff will go 100% AI.

00:39:59.640 —> 00:40:01.120
I don’t know.

00:40:01.120 —> 00:40:04.760
But I can’t see it being used for anything more substantial than that.

00:40:04.760 —> 00:40:08.800
I guess I feel like a lot of this is going to be a self-solving problem and it’s going

00:40:08.800 —> 00:40:12.880
to implode because if so many people just shift across and say, “I don’t need humans,

00:40:12.880 —> 00:40:18.180
We’re gonna use AI and ML to develop our articles and so on. People will simply get jack of it

00:40:18.180 —> 00:40:20.840
and I will not read it. They will not engage and

00:40:20.840 —> 00:40:25.800
so we go back to word-of-mouth again. We go back to I don’t trust any of these websites on the internet

00:40:25.800 —> 00:40:26.580
They’re all full of it

00:40:26.580 —> 00:40:30.100
and I can’t get an honest review about anything on here because it’s all fake

00:40:30.100 —> 00:40:33.960
which to be quite honest if you look at reviews on on Amazon or

00:40:33.960 —> 00:40:38.920
pick your platform, yeah, you got to wonder just how much you can trust them in the first place

00:40:38.920 —> 00:40:44.360
So I still feel like a lot of this has been a gradual erosion whether or not this is the end of it or not

00:40:44.360 —> 00:40:49.500
I don’t know only truly gullible people that are not interested in actually finding out

00:40:49.500 —> 00:40:54.480
Reality would fall for it. And and I I know honestly don’t think that’s very many people

00:40:54.480 —> 00:40:59.160
I think people will just vote with their feet people are smart enough to know the difference and they will simply say you know what?

00:40:59.160 —> 00:41:05.280
This is everything’s all samey samey because it’s all written by the same AI system and it’s all BS. I can’t trust it

00:41:05.280 —> 00:41:09.920
so I’m not gonna read it anymore.” And then, you know, page views will drop, you know,

00:41:09.920 —> 00:41:13.440
advertising revenue will dry up, and everyone will say, “Well, what went wrong with our lives?”

00:41:13.440 —> 00:41:16.400
And it’s like, well, you know, you made a bad choice.

00:41:16.400 —> 00:41:22.640
But the problem is, 30% of America apparently doesn’t care about reality or verification of

00:41:22.640 —> 00:41:28.240
information. I mean, I think you could replace Facebook with AI, and the people that are still

00:41:28.240 —> 00:41:32.880
on Facebook using it all the time and who are reading conspiracy theories, you know,

00:41:32.880 —> 00:41:37.280
they’re using it for that type of thing, yelling angrily about social issues or whatever based on

00:41:37.280 —> 00:41:42.080
some political viewpoint, I don’t think they would notice or care. They already believe these dodgy

00:41:42.080 —> 00:41:47.760
sources that have been misproven, but the problem is, once the lie is out there, it doesn’t matter.

00:41:47.760 —> 00:41:52.800
It doesn’t matter if it’s proven false or not. People have already heard it, they’ve already

00:41:52.800 —> 00:41:57.600
believed it, and I don’t… I think there’s a lot of people that honestly wouldn’t know or tell the

00:41:57.600 —> 00:42:02.800
difference or care. I don’t know. I guess maybe because I’ve seen what’s happened to the United

00:42:02.800 —> 00:42:05.920
States over the past few years, I’m not as optimistic about it.

00:42:05.920 —> 00:42:13.120
Well, when you say 30%, it’s an oddly specific percentage. But I mean, I can’t comment on that.

00:42:13.120 —> 00:42:18.560
And I would simply point out that Facebook is dying. And whether or not and people have been

00:42:18.560 —> 00:42:24.720
getting together for years with secret sort of, you know, groups and so on and enjoying conspiracy

00:42:24.720 —> 00:42:28.640
theories. And I mean, you know, there’s, there’s a bunch of people in Australia that, you know,

00:42:28.640 —> 00:42:31.240
don’t recognize the government and all that other stuff and

00:42:31.240 —> 00:42:36.880
Yada yada yada. I mean it’s everywhere just that’s just human nature people have a right to believe what they want to believe

00:42:36.880 —> 00:42:42.720
There’s a multiplier effect though when it’s when they have such easily gotten to and easily

00:42:42.720 —> 00:42:50.460
You know registered for sites and locations that keep actively generating stuff that purposely stokes their emotions

00:42:50.460 —> 00:42:52.200
I do agree with you that that’s just human

00:42:52.200 —> 00:42:57.840
But I also think that Silicon Valley is giving them a multiplier effect that has made the problem substantially worse

00:42:57.840 —> 00:43:00.200
I won’t disagree with that statement in

00:43:00.200 —> 00:43:06.000
Yeah in that regard. I think ultimately Silicon Valley has a lot to answer for and a lot of things

00:43:06.000 —> 00:43:12.480
regarding social media and the social media experiment and so on and and I look at things like the feta versus being a

00:43:12.480 —> 00:43:15.460
Far more balanced way of dealing with the problem

00:43:15.460 —> 00:43:22.260
But having said that it’s still susceptible to people starting their own things like truth or maybe no truth social or whatever the heck it

00:43:22.260 —> 00:43:26.680
Is like I said previously bring your own truth social that kind of thing

00:43:26.800 —> 00:43:32.500
It’s like that will still be the case and that is no different to the way it has always been in the past for

00:43:32.500 —> 00:43:39.460
Since the beginning of human existence, so where there was more than one person there was politics and there were people believing something different

00:43:39.460 —> 00:43:43.060
So I I don’t know but in any case we get a little bit off topic here

00:43:43.060 —> 00:43:48.700
I do want to just quickly circle back to the whole artist suing on AI image generators because

00:43:48.700 —> 00:43:50.880
That’s something that really does annoy me

00:43:50.880 —> 00:43:55.300
when people put content out on the internet doesn’t matter what it is whether it’s a written word and

00:43:55.520 —> 00:44:00.240
and so on. If it’s freely available, that’s one thing. But if it’s something that you’ve put out there,

00:44:00.240 —> 00:44:04.400
like the one I was thinking of is Shutterstock, for example. I’ll put stock photos out there,

00:44:04.400 —> 00:44:08.960
but they’ll have the Shutterstock logo on it. It’s like, well, you want the version of that,

00:44:08.960 —> 00:44:12.880
the raw photo of that without the watermark on it, you’re going to pay for that. And it’s like,

00:44:12.880 —> 00:44:17.680
that’s part of their business model, low res watermark stuff. But if that gets fed into an AI,

00:44:17.680 —> 00:44:22.800
then that is not fair use. And that should be prosecutable. It’s that simple. So if you’re

00:44:22.800 —> 00:44:30.360
If you’re an artist and you put stuff out there, you know, it’s funny because some of the photographers that are on photography forums will say, I never watermark my images.

00:44:30.360 —> 00:44:36.280
So I started watermarking my photos when I took them and put them on tech distortion when I started my photography thing.

00:44:36.280 —> 00:44:43.760
And I got some quite scathing feedback at the time from a few people looking at it saying, you’re ruining your image by putting your watermark on it.

00:44:43.760 —> 00:44:46.120
And I’m like, okay.

00:44:46.120 —> 00:44:49.800
And they’re like, oh, we only ever publish our images without a watermark on them.

00:44:49.800 —> 00:44:54.040
And I’m like, yeah, but does that not imply that then anyone can take and use my

00:44:54.040 —> 00:44:56.960
picture? I don’t want them to take and use my picture without permission.

00:44:56.960 —> 00:45:00.120
And so anyway, I sort of stopped doing it.

00:45:00.120 —> 00:45:05.080
But now this whole AI sucking in millions of photos and doing the, you know,

00:45:05.080 —> 00:45:08.920
creating images from them makes me reconsider that policy.

00:45:08.920 —> 00:45:15.000
Yeah. And the good news is, well, apparently the AI isn’t smart enough to try to get rid of the watermark either,

00:45:15.000 —> 00:45:19.000
because that’s how I think they found out about some of the stock images that

00:45:19.000 —> 00:45:23.200
were getting sucked into the AI was because it was reproducing the watermark.

00:45:23.200 —> 00:45:24.880
Yeah, exactly.

00:45:24.880 —> 00:45:30.520
But yeah, so I feel like it is absolutely fair and reasonable for anyone who has

00:45:30.520 —> 00:45:35.800
any content that is paid for content and you cannot, you know, and the usage terms

00:45:35.800 —> 00:45:41.680
of it should be a new entry may be used for AI, yada, yada, yada, ingestion, or,

00:45:41.680 —> 00:45:44.640
you know, training tool sets and what have you like that should now become

00:45:44.640 —> 00:45:46.040
a new license condition.

00:45:46.040 —> 00:45:48.920
And it already is, I imagine, on some license agreements.

00:45:48.920 —> 00:45:51.000
And unless you explicitly opt into that,

00:45:51.000 —> 00:45:53.340
then you can sue any company.

00:45:53.340 —> 00:45:55.620
You should be able to, anyhow,

00:45:55.620 —> 00:45:57.520
to sue any company that’s using that information

00:45:57.520 —> 00:46:00.320
against your usage agreement.

00:46:00.320 —> 00:46:04.560
And honestly, I cannot wait to see some of these tool sets

00:46:04.560 —> 00:46:06.320
and tools be destroyed by it,

00:46:06.320 —> 00:46:08.280
because frankly, they just sucked in

00:46:08.280 —> 00:46:10.240
everything from the internet thinking that was fair use,

00:46:10.240 —> 00:46:11.500
and it just isn’t.

00:46:11.500 —> 00:46:13.280
We get mad at credit companies,

00:46:13.280 —> 00:46:19.440
and, you know, all these companies that suck in our data supposedly for our good, for how

00:46:19.440 —> 00:46:23.480
society functions, without our permission, without our knowledge, we don’t know exactly

00:46:23.480 —> 00:46:28.000
what they have, and then we get mad about that, but then these guys turn around and

00:46:28.000 —> 00:46:32.520
just suck up everybody’s stuff without permission or attribution or even recognition of the

00:46:32.520 —> 00:46:35.680
fact that they’re doing it, and that’s just wrong.

00:46:35.680 —> 00:46:41.200
It’s like, you know, I wouldn’t say it’s hypocritical, because I don’t know how they feel about how

00:46:41.200 —> 00:46:44.040
their information is used personally, but yeah, it’s just wrong.

00:46:44.040 —> 00:46:50.140
And I think I agree with you completely about it should be an opt-in thing, and even there

00:46:50.140 —> 00:46:56.400
should be multiple levels, like opt-in no attribution necessary, use as will, opt-in

00:46:56.400 —> 00:47:01.920
attribution required, or this is mine, I don’t want it used as inputs for your AI.

00:47:01.920 —> 00:47:08.400
You know, there has to be a right on the artist’s end to specify that, because you’re right,

00:47:08.400 —> 00:47:12.360
People are putting work out there, but that doesn’t mean that they want it to be used. However

00:47:12.360 —> 00:47:15.360
There has to be a way for people to have control over their own art

00:47:15.360 —> 00:47:21.300
I know some people who go extremism and they’re like, well, everything should be open for everybody. That’s not quite true

00:47:21.300 —> 00:47:26.860
I don’t believe that’s true. I think humans should be allowed to have control over their artwork

00:47:26.860 —> 00:47:32.960
Yeah, I mean you’ve got basically three options one don’t create art. That’s that’s the simplest option. Just don’t do it

00:47:32.960 —> 00:47:37.020
The second option is create art and give it freely to anyone who wants it at any time

00:47:37.020 —> 00:47:40.300
And then the third option is create art and sell it.

00:47:40.300 —> 00:47:42.860
So how you choose to sell it is the,

00:47:42.860 —> 00:47:44.740
sort of for me is the debate.

00:47:44.740 —> 00:47:46.440
So if you put it out there with a sample

00:47:46.440 —> 00:47:47.780
and say, here’s my low res sample

00:47:47.780 —> 00:47:49.420
with a watermark on it for example,

00:47:49.420 —> 00:47:50.680
the license agreement will say,

00:47:50.680 —> 00:47:52.020
well, I’m gonna sell this

00:47:52.020 —> 00:47:53.700
and these are the conditions and so on.

00:47:53.700 —> 00:47:55.860
You can’t just take my image and then use it.

00:47:55.860 —> 00:47:57.740
If you want, you have to pay for it.

00:47:57.740 —> 00:47:58.800
And then when you pay for it,

00:47:58.800 —> 00:48:00.700
that is a license subject to those conditions.

00:48:00.700 —> 00:48:02.340
And it might be, you can use it in a magazine.

00:48:02.340 —> 00:48:04.380
You can use it, magazine’s a still thing.

00:48:04.380 —> 00:48:05.540
You can use it on a website.

00:48:05.540 —> 00:48:06.920
You can use it for whatever you want.

00:48:06.920 —> 00:48:10.280
But I doubt very much whether many licenses would say,

00:48:10.280 —> 00:48:12.360
yeah, you can feed that into an AI model,

00:48:12.360 —> 00:48:14.540
but that’s the thing that should be coming.

00:48:14.540 —> 00:48:17.580
Anyone that says that you should just create art free

00:48:17.580 —> 00:48:19.400
for the world and then anyone can use it,

00:48:19.400 —> 00:48:22.240
I think that that is a massive oversimplification.

00:48:22.240 —> 00:48:23.800
If you take out like,

00:48:23.800 —> 00:48:26.520
’cause ultimately there is art for art’s sake

00:48:26.520 —> 00:48:28.120
and I wanna create art and put it out there.

00:48:28.120 —> 00:48:30.260
Well, that’s your choice if that’s what you wanna do.

00:48:30.260 —> 00:48:32.380
But there are lots of artists that put art out there

00:48:32.380 —> 00:48:34.360
and that is the way that they survive.

00:48:34.360 —> 00:48:36.820
They don’t survive based on, you know, people saying,

00:48:36.820 —> 00:48:39.140
”Oh, hey man, painting looks great.”

00:48:39.140 —> 00:48:42.900
And it’s like, “Yeah, but you know, I have no money.

00:48:42.900 —> 00:48:44.820
Now I’m going to die because I have no food

00:48:44.820 —> 00:48:46.340
and so on and so forth.”

00:48:46.340 —> 00:48:48.420
So you could probably argue that, you know,

00:48:48.420 —> 00:48:50.180
that’s probably not fair

00:48:50.180 —> 00:48:51.940
’cause that person put that energy and effort in.

00:48:51.940 —> 00:48:53.460
Some would say, “Oh, we should get a real job,”

00:48:53.460 —> 00:48:54.340
and air quotes.

00:48:54.340 —> 00:48:55.540
Great, go do a real job.

00:48:55.540 —> 00:48:57.780
And then no one will create art.

00:48:57.780 —> 00:49:01.180
So the reality is that art is something that,

00:49:01.180 —> 00:49:04.020
when I started, and this is getting really a bit off,

00:49:04.020 —> 00:49:04.980
maybe a bit off tangent,

00:49:04.980 —> 00:49:08.420
But I never really used to understand art or appreciate art.

00:49:08.420 —> 00:49:11.140
And I’ve got an engineer’s mind in a lot of ways.

00:49:11.140 —> 00:49:13.780
But the older I’ve gotten, and as you point out,

00:49:13.780 —> 00:49:15.460
getting into photography is just one example,

00:49:15.460 —> 00:49:18.940
but I’ve come to start appreciating art in different forms.

00:49:18.940 —> 00:49:20.580
And it’s something that honestly,

00:49:20.580 —> 00:49:24.660
it lends meaning and value, I think, to our lives.

00:49:24.660 —> 00:49:27.020
And whether that’s music, whether that’s painting,

00:49:27.020 —> 00:49:28.860
whether it’s photography,

00:49:28.860 —> 00:49:31.700
that form of unique self-expression.

00:49:31.700 —> 00:49:34.220
And it almost feels a little bit like a travesty

00:49:34.220 —> 00:49:40.480
to sort of trivialize it with an AI and say, well, you know, show me a picture of a field

00:49:40.480 —> 00:49:47.060
with a person standing in it. And it’s like, that is not adding value to anything or anyone.

00:49:47.060 —> 00:49:52.940
It just, it feels just terribly wrong in any way. And that is the touchy feely non-engineer

00:49:52.940 —> 00:49:53.940
argument, I guess.

00:49:53.940 —> 00:49:59.180
No, but that’s what we are. We’re touchy feely humans. And we can get a sense of an emotion

00:49:59.180 —> 00:50:06.940
about that scene based on all kinds of intangible factors, based on our own history, based on

00:50:06.940 —> 00:50:11.320
our visual preferences, based on things that have happened to us in the past.

00:50:11.320 —> 00:50:15.060
Like when you just eating a specific meal with people and smelling that food can bring

00:50:15.060 —> 00:50:16.060
back memories.

00:50:16.060 —> 00:50:19.140
That’s what we are, and AI is never going to have that.

00:50:19.140 —> 00:50:23.000
And I think art does play an important role, and good things happen when people are allowed

00:50:23.000 —> 00:50:28.020
to create art that inspires and motivates people, and bad things happen when people

00:50:28.020 —> 00:50:34.080
feel like they can’t create that art or they feel stifled. I just, yeah, it’s undervalued

00:50:34.080 —> 00:50:38.040
and it’s the type of thing that everybody wants to consume but nobody wants to pay for.

00:50:38.040 —> 00:50:42.900
And I feel the same way when I see people producing a free piece of software and some

00:50:42.900 —> 00:50:46.320
dude rolls up with a huge laundry list of requests and says “when are these going

00:50:46.320 —> 00:50:50.620
to be available?” and it’s like, uhhh, could you be a little bit less of a jerk about the

00:50:50.620 —> 00:50:52.880
fact that this guy’s doing all this work for free?

00:50:52.880 —> 00:50:57.360
Well, I mean, in that case, if something is freely open I’ll ask the question and say

00:50:57.360 —> 00:50:59.360
”Hey, are you looking towards this feature and that feature?”

00:50:59.360 —> 00:51:02.760
You generally get a pretty straight answer, especially from single developer apps.

00:51:02.760 —> 00:51:07.660
But when it comes to something like, “I’d pay for this feature, and I would pay for this feature,” whatever else,

00:51:07.660 —> 00:51:11.660
then I’d just have to wait until there’s a developer who’s prepared to do that, and then I’ll pay them.

00:51:11.660 —> 00:51:16.760
And that’s the beauty of having both open source and paid apps.

00:51:16.760 —> 00:51:21.860
Yeah, and fair game to saying, “Hey, I would be really interested in the software being able to do this,”

00:51:21.860 —> 00:51:25.960
but then to roll up and say, “By the way, can you publish your timeline for making that happen?”

00:51:25.960 —> 00:51:28.160
That’s where I draw the line.

00:51:28.160 —> 00:51:30.420
Yeah, OK, that’s fair.

00:51:30.420 —> 00:51:33.220
But one more thing I just think we should probably

00:51:33.220 —> 00:51:36.620
discuss as well is all this.

00:51:36.620 —> 00:51:40.860
Well, us crapping on AI, to be honest, and machine learning.

00:51:40.860 —> 00:51:42.300
It’s not all bad.

00:51:42.300 —> 00:51:46.660
I mean, there are some good uses of of machine learning.

00:51:46.660 —> 00:51:51.560
And one of the ones that I have heard you mentioned a few times now is is Whisper.

00:51:51.560 —> 00:51:54.320
And I know very little about this.

00:51:54.320 —> 00:51:56.360
It’s on my to-do list to dig into.

00:51:56.360 —> 00:51:57.960
But you’ve been using that for a little while.

00:51:57.960 —> 00:51:59.780
Can you, you want to talk a little bit about that?

00:51:59.780 —> 00:52:01.700
’Cause I’d love to know actually.

00:52:01.700 —> 00:52:02.980
Yeah, and the funny thing is,

00:52:02.980 —> 00:52:05.340
is that when I was listening to that podcast,

00:52:05.340 —> 00:52:07.420
Don’t Fall for the AI Hype episode

00:52:07.420 —> 00:52:09.780
of Tech Won’t Save Us with Timnit,

00:52:09.780 —> 00:52:11.700
she specifically mentioned open AI

00:52:11.700 —> 00:52:13.500
because she has some real problems

00:52:13.500 —> 00:52:15.020
with the philosophies of the founders

00:52:15.020 —> 00:52:18.820
and fair game to her because I think she’s right.

00:52:18.820 —> 00:52:22.620
But open AI does have a tool that I find quite useful

00:52:22.620 —> 00:52:24.300
and it’s called Whisper.

00:52:24.300 —> 00:52:28.360
And it’s a Python-based audio transcription application.

00:52:28.360 —> 00:52:31.200
And basically what it does is it takes audio files

00:52:31.200 —> 00:52:35.360
and analyzes them and spits out a transcript for you.

00:52:35.360 —> 00:52:40.360
And somebody created a C and C++ port of that,

00:52:40.360 —> 00:52:44.560
and it is also optimized for Apple Silicon.

00:52:44.560 —> 00:52:47.800
So I’ve been running that and doing tests with it.

00:52:47.800 —> 00:52:51.000
And I found that it’s good enough for me,

00:52:51.000 —> 00:52:55.120
It does a pretty good job, and I found that it does a good enough job for me that I think

00:52:55.120 —> 00:53:01.280
I can create usable transcripts for friends with brews that I can put up on the site.

00:53:01.280 —> 00:53:05.280
I haven’t figured out yet if I’m going to go so far as to denote speakers of each sentence

00:53:05.280 —> 00:53:09.500
and so forth, I may, but the bottom line is I think I can come up with a workflow that

00:53:09.500 —> 00:53:13.480
doesn’t take me too much time to massage that transcript into something usable, and that

00:53:13.480 —> 00:53:16.280
way it’ll give people something searchable.

00:53:16.280 —> 00:53:21.960
And also, hey, just for us, we can also search and say, “How much detail did we go into on

00:53:21.960 —> 00:53:24.080
this topic before on the podcast?”

00:53:24.080 —> 00:53:27.500
And, you know, transcripts are never a bad option to give people if you can.

00:53:27.500 —> 00:53:33.520
And I think, you know, there’s other podcasters that I know are using this as well, experimenting

00:53:33.520 —> 00:53:36.080
with it to see how it will work.

00:53:36.080 —> 00:53:40.880
And this is the type of thing where I think AI can be positive because I don’t really

00:53:40.880 —> 00:53:45.840
see any downsides to taking my own audio and creating a transcript of it.

00:53:45.840 —> 00:53:47.940
I’m not stealing work from anybody.

00:53:47.940 —> 00:53:52.000
Really what I’m doing is trying to make my podcast more accessible to people.

00:53:52.000 —> 00:53:53.440
Yeah, absolutely.

00:53:53.440 —> 00:53:57.000
So this is the type of thing where I’m positive about it.

00:53:57.000 —> 00:54:00.840
And regardless of what OpenAI as a company is doing in general,

00:54:00.840 —> 00:54:02.600
the Whisper product is something that’s pretty cool.

00:54:02.600 —> 00:54:04.960
And it’s basically just out there on GitHub for people to use.

00:54:04.960 —> 00:54:09.040
So I am going to have a crack at that at some point in the next couple of weeks.

00:54:09.040 —> 00:54:13.340
I’ve had a few things going on with the 50th episode celebrations of causality.

00:54:13.340 —> 00:54:15.560
I’ve been trying to get a whole bunch of things done before that.

00:54:15.560 —> 00:54:19.840
and that’s part of that sort of thing with the t-shirts and all that other stuff that I’ve been doing and

00:54:19.840 —> 00:54:21.240
Q&A stuff and all that stuff

00:54:21.240 —> 00:54:25.320
but it’s been on my list for a couple weeks and I’m like I really want to have a crack at this because the way

00:54:25.320 —> 00:54:32.120
I do my sound bites, sound bites, my goodness, sorry, transcripts, is that I will, I up, okay, I

00:54:32.120 —> 00:54:39.040
started doing the parallel publish to YouTube a few years ago because it was a Libsyn feature where you could just say, you know

00:54:39.040 —> 00:54:45.080
YouTube as an output, it would go through and encode a static image with the entire video, the entire audio

00:54:45.080 —> 00:54:49.640
but as a video file and then it would upload it to YouTube to your account and publish it for you.

00:54:49.640 —> 00:54:54.520
And so I did this for causality and just just for the heck of it to see if it would make a difference.

00:54:54.520 —> 00:54:58.760
And then after a while I started to see, I was picking up a couple of patrons,

00:54:58.760 —> 00:55:02.880
I said “Oh, we found you via YouTube.” I’m like, okay, so this is probably worth doing but it’s not a huge

00:55:02.880 —> 00:55:08.900
huge downloads or you know a huge section of the market, but it’s still better than not doing it.

00:55:09.480 —> 00:55:16.480
The side effect of doing that, and since now I now export using Ferrite into a video and then upload it manually because I get more control over it.

00:55:16.480 —> 00:55:26.480
Anyway, so when you do that it automatically generates, YouTube generates its own SRT and files that you can then download once it auto-translates it for you.

00:55:26.480 —> 00:55:36.480
And these files, like they’re generally okay, but they do need a lot of work and I’m quite the perfectionist so I don’t like putting up stuff that’s rubbish.

00:55:36.480 —> 00:55:40.100
So what I’ve done is I’ve also invested another bit of software for the Mac called

00:55:40.100 —> 00:55:47.080
Subtitle Studio and I go in and I load the video and then I load the the file the

00:55:47.080 —> 00:55:53.020
Subtitles transcript and I go through and I make sure that everything does line up with the correct time and date stamps

00:55:53.020 —> 00:55:54.500
I’ve only done that for causality

00:55:54.500 —> 00:55:56.500
I haven’t done it for a multi-person show

00:55:56.500 —> 00:56:02.740
But the transcripts that I’ve got have been done via the YouTube system whatever they use in the back end, right?

00:56:02.820 —> 00:56:07.220
But I really want to try whisper to see if it makes if see if it’s better because every time it makes a mistake

00:56:07.220 —> 00:56:09.780
Going through and fixing that is a is a pain in the butt

00:56:09.780 —> 00:56:13.540
it just takes time and it discourages me from doing transcripts at all because

00:56:13.540 —> 00:56:17.760
You know if I wasn’t a perfectionist, maybe that’d be fine, you know, just put up there and say hey

00:56:17.760 —> 00:56:21.680
It’s only gonna be 90% and my last name will be spelt chiggily or something like that

00:56:21.680 —> 00:56:28.180
And who cares, you know and it’ll say person to when I mean went percent or be stupid stuff like that

00:56:28.180 —> 00:56:31.600
But I mean if whisper is better then maybe that’s a better option

00:56:31.940 —> 00:56:36.180
Yeah, and the good news is like with some of the when I was playing with the transcripts

00:56:36.180 —> 00:56:40.020
I quickly came up with some regexes for some common things and a

00:56:40.020 —> 00:56:45.960
couple other little things that I can run it through every time and get rid of some of the problems and even with

00:56:45.960 —> 00:56:48.180
Formatting it it helped me

00:56:48.180 —> 00:56:50.940
Format it a little bit. I’ll have to see how well that works over time

00:56:50.940 —> 00:56:56.540
But yeah, the good news is you can do stuff like that for common things, but you did answer a question

00:56:56.540 —> 00:56:58.780
I had I know that YouTube is generating those

00:56:59.460 —> 00:57:04.020
Transcuh, those captions now and I was really curious as to whether or not people could get their hands on them

00:57:04.020 —> 00:57:08.340
So that’s that’s actually really cool. That’s pretty cool. Yeah, I’ve been doing it now for a little while

00:57:08.340 —> 00:57:15.300
I even went back through older ones, older episodes and when I realized you could access it and download it and it was um

00:57:15.300 —> 00:57:21.060
It’s actually not that hard. Uh, but like I said the idea of doing rejects is for common problems

00:57:21.060 —> 00:57:25.460
that might help a little bit but you know the quality of it’s not necessarily the best but

00:57:25.940 —> 00:57:30.260
The fact that I can do that and it’s like not that much extra effort is great.

00:57:30.260 —> 00:57:35.140
It’s funny you know because some like the whole podcasting 2.0 movement like I’ve got where I

00:57:35.140 —> 00:57:39.140
have got transcripts you can download them as an individual file at the site or you can have them

00:57:39.140 —> 00:57:45.380
embedded and they’ll actually show up in sync with the audio if you’re listening in a podcasting 2.0

00:57:45.380 —> 00:57:49.300
compliant application like Cast-O-Matic or Podverse or something like that will show the

00:57:49.300 —> 00:57:53.860
transcript coming up at the exact moment the words are spoken just like it shouldn’t as if it was

00:57:53.860 —> 00:57:56.500
was closed captions on TV, which is really cool.

00:57:56.500 —> 00:58:00.300
But the amount of effort it takes to get to that point is huge.

00:58:00.300 —> 00:58:02.660
And I don’t want to put out-I don’t want to put rubbish out there.

00:58:02.660 —> 00:58:04.140
So I-yeah.

00:58:04.140 —> 00:58:06.260
So I’ll give this a shot, and I’ll see what it’s like,

00:58:06.260 —> 00:58:07.340
and I’ll report back.

00:58:07.340 —> 00:58:08.220
I’ll let you know how it goes.

00:58:08.220 —> 00:58:09.100
Sounds good, yeah.

00:58:09.100 —> 00:58:12.460
By the way, at some point in time, I do want to talk to you about Podcasting 2.0.

00:58:12.460 —> 00:58:13.860
I actually meant to a long time ago.

00:58:13.860 —> 00:58:14.620
Oh, sure.

00:58:14.620 —> 00:58:15.420
And then—

00:58:15.420 —> 00:58:16.100
Yeah.

00:58:16.100 —> 00:58:18.660
I didn’t even think of it when I set this up for some reason.

00:58:18.660 —> 00:58:21.860
One interesting thing I did find Whisper to be 100% adequate for

00:58:21.860 —> 00:58:25.820
is I have a website called syracusases.com,

00:58:25.820 —> 00:58:28.380
and I just take little tiny clips of things

00:58:28.380 —> 00:58:30.980
from John Syracusa who comes up with lovable,

00:58:30.980 —> 00:58:33.700
very notable quotes, he’s a very quotable guy.

00:58:33.700 —> 00:58:35.780
And so I’ll just post little clips

00:58:35.780 —> 00:58:38.780
from various different podcasts that he’s on, real short.

00:58:38.780 —> 00:58:42.100
I try to make sure, send everybody his work.

00:58:42.100 —> 00:58:44.740
I’m not trying to profit or benefit off John Syracusa.

00:58:44.740 —> 00:58:47.900
It’s just that something is a fan that I do ‘cause it’s fun.

00:58:47.900 —> 00:58:50.320
But Whisper is amazing for generating

00:58:50.320 —> 00:58:53.880
the little transcripts that I publish with each episode of that.

00:58:53.880 —> 00:58:56.420
It just generally I don’t have to change anything.

00:58:56.420 —> 00:58:57.680
It just pumps it right out.

00:58:57.680 —> 00:59:01.680
So if it’s a clip where Marco or Casey or some other host of a podcast

00:59:01.680 —> 00:59:05.320
is on also says something, then, yeah, I have to split it and add names and so forth.

00:59:05.320 —> 00:59:06.920
But they’re so short anyway.

00:59:06.920 —> 00:59:11.320
We’re talking like 30 second to minute and a half clips that it doesn’t matter.

00:59:11.320 —> 00:59:14.180
But Whisper will just generate those for me now, whereas I used to have to

00:59:14.180 —> 00:59:17.160
listen to them, play them back and type it out.

00:59:17.160 —> 00:59:18.820
So I don’t even have to do that anymore.

00:59:18.820 —> 00:59:19.960
You want to know something funny?

00:59:19.960 —> 00:59:24.960
I didn’t, this is just, I came across that in my timeline and

00:59:24.960 —> 00:59:29.280
it never clicked that the Syracuse or says website was done by you.

00:59:29.280 —> 00:59:33.280
It just, I’m like, I knew there was a site Syracuse that says, I’m like, wow,

00:59:33.280 —> 00:59:36.760
someone’s really a big Syracuse fan and I never dug into it. And it’s like,

00:59:36.760 —> 00:59:39.120
oh wow, that’s your site? Oh, how cool is this?

00:59:39.120 —> 00:59:41.560
Well, it’s funny because I got,

00:59:41.560 —> 00:59:45.280
I finally got Peter to listen to ATP because a lot of his Apple related

00:59:45.280 —> 00:59:48.600
questions, I was like, dude, just start listening to ATP as a base primer.

00:59:48.640 —> 00:59:50.000
Just start listening to it.

00:59:50.000 —> 00:59:54.480
And he and I would laugh so much about some of the things that Sir Kiuso would say.

00:59:54.480 —> 00:59:59.360
And I don’t know, we must haveIt must have been on a Friends with Brews where we had been

00:59:59.360 —> 01:00:01.200
drinking beer or something where we came up with the idea.

01:00:01.200 —> 01:00:03.280
But anyway, it’s still fun to do.

01:00:03.280 —> 01:00:05.560
Oh, that’s awesome.

01:00:05.560 —> 01:00:06.680
I think that’s fantastic.

01:00:06.680 —> 01:00:11.240
And a part of me, to be honest, I’m waiting until someone out there is a big enough fan of my

01:00:11.240 —> 01:00:14.400
work that they do what Chidgey says, because then I know I’ve made it.

01:00:14.400 —> 01:00:16.040
Until then, though, it’s okay.

01:00:16.040 —> 01:00:16.800
I’ll just keep waiting.

01:00:16.800 —> 01:00:17.120
Challenge accepted.

01:00:17.120 —> 01:00:17.520
That’s not a hint.

01:00:17.840 —> 01:00:19.480
That’s not a hint, by the way.

01:00:19.480 —> 01:00:23.760
It has to be organic, because if it’s not organic, then that does not qualify.

01:00:23.760 —> 01:00:28.640
So you are now excluded, as are any of the listeners, from ever creating such a thing,

01:00:28.640 —> 01:00:30.520
because that would be like me soliciting that.

01:00:30.520 —> 01:00:31.480
So no, you can’t do that.

01:00:31.480 —> 01:00:32.320
But anyway.

01:00:32.320 —> 01:00:36.200
I was going to say Vic will do it, but I don’t know what year he’ll get started on that.

01:00:36.200 —> 01:00:38.240
Oh, that’s harsh.

01:00:38.240 —> 01:00:40.120
Yes, good question.

01:00:40.120 —> 01:00:43.280
Twenty-something.

01:00:43.280 —> 01:00:44.440
Anyhow, it’s all good.

01:00:44.440 —> 01:00:45.880
Ah, Vic’s awesome.

01:00:45.880 —> 01:00:46.560
But never mind that.

01:00:46.560 —> 01:00:47.240
He is.

01:00:47.240 —> 01:00:47.640
He is.

01:00:47.640 —> 01:00:52.200
And he knows I’m teasing him. I know he is listening. That’s why I said that. We’ll get VicGPT on it.

01:00:52.200 —> 01:00:55.720
Yeah, those deep fake things. That’s another thing. Yeah.

01:00:55.720 —> 01:01:04.200
So, um, someone deep faked an interview with Adam Curry because they kept inviting him on their podcast and he kept not going on their podcast.

01:01:04.200 —> 01:01:05.920
So they just did a deep fake interview with him.

01:01:05.920 —> 01:01:07.400
Oh my God, that’s hilarious.

01:01:07.400 —> 01:01:10.240
It was so hilarious. I mean…

01:01:10.240 —> 01:01:12.440
I got to find that now. That is funny.

01:01:12.440 —> 01:01:16.920
I’ll see if I can find the link for you. That’sYeah, it’s wrong.

01:01:16.920 —> 01:01:18.120
That’s funny.

01:01:18.120 —> 01:01:23.400
All right, John. Well, I have to go get in a car and drive around a

01:01:23.400 —> 01:01:28.040
bunch of 16-year-old girls because my daughter’s having a birthday party event today.

01:01:28.040 —> 01:01:31.760
That sounds like a special kind of pain potentially.

01:01:31.760 —> 01:01:32.960
So hopefully they are.

01:01:32.960 —> 01:01:36.840
Yeah, one 16-year-old girl on their own is fine.

01:01:36.840 —> 01:01:38.640
Two isgets louder.

01:01:38.640 —> 01:01:41.080
Three, it’sit’s exponential.

01:01:41.080 —> 01:01:43.600
And then there’s the giggling and then there’s theyes.

01:01:43.600 —> 01:01:46.200
So anyway, enjoy that.

01:01:46.200 —> 01:01:46.720
Enjoy that.

01:01:46.720 —> 01:01:53.120
I think it’ll be better than it could be because the good news is that my daughter is a very good human for reasons

01:01:53.120 —> 01:01:57.060
That are totally unconnected with being my daughter. She just is okay. I’m not taking credit

01:01:57.060 —> 01:02:00.400
So she has good friends. I’ve liked all of her friends that I’ve met

01:02:00.400 —> 01:02:04.520
So I don’t think it’ll be as bad as it could be. But yeah, it’s still it’s still something I gotta do

01:02:04.520 —> 01:02:08.840
Now it’s all good, man. All good. All good. I appreciate you being here

01:02:08.840 —> 01:02:11.800
It was so good to talk to you again after so long

01:02:11.840 —> 01:02:18.020
What would you like to point people to these days for how to find you and how to enjoy more of your?

01:02:18.020 —> 01:02:20.680
accent your untoppable

01:02:20.680 —> 01:02:23.000
voice

01:02:23.000 —> 01:02:25.000
Dear well Adam Curry

01:02:25.000 —> 01:02:28.080
Said that I have very nice pipes and for a second there

01:02:28.080 —> 01:02:31.360
I thought he was talking about my legs because that’s some people talk today

01:02:31.360 —> 01:02:36.300
That’s pipes of legs and I’m like what no what no oh you mean my voice. Oh, yeah, right cheers. Thanks Adam

01:02:36.300 —> 01:02:38.300
That’s very nice coming from the pod father, but anyway oh

01:02:39.400 —> 01:02:46.240
It was a weird moment anyhow sorry right you can reach me at at Chidgey C H I D G E Y at

01:02:46.240 —> 01:02:49.500
engineered dot space that’s where I hang out on the Fedeverse and

01:02:49.500 —> 01:02:53.380
My my projects my my passion project

01:02:53.380 —> 01:02:58.420
I guess you could call it is the engineered network and that’s engineered dot network as in the full word

01:02:58.420 —> 01:03:04.940
And you’ll find podcasts there like pragmatic and and causality is the other big one

01:03:04.940 —> 01:03:08.760
So yeah, if you want to hear more of me then check them out

01:03:08.760 —> 01:03:14.520
Yeah, and I’m sure that most of the people listening to this do or have, and it would

01:03:14.520 —> 01:03:18.480
be more likely to be listening to those than this, but if you’re not, for some reason,

01:03:18.480 —> 01:03:21.160
do – I love Causality, by the way.

01:03:21.160 —> 01:03:24.360
It’s just a fascinating show.

01:03:24.360 —> 01:03:29.800
And there’s so many episodes I listen to and you’re just going, “No, no, no, no,

01:03:29.800 —> 01:03:30.800
why?”

01:03:30.800 —> 01:03:31.800
I know.

01:03:31.800 —> 01:03:33.640
The human condition.

01:03:33.640 —> 01:03:42.640
Yeah, the funny thing about Causality is that I started it years ago back in 2015, and I wasn’t sure anyone was going to like it, because I thought, nah, this will be too geeky, this…

01:03:42.640 —> 01:03:49.640
Because I said at the beginning, I don’t… I don’t… I want to do… analyze disasters, but from an engineering point of view. So no hype, no…

01:03:49.640 —> 01:03:57.640
No, uh, character voices, no fake explosion sounds, none of that. I just want to keep it to… This is the sequence of events, this is what happened.

01:03:57.640 —> 01:04:01.240
These are the relevant bits of information and this is how they could have actually stopped it from happening

01:04:01.240 —> 01:04:05.520
Because to me that’s useful like having a bunch of people in a dramatization going

01:04:05.520 —> 01:04:09.540
Oh, it’s gonna blow up and all sort of stuff. It’s like that doesn’t help you that doesn’t help anybody

01:04:09.540 —> 01:04:13.800
It’s just like it’s it’s a poor form of entertainment and I don’t even know if it’s in good taste

01:04:13.800 —> 01:04:20.200
but the reality is that it’s tended to be very very popular in the end and it’s got a it’s got a

01:04:20.200 —> 01:04:23.120
It’s not a at its peak

01:04:23.440 —> 01:04:28.460
Pragmatic was probably five times the size in terms of downloads that causality is today

01:04:28.460 —> 01:04:33.480
But the truth is that I think the fans of causality are hardcore

01:04:33.480 —> 01:04:42.300
Lifelong fans. I’ve never had anyone say that they’ve stopped listening in disgust like like I got with pragmatic several times believe me

01:04:42.300 —> 01:04:47.460
You know when I get some fact wrong, but on causality I’ve had nothing but positive feedback on it and

01:04:47.460 —> 01:04:50.180
It’s it’s a it’s a pleasure to make it

01:04:50.180 —> 01:04:54.220
I just wish I had more time to make more episodes, but yeah, I’m glad you like it. Thank you

01:04:54.220 —> 01:04:56.220
It’s very nice for you