1
00:00:00,000 –> 00:00:10,680
So, welcome, Yevgeniy. Very excited to have you in the podcast as one of the persons who

2
00:00:10,680 –> 00:00:18,200
actually initiated the whole idea of gathering the IT community in Korea. And actually, that

3
00:00:18,200 –> 00:00:23,560
might be the first question. Why would you think that, you know, IT community is something

4
00:00:23,560 –> 00:00:28,200
really important to have in Korea? And why would people, in your opinion, want to join

5
00:00:28,200 –> 00:00:36,280
the IT community and, like, actively participate in it? Well, how did this idea pop in your

6
00:00:36,280 –> 00:00:43,840
mind, like, that we need an IT community in Korea? And maybe you just wanted to be a part

7
00:00:43,840 –> 00:00:51,080
of some community, right? So how did that come to be? Well, I know that you were concerned

8
00:00:51,080 –> 00:00:58,440
with IT people not having, you know, people to go to and actually talk with, and, like,

9
00:00:58,440 –> 00:01:04,920
not having fun maybe and having, like, maybe being isolated in Korea in some way or being

10
00:01:04,920 –> 00:01:11,520
isolated internally, just being, you know, very, how to say, introverted. So are you

11
00:01:11,520 –> 00:01:21,000
an introverted person yourself? No, no. I think I’m half introverted, half extroverted.

12
00:01:21,000 –> 00:01:31,480
Depends on people who I’m together with and depends on how much beer I’m drunk. We’re

13
00:01:31,480 –> 00:01:38,640
gonna have some beer after that, right? Yeah, definitely. Let’s go. So we were talking once,

14
00:01:38,640 –> 00:01:45,680
right, in the, yeah, some place in a Shisha bar, to be precise. We were talking about

15
00:01:45,680 –> 00:01:51,080
you were so passionate about, like, creating a community of IT specialists, like, you know,

16
00:01:51,080 –> 00:01:56,080
bringing them together and engaging them, making some space for them to talk to each

17
00:01:56,080 –> 00:02:04,360
other to network. So once we created that community, what did you feel? Yeah, so I think

18
00:02:04,360 –> 00:02:11,680
it’s great. I think there’s, like, the huge opportunity and the huge potential for all

19
00:02:11,680 –> 00:02:21,080
people who are involved in the IT industry here in Korea. Like, when I was going through

20
00:02:21,080 –> 00:02:27,440
the LinkedIn, I actually saw a lot of different people who were working in Naver or Coupang

21
00:02:27,440 –> 00:02:37,160
or, like, some other big tech companies. So I felt, like, if we combine all of these people

22
00:02:37,160 –> 00:02:47,040
together, there will be, like, a really huge team who can make some new things that haven’t

23
00:02:47,040 –> 00:02:52,840
been done before by nobody. So yeah, I think there’s a great opportunity here. Well, talking

24
00:02:52,840 –> 00:02:59,960
about the things that haven’t been done before, we together made a chatbot called SkyFetch,

25
00:02:59,960 –> 00:03:06,120
right? And the SkyFetch chatbot, first of all, like, what would you describe it like?

26
00:03:06,120 –> 00:03:11,360
What was your description, idea of SkyFetch and what was it supposed to, like, the problems

27
00:03:11,360 –> 00:03:21,120
it was supposed to resolve for people? Yeah, well, so ideally, if, like, SkyFetch is the

28
00:03:21,120 –> 00:03:29,840
service for, like, people who are traveling from one place to another place and also for

29
00:03:29,840 –> 00:03:37,200
another person who needs to get, for example, some delivery or, like, quick parcel to get

30
00:03:37,200 –> 00:03:44,280
delivery from one place to another place. So, like, travelers can leave, like, can create

31
00:03:44,280 –> 00:03:51,280
their posts in our SkyFetch app. And consumers, like, people who need to get delivery or something,

32
00:03:51,280 –> 00:03:58,360
they can actually meet those people, see their posts, can connect with them, like, ask if

33
00:03:58,360 –> 00:04:03,520
they can make some delivery, like, deliver their parcels or documents, like, either for

34
00:04:03,520 –> 00:04:10,720
free or for some fee that they will come up with. Yeah, like, this, like, sort of the

35
00:04:10,720 –> 00:04:15,840
service which will connect both people, these types of people. Yeah.

36
00:04:15,840 –> 00:04:21,280
Well, like, did it take off? Did it, was it successful, in your opinion, the SkyFetch

37
00:04:21,280 –> 00:04:26,120
or did it teach you something? Like, in what ways would you say it was successful and in

38
00:04:26,120 –> 00:04:34,760
what ways it wasn’t? Yeah, so, actually, it taught me a lot. Like,

39
00:04:34,760 –> 00:04:43,240
I’m still learning a lot of new things even though we partially stopped this project.

40
00:04:43,240 –> 00:04:48,420
But yeah, we actually developed the MVP and it is ready for use. I mean, it was ready

41
00:04:48,420 –> 00:04:54,320
for use, but once we started promoting this service, we actually realized that it’s really

42
00:04:54,320 –> 00:05:02,440
difficult to implement, like, marketing strategies to attract new customers. So yeah, like, the

43
00:05:02,440 –> 00:05:13,040
technical stuff itself is not the only thing which will bring some success to your company,

44
00:05:13,040 –> 00:05:20,240
to your project. Yeah. So marketing stuff is really important.

45
00:05:20,240 –> 00:05:26,840
So would you say that if there was, you know, indication that people actually need this

46
00:05:26,840 –> 00:05:34,440
service after the initial, you know, results, would you say that it’s feasible and viable

47
00:05:34,440 –> 00:05:40,320
to develop it further or would you want to, like, stop and actually let it, you know,

48
00:05:40,320 –> 00:05:49,960
die, like, the service itself? Yeah, so based on our results, we could

49
00:05:49,960 –> 00:05:58,400
not attract, like, millions or billions of users, but I don’t know if there’s potential,

50
00:05:58,400 –> 00:06:06,720
like, marketing strategy that could be done and as a result, like, which could bring to

51
00:06:06,720 –> 00:06:10,400
our service, like, millions of users. I don’t know if there’s, like, this marketing strategy

52
00:06:10,400 –> 00:06:22,800
exists or not, but yeah, I think there’s always opportunity, which is, like, never give up,

53
00:06:22,800 –> 00:06:25,280
you know. Never give up, never back down.

54
00:06:25,280 –> 00:06:32,400
Yeah, never back down. As a result, you’ll achieve your goal, but that’s not guaranteed,

55
00:06:32,400 –> 00:06:40,600
I guess. So I would say that from my perspective,

56
00:06:40,600 –> 00:06:48,400
like, SkyFetch was a good idea and it was really difficult, like, to really market it,

57
00:06:48,400 –> 00:06:54,560
like, to prove to people because remember, we went to KakaoTalk and we started, like,

58
00:06:54,560 –> 00:06:59,600
posting some advertisements, like, yeah, come to SkyFetch, you know, because a lot of people

59
00:06:59,600 –> 00:07:08,040
there made, like, postings, like, oh, yeah, I’m flying from Seoul to Almaty, let’s say,

60
00:07:08,040 –> 00:07:14,040
and I want to deliver some parcels for some reward and people were, like, exchanging this

61
00:07:14,040 –> 00:07:20,040
back and forth doing, like, via KakaoTalk and we thought initially that, yeah, it’s

62
00:07:20,040 –> 00:07:23,480
going to, like, it’s going to blow up. There are a lot of people who need our services,

63
00:07:23,480 –> 00:07:28,720
right? So we just thought that if we put out the service out there, people are going to

64
00:07:28,720 –> 00:07:35,920
use it, but the picture was, like, the reverse of it because once we started promoting it,

65
00:07:35,920 –> 00:07:40,640
we realized that people kind of don’t want to switch and the main problem is that people

66
00:07:40,640 –> 00:07:49,120
who deliver the parcels, they really were reluctant to go to our service and I honestly,

67
00:07:49,120 –> 00:07:56,920
like, up until now, I can’t really precisely figure what exactly was the problem. I can’t

68
00:07:56,920 –> 00:08:03,520
pinpoint that. I can say that just generally my feeling is for them, the incentive of going

69
00:08:03,520 –> 00:08:11,240
to some kind of, like, chat bot was not as big to abandon KakaoTalk altogether and, like,

70
00:08:11,240 –> 00:08:17,800
the chat altogether and, you know, stop going there and go to our service. So how do you

71
00:08:17,800 –> 00:08:27,760
think, like, yourself, you used to, like, send something back to Kassastan or, you know,

72
00:08:27,760 –> 00:08:33,560
like, you actually went to Kassastan, right, to deliver something. Would you yourself as

73
00:08:33,560 –> 00:08:39,080
a courier, like, use the service or what kind of, like, pain point maybe you see now in

74
00:08:39,080 –> 00:08:45,840
retrospect, like, looking back at the past, what was the problem? Because I still don’t

75
00:08:45,840 –> 00:09:01,200
fucking know. You can swear. Nobody gives a fuck. Yeah. Or IT people. Who the fuck cares?

76
00:09:01,200 –> 00:09:18,400
Go ahead, man. Yeah, I also, yeah, I also don’t know what, like, what’s the main reason.

77
00:09:18,400 –> 00:09:28,960
But, yeah, I agree that there was not enough incentive for couriers, like, for flyers to

78
00:09:28,960 –> 00:09:38,440
go to our service and use it. Yeah, I guess, like, in general, like, like, if we are talking

79
00:09:38,440 –> 00:09:44,640
in terms of, like, abstraction, that’s the main reason why they didn’t use it because

80
00:09:44,640 –> 00:09:55,280
they actually didn’t want to use it. They didn’t have incentives. But if we try to dig

81
00:09:55,280 –> 00:10:07,680
into the details, I also, I haven’t figured out why that happened. Like, I mean, what

82
00:10:07,680 –> 00:10:15,520
was the problem in, like, in the details? I also don’t know about that. Yeah. Yeah,

83
00:10:15,520 –> 00:10:23,100
I’d say it’s one of the main problems of IT industry in general. Like, building something

84
00:10:23,100 –> 00:10:29,580
that nobody actually needs in real world and that cannot, like, that has really low potential

85
00:10:29,580 –> 00:10:37,560
to attract investments. And I watched a couple, like, podcasts from, especially from people

86
00:10:37,560 –> 00:10:44,320
in AI industry. They were, you know, talking about how to actually, like, approach this

87
00:10:44,320 –> 00:10:49,200
problem, how to approach product management itself, product creation itself, like software

88
00:10:49,200 –> 00:10:54,320
management. So they said, some of them said, like, they don’t even touch code, they don’t

89
00:10:54,320 –> 00:11:00,240
even start coding before they secure, like, five to 10 investors initially, because they

90
00:11:00,240 –> 00:11:05,340
know exactly that half of them are going to drop. And I thought, well, that’s really smart,

91
00:11:05,340 –> 00:11:11,680
but that also requires a lot of, you know, initial kind of authority to be able to do

92
00:11:11,680 –> 00:11:16,920
that. Because if you come out to investors and just sell them air, sell them, like, the

93
00:11:16,920 –> 00:11:23,800
idea and the team that can implement this idea, then you have to be of some kind of

94
00:11:23,800 –> 00:11:31,680
background to actually do that. And it’s quite difficult. But I think, like, next time, next

95
00:11:31,680 –> 00:11:35,800
time we develop something, what do you, what do you, do you think is a viable strategy

96
00:11:35,800 –> 00:11:41,480
to actually approach investors first, and try to pitch the idea first, and then only,

97
00:11:41,480 –> 00:11:47,080
like, start developing? Or is it something that, you know, you want to test out yourself,

98
00:11:47,080 –> 00:11:50,080
try other approach?

99
00:11:50,080 –> 00:12:09,480
Yeah, so I think there are, like, two different worlds. One is the objective world, where

100
00:12:09,480 –> 00:12:16,160
there’s really, like, the demand, or there’s, like, the need for some products or some services.

101
00:12:16,160 –> 00:12:23,160
And another world is just our own subjective world that we are thinking about ourselves.

102
00:12:23,160 –> 00:12:32,720
So I think in terms of, like, Skyfish, we just, we couldn’t, we couldn’t see the clear,

103
00:12:32,720 –> 00:12:43,600
like, objective world. We were just thinking through our own subjective world. We mistakenly

104
00:12:43,600 –> 00:12:50,840
thought that Skyfish would be a great thing. Well, in reality, probably, Skyfish was not

105
00:12:50,840 –> 00:13:02,040
that really was in demand in the objective world. So I think before we tried to create

106
00:13:02,040 –> 00:13:12,840
something new, like the next service, we first need to make sure that we are, our mind, like,

107
00:13:12,840 –> 00:13:23,040
corresponds and aligns with the objective world. Yeah, where we can actually see that

108
00:13:23,040 –> 00:13:28,840
some product or service is really in need, like, among people. And so I think we need

109
00:13:28,840 –> 00:13:37,960
to work on ourselves so that we are not, like, in the delusional world, but rather think

110
00:13:37,960 –> 00:13:47,120
more clearly without any, like, subjective biases, maybe. Yeah. I think, well, that’s

111
00:13:47,120 –> 00:13:57,040
personally my goal for the next, like, uncertain amount of time to try to get rid of this subjective

112
00:13:57,040 –> 00:14:06,440
perception of the world, which is delusional, I think, and try to think more, like, objectively.

113
00:14:06,440 –> 00:14:14,080
Yeah. Well, that’s personally my strategy for the next uncertain amount of time. I don’t

114
00:14:14,080 –> 00:14:15,080
know.

115
00:14:15,080 –> 00:14:20,960
Oh, by the way, you can move the mic around if you’re not comfortable. It’s okay. So I

116
00:14:20,960 –> 00:14:27,560
thought I actually heard a lot of people say that, you know, software development is like

117
00:14:27,560 –> 00:14:33,800
this, in many cases, that you develop something that people don’t need because you think,

118
00:14:33,800 –> 00:14:41,240
as you said, like, you think subjectively. And yeah, that’s a really great point. But

119
00:14:41,240 –> 00:14:49,240
what I wanted to ask you was, from software development standpoint, I remember you were,

120
00:14:49,240 –> 00:14:54,440
you approached this problem very creatively of not knowing how to exactly develop chatbot

121
00:14:54,440 –> 00:15:01,000
and the interface and everything. And you mentioned using AI and the chat GPT in that.

122
00:15:01,000 –> 00:15:07,360
So what do you think about, like, it’s such a big question in all about, there are so

123
00:15:07,360 –> 00:15:13,560
many faces of it, like, oh, yeah, what do you think about AI in general? And how do

124
00:15:13,560 –> 00:15:20,520
you think your profession is going to change with AI? And do you think that, like, software

125
00:15:20,520 –> 00:15:26,080
developers are going to be, you know, obsolete in the future, like not needed in the future?

126
00:15:26,080 –> 00:15:31,200
Like, it’s such a big question. I don’t know where to begin. But let’s begin with how did

127
00:15:31,200 –> 00:15:38,200
you come up with this idea of using chat GPT for programming? And whether you like this

128
00:15:38,200 –> 00:15:41,680
experience, whether it was actually useful?

129
00:15:41,680 –> 00:15:48,960
Yeah, I hope that AI will replace like software developers.

130
00:15:48,960 –> 00:15:52,600
You hope? Why do you hope so?

131
00:15:52,600 –> 00:16:01,160
There’s like this blind, like, people actually software development is not that difficult,

132
00:16:01,160 –> 00:16:09,280
you know, it’s just you just need to get used to programming. Like, there’s nothing difficult.

133
00:16:09,280 –> 00:16:15,040
You just need to repeatedly do the same stuff like every day and don’t give up. Like, that’s

134
00:16:15,040 –> 00:16:21,680
it. So, yeah, like, these days, software developers don’t get that much salary as they used to

135
00:16:21,680 –> 00:16:31,720
get like, maybe five or 10 years ago. So the market is slowly getting into its real, like,

136
00:16:31,720 –> 00:16:41,720
environment. So yeah, I think software developers will not be the type of job like occupation,

137
00:16:41,720 –> 00:16:49,680
which will get a lot of money because actually, in reality, like, if you if you spend your

138
00:16:49,680 –> 00:16:54,760
like maybe a couple of days with actual software developers, you will realize that maybe 80

139
00:16:54,760 –> 00:17:00,880
or 90% of them, they’re just like chatting with chat GPT or searching some stuff like,

140
00:17:00,880 –> 00:17:07,480
you know, from Stack Overflow, just copy past already existing solution. So, well, probably

141
00:17:07,480 –> 00:17:13,640
they’re like 10 or 12% of people who are working on such things like chat GPT. Well, yeah,

142
00:17:13,640 –> 00:17:21,800
they’re really great, like software developers, but the rest of them, maybe like 70 or 80%,

143
00:17:21,800 –> 00:17:30,800
they’re just doing the same thing every day, which has already been done, like 100 or 70

144
00:17:30,800 –> 00:17:38,360
million of times before. So they’re just copy past the same stuff. You just need to know

145
00:17:38,360 –> 00:17:44,440
what you need to copy and where you need to paste in your code. So yeah, that’s it. And

146
00:17:44,440 –> 00:17:53,760
it’s not like the job that can earn like six figures or bring a lot of money. So yeah,

147
00:17:53,760 –> 00:18:02,080
well, personally, I think like marketing is not easier than software development, I think.

148
00:18:02,080 –> 00:18:07,420
But yeah, in the market, actually, the software developers get a little bit more than marketers.

149
00:18:07,420 –> 00:18:18,120
So I think AI will slow down, like software development, I mean, software developers,

150
00:18:18,120 –> 00:18:27,800
like market, like in the real situation, which it has to be in. Yeah, that’s it. Like for

151
00:18:27,800 –> 00:18:33,840
me, I also use like chat GPT to copy past code. And as I said, you just need to know

152
00:18:33,840 –> 00:18:38,240
what part you need to copy and where you need to paste it.

153
00:18:38,240 –> 00:18:44,000
That’s a great perspective. I actually like you grazed upon the marketing part, because

154
00:18:44,000 –> 00:18:47,920
that’s interesting, because I’m now working in digital marketing, you’re working in software

155
00:18:47,920 –> 00:18:52,560
development. And I would say, yeah, definitely software developers get more money than we

156
00:18:52,560 –> 00:18:59,040
do. Yeah. And that’s really hard to admit. But it’s painful to admit. But it is truth.

157
00:18:59,040 –> 00:19:06,560
And I thought, yeah, like, from my perspective, also, currently, like our company, our client

158
00:19:06,560 –> 00:19:11,840
company to it’s going through a lot of restructuring, a lot of like internal change. It’s not a

159
00:19:11,840 –> 00:19:15,440
secret. Everyone’s going through a lot of internal change. And like Fang, even they’re

160
00:19:15,440 –> 00:19:21,160
going through a lot of like, you know, layoffs and higher and fires because of AI. And I

161
00:19:21,160 –> 00:19:26,200
feel that both now that you said that software developers are also doing like 80% of them

162
00:19:26,200 –> 00:19:33,000
are doing really manual tasks, really something like copy paste, which is also I understand

163
00:19:33,000 –> 00:19:35,800
is difficult to me, right? Because you’re a software developer, you kind of like shoot

164
00:19:35,800 –> 00:19:41,680
yourself in the leg here. But still, it is truth. And AI is coming for our jobs, I would

165
00:19:41,680 –> 00:19:47,480
say. And even if software developers are, you know, in danger, then what to tell about

166
00:19:47,480 –> 00:19:52,840
marketers because marketers also, like literally my job for the past year was copying paste

167
00:19:52,840 –> 00:19:57,320
things. And I was a project manager. So I was literally copying and pasting things like

168
00:19:57,320 –> 00:20:04,640
70%, at least 70% of the time was just copying and pasting things. And that’s interesting

169
00:20:04,640 –> 00:20:11,240
also, because my next question is, I know that your background is not in software development,

170
00:20:11,240 –> 00:20:17,040
you graduated from business administration, right? Like bachelor’s in business administration.

171
00:20:17,040 –> 00:20:22,120
And that’s fascinating for me, because for several years, I’ve been looking for an opportunity

172
00:20:22,120 –> 00:20:29,560
to change from my background from very like, also finance related, you know, management

173
00:20:29,560 –> 00:20:34,260
related background into software development. And now you are here, like you are the person

174
00:20:34,260 –> 00:20:39,440
that I aspire to become. And how did you do it? Because there are a lot of people like

175
00:20:39,440 –> 00:20:44,560
us who want to change their career. And I’ve seen like, them struggling a lot. And for

176
00:20:44,560 –> 00:20:48,920
you, it seems like it just went naturally, you just nature changed. I never knew that

177
00:20:48,920 –> 00:20:54,200
you were of business background before we actually talked about like how you came to

178
00:20:54,200 –> 00:20:59,160
be and I saw your LinkedIn, I was like, Oh, you have like you have bachelor’s in business

179
00:20:59,160 –> 00:21:06,400
administration that’s not related to IT development, IT like at all. So what was the path like?

180
00:21:06,400 –> 00:21:09,920
Is it difficult? Like is it worth it?

181
00:21:09,920 –> 00:21:20,080
Yes. So as I said, like software development is not that is not the job which actually

182
00:21:20,080 –> 00:21:30,440
requires like the high IQ level. Wow. Yeah, actually. People think that software developers

183
00:21:30,440 –> 00:21:39,560
really smart. But no, they’re actually people who didn’t give up. Now when they couldn’t

184
00:21:39,560 –> 00:21:48,840
like run their first Hello World code, or maybe when they couldn’t like, write the correct

185
00:21:48,840 –> 00:22:00,280
if and else statement in their function. So that actually happened to me. But people actually

186
00:22:00,280 –> 00:22:08,720
Well, most people say that I’m super like subborn. So I probably agree with that. So

187
00:22:08,720 –> 00:22:17,240
when I actually failed my first code, or when I actually failed my first like if statement,

188
00:22:17,240 –> 00:22:25,960
I think I didn’t give up on that and just kept going. And I think the reason there’s

189
00:22:25,960 –> 00:22:31,520
there isn’t that I could learn my first job as software developer, because you know, I

190
00:22:31,520 –> 00:22:38,840
actually don’t have like, that high IQ, like people who are working in open AI or some

191
00:22:38,840 –> 00:22:47,160
other stuff. So I think you just don’t need to give up. Just keep going, keep going. Like

192
00:22:47,160 –> 00:22:53,400
to do the same stuff every day. And yeah, actually, you will get more chances to learn

193
00:22:53,400 –> 00:22:58,760
your first job. So yeah, and actually, that’s the reason why I’m saying that software developers

194
00:22:58,760 –> 00:23:03,360
are not that great people like super smart people. They’re just people who didn’t give

195
00:23:03,360 –> 00:23:08,200
up. And that’s it. They, they’re probably not smart people. But they’re really like

196
00:23:08,200 –> 00:23:16,040
subborn. They really didn’t give up. And that’s it. They actually like I have some friends

197
00:23:16,040 –> 00:23:21,880
who started coding, like who started to learn code. But when they first faced the problem

198
00:23:21,880 –> 00:23:27,600
with the error, like for example, function or with their code, they actually thought,

199
00:23:27,600 –> 00:23:35,560
well, I’m super like, I’m stupid for this. I’m not born for this. But actually, I have

200
00:23:35,560 –> 00:23:43,640
also experienced this experienced this feelings. And I just compare them with me. So they just

201
00:23:43,640 –> 00:23:53,880
they just went. They just gave up coding because of their emotions because of their feelings

202
00:23:53,880 –> 00:24:00,520
that I’m not born for this. But if you put aside your emotions, if just keep like retry,

203
00:24:00,520 –> 00:24:06,440
do the same stuff like every day, every day, you will actually realize that writing a function

204
00:24:06,440 –> 00:24:12,840
is not something that requires like the high IQ level, but it’s just something that you

205
00:24:12,840 –> 00:24:20,920
have already done before. And you just, you know, like automatically do these things.

206
00:24:20,920 –> 00:24:26,560
Not because you’re super smart, but because you have trained this mindset of writing like

207
00:24:26,560 –> 00:24:34,080
some code and you just automatically without any intention, just mechanically write this

208
00:24:34,080 –> 00:24:42,280
code. So it’s, it’s not related to IQ. I think it’s related to like, automatically do the

209
00:24:42,280 –> 00:24:48,320
same thing like every day, every day. Yeah. And that’s the secret that probably most people

210
00:24:48,320 –> 00:24:53,880
know but they don’t believe that it actually is actually the like the main secret, the

211
00:24:53,880 –> 00:24:59,080
main reason why, like how people get their first job. But yeah, I mean, myself actually

212
00:24:59,080 –> 00:25:07,440
went through this pass. So yeah, I think this is true. You just need to do the same thing

213
00:25:07,440 –> 00:25:12,480
like the same stuff every day, every day, like you will fail. But like regarding the

214
00:25:12,480 –> 00:25:19,720
failures, you just need to put aside your emotions and keep going. Yeah.

215
00:25:19,720 –> 00:25:24,720
Great stuff. Really motivational. I mean, listening to you right now, I caught myself

216
00:25:24,720 –> 00:25:30,800
thinking that maybe I don’t really want to change into IT that much. I mean, if AI is

217
00:25:30,800 –> 00:25:36,960
coming for our jobs, it’s kind of like changing the whole paradigm. But speaking about jobs,

218
00:25:36,960 –> 00:25:41,040
you mentioned like your first job, landing your first job. And probably there is a lot

219
00:25:41,040 –> 00:25:47,520
of like emotion, a lot of your, you know, memories in there and a lot of lessons you

220
00:25:47,520 –> 00:25:55,760
learn from there. So my question would be, probably you experienced some kind of imposter

221
00:25:55,760 –> 00:26:03,720
syndrome, right? So how did you land your first job? And like, how did you get rid of

222
00:26:03,720 –> 00:26:20,880
this syndrome? Yeah, so actually when I landed my first job,

223
00:26:20,880 –> 00:26:33,680
it was a small project. I don’t remember if I had like this imposter syndrome, but I

224
00:26:33,680 –> 00:26:41,600
realized that this is the project that like, for example, junior software developers like

225
00:26:41,600 –> 00:26:50,200
me could actually get to work on this project. So yeah, I think it was, well, in addition

226
00:26:50,200 –> 00:26:58,400
to this, like the financial budget was not that high, was not that big for that position.

227
00:26:58,400 –> 00:27:08,320
So I just got the job with minimum like salary. So I think because of that, I didn’t have

228
00:27:08,320 –> 00:27:14,720
like imposter syndrome. For example, if I had like super huge salary while being junior

229
00:27:14,720 –> 00:27:19,040
software engineer, then yeah, probably I could get like the imposter syndrome. But since

230
00:27:19,040 –> 00:27:27,240
I was getting relatively small amount of money, so yeah, I think I didn’t feel that imposter

231
00:27:27,240 –> 00:27:39,680
syndrome. Yeah, so I think if you’re junior software engineer, but you get super ideal

232
00:27:39,680 –> 00:27:46,760
environment, like you get super high salary, you get a lot of corporate benefits, I think

233
00:27:46,760 –> 00:27:52,440
in this case, you can get into situation like where you have like imposter syndrome. But

234
00:27:52,440 –> 00:28:01,920
since I was getting the minimum like salary, the minimum like corporate benefits, I think

235
00:28:01,920 –> 00:28:07,800
I didn’t have this imposter syndrome, you know, I think I’m not I’m not that experienced.

236
00:28:07,800 –> 00:28:15,440
I’m a junior software engineer. And I realized that I’m not I’m getting the benefits that

237
00:28:15,440 –> 00:28:22,400
I actually deserve for junior software engineers. So yeah, I think because of that, I didn’t

238
00:28:22,400 –> 00:28:26,240
have like this imposter syndrome problem.

239
00:28:26,240 –> 00:28:33,600
How did you land this job? Like, did you search on LinkedIn or in Saddam mean or what did

240
00:28:33,600 –> 00:28:46,800
you do? Yeah, actually, I just googled like full stack software engineer positions like

241
00:28:46,800 –> 00:28:52,840
no jazz or react developer, and just applied everywhere like in LinkedIn, and some other

242
00:28:52,840 –> 00:28:58,360
services I don’t even remember. Yeah, I just applied everywhere and everywhere. And some

243
00:28:58,360 –> 00:29:05,000
of them like replied, some of them ignored. But yeah, I think it was maybe LinkedIn or

244
00:29:05,000 –> 00:29:13,120
I don’t I don’t remember this website. I just I just applied everywhere I found like some

245
00:29:13,120 –> 00:29:18,800
job posting potings. Yeah, nowadays, do you use LinkedIn mostly?

246
00:29:18,800 –> 00:29:27,640
No, actually, I think it’s real like it’s real like a waste of time, or at least for

247
00:29:27,640 –> 00:29:35,480
software developers. Because yeah, I mean, like to search to apply not search like to

248
00:29:35,480 –> 00:29:46,400
apply through the LinkedIn. Because like, maybe 80% or 90% of job advertisements, like

249
00:29:46,400 –> 00:29:53,000
they are either like reposted or yeah, they’re just being like automatically reposted. And

250
00:29:53,000 –> 00:29:59,880
recruiters maybe don’t even check the submitted like resume and other personal information.

251
00:29:59,880 –> 00:30:08,440
But yeah, I think I think it would be better if you if you find some job on position, you

252
00:30:08,440 –> 00:30:15,920
probably would have much more chances if you go to their official like corporate website

253
00:30:15,920 –> 00:30:23,120
and check if this position actually exists on their website and apply through the corporate

254
00:30:23,120 –> 00:30:32,280
website, rather than applying through LinkedIn. Because I think there’s like too many like

255
00:30:32,280 –> 00:30:44,880
artificial bots in LinkedIn. And the information there is not actual up to date, I think.

256
00:30:44,880 –> 00:30:51,520
So as a junior developer, let’s say I just switched to it hypothetically, what kind of

257
00:30:51,520 –> 00:30:59,320
way? What’s the range of salary I can hope for in Korea?

258
00:30:59,320 –> 00:31:07,720
In Korea, I think it’s like, it’s the minimum probably, which maybe you will get something

259
00:31:07,720 –> 00:31:15,000
like 3 million won per month, per month, maybe net. Yeah.

260
00:31:15,000 –> 00:31:20,760
So like 40 million won per year, like about 36, 35?

261
00:31:20,760 –> 00:31:21,760
Yeah.

262
00:31:21,760 –> 00:31:22,760
Okay.

263
00:31:22,760 –> 00:31:27,600
I think so. Well, it’s not well, it’s not based on any facts. But yeah, I think so,

264
00:31:27,600 –> 00:31:35,320
based on some job posts that I seen on like Wanted or Saramin or Job Korea. Yeah, but

265
00:31:35,320 –> 00:31:37,120
it’s my assumption.

266
00:31:37,120 –> 00:31:39,520
So that’s after tax, right?

267
00:31:39,520 –> 00:31:40,520
Maybe.

268
00:31:40,520 –> 00:31:46,080
Well, after tax and before tax in Korea is not that much different. Not like in America,

269
00:31:46,080 –> 00:31:47,080
right?

270
00:31:47,080 –> 00:31:54,900
In the States, I heard it’s like up to 40% tax deduction. So you can make six figures

271
00:31:54,900 –> 00:32:02,200
on paper, but actually get like 60,000, like 100k yearly, but actually make 60,000 per

272
00:32:02,200 –> 00:32:11,040
year, which is, I think, insane. Like, yeah. So you landed your first job like that, like

273
00:32:11,040 –> 00:32:16,560
applying to LinkedIn and everything, right? What was the interview like? So did they ask

274
00:32:16,560 –> 00:32:23,960
you questions like, why did you fucking decide to switch? Did they interrogate you about

275
00:32:23,960 –> 00:32:24,960
that?

276
00:32:24,960 –> 00:32:38,120
No, actually, they didn’t. Because that project, as I said, they required some junior software

277
00:32:38,120 –> 00:32:48,320
developers. And it was a small startup. It wasn’t like super huge IT company. So they

278
00:32:48,320 –> 00:32:54,080
actually didn’t care about like your background as long as you have the bare minimum knowledge

279
00:32:54,080 –> 00:33:07,480
for junior software engineer, you could actually apply. Yeah. Yeah, I think if you’re the junior

280
00:33:07,480 –> 00:33:17,720
software engineer, and you didn’t graduate like super, like those Ivy League universities

281
00:33:17,720 –> 00:33:25,080
or some other universities where you were, I mean, like the computer science department

282
00:33:25,080 –> 00:33:35,720
is really popular. I think if you don’t have any experience, probably it would be even

283
00:33:35,720 –> 00:33:44,000
better to start your first job at startup and not applying for those like internships

284
00:33:44,000 –> 00:33:50,480
at Google. Well, actually you can, but I think in this case, you’ll get much more chances

285
00:33:50,480 –> 00:34:00,280
to land your job at startup. But actually, there’s a lot of people who completed their

286
00:34:00,280 –> 00:34:09,000
computer science degree at not like top universities, but still could land their job at Google or

287
00:34:09,000 –> 00:34:18,880
maybe like other fan companies. But in this case, you need to learn data structures and

288
00:34:18,880 –> 00:34:29,640
algorithms, you know, DSA, maybe like to solve a lot of problems on lead code. Yeah, I think

289
00:34:29,640 –> 00:34:44,040
that’s one of the ways to land the job at fan company. So, sorry, I lost my thought

290
00:34:44,040 –> 00:34:51,400
for a minute there. One of the best ways you said to land the job is like to go to startups,

291
00:34:51,400 –> 00:34:59,560
right? Yeah. And I was curious, like, because there are many ways to learn software development,

292
00:34:59,560 –> 00:35:05,400
and every university teaches it differently. And even there are some even like videos on

293
00:35:05,400 –> 00:35:11,400
YouTube that describe the full path to certain roles, for example, like web development,

294
00:35:11,400 –> 00:35:16,200
or people say like, Oh, you should start from this and then go to there and then go there.

295
00:35:16,200 –> 00:35:22,600
So what would you say, in your opinion is like, the best starting point for someone

296
00:35:22,600 –> 00:35:27,080
who just goes into software development and doesn’t know much, what’s the best starting

297
00:35:27,080 –> 00:35:34,760
point? And what is like the pathway as far as you can go? Just your opinion about that?

298
00:35:38,120 –> 00:35:44,840
I don’t think this roadmap will be valid because, you know, all of those things are probably,

299
00:35:45,720 –> 00:35:55,080
will probably be done by AI, like devian, or what is it like, AI assistant who will code. So I think,

300
00:35:55,080 –> 00:36:04,840
I think the roadmap for software developers should be restructured, because, you know,

301
00:36:05,400 –> 00:36:14,600
right now, I mean, so far, if you go to any video, they like all those videos will be like about the

302
00:36:14,600 –> 00:36:19,480
same stuff, about the same roadmap, and like you need to learn this, this, this, and that.

303
00:36:19,480 –> 00:36:27,960
But I think, since devian will come out soon, and AI is evolving, I think,

304
00:36:28,600 –> 00:36:37,080
the roadmap should be restructured. And actually, the requirements for software engineers will also

305
00:36:37,080 –> 00:36:47,800
be changed. Well, I personally cannot predict what those requirements will be. But it’s

306
00:36:47,800 –> 00:36:57,880
soon as we will see some like unusual requirements, I think based on these requirements, we need to

307
00:36:59,080 –> 00:37:03,160
come up with a new roadmap for software engineers, I think.

308
00:37:04,760 –> 00:37:12,600
Because you know, all of those like HTML, like CSS stuff will be like, like 95% done by AI. So,

309
00:37:12,600 –> 00:37:17,240
yeah, I think there will be some new technical requirements for software engineers,

310
00:37:18,280 –> 00:37:26,520
or maybe the entry level, like the entrance level will be much higher. Like, probably in the future,

311
00:37:26,520 –> 00:37:34,840
you will need to know so many things that right now like middle or senior software developer

312
00:37:34,840 –> 00:37:42,200
has. I don’t know. Yeah. But I think, yeah, this industry, like the entire job

313
00:37:45,000 –> 00:37:53,400
will be changed. And technical requirements will be changed as well for software developers in the future.

314
00:37:56,200 –> 00:37:59,160
I think so. What’s devian?

315
00:37:59,160 –> 00:38:09,640
There is AI assistant that’s like that, that will be created for the purpose of coding. So it’s not

316
00:38:09,640 –> 00:38:16,840
like super broad or general AI system. It’s like AI assistant for software development. Yeah.

317
00:38:18,840 –> 00:38:25,240
Do you think it has the potential to actually to become a software engineer?

318
00:38:25,240 –> 00:38:31,240
Do you think it has the potential to actually to become more than just an assistant and actually

319
00:38:31,240 –> 00:38:34,680
software develop, like develop software itself on its own?

320
00:38:38,600 –> 00:38:45,400
Do you think AI in general has this potential? Yeah. Well, right now the ChartGPT actually

321
00:38:45,400 –> 00:38:55,160
provides technical solutions in not all, but in some or most of the cases. But, you know, ChartGPT

322
00:38:55,160 –> 00:39:02,920
is not created for software development only, right? But if there’s like this AI system which will be

323
00:39:02,920 –> 00:39:13,000
created using only this technical, I mean, with this technical expertise, like it will

324
00:39:14,200 –> 00:39:25,080
go through the entire API documentation for each service, like React or Node.js or Java Spring.

325
00:39:25,080 –> 00:39:33,960
Yeah, I think it will be more specialized and customized for software development.

326
00:39:36,120 –> 00:39:43,000
Yeah, the main thing that it should always be up to date because now ChartGPT

327
00:39:43,720 –> 00:39:49,720
has the limited information, like in terms of time limits. Like previously it was

328
00:39:49,720 –> 00:39:56,280
only capable of handling information that was generated before like a certain period of time.

329
00:39:56,280 –> 00:40:03,320
But 2021, right? Yeah, something like that. But yeah, well, for this, for the software development

330
00:40:03,320 –> 00:40:08,840
industry, you always need to make sure that your AI system is up to date with the real information

331
00:40:08,840 –> 00:40:14,840
because, you know, the API documentation is changing every day. There may be some

332
00:40:14,840 –> 00:40:25,000
breaking changes where the function requires two parameters, but now it should get only one.

333
00:40:25,880 –> 00:40:34,520
You know, if you try to pass like incorrect number of parameters, there will be an exception, an error.

334
00:40:35,080 –> 00:40:40,920
So you need to make sure that your AI system is up to date with the real information.

335
00:40:40,920 –> 00:40:47,080
You know, you always need to support this AI system.

336
00:40:50,040 –> 00:40:58,520
I remember, if I remember correctly, Sam Altman announced, I mean, I watched it in Lex Friedman

337
00:40:58,520 –> 00:41:11,080
podcast, that ChartGPT 5 is going to come out this summer, right? So I’m actually quite

338
00:41:11,880 –> 00:41:21,320
worried by that because Altman said that he’s personally not really that fond of ChartGPT 4.

339
00:41:21,320 –> 00:41:28,120
He doesn’t think that it’s very impressive, but he thinks that ChartGPT 5 is something that he can be

340
00:41:28,120 –> 00:41:36,520
actually proud of. So that much difference, and they also don’t go with 4.5. They go with 5 directly.

341
00:41:37,240 –> 00:41:44,200
What would you think, because you might be like more knowledgeable in this industry than I am,

342
00:41:44,200 –> 00:41:52,040
about the possible capabilities of ChartGPT 5, and what I’m more interested in is

343
00:41:53,000 –> 00:41:58,040
should we start worrying really about job market?

344
00:42:01,240 –> 00:42:07,640
Well, no, actually ChartGPT 5 will not absolutely replace developers, and

345
00:42:07,640 –> 00:42:15,640
Davion can, I mean, potentially can replace, because at least, for example, if you are

346
00:42:16,760 –> 00:42:25,240
developing a program, you sometimes you need to manipulate over, well, not sometimes, but you

347
00:42:25,240 –> 00:42:33,080
always need to manipulate over your file system of the ChartGPT 5, and you need to have a

348
00:42:33,080 –> 00:42:40,440
manipulate over your file system of the computer that you’re working with. For example, you need to create

349
00:42:41,080 –> 00:42:48,760
or delete new files in your local folder. So right now ChartGPT cannot do that.

350
00:42:49,800 –> 00:42:56,280
You cannot provide the permission or access to your operating system to ChartGPT, so that it can

351
00:42:56,280 –> 00:43:05,160
delete or create some files or folders on your computer. But yeah, I don’t know how Davion will

352
00:43:05,960 –> 00:43:20,040
be developed. I don’t know if Davion will be the plugin to code editors, or if it will be the separate

353
00:43:20,040 –> 00:43:28,600
software which you need to install on your computer, or if it will be like ChartGPT, the browser

354
00:43:28,600 –> 00:43:41,560
extension, or the website, I don’t know. But at least you need to manipulate over your computer

355
00:43:41,560 –> 00:43:50,760
to send network requests or to update the file system. And right now neither ChartGPT 4 nor ChartGPT

356
00:43:50,760 –> 00:43:58,120
5 can do that, because ChartGPT is just the browser’s website. It doesn’t have the full access

357
00:43:59,480 –> 00:44:08,040
for your file system. So yeah, well, this ChartGPT file will not replace

358
00:44:08,040 –> 00:44:19,000
software engineers, but if once we have this AI software which will have this permission to file

359
00:44:19,000 –> 00:44:25,960
system, the end of the software engineers will be much closer from that time.

360
00:44:25,960 –> 00:44:39,000
Yeah, you touched this topic of manipulation and access. And pretty much everyone is aware of the

361
00:44:39,000 –> 00:44:46,680
conflict between Musk and Altman on the topic of making AI actually open source, open to the public.

362
00:44:46,680 –> 00:44:56,840
But I guess the underlying conflict is much wider here. It’s more about the bigger picture,

363
00:44:56,840 –> 00:45:07,640
whether AI is going to be capable of, whether AI is safe enough to actually give it so much power

364
00:45:08,520 –> 00:45:15,560
or not safe. And let’s actually make it a very existential question. Do you think that AI can

365
00:45:15,560 –> 00:45:18,920
and should control the world?

366
00:45:29,160 –> 00:45:39,880
Actually, I don’t think AI will be exactly the same as human brain.

367
00:45:39,880 –> 00:45:50,440
Yeah, as human brain because people are trying to create, I mean to copy and

368
00:45:51,800 –> 00:46:01,880
to make the copy of human brain. But how can you create exactly entirely the same thing of

369
00:46:01,880 –> 00:46:09,400
something that you actually don’t know about? The human brain is not the same as the human

370
00:46:09,400 –> 00:46:16,600
brain. It’s not started like 100%. So there’s a lot of unknown things and there’s a lot of

371
00:46:18,120 –> 00:46:24,440
things which people don’t know about the brain yet. So I think you cannot,

372
00:46:26,520 –> 00:46:36,280
I mean people will not create exactly the same brain, but artificial brain because they simply

373
00:46:36,280 –> 00:46:46,920
don’t have like 100% of information about how human brain actually works. But yeah, I think

374
00:46:47,560 –> 00:46:53,560
people will not create exactly the same artificial intelligence as human brain.

375
00:46:55,400 –> 00:47:06,120
Because you know, human brain is so complex. Yeah, it’s so complex. There’s a lot of

376
00:47:06,120 –> 00:47:18,360
questions which are not answered yet. But I think AI can help humans to find some answers on those

377
00:47:18,360 –> 00:47:25,000
questions, I think. Yeah, because honestly what I thought when I first was posed with this dilemma,

378
00:47:25,640 –> 00:47:32,040
like whether I think personally that AI should or should not take over the world, I thought that

379
00:47:32,040 –> 00:47:36,760
there was part of me that actually said, you know, it’s okay if it takes over the world

380
00:47:37,320 –> 00:47:43,080
because it’s kind of rational. But also at the same time, the next second I realized that AI

381
00:47:43,080 –> 00:47:49,320
for this purpose, it should have all the data about humans and also kind of not perceive humans

382
00:47:49,320 –> 00:47:56,360
as humans. It’s quite impossible for AI to really understand humans on this level, like really care

383
00:47:56,360 –> 00:48:01,480
about each life and everything because there are like a billions of us, you know. And for AI

384
00:48:01,480 –> 00:48:12,120
it’s more of a resource. So it’s kind of like, it would treat the earth like kind of strategy game.

385
00:48:12,920 –> 00:48:19,400
So maybe that is the worst part of AI being the, you know, in the head of the humanity.

386
00:48:19,400 –> 00:48:24,840
Because it is rational and being rational sometimes means not caring about people’s lives.

387
00:48:25,560 –> 00:48:30,040
So that’s something I thought from the standpoint of like ethics and everything.

388
00:48:30,040 –> 00:48:35,400
Like what if it just decides that, you know, we don’t need those people here, let’s just wipe

389
00:48:35,400 –> 00:48:41,240
them out using these people here. And like just calculate mathematically like, yeah, we’ll lose

390
00:48:41,240 –> 00:48:46,040
this much people, but they are like the least productive part of the, you know, the world and

391
00:48:46,040 –> 00:48:52,200
basically just like wipe out an entire nation or population. So I thought, well, yeah, that’s

392
00:48:52,200 –> 00:49:00,120
really like the laws of robotics really should be deeply embedded in AI in order to make it humane.

393
00:49:00,920 –> 00:49:08,200
And in that regard, do you agree with more like with Altman or with Musk on whether AI should be

394
00:49:08,200 –> 00:49:16,840
like open AI should open like their code or make it like close AI and rename themselves into closed AI?

395
00:49:16,840 –> 00:49:33,960
Actually, I think the reason why Elon Musk wants to make open AI open source is not because he

396
00:49:33,960 –> 00:49:41,720
actually cares about the humanity. I think he doesn’t care about the humanity that much as he

397
00:49:41,720 –> 00:49:51,960
proposes himself through this conflict with open AI. I don’t know what he wants to get from this

398
00:49:51,960 –> 00:50:03,800
situation, but he’s definitely not the person who cares about the society, about the world.

399
00:50:03,800 –> 00:50:11,560
And therefore, that’s why he wants open AI to be open source. I think that’s not the main reason

400
00:50:11,560 –> 00:50:17,560
I don’t know what’s the reason behind that, but he’s not that person.

401
00:50:17,560 –> 00:50:25,560
What makes you think so? That he’s not the person to care?

402
00:50:25,560 –> 00:50:29,560
No, I mean, like he’s not like the evil or he’s not a bad guy, but

403
00:50:29,560 –> 00:50:39,560
humanity is not his number one priority. I think. Well, I’m not saying that he’s super bad guy, but

404
00:50:39,560 –> 00:50:47,560
honestly, humanity is not his number one priority. His personal interests are his number one priority.

405
00:50:47,560 –> 00:50:55,560
Because if he actually cared about the humanity, well, the first thing he should do is to make the

406
00:50:55,560 –> 00:51:05,560
Tesla code open source. That would be much more step, like, I mean, much better step towards human

407
00:51:05,560 –> 00:51:13,560
safety. Because, you know, probably if he makes Tesla code open source, people will find a lot of

408
00:51:13,560 –> 00:51:23,560
bugs, like a lot of technical bugs, which probably can potentially cause a lot of current problems.

409
00:51:23,560 –> 00:51:31,560
It could actually cause a lot of like car accidents. And of course, this will hit a lot on Tesla’s

410
00:51:31,560 –> 00:51:39,560
stocks and he will probably lose some part of his net wealth.

411
00:51:39,560 –> 00:52:01,560
Yeah, actually, like as I want to say, like, if a lot of open source projects are evolving slowly, you know,

412
00:52:01,560 –> 00:52:07,560
if you need to make some changes to open source product, you need to create the pull request. You need

413
00:52:07,560 –> 00:52:16,560
to get accepted by a lot of different people, depending on the policy of those like open source

414
00:52:16,560 –> 00:52:22,560
projects. For example, in some like open source project, they’re like a separate committee, which should

415
00:52:22,560 –> 00:52:32,560
accept those changes first and only after that, the changes will be merged. And sometimes this entire

416
00:52:32,560 –> 00:52:42,560
process can take even years, like a couple of years. So if the AI, if such successful project like

417
00:52:42,560 –> 00:52:54,560
open like CharGPD will become open source, we will, I mean, like the speed with which the AI industry

418
00:52:54,560 –> 00:53:08,560
is growing up will be slowed down. You know, you need to sacrifice the industry growth with safety.

419
00:53:08,560 –> 00:53:18,560
You know, if you want to get the safe AI, you will need to sacrifice with the speed of evolution.

420
00:53:18,560 –> 00:53:24,560
It will slow down. Yeah, it will slow down. Definitely. I don’t know. It’s just my assumption. Yeah, you

421
00:53:24,560 –> 00:53:30,560
always need to sacrifice something, you know, to find those compromises, compromise solutions. Yeah,

422
00:53:30,560 –> 00:53:39,560
progress is like very rarely ethical. Yeah, like, will it break breakthrough things are breakthrough

423
00:53:39,560 –> 00:53:50,560
because they’re unethical. Most of the time. Well, I would I caught myself thinking like, if so many

424
00:53:50,560 –> 00:53:59,560
things are being invented, you know, we don’t we won’t even need to drive cars, probably ourselves in

425
00:53:59,560 –> 00:54:05,560
about 20 years, you can just buy an electric car like a Tesla, a self driving car. And that’s it. You

426
00:54:05,560 –> 00:54:12,560
don’t really need to steer the wheel anymore. I mean, it’s not now but in 20 years, probably we won’t

427
00:54:12,560 –> 00:54:20,560
be driving ourselves. Right. And I also thought that, well, if AI is going to come and AGI is going to

428
00:54:20,560 –> 00:54:28,560
be developed, and I think many people say that many of like people with authority with knowledge say

429
00:54:28,560 –> 00:54:34,560
that it’s going to be developed within the nearest 10 years. So within 10 years, by the time we are

430
00:54:34,560 –> 00:54:44,560
like 3540, we’re going to be driving driverless cars, we’re going to be working in a in the world

431
00:54:44,560 –> 00:54:52,560
where AGI is developed, let’s say, let’s say like not 10, like 20 years. Doesn’t it give you to me,

432
00:54:52,560 –> 00:55:00,560
it gives this kind of sense like, so what am I working for? What am I really striving for? I should

433
00:55:00,560 –> 00:55:05,560
probably go and learn something like being a carpenter, and actually like do something I enjoy, you know,

434
00:55:05,560 –> 00:55:11,560
maybe writing books, maybe doing something like that. From the society, from the technological standpoint,

435
00:55:11,560 –> 00:55:17,560
is not that productive. But from the creative standpoint, it’s kind of like makes sense because AGI,

436
00:55:17,560 –> 00:55:25,560
no matter how smart it is, it might not be capable of the same creativity as people are. So do you

437
00:55:25,560 –> 00:55:32,560
sometimes have those thoughts, especially as a software developer, I thought, maybe you know like

438
00:55:32,560 –> 00:55:39,560
more and have more concerns about that. And in this case, what would your escape be from that?

439
00:55:39,560 –> 00:55:45,560
Because honestly, it’s like difficult to realize that maybe you are replaceable in the nearest future.

440
00:55:45,560 –> 00:55:50,560
Maybe you won’t need to actually work, maybe you will need to do something else.

441
00:55:50,560 –> 00:56:06,560
Yes, so actually I don’t agree with people who say that AI will make, well, of course it will make

442
00:56:06,560 –> 00:56:16,560
people jobless, but there will definitely be created absolutely new type of jobs and occupation

443
00:56:16,560 –> 00:56:26,560
which haven’t existed before. Probably right now we cannot even imagine what new type of job will be

444
00:56:26,560 –> 00:56:40,560
created as a result of this AI revolution. Because you know, like maybe like 20 years ago people

445
00:56:40,560 –> 00:56:47,560
even couldn’t imagine there will be such jobs like prompt engineer or maybe like social media marketer

446
00:56:47,560 –> 00:56:59,560
or bloggers. People always complain that technology will make them jobless, you know. But right now,

447
00:56:59,560 –> 00:57:08,560
if we take a look back then, we can see that yes, some jobs were eliminated by technologists, but right

448
00:57:08,560 –> 00:57:18,560
now we have a lot of new jobs which didn’t exist even like 20 years ago. But yes, actually a lot of

449
00:57:18,560 –> 00:57:31,560
people say that you don’t need to, I mean, if you haven’t discovered what’s your meaning and purpose

450
00:57:31,560 –> 00:57:40,560
in this life, if you haven’t found your job that you would actually do for free, then you should not

451
00:57:40,560 –> 00:57:52,560
attach yourself to the job because if AI will replace you, you should be adaptive enough to start

452
00:57:52,560 –> 00:58:01,560
learning new things, to get these new jobs. But yeah, if you have found the job that you would actually

453
00:58:01,560 –> 00:58:10,560
love to do for free, then yeah, I think you don’t need to worry about that AI will replace your job.

454
00:58:10,560 –> 00:58:22,560
Let’s say hypothetically you got fully replaced by AI, like there is no job in software development

455
00:58:22,560 –> 00:58:25,560
or maybe you just don’t want to do it anymore. What would you do?

456
00:58:25,560 –> 00:58:34,560
Yeah, since I don’t have this job, as I said, that I would love to do for free, I probably haven’t found

457
00:58:34,560 –> 00:58:45,560
I don’t know what I want to do for free, like I haven’t found some people’s passion. Probably I will just

458
00:58:45,560 –> 00:58:50,560
spend some time to learn something new, to switch to another job.

459
00:58:50,560 –> 00:58:54,560
But you have no idea what is going to be like right now, right?

460
00:58:54,560 –> 00:59:02,560
Yeah, actually I don’t attach myself to software development. Probably I will leave this job maybe next year

461
00:59:02,560 –> 00:59:13,560
or next month, I don’t know. I don’t have this idea in my brain that I will always be the software engineer.

462
00:59:13,560 –> 00:59:21,560
Like fixation. Great stuff.

463
00:59:21,560 –> 00:59:35,560
I mean, okay, well, ladies and gentlemen, that was Yevgeny Pak for you. The prodigy, genius guy who graduated

464
00:59:35,560 –> 00:59:41,560
from the Bachelor of Business Administration and then completely turned around his career and mindset

465
00:59:41,560 –> 00:59:50,560
into becoming a software developer and is now exploring this field and constantly learning.

466
00:59:50,560 –> 00:59:57,560
So I’ve never seen anyone as motivated and as passionate about what he does.

467
00:59:57,560 –> 01:00:05,560
And even though he claims that he’s not really fixated on software development, it does seem often that he is

468
01:00:05,560 –> 01:00:08,560
fixated on software development. That’s how passionate he is.

469
01:00:08,560 –> 01:00:16,560
So I hope that you extracted value and benefits and something new from this podcast.

470
01:00:16,560 –> 01:00:21,560
And thank you. Thank you, Yevgeny, for being here.

471
01:00:21,560 –> 01:00:33,560
And I hope that we’ll get a chance to talk to you next time. Yeah, thank you.

472
01:00:33,560 –> 01:00:39,560
This was the episode with Yevgeny Pak.

473
01:00:39,560 –> 01:00:49,560
This was the episode with Yevgeny Pak.