Canisius Rozario from collide.

0:00 I know we just launched the webinar and so I am happy to just slowly kind of talk through why we're here and set the table as more people trickle in. So I, as many of you guys know, Mercury Fund

0:15 has been investing in vertical AI native companies or AI fast companies, AI first, there's like a new term all the time, we'll get into that. And I've had the benefit personally of being able to

0:27 work with a lot of amazing CTOs like you, like others that may come down the line in this series, and I can't have granola notes running on my life all the time. So all those long conversations on

0:40 the phone, all those meetings we had, anytime we're just chatting in the collide offices, you've provided nuggets that have been really useful for me as a venture capitalist, but also as someone

0:51 that's interested in how AI is going to change all these different legacy industries or industrial transformation, as we're going to call it.

0:60 I figured I would, you know, if there was enough interest and there seemed to be enough interest, probably likely because of you, not because of me, but I'm sure it's because a few people don't

1:08 know me yet. Yeah, that's fair, that's fair. We'll use that.

1:13 And I wanted to host something, I'll host a conversation with you. And most importantly, because like, for those of you that don't know, Kinesh is like, he has a long technical career building,

1:23 amazing technical

1:25 teams at every stage, right? And we can get into kind of specifics of that in a sec, but starting from pre-seed all the way through growth stage, and most recently as the head of engineering, at

1:37 a company called Upside. But none of that was ever in the oil and gas space, energy space, anything kind of adjacent to that. And if we're being transparent, like, not a lot of it was

1:50 necessarily AI native back then, or AI anything back then, right? And so I'm excited for you to kind of jump into story arc, that character arc. And then before you start, just one key piece of

2:03 logistics, I guess for the audience. There should be a QA chat, please feel free to drop questions, comments, whatever, you know, just nothing mean about me anyway. And I'm more sensitive than

2:17 Kinesha is and we'll be monitoring that and try to sneak those in as we have this conversation.

2:24 I actually appreciate the opportunity and I know we have kind of gone back and forth on lots of different topics. So I appreciate just calling me and this is my first actual webinar and that too in

2:35 the collide office. So it's quite surreal. Yeah, this is amazing. This is really awesome. Like I've always seen it from the outside looking in, but this is interesting to kind of just sit here

2:43 and big props to Jacob. Yeah, you can Jacob. Yeah. So okay, tell me like, this is not a single day. Have you gone? Have I seen you and I felt like you were nervous about the the like. this

2:57 enormous task that you have in front of you, right? Of becoming the infrastructure layer for the oil and gas AI space. Yeah, so if I walk back, probably two good things you can talk about, just

3:09 my background, right? And then how that mapped into the decision I made to come over and work at Collide. That probably gives some context, right? Because part of this is, I suspect that a lot

3:20 of folks like me who have been in software for a long time, but are almost feeling forced and pushed by external factors like VCs as well, right, or board members or C-level execs, be like, hey,

3:34 we need to be AI first. Like, what are we doing to be AI first and all of this? So I think maybe my hope is that my story in progress, but just gives some steps as to what I have done and how I've

3:50 kind of got up on few things and how I'm kind of going about this process hopefully that's helpful. So I started my career. Went to college, did computer science and engineering, then moved to

4:01 Austin. That's where my job was. My first company where I worked was National Instruments, which was recently a couple of years ago, I think bought by Emerson. And they were a virtual

4:11 instrumentation hardware software company and I was on the software side, very much actually Microsoft based. So we were building Microsoft plugins into Visual Studio and building. NET layers over

4:23 C drivers for their virtual instrumentation So I was there for about nine or 10 years. So work through as a software developer then moved into technical leadership, then moved into group management

4:34 and all of those things, right? But just very, very technically focused that there was National Instruments was a very technical company. So I was there for about nine or close to 10 years,

4:44 almost then decided that I needed broader exposure into understanding how software is actually built, not the bits, but how do you go through the decision-making Let's launch a new product. What is

4:57 the impact of that? How do you go through the future aspects? So then I ended up going to a company called SolarWinds, which while it says SolarWinds has nothing to do with energy or anything. No,

5:06 no solar or wind. No solar or wind. It was a network management application management software. Join them as a program manager. And basically that's where I really learned a lot about the full

5:17 life cycle of software development or product development and how software development fits within. So that was a great aperture opening for me to understand role of product management versus support

5:27 versus UX versus UI versus user research,

5:32 which are all different things. Data science was the first exposure that I got over there because they had a data analytics team. Before that, I had no idea that, OK, when you release a feature,

5:42 what telemetry metrics do you want to measure such that a financedata team can monitor the effect of those decisions, right? those were very foreign concepts to me. So I spent about four years

5:58 there and after that moved to a company called Hot Schedules which was a restaurant back office software. Sorry, what forced these like jumps? Yeah, great question. The first jump I'd been at

6:10 National Smiths for a long time, like about 10 years. And I just felt that my growth was, I needed to grow differently, meaning I was just managing people at some point, right? And again, I was

6:22 not doing anything. And I think for every engineer on the skull or people who have gone from engineering, that's a hard shift when you've gone from doing something to now just managing and

6:33 orchestrating. You start feeling very incomplete. So that kind of was one of the reasons why at least while I didn't want to go back and start coding again, I wanted to get closer into being able

6:45 to do some more things and be closer to the action in some ways But I think I was just kind of done it. National Institute and something like so I found the next thing that would help me figure out a

6:59 broader understanding of this as I said the broad development lifecycle in a very different way and not just that what is the next code feature that you are developing right so that was my first jump

7:14 I've always tried to see that am I running towards something or am I running from something right like all both so I think the first case was probably both yeah right like I was tired I was burnt out

7:27 I needed a change the second change from solar winds to hot schedules really came because I realized that as a program manager while I was accountable for running big programs right whether that was

7:41 product programs or internal programs like gyro agile migrations were a big thing or sock to compliance GDPR or the legal liaison like all of those things

7:53 also

7:55 SolarWinds went from public to private. So I was part of that team between engineering and legal that kind of did all of the compliance work.

8:06 But I just started feeling that I could do more and more impactful work if I started more leadership more directly managing teams and platforms to go build product. So I think I had been away from

8:19 product directly for a little bit long enough to come back in and have that realization that now I need to kind of go back and really start building things again and build a team, build software all

8:31 over again.

8:33 And SaaS and AWS and all these things were really coming on edge. And at

8:41 SolarWinds, we weren't really exposed to that. It was pretty much an on-prem big solution. So we were not really in the cloud, right? So I think one thing I've done well is to kind of just try to

8:51 see what the next bigger movement is. And I don't classify myself as a technical innovator, right? But I'm generally an earlier doctor or early majority, right, in that S curve. So I think SAS

9:05 kind of became the next reason. So I shifted into my next opportunity at heart schedules, which was basically an integrated back office software system that they were building, porting over into

9:16 the cloud Well, catch us up to upside because I think maybe I'm impatient, but I love this jump from, again, having built an amazing technical team to like now building the most AI native

9:31 best-in-class technical team to me and oil and gas. I know I say this on LinkedIn, and I don't just say this for the algorithm. I truly believe you've done something amazing there. And so I feel

9:41 like it's important that you went through all these phases, but I do love that this last jump, especially as a lot of people are trying to identify this right CTO to hire or not. It's so crucial to

9:52 to what you've been able to accomplish. Yeah, absolutely. So eventually, like I landed up at Upside, which was a series A company when I joined. So I joined them as the head of the Austin office

10:02 because that company was definitely on that. It was a unicorn at the point where I joined them and they were growing really fast but they were based out of DC. So they needed talent outside as well

10:12 in a more senior manner, right? That could really step them up.

10:17 Went there, started building the team ground up again They were about 20, 25 engineers at peak. We were almost 200 engineers as well. Did not start as head of engineering. One of the founders who

10:29 hired me was the head of engineer. He was one of the co-founders there. And then over my time there about five and a half, six years, just continued to move through and then eventually led all of

10:38 engineering over there. Got an opportunity to build the team, recruit a lot of senior level talent but that's also my first time where I got really exposed to machine learning and true data science.

10:51 upsides a cashback app but there's a lot of incremental calculations that okay the business that you are sending to merchants is truly incremental so how do you calculate that right there's a lot of

10:59 data science and machine learning pieces and big data and all of this stuff because there's a lot of transactions that you have to process so that was actually another big pivot for me to understand

11:09 how does machine learning actually help i was there for a long time and now the company's a multi-billion dollar company post-series d if i so i think again after being at a company for that long i

11:24 wanted to do something different because once again i was kind of at a point where just managing a lot of people but not really learning technology per se and that's when i started looking got

11:36 approached for a couple of different opportunities and i had a couple of different offers for studio positions at different places what why sorry if i can pause right there what why were these places

11:48 seeking you in particular i think because of upside okay because upside had gotten on the radar for a lot of companies. And my story and journey there had started highlighting, even though I wasn't

11:58 posting anything on LinkedIn, but just people followed, right? And in fact, two or three months ago, somebody from my network creature, they're like, hey, somebody asked us about you earlier

12:07 in the year, I was like, yeah, that must have been because when people were recruiting. So somehow, like, board gets around, right? Upside had become a very successful company. So I think my

12:15 story there was very attractive. And when people saw that I've moved from virtual instrumentation to networking, like security staff, then to back office, restaurant software. Now, cashback

12:27 applications are very different fields, not necessarily that I stuck to one particular thing. I think just that flexibility, right? And being able to do well at each stage, because just quickly,

12:39 right? Upside was, had gone through all of these phases, became multi-billion dollar company, but before that, SolarWinds was a 45 billion dollar exit

12:50 Hot schedules was another exit, not in the billions, but like close enough. So there was also this history of going to places. You got this golden touch, man. So now I'm just hoping that that

12:59 golden touch makes me super rich. Yeah. One day, right, that's the dream. Don't we all? Yes, exactly. So I think as I was looking, right, and as I was being seen earlier in the year, I

13:09 think this was the first year that I seriously started thinking about that, okay, maybe I should explore, right? And when I start thinking about what should I go into,

13:21 a couple of the opportunities were going to be very much like upside that, okay, go in, they were already series A or B, I knew the script, right? That doesn't mean that it's easy, but I just

13:31 know what it would take to get to the next level. But I knew that my one big handicap through my career over the last 70 years had become that I was not close to technology, and I started feeling

13:44 that, right? that shifted so much into leadership management. still all on the technical side, but much more people process leadership related, not technical decisions, right? Like my base is

13:57 still pretty strong because of national instruments. So I could figure out that, okay, these are the right decisions, but it's not that I was close enough to know, right? So I would always

14:05 surround myself. My biggest knack, my biggest strength is really surrounding myself with great people, right? Like I can hire really well, that's my strength. My strength is not that I know the

14:14 next deck to go do, right? But it is to find those people that can collectively really up level the system. So I'm a good judge of that piece. So that's what I've done repeatedly at the different

14:28 companies. So I knew that that was the path for me. But at this point, I was looking for something where I could get more, learn more technology. So I was like, okay, if I'm going to go do that,

14:40 what's the right place to go learn that? And I said, okay, everybody's talking about AI and quite frankly I was just frustrated because as head of engineering Um, even within upside, like there

14:52 was just so much conversation about what are we doing with AI? How are we like, and I was like, I don't know what that means, right? I don't know what it means to be AI first. I don't know what

15:01 it means. So it was generally a very frustrating period because different people were coming in with different perspectives, right? And the new shiny object and then some people getting more

15:12 aggressive with it and really forcing it down, look at our stage, at our age as well and where we are in careers, nobody likes getting forced into something, right? So you have to meet people

15:23 where they are. So I just felt that everybody was talking about it. Every news article, every single piece was all about AI. Just a lot of noise. And I was like, I don't know what this means

15:34 actually, right? My only reference point was chat GBT, really, like it's gen AI. And AI per se is not new, right? A lot of people have been doing this and AI has kind of gone through the

15:45 transition of machine learning and like when people talk about it. AI now, it's a lot more, in my opinion, genuine, right? That's what's changed and made it much more accessible, especially

15:55 when chat GBT came. They're like, oh, now I'm not a great writer. I didn't even grow up here, right? But I can put in my thoughts into some tool like chat GBT and get a very refined message back.

16:08 But because I'm the subject matter expert, I know what's right, what's wrong, or should be, right? So I can tweak it. But something that could have given me anxiety for days, I can at least

16:18 just put it out on paper fairly quickly and just get some structure out. So that was my only exposure, right? I started doing my own review with chat GBT, started doing other people's - This is

16:28 why you're at Upside, right? This is why I was at Upside, right? And then Upside got more, like they were promoting that too, but we just didn't know. So I think a lot of it was this sort of

16:37 frustration of not understanding what the hell AI means, right? Or when people say, I hear this all the time, Or are you using AI? Are you an AI native company? executive teams and board members

16:50 or VCs, right? Be like, What are you doing with the AI? I'm like, Do you know what you're even asking for? What is AI data? What does EA native mean? But what I did know is, and this piece is

17:03 a little important, then we can go into it. I started seeing my kids use it, right? So my kids go to a French private school and I don't know any French, right? So one day I just saw my daughter,

17:15 like I was helping my son with some homework And I see my daughter is just like chatting with someone and I get frustrated and I'm like, can you please stop chatting with your friends and just go to

17:26 your home where she's like, I am. I'm like, I'm asking you not to do it with your friends just to, she's like, Dad, I'm not talking to anyone. So out of true frustration, stand up, go close

17:36 to her and she has chat GBT open, right? I didn't tell her about chat GBT, right? But this is like the kids caught up and she was basically not even typing. She was using like the voice piece

17:49 in French, what her homework was and getting some help. So there was a moment of pride, but also I didn't know how to react in that moment 'cause I was like, okay, you know that you can't just

17:58 copy paste this into your assignment. She's like, yes, I know. I'm just trying to get some pointers so that then I can do more research on Google or something else. Wow. So when I saw that -

18:06 Your child's only four hours. Yes. At that point, she was probably nine, right, now she's 10. But I think that was a big turning point for me that, okay, this generation is growing up with

18:16 this tech, the same way in the late 1990s, early 2000s, I was growing up in Pakistan in the internet face. And my dad was very much into technology and stuff. He was also learning, or he was

18:29 just, he was early adopted to everything. But I was also learning. So the both of us were going through our own learning. So that's what I feel is happening with this AI turn right now, is where

18:40 the kids and us as parents are going through that journey together. And I think that was a big turning point So I think that moment really. etched in my mind that, okay, whatever I'm trying to do

18:50 next, I may not agree with what might be going on with AI or how people talk about it at that point in time. But I was like, my background is software. I've built software teams. I have that

19:02 success. I have that understanding. Why don't I try to entrench myself where I completely immerse myself into the AI field so that at least two years from now, three years from now, at least I'll

19:14 be better prepared and understand what's real versus what's hype, rather than just trying to get frustrated with it from a distance. The thing that's also important is that then the next question is,

19:24 why could I not do this at upside, right? And this is, I think, relevant to a lot of people who are at big companies because to learn something you need to create space. And when people say just

19:35 do it on the side, almost everybody works, not 40 hours, but a lot more than that, right? Because there's no, your work-life balance is pretty, like it's very much indicated. okay, I'm

19:47 already working and I'm already stressed. My boss is asking for 10 different things when exactly am I really going to learn? And I think that was the big switch that for me to learn, I needed to

19:58 put myself in a different situation. And that's why the different opportunities collide was the most AI native. And we can talk about what that means. Yeah, yeah, yeah, because sorry, because I

20:08 know, I mean, when we use a recruiter or collide user recruiter to find you.

20:15 And I know Colin, the CEO spent a lot of time with some of the top candidates and a lot of time with you specifically, right? And so you coming from this history of like a lot of data science, a

20:27 lot of ML, but realizing, hey, they're AI is another wave. Being naturally curious, right, because of your upbringing, but also as a parent and what your kids are doing. I remember Colin

20:39 coming back and saying like, This is the guy, I know it, this was like weeks and weeks before we like had finalized everything, you know? I'm actually, maybe I'm

20:50 selfishly curious, but I guess people are trusting me to ask the question is here. What about those engagements between colonist CEO and you as a potential CTO candidate for an AI native business,

21:02 particularly in an industry that you had no background in, made it so mutually

21:09 beneficial Both of y'all left those conversations feeling like this is the right choice moving forward. No, I just saw the source oil and gas and that's big money. I was like, let's just go to oil

21:18 and gas. Billion dollar valuations. Exactly. It's not even a dummy.

21:23 I think the thing that was very exciting to me was that here's one that Colin, right our CEO, our founder, he's not a

21:33 classically trained software engineer, right? He doesn't come from that background, but obviously he has worked on the field So he has a very engineering mind of how he has gone about. He had

21:43 built something already successful in digital wildcatters from the ground up, right? Then decide that he wants to pivot completely into the AI space and was teaching me about AI things like RAG,

21:60 right? Like retrieval, augmented generation and all these things that I had no idea about. And here's somebody who technically, or background-wise is not a software engineer or technical person,

22:10 right? So I think his depth of knowledge, his clarity of knowledge, and the one biggest piece, which is this is not ready to AI or oil and gas, was he had distribution figured out. What that

22:24 means is that I've worked at a lot of big companies built great software, but nobody to use it. Here, Colin Chuck and the other folks that collide had already had his, like they know everybody in

22:37 the oil and gas space, and that makes a huge difference, means that they can at least get a call. you can at least walk in, you can do a handshake, you can tell them what you're working on, find

22:46 enough validation. So they were actually finding product market fit way before my conversation with them even started. And somehow put together some basic system with,

22:57 we can talk about what that system and architecture looked like, but some basic system. Well, it's not your first 90 days. Yeah, once you're on board. Yeah, but they were able to get into

23:08 organizations prove some value of search, but search meaning that not just chat GBT search, but search within people's, like their own data, right? In a secure manner.

23:21 So I was just very impressed by that, right? I was impressed by that this is truly somebody who wants to change the industry. It has a history of doing that. And I was like, if nothing else, I'm

23:32 gonna be pushed really hard to learn a lot. So I think the common theme for me was really just learning because we have all become. millionaire, billionaires on spreadsheets many times, right,

23:43 and success metrics and all of the stuff. Up into the right, baby. Exactly, right? So it's like, okay, you just have to continue to go just learn, right? And I think that was kind of a big

23:53 hook for me, that this was somebody that understood it. And it's like, whatever he was telling me, I was also then learning and validating it, right? So it was not just like vaporware or

24:02 something, but he was really deeply involved in the industry.

24:08 So very clear on what you liked about this opportunity. What do you think Colin saw in you as a CTO candidate? Yeah. And I know at that point, Colin was also interviewing some people, for

24:20 instance, like from some other AI companies, well-named companies, as their AI heads, right? And I think what stood out to him about me was really just

24:35 I think just one of my background that I've had a history of building teams and successful outcomes, building stuff from scratch, or going into organizations that needed to change or revitalize.

24:48 I've gone through that change management not once, not twice, at least four times, right?

24:53 I know you're a humble guy, so I appreciate you going in. 'Cause I do think what a lot of companies are facing is this challenge of, hey, there isn't, there hasn't been this long history of AI in

25:03 any. Yeah, there are people that have been working this for a decade, but they're usually at the opening eyes, the XAIs, the Gemini's of the world and cost a lot, or don't have a, you know,

25:14 don't have any desire necessarily leave and do something more verticalized. What

25:21 you were able to do, I guess, in being able to jump in and assess, I guess, where the company was at, and then also build up from a tech stack standpoint, from a people standpoint. And then

25:35 again, from just like a natural curiosity of where AI is going. through maybe that reflects your first 90, 100 days here, but talk through how you kind of landed and started. Yeah, it was hard,

25:50 right? So this is one thing I want to be very clear about, right? It was not why I appreciate kind of something that you said at the very beginning that, okay, like I've never seen a sense of

25:58 frenzy, right? Or complete discomfort. But I've also just put in a lot of time and dealt with my own anxiety of a new field, a new technology going back and becoming more familiar with technology

26:13 as well. And there's no substitute for the amount of hours you will have to put in, right? So I think that's a big differentiator as well, like that great piece. I think that get me afloat here,

26:24 basically as well, because I knew nothing about oil and gas, but everybody around me knew a lot about oil and gas. So it was easy for me to extract information or trust that what they are telling

26:36 me is correct, right? I think it's also just in my personality that I'm not a very critical sort of pessimistic. Like just, hey, I have to know all the details. No, pessimist is not the right

26:48 word, but

26:50 if I'm in an environment, I'm going to trust the people around me till they give me reasons not to. So I think everybody in this field, when I read about Todd Bush, when I read about Colin, when

26:58 I read about Chuck, when I read about like, they are industry experts. They say, okay, they got it covered. I don't need to know everything about all end gas because software is software is

27:08 software That's the one thing that I think people may disagree with me. I just believe I can go from, in my career, I've just gone through different companies, like industries, and I'm confident

27:19 that I can go to another company and have similar success there, because I don't always try to just learn everything about that industry, because that may or may not help me because at the end of

27:28 the day, okay, you need to know some of the basics. You need to trust the people that are around you. You have to assess who's around you, right?

27:38 So then it's about data models, it's about, okay, what are the workflows you're trying to automate, how to search work, right? But those are not oil and gas specific problems, right? Everybody

27:47 faces that. So I had to remind myself that I've built these things in the past. So I have to extract that knowledge, right? Go back. And my biggest tie in there for me was really, with the eye

27:60 was interesting and an oil and gas. Enterprise is a big thing, right? So SaaS is not always a thing Everybody wants data to be protected. We are an Azure shop because everybody is in Microsoft,

28:12 right? Oil and gas. So we had Azure in. solution a build to I knew nothing about Azure. All of my last two or three jobs were all AWS. But Azure's come a long way as well, right? And the

28:22 ecosystem of development and all of that stuff. So I actually had to pull my examples back from how was I building software in the 2000s at national instruments because every deployment there was a

28:36 custom development. deployment, or sort of, but on premise for users. So how do I, what did I learn there? What did I learn during science and how does that apply here? So I think that was a

28:47 lot of what was happening in the first 90 days. But the other big thing that I started assessing was really what's the quality of the team, right? Because that's something I've built many times.

28:56 And from everything that I'm learning about software, that I know about software and I'm learning about AI, when I look at the architecture is it's passing my SNF test. And then I actually went out

29:08 into my own network and asked a couple of people to help me validate certain things. So that was a big part of my first 90 days. Okay, what does your SNF test mean? Give me more details. So my

29:19 SNF test was that

29:22 and again, this is everybody's going to have their own opinions on this stuff, right? But when I looked at the architecture, it was basically some resources, some part of the software in,

29:36 Azure, some part of the software in Heroku, and then our data ingest was running off of a developer's laptop somewhere, right? And so, okay, how does this thing actually scale? So when I asked

29:46 the question, okay, we have to deploy this to 20 customers, how would we do it? Like, I couldn't get a straight answer, right? And just like every answer, everything that the developers would

29:57 tell me if I asked them two or three further questions, it was like, oh, AI just does it. I'm like, what does that even mean, right? Like, I don't know what that means, right? It sounds like

30:06 bullshit. Yeah, and I'm like, oh, AI just solves it. And I'm like, how? Yeah. But they couldn't answer the how, which told me that they didn't understand the basics of software development

30:17 necessarily. And my problem at that point really became that, okay, if you don't understand the basics of software development, then how do you know that whatever code it's been producing or when

30:25 you pass a call to an LLM, that's invoking some age unit, right? Everybody talks about these things. They're invoking the right one How are you validating that?

30:35 Oh, we just look at this, the chat response. And I'm like, okay, how does this at scale, how many chat responses are you going to be able to see yourself visually? And then a common answer was

30:46 also that, oh, I didn't build this, so I don't know. It's like, okay, so those are just some good, you start understanding what a good engineering, a good engineer is very curious. They like

30:59 owning stuff, maybe to a fault sometimes But there was not that curiosity of depth. That was a signal to me. And when I was like, okay, like this, part of this is your ecosystem is completely

31:11 fragmented. We didn't have dev staging prod, right? But just it was also a natural journey of where the company was. So that was not my problem. My problem was that they wouldn't want to have

31:23 conversations that would elevate it, right? Like I literally had a developer tell me, oh, if you're going to tell us that we need more services here, I'm not going to agree with that because this

31:31 is a monolith. I'm like, how exactly is this a monolith, right? that maybe, but you have a data ingest, you have a processing layer, you have an LLM layer, you have an agent layer, you have a

31:43 front eye. So everything, by the way, was also pretty much built in Ruby. Right, right, right. So that was another - There was a tech stack issue. There was a tech stack issue. So any, I

31:52 could just not have any conversation with them that okay, Python and Rust are more AI need, like Python is very much supported as the industry, like peace for data and AI So all the support

32:04 framework system is around that, right? So I was like, okay, can we at least move that processing part? They're like, nope. And they're like, oh, Ruby's coming up with more package. So it

32:13 was just like very hard to move through. But I - Well, I think part of this, sorry, by the way, is like, is your intuition based on having built teams, having to have that dynamic, having,

32:23 and it's something you told me beforehand. Actually, let me pause real quick, just because I'm happy to continue asking questions this whole time. But if there are people on the webinar don't

32:34 forget, submit them through the QA. I'll see them and I'll ask them as if you continue to trust me with the best questions and happy to continue that as well. So I one of the things that you

32:44 mentioned before we started was like you made it purposeful maybe likely because of what you just described right this like the execution guys versus the strategic and execution guys to focus on more

32:56 senior level talent right. I remember when when we first started looking at what an organizational design for a technical team might look like it was like okay great we'll we'll be able to do you

33:07 know so much with with AI tools that we can just hire cheap junior people tell me about the fallacy of that and and and where what's been built at collide now today to kind of overcome that yeah uh

33:19 great question and

33:22 in my experience or just my opinion there's no substitute for experience as well good or bad Right. The

33:30 reason I could sniff test and figure out that the the team wasn't a talent-tense team, right? But we had who we had at that point in the journey, right? It was all about that, would they evolve,

33:42 right?

33:44 But I'd seen so many examples of good architecture through my career, that when I was looking at this, it just didn't feel right, right? And I know it's like a feels thing, which I like you can

33:55 research it, but now you can run your whole architecture through Claude, right, or chat GBT and get an opinion at least, right? But you have to be resourceful, right? And you have to kind of go

34:06 look, right, outside and rely on your advisors. But I think for me, I'm almost to a fault every single time that I've hired people. I almost like starting with seniors. And the reason for that

34:19 is because they bring the right, if you can get the right, since they bring the right practices in place to know what are good coding standards, what are good architecture standards? How do you

34:28 release a product quickly that you know you'll have to rework. but what are the sacrifices that are okay to make, which corners you should not cut at this point, right? So you start getting a

34:40 sense of those things. And I knew that I needed to uplevel the technical talent of the team from that standpoint, because AI became secondary to me, honestly, right? My first three months,

34:53 because everybody around me was just talking about AI, but nobody could explain to me what was going on and when an answer wasn't the right answer. Like imagine how difficult it is that you're

35:02 sitting in front of a customer demo.

35:05 And I present one thing to you and I present the same thing to another customer and my responses are completely different. It's like, okay, how do I explain that? And the answer was, AI just does

35:15 it this way. It's like, yeah, great. Hallucinations. Hallucinations, right? It's like, okay, like how do you build a company on that? So I went back to the basics. I hired people first who

35:26 were just classically trained software engineers and architects that I had booked with for a while. some of them became available. So I was able to get them here because I knew that, okay, AI will

35:36 have a portion here. I don't know what it is. I know what good architecture looks like. So let me start there with my people. And at that point, once I had that base covered, then I started

35:46 reading a lot about, okay, what's the difference between applied AI? Like when we talk about the structure of a team, an AI team, it's like, okay, I needed people who understood how to use the

35:56 models to get the outcomes that we needed. How do you use tools like land change, right? And others to like chain through structured and in an undeterministic manner, produce more outcomes that

36:11 are consistent, but can be derived sooner. People who understood NCP, people who understood all these other things, that's classically what I think of applied AI engineers. So then my next step

36:23 was, okay, now let me hire applied AI engineers who were previously true software engineers and made the leap in the last one or two years AI tools, for their own code gen, for their own kind of

36:36 decision making in the code, so that they are not doing everything deterministically, but get the same outputs. Once that happened, then we went out and also hired a core AI ML engineer that had a

36:47 background in machine learning, moved into AI to start building models. Because in oil and gas, we are going to need very specific oil and gas models too, because chat GBT and all these other

36:57 models will only go so far. So I did it in that strategic manner, but I built the base of software engineers first. Got it, okay. I wanna maybe change the conversation over to the FDE side, but

37:10 before we get that, we do have our first question here. I don't know who to attribute this to, but thank you for submitting, and thank you, Crystal, for sending. How do you see oil and gas

37:17 professionals fitting into the AI picture? Our foundational knowledge has been around forever, but the science and technology we use today is far more advanced than most people realize. Are there

37:25 plans to let industry experts help train these models? Yes. Great question. That is a great question. I love that question. Because I'm gonna do a plug for our collide community, which is

37:33 actually a separate part, right? So it was interesting that collide enterprises and AI product to do search, workflows and all of this stuff. But before collide, AI was a part, right?

37:43 Enterprise is a collide community of almost eight, nine, 10, 000 community members at this point. It'd be nine million, 10 million. Yes, 10 million, exactly. One day it will be. But I think

37:54 of that as this equivalent of the Stack Overflow, Stack Overflow is very soft engineering, mind it, right? Like this collide community is very much an

38:05 oil and gas like community, like of experts that come in and share, like can help each other with. You can post a question, you can get some advice there. Lots of conversations. So what we are

38:16 doing is right now, we are actually starting to train our own models. And part of that training models is we'll be opening that up very soon and hopefully I can save this here, but hoping that I'm

38:26 very soon. to our Clite community to provide feedback, right? To do the assessment that, okay, like, here are 100 questions. Reinforcement learning. Reinforcement learning, yes, right? So

38:36 the person that we just hired who actually was part of the Microsoft team earlier a few years ago that helped train a lot of reinforcement learning open AI models. That person's done with us, and we

38:48 just are starting to kind of actually build our own machines and other things. But we are fully integrating our Clite community within this aspect where they'll be able to come in and help us train

38:58 and validate the answers so that we know what's good, what's bad, and what correct it looks like. So that's how I think how it is created. Yeah, yeah, I'll never forget one of my first, so I

39:10 started my career at Tenerys, a casing and tubing company. And one of the first situations, I was at on a rig site, you get called 2, 3 am, whatever it was, hey, you gotta go out there,

39:21 something's going on with the pipe, whatever, blah, blah. And of course, like as a young technical sales engineer, just starting out in the industry. In the industry, I had no background in.

39:29 My family's all medical, right? I have my little like standard operating procedure, what I'm supposed to say, what I'm supposed to do, blah, blah, blah. And so I'm trying to assess the problem.

39:40 I think I have the answer to the problem. Therefore, I have like a logical potential solution. It doesn't work. I'm freaking out. And this guy with like 50, 60 years of experience comes out

39:52 there, puts his ear against the pipe, knocks in it a couple of times, says pump 40 barrels of water will be good We'll get on stuck. And it

39:59 was awesome. That's exactly how it worked. And it was such a gut punch for me, obviously, because I was supposed to be the engineer on the rig site. But I think that level of tribal knowledge

40:09 that is so inherent to not just the oil and gas industry, but any legacy industry, I think is so crucial to be able to tap into as we're building out AI native, however we wanted to find that,

40:23 Again, we go back to, it feels like an insult, but it really isn't. We go back to the fact that you didn't come from this industry, but you are building the frameworks to be able to allow for

40:32 industry experts like this person is asking about, to provide their feedback so that you can incorporate that context into the models, is that? Absolutely, and I think that's the other big thing

40:41 here too, right? We are still in the early phases of a lot of the stuff, right? Does change? How do we take this technology to people in a way that they feel comfortable with, meet them where

40:52 they are, right? That's a big mindset that we must carry. So it doesn't feel forced either. But people do have a lot of this knowledge. And for instance, like sometimes people just have this

41:04 intuition, right? At orientation to experience or whatever. But imagine a world where I'm just talking into my phone to capture notes, right? And that can distill all of that feedback into a

41:18 model. Now that's safe, right? you have all this tribal knowledge, but they're not going to sit in front of a computer to type it, right? You can't just video record everyone through, and maybe

41:27 you can, right, or whatever, but people are more used to phone in that interaction, so they'll be okay with like just talking about that, okay, this is what you happen, this is what I was

41:35 thinking through, this is why I thought all these things, and then you can start really the reinforcement learning part. I think that's a great way that people will get to become part of this

41:45 evolution. And actually, so this ties into like now how the organizational design is changing, right? We're implementing four deployed engineers, FDEs, much like the Palantir model. I thought,

41:56 I guess in the oil and gas space, I always call them technical sales.

42:03 So

42:04 I assume it's a similar model, but you're having actual individuals within clients that are prepared in some form of technical fashion to be able to produce or prototype something based on those

42:12 one-on-one interactions. Can you talk to me about some of the, I guess how you guys came to the decision in this or deploying a structure one and some of the challenges that have come about from

42:22 that. Yeah, quite loaded. Yeah, sure. Yeah. So,

42:29 I think let's look at how or the best practices. The best practices. Yes. Like how we came about this because I think all of us are just very curious and learn a lot. Colin does a lot of just

42:39 research and learning and draws from a lot of great inspiration. So I think he's kind of the one that came up with that, hey, why don't we try to be follow what Valentia did, which was that for

42:49 they had a platform eventually that they built, but they had this concept of a forward deploy team echo and delta. I think right like echo was industry experts like on that subject that would embed

42:59 within the companies. So think of them as a geologist right in our case, right. And when you're creating a workflow for a geologist, right, this person knows that knowledge Yeah. And then you

43:13 have a forward deploy, delta engineer who knows how to prototype things really. right? Because you're just trying to prove out whether this mod whether this workflow works for you or not. And if

43:25 it does, then you can see is it just inherently supported in the platform? It's just like the same functionality, right? Or is this something different? And if it's different and you do decide as

43:34 a company to make it more production ready because you can scale it to multiple other clients, then you bring it back into the platform to productionize it. So that's the model we are following

43:44 because what we have seen is that a lot of the requirements across a lot of our clients are very similar, right? Regulatory is a great example. We created that for one, but now we have a lot of

43:54 inbound coming. Revenue statements is another one, right? That so any case that we want to look at, it needs to be expanded, but how do you kind of filter out the noise and how do you know how to

44:04 go fast because if you really try to build every solution in a production grade, it will take a very long time. So you also need that learning quickly in a very real manner because the one thing

44:13 that I learned about the oil and gas people which is very different. One, it's a very handshake, like in-person business, and they like to see things in action, right? So it's not good enough to

44:23 just something tangible, right? And that's our product. Like if they can see it, they can interact with it in their own kind of like workflows, right? But building every workflow into the

44:34 platform is hard. So it's like, that's how we thought of the Forward Deploy team, that why don't we have people that just sit with the client, truly understand what they're trying to build? Can

44:44 we prototype that fairly quickly? And if that thing sticks, then we can figure out how to productionize it. And once you have done it 10, 15, 20 times, patterns emerge, practices emerge,

44:55 commonalities emerge, that you can then start building into the platform, such that the next deployment of the same thing doesn't take you two months, but could take you two weeks. Right, but how,

45:04 so, 'cause like, I think you're right, I think you're gonna get a lot of customer feedback, especially in the early days. I remember, you know, Clyde started with this augmented search play

45:14 very quickly became table stakes, right? It was like, no, no, we need application layer, we need you to be executing on certain workflows to be something that makes sense for us. 10x ROI, you

45:24 know, it's just three, but let's call it 10x. And so, what, you know, having an FDE out on site, being able to capture a lot of that data and bring it back to you guys, how do you guys work

45:38 through the like prioritization of what, you know, what happens or versus like what you guys are already starting to work on or building or whatever in your sandbox environment and allowing the FDEs

45:50 to kind of present or at least talk about. Yeah, that's a tough one. In the last two, three weeks, I think we have finally come to a point. We did an engineering onsite actually our first one

46:01 last week where we went very detailed into all of this stuff because building the tech is one prioritization is a different thing altogether And

46:12 I have to be honest like it was - very chaotic, right? Okay, like, are we building all of this? What are we building? And then we have had a lot of leadership meetings around this. So now we

46:21 have a process in which a POC goes through seven or eight different stages, right, off of a wedding, the first stages that when a client comes and talks about to their FTE, that, hey, I have

46:32 this next use case, they have to come back and there's a leadership in like, Colin and our CO Todd version, I'll sit down and assess, does this make sense to even POC? Is it too far out? And if

46:46 it is too far out, is it a strategic customer? Is it a customer that's like, hey, I'm going to pay you3 million, just go build this thing. As a startup, you have to always balance those, right?

46:55 But there's at least a conversation now of that earlier vetting, right? And then you kind of go through, okay, let me put a quick prototype together for you, right? Let's make sure that the

47:04 requirements are right. So at every stage of like the seven or eight stages, our hope is that the core engineering team is not really getting tapped for But what we have also done is we have

47:15 embedded two of our core engineering principle engineers with the FTEs to be vetting these things and building quickly so that we can also understand the patterns that emerge. But I think we have

47:26 a bailout at each stage of the POC that, okay, this is not worth it or it's not working or the scope's way bigger for this to be a valuable thing. So let's not continue to pursue this. So I think

47:38 we are going through a lot of that stuff. So that's the front end of the problem, right? So narrowing of the funnel We are still working through that, OK, once we have decided that these three

47:46 things now have to be productionalized and they have value, how do you then schedule them in the cycle? And part of what we are doing is that's why we are embedding our two top engineers onto the

47:57 front end so that they can be thinking about we know what's in the platform, we know what customers are asking so that they can do that bridge with us, right? So that's where we are today, and

48:07 this will evolve. Yeah, of course, because I imagine over time, once you get prototypes. up and running, you can see how much engagement they have, how much the customers are actually going to

48:15 use them. Yeah, and that's why we didn't want to leave just the FDE per se, just on an island. Hey, just go build whatever. And what we are doing is we are using a lot of internal tooling to

48:25 create for the lack of packages that they should be using in software that, okay, if you're building use case A, here are the 10 things that are already ready, use these first, right? So we're

48:37 going through that process. And to be clear, Nick and Michael, FDEs are amazing But I completely understand. All right, I got another question from the audience here. So have there been any

48:46 unique challenges you faced as you're building tech systems and teams for oil and gas specifically? What is some advice you have for people looking to lead technical teams in this type of environment?

48:59 I do think there's a lot of context that is hard to understand a lot of times, right? That people talk about

49:07 drilling reports, people talk about schematic,

49:11 sand. And like for somebody who didn't come from that field at all, it's really hard. And what you have to do is, as I said, right, you're not going to be able to do this alone, right? That's

49:22 the one point that I want to emphasize. It has to be a team effort. So my hope is that you have somebody on the other side who can help distill the industry knowledge into some level of what you can

49:35 translate between the software knowledge of, okay, what is the workflow? Is this a report? Is this a chat interface? Like, what are the inputs of the system, right? Like, what are the 10

49:47 variables you're going to put in? And what output do you expect? I think you need that mixture. Otherwise, you're not going to be able to really build because if you don't understand, like, the

49:57 outcomes and the inputs, and if you are not from the field, you need somebody that can help do that. So I think my advice would be that if you are a technical person and you have been tasked to

50:06 build an operational and nobody on your team knows that at all. Don't face your time, right? First go and seek who could you partner with within the company or hide somebody that's going to sit

50:19 down with you and help you really understand those use cases. 'Cause eventually you'll come to a point where you'll be able to just map that as a software development cycle, but there's that early

50:27 discovery phase. Is it almost like a consultant might come in, like an AI consultant to see that is assessing all these things that we need to transition, right? I don't know what consultant's

50:36 famous thing is, like I give you a plan and then they leave before it's actually deployed. Yeah, so I think that's why I kind of you saw that look on the side. Yeah, yeah, there's a grimace

50:42 there that - Right, and somebody who has always been on the inside building when consultants show off a site, you're not gonna be - Sorry, I used the worst of that word. No, no, I hadn't said,

50:50 I'd recognize that, but I think you need people who have a vested interest, right? Yeah. Who want to see this successful, right? And yeah, sometimes they are consultants, right? But probably

51:00 somebody from your network that you trust that can help, and it's not just that they are putting in the hours charging, be like, okay, I gave you my report, No, no, right. The team is actually

51:09 going to go on and, and yes, so you need that, right? You need somebody in the, so classically, they were called business analysts, right? Way back when. So you need somebody who can sit who

51:18 understands the domain, can translate this into software speak, because it's unreasonable, in my opinion, for every single software engineer or every single person to learn everything about the

51:28 field, you'll never scale, right? Because it's the same thing if I'm like, hey, now I'm building a finance software, right? Or Alex, if you're in finance and you keep talking about finance

51:38 terms and I have no idea. Like, is it fair for me to get up level to the degree that I can have that conversation or can we meet somewhere in the middle? Because if I keep talking about

51:47 architecture all day long and look, this is the next decorator in Python, you should like, dude, what are you talking about, right? So I think inputs and outputs is a good way. And then you

51:56 have to figure out who can fill in those gaps so that the requirements can become something that you can actually take to your team, such that not every person on your your team has to become an

52:06 all-in-gas expert. But collectively, the longer they'll stay in the system, the more they'll understand, right? But don't have that expectation that on day one, they should learn all of that.

52:15 Right, okay. I think that was a hard part for me too, because I was trying to do a lot of that stuff. And the people I was hiring, but then eventually you just realized that it's still workflows,

52:23 it's still reports, it's still search, it's still like all the same things that we have built, it's still asynchronous messaging, it's still event-based driven architecture, it's still all of

52:32 these things, but we have the advantage that we have the best oil and gas people here working at Collide, so we could form that partnership. So I think that's a great model to follow. Got it,

52:42 okay. I know we don't have a ton of time. I have like, to me, what is the most important question I wanted to ask, but again, I'll read a reminder, if anybody does want to ask other questions

52:53 that are likely smarter than what I've been spitting out so far, please go ahead. So if you were to hire the next CTO of Collide, the AI native company, what would you be looking for? Are you

53:07 telling me that I'm about to lose my job right after this? Yeah, this is your exit interview. Yeah, so let's do this publicly.

53:15 You'll join Chuck Yates, Chuck Yates needs a job? Yes, probably. Yes, absolutely.

53:23 I think I would still look for

53:26 somebody, if you're looking at a CTO level, right? Yeah. It depends on where your company is. So if you're very early stage, people call CTO and everybody gets called something, right? But

53:38 let's assume that your found market traction, you're kind of where collide is today, right? You need somebody who comes, in my opinion, again, from the software, the classically software

53:48 background, with a history of

53:52 learning new things, because that's what AI is to, right? It's just the next tech, but I think it's going to affect more people than some of the previous texts, just like internet it, right?

54:03 about this.

54:07 you need somebody that has shown curiosity through their career, has shown success through their career. Right now doesn't have to be five different exits or whatever, right? But just something

54:18 that they understand how to build a team, they understand how to manage up, they understand how to lead a team, they can take the founders' vision into execution phase. And they are open-minded

54:30 about new technology. That's a big piece too because I often am in conversations where having skepticism of new technology is good, but you should still be open enough to try it because what's the

54:43 harm? There is

54:45 a harm in not doing it though because if you don't do it, this is where the industry is heading in my opinion. Where does that leave you in five years? So I think for the next CTO, they either

54:56 need to have exposure into enough LLM agents, some of these things, like MCP sort of situation

55:06 that they Understand what where are like where could this go right? So that would that's probably what I would look for in the next CTO But yeah, I would definitely I definitely do believe that

55:18 every CTO coming forward has to be more technically inclined than not Got it because I think the fields moving in such a way that They need to understand how to scale teams but need to understand how

55:28 to scale teams by not just people Right, but by using technology to also help scale So it's kind of that 10 balance because in the past there could be CTOs who were just extremely co extreme coders

55:40 Right knew nothing about people then there were CTOs who just knew everything about people but nothing about the architecture Right The new eight cto needs to kind of have I think a mix of both with

55:50 their own specialty right like I know my specialty will never be Court like i'm not a technical innovator as I said, right? But I know enough about technology, but I know i'm really good at

56:01 creating great teams and understanding how to take vision to execution, right, and alone. I'm curious, right? I'll surround myself with a lot of really smart people, but that's an art as well,

56:12 right? So I know what my strength is, and that health, right? But you need that well-rounded piece. All right, so I've been asking questions of you this whole time. Since we've got three, four,

56:25 five more minutes here, what is a question you might have for the audience around building an AI native company and an AI native technical team?

56:36 Presumably a lot of the audience here are other CTOs, other founders that are kind of in the same position, right? And it's starting with the same challenges because they know what, yeah, what

56:46 would you like to know from the audience? I think I would love to know when they, like what their experience with this term, AI native means. Yeah, what does it mean? Yeah, what does it mean,

56:55 right? What does it mean to you? What is forcing you to explore AI? Like first of all, right? Is it just internal curiosity? Is it pressures? way harder to adopt AI in any of the new tech when

57:09 you are feeling that you don't have space as well because you need that space. I think that would be my biggest question that what does AI native mean to you within your company? And if you're not

57:21 experimenting right now, like what prevents you from experimenting some of the new tech like LLMs and agent tech workflows and all of these things, right? Or my biggest question would also be that

57:33 when you think about AI, how do you actually think about AI that is it running your business? Or is it helping create stuff that you validate that runs the business? And an example there is that

57:46 you could use AI to generate a lot of the code, right? There's cursor, there's warp, there's windsurf, there's cloud code, there's a lot of these tools. What are you guys using? We use a bunch

57:56 of different things just as of recently I've standardized everybody to use cloud code. Because it's important to pick a tool chain so that we can train everybody on it. That's for the code gen, but

58:08 for IDEs, like I've told people that they're free to use what, but the only supported code gen tool set for us for the cloud because we know that we can train it in a certain way.

58:18 So I think that would be my biggest take, right? And like, what do you think about AI native and what prevents you from starting the journey? And my one, my learning was that I just had to start

58:30 the journey, right? Like I don't know everything about this, right? And I never will, right? The pace of innovation is so far that it creates a lot of insight, right? And that's why I always

58:39 just go back to, okay, what am I trying to solve for the industry, right? Am I just learning technology for technology sake? Then I should go work for a research firm, not a product firm. But

58:48 if I'm working for a product firm, then my job as a technologist is to create the right outcomes for the company in the most scalable manner, to use technology to do that, right? To not just use

58:59 technology for technology sake. And if that means that your best outcome is coding in Rust or creating a C for a short, right? as far as you can kind of see how that goes, right? Yeah. So those

59:12 have kind of just been some of my curious talks. Well, there you go. I mean, like, what's the best way, I guess, to reach out for the audience to reach out to you, y'all can have this

59:21 discussion around, like, what is your name? Email is great, and I'm sure like we can post that. Yeah, this will be recorded. And again, I'm sorry, I can't see everyone on this call, so I

59:34 please feel free to reach out For me, it's Alex at mercuryfundcom.

59:38 Kineshiuscolideio, and

59:41 happy to take texts and calls as

59:45 well, which I would love to make, and just learned a lot, and I'm excited, right? I think everybody will learn something from this journey, and I'm happy to talk to you.

59:55 Well, Kaleid's lucky to have you, Kineshius, and I appreciate the time. Yes, no, thank you.

Canisius Rozario from collide.
Broadcast by