June 9, 2025

1142: Navigating and leading the way with AI guardrails at the enterprise level w/ Shaun Ayrton

In this episode, Brian interviews Shaun Ayrton, the CEO and founder of Galini, a Y Combinator and General Catalyst-backed startup helping enterprises deploy AI responsibly. Shaun shares his journey from working at McKinsey, where he drove over $500M in revenue growth, to founding Galini with his former college roommate. He explains how Galini is addressing the challenges of AI compliance, data privacy, and regulation—problems that are becoming more urgent as AI becomes a central part of business operations. Shaun dives into the unique risks and opportunities in AI and why now is the best time to be an entrepreneur.

πŸ’‘ What You'll Take Away For YOUR Business

πŸš€ How to turn corporate experience into a startup advantage – Shaun shares how his time at McKinsey prepared him for entrepreneurship.
πŸ€– Why AI guardrails are critical for business adoption – Learn why most enterprises are stuck in "pilot mode" and how to break through to scale.
πŸ’° How to build a business around regulation and compliance – Shaun reveals how Galini is creating a new category in AI security and compliance.
⚑ The biggest AI risks businesses face right now – And how to navigate them responsibly.
πŸ›‘οΈ How to use AI to increase efficiency without risking data security – Shaun explains the difference between smart AI deployment and reckless automation.
πŸ’‘ Why timing matters in business and AI – Shaun discusses how Galini is positioned perfectly for the next wave of AI adoption.
🎯 How to use AI for business growth TODAY – Simple, practical steps to integrate AI in your business responsibly.

πŸ“ About Shaun Ayrton

Shaun Ayrton is the CEO and Founder of Galini, a YC and General Catalyst backed start-up helping enterprises deploy AI responsibly. His partner in crime is his former college roommate and dear friend Raul Zablah, who has built and managed complex systems at Morgan Stanley, Bridgewater and Ridgeline.

Prior to founding Galini, Shaun was a leader in McKinsey's Software & Telecom practice in New York where he drove $500M+ revenue growth for Fortune 50 software and telecom companies. He had a track record of helping his clients achieve breakout revenue growth, two of which were acquired for ~$20B each!

Outside of work, Shaun is an avid sports fan and player. He grew up in Dubai playing cricket for the junior national team and captained the UPenn cricket team in college.

🎯 Shaun’s BEST Piece of Advice for Wantrepreneurs and Entrepreneurs

"There has never been a better time to start a company. If you have even the slightest itch to build something, DO IT."

πŸ“’ Memorable Quotes

"Intelligence has never been free before. AI is changing that—and it’s going to reshape everything." – Shaun Ayrton

"If you’re not making at least one blunder every day, you’re not moving fast enough." – Shaun Ayrton

"Entrepreneurship is about speed of learning, not perfection. Fail fast and adjust." – Shaun Ayrton


πŸ’‘ Actionable Takeaways

βœ… Audit your AI use cases – Are you using AI securely and responsibly?
βœ… Set clear guardrails for AI in your business – Protect your data and your customers.
βœ… Leverage AI for business growth – Use AI to automate, but keep human oversight.
βœ… Start small, scale fast – Experiment with AI in one area of your business before expanding.
βœ… Be transparent with customers – Make sure customers know how their data is being used.

πŸ”— Links & Resources

 

00:28 - Introduction to Sean Ayrton and Galini

13:06 - From McKinsey to Entrepreneurship

22:12 - Building Responsible Guardrails for AI

32:19 - Enterprise AI Adoption and Challenges

41:30 - Good Thoughts, Good Words, Good Deeds

52:24 - Practical AI Uses and Final Advice

WEBVTT

00:00:00.281 --> 00:00:01.264
Hey, what is up?

00:00:01.264 --> 00:00:04.411
Welcome to this episode of the Wall Entrepreneur to Entrepreneur podcast.

00:00:04.411 --> 00:00:24.551
As always, I'm your host, brian LoFermento, I feel like last week, and it's so frequently lately here on the show and just in life in general, and certainly in business, we have so many interesting conversations that kind of revolve around AI, and that's why I'm so excited to welcome today's guest to the show, because this is someone who thinks about AI, I'm pretty sure all the time to the show.

00:00:24.551 --> 00:00:35.570
Because this is someone who thinks about AI I'm pretty sure all the time and someone who has such interesting thoughts and, most importantly, more than thinking about it, it's someone who's doing something about it with a really cool company that's addressing future things that we're all going to have to confront when it comes to AI.

00:00:35.570 --> 00:00:38.289
So let me tell you all about today's guest and entrepreneur.

00:00:38.289 --> 00:00:39.642
His name is Sean Ayrton.

00:00:39.962 --> 00:00:49.174
Sean is the CEO and founder of Galini, which is a Y Combinator and general catalyst-backed startup helping enterprises deploy AI responsibly.

00:00:49.174 --> 00:00:52.060
That's the key word in this conversation.

00:00:52.060 --> 00:01:05.379
His partner in crime is his former college roommate and dear friend, raul Zabla, who has built and managed complex systems at Morgan Stanley, bridgewater and Ridgeline line.

00:01:05.379 --> 00:01:12.421
Prior to founding Galini, sean was a leader in McKinsey's software and telecom practice in New York, where he drove $500 million in revenue growth for Fortune 500 software and telecom companies.

00:01:12.421 --> 00:01:23.643
He has a track record of helping his clients achieve breakout revenue growth, two of which were acquired for just a casual $20 billion, with a B $20 billion each Outside of work.

00:01:23.643 --> 00:01:25.347
Sean is an avid sports fan and player.

00:01:25.347 --> 00:01:33.751
He grew up in Dubai playing cricket for Junior National Team and captained the UPenn cricket team in college, which we're so grateful for Sean's connections to UPenn as well.

00:01:33.751 --> 00:01:38.989
Obviously, that is an educational institution that is near and dear to our show as well through our partnership.

00:01:38.989 --> 00:01:41.061
So we are all going to learn a lot from Sean.

00:01:41.061 --> 00:01:44.108
I know he's going to make us think a lot, so I'm not going to say anything else.

00:01:44.409 --> 00:01:46.953
Let's dive straight into my interview with Sean Ayrton.

00:01:46.953 --> 00:01:53.429
All right, sean, I am so very excited that you're here with us today.

00:01:53.429 --> 00:01:55.245
First things first, welcome to the show.

00:01:55.245 --> 00:01:57.075
Thank you, brian.

00:01:57.075 --> 00:01:57.759
Thanks for having me.

00:01:57.759 --> 00:02:04.129
Heck, yes, I'm excited to hear all the ways that your mind thinks about and works through this world of AI.

00:02:04.129 --> 00:02:06.954
But before we get there, take us beyond the bio.

00:02:06.954 --> 00:02:07.602
Who's Sean?

00:02:07.602 --> 00:02:09.147
How'd you start doing all these cool things?

00:02:10.419 --> 00:02:13.289
Yeah, of course I'm Sean Outon here.

00:02:13.340 --> 00:02:23.430
I grew up in Dubai, so abroad, you know it was an incredible sort of journey watching the city grow from a desert to the city it is today, in a very short amount of time.

00:02:23.961 --> 00:02:42.854
It sort of left me with the takeaway of the learning that you can actually dream big and achieve things and not everything has to go as planned, but overall it's sort of an endless path there, and so I came to the US very excited to sort of be at the heart of technology and innovation in the world.

00:02:42.854 --> 00:03:03.110
Studied at UPenn, always had a mixed interest between wanting to be a sports person, maybe not being good enough to do it, and really enjoying the intersection of technology and business and figuring out what could happen there and the potential, and so that's been sort of my two things that have driven me through most of my life.

00:03:03.110 --> 00:03:19.231
I spent a bunch of time at McKinsey before this working with some incredible clients and colleagues, you know, at that intersection of software and technology and, you know, always had the itch to do something entrepreneurial, finally had the opportunity to do so and have jumped in and done it.

00:03:19.840 --> 00:03:21.443
Yeah, I love that overview, Sean.

00:03:21.443 --> 00:03:30.724
What I'm so fascinated I'm excited to hear you go a little bit deeper here is the stark contrast, because it stands out to me about the difference between McKinsey and now being an entrepreneur yourself.

00:03:30.724 --> 00:03:52.365
Because I will confess this to you while we're here on the air, sean is that a lot of entrepreneurs I know that behind closed doors they kind of view the McKinseys of the world as the kings of corporate jargon, which obviously for us as entrepreneurs, we're making the most of all the resources, however limited they may be, for us to further the world and we kind of go counter to the traditional ways that things have been done.

00:03:52.365 --> 00:03:57.788
Talk to me about that difference, because now I would imagine you're moving so fast in a very rapidly evolving field.

00:03:57.788 --> 00:03:59.280
What does that difference look like?

00:04:00.301 --> 00:04:01.723
It is funny you bring that up, brian.

00:04:01.723 --> 00:04:10.873
I think one of the first things and this is a fairly new part of my life we kicked this off in September of last year and just sort of wrapped the Y Combinator went through.

00:04:10.873 --> 00:04:12.656
The Y Combinator accelerated on the West Coast.

00:04:12.656 --> 00:04:18.425
I have to say there's a good amount of unlearning that has happened in those 10 weeks at a very accelerated pace.

00:04:18.425 --> 00:04:20.327
For exactly that you're referring to.

00:04:20.668 --> 00:04:27.896
I do think there are some incredible things I learned from McKinsey that have actually given me a sort of a competitive advantage in the entrepreneurial game.

00:04:27.896 --> 00:04:31.110
A few of those things are how to navigate a corporation.

00:04:31.110 --> 00:04:39.189
It sounds easy, but knowing who to sell to, what type of conversation to have, when to engage different stakeholders to sort of drive decision making.

00:04:39.189 --> 00:04:39.910
You know how to engage them.

00:04:39.910 --> 00:04:41.432
You know how to engage them.

00:04:41.432 --> 00:04:50.923
I think those were having a seat at that table for seven, eight years before.

00:04:50.923 --> 00:04:52.990
This has really, you know, almost given me the insider perspective of what it takes.

00:04:52.990 --> 00:04:57.925
But that said, there is a lot that I've had to unlearn as well, predominantly around the appetite of making mistakes.

00:04:57.925 --> 00:05:12.100
I think in most corporate jobs and consulting is no exception is you know, making mistakes is not really kosher, but in entrepreneurship, if you're not making at least one blunder every day, you're not moving fast enough and not learning fast enough.

00:05:12.100 --> 00:05:15.391
So that was probably the biggest unlearning that had to happen in this journey.

00:05:16.040 --> 00:05:17.442
Yeah, so well said, Sean.

00:05:17.442 --> 00:05:29.827
I love the fact that you call that out because it is an inevitable part of any and every entrepreneurial journey, which, of course, I'm going to use that as a segue to talk about your entrepreneurial journey, because you're doing really cool things in the world of AI.

00:05:29.827 --> 00:05:34.271
We obviously talk about AI quite frequently here on this show, but talk to us about Galini.

00:05:34.271 --> 00:05:34.992
What is it?

00:05:34.992 --> 00:05:36.144
What made you start it?

00:05:36.144 --> 00:05:37.245
Where did that idea come from?

00:05:37.245 --> 00:05:38.605
And why now, Sean?

00:05:38.605 --> 00:05:47.595
Because it's changing every single week and I would imagine that you dumped or jumped headfirst into an industry that you had to make sense of and you continuously have to stay ahead of the curve on.

00:05:48.939 --> 00:05:58.321
Yeah Well, firstly, I don't think there's ever been a better time to start a company, so we can come back to that at the end, but I would encourage anyone listening to this podcast take the jump.

00:05:58.321 --> 00:06:03.629
It has never been cheaper, it's never been easier and the sort of investment appetite has never been better.

00:06:03.629 --> 00:06:08.858
Uh, and, more importantly, your ability to bring an idea to reality has never been faster.

00:06:08.858 --> 00:06:10.863
Um, so I would recommend that strongly.

00:06:10.863 --> 00:06:12.773
What drove my co-founder and I?

00:06:12.773 --> 00:06:14.880
I mean, we've known each other for 15 years at this point.

00:06:14.880 --> 00:06:37.596
We were roommates in college, uh, you know, from the time we were babies almost, uh, and we sort of missed two or three of the seminal waves of technology, either because we were too young or we were international, so we needed to play the Visa game in the US, including the cloud wave, the internet wave I guess we were a little bit too young for the mobile wave.

00:06:37.596 --> 00:06:53.351
That this is a paradigm shifting sort of platform of technology that's going to redefine the way everyone works, everyone sort of conducts life and everyone lives in the next five to seven years, and sitting on the sidelines just didn't sit well with us.

00:06:53.351 --> 00:07:01.401
So that sort of you know the core motivation I think practically both of us have me from an advisory capacity and him from an actual building capacity.

00:07:01.440 --> 00:07:15.192
He was making a lot of the systems in many of the top institutions realize both the potential of the latest form of AI I mean AI has been around a while the generative AI and the transformer architecture but also the risk that it has.

00:07:15.192 --> 00:07:32.875
Fundamentally, it's a stochastic system, which means it's a probabilistic response, and there are a lot of industries where, for very good reasons, there's regulation around what can and cannot happen, what can and cannot be said by systems like, broadly, the financial sector, the healthcare sector, government services.

00:07:32.875 --> 00:07:43.394
And what we saw very viscerally is many enterprises are stuck in sort of the pilot mode and unable to do the enterprise scale because they're unable to manage this risk.

00:07:43.394 --> 00:07:49.684
So that is really the problem that we left to help solve, as we are both very pro-AI, but we want it to be responsible.

00:07:49.684 --> 00:08:00.346
We want folks to do it in a way that they can control and control the customer's experience as well, and that's what led us to start Galini, which are essentially guardrails for AI applications.

00:08:00.809 --> 00:08:14.690
We work very closely with product leaders and engineering leaders to help them accelerate their their ai deployment journey yeah, I love that overview, sean, especially because there's so many considerations and obviously we're going to go deeper into quite a few of those avenues during our conversation today.

00:08:14.690 --> 00:08:18.627
But the first place that I want to start is those guardrails, because I think it's fascinating.

00:08:18.627 --> 00:08:21.963
I'll confess here, while we're on the air together, that I love scrolling through reddit.

00:08:21.963 --> 00:08:33.168
I really love seeing what the public are talking about, and right now it's kind of it's almost a meme at this point of how far can we push these large language models like ChatGPT, how far can we push them?

00:08:33.168 --> 00:08:35.561
At what point are they going to say, nope, I can't go there?

00:08:35.561 --> 00:08:37.683
At what point are those guardrails going to kick in?

00:08:37.745 --> 00:08:40.668
And in some of those guardrails, sean, people don't like.

00:08:40.668 --> 00:08:41.870
Some of them, people can see.

00:08:41.870 --> 00:08:46.082
Okay, it doesn't make sense for AI to be sharing this type of information with me.

00:08:46.082 --> 00:08:49.321
Talk to me about those guardrails, because obviously there are good guardrails.

00:08:49.321 --> 00:08:50.664
There are not so great guardrails.

00:08:50.664 --> 00:08:53.552
How do you distinguish the two from each other?

00:08:54.760 --> 00:09:03.591
Yeah, it's a very astute question, brian, and, to be honest, when you talk to five or 10 different people about what a guardrail is, everyone has a different interpretation of what it means.

00:09:03.591 --> 00:09:10.967
When we are referring to guardrails, we are actually referring to the enterprise use of guardrails.

00:09:10.967 --> 00:09:18.110
So this is sort of most enterprises have corporate policies that govern the way employees interact with each other, access to data privacy and controls.

00:09:18.110 --> 00:09:27.312
In a post AI world where things are becoming more and more agentic and you have systems that, honestly, will very soon, if they're not already, behave like employees.

00:09:27.312 --> 00:09:29.548
They have access to employee databases.

00:09:29.548 --> 00:09:35.644
They have reasoning modules where it can decide what to access, when and how to string together sort of pieces of information.

00:09:35.644 --> 00:09:46.746
That particular, that opens up a risk vector for many enterprises where the potential is obvious, but the risk is also pretty large, and so that is what we mean by guardrails.

00:09:47.139 --> 00:09:58.128
Now there's been a lot of discourse on more of the consumer-facing applications whether it's OpenAI or Grok or name your application these days and whether they should and should not be guardrails.

00:09:58.128 --> 00:10:03.067
I know that this also sort of moves towards a political issue, so it's not something we have a strong stance on.

00:10:03.067 --> 00:10:11.604
I think overall, we're very pro-AI and we want it to be done responsibly.

00:10:11.604 --> 00:10:12.285
We're also pro-open source.

00:10:12.285 --> 00:10:23.365
I think recently, with DeepSeek and some of the other innovations that have happened, it's very clear that those models are catching up to some of the cutting edge propriety models and we're very excited to sort of leverage that trend as well.

00:10:23.385 --> 00:10:26.653
Going forward, yeah, I love the fact that you really make that distinction.

00:10:26.653 --> 00:10:42.575
You talk about those enterprise considerations because there are so many and, quite frankly, obviously you and I are going to be talking enterprise because you work within that realm, but I would argue, all of us as business owners every single person that's tuning into this conversation we all have to think about the ways that we use it, because we have our own data.

00:10:42.575 --> 00:10:44.524
We have our own customer data.

00:10:44.524 --> 00:10:50.041
There's a lot of sensitive stuff that a lot of us are feeding through AI these days, and free plans versus paid plans.

00:10:50.041 --> 00:10:51.605
There's so many considerations there.

00:10:51.605 --> 00:11:00.440
But one thing that I really appreciate is that you call out so distinctly that there are guardrails, not just on AI outputs, but also the inputs.

00:11:00.440 --> 00:11:02.143
What is it that we're feeding into it?

00:11:02.143 --> 00:11:13.350
Talk to us about those considerations, because I feel like a lot of us are just freely using it without thinking about both sides of that equation of not just what's the AI giving to me, but what am I giving to it?

00:11:14.639 --> 00:11:16.285
Yeah, that's a fantastic point.

00:11:16.285 --> 00:11:20.988
I mean the first thing I'll mention, whether you're you know, consumer use cases or even SMB use cases.

00:11:20.988 --> 00:11:26.525
Look at the terms and conditions on the different websites for how they handle data, what they do about it.

00:11:26.525 --> 00:11:28.407
I think that is very critical.

00:11:28.407 --> 00:11:29.386
You could be.

00:11:29.386 --> 00:11:30.559
I'll give you an example.

00:11:30.679 --> 00:11:46.586
Last week I was at the T3 conference, which is a wealth management technology conference, and it was very clear, for better or worse, that many advisors were using some of the tools like ChatGPT and sort of uploading PII and sensitive client information because regulation hasn't caught up there.

00:11:46.586 --> 00:11:55.871
It's probably fine, but it's at least something you need to disclose to your customers because it is going to an open source model or an openly accessible model.

00:11:55.871 --> 00:12:01.241
So I would sort of take away is definitely look at the terms and conditions and just think through.

00:12:01.241 --> 00:12:02.687
Do the what do they call it?

00:12:02.687 --> 00:12:04.565
The PR test or the public test?

00:12:04.625 --> 00:12:10.019
If you were in the newspaper, would you feel comfortable or not comfortable saying a particular statement?

00:12:10.019 --> 00:12:12.248
And if you're not comfortable, it's good to explore solutions.

00:12:12.248 --> 00:12:32.986
That said, you know not a proponent of adding costs to your business model, so don't be not suggesting that there are solutions that you can use that are you know a very low cost, but at least be aware of the risks there for that, yeah, I love the fact that you're also introducing us to so many different players, sean.

00:12:32.895 --> 00:12:34.111
We're talking about potential government regulation.

00:12:34.111 --> 00:12:35.296
We're talking about enterprise level.

00:12:35.296 --> 00:12:38.403
We're even talking about consumers responsibilities of ourselves.

00:12:38.403 --> 00:12:47.772
If I go to a dental office, of course I'd like to know what the heck they're doing with my data, and it was much simpler 50 years ago when everything was just pen and paper, but it's much more complex today.

00:12:47.772 --> 00:12:49.221
So I want to ask you this question.

00:12:49.221 --> 00:12:51.145
Obviously, there's no one answer to it.

00:12:51.145 --> 00:12:54.123
I'm sure it's a mix of everything, but who's responsible here?

00:12:54.123 --> 00:12:56.390
Who's going to drive that change?

00:12:56.390 --> 00:13:15.322
Because, when I think about, part of our value add as entrepreneurs is that we drive a lot of change when it comes to technology, when it it comes to innovation, to the point that you said earlier is that, sean, there's a lower penalty for you and I taking risks and failing than there is for a McKinsey, so we get to drive some of those experiments and that change.

00:13:15.322 --> 00:13:23.567
Who's going to be driving that change in this world of AI and those guardrails and use cases and privacy and data and all these things we're talking about?

00:13:25.437 --> 00:13:28.871
Yeah, that's a million dollar question, brian, I think you know.

00:13:28.871 --> 00:13:37.985
Ideally we'd want the government to play a part in this, but we know that they usually is a little bit slow in the adoption and understanding of the latest technology.

00:13:37.985 --> 00:13:51.301
We also know that historically, a lot of the innovation has come from some of the larger companies, but this is a very, very unique sort of thing that we're seeing in the market, where the vast majority of innovation here is actually coming from the startups of the world.

00:13:51.301 --> 00:13:56.852
You know, sort of our fellow founders, for lack of a better word.

00:13:56.852 --> 00:14:04.921
So I think the onus from a, you know, an ethical standpoint is on us to make sure that we are using sort of our technology.

00:14:04.921 --> 00:14:08.094
I hate to draw the metaphor, but you know what's that Spider-Man or Superman quote of like?

00:14:08.094 --> 00:14:09.275
With power comes responsibility.

00:14:09.275 --> 00:14:22.297
I do think that really falls on founders to and sort of early technology experimenters with AI to take this onus on themselves until they become standards and norms that are applicable in our space.

00:14:22.918 --> 00:14:27.796
Yeah, it was totally a softball question, sean, because obviously Galini is plugging a lot of that gaps.

00:14:27.796 --> 00:14:31.402
I want to put you on the spot a little bit because I obviously can see your website.

00:14:31.402 --> 00:14:43.964
Most listeners can't see you and I right now and they definitely can't see your website as we're talking, but you have a graphic on your website that I feel like describes so succinctly where Galini fits into this mix.

00:14:43.964 --> 00:14:50.629
I love the fact that you've got the user's input feeds not to get an output, but first feeds into Galini.

00:14:50.629 --> 00:14:59.743
That makes sense of all of the things that we're talking about regulations, company policy, all of those then feeds it into the AI and gets that output.

00:14:59.743 --> 00:15:05.691
Walk us through how the heck that works, because the visual is worth a thousand words for me, truly, but I want you to explain it for listeners.

00:15:06.692 --> 00:15:07.235
Absolutely.

00:15:07.235 --> 00:15:13.876
I mean the metaphor here for technologists out there is a firewall, but instead of being a security firewall, it's an AI compliance firewall.

00:15:13.876 --> 00:15:15.635
That's exactly how it works, brian.

00:15:15.635 --> 00:15:21.775
So, as a listener or someone, how you would use something like this?

00:15:21.775 --> 00:15:24.169
There are two parts to the solution.

00:15:24.169 --> 00:15:33.482
The first one is a user would type in a query, a prompt, a voice note, an agentic instruction, whatever the input format is.

00:15:33.482 --> 00:15:36.839
It would hit the Galini API and they would get a response.

00:15:36.839 --> 00:15:44.355
The developer team Based on the response, they could do something about it, and the same thing happens on the output side.

00:15:44.355 --> 00:15:54.506
You know you could be masking PII, you could be having off-topic certain conversations or you know topics are off limits for the purpose of your application.

00:15:54.506 --> 00:16:02.083
You could control what the agents can and cannot access within as you're building sort of more agentic applications.

00:16:02.083 --> 00:16:03.471
So it essentially is.

00:16:03.471 --> 00:16:11.931
You can think of it like your safety blanket or safety layer between you know users and a model that is not fully in your control.

00:16:12.572 --> 00:16:15.941
Yeah, I love the way that you use analogies to illustrate that point.

00:16:15.941 --> 00:16:25.241
I think it's so important for all of us to understand where things are living, because now we're sending data all across the world these days with any and every query that we're sending out there.

00:16:25.241 --> 00:16:42.399
Sean, when I think about the work that you're doing, what really excites me is the fact that you guys are taking that responsibility to make sense of company policy, to make sense of all the things that at the enterprise level, they hope is happening, but you then build and deploy that solution within their environment.

00:16:42.399 --> 00:16:45.092
My question to you is what's the spark for them?

00:16:45.092 --> 00:16:46.195
What's that catalyst?

00:16:46.195 --> 00:16:55.134
Is it a pain point that enterprise level companies are already saying wait, we recognize that this is something we want to get on top of or I'm going to throw the insurance industry under the bus here.

00:16:55.134 --> 00:17:02.381
Is it the case of like insurance, where none of us want it but unfortunately, when something happens, we're really glad that we have it?

00:17:02.381 --> 00:17:06.914
Where's that catalyst or that spark for them to say, wait, let's prioritize this.

00:17:08.175 --> 00:17:09.438
It's a great question, brian.

00:17:09.438 --> 00:17:10.278
I think we're.

00:17:10.278 --> 00:17:11.140
I think two things.

00:17:11.140 --> 00:17:12.801
One is we're a little bit early in the market.

00:17:12.801 --> 00:17:16.326
I think people are still early in their AI journeys, ai adoptions.

00:17:16.326 --> 00:17:19.353
They're figuring out what AI means for their company beyond.

00:17:19.353 --> 00:17:23.563
You know the Microsoft co-pilots of the world or you know the consumer facing applications.

00:17:23.563 --> 00:17:25.979
So I do think this is an evolving conversation.

00:17:25.979 --> 00:17:32.088
If you ask me in a couple of years it'll probably my answer will change the motivations that we've discussed so far.

00:17:32.088 --> 00:17:35.480
It is I'd say it's 50-50.

00:17:35.851 --> 00:17:44.935
There are some sort of proactive leaders that are very like taking a very proactive approach around how to manage something like this, and so they'd fall in your first camp.

00:17:44.935 --> 00:17:52.642
But I'd say maybe like 60, 65% of the folks that we've spoken to are almost taking the insurance angle of gosh.

00:17:52.642 --> 00:17:57.500
Is the upside of launching this capability to our customers worth the risk of the downside?

00:17:57.500 --> 00:18:02.561
And unfortunately, almost every week there's a new news article of somebody butchering this.

00:18:02.561 --> 00:18:05.478
I mean, I don't want to put any names out there, but it's.

00:18:05.478 --> 00:18:09.940
This quick news search will give you a sense of what these are.

00:18:09.940 --> 00:18:16.002
So you know, it is more of the insurance as a driver today for enterprise adoption of guardrails.

00:18:16.002 --> 00:18:18.074
But you know our hope is in the future.

00:18:18.074 --> 00:18:21.122
Folks will take more and more of a proactive lens here.

00:18:22.090 --> 00:18:24.986
Yeah, I really even not just speaking from a technical perspective.

00:18:24.986 --> 00:18:39.056
I love getting inside your executive mind and hearing you call out the fact that, yeah, we're early in the market, and I feel like that's how everybody kind of feels about AI right now is it's unbelievable saying we're early because it is obviously incredible already but it's just evolving at such a rapid rate.

00:18:39.056 --> 00:18:42.750
So I'll put your entrepreneurial and executive mind on the spot here.

00:18:42.750 --> 00:18:44.492
How do you make sense of that?

00:18:44.492 --> 00:18:48.999
Are there some times where you go to bed at night and you think are we too early here?

00:18:48.999 --> 00:18:51.962
Are we fixing a problem that other people aren't aware of just yet?

00:18:51.962 --> 00:18:57.771
Or what are those conversations like when you're in the marketplace, when you're talking to your potential clients, your existing clients?

00:18:57.771 --> 00:19:02.691
How do you make sense of the timeline, of where you're fitting in into the more macro landscape?

00:19:04.574 --> 00:19:04.874
I do.

00:19:04.874 --> 00:19:07.278
I spend time on it all the time, brian, all the time.

00:19:07.278 --> 00:19:12.026
It's, I guess, the fun and the not fun parts about being an entrepreneur.

00:19:12.026 --> 00:19:14.977
You constantly have to question yourself, your business model, kind of.

00:19:14.977 --> 00:19:16.981
You know, your timing, what you offer in the market.

00:19:16.981 --> 00:19:33.435
You know, one of the incredible things we learned from YC on this was they try to simplify a very, very ambiguous you know thing, which is entrepreneurship, into sort of different pieces of advice at different stages of your journey, and most of us that are early.

00:19:33.435 --> 00:19:35.260
It is build something people want.

00:19:35.260 --> 00:19:40.911
That is sort of the, the mandate, and they've sort of distilled years of very successful founders into that.

00:19:40.911 --> 00:19:44.804
So the way I sort of handle that is I keep talking to customers.

00:19:44.804 --> 00:19:54.954
I keep sort of uh, you know, going to conferences, speaking to leaders, um, speaking at ISAF, which is a large audit conference later this week.

00:19:54.954 --> 00:20:03.304
Being right front and center with customers in their workflows in the discussion is the nearest way to figure out.

00:20:03.685 --> 00:20:04.589
Are we too early or not?

00:20:04.589 --> 00:20:08.856
I think we are on the earlier side here, but we're not going to be for too long.

00:20:08.856 --> 00:20:15.512
I don't think there's been a technology that has evolved as quickly at a global scale as we are seeing it evolve.

00:20:15.512 --> 00:20:33.023
Jenny, even the open source Manus release a week or two ago or I think it was last week is essentially taking OpenAI's operator and bringing it to the world for a much lower price point, and that will completely change the way folks interact with technology.

00:20:33.023 --> 00:20:37.201
The actual UI with which even consumers use technology will completely change.

00:20:37.201 --> 00:20:38.637
Imagine if you don't have to.

00:20:38.637 --> 00:20:43.220
You want to use a piece of software and you don't have to figure out how the software works.

00:20:43.220 --> 00:20:47.121
You just have a need and you express the need and it is solved.

00:20:47.121 --> 00:20:51.954
That is a world we're heading into very, very soon, just to pick on one example.

00:20:52.455 --> 00:20:53.921
Yeah, I love that, Sean, I'll tell you.

00:20:53.921 --> 00:21:13.412
Obviously, as someone who talks to business owners for a living, I completely agree with you that I think that, even though it feels early right now, that next stage is going to come so quickly, and that's why I really appreciate the fact that you and your co-founder, you're so clear on the fact that there are more regulated industries that make more sense for you to dip into Government industry, for example, financial sector.

00:21:13.412 --> 00:21:19.675
Obviously, those are industries that have very sensitive data, that have a lot of data, that are leveraging AI in different ways.

00:21:19.675 --> 00:21:31.156
Talk to us about that industry-specific targeting those conversations, how you've identified those and really why those are going to be forced to be the leaders in this because of the sensitivity of their data.

00:21:32.259 --> 00:21:35.230
Yeah, I mean, that is where the pain is most felt, right?

00:21:35.230 --> 00:21:43.385
I think that the reason why we're starting there, as you said, it's regulated and the penalties are already established and large for, you know, violating those.

00:21:43.385 --> 00:21:45.270
I can give you a couple examples.

00:21:45.270 --> 00:22:03.017
We're working with a public safety provider who's essentially bringing the next version of their video technology to market to help with crime prevention and mitigation and we're helping them put guardrails around how that technology is used.

00:22:03.017 --> 00:22:19.642
We're working with another global government and early discussions with them around amazing internal capabilities that they're trying to deploy for their citizens that would just bring access to information and resources to their fingertips, but need to do it in a responsible way and sort of.

00:22:19.642 --> 00:22:22.559
There's a big data challenge around that, as you can imagine.

00:22:22.559 --> 00:22:25.490
So you know those are some examples there.

00:22:25.771 --> 00:22:44.924
We're having a lot of conversations with folks all across the financial services landscape, particularly investment advisors, around, honestly, the pain of compliance and how to use technology to help ease that from a delivery standpoint in a model that's pretty tough.

00:22:44.924 --> 00:22:46.236
There are pretty thin margins there.

00:22:46.236 --> 00:22:48.213
We're talking to banks.

00:22:48.213 --> 00:22:48.836
That's pretty tough.

00:22:48.836 --> 00:22:49.859
There are pretty thin margins there.

00:22:49.859 --> 00:22:51.064
We're talking to banks.

00:22:51.104 --> 00:23:00.655
And then, the last thing I'll mention, we're also talking to chronic care and other healthcare delivery practices, as they think about using AI both in their operations but in customer service.

00:23:00.655 --> 00:23:13.799
And the last thing you want is to launch a 24-7, always available AI chatbot where you ask hey, my back's hurting, I think I injured it in some way, and it starts giving you medical advice and puts the practice up to get sued.

00:23:13.799 --> 00:23:28.040
So those are some of the low-hanging fruit, but the vision we have in our head is, as AI becomes more human-like, we have a set of social norm or regulations of how we interact in companies with each other.

00:23:28.040 --> 00:23:29.604
What is that protocol?

00:23:29.604 --> 00:23:32.717
What does that look like for AI?

00:23:32.717 --> 00:23:41.282
And it makes sense that this is not going to be solved by an individual provider, and so we want to ideally be that protocol in five years.

00:23:42.069 --> 00:23:47.814
Yeah, sean, I love the real life examples because it immediately shows us how much we should all support some level of guardrails.

00:23:47.814 --> 00:23:50.943
Because the back pain example yeah, none of us want AI to.

00:23:50.943 --> 00:23:56.558
We already have WebMD to terrify us when anything is wrong with our bodies, so we don't need AI to be tacking onto that.

00:23:56.558 --> 00:23:58.171
I want to switch gears a little bit, sean.

00:23:58.171 --> 00:24:03.053
I've been so excited not only to talk to you with regards to AI, but also to get inside of your entrepreneurial mind.

00:24:03.053 --> 00:24:07.019
It's such a fun part of these conversations for me because I think it's fascinating.

00:24:07.039 --> 00:24:15.073
The dichotomy of your industry is that AI makes a lot of people afraid because they say all of us are going to be out of jobs, it's going to replace humans.

00:24:15.073 --> 00:24:17.279
Some people say it's going to take over the world.

00:24:17.279 --> 00:24:18.394
It's going to do all these bad things.

00:24:18.394 --> 00:24:34.439
But, of course, there's also the good side, and so I'm using that as a segue to the fact that I know that one of your personal mantras and part of the way that you are raised is the importance of good thoughts, good words, good deeds, and obviously that helps you see the good in the world amidst all of the other things that are out there.

00:24:34.439 --> 00:24:41.701
Talk to us about how that's played into the way that you see the world, the way that you see the world as an entrepreneur, and how you make these decisions in your business.

00:24:43.165 --> 00:24:45.588
Yeah, I mean that's a core sort of mantra.

00:24:45.588 --> 00:24:53.461
I grew up a Mzorastrian and it's a very sort of simple but fundamental part of how I conduct my life and kind of what I live by.

00:24:53.461 --> 00:24:55.287
So thanks for bringing that up, brian.

00:24:55.287 --> 00:25:03.920
I'd say for me personally it's been actually an incredible sort of accelerator and a big advantage to most things in my life.

00:25:03.920 --> 00:25:19.435
I mean people like working with generally, working with genuine, personable people, and you know I've gotten that feedback a bunch before and very fortunate, and you know the more you put out there, I truly believe the more you put out there and just help people, the more it will come back overall, like in your life.

00:25:19.435 --> 00:25:26.353
And so having that mindset actually, while it's important, just I think in general, it's particularly helpful as a founder Right.

00:25:26.353 --> 00:25:29.076
And so how that's played out.

00:25:29.175 --> 00:25:35.953
And to answer your second question around AI, look, I mean it is a scary technology to some extent.

00:25:35.953 --> 00:25:38.432
You know intelligence has never been free before.

00:25:38.432 --> 00:25:45.373
So in that paradigm, you know there are fundamental questions we have to ask ourselves about how society will operate in five, six years.

00:25:45.373 --> 00:25:53.532
But I think the fear that people have and it's a gross generalization, so not everyone, but it's more the fear of change.

00:25:53.532 --> 00:26:08.596
It's easy to put something sort of on the side of something I don't understand and it is scary, so I won't understand it versus just trying to use it, digging into it, even from a vocational standpoint.

00:26:08.596 --> 00:26:23.530
I think we're so early in the journey collectively that you know people spending an hour a day, an hour a week on this, on something that they're actually interested in, will make them almost an expert in about a month or two months of just compounding that learning.

00:26:23.605 --> 00:26:31.498
So I would strongly urge sort of anyone to just try to play with AI today, not just, hey, go to chat GPT and use it for search.

00:26:31.498 --> 00:26:37.194
Try to think of something where your life might be a little bit easier and use AI to try to make it happen.

00:26:37.194 --> 00:26:39.894
No need to build just like make a custom GPT.

00:26:39.894 --> 00:26:41.828
You know you'd.

00:26:41.848 --> 00:26:47.128
Look at YouTube, look at some of the documents that have been published by some of the leading LLM labs, look at Reddit.

00:26:47.128 --> 00:26:52.353
There's so much experimentation that still hasn't happened for technology.

00:26:52.353 --> 00:26:53.334
That's honestly incredible.

00:26:53.334 --> 00:27:07.907
I think the analogy I have here is I feel like we're still riding horses and there is a Lamborghini and nobody's driving the Lamborghini, but everyone's saying how scary it might be and all we have to do is open the door and press the accelerator and see what might happen.

00:27:07.907 --> 00:27:09.951
That's kind of where we are with AI.

00:27:09.951 --> 00:27:17.970
So I would strongly encourage folks to if you're not already, think about ways that it might make your life better and don't wait for a solution to come out Like.

00:27:17.970 --> 00:27:21.326
Go and try to experiment and try to make something useful yourself.

00:27:21.866 --> 00:27:24.471
Yeah, I love that, sean, especially that analogy.

00:27:24.471 --> 00:27:40.272
I would imagine that when people were riding on horseback and the first time that there was a car that could go, you know, 40, 50, 60 miles per hour, that would have been a terrifying speed to be sitting in a metal machine in, whereas all you've ever been used to is riding on horseback.

00:27:40.272 --> 00:27:47.414
So it's cool, but I also think that it reveals a lot about your curiosity, sean, so I will publicly put you on the spot while we're on the air here together.

00:27:47.414 --> 00:27:51.606
What are some of those ways that curiosity manifests in your personal uses of AI?

00:27:51.606 --> 00:28:02.470
What are some of those real life use cases that you say you know what, even aside from work, because this is what I do for a living, I'm just going to use AI and play around with this and tinker with this to really leverage that curiosity.

00:28:03.632 --> 00:28:06.436
Oh, absolutely, I think, one of the things you have to realize.

00:28:06.436 --> 00:28:07.779
I'll start with the work thing.

00:28:07.779 --> 00:28:11.414
As an entrepreneur, you have to play almost every role in the company.

00:28:11.414 --> 00:28:30.788
I mean, I have a very impressive co-founder who's managing a lot of the actual tech build, but everything around the roles of a CEO, coo, cio, chief Legal Officer, tax Advisor, the whole thing, chief Sales Officer all of that is compressed into roles when you're in our stage.

00:28:30.788 --> 00:28:38.614
So you need to find ways to get leverage in a way that's not outsourcing or necessarily scaling hiring too much at this point.

00:28:38.933 --> 00:28:43.449
So I've actually taken a lot of my workflows which at the moment is a lot about discovery.

00:28:43.788 --> 00:28:55.376
It's a lot about leveraging sort of my network whether it's LinkedIn, mckinsey, wharton, you know broadly my network to get those conversations that can really inform you know where we take the product.

00:28:55.436 --> 00:29:03.665
And one example just very visceral example of something I started doing is using operator to automate a lot of the outreach that I've had.

00:29:03.665 --> 00:29:07.115
So traditionally, you know, you sort of go onto LinkedIn.

00:29:07.115 --> 00:29:22.656
You find a particular role or types of people in your network or people who are connected to folks in your network, and then you ask you know some sort of an introduction and people have used marketing automation type campaigns around that, but it's fairly clunky and not very good.

00:29:22.656 --> 00:29:32.180
Now, with you know opening eye operator, even without coding, you can sort of prompt your way into having it run autonomously in the back and do some of this initial outreach for interest.

00:29:32.180 --> 00:29:40.586
Of course, you know I'm not going to trust it to have conversations with them, but the hey, this is what we're we're doing Would love to have a conversation with you type outreach for the right folks.

00:29:40.586 --> 00:29:44.795
It's pretty good at that, um, so that's one way I've tinkered with it in the last few weeks.

00:29:45.316 --> 00:29:46.117
Yeah, I love that.

00:29:46.117 --> 00:29:53.592
All right time to hit YouTube and learn a heck of a lot more about Operator, because it sounds like you're having some fun playing around with that and it's really interesting technology.

00:29:53.592 --> 00:29:53.993
That you're right.

00:29:53.993 --> 00:30:14.863
Probably, I think two months ago it didn't even exist in the marketplace, so the fact that we now have access to it is very cool, sean, the only question that I ask that's the same in every single episode is now I want to tap into that entrepreneurial mind and I want to ask you your one best piece of advice, knowing that we're being listened to by both entrepreneurs and entrepreneurs at all different stages of their own growth journey.

00:30:14.863 --> 00:30:20.826
You're also a founder now and I think that is such a cool part of your journey and you've learned a lot as an entrepreneur.

00:30:20.826 --> 00:30:27.368
Of course, it just accelerates the entire learning experience, so what's that one piece of advice that you want to pass on to our listeners today?

00:30:28.612 --> 00:30:29.512
Yeah, absolutely.

00:30:29.512 --> 00:30:31.676
Two pieces of advice, I think, if I may.

00:30:31.676 --> 00:30:36.894
The first one is there has truly never been a better time to go and build something.

00:30:36.894 --> 00:30:39.228
So, even if you have a slight itch, do it.

00:30:39.228 --> 00:30:45.569
I can tell you on this side of the fence it is 100 times more fun and rewarding and sort of higher learning than anything you will do.

00:30:45.569 --> 00:30:46.811
That's not a founder.

00:30:46.811 --> 00:30:50.539
And then the second thing I'll mention it's all about focus.

00:30:51.342 --> 00:30:58.749
It's the complete opposite of what you might learn in business school or any corporate job around understanding a market and making business plans and planning.

00:30:58.749 --> 00:31:09.346
I think planning as a word is a bad word as an entrepreneur, and there's always a million things you could be doing, and if you're a creative person, you're going to be coming up with ideas all the time.

00:31:09.346 --> 00:31:17.936
Jot down the ideas, but every single day, have the one thing you're going to do that day or for that week and just focus solely on that.

00:31:17.936 --> 00:31:22.192
And the reason I say that is the world is stacked against you as an entrepreneur.

00:31:22.192 --> 00:31:33.653
You don't have the resources, you may not have the funding, you definitely don't have a team of talent working in-house, but what you do have, which is the one biggest competitive advantage other than passion is your rate of learning.

00:31:33.653 --> 00:31:37.828
It's the rate of compounding, the rate of discovery, the rate of of building.

00:31:37.828 --> 00:31:39.635
So lean into that.

00:31:39.635 --> 00:31:44.169
And the only way you can really lean into that is being laser focused on something boom.

00:31:44.289 --> 00:31:50.027
yes, that's why I always love that acronym for focus of follow one course until success.

00:31:50.188 --> 00:31:52.055
Focus follow one course until success.

00:31:52.737 --> 00:32:06.311
Sean, it sounds like you are super laser focused on what it is that you're doing, while also allowing your curiosity to lead the way and get those compounding effects that you've shared with us here today, and I'm so appreciative of you sharing not just your expertise but your own personal entrepreneurial journey with us here.

00:32:06.311 --> 00:32:08.676
Appreciative of you sharing not just your expertise but your own personal entrepreneurial journey with us here.

00:32:08.676 --> 00:32:16.366
I am a big fan of companies and leaders and innovators that are at the cutting edge of very important industries, and that is certainly true with the work that you're doing with Galini.

00:32:16.366 --> 00:32:26.893
So, for anyone else who wants to tune in and follow all of your success from here and all of the things that you guys are leading the way on, like I said, I love how simple you've made it through the graphic that explains it on your website.

00:32:26.893 --> 00:32:33.260
I love how you lay out all the different considerations of what responsible AI usage looks like on your website.

00:32:33.260 --> 00:32:36.394
So, for anyone else who wants to check that stuff out, drop those links on us.

00:32:36.394 --> 00:32:37.730
Where should listeners go from here?

00:32:39.125 --> 00:32:39.970
Yeah, thank you, brian.

00:32:39.970 --> 00:32:44.596
Go to wwwgaliniai or reach out to me directly.

00:32:44.596 --> 00:32:48.556
I would love to speak to you on the topic and honestly learn.

00:32:48.556 --> 00:32:50.691
I think it's a two-way learning journey right now.

00:32:50.691 --> 00:32:52.214
Thanks, brian, for having me.

00:32:52.214 --> 00:33:01.545
I think you've had some incredible folks on this channel and a really incredible listening base, and so I hope I left folks with one or two things that they might have learned that they didn't know before.

00:33:01.545 --> 00:33:02.066
Heck, yeah, you sure have.

00:33:02.086 --> 00:33:03.307
Sean, you've absolutely delivered here in today's conversation.

00:33:03.307 --> 00:33:03.866
I'm so appreciative of that.

00:33:03.866 --> 00:33:04.347
Listeners you sure have.

00:33:04.347 --> 00:33:06.230
Sean, you've absolutely delivered here in today's conversation.

00:33:06.230 --> 00:33:07.431
I'm so appreciative of that.

00:33:07.431 --> 00:33:08.711
Listeners, you already know the drill.

00:33:08.711 --> 00:33:13.317
We're dropping those links down below in the show notes, no matter where it is that you're tuning into today's episode.

00:33:13.317 --> 00:33:18.582
You can find Galini's website at galiniai Super easy to remember, but you can click right on through from the show notes.

00:33:18.582 --> 00:33:29.740
We're also linking to Sean's personal LinkedIn conversation If you want to introduce him to someone or you want to bounce ideas off of him, and vice versa.

00:33:29.740 --> 00:33:32.453
Everybody wins from these situations.

00:33:32.453 --> 00:33:35.074
That's why we always say that a rising tide lifts all boats.

00:33:35.074 --> 00:33:39.811
So, sean, on behalf of myself and all the listeners worldwide, thanks so much for coming on the show today.

00:33:41.046 --> 00:33:42.711
Thank you, brian, I really appreciate it.

00:33:44.046 --> 00:33:49.424
Hey, it's Brian here, and thanks for tuning in to yet another episode of the Wantrepreneur to Entrepreneur podcast.

00:33:49.424 --> 00:33:53.576
If you haven't checked us out online, there's so much good stuff there.

00:33:53.576 --> 00:34:02.814
Check out the show's website and all the show notes that we talked about in today's episode at thewantrepreneurshowcom, and I just want to give a shout out to our amazing guests.

00:34:02.814 --> 00:34:11.614
There's a reason why we are ad free and have produced so many incredible episodes five days a week for you, and it's because our guests step up to the plate.

00:34:11.675 --> 00:34:13.648
These are not sponsored episodes.

00:34:13.648 --> 00:34:15.251
These are not infomercials.

00:34:15.251 --> 00:34:18.746
Our guests help us cover the costs of our productions.

00:34:18.746 --> 00:34:29.708
They so deeply believe in the power of getting their message out in front of you, awesome entrepreneurs and entrepreneurs, that they contribute to help us make these productions possible.

00:34:29.708 --> 00:34:38.184
So thank you to not only today's guests, but all of our guests in general, and I just want to invite you check out our website because you can send us a voicemail there.

00:34:38.184 --> 00:34:39.527
We also have live chat.

00:34:39.527 --> 00:34:43.356
If you want to interact directly with me, go to thewantrepreneurshowcom.

00:34:43.356 --> 00:34:45.568
Initiate a live chat.

00:34:45.568 --> 00:34:54.974
It's for real me, and I'm excited because I'll see you, as always every Monday, wednesday, friday, saturday and Sunday here on the entrepreneur to entrepreneur podcast.