Here’s How AI Agents Are Transforming Project Management
The player is loading ...
Here’s How AI Agents Are Transforming Project Management

Get featured on the show by leaving us a Voice Mail: https://bit.ly/MIPVM 

👉 Full Show Notes
https://www.microsoftinnovationpodcast.com/778
 
James Diekman explores how AI and automation are reshaping project management, compliance, and business innovation. Learn practical strategies for embedding AI in workflows, managing risk, and building future-ready teams. Gain insights from real-world government and enterprise projects, with a focus on responsible, efficient, and scalable technology adoption. 

🎙️ What you’ll learn  

  • Identify where AI agents can streamline project management and reporting 
  • Apply responsible AI frameworks and assurance models in regulated sectors 
  • Build and test project-specific tools using low-code and AI platforms 
  • Integrate legal and compliance checks into AI solution delivery 
  • Develop a continuous improvement approach for AI adoption in your organisation 

Highlights 

  • “Roles like project coordinator will get transformed into AI agents or chat bots.” 
  • “Setting up a project brain for each project will transform quite a bit.” 
  • “We’re seeing a big appetite for AI, especially in New South Wales.” 
  • “Every AI solution for New South Wales government must go through an AI assurance framework assessment.” 
  • “You can ask the same question multiple times and get different answers.” 
  • “Legal teams are starting to use these technologies to enhance their workflows.” 
  • “There’s a real opportunity for cybersecurity firms to invest in red teaming for AI.” 
  • “If you don’t build AI into your processes, companies will get left behind.” 
  • “We use Copilot, Cursor, and GitHub Copilot for development and proposals.” 
  • “Don’t just build a chatbot for everything—map your organisation’s processes first.” 
  • “A dedicated team or working group should constantly prioritise and build AI capabilities.” 
  • “Government, you can’t really get wrong either. It can’t be hallucinating.” 

🧰 Mentioned 

✅Keywords 
ai, project management, automation, power platform, azure, dynamics 365, responsible ai, assurance framework, legal compliance, user acceptance testing, low-code, cybersecurity 

Microsoft 365 Copilot Adoption is a Microsoft Press book for leaders and consultants. It shows how to identify high-value use cases, set guardrails, enable champions, and measure impact, so Copilot sticks. Practical frameworks, checklists, and metrics you can use this month. Get the book: https://bit.ly/CopilotAdoption

Support the show

If you want to get in touch with me, you can message me here on Linkedin.

Thanks for listening 🚀 - Mark Smith

01:18 - The Future of Project Management: AI Agents as Project Brains

05:00 - Cutting Through the Fluff: AI for Radical Transparency

07:07 - From Employee to Entrepreneur: Building a Tech Company in the Age of Low-Code

13:09 - Government’s Appetite for AI: Innovation, Guardrails, and Adoption

16:27 - Navigating AI Governance: Frameworks, Assurance, and Responsible Deployment

21:16 - The Legal Frontier: AI, Compliance, and the Rise of AI Legal Practices

30:13 - AI in Action: Practical Use Cases and the Imperative for Continuous Transformation

00:00:01 Mark Smith
Welcome to the Power Platform Show. Thanks for joining me today. I hope today's guest inspires and educates you on the possibilities of the Microsoft Power Platform. Now, let's get on with the show. Today's guest is from Byron Bay, Australia, where he's the founder and CEO of Accelerate Tech, a company specializing in Azure, generative AI, the Power Platform, and Dynamics 365. With a strong focus on innovation, James helped organizations like Westpac Transport for New South Wales, Stockland, streamline operations and successfully implement transformative Microsoft technologies. He's passionate about using genuine agile practice, not agile, not scrum, but true agile. getting solutions rapidly into users' hands and iterating from there. You'll find links to his bio and socials in the show notes for this episode. James, welcome to the show.

00:00:58 James Diekman
Thanks for having me, Mark. Agile, I like it. It's a good terminology.

00:01:05 Mark Smith  
Not agile. Sounds Australian, like, sounds like an Australian word.

00:01:09 James Diekman 
I don't know where it came from, but it seems to be sticking. And yeah, we come across it far more than we should. But anyway, we can dive into that.

00:01:18 Mark Smith 
So my question for you, just because Agile's on the tail end of this. Is project methodology going to matter in the next five years?

00:01:28 James Diekman  
I think it's going to change and organizations like ourselves and partners need to adapt to that, particularly with what that's doing to, well, every industry and so many different roles, but in particular project management. And we're experimenting and doing things internally at the moment. So where you traditionally have someone like a project coordinator doing some of your heavy lifting, I think that, those sort of roles will get transformed into AI agents or chat bots, asking a question to produce a report or to get the weekly, the weekly actions or tasks that are later overdue, those sort of things. And really, like a project brain, if you will. So setting that up for each and every project where you have an AI agent that just is ingested with everything about that project, I think that'll transform quite a bit. The other thing that we started experimenting with recently is developing project-specific tools.
So again, where you may have a coordinator manage a whole bunch of things that are very specific to that project out of a spreadsheet, well, now we've got tools like Cursor and GitHub Copilot and Windsurf where People can go in and develop something that solves a problem for that particular project, and it's very isolated to that project. And we did one recently, and one of our developers showed us and showed the client, and it was like, I had this like, wow, this is really something, because that wasn't possible before, and you wouldn't invest the time in that either. It just wasn't worth investing the time in it. And there'll be ones that are repeatable for each project, and we can get into this because there's more to it, obviously, but Yeah, I see. I see with everything that's happening at the moment, there's a lot of opportunity and a lot of interesting things that are going to happen, particularly to project management. It'll get more efficient, leaner. And I think like organizations like ours, partners, implementers have a real opportunity to stand out and just go over and above without spending more effort than they're currently doing at the moment.

00:03:41 Mark Smith 
Will it become more accountable? As in what I mean by that from, over my career, I've had a lot of PMs work for me and with different methodologies on. And one of my pet peeves was often projects became all around the methodology rather than about delivering an outcome for a customer. And then the other my little pet peeve that I always had in the PM space was when you had a PM whose character couldn't hold people accountable. So in other words, let's say they're a people pleaser or an empath or something like that, right? And this would creep into the way they actually run projects, not positively.And I'd be like, you know, part of the whole methodology is holding people to their commitments, right, in a sprint. And I'm hoping that part of it will tap into the nuance of human bullshittery and Let's get real. Are we delivering on the velocity we're talking about? Are we, did you deliver the artifacts that you said you were going to deliver or didn't you? And, you know, are we clear that the impediment, let's ask Five Whys behind it, so let's understand what it really is, rather than some surface. I'm just hoping like this is the type of stuff we're going to be able to build into these solutions going forward.

00:05:00 James Diekman
Oh, for sure. We had a customer come to us recently, a government agency, and they're their investment corporation, if you will. And they wanted to build an AI solution where they ingest all that information in, plus other information from the public domain and from the market, and then ask it those tricky questions or help them prepare for those meetings. Because a lot of the time, they're getting fed the best version of how the investments are performing. And sometimes it's a bit of ******** it's a bit of fluff. And they can go into this solution and really interrogate it and say, Well, how is it actually performing, and what are some questions I can ask about these certain things that they said in an e-mail or as a Teams recording or a message, and pull out those insights, because... I guess, and the same applies for projects, right? And I think it's got applicability in the project management space as well. If you're delivering for a customer and again, you've got that project brain and maybe you've got, for big transformations or big capital works projects, the customer could have a similar solution where they're interrogating it constantly and they're getting some raw facts, really, that they're not getting told by project managers or other people within those teams to that point that you made before. Maybe they're not getting the full picture or the full story or getting a version of it.

00:06:28 Mark Smith
You know, we caught up a couple of months ago in Auckland at the AI, Microsoft AI tour. And I think it was the first time we might have met in person. And I've had like this a privilege of kind of seeing you go from working for somebody else to starting your own company over the years and then turning it into such a success story, with the various awards and stuff you've received and fastest growing companies, one of the fastest growing companies in Australia. Tell me a bit about what was your business vision that you set out to do when you founded your company and you know, what part of the market does it serve? What do you do?

00:07:07 James Diekman
So, let me take you back a bit because I, so prior to founding Accelerate Tech, and we used to be called Sentient Dynamics, and we rebranded a couple of years ago now. But prior to that, I joined a partner called 8020 Solutions. They were acquired by NCS a few years ago, but I was more or less a founding member of that organization. I was a first staff member, first employee. So I'd come from big enterprise organizations, government organizations, that's where my career was. I've always been in tech, probably more on the modern workplace side than the traditional sort of CRM side. And really got thrown into the deep end to the coalface of, well, what's actually required to grow and scale and run a business. And doing that whilst getting paid and consulting, and that was over a couple of years. And then So I've got a lot of experience there and a lot of confidence. And during that time as well, I joined your mentoring program. I think it was around COVID or just before, I think it was 2019 or thereabout.  And yeah, that was, I still have, think about that all the time and the impact that had going through that. Now, whilst I think a lot of people go through that program and they end up wanting to be MVPs or become MVPs in the Microsoft program, Whilst that would have been nice, my focus was more on the business side of it and how can I get my, I guess, my confidence and name out there a bit more. So a lot of that, what you did in that program helped. But what I saw happening back then, it was around the 2018, was Power Apps. Power Platform wasn't really a thing. There was Power Apps, there was Power BI. I think Power Automate or Flow, as it was called back then, was starting to emerge. We started getting into it and using it and solving problems with it. And I had this real kind of, **** moment. This is kind of like what happened with SharePoint, because I did a lot with SharePoint back in 2007, 2008. And I'm like, hang on a minute, I see what's happening here. And I can see what Microsoft's doing. I can see where this market's heading and low code. So it was whether, I guess the decision over a long period of time came up, do I do it with this company? Or do I go and do it myself? And I think there's a lot of boxes that I ticked at the time. Do I have experience in working for a startup and growing that and scaling it? Yes. Do I have networks and contacts within Microsoft and now customers and doing things like the mentoring program? Yep, tick, tick, tick. There was a lot of other things, obviously, that I didn't have figured out, but it gave me enough confidence to say, right, I'm going to do this. I've got enough sort of cash saved up to support myself to be able to do it. And then I did it. And then COVID came along and I was like, oh, hang on a minute. This may not be a great timing. So I pulled back a bit and the company that I work for, 8020, were fantastic. They supported me. I ended up working for them and then ramping down as I was starting to build up my company. So I was in a very unique position. I had a very good relationship with them to be able to do that. Because it took six months to land our first client. And we were very fortunate where we landed a New South Wales government client and sort of caught a falling pass or a falling catch with them and built them something pretty quickly. And then really started to work that Microsoft engine and started doing some more projects for that agency. And then, the name sort of gets out. And then, we're now looking at, well, at that time, it's like, okay, we've got a few more projects that have come in now. This is starting to grow quicker than I had anticipated and we need to start hiring people. And yeah, so that all sort of took off in 2020. And fast forward now, we very much started off with Power Apps and Power Platform. Dynamics 365 as well, but that wasn't sort of the thing that we let in. It was really fast, speed to market, get some value, get a working solution in front of you very quickly, get you guys on it, iterate, provide feedback. So that worked for, well, it still works, but that was sort of the model in the beginning. We then pivoted pretty early on into Azure. and more in Dynamics as well, and more recently, data and AI. So I'd say our Azure capability and development capability is bigger. just as big, I'd say bigger than Power Platform and Power Apps. So we do probably more in Azure and custom development now in Dynamics than we do in Power Apps and Power Platform. And I think that's, I don't know if that's where the market's heading, but there's a lot of, particularly for Microsoft, to see more focus on agents and AI and other solutions than Power Apps and Power Automate and those workloads. So yeah, and now we're a team of 23, I think, staff members, fully Australia-based, headquartered in Sydney, office in Melbourne, office in Brisbane, and a few remote staff, myself included. And customers, we're a New Zealand customer as well. So putting a bit of focus there and a lot of partnerships with other partners and other organizations.So yeah, it's been a really fun ride over the last five years. And it's challenges as well, as all businesses do. Of course. But here we are today and we predominantly serve government and enterprise. We don't really do much in the SMB space, so very much mid-market. And I'd say 90% of our customer base is government, lots of state and local government as well.

00:13:02 Mark Smith
What have you seen from an appetite point of view when it comes to AI in that PubSec space?

00:13:09 James Diekman
I think it was, well, Queensland government, who we've did some work for, moved very quickly a couple of years ago in their central customer and digital group to stand up and we help them build out the earlier versions of this. But an internal ChatGPT solution, if you will, because obviously they don't want them using ChatGPT and other large language models where they don't have the control and governance over. So they were very quick to do that and put some investment into that, but also build out the backbone so other agencies could leverage that central AI model capability. So they were quite quick. New South Wales took a bit longer and I think spent a bit more time in strategy and governance and frameworks and making sure it was safe to use and had some guardrails and things set up. But now what we're seeing is a pretty big appetite, particularly in New South Wales. It's just like in the last, let's say, year or so. We're speaking to many agents. We've delivered many solutions for agencies, and that seems to be accelerating, ramping up. So there's a lot of, I think, support, adoption. They're not huge multi-million dollar projects. They're still very much across the board, Some dipping their toes in, but some going a bit further, putting a whole leg in. And I think that will go further as the business value and the return on investment stacks up, because it's still a very new, particularly for government area. And government, you can't really get wrong either. It can't be hallucinating. It can't be putting out false information. It needs to be super tight, right? You know, financial services is obviously very similar. But they are, yeah, I mean, we're seeing a lot of innovation, a lot of support and a lot of progress across AI within state government agencies. Local government, not as much, but we're starting to do some projects there and see some stuff coming out of those spaces. So they're not laggards by any means.

00:15:28 Mark Smith
In highly regulated spaces, often as you've identified, there's a more, a higher attention and detail to the risk of implementing something like AI from a, as you said, hallucinations, flat out creating wrong answers, et cetera, all being used in unethical ways. You talked about frameworks there a bit. Are these frameworks that you've created yourself? Are you kind of looking at the ISO standards around AI in the space? Is there any like international thing that you're tying it to? How are you looking at it? Because, you know, we're only two years or three years into this rodeo. It's still very early days, but you know, I've already identified there's five frameworks that have come out from various bodies around the world. Where are you and where do you perceive government when they look at things like this to kind of help them build trust.

00:16:27 James Diekman
So there's sort of three frameworks that we use or governance models. There's a responsible AI framework that Microsoft has put out there, which is fair.

00:16:36 Mark Smith
RAI, yeah.

00:16:37 James Diekman
Which is pretty good. There's also an Australian Federal Government AI Assurance Framework. I don't know if that's the exact name. But then there's the New South Wales Government AI Assurance Framework. So that is one that we it's mandated for every solution that is being AI solution that is being developed in New South Wales government has to go through an AI assurance framework assessment. Yep. Now, that is, it's iterated already over a couple of versions, and that's changing. It's quite a process, I must say, going through it. We've gone through it a few times now. So, yeah, it's very... I think they've started off very carefully and very diligently in building that out as something that's best practice.Now, whether they've derived that from a global standard, I'm not entirely sure. I haven't really looked at that global level in enough detail to sort of see those standards, but I'd imagine there would be alignment there. But it's, yeah, I mean, it uncovers a lot of things like who are your main users or indirect users of this solution? Could it lead to harmful outcomes or negative outcomes for the community, things like that. There's a lot of things I think when you're building out these solutions, you just don't take into account. So you go, I think the questions and the guidance that they put out there is quite good because it gets you and the business who are developing it thinking, so okay, if I'm asking this solution about a particular use case and it's helping me make a decision, is that decision, like how is that going to affect things downstream, whether internally in the agency or even the public, depending on the use case. So it gets you thinking about those things, which is really good in identifying those risks. And then those would, well, those flow into UAT and testing when this thing goes live. So there's a lot of focus that we put on with user acceptance testing. And that ultimately falls in the hands of the business. We provide a lot of guidance and I guess project management of that. But they're the ones that ultimately have to put it through its paces and work out if it is outputting the right information or not. Because at the end of the day, we're dealing with a large language model and it is probabilistic by nature. So it's you can ask the same question multiple times and you'll get different answers. You may not get the exact same answer every single time.

00:19:04 Mark Smith
Yeah.

00:19:04 James Diekman
So yeah, spending time on that, red teaming, things like that, very, very important. We've done, I think, 3 production solutions and one out there in the public domain. So unlike a lot of AI solutions I mentioned before, there's a lot of proof of concepts, there's a lot of MVPs, there's a lot of pilots, but a lot of those never see the light of day.

00:19:25 Mark Smith
Yeah.

00:19:26 James Diekman
So going through the process of taking those through to production is, yeah, there's a whole another thing you've got to contend with doing that and going through the right stage gates and governance committees and privacy committees, enterprise architecture working groups, a lot of hard questions asked. Yeah.

00:19:45 Mark Smith
Are you noticing the legal profession needing to enter the frame on projects? And, you know, my observation for my 30-year career in tech, is that we've never been in a situation where you've needed to bring a lawyer in because of what you're doing in a tech project. And this is something that we're starting to observe across Europe and the US particularly, is the need to have legal cover. So for example, you mentioned red teaming there. Often the red team need to test whether a AI system can be jailbroken. To carry out those tests, is a very fine line to carrying out a legal activity to actually do those tests. And therefore, in that scenario, do you have legal cover for doing those type of tests? What are you seeing in the Australian market?

00:20:42 James Diekman
So, look, I think... I'm trying to remember back the exact back of the project team is I wasn't super involved with that one, but I'm fairly certain they had legal, someone from legal or compliance as part of that project. Like it went through, I mean, the solution was developed within, and it was, I guess, from a technology standpoint, ready within five months, let's say, I think, from memory. There was another four months of compliance and stage gates and readiness, there was a bit outside our purview that they had to go through. However, we are working with two, I guess you could say legal in government, use cases and teams at the moment. One of them will go to production. One of them is still very much heading into pilot phase, but it's moving along. So those teams are actually starting to use these technologies themselves to enhance their workflows, to all the things that I guess AI and automation help with, right? Reduce bottlenecks, improve efficiency, give time back to the team to focus on higher value activities. All those things remain true. But yeah, it's very interesting to see that they're using these technologies as well. And I think part of it is that a lot of these legal professionals even in government, they're subscribing to their industry news and events. And it's coming up quite a lot in those spaces. And there's industry specific solutions like LexisNexis and others, where they're putting a huge focus on AI and they're showing demos and the art of the possible and all that sort of stuff. So they're seeing these things and sometimes I can't use them because either they're not in, they're not sovereign to Australia and the data's not unsure and all that sort of stuff. But they're seeing what's possible. And they're getting ideas and they're like, well, hang on a minute, can't we do that internally? And then, they're obviously doing their due diligence to figure out if they can or not. So yeah, we're seeing legal teams use these solutions. They're wanting to build these solutions, which is very interesting.

00:22:53 Mark Smith
So one of the things we're coming across is legal teams that are setting up AI practices. In other words, they're understanding the new legislation that's coming out. And just to give you an idea, I was in a conversation with someone from Microsoft, internally in Microsoft, they're monitoring over 200 items of legislation around the world in regards to AI. There's a feeling that we're going to, even though, for example, in the US with the Trump administration, them saying, hey, we're not going to be as restrictive, there's a lot of restriction coming in, right, on how, so of responsible use of AI. And so we're seeing that, yeah, these legal practices, one firm we're working with in London, has about 1000 lawyers in their team, but they have now spun up an AI legal advisory practice. And they see that there's going to be a future, and they're already getting requests in this area, is companies that implement AI workloads need to kind of get a legal sign-off from people that know the law in regards to the particular technology. So in other words, they're not becoming technologists from an AI perspective, They're becoming AI legal experts because it's just like in the tech industry, we're seeing this whole generative AI industry blow up. They're seeing it from a legal perspective that those that are deciding to are building practices around that offering.

00:24:20 James Diekman
It's, look, it's definitely going to be required. This, like, I guess any technology, it's going faster.I think particularly AI, it's going far quicker than certainly I've thought it would, I guess over the last six months, it's kind of blowing my mind. You see a lot of that kind of hype online, but there are genuinely things where it's just like, oh my God, this is crazy. And government legislation policy has never really kept up with technological change, but certainly AI and the pace of what's happening is really outstripping that. So what you mentioned before about AI practices and people who now need to become skilled and knowledgeable in what is AI legislation and what does it mean from a legal standpoint, yeah, I think that's going to be super important. And I think we'll see as more workloads and activities and tasks and things that humans would have done before get handed off to AI, as small or as big as it may be, that's going to, I guess, increase the potential of things going wrong and leading to liability, potentially adverse outcomes. And then, you'll see lawsuits and things come out. And yeah, I dare say we'll see some of those things and kind of first of its kind, you know, company getting sued for an AI agent that gave out bad advice or did something incorrectly. And I don't even know this, but are insurance companies keeping up with this as well?

00:26:04 Mark Smith
Yeah.

00:26:05 James Diekman
From their policies and those sort of things.

00:26:07 Mark Smith
I just saw a article out of the US, one of the top insurers over there, so Fortune 100, and they have already identified over 1,000 workloads internally that they are building out AI-based solutions for.You mentioned red teaming earlier. An AI practice like yours and we're obviously going to see a lot more folks that have been in the Microsoft space build data and AI businesses. Do you think that the area of red team is an in-house team that you would develop, or will you, do you think that there will be companies that are just red team specialists that you will, because, you know, red teaming is not one and done, right? It's A ongoing, iterative process, because every time your LLM gets updated, that's behind something, it's going to change the way, you know, whatever AI solution you build potentially acts, been due to its non-deterministic nature. How much do you see that's a responsibility for a company to build that new muscle that new team, if you like, within them, as we would have built, our project delivery teams, our analysis teams, our data specialist teams and stuff in the past.

00:27:27 James Diekman
So look, I think it's going to depend on the solution, but we, so we've been building those skills in-house, and I think the customer needs to have some input into that, particularly with the solutions that we develop, because Like any solution, we're only going to know what we're going to know, particularly if it's over a data source that we are not experts in or a business function that we're not experts in. We're technologists, we're not the subject matter experts. So there needs to be representation from the business and our team now, educating them on what they need to do to carry out an effective test, you know, that can be taught. And it's no different to, well, it's a form of user acceptance testing, really. Correct. I think the cybersecurity firms will start to do more of this because solutions that we develop, we never do pen testing because it's a conflict of interest. We don't pen test.

00:28:26 James Diekman
I mean, we do best practice and we test things obviously before they go out, absolutely. But for things that go out that are critical and go into the public domain, our advice to customers is, look, go and engage if you have a preferred cybersecurity firm or pen testing firm, get them to come in and do it. Because again, I think it's a conflict of interest. You can't mark your own homework. Yeah, exactly. Yeah, marking your own homework. But doing it ourselves and having the business involved in doing it is, I think it's going to cover the majority of use cases and scenarios. But for that extra, I think, layer of due diligence, yeah, I think there's a real opportunity there for cybersecurity firms to, and we're not a cybersecurity firm, obviously, but to invest in those areas because that's going to become critical. There's just going to be a tsunami of these solutions going out across all sectors, good ones and bad ones.

00:29:29 Mark Smith
My final question is, how does James use AI? Like how do you use it? How are you specifically, how's it become part of your day?

00:29:39 James Diekman
Well, I mean, I'm using it every single day from, you know, we've got Copilot, we've got a number of users who use Copilot. We use Cursor and GitHub Copilot as well. So we're using it for development. I think one of the highest value areas because I come from a technical background and I traditionally, I was doing a lot of the doing, but as time has gone on, I've needed to obviously step up and look more on the strategy side and obviously run the company. So the things where it's helping me do high value activities is like tenders, proposals, sales, marketing as well, particularly with the the new image generation that's come out of GPT. It's pretty incredible for spinning up an image, you know, to add to a LinkedIn post or to a blog article or something like that. I'm getting it to rewrite things that I've, you know, put out there and it's doing it in my kind of tone and style, which is good.

00:30:41 James Diekman
One of the, I think, the highest value things is actually spinning up a user interface to show a customer. even before you have gone into a deal or even for its attender. It's like, this is what it could look like. Now, to do that before, you wouldn't even think of investing the effort into doing that. But if you can quickly, with Cursor or GitHub Copilot, prompt it to say, well, this is what the customer's looking for. And again, you don't need to share any sensitive data. You just need to give it a fairly well-structured, maybe not even well-structured. You just need to give it what they're sort of asking for and the problem it solves. and some parameters. And within one or two shots, it can come up with something pretty darn good. And I've been on calls where I've done it in the background while someone on the call was like, is this kind of what you're thinking? And oh my God, that's yeah, that's exactly like, that's so cool. I couldn't have this and this. Like, yeah, absolutely. You know, you can, our developers are using that technology quite a bit. I mean, I know how to use it and how little bits fit together, but I'm not a software engineer. But our software engineers, they're like, yeah, it's really good. sometimes I make some silly mistakes, but we know that because we're software developers. But yeah, that's somewhere where it's, again, it's just like, it's that value add that you can give to the customer.

00:32:09 Mark Smith
And the speed, right?

00:32:10 James Diekman
And the speed. And before you, wouldn't bother doing that unless it was something that was of significant importance or value. And so that's how I'm using it. And we're rolling out our internal AI working group and strategy at the moment. And yeah, we are going to be transforming many things internally and looking where AI can assist. So really inserting it into existing processes and workflows that we've got. And I think that's the right way to do it, what I'm seeing at the moment. Don't just go and build a chatbot for everything. Work out where, obviously, how does your organization run, essentially? Map that out if you don't, you probably should have an idea of how it runs. But then work out, okay, where can this type of technology insert itself into these discrete processes, do that on a 90-day cadence and build a backlog of those things and just continually chip away at it, basically. Because this, as I said before, this technology and AI, it's changing so rapidly.These things aren't set and forget. I think you need a dedicated, not dedicated, but you need a team or a working group within an organization constantly looking at this and prioritizing capabilities and then rapidly building them in to your organization. Because if you don't, it's, I mean, from what I'm saying, I'm not fear mongering, but I think companies will get left behind. There'll be others that do it and they're more efficient. And they're the ones that are going to excel in this age.

00:33:53 Mark Smith
Hey, thanks for listening. I'm your host, Business Application MVP Mark Smith, otherwise known as the nz365guy. If there's a guest you'd like to see on the show, please message me on LinkedIn. If you want to be a supporter of the show, please check out buymeacoffee.com forward slash nz365guy. Stay safe out there and shoot for the starts.