Hi everyone,
We know that you are really looking forward to starting summer vacation and getting some tan (depending on where you live). However we have something for you that is independent of your location.
With our fantastic speakers from the previous season we would like to invite you to a panel debate where we will discuss new aspects of artificial intelligence implementation. In the past we have focused on the how and why of AI, but how do you actually sell your AI product?
Who are the customers?
Startups or larger enterprises?
Is it possible to sell pre-trained machine learning models or do we need to develop customized models?
In the end we need to show that we are adding value with our AI products. How can we demonstrate added value?
Join us for an amazing evening together with our wonderful speakers to discuss these questions and other things that you might have on your mind.
Agenda
17:00 - 17:05 Introduction
17:05 - 17:50 Discussion Panel
17:50 - 18:00 Summary
Show More Show Less View Video Transcript
0:00
Thank you
0:29
Hi, good morning, good afternoon and good evening everyone, depending on from where you are joining us from today
0:39
I would like to start by welcoming my two amazing friends, Gosha and Håkon, who are going to help me today to conduct this amazing panel talk
0:49
And before we get started, I just want to have a quick introduction to what we are going to do today, because throughout the past year, we have talked about the development of an AI solution, the underlying mathematics and advanced ytics, data science, machine learning pipelines, and real-life implementations as well
1:09
And hopefully you have learned a lot together via das. And before we get started to talk about today's topic, which is going to be how to sell our AI products
1:19
please Håkan and Gosia introduce AI42 to us. Yes. yes so we would just like to have a quick introduction here into ai42 so basically
1:42
the motivation here for starting ai42 comes from the recognition that we believe that there is no
1:48
really good starting material for getting into the ai and machine learning field here
1:53
So with AI42, we are a strong team consisting of three Microsoft AI MPs
1:58
And we will strive to provide you with a valuable series of lectures that will help you to jumpstart your career in data science and artificial intelligence
2:07
And our aim is to provide you with the necessary know-how so that we will help you to land your dream job
2:13
Just as long as it's related to the fields of data science or machine learning
2:18
And the concept is quite simple. It involves professionals from all around the globe that will explain to you the underlying mathematics, the statistics, probability calculations, data science and also machine learning techniques
2:32
And don't worry, because we will guide you through all of this. So all you have to do is just to follow our channel and enjoy the content every second week
2:40
It will be filled with real life cases and also expert experiences
2:43
and you know we have all started from scratch and we are very happy to help you to build it up from
2:50
there and you can always stop and rewind the videos or ask for clarification in the comment
2:55
section and we hope to assist you on this wonderful journey and have you as a speaker
3:00
one day and we believe that by creating cross collaborations with other organizations we can
3:07
give you the best opportunities to broaden your network in the AI and data science communities
3:11
and with the combination of our offered services, we would support less fortunate people
3:16
and organizations that are not that recognized yet even though they deserve it
3:23
Our organization is sponsored by Microsoft Admise and we are humbled by all the support we get from our contribution as well
3:31
Thank you for all the beautiful graphics content and Mea Mari for the cool intro music before our event
3:38
our event we are also in close collaboration with c-sharp corner a global ai community so our
3:44
lectures are going to be available also on their youtube channel additionally to our own media
3:50
and nicoletov create and review all our text content views on our website or advert mindset
3:56
and during our sessions you can also follow us on facebook instagram and twitter to become a part of
4:03
of a growing community. We share knowledge and fun. You'll find every information that will bring you
4:09
to an advanced level in the field of artificial intelligence and data science
4:15
And as well, you can watch our recorded session on the YouTube channel and find our upcoming session
4:20
on our Meetup page. Yes, and we also have a code of conduct
4:26
So our code of conduct outlines the expectations that we have for participation in our community
4:32
as well as the steps for reporting unacceptable behavior. And we are committed to providing a welcoming and an inspiring community for everyone
4:41
So be friendly and be patient, be welcoming and also be respectful with each other
4:48
So with that said, let's get back here to the studio. So welcome back everyone and as you might already know today we are going to learn how to sell our AI products and that is a hard one for many of us but soon we're gonna hear our expert experiences
5:17
So this afternoon we are lucky to have Leon Gordon current Microsoft Data Platform MVP expert in data ytics. Hi Leon
5:27
Hi everyone, thank you very much for having me today. It's actually a pleasure to be here with you
5:34
It's great to be here. Yeah, we also have Leila Atetti, Microsoft AI, an author of three very useful books in the field
5:44
Hello, Leila. Hello, thanks so much for having me in this and great to be in this talk
5:50
Thank you. And we also have Johan Bratos, who is a data platform MVP and also a principal solution architect
6:03
Welcome, Johan. Hello, everyone. Thank you for having me. And Priyanka Shah, AIMVP and IoT Director at Avanade
6:12
I'm very happy to see you here, Priyanka. Welcome. Hi, thank you so much. I'm so happy to be a part of this wonderful panel discussion. Looking forward to it
6:22
And last but not least, Peter Gallagher, Microsoft Azure MVP and IT consultant
6:29
Hi, thanks very much for having me along. Hi, how's it going
6:34
It's really nice to see you all here together. And we could hear a lot of great achievements from you guys
6:42
guys and congratulations to all that and some of you might seem familiar to our audience and that
6:48
is not a coincidence because you were here with us at ai42 during the last few months and you were
6:54
teaching us a lot about microsoft ai and azure data platform and thank you again for being here
7:00
with us again and sharing your insights this time on how to sell ai products
7:06
So shall we get started? Can't wait! So who is going first? We have some questions prepared and
7:21
H would you like to take the first one Yep so our first question here relates to what kind of AI solutions your clients are looking for most of the time So it could be either are they asking for just a prototype or could it be that they ask more for migration or an improvement of an already existing solution
7:45
What is your experience about that? I can go for that. Yes, sure. Leila
7:54
So yes, that's, I think most of the time it's prototype these days because people getting their data
8:01
So I'm coming from a data background. So most of my customers, they use data in Power BI, in SQL
8:08
So they have their own data ready. So they kind of want to see how they can have the machine learning through that
8:15
So for me, most of them is going to use prototype to see how they can use AI and ML
8:24
Yeah. Yeah. True. Yeah. I would agree. I'd agree with that. So it's similar for the majority of our clients
8:32
We see that they have their data. They have some thoughts in mind on how they want to utilize that data and how they can benefit from the value of it
8:41
Particularly in our case for machine learning. I was literally just speaking to a client before this call in regards to how they can start to use machine learning and telephoning to predict leads and send those directly via API to their dialed software
8:58
So again, very much a prototype, proof of concept before moving forward with hopefully a finalized solution
9:05
Yeah. Yep. So it's almost similar also for Southeast Asia region, right
9:11
more so because uh you know companies are afraid to invest in ai for the very fact that we cannot
9:18
commit on any accuracy beforehand i don't go out and say that for your computer vision model i'm
9:24
promising you 90 accuracy because there will be data drifts there will be you know certain
9:30
incorrect predictions recalls which will be done on your inferences scoring so which is why most of
9:36
the engagements are either advisory driven like you know we go to the client place we help them
9:42
discover what sort of ai they can do and then you know get some traction build a sort of mbp poc
9:49
sometimes the pocs are paid like you know the current one which we are engaging for ai for
9:54
sustainability with the government here it's it's like a paid engagement for three four months
9:59
And then it will be rolled out to other departments for sustainability, for energy ytics
10:07
But more often than not, people are a bit hesitant to directly jump into, let's say, a digital twin implementation or let's say, you know, a full-fledged knowledge mining document search engine
10:20
So they would rather see it on a prototype or get the results and then try to roll it out on a larger scale
10:29
There is one thing though in that is that yes, also in the Nordics, you still see a lot of POCs
10:35
or they are curious, but they are a bit more mature in that they want to implement their AI
10:42
So they want to build machine learning models into their data platforms
10:47
But of course, they are not sure. Most of them don't have anything existing right now. Maybe they
10:51
have no statistical model running but there are there are clients who come up and say yes we have
10:58
something we want to migrate it to the cloud for instance but mostly it is we know that we want ai
11:04
we know that we need ai um how can we implement it and then we have some use cases we want to do plc
11:12
so it's maybe it's the next step i'm not sure but it's still not a lot of migrations it's mostly
11:17
fresh fields. Yeah. Yeah, I'm seeing mostly POC work, but certainly based
11:27
on ideas that the companies have already got, based on the data
11:31
they've got or the direction they want to go. And sometimes that's not even a fully formed
11:36
idea and then they need folks like us to come along and
11:39
help guide them where they need to be going. But yeah, that varies. Sometimes they've got a really
11:45
solid thing they want to do with their data and sometimes they just they know that they want to do something with AI they've got all of this data
11:52
and they just want to be able to leverage it in some way it's funny to see the difference
11:56
in in the respective clients even with where they want to go and how they want to take their
12:04
the data yeah but it's heartening to see that you know people do want a lot of insights ytics so
12:13
it's not vanilla solutions they want anymore so even if you have a you know a customer related
12:20
marketing related implementation they will inadvertently come and tell you that i want
12:24
insights on top of that you know i want a customer 360 view in my system so that gives us a chance
12:31
to have a bit of ai engagement in your normal you know surround offerings as well so in in my
12:38
company whenever you know we sell d365 biz apps ai rides on top of that in the form of customer
12:44
insights or in in the form of custom ai modeling uh on top of d365 rs or something of that sort
12:51
uh but yeah i mean you know uh the the purse strings are not yet so uh um open here in
12:57
singapore to invest hugely in ai like i i have not seen a full-fledged you know couple of million
13:04
dollars of projects straight away like you know you see our credentials and you hand it over to
13:09
us no i mean you have to have to build something for them at least a small system and then they
13:15
would be willing to take it forward yeah i think that's a very good point briyanka and what we see
13:20
a lot over here in the uk as well at least from my experience um is that a lot of customers want
13:25
to be involved in particular machine learning but their data just isn't mature enough their data
13:30
quality isn't up to spec to be able to invest in some of these newer technologies so we have to
13:35
then go through a process and they have to understand that whilst they want um again
13:40
machine learning they're just not mature enough for it yet yeah sure thank you for those answer
13:50
and it's very interesting that you're mentioning that it's mostly a poc work but can the can these
13:56
poc works be turned into like a real project in most of the cases or it is sometimes staying as
14:03
a poc and maybe it gets picked up picked up later on yeah i've been working with one client where
14:12
we've implemented a poc to compete against a large-scale ytical or prediction model that
14:19
that they had purchased from somebody else. And then that evolved into a larger scale migration
14:27
to a better model for, this was for airports. And then they've taken that again
14:33
and they've gone out actually for a new bid for replacing everything
14:38
but the customer that we had made for them as well as the third party ready-made they had
14:44
So it's getting there. That's good to hear. Are we ready for the next question then
14:54
Yes How often can you use pre models or it is more likely that a solution needs to be built from the scratch Yeah from my experience so far I think that we generally see solutions from scratch
15:12
It's very much specific to every customer has been unique or every client has been unique
15:18
We haven't been able to really have an off-the-shelf solution. I've done some work previously, particularly in the housing market with prediction against property prices, which we have tried to lift and shift to customers as well
15:31
But again, there were elements of uniqueness which meant that we were better off just going forward with the new solutions as opposed to trying to lift and shift. Yeah. Yeah
15:41
Gemini has actually involved or made a neural network to create synthetic datasets, which
15:50
is then being sold as a prepackaged model, which is quite nice, even though it still
15:57
needs training on these new data sources it adds. It's still a prepackaged model that you can solve, and it actually gets traction in the
16:05
market because there is not a lot of uh pack or shelfware in that aspect yeah yeah yeah
16:14
so the same so I think that these days I I saw that most of the uh pre-built package they are
16:21
kind of more trustable so um I can say before is I can say more to 70 percent of project or more
16:28
than is was start from scratch but now in most of the client even I heard from other colleagues that
16:35
They're using some of the pre-built packages from different kind of vendors
16:41
And that works fine even for our projects. I see sometimes based on their data and the model, the pre-built one, really, they did a good job actually
16:50
So I can say now is a half-half for our projects, half pre-built and half actually we need to start from scratch
16:59
Right. Yeah. so normally for custom vision sort of projects right image processing video ytics we do see
17:07
that a lot of pre-built models are used leveraged upon but when it comes to NLP or as I said right
17:14
there's a lot of traction right now for knowledge mining graph search kind of use cases so yeah for
17:19
those because it's more you know domain specific so if you are working for an oil and natural gas
17:25
company versus manufacturing energy you know a perfume manufacturing company or energy intensive
17:33
company then for those you can't even use your same accelerators which you have built so let's
17:39
say if i have an accelerator built on nlp for one client that doesn't really hold it true for another
17:45
client because you need to annotate the data you need to you know do all of the data ysis again
17:51
on the client specific data but yeah i mean relate related to as as you know your your
17:58
your mention uh for for uh deep learning or neural networks uh deep dnns right yeah we are leveraging
18:06
some pre-packaged pre-trained models uh but again a case-to-case basis basis i mean based on what
18:14
use cases you are implementing i would say it really depends yes it's an interesting question actually because we're all standing on the shoulders
18:26
of giants with this anyway um it depends on how far you want to go down as to whether or not you
18:31
call it from scratch or not um but certainly uh the work that i've been doing we we train on top
18:38
of pre-built models essentially um and it's an interesting uh conundrum when you do something
18:45
like that because who owns the model that you're standing on top of um and i noticed just the other
18:50
day the uk government and run about changing the copyright laws to make this particular problem
18:55
easier uh because the data sets certainly if you're going to use them commercially you nearly
19:00
always have to pay uh some sort of a fee a royalty fee or something to use that data
19:05
but i think uh the uk law is going to change to make sure that that's not going to be the case
19:10
which is it's an interesting concept interesting side of things commercially when you start
19:15
thinking about who owns this data but it's interesting when you think about how you use
19:18
your data as well and i don't know what everybody else's experience of that is but even the clients
19:25
you're working for they may may not own in the respect of been able to use it for training models
19:31
uh the data so i don't know what what others are seeing as a result of that and do our clients even
19:36
know that they can get into copyright claim issues if if they use stuff that isn't actually theirs
19:45
right yes i can see that many clients are doing so that they they get the data sort of from a
19:53
third party because they are storing their data they are doing some transformation on it so when
19:59
we get it back through the client this data is so over transformed and over prepared that it's a
20:06
nightmare to figure out what is which feature is what and how should that actually look like what
20:12
what is the purpose of that column remove the bias as well um yeah that's a difficult problem
20:19
but that's a completely separate question that one yes yes and um every time we start a project we
20:29
We have this pre-sales period or proposal writing and all this where we try to provide value to the customers
20:38
So how do we show the value to the customer when we still don't have access to the data
20:44
We don't have designed contract. We only sort of have access to the description of the project
20:49
and we can use our own experience and knowledge to put together a valuable proposal to our clients
20:56
How would you get started on this situation? I think there's a combination of elements, at least for me, from a client perspective
21:04
And it's understanding that the client understands the value of AI, particularly for us, machine learning
21:12
And we focus on specific business problems that are going to have a high return on investment
21:17
By solving those problems, we find that our clients then are able to go and wave a flag
21:22
so to speak, across the business. it enables us to then get buy-in for further projects and to solve further issues within
21:29
an organization so for me it's a key to make sure the client understands the value behind the ai and
21:36
also to solve a specific business problem with the solution and typically it depends on how that
21:48
that request comes into you, if it's in form of an RFP
21:52
or if it's form of a specific use case, because a lot of times they've had consultants in to
21:58
or they've had people coming in and describing what they need or they see a pain point
22:02
And in that case, you can say, okay, I don't have access to your data, but I know a bit about your business
22:07
I know what line of the use case the problem is, and we have done this for other clients
22:12
We have done that for that client. So using references as well as your knowledge
22:17
about their business and their problem, you can go a long way in proving that you can get value. Otherwise
22:24
Yeah It all about presenting the vision right I mean you present a vision to the client that this is possible This is the art of possible So which we basically you know actually we call it as a customer discovery journey So where you engage with the customer and then you try to sort of you know show him the beautiful possibilities that AI can have
22:49
sometimes you know even the customer is not aware of i mean he maybe you know he just want insights
22:55
but on top of that what will happen if you have predictive forecasting what will have if you you
23:00
know what will happen if you uh over and above insights you also have some sort of iot there
23:06
digital twin there so it's it's like a iterative uh journey even for the client and as you engage
23:12
more and as you showcase the customer vision more i think uh even even for us we discover a lot of
23:20
potential possibilities which we can engage with on on a bigger scale with the client so typically
23:26
that happens right right we go for one solution and we end up you know suggesting them to implement
23:34
maybe a completely different uh scenario altogether or maybe a joint of probably a data
23:40
platform modernization first. On top of that, you have insights. On top of that, you will have AIML
23:47
So, yeah. So, it's all about presenting that vision to the client
23:52
First the vision, and then you hit the horrible data, and then you have the data passion project
23:57
Yeah. That is true as well. You're not wrong at all. And I think it's really key, the point you made there as well, Johan
24:05
on terms of where you're entering the customer journey. So, our idea is very skewed
24:10
because we come in as a solution for a pain point. But you really pointed out that it could be
24:15
like you mentioned, coming in with already a solution in place that you have to go and develop
24:21
as opposed to actually walking the customer through the journey and identifying the right solutions for them
24:30
Yeah, I think I'd like to hear the same thing. Yeah, sorry, sorry
24:34
Yeah, sorry. No, I was just saying that, does it happen sometimes that we end up overselling, and then the client
24:40
expect a lot of miracles from us. Yeah, that actually feeds into what I was going to say to a degree
24:49
I agree with Leon and Johan as well. It depends on where you're coming in. And it depends on the interactions you've had with that particular client as well as
24:56
to whether or not you actually produce them something. Showing them something, I find, is the most powerful way to show them what it is that's possible
25:04
And even if you've not got the data, you can often create a POC quite quickly that will
25:08
at least get them off a blank sheet and conceptualize what it is that you're trying to
25:13
demonstrate to them. But obviously, you know, if that turns out to be a lot of smoke and mirrors
25:21
then you can overpromise, just like Priyanka was saying, and they think, you've done it
25:25
it's finished, can we buy it? And it's, well, no, because that's just a demo. But yeah, it's interesting. It depends on what they give you for the description as to whether
25:35
or like you can give them something which is representative of the problem you're going to be solving. So, yeah, I can say the same thing
25:43
So for me, because most of the people coming for the AI and ML, we already work with data
25:50
So we know what's the data in there. So that's kind of a good chance because we can go through that
25:56
But for new customers, as other mentions, yep, definitely we look at the best practice happening in that industry
26:02
so for example if they come from for example from the banking or from the financial or health
26:10
definitely we look at the combination of what we've done and actually what other done in that area so
26:17
and because it's good for them because they can see oh this this is possible so we can have it
26:22
so it's it's as other mentioned is a combination of the approach from what possible what we can do
26:29
they don't know that it's possible and also the combination of the other things that other people
26:35
done and if I didn't done it before in that sector. Yes, so now we talked a little bit about here the problems of how can you actually get into a client
26:51
and show that you can provide value but then we have a problem sort of on the opposite side
26:56
how can we actually make a successful exit from the clients? So how can we make sure that this organization that you've been working with
27:04
that they can continue working with this solution that you have provided as a consultant
27:10
And how can you ensure that also that the client has the right knowledge
27:13
or the critical knowledge of how this solution actually works and also has the right type of people in the team that will take over the solution
27:26
that is a challenge um it it depends some clients actually have people on board already and they're
27:33
the ones writing the proposal or they are interviewing you for the project and then
27:38
it's easy it's just a matter of um transferring or documenting and describing what you're delivering
27:46
but if you need to train them as well then it's um and sometimes you even have to tell
27:50
tell them what kind of people they need to hire. So typically we'll come into a fresh field customer
27:58
like that you will have to come in with a team and explain to them, okay, we will do this for you
28:04
You need these kinds of people. If you have confidence in inside, we can train them
28:10
And then we can do shadow training as well. So we work in tandem with them
28:16
So that's one way to do it. But of course, if they don't have any kind digital maturity, it will be hard, but then probably your AI project would fail anyways
28:24
Yeah, I agree. I think that one of the key you mentioned there is potentially identifying
28:31
the right client as well. So maybe the client might not be the right fit. So it's okay to
28:37
say we want all the clients in the world, which is great, an ideal scenario. But sometimes
28:42
we need to be able to say, okay, actually, this client isn't mature enough, or in a position
28:46
customers to go in and deliver a solution, make sure the client is aware of the reasons
28:51
why that could be the case and then understand the fact that we can do or can deliver for
28:57
them but they need to put this in place first and not be afraid of walking away from the
29:02
client's point. Right. This is a very practical difficulty which we face, right
29:09
We are seldom able to ring fence the scope, ring fencing at the correct point of time, right
29:16
Advertently, what happens is especially, you know, with NLP, because as I said, right, we work a lot in knowledge mining NLP
29:24
So we hand over everything to the client. We deploy the solution
29:29
It is all working great. We provide those two months of guarantee, warranty, training, everything
29:34
Six months down the line, the client tells me, hey, my search engine is not working or my accuracy is only 20 percent, 30 percent
29:41
So there needs to be that knowledge also about data drift, about how to retrain your model and they want everything automated
29:50
So, even if you have a human out of loop pipeline, then still you need to have those re-ranking of results, which results should be
29:59
be more relevant so all that knowledge i don't think you know uh we can give it one shot to the
30:05
client so typically what we do is we have the six month or eight month clause depending upon the
30:12
complexity of the model developed and basis that we slowly roll off from the project so we have
30:18
some sort of shadowing some sort of annual maintenance on support services l1 l2 support
30:25
sort of thing and finally if the client agrees that they are able to handle they have technical
30:30
team on their side to mitigate these kind of things uh then we shop we we you know slowly uh
30:37
sort of roll off from the project but otherwise it is very difficult because you will still
30:42
you know like my last project i'm still getting from my previous company the client is still
30:46
logging tickets in my name saying hey we are we are still not you know it sometimes accuracy
30:53
decreases for the hedge fund model sometimes it gives me error sometimes it the score is scoring
30:59
is very low so yeah i mean so that is uh again you know in the southeast asia region the maturity
31:05
of the clients is low as compared to uh probably the developed market so growth markets the
31:11
propensity for clients to understand the project is very low um so yeah so that is a problem we
31:18
routinely face and which is why we have watertight contracts in place that after so and so I won't
31:24
be a part of your project. Anything else you want will be a change request
31:30
Yeah, it's an interesting, certainly from a consultancy point of view and one that I don't
31:36
often plan heavily for, certainly not as an individual consultant because obviously we're
31:41
hoping they come back. But it drives your decision making as to what tools you're going to be using
31:47
to solve their problems. And it could be that a particular client
31:51
you know, doesn't have the knowledge in-house to be able to deal with a highly complex solution
31:58
you can give them. So it might be that you have to, for want of a better term
32:02
dumb it down a little bit for them and give them a slightly, you know, less than optimal solution
32:07
but one that they can manage themselves. Because even just retraining models
32:11
that can be above some clients' heads and they just don't have that resource in-house
32:16
So, yeah, it's interesting to think that you could give them something that's perfect that they can no longer manage once you're out of the frame
32:24
And you definitely have to think about that as a consequence. Yeah, it's interesting
32:31
Yeah, definitely. I think what we've found as well with our clients is putting in support packages
32:36
So this is whereby we end the engagement with the entrepreneurship. But we offer them maybe five, 10, 20 days worth of our time where they can come back to us as well as part of that
32:46
Anything else outside of the conversation, again, as a brand convention, that becomes a change or a new piece of work
32:55
So, yeah, the same. So actually, one of the things is that it depends on the customer thing
33:02
So if they already kind of have projects from other customers, from other clients and other industries they develop for them
33:10
So it's good and bad because they already know the ML. But sometimes, for example, they face some problem from the previous industry that they develop the AI for them
33:21
So we need to correct that misconceptions and help them to kind of see the other solution can work
33:30
sometimes for example they use some algorithms and they see that this doesn't work so there is
33:37
a need to kind of recover from that image so it's good and bad to work with a client that already
33:44
works with AI but for clients that didn't there is a there's a good things because there is a blank
33:49
area you can start to teach them about that but sometimes also they don't know anything so you go
33:55
the training first, as Johan mentioned. So you need to kind of go there
34:00
And that's a journey actually for each client that can be totally different from each other
34:06
because all of them has different tools, different journey till now. And yeah, it can be, I can say
34:12
I never, I couldn't compare them together at all. All of them is identical
34:18
Okay, thank you for all your answers. So I think that one of the tricky thing when we're going to sell things is to estimate
34:31
So how do you do estimation for machine learning project? Maybe you have some framework already that you're going with all your clients
34:39
I'm finding that quite hard actually, mostly because everyone's unique and that's what
34:47
Layla's just said there pretty much is it makes it difficult um it's not an exercise of just
34:53
repeating what you did before it's all of the surrounding services that you have to hang on to
34:57
to whatever AI that that you're using certainly the the projects that I've been quoting and making
35:02
POCs for recently are just everything is different um from ecology projects to to manufacturing to
35:09
um to just folks who have got the existing data and they want to process that so it's difficult
35:14
so yeah there's a lot of finger in the air but I think you can if you manage your client
35:23
correctly you can certainly work with them to be able to go along a journey to a rough estimate to
35:29
almost a quote but I never give quotes so it's always an estimate anyway but
35:34
yeah hard is the answer no true true which is true right so
35:42
So estimation has been a historic problem, be it AI or non-AI, right
35:47
I mean, always in all IT projects, estimation is a huge problem
35:52
You don't have any, even for your normal, you know, full stack development projects
35:58
seldom, I think, you know, we have some sort of predefined templates
36:04
which we can reuse for estimation. And for AI, it is all the more difficult because sometimes it's hybrid cloud plus on-prem
36:12
Sometimes it's totally on-prem. Sometimes it is multi-cloud. And sometimes it is a combination of different components
36:20
So AI plus IoT, D365 plus Customer Insights plus Synapse plus AI and Cosmos DB, NoSQL, Elasticsearch
36:30
So a lot of components come into play, basis the different type of scenarios you're trying to implement
36:37
and the complexity of the use case which the client has. So it's very difficult, right
36:43
I mean, more often than not, when we use Azure Cognitive Search
36:47
or knowledge mining kind of a thing, by experience of implementing it at couple of similar clients
36:54
or couple of similar industries, we can still you know sort of try to emulate what we estimated previously but then again you know
37:03
the data annotation labeling the number of data sets which they have all of this adds to
37:09
adds adds to the confusion of how to come at a you know very appropriate precise estimate
37:16
so yeah as as pete rightly pointed out that combined with the myriad services which are
37:22
trying to put together in that project it becomes very hard So I think it more guesswork Try to you know relying on your instincts your intuition and and yeah keeping some buffer you have those 20 30 buffers and all of those things
37:37
but yeah there's just no hard and fast formula to arrive at a correct estimate no but at least you
37:44
could what you can do is you can box it you can put in a time box or a frame of the project so yes
37:50
it is a poc it's not supposed to be 15 data sets and the complex algorithm it's
37:59
one or two data sets as very specific use case and in these cases you can time box
38:06
quite a bit even though yes there will still be a amount of guesswork it will still be amount of um
38:12
um what i i i my last english word for it but it's just intuition uh based on what you have but
38:22
you can at least have okay pocs maybe three months if we keep it to one data set and this
38:28
small use case but it's you still have to build in some buffers and some uh rough estimates because
38:37
of it. Yes, every customer tends to be unique in one way or another
38:46
So yeah, same as actually, Pete, Priyanka and Johan mentioned, there's the same things
38:53
for that. So it's very hard, I think, to kind of estimate that. Yes, having worked with the same
38:59
solutions can help to have some predictions on that. But I found always when I do the estimation
39:05
for Power BI project is much more easier when I do it for the AI one
39:09
And kind of it can, it becomes sometimes mostly because I'm working with kind of the prediction scenario
39:17
most of the time in my project. So the data can be, I really don't kind of start to work
39:24
with their data to see that. So that part helped me to have a better estimation
39:29
what will be happen. But I can set compared to BI project, ML project
39:35
always take i think the estimations can have more kind of fluctuate than what we've done for the
39:41
power bi and bi projects and what can do estimations a bit is also like you know sometimes this has happened with us that we
39:53
have to scrap the training the model which has been trained because it does not fit at all i mean
39:59
three months of effort we have estimated and after three months of training we find that the model
40:05
especially for real-time video ytics we had to scrap that so we had to pay a heavy fine to
40:11
the client as well uh but yeah i mean can happen with the algorithms right i mean you're at the end
40:17
of uh two weeks or three weeks you're like hey this is not working i need to scrap all my efforts
40:24
yeah i couldn't agree more frankly we're in a quite similar position to to me as well in terms of both the power bi perspective and the ai perspective what we tend
40:33
to do is absorb a little bit of that risk ourselves so we try to go to staged price points so when we
40:39
get to this element for example we have your data plans available to us we'll then release this amount
40:46
and so on and so forth now obviously as you mentioned it's a risk that when you get to a
40:50
point when what are being trained isn't appropriate we have to absorb that cost so we have to make
40:55
sure that we introduce a buffer as well but it's it's very tricky to price to price correctly
41:03
and i guess it's the same thing in research as well no result is also a result so depend
41:15
on how you sell that to the clients so thank you for the answers shall we go to our next question which is sort of going back to one
41:31
of my previous one of our previous questions so we were talking about building a poc or doing
41:38
researches and so on and then we also talked about that that it can actually turn into a real project
41:44
as well so what if the poc shows better results than what you can achieve with the end solution
41:52
what do we tell to our clients in that case I think the POC would always show better results than the end solution
42:02
That's usually the end result for a POC. I think it's about educating your client up front in terms of these solutions
42:12
What we find is that a lot of our clients are very new to AI
42:17
They hear it. It's very shiny. It's very much a buzzword. There's a lot of commentary around it at the moment
42:23
what that means, what it brings, and how to implement these type of solutions is where they need the hand-holding and walking through
42:30
Now, this doesn't mean we need to get too technical, because a lot of my clients at least gloss over some of the technical aspects
42:37
but being aware of some of the risks to the project really does help
42:41
It depends on the scope of the POC. If you get this small dataset to train on and work on, of course, you will get a lot better results than when you get real datasets in the background
43:01
But it's, as Leon says, expect or training your clients to expect this change
43:09
explain to them in advance that normally when we do a limited POC
43:14
we'll get better results than when we go live with them. But then it's just a matter of trying to estimate roughly what that will be
43:23
and then letting them have the end result when they do it on the live
43:28
on the production environment instead. Yeah, I suppose a lot of it comes down to why you're making the POC as well
43:39
certainly a few of my clients and I've worked with the POC as a funding exercise really that they take
43:44
around just so they can get some form of funding to be able to take the project forward further
43:48
so you've got that sort of a client as well as a rolling client who just wants to add AI to an
43:55
existing solution but certainly everybody else has mentioned this educating the client I think is
44:00
is massively important at this point so that they know that your POC is usually a very targeted
44:06
solution that you're giving them for a specific purpose of either just showing them how it's going
44:11
to work and giving them the clarity or as I've just mentioned for a funding exercise and then
44:16
there's a whole separate section of development that needs to happen on top of that and certainly
44:20
as we go right back to the start of the conversation you may not even have the full data set to work
44:25
with at that point so making them aware that look this is likely to change and as Priyanka mentioned
44:30
that that can actually scrap what you started out with you know now I've seen the data I'd
44:36
definitely wouldn't start it this way. And maybe you lose your entire POC
44:41
but at least I've got the clarity at that point. And as Johan pointed out
44:44
no results at the end can actually be a solution. Well, we've made the POC, now we've got the real data
44:50
It doesn't work that way. So we had faulty premises to begin with. So yeah there a whole area that you need to be careful of but yeah educating the client can help i have actually a very interesting scenario real life scenario to share here
45:08
so normally for pocs we end up using our own data right the client is also like yeah go ahead use
45:14
your own data sets mock data sets open source data sets due to which your results are fantastic
45:20
You seal the deal. And then, so this happened. So we did this pre-sale pretty nicely
45:26
We pocketed the deal. And then we got into the real implementation
45:31
The real implementation was like they were working on Excel sheets. So they were actual horrendous Excel sheets, not even formatted properly
45:40
Like each Excel sheet is in a different format, merged columns. And the column formats are different
45:47
So let's say you have a name column. in one Excel sheet and the same name column in the other Excel sheet
45:53
And both of them are different, like, you know, not matching. So if I can't even do lookups, I have to sanitize the data
46:00
So right from and they are like, OK, so when we went in for the engagement
46:04
so we were like, you have a paper pen system and you want to have rocket science on top of that
46:10
So our whole engagement turned from an AI and prediction, you know
46:15
forecasting based engagement to a data platform optimization. organization so we and yeah i mean it was interesting so we told them that you need to
46:24
get your system in shape first you need to put it on a cloud platform then we can do ytics on top
46:31
of it and actually you know aiml was like almost half a million deal but dpm turned out to be more
46:37
profitable it jumped to 2.5 million so yeah i mean that we were lucky in the scenario but yeah more
46:46
often than not what happens with POC and real life is totally different. In our case it was like
46:52
unexpected thing which worked in our favor but could have gone the other way also very easily
46:59
Yeah I can agree exactly with that so that's a problem of the data quality always happens so
47:06
the data that you get in the POC when you go to the real projects as Piranka mentioned for example
47:12
they can be in excel i have the same scenario for the predictions of the for example how many months
47:19
before a travel people needs to book their tickets and i come up with lots of data that they are not
47:26
really in the same way so it is is reality so we always mention to them we can do that but despite
47:35
to the data quality because the first i think the first thing is that data quality should be
47:41
make sure it's correct over there. So we tell them that you can see some differences because the data quality change
47:48
And most of the time they agree because when they go through the data quality parts, they
47:53
see lots of missing value, lots of things that they are not okay with each other
47:58
So I think most of the time they accept that and they are happy actually because they see
48:03
the issue in the data and kind of so, and of course, as other mentioned, educate them
48:09
about that. Yep. Yes, so we have one last question here
48:23
which is about, you know, you can argue here that the most important thing here
48:30
when you try to sell an AI solution is to show that your solution can actually provide
48:34
some value back to the customer. But you can also say that the second most important thing
48:39
to discuss is how should we actually develop this solution? And then we're talking about what languages should we use
48:48
What frameworks should we use? What kind of tools? How should we document this
48:53
So what are your experiences? How can you handle this type of situation
49:00
Yeah, I think at least for us anyway, AI solutions have always fit the edge of our methodology
49:06
For us, it's really an iterative process. And with this process, AI is a really good fit for that
49:14
Sorry, Agile's a very good fit for that. We're able to swing with the IT teams
49:20
Sorry, you might be getting some background noise, so I'm just going to go on for a moment
49:29
Okay, I'll keep that back up. Hopefully that's a bit better now. So yeah, we try to work within the Agile methodology
49:34
because it really does fit. it really does fit. In terms of our toolset, we have used the short
49:40
Autowead now, but we also lean towards Python. In particular, one of our favorite libraries to use
49:46
is a high character for a gentleman called Noez Ali, which is based on a SkyGip
50:00
In my experience, it depends on either the client or the consultants that we bring into the project
50:09
Sometimes the client states that we are going to do this on Databricks using Python, or we are going to use Azure ML
50:15
because that's our company standard, or we don't want to have full code
50:20
We only want to have low code solutions. I've had that one a couple of times
50:26
So it depends on the project, really. We don't have any specific tools that we want to use
50:32
It's more on the project fit. Yep, same as Johan and Leon mentioned
50:42
That's the same thing. So I think the most important is that what architecture they are using
50:47
Some governments, they don't want to go to cloud, but some prefer they already on the cloud
50:52
and they don't want to, for example, expense for the gateway or something to do something on premises
50:59
So I think the most important thing is that what architecture they have
51:03
We don't want that kind of put extra costs for them to go and buy a new technology
51:08
that is not in line with what they have. So I think that's one of the things
51:14
that we should consider. And also about the language choice, I think that sometimes it just depends
51:23
on how it works better for the algorithm. I never see my customer kind of check
51:29
my what language from the python or r or assurance so they mostly care about the architectures in
51:34
line with them or how it's kind of how much cost actually is provide for them
51:45
yeah for okay so far as i'm i'm sure eve will agree with me we both work in our nad
51:52
although in different locations so we are severely constrained by microsoft solutions right i mean we
51:58
have to pitch to clients Microsoft solution stack alone so even if they are doing you know hybrid
52:05
cloud like for example AWS and some third-party tool which is why sometimes the choice of tools
52:12
for us can be a bit tricky because if you stick to Microsoft tech stack alone right sometimes you
52:18
know like you know a spark NLP library for NLP related purposes could be much much easier to use and built an on model could be much easier to use as well But yeah I mean you know because of that sometimes we have to make a choice
52:36
Sometimes Microsoft themselves, when they come for solutioning, they might, for example, you know, pitch in for signups
52:44
signups ytics or MML spark. And then that sort of dictates the technology which we are going to, which we are going to use to solution
52:54
So for us, I mean, you know, the technology stack is a bit constricted by Microsoft technologies
53:03
But yeah, I mean, still, for example, ML.net and all of these interesting Microsoft stack do come into picture
53:10
So sometimes we have solution because we didn't have any Python developers or any open source developers
53:17
We have developed solutions with C Sharp developers using ML.net libraries as well
53:22
Yeah. yeah it's interesting i don't think i've been prescribed um specific tools to use uh but
53:32
certainly i bear the client in mind like we said earlier uh as to as to where we leave them
53:38
afterwards if we leave them with something complicated they can't handle so um yeah
53:43
usually it's just up to me when what i choose to to fulfill whatever it is they've asked me to do Yeah
53:56
And then thank you for all those answers. I'm not sure if I can, I come with my backup question
54:04
I'm asking the AI 42 team here first of all, and then our speakers too
54:11
So let's talk a bit about the team we are working with
54:15
So how a perfect team look like And based on what factors would you choose your teammates
54:21
you are working with in a project? Consultant answer is, of course, it depends
54:33
But it's, I mean, it depends on what data source, how much data preparation do you need and so on
54:41
But typically I would like to have, I mean, Personally, I'm not good at machine learning or doing complex math
54:51
I'm a solution architect and a data wrangler. But typically, I would want to have people in who can do the data and the science-y stuff
55:00
but probably spec them up with people who can support them on the boring tasks of prepping the data
55:08
Somebody to help them have a dialogue with the client. sometimes that can be one person sometimes that has to be three but it has to be somebody who can
55:20
do all these things and fit that into the client architecture or technology stack as it mentioned
55:27
before yeah I agree the only the only element I might add to that obviously the opening of this
55:35
science, it depends. I tend to, because we have, we go into
55:39
delivery solution, I try to have a product owner as well. So
55:43
somebody we handle and are responsible for that product and liaisons with the client and the integrated process as well
55:54
So, same. So, so beside the hard skill, like they, of course, I
56:01
need a data science person to able to improve the algorithm if you have to write so definitely
56:07
that's a harder skill the softer skill also is matter because you know that it's as the each
56:13
client can have different things so how they can actually adapt their self to that so beside the
56:19
all of the hardest skill the intention to work with other tools so not being biased just to stick
56:26
with one thing so because you know every solution every problem can have different solution so
56:31
So I think that's also really matters. So besides that harder skills that they need to have
56:36
so data wrangling and also data science, I really look for these soft skills
56:41
how they can kind of able to accept other technology and kind of going to learn
56:47
and how much they are motivated through that. That's an interesting one
56:56
I think it speaks to a little bit of what we spoke about earlier with how complex a project is
57:01
and what other services you're going to be touching and whether it's an IoT project or just a visualization project
57:08
I suppose it's going to come down to that. The real answer is I'd just choose all the folks on this call
57:13
and be happy then. Be easy. Yeah. One important thing which I have learned for, you know
57:22
while having the perfect team is more often than not, I have a lot of good quality data scientists, data engineers
57:29
But one person which I sorely miss that skill set is an MLOps person
57:34
The one who is actually going to productionalize your models, build your pipelines, parameterized pipelines
57:41
So this DevOps MLOps person is like, you know, a must addition apart from all the good stuff which you have
57:49
So all the sciencey stuff, data sciencey stuff, all the project director, architects and all of these good people
57:58
I think it's important to have a security person, an infra security person and an MLOps person for a data science AI ML engagement
58:10
Yeah, I mean, having someone with ethics as well, you may need an actual ethics person depending on what sort of data you're going to be processing
58:17
And yeah, you put that together with security and ops and I think you've got the beginnings of a good team
58:23
if you've got somebody with a data science background, it's a hard problem to see
58:30
I also found out that for me, the always helpful teammate on the top of the developers
58:38
and the security people and all those amazing people who know everything else that I don't know
58:45
I always needed software developers on my side, just to speed it up
58:51
They are always better to look through some parts of the project
58:59
I think it is time to conclude. So with that, Gausse and Håkan, would you like to give us some conclusion here
59:10
Yes, I think we've covered a very wide range here of topics and questions
59:16
So it's a little bit difficult to draw some straight conclusion, I think
59:21
So I think it's better just to say that, you know, we've really appreciated all of you, all of our speakers and all of the insights that you brought here to the table
59:30
And, you know, we're saving this on our channel. So anyone can, as we said here, anyone can find if they're interested in some specific aspects to learn more about that, I think
59:46
Yes. So thank you, everyone who joined us. And of course, everyone who watch live or will watch later on
59:53
it was very important topic because we're coming to AI with all the projects
59:59
and we would like to know how to better sell to our clients
1:00:06
Yes, thank you, Håkon and Gosia. And I also would like to thank you a lot to our speakers today
1:00:11
because you did bring a lot of valuable information on the stage today again
1:00:17
Thank you a lot to our audience for joining us. And remember, we go away for the summer break now
1:00:23
but then come back in the beginning of September with a real life machine learning conference
1:00:29
So please follow us on Twitter, Instagram and Facebook and on Meetup, of course, because there you will find all the important information about our upcoming sessions and the conference as well
1:00:41
And at that conference, we are going to bring in scenarios that are coming from existing real life projects, which is why it is so cool
1:00:52
So please join us on that conference as well. Did I miss something
1:00:59
No, I think, yeah, we also would like to thank all the contributors that we have and also
1:01:07
our sponsors for making these shows possible. And we also would like to thank, of course, you audience who are watching this either
1:01:17
online or maybe watching it sometime in the future. So have fun everyone during the summer
1:01:26
Thank you again for joining us for this half year of AI42
1:01:30
See you in the next semester. And also a special notice to our speakers, if you could just stay on the line while we're
1:01:36
finishing off. Okay, bye bye. Thank you. Bye
#Consulting
#Education
#Sales
#Machine Learning & Artificial Intelligence


