Session Details:
In this session, we'll explore how to enhance email triage using Azure AI Language Studio and Azure Logic Apps.
Companies often struggle with managing high volumes of emails in a shared inbox, leading to inefficiencies and backlogs. I will demonstrate a solution that uses Named Entity Recognition (NER) to extract data and Text Classification to assign emails to the correct department, streamlining the triage process.
Based on lessons learned from an insurance company, this session will include a hands-on demo covering practical steps from Logic apps to leveraging Azure AI for text analysis. The techniques discussed will be applicable across various industries. Join us to discover how to create a scalable, low-code solution that enhances operational efficiency.
Speaker:
Dieter Gobeyn is a Microsoft MVP for Azure, a seasoned Cloud enthusiast with 12+ years of experience in IT, Architecture and Leadership, he bring a wealth of hands-on expertise in Microsoft Azure, .NET, Dynamics and Power Platform. From the intricate realm of IT to the strategic landscape of Architecture, his journey has been guided by a natural curiosity and a relentless drive for innovation. Transitioning from consultancy to leadership roles within the enterprise domain, he is currently immersed in crafting end-to-end architectures for large-scale organizations, fueled by a genuine passion for collaborative problem-solving and solution crafting.
He has certifications in TOGAF, Scrum Master, and Azure Architecture, and I am skilled at creating clear, effective solutions to complex challenges in enterprise settings. He also specialize in developing architectures that accommodate various integration flows and software solutions.
Show More Show Less View Video Transcript
0:00
heat heat n [Music]
0:43
down are you [Music]
0:51
Good morning good afternoon and good evening depending on where you are connecting from so welcome to this
0:58
session here for Asha user group Sweden my name is Hokan Silvanogel i will be
1:03
your host here today uh so our founder Jonah Jonah Anderson is on vacation so
1:09
unfortunately she couldn't be with us today here so let me go over here a little bit of
1:16
practical details so we have we have a code of
1:21
conduct here so it says be nice and friendly listen with purpose and be thoughtful be respectful to others and
1:28
seek to understand not criticize be curious and open to share ideas and be
1:34
inclusive in your comments or questions and you can also find our full
1:39
code of conduct on the link here uh below and we are also always looking out for
1:47
new speakers and new sessions so please feel free to scan this QR code and
1:53
register your interest for uh presenting a session
1:59
and finally I would also like to invite you here for a small digital FIA which is a small Zoom meeting where you will
2:07
be able to ask questions and discuss uh directly here with our speaker for
2:14
today so that brings us actually here to our speaker here so let me welcome here
2:20
on the stage go
2:26
hello Hakan thank you so much for the introduction it's very nice to to be here yes
2:31
so a small more formal introduction is that Peter is a Microsoft MVP for Ashure
2:37
and he's an experienced IT architect and Ashure consultant and he specialize in creating end-to-end enterprise
2:43
architecture that focus on clarity simplicity and scalability and DJ shares his insight as
2:50
a public speaker drawing on his extensive consulting experience he focuses on cloud solutions and and
2:57
enterprise integration AI and the valuable lessons learned from managing global teams in distributed environments
3:08
thank you so much for the introduction Ham so would you like to say a couple of
3:13
words to our to our viewers about what they will see here
3:19
yes absolutely so the presentation Yep so the presentation today will be more about uh my own lessons learned from an
3:26
insurance company I used to work for a few years ago uh we were struggling with processing data and there was lots of
3:32
emails lots of documents that we we were receiving and we really wanted to automate this process now keep in mind
3:39
this was all before Gen AI right so this technology still exists and has been
3:45
even better than a few years ago the um the demo and the presentation the slides from today will be really optimizing that creating
3:51
optimization but it's also an enterprise ready solution uh to be used um managing
3:56
you know millions of emails so um yep that's the subject for today oh nice yes please share your presentation here and
4:04
I can add it to the thank you so much yeah
4:11
um
4:19
So if you could just confirm you can see my screen please yes it looks great so the show is yours perfect thank
4:26
you thank you so much for the introduction um so yeah quick quick agenda for today here um I'll introduce
4:33
myself in a couple of minutes um we will be looking at linguistics technology um
4:39
it sounds like a difficult word but it's not i promise you that the technology we will be using from Azure will be Azure
4:44
AI language studio here
4:52
um in the email triage or document um classification and then we will look
4:58
into solutioning a demo and there's questions as well uh and as Hakan explained um feel free to raise those
5:04
questions um and I can also adjust the demo if necessary let's see maybe we can try to break the demo
5:11
Good quick introduction about myself so as already introduced um I'm I'm a developer uh an Azure architect and
5:19
working with Azure since 2012 uh feel free to connect here on this QR code with my socials uh would there be any
5:25
questions you are very welcome to follow up through LinkedIn email or any other social media um if I'm not doing um tech
5:33
stuff such as reading books uh blogging presenting uh you will likely find me
5:38
outside so hiking scuba diving is one of my two hobbies uh and I'm also based in London uh since a couple of years cool
5:48
um now Azure AI language studio right so this is the linguistics that we want to
5:53
talk about today as a matter of fact there's a lot of use cases with linguistic technology and it exists
5:58
already in quite a lot of places right i just want to provide a couple of uh examples where this could be useful for
6:05
potentially your business right or whether it's already being used first one being uh document digitalization
6:14
right so any written document that potentially you would like to classify
6:19
uh or digitalize um rather than manually typing it over um AI linguistics
6:25
technology is quite useful for that and can be used um second example could be um document categorization right in this
6:32
case we we would like to categorize maybe the type of a document uh the reason why it was made you know the
6:39
color scheme quite a lot of parameters could be extracted here potentially also
6:44
some some of the text i'm not too sure how how visible that is uh but I guess you get the point and as last last
6:51
example is personalized communication and and obviously with with the chat bots and and agents um we all want to
6:58
speak in our own native language right and having access to language detection as a first instance is quite important
7:04
right so think about routing it to the right agent right um or potentially sending it to the right um person that
7:11
they can answer but also um you know having a solution available in your own
7:16
language or potentially dialect is quite important here so these are all examples where we use techn uh linguistics
7:23
technology today and obviously the scope is beyond th those examples right um and
7:30
then a quick introduction about linguistics technology right as a matter of fact it's not very new um it existed
7:37
already in the 1990s and this is just one of the samples with a company that I knew back in the days um some of the
7:44
people I knew had stocks there uh they bought stocks for this company now um and it was also used in in 2001 with
7:51
Windows XP speech option um so it's quite a I wouldn't say
7:56
an old technology but exists already about 30 years right um now interestingly enough um well the company
8:03
went bankrupt a couple of years afterwards um nuance communication acquired that but then the most
8:08
important part is this so when I saw this announcement a couple of years ago Microsoft acquired this company
8:14
effectively it bought a company from well about 30 years ago so it's interesting to see that Microsoft is
8:21
investing in this kind of technology um obvious Obviously things have moved along and technology is much better
8:27
right
8:32
um and this changed since 30 years um on the left side you see um well I purchase
8:38
a lot from Amazon and I do have to be honest Amazon is very useful for myself i ordered quite a lot from Amazon um
8:45
especially with the free delivery but one of the you know one of the the nicest thing on Amazon um according to
8:51
them right is the personalized recommendations um on the left side you see a couple of items here amazon thinks
8:59
maybe you should purchase this and according to them it has driven up sales by 35% i think this is rather interesting
9:06
right um so lots of things has changed right we've got way more data available um there's much better algorithms
9:12
available technology right we all now have access to the internet uh with multiple devices with uh agents
9:19
interaction with um chat clients Siri Amazon um also through my TV um so
9:25
there's a there's a huge huge huge change over the past years right from a technology point of view I think the
9:32
biggest changes here is that uh linguistics AI is more accessible than ever right and from now on you can also
9:38
start for free means that if you've got you know any any Azure account for example
9:46
you can start using those um AI linguistics technology um for a certain
9:52
amount for free on your subscription um we will look into this later how to do that right so quick summary um obviously
10:00
um AI has shifted from something niche I would say um to to something very mainstream and I think we do have seen
10:06
that evolution since the last couple of years very strongly right we all want to create more value by either either
10:13
providing more um customer experience adding more value to a company or increasing stock value and everything
10:20
has been more accessible than ever right so you can install for free nowadays um you don't need too much upfront
10:27
investment here um and then you know the example from driving purchasing a CD or
10:33
you know uh purchasing computer power that's basically all done um nowadays
10:38
there's also algorithms available as a model for free which we will also look into this um that you can customize for
10:45
a fairly fairly um you know small amount of effort now I do think personally that
10:50
the landscape of AI especially within Microsoft is sometimes quite difficult to follow and I just want to zoom out a
10:56
little bit in terms of what are we looking at today right um so there's a huge AI offering within within the
11:02
Microsoft Azure landscape um and within AI services uh portion we've got uh AI
11:08
um document intelligence right where we will extract um structured some
11:13
structured data we've got open AI service uh which is the very
11:21
favorably discussed over the past I would say a couple couple of years um this is what we will be looking at today
11:27
azure AI language right natural language processing text
11:33
analyzis there's also part with the translation I think that that's quite you know self-explanatory here content
11:38
safety uh search very useful for rack integrations here there's a part
11:45
with speech uh and also quite interesting if you're
11:51
looking to automate uh document processing because you will be looking at scanning um photos recognizing what's
11:57
happening in the picture extracting OCR facial recognition um so that's also useful for automation processing
12:03
here now um AI language right um this was actually previously part of what has
12:09
been called Azure cognitive services and still today some of the documentation is referring
12:16
to to this terminology um but it's actually part of AI Azure AI suit of AI
12:22
tools and it does provide a quite a fast time to market which means Um um you can
12:28
rely on those pre-built algorithms or features that's been provided um you
12:33
can rather easily customize if necessary without the need of um prompt
12:40
engineering without the need of development potentially through Python and it offers a a rather um accessible
12:47
interface through through the browser so these are from my point of view the
12:52
benefits where you can start out um and the capabilities here um is kind of as
12:59
follows right so natural language processing it means that it's able to understand what being said uh emotions
13:06
key concepts and and the information right it's able to to work with more than 100 languages including dialects
13:14
there's a part which offers integration with bots and virtual assistants
13:19
Um it does provide analytics and insights and I do think this is really a nice feature we will be looking into
13:26
this um during the slideshow today effectively provides feedback on how you can improve the model how you can make
13:32
it better and where things you've made some mistakes with the input and output so that that's a rather interesting feature there I think and obviously
13:39
provides easily integration with u with the Azure ecosystem um and and the existing apps um depending how you like
13:46
to develop low code medium code or you know pro code so a couple of examples just so
13:52
we've got a bit of an idea about the business case right um I'm aware I've explained already a couple of them initially but these are
13:59
some some of the interesting interesting concepts that you can maybe translate for your business right Um so the first
14:06
one would be identify specific language of a text could be very useful to understand who needs to deal with a
14:11
specific email with a specific document uh or a specific use case summarizing
14:17
specific text um determinating potentially if a Google review or any review is something
14:23
positive negative and then trigger another flow extracting key data information from a
14:29
specific text for example a phone number uh medical information address uh
14:34
daytime um or potentially just identifying hey what's being said identify the main concept here these are
14:41
a few examples and um if I actually just open my browser we will
14:48
then find a couple of um onetoone matching them with with an existing model or feature
14:55
so this is what we call available features within AI language right um I'll try to zoom in slightly here
15:04
um the first one would be um named entity recognition and in this case
15:10
um this feature is able to extract specific data from text right so we're looking at product information duration
15:15
address location event date uh and a whole set of standardized data types that's being able to extract um from a
15:23
given unstructured text um there's a bit more about health which I will skip um it's also able to
15:29
detect the language itself sentiment analyzis is also a rather interesting one right where we
15:35
want to understand how positive neutral or negative certain things are um potentially for the Google review um it
15:42
offers summarization capabilities right um key extract information this would be
15:49
more about specific words and the list really goes on so
15:54
these are all the pre um predefined um features right which allows you to
16:00
extract understand categorize um unstructured text into something more structured now if you feel like hm this
16:07
is not sufficient for my business case and that's also what I had or what we had in the company because we wanted to
16:14
extract um insurance um claim numbers you can also do a custom u feature right
16:21
so the list goes on here but effectively it talks about custom text classification custom named entity
16:27
recognition so all of this means is that um from those predefined features that
16:36
extract um something more custom right um for example you you know the color of
16:41
a paint the car brand um insurance claim number the list goes on something that's
16:47
not being provided by default uh you can also train a custom model for that right and let's have a look at specifically
16:54
how this works because it's actually very straightforward uh but I do think
16:59
they the the software uses a very specific terminology for this and it's important that you understand how it
17:04
works so the first one is select a data and defined schema and this means that um basically you select a good sample
17:12
set of data which you then upload to a storage account and this is the input from um for the model right um so sample
17:21
data here in this case um the second step would be uh labeling the data which means for every sample set data you
17:30
explain uh underline or add what's being expected as an
17:35
output and it also means that if you can rinse and repeat that over several documents effectively you can train the
17:41
model and the model will then understand what you're trying to extract um or the purpose from um this exercise here right
17:49
so you train the model which is just a click on the button um the training will be done um there is some statistics
17:55
there which you can look at the model performance and we will look at this later as well and then you you can kind
18:02
of start this loop feedback loop here right so if the model performance is not sufficient for yourself um you can add
18:07
more documents make corrections if you feel like I'm satisfied with this
18:13
um you make a release deploy the model through a rest endpoint and afterwards
18:18
uh you can extract the entities or consume consume the model here
18:23
So that's um that that's a brief overview of of the steps involved and then we will do this also in the demo
18:29
now in terms of feedback I do think there's um two very important things that you will need to know um so the
18:35
first one is the AI model performance which means right after the training there will be a statistical score about
18:42
um how good well based on the setup information the model was trained and
18:49
how does it specifically work uh we are using a split setup right so 80% of the data is being sent to the model to to be
18:57
trained and then 20% of that data um is being afterwards used to test against
19:03
that model right so that's a split setup 80/20 for example and as we see in the print screen here um there's a bit of um
19:10
you know seems to be really good in this
19:16
case that that's goes on um with you know recommendations here um the second
19:22
one about feedback and information about how successful it is is a confidence score and a confidence score is
19:28
effectively a score that ranges between um zero and 100 or zero and
19:33
[Music]
19:41
one answer is correct um and obviously the higher the score the more confident
19:47
is that's correct and then the lower the score the less confidence now I do think this is quite important to understand
19:53
how it works because it's just based on a statistical um estimation there right um and I I
20:01
think it's it's interesting to use that also in the
20:11
solution designers once you receive the feedback from the IM model you and then how would you like to use them I always
20:16
say that if it's about um changing the toilet paper in in the in the toilet Maybe the confidence score is not too
20:22
important because there's no significant impact on your business right u if it's about paying a claim for an insurance
20:28
company or you know doing more something critical u potentially you want to have this as a more more confidence score 95
20:35
something higher than that right so that's really depending on on your business case
20:41
here now um I think do as a human we are pretty good at and understanding what
20:46
kind of information needs to be extracted from a text right um I did prepare this a little bit as a
20:54
interactive session I'll just present them as such for example in this case we've got an email
20:59
and Jonathan wants to as discussed over the phone please find pictures of the BMW attached now uh what's happening
21:05
here in our minds as a human we immediately understand the BMW it means it must be for a car insurance right and
21:13
effectively the same happens here right so we've got um some policy number and then Eric is
21:20
providing an update um if you would be dealing with this email um you will
21:25
probably look up the the policy number uh in the system or potentially with the email address look up what kind of
21:31
policies Eric has and the list basically goes on um even with this as a human you
21:36
immediately understand this must be for home insurance um based on the scope here and so each of those examples
21:43
really um provide different solutions there but what we did with with the company you used to work for right is
21:49
that the business case was as follows right so we had a
21:54
shared mailbox and that's typically what's still being used today um for lots of insurance companies basically um
22:00
there's a mailbox and and clients are emailing them the mails are being piled up effectively and there's a couple of
22:06
full-time employees that are just dragging and dropping um you know moving around those emails to the different
22:12
departments and the reason why um we've got different departments and claim handlers in this case is because um
22:19
insurance products are rather complicated and complex and those claim handlers need to understand every every
22:25
single detail of of uh the policy right um which means someone that's able to
22:31
deal with your home insurance won't be able to deal with your health insurance for example so they need to dispatch
22:36
those emails to specific departments um in this case we've got three departments
22:42
it could be much more um so this is what we call with email triage right so the goal is really to optimize this to to
22:49
automate this to make this more efficient right uh because what's happening um if there is a busy weekend
22:55
and there's lots of emails uh we want to obviously serve clients much faster here
23:00
right um that one I'll skip this one so we want to use
23:06
AI language studio um to you know to manage those claims um and there's basically two basic solutions that we
23:13
can use here um to fall back on a couple of items and I think the first strategy
23:18
would be to use a high percentage of certainty which means you want to extract key data from that email right
23:25
so think about a claims number a reference number a policy number or anything else that would be extracted um
23:31
now unfortunately some people don't provide these references right um and
23:37
then you can fall back on email categorization or understanding what's being said right so in the sample from
23:42
the BMW it's probably car insurance and if people are speaking about prescription well I guess it's more a
23:48
health related thing could be GP prescription or something else so these
23:54
are two options that that could be used um but I I do think that obviously the
24:00
the limitations are endless right so you could be looking at existing claims new claims understand the urgency right um
24:07
if someone looks to be you know really mad irritated you could prioritize this email um you could use um this the
24:15
sender's email address to see if there's active or closed claims already or use AI vision to see um and view the
24:22
attachments or potentially uh OCR codes on a pre-written document
24:29
um so there's lots of room for for optimization and there's there's lots of options there basically right
24:35
um now let's have a look at how do we translate this into a a solution right
24:42
um so the first one extracting the policy number or claims number uh in this case this um remember when we were
24:47
scrolling through the specific page here right um and we were looking at the available features uh we had this named
24:56
entity recognition that was able to extract predefined um data now um well it didn't
25:03
surprise me but there was no such thing available as an insurance number or insurance policy
25:10
um I I guess that's it develop a custom named
25:17
entity recognition for that um which is then able to extract key data from that unstructured text or
25:24
email right and as we see in the um in the Brit screen below right uh we've got someone emailing for a fire damage
25:32
um what we do in the portal is we will just assign the policy number to to to
25:38
this email and I think now it's a really good moment to to have a first look on how this effectively works um so if I
25:46
open my browser here and let's have a look i've got a
25:53
couple of resources here this going to zoom in a bit more um
25:59
effectively what being used is a a language um resource here this one um
26:05
which is nothing more than a a container and then effectively all the work is happening in this um language studio
26:15
itself this one yep good so if you go to language.cognitive.asure.com
26:20
azure.com uh I've got already two project defined and the first one would be for the policy number right so what I
26:27
did is I
26:33
up access to uh AI language studio to that storage account effectively and
26:38
then create a new project connected to the storage account and then you will um
26:44
look at the documents that's being available in that storage account and these are the sample documents I've got in this case I've got a couple of emails
26:51
here on the left side and what I did here is you can then add a entity for example policy number
26:58
which I already did here uh and then you can also underline the specific text uh
27:04
and then assign it to a specific label right so that's the labeling process we've got input outputs that you'll be
27:10
defining here not too sure why this is policy number this doesn't seem to be correct remove label
27:17
um so this is this is basically the steps and you just
27:24
rinse yeah so as you see the policy number is here and then you rinse a repeat repeat
27:30
repeat over those things now you can also extract multiple data um for example if it would be about a specific
27:36
car number or anything else or address uh or dates then you know you can also do that here so it's not only limited to
27:42
one key data so that's the first part with the uh policy numbers that you can use or
27:48
insurance numbers or any reference numbers basically or anything else that we used um and then the next step is uh
27:55
understanding and categorizing the text right and this goes through a fairly um fairly similar process but what we do
28:01
did for this is a custom text classification right um and and why was
28:06
it again a custom because there was no such thing available for specific insurance needs here right um
28:14
and there's two options here right so um text classification is able to do single layer classification or multilel
28:21
classification um the single one means that each document only has one category
28:26
and a multilel means that every document or every input file unstructured text
28:32
contains multiple categories um I think the later one could be an example for movies right um where you assign
28:38
multiple labels to a specific movie um so so there's there's more options there
28:43
than that so let's have a look at how this works in the portal from here and
28:48
you will recognize that the experience is fairly similar actually right so again we will be uh looking at
28:55
the documents that are uploaded to my existing storage accounts and then we
29:05
will put variables here and see what's being
29:17
ex Okay great uh so I've got some emails here some documents here uh again this
29:22
is also for some specific um or is it here
29:27
broken arm right and on the right side instead of underlying here I'm labeling the entire
29:33
email so this seems to be health related as it seems to be for a broken arm
29:39
um next one could be a life policy and then I'm assigning those labels here right um so there's just um you can add
29:45
the labels here assigning it one this is just one label uh for a specific speific
29:51
email so that's the labeling the process of um using my sample data and then assigning it to what I'm expecting from
29:57
that so those are the two steps and now let's have a look at a bit more of a solutioning point of view how this could
30:04
be used in a kind of um software environment or how you can use that in
30:09
your um your solution right but first maybe maybe a small word about um integration AI within middleware right
30:16
because um I do come from an integration background um also from um on premise back in the days before Azure existed I
30:24
would say um and and I do think what we're doing here is enhancing um data All right so we are enriching specific
30:30
cases right uh by extracting data by assigning um categorizations to existing
30:36
emails and I would say if you already have an existing integration between two systems right um this is a really good
30:43
opportunity to use AI with this because you are able to enrich the existing flow
30:48
enrich the existing data and the integration exists already so that's the first I would say lowhanging fruit from
30:55
that point of view um especially if you already have a middleware system um leverage that to to
31:03
enhance the data that you're sending um between two systems here um now if we
31:09
look more specifically about the solution design that we used um for email triage right
31:16
um we had a mailbox uh claims.comp.com company.com and we did had some software to deal with um email cases in this case
31:23
it was Dynamics um CE we did send the the the email or the the data to to
31:30
logic app or an orchestrator uh which then kicked off a
31:36
a a process of the first step would be uh finding the policy number right
31:42
extracting that key data information um we then had multiple APIs depending on
31:47
each department um to find back if a policy number or cleaning number is found and who actually owns that policy
31:53
number and then we you know if nothing has been found if no results were there um we do look if there's any you know
32:00
classification found if we can find the purpose and the department and
32:06
then the do the system handle those emails in this case dynamics um fairly
32:12
straightforward example I would Okay there's lots of different ways where you can um extend this to make this more the
32:19
needs of your business case or your company but I think generally you will rely on on on those two items as well
32:32
right where we can also extract policy numbers departments uh and then this
32:37
part right now maybe before we do the demo right um so
32:45
I kind of demonstrate um that specific data but I didn't demonstrate the other
32:50
possibilities right so I would say once that's done um really going on the left
32:56
and again there's no there's no need to develop um because it's all in the browser um on the left side training
33:03
jobs you can then start a training job um just enter a model name for example a
33:09
V2 um we are again looking at the split setup here so 8020 for example and then
33:16
you can click on train uh a couple of minutes or potentially hours later um once that's all finished um you will see
33:23
that a model has been available here just need to look at
33:30
okay here is the model performance that's it that was what I was looking for um so looking at the performance score
33:37
and these are rather interesting items right um I did only upload well uh 19
33:42
documents so score is not too great but there's already some data that we can use for testing right so immediately
33:49
there's a complaint about there's not enough data set um it seems to be that um I need to upload more documents so
33:55
that's useful feedback
34:01
it seems to be unbalanced because some of the categorizations have
34:06
more and there seems to be lots of noise used value um but really if you look at
34:12
for example confusion matrix you can really dig deeper in kind of normal others values um as well as you know all
34:18
the other tabs but this one is feedback um for whoever trains the model as in
34:24
hey we do think you know you can do this in order to improve it uh let's say you're happy you want to release the
34:29
model that's just exposing the the model through rest endpoint u and from there I would say you're good to go now looking
34:37
at a little bit more of a demo setup right um let's see if we can go back to my
34:42
resource group right i did set up a couple of samples with um a logic app here and the first
34:49
thing that I want to do is a bit more basic right um let's see if we can do some uh language detection here
34:59
so for those who are not familiar with logic app it is a low code uh enterprise ready uh integration platform or more of
35:06
an in orchestration platform um which allows you to in this case I've set up a
35:12
rest endpoint um it works with connectors that was able to uh that are predefined basically uh and then it
35:19
allows me to connect to that um AI language uh as itself
35:26
right so it will then send that same request to um AI
35:32
language and it will reply me the response here so this just a rather big proxy I would say
35:38
Nothing too complicated um yeah I'm not too sure so let's see how we can
35:45
then look at a demo to to extract some of that data to see if I can do a language
35:52
detection uh let's see if we can kind of simulate this one here
35:59
so the hello how are you buddy um and this is then the reply that's fairly certain that is English and why there's
36:06
a confidence score of 0.9 um obviously what happens is
36:12
that if you're using shorter texts then I figured out it doesn't seems to be too
36:17
certain but what I did notice is that for example if we are trying to
36:22
confuse it a little bit with some Spanish I expect it to drop right
36:30
um even if we kind of send some typos I expect to drop it even further but it
36:35
still seems to be fairly confident that this is English um again it's a
36:42
very short text um I noticed the
36:48
performance is definitely useful to um so this is also something we used for agents right
36:55
so specifically in a country where you deal with multiple languages or kind of enterprise integration um extracting the
37:03
language then also allows you to route it to specific agents um someone that is
37:09
able to deal with uh Spanish customers or someone able to deal with English customers for
37:17
example so that's um this part for a small demo for the language now a little
37:23
bit more about the the setup of um what we can use with email triage
37:29
right and here we have a bit more of a basic orchestration right so again it's it's it's a rest endpoint that uh I
37:37
exposed here no yeah what are we doing uh we receive a HTTP request and the
37:43
first part is um that integration with the policy number right um so we use
37:48
that we just simply forward the email um to
37:54
to the AI language rest endpoint um second step is hey is there any policy number found is there anything present
38:00
if not then we want to rely on that department extraction right and again we do something very similar here we just
38:06
send the entire email for the sake of you know demoing to that part and if everything is found we just reply that
38:13
through um to the client um obviously a demo but in reality you know you could use something more asynchronous um put
38:19
the message on the service bus then which picks it up and then sends it
38:25
um send it back to your ERP system uh you can also expose this to your ERP
38:31
system as a right so let's have a look at how we can do a little bit more about emails in
38:38
this case uh we've got a email from Charlotte right so she's asking about stolen electronics about her home that
38:44
was burglar ized um with some specifics so I want I'm a little bit
38:49
more interested into seeing how we can extract um data from this email and
38:55
indeed uh we were able to extract a specific policy number here right so
39:01
that's the policy number now um Charlott was well I would say very nice customers because she also mentioned it in a very
39:07
nice way hello my policy number is which is nice um I would be a bit more interested to see in um what's the
39:14
confidence score and what do we
39:21
receive right what are we expecting here
39:30
so looking back at that um
39:39
um it was indeed able to extract a a a text and say hey this is the policy number that we looking for um
39:46
specifically around you know the position in the text but also a confidence score of one oh I've never seen that one a one so it seems to be
39:52
100% sure um that this is indeed the policy number for this specific email interesting
39:59
um so this is what we received now this is obviously also an orchestration so
40:05
you know it was happy it did send back the the the deposy number there um second case that we can kind of run
40:11
through is again we've got a fairly similar email but in this case um no policy was was
40:19
given but she understands it right again about her burglarized
40:25
burglarized home and her electronics um this time it seems to understand that it's for non-life now non-life in
40:31
insurance terms means also home insurance right um so again what
40:36
happened here let's have a look at this orchestration that we
40:41
defined uh the first part was unable to extract specific uh I guess um specific
40:48
key data indeed no entities were found absolutely nothing um which in that case
40:54
went back to the second one looking at the output here um and
40:59
then I might zoom in slightly more nope we can see that indeed um non-life or
41:06
home insurance was assigned as a categorization for this email um the confidence score not not too high i
41:12
would say 04 um I know the reason I I only uploaded three documents with non-life to my training set so very very
41:20
very small sample set um which also yield into a very low confidence score
41:25
uh but it's nice to see and I would say surprising to see that even with very limited amount of data it's still able
41:31
to um to understand the intent of the email here or extract that information um cool
41:41
so that's kind of the demo for this part now um I always try to kind of challenge the things a little
41:47
bit and I was wondering hey what happens if people send more of a confusing email cuz in reality this doesn't happen every
41:53
day but in this case we've got um someone that is referring to an invoice from a doctor's consult seems to be a
42:00
typo here um but she's also asking if she need a fast car to drive to her appointment um so a couple of
42:06
observations here there's a typo within doctor and there's a fast somehow car
42:11
and there's drive as a matter of fact there's more words to non-health in this email than to
42:28
health um it's also able to understand the intent of the email and able to to
42:34
see what's being said now to be honest this is a very confusing email right so if I was the I if I was there if I would
42:41
be reading this email I would be equally confused but I do think we all kind of understand that you know this is more
42:47
for doctor's consult uh invoice u um that she's asking questions there or
42:53
that she's referring so it seems to be more for health related concern or health department for the insurance
42:58
company here so um yeah that's an interesting case I would say um I I do think there's later also an option to
43:03
ask a bit more questions and being interactive and I do invite you guys um you know if you want to play around with
43:09
this feel free I'm I'm happy to kind of change this demo here as well um and to see if we can kind of
43:17
you know mess around with this or or or not um interestingly enough I'm kind of curious to see how this you know the
43:23
confidence score was in this part um what are we expecting because I'm not expecting such a high confidence score especially for
43:30
uh the department considering there was some confusion even myself I was confused when I you know when I wrote
43:36
this email again I'm not trying to provide a perfect demo sample but also trying to you know see how things are
43:42
going and indeed the confidence score even dropped more right so it was indeed the health insurance that was assigned
43:48
uh with a very low confidence score 35 so that's um that That's it for the
43:55
demo part effectively here this part um if you feel like um using this um or
44:05
or trying it out I highly recommend uh trying it out and you know play around with this yourself for multiple benefits
44:13
as I've explained before it's a very fast time to market that you can use and especially with the free tier it allows
44:20
you to do 5,000 text extractions or records free per month um so that's
44:26
great if you want to start out and play around especially with an MVP version that you want to do um otherwise it's
44:31
just a pay as you go model you charge per every thousand calls um however the pricing depends uh heavily so since
44:38
we've got a little bit of more time I think we can also have a look at layout
44:45
language services pricing we can also look at the pricing here and the reason
44:51
why well obviously you know this might change i'm well aware of the video is
44:56
recorded um but just to point out that there's a couple of price differences if if you
45:02
have a look at right for example uh the sentiment
45:08
analyzes right um they are charged you know in this case up to $1 for for one
45:13
amount in this case for health it seems to be more expensive
45:21
um text classification also seems to be more expensive so I would say pick your
45:26
battles in terms of um what you want to use first and especially you know
45:31
there's a huge difference between those two uh obviously the use case is very
45:37
different but there's also a very huge where very big difference between this
45:42
and for example this because we are starting off with you know different prices and the reason why this is very different is because um what we've seen
45:50
in this demo is a custom model um now they are charged more expensive per
45:55
usage but you will also see that they are slower um I wouldn't say slower in
46:00
terms of the takes minutes but the performance from a
46:08
um text um they are cheaper but they also perform faster so my
46:15
recommendation is to to use those if possible and then potentially have a look at kind of a backup that you can
46:20
use do we really need to use those custom models um what kind of business
46:25
value you know you still want to have return on investment right um the training is charged per hour but I've
46:32
didn't really notice too much expenses over there personally um especially the
46:37
training is more like a one-off and then you go through those cycles
46:45
normally the the biggest expensive will be for the charging itself and the usage there's a little
46:51
bit of endpoint hosting but it's it's it's fairly small and then there's some commitment available as well uh for for
46:57
large large customers right now I do recommend if you feel like playing around with this there's a
47:03
very very good um demos setups available under learnicrosoft.com
47:13
there's lots of um looking back at the other options there's also some some more of here
47:19
which
47:24
also training material uh that you can use so heavily you know heavily recommend uh using this website here or
47:32
at least this category and um yeah so we are ending the the session here um I
47:38
think there is a little bit of Q&A in a separate um separate link but yeah I do
47:44
welcome all the questions of course um so um yeah thank you so much for joining
47:50
yeah thank you so much for presenting it was really really interesting I think and also I appreciate the those are the
47:58
links that you gave to more information if someone is interested in trying these things out
48:07
so um so right now um
48:12
right now we don't have so many questions here but as
48:17
I as I described here in the beginning we will have a short zoom meeting for
48:23
those who would like to um ask questions directly here so let me just find
48:29
uh let me just find the QR code and also the link um let me post the link here in the
48:42
chat
48:49
so so this is the link here to our Zoom meeting
48:58
so and also just out of interest uh can
49:03
our viewers see you on stage are you presenting at any conference or meetup
49:08
here in the near future um I do have a couple of things lined up
49:15
um well not until not this month but then next month I'll be in Sophia for
49:21
global AI day and then um the weekend afterwards I'll be in Poland for SQL day
49:28
as well for presentation um that one so that's the schedule for I would say the next month a couple of
49:35
more presentations um so yeah feel free to join in and tune in m um I'm not
49:41
coming to Sweden yet but it's on my schedule so hopefully very soon indeed
49:47
that's nice okay then I thank you thank you so much for your session and also
49:52
thank you to our our audience if you watch this live or if you're watching it
49:58
on the recorded uh stream on YouTube so I wish you all a very happy weekend and
50:05
see you in our next session etc
50:12
[Music]
#Business Operations
#Management
#Email


