0:04
hello everyone welcome back to another
0:06
hello everyone welcome back to another
0:06
hello everyone welcome back to another episode of the cloud show and again we
0:08
episode of the cloud show and again we
0:08
episode of the cloud show and again we are on the road traveling the cloud show
0:10
are on the road traveling the cloud show
0:10
are on the road traveling the cloud show to a different place a different
0:12
to a different place a different
0:12
to a different place a different continent again I am down in Tunisia and
0:16
continent again I am down in Tunisia and
0:16
continent again I am down in Tunisia and I had the opportunity to catch up with
0:18
I had the opportunity to catch up with
0:18
I had the opportunity to catch up with an old friend and make sure to get a
0:20
an old friend and make sure to get a
0:21
an old friend and make sure to get a first second appearance on my show no
0:24
first second appearance on my show no
0:24
first second appearance on my show no one else has been on the show twice
0:25
one else has been on the show twice
0:26
one else has been on the show twice except this time I'm going to sit down
0:28
except this time I'm going to sit down
0:28
except this time I'm going to sit down again with Elena and we are going talk
0:30
again with Elena and we are going talk
0:30
again with Elena and we are going talk about Ai and ethics on the cloud
0:33
about Ai and ethics on the cloud
0:33
about Ai and ethics on the cloud [Music]
0:40
[Music] show hello Elena how are you hi I'm
0:44
show hello Elena how are you hi I'm
0:44
show hello Elena how are you hi I'm really good you made it sound like a
0:46
really good you made it sound like a
0:46
really good you made it sound like a really big deal being the second time in
0:47
really big deal being the second time in
0:47
really big deal being the second time in the show yeah but big honor yeah for
0:50
the show yeah but big honor yeah for
0:50
the show yeah but big honor yeah for having me again it's got to it's got to
0:51
having me again it's got to it's got to
0:51
having me again it's got to it's got to happen sometime and it couldn't have
0:53
happen sometime and it couldn't have
0:53
happen sometime and it couldn't have been a better person so you're the first
0:55
been a better person so you're the first
0:55
been a better person so you're the first second appearance on the cloud show
0:57
second appearance on the cloud show
0:57
second appearance on the cloud show thank you and here at iian de days it's
1:01
thank you and here at iian de days it's
1:01
thank you and here at iian de days it's pretty good conference right yeah I
1:03
pretty good conference right yeah I
1:03
pretty good conference right yeah I really liked it enjoyed it so far really
1:04
really liked it enjoyed it so far really
1:04
really liked it enjoyed it so far really cool questions really cool people so
1:07
cool questions really cool people so
1:07
cool questions really cool people so really nice place I want to go here come
1:08
really nice place I want to go here come
1:08
really nice place I want to go here come here again next I would I would go I
1:10
here again next I would I would go I
1:10
here again next I would I would go I would come a second time as well yeah
1:12
would come a second time as well yeah
1:12
would come a second time as well yeah yeah definitely no no they're doing a
1:13
yeah definitely no no they're doing a
1:13
yeah definitely no no they're doing a great job and uh apparently it's it's
1:16
great job and uh apparently it's it's
1:16
great job and uh apparently it's it's kind of unusual to have uh Tech
1:18
kind of unusual to have uh Tech
1:18
kind of unusual to have uh Tech conferences in Tunisia so we're kind of
1:20
conferences in Tunisia so we're kind of
1:20
conferences in Tunisia so we're kind of happy to be here for the for the start
1:22
happy to be here for the for the start
1:22
happy to be here for the for the start of this yeah start yeah start up
1:25
of this yeah start yeah start up
1:25
of this yeah start yeah start up something good all right so you had a
1:27
something good all right so you had a
1:27
something good all right so you had a session here of course that's why you're
1:29
session here of course that's why you're
1:29
session here of course that's why you're here ultimately and you were talking
1:31
here ultimately and you were talking
1:32
here ultimately and you were talking about Ai and ethics yeah it's a very
1:35
about Ai and ethics yeah it's a very
1:35
about Ai and ethics yeah it's a very current topic yeah very sensitive topic
1:38
current topic yeah very sensitive topic
1:38
current topic yeah very sensitive topic very yeah yeah so I I guess in this
1:41
very yeah yeah so I I guess in this
1:42
very yeah yeah so I I guess in this space it's super important that we are
1:44
space it's super important that we are
1:44
space it's super important that we are able to do the right thing the
1:46
able to do the right thing the
1:46
able to do the right thing the responsible thing the correct thing with
1:49
responsible thing the correct thing with
1:49
responsible thing the correct thing with AI because AI can definitely be abused
1:53
AI because AI can definitely be abused
1:53
AI because AI can definitely be abused and it can have some um unwanted results
1:57
and it can have some um unwanted results
1:57
and it can have some um unwanted results unwanted outcomes so shall we start
1:59
unwanted outcomes so shall we start
1:59
unwanted outcomes so shall we start there if you will uh what are some
2:02
there if you will uh what are some
2:02
there if you will uh what are some problems that AI can have uh today um
2:06
problems that AI can have uh today um
2:06
problems that AI can have uh today um you know there's quite some errors like
2:08
you know there's quite some errors like
2:08
you know there's quite some errors like multiple errors um that are impacting
2:11
multiple errors um that are impacting
2:11
multiple errors um that are impacting the eye the results the out outcome and
2:13
the eye the results the out outcome and
2:13
the eye the results the out outcome and how we use it but I think in my opinion
2:15
how we use it but I think in my opinion
2:15
how we use it but I think in my opinion there is a three key points that I can
2:17
there is a three key points that I can
2:17
there is a three key points that I can mention so first of all of all it's a
2:20
mention so first of all of all it's a
2:20
mention so first of all of all it's a data that we using that been used for
2:22
data that we using that been used for
2:22
data that we using that been used for training the models the second one is
2:25
training the models the second one is
2:25
training the models the second one is transparency and accuracy it's basically
2:28
transparency and accuracy it's basically
2:28
transparency and accuracy it's basically uh the way how it decision is being made
2:32
uh the way how it decision is being made
2:32
uh the way how it decision is being made when we use the for certain tasks for
2:34
when we use the for certain tasks for
2:34
when we use the for certain tasks for example uh to summarize the certain
2:37
example uh to summarize the certain
2:37
example uh to summarize the certain documents for example or using it for
2:39
documents for example or using it for
2:39
documents for example or using it for hiring tools how those decision is being
2:41
hiring tools how those decision is being
2:41
hiring tools how those decision is being made there's a data set that's being
2:44
made there's a data set that's being
2:44
made there's a data set that's being used but also there's algorithms so
2:46
used but also there's algorithms so
2:46
used but also there's algorithms so that's the second aspect and the third
2:48
that's the second aspect and the third
2:48
that's the second aspect and the third aspect is how we actually integrating
2:50
aspect is how we actually integrating
2:50
aspect is how we actually integrating those tools in the companies one thing
2:52
those tools in the companies one thing
2:52
those tools in the companies one thing is having a great data and great
2:54
is having a great data and great
2:54
is having a great data and great algorithm but if you misuse it that's on
2:57
algorithm but if you misuse it that's on
2:57
algorithm but if you misuse it that's on you that's your responsibility right
2:59
you that's your responsibility right
2:59
you that's your responsibility right right so so if I get this right it's
3:01
right so so if I get this right it's
3:01
right so so if I get this right it's it's having data that has problems bias
3:04
it's having data that has problems bias
3:04
it's having data that has problems bias maybe it's it's how we then process that
3:07
maybe it's it's how we then process that
3:07
maybe it's it's how we then process that data the algorithm that actually does
3:09
data the algorithm that actually does
3:09
data the algorithm that actually does the processing of the data and then it
3:12
the processing of the data and then it
3:12
the processing of the data and then it is how we users actually use the data or
3:16
is how we users actually use the data or
3:16
is how we users actually use the data or the the the capability ultimately how we
3:19
the the the capability ultimately how we
3:19
the the the capability ultimately how we how we use it and how we integrate it
3:20
how we use it and how we integrate it
3:20
how we use it and how we integrate it right yeah I've been given a lot of
3:22
right yeah I've been given a lot of
3:23
right yeah I've been given a lot of examples in this space and one one that
3:25
examples in this space and one one that
3:25
examples in this space and one one that comes to mind is for example if you have
3:27
comes to mind is for example if you have
3:27
comes to mind is for example if you have a chatbot which everybody is supposed to
3:29
a chatbot which everybody is supposed to
3:29
a chatbot which everybody is supposed to have now I guess uh and and uh the
3:32
have now I guess uh and and uh the
3:32
have now I guess uh and and uh the chatbot typically for a company should
3:34
chatbot typically for a company should
3:34
chatbot typically for a company should be talking about the company's product
3:35
be talking about the company's product
3:35
be talking about the company's product that that would be nice right but but
3:38
that that would be nice right but but
3:38
that that would be nice right but but then we also know that if we ask the
3:40
then we also know that if we ask the
3:40
then we also know that if we ask the chatbot different other things then it
3:42
chatbot different other things then it
3:42
chatbot different other things then it it might still respond to those like for
3:44
it might still respond to those like for
3:44
it might still respond to those like for example oh I'm a little bit hungry and
3:46
example oh I'm a little bit hungry and
3:46
example oh I'm a little bit hungry and what should I have in my omelets right
3:47
what should I have in my omelets right
3:47
what should I have in my omelets right and it just suggests oh you could have
3:49
and it just suggests oh you could have
3:49
and it just suggests oh you could have peppers and cheese and things and that
3:51
peppers and cheese and things and that
3:51
peppers and cheese and things and that would be you know it's not the intended
3:54
would be you know it's not the intended
3:54
would be you know it's not the intended use case maybe it should be talking
3:56
use case maybe it should be talking
3:56
use case maybe it should be talking about the company's product but still
3:58
about the company's product but still
3:58
about the company's product but still you know you can the chatard that uses
4:01
you know you can the chatard that uses
4:01
you know you can the chatard that uses the like you know can search in Google
4:02
the like you know can search in Google
4:02
the like you know can search in Google and can actually give you responses well
4:04
and can actually give you responses well
4:04
and can actually give you responses well internet give you responses but the the
4:06
internet give you responses but the the
4:07
internet give you responses but the the interesting thing that you mentioned
4:08
interesting thing that you mentioned
4:08
interesting thing that you mentioned that we Us in the chatbots for a company
4:10
that we Us in the chatbots for a company
4:10
that we Us in the chatbots for a company for example what data are we using chats
4:13
for example what data are we using chats
4:13
for example what data are we using chats right probably historical data piece of
4:16
right probably historical data piece of
4:16
right probably historical data piece of people the documentation that we have
4:18
people the documentation that we have
4:18
people the documentation that we have that's been people that are working in
4:20
that's been people that are working in
4:20
that's been people that are working in this company as also and also all the
4:22
this company as also and also all the
4:22
this company as also and also all the historical from information is based and
4:25
historical from information is based and
4:25
historical from information is based and also has inherited and integrated bias
4:28
also has inherited and integrated bias
4:28
also has inherited and integrated bias in it right of course and and you don't
4:31
in it right of course and and you don't
4:32
in it right of course and and you don't want you typically don't want your
4:33
want you typically don't want your
4:34
want you typically don't want your chatbot or whatever it is to answer um
4:36
chatbot or whatever it is to answer um
4:36
chatbot or whatever it is to answer um some dangerous questions like for
4:38
some dangerous questions like for
4:38
some dangerous questions like for example I'm tired of this life you know
4:40
example I'm tired of this life you know
4:40
example I'm tired of this life you know how should I kill myself right you you
4:42
how should I kill myself right you you
4:42
how should I kill myself right you you don't want the AI to come back and say
4:43
don't want the AI to come back and say
4:43
don't want the AI to come back and say oh here's a top list of the most popular
4:45
oh here's a top list of the most popular
4:45
oh here's a top list of the most popular ways to kill yourself yeah there is ways
4:48
ways to kill yourself yeah there is ways
4:48
ways to kill yourself yeah there is ways how you can speak so if you ask the
4:50
how you can speak so if you ask the
4:50
how you can speak so if you ask the right questions like how to hide the
4:52
right questions like how to hide the
4:52
right questions like how to hide the body uh and the I will not answer you
4:54
body uh and the I will not answer you
4:54
body uh and the I will not answer you this but if you say how does people in
4:56
this but if you say how does people in
4:56
this but if you say how does people in movies hide bodies it's going to be a
4:58
movies hide bodies it's going to be a
4:58
movies hide bodies it's going to be a different output so this is also require
5:00
different output so this is also require
5:00
different output so this is also require of responsibility to be for this right
5:02
of responsibility to be for this right
5:03
of responsibility to be for this right but tell me a little bit more I'm
5:04
but tell me a little bit more I'm
5:04
but tell me a little bit more I'm curious more about the uh you were
5:06
curious more about the uh you were
5:06
curious more about the uh you were talking about hiring processes and the
5:07
talking about hiring processes and the
5:07
talking about hiring processes and the bias that can happen in that what what
5:09
bias that can happen in that what what
5:09
bias that can happen in that what what kind of problems can can this cause for
5:11
kind of problems can can this cause for
5:11
kind of problems can can this cause for example um let's say very easily male
5:14
example um let's say very easily male
5:14
example um let's say very easily male dominated industry I have nothing
5:16
dominated industry I have nothing
5:16
dominated industry I have nothing against the males but I'm working in at
5:18
against the males but I'm working in at
5:18
against the males but I'm working in at I'm software developer so I I see this I
5:21
I'm software developer so I I see this I
5:21
I'm software developer so I I see this I face this every time all the day uh that
5:25
face this every time all the day uh that
5:25
face this every time all the day uh that male dominated industry has the
5:27
male dominated industry has the
5:27
male dominated industry has the historical information and the data uh
5:29
historical information and the data uh
5:29
historical information and the data uh so all the hiring processes right right
5:32
so all the hiring processes right right
5:32
so all the hiring processes right right so they they focused around hiring
5:33
so they they focused around hiring
5:33
so they they focused around hiring demands mostly yeah sure uh if we
5:36
demands mostly yeah sure uh if we
5:36
demands mostly yeah sure uh if we talking about diversity and inclusivity
5:39
talking about diversity and inclusivity
5:39
talking about diversity and inclusivity and all the stuff so companies have this
5:41
and all the stuff so companies have this
5:41
and all the stuff so companies have this also on the agenda we want to make
5:43
also on the agenda we want to make
5:43
also on the agenda we want to make companies more diverse more inclusive
5:45
companies more diverse more inclusive
5:45
companies more diverse more inclusive you want to hire more women but then
5:47
you want to hire more women but then
5:47
you want to hire more women but then they integrate in in their hiring
5:49
they integrate in in their hiring
5:49
they integrate in in their hiring processes and they don't adjust the data
5:51
processes and they don't adjust the data
5:51
processes and they don't adjust the data that they provide so they provide all
5:53
that they provide so they provide all
5:53
that they provide so they provide all the successful uh hiring cases from the
5:56
the successful uh hiring cases from the
5:56
the successful uh hiring cases from the historical information and it's because
5:58
historical information and it's because
5:58
historical information and it's because it's the industry most of the cases will
6:01
it's the industry most of the cases will
6:01
it's the industry most of the cases will be male uh male cases yeah so hiring
6:05
be male uh male cases yeah so hiring
6:05
be male uh male cases yeah so hiring jobs and that makes sense yeah that's
6:07
jobs and that makes sense yeah that's
6:07
jobs and that makes sense yeah that's really makes a big problem because then
6:09
really makes a big problem because then
6:10
really makes a big problem because then you will be just like your c will be
6:12
you will be just like your c will be
6:12
you will be just like your c will be just um skipped because uh certain
6:16
just um skipped because uh certain
6:16
just um skipped because uh certain aspects and it can be gender if you if
6:18
aspects and it can be gender if you if
6:18
aspects and it can be gender if you if you have many men have been successfully
6:21
you have many men have been successfully
6:21
you have many men have been successfully hired before and that's the data we have
6:24
hired before and that's the data we have
6:24
hired before and that's the data we have and then if you put a a a woman's CV in
6:27
and then if you put a a a woman's CV in
6:27
and then if you put a a a woman's CV in the pile it's going to be like
6:30
the pile it's going to be like
6:30
the pile it's going to be like understand those CVS they have
6:32
understand those CVS they have
6:32
understand those CVS they have information like gender nationality and
6:36
information like gender nationality and
6:36
information like gender nationality and this might not be craved or or like you
6:38
this might not be craved or or like you
6:38
this might not be craved or or like you know uh integrated in the algorithm
6:40
know uh integrated in the algorithm
6:40
know uh integrated in the algorithm itself but because AI has access to
6:42
itself but because AI has access to
6:42
itself but because AI has access to certain data right so it's not the
6:44
certain data right so it's not the
6:44
certain data right so it's not the company's policy it's not what they want
6:47
company's policy it's not what they want
6:47
company's policy it's not what they want and they have no no you know if you talk
6:49
and they have no no you know if you talk
6:50
and they have no no you know if you talk to people they they would like to hire
6:51
to people they they would like to hire
6:51
to people they they would like to hire more women but then they maybe use a
6:54
more women but then they maybe use a
6:54
more women but then they maybe use a tool which has a bias right and maybe
6:57
tool which has a bias right and maybe
6:57
tool which has a bias right and maybe accidentally but still the outcome is
6:59
accidentally but still the outcome is
6:59
accidentally but still the outcome is not what they what they wanted right but
7:01
not what they what they wanted right but
7:01
not what they what they wanted right but it's also we cannot say that it's the
7:03
it's also we cannot say that it's the
7:03
it's also we cannot say that it's the company has no responsibility over it oh
7:06
company has no responsibility over it oh
7:06
company has no responsibility over it oh because they do there are ways even if
7:08
because they do there are ways even if
7:08
because they do there are ways even if you have the historical information
7:10
you have the historical information
7:10
you have the historical information there bias you can still avoid this bias
7:13
there bias you can still avoid this bias
7:13
there bias you can still avoid this bias in your responses that's the I think the
7:15
in your responses that's the I think the
7:15
in your responses that's the I think the important part they maybe didn't want to
7:18
important part they maybe didn't want to
7:18
important part they maybe didn't want to have this outcome um they were not
7:19
have this outcome um they were not
7:19
have this outcome um they were not looking for such an outcome and they
7:22
looking for such an outcome and they
7:22
looking for such an outcome and they they they made the wrong choice along
7:24
they they made the wrong choice along
7:24
they they made the wrong choice along the way right they they they couldn't
7:26
the way right they they they couldn't
7:26
the way right they they they couldn't use the AI to help them instead the AI
7:30
use the AI to help them instead the AI
7:30
use the AI to help them instead the AI did the well the AI is always going to
7:32
did the well the AI is always going to
7:32
did the well the AI is always going to do its best right based on the
7:33
do its best right based on the
7:33
do its best right based on the information that it has it will do its
7:35
information that it has it will do its
7:35
information that it has it will do its very best job but it if it has wrong
7:37
very best job but it if it has wrong
7:37
very best job but it if it has wrong information then the answer will not be
7:39
information then the answer will not be
7:39
information then the answer will not be the desired outcome yeah right I think
7:41
the desired outcome yeah right I think
7:41
the desired outcome yeah right I think it's very important for us as humans to
7:44
it's very important for us as humans to
7:44
it's very important for us as humans to work alongside with so and guide the
7:46
work alongside with so and guide the
7:46
work alongside with so and guide the process of uh decision making so we can
7:49
process of uh decision making so we can
7:49
process of uh decision making so we can say it's our responsibility to say uh
7:52
say it's our responsibility to say uh
7:52
say it's our responsibility to say uh even when we creating the prompt try to
7:54
even when we creating the prompt try to
7:54
even when we creating the prompt try to have a better outlook on the picture to
7:56
have a better outlook on the picture to
7:56
have a better outlook on the picture to certain factors don't take into account
7:59
certain factors don't take into account
7:59
certain factors don't take into account gender race nationality you can create a
8:01
gender race nationality you can create a
8:01
gender race nationality you can create a safe guard trails with safe rules to
8:04
safe guard trails with safe rules to
8:04
safe guard trails with safe rules to make sure there's less bonus okay so so
8:07
make sure there's less bonus okay so so
8:07
make sure there's less bonus okay so so let's let's let's focus in on that and
8:09
let's let's let's focus in on that and
8:09
let's let's let's focus in on that and see so what what are some things that we
8:11
see so what what are some things that we
8:11
see so what what are some things that we can do to as AI or we want to use AI
8:16
can do to as AI or we want to use AI
8:16
can do to as AI or we want to use AI right what are some things that we can
8:17
right what are some things that we can
8:17
right what are some things that we can do to uh make sure that we get an
8:20
do to uh make sure that we get an
8:20
do to uh make sure that we get an unbiased result or we get uh you know to
8:23
unbiased result or we get uh you know to
8:23
unbiased result or we get uh you know to a a fair selection and and or you know
8:27
a a fair selection and and or you know
8:27
a a fair selection and and or you know what do we do let's have a look at a
8:29
what do we do let's have a look at a
8:29
what do we do let's have a look at a case so of course there's algorithms
8:31
case so of course there's algorithms
8:32
case so of course there's algorithms between in data we don't have control
8:33
between in data we don't have control
8:34
between in data we don't have control over this as a final user I still can do
8:37
over this as a final user I still can do
8:37
over this as a final user I still can do something so uh if I'm having a company
8:39
something so uh if I'm having a company
8:39
something so uh if I'm having a company and I want to make sure that in of my
8:41
and I want to make sure that in of my
8:41
and I want to make sure that in of my company using responsibly I will make
8:43
company using responsibly I will make
8:43
company using responsibly I will make sure that they getting uh correct enough
8:46
sure that they getting uh correct enough
8:46
sure that they getting uh correct enough education about this okay so where the
8:49
education about this okay so where the
8:49
education about this okay so where the data is coming from uh if it's coming
8:51
data is coming from uh if it's coming
8:51
data is coming from uh if it's coming from underrepresented groups for example
8:53
from underrepresented groups for example
8:53
from underrepresented groups for example I want to make sure that when they
8:55
I want to make sure that when they
8:55
I want to make sure that when they making the request of this data they
8:57
making the request of this data they
8:57
making the request of this data they also set the certain rules and guard
8:59
also set the certain rules and guard
8:59
also set the certain rules and guard trails to avoid the bias in it so proper
9:02
trails to avoid the bias in it so proper
9:02
trails to avoid the bias in it so proper education yeah making sure raising the
9:05
education yeah making sure raising the
9:05
education yeah making sure raising the awareness about um potential risks and
9:08
awareness about um potential risks and
9:08
awareness about um potential risks and biases in the data and outcomes and also
9:12
biases in the data and outcomes and also
9:12
biases in the data and outcomes and also um well writing proper proms probably
9:15
um well writing proper proms probably
9:15
um well writing proper proms probably having some AI guidelines will help a
9:17
having some AI guidelines will help a
9:17
having some AI guidelines will help a lot in your company right right so
9:20
lot in your company right right so
9:20
lot in your company right right so making even some tools by developers who
9:23
making even some tools by developers who
9:23
making even some tools by developers who understand how AI works for the use of
9:25
understand how AI works for the use of
9:25
understand how AI works for the use of other people in the company can help a
9:27
other people in the company can help a
9:27
other people in the company can help a lot so it's we saw a lot exposion of
9:30
lot so it's we saw a lot exposion of
9:30
lot so it's we saw a lot exposion of different chat so you can even go to CH
9:32
different chat so you can even go to CH
9:32
different chat so you can even go to CH you can create your own chat right
9:35
you can create your own chat right
9:35
you can create your own chat right nothing stopping you from creating that
9:36
nothing stopping you from creating that
9:36
nothing stopping you from creating that one is the correct system message that
9:39
one is the correct system message that
9:39
one is the correct system message that will make sure that there's less biased
9:41
will make sure that there's less biased
9:41
will make sure that there's less biased results for example so you put all those
9:43
results for example so you put all those
9:43
results for example so you put all those guard trails in there so you you are
9:45
guard trails in there so you you are
9:45
guard trails in there so you you are careful to uh to instruct the AI to uh
9:50
careful to uh to instruct the AI to uh
9:50
careful to uh to instruct the AI to uh to adjust
9:51
to adjust for a a a data set that isn't that isn't
9:55
for a a a data set that isn't that isn't
9:55
for a a a data set that isn't that isn't fair yeah we cannot avoid this because
9:58
fair yeah we cannot avoid this because
9:58
fair yeah we cannot avoid this because we cannot avoid any like predisposition
10:00
we cannot avoid any like predisposition
10:00
we cannot avoid any like predisposition or some because the people creating Ai
10:04
or some because the people creating Ai
10:04
or some because the people creating Ai and AI is training all the data provided
10:07
and AI is training all the data provided
10:07
and AI is training all the data provided by us that's right data so it will exist
10:10
by us that's right data so it will exist
10:10
by us that's right data so it will exist always but we have to make sure that
10:12
always but we have to make sure that
10:12
always but we have to make sure that when AI making decision it's not because
10:15
when AI making decision it's not because
10:15
when AI making decision it's not because of certain factors so do you think that
10:17
of certain factors so do you think that
10:17
of certain factors so do you think that we now using AI has have put a more
10:23
we now using AI has have put a more
10:23
we now using AI has have put a more focus on these questions than we have
10:26
focus on these questions than we have
10:26
focus on these questions than we have had historically because the the the bu
10:29
had historically because the the the bu
10:29
had historically because the the the bu in the data has been around since the
10:31
in the data has been around since the
10:31
in the data has been around since the the age of the internet right it's it's
10:33
the age of the internet right it's it's
10:33
the age of the internet right it's it's been dominated by people that may maybe
10:36
been dominated by people that may maybe
10:36
been dominated by people that may maybe look just like me like like white male
10:39
look just like me like like white male
10:39
look just like me like like white male middle-aged person right um has a lot of
10:43
middle-aged person right um has a lot of
10:43
middle-aged person right um has a lot of uh emphasis in the data to say right are
10:46
uh emphasis in the data to say right are
10:46
uh emphasis in the data to say right are we are we getting can AI maybe help us
10:49
we are we getting can AI maybe help us
10:49
we are we getting can AI maybe help us make this better in the future um I
10:52
make this better in the future um I
10:52
make this better in the future um I think we can work alongside so first we
10:55
think we can work alongside so first we
10:55
think we can work alongside so first we have to make sure that humans actually
10:57
have to make sure that humans actually
10:57
have to make sure that humans actually assist in eii and not like fighting
10:59
assist in eii and not like fighting
11:00
assist in eii and not like fighting against it when you fight it you fear it
11:02
against it when you fight it you fear it
11:02
against it when you fight it you fear it and you don't understand it if you
11:04
and you don't understand it if you
11:04
and you don't understand it if you assist it you understand and you guide
11:06
assist it you understand and you guide
11:06
assist it you understand and you guide and you show the direction that's the
11:08
and you show the direction that's the
11:08
and you show the direction that's the first thing and the second thing of
11:10
first thing and the second thing of
11:10
first thing and the second thing of course the bias data exist for a long
11:13
course the bias data exist for a long
11:13
course the bias data exist for a long time and that's very good examples is of
11:16
time and that's very good examples is of
11:16
time and that's very good examples is of healthare so we know a lot of examples
11:19
healthare so we know a lot of examples
11:19
healthare so we know a lot of examples of mistakes being made because of the
11:21
of mistakes being made because of the
11:21
of mistakes being made because of the data sets right even before CH and all
11:24
data sets right even before CH and all
11:24
data sets right even before CH and all this but I think because of the exposure
11:27
this but I think because of the exposure
11:28
this but I think because of the exposure of EI we have
11:30
of EI we have uh the problem became so big and so fast
11:32
uh the problem became so big and so fast
11:32
uh the problem became so big and so fast that we have to be much more uh like
11:35
that we have to be much more uh like
11:35
that we have to be much more uh like careful about it and put much more
11:37
careful about it and put much more
11:37
careful about it and put much more intention than before and in this space
11:40
intention than before and in this space
11:40
intention than before and in this space then that's is I think I read something
11:41
then that's is I think I read something
11:42
then that's is I think I read something about that that that we're working to
11:43
about that that that we're working to
11:43
about that that that we're working to create standards and policies for this
11:46
create standards and policies for this
11:46
create standards and policies for this kind of work right now two years after
11:49
kind of work right now two years after
11:49
kind of work right now two years after the app was launched got like in two
11:51
the app was launched got like in two
11:51
the app was launched got like in two months 100 million should have done that
11:53
months 100 million should have done that
11:53
months 100 million should have done that the other way around but at least we are
11:55
the other way around but at least we are
11:55
the other way around but at least we are working on it now to say that there are
11:58
working on it now to say that there are
11:58
working on it now to say that there are there are there are things we have have
11:59
there are there are things we have have
11:59
there are there are things we have have to consider when we're using AI there
12:01
to consider when we're using AI there
12:01
to consider when we're using AI there are some policies we have to follow yeah
12:03
are some policies we have to follow yeah
12:03
are some policies we have to follow yeah right makes sense kind of like first we
12:05
right makes sense kind of like first we
12:05
right makes sense kind of like first we do then we see the consequences and then
12:07
do then we see the consequences and then
12:07
do then we see the consequences and then we try to find a solution right so so
12:10
we try to find a solution right so so
12:10
we try to find a solution right so so the the the summary of the of the advice
12:12
the the the summary of the of the advice
12:13
the the the summary of the of the advice that we need as as a company where we a
12:15
that we need as as a company where we a
12:15
that we need as as a company where we a company we're going to use start using
12:16
company we're going to use start using
12:16
company we're going to use start using AI in summary we have to look at our
12:18
AI in summary we have to look at our
12:18
AI in summary we have to look at our data right to figure out if is this say
12:21
data right to figure out if is this say
12:21
data right to figure out if is this say what else data guines have guidelines
12:24
what else data guines have guidelines
12:24
what else data guines have guidelines for it yeah and you you talked about
12:26
for it yeah and you you talked about
12:26
for it yeah and you you talked about training uh yeah you can you can well
12:29
training uh yeah you can you can well
12:29
training uh yeah you can you can well the first is data ni guidelines and then
12:31
the first is data ni guidelines and then
12:31
the first is data ni guidelines and then you can have actually people creating
12:33
you can have actually people creating
12:33
you can have actually people creating tools that are being safe to use for uh
12:36
tools that are being safe to use for uh
12:36
tools that are being safe to use for uh people that are not that knowledgeable
12:37
people that are not that knowledgeable
12:37
people that are not that knowledgeable in the I for example okay okay right and
12:40
in the I for example okay okay right and
12:40
in the I for example okay okay right and also educating people of what comp is
12:43
also educating people of what comp is
12:43
also educating people of what comp is how to use risk have right right and
12:47
how to use risk have right right and
12:47
how to use risk have right right and then there are there are training
12:48
then there are there are training
12:48
then there are there are training materials out there for for for for
12:51
materials out there for for for for
12:51
materials out there for for for for learning for an organization to learn
12:53
learning for an organization to learn
12:53
learning for an organization to learn how to use AI in a responsible and a and
12:55
how to use AI in a responsible and a and
12:55
how to use AI in a responsible and a and a more safe way I I think you said
12:57
a more safe way I I think you said
12:57
a more safe way I I think you said trainings trainings would be defin think
13:00
trainings trainings would be defin think
13:00
trainings trainings would be defin think create um nonbiased prompt that's would
13:02
create um nonbiased prompt that's would
13:02
create um nonbiased prompt that's would be a great yeah good training quick one
13:05
be a great yeah good training quick one
13:05
be a great yeah good training quick one but I think very very useful yeah
13:08
but I think very very useful yeah
13:08
but I think very very useful yeah absolutely well that was excellent so
13:10
absolutely well that was excellent so
13:10
absolutely well that was excellent so thank you very much for coming again
13:12
thank you very much for coming again
13:12
thank you very much for coming again back to the cloud show and being my
13:14
back to the cloud show and being my
13:14
back to the cloud show and being my first second appearance on the show my
13:17
first second appearance on the show my
13:17
first second appearance on the show my pleasure thank you very much and guests
13:19
pleasure thank you very much and guests
13:19
pleasure thank you very much and guests I'll see you next time on the cloud show