0:03
okay all right so uh I will start with
0:06
okay all right so uh I will start with
0:06
okay all right so uh I will start with my brief introduction first so my name
0:08
my brief introduction first so my name
0:08
my brief introduction first so my name is Abaya I'm working as a software team
0:11
is Abaya I'm working as a software team
0:11
is Abaya I'm working as a software team leader with ISJ short name is SLB and
0:16
leader with ISJ short name is SLB and
0:16
leader with ISJ short name is SLB and today's topic is U scaling oil field
0:19
today's topic is U scaling oil field
0:19
today's topic is U scaling oil field production operation using cloud
0:21
production operation using cloud
0:21
production operation using cloud computing so I completely agree with the
0:25
computing so I completely agree with the
0:25
computing so I completely agree with the last presenter and he focuses
0:28
last presenter and he focuses
0:28
last presenter and he focuses on uh the requirement and keep it simple
0:31
on uh the requirement and keep it simple
0:31
on uh the requirement and keep it simple so yeah that is also I follow in
0:33
so yeah that is also I follow in
0:33
so yeah that is also I follow in day-to-day life I mean I usually build
0:37
day-to-day life I mean I usually build
0:37
day-to-day life I mean I usually build Solutions incrementally not everything
0:39
Solutions incrementally not everything
0:39
Solutions incrementally not everything on single day I mean I start with
0:42
on single day I mean I start with
0:42
on single day I mean I start with something small then keep it rating over
0:44
something small then keep it rating over
0:44
something small then keep it rating over it so yeah I completely agree with
0:46
it so yeah I completely agree with
0:46
it so yeah I completely agree with whatever he he presented okay so before
0:51
whatever he he presented okay so before
0:51
whatever he he presented okay so before going into the main topic I would like
0:54
going into the main topic I would like
0:54
going into the main topic I would like to give a brief introduction about oil
0:57
to give a brief introduction about oil
0:57
to give a brief introduction about oil and gas business so for the folks who
1:00
and gas business so for the folks who
1:00
and gas business so for the folks who don't know s l is world's number one
1:03
don't know s l is world's number one
1:03
don't know s l is world's number one company in Upstream oil and gas services
1:07
company in Upstream oil and gas services
1:07
company in Upstream oil and gas services so around the globe there are mainly two
1:10
so around the globe there are mainly two
1:11
so around the globe there are mainly two kind of companies dealing with oil and
1:13
kind of companies dealing with oil and
1:13
kind of companies dealing with oil and gas business the first part of com
1:16
gas business the first part of com
1:16
gas business the first part of com company are basically oil and gas
1:18
company are basically oil and gas
1:18
company are basically oil and gas company who basically owns the oil well
1:21
company who basically owns the oil well
1:21
company who basically owns the oil well and these oil well could be anywhere on
1:24
and these oil well could be anywhere on
1:24
and these oil well could be anywhere on Earth or in offshore in middle of the
1:26
Earth or in offshore in middle of the
1:26
Earth or in offshore in middle of the sea and second of the company which are
1:29
sea and second of the company which are
1:30
sea and second of the company which are basically operating in these oil and gas
1:32
basically operating in these oil and gas
1:32
basically operating in these oil and gas well so I'm from the second part I'm
1:35
well so I'm from the second part I'm
1:35
well so I'm from the second part I'm working for the service provider so L is
1:38
working for the service provider so L is
1:38
working for the service provider so L is a world's number one Service Pro
1:40
a world's number one Service Pro
1:40
a world's number one Service Pro provider and what we do we basically
1:42
provider and what we do we basically
1:42
provider and what we do we basically deals with uh up Stream So if you want
1:45
deals with uh up Stream So if you want
1:45
deals with uh up Stream So if you want to know what is up stream so I'm showing
1:48
to know what is up stream so I'm showing
1:48
to know what is up stream so I'm showing on the screen so the left hand side that
1:51
on the screen so the left hand side that
1:51
on the screen so the left hand side that call up scam the middle says mid stream
1:54
call up scam the middle says mid stream
1:54
call up scam the middle says mid stream and the right one say down stream so
1:57
and the right one say down stream so
1:57
and the right one say down stream so what we do basically so we basically
2:00
what we do basically so we basically
2:00
what we do basically so we basically involved in everything whatever involves
2:02
involved in everything whatever involves
2:02
involved in everything whatever involves in drilling the oil well and extracting
2:05
in drilling the oil well and extracting
2:05
in drilling the oil well and extracting the oil from under the Earth and this
2:09
the oil from under the Earth and this
2:09
the oil from under the Earth and this oil well as I said it could be a middle
2:11
oil well as I said it could be a middle
2:11
oil well as I said it could be a middle of the sea it could be on some desert or
2:13
of the sea it could be on some desert or
2:13
of the sea it could be on some desert or some remote area so everything involved
2:17
some remote area so everything involved
2:17
some remote area so everything involved including some tools and Technologies
2:19
including some tools and Technologies
2:19
including some tools and Technologies and hardware and software everything so
2:22
and hardware and software everything so
2:22
and hardware and software everything so we provide all kind of Technologies and
2:25
we provide all kind of Technologies and
2:25
we provide all kind of Technologies and when I'm saying skill set you will find
2:28
when I'm saying skill set you will find
2:28
when I'm saying skill set you will find petroleum engineers geophysicist or
2:31
petroleum engineers geophysicist or
2:31
petroleum engineers geophysicist or petrophysicist geologist and software
2:34
petrophysicist geologist and software
2:34
petrophysicist geologist and software Engineers data scientist machine
2:35
Engineers data scientist machine
2:35
Engineers data scientist machine learning Engineers so everyone working
2:37
learning Engineers so everyone working
2:37
learning Engineers so everyone working as a team and our single goal is to
2:40
as a team and our single goal is to
2:40
as a team and our single goal is to optimize the oil production so we want
2:43
optimize the oil production so we want
2:43
optimize the oil production so we want to apply all
2:44
to apply all Technologies uh together so that we can
2:47
Technologies uh together so that we can
2:47
Technologies uh together so that we can optimize the oil
2:49
optimize the oil production and uh as you can see on left
2:52
production and uh as you can see on left
2:52
production and uh as you can see on left hand side I'm showing two two pictures
2:54
hand side I'm showing two two pictures
2:54
hand side I'm showing two two pictures one is the onshore pump check you will
2:56
one is the onshore pump check you will
2:56
one is the onshore pump check you will mostly see that Rod is basically going
2:59
mostly see that Rod is basically going
3:00
mostly see that Rod is basically going up and down it is basically drilling the
3:02
up and down it is basically drilling the
3:02
up and down it is basically drilling the oil oil well and the other picture says
3:04
oil oil well and the other picture says
3:04
oil oil well and the other picture says okay someone using some helicopter and
3:07
okay someone using some helicopter and
3:08
okay someone using some helicopter and landing on that uh pad location where
3:11
landing on that uh pad location where
3:11
landing on that uh pad location where there is some uh uh instruments are
3:14
there is some uh uh instruments are
3:14
there is some uh uh instruments are installed in middle of the sea and they
3:15
installed in middle of the sea and they
3:16
installed in middle of the sea and they are basically producing the oil and
3:19
are basically producing the oil and
3:19
are basically producing the oil and there then there are Midstream companies
3:21
there then there are Midstream companies
3:21
there then there are Midstream companies these companies are basically
3:23
these companies are basically
3:23
these companies are basically responsible for processing and storage I
3:26
responsible for processing and storage I
3:26
responsible for processing and storage I mean once the oil is produced then
3:28
mean once the oil is produced then
3:28
mean once the oil is produced then someone should be there to to transfer
3:30
someone should be there to to transfer
3:30
someone should be there to to transfer it so these companies are mostly uh
3:33
it so these companies are mostly uh
3:33
it so these companies are mostly uh build the pipelines and transport and
3:36
build the pipelines and transport and
3:36
build the pipelines and transport and trucks and the Third Kind of companies
3:38
trucks and the Third Kind of companies
3:38
trucks and the Third Kind of companies are Downstream companies so once uh this
3:42
are Downstream companies so once uh this
3:42
are Downstream companies so once uh this oil is already uh reached to the
3:45
oil is already uh reached to the
3:45
oil is already uh reached to the distribution C Center so these companies
3:47
distribution C Center so these companies
3:47
distribution C Center so these companies are basically refining the oil and after
3:50
are basically refining the oil and after
3:50
are basically refining the oil and after that they also deal with the sales and
3:52
that they also deal with the sales and
3:52
that they also deal with the sales and marketing so today's topic is mostly on
3:56
marketing so today's topic is mostly on
3:56
marketing so today's topic is mostly on upstream and about uh about a
4:01
upstream and about uh about a
4:01
upstream and about uh about a cloud-based solution what we built and
4:04
cloud-based solution what we built and
4:04
cloud-based solution what we built and it basically helped to optimize oil
4:07
it basically helped to optimize oil
4:07
it basically helped to optimize oil production so I would like to start with
4:11
production so I would like to start with
4:11
production so I would like to start with some terms like what is a digital oil
4:13
some terms like what is a digital oil
4:13
some terms like what is a digital oil oil field so digital oil oil field is
4:15
oil field so digital oil oil field is
4:15
oil field so digital oil oil field is just a fancy name what we use I mean
4:18
just a fancy name what we use I mean
4:18
just a fancy name what we use I mean it's nothing new it's there in the
4:20
it's nothing new it's there in the
4:20
it's nothing new it's there in the industry since last more than 25 years
4:23
industry since last more than 25 years
4:23
industry since last more than 25 years but uh since last more than 10 years
4:25
but uh since last more than 10 years
4:25
but uh since last more than 10 years everyone started talking about a cloud
4:28
everyone started talking about a cloud
4:28
everyone started talking about a cloud and now we haveck at GPT and machine
4:30
and now we haveck at GPT and machine
4:30
and now we haveck at GPT and machine learning and data scientist so digital
4:33
learning and data scientist so digital
4:33
learning and data scientist so digital term I mean people started correlating
4:36
term I mean people started correlating
4:36
term I mean people started correlating it with a cloud and machine learning and
4:38
it with a cloud and machine learning and
4:38
it with a cloud and machine learning and everything but for us the digital oil
4:40
everything but for us the digital oil
4:40
everything but for us the digital oil field is anything related to computer so
4:43
field is anything related to computer so
4:43
field is anything related to computer so digital oil field I mean even before the
4:45
digital oil field I mean even before the
4:45
digital oil field I mean even before the cloud cloud com Computing it was there
4:49
cloud cloud com Computing it was there
4:49
cloud cloud com Computing it was there but now we are using it more proactively
4:53
but now we are using it more proactively
4:53
but now we are using it more proactively and it is related to using any digital
4:56
and it is related to using any digital
4:56
and it is related to using any digital Technologies it could be software or
4:58
Technologies it could be software or
4:58
Technologies it could be software or Hardware or data analytics and the end
5:01
Hardware or data analytics and the end
5:01
Hardware or data analytics and the end goal is to enhance the exploration
5:03
goal is to enhance the exploration
5:03
goal is to enhance the exploration production and management of these oil
5:06
production and management of these oil
5:06
production and management of these oil gas field and uh the general operation
5:09
gas field and uh the general operation
5:09
gas field and uh the general operation involved I mean if we are collecting a
5:12
involved I mean if we are collecting a
5:12
involved I mean if we are collecting a lot of data from the sensors installed
5:14
lot of data from the sensors installed
5:14
lot of data from the sensors installed in this oil oil field so basically
5:17
in this oil oil field so basically
5:17
in this oil oil field so basically artificial intelligence is involved and
5:19
artificial intelligence is involved and
5:19
artificial intelligence is involved and internet of things are also involved so
5:22
internet of things are also involved so
5:22
internet of things are also involved so as you can see the right hand picture I
5:24
as you can see the right hand picture I
5:24
as you can see the right hand picture I generated from AI so this picture is
5:28
generated from AI so this picture is
5:28
generated from AI so this picture is basically showing a typical oil oil
5:31
basically showing a typical oil oil
5:31
basically showing a typical oil oil field where several peoples are working
5:33
field where several peoples are working
5:33
field where several peoples are working and you can also see some Hardwares are
5:36
and you can also see some Hardwares are
5:36
and you can also see some Hardwares are installed some computer schemes are
5:37
installed some computer schemes are
5:37
installed some computer schemes are installed they connect it to some some
5:40
installed they connect it to some some
5:40
installed they connect it to some some satellite so that they can get the
5:41
satellite so that they can get the
5:41
satellite so that they can get the internet connection and everything and
5:43
internet connection and everything and
5:43
internet connection and everything and they're always sending some data to the
5:45
they're always sending some data to the
5:45
they're always sending some data to the remote oil field operation so today's
5:48
remote oil field operation so today's
5:48
remote oil field operation so today's topic is about the similar application
5:50
topic is about the similar application
5:50
topic is about the similar application which is installed and which is
5:53
which is installed and which is
5:53
which is installed and which is basically uh optimizing this digital oil
5:55
basically uh optimizing this digital oil
5:55
basically uh optimizing this digital oil feed
5:57
feed operation so I would like to start with
6:00
operation so I would like to start with
6:00
operation so I would like to start with the challenges the requirement what we
6:02
the challenges the requirement what we
6:02
the challenges the requirement what we learned in the last session as well so
6:05
learned in the last session as well so
6:05
learned in the last session as well so the challenges and requirement for
6:07
the challenges and requirement for
6:07
the challenges and requirement for production operation means uh because
6:09
production operation means uh because
6:09
production operation means uh because this operation is highly complex it
6:11
this operation is highly complex it
6:11
this operation is highly complex it basically involves a lot many people
6:13
basically involves a lot many people
6:13
basically involves a lot many people from different business roles and from
6:16
from different business roles and from
6:16
from different business roles and from different geographies and different
6:18
different geographies and different
6:18
different geographies and different workflows are always applied so these
6:21
workflows are always applied so these
6:21
workflows are always applied so these problems are typically solved using a
6:23
problems are typically solved using a
6:23
problems are typically solved using a diverse or disconnected set of software
6:25
diverse or disconnected set of software
6:25
diverse or disconnected set of software application and tools and uh before
6:28
application and tools and uh before
6:28
application and tools and uh before introducing the cloud computing part we
6:30
introducing the cloud computing part we
6:31
introducing the cloud computing part we were still doing this operation but at
6:33
were still doing this operation but at
6:33
were still doing this operation but at less scale and uh so the main challenge
6:38
less scale and uh so the main challenge
6:38
less scale and uh so the main challenge for with the application the existing
6:40
for with the application the existing
6:40
for with the application the existing application was I mean uh the collection
6:43
application was I mean uh the collection
6:43
application was I mean uh the collection of data was not easy and to make this
6:46
of data was not easy and to make this
6:46
of data was not easy and to make this data available for running a scalable
6:48
data available for running a scalable
6:48
data available for running a scalable workflow that was not a straightforward
6:50
workflow that was not a straightforward
6:50
workflow that was not a straightforward because our data was scattered and there
6:52
because our data was scattered and there
6:52
because our data was scattered and there was so many on-prem uh desktop based
6:56
was so many on-prem uh desktop based
6:56
was so many on-prem uh desktop based application was was running and
6:58
application was was running and
6:58
application was was running and collecting so it was not very much
7:00
collecting so it was not very much
7:00
collecting so it was not very much scalable second thing was in oil oil
7:03
scalable second thing was in oil oil
7:03
scalable second thing was in oil oil field the data frequency could be very
7:06
field the data frequency could be very
7:06
field the data frequency could be very different few data could be a second
7:09
different few data could be a second
7:09
different few data could be a second best data or maybe less than second
7:11
best data or maybe less than second
7:11
best data or maybe less than second based data few data could be monthly
7:15
based data few data could be monthly
7:15
based data few data could be monthly yearly weekly or daily and there are
7:18
yearly weekly or daily and there are
7:18
yearly weekly or daily and there are several conventions followed by
7:20
several conventions followed by
7:20
several conventions followed by different oil companies around the globe
7:22
different oil companies around the globe
7:22
different oil companies around the globe so whenever we we we have some uh
7:26
so whenever we we we have some uh
7:26
so whenever we we we have some uh existing desktop based application then
7:29
existing desktop based application then
7:29
existing desktop based application then each application talks their own data
7:31
each application talks their own data
7:31
each application talks their own data standard so whenever we are transferring
7:34
standard so whenever we are transferring
7:34
standard so whenever we are transferring data from one app to other we always
7:36
data from one app to other we always
7:36
data from one app to other we always need to build some some kind of
7:38
need to build some some kind of
7:38
need to build some some kind of connectors or or adapters so that one
7:41
connectors or or adapters so that one
7:41
connectors or or adapters so that one application understand what the other
7:43
application understand what the other
7:43
application understand what the other application is talking about and it was
7:46
application is talking about and it was
7:46
application is talking about and it was uh very hard to maintain it was not very
7:49
uh very hard to maintain it was not very
7:49
uh very hard to maintain it was not very much easily accessible and poor
7:52
much easily accessible and poor
7:52
much easily accessible and poor scalability and it was leading to
7:55
scalability and it was leading to
7:55
scalability and it was leading to non-productive time and data quality
7:57
non-productive time and data quality
7:57
non-productive time and data quality issue as well and at the end it was uh
8:00
issue as well and at the end it was uh
8:00
issue as well and at the end it was uh giving some inconsistencies as well
8:02
giving some inconsistencies as well
8:02
giving some inconsistencies as well because in this on on on Prime app I
8:05
because in this on on on Prime app I
8:05
because in this on on on Prime app I mean if there's a data became in
8:08
mean if there's a data became in
8:08
mean if there's a data became in inconsistent then I need to import that
8:11
inconsistent then I need to import that
8:11
inconsistent then I need to import that data again and connect it to the several
8:13
data again and connect it to the several
8:13
data again and connect it to the several application a couple of times so that
8:15
application a couple of times so that
8:15
application a couple of times so that was not a a good way to run these work
8:18
was not a a good way to run these work
8:18
was not a a good way to run these work workflows and other common complaint
8:22
workflows and other common complaint
8:22
workflows and other common complaint from customer was they always see a
8:23
from customer was they always see a
8:23
from customer was they always see a missing data or incomplete data so they
8:27
missing data or incomplete data so they
8:27
missing data or incomplete data so they they were missing a complete and
8:28
they were missing a complete and
8:28
they were missing a complete and consistent View of data always so these
8:32
consistent View of data always so these
8:32
consistent View of data always so these are the challenge so okay I talked
8:35
are the challenge so okay I talked
8:35
are the challenge so okay I talked enough about data and data about
8:36
enough about data and data about
8:36
enough about data and data about frequencies at all but I would like to
8:39
frequencies at all but I would like to
8:39
frequencies at all but I would like to give some context how how does that data
8:43
give some context how how does that data
8:43
give some context how how does that data look look like so on very high level I
8:45
look look like so on very high level I
8:45
look look like so on very high level I would like to split the data into the
8:48
would like to split the data into the
8:48
would like to split the data into the two in the two uh uh two kind of data I
8:53
two in the two uh uh two kind of data I
8:53
two in the two uh uh two kind of data I one is the time series data other one is
8:55
one is the time series data other one is
8:55
one is the time series data other one is the structural data when I'm saying time
8:57
the structural data when I'm saying time
8:57
the structural data when I'm saying time series data so when whenever we are
9:00
series data so when whenever we are
9:00
series data so when whenever we are drilling the oil well and we are putting
9:02
drilling the oil well and we are putting
9:02
drilling the oil well and we are putting a lot many tools inside the earth so
9:05
a lot many tools inside the earth so
9:05
a lot many tools inside the earth so with that drilling pipeline there are
9:07
with that drilling pipeline there are
9:07
with that drilling pipeline there are lot many sensors installed and these
9:09
lot many sensors installed and these
9:09
lot many sensors installed and these sensors are basically capturing a lot of
9:14
sensors are basically capturing a lot of
9:14
sensors are basically capturing a lot of data points from beneath the Earth these
9:17
data points from beneath the Earth these
9:17
data points from beneath the Earth these data points could be in simple terms of
9:19
data points could be in simple terms of
9:19
data points could be in simple terms of pressure or temperature or the gas flow
9:22
pressure or temperature or the gas flow
9:23
pressure or temperature or the gas flow rate or oil flow rate or water flow rate
9:25
rate or oil flow rate or water flow rate
9:25
rate or oil flow rate or water flow rate and typically this time space data looks
9:28
and typically this time space data looks
9:28
and typically this time space data looks like what I'm sh going on the right hand
9:31
like what I'm sh going on the right hand
9:31
like what I'm sh going on the right hand side the first one is say okay there is
9:33
side the first one is say okay there is
9:33
side the first one is say okay there is acquisition time which is which is
9:36
acquisition time which is which is
9:36
acquisition time which is which is associated with every data point let's
9:38
associated with every data point let's
9:38
associated with every data point let's say I'm showing that uh uh some value
9:41
say I'm showing that uh uh some value
9:42
say I'm showing that uh uh some value let's say some uh oil production volume
9:44
let's say some uh oil production volume
9:44
let's say some uh oil production volume coming so I'm showing 2.3 I assigning
9:47
coming so I'm showing 2.3 I assigning
9:47
coming so I'm showing 2.3 I assigning the unit of measurement as well and I'm
9:50
the unit of measurement as well and I'm
9:50
the unit of measurement as well and I'm also assigning the kind of data is it
9:52
also assigning the kind of data is it
9:52
also assigning the kind of data is it double or string so that my storage
9:54
double or string so that my storage
9:54
double or string so that my storage system or my workflows also understand
9:56
system or my workflows also understand
9:56
system or my workflows also understand it I'm also assigning The Source
9:58
it I'm also assigning The Source
9:58
it I'm also assigning The Source acquisition time I mean okay uh Source
10:00
acquisition time I mean okay uh Source
10:01
acquisition time I mean okay uh Source acquisition time means when I really see
10:03
acquisition time means when I really see
10:03
acquisition time means when I really see the data on my cloud so one agent
10:06
the data on my cloud so one agent
10:06
the data on my cloud so one agent acquisition time means the acquisition
10:08
acquisition time means the acquisition
10:08
acquisition time means the acquisition time when the onframe Tool which is
10:11
time when the onframe Tool which is
10:12
time when the onframe Tool which is basically streaming that data point and
10:14
basically streaming that data point and
10:15
basically streaming that data point and the source acquisition when I start
10:16
the source acquisition when I start
10:16
the source acquisition when I start seeing that data point on cloud so there
10:19
seeing that data point on cloud so there
10:19
seeing that data point on cloud so there are typical key characteristic for this
10:21
are typical key characteristic for this
10:21
are typical key characteristic for this time series data which is important for
10:23
time series data which is important for
10:23
time series data which is important for our work workflows so for example let's
10:26
our work workflows so for example let's
10:26
our work workflows so for example let's say if I want to see a forecast of the
10:29
say if I want to see a forecast of the
10:29
say if I want to see a forecast of the oil production so I'm basically
10:31
oil production so I'm basically
10:31
oil production so I'm basically interested in the trend set pattern
10:33
interested in the trend set pattern
10:33
interested in the trend set pattern because I want to apply the time series
10:35
because I want to apply the time series
10:35
because I want to apply the time series analysis which is basically a typical
10:37
analysis which is basically a typical
10:37
analysis which is basically a typical machine learning work workflow so it is
10:40
machine learning work workflow so it is
10:40
machine learning work workflow so it is only possible if your data is stored as
10:42
only possible if your data is stored as
10:42
only possible if your data is stored as a Time series manner so time series
10:45
a Time series manner so time series
10:45
a Time series manner so time series means it always has some temporality
10:47
means it always has some temporality
10:47
means it always has some temporality associated with it I mean it's
10:49
associated with it I mean it's
10:49
associated with it I mean it's chronological order maintain then this
10:52
chronological order maintain then this
10:52
chronological order maintain then this data could be a a periodic or sporic
10:56
data could be a a periodic or sporic
10:56
data could be a a periodic or sporic periodic means when we have some defined
10:59
periodic means when we have some defined
10:59
periodic means when we have some defined frequency like daily hourly or minute
11:01
frequency like daily hourly or minute
11:01
frequency like daily hourly or minute base or there are possible that there
11:04
base or there are possible that there
11:04
base or there are possible that there are some sporic operation for for
11:07
are some sporic operation for for
11:07
are some sporic operation for for example if there is some downtime
11:08
example if there is some downtime
11:08
example if there is some downtime activity in certain well so this
11:11
activity in certain well so this
11:11
activity in certain well so this downtime activity is is like a sporic
11:13
downtime activity is is like a sporic
11:13
downtime activity is is like a sporic event I mean there is no uh fixed
11:17
event I mean there is no uh fixed
11:17
event I mean there is no uh fixed interval associated with it the downtime
11:19
interval associated with it the downtime
11:19
interval associated with it the downtime started may may now it may finish maybe
11:22
started may may now it may finish maybe
11:22
started may may now it may finish maybe in hour or it may go for more than an
11:25
in hour or it may go for more than an
11:25
in hour or it may go for more than an hour also third type is why we we want
11:28
hour also third type is why we we want
11:28
hour also third type is why we we want to see the time series data because it
11:30
to see the time series data because it
11:30
to see the time series data because it has some pattern associated with it and
11:33
has some pattern associated with it and
11:33
has some pattern associated with it and we our workflows want to analyze these
11:35
we our workflows want to analyze these
11:35
we our workflows want to analyze these pattern and there is always a
11:37
pattern and there is always a
11:37
pattern and there is always a seasonality associated with time series
11:40
seasonality associated with time series
11:40
seasonality associated with time series data which involves pattern that repeat
11:43
data which involves pattern that repeat
11:43
data which involves pattern that repeat at the known interval it could be daily
11:44
at the known interval it could be daily
11:44
at the known interval it could be daily weekly or
11:46
weekly or or so second type of data is basically
11:49
or so second type of data is basically
11:49
or so second type of data is basically structural data so to explain this data
11:53
structural data so to explain this data
11:53
structural data so to explain this data I just I would like to give you one
11:55
I just I would like to give you one
11:55
I just I would like to give you one example let's say if some oil and gas
11:59
example let's say if some oil and gas
11:59
example let's say if some oil and gas company has some oil well let's say in
12:03
company has some oil well let's say in
12:03
company has some oil well let's say in the Texas in Texas there is one city C
12:05
the Texas in Texas there is one city C
12:05
the Texas in Texas there is one city C Cam Cameron so you can think of it's
12:09
Cam Cameron so you can think of it's
12:09
Cam Cameron so you can think of it's like okay maybe in North America that
12:11
like okay maybe in North America that
12:11
like okay maybe in North America that oil and gas company may have different
12:13
oil and gas company may have different
12:13
oil and gas company may have different oil wells in different geographical lo
12:16
oil wells in different geographical lo
12:16
oil wells in different geographical lo location so they always start with the
12:19
location so they always start with the
12:19
location so they always start with the top route that is basically a a country
12:22
top route that is basically a a country
12:22
top route that is basically a a country then they will go to the state or they
12:25
then they will go to the state or they
12:25
then they will go to the state or they also they may create a a Zone area as
12:28
also they may create a a Zone area as
12:28
also they may create a a Zone area as well I mean they can combine multiple
12:31
well I mean they can combine multiple
12:31
well I mean they can combine multiple geography they can start calling it as a
12:33
geography they can start calling it as a
12:33
geography they can start calling it as a Zone and after that they they are
12:35
Zone and after that they they are
12:35
Zone and after that they they are basically reaching towards the geography
12:38
basically reaching towards the geography
12:38
basically reaching towards the geography where Oil Well is actually located so
12:40
where Oil Well is actually located so
12:40
where Oil Well is actually located so they always identify it using latitude
12:43
they always identify it using latitude
12:43
they always identify it using latitude and longitude of oil well so on the
12:47
and longitude of oil well so on the
12:47
and longitude of oil well so on the right hand side I'm basically showing
12:49
right hand side I'm basically showing
12:49
right hand side I'm basically showing this kind of data which is a structural
12:51
this kind of data which is a structural
12:51
this kind of data which is a structural data as you can see this is a a graph
12:55
data as you can see this is a a graph
12:55
data as you can see this is a a graph based structure so on the center you see
12:57
based structure so on the center you see
12:57
based structure so on the center you see a well is located and there are three
13:00
a well is located and there are three
13:00
a well is located and there are three kind of branches coming out from from
13:03
kind of branches coming out from from
13:03
kind of branches coming out from from that well one is in in orange color one
13:06
that well one is in in orange color one
13:06
that well one is in in orange color one is in red color yellow and green so
13:09
is in red color yellow and green so
13:09
is in red color yellow and green so these different branches are basically
13:10
these different branches are basically
13:10
these different branches are basically showing hierarchies where this well is
13:13
showing hierarchies where this well is
13:13
showing hierarchies where this well is attached so well is obviously attached
13:16
attached so well is obviously attached
13:16
attached so well is obviously attached to some field where we are basically
13:18
to some field where we are basically
13:18
to some field where we are basically doing the drilling operation that field
13:21
doing the drilling operation that field
13:21
doing the drilling operation that field is as exist in some state so that's why
13:25
is as exist in some state so that's why
13:25
is as exist in some state so that's why that one one line shows okay this oil
13:28
that one one line shows okay this oil
13:28
that one one line shows okay this oil well exist in this state then there is
13:31
well exist in this state then there is
13:31
well exist in this state then there is another hierarchy which is basically for
13:33
another hierarchy which is basically for
13:33
another hierarchy which is basically for patum
13:35
patum engineers if they want to know the
13:37
engineers if they want to know the
13:37
engineers if they want to know the reservoir and Zone and completion so
13:39
reservoir and Zone and completion so
13:40
reservoir and Zone and completion so completion is another concept it is
13:43
completion is another concept it is
13:43
completion is another concept it is basically an area under the oil well
13:47
basically an area under the oil well
13:47
basically an area under the oil well from where Oil actually produced so that
13:49
from where Oil actually produced so that
13:49
from where Oil actually produced so that area is basically called com completion
13:53
area is basically called com completion
13:53
area is basically called com completion then I'm showing other hierarchy as well
13:55
then I'm showing other hierarchy as well
13:55
then I'm showing other hierarchy as well for example if there are tools installed
13:58
for example if there are tools installed
13:58
for example if there are tools installed so whenever oil is coming on the Earth
14:01
so whenever oil is coming on the Earth
14:01
so whenever oil is coming on the Earth surface then there are multiple tool
14:03
surface then there are multiple tool
14:03
surface then there are multiple tool installed we call it pump and flow flow
14:05
installed we call it pump and flow flow
14:05
installed we call it pump and flow flow lines and separators so that hierarchy
14:07
lines and separators so that hierarchy
14:07
lines and separators so that hierarchy basically shows okay this oil well tools
14:09
basically shows okay this oil well tools
14:09
basically shows okay this oil well tools are basically connected to the these
14:12
are basically connected to the these
14:12
are basically connected to the these subsurface components on the earth then
14:14
subsurface components on the earth then
14:14
subsurface components on the earth then there are other hierarchy like events as
14:16
there are other hierarchy like events as
14:16
there are other hierarchy like events as I said if it is a downtime event or some
14:20
I said if it is a downtime event or some
14:20
I said if it is a downtime event or some some something else they have their own
14:22
some something else they have their own
14:22
some something else they have their own hierarchy so and this structure is
14:25
hierarchy so and this structure is
14:25
hierarchy so and this structure is basically a a graph based structure so
14:27
basically a a graph based structure so
14:27
basically a a graph based structure so you can see what we want want to store
14:30
you can see what we want want to store
14:30
you can see what we want want to store okay the node ID itself uh and the edges
14:34
okay the node ID itself uh and the edges
14:34
okay the node ID itself uh and the edges and the started and ended so all these
14:37
and the started and ended so all these
14:37
and the started and ended so all these entities in this system are connected by
14:39
entities in this system are connected by
14:39
entities in this system are connected by several hierarchies and as I said okay
14:42
several hierarchies and as I said okay
14:42
several hierarchies and as I said okay the same entity could be attached to the
14:44
the same entity could be attached to the
14:44
the same entity could be attached to the multiple hierarchy as well so the main
14:47
multiple hierarchy as well so the main
14:47
multiple hierarchy as well so the main challeng is now we talked about the data
14:50
challeng is now we talked about the data
14:50
challeng is now we talked about the data the time series data and the structural
14:52
the time series data and the structural
14:52
the time series data and the structural data so time series data could be very
14:56
data so time series data could be very
14:56
data so time series data could be very huge it could be a PAB of data for more
15:01
huge it could be a PAB of data for more
15:01
huge it could be a PAB of data for more than 10 years or so and for a
15:04
than 10 years or so and for a
15:04
than 10 years or so and for a structural it is basically a graphical
15:08
structural it is basically a graphical
15:08
structural it is basically a graphical data so our challenge is to create a
15:11
data so our challenge is to create a
15:11
data so our challenge is to create a scalable cloud storage so that we can uh
15:15
scalable cloud storage so that we can uh
15:15
scalable cloud storage so that we can uh store this massive data and this graph
15:18
store this massive data and this graph
15:18
store this massive data and this graph structural data so how did we solve this
15:22
structural data so how did we solve this
15:22
structural data so how did we solve this so our first pro problem was the data
15:24
so our first pro problem was the data
15:24
so our first pro problem was the data injection problem so in the production
15:26
injection problem so in the production
15:26
injection problem so in the production operation as I said there are several s
15:29
operation as I said there are several s
15:29
operation as I said there are several s installed and there are several tools
15:31
installed and there are several tools
15:31
installed and there are several tools tools installed and we have different
15:33
tools installed and we have different
15:33
tools installed and we have different kind of data sources as well so there
15:35
kind of data sources as well so there
15:35
kind of data sources as well so there are typical data source like if there is
15:37
are typical data source like if there is
15:37
are typical data source like if there is a on
15:38
a on Prem relational database that is called
15:41
Prem relational database that is called
15:41
Prem relational database that is called a production data management solution
15:43
a production data management solution
15:43
a production data management solution there could be Edge base or iot based
15:46
there could be Edge base or iot based
15:46
there could be Edge base or iot based devices streaming data there are mobile
15:49
devices streaming data there are mobile
15:49
devices streaming data there are mobile phone as well I mean someone is just
15:51
phone as well I mean someone is just
15:51
phone as well I mean someone is just capturing some data and preparing some
15:54
capturing some data and preparing some
15:54
capturing some data and preparing some report there could be a a physics based
15:57
report there could be a a physics based
15:57
report there could be a a physics based model if you guys don't about the physic
15:59
model if you guys don't about the physic
15:59
model if you guys don't about the physic physics based model so for
16:02
physics based model so for
16:02
physics based model so for example uh in any oil field they install
16:06
example uh in any oil field they install
16:06
example uh in any oil field they install multiphase flow meter so the job of this
16:09
multiphase flow meter so the job of this
16:09
multiphase flow meter so the job of this multiphase flow meter is like uh to
16:13
multiphase flow meter is like uh to
16:13
multiphase flow meter is like uh to check the uh
16:16
check the uh liquid liquid characteristic and the
16:19
liquid liquid characteristic and the
16:19
liquid liquid characteristic and the heat exchange rate and uh What is the
16:22
heat exchange rate and uh What is the
16:22
heat exchange rate and uh What is the characteristic of the oil gas and water
16:26
characteristic of the oil gas and water
16:26
characteristic of the oil gas and water proportion when when whenever that Raw
16:28
proportion when when whenever that Raw
16:28
proportion when when whenever that Raw oil coming from the earth so we want to
16:32
oil coming from the earth so we want to
16:32
oil coming from the earth so we want to capture a different kind of data from
16:35
capture a different kind of data from
16:35
capture a different kind of data from the different kind of sources so we
16:37
the different kind of sources so we
16:37
the different kind of sources so we definitely need a scalable data
16:40
definitely need a scalable data
16:40
definitely need a scalable data injection system and data data is really
16:43
injection system and data data is really
16:43
injection system and data data is really important because uh 80% of the time any
16:46
important because uh 80% of the time any
16:46
important because uh 80% of the time any production engineer or petroleum
16:48
production engineer or petroleum
16:48
production engineer or petroleum engineer they always uh keep looking at
16:50
engineer they always uh keep looking at
16:50
engineer they always uh keep looking at data because their workflows are
16:53
data because their workflows are
16:53
data because their workflows are basically data
16:55
basically data intensives so to solve this problem what
16:58
intensives so to solve this problem what
16:58
intensives so to solve this problem what what we did we we created autonomous
17:01
what we did we we created autonomous
17:01
what we did we we created autonomous agent so autonomous agent is a very
17:03
agent so autonomous agent is a very
17:03
agent so autonomous agent is a very famous term these days we use it for for
17:07
famous term these days we use it for for
17:07
famous term these days we use it for for creating machine learning models and llm
17:09
creating machine learning models and llm
17:09
creating machine learning models and llm model also but yeah autonomous agent
17:12
model also but yeah autonomous agent
17:12
model also but yeah autonomous agent means anything any software process
17:15
means anything any software process
17:15
means anything any software process which is running continuously and acting
17:18
which is running continuously and acting
17:18
which is running continuously and acting on the events it receive so for us the
17:23
on the events it receive so for us the
17:23
on the events it receive so for us the we we always want okay some process to
17:25
we we always want okay some process to
17:25
we we always want okay some process to run closer to the data source and keep
17:28
run closer to the data source and keep
17:28
run closer to the data source and keep listening for the events like if some
17:30
listening for the events like if some
17:30
listening for the events like if some new data point arrive or if you want to
17:34
new data point arrive or if you want to
17:34
new data point arrive or if you want to fet the data for more than 20 years
17:37
fet the data for more than 20 years
17:37
fet the data for more than 20 years because there are few Wells who are
17:39
because there are few Wells who are
17:39
because there are few Wells who are producing oil since most since last more
17:42
producing oil since most since last more
17:42
producing oil since most since last more than 100 years as well so we want to
17:45
than 100 years as well so we want to
17:45
than 100 years as well so we want to fetch all this 100 Years of data and you
17:48
fetch all this 100 Years of data and you
17:49
fetch all this 100 Years of data and you can just imagine the scale if some well
17:52
can just imagine the scale if some well
17:52
can just imagine the scale if some well producing a second based frequency data
17:54
producing a second based frequency data
17:54
producing a second based frequency data for 100 years there could be billions of
17:57
for 100 years there could be billions of
17:57
for 100 years there could be billions of data points so we want to push all these
18:00
data points so we want to push all these
18:00
data points so we want to push all these data points to our cloud storage so we
18:03
data points to our cloud storage so we
18:03
data points to our cloud storage so we produce this agent this agents in simple
18:05
produce this agent this agents in simple
18:05
produce this agent this agents in simple term you can think of it's like a
18:07
term you can think of it's like a
18:07
term you can think of it's like a Windows service running continuously and
18:11
Windows service running continuously and
18:11
Windows service running continuously and uh whenever it receives some event from
18:13
uh whenever it receives some event from
18:13
uh whenever it receives some event from the on Prem data source it is uh pushing
18:16
the on Prem data source it is uh pushing
18:16
the on Prem data source it is uh pushing those data points securely to the cloud
18:18
those data points securely to the cloud
18:18
those data points securely to the cloud storage so each agent is
18:21
storage so each agent is
18:21
storage so each agent is basically uh associated with a unique
18:24
basically uh associated with a unique
18:24
basically uh associated with a unique agent ID what we handle on cloud and uh
18:28
agent ID what we handle on cloud and uh
18:28
agent ID what we handle on cloud and uh for secure
18:29
for secure if you want to secure the communication
18:31
if you want to secure the communication
18:31
if you want to secure the communication Channel between the onr and Cloud so we
18:34
Channel between the onr and Cloud so we
18:34
Channel between the onr and Cloud so we use a cloud cloudbased service account
18:37
use a cloud cloudbased service account
18:37
use a cloud cloudbased service account so that whenever someone is installing
18:39
so that whenever someone is installing
18:39
so that whenever someone is installing those agent that person is always using
18:42
those agent that person is always using
18:42
those agent that person is always using the encryption key coming from the
18:44
the encryption key coming from the
18:44
the encryption key coming from the cloudbased service account then that
18:46
cloudbased service account then that
18:46
cloudbased service account then that installer is basically decrypt that key
18:49
installer is basically decrypt that key
18:49
installer is basically decrypt that key and after that we don't have any idea
18:51
and after that we don't have any idea
18:51
and after that we don't have any idea about a de key because we are not
18:53
about a de key because we are not
18:54
about a de key because we are not storing anything and whenever I'm
18:56
storing anything and whenever I'm
18:56
storing anything and whenever I'm communicating from onr to Cloud using
18:59
communicating from onr to Cloud using
18:59
communicating from onr to Cloud using some messaging q that channel is also
19:02
some messaging q that channel is also
19:02
some messaging q that channel is also encrypted using same key so it makes
19:04
encrypted using same key so it makes
19:04
encrypted using same key so it makes sure that we are not spoofing anything
19:06
sure that we are not spoofing anything
19:06
sure that we are not spoofing anything and uh each agent is basically talking
19:09
and uh each agent is basically talking
19:09
and uh each agent is basically talking to the same messaging topic associated
19:12
to the same messaging topic associated
19:12
to the same messaging topic associated with its cloud service account so this
19:15
with its cloud service account so this
19:15
with its cloud service account so this is for handling the data injection so in
19:17
is for handling the data injection so in
19:17
is for handling the data injection so in this slide I'm basically showing the
19:19
this slide I'm basically showing the
19:19
this slide I'm basically showing the typical life cycle of of any agent and
19:22
typical life cycle of of any agent and
19:22
typical life cycle of of any agent and right hand side I'm also showing one
19:24
right hand side I'm also showing one
19:24
right hand side I'm also showing one picture so this picture is is basically
19:26
picture so this picture is is basically
19:26
picture so this picture is is basically from the patent I find Fed so this is
19:29
from the patent I find Fed so this is
19:29
from the patent I find Fed so this is this patent is already approved so this
19:32
this patent is already approved so this
19:32
this patent is already approved so this is for showing the life cycle of the
19:34
is for showing the life cycle of the
19:34
is for showing the life cycle of the agent so this basically shows any agent
19:37
agent so this basically shows any agent
19:37
agent so this basically shows any agent can send the time SE and structural data
19:39
can send the time SE and structural data
19:39
can send the time SE and structural data what I showed earlier we are using the
19:41
what I showed earlier we are using the
19:41
what I showed earlier we are using the Proto the Google Proto buff for format
19:44
Proto the Google Proto buff for format
19:45
Proto the Google Proto buff for format for serializing and desizing we can
19:47
for serializing and desizing we can
19:47
for serializing and desizing we can always send the incremental incremental
19:50
always send the incremental incremental
19:50
always send the incremental incremental data on daily basis or or overly basis
19:53
data on daily basis or or overly basis
19:53
data on daily basis or or overly basis or second base or historical and this
19:55
or second base or historical and this
19:55
or second base or historical and this agent can also request some
19:57
agent can also request some
19:57
agent can also request some configuration at the startup on
19:58
configuration at the startup on
19:58
configuration at the startup on periodically or demand from cloud and it
20:01
periodically or demand from cloud and it
20:01
periodically or demand from cloud and it always use some messaging uh or
20:04
always use some messaging uh or
20:04
always use some messaging uh or communication mechanism so that it can
20:05
communication mechanism so that it can
20:05
communication mechanism so that it can scale real well it can also receive some
20:09
scale real well it can also receive some
20:09
scale real well it can also receive some commands from from cloud okay let's say
20:12
commands from from cloud okay let's say
20:12
commands from from cloud okay let's say someone wants to fetch the last 8 years
20:14
someone wants to fetch the last 8 years
20:14
someone wants to fetch the last 8 years of data then that person can basically
20:17
of data then that person can basically
20:17
of data then that person can basically use some apis hosted on on cloud and it
20:20
use some apis hosted on on cloud and it
20:20
use some apis hosted on on cloud and it can start those jobs manually and uh
20:23
can start those jobs manually and uh
20:24
can start those jobs manually and uh other challenge what we solve okay we we
20:27
other challenge what we solve okay we we
20:27
other challenge what we solve okay we we need something to monitor so we are
20:29
need something to monitor so we are
20:29
need something to monitor so we are using the same uh uh messaging pop up
20:33
using the same uh uh messaging pop up
20:33
using the same uh uh messaging pop up communication mechanism to send the
20:35
communication mechanism to send the
20:35
communication mechanism to send the logging and monitoring status as well
20:37
logging and monitoring status as well
20:37
logging and monitoring status as well those we call hardbeat and we are
20:39
those we call hardbeat and we are
20:39
those we call hardbeat and we are monitoring those hardbeats on on cloud
20:42
monitoring those hardbeats on on cloud
20:42
monitoring those hardbeats on on cloud so we basically created some dashboard
20:45
so we basically created some dashboard
20:45
so we basically created some dashboard so that we always see okay our agent is
20:47
so that we always see okay our agent is
20:47
so that we always see okay our agent is up or
20:48
up or not now moving over to the agent life
20:52
not now moving over to the agent life
20:52
not now moving over to the agent life cycle the typical agent life cycle start
20:54
cycle the typical agent life cycle start
20:54
cycle the typical agent life cycle start if someone registered that agent on
20:56
if someone registered that agent on
20:56
if someone registered that agent on cloud you using some set of apis so when
20:58
cloud you using some set of apis so when
20:59
cloud you using some set of apis so when agent is Agent is registered on cloud
21:02
agent is Agent is registered on cloud
21:02
agent is Agent is registered on cloud then we basically associate it with some
21:04
then we basically associate it with some
21:04
then we basically associate it with some cloud service account so we maintain the
21:06
cloud service account so we maintain the
21:06
cloud service account so we maintain the authentication and authorization part
21:08
authentication and authorization part
21:08
authentication and authorization part and that cloud service account is also
21:10
and that cloud service account is also
21:10
and that cloud service account is also associated with some Cloud pffs some
21:12
associated with some Cloud pffs some
21:12
associated with some Cloud pffs some mechanism so when I'm saying Cloud pups
21:14
mechanism so when I'm saying Cloud pups
21:14
mechanism so when I'm saying Cloud pups some mechanism so in G in Google Cloud
21:16
some mechanism so in G in Google Cloud
21:16
some mechanism so in G in Google Cloud it's a cloud psub or in Azure it is aure
21:20
it's a cloud psub or in Azure it is aure
21:20
it's a cloud psub or in Azure it is aure service bus and for AWS it is SNS or sqs
21:24
service bus and for AWS it is SNS or sqs
21:24
service bus and for AWS it is SNS or sqs and agent installation like after
21:26
and agent installation like after
21:26
and agent installation like after registering the agent I want to install
21:28
registering the agent I want to install
21:28
registering the agent I want to install it on the on Prem machine so that it is
21:30
it on the on Prem machine so that it is
21:30
it on the on Prem machine so that it is always attached to that particular data
21:32
always attached to that particular data
21:32
always attached to that particular data source so I use the same cloud service
21:35
source so I use the same cloud service
21:35
source so I use the same cloud service account key and after completing the
21:37
account key and after completing the
21:37
account key and after completing the installation if it is a Windows Server
21:39
installation if it is a Windows Server
21:39
installation if it is a Windows Server it basically encrypts it using the
21:41
it basically encrypts it using the
21:41
it basically encrypts it using the window data protection API so as soon as
21:44
window data protection API so as soon as
21:44
window data protection API so as soon as agent start it it basically started
21:46
agent start it it basically started
21:46
agent start it it basically started streaming data on cloud then the
21:48
streaming data on cloud then the
21:48
streaming data on cloud then the injection part is basically sending the
21:50
injection part is basically sending the
21:50
injection part is basically sending the oil production data from these oil field
21:52
oil production data from these oil field
21:52
oil production data from these oil field and data is pushed using the same
21:54
and data is pushed using the same
21:54
and data is pushed using the same messaging any pops of mechan
21:57
messaging any pops of mechan
21:57
messaging any pops of mechan mechanism and uh yeah we also can send
22:02
mechanism and uh yeah we also can send
22:02
mechanism and uh yeah we also can send commands from Cloud to onr for if we
22:04
commands from Cloud to onr for if we
22:04
commands from Cloud to onr for if we want to fetch any historical
22:09
want to fetch any historical
22:09
want to fetch any historical data so one important point point was
22:12
data so one important point point was
22:12
data so one important point point was let's say if there is a some disaster
22:15
let's say if there is a some disaster
22:15
let's say if there is a some disaster happens on cloud okay so what is the
22:17
happens on cloud okay so what is the
22:17
happens on cloud okay so what is the best way to reduce the downtime because
22:20
best way to reduce the downtime because
22:20
best way to reduce the downtime because I don't want our customer to wait so in
22:23
I don't want our customer to wait so in
22:23
I don't want our customer to wait so in disaster scenario the strategy we
22:25
disaster scenario the strategy we
22:25
disaster scenario the strategy we followed we created a global uh Cloud
22:29
followed we created a global uh Cloud
22:29
followed we created a global uh Cloud resource which is available in multis
22:31
resource which is available in multis
22:31
resource which is available in multis zone so whenever uh and also because we
22:34
zone so whenever uh and also because we
22:34
zone so whenever uh and also because we were using the cloud P pops up so by
22:37
were using the cloud P pops up so by
22:37
were using the cloud P pops up so by default Cloud pops up has the message
22:40
default Cloud pops up has the message
22:40
default Cloud pops up has the message storage for more than 7 days so is so
22:44
storage for more than 7 days so is so
22:44
storage for more than 7 days so is so even if some service is not available
22:47
even if some service is not available
22:47
even if some service is not available some disaster happen on cloud side so
22:50
some disaster happen on cloud side so
22:50
some disaster happen on cloud side so that message is still present in that
22:52
that message is still present in that
22:52
that message is still present in that que for at least 7 days and by creating
22:55
que for at least 7 days and by creating
22:55
que for at least 7 days and by creating a global resource that project is Global
22:59
a global resource that project is Global
22:59
a global resource that project is Global so it is available in the multiple Cloud
23:01
so it is available in the multiple Cloud
23:01
so it is available in the multiple Cloud zone so we can easily recover that
23:04
zone so we can easily recover that
23:04
zone so we can easily recover that disaster s scenarios and we can replay
23:08
disaster s scenarios and we can replay
23:08
disaster s scenarios and we can replay all those messages which are in the
23:10
all those messages which are in the
23:10
all those messages which are in the queue for last 7even days so it
23:12
queue for last 7even days so it
23:12
queue for last 7even days so it basically help us to achieve the
23:15
basically help us to achieve the
23:15
basically help us to achieve the disaster mechanism for solving the
23:17
disaster mechanism for solving the
23:17
disaster mechanism for solving the injection Pro problem now I want to
23:20
injection Pro problem now I want to
23:20
injection Pro problem now I want to store okay now data is arrived I I want
23:23
store okay now data is arrived I I want
23:23
store okay now data is arrived I I want to store that data on on cloud so the
23:26
to store that data on on cloud so the
23:26
to store that data on on cloud so the main challenge in any oil production
23:27
main challenge in any oil production
23:27
main challenge in any oil production operation is okay everyone start using
23:30
operation is okay everyone start using
23:30
operation is okay everyone start using their own terminology own data model so
23:33
their own terminology own data model so
23:33
their own terminology own data model so so someone has someone wants to call
23:36
so someone has someone wants to call
23:36
so someone has someone wants to call something like area Field Station or Val
23:39
something like area Field Station or Val
23:39
something like area Field Station or Val other person wants to call it some other
23:42
other person wants to call it some other
23:42
other person wants to call it some other tank and
23:43
tank and compressor so there are several domain
23:45
compressor so there are several domain
23:45
compressor so there are several domain concept like field asset surface
23:47
concept like field asset surface
23:47
concept like field asset surface subsurface equipment Val B holes and
23:50
subsurface equipment Val B holes and
23:50
subsurface equipment Val B holes and completion our goal is to present every
23:53
completion our goal is to present every
23:53
completion our goal is to present every single thing by using these three terms
23:55
single thing by using these three terms
23:56
single thing by using these three terms only one is called entities second is
23:58
only one is called entities second is
23:58
only one is called entities second is called properties and third one is
24:00
called properties and third one is
24:00
called properties and third one is called the relationships so entity is
24:01
called the relationships so entity is
24:01
called the relationships so entity is like anything it could be well or
24:03
like anything it could be well or
24:03
like anything it could be well or completer or any tool or compressor a
24:06
completer or any tool or compressor a
24:06
completer or any tool or compressor a properties could be any data coming from
24:10
properties could be any data coming from
24:10
properties could be any data coming from that particular entity it could be
24:11
that particular entity it could be
24:11
that particular entity it could be pressure temperature or oil flow rate
24:14
pressure temperature or oil flow rate
24:14
pressure temperature or oil flow rate and relationships is like how these
24:16
and relationships is like how these
24:16
and relationships is like how these hierarchies are attached to each other
24:18
hierarchies are attached to each other
24:18
hierarchies are attached to each other so we basically created a canonical
24:20
so we basically created a canonical
24:20
so we basically created a canonical domain model so that we can represent
24:23
domain model so that we can represent
24:23
domain model so that we can represent every single thing by using a common
24:25
every single thing by using a common
24:25
every single thing by using a common terms it will solve the pro problem what
24:28
terms it will solve the pro problem what
24:28
terms it will solve the pro problem what we earlier when every single application
24:30
we earlier when every single application
24:30
we earlier when every single application was using their own data model and we
24:33
was using their own data model and we
24:33
was using their own data model and we are always solving this problem again
24:35
are always solving this problem again
24:35
are always solving this problem again and
24:37
and again so another use case just to
24:40
again so another use case just to
24:40
again so another use case just to explain you why we need a canical data
24:43
explain you why we need a canical data
24:43
explain you why we need a canical data model let's say in there some field
24:45
model let's say in there some field
24:45
model let's say in there some field there are two sensor installed one is
24:47
there are two sensor installed one is
24:47
there are two sensor installed one is streaming at both are streaming
24:49
streaming at both are streaming
24:49
streaming at both are streaming frequency one is coming from the iot
24:51
frequency one is coming from the iot
24:51
frequency one is coming from the iot source the second one is coming from
24:53
source the second one is coming from
24:53
source the second one is coming from some database let's say one one person
24:55
some database let's say one one person
24:55
some database let's say one one person calling is iot Source like esp.
24:58
calling is iot Source like esp.
24:58
calling is iot Source like esp. frequency in Herz and second person is
25:01
frequency in Herz and second person is
25:01
frequency in Herz and second person is calling just like ESP
25:03
calling just like ESP
25:03
calling just like ESP underscore uncore frequency in a in Herz
25:07
underscore uncore frequency in a in Herz
25:07
underscore uncore frequency in a in Herz but for our production domain model it
25:09
but for our production domain model it
25:09
but for our production domain model it is just ESP frequency we don't care
25:12
is just ESP frequency we don't care
25:12
is just ESP frequency we don't care about how how the main data source want
25:15
about how how the main data source want
25:15
about how how the main data source want to name it or tag it whenever data is
25:18
to name it or tag it whenever data is
25:18
to name it or tag it whenever data is coming to our data storage we want to
25:20
coming to our data storage we want to
25:20
coming to our data storage we want to apply our own domain model concept so it
25:23
apply our own domain model concept so it
25:23
apply our own domain model concept so it basically help us to resolve the
25:26
basically help us to resolve the
25:26
basically help us to resolve the different terminology and data standard
25:28
different terminology and data standard
25:28
different terminology and data standard because now we understand okay we are
25:30
because now we understand okay we are
25:30
because now we understand okay we are talking a same language and we are not
25:32
talking a same language and we are not
25:32
talking a same language and we are not confused about what other people are
25:35
confused about what other people are
25:35
confused about what other people are giving some other some other tag or
25:39
giving some other some other tag or
25:39
giving some other some other tag or name so other challenge was okay we were
25:43
name so other challenge was okay we were
25:43
name so other challenge was okay we were we wanted to store this
25:46
we wanted to store this
25:46
we wanted to store this huge amounts of Time s data instruction
25:49
huge amounts of Time s data instruction
25:49
huge amounts of Time s data instruction data and our main requirement was we
25:52
data and our main requirement was we
25:52
data and our main requirement was we always want to keep the history of data
25:54
always want to keep the history of data
25:54
always want to keep the history of data because let's say if I'm running a
25:56
because let's say if I'm running a
25:56
because let's say if I'm running a forecasting operation or I'm running
25:58
forecasting operation or I'm running
25:58
forecasting operation or I'm running some uh other recommendation in your
26:01
some uh other recommendation in your
26:01
some uh other recommendation in your workflow I wanted to see the history of
26:03
workflow I wanted to see the history of
26:03
workflow I wanted to see the history of data always I don't want to delete
26:06
data always I don't want to delete
26:06
data always I don't want to delete anything I always want to upend
26:08
anything I always want to upend
26:08
anything I always want to upend everything so we solve this problem
26:10
everything so we solve this problem
26:10
everything so we solve this problem using Boral storage so Boral so B
26:13
using Boral storage so Boral so B
26:13
using Boral storage so Boral so B temporality is a concept when we assign
26:16
temporality is a concept when we assign
26:16
temporality is a concept when we assign at least two times 10 one is like a
26:18
at least two times 10 one is like a
26:18
at least two times 10 one is like a valid time second one is called the
26:20
valid time second one is called the
26:20
valid time second one is called the transaction time so valid time is like
26:22
transaction time so valid time is like
26:22
transaction time so valid time is like the actual time when that physical
26:24
the actual time when that physical
26:24
the actual time when that physical measurement happen at the source like
26:27
measurement happen at the source like
26:27
measurement happen at the source like when we make the pressure at source and
26:30
when we make the pressure at source and
26:30
when we make the pressure at source and proection time is when I basically
26:33
proection time is when I basically
26:33
proection time is when I basically storing that data into our Cloud stories
26:37
storing that data into our Cloud stories
26:37
storing that data into our Cloud stories so it basically helped us by storing
26:39
so it basically helped us by storing
26:39
so it basically helped us by storing these two time stamp it help us to uh
26:42
these two time stamp it help us to uh
26:43
these two time stamp it help us to uh run the temporal queries it is very
26:45
run the temporal queries it is very
26:45
run the temporal queries it is very useful for running any kind of
26:47
useful for running any kind of
26:47
useful for running any kind of historical analysis it also help help us
26:51
historical analysis it also help help us
26:51
historical analysis it also help help us to achieve the temporal Trends and
26:53
to achieve the temporal Trends and
26:54
to achieve the temporal Trends and running the long running immutable Cal
26:57
running the long running immutable Cal
26:57
running the long running immutable Cal calculation
26:59
calculation and uh we can Al we can always run the
27:03
and uh we can Al we can always run the
27:03
and uh we can Al we can always run the different data points between different
27:05
different data points between different
27:05
different data points between different data versions so that we
27:08
data versions so that we
27:08
data versions so that we can see the complete history of the data
27:11
can see the complete history of the data
27:11
can see the complete history of the data and it was really important for our
27:13
and it was really important for our
27:13
and it was really important for our historical work workflows so on right
27:15
historical work workflows so on right
27:15
historical work workflows so on right hand side as you can see I'm just
27:17
hand side as you can see I'm just
27:17
hand side as you can see I'm just showing you you the examples on the top
27:20
showing you you the examples on the top
27:20
showing you you the examples on the top it is showing the time series storage
27:21
it is showing the time series storage
27:21
it is showing the time series storage where I'm assigning two time stamp one
27:25
where I'm assigning two time stamp one
27:25
where I'm assigning two time stamp one is on the row level other one is at
27:27
is on the row level other one is at
27:27
is on the row level other one is at column LEL
27:28
column LEL and Below we I'm showing the structural
27:31
and Below we I'm showing the structural
27:31
and Below we I'm showing the structural data where the two times St is basically
27:34
data where the two times St is basically
27:34
data where the two times St is basically installed so on the first time St I'm
27:37
installed so on the first time St I'm
27:37
installed so on the first time St I'm showing that okay well2 was not attached
27:39
showing that okay well2 was not attached
27:39
showing that okay well2 was not attached to battery one but after some time on
27:42
to battery one but after some time on
27:42
to battery one but after some time on 13th October after one one day the well
27:45
13th October after one one day the well
27:45
13th October after one one day the well is attached to the battery well so I
27:47
is attached to the battery well so I
27:47
is attached to the battery well so I always wanted to store the history of
27:50
always wanted to store the history of
27:50
always wanted to store the history of data and how we achieved this we
27:52
data and how we achieved this we
27:52
data and how we achieved this we basically built a by temporary storage
27:55
basically built a by temporary storage
27:55
basically built a by temporary storage and just to give you one example okay
27:58
and just to give you one example okay
27:58
and just to give you one example okay uh so we are basically uh we started
28:02
uh so we are basically uh we started
28:02
uh so we are basically uh we started with a cloud native version of it but
28:05
with a cloud native version of it but
28:05
with a cloud native version of it but slowly we move towards a cloud agnostic
28:08
slowly we move towards a cloud agnostic
28:08
slowly we move towards a cloud agnostic one so our first version was created
28:11
one so our first version was created
28:11
one so our first version was created using big table as a storage so on the
28:13
using big table as a storage so on the
28:13
using big table as a storage so on the top you are basically seeing the actual
28:15
top you are basically seeing the actual
28:15
top you are basically seeing the actual big table schema and bottom for the
28:17
big table schema and bottom for the
28:17
big table schema and bottom for the structural storage we use another
28:20
structural storage we use another
28:20
structural storage we use another database that's called datomic so
28:22
database that's called datomic so
28:22
database that's called datomic so datomic is basically by default provides
28:26
datomic is basically by default provides
28:26
datomic is basically by default provides a feature to run a temporal queries
28:28
a feature to run a temporal queries
28:29
a feature to run a temporal queries because it basically stores the version
28:30
because it basically stores the version
28:30
because it basically stores the version time and the trans and the transaction
28:35
time and the trans and the transaction
28:35
time and the trans and the transaction time so consumption worklow okay now our
28:38
time so consumption worklow okay now our
28:38
time so consumption worklow okay now our data is stored we have everything now we
28:40
data is stored we have everything now we
28:40
data is stored we have everything now we want to run a robust model so that we
28:43
want to run a robust model so that we
28:43
want to run a robust model so that we can uh uh create our consumption so our
28:45
can uh uh create our consumption so our
28:45
can uh uh create our consumption so our consumption workflows are basically
28:47
consumption workflows are basically
28:47
consumption workflows are basically interested in looking for the entities
28:49
interested in looking for the entities
28:49
interested in looking for the entities associate properties what I explained
28:51
associate properties what I explained
28:51
associate properties what I explained earlier using the same canical model I
28:54
earlier using the same canical model I
28:54
earlier using the same canical model I want to Traverse that Boral graph
28:58
want to Traverse that Boral graph
28:58
want to Traverse that Boral graph so that I can know okay if even I need
29:01
so that I can know okay if even I need
29:01
so that I can know okay if even I need to go multiple lbel down and find okay
29:05
to go multiple lbel down and find okay
29:05
to go multiple lbel down and find okay my pressure is coming from some flow
29:07
my pressure is coming from some flow
29:07
my pressure is coming from some flow Point location under certain Val so I
29:09
Point location under certain Val so I
29:09
Point location under certain Val so I can do that traversing as well I'm also
29:12
can do that traversing as well I'm also
29:12
can do that traversing as well I'm also interested in doing the lot of
29:15
interested in doing the lot of
29:15
interested in doing the lot of calculation on the time space data which
29:17
calculation on the time space data which
29:17
calculation on the time space data which is basically aggregation consuming and
29:19
is basically aggregation consuming and
29:19
is basically aggregation consuming and write back I want to apply the data
29:21
write back I want to apply the data
29:21
write back I want to apply the data quality attribute and all this data is
29:24
quality attribute and all this data is
29:24
quality attribute and all this data is basically feed into some calculation
29:26
basically feed into some calculation
29:26
basically feed into some calculation engine so that this calculation engine
29:28
engine so that this calculation engine
29:28
engine so that this calculation engine is basically running in the background
29:30
is basically running in the background
29:31
is basically running in the background and giving some recommendation and
29:32
and giving some recommendation and
29:32
and giving some recommendation and insight to production engineer okay and
29:34
insight to production engineer okay and
29:34
insight to production engineer okay and helping them to understand why their oil
29:37
helping them to understand why their oil
29:37
helping them to understand why their oil production is low or why their
29:38
production is low or why their
29:38
production is low or why their forecasting is is not is not matching so
29:42
forecasting is is not is not matching so
29:42
forecasting is is not is not matching so all this data is basically helping to
29:44
all this data is basically helping to
29:44
all this data is basically helping to run this consumption and work workflow I
29:46
run this consumption and work workflow I
29:46
run this consumption and work workflow I would like to give uh uh some example
29:50
would like to give uh uh some example
29:50
would like to give uh uh some example like I'm showing here some Advanced
29:52
like I'm showing here some Advanced
29:52
like I'm showing here some Advanced calculation like validity and S
29:55
calculation like validity and S
29:55
calculation like validity and S selection so validity means now we
29:58
selection so validity means now we
29:58
selection so validity means now we stored all data all time series data I
30:01
stored all data all time series data I
30:01
stored all data all time series data I want to see uh if there is a gap in my
30:05
want to see uh if there is a gap in my
30:05
want to see uh if there is a gap in my data so how much how much tolerance of
30:09
data so how much how much tolerance of
30:09
data so how much how much tolerance of this data Gap I can I can have in my
30:12
this data Gap I can I can have in my
30:12
this data Gap I can I can have in my work workflows so this calculation is
30:15
work workflows so this calculation is
30:15
work workflows so this calculation is basically let's say when I'm feeding
30:17
basically let's say when I'm feeding
30:17
basically let's say when I'm feeding this data to that calculation engine and
30:19
this data to that calculation engine and
30:19
this data to that calculation engine and that calculation engine sees some Gap so
30:22
that calculation engine sees some Gap so
30:22
that calculation engine sees some Gap so it basically a back fill to some status
30:26
it basically a back fill to some status
30:26
it basically a back fill to some status code so that engineer understand okay I
30:29
code so that engineer understand okay I
30:29
code so that engineer understand okay I have some bad status code or some
30:31
have some bad status code or some
30:31
have some bad status code or some unavailable status code so I mean it
30:34
unavailable status code so I mean it
30:34
unavailable status code so I mean it sound simple but it's a difficult
30:36
sound simple but it's a difficult
30:36
sound simple but it's a difficult problem to solve because just imagine I
30:39
problem to solve because just imagine I
30:39
problem to solve because just imagine I someone is seeing last 8 years of second
30:41
someone is seeing last 8 years of second
30:41
someone is seeing last 8 years of second based frequency data so I always need to
30:44
based frequency data so I always need to
30:44
based frequency data so I always need to find the last known good data point so
30:48
find the last known good data point so
30:48
find the last known good data point so that I know okay until that point I need
30:51
that I know okay until that point I need
30:51
that I know okay until that point I need to back fill so this data point could be
30:54
to back fill so this data point could be
30:54
to back fill so this data point could be maybe 1 month ago or could be 25 years
30:57
maybe 1 month ago or could be 25 years
30:57
maybe 1 month ago or could be 25 years ago as well also I want to know okay I
31:00
ago as well also I want to know okay I
31:00
ago as well also I want to know okay I want to merge the different data streams
31:03
want to merge the different data streams
31:03
want to merge the different data streams from different data source for
31:06
from different data source for
31:06
from different data source for example any single oil field if there is
31:10
example any single oil field if there is
31:10
example any single oil field if there is some uh uh Edge device installed and
31:14
some uh uh Edge device installed and
31:14
some uh uh Edge device installed and someone is also streaming data from
31:16
someone is also streaming data from
31:16
someone is also streaming data from Mobile by capturing some image or
31:19
Mobile by capturing some image or
31:19
Mobile by capturing some image or anything so but it is possible that both
31:22
anything so but it is possible that both
31:22
anything so but it is possible that both are sending the same kind of a stream so
31:25
are sending the same kind of a stream so
31:25
are sending the same kind of a stream so I need some way to know okay these are
31:28
I need some way to know okay these are
31:28
I need some way to know okay these are same kind of data so I need to D
31:30
same kind of data so I need to D
31:30
same kind of data so I need to D duplicate it and I want to merge it to
31:32
duplicate it and I want to merge it to
31:32
duplicate it and I want to merge it to the sing sing Single stream
31:35
the sing sing Single stream
31:35
the sing sing Single stream other calculations are like okay uh
31:40
other calculations are like okay uh
31:40
other calculations are like okay uh there could be a monteo simulation there
31:43
there could be a monteo simulation there
31:43
there could be a monteo simulation there could be a joining of Time series it
31:46
could be a joining of Time series it
31:46
could be a joining of Time series it could be more than two time time series
31:48
could be more than two time time series
31:48
could be more than two time time series so overall we created around, 1500 time
31:52
so overall we created around, 1500 time
31:52
so overall we created around, 1500 time series calculation what we are feeding
31:54
series calculation what we are feeding
31:55
series calculation what we are feeding it to generate these insights
31:59
and yeah I would like to give some
32:00
and yeah I would like to give some
32:01
and yeah I would like to give some interesting numbers so this application
32:03
interesting numbers so this application
32:03
interesting numbers so this application is basically uh deployed around the
32:06
is basically uh deployed around the
32:06
is basically uh deployed around the globe mostly for the oil and gas company
32:08
globe mostly for the oil and gas company
32:08
globe mostly for the oil and gas company in South America in Southeast Asia and
32:10
in South America in Southeast Asia and
32:10
in South America in Southeast Asia and Middle East it is managing more than
32:13
Middle East it is managing more than
32:13
Middle East it is managing more than 20,000 oil wells we have 1500 1500 plus
32:18
20,000 oil wells we have 1500 1500 plus
32:18
20,000 oil wells we have 1500 1500 plus time calculation so I would like to give
32:21
time calculation so I would like to give
32:21
time calculation so I would like to give you some numbers for from one client
32:24
you some numbers for from one client
32:24
you some numbers for from one client that client has 7,500 oil well the
32:27
that client has 7,500 oil well the
32:27
that client has 7,500 oil well the typical hierarchy was started from
32:29
typical hierarchy was started from
32:29
typical hierarchy was started from company some labels then field then then
32:32
company some labels then field then then
32:32
company some labels then field then then well and number of entities around
32:34
well and number of entities around
32:34
well and number of entities around 30,000 total relationship be served
32:37
30,000 total relationship be served
32:37
30,000 total relationship be served 60,000 and properties are 1.5 billion
32:41
60,000 and properties are 1.5 billion
32:41
60,000 and properties are 1.5 billion this client has the data streaming for
32:44
this client has the data streaming for
32:44
this client has the data streaming for 25 years so the typical calculations so
32:47
25 years so the typical calculations so
32:47
25 years so the typical calculations so shows we injested on 14 billion data
32:51
shows we injested on 14 billion data
32:51
shows we injested on 14 billion data points and yeah these are other numbers
32:53
points and yeah these are other numbers
32:53
points and yeah these are other numbers like uh the achievements I mean we able
32:56
like uh the achievements I mean we able
32:56
like uh the achievements I mean we able to achieve the 98% time saving 88% cost
32:59
to achieve the 98% time saving 88% cost
32:59
to achieve the 98% time saving 88% cost saving 80% reduction of the data
33:02
saving 80% reduction of the data
33:02
saving 80% reduction of the data preparation to time we able to achieve
33:05
preparation to time we able to achieve
33:05
preparation to time we able to achieve the well up time as well and the typical
33:07
the well up time as well and the typical
33:07
the well up time as well and the typical tag stag I would like to talk okay this
33:10
tag stag I would like to talk okay this
33:10
tag stag I would like to talk okay this is purely a microservice based
33:12
is purely a microservice based
33:12
is purely a microservice based architecture we have 40 plus
33:14
architecture we have 40 plus
33:14
architecture we have 40 plus microservices we are running over two
33:17
microservices we are running over two
33:17
microservices we are running over two commun cluster and uh 90% services are
33:22
commun cluster and uh 90% services are
33:22
commun cluster and uh 90% services are written in Escala few services are
33:24
written in Escala few services are
33:24
written in Escala few services are written in go and Python and we are
33:27
written in go and Python and we are
33:27
written in go and Python and we are using AKA a lot AKA is a actor based
33:30
using AKA a lot AKA is a actor based
33:30
using AKA a lot AKA is a actor based framework and uh yeah everything is
33:33
framework and uh yeah everything is
33:33
framework and uh yeah everything is because we started with the cloud native
33:36
because we started with the cloud native
33:36
because we started with the cloud native but still our future goal was Cloud
33:38
but still our future goal was Cloud
33:38
but still our future goal was Cloud agnostic so we by default started using
33:40
agnostic so we by default started using
33:40
agnostic so we by default started using cuties so we are using cuties since 2015
33:44
cuties so we are using cuties since 2015
33:44
cuties so we are using cuties since 2015 I mean it's very early days and for
33:46
I mean it's very early days and for
33:46
I mean it's very early days and for storage we are using red is post datomic
33:49
storage we are using red is post datomic
33:49
storage we are using red is post datomic and big table so yeah I think so that's
33:52
and big table so yeah I think so that's
33:52
and big table so yeah I think so that's all I have and I'm open for any question