Nowadays, factories around the world go through a comprehensive digital transformation to become increasingly data-driven companies for significantly improving the performance, the quality of their production and manufacturing processes. In this session, we will present a Power Platform-based solution in great detail to show how a well-known no-code/low-code solution package can support these companies in their digital transformation journey. We will create a real-time production monitoring Power BI report using Power Automate, DirectQuery, and Automatic Page Refresh features to show the current status and the latest trends on the production floor. During the session, we will also present a Power Apps solution to modify the planned volumes based on the actual production figures. Moreover, the app generated data will be also integrated into this Power BI report.
About Speaker:
Ph.D. in Computer Science, 20 years experience in software development, 10+ years experience in Microsoft BI stack, day-by-day usage of Power BI since its very first release, teaching lots of Power BI trainings in Hungarian and English, co-organizer of Hungarian Power BI Meetup. Last year I was a speaker at Microsoft Tech Summit 2019 conference in Budapest and also at Power Platform World Tour in Vienna. Recently, I had a presentation at Budapest Artificial Intelligence Meetup, Community Summit Europe 2020, Lightup Virtual Conference, Power BI Days Munich, DataWeekender, Power BI Break, Dublin Power BI User Group, and Data Platform Summit 2020.
Conference Website: https://www.2020twenty.net/power-platform-virtual-conference/
C# Corner - Community of Software and Data Developers
https://www.c-sharpcorner.com
#conference #powerplatform #powerbi #csharpcorner
Show More Show Less View Video Transcript
0:00
So, in the next 40-45 minutes I will speak about how today's factories, production and
0:10
manufacturing factories, could use a no-code, low-code solution like this Power Platform
0:15
which is I believe a really great solution for no-code, low-code solutions, and how they
0:21
could use this for creating production floor monitoring. especially how to create for some kind of iot solutions so first of all after the nice
0:34
introduction from i'm on who i am so i'm speaking from hungary budapest capital city of hungary
0:42
i'm working for a small consulting company called viscode i've worked in many many areas in software
0:50
development in the last 20 years but mostly focusing for bi data warehouse data lakes etc
0:57
topics in the last 10 years from the very first days i'm working with power bi and now as simon
1:04
also mentioned i'm a public speaker in power bi power platform and signups topics and just to
1:13
comment simon's thank you i really like this kind of sessions because every time i could show
1:21
something else which is not typical for a company day by day projects rather just focusing some of
1:28
my hobbies or pet projects etc and this whole session is also coming from one of my pet projects
1:34
at home and i would like to share my experiences from this pet project because i believe it would
1:40
be useful not for the home but also for the companies. So nowadays there are a lot of terms
1:48
which are really important in this industry, the digital transformations of the company, the smart
1:55
factory, smart manufacturing, industry 4.0 in Europe it is a really important phrase or just
2:03
saying that iot industrial iot so a lot of words a lot of concept terms buzzwords but i believe
2:11
they are really important because nowadays our companies our manufacturing and producing companies
2:19
and factories has a lot of information because they're not just having some machines and
2:25
different devices in the factories, but rather we have a lot of sensors with special measurement
2:34
devices, which are proposing, which are sending the data to some kind of local environment
2:42
and then from the local environment, they are sending the information to the cloud
2:47
and then in the cloud, we could process this lot of data using machine learning algorithms
2:54
we could find anomalies, we could predict that predictive maintenance should be executed or not
3:01
etc. A lot of things could be done nowadays using the cloud, using all these data technologies
3:09
And it's really cool if you just think about in the automotive industry, the biggest automotive
3:17
manufacturers all has a lot of robots devices and sensors and they just would like to solve a lot of
3:24
issues within the company within the factory this world edge computing and in the cloud also which
3:30
is the typical iot scenarios my idea was with this whole topic it's coming from last christmas
3:41
because in the Christmas, I've got a gift from my wife and my son
3:46
It was a Raspberry Pi. And for the Raspberry Pi, I just started to do a pet project
3:52
which is I would like to create some kind of device simulation
3:57
so there is humidity and temperature and maybe something else. I could simulate somehow that there are some machines in my company
4:08
in my factory and i would like to push this information to the cloud and do some data ysis
4:16
and some kind of other machine learning algorithms just to play with these technologies and one of
4:23
the first idea was from me that maybe can i use the whole power platform so the power bi power
4:31
apps power automate maybe power virtual regions for such a situation or is it
4:37
not the best technology for it. And I realized that, and my answer is that I believe for smaller IoT projects
4:46
yes, the power platform is really good. And then I just start to think that maybe not for big automotive industry companies
4:56
like Audi, Mercedes, et cetera, but for smaller producing and smaller manufacturing companies who have just some CNC machines
5:05
or one, two machines which produces items day by day, they would buy some devices and sensors
5:13
and create some smaller IoT projects. But until now, they haven't started to invest
5:20
in such IoT projects because it is an expensive project. But using the Power Platform, I believe they could solve it
5:30
because it is a local solution. so maybe one of the guys in the factory could learn it and start to do some smaller IoT projects
5:40
So that was my question, that Fibro platform could be used for such scenarios or not
5:45
and I believe, yes, we could use it. So in the next 30 minutes, I will show some examples how it could be used for such scenarios
5:57
So we will simulate different devices. there will be three different machines which will send event messages notification to the
6:07
Azure IoT hub which would ingest this data coming from the different devices from the factory
6:15
and then there will be two different approaches because in the IoT Scannerio's passes
6:22
where we will check what kind of data arrived, process them, and if it is needed
6:28
we will make some actions because all this IoT scenario is needed to have more, better
6:36
manufacturing or producing in our factory company. So there will be one approach which is called
6:44
hot pass. Hot pass means that we would like to process the data as soon as possible and we would
6:52
like to get insight from this process data and for example if one of the messages from one of the
7:00
machines shows that the temperature of the machine is really high, I should create an alert and send
7:08
an alert to one of the responsible people who could go to the machine and do
7:13
what's needed to stop the machine or decrease the temperature of the machine
7:18
in order to avoid any kind of breakdown of the machine and then a big loss for
7:23
the company so there will be one approach for us today is the hot pass we will use the power automate for it And the other pass which is called in this IoT Scenario cold pass
7:37
is typically we would like to process the data as fast as possible
7:43
but it is not in the scope that it should be as soon as possible
7:49
because the goal with this cold pass that we would like to store the data
7:56
In this session, we will store the data into an Azure SQL database
8:00
but maybe in another bigger project, we could use a sign-ups dedicated SQL pool
8:05
or just store the data at first in a data lake and later we will upload to a curated layer
8:11
in a SQL database or a dedicated SQL pool, or maybe it will be processed, brought into a Cosmos DB
8:21
whatever the cold pass means that we store the data somewhere and later we could reuse, yze and reuse this information
8:31
Maybe we will execute the complex machine learning algorithm to understand what kind of patterns in this data
8:39
Today, we will just create a Power BI report, but not just the typical power be a report like
8:46
we will yze the data over time and say that how many items are produced
8:51
what was the average temperature of the machines or something like this
8:55
I would like to show something special. And the special is that if you go to a factory
9:00
a manufacturing producing factory, you every time see a big LCD screens
9:07
And on the screens, there are numbers, KPIs that we produce, how many items
9:12
What is the speed of the machine? What is the temperature around the machine
9:17
What is the humidity around the machine? So several most important key PRs are available
9:23
on this LCD panels. And I thought that maybe I could reproduce
9:27
this kind of dashboards in a Power BI. So we will show how to create automatically
9:33
refreshing Power BI reports. We will show this kind of information. And last but not least, we will create a Power Apps application to do some data manipulation
9:47
What kind of data manipulation you could imagine in this scenario? So typically, if you are ever in a factory, you know that there are a lot of machines, people
9:58
So every time there is something happens, so you could imagine that one machine which was planned for today
10:05
for example, 500 items will be produced on this machine. Maybe we have to stop this machine and overheating or something problem happened
10:14
And now after one hour, we could use this machine again. And at the end of the day, the realistic is that we could produce 400 items on this machine
10:25
So typically somebody who is responsible for the production floor, a team lead for example during the day maybe hour by hour just refine the actual plan for today and
10:40
so he is refining the planned volumes for the day so for such scenario we will implement a small data
10:49
manipulation application using power apps so let's start the first and very important thing
10:57
how to create such a situation, how to create such reports or use the Power Platform to be able to
11:06
check I could be a citizen IoT developer at home or I could be a citizen IoT developer in a small
11:13
factory where I would like to grab the data from the different devices and sensors and create some
11:20
nice reports and do the hot pass to send dollars to the responsible people how can i start such a
11:28
situation and such a project the first one that i also did last christmas that i've got a raspberry
11:37
pie like here with keyboards mouse a raspberry pi os and etc or maybe i just got arduino or whatever
11:48
sensors and other technologies and start to type Python or some kind of C++ programming language
11:57
learn what is MQTT protocol, how to connect to a cloud. So a lot of things. And I believe it is
12:03
not a good way to start an IoT project, especially for a citizen developer, but also for a professional
12:10
developer, because too many things you have to learn to start such a project
12:15
So then I just thought that maybe we should use Microsoft as a great solution to do Raspberry Pi Azure IoT simulator
12:27
So here you see the URL of this Raspberry Pi Azure IoT simulator
12:32
It is a web-based simulator. You could do whatever you would like to simulate with the Raspberry Pi
12:39
You could write Python code and the data could be uploaded to Azure IoT
12:44
and then process engines and also do all the stuff. It's much better to start
12:51
but I have to say because it needs some Python development. So maybe it is also not the best way to start
13:00
but it's much better to start, I believe. And finally, I realized that Microsoft
13:05
has another great open source solution, absolutely free. It is called Azure IoT Device Simulation
13:11
It is the URL. You could imagine that you go to this GitHub page
13:18
There is a step-by-step, very detailed instruction how to deploy this solution to your machine
13:25
and to your Azure subscription. And after 10, 15 minutes, you are
13:30
able to create a nice environment that there will be a web page
13:35
And within one minute, I will show this web page. you could create as many machines and sensors as you like
13:42
You could define what kind of sensors and what kind of data could be sent from the sensors
13:48
And then you could start simulations with a lot of devices and machines
13:53
And all this data is automatically sending to your Azure subscriptions. And in the Azure subscription
14:00
there will be an Azure IoT hub, which will ingest this data
14:03
And from that moment, you are good to go with your different IOT scenarios
14:10
or for example, with our Power Platform solutions. So how it looks like in reality, I'm just showing
14:20
First of all, if I install this solution, I will see that there is a new resource group
14:28
in my Azure subscription. And below this resource group, there will be a lot of new resources
14:36
For example, there will be an IoT hub, which will ingest the data
14:41
There will be a virtual machine skill set, because in the background, who generates these simulated devices
14:48
and messages and sending all this stuff will be run on a virtual machine
14:53
There are some new items for having all the required network capabilities load balancer network security group IP address virtual network etc And finally there will be a web page and app service
15:10
which will help to execute this kind of simulation. So let's check how it looks like
15:16
It is not the most fancy application, but really useful to create such web projects
15:23
I believe it is a really good solution. So this application called, as I mentioned earlier
15:29
Azure IoT Device Simulation. It's absolutely free. And there are only two screens on this application
15:38
One is that what kind of device is available for you. And the second one, using these devices
15:43
you are able to start simulations. I just would like to show one of my..
15:49
There are a lot of built-in devices just to be able to start the project without doing anything
15:56
But I just created one of my devices, which is a floor machine number one
16:03
And in this machine, there are three sensors which would generate temperature data, humidity data
16:11
and how many items are produced on this machine. And you are able to say that how to generate this kind of data
16:19
So for example, temperature is just a random number between 20 and 50 Celsius
16:26
Or humidity is random number between 20 and 70%. But for how many items are produced today
16:36
is typically an incremental behavior. So first item produced, second item produced
16:42
third item produced. So you are able to do this kind of specification
16:48
And now we have one device. Then you could define another machine and another machine
16:55
And then you could go and say that I would like to create a new simulation
16:59
First you say that the name of the simulation is this. And then just go here, select which kind of devices I would like to add to this simulation
17:12
For example, I just use only the four machine number one. and maybe I would like to have five similar machines
17:22
on this floor simulation, and I would like to generate messages every 10 seconds
17:29
This is the minimal interval that could use this simulator. And then which will be the IoT hub
17:38
which will ingest these messages. And then you could start the simulation
17:44
Previously, I just created a modern factory simulation with three different devices
17:50
Every device generates different sensor data. The first machine generates every 10 second message
17:57
the second every 15 seconds, the third machine generates messages every 20 seconds
18:04
And if I just click on this button here and say that start simulation
18:10
Now, in the background, first the virtual machine is warming up. It takes some seconds, not too much
18:19
It takes some seconds. And then within 15, 20 seconds, you will start to ingest the data and get the messages. Okay
18:38
So let's start to speak how to implement this hotbar. So then you know that as soon as possible
18:45
we would like to process the ingested data and check that the message doesn't show a temperature
18:52
which is too high for us, for example, or something else. And then if it is too high, the temperature
18:59
then we would like to send an alert to a responsible people
19:03
For such scenarios, the IoT Hub should be somehow connected to the Power Automate
19:09
because in Power Automate we could process this message and do a condition that if the temperature is higher than this
19:19
than this Celsius degrees, then we could send an alert. For this situation, I will use a so-called Azure Event Grid technology
19:30
which is nothing else, just a very robust, very scalable, events receiver event sending service in the asia for as i remember in one month you could send
19:45
100 000 events for free so to start the smaller iot project it is nearly free or very small cost
19:54
will be at the end of the month the first step to use this event grid that we should assign we should
20:02
somehow connect this event grid with the iot hub to do this you have to go at first to the
20:09
user subscription and as you see on my screen you have to check that even grid is registered to be
20:16
a resource provider if not then you should register it so one click and then later you are able to
20:27
assign the event grid to the iot hub but because the event grid itself doesn't have a ability
20:33
monitoring stuff you are it is a bit hard to show that something is happening so ferenc is just
20:40
speaking about even grid but how can i check that the events the messages are arrived to this event
20:46
grid so there is another great free opens or solution for microsoft i just add also the url
20:53
here it is a again a github solution as a event viewer it takes around two three minutes to install
21:03
this grid viewer which again a web application in the asia and it will show you that what kind of
21:12
events are arrived at the event and you could check that everything is fine so it's good to go
21:19
So I just would like to show how it looks like this event grid, because you remember
21:26
some minutes ago, I just started the simulation, you see that a lot of messages arrived in
21:33
the meantime. So I just click here and you see that one of the telemetry data, which is coming from
21:39
one of the devices, this, which is a JSON file. has temperature data, how many pieces generated, what is the humidity, etc
21:49
So this information is now arrived to the event grid and because the data is
21:55
arrived to the event grid we will be able to process them using the Power Automate
22:01
But one step is that I would like to show before it, how the event grid is connected to the IoT
22:10
hub it's very simple in the iot hub there is a section called events and you have to click on
22:17
the event subscription you have to say that you have an event grid for this iot hub just feel what kind of events you would like to subscribe for example you would like to give events like a new device is created a new device is created a device is updated the device is
22:39
removed or for our project i believe the most important one is the messages that is arriving
22:46
from the sensor so i just focus on the so-called device telemetry data so then
22:53
then you are able to create the Power Automate alert mechanism. So let's check how to create a flow in Power Automate
23:02
which will get the data from this event grid. And for example, hot flow, it will be the name
23:15
Then event. You can see that there is something which has to trigger this flow, this Power Automate
23:31
flow, and there is one triggering, is when a resource event occurs in an Azure Event Grid
23:39
Okay, then I have to say that which subscription, this is my Azure subscription, what kind of
23:48
resource should be done. IoT Hub should be used in the event grid, which IoT Hub should be used
23:57
I have only one at this moment. And what kind of event coming from this IoT Hub should be checked, for example, the device
24:05
telemetry information. And now I've got immediately messages events that is coming from the event grid into the
24:15
power out of it. The next step is that in this event, as you saw
24:20
it is a JSON file, the body is a JSON file. So the temperature, the humidity
24:25
and all this information is coming in a JSON file. So first of all, I have to parse this JSON file
24:32
to be able to handle this information. First, I have to say that which data
24:40
which is coming from this event field should be parsed as a JSON, the body should be
24:45
And then I have to add what kind of JSON schema is available in this body
24:50
because it is not so easy to know from scratch what kind of data is available
24:55
I'm just coming back to this event with viewer, copy paste an existing data like this one
25:05
Go here and say that I have a sample. Hopefully everything will be done
25:12
I just click here. And from this sample JSON file, he created the schema instead of typing from scratch by me
25:20
Okay, now we parse the JSON file. Now we could say that I would like to have a condition
25:27
In the condition, what is my goal? If the temperature is greater or equal than
25:34
for example, I just write a very small number, 25 Celsius degrees
25:41
then I should create an alert. In a real project, of course
25:45
we will write a much higher number, but to show you that it is working
25:50
I just would like to show that a very small number here
25:55
So if there is a threshold exceeded, so it is 25 Celsius degrees
26:01
then I would like to send an alert. And for example, the responsible people will be me
26:08
I just say that, oh, it is an alert email. hot temperature, and then you could add
26:16
that what was the date for this event, maybe this is the creation time
26:24
what was the machine which generated this problem, this is the device ID, and what was the temperature
26:34
for example, at that moment, which generated this alert, and you could see this was temperature and this is the temperature unit, the Celsius temperature
26:45
And that's all. I'm just coming down, save, and now I have a flow which would get the data from
26:54
the event grid, check this conditional for a sync, and generates event alerts as an email to me
27:03
and sending to me if there are some items which has temperature over 20
27:09
And hopefully I'm just trying to show it in reality. Maybe within some seconds, there will be some new alerts
27:19
Yes, I'm just coming one. This is the date. This is the machine as a global unit
27:26
I agree and the temperature here, which is another. And you see another, another alerts are arriving
27:35
because I entered a very small temperature number just to be able to show very quickly
27:41
how the alerts are receiving. So using this hot path, we are able to process the data
27:48
which is coming from the simulated devices, arriving to the IoT hub
27:55
the ingested data is go to the event grid, and from the event grid, Power Automate is able to process
28:05
and check and send alert if it's needed. So within some seconds, we are able to send the email alert
28:14
to the responsible people in our factory. So I believe it's really cool
28:19
And as you see, it doesn't need too much professional implementation. You have to know, of course, the basic items in Azure IoT
28:27
that there is IoT Hub, Event Grid, and you have to, of course, know the Power Automate
28:33
but you don't have to write long codes. we don't know anything about what are the different protocols
28:40
et cetera, so I believe it's really good. The other way that I would like to show is the call pass
28:45
It was that we will store somewhere the data which is coming from the devices
28:50
and then we will start to yze this data. For this session, I just stored the data
28:56
in an Azure SQL database, like here. Previously, I created an Azure SQL database
29:02
with a device data table, and in this table, I just store each device, how many pieces I created, what was the temperature, humidity, etc
29:13
Then I created an Azure Stream ytics job and it's better to show it in reality
29:20
So what is this Stream ytics 6? I'm just starting this job because it takes around one
29:28
minute to start the Stream ytics jobs and until it is just forming, I will show you what
29:34
What is it? And you will see again that it doesn't need too much professional development skills
29:40
It's very simple thing, but it's a really good technology for such scenarios
29:48
So what is this stream ytics? You could imagine it is a tube
29:53
The tube has an input part that the messages from the IoT hub is arriving
29:58
In the tube, you are able to do transformation we will do a very very simple transformation only in this session and then at
30:05
the end of the from the tube there is the output so what kind of output service you would like to
30:14
use for our service for our session we will use an asia sql database so here you could say that
30:24
What kind of inputs are available for the stream ytics? This is an Azure IoT hub
30:30
And what kind of outputs are available? For this session, I just created an Azure SQL database
30:36
but previously I just checked also that in a Sinex dedicated SQL pool
30:41
you are able to also do this stream ytics job. And what is this job
30:49
I understand the data is coming from IoT, then we will write into an asia sql database but what will happen in this asia stream ytics
30:59
so you could imagine i just try to show it a bit bigger so the data is arriving
31:08
from the iot hub like this temperature temperature you need humidity and the last item is the iot
31:16
hub and in the IoT hub there are additional information what was the device ID when this
31:22
happened etc etc and from this data I am able to transform it to another data so I have to write
31:32
a SQL which is not a full SQL but lots of SQL items are possible to use so I just write a SQL
31:43
I will select from the input IoT Hub. The result should be goes to the Azure SQL output
31:51
And I just selected some of the columns from the input stream and says that the input stream
31:59
has this IoT Hub with a lot of information in a record
32:04
I just select only one item, the connection device ID as a device ID
32:08
And if I just execute this test query, you will see that it transforms the data
32:17
which has input stream, which is coming from the IoT Hub and generates something else
32:22
which could be written into an Azure SQL database. If everything is fine
32:27
I just would like to show that this, because this Azure Stream ytics job is executed
32:35
a lot of items, a lot of messages now are available in this database because originally previously
32:44
earlier I just deleted everything. Now we have several messages that is stored
32:50
And if this stream and any data job is running and running, there will be more and more and more messages
32:55
in the SQL table. Okay, so we are able that all the machines will be
33:02
and all messages from the different messages ingested in IoT Hub streamed and cried
33:10
into a SQL database. So now all the information, mine is a SQL database
33:16
so we are able to process them somehow, yze this information. So I just created a nice Power BI report
33:24
which is called floor monitoring report. And as you see every five seconds this Power BI report is updated refresh because it uses a technology called automatic page refreshing it is a technology which is
33:43
released last year in Power BI you have to select the page itself and in the
33:50
format panel the last section is this page refreshing you have to first enable
33:55
and then you have to say that what kind of technology you would like to use for
34:01
this page refreshing there are two approaches one is that you use a fixed
34:05
interval like here I am just using every five seconds I'm trying to connect to
34:11
the easier SQL database and grab the data from the easier SQL database here I
34:18
would like to emphasize that this technology this automatic page refreshing using the direct query because direct query connection which ensures that the data is
34:28
always the latest data the other way to use this automatic page refreshing technology is that you
34:37
are able to write and measure in decks which will say that the data is changed on the data source
34:45
side and you don't have to every time load the data from the user sql to the power bi every five
34:51
seconds for example in our case rather a measure could find out that you should do something or
34:58
the data which is arrived last time it is also good for our small project i believe this five
35:04
seconds automatic refreshing time is really good and as you see every 10 seconds the machine one
35:11
is arrived a message and says that a new item produced on the machine to every 15 seconds
35:18
generated and on the mesh insert every 20 seconds generated that something happened and you see
35:25
this is a temperature over time thing which is automatically updated due to this automatic page
35:32
refresh technology so sounds really good we just checked this power bi report now we are able to
35:43
to create such an automatically refreshing report, which could be visible on the big LCD screen
35:51
in the factory wall, and everybody could see what is happening in the production floor. Cool
35:58
The next thing that I promised to you that there will be a data manipulation application
36:03
which will be implemented in PowerApps because this daily refining of the plant volumes
36:10
So I'm just created using the Power Platform, another component, Power Apps, a small data application
36:21
It is not a very complicated application I have to say. So I've got only one data source, this Azure SQL database
36:29
This Azure SQL database has another table which contains plant volume. So it would say that on this machine, on that day
36:38
I have to produce this and this volumes. And this data is visible as you see on a data table here
36:48
And when I select something on this data table, it will be visible in this small modification area
36:56
The only value that I could modify the plan volume and then I could click on the save button
37:03
I just would like to show that how the save button works So when I click on the sale button in Power Apps it means that every object has a property and event handler and one of the event handler
37:16
is the onSelect. OnSelect means you click on the button. And if you click on the button
37:22
you will see that we will update this table in the SQL database
37:26
I just found which machine data should be updated and store the value from that input editor
37:36
So nothing special that, as I remember it was 10, 15 minutes
37:41
with all the design work to create this Power Apps application. Then I just published this application
37:48
shared with my colleagues, etc. So from that moment, responsible people on the floor
37:55
is able to modify the plan volumes for that day based on the actual, based on the knowledge that this machine was bad
38:05
or these people are sick now, or something happened, so I just modified the plant data
38:14
But I believe for a factory where there are so many problems
38:21
and so many tasks to produce and manufacture items, the most important thing is not our Power Apps application
38:28
or Power BI report or another application and another application. Rather, the responsible report is team lead people
38:37
and the production floor needs one place, one hub that he is able to check the KPIs in a Power BI report
38:47
do the plan for you, et cetera, et cetera. So the next step, I believe, in this call pass
38:53
is to integrate the Power Apps application into this Power BI report
38:57
instead of having apps in the browser, a report in another browser
39:02
then we are able to have one place where all the important things are available
39:08
for this responsible people. So how it looks like in a Power BI
39:13
I just created another page, a report page in the Power BI
39:16
and add Power Apps Visual to this page, and this Power Apps Visual referring to this
39:27
previously implemented Power Apps application. If everything is fine, then it should show the data in this visual
39:39
But really simple thing, it is just connecting to the Power Apps application
39:47
grabs the data and show the same screen that is available in your browser application
39:54
in your Power Apps application, light is embedded into Power BI. Okay
40:01
And then after it is done, the only thing that could be questioned, asked that
40:09
okay, everything looks nice. There is a nice Power BI report. There is an embedded Power Apps application
40:16
Okay, but it is in Power BI Desktop. Is it possible to publish this Power BI Desktop file
40:23
to the cloud? What will happen? I need premium capacity or not
40:28
And the answer that this whole automatic page refreshing technology works fine with the shared capacity
40:35
So you could, for example, publish it to a MyWorkspace or any workspace which not assigned to a dedicated capacity The only limitation is that the minimum interval to refresh the data is 30 minutes
40:50
So for our situation, it is not good, but there could be situations
40:55
that every 30 minutes automatically refreshing reportage is absolutely enough. For example, if a complex process could generate
41:06
an item only every one hour, then it's absolutely good for you to use
41:10
as a shared capacity for your report. If you would like to use something like this
41:15
this five seconds refresh time, then you need a dedicated capacity. So for example, my sample is using a PPU license
41:25
PPU is the premium per license. If somebody doesn't know it, what does it mean
41:30
It is now in preview and will be in general available from 2nd of April
41:36
it is a user, bear user license to access all the premium functionalities
41:43
For example, if there is a company with only 10 people who are using Power BI, create reports or consume reports
41:51
but they need some of the, or maybe all of the premium functionalities
41:55
and they don't want to spend to purchase a very expensive premium dedicated capacity
42:03
from 2nd of April, they will be able to buy PPU licenses
42:08
with very cheap monthly rates. So if you use this PPU license
42:15
or you have a premium dedicated capacity or an embedded capacity, you are able to set up this automatic refresh
42:21
For example, here I just set up for five seconds and then the same data is shown
42:31
and automatically refreshed in this report. For example, you see that I'm just waiting some seconds
42:38
and the data is refreshing. So in the browser, in the Power BI service
42:44
this automatic page refreshing is also working. And also the plan volumes are possible to check that
42:53
for example, on the machine one, for the 400 items are planned
43:00
but I see that today we are good to go with 500 pieces to create
43:05
then I could update it coming back to dashboard, and you see that now on the machine one
43:12
we have 500 pieces for that day, so we are really good at this moment
43:19
So back to the presentation. My conclusion is that for our small pet projects
43:26
or in a smaller factory which would like to use some kind of smaller IoT
43:31
Scenario, Power Platform with Power BI, Power Automated, Power Apps, maybe with Virtual Agent, could be a useful solution
43:43
So I could say that the Power Platform loves this kind of industrial
43:48
fractal IoT solutions. So thank you for coming to this session, and thank you that hearing me, watching me
43:59
And if you have any questions now or later, there are email and social media accesses for myself
44:07
So thank you for having me
#Business Operations
#Computers & Electronics
#Business & Productivity Software
#Factory Automation
#Distributed & Cloud Computing
#Intelligent Personal Assistants
#Machine Learning & Artificial Intelligence


