This is the Keynote session in IoT Virtual Conference 2021. In this session, Allen O'neill will explain more about the IoT. He also goes with the panel discussion with Clifford Agius and Pete Gallagher.
About Speaker:
Allen is a consulting engineer with a background in enterprise systems. He runs his own company specializing in systems architecture, optimization and scaling. He is also involved in a number of start-ups. Allen is a chartered engineer, a Fellow of the British Computing Society, a Microsoft MVP and Regional Director, and C-Sharp Corner Community Adviser and MVP. His core technology interests are BigData, IoT and Machine Learning.
Conference Website: https://www.2020twenty.net/iot-virtual-conference/
C# Corner - Community of Software and Data Developers
https://www.c-sharpcorner.com
#csharpcorner #conference #IoT #AI #keynote
Show More Show Less View Video Transcript
0:00
Welcome, everybody, to this IoT conference, and many thanks to C Sharp Corner for organizing
0:09
it and facilitating. It's a lot of hard work in the background
0:13
It's easy enough for guys like us to rock up, so many thanks to the organizers as usual
0:19
IoT, Internet of Things, people say, well, what is it? It doesn't affect me
0:24
Well, the image that I have up there kind of talks about the things that are involved in IoT, even though you may or may not know it
0:32
Everything from robotics as IoT and computers. When we use a camera or a phone and we walk down the street and we're identified by a security camera
0:46
when we go and order something online, which we do mostly these days with lockdown
0:53
and we're able to track the delivery van coming to our place of residence
1:00
That's all done with IoT. The manufacturer of products and factories, as things move along the conveyor line
1:07
IoT is used for this. When we go in and we choose something in our shopping carts
1:14
Some of us go into a shop and we don't have to go through a checkout anymore
1:20
Instead, we've got a device that we scan the items. Some of the Amazon shops now use a combination of cameras and other sensors so that there's nobody in the shop to talk to
1:33
It's great for someone like me, but you go in and you just stack your items into a basket and off you go
1:39
And it automatically calculates what you've got in your basket. You don't have to go through a checkout at all
1:44
Then we have, of course, aircraft and the amount of sensors that are on an aircraft is unreal
1:51
Probably one of our key keynote speakers, Cliff, can talk to us a little bit about that later
1:56
What I'm trying to say here is that IoT is all around us. So even though we think it isn't and we're not affected by it, it's there and it's very, very ubiquitous
2:08
So the question a lot of people say to me is, you know, I don't use IoT
2:14
And I've got a simple way to demonstrate that you do use IoT every single day
2:20
Now, I know that I haven't been to the barber when I want to get my hair cut, but in the times when I was able to get my hair cut and go down to a barber, you know, I'd go off
2:32
And if I wasn't based at home and going to my usual barber, I would look up one on Google Maps
2:38
So I was in a conference before the pandemic there, about two years ago, in Vilnius in Lithuania
2:50
And I was staying in this particular hotel and I wanted to get my hair cut
2:54
So I opened up my map and I searched for where I was and I said, I'm looking for a barbershop nearby
3:03
So I typed in barber and press enter. And of course, it showed us all the different places that are local
3:10
And that one looked interesting, red light barbers. And I went down and interestingly, down the bottom, it told me popular times and when it was going to be busy
3:19
And I could see there when the busiest time was, according to the little chart, etc
3:27
And that's always struck me as awesome. Like I can go in there and I can decide I want to go and get my hair cut or I want to go to a gallery
3:35
I want to do some shopping. And just by going into Google Maps, they can tell me when the best time to go is, when there's going to be less people there, et cetera, et cetera
3:43
But the question is, how is that done? Right. How is that actually achieved
3:48
And the answer is it's all about location and time and it's all about sensors
3:54
So if you think about it, we're all moving along the street every day
3:59
We have our phone with us in our pocket and we go past a location
4:07
Now, how does Google or Bing or anybody else know that we're at that location
4:11
Well, our phone has GPS and GPS is a little sensor. It's a device inside our phone
4:19
So it's part of the IoT, the Internet of Things. And of course, because they can locate where we are with a GPS sensor or with a tracking because of Wi-Fi strength, they can then overlay that on top of a map
4:35
and we have the Google map cars and Bing cars and everything going around taking pictures of all of
4:43
our buildings and all of the geolocations and because of this they can say well we're outside
4:50
a particular location and they can notice a cluster of phones or people who are in a particular GPS
4:58
location for a certain amount of time. So we could assume by just simple prediction that if
5:06
there's a certain number of phones with people attached walking along a location at any one time
5:13
and then suddenly they seem to stop and they're just staying in that location for a period of time
5:18
well then you assume that they're in that location. So we can assume that those people are in having a
5:23
haircut. And if you look at the volume of those phones, the number of those phones of those
5:28
IP addresses of those phones that are staying in that geolocation for that period of time
5:35
well, then you can see how in the previous thing we can get the amount of
5:41
little bar chart to show who's there and when they're there. So the different sensors that are on a phone that are ubiquitous, they're with us all the time
5:50
they're 24-7 there, we have accelerators, we have humidity, we have orientation of the phone
5:57
We've got the magnetic field of where we are, the strength of that
6:02
We've got the air pressure proximity to other phones, gyroscopes, ambient temperature, accelerometer
6:09
There's a huge amount of inbuilt sensors inside in a phone before we even start talking about anything that's very specific and doing one specific thing
6:18
And in addition to what's in the phone, the next question then is how do those things relate to the world around us
6:23
And if we look at this picture here, we can see things like our phone
6:28
Well, we get into a car and our phone can connect to the Bluetooth speaker in the car
6:34
We go into our place of work and maybe we use our phone to, I've got a thing on the back of my phone and it's when I go into the office, when I'm allowed to go into the office, I do a little touchpad against the end of the access door
6:51
or it lets me in and out and it knows that I'm there. We have refrigerators that store our food
7:00
that we can watch for the food that's going in and out. We can scan the barcode, the food that's
7:08
going in and out, or we can look even at the shape of the stuff and do image recognition of what's
7:13
going in and out. And it can tell us to make sure that we don't let stuff go out of date and we eat
7:17
et cetera, et cetera. So all of these different things these are all sensors that are part of the world of IoT And the message here is that it all around us So even though we think that we aren affected by it it is actually everywhere And it growing and it getting even bigger
7:37
So what I'd like to do now is I'd like to introduce our two guests. And we're going to
7:43
have a panel discussion. We're going to talk to Pete and to Cliff. Interestingly, Pete is there
7:49
with his headphones on in that picture. And I would expect Cliff, who is also a pilot
7:53
to be wearing his headphones in the picture underneath. So the agenda that we're going to talk about
8:00
is we're going to talk about IoT for good. Both guys have some interesting stories to tell about that
8:06
And I think Cliff is going to be talking about it in particular later on. We're going to be talking about IoT edge computing
8:12
What is edge computing? And what is this new thing called Azure Percept
8:17
We're going to talk about cobots or co-robotics. We're going to talk about this strange thing called digital twins and how it works and interacts with IoT
8:27
And then finally, we're going to talk about a topic that's incredibly important with IoT, and that's ethics
8:33
So we're going to talk about killer robots, and we're going to talk about data privacy
8:38
So, Simon, if you want to let the chaps into the conversation there now
8:45
So, Pete, do you want to give a quick introduction to yourself? i can't do a quick introduction to myself i'll be good at that
8:52
so i recall years ago one of the um i was doing a a pitch for an investment fund and they always
9:01
try and sort of say do your elevator pitch right but these guys actually made it even worse and
9:05
they say give us your elevator pitch in a tweet so they made us do it in 120 characters or whatever
9:12
it was at the time so you have 60 seconds which you just have pete and what you're doing in iot
9:16
Yeah, I'm a freelance software developer, an IoT specialist mainly, that's where I tend to concentrate
9:23
Although today I've been working on creating little mini games for a client in Vue and JavaScript
9:28
So it really depends on what people want to pay me to do. But I'm an Azure MVP, Microsoft Azure MVP, Microsoft Certified Trainer, Pluralsight Author
9:38
STEM Ambassador, Code Club Organizer, and I'm pretty sure, I'll meet up organiser, NotSaiUt, and .NET NotS
9:45
and I'm in a podcast. And, geez, am I out of 60 seconds already or what
9:52
So it's like one of those things where you say, well, tick any of these boxes that you don't do, right
9:58
Yes, that's what I need to do. Yeah, just one box. Cliff, tell us where you're at
10:05
Me, my day job is an airline pilot. I'm a major UK airline
10:09
So I get to fly Boeing 787s around the world, which is hence the picture you looked at earlier of me
10:14
When I'm not at 40,000 feet doing 600 miles an hour, you'll find me huddled over my computer
10:19
thrashing out even some .NET codes, be that Xamarin or IoT. I kind of tend to try and stick to the Xamarin IoT space
10:28
is where I'm at as a .NET developer. So work on various projects with different clients
10:34
You know, kind of excited about Maui, come along in .NET 6. Later on, I think directly after this one
10:40
I'm going to give a talk about Hand in the Hands, just a 3D printed prosthetic hand that I've helped build
10:46
And also on a Tuesday night, Pete forgot to say, but we do have a Twitch stream
10:51
He left it off his list. On a Tuesday evening, 9 p.m. UK time
10:56
you can work out your time zone where you are in the world. But we have a, on Azure Live
11:00
we have an IoT stream for an hour every Tuesday night where we talk about the latest and greatest things
11:04
that are coming into IoT and try and break them down into simpleton terms
11:09
that even I can understand. awesome and to raise you alan on your your workshop and building a boat i've got two
11:15
3d printers and i'm building an airplane so um you know we'll we'll uh i'll take your boat and
11:21
i'll raise you one plane oh my god okay it's actually it's it's one of my uh minor dreams to
11:29
um uh build an airplane as well but i don't think i'd be able to fill it fit it into my workshop
11:33
um that would require uh are you are you talking about now tree building are you talking about a
11:39
a remote plane or a it's a four seater sling TSI the kit comes from South Africa
11:47
from Joburg and yeah they send you a kit and you yeah you yeah build
11:53
it oh my god I think I even see the rivet gun there just beside you is that your rivet gun
11:58
that's a dimple machine so that puts it all the metal is
12:03
flat and where the holes are you dimple it so the rivet head's sitting in the
12:07
top and you get a cleaner air flow so you don't get any parasite drag which is a drag caused by
12:11
rivets and things sticking out into the airflow okay we're here to talk about iot
12:16
i'm gonna drag you on to my after code show to talk about that particular project if guaranteed
12:23
yeah okay so the first thing that we have on the agenda here guys is um to talk about iot for good
12:29
And, you know, we've seen IoT, it's ubiquitous everywhere, everything from production lines to aircraft to cars
12:41
I mean, there's more sensors than you can shake a stick at no matter where we go
12:47
But in particular, we're interested in IoT for good. And Cliff, you have been doing that directly with the prosthetic work that you do
12:56
I know you're going to talk about it in detail when you do your session after this, but can you give us a small overview of the project that you have there and how you got involved in it
13:10
I got involved because family friends of ours, their son Caden, was born without a left forearm
13:20
And, you know, he's lived with a prosthetic limb. You can see here, he has a slightly different version of this that he's given by the NHS
13:29
It's effectively just a hook on the end. And there's a story about a team that are 3D printing prosthetic arms and limbs
13:37
And his mum said, could you help? I had a 3D printer, so I started out helping, but then started to think, well, even that is much the same
13:44
It's still effectively you pull that this is actual fishing line, 100 pound fishing line
13:51
And you pull across that, you move your shoulder and it opens and closes the hook
13:55
The ones that they're 3D printing is much the same fishing line, move your shoulder and it opens and closes the hand
14:01
And I thought we could do better. And I like IoT and electronics. So we sort of took a design from a group called OpenVionics that was open source at the time and adapted that and made it so that we can actually build it cheaper and 3D print the parts
14:16
And it's got some Adafruit boards inside. It's got some actronics, little 20 millimeter linear actuators in there
14:23
And yeah, and it moves and sort of works such that Caden now has an arm that he can control
14:31
And with his muscles, he can control. When you say it was open source at the time, has that changed
14:37
Yes, they went closed source, which I was annoyed at the beginning
14:42
but I believe now talking with various different people, none of the team will reply to my emails, even though I've tried to reach out to them
14:49
It's just kind of sad, but I believe they went closed source
14:52
because they were aiming to try and get approval for the NHS National Health Service here in the UK and across in Germany and France as well So I believe they gone close source they turned it into a business and you know yeah it one of those things
15:07
you know, I can't complain about it. They've left their designs as were up on GitHub
15:12
but they're three, four years old now. And I'll talk more about it if you come along to my talk
15:16
but what, in 35-ish minutes, you can hear a bit more about it
15:21
But yeah, it's a great Caden use it day in, day out. And the good thing is he wants to be a chef. So when he's at college, he can't hold a tomato, an apple, a round thing with a hook
15:29
It just doesn't work. Whereas now he's got hands. He can just like we can
15:34
He can grip around a tomato, an apple and he can slush up. And also he can't cut himself
15:39
So he'll never be wearing those blue plasters to. But yeah. And then there's a mobile app written in Xamarin, which allows him to control the hands and also change the settings
15:55
And, you know, as his muscles get stronger and weaker, he can set the settings for how the hand works and change the grips that are in there and things like this
16:04
But as I say, in 30 minutes time, we'll come along to my talk and learn a bit more. Excellent
16:08
Pete, have you come across any IoT for good? Have you done any yourself
16:12
Any insight there you want to share? Oh, absolutely. come across. We had Sarah Mastin at my meetup, not CYOT, just recently. She's involved with
16:21
Project 15, which is a Microsoft IoT for Good project where they're a psychology-based IoT
16:30
so they're looking at how they can fix all the problems around ecology, which was fantastic
16:37
And while I was arranging that, we managed to find a BBC article where Whipsnade Zoo
16:44
were looking at taking what they called selfies of elephants so they could train AI models to help
16:52
with human elephant conflicts which you think are people going and poaching elephants that's not what
16:58
they meant at all what they mean is that if you've got a farm somewhere in Africa then potentially
17:02
elephants could come in and trample on that and possibly even the people that are working at the
17:07
farm so you know there are what they would call conflicts that happen in those particular and then
17:12
both humans and elephants can get hurt during that. So they want to deploy out this machine learning models
17:18
out into Africa to help that. And they managed to get Alastair Davies
17:22
I think his last name is, to come out and speak about the stuff that he was doing
17:27
And he was also, he put IoT in plastic bottles and set them afloat in the ocean to see where
17:32
this plastic that is a horrible scourge across the world at the moment, to see where that goes
17:40
So those two projects that they were speaking about were fantastic. And in fact, the good thing about the Project 15 one is it's all open source
17:47
and you can deploy the whole of that IoT infrastructure yourself. And obviously, there's no point going into a great deal of detail
17:56
But obviously, if you look at it from a Microsoft point of view, you've got quite a lot of services that you need to string together
18:03
to get a full IoT solution to work if you want to use all of the bits
18:07
So you've got Time Series, Insights, Streaming ytics, and Storage, and an IoT hub, and Endpoints, and all of that routing, and Logic Apps, and Flow, and Digital Twins that we'll come onto in a bit
18:18
And they've just got an ARM template that you spin up and put some bits into, and it configures all of that for you
18:24
And so the reason why they did that is to make it easy for people to grab all of that powerful IoT tech and then use it, because it's complicated to do this sort of stuff
18:37
so the easier you can make it the better yeah so yeah that's very interesting that um uh you know
18:43
you you touched on there that it's not just the uh the the device itself but it's all of the other
18:50
things that surround the device um which brings us on to um this this concept of edge computing
18:57
and um uh i never fully understood until you know recently what they really meant about this by this
19:06
edge computing and iot on the edge and it seems at times that there's people at companies like
19:17
microsoft and google and ibm and everything else and they're just paid to think up these you know
19:22
random words and stuff and slip them together and you know then build some marketing around them and
19:27
then they come down to engineers and they say make something that does this thing you know
19:32
So, you know, so IoT edge computing, it conjures up things of, I don't know, something that's out on the edge of the Arctic and it's like not meant to be touched or it's maybe on the far flung reaches of a farm and sheep farm in Australia or something
19:58
But in actual fact, it means anything that is, you know, sort of the last step before it's touching the environment, as it were, you know
20:09
And it seems that there's a huge amount of issues in relation to edge computing and not only allowing whatever the device it is to do its job
20:23
but then getting the data that it collects back up to wherever
20:30
this place that does something with that. There is the question of when you go and you have this small device out on the edge
20:41
is it able to do everything that it needs to do? Or do we actually need to take that data and send it back up to some big processing
20:51
Goliath center somewhere that does some big magic AI and then sends some result back down
20:58
So it's not just about the individual device. It's the whole ecosystem of everything that's
21:06
around this. So when you're looking at projects, Pete, that customers might come to you with
21:14
do you generally look at sort of the device first? Do you look at the backend first
21:20
and how do you how do you get this synergy of of stuff coming together
21:25
it can be quite difficult actually depending on the client if you get a reasonably technical
21:31
client that that job's somewhat easier um but finding a delineation between what they think
21:37
they want and what they actually want sometimes can be where you need to go and you can you can
21:42
often scare clients i mean i'm i'm freelance i'm a consultant so i have to be a little bit careful so
21:47
So I can't just fling a whole heap of stuff at somebody and tell them how much it's going to cost because they'll likely go, I think we want all of that stuff
21:55
We're too scared. We'll probably go and do it ourselves or get somebody else to do it. So you've got it. There's a fair bit of holding back
22:01
But I mean, a good example is a client I've just had recently where they have got their technical, quite technical, but they know nothing about the cloud
22:11
But they came to me and they've got these little sensors, BLE sensors that do temperature and humidity and light level
22:17
much to Cliff's disgust. He hates those particular two of them at least. Um
22:21
and they, they knew how to create this, this tiny little dumb sensor not dumb Actually it pretty clever but uh it just sits there broadcasting out these three readings but they had no way of of grabbing that data and doing something with it uh now when when
22:36
they showed me what they had which was like a little python script that would run uh on a on a
22:41
pc just grabbing bluetooth data and they said right we want to put these into buildings and look how
22:47
they warm up and cool down and we've got you know want to put hundreds of them in the building and
22:51
and we'll need little gateways perhaps to grab the data. I immediately thought of putting it in the cloud
22:57
because that's my background, but that's not how they led. They had a little server on site that they wanted that data delivering to
23:08
and then a university would dial into that, and they had Firebird as a database or something
23:14
So all they wanted was for me to grab the data from the BLE sensors
23:18
and send it to that. and of course as a consultant then you do what the client wants and you absolutely you know advise
23:26
them of the other things they could do but they had no requirement for the cloud side whatsoever
23:30
but of course I knew that that would come anyway. That's always interesting it's the
23:35
that particular tension between what you know is possible and what could
23:47
bring value to something. And then what the customer says, well, I don't need that stuff
23:54
I know it's cool and I know it's interesting, but we only need a ladder
24:01
We don't need the Rolls Royce. Thank you very much. Even though we know that the Rolls Royce in this case
24:06
because it is rented out by the CPU is only going to cost us one P a day
24:13
We're still going to buy that box and put it in the corner, even though it's going to cost us four grand because that's what we know that's true what's
24:22
what's i mean there's a few benefits here obviously um having a background in iot it's faster for
24:28
someone like myself to spin up a a proof of concept of poc once once you get it to a certain point and
24:35
they're happy and you've done the job or you've nearly done the job um then i tend to knock up a
24:40
a little POC in my own time, I don't charge anybody for this, of what it could look like
24:45
Because that's the other delineation there, is that if you don't understand, certainly
24:50
how the cloud side works, it's really difficult to see what the benefits are. And you kind of said that about edge as well
24:56
Until you understand what the benefits are, sort of almost in a concrete way
25:00
it's quite difficult to picture what the value is. So one of the other things I use quite regularly from a consultancy point of view is IoT Central
25:09
which is a Microsoft software as a service offering that bundles up a lot of the complicated stuff
25:15
IoT hubs, streaming ytics, time series insight storage, device provisioning service, all of those different things are bundled up
25:23
into a package that you don't have to manage. And what's beautiful from a consultancy point of view
25:27
is that you can develop a solution in that and just give it to the client and they can manage the parts they need to manage
25:33
and Microsoft manages everything else essentially. so I lean on that quite heavily although you know if you were to look at my portfolio of stuff if
25:42
two-thirds of it is IoT central that might just make it look like I'm dumbing it down but really
25:47
it does give you that a little bit like project 15 but you know a far more delivered developed
25:54
effort. Clifford you talked earlier on about the fact that in general you like to
26:01
For example, your IoT tends to go hand-in-hand, to excuse the pun behind, tends to go hand-in-hand with your Xamarin stuff
26:14
And that again shows that the ecosystem and it talks directly to the ubiquity of phones in the whole IoT space
26:25
Do you want to talk to that a bit from the point of view of edge computing? yeah i mean the you know pete said that he's uh more kind of cloud orientated i'm more sort of
26:36
down and dirty in the electronics um i like getting you know in there with a soldering iron
26:40
and you know building circuits up and and so i know pete does that as well but that's where i'm
26:45
happy you'll ask depends who you ask doesn't it yeah so um you know that's where i'm happiest is
26:51
building electronics and playing around and trying to work out a circuit design or what
26:55
parts that can bar off the shelf to build up a circuit. A lot of the projects
26:59
that they do with client involve around Bluetooth things and then you can use a Xamarin app quite simply
27:06
because your phone has got Bluetooth to connect those devices and take
27:11
control of them or just check in on them or the sensors etc
27:15
So I'm currently working on a project which is aviation related and that is going to use
27:23
a device that's going to be in an aircraft, but it's going to use a phone to set up that device
27:30
and control it and do all the management of the device. And then when you walk away and it hasn't
27:36
got a Bluetooth connection, that Bluetooth sensor turns off and your device runs with those settings
27:43
until next time you come along and you want to change the settings, you'll connect to it again
27:48
you'll change the settings and walk away. But you can do all that. So rather than having to take the
27:52
the device from where it is or taking a big laptop and plugging it in and uh and things you know i
27:57
sort of spent many years in the car industry to change settings and then you have to take a laptop
28:01
with a special controller card and you know sort of big rucksack with it all in and go along just
28:06
to plug into a robot to change a couple of settings or or plug into one of the machine tools and uh
28:12
and adjust something um whereas now you can just pop along with any mobile phone you know android
28:17
ios you know it really doesn't matter um and just sort of you know connect it via wirelessly via
28:22
wi-fi or bluetooth and just change the settings um you know i've worked on projects where you
28:27
talk about the edge um for me it's stuff that's kind of right out there that doesn't have a
28:32
connection so it's you know effectively you're putting a mobile phone like 4g sim inside the
28:37
device or you're you're using um lower one um to connect to something that can eventually connect
28:43
to the clouds or the internet which can then connect to the cloud so for me it's something
28:47
way out in the distance that's you know sort of really low power on a battery or solar cell
28:52
um you know and it's got a weak signal but um you need to amplify that to get up to the cloud
28:57
that's really interesting when we um sit it on top of uh what we started off talking about there
29:03
which is the iot for good and i was uh writing an article for a client uh last year at some stage
29:12
and it discussed lower one and it discussed remote gateways and in particular it was
29:20
interested in things like out in maybe developing countries where there was issues with wells
29:29
okay and if people had to go to a well that was so many miles away to get water on a daily basis
29:37
usually it fell down to the female and kids to do it
29:41
And they were the donkeys, unfortunately. But when they get there, it could be that the well was dry
29:50
or it was half empty or they needed a longer rope to get down or whatever
29:54
So it was discussed that potentially they would have sensors that would be..
29:59
you know, solar powered, lower down in there, would be able to measure the quality of the water
30:06
alert them before they went to one that they should go to another one, et cetera, et cetera. And it would help to improve the quality of life
30:14
And ultimately, as you had flagged there, it was all about stuff that was really on the extreme edge
30:21
It was really out there. It was nowhere from civilization. It certainly didn't have the fiber broadband that, you know
30:28
we boast about having so many, you know, gigs per second up, down and everything else. Well
30:34
these things aren't measured in gigs. They're measured in bytes per minute almost. It's really
30:42
bad, you know. So it was very interesting to hear you talk about that. I also found it interesting
30:49
that you're in the aviation business and of course aircraft are the ultimate things on the edge
30:58
aren't they? Because they fly up and they're out there. And it's my understanding that certainly
31:03
commercial aircraft can gather gigabytes of information from the time they take off to the
31:09
time they land. And then when they land, that's downloaded or whatever. I'm interested in general
31:15
aviation, i.e. small aircraft. And again, when lockdown finishes, I'm hoping I can get back in
31:24
and go towards getting my private license and everything else. One of the interesting things that I sort of understood
31:32
is that in general aviation, there isn't all of the advanced electronics
31:37
and advanced stuff on the edge that you have in commercial. And I was amazed to find out that in actual fact
31:45
when you're coming into a rural or a non-commercial airport frequently
31:56
because there's no tower, there's nobody there to tell you where to land
31:59
You're literally looking out the window trying to see if there's anybody else there
32:03
coming in to land you're going to hit or whatever before you go down. And that's only recently that even collision detection systems
32:14
That started out also, fascinatingly enough, as an open source project for general aviation aircraft and are similar types of things that have started out as open source for autonomous vehicles
32:28
So we're seeing all of this stuff that happens out on the edge that may or may not get brought back into commercialization
32:40
But nonetheless, it seems to have its genus right on the edge before it's brought back in
32:47
one of the things um that you know we start off and we talk about uh computing on the edge and
32:55
um devices on the edge and but one of the things that always seems to be
33:00
forgotten about and tacked on at the end is security and it's um the ease of authentication
33:07
and um we've seen uh people doing uh proof of concepts where they were able to hack into um
33:16
cars that were being driven remotely that have been able to take over a car and pull it into the side or get it to speed up or slow down or whatever
33:25
I know that I read there last year about some proof of concept attacks on aircraft which are kind of scary
33:32
and then I came across recently Azure Percept, which aims to bring in this concept of
33:45
okay, you have your device and you want to do your stuff out on the edge
33:50
but have you thought about the whole thing about authentication and about device management
33:55
and about the software update and about, importantly, the security of it, et cetera
34:00
Well, if you haven't, here's a kind of a package that you can build around your stuff that allows you just to plug it in and it takes away that headache and forgets about it
34:11
Have you guys come across Percept yet or have you any experience of having to secure your devices on the edge
34:20
Have we come across Percept yet, Cliff? Well, there's none available in the UK, so we can't play with one here in the UK, sadly
34:27
I did, I ordered one last night the usual I ordered it through a virtual
34:34
US address yeah, there's a trick that's what we've missed Pete, that's what we've missed
34:40
for security I kind of I don't trust any IoT device for security, so Bluetooth
34:48
data that comes from the device, I treat it just like you would a web page
34:52
you never trust the user's browser do you when the data comes back and they're filled in a long form and it gets back up to server. You do server-side
35:00
validation as well as client-side. I treat IoT devices, when the data comes back to where I want
35:05
to process it, be it a mobile phone or back in the cloud, I treat them almost the same. I don't trust
35:09
the data because you don't know what's happening to it in transit. You know, Bluetooth is notoriously
35:15
difficult to secure. You know, Wi-Fi a little bit better. LoRaWAN, you know, much like Wi-Fi
35:21
you can secure it a bit better um but it's you know it's insecure um because you're transmitting
35:27
it via a medium which isn't you know https and all the other gubbings that go with um so i just
35:33
don trust the data and that the way i think of it mentally which means that i make sure that whatever data i am sending is is the bare minimum you know I trying to ostracate it some way as well when I sending it So that how I do it
35:47
The percept is really interesting. I'm very eager to play with one and see what we can do with it
35:54
Me and Pete have talked about it quite a few times, haven't we, Pete? So it'd be nice to get one and have a play
35:59
Yeah, we are working on it. We can say that, can't we? tell us about um tell us pete about uh cobots what's a cobot well cobot co-robotics do you mean
36:10
yeah uh like remote controlling robots is that sort of uh where you kind of yeah it's sort of
36:16
i've heard before when people talk about robots and one of the things that we have here even down
36:22
that we can kind of blend into from this is you know killer robots and uh schwarzenegger's thing
36:27
coming along and all that stuff and Skynet and everything else. And, you know, people think of
36:33
robots and they think, oh, they'll kill a robot and the robot going to take our job and everything
36:38
else. When I talk about AI, I always say, look, we're not interested in replacing the human
36:46
We're interested in augmenting the human. Yeah. It's critically important to keep that sort of
36:54
at the top of the conversation. And co-bots and co-robotics in the context of what we're
37:02
discussing here is where we have robots that work with people. So for example, let's take a
37:10
healthcare scenario whereby we have a robot. That's not something like, you know, people think
37:21
robot and they just think they think the hand that we see behind behind cliff there right they think
37:27
the humanoid thing right they generally don't the first picture that comes to their mind
37:31
is usually humanoid and and killers and laser eyes and stuff and it's yeah it's generally not um oh
37:38
it's a thing with four wheels that scurries along the ground at two miles an hour
37:42
trying to avoid everything looking stupid most of the time right um so a cobot we could think of it
37:48
as something that maybe goes into or comes alongside a patient's bed and is nothing more
37:57
than a thing on three wheels or two wheels that says, take your medicine now, here's
38:03
your three tablets to take type of thing. And it takes the effort away from the human or maybe it's monitoring, maybe it's a bed
38:13
that we already have an actual fact for people who are heavily immobile
38:18
that helps to turn the patient this type of stuff um so uh co-robotics in in in this kind of thing
38:27
is certainly becoming uh much more prevalent and if you think also um a classic example
38:34
um that does almost edge into the killer robot thing um is uh have you heard i'm sure you have
38:42
of these harnesses that people who do manual work can step into
38:48
and they help them to lift things. Yeah. Very much like an alien
38:53
There's a scene in there. It might be Alien 2 or something. I think it's Alien 2, isn't it
38:58
It's an exoskeleton. Yeah, exactly that. Yeah. It's quite clever, really. Yeah
39:03
Also, if you think of co-robotics, autopilot on an aircraft is effectively a co-robotic
39:08
It's flying the plane for us, so we can deal with, you know, rather than being pilots we're systems administrators now we step back and you know
39:16
the plane flies itself you know we do the takeoff and put the autopilot in you know 10 000 feet and
39:22
the plane flies itself and we're just managing it and telling okay go left go right or follow the
39:26
flight track um you know and we're managing the system so you know that's kind of co-robotics um
39:32
you never get uh i don't think you're never going to get a commercial aircraft flying with passengers
39:37
and no pilots because, well, we don't have a car that can drive itself
39:41
along the road at the moment with no drivers. So you've got a third dimension, at least in the car it can stop
39:46
if it's got a problem. The computer has a blue screen of death at the moment
39:50
It can just stop and everyone gets out and say, oh, well, the car's broken down. I call a tow truck
39:55
You can't really do that at 40,000 feet. It's not going to work. But that, for me, is kind of co-robotics
40:01
Passengers, if you look to the left, we're approaching the Azure cloud if you complete this embark
40:07
yeah one of you press control alt delete please absolutely so let's let's get on now to um uh digital twins right so um cliff we were talking
40:22
uh about the absolute sheer uh tsunami of data that is generated in an aircraft
40:30
when it flies, especially a commercial aircraft. And even the complexity of something like the aircraft engine
40:40
to say nothing about the navigations. One of the things that I've long known about aircraft
40:48
and flying aircraft is that there's so much redundancy built in, a nice redundancy of systems
40:55
And to me, Digital Twins allows us to sort of be able
41:00
to see how things could be modeled and how they could go wrong
41:05
and how they are going. Even while the aircraft is away, if we look at the edge devices that are off out on the edge
41:14
that because they on low RAN and the data communications is very unreliable and we don get the data in for a couple of days we can still get an estimate of how it is by mirroring the stuff in twins
41:31
Do you guys want to give an explanation of digital twins and how it relates to IoT, maybe, in one minute
41:37
Who wants to take that on? Go on, Pete, you're the expert. Yeah, digital twins are, well, kind of alluded to in the title
41:45
a digital representation of something potentially in the real world, but not necessarily
41:50
But it can be anything from a small device to a human
41:54
You can make a digital twin of a human. And in fact, if you watched the NVIDIA GTC conference, NVIDIA got a large plan to make
42:02
digital twins of pretty much everything they can in the real world, which if you can do
42:07
something like that, it's fantastic. But one of the real powers of digital twins you just mentioned there a second, where you
42:14
can use them for modeling. And I heard a really great story about one of the early adopters of
42:19
digital twins before there was such a term even were Formula One teams, where they model everything
42:24
down to even the thickness and gauge of a wire, so that they can try reducing a wire and seeing
42:31
what the result of the rest of the system, which is a massively complicated machine. You're talking
42:35
about another machine that's got sensors producing gigabytes of data. I bet Formula One team probably
42:40
produces more data on their car on one race than than you would in five days worth of flying on a
42:45
plane that's that much data so it's a lot they they probably have sensors on more of a the car than
42:53
you would on a plane even um planes big enough that you probably don't have as many sensors
42:58
i don't know i'm just guessing but i know that um you know the aircraft that i fly to 787 has
43:05
two rolls-offs engines and uh the date the data they produce is you know sent directly back to
43:11
rolls-offs and also back to the airline that i work for um and they can you know see problems
43:16
with the engine way before we would see them um you know as pilots sitting at the point end of the
43:20
jet uh you know and i've been i've landed and been met by engineers that have said okay we're just
43:26
going to take the aircraft so we see a problem starting to uh to occur with left engine and we
43:33
knew nothing about it we've just been flying the thing for like 12 13 hours like yeah yeah we we
43:37
can see it's going to foul um you know in a you know in a few sectors time so we're going to you
43:43
know take it out of service now and you know rectify the fault now you know so it means that
43:48
they could they could plan the the time of taking out of service to to you know because an airline
43:53
you know aircraft to an airline is only making money when it's in the sky on the ground it costs
43:57
money um so they could see okay we'll take out a service now get this repair affected why we've
44:03
got a spare on the ground and no one knows the passengers don't know that the aircraft has changed
44:07
you know from one registration to another registration underneath them but it all
44:12
happened seamlessly because they could see that data um you know i've been airborne where we had
44:16
you know a technical problem with aircraft i've been on sat phone back to back to uk and head
44:21
office and spoke to engineers and they could see live streams of data a delay of about a minute
44:26
between what we're seeing on the aircraft and what they're seeing computers so again they can
44:30
see a digital representation of the aircraft on the systems back in the UK. So, you know
44:37
that's kind of digital twins, you know, is a way of visualizing a live system or a system that's
44:43
not currently connected to the network. You could, you know, update it, make changes, and then next
44:48
time it connects, it will suck down those changes and those new settings. Yeah. So, let's move on
44:56
now finally and talk about um the wonderful world of ethics um i remember when i heard this years
45:05
ago um when i was uh uh only interested in myself and my mates and my music i was gonna go you know
45:13
who cares you know who who cares these ethics things um but then i think as you realize the
45:19
world is not just about you and suddenly it becomes more important you know and the main
45:28
thing again that comes into people's head is stuff like killer robots but it's not just about
45:34
killer robots and do we design IoT devices that can go out as a drone and you know
45:41
track somebody down and kill them it's about things like faith recognition it's about things
45:49
like data privacy of the data that we have. So, for example, if I take my Fitbit that's charging here and I'm gathering data on the
46:03
person, what do I have to do to ensure that the data that comes off that IoT device isn't
46:11
giving away some information that is going to be injurious to the person and that we
46:16
protect the privacy? So there's a whole range of stuff there that causes an issue
46:25
Do you think from an engineering point of view that we need any kind of regulation around IoT ethics
46:36
Or is it something that should just be left up to the industry to self-regulate
46:43
You're muted, Cliff. I'm muted. there. I think the, it's the saying of 2020-21 isn't it? I think at the moment the industry
46:54
should self and the reason I say that is because I think it be too difficult for politicians and the beavers to to come up with a way of regulating it and I think they get it wrong um so I think we should self
47:09
regulate and you know and the likes of you know the likes of Troy Hunts of the world that the point
47:15
out and and Scott Helms that that point out all these security failures and you know and where
47:20
things have gone wrong um we need more of that it needs to be shouted louder um the fact that when
47:25
companies do get it wrong and then it'll mean that you know when clients come to me we're a
47:29
project you know i'm less is more if i don't need to collect the data we don't need it then why
47:34
collect it don't don't don't collect it don't store it but we might need it later but you don't need
47:38
it now so let's not collect it um because then that way you've got you know always sort of work
47:43
on a premise that even if i'm trying to store it securely it's in the open that way you know what
47:49
i'm storing is it can be seen by anyone and it makes me feel better about the fact that the data
47:53
from collecting so it's interesting when we we talk about ethics and one of the things that we
48:00
you know think about is um data privacy and data security and are we collecting too much or too
48:06
little um all of the things that we we don't generally talk about is um inclusion and one
48:13
of the reasons um that inclusion is really important in in ethics i think can be clearly
48:18
illustrated by one example. I recall being in the Science Museum, in fact, in London a couple of years back
48:30
and in one of the exhibitions that they had, they had a little sort of a booth
48:37
and you would go in and you would stand up, and it would look at you
48:41
and it was called the EmotoCon or something, I don't know, some silly name
48:46
But what it would do is it would zone in and it would try and using, you know, cameras to automatically recognize what is the gender of the person who is standing in front of me
48:58
What emotion are they showing in their face? What age are they
49:04
And, you know, what gender are they? That type of stuff. And I thought it was cool and I wanted to go and see what it would make of me, Santa Claus, right
49:14
and I was kind of queuing you know the the 50 year old geek standing there of course and
49:23
all of the everything from six-year-olds up to their mums and dads and all lining up as well
49:27
so this this particular young chap went in and it said that he was 11 or 12 and he went
49:35
and it said you're happy and he went and it says you're sad and that was awesome right
49:40
and this other kid then went in and it didn't give anything
49:50
it just kind of gave a there's nobody standing there response and then his dad went in
49:58
and kind of because the kid was going what's going on and his dad went in
50:02
and stood in as well and it gave a similar type of thing and I suddenly clicked
50:06
what was going on the gentleman and his son were Sikhs. So they were both wearing turbans
50:14
And it completely threw out the algorithm. Because number one, it had been
50:24
clearly trained primarily on Caucasian individuals. And B, it had never been shown anybody
50:33
who had a hat on. Irrelevant of whether it was a Sikh
50:37
turban or the queen with one of her fancy hats or whatever
50:43
It's irrelevant. The fact is that that piece of IoT had been deployed
50:50
without having done some kind of inclusivity testing. So it's very important
50:55
We still have that problem now too, don't we? Twitter's AI algorithm for selecting the faces and images
51:00
chooses Caucasian people over people of African descent and things like that
51:06
Still, I think today, in fact, which is a terrible thing for it to do
51:10
It just auto selects. And there's AI where people with dark skin against dark backgrounds just don't get shown as faces because there's not enough contrast between the background and the person's face
51:23
So there's some big problems. Absolutely. And you know what? Those things are not technically insurmountable
51:29
They're not even difficult problems. Right. And it's purely a case that they haven't got enough people with diversity in their training sets for it to be actually lifted up as a problem
51:39
You know, and that's very unfortunate. So, OK, we're going to have to wind up now
51:46
That's been a very interesting tour of what exactly is IoT through for some examples of IoT being used for good co-robots, digital twins
51:57
we didn't quite touch on the killer robots yet but I will
52:03
we should actually I saw a Twitter thing about 12 months ago
52:12
and I just said oh my god that's just the end and it showed these guys who had put together a drone
52:18
that was actually flying around a working chainsaw right? What could possibly go wrong, right
52:30
Yeah. So we leave it at that. Thank you very much
#Business & Industrial
#Consumer Electronics
#GPS & Navigation
#Internet & Telecom
#Gadgets & Portable Electronics
#Home Automation
#Home Alarm & Security Systems
#Teleconferencing
#Machine Learning & Artificial Intelligence


