Experience the Mass Robotics Convention in Boston, MA with us as we walk through the expo to see all of the cool new robotic innovations and technology!
Mass Robotics Summit and Expo 2025 https://www.massrobotics.org/event/robotics-summit-and-expo-2025/
Shot by Moloney Creative Agency: https://moloneycreativeagency.com/
Mike Moloney Instagram: https://www.instagram.com/mjmolo/
Matthew Moloney: https://matthewmoloney.com
Matthew Moloney Instagram: https://www.instagram.com/moloneymatthew_/
Watch all Matthew Moloney short films: https://www.youtube.com/@mattmoloney_/videos
Music Credits:
"Auckland" – VYEN https://www.youtube.com/watch?v=O5b1j_zQ1os
"Icelandic Arpeggios" – DivKid https://www.youtube.com/watch?v=pxoq8jEGbBo
"Instant Crush" – Corbyn Kites https://www.youtube.com/watch?v=vad0YbV9wm4
"Wander" – Emmit Fenn https://www.youtube.com/watch?v=Y1PKVhk_bvU
Show More Show Less View Video Transcript
0:00
have one part of that plant.
0:14
[Music] Any questions
0:27
about that? echoes on that screen.
0:37
[Music] Also give you a quick demo. We are zero
0:42
error. Zero error means zero error motion control. We specialize in design, development and manufacturing of the
0:48
rotary accurators. Our road reactors is composed by single core components
0:54
including the servo driver, servo motors, few absolute encoders, friction brake, electric torque sensors, string
1:01
wave gears also known as the harmonic drives and all these core components are
1:06
integrated in one actual unit. The accurators like this
1:12
one and we also are pleased to show you
1:17
this layer by layer structure. same product and a more direct view of what's
1:24
happening [Music]
1:43
Heat. Heat.
1:48
[Music]
2:18
Heat. [Music]
2:36
So the next [Music] position's position.
2:46
[Music]
2:52
[Music] Thank [Music]
3:06
you. Thank you for
3:16
Hi, I'm Mike from Airline. I'm here at the Boston Robotics Summit uh 2025. And
3:23
uh what we're showing today is I'm actually sharing the SMC booth with an Omron collaborative robot. We're using
3:28
SMC grippers to actually do a small pick and place application. As you can see with the boxes, we're basically sorting
3:35
them from one side to the other. The Omron robot has a lot of great features with included vision um included
3:41
programming that goes with it, the ability to do arrays and other um methods of programming. It has unique
3:48
features of a landmark which can help it to help you with location so you're not using your zero reference coming from
3:54
the base. So if you have any questions and you like to do, you can contact airline at airlinehide.com
4:01
uh and schedule a trip with our tech traveler. We can come and bring the the technology to you.
4:11
[Music]
4:22
[Music]
4:28
[Music]
4:41
[Music]
4:46
Look at
5:02
this. [Music]
5:20
Right. lot of what we're doing, testing, taking your technology.
5:28
It really helps my new
5:37
[Music]
5:50
friends. So they do a lot of like front end work. My name is Danielle. I'm
5:56
co-founder at Fresco Design. We're a design consultancy based outside of Boston. We help client companies through
6:02
product development. So, we go from early concept design through to design for manufacturing. Um, a lot of the
6:09
products that we work with have to consider human factors and ergonomics. So, you'll see a lot of products that
6:16
are worn on the head or over here that it's sat in, right? A lot of the
6:22
products we're we're designing across industries, whether it's medical or wearables or the consumer space, it has
6:29
to interact with the human body. Um, a lot of our clients are also bringing new
6:35
technologies to the world and maybe productizing something for the first time. So, we help them say, how do you
6:40
take a new technology, design it for a product with a really like wonderful user experience so that it's going to be
6:47
adopted by the consumer? A great example of that is um this is work we did for one of our
6:54
clients, Nurible. They do uh brain consumer interface technology. So their
6:59
technology uh senses your brain waves and it helps you detect when you're focused and when you're being
7:05
distracted. So they wanted to take this technology into a consumer product. So
7:11
we help them in that process integrate their technology into consumer
7:16
headphones. So a big part of that effort was how do you take the sensors and
7:22
integrate it into a form factor which can be designed for a product that can
7:27
be consoled on the mark consumer market. So these are actually
7:32
um uh conductive fibers that are woven into ear cushions that are connected to
7:38
the technology which is inside of the housing. Part of our process as we're
7:44
designing is prototyping. So we want to understand how a product is going to look, how it's going to feel. This is an
7:51
appearance prototype. Um but it can be worn so that you can test and understand
7:57
like form factor, what it feels like, what it looks like. We help them really derisk this
8:04
product in the design for manufacturing process. So they could take that technology and go to um headphone
8:12
companies and say would you like to partner with us? So their technology is now integrated into the master and
8:17
dynamic headphone which has amazing audio and sound. Um so you can now you
8:22
know you can now buy this technology in a product that you're going to wear every day as a consumer.
8:28
[Music] Um another example of work we do I mentioned
8:33
prototyping. It's a huge part of the design process. So, as we're solving
8:39
design problems, we need to test and iterate constantly. This is an example
8:44
of an appearance prototype. So, this is all um 3D printed SLA parts, including
8:50
the mesh here that's inside of the helmet. An appearance prototype is what
8:56
helps our team and our client teams make design decisions and understand what direction they want to take a product
9:02
into. We also do a lot of working prototypes. prototypes that are going to be used for field testing and user
9:09
testing. And this is all part of the design evolution as we're designing for the final manufacturing intent of a
9:16
product. Um, so that's a lot of fresco in a nutshell. Um, one more thing I'll
9:23
mention is when we're designing these products, we're saying how do you how do you show
9:30
your customer the technology that you've taken years to develop? How do you like
9:35
convey that technology in a really like easy to understand and easy to digest
9:40
format? So, our team of 3D visualization artists, we bring a lot of technology to
9:46
life um in 3D with technical animations that
9:52
showcase how a product works, how a technology works. And this is a way for
9:58
um our clients to present their technology whether it's consumerf facing
10:03
or for an investor pitch or internal stakeholders. So it's part of the the
10:08
product journey is that final storytelling of what your technology can do.
10:16
Hi, I'm Gustavo Fontana. I'm one of the co-founders of Fresco. Thank you for coming to our booth today. Uh I'd like
10:22
to tell you a little bit more about our culture. Why Fresco? Um my background is
10:28
in product development, industrial design. I was working in advanced development in the corporate side. I did that for quite a bit. And when I would
10:35
look for a design company, it was very hard to find the right mixture of front- end creativity, but also follow with a
10:42
lot of uh execution, high quality design for manufacturing knowhow and then also
10:48
be very good about fast adoration to let our clients know and show them the way
10:54
of different ways of how to be more creative, right? And this is where we started Fresco. Uh we're a small
11:00
company. We're only 15 people in in three offices and our approach is really
11:06
uh working with a very senior team, very hands-on. Our average experience is 13
11:12
years for the entire team and our our goal is to be an embedded augmented part
11:19
of the in-house design teams of a small startup or large corporations. So we tend to be instead of this outside
11:26
company that works sending things over the wall be much more involved in the day-to-day decisions of why and how we
11:34
use design and then how to bring that design into something that is delightful
11:39
for users that works well that has been tested and evolved every day and as we
11:44
do that also that starts meeting all your business goals it starts meeting all your goals of quality of
11:50
manufacturing and I'm sorry manufacturing reliability and and also it's easy for the consumer
11:56
to understand. Um our background tends to be in terms of the type of products and projects we do study more with the
12:03
consumer and consumer electronics but we really have an act for things that people wear things that
12:10
people interact with that they need to grip that they need to put on their heads they need to put on the on the
12:16
bodies. Though I would say human factors becomes like a natural extension of the
12:22
areas of excellence of fresco and we get to do that through a lot of as you can
12:27
see we have a lot of uh dummies of all different sizes of people different sizes inside elements such as ears only
12:35
of of noses only of you know shoulder width only and depends on what kind of
12:41
product you're working with we're going to help you in those directions and the prototyping We take them from
12:47
the very rough and daily printouts. Uh we get to the point that uh we got over
12:55
25 printers in our studio. So we print every day and every night just to kind of see where design is at. And besides
13:01
that, we also do very finished things like uh you can see here finished level appearance prototypes. That's the level
13:08
of prototyping that tends to be a bit of a dying arts because people rely a lot in computer renderings, but they don't
13:15
get to have the visceral connection to how a material starts feeling or how a
13:20
color way really works out in real life until you start doing it. Funny part about this when we do a CMF study which
13:28
is color material and finishes and we do it only in the tube and we do it first
13:33
virtually in renderings which are really good about doing that. As you can see some of the rendering exercises, a lot
13:39
of the colors and that we choose when we start ordering those paints and painting them, they don't look as we expected.
13:47
So, there's a lot of iteration that we start doing mixing paints, uh, repainting, rematching until we get it
13:53
right. So, it's a very hands-on process. A lot of our clients come to a model shop and paint room right in Malbero. Uh
14:01
because if you have someone doing that far away overseas or across the world, uh you're not going to be able to say,
14:07
"Ah, that's not the red that I wanted or that green. It just looks a little bit dead and I wanted something more vivid."
14:13
So, we can do the level of iteration and care to really get you to the finishes
14:19
and colors and impact that you want when you're doing something for, you know, visual stunning differentiation.
14:26
[Music]
15:02
Yeah, it's actually funny. Let me say that. Yes.
15:14
[Music] Yeah. [Music]
15:37
Some people try to [Music]
15:45
make
15:53
it. I remember choosing their color.
16:00
[Music]
16:09
[Music]
16:26
[Music]
16:37
Hold on. [Music]
16:48
Hi everyone. U we are here at the robotics summit for 2025 in Boston. Uh
16:56
what we have here is a virtual factory digital twin demo where uh QX is
17:03
essentially the brain that's running your robotic system. And uh in terms of what's happening here
17:11
is uh we have a virtual factory digital twin demo where the QX real-time operating
17:18
system powers the flexive robot and we're essentially also having a digital
17:25
twin that's running on Unity engine and the Unity engine which shows the whole
17:32
virtual factory is also running on a QX uh realtime operating system Through our
17:38
general embedded software development platform, QX is essentially enabling a
17:44
lot of robotics applications which power very high complex applications across
17:51
different industries. I can quickly show you a demo here where uh the real time operating system
17:59
essentially is helping you mimic what's happening in the physical digital world
18:04
and showing how that could operate within a virtual factory setting.
18:11
So as I pick up things from the robotic arm
18:19
and move along things on the right side. What we have
18:25
here is the same robot arm setting within a virtual
18:32
factory setting that is between a production and production line. So we
18:38
have different use cases here. uh weptic and uh place robot but but in terms of how
18:46
this uh optimizes and accelerates embedded software development is is the one that's
18:53
powering safety and and any kind of safety critical embedded software
18:58
applications within robotics. Thank you again.
19:07
Okay. So, here's the pick and place.
19:25
Uh there's also another mode. Uh just this is the eight.
19:33
[Music]
19:53
To build any software defined embedded system, it's very obvious that you need software, you need uh hardware and also
20:02
development tools to put the software and hardware together but you but
20:07
today's software defined embedded systems are very complex and they need to be safe and secure so you need an
20:13
efficient tools to put them together and this embedded robotic systems have to safely communicate with each other so
20:20
for those reasons we have QX general embedded development platform and if you
20:25
see here we provide not only uh like uh just the OS and the hypervisor we
20:31
provide the entire software framework network and also a starting application so that you can accelerate the building
20:37
of your software defined embedded system. So in this particular thing the only software component that's tied to
20:43
the hardware is the board support package and also the drivers here and our software runs in an x86 processor or
20:51
an ARM processor and it can run on the cloud or in the production hardware. So
20:58
you don't need to wait uh so that for your development of your uh software for
21:03
building your device you can start off on any hardware end because most of the software above this board support
21:09
package is completely software defined you should be able to migrate from one system to the other we also provide a
21:16
jump start application when I say jump start application a starting application as a hello world application and we have
21:23
the virtual factory demo that is a also a jump start application for us to build
21:29
robotics. Uh let me show some of the middleware
21:35
components we have here. We have uh the sensor framework which is used to
21:40
connect the cameras, the light, radars. It can be used for uh autonomous
21:47
applications, auton autonomous functionality and we have different uh frameworks for user interfaces. We are
21:54
going to see more about that in the demo. uh we have rich user interfaces 3D augmented reality which we are
22:01
showcasing there and the other framework here is the security framework every device
22:07
needs to have a birth certificate and we provide a birth certificate to the
22:12
extent you can every subsystem can be authenticated to and along with it we
22:18
can have a code signing which can be quantum resistant to so that it cannot
22:23
break by quantum computers And we also have a software defined audio so that
22:29
whenever you have audio you can have a system level audio development that you can do. Audio seems
22:36
uh like a is a necessary component that when you're building humanoids it can be also used not only for your sound but
22:45
also can be used for sensing applications. uh in one of the humanoid robots I've seen uh they are planning to
22:51
use more than 16 microphones because microphones are omniirectional you can
22:56
do that for sensing different kinds of sensing applications so with this particular uh we are we support a lot of
23:03
open source components too like Ross 2 like KDL which is a inverse kinaptics
23:11
for uh robotic systems there are several open source components we support and the software itself is positive
23:17
compliant and open but the the quick takeaway is you have most of the
23:22
software is the platform that is given so that you can accelerate the building of this uh devices. So let's go and see
23:29
and uh these are the development tools we talked about. We provide pre-certified software. When we say
23:36
prescertified software, you can show this as an evidence when you are certifying your whole system. And using
23:42
this approach, you can save more than a year or up to 18 months of certification
23:48
efforts. Shall we go and see the demo? Okay, Matt. So we we've seen uh the
23:55
stack there general embedded development platform which accelerates uh the building of your softwaredefined
24:01
robotics or software defined embedded systems for any industry and we also said that we provide not only the
24:07
foundational components safety certified components but we also provide a a starting uh application so that you can
24:14
start off from that applications to build your own uh innovative product. So this is an example of such uh particular
24:20
starting applications called the virtual factory demo. What we are showcasing is QNXC is running on this single Intel box
24:29
which has got on a single multi-core SOC which is a 12 core here. And what we are
24:35
doing is and the bottom box that you have here is a robotic controller. That robotic controller is just a black box
24:41
here in this case which actually is controlling the seven degrees of freedom that this robotic arm has. That means
24:47
there are seven motors here that are running here. So and all the application that the use
24:55
case that needs to be done is running on that single embedded Intel box. So what
25:01
is what are we running here? We are running the Unity 3D. This entire virtual factory demo is running on that
25:08
particular one single SOC. Not only that, we are showing a digital twin of
25:14
this particular robotic arm. And if you see this is the space mouse I have here the mouse to move it and you can see in
25:20
real time it's almost very realistic model. There's no camera here. So we have a 3D
25:27
model running natively on a hard realtime operating system which has got a very ultra low latency uh like timing
25:35
requirements like that means in some of these robotics you need to have feedback because these robots when they interact
25:40
with the human beings without any fences or guard rails they need to have a force feedback that means the more the
25:47
immediately you feel any force that it's trying to crush or something it has to stop by itself for that reason this has
25:54
to be continuously monitored So our software is able to continuously monitor that it is otherwise this robotic arm
26:01
will not function. Okay, that is one main thing while doing that high speeded operations of monitoring for the force
26:07
feedback for safety. It is also doing the unity 3D on the same thing. So you can also now that you have a multi-core
26:14
SOC you can have the safety controller. You see here the stop button here. In a typical factory you have a safety
26:21
controller externally on an assembly line and if something goes wrong from different machines it automatically
26:26
stops the assembly line. But now with the ability to do hardware consolidation
26:31
do many things as a software defined on the same single SOC you can actually have that safety controller also running
26:38
on the same box. So no more need to actually have those wires moving
26:43
everywhere. If you have introduced a new industrial controller, you need to draw a new line again back to that safety
26:49
controller. But now when you define software define this safety controller, you can actually have a software defined
26:54
signal itself as your new adaptable assembly line. So it it
27:00
accelerates innovation for you. So we are running Unity and what else we are doing? We are having this uh aruko
27:08
augmented reality tag based things. Each one is each tag is marked as red, green
27:14
or blue. And if you see here in real time we have a camera
27:21
here. in real time we are able to see this and we use an open source component
27:26
called OpenCV and based on this we are painting that green square on the screen
27:33
and this demo I can even uh maybe let's say show you uh zoom it up is it getting
27:41
lengthier for you or is that okay
27:55
And the other thing is other feature of this is you also have a hypervisor running on
28:02
the same system. So with that we are running a Ubuntu guest [Music]
28:08
here. This is an Ubuntu guest. It is called Arvis. It is a ROSS visualizer.
28:14
Ross is a communication that is open source communication that is used to uh is based on a publish subscriber
28:20
interface protocol so that you can uh send commands to control and manage the robotic systems. It's not only seeing
28:27
the native camera vision we also can see send that command again and we can also say visualize in the RV world too.
28:35
The fact here is like we are running the hypervisor running a Linux gu running on top of a a QX based system.
28:43
Yeah. And other aspect of it is is the what the mixed reality aspect of it. If you
28:49
see that you have a a blue, red and a green zone here, but that blue, red and
28:54
green zone are not on the in the physical world.
29:01
So you can use like your space mouse here. The space mouse is sending the
29:07
commands to the robotic controller and it is and those commands
29:13
are used to place your object that it is being seen. This is a virtual cube is actually being placed in those zones and
29:20
for the gamification purpose when it is when the cubes are placed the colored cubes are placed in the colored in the
29:26
proper zones the yellow cube starts on those three conveyor belt. And once it starts, we are just showing here the
29:34
amount of processing that you can do. And this is a gamification part of it.
29:39
And now this is a virtual robot completely virtual robot. Uh it's nothing. We detached it from the
29:45
physical robot. So the significance of it is you don't even need to have this
29:50
expensive physical robot to start your application. You can actually start with that because the command that you send
29:56
to the robot or robotic arm or this physical is the same. It does not matter
30:03
once you get your application everything working there you should be able to put it here and everything works so there
30:09
will be no surprises at all and that virtual robot now that this one is running on this particular target itself
30:14
here locally here but I can get a digital twin of the same thing running in the cloud so that you can have any
30:21
number of those like virtual robots or a digital twins of the robotic arms and you can have distributed teams working
30:28
on it and anytime your code is checked in each time you can do get a degree of quality of your code before even you try
30:34
it on the physical hardware which is really very expensive. So the simulation first world we are already there. This
30:39
is an example of simulation first before you can get it. So it enables to accelerate the building of your uh
30:46
devices and uh yeah that's in a nutshell uh is
30:53
something in detail but uh if you have any questions yeah please feel free Matt. Yeah hopefully this will help.
31:00
That was awesome. [Music]
