0845 528 0404

Endiatx’s PillBot: Revolutionizing Medical Diagnostics with Swallowable Robots at Mayo Clinic

Endiatxs-PillBo
"Endiatx’s PillBot: Revolutionizing Medical Diagnostics with Swallowable Robots at Mayo Clinic" Hi Guys, Welcome to Robot Philosophy, my name is Robophil where we explore the cutting edge of robotics and automation. Today, I'm excited to introduce Exciting news from Endiatx LLC, the startup making waves in medical technology! This year, they’re planning a pivotal trial for their innovative swallowable, steerable Pillbot robot camera at Mayo Clinic campuses. Imagine, instead of awkward scopes, you just pop a pill for a thorough stomach inspection! They’ve already started their first in-human trials in New Zealand, and the doctors are loving it. They plan to publish this promising data soon. Now, they’re aiming for an investigational device exemption from the FDA. Fingers crossed, folks, because this could be a game-changer! Endiatx has secured $7 million from investors and is working on a $20 million Series A round. Clearly, people believe in the power of swallowing robots! The latest PillBot version boasts a 160° field of view. PillBot Version 1.0, at just 13 mm in diameter, allows doctors to remotely control it inside your stomach. It’s like having a tiny, high-tech submarine navigating your digestive system. Future PillBot versions will be even smaller, with AI-automated movement and lab-on-chip gut-biome analysis. It’s like having a mini scientist on a field trip inside you! Endiatx’s partnership with Mayo Clinic is rock solid, thanks to a 15-year know-how license agreement. Dr. Vivek Kumbhari, Mayo Clinic–Florida’s chair of gastroenterology, has been a key player, joining the board in 2021 and helping with cadaver testing. That’s dedication! In a daring TED Talk in April, Dr. Kumbhari and Endiatx co-founder Alex Luebke showcased the PillBot live on stage. Swallowing cameras in front of a live audience? Talk about gutsy! Stay tuned for more on this incredible journey of medical innovation. Endiatx is truly turning the tables—or rather, the stomachs—of medical diagnostics! Thanks for watching Robot Philosophy! Ready to explore automation? Book a call with us today and let's get started! To get your own ____ or consultancy around Robotics & Automation.

LG’s Award-Winning CLOi ServeBot

LG CLOi ServeBot

Welcome to Robot Philosophy, my name is Robophil where we explore the cutting edge of robotics and automation. Today, we are looking at LG’s new door-type CLOi ServeBot, unveiled at HITEC 2024 Introducing the latest in healthcare and hospitality innovation: LG’s new door-type CLOi ServeBot, unveiled at HITEC 2024. This autonomous service robot, model LDLIM31, is here to make life easier, one delivery at a time. Need a coffee? This bot’s got your back—just don’t ask it for a hug; it’s all doors and no arms. “Our newest service robot brings a new level of functionality and flexibility to the hospitality and healthcare channels,” says Mike Kosla, senior vice president of LG Business Solutions USA. Advanced AI, communication, and control technologies ensure that the ServeBot delivers efficiency like a pro, without needing a break. The CLOi ServeBot is perfect for multiple deliveries, with four compartments each capable of holding up to 30 kg. That’s a lot of coffee cups! And thanks to its six-wheel independent suspension, it can roll over uneven surfaces without spilling a drop. The robot also features a 10.1-inch front display for mobile advertising. So, while it’s delivering your snacks, it might also suggest a new TV show to binge-watch. And don’t worry about it bumping into things—it’s got obstacle navigation down to a science, avoiding collisions better than most humans. Winning the 2024 Red Dot “Best of the Best” design award and the 2024 iF Design Award, the CLOi ServeBot is not just functional, it’s stylish too. Experience the future of delivery with LG’s CLOi ServeBot—your new favorite colleague who never steals your lunch from the break room fridge. Thanks for watching Robot Philosophy! Ready to explore automation? Book a call with us today and let’s get started! To get your own ____ or consultancy around Robotics & Automation.

Sponsors

Robot Center:- www.robotcenter.co.uk – Buy Robot, Robot Buy, Robot consultancy, Robotics Consultnacy

Robots of London:- www.robotsoflondon.co.uk – Robot Hire, Robot Rental, Rent Robot, Hire Robot

Robot Philosophy:- www.robophil.com – Robot Events, Robot Consultancy, Robot Advice, Robot Insights, Robot Ideas

Revolute Robotics: Transforming Today, Innovating Tomorrow

Ladies and gentlemen, welcome to Robot Philosophy, where we explore the cutting edge of robotics and automation. Today, I’m excited to introduce you to a leader in this dynamic field—Revolute Robotics.

Revolute Robotics is revolutionizing the industry with their autonomous ground and aerial robots, built for challenging environments. Their flagship product, the Hybrid Mobility Robot (H.M.R.), is a versatile tool designed for inspection and surveillance in confined, complex, and extreme spaces.

 

The H.M.R. boasts hybrid mobility, seamlessly switching between rolling and flying to navigate obstacles and conserve energy. With a durable exoskeleton, it can safely explore tight spaces without fear of collisions. Its customizable payload allows for interchangeable cameras and sensors, tailored to meet the needs of inspection crews, security services, and first responders.

Revolute Robotics excels in autonomous operation, utilizing intuitive mission planning algorithms for minimal supervision. Their GPS-denied navigation combines LED lighting, stabilization sensors, and robust transmission systems for beyond visual line of sight (BVLoS) inspections. Additionally, their multi-robot capabilities enable a swarm of robots to maintain continuous situational awareness over large sites.

Revolute Robotics is committed to continuous innovation and user-friendly designs, ensuring their robots integrate seamlessly into existing workflows. Their diverse team is passionate about technology and dedicated to making a positive impact on society, building a brighter, more automated future.

If you’re interested in learning how robots can revolutionize your business, our Robot Philosophy Consultancy is here to assist. Our expert team are ready to show you how automation can transform your operations. Visit robophil.com to learn more and start your journey into the future of robotics.

Thank you for joining us at Robot Philosophy. Let’s embrace the future of automation together!

Surgical Robotics: The Future of Precision with Vicarious Surgical

Welcome to Vicarious Surgical, where tiny robots are revolutionizing surgery! Imagine a robot that can operate through a cut smaller than your pinky nail. These robotic arms are incredibly precise, allowing for less invasive surgeries and much quicker recovery times.

The Vicarious Surgical system includes a patient cart with agile robotic arms and a surgeon console. The robotic arms perform intricate maneuvers inside the body, making surgery less traumatic and promoting faster healing. Surgeons control these arms from a console using an immersive 3D headset, providing a high-definition view of the surgical field. It’s like a high-tech video game, but with real lives at stake!

One of the major benefits of this technology is its potential to significantly reduce recovery times and minimize complications. Traditional surgeries often involve large incisions and lengthy hospital stays. With Vicarious Surgical’s minimally invasive approach, patients can return to their normal lives much faster, avoiding extended hospital stays and endless reruns of daytime TV.

Moreover, this technology is expanding access to quality healthcare. With remote operation capabilities, experienced surgeons can perform procedures from afar, bringing top-notch surgical care to underserved regions. This means that more people can benefit from advanced surgical techniques, no matter where they are.

In essence, Vicarious Surgical is not just advancing surgical techniques; they are redefining the future of healthcare. By combining cutting-edge robotics with a commitment to improving patient outcomes, they are paving the way for a new era of precision and accessibility in surgery.

Thank you for exploring the future of surgical innovation with Vicarious Surgical.

Unleash the Future: Introducing the Revolutionary Robot Center Digital Signage Robotca

“Introducing the new Robot Center digital signage robot – your ultimate solution for dynamic communication and engagement. Imagine a world where your messages are not just seen but truly experienced. Our digital signage robot is designed to captivate your audience with its cutting-edge features and unparalleled versatility.

“Equipped with a stunning high-definition display, the Robot Center digital signage robot delivers crystal-clear visuals that will leave a lasting impression. Whether it’s showcasing your latest promotions, sharing important announcements, or providing wayfinding assistance, our robot ensures your message stands out.

“Engage your audience like never before with interactive touch and voice command capabilities. The Robot Center digital signage robot is perfect for retail environments, corporate offices, hotels, and more. It not only attracts attention but also enhances customer experience by providing information at their fingertips.

“Seamless integration is key, and our robot is designed to fit effortlessly into any environment. Its autonomous navigation allows it to move smoothly, guiding visitors and ensuring they never miss an important update. Plus, with remote management, you can update content in real-time from anywhere, keeping your messaging relevant and timely.

“Transform your space with the Robot Center digital signage robot – a game-changer in the world of digital communication. Increase engagement, improve customer satisfaction, and stay ahead of the competition with this innovative technology. Don’t just display your message, bring it to life.

“Experience the future of communication today. Visit our website or contact us to learn more about how the Robot Center digital signage robot can revolutionize your business. Make your message unforgettable.”

Agility Robotics Strikes Ground breaking Partnership with Manhattan Associates

Agility Robotics and Manhattan Associates

Agility Robotics Strikes Ground breaking Partnership with Manhattan Associates Amid Strategic Streamlining.

Hi guys. Robo Phil from Robot Philosophy with another news update for you.

This week in the world of robotics and warehouse automation, Agility Robotics took a notable step by partnering with Manhattan Associates, a big name in Warehouse Management Systems (WMS).

Think of it as a tech-world handshake that promises to change how warehouses operate. This partnership isn’t just any collaboration; it’s akin to matching a sophisticated robot with the Michael Jordan of WMS, aiming to score big in efficiency and productivity.

Agility’s humanoid robot, Digit, along with their new fleet management software Agility Arc, are joining forces with Manhattan’s Active Warehouse Management solution. This integration is a bit like getting the band back together, except this band plans to revolutionize warehouse automation with their unique blend of talents.

However, the week wasn’t all about triumphs and trailblazing partnerships. Agility Robotics also announced layoffs, a bittersweet moment in their journey. It’s like realizing you’ve over-prepared for a party and need to downsize to keep the gathering intimate. The company reassured that this move is to streamline operations, focusing on what’s essential to keep the innovation and production of Digit on track.

Agility introduced some intriguing accessories for the Arc platform, including a charging dock that ensures Digit doesn’t run out of juice at crucial moments and a control pendant that sounds as if it’s straight out of a sci-fi toolkit. These additions underscore Agility’s commitment to not just creating robots but ensuring they fit seamlessly into their new warehouse homes.

Despite the challenges, including the initial need to cage the humanoid robots for safety (reminiscent of early-days tech caution), Agility’s narrative this week is one of optimistic forward motion. Under the new leadership of CEO Peggy Johnson, the company is sharpening its focus, aiming to blend the lines between human and machine efficiency in industrial settings. While the journey includes a mix of strategic partnerships and tough decisions, Agility Robotics is navigating the path with a clear vision: to make their humanoid robots indispensable to the future of work.

Thanks Guys

Sponsor: Robot Center: https://robotcenter.co.uk/
Sponsor: Robots of London: https://robotsoflondon.co.uk
Robot Strategy Call:- https://calendly.com/robo-phil/introductory-call

#robophil
#agilityrobotics
#manhattenassociates

The MiR1200 Pallet Jack

MiR1200 Pallet Jack

Revolutionising Logistics: The MiR1200 Pallet Jack Unleashes AI Precision on the Warehouse Floor

Hi guys. Robo Phil from Robot Philosophy with another news update for you. In the ever-evolving landscape of industrial automation, Mobile Industrial Robots A/S has introduced a game-changer at LogiMAT: the MiR1200 Pallet Jack. This isn’t your ordinary pallet jack! With its 3D vision and AI, it identifies and transports pallets with such precision, it could almost participate in a game of “Where’s Waldo?” with pallets as the main characters. 

Mads Paulin, the VP of R&D at MiR, might as well have touted, “This is not just an upgrade; it’s like giving the pallet jack a sixth sense.” Why the big reveal now? In a market crowded with AMRs and autonomous forklifts, Kevin Dumas from MiR explains their strategy not as being late to the party, but as making a grand entrance with a robust machine ready for the real-world challenge of imperfect pallets. It’s as though they’ve been waiting in the wings, watching others take their turn, only to leap onto the stage with a solution that says, “Watch how it’s really done.” 

Ujjwal (oodjawa) Kumar of Teradyne Robotics points out that while navigating around obstacles is child’s play for most AMRs, the MiR1200 aims higher, targeting pallets that have seen better days. It’s like the pallet jack version of a treasure hunter, unearthing valuables others might overlook. 

Thanks to its partnership with NVIDIA, the MiR1200 doesn’t just see the world; it understands it, processing data at lightning speed. This is not your typical industrial robot; it’s more like a supercomputer on wheels, designed not just for heavy lifting but for smart lifting. 

The MiR1200 Pallet Jack marks a significant step forward for Mobile Industrial Robots, showcasing not only their technological prowess but also their knack for introducing a bit of flair into the world of autonomous material handling. It’s not just about making tasks easier; it’s about redefining what’s possible in the warehouse.

 
 
 

Hand on with Inovo Robotics

Hi guys Philip English this from philipenglish.com. Today, we will have a hands on approach with Inovo Robotics and their robotic arm.

Philip:
So hi guys. My name is Philip English, and I am a robotics enthusiast, reporting on the latest business and application of robotics and automation. My mission is to get you Robot Optimised and to support industry infrastructure and innovation for the next era. Today, we’re here with Henry Wood. So we did an interview with Henry Woods, I think it’s about three or four months ago. And now we’re on site today to actually see the robot live. So we have it behind us, and to just have a look at how this robot works. So I’ll get Henry to introduce himself, can you introduce yourself, Henry, please?

Henry:
Hello again. Hi, my name is Henry Wood and I’m one of the founders of Inovo. We put together Inovo to develop robots that were going to be much more accessible for companies building products in batches. Today Phil’s come down to visit us and I’m going to show him how we program the robot and what kind of things it can do. And really looking forward to giving an overview of the product.

Philip:
Perfect. Thanks for the intro Henry. So the next step guys, is we’re just going to basically have a look at the robot and do a quick light tutorial for you. A bit like an unboxing, but more of just a hands-on Phil. So let’s get into it.

Philip:
We’re going to run through things for you guys and obviously Henry is going to give you an overview and I’m basically just going to hear and ask some questions. And so I hand it to Henry to give us a brief start up of the robot.

Henry:
Sure. Okay, so this is our main product. This is the Modular System. And the really big difference between this robot and other robots is that you can reconfigure it to change the payload and reach. So with most robots, you’ve got to choose a product from the range. They’ve got a five kilo version or a 10 kilo version with 1.3 meters or a 900 mil. And the first thing you’re faced with is this catalog where you’ve got to choose which robot’s the right product for your application. And that’s very difficult for a lot of companies because they’re changing applications all the time. So what we offer is a system where you can change the reach and payload of the robot, depending on what you want to do with it. So what we’ve got here is the arm set up, it’s built out of the three core parts, the wrist, the elbow, and the shoulder.

Henry:
But the link tubes between it are interchangeable to change the reach of the robot. Just as a quick overview, we’ve got a pendant here, which allows simple control of the robot. Say, switching off and on the robot and doing some recoveries, if the robot is left in a funny position. But basically limited to the simple tasks that an operator might do. If you want to program the robot, then you have to plug in a laptop and then you can use all of the advanced features with the comfort of a full QWERTY keyboard and a mouse. Which is a bit easier than the touch screen, in my opinion. So we’ve got a third party gripper fitted here at the moment, this is a Robotiq 2F-85. A really, really nice product. And what it allows is force and speed and aperture control.

Henry:
So you can program how hard it’s gripping something. It could lift up an egg without crushing it, but at the same time, it could lift up a brick or something where you want to apply a lot of force. And it’s really adaptable. So we’ve got some simple controls on the wrist here, including a zero gravity button. If I press that, then the ring goes blue and I can drag the arm around and move it into different positions. And on the interface that you can see behind me, this is giving us a live feed of what the robot’s doing. And it’s showing us the way points we’ve created and the program we’ve built. After that I’ve got direct controls of the gripper, and I can actually add way points to the way guide. So I can build the program really quickly by dragging it to position, recording a way point, drag it to another position, record a way point there.

Philip:
Right. I see, and then when you move the robot to different positions, does it automatically build it in this stack here? Or…

Henry:
Yeah, that’s right. This is basically the workspace. So we’ve got the visualization to see what the robot’s doing and where it’s going. And then this is the program that we’re building. So if we start with a new program and clear that, what you begin with is the start block.

Philip:
Okay.

Henry:
And then we have a toolbox on the left, which is all the different features that’s built into the robot. So I can drag out to the toolbox. I could drag a way point, which is a position the arm moves to. But as you can see, there’s various different bits in there, there’s grip functions. There’s delays, there’s logic, there’s loops, there’s controlling of third party tools. So all these things can basically be pulled out of the toolbox and plugged into a sequence, which is a simple and a logical order.

Henry:
So if I position the robot in a point, maybe put it there. And then I can set from robot. So what that does is that captures the current position of the robot. And I can give it a name, so I could call it 0.8. Or something helpful, this is the door handle. If we’re opening a door, or this is the phone if you’re picking up the phone. And then I can drag it to a new position and I can duplicate this block. And I can set it, give it another name. And again, I can set it from the robot. And it can see in the visualization, here are the two points that I’ve created. And get rid of the grip block in the middle.

Henry:
And we’ve got two points and it will move between those two points. If I play that it will run, moves to point A, then it moves to point B and it stops. If you want to apply more logic, we can use a loop. So we can drag a loop out of the toolbox. Plug that in, and then put those inside it. And now if we play, keeps running around that loop.

Philip:
It’s very nice and easy, quick and simple to use, yeah.

Henry:
Yeah. And the really nice thing about this form of programming is you can’t plug something incompatible into a place it doesn’t fit. So it makes it really intuitive. If it’s clicking together, it’s going to run. If you can’t get the bits to connect, then you’re trying to do something that isn’t possible. So, that you notice it moves in an arc. You could do a linear move, so you just change it so that it basically moves in a perfectly straight line between those points.

Philip:
Oh I see and so you can have a fast bit in the middle, slow bit. [crosstalk 00:00:06:27].

Henry:
Yeah you can control the speed of all of these. If I had a third point on there, if I pause it there. So if I drag another point over here, duplicate that one, call that point C. So now we’ve got a path where it goes there to there to there. And if we run around that loop now, it’s going in a more of a triangular motion. Now that’s still linear. So it’s still doing a perfectly straight line between these points, but it goes to the points and stop. If you’ve got to pick something out of a machine, you want it to do it as smoothly and quickly as possible. So we can take the middle one as having a blend on it. In fact we can put a blend around the whole lot if you want. And it will basically mean that it smoothly moves past those points rather than coming to a stop. So it makes the whole thing run a bit faster. So, if I run that…

Philip:
Instead of having a hard stop at A, B or C, it just…

Henry:
Exactly. So you can see sort of over the B it didn’t stop, it just looped over. And so you can get the whole process running a lot smoother and faster like that.

Philip:
So, when it comes to the end of arm gripper, I saw that there was a Robotiq section in there. So it’s similar to how you program that, you can program that with the robot gripper as well.

Henry:
Yeah, out of the toolbox, we can grab a grip block.

Philip:
And this can be any gripper? [crosstalk 00:07:55].

Henry:
You have to tell it what gripper it is. And what we’re doing, is we’re building libraries for more and more tools. So we’ve been building libraries for Robotiq, OnRobot. We’ve got interfaces that let you drive Schunk. And so, if someone presents us a new gripper we haven’t seen before. The first thing we’ll probably do is hand it over to our engineers and get them to add that library to the driver, so that everyone can use it. Because we want to support as many different tools as possible, really.

Philip:
Perfect, sounds great.

Henry:
So, we’ve also got direct controls on the UI there. So on the far side, on the left there’s a control there where we can basically drag the gripper open and shut. And that lets us control left quite closely, we can set the effort of how gentle it is. So if we grip something like that, it’s gripping it. It’s not gripping it that hard, I can pull it out. If we open it again, and then set the force higher, that’s got it really solid set. So you get really nice, fine control with these, they’re very versatile tools.

Henry:
And one of the things you might notice here is that it’s actually plugging straight into the wrist. It’s not a cable on the outside, and this is one of the limitations with some robots is that you can attach third party tools. But you’ve got to attach them to the control box. One of the reasons we were really keen to have it attached directly to the wrist, is because these joints can just rotate freely in any direction without worrying about a cable getting stretched or snagging. And you could even drive a screw with this and go around 10 times and it will just continually rotate. So there’s a lot of flexibility that comes out of that.

Philip:
So, how does that compare to other sorts, like cobots then? They normally have something you have to plug in and then a wire that comes…

Henry:
They’re starting to offer some support, but it’s quite limited actually. A lot of cobots, you get an interface that plugs into the control box under the table. And then you get a long cable and you’ve got to run five meters of cable around the outside and then you’re cable tying around the joints. And it has just got a lot more chance of actually getting snagged or pulling out, which is a real restriction.

Philip:
Yeah. Sounds very messy if you have that.

Henry:
It can be quite messy, yeah.

Philip:
It’s only be a matter of time before it gets to somewhere it shouldn’t be and then ends up breaking something, so yeah.

Henry:
Absolutely, so you can control the gripper live like that. There’s obviously a block to control the gripper mid sequence. So if we wanted to put a grip in the middle of there and we could say this one’s an open position. And then we could have a short pause and close it again. And then that would basically be running around the loop and it will open at point B and then it will close at the third point after that. So, you can sort of see, you can build up increasingly complex programs in this visual language. Really easily without having to text scripting language or anything.

Philip:
Well, this is it, it’s very, very easy to use, really. Very simple as you just shown me. So yeah, I’m very impressed with that.

Henry:
And then balanced with the complexity, you want to still have fine control. So, if you do need to do something very precise. So if for example, we’ve got to pick this block up. You can can drag it down there and get it roughly in the right position very quickly. But then you might want… I’ll do it this way so the camera can see. Then you might want to get very fine control at the end. So I can stop using the zero gravity mode and I can use the controls on the screen. And then you can do very accurate…

Philip:
Minute.

Henry:
Maybe we can set it to only moves one millimeter at a time. Or you can set it to only move in one axis at a time. So you could move very carefully, even if you’re around delicate things that could be broken.

Philip:
And then if it was something delicate, how do you test? Or is it just a bit of trial and error? [crosstalk 00:11:40]

Henry:
It’s a good question. If you’re figuring out exactly how hard to grip an egg, it could be a bit of trial and error. The point where something will break isn’t something that we can really monitor, but you just gradually increase it. You could start off setting the grip force so that it’s got it. And then you try and pull it out the grip. And if it’s not enough, increase it a bit more. So these motions will let you do Cartesian motions. So down and up in the Z plane, X and Y. And you can set steps so that it will do small movements, or you can do faster movements.

Philip:
No, that’s great. So, we’ve got the controller here, can you just take me through how that works?

Henry:
Sure. Yeah, so this is the pendant. This remains fixed to the robot. So if you’ve got a shop floor environment in a factory, your engineer will probably come down with a laptop and they’ll have their laptop on their knee or the bench while they set up the robot. They will create the program where they can see it with all of these nice features that you get on a full sized screen. So when they leave, they can unplug the laptop and the robot can continue working. This gives a nice, simple interface for the operator to basically set up and run the program. So they don’t build the program on here.

Philip:
So, obviously if you’re in a manufacturing site, you just want to get the robot quickly moving. So you just come to this and uses a very quick, fast interface. And I suppose if you’re doing a different task as well, you can do one task, hit go, move to the next task and hit go. And you haven’t got to go and program the whole robot again.

Henry:
Exactly, the idea of when you come in here, you load program A, and it can do program A for the next few days. And then that finishes and someone’s got another program that the robot’s set up to do. They can just select it from a list here. But what they can’t do, is they can’t edit the program, which is useful because it means that once someone’s built a program here, it can’t be messed with. It’s locked if that’s the only interface you’re using.

Philip:
Right, perfect. So if I… We have a joystick here?

Henry:
That’s right, yeah. The joystick’s quite a nice feature we’ve got there. So, joystick itself would be prone if you just knocked it. So you’ve got these buttons on the side, which are called dead man’s handles. When you squeeze those, it becomes live. So now you’re able to control it. Now you’re in the mode at the moment where you’re basically rotating the wrist around the point. And it’ll only go so far before it reaches a limit. You can jump between other modes. So if we press this button here, then you’re in Cartesian mode now. So, now it’s moving forwards and backwards, left and right. And then doing that still we’ve got up and down. So, that’s basically allowing the robot to travel in this axis.

Philip:
So you can have a full play of different axis’s there.

Henry:
Exactly. And that’s in the base frame. So that means that you’ve got X, Y, and Z like that. So I’ve now put it in the tool frame. So now you’re moving along an axis which is along here. So if you’re trying to insert something, gives you a nice simple way of doing that. That’s quite hard to do with zero gravity because you’re trying to drag it in a perfectly straight line. But in this mode with a joystick, do that. And actually if you move the joystick just a tiny bit, you can move it really quite slowly. So you get a very fine control if you’re trying to do something quite precise.

Philip:
I see, yeah. Because I can imagine if you were trying to do it manually, you could be pulling it in one direction. Not actually know you and not realize you’re pulling to the right, pulling to the left. Whereas here you can do very, very, very precise.

Henry:
Absolutely. So if you’re trying to insert a pin in a hole or something like that, then it gives you the ideal way of doing that. So we’ve got various different ways of doing similar things that suit different applications.

Philip:
Yeah. And that’s a very handy feature to have, well I’d say a brilliant feature to have, really. Especially if this is going to get used in multiple different applications and devices, I can really see that being really helpful.

Henry:
Yeah, we’ve had good feedback on that. And so we’ve got the control to do it with the joystick. There’s an onscreen control and the zero gravity. So typically for almost everything, one of those is the ideal. But there’s no one solution for everything, so it’s useful to be able to jump between them quickly.

Philip:
And then I suppose the next question is, obviously the main thing for this cobot is basically if you’re a small business and you wanted to do multiple tasks. Some of the tasks may be heavy, some of the tasks may be light. So yes, I’d be very keen to interested into how you would make the robot for something heavy and a robot for something light. Can you give us an idea? [crosstalk 00:00:15:59].

Henry:
Sure. Well, this is the super power that this robot offers that others haven’t got. So what we can do is we can change the reach at the moment. This is a medium reach. I’m going to put it upright. Just because it’s a bit easier to handle like that. So I leave it in this position, I look to the screen where I can basically turn off the arms. So, I go, just off. You can hear it click and it’s off. So now what I need to do, is unlock this. No tools, just simple wheel. And then this section comes off.

Philip:
Very nice, very quick and easy.

Henry:
This weighs about eight kilos or something. So it’s not light, but it’s certainly fine for one person to lift and very easy to do on the fly.

Philip:
A one man job, really. So someone [crosstalk 00:16:48] can come along and do it for you, yeah.

Henry:
And the nice thing is you want to move this robot. You can move it in pieces as well. So this is about 10 kilos, about eight kilos, couple of kilos. So all these bits I can move by myself. But if this was all one robot, you’d probably have to get a second person to help with that. So we’ve got a range of different link sections. This is a medium, this is 150 mil. We’ve got a 300 mil or indeed you can put it together with nothing. So if I take this part here, I line up the pins.

Philip:
And this would be the maximum weight that you can get with that extension.

Henry:
Exactly. That’s the trade off. So you’re trading off, reach with pay load. So having put that one on there, turn it back on. See the orange here, blinking indicates that it’s just starting. Takes a few seconds to detect the whole system. And if you look at the user interface here…

Philip:
Oh, there we go, it’s straight in.

Henry:
Popped up, and we can rotate that, so that it’s in the same orientation as the arm.

Philip:
So you’re not there having to download another program to put a different [crosstalk 00:17:49] Yeah, very quick and easy.

Henry:
All we did, was switch off the arm, change the configuration, switch it back on. And we could even run the same program, it’s that flexible. Now I’ll give you a warning if you do that, I won’t do that now. But once the arm’s enabled, I can essentially use it in exactly the same way. It’s updated all of the dynamic systems, the zero gravity and the kinetic. So it’s basically ready to run. We could try and run this program. The limitation is if it’s out of reach. So if you’ve asked it to reach something over here and it’s not long enough anymore. But there you go, so this one’s still valid.

Philip:
It still runs. And then the weight load for this now then?

Henry:
So this is now 10 kilos, so this becomes quite heavy duty in that short reach.

Philip:
Right. Yeah, and then the one bit there…

Henry:
With this, this would be about five or six kilos. And with the longer reach still, then it would still be able to lift three kilos. So you’ve obviously used some of that pay load in the gripper. This is about 800 grams. But at full reach you could still be lifting two and a bit kilos, which is still fairly substantial. And we see a lot of companies where they’re loading machines. They’re putting blank metal [inaudible 00:18:57] in a CNC machine or lifting plastic molded bits up. They really love that longer reach and they often don’t need particularly heavy payloads. So it’s an ideal setup for them.

Philip:
So again, you’ve got three different worlds there, they can use the robot for.

Henry:
Exactly. And actually in the future on a roadmap we’re bringing out an active link as well. So this is a six degree of freedom arm, which is still fairly flexible. Some robots have a seventh degree of freedom and that gives them an extra layer of redundancy. So you can reach inside, something in the elbow can still move. So we’re bringing out on active link section which you can plug in here. And then this is an extra degree of freedom. So you can go from a six DOF to a seventh DOF, which is a really nice feature to be able to have if you need it. But if you don’t have to need it, you don’t need it, then you don’t have to pay for it. Which is our general ethos, you only buy what you need with this robot.

Philip:
So an example of where you would use that seventh degree, can you give an example of that?

Henry:
Yeah, it’s really if the robot’s got to do quite complicated motion. If it’s got to reach inside something, around the back of something, that can be useful. We see seventh DOF being useful in probably a quarter of all the cases. It’s the minority, but it is useful in some spaces. And where we’re finding these longer reaches very useful, is things like inspection as well actually. We’re working on a number of projects, where the robot’s got a camera mounted on the end and it’s reaching around and it’s basically inspecting something from a lot of different angles. Cameras are generally a kilo or less. So it’s an ideal light payload, long reach application.

Philip:
So some of the case studies that you’ve done already, can you talk us through some of those?

Henry:
Sure. Yeah, happy to talk about those. So, we’ve got one of these in a restaurant, a food and preparation space where the robot’s loading raw ingredients into pans and woks. We’ve got another project we’re doing where this is actually a vertical farm and this is loading seedlings into pots. And then they go onto a wrapped conveyor system, where the seedlings go under artificial growing lights. And we’re looking at a couple of projects at the moment where there’s a camera mounted on the arm for all kinds of different applications. In dentistry, in part inspection, in metrology, and in a really wide range of different applications like that.

Henry:
And then the classic applications that people are used to seeing with load robots, like universal robot, where this is an ideal robot for machine tending as well. Because you can set it up for the long reach. It represents very good value if you don’t need those heavy payloads. And it’s also really versatile. If people need to move it to a different application where they might be doing assembly. So, one week it’s reaching inside machine, using the full reach. The next week, got it on a bench and it’s doing some simple assembly task. And then you don’t want that large reach, because it’s just a bulkier system. And you can get better precision and a higher payload in that short reach.

Philip:
Right, so hi guys. So Henry and I have quickly swapped places because this time I’m going to give it a go. Just to show how easy it is to get the robot working. And Henry’s going to give me a hand. So as I understand it, all I need to do, is I hit one button at the top. It turns the LED’s blue and then I’ve got the ability to move the robot handle around. So I’m going to go for these blue points here, this blue block here. Place it down below and then I can just hit this button on the right. And it should give me one position.

Henry:
That’s right, yeah. So it’s added your new way point to the bottom of that list. So if you want to add another one that’s directly above where you are, then it will lift out nice and square. So if you go up again and then… That’s good and then you can just press the button again to add another way point there. Should pop on the bottom of that list there, then you could bring it over this way.

Philip:
So, let’s bring it over here. So we’re going to try and put him on top of this one here, just put it here…

Henry:
Yeah, that’s nice.

Philip:
And then we’ll click it again.

Henry:
One more down there. And then finally you can try and rest it on top of that.

Philip:
Probably about there. I think I got that right, and then one more click.

Henry:
Yeah. So you’ve created now the path for it, and you’ve done all that without having to use the interface. Now, all we do is we’re going to drop a grip position on this. So we can say, so your first point you were starting above it, you’ve moved down over it. This point, you’re closing the gripper and then you’re picking it back up again and you’re leaning over. And if you wanted to drag the 3D, you’d be able to see that that path is drawn at a set point. [crosstalk 00:23:35].

Philip:
Oh I see, so it sits here as well, perfect.

Henry:
If we’re taking a bit more time, what we do is we’d label them each. So we’d call this above, we’d call this over, we’d call this away and call this one place. So that it’s really easy to read.

Philip:
Right, makes sense.

Henry:
But yeah, so now you can go here, you should be able to just press play in the middle of that.

Philip:
Play button here.

Henry:
And it’ll go to the first point. And this is showing you which one is one, so you can see that one’s highlighted at the moment. It’s going towards the next one, sorry. Then the gripper should close, gone up, onto this one. Next one’s going to be the move across.

Philip:
And you can see the way points here as well.

Henry:
Yeah.

Philip:
That’s brilliant. [crosstalk 00:24:12].

Henry:
So now you’ve run it once and it worked. What could we do is we could turn up the speed a bit more and you could gradually tweak it. So if there’s any bits where it’s not perfectly aligned, you can use the on-screen controls to adjust the position and just get it to the point where it’s running really smoothly. And then you can get it to run it faster and optimize the whole process, so that you got it running exactly the way you want.

Philip:
Right, perfect. Yeah, you get that optimization to make sure it’s working perfectly.

Henry:
Exactly.

Philip:
Right, no, that’s great. Yeah, thanks very much for the quick test there. I could see how it’s very easy to move and get going. So thank you for that.

Henry:
It’s a pleasure, it was really nice to have you down here.

Philip:
Yeah, no, that’s fine. Well, thanks for the overview. And as I said, it’s a great product. It’s got many different applications in many different industries, really. So yeah, I’m really looking forward to see how this robot grows within the industry. So thank you Henry.

Henry:
It’s a pleasure, thank you for coming. (silence)

Inovo Robotics Interview – Hands on Inovo Robotics: https://inovorobotics.com/ Philip English: https://philipenglish.com/ Sponsor: Robot Center : http://www.robotcenter.co.uk

Tech London Advocates Robotics- Software robotics

Tech London Robot Center Philip English

Hi, guys! Philip English from philipenglish.com and today we have an interview with different tech companies including the Tech London Advocates team – a network of tech leaders, experts and investors uniting to form the most influential group in tech, Fizyr– leader in software for automated picking, Alias Robotics – a robot cyber security company, SLAMCORE– world leader in spatial AI and algorithms that allow robots to understand space, HAL Robotics – London-based robot control specialists focusing on novel applications of robotics in creative and construction industries, Extend Robotics – human focus creators of robotic arms capable of remote operation anywhere in the world, IRAI Innovative software- software developers for the fields of industry and education, IncubedIT- leader in software for autonomous mobile robots and ARS Recruitment- agency working to match great candidates with awesome companies involved within Automation. Find out more as they introduce their companies and answer questions about their work industry.

Discussion:

Philip English: (25:50)
Okay.

Thomas Andersson: (25:58)
Okay. So let’s kick off then. Um, I think we’ll get a few more attendees as we go along. Um, and like comments can also take part of the recording. So this is a recorded, um, webinar. We’ll put it on YouTube after this session, so you can also, um, look at it after them. So hello and welcome to the second, um, TLA webinar. Um, my name is Thomas Andersson. I’m the co-founder of TLA robotics, and one of the six group leaders I’ll be very brief so we can get on to the exciting part, which is then the company presentations. So a little bit about tech, London advocates. So TLA for short, it’s a voluntary organization, uh, that was originally set up to promote the tech ecosystem in London, but the group has now expanded. It’s got several different working groups. Um, it’s called, uh, chapters across the world as well. So, um, it’s a really big group. That’s about 10,000 members in the UK alone. Don’t know if you want to join the group it’s pre um, um, I should share my screen as well. I just remembered,

Thomas Andersson: (27:12)
Um, where is my sorry guys? It’s coming up. I would go, yeah.

Thomas Andersson: (27:30)
All right. I’ll go on. I will share that in a bit later, so you can see the email address as well. So actually the email address is T L a dot robotics@gmail.com. So joining is free as well. Um, so the TLA robotics working group was set up in April, 2020 with the purpose of encouraging and promoting the robotics and automation equity system with a focus on the UK and Europe. Um, we have three aims to encourage investment in robotics and automation to increase adoption of robotics and automation, um, in the UK and Europe, especially in the UK and also thirdly, to improve the gender balance that we see across the, um, um, robotics and automation, um, uh, sectors as well. So when it comes to investments, um, so what I’ve seen in my work, I work a lot with, uh, investors. We see that, um, most of the investments in robotics actually is going to US and, um, to Asia as well right now. Uh, let’s see if I can share my screen here. Oh, there we go. Um,

Thomas Andersson: (28:42)
So in one recent project I’ve done it’s we found that, um, 80% of the companies or 60% of the companies were based in Europe, uh, 14% of the companies were based in the U S but us companies take an 80% of all the investments or they attracted that investment. Um, so yeah, I should also add that our medium to long term aim is to organize physical events, but that’s obviously been hampered by the, um, uh, COVID as well yet. So, so what we can see here is then on the screen here, here’s the team, uh, that makes up, um, TLA robotics. Uh, I should make a note that we can, um, we do questions, uh, during this, um, the presentations as well. So our team members, they will be able to answer any of the questions. If it’s a question with particular focus on one of the companies, we can feel that as well to the company, just so you know, uh, with that, um, I really want to say big, thanks to Philip at, um, robot center who are sponsoring the zoom hosting for this event. And without this, we wouldn’t really be able to do this, um, these volunteer. So with that, I’ll stop sharing. Um, sorry. I had some problem with that, um, over to you and then Philip.

Philip English: (30:08)
Cool. Thanks, Thomas. Uh, thank you very much. Now. I always a big fan of these events. Uh, I’m just gonna share my screen as well. And uh, let me just go presenter mode.

Philip English: (30:29)
Okay. Right, can everyone see my screen? Perfect. Let me just move this as well. Cause that’s in my way. Brilliant. Um, right. Yeah. So, um, my name’s Philippine English. I am a chief operations officer at robot center, um, a collaborative robotic integrator. Um, I’ve personally been fascinated with robotics ever since I saw my first movie when I was a young child at the cinema, uh, which was uh short circuit. If some of you guys can remember that one, uh, Johnny five. Um, so, uh, since then I’ve, uh, I’ve always had a passion for, uh, RO robots and, um, as we getting closer to the real life, uh, Johnny five, um, so a big part of that would be to do with software. Um, and this is why I’ve put together the five software trends. Um, so with software being famed for being fast pace and an innovative, uh, the future of robotic software is started wanting to see unfold. Uh, so these are the five trends that we’ll put together for you guys.

Philip English: (31:37)
Okay. One second.

Philip English: (31:50)
The cloud starting with the cloud trend one. So the cloud will only get bigger over time. Uh, the, the use of computing with cloud storage and other technologies that allow for higher levels of human robot interaction and learning, uh, is contributing to the robotic transformation of companies. A more powerful robot solutions are growing thanks to cloud technologies. They give the ability to handle heavier computing processing task, which basically enables a more powerful, powerful and cognitive collaboration. And this greatly increases the available data to share with machines and shamans. Uh, so always know, um, it should mean that prices should drop for consumers and a software functionality is expanded and refined. So that’s trend number one, trend. Number two, we’ve seen is a Python is still the dominant, uh, in coding road realm. And, um, basically Python is on a roll right at the moment.

Philip English: (32:48)
And according to statistics, um, it’s grown rapidly to become one of the top languages in robotics, uh, mainly because, uh, both Python and C plus plus. So the two programming languages found in Ross, um, partly has a huge number of free libraries. So you’re not gonna have to reinvent the wheel. And, uh, it’s generally easy to use and save time programming. Uh, and then with, with more and more robotic friendly electronics now supported Python, um, out of the box CG Rossi PI what going to continue to see a lot more Python software and robotics, uh, from the new generation of developers around the globe,

Philip English: (33:28)
Trend three Is artificial intelligence will only get better over the next few years, um, with, uh, artificial intelligence and robotics proven to be a powerful combination for automating tasks and processes. Um, AI and robotics gives flexibility in steps and actions and enables the robots with learning capabilities in various applications, uh, artificial intelligence and robotics market expenses register a compound annual growth rate of a 28% for the forecast periods of 2020 to 25. So that’s interesting to figure. And so all in all artificial intelligence in robots offers companies new opportunities to increase productivity, make work safer, and save people valuable time.

Philip English: (34:12)
Okay, Is the internet things, um, is set to grow and grow. And with the internet of robotic things, IOT set to be a $20 billion market by 2024. Um, I found a great description of, uh, IOT from ABI research, which is, um, internet robotic things as a concept where intelligent devices can monitor a fence few sensor data from a variety of sources, use local and distributed intelligence to determine best course of action and then act to control or manipulate objects in the physical world. And in some cases while physically moving through that world. So, uh, Sony see what the, uh, incident robotics thing we’ll have a bit slave to 2021.

Philip English: (34:58)
Last one is trend number five. So, uh, mixed reality, um, might be something to keep an eye out on. Um, industrial industries uses of augmented reality and virtual reality in conjunction with robotics continues to grow, uh, manufacturers, militaries and healthcare providers all look to find new uses of this technology. Uh, as a rate of growth is projected to continue over the next coming five years with practical uses of the technology, continue to develop an existing users, move into further maturity and interesting fact, found by this one is that as executive that 75% of mixed reality, and we’ll be due to be on mobile by 2026.

Philip English: (35:40)
So that was just a quick overview of what we’re seeing on the market. Um, if anyone is interested or wants to know more, then please feel free to reach out to me. Um, we are keen to get people on our first step of our ROI methodology to find out and research what’s possible. And, uh, lastly, um, the Robot center team, uh, we, we all grew up watching Scifi films this both seeing envisions of utopian features and dystopian features. So we want to be architects of real positive future that is apparent supported by robotic technology. And so we aligned the global goal number nine, which is to build resilient infrastructure, promote sustainable industrialization and foster innovation. And we support charities that are aligned to this. And for that, I will pass it back. So the first speaker.

Thomas Andersson: (36:33)
Excellent. So, yeah, thanks Philip for that. Um, without, so we move quickly on to alias robotics and, um, Endika from Spain.

Endika Gil-Uriarte: (36:44)
Hello everyone. Hello, Tom. Thanks very much for raising the invitation and in particular to Philip for sponsoring this opportunity. So very quickly, may I try to share my screen and please do confirm that you can see me and hear me properly.

Endika Gil-Uriarte: (37:07)
That’s okay. Can you hear me? Can you see me? That’s fantastic. So there we go. So, um, it’s Endika Gil Uriarte CEO of alias robotics and alias robotics is a robot cybersecurity company. We are a company focused in these market niche and we take these, uh, from the robot B6 proxy to cybersecurity of robots and robot components in particular, abs robotics was founded upon previous stories in robotics, and we do have the experience and we do have the niche expertise that it takes to tackle the cybersecurity of these robotic systems. Now it has been came very clear from Phillips. Um, talk before that we live in the era of robots and these one hub on the is happening. Although we must say that the, these are very early days in robotics with these, uh, down of an industry that will, uh, follow up in the coming years and decades and probably millennia.

Endika Gil-Uriarte: (38:13)
We see robots every day working with us in our homes, um, performing professional tasks, but more particularly and more let’s say intensively, we see robots in our industries industry that is now trying to position in towards the industry for paradigm where the connectivity is key and we’re older components. So just cyber security becomes critical. Now there’s some public case studies that we’ve been publishing as alias, robotics vaccines, 2018 showing, um, the big thing, we’ll say the landscape that is happening right now in robotics. We normally say that it is happening the very same note learned lessons that happened in other IT industries at the very round of them. And the let’s say cybersecurity lessons are not learned. And the cybersecurity status of robotics is a thing that needs to improve, but it’s better to see something rather than saying it. That’s why I brought to you some videos with this little bottom line of do not trust robot, that they want you to take us and advice.

Endika Gil-Uriarte: (39:32)
Um, let’s say, um, for the fault, and this is something that we have been able to do through manipulating remotely, the safety system of a very popular autonomous mobile robot. Now, alternatively, an attacker could exploit, present vulnerabilities, as you can see in the top, right, to retrieve the map of a sensitive industry. In this case, you can see in there the headquarters of Alias, robotics, but in [inaudible] States, Spain, alternatively, um, the industrial manipulators are not at attack free entities and attacker could exploit the network attack vectors to Ramtion or alternatively use High the robot by using a simple USB via, exploiting the vulnerabilities persent on these industrial colaborative manipulator. Now this is why alias robotics has built the robot immune system, our robot endpoint protection platform for robots. This is robot software that acts like a next generation antivirus that these installed directly into your industrial robots. We do support the top sales industrial robots, and there’s a list that these public in our web, but please do ask us if you, um, find, uh, if you want certain information for your robot, this is how you operate. Ris Ris gets installed into the teach pendant or in this case of this universal robots, industrial robot, and you train it for a certain period of time. So it learns about the usual pattern of life within this robot, outside these training phase, then you have a fully intelligent detecting system to monitor your robot section.

Thomas Andersson: (41:35)
That’s a second slap.

Endika Gil-Uriarte: (41:37)
Okay. So the biological inspiration behind the race, you can find these out there and there’s a flexible licensing associated to it. Uh, following research and development, professional and certif licenses, according to industrial security standards, we do provide some security services as well, so we can help throughout the security process to companies. And please do contact us in our locations in Spain or in Boston. And, uh, I would like to thank you again and stay safe, stay secure.

Thomas Andersson: (42:14)
Excellent. Thanks for that. Uh, and they can, can I just remind all the attendees that, um, taking screenshots, it’s a really good way of, um, capturing if you want to contact anyone with emails and so on as well, that’s going onto the screen, um, with that, uh, we’re moving on to Owen and slam core from the UK. So over to you Owen.

Owen Nicholson: (42:39)
All right. Thanks Thomas. Awesome. So let me just share my screen and we’ll jump straight in. So, uh, okay. I’m looking for sums up from people. Is that, is that coming across? Okay. Can I jump between the slides? That’s the question. Okay, cool. Okay. So, hi, um, my name is Owen. I’m the CEO at slam core. Now robots, uh, struggled to cope with change. So w when the lighting starts to vary when the structure changes and when there are too many people walking around, it’s essentially when the world gets a bit too real. Um, even today, robots struggled to answer the following three questions. Where am I, how far away are things and what are they, and these others, the three questions of spatial understanding and the biggest cause of robotic failure is when a robot gets the answer to one of these wrong. Now, this is a problem that nature has already solved.

Owen Nicholson: (43:35)
And there’s a reason why nearly every animal in the planet uses vision as it’s called sensing modality. If we take a camera and fuse together, simple data such as from a low-cost sensors, gyroscopes, and accelerometers, and we optimize that software to run on low level, uh, low level, uh, Silicon, then we can deliver commercial grade spatial understanding on hardware that’s actually available today. And this is exactly the approach that the AR VR industry has taken already. We here, we actually see three of probably the most advanced spatially aware consumer slash early stage industrial products on the market today. And they all fuse low-cost gyroscopes and accelerometers with vision to tackle these three questions. Um, but these tech giants have all heavily optimized that algorithms to only run on their specific industry, uh, for, for their specific hardware. Sorry. So the, the robotics industry is made up of thousands of niche verticals all with their own technical and commercial requirements.

Owen Nicholson: (44:34)
We don’t need a single solution that works well on one hardware platform. We need solutions that are flexible configurable, and allow us to try different hardware combinations to find the one that works for us. Well, unsurprisingly, that’s exactly what we’re doing here at SLAMCORE. So we actually span out quick bit of history. Uh, we span up from Imperial college about four years ago, um, founded by absolute world leaders in the field of machine vision. And after multiple funding rounds led by top investors, we’ve now built an incredible team of, um, 25 staff from 17 nationalities around the world together have produced 50 patents, 500 papers and 50,000 citations during that time. So, uh, they’ve been busy and after four years of development, um, I’m extremely proud to tell you that the slam core spatial AI SDK is, is hitting the market. So developers can use this, create their own vision based solutions with our proprietary algorithms at the core.

Owen Nicholson: (45:28)
So our public SDK is available for free, and we’ve teamed up with Intel to make it out of the box compatible with the real sensitive 435 isensor, if you’ve ever used that. And through our pay to play vision with program, you can also gain access to other sense of configurations and build for arm architectures, such as the Qualcomm Snapdragon, Nvidia Jetson. Um, what I’m super proud of is the raspberry PI to get this running on that is a huge achievement. Um, but let’s just take a bit of a closer look at what, where we’re providing. So the first thing as I said is the robot needs to know where it is in space. So our algorithms analyze each frame from the Intel sensor. And, uh, you use that to build a centimeter accurate map of the key points in the environment while simultaneously calculating the position of the robot relative to those points.

Owen Nicholson: (46:12)
That’s what slam stands for in slam core. So here we actually see it running on a remote control car, building a point cloud outdoors while positioning that car in space to send to centimeter accuracy all in real time on an arm CPU. Now, second, a robot also needs to know what the shape of the world is, so it can navigate effectively and not crash into things. So our solution answers this by using the same Intel sensor fuse and the depth information generated by the infrared depth camera and fuses that into a 3d map, again, in real time on an arm CPU, which can be used for path planning and obstacle avoidance, but for truly Spatial and machines, robots need to not just know the shape and the position of the world, but they need to know what the objects are. They need context.

Owen Nicholson: (46:56)
And is that a person? Is it a chair, a wall, a ceiling, um, we’ve actually developed a proprietary pan optic neural network, which segments out to the objects and categorizes them all in real time. Ultimately, as we fuse the three levels together, this will allow the robot to identify objects and position them in 3d space and time. So this will really get full level, um, spatial understanding to the robots. Um, and the great thing about what we’ve done is in the past, it could take months, maybe years to build a great vision based system, if at all. And we’ve actually reduced this month, this from months to minutes, if you head to our website, sign up for an account, you can download the SDK and be off. We will need to approve you first. So please request access on our website. Um, as long as you’ve got the hardware we support, you can be up and running in less than 30 seconds.

Owen Nicholson: (47:41)
So, and actually you can see for yourself and the time it’s taking me to present this slide, that was not an edited, not sped up. That was actually me going on a fresher printing machine, installing it from scratch. So in my last kind of slides, um, I really do believe that we’re in a new era for autonomous machines. We’ve already heard that, and we are actually committing to open up this, uh, technology to that, to the world, um, from hackers to tech giants, from small companies to multinationals, the more companies we have playing in this space, the more this will drive innovation and ultimately allow us to see the true potential of robots for a force for good in the world. And we want to be a huge part of that. So if you want to join us on that journey, please reach out. I’m happy to talk anytime.

Thomas Andersson: (48:22)
Excellent. Thanks a lot. for that one um, quite a condense, uh, presentation, uh, interesting. Um, without an way we move over to Fizyr and, uh, Herbert, uh, in, um, Poland, Netherlands.

Herbert ten Have: (48:37)
Yes. Good afternoon everybody. Let me share my screen. Can you see my screen? Yes. Sums up. Okay. We’re a, scale-up in the Netherlands founded by professor Martin Visha robotics professor. He has an academic, a few of the world, and I started this company Delft robotics a few years ago. And we were in the business of logistics, enabling robots to cope with variation, variation of items in shapes and sizes. So everything you buy a line currently is being picked by humans. So if fulfillment order picking, et cetera, then it’s, it’s spec gets into a box or back. And it’s also handled by humans. I’ve got a short video to show this, and this is it.

Herbert ten Have: (49:33)
So this happens all around the world, seven times 24 hours, where humans have to pick parcels. These are big parcels, which you can imagine all sorts of small parcels from China are all being picked by humans day in, day out on average, a parcel from DHL is being picked at least eight times by human up until 16 times. So in this case, they’re all be placed on the sorter, and then the sorter is being scanned in scanning tunnel. And then it’s being delivered to a, to C area where they need to go. So this is up till now. And you can imagine that, um, in times of Corona, uh, it only increased. So why this, this has not been, uh, automated so far because of the variation variation in shape, in size and color and material and how it’s stacked. Just in this example, if I drop this a small towel, a thousand times, the robots will see a thousand different, uh, let’s say tiles, but it’s the same one.

Herbert ten Have: (50:43)
So what we’ve done, we trained the neural network with supervised learning to enable the robot to generalize and to see the item and knowing where to cross. And so that’s what we do. We show, we built this algorithm and it’s being used in, uh, in both an order picking and in a partial hemming. Uh, just as an example, uh, you see your roll cage where we pick from, or in a, an older store we pick from the bin. Um, and maybe this an example, you see the camera, what it does, uh, takes the image, or we take the point cloud. So the 3d data, we do the segmentation, you see the, uh, green bounding boxes. Uh, we find all kinds of classifier. So we can see, for instance, if it’s a box or if it’s a back, et cetera. So all kinds of classifiers that are relevant and we give it the, we propose the cross processes, six degrees of freedom, X Y set, plus the rotation.

Herbert ten Have: (51:38)
So that’s all done by the neural network in less than 300 milliseconds, to give an example, you see a drinking cup from a little, from a baby. And if you see the point cloud left above you, you see, we are looking for a surface, uh, big enough and flat enough to attach a fact fact can cup to simulate it, to pick it up. And so that’s, that’s generating a six degrees of freedom brass posts, which you also see in the groceries, in the, in the bin on the right. So the, the angles green, red, and blue represent the angle where the suction cup shoots apply, uh, apply force. So, um, so this is just an example of the work that we’ve done with, uh, picking from this, uh, this bin. So, um, we were in logistics, which is really, uh, growing very fast. So our systems are being used in production in all of, of course, in Europe, in the, in North America and up till China.

Herbert ten Have: (52:36)
Uh, so we, uh, we have integrators using our software and it’s not totally for item picking and partial handling, but also for the palletizing for a truck and loading. And we even picked towels. So we have an algorithm that picks, finds the corners of towels and then feeds it in a, in a folding machine. So, um, we used to be, uh, bootstrapped the first year. So we really grew, and I think we are an exception in field. Uh, so, uh, now we got funding beginning of this year. We growing foster, um, re uh, we did use ROS in the past, but we stopped at, uh, due to quality. We, uh, we, uh, we had everything in C plus plus would be moved now to, uh, to rust. So with our newer environments. So we really, the we have production systems, so

Herbert ten Have: (53:26)
They really rely on the, on performance. So we have to get the best systems for, for us. Um, this is our timeline. Let’s start with your previous winner of the Amazon Picking challenge. And we’re based in Delfin, uh, uh, close to the university in the old university building. That’s it.

Thomas Andersson: (53:46)
Excellent. Thanks a lot for that. Um, Herbert, it’s very exciting days for logistics, uh, robotics, for sure. We keep hearing about lots of people being interested in investing in it. So, um, with that, uh, we’re handing over to Sebastian from, uh, how robotics in the UK.

Sebastian Andraos: (54:08)
Hello? Um, yes, I am sorry. There we go. Okay. That’s whatever. Um, so my name is Sebastian. Um, I’m one of the co-founders of Hal Robotics, um, Hal robotics is a software company, uh, based here in London and in Paris. Uh, we provide solutions and substance to model programs, simulate, and increasingly communicate, uh, with applications involving industrial rentals. Um, as I mentioned, we have, uh, offices here in London and Paris, but we have funds absolutely all over the world and in all sorts of different industries, um, from food and beverage to aerospace, uh, arts and crafts to construction, um, and the ones I’ve picked out here in particular are those that I think best exemplify what we do, um, is that currently undocumented, um, they tend to work with very small batches or even one of pieces. Um, and the operators that work in these fields are more often than not non-expert users, or at least we have to cater for non-expert users in these, in these fields.

Sebastian Andraos: (55:25)
Um, and we do that by striving for simplicity and flexibility. Um, it’s obviously an approach that will work for any other industry. Um, but for some seekers it’s particularly preschool. Um, so on the simplicity side, we offer robotic programming solutions tied directly into cut packages, uh, with which our users are already familiar. The solutions themselves are actually built on top of our robotic framework, which is a flexible cross compatible, lightweight, and highly extensible software library that allows us to develop custom software, which will run on PCs, embedded systems or in the cloud with bespoke user interfaces to suit any skill.

Sebastian Andraos: (56:10)
To me, there are three major benefits of flexible digitized automation, and this project by seater got a bridge too far. Um, I think embodies them all perfectly, firstly, the tendency to use simulation and the fact that they have robots tied to optimize the structure, not only for performance, but also taking into account manufacturing constraints like production time, two-path visibility, and a few other, a few other parameters. They also make you solve automated Two-path generation to push the normal process much further than it could be a program by hand. There are something like 10,000 different targets on each of their panels. Um, and I can tell you, you’re never going to do it’s too long. It’s too boring. Um, and finally they use the fact that CAD and tool paths are adaptive. They say the robot is reprogrammed automatically when a part changes to make each part in a structure, unique at no extra cost with minimal variety of time per piece, and the complete freedom of design to end up with a truly mass customized and highly performative product.

Sebastian Andraos: (57:28)
These themes, Just studies can be applied regardless of the industry or material being used. And they can be tied in with sensors to perform more complex adaptive processes. Uh, here, for example, we have, uh, glassblowing or glass bending actually in this particular case, uh, steam bending of woods, and even robots on our construction sites that we’re working on at the moment. Um, and just cause it’s fun. Once you’ve got sensors in the mix, you can tie people in and have a bit of emotion and interaction going on. Um, so this was a student at a workshop we did a couple of years ago now. Um, who’s controlling a robot with his right hand and the gripper with his left. Um, and it appears that I raced through that. Um, but I’m gonna leave it at that. So please feel free to reach out and ask any questions you may have.

Thomas Andersson: (58:22)
Well, that gives us a spare minute for us. Well done, Sebastian, thanks a lot for that very interesting solution you have there. Um, I’ll say it again, feel free to field any questions through the Q and a tab to any of the attendees, um, and away with that we should move on to, Miranda software in France and Logan.

Laurent IRAI: (58:45)
Yep. Adam, thanks again. I’m gonna share my screen. Okay. So I’m Laurent from, uh, the IRAI company based in France. So we are all developing software for the industry in the education, uh, since 1988. Uh, so today I will show you a software, uh, that we developed for students to learn programming language. Uh, that’s called Miranda. So this is a simulator, um, for robots, small robots, uh, that you can find in some schools that you can program either in a stretch I beta. Uh, so let me show you directly inside the software. So we developed this, uh, the tools because, uh, particularly in schools, they don’t have access to a lot of robots. Uh, this is, uh, budgets. Uh, so with this tool, they are about to walk together, uh, at the same time, either in simulation and then work with the wheel robot or make a test, uh, before, uh, producing their own robots.

Laurent IRAI: (01:00:04)
Some schools do that with their children. So with me that can do all that. So they can simulate their own robots, test a challenge inside Miranda, then transfer your program to the robot to status through test it. There’s also a digital, so you can, uh, make your own remote before for the, seeing it and testing every, uh, aspect of the participants. And so et cetera. So to do that, uh, we propose, uh, challenges for the students. So we propose one challenges about our robot. So for example, is, uh, is a smaller robots. That’s pretty common in the schools.

Laurent IRAI: (01:00:51)
So robots can be programmed as you see in scratch by python. So for example, there’s the challenge is to program the robot to go to the each gate. So for example, I can simply make it not programmed to go back the, it feels great. So it’s, uh, it will be a good way for the students to learn programming without, uh, imaging, uh, or so the, the rail robots. And then what’s. So is that the, as a teacher or as a parent, you can follow the probation. So you have, uh, your students that, uh, I made inside this account, so I can follow their probation. You say each, uh, changes and also see their progression. They’ve done, uh, 12 of them. So they are, this is the program editor earlier. So I can just test it and was with me on that. So, so this is a predefined, uh, changes who is also get [inaudible] given, but you have access to an editor so you can create your own changes or modify, uh, changes are given so that you can better 3d from the library or 3d from the, uh, CEO, uh, software to create your own simulation. So if you have more questions, if you, if you want to be able to test, uh, the software you have, uh, access, you can access to go to website. Uh [inaudible] you have, uh, you can, uh, ask for three months, uh, to, to test the software or you can also go our website the area in France. So we linked in the chat when you can see also all the other products and based in the industry side and the automation, uh, you can see the robots simulation also. Thank you.

Thomas Andersson: (01:03:10)
Excellent. Thanks for that. Um, with that, we moving on quickly to Austria and, uh, Christian from incubated over to your question. Yeah.

Christoph Zehentner: (01:03:33)
Uh, thank you very much. Thanks for the invitation. Um, my name is Christoph Zehenter and I’m product owner, and one of the seven founders of Incubed IT, uh, yes, you heard right. We are seven founders and actually already 30 people working here at, incubed IT. And as you can hear from my lovely accent, I’m from Katz, Austria. Um, I think incubedIT believe that, um, autonomous mobile robots will become a commodity in every warehouse and, and every, um, production shop floor within a couple of years from now. And we believe that software will be, um, the key aspect to, um, for robots to become actually part of every warehouse, every production line. And that’s why we developed over the last nine years, a robotics platform, um, that turns any vehicle, any HPV into an autonomous mobile robot and, um, how that actually works. You can see in a nice little YouTube video, um, and I will let it run while I’m, I’m, I’m going to speak, um, over the last nine years, um, we invested in innovative lot, uh, in, in areas like localization. So, uh, let there open awareness, um, navigation, natural obstacle of widens, all this stuff that makes the robot go from a to B naturally. Um, and we have seen that this technology.

Philip English: (01:04:58)
Oh, I think there’s a, um, can they extend guys, um, not share their screen? So request off, please. Could you share your screen again?

Christoph Zehentner: (01:05:06)
Doesn’t matter. Um, and we’ve seen over the last year, statistic technology has become robust enough to, um, actually serve, uh, 24 seven industrial applications and what is needed now to get large fleets into, into the real world is easy use and, um, what is easy use for us? Um, on the one hand it’s easy hardware integration. So we need to get our software as easy as possible onto new hardware models. Um, it’s easy implementation of customer project. So, um, the top needs to be applied as a consumer like product. We often compare it to TV. You’re going to select the TV in the shop, and then you just use it,

Thomas Andersson: (01:05:50)
Uh, just to, um, can you share your screen again? Some of the attendees are really keen, so, sorry, sorry for that.

Christoph Zehentner: (01:06:01)
And it’s, um, on top of that, it’s easy kind of activity. So, um, I’ll flip management server, for example, can be instilled in the cloud, making it possible to attach it to multiple add on services, such as, um, analytic platforms, other host systems, ERP systems, whatever. Um, and in the end, easy usage on the sock shop floor itself. So it’s not only about a usable or easy to use user interface. It’s much more so the whole robotic solutions must be, um, easy to integrate and smooth to use. We always say smooth is the new smart. Um, and on top of that, um, we’re to do that, actually we developed a software platform that consists of three main parts. On the one hand, it’s the smart shat on navigation tool kit us with tend to call it it’s the navigation localization stack the drones within the robot.

Christoph Zehentner: (01:06:56)
It’s the second part is the fleet management server, which coordinates a fleet of heterogeneous robots. And on top of that data monitoring and analytics. So, um, the customer is getting inside. What is actually going on on the shop floor, um, cell numbers towards set them. We have more than 300 shadows deployed worldwide, currently running on our software platform. And for example, the biggest fleets consisting of 40 shuttles runs more than a thousand kilometers, the 24 seven and since 2014. Um, but there’s more than just robots driving around and robotics projects consists of multiple steps and the whole life cycle can be covered by our digital twin. So going from the planning phase where you can simulate a site upfront without actually having robots there, you can see how the process will look like how many shuttles do it, do I need, um, going through the development and testing of, of this, um, installation frugal life.

Christoph Zehentner: (01:07:58)
So you take the data that came from planning that you adopted during the development and actually put it in production. So, um, it’s a nice thing to take such a solution, um, in real production, from home, even from your home office, it’s much more convenient than sitting in a four degrees cooler warehouse. Um, and when you have the support case, for example, we just looked at data from the life installation back into our digital twin simulate. Um, how could that happen? Um, how could maybe a buck fix, um, fix the thing or changed in the configuration and so on and last but not least when you have processes that should, you can ride it up front in the digital twin, see how your KPIs vary and you can even suggest changes to the customer upfront before him knowing that he has maybe something to do tune to improve. Yeah. Um, that’s basically that I somehow hurry through it. If you have any questions, please feel free to contact us. Um, I’ve given the contact information here and once again, thank you for, for organizing that opportunity. Um, I’d love to hear from you guys. Thank you.

Thomas Andersson: (01:09:10)
It’s crystal. So just two quick notes there. So that’s a question for Fizyr in the Q and a, and there’s also one for you, Christoph, but that was in the chat. If you can just look at that. So anyway, uh, thanks for that. It’s very exciting. And kind of the solution that you developed, um, that’s way we’re moving over to extend robotics in the UK as well. And, uh, Ji long, I believe I pronounced that, right?

Chang Liu: (01:09:45)
Sorry. Uh, yeah. Chang Liu to hello everyone. Uh, my name is Chang Liu, I’m the founder and CEO of extend robotics. Uh, we are a human, a human focus, uh, robots, uh, startup in the UK. So, so our vision is really to extend human capabilities beyond physical presence. So we build affordable robotic arm capable of remote operation from anywhere in the world using a cloud-based teleoperation software. Uh, so our, our technical phoners are, uh, uh, post-docs and PhDs from Imperial college, uh, and, uh, commercial leadership from huawei. So, so really the background is not, uh, the, the new trend of digitization and automation is transforming almost oil industries and new technologies are, are approaching, uh, the slope of enlightenment and the, uh, and the new technology, uh, new challenges, like a labor shortage, uh, Asian society. And the pandemics are forcing our society to, to, uh, to look for new technologies and adopt and adopt them faster.

Chang Liu: (01:10:54)
And all of these contribute to the exponential growth of, uh, robotics. Uh, as we see today, so sale, um, sales of a service robot and succeed, uh, uh, 17 billion and, uh, 29 tens on the forecast to reach a 40 billion in 2023. So as a, yeah, as a big market, of course, I cannot avoid talking about COVID, uh, 65%, uh, UK, uh, office workers believe working remotely. It will becomes more common, uh, even after the pandemic. Um, and in many traditional industries are really like showing the need for remote working, even for physical, um, tasks. Uh, so it’s, it’s, it’s the worst time for human, but maybe, maybe the best time for robotics. Um, the fact is that, uh, still over 50% of the jobs can not be easily automated by today’s, uh, robots, um, due to the limited, uh, dexterity and the intelligence of autonomous robot and, uh, the low scalability and high cost of the traditional teleoperation, uh, robots.

Chang Liu: (01:11:58)
However, uh, now, um, many, many of these operations have a strong desire for, uh, to avoid human human presence. Um, but, uh, but how, but how can robotic, uh, help with this since 90, Nineteen fifties, uh, teleoperation robots and autonomous robot are kind of con complimentary in terms of balancing the cost and the dexterity. Uh, however that, uh, what people really need is, uh, in this background is a robotic system that is, uh, that is suppose affordable and can, uh, can provide flexible dexterity. Uh, they need to complete the job remotely and reliably. So, already. Uh, our solution is that, uh, we’re building a next generation, uh, teleoperation robots, which is an affordable VR controls, uh, Mo multi-purpose robotic arm accessible from anywhere in the cloud. So this works as the physical avatar of the user to perform, um, manipulation or teleoperation task remotely.

Chang Liu: (01:12:59)
Uh, so we call it the all-star twin. Uh, so after is a robotic, uh, manipulation module to be in, to be integrated into your existing robots, like, uh, re uh, um, uh, mobile robots. So technically is includes three parts, uh, robot toolkit, uh, of advanced mechanical assistance systems, AMS, and, uh, um, and, uh, imitation learning engine. So, so our robot toolkit is a versatile, uh, on module, uh, with human-like dexterity and the perception. Um, it is, um, it is an affordable. Multi-purpose robotic arm optimized for mobile teleoperation. So it features, uh, remarkably, uh, high power to mass ratio, um, with six degrees of freedom utilizing their power, power, high power density, uh, quasi quasi direct drive, uh, partially smallers and, uh, and lightweight optimized, um, uh, design was advised the parallel mechanism. So it was also integrated, uh, RGBD sensors and powerful computer unit for data processing and control.

Chang Liu: (01:14:07)
Um, yeah. Um, I’m gonna still manage, uh, everything in one package, you know, and quite low cost. So, so our aim as, uh, is a powerful, uh, VR software, uh, which provides, uh, immersive 3d perception and intuitive control, uh, for high dexterous, uh, teleoperation, uh, with, with low cost, uh, consumer equipment. So, so, yeah, so the, uh, is, is developed in a efficient point cloud streaming pipeline and the gesture based control graphic user interface, and, and really utilizing the, uh, 3d 3d point cloud perception gives user the accurate sense of depth, uh, flexible view, uh, viewpoint and avoid motion sickness, uh, while the digital twin based adjust gesture control, uh, gives user an intuitive interface to, uh, to send complex control signals effortlessly. Yeah. So yeah, our invitation engine is the, as a cloud-based imitation pep light enable anywhere access, uh, on flexible data-driven AI. Yeah. So, yeah, so, uh, I I’ll, I’ll just quickly jump through that. The, the, yeah, so if you’re interested, you can, uh, you can go online and YouTube to, uh, to check out our, like a portal, a walkthrough video, and, uh, yeah, thanks for listening. I’m sorry to the expense.

Thomas Andersson: (01:15:36)
Thanks &for that. Um, very interesting, but that’s anyway, we, we jump over to the last presentation. It’s a note from ARS recruitment, if we have any, um, people looking for work these days in the robotics sector. Uh, so without over to you, Sam,

Sam Robinson: (01:15:54)
Thank you. Uh, let me just share my screen. Can we all see that? Okay, cool. So thanks for the invitation. Um, my name is Sam Robinson. I am the owner.

Thomas Andersson: (01:16:10)
You haven’t shared anything yet. Oh, no.

Sam Robinson: (01:16:14)
Two seconds then,

Thomas Andersson: (01:16:18)
Just to note again, feel free to field any questions in the Q&A tab, and we will try to read them out later or on to the main screen. There you go.

Sam Robinson: (01:16:29)
Okay. Lovely. Sorry. So yeah, I am Sam Robinson. I’m the owner of automation and recruitment solutions. Uh, I set the business of pen February this year, just before COVID struck, um, really specializing in automation and robotics recruitment. Um, just want to talk about a few things today. Um, mainly the client top tips for hiring. Um, the market is picking back up now and some people are after, you know, specialized stuff. So I just want to put a few points out to stress us specifically being competitive. There’s a lot of people within automation, robotics that are looking for the same type of talent, same type of skillset. Um, um, really you need to start thinking about what separates you from the other people. It’s not always regarding salary, uh, candidates in the market these days. They’re looking for more benefits in terms of flexibility, career development.

Sam Robinson: (01:17:22)
Um, and then think about the interview methods that you are actually using while hiring, uh, are they formal interviews now, are you now switching over to zoom meetings, team meetings, um, and streamlining going to be processed? So the Canada is fully transparent with how many stages that they interview is going to take. And as I just touched on, there’s not a great load of talent out there at the moment, and it is spread thin within many companies. So there’s different options to consider. Are you taking the graduate routes, or maybe not so much on the technical roles, but on senior management and director roles, taking people out of the industry that bring different methodologies, different ideas. Um, it found that the couple of clients that I currently work with, it’s a great way of bringing in new ideas into the business. Uh, just want to talk about the candidate side of thing very quickly.

Sam Robinson: (01:18:10)
And if you are looking for work at the moment, it’s very important that you are making your CV very clear and very precise to the type of skills that you have and to why somebody should want to hire you. I would include hiding, uh, including your achievements onto your CV and making it very clear what you can bring to the business. And then if you are selected for interview, I think preparation is very important. You need to be doing the research on the business. It’s one of the interview questions that are asked every single time. What do you know about us? And you’d be surprised how many times that catches people out even today, um, which is bizarre. Uh, you also, I would recommend doing some interview research on the managers that are going to be actually interviewing you for the role. And on top of that, I also suggest potentially doing a route planner, a Google maps to see where the interview is, how long it’s going to take.

Sam Robinson: (01:18:57)
You even do a practice route, and it’s only a short presentation, but the, I saved the last important part to last. And that’s your LinkedIn profile? So it’s a public platform where everybody is on or should be on if they’re looking for work, or if they’re looking to hire a, your, your profile should include your contact details, a picture of yourself and making yourself visible. So people can contact you for opportunities, uh, recruiters like myself, users, the number one go-to to find talent. And if you’re not on there, and if you’re not making yourself visible, um, there are other people out there that are, so if you need any support or any help with anything regarding your LinkedIn preparations for interview or your Germany, just looking around, um, I’d love for you guys to get in touch and more than happy to help, even if it’s just the guidance. Um, so with that, it was only a short one. Thank you very much for the invitation, um, back over to you, Thomas.

Thomas Andersson: (01:19:49)
Yeah. So thanks. Lots of that. So kind of concludes, um, all the presentations. So we can now go over to Q and a session and, um, let’s see here, I’ll share my screen. So this all, so first of all, if any attendees you want to contact us, just take a screenshot now. Um, we, there is one question for, um, for Fiyer. Um, if you’ve worked on any application where it’s necessary to pick up items using a robotic arm located on a mobile robot, if so, what, uh, experience and challenges have you found? Okay. Yeah,

Herbert ten Have: (01:20:35)
I can also that, um, uh, actually a lot of people think, uh, think of it that way, replacing a human, both walking around and picking, but it’s not an interesting application for a robot for economic reasons, because if you will, you want to have the robot continuously picking to have a decent ROI, a return on investment. So let’s say every few seconds of robots should be picking. So ideally you have a goods person or a good robot situation where the robot can continue picking if you have to run around and apart from navigation and it becomes harder, but then picking would be, let’s say once in 30 seconds, max, maybe once in a minute. So it doesn’t, uh, uh, support the investment of robots.

Thomas Andersson: (01:21:21)
Interesting. Thanks for that. Um, so, and feel free to field any questions I have, um, a question myself, perhaps. Um, what, um, if anyone wants to pitch in, what has your kind of impact been from the pandemic? How has business changed for you? Some must’ve had quite a positive impact, some perhaps a slightly negative or any, any type of impact that you want to talk about? Perhaps a Herbert wants to answer it.

Herbert ten Have: (01:21:54)
Yeah, sure. Um, so first we’re were shocked, of course, and we, we, uh, had an estimate for all project being, being put on hold. Uh, and so we had to survive for a year, but soon after, I think less than two months, we saw that our key clients like a fabric and Tel afif, and then New York, they are picking groceries for delivery at home. They said, full-blown we go ahead. And so it’s, uh, so, uh, they increased their, uh, their speed and so we work harder for harder for them. And then in the, I would say the last three months or partial handlers, uh, they, they have so many increases and so they want to have more picking shelves. So, um, if it’s DHL ups, federal express to ups, so all of them have a shortage of people, all of them want to have picking shelves, uh, in the near future.

Thomas Andersson: (01:22:47)
Interesting. Anyone else? Uh, Christoph, perhaps.

Christoph Zehentner: (01:22:55)
I would say it’s the same story here. Um, at the beginning we have seen that everybody puts, put, has purchased projects on health because fear maybe, um, but actually, um, it’s rising again and rising much more than it was before.

Thomas Andersson: (01:23:13)
Yeah. Interesting. What about, um, Owen, and I know you haven’t had any, um, it kind of out with your bedtime, but how do you see the pandemic?

Owen Nicholson: (01:23:25)
It was, um, so obviously it was a shock, uh, and it’s, uh, it was a big shock to the system and had to, I think we all had to rethink a bit about how we were going to actually run the companies,so, uh, moving to working remotely. Um, I actually wrote a, an open letter to the, to the robotics industry, um, at the start, which got picked up by the kind of robo tech and broker business use. And we’ve got quite a lot of PR, um, essentially saying to a lot of the bigger companies and the investors. Um, this is going to be a hard time for the robotics industry, but we are also part of the solution for the future. So let’s make sure that we, we, we will get through it to the other side because, and actually that’s what we’re starting to see now, ourselves, we’re seeing a huge, uh, inbound interest coming from our side. Um, investors are starting to, to see that robotics is, um, is something which is going to be, I’d say it’s even been accelerated by all of this because a lot of the larger companies have been thinking of how do we make ourselves more robust into the future. So absolutely wish it wasn’t happen. I wish it wasn’t, it wasn’t a hair, but I think the net benefit might actually be positive for the robotics industry, which is quite bizarre to say

Thomas Andersson: (01:24:30)
Interesting. What about, uh, from alias side Endika in Spain, how did you, um, what was your experience of the pandemic?

Endika Gil-Uriarte: (01:24:39)
So I, I do share what, what most of the previous speakers on the content that you guys serve, um, it has pretty much become the same for us. So we had to move to, let’s say to fully working remotely for a while, and most of our customers have suffered the same. Now we’re back to the new normal. We have a witness, not some of our customers do have, let’s say reduced themes within the shop floors. They do have, they have a lot of constraints when it comes to say physical contact with people and so on, and why not. They have moved to wireless, remote monitoring station of their robotic processes and so on. And they have identified increasingly security as we’re on for business continuity. So it’s up to us as when said before to be safe, being in this very new future and to be able to adapt effectively to these new conditions that will very likely stay for awhile.

Thomas Andersson: (01:25:41)
Interesting. Thanks. What about, um, extend robotics? What, um, so having remote, um, from remote control of yourself, or must’ve been quite an interesting business to be in during the pandemic or,

Chang Liu: (01:25:55)
Yeah. Uh, I guess we are, we’re, we’re being quite lucky because, because our whole solution is around remote working. Uh, some, especially in being able

Chang Liu: (01:26:04)
To have a, like an intuitive, uh, user interface to allow anyone to control, uh, at least a robot arm, uh, remotely. Uh, so we’re, we’re hoping that, uh, I mean, as clear now, there’s a, there’s a huge need now, uh, not only of for remote work and not only, and like, uh, uh, ICT, uh, um, demand, but also like extending that into our physical to men. And that’s, that’s exactly what we could do. For example, if some, uh, so many, many like, uh, university labs that just closed and people can not continue doing testing, um, you know, like, uh, yeah, uh, uh, any company labs that just just require physical work and that has to be closed and no one can access the location. Uh, but, uh, but just imagine if someone could just remotely control robots working on a work camp continuously working on their product. Um, yeah, that’s, that’s exactly what we can, we can offer

Thomas Andersson: (01:27:02)
Sebastian is how any, any impact at all? Uh, any, um, interesting.

Sebastian Andraos: (01:27:08)
So, I mean, as a software company working from home is, uh, is a bit easier than working for a hardware company. Um, but yeah, everything is slowed down, um, a bit, I think, I think we’re going to see everything got, particularly, we’ve heard a bit of contact from our SME manufacturers struggling to keep as many people in, in their cities, um, and would like to automate out some of the, some of the less interesting jobs, um, which is something that we saw before the pandemic is just, uh, I’ve spurred some of those people back into action. Um, but yeah, I mean, it was relatively easy to move between locations. It’ll be okay,

Thomas Andersson: (01:27:48)
Lauren, um, France.

Laurent IRAI: (01:27:51)
Yeah. So fortunately for us, it was a, uh, critical, uh, opportunity, but, uh, it’s unfortunate that, uh, as we work in the simulation side, uh, in industry that doesn’t change anything for us, but, uh, in the education, uh, as you make so many shows, uh, Miranda is a product that we, uh, came out in February. So just before the pandemic and, uh, so students, uh, were actually using the software, uh, we gave it for free during the pandemic for them to work from home on the programming, the robots that they couldn’t do in, uh, in school was, uh, granted. Um, so we really became popula] in the, in the schools Mirada and, uh, uh, so it was fortunate for us to

Thomas Andersson: (01:28:57)
Interesting. Yeah. I mean, it’s, it’s been a health problem. So there’s one question from, um, robots online, again, about, uh, how, and it’s kind of talking about the pandemic impact as well, but specific, uh, specifically relate to how customers or end users have, um, reevaluate the way they access and they deploy capital, uh, with the perhaps. So looking at CapEx versus OPEX type investments for robotics, perhaps, um, how about, do you want to, um, have a stab at that as well?

Herbert ten Have: (01:29:36)
Um, yes, I can. Um, we, we actually, we support both, uh, construction, both CapEx and OPEX. Uh, we even can, uh, invoice per pick, uh, that’s really up to the client. So we’re, we’re flexible on that. Uh, so, uh, um, but we haven’t seen a big change in that. So there are integrators and end users that want to have CapEx and other one to have an OPEX model. So we are just flexible, I would say.

Thomas Andersson: (01:30:02)
Okay, what about, so Philip, we have, so Philip, um, it’s obviously sponsoring the events, but it’s also a system integrator working quite a lot with end customers in the UK. What’s your kind of, yeah. Okay.

Philip English: (01:30:19)
Yeah. Yeah. So, I mean, from, from our side, we we’ve seen, um, a lot of consultancy. Uh, so especially for the bigger guys that are looking into the new types of robotics, but they’re, um, a bit hesitant with the actual, uh, placing orders and placing POS. I think they’re sort of, they’re getting the consultancy, they’re getting the understanding of what they need to do, uh, ready to hit the order button. Um, and then for the, for the SMEs, um, depending on the industry, uh, if it’s a or C uh, aerospace and have gone quite quiet right at the moment. Um, but yeah, there’s definitely got some solvency, so I just don’t see it. Everyone’s in interesting to see how, how this technology is going to help them. So

Thomas Andersson: (01:31:03)
Interesting. Anyone else on the OPEX versus CapEx kind of, um, or let’s say, um, um, preservation of cash, so, or otherwise I can, I mean, I’m just, I’m just writing a really big report on the AGV sector. So AGVs, um, um, an AMR, so autonomous mobile robots and have interviewed about 55 to 60 companies. Um, so what we’ve seen is that, um, there is, in, in the past years, like two or three years ago, people talked about Rust models and, uh, leasing models, but no one really kind of entered into that, but now we’re seeing that customers are coming back and asking for these models. Um, you know, um, sorry, you, you, you talked about this before. Can we rediscover that? Um, we kind of the rust type models, one of the, is that, um, not many of the customers or the end users allow, I’ll say robotics, uh, fleet managers, or a controller software on a kind of own cloud. So the business models look slightly different. It’s only the least that’s really, uh, available in most cases anyway. Um, I’m not sure if that’s true for everyone. Um, anyone else on the OPEX versus CapEx cash preservation. Otherwise we have one very specific question as well. Um, which I can’t see now. Um,

Herbert ten Have: (01:32:40)
I think I remember the question was through us, it, wasn’t hard to work to migrate from C plus plus the rest,

Thomas Andersson: (01:32:47)
That’s it? Yeah.

Herbert ten Have: (01:32:49)
Um, and the, the least person to answer this question in my team, but, uh, what I’ve witnessed is that, uh, of course it’s a big, big change because you invest in something to replace. It, it doesn’t bring you direct benefits. It’s really long-term for your clients. So it was a, it’s this really strategic decision to go for the highest quality. And, uh, as you know, Microsoft decided to also migrate to rust instead of C plus plus. So that’s really indication now how important they see this decision. Uh, so we had to train people. We had to make, uh, you planning all the stuff. So we invest a lot in this migration. Yeah. It’s, uh, it’s tough, but we believe we want to go for staying the best. And this is one of the, uh, the key elements.

Thomas Andersson: (01:33:35)
Okay, excellent. Thanks for that. It also brought in, um, um, Microsoft into the question. So we have a few other Q and A’s as well, but I think we’re running slightly over our time now. So we have, uh, had the webinar for one hour, 10 minutes. We, um, I think that’s it, unless there are any really pressing questions. There is a question for IRI regarding the simulator. Perhaps you can answer that.

Laurent IRAI: (01:34:05)
Yeah. There’s a, in the software, so you can, uh, edit, uh, existing, uh, robots, so, or plants or something, but you can also create your own robots in portraits really. And the build, uh, blogs or the Python command, uh, to come on the inside miranda so you can test it, uh, inside the changers or, uh, see the difference between the robots, uh, we have, uh, installed between the yours. Yep.

Thomas Andersson: (01:34:43)
Yeah. I recommend also anyone, any of the attendees, just to remember that this will be online as well on YouTube, so you can, we’ll be able to see the links and so on. Again, I think with that, we’ll, we’ll close down this webinar, big, thank you to all panelists. Um, who’ve given their time for this and, uh, hopefully we’ll get something out of this. Some nice connections, um, or whatever. So to everybody from Europe have a real continued, a nice evening from everybody from the U S so hope you have a nice day. Thanks everyone. Thanks guys. Thanks everyone.

TLA Robotics: https://www.techlondonadvocates.org.uk/ Tech London Advocates Robotics: TLA Youtube Philip English: https://philipenglish.com/  Sponsor – Robot Center: http://www.robotcenter.co.uk FIZYR: https://fizyr.com/ Alias Robotics: https://aliasrobotics.com/ SLAMCORE: https://www.slamcore.com/ HAL Robotics: https://hal-robotics.com/ Extend Robotics: https://www.extendrobotics.com/ IRAI Miranda: https://www.iraifrance.com/ IncubedIT: https://www.incubedit.com/ ARS Recruitment: https://www.ars-recruit.com/

Tech London Advocates Robotics – Inaugural

TLA - Robot Center - Philip English

Hi, guys! Philip English from philipenglish.com and today we have an interview with the Tech London Advocates team, who are telling us about a mixture of Robots.

Thomas Andersson: (00:00)
Hello and welcome to the inaugural TLA robotics events. My name is Thomas Anderson and I am one of the four group leaders in TLA Robotics. I’ll be very brief, so we can get onto the exciting part, which is then to see the company presentation, um, Tech London Advocates now, or TLA for short is a voluntary organization. So please consider that everyone involved in today’s event has spent their free time on this. So if you could mute your microphones until it’s your time to speak, that’d be great as well. And I should mention that if he wants to become a member of TLA, please reach out to any of us. You’d just take a screenshot of the screen right now and reach out to us, after the event or during his event. We’re lucky to have, Russ Shaw, the founder of tech, London advocates on the call today.

Thomas Andersson: (00:54)
So he’ll be introducing tech, London advocates in a bit more detail. So I will talk a bit more about the robotics group. So TLA robotics was set up in April 2020 with the purpose of encouraging and promoting robotics and automation in UK, Europe. So the key reason for this is simply that the UK and Europe lags behind North American and Asia in terms of funding for robotics companies in particular, those with a hardware component at the same time, we also keen to see, increased adoption of robotics and automation across industries. And in particular in the UK, which likes behind many of its European peers, in robot penetration, further more. We also have an aim to encourage more women in robotics. And the reason I’m laughing slightly is that today’s, we’ve totally failed with that. And we all men who are presenting here, but we promise to, to improve that in our next, coming, webinars.

Thomas Andersson: (01:57)
I should also note that our medium to long term, aim is to organize physical networking events. However, depending on the success of these webinars, we may continue these as well. Before I hand over to Russ to introduce Tech London Advocates in more detail, I want to extend a big thank you to Robot Center, which is a robotics and automation system integrator based in UK. Robot Center has kindly sponsored the zoom hosting for this event and without that, we wouldn’t be able to showcase the many thriving companies that are coming up. It’s a big thank you to Philip and his colleagues at Robot Center first of all. You will be hearing more from Robot Center after Russ has introduced Tech, London advocates as well with that, I’ll get sharing. so can we stop right over to you, Russ?

Russ Shaw: (02:50)
Thanks, Thomas and Thomas congratulations to you for getting this group set up and launched today. I know this has been a passion of yours, and I know you, and I’ve been speaking about this for the past few months. So, so many congratulations on taking an idea and turning it into a reality. And also let me add my thanks to the Robot Center as well. As Thomas mentioned I am the founder of Tech, London Advocates, I know many on the call today may not be familiar with Tech London, Advocates and TLA Robotics. So I’m just going to take a couple of minutes to share a bit more about the group as well as Global Tech Advocates. I launched Tech London Advocates back in 2013. So we’re just over seven years old. Um, and I did it because I wanted to create a group of diverse leaders from all backgrounds and all walks of life, to come together as volunteers, mainly from the private sector to promote London’s tech ecosystem and to deal with the issues and challenges that we face in our ecosystem.

Russ Shaw: (03:51)
And Thomas alluded to those, in relation to Robotics at the start in terms of where the UK and Europe lags behind a bit on, on the robotics. so I launched the group. I make it very easy for people to come into the group. And, and after this event, Thomas and I will, we’ll write to all of you and all of those who registered to, to invite you and welcome you to the group. But basically I ask advocates to do three things. One, use us as a resource, whenever you’re speaking or blogging or tweeting about tech, good, bad. It doesn’t matter, but we want to really encourage advocates to try and speak with a relatively consistent voice. When we’re speaking to media to government and to other key influencers, the group is open free to join entirely inclusive. Anybody can come into the group.

Russ Shaw: (04:38)
I just ask every advocate when ready to do so to introduce at least one new advocate to the community. So we’re built on network effects and the London group has gone from zero back in April, 2013 to over 9,500 advocates, literally people introducing new advocates to me or people coming into the community through working groups like TLA robotics. And then three, I ask every, I ask if we advocate to adopt the ethos, which is we’re here to help one another for the greater good. If an advocate reaches out and says, do you know this person? Can you connect me? Can you make an introduction? Can you help either say yes or say, look, no, I’m sorry. I can’t, maybe I can find somebody who can, so that’s all I ask him. Every advocate it’s designed for very busy people. we now have, and as of today with this group, I think this is working group number 50 or 51.

Russ Shaw: (05:32)
I’ve lost count, but we do have over 50 working groups. and Thomas has working and he’d been working incredibly hard to build up this community before we actually launched the TLA robotics group. So take a look on the website. You’re welcome to get involved with other working groups. If there’s another group beyond this that you’re interested in, do let me know. And I’ll connect you to the leaders of those groups. Obviously we’re in a strange time, the pandemic, normally I host one or two big events in London each year but we’ve had to postpone things our next TLA event will actually be in March of 2021. It’s called debate tech. We have invited all of the London Mayor candidates who were running for mayor, which was supposed to happen this May, it was going to happen in May, 2021 to come to an event and debate a tech manifesto that we’re preparing with them along with, center for London, London.

Russ Shaw: (06:25)
First in tech, UK, Thomas, I know you wanted to mention, having, you mentioned a couple things on the horizon. One of the big things coming up is London tech week. Tech London Advocates has been a founding partner of London tech week since its inception back in 2014, back in June of this unit normally takes place. Thanks. During June of each year, we held a small virtual London tech week events called London tech week connects. and it was a great success. It was all done virtually, and I think we had eight or nine, 9,000 people attend tech week connects. it was so successful that the fellow founding partners, which are informant tech and London and partners along with TLA and founders forum said, let’s run a bigger version of that in September. So for the week of the 7th of September London tech week, we’ll come back, , next week, I’m sending out a newsletter to all advocates, which is my London tech week preview.

Russ Shaw: (07:20)
So Thomas, your timing is excellent in terms of launching this group. Now everybody will get a newsletter next week, which will summarize and highlight how to get involved, how to register for events. The good news is everything is free for London tech week in September. One of the thing I just want to talk about is tech London advocates is now probably of global tech advocates, which is a network that I’ve been I’ve set up globally since 2015. There are now approximately 20 groups in global tech advocates around the world. So in the UK, we have the London group, we have three other groups, one in the North of England, one in Belfast and one in Scotland in Europe, we have a Nordics group which covers Scandinavia and the Baltics. We have a Italy group, it’s Spain group, a Paris group in the Americas. We have groups in the San Francisco Bay area in Canada, Mexico, and in Bogota, Colombia in Asia, we have groups in Singapore or Japan and two groups in China, one in Shanghai and one in Schengen. We have five groups in launch stage, Korea, Australia, India, Netherlands, and yeah, what’s being called emerging Europe, which will cover central and Eastern Europe. The Balkans going all the way up to Ukraine. So they’re scheduled to launch later this year and early into 2021 last year, I held the first ever global tech advocates summit in China. We did it in Shanghai last October, and then I’m form part of a global tech advocates festival. We traveled on to Beijing. There were approximately 60 us who took

Russ Shaw: (08:56)
Place who participated in the event from nine of the 16 GTA groups at the time. So obviously things are a bit different this year, but we’ll take a look next year. I’m thinking of doing, the GTA festival again, possibly in China. And I know that there’ll be a lot of interest in this robotics group as we move forward with it. So hopefully that gives you a picture for me as to who we are, what we’re all about. Again, I will follow up with you, Thomas. We’ll follow up with you hopefully over the next day or two to bring you in, share more information and get you comfortable being part of the community. So Thomas back to you really looking forward to the rest of the session,

Thomas Andersson: (09:37)
Thanks for that Russ. Let’s jump straight over to the exciting part, which is the company presentation. So it’s a Philip English Robot center, tonight’s event sponsor as well.

Philip English: (09:55)
Perfect. Thank you guys. Thanks for the introduction so much. Appreciate it. And I’ll say, yeah, always keen to support, like these types of events. So I just, put my power point on one second. Right, Can everyone see that?

Philip English: (10:24)
Right. Perfect. right. Yeah. So, so my name’s Phillip English and I’m chief operation officer over at robot center and we’re a collaborative robot company. We work with, companies such as Airbus, Honeywell as the NHS, to automate repetitive tasks and get our customers robot optimised, as well as a return on investment. And as a system integrator, we, basically analyze our customer’s needs, provide a plan for automation and then put that automation into production.

Philip English: (11:05)
What I wanted to share on this presentation is really the three key problem areas that we see customers facing in both the private and public sectors. And, the reason why, we find them that the reason why they find that, that, that looking into, in so into these three areas, so that the first one is resource. second one is optimization and the third one is innovation. So we’ll go through each one. So starting with resource. So due, the aging workforce and the skills, labor gap shortage. We see people are not interested in doing the dull, dirty, and dangerous jobs anymore. This means employers are struggling to recruit the right people and staff retention is low. As a result, businesses are constantly having to retrain new staff racking up thousands of pounds in training costs and loss of productivity.

Philip English: (12:01)
Next one we see is optimisation. So with businesses pressured to produce their products and services faster and more cost effectively, we see employees doing low value tasks and every step is the debt is the same with the physical movement of materials alone, wasting a huge amount of time. The negative effects of this are twofold. First, the customer suffers from a continuity issues and reduce level of service. Secondly, the employee suffers may suffer unnecessarily stress and may even be subject to injury or illness. So that’s the second one. The next one we see is to do with innovation. So because of technology advancements, there’s an increase in demand for products and services to be both bespoke and delivered next day, this can cause problems, we’ve been saying the right inventory levels and to fill in large orders, as well as bottlenecks within the supply chain, coupled with a lack of visibility and traceability, these issues can lead to poor fulfillment at best and product deletion at worst because of the reduced time available for the R and D and innovation that keeps products relevant.

Philip English: (13:08)
So with the three main issues in mind, how we work at Robot Center. So we’ve developed a five step process that we follow to really get the full value of any robotics, integrations that we call our ROI methodology. And as a result, our customers enable more flexible resource, engaging, more efficient systems and optimization and explore robots innovation across the business processes. So here’s our five steps just quickly. So first we, uh, we Research the sites on the right robotic tools to use with an organized, so Organising the full process and the team around it. We then Build the robotic system and developed the ecosystem for the customer. Then the next phase is to Operate where we always want things running and then the support package. And then we move on to the Transcendent stage where we really feel innovation comes when businesses leverage their existing teams and we support business with thier own product creation.

Philip English: (14:08)
And each of these steps has a process around it to get the customer’s automation goals. So quickly talking about goals. So Robot Center supports the 17 global goals, which if you don’t know where they are in 2015 world leaders agreed to 17 goals officially known as the sustainable development goals, goals, or STGs. The aim of these goals is to create a better world by 2030 with specific targets for this date. So the Robot Center team all grew up, what’s in Sci-fi films and see in versions of either utopian futures or dystopian futures. And we want to be the architects of real positive future that is powered and supported by robotic technology we aligned to global goal number nine, which is to build resilient infrastructures for promote inclusive and sustainable industrialization and foster innovation. And we support charities and robotics startups, their lines to this goal

Philip English: (15:08)
So,I know we have a mixture of people in this call, investors, robot uses, system integrators, robot vendors and startups, etc. So what I really want to do is just to reach out, to see who would be open to working with us on any partnering opportunities. We’re keen to extend our services to our customers at the same time, add more ideas and innovations to our Research process. So I’ll put my email address here philip.english@robotcenter.co.uk. For anyone that wants to reach out and speak to me. And then a again, yeah, really are especially is our five step, ROI methodology for getting customers, Industries, Infrastructures, and Innovations, Robot Optimsed to build a better future. So I’m looking forward to hear from you. Cool. And I’ll pass it back.

Thomas Andersson: (16:02)
Thanks for that philip, with that we’ll go straight to Henry and we would have cofounder at Inovo Robotics. Um, you ready there Henry?

Henry Wood: (16:11)
Thank you. Yes. Uh, demo screen. So can you see that?

Philip English: (16:20)
Yes.

Henry Wood: (16:21)
Great. Thank you. Thanks Thomas. Yeah, so as Thomas said, my name is Henry Wood and I’m one of the founders of Inovo robotics. Inovo is a fairly early stage company. We set up in 2016 to develop robotic hardware and software for an opportunity that we saw in the manufacturing sector. So I’ll start with a bit of background. So the manufacturing sector is currently worth a staggering $25 trillion globally. And around 20% of this sector is mass production. Mass production has taken, a really large part of its solution from automation. And it is a heavy user of industrial robots for more than five decades now, but the remaining 80% of the manufacturing sector is actually batch or low volume manufacturing. And this sector is still very reliant on manual labor for most of the machine tending, assembly packing and handling tasks.

Henry Wood: (17:16)
Doing these tasks manually is expensive and inefficient, and most companies would like to try and use more automation. But the problem is that today’s industrial robots are a poor fit for the needs of batch environments todays industrial robots generally need to be in cages. They are very much configured to do a one single task for their full lifespan, and they take a lot of expertise and skill to actually set up in the first place. So when Inovo set out to develop robots, which are going to be much more flexible, much easier to program a use a much more versatile, do a really wide range of different tasks. So we developed a modular system, which allows the robot to be physically reconfigured for different tasks. So you can remove links sections from the robot. You can put longer links, shorter links, or take them out completely.

Henry Wood: (18:05)
And this changes the physical reach and payload of the robot. And this makes it much easier to move the robot between different applications as needs change and needs do change a lot, particularly in batch manufacturing, where companies are making different products month by month. They’re often switching between batches and reverting back to them later, but in between they can’t, they can’t have one space permanently dedicated to one process when you’ve got customers in different batches to fulfill. So we’ve put a lot of effort into developing our software to make it very easy to use, to have a very quick setup, time, a much shorter learning curve and to be very flexible. We’ve also developed an architecture, which makes it much easier to expand the product in the future so that we can add vision systems, new sensors, and even new modules, as the needs arise, allowing customers to buy the features they need, but not have to buy everything they don’t want. So our product is essentially three and one, and this, this slide shows a comparison, with, one of the leading competitors who have a range of three different robots to cover three different payloads and reaches and our robot basically covers all of these reaches and the majority of the payload configurations, within the same system, just by changing the setup.

Henry Wood: (19:30)
So the typical applications that we’re addressing a quite wide ranging, actually, you see, companies making, consumer products, making injection moulded parts, for the automotive, for the domestic markets for electronics and things like this. And it’s not uncommon to see companies where you have a injection moulding machine, and then you have an operator standing next to it who has to physically reach inside the machine, lift out the moulded part, place it on a bench, close the door, and press a button to run the cycle again. And this is, this is a very repetitive, dull, task, which, which no one really wants to do. And it’s a, it’s a poor use of, of skill or labour. You often find short cycle times as well, where it might be between 20 seconds and a minute where someone has to repeat that process. So it was very difficult for them to go and do anything else useful in between.

Henry Wood: (20:19)
And it’s an ideal application where a robot can be programmed very quickly to reach inside, grab the part, place it out, close the door, and send a signal to the machine to, operate it a similar example with CNC, where companies are making metal parts and an operator generally has to put a blank metal part in, lift up, lift the machine part out and stack it. and again, similar cycle times between 20 or 30 seconds up to a few minutes, but very hard for anyone to do anything else in between. Our robots also quite well suited to packing and assembly tasks where, electronics or components need to be put inside enclosures and fittings, screws, etc it needs to be inserted. and the there’s a lot of repetition there and really quite simple motions robots, very well suited to, we also see, spray painting and gluing and coatings where a robot can be followed to program to follow very precise tasks.

Henry Wood: (21:16)
And one of the other benefits with this is you can take a person out of a hazardous environment. You can stop them being exposed to toxic chemicals or to very dirty materials. We’re also seeing a lot of interest in logistics and fulfilment centres where people are picking parts and placing them, into packaging. And they, tend to have very seasonal work as well. So it’s a classic example of a process that’s repetitive, but it changes from month to month. And so the ability to move the robot to a different task and reprogramming quickly and easily is very valuable. And we’re also starting to see some new emerging applications, things like, in kitchens and in food preparation, which is partly a response to COVID-19 where there’s a, an increased desire to take people out of the loop in terms of handling food or, handling products that are potentially going to pass on virus.

Thomas Andersson: (22:12)
Excellent. That’s your time up? Do you want to wrap up Henry now?

Henry Wood: (22:19)
Yeah, I’ll just, I’ll just quickly conclude with this graph. And this is showing the coverage of, more than 50 different applications we looked at and how our one system could do all these applications, whereas you needed a wide range of other robots to do a similar thing. So thanks for the opportunity to, to talk and I’m looking forward to being part of that London advocates.

Thomas Andersson: (22:39)
Excellent. Thanks for that. Henry, very interesting, especially with the food preparation now in pandemic times and so on. So, quickly then over to Jakub Langr, who’s the cofounder of Creation Labs? Which is one of my, um, I’ll say, uh, really interesting points, which is, Generative Adversarial Networks. I can hardly pronounce it, but then went over to you Jakub to explain more.

Jakub Langr: (23:07)
Amazing, thank you so much for inviting me and happy to be part of Tech London advocates. Um, I’ll just share my screen in a second and hopefully everyone can see yeah, everyone good. Cool. Amazing. So, um, thank you for the introduction. As Thomas said, we’re creation labs and we’re all about creating engineering that right data. And I think especially, uh, this is a good set of data to some of the discussion that we had around vision and sort of integrating vision system into robotics. Um, so, um, you know, what’s the, what’s the problem today. And especially as we move up to robots that we can see a lot of people quickly realize that deep learning is actually very data hungry. And I think one of my favorite slides that really demonstrates that it’s this, um, sort of comparison of how long it took for some real world, uh, breakthrough to happen, you know, whether that’s IBM, deep, blue beating Garry Kasparov, or, you know, uh, sort of English to Chinese translation and vice versa or the IBM shepardy or Google net, all these advancements have happened very shortly after the data sets were available for them to accomplish that.

Jakub Langr: (24:29)
But actually a really long time on average 20 years, um, almost to after the algorithms have been first proposed, the fundamental algorithms of a lot of machine learning are actually quite old. Um, and, um, the, the challenge really becomes around data. And I think the same thing you can hear from the sort of captains of the industry like Andre Carthy, who is the chief AI officer at Tesla, um, you know, sort of, uh, when, when we had a meeting with him, he said that, you know, uh, within machine learning everything except for data as a commodity, and we really see that happening. Uh, you know, the economist commented on this. There’s some industry research suggesting that even large enterprises struggle a lot with, with training data availability. Um, so, you know, that’s an interesting problem. And I think, you know, uh, most of, most of the people who building vision systems would really spend most of the time in the last five sort of, you know, mold, training, model, tuning, algorithm development, and operalization, but really, you know, when you look at the time, then of course there’s so much more effort that comes into that cleaning, labeling augmentation of that data.

Jakub Langr: (25:39)
Um, you know, I, I think you can kind of, I would personally have some, some comments to this, this economist article, but I think it’s a useful starting point. And, you know, I think one of the key things here is also the acquisition of the data, how you’re even acquiring the raw images in the first place. Um, and so just quickly moving on, I think fundamentally there’s three, three big problems. There’s a sourcing of the data. So sourcing the data with the right sensors at sort of the right time or capturing the right events that you need to, um, you know, and, and whether that’s a simple RGB camera, whether you have LIDAR in your stack, whether you have some, some other ways of, of doing depth estimation, you know, all of these things are quite important. Uh, of course then this, the step of labeling you to explain to the vision system what’s going on.

Jakub Langr: (26:23)
And then lastly, there’s the step of curation and balance and making sure that the data that you’ve captured actually aligns well with the, um, proposed business problem. Right? So one of the things that we do is basically we create the, these data sets sometimes, uh, sometimes that our customers, then you use exclusively as a, as, as a synthetic data. Sometimes people use that in conjunction. I think that’s more common in conjunction, conjunction with real data where we basically create the perfect curated data set, but all the senses they need, um, and all the, all the labels. Uh, now I’m sure that sounds great, but how do we do it? So one of the things that Thomas alluded to, uh, will become apparent in a second, but the basis for what we’re modeling always has to start with a 3D environment. So it has to start with 3D model of the scene.

Jakub Langr: (27:13)
Um, you know, here we can get things like, um, depth sensors, um, semantic segmentation. So what pixel is, what object, um, 3d bounding boxes, 3D position, um, basically anything synthetic LIDAR we’ve, we’ve done all these types of sensors. And if there’s some custom sensor that someone needs, like we can integrate that because it’s a simulation, it’s easy for us to sort of know, you know, pixel perfect, uh, semantic maps are picks up perfect depth values very easily. Um, but you know, that’s sort of not good enough. I mean, people, especially these set of techniques that are sort of called SIM to real started in robotics, um, you know, because the need is sort of, as I’m sure a lot of you realize is that it’s sort of the highest. Um, but, but what people quickly realize is that that doesn’t work particularly well because it doesn’t generalize from the synthetic world onto the real world.

Jakub Langr: (28:04)
So what we do is we take Ganz as, as Thomas alluded to so generative adversarial networks and basically use what’s called domain adaptation to learn from a sample of real data to apply it to the rendered, uh, rendered scene to make it basically photo realistic. Yeah. So this is an example I’ll just quickly show you what we would, we have done in a ton of us vehicles. We have done some work in robotics, happy to talk about it afterwards. Um, but basically here we’ve taken a rent computer render instead of make it for a realistic based on a German data sets. So hopefully this is clearly more the stick. So, um, yeah, uh, I think I’ll, there’s some metrics if you want, you can sort of talk to me about them afterwards are definitely better consistently, uh, in simulation and I’ll just close off. Yeah. If you, if you want to find out more happy to entertain any, any, uh, points and discussions, I’ll leave it at that. Thank you.

Thomas Andersson: (29:01)
Very interesting. Um, complex subject methods, um, with that we, um, jump over to some to Sameer Puri, who’s head of sales, at um, Eiratech. Over to you Sam

Sameer Puri: (29:15)
Thanks, Tom. Um, so glad to be a part of a Tech London, um, thanks to Tom again, to Russ and to Phil. So I’m just going to screen share my screen there. Okay. Can you guys see this? Okay. Perfect. Okay, great. So my name is Sameer Purri. I represent Eiratech robotics. We provide autonomous automation solutions, um, in terms what we do well, we provide end to end solutions. So whatever you need, in terms of robotic automation from concept design, prototyping, simulations, integration, project management, et cetera, we provide the full spectrum of solutions. Everything that we do there is in house. So we don’t outsource any of our core services. So electrical, mechanical engineering, safety certification, software operations, and post-sale support. Everything is performed in house.

Sameer Puri: (30:07)
We’re a growing team. So we’re currently more than 45 full time employees. Uh, we have a pan European clientele, which is where most of our large to medium sized clients are based. We are safety certified we’re C compliant, and we installed a rigorous use standards in terms of our solutions overview. Let me just give you a quick overview of our portfolio solutions. Um, in terms of the hard way we provide goods to person or goods to order robots. So this is essentially robots operating within an enclosure, moving at very high speeds of about three to four meters per second, to ensure speed and efficiency we also provide safety robots that can work outside of enclosures that drive safely around humans for a variety of tasks from material transport or line feed, et cetera. We also provide stations, customizable, racks, and chargers. As part of this hardware portfolio.

Sameer Puri: (31:03)
We also have additional peripherals for, for the add on operations. So whether you need to pick the sort wall and we’ll put the light system or a light pointer, or whether you have cluster picking or batch picking, we can step in with a variety of peripherals that can compliment our solutions. In terms of the software. We’ve got embedded software, of course, working on most basic level with regards to the hardware over and above that we have our own fleet management system called RMS. We have our own in house designed extended warehouse management system and loping. All of that. We have the user interface and something that we call Eiracore. This is an operational control and analysis dashboard. It provides you with a status and statistical update on the entire system. Everything from tasks, picking replenishment, picker, efficiency, robot service, and charge, et cetera, everything is provided in this dashboard for full operational control system. In terms of applications. There were quite a few here. So just to talk a little bit about, uh, our kitting operations under manufacturing. So if you have inventory from which you need to pull out items to build kits, which will then be taken to a line feed or assembly, this is something we can handle.

Sameer Puri: (32:24)
Um, when you’re talking about line feed or flexible, flexible manufacturing, this is something that we can actually manage. We’ve been doing this for a particular automotive plant as well, intra-logistics. So if you’re talking about moving storing, sorting and shipping of goods, this is one of our key specialties. It’s something we can manage on the retail side, if you’re looking at an online grocery. So if you have a dark store setup, are you looking at back of store operations, such as marshaling, or you want to get into micro fulfillment where the companies talk to, this is something that we provide as one of our key specialties. And last but not least when you’re talking about eCommerce fulfillment, um, this is what the goods to person or Vista order system was designed for, particularly when we’re talking about a competitive market. So when you’re looking at fast and furious consumers that want next day delivery or fast returns, this is something we can quite easily manage.

Sameer Puri: (33:19)
And we have a demonstrated alum, um, KPIs. When it comes to high pick rates and accuracy, we have a consultancy approach. We pay a lot of attention to detail, and we spend a lot of time listening to our customers to analyze their pain points. So we do data analysis. We define KPIs, we review processes and essentially we’re driven by yourselves. So whatever your challenges are, we have a portfolio of solutions that can tackle your challenges. My details have been provided in every slide. So give me a call or drop me an email, and let’s talk about how we can help you go autonomous. And that’s it from my end,

Thomas Andersson: (34:06)
Okay , thanks Sameer Puri . Just as a quick reminder to everyone, if you look at the bottom of the screen, you should have a Q and a comment thing as well. If you want to put on a question, the director, attorney of the presenters, you may do so as well, hopefully we’ll do a Q and a session at the end as well. With that I want to hand over to, um, Mathieu Scampani who is, um, the UK sales manager for Fives Intralogistics. I never know how to pronounce that, so hopefully Mathieu can help us with that.

Mathieu Scampini: (34:40)
Yes. Hello everybody. Um, I hope you can see my screen here. Yes. Thank you. So, yes. Hello everybody. I Mathieu Scampini Sales manager at Fives. Uh, thanks first to Tech London Advocates for hosting us today and give us the opportunity to introduce our newest solutions in logistics. Um, at fives, one of our focus is the parcel world. Uh, this is an exciting market and that UK alone is at the top, uh, being number one in possible volume per capita in number two eCommerce market, uh, only after China and North America. And also, uh, it’s about 20, uh, 20 billion pounds turnover, uh, only for the top 25, uh, Korean companies. So yes, the UK is a huge logistic platform with massive needs for volumes, and that has to be delivered in record times, uh, businesses. Uh, what we see usually focus on, uh, labor efficiency, time savings, uh, improved accuracy, and also tracking and scalability. So here is a start to understand why process rotation is a must.

Mathieu Scampini: (35:57)
So at FIV, we have a wide range of solutions, uh, for the logistics, with all sorts of material handling solutions and software. Uh, so we start, we can address, uh, all challenges for our partners. And here you have an overview on this slide of the major partnership that we have, uh, in a unique way, uh, a few years ago, uh, surveying the market. We understood that 75% of the volume, uh, is small in that sector. And 95% is less than 25 kilo. Um, and also that logistic players required formation that will, uh, uh, always be in a smaller footprint, uh, that site can be moved easily. Uh, that also are scalable in different terms or in terms of capacity in terms of destinations, uh, and, uh, scale, uh, the investment according the growth of their business is very important, uh, and also to be flexible to all sorts of items. So that is for B to B or B to C, uh, and, uh, as it is very important, uh, especially in the UK, a higher level of service and the automation that follows

Mathieu Scampini: (37:14)
[inaudible].

Mathieu Scampini: (37:14)
So, uh, supporting business to grow, we naturally move from a scheme that is, uh, with a low level of automation for low capacity to high-speed sortation for high capacity. Um, this pended, uh, until now to unbalance sites were on small sites with small volumes, you will be less efficient than when you have these mega hubs, uh, knowing that any logistics company or site would aim for our child and flexible organization, uh, and high efficiency. And we felt we could do more for the business and for our customers. Uh, that’s why we have imagine something different that I will introduce now. So our cavinants at Fives came up with the genuine solution. Uh, it’s an autonomous mobile robot. Uh, it goes fast to the deliver throughput and register distances. It’s modular to adapt also things scenarios. Uh, it needs no physical line on the floor, so it can be stored in any warehouse on the floor. So on grades, uh, it’s energetic, sorry. It is intelligence to avoid traffic jam, uh, thanks to is a traffic manager, and it can handle all sorts of items from very small, even non-compatible to a lens of 1.2 meters.

Mathieu Scampini: (38:37)
Uh, so in this slide here, you have an example of the latest design of a sorting center based on that technology, uh, the Geni-Ant, it sorts more than 5,000 items per hour, 200 destinations, and only, uh, about 50 robots. So you can imagine all the possibilities in what Same year or Phillip you’ve been talking before that application we can have. So, uh, just to conclude this solution, uh, respond to today’s need because it’s taught in a very short time, it’s easy to install and it’s compact. Uh, so it provides an optimal in automation that has never been seen before from low to higher throughput. Uh, and it’s also future-proof for tomorrow’s challenges because it’s scalable to expand. It’s easy to reconfigure and it’s allow you to invest just in what you need so together with your business, thanks all for your attention and to the presenters to make this journey even more interesting. If you have any question or inquiry, I’d be more than happy to engage with you. Just let me know you have my contact. Thanks again.

Thomas Andersson: (39:51)
Excellent. Thanks Mathieu .Um, with that I think we jump over to the very exciting, um, Bristol robotics lab, um, which is an incubator, which is a part of, um, two universities. If I’m wrong, Mark over to you and explain it in more detail.

Mark Corderoy: (40:20)
[inaudible]

Mark Corderoy: (40:21)
Goodevening. Thank you for the opportunity to present you, uh, tonight. Um, my name is Mark Corderoy and I’m the incubation manager at the Bristol robotics lab laboratory. And what I want to do over the next couple of slides is just to explain a little bit about what we do as a laboratory and how we are helping, uh, the next generation of, uh, robotics and automation, startups. Uh, again, the commercial lives, the robotics for the, the university, the university of West of England, and the university of Bristol. And we have around 300 researchers and academics based in, uh, 55, very active, uh, master’s courses. We have a center for doctoral studies, um, and what we are as a laboratories, we’re very horizontal facility. We don’t, um, only specialize, you know, very sort of narrow fields of robotics, like some universities do. We have a very broad spectrum and what that means from a, an enterprise and a sort of a, uh, uh, a commercialization point of view as well, well placed to work

Mark Corderoy: (41:34)
With lots of different technologies to bring technology together, to create solutions. What we have done within the laboratory is to align with the university’s enterprise. So we actually are neighbors where we sit in the same building and the university enterprise zone is a very vibrant community of about 85 startup companies are split over three incubators, and I’ve talked about those on the next slide, but I think very importantly, um, backed by, um, a number of, uh, facilities, uh, that provides our startup support, which is pretty unique. So we have organizations like the robotics innovation of facility, which, um, provides free consultancy, free support, um, to start ups that could be access to a five-axis CNC machine. It could be modeling, um, it could be a simulation and prototyping. Uh, we have the health tech hub, which is, um, looking very much at the new generation of health sensors, and that’s a big crossover in the sort of the, the IOT space.

Mark Corderoy: (42:46)
And then because of the nature of the, what we are, we, we attract network. So we have a where the base of the West of England robotics network, we are meddling Southwest is based there. And so in normal times, non-covid times, it’s a very, very vibrant place for startups to create new businesses and to, um, interact with, uh, similarly minded entrepreneurs. And to that end, we have, uh, a range of different entrepreneurial options. We have a, a graduate incubator for companies. Uh, they recently graduated from universities and that’s university through across the UK. We have the BRL technology, hardware incubator, and that’s something that is a pretty unique, you know, hardware is hard as they say. Um, and we provide, um, a lot of support to companies in that space and then finally we have future space, which are grow-on space, which is our commercial offering.

Mark Corderoy: (43:43)
Um, and what we do as an organization, we don’t take equity positions. Um, we don’t charge rent for the first one to three years, depending upon the pathway companies come to us, but we provide lots of support. This is a range of some of the companies we’ve helped recently. Um, people who’ve sort of, uh, grow them, flown the nest from companies. You might recognize open Bionics, perceptual robotics, home Lincoln IOT, smart homes company recently just been acquired. Um, and you know, what we’re trying to do is we’re an incubator. We want people to join us and leave us, um, the, uh, successfully funded feature

Mark Corderoy: (44:30)
Yeah.

Mark Corderoy: (44:32)
Presentation seems to have ended prematurely. Um, I was just going to talk about the last line I was going to talk about was, um, an example of how we were, um, uh, helping the startup community. We’re just involved in a new project called umbrella. And it’s one of the most complicated acronyms I can remember. So without a slide, I’m not going to try and remember it, but this is the first IOT 5g public network that has been, we’re creating, um, in the North of Bristol in a five mile corridor between the university of the West of England and the, um, Bristol bus part. We’re putting 200, um, IOT, uh, sensors, thousands of, uh, of receivers, thousands of sensors to create a public, um, test bed for companies to come and experiment with the new five G technology, the new industrial IOT technology. Um, we’re having a couple of, um, five G um, mass put on campus to be part of that. And that’s an initiative that the university is a part of together with the West of England, local authority and, um, Toshiba. So, um, hopefully gives you an insight to what is a very vibrant startup community. And I think you’re about to hear from two of those startups, uh, next,

Thomas Andersson: (45:47)
Thanks Mark, um, it would be kind to come down and have a look at some of your startups at some point, um, with that, uh, I want to hand over to, uh, one of the incubator popups, which is, um, Indus four and Arthur is the founder, I believe. Over to you Arthur.

Arthur Keeling: (46:06)
Wonderful. Thank you very much. And thank you very much for inviting us to join this London tech advocates and for the talks evening, I’d like to introduce you to Indus four. And just for a start speaking about automation, like to briefly just speak about economics very quickly. And the last 10 years have been about as bad as it gets for UK productivity. And it’s had a really detrimental effects on the living standards for almost everyone within this country. And this has been true across the West. This is a chart from Deloitte showing American productivity, where generations, since the war have been able to expect a higher living standard than their parents until almost the financial crash where that has been net zero growth and productivity. And the fourth industrial revolution was promised as the solution to help uplift this with new innovations and new technologies. However, as we all know, coronavirus has made the future certainly, uh, uncertain.

Arthur Keeling: (47:05)
And we don’t quite know what lays ahead of us now. And productivity looks uncertain now for that’s for sure, but going on towards a nation, we know automation works as we’ve seen from a few people that idea, but we believe it’s inflexible expensive and complex for most companies out there. And it only really likes works, but larger enterprises and people who’ve got mass production requirements. We believe to have a sort of an end to end supply chain that is fit for purpose automation, robots, X needs to be much more widely used, and he’s be easiest to use, but also be offered with a different business model.

Arthur Keeling: (47:44)
That’s why we’ve created system INX. This is our cyber-physical operating system real time, it’s web native edge enabled, and it’s vision aware. We also have an open design so that we’re able to work with third parties and other vendors. And that enables us to increase what we’re able to do as a business. We also have the latest technologies and flow programming that we’re using and using the internet of things, not just to stop data, but to make that data actually useful for people, but also machines do use on the shop floor. And we worked really hard to ensure that we’ve got flexible deployments of our systems. This means working with organizations large and small, whether they’re doing with discreet or batch manufacturing processes and enabling them to also use it, we’re also using machine learning and AI to ensure that we’re pushing the technology forward and keeping it on the absolute cutting edge whilst trying to deliver the best technology of people.

Arthur Keeling: (48:43)
And finally, I think keep ourselves, given the range of robotic arms out there we’re agnostic. So each arm is there to do different functions and some have different roles, and we want to work with lots of them. So we’re able to deliver a wide reaching service and this, we believe the collision of the cyber and physical worlds that we are looking forward to in the future. Just like to say, thanks very much for inviting us to talk. And if anyone has any questions, after would like to speak to please get in touch and thank you to London tech advocate again.

Thomas Andersson: (49:17)
Excellent. Thanks again Arthur Um, with that, um, last but not the least Snir Benedek from Benedex, um, over to you Snir.

Snir Benedek: (49:29)
Thank you very, very much. Alright. So, um, thanks for the opportunity. Um, hi everyone. My name is, um, Snir. I’m the founder of the Benedex, uh, we’re a young smart company in the UK developing a market leading disruptive propulsion concept from and robotics. Now thousands of robots are being designed as we speak the market for, uh, unmanned platforms will estimate it to be about $3 billion in the beginning of 2020, and is believed to be on a sharp growth strength, um, due to COVID-19. We seek to boost the strands. We want to make high end robotics, more accessible to everyone. Every mobile platform that is made is made for a function for a certain function. Every robots developers are experts on the payload

Snir Benedek: (50:28)
That does the function the robot is supposed to do, but they are not motion experts. Why then does the wheel get needlessly invented for every new robot that gets rolled out at this? We will call the problem Bendex introduces in, in an industry changing solution. Our patent pending system is made of a varying number of motorized wheels, which is the drive train with a pre-program central controller. It can be physically reconfigured. So it connects to every platform, allowing robot creators to focus on what their machine actually does and not on engineering. The motion, the benedex system fits a huge range of applications because it is modular. It is easy to use, and it is built for endurance and performance. The wheels adjust in this case, just 22 centimeters, big the wheels, each individual wheel weighs four kilos, but it supports each one supports up to 200 kilos of load outputting, 50 kilos of thrust, which is 1500 Watts.

Snir Benedek: (51:38)
And the turning speeds up to 25 kilometers an hour. This outfit is absolutely unique in the market and it will outperform any competing solution. Currently, everything is direct drive. There are no gears, so you’ll be getting the highest power efficiency and reliability. Unlike others, our solutions are made to operate anywhere in an all terrains, unlocking new possibilities for smaller developers and research facilities. In this example, our harsh environment version includes active cooling, so to sustain high, higher power output for longer periods of time, we have pretty much redefined flexibility to make new designs possible. Our wheels attach externally and conform to your design, your payload size and your shape. Our system contributes to cleaner, greener, sustainable transportation. The solution levels, the playing field for developers of mobile platforms. And with it, vehicle development becomes easier, cheaper, and faster. We are now rolling out the first prototypes and we’re looking for proof of concept collaboration, partners. Would you like to try our, uh, project, uh, with, uh, our system with your project, or if you’re just interested to learn more, drop us an email, come say hello. We’re interested to collaborate with, um, AGV with robot builders. Um, and this is the best time. Um, so contact us today. Thank you very, very much.

Thomas Andersson: (53:17)
Excellent. Uh, thanks for that Snir. Um, now have, uh, the opportunity to do a quick

Thomas Andersson: (53:26)
Q and a, some questions I’ve seen a few people have posted a few questions, and if we can answer those, that’d be great. Uh, first of all, if you want to participate in any future TLA robotics webinar, um, contact us on, uh, T L a .robotics@gmail.com. Um, we’ll respond to you as quick as we can. Um, there’s open to, um, companies from all over Europe, UK as well. Um, some, do we have any specific questions that quite or any generic questions? I mean, that’s quite a few of the specific questions for each of the panelists, which I think each of the panelists will respond to, um, in writing as we go along. Um, unless, unless we have any specific questions for, for us, um, you are more than welcome to contact any of the, uh, uh, present presenters, uh, through the email as well as I was, uh, on the front.

Thomas Andersson: (54:46)
Um, that will go, we have one question there that will say the question for Bendex. I’m sure you can reach back to those guys. Well, with that, I want to just, um, thank all the participants then for, for your, um, great presentations. Um, and, uh, yeah, let’s keep in touch. You can follow us on LinkedIn, um, that we have the real interesting question from Mike. Um, perhaps that would be a question for Mark on the two startups pops, even the Inova center, what is the most significant barrier robotic startup faces in the UK? Um, any, any answers, any takers for that?

Mark Corderoy: (55:37)
I think that they have all the problems of ordinary starts up software startups, but they also have that problem of hardware and software startup. They tend to, as they grow the first four or five people, almost clones of the early founders who are writing code and things like that, quite often, a software startup, the first five or six startup people are completely different skill sets, you know, mechatronics, electronics embedded systems, full stack developers, and you need access to resources and money. That most people realizes is never easy to come by.

Thomas Andersson: (56:24)
Any of the startups, perhaps Henry perhaps you’ve

Henry Wood: (56:29)
I, I sort of okay, well, Mark says really. I think one of the big challenges we’ve found is that to get, to get to a commercial MVP that you can actually sell to a customer can be a huge, huge amount of development work with hardware, whether with software product, you know, you can often string together an MVP using, uh, existing, uh, existing code and, and in a sort of very light framework. Um, and start, I’m not saying there isn’t a huge amount of work to do with the software product to get it fully featured, but you can often get an earlier MVP, I think, with a hardware business. I mean, with, with our arm, until you’ve got four at least six joints or surveilling and high precision, you basically have no useful functional functionality to anyone. So there’s, I mean, we’ve, we’ve spent several years and several million pounds just getting to a first product that’s really got any value to anyone. So this huge funding gap that you’ve got to get over with a hardware business.

Thomas Andersson: (57:32)
Okay. Um, we also had a question about the long iterative development. Um, I think you kind of answered that and henry at some point as well. Um, any of the other startups from my Bristol robotics lab once, um, answer a few questions there on,

Philip English: (57:52)
Yeah, I would certainly echo Henry’s points of it is exceptionally challenging when you’re dealing with customers where, uh, you cannot just put an arm or a system into play because sadly, a bug or an error there cost quite real money. If you’re working with a supplier and you’re handling goods to them. And as when things go wrong, things go wrong in the real world seems to have a much bigger impact on the companies than a small software glitch. Uh, if it was an app or something like that, which does add to the problems and then hiring such a diverse talent base from vision to hardware to electronics does also present challenges as well, which I think software companies probably don’t face in the same way. And it’s nice to sort of see things like TNA robotics coming through to so hopefully raise awareness of that, but also you hopefully break down some of those barriers that robots Excel, that we face.

Thomas Andersson: (58:46)
Okay. That’s also one question about how we intend to raise awareness of, um, what will TLA do to influence technology adoption in industry? Um, technology adoption is we’re just a small part of that. We want to, uh, improve, um, I’ll say adoption of automation and robotics in particular, but then as loads of other, uh, technology involved, um, the main, um, actions we will take is to, first of all, create networking events. We will do our best to invite, uh, the best or the most interesting startups which we see now present, but we’ll also, um, uh, invite people from government, from academia, from, from people actually use robotics as well. So I’ve seen a few people who have signed on who are part of robotics users, um, and hopefully potential for robotics users as well. Um, I hope that answers that and we’re hoping to create more interesting events in the future as we go along as well. Um, and the other questions to the panelists, seven questions by NHL. So,

Thomas Andersson: (01:00:04)
Nope. So with that, um, I think that’s it. If you have any further questions you want to reach out to any of the panelists, um, the, um, I hope the email is on the screen right now, uh, email to us. And we will forward that to either of the panelists, if you’ve got their details for didn’t catch any of your details. Um, so we can connect you from that with that, uh, I want to say extend a big, thank you to all the presenters, all the companies that we had here and, um, have a really good, uh, continued warm evening in London now.

All panelist: (01:00:38)

Thanks Thomas. Well done. Thanks everyone. Thanks very much. Thanks. Thank you. Thank you.   

TLA Robotics: https://www.techlondonadvocates.org.uk/ Tech London Advocates Robotics: TLA Youtube Philip English: https://philipenglish.com/  Sponsor – Robot Center: http://www.robotcenter.co.uk