0845 528 0404

HausBots interview with Jack Cornes

Hi guys Philip English this from philipenglish.com. Welcome to the Robot Optimized Podcast where we talk about everything robotics related. For our next episode, we have Jack Cornes who will talk about us "Wall Climbing Robots".

 PHILIP

Welcome to the Robot Philosophy podcast, where we keep you up to date on the latest news, reviews, and anything new in the robot world. Hi, guys. Philip English here. Robot Phil. Just another interview for you, just to quickly and obviously learn some more about the new or some more robot companies out and about. So, Daniel, we got Jack Cornes from HausBots, and we’re going to give you a quick overview to see how they work. Really? So welcome, Jack.

 JACK

Hi, how are you doing?

 PHILIP

Fine. Thanks for your time today. Just to start with, it’s probably worth getting like an intro for yourself and just a little profile on the company, if that’s okay.

 JACK

Yeah, sure, no problem. Hi, I’m Jack. I am the CEO, one of the founders at HausBots. In a nutshell, that HausBots build robots to protect and maintain buildings and infrastructure. We started the business when my co-founder was asked by his parents to paint his house. He was up on the ladder painting his parents’ house, thinking to himself, blimey, it’s the 21st century, I’ve got an engineering degree, there’s got to be a better way of doing this. So, he made our first ever robot as nothing more than a bit of fun in his garage to help him paint his parents’ house. We’ve known each other since we were about twelve years old. So, we were having a catch up one day and he was telling me about this idea that he had come up with, and a bit of a Eureka moment happened for me. At that time, I was working in big tech, selling mainly software-based automation. And the Eureka moment was seeing how the task that his robot solved, which was effectively making work at height safer, wasn’t really being matched in the market, there wasn’t really a product out there, and I knew how much money was being spent on automation at large. So, we decided it would be a perfect opportunity to get together. I brought the parts of the puzzle that he was missing, sales, commercial, that sort of stuff, and started the business about three years ago. Fast forward to today and we’ve morphed from a robot to help a kid paint his house to a robot that can make all types of work at height significantly safer and significantly more cost effective. So, what we’ve actually got is a really clever climbing robot that can climb any surface you can imagine. And then we can integrate payloads up to 6 kg. So we do all sorts of projects from concrete inspection through to painting through to metal inspection, you name it. We can work out a way of getting our climbing platform to do it

 PHILIP

 right. It’s a fantastic overview. Thanks, Chad. They could just a good speck on the company. So I suppose the first question that you must always get, you say any material there, so what stops it? Like slipping on very wet and slippy material. It’s something to do with the way that the device, I suppose it might suck onto the building. Is that how it works?

 JACK

 Yes. So, we use a particular type of aerodynamics that’s found in Formula One called the ground effect. The way that it works in Formula One is that the undercar side of the car is designed in such a way to enhance its airflow under the vehicle and create a big low pressure. So, we use that same principle, but use a fan. So, this fan is moving the air to create a low-pressure region underneath our chassis. The reason that that’s kind of clever different and what our patent is based on is that it means that you can create large amounts of suction force without ceiling against your surface. So, most of our competitors will use, let’s say, a vacuum cup, suction cup, something like that. But our robot doesn’t need to seal, so we’ve got almost two-inch gap underneath the robot, so then it can generate these suction forces against pretty much any surface. We use that in combination with extremely high friction tires to mean that roughness of surface or obstacles or wet surfaces or smooth surfaces doesn’t really matter or affect the robot.

 PHILIP

Right. And then for the actual painting of the walls, then I’m assuming that there’s like an extra line that comes off the robot with whatever paint that customer wants. Is that how it works?

 JACK

Yeah. So, the robot can be integrated with any attachment you can imagine. It’s got all sorts of integration ports, just like you’d find on your laptop, USB and communication and all sorts of things. So, you basically plug the robot into the attachment you want to use, which, if we’re talking about painting, is just a paint gun. Attach the paint gun to the mounting point on the robot, plug them in for communication, and then that paint gun is supplied with a separate feed.

 PHILIP

Right. Could you put a camera on there for building inspection? Because I’ve read in the news lately that there’s a lot of buildings around London that have potentially need to be inspected more. So could that be like another feature that the robot could do?

 JACK

Yeah, it’s already a feature that we use extensively, actually. So, we’ve got a 4K pan tilt and zoom camera that can sit on the front of the robot. And again, because it’s easily integratable with all sorts of different things, we could even change that camera, upgrade it, use thermal cameras, whatever you fancy. But, yeah, we already use the camera quite extensively, especially in areas where it’s extremely difficult to get a drone permit or you can’t fly drone, which is most cities. Around most buildings, around most road networks, then drones can’t be used. So, our camera on our robot fills a nice gap there.

 PHILIP

Right. And you just answered my other question, actually, because I was going to question about sort of drones, but with the understanding from the health and safety side of drones, and that makes sense.

 JACK

 6:20

 Yeah, it’s a licensing thing. It’s health and safety, it’s a permit thing. More fundamentally than that, our robot was specifically designed for tasks where you require contact with the surface. Drones today are pretty much only good for cameras, for photos. And yes, you can use different types of thermal cameras or whatever, but that’s about as far as you can get with a drone. Our robot lends itself perfectly to tasks where you need to be touching the surface. So, a radar survey, an ultrasonic survey, camera of something extremely close, painting, fixing a particular thing, those sorts of tasks, which is where the functionality is kind of enhanced versus a drug.

 PHILIP

And that’s the advantage, really, because you’re doing a physical job as well. So, you can do division inspecting, but you’re actually doing a job if it’s painting or something like that and actually like, repairing the building. So, yeah, I could definitely see advice. What’s the highest it can go? Or you can go as high as your life as long as you got the power cord. That’s long enough. Is that there?

 JACK

Well, there’s two versions, actually. We have a 30 meters Tethered version, so customers will often use that if they want to run continuously. So, you can power it through the Tether and have unlimited power up to 30 meters. Or you’ve got a battery version, so the battery version is unlimited. As soon as you can carry the battery, you can carry that battery up to any height, but it’s limited by time. So, you’ve got about 25 minutes of runtime, which is somewhere on par with the drone as well. So, it depends on what the customer wants to do.

 PHILIP

Right, okay, I suppose what’s been your most trickiest building you’ve done so far? What’s been the tallest one you guys have done?

 JACK

Oh, gosh. We did the Qi two bridge in Dartford. Okay. The robot was climbing up one of the support piers. We undertook a visual survey and a radar survey. That bridge is huge and was an extremely exciting asset structure for us to work on.

 PHILIP

Yeah, that’s amazing. Well, I could definitely see the future potential because obviously, once people want their building painted or inspection, I mean, what’s the next step for you guys? You got the two different options at the moment. Is it really just to expand the options, expand the lines?

 JACK

Yeah. We’re constantly improving on the fundamental physics of the thing. So, you can always create more suction, you can always overcome a slightly larger obstacle, all these sorts of things. So that will just happen naturally. But I think the biggest piece of work that we’re doing is just constantly upgrading the portfolio of items that we can integrate with because ultimately the climbing robot itself, whilst is our bread and butter and is our special thing, the climbing robot itself is pretty useless. You can’t really do anything with just a climbing robot. So, it’s all the attachments. That is what it actually makes it do useful, productive work. So, we’ve got all sorts of projects on at the moment to just make that base robot integrate with as many different attachments as possible. And really the nicest analogy should draw it to is a tractor. A tractor is pretty useless if all it can do is just drive around fields. What you need is your tractor to be able to bolt onto your plow, your harvester, your tree cutting machine, all of these different things that your tracker then powers. And that’s kind of how we’re seeing our robot.

 PHILIP

Yeah, I like their analogy. That’s really good. So, the max payload currently is 6 kg, because what I was thinking is that I suppose the next step would be to have some sort of cobalt arm on the back of it with a tool to some degree. If you wanted more weight, I’m guessing it would just be a bigger robot. You have to size everything up to fit something heavier. I know the Cobot market, the robots are getting lighter, lighter all the time, but I don’t think we’ve got, unless it’s a university one, a six kilogram one yet. So, if you wanted more payload, would it just be size or could you add more technology in there so you can up the weight without having does that double the size?

 JACK

Yeah, size is one way of doing it. It’s kind of cheating, but that is one way of doing it. What we mainly focus on for payload improvements is aerodynamic design. So, we’re doing pretty much Formula One levels of aerodynamic design. And in the same way that a Formula One car every year gets about a second quicker because they found a new wing which can generate downforce in this particular way. It’s the same sort of iterative circle that we go through. That being said, and where the first part of this question came from, we’ve recently created a partnership with a manufacturer of basically mini Cobot arms. And we’ve got an arm that weighs 3 kg, is six degrees of freedom and attaches to the front of our robot. And it only has a small payload, obviously, but we’ve already started to do much more precision manipulation tasks through that. Three-kilogram arm. Wow, that’s really impressive for 3 kg. Yeah. I don’t think I’ve ever seen one that small before. I suppose it’s almost going into sort of like they’re talking about nanotech technology and nano that sort of standard. And I guess the smaller robots that we can build to do the work, the more you’re going down that road. Yeah, that’s right.

 PHILIP

Tools that we’ve seen from companies like You Are and Robotic and Enroll, they are producing a lot of end affects at all right. At the moment. That’s the trend that we’ve seen. So, you’ve had a series of years where they bought the arm, they weren’t too sure what to do with it. It’s got a gripper, it’s got a pick and place function, but now we’ve seen they’ve got sanders on the ends, they’ve got drills, they’ve got all sorts of tools. So, you can imagine the most dangerous job that you can get is being on the outside of the building, trying to screw something in. So you guys are based in the UK?

 JACK

Birmingham.

 PHILIP

Okay. Over in Birmingham, I suppose, from our viewers point of view. Like to get in contact with you, obviously, I’ll put in the houseboat website below and put some details. And I’ll have a think on this side as well to see if there’s any opportunities or customers. I mean, again, as part of the TLA Robotics Group, we may have you on there as well because that’s the European base as well. Have you done any installs in Europe or abroad?

 JACK

Yeah, we pretty recently came back from a chemical tank inspection in the Netherlands and there’s a couple of different customers that we’re talking to know that are based in that region as well.

 PHILIP

Right. Fantastic. Expand it out.

 JACK

Yes. So, we’re kind of giving demos and tests and pilot projects as we speak, so yeah, happy to engage for a demo with anyone.

 PHILIP

Yeah. And then the three-kilogram arm, is that something you might see on the website in six months’ time or something like that?

 JACK

Yeah, possibly. It’s a manufacturer called Elephant Robotics. Okay. Yeah. They basically specialize in mini arms. Really quite impressive stuff.

 PHILIP

Are they UK based as well?

 JACK

They’re in China, actually.

 PHILIP

They’re China? Yeah. Okay. That’s interesting. They are after looking to those guys as well. Okay. Now that’s great, Jack. I mean, thanks for the overview. I think that’s given the audience a clear understanding of what you guys do and yeah, we’ll keep an eye on you. We may do another interview in a year’s time or something like that and see what you have on them. So, it would be great.

 JACK

Good stuff. Thanks.

 PHILIP

Cool. Thanks for your time, Jack. So much appreciate.

 

Robot Optimised Podcast #8 – Interview with Jack Cornes

HausBots: https://hausbots.com/

Robot Score Card:- https://robot.scoreapp.com/

Sponsor: Robot Center: http://www.robotcenter.co.uk

Robot Strategy Call:- https://robotcenter.co.uk/pages/robot-call8

Will Computers Revolt” – Book Interview with Charles Simon

Willcomputersrevolt

Hi guys Philip English this from philipenglish.com. Welcome to the Robot Optimized Podcast where we talk about everything robotics related. For our next episode, we have Charles Simon who will talk about his book "Will Computers Revolt".

Philip English (00:14):

Hi guys. Philip English here, also known as Robo Phil, robot and enthusiasts report on the latest business and application of robotics. And my main mission is to get you guys’ robot optimized, support industry infrastructure, and innovation for the next era. I’m excited today because you’ve got Charles Simon and who’s going there and tell us a little bit about his book and we’re going to do a bit of an interview with Charles about the AI, side of technology. So, welcome Charles is very, I really appreciate your time today. It’s a perfect, so could you give us, a quick overview, I suppose, like a little bit about yourself, like a little bit about your history, if that’s okay, Charles?

 

Charles Simon (01:00):

Sure. I’m a long time. Silicon valley, serial entrepreneur. And, I started three of my own companies and worked at two of other startups. And I spent a couple of years working at Microsoft and doing all kinds of different things. My very first company was about computerated design of printed circuit boards. And one of the things we observed is that the way computers designed the printed circuit boards at that, in that era was seriously different from the way people did it. And people did a better job. And that intrigued me into the idea of what makes people intelligence different from artificial intelligence. And I followed through on that, a little background about myself. I’ve got a degree in electrical engineering and a master’s in computer science. And so I’ve got a little bit of academic background in the area, but the area we’re talking about is the future of computers and artificial intelligence.

 

Charles Simon (01:57):

And that’s so cutting edge that nobody would say I’ve got 20 years experience in that. Along the way, also I did a stint as a developer of a lot of neurodiagnostic software. And so if you get a brain injury, you might be hooked up to my software or you get carpal tunnel syndrome and all of these other things that you test testing for neural pulses. So I bring to the table a whole lot of interesting and interest in how the human brain works and how neurology works and try and map that onto the artificial intelligence world too.

 

Philip English (02:37):

Right. I see. So you see, you’ve got a wealth of experience there from obviously from like an academic point of view and from a business side. So you sort of like merge the two together. Like we could probably actually jump into your brain, like sits at simulator software straight away. So explain sort of like what, the, and that the brain sits in later solves.

 

Charles Simon (03:02):

Well, back at the entire world of artificial intelligence. Back in the 1960s, there was a divergence of artificial intelligence where there are the neural network guys and the symbolic AI guys, and they kind of went their separate ways. And since then, they’ve kind of gone back and forth and sometimes one group got a whole bunch of money and the other group faded, and now they’re back together again right now that the neural network guys now call it deep learning or deep neural networks, and they’re more or less in charge. And they have a very interesting set of solutions, but they are not related to the way your brain works. And so the idea in 1970s or early eighties was we got this great new neural network algorithm with backpropagation. If we could just put it on a big enough computer, it would be as smart as a person.

 

Charles Simon (04:06):

Well, in the intervening 50 years, that has proven not to be the case. And so we have to look to some different algorithms. And so I wrote that the brain simulator looking at it from the other point of view, let’s start with how neurons work and see what we can build with that. And so my electrical engineering background says, oh, well, let’s build a simulator. And, if you were building a digital simulator, you’d have basic building blocks of NAND gates. And if you were doing analog simulator, you’d have various electronic components and op amps, but in the brain simulator, the basic component is a neuron. And the way a neuron works is it accumulates ions and eventually REITs a threat reaches a threshold fires, sends that a spike down its axon to distribute more ions to other, all of the neurons it’s attached to through it.

 

Charles Simon (05:00):

Synapses and neurons can have lots of synapses, you know, on the order of 10,000 and your brain has got billions and billions of neurons in it. And so, but the neat thing is that neurons are so slow that a lot of the circuitry in your brain is coping with that problem and the amount of computer power that we can get simulating neurons can simulate. Now I can simulate a billion neurons on my desktop, which I couldn’t do before. So we’re getting very close to having computers that can match the power of simulated neurons. And I’ve done a lot of explorations and this is a brain simulators, a community project. So it’s all free and you can download it and you can build your own circuits. And then you will become a lot smarter about what neurons can and can’t do and see why it diverges so much from the AI backpropagation approach.

 

Philip English (06:04):

Wow. I want to say, let’s say so someone, like not myself, but I couldn’t really use it as a learning tool to sort of understand the subject

 

Charles Simon (06:13):

Like you, you, I mean, like all learning things and you can sit down in front of it and a novice can, it’s got a bunch of sample neural networks and you can say, “aha”, well, this, these are the sorts of things I can do with neurons and how you could use these to do many more advanced things as well. And so the book kind of draws the surroundings around the software to say, well, if you go down this path, it’s pretty obvious that in the next decade or so, we will have machines smarter than people. What are the implications of that? And what, what will those machines be like and how are we going to control them and what are our options? And that’s what the book is about related to the software. So they kind of work together that way.

 

Philip English (07:04):

Yeah, no, that makes sense. Obviously you’ve designed and built the software, so you’re the perfect expert really, to look forward and actually see that if this grows at this rate, this is what we’re going to see, like in the future. And yeah, and that I, and that leads perfectly onto the book really. So, I mean like will computers, like revolt, is the name of the book and you’ll see the siding sort of the when, why and how dangerous it is going to be. And then again, give us a brief overview of the book then. I mean, I noticed that you’ve got three main sections and then if it’s got 14 chapters and the first part seems to be sort of explaining about how it all works. And then the next section is obviously what your, what you think is going to happen like within the future?

 

Charles Simon (07:58):

Well, that in order to talk about making a machine that is intelligent, you need to consider the idea of what intelligence actually is. And you need to think about what it is that makes people intelligent. And this turns out to be not an easy task to say, this is an intelligent thing to do, because if you start making a list, you’ll say, you know, can read a newspaper. Well, blind people don’t read newspapers, and yet they seem to be intelligent and you could hear a symphony. And there are always these disabilities that work that are in concert with perfectly intelligent people. So you, can’t just itemize a list of say, if you can do this and this and this and this year intelligent, and if you can’t do this and this and this, you’re not intelligent because you always have this problem, but I can see some underlying abilities, like the ability to recognize patterns in an input stream.

 

Charles Simon (08:56):

Now I’ve made a huge abstraction jump there, but a year, you know, your senses are continuously pouring data in it, your brain and your brain is doing its best to make sense of them, to remember what are the things that are going on at whether things worked out when you made a choice of one action over another action, and then to repeat those things. So if you said, you know, a simple game of tic-tac-toe you say, well, if I saw this, this situation, I made this move and I won, or I made this move and I lost. And you, your brain builds up these memories of things that worked out and things that didn’t work out. And so intelligent behavior is doing things that worked out. And so all of this happens within the limits of what you know, and what you’re learning. And another real problem of your brain is it’s getting so much data that, that it can only really focus on a tiny percentage of it at any time and remember even less.

 

Charles Simon (10:01):

And so when you stop to think about what, you know, you know, a whole lot less than you think you do, you have this perception that you can remember what’s next to you, or what’s next door, or what your friends look like. But when you actually get down to drawing a picture, you have very sketchy remembrances, and your memories are very fate to get fuzzy. And so building a computer system that works in this way, we were starting with the definition of intelligence. So it got some kind of a basic definition. And then you can say, bill working with these facets, can you build a software system or a hardware system, the software first, because it’s easier. And you build a software system that does that. And the answer is yes, and it’s not that tough, but there are certain things that we want to talk about in terms of general intelligence.

 

Charles Simon (11:03):

And that is, well, people seem to be able to understand stuff. You can understand stuff. I can understand stuff. What does understanding mean? And to some extent, understanding is putting everything, you know, and everything your input is receiving in the context of everything else you already know. And so you’re able to merge all of this together in a multisensory sort of way, that is you hear words, or you read words, and these may mean the same thing, but they relate to abstract things, objects, or physical actions or something. So it’s not the words that are the meaning. It is an abstraction, that’s the meaning. Then you can paste words on top of that. And so you can build computer systems to do all of these things, and that’s pretty likely, and you will end up with a because it’s doing the right thing over and over, you end up with a goal directed system, because the idea of doing something that worked out versus didn’t is entirely arbitrary.

 

Charles Simon (12:17):

It’s based as a measurement against some goals that some program were put into place. And so if your goal is to comprehend the world and explain it to people, that’s entirely different from a goal being set of making a lot of money or taking over the world. And so we have a goal directed system that has these capabilities. Now in the last section of the book, it is, well, what will these machines actually be like? And what will they be like when they are equivalent to a three-year-old or equivalent to an adult, or unfortunately, only 10 years later after they are equivalent to an adult will be a thousand times faster than the equivalent of an adult. And so all of these things map out to what’s the future of intelligent machines. So now the final section, I map out a number of different scenarios, which kind of put them on the low levels of different likelihood.

 

Philip English (13:24):

Yeah, well, this is it. I had to look through the chapters and the connection that we have with robotics is obviously a lot. Robotics is it’s all about the physical world. It’s all about the sensors that are coming out every year. There’s better and better cameras as better and better laser scanners, better LIDAR. But the real intelligence we’re seeing in robotics is all the AI side. So it sets up taking the data from the modern cameras and actually using it in an efficient way to get a job done. And with that intersection of technology getting faster and faster, and AI getting faster and faster, we’re, we’re certainly going to have an exponential growth soon with of certain technologies.

 

Charles Simon (14:05):

Exactly. And one of the things that I’d like to add to that is robotics is a key to general intelligence, because if you start with the idea of things, a three-year-old knows that round things roll and square thing blocks can be stacked up and things like that. These are things that you might be able to put into words and explain to a computer or show in pictures and explain to a computer. But that is entirely different from the understanding you get from having played with these blocks and to set a robot with a manipulator, loose, to play with blocks, we’ll give it an entirely different level of understanding than anything you could train. And so robotics is where the general intelligence has to emerge, because it’s the only place that brings together all of these different senses.

 

Philip English (14:58):

Well, this is it. And this is when you get touch senses, smelling senses, tasting senses. And you know, when, when I, and understand that, and yeah, we’re certainly going to see, and they’re

 

Charles Simon (15:08):

Some of the real keys are the sense of time that some things have to happen before things other happened. You know, that you have to stack the blocks before they can fall down.

 

Philip English (15:21):

That’s been great. It’s been great. Yeah. This is really, really, like interesting. I think it’s, it’s a perfect sideline as well. Cause we will talk about products and stuff. And this is quite good to have this view.

 

Charles Simon (15:33):

But from a product perspective now I happen to have been very fortunate in my professional career. So in these books and brain simulator and stuff, I do not need to make any money, which is a good thing. Because if I went to somebody and said, I need a billion dollars and I’m going to build a machine that’s as good as a three-year-old. This is not a winner of a project because three year olds don’t do very much, but that is the approach you have to take. You’ve got to be able to understand what a three-year-old can understand before you can understand what an adult can understand.

 

Philip English (16:10):

Yeah, no, and that’s it. And then from there that you can grow. I mean, so what I was interested in is your four light scenarios. So obviously I saw that number one was like the ideal one and then there was a few others, but if you can take us through your thoughts about that.

 

Charles Simon (16:28):

Sure, one can eat the scenarios of what happens when machines are a lot smarter than us. And there’s an interim period where there’s where they’re smart enough to interact with us, but not so smart that we’re borrowing. So that’s the key of being really interesting where the ideal scenario is we have programmed computers just with goals that match what human goals are now. The good news is that our needs and the computer’s needs are divergent. We need land and clean air and clean water and clean food and mates and other this, this, and that, and computers don’t need anything that we need except energy. And so we may have a fight over energy, but mostly they’re going to be doing their own thing. And the real true AGI don’t need spaceships or submarines to do exploration, or, and they don’t need air conditioning to live in the desert because they can become spaceships and they can become submarines.

 

Charles Simon (17:39):

And so they have a different set of standards and they can go off and do their own thing and learn a bunch of stuff about the universe and hopefully share with us. Now, the scary parts are more like in the early stages, suppose a nefarious, human is running these AGI and directs them to do things that benefit this, that person or group at the expense of mankind. And that is the only scenario that has any relationship with terminators and all of science fiction, where they build machines for the purpose of taking over the world or the purpose of making themselves rich. I don’t see that as a very likely scenario because it happens in a very small window of opportunity where machines are smart enough to be useful, but not smart enough to refuse to do the work, because it doesn’t take a genius to say that setting off a nuclear war is bad for everybody.

 

Charles Simon (18:46):

So a computer could easily say, no, I’m not going to participate in that project. And that will be a very interesting scenario when computers start refusing to do the things we asked them to do, but that’s a separate issue. So machine going mad on its own is extremely unlikely because in order to do that, you have to set goals for the machine that are self-destructive to mankind as a whole. And I don’t see that as a very likely scenario. And, so we’ve got the mad machine and the mad man who does things. And then there is the mad what I call the mad mankind scenario. Let us imagine that humans continue to overpopulate the world at a great rate. And they do put themselves in situations where the computers can see, well, this is going to get us into trouble. We need to do something about that.

 

Charles Simon (19:49):

All of the things that computers might do to solve human problems are going to be things that humans are not going to like, if you know, you say they want to solve the overpopulation problem or the famine problem, you can think of lots of solutions that you’re not going to be very happy with. So the four things that you can do, there’s the pleasant scenario that the mad man scenario, the mad machine scenario, which I think is pretty unlikely and the madman kind scenario, which is a concern. And we is what really says it’s time for mankind to get its house in order, and to solve our own problems, because we won’t want machines to solve them for us.

 

Philip English (20:38):

That’s it. Now, perfect. No, that’s a great light synopsis of the last four. And it’s again the very interesting and there’s four different scenarios. And, I think, yeah, I mean, like, I mean, if people obviously want to get hold of the book and I know it’s on Amazon and everything is, there.

 

Charles Simon (20:59):

The computer bit, the name of the book is will computers revolt, and there is a website will computers, revolt.com. The name of the software is brain simulator. And there is because it’s free it’s brain sim.org.

 

Philip English (21:16):

Know that, that’s perfect. Thanks, Charles. And then I suppose the last question I had is that timeframe wise, obviously, like we all know about Ray Croswell and he’s 24 foot 45, like live predictions. If you, do you think it will fit along that sort of timeframe do you think it’d be longer or shorter

 

Charles Simon (21:34):

Shorter, but the key is that it’s not an all or nothing situation when you think of a three-year-old, it’s not obvious that that three-year-old is going to become an intelligent adult. And so everything we do, if you look at everything you don’t like about your computer systems today, it’s mostly because they don’t think they’re not very smart. And so everything we do to make our, that brings on little pieces of smartness will be so happy to get it. So the machines increasing intelligence is inevitable because all of the little components are things we want, and we’ll eventually get to machines that are smarter than us, but it will have happened so gradually that we won’t have noticed. And every step along the way, we will have enjoyed it.

 

Philip English (22:35):

Well, this is it. This is the benefits. I mean, I’ve recently just invested in a little light health gadget and, you know, it’s there to benefit me really, you know, and us as a species. So yeah. Hopefully if you want it there, but no, that’s great. Well, I, thanks very much for your time. The light your time, Charles, it’s very much appreciated. I mean, what I’ll do guys is I’ll send, I’ll put a link on the YouTube video, so you guys can go and get touch of Charles book. You’re gonna have a look at his brain simulator software, and then, yeah, we’ll probably, do this again, another 6, 6, 7 months time. I mean, I’m going to get a copy of the book and have a read as well. And any questions I’ll put, Charles his details so you can reach out. So thanks, Charles. Thank you very much. Fit, fit, fit, fit.

 

Charles Simon (23:20):

Well, thank you for the opportunity. It’s been great talking with you.

Robot Optimised Podcast #6 – Book Interview with Charles Simon

Charles Simon: https://futureai.guru/

Philip English: https://philipenglish.com/

Sponsor: Robot Center : http://www.robotcenter.co.uk

Youtube:- https://www.youtube.com/watch?v=knlbxEZ6mgA&ab_channel=PhilipEnglish

 

SLAMCORE interview with Owen Nicholson

Hi guys, Philip English from philipenglish.com. Welcome to the Robot Optimized Podcast where we talk about everything robotics related. For our next episode, we have SLAMCORE led by Owen Nicholson who will talk about their leading software and robotics .

Philip English (00:14):

Hi guys, Philip English. I am a robotics enthusiast, reporting on the latest business application of robotics and automation. And so today, we’ve got Slam Core and we’ve got Owen. He’s gonna give us a quick overview, of the technology down there. So I was also the CEO and co-founder. And, for any of you who haven’t come across slam before, it stands for simultaneous, mapping and we’ll got it wrong, apologies, simultaneous localization and mapping, and Slam Core like develop the algorithms that allow basically robots and machines to understand their space around them. So as more robots come out, then they have a sense of where they are and obviously they can interact with our environment. So yeah, no, I sort of, if I were in, just to give us like an intro and like an overview really like about yourself to start with, Owen if that’s, okay.

Owen Nicholson (01:20):

Sure. Awesome. Well, thanks a lot for the opportunity Phil, and thanks for the intro. So just to play back, I’m Owen, I’m the CEO at slam core. I’m also one of the original founders, and we’ve been going for about five years now. We originally span out from, Imperial college in the UK, one of the top, colleges in the world, founded by some of the absolute world leaders in the space. And it’s been, an incredible journey over the last five years, taking this technology and turning it into a real commercial products, which I’d love to tell you about all today.

Philip English (01:55):

Alright, thank you for that overview. And so you’re saying that, also you’re one of the co-founders so I was on the west side, so it’s is it 4 main co-founders or

Owen Nicholson (02:04):

So yes, two academic co-founders and then to full-time business founders as well. So, from the academic side, we have Prof. Andrew Davidson and Dr. Stefan Leutenegger, between them they’re two of the most respected, academics in this space. Probably most notably Prof. Davidson, who is one of the original founders of the concept and the real pioneers of slam particularly using cameras. So we’ll talk more about that, but this is really our particular flavor of slam is using vision. And he’s been really pushing that for the last one, nearly 20 years now. So incredible to have him as part of the founding team. And then, Dr. Leutenegger, who’s now at technical university Munich, who is another one of the real pioneers of vision for robotics. And then myself and, were with a full-time business side of things when we founded the company.

Philip English (03:01):

Right. Fantastic. So you’ve got quite an international sort of group of co-founders there. It sounds like you’ve got some mostly from an academic point of view, that our team that have been studying this and doing this technology for years. So that’s interesting,

Owen Nicholson (03:19):

Absolutely. And it’s one of those things. When you start with strong technical founders, then you can attract other great people into the space. So one of our first hires was our CTO, Dr. Pablo Alcantarilla who came from I robot. Great to be able to bring someone of that quality. And, he’s another one of the absolute real leaders in the space. But also with the real experience in industry. So he’s been a Toshiba has been, in iRobots and knows all about how do you get this stuff to work on low-cost hardware in the real world, and price to performance point. That really makes sense. And we’ve now reached our 33rd hire. So it’s been incredible and these we’ve got still a big technical team about 18 PhDs. I think I’m still about three quarters are still technical. So they either have a PhD or an extremely, in depth experience in software engineering and particularly embedded software engineering. But we’re also growing out our commercial and business side of the company as well over the last year and a half.

Philip English (04:25):

Fantastic. Yeah. Sounds like a phenomenal sort of growth over the five years to have such a strong team there. And, we were chatting about it. We were chatting before, so obviously you’re based down in Barra in central London, but you’ve also got another branch a bit out in now. What was that again?

Owen Nicholson (04:43):

Chiswick Sorry, kind of west London. So we have a couple of offices for the, for the team to work from. We have, I think on the last count, 17 nationalities now represented, within the company. So we sponsored a lot of international visas. We bring a lot of people into the UK to work for slam core, from all over the world. Most continents represented now. But I think this is just the way it goes. This the type of tech we’re working in is very specialist and we need the best of the best. So it’s the challenges these guys can girlfriend and, and girls can go from work in a DeepMind or Oculus if they wanted to. So we need to make sure that we attract them and retain them, which is something we’ve, we’ve been very successful with so far.

Philip English (05:28):

That’s right. And is what it’s all about getting the best team around you, like a sports team, you know, you want to get the best players, you know, to do the work. I mean, how did you find your team that started you, do you sort of regularly advertise or do you do a bit of headhunting for the guys? You got

Owen Nicholson (05:47):

The mixture of both of them having, having great people from the founding team means we can get a, we’ve got a good access to the network. So we do have a lot of inbound queries coming in for, we have very rigorous, interview processes. But we do, we use recruiters, we use headhunters particularly for some of the more commercial hires we’ve used kind of high-end head hunters to find these people because they’re very hard to find, once you do then bring them in, it needs to be, we need to make sure that this is a vision they really buy into. So, that’s, you know, we have, multiple ways in which we’ve attracted people over the years, but I’d say because we have such good quality team. It attracts other great people. So, it’s one of the real benefits

Philip English (06:35):

That’s right. So talent attracts talent, and we’ll see that those are the sort of guys who would like to know each other a lot within the space as well. And yet, like you mentioned vision there and we’ll get into vision a bit, my late later on, but I suppose I’m interested in the, I suppose, the problem or the issue to start with. I mean, Nikki, could you talk around or see, you know, I suppose the fundamental problem that you guys are, are trying to address?

Owen Nicholson (07:02):

Sure, sure. I think at the heart of it, we exist to help developers give their robots and machines, the ability to understand space, quite high level, but let’s start there. And ultimately we break this down into the ability for a machine to know its position, know where the objects are around them and what those objects are. So it’s coordinates it’s map and the, is it a person? Is it a door? Those are the three key questions that machines and particularly robots need to answer to be able to do the job that they’ve been designed for. And the way this has done normally is using the sensors on the onboard the robot and combining all that, all these different feeds into single source of truth, where the robot creates essentially a digital representation of the world and tries to get where it is within that space.

Owen Nicholson (08:01):

And, the problem has always really been that actually the number one cause of failure for a robot, isn’t the wheels falling off for it falling over. Although there are some funny videos on the internet about that actually, when it comes to real hardcore robotics development, the challenges that are faced on mainly around the discrepancies between the robots, understanding of the space and its reality. So that’s what causes it to crash into another object is what causes it to get lost on the way back to the charging station and therefore not actually get there in time. And so it runs out of juice and just dies. So the high-level problem we are trying to do, trying to address is giving developers the ability to answer these questions without having to be deep, deep experts in the fundamental algorithms that are allow you to do that.

Owen Nicholson (08:53):

Because there’s an explosion of robotics companies at the moment, and it’s super exciting, seeing all these different new applications coming out, really driven from lower cost hardware coming out and modular software, which allows you to quickly build POC and demos and early stage prototypes. But there’s a still, when you really drill into it only, well, probably well over 90% of those machines today will not scale to a commercially viable product as it stands, if they literally went and tried to sell that to hardware and that software right now. So this is where most of the energy has been focused on by the companies trying to modify what they have to be more accurate, more reliable, or reduce the cost, and actually nearly all most of the times, all three. So they’re nearly always trying to make it increase the performance and reduce the cost.

Owen Nicholson (09:43):

And this is phenomenally time consuming. It’s very expensive. It can be, cause it’s lots of trial and error, especially if you have a robot where say a service robot where you need to shut down a supermarket to be able to even do your testing. You might only get an hour a month to be able to do that with your client. And this is such a critical time. And if you spend the entire time just trying to get the thing from A to B, not actually worrying about what does it do when it gets there. This is really what’s holding back the industry as a whole.

Philip English (10:13):

Right. I see. So if I’m obviously like a manufacturer and I want to build a solution again for like retail or education or when you’re in hospital, was this quite good? Quite good one then basically then slam core is one of the components that I can bring into the product that I’m building. And again, it’s got all the expertise, it’s got everything It needs to make sure it does a brilliant job on the vision side. So the one, obviously it, that helps with costs on manufacturing, on a new product, and then it’s easier for the customer to launch the products, knowing that it’s got a branded and obviously a safe way of localization.

Owen Nicholson (10:54):

Absolutely, so they start time to market for a commercially viable system. So you can build something within a month. In fact, at the end of master’s projects, you quite often will have a robot, which is able to navigate and get from A to B, but doing it in a way, which especially when the world starts to get a bit more chaotic, that’s probably the real real challenge, is when you have people moving around structures, changing the standard systems today, just, they just don’t work in those environments. They don’t work well enough to, especially when you have a hundred, a thousand, 10,000 robots, if your mean time between failure is once every two weeks, that’s okay for your demo, but it doesn’t work when you’ve got 10,000 robots deployed to across a wide areas. So, yeah.

Philip English (11:40):

Yeah. this is it. I mean, from what I’ve seen, it’s all about movement. And as you said, like you can do a demo with a robots sort of show it working in an environment that’s half empty and no one’s really around, but once when it’s a busy environment, busy retail, lots of people, lots of movement. And it’s very easy for obviously the robot can get confused and say, I know, is that a person? Is that a wall? Where am I sort of sort of thing? And then that’s it, it loses its localization, and then you start, so I suppose the question I had was around the technology. So I saw on one of your videos, obviously you were using one of the Intel cameras, but is it, can you link it with sort of any laser scanner, any LIDAR scanner? Is there a certain tech, product range that you need to integrate as well for Slam Core to work best or.

Owen Nicholson (12:30):

Absolutely great question. I think this is one of the really interesting, when does the technology become a commercial product questions? Because the answer is, if you lock down the hardware and you work just on one specific hardware sensor combination, then you can build a system which works well, particularly with vision. So you don’t, if you look at some of the products out there already Oculus quest, I know it’s not a robot, but ultimately it’s answering very similar questions, whereas the headset, what are the objects around it? Same with the hollow lens that iRobot Rumba and a number of other questions, they’ve all successfully integrated vision into their robotic stacks. And it works. They work very well on low cost hardware. The challenge has been then if you don’t have those kinds of resources, if you’re a com, if you’re not Facebook or Microsoft or iRobot.

Owen Nicholson (13:32):

So then you have to, a lot of the companies are using much more open source solutions. and they, quite often use laser-based localization. This is the very common approach in this industry. and we are not anti laser at all. LIDAR is an incredible technology, but you shouldn’t need a $5,000 LIDAR on your fleet of robots, just for localization. And that’s currently where we are in, in this industry. The reality is there are cheaper ones. Absolutely. But to get ones that actually work in more, more dynamic environments, you need to be spending a few thousand dollars on your lasers. So we are at the heart of our system is we process the images from a camera. We extract the spatial information. So we look at the pixels and how they flow just to get the sense of geometry within the space.

Owen Nicholson (14:21):

So this gives you your coordinates. It gives you the surface shape of the world. So your floor plan and where are the obstacles irrelevant to what they are, but what if there’s something in my way? And that’s kind of the first level, our algorithms, operator, but then we also are able to take that information and use our proprietary machine learning algorithms to draw out the higher level spatial intelligence, which is the obstacle object names there, the segment segmenting them out, looking at how they’re moving relative to other parts of the environment. And that all means that we’re able to provide much richer, spatial information than you can achieve with, even that the high-end 3d lidars that you have available today. Just to address your question directly, as far as portability between hardware, this is one of those real challenges, because if we’d have decided three years ago to just lock it down to one.

Owen Nicholson (15:22):

So the Intel real sense, it’s a great sensor. They’ve done a really good job. And if we’d have just decided to work with that and optimize only for that today, we would have something which, as extremely high-performing, but you wouldn’t be able to move it from one product to another. If another sensor was out there at a different price point, it wouldn’t port. So we’ve spent a lot of our energy taking our core algorithms and then building tools and APIs around them. So that a developer can actually integrate into a wide range of different hardware options using the same fundamental core algorithms, but interacting with them through different sensor combinations. Because the one thing we know in this entire industry, there’s a lot of unknowns, but probably the one thing we all know is there’s no one robot which will be the robot that works everywhere, just like in nature.

Owen Nicholson (16:10):

There’s no one animal. Although by, as an aside, nature uses vision as well. So there’s clearly some benefits that evolution has chosen a vision as its main sensing modality, but we need variety. We need flexibility and it needs to be easy to be able to move from one hardware configuration to the next. And that’s exactly what we’re building at slam core. Our approach at the moment is to optimize for certain hardware. So the real sense right now is our sense of choice. And it works out of the box. You can be up and running within 30 seconds with a real sense sensor, but if you come along with a different hardware combination, we can still work with you. They might just need a bit supporting, but we’re not talking blue sky research, we’re talking few, a few weeks of drivers and API design to get that to work.

Philip English (16:59):

Right. Fantastic. And I suppose every year you also get a new version of a camera, like coming out as well. So a new version of the Intel real sense, which would obviously it’s normally advanced version and it’s the best version. And then it will see, I suppose that helps in your three key levels, which is what I was just wanting to quit. Quit to go over. Cause obviously like you discussed them there. So you had three levels, there was tracking math in and the semantics. So that’s basically what you were saying. So we’ll see, your algorithm stage then is that level three? Is that the semantics, was that level two diffuse?

Owen Nicholson (17:38):

So we actually, we call it full stack, spatial understanding. So we actually cut, we provide the answers to all three, but within a single solution, and this has huge advantages, because, well, Hey, there’s performance advantages, but there’s also, you’re not processing the data in lots of different ways and you, it means you can answer these questions using much lower cost Silicon and processes because you are essentially building on top of each one’s feeds into the next. So for example, our level one solution is tracking gives you very good, positioning information, and then all level two is the shape of the world, but we can feed the position into the map so that you get a better quality map. And then we can feed that map using the semantics to identify dynamic objects and remove them before they’re even mapped. So that you don’t confuse the system. And then this actually improves the positioning system as well, because you’re no longer measuring your position against things which are non static. So there’s this real virtuous circle of taking a full stack approach. And it’s only really possible if you understand the absolute fundamental mathematics, going on so that you can optimize across the stack and not within your individual elements with across it.

Philip English (19:01):

Wow. Okay. And then within the algorithm package then, is it a constantly learning system? So we’ll see if we’ve, developed, like a mobile Charlie to go around a factory and someone puts a permanent house there or permanent obstacle, will it, learn to say, okay, that obstacle is there now and include that in an increase into the map.

Owen Nicholson (19:29):

It’s absolutely that’s one of our core features, which is what we call, lifetime mapping. So, currently with most systems, you would build your map. This is how a lot of the LIDAR localization works. You’d build a map with a essentially a master run. You’d save that map. Maybe pre-process it to get it as accurate as possible. And that becomes your offline reference map off, which everything localizes against. So right now we provide that functionality today using vision instead of LIDAR. So you actually, you already get a huge amount, more tolerance to variation within the scene because we are tracking the ceiling, the floor, the walls, which are normally a lot less likely to have changes. So even if that post appeared it wouldn’t actually change the behavior of the entire system.

Owen Nicholson (20:20):

But we are also later this year, we’ll be updating our, released to be able to merge maps from different agents and from different runs into a new map. So every time you do, every time you run your system, you can update it with the new information. And this is something which is very well suited to a vision-based approach because we can actually identify, okay, that was a post probably more interestingly, maybe pallets or something where a pallet gets stuck in the middle of the warehouse. And you don’t want to, maybe during that day, you want to communicate to the fleet that there’s a pallet here. So you don’t want to pan your plan, your path through it. But then the next day you might want to remove that information entirely because it’s unlikely to still be there. So it, we ultimately don’t provide the final maps and the final systems. We provide the information that the developers can then use. They can use their own strategies, because this is key that some applications might want to know and keep all the dynamic objects in their maps and might want to ignore them entirely. So we really just provide the location, the positions of those objects, in a very clean API so that people can actually use it themselves.

Philip English (21:40):

Right. I see. And then when you said, so an emergence of other mapping tools, so I’ve seen the old classic where you have like someone who has a laser scanner on like a pole, and then he’s walking around the factory or walking around the hospital to create a 3d map. And then, so can you take that data and merge it in with your data to get like a more like accurate map is that we may not.

Owen Nicholson (22:07):

At the moment, we don’t fuse maps created from other kinds of systems. Like it went to our system. We ultimately would want to consume the raw data from that laser and fuse it into our algorithms. So right now, our support is for version inertial, sensors, and wheel Adometry. LIDAR support will come later in the year where it’s just a matter of engineering resource at the moment, algorithmically it’s all supported, but from a engineering, an API point of view, that’s where a lot of the work it’s that last 10%, a lot of people will tell you about is quite often 90% of the work. I know something to be 80 20, but I think that thinking robotics is more like 90 10 and so we wouldn’t support that sort of setup at the moment, but the answer is you shouldn’t need to do that because with those systems, you need to be very accurate, quite often, be careful how you move the LIDAR. Also, you need a lot of compute. You also need to do it quite often offline post-processing. Whereas our system is all real time on the edge. It runs using vision, and you can build a 3d model of that space all in real time, as you see it being created in front of you on the screen, so that you can actually go back and, oh, I missed that bet I’ll scan there. So this is really kind of a core part of our offering.

Philip English (23:32):

Right. Fantastic. Fantastic. I suppose the last question I have in regards to the solution, then we sort of see what, like, what you’ve been going through, which is fantastic is, so is this just for internal, or can you go external as well? I mean, I’ve seen slammed based systems have had issues with, since things like sunlight and rain and weather conditions, is it the incident at that moment? And then it looked at to go external eventually, or where whereabouts does it say?

Owen Nicholson (24:01):

We tend not to differentiate internal external, it’s more to do with the type of environment. So as long as we have light, so we won’t operate in the light soft factory cause we need, we need vision. And as long as the cameras are not completely blinded. So the rough rule of thumb we normally say to our customers are, could you walk around that space and not crash into things? If the answer to that is yes, then there’s them. We will, we may have to do some tuning for some of the edge cases around auto exposure and some of the way in which we fuse the data together. But we already have deployments in warehouses which have large outdoor areas and indoor area. So they’re transitioning between the two. We are not designing a system for the road or for city scale, autonomous cars slam, which really is a different approach you would take. And that’s where a lot of those more traditional problems, you just talked about rain and those types of areas really starts to become an issue. But for us we support indoor outdoor, whether it’s a lawn mower or a vacuum cleaner the system will still work.

Philip English (25:12):

Right. Fantastic. Yeah. Yeah. And I think this is it. I mean, we’re starting to see a lot more outdoor robots coming to market probably more over in the U S but also that’s going to be the future. So we’ll see. Yeah. Like the whole markets there, I suppose it sort of leads onto the bigger sort of vision for you guys then. I mean like where do you see the company in like five years time and like technology wise, I suppose. And what’s the ultimate goal, I suppose to get perfect vision, like similar to, cause humans have vision, I quite liked your animal, like analogy there actually, because obviously vision is one of the cool things, but as far as like what’s the why for you guys and the next sort of steps?

Owen Nicholson (25:58):

Yeah. I think really, we founded the company because the core technology being developed has so much potential to have a positive impact on the world. And it’s essentially the ability for robots to see, and that can be used for so many different applications. So the challenge has always been doing that flexibly whilst keeping the performance and costs at a price at something that makes sense. And we’re now demonstrating through our SDK. So we actually, the SDK is publicly available if you request access and you’re able to download it, we already have over a hundred companies running it and we’ll have about a thousand companies in waiting as we start to onboard them. So we’ve demonstrated that as possible to deliver this high quality solution in a flexible and configurable way.

Owen Nicholson (26:50):

And this means we are essentially opening up this market to people who may be in the past, would not have been able to get their products to that commercial level of performance to be successful. So having a really competitive, and also collaborative ecosystem of companies working together, trying to identify new ways to use robot is got to be good for us as an industry, because if it’s just owned by a couple of tech giants or even states then that’s going to kill all of the competition. So, and this will drive some of the really big applications for robotics. We see in the future in five years time, I believe there’ll be robots, maintaining large, huge, renewable energy infrastructures at the scale, which would be impossible to manage with people driving around machines and looking at sustainable agriculture in a way, which means that we can really target water and pesticides so that we can really feed the world as we start to grow.

Owen Nicholson (27:55):

And yeah, we’ll let you, you’ve seen all of the great work going on Mars with the Rover up there now with perseverance and we’re using visual slam, ultimately, it’s not ours, unfortunately, but in the future, we would like us as our systems to be running on every robot on the planet and beyond that’s really where we want to take this. And it has to be, we have to make sure that these core components are available to as many people as possible so that they can innovate and they can come up with those next-generation robotic systems, which will change the world. And we want to be a key part of that, but really sitting in the background, not living vicariously through our customers. I quite often say I want Slam Core to be the biggest tech company that no one’s ever heard of and running up, having our algorithms running on every single machine with vision. But never having our logo on the side of the product.

Philip English (28:56):

Yeah, well, this is it. And this is the thing that excites me about, like robotics and automation. I mean, if you think about the it industry, obviously you have a laptop, you have a screen and you have a computer, or obviously see there’s lots of big players, but you’re pretty much getting the same thing, but with robotics, you’re going to have all sorts of different technologies, different mechanical, physical machines, and it’s going to be a complete mixture. And I mean some companies will build things similar and to do one job where you may have different robots with different jobs. And yeah, now, I think that sounds great. I mean, obviously what if you can solve that vision issue that we have, it makes it a lot easier for start ups, you know, and a lot easier for businesses to take on the technology, get the pricing down, because obviously if you don’t want robust, wasn’t hundreds of thousands of pounds, you want them at a level where they’re well-priced, so they can do a good job and actually in the end, help us out with whatever role that and the robots do.

Philip English (29:54):

And so, yeah, now, that sounds really exciting. And actually, I’m looking forward to that to keep an eye on you guys. I mean, what’s the best way to stay in contact with you, then what’s the best way to get,

Owen Nicholson (30:06):

Genuinely head to this website and click on the request access button, if you’re interested in actually trying out the SDK, we’re currently in beta rollout at the moment, focusing on companies with products and developments. So if you are building a robot and are looking to integrate vision into your autonomy stack, then request access, we can onboard you within minutes. It’s just a quick download. And as long as you have hardware we support today, you can run and run the system. We have a mailing list as well, where we want to keep people up to date as things as exciting announcements come. And that’s really probably the best way is just to sign up to either our meeting list or our waiting list.

Philip English (30:57):

Right. Perfect. Thank you, Ron. And, what I do guys, is I’ll put a link to all the websites and everything and some more information about Slam Core. So, yeah, now, it was great. It was great interviewing, many thanks for your time. And yeah, like, I’m looking forward to keep an eye on you guys and there, and see, I seen your progression. Thank you very much

Owen Nicholson (31:15):

Absolutely.

SLAMCORE interview with Owen Nicholson

Slamcore: https://www.slamcore.com/

Philip English: https://philipenglish.com/slamcore/

Robot Score Card:- https://robot.scoreapp.com/

Sponsor: Robot Center: http://www.robotcenter.co.uk

Robot Strategy Call:- https://www.robotcenter.co.uk/pages/robot-call

Benedex LTD Interview with Snir Benedek

Benedex LTD Interview with Snir Benedek

Hi guys Philip English this from philipenglish.com. Welcome to the Robot Optimized Podcast where we talk about everything robotics related. For our next episode, we have Benedex LTD led by Snir Benedek who will talk about their leading software and robotics .

 

Philip English: 
Hi guys. My name is Philip English, and I am a robotics enthusiasts to report on the latest business and application of robotics. And my mission is to get yourself a robot optimised as support industry innovation and infrastructure. Uh, today we have Snir from Benedex limited and their vision is to promote automation for the empowerment of society. And, we’re going to have a word of snir today about his flexible motion platform. So Snir welcome. Okay.

Snir Benedek: 
Thank you very much. We’re also robots enthusiastic here.

Philip English: 
Cool, perfect. Perfect. That’s what I have to say. Um, so could you give us, um, I suppose a quick overview about yourself if that’s okay, so now,

Snir Benedek: 
Um, yeah, of course, happily. So, um, I myself, uh, have, um, a rather multifaceted, uh, background, uh, I have a bachelor’s degree and in aerospace engineering from the Technion, from the Israeli Institute institutes of technology, Israel is, um, where I spend the first, uh, first 37 years of my life. And, um, I then, uh, uh, I then took a course in biomedical engineering at the Tel Aviv university. So the first 11 years of my, my career, um, have been in R and D of various descriptions. Uh, I’ve been a programmer of simulations. I’ve been, um, an engineer, uh, aerospace, um, well in aerospace, I was an engineer of aircraft performance and, uh, I’ve been a mechanical engineer in the digital press industry and then applications engineering, robotics. And from there, it segwayed into a more commercial role. And, um, since 2012, I’ve been a sales manager, product manager of the motion control and, and, um, robotics related electronics. And, and through that international role, I came to know, uh, the technology very intimately. I came to, to build up a network, um, across the globe of people who are involved in the industry. And, um, and that’s naturally, naturally led me to understand better the pains and the, the endemic problems in, in this market. And here we are today.

Philip English: 
Yes. No. Okay. No, thanks for that. That’s a great overview. And then, so I will see you’ve got lots and lots of experience. I mean, 20 years of space is really, really impressive. And then, I mean, so what I suppose the first quick question is, um, so you, you, you decided to come to the UK, um, and sort of worked with the team like a British team. Was that the plan or?

Snir Benedek: 
Uh, well, not really. I started as a regional sales manager was still working in Israel. I started as a regional sales manager of, um, of, um, motion control electronics. Um, and, and I got to work in the UK market. I absolutely love the United Kingdom. I love this country. I love diversity. I love its culture. I always have really. And, and, and so I always was happy to, to come and live here. And I had an opportunity where a person who used the business. So it used to be my, my customer back in the day, um, got me as an employee. And so, uh, that’s what, uh, that’s what facilitated the, the move into the UK. It started as a, as a half jest, but, uh, but here we are quite an amazing turn of events. Really. Yeah.

Philip English: 
Yeah. And then, and then, did you, do, did you aim to land for sort of like the Bristol bath area? Cause that that’s sort of a bit of a HubSpot in the UK for like robotics with Bristol, like robotics labs. I mean, obviously I know you’ve got some association with those guys, so, so what, what’s the connection there?

Snir Benedek: 
A very fortunate turn of events. We started out in bonus very, very, very lovely place. And, um, following my better half’s, um, place of work, we ended up halfway between between mom’s break, which is where Dyson is and, uh, and, um, at Bournemouth and, um, it is naturally close to, to Bristol and above. And not many people, not many people are aware, but this entire area, not just Bristol has a lots of action in terms of reports. They send some really, really, really smart people in the wheelchair area. I think it’s been very, very fruitful to, to open this business here. Yeah.

Philip English: 
I think you’re definitely in the right space. I mean, we know, um, uh, Airbus is down there. A lot of the big aerospace companies are around there as well. And I think they’re, they’re saying that that sort of era is sort of the new center of, I suppose, essentially the UK for like, um, I suppose technology and innovation, because it’s so, so well-placed really, I mean, I mean sort of, sort of in, in, in the sense that you can go up north and get down south, so yeah, definitely a good place to be. And, um, yeah, and obviously we’re still robotics labs there there’s advantage there, but, um, I, I suppose, um, yeah, obviously the main thing of that talk is to talk about, um, your products and, uh, and, and, and the actual company. I mean, could I start with sort of the problems that, that you’ve seen? Obviously you’ve got a lot of history there, so you’ve seen a lot of the problems with, um, the, the current solutions are on the, on the market, but it could, could you give us an overview, obviously, the, there, the problems that your AMI solution is looking to address?

Snir Benedek: 
Okay. So you said AMR, the AMR market is a bust market, and it is very, very quickly growing. It is one of the largest or sorry, one of the fastest growing sectors on the, the, the whole robotics industry. So, um, mobile robotics, and probably not many people are aware of how big it really is, is a market that, that brings in tens of billions of dollars every year. And is, is anticipated to grow, to be about 10 times as large in the next, uh, in the next 10 years or so. Uh, it is, it is a huge market and it is a very quickly growing. And also they say, you know, people have been saying for the past 50 years, that robotics is the next big thing, the next big thing. However, unless you’re in Silicon valley, which is probably the singular one place in the world that you see it, um, when you walk around, you don’t see robots, do you, so, so they’re the next big thing for the past 50 years, but they’re not here yet. It has not happened unless you’re in Silicon valley. So why has not happened? What is keeping it from happening? Why aren’t robots all around us? I think the way I see it, and this is what guided us to, to, to go for this solution is that building robots is difficult. It’s hard, it’s still too hard. And, and it’s, and it’s very, very expensive. There are major barriers of entry into this, into this market. This is the problem that we’re tackling.

Philip English: 
And this is, um, you know, as, as you, as you said, it’s a market, that’s, it’s going to be expanding more and more over the next few years. And, um, I know what you mean about the AMI market in general. We’ve seen lots of different platforms can’t come in there. They’re all quite similar to some degree. Um, and that’s why we’re really, really interested like in Bennett, Alex, it seems like, um, a different type of solution. That’s gonna be, be able to fit into lots of different areas. I mean, I mean, just to get my head around it, I mean, again, I’ve done a bit of reading, so is the idea of obviously that you could actually place your system on to any type of, um, of platform. So if I have, um, a certain, I’m trying to think of a good example here, like, uh, a table or something simple that, that, that, that I want to move around and that table has some products on. Can do I, do I work with you to attach your system to that table, make the table like autonomous? Is that, is that how it works? Or if I,

Snir Benedek: 
Okay. Yeah. That’s one way it works, but I think we can attach the wheels. Things much, much, much more interesting than a table or the kitchen sink that matter. So mobile robots are machines that travel from place to place and, and do a certain function. And because of recent technological developments, what those machines can do has, has greatly improved and increased. They mobile robotics can do the work of people and can do increasingly more things people can do. And, and so they find themselves going into every industry can think of, obviously everyone knows they’re in logistics, they’re in warehousing. Everybody already knows that because everyone has already seen pictures of swarms of robots are crawling up the walls and on the shelves and those big Amazon warehouses and at, uh, at, uh, what’s the name, um, Ocado, uh, the, the big supermarkets, you know, all those big logistical centers, but they’re also, uh, they’re also going into, into the outdoors and people are looking at last mile delivery.

Snir Benedek: 
That’s a big thing. Now it’s being looked at. People are looking at security robots using using, um, ground drones as centuries. People are looking at the medical industry. People are looking at the agricultural industry because robots can do things out in the fields. So the variety is endless. And, and the thing is that, that the technology for building industrial industrial robot platforms is because it’s difficult and expensive is held by big companies, big corporates, but the ideas, but the great ideas for new uses, new implementations, new applications come from, from, um, SMEs, from entrepreneurs, people who don’t necessarily have that capital backing, and they don’t have access to that technology, uh, to get it from the big corporates, because that costs a lot of money. And, and this, this is the gap that needs to be crossed that needs to be bridged. And what we have set to do is to make this technology accessible to people who don’t have that kind of capital backing.

Snir Benedek: 
This is, this is, this is the, the solution. As we see it, we need to bring it closer to the people. Now we can’t make one machine that will do everything and people would be able to, um, to easily procure it because, because the machines are highly specific, every payload that does a certain work is very specific to that work. It does. But if it’s on a mobile base mobile basis have a far more common function, they just need to get the payload from place to place. Right? And so what you see is that the developers of these platforms, um, they, uh, they can buy something off the shelf. They can buy a generic robotic vehicle off the shelf, which comes as is, and does what it does. And they have to cope with it, or they go and make their own and making your own is, is lengthy.

Snir Benedek: 
It’s, um, it’s expensive. It requires expertise and not everyone has the expertise. This is where we come in. We have created a platform, which is, which is very, very capable and very flexible. And it combines the best of both worlds, what tailor it to your specific application. And it is bespoke for your needs in the sense that you get it, the size you want, and you can get it with a number of motors you want, depending on how much power you want and how dynamic you want it to be, how agile nimble you want it to be, and you get it very, very fast and for a very affordable price. That is how we are attempting to bridge this gap. So anyone making a robotic application can let us know what the dimensions of the platform are and what the dynamics are required to be. And within days they will have their platform given to them.

Philip English: 
Wow. Wow. This sounds like a fantastic solution. Isn’t there. Yeah. I mean, obviously I’ve done some research on the website, but that explanation explains it a lot more to me. And I, I, I totally agree with you on the solution side. There’s lots of robots coming out, but actually, um, obviously companies are investing or having to look into them and buying them, but I don’t quite know how they’re going to use them or what they’re going to do. Whereas obviously you need to start at the other end where, you know, well, what’s the problem? What, what, what do we need to solve? And what’s the ideal thing, instead of trying to squeeze, what’s it, you know, like a round peg into a square hole, you know, th that, that, that, that, that, that like analogy. So, yeah, I mean, obviously if you’ve got, um, a problem a year, then if you have a Pacific’s build that you can make wishes tailored for the, the, the company, then that makes a lot more sense. Um, so, so how would it, I suppose to, how would it work, um, go in a bit, in a bit more detailed tech technically wise, like how, how, how does it work from a vision and mapping point of view that the actual software? So let’s, let’s say that I’ve got, um, a solution that I need, I need a small mobile platform. How, how, how do I, we’ll see, I’ll come to you to, to actually build it, but what, what’s, what what’s the tech technology like within the, within the robot?

Snir Benedek: 
Okay. There are several, there are several levels to, to, to, to get over in order to, to be able to deliver that technology packaged to you first off, there’s the, there’s the physical platform, the secret sauce, if you will, in, in being able to make a physical platform that is very flexible and yet very capable is in compression, in it’s in functional density, it’s making everything smaller and yet highly capable. So the, the cornerstone of everything is this wheel. This is the first development of the company. This is the very real wheel that we’re making. It’s, it’s about as big it’s, it’s exactly as big as the diameter of the, um, of the urban scooter wheels. This is actually an urban scooter tire that goes on. It it’s very, very standard, but the size of it does not belie the complexity of what’s going on inside.

Snir Benedek: 
There’s a whole lot going on inside. So in the hub, there is, um, a very special electric Meltzer. There is the control electronics. There are, um, sensors for measuring the motion, measuring temperatures and measuring operational parameters of, um, of the wheel. And the important thing is that each wheel is self-controlled. So there is a processor on, within it as well. And the processor is what controls the individual motion of these. And so, um, by integrating the electric system with the electronic system and compressing it into the smallest possible size, we get to building block that is by itself complex, but as a system, the architecture of the system that comprises these building blocks and other elements is very, very simple, unlike, unlike the traditional, um, designs that you know, well, and if you open a robotic vehicle, you’ll see. And if you open indeed, um, an automobile, for example, you will see, um, a great number of parts interconnected with a complex spider web of connections.

Snir Benedek: 
Everything is connected to everything. And that kind of cable harness is, is, um, well complicated in our case because everything is integrated. Everything is connected with a very, very simple cable harness with communication, with power and with not much more than that. And so the system itself as a system, it is exceedingly simple. So that is the secret sauce, if you will. Uh, and, and that says, that is at the hardware level. Now, how do we control the number of wheels? This is what the actual patent of the company is, is about. It’s not about the wheel, it’s about a varying number of wheels, which are controlled by very simple computer. The computer needs does not need to be very powerful because each wheel controls itself. So what the central computer does, the central controller is just synchronized between the wheels in order to make them behave like, um, like, um, a vehicle.

Snir Benedek: 
So whether you’re using two wheels or four wheels or six wheels, direct wheels, uh, the mathematical model or behaving as a vehicle without number of wheels is already inside. So you, you do a quick configuration after a quick configuration, you’re using the software that, that is proprietary to Benedex. And, um, that part, part of the software is also, uh, the mathematical model. Part of the software is also the, um, the control of the wheels. And, um, after a quick configuration, your vehicle is ready to go, wow. But at that point, at that point, we are not still at the, um, at the, um, space issue, perception level. We’re not there yet. What you have now is a blind horse, if you will, because the system can now control its own its own wheels, its own legs, but it still does not know where it is. It needs some kind of, it needs some kind of control, some kind of command from above, from, from something that knows where it is and some, and then there’s the same or some factor that makes decisions to where it needs to go.

Snir Benedek: 
So, uh, because, um, navigation and guidance and, and, and perception, uh, quite complex, well, very complex then, um, we are complementing that side of our business partnerships with top end companies in those fields, because it would take us many years to develop a very good system of our own. So what we’re, what we’re doing is building a very robust software platform that does, uh, the control of our platform and, um, and is also meant to be very easily integrated into other means. Other means of control. That’s why we, we are currently working on a roast interface with robotic operation systems that we can fit into fit into, um, a ROS system as a roast node, but you will not need to program the wheels by themselves. You will see the entire system, however many wheels it has as one Ross’ note, and you’ll just need to command it and let it know where to go. And we will also compliment, uh, our platform with, um, with slam and with a position and, um, and localization measurement, um, measurement, uh, hardware that we won’t be making ourselves. And that is something that we’ll be doing, but by partnering with best in class companies that do it.

Philip English: 
Right. I see. I see. So we’ll see. So you’ve got the, uh, the baseline K K K K components. And then for the vision side, obviously that you’ve got partners for that, and then is it, so, um, if we, if we were looking to you to, to, to, uh, to get one of your users, is it, is it normally slammed to allergy that, that you see is, is the most widely used for like types of AMRs or is it better to do it with, um, like wifi signals where you have a free access point?

Snir Benedek: 
That that really depends on the situation. Um, if, if you’re running on an open field and slam hasn’t, hasn’t been contribution because there isn’t, there isn’t, there aren’t any obstacles perceived, um, as, as, as you are doing it in a more, um, in a more real environment, say in an urban environment, or even in, in those, that’s why the situation gets rather more hairy. And that’s where you, you need to have, um, visual means of perception. And you have to have a software that in that, uh, stands what to do with that kind of information. Um, you also needs, you also will need means for, for positioning measurement, um, what the platform does have. And, and that is a very significant advantage among others, is that because we are using direct drive technology, there are no gears within the wheels, it’s all direct drive. And that means that the motion measurement of the motor is the same as the motion measurement of the wheel.

Snir Benedek: 
And by using a high resolution motion sensor, we can actually measure with very, very, very great accuracy, the motion of the wheel. Right. I see. Okay. So coupled with other advantages of direct drive, one of which is being able to attain higher speeds and also very important being this, this being the highest energy efficiency system that physics can allow you to have, because there are no gears inside there’s, there’s no wastage of energy, uh, through internal friction. So, um, so with, with very, very high accuracy, high precision measurement of the motion, you can get great control in both low speed and high speed.

Philip English: 
Roy. I see. And then, and then, so I saw that from the performance side and the weight side, so every will, can take 125 kilos. Is it, is it, is that right?

Snir Benedek: 
Well, in the sense of loading yeah. How much weight can you put on it and put on it until it breaks then? Yeah, we’ll design it for 150 kilos of, um, of, uh, loading. Okay. Um, that’s not a big deal. That’s not very difficult. I mean, the more, the more metal you use, the more, the more you can pack on there. So yeah, we, we designed it to very, very high loads. So our platforms, even the simplest one begins with a carrying capacity of at least 200 kilos. Right.

Philip English: 
Okay, great. And then the performance is sort of how, how fast, um, can these wheels go?

Snir Benedek: 
It’s a function of the voltage you apply. Our system can work with 1224, 48 volts, and that has, has a direct bearing on, on how fast it goes. So I’m at 48 volts. We can attain with these wheels speeds of around 14 kilometers now. Right.

Philip English: 
So you got the speed there. And then the other question I had is that, is it possible to make the wheels wireless? Cause obviously you said about the control box, does that have to be with the robot or can it that be wirelessly and the signals are coming down straight? Yeah,

Snir Benedek: 
Of course. Of course, of course. No one is thinking about, uh, about, uh, controlling this with a tether. That is, that’s not the idea, um, the platform self powered, obviously. So, so it has its own batteries and it has its own computer on board. Now the question is, how do you want to command it? Do you want to command it with, uh, with a joystick? That’s fine. You can command, you can command with joystick. You can connect it to any manner of, um, of loss based system that may by itself be wireless. Because as we said, we, we provide them the mobile base for your platform. However, there is, there has to be another computer that tells the payload what to do. And usually that is also the computer that tells the entire EMR with the entire robot, where to go in order to carry out the next action. So we can very quickly and easily connect to all those computers and received from them the command of where to go. And then we will know how to interpret the command because, because our software will, then it has been programmed to, um, to navigate from point a to point B. Okay. So you just, so you just tell it where point B is assuming if you’re at point a, it knows where it is and it will carry out that, uh, that, um, come on,

Philip English: 
Come on. And then, so I think, um, in regards to that question then, so we’ll see, so you can have the wheel, you can build it onto the system. And then from the AMI, as I have experienced with is then is then you, would, you then have another computer that sits on the whole platform that communicates to the wheels. So is it, is it possible to sort of take, take, take away that, that computer, so you have direct communication to the wheels via what wireless, so you haven’t got that control system. I think that’s what I’m trying to, um, um, I understand, so like the reason I say that is, um, we sit there’s another company called wheelme and they’ve got like a small wheel like that. And I’m just curious to see how, how it sort of works really.

Snir Benedek: 
Okay. So wheel me our a bit. Okay. W we have similarities with [inaudible], but we also have our differences, uh, wheel me, each wheel is self-sufficient with respect to power. So each wheel has a box, a big box around it, and it has in those boxes, there are, um, the batteries, and there’s also the control computer. Um, when you work with a different kind of typology, right? In our case, uh, the common elements are that each wheel has its control inside. And that’s pretty much where it ends because we S we need to have one common controller to all the wheels. And we are aiming at industrial applications and to have a very robust industrial applications, um, sorry, you still need to have some kind of wire in place. And it, it is also mandated for the purpose of safety. You can’t have safety, uh, safety over safety that, that isn’t wired.

Snir Benedek: 
And one of the very important features of our wheels is that they have, they have the hardwired safety feature actually, where the redundancy event. So the premise of everything that we’re doing is that we are, we’re not, we’re not making things for the hockey market. These are hardworking, heavy duty industrial platforms. This is also why they deal with such high loads. And this is why the wheels are so powerful for their size. And this wheel weighs three and a half, 3.6 kilos, which is very, very light yet the kind of thrust it can generate. It can generate a thrust of 50 kilos. That means that by itself, the wheel is enough. One wheel is enough to, um, to pull a 50 kilo payload up a vertical wall. You can imagine that with four of these wheels, um, creating a thrust force combined of 200 kilos push, uh, you can, you can get a lot moving, so they’re small, but they handle a lots of weight.

Snir Benedek: 
They’re extremely powerful, extremely power efficient. And, and this is for the purpose of sustainability, not just in the environment sense that too, because the, the, the energy efficiency of these means that you can, you can use most of the energy you challenge them with. But sustainability also means that this is thinking long-term, I don’t mean for you to get a platform and scrap it in two years. And, and in several years, uh, you’d have a mountain of used up, uh, plat benefits platforms, uh, little around. This is not what we intend. We, we make the hardware very, very Hardy, rugged, and survivable, and we make the software very, very easy to update so that, uh, so that you can use the hardware itself, which is basically a motor and wheel, um, for a long time and keep updating the software. And it also, um, it also means that this is a system for avoiding waste, avoiding the creation of waste and being able to, uh, quickly change components very easily.

Philip English: 
Okay. And so You got the longevity there as well. And it, it sounds like, um, you know, it, even if you had it for five years, then you can obviously just take the components out and re and re rebuild it for a different light. So a solution as well. So it sounds like it’s got that. It’s got that, that ability, because if the technology is constantly getting it software updates, and he did want to change to some sort of something new, then there, then you’ve got the capabilities there. So,

Snir Benedek: 
Absolutely. But by using the smallest number of moving, moving parts within the wheel, which is basically rotor and stator, there are no gears there, there isn’t anything moving around other than, other than the, uh, the two parts of the melter. And, uh, and so that means, um, it is, it is designed for the maximum reliability, uh, the minimum weapon maintenance and, and also, um, because the components themselves a very, very easy and easy to change, they’re interchangeable, then, you know, worst case you’ve shot one of your wheels. And, uh, you just replaced it with a new one. It assumes the previous one’s personality and you get rolling again,

Philip English: 
Right. Nurse. Fantastic. Fantastic. Um, what, what I normally ask next is I think we go on through problem and solution. I think we’ve got everything there is. I, I, I normally ask, um, so something about, okay, what’s the future future, and then obviously just the next steps for how to get in contact. But before I do that, do you, is there anything else that you want to add into this section?

Snir Benedek: 
What I would say is that we are on, we’re starting out on, I say we, because benedex is not just me benedex is an amazing team is an amazing team of brilliant engineers. We’ve won substantial funding from innovate UK, and that, uh, got us off to a really, really good start. Um, we’ve assembled a team. So map company’s CTO, who, who is one of the most brilliant individuals that I know with an extensive background and a very, very illustrious, uh, track record in, in robotics. Um, and not just here, we also have great engineers, uh, because this is very, very multidisciplinary. So we have to have talent in electric, electronics and mechanical engineering in robotics, in, um, in software. And we have all that. So we’re starting out now and we are very, very much seeking collaboration. Anyone who can use this technology, and there are indeed many, many who can we just need to get the word out, I’m sure this’ll be a hit. So all these, all these people, anyone, anyone interested is very, very welcome to reach out and, and to, and to get in touch. And we will talk about how to, how to best benefit them with this technology.

Philip English: 
Right. Right. And, and the best way to get in contact with your snare, is it, is it through the website and have you got an email address?

Snir Benedek: 
Yeah. hello@benedex.co.uk. Okay. Okay.

Philip English: 
No, that’s perfect. That’s perfect. Yeah, no, sounds great. And then I suppose for the, the, the bigger future vision, or see again, since you guys are vision, you know, sort of to promote automation for, uh, the empowerment of society, I suppose to get, I suppose, it’s, like you said, you know, to, to, to make it more accessible for, for, for companies to bring this technology in and build something that’s bespoke and going to work for them, and just expanding that out for as many applications as possible, really, I suppose, as the, is the

Snir Benedek: 
Main name, we have a very, very clear roadmap of where we’re going. We just, uh, we just now towards the completion of the first stage, which is, which is solidifying, consolidating the technology, which means that we’re building the motors, we’re checking the software. We’re seeing that everything is well connected and working as a robust system, that stage one, the next stage is the platform at that stage. We’re going to, um, put the, put everything together as a system and take all the considerations as a system. And that’s when we will consolidate those partners of those partnerships with, uh, complementing technologies, uh, that will be slammed. That will be the navigation technology. That would be the positioning technology things that will take us a hundred years to do. If we did this, if we did them ourselves, the next stage, once we have a good platform that is already selling, once we have a good platform that is out in the field, what we will do is work on the service, because this has to be the best platform.

Snir Benedek:
Also in this, not just in the sense of hardware, we want to give our users the best experience in, in getting those applications deployed quickly and well. So we will be working on the documentation. We will be working on a great website. We will be working on a sales on the sales platform, on the internet. So you can put together everything on, on the website, hit, uh, hit order and, and, and your platform will be on its way to you within days. Wow. All assembled and ready to use, um, that kind, that kind of service. Um, and after we’ve done all that, I suppose we will start looking very seriously at our being a powerhouse for all kind of robotic applications ourselves and, and, um, and, and try to get a piece of the action of actually making full applications, completely turnkey on the basis of everything that we’ve developed to that point.

Philip English: 
Wow. Now where this is it, I mean, I think that’s a great way. I really liked the website where you can go in there and build it, and then you just hit the order button. I mean, that sounds like a

Snir Benedek: 
Great, there are several, yeah, there are several companies working that way. One of them is called pension, um, where you can build, um, machines on the internet and then you just click order, you know, exactly what it costs. You know, exactly what’s in there. Um, everything is transparent. Everything is visual. Everything is out there, there, there are no secrets anymore. There are no trade secrets anymore. Everyone knows everything. And, and people appreciate that everything is laid bare before you, you know, exactly what you’re getting, you know, exactly what your options are. And he just puts it together. No BS, you know, the numbers, you order, the thing it’s assembled within days, instant satisfaction and the most important thing. It works. You trust it to work. It has to be trustworthy to work. It must be. Yes. Yeah.

Philip English: 
Yeah. So it’s the people, obviously you’ve got the reliability there and now an associate does the job. And then, so what sort of, I suppose, I suppose lifestyle questions, so like timelines, um, um, in your eyes, obviously it was great to see, um, like the wheel today in real, in real life. So we’re set, I suppose, to get to the platform stage. what sort of timelines are you guys thinking of?

Snir Benedek: 
Um, four to eight weeks. Okay. Yeah. So we’ve got the wheels. Um, we’ve, um, we’re just about ready to roll out the first version of the software that we’ll be calling a variable number of these wheels, and that will be in the upcoming weeks. Uh that’s when we’ll start, um, we’ll start driving the first platforms where these wheels up until now, by the way, it’s not that there’s, there hasn’t been anything driving around. We have, uh, the benedex mini me, which is a little platform that we made out of a regular, just normal, normal industrial motors, but very small ones, like a, um, small PLDC brushless, DC motors. And then they have a similar interface. They they’re just different magnitude of hardware and, and that’s working fine. I can’t wait to get the big wheels on there and see how much, how much power they put out is going to be cool of power.

Snir Benedek: 
And it’s always fun before you, uh, before you try something like that. Um, so yeah, that’ll be the next two to four weeks. Um, we will continue the development of the platform until, until the third quarter of 2022. Okay. And, and then we will start working on optimizing the manufacturing, if everything goes to plan that is at that point, the company will already have substantial sales. And, uh, we will, um, start concentrating on, on, um, getting quick, quicker, lead times quicker, my factoring, better manufacturing quality documentation, and what I described as, as, as making up the good service, better service to our customers. Right. Okay. But yeah, w w w we are actually ready to speak with, um, with early adopters about, about getting some, getting some, um, payloads on platforms. We are at that point

Philip English: 
Now. Right. Perfect. Perfect. Well, I think that’s been a great, um, like overview, snare. I mean, it’s very much appreciated and yeah. I mean, obviously guys, you want to get in contact with Snir, then you’ve got the email address there and, um, yeah, there’s this, I think it’s going to be a great tech technology and we’re looking to see, uh, looking forward to see like more from you guys from, from, from the future. So I thank you snare. Thank you. Thank you for your time, sir.

Snir Benedek: 
And thank you very, very much. No worries. Thank you.

 

Robot Optimised Podcast #5 – Snir Benedek of Benedex LTD

Benedex LTD : https://benedex.co.uk/

Philip English: https://philipenglish.com/

Sponsor: Robot Center : http://www.robotcenter.co.uk

Youtube:- https://www.youtube.com/watch?v=knlbxEZ6mgA&ab_channel=PhilipEnglish

Miranda Software Interview with Laurent Ravagli

Miranda Software Interview with Laurent Ravagli

Hi guys Philip English this from philipenglish.com. Welcome to the Robot Optimized Podcast where we talk about everything robotics related. For our next episode, we have IRAI Robotics led by  Laurent Ravagli who will talk about their leading software and robotics .



Philip English: 
Hi guys. my name is Philip English and I am a robotics enthusiast, reporting on the latest business and applications of robotics and automation. My mission is to get you Rove optimised and to support Industry Innovation and Infrastructure. so, uh, we’ve got, uh, Laurent on the call today. Um, I’m going to have a chat with her on, uh, just to, uh, get a good understanding of his company. And, uh, and then yeah, as, as always, if you have any questions and please feel free to come back to me. Um, so probably, yeah, going straight over to you later on, like, could you give us, um, sort of like a quick intro of like your company and the, and the software that you got?

Laurent Ravagli: 
Yep. Sure. So thank you for this interview. So I’m Laurant Ravagli from the IRAI company so, uh, we create a software for the education in the industry since 1988. And, uh, this, uh, last year, uh, we developed a software called Miranda. So this is a software to simulate, uh, robots for the education. So there’s a lot of robot that you can assimilate, uh, pretty much everything that’s popular, it’s already into the software and board and et cetera, over LIGO. So you can create challenges and, uh, make your students do these challenges and create a challenge yourself and see their progress, et cetera.

Philip English:
Oh, I see. So, so the company has been around since 1988, did you say, or no?

Laurent Ravagli: 
Yeah. So, yeah. Yeah, that’s the key for the staff. So there wasn’t robotics at this point like this, it was mostly, uh, with PLC, uh, saw we are more in this, this, uh, field of the front, the education. Uh, so is, uh, about this, but, uh, this is, was a software called the automation still exist in the same words, but, uh, basically it’s a good software program, uh, inside the, with this software or the type of, uh, it gives you, you want, so you don’t have to learn every time, a newer software to learn, to run a PLC. So it’s a really popular in, uh, in France, not in the world, it’s a student setting in France, not so much because, uh, we are pretty much the one market was, uh, so a lot of the schools are there, they are this or this product yet.

Laurent Ravagli: 
And, um, so after that, so we developed some more, um, so another software that’s more on the schematics, uh, so electrical, coding, et cetera. Um, and after that back 10 years ago, I think, uh, we push, uh, us to the industry more, uh, visible, natural commissioning software. Uh, so far we are, also still working with PLC because, uh, it was, uh, um, software that you can connect to the PLC. So it’s a really, uh, creating a digital twin of the, the machine you are trained to, to create. And, uh, so basically, so this last year, uh, in France in the world, we saw that, uh, Kirsi doesn’t get much more, um, the schools. So, you know, they are switching to scratch, uh, RDR by turn, et cetera. So we decided to, to build the product for, for this, uh, and the us, we had, uh, some, uh, previously made a, um, simulator before Miranda, so specific to some robots and bought the drone.

Laurent Ravagli: 
And, uh, we had a lot of feedback, uh, that we didn’t, uh, expand it. And, uh, so we decided to create this Miranda nine, to give all the tools, uh, that to be taken. We took from the old products to implement them to this software. Uh, um, so this is the software so that we, we developed for 10 plus, uh, chitlins. So really, uh, start programming with scratch, but, uh, what’s so good is that we can do scratch and bite them so they can continue to use the same software, uh, or the years and the CEO different, uh, type of programming. And of course the Chuck and also make the old changes there. There’s no really limits, uh, for the software. Uh, so it’d be trying to build a community so they are sharing, or there are challenges, et cetera, sorry, really trying to give all the tools for 40 channels. And, uh, basically, yeah, we sell to schools, but, uh, we are trying also to sell to, uh, to outsource, uh, right now we are trying to see if it’s something that can be, uh, useful because, uh, you know, uh, children play, we, we buy a summer bus. So maybe that’s something that we are trying to dig in.

Philip English: 
Yeah. I mean, so that, that, that, that’s really interesting. I mean, to just going back to the history there, so I will say I was doing a bit of research and actually it sounds like I didn’t realize that you guys had been round and you’ve been doing the PLC work and everything else, and then you’ve merged into this slight type of technology. Um, I was almost going to say all that. You should have some of that stuff like your website, because it shows a real sort of wealth and authority that you have there. So people would say, Oh, look, they’ve come from this background. And now they’re, you know, now they’ve moved to this. So, you know, that that’s really in interested in now. So you’ve got skills from the past, you’ve done it in the past. You’ve done it with PLCs, and this is just a natural progression. So you’ve got the experience. So that’s, that’s really, really interesting to know. I actually, um, I mean, I suppose, um, so from a problem solving point of view, so what would you say is the main, the main thing, the main problems that, that you’re solving?

Laurent Ravagli: 
So right now with the current situation, of course, it’s the giving the tools to the students to, to be able to code the code that they don’t have to, they can access to the robots and the teacher don’t, don’t say use, I don’t want to make the costs on the coding is they don’t have the robots. So giving the tools on that, uh, the mentor first, um, and also, uh, as a cost-effective solution, because as a class room, you don’t have, uh, the number of robots you need for all the students. So to have a simulation of the robots, even don’t have to have the robots. Uh, so, but still some romantic factor or coming from us for that training, the setting, the software, but, uh, our grandfathers, uh, some are, some of them are actually, uh, adapting to this and, uh, contacting us directly, uh, to, to, to build something with us, um, or they are training to, uh, get some money.

Laurent Ravagli: 
Uh, uh, yes, uh, you’re always trying to, to get through for the teacher for the, uh, or so in the industry, uh, for other products is also always giving something that, uh, can, um, uh, help, uh, the industry or the education to be more effective in the industry. For example, if, you know, go to commission software, it’s small on, uh, you don’t have to build a prototype, uh, you can be 3d of the machine and you can connect to the PC. So you can start working on the beauty. Doesn’t have to be the prototype. You can present the machine with a working system and you can see, uh, what would be the problems. Um, so it’s a timer, it’s a Q and a, so, uh,

Philip English: 
I see, I see. So you’ve got, so it sounds like you got two different areas there, so we’ll see, like for education, it, you know, if, if, if, if teachers or parents or kids haven’t got the hardware, then they can use your, your, your, your, your software to, to practice their coding skills, um, without actually having the, the, the hardware there, but then also in the corporate world, uh, for, for people that want to test out their ideas on a more like commercial basis, then they can use your, your, your software. So you’re, you’re, you’re, you’re, you’re essentially saving them costs, you know, and saving them money. You know, he, you know, stood the test then, I mean, I was having to read, so I can see that you, you already interact with a mixture of robots. So could you do a robot that, that someone built? So, so like if I built my own robot out of bits and pieces, um, uh, at w w w would there be something I suppose, software wise that I would need to install to connect to your system, is that how it works?

Laurent Ravagli:
Cool. So you can, um, basically, uh, Miranda, we have three main parts. So you have one, you can just play the already big challenges, one, which you can create your own. And, uh, the last part is when you can create or modify the robots. So all the built-in robots, we can hide some saw plan for example, but you can also create your own from scratch, not to another program in China. Uh, so you, you have basically the library, uh, with, uh, elements you can have so chassis, you can have the whales, uh, several motels, et cetera, and, uh, all your sun salts, so we can have that to make your, uh, base, uh, that’s what we, uh, I do also for, I didn’t explain my, my role in the company, uh, but I’m a simulation development manager. So, um, for all the projects we developed in internet, uh, we, uh, sometimes I have clients that ask, um, they don’t have the time or the resources, so they ask us to make something for them, uh, inside the software.

Laurent Ravagli: 
So basically that’s my role in the company. So, uh, there was of Miranda basically, like right now, am I on a project like this? I incorporate, uh, robots from the, our reseller, they are keynote software. So this is how I do, and the outdoor teacher or the parents can do themselves. So you create a base of your robots. And after that, you can mask, uh, elementary. Don’t want to see, for example, the tube that Selma the chassis. And you can import that we did add a skill you should add as a scheme, uh, for the robot. So basically you have already all the functional function inside the software, so yeah, and you put them on the base and you put the skin on the, on this, you can, the put ready and being around it, that’s, uh, it’s possible. Um, but two, we don’t limit this system, uh, to robots. You can both do 3d or for example, of a tree and the tree, you can, uh, use it to build your scene. So you can, uh, really what you want. Uh, you don’t have to worry about, uh, for example, now I’m building with a gate, a backing gate, or this is a system that, uh, that, uh, uh, the reseller, we are working with strain to implement the system that work with, uh, uh, Arduino com, uh, that we can program, uh, from yonder.

Philip English: 
Well, I see. And then, so I was having to read about the software is in the cloud as well, where, so, so, so you can stall your models and everything it inside the cloud, and then you can just access it from any device. Is that

Laurent Ravagli: 
Yeah, exactly. Yeah. So this is a big point also for schools because, uh, uh, I don’t know, in the world, but in France, uh, in studying the software on the network, this tool is a pretty long process, uh, because they don’t deal, uh, with the, is this themself does, uh, uh, original uh, so it takes months for software to be installed. So we wanted it to be online. So from the red buzzer, from the tablets, uh, much cannot. So in study it so directly, uh, from Microsoft store or directly to the PC, um, that’s, um, something that we want to implement for our other software. So the first one, uh, we thought the automation for the PLC, we are trained to make an online version also. So there’s a really good feedback about that.

Philip English:
Yeah. And obviously it makes it a lot easier like to access and, and what I loved about it was the challenge, like creation. So obviously if I was a teacher and I had a group of kids, then, then I could give each of them a challenge, is that right? I can build in challenge that they would have to do that, they would have to code. Um, and then is that depending on the robot that, , that I’ve selected for them, I suppose, I suppose, you know, th this is your challenge for this robot. And then can I set them another challenge for a different robot? Is that how it works or,

Laurent Ravagli: 
Yeah, basically you, you build the scenes, uh, and that’s, uh, the, to show that, uh, implement the robots, they can be, uh, for example, uh, you create your Parco, you add a specific robots you can give to the students. Uh, uh, and after that, uh, maybe I want to produce the same Parco. So I just asked the teacher to change the robot. It’s a good to go. Uh, some feed that’s, uh, could be that we didn’t do for now. That’s the, maybe a better response, some robots, uh, at the start line, and then the students, somebody on the cutting of it, I think, uh, let the student choose which robots you want to program. That’s also a possibility. Um, and, uh, basically you create a senior with the robot. So we have, uh, predefined challenges for each robot sometimes. Yeah, that’s the same Parco for example, for the same parco the follow, the line, uh, challenge pretty simple.

Laurent Ravagli: 
But, uh, after that, we really, uh, go on, um, the tree at the editor and, uh, show the teacher how to modify the difficulty, for example. So other stakeholder in the middle of the line, put a section without relying on the ground. Uh, so that can be really easy. So that’s the first step. Uh, I, um, teach them, so I do so far some, uh, trainings, uh, with the teacher. Um, and that’s, that’s the thing. I show them it’s after that, actually I show them how to create their home. So basically you have all the tools that I can set, uh, to when the lights, the scene, but also what you want to do with the challenge is, um, the scenario. Uh, so you have to go to first Gates, et cetera. So this is something that you can code also directly from the city, uh, in scratch or Python.

Laurent Ravagli: 
So this is really easy for them. Uh, and, uh, as, uh, explain them out to modify, they can start from an already built in change and modify it to their needs. And after an hour, I’m used to the software, they can start building now, um, and, uh, for the, is for the teacher to see, see, this is good. You can also give these tools to the students, but to be the program mandate for the, for the start. And, uh, so this, uh, we also, uh, introduced some tubes for the children to use. Um, but, uh, this, um, pop-ups, for example, that can appear off to tell them what they have to do, uh, uh, the video also, but, uh, basically you, will you stop the scene in, uh, yeah, you have to start putting the, the lots next. It would say you have to connect this back to the other one, but he’ll start progressively, uh, and not, uh, uh, give a challenge like this and, uh, yeah, programmer, you’re trying to do something and not just give the tools. And, uh, let’s say, uh, um, what this, uh, by yourself, uh, we are trying to reintroduce, uh, everyone, uh, teacher and students to the result.

Philip English: 
So you got that connection there. Yeah. Okay. No, sounds good. I mean, I was also looking at some of the different packages that, that, that you had, can you give us a run through, so I think you had the, you know, it was a mixture, you had sort of a, the number of users, but then you had sort of a bronze, silver and gold, um, same thing going on. What was that?

Laurent Ravagli: 
So basically we have a four robots, so the first one is a for outsource. So just one user. Okay. So it’s all the functionality inside there. And the trio does. So you have unlimited number of users, and also you can follow the progression of them, uh, basically with the distributions, what functionality you have access to. So sometimes a teacher don’t want to have them. Uh, so for example, uh, you have the primary option. It’s pretty much the same of the destination. You have access to everything. So for schools, the origin is the same thing without the word budget. So you only work with what you have in the library. So you have a 10 or 15 robots that you can use, and you can create your scenes. And just below that is, uh, the one robot edition. So you choose between the robot are, so for example, if you want to win, but you just have to input and you can create your scene with it. Right.

Philip English: 
So differences there. Yeah. Okay. I mean, so just, just, just, just changing gears aside. So what’s the sort of the bigger picture for like you guys, I mean, is it, is it basically to get on more and more robots and then do just, just have bit build more of a community, um, where teachers and everyone can, can interact. I mean, w what’s the sort of the next steps and sort of big, big vision.

Laurent Ravagli: 
So for now we just launched a stock. Uh, so the user can share, uh, there are creations, uh, before we, we try to do a forum, but, um, doesn’t work where people didn’t seem to post very much. We had some teacher that, uh, re uh, create a lot of change. Uh certain chage but, uh, isn’t very well. So we are trained to, with the, to hopefully change that. So I’ll serve your, creating a robot and also dock six ways, for example, um, tides, uh, that’s used, or in the world challenges or competition, uh, robotics, uh, that’s uh, in real, we are incorporating it to the store for free. So all the things we incorporated ourselves, we put it for free, but, uh, as a teacher or as a user, you can put a price on it if you wanted. Um, so for now, everything is free on the, on the store. So you can, the, no, the, uh, it will be added automatically to your account.

Laurent Ravagli: 
So, yeah, this is what we are already trained to do is build a community around this, because we know the debate is that, uh, there was a lot of creations. They are really gaining some, uh, um, uh, population every day. We see the number of connection is growing. Uh, so we are, we, we know there are some creations and, uh, the big point is, uh, the, to share with the suffer, I think, and not be on your side. Um, that’s really real. We are trained to, to create, uh, this, and, uh, of course we are also as a, all the time trying to, uh, to get, uh, other countries. So trying to find some resellers, basically we, we work like this. We are rarely, uh, contacting client directly, uh, only, uh, secret sellers in the, in the, uh, the world for France. We are also trying to seek them directly because we, we have, uh, uh, an agenda, uh, uh, uh, a list of contacts and, uh, the scores because of our software.

Laurent Ravagli: 
We have a lot of, uh, screws, uh, that we know, uh, in our company, but for, for the world, you are going to resell us, uh, to do that. And basically, uh, the, the functionality we had is basically what, the reason, I think, what we be the best for the country. So we are the world beneficent, uh, from it. Uh, so for example, there’s, uh, uh, for the changes, you can do a multiplayer. So put a separate students on a simpatico with several words. Maybe this is something that’s, uh, come from, uh, an ID from, uh, China.

Philip English: 
Okay. Okay. So it sounds like you expanded out quite quickly, quite well. Socially, if you you’ve got artists, people from China, people from America and obviously France as well and around Europe. So, yeah, I mean, so that’s, that sounds like you’ve got some good growth plans there really, and there’s a, you know, there’s a path, you know, to sort of grow and build that, that bigger communicate, uh, that bigger like community. Um, I get everyone. So do you deal with like industrial robots and stuff like that?

Laurent Ravagli: 
Hold on, clean me on that based on, on the other software. So the actual commissioning one that’s called virtual so that we can need to PRC and, uh, import 3d, et cetera. So what we can do with this software is that you can import, uh, from a library, uh, basically robot from a manufacturer. So as IBD, uh, hookah, et cetera, uh, and what’s ever too, is that, uh, real time, uh, copied the movements from the simulating, uh, software. Uh, so that can be useful if you’re trying to see, uh, what the robots out of the robots we, uh, interact with the other machines of your line, for example. So you feel the PLC can really see what’s what’s going on. And it’s a, it’s a robot starting the cycle at the right time and what will happen next. So that’s a really good tool for, for this. So as we know that, uh, robots, software, you can simulate your robots, but you don’t really see what’s going on, uh, that they incorporate 3d. But now we can see what’s, what’s going on with the ball liner and you see, you have a bigger view of your system.

Philip English:
Um, and then, so does this, the Miranda software linking with the older Pitt PLC software that you guys use? Is it, is it, is there a link between them so you can use both of them, like, like together.

Laurent Ravagli: 
Yeah. You can use them together. Uh, this is primary too. So don’t, you really usually do that, but if, sometimes you want to use the, uh, the program that you made inside the automation. So before sending it to the PLC, you can use this program through, uh, uh, your main, uh, twin digital twin, but what’s, uh, we picked from this software is that, uh, we use all the drivers. Uh, so the, the digital twins can directly connect to, uh, the PRC, all the simulated periods here. So, so with IP, with OPC, uh, everything you want, uh, so it’s really easy on the side. So this way, some of, sometimes in the, in the industry, most of the time they don’t take a vendor or software, they just take the virtual commissioning because it do, but they, they need,

Philip English: 
Okay. No, that’s good to understand. Um, yeah, I mean, I think for next steps, I mean, um, I think that’s quite a good overview actually. You know, you, you, you showed us the history of the business and where you’ve come from and all those bits. So that’s really good. I mean, I mean, if, uh, if someone wants to get in contact with you, then what’s the best way to do that.

Laurent Ravagli: 
So folks, the best ways to contact us on our website for Miranda, that’s MIRANDA, software. Okay. Oh, it’s for the company all around it’s irai.com. So we will be redirected to the iReady france.com site. That’s normal. It’s a, our main website. So maybe that’s a, that’s why you didn’t see our other products because we made a separate site because it’s, uh, for different industry. Uh, so the first one is more on the industry and merchandise purely, uh, um, for the robotics.

Philip English: 
Right. Okay. Yeah. Well, I’ll have to go and check that out to see the differences, but, um, but on the random side, yeah. I mean, obviously on the robotics side, which is what we’re keen on. I mean, that, that, that’s a great overview and, uh, and yeah. I mean, yeah, yeah. Guys, if you have any questions for like, um, like Miranda or like Laurent, then please feel free to send them over. And, um, yeah, no, that was a great interview, like, and really, really appreciate your time. So thank you, sir. Thank you. Thank you very much.


Robot Optimised Podcast #4 – Laurant Ravagli of IRAI

IRAI Robotics : https://en.iraifrance.com/

MIRANDA Software : https://www.miranda.software/

Philip English: https://philipenglish.com/

Sponsor: Robot Center : http://www.robotcenter.co.uk

Youtube:- https://www.youtube.com/watch?v=fq48xa42qYw&ab_channel=PhilipEnglish

Indus four Interview with Arthur Keeling

Indus four Interview with Arthur Keeling

Hi guys Philip English this from philipenglish.com. Welcome to the Robot Optimized Podcast where we talk about everything robotics related. For our next episode, we have Indus Four led by Co-founder Arthur Keeling who will talk about their leading technology solutions.

Philip English: 
Hi guys. Um, my name is Philip English and I am a robotics enthusiasts report on the latest business application of robotics. And my, uh, my mission is to get you robot optimized as a support industry, infrastructure and innovation, uh, today, uh, we’ve got, um, Arthur Keeling from, uh, Indus four and, um, Indus four really, um, relooking to redefine how organizations access and control automation, uh, to solve their problems. Um, so Arthur, hello. Hello, must be here. Nice to meet you. Thank you for your time. Um, so I think the first thing that we wanted to go through is this really just a, uh, a bit of a, um, uh, an explanation about yourself and the company, if that’s okay. Just to give us like an overview.

Arthur Keeling: 
Um, so of course, um, well Indus Four was founded just before the, uh, first lockdowns of COVID and we had set out to deliver automation for people by tackling problems that they hadn’t normally associated, that could be automated. So by offering solutions that can help them tackle challenges that they maybe didn’t think they could automate. And over the last 12 months, obviously the world has changed beyond recognition. But what that has led for us is we’ve become the sort of go-to of automating tasks that people thought they couldn’t automate. And that has led to a loss of work with pharmaceutical companies, uh, the NHS, um, but also food producers we’ve been speaking to. And these were jobs, which they didn’t previously think they even wanted to automate, but events overtaking them and we’ve been helping, uh, deliver sometimes prototypes, sometimes working solutions. And so helping provide those tools for them to automate solutions that traditionally they may not have wanted to approach five years ago.

Philip English: 
Wow. So, so for, so because of the pandemic it’s made, so those, um, that those End users think a little bit more about how they can do their normal manufacturing and processing, and then we’ll see that, then they’ve come to you to say, okay, look, we do need to think about this. Like we’ve never done it before, and we need some, some smarts and creative services solutions to actually get it, get it, get it working.

Arthur Keeling: 
Absolutely. So we’ve brought together a team and our team has made officer AI vision specialists, and then we’ve got mechanical engineers, electrical engineers. So by having that broad skills team, and by being located next to the Bristol robotics lab, it gives us the access to huge amounts of knowledge. And by being able to pool resources like that, we’re able to tap into, you know, researchers from the universities in Bristol, but also we’ve got our own offices next to it. And that’s some collaboration helps us tackle a lot of those challenges. And it’s been tasks that traditionally people weren’t, they weren’t a problem before, but they are now whether it be because of having to distance staff, they’re having to automate a process or it’s because there was a job they didn’t do before all of this. And they’re now having to do, and they’ve discovered that needs automating because it’s very burdensome taking time, or quite often we find it so valuable staff doing mundane work suddenly, and they’re having to find a way of freeing up their bodied staff, whether it be doctors or people working in pathology departments and things like that.

Philip English: 
Right. I see. I see. So, so the business is only about a year old or two years,

Arthur Keeling: 
Just over a year old now, just over a year.

Philip English: 
And, and then, and in the history of the business, so you from sort of the Bristol robotics lab, um, like background or if you, yeah,

Arthur Keeling: 
So I, I was at university here in bristol as well, um, which is how I’ve kind of associated with it and have worked with various projects with other companies before starting this with 3d printing technologies, um, have worked with some drones before, and that’s a combination of projects we’ve worked on, enabled us to bring together that team that is now delivering what we are able to do. And it’s offering those services to people who need towards those tasks. And it’s that sort of broad range of skills that we will learn. So over quite a few years, uh, got a team of 10 now delivering these, and then we’re able to serve, use external. So people for other areas to help support that.

Philip English: 
Right. And I suppose the, the problems that, that, that you’re solving is going to be a complete mixture. I know, sense that’s some of the interviews. Um, we, we, we see a lot of the, um, uh, the, uh, the vendors sort of focusing on one particular problem, but it sounds like obviously you you’re, you’re more of a, of a speak to the customer, get an understanding of what, of, what they need to do, and then realize what, what their overall problems are and then come back to solve them. Is it, I suppose the question is, is it it, are you finding the same sort of problems that you’re solving or are they completely different for every customer?

Arthur Keeling: 
No, actually not. It’s a really good point. We are tackling problems, which often end up having mass market appeal. Okay. So we’re finding, we’re not working on a one-off project when we’re working on a project, we often then analyze what else is available out there. And then we realized that they are not the only person who has that problem, and those are the problems we’ve been focused on. So we’re able to help offer it to other people beyond the, um, by using our sort of platform that we’re developing, we’re able to scale those benefits for other people, not just as one-off products. And that’s something that we do really importantly, when we’re working with customers, we look beyond just that case as well. So we’re always looking, you know, one, two and five years down the road of how we can bring those benefits and not just in the immediate shorter.

Philip English: 
Right. And then, so do you end up supplying the customer with like a finished product solution or finished products, software? Is that, is it, is that how it works?

Arthur Keeling: 
So the combination of the two, um, we have our own platform that we’ve developed as well to help support our hardware roll outs. Um, and we’re looking at how we could partner with other or manufacturers to offer our platform as a standalone product, but also continuing to show what’s possible using our software with our hardware solutions at the moment. And that’s where we’ve been deploying them with some pharmaceutical companies and within the NHS at the moment, supporting some of the work they’re doing. And we’ve been using those as the case studies and the proof points of what is possible with a new way of trying to handle different parts of automation. And these are often highly trained individuals who multiple PhDs and their knowledge and power is incredible, but they’re not robot experts and it’s about making it accessible for them. And so that’s how we’ve tried some gear and it deliver these tools,

Philip English: 
Deliver the tool. And it’s yours. It’s your software system? INX is that the one I was doing a bit of research and I was on your Western, on your website system. I, I next was that something completely different?

Arthur Keeling: 
That was, uh, one of our earlier prototypes of our system. Um, and it’s a bit that’s version one and we’ve sort of, we’re evolving as we go. And that’s been a really key learning point for us is as we’ve been working on a number of projects over the last year with a range of different customers, that’s of learning and that learning is going into the platform to improve it. And we’re working with companies manufacturers of off the shelf components in Germany, in the UK, and by being able to work with them, we’re able to bring the benefits of our platform to them as well.

Philip English: 
Right. Okay. That’s really, really interesting. That’s really interesting. So see, so it’s obviously solving the customer’s problems you see have having a look at the solutions. I mean, it’s, it’s, um, I suppose what, what’s the bigger picture then? I suppose if you’ve got your own platform, is it to obviously grow the platform and obviously, uh, be, have the ability to integrate with lots of different vendors? Is that, is that what you’re saying?

Arthur Keeling: 
Okay. Well much, much like yourself. So you, as a robot, enthusiastic, we, our team or increase of enthusiastic about machine, vision, robotics, and getting automation out there. And it’s about getting it to two more people. And I think in the next 15 years, we’re going to see the automation and digitalization of manufacturing processes that are going to do wall office has come before us already. And I think it’s going to be a really exciting time to see what happens now. And that’s where we’re trying to enable more people to benefit from that change, whether that be control, just simple sensors that you’re putting into a shop to help you control the temperature and reduce your energy output for environmental reasons, or you’re controlling a check-in sheds to improve the climates in it, or you’re using a robotic arm for pack and place at the end of, uh, the latest first sickle farming installation somewhere. It’s about trying to enable a platform to that more and more people access them as we see. So more and more people trying to embrace better ways of working.

Philip English: 
Wow. Yeah, no, it sounds like you guys have got some great projects there. I mean, I suppose I’m like, I quite like the idea is obviously like you work on a solution and then now you can see how you can take that to the mass market. I mean, would you, um, w w w I’ve seen people do that before and, and they, they would normally almost brand it and create their own company or, you know, and an AP and other companies spin off of your company. Is that something you guys sort of see in the future?

Arthur Keeling: 
Um, that’s not how we operate at the moment. Um, and by pooling our resources and developing these products, it means that we are able to deliver the best value for money as well for people. So by rather than spinning everything off, we’re able to keep knowledge and skills contained within the company so that we can take learning from different projects. Cause we often find there is crossover between some of these, and then we don’t get any trips or spells with, uh, different companies having competition over each other. And by bringing it together and having that sort of collaboration between all the projects within our company enables us to take learning from areas. You wouldn’t have thought you could have taken a lesson or anything like that from, and you pick up all sorts of great insights, but that’s also one of the benefits of being based somewhere where you’ve got access to people like the bristol, robotics lab, you have conversations with people and yeah, those are the moments that you can really help you fix that problem. Or how are we going to get us certain things you move in a certain way. And that is something I think we’re all looking forward to once we can get back into the office, being able to have those design and engineering meetings. Aren’t I I’m much more challenging when they’re done like this at the moment

Philip English: 
Or on the, on the virtual arena. Yes, indeed. Okay. Um, I suppose the, the, the other question that I had was, um, it more in regards to, um, I suppose, like opportunities to work with you guys. So if I was, if you have a customer saying far more or in a hospital, was it, what, what, what, what’s the best way to work with you? Is it, is it, is it to literally say, Oh, look, here’s some videos, here’s some pictures. This is what we need to do. Like, can you guys have a go at creating like a system for us? Is that, is that, is that how you guys work or yeah.

Arthur Keeling: 
Yeah, absolutely. I mean, we’ve been often we’re approached by people and they come to us saying, this is my problem, and this is the challenge I’m looking to solve. And they, then we sort of evaluate it internally and see what we can do. Um, and that’s normally just a consultation with them and a conversation where we can scope it out. We’ve had a project this year that we had our first meeting on the 11th of January for pound. We are already sort of rolling out the products in the coming weeks, um, for them. So that’s something as well that, where I was bringing that severe speed and agility within our team as well. Um, but that’s sometimes why we’ve also partnered a larger organization. So larger organizations have blips walls make tasks, and they’ve come to us saying, yeah, you’ve got that speed and agility. We’re looking for we’ll work with the on project. So we’ve had that combination of individuals coming with photos and problems. I’d like to solve all the way to larger company saying we’re looking to partner with an agile automation company to help us fix problems.

Philip English: 
Right. Right. Are you guys do some amazing stuff? Uh, yeah. So what would be the next steps in getting in contact you, you know, um, and what sort of industries would be the best sort of marks for getting in contact with you guys?

Arthur Keeling: 
. Um, well, at the moment we’ve got a really strong focus in the medical sector and we’re working really closely with a large number of doctors and pharmaceutical companies. So if you are in the medical sector, we’d love to hear from you. Um, we’ve got a number of products we’re developing in this space at the moment, uh, ranging from primary care all the way to sort of their pathology departments. And we’d be really interested to hear from you to see if we could also bring these products to help benefit you, but also maybe improve them and get your feedback and thoughts on them as possible. It’d be fantastic to talk.

Philip English: 
Right. Fantastic. I said, well, yeah. So, nthat sounds very exciting. And I think, I think what we’ll do then guys is we have a concept with us, Arthur, the next sort of three to six months and just see some of these that the projects that the guys are working on, but it sounds like some very, very like exciting stuff and yeah, and very, very much thanks for your time today. I very much appreciate it.

Arthur Keeling: 
Okay. Well, thank you so much for having me here and look forward to touching base in maybe a couple of months, time and updating I’ve gone away. We’ve got two little projects.

Philip English: 
Fantastic. Thank you, sir.

Arthur Keeling: 
Brilliant. Thank you very much for your time. Really appreciate it.

Robot Optimised Podcast #3 – Arthur Keeling of Indus Four

Indus Four : https://www.indusfour.com/

Philip English: https://philipenglish.com/

Sponsor: Robot Center : http://www.robotcenter.co.uk

Youtube:- https://www.youtube.com/watch?v=9BjcCt0kWII&ab_channel=PhilipEnglish

Extend Robotics interview with Dr. Chang Liu

Extend Robotics interview with Dr. Chang Liu

Hi guys Philip English this from philipenglish.com. Welcome to the Robot Optimized Podcast where we talk about everything robotics related. For our next episode, we have Extend Robotics led by CEO Chang Liu who will talk about their leading technology on robotic arms.

Philip English: 
You guys, uh, it’s uh, Philip English. Um, I am a robotics enthusiast, uh, reporting on the latest business and application of robotics. Um, my mission is to get you a robot optimized and, uh, to support industry innovation and infrastructure for the next era. Uh, today, uh, we’ve got Chang, uh, from extend robotics, uh, so extends, um, they build, uh, affordable robotic arms, um, which can be remotely controlled anywhere. And I think their, their, their, their main vision is to sort of extend human capability. So you, haven’t got to be there physically. You, you, you, you can use robots to do the tasks. Um, so hi, Chang. Welcome. Cool. Thank you, sir. Um, so I suppose that the first thing that we want to know is it’s really just like a, um, a quick explanation about yourself and the company. Is it, is it all right if you give us a, uh, a very quick overview?

Dr. Chang Liu: 
Yeah, sure. Uh, so I’m Chang, I’m the CEO of Extend Robotics. Uh, so in one word, uh, in one sentence, uh, an example box we would build, um, highly, scalable, um, extent. Uh, we’ll go highly scalable. Uh, teleoperation robotic long two, and now anyone to, uh, intuitively operates remotely from anywhere. And, uh, I really like myself is, um, I, I kind of started from like a robotic recess back where I was, uh, I did my PhD and, um, uh, actually, uh, on our style, how we can automate, uh, robots. Um, I was also the head of robotics. Uh, so during this process really like, uh, all too many plans and the industry partners, uh, trying to figure out what’s their, what’s their need in terms of using the robot in real world scenarios. I’m kind of getting very excited and recent years seeing this, this, uh, uh, boom, uh, um, role of, uh, being used in real application. Uh, other needs like this, there’s a few writing for us a lot, like, uh, digital digitization automation, the trends, and the enabling technologies that allow like, uh, um, intuitive operations such as the VR, uh, um, the class artist. Um, yeah, but I also see as a base, uh, challenge there, uh, that’s really this, this, uh, this bottleneck of motor robotics that has this complaint of the limitation of operating and complex scenarios, while you actually would require physical operations to certain costs, but that’s beyond the cognitive ability, say, uh, the robots say taking pictures and deliberate, uh, robots. And, uh, really what we want to do to do is back. You find a way, but a new way to, to make complex operations, uh, many relation tasks saves our cost cheaper. Yeah.

Philip English: 
Fantastic. No, that’s, that’s a great overview. Thanks Chang. So we’ll see, like, you’ve come from a, uh, robotic, um, like academic backgrounds, you know, you’ve seen that the, the, the, the, the problems that the industry has is, and now you’ve created oversee, uh, this company and this solution. And I suppose the main problems, I suppose you can go through if you went through health care. And then I, the first thing I think about is, you know, you, you, you, you may have people like in hospital beds or care homes where you can, uh, assist them like remotely, uh, industry has lots and lots of different types of, uh, of applications to do with manufacturing and distribution, um, general services. So restaurants and, and things like that, especially because of COVID right, right. At the moment. So I can see, um, the, the problems that, that, that people will ha would have. And then I suppose, on the agricultural side as well, with the, with the getting more and more difficult to find workers, um, the, the, the, the problem is there that we need to like address, and I suppose the solution for fit for you. I mean, I was reading into, um, you’ve got something called AMA S is that, is that part of the solution?

Dr. Chang Liu: 
Yeah. Yeah. So the AMS, uh, is short for, the amass mechanic assistance system, which is literally a special VR game that connects to a real role to allow people to leverage the intuitiveness and the immersive VR to operate the robots more easily. Um, so we believe that’s, that’s kind of the new way that people could go for it, the robots. Yeah.

Philip English: 
And then, and this is connected to the clouds and that, and obviously that, that’s where the link is. So is that, is that a server on the cloud that, that, that, that all the information lead links to, or how, how, how does it work? How does the communication work?

Dr. Chang Liu: 
Yeah, I mean, the system with the mining is a, uh, it’s a network ready, uh, system. So the robots connects to obviously the network, uh, the AMS software connects to connect to the same network on them. They communicate or over imagine this network in as, as, um, it’s flexible. So it could be wifi could be, uh, 4g, uh, or even the 5g is not, well, I would say a CEO, we are a partnership with a few, um, 5g test that, to actually trial how 5g could improve our service. Uh, Thomas was giving , but, uh, but yeah, but the system with a lot of network ready, so some that couldn’t connect to any, uh, networks. Uh, we also worked on integrating the out of the website is to the, to the system. So, uh, slowly you can actually have a remote server that actually allows you to choose province like, uh, Marshall, uh, uh, like VPNs you to communicate over the internet. So that’s, that’s allowed people to work remotely. So because initially use the BI interface to, to operate the robots or the internet. So you don’t have to be in the same, same location. There’s a little

Philip English: 
Perfect. Yeah. Well, I suppose that, that’s the main thing. Like you want the security around it, so if anyone’s going to be logging onto the robots, you want to make sure that they’re secure. Um, I did go through a few of your videos. So I saw you at the robot restaurant up in Milton keyeds. Um, is there a robot ties here? Uh, sorry, robot taser. I think Mark and Joe I’m like, Oh, that, that at that place. So I’ve been there, it’s quite a, it’s quite a good place to get to if you’re going to the UK. But, um, so as those, the sort of testing is that, is that a good example of, uh, how you would use it?

Dr. Chang Liu: 
Yeah, I would say as a, as a concept demonstration and a technology showcase, uh, our system, uh, we’re still exploring the business, uh, business case behind it, but, uh, but I would say this is a great showcase of what you kind of do as example. And, uh, so we basically did a bit of quick set up in there. Um, so we kind of use it a little bit hard to see what kind of beer, uh, following the quality of that. That was cool. It closed down a lockdown. Uh, actually that was one though only like one day before the, uh, meltdown second lockdown. So we were kind of lucky. Uh, we managed to do it a demonstration. Uh, yeah, but, but it was the, it was quite simple demo, but actually is not designed to do one particular task. Uh, but it’s, it will be, it’s designed to do a general, uh, system floor, uh, random, different costs, um, depending on how people want to use it. So one day it could be useful, um, for bartending one day it could be used or Shanna’s, or, uh, or it could be an independent roles. Um, anything you want to do to fix this flexibility, you, you attend from the AMS. Uh, human interface, that’s literally the role of the human. They still controlling the robots, uh, but in a more intuitive way.

Philip English: 
Yeah. Yeah. And that, I love the flexibility as well. So obviously like you can, you can use it for multiple tasks. And I suppose, especially in, in, in, in, in restaurants, like we’ve seen, um, a few robot projects where they’re looking to automate the, the restaurant, but obviously the robot can’t do everything. So you would need still someone to, to, to log on and do that piece of work. And, and the thing that, that gets my attention it’s maybe even at the home as well. Um, cause we’ll see if you have a, if you’re making a meal at home and he wants some, someone to, to get you to give you a hands, then obviously you can have a service where someone can log on and actually start to start to make the food for you, especially if it’s an elderly person as well. Who’s struggling. Um, so that’s interesting. Yeah. I mean, I was going to ask, um, so what’s the sort of, um, so the main reason is obviously to get these robots out, to make it a lot easier to, to do tasks without having to be there. So like what’s the bigger picture then? Like where do you see your, your yourself going?

Dr. Chang Liu: 
Well, the system is general S uh, remote timely solution was, uh, was a low cost advantage on the intuitive interface. Uh, and what we believe is, is the, the problem is so applicable to a wide range of industries. Uh, so random from, uh, utilities, uh, utility industries where you want to do a special images, uh, maybe like nuclear industry say you want to, uh, decomission, the nuclear, uh, facilities, uh, house care. You want to go onto it, we’ll take care of the patients, uh, where you, uh, even, even there was, there was a highly, um, uh, dangerous virus uh, uh, or yeah, uh, the agriculture industry where you house, um, uh, like the local local, um, what do you call out the shortage of laborers and local areas? So you can utilize, um, the labor that’s outside the SIS local area, uh, quickly, and also like, um, uh, public service, uh, hospitality industry, as you want to, uh, have a remote walking solution for your, um, Walkers, uh, there was also could be usable space even.

Dr. Chang Liu: 
So those are wide renders industries. Uh, what believes the key market would be around utility industry. That’s where we see the most desire from the customer. Uh, but, but, uh, sometimes it needs a more complete solution, not what we tend to, uh, at the moment offering. Uh, yeah, but, uh, that’s what we are looking at us. And, uh, we, we’re also looking at, uh, uh, public service industries that may have a, um, easier cost philosophy would do at the beginning. But, um, but yeah, but we see the shot market is seeing, um, house, uh, they probably had this, um, the, the key market is the utility, the utility industry.

Philip English: 
My sense. Yeah, it makes sense. Like I also saw the video about the, uh, the, uh, the first, um, birthday party sort of like CA COVID birthday pie. So that was very impressive.

Dr. Chang Liu: 
That was the, our early prototype, uh, uh, that was basically that was done, uh, or 50 miles away, apart from the mobile apps to the operator. So I was controlling the robot from Redding and then robot itself is in central London. Uh, that’s what we’re suppose to be miles away. Um, we were able to operate the roll-ups, uh, over the internet, um, that that’s shown us to be quite, quite bombastic. Yeah.

Philip English: 
And then I, I suppose a quick, quick question, I thought, well, if he were operating like 500 miles away, would you have to have to make sure that the internet connection is it’s obviously stable and secure and powerful, you know, so he doesn’t cut, cut, cut out. So he’s, that’s quite a main thing.

Dr. Chang Liu: 
Yeah. Um, I said that the network capability will help us, is it just like a 5g, uh, improves to improve latency and the family than the robustness? Uh, the system can operate, uh, in restricted, uh, by the way, is, um, on being safe, even the, even this network breaks out. So worst case scenario, you lose control and the robot stays there, uh, safely. Um, and, uh, yeah, so, and actually our, um, point like our beta beta streaming pipeline, uh, where we’re also like a patent pending analysis, a lot of information was in a low bottle. Um, so we are walking on a flat and popping off on the streaming algorithms, um, that will basically forms our, uh, one of our USP to achieve people, have such systems, uh, and, uh, in a more realistic scenario, especially within a constrained network. Yeah.

Philip English: 
Right. Perfect. Perfect. And then, um, I suppose the only other question I have was around, um, sort of, uh, you, is the company going through like a, um, like an investment sort of scheme where obviously you, you have investors on board and every step that you go up, um, and then you get more investment and you grow and grow. Is that, is that the path that, that, that, that the company is on?

Dr. Chang Liu: 
Yeah. Uh, we are, uh, yeah, so we are already say start up, uh, we have been raising, uh, investments from, uh, breed books, uh, Wales, we’re still looking for Southern investments, uh, someone interested, uh, with a more detail conversation around that.

Philip English: 
That’s fine. Thanks very much. Well, um, yeah, I think that’s it for my questions. I mean, I suppose the next, um, so sort of steps is, so if we wanted to purchase a unit, I fix a start up, but, uh, can we, can we purchase a unit now or are we still a bit too early?

Dr. Chang Liu: 
Uh, well, yeah, so, uh, our strategy is that, uh, were walking honestly, antiquated solution was hardware software. Uh, our, our goal is to provide, uh, fully integrated solution, um, uh, by the, by the beginning of, uh, 2022, yeah. Uh, we are currently, uh, just, just all just started to offer the software alone solutions. So we’re demonstrating the way we help, how people could, uh, in to phase out softball with a third party, lower arm. Uh, so someone we want any have a, um, and also one arm, uh, they got come to us and a wait time we can discuss the way we could just say is out so far, which would have robot arm, uh, the way our software architecture is it’s a loss interface, robotic operating system. So anyone arm that support robotic operating system kind of actually, uh, integrate with our AMS software very easily. Um, yeah, so we also have not now, and, uh, we’re actually really looking forward to, um, everyone who has an interest on though we can, uh, we can walk out with solutions together, or we can to stop, uh, have a trial of our current system.

Philip English: 
Right. And then, and then the, the, the, the actual hardware, is that a 3d , a 3d printed piece, or is that something that you’re going to get mass manufactured?

Dr. Chang Liu: 
Yeah, colored form five is 3d printed, uh, robot arms. Uh, we have a lot of experience all in the, how to we, uh, make sure to business 3d prints, uh, Makes salary requirements. Um, we are, we optimizing particular small elements of, um, the, the system to make sure the core, the core functionality, um, is not compromised. So it will be optimized as 3d printing, uh, solutions, arms. Um, yeah.

Philip English: 
Okay, perfect. Perfect. Okay. Yeah, no, I think that’s been a great, like overview Chang, so like many thanks fit fee for your time. I mean, if people want to get in contact with you, like what’s the best way,

Dr. Chang Liu: 
Uh, based onto our website for, uh, and www dot extend robotics dot com, um, uh, maybe a little, a little messages, or you can just simply, you know, me, uh, chang.liu@extendrobotics.com. Yeah.

Philip English: 
Right. Perfect. Thanks Chang. And what we’ll do is we’ll put all the links, uh, in the, in, in, in the bottom of the interview as well, so. Perfect. Cool. Well, thanks Chang. Thanks for your time today. It’s very much appreciate it. Thank you, sir.

Dr. Chang Liu: 
Okay, cool. Awesome. Take care. Bye-bye.

Robot Optimised Podcast #2 – Dr. Chang Liu of Extend Robotics

Extend robotics: https://www.extendrobotics.com/

Philip English: https://philipenglish.com/

Sponsor: Robot Center : http://www.robotcenter.co.uk

Youtube:- https://www.youtube.com/watch?v=de2B3nS5VBg&ab_channel=PhilipEnglish

Fizyr Interview, with Herbert ten Have

Fizyr Interview, with Herbert ten Have

Hi guys Philip English this from philipenglish.com. Welcome to the Robot Optimized Podcast where we talk about everything robotics related. For our first episode, we have Fizyr led by Herbert Ten Have who will talk about their leading technology on picking up objects.

Philip English: 
Hi guys, my name Philip English, and, uh, I’m a Ross’s enthusiasts report on the latest business and applications of robotics and automation. Uh, my mission is to get you robot optimized, uh, to support industry infrastructure and innovation for the next era. Uh, today, uh, we have Herbert, uh, Ten Have from Fizyr. Who’s going to give us an overview of fizyr, and then we’re going to fire a few questions at him and just to see how the business runs and what solutions that he has. And so hi there Herbert

Herbert ten Have: 
Afternoon,

Herbert ten Have: 

Philip English: 
Afternoon. And so I suppose to start with, could you give us like an overview, um, of, of yourself and your company?

Herbert ten Have: 
Sure. I’m 57, still alive. You can breathe and you can do all kinds of active stuff. Um, I run a company called Pfizer, which is a spinoff of the university by a professor. He had an academic view of the world and having robots do all kinds of jobs and they put a brilliant team together. And so now we also have a business plan and we are in quite good operation. A few years ago. We won the Amazon picking challenge, post stowing and picking. And that’s the moment we stopped doing robotics. No more robots.

Herbert ten Have: 
Okay. Okay. So you’ve moved from the hardware to the software side.

Herbert ten Have: 
The brain, Actually what we do is like self-driving cars. We translate the image, what the camera sees into, where the robot should move. And that’s why also our name, new name fizyr means scope. We look, so we look through where it should be grasp as well as we’re looking at future new applications and new technology.

Philip English: Wow. Okay. And fizyr means scope

Herbert ten Have: 
Dutch in Dutch and German and Danish. It’s telescope,

Philip English: 
It’s sort of the same. And then I know, we’ll see, I saw the bit about, um, you won the Amazon awards and that, that was because of your deep learning capability of the software.

Herbert ten Have: 
Correct. Wow. So,

Philip English: 
Um, how big is your team then? Um, from like a size wise,

Herbert ten Have: 
24 currently.

Philip English: 
I see. And then we’ll see. So you’ve come from a university background. Um, so I guess, you know, you’ve got, um, uh, quite like an academic, uh, source of inspiration there, which is something that’s really interesting. I mean, so what, what’s the main problem that fizyr is solving.

Herbert ten Have: 
Okay, good question. So let’s take a partial fulfillment, like example, every time you drop it, it will, the robot will see a different envelope. It will always be different or they’ll take their style. They will always be different. So there is no way you can program it to program a robot. How will, how will it look like? Because it will always be different. So the only way to sort of recognize it and classify it is to generalize to teach it to generalize. Like we, humans can, I could get a one year old, can see that this is a box and can grasp it. And so what we’ve done, we trained the neural network with a lot of images, so a supervised learning. So we teach them, this is an envelope, this is a box, et cetera, and millions of images. And at some point it became good at doing the same thing as humans, understanding that this is a box, a bottle or two per cylinder, et cetera. And to pick it from a bulk of unknown partials.

Philip English: 
Right. I see. So obviously the main problem is that there’s from a normal vision system, it’s hard for it to distinguish the way one item ends and another one starts. So, so within, within your software, it’s obviously got a lot of deep, deep learning tools that actually does that distinction. Um, and I suppose the main problem for the customers is that if they can’t find where the product is, they can’t get a robot to pick it up and then they can’t move into an automation type of process to speed up the process plan. Correct. Okay. That makes sense. And then, so in regards to the actual camera system, then, I mean, is it, is it one type of camera or is it, is there any camera that you can use and you put for your software on top or

Herbert ten Have: 
Yeah, now we are hardware agnostic or both on the sensor of the camera side, as well as on the robot and the end effector the gripper. But having said that we mostly use RGB depth data. RGB is being what we see right now and the depth images is to do triangulation. So we need to know what’s the distance. So for that, you need a depth camera, which is a stereo camera, uh, with, uh, in the middle is structured light, like a flashlight with different ways of structuring it. And based on that, uh, that camera will create a point cloud from which you can derive the distance and to see how it’s positioned. So for instance, take this box, uh, this time you would hear instruction to simulate it, to factor them, uh, in this case you have two sites, but we will see maximum, see three sites of each parcel. So we will find those places where to grasp with the robot to simulate the, uh, the item.

Philip English: 
Right. I see. I see. So that’s the mechanics of how it works and, um, I suppose, would it, um, could you, if you had two cameras or three cameras or four cameras with, does that add on and make it more and more, um, of a product that the products you can see or was it just one camera and that’s all you need

Herbert ten Have: 
As always, it depends. The key thing is the camera should see it. If the camera cannot see it, then, uh, the algorithm, the neural network cannot it. So the good exercise is normally when we want to pick something, we look into the bin and we can find what we want, but we are flexible with our eyes and we can move that. So assume you have the camera above, then you just look at the screen, can you see it in the screen? And then we can train a neural network to pick it as well. So we’ve had many applications that the neural network was much better than human beings. So for us, it was too hard, but the neural network was more faster, accurate, and more robust.

Philip English: 
And would it make sense? I mean, it’s just an example there, we’ll see if a camera is that I only got one position and w w would it help to have maybe a system where the camera was almost moving to some degree to help like pick, pick it up?

Herbert ten Have: 
Not necessarily in most cases we, um, we picked, let’s say from a pellet or from a bin or from a conveyor, and as long as it’s feasible for that camera, and then the robot can move there as well, because what you can see in the camera, then the mobile can move at the same direction as well, because it’s free. There is no, there is no other objects in between.

Philip English: 
Perfect. And then, so to give us some examples, I mean, like where have you installed this type of technology?

Herbert ten Have: 
Okay. One of the nice thing, and we’ll show you in a, in an image is picking the corners of towels and that’s really hard. So towels are in bulk white and white, for instance, and you have soft corners like when it’s, uh, it’s it’s bended, but you also have the heart corner of, of, of a, of a towel. And what the algorithm does is finds the corners of towel and then big, and then feeds it into a folding machine. And then the towel is folded. So that’s something that has been operation for a few years. So we started doing that, and now we are mostly in logistics, in e-commerce it’s item picking. So picking unknown items, which you order online, as well as when the item is being packaged into a bag or a box, we have to pick the partials also from bulk for, let’s say, DHL ups, federal express, et cetera. And, um, we do also track on loading and the pelletizing, which is mostly also boxes, let’s say,

Philip English: 
Well, and for the truck on, on loading then. So the idea is obviously the back of the trunk pulls out. Um, then I suppose a bit of a challenge, a question like where, like, where would you put the camera? Does the camera sort of drop, drop down to the back of the truck? So it sees in and then pop pops up again.

Herbert ten Have: 
Uh, first of all, we don’t build a machine. We only are the brains that translate the image into, with what they do is they have a camera on the device that goes into the truck. So there is a robot picking each of the items and putting it on a retractable conveyor. So the items, the boxes or bags are simulated from bulk, from unknown. And I will show you the image as well, and then put on a conveyor and then it’s being handled the, in the warehouse. Right.

Philip English: 
That makes sense. Yeah. Cause I understand in big light logistic houses, you would have a device that goes into the actual lorry with a conveyor belt on, and then obviously the items can be paid. I mean, in, in regards to, to the, um, to the towel folding, I was always, I was having to look at that FoldiMate robot. I’m not sure if you’ve seen that. It’s like a house appliance. I think it’s on Kickstarter at the moment.

Herbert ten Have: 
Yeah. In our case, it’s really, it’s a really professional a robot. Uh it’s uh, so it’s, uh, it’s been there for, uh, for a while. It’s really for the professional, uh, laundry industry. So in hotels and conference centers and a lot of laundry when they, where they have three shifts per day dealing with laundry.

Philip English: 
Yeah. I’ve been into some of those sites moment myself, and it’s a 24 hour operations as washing and washing and washing. So yeah.

Herbert ten Have: 
Humid and warm show. It’s something you, at some point we realized we should not do this as humans. Yes. Yes. So we should just sleep at night and then have robots doing the work for us.

Philip English: 
Yeah. No, definitely. Definitely. Yeah. There’s definitely a way, I mean, um, in, in regards to their robustness, cause I’ll say, I was saying that’s one of your key features, I suppose the first question is, can you use it outside that I know you wouldn’t normally have a robot set up outside, but is it, can you have cold and snow and wet or is it

Herbert ten Have: 
Yeah. For the software? Doesn’t matter. Of course. So it’s yeah, of course. It’s all about, do you have a camera that’s IP waterproof and all this stuff. And uh, so, uh, most, I would say 95% of our applications are, are indoor, but some of them are like for truck and loading, it could be outdoors as well,

Philip English: 
And then I saw her, I was, I was reading up. So yeah, the, the, the scanning can, can scan up. Um, is it like a hundred items a

Herbert ten Have: 
Second?

Philip English: 
Yeah. Yeah. So, so if the robot was fast enough and then you could really yeah,

Herbert ten Have: 
Yeah. There is no robot on earth that fast. So the neural network is extremely faster. So we use a GPU like in a Nvidia card where we play with, uh, so we use that, uh, to, to do that process. So it’s extremely fast in providing all information, including the, the, the cross poses of the, of the parcels. Yeah.

Philip English: 
Yeah. Cause I was, I was reading that. I was saying, yeah, that’s really fast. I was thinking, Oh yeah, you would need a, you need a lot of robots all attacking. Um, so I suppose then, as you’re saying, like, you know, the bigger picture then is really for those sort of dull, dirty and dangerous jobs, you know, that, that you, you, you, we have robots with the fizyr system that can also pick the items and do the job for us. I mean, what, what, what’s your sort of, um, uh, like future plan? I mean, I did see from, from your website, obviously you, you guys have very successfully like bootstrapped up to 2020, and I think you recently got some investment. So is your seed expansion is it’s on your mind? So

Herbert ten Have: 
Yeah, we’re quite unique I think in Europe. Uh, so we’re bootstrapping is more common, uh, then later to get some investment. So we refill it, dated our product with our, our clients. So we have clients like Toyota for, for four years already. So it’s really, we go into a long-term relationship and we build things going production, and then we built the next one. So it’s, the Americans would go faster and et cetera, but we would like to get everything in order and then go to the next one. So that’s how we build up. So now we have the product ready and we can scale easily easier. We are in logistics, which is like I said, if fulfillment and partial handling and we do something in airports as well, but it hasn’t been disclosed it’s so it’s always logistics and nine out of 10 cases, either a box or a bag.

Philip English: 
Yeah. Yeah, that’s right. I’ve, I’ve, uh, I’ve been in a lot of airports as well, and I’ve seen, uh, I’ve seen them to deploy some robotic systems in there. So I suppose, yeah, that’d be a perfect target for a, for, for you really? Because, um, just, just making sure, you know, different sizes of luggage and bags made sure, because that’s key. If you, if you go on a holiday, you want to make sure that your luggage is there.

Herbert ten Have: 
Yeah, yeah, yeah. But like I said, we only deal with the computer vision part. So there are two more elements to it. Secondly, do you have an anti factor, a gripper that can cope with the variation? So if I have to pick up a pen with suction, I need a very small suction cup to pick up this one where when I have a bigger box, let’s say is, would be heavy, then I need multiple suction to CrossFit. So it’s the, do you have a gripper and anti factor that can cope with the variation? That’s going to be expected. That’s a second challenge. And cert one is all about the integration. So how fast can you accelerate or decelerate without throwing it away? How well do you know it’s it’s attached? Is it safe for the environment for people? Do you have a cobalt or an industrial robot? So integration with the warehouse system. So there are a lot of things around it. So there are three phases and we take all the first phase. Can you see it? Do you have perception? Can, do you know where to where to cross?

Philip English: 
I see. Yeah. No, that makes sense. I mean, I did see your gripper as well. And um, I think if you made that, um, like open source, so anyone can sort of build their own, is that, is that the idea or, you know, a usual technology?

Herbert ten Have: 
Yeah, we do a lot of open source. So if you go and get up and Pfizer, they will see a lot of return on net and all the stuff that we’ve made open source. So we have a lot of followers. We’re very proud of that. And it also brings in new developers. So we get a lot of developers through the open source community because they know us. Um, so the gripper is something we give away the science as well, because we only do software. We don’t do hardware. We don’t want to do, we just want to stay digital. And so it’s, it’s a really nice market. It’s so big. And there’s so many challenges still to go. It’s not as easy as it looks because in warehousing, if you go to a shorting center, it’s looks, it looked like a warzone. So you’ll see everything, car tires, all kinds of stuff is being shipped. So it’s not easy and they’re working hard. And um, so it’s, it’s a, yeah, there is a lot to be done still.

Philip English: 
Yes. Yeah. No, that that’s really true. I mean, I I’ve, I’ve been into a lot of those sites as well, and yeah, I can definitely see that there’s, um, that you, you need a good vision system to make sure that you pick up the right light items. I mean, just going back to the gripper though, I mean, it’s, um, it’s obviously that’s open source. So then I was going to ask a payload question, but I don’t, I suppose it depends on how you build the grip or how they would build the gripper. Like usual

Herbert ten Have: 
Fill up the payload is very simple. You have a vacuum, then you have a surface. So just, you can just calculate what is the maximum force you can apply. So in order to lift something with a certain amount of vacuum and surface, so you can calculate, and if it’s well touched, then you can do CrossFit. But let’s say if it’s something like this, like a towel It will go through, right, you need to take that into account and then you need to apply more flux. So like a vacuum cleaner, you can still pick it up as long as it’s not, it’s not going to be sucked in. So you need, you need the, you need the filter, but so are our neural network knows what it is, can classify it and knows. In some cases you have to apply more, more air, more flux in order to cross this. And, uh, so you can also measure how much air goes through how well it’s attached, uh, in order to know how fast you can move without throwing it away.

Philip English: 
Right. And, and I’ve seen recently there’s a lot, a lot more of those soft type of rope or all of robotics that have all sorts of arms and flux that, that, that make it even more, um, like useful for those types of operations. So yeah,

Herbert ten Have: 
The key thing is, is the combination between what the robot sees the information, the eye, hand coordination. So the more like we’re humans, we have flexible hands. We can do a lot. So the same applies for a robot. You can have a smart gripper with multiple suction cups so we can apply based on if it’s this one, we only do you see suction cup so we can apply different suction cups, different sizes and shapes, uh, based on the material we go to grasp. And so then what we also do is stacking. It’s like playing Tetris. Okay. So we picked something of unknown. And when we look, what are the dimensions, and we look into the place where we want to play a place like a pallet or a bag, like grocery should do micro fulfillment. And then we placed the, the item in the, in the unknown environment.

Philip English: 
Right. So you’ve got the ability to do that as well. So I was speaking to a friend about you guys, so he he’s got a project, uh, to do with waste recycling. So as you can imagine, massive plant, lots of all sites of rubbish. So we’ll kind of get coming along a conveyor belt. And, uh, I know he’s looking into, um, uh, you know, a technique to do it. And I think they’re actually saying, Oh, look, we only needed to, we only need it to work 30% of the time, and then we can work on it. And then if you guys had a chance to have a play with that industry. Yeah. Or

Herbert ten Have: 
Yes, we did screen off years ago. And then we decided we want to focus on logistics because it’s logistics like a blue ocean. It’s so big. And we, we, uh, we claim, we did still think we are the best in the world, although we are small and, uh, we want to stay the best. So you need to focus, focus, focus, focus, and just stick with that one and just be in, stay the best because you can do a lot of things, stuff. And it’s really interesting. It’s nice to do, but at the end you need to, to stay the best and just to focus,

Philip English: 
Focus on the main area. Yeah, yeah. Um, okay. No, that’s great. So, I mean, so what’s, what’s the latest news, like, what’s the next thing for, for you guys, um, are looking into it.

Herbert ten Have: 
Um, it’s, it’s helping our integrators are robotic integrators worldwide. Uh, so our, now our software fresh in production in, uh, in the U S North America in Europe, of course, and in Asia, China, and soon Japan. Uh, but what we see a lot is in the fulfillment that they will have micro fulfillment centers. So one of our clients is for instance, fabric, they have micro fulfillment center to, uh, to bring the groceries, really, to work towards the homes and they are in cities. So that’s really a robot robotized, like lights, they call it lights out factory where everything is done with robots. And I would say, we’re still ahead of that. We’ll still take maybe one or two years, but that definitely what the industry is going for to have lights out factories where just truck comes in, it’s loaded, then the robots take over the rest,

Philip English: 
Right. And then, and then your software light, either lights on or lights off can still do the same job.

Herbert ten Have: 
We need some lights, but, uh, the lights off means, let’s say no people or just remote. But again, we only do a small part. We only do. We like self-driving cars where to drive. We are for the robots where to pick that’s the key thing we do.

Philip English: 
Yeah. And, and I suppose it, you expect on a lights outfit, factory, you know, on certain items that need light, then it would flash on a light. So, so, so, so it can do its job, uh, like most efficiently. So

Herbert ten Have: 
Yeah, in our case, we always need light. We need RGB. And so we need light to see, but, uh, lights out as a term means that they can have a factory without humans around. And so they call it lights, lights out, factory mat. So, but in shorting center, everybody buys a lot online and the number of parcels only, uh, increases every year. So that’s a big challenge in the industry to be automated because there’s a shortage of humans doing this work. And we don’t like to do it in the middle of the night or weekends working on the, on the de palletizing or picking parcels. So we should have robots doing that.

Philip English: 
Yes, no, I totally agree. I totally agree. Have you, uh, have you had any, any work in, in hospitals or sort of amendment meant medical care? That seems to be a lot of speak about that robots coming into the hospital world. I mean, I suppose you’re using your software to, um, organize certain items around the hospital. Have even if you guys said that any, any traction with that or,

Herbert ten Have: 
Uh, focus, focus, focus, logistics, logistics. The only thing that we still do is picking towels and then sometime that’s in a, in a, in a hospital as well, but we really want to do focus and we do pick a medicine by the way, see fulfillment, shop picking small boxes of, uh, of medicine and the blisters and stuff. It’s part of it.

Philip English: 
Right. Fantastic. Okay. No, that’s great. But I think we’ve got a great overview of fizyr. I mean, um, what’s the best way to get hold of you then?

Herbert ten Have: 
Well, you can follow us on LinkedIn. We post frequently, let’s say a few times a week, uh, of course followers open source if you’re into developments, um, yeah, I’m online. So you can reach me if I can help you more than welcome to help.

Philip English: 
Sure. appreciate that, and I suppose that could be a mixture of, uh, of end users integrators, um, and, and anyone who needs that, uh, the, the, the vision to basically move lot items around them. So there’s a big industry there.

Herbert ten Have: 
Sure. But I, I meant also, uh, personally, if a student has an, a question or whatever, working in the, we have, we have 11 nationalities. We have a lot of people from abroad. We have consent to hire people from both. So, so we’re always open for new, uh, brilliant talent talent joining us.

Philip English: 
Thank you. Thank you here, but no, that’s great. Like many thanks fit fit for your time today. It’s very much appreciated. Thank you.

Herbert ten Have: 
My pleasure. Take care.

Robot Optimised Podcast #1 – Herbert Ten Have of Fizyr

Fizyr: https://fizyr.com/ 

Philip English: https://philipenglish.com/

Sponsor: Robot Center : http://www.robotcenter.co.uk

Youtube:- https://www.youtube.com/watch?v=QxPZKVDz65c

Hand on with Inovo Robotics

Hi guys Philip English this from philipenglish.com. Today, we will have a hands on approach with Inovo Robotics and their robotic arm.

Philip:
So hi guys. My name is Philip English, and I am a robotics enthusiast, reporting on the latest business and application of robotics and automation. My mission is to get you Robot Optimised and to support industry infrastructure and innovation for the next era. Today, we’re here with Henry Wood. So we did an interview with Henry Woods, I think it’s about three or four months ago. And now we’re on site today to actually see the robot live. So we have it behind us, and to just have a look at how this robot works. So I’ll get Henry to introduce himself, can you introduce yourself, Henry, please?

Henry:
Hello again. Hi, my name is Henry Wood and I’m one of the founders of Inovo. We put together Inovo to develop robots that were going to be much more accessible for companies building products in batches. Today Phil’s come down to visit us and I’m going to show him how we program the robot and what kind of things it can do. And really looking forward to giving an overview of the product.

Philip:
Perfect. Thanks for the intro Henry. So the next step guys, is we’re just going to basically have a look at the robot and do a quick light tutorial for you. A bit like an unboxing, but more of just a hands-on Phil. So let’s get into it.

Philip:
We’re going to run through things for you guys and obviously Henry is going to give you an overview and I’m basically just going to hear and ask some questions. And so I hand it to Henry to give us a brief start up of the robot.

Henry:
Sure. Okay, so this is our main product. This is the Modular System. And the really big difference between this robot and other robots is that you can reconfigure it to change the payload and reach. So with most robots, you’ve got to choose a product from the range. They’ve got a five kilo version or a 10 kilo version with 1.3 meters or a 900 mil. And the first thing you’re faced with is this catalog where you’ve got to choose which robot’s the right product for your application. And that’s very difficult for a lot of companies because they’re changing applications all the time. So what we offer is a system where you can change the reach and payload of the robot, depending on what you want to do with it. So what we’ve got here is the arm set up, it’s built out of the three core parts, the wrist, the elbow, and the shoulder.

Henry:
But the link tubes between it are interchangeable to change the reach of the robot. Just as a quick overview, we’ve got a pendant here, which allows simple control of the robot. Say, switching off and on the robot and doing some recoveries, if the robot is left in a funny position. But basically limited to the simple tasks that an operator might do. If you want to program the robot, then you have to plug in a laptop and then you can use all of the advanced features with the comfort of a full QWERTY keyboard and a mouse. Which is a bit easier than the touch screen, in my opinion. So we’ve got a third party gripper fitted here at the moment, this is a Robotiq 2F-85. A really, really nice product. And what it allows is force and speed and aperture control.

Henry:
So you can program how hard it’s gripping something. It could lift up an egg without crushing it, but at the same time, it could lift up a brick or something where you want to apply a lot of force. And it’s really adaptable. So we’ve got some simple controls on the wrist here, including a zero gravity button. If I press that, then the ring goes blue and I can drag the arm around and move it into different positions. And on the interface that you can see behind me, this is giving us a live feed of what the robot’s doing. And it’s showing us the way points we’ve created and the program we’ve built. After that I’ve got direct controls of the gripper, and I can actually add way points to the way guide. So I can build the program really quickly by dragging it to position, recording a way point, drag it to another position, record a way point there.

Philip:
Right. I see, and then when you move the robot to different positions, does it automatically build it in this stack here? Or…

Henry:
Yeah, that’s right. This is basically the workspace. So we’ve got the visualization to see what the robot’s doing and where it’s going. And then this is the program that we’re building. So if we start with a new program and clear that, what you begin with is the start block.

Philip:
Okay.

Henry:
And then we have a toolbox on the left, which is all the different features that’s built into the robot. So I can drag out to the toolbox. I could drag a way point, which is a position the arm moves to. But as you can see, there’s various different bits in there, there’s grip functions. There’s delays, there’s logic, there’s loops, there’s controlling of third party tools. So all these things can basically be pulled out of the toolbox and plugged into a sequence, which is a simple and a logical order.

Henry:
So if I position the robot in a point, maybe put it there. And then I can set from robot. So what that does is that captures the current position of the robot. And I can give it a name, so I could call it 0.8. Or something helpful, this is the door handle. If we’re opening a door, or this is the phone if you’re picking up the phone. And then I can drag it to a new position and I can duplicate this block. And I can set it, give it another name. And again, I can set it from the robot. And it can see in the visualization, here are the two points that I’ve created. And get rid of the grip block in the middle.

Henry:
And we’ve got two points and it will move between those two points. If I play that it will run, moves to point A, then it moves to point B and it stops. If you want to apply more logic, we can use a loop. So we can drag a loop out of the toolbox. Plug that in, and then put those inside it. And now if we play, keeps running around that loop.

Philip:
It’s very nice and easy, quick and simple to use, yeah.

Henry:
Yeah. And the really nice thing about this form of programming is you can’t plug something incompatible into a place it doesn’t fit. So it makes it really intuitive. If it’s clicking together, it’s going to run. If you can’t get the bits to connect, then you’re trying to do something that isn’t possible. So, that you notice it moves in an arc. You could do a linear move, so you just change it so that it basically moves in a perfectly straight line between those points.

Philip:
Oh I see and so you can have a fast bit in the middle, slow bit. [crosstalk 00:00:06:27].

Henry:
Yeah you can control the speed of all of these. If I had a third point on there, if I pause it there. So if I drag another point over here, duplicate that one, call that point C. So now we’ve got a path where it goes there to there to there. And if we run around that loop now, it’s going in a more of a triangular motion. Now that’s still linear. So it’s still doing a perfectly straight line between these points, but it goes to the points and stop. If you’ve got to pick something out of a machine, you want it to do it as smoothly and quickly as possible. So we can take the middle one as having a blend on it. In fact we can put a blend around the whole lot if you want. And it will basically mean that it smoothly moves past those points rather than coming to a stop. So it makes the whole thing run a bit faster. So, if I run that…

Philip:
Instead of having a hard stop at A, B or C, it just…

Henry:
Exactly. So you can see sort of over the B it didn’t stop, it just looped over. And so you can get the whole process running a lot smoother and faster like that.

Philip:
So, when it comes to the end of arm gripper, I saw that there was a Robotiq section in there. So it’s similar to how you program that, you can program that with the robot gripper as well.

Henry:
Yeah, out of the toolbox, we can grab a grip block.

Philip:
And this can be any gripper? [crosstalk 00:07:55].

Henry:
You have to tell it what gripper it is. And what we’re doing, is we’re building libraries for more and more tools. So we’ve been building libraries for Robotiq, OnRobot. We’ve got interfaces that let you drive Schunk. And so, if someone presents us a new gripper we haven’t seen before. The first thing we’ll probably do is hand it over to our engineers and get them to add that library to the driver, so that everyone can use it. Because we want to support as many different tools as possible, really.

Philip:
Perfect, sounds great.

Henry:
So, we’ve also got direct controls on the UI there. So on the far side, on the left there’s a control there where we can basically drag the gripper open and shut. And that lets us control left quite closely, we can set the effort of how gentle it is. So if we grip something like that, it’s gripping it. It’s not gripping it that hard, I can pull it out. If we open it again, and then set the force higher, that’s got it really solid set. So you get really nice, fine control with these, they’re very versatile tools.

Henry:
And one of the things you might notice here is that it’s actually plugging straight into the wrist. It’s not a cable on the outside, and this is one of the limitations with some robots is that you can attach third party tools. But you’ve got to attach them to the control box. One of the reasons we were really keen to have it attached directly to the wrist, is because these joints can just rotate freely in any direction without worrying about a cable getting stretched or snagging. And you could even drive a screw with this and go around 10 times and it will just continually rotate. So there’s a lot of flexibility that comes out of that.

Philip:
So, how does that compare to other sorts, like cobots then? They normally have something you have to plug in and then a wire that comes…

Henry:
They’re starting to offer some support, but it’s quite limited actually. A lot of cobots, you get an interface that plugs into the control box under the table. And then you get a long cable and you’ve got to run five meters of cable around the outside and then you’re cable tying around the joints. And it has just got a lot more chance of actually getting snagged or pulling out, which is a real restriction.

Philip:
Yeah. Sounds very messy if you have that.

Henry:
It can be quite messy, yeah.

Philip:
It’s only be a matter of time before it gets to somewhere it shouldn’t be and then ends up breaking something, so yeah.

Henry:
Absolutely, so you can control the gripper live like that. There’s obviously a block to control the gripper mid sequence. So if we wanted to put a grip in the middle of there and we could say this one’s an open position. And then we could have a short pause and close it again. And then that would basically be running around the loop and it will open at point B and then it will close at the third point after that. So, you can sort of see, you can build up increasingly complex programs in this visual language. Really easily without having to text scripting language or anything.

Philip:
Well, this is it, it’s very, very easy to use, really. Very simple as you just shown me. So yeah, I’m very impressed with that.

Henry:
And then balanced with the complexity, you want to still have fine control. So, if you do need to do something very precise. So if for example, we’ve got to pick this block up. You can can drag it down there and get it roughly in the right position very quickly. But then you might want… I’ll do it this way so the camera can see. Then you might want to get very fine control at the end. So I can stop using the zero gravity mode and I can use the controls on the screen. And then you can do very accurate…

Philip:
Minute.

Henry:
Maybe we can set it to only moves one millimeter at a time. Or you can set it to only move in one axis at a time. So you could move very carefully, even if you’re around delicate things that could be broken.

Philip:
And then if it was something delicate, how do you test? Or is it just a bit of trial and error? [crosstalk 00:11:40]

Henry:
It’s a good question. If you’re figuring out exactly how hard to grip an egg, it could be a bit of trial and error. The point where something will break isn’t something that we can really monitor, but you just gradually increase it. You could start off setting the grip force so that it’s got it. And then you try and pull it out the grip. And if it’s not enough, increase it a bit more. So these motions will let you do Cartesian motions. So down and up in the Z plane, X and Y. And you can set steps so that it will do small movements, or you can do faster movements.

Philip:
No, that’s great. So, we’ve got the controller here, can you just take me through how that works?

Henry:
Sure. Yeah, so this is the pendant. This remains fixed to the robot. So if you’ve got a shop floor environment in a factory, your engineer will probably come down with a laptop and they’ll have their laptop on their knee or the bench while they set up the robot. They will create the program where they can see it with all of these nice features that you get on a full sized screen. So when they leave, they can unplug the laptop and the robot can continue working. This gives a nice, simple interface for the operator to basically set up and run the program. So they don’t build the program on here.

Philip:
So, obviously if you’re in a manufacturing site, you just want to get the robot quickly moving. So you just come to this and uses a very quick, fast interface. And I suppose if you’re doing a different task as well, you can do one task, hit go, move to the next task and hit go. And you haven’t got to go and program the whole robot again.

Henry:
Exactly, the idea of when you come in here, you load program A, and it can do program A for the next few days. And then that finishes and someone’s got another program that the robot’s set up to do. They can just select it from a list here. But what they can’t do, is they can’t edit the program, which is useful because it means that once someone’s built a program here, it can’t be messed with. It’s locked if that’s the only interface you’re using.

Philip:
Right, perfect. So if I… We have a joystick here?

Henry:
That’s right, yeah. The joystick’s quite a nice feature we’ve got there. So, joystick itself would be prone if you just knocked it. So you’ve got these buttons on the side, which are called dead man’s handles. When you squeeze those, it becomes live. So now you’re able to control it. Now you’re in the mode at the moment where you’re basically rotating the wrist around the point. And it’ll only go so far before it reaches a limit. You can jump between other modes. So if we press this button here, then you’re in Cartesian mode now. So, now it’s moving forwards and backwards, left and right. And then doing that still we’ve got up and down. So, that’s basically allowing the robot to travel in this axis.

Philip:
So you can have a full play of different axis’s there.

Henry:
Exactly. And that’s in the base frame. So that means that you’ve got X, Y, and Z like that. So I’ve now put it in the tool frame. So now you’re moving along an axis which is along here. So if you’re trying to insert something, gives you a nice simple way of doing that. That’s quite hard to do with zero gravity because you’re trying to drag it in a perfectly straight line. But in this mode with a joystick, do that. And actually if you move the joystick just a tiny bit, you can move it really quite slowly. So you get a very fine control if you’re trying to do something quite precise.

Philip:
I see, yeah. Because I can imagine if you were trying to do it manually, you could be pulling it in one direction. Not actually know you and not realize you’re pulling to the right, pulling to the left. Whereas here you can do very, very, very precise.

Henry:
Absolutely. So if you’re trying to insert a pin in a hole or something like that, then it gives you the ideal way of doing that. So we’ve got various different ways of doing similar things that suit different applications.

Philip:
Yeah. And that’s a very handy feature to have, well I’d say a brilliant feature to have, really. Especially if this is going to get used in multiple different applications and devices, I can really see that being really helpful.

Henry:
Yeah, we’ve had good feedback on that. And so we’ve got the control to do it with the joystick. There’s an onscreen control and the zero gravity. So typically for almost everything, one of those is the ideal. But there’s no one solution for everything, so it’s useful to be able to jump between them quickly.

Philip:
And then I suppose the next question is, obviously the main thing for this cobot is basically if you’re a small business and you wanted to do multiple tasks. Some of the tasks may be heavy, some of the tasks may be light. So yes, I’d be very keen to interested into how you would make the robot for something heavy and a robot for something light. Can you give us an idea? [crosstalk 00:00:15:59].

Henry:
Sure. Well, this is the super power that this robot offers that others haven’t got. So what we can do is we can change the reach at the moment. This is a medium reach. I’m going to put it upright. Just because it’s a bit easier to handle like that. So I leave it in this position, I look to the screen where I can basically turn off the arms. So, I go, just off. You can hear it click and it’s off. So now what I need to do, is unlock this. No tools, just simple wheel. And then this section comes off.

Philip:
Very nice, very quick and easy.

Henry:
This weighs about eight kilos or something. So it’s not light, but it’s certainly fine for one person to lift and very easy to do on the fly.

Philip:
A one man job, really. So someone [crosstalk 00:16:48] can come along and do it for you, yeah.

Henry:
And the nice thing is you want to move this robot. You can move it in pieces as well. So this is about 10 kilos, about eight kilos, couple of kilos. So all these bits I can move by myself. But if this was all one robot, you’d probably have to get a second person to help with that. So we’ve got a range of different link sections. This is a medium, this is 150 mil. We’ve got a 300 mil or indeed you can put it together with nothing. So if I take this part here, I line up the pins.

Philip:
And this would be the maximum weight that you can get with that extension.

Henry:
Exactly. That’s the trade off. So you’re trading off, reach with pay load. So having put that one on there, turn it back on. See the orange here, blinking indicates that it’s just starting. Takes a few seconds to detect the whole system. And if you look at the user interface here…

Philip:
Oh, there we go, it’s straight in.

Henry:
Popped up, and we can rotate that, so that it’s in the same orientation as the arm.

Philip:
So you’re not there having to download another program to put a different [crosstalk 00:17:49] Yeah, very quick and easy.

Henry:
All we did, was switch off the arm, change the configuration, switch it back on. And we could even run the same program, it’s that flexible. Now I’ll give you a warning if you do that, I won’t do that now. But once the arm’s enabled, I can essentially use it in exactly the same way. It’s updated all of the dynamic systems, the zero gravity and the kinetic. So it’s basically ready to run. We could try and run this program. The limitation is if it’s out of reach. So if you’ve asked it to reach something over here and it’s not long enough anymore. But there you go, so this one’s still valid.

Philip:
It still runs. And then the weight load for this now then?

Henry:
So this is now 10 kilos, so this becomes quite heavy duty in that short reach.

Philip:
Right. Yeah, and then the one bit there…

Henry:
With this, this would be about five or six kilos. And with the longer reach still, then it would still be able to lift three kilos. So you’ve obviously used some of that pay load in the gripper. This is about 800 grams. But at full reach you could still be lifting two and a bit kilos, which is still fairly substantial. And we see a lot of companies where they’re loading machines. They’re putting blank metal [inaudible 00:18:57] in a CNC machine or lifting plastic molded bits up. They really love that longer reach and they often don’t need particularly heavy payloads. So it’s an ideal setup for them.

Philip:
So again, you’ve got three different worlds there, they can use the robot for.

Henry:
Exactly. And actually in the future on a roadmap we’re bringing out an active link as well. So this is a six degree of freedom arm, which is still fairly flexible. Some robots have a seventh degree of freedom and that gives them an extra layer of redundancy. So you can reach inside, something in the elbow can still move. So we’re bringing out on active link section which you can plug in here. And then this is an extra degree of freedom. So you can go from a six DOF to a seventh DOF, which is a really nice feature to be able to have if you need it. But if you don’t have to need it, you don’t need it, then you don’t have to pay for it. Which is our general ethos, you only buy what you need with this robot.

Philip:
So an example of where you would use that seventh degree, can you give an example of that?

Henry:
Yeah, it’s really if the robot’s got to do quite complicated motion. If it’s got to reach inside something, around the back of something, that can be useful. We see seventh DOF being useful in probably a quarter of all the cases. It’s the minority, but it is useful in some spaces. And where we’re finding these longer reaches very useful, is things like inspection as well actually. We’re working on a number of projects, where the robot’s got a camera mounted on the end and it’s reaching around and it’s basically inspecting something from a lot of different angles. Cameras are generally a kilo or less. So it’s an ideal light payload, long reach application.

Philip:
So some of the case studies that you’ve done already, can you talk us through some of those?

Henry:
Sure. Yeah, happy to talk about those. So, we’ve got one of these in a restaurant, a food and preparation space where the robot’s loading raw ingredients into pans and woks. We’ve got another project we’re doing where this is actually a vertical farm and this is loading seedlings into pots. And then they go onto a wrapped conveyor system, where the seedlings go under artificial growing lights. And we’re looking at a couple of projects at the moment where there’s a camera mounted on the arm for all kinds of different applications. In dentistry, in part inspection, in metrology, and in a really wide range of different applications like that.

Henry:
And then the classic applications that people are used to seeing with load robots, like universal robot, where this is an ideal robot for machine tending as well. Because you can set it up for the long reach. It represents very good value if you don’t need those heavy payloads. And it’s also really versatile. If people need to move it to a different application where they might be doing assembly. So, one week it’s reaching inside machine, using the full reach. The next week, got it on a bench and it’s doing some simple assembly task. And then you don’t want that large reach, because it’s just a bulkier system. And you can get better precision and a higher payload in that short reach.

Philip:
Right, so hi guys. So Henry and I have quickly swapped places because this time I’m going to give it a go. Just to show how easy it is to get the robot working. And Henry’s going to give me a hand. So as I understand it, all I need to do, is I hit one button at the top. It turns the LED’s blue and then I’ve got the ability to move the robot handle around. So I’m going to go for these blue points here, this blue block here. Place it down below and then I can just hit this button on the right. And it should give me one position.

Henry:
That’s right, yeah. So it’s added your new way point to the bottom of that list. So if you want to add another one that’s directly above where you are, then it will lift out nice and square. So if you go up again and then… That’s good and then you can just press the button again to add another way point there. Should pop on the bottom of that list there, then you could bring it over this way.

Philip:
So, let’s bring it over here. So we’re going to try and put him on top of this one here, just put it here…

Henry:
Yeah, that’s nice.

Philip:
And then we’ll click it again.

Henry:
One more down there. And then finally you can try and rest it on top of that.

Philip:
Probably about there. I think I got that right, and then one more click.

Henry:
Yeah. So you’ve created now the path for it, and you’ve done all that without having to use the interface. Now, all we do is we’re going to drop a grip position on this. So we can say, so your first point you were starting above it, you’ve moved down over it. This point, you’re closing the gripper and then you’re picking it back up again and you’re leaning over. And if you wanted to drag the 3D, you’d be able to see that that path is drawn at a set point. [crosstalk 00:23:35].

Philip:
Oh I see, so it sits here as well, perfect.

Henry:
If we’re taking a bit more time, what we do is we’d label them each. So we’d call this above, we’d call this over, we’d call this away and call this one place. So that it’s really easy to read.

Philip:
Right, makes sense.

Henry:
But yeah, so now you can go here, you should be able to just press play in the middle of that.

Philip:
Play button here.

Henry:
And it’ll go to the first point. And this is showing you which one is one, so you can see that one’s highlighted at the moment. It’s going towards the next one, sorry. Then the gripper should close, gone up, onto this one. Next one’s going to be the move across.

Philip:
And you can see the way points here as well.

Henry:
Yeah.

Philip:
That’s brilliant. [crosstalk 00:24:12].

Henry:
So now you’ve run it once and it worked. What could we do is we could turn up the speed a bit more and you could gradually tweak it. So if there’s any bits where it’s not perfectly aligned, you can use the on-screen controls to adjust the position and just get it to the point where it’s running really smoothly. And then you can get it to run it faster and optimize the whole process, so that you got it running exactly the way you want.

Philip:
Right, perfect. Yeah, you get that optimization to make sure it’s working perfectly.

Henry:
Exactly.

Philip:
Right, no, that’s great. Yeah, thanks very much for the quick test there. I could see how it’s very easy to move and get going. So thank you for that.

Henry:
It’s a pleasure, it was really nice to have you down here.

Philip:
Yeah, no, that’s fine. Well, thanks for the overview. And as I said, it’s a great product. It’s got many different applications in many different industries, really. So yeah, I’m really looking forward to see how this robot grows within the industry. So thank you Henry.

Henry:
It’s a pleasure, thank you for coming. (silence)

Inovo Robotics Interview – Hands on Inovo Robotics: https://inovorobotics.com/ Philip English: https://philipenglish.com/ Sponsor: Robot Center : http://www.robotcenter.co.uk

Tech London Advocates Robotics- Software robotics

Tech London Robot Center Philip English

Hi, guys! Philip English from philipenglish.com and today we have an interview with different tech companies including the Tech London Advocates team – a network of tech leaders, experts and investors uniting to form the most influential group in tech, Fizyr– leader in software for automated picking, Alias Robotics – a robot cyber security company, SLAMCORE– world leader in spatial AI and algorithms that allow robots to understand space, HAL Robotics – London-based robot control specialists focusing on novel applications of robotics in creative and construction industries, Extend Robotics – human focus creators of robotic arms capable of remote operation anywhere in the world, IRAI Innovative software- software developers for the fields of industry and education, IncubedIT- leader in software for autonomous mobile robots and ARS Recruitment- agency working to match great candidates with awesome companies involved within Automation. Find out more as they introduce their companies and answer questions about their work industry.

Discussion:

Philip English: (25:50)
Okay.

Thomas Andersson: (25:58)
Okay. So let’s kick off then. Um, I think we’ll get a few more attendees as we go along. Um, and like comments can also take part of the recording. So this is a recorded, um, webinar. We’ll put it on YouTube after this session, so you can also, um, look at it after them. So hello and welcome to the second, um, TLA webinar. Um, my name is Thomas Andersson. I’m the co-founder of TLA robotics, and one of the six group leaders I’ll be very brief so we can get on to the exciting part, which is then the company presentations. So a little bit about tech, London advocates. So TLA for short, it’s a voluntary organization, uh, that was originally set up to promote the tech ecosystem in London, but the group has now expanded. It’s got several different working groups. Um, it’s called, uh, chapters across the world as well. So, um, it’s a really big group. That’s about 10,000 members in the UK alone. Don’t know if you want to join the group it’s pre um, um, I should share my screen as well. I just remembered,

Thomas Andersson: (27:12)
Um, where is my sorry guys? It’s coming up. I would go, yeah.

Thomas Andersson: (27:30)
All right. I’ll go on. I will share that in a bit later, so you can see the email address as well. So actually the email address is T L a dot robotics@gmail.com. So joining is free as well. Um, so the TLA robotics working group was set up in April, 2020 with the purpose of encouraging and promoting the robotics and automation equity system with a focus on the UK and Europe. Um, we have three aims to encourage investment in robotics and automation to increase adoption of robotics and automation, um, in the UK and Europe, especially in the UK and also thirdly, to improve the gender balance that we see across the, um, um, robotics and automation, um, uh, sectors as well. So when it comes to investments, um, so what I’ve seen in my work, I work a lot with, uh, investors. We see that, um, most of the investments in robotics actually is going to US and, um, to Asia as well right now. Uh, let’s see if I can share my screen here. Oh, there we go. Um,

Thomas Andersson: (28:42)
So in one recent project I’ve done it’s we found that, um, 80% of the companies or 60% of the companies were based in Europe, uh, 14% of the companies were based in the U S but us companies take an 80% of all the investments or they attracted that investment. Um, so yeah, I should also add that our medium to long term aim is to organize physical events, but that’s obviously been hampered by the, um, uh, COVID as well yet. So, so what we can see here is then on the screen here, here’s the team, uh, that makes up, um, TLA robotics. Uh, I should make a note that we can, um, we do questions, uh, during this, um, the presentations as well. So our team members, they will be able to answer any of the questions. If it’s a question with particular focus on one of the companies, we can feel that as well to the company, just so you know, uh, with that, um, I really want to say big, thanks to Philip at, um, robot center who are sponsoring the zoom hosting for this event. And without this, we wouldn’t really be able to do this, um, these volunteer. So with that, I’ll stop sharing. Um, sorry. I had some problem with that, um, over to you and then Philip.

Philip English: (30:08)
Cool. Thanks, Thomas. Uh, thank you very much. Now. I always a big fan of these events. Uh, I’m just gonna share my screen as well. And uh, let me just go presenter mode.

Philip English: (30:29)
Okay. Right, can everyone see my screen? Perfect. Let me just move this as well. Cause that’s in my way. Brilliant. Um, right. Yeah. So, um, my name’s Philippine English. I am a chief operations officer at robot center, um, a collaborative robotic integrator. Um, I’ve personally been fascinated with robotics ever since I saw my first movie when I was a young child at the cinema, uh, which was uh short circuit. If some of you guys can remember that one, uh, Johnny five. Um, so, uh, since then I’ve, uh, I’ve always had a passion for, uh, RO robots and, um, as we getting closer to the real life, uh, Johnny five, um, so a big part of that would be to do with software. Um, and this is why I’ve put together the five software trends. Um, so with software being famed for being fast pace and an innovative, uh, the future of robotic software is started wanting to see unfold. Uh, so these are the five trends that we’ll put together for you guys.

Philip English: (31:37)
Okay. One second.

Philip English: (31:50)
The cloud starting with the cloud trend one. So the cloud will only get bigger over time. Uh, the, the use of computing with cloud storage and other technologies that allow for higher levels of human robot interaction and learning, uh, is contributing to the robotic transformation of companies. A more powerful robot solutions are growing thanks to cloud technologies. They give the ability to handle heavier computing processing task, which basically enables a more powerful, powerful and cognitive collaboration. And this greatly increases the available data to share with machines and shamans. Uh, so always know, um, it should mean that prices should drop for consumers and a software functionality is expanded and refined. So that’s trend number one, trend. Number two, we’ve seen is a Python is still the dominant, uh, in coding road realm. And, um, basically Python is on a roll right at the moment.

Philip English: (32:48)
And according to statistics, um, it’s grown rapidly to become one of the top languages in robotics, uh, mainly because, uh, both Python and C plus plus. So the two programming languages found in Ross, um, partly has a huge number of free libraries. So you’re not gonna have to reinvent the wheel. And, uh, it’s generally easy to use and save time programming. Uh, and then with, with more and more robotic friendly electronics now supported Python, um, out of the box CG Rossi PI what going to continue to see a lot more Python software and robotics, uh, from the new generation of developers around the globe,

Philip English: (33:28)
Trend three Is artificial intelligence will only get better over the next few years, um, with, uh, artificial intelligence and robotics proven to be a powerful combination for automating tasks and processes. Um, AI and robotics gives flexibility in steps and actions and enables the robots with learning capabilities in various applications, uh, artificial intelligence and robotics market expenses register a compound annual growth rate of a 28% for the forecast periods of 2020 to 25. So that’s interesting to figure. And so all in all artificial intelligence in robots offers companies new opportunities to increase productivity, make work safer, and save people valuable time.

Philip English: (34:12)
Okay, Is the internet things, um, is set to grow and grow. And with the internet of robotic things, IOT set to be a $20 billion market by 2024. Um, I found a great description of, uh, IOT from ABI research, which is, um, internet robotic things as a concept where intelligent devices can monitor a fence few sensor data from a variety of sources, use local and distributed intelligence to determine best course of action and then act to control or manipulate objects in the physical world. And in some cases while physically moving through that world. So, uh, Sony see what the, uh, incident robotics thing we’ll have a bit slave to 2021.

Philip English: (34:58)
Last one is trend number five. So, uh, mixed reality, um, might be something to keep an eye out on. Um, industrial industries uses of augmented reality and virtual reality in conjunction with robotics continues to grow, uh, manufacturers, militaries and healthcare providers all look to find new uses of this technology. Uh, as a rate of growth is projected to continue over the next coming five years with practical uses of the technology, continue to develop an existing users, move into further maturity and interesting fact, found by this one is that as executive that 75% of mixed reality, and we’ll be due to be on mobile by 2026.

Philip English: (35:40)
So that was just a quick overview of what we’re seeing on the market. Um, if anyone is interested or wants to know more, then please feel free to reach out to me. Um, we are keen to get people on our first step of our ROI methodology to find out and research what’s possible. And, uh, lastly, um, the Robot center team, uh, we, we all grew up watching Scifi films this both seeing envisions of utopian features and dystopian features. So we want to be architects of real positive future that is apparent supported by robotic technology. And so we aligned the global goal number nine, which is to build resilient infrastructure, promote sustainable industrialization and foster innovation. And we support charities that are aligned to this. And for that, I will pass it back. So the first speaker.

Thomas Andersson: (36:33)
Excellent. So, yeah, thanks Philip for that. Um, without, so we move quickly on to alias robotics and, um, Endika from Spain.

Endika Gil-Uriarte: (36:44)
Hello everyone. Hello, Tom. Thanks very much for raising the invitation and in particular to Philip for sponsoring this opportunity. So very quickly, may I try to share my screen and please do confirm that you can see me and hear me properly.

Endika Gil-Uriarte: (37:07)
That’s okay. Can you hear me? Can you see me? That’s fantastic. So there we go. So, um, it’s Endika Gil Uriarte CEO of alias robotics and alias robotics is a robot cybersecurity company. We are a company focused in these market niche and we take these, uh, from the robot B6 proxy to cybersecurity of robots and robot components in particular, abs robotics was founded upon previous stories in robotics, and we do have the experience and we do have the niche expertise that it takes to tackle the cybersecurity of these robotic systems. Now it has been came very clear from Phillips. Um, talk before that we live in the era of robots and these one hub on the is happening. Although we must say that the, these are very early days in robotics with these, uh, down of an industry that will, uh, follow up in the coming years and decades and probably millennia.

Endika Gil-Uriarte: (38:13)
We see robots every day working with us in our homes, um, performing professional tasks, but more particularly and more let’s say intensively, we see robots in our industries industry that is now trying to position in towards the industry for paradigm where the connectivity is key and we’re older components. So just cyber security becomes critical. Now there’s some public case studies that we’ve been publishing as alias, robotics vaccines, 2018 showing, um, the big thing, we’ll say the landscape that is happening right now in robotics. We normally say that it is happening the very same note learned lessons that happened in other IT industries at the very round of them. And the let’s say cybersecurity lessons are not learned. And the cybersecurity status of robotics is a thing that needs to improve, but it’s better to see something rather than saying it. That’s why I brought to you some videos with this little bottom line of do not trust robot, that they want you to take us and advice.

Endika Gil-Uriarte: (39:32)
Um, let’s say, um, for the fault, and this is something that we have been able to do through manipulating remotely, the safety system of a very popular autonomous mobile robot. Now, alternatively, an attacker could exploit, present vulnerabilities, as you can see in the top, right, to retrieve the map of a sensitive industry. In this case, you can see in there the headquarters of Alias, robotics, but in [inaudible] States, Spain, alternatively, um, the industrial manipulators are not at attack free entities and attacker could exploit the network attack vectors to Ramtion or alternatively use High the robot by using a simple USB via, exploiting the vulnerabilities persent on these industrial colaborative manipulator. Now this is why alias robotics has built the robot immune system, our robot endpoint protection platform for robots. This is robot software that acts like a next generation antivirus that these installed directly into your industrial robots. We do support the top sales industrial robots, and there’s a list that these public in our web, but please do ask us if you, um, find, uh, if you want certain information for your robot, this is how you operate. Ris Ris gets installed into the teach pendant or in this case of this universal robots, industrial robot, and you train it for a certain period of time. So it learns about the usual pattern of life within this robot, outside these training phase, then you have a fully intelligent detecting system to monitor your robot section.

Thomas Andersson: (41:35)
That’s a second slap.

Endika Gil-Uriarte: (41:37)
Okay. So the biological inspiration behind the race, you can find these out there and there’s a flexible licensing associated to it. Uh, following research and development, professional and certif licenses, according to industrial security standards, we do provide some security services as well, so we can help throughout the security process to companies. And please do contact us in our locations in Spain or in Boston. And, uh, I would like to thank you again and stay safe, stay secure.

Thomas Andersson: (42:14)
Excellent. Thanks for that. Uh, and they can, can I just remind all the attendees that, um, taking screenshots, it’s a really good way of, um, capturing if you want to contact anyone with emails and so on as well, that’s going onto the screen, um, with that, uh, we’re moving on to Owen and slam core from the UK. So over to you Owen.

Owen Nicholson: (42:39)
All right. Thanks Thomas. Awesome. So let me just share my screen and we’ll jump straight in. So, uh, okay. I’m looking for sums up from people. Is that, is that coming across? Okay. Can I jump between the slides? That’s the question. Okay, cool. Okay. So, hi, um, my name is Owen. I’m the CEO at slam core. Now robots, uh, struggled to cope with change. So w when the lighting starts to vary when the structure changes and when there are too many people walking around, it’s essentially when the world gets a bit too real. Um, even today, robots struggled to answer the following three questions. Where am I, how far away are things and what are they, and these others, the three questions of spatial understanding and the biggest cause of robotic failure is when a robot gets the answer to one of these wrong. Now, this is a problem that nature has already solved.

Owen Nicholson: (43:35)
And there’s a reason why nearly every animal in the planet uses vision as it’s called sensing modality. If we take a camera and fuse together, simple data such as from a low-cost sensors, gyroscopes, and accelerometers, and we optimize that software to run on low level, uh, low level, uh, Silicon, then we can deliver commercial grade spatial understanding on hardware that’s actually available today. And this is exactly the approach that the AR VR industry has taken already. We here, we actually see three of probably the most advanced spatially aware consumer slash early stage industrial products on the market today. And they all fuse low-cost gyroscopes and accelerometers with vision to tackle these three questions. Um, but these tech giants have all heavily optimized that algorithms to only run on their specific industry, uh, for, for their specific hardware. Sorry. So the, the robotics industry is made up of thousands of niche verticals all with their own technical and commercial requirements.

Owen Nicholson: (44:34)
We don’t need a single solution that works well on one hardware platform. We need solutions that are flexible configurable, and allow us to try different hardware combinations to find the one that works for us. Well, unsurprisingly, that’s exactly what we’re doing here at SLAMCORE. So we actually span out quick bit of history. Uh, we span up from Imperial college about four years ago, um, founded by absolute world leaders in the field of machine vision. And after multiple funding rounds led by top investors, we’ve now built an incredible team of, um, 25 staff from 17 nationalities around the world together have produced 50 patents, 500 papers and 50,000 citations during that time. So, uh, they’ve been busy and after four years of development, um, I’m extremely proud to tell you that the slam core spatial AI SDK is, is hitting the market. So developers can use this, create their own vision based solutions with our proprietary algorithms at the core.

Owen Nicholson: (45:28)
So our public SDK is available for free, and we’ve teamed up with Intel to make it out of the box compatible with the real sensitive 435 isensor, if you’ve ever used that. And through our pay to play vision with program, you can also gain access to other sense of configurations and build for arm architectures, such as the Qualcomm Snapdragon, Nvidia Jetson. Um, what I’m super proud of is the raspberry PI to get this running on that is a huge achievement. Um, but let’s just take a bit of a closer look at what, where we’re providing. So the first thing as I said is the robot needs to know where it is in space. So our algorithms analyze each frame from the Intel sensor. And, uh, you use that to build a centimeter accurate map of the key points in the environment while simultaneously calculating the position of the robot relative to those points.

Owen Nicholson: (46:12)
That’s what slam stands for in slam core. So here we actually see it running on a remote control car, building a point cloud outdoors while positioning that car in space to send to centimeter accuracy all in real time on an arm CPU. Now, second, a robot also needs to know what the shape of the world is, so it can navigate effectively and not crash into things. So our solution answers this by using the same Intel sensor fuse and the depth information generated by the infrared depth camera and fuses that into a 3d map, again, in real time on an arm CPU, which can be used for path planning and obstacle avoidance, but for truly Spatial and machines, robots need to not just know the shape and the position of the world, but they need to know what the objects are. They need context.

Owen Nicholson: (46:56)
And is that a person? Is it a chair, a wall, a ceiling, um, we’ve actually developed a proprietary pan optic neural network, which segments out to the objects and categorizes them all in real time. Ultimately, as we fuse the three levels together, this will allow the robot to identify objects and position them in 3d space and time. So this will really get full level, um, spatial understanding to the robots. Um, and the great thing about what we’ve done is in the past, it could take months, maybe years to build a great vision based system, if at all. And we’ve actually reduced this month, this from months to minutes, if you head to our website, sign up for an account, you can download the SDK and be off. We will need to approve you first. So please request access on our website. Um, as long as you’ve got the hardware we support, you can be up and running in less than 30 seconds.

Owen Nicholson: (47:41)
So, and actually you can see for yourself and the time it’s taking me to present this slide, that was not an edited, not sped up. That was actually me going on a fresher printing machine, installing it from scratch. So in my last kind of slides, um, I really do believe that we’re in a new era for autonomous machines. We’ve already heard that, and we are actually committing to open up this, uh, technology to that, to the world, um, from hackers to tech giants, from small companies to multinationals, the more companies we have playing in this space, the more this will drive innovation and ultimately allow us to see the true potential of robots for a force for good in the world. And we want to be a huge part of that. So if you want to join us on that journey, please reach out. I’m happy to talk anytime.

Thomas Andersson: (48:22)
Excellent. Thanks a lot. for that one um, quite a condense, uh, presentation, uh, interesting. Um, without an way we move over to Fizyr and, uh, Herbert, uh, in, um, Poland, Netherlands.

Herbert ten Have: (48:37)
Yes. Good afternoon everybody. Let me share my screen. Can you see my screen? Yes. Sums up. Okay. We’re a, scale-up in the Netherlands founded by professor Martin Visha robotics professor. He has an academic, a few of the world, and I started this company Delft robotics a few years ago. And we were in the business of logistics, enabling robots to cope with variation, variation of items in shapes and sizes. So everything you buy a line currently is being picked by humans. So if fulfillment order picking, et cetera, then it’s, it’s spec gets into a box or back. And it’s also handled by humans. I’ve got a short video to show this, and this is it.

Herbert ten Have: (49:33)
So this happens all around the world, seven times 24 hours, where humans have to pick parcels. These are big parcels, which you can imagine all sorts of small parcels from China are all being picked by humans day in, day out on average, a parcel from DHL is being picked at least eight times by human up until 16 times. So in this case, they’re all be placed on the sorter, and then the sorter is being scanned in scanning tunnel. And then it’s being delivered to a, to C area where they need to go. So this is up till now. And you can imagine that, um, in times of Corona, uh, it only increased. So why this, this has not been, uh, automated so far because of the variation variation in shape, in size and color and material and how it’s stacked. Just in this example, if I drop this a small towel, a thousand times, the robots will see a thousand different, uh, let’s say tiles, but it’s the same one.

Herbert ten Have: (50:43)
So what we’ve done, we trained the neural network with supervised learning to enable the robot to generalize and to see the item and knowing where to cross. And so that’s what we do. We show, we built this algorithm and it’s being used in, uh, in both an order picking and in a partial hemming. Uh, just as an example, uh, you see your roll cage where we pick from, or in a, an older store we pick from the bin. Um, and maybe this an example, you see the camera, what it does, uh, takes the image, or we take the point cloud. So the 3d data, we do the segmentation, you see the, uh, green bounding boxes. Uh, we find all kinds of classifier. So we can see, for instance, if it’s a box or if it’s a back, et cetera. So all kinds of classifiers that are relevant and we give it the, we propose the cross processes, six degrees of freedom, X Y set, plus the rotation.

Herbert ten Have: (51:38)
So that’s all done by the neural network in less than 300 milliseconds, to give an example, you see a drinking cup from a little, from a baby. And if you see the point cloud left above you, you see, we are looking for a surface, uh, big enough and flat enough to attach a fact fact can cup to simulate it, to pick it up. And so that’s, that’s generating a six degrees of freedom brass posts, which you also see in the groceries, in the, in the bin on the right. So the, the angles green, red, and blue represent the angle where the suction cup shoots apply, uh, apply force. So, um, so this is just an example of the work that we’ve done with, uh, picking from this, uh, this bin. So, um, we were in logistics, which is really, uh, growing very fast. So our systems are being used in production in all of, of course, in Europe, in the, in North America and up till China.

Herbert ten Have: (52:36)
Uh, so we, uh, we have integrators using our software and it’s not totally for item picking and partial handling, but also for the palletizing for a truck and loading. And we even picked towels. So we have an algorithm that picks, finds the corners of towels and then feeds it in a, in a folding machine. So, um, we used to be, uh, bootstrapped the first year. So we really grew, and I think we are an exception in field. Uh, so, uh, now we got funding beginning of this year. We growing foster, um, re uh, we did use ROS in the past, but we stopped at, uh, due to quality. We, uh, we, uh, we had everything in C plus plus would be moved now to, uh, to rust. So with our newer environments. So we really, the we have production systems, so

Herbert ten Have: (53:26)
They really rely on the, on performance. So we have to get the best systems for, for us. Um, this is our timeline. Let’s start with your previous winner of the Amazon Picking challenge. And we’re based in Delfin, uh, uh, close to the university in the old university building. That’s it.

Thomas Andersson: (53:46)
Excellent. Thanks a lot for that. Um, Herbert, it’s very exciting days for logistics, uh, robotics, for sure. We keep hearing about lots of people being interested in investing in it. So, um, with that, uh, we’re handing over to Sebastian from, uh, how robotics in the UK.

Sebastian Andraos: (54:08)
Hello? Um, yes, I am sorry. There we go. Okay. That’s whatever. Um, so my name is Sebastian. Um, I’m one of the co-founders of Hal Robotics, um, Hal robotics is a software company, uh, based here in London and in Paris. Uh, we provide solutions and substance to model programs, simulate, and increasingly communicate, uh, with applications involving industrial rentals. Um, as I mentioned, we have, uh, offices here in London and Paris, but we have funds absolutely all over the world and in all sorts of different industries, um, from food and beverage to aerospace, uh, arts and crafts to construction, um, and the ones I’ve picked out here in particular are those that I think best exemplify what we do, um, is that currently undocumented, um, they tend to work with very small batches or even one of pieces. Um, and the operators that work in these fields are more often than not non-expert users, or at least we have to cater for non-expert users in these, in these fields.

Sebastian Andraos: (55:25)
Um, and we do that by striving for simplicity and flexibility. Um, it’s obviously an approach that will work for any other industry. Um, but for some seekers it’s particularly preschool. Um, so on the simplicity side, we offer robotic programming solutions tied directly into cut packages, uh, with which our users are already familiar. The solutions themselves are actually built on top of our robotic framework, which is a flexible cross compatible, lightweight, and highly extensible software library that allows us to develop custom software, which will run on PCs, embedded systems or in the cloud with bespoke user interfaces to suit any skill.

Sebastian Andraos: (56:10)
To me, there are three major benefits of flexible digitized automation, and this project by seater got a bridge too far. Um, I think embodies them all perfectly, firstly, the tendency to use simulation and the fact that they have robots tied to optimize the structure, not only for performance, but also taking into account manufacturing constraints like production time, two-path visibility, and a few other, a few other parameters. They also make you solve automated Two-path generation to push the normal process much further than it could be a program by hand. There are something like 10,000 different targets on each of their panels. Um, and I can tell you, you’re never going to do it’s too long. It’s too boring. Um, and finally they use the fact that CAD and tool paths are adaptive. They say the robot is reprogrammed automatically when a part changes to make each part in a structure, unique at no extra cost with minimal variety of time per piece, and the complete freedom of design to end up with a truly mass customized and highly performative product.

Sebastian Andraos: (57:28)
These themes, Just studies can be applied regardless of the industry or material being used. And they can be tied in with sensors to perform more complex adaptive processes. Uh, here, for example, we have, uh, glassblowing or glass bending actually in this particular case, uh, steam bending of woods, and even robots on our construction sites that we’re working on at the moment. Um, and just cause it’s fun. Once you’ve got sensors in the mix, you can tie people in and have a bit of emotion and interaction going on. Um, so this was a student at a workshop we did a couple of years ago now. Um, who’s controlling a robot with his right hand and the gripper with his left. Um, and it appears that I raced through that. Um, but I’m gonna leave it at that. So please feel free to reach out and ask any questions you may have.

Thomas Andersson: (58:22)
Well, that gives us a spare minute for us. Well done, Sebastian, thanks a lot for that very interesting solution you have there. Um, I’ll say it again, feel free to field any questions through the Q and a tab to any of the attendees, um, and away with that we should move on to, Miranda software in France and Logan.

Laurent IRAI: (58:45)
Yep. Adam, thanks again. I’m gonna share my screen. Okay. So I’m Laurent from, uh, the IRAI company based in France. So we are all developing software for the industry in the education, uh, since 1988. Uh, so today I will show you a software, uh, that we developed for students to learn programming language. Uh, that’s called Miranda. So this is a simulator, um, for robots, small robots, uh, that you can find in some schools that you can program either in a stretch I beta. Uh, so let me show you directly inside the software. So we developed this, uh, the tools because, uh, particularly in schools, they don’t have access to a lot of robots. Uh, this is, uh, budgets. Uh, so with this tool, they are about to walk together, uh, at the same time, either in simulation and then work with the wheel robot or make a test, uh, before, uh, producing their own robots.

Laurent IRAI: (01:00:04)
Some schools do that with their children. So with me that can do all that. So they can simulate their own robots, test a challenge inside Miranda, then transfer your program to the robot to status through test it. There’s also a digital, so you can, uh, make your own remote before for the, seeing it and testing every, uh, aspect of the participants. And so et cetera. So to do that, uh, we propose, uh, challenges for the students. So we propose one challenges about our robot. So for example, is, uh, is a smaller robots. That’s pretty common in the schools.

Laurent IRAI: (01:00:51)
So robots can be programmed as you see in scratch by python. So for example, there’s the challenge is to program the robot to go to the each gate. So for example, I can simply make it not programmed to go back the, it feels great. So it’s, uh, it will be a good way for the students to learn programming without, uh, imaging, uh, or so the, the rail robots. And then what’s. So is that the, as a teacher or as a parent, you can follow the probation. So you have, uh, your students that, uh, I made inside this account, so I can follow their probation. You say each, uh, changes and also see their progression. They’ve done, uh, 12 of them. So they are, this is the program editor earlier. So I can just test it and was with me on that. So, so this is a predefined, uh, changes who is also get [inaudible] given, but you have access to an editor so you can create your own changes or modify, uh, changes are given so that you can better 3d from the library or 3d from the, uh, CEO, uh, software to create your own simulation. So if you have more questions, if you, if you want to be able to test, uh, the software you have, uh, access, you can access to go to website. Uh [inaudible] you have, uh, you can, uh, ask for three months, uh, to, to test the software or you can also go our website the area in France. So we linked in the chat when you can see also all the other products and based in the industry side and the automation, uh, you can see the robots simulation also. Thank you.

Thomas Andersson: (01:03:10)
Excellent. Thanks for that. Um, with that, we moving on quickly to Austria and, uh, Christian from incubated over to your question. Yeah.

Christoph Zehentner: (01:03:33)
Uh, thank you very much. Thanks for the invitation. Um, my name is Christoph Zehenter and I’m product owner, and one of the seven founders of Incubed IT, uh, yes, you heard right. We are seven founders and actually already 30 people working here at, incubed IT. And as you can hear from my lovely accent, I’m from Katz, Austria. Um, I think incubedIT believe that, um, autonomous mobile robots will become a commodity in every warehouse and, and every, um, production shop floor within a couple of years from now. And we believe that software will be, um, the key aspect to, um, for robots to become actually part of every warehouse, every production line. And that’s why we developed over the last nine years, a robotics platform, um, that turns any vehicle, any HPV into an autonomous mobile robot and, um, how that actually works. You can see in a nice little YouTube video, um, and I will let it run while I’m, I’m, I’m going to speak, um, over the last nine years, um, we invested in innovative lot, uh, in, in areas like localization. So, uh, let there open awareness, um, navigation, natural obstacle of widens, all this stuff that makes the robot go from a to B naturally. Um, and we have seen that this technology.

Philip English: (01:04:58)
Oh, I think there’s a, um, can they extend guys, um, not share their screen? So request off, please. Could you share your screen again?

Christoph Zehentner: (01:05:06)
Doesn’t matter. Um, and we’ve seen over the last year, statistic technology has become robust enough to, um, actually serve, uh, 24 seven industrial applications and what is needed now to get large fleets into, into the real world is easy use and, um, what is easy use for us? Um, on the one hand it’s easy hardware integration. So we need to get our software as easy as possible onto new hardware models. Um, it’s easy implementation of customer project. So, um, the top needs to be applied as a consumer like product. We often compare it to TV. You’re going to select the TV in the shop, and then you just use it,

Thomas Andersson: (01:05:50)
Uh, just to, um, can you share your screen again? Some of the attendees are really keen, so, sorry, sorry for that.

Christoph Zehentner: (01:06:01)
And it’s, um, on top of that, it’s easy kind of activity. So, um, I’ll flip management server, for example, can be instilled in the cloud, making it possible to attach it to multiple add on services, such as, um, analytic platforms, other host systems, ERP systems, whatever. Um, and in the end, easy usage on the sock shop floor itself. So it’s not only about a usable or easy to use user interface. It’s much more so the whole robotic solutions must be, um, easy to integrate and smooth to use. We always say smooth is the new smart. Um, and on top of that, um, we’re to do that, actually we developed a software platform that consists of three main parts. On the one hand, it’s the smart shat on navigation tool kit us with tend to call it it’s the navigation localization stack the drones within the robot.

Christoph Zehentner: (01:06:56)
It’s the second part is the fleet management server, which coordinates a fleet of heterogeneous robots. And on top of that data monitoring and analytics. So, um, the customer is getting inside. What is actually going on on the shop floor, um, cell numbers towards set them. We have more than 300 shadows deployed worldwide, currently running on our software platform. And for example, the biggest fleets consisting of 40 shuttles runs more than a thousand kilometers, the 24 seven and since 2014. Um, but there’s more than just robots driving around and robotics projects consists of multiple steps and the whole life cycle can be covered by our digital twin. So going from the planning phase where you can simulate a site upfront without actually having robots there, you can see how the process will look like how many shuttles do it, do I need, um, going through the development and testing of, of this, um, installation frugal life.

Christoph Zehentner: (01:07:58)
So you take the data that came from planning that you adopted during the development and actually put it in production. So, um, it’s a nice thing to take such a solution, um, in real production, from home, even from your home office, it’s much more convenient than sitting in a four degrees cooler warehouse. Um, and when you have the support case, for example, we just looked at data from the life installation back into our digital twin simulate. Um, how could that happen? Um, how could maybe a buck fix, um, fix the thing or changed in the configuration and so on and last but not least when you have processes that should, you can ride it up front in the digital twin, see how your KPIs vary and you can even suggest changes to the customer upfront before him knowing that he has maybe something to do tune to improve. Yeah. Um, that’s basically that I somehow hurry through it. If you have any questions, please feel free to contact us. Um, I’ve given the contact information here and once again, thank you for, for organizing that opportunity. Um, I’d love to hear from you guys. Thank you.

Thomas Andersson: (01:09:10)
It’s crystal. So just two quick notes there. So that’s a question for Fizyr in the Q and a, and there’s also one for you, Christoph, but that was in the chat. If you can just look at that. So anyway, uh, thanks for that. It’s very exciting. And kind of the solution that you developed, um, that’s way we’re moving over to extend robotics in the UK as well. And, uh, Ji long, I believe I pronounced that, right?

Chang Liu: (01:09:45)
Sorry. Uh, yeah. Chang Liu to hello everyone. Uh, my name is Chang Liu, I’m the founder and CEO of extend robotics. Uh, we are a human, a human focus, uh, robots, uh, startup in the UK. So, so our vision is really to extend human capabilities beyond physical presence. So we build affordable robotic arm capable of remote operation from anywhere in the world using a cloud-based teleoperation software. Uh, so our, our technical phoners are, uh, uh, post-docs and PhDs from Imperial college, uh, and, uh, commercial leadership from huawei. So, so really the background is not, uh, the, the new trend of digitization and automation is transforming almost oil industries and new technologies are, are approaching, uh, the slope of enlightenment and the, uh, and the new technology, uh, new challenges, like a labor shortage, uh, Asian society. And the pandemics are forcing our society to, to, uh, to look for new technologies and adopt and adopt them faster.

Chang Liu: (01:10:54)
And all of these contribute to the exponential growth of, uh, robotics. Uh, as we see today, so sale, um, sales of a service robot and succeed, uh, uh, 17 billion and, uh, 29 tens on the forecast to reach a 40 billion in 2023. So as a, yeah, as a big market, of course, I cannot avoid talking about COVID, uh, 65%, uh, UK, uh, office workers believe working remotely. It will becomes more common, uh, even after the pandemic. Um, and in many traditional industries are really like showing the need for remote working, even for physical, um, tasks. Uh, so it’s, it’s, it’s the worst time for human, but maybe, maybe the best time for robotics. Um, the fact is that, uh, still over 50% of the jobs can not be easily automated by today’s, uh, robots, um, due to the limited, uh, dexterity and the intelligence of autonomous robot and, uh, the low scalability and high cost of the traditional teleoperation, uh, robots.

Chang Liu: (01:11:58)
However, uh, now, um, many, many of these operations have a strong desire for, uh, to avoid human human presence. Um, but, uh, but how, but how can robotic, uh, help with this since 90, Nineteen fifties, uh, teleoperation robots and autonomous robot are kind of con complimentary in terms of balancing the cost and the dexterity. Uh, however that, uh, what people really need is, uh, in this background is a robotic system that is, uh, that is suppose affordable and can, uh, can provide flexible dexterity. Uh, they need to complete the job remotely and reliably. So, already. Uh, our solution is that, uh, we’re building a next generation, uh, teleoperation robots, which is an affordable VR controls, uh, Mo multi-purpose robotic arm accessible from anywhere in the cloud. So this works as the physical avatar of the user to perform, um, manipulation or teleoperation task remotely.

Chang Liu: (01:12:59)
Uh, so we call it the all-star twin. Uh, so after is a robotic, uh, manipulation module to be in, to be integrated into your existing robots, like, uh, re uh, um, uh, mobile robots. So technically is includes three parts, uh, robot toolkit, uh, of advanced mechanical assistance systems, AMS, and, uh, um, and, uh, imitation learning engine. So, so our robot toolkit is a versatile, uh, on module, uh, with human-like dexterity and the perception. Um, it is, um, it is an affordable. Multi-purpose robotic arm optimized for mobile teleoperation. So it features, uh, remarkably, uh, high power to mass ratio, um, with six degrees of freedom utilizing their power, power, high power density, uh, quasi quasi direct drive, uh, partially smallers and, uh, and lightweight optimized, um, uh, design was advised the parallel mechanism. So it was also integrated, uh, RGBD sensors and powerful computer unit for data processing and control.

Chang Liu: (01:14:07)
Um, yeah. Um, I’m gonna still manage, uh, everything in one package, you know, and quite low cost. So, so our aim as, uh, is a powerful, uh, VR software, uh, which provides, uh, immersive 3d perception and intuitive control, uh, for high dexterous, uh, teleoperation, uh, with, with low cost, uh, consumer equipment. So, so, yeah, so the, uh, is, is developed in a efficient point cloud streaming pipeline and the gesture based control graphic user interface, and, and really utilizing the, uh, 3d 3d point cloud perception gives user the accurate sense of depth, uh, flexible view, uh, viewpoint and avoid motion sickness, uh, while the digital twin based adjust gesture control, uh, gives user an intuitive interface to, uh, to send complex control signals effortlessly. Yeah. So yeah, our invitation engine is the, as a cloud-based imitation pep light enable anywhere access, uh, on flexible data-driven AI. Yeah. So, yeah, so, uh, I I’ll, I’ll just quickly jump through that. The, the, yeah, so if you’re interested, you can, uh, you can go online and YouTube to, uh, to check out our, like a portal, a walkthrough video, and, uh, yeah, thanks for listening. I’m sorry to the expense.

Thomas Andersson: (01:15:36)
Thanks &for that. Um, very interesting, but that’s anyway, we, we jump over to the last presentation. It’s a note from ARS recruitment, if we have any, um, people looking for work these days in the robotics sector. Uh, so without over to you, Sam,

Sam Robinson: (01:15:54)
Thank you. Uh, let me just share my screen. Can we all see that? Okay, cool. So thanks for the invitation. Um, my name is Sam Robinson. I am the owner.

Thomas Andersson: (01:16:10)
You haven’t shared anything yet. Oh, no.

Sam Robinson: (01:16:14)
Two seconds then,

Thomas Andersson: (01:16:18)
Just to note again, feel free to field any questions in the Q&A tab, and we will try to read them out later or on to the main screen. There you go.

Sam Robinson: (01:16:29)
Okay. Lovely. Sorry. So yeah, I am Sam Robinson. I’m the owner of automation and recruitment solutions. Uh, I set the business of pen February this year, just before COVID struck, um, really specializing in automation and robotics recruitment. Um, just want to talk about a few things today. Um, mainly the client top tips for hiring. Um, the market is picking back up now and some people are after, you know, specialized stuff. So I just want to put a few points out to stress us specifically being competitive. There’s a lot of people within automation, robotics that are looking for the same type of talent, same type of skillset. Um, um, really you need to start thinking about what separates you from the other people. It’s not always regarding salary, uh, candidates in the market these days. They’re looking for more benefits in terms of flexibility, career development.

Sam Robinson: (01:17:22)
Um, and then think about the interview methods that you are actually using while hiring, uh, are they formal interviews now, are you now switching over to zoom meetings, team meetings, um, and streamlining going to be processed? So the Canada is fully transparent with how many stages that they interview is going to take. And as I just touched on, there’s not a great load of talent out there at the moment, and it is spread thin within many companies. So there’s different options to consider. Are you taking the graduate routes, or maybe not so much on the technical roles, but on senior management and director roles, taking people out of the industry that bring different methodologies, different ideas. Um, it found that the couple of clients that I currently work with, it’s a great way of bringing in new ideas into the business. Uh, just want to talk about the candidate side of thing very quickly.

Sam Robinson: (01:18:10)
And if you are looking for work at the moment, it’s very important that you are making your CV very clear and very precise to the type of skills that you have and to why somebody should want to hire you. I would include hiding, uh, including your achievements onto your CV and making it very clear what you can bring to the business. And then if you are selected for interview, I think preparation is very important. You need to be doing the research on the business. It’s one of the interview questions that are asked every single time. What do you know about us? And you’d be surprised how many times that catches people out even today, um, which is bizarre. Uh, you also, I would recommend doing some interview research on the managers that are going to be actually interviewing you for the role. And on top of that, I also suggest potentially doing a route planner, a Google maps to see where the interview is, how long it’s going to take.

Sam Robinson: (01:18:57)
You even do a practice route, and it’s only a short presentation, but the, I saved the last important part to last. And that’s your LinkedIn profile? So it’s a public platform where everybody is on or should be on if they’re looking for work, or if they’re looking to hire a, your, your profile should include your contact details, a picture of yourself and making yourself visible. So people can contact you for opportunities, uh, recruiters like myself, users, the number one go-to to find talent. And if you’re not on there, and if you’re not making yourself visible, um, there are other people out there that are, so if you need any support or any help with anything regarding your LinkedIn preparations for interview or your Germany, just looking around, um, I’d love for you guys to get in touch and more than happy to help, even if it’s just the guidance. Um, so with that, it was only a short one. Thank you very much for the invitation, um, back over to you, Thomas.

Thomas Andersson: (01:19:49)
Yeah. So thanks. Lots of that. So kind of concludes, um, all the presentations. So we can now go over to Q and a session and, um, let’s see here, I’ll share my screen. So this all, so first of all, if any attendees you want to contact us, just take a screenshot now. Um, we, there is one question for, um, for Fiyer. Um, if you’ve worked on any application where it’s necessary to pick up items using a robotic arm located on a mobile robot, if so, what, uh, experience and challenges have you found? Okay. Yeah,

Herbert ten Have: (01:20:35)
I can also that, um, uh, actually a lot of people think, uh, think of it that way, replacing a human, both walking around and picking, but it’s not an interesting application for a robot for economic reasons, because if you will, you want to have the robot continuously picking to have a decent ROI, a return on investment. So let’s say every few seconds of robots should be picking. So ideally you have a goods person or a good robot situation where the robot can continue picking if you have to run around and apart from navigation and it becomes harder, but then picking would be, let’s say once in 30 seconds, max, maybe once in a minute. So it doesn’t, uh, uh, support the investment of robots.

Thomas Andersson: (01:21:21)
Interesting. Thanks for that. Um, so, and feel free to field any questions I have, um, a question myself, perhaps. Um, what, um, if anyone wants to pitch in, what has your kind of impact been from the pandemic? How has business changed for you? Some must’ve had quite a positive impact, some perhaps a slightly negative or any, any type of impact that you want to talk about? Perhaps a Herbert wants to answer it.

Herbert ten Have: (01:21:54)
Yeah, sure. Um, so first we’re were shocked, of course, and we, we, uh, had an estimate for all project being, being put on hold. Uh, and so we had to survive for a year, but soon after, I think less than two months, we saw that our key clients like a fabric and Tel afif, and then New York, they are picking groceries for delivery at home. They said, full-blown we go ahead. And so it’s, uh, so, uh, they increased their, uh, their speed and so we work harder for harder for them. And then in the, I would say the last three months or partial handlers, uh, they, they have so many increases and so they want to have more picking shelves. So, um, if it’s DHL ups, federal express to ups, so all of them have a shortage of people, all of them want to have picking shelves, uh, in the near future.

Thomas Andersson: (01:22:47)
Interesting. Anyone else? Uh, Christoph, perhaps.

Christoph Zehentner: (01:22:55)
I would say it’s the same story here. Um, at the beginning we have seen that everybody puts, put, has purchased projects on health because fear maybe, um, but actually, um, it’s rising again and rising much more than it was before.

Thomas Andersson: (01:23:13)
Yeah. Interesting. What about, um, Owen, and I know you haven’t had any, um, it kind of out with your bedtime, but how do you see the pandemic?

Owen Nicholson: (01:23:25)
It was, um, so obviously it was a shock, uh, and it’s, uh, it was a big shock to the system and had to, I think we all had to rethink a bit about how we were going to actually run the companies,so, uh, moving to working remotely. Um, I actually wrote a, an open letter to the, to the robotics industry, um, at the start, which got picked up by the kind of robo tech and broker business use. And we’ve got quite a lot of PR, um, essentially saying to a lot of the bigger companies and the investors. Um, this is going to be a hard time for the robotics industry, but we are also part of the solution for the future. So let’s make sure that we, we, we will get through it to the other side because, and actually that’s what we’re starting to see now, ourselves, we’re seeing a huge, uh, inbound interest coming from our side. Um, investors are starting to, to see that robotics is, um, is something which is going to be, I’d say it’s even been accelerated by all of this because a lot of the larger companies have been thinking of how do we make ourselves more robust into the future. So absolutely wish it wasn’t happen. I wish it wasn’t, it wasn’t a hair, but I think the net benefit might actually be positive for the robotics industry, which is quite bizarre to say

Thomas Andersson: (01:24:30)
Interesting. What about, uh, from alias side Endika in Spain, how did you, um, what was your experience of the pandemic?

Endika Gil-Uriarte: (01:24:39)
So I, I do share what, what most of the previous speakers on the content that you guys serve, um, it has pretty much become the same for us. So we had to move to, let’s say to fully working remotely for a while, and most of our customers have suffered the same. Now we’re back to the new normal. We have a witness, not some of our customers do have, let’s say reduced themes within the shop floors. They do have, they have a lot of constraints when it comes to say physical contact with people and so on, and why not. They have moved to wireless, remote monitoring station of their robotic processes and so on. And they have identified increasingly security as we’re on for business continuity. So it’s up to us as when said before to be safe, being in this very new future and to be able to adapt effectively to these new conditions that will very likely stay for awhile.

Thomas Andersson: (01:25:41)
Interesting. Thanks. What about, um, extend robotics? What, um, so having remote, um, from remote control of yourself, or must’ve been quite an interesting business to be in during the pandemic or,

Chang Liu: (01:25:55)
Yeah. Uh, I guess we are, we’re, we’re being quite lucky because, because our whole solution is around remote working. Uh, some, especially in being able

Chang Liu: (01:26:04)
To have a, like an intuitive, uh, user interface to allow anyone to control, uh, at least a robot arm, uh, remotely. Uh, so we’re, we’re hoping that, uh, I mean, as clear now, there’s a, there’s a huge need now, uh, not only of for remote work and not only, and like, uh, uh, ICT, uh, um, demand, but also like extending that into our physical to men. And that’s, that’s exactly what we could do. For example, if some, uh, so many, many like, uh, university labs that just closed and people can not continue doing testing, um, you know, like, uh, yeah, uh, uh, any company labs that just just require physical work and that has to be closed and no one can access the location. Uh, but, uh, but just imagine if someone could just remotely control robots working on a work camp continuously working on their product. Um, yeah, that’s, that’s exactly what we can, we can offer

Thomas Andersson: (01:27:02)
Sebastian is how any, any impact at all? Uh, any, um, interesting.

Sebastian Andraos: (01:27:08)
So, I mean, as a software company working from home is, uh, is a bit easier than working for a hardware company. Um, but yeah, everything is slowed down, um, a bit, I think, I think we’re going to see everything got, particularly, we’ve heard a bit of contact from our SME manufacturers struggling to keep as many people in, in their cities, um, and would like to automate out some of the, some of the less interesting jobs, um, which is something that we saw before the pandemic is just, uh, I’ve spurred some of those people back into action. Um, but yeah, I mean, it was relatively easy to move between locations. It’ll be okay,

Thomas Andersson: (01:27:48)
Lauren, um, France.

Laurent IRAI: (01:27:51)
Yeah. So fortunately for us, it was a, uh, critical, uh, opportunity, but, uh, it’s unfortunate that, uh, as we work in the simulation side, uh, in industry that doesn’t change anything for us, but, uh, in the education, uh, as you make so many shows, uh, Miranda is a product that we, uh, came out in February. So just before the pandemic and, uh, so students, uh, were actually using the software, uh, we gave it for free during the pandemic for them to work from home on the programming, the robots that they couldn’t do in, uh, in school was, uh, granted. Um, so we really became popula] in the, in the schools Mirada and, uh, uh, so it was fortunate for us to

Thomas Andersson: (01:28:57)
Interesting. Yeah. I mean, it’s, it’s been a health problem. So there’s one question from, um, robots online, again, about, uh, how, and it’s kind of talking about the pandemic impact as well, but specific, uh, specifically relate to how customers or end users have, um, reevaluate the way they access and they deploy capital, uh, with the perhaps. So looking at CapEx versus OPEX type investments for robotics, perhaps, um, how about, do you want to, um, have a stab at that as well?

Herbert ten Have: (01:29:36)
Um, yes, I can. Um, we, we actually, we support both, uh, construction, both CapEx and OPEX. Uh, we even can, uh, invoice per pick, uh, that’s really up to the client. So we’re, we’re flexible on that. Uh, so, uh, um, but we haven’t seen a big change in that. So there are integrators and end users that want to have CapEx and other one to have an OPEX model. So we are just flexible, I would say.

Thomas Andersson: (01:30:02)
Okay, what about, so Philip, we have, so Philip, um, it’s obviously sponsoring the events, but it’s also a system integrator working quite a lot with end customers in the UK. What’s your kind of, yeah. Okay.

Philip English: (01:30:19)
Yeah. Yeah. So, I mean, from, from our side, we we’ve seen, um, a lot of consultancy. Uh, so especially for the bigger guys that are looking into the new types of robotics, but they’re, um, a bit hesitant with the actual, uh, placing orders and placing POS. I think they’re sort of, they’re getting the consultancy, they’re getting the understanding of what they need to do, uh, ready to hit the order button. Um, and then for the, for the SMEs, um, depending on the industry, uh, if it’s a or C uh, aerospace and have gone quite quiet right at the moment. Um, but yeah, there’s definitely got some solvency, so I just don’t see it. Everyone’s in interesting to see how, how this technology is going to help them. So

Thomas Andersson: (01:31:03)
Interesting. Anyone else on the OPEX versus CapEx kind of, um, or let’s say, um, um, preservation of cash, so, or otherwise I can, I mean, I’m just, I’m just writing a really big report on the AGV sector. So AGVs, um, um, an AMR, so autonomous mobile robots and have interviewed about 55 to 60 companies. Um, so what we’ve seen is that, um, there is, in, in the past years, like two or three years ago, people talked about Rust models and, uh, leasing models, but no one really kind of entered into that, but now we’re seeing that customers are coming back and asking for these models. Um, you know, um, sorry, you, you, you talked about this before. Can we rediscover that? Um, we kind of the rust type models, one of the, is that, um, not many of the customers or the end users allow, I’ll say robotics, uh, fleet managers, or a controller software on a kind of own cloud. So the business models look slightly different. It’s only the least that’s really, uh, available in most cases anyway. Um, I’m not sure if that’s true for everyone. Um, anyone else on the OPEX versus CapEx cash preservation. Otherwise we have one very specific question as well. Um, which I can’t see now. Um,

Herbert ten Have: (01:32:40)
I think I remember the question was through us, it, wasn’t hard to work to migrate from C plus plus the rest,

Thomas Andersson: (01:32:47)
That’s it? Yeah.

Herbert ten Have: (01:32:49)
Um, and the, the least person to answer this question in my team, but, uh, what I’ve witnessed is that, uh, of course it’s a big, big change because you invest in something to replace. It, it doesn’t bring you direct benefits. It’s really long-term for your clients. So it was a, it’s this really strategic decision to go for the highest quality. And, uh, as you know, Microsoft decided to also migrate to rust instead of C plus plus. So that’s really indication now how important they see this decision. Uh, so we had to train people. We had to make, uh, you planning all the stuff. So we invest a lot in this migration. Yeah. It’s, uh, it’s tough, but we believe we want to go for staying the best. And this is one of the, uh, the key elements.

Thomas Andersson: (01:33:35)
Okay, excellent. Thanks for that. It also brought in, um, um, Microsoft into the question. So we have a few other Q and A’s as well, but I think we’re running slightly over our time now. So we have, uh, had the webinar for one hour, 10 minutes. We, um, I think that’s it, unless there are any really pressing questions. There is a question for IRI regarding the simulator. Perhaps you can answer that.

Laurent IRAI: (01:34:05)
Yeah. There’s a, in the software, so you can, uh, edit, uh, existing, uh, robots, so, or plants or something, but you can also create your own robots in portraits really. And the build, uh, blogs or the Python command, uh, to come on the inside miranda so you can test it, uh, inside the changers or, uh, see the difference between the robots, uh, we have, uh, installed between the yours. Yep.

Thomas Andersson: (01:34:43)
Yeah. I recommend also anyone, any of the attendees, just to remember that this will be online as well on YouTube, so you can, we’ll be able to see the links and so on. Again, I think with that, we’ll, we’ll close down this webinar, big, thank you to all panelists. Um, who’ve given their time for this and, uh, hopefully we’ll get something out of this. Some nice connections, um, or whatever. So to everybody from Europe have a real continued, a nice evening from everybody from the U S so hope you have a nice day. Thanks everyone. Thanks guys. Thanks everyone.

TLA Robotics: https://www.techlondonadvocates.org.uk/ Tech London Advocates Robotics: TLA Youtube Philip English: https://philipenglish.com/  Sponsor – Robot Center: http://www.robotcenter.co.uk FIZYR: https://fizyr.com/ Alias Robotics: https://aliasrobotics.com/ SLAMCORE: https://www.slamcore.com/ HAL Robotics: https://hal-robotics.com/ Extend Robotics: https://www.extendrobotics.com/ IRAI Miranda: https://www.iraifrance.com/ IncubedIT: https://www.incubedit.com/ ARS Recruitment: https://www.ars-recruit.com/