Timesdelhi.com

January 18, 2019
Category archive

robotics - page 26

MIT’s new Ford-funded robot can deftly navigate pedestrian traffic

in Delhi/Ford/Hardware/India/MIT/Politics/robotics/TC/Transportation by

Robots have a lot of challenges remaining before they become commonplace in our lives, and many of those are likely not even something you’d think about when trying to make a list. One good example is simply walking around among people, and keeping pace with pedestrians on streets and sidewalks. MIT engineers have created a new autonomous robot that can do exactly that, however, in a way that doesn’t impede or inconvenience the people it’s walking alongside.

Why is this important? Because future robots employed in service roles will have to navigate densely populated spots with a lot of foot traffic, including hospitals, malls, neighborhoods and campuses. That’s why an MIT team led by MIT researcher Yu Fan “Steven” Chen set out to create a robot that can move around completely on its own using “socially aware navigation” – in other words, following the unspoken social codes we all unconsciously observe when moving around together in shared space.

The robot itself is not humanoid, but is instead just a few feet tall with wheels. It’s designed to resemble what eventual mobile service kiosks and delivery robots might look like, and it includes a LiDAR array on top for high-resolution environment sensing. The robot uses webcams, too, for visual input, and a depth sensor, and it makes use of machine learning to deal with both a requirement to be continually moving, while at the same time dealing with the highly unpredictable movement of people moving around in crowds.

In tests, the robot proved able to drive on its own for up to 20 minutes at a time, smoothly navigating actual pedestrian groups without any collisions. This is key if we’re ever to actually have service robots interacting with us in the real world, the way you often see them depicted in fictional worlds like the Star Wars cinematic universe.

Just navigating pedestrian situations is tough enough, but what this project specifically sought to do was to accomplish this in a way that also left people around the robot feeling comfortable with its presence. Social norms are a huge, powerful unspoken element of interpersonal interaction, which you’ll recognize if you’ve ever tried walking down the street in a different part of the world where the unspoken codes of conduct might be subtly different.

MIT’s socially aware robot could be a big step towards workable real-life service bots, which is something research funder Ford and other automakers are very interested in.

News Source = techcrunch.com

Pepper the robot can perform funerary rites, but it shouldn’t

in Delhi/Gadgets/India/pepper/Politics/robotics/SoftBank/TC/WTF by

It’s not really clear just what “humanoid” robots are actually for. I’ve seen them do all kinds of things, but almost none of them well; at our recent Robotics event in Boston, several leading experts in the field questioned their necessity. But we grew up with Data and Robby and Cylons, and so now we have Pepper. Pepper, performing funeral rites for cash-strapped people in Japan. Really?

This week is ENDEX, the Tokyo International Funeral and Cemetery Show and Life Ending Industry Expo — which, sure, isn’t for everyone, but honestly is a very important industry and one in which technology plays an increasing role.

But while I can get behind the idea of robotic grave cleaners, epitaph etchers, urn retrievers and things along those lines, this seems like a step in the wrong direction.

Pepper can bend its body, but it can’t bow, because bowing is an expression of respect and Pepper can neither express itself or feel respect.

According to the Japan Times, “Kanagawa-based Nissei Eco Co. will offer Pepper — billed as the world’s first robot that can read emotions — to chant Buddhist sutras at funerals, providing a cheaper alternative to human priests.”

(Just as a quick aside, I don’t buy that Pepper actually reads emotions — no robot or computer vision system can do that. Hell, neither can I most of the time.)

It’s meant to help bring down the costs of funerary services in Japan, where (as elsewhere) they can run into the tens of thousands of dollars. But this is just a bad, dumb idea.

It’s a bad idea to introduce autonomy, or any machine really, into a situation that fundamentally calls for respect and sympathy, because machines have neither.

And it’s a dumb idea because Pepper specifically isn’t good at this job! Humanoid robots are created with the idea of existing and interacting with human spaces. There’s nothing about a human-shaped machine that makes it better at praying. If anything, it’s a mockery of the devotion and care that prayer is meant to embody. Not to mention the whole no-souls thing.

Mechanically it’s no better than playing an MP3 of the sutras on repeat, though it’s a great deal more expensive.

We have to consider with care how we integrate robots, AI, the internet and other technological advances into our lives and cultures. Dull, dirty, and dangerous work (the “three Ds”) are a perfect fit. Shoehorning them in where people are vulnerable like this and need a human touch is distasteful and bizarre.

News Source = techcrunch.com

MIT’s Robogami lets you build custom 3D-printable robots from standard, folding parts

in CSAIL/Delhi/Gadgets/India/MIT/Politics/robotics/Science/TC by

Flat-pack furniture made Ikea a global powerhouse, and the same principles may help create a new generation of robots. Interactive Robogami is a project from MIT that lets users create ambulatory robots from a library of pieces that fold and fit together like origami.

“The goal is to make the process of designing robots accessible,” said Adriana Schulz, a PhD student at MIT who co-led the project. “The actuators, the materials, the code, things like that require a lot of knowledge. Our system encapsulates that expert knowledge, so the user can focus on conceptual design.”

Robogami lets users combine a library of intercompatible parts with primitives that can be printed like puzzle pieces and then folded and locked into shape. The flat-printed style reduces both print time and material cost by more than half.

It has a user-friendly GUI that has more in common with a game or 3D doodle app than a CAD or other design environment. You drag the pieces where you want them, arrange the type and placement of the feet or wheels, and add any other features you think might be useful (or cool-looking). Then you can work out how and when those parts will move, what direction, and so on.

Meanwhile, the app is doing all kinds of calculations in the background.

“One of the key things is that you can design geometry and motion at the same time. Normally that’s two different processes, but here you can change one and see how it affects the other,” said Schulz.

“The ordering of the how the legs move, the speed, these are all parametric structures,” she said. How they interact is systematized and you don’t have to worry about, for example, synchronizing motor torques or timing a foot’s touchdown — it’s being calculated internally. “The software maps these components into a full fabrication plan, from the mesh that goes to the 3D printer to the motors and code, the user doesn’t have to worry — it’s very end to end.”

For instance, if you decide to move the front legs back towards the center, the software calculates how that might affect when and how fast they move, or whether their motion would interfere with another piece. It watches for things like wobbliness, changes in orientation, and center of gravity.

  1. oriinterface

  2. variety

  3. ori_print

  4. ant

  5. IMG_5525

It might sound a little like baby’s first robot kit, but it’s clearly a versatile tool. In a handful of tests, Robogami users of varying levels of familiarity with CAD and engineering tools, “We were really surprised by the diversity of models people made,” Schulz told me. It took about 15-30 minutes to design a robot, a few hours to print it, then 30-90 minutes to assemble it.

The tool isn’t aimed at any population in particular, Schulz told me (other than “anybody”), but I thought this would make for an amazing couple weeks in a high school or college engineering course, perhaps even earlier. Design your own little bots, print them overnight, assemble them to understand how the motors and chassis fit together, then race them or have them navigate obstacles.

Right now Robogami is just the subject of the researchers’ paper, now published in the International Journal of Robotics Research. But Schulz said the goal was to move beyond mere locomotion and into other tasks. “Empowering people to design complex things is difficult! But I think it would be exciting to create tools that lower design barriers for casual users.”

“These tools enable new approaches to teaching computational thinking and creating,” said Daniela Rus, director of MIT’s CSAIL and collaborator on the research, in a news release. “Students can not only learn by coding and making their own robots, but by bringing to life conceptual ideas about what their robots can actually do.”

Featured Image: MIT CSAIL

News Source = techcrunch.com

Industrial hack can turn powerful machines into killer robots

in collaborative robots/Delhi/Hack/hacks/India/industrial robot/Politics/robotics/robots/Security/TC by

When we imagine the existential threat to the humanity posed by automation, we usually think of the negative impact on the workforce, not robots quite literally bludgeoning us to death.

In a post titled “Exploiting industrial collaborative robots,” security researchers at IOActive detail how popular models of consumer and industrial robots have already been compromised in such a way that could cause humans bodily harm. The study examines a class of collaborative robots designed to work together with their human counterparts, often in industrial settings.

IOActive’s research focuses specifically on a set of unpatched vulnerabilities affecting the UR line of robots, made by Universal Robots, including “authentication issues in many of the control protocols, susceptibility to physical attacks, memory corruption vulnerabilities, and insecure communication transport.” The team disclosed the concerns to the company in January 2016, published a video on the exploits in July and has now detailed its method in depth in the blog post.

The hack, which targets a buffer overflow vulnerability, disables key safety measures put in place to ensure that the robots can work peaceably alongside their human counterparts. While the programming limits the physical parameters of what the industrial robots can and can’t do, hacks like the one demonstrated here allow these limits to be broken. The result could be dangerous, even catastrophic, for nearby human workers. As the study explains, “… Even the smaller UR5 model is powerful enough to seriously harm a person. While running at slow speeds, their force is more than sufficient to cause a skull fracture.”

According to documentation from Universal Robotics, these concerns are well-known. The risks include:

  • Penetration of skin by sharp edges and sharp points on tools or tool connectors

  • Penetration of skin by sharp edges and sharp points on obstacles near the robot track

  • Bruising due to stroke from the robot

  • Sprain or bone fracture due to strokes between a heavy payload and a hard surface

  • Mistakes due to unauthorized changes to the safety configuration parameters

Yikes. And these robots are already deployed and in use around the globe. Another video by IOActive demonstrates how Softbank’s NAO and Pepper robots are vulnerable to an exploit that could turn them into spying devices, collecting unencrypted video and audio data and transmitting it remotely.

In July, the company published research on how hoverboards, humanity’s other looming existential threat, are also trying to kill us.

Featured Image: Bryce Durbin/TechCrunch

News Source = techcrunch.com

1 24 25 26
Go to Top