Daily Archives: November 1, 2021

This Startup Is Tackling One of the Biggest Challenges in Robotics – Inc.

Posted: November 1, 2021 at 7:05 am

If a robotone day helps you make breakfast or change a diaper, there's a good chance PickNik Roboticshad a hand in it.

The Boulder, Colorado-based company develops software thatmakes robotssmarter, allowing them tomake better decisions and perform tasks moreintelligently.

Hardware is the easy part when it comes to robotics. "There are a lot of companies thathave been providing decent robot arms for a couple of decades now," says PickNik founder and CEO Dave Coleman. "The real challenge is making them smarter."

Tackling this difficult problem has mined a huge business opportunity.PickNik earned $2.2 million in revenue in 2020, giving it a three-yeargrowth rate of 966 percent and helping it land atNo. 505 on this year's Inc. 5000 list. Coleman says the company'sclients have included NASA, Google, Amazon, androbotics upstarts like Kindred and Plus One Robotics.

Coleman interned at robotics startup Willow Garage back in 2010. The young company employedmany of the industry's brightest minds:Early staffers went on to found companies likeSavioke, which makes bots for the hotel industry, and Zipline, a manufacturer of drones meant todeliver blood and other medical supplies to remote areas.

"That was really the starting point of my whole career," says Coleman, "being surrounded by all these amazing roboticists."

During his time at Willow Garage, Coleman worked on creatingopen-sourcesoftware that poweredrobotic arms. After the company folded in 2014, he continued developingthe platform, earning money by consulting for clients on how to use it in conjunction with their robots. Demand was so great that he decided to form a business based on the software the following year.

Picking up toys is less of a chore when you live with a PickNik-powered robot. Photographs by Ross Mantle

When combined with PickNik's platform,a robotic arm that previously used to, say,pick up and put down components in factories suddenly is able to better negotiateits environment. Abot can be trained toavoid humans or make decisions about whichpieces of equipment to move and which to leave alone.For the company's various clients, PickNik's software helps machinesefficiently and safely perform a variety of tasks likepicking fruits and vegetables, prepping meals, assisting with surgeries, and working on underwater oil and gas rigs.

PickNik's platform is hardwareagnostic, so it can be used with off-the-shelfarms as well as custom-built ones. That's proven beneficial to the company, helpingensure itcan work with clients in a wide variety of industries from manufacturing to farming.The 30-employee startupstill offers open-sourcesoftware, but italso has a premium offeringthat includes additional functionality, more customization, and support from its employees.

"How can someone who doesn't have a computer science degree or an engineering degree successfully control a robot and do all sorts of cool stuff with it?" says Coleman. "That's what our product offering does."

PickNik hasn't taken any venture capital, though it has won grants from State of Colorado, NASA, and other grant-givers.The company is working with NASAona robotic vehicle that could be used tounload cargo and perform other tasks in the new space station. Coleman says that not taking venture capitalhas allowed PickNikto test the waters in excitingmarkets -- like space -- without having topromise massive returns to investors.

"We're having fun," says Coleman."As long as we make money, we're happy, even if this isn't aunicorn company."

The way things are going, though,it might become one anyway.

Originally posted here:

This Startup Is Tackling One of the Biggest Challenges in Robotics - Inc.

Posted in Robotics | Comments Off on This Startup Is Tackling One of the Biggest Challenges in Robotics – Inc.

Are you thinking about transitioning to robotic automation? – Today’s Medical Developments

Posted: at 7:05 am

Register today for next weeks Robot Roundtable, taking place on Wednesday, November 3 from 12PM 1PM ET. During this online event, representatives from Shunk, Techman Robot, and Maxbyte Advanced Robotics Centre will discuss the future of robotics and assess the impacts and opportunities available with robotics and AI.

Our lineup of panelists will include:

Thomas Reek, VP of Sales: Automation, SCHUNK, has more than 25 years of workpiece gripping and handling experience. He graduated from NC State University with a BSEE in Electrical Engineering and began his career with SCHUNK in 1994 as an applications engineer in the early years of the company's launch into the USA.

Reek's career has continued to grow and evolve at SCHUNK: he served as director of sales for the Eastern states before being promoted to his current role of vice president of sales for the entire USA. Today, Tom leads a team of motivated professional territory sales managers dedicated to serving customers and making SCHUNK the recognized leader in workpiece handling technology.

Gerardo Paniagua, Senior Application Engineer, Techman Robot has several years devoted to the field of data analysis and machine learning. Motivated and organized, Gerardo has professional experience consulting automation projects with cobots for business-driven solutions in smart applications such as artificial intelligence for defect identification, applied 3D vision, palletizing, data communication, and more.

Click here to learn more and register today! Cant participate on the day of the event? No problem. Each registered attendee will also receive a link to the recording, which will be sent within 7-10 business days after the event.

See more here:

Are you thinking about transitioning to robotic automation? - Today's Medical Developments

Posted in Robotics | Comments Off on Are you thinking about transitioning to robotic automation? – Today’s Medical Developments

Vi partners with startups Vizzbee Robotics Solution and Tweek Labs for 5G trials – Economic Times

Posted: at 7:04 am

Monday said it has partnered with two Indian startups to test 5G-based solutions on aerial traffic management and motion capture system.

To conduct the trials, Vi has partnered with startups - Vizzbee Robotics Solution and Tweek Labs, the company said in the statement.

We are conducting 5G trials to identify a range of India specific use cases that can accelerate the achievement of smart cities, smart enterprises and smart citizens, Abhijit Kishore, chief operating officer, Vodafone Idea, said in the statement.

Tweek Labs will test a full-bodied motion capture suit using 5G technology, that can be an effective solution for sports training as it helps monitor performance of athletes.

These trials come at a time when Vi along with rivals Bharti Airtel and Reliance Jio are preparing to participate in the upcoming 5G auctions. The airwaves will bring in the next generation of technology for subscribers.

Read more here:

Vi partners with startups Vizzbee Robotics Solution and Tweek Labs for 5G trials - Economic Times

Posted in Robotics | Comments Off on Vi partners with startups Vizzbee Robotics Solution and Tweek Labs for 5G trials – Economic Times

Notre Dame professor develops swarm robots and hopes to inspire other creators – ABC 57 News

Posted: at 7:04 am

'); if(!WVM.IS_STREAMING){ $videoEl.append('' + '' + ''); } setTimeout(function(){ $('.mute-overlay').on('touchstart click', function(e){ if(e.handled === false) return; e.stopPropagation(); e.preventDefault(); e.handled = true; player.muted(false); //console.log("volumee " + WVM.activePlayer.volume()); $(this).hide(); $(this).css('display', 'none'); var currentTime = player.currentTime(); if(currentTime 0){ if(deviceName == 'desktop'){ WVM.VIDEO_TOP = $('#media-container-' + videoId).offset().top; }else{ WVM.VIDEO_TOP = $('#media-container-' + videoId).offset().top - $('.next-dropdown-accordion').height(); } if(deviceName == 'desktop'){ WVM.VIDEO_HEIGHT = $('#html5-video-' + videoId).outerHeight(); }else{ WVM.VIDEO_HEIGHT = $('#html5-video-' + videoId).outerHeight(); } WVM.CONTAINER_HEIGHT = $('#media-container-' + videoId).height(); //console.log("container height: " + WVM.CONTAINER_HEIGHT); $(window).on( "resize", function() { if(deviceName == 'desktop'){ WVM.VIDEO_TOP = $('#media-container-' + videoId).offset().top; }else{ WVM.VIDEO_TOP = $('#media-container-' + videoId).offset().top - $('.next-dropdown-accordion').height(); } if(deviceName == 'desktop'){ WVM.VIDEO_HEIGHT = $('#html5-video-' + videoId).outerHeight(); }else{ WVM.VIDEO_HEIGHT = $('#html5-video-' + videoId).outerHeight(); } WVM.CONTAINER_HEIGHT = $('#media-container-' + videoId).height(); console.log("container height: " + WVM.CONTAINER_HEIGHT); }); //console.log("VIDEOTOP: " + WVM.VIDEO_TOP); //console.log("VIDEOHEIGHT: " + WVM.VIDEO_HEIGHT); $(window).on( "scroll", function() { if(!WVM.IS_FLOATING){ if(deviceName == 'desktop'){ WVM.CONTAINER_HEIGHT = $('#media-container-' + videoId).height(); }else{ WVM.CONTAINER_HEIGHT = $('#media-container-' + videoId + " .hlsvideo-wrapper").height() + $('#media-container-' + videoId + " .now-playing-container").height(); } } //var top = $('#media-container-' + videoId).offset().top; var offset = WVM.VIDEO_TOP + (WVM.VIDEO_HEIGHT / 2); var offsetBack = WVM.VIDEO_TOP; var changed = false; //console.log("VIDEOTOP: " + WVM.VIDEO_TOP); //console.log("VIDEOHEIGHT: " + WVM.VIDEO_HEIGHT); //console.log("scrolltop " + $(window).scrollTop()); //only float if playing var isPlaying = WVM['player_state' + videoId]['IS_PLAYING'] || WVM['player_state' + videoId]['AD_IS_PLAYING']; if(isPlaying){ $('.vjs-loading-spinner').hide(); } var offsetFloatAd = 99999999; if(deviceName == 'desktop' && $('#float_anchor').length > 0){ offsetFloatAd = $('#float_anchor').offset().top - WVM.VIDEO_HEIGHT; //console.log("float anchor offset top " + offsetFloatAd); } if($(window).scrollTop() > offset && isPlaying && !WVM['player_state' + videoId]['CANCEL_FLOATING']){ $('#media-placeholder-' + videoId).height(WVM.CONTAINER_HEIGHT); $('#media-placeholder-' + videoId).css('display', 'block'); if(!WVM.IS_FLOATING){ changed = true; } WVM.IS_FLOATING = true; $('#media-container-' + videoId).addClass('floating-video'); var sWidth = window.innerWidth || document.documentElement.clientWidth; var sHeight = window.innerHeight || document.documentElement.clientHeight; if(sWidth > 900 && WADS.IS_STICKING){ $('#media-container-' + videoId).addClass('desktop-ad-is-sticky'); } else if(WADS.IS_STICKING){ if(!TOP_AD_VIEWED){ $('#media-container-' + videoId).addClass('mobile-ad-is-sticky'); }else{ $('#media-container-' + videoId).addClass('mobile-ad-is-sticky-noad'); } } else if(!WADS.IS_STICKING){ if(!TOP_AD_VIEWED){ $('#media-container-' + videoId).removeClass('desktop-ad-is-sticky'); }else{ $('#media-container-' + videoId).addClass('desktop-ad-is-sticky-noad'); } } //set right var sWidth = window.innerWidth || document.documentElement.clientWidth; var sHeight = window.innerHeight || document.documentElement.clientHeight; if(deviceName == 'desktop' || sWidth > 900){ var leftPos2 = $('aside').get(0).getBoundingClientRect().left; var leftPos = $('aside').offset().left ; $('#media-container-' + videoId).css('left', leftPos + "px"); var newWidth = Math.floor(sWidth / 3.5); $('#media-container-' + videoId).css('width', newWidth + "px"); } else{ $('#media-container-' + videoId).css('width', "100% !important"); $('#media-container-' + videoId + ' .now-playing-container').css('display', 'block'); $('#media-container-' + videoId + ' .next-dropdown-accordion').css('display', 'block'); } //floating-video $('#media-container-' + videoId + " " + '.page-carousel-wrapper').hide(); setTimeout(function(){ var hWrapper = $('.floating-video .hlsvideo-wrapper').height(); var npWidth = $('.floating-video .now-playing-container').height(); var ndWidth = $('.floating-video .next-dropdown-header').height() + 20; var scrollerHeight = sHeight - (hWrapper + npWidth + ndWidth); scrollerHeight = 180; //scrollerHeight = parseInt(scrollerHeight * 0.5); if(WVM.device_name == 'desktop'){ $('#media-container-' + videoId + " " + " .mobile-list-videos").height(scrollerHeight); } }, 100); }else if($(window).scrollTop() 0){ var container = document.querySelector('#page-carousel-' + fullVideoId); imagesLoaded( container, function() { var screenWidth = window.innerWidth || document.documentElement.clientWidth; if(screenWidth > 850){ WVM.IS_DESKTOP = true; $('#page-carousel-' + fullVideoId + ' .page-carousel-lg-slides').css('display', 'block'); WVM['player_settings' + fullVideoId].slider = $('#page-carousel-' + fullVideoId).bxSlider({ maxSlides: 4, minSlides: 4, slideWidth: 305, infiniteLoop: false, hideControlOnEnd: true, useCSS: true, pager: false, slideMargin: 15, moveSlides: 1, nextText: '', prevText: '' }); }else{ WVM.IS_DESKTOP = false; $('.page-carousel-wrapper').css('display', 'block'); } }); } }; WVM.setupToggleButton = function(fullVideoId, player){ if($('.nextplay-switch-' + fullVideoId).length > 0){ new DG.OnOffSwitchAuto({ cls:'.nextplay-switch-' + fullVideoId, height: 24, trackColorOn:'#F9F9F9', trackColorOff:'#222', textColorOn: '#222', textColorOff: '#222', textOn:'On', textOff:'Off', listener:function(name, checked){ var theVal = 1; if(!checked){ theVal = 0; } $.ajax({ url: '/ajax/update_autoplay_video/', data: { autoplay_on: theVal }, type: 'POST', dataType: 'json', success: function(data) { WVM['player_settings' + fullVideoId]['autoplay'] = checked; }, error : function(){ console.log("Error loading video"); } }); } }); } }; WVM.setupAccordionButton = function(fullVideoId){ var deviceName = 'desktop'; $('#next-dropdown-accordion-button-' + fullVideoId).on('click', function(){ if($(this).find('i').hasClass('fa-chevron-up')){ //hide $(this).find('i').removeClass('fa-chevron-up'); $(this).find('i').addClass('fa-chevron-down'); if(deviceName == "desktop" && !$('#media-container-' + fullVideoId).hasClass('floating-video')){ $('#media-container-' + fullVideoId + " " + '.page-carousel-wrapper').slideUp(); $('#media-container-' + fullVideoId + " " + '.mobile-list-wrapper').hide(); }else{ $('#media-container-' + fullVideoId + " " + '.mobile-list-wrapper').slideUp(); $('#media-container-' + fullVideoId + " " + '.page-carousel-wrapper').hide(); } var currVideoId = WVM['player_state' + fullVideoId]['VIDEO_ID']; var nextVideoId = WVM.getNextPlaylistIndex(currVideoId); //playerId, mediaId, fieldName var myTitle = WVM.getPlaylistData(fullVideoId, nextVideoId, 'noprefixtitle'); //alert("Getting title " + myTitle); $('#video-slider-nexttitle' + fullVideoId).css('display', 'inline'); $('#video-slider-nexttitle' + fullVideoId).html(myTitle); }else{ //expand $(this).find('i').addClass('fa-chevron-up'); $(this).find('i').removeClass('fa-chevron-down'); $('#media-container-' + fullVideoId + " " + '.mobile-list-wrapper').css('display', 'block'); if(deviceName == "desktop" && !$('#media-container-' + fullVideoId).hasClass('floating-video')){ $('#media-container-' + fullVideoId + " " + '.page-carousel-wrapper').css('display', 'block'); $('#media-container-' + fullVideoId + " " + '.page-carousel-wrapper').slideDown(); $('#media-container-' + fullVideoId + " " + '.mobile-list-wrapper').hide(); if(!WVM.player_state169747['CAROUSEL_INIT']){ WVM.setupCarousel(fullVideoId); } }else{ $('#media-container-' + fullVideoId + " " + '.mobile-list-wrapper').slideDown(); $('#media-container-' + fullVideoId + " " + '.page-carousel-wrapper').hide(); if(!$('#media-container-' + fullVideoId).hasClass('floating-video')){ if(!WVM.player_state169747['CAROUSEL_INIT']){ WVM.setupCarousel(fullVideoId); } } } $('#video-slider-nexttitle' + fullVideoId).css('display', 'none'); } }); var currVideoId = WVM['player_state' + fullVideoId]['VIDEO_ID']; //console.log("current Video " + currVideoId); var nextVideoId = WVM.getNextPlaylistIndex(currVideoId); var myTitle = WVM.getPlaylistData(fullVideoId, nextVideoId, 'noprefixtitle'); //console.log("setting title " + myTitle); $('#video-slider-nexttitle' + fullVideoId).css('display', 'inline'); $('#video-slider-nexttitle' + fullVideoId).html(myTitle); }; WVM.sendbeacon = function(action, nonInteraction, value, eventLabel) { var eventCategory = 'Video'; if (window.ga) { //console.log("sending action: " + action + " val: " + value + " label " + eventLabel); ga('send', 'event', { 'eventCategory': eventCategory, 'eventAction': action, 'eventLabel': eventLabel, 'eventValue': value, 'nonInteraction': nonInteraction }); } }; WVM.getNextPlaylistIndex = function(mediaId, returnArrayIndex){ var currId = null; if(mediaId == null){ return null; } for(var x =0; x 20){ if(fullDuration > 1 && ((fullDuration - fullCurrent) > 1) && !$('.vjs-loading-spinner').hasClass('badspinner')){ console.log("hiding spinner"); $('.vjs-loading-spinner').addClass('badspinner'); } } var duration_time = Math.floor(this.duration()); //this is a hack because the end video event is not firing... var current_time = Math.floor(this.currentTime()); if ( current_time > 0 && ( fullCurrent >= (fullDuration - 10) )){ var currId = playerState.VIDEO_ID; var newMediaId = WVM.getNextPlaylistIndex(currId); //if(playerSettings.autoplay_next && newMediaId){ if(newMediaId){ if('desktop' == "iphone" && playerState.AD_ERROR){ console.log("skipped timeupdate end"); }else{ WVM.load_video(newMediaId, true, playerState.ORIGINAL_ID); } } } if(!playerState.START_SENT){ WVM.sendbeacon('start', true, playerState.VIDEO_ID, playerState.VIDEO_TITLE); playerState.START_SENT = true; } var currentTime, duration, percent, percentPlayed, _i; currentTime = Math.round(this.currentTime()); duration = Math.round(this.duration()); percentPlayed = Math.round(currentTime / duration * 100); for (percent = _i = 0; _i = percent && __indexOf.call(playerState['PERCENTS_TRACKED'], percent) 0) { playerState['PERCENTS_TRACKED'].push(percent); } } } }); //player.off('ended'); player.on('ended', function(){ console.log("ended"); playerState.IS_PLAYING = false; WVM.sendbeacon("complete", true, playerState.VIDEO_ID, playerState.VIDEO_TITLE); var currId = playerState.VIDEO_ID; var newMediaId = WVM.getNextPlaylistIndex(currId); //if(playerSettings.autoplay_next && newMediaId){ if(newMediaId){ WVM.load_video(newMediaId, true, playerState.ORIGINAL_ID); }else{ console.log("Playlist complete (no more videos)"); } }); //player.off('adserror'); player.on('adserror', function(e){ //$('#ima-ad-container').remove(); WVM.lastAdRequest = new Date().getTime() / 1000; console.log(e); console.log("ads error"); var errMessage = e['data']['AdError']['l']; playerState.AD_IS_PLAYING = false; playerState.IS_PLAYING = false; // && errMessage == 'The VAST response document is empty.' if(!playerState.AD_ERROR){ var dTime = new Date().getTime(); WVM.firstPrerollTagUrl = WVM.getFirstPrerollUrl(); console.log("calling backup ad tag url: " + WVM.firstPrerollTagUrl); WVM.activePlayer.ima.changeAdTag(WVM.firstPrerollTagUrl + "?" + dTime); WVM.activePlayer.ima.requestAds(); //WVM.activePlayer.src({ // src: masterSrc, // type: 'video/mp4' //}); //WVM.firstPrerollTagUrl = ""; } playerState.AD_ERROR = true; }); //player.off('error'); player.on('error', function(event) { if (player.error().code === 4) { player.error(null); // clear out the old error player.options().sources.shift(); // drop the highest precedence source console.log("now doing src"); console.log(player.options().sources[0]); player.src(player.options().sources[0]); // retry return; } }); //player.off('volumechange'); player.on('volumechange', function(event) { console.log(event); var theHeight = $('#media-container-' + playerState.ORIGINAL_ID + ' .vjs-volume-level').css('height'); var cssVolume = 0; if(theHeight){ cssVolume = parseInt(theHeight.replace('%', '')); } var theVolume = player.volume(); if(theVolume > 0.0 || cssVolume > 0){ $('#media-container-' + playerState.ORIGINAL_ID + ' .mute-overlay').css('display', 'none'); }else{ $('#media-container-' + playerState.ORIGINAL_ID + ' .mute-overlay').css('display', 'block'); } }); WVM.reinitRawEvents(playerState.ORIGINAL_ID); setInterval(function(){ WVM.reinitRawEvents(playerState.ORIGINAL_ID); }, 2000); } if(!WVM.rawCompleteEvent){ WVM.rawCompleteEvent = function(e){ var playerState = WVM['player_state169747']; console.log("firing raw event due to all other events failing"); var currId = playerState.VIDEO_ID; var newMediaId = WVM.getNextPlaylistIndex(currId); //if(playerSettings.autoplay_next && newMediaId){ if(newMediaId){ WVM.load_video(newMediaId, true, playerState.ORIGINAL_ID); } }; } if(!WVM.rawTimeupdateEvent){ WVM.rawTimeupdateEvent = function(e){ var playerState = WVM['player_state169747']; var rawVideoElem = document.getElementById('html5-video-' + playerState['ORIGINAL_ID'] + '_html5_api'); var fullCurrent = rawVideoElem.currentTime * 1000; var fullDuration = rawVideoElem.duration * 1000; var current_time = Math.floor(rawVideoElem.currentTime); console.log("raw timeupdate: " + fullCurrent + " out of " + fullDuration); if ( current_time > 0 && ( fullCurrent >= (fullDuration - 50) )){ var currId = playerState.VIDEO_ID; var newMediaId = WVM.getNextPlaylistIndex(currId); if(newMediaId){ console.log("loading new video from rawtimeupdate"); WVM.load_video(newMediaId, true, playerState.ORIGINAL_ID); } } if(!$('.vjs-loading-spinner').hasClass('badspinner')){ $('.vjs-loading-spinner').addClass('badspinner') } }; } WVM.reinitRawEvents = function(playerId){ var playerState = WVM['player_state' + playerId]; var rawVideoElem = document.getElementById('html5-video-' + WVM['player_state' + playerId]['ORIGINAL_ID'] + '_html5_api'); //COMPLETE EENT if( WVM['player_state' + playerId].COMPLETE_EVENT){ rawVideoElem.removeEventListener('ended', WVM.rawCompleteEvent, false); } rawVideoElem.addEventListener('ended', WVM.rawCompleteEvent, false); //TIME UPDATE EVENT if( WVM['player_state' + playerId].TIMEUPDATE_EVENT){ rawVideoElem.removeEventListener('ended', WVM.rawTimeupdateEvent, false); } rawVideoElem.addEventListener('ended', WVM.rawTimeupdateEvent, false); WVM['player_state' + playerId].COMPLETE_EVENT = true; WVM['player_state' + playerId].TIMEUPDATE_EVENT = true; };

SOUTH BEND, Ind. -- Notre Dame Professor Yasemin Ozkan-Aydin has spent fifteen years working in robotics, though despite the mechanical inclination, she takes much of her inspiration from nature- like how centipedes travel and how ants work together to complete tasks, using their legs.

But robots with legs have their own unique challenges.

According to Prof. Ozkan-Aydin As a product, there are lots of wheeled robots, and other stuff. But legged robots are not commercially available.

So, she opted to create her own, using convenient, modern technologies like 3-D printing and low-cost materials to create her own "swarm robots."

The idea here is to use the legged robots like a swarm.

These four-legged robots, which only cost about $200 in raw materials to produce, are able to navigate uneven terrain and work together to complete tasks. Ozkan-Aydin suggests they could be used in a wide variety of applications-- from search and rescue operations to gathering data for farmers to plant their crops.

She hopes that these robots can also be used as a learning tool.

Students can learn how to control the systems using a low-cost platform like that, she said.

But one major takeaway Ozkan-Aydin has on the project is she hopes that by developing these swarm robots on her own, she can inspire another generation of engineers in creating their own devices.

Robotics is growing, and now the age of the students is also decreasing. With availability of 3-D printers and materials, its a very growing area," she said.

And she is thankful for the opportunity and support given by the University of Notre Dame.

We have a growing robotics group here," she said "So people are very supportive of the research on the robots, and our Dean is also interested in this kind of research.

And she suggests that if anyone is looking for design inspiration: look to the natural world that surrounds them:

I suggest to people that if they want to work on robotics, to first look at biology, to see if they cant find some useful ideas.

More here:

Notre Dame professor develops swarm robots and hopes to inspire other creators - ABC 57 News

Posted in Robotics | Comments Off on Notre Dame professor develops swarm robots and hopes to inspire other creators – ABC 57 News

IBM, Boston Dynamics Using AI and Walking Robots to Rethink and Improve Industrial Monitoring – EnterpriseAI

Posted: at 7:04 am

As AI-enabled robots in manufacturing facilities, power plants, warehouse and other industrial sites continue to expand in use, more potential use cases are constantly being identified by enterprises that are looking to solve their critical business problems.

In response, IBM and Boston Dynamics are partnering to bring IBM software and Boston Dynamics Spot robots together to find answers for customers who are dealing with challenging technology requirements at the edge and in a wide range of industrial settings.

The new collaboration between the two companies, which was announced Oct. 26 (Tuesday), will pair their technologies to focus on conducting data analysis at the edge that will be used to addressworker safety, optimize field operations and boost maintenance productivity across many industrial environments.

To fill these needs, IBM Consulting, in conjunction with AI and hybrid cloud innovations from IBM Research, is developing edge software applications to work with a variety of attachable payloads that are available for Boston Dynamics agile, mobile robot, which is called Spot. The payloads are essentially specialized modules that include a robotic arm, a pan-tilt-zoom camera, a thermal camera for analyzing temperatures during inspections, a LIDAR mapping module and attachable edge CPU or GPU modules.

Through the partnership, IBM and Boston Dynamics are essentially using the Spot walking robot as an intelligent roaming edge device and equipped with specialized software that can traverse stairs, rough topography and indoor or outdoor locations, giving operations staff the ability to remotely inspect and monitor equipment with dynamic sensing capabilities.

Skip Snyder of IBM

If IBM was not part of the picture and you bought a Spot robot, you could use it for remote operations, Skip Snyder, a senior partner at IBM Consulting who leads the intelligent connected operations practice, told EnterpriseAI. If you were responsible for a substation, you could wake Spot up at the substation and have it move and see what it sees.

That could include, for example, an inspection of a bank of valves in a power plant and a subsequent recording of the valve readings, he said. And when those readings are combined with IBMs edge applications and the specialized payload modules, customers can gain even more insights, he added.

What we have done is we put on an edge payload that sits on the back of Spots, with performance criteria geared toward AI, said Snyder. Now, using the analytics aboard Spot, the robot can show what it sees, and customers can run the inferencing at the same time it is reading the valve readings on the bank of valves.

We can do the inferencing on the edge device to actually say, OK, at the rate that the pressure is increasing, we are going to be out of [the permitted pressure range] in about six hours, said Snyder. [A worker then needs to] create a work order and alert somebody and we can actually have Spot create a work order within an enterprise asset management system, including IBMs Maximo Application Suite.

One of the first pilot projects using the Spot robot and IBM applications is being conducted by the electric and gas utility, National Grid, which serves Massachusetts and New York. According to the partners, Spot is being used to conduct regularly-scheduled, autonomous inspections at National Grid sites. The integration of IBM advanced AI services gives National Grid new actionable intelligence by processing the data and enabling faster response times when a problem in a facility system is detected.

Currently under field testing is a near real-time inferencing capability that incorporates thermo-visual analysis from inspection data collected by the robot, according to the companies. The new analyses should be able to help identify hotspots and other problems with power station components that could cause serious equipment failures and power outages if left unrepaired. National Grid is planning to use edge data processing via the Spot robot to notify maintenance staff workers as soon as any defects are found so they can make repairs immediately, even as the robot continues its inspection rounds.

The sensing capabilities aboard the Spot robots can scale across multiple facility sites and across many types of equipment using IBM tools and expertise in edge, 5G, security and hybrid cloud, according to IBM and Boston Dynamics.

The first concepts bringing the two technologies together began this past January and were quickly seen as game-changing, said Snyder.

You stop and think about things from an internet of things (IoT) perspective where you can put sensors on all these assets [used by customers], he said. You can put cameras, vibration sensors, microphones on and you can put payloads together to bring all those sensors to multiple assets. The other benefit of it is it is freeing up people to do other higher value work.

Users can even shift among the sensors in a payload and get information across multiple areas at the same time, giving them more flexibility, said Snyder. It is being able to go in and look at a piece of equipment in in multiple ways.

The big differentiator between Spot and other robotic inspection devices is that Spot can go almost anywhere, he said. I am talking about assets that are spanning large campuses, and it is all outside where the temperatures are above 100 degrees or below zero. But Spot does not care either way and it can walk on whatever terrain happens to be out there.

Customers can use one Spot robot or multiple Spot robots, depending on their requirements and the sizes of their facilities.

Research and development are continuing with the technology and new features and capabilities will continue to be added over time, said Snyder.

Rob Enderle, analyst

Rob Enderle, principal analyst with Enderle Group, told EnterpriseAI that the partnership between IBM and Boston Dynamics to make this technology work is understandable.

We are rolling out the fourth industrial revolution, and no one company can do it all, said Enderle. Collaborations like this will be critical to assuring that the coming waves of technology do not also represent a growing danger to the humans that must work in and around these emerging technologies.

At the same time, the announcement showcases a critical step to the growing number of autonomous machines that it seems most are missing the need to update the related hardware, said Enderle. These robots, which will eventually be used in massive numbers, need to be maintained and updated to fix bugs, provide more advanced features and distribute training. IBM may be uniquely capable of providing the secure connections needed to capture data at scale to determine if patches and updates are necessary and help deliver those updates securely, so the robots are not compromised.

That, he said, could prevent some of the more troubling potential outcomes resulting from large numbers of robotic machines that are not patched rapidly enough to avoid catastrophic problems or are compromised by hackers wishing to do harm.

Lian Jye Su, analyst

Another analyst, Lian Jye Su of ABI Research, called today's edge analytics market an ecosystem play. To solve customer pain points, vendors must form partnerships to gain industrial expertise and deep operational knowledge, said Su. As the number of robots and drones proliferate, we expect to see more and more new use cases popping up in the future.

Su called the partnership between IBM and Boston Dynamics an exciting project that shows off Boston Dynamics active pace in deploying its Spot robots since the company was acquired by Hyundai.

As a quadruped, Spot is an excellent choice to operate in unstructured environments like manufacturing facilities, power plants, and warehouses, said Su. When coupled with IBMs expertise in data analytics and AI, this partnership should be very intriguing to end users who want to reduce the reliance on human workers while getting high ROI in their tech investments.

Autonomous mobile robots and drones have proven to be great tools to perform human-like tasks in hard-to-reach or hazardous areas, he added. I expect to see users demands to connect these robots to their internal software for data collection, monitoring and logging. As such, the industry will witness more collaboration between robotics hardware vendors and AI software vendors. Field applications will be an interesting area for Boston Dynamic and IBM, as more and more robotics are now deployed in agriculture, oil and gas, mining, seaports and more.

Related

Read the original post:

IBM, Boston Dynamics Using AI and Walking Robots to Rethink and Improve Industrial Monitoring - EnterpriseAI

Posted in Robotics | Comments Off on IBM, Boston Dynamics Using AI and Walking Robots to Rethink and Improve Industrial Monitoring – EnterpriseAI

New Robots May Be Creepier Than The Uncanny Valley – Forbes

Posted: at 7:04 am

Researchers have discovered something creepier than the Uncanny valley: identical faces.

The uncanny valley is the scientific explanation for why we all find clowns or corpses creepy. And just when we thought nothing could be more alarming than clowns, scientists have found an even uncannier way to freak us out.

New research finds that there is something even creepier than the uncanny valley: clones. Scientists now predict that when convincing humanoid robots with identical faces are launched, we are all going to panic.

It all started with a robotics professorMasahiro Mori, who discovered the concept of 'bukimi no tani gensh'()in 1970, later translated as the uncanny valley. If robots looked very different from human faces, people are fine with it. And people have no problem with real human faces. But when a face is close to human but not quite right, people get creeped out. Think of zombies or corpses. This effect is called the uncanny valley because it correlates to a dip in the graph of peoples emotional responses vs. level of human likeness.

The uncanny valley refers to the dip in this graph: when something is close to human but not right, ... [+] people find it creepy.

One of the best known examples of the uncanny valley effect came from the movie, Polar Express, which gave countless kids nightmares. Known for the unsettling eyes of the computer generated characters, the movie is now used as a teaching example for film schools.

There are several theories for why the uncanny valley exists, but they all center around the idea that an aversive response to a face thats not quite right could have survival benefit. Perhaps even helping to protect us from contagion. So for robots to be successful in the marketplace, they need to overcome the uncanny valley.

And thats exactly what is about to happen: robots are overcoming our aversions. In a new research article, Dr. Fumiya Yonemitsu of Kyushu University and colleagues asked what would happen when humanoid robots got so good, they overcome the uncanny valley.

Technological advances in robotics have already produced robots that are indistinguishable from human beings, they write. If humanoid robots with the same appearance are mass-produced and become commonplace, we may encounter circumstances in which people or human-like products have faces with the exact same appearance in the future.

To test peoples reactions, the team asked people to look at photos individuals with the same face (clones), with different faces, and of one single person. Research participants were asked to rate how realistic the images appeared, their emotional reaction to them, and how eerie they found them.

An edited photo that shows one of the study authors face as an example of a clone image.

Peoples responses were clear: clones were creepy. And the more identical individuals in a picture, the eerier it got. The researches decided to call this the clone devaluation effect.

The clone devaluation effect was stronger when the number of clone faces increased from two to four,says Yonemitsu in a press release.

Its interesting how shows like Westworld and Humans, we are rarely shown multiple clones of each robot. Perhaps thats because doing so would interfere with our experience of the robots as individuals with their own identities. In fact, viewing duplicates could even induce disgust.

We also noticed that the duplication of identity, that is the personality and mind unique to a person, rather than their facial features, has an important role in this effect. Clone faces with the duplication of identity were eerier, says Yonemitsu. These results suggest that clone faces induce eeriness and that the clone devaluation effect is related to realism and disgust reaction.

It all begs the question: why are scientists so dead-set on creating robots with terminator-like realism anyway?

Our study clearly shows that uncomfortable situations could occur due to the rapid development of technology, says co-author Dr. Akihiko Gobara of Ritsumeikan University. But we believe our findings can play an important role in the smooth acceptance of new technologies and enhance peoples enjoyment of their benefits.

And that last line is the uncanniest of all. Bladerunner is right around the corner. Happy Halloween.

Continued here:

New Robots May Be Creepier Than The Uncanny Valley - Forbes

Posted in Robotics | Comments Off on New Robots May Be Creepier Than The Uncanny Valley – Forbes

DeepMind’s Progress Over The Years In Robotics – Analytics India Magazine

Posted: at 7:04 am

AI Research Lab DeepMind acquired and open-sourced MuJoCo, a rich and effective contact model. By open-sourcing Multi-Joint Dynamics with Contact (MuJoCo), DeepMind has given a major push to its robotics ambition.

This article will trace how DeepMind has been making consistent efforts in pushing the envelope in robotics.

In 2016, DeepMind researchers demonstrated how deep reinforcement learning can train real physical robots. The paper showed that deep Q-functions-based reinforcement learning algorithms can scale to complex 3D manipulation tasks and efficiently learn deep neural network policies. The authors further showed that the time to train the robots can be further reduced by algorithm parallelisation across multiple robots that asynchronously pool their policy updates. The proposed methodology can learn a variety of 3D manipulations skills in simulation and a door opening skill (often considered a complex task for robots to train on) without manually designed representations.

In 2018, DeepMind published three major papers to demonstrate flexible and natural behaviours to reuse and adapt to solve tasks. The scientists trained agents with a variety of simulated bodies to perform activities like jumping, turning, and crouching across diverse terrains. The results showed that the agents develop these skills without receiving specific instructions.

Credit: DeepMind

Another paper demonstrated a method to train a policy network that imitates motion capture data of human behaviours to pre-learn skills like walking, getting up from the ground, turning, and running. These behaviours can then be tuned and repurposed to solve other tasks like climbing stairs and navigating through walled corridors.

The third paper produced a neural network architecture based on state of the art generative models. This research showed how this architecture is capable of learning relationships between different behaviours and imitating specific actions that are shown to it. After training, the systems could encode a single observed action and create a new novel movement.

DeepMind demonstrated a framework for data-driven robotics which uses a large dataset of recorded robot experience before scaling it to several tasks using a learned reward function. This framework can be applied to accomplish three different object manipulation tasks on a real robot platform.

The scientists used a special form of human annotations as supervision to learn a reward function and demonstrate tasks with task-agnostic recorded experience. This helps in dealing with real-world tasks where the reward signal cannot be acquired directly.

The learned rewards and large dataset of experience derived from different tasks are used to learn robot policy offline using batch reinforcement learning. This approach makes it possible to train agents to perform challenging manipulation tasks like stacking rigid objects.

DeepMind recently introduced RGB-Stacking as the new benchmark for vision-based robotic manipulation tasks. Here the robot has to learn how to grasp different objects and balance them over each other. It was different from previous works because of the diversity of the objects used and the variety of empirical evaluations performed to verify the accuracy of the results.

Credit: DeepMind

The results demonstrated that complex multi-object manipulation can be learnt using a combination of simulation and real-world data. The experiment could also suggest a strong baseline for generalisation to novel objects.

This experiment is considered a major advancement in DeepMinds endeavour towards making generalisable and useful robots. The authors will now work to make robots better understand the interaction with objects of different geometries. The RGB-Stacking benchmark has been open-sourced along with the designs for building real-robot RGB-stacking environments, RGB-object models and information for 3D printing.

MuJoCo is a physics engine simulator that facilitates research and development in fields that require fast and accurate simulations like robotics, biomechanics, graphics, animation, etc. Developed by Emo Todorov for Roboti, MuJoCo is one of the first full-featured simulators designed from scratch for model-based optimisation through contacts. Before DeepMinds acquisition, MuJoCo was a commercial product between 2015 and 2021.

MuJoCo helps in scaling up computationally intensive techniques like optimal control, system identification, physically consistent state estimation, and automated mechanism design before applying them to complex dynamic systems in contact-rich behaviours. It also has applications like testing and validating control schemes before deploying on physical robots, gaming, and interactive scientific visualisation.

This is probably a slow phase for research and development work in robotics. DeepMind rival OpenAI, after investing many years of research, resources and efforts into robotics, finally decided to disband its robotics research team and shift focus to domains where data is more readily available. On the industry side, too, several robotics-based companies have shut shop or are undergoing major losses. Given the circumstances, robotics, despite being such a lucrative industry, has limited to no buyers.

Backed by Alphabet, DeepMinds progress has helped it hold the flag high in this field over the past few years.

I am a journalist with a postgraduate degree in computer network engineering. When not reading or writing, one can find me doodling away to my hearts content.

See more here:

DeepMind's Progress Over The Years In Robotics - Analytics India Magazine

Posted in Robotics | Comments Off on DeepMind’s Progress Over The Years In Robotics – Analytics India Magazine

The 6th China Shenyang International Conference on Robotics kicks off in Shenyang – PRNewswire

Posted: at 7:04 am

With the theme of "Intelligence Creates the Future", this three-day conference sets up sub-events such as opening ceremony, keynote speech, group forum, exhibition, challenge, and exhibition game. After the opening ceremony, speeches, exhibitions and competitions were launched around the future theme of robots. There are 52 exhibitors at the exhibition, mainly including SIASUN Robotics, Harbin Institute of Technology Robotics, Shenyang Institute of Automation under Chinese Academy of Sciences, IFLYTEK, DJI UAV ,and robot companies from South Korea and Japan. The challenge sets up robot construction creativity competition, programming competition, football competition and so on.

The conference was held online and offline. Academicians, experts, scholars, the executives and technicians from robotics and intelligent manufacturing companies were invited to participate. Tang Lixin, the academician of the Chinese Academy of Engineering, and Qu Daokui, the president of SIASUN Company, both delivered keynote speeches. Susan Biller, the secretary general of the International Robotics Federation, and Jeff Bernstein, the president of the American Robotics Association, also addressed the conference by video.

Established in 2015, the China Shenyang International Robotics Conference is the only professional conference and exhibition event with robot as its theme in Northeast China.

Image Attachments Links:

Link: http://asianetnews.net/view-attachment?attach-id=405137Caption: Guests visited the exhibition

SOURCE The Information Office of Shenyang People's Government

See the original post here:

The 6th China Shenyang International Conference on Robotics kicks off in Shenyang - PRNewswire

Posted in Robotics | Comments Off on The 6th China Shenyang International Conference on Robotics kicks off in Shenyang – PRNewswire

Global Paint Process Automation Market (2021 to 2030) – Featuring ABB, CMA Robotics and Graco Among Others – ResearchAndMarkets.com – Business Wire

Posted: at 7:04 am

DUBLIN--(BUSINESS WIRE)--The "Paint Process Automation Market by Offering, Purpose and Vertical and Type: Opportunity Analysis and Industry Forecast, 2021-2030" report has been added to ResearchAndMarkets.com's offering.

The global paint process automation market size was valued at $3.34 billion in 2020, and is projected to reach $9.22 billion by 2030, registering a CAGR of 11.7% from 2021 to 2030.

Paint process automation is an advanced method of industrial painting processes. It includes usage of high-end machines and robots. It helps users to increase competitiveness, product quality, and workman safety.

Some of the prime drivers of the paint process automation industry are consistent painting results, cost-effective painting processes, and ability to meet industry-specific needs. These factors are estimated to propel the market growth rapidly during the forecast period. However, high installation cost acts as a major barrier for the paint process automation market growth. Contradictory, integration of advanced technologies with paint processes and its ability to meet sustainable goals create lucrative opportunities for the market growth during the forecast period.

The paint process automation market is segmented into offering, purpose, vertical, type, and region. On the basis of offering, it is fragmented into hardware, software, and services. The hardware is further sub-segmented into robots, controllers, atomizers, and others. The robot is again segmented into 4-axis, 6-axis, 7-axis, and others. Based on purpose, the market is segregated into interior and exterior. By vertical, the market is divided into automotive, aviation, agriculture, textile, furniture, pharmaceutical, electronics, construction, and others. By type, the market is divided into floor-mounted systems, wall-mounted systems, rail-mounted systems, and others. Region-wise, paint process automation market trends are analyzed across North America (the U.S., Canada, and Mexico), Europe (the UK, Germany, France, Italy, and Rest of Europe), Asia-Pacific (China, Japan, India, South Korea, and Rest of Asia-Pacific), and LAMEA (Latin America, the Middle East, and Africa).

Market Dynamics

Drivers

Restraint

Opportunities

Key Players

For more information about this report visit https://www.researchandmarkets.com/r/xm1eto

See the original post:

Global Paint Process Automation Market (2021 to 2030) - Featuring ABB, CMA Robotics and Graco Among Others - ResearchAndMarkets.com - Business Wire

Posted in Robotics | Comments Off on Global Paint Process Automation Market (2021 to 2030) – Featuring ABB, CMA Robotics and Graco Among Others – ResearchAndMarkets.com – Business Wire

Computer gaming and robotics set to revolutionise the future of stroke care – University of Strathclyde

Posted: at 7:04 am

A pioneering new partnership between the University of Strathclyde and Chest Heart and Stroke Scotland (CHSS) is bidding to kickstart a revolution in stroke care.

Academics at the Universitys world leading Department of Biomedical Engineering are working with the charity to integrate cutting-edge research in areas like robotics and computer gaming technology with CHSSs Hospital to Home services. They aim to ensure a smooth transition for stroke patients discharged from hospital.

The partnership in its first phase of development is the first of its kind in Scotland for stroke.

Over the next 12 months it will integrate the work of Strathclyde engineers at theUniversitys Sir Jules Thorn Centre for Co-creation of Rehabilitation Technology and the Hospital to Home stroke services.

The centre uses artificial intelligence and machine learned methods used in computer gaming to produce tailored exercise programmes for stroke survivors that encourage and support people in their own rehabilitation.

In the first phase, the work of the partnership is open to people in the West of Scotland, who will be given opportunities to take part in cutting edge recovery research and transform wraparound recovery support.

The technology will then be developed using the insight generated from stroke survivors going through the centre, with the programmes generated made available in community settings across the country.

The first cohort of stroke survivors are already going through the programme, and its hoped that through fundraising, the numbers will increase significantly over the next year.

Dr Andy Kerr, from Biomedical Engineering at University of Strathclyde said: We are delighted to have Chest Heart & Stroke Scotland as a partner in our bid to revolutionise rehabilitation.

Our determination is to develop technology that not only helps recovery but can also be used, easily, at home and in the local community, for example leisure centres. We consider this to be a key factor in improving access to rehabilitation technology.

Our pilot has gone very well at the Sir Jules Thorn Centre for Co-creation of Rehabilitation technology, and we are well placed, with the support of CHSS, to scale up our efforts in the new year.

Stroke affects more than 9,000 people a year in Scotland and is the most common cause of severe physical disability among adults. Hospital care for these patients accounts for seven percent of all NHS beds and five per cent of the entire Scottish NHS budget. The work from the partnership aims to improve rehabilitation which has been proven to aid recovery, improve peoples lives and reduce pressures on the NHS.

On World Stroke Day, CHSS has set up aStroke Care Revolution Fundto help fund the venture, They are looking to initially raise 160,000 to support the work of the centre and help 450 stroke survivors access the services through the partnership over the next 12 months.

Jane-Claire Judson, Chief Executive of Chest Heart & Stroke Scotland said: Every day in Scotland 25 peoples lives are changed forever by stroke. In an instant, the things we all take for granted can be taken away like the ability to walk or talk.

Rehabilitation is a lifeline that helps stroke survivors get their lives back. But cutting-edge rehabilitation technology and support is out of reach of most people in Scotland. This partnership will change that. It will kick start a revolution in stroke care in Scotland that will transform care for survivors and reduce pressures on our NHS.

Linda Hanlin, is one of the first to use the pioneering stroke rehabilitation unit at the University.The 62-year-old suffered a stroke in January 2014 and has mobility issues on her left side.

Theretiredmum of two grown-up children,went through physiotherapy, but, almost eight years on, shestill has trouble gripping with her left hand and has to walk with a stick.

Twice a week since the start of September, she has been going through her paces in a series of technological tests andexercises designed to retrain her limbs at the centre.

Linda said: My balance isnt great, and I walk with a stick. Ive been using the treadmill with a harness that shows you howyou are working and encourages you to use your feet more evenly.My balance has definitely improvedby usingthe treadmill.

I love the sessions, even though they go so quickly.

See original here:

Computer gaming and robotics set to revolutionise the future of stroke care - University of Strathclyde

Posted in Robotics | Comments Off on Computer gaming and robotics set to revolutionise the future of stroke care – University of Strathclyde