Page 101«..1020..100101102103..110120..»

Category Archives: Robotics

This online grocery company wants its robots to deliver right into your kitchen – ZDNet

Posted: April 21, 2021 at 9:59 am

Ocado is investing 10 million in Oxford-based start-up Oxbotica, which develops autonomy software for vehicles.

Online retailer Ocado is exploring the possibility of having robots packing, transporting and delivering groceries all the way to customers' kitchens, with a new partnership designed to bring new levels of automation to the warehouse.

The British e-tailer is investing 10 million ($14 million) in Oxford-based start-up Oxbotica, which develops autonomy software for vehicles, with the objective of testing different ways of integrating the technology with Ocado's hardware. Ocado will take a seat on Oxobtica's board.

Among the projects envisioned by the two firms are autonomous vehicles travelling inside Ocado's warehouses to move orders around the buildings and surrounding yard areas, but also driverless delivery vans and even "kerb-to-kitchen" robots to facilitate what is known as last-mile logistics the final steps between a customer's doorstep and the vehicle carrying their order.

SEE: Building the bionic brain (free PDF) (TechRepublic)

Automating these processes could cut costs significantly. According to Ocado, logistics costs weigh heavily in the expense hierarchy of online grocery: the cost of final mile delivery alone represents 10% of sales, with labor constituting about half of the costs.

Ocado and Oxbotica had previously worked together in 2017, when the e-tailer conducted a two-week trialusing an early prototype vehicle doing autonomous deliveriesin London. Oxbotica has now further developed two core products that will be used by Ocado's team a software suite that enables vehicle autonomy, as well as a cloud-based autonomy management system to monitor and control fleets.

A dedicated team of engineers within Ocado's Advanced Technology division will work with Oxbotica to come up with new use cases for the technology.

Ocado is keen to demonstrate that the nature of its business is not limited to online retail; rather, the company speaks of itself as a "technology company". Although the e-tailer's main activity still consists of providing online grocery services, mostly for UK retailer Marks & Spencer, Ocado is also heavily investing in robotics, AI, machine learning and edge intelligence.

This has led to the development of the Ocado Smart Platform (OSP), which includes end-to-end software systems to operate online retail businesses, from running an e-commerce website to managing the routing of delivery vans.

Perhaps the most well-known part of OSP is warehouse management, which Ocado developed in the form of huge hive-like buildings where groceries are stored in crates, over which dishwasher-sized robots coordinate to pick and mix goods for customer orders.

OSP is sold to retail businesses like Marks & Spencer, and the platform is the reason why Ocado describes itself as a technology provider. As a result, the company has focused primarily on improving OSP services; at the end of last year, for example,Ocado bought Kindred Systems, which designs AI-powered systems for warehouses, as well as robotic arm designer Haddington Dynamics.

The new partnership with Oxbotica is yet another signal from Ocado that it is willing to bring OSP's capabilities one step further.

"We are excited about the opportunity to work with Oxbotica to develop a wide range of autonomous solutions that truly have the potential to transform both our and our partners' customer fulfillment centers and service delivery operations, while also giving all end customers the widest range of options and flexibility," said Alex Harvey, chief of advanced technology at Ocado.

Due to the regulatory landscape, Ocado expects that the development of vehicles that operate in restricted areas such as inside fulfillment centers will become a reality sooner than fully autonomous deliveries to consumers' homes. The first prototypes of some early use cases are expected to be ready within two years.

With a workforce now approaching 19,000 employees, Ocado maintained that the vehicle autonomy program will not have any impact on the company's current hiring or employment levels within logistics and operations groups.

SEE: What is Agile software development? Everything you need to know about delivering better code, faster

Ocado will now have to prove that the company's extensive investments in new technologies are reflected in profit margins. Now more than two decades in the making,the e-tailer is yet to report significant profits. The past year, marked by a rise in online sales that came as a result of the global health crisis, seems to have benefitted Ocado, whichsaw revenues increase by about a third to hit 2.3 billion($3.2 billion); but the jump was largely driven by retail revenue, with earnings from technology services yet to match the company's more traditional offerings.

Ocado is also competing in a fast-evolving market, with autonomous delivery systemsprojected to grow at over 24% to more than $84 billion globally in 2031. Giant e-tailers like Amazon are aggressively expanding their capabilities, and havelast-mile delivery robots already operatingin some cities; and small start-ups are also popping up to automate delivery processes.

Even more traditional companies like Ford are getting involved in the space: the automaker recently announced that it waspurchasing a two-legged, two-armed robotproduced by Agility Robotics, to carry parcels from delivery vehicles straight up to customers' doors.

Go here to read the rest:

This online grocery company wants its robots to deliver right into your kitchen - ZDNet

Posted in Robotics | Comments Off on This online grocery company wants its robots to deliver right into your kitchen – ZDNet

Commentary: Self-driving buses and delivery robots welcomed but who do we blame if AI goes rogue in Singapore? – CNA

Posted: at 9:59 am

SINGAPORE: Earlier this year, Luda Lee, an AI (artificial intelligence)-powered chatbot, went rogue.

Created by Korean start-up Scatter Lab, Luda was designed to chat naturally with South Korean Facebook users (attracting more than 750,000 over just 20 days), and to improve based on user data.

Soon after however, Luda Lee began making bigoted and offensive comments against women, minorities, foreigners and people with disabilities.

She even randomly shared the personal data of its users. Her creators apologised but now face lawsuits over the data leaks.

While some consider this a relatively innocuous example, there are cases where more serious harms were caused by AI-made decisions.

In one high-profile example in 2018, a pedestrian was hit and killed by a self-driving Uber car whose sensors had failed to see and avoid her.

Concerns over algorithmic high-frequency trading triggering widespread financial market crashes, such as the flash crash of 2010, have also been raised.

These come as Singapore is ramping up its use of AI in all areas of life. In November 2019, the Government announced its National AI Strategy, which spells out plans to deepen Singapores use of AI to transform our economy and society.

Just this year, commuters could begin taking driverless buses at the Singapore Science Park and on Jurong Island, while on-demand delivery robots are being trialled in Punggol. Even robot dogs have been patrolling Bishan-Ang Mo Kio Park to ensure safe distancing among park-goers.

While extensive pre-trials would have been conducted, alongside safety precautions taken during their roll-out, Moores Law dictates that we ask: In the unlikely event that serious harms, whether physical, emotional or financial occur, on whom (or what) does legal blame lie, and on what basis?

FINDING THE SMOKING GUN

These questions were the focus of a recent law reform report published by the Singapore Academy of Laws Law Reform Committee, as part of a series looking at the impact of robotics and AI on the law.

As this report notes, these questions sometimes have relatively straightforward answers. For instance, if a malicious individual deliberately programmes a delivery robot to break into someones house, or disrupts the signals to a driverless bus, causing it to veer off the road and crash, most would agree that individual should be held liable.

In addition, if somebody sustains serious injury or dies from the individuals deliberate actions, some form of criminal punishment would not seem unfair. Indeed, criminal laws already exist to deal with such issues.

Save for some tweaks, present laws could still tackle cases of intentional harm, even in an AI-powered world.

WHAT HAPPENS WHEN NO HUMAN INTENDED THE HARM?

Things get trickier, however, in situations where a human did not intend the harm that arose. This is particularly so as AI systems become more autonomous, and humans roles in their operation and supervision diminish.

Already, driverless vehicles present such a conundrum today, given that they operate at speeds that may not leave users time to take control and prevent the harm. What then?

An instinctive response might be to say the entities responsible for the system should be punished. But which entities? There are usually multiple parties involved in the development and deployment of AI systems.

Should the liable party be the one that built the system; programmed the systems code; trained the system; put it on the market, or deployed the system?

Putting aside challenges of identifying harmful intent, it can be tricky to pinpoint the blameworthy party in this chain. Pinning criminal liability on all of them would also likely have a counterproductive effect of discouraging innovation the legal equivalent of using a sledgehammer to crack a nut.

This same challenge besets the use of criminal negligence laws in situations where an offender (an individual or company) carelessly causes harm, even though no harm was intended.

Does this mean that serious harms could be inflicted by robots and AI systems, with no one being held criminally liable?

Some take this view, preferring the use of regulatory penalties such as censures, improvement notices and fines to promote the safe and responsible use of AI systems. The argument goes that threatening criminal liability on those who create AI systems for harms they didnt foresee deters the development of new, potentially game-changing technologies.

Others counter some cases of harm are so serious that criminal laws are needed to ensure that someone or something is held accountable for the damage done, to reflect societys abhorrence at such harmful conduct and to set a strong deterrent.

These questions and trade-offs are matters that policymakers and society need to deeply consider.

WEIGHING THE ALTERNATIVES

One possible approach could be to impose specific duties on designated entities to take all reasonable measures necessary, to ensure the AI systems safety.

These entities would risk criminal penalties if they fail, much as worksite operators are required to ensure the safety of workers on those sites.

Another more radical solution could be to impose criminal liability on and punish the AI system itself, particularly with highly-advanced autonomous systems. After all, is it not the system that took the decision to act in a harmful way?

This approach is not unheard of: The European Parliament has suggested it be considered further, and even Saudi Arabia has recognised the robot Sophia as the worlds first robot citizen just a few years ago.

However, such a solution does appear impractical in todays legal systems, which are shaped primarily to regulate human behaviour, as well as at the present state of technology. After all, what purpose would it serve today if a driverless bus in Jurong Island or a delivery robot in Punggol were charged, convicted and sentenced to prison, fined, or even put to death?

But as AI systems become more and more sophisticated, these questions are no longer the preserve of science fiction.

A silver bullet or one-size-fits-all solution is unlikely. Different technologies and different contexts will likely require different approaches to whether, how and against whom to apply criminal law.

Policy and ethical balances will need to be struck. Whether from a legal, policy or broader societal perspective, these are not issues for tomorrow but today.

Josh Lee Kok Thong is a member of the Singapore Academy of Laws Law Reform Committees Subcommittee on Robotics and AI. He is also the co-founder of LawTech.Asia and the founding chairperson of the Asia-Pacific Legal Innovation and Technology Association. Simon Constantine was formerly Deputy Research Director, Law Reform at the SAL.

Read more:

Commentary: Self-driving buses and delivery robots welcomed but who do we blame if AI goes rogue in Singapore? - CNA

Posted in Robotics | Comments Off on Commentary: Self-driving buses and delivery robots welcomed but who do we blame if AI goes rogue in Singapore? – CNA

Webinar: Collaborative Robotics 2021 New Systems, Applications and Opportunities – April 21 – Robotics Business Review

Posted: April 17, 2021 at 12:04 pm

Whats new for collaborative robotics systems, enabling technologies and applications? Whats next? Attend this online session on April 21st to find out.

By RBR Staff | April 16, 2021

Listen to this article

Wednesday, March 24, 20212 PM ET / 11AM PT

The introduction of collaborative robots, robotic systems that can work safely in close approximation with human co-workers, has increased task flexibility and expanded the number and types of applications for which robots can be used. Both large, existing robotics suppliers, as well as new, smaller firms, have rapidly introduced innovative collaborative robotics technologies into the market, and more are on their way. Unfortunately, this fast-moving sector makes for uncertainty for both end-users and developers of collaborative systems.

This webinar will act to increase clarity, providing a snapshot of the current state of the collaborative robotics sector, including emerging capabilities, new applications and business models, and powerful enabling technologies, as well as a description of what to expect in the future. Topics include:

Date / Time Wednesday, April 21, 2021 2 PM ET / 11 AM PT

Sponsored by:

Dan Kara, Vice President, Robotics, WTWH Media

Dan Kara is Vice President, Robotics at WTWH Media where he chartered with driving the companys robotics initiatives including the Robot Report and Robotics Business Review online portals and the Robotics Summit Conference and Exposition, Healthcare Robotics Engineering Forum, RoboBusiness Conference & Expo and the International Field Robotics Engineering Forum. Prior to joining WTWH, he was Practice Director, Robotics and Intelligent Systems at ABI Research and Chief Research Officer for Myria RAS, both research and advisory services firms focused on automation, robotics and intelligent systems. Dan was also President of Robotics Trends, an integrated media and research firm serving the personal, service and industrial robotics markets. Dan has also worked as Executive Vice President of Intermedia Group, and Director of Research at Ullo International.

The rest is here:

Webinar: Collaborative Robotics 2021 New Systems, Applications and Opportunities - April 21 - Robotics Business Review

Posted in Robotics | Comments Off on Webinar: Collaborative Robotics 2021 New Systems, Applications and Opportunities – April 21 – Robotics Business Review

Biomedical Engineer Receives Grant to Further His Research into Using Robotics for Cancer Detection and Monitoring – WPI News

Posted: at 12:04 pm

Haichong (Kai) Zhang, assistant professor of Robotics Engineering and Biomedical Engineering, has received a $445,742, five-year grant to continue his ongoing research into creating robotic systems to use minimally invasive technologies to safely and accurately detect and monitor cancer.

Zhangs grant is part of a larger project funded by the National Institutes of Health (NIH) and led by Martin Pomper, director of Nuclear Medicine and Molecular Imaging and professor of Radiology and Radiological Science at Johns Hopkins University. Pomper received a total of $2,266,703 for his work to create an innovative, targeted chemical compound that can be used during imaging to enable better detection and treatment of aggressive forms of cancer. Zhang, a co-investigator on the project and a subcontractor on the award, will focus on designing and creating a photoacoustic imaging apparatus, which will be part of his diagnostic imaging robot, to evaluate the chemical contrast agents that Pompers team synthesizes.

Creating a diagnostic imaging robot has been an ongoing project for Zhang. In 2019, he received a five-year $1,869,423 Director's Early Independence Award from the NIH to support his work to create a robotic system that will detect and analyze three different indicators of prostate cancer: a 3D image of any mass; high levels of a protein produced by cancer cells; and tissue with low-oxygen levels caused by cancer growth.

Zhang said the two grants synergize to enable him to advance his research into exploring the detection of aggressive prostate cancer using photoacoustic imaging, which uses the delivery of light energy to examine tissue to improve ultrasound imagery.

Excerpt from:

Biomedical Engineer Receives Grant to Further His Research into Using Robotics for Cancer Detection and Monitoring - WPI News

Posted in Robotics | Comments Off on Biomedical Engineer Receives Grant to Further His Research into Using Robotics for Cancer Detection and Monitoring – WPI News

‘Snakebot’ takes a dive to go where other robots can’t – GCN.com

Posted: at 12:04 pm

Snakebot takes a dive to go where other robots cant

A snake-like robot can now slither its way through water, allowing it to inspect ships, submarines, and underwater infrastructure for damage.

Researchers from the Biorobotics Lab in the School of Computer Sciences Robotics Institute at Carnegie Mellon University tested the hardened underwater modular robot snake (HUMRS) last month in the pool, diving the robot through underwater hoops, showing off its precise and smooth swimming, and demonstrating its ease of control.

We can go places that other robots cannot, says Howie Choset, professor of computer science. It can snake around and squeeze into hard-to-reach underwater spaces. Choset and Matt Travers, co-directors of the Biorobotics Lab, led the work.

Thesubmersiblerobot snake project aims to assist the Department of Defense with inspecting ships, submarines, and other underwater infrastructure for damage or as part of routine maintenance, says Matt Fischer, the program manager at the Advanced Robotics for Manufacturing (ARM) Institute.

Snakebot could save time and money

The military has limited options for inspecting areas like a ships hull. To do so, theNavymust either send a team of divers to the ships location, wait until it returns to port to deploy the divers, or pull it into a dry dock -- all options that take time and money.

A submersible robot snake could allow the Navy to inspect the ship at sea, immediately alerting the crew to critical damage or sending information about issues that need attention back to port for use when the ship docks.

If they can get that information before the ship comes into a home port or a dry dock, that saves weeks or months of time in a maintenance schedule, says Fischer, who served in the Navy for three years. And in turn, that saves money.

Fischer, who crawled into the ballast tanks of asubmarineduring his service, says many sailors would gladly pass that difficult and tight duty to a robot.

Steve McKee, a co-lead of the Joint Robotics Organization for Building Organic Technologies (JROBOT), a Department of Defense task force interested in technology like the submersible robot snake, says the project will improve the readiness of equipment in the armed services.

The advancements being made hold great promise for helping not only the Department of Defense but also various industries around the world, McKee says.

Snake in the pool!

Outside the military, therobotscould inspect underwater pipes for damage or blockages, assess offshore oil rigs, or check the integrity of a tank while it is filled with liquid. The robot could be used to inspect and maintain any fluid-filled systems, says Nate Shoemaker-Trejo, a mechanical and mechatronics engineer in the Biorobotics Lab working on the submersible snakebot.

The distinguishing feature is the robots form factor and flexibility. The smallest versions of regular submersibles are usually blocky, one-piece arrangements. The robot snake is narrow and jointed, Shoemaker-Trejo says. The end result is that an underwater robot snake can squeeze around corners and into small spaces where regular submersibles cant go.

Versions of therobot snakeshave already proven useful in difficult situations. Travers led a team to Mexico City in 2017 to use robot snakes in a search-and-rescue mission after an earthquake. And a robot snake made a lasting impression on Jimmy Fallon when it climbed up his leg as a guest on NBCsThe Tonight Show with Jimmy Fallon.

The robots modular design allows it to adapt to different tasks, whether squeezing through tight spaces under rubble, climbing up a tree, or slithering around a corner underwater. For the underwater robot snake, the team used existing watertight modules that allow the robot to operate in bad conditions. They then added new modules containing the turbines and thrusters needed to maneuver the robot underwater.

Development progressed rapidly. The team started working on the underwater robot snake in July 2020 and by March 2021, had it swimming in the pool.

Im surprised that we made this robot work as fast as we did, Choset says.

A grant from the Advanced Robotics for Manufacturing Institute funded the work.

This article was posted from Futurity.

About the Author

Jason Maderer is the interim managing director of communications at Carnegie Mellon University.

Go here to read the rest:

'Snakebot' takes a dive to go where other robots can't - GCN.com

Posted in Robotics | Comments Off on ‘Snakebot’ takes a dive to go where other robots can’t – GCN.com

How Vision Systems Work in Robotics – DesignNews

Posted: at 12:04 pm

Robot vision systems are commonly referred to as machine vision. This vision tool is used in several industrial processes, including material inspection, object recognition, and pattern recognition. Each industry applies its particular values to machine vision. In healthcare, pattern recognition is critical. In electronics production component inspection is important. In banking, the recognition of signatures, optical characters, and currency matter.

We talked with machine vision company, Cognex to get an understanding of how vision systems work with robotics. There are two main applications for robot guidance. You take an image of the scene and you find something and input that things coordinates. This is the angle for 2D and 3D systems. The robot can see an object and will do something with it, Brian Benoit, senior manager of product marketing for Cognex, told Design News. Another application is inspecting an object. A robot holds a camera and the robot moves the camera around the part to get certain images.

Related: How Mobile Robots Deliver Efficiency to Your Packaging Line

KUKA offers a flexible 2D vision solution for its robots by integrating with Cognex VisionPro software. The vision tools locate, inspect and read codes on stationary or moving parts.

For Cognex, the vision system often goes to integrators who cobble together automation systems.Sometimes our customer is an integrator who is building a system with robots that need vision, said Benoit. The types of robots we work with are manipulating robots. They may lift a large payload in a cage. Those are fast and unsafe to be around.

Related: Artificial Brain Gives Robots Unprecedented Sensing Capabilities

In other cases, the customer is the end-user working with smaller, safer robots, We also work with collaborative robots that are designed to work side-by-side with people. Theyre not going to use as much force, said Benoit. They often do pick-and-place. He noted that the vision systems from Cognex are not the same as the guided systems used by mobile robots. We havent seen much traction with vision on warehouse robots. They use sensors that are integrated into their system, so they dont need an outside camera.

While the large caged robots have a decades-long history, collaborative robots are a relatively new addition to the automation world. The cage robot market is stable. The most common customer is in automotive. The need for a vision system with caged robots isnt great because there isnt as much variability, said Benoit With collaborative robots, you have a more unstructured environment so there are a lot more vision applications. Its a developing market, so our strategy is to work well with any robot manufacturer. We develop software interfaces for the robot manufacturer so they can easily work with our system. With each robot, we have to develop an interface.

ABB Integrated Vision interfaces with Cognex.

The human eye is a sensor. Its our brain that understands the visual feed from our eyes. Same with robots. The robots software interprets the visual data. Software is the brain behind everything. We have two types of vision. One set of algorithms is rules-based. It looks for patterns and edges. The other algorithm is in the deep-learning space where the software is trained by example. Were seeing traction in both, said Benoit. The goal of the software is a good handshake between the robot and the vision system. We train the robot and the vision system to know where they are together. When the camera sees something, the robot knows where to go in response.

The vision software includes a plug-and-play connection for each robot maker. We try to make the integration as simple as possible. We have the right hooks in our software to send signals back and forth, said Benoit. We also rely on integrators for help on that. We work directly with some robot companies, such as Universal Robots. They have vision software thats maintained by Cognex and works directly on their robots.

Part of what sets collaborative robots apart from traditional, caged robots is they are designed to be trained by users rather than programmed by integrators. Universal Robots business model is to skip the integrators. They have a point-and-click driver, said Benoit. Its important that when they work with peripherals, they can just plug it in. They do it with our vision system and with different end effectors.

The vision system that allows the robot to identify a part and pick it up is not different from the vision system that allows the robot to check to do a quality check on a part. When the robot has a vision system whether its on a conveyor or the end of a robot arm it works the same. The vision system is mounted on a robot arm, and the arm moves around, said Benoit. The software analyzes the feed from the vision. In many cases, the software engages in deep learning. We integrated a system, all of the analysis exists inside the vision platform. We use a neural network inside the camera.

The software for the vision system may be programmed to identify a specific object, or it may be designed to learn about what its seeing. The deep-earning is different from conventional programming. Its like teaching a child what a house is. You dont normally tell the child the coordinates of a house. You say. Thats a house, thats a house, and thats an office building. Our software is designed to do that in manufacturing.

Rob Spiegel has covered manufacturing for 19 years, 17 of them for Design News. Other topics he has covered include automation, supply chain technology, alternative energy, and cybersecurity. For 10 years, he was the owner and publisher of the food magazine Chile Pepper.

Read the original:

How Vision Systems Work in Robotics - DesignNews

Posted in Robotics | Comments Off on How Vision Systems Work in Robotics – DesignNews

Isilon founder lifts the hood on farming startup Carbon Robotics and its weed zapping machine – GeekWire

Posted: at 12:04 pm

Carbon Robotics Autonomous Weeder uses artificial intelligence to identify and zap weeds growing in fields of vegetables. (Carbon Robotics Photo)

Carbon Robotics, a Seattle company led by Isilon Systems co-founder Paul Mikesell, is unveiling its self-driving robot that uses artificial intelligence to identify weeds growing in fields of vegetables, then zaps them with precision thermal bursts from lasers.

The startup, previously known as Maka Autonomous Robots, was in stealth mode since 2018. Mikesell sold Isilon for $2.25 billion in 2010, helped Uber open its Seattle engineering office in 2015, then moved to Facebooks Seattle Oculus lab before taking the startup plunge again.

Carbons tech holds the promise of reducing the cost of growing organic vegetables so that they no longer cost a whole paycheck.

We have all of this technology that allows, for the first time, computers to see things and understand what theyre looking at, Mikesell said. For him, the question was: how do we apply this to real physical world work? He turned to food production.

Scientists have been experimenting with laser weed-control for more than a decade after finding that the heat of lasers vaporizes water inside plant cells, destroying the cells and killing the plant. In 2013, a German company announced plans to use a laser-armed drone to zap weeds from the air.

But what farmers need is less a revolution in farming methods than a revolutionary tool that fits into their current farming patterns, Mikesell said.

Carbon worked closely with farmers in eastern Oregon and southern Idaho, he said. As a result, Carbons robot system the Autonomous Weeder was built about the size of a medium tractor so it would fit in the furrows between rows of common crops like onions and sweet potatoes.

It can cover up to 16 acres of cropland a day, zapping as many as 100,000 weeds an hour, Mikesell said. And since its self-driving, all a farmer has to do is take it to the field in the morning and turn it on.

Were really intent on not making farmers have to change how theyre doing things, Mikesell said. Thats been a key to our success. We fit right into their operations.

Weed control is an essential part of successful farming and is perhaps the essential factor for organic farmers, said Doug Collins, a soils scientist with Washington State Universitys Cooperative Extension research center in Puyallup, Wash., who serves on the Organic Advisory Board for the states Agriculture Department.

Its frequently the No. 1-cited problem, he said. Weeds can get out of hand and you can lose a crop pretty easily. The competition from the weeds can make it so the crop is not worth harvesting.

For organic farmers in particular, the cost of weed control can be high. Collins said his research on larger organic farms in the Columbia Basin showed that farmers can spend $1,200 to $1,600 an acre hiring workers to eradicate weeds by hand with hoes. Even non-organic farmers will hire hand crews to supplement the weed-killing sprays they use.

Its hard, physically demanding work, Collins said. Its not fun.

There are other alternatives for weed control like covering the space between rows with tarps or black plastic, to block weeds from getting the sunlight they need but theyre not always practical for large operations, Collins said.

One key for Carbon has been its Northwest location, Mikesell said. The region is unique in that it has a deep depth of knowledge in AI and computer visioning; an established advanced manufacturing sector; a diverse agricultural industry; and a strong venture capital community all in close proximity.

Theres not a lot of places in the world where you have all those things coming together, he said.

The 21-person startup has raised $8.9 million to date from Fuse and Bolt.

Carbon has sold out all the robots it built for the 2021 planting season, and is looking for an industrial partner who could help it build more units for 2022, Mikesell said.

The company is looking to get into the hundreds of units built and shipped for next year, he said. Theres a demand for a lot more than that, tens or hundreds of thousands of them.

Visit link:

Isilon founder lifts the hood on farming startup Carbon Robotics and its weed zapping machine - GeekWire

Posted in Robotics | Comments Off on Isilon founder lifts the hood on farming startup Carbon Robotics and its weed zapping machine – GeekWire

Kroger partner Ocado is on the road to robotic grocery delivery with autonomous-vehicle investment – MarketWatch

Posted: at 12:04 pm

High-tech supermarket and logistics group Ocado is pushing toward a future where, from warehouse to doorstep, groceries will be handled by robots.

The British group, a grocery delivery rival to online retail giant Amazon AMZN, +0.60% in the U.K. and joined with grocer Kroger KR, -0.22% in the U.S., will invest 10 million ($13.8 million) in a commercial partnership with Oxbotica, an autonomous vehicle software company, the company said on Friday.

Shares in Ocado OCDO, +1.87% climbed near 2% on the day, helping Londons blue-chip FTSE 100 UKX, +0.52% index top 7,000 for the first time in more than a year.

The partnership will involve collaborating on hardware and software interfaces for autonomous vehicles, Ocado said, integrating Oxford, U.K.-based Oxboticas software platform on Ocados vehicles.

Plus: Tesla self-driving truck rival TuSimple raises $1 billion in IPO valuing it at $8.5 billion

The next steps are for Ocado to fit out its delivery vans and warehouse vehicles with data capture capabilities to help Oxbotica to test its technologies. The vision is to automate the entire grocery delivery process: from vehicles that operate inside and out of Ocado warehouses, to last-mile delivery vehicles and curb-to-kitchen robots.

We are excited about the opportunity to work with Oxbotica to develop a wide range of autonomous solutions that truly have the potential to transform both our and our partners [customer fulfillment center] and service delivery operations, while also giving all end customers the widest range of options and flexibility, said Alex Harvey, Ocados head of advanced technology.

Founded by former Goldman Sachs GS, +1.11% bankers in 2000, Ocados business is built on its roots as a high-tech grocery delivery company and it is a competitive player in the cutthroat British supermarket sector. But one of its key areas for growth is creating custom logistics and warehousing solutions using its proprietary robotics technology.

Read more: Kroger and Ocado bring bots to new high-tech facility as digital competition among grocers heats up

In that vein, it has a partnership with U.S. retail giant Kroger dating back to 2018. Earlier this week, Kroger launched its first Ocado warehouse combining robotics and machine learning for fast fresh food delivery located in Monroe, Ohio, north of Cincinnati.

Ocado said the ultimate ambition of its multiyear collaboration with Oxbotica is to enable partners like Kroger to reduce the costs of both last-mile delivery and logistics operations.

The 10 million investment in Oxbotica came as part of the companys Series B equity funding round, led by the venture arm of energy giant BP BP, -0.18%, which has tested autonomous vehicles using Oxboticas technology at a refinery in Germany. Chinese tech giant Tencent 700, +1.94% and safety equipment group Halma HLMA, +0.16% were among the other investors.

Read the original:

Kroger partner Ocado is on the road to robotic grocery delivery with autonomous-vehicle investment - MarketWatch

Posted in Robotics | Comments Off on Kroger partner Ocado is on the road to robotic grocery delivery with autonomous-vehicle investment – MarketWatch

Hanwha Robotics and elliTek, Inc. Partnership to Help United States Manufacturers Safely Reopen – PR.com

Posted: at 12:04 pm

Collaborative Robots Combined with Mechanical and Industrial IoT Expertise Creates Solution to Severe Labor Shortages & Health & Safety Concerns in Post-Pandemic World While Increasing Production.

The COVID-19 pandemic forced manufacturers to stop or reduce production. This disruption was in full display last year as seen on empty grocery and store shelves, not to mention the loss of lives and jobs. elliTeks engineers have been working tirelessly to find cost-effective solutions to help manufacturers safely reopen their facilities and also address severe labor shortages. Hanwhas team vigorously developed advanced cobots that combined Hanwhas AI technology with its mobility capabilities.

Adding automated robotic systems to a factory is an investment that has a quick return, said Brandon Ellis, president of elliTek. Additionally, collaborative robots can limit the number of people on a production line to ensure appropriate distancing requirements are being met and may also provide relief for those experiencing labor shortage situations.

Cobots can perform multiple tasks, so employees can focus on value-added duties. Cobots can help manufacturers improve quality through process automation, cut operational costs with easily controllable equipment, offer flexible switch-over between processes for small-batch production, and improve dangerous work conditions. elliTeks pre-engineered, turn-key robotic workcells are engineered to allow the robots to operate at their full capacity and speed. Plus, they are ready for quick installation.

Hanwhas HCR Advanced Collaborative Robot Series consists of three six-axis articulated robot models that can be used to collaborate with workers. Hanwhas HCR Advanced Series is unique in that two HCR Cobots can run off a single control unit resulting in a 10% reduction in operating costs. The HCR Advanced Cobots can be applied immediately without changing the existing workspace. They are easy to control and flexible in reacting to changes in the production layout. Hanwhas cobots not only automate manual work, but the HCR Advanced Series also provides a safer work environment.

Furthermore, Hanwhas new advanced cobot models allow users the ability to easily attach torque sensors and grippers without the need to attach additional cables. The improved speed, consistency, and accuracy of the HCR Advanced Series offer increased productivity for industries such as automotive, electronics, food, and pharmaceutical to name a few.

Hanwhas R&D team developed five additional Advanced Solutions Vision and Mobility modules from which users can choose to customize their cobot according to the type of work and production processes.

Learn more about Hanwhas HCR Advanced Collaborative Robots, http://www.elliTek.com/Hanwha-Robots. Speak with one of elliTeks automation experts, 865-409-1555, to find out if a collaborative robot is the best automation solution for your production line.

About Hanwha Corporation

Hanwha, founded in 1952, is one of South Koreas top-ten business enterprises and a Fortune Global 500 company. Hanwha has 76 domestic affiliates and more than 350 locations around the world spanning four business areas: chemicals & energy, aerospace & mechatronics, finance, and constructions & leisure/lifestyle services. Hanwhas impressive growth over the past seven decades stems from its ability to anticipate change and embrace new challenges. Learn more about Hanwha Robotics at http://www.hanwharobotics.com/En.

About elliTek, Inc.

elliTek, Inc. is an East Tennessee-based industrial automation company with a mission of empowerment. Founded in 2009, elliTeks focus on the user experience and proficiency in industrial IoT led to the development of their award-winning MES products. elliTek also offers distribution services, engineering services, and robotic solutions to clients through strategic partnerships with global technological companies. elliTeks expertise in motion control, robotics, automated machined design, and industrial IoT has led to innovative solutions that increase business responsiveness, boost productivity, and accomplish sustainability objectives all while lowering the total cost of ownership. Listen to elliTeks podcast Industrial Automation It Doesnt Have To to find out what makes elliTek different. Learn more at http://www.elliTek.com or call (865) 409-1555.

Go here to see the original:

Hanwha Robotics and elliTek, Inc. Partnership to Help United States Manufacturers Safely Reopen - PR.com

Posted in Robotics | Comments Off on Hanwha Robotics and elliTek, Inc. Partnership to Help United States Manufacturers Safely Reopen – PR.com

Day 2 of ProMatDX has a heavy focus on robots – Logistics Management

Posted: at 12:04 pm

Yesterday,Day 1 of ProMatDX, I focused on trends some of the industrys leading system integrators were following. Today, Ill look at robotics.

One of the presenters at last yearsNextGen Supply Chain Conference, the annual technology conference I produce, was Adrian Kumar, the global head of operations science & analytics atDHL Supply Chain. The topic was robotics: When it comes to putting robotics to work in the warehouse and distribution center, few organizations have worked with as many solution providers as Kumars team at DHL. Yet most of his presentation was about software.

When I asked him why, Kumar said simply that after more than three years of working with autonomous mobile robots (AMRs) in DHLs operations, one of the most important learnings was that the software optimizing the robots and coordinating the rest of the fulfillment operations was more important to achieving results than the robots themselves. Essentially, the hardware is becoming a commodity.

That anecdote came as no surprise to Lior Elazary, the founder and CEO ofinVia Robotics. He joked that at some point, someone will mass produce robots at such a cheap price that hell be able toconcentrate on software. In fact, he described inVia as a software company. He is putting his money where his mouth is:inVia just announceda new project that utilizes the companys warehouse execution system in a conventional distribution center with limited automation; robots will come later. He explained that the software can now route a lift truck driver to get a pallet for the pallet area; route a cart runner who is delivering carts to the picking or packing areas; or direct a person in the pallet pick area.

inVia isnt the only company leading with software. Developing a solution to quickly and efficiently integrate the myriad automation platforms available today, including those from robotics suppliers, was the strategy behindSVT Robotics, co-founded by A.K. Schultz,SVTsCEO. Despite its name, SVT doesnt sell robots. Schultz and his partner Michael Howes are former Swisslog executives who were frustrated by the challenges of integrating software to create a holistic solution when they were working on projects for customers. They set out to build a better mousetrap a Cloud-based platform where automation operating systems can be accessed and then integrated to create a solution. That includes robotics, as the name implies, but also more traditional automated equipment like conveyors, sorters and AS/RS systems. Our goal was to take the typical 12 to 18 month integration cycle down to 12 to 18 weeks for deployment, especially for companies that are new to automation, Schultz said. We realize that its no longer about the machine, but the software.

The ability for all those systems to play nice together is leading to another trend in robotics, one well call convergence. Or, as I wrote yesterday, the transition from a point product to a holistic solution. Tom Galluzzo, the CEO of Pittsburgh-basedIAM Robotics, noted that customers are looking at how to use AMRs, one of the products IAM produces, for conveyance over long distances in place of traditional conveyor, and as a sortation device to deliver totes to the different lanes for shipping. IAM is also one of the suppliers working with a systems integrator to create an highly-automated, end-to-end fulfillment engine that includes a piece picking robots, a robotic sorter and a high-density goods-to-person storage and retrieval system. I think were on the verge of seeing more big and complex systems being designed, Galluzzo said, and were seeing a lot of new entrants to the market, addressing more applications.

Without questions, AMRs are more widely adopted than piece-picking robots. To some degree thats because AMRs are point solutions that require little in the way of infrastructure changes to get them up and running. Item-handling robots on the other hand have to work with other systems, requiring more software and hardware integration as well as infrastructure. But, that might be starting to change, noted Vince Martinelli, head of production and marketing atRightHand Robotics, one of the early movers in this space. One reason is that there are more reference systems up and running, and the deployments are getting larger. Early on, companies were just installing one or two robots, Martinelli said. If you look at the project we just did withApologistics in Germany, youll see that there are a lot of robots working. You cant underestimate the value of real case studies, he added. Look closely, and youll also see that the robots have been integrated with an AutoStore goods-to-robot storage and retrieval system an example of convergence. In addition to experience and more reference case studies, the robot software, and integration of that software, is getting better. APIs arent sexy, Martinelli said. But in our latest product, we emphasize more than ever how we can work with a system integrator.

Robotics is a crowded field, one that is becoming more crowded every day as new startups throw their hats in the ring. At some point, Im sure, there is going to be a shakeout. So, what may determine who survives? Thats a question that may be answered by data and experience. Robots, after all, rely on Artificial Intelligence to improve their performance, and AI needs data, the more the better. The guy who gets 1,000 robots in there and operational first will learn a lot of things, Martinelli said.

Read more:

Day 2 of ProMatDX has a heavy focus on robots - Logistics Management

Posted in Robotics | Comments Off on Day 2 of ProMatDX has a heavy focus on robots – Logistics Management

Page 101«..1020..100101102103..110120..»