The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Monthly Archives: June 2023
Jersey City’s Oishii partners with robotics company to bring … – ROI-NJ.com
Posted: June 4, 2023 at 9:10 am
Oishii, the Jersey City-based Japanese vertical farming company best known for its strawberries, is partnering with the Yaskawa Electric Corp., an industrial robotics company based in Japan, to develop new automation solutions to optimize vertical farming methods and scale output.
A series of Yaskawa robotic arms will power Oishiis indoor vertical strawberry farm. The robots will work in together with Oishiis infrastructure, technology and urban farmers to harvest strawberries at the peak of freshness.
Yaskawa is one of the most respected names in industrial robotics today. Like Oishii, they are guided by the pursuit of quality and believe technology can solve some of the worlds most pressing issues, Hiroki Koga, co-founder and CEO of Oishii, said. Were honored to partner with a company that is invested in our mission to reinvent the future of agriculture. Together, we will deliver forward-looking solutions to bring clean, delicious produce to more people.
Yaskawa will provide industrial robots and other products as well as systems for Oishiis ongoing project to automate the entire process from sowing, raising seedlings and harvesting, to inspecting, boxing and shipping at a factory to be constructed.
The technology, Oishii says, will deliver new efficiency to reduce food, energy and water intake.
Read more from the original source:
Jersey City's Oishii partners with robotics company to bring ... - ROI-NJ.com
Posted in Robotics
Comments Off on Jersey City’s Oishii partners with robotics company to bring … – ROI-NJ.com
New Dog, New Tricks: Reflections on Construction, Robotics, and … – Archinect
Posted: at 9:10 am
Spot, a robot developed by Boston Dynamics. Image credit: Boston Dynamics
What is the current relationship between humans, robotics, and construction? What is its future? To explore these questions in depth, Archinect speaks with bothBoston Dynamicsand the Applied Research + Development group at Foster + Partners for their experiences and perspectives in designing, building, and applying the latest innovations in robotics on construction sites.
This article is part of the Archinect In-Depth: Artificial Intelligenceseries.
Along a growing suburban street outside New York City, two residential schemes are under construction on adjacent plots. On one site, a family of robotic 3D printing arms steadily builds layer after layer of a curving, honeycomb-like facade. Next door, autonomous cranes stand over a semi-completed prefabricated modular apartment building.
Throughout the 3D printed building site, a gentle hum is emitted by the movement of the 3D printing robotic arms and autonomous drones flying above them. In this human-free environment, the drones are the eyes and ears of the operation, live streaming imagery to the contractors command facility in San Francisco for a human supervisor to occasionally monitor alongside dozens of other active sites. The drones live stream function is somewhat of an add-on. Instead, the primary role of the drone is to collect daily 3D scans of the site on its continuous, preprogrammed route; data which is then sent to the California command center, where a series of AI programs compare the realized output to the architects BIM model and archive each days progress for future reference. Likewise, the remote human supervisors role is a failsafe. Months before construction ever began, generative AI design models were communicating back and forth with the software powering the 3D printing robotic arms, sculpting a design proposal that met all brief requirements and could be delivered on time and on budget without third-party human intervention.
On the site next door, all is not well. In contrast to the gentle hum from the adjacent plot, the prefabricated modular construction site is awash with frantic activity. The night before, after the site had been powered down for the evening, hackers had inexplicably taken control of the contractors fleet of autonomous construction bots, directing them to demolish freshly-installed structural connections between the modular apartment units. The disruption only lasted five minutes before the contractors security system detected the unusual behavior and triggered an emergency shutdown of the site, but thats all it took to cause days' worth of setbacks.
Fast forward to the morning, and an amalgamation of architects, contractors, project managers, and law enforcement are on the scene while four-legged robots scurry throughout the site, analyzing the full extent of the damage. Although the incident only occurred hours before, the design team has already been provided with a series of options to minimize delays. AI-driven project management software had already analyzed global supply chains, subcontractor availability, and viable production sequences to produce a variety of construction programs that accounted for the need to repair the damaged structural connections. The only remaining dispute among the team was whether the client needed to be made aware of the mishap at all.
Across the street from the commotion, an elderly man is walking his granddaughter to school. The grandfather stops and motions for the young girl to remove her earphones. I used to do that job, the grandfather said to the girl, pointing towards the 3D printing construction site. The girl looked up at the drones, bemused. How could Grandpa possibly fit inside a drone? She looked down towards the 3D printing robotic arms. This made even less sense. Studying her grandfathers hands, she giggled at the thought of concrete shooting from his fingertips before putting her earphones back on and pressing play. Grandpas really losing it, she concluded.
Its an intriguing thought experiment; the prospect of a generation of children to whom manual, human construction workers are an alien idea. Throughout the previous century, our story would have found an audience among science fiction writers but few others. Today, the paradigm has shifted. On the topic of autonomous construction, once dismissive talk of technological limitations is gradually being supplanted by more pragmatic conversations over economy, labor, and adoption. Grandpas construction job isnt doomed. As we will see later, the concept of a human-free construction process is far from fruition or desire. However, the premise behind the discussion represents a remarkable shift. Today, the question of how artificial intelligence can intersect with the construction industry is the preoccupation of an expansive field of companies, from startups to manufacturing giants, which make up the approximately $500 million AI in Construction market size. One report published at the beginning of 2023 predicts that by 2031, this market will surpass $8 billion.
Our opening story offers clues into where such innovations are taking place. In our scenario, 3D printing robotic arms worked in tandem with computational models to determine how a proposed structure could be most efficiently delivered. In reality, University of Michigan researchers have designed and prototyped a 3D printed, ultra-lightweight structure using this same method.
In our scenario, autonomous drones routinely scanned construction sites, collecting data to be converted into an ever-evolving record of BIM models and visual feeds for remote inspection. In reality, drone manufacturing giant DJI openly markets the ability of its products to generate 3D point clouds of construction sites to help craft a digital model, while U.S. autonomous drone startup Skydiois working with contractors to deploy its products on the construction site.
In our scenario, AI-driven construction management software is infused with all aspects of scheduling, data analytics, and risk management. In reality, the AI construction market is becoming saturated with companies offering similar services, including construction management software giant Procore whose products use machine learning to perform estimates, capture defects, identify risks to construction workers, and improve forecasting through historical analyses.
Our scenarios most alien proposition is of land-based, agile, autonomous robots performing the analytic and constructive tasks which in the present day are performed by humans, or at the very least, by machines directly controlled by humans. On this question, no company has captured the public imagination like Boston Dynamics. Born out of MIT in 1992, and now owned by Hyundai, the company describes its mission as creating a future in which humans and machines work together to improve safety, productivity, and quality of life. For Brian Ringley, Boston Dynamics Principal Product Manager, the operative word for achieving this mission on the construction site is not replacement but collaboration.
Anyone who is worried about robots being overly disruptive to humans, or completely replacing them, either hasnt been on a construction site or seriously undervalues human capability, Ringley told me in a recent conversation. Whether on the topic of intelligence, dexterity, or communication, working with even the most advanced robots in the world gives you a profound appreciation for what humans are capable of.
Ringleys description of the collaborative relationship between humans and robotics isnt confined to literal hand-in-hand exercises. He cites examples such as the UK construction giant BAM, where human workers from the companys London headquarters supervised and directed autonomous robots on a site in the remote Shetlands Islands far off the north coast of Scotland. In another context, he cites the ability of human construction workers to deploy robotic agents on tasks with a frequency or complexity beyond human feasibility.
We have found that teams benefit from constant data capture, but the truth is, nobody can afford to do as much data capture as they would like to, Ringley explained. If you can deploy robotics on data capture missions, you are capturing value that simply wasnt possible to capture before. There is still a human in the loop directing the robots path, managing what the nature of the data is, and setting up workflows to consume that data, but with the aid of robotics, they can now continuously monitor construction progress.
It is difficult to find any BIM expert who believes they are fully leveraging the value of data-rich models, Ringley continued. Imagine if that model wasnt just a design intent tool but was now a feedback loop tool. The model is always up-to-date with the physical reality of the site, which is invariably different from the original design model. If we could consistently and reliably capture data this way, it takes us to a whole new world of methodologies, software, and professionalization. Valuable human labor would be shifted from rote data capture and job site documentation tasks to model and system coordination. In effect, there are all sorts of avenues that could be opened up if you could trust that a model was accurate at all times. You could use that model to drive other types of automation or other types of robots. You could also more cost-effectively hand over a final as-built model to the building owner for use throughout the rest of the buildings lifecycle. You can start to think about phases solely beyond construction.
Ringleys description of a collaborative relationship between construction AI and humans is not unique to Boston Dynamics. What sets the company apart from competitors is instead what Ringley labels athletic intelligence, in which legged robots move with dexterity and agility designed to mimic those of humans. When Ringley describes athletic intelligence as the core spirit of the company, its no exaggeration the Boston Dynamics logo shows a human-like figure in mid-motion.
You can trace this approach back to the roots of the company, and the mission of how we could bring mobile robots to the world, Ringley told me when asked about Boston Dynamics deliberate pursuit of athletic intelligence. Wheels can only operate on paved roads, which represents an extremely small fraction of our environment. Humans and animals have evolved with legs. Our approach is a form of co-evolution, which says that if you want to build effective automation for the human-purposed world we live in, you need to include legs. Otherwise, you run the risk of future designers being forced to design environments around automation such as an Amazon warehouse scenario. We take the opposite view: Robotics must be designed for people and for cohabitation.
Boston Dynamics investigations into athletic intelligence can take a variety of forms. In early 2023, a YouTube video of the companys humanoid robot Atlas lending a hand on a construction site amassed over 6 million views. The video was far from a one-hit-wonder. One year previous, a video of Atlas performing parkour amassed over 14 million views. Atlas is ultimately a research and development project, Ringley explained when I asked if Atlas had a commercial future. The lessons we learn from Atlas will filter into commercial products. Do we think there are future applications in construction for two-handed robotics? Is it necessarily Atlas or even necessarily a humanoid? We have a lot to learn from customers and partners before we figure that out. For now, our videos of Atlas are more of a technology demonstrator.
While Atlas swings from handlebars, its four-legged companion Spot is leading Boston Dynamics commercial product offering to the construction industry. Since its commercial launch in 2020, over one thousand Spot systems have been deployed across the world, with approximately one-third being deployed on construction sites. In our conversation, Ringley lists off a series of use cases for the so-called robot dog in the construction industry, many of which are grounded in Spots ability to reliably and consistently capture job site data. Prominent examples included capturing data on construction work already completed, identifying deviations from design models, and capturing existing spaces set to undergo adaptation. Spots four-legged design is crucial to these missions, enabling the autonomous robot to navigate the often uneven, variable, and obstacle-strewn nature of an active construction site.
While conversations on the impact of artificial intelligence in architecture can sometimes become narrow-focused discussions on new-age generative tools, Spots operation and use cases are instructive examples of how artificial intelligence can permeate the design and construction process in a variety of forms. Boston Dynamics describes athletic intelligence as a form of AI, offering the robot an inherent sense of balance and perception, allowing it to traverse difficult terrain along preset routes with little or no input from users. Spots ability to navigate unpredictable surroundings is not currently grounded in machine learning, although recent reporting suggests this may change. In a separate context, as construction managers increasingly adopt AI systems for tasks such as material quantification, on-site progress, model deviation, clash detection, and safety auditing, the demand for high-quality data spurs much of Spots development. These systems benefit from the frequency of data and repeatability of data across the board, Ringley told me. Youll get better results from AI tools if you feed them better quality data. This is something that Spot is uniquely positioned to do versus other methods of capture on job sites.
In an analysis of use cases for Spot in the design and construction industry, few companies have as unique a perspective as Foster + Partners. In 2020, the firm became the first architecture practice to take part in the Boston Dynamics Early Adopter Program for Spot, deploying the robot in a range of contexts from the Battersea Roof Gardens construction site and Foster + Partners own campus in London as it underwent renovation.
Spot was an example of our interest in investigating disruptive technologies, and of how robotics could be used in the AEC industry to revolutionize the way we work, Foster + Partners Senior Partner Martha Tsigkari told me in a conversation alongside Partner Adam Davis and Associate Partner Sherif Tarabishy. Tsigkari leads the company's Applied Research + Development (Applied R+D) group; a team of under 20 people, including Davis and Tarabishy, whose remit includes machine learning and robotics but also expands to areas such as performance-driven design optimization, buildability, extended reality, and digital twins. We usually make up 1% of the company, but we believe in Einsteins relativity formula of having a small mass but huge acceleration, Tsigkari explained. We look at what disruptive technologies exist in the wider world and seek to understand how we can integrate them into the AEC industry at all stages of the design process, from conception to completion.
Throughout our discussion, the Applied R+D group aligns with Ringleys earlier vision for Spots predominant role in construction: Consistent autonomous scans to generate data that the team can use to compare as-designed versus as-built models. However, the teams experience with Spot also opened other use cases. We were interested in using Spot not just during construction but during building operation, Davis explained. In our practice, we are regularly measuring factors such as air quality, lighting, and energy usage to understand changes in space over time. If you have spaces that change regularly, such as breakout areas where furniture is often manipulated, a consistent three-dimensional scan can allow you to understand how we use space.
While other industrial robots require human partners to be extensively trained in their use, the team sees promise in Spots ability to coexist in the office environment without significant human intervention, though it is not without its challenges. At this stage, and likely for some time to come, it is still a head-turner, Davis added. The 'paparazzi' were quite keen to take cameras out and photograph it, and one person ran over to hug it. In a working environment, we will need changes in technology but also changes in culture, whereby we reach a point where people arent distracted by the presence of robots.
The Applied R+D groups description of their experience with Spot offers tangible examples of what Boston Dynamics' Ringley determined as the varying but collaborative relationship between humans and robotics; one which he hopes will be nurtured on future construction sites.
At the Battersea Roof Gardens site, where Spot was used to perform data capture missions along a consistent route, Tarabishy offers an insight into the human-robotic relationship as it existed in the 2020 edition of Spot. We would define a route by taking Spot to a starting point and manually driving it through the route we wished to take scans from, Tarabishy explained. Each subsequent time, we would place it at the starting point, and tell it to repeat. Initially, if it encountered a permanent obstacle, it would sit down and inform the team it was unable to proceed. As we gave feedback to Boston Dynamics, features were added which gave us options to manually maneuver around the obstacle while still collecting data, or skip the scan and move to the next mission.
Its not a competition between humans and robots, Tsigkari noted as we reflected on the broader theme of robotics in construction. Robotics will be good for certain things, and humans will be good for other things. Our interest here is in the human-robot collaboration, and how it could work in the AEC space. Its not a question of what robotics can offer, because we see every day that they can offer a lot. Its more about how we will interact with them. This is what many people need to come to terms with, and what forms part of our work: To ensure that the relationship between humans and robotics on a construction site or in a building is as seamless as possible.
Returning to our opening story, in which human construction workers were supplanted by robotics, the experience of Boston Dynamics and Foster + Partners strongly suggests that our fable is just that: A fictitious tale with little probability of manifesting in the foreseeable future. Readers of this article will not find themselves reminiscing on a bygone era of human construction workers as our fictitious grandfather did. The looming fear is nonetheless understandable. In 2023, where reports from economic authorities from the World Economic Forum to Goldman Sachs detail the millions of human jobs which could be replaced by artificial intelligence, there can be a temptation to group robotics, generative tools, AI-powered analytics, and more into a single job-hunting army. In reality, each innovation must be addressed within its own context, and, for now, the landscape of construction robotics holds mountainous obstacles to overcome before a future dominated by entirely robotic construction sites appears on the horizon.
In robotics circles, perhaps the largest mountain is Moravecs paradox, which argues that it is significantly more difficult for computation to match humans in sensorimotor and perception skills than in reasoning. It is comparatively easy to make computers exhibit adult level performance on intelligence tests or playing checkers, and difficult or impossible to give them the skills of a one-year-old when it comes to perception and mobility, Hans Moravec wrote in 1988. The technological landscape of the 21st century has so far proven Moravec correct, spurred faster still by the relative financial ease at which AI software can embark on a journey of trial and error when compared with robotic hardware. The resulting disparity we perceive between innovations in bits versus atoms was captured by Peter Thiel in his much-quoted remarks at Yale University in 2011. What happened to the future? Thiel wondered. We wanted flying cars, instead we got 140 characters.
When we look beyond digital arenas such as ChatGPT and social media algorithms and focus our attention exclusively on computational applications in real-world settings, construction sites continue to serve as uniquely difficult environments for robotics to contend with. As Moveracs paradox suggests, ever-more complex physical environments present ever-increasing challenges for robotics to overcome. In highly-choreographed, predictable, rules-based settings such as vehicle production lines or packaging warehouses, robotics can be programmed to perform repeatable, automated tasks. Construction sites, by contrast, are in a state of constant flux. Every construction site presents its own unique context, leaving little room for pre-programmed repetition and a higher demand for the level of spatial intuition and manipulation that the human mind and body have naturally evolved over millions of years. Here, Ringleys overall assessment of the comparison between human and robotic capabilities bears repeating: Working with even the most advanced robots in the world gives you a profound appreciation for what humans are capable of.
Its not an exciting vision to say robots are here to replace us, Ringley told me as our conversation concluded. Whats joyous and interesting to me about this work is that there are tasks that humans and robots can achieve together that far surpasses what machines can do in isolation and what people can do in isolation. Lets figure out what those things are and design systems to enable them.
Continue reading here:
New Dog, New Tricks: Reflections on Construction, Robotics, and ... - Archinect
Posted in Robotics
Comments Off on New Dog, New Tricks: Reflections on Construction, Robotics, and … – Archinect
Underwater robots for operations in challenging and dangerous … – Inceptive Mind
Posted: at 9:10 am
Divers are often put at considerable risk when searching for people or objects underwater due to factors such as strong currents, deep waters, and low visibility.
Engineers at the ETH Zurich spinoff company Tethys Robotics have developed an underwater robot that can be used in situations that are too dangerous for human divers.
The Tethys robot is an autonomous underwater vehicle that has been specially developed for use in challenging and dangerous environments like turbid channels and rivers. It is primarily used in situations when it is too difficult or risky to use conventional search and rescue techniques.
The Tethys weighs 30 kg when on the water, has a top speed of 2 meters per second, and has a fiber optics cable reach of up to 10 km (6.2 miles). Thanks to its swappable lithium battery, the robot can operate on a single charge for four hours.
Equipped with acoustic sensors, cameras, and AI-based algorithms, the robot can autonomously search large areas underwater and quickly localize objects or people. This means that divers and rescue teams no longer have to risk working in dangerous situations.
The Tethys robot can be used to grab and carry up to 40 kg back to the surface. Once the robot has located its target, an operator takes over the navigation and guides the robot diver to the target. This allows the emergency services to focus on other important tasks and ensure that the search and rescue operation runs as efficiently and as safely as possible.
According to the ETH Zurich team, the underwater robot has already been used by several local authorities for underwater search and rescue operations.
Original post:
Underwater robots for operations in challenging and dangerous ... - Inceptive Mind
Posted in Robotics
Comments Off on Underwater robots for operations in challenging and dangerous … – Inceptive Mind
NVIDIA Brings Advanced Autonomy to Mobile Robots With Isaac AMR – Nvidia
Posted: at 9:10 am
As mobile robot shipments surge to meet the growing demands of industries seeking operational efficiencies, NVIDIA is launching a new platform to enable the next generation of autonomous mobile robot (AMR) fleets.
Isaac AMR brings advanced mapping, autonomy and simulation to mobile robots and will soon be available for early customers, NVIDIA founder and CEO Jensen Huang announced during his keynote address at the COMPUTEX technology conference in Taipei.
Isaac AMR is a platform to simulate, validate, deploy, optimize and manage fleets of autonomous mobile robots. It includes edge-to-cloud software services, computing and a set of reference sensors and robot hardware to accelerate development and deployment of AMRs, reducing costs and time to market.
Mobile robot shipments are expected to climb from 251,000 units in 2023 to 1.6 million by 2028, with revenue forecast to jump from $12.6 billion to $64.5 billion in the period, according to ABI Research.
Despite the explosive adoption of robots, the intralogistics industry faces challenges.
Traditionally, software applications for autonomous navigation are often coded from scratch for each robot, making rolling out autonomy across different robots complex. Also, warehouses, factories and fulfillment centers are enormous, frequently running a million square feet or more, making them hard to map for robots and keep updated. And integrating AMRs into existing workflows, fleet management and warehouse management systems can be complicated.
For those working in advanced robotics and seeking to migrate traditional forklifts or automated guided vehicles to fully autonomous mobile robots, Isaac AMR provides the blueprint to accelerate the migration to full autonomy, reducing costs and speeding deployment of state-of-the-art AMRs.
Isaac AMR is built on the foundations of the NVIDIA Nova Orin reference architecture.
Nova Orin is the brains and eyes of Isaac AMR. It integrates multiple sensors including stereo cameras, fisheye cameras, 2D and 3D lidars with the powerful NVIDIA Jetson AGX Orin system-on-module. The reference robot hardware comes with Nova Orin pre-integrated, making it easy for developers to evaluate Isaac AMR in their own environments.
The compute engine of Nova is Orin, which delivers access to some of the most advanced AI and hardware-accelerated algorithms that can be run using 275 tera operations per second (TOPS) of edge computing in real time.
The synchronized and calibrated sensor suite offers sensor diversity and redundancy for real-time 3D perception and mapping. Cloud-native tools for record, upload and replay enable easy debugging, map creation, training and analytics.
Isaac AMR offers a foundation for mapping, autonomy and simulation.
Isaac AMR accelerates mapping and semantic understanding of large environments by tying into DeepMaps cloud-based service to help accelerate robot mapping of large facilities from weeks to days, offering centimeter-level accuracy without the need for a highly skilled team of technicians. It can generate rich 3D voxel maps, which can be used to create occupancy maps and semantic maps for multiple types of AMRs.
Additionally, Isaac AMR shortens the time to develop and deploy robots in large, highly dynamic and unstructured environments with autonomy thats enabled by multimodal navigation with cloud-based fleet optimization using NVIDIA cuOpt software.
An accelerated and modular framework enables real-time camera and lidar perception. Planning and control using advanced path planners, behavior planners and use of semantic information make the robot operate autonomously in complex environments. A low-code, no-code interface makes it easy to rapidly develop and customize applications for different scenarios and use cases.
Finally, Isaac AMR simplifies robot operations by tapping into physics-based simulation from Isaac Sim, powered by NVIDIA Omniverse, an open development platform for industrial digitalization. This can bring digital twins to life, so the robot application can be developed, tested and customized for each customer before deploying in the physical world. This significantly reduces the operational cost and complexity of deploying AMRs.
Sign up for early access to Isaac AMR.
See more here:
NVIDIA Brings Advanced Autonomy to Mobile Robots With Isaac AMR - Nvidia
Posted in Robotics
Comments Off on NVIDIA Brings Advanced Autonomy to Mobile Robots With Isaac AMR – Nvidia
Robotic assisted surgery now available at Northeast Regional … – Kirksville Daily Express and Daily News
Posted: at 9:10 am
Northeast Regional Medical Center
Northeast Regional Medical Center is taking minimally invasive surgery to the next level with the addition of new robotic equipment. Minimally invasive techniques are advanced through the use of robotic assisted equipment, allowing surgeons to perform more complex procedures.
The new equipment features a magnified 3D high-definition vision system and tiny wristed instruments that bend and rotate far greater than the human hand. The robotic assisted technology allows surgeons to operate using the tiniest incisions with greater vision, precision and control.
We are excited to offer this technology to Kirksville and the surrounding communities, NRMC Interim Chief Executive Officer Dwayne Blaylock said. With the new robot, surgeons trained in this surgical instrumentation are now able to provide a number of minimally invasive surgical procedures vs. a traditional laparoscopic surgical approach.
Dr. Steven Lyons, general surgeon at NRMC, is trained in this highly specialized surgical robotics approach.
Robotic surgery can offer a faster recovery for patients as opposed to traditional open or laparoscopic surgery, Dr. Lyons said. It is our goal to provide safe and compassionate care. Patients typically experience a shorter recuperation period with less intense pain and many can usually return to their normal routine in a shorter period of time. In some instances we are actually able to have better visualization to perform safe surgery and it allows us to do some parts of surgeries more effectively.
While not all patients are good candidates for robotic surgery, those that are good candidates are given the option of utilizing this advanced technology vs. a more traditional surgical approach. Examples of surgical procedures that have benefitted from robotic surgery include, but are not limited to: Inguinalhernia, Ventral hernia, Umbilical hernia, Incisional hernia, Hiatal hernia repair, urologic (prostate) surgery, general laparoscopic surgery, gynecologic surgery such as hysterectomies and ovary removal for benign conditions, certain thoracic procedures, gallbladder removal and early stage (T1 or T2) cancers.
Northeast Regional Medical Center offers a free e-newsletter with a monthly dose of health and wellness inspiration sent directly to your inbox from a trusted medical source. Sign up by visiting nermc.com/enewsletter-sign-up.
About Northeast Regional Medical Center
NRMC is a 93-bed facility with a Level III trauma center, Level III STEMI center, Level III stroke center, and ACC Certified Chest Pain Center. With over 500 healthcare professionals, NRMC is a teaching hospital associated with the founding school of osteopathic medicine. NRMC has a 4-Star CMS Quality Star rating and Spring 2023 Leapfrog A safety grade. NRMC is owned, in part, by physicians.
Link:
Posted in Robotics
Comments Off on Robotic assisted surgery now available at Northeast Regional … – Kirksville Daily Express and Daily News
Mars livestream: ESA to beam first ever live view from the red planet – Business Insider
Posted: at 9:09 am
- Mars livestream: ESA to beam first ever live view from the red planet Business Insider
- First time ever Mars livestream a chance "to get as close as it's currently possible" to the red planet CBS News
- Live from Mars! European probe beams Red Planet views to Earth in 1st-ever video feat Space.com
Read more here:
Mars livestream: ESA to beam first ever live view from the red planet - Business Insider
Posted in Mars
Comments Off on Mars livestream: ESA to beam first ever live view from the red planet – Business Insider
Mars declared unsafe for humans to live as no one can survive for longer than four years – UNILAD
Posted: at 9:09 am
For years, there's been talk of one day mankind living on Mars.
And in recent times, with the likes of Elon Musk seemingly on a one-man crusade to get there first, it looks like it might happen fairly soon.
But according to one of the most recent studies into the viability of human life on the Red Planet, it might not be as easy as you'd think.
Yep, research carried out by team at UCLA looked to answer two key questions - the first being around the impact of particle radiation and whether it would pose too grave a threat to human life.
The second was whether the timing of a mission to Mars could protect astronauts and the spacecraft from radiation?
To answer both questions, the scientists used geophysical models of particle radiation for a solar cycle and models of how radiation could affect both human passengers and a spacecraft.
Now, the good news is that the answers to their two questions were 'no' and 'yes', respectively.
According to the scientists' calculations, the spacecraft should provide enough protection during the round trip to and from Mars for the astronauts.
However, if the material the spacecraft is built with is too thick, then it could actually increase the amount of secondary radiation.
They also noted that this would largely be dependent on the timing of the mission as well.
Researchers claimed the best time to leave Earth would be when the solar activity is at its peak.
This is because during the 'solar maximum' the most dangerous particles are deflected, thus shielding the astronauts from the worst of it.
On the other hand, the bad news is that experts recommended humans should spend no longer than four years on any mission to the planet.
According to the study, which was published in the Advancing Earth and Science Journal, beyond this point, the levels of radiation become unsafe.
It reads: "Our calculations clearly demonstrate that the best time for launching a human space flight to Mars is during the solar maximum, as it is possible to shield from Solar Energetic Particles.
"Our simulations show that an increase in shielding creates an increase in secondary radiation produced by the most energetic GCR, which results in a higher dose, introducing a limit to a mission duration.
"We estimate that a potential mission to Mars should not exceed approximately four years.
"This study shows that space radiation imposes strict limitations and presents technological difficulties for the human mission to Mars, such a mission is still viable."
View post:
Mars declared unsafe for humans to live as no one can survive for longer than four years - UNILAD
Posted in Mars
Comments Off on Mars declared unsafe for humans to live as no one can survive for longer than four years – UNILAD
How astronauts heading to Mars could enjoy fresh produce and grill meat – KSL.com
Posted: at 9:09 am
Estimated read time: 5-6 minutes
SAN FRANCISCO When the first astronauts venture to Mars in the future, the crew will need access to healthy, fresh food but there are no cosmic grocery stores along the way. And the round trip to the red planet is expected to take about three years.
Food is one of the many challenges NASA faces before sending humans into deep space, but it's a big one. Nutritious food that also stimulates the appetite is necessary to keep astronauts healthy, and freeze-dried options won't be enough.
This demand for nutrition is part of why NASA and the Canadian Space Agency began the Deep Space Food Challenge, an open call to experts around the world to develop technologies for keeping astronauts fed and healthy on long-term space missions.
The competition led the Astra Gastronomy team at Nonfiction, a design and innovation firm based in San Francisco, to develop the Space Culinary Lab. The compact kitchen-style system includes stations for growing algae and leafy greens, blending creamy coffee and even grilling meat.
"The idea here is to create a space kitchen," said Phnam Bagley, cofounder of Nonfiction. "You get to prepare the food that you want however you want it. Bringing that level of agency to astronauts is where designers like us start."
The Space Culinary Lab made it through the first phase of the Deep Space Food Challenge in October 2021. Despite not being selected during phase two, the design showcases some of the technology that could be used not only in space but also in resource-challenged environments such as refugee camps and food deserts on Earth.
The heart of the design is to bring "a bit of humanity to space," with mix and match options so astronauts aren't exhausted with the same flavors and textures as their taste buds become dull in space, Bagley said.
The lab provides ways the astronauts can also keep up a strong appetite to prevent weight loss and have access to fresh options to maintain optimal nutrition, which is crucial for their health as the crew ventures far from Earth.
The culinary lab is configured so the rounded design could slot into an existing spacecraft and would require few resources and little effort from the astronauts. The different modules included in the design are called munch, sizzle, yum and snap.
Snap provides a refreshing wall of green within the otherwise sterile environment of a spacecraft, where the astronauts can tend to microgreens grown without soil such as baby bok choy and butter greens. Pink lights provide the proper wavelength that accelerates the growth of the greens, and timed spritzes furnish the exposed roots with water and nutrients.
While the greens deliver extra flavor and healthy nutrients to a meal, there's a psychological side to tending to the plants as well.
Astronauts living for six months or longer aboard the International Space Station have shared how growing, harvesting and eating fresh produce has improved their mood and brought out their nurturing sides as they incorporated caring for plants into their routines.
The culinary lab's munch module offers another nutrition boost by growing microalgae in a bioreactor. The algae can be collected, dehydrated and mixed with fruit powders, spices, vinegar, oats and peanut butter for a tasty and nutritious snack.
Microalgae could help protect the astronauts as they leave the shielding effects of low Earth orbit and venture into the harsh radiation environment of deep space, Bagley said.
Rehydrated meats are something astronauts rely on as a source of protein. To make them more palatable, Nonfiction included sizzle as part of the culinary lab. The tiny microwave drawer, which resembles a convection oven, has glass plates and laser technology. Bagley demonstrated brushing a piece of rehydrated chicken with a blend of maple syrup and soy sauce, a combination that is "shelf stable and delicious," she said.
As the meat warms, the "marinade" helps it caramelize, and a laser draws grill marks on the meat. (You can also draw your name or even a rendering of the "Mona Lisa" if it amuses you, Bagley said.) Sizzle can be used to warm and "grill" vegetables, tofu and tortillas as well.
Since astronauts struggle to sleep properly in space, they might also be relying on extra caffeine on the long journey to Mars. That's where the yum module comes in handy. The creaming machine uses a steel probe to emulsify water and oil-based ingredients to create lattes, chocolate ganache and mayonnaise in a self-contained way.
The futuristic space food prepared using the culinary lab was available for a taste test at Nonfiction during CNN's visit in March, including space coffee and algae mixed with different flavors.
The algae, rolled into balls or cubes after being blended with ingredients in a silicon pouch, can stay fresh for two to three days.
Two types of nutritional algae balls were on hand one savory and one fruity. The end result resembled a snack for a long hiking trip, but it was surprisingly delicious and didn't have an algae aftertaste.
Bagley and others at Nonfiction, including Mark Alexander, Mardis Bagley, Nadia Kutyreva and Fifile Nguyen, have tasted multiple flavor combinations to get the balance right.
"I think we've realized that if we put too many ingredients together, it confuses the flavor profile, and then the algae flavor comes back," Bagley said. "We use two or three ingredients at once."
One mix blended peanut butter, oats, onion powder and vinegar with the algae for a strong, savory flavor with a pleasant, sour finish. But the favorite was the fruity algae, which mixed in powders from freeze-dried strawberries, cherries and other fruits. The fruit powders masked the algae flavor and made it taste more like a slightly sweet treat without added sugars.
Then, coffee powder, hot water, ghee, coconut oil and lecithin were blended with the emulsifying probe to create a foamy brew.
"The mechanism agitates the liquids together," Bagley said, "and creates this super creamy hot beverage, which is very satisfying in the morning."
View original post here:
How astronauts heading to Mars could enjoy fresh produce and grill meat - KSL.com
Posted in Mars
Comments Off on How astronauts heading to Mars could enjoy fresh produce and grill meat – KSL.com
Here is the first livestream from Mars a rare, almost real-time look … – NPR
Posted: at 9:09 am
Taking a picture of Mars is not easy.
Once light bounces off the planet, it can take between 3 to 22 minutes to travel to Earth so there aren't truly "live" images of Mars.
But on Friday afternoon, the European Space Agency offered the closest thing: the first "livestream" of Mars on YouTube, which posted pictures of the planet every 50 seconds as they beamed down directly from the camera mounted on the agency's Mars Express orbiter. The livestream was about an hour long.
In this handout image supplied by the European Space Agency on July 16, 2008, the Echus Chasma, one of the largest water source regions on Mars, is pictured from ESA's Mars Express. ESA/Getty Images hide caption
In this handout image supplied by the European Space Agency on July 16, 2008, the Echus Chasma, one of the largest water source regions on Mars, is pictured from ESA's Mars Express.
"Normally, we see images from Mars and know that they were taken days before. I'm excited to see Mars as it is now as close to a martian 'now' as we can possibly get!" James Godfrey, the spacecraft operations manager at the ESA's mission control center, said in a statement.
In 50-second intervals, the camera panned across Mars, showing a side of the planet entering night, as well as some clouds billowing out on the corner.
The livestream celebrates the 20th anniversary of the Mars Express mission, which launched in 2003 to better understand the planet, as well as search for traces of water.
There are only a few examples of "live" footage in space, including the famous Apollo missions showing astronauts walk on the moon's surface, as well as the DART and LCROSS missions where NASA intentionally crashed spacecrafts into asteroids and the moon, respectively, the ESA said in a news release.
"These missions were all pretty close to home and others farther away sent perhaps an image or two in near real-time," the ESA said. "When it comes to a lengthy livestream from deep space, this is a first."
Most observations and data gathered by spacecrafts are beamed down to Earth a few hours or even days later which isn't generally an issue for scientists.
In fact, though the speed of light can make livestreams difficult, in other cases, it has been a boon for scientific discovery.
Take the Euclid mission. The telescope will capture light that has been traveling for 10 billion years, allowing scientists to see 10 billion years into the past, the ESA said.
View original post here:
Here is the first livestream from Mars a rare, almost real-time look ... - NPR
Posted in Mars
Comments Off on Here is the first livestream from Mars a rare, almost real-time look … – NPR
20 years of Mars Express: Mars as never seen before – European Space Agency
Posted: at 9:09 am
Science & Exploration
02/06/2023 10079 views 153 likes
A new mosaic of Mars marks 20 years since the launch of ESA's Mars Express, and reveals the planets colour and composition in spectacular detail.
The mosaic was created using data from Mars Expresss High Resolution Stereo Camera (HRSC).
HRSC normally photographs Marss surface from an altitude of about 300 km the closest the spacecraft gets to Mars in its elliptical orbit with the resulting images covering areas about 50 km across. However, the mosaic presented here uses a slightly different approach. To view the planet more widely, HRSC gathered 90 images at higher altitudes (of 4000 to 10 000 km), thus capturing areas of around 2500 km wide. These images were then put together to form a full global view.
Such large-scale images are typically obtained to observe weather patterns on Mars but even in the absence of atmospheric phenomena they offer wonderful views of the planets surface.
This new view highlights variation across Marss surface by enhancing local colour and contrast.
Thanks to its nine imaging channels, HRSC can visualise Mars not only in three dimensions but also in colour. However, the ever-changing opacity of the martian atmosphere makes it difficult to determine accurate surface colours from orbit. Dust scatters and reflects light, causing colours to shift between images and creating a patchwork-like effect when assembling a mosaic.
Until now, suppressing this effect during image processing has reduced variations in colour between different parts of Mars. But to create this mosaic, the HRSC team instead colour-referenced each constituent image to a colour model derived fromhigh-altitude observations, allowing them to preserve colour variations and reveal a far richer colour view of Mars than has been seen before.
While beautiful in its own right, the mosaic also provides fascinating information about Marss composition, revealing an unprecedented variety and detail of colours across its surface.
Mars is famous for its reddish colour, which is caused by high levels of oxidised iron. However, large parts of the planet appear to be rather dark and blue-toned here. These are grey-black basaltic sands of volcanic origin that form far-reaching, dark layers of sand across Mars. They pile up as they move in the wind, creating imposing sand dunes and dune fields within impact craters.
Material weathered by water, on the other hand, tends to look lighter. The two most common water-weathered minerals on Mars, clay and sulphate minerals, appear particularly bright on such colour composites; their presence was established by the OMEGA spectrometer on Mars Express. The presence of these minerals signals that liquid water existed on Mars for a long time, weathering and altering rock over time to form significant clay deposits such as Mawrth Vallis (a former outflow channel not shown in this view but previously observed by HRSC).
Sulphate minerals are visible here within the Valles Marineris canyon system, as seen most clearly in the annotated image. Here, however, they are covered by a thin veneer of dark sand, but their impressive colour variations can be seen on closer look. Unlike clay deposits, sulphate minerals indicate more acidic environmental conditions that would be less friendly to life.
Mars Express launched and has been orbiting the Red Planet since 2003 20 years ago! The orbiter is imaging Mars surface, mapping its minerals, identifying the composition and circulation of its tenuous atmosphere, probing beneath its crust, and exploring how various phenomena interact in the martian environment.
The spacecrafts HRSC, the camera responsible for these images, has revealed much about Mars diverse surface features in the past 20 years. Its images show everything fromwind-sculpted ridges and groovesto sinkholes on the flanks of colossal volcanoes toimpact craters, tectonic faults, river channels and ancient lava pools.
The mission has been immensely productive in its two decades of life, creating a far fuller and more accurate understanding of our planetary neighbour than ever before. It was initially planned to last for one martian year, or around 687 Earth days, but has continued to meet and exceed its objectives. As the mission has been extended until at least the end of 2026, we can anticipate many more beautiful and insightful snapshots of Mars in the years to come.
The missions High Resolution Stereo Camera(HRSC) was developed and is operated by the German Aerospace Center (Deutsches Zentrum fr Luft- und Raumfahrt; DLR).
The development of the colour model method and processing of the mosaic was performed by Greg Michael of the HRSC team at Freie Universitt Berlin. The acquisition and planning of the high-altitude images were the responsibility of the camera operations team at the German Aerospace Center (DLR) at Berlin-Adlershof. On publication of the upcoming scientific paper on the mosaic, the georeferenced dataset will be made available through the ESA guest storage facility.
Thank you for liking
You have already liked this page, you can only like it once!
View post:
20 years of Mars Express: Mars as never seen before - European Space Agency
Posted in Mars
Comments Off on 20 years of Mars Express: Mars as never seen before – European Space Agency