The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Daily Archives: August 28, 2021
Composting Is a Viable Method of Equine Carcass Management The Horse – TheHorse.com
Posted: August 28, 2021 at 12:12 pm
Approximately 100,800 U.S. horses die each year, many of them being euthanized for health reasons. The most widely used euthanasia drug in the equine industry, sodium pentobarbital, however, has been linked to secondary poisoning in both wildlife and domestic animals. Its also been detected in groundwater for up to 20 years after contamination. Therefore, proper management of euthanized horses is paramount to protecting the environment and water sources.
One environmentally friendly option is carcass compostinga practice routinely used for carcass management in the swine and poultry industries. However, little evidence exists as to its efficacy and use for horses. So, a team of researchers from various universities recently performed a study to demonstrate successful equine mortality composting and document sodium pentobarbital concentrations throughout the process. University of Minnesota student Hannah Lochner, who is studying for a masters degree, presented the groups findings at the Equine Science Societys 2021 virtual symposium.
In addition to environmental concerns, the existing management options for euthanized horses are increasingly limited, Lochner explained. Rendering facilities have limited abilities; burial comes with site restrictions and is not practical during winter months in northern climates; and incineration (cremation) is costly at about $1,600 per carcass. Compostingthe natural, biological decomposition of a carcass abovegroundon the other hand, is biosecure, feasible, and eco-friendly.
To confirm this methods ability to degrade carcasses and reduce sodium pentobarbital concentrations, Lochner and her fellow researchers performed a composting trial of four horses euthanized for terminal medical reasons from September 2019 to April 2020. At a location in rural Minnesota, they constructed four compost pilesone for each horsethat included the following layers:
The team placed temperature loggers at depths of 46 and 91 centimeters to record pile temperatures every eight hours. They also turned the piles at Day 50 and concluded the trial at Day 216 (after five months of curing was complete). On these days, the team sampled three mirrored cross-sections of each pile and scored the amount of degradation from 1 (carcass discernable) to 5 (a few large bones remaining). They also analyzed sodium pentobarbital levels.
Lochner said that by Day 216, all piles received scores of 3 or 4, meaning the carcasses had degraded and only hair, hide, and large bones remained. Pentobarbital, while at low levels, was still detectable on Day 216. These concentrations were highest at the center of the pile on Day 50 and more consistent between cross-sections on Day 216.
These findings suggest mortality composting is an effective method for managing equine carcasses, said Lochner, adding that further research is needed to determine the environmental implications of composting chemically euthanized horses.
See the original post here:
Composting Is a Viable Method of Equine Carcass Management The Horse - TheHorse.com
Posted in Euthanasia
Comments Off on Composting Is a Viable Method of Equine Carcass Management The Horse – TheHorse.com
Mobile vet service allows pet owners to say goodbye in the comfort of home – TheRecord.com
Posted: at 12:12 pm
SEBRINGVILLE Its the hardest decision a pet owner can make saying goodbye to a beloved companion who is part of the family.
End-of-life care and euthanasia can be extremely stressful for both humans and animals; grief, and even guilt, wash into the void left behind.
I think its about the bond, really, that unique human-animal bond that you just want to maintain all the way through the end of a pets life and beyond, and to recognize that the loss of a pet is significant, said Dr. Erica Dickie.
The Sebringville-based veterinarian runs Black Creek Mobile Veterinary Services, which focuses solely on compassionate end-of-life care for pets, including hospice and palliative care and in-home euthanasia. While she primarily deals with cats and dogs, she will see smaller companion animals like rabbits and guinea pigs as well.
Theres no brick-and-mortar clinic; instead, Dickie travels to clients throughout Perth County and western Waterloo Region in a primary radius of 40 kilometres from her home base. She frequently ventures further afield with an additional mileage fee. I go where the need is if Im available.
Dickie and her small team do their best to accommodate same-day or next-day appointments. She also works with families for several weeks or months in keeping pets comfortable through pain and symptom relief.
Animal hospice care is very similar to the philosophy in human hospice settings. Theres a huge mental shift in the physician and in the family, she said. The care is geared toward comfort, and not cure.
A relationship with Black Creek often begins with a telemedicine quality of life consultation, which touches on topics like pet comfort, medicine and nutrition, holistic supports, environmental changes, decisions around euthanasia, and whats best, ultimately, for both the pet and its owners.
Part of my job as a palliative care veterinarian is preparing them for what to expect, what to look for, and that continuing communication is ongoing with me to know when things are starting to take a turn, Dickie said.
Originally from Kitchener, Dickie completed undergraduate studies at the University of Guelph and graduated from the Ontario Veterinary College in 2011.
Her work initially took her from Southampton to Stratford. She also was a locum veterinarian, working in a relief capacity at different practices.
When it came to euthanasia, Dickie adopted the practice of pre-sedation. I believe that it creates a more gentle transition, she said. Without it, that sudden loss of consciousness, that stays with the witness.
The journey with her cat Smokey in his final months, ultimately providing palliative home care and pain management, inspired Dickie to launch Black Creek in March 2020.
Losing him really reminded me about the deep impact when you have those bonds, and how much the loss affects you, she said. I felt more could be done in veterinary medicine and this is what was calling me.
Dickie earned a certification in animal hospice and palliative care through the International Association for Animal Hospice and Palliative Care.
Black Creeks inception as the pandemic began reinforced the benefits of the in-home service, as vet appointments became hard to find and people werent allowed to accompany their pets into clinics.
Familiar surroundings reduce stress for both animals and people. Many pet owners know the pain of a long drive home after taking their companion to the vets for the final time.
Being present for a pets final moments can be incredibly important for a family; while the decision is up to parents, Dickie encourages children and even other pets to be there.
Aftercare includes resources for grief support and assistance with cremation or burial arrangements and memorial or keepsake options.
To be able to be at your home, and after everything is said and done, to just grieve, cry Its more authentic. Its really just honouring that bond.
Read more:
Mobile vet service allows pet owners to say goodbye in the comfort of home - TheRecord.com
Posted in Euthanasia
Comments Off on Mobile vet service allows pet owners to say goodbye in the comfort of home – TheRecord.com
Health care without conscience is a dangerous contradiction – CBC.ca
Posted: at 12:12 pm
This column is an opinion from Brian Bird, an assistant professor at the Peter A. Allard School of Law at the University of British Columbia. For more information aboutCBC's Opinion section, please see theFAQ.
Health care is a fixture of Canadian election campaigns. Wait times, bed shortages and private health care are usual topics of debate. Now we can add freedom of conscience to that list.
It has long been accepted that health-care workers in Canada have a right to distance themselves from procedures that they consider unethical. This right sustained a body blow last week when the Liberals, Conservatives and NDP said that these workers must provide referrals to other medical service providers willing to perform the procedure.
Requiring doctors to make the arrangements for procedures that they cannot perform in good conscience is far from a compromise. If you believe it is wrong to rob a bank, would you be willing to plan the robbery?
The remarks by the parties are the latest threat to conscience in Canadian health care.
In British Columbia, a private hospice that declined to provide assisted suicide lost its licence to operate. In Ontario, the highest court of that province ruled that doctors can be forced to facilitate procedures they deem immoral. In Manitoba, a university expelled a medical student with moral objections to abortion.
Hostility to conscientious health care is fuelled by the flawed belief that health care amounts to whatever a doctor, nurse or other health-care worker is lawfully permitted to do. To be a good health-care worker therefore means that you must be willing to participate in any service that is categorized by the state as health care, regardless of any ethical qualms you might have.
These ideas are dangerous.
Health care is not simply whatever the state says it is. Health care is a sphere of human activity: preserving life, healing the sick and comforting the dying. Health-care workers are not robotic technicians but, as the pandemic has dramatically reminded us, human beings with a calling.
It does not take much reflection to realize why divorcing health care from ethical considerations and reducing it to whatever is authorized by law is a risky path to follow. This prospect should alarm all of us. Finalizing this divorce will lead to disastrous consequences for individuals and society alike.
Some of us think we are already seeing and living these consequences. A decade ago, it was hard to imagine that euthanasia for the terminally ill would be legal in Canada. Only a few years ago, it was unthinkable that euthanasia would ever be afforded to persons with mental illness. But here we are.
Does anyone want a health-care system that obliges the people who work within it to disable their moral compass and unreflectively endorse whatever the state labels as health care? It is easy to say that health-care workers cannot refuse to participate in whichever services are lawful when we agree with what is lawful. But what happens when we disagree?
Health-care providers who follow their conscience are often portrayed as unprofessional, uncaring and even un-Canadian. They are scolded for bringing their personal convictions to work, but their critics lean on convictions of their own. Take the complaint that conscientious objectors abandon their patients and deny them care. It assumes but does not demonstrate that what these individuals refuse to do amounts to health care, properly understood. That matter is not a footnote; it is the heart of the debate.
Freedom of conscience in health care is not political activism. Conscientious objection rests on the view that the service in question neither promotes health nor constitutes care but instead harms patients and others.
The doctor who conscientiously refuses to participate in abortion or euthanasia does so because she considers these acts to be lethal violence against a human being. You might disagree with these beliefs, but they are not crazy. They are rationally defensible and deserve a fair hearing.
If there is any sector of our society where ample space should be granted to conscience, health care is it. Health-care professionals are, first and foremost, called to do no harm.
Many Canadians say that health-care professionals should not bring conscience to their job. But the truth is, without conscience, their job cannot be done.
Do you have a strong opinion that could add insight, illuminate an issue in the news, or change how people think about an issue? We want to hear from you. Here'show to pitch to us.
Read more here:
Health care without conscience is a dangerous contradiction - CBC.ca
Posted in Euthanasia
Comments Off on Health care without conscience is a dangerous contradiction – CBC.ca
Mysteries: Louise Pennys The Madness of Crowds Review – The Wall Street Journal
Posted: at 12:12 pm
At the start of Louise Pennys The Madness of Crowds, Chief Inspector Armand Gamache, head of homicide for the Sret du Qubec, is assigned to oversee security for a polarizing professors lecture at a local college.
Abigail Robinson is a statistician who proposes that the government can ease socioeconomic pressures through eugenics and euthanasia. Gamache reluctantly fulfills his task shadowing her talk, during which a gunman fires (unsuccessfully) at the speaker. Can you think of anyone who might want to hurt you? the professor is asked. Well, she says, theres half of Canada, it seems. The following day, another woman, who was perhaps mistaken for Robinson, is found bludgeoned to death in the snow.
The Madness of Crowds is Ms. Pennys 17th entry in her intelligent and emotionally powerful series featuring Gamache. Once again, the author has produced a unique work twining moral issues and absorbing character studies into a challenging murder mystery. There is an abundance of deductive speculation among Gamache and his team, and the brainstorming continues even after the suspects are gathered for a final confrontation. Then, at the very last moment, the chief inspectorin a manner worthy of Aesop, Solomon, Freud or Holmespoints the finger of guilt.
Jack, the 25-year-old title character of Peter Hellers The Guide, isnt looking to save the worldjust his own sanity. Guilt-ridden by the death of his mother in a horse accident and the demise of his best friend in a river mishap, hes retreated into a solitary life centered around fishing. He leaps like a trout at the chance to work as a guide at Kingfisher Lodge, a rustic getaway for the rich and famous that promises boutique fishing at its finest. The first guest entrusted to him is Alison, a popular singer who fishes well and finds Jack charming. He asks himself: What could be better?
Well, it would be nice if the guests and staff werent under camera surveillance, and if there werent off-limits areas where the penalty for trespassing is being shot at. And if Jack didnt find evidence suggesting that his predecessor as guide was murdered. Jack and Alison both wonder: What is Kingfisher Lodge, really? When at last they learn the awful truth, Jack erupts: He wanted to bust whatever was happening here as fast as he could. . . . Barring that, he wanted to kill.
Read the original here:
Mysteries: Louise Pennys The Madness of Crowds Review - The Wall Street Journal
Posted in Euthanasia
Comments Off on Mysteries: Louise Pennys The Madness of Crowds Review – The Wall Street Journal
Letters: what caused the deaths of Indigenous children? – The B.C. Catholic
Posted: at 12:12 pm
Im a 100-year-old retired RN and remember my days on the hospital childrens ward. Every winter we had many Indigenous children with respiratory problems. It seems they had poor resistance to the diseases of the white settlers.
Now can you imagine a large group of Indigenous children in a school? Everything was foreign to them the people, the lifestyle and the diet and here they were crowded together.
Colds spread like wildfire, as did other diseases as they had no resistance to our ailments. All the schools had were a few caretakers no medicine, no nurses, no medical care at all.
How did the caretakers manage the institutions, feed the children, and care for the sick without any help? Its very likely that this residential school lifestyle caused a heavy mortality rate.
Some children ran away. Did they reach safety or perish? Were there some suicides? All these questions and conditions must be considered before passing judgment on the caretakers of these institutions.
C.M. BourgeoisVictoria
Studies show that more than 40 per cent of women who have had an abortion were churchgoers when they ended their pregnancy. That means the fight for life starts right in our own churches by being a voice of hope to the ones we sit next to every Sunday.
In addition, 77 per cent of Canadians think Canada already has an abortion law, yet it is only democratic country in the world with no abortion law. You can have an abortion for any reason or no reason at all, all the way up to the moment of birth.
God is pro-life and against euthanasia. With the federal election coming up, lets remember which parties allow pro-life MPs and oppose euthanasia.
Dean ClarkLangley
David Bairds Aug. 2 review of The Chosen needs to be counterbalanced with some due skepticism.
The adulation the show has received, even when muted as is the case of the B.C. Catholic review, is in some sense troubling. The Chosen is a poetic interpretation of the Gospels that often shares more in common with daytime soap operas and late-night sitcoms than with the biblical source material.
Is this wrong? Perhaps not. We have been making art out of the Gospel narratives for a long time. But The Chosens modern twists make it feel different somehow.
It is always trying to make you feel something, which again is not wrong in and of itself, but given that the show is primarily a Protestant production, Catholics should be wary of how this reality impacts their perception of the Gospels more broadly. Catholics do not believe an evangelical conversion experience is necessary to have a relationship with Christ, and it is significant that by the end of the second season there is not a single character who does not come to Christ through any other means.
The reason the show is so compelling is the very reason we should approach it with caution: it takes significant poetic liberties with characters and events from the Gospels. Too many people are saying the show is an authentic representation of the Gospel stories, and the fact that most laymen will have a difficult time separating the historical truths from the fictions is not something we should gloss over.
Of course, watch the show. I would never presume to disagree with David Bairds careful recommendation, but dont be surprised if suddenly the Matthew in your bible becomes the Matthew from The Chosen.
Nicholas ElbersMaple Ridge
Excerpt from:
Letters: what caused the deaths of Indigenous children? - The B.C. Catholic
Posted in Euthanasia
Comments Off on Letters: what caused the deaths of Indigenous children? – The B.C. Catholic
Sound Health: There’s A Lot That You Don’t Know About Hospice – WGLT
Posted: at 12:12 pm
Watching a loved one die from a terminal illness is one of the most difficult human experiences to endure. Hospice is there to make it easier. Yet there are many misconceptions about what hospice isand isn't.
In this installment of the Sound Health series, WGLT spoke to two leaders with OSF HealthCares Hospice program: social worker Laura Baker and manager Michelle Jackson.
What hospice is, Jackson said, is simply when you've decided not to seek further treatment for a terminal illness.
Usually the doctor thinks you have 6 months or less to live, but we do have some patients who live longer. With the support of us going into the home frequently, we can help them live a good, quality life. Were not about having things go quicker than intended. We just are there to support and provide comfort during the last days, said Jackson, who is also a nurse.
Michelle Jackson, OSF HealthCare Hospice
That last part is really important, Baker said.
People equate hospice with euthanasia. And so we do a lot of educating with patients and family members in the home saying, No, we dont do that. Were here to provide support, maintain and manage symptoms, keep people comfortable, and give them as many good days as we can, Baker said.
There are other misconceptions, like that hospice is just for patients with cancer. It's not. It's for anyone with a chronic, end-stage illness.
And you don't need to be imminently dying. Actually, it's better to get connected with hospice sooner, so you can build a relationship with the team.
And Baker says there's another one: People think hospice is a place, rather than a service that comes to you, whether thats your home in the community, or an assisted-living or nursing home level of care. There is a Hospice Home within OSF in Peoria, so there is a place so to speak. But hospice overall is a service, its not a place where people go, Baker said.
Your vision of hospice may be someone on their deathbed, unable to get up.
That's part of it but not all of it. Some in hospice care like getting out into the world. Some set a goal for an out-of-state trip. Hospice is a Medicare-reimbursed program, so OSF can set up a travel contract with a hospice agency in, say, Florida.
Its helping people to accomplish a goal they want to accomplish. They want to take that last trip they havent been able to do. Sometimes its celebrating a birthday, or they want to go see a wedding, or, Oh golly, I just want to see the baby born. We had a patient who still wanted to get out and about and ride on a motorcycle. We like celebrating with them when theyre able to meet those goals, those milestones, Baker said.
As a social worker, Baker works closely with the caregivers of those who are dying. Yes, it's a lot of adult sons and daughters caring for mom or dad, but they run the gamut more than you'd expect.
We have folks that have their own young children and theyre trying to raise their family, and work, and care for mom or dad. We are encountering grandchildren caring for grandparents, Baker said. Unfortunately, we do encounter parents caring for children. Those may be infants, toddlers, grade schoolers, high schoolers. Sometimes its adult children. We do get folks that have some developmental disabilities, in addition to whatever their diagnosis that brought them to hospice.
Hospice staff are interacting with those caregivers at one of the hardest points in their life.
Oftentimes our caregivers are quite frankly very stressed. Theyre overwhelmed. Because they have so many of their own responsibilities, and then they have the care and the concerns for their loved one, Baker said. Many of them are trying to care for their loved one in their loved ones home, or theyve brought them into their own home. There are multiple challenges that come along with that.
Hospice manager Michelle Jackson said her team is there as much for the family as they are the patient.
We teach them how to take care of their loved one, because were not there 24/7. So we come in, we educate, and try to get them the support they need. If they need paid caregivers to come into the home, if there are physical limitations, we try to arrange that. Or try to use community resources that are available, or VA resources if theyre a veteran, Jackson said.
Jackson and Baker and a few dozen other colleagues care for between 70 and 100 people who are in hospice in OSF's Eastern region at any given time, including in Bloomington-Normal.
This work is very difficult, but its the most rewarding thing youve ever done, Jackson said. You just get that sense when you see patients, and you know they have a good quality of life, a good passing, and the family feels supportedthat recognition is what keeps you going to the next patient.
You can learn more about hospice on OSF's website is by calling (309) 451-5925. Hospice would set up an assessment visit to see if you or your family member are hospice-appropriate.
Excerpt from:
Sound Health: There's A Lot That You Don't Know About Hospice - WGLT
Posted in Euthanasia
Comments Off on Sound Health: There’s A Lot That You Don’t Know About Hospice – WGLT
PETA Hypocracy – AG INFORMATION NETWORK OF THE WEST – AGInfo Ag Information Network Of The West
Posted: at 12:12 pm
Will Coggin, managing director of PETAKillsAnimals.com, where he talked about the way PETA thinks about livestock and ranchers. "PETA would argue that the conditions that cattle are raised in our terrible and even if they were raised in a five-star hotel with massages, PETA would say you are still killing them for food and that is wrong.
PETAs Shelter Killed More Than 1,700 Pets in 2020
Despite National Surge in Pet Adoptions, PETAs Kill Mill Put Down 10% More Pets than 2019
Washington, D.C. (February 17, 2021)PETAs lone animal shelter, located at the organizations headquarters in Norfolk, Virginia, euthanized 1,759 cats, dogs, and other pets in 2020. This means that PETA euthanized nearly two in three of the animals it received over the course of 2020.
PETA only adopted out 41 pets, about 1.5% of the total number of animals it received last year. This minuscule number becomes more egregious when considering 2020 saw U.S. shelters experience record-high numbers of adoptions. The figures come from animal custody records filed by PETA with the Virginia Department of Agriculture and Consumer Services.
According to the same database, the average euthanasia rate of private animal shelters in Virginia is less than 6%more than ten times smaller than PETAs. Of all the animals killed at private animal shelters in the state, 74% were killed by PETA.
View original post here:
PETA Hypocracy - AG INFORMATION NETWORK OF THE WEST - AGInfo Ag Information Network Of The West
Posted in Euthanasia
Comments Off on PETA Hypocracy – AG INFORMATION NETWORK OF THE WEST – AGInfo Ag Information Network Of The West
Emory students advance artificial intelligence with a bot that aims to serve humanity – SaportaReport
Posted: at 12:12 pm
A team of six Emory computer science students are helping to usher in a new era in artificial intelligence. Theyve developed a chatbot capable of making logical inferences that aims to hold deeper, more nuanced conversations with humans than have previously been possible. Theyve christened their chatbot Emora, because it sounds like a feminine version of Emory and is similar to a Hebrew word for an eloquent sage.
The team is now refining their new approach to conversational AI a logic-based framework for dialogue management that can be scaled to conduct real-life conversations. Their longer-term goal is to use Emora to assist first-year college students, helping them to navigate a new way of life, deal with day-to-day issues and guide them to proper human contacts and other resources when needed.
Eventually, they hope to further refine their chatbot developed during the era of COVID-19 with the philosophy Emora cares for you to assist people dealing with social isolation and other issues, including anxiety and depression.
The Emory team is headed by graduate students Sarah Finch and James Finch, along with faculty advisorJinho Choi, associate professor in the Department of Computer Sciences. The team also includes graduate student Han He and undergraduates Sophy Huang, Daniil Huryn and Mack Hutsell. All the students are members of ChoisNatural Language Processing Research Laboratory.
Were taking advantage of established technology while introducing a new approach in how we combine and execute dialogue management so a computer can make logical inferences while conversing with a human, Sarah Finch says.
We believe that Emora represents a groundbreaking moment for conversational artificial intelligence, Choi adds. The experience that users have with our chatbot will be largely different than chatbots based on traditional, state-machine approaches to AI.
Last year, Choi and Sarah and James Finch headed a team of 14 Emory students that took first place in Amazons Alexa Prize Socialbot Grand Challenge, winning $500,000 for their Emora chatbot. The annual Alexa Prize challenges university students to make breakthroughs in the design of chatbots, also known as socialbots software apps that simplify interactions between humans and computers by allowing them to talk with one another.
This year, they developed a completely new version of Emora with the new team of six students.
They made the bold decision to start from scratch, instead of building on the state-machine platform they developed in 2020 for Emora. We realized there was an upper limit to how far we could push the quality of the system we developed last year, Sarah Finch says. We wanted to do something much more advanced, with the potential to transform the field of artificial intelligence.
They based the current Emora on three types of frameworks to advance core natural language processing technology, computational symbolic structures and probabilistic reasoning for dialogue management.
They worked around the clock, making it into the Alexa Prize finals in June. They did not complete most of the new system, however, until just a few days before they had to submit Emora to the judges for the final round of the competition.
That gave the team no time to make finishing touches to the new system, work out the bugs, and flesh out the range of topics that it could deeply engage in with a human. While they did not win this years Alexa Prize, the strategy led them to develop a system that holds more potential to open new doors of possibilities for AI.
In the run-up to the finals, users of Amazons virtual assistant, known as Alexa, volunteered to test out the competing chatbots, which were not identified by their names or universities. A chatbots success was gauged by user ratings.
The competition is extremely valuable because it gave us access to a high volume of people talking to our bot from all over the world, James Finch says. When we wanted to try something new, we didnt have to wait long to see whether it worked. We immediately got this deluge of feedback so that we could make any needed adjustments. One of the biggest things we learned is that what people really want to talk about is their personal experiences.
Sarah and James Finch, who married in 2019, are the ultimate computer power couple. They met at age 13 in a math class in their hometown of Grand Blanc, Michigan. They were dating by high school, bonding over a shared love of computer programming. As undergraduates at Michigan State University, they worked together on a joint passion for programming computers to speak more naturally with humans.
If we can create more flexible and robust dialogue capability in machines, Sarah Finch explains, a more natural, conversational interface could replace pointing, clicking and hours of learning a new software interface. Everyone would be on a more equal footing because using technology would become easier.
She hopes to pursue a career in enhancing computer dialogue capabilities with private industry after receiving her PhD.
James Finch is most passionate about the intellectual aspects of solving problems and is leaning towards a career in academia after receiving his PhD.
The Alexa Prize deadlines required the couple to work many 60-hour-plus weeks on developing Emoras framework, but they didnt consider it a grind. Ive enjoyed every day, James Finch says. Doing this kind of dialogue research is our dream and were living it. We are making something new that will hopefully be useful to the world.
They chose to come to Emory for graduate school because of Choi, an expert in natural language processing, and Eugene Agichtein, professor in the Department of Computer Science and an expert in information retrieval.
Emora was designed not just to answer questions, but as a social companion.
A caring chatbot was an essential requirement for Choi. At the end of every team meeting, he asks one member to say something about how the others have inspired them. When someone sees a bright side in us, and shares it with others, everyone sees that side and that makes it even brighter, he says.
Chois enthusiasm is also infectious.
Growing up in Seoul, South Korea, he knew by the age of six that he wanted to design robots. I remember telling my mom that I wanted to make a robot that would do homework for me so I could play outside all day, he recalls. It has been my dream ever since. I later realized that it was not the physical robot, but the intelligence behind the robot that really attracted me.
The original Emora was built on a behavioral mathematical model similar to a flowchart and equipped with several natural language processing models. Depending on what people said to the chatbot, the machine made a choice about what path of a conversation to go down. While the system was good at chit chat, the longer a conversation went on, the more chances that the system would miss a social-linguistic nuance and the conversation would go off the rails, diverting from the logical thread.
This year, the Emory team designed Emora so that she could go beyond a script and make logical inferences. Rather than a flowchart, the new system breaks a conversation down into concepts and represents them using a symbolic graph. A logical inference engine allows Emora to connect the graph of an ongoing conversation into other symbolic graphs that represent a bank of knowledge and common sense. The longer the conversations continue, the more its ability to make logical inferences grows.
Sarah and James Finch worked on the engineering of the new Emora system, as well as designing logic structures and implementing related algorithms. Undergraduates Sophy Huang, Daniil Huryn and Mack Hutsell focused on developing dialogue content and conversational scripts for integrating within the chatbot. Graduate student Han He focused on structure parsing, including recent advances in the technology.
A computer cannot deal with ambiguity, it can only deal with structure, Han He explains. Our parser turns the grammar of a sentence into a graph, a structure like a tree, that describes what a chatbot user is saying to the computer.
He is passionate about language. Growing up in a small city in central China, he studied Japanese with the goal of becoming a linguist. His family was low income so he taught himself computer programming and picked up odd programmer jobs to help support himself. In college, he found a new passion in the field of natural language processing, or using computers to process human language.
His linguistic background enhances his technological expertise. When you learn a foreign language, you get new insights into the role of grammar and word order, He says. And those insights can help you to develop better algorithms and programs to teach computers how to understand language. Unfortunately, many people working in natural language processing focus primarily on mathematics without realizing the importance of grammar.
After getting his masters at the University of Houston, He chose to come to Emory for a PhD to work with Choi, who also emphasizes linguistics in his approach to natural language processing. He hopes to make a career in using artificial intelligence as an educational tool that can help give low-income children an equal opportunity to learn.
A love of language also brought senior Mack Hutsell into the fold. A native of Houston, he came to Emorys Oxford College to study English literature. His second love is computer programming and coding. When Hutsell discovered the digital humanities, using computational methods to study literary texts, he decided on a double major in English and computer science.
I enjoy thinking about language, especially language in the context of computers, he says.
Chois Natural Language Processing Lab and the Emora project was a natural fit for him.
Like the other undergraduates on the team, Hutsell did miscellaneous tasks for the project while also creating content that could be injected into Emoras real-world knowledge graph. On the topic of movies, for instance, he started with an IMDB dataset. The team had to combine concepts from possible conversations about the movie data in ways that would fit into the knowledge graph template and generate unique responses from the chatbot. Thinking about how to turn metadata and numbers into something that sounds human is a lot of fun, Hutsell says.
Language was also a key draw for senior Danii Huryn. He was born in Belarus, moved to California with his family when he was four, and then returned to Belarus when he was 10, staying until he completed high school. He speaks English, Belarusian and Russian fluently and is studying German.
In Belarus, I helped translate at my church, he says. That got me thinking about how different languages work differently and that some are better at saying different things.
Huryn excelled in computer programming and astronomy in his studies in Belarus. His interests also include reading science fiction and playing video games. He began his Emory career on the Oxford campus, and eventually decided to major in computer science and minor in physics.
For the Emora project, he developed conversations about technology, including an AI component, and another on how people were adapting to life during the pandemic.
The experience was great, Huryn says. I helped develop features for the bot while I was taking a course in natural language processing. I could see how some of the things I was learning about were coming together into one package to actually work.
Team member Sophy Huang, also a senior, grew up in Shanghai and came to Emory planning to go down a pre-med track. She soon realized, however, that she did not have a strong enough interest in biology and decided on a dual major of applied mathematics and statistics and psychology. Working on the Emora project also taps into her passions for computer programming and developing applications that help people.
Psychology plays a big role in natural language processing, Huang says. Its really about investigating how people think, talk and interact and how those processes can be integrated into a computer.
Food was one of the topics Huang developed for Emora to discuss. The strategy was first to connect with users by showing understanding, she says.
For instance, if someone says pizza is their favorite food, Emora would acknowledge their interest and ask what it is about pizza that they like so much.
By continuously acknowledging and connecting with the user, asking for their opinions and perspectives and sharing her own, Emora shows that she understands and cares, Huang explains. That encourages them to become more engaged and involved in the conversation.
The Emora team members are still at work putting the finishing touches on their chatbot.
We created most of the system that has the capability to do logical thinking, essentially the brain for Emora, Choi says. The brain just doesnt know that much about the world right now and needs more information to make deeper inferences. You can think of it like a toddler. Now were going to focus on teaching the brain so it will be on the level of an adult.
The team is confident that their system works and that they can complete full development and integration to launch beta testing sometime next spring.
Choi is most excited about the potential to use Emora to support first-year college students, answering questions about their day-to-day needs and directing them to the proper human staff or professor as appropriate. For larger issues, such as common conflicts that arise in group projects, Emora could also serve as a starting point by sharing how other students have overcome similar issues.
Choi also has a longer-term vision that the technology underlying Emora may one day be capable of assisting people dealing with loneliness, anxiety or depression. I dont believe that socialbots can ever replace humans as social companions, he says. But I do think there is potential for a socialbot to sympathize with someone who is feeling down, and to encourage them to get help from other people, so that they can get back to the cheerful life that they deserve.
Continued here:
Posted in Artificial Intelligence
Comments Off on Emory students advance artificial intelligence with a bot that aims to serve humanity – SaportaReport
Frontier Development Lab Transforms Space and Earth Science for NASA with Google Cloud Artificial Intelligence and Machine Learning Technology – SETI…
Posted: at 12:12 pm
August 26, 2021, Mountain View, Calif., Frontier Development Lab (FDL), in partnership with the SETI Institute, NASA and private sector partners including Google Cloud, are transforming space and Earth science through the application of industry-leading artificial intelligence (AI) and machine learning (ML) tools.
FDL tackles knowledge gaps in space science by pairing ML experts with researchers in physics, astronomy, astrobiology, planetary science, space medicine and Earth science.These researchers have utilized Google Cloud compute resources and expertise since 2018, specifically AI / ML technology, to address research challenges in areas like astronaut health, lunar exploration, exoplanets, heliophysics, climate change and disaster response.
With access to compute resources provided by Google Cloud, FDL has been able to increase the typical ML pipeline by more than 700 times in the last five years, facilitating new discoveries and improved understanding of our planet, solar system and the universe. Throughout this period, Google Clouds Office of the CTO (OCTO) has provided ongoing strategic guidance to FDL researchers on how to optimize AI / ML , and how to use compute resources most efficiently.
With Google Clouds investment, recent FDL achievements include:
"Unfettered on-demand access to massive super-compute resources has transformed the FDL program, enabling researchers to address highly complex challenges across a wide range of science domains, advancing new knowledge, new discoveries and improved understandings in previously unimaginable timeframes, said Bill Diamond, president and CEO, SETI Institute.This program, and the extraordinary results it achieves, would not be possible without the resources generously provided by Google Cloud.
When I first met Bill Diamond and James Parr in 2017, they asked me a simple question: What could happen if we marry the best of Silicon Valley and the minds of NASA? said Scott Penberthy, director of Applied AI at Google Cloud. That was an irresistible challenge. We at Google Cloud simply shared some of our AI tricks and tools, one engineer to another, and they ran with it. Im delighted to see what weve been able to accomplish together - and I am inspired for what we can achieve in the future. The possibilities are endless.
FDL leverages AI technologies to push the frontiers of science research and develop new tools to help solve some of humanity's biggest challenges. FDL teams are comprised of doctoral and post-doctoral researchers who use AI / ML to tackle ground-breaking challenges. Cloud-based super-computer resources mean that FDL teams achieve results in eight-week research sprints that would not be possible in even year-long programs with conventional compute capabilities.
High-performance computing is normally constrained due to the large amount of time, limited availability and cost of running AI experiments, said James Parr, director of FDL. Youre always in a queue. Having a common platform to integrate unstructured data and train neural networks in the cloud allows our FDL researchers from different backgrounds to work together on hugely complex problems with enormous data requirements - no matter where they are located.
Better integrating science and ML is the founding rationale and future north star of FDLs partnership with Google Cloud. ML is particularly powerful for space science when paired with a physical understanding of a problem space. The gap between what we know so far and what we collect as data is an exciting frontier for discovery and something AI / ML and cloud technology is poised to transform.
You can learn more about FDLs 2021 program here.
The FDL 2021 showcase presentations can be watched as follows:
In addition to Google Cloud, FDL is supported by partners including Lockheed Martin, Intel, Luxembourg Space Agency, MIT Portugal, Lawrence Berkeley National Lab, USGS, Microsoft, NVIDIA, Mayo Clinic, Planet and IBM.
About the SETI InstituteFounded in 1984, the SETI Institute is a non-profit, multidisciplinary research and education organization whose mission is to lead humanity's quest to understand the origins and prevalence of life and intelligence in the universe and share that knowledge with the world. Our research encompasses the physical and biological sciences and leverages expertise in data analytics, machine learning and advanced signal detection technologies. The SETI Institute is a distinguished research partner for industry, academia and government agencies, including NASA and NSF.
Contact Information:Rebecca McDonaldDirector of CommunicationsSETI Institutermcdonald@SETI.org
DOWNLOAD FULL PRESS RELEASE HERE.
Visit link:
Posted in Artificial Intelligence
Comments Off on Frontier Development Lab Transforms Space and Earth Science for NASA with Google Cloud Artificial Intelligence and Machine Learning Technology – SETI…
Embedding Gender in International Humanitarian Law: Is Artificial Intelligence Up to the Task? – Just Security
Posted: at 12:12 pm
During armed conflict, unequal power relations and structural disadvantages derived from gender dynamics are exacerbated. There has been increased recognition of these dynamics during the last several decades, particularly in the context of sexual and gender-based violence in conflict, as exemplified for example in United Nations Security Council Resolution 1325 on Women, Peace, and Security. Though initiatives like this resolution are a positive advancement towards the recognition of discrimination against women and structural disadvantages that they suffer from during armed conflict, other aspects of armed conflict, including, notably, the use of artificial intelligence (AI) for targeting purposes, have remained resistant to insights related to gender. This is particularly problematic in the operational aspect of international humanitarian law (IHL), which contains rules on targeting in armed conflict.
The Gender Dimensions of Distinction and Proportionality
Some gendered dimensions of the application of IHL have long been recognized, especially in the context of rape and other categories of sexual violence against women occurring during armed conflict. Therefore, a great deal of attention has been paid in relation to ensuring accountability for crimes of sexual violence during times of armed conflict, while other aspects of conflict, such as the operational aspect of IHL, have remained overlooked.
In applying the principle of distinction, which requires distinguishing civilians from combatants (only the latter of which may be the target of a lawful attack), gendered assumptions of who is a threat have often played an important role. In modern warfare, often characterized by asymmetry and urban conflict and where combatants can blend in with the civilian population, some militaries and armed groups have struggled to reliably distinguish civilians. Due to gendered stereotypes of expected behavior of women and men, gender has operated as a de facto qualified identity that supplements the category of civilian. In practice this can mean that, for women to be targeted, IHL requirements are rigorously applied. Yet, in the case of young civilian males, the bar seems to be lower gender considerations, coupled with other factors such as geographical location, expose them to a greater risk of being targeted.
An illustrative example of this application of the principle of distinction is in so-called signature strikes, a subset of drone strikes adopted by the United States outside what it considers to be areas of active hostilities. Signature strikes target persons who are not on traditional battlefields without individually identifying them, but rather based only on patterns of life. According to reports on these strikes, it is sufficient that the persons targeted fit into the category military-aged males, who live in regions where terrorists operate, and whose behavior is assessed to be similar enough to those of terrorists to mark them for death. However, as the organization Article 36 notes, due to the lack of transparency around the use of armed drones in signature strikes, it is difficult to determine in more detail what standards are used by the U.S. government to classify certain individuals as legal targets. According to a New York Times report from May 2012, in counting casualties from armed drone strikes, the U.S. government reportedly recorded all military-age males in a strike zone as combatants [] unless there is explicit intelligence posthumously proving them innocent.
However, once a target is assessed as a valid military objective, the impact of gender is reversed in conducting a proportionality assessment. The principle of proportionality requires ensuring the anticipated harm to civilians and civilian objects is not excessive compared to the anticipated military advantage of an attack. But in assessing the anticipated advantage and anticipated civilian harms, the calculated military advantage can include the expected reduction of the commanders own combatant casualties as an advantage in other words, the actual loss of civilian lives can be offset by the avoidance of prospective military casualties. This creates the de facto result that the lives of combatants, the vast majority of whom are men, are weighed as more important than those of civilians who in a battlefield context, are often disproportionately women. Taking these applications of IHL into account, we can conclude that a gendered dimension is present in the operational aspect of this branch of law.
AI Application of IHL Principles
New technologies, particularly AI, have been increasingly deployed to assist commanders in their targeting decisions. Specifically, machine-learning algorithms are being used to process massive amounts of data to identify rules or patterns, drawing conclusions about individual pieces of information based on these patterns. In warfare, AI already supports targeting decisions in various forms. For instance, AI algorithms can estimate collateral damage, thereby helping commanders undertake the proportionality analysis. Likewise, some drones have been outfitted with AI to conduct image-recognition and are currently being trained to scan urban environments to find hidden attackers in other words, to distinguish between civilians and combatants as required by the principle of distinction.
Indeed, in modern warfare, the use of AI is expanding. For example, in March 2021 the National Security Commission on AI, a U.S. congressionally-mandated commission, released a report highlighting how, in the future, AI-enabled technologies are going to permeate every facet of warfighting. It also urged the Department of Defense to integrate AI into critical functions and existing systems in order to become an AI-ready force by 2025. As Neil Davison and Jonathan Horowitz note, as the use of AI grows, it is crucial to ensure that its development and deployment (especially when coupled with the use of autonomous weapons) complies with civilian protection.
Yet even if IHL principles can be translated faithfully into the programming of AI-assisted military technologies (a big and doubtful if), such translation will reproduce or even magnify the disparate, gendered impacts of IHL application identified previously. As the case of drones used to undertake signature strikes demonstrates, the integration of new technologies in warfare risks importing, and in the case of AI tech, potentially magnifying and cementing, the gendered injustices already embodied in the application of existing law.
Gendering Artificial Intelligence-Assisted Warfare
There are several reasons that AI may end up reifying and magnifying gender inequities. First, the algorithms are only as good as their inputs and those underlying data are problematic. To properly work, AI needs massive amounts of data. However, neither the collection nor selection of these data are neutral. In less deadly application domains, such as in mortgage loan decisions or predictive policing, there have been demonstrated instances of gender (and other) biases of both the programmers and the individuals tasked with classifying data samples, or even the data sets themselves (which often contain more data on white, male subjects).
Perhaps even more difficult to identify and correct than individuals biases are instances of machine learning that replicate and reinforce historical patterns of injustice merely because those patterns appear, to the AI, to provide useful information rather than undesirable noise. As Noel Sharkey notes, the societal push towards greater fairness and justice is being held back by historical values about poverty, gender and ethnicity that are ossified in big data. There is no reason to believe that bias in targeting data would be any different or any easier to find.
This means that historical human biases can and do lead to incomplete or unrepresentative training data. For example, a predictive algorithm used to apply the principle of distinction on the basis of target profiles, together with other intelligence, surveillance, and reconnaissance tools, will be gender biased if the data inserted equate military-aged men with combatants and disregard other factors. As the practice of signature drone strikes has demonstrated, automatically classifying men as combatants and women as vulnerable has led to mistakes in targeting. As the use of machine learning in targeting expands, these biases will be amplified if not corrected for with each strike providing increasingly biased data.
To mitigate this result, it is critical to ensure that the data collected are diverse, accurate, and disaggregated, and that algorithm designers reflect on how the principles of distinction and proportionality can be applied in gender-biased ways. High quality data collection means, among other things, ensuring that the data are disaggregated by gender otherwise it will be impossible to learn what biases are operating behind the assumptions used, what works to counter those biases, and what does not.
Ensuring high quality data also requires collecting more and different types of data, including data on women. In addition, because AI tools reflect the biases of those who build them, ensuring that female employees hold technical roles and that male employees are fully trained to understand gender and other biases is also crucial to mitigate data biases. Incorporating gender advisors would also be a positive step to ensure that the design of the algorithm, and the interpretation of what the algorithm recommends or suggests, considers gender biases and dynamics.
However, issues of data quality are subsidiary to larger questions about the possibility of translating IHL into code and, even if this translation is possible, the further difficulty of incorporating gender considerations into IHL code. Encoding gender considerations into AI is challenging to say the least, because gender is both a societal and individual construction. Likewise, the process of developing AI is not neutral, as it has both politics and ethics embedded, as demonstrated by documented incidents of AI encoding biases. Finally, the very rules and principles of modern IHL were drafted when structural discrimination against women was not acknowledged or was viewed as natural or beneficial. As a result, when considering how to translate IHL into code, it is essential to incorporate critical gender perspectives into the interpretation of the norms and laws related to armed conflict.
Gendering IHL: An Early Attempt and Work to be Done
An example of the kind of critical engagement with IHL that will be required is provided by the updated International Committee of the Red Cross (ICRC) Commentary on the Third Geneva Convention. Through the incorporation of particular considerations of gender-specific risks and needs (para. 1747), the updated commentary has reconsidered outdated baseline gender assumptions, such as the idea that women have non-combatant status by default, or that women must receive special consideration because they have less resilience, agency or capacity (para. 1682). This shift has demonstrated that it is not only desirable, but also possible to include a gender perspective in the interpretation of the rules of warfare. This shift also underscores the urgent need to revisit IHL targeting principles of distinction and proportionality to assess how their application impacts genders differently, so that any algorithms developed to execute IHL principles incorporate these insights from the start.
As a first cut at this reexamination, it is essential to reassert that principles of non-discrimination also apply to IHL, and must be incorporated into any algorithmic version of these rules. In particular, the principle of distinction allows commanders to lawfully target only those identified as combatants or those who directly participate in hostilities. Article 50 of Additional Protocol I to the Geneva Conventions defines civilians in a negative way, meaning that civilians are those who do not belong to the category of combatants and IHL makes no reference to gender as a signifier of identity for the purpose of assessing whether a given individual is a combatant. In this regard, being a military-aged male cannot be a shortcut to the identification of combatants. Men make up the category of civilians as well. As Maya Brehm notes, there is scope for categorical targeting within a conduct of hostilities framework, but the principle of non-discrimination continues to apply in armed conflict. Adverse distinction based on race, sex, religion, national origin or similar criteria is prohibited.
Likewise, in any attempt to translate the principle of proportionality into code, there must be recognition of and correction for the gendered impacts of current proportionality calculations. For example, across Syria between 2011 and 2016, 75 percent of the civilian women killed in conflict-related violence were killed by shelling or aerial bombardment. In contrast, 49 percent of civilian men killed in war-related violence were killed by shelling or aerial bombardment; men were killed more often by shooting. This suggests that particular tactics and weapons have disparate impacts on civilian populations that break down along gendered lines. The studys authors note that the evolving tactics used by Syrian, opposition, and international forces in the conflict contributed to a decrease in the proportion of casualties who were combatants, as the use of shelling and bombardment two weapons that were shown to have high rates of civilian casualties, especially women and children civilian casualties increased over time. Study authors also note, however, that changing patterns of civilian and combatant behavior may partially explain the increasing rates of women compared to men in civilian casualties: A possible contributor to increasing proportions of women and children among civilian deaths could be that numbers of civilian men in the population decreased over time as some took up arms to become combatants.
As currently understood, IHL does not require an analysis of the gendered impacts of, for example, the choice of aerial bombardment versus shooting. Yet this research suggests that selecting aerial bombardment as a tactic will result in more civilian women than men being killed (nearly 37 percent of women killed in the conflict versus 23 percent of men). Selecting shooting as a tactic produces opposite results, with 23 percent of civilian men killed by shooting compared to 13 percent of women. There is no right proportion of civilian men and women killed by a given tactic, but these disparities have profound, real-world consequences for civilian populations during and after conflict that are simply not considered under current rules of proportionality and distinction.
In this regard, although using force protection to limit ones own forces casualties is not forbidden, such strategy ought to consider the effect that this policy will have on the civilian population of the opposing side including gendered impacts. The compilation of data on how a certain means or method of warfare may impact the civilian population would enable commanders to take a more informed decision. Acknowledging that the effects of weapons in warfare are gendered is the first key step to be taken. In some cases, there has been progress in incorporating a gendered lens into positive IHL, as in the case of cluster munitions, where Article 5 of the convention banning these weapons notes that States shall provide gender-sensitive assistance to victims. But most of this analysis remains rudimentary and not clearly required. In the context of developing AI-assisted technologies, reflecting on the gendered impact of the algorithm is essential during AI development, acquisition, and application.
The process of encoding IHL principles of distinction and proportionality into AI systems provides a useful opportunity to revisit application of these principles with an eye toward interpretations that take into account modern gender perspectives both in terms of how such IHL principles are interpreted and how their application impacts men and women differently. As the recent update of the ICRC Commentary on the Third Geneva Convention illustrates, acknowledging and incorporating gender-specific needs in the interpretation and suggested application of the existing rules of warfare is not only possible, but also desirable.
Disclaimer:This post has been prepared as part of a research internship at theErasmus University Rotterdam, funded by the European Union (EU) Non-Proliferation and Disarmament Consortium as part of a larger EU educationalinitiative aimed at building capacity in the next generation of scholars and practitioners innon-proliferation policy and programming. The views expressed in this post are those of theauthor and do not necessarily reflect those of the Erasmus University Rotterdam, the EU Non-Proliferation andDisarmament Consortium or other members of the network.
Originally posted here:
Posted in Artificial Intelligence
Comments Off on Embedding Gender in International Humanitarian Law: Is Artificial Intelligence Up to the Task? – Just Security