Page 182«..1020..181182183184..190200..»

Category Archives: Transhuman News

Ketamine Promising for Rare Condition Linked to Autism – Medscape

Posted: September 14, 2022 at 1:04 am

Ketamine may be an effective treatment for children with activity-dependent neuroprotective protein (ADNP) syndrome, a rare genetic condition associated with intellectual disability and autism spectrum disorder.

Also known as HelsmoortelVan Der Aa syndrome, ADNP syndrome is caused by mutations in the ADNP gene. Studies in animal models suggest that low-dose ketamine increases expression of ADNP and is neuroprotective.

Intrigued by the preclinical evidence, Alexander Kolevzon, MD, clinical director of the Seaver Autism Center at Mount Sinai in New York, and colleagues treated 10 children with ADNP syndrome with a single low dose of ketamine (0.5mg/kg) infused intravenously over 40 minutes. The children ranged in ages 6-12 years.

Using parent-report instruments to assess treatment effects, ketamine was associated with "nominally significant" improvement in a variety of domains, including social behavior, attention-deficit and hyperactivity, restricted and repetitive behaviors, and sensory sensitivities.

Parent reports of improvement in these domains aligned with clinician-rated assessments based on the Clinical Global ImpressionsImprovement scale.

The results also highlight the potential utility of electrophysiological measurement of auditory steady-state response and eye-tracking to track change with ketamine treatment, the researchers say.

The study was published online August 27 in Human Genetics and Genomic (HGG) Advances.

Ketamine was generally well tolerated. There were no clinically significant abnormalities in laboratory or cardiac monitoring, and there were no serious adverse events (AEs).

Treatment emergent AEs were all mild to moderate and no child required any interventions.

The most common AEs were elation/silliness in five children (50%), all of whom had a history of similar symptoms. Drowsiness and fatigue occurred in four children (40%) and two of them had a history of drowsiness. Aggression was likewise relatively common, reported in four children (40%), all of whom had aggression at baseline.

Decreased appetite emerged as a new AE in three children (30%), increased anxiety occurred in three children (30%), and irritability, nausea/vomiting, and restlessness each occurred in two children (20%).

The researchers caution that the findings are intended to be "hypothesis generating."

"We are encouraged by these findings, which provide preliminary support for ketamine to help reduce negative effects of this devastating syndrome," Kolevzon said in a news release from Mount Sinai.

Ketamine might help ease symptoms of ADNP syndrome "by increasing expression of the ADNP gene or by promoting synaptic plasticity through glutamatergic pathways," Kolevzon told Medscape Medical News.

The next step, he said, is to get "a larger, placebo-controlled study approved for funding using repeated dosing over a longer duration of time. We are working with the FDA to get the design approved for an investigational new drug application."

Support for the study was provided by the ADNP Kids Foundation and the Foundation for Mood Disorders. Support for mediKanren was provided by the National Center for Advancing Translational Sciences, and National Institutes of Health through the Biomedical Data Translator Program. Kolevzon is on the scientific advisory board of Ovid Therapeutics, Ritrova Therapeutics, and Jaguar Therapeutics and consults to Acadia, Alkermes, GW Pharmaceuticals, Neuren Pharmaceuticals, Clinilabs Drug Development Corporation, and Scioto Biosciences.

HGG Advances. Published online August 27, 2022. Full text

For more Medscape Psychiatry news, join us on Twitter and Facebook

Follow Medscape on Facebook, Twitter, Instagram, and YouTube

See the rest here:
Ketamine Promising for Rare Condition Linked to Autism - Medscape

Posted in Human Genetics | Comments Off on Ketamine Promising for Rare Condition Linked to Autism – Medscape

How a small, unassuming fish helps reveal gene adaptations – University of Wisconsin-Madison

Posted: at 1:04 am

Jesse Weber collects stickleback with a minnow trap in the Kenai Peninsula of Alaska. Photo by Matt Chotlos

At first blush, sticklebacks might seem a bit pedestrian. The finger-length, unassuming fish with a few small dorsal spines are a ubiquitous presence in oceans and coastal watersheds around the northern hemisphere. But these small creatures are also an excellent subject for investigating the complex dance of evolutionary adaptations.

A new study published Sept. 8 in Science sheds light on the genetic basis by which stickleback populations inhabiting ecosystems near each other developed a strong immune response to tapeworm infections, and how some populations later came to tolerate the parasites.

Evolutionary biologist Jesse Weber, a professor of integrative biology at the University of WisconsinMadison, is one of the studys lead authors. Sticklebacks have long been a source of fascination not only for Weber, but for biologists all over the world so much so that the fish are among the most closely studied species.

An aerial view of an experiment in the Kenai Peninsula of Alaska studying changes in stickleback traits in response to a new environment. Photo by Andrew Hendry

We arguably know more about stickleback ecology and evolution than any other vertebrate, says Weber.

This is in part because of sticklebacks rich abundance in places like Western Europe, where the fish have long been involved in biological study, Weber says. But the reasons for the species star status go well beyond happenstance.

Sticklebacks are also just super charismatic, Weber adds, noting the species complex courtship and territorial behaviors, as well as their diverse colors, shapes and sizes, all of which vary depending on the specific ecosystem they inhabit.

While sticklebacks diversity provides a foothold for understanding why animals evolve different traits, their value for scientists like Weber is boosted by their genetics. The fish have approximately as many genes as humans, but their genetic material is packed much more tightly sticklebacks genome is about one-sixth the size of the human genome.

Their genome is amazingly useful, Weber says. As far as we can tell, its just packed more densely. This means we can efficiently investigate their genetic diversity, allowing us to ask not only, Why do new traits evolve? but also, How are adaptations programmed into the genome?'

On top of all that, sticklebacks take well to captive breeding. A single female can produce hundreds of offspring multiple times over the course of just a few months.

All these traits make stickleback an almost uniquely valuable species for studying the genetic basis for many types of biological adaptations. So, when Weber arrived at UWMadison in the fall of 2020 from the University of Alaska Anchorage, he came with an entire fish colony in tow. Living in tanks, the colony contains fish from genetically distinct populations originating from different lakes and estuaries dotting northwestern North America.

A three spine stickleback with tapeworms recently dissected from the body of the same animal. Photo by Natalie Steinel

In their quest to understand why and how the fish sometimes evolve to look and behave very differently even in relatively nearby lake systems, Weber and his colleagues can crossbreed these populations in various ways and map changes to their genomes across multiple generations relatively quickly.

Much of Webers scientific career to this point has focused on developing tools to make this type of work more efficient. More recently, Weber has turned to using these tools to investigate coevolution the process by which two species adapt to the presence of one another within a shared habitat.

Specifically, Weber and his colleagues have sought to understand why sticklebacks in some lakes are much more likely to be infected with tapeworms than their counterparts in nearby lakes where the tapeworms are also present.

These investigations are beginning to bear fruit. Weber, along with colleagues at the University of Connecticut and University of Massachusetts Lowell, recently identified key genetic differences between the populations.

These differences indicate that all fish populations developed a robust immune response to the tapeworms when they first moved from the sea to new freshwater habitats near the end of the last ice age. But the immune response is costly in terms of both energy and reproduction. It also leads to a large amount of inflammation and internal scarring.

Webers work and that of his colleagues suggest that numerous populations eventually evolved to avoid these costs by ignoring, or in the lingo of immunologists tolerating, the parasite infestation. But the tolerant population still carries the genes that produce the immune response to the tapeworms.

While they havent yet tested it, Weber says it appears that these sticklebacks may have mutations to these fibrosis-associated genes that render them non-functional.

While the results are exciting for Weber, hes already looking toward future research that he hopes will further tell the genetic story of sticklebacks abundant adaptations, and by extension reveal biological processes with implications across the wide diversity of life on Earth.

Read more about the study and its findings from the University of Connecticut.

This study was supported by the Howard Hughes Medical Institute Early Career Scientist fellowship, as well as grants from the National Institutes of Health (1R01AI123659-01A1, 1R01AI146168 and 1R35GM142891).

See the article here:
How a small, unassuming fish helps reveal gene adaptations - University of Wisconsin-Madison

Posted in Human Genetics | Comments Off on How a small, unassuming fish helps reveal gene adaptations – University of Wisconsin-Madison

How Nutrigenomics Explores Links Between Nutrition And Genes – Health Digest

Posted: at 1:04 am

Anything that changes the way individuals and medical professionals view nutrition is undoubtedly going to be reflected in other areas. And an obvious one, no doubt, is the food industry. Whatever the real difference gene variations make in terms of health, the reality is this: The more that's discovered, the more reactions are going to be experienced in different ways, and on different levels.

It's already the case that foods are sold that are enriched in some way, or it's highlighted how they're rich in certain nutrients. At the same time, foods for specific diets, such as keto, to treat certain ailments are also available. As nutrigenomics advances, nutrition plans can be created for certain genetic groups (viaIndian Journal of Horticulture).

There have long been diets and food products targeted at specific health conditions keto is aimed at lowering blood sugar levels and tackling type 2 diabetes, for example (perHealthline). This is whereby a variant of one gene has led to a disorder of some kind and there's a direct connection. However, nutrigenomics is more expansive, and more complex perhaps, as it may be that a number of genetic variations impact a number of different responses to nutrition. It's when these multiple changes are combined that they create an outcome.

The result is food that's created to deal with these differences. A University of Auckland study, highlighted in aHealthy Food Guidearticle, focuses on a gene-diet factor in why Crohn's disease is higher in New Zealand, and one area in particular. The guide explains, "The research team is studying the link between foods eaten by people with Crohn's disease and different variations of the disease-related genes. Information about lifestyle and symptoms are also collected to learn more about the disease and potentially to allow tailoring of foods to genetic type."

Read the original here:
How Nutrigenomics Explores Links Between Nutrition And Genes - Health Digest

Posted in Human Genetics | Comments Off on How Nutrigenomics Explores Links Between Nutrition And Genes – Health Digest

Scientists redefine obesity with discovery of two major subtypes – EurekAlert

Posted: at 1:04 am

image:Dr. J. Andrew Pospisilik, Chair of the Department of Epigenetics, Van Andel Institute view more

Credit: Courtesy of Van Andel Institute

GRAND RAPIDS, Mich. (September 12, 2022) A team led by Van Andel Institute scientists has identified two distinct types of obesity with physiological and molecular differences that may have lifelong consequences for health, disease and response to medication.

The findings, published today in the journal Nature Metabolism, offer a more nuanced understanding of obesity than current definitions and may one day inform more precise ways to diagnose and treat obesity and associated metabolic disorders.

The study also reveals new details about the role of epigenetics and chance in health and provides insights into the link between insulin and obesity.

Nearly two billion people worldwide are considered overweight and there are more than 600 million people with obesity, yet we have no framework for stratifying individuals according to their more precise disease etiologies, said J. Andrew Pospisilik, Ph.D., chair of Van Andel Institutes Department of Epigenetics and corresponding author of the study. Using a purely data-driven approach, we see for the first time that there are at least two different metabolic subtypes of obesity, each with their own physiological and molecular features that influence health. Translating these findings into a clinically usable test could help doctors provide more precise care for patients.

Currently, obesity is diagnosed using body mass index (BMI), an index correlated to body fat that is generated by comparing weight in relation to height. It is an imperfect measure, Pospisilik says, because it doesnt account for underlying biological differences and can misrepresent an individuals health status.

Using a combination of laboratory studies in mouse models and deep analysis of data from TwinsUK, a pioneering research resource and study cohort developed in the United Kingdom, Pospisilik and his collaborators discovered four metabolic subtypes that influence individual body types: two prone to leanness and two prone to obesity.

One obesity subtype is characterized by greater fat mass while the other was characterized by both greater fat mass and lean muscle mass. Somewhat surprisingly, the team found that the second obesity type also was associated with increased inflammation, which can elevate the risk of certain cancers and other diseases. Both subtypes were observed across multiple study cohorts, including in children. These insights are an important step toward understanding how these different types impact disease risk and treatment response.

After the subtypes were identified in the human data, the team verified the results in mouse models. This approach allowed the scientists to compare individual mice that are genetically identical, raised in the same environment and fed the same amounts of food. The study revealed that the inflammatory subtype appears to result from epigenetic changes triggered by pure chance. They also found that there seems to be no middle ground the genetically identical sibling mice either grew to a larger size or remained smaller, with no gradient between them. A similar pattern was seen in data from more than 150 human twin pairs, each of whom were virtually the same genetically.

Our findings in the lab almost carbon copied the human twin data. We again saw two distinct subtypes of obesity, one of which appeared to be epigenetically triggerable, and was marked by higher lean mass and higher fat, high inflammatory signals, high insulin levels, and a strong epigenetic signature, Pospisilik said.

Depending on the calculation and traits in question, only 30%50% of human trait outcomes can be linked to genetics or environmental influences. That means as much as half of who we are is governed by something else. This phenomenon is called unexplained phenotypic variation (UPV) and it offers both a challenge and untapped potential to scientists like Pospisilik and his collaborators.

The study indicates that the roots of UPV likely lie in epigenetics, the processes that govern when and to what extent the instructions in DNA are used. Epigenetic mechanisms are the reason that individuals with the same genetic instruction manual, such as twins, may grow to have different traits, such as eye color and hair color. Epigenetics also offer tantalizing targets for precision treatment.

This unexplained variation is difficult to study but the payoff of a deeper understanding is immense, Pospisilik said. Epigenetics can act like a light switch that flips genes on or off, which can promote health or, when things go wrong, disease. Accounting for UPV doesnt exist in precision medicine right now, but it looks like it could be half the puzzle. Todays findings underscore the power of recognizing these subtle differences between people to guide more precise ways to treat disease.

Pospisilik is hopeful that the teams findings will inform development of future precision medicine strategies and lead to a version of their method that may be used in doctors offices to better understand individual patients health and inform care.

###

Chih-Hsiang Yang, Ph.D., and Luca Fagnocchi, Ph.D., of VAI are co-first authors of the study. Other authors include Stefanos Apostle, M.S., Vanessa Wegert, M.Sc., Ilaria Panzeri, Ph.D., Darrell P. Chandler, Ph.D., Di Lu, Ph.D., Tao Yang, Ph.D., Elizabeth Gibbons, Ph.D., Rita Guerreiro, Ph.D., and Jos Brs, Ph.D. of VAI; Erez Dror, Ph.D., Steffen Heyne, Ph.D., Till Wrpel of Max Planck Institute of Immunobiology and Epigenetics; Salvador Casani-Galdn, Ph.D. of BioBam Bioinformatics; Kathrin Landgraf, Ph.D., of University of Leipzig; Martin Thomasen, Louise G. Grunnet, Ph.D., and Allan A. Vaag, M.D., Ph.D., D.MSc., of Rigshospitalet; Linn Gillberg, Ph.D., of University of Copenhagen; Elin Grundberg, Ph.D., of Childrens Mercy Research Institute; Ana Conesa, Ph.D., of the Spanish National Research Council and University of Florida; Antje Krner, M.D., of University of Leipzig and Helmholtz Institute for Metabolic, Obesity and Vascular Research; and PERMUTE. The authors thank the MPI-IE Facilities, and Van Andel Institutes Bioinformatics and Biostatistics Core, Genomics Core, Optical Imaging Core, Pathology and Biorepository Core, and Vivarium Core. Access to twin data was generously provided by UKTwins, without whom this study would not have been possible.

Research reported in this publication was supported by Van Andel Institute; Max Planck Gesellschaft; the European Unions Horizon 2020 Research and Innovation Program under Marie Skodowska-Curie grant agreement no. 675610; the Novo Nordisk Foundation and the European Foundation for the Study of Diabetes; the Danish Council for Independent Research; the National Human Genome Research Institute of the National Institutes of Health under award no. R21HG011964 (Pospisilik); and the NIH Common Fund, through the Office of the NIH Director (OD), and the National Human Genome Research Institute of the National Institutes of Health under award no. R01HG012444 (Pospisilik and Nadeau). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health or other granting organizations. Approximately 5% ($50,000) of funding for this study is from federal sources; approximately 95% ($950,000) is from non-U.S. governmental sources.

###

ABOUT VAN ANDEL INSTITUTEVan Andel Institute (VAI) is committed to improving the health and enhancing the lives of current and future generations through cutting-edge biomedical research and innovative educational offerings. Established in Grand Rapids, Michigan, in 1996 by the Van Andel family, VAI is now home to nearly 500 scientists, educators and support staff, who work with a growing number of national and international collaborators to foster discovery. The Institutes scientists study the origins of cancer, Parkinsons and other diseases and translate their findings into breakthrough prevention and treatment strategies. Our educators develop inquiry-based approaches for K-12 education to help students and teachers prepare the next generation of problem-solvers, while our Graduate School offers a rigorous, research-intensive Ph.D. program in molecular and cellular biology. Learn more at vai.org.

Nature Metabolism

Independent phenotypic plasticity axes define distinct obesity sub-types

12-Sep-2022

More:
Scientists redefine obesity with discovery of two major subtypes - EurekAlert

Posted in Human Genetics | Comments Off on Scientists redefine obesity with discovery of two major subtypes – EurekAlert

Estimating genetics of body dimensions and activity levels in pigs using automated pose estimation | Scientific Reports – Nature.com

Posted: at 1:04 am

Ethics statement

All experimental procedures were approved by the Animal Ethics Committee of KU Leuven (P004/2020), in accordance with European Community Council Directive 86/609/EEC, the ARRIVE guidelines and the ILAR Guide to the Care and Use of Experimental Animals. Researchers obtained informed consent for publication from all identifiable persons to display and reuse videos.

The study was carried out on 794 female and 746 castrated male Pitrain x PIC Camborough pigs (Vlaamse Pitrain Fokkerij, Belgium; offspring from 73 different sires and 204 dams), which had a mean age of 83.4 (2.2) days and a mean weight of 30.6 (5.1) kg at the start of the experiment. Observations were made during the fattening period which could span up to 120days and ended when pigs reached a body weight of approximately 115kg. Per sire, a median of 26 crossbred piglets (full-sibs and half-sibs from the same Pitrain sire) were allocated in equal numbers to two identical pens in mixed-sex groups. The pig building (experimental farm, located in Belgium) consisted of seventeen identical compartments with eight semi-slatted pens (2.5m4.0m) per compartment and on average thirteen pigs per pen (0.77m2 per pig). Food and water were provided ad-libitum in each pen throughout, from one trough and one nipple drinker.

Pigs were weighed individually over their fattening period every two weeks from January to July 2021. Pen-by-pen, all individuals were driven to the stables central hallway, after which pigs were weighed sequentially. Weighing was carried out between 08:00 a.m. and 16:00 p.m. and was video-recorded. All piglets were weighed for the first at thirteen days after arrival at the fattening farm. For practical limitations, only one out of two pens per sire was hereafter selected for subsequent follow-up. All 1556 pigs were weighed up to eight times, resulting in a total of 7428 records.

Additionally, each pig was scored manually during weighing on the following physical abnormalities: ear swellings or hematomas (0=none, 1=one ear, 2=both ears); the presence and size of umbilical hernia (0=not present, 1=present); ear biting wounds (0=none, 1=one ear, 2=both ears) and tail biting wounds (0=none, 1=small scratches, 3=bloody and/or infected tail; Additional File 1). All recordings were collected by the same trained professional. Lean meat percentage was recorded individually at the slaughterhouse of the Belgian Pork Group in Meer (Belgium) using AutoFom III (Frontmatec, Smoerum A/S, Denmark)31. Feed intake was measured at the pen level.

The walk-through pig weighing setup consisted of a ground scale weighing platform, a radio frequency identification (RFID) reader, a video camera and a computer (Fig.1). The ground scale platform (3.4m1.8m) had an accuracy of0.5kg (T.E.L.L. EAG80, Vreden, Germany) and was situated in the central hallway of the pig building. A wooden aisle helped pigs to walk individually and forward over the balance (2.5m0.6m; Fig.1a; Additional File 2Video S1). Body weights were registered electronically and coupled to the pigs ID using an RFID-reader and custom-made software. The camera (Dahua IPC-HDW4831EMP-ASE, Dahua Technology Co., Ltd, Hangzhou, China) was mounted 2.5m above floor at the center of the weighing scale. Pigs were recorded from an overhead camera perspective with a frame rate of 15 frames per second and a resolution of 38402160. An example of our data collection and a video recording is provided in Fig.1b.

Experimental setup (created with BioRender.com). (a) Schematic top view diagram of the experimental setup used in this study in the center hallway of the pig building. The blue area indicates the ground scale platform with a wooden aisle (in red). The red dashed lines indicate gates to regulate individual pig passage. (b) Schematic side view diagram of the experimental setup.

DeepLabCut 2.2b.827 was installed in an Anaconda environment with Python 3.7.7.30 on a custom-built computer running a Windows 10 64-bit operating system with Intel Core i5-vPro CPU processor (2.60GHz) and 8GB RAM memory. Training, evaluation and analysis of the neural network was performed using DeepLabCut in the Google Colaboratory (COLAB) (https://colab.research.google.com/).

To detect body parts on a pig that is walking through the experimental setup, a neural network was trained using DeepLabCut 2.2b27 as described in Nath et al.32. A minimalistic eight body part configuration (Fig.2a; Table 1) was necessary to estimate hip width, shoulder width and body length. Operational definitions can be found in Table 1. Head body parts (Nose, Ear left, and Ear right) were also labeled, but not included in our final structural model as these body parts were frequently occluded in consecutive frames.

(a) Schematic overview of the eight body positions annotated for pose configuration in DeepLabCut27 (created with BioRender.com). 1=Spine1; 2=Shoulder left; 3=shoulder right; 4=Center; 5=Spine2; 6=Hip left; 7=Hip right; 8=Tail base. (b) Example of a labeled pig during weighing using the DeepLabCut software.

Seven videos of approximately one hour recorded on two different days were selected to include variable pig sizes (20120kg) and each video contained multiple pig weighings. From these seven videos, several frames were extracted for annotation using k-means clustering in DeepLabCut. We first annotated 457 frames (~1 frame per pig) which were split into a training dataset (95%; 434 frames) and a test dataset (5%; 23 frames). The network was trained in Google Colaboratory using the ResNet-50 architecture with a batch size of 2. We trained our algorithm until the loss function reached an optimum, which indicated a minimal loss with a minimum number of iterations in this study. Next, we compared mean pixel errors of several models within this optimal region. Models with lowest mean pixel errors were visually checked for body part tracking performance on entire videos. Hereafter, the model that performed optimal was tested for flexibility using unseen single pig videos with pigs of variable size (20 vs 120kg) weighed on different days. As model performance was suboptimal at first, poorly tracked outlier frames were extracted using the DeepLabCut jump algorithm32. This algorithm identifies frames in which one or more body parts jumped more than a criterion value (in pixels) from the last frame32. These outlier frames were refined manually and hereafter added to the training dataset for re-training. In total, 150 outlier frames were extracted from six novel videos containing one single pig to improve tracking performance (25 frames per pig). The final training dataset consisted of 577 (95%) frames and a test dataset of 30 frames (5%). The network was then trained again using the same features as the first training. Additional File 3Video S2 shows an example of a pig with body part tracking.

After posture extractions of body parts using DeepLabCut, body dimension parameters were estimated. The raw dataset contained body part positions and tracking probabilities of 5,102,260 frames. Individual pig IDs were first coupled with video recordings based on time of measurement from the weight dataset. The following steps and analyses were performed in R33. Frames with a mean tracking probability<0.1 over all eight body parts were removed (2,792,252 frames left). This large reduction in number of frames (50% removed) was mainly caused by video frames without any pigs, for example in between weighing of different pens or in between weighings of pigs.

Next, for every weighing event, start and end points were determined to estimate body dimensions and activity traits. For a specific weighing event, a subset was first created containing all frames between the previous and next weighing event. The time of entrance and departure of the pig on the weighing scale was estimated using the x-position (in pixels) of the tail base, as the movement of pigs was predominantly along the x-axis (from right to left; Fig.2b). The frame of entrance was defined as the first frame of a subset where the rolling median (per 10 frames) of the tail base x-position exceeded 1100 pixels (Fig.3). Likewise, the first frame after a pigs weighing event with a rolling median tail base x-position<250 pixels was used to determine time of departure. If these criteria were not met, the first frame and/or the frame at which the weight record took place were used for the time of entrance/departure.

Determination of time window for a weight recording. (a) First, a subset is created as all tail base x-positions between time of recording of the next (orange) and previous (red) weight recording. The start time of the time window is determined as the first value before the own weight recording (green) above the threshold of 1100 pixels (dashed purple line; pig entering weighing scale). The end time of the time window is determined as the first value after the own weight recording (green) below the threshold of 250 pixels (dashed purple line; pig leaving weighing scale). (b) The extracted time window on which body part dimensions will be estimated and trajectory analysis will be performed.

Hip width, shoulder width and body length of a pig were estimated by using the median value of the distance between certain body parts over all frames for a specific weight recording (Table 1, Fig.2). These body dimensions in pixels, were transformed to metrics as 1cm was calculated to be equivalent to 29.1 pixels. The conversion ratio from pixels to centimeters was based on the distance between tiles of the weighing scale, which was known to be exactly 50cm. Total surface area was estimated using the mean value of the area calculated with the st_area function in R from the R-package sf34 using all outer body part locations. Standard deviations of the body part positions were also calculated for all frames between entrance and departure after quality control (as described above), to assess the stability of estimates.

Trajectory analysis was performed using the R-package trajr35 for left and right shoulder, left and right hip and the tail base. For each body part, pixel coordinates were extracted, trajectories were rescaled from pixels to cm and a smoothed trajectory was created using the TrajSmoothSG function. From these smoothed trajectories, the following activity-related features were derived: mean and standard deviation of speed and acceleration (TrajDerivatives), a straightness index (TrajStraightness) and sinuosity (TrajSinuosity2).

The straightness index and sinuosity are related to the concept of tortuosity and associated with an animals orientation and searching behavior35,36. The straightness index is calculated as the Euclidean distance between the start and the endpoint divided by the total length of the movement36. The straightness index is an indication of how close the animals path was to a straight line connecting the start and final point and varies from 0 to 1. Thus it quantifies path efficiency whereas the closer to 1, the higher the efficiency. In our experiment, this path efficiency will be highest when a pigs walks in a straight line during weighing (straightness index=1). Any deviations from this straight linedue to an increased activity of the pig during weighingwill lower the straightness index towards zero. Sinuosity tries to estimate the tortuosity of a random research path by combining step length and the mean cosine of an animals turning angles35,36,37. The sinuosity of a trajectory varies between 0 (random movement) and 1 (directed movement).

In this study we hypothesize that mean speed, straightness index and sinuosity are related to pigs activity during weighing. In an extreme case, a pig will walk in a straight line towards the RFID reader, stand motionless until weight is recorded and continues its walk in a straight line after the gate is opened. This would result in a low mean speed (m/s), a sinuosity >0 and a straightness index of 1. We hypothesize that more active pigs will present more lateral movements, increasing the mean speed and lowering the straightness index and sinuosity. So generally, more calm pigs during weighing will display a lower mean speed, although they might have run with a high speed towards the RFID reader.

The estimations of body dimensions using video recordings analyzed with DeepLabCut were validated by an independent set of 60 pigs after the initial experiment. These pigs came from five pens of different ages (92166days) and were measured manually for tail-neck length and hip width using a simple measuring tape. Pig surface area was estimated for the manual recordings as the multiplication of tail-neck length and hip width. The manual estimates for tail-neck length, hip width and pig surface area were then compared to the estimates from the video analysis by calculating Pearson correlations and root mean squared error (RMSE).

Automated activity traits were validated by comparing these values with manual activity scores given by five trained observers. Video footage of 1748 pig weighings were manually scored for pig activity by at least two observers per pig on a scale from 1 (calm) to 5 (very active). This ordinal activity scale was constructed based on DEath et al. and Holl et al.17,24. The average activity score per pig was then compared with automated activity scores by calculating Pearson correlations.

After estimation of body dimension and activity traits, additional quality control was performed. First, estimates of hip and shoulder width, tail-neck length and pig surface area were set to missing for records with frame by frame standard deviation estimates higher than the mean+3 standard deviations for all records. The thresholds were 10.2cm for hip distance (132 records), 11.8cm for shoulder distance (135 records), 20.6cm for tail-neck length (121 records) and 0.058 m2 for pig surface area (96 records). If the standard deviation of the estimated hip widths over frames within one weighing event of a pig was>8.9cm, the record was set to missing.

Second, for every individual with at least four records (941 pigs, 6807 records), outliers were determined using a second order polynomial regression on the variable of interest in function of age in days. Based on the distribution of the difference between observed and predicted phenotypes for all animals, a threshold for exclusion (record set to missing) was set as three times the standard deviation of the differences. The thresholds were 2.1cm for hip distance (61 records), 2.2cm for shoulder distance (58 records), 6.4cm for tail-neck length (75 records), 0.021 m2 for pig surface area (85 records) and 3.7kg for weight (86 records).

The final dataset after data cleaning included 7428 records from 1556 finishing pigs descending from 73 Pitrain sires and 204 crossbred dams. Pedigree comprised 4089 animals, where the median pedigree depth of Pitrain sires was 15 generations (min 10; max 17) and 3 (min 0; max 6) for crossbred dams.

We estimated genetic parameters (heritability and genetic correlations) using the blupf90 suite of programs38. Genetic variances and heritabilities were estimated with average information REML, implemented in airemlf90 and invoked with the R-package breedR39 with the options EM-REML 20, "use_yams" and se_covar_function. Genetic parameters were first estimated on the full dataset and thereafter on subsets per pigs weight recording (1 to 8). The first weight recording, for example, corresponds with a dataset of 1176 pigs between 78 and 89days of age (Table 2). We estimated h2 as the proportion of additive genetic variance divided by total variance, whereas the common environmental effect (c2) was estimated as the proportion of variance explained by random environmental effects (c), divided by total variance.

Genetic correlations (rg) between traits were estimated using bivariate animal models (airemlf90). Genetic correlations were first calculated between all possible trait combinations using the full dataset. Hereafter, the genetic correlations within traits for all pairwise weighing events were estimated (so two recordings of the same trait were treated as two different traits). By doing this, we can evaluate if a trait genetically changes over time.

The estimated animal models were of the form:

where y is the vector with phenotypes for the studied trait(s); b is the vector containing the fixed effects (sex, 2 levels; parity of dam, 4 levels) and covariates (age); a is the vector of additive genetic effects (4089 levels); c is the vector of random environmental effects (65 levels); e is the vector of residual effects; X, Z and W are incidence matrices for respectively fixed effects, random animal effects and random permanent environmental effects. The random environmental effect c is a combination of date of entrance at the fattening farm and weighing date. Every two weeks, a new batch of pigs arrived at fattening farm. Parity of dams consisted of four classes (1, 23, 45, 6+).

More here:
Estimating genetics of body dimensions and activity levels in pigs using automated pose estimation | Scientific Reports - Nature.com

Posted in Human Genetics | Comments Off on Estimating genetics of body dimensions and activity levels in pigs using automated pose estimation | Scientific Reports – Nature.com

Amgen (NASDAQ:AMGN) AMGEN ANNOUNCES WEBCAST OF 2022 BANK OF AMERICA MERRILL LYNCH GLOBAL HEALTHCARE CON – Benzinga

Posted: at 1:04 am

THOUSAND OAKS, Calif., Sept. 12, 2022 /PRNewswire/ -- Amgen AMGN will present at Bank of America Merrill Lynch's 2022 Global Healthcare Conference at 4:55 a.m. ET on Thursday, Sept. 15, 2022. Peter H. Griffith, executive vice president and chief financial officer at Amgen, will present at the conference. The webcast will be broadcast over the internet simultaneously and will be available to members of the news media, investors and the general public.

The webcast, as with other selected presentations regarding developments in Amgen's business given by management at certain investor and medical conferences, can be found on Amgen's website, http://www.amgen.com, under Investors. Information regarding presentation times, webcast availability and webcast links are noted on Amgen's Investor Relations Events Calendar. The webcast will be archived and available for replay for at least 90 days after the event.

About AmgenAmgen is committed to unlocking the potential of biology for patients suffering from serious illnesses by discovering, developing, manufacturing and delivering innovative human therapeutics. This approach begins by using tools like advanced human genetics to unravel the complexities of disease and understand the fundamentals of human biology.

Amgen focuses on areas of high unmet medical need and leverages its expertise to strive for solutions that improve health outcomes and dramatically improve people's lives. A biotechnology pioneer since 1980, Amgen has grown to beone ofthe world'sleadingindependent biotechnology companies, has reached millions of patients around the world and is developing a pipeline of medicines with breakaway potential.

Amgen is one of the 30 companies that comprise the Dow Jones Industrial Average and is also part of the Nasdaq-100 index. In 2021, Amgen was named one of the 25 World's Best Workplaces by Fortune and Great Place to Work and one of the 100 most sustainable companies in the world by Barron's.

For more information, visitwww.amgen.comand follow us onwww.twitter.com/amgen.

CONTACT: Amgen, Thousand OaksMegan Fox, 805-447-1423 (media)Jessica Akopyan, 805-447-0974 (media)Arvind Sood, 805-447-1060 (investors)

SOURCE Amgen

Read more from the original source:
Amgen (NASDAQ:AMGN) AMGEN ANNOUNCES WEBCAST OF 2022 BANK OF AMERICA MERRILL LYNCH GLOBAL HEALTHCARE CON - Benzinga

Posted in Human Genetics | Comments Off on Amgen (NASDAQ:AMGN) AMGEN ANNOUNCES WEBCAST OF 2022 BANK OF AMERICA MERRILL LYNCH GLOBAL HEALTHCARE CON – Benzinga

Searching the skies for the building blocks of life in the universe – Modern Diplomacy

Posted: at 1:04 am

BY GARETH WILLMER

Game theory mathematics is used to predict outcomes in conflict situations. Now it is being adapted through big data to resolve highly contentious issues between people and the environment.

Game theory is a mathematical concept that aims to predict outcomes and solutions to an issue in which parties with conflicting, overlapping or mixed interests interact.

In theory, the game will bring everyone towards an optimal solution or equilibrium. It promises a scientific approach to understanding how people make decisions and reach compromises in real-world situations.

Game theory originated in the 1940s in the field of economics. The Oscar-winning movieA Beautiful Mind (2001)is about the life of mathematician John Nash (played by Russell Crowe), who was awarded the1994 Nobel Prize in Economic Sciencesfor his work in this area.

Although the concept has been around for many decades, the difference now is the ability to build it into computer-based algorithms, games and apps to apply it more broadly, said Professor Nils Bunnefeld, a social and environmental scientist at the University of Stirling, UK. This is particularly true in the age of big data.

Game theory as a theoretical idea has long been around to show solutions to conflict problems, he said. We really see the potential to move this to a computer to make the most of the data that can be collected, but also reach many more people.

Conservation conflicts

Prof Bunnefeld led the EU-backedConFooBioproject, which applied game theory to scenarios where people were in conflict over resources and the environment. His team wanted to develop a model for predicting solutions to conflicts between food security and biodiversity.

The starting point was that when we have two or more parties at loggerheads, what should we do, for example, with land or natural resources? Should we produce more food? Or should we protect a certain area for biodiversity? he said.

The team focused on seven case studies, ranging from conflicts involving farmers and conservation of geese in Scotland to ones about elephants and crop raiding in Gabon.

ConFooBio conducted more than 300 game workshops with over 900 people in numerous locations including Gabon, Kenya, Madagascar, Tanzania and Scotland.

Ecological challenges

Prof Bunnefeld realised it became necessary to step back from pure game theory and instead build more complex games to incorporate ecological challenges the world currently faces, like climate change. It also became necessary to adopt a more people-based approach than initially planned, to better target the games.

Participants included people directly involved in these conflicts, and in many cases that were very unhappy, said Prof Bunnefeld.

Through the games, we got high engagement from communities, even from those where conflict is high and people can be reluctant to engage in research. We showed that people are able to solve conflicts when they trust each other and have a say, and when they get adequate payments for conservation efforts.

The team developed a modelling framework to predict wildlife management outcomes amid conflict. Freely available, it has been downloaded thousands of times from theConFooBio website.

Conservation game

The researchers also created an accessible game about conservation calledCrops vs Creatures, in which players decide between a range of options from shooting creatures to allocating habitat for conservation.

Prof Bunnefeld hopes these types of game become more available on a mainstream basis via app stores such as one on conflicts in the realm of biodiversity and energy justice in a separate initiative he works on called the Beacon Project.If you tell people you have an exciting game or you have a complex model, which one are they going to engage with? I think the answer is pretty easy, he said.

In the ConFooBio project, weve been able to show that our new models and algorithms can adapt to new situations and respond to environmental and social changes, added Prof Bunnefeld. Our models are useful for suggesting ways of managing conflicts between stakeholders with competing objectives.

Social media dynamics

Another project,Odycceus, harnessed elements of game theory to investigate what social media can tell us about social dynamics and potentially assist in the early detection of emerging social conflicts.

They analysed the language, content and opinions of social media discussions using data tools.

Such tools are required to analyse the vast amount of information in public discourse, explained Eckehard Olbrich, coordinator of the Odycceus project, and a physicist at the Max Planck Institute for Mathematics in the Sciences in Leipzig, Germany.

His work is partially motivated by trying to understand the reasons behind the polarisation of views and the growth of populist movements like far-right organisation Pegida, which was founded in his hometown of Dresden in 2014.

The team created a variety of tools accessible to researchers via an open platform known asPenelope. These included the likes of theTwitter Explorer, which enables researchers to visualise connections between Twitter users and trending topics to help understand how societal debates evolve.

Others included two participatory apps known as the Opinion Observatory and the Opinion Facilitator, which enable people to monitor the dynamics of conflict situations, such as by helping interlink news articles containing related concepts.

Patterns of polarisation

These tools have already allowed us to get a better insight into patterns of polarisation and understanding different world views, said Olbrich.

He said, for example, that his team managed to develop a model about the effect of social feedback on polarisation thatincorporated game-theoretic ideas.

The findings suggested that the formation of polarised groups online was less about the traditional concept of social media bubbles and echo chambers than the way people build their identity by gaining approval from their peers.

He added that connecting the dots between game theory and polarisation could have real-life applications for things like how best to regulate social media.

In a game-theoretic formulation, you start with the incentives of the players, and they select their actions to maximise their expected utility, he said. This allows predictions to be made of how people would change their behaviour if you, for instance, regulate social media.

Olbrich added that he hopes such modelling can furnish a better understanding of democracy and debates in the public sphere, as well as indicating to people better ways to participate in public debates. Then we would have better ways to deal with the conflicts we have and that we have to solve, he said.

But there are also significant challenges in using game theory for real-world situations, explained Olbrich.

Varying outlooks

For example, incorporating cultural differences into game theory has proved difficult because such differences may mean two people have hugely varying ways of looking at a problem.

The problem with game theory is that its looking for solutions to the way a problem can be solved, added Prof Bunnefeld.

Having looked at conflicts over the last few years, to me it is clear that we cant solve conflicts, we can only manage them. Building in factors like climate change and local context is also complex.

But game theory is a useful way to explore models, games and apps for dealing with conflicts, he said. Game theory is, from its very simple basics to quite complex situations, a good entry point, said Prof Bunnefeld.

It gives us a framework that you can work through and also captures peoples imagination.

Research in this article was funded via the EUsEuropean Research Council and originally publishedin Horizon, the EU Research and Innovation Magazine.

Related

Read more:
Searching the skies for the building blocks of life in the universe - Modern Diplomacy

Posted in Human Genetics | Comments Off on Searching the skies for the building blocks of life in the universe – Modern Diplomacy

Measurement of lipid flux to advance translational research: evolution of classic methods to the future of precision health | Experimental &…

Posted: at 1:04 am

DeFronzo, R. A., Ferrannini, E. & Simonson, D. C. Fasting hyperglycemia in non-insulin-dependent diabetes mellitus: contributions of excessive hepatic glucose production and impaired tissue glucose uptake. Metabolism 38, 387395 (1989).

CAS PubMed Article Google Scholar

Lambert, J. E., Ramos-Roman, M. A., Browning, J. D. & Parks, E. J. Increased de novo lipogenesis is a distinct characteristic of individuals with nonalcoholic fatty liver disease. Gastroenterology 146, 726735 (2014).

CAS PubMed Article Google Scholar

Ginsberg, H. N. et al. Triglyceride-rich lipoproteins and their remnants: metabolic insights, role in atherosclerotic cardiovascular disease, and emerging therapeutic strategies-a consensus statement from the European Atherosclerosis Society. Eur. Heart J. 42, 47914806 (2021).

CAS PubMed PubMed Central Article Google Scholar

Wolfe, R. R. Radioactive and stable isotope tracers in biomedicine. (Wiley and Sons, 1992).

Berman, M., Grundy, S. M. & Howard, B. V. Lipoprotein kinetics and modeling. (Academic Press, 1982).

Berman, M. et al. Metabolism of apoB and apoC lipoproteins in man: kinetic studies in normal and hyperlipoproteinemic subjects. J. Lipid Res. 19, 3856 (1978).

CAS PubMed Article Google Scholar

Beltz, W. F., Kesaniemi, Y. A., Howard, B. V. & Grundy, S. M. Development of an integrated model for analysis of the kinetics of apolipoprotein B in plasma very low density lipoproteins, intermediate density lipoproteins, and low density lipoproteins. J. Clin. Invest. 76, 575585 (1985).

CAS PubMed PubMed Central Article Google Scholar

Barrett, P. H., Chan, D. C. & Watts, G. F. Thematic review series: patient-oriented research. Design and analysis of lipoprotein tracer kinetics studies in humans. J. Lipid Res. 47, 16071619 (2006).

CAS PubMed Article Google Scholar

Ying, Q., Chan, D. C., Barrett, P. H. R. & Watts, G. F. Unravelling lipoprotein metabolism with stable isotopes: tracing the flow. Metabolism 124, 154887 (2021).

CAS PubMed Article Google Scholar

Beylot, M. Use of stable isotopes to evaluate the functional effects of nutrients. Curr. Opin. Clin. Nutr. Metab. Care 9, 734739 (2006).

CAS PubMed Article Google Scholar

Bier, D. M. Stable isotopes in biosciences, their measurement and models for amino acid metabolism. Eur. J. Pediatr. 156, S2S8 (1997).

CAS PubMed Article Google Scholar

Demant, T. & Packard, C. J. Studies of apolipoprotein B-100 metabolism using radiotracers and stable isotopes. Eur. J. Pediatr. 156, S75S77 (1997).

CAS PubMed Article Google Scholar

Packard, C. J. The role of stable isotopes in the investigation of plasma lipoprotein metabolism. Baillieres Clin. Endocrinol. Metab. 9, 755772 (1995).

CAS PubMed Article Google Scholar

Patterson, B. W., Mittendorfer, B., Elias, N., Satyanarayana, R. & Klein, S. Use of stable isotopically labeled tracers to measure very low density lipoprotein-triglyceride turnover. J. Lipid Res. 43, 223233 (2002).

CAS PubMed Article Google Scholar

DeLany, J. P., Windhauser, M. M., Champagne, C. M. & Bray, G. A. Differential oxidation of individual dietary fatty acids in humans. Am. J. Clin. Nutr. 72, 905911 (2000).

CAS PubMed Article Google Scholar

Raman, A., Blanc, S., Adams, A. & Schoeller, D. A. Validation of deuterium-labeled fatty acids for the measurement of dietary fat oxidation during physical activity. J. Lipid Res. 45, 23392344 (2004).

CAS PubMed Article Google Scholar

Hibi, M. et al. Fat utilization in healthy subjects consuming diacylglycerol oil diet: dietary and whole body fat oxidation. Lipids 43, 517524 (2008).

CAS PubMed Article Google Scholar

Jones, P. J., Pencharz, P. B. & Clandinin, M. T. Whole body oxidation of dietary fatty acids: implications for energy utilization. Am. J. Clin. Nutr. 42, 769777 (1985).

CAS PubMed Article Google Scholar

Hodson, L., McQuaid, S. E., Karpe, F., Frayn, K. N. & Fielding, B. A. Differences in partitioning of meal fatty acids into blood lipid fractions: a comparison of linoleate, oleate, and palmitate. Am. J. Physiol. Endocrinol. Metab. 296, E64E71 (2009).

CAS PubMed Article Google Scholar

Heiling, V. J., Miles, J. M. & Jensen, M. D. How valid are isotopic measurements of fatty acid oxidation? Am. J. Physiol. 261, E572E577 (1991).

CAS PubMed Google Scholar

Timlin, M. T., Barrows, B. R. & Parks, E. J. Increased dietary substrate delivery alters hepatic fatty acid recycling in healthy men. Diabetes 54, 26942701 (2005).

CAS PubMed Article Google Scholar

Jacome-Sosa, M. M. & Parks, E. J. Fatty acid sources and their fluxes as they contribute to plasma triglyceride concentrations and fatty liver in humans. Curr. Opin. Lipidol. 25, 213220 (2014).

CAS PubMed Article Google Scholar

Mucinski, J. M. et al. High throughput LCMS method to investigate postprandial lipemia: considerations for future precision nutrition research. Am. J. Physiol. Endocrinol. Metab. 320, E702E715 (2021).

CAS PubMed Article Google Scholar

Knuth, N. D. & Horowitz, J. F. The elevation of ingested lipids within plasma chylomicrons is prolonged in men compared with women. J. Nutr. 136, 14981503 (2006).

CAS PubMed Article Google Scholar

Gil-Snchez, A. et al. Maternal-fetal in vivo transfer of [13C]docosahexaenoic and other fatty acids across the human placenta 12 h after maternal oral intake. Am. J. Clin. Nutr. 92, 115122 (2010).

PubMed Article CAS Google Scholar

Jackson, K. G., Robertson, M. D., Fielding, B. A., Frayn, K. N. & Williams, C. M. Second meal effect: modified sham feeding does not provoke the release of stored triacylglycerol from a previous high-fat meal. Br. J. Nutr. 85, 149156 (2001).

CAS PubMed Article Google Scholar

Jacome-Sosa, M., Hu, Q., Manrique-Acevedo, C. M., Phair, R. D. & Parks, E. J. Human intestinal lipid storage through sequential meals reveals faster dinner appearance is associated with hyperlipidemia. JCI Insight 6, e148378 (2021).

PubMed Central Article Google Scholar

Nelson, R. H., Basu, R., Johnson, C. M., Rizza, R. A. & Miles, J. M. Splanchnic spillover of extracellular lipase-generated fatty acids in overweight and obese humans. Diabetes 56, 28782884 (2007).

CAS PubMed Article Google Scholar

Barrows, B. R., Timlin, M. T. & Parks, E. J. Spillover of dietary fatty acids and use of serum nonesterified fatty acids for the synthesis of VLDL-triacylglycerol under two different feeding regimens. Diabetes 54, 26682673 (2005).

CAS PubMed Article Google Scholar

Parks, E. J., Schneider, T. L. & Baar, R. A. Meal-feeding studies in mice: effects of different diets on blood lipids and energy expenditure. Comp. Med. 55, 2429 (2005).

CAS PubMed Google Scholar

Barrows, B. R. & Parks, E. J. Contributions of different fatty acid sources to very low-density lipoprotein-triacylglycerol in the fasted and fed states. J. Clin. Endocrinol. Metab. 91, 14461452 (2006).

CAS PubMed Article Google Scholar

Parks, E. J. & Hellerstein, M. K. Thematic review series: patient-oriented research. Recent advances in liver triacylglycerol and fatty acid metabolism using stable isotope labeling techniques. J. Lipid Res. 47, 16511660 (2006).

CAS PubMed Article Google Scholar

Baar, R. A. et al. Investigation of in vivo fatty acid metabolism in AFABP/aP2-/- mice. Am. J. Physiol. Endocrinol. Metab. 288, E187E193 (2004).

PubMed Article CAS Google Scholar

Donnelly, K. L., Margosian, M. R., Sheth, S. S., Lusis, A. J. & Parks, E. J. Increased lipogenesis and fatty acid reesterification contribute to hepatic triacylglycerol stores in hyperlipidemic Txnip-/- mice. J. Nutr. 134, 14751480 (2004).

CAS PubMed Article Google Scholar

Bastarrachea, R. A. et al. Protocol for the measurement of fatty acid and glycerol turnover in vivo in baboons. J. Lipid Res. 52, 12721280 (2011).

CAS PubMed PubMed Central Article Google Scholar

Erkin-Cakmak, A. et al. Isocaloric fructose testriction reduces serum d-lactate concentration in children with obesity and metabolic syndrome. J. Clin. Endocrinol. Metab. 104, 30033011 (2019).

PubMed PubMed Central Article Google Scholar

Turner, S. M. et al. Measurement of TG synthesis and turnover in vivo by 2H2O incorporation into the glycerol moiety and application of MIDA. Am. J. Physiol. Endocrinol. Metab. 285, E790E803 (2003).

CAS PubMed Article Google Scholar

Schoenheimer, R. & Rittenberg, D. Deuterium as an indicator in the study of intermediary metabolism. Science 82, 156157 (1935).

CAS PubMed Article Google Scholar

Castro-Perez, J. et al. In vivo D2O labeling to quantify static and dynamic changes in cholesterol and cholesterol esters by high resolution LC/MS. J. Lipid Res. 52, 159169 (2011).

CAS PubMed PubMed Central Article Google Scholar

Chen, Y. et al. Quantifying ceramide kinetics in vivo using stable isotope tracers and LCMS/MS. Am. J. Physiol. Endocrinol. Metab. 315, E416e424 (2018).

CAS PubMed Article Google Scholar

White, U., Fitch, M. D., Beyl, R. A., Hellerstein, M. K. & Ravussin, E. Adipose depot-specific effects of 16 weeks of pioglitazone on in vivo adipogenesis in women with obesity: a randomised controlled trial. Diabetologia 64, 159167 (2021).

CAS PubMed Article Google Scholar

Zhou, H. et al. Quantifying apoprotein synthesis in rodents: coupling LC-MS/MS analyses with the administration of labeled water. J. Lipid Res. 53, 12231231 (2012).

CAS PubMed PubMed Central Article Google Scholar

Puchalska, P. et al. Isotope tracing untargeted metabolomics reveals macrophage polarization-state-specific metabolic coordination across intracellular compartments. iScience 9, 298313 (2018).

CAS PubMed PubMed Central Article Google Scholar

Downes, D. P. et al. Isotope fractionation during gas chromatography can enhance mass spectrometry-based measures of (2)H-labeling of small molecules. Metabolites 10, 474 (2020).

CAS PubMed Central Article Google Scholar

Trtzmller, M. et al. Determination of the isotopic enrichment of (13)C- and (2)H-labeled tracers of glucose using high-resolution mass spectrometry: application to dual- and triple-tracer studies. Anal. Chem. 89, 1225212260 (2017).

PubMed Article CAS Google Scholar

Schuhmann, K. et al. Monitoring membrane lipidome turnover by metabolic (15)N labeling and shotgun ultra-high-resolution orbitrap fourier transform mass spectrometry. Anal. Chem. 89, 1285712865 (2017).

CAS PubMed Article Google Scholar

Triebl, A. & Wenk, M. R. Analytical considerations of stable isotope labelling in lipidomics. Biomolecules 8, 151 (2018).

PubMed Central Article CAS Google Scholar

Wang, M., Wang, C. & Han, X. Selection of internal standards for accurate quantification of complex lipid species in biological extracts by electrospray ionization mass spectrometry-What, how and why? Mass. Spectrom. Rev. 36, 693714 (2017).

PubMed Article CAS Google Scholar

Rampler, E. et al. LILY-lipidome isotope labeling of yeast: in vivo synthesis of (13)C labeled reference lipids for quantification by mass spectrometry. Analyst 142, 18911899 (2017).

CAS PubMed Article Google Scholar

Han, X. & Gross, R. W. The foundations and development of lipidomics. J. Lipid Res. 63, 100164 (2022).

CAS PubMed Article Google Scholar

Satapati, S. et al. Using measures of metabolic flux to align screening and clinical development: Avoiding pitfalls to enable translational studies. SLAS Disco. 27, 2028 (2022).

Article Google Scholar

Excerpt from:
Measurement of lipid flux to advance translational research: evolution of classic methods to the future of precision health | Experimental &...

Posted in Human Genetics | Comments Off on Measurement of lipid flux to advance translational research: evolution of classic methods to the future of precision health | Experimental &…

With Graphic Works on Sex and Inequality, a New Show Addresses Artistic Censorship – Artsy

Posted: at 12:55 am

Artists who have faced censorship are taking center stage at Unit London. Sensitive Content, curated by artist Helen Beard and art historians Alayo Akinkugbe and Maria Elena Buszed, presents artworks that have challenged the status quo by raising questions on artistic freedom and foregrounding issues linked to the circulation and suppression of art.

On view through October 16th, the group exhibition examines censorship and artistic freedom from multiple standpoints. The interrogative nature of Sensitive Content expands on social, cultural, and political issues touching upon gender, sexuality, religion, race, and eroticism, among other topics. Featuring 19 artists whose works have fought against the culture of censorship, the showaddresses agency, access, and power to encourage viewers to engage in an expanded public discourse.

The personal is political in Sensitive Content. The works of Polly Borland, Micol Hebron, and Emma Shapiro draw attention to sexisms role in the policing and censoring of specific body types, deeming them as inherently sexual when unclothed. Feminist themes also emerge in Leah Schragers Infinity Selfie series (2016) and Caroline Coons performance piece I AM WHORE (2019). Schragers digitally manipulated photographs blur the line between model and photographer to question how one is represented and by whom. Meanwhile, in Coons compelling historical examination of misogynistic tropes, the artist forces the viewer to encounter the uneasy truths about the violence women still face in todays patriarchal societies.

With artworks depicting erotic and sexual themes that have often been deemed obscene, controversial, or inappropriate, Sensitive Content features pioneers in feminist artsuch as Carol Rama, Betty Tompkins, Penny Slinger, and Linderwho prominently incorporate explicit imagery in their practices. In the 1970s, French customs confiscated photorealistic works from Tompkinss Fuck Paintings series, declaring the pieces obscene. Whereas thousands of copies of Slingers 1978 book Mountain Ecstasy were seized and destroyed by British customs, Linders collages had to be published covertly due to the ongoing restrictions. Many of the shows artists still frequently battle with the limitations placed on exhibiting and disseminating their work.

One such artist is co-curator Beard, whose radiant paintings depicting female pleasure seduce through vivid and bold graphic shapes. Beards social media posts of her paintings are frequently removed due to alleged violations of community guidelines. Like Beard, Beverley Onyangunga has often been shadow-banned on social media. Onyangungas archival photomontages depicting the history of colonial violence remind viewers of the excruciating atrocities that took place from 1885 to 1908 in Congo Free State, present-day Democratic Republic of the Congo. Under the gruesome, 23-year-long colonial rule of Belgiums King Leopold II, Congolese children and adults were brutalized and denied access to food if they failed to meet their daily rubber quotas.

Onyangunga recalls this period of history in her installation Parts of a Rubber Tree (2022), in which the leaves of a tree are replaced by red rubber gloves. In Onyangungas photo collage Archive I (2022), a red rubber glove appears again; this time, it occupies the space where a Congolese childs hand was severed. A missionary grips the childs arm, while Black children pay witness to the scene and Leopold IIs head and torso peek up from behind them.

Other artists have faced repercussions outside of the digital sphere for the content in their work. Russian activist and performance art group Pussy Riot and Chinese artist Xiao Lu have previously been detained by their respective government authorities for political dissent. Pussy Riots three artworks in Sensitive Content, all titled Push This Button (2022), feature a call to action followed by a kaomoji: This button makes you squirt =^.^=, This button eliminates sexism =^_^=, and This button neutralizes Vladimir Putin =^.^=. Despite their cutified appearance, the politically charged works are met by viewers with caution.

In Xiaos performance Polar (2016), the artist climbs into a semi-transparent cubicle made of ice. With only a kitchen knife, Xiao repeatedly hacks at her icy confinement, even as she begins to draw blood and stain her surrounding environment. The violent and aggressive subtexts found in Polar are recurring themes in Xiaos transgressive work critiquing the CCPs political and social policies. Perhaps Polar can also be understood as a symbolic pursuit of breaking free from the constraints of a patriarchal society.

Meanwhile, Renee Coxs photograph Yo Mamas Last Supper (1996)which features Cox as Jesus in the center of the composition, surrounded by 11 Black men and a white man, Judaswas deemed sacrilegious and offensive by both the Catholic Church and thenNew York City mayor Rudy Giuliani. The latter called for a commission to set decency standards for all publicly funded art. Its worth asking whether the artwork sparked such opposition due to its reinterpretation of a biblical scene or because such artistic license was taken by a Black woman.

Operating as a site for thought-provoking public discourse that welcomes both contemporary and historical artistic acts of resistance, Sensitive Content responds to the complex sociopolitical and cultural mechanisms involved in silencing and suppressing narratives deemed threatening, disruptive, obscene, divergent, or offensive. As the curators stated in the exhibition catalogue, Ultimately, despite their many differences, the artists in Sensitive Content have a shared commitment to the real over the fakewhether in our politics, interactions or expressionsthat binds them more deeply than their works censorship. This exhibition hopes to honor that courageous common bond. And indeed it does.

Follow this link:
With Graphic Works on Sex and Inequality, a New Show Addresses Artistic Censorship - Artsy

Posted in Censorship | Comments Off on With Graphic Works on Sex and Inequality, a New Show Addresses Artistic Censorship – Artsy

Censorship in DeFi and the Transition to POS: Causes and Consequences – Finance Magnates

Posted: at 12:55 am

Finance Magnates got the opportunity to get the thoughts of Brian Pasfield, CTO at Fringe Finance on the future of Ethereum's move to PoS, DeFi's split into permitted and non-permitted, and shares his vision of where this could lead DeFi in the future.

Q. The transition to PoS is the dawn of the bifurcation of DeFi into 'permitted' and 'non-permitted' DeFi. What are the possible consequences of it?

Authorities have commenced attacks on the DeFi ecosystem by introducing censorship. The core value proposition of DeFi is censorship resistance. So, any implementation of DeFi that enables censorship is not DeFi. Permissioned DeFi = on-chain CeFi, which eliminates all that is valuable about DeFi. Not even composability as a benefit remains, as it poses existential risks for protocols composed with sanctioned/permissionless protocols. And, much of DeFi has centralized components, which, therefore, attack vectors for authorities to coerce censorship.

Q. What are the prospects of DeFi then?

Keep Reading

DeFis only path is to pursue avenues that assure its censorship resistance. This means removing the reliance on a number of things that are variously characteristic of DeFi today, including doxxed teams, centralized pegged stablecoins and any notion of PoS given PoS introduces a greater attack surface for authorities to enact bribery attacks that can compromise the network.

Q: So Why is DeFi so valuable?

Many participants in the DeFi ecosystem do not recognize DeFis core value proposition of censorship resistance. Many view DeFi as just an additional way to deliver financial services and a way to achieve rapid financial gains. But, DeFi is distinct because of its core value proposition. This proposition is valuable to those who have a security mindset - and those who do not want to be stolen from. A security mindset refers to the notion of personal sovereignty and that the aims of authorities and some supra-national organizations are all too often not in the people's interests. A good introduction to understanding this can be found in The Prince by Niccol Machiavelli.

Q. The censorship calls from the authorities will increase. Does this mean that DeFi projects will soon face new difficulties in obtaining licenses? Will they close more often due to censorship?

Any reference to licenses and DeFi in the same sentence indicates a misunderstanding of what makes DeFi useful. The core value proposition of DeFi is censorship resistance. A truly censorship-resistant DeFi protocol can not be regulated, as it is not susceptible to state coercion. Any nominally DeFi protocol that does require a license is an example of on-chain CeFi. Given DeFis core value proposition, by definition, DeFi will not and cannot be regulated by authorities. It is the centralized aspects of current DeFi that are censorable.

Q. Give examples of DeFi projects with centralized aspects. What are their risks?

Examples of DeFi projects with centralized aspects are USD-pegged stablecoins. Ultimately, they rely on meat-space entities that can be and have been coerced by authorities to enact censorship. DeFi will move away from its current love affair with USD-pegged stablecoins because of the attack surface they represent in terms of coercion and censorship by authorities.

Q: Many people strongly hold that PoW is a danger and that a move to PoS is necessary. However is there a risk of PoSs attack vectors being exploited by vested interests?

Yes, there is a significant risk. PoS bribery attacks will be attempted. DeFi on PoS will then be TradFI but on a censored blockchain. For humans to unshackle themselves from coercion and censorship and to move to a state of greater freedom, a security mindset is needed. Proper DeFi, with its core value proposition of censorship resistance, is necessary. There are people in the DeFi ecosystem who understand the core value proposition of censorship resistance, and DeFi will find a way. Look for these people and follow their projects.

Q: The industry uses PoS for several reasons: to lower fees and use less energy, and it is also claimed to increase security. Is this true and are there any security issues caused by the adoption of PoS?

Lets analyze new security issues added by the adoption of PoS. Fees are a function of demand for block space. The market dictates the price. The participants demand the security afforded by the current blockchain and are willing to pay the fees. If participants did not demand it, the price would be lower. And we now have L2s which increase throughput and correspondingly reduce fees.

The remaining reason for Eth PoS is energy usage. PoW and PoS have different properties; hence, there are trade-offs moving from PoW to PoS. Particularly, PoS represents a greater attack surface for censorship via bribery attacks, which, if successful, could be fatal for the network. If more people were aware of this, they would ask, is the energy usage matter really as its been described by untrustworthy supra-nationalists? And if so, is reducing DeFis energy usage at the price of removing DeFis core value proposition of censorship resistance worth it?

The solution to this is (proper) DeFi will find a way to remain uncensorable in the long term. This may or may not be on the Eth blockchain Blockchain Blockchain comprises a digital network of blocks with a comprehensive ledger of transactions made in a cryptocurrency such as Bitcoin or other altcoins.One of the signature features of blockchain is that it is maintained across more than one computer. The ledger can be public or private (permissioned.) In this sense, blockchain is immune to the manipulation of data making it not only open but verifiable. Because a blockchain is stored across a network of computers, it is very difficult to tamper with. The Evolution of BlockchainBlockchain was originally invented by an individual or group of people under the name of Satoshi Nakamoto in 2008. The purpose of blockchain was originally to serve as the public transaction ledger of Bitcoin, the worlds first cryptocurrency.In particular, bundles of transaction data, called blocks, are added to the ledger in a chronological fashion, forming a chain. These blocks include things like date, time, dollar amount, and (in some cases) the public addresses of the sender and the receiver.The computers responsible for upholding a blockchain network are called nodes. These nodes carry out the duties necessary to confirm the transactions and add them to the ledger. In exchange for their work, the nodes receive rewards in the form of crypto tokens.By storing data via a peer-to-peer network (P2P), blockchain controls for a wide range of risks that are traditionally inherent with data being held centrally.Of note, P2P blockchain networks lack centralized points of vulnerability. Consequently, hackers cannot exploit these networks via normalized means nor does the network possess a central failure point.In order to hack or alter a blockchains ledger, more than half of the nodes must be compromised. Looking ahead, blockchain technology is an area of extensive research across multiple industries, including financial services and payments, among others. Blockchain comprises a digital network of blocks with a comprehensive ledger of transactions made in a cryptocurrency such as Bitcoin or other altcoins.One of the signature features of blockchain is that it is maintained across more than one computer. The ledger can be public or private (permissioned.) In this sense, blockchain is immune to the manipulation of data making it not only open but verifiable. Because a blockchain is stored across a network of computers, it is very difficult to tamper with. The Evolution of BlockchainBlockchain was originally invented by an individual or group of people under the name of Satoshi Nakamoto in 2008. The purpose of blockchain was originally to serve as the public transaction ledger of Bitcoin, the worlds first cryptocurrency.In particular, bundles of transaction data, called blocks, are added to the ledger in a chronological fashion, forming a chain. These blocks include things like date, time, dollar amount, and (in some cases) the public addresses of the sender and the receiver.The computers responsible for upholding a blockchain network are called nodes. These nodes carry out the duties necessary to confirm the transactions and add them to the ledger. In exchange for their work, the nodes receive rewards in the form of crypto tokens.By storing data via a peer-to-peer network (P2P), blockchain controls for a wide range of risks that are traditionally inherent with data being held centrally.Of note, P2P blockchain networks lack centralized points of vulnerability. Consequently, hackers cannot exploit these networks via normalized means nor does the network possess a central failure point.In order to hack or alter a blockchains ledger, more than half of the nodes must be compromised. Looking ahead, blockchain technology is an area of extensive research across multiple industries, including financial services and payments, among others. Read this Term - and likely will not be, given the current PoS adherents ideological positions distort their ability to make decisions with the required objectivity.

Q: What's your vision of the future of DeFi?

B: DeFi is just starting. It is so new. Many DeFi projects have not fully embraced Its core value proposition of uncensorability. Were now seeing authorities taking action not only to sensor DeFi, but to confiscate assets and take legal action. There is effectively no reason for censored DeFi to exist. DeFi needs to divest itself of its current vulnerabilities to censorship so that it continues to deliver on its core value proposition.

DeFi is one part of the decentralized economy. It's a part of the future decentralized world. A whole new body of legal precedence would evolve in this decentralized space that completely bypasses the distortions of state-based legislation systems. In the areas where it competes with meat-space legacy institutions, the decentralized world will be more efficient and deliver greater prosperity to communities.

Brian Pasfield is the CTO at Fringe Finance with almost 10 years of expertise in blockchain, cryptocurrency, fintech and DeFi. He has delivered technically-complex projects that have leveraged his engineering background and keen understanding of industry trends and philosophies. Furthermore, Brian has worked with industry blockchain bodies to lobby for legislation and government policy changes.

Finance Magnates got the opportunity to get the thoughts of Brian Pasfield, CTO at Fringe Finance on the future of Ethereum's move to PoS, DeFi's split into permitted and non-permitted, and shares his vision of where this could lead DeFi in the future.

Q. The transition to PoS is the dawn of the bifurcation of DeFi into 'permitted' and 'non-permitted' DeFi. What are the possible consequences of it?

Authorities have commenced attacks on the DeFi ecosystem by introducing censorship. The core value proposition of DeFi is censorship resistance. So, any implementation of DeFi that enables censorship is not DeFi. Permissioned DeFi = on-chain CeFi, which eliminates all that is valuable about DeFi. Not even composability as a benefit remains, as it poses existential risks for protocols composed with sanctioned/permissionless protocols. And, much of DeFi has centralized components, which, therefore, attack vectors for authorities to coerce censorship.

Q. What are the prospects of DeFi then?

Keep Reading

DeFis only path is to pursue avenues that assure its censorship resistance. This means removing the reliance on a number of things that are variously characteristic of DeFi today, including doxxed teams, centralized pegged stablecoins and any notion of PoS given PoS introduces a greater attack surface for authorities to enact bribery attacks that can compromise the network.

Q: So Why is DeFi so valuable?

Many participants in the DeFi ecosystem do not recognize DeFis core value proposition of censorship resistance. Many view DeFi as just an additional way to deliver financial services and a way to achieve rapid financial gains. But, DeFi is distinct because of its core value proposition. This proposition is valuable to those who have a security mindset - and those who do not want to be stolen from. A security mindset refers to the notion of personal sovereignty and that the aims of authorities and some supra-national organizations are all too often not in the people's interests. A good introduction to understanding this can be found in The Prince by Niccol Machiavelli.

Q. The censorship calls from the authorities will increase. Does this mean that DeFi projects will soon face new difficulties in obtaining licenses? Will they close more often due to censorship?

Any reference to licenses and DeFi in the same sentence indicates a misunderstanding of what makes DeFi useful. The core value proposition of DeFi is censorship resistance. A truly censorship-resistant DeFi protocol can not be regulated, as it is not susceptible to state coercion. Any nominally DeFi protocol that does require a license is an example of on-chain CeFi. Given DeFis core value proposition, by definition, DeFi will not and cannot be regulated by authorities. It is the centralized aspects of current DeFi that are censorable.

Q. Give examples of DeFi projects with centralized aspects. What are their risks?

Examples of DeFi projects with centralized aspects are USD-pegged stablecoins. Ultimately, they rely on meat-space entities that can be and have been coerced by authorities to enact censorship. DeFi will move away from its current love affair with USD-pegged stablecoins because of the attack surface they represent in terms of coercion and censorship by authorities.

Q: Many people strongly hold that PoW is a danger and that a move to PoS is necessary. However is there a risk of PoSs attack vectors being exploited by vested interests?

Yes, there is a significant risk. PoS bribery attacks will be attempted. DeFi on PoS will then be TradFI but on a censored blockchain. For humans to unshackle themselves from coercion and censorship and to move to a state of greater freedom, a security mindset is needed. Proper DeFi, with its core value proposition of censorship resistance, is necessary. There are people in the DeFi ecosystem who understand the core value proposition of censorship resistance, and DeFi will find a way. Look for these people and follow their projects.

Q: The industry uses PoS for several reasons: to lower fees and use less energy, and it is also claimed to increase security. Is this true and are there any security issues caused by the adoption of PoS?

Lets analyze new security issues added by the adoption of PoS. Fees are a function of demand for block space. The market dictates the price. The participants demand the security afforded by the current blockchain and are willing to pay the fees. If participants did not demand it, the price would be lower. And we now have L2s which increase throughput and correspondingly reduce fees.

The remaining reason for Eth PoS is energy usage. PoW and PoS have different properties; hence, there are trade-offs moving from PoW to PoS. Particularly, PoS represents a greater attack surface for censorship via bribery attacks, which, if successful, could be fatal for the network. If more people were aware of this, they would ask, is the energy usage matter really as its been described by untrustworthy supra-nationalists? And if so, is reducing DeFis energy usage at the price of removing DeFis core value proposition of censorship resistance worth it?

The solution to this is (proper) DeFi will find a way to remain uncensorable in the long term. This may or may not be on the Eth blockchain Blockchain Blockchain comprises a digital network of blocks with a comprehensive ledger of transactions made in a cryptocurrency such as Bitcoin or other altcoins.One of the signature features of blockchain is that it is maintained across more than one computer. The ledger can be public or private (permissioned.) In this sense, blockchain is immune to the manipulation of data making it not only open but verifiable. Because a blockchain is stored across a network of computers, it is very difficult to tamper with. The Evolution of BlockchainBlockchain was originally invented by an individual or group of people under the name of Satoshi Nakamoto in 2008. The purpose of blockchain was originally to serve as the public transaction ledger of Bitcoin, the worlds first cryptocurrency.In particular, bundles of transaction data, called blocks, are added to the ledger in a chronological fashion, forming a chain. These blocks include things like date, time, dollar amount, and (in some cases) the public addresses of the sender and the receiver.The computers responsible for upholding a blockchain network are called nodes. These nodes carry out the duties necessary to confirm the transactions and add them to the ledger. In exchange for their work, the nodes receive rewards in the form of crypto tokens.By storing data via a peer-to-peer network (P2P), blockchain controls for a wide range of risks that are traditionally inherent with data being held centrally.Of note, P2P blockchain networks lack centralized points of vulnerability. Consequently, hackers cannot exploit these networks via normalized means nor does the network possess a central failure point.In order to hack or alter a blockchains ledger, more than half of the nodes must be compromised. Looking ahead, blockchain technology is an area of extensive research across multiple industries, including financial services and payments, among others. Blockchain comprises a digital network of blocks with a comprehensive ledger of transactions made in a cryptocurrency such as Bitcoin or other altcoins.One of the signature features of blockchain is that it is maintained across more than one computer. The ledger can be public or private (permissioned.) In this sense, blockchain is immune to the manipulation of data making it not only open but verifiable. Because a blockchain is stored across a network of computers, it is very difficult to tamper with. The Evolution of BlockchainBlockchain was originally invented by an individual or group of people under the name of Satoshi Nakamoto in 2008. The purpose of blockchain was originally to serve as the public transaction ledger of Bitcoin, the worlds first cryptocurrency.In particular, bundles of transaction data, called blocks, are added to the ledger in a chronological fashion, forming a chain. These blocks include things like date, time, dollar amount, and (in some cases) the public addresses of the sender and the receiver.The computers responsible for upholding a blockchain network are called nodes. These nodes carry out the duties necessary to confirm the transactions and add them to the ledger. In exchange for their work, the nodes receive rewards in the form of crypto tokens.By storing data via a peer-to-peer network (P2P), blockchain controls for a wide range of risks that are traditionally inherent with data being held centrally.Of note, P2P blockchain networks lack centralized points of vulnerability. Consequently, hackers cannot exploit these networks via normalized means nor does the network possess a central failure point.In order to hack or alter a blockchains ledger, more than half of the nodes must be compromised. Looking ahead, blockchain technology is an area of extensive research across multiple industries, including financial services and payments, among others. Read this Term - and likely will not be, given the current PoS adherents ideological positions distort their ability to make decisions with the required objectivity.

Q: What's your vision of the future of DeFi?

B: DeFi is just starting. It is so new. Many DeFi projects have not fully embraced Its core value proposition of uncensorability. Were now seeing authorities taking action not only to sensor DeFi, but to confiscate assets and take legal action. There is effectively no reason for censored DeFi to exist. DeFi needs to divest itself of its current vulnerabilities to censorship so that it continues to deliver on its core value proposition.

DeFi is one part of the decentralized economy. It's a part of the future decentralized world. A whole new body of legal precedence would evolve in this decentralized space that completely bypasses the distortions of state-based legislation systems. In the areas where it competes with meat-space legacy institutions, the decentralized world will be more efficient and deliver greater prosperity to communities.

Brian Pasfield is the CTO at Fringe Finance with almost 10 years of expertise in blockchain, cryptocurrency, fintech and DeFi. He has delivered technically-complex projects that have leveraged his engineering background and keen understanding of industry trends and philosophies. Furthermore, Brian has worked with industry blockchain bodies to lobby for legislation and government policy changes.

Read more here:
Censorship in DeFi and the Transition to POS: Causes and Consequences - Finance Magnates

Posted in Censorship | Comments Off on Censorship in DeFi and the Transition to POS: Causes and Consequences – Finance Magnates

Page 182«..1020..181182183184..190200..»