Monthly Archives: August 2022

Multiverse Computing Introduces a New Version of their Singularity Portfolio Optimization Software – Quantum Computing Report

Posted: August 29, 2022 at 7:51 am

Multiverse Computing Introduces a New Version of their Singularity Portfolio Optimization Software

We reported in August 2021 about a new software program from Multiverse Computing called Singularity. This program has an interesting characteristic in that it is implemented as an Excel plug-in that make it easy and quick for an inexperienced end user to try without requiring them to learn a lot about quantum computing. They have now released an update to this program that includes Singularity Portfolio Optimization v1.2 that supports a variety of modes including a Multiverse Hybrid mode, a D-Wave Leap Hybrid mode, and a pure classical solver. The program also can accept a variety of constraints while performing the optimizations including investors level of risk aversion, resolution of asset allocation, minimum and maximum allowable investment per asset, and others. The portfolio optimizer uses Multiverses hybrid solver for its core algorithms and the company indicates it can produce results competitive to classical solvers in a shorter period of time. The program is hardware agnostic and can be used with a variety of different quantum processors as well as quantum-inspired and classical configurations. Additional information about this new version of Singularity is available in a news release posted on the Multiverse website here.

August 28, 2022

This site uses Akismet to reduce spam. Learn how your comment data is processed.

The rest is here:

Multiverse Computing Introduces a New Version of their Singularity Portfolio Optimization Software - Quantum Computing Report

Posted in Quantum Computing | Comments Off on Multiverse Computing Introduces a New Version of their Singularity Portfolio Optimization Software – Quantum Computing Report

Nvidia GTC 2022 is Happening. Here’s What to Expect – Analytics India Magazine

Posted: at 7:51 am

One of the most-awaited developer conferences, Nvidia GTC, is just around the corner. Scheduled from 19-22 next month, the event is expected to bring thousands of innovators, researchers, thought leaders, and decision-makers together to showcase the latest technology innovations in AI, gaming, computer graphics, metaverse and more. The thought leaders include Turing award winners Yoshua Bengio, Geoff Hinton, Yann LeCun and others.

The Nvidia GTC would feature a keynote by Nvidia chief Jensen Huang and hold over 200 sessions with global business and technology leaders. The keynote announcement by Huang will be live-streamed on Tuesday, September 20, at 8:30 PM IST (8 AM PT).

Click here to register.

Nvidia has been the foundation of technology innovations, modern applications and computing platforms. Since its inception in 1993, the company has dedicated itself to the computing arena, starting from enhancing general-purpose computing to revolutionising the gaming and entertainment industry, pioneering GPU-accelerated computing, and later branching out to scientific computing, artificial intelligence, data platforms, and most recently metaverse, and quantum computing, among others.

The company has dedicated itself to solving problems in the computing arena, starting from building hardware products to software tools, gaming capabilities, architecture, etc. It looks to help people take their ideas into reality faster.

Around 2009, one of the important milestones was the design of the next-generation CUDA, code-named Fermi, where Nvidia essentially solved the GPU computing puzzle. CUDA, or compute unified device architecture, is designed to work with programming languages like C, C++, and Fortran. This makes it easier for developers to use GPU resources effectively. It also supports multiple programming frameworks, including OpenMP, OpenACC, OpenCL, and others.

Cut to 2022; the company is replicating CUDAs success with quantum computing, where it recently launched QODA (quantum optimised device architecture). Last month, the company open-sourced QODA to accelerate quantum research and development across various areas, including health, finance, HPC (high-performance computing), AI and others.

There is no stopping Nvidia.

Earlier this month, the company announced a wide range of Metaverse initiatives. The company plans to bridge the gap between AI and the digital world, creating a more realistic Metaverse.

Now, with all of these advancements in the backdrop, Nvidias GTC, which was started in 2009 onwards, provides a platform to understand general processing computing and challenges in the field, alongside the launch of futuristic technology where Nvidias in-house experts and researchers are experimenting.

In an interview with Analytics India Magazine, Vishal Dhupar, managing director, Asia South at Nvidia, said they can synthesise the virtual worlds with physical worlds as they sit at the intersection of computer graphics, physics and intelligence. Thats what people come to see. Thats what people imbibe. Thats what people practise. Thats why GTC, he added.

Omniverse

Nvidias Omniverse has been the talk of the town. The platform offers developers a collaboration and scalable multi-GPU, real-time, true-to-reality simulation. The company believes it will revolutionise how people create and develop as individuals and work together as teams, bringing creative possibilities and efficiency to 3D creators, developers and enterprises.

At GTC, the company will announce various updates, libraries and new tools and applications to create immersive AI chatbots, realistic avatars, and stunning 3D virtual worlds.

Realistic Avatars: Recently, Nvidia announced the launch of lifelike avatars that can give an animated human face to the computers that people could interact with online. This might be similar to what Meta AI researchers developed, called MyoSuite. This new tool creates realistic musculoskeletal models more efficiently than exercising ones. Given Nvidias rich history in revolutionising the gaming and entertainment industry, there is a glimmer of hope from Nvidia to help developers create more realistic and life-like avatars.

Metaverse bots: Nvidia is most likely to launch new capabilities and AI platforms to develop realistic avatars and characters that would help people navigate the digital world.

3D Rendering models: In March 2022, Nvidia announced the launch of Instant NeRF, touted to be one of the fastest techniques to data, achieving more than 1,000x speedups in some cases. It is a neural rendering model that learns a high-resolution 3D scene in seconds and can render images in milliseconds.

Last year, Nvidia launched GANverse 3D, which can be imported as an extension in the Nvidia Omniverse to render 3D objects accurately in the virtual world. We can expect new updates and announcements around 3D rendering models at the upcoming GTC.

Thanks to our network and computing effect, which is taking place because of our accelerated computing capabilities, we can go into our imaginations and make it real, and we can all create our own world and uniformly create many worlds, said Dhupar, excitedly, pointing at the multiple possibilities on Omniverse.

He further said that AI has a huge role to play in creating such a 3D world, where machines/bots can write their own piece of software, which humans can drive, and later can learn on themselves and, most importantly, make recommendations, and predictions based on the interaction in the metaverse.

Nvidia currently offers Omniverse Enterprise, where it looks to help enterprises build 3D design and digital twin workflows with real-time collaboration and true-to-reality simulation. At GTC, there might be announcements of new partnerships on how companies leverage its Omniverse Enterprise platform to create various use cases, including robotic process automation, fighting climate change, automobile design, and more.

Banking on the success of CUDA, which opened up a new type of hardware and programming paradigm, Nvidia is betting big on quantum computing to help develop an ecosystem of hybrid quantum applications running on top of QODA.

Citing examples of CUDA, Dhupar said QODA allows developers to run quantum simulations. You can write a lot of test applications, and we can get ready when quantum hardware really comes into play, he added, saying that it is quite simpler to use than how one would typically operate classical computing.

It helps quantum computing scientists to write algorithms and test their applications and get to the next level using the GPU where instead of one or two bits, you can write into hundreds of bits, qubits and move forward onto it, explained Dhupar.

At GTC, the company would announce some of the latest use cases and updates of its platforms, alongside the latest partnership and collaboration to accelerate quantum computing research across the globe.

Previously, Nvidia had said that it would launch BlueField-4 by 2023. The data processing unit BlueField supports CUDA parallel programming platform and Nvidia AI, turbocharging the in-network computer vision.

The company had also announced the launch of Nvidia Grace, the first data centre CPU, an Arm-based processor that will deliver 10x the performance of todays fastest servers on the most complex AI and HPC workloads.

This is the only company that talks about three processors, the CPU, GPU, and DPU; about accelerating applications across multiple domains; about a recent problem holding you back, and how you create that into a solution that becomes a mega-market, said Dhupar, hinting that the company would announce major hardware and semiconductor chip updates.

At GTC, we can expect the company to launch new hardware for the data centre, CPUs, GPUs, and others, along the lens of x86 architecture and the size of computing.

In 2019, Nvidia introduced GauGAN, an AI tool that turns sketches into photorealistic landscapes. Of late, there has been a lot of buzz around image generation tools such as Metas Make a Scene, OpenAIs DALL.E-2 and Midjourney, among others. There is a high chance of Nvidia making similar announcements around the release of text-to-image models and platforms.

At last years GTC, Nvidia announced the Nvidia DRIVE, powered by Hyperion 8. It is an end-to-end modular development platform and reference architecture for designing autonomous vehicles (AVs). This includes the NVIDIA DRIVE AGX Orin, DRIVE AGX Pegasus, and DRIVE Hyperion 8.1 Developer Kits, all built on the NVIDIA DRIVE Orin system-on-a-chip (SoC).

Nvidias Dhupar did not disclose much about NVIDIA DRIVE Hyperion. However, he said that there are a lot of things, whether, from a computing or software perspective, there would be talks around all of them.

Every field that we spoke of is going through the greatest technology shift what people call Web 3.0, some call it metaverse, and everything that gets done between that aspects is something we should be looking forward to, shared Dhupar.

Continue reading here:

Nvidia GTC 2022 is Happening. Here's What to Expect - Analytics India Magazine

Posted in Quantum Computing | Comments Off on Nvidia GTC 2022 is Happening. Here’s What to Expect – Analytics India Magazine

Cyber Week in Review: August 26, 2022 – Council on Foreign Relations

Posted: at 7:51 am

Facebook and Twitter take down pro-Western influence campaign

Facebook, Instagram, WhatsApp, Twitter, and Telegram disrupted a pro-Western influence campaign focused on promoting U.S. interests abroad, according to a report from Graphika and the Stanford Internet Observatory. The accounts used in the influence operation targeted the Middle East and Central Asia, frequently criticized Russia over the war in Ukraine, and often shared content from U.S. government-affiliated news outlets such as Voice of America and Radio Free Europe. Some of the accounts appear to be part of the Trans-Regional Web Initiative, a propaganda operation run by U.S. Special Operations Command active for over a decade. The campaign is the first publicly known, U.S.-run influence operation on social media. The campaign does not appear to have been very effective, as most posts received only a handful of likes or retweets, and only 19 percent of accounts had more than one thousand followers.

Ransomware gang attacks UK water organization

The ransomware gang Cl0p said it had infected a major water treatment company, South Staffordshire Water, in the United Kingdom. Cl0p first infected the systems of South Staffordshire on August 15, although there was some initial confusion as the gang believed it had compromised the systems of a larger utility, Thames Water, which serves most of southeast England. Cl0p did not deploy ransomware on the network, citing ethical concerns, but instead stole data and threatened further consequences unless a ransom is paid. The hackers may have gained access to the industrial control systems of South Staffordshire. Attacks on water systems have become increasingly common in recent years, and in some cases these attacks could have caused active harm to civilians.

Lloyds of London Excludes State-Sponsored Cyberattacks from Insurance

More on:

Cybersecurity

Technology and Innovation

China

Lloyds of London, a major insurance market in England, announced that it will not allow insurers to cover catastrophic cyberattacks perpetrated by nation-states as of March 31, 2023. Lloyds currently defines a catastrophic cyberattack as an attack that will significantly impair the ability of a state to function or... that significantly impairs the security capabilities of a state. While some have praised the move to greater clarity on what will not be covered, others have noted that that Lloyds standard of catastrophic is vague and that cyberattacks are often difficult to attribute to a specific nation-state conclusively. In recent years, insurance companies have grappled with how to address major cyberattacks, and, in December 2021, Lloyds announced the exclusion of nation-state-led attacks from policies held in a small subset of countries, China, France, Japan, Russia, the United Kingdom and the United States, although it appears this exclusion has not been tested yet.

Former Twitter head of security turns whistleblower

Net Politics

CFR experts investigate the impact of information and communication technologies on security, privacy, and international affairs.2-4 times weekly.

Digital and Cyberspace Policy program updates on cybersecurity, digital trade, internet governance, and online privacy.Bimonthly.

A summary of global news developments with CFR analysis delivered to your inbox each morning.Most weekdays.

A weekly digest of the latestfrom CFR on the biggest foreign policy stories of the week, featuring briefs, opinions, and explainers. Every Friday.

Twitters former head of security Pieter Zatko, also known as Mudge, filed a whistleblower complaint against the company earlier this week. Zatko made a series of claims about the state of Twitters security, including that Twitter unknowingly employs agents of foreign nations, deleted data may still be accessible, and that the loss of a few key data centers could permanently take down the entire site. Zatko also alleged that Twitters security practices violated an agreement with the Federal Trade Commission that prohibited Twitter from misleading user about its security or privacy practices. Zatko, who developed L0phtCrack in 1997, a password-recovery tool still in use in an updated form today, is well-respected in the cybersecurity community for his work over the past three decades. Zatkos disclosures will likely affect the court case between Twitter and Elon Musk over whether the tech entrepreneur can back out of his bid to buy the company without significant penalty, although experts are divided as to whether Zatkos disclosures will help or hurt Twitter.

Baidu unveils first quantum computer

Chinese internet company Baidu announced it had built its first quantum computer on Thursday this week. The computer, dubbed Qianshi, has a ten qubit processor, significantly behind Googles Sycamore at fifty four qubits, and Zuchongzi from the University of Science and Technology of China at sixty six qubits. Baidu said that it had also developed a thirty six qubit processor, although it appears that processor has not been used yet. Quantum computing has been a major research focus for China, the United States, and European Union in recent years, as each country has poured billions of dollars into research on quantum computing. The Biden administration recently announced a series of initiatives aimed at growing quantum research in the United States.

More on:

Cybersecurity

Technology and Innovation

China

Read the original:

Cyber Week in Review: August 26, 2022 - Council on Foreign Relations

Posted in Quantum Computing | Comments Off on Cyber Week in Review: August 26, 2022 – Council on Foreign Relations

Freeze Your Eggs: The Worst Advice Ever – The Stream

Posted: at 7:48 am

This is absolutely appalling perhaps the worst advice ever, said Ruth Institute President Dr. Jennifer Roback Morse, Ph.D., of actress Mindy Kaling recommending young women have their eggs frozen. Its wrong on so many levels.

In an interview with the teen fashion magazine Marie Claire, Kaling said paying to have her eggs frozen is the perfect gift for a co-ed home from college.

I wish every 19-year-old girl would come home from college and that the gift instead of buying them jewelry or a vacation or whatever is that their parents would take them to freeze their eggs They could do this once and have all of these eggs for them, for their futures.

Kaling had two children this way in her late 30s and rhapsodizes about the process: It was the best part of my life.

Morse noted: As Naomi Schaefer Riley, a senior fellow at the American Enterprise Institute, wrote in the Deseret News, Kaling forgot a few salient details. The procedure is iffy at best. According to the Centers for Disease Control, only about a fifth of these painful and expensive procedures actually result in a live birth. Other sources put the odds at closer to 11%.

And thats not the worst of it, Morse continued. It takes the father completely out of the picture. It makes single-parenting the ideal. It overlooks the reality that children raised by single parents are more likely to exhibit any number of social pathologies, from addiction to mental health issues.

At best, you have children raised without a fathers love and direction. At worse, you have another human tragedy. Is it fair to the child? Of course not, but in these situations, its all about adult desires. Children are a sideline to their story.

The best advice you can give a 19-year-old woman is to take her time and make the effort to find the type of man she will want to marry and have children with, Morse urged.

But, of course, in a teen fashion magazine, you wouldnt find such uncool advice.

The latest Dr. J Show features Rachel Mastrogiacomo, the victim of rape by a priest during a private Mass. Learn how Rachel was slowly groomed by this highly manipulative man, how she eventually overcame this trauma, and how she now uses her experience to help others who have been in similar situations.

Be warned that this interview is distressing.

Rachel was targeted for a slow, calculated grooming process that took ten months. I was completely psychologically overpowered by him. There was no way out, Rachel says. He convinced me that this was Gods will. That this was my duty, that my eternal salvation and my sanctity depended on it.

He weaponized everything. He had figured me out: the deep places in my heart that were wounded, Rachel says of the process. He told me I will be a bride of Christ and a spiritual mother to priests. He took everything that I really cared about and weaponized it.

Watch to learn more about grooming: what it entails and what to be alert for. Hear also how this now-laicized priest manipulated Rachels family and friends as well, normalizing his behavior so that everyone around Rachel believed he was a trustworthy man, even despite what she told them.

When Rachel broke the story years later by writing an article about it, many women reached out to her saying it was like they were reading their own story, that their abuser was following the same playbook. Theres something systemic going on, Rachel says.

Watch to the end to hear Dr. J explain something everyone can do to help their friends and loved ones dealing with trauma.

Watch this intense interview on YouTube, BitChute, Odysee, or Rumble.

Dr. Morse was interviewed on Helen Roys podcast, Girlboss, Interrupted, about her book, The Sexual State. Morse exposes the Sexual Revolutions toxic ideologies that are destroying families and society:

The Sexual State answers the questions:

We need to reassert the differences between men and women, Morse says, as well as the need of children to have a relationship with both their biological parents. Marriage is the only thing that protects the rights of children.

Watch the full interview here. Get your copy of The Sexual State here.

In a recent interview with Jordan Peterson on Dave Rubins podcast, Peterson applauded Rubins decision to manufacture children for himself and his male partner. Dr. Morse and Fr. Rob Jack discuss the inevitably tragic consequences of in vitro fertilization and other forms of third-party reproduction in this podcast.

Listen to learn why efforts to help people self-actualize are morally wrong and an attack on the intrinsic rights of children.

The Ruth Institute is a global non-profit organization, leading an international interfaith coalition to defend the family and build a civilization of love. The Ruth Institutes Founder and President, Dr. Jennifer Roback Morse, is the author of The Sexual State: How Elite Ideologies Are Destroying Lives and Love and Economics: It Takes a Family to Raise a Village. Subscribe to our newsletter and YouTube channel to get all our latest news.

Follow this link:

Freeze Your Eggs: The Worst Advice Ever - The Stream

Posted in Jordan Peterson | Comments Off on Freeze Your Eggs: The Worst Advice Ever – The Stream

Evidently, Biden Does Not Know About the False Positive Risk … – Substack

Posted: at 7:47 am

The fact is that all Americans citizens who test are still at risk of their own personal lockdown.

So Biden is overly jabbed. Yet he had COVID-19. Again. Or did he? Testing for COVID-19 is a rats nest, as Ive said since April 2020. The first error is equating a positive test result with the presence of the virus. [Stay focused: This is not that hopeless the virus does not exist fools errand that has been repeatedly addressed by me and others].

The second error is equating the presence of the virus with COVID-19.

The first error is made whenever someone who does not have active viral replication tests positive using any of the tests. Accordingly, to this flailing attempt by CNN to repair the Biden administrations reputation (theres even going to be a Rose Garden Ceremony!), Biden swiped his own nose and 15 minutes later, the test came back positive.

That is almost certainly an antigen test. Which is not free from false positive results. Due to the non-specificity of the antigen detection, some COVID-19 antigen tests can light up positive if the patient is infected with other respiratory pathogens. Read the package inserts for details.

The self-inflicted wound that the Biden administration is suffering is inherent in the CNN article, right in the title itself:

According to the article, because Uncle Sniffy tested positive, who the article starts off describing as a fatigued, runny-nosed Joe Biden reports that

The brutal months that came before had lent the Biden presidency a sense of gloom, fueled by high prices, abysmal approval numbers and swirling questions about the President's ability to lead. Many problems -- like a growing outbreak of monkeypox, the war in Ukraine and shortages of baby formula -- still persist, and a new crisis is emerging with China. Democrats running for office this year are still putting distance between themselves and the President.

There are so many layers here to unpack. Ill leave the political questions to the politicos. But a growing outbreak of monkeypox? So many layers.

This article is about testing, so back to the testing:

At 9:12 a.m. ET, Biden swiped his nose with a cotton swab and hoped for the best. It was officially his sixth day isolating with Covid-19. His symptoms had disappeared. He'd started working out again in the White House gym.

Someone needs to tell Joe how surgeons don and doff masks. Hint: No repeat use.

Biden then tested positive again, after Paxlovid, an outcome that Joe evidently was always in the now was a possibility because his doctors told him so.

From the Pfizer website:

The FDA has authorized the emergency use of PAXLOVID, an investigational medicine, for the treatment of mild-to-moderate COVID-19 in adults and children (12 years of age and older weighing at least 88 pounds [40 kg]) with a positive test for the virus that causes COVID-19, and who are at high risk for progression to severe COVID-19, including hospitalization or death, under an EUA.

PAXLOVID is investigational because it is still being studied. There is limited information about the safety and effectiveness of using PAXLOVID to treat people with mild-to-moderate COVID19.

The fact sheet itself starts:

You are being given this Fact Sheet because your healthcare provider believes it is necessary to provide you with PAXLOVID for the treatment of mild-to-moderate coronavirus disease (COVID-19) caused by the SARS-CoV-2 virus. This Fact Sheet contains information to help you understand the risks and benefits of taking the PAXLOVID you have received or may receive.Pardon? Didnt the website warning say There is limited information about the safety and effectiveness of using PAXLOVID to treat people with mild-to-moderate COVID19?

So how can an HCP believe it is necessary" for the treatment of mild-to-moderate COVID-19?

Anyway, back to testing - Joes first test could have been a false positive, and it was self-administered. The article does not say whether the Presidents sample was sent out for sequencing to confirm it was not a false positive. Or that his second positive test also could have been a false positive.

Some would still have you believe that PCR testing never had ANY false positives: my experience in clinical biomarker development and my reading of the published scientific literature tells me the best estimate of the PCR False Positive Rate is around 40%. This includes the Marine study that failed to find sequenceable viral DNA in about 40% of the Marines who initially tested positive.

In my written testimony on a restaurant case in Pennsylvania, I provided all of the studies showing PCR false positive rates. I sent my testimony in before I saw the Commonwealths expert testimony from the state epidemiologist. The state epidemiologist, in her written testimony, had misinformed the court that there were no - zero - clinical false positive results. The judge, for some reason, decided to decline the written testimony from both experts and insisted only on oral testimony, which devolved into an ad hominem attack, leaving the issue of false positives underappreciated by the judge, who ruled the restaurant had to follow state procedures. For my efforts there and as an expert witness in the NVICP, I earned a Wikipedia page that, like the Commonwealths lawyer and the Special Master of the NVICP, ignores my 20 years of biomedical research experience, including intensive research in biomarker development while faculty in the Department of Pathology at the University of Pittsburgh.

The CNN article attempts to portray Bidens self-isolation as the cause for his administrations month from hell, spinning the administrations tailspin into a turn-around because after Joe tested negative, the White House sent a missile into a house in Kabul, Afghanistan, and Joe asked some questions about the ready-made operation.

The only mention of sequencing was whether they had confirmed the death of the intended target by DNA sequencing.

The fact is that all American citizens who test are still at risk of their own personal lockdown, and Joe Biden and every other American are not being told about the risk - and cost - of the false positive result.

To be clear - Im not saying people should not test. Some doctors I know think no one should test, just treat yourself if you have symptoms.

I think people should test if they want to - once they know the full risks. You cant take a PCR test result seriously unless you know the cycle threshold cut-off being used to call a positive. And you cant know that risk and make your own personal assessment of the risk/benefit ratio unless you know the FP rate associated with the kit youre considering, and the threshold they will apply to your sample.

I am once again, pointing out that the #costofthefalsepositives can be truly made zero by sequencing the virus from clinical samples from patients who test positive either by PCR or by antigen test.

How many more times will America suffer a month from hell because a President tests positive for COVID-19 without any assurance against the 40% false positive rate of the PCR test or a false positive due to non-specificity of an antigen test?

Additional information and resources on the false positive catastrophe from Covid-19 testing can be found on Tam Hunts MEDIUM.COM page.

Thank you for reading Popular Rationalism. This post is public so feel free to share it.

Share

Leave a comment

Get 20% off for 1 year

Click on The Vitruvian Man to find out about our super fall course line-up at IPAK-EDU!

View post:

Evidently, Biden Does Not Know About the False Positive Risk ... - Substack

Posted in Rationalism | Comments Off on Evidently, Biden Does Not Know About the False Positive Risk … – Substack

The Jewish and Intellectual Origins of this Famously Non-Jewish Jew – Jewish Journal

Posted: at 7:47 am

Editors note: Excerpted from the new three-volume set, Theodor Herzl: Zionist Writings, the inaugural publication of The Library of the Jewish People edited by Gil Troy, to be published this August marking the 125th anniversary of the First Zionist Congress. This is second in a series.

Theodor Herzl was born on May 2, 1860, in Pest, Hungary, across the River Danube from Buda. The second child and only son of a successful businessman, Jakob, he was raised to fit in to the elegant, sophisticated society his family and a fraction of his people had fought so hard to enter. But it is too easy to caricature his upbringing as fully emancipated and assimilated. His paternal grandfather, Simon Loeb Herzl, came from Semlin, todays Zemun, now incorporated into Belgrade. There, Simon befriended Rabbi Judah ben Solomon Chai Alkalai. This prominent Sephardic leader was an early Zionist, scarred by the crude antisemitism of the Damascus Blood Libel of 1840, inspired by the old-new Greek War of Independence in the 1820s and energized by the spiritual and agricultural possibilities of returning the Jews to their natural habitat, their homeland in the Land of Israel. It is plausible that the grandfather conveyed some of those ideas, some of that excitement, to his grandson.

Still, the move from Semlin to Budapest, from poverty to wealth, from intense Jewish living in the ghetto to emancipated European ways in the city, placed the Herzl family at the intersection of many of his eras defining currents.

The 1800s were years of change and of isms. Creative ideas erupted amid the disruptions of industrialization, urbanization, and capitalism. Three defining ideologies were rationalism, liberalism, and nationalism with each one shaping the next. The Age of Reason, the Enlightenment science itself rose thanks to rationalism. Life was no longer organized around believing in God and serving your king, but following logic, facts, objective truth. The logic of reason flowed naturally to liberalism, an expansive political ideology rooted in recognizing every individuals inherent rights. Finally, as polities became less God-and-king-centered, nationalism filled in the God-sized hole in many peoples hearts. Individuals bonded based on their common heritage, language, ethnicity, or regional pride and needs.

Ideas are not static. In an ideological age rippling with such dramatic changes, the different isms kept colliding and fusing, like atoms becoming molecular compounds. Some combinations proved more stable and constructive than others.

Liberalism combined with nationalism created Americanism, the democratic model wherein individual rights flourished in a collective context yielding the liberal-democratic nation-state. An offshoot of liberalism emphasizing equality more than rights fused with rationalism and created Marxism, although Karl Marx admitted his theories could only be enacted with irrational terror. Marxism with that violent streak, drained of liberalism, became communism, while a hyper-nationalism, rooted in blood-and-soil loyalty, and the kind of Marxist rationalism and totalitarianism also drained of any liberalism, created Nazism.

It is too easy to caricature [Herzls] upbringing as fully emancipated and assimilated.

A similar impressionistic summary of the Jewish experience would track how the nineteenth centurys ideological clashes shaped the major movements and institutions still defining Judaism, from the Reform movement to Zionism, from the modern synagogue to the State of Israel. Judaism and rationalism set off the explosion of scholarship the Wissenschaft while Judaism mixed with liberalism triggered the Reform and Conservative movements theological inventiveness. In response, ultra-Orthodoxy emerged, hostile to change essentially subtracting liberalism from Judaism. Modern Orthodoxy synthesized, accepting some liberalism in Judaism and eventually Jewish nationalism without too much rationalism. And, thanks to Herzl and others, the compound of Judaism and liberalism and nationalism yielded Zionism.

The actual historical process was much messier. It began with the great double-edged sword of European Emancipation. First in the West, then in the East, some Europeans welcomed Jews with equal rights and extraordinary opportunities, liberating many to move to the cities and for a few to succeed on legendary scales. Moses Mendelssohn (17291786), the Herzl of the Haskala Enlightenment was a Jew who as a philosopher dazzled Berlin. But, unlike Herzl, Mendelssohn was so fluent in Judaism and Hebrew that in 1783 he started translating much of the Bible into High German, adding commentary sporadically too. Mendelssohn epitomized the Haskala ideal of being a full, functioning, literate Jew in the house and a full, functioning, popular man on the street. And, unlike Herzl, Mendelssohn was ugly, infamously so, a walking ghetto stereotype with his crooked back and hooked nose.

Mendelssohn was accepted. Jews, however, realized that Europes embrace often came at a cost: Jews had to be willing to give up their Jewishness, to fit in so much that many lost their way. Mendelssohn had six children who survived into adulthood only two remained Jewish. Most disturbing, the Jewish rush into modern European society triggered a backlash, an updated, racist Jew-hatred that became increasingly potent as nationalist demagogues blamed the eras problems on Europes traditional scapegoat, the Jews.

Rather than being welcomed smoothly into European life, most Jews felt mugged by modernity.

Rather than being welcomed smoothly into European life, most Jews felt mugged by modernity. The complex realities never matched the euphoric hopes of the maskilim, the Enlightened Reformers, that their people would awake from their ghetto-imposed long slumber, as the Russian-Jewish maskil Y. L. Gordon would write in Hebrew in 1866.

Developing Mendelssohns vision as the pioneering Jewish modernizer, Gordon celebrated the essential bargain Jews like Theodor and his parents accepted. The deal was: Be a man when you wander outside and a Jew when at home. In Herzls household like so many other bourgeois Jewish homes the success in looking normal on the streets came at a high Jewish cost, even at home.

For Herzl and his family, Middle European Jews caught in the middle, every educational choice became a marker. Were you looking backward to your traditional past or forward to your enlightened future? Initially, Herzls parents, Jakob and Jeannette ne Diamant, tried doing both. When their son was eight days old, they initiated their son Theodor into the great identity juggle by giving him a Hebrew name Binyamin Zeev.

Ultimately, then, Binyamin Zeev Herzl was far more rooted in Judaism and the Jewish struggle of the nineteenth century, than most legends acknowledge.

Professor Gil Troy is the author of The Zionist Ideas and the editor of the three-volume set, Theodor Herzl: Zionist Writings, the inaugural publication of The Library of the Jewish People, to be published this August marking the 125th anniversary of the First Zionist Congress.

See the rest here:

The Jewish and Intellectual Origins of this Famously Non-Jewish Jew - Jewish Journal

Posted in Rationalism | Comments Off on The Jewish and Intellectual Origins of this Famously Non-Jewish Jew – Jewish Journal

Culture, progress and the future: Can the West survive its own myths? – Salon

Posted: at 7:47 am

When I was young, back in the 1970s, I spent two years traveling across the world: by truck with a group through Africa from south to north; in a camper van with a friend through Northern and Eastern Europe and Russia; on foot along most of the south coast of Crete; and by boat, bus, truck and train across Asia to India and Nepal.

The most difficult cultural adjustment I had to make was not to the cultures of other countries, but to my own on my return home to Australia. Many long-term Western travelers have the same experience, shocked in particular by the West's extravagant consumerism. My initial response on flying into Sydney from Bangkok was one of wonder at the orderliness and cleanliness, the abundantly stocked shops, the clear-eyed children, seemingly so healthy and carefree. However, this initial celebration of the material comforts and individual freedoms soon gave way to a growing apprehension about the Western way of life.

In a way I hadn't anticipated, the experience allowed me to view my native culture from the outside; and in ways I hadn't appreciated before, I became aware ours was a flawed and harsh culture. I realized that the Western worldview was not necessarily the truest or best, as I had been brought up and educated to believe, but just one of many, defined and supported by deeply ingrained beliefs and myths like any other.

We in the West tend to see material poverty as synonymous with misery and squalor; yet only with the most abject poverty is this so. Mostly the poorer societies I travelled through had a social cohesion and spiritual richness that I felt the West lacked. We see others as crippled by ignorance and cowed by superstition; we don't see the extent to which we are, in our own ways, oppressed by our rationalism and lack of "superstition" (in a spiritual sense).

There were other elements to my "re-entry trauma" besides the experience of other cultures. My lifestyle, very open in some respects, was closed or contained in others: the consequences of being on the road; and the almost total absence of mass media in my life. The exposure to the counterculture of my fellow travelers, especially in Asia, was another influence.

Over the following decades, as a journalist, researcher and writer, I developed these early insights into an analysis of cultural influences on health and well-being, how we define and measure human progress and development, and what the future holds for our civilization and species. This work is available on my website, including my book, "Well & Good: Morality, Meaning and Happiness," published in 2005.

"Culture" is often understood to mean the arts, or to mean ethnicity and ethnic differences, or to describe a quality of specific institutions, especially when their "cultures" become toxic. In scientific research, culture is a challenging topic, much debated and contested, defined and used differently in different disciplines and even within the same discipline. It can be difficult to pin down cultural qualities to measure their effects, which are often diffuse and pervasive, with complex interactions with other social factors.

In this essay, I use "culture" to refer to the language and accumulated knowledge, beliefs, assumptions and values that are passed between individuals, groups and generations; a system of meanings and symbols that shapes how people see the world and their place in it, and gives meaning and order to their lives; or, more simply, as the knowledge people must possess to function adequately in society.

The dominant discipline in research on population health is epidemiology (although other disciplines also contribute). Over the past few decades, epidemiologists have become more interested in the so-called social determinants of health, with a particular focus on socioeconomic inequality. Research suggests that the greater the inequality, the steeper the gradient in health is (meaning that at any point on the social ladder, people on average have better health than those below them and worse health than those above them), and the poorer people's health is overall.

As anthropologist Ellen Corin argues, "culture" shapes every area of life, defines a worldview that gives meaning to experience and frames how people locate themselves in the world.

I felt cultural factors were being neglected in this literature, however. This is unsurprising: Epidemiology (and science more generally), tends to overlook or underestimate the intangible, abstract and subjective in favor of the tangible, concrete and objective, which are easier to measure. A notable exception in the research was the work in the 1990s of psychologist and anthropologist Ellen Corin, to which I immediately related because of my travel experiences.

In contributions to two books on social determinants of health, Corin argues that culture shapes every area of life, defines a worldview that gives meaning to personal and collective experience, and frames the way people locate themselves within the world, perceive the world and behave in it.

Humans do not live in a purely objective world in which objects and events possess an inherent and objective significance, she says; instead, these things are imbued with meanings that vary with individuals, times and societies, and emerge from a network of associations: "Every aspect of reality is seen embedded within webs of meaning that define a certain worldview and that cannot be studied or understood apart from this collective frame."

Want a daily wrap-up of all the news and commentary Salon has to offer? Subscribe to our morning newsletter, Crash Course.

As reflected in my own experience, Corin notes that cultural influences are always easier to identify in unfamiliar societies. "As long as one remains within one's own cultural boundaries, the ways of thinking, living, and behaving peculiar to that culture are transparent or invisible; they appear to constitute a natural order that is not itself an object of study. But this impression is an unsupported ethnocentric illusion."

In contrast to this way of thinking about culture, epidemiology understands "culture" mainly in terms of "subcultures" or "difference," especially ethnic and racial difference, and therefore usually as one dimension of socioeconomic status and inequality. Generally speaking, the broader influence of culture on health has been seen as remote and diffuse, pervasive but unspecified. As Corin observes, epidemiology's "categorical" approach to sociocultural factors, which fits comfortably within prevailing scientific paradigms, strips human realities of much of their social context and disregards and dismisses other approaches to social and cultural realities.

I have written many scientific papers discussing culture and health. Perhaps the most influential is a 2006 paper, "Is modern Western culture a health hazard?" published in the International Journal of Epidemiology, together with three commentaries by other researchers. In this paper, I argue that cultural factors such as materialism and individualism are underestimated determinants of population health and well-being in Western societies and that an important and growing cost of our modern way of life is "cultural fraud": the promotion of images and ideals of "the good life" that serve the economy but do not meet psychological needs or reflect social realities.

Research suggests that inequality impacts health through both material and psychosocial processes: In other words, such processes result from differences in material conditions, experiences and resources and from people's position in the social hierarchy and their perceptions of relative disadvantage, which contribute to stress, depression, anxiety, isolation, insecurity, hostility and lack of control over one's life. These qualities affect health directly, and also indirectly by encouraging unhealthy behavior. If factors such as perceptions, expectations and emotions were part of the pathways by which inequality affected health, I argued, research needs to take culture into account, since culture influences these things.

A growing cost of our modern way of life is "cultural fraud": the promotion of images and ideals of "the good life" that serve the economy but do not meet psychological needs or reflect social realities.

Even if we look just at inequality, culture affects the extent to which a society tolerates or even promotes inequality rather than discouraging it. If perceptions of social status influence levels of stress and anxiety, then cultural factors also play a critical role: For example, by amplifying a sense of relative deprivation through media images of "the good life" and celebrity lifestyles that are increasingly beyond the reach of most of us; or moderating that sense by providing alternative cultural models, such as downshifting and simple living, that undermine conventional social comparisons. Culture also influences the social distribution of risk behaviors like smoking and alcohol use.

Culture's impacts are far more pervasive than these effects on inequality, however, penetrating and shaping every facet of life in ways that affect well-being, including meaning, identity, belonging and security. Consider how Western culture construes the self. When I was at school, 60-odd years ago, we were taught that the atom was made up of solid particles, with electrons whizzing around the nucleus like planets orbiting the sun. Similarly, we think of the self as a discrete, biological entity or being. Sociologists talk of modern society as one of "atomized" individuals.

But these days science depicts the atom in quite different terms, as more like a fuzzy cloud of electrical charges. What if we were to see the self like this, as a fuzzy cloud of relational forces and fields? As a self of many relationships, inextricably linking us to other people and other things and entities? Some are close and intense, as in a love affair or within families; some more distant and diffuse, as in a sense of community or place or national or ethnic identity; some maybe are more subtle, but still powerful, as in a spiritual connection or a love of nature.

These relationships can wax and wane, vary in intensity and charge (positive or negative). Importantly, they never end for example, the breakup of a marriage, or the death of a parent or child, does not "end" the relationship, just changes it. Transforming how we see the self in this way as a fuzzy cloud of relationships would change profoundly how we see our relationships to others and the world. It would bring us closer to the way many indigenous peoples see the self, and would alter radically our personal choices and our social and political goals.

A critical consequence of the trends in modern Western culture has been their effect on moral values. Values provide the framework for deciding what is important, true, right and good, and have a central role in defining relationships and meanings, and so in determining well-being.

Most societies have tended to reinforce values that emphasize social obligations and self-restraint, and to discourage those that promote self-indulgence and antisocial behavior. Virtues are concerned with building and maintaining strong, harmonious personal relationships and social attachments, and the strength to endure adversity. Virtues serve to maintain a balance always dynamic, always shifting between individual needs and freedom on one hand, and social stability and order on the other. "Vices," on the other hand, typically involve the unrestrained satisfaction of individual wants and desires, or the capitulation to human weaknesses.

Christianity's seven deadly sins are pride (vanity, self-centeredness), envy, avarice (greed), wrath (anger, violence), gluttony, sloth (laziness, apathy) and lust. Its seven cardinal virtues are faith, hope, charity (compassion), prudence (good sense), temperance (moderation), fortitude (courage, perseverance) and religion (spirituality).

French philosopher Andr Comte-Sponville, in his 2002 book "A Short Treatise on the Great Virtues,"lists these as the most important human virtues: politeness (as the "imitation of virtue," paving the way for true virtue to be learned), fidelity, prudence, temperance, courage, justice, generosity, compassion, mercy, gratitude, humility, simplicity, tolerance, purity, gentleness, good faith, humor and, finally, love (which transcends virtue). He says that a virtuous life is not masochistic or puritanical, but a way of living well and finding love and peace.

Modern Western culture undermines, even reverses, universal values and time-tested wisdom. The result is not so much a collapse of personal morality, but a loss of moral clarity: a heightened moral ambivalence and ambiguity, a tension or dissonance between our personal values and our lifestyles and the institutional values of the organizations we work for, and a deepening cynicism and mistrust toward social institutions, especially government.

Think for a moment about how much of public life, especially as revealed by politics and the mainstream and social media, reflects and promotes the "great virtues" (or, conversely, the vices).

Modern Western culture undermines universal values the result is not so much a collapse of "personal morality" buta heightened moral ambivalence and ambiguity, a dissonance between our personal values, our lifestyles and the institutional values around us.

Without appropriate cultural reinforcement, we find it harder to do what we believe to be "good"; it takes more effort. Conversely, it becomes easier to justify or rationalize bad behavior. There are positive (reinforcing) feedbacks in the process: Antisocial values weaken personal and social ties, which in turn reduce the "hold" of a moral code on individuals because it is those kinds of ties that give the code its "leverage"; they are a source of "moral fiber."

Values are the foundations of social organization, and any discussion of personal well-being and social functioning must begin here. The sounder the foundations, the less we need to rely on elaborate supporting structures of legislation and regulation. As the 18th-century political philosopher Edmund Burke said, the less control there is from within, the more there must be from without.

Human societies are complex systems, and the management of complexity requires rules that are generic, diffuse, pervasive, flexible and internalized; in other words, they need a strong framework of values. As moral frameworks erode, and our culture becomes more rational, legalistic and technocratic, the more the work of values is supplanted by laws and regulations, which tend to be rigid, specific and externally imposed; they are often a poor or inappropriate substitute.

The apparent harm caused by materialism and individualism raises the question of why these qualities persist and even intensify, a point I discuss in my 2006 paper. Both have conferred benefits to health and well-being in the past, but now appear to have passed a threshold where their rising costs exceed their diminishing benefits. Various forms of institutional practice encourage this cultural "overshoot": Government policy makes sustained economic growth a priority, but leaves the actual content of growth largely up to individuals, whose personal consumption makes the largest contribution to economic growth.

Ever-increasing consumption is not natural or inevitable. It is culturally "manufactured" by a massive and growing media-marketing complex. I cite a figure from Michael Dawson's 2003 book "The Consumer Trap: Big Business Marketing in American Life": At that time, nearly 20 years ago, corporate business in the U.S. spent more than $1 trillion a year on marketing, about twice what Americans spent annually on all levels of education, private and public, from kindergarten through graduate school. That spending includes "macromarketing," a term describing the management of the social environment, particularly public policy, to suit the interests of business.

Government policy makes sustained economic growth a priority, but leaves the actual content of growth largely up to individuals, whose personal consumption makes the largest contribution to economic growth.

While other species have "cultures" in the form of learned behaviors, humans alone require a culture to give us reasons to live, to make life worth living: to give us a sense of purpose, identity and belonging personally, socially and spiritually and a framework of values to guide our actions. There may be many cultural paths we can follow in meeting human needs (as I discuss later). This is the source of our extraordinary diversity and versatility, but it is also a source of danger: We can lose the path altogether, run off the rails.

In my 2006 paper, I argued that Western culture's promotion of images and ideals of "the good life" amounted to cultural fraud, concluding:

To the extent that these images and ideals hold sway over us, they encourage goals and aspirations that are in themselves unhealthy. To the extent that we resist them because they are contrary to our own ethical and social ideals, they are a powerful source of dissonance that is also harmful to health and wellbeing.

Nevertheless, there are reasons for optimism (on this score at least). As Western culture becomes more harmful to health, we are seeing a diminishing "cultural consonance": Increasing numbers of people in Western nations are rejecting this dominant ethic of individual and material self-interest, and making, or trying to make, a comprehensive shift in their worldview, values and ways of life as they seek to close the gap between what they believe and how they live.

This is a driving dynamic behind various countercultural movements such as simple living, downshifting, minimalism and transition movements. We are witnessing parallel processes of cultural decay and renewal, a titanic contest as old ways of thinking about ourselves fail, and new ways of being human struggle for definition and acceptance.

This cultural contest has obvious significance for the notion of progress the belief that life is constantly getting better which is a defining feature of modern Western culture. Another line of my research has been to address this topic, including its cultural and subjective elements. The measures of progress that we use matter: Good measures are a prerequisite for good governance because they are how we judge its success; they also influence how we evaluate our own lives because they affect our values, perceptions and goals. Models and measures both reflect and reinforce how we understand progress: If we believe the wrong thing, we will measure the wrong thing, and if we measure the wrong thing, we will do the wrong thing.

Essentially, we equate progress with modernization. Modernization is a pervasive, complex, multidimensional process that characterizes our times. It includes industrialization, globalization, urbanization, democratization, scientific and technological advance, capitalism, secularism, rationalism, individualism and consumerism. Many of these features are part of the processes of cultural Westernization and material progress (measured as economic growth). This equation of progress and modernization reflects a deep cultural bias.

We equate progress with modernization, and with a specific definition of economic growth. That reflects a deep cultural bias.

Western nations dominate the top rankings of most indices of progress and development, and Western nations are promoted as a model of development for other countries. On the face of it, the equation seems compelling. The UN Development Programme has noted that past decades have seen substantial progress in many aspects of human development. Most people today are healthier, live longer, are more educated and have more access to goods and services, it reports; they also have more power to select leaders, influence public decisions and share knowledge.

Let us notice that indicators focus on those qualities that characterize modernization and that Western culture celebrates as success or improvement, such as material wealth, high life expectancy, education, democratic governance and individual freedom. However valuable these gains are, they do not represent the sum total of what constitutes optimal well-being and quality of life. Emotional, social and spiritual well-being barely register in this view of progress. It is precisely in these areas that progress has become most problematic, especially in rich nations.

Nor does this view of progress adequately integrate the requirements of environmental health and sustainability. This dimension is being addressed in new indices, although not yet adequately. Despite devoting a huge amount of social and political energy to attempting to get the policy settings right, development at least as currently understood and pursued and sustainability remain fundamentally irreconcilable. Modernization's benefits are counted, but its costs to well-being are underestimated and downplayed. At best, the qualities being measured under orthodox approaches may be desirable and even necessary, but they are not sufficient. At worst, the measures result in a consistent decline in quality of life, and lead us toward an uncertain and potentially catastrophic future.

Our flawed idea of progress is being challenged by the realities of global threats to humanity, such as climate change and biodiversity loss; pollution of land, air and water; food, water and energy security; global economic crises; nuclear war; and technological anarchy (where technologies become so powerful and develop so rapidly that we lose control over them). Without a deep change in culture, we will not close the gulf between the magnitude of the problems we face and the scale of our responses.

A cultural transformation of this extent is very different from the policy reforms and technological remedies on which public discussions and political debates typically focus. The history of climate-change politics provides a clear example of the "scale anomaly" or "reality gap" between the threat and our response. Politics continues to produce slow, incremental change, while science demands urgent, radical action. The pressure on the political status quo is increasing, but has yet to crack it open; we are still "kicking the can down the road."

This predicament applies across the range of humanity's challenges. These are "existential" in that they both materially and physically threaten human existence, and also undermine people's sense of confidence and certainty about life. Culture is central to resolving the situation, meaning both Western culture in general and the specific institutional cultures of politics and journalism, which concentrate some of the worst aspects of the broader culture (making them more visible).

Cultural factors are one driver behind growing electoral fragmentation and tribalism. A lack of a sense of belonging or social attachment was important to Donald Trump's election as president. Veteran journalist Carl Bernstein (of Watergate fame) observed recently that American democracy had not worked well for decades, and that Trump had ignited what he called a "cold civil war." It was a mistake, he said, "to look at the country just in terms of politics and of media. This is a cultural shift of huge dimension."

In a previous essay in Salon, I argued that a deep and dangerous divide existed in liberal democracies between people's concerns about their lives, their country and their future, and the proclivities and preoccupations of mainstream politics and news media. The cultures of politics and journalism are too short-sighted and narrow-minded to bridge the gulf between what we are actually doing as a society and what we now know we need to do. Adding to this failure is a focus on division and conflict over a multiplicity of discrete issues, which are dealt with in isolation from the totality, complexity and interconnectedness of life. As I concluded in that essay, political debate needs to encourage the conceptual space for a transformation in our worldview, beliefs and values as profound as any in human history.

This cultural transformation can be compared to that in Europe from the Middle Ages to the Enlightenment: from the medieval mind, dominated by religion and the afterlife, to the modern mind, focused on material life here and now. Historian Barbara Tuchman, in "A Distant Mirror: The Calamitous 14th Century,"writes that Christianity provided "the matrix and law of medieval life, omnipresent, indeed compulsory." Its insistent principle was that "the life of the spirit and of the afterworld was superior to the here and now, to material life on earth. The rupture of this principle and its replacement by belief in the worth of the individual and of an active life not necessarily focused on God is, in fact, what created the modern world and ended the Middle Ages."

Today, humanity faces another rupture or discontinuity in its view of what it is to be human, and that rupture will profoundly change how we live. Just as it was impossible for the medieval mind to anticipate the modern, so too is it impossible for the modern mind to grasp what might come next. A greater awareness and acknowledgment of the flaws and failings of material progress and modernization, however,can encourage us to think more positively about alternative ways of living that deliver a high quality of life with much lower material consumption and social complexity. Growing and deepening crises will help to precipitate this change.

The modern myth of material progress implies, even insists, that past life was wretched, as expressed in the oft-quoted words of 17th-century philosopher Thomas Hobbes that the life of man in his natural state was "solitary, poor, nasty, brutish, and short." It is true that people were materially poorer and their life expectancy lower in the past, but they often led rich social and spiritual lives, as recent accounts of the quality of life among indigenous Australians show.

Just as it was impossible for the medieval mind to anticipate the modern, so too is it impossible for the modern mind to grasp what might come next.A greater awareness of the failings of material progress and modernization, however,can encourage us to think more positively about alternatives.

Traditional indigenous ways of living were devastated by the arrival of Europeans, but early accounts suggest a life of relative abundance and ease. Culturally speaking, the lesson is that we need to realize and accept that other, quite different and even better ways of making sense of the world and our lives are possible. Furthermore, we need to examine our situation at this fundamental level if we are to have a chance of achieving a higher and sustainable quality of life.

Anthropologist Wade Davis' writing offers an eloquent exposition of this viewpoint. In his books "Light at the Edge of the World: A Journey Through the Realm of Vanishing Cultures"and "The Wayfinders: Why Ancient Wisdom Matters in the Modern World," he urges us to heed the voices of other cultures because these remind us that there are alternatives, "other ways of orienting human beings in social, spiritual, and ecological space." They allow us "to draw inspiration and comfort from the fact that the path we have taken is not the only one available, that our destiny is therefore not indelibly written in a set of choices that demonstrably and scientifically have proven not to be wise." By their very existence, Davis argues, the diverse cultures of the world show we can change, as we know we must, the fundamental manner in which we inhabit this planet.

Davis learned as a student to appreciate and embrace the key revelation of anthropology: the idea that distinct cultures represent unique visions of life itself, morally inspired and inherently right. Cultural beliefs really do generate different realities, separate and utterly distinct from each other, even as they face the same fundamental challenges.

The significance of an esoteric belief lies not in its veracity in some absolute sense but in what it can tell us about a culture, he says. "What matters is the potency of the belief and the manner in which the conviction plays out in the day to day life of a people." A child raised to believe that a mountain is the abode of a protective spirit will be a profoundly different human being from one brought up to believe that a mountain is an inert mass of rock ready to be mined. A child raised to revere forests as a spiritual home will be different from one who believes that they exist to be logged.

Davis cautions that modernity (whether identified as Westernization, globalization, capitalism or democracy) is an expression of cultural values: "It is not some objective force removed from the constraints of culture. And it is certainly not the true and only pulse of history." The Western paradigm, for all its accomplishments, and inspired in so many ways, is not "the paragon of humanity's potential," he writes; "there is no universal progression in the lives and destiny of human beings."

The writer Barry Lopez, in his 2019 book "Horizon,"also brings an anthropological perspective to humankind's current state of precarity, "a time when many see little more on the horizon but the suggestion of a dark future":

As time grows short, the necessity to listen attentively to foundational stories other than our own becomes more imperative. Many cultures are still distinguished today by wisdoms not associated with modern technologies but grounded, instead, in an acute awareness of human foibles, of the traps people tend to set for themselves as they enter the ancient labyrinth of hubris or blindly pursue the appeasement of their appetites.

Lopez warns that if we persist in believing that we alone (whatever our culture) are right, and that we have no need to listen to anyone else's stories, we endanger ourselves. "If we remain fearful of human diversity, our potential to evolve into the very thing we most fear to become our own fatal nemesis only increases."

Davis and Lopez's warnings take me back to an early 1990s UNESCO project on the futures of cultures, which had as its hypothesis that "cultures and their futures, rather than technological and economic developments, are at the core of humankind's highly uncertain future." A project report noted: "Some of the participants expressed the view that culture may well prove to be the last resort for the salvation of humankind."

The project considered some critical questions about culture. Will economic and technological progress destroy the cultural diversity that is our precious heritage? Will the "meaning systems" of different societies, which have provided their members with a sense of identity, meaning and place in the totality of the universe, be reduced to insignificance by the steamroller effects of mass culture, characterized by electronic media, consumer gadgets, occupational and geographic mobility and globally disseminated role models? Or, on the other hand, will the explosive release of ethnic emotions accompanying political liberation destroy all possibility of both genuine development founded on universal solidarity and community-building across differences? Will we witness a return of local chauvinisms, breeding new wars over boundaries and intercultural discriminations?

Background papers for the UNESCO project proposed two scenarios: one pessimistic, one optimistic. The pessimistic scenario was that cultures and authentic cultural values will be, throughout the world, bastardized or reduced to marginal or ornamental roles in most national societies and regional or local communities because of powerful forces of cultural standardization. These forces are technology, especially media technology; the nature of the modern state, which is bureaucratic, centralizing, legalistic and controlling; and the spread of "managerial organization" as the one best way of making decisions and coordinating actions.

The optimistic scenario was that humanity will advance in global solidarity, with ecological and economic collaboration, as responsible stewards of the cosmos. Numerous vital and authentic cultures will flourish, each proud of its identity while actively rejoicing in differences exhibited by other cultures. Human beings everywhere will nurture a sense of possessing several partial and overlapping identities while recognizing their primary allegiance to the human species. Cultural communities will plunge creatively into their roots and find new ways of being modern and of contributing precious values to the universal human culture now in gestation.

Participants in the UNESCO project appeared to see the pessimistic scenario as more likely, as things stood then (it is perhaps even more likely today), while the optimistic scenario was more an ideal to guide policy.

With culture as with so many other areas of modern life, humanity's destiny hangs in the balance: A dominant culture that is deeply flawed is nevertheless spreading throughout the world. Epitomized by today's global, technocratic, managerial elite, this culture has become hugely powerful, the default setting for running national and world affairs. Yet its failures grow correspondingly more profound, with growing inequality and concentration of wealth and power, growing mistrust of government and other institutions, growing global problems such as climate change. At the same time, ethnic and other "tribal" feelings have become more fervent and exclusive, often fanatical, including in the West. The 20-year war in Afghanistan offers one powerful symbol of this cultural contest.

On the other hand, somewhere beyond this ugly mix, largely hidden by the outdated and dysfunctional cultures of mainstream politics and the news media, through these same dual processes, there is also the potential, the possibility, for the optimistic scenario: a world where rich cultural diversity underpins a new and vital cultural universality.

At least we should hope so. Humanity's fate hangs on the outcome.

Read more

about the human future

Read the rest here:

Culture, progress and the future: Can the West survive its own myths? - Salon

Posted in Rationalism | Comments Off on Culture, progress and the future: Can the West survive its own myths? – Salon

William Brooks: From Western Traditions to Political Indoctrination: A Cultural History of Education – The Epoch Times

Posted: at 7:47 am

Commentary

Parts 1 and 2 of this series can be read hereand here.

Over the 19th century, Enlightenment rationalism gradually diminished the influence of the Christian worldview.

Separation of church and state led naturally to the development of public schools and the disconnection of education from traditional religious influence. This was especially the case in some of the most prestigious districts on the continent.

In North America, two distinct educational movements vied for control of the newly developing public school systems. The first was a classical liberal model that was a secularized version of the Western Christian paideia. The second was a progressive model guided mainly by the ideas of 19th-century Utopian socialism and 20th-century Marxism.

In the wake of declining Christian influence, classical scholars argued that students should still have access to the Western canon. This would include the study of languages and literature, science and mathematics, history, the arts, and foreign languages.

This mostly secular vision became commonly known as liberal education. It was intended to bypass denominational conflicts and keep the focus on traditional Western literary and scientific achievement.

Classical liberal teachers sought to pass on knowledge and skills, cultivate the imagination, and develop the capacity for independent thought. Their mission was to prepare young people for mature participation in the civic, cultural, and business affairs of Western democratic societies.

Among the faithful, it was generally thought that moral and religious instruction would continue outside of public education systemsin churches, Sunday schools, and family homes.

Everyone recognized that college-bound students required academic preparation, but classical liberal educators believed that all children should still have instruction in the organizing principles of their society and the varieties of human experience.

Over the final century of Christian cultural hegemony, classical scholars sought to retain schools that would develop the capacity to reason and enrich the lives of young citizens.

But the liberal model wasnt destined to prevail.

In Battle for the American Mind, Pete Hegseth and David Goodwin contend that early in the 20th century, Western progressives launched a clandestine war against Western civilization.

The authors point to a literal heist of public school systems. They write, While we were busy staving off Marxist economics and making the world safe for democracy, underneath our noses the Progressives slowly and quietly removed our key ingredientthe Western Christian Paideiaand replaced it with a Paideia of their own.

Marxist intellectuals wrote about the heroic advances of progressive education and how the movement was overcoming outdated teaching practices.Progressives co-opted the very idea of democracy and contended that teachers should help bridge the gap between the school and society. The word democratic became a Marxist synonym for revolutionary.

As early as the mid-19th century, the classic British grammar school model, adopted in early colonial North America, gave way to the educational ideas of European social reformers such as Rousseau, Pestalozzi, Herbart, and Froebel. This changed our perception of the schools purpose, gradually substituting the concentration on literacy and the acquisition of knowledge with active teaching methods and the interests of the child.

Pedagogical experts posed as champions of working-class children. They proposed a differentiated curriculum that would offer a less rigorous and more pragmatic education to children of poorer parents, immigrants, and racial minorities whom they considered less capable of academic achievement.

Progressive policies were attractively packaged in democratic rhetoric. Educators claimed to be liberating young minds from boring traditional instruction and rote learning. Teach the child, not the subject became the mantra for student-centered schools.

But it soon became clear that the progressive mission was less about teaching and more about leading students toward a hypercritical view of Western civilization.On Jan. 15, 1987, former Democrat Party presidential hopeful Jesse Jackson and some 500 protesters marched down Palm Drive, Stanford Universitys grand main entrance, chanting Hey hey, ho ho, Western civ has got to go.

By the mid-20th century, several classical liberal scholars pushed back. Progressives came under a counterattack from a number of academic quarters. Some argued that the progressive vision was not only profoundly undemocratic, but also harmful to students and the society in which they lived.

In 1953, Canadian historian Hilda Neatby published So Little for the Mind, a scathing account of progressive reforms undertaken in Canadian public school systems. The respected University of Saskatchewan scholar argues that progressive teaching methods were anti-intellectual, anti-cultural, and amoral. Neatby asserts that there is no attempt to exercise, train, and discipline the mind.

In 1961, Columbia University history professor Lawrence A. Cremin wrote a similar critique titled The Transformation of the School: Progressivism in American Education, 18761957. Cremin also describes an anti-intellectual emphasis on non-academic subjects and the questionable teaching methods that had become hallmarks of the progressive movement.

Such books by serious thinkers raised important concerns about the purpose and quality of progressive schooling. Some parents and citizens began to resist and look for private alternatives.

From where I write in Nova Scotia, the 1958 founders of the independent Halifax Grammar School are said to have been inspired by Neatbys case against public progressive education.

Nevertheless, through the persistence of a solidly entrenched educational bureaucracy, most schools on the North American continent eventually returned to progressive policies and practices.

From the dawn of the Age of Aquarius in the 1960s, the odds were heavily stacked against any movement dedicated to the restoration of a classical liberal paideia.

Throughout the 20th century in the United States and Canada, state, provincial, and local governments almost entirely took over responsibility for the delivery of education.

Eventually, all forms of elementary and high school educationCatholic, Protestant, Jewish, public, and privatecame under the influence of the progressive model.

Hegseth and Goodwin assert that, as early as the 1920s and 1930s, government accreditation requirements were introduced to validate school diplomas and control transition to postsecondary studies.

Teachers were certified through education colleges that were designed by progressives. Graduation requirements and diplomas were authorized by states, under progressive education departments. Textbook authors, descended from this professional class of teachers, were trained in the progressive education colleges, Hegseth and Goodwin write.

Progressive schools were fully equipped to separate impressionable young people from the foundational principles of Judeo-Christian, democratic-capitalist, Western culture. Compulsory public education and progressive policy experts gradually replaced the cultural influence of churches, parents, local communities, and classical scholars.

The North American business community paid little attention to what was going on in education. They were focused on free market transactions and wealth production, not cultural transformation.

With a 20th-century public lulled into complacency by advancing technology, economic prosperity, and Utopian visions, neither religious educators nor classical liberal scholars had the capacity to resist what came next.

On top of the havoc wrought by increasingly radical school reforms in the 1960s, more of the same progressive teaching practices were introduced in the 70s.

To this day, progressive faculties of education produce thousands of graduates eager to replace any remaining traditional teachers and advance a new era of social justice education, activist training, and 21st-century woke culture. Agents of secular-progressive governments have become the permanent schoolmasters of North American children.

One of the leading school reformers of the last century was the iconic Columbia University philosopher John Dewey. His ideas influenced educational theory for more than 100 years. We will examine Deweys worldview more closely in Part 4 of this cultural history of education.

Views expressed in this article are the opinions of the author and do not necessarily reflect the views of The Epoch Times.

Follow

William Brooks is a Canadian writer who contributes to The Epoch Times from Halifax, Nova Scotia. He serves on the Editorial Advisory Board of The Civil Conversation for Canadas Civitas Society.

Here is the original post:

William Brooks: From Western Traditions to Political Indoctrination: A Cultural History of Education - The Epoch Times

Posted in Rationalism | Comments Off on William Brooks: From Western Traditions to Political Indoctrination: A Cultural History of Education – The Epoch Times

Attack On Salman Rushdie Manifests Barbarism In The Name Of Religion: Taslima Nasrin – Outlook India

Posted: at 7:47 am

Salman Rushdie did not stay in Iran, the country that declared a price on his head. I have lived in Bangladesh and India, two countries where price on my head has been set repeatedly. It is these two countries where I have faced death threats, been physically attacked, had processions taken out against me, had my books banned and my TV serials taken off air. If Rushdie is not safe under police protection in the Western world, theres little left to be imagined about my personal safety. But I am not in favour of trading for a safe and placid life in fear of diatribe and personal security. I choose to express my views in spite of the threat to my life, even if none subscribe to my views. I stand for my views against religious brutality, for humanism, rationalism, and equal rights for women.

The same Iran that had issued a fatwa against Rushdie in 1989 has sentenced Jafar Panahi, the world-famous filmmaker, to six years imprisonment for criticising the government just a few weeks ago. There is no trace of human rights in Iran after the Islamic revolution. Anyone who has dared to voice uncomfortable truthscriticised the government, sought an end to fundamentalism, demanded equal rights for womenhas been tortured mercilessly, jailed and executed. Minority communities, homosexuals, transgenders and socially-conscious people of the arts and sciences are routinely tortured.

ALSO READ: Can We Agree To Disagree And Reserve Our Right To Question?

The most important requirement for strangulation of democracy, equal rights for women and right to speech is to move the State away from secularism and make it believe in a religion, causing it to follow the doctrines of that religion. Any religion will eventually, almost inevitably, walk against democracy, decree against equal rights for women, and violate the right to speech. These problems are more evident in declared Islamic countries.

Things have come to such a pass that minority Islamic groups are capable of producing indoctrinated religious terrorists even in traditionally secular countries that uphold the right to speech. The person who tried to kill Rushdie is one of them. Whether in the minority or majority, Muslim fundamentalists are active everywhere. If young people are bred systematically in religion, it doesnt take long for them to be turned into religious fanatics. The path from fanaticism to terrorism is then easy to traverse.

Religion has given nothing to humans and society apart from bigotry, ignorance, communal hatred and terrorism.

Islamic terrorists dont care about the law of the land; they only follow Allahs laws. Critics of Islam were killed in the 7th century, and they are killed even today in the 21st century. Its soldiers are still carrying out the orders of Allah, who is famous for being the almighty, all-knowing, most beneficent, most gracious, and most merciful. They will continue doing it until Islam is reformed, free speech is allowed, violence is denounced, the breeding ground for extremism is demolished.

ALSO READ: Attack On Salman Rushdie: Will Writers Be Able To Create Their Works Without Fear?

Other religions have evolved, but thats hardly what we can say about Islam. Islam has been exempted from critical scrutiny that applies to other religions. Islam has not gone through an enlightenment process by which the barbaric, inhuman, unequal, unscientific aspects of religion have been questioned. Other religions could rectify their errors and mistakes; prohibit barbarity and violence against women. It was possible because those religions were subjected to critical scrutiny. On the other hand, all forms of d iscrimination against women still continue to exist in Islam.

Islam must be very weak and fragile so Islamists need blood to keep it alive. In this desperate situation if moderate Muslims do not break their silence and do not protest against jihadis, we would have to think that there is no such thing as moderate Muslims.

ALSO READ: Salman Rushdie And The Iran Fatwa

No religious scripture is sacred. All religious texts have been created by human beings. The same goes for the Quran. The days of considering a book as holy or as a commandment of the Almighty are long gone. Human beings arent too far away from watching and experiencing the beginning of the creation of the universe, the Big Bang explosion. Till date, we couldnt get hold of any proof of something we can call God. Scientific discoveries have time and again proved that religions are nothing but fairy tales. Believing in such fairy tales, enough humans have killed other humans, enough people have harmed others. They need to stop now. Let the backward thoughts of these barbarians who are anti-democracy, anti-individual freedom and anti-freedom of speech be identified, marked and called out.

Let the world, the progress of the world, the future of the world, and above all, humanity, be saved from them.

An Area of Darkness by V. S. Naipaul is banned in India. The government considered its portrayal of the country to be negative.

Most of the declared religious countries in the world are Islamic states. Non-Islamic states are by and large, secular in nature. Socially aware artists and litterateurs are deprived of human rights in almost all Islamic states. Those who are unscathed have inevitably compromised themselves. India has its own share of myriad problems. It is said that Hindutva is currently in its heyday and Muslims in the country are cornered. Not that there is no truth in what is said, but in spite of all this, two Muslim men in Udaipur did dare to barge into the shop of Kanhaiya Lal and slaughter him in broad daylight. If I were to imagine a Hindu in a similar state of rage against a Muslim in Bangladesh, I cannot imagine him in a similar act of murder. Murders are possible only by those who dont care or believe deeply that the Almighty above has given them the right to kill.

ALSO READ: Protecting The Perfect One: Do Muslims Need To Defend The Honour Of Allah And His Prophet?

What happened in the plush surroundings of the newly-opened Lulu Mall in Lucknow was truly fantastic. A few Muslims without any provocation settled down to offer namaaz inside the mall. There are some Muslims that consider the entire world to be their personal mosque. Nothing deters them from starting their namaaz anywhere and everywhere. Seeing some Muslims offering namaaz in the mall, a few Hindus scampered together and started reading the Hanuman Chalisa at the same place. The police, dying to react, swung into action and arrested a few of the Hanuman Chalisa readers. Chaos and commotion ensued. The flustered officials of Lulu Mall put up a notice saying, Any form of religious prayer is prohibited in this mall.

Let such notices stating Any form of religious prayer is prohibited here be put up in all malls, markets, museums, roads, schools, colleges, offices, courts, boats, launches, buses, trains, ships and airplanes. The time for removing religion from all public places is long overdue. Actually, there is no need for religious institutions, prayer halls, and faith schools. Religion has given nothing to society apart from bigotry, ignorance, communal hatred and terrorism. While it is a grave mistake not to keep religion separate from the State, it is equally big a mistake to let religion meddle in politics. The Islam that is brandishing its swords and killing free thinkers and progressive people is a political Islam. As long as political Islam is alive, every free thinker will live with a constant threat to life.

Religion has given nothing to humans and society apart from bigotry, ignorance, communal hatred and terrorism.

Theres no other way left for us but to free the world from terror and make the world a better place to live. Theres no option left for us but to ensure every human beings right to expression and protect the safety of every life. If we dont do it even today, well all sink into the darkness of an uncertain future. Governments of all nations have to stop using religion for their political interests. Religion should go away from public places for the sake of humanity. The strict separation between state and religion is urgently necessary. Humans have no choice left but to be scientific in their outlook. Fairytales dont save human beings, rationalism and humanism do.

ALSO READ: Guns & Proses: Can 'Dakshinayan Abhiyaan' Instil Confidence Among Writers And Artists?

Religion has given nothing to humans and society apart from bigotry, ignorance, communal hatred and terrorism. While it is a grave mistake not to keep religion separated from the State, it is equally big a mistake to let religion meddle into the politics of the State. The Islam that is brandishing its metals and killing freethinkers and progressive people is political Islam. As long as political Islam is alive, every freethinker will live with a constant threat to life.

(This appeared in the print edition as "Barbarism in the Name of Religion")

(Views expressed are personal)

Taslima Nasrin is an award winning feminist poet, novelist and public intellectual

See the original post:

Attack On Salman Rushdie Manifests Barbarism In The Name Of Religion: Taslima Nasrin - Outlook India

Posted in Rationalism | Comments Off on Attack On Salman Rushdie Manifests Barbarism In The Name Of Religion: Taslima Nasrin – Outlook India

Britain doesnt need a public holiday to remember the slave trade – The Spectator

Posted: at 7:47 am

A fair number of episodes in the history of this country are frankly best forgotten. The last thing to do with them, one might have thought, would be to memorialise them with bank holidays. Giving people in Britain a day off to mark, say, Cromwells harrying of Ireland in the 17th century, or the starting of the Boer War in the interest of corporate capital in the 19th, would at the very least raise eyebrows.

Yet yesterday, on Unescos International Day for the Remembrance of the Slave Trade, black studies academic Kehinde Andrews suggested exactly this in respect of one such event: namely, our involvement in slavery. There was, he said, really nothing more important to Britains development. We therefore needed a permanent official public holiday to keep its memory alive, preserve a conscience of the horrors of the transatlantic trade, and to remind us of its direct outcome in the form of continued structural racism, and racial economic and health inequality, in Britain today.

Really? This argument deserves a closer look. For one thing, there is some rather odd historical reasoning going on here.

Nothing more important to Britains development? British involvement in the slave trade lasted about 200 years, until its abolition in 1807 and the final suppression of servitude in 1833. During that time it was largely colonial (and not everywhere: Ontario, for example, legislated to free its slaves in 1793). Although some Britons may have owned slaves or shares in slaving businesses, in mainland Britain its status was always dubious, both legally and socially. Against Lord Hardwickes insistence in 1749 that planters could rest assured that any slaves they brought here remained unfree, we have Lord Mansfields words in 1772 in the great Somerset habeas corpus case that in this country the restraint of a slave was odious, and since it was not allowed or approved by the law of England, therefore the black must be discharged. Even if slave-run plantations contributed some of the capital for Britains industrial development, can anyone seriously see this as the most important feature of nearly a thousand years of British history from the Norman Conquest to 2022?

Again, its all very well to cite Marxist historian Eric Williamss view that we can take no credit for abolition because by 1833 slavery had ceased to make economic sense and abolishing it was therefore financially rational. True, he was probably right on the economics: indeed, Adam Smith had said roughly this in the Wealth of Nations in 1776. But to imply from this that the suppression owed everything to economics and nothing to decent moral sensibility is, unless you are a fairly crude economic determinist, somewhat extreme. There was a large moral side to the abolitionist campaign; furthermore, parliamentarians had to be persuaded to vote for abolition, and not all politicians, even in the 19th century, thought exclusively in money terms. It is also a bit difficult to see how this argument could be applied to the use of the West Africa Squadron to suppress the trade after abolition, where there is no obvious British financial self-interest to be found.

Moreover, while racism does undoubtedly exist against black and West Indian people in Britain today, the argument that it is somehow structural and the product of slavery is by no means obvious. Large-scale immigration started only in the 1950s, 120 years after abolition. Unless one believes in some kind of mystical collective folk memory lasting for four generations or so, that West Indians should be seen by white people in some sense as would-be slaves after such a time is implausible. In other words, the institution of slavery may explain the presence of people of African descent in this country, but it is hard put to it to explain the prejudice they encounter.

Unfortunately, to a select group of the initiated, rationalism of this kind about racism cuts little if any ice. Instead, obsession with the past institution of slavery and its perceived consequences today, together with the modern intellectual edifices of postmodernism and anti-racism, are increasingly morphing into a cult. We have what is close to a new religion, something seen as outside and beyond secular intellectual processes.

Indeed, there is an intriguing parallel. One hundred and seventy years ago Edward Pusey and the 19th-century Oxford Movement saw scripture not as a basis for logical argument but as a support for spontaneous faith inspired by the church fathers. Today anti-racist cultists think in much the same way about history. Listen to members (especially black members) lived experience, they argue, and you will see the light. Do not ask if history shows that Britain is a hotbed of slavery-derived racism inherent in its very structure, but instead accept that it is, and then ask how history supports this view. Only then will you (especially if you are white) be able to accept your collective guilt and work towards allyship with the oppressed. At this point, for the initiated, everything falls into place. If we are all indeed marked with ineradicable racial guilt, what better than to set up an anti-racist day of penitence, in the same way as Christians mark Good Friday?

From the rest of us, the answer must be simple. We should not join this miserabilist cult. We should continue to question its assumptions from a rationalist standpoint. For that matter, we might even go one better. Heres a nice contrary idea. Why not have a new public holiday, but make it a day of genuine celebration? An obvious candidate would be not some dreary Unesco remembrance day, but 1 August, the anniversary of the date when the Slavery Abolition Act 1833 came into effect. It might even help all of us, black and white alike, to celebrate freedom.

More here:

Britain doesnt need a public holiday to remember the slave trade - The Spectator

Posted in Rationalism | Comments Off on Britain doesnt need a public holiday to remember the slave trade – The Spectator