Daily Archives: February 15, 2017

TMS Jazz Band Puts on Red Hot Performance at Prudential Center – TAPinto.net

Posted: February 15, 2017 at 9:26 pm

NEWARK, NJ -- The Terrill Middle School Jazz Band rocked the main concourse of "The Rock" during the first intermission of Tuesday night's game between the Colorado Avalanche and the host New Jersey Devils.

The band, comprised of 7th and 8th graders,played rock classics including Louie, Louie by the Kingsmen, Jump by Van Halen, 25 or 6 to 4 by Chicago, In The Midnight Hour by Wilson Pickett, and A Hard Day's Night by The Beatles. (See video below.)

"They were great. Members of the Devils' brass came overto watch us," said John Gillick, Director of the Terrill Middle School Jazz Band."We'll definitely be coming back next year."

Sign Up for E-News

On the ice, the Devils jumped out to a 1-0 first period lead on a goal by PavelZacha. Andy Greene and Kyle Palmieriadded tallies for the home team. Mark Barberio and Mikhail Grigorenko. Cory Schneider made 28 saves for the Devils in the victory.

TAPintoSPF.net is Scotch Plains-Fanwoods only free daily paper. Sign up for our daily eNews and follow us on Facebook and twitter @SPF_TAP.

See more here:

TMS Jazz Band Puts on Red Hot Performance at Prudential Center - TAPinto.net

Posted in Tms | Comments Off on TMS Jazz Band Puts on Red Hot Performance at Prudential Center – TAPinto.net

The Edap Tms SA (EDAP) Stock Rating Decreased by the Zacks Investment Research – DailyQuint

Posted: at 9:26 pm

Edap Tms S.A. (NASDAQ:EDAP) was downgraded by Zacks Investment Research from a hold rating to a sell rating in a research note issued to investors on Tuesday.

According to Zacks, EDAP TMS S.A. develops, produces, markets and distributes minimally invasive medical devices, primarily for the treatment of urological diseases. They currently produce and market devices for treatment of benign prostate hyperplasia and urinary tract stones. They are also developing a third range of products for minimally invasive destruction of certain types of tumors.

Separately, TheStreet upgraded Edap Tms from a sell rating to a hold rating in a research note on Tuesday, September 27th.

Edap Tms (NASDAQ:EDAP) traded up 1.83% during midday trading on Tuesday, reaching $3.34. 39,022 shares of the company were exchanged. The company has a market capitalization of $95.95 million, a price-to-earnings ratio of 8.46 and a beta of 1.29. The company has a 50 day moving average of $3.15 and a 200-day moving average of $3.03. Edap Tms has a 52-week low of $2.43 and a 52-week high of $4.80.

An institutional investor recently raised its position in Edap Tms stock. Wells Fargo & Company MN increased its stake in Edap Tms S.A. (NASDAQ:EDAP) by 1.0% during the third quarter, according to its most recent Form 13F filing with the SEC. The fund owned 193,655 shares of the companys stock after buying an additional 2,000 shares during the period. Wells Fargo & Company MN owned approximately 0.67% of Edap Tms worth $562,000 at the end of the most recent reporting period. Hedge funds and other institutional investors own 9.56% of the companys stock.

About Edap Tms

EDAP TMS SA (EDAP) is a holding company engaged in developing and marketing the Ablatherm and Focal One devices. The Company operates two divisions: High Intensity Focused Ultrasound (HIFU) and Urology Devices and Services (UDS) (including lithotripsy activities). The Company is developing HIFU technology for the treatment of certain other types of tumors.

Get a free copy of the Zacks research report on Edap Tms (EDAP)

For more information about research offerings from Zacks Investment Research, visit Zacks.com

More:

The Edap Tms SA (EDAP) Stock Rating Decreased by the Zacks Investment Research - DailyQuint

Posted in Tms | Comments Off on The Edap Tms SA (EDAP) Stock Rating Decreased by the Zacks Investment Research – DailyQuint

Memristor Research Highlights Neuromorphic Device Future – The Next Platform

Posted: at 9:25 pm

February 15, 2017 Jeffrey Burt

Much of the talk around artificial intelligence these days focuses on software efforts various algorithms and neural networks and such hardware devices as custom ASICs for those neural networks and chips like GPUs and FPGAs that can help the development of reprogrammable systems. A vast array of well-known names in the industry from Google and Facebook to Nvidia, Intel, IBM and Qualcomm is pushing hard in this direction, and those and other organizations are making significant gains thanks to new AI methods as deep learning.

All of this development is happening at a time when the stakes appear higher than ever for future deep learning hardware. One of the forthcoming exascale machines is mandated to sport a novel architecture (although what that means exactly is still up for debate), and companies like Intel are suddenly talking with renewed vigor about their own internal efforts on neuromorphic processors.

The focus on such AI efforts has turned attention away from work that has been underway for years on developing neuromorphic processors essentially creating tiny chips that work in a similar fashion as the human brain, complete with technologies that mimic synapses and neurons. As weve outlined at The Next Platform, there are myriad projects underway to develop such neuromorphic computing capabilities. IBM, Hewlett Packard Enterprise with its work with memristors Qualcomm through its Brain Corporation venture and other tech vendors are making pushes in that direction, while government agencies like the Oak Ridge National Laboratory in Tennessee and universities like MIT and Stanford and its NeuroGrid project also have efforts underway. Such work also has the backing of federal government programs, such as DARPAs SyNapse and UPSIDE (Unconventional Processing of Signals for Intelligent Data Exploitation) and the National Science Foundation.

Another institution that is working on neuromorphic processor technology is the University of Michigans Electrical Engineering and Computer Science department, an effort led by Professor Wei Lu. Lus group is focusing on the memristors a two-terminal device that essentially is a resistor with memory that retain its stored data even when turned off that can act like synapses to build computers that can act like the human brain and drive machine learning. Weve talked about the growing interest in memristors for use in developing computer systems that can mimic the human brain.

Lus group created a nanoscale memristor that to mimic a synapse by using a mixture of silicon and silver that is housed between a pair of electrodes. Silver ions in the mixture are controlled by voltage applied to the memristor, changing the conductance state, similar to how synaptic connections between neurons rise and fall based on when the neurons fire off electrical pulses. (In the human brain, there are about 10 billion neurons, with each connected to other neurons via about 10,000 synapses.)

Neuromorphic computing proponents like Lu believe that building such brain-like computers will be the key moving forward in driving the development of systems that are smaller, faster and more efficient. During a talk last year at the International Conference for Advanced Neurotechnology, Lu noted the accomplishment of Googles AlphaGo program, but noted that it had to be done on a system powered by 1,202 CPUs and 176 GPUs. He also pointed out that it was designed for a specific task to learn and master Go and that doing so took three weeks of training and some 340 million repeated training reps. Such large compute needs and specific task orientation are among the weaknesses of driving AI in software, he said. AlphaGos win was an example of brute force an inefficient computer using a lot of power (more than the human brain consumes) and designed for s specific job that necessitated a long period of training. He also pointed to IBMs BlueGene/P supercomputer at Argonne National Lab that was used to simulate a cats brain. It used 147,456 CPUs and 144TB of memory to create a simulation that was 83 times slower than that of a real cats brain.

Once again, this is because they tried to emulate this system in software, he said. We dont have the efficient hardware to emulate these biological systems. So the idea is that if we have the hardware, then we can also implement some of the rules or features we learn in biology, not only will we make computers faster, but also you can use it to up with biological system to enhance our brain functions.

Were not trying to do it in software. Were actually trying to build as a fundamental device on hardware a computer network very similar to the biological neuro-network.

His group is doing that through the use of memristor synapses and CMOS components that work like neurons and are built on what Lu described as a crossbar electrical circuit. The crossbar network is comparable to biological systems in the way it operates. An advantage such a system like this has over traditional computers is the synapse-like way memristors operate. Traditional computers are limited by the separation between the CPU and memory.

Such a change could have a significant impact on a $6 billion memory industry that is looking at what comes after flash, he said. Lus team introduced its concept in 2010, and now he is a cofounder of Crossbar ReRAM, a company with $85 million in venture capital backing that was founded that same year and is working to commercialize what the University of Michigan team developed. He said in 2016 that the company already had developed some products for several customers. The company last month announced it is sampling embedded 40nm ReRAM manufactured by Semiconductor Manufacturing International Corp. (SMIC) with plans to come out with a 28nm version in the first half of the year.

Categories: Analyze, Compute

Tags: Neuromorphic

IBM Wants to Make Mainframes Next Platform for Machine Learning Why Googles Spanner Database Wont Do As Well As Its Clone

Read more from the original source:

Memristor Research Highlights Neuromorphic Device Future - The Next Platform

Posted in Neurotechnology | Comments Off on Memristor Research Highlights Neuromorphic Device Future – The Next Platform

Political Correctness Is An Absolute Must | Time.com

Posted: at 9:24 pm

Donald Trump, holding a photo of himself beside, as he might say, a "dog."Sara D. DavisGetty Images

The Republican Convention has barely begun, and the party has already made clear its primary political foe. Of course potshots will be taken at the "mainstream media," liberals and Hillary Clinton. But what did several of last night's convention speakersfrom Duck Dynasty 's Willie Robertson to Real World 's Sean Duffyregard as the real enemy? Political correctness.

You might have heard: America is plagued by "political correctness run amok." We were told this by Donald Trump's former campaign manager, Corey Lewandowski, when he tried to defend his old boss for tweeting an anti-Semitic Internet meme depicting a Star of David atop a pile of cash. The origins of that meme were recently discovered to be a message board of neo-Nazis and white supremacists who presumably agree with Lewandowski. After all, they titled their message board, "Politically Incorrect."

We were told by Republicans, after the hideous, hate-fueled mass shooting by an ISIS-idolizing lunatic in Orlando, easy access to guns was not even partly to blame. Then what was? Political correctness! According to the logic of a top NRA official, who was widely parroted by Republican lawmakers, the Obama administrations political correctness prevented anything from being done about the shooters racist ramblings.

When the elephant ate its own tail, and members of his own party panned Trump for exploiting the tragedy with offensive and egomaniacal tweets, we were told the criticism was misplaced. The real culprit? We cant afford to be politically correct anymore, said Trump.

Political correctness has been a whipping boy of the right wing for decades, and lately Trump is cracking the whip with abandon. He recently told a group of evangelical leaders that they shouldnt pray for President Obama because We cant be politically correct and say we pray for all of our leaders, because all of your leaders are selling Christianity down the tubes. (Never mind that Trump places prayer within the scope of self-interested transactions.) Remember his response to Fox host Megyn Kelly when she asked him about his temperament after calling some women dogs and fat pigs? It was : I think the big problem this country has is being politically correct. After being skewered by all sides for racist comments about a federal judge? We have to stop being so politically correct in this country.

If you're like many Americans, you might have been persuaded political correctness is one of our country's primary problems. Trump badly wants you to believe this, but you'd be wrong to do so. Trump is effectively positioning himself as the anti-PC candidate. Whereas Hillary Clinton thinks and speaks in the strategicand sometimes subtlelanguage of diplomacy, Trump explicitly proposes himself as undiplomatic and politically incorrect. In doing so, he is cheapening and polarizing our political debates and, more important, he is making our country less safe.

You might think politicians speak in too much coded language, designed to cloak their true positions and to avoid offending everyone. But lets be clear: The opposite of political correctness is not unvarnished truth-telling. It is political expression that is careless toward the beliefs and attitudes different than ones own. In its more extreme fashion, it is incivility, indecency or vulgarity. These are the true alternatives to political correctness. These are the traits that Trump tacitly touts when he criticizes political correctness. And these are the essential attributes of Trumps candidacy.

This is not the first time our political discourse has been crass. When he traveled to the United States fifty years after the nation gained its independence, the French writer Alexis de Tocqueville noticed a vulgar turn of mind among American journalists. Journalists back in France often wrote in an eloquent and lofty manner but, according to Tocqueville, the typical American journalist made an open and coarse appeal to the passions of the populace; and he habitually abandons the principles of political science to assail the characters of individuals. Sound familiar? This vulgarity might have been characteristic of that eras journalists, who brazenly competed for readers and hadn't yet developed common standards of professionalism and ethics. But it wasnt characteristic of the types of Americans who sought the nations highest political office.

Trumps vulgarity is so vivid, in part, because it contrasts so starkly with Barack Obama's civility and cool-headedness. I predict that the more Trump debases our political climate with his brand of political incorrectness, the more we will come to appreciate the qualities our president embodies. Regular Obama critic David Brooks recently praised the president for his ethos of integrity, humanity, good manners and elegance. Yet when the president challenges us to disagree without being disagreeable and to be careful not to conflate an entire religion with the hateful ideology that seeks to exploit and debase that religion, we watch as his detractors accuse him of political correctness.

You probably heard the accusations: Obama is pussyfooting around the phrase radical Islam because hed rather protect the feelings of terrorists rather than the lives of Americans. Or something like that. On one hand, the intense scrutiny on the presidents language reveals a conspicuous lack of substantive criticism of the presidents foreign policy. As President Obama wondered aloud in a recent press conference, What exactly would using this label accomplish? Would it make ISIL less committed to killing more Americans? Would it bring in more allies? Is there a military strategy that is served by this? Of course not. It is, as the president said, a distraction a political talking point, not a strategy.

But on the other hand, we are wise to focus on the language used in the critically important issue of knowing who our enemies are and who they are not. This is an issue that has the greatest political consequences. It is a political issue on which we need to be correct . And yet in that press conference, the president himself dismissed political correctness, underscoring the concepts status as a universal pariah, even as he defended his terminology. Obama explained, the reason that I am careful about how I describe this threat has nothing to do with political correctness and everything to do with defeating extremism.

Just as no serious firefighter would actually fight fire with fire, we cant fight the extremist language of foreign adversaries (and the insecurity and simplemindedness that propel it) with our own extremist language, insecurity and simplemindedness. It would be geopolitically incorrect, if you will, to do so. It would alienate our allies and motivate our adversaries.

After all, as conservative foreign policy expert Eli Lake has pointed out , our biggest allies in the Middle East are people in countries, such as Egypt and Saudi Arabia, whose brand of Islam strikes American sensibilities as "radical." After special forces raided his compound, Osama bin Ladens notebooks revealed that al Qaeda recruiting activities were disabled because, according to Bin Laden, Obama administration officials have largely stopped using the phrase the war on terror in the context of not wanting to provoke Muslims. Nothing would help ISILs recruiting strategy more than an American president lumping togetherrather than drawing a distinction betweenterrorists and the worlds billion and a half Muslims.

Conservatives might tell us Obama is politically correct and Trump tells it like it is. But when it comes to the debate over the phrase radical Islam, Obama is playing chess and Trump is playing dodge ball. If politics is about strategy, political correctness is arming oneself with a sound strategy while political incorrectness is strategic recklessness.

Many on the left think conservatives demonize political correctness because they resent having to suppress their own prejudices. That might be true for some. But as someone who teaches a college class on political rhetoric, Ive come to appreciate that anti-PC attitudes are part of a longer tradition of suspicion toward carefully calibrated language. Throughout history, our species has tended to distrust people who have a knack for political oratory. Part of this stems from the fact that most people are not good public speakers at the same time most people have an affinity for people who are like them. This is something psychologists call homophily," and is the reason so many of us tend to want to vote for somebody we'd "like to have a beer with" rather than someone smarter than us.

Conservative politicians who criticize Obama and political correctness understand that eloquence is often perceived less as a mark of intelligence and personal style and more as a product of artifice and self-indulgence. This is why they can muster up the backhanded compliment that Obama is a good speaker or a gifted orator.

Why do we hate political correctness so much? Our suspicion of sensitive political language goes back to ancient Greece, when the sophists got a bad rap for going around Athens training wealthy kids to become more talented speakers so they could win votes or dodge prison time. Plato famously distrusted rhetoric, although his student Aristotle would rehabilitate its reputation as an essentially virtuous endeavor. Political correctness, in which public officials are careful to avoid language that alienates or offends, requires a certain type of expressive competence. In the 2016 presidential campaign, Trump has critiqued this expressive competence while being wholly unequipped with it.

But political correctness is a longstanding American tradition and a deeply rooted value. Our countrys founders placed a premium on the ability to persuasively articulate opposing viewpoints. They rejected government censorship precisely because they trusted individuals could and would regulate themselves in our proverbial free marketplace of ideas. They didnt prohibit offensive speech because they believed truth lost its vigor unless confronted with falsehoods, and tolerance lost its social acceptance unless it could stand in contrast with ugly prejudices. They knew the value of an idea laid in its ability to gain favor in debates, which should be, in Supreme Court Justice William Brennans words , uninhibited, robust, and wide-open. Trump can say what he will about Muslims and Mexicans, but thoughtful journalists and pundits can and should say what they will about Trump.

If you are one of the many Americans who think political correctness is a detriment to politically vibrant debates in this country, you have it all backwards: People who use politically correct language arent trying to stifle insensitive speech. Theyre simply trying to out-compete that speech in a free and open exchange.

Every time Trump says something thats ugly or false and then claims political correctness is the big problem this country has and something we cant afford, hes basically blaming this free marketplace itself. He's petulantly arguing with the umpire. Hes blaming you and methe publicfor exercising the freedom to decide which ideas are good or bad. In the end, many of you dont like or want what hes peddling. You reject his racist tirades and narcissistic antics. You support common-sense gun legislation which would help prevent another terrorist hate crime like the one that occurred in Orlando. You reject praying for political leaders based on those leaders' party affiliations. And you don't think women deserve to be compared to "pigs" or "dogs" by people seeking our country's highest office. I happen to think you're correct, politically.

Mark Hannah was a staffer on the John Kerry and Barack Obama presidential campaigns and is the author of the new book The Best Worst PresidentWhat the Right Gets Wrong About Barack Obama . He teaches at NYU and The New School.

Excerpt from:

Political Correctness Is An Absolute Must | Time.com

Posted in Political Correctness | Comments Off on Political Correctness Is An Absolute Must | Time.com

Letter to the editor: Political correctness has influenced minds – Post Register

Posted: at 9:24 pm

Letter to the editor: Political correctness has influenced minds
Post Register
Inherent in the output of some of the favored, perennial guest writers, is how much political correctness has influenced the minds of many. Much of the radicalism that has attended the election is based on programmed ignorance and/or misinformation in ...

Follow this link:

Letter to the editor: Political correctness has influenced minds - Post Register

Posted in Political Correctness | Comments Off on Letter to the editor: Political correctness has influenced minds – Post Register

Trump, eugenics, and the historical precedent for his anti-Muslim travel ban – Daily Maverick

Posted: at 9:23 pm

Eugenics, which was endorsed by politicians and scientists across the ideological spectrum, sought to improve and strengthen human populations by means of compulsory sterilisation and restrictive immigration policies. The US were leaders of eugenics in the 1920s, but soon the Nazi state would take over this mantle.

What is less well known is that eugenics also provided pro-Nazi America First propagandists such as the famous aviator Charles Lindbergh with the scientific evidence needed to demand drastic measures to protect the superior Nordic, Germanic and Anglo-Saxon genes of Western Europeans. The US advocates of immigration restrictions drew on H.H. Goddards 1912 study, which used intelligence tests for identifying the feebleminded among immigrants arriving on Ellis Island.

Such studies allowed eugenics activists such as Charles Davenport and Madison Grant to successfully lobby the US Congress to introduce these immigration restrictions. Eugenics went into sharp decline soon after this 1924 legislative victory. A decade later, the Nazi regime used eugenics to justify racial laws to protect pure Aryan genetic stock.

In his history of scientific racism in America, The Legacy of Malthus, Allan Chase claims that these country quotas prevented an estimated 6-million southern, central and eastern Europeans from entering the US from 1924 to 1939. As Stephen Jay Gould concludes in The Mismeasure of Man: We know what happened to many who wanted to leave but had no place to go. The pathways to destruction are often indirect, but ideas can (be) agents as sure as guns and bombs.

Trumps presidential campaign seems to have borrowed from Lindberghs rhetoric of America First, which the latter deployed during his unsuccessful 1940 US presidential campaign. In an interview in the 1939 edition of Readers Digest, Lindbergh had referred to a metaphorical Western Wall to protect white Americans from the infiltration of foreign blood: It is time to turn from our quarrels and to build our White ramparts again. This alliance with foreign races means nothing but death to us. It is our turn to guard our heritage from Mongol and Persian and Moor, before we become engulfed in a limitless foreign sea. Our civilisation depends on a united strength among ourselves; on strength too great for foreign armies to challenge; on a Western Wall of race and arms which can hold back either a Genghis Khan or the infiltration of inferior blood; on an English fleet, a German air force, a French army, an American nation, standing together as guardians of our common heritage, sharing strength, dividing influence. As the by now very familiar refrain goes -- history repeats itself, first as farce, then as tragedy. With Trump it is wall-to-wall tragicomedy.

It was the tragic history of the Holocaust that prompted Mark Hetfield, the chief executive of Jewish refugee programme HIAS, to recently observe that it is a deep and tragic irony that Donald Trump is slamming the door in the faces of refugees right before International Holocaust Remembrance Day. This was especially disturbing since the entire refugee convention came out of the Holocaust and the failure of the international community to protect Jews and survivors. Trump antagonised Jews and Holocaust survivors further when he omitted to mention Jews in his public statement on International Holocaust Remembrance Day. Both Holocaust amnesia and denial seemed to converge in Trumps enactment of the Muslim ban. Yet, some do insist on remembering.

In commemoration of International Holocaust Remembrance Day in January this year, Russell Neiss, a 33-year-old grandson of Holocaust survivors, set up a Twitter account to automatically generate the names and photographs of German Jewish refugees who were on board the St Louis Manifest in May 1939, when the majority of passengers were refused entry into the US. The 937 passengers had left Hamburg on 13 May 1939. After being refused entry to the US, their ship was forced to return to Europe, where 532 passengers were later transported to various concentration camps where 254 were murdered; the 254 are the names that are tweeted at a rate of one every five minutes for 21 hours.

Neiss, who builds apps and interactive technology for Jewish education, came up with the idea as International Holocaust Remembrance Day was approaching. At the time, he was aware that there were other name-reading Twitter bots such as the Every Three Minutes account, which uses the fact that a person was sold into slavery every three minutes in the antebellum US. He was also aware of a bot that reads the names of the St. Louis victims based on data from the United States Holocaust Memorial Museum.

In addition to the reading of the names, Neiss included photographs in these tweets. One of the photographs in this stream of tweets is that of a small, smiling boy all dressed in white. This photograph has a standardised caption: My name is Joachim Hirsch. The US turned me away at the border. I was murdered in Auschwitz. One of the countless responses to these tweets was from the Democratic Partys Elizabeth Warren who declared that Trumps order restricting immigrants from seven Muslim countries and refusing admission of Syrian refugees was a betrayal of American values. Under Trump, immigration restrictions are not rooted in early twentieth century eugenics ideas about feebleminded foreigners, but rather through the conflation of Islam and terrorism. So, how did we get from Nazi eugenics to the Muslim ban?

My 2016 book, Letters of Stone: From Nazi Germany to South Africa, is a Holocaust family memoir that tells the story of how eugenics-influenced immigration policies resulted in Jews being unable to escape Nazi-occupied Europe. The book is based on one hundred letters my father received from his parents and siblings who were trapped in Berlin. The letters were sent from 1936, when my father arrived in South Africa, until 1943, when his parents and siblings were deported from Berlin to Auschwitz and Riga. My grandmothers letters to my father, Herbert Leopold Robinski and his younger brother Arthur, who had managed to escape to Northern Rhodesia in 1938, are mostly about the immense difficulties facing German Jews who desperately wanted to.

Although I wrote Letters of Stone in the shadow of the Syrian refugee crisis in Europe, I never imagined the possibility of Trumps Mexican wall, the Muslim travel ban and the closing of US borders to Syrian refugees. Neither did I realise then that a US president would resuscitate Lindberghs 1940 America First campaign and inspire far right movements in Europe. While my book focuses on how the Nazi state used eugenics to justify its persecution and murder of Jews, Roma, Sinti, homosexuals, the disabled and other racially inferior groups, US immigration policy in the first half of twentieth century relied upon eugenics to justify shutting its doors to unwanted foreigners. Now Trump administration is using the imperatives of national security to justify its Muslim ban. So how did we get here?

Whereas most histories of Nazism tend to be confined to Europe, Letters of Stone draws attention to its transnational roots. Hannah Arendts concept of the boomerang effect shows how the seeds of Nazi racial hygiene, as well as later US immigration policies, were planted in the far-flung fertile soils of the colonies. In 1913, the German physical anthropologist and anatomist Eugen Fischer published his ethnographic study of racial mixing among the Rehoboth Basters in German South West Africa. Hitler praised the book after reading it in a Munich prison in 1923. The trajectory of Fischers professional career reveals how scientific findings incubated in the human laboratories of German South West Africa later rebounded back into the heartland of Europe. By the mid-1930s, Fischer had become one of the Nazis most senior racial scientists, and from 1929 to 1942 he was director of the prestigious Kaiser Wilhelm Institute for Anthropology Human Heredity and Eugenics in Berlin. It was here that Fischers Rehoboth study and his Berlin Institute became intimately entangled with Mengeles experiments in Auschwitz as well as Nazi racial classifications of Jews, Roma and Sinti.

Letters of Stone tells the story of the desperate attempts by my grandmother and my father to get the family out of Germany. It is also about the moral indifference of immigration law in the face of human catastrophe. We now witness Trumps travel bans that, in the name of national security, demonstrate a similar indifference to the human suffering of refugees from Syria and other countries engulfed in war and violence. Trumps Muslim ban re-enacts an especially dark period in Americas past when Lindbergh was the leader of the pro-Nazi America First Movement. In 2004, Philip Roth published The Plot Against America, an alternative history in which Franklin D Rooseveld is defeated by Lindbergh in the 1940 presidential elections. While Roths book is fictional, the rise to power of Trump has made it frighteningly prophetic. Roths novel implies that there has always been this dangerous undercurrent of chauvinistic patriotism and fascism embedded within conservative American politics.

German Jewish scholars such as Theodor Adorno and Max Horkheimer, who were exiled in the United States following the rise of Nazism, also identified this potential for fascism and authoritarianism in America. Whereas Trumps medium for his America First messaging is twitter, Adorno and Horkheimer were concerned about the fascist aesthetics of the US Culture Industry. In 1940, Lindbergh was the spokesperson for the America First movement; now, almost 80 years later, Trump and Bannon promise to Make America Great Again, resuscitating once more this dangerous American brand of populism. Bannons valorisation of apocalyptic war and destruction as the ideological furnace for forging a return to traditional white American values has eerie echoes with Nazism and other catastrophic forms of fascism. Just as economic depression in Germany paved the way for Hitler, so too has neoliberalism and growing economic inequality in the US created the conditions for Trumps rise to power. Trumps particular brand of Islamophobic populism may not look exactly like Nazism, but its logic certainly mirrors Lindberghs pro-Nazi America First movement and his calls for a Western Wall to keep foreigners out.

In recent weeks, political activists and media commentators have stressed parallels between the refusal to allow European Jews to enter the US in the 1930s and Trumps Muslim ban. In a YouTube video produced by UNICEF, an elderly German Jewish refugee and Holocaust survivor speaks about how, as a small boy, he became a stateless refugee fleeing from Nazi terror; sitting right next to him, a small boy describes his own terrifying flight from war in Syria.

As UNICEFs description of the video states: 80 years apart, these two refugees have more in common than youd think. Similarly, in an article published in the Independent on 27 January 2017, the journalist Peter Walker writes that many Holocaust survivors find that Donald Trump's refugee ban is tragically similar to what happened in the 1930s. What has not been mentioned much is the history of eugenics-inspired immigration restrictions and how early twentieth century ideas about dangerous foreigners have re-entered American public consciousness. This is a reminder of how immigration policies continue to be shaped by histories of racism and scientific studies that were incubated in the human laboratories of the colonies. DM

Professor Steven Robins is with the Department of Sociology & Social Anthropology, University of Stellenbosch.

Photo: US President Donald J. Trump waves outside the entrance to the West Wing after seeing off Israeli Prime Minister Benjamin Netanyahu (not pictured) following their meeting at the White House in Washington, DC, USA, 15 February 2017. This is the first official meeting of the two leaders since President Trump has taken office. EPA/MICHAEL REYNOLDS

See the original post here:

Trump, eugenics, and the historical precedent for his anti-Muslim travel ban - Daily Maverick

Posted in Eugenics | Comments Off on Trump, eugenics, and the historical precedent for his anti-Muslim travel ban – Daily Maverick

Why Google’s Spanner Database Won’t Do As Well As Its Clone – The Next Platform

Posted: at 9:23 pm

February 15, 2017 Timothy Prickett Morgan

Google has proven time and again it is on the extreme bleeding edge of invention when it comes to scale out architectures that make supercomputers look like toys. But what would the world look like if the search engine giant had started selling capacity on its vast infrastructure back in 2005, before Amazon Web Services launched, and then shortly thereafter started selling capacity on its high level platform services? And what if it had open sourced these technologies, as it has done with the Kubernetes container controller?

The world would be surely different, and the reason it is not is because there is a lesson to be learned, one that applies equally well to traditional HPC systems for simulation and modeling as well as the Web application, analytics, and transactional systems forged by hyperscalers and cloud builders. And the lesson is this: Making a tool or system that was created for a specific task more general purpose and enterprise-grade meaning mere mortals, not just Site Reliability Engineers at hyperscalers can make it work and keep it working is very, very hard. And just because something scales up and out does not mean that it scales down, as it needs to do be appropriate for enterprises.

That, in a nutshell, is why it has taken nearly ten years since Google first started development of Spanner and five years from when Google released its paper on this globe-spanning, distributed SQL database to when it is available as a service on Googles Cloud Platform public cloud, aimed at more generic workloads than its own AdWords and Google Play.

If this was easy, Google would have long since done it or someone else cloning Googles ideas would have, and thus relational databases that provide high availability, horizontal scalability, and transactional consistency on a vast scale would be normal. They are not, and that is why the availability of Spanner on Cloud Platform is a big deal.

It would have been bigger news if Google had open sourced Spanner or some tool derived from Spanner, much as it has done with the guts of its Borg cluster and container controller through the Kubernetes project, and that may yet happen as Cockroach Labs, the New York City startup that is cloning Spanner much as Yahoo did with Googles MapReduce to create Hadoop or the HBase and Cassandra NoSQL databases that were derived from ideas in Googles BigTable NoSQL database.

To put it bluntly, it would have been more interesting to see Google endorse CockroachDB and support it on Cloud Platform, creating an open source community as well as a cloud service for its Cloud Platform customers. But, as far as we know, it did not do that. (We will catch up with the Cockroach Labs folks, who all came from Google, to see what they think about all this.) And we think that the groundswell of support for Kubernetes, which Google open sourced and let go, is a great example of how to build a community with momentum very fast.

For all we know, Google will eventually embrace CockroachDB as a service on Cloud Platform not just for external customers but for internal Google workloads as well, much as is starting to happen with Kubernetes jobs running on Cloud Platform through the Container Engine service among Googlers.

Back in 2007, Google was frustrated by the limitations of its Megastore NoSQL and BigTable NoSQL databases, which were fine in that they provided horizontal scalability and reasonably fast performance, but Google wanted to also have these data services be more like traditional relational databases and also have them be geographically distributed for high availability and for maximum throughput on a set of global applications that also ran geographically. And so it embarked on a means to take BigTable, which had been created back in 2004 to house data stored for Googles eponymous search engines as well as Gmail and other servers, and allow it to span global distances and still be usable as a single database for Googles developers, who could care less about how a database or datastore is architected and implemented so long as it gets closer and closer to the SQL-based relational database that is the foundation of enterprise computing.

And, by the way, a pairing of relational data models and database schemas with the SQL query language that was invented by IBM nearly forty years ago and cloned by Oracle, Sybase, Informix, and anyone else you can think of including Postgres and MySQL. Moreover, IBM has been running clustered databases on its mainframes for as long as we can remember they are called Parallel Sysplexes and they can be locally clustered as well as geographically distributed and run a cluster of DB2 database instances as if there were one giant, logical database. Just like Spanner. Google databases like Spanner may dwarf those that can be implemented on IBM mainframes, but Google was not the first company to come up with this stuff. Contrary to what Silicon Valley may believe.

With any relational database, the big problem when many users (be they people or applications) is deciding who has access to the data and who can change that data as they are sharing the database. There are very sophisticated timestamping and locking mechanisms for deciding who has the right to change data and what that data is these are the so-called ACID properties of databases. Google luminary Eric Brewer, who is vice president of infrastructure at Google and who helped create many of the data services at the search engine giant, coined the CAP Theorem back in 1998 and the ideas were developed by the database community in the following years. The gist of CAP Theorem is that all distributed databases have to worry about three things consistency, availability, and partition tolerance and no matter what you do, you can only have no more than two of these properties being fully implemented at any time in the datastore or database. Brewer explained this theorem in some detail in a blog post related to the Cloud Spanner service Google has just launched, and also explained that the theory is about having 100 percent of two of these properties, and that in the real world, as with NoSQL and NewSQL databases, the real issue is how you can get close enough to 100 percent on all three to have a workable, usable database that is reliable enough to run enterprise applications.

With Spanner, after a decade of work, Google has been able to achieve this. (You can read all about Spanner in the paper that Google release back in October 2012.) Part of the reason why Google can do this is because it has developed a sophisticated timestamping scheme for the globally distributed parts of Spanner that creates a kind of universal and internally consistent time that is synchronized by Googles own network and is not dependent on outside services like the Network Time Protocol (NTP) that is used by servers to keep relatively in synch. Google needed a finer-grained control of timestamping with Spanner, so it came up with a scheme based on atomic clocks and GPS receivers in its datacenters that could provide a kind of superclock that spanned all of its datacenters, ordering transactions across the distributed systems. This feature, called TrueTime by Google, is neat, but the real thing that makes Googles Spanner work at the speed and scale that it does is the internal Google network that lashes those datacenters to the same heartbeat of time as it passes.

Brewer said as much in a white paper that was published about Spanner and TrueTime in conjunction with the Cloud Spanner service this week.

Many assume that Spanner somehow gets around CAP via its use of TrueTime, which is a service that enables the use of globally synchronized clocks. Although remarkable, TrueTime does not significantly help achieve CA; its actual value is covered below. To the extent there is anything special, it is really Googles wide-area network, plus many years of operational improvements, that greatly limit partitions in practice, and thus enable high availability.

The CA here refers to Consistency and Availability, and these are possible because Google has a very high throughput, global fiber optic network linking its datacenters with at least three links between the datacenters and the network backbone, called B1. This means that Spanner partitions that are being written to and that are trying to replicate data to other Spanner partitions running around the Google facilities have many paths to reach each other and eventually get all of the data synchronized eventually being a matter of tens of milliseconds, not tens of nanoseconds like a port to port hop on a very fast switch and not hundreds of milliseconds, which is the time it takes for a human being to see an application moving too slow.

The important thing about Spanner is that it is a database with SQL semantics that allows reads without any locking of the database and massive scalability on local Spanner slices to thousands (and we would guess tens of thousands) of server nodes, with very fast replication on a global scale to many difference Spanner slices. When we pressed Google about the local and global scalability limits on the Cloud Spanner service, a Google spokesperson said: Technically speaking, there are no limits to Cloud Spanners scale.

Ahem. If we had a dollar for every time someone told us that. . . . What we know from the original paper is that Spanner was designed to, in theory, scale across millions of machines across hundreds of datacenters and juggle trillions of rows of data in its database. What Google has done in practice, that is another thing.

We also asked how many geographically distributed copies of the database are provided through the Cloud Spanner service, and this was the reply: Everything is handled automatically, but customers have full view into where their data is located via our UI/menu.

We will seek to get better answers to these and other questions.

The other neat thing about the paper that Brewer released this week is that it provided some availability data for Spanner as it is running inside of Google, and this chart counts incidents unexpected things that happened rather than failures times when Spanner was unavailable itself. Incidents can cause failures, but not always, and Google claims that Spanner is available more than 99.999 percent (so called 5 9s) of the time.

As you can see from the chart above, the most frequent cause of incidents relating to Spanner running internally were user errors, such as overloading the system or not configuring something correctly; in this case, only that user is affected and everyone else using Spanner is woefully unaware of the issue. (Not my circus, not my monkeys. . . .) The cluster incidents, which made up 12.1 percent of Spanner incidents, were when servers or datacenter power or other components crashed, and often a Site Reliability Engineer is needed to fix something here. The operator incidents are when SREs do something wrong, and yes, that happens. The bugs, which are true software errors, presumably in Spanner code as well as applications, and Brewer said that the two biggest outages (meaning the time and impact) were related to such software errors. Networking errors for Spanner are when the network goes kaplooey, and it usually caused datacenters or regions with Spanner nodes to be cut off from the rest of the Spanner cluster. To be a CA system in the CAP Theorem categorization, the A has to be pretty good and not caused by the network partitions being an issue.

With under 8 percent of Spanner failures being due to network and partition issues and with north of 5 9s availability, you can make a pretty good argument that Spanner and therefore the Cloud Spanner service being a pretty good fuzzy CAP database, not just hewing to the CP definition that both Spanner and CockroachDB technically fall under.

The inside Spanner at Google underlies hundreds of its applications and petabytes of capacity and churns through tens of millions of queries per second, and it is obviously battle tested enough for Google to trust other applications on it besides its own.

At the moment, Cloud Spanner is only available as a beta service to Cloud Platform customers, and Google is not talking about a timeline for when it will be generally available, but we expect a lot more detail at the Next 17 conference in early March that Google is hosting. What we know for sure is that Google is aiming Cloud Spanner at customers who are, like it was a decade ago, frustrated by MySQL databases that are chopped up into shards, as Google was using at the time as its relational datastore, as well as those who have chosen the Postgres path once Oracle bought MySQL. The important thing is that Spanner and now Cloud Spanner support distributed transactions, schemas, and DDL statements as well as SQL queries and JDBC drivers that are commonly used in the enterprise to tickle databases. Cloud Spanner has libraries for the popular languages out there, including Java, Node.js, Go, and Python.

As is the case with any cloud data service out there, putting data in is free, but moving it around different regions is not and neither would be downloading it off Cloud Platform to another service or a private datacenter, should that be necessary.

Categories: Cloud, Hyperscale, Store

Tags: BigTable, Cloud Platform, Cloud Spanner, Google, Megastore, Spanner

Memristor Research Highlights Neuromorphic Device Future

More:

Why Google's Spanner Database Won't Do As Well As Its Clone - The Next Platform

Posted in Cloning | Comments Off on Why Google’s Spanner Database Won’t Do As Well As Its Clone – The Next Platform

Pokemon Go Adds 80 Generation 2 Pokemon, New Evolution Items This Week – IGN

Posted: at 9:22 pm

Share.

Niantic and The Pokemon Company have announced that more than 80 new Pokemon are headed to Pokemon Go this week.

The new Pokemon come from the Johto region, originally introduced in Pokemon Gold and Silver, and can be encountered in the wild starting this week. Niantic is also adding new Evolution items for evolving Pokemon, as well as new purchasable outfit and accessory options for customizing your trainer.

Got feedback on our player?

New berries will also be introduced to aid in catching Pokemon. The new Nanab Berry will slow the movements of wild Pokemon, while the Pinap Berry will double the amount of candy earned from catching a Pokemon if the next ball thrown yields a successful catch. The new berries join the Razz Berriesthat were already in the game, which can be fed to a Pokemon to make them slightly easier to catch.

While a full list of new Pokemon isnt available yet, Niantic specifically mentioned that Chikorita, Cyndaquil, and Totodile will be among the new additions. The new Pokemon coming this week join the initial set of Pokemon from generation 2 introduced to Pokemon Go in December, which included Togepi, Togetic, Pichu, Elekid, Smoochum, Magby, Igglybuff, and Cleffa.

A few of the new Pokemon and new berry types on display.

Todays announcement arrives as the Pokemon Go Valentines Day event comes to a close, ending a week of double candy rewards and extended six-hour Lure Modules.

The news also ends months of speculation about the full Johto Pokedex appearing in Pokemon Go, following details datamined from previous updates. Additional features found from datamining, including shiny Pokemon variants, have not yet been officially announced.

For much more on Pokemon Go, see IGNs Pokemon Go wiki guide, including a list of original Pokemon that evolve in generation two.

Andrew is IGN's executive editor of news and currently has a full Pokedex for the United States and Europe. You can find him rambling about Persona and cute animals on Twitter.

More here:

Pokemon Go Adds 80 Generation 2 Pokemon, New Evolution Items This Week - IGN

Posted in Evolution | Comments Off on Pokemon Go Adds 80 Generation 2 Pokemon, New Evolution Items This Week – IGN

Eye Evolution: The Waiting Is the Hardest Part – Discovery Institute

Posted: at 9:22 pm

Without calling it a series, I've written several articles recently that followed a logical path. In the first, I described the distinction between incremental innovation and radical innovation. I also outlined the commonalities and differences between intelligent design and theistic evolution (TE) as approaches to biology. In a follow-up, I applied the concepts from the first article to the proposed evolution of the vertebrate eye, demonstrating that it could not have occurred without intelligent direction. That's mainly because the majority of steps required for the addition of a lens are disadvantageous in isolation, so selective pressures would have operated in opposition to the evolutionary process.

Let's now consider the challenge of waiting times -- the minimum time required for hypothesized evolutionary transformations, such as the development of the camera eye, to occur through undirected processes. Even if the selective pressures were favorable, the required timescales are far longer for sufficient numbers of coordinated mutations to accumulate than the maximum time available, as determined by the fossil record. Of special interest is the proposed cooption of crystallin proteins, which give the lens its refractive properties. Seemingly, one of the easiest evolutionary steps should be producing these proteins in the lens, for some of them are already used for other purposes. The main hurdle would simply be altering the regulatory regions of the first borrowed crystallin gene, so it binds to the correct set of transcription factors (TFs). The lens protein could then be produced in the fiber cells in sufficient quantities at the right time in development.

However, the cooption process is far more challenging than it might at first appear. It requires regions in the gene to bind to at least four new transcription factors. This alteration would involve numerous mutations creating the four corresponding DNA binding sites known as transcription factor binding sites (TFBS). As I mentioned in the previous article, the earliest lens should have closely resembled lenses of vertebrates today, so this lower estimate is almost certainly accurate.

A typical binding site involved in lens construction consists of a DNA sequence ranging from roughly 7 (e.g., SOX2) to 15 (e.g., Pax6) base pairs, so four TFBS would likely correspond to over 30 base pairs. One could think of these DNA sequences like the launch codes to a missile; they must be correct before the protein can be properly manufactured. The lower bound of 30 base pairs can be divided by a factor of 3 to compensate for sequence redundancies, flexibility in where in the DNA sequences start, and the fact that roughly one quarter of the bases would be correct purely by chance. This extremely conservative estimate indicates that over 10 mutations would be required to generate a proper sequence. All but the final mutation would be neutral.

We can now calculate the likelihood of sufficient mutations occurring in 10 million generations. The mutation rate for a specific base par is typically estimated for complex animals to correspond to a probability around 1 in 100 million. The chance of a mutation occurring in 10 million generations is then 1 in 10. Therefore, the chance of 10 coordinated mutations appearing on the same DNA strand works out to much less than 1 in 10 billion. No potential precursor to a vertebrate with a lens would have had an effective population large enough to acquire the needed mutations. For comparison, the effective population size estimate used for Drosophila melanogaster can be in the low millions. If the generation time were even as low as one year, a crystallin could not be coopted even in 10 million years, which is the time required for the appearance of most known phyla in the Cambrian explosion.

Moreover, this step is only one of hundreds required to produce a lens. Researchers have identified numerous TFs essential to lens development in vertebrates, and each has its own set of TFBS, which integrate into a complex developmental regulatory gene network. If only one connection were wired incorrectly, the eye in the vast majority of cases would not form properly, resulting in impaired vision. In addition, the lens is only one component of the eye, which is only one part of the visual system. The obvious conclusion is that, in the timeframe allowed by the fossil record, the reengineering to produce the vertebrate visual system would require foresight and deliberate coordination. Those are the hallmarks of design.

Biologists have claimed to produce viable scenarios for the evolution of several other complex systems. What all these stories share is that they ignore crucial details and lack careful analysis of feasibility. When we examine these issues in detail, the stories collapse for the same reasons that the one about the eye does: First, the selective pressures oppose transitions between key proposed stages. Second, the required timescales are vastly longer than what is available.

For biologists, rigorously evaluating evolutionary narratives has become fully possible only in the past several decades due to advances in molecular and developmental biology. Meanwhile, with breakthroughs in computer engineering, information theory, and nanotechnology, parallels between biological and human engineered systems are increasingly evident. These developments are making the intelligent design framework essential for scientific advancement. They also create new opportunities for ID proponents and theistic evolutionists to collaborate.

Proponents of TE want to push materialistic explanations for biological systems as far as possible, as science demands. ID advocates would not disagree with them on that. No one wants to trigger the design filter prematurely. So theistic evolutionists should join us in considering what the modern evolutionary synthesis with its auxiliary hypotheses, such as niche construction and epigenetic inheritance, can explain. We should all continue to examine how insights from evolution may benefit research on cancer, in epidemiology, and other fields.

ID researchers, meanwhile, can examine the limits of purely materialistic processes, and we invite theistic evolutionists to do likewise These combined efforts will help to define in greater detail what Michael Behe calls the edge of evolution. This understanding would also help advance research on cancer treatments, antibiotic protocols, and more. At the same time, ID proponents can help identify how principles and insights from engineering may advance biological research and related applications.

Many theistic evolutionists recognize that the appearance of design is real (but then, so does Richard Dawkins). This insight, at least, should inform their research. In contrast, anti-theistic evolutionists are biased against recognizing the benefits of design thinking. As a result, in studying life they have stumbled upon close parallels to human engineering, which, however, they recognized only begrudgingly. On the other hand, ID expects these parallel and is unsurprised to find them. A classic example is how researchers, misled by evolutionary thinking, dismissed a large portion of the human genome as "junk" DNA instead of anticipating that it would function as a genomic operating system.

TE researchers do not need to immediately agree with ID researchers on whether any particular feature of life is the result of primary design or secondary causes. They can still work together to best serve the cause of genuine science, and I hope they will do so more in the future.

See the rest here:

Eye Evolution: The Waiting Is the Hardest Part - Discovery Institute

Posted in Evolution | Comments Off on Eye Evolution: The Waiting Is the Hardest Part – Discovery Institute

PSG Hammering Signals the End for Luis Enrique-Led Evolution at Barcelona – Bleacher Report

Posted: at 9:22 pm

CHRISTOPHE SIMON/Getty Images Tim CollinsFeatured ColumnistFebruary 15, 2017

Thomas Meunier looked up, and all he saw was empty space. For 70 minutes, he and his Paris Saint-Germain team-mates had seen little else, so he put his head down and ran and ran and ran, all the way from right-back to the other penalty area where Edinson Cavani was waiting for the baton, poised to blitz the final leg.

Thrashing the ball into the net, Cavani set off, first toward the corner flag and then past team-mates, beyond his own bench and past opposition manager Luis Enrique, covering more distance with more speed than every Barcelona player on the night combined to embrace those in the stands at the other end.

In the background, the scoreboard read 4-0. It may as well have read "The End."

Cavani's goal was the nadir in a nightmare for Barcelona, but it was also so much more. This was the goal and the brutal treatment the Catalans have been trending toward all season. Every warning that's been dealt, every concern that's been voiced, they'd all fixated on a moment and a night such as thisone that had felt as though it was coming, one when the consequences of drift would crystallise.

Even if the extent of Tuesday's hammering at Parc des Princeswas surprising, the nature of the performance wasn't. For those who've watched Barcelona closely this season, this wasn't anything new. Instead, it was more of the same; only the strength of the opponent was different.

Watching PSG harass and trample the Catalans was essentially the maxed-out version of the type of contest we'd seen a handful of times before. Rewind to the clash with Celta Vigo in October and you'll see all the same themes; rewind to the games against Valencia, Manchester City, Sevilla, Real Sociedad and Real Betis and you'll see them, too.

On Wednesday morning, the cover of Catalonia-based Sport read, "This is not Barca." You knew what it meant. In a broader sense, this isn't them: the identity, the philosophy, the strength of the collective. But Sport's cover was also wrongthis is what this Barca have become.

It is Luis Enrique of course who has steered Barcelona down this path. The period of evolution led by the Asturian since 2014 has been both necessary and highly successful, reaping a treble in his first season and a domestic double last term. But evolution has now become regression, with the process having gone beyond the outer limit of its effectiveness and the team having moved too far along the spectrum. Tuesday signals the end of such a shift.

"It is difficult," the Barcelona boss said afterwards. "They were superior to us from the start. It was a disastrous night for us in which we were clearly inferior.There's not much more to say. PSG did what we expected them to do and produced their best version and we were at our poorest."

Nowhere was that more evident on Tuesday than in midfield. Once the cornerstone of Barcelona's dominance, the central third at Parc des Princeswas the area of the game's greatest discrepancy. Marco Verratti was sublime for the hosts, the conductor behind the athletic enforcers in Adrien Rabiot and Blaise Matuidi.

That trio swamped Sergio Busquets and rendered an underdone Andres Iniesta irrelevant. Gone, then, was the control so characteristic of Barcelonathe command of possession, the metronomic quality of the ball movement, the domination of territory, the suffocation of the opponent.

The effects of that were felt everywhere. Angel Di Maria attacked the space between a besieged midfield and a backtracking defensive line, Julian Draxler tormented an exposed Sergi Roberto and the vaunted front three had no supply line.

"I was there for Barca's 5-0 win over Real Madrid and was left with a similar facial expression right after it as they have now," Di Maria told beIN Sports (h/t Marca). "Surely, Barcelona have been finished."

The collapse, though, of Barcelona's central foundation is more consequence than cause. The erosion of the club's midfield supremacy has been the casualty of the Luis Enrique-led evolution as the team's definition has changed, with the emphasis moving to the forwards.

In Lucho's first season, the club's march to a treble was due to the calibration of Lionel Messi, Luis Suarez and Neymar falling into place largely within the existing framework. There was a degree of compromise, as the manager's desire for explosiveness met midway between it and structure. But since, that shift has continued unabated, taking Barcelona away from what they were and to where they are now. Luis Enrique will pay for that.

"[Johan]Cruyff built the cathedral. It is our job to maintain it," Pep Guardiola once said. The problem is not the cathedral; it's still there. The problem is that they've drifted too far from their own religion.

For that, Barcelona's players will have to take their share of responsibility. But they'll also get their chance to make amends. Luis Enrique likely won't.

The man who played at the Camp Nou for eight years around the turn of the century has a contract that expires in June, and he has been non-committal all season on his future. Tense and often prickly, the Barca boss has regularly exuded the feeling he's tired of the demands, tired of the scrutiny and political swirl. His position consumes even the greats, and you sense that strain has taken its toll.

At the beginning of the campaign, this writer suggested that this season would present stiffer challenges to the 46-year-old: "Just as testing will be the necessity to continue feeding his players' drive. Astutely, he'll need to keep pushing his stars, challenging them, appealing relentlessly to them as competitors for another year after already doing so for two."

It's this that's seemingly escaped him, and it's not unusual. The great Hungarian manager Bela Guttmann used to argue that the third season was the point at which methods grew stale, messages lost their punch and at which opponents worked out the riddle. "The third season," went his famous line, "is fatal." And so it looks to be proving.

Back in November, when Barcelona were ambushed and run over by Manchester City,Sport likened the Luis Enrique incarnation with the way Liam Gallagher once described Oasis: "Like a Ferrari: Great to look at. Great to drive. And it'll f--king spin out of control every now and again."

Under the club's current manager, years one and two were full of great driving. In year three, they've been gradually losing the back end before entering a high-speed spin on Tuesday. There's a pole not far in the distance.

Almost certainly heading out of the Champions League, Barcelona are three weeks from their earliest European exit in a decade. They're also one point back of Real Madrid in La Liga despite having played two more games.

"Desastre," said Mundo Deportivo on Wednesday. Sport added: "Shipwrecked without a manager." It's not quite true, but it likely soon will be.

Continue reading here:

PSG Hammering Signals the End for Luis Enrique-Led Evolution at Barcelona - Bleacher Report

Posted in Evolution | Comments Off on PSG Hammering Signals the End for Luis Enrique-Led Evolution at Barcelona – Bleacher Report