In the WeChat, TikTok US shutdown order, TikTok gets Nov. 12 stay, keeping it up through the US election and Oracle dealmaking – TechCrunch

The U.S. Commerce Department has nowannounced the details of how it will enforce the shutdown of TikTok and WeChat in the country, after announcing in August plans to do so by September 20 over national security concerns. The news is structured along two dates, September 20 and November 12. Both apps and their app updates will no longer be distributed in U.S. app stores as of September 20. But TikTok specifically gets an extension on how it operates until November 12.

That not only keeps it up until after the November 3 U.S. election, but leaves the door open for it to complete a complicated deal with Oracle and partners to take control of its U.S. operations without an interruption in service.

That timing, plus the statement from the Department of Commerce Secretary Wilbur Ross, underscores the strong political current running through the news.

Todays actions prove once again that President Trump will do everything in his power to guarantee our national security and protect Americans from the threats of the Chinese Communist Party, said Ross in a statement.At the Presidents direction, we have taken significant action to combat Chinas malicious collection of American citizens personal data, while promoting our national values, democratic rules-based norms, and aggressive enforcement of U.S. laws and regulations.

The first part of the action, starting September 20, has to do with halting all new app distribution for both WeChat (owned by Tencent) and TikTok (owned by ByteDance) . In other words, no new downloads of either app as of September 20, and no new updates. And it also forbids any provision of services through theWeChatmobile application for the purpose of transferring funds or processing payments within the U.S. that is, all payments.

WeChats owner Tencent has issued a response to the announcement:

We are reviewing the latest announcement from the Department of Commerce restricting the use of WeChat by U.S. users, said a spokesperson. WeChat was designed to serve international users outside of mainland China and has always incorporated the highest standards of user privacy and data security. Following the initial executive order on August 6 we have engaged in extensive discussions with the U.S. government, and have put forward a comprehensive proposal to address its concerns. The restrictions announced today are unfortunate, but given our desire to provide ongoing services to our users in the U.S. for whom WeChat is an important communication tool we will continue to discuss with the government and other stakeholders in the U.S. ways to achieve a long-term solution.

It seems that the different dates might mean that those that already have TikTok installed by September 20 should still be able to continue using it, even if youre an iPhone user updating to iOS 14.

We write might because the government hasnt provided technical detail around how it plans to implement its rules.

From that date, WeChat also will not, seemingly, work at all, with the U.S. forbidding any provision of internet hosting services enabling the functioning or optimization, any provision of content delivery, internet transit or peering services, or any provision of constituent code, functions or services of the app.

Notably, TikTok will not face the same operational roadblock on September 20.

TikTok has until November 12 before those rules come into play. That is to say, if you have TikTok downloaded by September 20, you will still be able to use it.

The date is important for a couple of reasons. It firstly leaves the app up and running in the run-up to the U.S. presidential election. Many have said that Trump shutting down the popular app which has some 100 million users in the U.S., but beyond that a much wider public following in pop culture, where TikToks are shared on national television and across a lot of other social channels now could hurt him with younger voters. Whether or not that really would have been the case, this seems to have knocked that problem out of his re-election calculus.

It secondly leaves the door open for Oracle, Walmart and the rest in that consortium negotiating to take over the operation of TikTok to seal their deal without any interruption in service. The app has around 100 million users in the country, similar to its number of users in Europe.

The story around the deal has been changing by the day, shifting from an outright acquisition to one where Oracle might control the data in the app but not the source code, to licensing the source code too, to getting Chinas approval as well as that of the U.S., and other permutations. The most recent developments have included the idea of a public listing and even considering Instagram co-founder Kevin Systrom to take it over as the new CEO.

Ironic, then, that one of the more outspoken tech leaders around this latest development has been Adam Mosseri, the current head of Instagram, who has been tweeting his thoughts about the wider implications for other big tech companies.

(Weve been in a tit-for-tat war around apps and freedom to operate them across national boundaries for some time already, and many countries with national firewalls have long decided that there is absolutely nothing wrong with prohibiting some apps from other countries if you feel they compromise your national security.)

The U.S. Department of Commerce decisions are in line with an executive order signed by President Trump on August 6, which put ByteDance and Tencent, the respective owners of TikTok and WeChat, on notice of the governments intention to block access to their products over purported concerns about national security.

That executive order precipitated the last few weeks of feverish dealmaking to avoid a shutdown of TikTok, discussions that remain ongoing and are not finalized. As of today, Oracle and what looks like Walmart are still negotiating with the White House, Treasury Department and ByteDance to come to a deal that will be acceptable to the president. China also has authority to approve a sale of TikTok.

Over the last few weeks, the administration has promoted a policy known as Clean Network designed to eliminate foreign interference in applications and cloud infrastructure that powers American technology.

That policy calls for the removal of certain apps, data sovereignty to onshore American user data to the United States, mobile network infrastructure built from clean equipment and a host of other measures to create a clean computing environment for U.S. citizens. While those policies are generally written broadly, their clear target has been China, based on speeches from administration officials.

TikTok and WeChat are not the only app removals announced overnight. In India, one of the most popular payment apps in the country Paytm has been removed from Googles Play Store for repeat policy violations. The app has tens of millions of monthly users. In late June, the country also announced a list of 59 apps developed by Chinese companies that would be banned, including TikTok.

Such national fights over the future of technology have increasingly come to a head as tech drives a larger segment of the global economy and increasingly becomes intertwined with competing national interests.

Go here to read the rest:

In the WeChat, TikTok US shutdown order, TikTok gets Nov. 12 stay, keeping it up through the US election and Oracle dealmaking - TechCrunch

Chinese firm amasses trove of open-source data on influential Canadians – The Globe and Mail

The office for Shenzhen Zhenhua Data Information Technology Co. Ltd., a small Chinese technology company that is building tools to process the worlds open-source information about influential people.

Nathan VanderKlippe/The Globe and Mail

On the 14th floor of a tower filled with cramped workspaces in Chinas high-tech city, a small military contractor is building tools to track politicians, aerospace entrepreneurs, scholars and other influential people around the world, including thousands in Canada.

The office of Shenzhen Zhenhua Data Information Technology Co. Ltd. is modest. When a Globe and Mail reporter visited, three people sat at desks in what appeared to be a converted studio apartment, with the bathroom door open. One person was entering code into a computer, while another worked on a PowerPoint presentation. The third person said the company has a work force of more than 30 employees. And although she said Zhenhua is expanding, thats small even by startup standards in China.

Its ambitions, though, extend far beyond its small real estate footprint. It is building tools to process the worlds open-source information about influential people culled from Twitter, criminal records, LinkedIn posts, YouTube videos and more into data that can be analyzed and used by universities, companies, government actors and the Chinese military.

Story continues below advertisement

Our client base is a bit special, the woman said.

The Globe and a consortium of international journalists have accessed an early copy of the companys Overseas Key Information Database (OKIDB), which shows the type of information Zhenhua is collecting for use in China, including records on small-town mayors in Western Canada, where Chinese diplomats have sought to curry favour.

The company claims to have built tools to manipulate content on Twitter, WhatsApp and other platforms, including Facebook which says it has banned Zhenhua from gathering data on its platform.

The company declined an interview request, saying it was not convenient to disclose trade secrets. Its website became inaccessible after The Globe visited its office, which is located in a government-backed business incubator building across the street from an investigative centre for the local Public Security Bureau all a short drive from the headquarters of some of Chinas most important technology companies and civilian military contractors, including Tencent and China Electronics Corp.

The company, led by a former IBM data centre management expert, has also described its work online in job postings, LinkedIn records, blog articles and software patents. One employee described work mining the business needs of military customers for overseas data. Before it became inaccessible, Zhenhuas website listed a series of partners that included important military contractors.

It claims to have collected information on more than 2.4 million people and 650,000 organizations from about two billion social-media articles.

Together the documents reveal a Chinese firm with a keen interest in advanced forms of warfare, the structure of the U.S. intelligence apparatus and the use of social media to achieve military victories.

Story continues below advertisement

The company has secured a software patent for a social media account simulation system, a title that connotes a tool for managing networks of fake social-media usernames in ways that emulate human characteristics, making them more effective at spreading messages.

Zhenhuas name translates to China Revival, a reference to a mantra of President Xi Jinping, who has proclaimed the great rejuvenation of the Chinese nation.

It seems to be collecting information about people who are around things that China would be interested in. The question is whether this is a database of potential targets that could be used by the intelligence services of China to get what they want, said Stephanie Carvin, a former national security analyst who is now an associate professor of international relations at Carleton University.

Prof. Carvin looked at the database on behalf of The Globe and said it wasnt clear whether it was being used by Chinese intelligence or had simply been created by a company hoping to sell it to Chinese intelligence.

But she found it curious that it contained records on people such as Ella-Grace Trudeau, the 11-year-old daughter of Prime Minister Justin Trudeau, and Jeremy Fry, the adult son of long-time MP Hedy Fry. That, Prof. Carvin said, suggested an attempt to learn more not just about the people in power in Canada but about those around them.

Why have these people in some kind of database? That, to me, is the question that national security agencies in the West have to figure out. Thats the thing I worry about, Prof. Carvin said. Is this an attempt to create a database of targetable individuals? And what are they trying to do with that?

Story continues below advertisement

A version of the Zhenhua OKIDB database analyzed by The Globe contained almost 16,000 entries mentioning Canada.

Its files seem to have been cobbled together from various sources. Some catalogue news stories, including hundreds of Globe articles, while others are archived Facebook posts from U.S. President Donald Trump about trade tariffs. A large portion of the data appears to have been extracted from the business information website Crunchbase and serves as a Rolodex of social-media accounts and contact information for people in all sorts of occupations, from tech executives to university professors. Roughly 70 per cent of the people captured in the data are men.

The database appears to have a special focus on mayors of Western Canadian towns, as well as academics and bureaucrats who focus on international relations.

However, the effort is broader than it is deep.

Jeremy Kirk, an information security analyst who said he gained independent access to the database earlier this year, said he didnt see any sign that it was a tool of an intelligence service.

This is data that anyone could find through a Google search. So far, none of the data has been linked to a non-public data source. As it stands, it doesnt represent a threat to any country, said Mr. Kirk, who is the executive editor of the Australia-based Information Security Media Group. But people should be mindful of what they post publicly on the internet, as it could be collected by other countries for commercial gain or intelligence purposes.

Story continues below advertisement

The vast majority of the files contain little more than what can be found about the individuals on social-media websites such as Twitter, Facebook and LinkedIn. If the person of interest has a police record, links are included to newspaper stories about their cases.

The mass scraping of data contravenes Facebooks policies, spokeswoman Liz Bourgeois said. We have banned Shenzhen Zhenhua Data Technology from our platform and sent a cease and desist letter ordering them to stop, she said. LinkedIn does not permit the use of any software that scrapes or copies information from LinkedIn, spokesperson Billy Huang said. If any violation of our user agreement is uncovered or reported, we investigate and take necessary steps to protect our members' information.

The database also contains a shorter list of 3,767 Canadians who have been assigned a grade of 1, 2 or 3. Those assigned a 1 appear to be people of direct influence, such as mayors, MPs or senior civil servants, while those assigned a 2 are often relatives of people in power, such as Mr. Trudeaus daughter and Ms. Frys son. Those assigned a grade of 3 often have criminal convictions, mostly for economic crimes.

Dozens of current and former MPs dot the list, including new Conservative Party Leader Erin OToole, whose file includes a link to the web page of his official parliamentary profile and, like most, a seven-digit ID number.

Others with files assigned a grade of 1 include senior bureaucrats at the Canadian Nuclear Safety Commission, the Canadian Food Inspection Agency, the Treasury Board, the Transportation Safety Board, Export Development Canada even the Office of the Privacy Commissioner.

The justice system appears to be another focus of the database, which contains entries on judges up to and including current and former members of the Supreme Court of Canada.

Story continues below advertisement

Notable individuals assigned a grade of 3 include former theatre impresario Garth Drabinsky, who was convicted of fraud in 2009, former SNC-Lavalin executive Riadh Ben Aissa, who pleaded guilty to corruption charges in Switzerland before testifying against his superiors in Canada, and Nicola Iammarrone, a former Canada Revenue Agency auditor who pleaded guilty to taking bribes.

It is unclear how often the database is updated, as several names on the list appear to correspond with those of prominent Canadians who have died, in some cases many years ago.

Entries about Canadian criminals feature prominently. The database lists 198 people it says are associated with narcotics, 178 with conspiracy, 162 with fraud and 100 with money laundering. A handful of people are mentioned multiple times, including Gilles Vaillancourt, the former Laval, Que., mayor jailed in 2016 on fraud charges; Amin Mohamed Durrani, jailed after being arrested in the 2006 Toronto anti-terrorism sweep; and Michael Witen, an accountant who was found guilty of defrauding the federal government.

Zhenhua appears to be a company hoovering up open-source intelligence, and one of the things where there is a lot of open-source intelligence is around criminal records and court records, said Garrett Graff, co-author of Dawn of the Code War: Americas Battle Against Russia, China, and the Rising Global Cyber Threat.

According to database timestamps, all of the Canadian entries analyzed by The Globe were collected in mid-to-late 2018.

Data collection on this scale is not without precedent. The internet made collecting massive amounts of general-purpose data infinitely easier, giving rise to data brokers. Today these companies sell datasets ranging from credit-card purchasing histories to cellphone geolocation data, and their clients rely on the information for everything from tailoring advertising campaigns to calculating credit scores.

Story continues below advertisement

But other evidence points to Chinese players attempting to take in large amounts of data. A series of data breaches between 2013 and 2018 attributed to Chinese hackers stole personal information from Marriott hotels, the United States Office of Personnel Management and health insurer Anthem. China routinely denies involvement in hacking.

Each individual collection of data may be of limited value. But when you begin to layer these databases on top of one another, it provides an arguably unparallelled window into human targeting backgrounds, personal motivations, personal weaknesses and provides a roadmap for influencing people, Mr. Graff said.

Zhenhuas data is structured in a way similar to that of Factiva, a research tool from Dow Jones that also catalogues influential people around the world. In fact, the woman at the Zhenhua office likened the companys products to those of Dow Jones and Wind Information, a Chinese provider.

Some foreign software companies are able to obtain content such as videos, text and music from social media posts. What we can do is to get them all at once, she said. The company describes OKIDB as tracking people, institutions, connections and relationships. The people include global leaders and core figures in the fields of military, politics, business, science and technology, media, civil organizations and the like.

Zhenhuas clients are in government, the military, universities and academic institutes, the woman at the company said, adding they can use the companys technology to conduct a more detailed analysis of a certain professor. She said the company is not merely a technology provider, as its employees actively work with customers and are based in cities across China, including Nanjing and Wuhan.

Western intelligence services now estimate that China has collected personally identifiable information on 80 per cent of the U.S. population, said Nicholas Eftimiades, a former senior U.S. intelligence officer and China expert who recently published Chinese Espionage: Operations and Tactics. For a company like Zhenhua, applying artificial intelligence tools to a trove of social-media data can help its customers attain their goals locally, regionally, nationally or commercially, he said.

Youre talking about the ability to influence academics, political leaders ranging from mayors up through senior leaders in a government. Its about influencing them to serve the Chinese Communist Partys desires, their goals.

Online, Zhenhua stresses its military connections. On LinkedIn posts, a senior R&D engineer describes working on a social media cultivation system, and military deployment simulation demonstration system, while a product sales manager discusses mining military customers' business needs for overseas data. A job posting seeks a candidate that can manage sales and focus management systems at the direction of the Party, government, and military.

Zhenhua also lists a series of corporate partners with ties to the security establishment. Wenge Group uses big data and artificial intelligence to aid smart law enforcement. LSSEC Tech provides encryption tools and IT equipment to national security and military customers and has trained its employees to keep secrets on weaponry research. GTCOM sifts social media to spot the development of heated public opinion, equipping authorities to minimize the probability of group incidents. TRS lists the police and the Communist Party as customers for software services that include online relationship mining, a public opinion management system and a crystal ball intelligence analysis platform. CHRTC provides urban governance products to the countrys security apparatus.

Zhenhua itself has been granted 10 software patents, Chinese records show, for systems that include searching global think tanks, monitoring personnel appointments and removals around the world, gathering real-time telecommunications content and social-media account simulation. The latter appears to describe a technology to teach a computerized system to better mimic humans on social media.

The ultimate intent of this sort of thing is to get attention, to stimulate phony online traffic, said Wu Fei, director of the AI Research Centre at Zhejiang University. When people see content that has already received thousands of likes or comments, the majority of which may be created and stimulated by the system they would immediately be interested.

Zhenhua said on its website that its system manages multiple social-media accounts belonging to virtual humans bots. When an assigned task is received, the user can select all social media accounts or part of them to execute the assigned instructions.

Such tools can be used by companies to promote products. But Zhenhua has explained how they could also hold military value.

The company has published extensively on Number 99 Institute, a blog account on the WeChat messaging app. Its articles reveal an interest in the structure and hierarchy of U.S. intelligence agencies, as well as in future forms of conflict. One, which was also posted to the Zhenhua website, describes social media as important tools for hybrid warfare, explaining how the manipulation of public opinion through social media can be a cost-effective and powerful way to prevail in battle. Social media can manipulate reality and weaken a countrys administrative, social, military or economic power, the company wrote. It can also lead to internal conflicts, social polarization and radicalization in a country.

Our Morning Update and Evening Update newsletters are written by Globe editors, giving you a concise summary of the days most important headlines.Sign up today.

See the article here:

Chinese firm amasses trove of open-source data on influential Canadians - The Globe and Mail

ATO declines to fix code replay flaw within myGovID – ZDNet

The default login option for agents used by the Australian Taxation Office (ATO) is vulnerable to a code replay attack, security researchers Ben Frengley and Vanessa Teague said.

Writing in a blog post, the pair described that an attacker could use a malicious login form to capture user details, which the attacker could then use to login into other accounts held by the myGovID user.

The nub of the attack is that when a myGovID user attempts to login into a site, they are asked to input a four-digit code into the myGovID smartphone app to verify the login -- no passwords are used, and the only identifying piece of information is an email address.

If the attacker can capture an email address, that can be used by the attacker to log into another myGovID service and replay the generated code to the user to enter into the myGovID app. Once the code is entered, the user will believe they are logged into a proper site, while the attacker can simultaneously log into their account elsewhere.

The user is not alerted to the other login taking place.

"This attack is detectable by a diligent user who understands the protocol well enough to know that they should only accept 4-digit codes from mygovid.gov.au (and knows how to check for TLS)," the pair wrote.

"However we believe that there are very few users in this category, because it is a counter-intuitive protocol designed to reverse the information flow relative to what users are accustomed to."

The suggested short term mitigation from the researchers is to inform users about what site is requesting a login, and for the long term, the pair recommended ditching the framework altogether.

"In the long run, the [Trusted Digital Identity Framework] and all its current implementations should be deprecated and replaced with an open standard such as OpenID Connect or a protocol modelled on that of a nation with an existing secure public key infrastructure such as Belgium or Estonia," they wrote.

"The implementation and design documentation should be openly available to the Australian public to allow for the identification and responsible disclosure of other vulnerabilities.

"We have no reason to believe that this is the only, or the worst, vulnerability in this system. Its complex nature and the desire to hide information makes enforcing and validating correct, secure behaviour close to impossible."

For users, the pair recommended they do not use myGovID unless unavoidable, and in that case, to ensure they only receive codes from the mygovid.gov.au site.

"This unlikely to work in practice for most users, who will struggle to recognise a secure website with the right URL," they said.

The pair said they informed the Australian Signals Directorate of the issue on August 19, and were told on Friday by the ATO that "they did not intend to change the protocol, at which point we immediately informed them that we would make a warning to users public".

In October, the Digital Transformation Agency said almost 7,000 Australians had created a myGovID.

Also on Monday morning, the ATO announced it has signed a three-year, AU$11.4 million deal with Vocus for managed network services.

"The contract will see Vocus provide up to 230 services across 80 ATO sites, on its fully separated secure network," Vocus said.

"The types of services include IP WAN, internet and data centre connectivity for all existing and future ATO sites."

The contract has three potential two-year extensions.

Link:

ATO declines to fix code replay flaw within myGovID - ZDNet

NVIDIA GeForce Now quietly starts working on Linux as the Avengers come to play – Android Central

If you use or have been following NVIDIA GeForce Now, the cloud gaming platform that delivers PC titles you already own from sources such as Steam and Epic Games to a multitude of devices, the latest development seems to have emerged silently. Spotted by the team at GamingonLinux, users of Linux can now, it seems, access GeForce Now in either Chromium of Google Chrome.

Indeed, previously this tactic involved fudging user agents to make GeForce Now believe you were on a Chromebook, following the launch of the web client for Google's laptops. And it works just fine, I logged in and played some games with no issues on Ubuntu in both browsers. And just to double check, Firefox still shows an incompatible device error.

So, if you've been waiting on Linux support for GeForce Now, definitely go check it out. It also means the platform is usable now on all major desktop operating systems. Mobile access is currently limited to Android.

Along with this bit of good news NVIDIA also added its regular weekly slew of new games to the GeForce Now library, and while it's a smaller week in numbers, the headline act is a big hitter. Earth's Mightiest Heroes join GeForce Now as Marvel's Avengers joins the service, alongside Stick it to the Man, ULTRAKILL and Zero Escape: The Nonary Games.

Other interesting GeForce Now news includes work of Fortnite with RTX enhancements coming to GeForce Now for Founders members in the weeks after it launches on PC. Buyers of a new NVIDIA RTX 30 Series graphics card can also get a year's access to GeForce Now bundled in along with a free copy of Ubisoft's forthcoming title, Watch Dogs: Legion. Not too shabby at all.

Read the original here:

NVIDIA GeForce Now quietly starts working on Linux as the Avengers come to play - Android Central

Microsoft: Windows 10 is hardened with these fuzzing security tools now they’re open source – ZDNet

Microsoft has released a new open-source security tool called Project OneFuzz, a testing framework for Azure that brings together multiple software security testing tools to automate the process of detecting crashes and bugs that could be security issues.

Google's open-source fuzzing bots have helped it detect thousands of bugs in its own software and other open-source software projects. Now Microsoft is releasing its answer to the same challenge for software developers.

Project OneFuzz is available on GitHub under an open-source MIT license like Microsoft's other open-source projects, such as Visual Studio Code, .NET Core and the TypeScript programming language for JavaScript.

SEE: Hiring Kit: Python developer (TechRepublic Premium)

Microsoft describes Project OneFuzz as an "extensible fuzz testing framework for Azure".

Fuzzing essentially involves throwing random code at software until it crashes, potentially revealing security issues but also performance problems.

Google has been a major proponent of the technique, pushing coders and security researchers towards fuzzing utilities and techniques. Its open-source fuzzers include OSS-Fuzz and Cluster Fuzz.

OSS-Fuzz is available for developers to download from GitHub and use on their own code. It's also available as a cloud service for select open-source projects.

Microsoft previously announced that it would replace its existing software testing toolset known as Microsoft Security and Risk Detection with the automated, open-source fuzzing tool.

The Redmond company also says it's solving a different and expensive challenge for all businesses that employ software developers, and gives credit to Google for pioneering the technology.

OneFuzz is the same testing framework Microsoft uses to probe Edge, Windows and other products at the company. It's already helped Microsoft harden Windows 10, according to Microsoft.

"Fuzz testing is a highly effective method for increasing the security and reliability of native code it is the gold standard for finding and removing costly, exploitable security flaws," said Microsoft Security's Justin Campbell, a principal security software engineering lead, and Mike Walker, a senior director, special projects management.

"Traditionally, fuzz testing has been a double-edged sword for developers: mandated by the software-development lifecycle, highly effective in finding actionable flaws, yet very complicated to harness, execute, and extract information from.

"That complexity required dedicated security engineering teams to build and operate fuzz-testing capabilities making it very useful but expensive. Enabling developers to perform fuzz testing shifts the discovery of vulnerabilities to earlier in the development lifecycle and simultaneously frees security engineering teams to pursue proactive work."

As Microsoft notes, "recent advancements in the compiler world, open-sourced in LLVM and pioneered by Google, have transformed the security engineering tasks involved in fuzz testing native code".

SEE: Open-source security: This is why bugs in open-source software have hit a record high

These advances make it cheaper for developers to handle what was once attached and instead bake these processes into continuous build systems, according to Microsoft. This includes crash detection, which was previously attached via tools such as Electric Fence. Now they can be baked in with asan.

It also addresses previously attached tools such as iDNA, Dynamo Rio, and Pin that are now built in with sancov.

"Input harnessing, once accomplished via custom I/O harnesses, can be baked in with libfuzzer's LLVMFuzzerTestOneInput function prototype," Campbell and Walker note.

Microsoft has also been adding experimental support for these features to Visual Studio so that test binaries can be built by a compiler, allowing developers to avoid the need to build them into a continuous integration (CI) or continuous development (CD) pipeline. It also helps developers scale fuzzing workloads in the cloud.

View original post here:
Microsoft: Windows 10 is hardened with these fuzzing security tools now they're open source - ZDNet

The Local Audience Is the Central Audience: As Tourism Tanks Across the US, Museums Pivot to the Visitors in Their Own Backyards – artnet News

Before the shutdown hit New York City museums in March, 70 percent of the Metropolitan Museum of Arts visitors traveled to it from other cities and countries. Now, since the museums August 29 reopening, that number has dwindled to just 20 percent, leading the Met, and institutions across the globe, to grapple for the first time with what it means to run a museum without tourists.

As social distancing shrinks visitor capacities, ticket sales are down and museums are feeling the financial hit. They are also having to retool programming for their newly local audiences.

Before the shutdown, the Met welcomed anywhere from 15,000 visitors to 25,000 (on a peak summer day). Now, capacity is 14,000, and the Met is currently using its timed ticketing system to generate audiences of about 5,000 visitors per day.

With loans and traveling shows curtailed, the museum will be relying heavily on its own collection to create exhibitions, Met director Max Hollein said. Overall, programming will be sharply reduced for the next 17 to 24 months, prompting the museum to explore new juxtapositions and new ways of contextualizing objects in its collection, said Hollein.

The museums current 150th anniversary exhibition, Making the Met 1870-2020, was already in the works long before the current crisis, but its nonetheless a good example of the kind of show that can be done from drawing on the museums own resources.

A view of the Metropolitan Museum of Art in April. Photo by Rob Kim/Getty Images.

We also became more local in the lockdown, Hollein said. Were focusing on what we have to energize and galvanize us in that direction. To that end, the museum is adding a modern spin to its period rooms display that represents a local New York story. Detailsabout the new installation will be revealed in coming months, but, for now, Its time we start to imagine a period room not of the past but of today that can reflect our current time and challenge contemporary issues, Hollein said.

Hollein expects that it could be two to three years before tourism approaches its pre-pandemic levels. The local audience is really the central audience, he said. Its an audience that has grown up with the institution and comes to you again and again. They have a much closer connection because they enjoy and notice constant changes within the institution. Their level of expectation is higher, than, for example, a tourist who comes once every few years.

Two friends in face masks sit in front of Claude Monets paintings at the Metropolitan Museum of Art during its first day open to members since March on August 27, 2020. Photo by Taylor Hill/Getty Images.

It appears the Met has plenty of company on this path. In June, Vastari, a digital platform that helps arts and cultural institutions source individual objects or entire exhibitions from peer institutions and private collectors, conducted a multiple-choice survey with its members about programming strategy through the end of 2021.

The majority of the 50 respondents were art museums, and 26 of them were based in the US. From this cohort, 52 percent replied that they would focus on organizing shows with their own permanent collections. Another 38 percent said budget restrictions would compel them to book only small exhibitions. Among the museums who responded other to the question, the most frequent explanation written in was an intent to focus on their local audiences.

Vastari CEO Bernadine Brcker Wieder told Artnet News that activity on the platform in the months since the survey has reinforced its findings. Although a few major institutions are planning to move ahead with blockbuster shows under the belief that they can still tempt sufficiently largeyet still socially distantcrowds from a newly local pool of potential visitors, others are pivoting to more modular programming: smaller, nimbler shows devised to directly engage the communities in their immediate surroundings.

Rather than hire whole touring exhibitions on Vastari, many institutions are now interested in paying for just a few objects from another museum that they can build a marketing campaign around, says Brcker Wieder. They still get the draw of a big touring show without the cost of a big touring show.

Vastari is even in the midst of changing its policies and capabilities in response. Prior to the shutdown, the platform only allowed member institutions to offer complete exhibitions to other member institutions, while private collectors could offer individual objects for show. But it is now working to give museums the same flexibility to hire out as little as a single work to their peersa practical acknowledgment of the new normals shifting priorities.

Its nottourism versus local, in many cases its audience versus curatorial interest, said Adrian Ellis, founder of AEA Consulting, which tracks spending and strategy in the cultural industries. The financial pressure of the past few months will lead to a more considered approach to programming generally, driven more on balance by considerations of audience than curatorial agendas.

Striking that balance is not exactly a new phenomenon, but Ellis predicts it will likely tip toward market and toward cost consciousness.As a result, small exhibitions that draw more heavily on permanent collections and less on borrowed objects may become more frequent. There will also of course be exhibitions exploring the current momentand our national preoccupations with race and social justice will clearly affect thinking about exhibitions. There will be continuing efforts to broaden audience demographics, he says.

Visitors walk through an exhibition at the newly reopened Whitney Museum. Photo by Angela Weiss/AFP via Getty Images.

Of course the situation is different for each institution. The Whitney Museum, like the Met, says the overwhelming majority of visitors are New Yorkers. The museum reopened with a month of pay-what-you-wish admission to welcome back New Yorkers while engaging and supporting the local community. Notably, it is one of the few museums that has not yet had a major disruption in programming. It was able to keep exhibitions on view that were installed before the shutdown, includingVida Americana: Mexican Muralists Remake American Art, 19251945, Cauleen Smith: Mutualities, and Agnes Pelton: Desert Transcendentalist, as well as its two collection installations which were on view prior to the museums closure.

We kept these shows installed for the duration of the closure so that they would be on view to welcome visitors back following the reopening, said a museum representative, adding that previously planned exhibitions remain on the schedule as well. Our programming has not been impacted by the shift in visitor composition.

Jackson Pollock, Mural (1943). University of Iowa Stanley Museum of Art, Gift of Peggy Guggenheim, 1959.6 2020 The Pollock-Krasner Foundation/Artists Rights Society (ARS), New York.

Another emergent trend in the museum sectors new normal is geographic disparity. The nations disjointed public-health response to the coronavirus has left the most popular institutions on the East Coast in vastly friendlier scenarios than their counterparts on the West Coast.

When the Guggenheim Museum reopens on October 3 with new exhibitions of Jackson Pollocks famedMural(1943) and related sculptures, all of New Yorks flagship arts institutions will once again be operational. Caveats apply, of course. According to a Guggenheim spokesperson, the museum will only accommodate a quarter of its normal capacity, with timed tickets and observance of other safety measures required. The Guggenheim also anticipates a 25 percent year-over-year decline in overall attendance in 2020, meaning the institution would host only about 900,000 total visitors by years end versus roughly 1.2 million in 2019.

The composition of its audience will also change substantially. Annual visitorship at the museum typically breaks down as roughly half international and half domestic, with nearly 40 percent of US visitors (meaning about 20 percent of all visitors) being New Yorkers. Although the museum does not have a projection for how much more of its overall attendance locals will comprise, it is safe to say that this year will deviate from the usual demographic trend.The Guggenheim is adjusting its communications strategy accordingly by featuring content aimed towards New York audiences on its social-media and digital platforms, the spokesperson said.

This approach harmonizes with the one already in practice at the Art Institute of Chicago. Since reopening on July 30, three-quarters of visitors to the encyclopedic museum have been local residents, according to a representative. Normally, this constituency makes up only about 50 percent of attendees. The Art Institute has responded by eliminating advertising targeted to tourists for the time being, as well as shifting the emphasis further toward the Chicago element of the campaigns for its current exhibition Monet and Chicago. The shows layout was also modified during the shutdown to optimize traffic flow through the galleries for a socially distanced world.

While the wildly popular art museums above undoubtedly wish their situations were better, their peer institutions in the west would likely love to have the same problems. The Getty Center and the Getty Villa will not welcome visitors until January 2021, according to a spokesperson. A representative for the Los Angeles County Museum of Art, whose website simply designates the campus as temporarily closed amid the pandemic and its controversial architectural overhaul, declined to comment for this story. Multiple email inquiries to the Museum of Contemporary Art Los Angeles went unanswered; alsotemporarily closed, the institution is scheduled to debut two new major exhibitions in October, but as of publication time, opening dates had yet to be announced.

The atrium of the new Kinder Building at the Museum of Fine Arts, Houston. Photo by Peter Molick, courtesy of MFAH.

To the extent that any institution can operate as it did pre-shutdown, the Museum of Fine Arts, Houston has been remarkably consistent. It reopened to the public a full three months ahead of most New York institutions, on May 23. The date set bymuseum leaders followed three weeks after that set by the states governor for museums to reopen.

Social distancing protocols are in place, including mask requirements, and timed ticket entries, and the museum is currently operating below 25 percent capacity, which is 900 visitors across the 14-acre main campus. A representative said the museum does not expect programming to be impacted as 90 percent of our audience has been and continues to be from the greater Houston area.

One pandemic-related delay is the opening of its newNancy and Rich Kinder Building, now set for November 21, about three weeks delayed. The Kinder Building will present works from the museums international collections of modern and contemporary art, opening with the first comprehensive installation drawn from the collections of Latin American and Latino art.

Still the Houston museums experience remains an outlier at this stage. As the conditions on the ground continue to evolve in the months ahead, the only certainty is that arts institutions will need to keep their creative problem-solving skills sharp. The crisis and its effect on the museum-going public have already forced several of the USs best-attended public collections to reappraise everything from programming, to communications, to operations. And asBrcker Wieder of Vastari sees it, in some cases this soul-searching was overdue.

I grew up in a country with a lot of hurricanes, shesays. After a hurricane, everything is chopped down. Maybe this is time for a spring cleaning in museums.

Continue reading here:
The Local Audience Is the Central Audience: As Tourism Tanks Across the US, Museums Pivot to the Visitors in Their Own Backyards - artnet News

Girls on the Run of Snohomish County launches virtual fall season – My Edmonds News

Girls on the Run virtual programming will include physical activity and social-emotional learning. (Photo courtesy Girls on the Run)

Girls on the Run of Snohomish County (GOTRSnoCo) has announced the launch of a special virtual fall season.

GOTRSnoCo is a leader in delivering evidence-based, life skills curriculum to girls of all abilities through a program that creatively integrates running and movement. With more than 45 sites across the county, the organization has served more than 1,600 girls since it was founded in 2015. For the 2020 fall season, GOTRSnoCo is offering 100% virtual programming for girls in 3rd-8th grades to accommodate the changing and unpredictable school schedules due to the pandemic. Registration for the eight-week season is now open at http://www.GirlsontheRunSnoCo.org.

Our staff and coaches are ready to bring critical social-emotional programming to girls at a time when they need it the most, said Megan Wolfe, Executive Director of GOTRSnoCo. We have adapted based on the recommendations of local health officials and decisions of local governments and school districts. Our virtual programming makes it possible for girls to stay active and connected despite the pandemic.

Virtual fall programming is delivered by trained coaches in a safe virtual space, with lessons that mirror the in-person Girls on the Run for younger girls or Heart & Sole program for older girls. Virtual programming will include physical activity and social-emotional learning, providing girls with an opportunity to still build meaningful connections with their peers and caring adult role models.

Volunteer coaches will receive the training and materials required to provide girls a safe, trauma-sensitive space to learn valuable life lessons and be active.

Added Wolfe, Together, we will find a way to motivate girls to nurture their physical and emotional health, no matter the circumstances.

Sign Up for Our Daily Edmonds Newsletter

Follow this link:
Girls on the Run of Snohomish County launches virtual fall season - My Edmonds News

Youve Heard Of Computer-Aided Design. What About Computer-Aided Biology? – Forbes

Virus

In the early days of the semiconductor industry, integrated circuits were designed by one or two engineers with slide-rules, hand-drawn on paper, and then given to a lithographer to print onto silicon wafers. As circuits became more complex, blueprints gave way to software. These digitally represented designs were much more than a reproduction of a pencil sketch: productivity, design quality, and communication all improved rapidly thanks to softwares ability to codify desired behaviors into actionable layouts, while also allowing for easy, iterative design improvements.

Today, large teams of engineers design circuits using high-level languages that automate the process, and chip layouts more detailed than a street map of the entire U.S. can be generated automatically. The result has been a revolution in engineering and design, manifesting itself as Moores Law and the Information Age itself.

Today, a similar revolution is happening in biology, most notably in the field of synthetic biology. And comparisons between computer-aided design (CAD) and computer-aided biology (CAB) are hardly accidental.

In recent years, automation has revolutionized how we do biology: driving down the cost of sequencing, facilitating open-source science, and pushing screening and many other processes towards higher throughput. In parallel, this trend has pushed biological experimentation into the realm of big data, where the inherent complexity of biology is finally beginning to be codified in the form of large datasets from increasingly optimized experimentation.

However, the engineering and synthetic biology world has not quite been able to harness and systematize these developments into a sustainable positive feedback loop. Single-factor experiments, such as the one described above, remain the norm because of how this automation has scaled in the form of liquid handling robots or electronic lab notebook technology, for example, but not at the foundational level of expanding and enhancing experiments to enable effective data integration and iterative design which effectively captures the multivariate complexity of biology.

Tim Fell, CEO of Synthace

It takes weeks and weeks to program robots, and its not good for a different combination of factors with a completely different experiment the next day, says Tim Fell, CEO of Synthace and member of the United Kingdoms Synthetic Biology Leadership Council. Its important to look at manufacturing holistically so we simultaneously can bring wonderfully flexible automation to inflexible hardware and enter this new frontier of computer-aided biology.

Synthace is a bioprocess-turned-software company founded in 2011, and it wants to accelerate the inevitable equivalent shift in biology by making high-throughput experiments easy and ubiquitous. For Fell, automating biology and silicon chip production, at their highest levels, are essentially the same thing.

These shifts are both about making the physical digital and the manual automated, he says. Theyre both digital tooling to help these processes along. And its digital tooling we need to leverage tools such as machine learning to unravel biological complexity faster, and with that better insight, to close loops of iterative design, he posits.

The cornerstone of Synthaces engineering biology endeavors is Antha, a cloud-based software platform for automating and improving the success rate, efficiency, and scalability of biological processes by connecting together all the hardware in a lab. Unlike most other platforms that digitize biology, Synthace still has a lab a key differentiating factor that facilitates essential validation of new workflows.

To Fell, this is the key to ensuring everything from easily tweakable out-of-the-box protocols for processes as simple as PCRs (the same technique used for COVID-19 testing) and automated data aggregation creating structured datasets for high-dimensional statistical learning to iterative multifactorial experiments and optimization of protein and gene-based bioprocessing. Our customers dont want software, he explains. They want a biological outcome that produces reliable biological outputs.

This technology is the framework of the companys two white papers: Computer-Aided Biology: The Metadata Responsibility, which highlights the crucial responsibility to define, capture, and combine metadata at the point of creation to facilitate deeper analysis downstream, and Computer-Aided Biology: Delivering biotechnology in the 21st century. This philosophy has not gone unnoticed: Synthaces major partners and customers include Merck, Oxford Biomedica, Dow, Microsoft Station B, Tecan, and Syngenta, with work ranging from vectors for CAR-T cancer therapies to optimization of liquid handling robots.

The value of such a mindset becomes apparent when considering the myriad intertwined pathways that accompany most any biological phenomena there is seldom just one protein involved. Multiple proteins and pathways need to be screened and understood against the backdrop of innumerable other cellular components. That kind of experimental design requires intense scalability and organization.

We can only do this if we codify biological experiments in an unambiguous way, akin to standards of CAD, Fell explains. To do that, you have your examples of what needs to be defined, then you build experimental blocks, and then you pass your parameter set. That gives you the structure and context to be able to use your data downstream.

Synthaces approach has gained notable traction within a variety of communities, drawing on its own team with broadly interdisciplinary skillsets to push the technology forward. Multifactoring [screening beyond just one variable] turns cynics into evangelists in one experiment, Fell remarks. Theres no going back. You need these higher-order interactions to extra true insights.

Judging by the investing team behind Synthace, many agree. Chairman of the Board Bob Widerhold, integrated circuit veteran from Bell Labs and later Cadence, is incredibly optimistic about the future of computer-aided biology. Cadence turned out to be the leader in the [integrated circuit] space and a multi-billion dollar company. I see the exact same scenario playing out 40 years later in biology with the formation of a computer-aided biology industry, and I hope to help Synthace be the Cadence of the Biology industry, he says.

Widerhold sees this change as not only inevitable but also crucial. The same shift that happened in the early '80s in the semiconductor industry, designing every aspect of a microprocessor on a computer to handle increasing complexity, needs to happen in the biology industry, he asserts. I believe this shift will usher in a period of incredible exponential progress in biology and enable biology to reach its full potential to change the world in a very positive way.

Herman Houser, Co-Founder of Amadeus Capital Partners

Hermann Hauser, co-founder of Amadeus Capital Partners with successes including acquisitions by Microsoft, Illumina, and Nvidia, emphatically agrees on biologys potential and the progress needed to make a promising future reality. Synthetic biology is the future of biology, Hauser suggests, but biology needs standardized lab procedures to produce replicable results the way microprocessors standardized instruction sets for programming. This will make it possible to program biology.

These are grand visions of where the engineering biology industry is headed next, but many moving parts will need to mature to realize these visions. Everyone has a part to play in this ecosystem, Fell declares, and ours is in experiment execution. While Synthace hopes to find enthusiastic advocates in other companies to evangelize their overarching missions, the company will continue to drive the field forward and its most basic, foundational level: automating and abstracting biology to scale experimentation into the next frontier.

Subscribe to my weekly synthetic biology newsletter. Thank you to Aishani Aatresh for additional research and reporting in this article. Im the founder of SynBioBeta, and some of the companies that I write about are sponsors of the SynBioBeta 2020 Global Synthetic Biology Summit.

Continued here:
Youve Heard Of Computer-Aided Design. What About Computer-Aided Biology? - Forbes

Meet GitOps, the key to launching effective software releases in the cloud-native era – SiliconANGLE News

The automation story behind DevOps centers on CI/CD, the continuous integration and continuous deployment that results in working code ready for production.

Deployment isnt the end of the process, however. Releasing code is the missing step putting new software in front of customers and end-users while ensuring it meets the ongoing objectives of the business.

Achieving this customer centricity and rapid deployments of CI/CD is difficult enough with traditional on-premises and cloud environments. But when deploying to Kubernetes-powered cloud-native environments, the massive scale and ephemerality of the operational environment requires an end-to-end rethink of how to release software into production and operate it once its there.

While most enterprises are currently in the midst of ramping up their Kubernetes deployments, certain industries telecommunications in particular are already looking ahead to the need for unprecedented scale.

As part of the 5G buildout, telcos are standing up small data centers at cell towers and points of presence. But small is a misleading adjective, since these data centers are essentially clouds in their own right, running potentially hundreds of thousands or even millions of Kubernetes clusters each.

From the perspective of the telco business, product managers want the ability to roll out new services to customers in sophisticated, dynamic ways. They may want to roll out new capabilities to a small group of customers, and then expand the deployment over time. They may have geographically specific offerings. Or perhaps they will delineate different service categories by compliance restrictions.

Furthermore, the telcos represent the tip of the sword. Many industries, from banking to automotive to media, are also looking to leverage similar capabilities to drive market share and customer value.

The list of possible variations in service offerings that such enterprises might want to roll out to different segments of their respective customer bases is extensive. Similarly, the scale that their technical infrastructures, as well as the personnel supporting them, also goes well beyond their earlier requirements from a mere handful of years previous.

On the one hand, this explosive growth in business demand for ephemerality and scale is driving the exceptionally rapid maturation of the Kubernetes ecosystem.

On the other hand, all this cutting-edge technology actually has to work. And thats where cloud-native operations fits in.

Cloud-native computing takes the established infrastructure as code principle and extends it to model-driven, configuration-based infrastructure. Cloud-native also leverages the shift-left, immutable infrastructure principle as well as favoring extensibility over customizability, itself a model-driven practice.

Although a model-driven, configuration-based approach to software deployment is necessary for achieving the goals of cloud-native computing, it is not sufficient to address the challenges of ensuring the scale and ephemerality characteristics of deployed software in the cloud-native context.

Software teams must extend such configurability to production environments in a way that expects and deals with ongoing change in production. To this end, canary deployments, blue/green rollouts, automated rollbacks and other techniques are necessary to both deal with and take advantage of ongoing, often unpredictable change in production environments.

Abstracting across different production environments is also an important challenge. Whether it be different public clouds, different Kubernetes distributions, or hybrid IT challenges that mix cloud and on-premises environments (perhaps for compliance reasons), cloud-native release orchestration must abstract such differences in order to provide a coherent, configuration-based approach to automating deployments across such variations.

Dependency management is also essential. Whether it be dependencies among individual microservices, or perhaps dependencies upon APIs that provide access to other types of software components, its important that unexpected dependencies dont break the deployment, even when individual components are ephemeral.

Finally, software teams must be able to deal with unprecedented scale. Kubernetes itself is built to scale, with an architecture that deploys microservices into containers, containers into pods, and pods into clusters but clusters arent enough.

Enterprises are already working through the intricacies of multicluster Kubernetes deployments. Software teams must also consider groups of clusters and then fleets of groups of clusters. Such fleets would typically cover multiple regions or data centers, bringing additional challenges of massive scale to the cloud-native party.

In a useful oversimplification, the cloud-native community has boiled down everything organizations need to do to get Kubernetes running in full production into three days.

Day 0 is the planning day. Day 1 is when you roll out Kubernetes and the rest of your cloud-native ecosystem. Day 2 represents full operations at scale.

Dividing such a complex, interconnected set of tasks into three discrete days highlights one important fact: Day 2 has so far gotten short shrift.To provide adequate attention to day 2 issues, the community has coined a term: GitOps.

GitOps is a cloud-native model for operations that takes into account all the concepts this article has covered so far, including model-driven, configuration-based deployments onto immutable infrastructure that supports dynamic production environments at scale.

GitOps gets its name from Git, the hugely popular open source source code management tool. Yet, although SCM is primarily focused on the pre-release parts of the software lifecycle, GitOps focuses more on the Ops than the Git.

GitOps extends the Git-oriented best practices of the software development world to ops, aligning with the configuration-based approach necessary for cloud-native operations only now, the team uses Git to manage and deploy the configurations as well as source code.

Such an approach promises to work at scale even at the fleet level, since GitOps is well-qualified to abstract all the various differences among environments, deployments, and configurations necessary to deal with ephemeral software assets at scale.

GitOps also promises a new approach to software governance that resolves issues of bottlenecks. In traditional software development (including Agile), a quality gate or change control board review requirement can stop a software deployment dead in its tracks. Instead, GitOps abstracts the policies that lead to such slowdowns, empowering organizations to better leverage automation to deliver adequate software government at speed.

The beating heart of cloud-native computing is open-source software, so its only logical that open-source projects are spearheading efforts in cloud-native operations.

For instance, Argo CD is a declarative, GitOps-centric CD tool for Kubernetes. Similarly, Tekton is a flexible open source framework for creating CI/CD systems, allowing developers to build, test and deploy across cloud providers and on-premises systems.

In many ways, however, such projects are only pieces of the cloud-native operations puzzle, and it falls to the vendors to put the pieces together.To begin with, a number of vendors tout the model-driven configuration-based approach. Here are a few examples.

Digital.ai Software Inc., for example, takes a model-driven, scalable approach, making changes simple to make and to propagate to all environments. With Digital.ai, developers dont need to maintain complicated scripts or workflows for each deployment instance.

Octopus Deploy Pty Ltd.follows a similar approach, with model-driven ops configuration that provides simple configuration abstractions across heterogeneous environments, for example, on-premises as well as in the cloud.

With Octopus, instead of writing separate scripts for each environment, developers can put those scripts into Octopus and parametrize them, creating an abstracted configuration representation. Instead of separate CI/CD tooling, ops tooling and runbook automation, Octopus provides one deployment tool across all tools, environments and platforms.

Similar to Octopus, ShuttleOps Inc. encapsulates a host of connectors and its own coded application and infrastructure configurations under the covers, parametrizing them as steps in the pipeline workflow. It then reports results to the orchestration and management tools of choice.

CircleCI (Circle Internet Services Inc.) and Cloudbees Inc. are two other vendors that represent a full deployment via declarative configuration files.

Many vendors also resolve the interdependencies among microservices (as well as other components) in production. Cloud66 Inc. enables developers and architects to define service dependencies in an abstracted but deterministic fashion. Those dependencies define the workflows that operations must manage.

Cloud66 can then tell developers when they need a new version of a particular piece of software in order to resolve such dependencies, and it also tells operators what they need to do to support it.

Harness Inc. offers what it calls a continuous delivery abstraction model that uses templates to eliminate dependencies. The CDAM resolves the impact of upstream and downstream microservices dependencies with automatic rollbacks.

Several vendors pull together the cloud-native operations story with a GitOps offering.

At WeaveWorks Inc., GitOps is context-aware, leading to a model of the entire system which represents its desired state. WeaveWorks supports multiple variations, for example, custom platform as a service on-premises as part of the same comprehensive model.WeaveWorks leverages a distributed database for configurations that supports potentially millions of clusters and works in high latency and occasionally disconnected environments.

GitLab Inc. is another vendor with explicit GitOps support. GitLab offers a single platform that takes an infrastructure as code approach, defining configurations and policies as code while leveraging automation to apply changes with Git merge requests.

This automation support in GitLab resolves many governance issues, as it leads to approvals with fewer bottlenecks. GitLabs GitOps strategy is all about automation, for example, automated rollbacks.GitLab also supports release evidence, which gives an audit trail of everything included in each release along with associated metadata.

D2IQ Inc. touts its own flavor of GitOps it calls GitNative, which combines GitOps and Kubernetes-native CI/CD. The goal is to maximize speed, scale, and quality via full-lifecycle Git automation from DevOps to GitOps to GitNative.

D2IQ takes an immutable infrastructure approach that leverages Kubernetes APIs and primitives. Its platform is both serverless and stateless, also works on-premises. D2IQ leverages both the Argo CD and Tekton open source projects.

A final GitOps-centric vendor is Codefresh Inc., whichuses Git as the single source of truth, automating and securing pull requests and deployments. It handles source code provenance and support for multiple regions.

Where the rubber hits the road with Day 2 Kubernetes deployments is whether they will handle massive scale scale on the order of millions of clusters.

Several vendors tout such capabilities. WeaveWorks offers cluster management that runs on the customers choice of managed Kubernetes platform plus application management, including release automation and progressive CD that scales to fleets.

Vamp.io BV leverages Kubernetes-based environments to provide release orchestration for applications that consist of large numbers of ephemeral microservices. This vendor offers release orchestration for DevOps that fully automates releases, including A/B testing, fine-grained segmentation and multitenant releases.

Rancher Labs Inc.,soon to be part of SUSE,offers GitOps at scale. It deals well with large numbers of heterogeneous nodes, including clusters, cluster groups and fleets. D2IQ also touts a single pane of glass for managing fleets of Kubernetes clusters.

A few vendors are also tackling the difficult challenge of ensuring that code in production continues to meet the business need even when that code is inherently dynamic and ephemeral. I call this capability intent-based operations.

On this list: the Keptn open-source project from Dynatrace LLC. Keptn produces a remediation file that automates the remediation of code in production as it drifts from its intended purpose. This remediation also allows for graceful failure in an automated fashion.

Keptn validates whether a particular remediation action works and, if not, it tries another one. Dynatrace calls this automated iterative approach to remediation micro-operations.

Harnesss GitOps approach also includes continuous verification across performance, quality and revenue, with automatic rollbacks another example of intent-based operations.

Finally, Vamp leverages metrics from production traffic to provide continuous validation, ensuring released code meets requirements on a continual basis.

It is tempting for anyone in a traditional enterprise to look at the massive scale and ephemerality characteristics of cloud-native deployments and wonder whether their organizations would ever need software that follows such patterns, which are so dramatically different from most of the software theyre familiar with in todays enterprise environments.

While its true that industry needs will vary, and individual companies will face different challenges from their competitors, no one should be too confident that the Day 2 vision this article lays out wont apply to them.

Remember, if a technical capability becomes available that improves the ability for certain organizations to roll out differentiated products and services that meet customer needs, then their competition must also leverage similar capabilities or risk becoming uncompetitive and, in the end, failing to survive.

In other words, cloud-native computing is here. Its already delivering massive scale and ephemerality to enterprises that are leveraging such capabilities to deliver differentiated products and services to their respective markets. If your organization doesnt jump on this bandwagon as well and quickly your future is in question. Dont be left behind.

Jason Bloomberg is founder and president of Intellyx, which publishes theCloud-Native Computing Posterand advises business leaders and technology vendors on their digital transformation strategies. Hewrote this article for SiliconANGLE. (* Disclosure:At the time of writing, Digital.ai and Dynatrace are former Intellyx customers. None of the other organizations mention in this article is an Intellyx customer.)

Show your support for our mission with our one-click subscription to our YouTube channel (below). The more subscribers we have, the more YouTube will suggest relevant enterprise and emerging technology content to you. Thanks!

Support our mission: >>>>>> SUBSCRIBE NOW >>>>>> to our YouTube channel.

Wed also like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we dont have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary onSiliconANGLE along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams attheCUBE take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.

If you like the reporting, video interviews and other ad-free content here,please take a moment to check out a sample of the video content supported by our sponsors,tweet your support, and keep coming back toSiliconANGLE.

Read more:

Meet GitOps, the key to launching effective software releases in the cloud-native era - SiliconANGLE News

Microsoft Teams: Now you can use it with GitHub in this new public beta – ZDNet

Microsoft-owned GitHub has announced the public beta of a new GitHub integration with Microsoft Teams.

The public beta means developers using GitHub now have the option of adding the GitHub app to the Microsoft Teams app, just as they've been able to do with the Slack chat for several years.

GitHub and Slack teamed up in 2018 to bring GitHub to Slack to make it easier for teams to track GitHub activity in Slack channels.

SEE: Top 100+ tips for telecommuters and managers (free PDF) (TechRepublic)

The GitHub and Microsoft Teams integration, which is maintained by GitHub, offers similar functionality as the Slack integration but for Teams channels.

"The GitHub integration for Microsoft Teams gives you and your teams full visibility into your GitHub projects right in your Teams channels, where you generate ideas, triage issues and collaborate with other teams to move projects forward," GitHub explains.

GitHub users can install the GitHub preview app from the Microsoft Teams app store within the Teams app. Users need to link GitHub and Teams accounts by authenticating to GitHub using a @github sign-in command.

GitHub for Teams allows users to track and create new commits, pull requests, issues, status updates, comments and code reviews.

Github users can subscribe and unsubscribe to notifications for an organization or a repository's activity to keep notifications relevant.

GitHub highlights a feature that lets users 'unfurl' GitHub links to give others in a Microsoft Teams channel more information when they share links to GitHub activities, such as pull requests.

The app groups notifications for pull requests and issues under a parent card as replies. The parent card shows the latest of these issues along with information about the title, assignees, reviewers, labels and checks.

SEE: GitHub: Our upgrade to programming language Ruby 2.7 fixes over 11,000 issues

The GitHub and Teams integration should be good news for the portion of GitHub's 30 million developer users who also rely on Teams for collaboration.

Microsoft meanwhile has been busy releasing new features for Microsoft Teams, which as of April had 75 million daily active users. The latest feature it released for Teams was the new Lists app, which offers Teams users a spreadsheet format with a focus on collaboration and completing tasks.

Continued here:

Microsoft Teams: Now you can use it with GitHub in this new public beta - ZDNet