Fairwinds Adds Open Source Interface for Kubernetes Backup and Recovery – Container Journal

Fairwinds, a provider of managed IT services, has launched an open source project that layers a user interface on top of the VolumeSnapshot application programming interface (API) that is available in beta on the latest release of Kubernetes.

Robert Brennan, director for open source at Fairwinds, says Fairwinds Gemini will make it easier for IT administrators to automate backups on a customizable, fine-grained schedule as well as making it easier to restore specific backups and delete stale backups. As the number of snapshots an organization creates increases, its relatively simple for those snapshots to pile up over time, he notes.

Given the ephemeral nature of containers, theres a desire to run snapshots more frequently to capture the application environment at a specific time. However, the cost of those snapshots can add up in cloud computing environments, he says.

The VolumeSnapshot API is an extension of the Container Storage Interface (CSI) through which IT teams attach external storage systems to Kubernetes clusters. As databases are deployed more frequently on Kubernetes clusters, it becomes more important to automate the backup and recovery of that data in the event a Kubernetes cluster suddenly becomes unavailable.

IT teams should also remember to test their ability to recover backups because there are any number of reasons why a backup file may become corrupted, including the injection of malware that encrypts the data organizations are expecting to be pristine to recover from a ransomware attack.

Interest in deploying stateful applications on Kubernetes clusters is rising because IT teams are looking to streamline the management of stateless and stateful applications on the same platform versus relying on legacy external storage systems to capture the state of an application that may be running stateless on a Kubernetes cluster.

Fairwinds Gemini is the fifth open source project launched by Fairwinds pertaining to Kubernetes. The other four are Nova, a tool for monitoring Helm charts; Astro, a tool for managing instances of Datadog monitoring tools for Kubernetes clusters; Pluto, a tool for discovering deprecated Kubernetes APIs; and Polaris, a tool that evaluates Kubernetes configurations based on best DevOps practices.

Despite the downturn in the economy brought on by the COVID-19 pandemic, interest in deploying cloud-native applications on Kubernetes clusters is on the rise. Organizations want to build and deploy applications today that will stand the test of time rather than continuing to build monolithic applications that eventually would have to be modernized anyway. The issue many organizations have today is they lack the internal expertise required to build and deploy those cloud-native applications, notes Brennan.

During an economic downturn, the adoption of open source technology rises. IT organizations would rather reduce commercial license fees than cut IT headcount. The challenge they all face now is managing the rate of open source innovation, which is now occurring faster than many of them can absorb on their own.

Related

Read more here:
Fairwinds Adds Open Source Interface for Kubernetes Backup and Recovery - Container Journal

Bill Gates says Tesla Semi and electric airplanes will probably never work, and he is wrong – Electrek.co

Bill Gates has thrown some cold water on the Tesla Semi project and recent comments from Elon Musk about the possibility of commercial electric airplanes.

Is he right?

In recent years, Gates has focused on using his fortune to try and fix major problems in the world.

He is getting more attention lately due to his early warnings of the world not being ready for a pandemic prior to the COVID-19 crisis.

Now, he is using his platform to issue a similar warning about climate change:

Earlier this month, I wrote about howCOVID-19 is a cautionary tale for climate change. Theres no doubt that we have experienced terrible suffering and economic hardship over the last several months. But as hard as it is to imagine right now when were still in the middle of the pandemic, climate change has the potential to be even more devastating.

In a new blog post, he emphasizes the need to electrify transport in order to address climate change.

However, he made some controversial comments about the segments going electric.

Gates does believe that passenger vehicles are going to be electrified, and they already are to a degree:

Plus, increased competition in the market means there are more choices available to customers than ever before, from compact sedans to sleek sports cars. Youll even be able to buy an all-electric pick-up truck soon thanks to legacy companies like GM and Ford and new carmakers like Rivian and Bollinger.

The Microsoft founder, who used to drive a Model X, snubbed Tesla and its Cybertruck in mentioning electric pickup trucks.

No big deal, but where Gates comments get more controversial is that he claims that electric semi-trucks, like Tesla Semi, and electric jets, will probably never happen:

The problem is that batteries are big and heavy. The more weight youre trying to move, the more batteries you need to power the vehicle. But the more batteries you use, the more weight you addand the more power you need. Even with big breakthroughs in battery technology, electric vehicles will probably never be a practical solution for things like 18-wheelers, cargo ships, and passenger jets. Electricity works when you need to cover short distances, but we need a different solution for heavy, long-haul vehicles.

Thats despite several electric semi-truck programs, like Tesla Semi and the Freightliner eCascadia, being quite far along.

As for electric airplanes, Tesla CEO Elon Musk has been predicting that they would become viable once batteries reach an energy density of 400 Wh/kg, which many battery manufacturers are currently working on.

Instead, Gates suggest biofuels as potential alternative to batteries for those segments of transportation.

I wont pretend to be smarter than Bill Gates, but Id like for him to revisit these comments because I think he is not looking at it the right way.

First of all, its not all about batteries being big and heavy.

If you look at the problem through that lens, you assume that we have reached the efficiency limit for all those types of vehicles (trucks, cargo ships, and planes).

I dont think thats true.

With the electrification of passenger cars, manufacturers have doubled down on their efforts to improve efficiency in order to use fewer batteries in their vehicles and they have found plenty of room for improvements.

As we start to electrify trucks and planes, we will likely find new efficiency improvements because necessity is the mother of invention.

We just didnt have that need until now.

But if you want to focus on the batteries, I find it kind of crazy that he doesnt believe they will improve enough to enable electric trucks and planes.

Especially when you consider that he is a major investor in Quantumscape, which claims that its technology is going to enable 500 Wh/kg batteries:

That would be more than enough to enable long-range electric 18-wheelers and even commercial jet planes.

What I am missing? Let me know what you think in the comment section below.

FTC: We use income earning auto affiliate links. More.

Subscribe to Electrek on YouTube for exclusive videos and subscribe to the podcast.

Visit link:
Bill Gates says Tesla Semi and electric airplanes will probably never work, and he is wrong - Electrek.co

The programming language that does not stop growing and that you may be interested in learning – Checkersaga

Tiobes August index throws a curious surprise and its the exponential increase in the use of the R programming language, driven by the desperate search for a vaccine for COVID-19.

There are many programming languages on the market, and it is essential to have some type of index that encompasses popularity and use so that a professional or a student in the programming niche can be guided more wisely, and the Tiobe index has a lot to say about it.

Tiobe is one of the most popular indexes in the programming market, and is based on measuring the popularity of all these languages through their searches in the main internet search engines, on the availability of jobs and the number of engineers and programmers qualified who are using it.

Well, R may not sound like much to you, but it is the protagonist of the last index of Tiobe that collects the most popular programming languages of August 2020. R has just climbed to eighth position, a considerable fact considering that it was only a year it was in the twentieth position among the most used programming languages.

The R programming language is free and open source, focused more on the subject of statistical computing and graphics, and from Tiobe they affirm that this exponential growth in the last 12 months It is because universities and research institutes have moved away from other commercial and classical statistical languages like SAS and Stata to embrace R and other open source languages like Python. Likewise, many engineers and researchers are using this programming language within data science to find a vaccine for COVID-19.

Learn a programming language It is not easy at all and it requires many years of our life, and therefore knowing how to specialize in the right one and that at the same time we like it, can assure us a job.

This means that the R programming language could end up becoming the 2020 Tiobe programming language, which is awarded to the language that achieves the highest grade increase over the course of 12 months.

The rest of the positions in the index have hardly changed, dominating the C programming language, followed by Java and Python, while other languages such as Go, Swift and SQL are fighting to enter the Top 10.

These types of indexes are essential for a programmer to specialize in those with the greatest demand for employment.

[Va:techrepublic]

Continued here:
The programming language that does not stop growing and that you may be interested in learning - Checkersaga

Build and Deploy .Net Core WebAPI Container to Amazon EKS using CDK & cdk8s – idk.dev

In this blog, we will leverage the development capabilities of theCDK for Kubernetesframework also known as cdk8s along with theAWS Cloud Development Kit (AWS CDK)framework to provision infrastructure through AWS CloudFormation.

cdk8s allows us to define Kubernetes apps and components using familiar languages. cdk8s is an open-source software development framework for defining Kubernetes applications and reusable abstractions using familiar programming languages and rich object-oriented APIs. cdk8s apps synthesize into standard Kubernetes manifests which can be applied to any Kubernetes cluster. cdk8s lets you define applications using Typescript, JavaScript, and Python. In this blog we will use Python.

The AWS CDK is an open source software development framework to model and provision your cloud application resources using familiar programming languages, including TypeScript, JavaScript, Python, C# and Java.

For the solution in this blog, we will use C# for the infrastructure code. Completing this walkthrough successfully would take you about couple hours (including installing pre-requisites etc.), so plan accordingly.

Lets get started!

At a high-level, we will:

Creating the infrastructure described above will result in charges beyond free tier. So, review the pricing section below for service-specific details and make sure to clean up the built infrastructure to avoid any recurring cost.

The Github source code includes a cdk8s folder where the .NET application (docker container WebAPI in ECR) will be deployed and run in the Kubernetes cluster. cdk folder contains the AWS Cloud Development Kit (CDK) solution (C# .Net Core) to build the infrastructure. This solution constructs the AWS infrastructure where the webapi (.NET Core Web api) is packaged, built as an artifact and pushed to AWS ECR. The .NET project sample uses AWS SDK, Mysql data packages to connect to MySQL and interact with Amazon Aurora database. The exposed Web API endpoint makes HTTP calls (GET & POST) to add/retrieve TODOs. The end user can use any http get/put tool like curl or UI tools like Google Chrome ARC Rest Client or POSTMAN to validate the changes.

We will use Docker Containers to deploy the Microsoft .NET Web API. The following are required to setup your development environment:

To provision the infrastructure (and services) and deploy the application, we will start by cloning the sample code from the aws-samples repo on GitHub, run installation scripts (includedin the sample code) to setup the infrastructure and deploy the webapi to your AWS Account. We will review and test the application, and finally cleanup the resources (basically teardown what you provisioned).

$ git clone https://github.com/aws-samples/aws-cdk-k8s-dotnet-todo

The git source provided above has a cdk, webapi and a cdk8s folder. webapi has the necessary .NET Web API solution. We will use the AWS CDK commands to build the infrastructure and deploy the webapi into EKS. cdk8s code provided (using Python language) defines our Kubernetes chart which creates a webservice (k8s Service and Deployment).

Once the code is downloaded, please take a moment to see how CDK provides a simpler implementation for spinning up an infrastructure using C# code. You may use Visual Studio Code or your favorite choice of IDE to open the folder aws-cdk-k8s-dotnet-todo).Open the file /aws-cdk-k8s-dotnet-todo/cdk/src/EksCdk/EksCdkStack.cs. Code below (provided a snippet from the github solution) spins up a VPC for the required Cidr and number of availability zones.Similarly Open the file /aws-cdk-k8s-dotnet-todo/cdk8/main.py. Below snippet creates a Kubernetes chart and creates a webservice.

NOTE: Make sure to replace with your AWS account number (where you are trying to deploy/run this application).

main.py is called by cdk8s.yaml when cdk8s synth is invoked (by run_cdk8s.sh). Windows users may have to change the name to main.py instead of .main.py in the cdk8s.yaml

Open the file /aws-cdk-k8s-dotnet-todo/cdk/src/EksCdk/EksCdkStack.cs. Below snippet creates a Kubernetes chart and creates a webservice.

Scripts provided

Provided run_infra.sh script/bash file as part of the code base folder, Make sure to replace with your AWS account number (where you are trying to deploy/run this application). This will create the CDK infrastructure and pushes the WebAPI into the ECR. Additionally the script registers the kube update config for the newly created cluster.

If you would like to perform these steps you can do these manual steps as below

Step 1: Steps to build CDK

The above CLI will produce output similar to below. Copy and execute this in the command line. This will update your kube config to connect to the EKS control plane.

Below provided below is a sample only:

EksCdkStack.cdkeksConfigCommand415D5239 = aws eks update-kubeconfig name cdkeksDB67CD5C-34ca1ef8aef7463c80c3517cc12737da region $REGION role-arn arn:aws:iam::$ACCOUNT_NUMBER:role/EksCdkStack-AdminRole38563C57-57FLB39DWVJR

Step 2: Steps to Build and push WebAPI into ECR (todo-app ECR repository created as part of above CDK infrastructure)

Make sure to update your region and account number above

Step 3: Steps to create Kubernetes service and pods using cdk8s

After this is run, review the list/cdk8s.k8s.yaml. cdk8s created k8s yaml that is needed for deploying, loading the image from the ECR. A sample is provided below.

In this case, the generated yaml has a Kubernetes service & a deployment.

Once the Kubernetes objects are created, you can see the created pods and services like below. NOTE This could take sometime to start the ELB cluster with the deployment

The .NET code provided(cdk/src/EksCdk/Program.cs) creates the EksCdkStack as coded. Based on the name provided, a CloudFormation stack is built. You will be able to see this new stack in AWS Console > CloudFormation.

Stack creation creates close to 44 resources within a new VPC. Some of them are provided here below for your reference.

At the end of this step, you will create the Amazon Aurora DB table and the EKS Cluster exposed with a Classic LoadBalancer where the .NET Core Web API is deployed & exposed to the outside world. The output of the stack returns the following:

Once the above CloudFormation stack is created successfully, take a moment to identify the major components. Here is the infrastructure youd have created

Using CDK constructs, we have built the above infrastructure and integrated the solution with a Public Load Balancer. The output of this stack will give the API URLs for health check and API validation. As you notice by defining the solution using CDK, you were able to:

Using cdk8s chart, were able to generate the needed Kubernetes deployment and service yaml. The generated yaml is applied to the EKS Cluster and exposed using the classic load balancer.

Lets test the TODO API using any REST API tools, like Postman, Chrome extension ARC or RestMan.

Set Headers as Content-type & application/jsonSample request:{"Task": "Deploying WebAPI in K8s","Status": "WIP"}

Run the cleanup.sh to delete the created infrastructure

If you would like to do this manually, make sure the following resources are deleted before performing the delete/destroy:

cleanup can be done using the below CLI commands as well:

As you can see, we were able to deploy an ASP.NET Core Web API application that uses various AWS Services. In this post we went through the steps and approach for deploying Microsoft .NET Core application code as containers with infrastructure as code using CDK and deploy the Kubernetes services, pods using cdk8s. cdk8s+ is a library built on top of cdk8s. It is a rich, intent-based class library for using the core Kubernetes API. It includes hand crafted constructs that map to native Kubernetes objects, and expose a richer API with reduced complexity. You can check out more cdk8s examples, patterns, AWS EKS Architecture, and intent-driven APIs using cdk8s+ for Kubernetes objects.

We encourage you to try this example and see for yourself how this overall application design works within AWS. Then, it will just be a matter of replacing your current applications (Web API, MVC, or other Microsoft .NET core application), package them as Docker containers and let the Amazon EKS manage the application efficiently.

If you have any questions/feedback about this blog please provide your comments below!

About the Authors

More here:
Build and Deploy .Net Core WebAPI Container to Amazon EKS using CDK & cdk8s - idk.dev

The future of school may be outdoors, even after the pandemic – CBC.ca

It's a five-minute walk from the nearest road to the wooden sign that announces the site of the Guelph Outdoor School. There, in a clearing in the woods, is a registration table the only visible infrastructure.

On a sunny August weekday inGuelph, Ont., dozens of kids find their way down the path, equipped with hats and bug spray everything they'll need for a full day outdoors. Cohorted into groups of 10, they play games, trek along a series of well-worn paths, study found bird bones, and learn things like how to tell which plant is Queen Anne's lace and which is poisonous water hemlock.

This is summer camp, but the Guelph Outdoor School runs similar programs year-round. In the past the full-day fall and winter programs have been more of a niche attraction, for students aged four to 14 with enough stamina to brave the wilderness in January. Many were homeschooled or had a special arrangement with their regular school to attend once or twice a week.

In 2020, though, with fresh air seen as a way to lower the risk of COVID-19 transmission, more parents are seeing the value in moving their kids outside through programming like this.

"The phone is [ringing] off the hook and I can't even keep track," said Chris Green, a former classroom teacher who started the outdoor school eight years ago.

He and his team have added seven new programs this year, all of which have been filling up. They've also partnered with a local Montessori school to offer a full-time option, where around 30 kids, split into two groups, will spend half the day in a classroom and the other half outdoors.

"For me, it's always made sense to have kids outside," Green said. "And now it makes double the sense, because it has now shifted from an educational and developmental initiative, to a kind of preventative public health initiative."

Even those who were already converts to the school's philosophy are thinking differently about its value.

Cheryl Cadogan's 13-year-old son, David, normally attends programming there one day a week during the school year. But this year, Cadogan said, their family has been on heightened alert since her partner is immunocompromised.

"It's not safe for us as a family to have him go back to school," she said.

David will instead take his Grade 8classes online, while also spending a few days a week at the outdoor school.

Cadogan said she knows there's still a risk, but she is heeding the words of Dr. Anthony Fauci, head of the U.S. National Institute of Allergy and Infectious Diseases, who has said that outdoors is better than indoors.

Indeed, the appeal of open-air activities during the COVID-19 pandemic is rooted in science. Dr. Linsey Marr of Virginia Tech studies how viruses spread through the air. She said COVID-19 transmission by air is happening "there's really no question anymore."

When asked why there's a lower risk of transmission outside, she recommended picturing a smoker. Outside, she said, the exhaled smoke "rapidly disperses throughout the atmosphere and becomes very dilute." Indoors, on the other hand, it gets "trapped."

While masks, physical distancing and proper ventilation can go a long way to help curb the spread of the virus in schools, Dr. Marr said she would seize upon "any opportunity that there is to move an activity outdoors."

The Toronto District School Board (TDSB) is trying to increase those opportunities for its students, encouraging teachers to take classes outside whenever possible this year. But schools that don't have a forest on their property will need to think differently about using the space beyond their doors.

David Hawker-Budlovsky is the Central Coordinating Principal for outdoor education at the TDSB. While it won't be possible for many large downtown schools to have full-day outdoor programming, he said teachers will be able to schedule time in the yard, while staggering entries and exits to maintain physical distance.

Teachers and students will have to get used to "traveling around and using the community as classroom as well," he said. Ideas range from reading aloud to a class in the yard, to teaching about climate change in a nearby ravine, or learning about local history while walking around the neighbourhood.

Hawker-Budlovsky said there will be challenges, and admitted the plan has skeptics. But he's excited about the idea of getting kids outside more often.

"I think what's really important is to be able to look at this [with] an open mind, be creative and be as flexible as possible," he said.

Open-mindedness will certainly be a valuable trait for those holding open-air classes in the Canadian winter. But according to Pamela Gibson, a former teacher who now consults on sustainability and outdoor education with Learning for a Sustainable Future (LSF), students and teachers can get past it.

"There is no bad weather," she said. "There are just bad clothes." Over time, she said, people can learn how to prepare themselves for those less-than-perfect forecasts.

In the early 2000s, as a teacher at Belfountain Public School in Caledon, Ont., Gibson began experimenting with open-air class time. The idea was initially spurred by a group of parents looking for ways for their kids to spend more time outside on the 10-acre property surrounding the school.

At first, she said, "we had the usual kids that hung around the doors and really felt uncomfortable. But as time went on, we [didn't] have those door hangers anymore."

Outdoor learning has become so ingrained there, she said students will sometimes spend two-thirds of their days in the yard or out in the community, working on class projects.

Teachers looking to adopt similar programs elsewhere, she said, will have to be creative. But from the Belfountain experience, even a tree can be looked to as a "possible source of curriculum."

Gibson suggested educators ask themselves, "What's the math in that tree? What's the science in that tree? Where are the arts in that tree?" She believes it's all there.

Holding classes outside in the community is not only possible, Gibson said, but is "crucial," even beyond the pandemic. Curriculum, she said, is "supposed to be what children need to function in the world, not just inside the building [and] not just inside their homes."

With the spectre of COVID-19 pushing educators to look differently at their classrooms, Gibson said, there's "an opportunity for great change," and perhaps even a chance to improve the system for the future.

See more here:
The future of school may be outdoors, even after the pandemic - CBC.ca

Activision edges out Sony and Nintendo in Augusts TV ad spend – VentureBeat

Gaming brands upped their outlay on TV advertising in August by 26.66% compared to July, for an estimated spend of $22.5 million. There was almost a three-way tie for top-spending brands, with Activision edging out longtime chart leader PlayStation. In total, 11 brands aired 43 spots over 5,000 times, resulting in 1.1 billion TV ad impressions. Aside from Nintendo, each of the top brands targeted sports programming, especially NBA and MLB games, for ads during the month.

GamesBeat has partnered with iSpot.tv, the always-on TV ad measurement and attribution platform, to bring you a monthly report on how gaming brands are spending. The results below are for the top five gaming-industry brands in August, ranked by estimated national TV ad spend.

Activision spent an estimated $6.2 million airing a single spot for Call of Duty: Warzone, User Reviews, 627 times, resulting in 215.3 million TV ad impressions. The brand prioritized reaching a sports-loving audience: Top programming by outlay included the NBA, NHL, and MLB, while top networks included TNT, NBC Sports, and Fox.

PlayStation takes second place with an estimated spend of $5.8 million on four ads that ran 754 times, generating 214.7 million TV ad impressions. Most of the spend and impressions occurred in the second half of the month. The spot with the biggest spend (estimated at $3.8 million) was Cannot Be Controlled, promoting the Marvels Avengers game. ESPN, Adult Swim, and Comedy Central were three of the networks with the biggest outlay, while top programming included MLB, NBA, and South Park.

At No. 3: Nintendo, with an estimated spend of $4.9 million on 20 commercials that aired over 1,900 times, resulting in 355.8 million TV ad impressions. The top spot by spend (estimated at $677,351) was Shes My Favorite: Animal Crossing. Programs with the biggest outlay included SpongeBob SquarePants, The Loud House, and The Amazing World of Gumball; top networks included Nick, Cartoon Network, and Bravo.

Fourth place goes to Crystal Dynamics, which hadnt advertised on TV at all this year until August 20. The brand spent an estimated $3.2 million airing two ads, both for the Marvels Avengers game, 397 times, generating 116.6 million TV ad impressions. Its Time to Assemble had the biggest outlay, an estimated $1.8 million. Three of the top programs by spend were the NBA, South Park, and MLB; top networks included ESPN, Adult Swim, and Comedy Central.

Rounding out the ranking is MLB Advanced Media Video Games with an estimated outlay of $825,253 on two spots that aired 323 times, resulting in 49.5 million TV ad impressions. Home Runs, advertising R.B.I. Baseball 20, had the most spend (estimated at $729,251). Most of its outlay went to MLB games, but Ancient Top 10 and Baseball Tonight: Sunday Night Countdown were also in the mix. On the network side of things, the brand prioritized Fox Sports 1, ESPN, and Fox.

For more about iSpots attention and conversion analytics, visit iSpot.tv.

Go here to read the rest:
Activision edges out Sony and Nintendo in Augusts TV ad spend - VentureBeat

All you need to know about the Indian AI Stack – MediaNama.com

A committee under the Department of Telecommunications has released a draft framework of the Indian Artificial Intelligence Stack which seeks to remove the impediments to AI deployment, and essentially proposed to set up a six-layered stack, each handling different functions including consent gathering, storage, and AI/Machine Learning (AI/ML) analytics. Once developed, this stack will be structured across all sectors, including data protection, data minimisation, open algorithm frameworks, defined data structures, trustworthiness and digital rights, and data federation (a single database source for front-end applications), among other things. The paper also said that there is no uniform definition of AI.

This committee AI Standardisation Committee had, in October last year, invited papers on Artificial Intelligence, addressing different aspects of AI such as functional network architecture, AI architecture, and data structures required, among other things. At the time, the DoT had said that as the proliferation of AI increases, there is a need to develop an Indian AI stack so as to bring interoperability, among other things. Here is a summary of the draft Indian AI Stack, comments to which can be emailed at aigroup-dot@gov.in or diradmnap-dot@gov.in, until October 3.

The stack will be made up of five main horizontal layers, and one vertical layer:

This is the root layer of the Indian AI stack over which the entire AI functionality is built. The layer will ensure setting up of a common data controller, and will involve multi-cloud scenarios both private and public clouds. This is where the infrastructure for data collection will be defined. The multilayer cloud services model will define both relations between cloud service models and other functional layers:

This layer will have to define the protocols and interfaces for storing hot data, cold data, and warm data (all three defined below). The paper called this as the most important layer in the stack regardless of size and type of data, since value from data can only be derived once it is processed. And data can only be processed efficiently, when it is stored properly. It is important to store data safely for a very long time while managing all factors of seasonality and trends, ensuring that it is easily accessible and shareable on any device, the paper said.

The paper has created three subcategories of data depending on the relevance of data and its usability:

Categories of data

This layer, through a set of defined protocols and templates ensures an open algorithm framework. The AI/ML process could be Natural Language Processing (NLP), deep learning and neural networks. This layer will also define data analytics that includes data engineering, which focuses on practical applications of data collection and analysis, apart from scaling and data ingestion. The technology mapping and rule execution will also be part of this layer.

The paper acknowledged the need for a proper data protection framework: the Compute layer involves analysis to mine vast troves of personal data and find correlations, which will then be used for various computations. This raises various privacy issues, as well as broader issues of lack of due process, discrimination and consumer protection.

The data so collected can shed light on most aspects of individuals lives. It can also provide information on their interactions and patterns of movement across physical and networked spaces and even on their personalities. The mining of such large troves of data to seek out new correlations creates many potential uses for Big Personal Data. Hence, there is a need to define proper data protection mechanism in this layer along with suitable data encryption and minimisation. from the paper

The compute layer will also define a new way to build and deploy enterprise service-oriented architectures, along with providing transparent computing architecture over which the industry could develop their own analytics. It will have to provide for a distinction between public, shared and private data sources, so that machine learning algorithms can be applied against relevant data fields.

The report also said that the NITI Aayog has proposed an AI specific cloud compute infrastructure which will facilitate research and solution development in using high performance and high throughput AI-specific supercomputing technologies. The broad specifications for this proposed cloud controller architecture may include:

Proposed architecture of AI specific controller

The paper described this as a purpose-built layer through which software and applications can be hosted and executed as a service layer. This layer will also support various backend services for processing of data, and will provide for backend services and a proper service framework for the AI engine to function. It will also keep track of all transaction across the stack, helping in logging auditing activities.

This layer will define the end customer experience through defined data structures and proper interfaces and protocols. It will have to support a proper consent framework for access to data by/for the customer. Provision for consent can be for individual data fields or for collective fields. This layer will also host gateway services. Typically, different tiers of consent will be made available to accommodate different tiers of permissions, the paper said.

This layer also needs to ensure that ethical standards are followed to ensure digital rights. In the absence of a clear data protection law in the country, the EUs General Data Protection Regulation (GDPR) or any of the laws can be applied. This will serve as interim measure until Indian laws are formalised, the paper said.

This layer will ensure the process of security and governance for all the preceding five horizontal layers. There will be an overwhelming flow of data through the stack, which is why there is a need to ensure encryption at different levels, the paper said. This may require setting up the ability for handling multiple queries in an encrypted environment, among other things. Cryptographic support is also an important dimension of the security layer, the paper said.

Why this layer is important, per the paper: data aggregated, transmitted, stored, and used by various stakeholders may increase the potential for discriminatory practices and pose substantial privacy and cybersecurity challenges. The data processed and stored in many cases include geolocation information, product-identifying data, and personal information related to use or owner identity, such as biometric data, health information, or smart-home metrics

Data storage in backend systems can present challenges in protection of data from cyberattacks. In addition to personal-information, privacy concerns, there could be data used in system operation, which may not typically be personal information. Cyber attackers could misuse these data by compromising data availability or changing data, causing data integrity issues, and use big data insights to reinforce or create discriminatory outcomes. When data is not available, causing a system to fail, it can result in damagefor example a smart homes furnace overheats or an individuals medical device cannot function, when required. from the paper

How the proposed AI stack looks like

According to the report, the key benefits of this proposed AI stack are:

This is how the paper proposes data flow through the stack:

Proposed AI flowchart

In AI, the thrust is on how efficiently data is used, the paper said, noting that if the data is garbage then the output will also be so. For example, if programmers or AI trainers transfer their biases to AI; the system will become biased, the paper said. There is a need for evolving ethical standards, trustworthiness, and consent framework to get data validation from users, the paper suggested.

The risks of passive adoption of AI that automates human decision-making are also severe. Such delegation can lead to harmful, unintended consequences, especially when it involves sensitive decisions or tasks and excludes human supervision, the paper said. It gave the example of Microsofts Twitter chatbot Tay as an example of what can happen when garbage data is input into an AI system. Tay had started tweeting racist and misogynist remarks in less than 24 hours of its launch.

Need for openness in AI algorithms: The paper said it was necessary to have an open AI algorithm framework, along with clearly defined data structures. It referenced on how the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) software used by some US courts in predicting the likelihood of recidivism in criminal defendants was demonstrated to be biased since the AI black box was proprietary.

As AI learns to address societal problems, it also develops its own hidden biases. The self learning nature of AI means, the distorted data the AI discovers in search engines, perhaps based upon unconscious and institutional biases, and other prejudices, is codified into a matrix that will make decisions for years to come. In the pursuit of being the best at its task, the AI may make decisions it considers the most effective or efficient for its given objective, but because of the wrong data, it becomes unfair to humans, the report said.

Need to centrally control data: Right after the paper made a pitch for having openness in AI algorithms, it proposed that the data fed into the AI system should be controlled centrally. The data from which the AI learns can itself be flawed or biased, leading to flawed automated AI decisions. This is certainly not the intention of algorithmised decision-making, which is perhaps a good-faith attempt to remove unbridled discretion and its inherent biases. There is thus a need to ensure that the data is centrally controlled including using a single or multiple cloud controllers, the report said.

Proper storage frameworks for AI: An important factor in aiding biases in AI systems is contamination of data, per the paper, which includes, missing information, inconsistent data, or simply errors. This could be because of unstructured storage of data. Thus, there is a need to ensure proper storage frameworks for AI, it said.

Changing the culture of coders and developers: There is a need to change the culture so that coders and developers themselves recognise the harmful and consequential implication of biases, the paper said, adding that this goes beyond standardisation of the type of algorithmic code and focuses on the programmers of the code. Since much coding is outsourced, this would place the onus on the company developing the software product to enforce such standards. Such a comprehensive approach would tackle the problem across the industry as a whole, and enable AI software to make fair decisions made on unbiased data, in a transparent manner, it added.

In the near future, AI will have huge implications on the countrys security, its economic activities and the society. The risks are unpredictable and unprecedented. Therefore, it is imperative for all countries including India to develop a stack that fits into a standard model, which protects customers; users; business establishments and the government.

Economic impact: AI will have a major impact on mainly four sectors, per the paper: manufacturing industries, professional services, financial services, and wholesale and retail. The paper also charted out how AI could be used in some specific sectors. For instance, in healthcare, it said in rural areas, which suffer from limited availability of healthcare professionals and facilities, AI could be used for diagnostics, personalised treatment, early identification of potential pandemics, and imaging diagnostics, among others.

Similarly, in the banking and financial services sector, A can be used for things like development of credit scores through analysis of bank history or social media data, and fraud analytics for proactive monitoring and prevention of various instances of fraud, money laundering, malpractice, and prediction of potential risks, according to the report.

Uses for the government: For governments, for example, cybersecurity attacks can be rectified within hours, rather than months and national spending patterns can be monitored in real-time to instantly gauge inflation levels whilst collecting indirect taxes.

Excerpt from:
All you need to know about the Indian AI Stack - MediaNama.com

Sudbury to hold first drive-in concert – Sherwood Park News

Despite no summer festival in July, the Northern Lights Festival Boral team has been steadily planning new ways to bring live music experiences to Sudbury.

Last week, the organization was thrilled to announce NLFB #49, a diverse and exciting presentation of festival programming in alternative formats.

The long-running music and arts festival has been re-introducing live music in ways that are safe, responsible and fun. This special festival #49 programming culminates in the regions first-ever drive-in concert event, featuring some past festival favourites, as well as a few new faces.

The Sept. 19 concert will include Canadian roots-pop icon Serena Ryder, dynamic songwriter/performer Hawksley Workman, Toronto roots/folk/soul artist Julian Taylor, as well as locals Martine Fortin and Maxwell Jos.

The event will take place in partnership with Horizon Drive-in, at the New Sudbury Centre parking lot (1349 Lasalle Blvd.). Tickets are available online only at nlfb.ca/tickets.

Ryder is an artist adored by fans, peers and critics alike, in part due to her raw and earnest songwriting, and beautifully electric live performances. She has received numerous accolades, including six prestigious Juno Awards, a MuchMusic Video Award for Stompa, and a Canadian Screen Award for Achievement in Music Original Song.

Before her chart-smashing album, Harmony (2013), she also enjoyed success with previous releases, If Your Memory Serves You Well (2007), and Is it O.K. (2009), achieving Gold-selling status.

In 2012, her single, Weak In The Knees, also achieved Gold Certification. Ryders Christmas Kisses was named one of the Top 5 Christmas records of 2018 by Rolling Stone. She has also received the 2018 Margaret Trudeau Mental Health Advocacy Award and has been the face of Bell Lets Talk campaign for multiple years.

A staple of the Canadian arts scene for almost 20 years, Hawksley Workman boasts a catalogue of 15 solo releases showcasing his now signature spectrum of sonic influence, from cabaret to electro-pop to anthemic rock and plenty in between.

The accolades amassed include JUNO nods and wins and widespread critical acclaim. As a producer, his fingerprints grace releases by Juno and Polaris Prize nominees, and winners like Tegan and Sara, Sarah Slean, Serena Ryder, Hey Rosetta!, and Great Big Sea.

Hes also penned melodies with a myriad of artists, from Oscar-award winning Marion Cotillard (La Vie en Rose, Inception) to French rock icon Johnny Hallyday.

Hawksleys touring career has seen him play nearly a thousand shows worldwide. Hes headlined prestigious venues like Massey Hall in Toronto and The Olympia in Paris, and opened for heroes Morrissey, David Bowie, and The Cure.

Julian Taylor doesnt fit in a box. He never has and more power to him. A Toronto music scene staple and a musical chameleon, Taylor is used to shaking it up over the course of 10 albums in the last two decades.

Of West Indian and Mohawk descent, Taylor first made his name as frontman of Staggered Crossing, a Canadian rock radio staple in the early 2000s. These days, however, the soulful singer/guitarist might be on stage one night playing with his eponymous band, spilling out electrified rhythm and blues glory, and the next hell be performing at a folk festival delivering a captivating solo singer-songwriter set.

Martine Fortin is a bilingual singer-songwriter from Sudbury and a past winner of NLFBs annual Meltdown Competition. Her music is a blend of pop, jazz, blues, soul, and rock, combined with intimate, introspective lyrics, and moving piano melodies. She will perform a few of her songs near the start of the evening.

Walking the line between country and folk, Maxwell Joss songs draw from his experience growing up both in the North, on Lake Superior, as well as in southern Illinois. Anxiety, growing pains, and some good old fashioned storytelling are key elements of his tunes. He will open up the event by sharing a few of these songs.

Gates open at 6 p.m., and vehicles are asked to arrive at that time to ensure vehicle placement for showtime. Tickets are $30 in advance and $40 at the gate.

Due to safety protocols around COVID-19 and general health and safety, concert-goers must remain in the vehicles during the show. For any questions regarding tickets, protocols, or the event in general, contact the NLFB team at marketing@nlfb.ca or 705-674-5512.

sud.editorial@sunmedia.ca

Twitter: @SudburyStar

Visit link:
Sudbury to hold first drive-in concert - Sherwood Park News

Shadow banning and its role in modern day censorship – Cherwell Online

It is no secret algorithms dominate our online social lives it is not as if we arent making our own decisions when it comes to who we talk to or what media we consume, but it would be wilfully ignorant to ignore how systems have been programmed to categorise, collect, and suggest data just based on our likes and follows. This exposes us to content, people and ideas that we just would not have found on our own but it begs the questions of how much control do these systems have in restricting what we see?

This brings us to shadow banning.

Shadow banning is the decision of a social media platform to partially or wholly obstruct a persons content from being interacted with preventing new people from searching for your content, ensuring you do not appear under hashtags or even limiting how often you are suggested as a person to follow are just a few ways this can be achived. Platforms such as Instagram and Tiktok rarely acknowledge the claims of this nature but rather point to their right to remove posts that do not align with their Community Guidelines and how agreeing to use the platform is consenting to their power to do so.

In the grand scheme of things, having your videos taken down or fewer people finding and engaging content is not the greatest detriment to the world, but there is a significant pattern to who is being shadow banned. If I refer back to Tiktoks community guidelines, they claim to scrap videos created to facilitate harm onto others but within the guidelines, they make an effort to reiterate that they allow educational, historical, satirical, artistic, and other content that can be clearly identified as counterspeech or aims to raise awareness of the harm caused by dangerous individuals and/or organisations. This quote and their statement to show support of the Black Lives Matter movement will come as surprise especially to the number of black creators that have seen their engagement rates fall and their videos be taken down on their app.

Instagram has shown itself to be just as complicit in this there has been significant backlash from sex workers, sex educators and often queer inclusive sex-positive spaces on the app. Chante Joseph in her Guardian piece exposed the grey area that is not as clearly defined as Instagrams no nudity policy where the administrators can flag content as sexually suggestive; many people argue that this is necessary to ensure children are not exposed to inappropriate content rather than parents taking accountability or social media platforms at least attempting to introduce any form of age restriction, the onus is placed on creators. But consider, for example, LGBTQIA+ creators; their accounts are providing information that young people who may not have even come out to themselves would otherwise be able to access so they can process and understand their feelings in a healthy space that wasnt available to them just a decade ago. In essence, these guidelines about what a person is allowed to share is being defined by some arbitrary moral standard where discussions of sex specifically those outside the realm of the heteronormative are something to be protected from, even though there are very few spaces that allow for them in real life either.

Instagram, Twitter, TikTok, Facebook all are often steeped in their reputation of being superficial and resting on the self-gratification of people wanting to be seen (which isnt even itself a bad thing), but besides that they can be used to share ideas, political thoughts and knowledge. So when black creators attempting to inform the masses are restricted from sharing information or when sex workers messages on misogyny are inaccessible because their page is considered too sexually suggestive (a term not defined so therefore difficult to avoid), the silence is deafening. Shadowbanning is a threat to us because it maintains for us the illusion of control. Yet the whole idea is synonymous with censorship and the obstruction of information. Further, this obstruction is dictated by what platforms see as appropriate so the power we assumed we had in our voices can still be silenced.

Go here to see the original:

Shadow banning and its role in modern day censorship - Cherwell Online

What’s the state of quantum computing? Led by IBM & Amazon it’s developing rapidly – WRAL Tech Wire

Editors note: Stephanie Long is Senior Analyst with Technology Business Research.

HAMPTON, N.H. Like IBM did with its Selectric typewriters in the 1960s, the company is successfully weaving its quantum computing thread through myriad aspects of the greater quantum ecosystem, underpinned by strategic sponsorships and the inclusion of partners in the IBM Quantum Experience.

Amazon Web Services (AWS) is pushing back on this approach by offering a vendor-agnostic view of quantum cloud computing.

Academia has also thrown its hat into the ring with ongoing innovation and advancements in quantum computing.

The competitive landscape of quantum computing has begun to take on the look and feel of the early classical computing world; however, the modern industry has addressed the mistakes made with classical computing, and therefore progress can be more formulaic and swift.

August 2020 developments are starting to tie pieces of investments together to show a glimpse of when the post-quantum world may come, and as advancements continue the future state appears closer on the horizon than previously thought.

Duke joins $115M program to focus on development of quantum computing

If you would like more detailed information around the quantum computing market, please inquire about TBRsQuantum Computing Market Landscape,a semiannual deep dive into the quantum computing market. Our most recent version, which focused on services, was released in June. Look for our next iteration in December, focused on middleware.

(C) TBR

Follow this link:
What's the state of quantum computing? Led by IBM & Amazon it's developing rapidly - WRAL Tech Wire