Best Programming Languages That Will Be In-Demand in 2021 – Analytics Insight

With the dynamic nature of the technology landscape, new technologies are emerging every day. This rapid emergence of technology is fuelling the demand for new skills and knowledge among todays workforce across industries. The year 2020 has seen a surging demand for programmers and developers in diverse industries. The programming and developer communities are emerging exponentially than ever before as computers are significantly useful for organizations to process business efficiently. There are various new programming languages coming up that are suited for developers who want to grow their career.

While programming languages are set to generate more job opportunities, we have compiled a list of the best programming languages that will be in-demand in 2021.

Python is one of the widely accepted programming languages. As a fast, easy-to-use language, it is widely used to build scalable web applications in the industry. Python provides rich library support. It has a large developer community and provides a great starting platform for beginners.

JavaScript refers to the most powerful and flexible programming language of the web. It powers the dynamic behavior on most websites. Many businesses today use JavaScript-based run-time environment Node.js. It allows developers to use JavaScript for server-side scripting to build dynamic web page content before the page is sent to a users web browser.

C++ is one of the most efficient and flexible programming languages that is built to support object-oriented programming. It has rich in-built libraries as it is widely used to develop desktop and web applications on Windows, Linux, Unix, and Mac.

Despite being losing its raking year over year, Java is among the popular programming languages. It refers to a class-based, general-purpose language designed to let developers write once, run anywhere. It means that a compiled Java code can run on all platforms that support Java without the need for recompilation.

R programming language provides excellent built-in libraries to develop powerful machine learning algorithms. This language is used for general statistical computing as well as graphics. It can run seamlessly on various operations systems. A majority of technology companies like Google, Facebook and Uber use R language. With growing data science and machine learning trends, this programming language will bring new career opportunities in the future.

Developed and maintained by Microsoft, TypeScript is an open-source, object-oriented language. It may be used to create JavaScript applications for both client-side and server-side execution. TypeScript catches errors at compile-time so that programmers can fix them before running the code. It supports object-oriented programming features like data types, classes, enums, etc., allowing JavaScript to be used at scale.

Go language, built at Google in the year 2007 by Robert Griesemer, Rob Pike, and Ken Thompson, is a statically-typed language and has syntax similar to that of C language. Go was designed to improve programming productivity in an era of multicore, networked machines and large codebases. As it provides numerous features such as garbage collection, dynamic-typing, type safety, and more, companies like Uber, Twitch and Google, work with the Go language.

PHP is another open-source server-side scripting programming language that is used for website development. As it is easier to learn, this language is highly recommended for beginners. PHP can be used for various programming tasks outside of the web context, such as standalone graphical applications and robotic drone control.

More here:
Best Programming Languages That Will Be In-Demand in 2021 - Analytics Insight

TechGirlz Leads SXSW EDU Online Panel on Surviving and Thriving Through a Pandemic – Daily American Online

PHILADELPHIA, Feb. 1, 2021 /PRNewswire/ -- Leaders from four nonprofit groups devoted to inspiring young women to pursue careers in technology and lead healthy, balanced lifestyles will explain how their organizations became stronger during the pandemic when they speak this March at SXSW EDU Online, the annual event that fosters innovation and learning within the education industry.

"Revelations from Our Pandemic Pivots" will feature representatives from TechGirlz, Philly Tech Sistas, Black Girls Code and Girls on the Run.

"There is strength and wisdom in working to lift one another through difficult times," said Amy Cliett, director of TechGirlz, a program of Creating IT Futures. "While TechGirlz learned plenty of lessons by managing our pandemic pivot, it is important for us, and others, to hear and learn from other organizations who faced similar challenges."

Cliett, who will anchor the panel discussion, will be joined by Isis Miller, community and events manager for Black Girls Code, Lauren Psimaris, director of development for Girls on the Run for Montgomery, Delaware and Chester counties in Pennsylvania, and Ashley Turner, founder of Philly Tech Sistas.

Here is the session description for the highly coveted speaking slot at SXSW EDU:

For four groups that inspire young women to pursue tech careers and healthy, balanced lifestyles, the pandemic was more than an operational challenge. It was an existential crisis. How do you comply with COVID-19 lockdown and distancing protocols when direct, in-person interaction between students, instructors and mentors is the backbone of your organization's mission? Join a panel of executives from TechGirlz, Black Girls Code, Tech Sistas and Girls on the Run who have some answers to share.

"We are proud to share this virtual stage with organizations that not only faced similar challenges but also managed to not just survive, but thrive," Cliett added. "The remarkable pivots and achievements of these four non-profits during a global pandemic, especially given limited resources, have produced lessons that can be applied to any organization, non-profit or for-profit, on how to creatively manage through a crisis."

Information on attending SXSW EDU and the schedule of sessions can be found at http://www.sxswedu.com.

About TechGirlz

TechGirlz is a nonprofit program of Creating IT Futures that fosters a love for technology in middle school girls. Our free, open-source technology courses can be used by anyone to inspire curiosity, impart confidence and build community as the foundation for the application of technology throughout a girl's career and life. TechShopz courses have been taught by volunteer instructors in several states and four countries to tens of thousands of girls. To learn more or find out how you can participate, please visit http://www.techgirlz.org/.

About Creating IT Futures

Founded in 1998 by CompTIA, Creating IT Futures is a 501(c)(3) charity with the mission of helping populations under-represented in the information technology industry and individuals who are lacking in opportunity to prepare for, secure, and be successful in IT careers. Learn more at http://www.creatingITfutures.org.

About Black Girls Code

Black Girls CODE is devoted to showing the world that black girls can code and do so much more. By reaching out to the community through workshops and after school programs, Black Girls CODE introduces computer coding lessons to young girls from underrepresented communities in programming languages such as Scratch or Ruby on Rails. Black Girls CODE has set out to prove to the world that girls of every color have the skills to become the programmers of tomorrow. Learn more at http://blackgirlscode.org/.

About Girls on the Run of Southeastern Suburban, PA

Girls on the Run is a transformational, physical activity based positive youth development program for girls in the third through eighth grades. The girls are taught life skills through dynamic interactive lessons and running games. The program culminates with the girls being physically and emotionally prepared to complete a celebratory 5K running event. The goal of the program is to unleash confidence through accomplishment while establishing a lifetime appreciation of health and fitness. Girls on the Run of Southeastern Suburban, Pa is an independent council of Girls on the Run International, which has a network of 200+ locations across the United States and Canada. For more information, visit http://www.gotrpa.org

About Philly Tech Sistas

Philly Tech Sistas is an organization aimed at helping women of color gain technical and professional skills in order to work, thrive, and move up in the tech industry. We do this by providing intro to programming workshops and professional development events that build leadership, confidence and community. Our vision is to partner with tech companies throughout the Philadelphia area to help bridge the diversity, equity and inclusion gap by providing a greater pipeline of diverse talent. Learn more atwww.phillytechsistas.org/.

Press Contact:

Gloria Bell

Events and Marketing Manager

TechGirlz

gloria@techgirlz.org/ 267-909-2308

View original content to download multimedia:http://www.prnewswire.com/news-releases/techgirlz-leads-sxsw-edu-online-panel-on-surviving-and-thriving-through-a-pandemic-301218688.html

SOURCE Creating IT Futures

Excerpt from:
TechGirlz Leads SXSW EDU Online Panel on Surviving and Thriving Through a Pandemic - Daily American Online

Use Amazon EMR with Apache Airflow to simplify processes – TechTarget

Amazon EMR is an orchestration tool used to create and run an Apache Spark or Apache Hadoop big data cluster at a massive scale on AWS instances. IT teams that want to cut costs on those clusters can do so with another open source project -- Apache Airflow.

Airflow is a big data pipeline that defines and runs jobs. It works on many tools and services, such as Hadoop and Snowflake, a data warehousing service. It also works with AWS products, including Amazon EMR, the Amazon Redshift data warehouse, Amazon S3 object storage and Amazon S3 Glacier, a long-term data archive.

Amazon EMR clusters can rack up significant expenses, especially if the supporting instances are left running while idle. Airflow can start and take down those clusters, which helps control costs and surge capacity.

Airflow, and its companion product Genie -- a job orchestration engine developed by Netflix -- run jobs by bundling JAR files, Python code and configuration data into metadata, which creates a feedback loop to monitor for issues. This process is simpler than using the spark-submit script or Yarn queues in Hadoop, which offer a wide array of configuration options and require an understanding of elements like Yarn, Hadoop's resource manager.

Therefore, while IT teams don't need Airflow specifically -- all the tools it installs are open source -- it might reduce costs if the organization uses Airflow to install and tear down those applications. Otherwise, Amazon EMR users would have to worry about charges for the idle resources, as well as the costs of a big data engineer and the time and effort required to write and debug scripts.

Let's take a closer look at Amazon EMR and Airflow to see if they fit your organization's big data needs.

Figure 1 shows the configuration wizard for Amazon EMR. It installs some of the tools normally used with Spark and Hadoop, such as Yarn, Apache Pig, Apache Mahout (a machine learning tool), Apache Zeppelin and Jupyter.

The name EMR is an amalgamation of Elastic and MapReduce. Elastic refers to the elastic cluster hosted on Amazon EC2. Apache MapReduce is both a programming paradigm and a set of Java SDKs -- in particular, these two Java classes:

These run MapReduce operations and then optionally save the results to an Apache Hadoop Distributed File System.

Amazon EMR supports multiple big data frameworks, including newer options such as Apache Spark, which performs the same tasks as Hadoop but more efficiently.

Mapping, common to most programming languages, means to run some function or collection of data. Reduce means to count, sum or otherwise create a subset of that now reduced data.

To illustrate what this means, the first programming example for MapReduce that most engineers are introduced to is the WordCount program.

The WordCount program performs both mapping and reducing: The map step creates this tuple (wordX, 1), then counts the number of times a designated value appears. So, if a text contains wordX 10 times, then the map step (wordX,10) counts the occurrence of that word.

Figure 2 illustrates the process of the WordCount program. To begin, let's look at these three sentences:

The first step, map, lists the number of times a given word occurs, and the reduce step further simplifies that data until we are left with succinct tuples: (James, 3); (hit, 1); (ball, 2); and (the, 3).

The WordCount program is far from exciting, but it is useful. Hadoop and Spark run these operations on large and messy data sets, such as records from the SAP transactional inventory system. And because Hadoop and Spark can scale without limit, so can this WordCount approach -- meaning it can spread the load across servers. IT professionals can feed this new, reduced data set into a reporting system or a predictive model.

MapReduce and Hadoop are the original use cases for EMR, but they aren't the only ones.

Java code, for example, is notoriously verbose. So, Apache Pig often accompanies EMR deployments, which enables IT pros to use SQL -- which is shorter and simpler to write -- to run MapReduce operations. Apache Hive, a data warehousing software, is similar.

EMR also can host Zeppelin and Jupyter notebooks. These are webpages in which IT teams write code; they support graphics and many programming languages. For example, admins can write Python code to run machine learning models against data stored in Hadoop or Spark.

Airflow is easy to install, but Amazon EMR requires more steps -- which is, itself, one reason to use Airflow. However, AWS makes Amazon EMR cluster creation easier the second time, as it saves a script that runs with the AWS command-line interface.

To install Airflow, source a Python environment -- for example source py372/bin/activate, if using virtualenv -- and then run this Python package:

Next, create a user.

Then start the web server interface, using any available port.

Shown below is an excerpt of an Airflow code example from the Airflow Git repository. It runs Python code on Spark to calculate the number Pi to 10 decimal places. This enables IT admins to package a Python program and run it on a Spark cluster.

In this snippet:

More here:
Use Amazon EMR with Apache Airflow to simplify processes - TechTarget

January Student of the Month: Chantal Shirley Newsroom – Kirkwood Report

What is your background?: I grew up as an army brat moving around quite a bit as a child and teenager. I come from a family of hard workers and service members. My mother works in education, my father is a retired Army Officer, and my younger brother is currently serving in the Air Force. I also have men in my extended family that served in WWII, including a great-great-uncle, Jimmie Wheeler, who served as a Tuskegee Airman. I spent my formative years mostly in South Florida and Arizona, and high school years in Georgia. My previous education and career have taken me to places like New Hampshire and Las Vegas, Nevada. I found myself in Iowa after my partner, Dr. Roberts, started working for the University of Iowa, developing curriculum, managing the observatories, and doing outreach work.

What brought you to Kirkwood and why?: I was unhappy in my previous career and found myself limited to unsatisfying employment outside of it. For a few years, I was interested in software development, but my previous schooling and professional experiences limited me in the ability to enter the profession. After deciding that I would like to go back to school and make a career change, I began looking at Kirkwood. The face-to-face instruction, small classroom size, and Computer Software Development program ultimately led me to the decision that I should apply. Seeing Kirkwoods offerings was the first time I encountered a feasible route to a new career path that I was passionate about and that my previous degrees could not offer me.

What is your program of study and what interests you about it?: I am a second-year student in the Computer Software Development program, specializing in Java Programming and .NET Development. I am fascinated by things like the development process, software architecture, design models, prototyping, DevOps, and application development. The program has transformed my entire way of thinking and conceptualizing ideas. It is incredibly rigorous, and the challenges that I encounter are always intriguing and rewarding.

Are you involved in anything else on campus? If so, what and why?: When time permits, I have enjoyed participating in the Cyber Defense Club, STEM Club, Phi Theta Kappa, and LSAMP. Through these affiliations, I have participated in competitions, listened to fascinating lectures, and attended conferences.

Are you involved in anything off of campus?: Off-campus, through the support and guidance of the Career Services office, I was able to network and obtain employment as a part-time Software Quality Assurance Tester at the beginning of my enrollment in the Computer Software Development Program. This part-time employment has allowed me to gain real industry experience alongside my coursework at Kirkwood.

What do you do for fun?: For fun, I enjoy camping with my partner and dog, playing complicated board games with friends and neighbors, working on open-source software and personal applications, playing Dungeons & Dragons with friends and family, and cooking and baking at home.

Where do you see yourself in five years?: In five years, I see myself developing software and making architectural decisions as a Full-Stack software engineer. I would love to work for an organization with a mission that gives back to society. Moreover, I envision myself contributing to open-source software projects, volunteering in software-related projects, and developing software independently.

View post:
January Student of the Month: Chantal Shirley Newsroom - Kirkwood Report

The Open Infrastructure Foundation announces its first board – ZDNet

For over a decade, the OpenStack Foundation oversaw the open-source OpenStack Infrastructure-as-a-Service (IaaS) cloud. Over time, the OpenStack umbrella covered more open-source projects. So, in October 2020, the Foundation transformed into a new organization: the Open Infrastructure Foundation (OIF). Now, it has announced its first board to help direct its members and their cloud-oriented open-source projects into the 2020s and beyond.

While a good deal smaller than the leading corporate open-source organization, The Linux Foundation, OIF has found a nice niche for itself in covering OpenStack-related open-source software. The OIF will continue to oversee OpenStack. But, with its 100-thousand community members, it will also help direct such cloud-friendly projects as Airship, Kata Containers, Magma, OpenInfra Labs, OpenStack, StarlingX, and Zuul.

The new group's platinum members are Ant Group, AT&T, Ericsson, FiberHome, Huawei, Red Hat, Wind River, and Tencent. They are joined by Facebook, which just became a top-level member. Altogether the OIF has more than 60 corporate members.

The bulk of the organizational work will be done under the OIF's 27-member board. Their numbers include Amar Padmanabhan, Facebook software engineer; Xu Wang; Ant Group senior staff engineer; and Daniel Becker, Red Hat's senior director of engineering. Allison Randal, the well-known open-source strategist; and Perl Foundation leader, will serve as the board chair.

After her election, Randal said:

"Open infrastructure promotes 'innovation and choice on the Internet,' as Mozilla is fond of saying. The open-source nature of the projects hosted at the OpenInfra Foundationas well as the projects with which we openly collaborate -- create economic opportunity around the world. It is an important proof point that modern, open-source development can be funded by corporate interests but guided by the technical governance of individual contributors. I'm humbled by those who put their confidence in me to lead the board, and I'm energized by the opportunities before us to help define the next decade of open infrastructure."

Mark Collier, the OIF's COO, added:

"It's exciting to see the open infrastructure movement grow at such a rapid pace, as evidenced by having more platinum members than we've ever had before, more OpenInfra community members encompassing infrastructure experts who operate some of the largest infrastructures in the world, like Ant Group and Facebook, and new open-source software being created like Magma. All of these trends point to the start of a decade of people and companies investing in open infrastructure that's just getting started, and we want to invite everyone to join us as we build open source communities who write software that runs in production."

View post:
The Open Infrastructure Foundation announces its first board - ZDNet

Does Deplatforming Work? Big Tech And The ‘Censorship’ Debate : Consider This from NPR – NPR

In the week after Donald Trump incited a deadly riot in Washington, D.C., Twitter banned more than 70,000 users including the former president himself. Justin Sullivan/Getty Images hide caption

In the week after Donald Trump incited a deadly riot in Washington, D.C., Twitter banned more than 70,000 users including the former president himself.

Removing disinformation and users who spread it can come at a cost for web hosts and social media platforms. But studies indicate "deplatforming" does stem the flow of disinformation.

Kate Starbird with the University of Washington explains why it's easier to see the effects of deplatforming in the short-term. And NPR's Shannon Bond looks at how one growing social media site is dealing with new attention and new challenges.

Additional reporting in this episode from NPR's Bobby Allyn, who's reported on the removal of Parler by Amazon Web Services.

In participating regions, you'll also hear a local news segment that will help you make sense of what's going on in your community.

Email us at considerthis@npr.org.

This episode was produced by Brianna Scott and Brent Baughman. It was edited by Lee Hale with help from Shannon Bond and Wynne Davis. Our executive producer is Cara Tallo.

Original post:

Does Deplatforming Work? Big Tech And The 'Censorship' Debate : Consider This from NPR - NPR

Amazon And Twitter Deplatforming Parler And Trump; Is It Legal? – Forbes

Contributing Author: Bryan Sullivan

Amazon, Apple and Google have all booted Parler from their platforms in a span of a little more than ... [+] 24 hours.

The shocking events of the Trump-fueled January 6, 2021 Capitol Riot had very visible and, in some cases, unexpected consequences. Within days of the violent Capitol insurrection, some social media and web service platforms suspended the accounts of Donald Trump and many of his supporters, who participated in inciting and organizing the Capitol breach. The sweeping suspensions included over 70,000 Twitter accounts, as well as the ban of free speech social media platform Parler by Amazon AMZN , Web Services, Apple AAPL , and Google GOOG . Right-wing megaphones are now ringing loudly, with cries of censorship and free speech. But have anyones rights been violated? The simple answer is no.

In some instances, the suspensions were a surprise - most social media platforms had bent over backwards NOT to remove Trump or his followers accounts and posts, slowly and reluctantly adopting content disclaimers and warnings. The reaction was far more expected, as many conservative voices began to rail against social media and tech giants. Among the claims: that this is a violation of free speech, and that a private company restricting access to its services is somehow either illegal or even unconstitutional. Neither of these is true.

The First Amendment to the Constitution of the United States of America is one sentence and, fittingly, 45 words. It restricts only what Congress may do as far as making a law that infringes on freedom of speech. A private company cannot violate this amendment by disallowing access to its platform(s). More importantly, in the case of extremism, deplatforming has been shown to be an effective tool, including in a 2016 Brookings Institute study on ISIS. Notably, the vocal minority who are decrying their deplatforming had no problem with the deplatforming of ISIS, and likely wouldnt have an issue if Congresswoman Alexandria Ocasio-Cortez and other members of the Squad were deplatformed.

Users agree to (usually boilerplate) contracts when signing up for a social media, hosting, or other digital service and courts have held that these are enforceable. These Terms Of Use/Service can be many pages long, and almost always insulate the company from its users, including an explicit right to terminate services for conduct that the companies determine in their sole discretion is dangerous. Companies are well within their rights when enforcing these contracts, which are voluntarily entered into as a condition of use. In the case of both AWS and Twitter, the terms are clearly indicated and presented to the user to read and accept at the point of signup. As discussed in the Second Circuit decision in Specht, a decision written by now-Supreme Court Justice Sonia Sotomayor, this distinction, having the terms clearly indicated, prominently featured, and affirmatively assented to, is the threshold for enforceability. AWS and Twitter certainly meet this requirement.

Twitter cited two inflammatory Trump tweets in its decision to remove his account. One of Twitters rules states You may not threaten violence against an individual or a group of people. We also prohibit the glorification of violence. Trump violated Twitters glorification of violence policy on January 8, 2021, two days after the Capitol was stormed: The 75,000,000 great American Patriots who voted for me, AMERICA FIRST, and MAKE AMERICA GREAT AGAIN, will have a GIANT VOICE long into the future. They will not be disrespected or treated unfairly in any way, shape or form!!! (Capitalization in original.) He later tweeted: To all of those who have asked, I will not be going to the Inauguration on January 20th.

In this photo illustration a notification from Twitter appears on tweet by U.S. President Donald ... [+] Trump that the social media platform says violated its policy on May 29, 2020 in San Anselmo, California.

While these tweets are relative tame compared to some of Trumps other far more boisterous tweets, Twitter still rightfully pointed to its Terms of Service and the global context in which these tweets were made. Twitter stated: Due to the ongoing tensions in the United States, and an uptick in the global conversation in regards to the people who violently stormed the Capitol on January 6, 2021, these two Tweets must be read in the context of broader events in the country and the ways in which the Presidents statements can be mobilized by different audiences, including to incite violence, as well as in the context of the pattern of behavior from this account in recent weeks. After assessing the language in these Tweets against our Glorification of Violence policy, we have determined that these Tweets are in violation of the Glorification of Violence Policy and the user @realDonaldTrump should be immediately permanently suspended from the service.

Parlers agreement with Amazon Web Services (AWS) falls under two different agreements that are included in any new account signup, a Customer Agreement and an Acceptable Use Policy. In addition to the right to modify or change the terms (and services) with applicable notice, AWS Customer Agreement includes Section 4, on responsibilities of the account-holder (in this case Parler). Section 4.2 states You will ensure that Your Content and your and End Users use of Your Content or the Service Offerings will not violate any of the Policies or any applicable law. This question of violating any policies or applicable law is of the utmost importance in addressing Parlers dismissal from the hosting platform. Section 8.2(c) of the Customer Agreement goes on to say that the accepting party represents that none of Your Content or End Users use of Your Content or the Service Offerings will violate the Acceptable Use Policy.

That Acceptable Use Policy begins as follows:

No Illegal, Harmful, or Offensive Use or Content

You may not use, or encourage, promote, facilitate or instruct others to use, the Services or AWS Site for any illegal, harmful, fraudulent, infringing or offensive use, or to transmit, store, display, distribute or otherwise make available content that is illegal, harmful, fraudulent, infringing or offensive.

Certainly inciting, organizing, and then attempting to cover up an armed insurrection and forceful seizure of a government building, with the intent to detain, harass, and harm people must qualify as a violation of these terms. Moreover, it has been reported that various conversations on Parler involved people openly advocating the overthrow of President Biden and alluded to the assassination of Democratic politicians and Republican politicians who were not supportive enough of Trump.

Parlers lawsuit, which accuses AWS of acting based on political animus and violating antitrust law by cutting off their service, has been dismissed by Amazon in a statement as having no merit. "It is clear that there is significant content on Parler that encourages and incites violence against others, and that Parler is unable or unwilling to promptly identify and remove this content, which is a violation of our terms of service," the company said. As far as an antitrust claim, Parler could argue that the tech giants conspired, but the conspiracy would have to be aimed at overall market disruption, not targeting a single company.

It is understandable that Trump, Parler, et. al. would attempt to take legal routes, regardless of viability. Rejection sucks. But in this case it is legal and just.

Bryan Sullivan, Partner at Early Sullivan Wright Gizer & McRae, advises and represents his clients as a legal strategist in all their business affairs. He has significant experience on the litigation and appeals side of the practice, as well as with entertainment and intellectual property contracts, investment and financing agreements, and corporate structure documents on the dealmaking side.

Continued here:

Amazon And Twitter Deplatforming Parler And Trump; Is It Legal? - Forbes

Georgetown University Discusses The Great Deplatforming: Removing Trump From Social Media – Forbes

The suspended Twitter account of U.S. President Donald Trump (Photo Illustration by Justin ... [+] Sullivan/Getty Images)

Shortly after the attack on the U.S. Capitol on January 6, 2021, a number of social media companies, starting with Twitter, kicked President Donald Trump off of their platforms. Following Twitter, Facebook banned Trump from its platforms including Instagram, and then other social media companies, including YouTube followed suit. Within a few days, Trump found himself with no digital means of getting his word out.

Of course he wasnt muted as President he could call a press conference any time he wished, and he could be confident that the media would attend. But like many politicians who had learned about the power of unfiltered access to the internet, Trump had grown used to being able to have his say whenever he wanted it. Now, through the actions of a couple of large companies, he couldnt get his word out in the way he wished.

This raised questions in many circles at the time, and those questions havent gone away in the time since. What are the implications of silencing a sitting President? Is it legal or ethical to shut off access to those platforms?

Deplatforming

To answer that question, the Georgetown University Law Center held a panel discussion with four of its top legal experts to examine the question of deplatforming a sitting President.

The first question, whether its legal for the companies to remove a sitting President from a publicly available social media platform such as Twitter or Facebook, was answered at the beginning of the panel discussion by moderator Hillary Brill, acting director of Georgetown Laws Institute for Technology Law and Policy. Brill noted the outcry by many when it happened that it was somehow a limitation on free speech, noting that the First Amendment to the U.S. Constitution only protects free speech against limitations from the government, She noted that Twitter, Facebook and the other companies were private companies. There is no First Amendment issue regarding their actions to block the President from posting on their services.

"White-Gravenor Hall of the Georgetown University, Washington DC."

But that didnt mean there werent concerns. Professor Erin Carrol, who teaches on communications and technology and the press, said that she was concerned about the power of big tech and the lack of transparency. When you clear away disinformation, will there be truth behind it? she asked.

Unfortunately, there may not be. Carroll pointed that when Trump and his sympathizers were booted from the mainstream social media, they moved to other platforms, such as Telegram and Signal, which are message services where law enforcement has little access, and Gab, which makes little attempt to control the content of messages. Another social media site, Parler, initially benefitted as a sort of home away from home for refugees from Twitter, but sponsors didnt like its lack of moderation, and Amazon, which was hosting the service, refused to carry it. That effectively killed Parler.

Speech doesnt go away

Speech doesnt go away, Carroll said, it just finds other places.

According to Professor David Vladeck, the A.B. Chettle Chair in Civil Procedure at Georgetown Law and former head of the FTC Consumer Protection Bureau, much of the issue about removing someone such as Trump from a platform is rooted in Section 230 of the Communications Decency Act. He said that Section 230 enables a lot of the problems. It gives very broad immunity for publishing harmful or defamatory information. He said that while he doubts that Section 230, which protects internet providers from liability for material that others post on their sites, will be repealed, he thinks its likely to be changed. He noted that former President Trumps desire to repeal that section was based in his lack of understanding of what it did. In effect, he said, it would have allowed much greater control over what he posted, rather than less.

Twitter CEO Jack Dorsey testifies remotely during a hearing to discuss reforming Section 230 of the ... [+] Communications Decency Act (Photo by Greg Nash / POOL / AFP) (Photo by GREG NASH/POOL/AFP via Getty Images)

That then raised the question of just how online content should be controlled. Professor Anupam Chander, who teaches communications and technology law, suggested that changing Section 230 to bring more content moderation might not be a good thing. It could lead to a Disneyfied universe, he said. That would be one in which no negative information exists.

Transparency needed

Instead, Carroll said that whats needed is for the industry to adopt greater transparency in how it makes decisions. She said that when new rules, such as a revision of Section 230 are done, it needs to be done by people who understand it, and who understand the way online services such as Twitter and Facebook work.

How do we have policies that promote facts versus propaganda, Carroll asked. She suggested that there needs to be some accountability into who makes decisions such as deplatforming a President.

So far, however, there seems to be no obvious answer to the question about when or if to remove such a platform from the President (or anyone else for that matter). But it appeared clear that the first step should be to update current legislation to at least reflect how these services work, and to make sure that theres transparency/

Read this article:

Georgetown University Discusses The Great Deplatforming: Removing Trump From Social Media - Forbes

De-platforming Is a Fix, But Only a Short-Term One – Just Security

Though we are in the midst of a techno-reckoning that has been neatly packaged as the Great Deplatforming, the measures taken by individual companies following the attack on the Capitol have been varyingly radical. Applying their now-customary methods of content moderation to high-profile U.S. users, major social media companies like Facebook and Twitter took action against many on the right, including President Donald Trump, Representative Majorie Taylor Greene, and 70,000 accounts linked with QAnon for inciting violence and violating terms of service. More atypically, those operating the previously mostly invisible digital infrastructure that platforms like Facebook and Twitter are built on also displayed their power, taking down Parler, a free-speech alternative to Twitter. Apple, after issuing a short-fuse warning Parler couldnt possibly heed, removed Parler from its App Stores, Google removed the social network from its Google Play store, and Amazon Web Services (AWS) took down its service entirely. These emergency measures should not be taken lightly, but they have reportedly greatly reduced the spread of election disinformation online. While new technologies and technology companies have contributed to, perhaps even caused, some of the problems that lead to the events on Jan. 6, Pandoras box has been opened. Conspiracy theorists and disinformation remain in the world and online. At the end of the day, solving that problem will require more than just a technical solution.

The App Store and Google Play store are the gatekeepers to hundreds of millions of American smartphones. Despite the fact Parler had been previously warned of its bad behavior, the landscape shifted quickly after the events of Jan. 6. When Parler was taken off the App Store and Google Play store, would-be users were no longer able to download it onto their phones, and the company was restricted from pushing updates to their app. While the statistics for Parler are not public, 93 percent of video views on Twitter come from mobile. When Google and Apple remove an app from their respective stores they do not quite issue a platform like Parler a death sentence, but such actions are extremely damaging to a networks ability to grow and maintain a user base.

While Apple and Google can create serious or even fatal business problems, a web hosting company can spark an immediate crisis for a hosted service should it pull the plug. AWS provides computing and data storage for much of the internet; in 2019, the company maintained about 45 percent of the internets cloud infrastructure. When the company abruptly canceled Parlers contract after the Capitol attack, the website went down. But death on the internet is short lived. About a week after being taken down, Parler registered its domain with Epik, a company known for hosting far-right content, and announced, We will resolve any challenge before us and plan to welcome all of you back soon. It is possible that Parler will follow in the footsteps of the far-right social network Gab and host its own servers at an undisclosed data center. This lifeline is more expensive and more difficult to set up and maintain.

The internet has many entry points for the moderation of content. Anything providing for the movement of bits could theoretically become a selective barrier: With varying degrees of precision, a cloud provider, wifi router maker, ISP, or owner of a copper fiber wire can determine what passes across some portion of the internet, whether through literal technical intervention, or use of its broader leverage to compel an application provider, who relies upon it to function, to take action. But the deeper into the infrastructure one goes further removed from specific instances of communication between people and the specific apps theyre using to make it the rarer and more significant a deplatforming becomes. AWS justified the take down of Parler by noting that its acceptable use policy states that users may not host certain content, including content that violate[s] the rights of others, or that may be harmful to others. Nor may they use the service in a way that poses a security risk to the Service Offerings or any third party. These broad terms are seldom enforced outside of services engaged in outright fraud or hacking against their own users.

In its opposition to Parlers application for a temporary restraining order on the basis of breach of contract and antitrust claims, AWS noted that it reported to Parler dozens of examples of content that encouraged violence, including calls to hang public officials, kill Black and Jewish people, and shoot police officers in the head. Crucially, on Parler, hatred did not lurk on the fringes and calls to violence werent isolated voices lost in a sea of content. Instead, it was common to come across posts and comments calling for violence or civil war. And this dozens figure cited by AWS is only a fraction of a fraction of the content that incited violence.

As part of my research, I joined Parler in early November, right before the election. It was clear that the network facilitated and bred hatred and disinformation. Its ecosystem was immediately evident. A prominent figure, such as Republican Senator Ted Cruz or Fox News commentator Sean Hannity, would post a provocative article, usually with no direct incitement to violence. However, their post or Parley was meat for the piranhas. For instance, on both Parler and Twitter, Cruz posted an article from the Washington Examiner titled: Graphic warning: Reported Black Lives Matter counterprotesters sucker punch and stomp on man leaving DC Trump rally, commenting Why is the media ignoring this? Why are Dems silent? The response to the posts on the two platforms was drastically different.

On Twitter, many of the responses are critical of the article, noting that the video was edited to remove the beginning of the incident where the man who is eventually sucker punched shoves a man to the ground and kicks him as other Trump supporters shout kill them. Some even condone the sucker punch, with someone tweeting [] he got exactly what he deserved. Make better choices, then. Other commenters bash the media and liberals, tweeting They blame the victim and make up lies, or, Because theyve gotten away with it since it started. [] There are no politicians that will hold them accountable. A small minority foreshadow future violence: We are only going to put up with the crap for a little while [] We will stand and fight and have no mercy on them.

But, on Parler, the comments were far more extreme. The comments on the platform read: []We have the 2nd amendment on our side. Put a damn mask on to cover you[r] identi[ty] lock and load, and start cleaning the streets of this vial filth, and [If] I see any BLM or ANTIFA and I am going to pull my gun and start shooting! F[***] those a[******] communist f[****]!.

Even more dangerously, conservatives used Twitter and Parler differently. For instance, Sean Hannity posted on Parler that Antifa and Stalinist Sympathizers Disguised in Trump Gear Identified in DC Protests, a claim based on an article that is false. There is no such reciprocal post on his Twitter account. Politicians and commentators can additionally post different messages on different social media. As journalist Nick Martin noted, Representative Paul Gosar (R-Ariz.) seemingly parleyed in support of the Jan. 6 raid of the Capitol while simultaneously tweeting a soft condemnation of it.

Thus, when it comes to political purpose, Parler served two main roles for those on the right: It was a community, a safe space to express and consume disinformation and radical viewpoints, and it allowed a forum to collect and re-interpret mainstream messages. To hear a dog whistle, you need a dogs ear, after all. Parler is not the first network to have served these functions. It wont be the last, either. In fact, even without the internet, Parler can be replaced.

Without Parler or Twitter, disinformation and hatred coded or overt will continue to be broadcast. The Trump White House and Fox News were, by some researchers findings, the largest spreaders of fake news. Even without Twitter or the White House, Trump will retain a spotlight, if only through right-wing new organizations. Plus, the U.S. Congress now has at least two members who have publicly supported QAnon, including Marjorie Greene Taylor, who in 2017 expressed her belief that having Trump as president provided the country with a once-in-a-lifetime opportunity to take [the] global cabal of satan-worshipping pedophiles out. Hannity no longer has his Parler, but he still spewed disinformation about Antifa masquerading as pro-Trumpers at the Capitol on his cable show, which averages 4.5 million viewers a night.

And without Parler or Twitter, disinformation and hatred coded or overt from radical elites will continue to be noted and interpreted. As recently as this summer, QAnon Facebook groups had millions of members. When these groups were shut down, many moved to Parler, a platform that consisted of adults who consensually joined it, presumably to have discussions like the ones they were having and to consume content like what they were seeing. With Parler gone and Twitter and Facebook cracking down on conspiracy theories, millions have downloaded the encrypted messaging apps Signal and Telegram, which allows for groups of up to 200,000 people. This doesnt absolve Parler of responsibility, nor does it mean any action taken is helpless. But conspiracy theorists wont disappear; theyll migrate.

In the wake of this, the question arises: What is there to be done? Already, Big Tech has answered that question in its own way, with Apple, Google, and AWS taking aggressive measures to disable the platform. The moves they made were probably the right ones, at least in the short term, but problems remain. Millions of Americans believe the big lie that the 2020 election was stolen, a problem for which there is no technical solution. At a certain point, the question about what to do with Parler is only part of the broader one about how society should cope with the fact that segments of the population are living in different realities. And thats a far trickier problem one that, when the dust settles, and platforms are unable to reasonably cite the imminent threat of violence, we will have to solve.

See more here:

De-platforming Is a Fix, But Only a Short-Term One - Just Security

The Great Deplatforming: Can Digital Platforms Be Trusted As Guardians of Free Speech? – ProMarket

Online social media platforms accepted the role of moderating content from Congress in 1996. The Great Deplatforming that occurred after January 6 was less a silent coup than a good faith effort to purge online platforms of toxic content.

After former President Trump and many of the extremist followers he goaded were removed from a variety of online platformsmost notably Twitter, Facebook, YouTube, as well as Redditmany saw the subsequent silence as a welcome relief. But is that, as Luigi Zingales posits, an emotional reaction to a wrong that is being used to justify something that, at least in the long term, is much worse?

The Great Deplatforming (aka the Night of Short Fingers) has exposed the fact that a great deal of political discourse is occurring on private, for-profit internet platforms. Can these platforms be trusted as our guardians of free speech?

Recognizing that the Internet and other interactive computer services offer a forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity, and with the intent of preserving the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation, Congress granted immunity in 1996 to interactive computer services (which are equivalent of Twitter, Facebook, YouTube, and other online social media platforms) for any information published by the platforms users. Without this immunity, social media platforms would simply not exist. The potential liability for the larger platforms arising from the content of the millions upon millions of posts would be far too great a risk.

Congress also granted social media platforms immunity for any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected. This provision gives platforms an incentive to moderate content to weed out objectionable and illegal content. It can be a vile world out there, as evidenced by some of the hateful, harmful, and obscene comments that are posted in online forums. As noted by Cloudfare founder Matthew Prince, What we didnt anticipate, was that there are just truly awful human beings in the world. The situation has devolved to such an extent that experts now recognize that some human moderators suffer from post-traumatic stress disorder-like symptoms.

What do we do when the moderation we have encouraged social media platforms to conduct is applied to what some consider political speech? Many fear that platforms have difficulty differentiating between racist and/or extremist posts advocating violence against individuals or institutions based on political views, and simple political opinion. Indeed, Facebooks own executives acknowledge that the companys algorithm-generated recommendations were responsible for the growth of extremism on its platform. Concerns have also been raised that, with bad intent, some platforms refuse to make that differentiation to promote their executives own political agendas. Social media platforms have also been accused of moderating such political speech with a bias toward a particular party.

One approach is to amend Section 230 of the Communications Decency Act, the law that provides internet social media platforms their immunity. Republicans introduced five bills in the 20192020 Congressional session calling for amendments, as well as the full repeal, of Section 230. For example, Senator Josh Hawley (R-Mo), claiming a lack of politically neutral content on social media platforms, sponsored the Ending Support for Internet Censorship Act. Under the bill as introduced, online social media platforms with 30 million or more active monthly users in the US (or 300 million or more active monthly users worldwide, or more than $500 million in global annual revenue) would have to obtain an immunity certification from the FTC every two years. The social media platform would be denied a certification and lose Section 230 immunity if it was determined to be moderating in a politically biased manner, which would include disproportionately restricting or promoting access to, or the availability of, information from a political party, political candidate, or political viewpoint. As one article noted, Hawley wants to stop internet censorship by censoring the internet, not to mention regulating political speech.

The Parler lawsuit amplifies a point made by one commentator: If you are so toxic that companies dont want to do business with you, thats on you. Not them.

In May 2020, President Trump signed an Executive Order claiming online platforms are invoking inconsistent, irrational, and groundless justifications to censor or otherwise restrict Americans speech, and stating that online platforms should lose their immunity, because rather than removing objectionable content in good faith as required under the law, they are engaging in deceptive actions by stifling viewpoints with which they disagree. The Executive Order called on the FCC to propose new regulations to clarify immunity under Section 230 and for the FTC to investigate online platforms for deceptive acts. Both Tim Wu and Hal Singer have provided cogent arguments on ProMarket as to why these are doomed approaches.

Online moderation is enforced through each platforms terms of service (TOS) that each user must agree to before being able to post to each respective platform. After the January 6 siege of the US Capital, the tech companies that deplatformed a large number of users, including former President Trump, and deleted tens of thousands of posts, did so on the basis of users violating their respective TOS. For example, Amazon Web Services (AWS) stopped hosting Parler because of Parlers alleged violations of the AWS TOS, which include an Acceptable Use Policy. Parlers subsequent lawsuit against AWS could have served as a bellwether case for the application of TOS to regulate speech. Unfortunately, the case Parler presented is appallingly weak (one lawyer referred to it as a lolsuit). For example, the AWSParler hosting agreement clearly gives AWS the power to immediately suspend and terminate a client that has violated AWSs Acceptable Use Policy.

In its swift denial of a preliminary injunction in this case, the court also noted the lack of any evidence that Twitter and AWS acted together either intentionally or at all to restrain Parlers business.

Parlers causes of action continue with a claim that Twitter was given preferential treatment by AWS because similar content appeared on Twitter, for which Twitters account was not suspended, while Parlers account was terminated. There are two issues with this assertion: First, we return to the role of moderation. Twitter actively (though, to some degree, imperfectly) moderates the content on its platform. In contrast, Parlers home page stated its users could [s]peak freely and express yourself openly, without fear of being deplatformed for your views.

Even more damaging to Parlers assertion is the fact that the evidence in the case demonstrates that AWS doesnt even host Twitter on its platform and therefore did not have any ability to suspend Twitters account even if it wanted to. But the Parler lawsuit amplifies a point made by one commentator: If you are so toxic that companies dont want to do business with you, thats on you. Not them (which would seem to apply to Zingaless justification for Simon & Schusters cancellation of Senator Josh Hawleys book deal).

In denying Parlers request for a preliminary injunction that would order AWS to restore service to Parler pending the full hearing, the court rejected any suggestion that the public interest favors requiring AWS to host the incendiary speech that the record shows some of Parlers users have engaged in. While this ruling does not end the case, it does substantiate the weakness of at least some of Parlers arguments.

The corporate owners of the social media platforms are permitted in our free enterprise system to set the terms under which content may be posted or removed from the platforms.

Although the AWS-Parler case before the court will ultimately resolve this particular dispute, the underlying issues will remain far from resolved regardless of the outcome of the case. The explosion of social media over the last ten years, and its supplanting of traditional media to a large degree, has created a new and untested playing field for public discourse. Some of the issues raised are similar in scope, if not size, to the issues our courts have dealt with in the past. The corporate owners of the social media platforms are permitted in our free enterprise system to set the terms under which content may be posted or removed from the platforms. As non-government actors, the First Amendments freedom of speech protections do not apply to the speech of a private company, a fact confirmed by the court in denying Parlers preliminary injunction. The audience of the social media platforms at issue, however, has grown to exponentially. While deplatforming will, at least temporarily, silence those voices promoting violence on specific platforms, the long-lasting implications are less clear.

While it is true that Trump lost the popular vote in the 2020 election by 7 million votes, the fact remains that 74.2 million Americans did cast their votes for Trump. Once Twitter permanently banned Trump from its platform, and many surmised that he would join the less restrictive Parler, nearly one million people downloaded the Parler app from Apple and Google before it was removed from those stores and Parler was suspended from AWS. Moving the conversation off of mainstream social media and driving it into less balanced platforms whose subscribers are more homogeneous, much to the dismay of even Parlers CEO, John Matze, encourages an echo chamber of ideas in smaller encrypted platforms that are more difficult to monitor and potentially amplifies the most angry and passionate voices.

If the touted exodus of conservatives from Twitter to other platforms that they view as more welcoming comes to fruition, could our public discourse become even more divided, with opposing viewpoints feeding upon their own biases rather than potentially being tempered by responses and dialogue with each other? An informed public exposed to conflicting opinions is the best chance for resolving political differences. These issues are important and warrant further discussion, but the current termination of Parler from the AWS platform does not seem to result in heightened concern that public discourse is truly harmed. There are alternatives to AWS, and in fact, Parler already seems to have found one that will bring it back. While there is a significant amount of talk of conservatives leaving Twitter, there seems to be little evidence that has happened on a large scale. For the moment, at least, the largest social media platforms seem to be retaining users on both sides of the political spectrum, which we believe is good for democracy.

See the original post here:

The Great Deplatforming: Can Digital Platforms Be Trusted As Guardians of Free Speech? - ProMarket