The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Category Archives: Cloud Computing
Software for Open Networking in the Cloud (SONiC) Moves to the Linux Foundation – PR Newswire
Posted: April 15, 2022 at 12:47 pm
Leading open source network operating system enabling dis-aggregation for data centers now hosted by the Linux Foundation to enable neutral governance in a software ecosystem
SAN FRANCISCO, April 14, 2022 /PRNewswire/ -- Today, the Linux Foundation, the nonprofit organization enabling mass innovation through open source, announced the Software for Open Networking in the Cloud (SONiC, an open source networking operating system), is now part of the Linux Foundation. The Linux Foundation provides a venue for continued ecosystem, developer growth and diversity, as well as collaboration across the open source networking stack.
"We are pleased to welcome SONiC to the Linux Foundation family of open networking projects," said Arpit Joshipura, general manager, Networking, Edge, and IoT, the Linux Foundation. "SONiC is a leader in open source data center NOS deployments, and we're looking forward to growing its developer community."
The Linux Foundation will primarily focus on the software component of SONiC, and continue to partner with Open Compute Platform(OCP) on aligning hardware and specifications like SAI.
"Microsoft founded SONiC to bring high reliability and fast innovation to the routers in Azure cloud data centers. We created it as open source so the entire networking ecosystem would grow stronger. SONiC already runs on millions of ports in the networks of cloud scalers, enterprises, and fintechs. The SONiC project is thrilled to be joining the Linux Foundation to take the community to its next jump in scale, participation, and usage," said Dave Maltz, Technical Fellow and Corporate Vice President, Microsoft Azure Networking.
About SONiC
Created by Microsoft for its Azure data centers, SONiC is an open source network operating system (NOS) based on Linux that runs on over 100 different switches from multiple vendors and ASICs. It offers a full-suite of network functionality, like BGP and RDMA, that has been production-hardened in the data centers of some of the largest cloud-service providers. It offers teams the flexibility to create the network solutions they need while leveraging the collective strength of a large ecosystem and community.
Existing Ecosystem
SONiC brings a strong existing ecosystem, with premier members including Alibaba, Broadcom, Dell, Google, Intel, Microsoft, NVIDIA and 50+ global partners.
The SONiC community will host its first hackathon later this year. Stay tuned for details and registration information.
More information about SONiC, including how to join, is available at SONiC (azure.github.io).
Support from Key Stakeholders & Customers
Alibaba
"This is a big milestone for the SONiC community. After joining the Linux Foundation, the SONiC community will play a much more important role in the networking ecosystem," said Dennis Cai, Head of Network Infrastructure, Alibaba Cloud. "Congratulations! As one of the pioneering SONiC users and contributors, Alibaba Cloud has widely deployed SONiC- based whitebox switches in our data centers, edge computing cloud, P4- based network gateways, and will extend the deployment to Wide Area Networks. With modern network OS design and operation- friendly features, we already gained tremendous value from the large-scale deployments. Alibaba is committed to the SONiC community, and will continue bringing our large-scale deployment best practices to the community, such as open hardware specs , network in-band telemetry, high performance networking, and network resiliency features, SRv6, etc."
Broadcom
"Large hyperscalers agree that merchant silicon, hardware independence, and open source protocol and management stack are essential for running their data center networks. Broadcom has wholeheartedly supported this vision with leading-edge, predictable silicon execution and contributions to the SONiC project. We are excited to see the SONiC initiative join the Linux Foundation and look forward to working with the streamlined ecosystem to drive the data center and hyperscale needs of the future," said Mohammad Hanif, senior director of engineering, Core Switching Group, Broadcom.
Dell Technologies
"We believe SONiC will continue its accelerated adoption into the modern data center, delivering the scale, flexibility and programmability needed to run enterprise-level networks," said Dave Lincoln, vice president of product management at Dell Technologies. "As a leading SONiC contributor, we see the advantages it brings to the supporting open source community and customers. As we continue the drive to take open-source-based solutions mainstream, we look forward to working with the Linux Foundation and its supporting communities to drive SONIC's development and adoption."
EBay
"eBay operates a large-scale network infrastructure to support its growing global business. eBay cares about the openness and quality of NOS to operate its network infrastructure. eBay is an active participant in the SONiC community and deploys SONiC at scale in its infrastructure. eBay is excited to see this next step of growth of the SONiC community," said Parantap Lahiri, vice president, Network and Datacenter Engineering at eBay.
EPFL
"At EPFL, we have been looking for a vendor neutral and flexible NOS that can provide HaaS capabilities for our Private Cloud Environment. SONiC OS provides us the solution we have been looking for in our Data Centre, allowing us to migrate to a powerful and modern Data Centre network. We are looking forward to this next phase in the SONiC community," said Julien Demierre, Network and System architect at EPFL.
"We believe moving SONiC to the Linux Foundation is very important as it will further enhance collaboration across the open source network, community and ecosystem. Google has more than a decade of experience in SDN; our data centers and WAN are exclusively SDN controlled, and we are excited to have helped bring SDN capabilities to SONiC . We fully support the move to the LF and intend to continue making significant upstream contributions to drive feature velocity and make it easier for operators to realize the benefits of SDN with PINS/SONiC and P4," said Dan Lenoski, vice president, Engineering, Network Infrastructure, Google.
Intel
"Intel has a strong history of working with SONiC and the Linux Foundation to help to propel innovation in an open, cooperative environment where ideas are shared and iterated. We continually promote open collaboration, encompassing open-source technologies such as the Infrastructure Programmer Developer Kit and P4 integrated networking stack (PINS), using Intel Xeon Scalable processors, Infrastructure Processing Units and Tofino Intelligent Fabric Processors as base hardware," said Ed Doe, vice president and general manager, Switch and Fabric Group at Intel. "Joining the Linux Foundation will help SONiC to flourish, and in turn create greater benefit for cloud service providers, network operators and enterprises to create customized network solutions and transform data-intensive workloads from data center to the edge."
NVIDIA
"This is an important milestone for SONiC and the community behind it," said Amit Katz, vice president of Ethernet Switches at NVIDIA. "NVIDIA is committed to supporting the community version of SONiC that is 100 percent open source, enabling data center operators to control the code inside their cloud fabrics, accelerated by state-of-the-art platforms with SONiC support, such as NVIDIA's Spectrum family of switches."
Open Compute Project
"The Open Compute Project Foundation is pleased to continue its collaboration with SONIC as part of the OCP's new hardware software co-design strategy. The open source SONiC Network Operating System is enabling rapid innovation across the network ecosystem, and it began with the definition of the Switch Abstraction Interface (SAI) at OCP. Hardware software co-design focuses on software that requires intimate knowledge of the hardware to drive maximum hardware performance, and speed time-to-market for hardware where system performance and ecological footprint can be highly dependent on software and hardware interactions," said George Tchaparian, CEO Open Compute Project Foundation.
About the Linux Foundation
The Linux Foundation is the organization of choice for the world's top developers and companies to build ecosystems that accelerate open technology development and commercial adoption. Together with the worldwide open source community, it is solving the hardest technology problems by creating the largest shared technology investment in history. Founded in 2000, The Linux Foundation today provides tools, training and events to scale any open source project, which together deliver an economic impact not achievable by any one company. More information can be found at http://www.linuxfoundation.org.
The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page: https://www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.
Media ContactJill LovatoThe Linux Foundation[emailprotected]
SOURCE The Linux Foundation
Go here to read the rest:
Software for Open Networking in the Cloud (SONiC) Moves to the Linux Foundation - PR Newswire
Posted in Cloud Computing
Comments Off on Software for Open Networking in the Cloud (SONiC) Moves to the Linux Foundation – PR Newswire
Google Distributed Cloud: Who Will It Benefit? – ITPro Today
Posted: at 12:47 pm
Wouldn't it be great if you could take a public cloud platform like Google Cloud and deploy its services in your own data center, or even on edge devices?
Well, you can, using Google Distributed Cloud, one of the newest offerings in Google's cloud services portfolio.
Related: Will Hybrid Cloud Save You Money? Predict Your Hybrid Cloud Costs
Here's how Google Distributed Cloud works, which use cases it targets, and why you may (or may not) want to use it as part of an edge or hybrid cloud strategy.
What Is Google Distributed Cloud?
Related: The Pros and Cons of Kubernetes-Based Hybrid Cloud
Google Distributed Cloud is a suite of cloud products from Google designed primarily to support the deployment of public cloud services at the "edge."
Thus, Google Distributed Cloud isn't a specific platform or service as much as it's a set of various tools and services, which you can use in a variety of ways.
Most of the services built into Google Distributed Cloud come from Google's standard public cloud platform (such as Anthos), so there's nothing really brand-new about Google Distributed Cloud from a technical perspective. It's mostly the way that Google is packaging the services to enable edge and hybrid cloud use cases that makes Google Distributed Cloud unique.
Google announced its Distributed Cloud portfolio in October 2021. The offering currently remains in preview mode, and there have been few real-world deployments of the platform to date. A deployment by Bell Canada, announced in February 2022, is one of the first.
Google Distributed Cloud works by allowing users to extend public cloud services that are hosted on Google Cloud Platform to private servers, internet of things (IoT) devices, or other infrastructure. In other words, you can use Distributed Cloud to manage infrastructure that you own as opposed to infrastructure owned by a cloud provider like Google using many of the same tools and services that Google makes available to its public cloud customers.
In this way, Distributed Cloud is similar to offerings such as AWS Outposts and Azure Arc, which also extend public cloud functionality into private infrastructure.
Currently, Google Distributed Cloud is designed to run on four types of infrastructure setups:
This means Google Distributed Cloud can operate on basically any infrastructure including conventional data centers and less orthodox environments, like networks of IoT devices.
If Google Distributed Cloud sounds like a hybrid cloud platform, it's because it basically is. Extending public cloud services like those of Google Cloud into private infrastructure would certainly qualify as a hybrid cloud deployment by most definitions.
Notably, however, Google is not calling the platform a hybrid cloud solution. Instead, Google is using language that centers on "edge" and "distributed" infrastructure.
That's probably because Google already markets Anthos (which, again, is integrated into the Distributed Cloud portfolio but which exists as a stand-alone product, too) as its main hybrid cloud solution.
Since Distributed Cloud is also based in part on Anthos, you could argue that the main difference between Distributed Cloud and a hybrid cloud platform is marketing and branding, not technology. And indeed, to a significant extent, Distributed Cloud seems to be a reflection of Google's efforts to position itself as a leader in the edge computing market above all else.
It's understandable why Google would choose to brand Distributed Cloud as something different from a hybrid cloud platform, even if it's technically really not that different. With Distributed Cloud, Google is in a stronger position to cater to use cases like running network functions on telco infrastructure or managing edge IoT devices deployments that aren't usually the focus of conventional hybrid cloud platforms.
Ultimately, Google Distributed Cloud is likely to become a product that is very important in certain narrow niches, but that most companies won't use.
In verticals such as telco, or for businesses with large IoT infrastructures, Google Distributed Cloud offers an easy way of managing large, distributed networks of devices. It's not the only solution of its kind; you could also manage distributed infrastructures using most Kubernetes distributions, for example, or via proprietary services like Azure IoT Edge. But the fact that Google Distributed Cloud is based on Google Cloud services will give it an advantage among customers who are already invested in the Google Cloud ecosystem.
That said, companies that just want to run a conventional hybrid cloud meaning one that extends public cloud services to private servers or data centers, without edge infrastructure in the mix aren't likely to benefit from Google Distributed Cloud. They should choose a more traditional hybrid cloud platform, like Anthos or a similar offering from a different public cloud provider.
Continue reading here:
Google Distributed Cloud: Who Will It Benefit? - ITPro Today
Posted in Cloud Computing
Comments Off on Google Distributed Cloud: Who Will It Benefit? – ITPro Today
Unveiling the Potential Relationship between IoT and Cloud Computing – IoT For All
Posted: April 9, 2022 at 4:05 am
Today, if we look around, we find that IoT, the Internet of Things, disrupts our daily lives, either at home or the workplace. It has been 20 years since this concept has knocked the tech world. Since then, it has offered excellent solutions that have made everything seamless and better.
From smart Fitbits to the Amazon Echo or Google Home, today, most people are using connected smart devices and wearables to monitor their health like heart rate, calories, and daily activities. In fact, some use it to manage their heating, lighting, home security, and re-order household staples when supplies run low.
The digital changes occurring everywhere, at home or business, in hospitals, buildings, or the entire town, show that IoT is growing at a breakneck speed.
As per research conducted by Juniper Research, the number of IoT devices is set to increase from 35 billion connections in 2020 and reach 83 billion by 2024, which results in around 130 percent increase in the coming four years as enterprise IoT users will extend their IoT ecosystems to improve operational efficiencies to generate real-time insights.c
The pandemic strike has proved that digitization in every sector has become necessary. Companies have to double down on digital transformation projects, and technologies like IoT are the only way to make it possible. Embracing these technologies is the only way to improve customer services, automate processes and tasks, track assets, detect existing loopholes, and re-invent and renovate existing business models. But the success of IoT is not possible without cloud computing. We can simply say the cloud offers a lot more than just connectivity on which IoT devices are dependent. This means IoT devices are dependent on the cloud to store essential data in one central location that can be easily accessed, managed, and distributed in real-time.
When it comes to the relationship between IoT and cloud computing, here are four significant benefits that can compel organizations to use clouds to unleash the full potential of their IoT devices.
The cloud can assist organizations in overcoming the technical and cost hurdles that get dragged in with deploying an IoT solution.
The cloud eliminates the requirement to set up physical servers, deploy databases, configure networks, manage connections, or do other infrastructure tasks. It makes it speedy and easy to spin up virtual servers, launch databases, and generate the needed data pipelines to operate an IoT solution.
On the other hand, on-premises IoT network infrastructure needs much hardware, and time-consuming configuration efforts to make sure things run appropriately; implementing a cloud-powered IoT system is significantly more streamlined.
For instance, scaling up the number of IoT-enabled devices just requires leasing another virtual server or more cloud space.
In the same way, cloud services can easily streamline remote device lifecycle management, ensuring delivery of a 360-degree view of the device infrastructure and tools that automate the update and setup of software and firmware over the air.
IoT devices are essential for both consumers and enterprises because of their information. But they become more helpful when communicating with each other.
For instance, a connected thermostat can communicate with smart refrigerators to increase or decrease the temperature. A connected micro-controller can analyze and predict preventative maintenance, which is needed to reduce the chances of any damage.
The cloud helps in this operation by streamlining and optimizing machine-to-machine communications and facilitating this across interfaces. With the increased interactions between many connected devices and immense volumes of data generated, organizations will have to find a cost-effective way to store, process, and access data from their IoT solutions.
In addition, they also need to be capable to scale up to manage peaks of demand or extending the infrastructure to handle extra functionality whenever they add more features to their IoT solution.
We all know that an IoT solution generates immense amounts of data. In that case, adding built-in management tools and processing capabilities that support the successful transfer of data between devices effectively and efficiently will make the process easy and convenient. The cloud also offers a hosting platform for Big Data and data analytics at a significantly lower cost.
Data generated by IoT devices can be stored and processed in a cloud server and easily accessible at any time from any place without any infrastructure or networking issues. In the same way, data can be collected remotely and in real-time from devices located anywhere and in any time zone.
Sometimes, interoperability hampers the ability of enterprises to link or integrate data generated by IoT devices to other data resources.
Adding a cloud can assist in linking applications and seamlessly integrate all the data sources so they can be analyzed, regardless of source.
Cloud can also help the organization streamline the integration of the IoT solution with other smart products developed by their third parties to generate additional value for users.
Security has been a much-talked concern as security lapses, and failure to update IoT devices has created a gateway for cybercriminals. Cloud platforms can support enterprises in improving and strengthening security in two ways.
Firstly, as we already know, cloud providers can make it simple to undertake regular software and firmware updates, signed with digital certificates that ensure users these updates are safe and authorized.
Secondly, cloud platforms help initiate customized client and server-side encryption that guarantees complete security of data flowing through the IoT ecosystem and even when it is at rest in the database. Many cloud service providers provide 24*7 monitoring to minimize the risk of a security breach.
Many organizations embrace IoT technologies, and those who are still reliant on the old-traditioned infrastructure will find themselves at a loss.
Adopting the Cloud, a power plant for IoT device communications and memory, organizations will experience better connectivity rates and improved ROI.
Embracing a hybrid cloud approach facilitates IT teams to establish the right mix of hosting opportunities that allows them to manage rapid rollout and enablement while getting the max out of IoT devices and securing better IoT IoT strategy without investing much time and money and efforts into developing costly infrastructure.
For organizations that have decided to extend their IoT ambitions, the cloud can assist in developing IoT products faster, can easily manage and handle all the generated data, secure the IoT ecosystem, and establish better integration with existing systems and other IoT devices. This means cloud computing will be the key to unlocking a faster time to market, with greater flexibility and adding lifetime value for success and profit churning IoT deployment.
Read more:
Unveiling the Potential Relationship between IoT and Cloud Computing - IoT For All
Posted in Cloud Computing
Comments Off on Unveiling the Potential Relationship between IoT and Cloud Computing – IoT For All
Cloud Computing Now IT’s Default Tech Are You in the Game? – Channel Futures
Posted: at 4:05 am
This weeks cloud news roundup also features takeaways from Prosimo, Google Cloud and NetApp.
Cloud computing is turning into the most in-demand technology, in all its different facets. Think SaaS, PaaS, IaaS, everything as a service. New research from Foundry (formerly IDG Communications) highlights the momentum organizations are undergoing as they adopt more cloud services and infrastructure. Partners will want to take note of the end-user demand, given the channels growing involvement in guiding, deploying and managing customers cloud environments. (Well be talking about cloud computing opportunities for smaller partners, April 11-14, at the Channel Partners Conference & Expo.)
Along similar lines, Prosimo, a vendor built by Viptelas cofounders, has a new cloud computing platform partners will want to investigate. Its geared toward resellers and managed services providers.
Next up, you probably know that Google Cloud is a founding member of the new Data Cloud Alliance. We covered that a couple days ago but here we offer a few more details. Channel partners should benefit, albeit in a trickle-down kind of way.
Finally, NetApp, which bought CloudCheckr, just announced another acquisition. FInd out which cloud computing company its buying to add to its Spot by NetApp portfolio (where CloudCheckr now resides).
Its all in the slideshow above.
More:
Cloud Computing Now IT's Default Tech Are You in the Game? - Channel Futures
Posted in Cloud Computing
Comments Off on Cloud Computing Now IT’s Default Tech Are You in the Game? – Channel Futures
Software-as-a-Service Rules the Cloud – DARKReading
Posted: at 4:05 am
The majority of respondents in a new survey on cloud adoption say they use software-as-a-service (SaaS). Eight in 10 (80%) IT professionals at organizations that employ cloud computing have implemented SaaS.
That's according to "State of the Cloud: A Security Perspective," a report from Dark Reading released in March. The research surveyed decision-makers with IT job titles at organizations that use cloud services. Those organizations include companies of all sizes from a variety of industries, including technology, healthcare, financial services, manufacturing, and education.
The overall picture from the survey is one ofa robust remote workforce environment. While SaaS was by far the most widely embraced cloud application, almost half of respondents also reported using infrastructure-as-a-service (49%) and platform-as-a-service (47%). Other everyday work solutions were rarer, including desktop-as-a-service (19%) and containers-as-a-service (14%).
One of the obstacles to implementing cloud computing more widely is concerns about security, the main focus of this report. Many respondents say they already use cloud tools to secure their data and networks. One-quarter (25%) of respondents use disaster-recovery-as-a-service to help bounce back from cyberattacks and natural disasters. Almost one in five (18%) implement security-as-a-service. Only 10% of cloud users run secure access service edge (SASE) defense, however, which means that area is still wide open for competition.
For more data and insights, download the full report.
Keep up with the latest cybersecurity threats, newly-discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.
See more here:
Posted in Cloud Computing
Comments Off on Software-as-a-Service Rules the Cloud – DARKReading
Cloud computing spending is growing again and there’s more to come – ZDNet
Posted: at 4:05 am
Businesses around the world spent $21.1 billion on cloud infrastructure services in the fourth quarter (Q4) of 2021, signaling a rebound in spending on cloud storage and compute power.
Spending on cloud infrastructure was up 13.5% in the quarter year on year to $21.1 billion, according to tech research firm, IDC.
The previous quarter saw spending on cloud infrastructure reach $18.6 billion after a remarkable year-on-year decline of 1.9% in Q2 2021, which was the first time in seven quarters that spending on cloud decreased.
Cloud spending rose as businesses and governments across the world embarked on major digital transformation projects over the last two years. The big winners are the big three cloud players: Amazon Web Services (AWS), Google Cloud, and Microsoft with Azure and its other cloud services, like Office 365.
"This marked the second consecutive quarter of year-over-year growth as supply chain constraints have depleted vendor inventories over the past several quarters. As backlogs continue to grow, pent-up demand bodes well for future growth as long as the economy stays healthy, and supply catches up to demand," IDC said of Q4 2021.
Over the whole of 2021, spending on cloud was 8.8% higher than in 2020, reaching a total of $73.9 billion for the year.
Enterprise spending on traditional IT grew too, but not as fast as cloud spending. Enterprise invested in non-cloud infrastructure to the tune of $17.2 billion, up 1.5% year on year in 4Q21, according to IDC. It marks the fourth consecutive quarter of growth on traditional IT spending, which grew 4.2% over 2020 to $59.6 billion for the year.
Cloud giants are trying to gain an edge on each other in all sorts of ways. Last week a Google Cloud survey argued that government workers were worried thatreliance on Microsoft's products was undermining the government's cybersecurity: Microsoft didn't agree.
IDC forecasts that firms will spend $90.0 billion cloud infrastructure in 2022, up 21.7% compared to 2021. The biggest loser is traditional IT spends for organizations that maintain their own infrastructure. CIO spending on non-cloud infrastructure is to decline 0.3% to $59.4 billion.
However, IDC reckons that spending on shared cloud infrastructure spending will grow 25.5% year over year to $64.5 billion for 2022, while spending on dedicated cloud infrastructure is expected to grow 13.1% to $25.4 billion in 2022.
See the article here:
Cloud computing spending is growing again and there's more to come - ZDNet
Posted in Cloud Computing
Comments Off on Cloud computing spending is growing again and there’s more to come – ZDNet
Importance of cloud computing highlighted at investment summit – Gulf Today
Posted: at 4:05 am
Top dignitaries and participants during the investment summit in Riyadh.
Gulf Today Report
Sir Anthony Ritossa's 18th Global Family Office Investment Summit held recently in Riyadh in the presence of Prince Abdulaziz Bin Faisal Bin Abdulmajeed Al Saud.
The event hosted more than 400 world leaders who share similar values regarding the importance of tailoring solutions to the worlds greatest challenges and making an impact through investment, positive action and personal commitment.
Royal families, global business leaders, leading entrepreneurs, and private investors were guests at the event. The previous summits have raised $2.8 billion in investments for leading global start-ups, entrepreneurs, funds, and philanthropic endeavors.
As a VIP guest at the event Ethernity CLOUD presented the latest developments in its network for private, confidential data processing.
In light of todays focus on Web 3.0, the applicability of Ethernity CLOUD is limitless and long-term use cases include healthcare, military, and government applications. The companys technology is decentralised, confidential anonymous, and highly available, which makes it an ideal solution for the fast-growing confidential computing market, which is estimated to reach $52 billion in the next three years, according to Everest Group Market research.
Our processes are transparent, and the data is confidential - while in storage, in transit, and during processing. With data being at the core of the digital world and privacy demand on the rise, Ethernity CLOUD provides a valuable network where data can be processed in a private and confidential manner, said Iosif Peterfi, Founder and CEO of Ethernity CLOUD.
Because Ethernity CLOUD uses blockchain technology, the activity on the network is fully transparent and it takes place in a trustless environment, preventing foul play. On top of everything else, because the network is comprised of independent CPUs (nodes), downtime is virtually reduced to zero. Essentially, Ethernity CLOUD provides a way for business entities to run dApps in a private and confidential manner, while also providing a passive income source for node operators.
Importantly, the company is also environmentally oriented and takes proper measures to offset carbon emissions. Through a partnership with Green Ant, a project which plants trees in Thailand and mints an NFT for each tree that they plant, Ethernity CLOUD is committed to becoming carbon negative.
Iosif Peterfi speaks during the summit.
As a former security guy, having worked with the US Department of Defense, I recognise that blockchain is the building block that is the answer to solve a lot of our current issues. At Ethernity CLOUD, data privacy is considered a human right. That is our mission. Blockchain opened the door to make our project feasible; blockchain for us is about decentralisation, said Peterfi.
Ethernity CLOUD is revolutionising the cloud computing industry with its fair ecosystem where everyones right to privacy is fully protected, andintegrity is insured by the blockchain itself. Data is protected from abusive cloud providers activities ensuring fair, decentralised and truly private operations. We are honoured to have had the team join us in Riyadh, said Sir Anthony Ritossa, Chairman of Ritossa Family Office.
Read the rest here:
Importance of cloud computing highlighted at investment summit - Gulf Today
Posted in Cloud Computing
Comments Off on Importance of cloud computing highlighted at investment summit – Gulf Today
Google Finally Gets The Edge Computing Strategy Right With Distributed Cloud Edge – Forbes
Posted: at 4:05 am
Announced at the Google Cloud Next 21 conference, Google Distributed Cloud (GDC) plays a critical role in the success of Anthos by making it relevant to telecom operators and enterprise customers. Google Distributed Cloud Edge, a part of GDC, aims to make Anthos the foundation for running 5G infrastructure and modern workloads such as AI and analytics.
Recently, Google announced the general availability of GDC Edge by sharing the details of the hardware configuration and the requirements.
5G
In its initial form, GDC Edge runs on two form factors - rack-based configuration and GDC Edge appliance. Lets take a closer look at these choices.
This configuration targets telecom operators and communication service providers (CSP) for running 5G core and radio access networks (RAN). The CSPs can expose the same infrastructure to their end customers for running workloads like AI inference that need ultra-low latency.
The location where the rack-based hardware runs is designated as a Distributed Cloud Edge Zone. Each zone runs on dedicated hardware that Google provides, deploys, operates, and maintains. The hardware consists of six servers, and two top-of-rack (ToR) switches connecting the servers to the local network. In terms of storage, each physical server comes with 4TiB disks. The gross weight of a typical rack is 900lbs or 408kg. The Distributed Cloud Edge rack arrives pre-configured with the hardware, network, and Google Cloud settings specified when it was ordered.
Once a DCE zone is fully configured, customers can group one or more servers from the rack to create a NodePool. Each node of the NodePool acts as a Kubernetes worker node connected to the Kubernetes control plane running in the nearest Google Cloud region.
This distributed topology gives Google the flexibility to upgrade, patch, and manage the Kubernetes infrastructure with minimal disruption to customer workloads. It allows DCE to benefit from a secure and highly available control plane without taking up the processing capacity on the nodes.
Google took a unique approach to edge computing by moving the worker nodes to the edge while running the control plane in the cloud. This is very similar to how Google manages GKE, except that the worker nodes are a part of the NodePool deployed at the edge.
The clusters running on DCE may be connected to Anthos management plane to gain better control over the deployments and configuration.
A secure VPN tunnel connects the local Distributed Cloud Edge infrastructure to a virtual private cloud (VPC) configured within Google Cloud. Workloads running at the edge can access Google Compute Engine resources deployed in the same VPC.
The rack-based configuration demands connectivity to the Google Cloud at all times. Since it runs in a controlled environment in a CSP facility, meeting this requirement is not a challenge.
Once the clusters are provisioned on the DCE infrastructure, they can be treated like other Kubernetes clusters. It is also possible to provision and run virtual machines based on kubevirt within the same environment.
CSPs from the United States, Canada, France, Germany, Italy, Netherlands, Spain, Finland, and the United Kingdom can order rack-based infrastructure from Google.
The GDC Edge Appliance is a Google Cloud-managed, secure, high-performance appliance for edge locations. It provides local storage, ML inference, data transformation, and export functionality.
According to Google, GDC Edge Appliances are ideal for use cases where bandwidth and latency limitations prevent organizations from processing the data from devices like cameras and sensors back in cloud data centers. These appliances simplify data collection, analytics, and processing at remote locations where copious amounts of data coming from these devices need to be processed quickly and stored securely.
The Edge Appliance targets enterprises from the manufacturing, supply chain, healthcare, and automotive verticals with low-latency and high throughput requirements.
GCD Edge Appliance
Each appliance comes with a 16 core CPU, 64GB RAM, an NVIDIA T4 GPU, and 3.6TB usable storage. It has a pair of 10 Gigabit and 1 Gigabit Ethernet ports. With the 1U rack-mount form factor, it supports both horizontal or vertical orientation.
The Edge Appliance is essentially a storage transfer device that can also run a Kubernetes cluster and AI inference workloads. With ample storage capacity, customers can use it as a cloud storage gateway.
For all practical purposes, the Edge Appliance is a managed device running Anthos clusters on bare metal. Customers follow the same workflow as installing and configuring Anthos in bare metal environments.
Unlike the rack-based configuration, the clusters run both the control plane and the worker nodes locally on the appliance. But, they are registered with the Anthos management plane running in the nearest Google Cloud region. This configuration makes it possible to run the edge appliance in an offline, air-gapped environment with intermittent connectivity to the cloud.
Analysis and Takeaways
With Anthos and GDC, Google defined a comprehensive multicloud, hybrid, and edge computing strategy. GDC Edge targets CSPs and enterprises through purpose-built hardware offerings.
The telecom operators need a reliable and modern platform to run 5G infrastructure. Google is positioning Anthos as the cloud native, reliable platform for running containerized network functions (CNFs) required for 5G Core and Radio Access Networks (RAN). By delivering a combination of managed hardware (rack-based GDC Edge) and software (Anthos) stack, Google wants to enable CSPs to offer 5G Multi-Access Edge Computing (MEC) to enterprises. It has partnered with AT&T, Reliance JIO, TELUS, Indosat Ooredoo, and more recently with Bell Canda and Verizon to run 5G infrastructure.
Googles approach is different from Amazon and Microsoft for delivering 5G MEC. Both AWS and Azure have 5G-based zones that act as extensions to their data center footprint. AWS Wavelength and Azure Private MEC enable customers to run workloads in the nearest edge location, managed by a CSP. Both Amazon and Microsoft are partnering with telecom providers such as AT&T, Verizon and Vodafone to offer hyperlocal edge zones.
Google is betting big on Anthos as the fabric to run 5G MEC. Its partnering with leading telcos worldwide in helping them build the 5G infrastructure based on its proven cloud native infrastructure based on Anthos. Though Google may have a competing offering for AWS Wavelength and Azure Private MEC in the future, its current strategy is to push GDC Edge as the preferred 5G MEC platform. This approach puts the CSP at the front and center of its edge computing strategy.
Google has finally responded to Azure Stack HCI and AWS Outposts with the GDC Edge Appliance. Its targeting enterprises who need a modern, cloud native platform to run data-driven, compute-intensive workloads at the edge. The edge appliance may be deployed in remote locations with intermittent connectivity, unlike the rack-based configuration.
With Anthos as the cornerstone, Google's Distributed Cloud strategy looks promising. It is aiming to win the enterprise edge as well as the telco edge with purpose-built hardware offerings. Google finally has a viable competitor for AWS Wavelength, AWS Outposts, Azure Edge Zones, and Azure Stack.
View original post here:
Google Finally Gets The Edge Computing Strategy Right With Distributed Cloud Edge - Forbes
Posted in Cloud Computing
Comments Off on Google Finally Gets The Edge Computing Strategy Right With Distributed Cloud Edge – Forbes
How to combine the power of cloud and edge computing – Raconteur
Posted: at 4:05 am
Like companies all around the world, US fast-food chain Taco Bell responded to the pandemics commercial impact by accelerating its shift to the cloud. As customers traditional patterns of restaurant and drive-through consumption changed rapidly and permanently to include kiosk, mobile and web ordering, often through third-party delivery services, Taco Bell moved the remainder of its group IT to cloudservices.
But this 100% cloud-based approach stops at the restaurant door. Given that many of its 7,000 outlets dont have fast and/or reliable internet connections, the company has recognised the limitations of the public cloud model and augmented its approach with edge computing. This set-up enables the company to process data near the physical point at which it is created, with only a periodic requirement to feed the most valuable material back to the cloud and receive updates fromit.
Taco Bell is just one of thousands of firms seeking to exploit the fast-evolving and much-hyped distributed IT capability that edge computing can offer.
Edge computing is getting so much attention now because organisations have accepted that there are things that cloud does poorly, observes Bob Gill, vice-president of research at Gartner and the founder of the consultancys edge research community.
Issues of latency (time-lag) and limited bandwidth when moving data are key potential weaknesses of the centralised cloud model. These drive a clear distinction between the use cases for cloud and edge computing. But the edge is also a focus for many organisations because they want to add intelligence to much of the equipment that sits within their operations and to apply AI-powered automation at those endpoints.
Early adopters include manufacturers implementing edge computing in their plants as part of their Industry 4.0 plans; logistics groups seeking to give some autonomy to dispersed assets; healthcare providers with medical equipment scattered across hospitals; and energy companies operating widely dispersed generation facilities.
For such applications to be viable and efficient, their data must be processed as close to the point of origin or consumption as possible, says George Elissaios, director of product management at Amazon Web Services. With edge computing, these applications can have lower latency, faster response times and give end customers a better experience.Edge computing can also aid interconnectivity by reducing the amount of data that needs to be backhauled to datacentres.
In some ways, the emergence of edge computing represents a new topology for IT. So says Paul Savill, global practice leader for networking and edge computing at Kyndryl, the provider of managed infrastructure services that was recently spun out ofIBM.
Companies are looking at the edge as a third landing spot for their data and applications. Its a new tier between the public cloud and the intelligence at an end device a robot, say, heexplains.
But most organisations dont expect their edge and cloud implementations to exist as distinct entities. Rather, they want to find ways to blend the scalability and flexibility they have achieved with the cloud with the responsiveness and autonomy of internet-of-things (IoT) and satellite processors installed at theedge.
Gill believes that cloud and edge are pure yin and yang. Each does things the other doesnt do well. When put together effectively, they are highly symbiotic.
They will need to be, as more and more intelligence is moved to the edge. More than 75 billion smart digital devices will be deployed worldwide by 2025, according to projections by research group IHS Markit. And it is neither desirable nor realistic for these to be interacting continuously with thecloud.
Cloud and edge are pure yin and yang When put together effectively, theyre highly symbiotic
When you start to add in multiple devices, you see a vast increase in the volume, velocity and variety of the data they generate, says Greg Hanson, vice-president of data management company Informatica in EMEA and Latin America. You simply cant keep moving all of that data into a central point without incurring a significant cost and becoming reliant on network bandwidth and infrastructure.
In such situations, edge IT performs a vital data-thinning function. Satellite processors sitting close to the end points filter out the most valuable material, collate it and dispatch it to the cloud periodically for heavyweight analysis, the training of machine-learning algorithms and longer-term storage. Processors at the edge can also apply data security and privacy rules locally to ensure regulatory compliance.
Gill notes that edge computing has shifted quickly from concept and hype to successful implementations. In many vertical industries, it is generating revenue, saving money, improving safety, enhancing the customer experience and enabling entirely new applications and datamodels.
Before achieving such gains, many edge pioneers are likely to have surmounted numerous significant challenges. Given that the technology is immature, there are few widely accepted standards that businesses can apply to it. This means that theyre often faced with an overwhelmingly wide range of designs for tech ranging from sensors and operating systems to software stacks and data management methods.
Such complexity is reflected in a widespread shortage of specialist expertise. As Savill notes: Many companies dont have all the skills they need to roll out edge computing. Theyre short of people with real competence in the orchestration of these distributed application architectures.
The goal may be to blend cloud and edge seamlessly into a unified model, but the starting points can be very different. There are two fundamentally different though not totally contradictory schools of thought, according to Gill. The cloud out perspective, favoured by big cloud service providers such as Amazon, Microsoft and Google, views the edge as an extension of the cloud model that extends the capabilities of theirproducts.
The other approach is known as edge in. In this case, organisations develop edge-native applications that occasionally reach up to the cloud to, say, pass data on to train a machine-learning algorithm.
Adherents of either approach are seeing significant returns on their investments when they get itright.
We may be in the early phase of exploiting that combination of IoT, edge and cloud, but the capabilities enabling these distributed architectures the software control and orchestration tools and the integration capabilities have already reached the point where theyre highly effective, Savill reports. Some companies that are figuring this out are seeing operational savings of 30% to 40% compared with more traditional configurations.
In doing so, they are also heralding a large-scale resurgence of the edifice that cloud helped to tear down: on-premises IT albeit in a different form.
In the next 10 to 20 years, the on-premises profile for most companies will not be servers, Elissaios predicts. It will be connected devices and billions ofthem.
Follow this link:
How to combine the power of cloud and edge computing - Raconteur
Posted in Cloud Computing
Comments Off on How to combine the power of cloud and edge computing – Raconteur
Will cloud computing be Canadas next big military procurement? Heres what to know – Global News
Posted: at 4:05 am
Ask most Canadians what the military needs next, and cloud computing might not be the first thing that jumps to mind.
But modernizing how Canadian security officials manage increasingly massive troves of data could be among the most important decisions of the coming years and federal officials have confirmed to Global News that preliminary work is underway.
Militaries are reflective of the societies they live in and a lot of the sort of development of how were going to fight wars in the future is stuff that we see in society today, which is large amounts of data management, said Richard Shimooka, a senior fellow at the Macdonald-Laurier Institute.
Its taking huge amounts of information and organizing and storing it away, and then actually applying them to conduct operations.
Story continues below advertisement
Canadian national security agencies and the military sit atop hordes of data that need to be continually tracked, assessed and managed in order to support the operations carried out to protect the countrys interests.
Increasingly though, those reams of data arent being stored just in filing cabinets or basements or bunkers. They sit in the cloud the digital ether that most Canadians likely know best as the safe haven for backing up old family photos or for syncing information between multiple devices.
As the amorphous nature of cyber warfare and cyber conflict have demonstrated over recent years, being able to gather, interpret, share and act on digital information is already a critical part of how militaries and national security agencies do their jobs in the 21st century.
Yet modernization has been a slow march for Canadian security actors, including the Canadian Forces.
Some of our systems and processes are dating back to the 50s. So [there is] crazy potential to upgrade that with not even modern practices, but to catch up to the 2010s, said Dave Perry, vice president of the Canadian Global Affairs Institute and an expert in Canadian defence policy.
Story continues below advertisement
It was a massive accomplishment to start using [Microsoft] Office 365 in recent years.
U.S. military cloud contracts are worth billions
Speculation about whether Canada could look toward a cloud computing contract comes amid plans south of the border to award a multibillion-dollar contract later this year for the Department of Defense.
Last summer, the U.S. Defense Department announced plans to award a contract in April 2022 for what it now calls the Joint Warfighting Cloud Capability.
That initiative aims to bring multiple American IT providers into a contract to provide cloud computing services for the military, and it replaces a single vendor program planned under the former Trump administration that was known as JEDI the Joint Enterprise Defense Infrastructure project.
Trending Stories
Story continues below advertisement
Last month, the Pentagon announced the JWCC contract wont be awarded until December 2022.
Microsoft and Amazon are believed to be frontrunners for different parts of that deal, while Google, Oracle and IBM have also expressed interest.
Some of those firms are now also lobbying Canadian officials to get similar contracts in place here.
Which firms are lobbying Canadian officials?
Google, IBM, Oracle and Microsoft did not have any lobbying listings with national security officials in recent months, although all list cloud computing as among their broader lobbying interests with officials with other departments including Treasury Board Secretariat, Justice Canada, and Natural Resources.
Amazon Web Services does have recent records filed disclosing lobbying with national security agencies and officials, one of its listed interests being seeking contracts with multiple government departments and institutions with regards to Amazon Cloud based solutions and related support services.
Story continues below advertisement
The web giant also has job postings up for working on its push to get cloud computing into Canadian government departments, including an account manager. That role is tasked with increasing adoption of Amazon Web Services by developing strategic accounts within Canadas Federal Government National Security sector.
According to lobbyist filings, Eric Gales, president of the Canadian branch, had meetings with Michael Power, chief of staff to Defence Minister Anita Anand, on Feb. 19, 2022, and one day earlier had met with the acting assistant deputy minister of Shared Services Canada, Scott Davis.
He also metwith Sami Khoury, head of the Canadian Centre for Cyber Security, on Nov. 17, 2021.
The Canadian Centre for Cyber Security is part of the Communications Security Establishment, Canadas signals intelligence agency and the body tasked with protecting the Government of Canadas IT networks.
A spokesperson for the CSE confirmed early work on the matter is underway,
Story continues below advertisement
The evolving information technology (IT) world is moving to cloud-based services. We are aware that our closest allies have, or are acquiring classified cloud capabilities, and we continue to engage in conversations with them on security requirements to maintain interoperability, Evan Koronewski said.
The Government of Canadas security and intelligence community is engaged in preliminary research, exploring the requirements for classified cloud services.
He added officials are exploring security requirements with the Treasury Board Secretariat, Shared Services Canada, and the Department of National Defence.
A spokesperson for the latter also confirmed that the military is working on incorporating more cloud capabilities, though not yet for classified material.
We recognize that cloud computing offers key benefits in terms of IT efficiency, said Dan Le Bouthillier.
DND/CAF is building its cloud capacity and has adopted a Multi-cloud Strategy with multiple vendors, namely Microsoft, Amazon Web Services, and Google.
He added the goal is to strike the right balance between agility and security.
The website for Shared Services Canada, which handles IT services for government departments, states there are framework agreements for cloud computing in place with eight providers: Google Cloud, ServiceNow, IBM Cloud, Oracle, ThinkOn, Microsoft and Amazon Web Services.
Those will let departments contract cloud services as they need through those providers.
Story continues below advertisement
The U.S. military cloud computing contract is valued at US$9 billion, or $11.2 billion.
Its not clear how much a similar solution for national security agencies here could cost.
Both Prime Minister Justin Trudeau and Defence Minister Anita Anand have suggested in recent weeks that the government is weighing an increase to defence spending, moving it closer to the NATO target, which aims to see all members of the military alliance spend at least two per cent of GDP on defence.
Canadas current defence spending sits at 1.39 per cent of GDP.
To hit the two per cent target would require approximately $16 billion.
That would be above the increases currently projected under the governments 2017 plan to boost defence spending, which will see it rise to $32.7 billion by 2026/27 from $18.9 billion in 2016/17.
More here:
Will cloud computing be Canadas next big military procurement? Heres what to know - Global News
Posted in Cloud Computing
Comments Off on Will cloud computing be Canadas next big military procurement? Heres what to know – Global News