Page 68«..1020..67686970..8090..»

Category Archives: Cloud Computing

Bare Metal Performance in the Cloud – HPCwire

Posted: August 14, 2021 at 1:00 am

High Performance Computing (HPC) is known as a domain where applications are well-optimized to get the highest performance possible on a platform. Unsurprisingly, a common question when moving a workload to AWS is what performance difference there may be from an existing on-premises bare metal platform. This blog will show the performance differential between bare metal instances and instances that use the AWS Nitro hypervisor is negligible for the evaluated HPC workloads.

TheAWS Nitro systemis a combination of purpose-built hardware and software designed to provide performance and security. Recent generation instances, including instance families popular with HPC workloads such as c5, c5n, m5zn, c6gn, andmany othersare based on the Nitro System. As shown in Figure 1, the AWS Nitro System is composed of three main components: Nitro cards, the Nitro security chip, and the Nitro hypervisor. Nitro cards provide controllers for the VPC data plane (network access), Amazon Elastic Block Store (Amazon EBS) access, instance storage (local NVMe), as well as overall coordination for the host. By offloading these capabilities to the Nitro cards, this removes the need to use host processor resources to implement these functions, as well as offering security benefits. The Nitro security chip provides a hardware root of trust and secure boot, among other features to help with system security.The Nitro hypervisor is lightweight hypervisor that manages memory and CPU allocation.

With this design, the host system no longer has direct access to AWS resources. Only the hardened Nitro cards can access other resources, and each of those cards provides software-defined hardware devices that are the only access points from the host device. With the I/O accesses handled by Nitro cards, this allows the last component, the Nitro hypervisor, to be light-weight and have a minimal impact to workloads running on the host. The Nitro hypervisor has only necessary functions, with a design goal of being quiescent, which means it should never activate unless it is doing work for an instance that requested it. This also means there are no background tasks running consuming any resources when it is not needed.

Figure 1. The AWS Nitro System building blocks.

The Nitro system architecture also allows AWS to offer instances that offer direct access to the bare metal of the host. Since their initial introduction in2017,many instance families offer *.metal variants, which provide direct access to the underlying hardware and no hypervisor. As in the case where the Nitro hypervisor is used, the Nitro cards are still the only access points to resources outside of the host. These instances are most commonly used for workloads that cannot run in a virtualized environment due to licensing requirements, or those that need specific hardware features only provided through direct access.

With both the option of bare metal instances and Nitro virtualized instances, this provides a method to show the performance differential between HPC application performance on bare metal vs running on the AWS Nitro hypervisor.

You can read the full blog to see how different HPC applications perform on Amazon EC2 instances with AWS Nitro hypervisor vs. bare metal instances.

Reminder: You can learn a lot from AWS HPC engineers by subscribing to the HPC Tech Short YouTube channel, and following the AWS HPC Blog channel.

See original here:

Bare Metal Performance in the Cloud - HPCwire

Posted in Cloud Computing | Comments Off on Bare Metal Performance in the Cloud – HPCwire

Samsung Has Its Own AI-Designed Chip. Soon, Others Will Too – WIRED

Posted: at 1:00 am

Samsung is using artificial intelligence to automate the insanely complex and subtle process of designing cutting-edge computer chips.

The South Korean giant is one of the first chipmakers to use AI to create its chips. Samsung is using AI features in new software from Synopsys, a leading chip design software firm used by many companies. What you're seeing here is the first of a real commercial processor design with AI, says Aart de Geus, the chairman and co-CEO of Synopsys.

Others, including Google and Nvidia, have talked about designing chips with AI. But Synopsys tool, called DSO.ai, may prove the most far-reaching because Synopsys works with dozens of companies. The tool has the potential to accelerate semiconductor development and unlock novel chip designs, according to industry watchers.

Synopsys has another valuable asset for crafting AI-designed chips: years of cutting-edge semiconductor designs that can be used to train an AI algorithm.

A spokesperson for Samsung confirms that the company is using Synopsys AI software to design its Exynos chips, which are used in smartphones, including its own branded handsets, as well as other gadgets. Samsung unveiled its newest smartphone, a foldable device called the Galaxy Z Fold3, earlier this week. The company did not confirm whether the AI-designed chips have gone into production yet, or what products they may appear in.

Across the industry, AI appears to be changing the way chips are made.

A Google research paper published in June described using AI to arrange the components on the Tensor chips that it uses to train and run AI programs in its data centers. Googles next smartphone, the Pixel 6, will feature a custom chip manufactured by Samsung. A Google spokesperson declined to say whether AI helped design the smartphone chip.

AI lends itself to these problems that have gotten massively complex.

Mike Demler, senior analyst, Linley Group

Chipmakers including Nvidia and IBM are also dabbling in AI-driven chip design. Other makers of chip-design software, including Cadence, a competitor to Synopsys, are also developing AI tools to aid with mapping out the blueprints for a new chip.

Mike Demler, a senior analyst at the Linley Group who tracks chip design software, says artificial intelligence is well suited to arranging billions of transistors across a chip. It lends itself to these problems that have gotten massively complex, he says. It will just become a standard part of the computational tool kit.

Using AI tends to be expensive, Demler says, because it requires a lot of cloud computing power to train a powerful algorithm. But he expects it to become more accessible as the cost of computing drops and models become more efficient. He adds that many tasks involved in chip design cannot be automated, so expert designers are still needed.

Modern microprocessors are incredibly complex, featuring multiple components that need to be combined effectively. Sketching out a new chip design normally requires weeks of painstaking effort as well as decades of experience. The best chip designers employ an instinctive understanding of how different decisions will affect each step of the design process. That understanding cannot easily be written into computer code, but some of the same skill can be captured using machine learning.

The AI approach used by Synopsys, as well as by Google, Nvidia, and IBM, uses a machine-learning technique called reinforcement learning to work out the design of a chip. Reinforcement learning involves training an algorithm to perform a task through reward or punishment, and it has proven an effective way of capturing subtle and hard-to-codify human judgment.

The method can automatically draw up the basics of a design, including the placement of components and how to wire them together, by trying different designs in simulation and learning which ones produce the best results. This can speed the process of designing a chip and allow an engineer to experiment with novel designs more efficiently. In a June blog post, Synopsys said one North American manufacturer of integrated circuits had improved the performance of a chip by 15 percent using the software.

Most famously, reinforcement learning was used by DeepMind, a Google subsidiary, in 2016 to develop AlphaGo, a program capable of mastering the board game Go well enough to defeat a world-class Go player.

Continued here:

Samsung Has Its Own AI-Designed Chip. Soon, Others Will Too - WIRED

Posted in Cloud Computing | Comments Off on Samsung Has Its Own AI-Designed Chip. Soon, Others Will Too – WIRED

Cloud Computing: Its Always Sunny in the Cloud – IEEE Spectrum

Posted: July 29, 2021 at 8:59 pm

This is part of IEEE Spectrums special report: Top 11 Technologies of the Decade

Illustration:Frank Chimero

Just 18 years ago the Internet was in its infancy, a mere playground for tech-savvy frontiersmen who knew how to search a directory and FTP a file. Then in 1993 it hit puberty, when the Webs graphical browsers and clickable hyperlinks began to attract a wider audience. Finally, in the 2000s, it came of age, with blogs, tweets, and social networking dizzying billions of ever more naive users with relentless waves of information, entertainment, and gossip.

This, the adulthood of the Internet, has come about for many reasons, all of them supporting a single conceptual advance: Weve cut clean through the barrier between hardware and software. And its deeply personal. Videos of our most embarrassing moments, e-mails detailing our deepest heartaches, and every digit of our bank accounts, social security numbers, and credit cards are splintered into thousands of servers controlled by dozenshundreds?of companies.

Welcome to cloud computing. Weve been catapulted into this nebulous state by the powerful convergence of widespread broadband access, the profusion of mobile devices enabling near-constant Internet connectivity, and hundreds of innovations that have made data centers much easier to build and run. For most of us, physical storage may well become obsolete in the next few years. We can now run intensive computing tasks on someone elses servers cheaply, or even for free. If this all sounds a lot like time-sharing on a mainframe, youre right. But this time its accessible to all, and its more than a little addictive.

The seduction of the business world began first, in 2000, when Salesforce.com started hosting software for interacting with customers that a client could rebrand as its own. Customers personal details, of course, went straight into Salesforces databases. Since then, hundreds of companies have turned their old physical products into virtual services or invented new ones by harnessing the potential of cloud computing.

Consumers were tempted four years later, when Google offered them their gateway drug: Gmail, a free online e-mail service with unprecedented amounts of storage space. The bargain had Faustian overtonesstore your e-mail with us for free, and in exchange well unleash creepy bots to scan your prosebut the illusion of infinite storage proved too thoroughly enthralling. This was Google, after all: big, brawny, able to warp space and time.

Gmails infinite storage was a start. But the programs developers also made use of a handy new feature. Now they could roll out updates whenever they pleased, guaranteeing that Gmail users were all in sync without having to visit a Web site to download and install an update. The same principle applied to the collaborative editing tools of Google Docs, which moved users documents into the browser with no need for backups to a hard drive. Six years agobefore the launch of Docsoffice productivity on the Web wasnt even an idea, recalls Rajen Sheth, a product manager at Google.

Docs thus took a first, tentative bite out of such package software products as Microsoft Office. Soon hundreds of companies were nibbling away.

Adding new features and fixing glitches, it turned out, could be a fluid and invisible process. Indeed, sites like the photo storage service Flickr and the blog platform WordPress continually seep out new products, features, and fixes. Scraping software off individual hard drives and running it in anonymous data centers obliterated the old, plodding cycles of product releases and patches.

In 2008, Google took a step back from software and launched App Engine. For next to nothing, Google now lets its users upload Java or Python code that is then modified to run swiftly on any desired number of machines. Anyone with a zany idea for a Web application could test it out on Googles servers with minimal financial risk. Lets say your Web app explodes in popularity: App Engine will sense the spike and swiftly increase your computing ration.

With App Engine, Google began dabbling in a space already dominated by another massive player, Amazon.com. No longer the placid bookstore most customers may have assumed it to be, in 2000 Amazon had begun to use its sales platform to host the Web sites of other companies, such as the budget retailer Target. In 2006 came rentable data storage, followed by a smorgasbord of instances, essentially slices of a server available in dozens of shapes and sizes. (Not satisfied? Fine: The CPU of an instance, which Amazon calls a compute unit, is equivalent to that of a 1.0- to 1.2-gigahertz 2007 Opteron or 2007 Xeon processor.)

To get a flavor of the options, for as little as about US $0.03 an hour, you can bid on unused instances in Amazons cloud. As long as your bid exceeds a price set by Amazon, that spare capacity is yours. At the higher end, around $2.28 per hour can get you a quadruple extra large instance with 68 gigabytes of memory, 1690 GB of storage, and a veritable bounty of 26 compute units.

In a sense, the cloud environment makes it easier to just get things done. The price of running 10 servers for 1000 hours is identical to running 1000 machines for 10 hoursa flexibility that doesnt exist in most corporate server rooms. These are unglamorous, heavy-lifting tasks that are the price of admission for doing what your customers value, says Adam Selipsky, a vice president at Amazon Web Services.

As unglamorous as an electric utility, some might say. Indeed, Amazons cloud services are as close as weve gotten to the 50-year-old dream of utility computing, in which processing is treated like power. Users pay for what they use and dont install their own generating capacity. The idea of every company running its own generators seems ludicrous, and some would argue that computing should be viewed the same way.

Selling instances, of course, is nothing like selling paperbacks, toasters, or DVDs. Where Googles business model revolves around collecting the worlds digital assets, Amazon has more of a split personality, one that has led to some odd relationships. To help sell movies, for example, Amazon now streams video on demand, much like companies such as Netflix. Netflix, however, also uses Amazons servers to stream its movies. In other words, Amazons servers are so cheap and useful that even its competitors cant stay away. But to understand whats truly fueling the addiction to the cloud, youll need to glance a bit farther back in time.

COMPANY TO WATCH:F-Secure,Helsinki, Finland

F-Secure Corp. uses the cloud to protect the cloud. Its global network of servers detects malicious software and distributes protective updates in minutes. To assess a threat, it uses the Internet itself: A widely available application is more likely to be safe than a unique file.

FUN FACT:Transmitting a terabyte of data from Boston to San Francisco can take a week. So the impatient are returning to an old idea, Sneakernet: Put your data on a disc, take it to FedEx, and get it to a data center in a day.

FUN FACT:Dude, where are my bits? In the growing obfuscation of whos responsible for what data, Amazon recently deployed its storefront platform on privacy-challenged Facebook for the first time. The irresistible business case? Selling Pampers diapers.

In the mid-1990s, a handful of computer science graduate students at Stanford University became interested in technologies that IBM had developed in the 1960s and 70s to let multiple users share a single machine. By the 1980s, when cheap servers and desktop computers began to supplant mainframe computers, those virtualization techniques had fallen out of favor.

The students applied some of those dusty old ideas to PCs running Microsoft Windows and Linux. They built whats called a hypervisor, a layer of software that goes between hardware and other higher-level software structures, deciding which of them will get how much access to CPU, storage, and memory. We called it Discoanother great idea from the 70s ready to make a comeback, recalls Stephen Herrod, who was one of the students.

They realized that virtualization could address many of the problems that had begun to plague the IT industry. For one thing, servers commonly operated at as little as a tenth of their capacity, according to International Data Corp., because key applications each had a dedicated server. It was a way of limiting vulnerabilities because true disaster-proofing was essentially unaffordable.

So the students spawned a start-up, VMware. They started by emulating an Intel x86 microprocessors behavior in software. But those early attempts didnt always work smoothly. When you mess up an emulation and then run Windows 95 on top of it, you sometimes get funny results, Herrod, now VMwares chief technology officer, recalls. Theyd wait an hour for the operating system to boot up, only to see the Windows graphics rendered upside down or all reds displayed as purple. But slowly they figured out how to emulate first the processor, then the video cards and network cards. Finally they had a software version of a PCa virtual machine.

Next they set out to load multiple virtual machines on one piece of hardware, allowing them to run several operating systems on a single machine. Armed with these techniques, VMware began helping its customers consolidate their data centers on an almost epic scaleshrinking 500 servers down to 20. You literally go up to a server, suck the brains out of it, and plop it on a virtual machine, with no disruption to how you run the application or what it looks like, Herrod says.

Also useful was an automated process that could switch out the underlying hardware that supported an up-and-running virtual machine, allowing it to move from, say, a Dell machine to an HP server. This was the essence of load balancingif one server started failing or got too choked up with virtual machines, they could move off, eliminating a potential bottleneck.

You might think that the virtual machines would run far more slowly than the underlying hardware, but the engineers solved the problem with a trick that separates mundane from privileged computing tasks. When the virtual machines sharing a single server execute routine commands, those computations all run on the bare metal, mixed together with their neighbors tasks in a computational salad bowl. Only when the virtual machine needs to perform a more confidential task, such as accessing the network, does the processing retreat back into its walled-off software alcove, where the calculating continues, bento-box style.

Those speedy transitions would not have been possible were it not for another key trendthe consolidation of life into an Intel world. Back in virtualizations early days, a major goal was to implement foreign architectures on whatever hardware was at handsay, by emulating a Power PC on a Sun Microsystems workstation. Virtualization then had two functions, to silo data and to translate commands for the underlying hardware. With microprocessor architectures standardized around the x86, just about any server is now compatible with every other, eliminating the tedious translation step.

VMware no longer has a monopoly on virtualizationa nice open-source option exists as wellbut it can take credit for developing much of the master idea. With computers sliced up into anywhere between 5 and 100 flexible, versatile virtual machines, users can claim exactly the computing capacity they need at any given moment. Adding more units or cutting back is simple and immediate. The now-routine tasks of cloning virtual machines and distributing them through multiple data centers make for easy backups. And at a few cents per CPU-hour, cloud computing can be cheap as dirt.

So will all computing move into the cloud? Well, not every bit. Some will stay down here, on Earth, where every roofing tile and toothbrush seems fated to have a microprocessor of its own.

But for you and me, the days of disconnecting and holing up with ones hard drive are gone. IT managers, too, will surely see their hardware babysitting duties continue to shrink. Cloud providers have argued their case well to small-time operations with unimpressive computing needs and university researchers with massive data sets to crunch through. But those vendors still need to convince Fortune 500 companies that cloud computing isnt just for start-ups and biology professors short on cash. They need a few more examples like Netflix to prove that mucking around in the server room is a choice, not a necessity.

And we may just need more assurances that our data will always be safe. Data could migrate across national borders, becoming susceptible to an unfriendly regimes weak human rights laws. A cloud vendor might go out of business, change its pricing, be acquired by an archrival, or get wiped out by a hurricane. To protect themselves, cloud dwellers will want their data to be able to transfer smoothly from cloud to cloud. Right now, it does not.

The true test of the cloud, then, may emerge in the next generation of court cases, where the murky details of consumer protections and data ownership in a cloud-based world will eventually be hashed out. Thats when well grasp the repercussions of our new addictionand when we may finally learn exactly how the dream of the Internet, in which all the worlds computers function as one, might also be a nightmare.

For all of IEEE Spectrums Top 11 Technologies of the Decade, visit the special report.

Read the original:

Cloud Computing: Its Always Sunny in the Cloud - IEEE Spectrum

Posted in Cloud Computing | Comments Off on Cloud Computing: Its Always Sunny in the Cloud – IEEE Spectrum

Cloud Computing Impact On The Gaming Industry | Invision Game Community – Invision Game Community

Posted: at 8:59 pm

Cloud computing is the instant, remote access to computing systems and resources without being actively involved in managing infrastructure. Its a data center made accessible to many users using the Internet. Anyone with access rights can interact with the cloud and retrieve, manage, download information from anywhere around the world.

Cloud computing services get provided on a pay as you go basis. The essential cloud computing features that you can enjoy:

The gaming industry is openly embracing cloud computing technology and is also implementing Gaming as a Service (GaaS). The tremendous processing power of cloud computing enables users to stream video games directly to their devices and run them from remote servers. The cloud handles all the processing requirements for the device. You dont need next-generation hardware to enjoy the latest games.

You can now stream gaming content from a network server. GaaS capabilities get supplied in different provisions: local rendering GaaS, remote rendering GaaS, and cognitive resource allocation GaaS.

All you need is low latency and a large bandwidth with minimum response time and high-quality video output. Although, there are many models for the provision of cloud gaming, including a monthly subscription to access an entire library of gamers or pay per game you request.

The high costs of gaming equipment usually present shortfalls in the gaming experience. Especially now that cloud computing is deeply established, its become more expensive to set up shop with physical games.

The number of gamers plus the total time spent playing and watching video games online has been rising over the years. An Entertainment Software Association (ESA) report entails that around 64% of adults in the U.S. regularly play video games.

The scope of Cloud computing in the gaming industry has enormous potential to expand. Today, video gaming is actively engaging about 2.8 billion people worldwide, a number expected to soar beyond 3 billion as of 2023. The entire video game industry is on the verge of reaching $189.3 billion in revenues as of 2021. At the same time, the global gaming market has estimates of getting a value of $256.97 billion by 2025.

Cloud computing is resolving many of the computing challenges faced by both gamers and gaming companies. Hence, its not shocking that companys like Google and Microsoft decided to migrate to Cloud gaming services (Google Stadia and Project xCloud).

Although, some realists look beyond the hype to argue that the Internet presents limitations regarding processing speed. But, the coming years have the possibility for significant changes and solutions to latency and processing problems.

Ongoing developments are driving us closer to faster adoption of cloud gaming services. The complete rollout of 5G technology will speed up the power of cloud computing and drive further adoption.

Microsoft established Project xCloud, which aims to enhance the gaming experience across multiple devices. And they launched cloud gaming (beta) for Xbox Game Pass Ultimate members in 2020.

Initially made its cloud computing gaming debut in 2014 when it launchedPlayStation Now. Sonyacquired a leading interactive cloud gaming company back in 2012. It successfully established its place in the world of cloud-based Gaming. Although sony remained unchallenged for years,now there are more companies expanding investments into the field.

Google has investedin the development of Stadia, a video game platform developed to provide instant access to video games regardless of the screen type. By providing the capacity to play 4K games on your TV minus a console.

You stream the games through a browser on a laptop or your phone.

EAestablished Project Atlas in 2018 to leverage cloud computing and artificial intelligence in enabling game developers to access services at optimal capacity, with an easy-to-use experience.

The leading cloud service providers also launched a cloud computing gaming service, Luna, which harnesses the extensive cloud capacity of AWS. Amazon is also establishing a new gaming channel in collaboration with global video game developer Ubisoft. You can access the massive library of games by subscription.

This company has been actively building cloud gaming solutions for many years. The evidence of its research and development lies in the release of GeForce. In February 2020, GeForce became accessible to everyone.

Nvidia also collaborated with Tencentto establish PC Cloud gaming in China.

Want more news from the Tech world for Gaming Peripherals to Hardware Click Here

See the original post here:

Cloud Computing Impact On The Gaming Industry | Invision Game Community - Invision Game Community

Posted in Cloud Computing | Comments Off on Cloud Computing Impact On The Gaming Industry | Invision Game Community – Invision Game Community

Amazon Web Services is getting ready to retire one of its oldest cloud computing services – ZDNet

Posted: at 8:59 pm

In coming months Amazon Web Services (AWS) will shut one of its oldest cloud computing infrastructure services, EC2-Classic, and is warning remaining users to move off the service to avoid application downtime.

"EC2-Classic has served us well, but we're going to give it a gold watch and a well-deserved sendoff,"writes AWS evangelist Jeff Barr.

EC2-Classic arrived with original release of Amazon EC2 but itwas not supported for accounts created after April 2013, at which point it required users to launch EC2 instances in a virtual private cloud (VPC) -- a logically-separated section AWS.

With EC2-Classic, instances run in a single, flat network that is shared with other customers. EC2-Classic required public IP addresses made available at the time, or tunneling, to communicate with AWS resources in a VPC.

There are some deadlines coming up for any business still on EC2-Classic, but Barr says the process will be gradual.

"Rest assured that we are going to make this as smooth and as non-disruptive as possible. We are not planning to disrupt any workloads and we are giving you plenty of lead time so that you can plan, test, and perform your migration," he notes.

Key dates to keep in mind are October 30, 2021, and August 15, 2022.

On October 30, AWS will disable EC2-Classic in Regions for AWS accounts that have no active EC2-Classic resources in the region. On that date, AWS won't sell 1-year and 3-year Reserved Instances for EC2-Classic.

By August 15, 2022, AWS reckons all migrations will be done and that all EC2-Classic resources will have been extinguished from AWS accounts.

Key AWS resources that EC2-Classic customers will need to keep an eye on include:

It could be tricky finding all services dependent on EC2-Class resources, so AWS has released the EC2 Classic Resource Finder script to help locate EC2-Classic resources in an account.

It's also offering the AWS Application Manager Service (AWS MGN) to help customers migrate instances and databases from EC2-Classic to VPC.

EC2-Classic customers should note that disabling it in a region is meant to be a "one-way door", but Barr says users can contact AWS Support if they need to re-enable EC2-Classic for a region.

View post:

Amazon Web Services is getting ready to retire one of its oldest cloud computing services - ZDNet

Posted in Cloud Computing | Comments Off on Amazon Web Services is getting ready to retire one of its oldest cloud computing services – ZDNet

Students will benefit from new cloud computing pathway – Brunswick News

Posted: at 8:59 pm

Technology has changed many aspects about the way we live. Its changed everything from how we communicate with each other to how we shop for goods and services.

It would make sense that as our technological society continues to move forward, new tech will infiltrate the workplace. That means workers will need to learn new skills to stay ahead of the ever-evolving technological landscape. A recent announcement from the State Board of Education shows how schools in Georgia are working to make sure todays students have access to learn these skills.

The state board recently approved a recommendation from State School Superintendent Richard Woods to add a new career pathway in cloud computing, according to a report from Capitol Beat News Service. Three courses introduction to software technology, computer science principles and cloud computing will be a part of the pathway.

A lot of people have probably heard of the term cloud computing, but they may not know what it entails. In general, the term refers to delivering services through the internet such as data storage. When you back up your photos or data to the cloud, you are using a system built off the skills students will learn in this pathway.

Adding this pathway as an option for high schoolers in the state is a no-brainer. Cloud computing is one of the most in-demand hard skills employers are looking for, according to professional networking and employment website Linkedin. In fact, Capitol Beat reported that there are more than 4,000 cloud computing related jobs opening currently in the state.

The curriculum for the course was also being developed with feedback from some of the biggest technology firms in the world such as Amazon Web Services, Google and Microsoft. Students will get the chance to learn cloud computing skills from a program designed with input from the firms most responsible for the leaps in technology we use every day.

Students that start down this pathway could one day come up with the next great technological invention. Even if they dont become the next Bill Gates, they will have the skills to find a job in a field that could keep growing as we become even more technologically advanced.

The goal of high school is to not only educate and assist the development of our youth, but it is also to make sure they have the best chance possible to succeed when they graduate.

This cloud computing pathway is just another tool to help complete the mission.

Excerpt from:

Students will benefit from new cloud computing pathway - Brunswick News

Posted in Cloud Computing | Comments Off on Students will benefit from new cloud computing pathway – Brunswick News

2021 Thematic Research into Cloud Computing in Healthcare – Featuring Amazon, Microsoft and Google Among Others – ResearchAndMarkets.com – Business…

Posted: at 8:59 pm

DUBLIN--(BUSINESS WIRE)--The "Cloud Computing in Healthcare, 2021 Update - Thematic Research" report has been added to ResearchAndMarkets.com's offering.

Healthcare providers are extremely cost-conscious because they are under constant pressure to improve patient care while maintaining profitability. Cloud solutions support this by reducing the costs of in-house IT infrastructure. Cloud computing also greatly reduces the time required to deploy software, which can take months in on-premises deployments. Cloud software deployment and updates can be conducted remotely and typically very quickly, so employees can spend less time waiting and be more productive. Major categories of cloud solutions include infrastructure as a service (IaaS), platform as a service (PaaS), and software as a service (SaaS).

In the healthcare industry, there is much concern for protecting patients' personally identifiable information (PII) as records include all forms of personal data, including name, patient number, addresses, next of kin, and detailed health information. When data privacy is breached, a healthcare company faces legal liability and penalties for regulatory noncompliance. Healthcare is one of the biggest targets of ransomware attacks, in which hackers infect a computer system and demand payment to restore it. One of the best-known examples is the 2017 WannaCry attack, which affected over 200,000 computers in over 150 countries, costing $126m in the UK alone and up to $7.9bn globally. As cloud adoption grows, developers are increasingly aware of security risks and how to combat them. Security measures for cloud include minimizing attacks, controlling logins, and improving data encryption.

This report explores the theme of cloud computing in healthcare, through coverage of healthcare challenges, players, market size and more.

Scope

Reasons to Buy

Key Topics Covered:

Companies Mentioned

For more information about this report visit https://www.researchandmarkets.com/r/vbq26w

Link:

2021 Thematic Research into Cloud Computing in Healthcare - Featuring Amazon, Microsoft and Google Among Others - ResearchAndMarkets.com - Business...

Posted in Cloud Computing | Comments Off on 2021 Thematic Research into Cloud Computing in Healthcare – Featuring Amazon, Microsoft and Google Among Others – ResearchAndMarkets.com – Business…

Now is the time for a federal cloud modernization moonshot – Brookings Institution

Posted: at 8:59 pm

Now is the time to launch a Federal Cloud Modernization moonshot to modernize all practical legacy civilian IT systems within a decade. COVID vividly demonstrated the importance of our IT systems to a resilient and robust economy. Yet from security breaches to delayed tax processing, the weaknesses of government IT systems are well known.

In ITIFs Secrets from Cloud Computings First Stage report, I show how cloud computing offers a better way to modernize federal IT. This will bring improved citizen services, lower operating cost, andas repeated security breaches highlight the need forbetter cybersecurity. The initiative should be led by the Federal Chief Information Officer (CIO) Council, the White House Office of Management and Budget (OMB), and the Federal Chief Technology Officer (CTO), with deep engagement by agencies and support from Congress. The Technology Modernization Fund (TMF) can serve as a starting point, with IT modernization funding targeting $10 billion a year.

This will involve modernizing thousands of systems. We will need to develop new agile migration methodologies and stand up migration factories. The U.S. Digital Service can play an important role here. Leadership must also rally the federal IT industry partner community to implement system migrations at scale. Its not just fundingthe initiative needs a robust program office and careful governance. Ten years is arguably too long, but even so this will be a challenge to achieve; once it starts showing success, lessons learned should be applied to modernize state and local government IT.

Cloud better enables the government missions and programs that the American public depends on. Cloud computing is a powerful platform that provides hundreds of IT services with a common architecture, security model, development tools, and management approach. This now provides a better way to automate and scale modernization in a more repeatable fashion. Cloud computing has 31% lower operational costs than comparable on-premises infrastructure, and even greater savings when people and downtime costs are included Elsewhere, I show that cloud is a more flexible and automated system that enables rapid changes to new demands. This provides better, more reliable citizen services, whether they be innovative public facing websites, faster payment processing, or veterans health care scheduling. Moreover, cloud provides substantially stronger security that is built into the platform by design. While the recent cybersecurity executive order makes important process and policy changes, the systems and code still need to be modernized.

The initiative should be led by the Federal CIO Council, OMB, and the Federal CTO. Agency and department leadership, in addition to CIOs, need to be deeply engaged to support IT modernization. DHS Cybersecurity and Infrastructure Security Agency should be an integral partner. Congress will need to support funding and will expect transparency. The CIO Council should set a baseline of systems to modernize and then set measurable, agency-specific outcome goals such as the number of target applications, servers, and petabytes of data moved, cost savings, and priority programs supported. Federal CIOs will need to prioritize all major systems and provide plans to move them in smaller stages, learning along the way. A cloud modernization moonshot program office is crucial to manage the program and should issue public progress reports at least bi-annually, in addition to managing ongoing performance metrics, timelines, and cost savings.

The U.S. federal government is the largest technology buyer, spending well over $100 billion a year on IT. However, we need to get out of the trap where annual appropriations only pay for ongoing operations, leaving little funds to move to lower-cost, more capable systems. As Congress and the President negotiate an infrastructure modernization package, digital infrastructure and Federal IT should be included. The federal Technology Modernization Fund provides funding and expertise to upgrade and transform IT systems. Now is the time to build on the lessons learned and scale it. Funding should be increased to roughly $10 billion a year, or roughly 10% of the federal IT budget. This is substantial but would place the government at the low end of the target share of IT spending dedicated to modernization. The TMF repayment requirement should be aggressively lowered for moonshot projects, with more funding for the most important systems. The OMB, in consultation with Congress, should develop criteria for funding. Funding can prioritize target public domains including health, education, security, and benefit payments and fraud.

The federal government relies on federal IT-focused companies to provide IT services, and private industry will be integral to moving thousands of systems to the cloud. New migration methodologies will be needed to move workloads at this scale, with attention to related mission work-flows and governance. The U.S. Digital Service has an important role to play here. Migration factories that move IT systems and data in more standardized, repeatable processes will be needed. Moving to the public cloud should be the desired default choice due to its better cost, operational flexibility, and agility. However, private clouds for sensitive data and on-premises modernization remain options where appropriate. They should require specific justification and approval, with criteria developed by OMB and the Federal CIO council.

Earlier Cloud First and Cloud Smart policies helped start the federal move to cloud. Ten years later, its time to build on them with additional action. The goal is ambitious. Yet the federal Data Center Optimization Initiative, for example, targeted closing ~10% of federal data center square footage a year, and included goals such as cost savings, server utilization, and energy efficiency. For sure there will be setbacks along the way. But the initiative should learn from these and course-correct. A parallel initiative at the Department of Defense for national security systems could follow. Lessons from the federal level should then be applied to a state and local government modernization initiative. We are moving to a digital economy to generate growth, resiliency, and improve social opportunities. A robust government IT capability is integral to this progress.

Amazon is a general, unrestricted donor to the Brookings Institution. The findings, interpretations, and conclusions posted in this piece are solely those of the authors and not influenced by any donation.

Follow this link:

Now is the time for a federal cloud modernization moonshot - Brookings Institution

Posted in Cloud Computing | Comments Off on Now is the time for a federal cloud modernization moonshot – Brookings Institution

Microsoft beats Wall Street expectations on soaring demand for cloud computing – City A.M.

Posted: at 8:59 pm

Microsoft beat earnings expectations and posted a rise in revenue for the fourth quarter, as the ongoing shift to remote working bolstered demand for its cloud computing services.

Microsoft reported a 21 per cent rise in revenue to $46.2bn in the three months ending 30 June beating analysts consensus estimates of $44.24bn, according to IBES data from Refinitiv.

Operating income jumped by 42 per cent over the period to $19.1bn and net income increased by 47 per cent to $16.5bn.

The computing giant reported earnings per share increased by almost half to $2.17 ahead of analysts expectations of $1.92.

The shift to remote working has seen demand for Microsofts cloud services soar and today it said revenue in its intelligent cloud division jumped 30 per cent to $17.4bn over the quarter, with 51 per cent growth in its Azure service.

We are innovating across the technology stack to help organizations drive new levels of tech intensity across their business, said Satya Nadella, chairman and chief executive officer of Microsoft.

Our results show that when we execute well and meet customers needs in differentiated ways in large and growing markets, we generate growth, as weve seen in our commercial cloud and in new franchises weve built, including gaming, security, and LinkedIn, all of which surpassed $10 billion in annual revenue over the past three years.

Microsoft shares were down 2 per cent on the news.

Microsofts products look set to generate reliable cash flows for years to come, said Steve Clayton, a fund manager at Hargreaves Lansdown.

Their Azure cloud computing division is the number 2 global player and is growing like Topsy. Microsoft may be huge, but it is still growing at pace, as these figures demonstrate so clearly.

Few things are as valuable as cash generative businesses with dominant market positions in growing markets. Microsoft fits the bill perfectly, especially with a rising proportion of its revenues coming from recurring sources, like Office 365, Clayton said.

The stock dip in after-hours trading was not surprising given some investors may have been looking for even higher growth rates from Azure and the wider Nasdaq market had already taken a 1.2 per cent pasting today, Clayton added.

Here is the original post:

Microsoft beats Wall Street expectations on soaring demand for cloud computing - City A.M.

Posted in Cloud Computing | Comments Off on Microsoft beats Wall Street expectations on soaring demand for cloud computing – City A.M.

Swizznet Selected as Sage Strategic Cloud Hosting Provider for Construction and Real Estate Industry in the United States – Webster County Citizen

Posted: at 8:59 pm

CHESTERFIELD, Mo., July 29, 2021 /PRNewswire-PRWeb/ --Swizznet, a cloud-based hosting solutions company for small- and medium-sized businesses, is pleased to announce it has been named a Sage Partner Cloud provider for the commercial real estate industry in the United States. The new partnership means that Sage clients can have an easier transition to the cloud with Swizznet and can keep the products they currently use.

Sage is the global market leader for technology that provides small- and medium-sized businesses with the visibility, flexibility and efficiency to manage finances, operations and people. The company's Partner Cloud program, which launched in December 2020, enables select partners to become managed services providers for their customers.

The program in the United States includes Sage 100 and Sage 300, as well as Sage 100 Contractor and Sage 300 Construction and Real Estate (CRE).

"Moving to the cloud is no longer a question of 'if,' it's a matter of 'when' for real estate and construction firms that want to grow and succeed," said Bob Hollander, President and Chief Executive Officer of Swizznet. "We're thrilled to help companies smoothly and seamlessly transition to the cloud with Sage's business solutions and support them as they evolve and compete."

As a Sage Partner Could provider, Swizznet offers construction and real estate firms in the United States the tools, expertise and resources needed to customize and deploy Sage's business management solutions on the Microsoft Azure platform.

"As the demand for cloud solutions in the construction industry has increased, we want to provide our customers with a flexible option to move their current Sage solutions to the cloud at their own pace, without disruption," said Dustin Stephens, vice president of Sage Construction and Real Estate. "We are pleased to have our trusted partner Swizznet join the Sage Partner Cloud program to deploy Sage 300 Construction and Real Estate and Sage 100 Contractor in the cloud."

Swizznet's relationship with Sage began in 2014, first as an authorized hosting partner and later becoming a development partner for Sage construction and real estate.

Swizznet offers hosting solutions that empower businesses to free themselves from in-house infrastructure and IT so that they can connect and collaborate from any computer or device. The company is a Sage partner cloud program member, Intuit-authorized commercial Hosting provider and a Microsoft cloud solution provider. Swizznet offers an on-demand marketplace, using the latest cloud computing technology and tools to provide a superior user experience and deliver the fastest, most secure and reliable cloud access to Sage and QuickBooks desktop applications. The company is committed to providing clients with 100% US-based, 24/7/365 Obsessive Support and service for the ultimate cloud accounting solution. For more information, visit https://www.swizznet.com.

Sage is the global market leader for technology that provides small and medium businesses with the visibility, flexibility and efficiency to manage finances, operations and people. With our partners, Sage is trusted by millions of customers worldwide to deliver the best cloud technology and support. Our years of experience mean that our colleagues and partners understand how to serve our customers and communities through the good, and more challenging times. We are here to help, with practical advice, solutions, expertise and insight. For more information, visit http://www.sage.com/.

Read more from the original source:

Swizznet Selected as Sage Strategic Cloud Hosting Provider for Construction and Real Estate Industry in the United States - Webster County Citizen

Posted in Cloud Computing | Comments Off on Swizznet Selected as Sage Strategic Cloud Hosting Provider for Construction and Real Estate Industry in the United States – Webster County Citizen

Page 68«..1020..67686970..8090..»