Page 63«..1020..62636465..7080..»

Category Archives: Cloud Computing

Microsoft cloud storage: is OneDrive or Azure right for your business? – ITProPortal

Posted: September 27, 2021 at 5:14 pm

Microsoft is one of the best cloud storage providers, and offers some of the best cloud storage for business too. But before you can dive into Microsoft cloud storage, you have to choose between two different products: OneDrive and Azure.

Microsoft OneDrive is a file storage service that integrates with the Microsoft 365 productivity suite. Employees can collaborate in real time on documents in either web or desktop versions of apps like Word, Excel, and PowerPoint. Plus, employees get their own cloud storage vault where they can keep their files.

Microsoft Azure, on the other hand, is a comprehensive cloud hosting service that enables you to store files, run servers in the cloud, and much more. Its best suited for developing software, running big data analyses, or handling massive databases for the apps your business runs on. Azure storage is more expensive than OneDrive, and doesnt integrate with Microsoft 365 apps.

In this guide, well cover how OneDrive and Azure work with productivity tools like Microsoft 365, how file sharing and collaboration work, and how data is stored in the cloud on the two platforms. Whereas OneDrive is best for businesses that simply want to enable cloud storage and file sharing using Microsoft 365 apps, Azure is best for businesses that want a scalable and highly advanced cloud computing platform.

We compared Microsoft OneDrive and Azure not just on their file storage capabilities, but also on the totality of what these two services are capable of. Well cover the following topics:

Its relatively easy to get started with OneDrive. You can add employees to your business account using their email addresses, and each employee gets their own dashboard. The dashboard is accessible on the web, as a desktop app, or as a mobile app.

OneDrive also integrates with the Windows File Explorer, enabling you to access your OneDrive cloud storage alongside files stored locally on your computer.

Azure is a bit more complicated to set up. Once you sign up, you can access the dashboardcalled the Azure Portalover the web or through desktop and mobile apps.

The Azure Portal starts out empty, but theres a list of all the services available in Azure on the left-hand side of the screen. You can add modules from this list to the Portal to build a custom dashboard and access Azures tools. Youll need to find the Azure Active Directory module and use that to add employees to your businesss Azure deployment.

Within the Portal, you can set up custom dashboards that display your virtual machines, storage systems, and more. To create a new cloud storage space, you can browse the available storage types in the left-hand services menu, and launch a new storage container from there.

Its tricky to compare pricing between OneDrive and Azure because they use completely different pricing schemes. OneDrive charges a flat monthly fee per user, starting from $5 a month for 1TB of storage per user.

Azure operates on a pay-as-you-go basis. Youre charged based on the amount of data you have stored, what type of cloud servers your data is stored on, and how often you make changes to your files. To give a rough estimate, storing 1TB of data with Azure typically costs around $20 a month.

You can also reserve storage with Azure for up to three years at a time. This results in a larger upfront bill, but Microsoft offers discounts of up to 38% for these extended contracts.

One of the major differences between OneDrive and Azure is how they integrate with Microsoft 365, Microsofts productivity suite, which includes apps like Word, Excel, PowerPoint, Teams, and Outlook.

OneDrive is baked into all Microsoft 365 apps. In any of these apps, you have the option to save files directly to your OneDrive storage, and you can set up syncing so that changes are saved to the cloud in real time.

Even better, its possible for multiple people to work on a file simultaneously if its stored in the cloud. Thats true whether you want to use Microsofts online office apps or the desktop versions of apps like Word and Excel. So, multiple employees can collaborate on a document without running the risk that multiple divergent copies will be created.

Azure, on the other hand, doesnt offer special integrations with Microsoft 365 apps. Youll need to save files created in Microsoft 365 to a storage container in Azure manually. Alternatively, you can work on a Microsoft 365 deployment inside the Azure cloud, and save files directly to Azures storage containers, but this requires you to first set up a virtual machine in the cloud.

In addition, you must be online at all times to use Microsoft 365 apps with Azure, whereas you can work offline with OneDrive, and files will save to the cloud automatically when you reconnect.

Microsoft Azure is a comprehensive cloud computing platform, not just an online cloud storage service. In fact, cloud storage in Azure is designed to interface with the virtual machines, workflows, and software development environments you create on the platform. This enables you to run big data analyses or to access information databases when running applications in the cloud.

Some of the features Azure offers in this respect include virtual Windows and Linux machines with scalable computing power, premade AI models for analyzing data, and developer tools for building apps in the cloud. Azure also offers a content delivery network and tools for enabling single sign-on for your employees.

OneDrive, in contrast, is simply a cloud storage service. You can move files around in the cloud, but thats it. Any computing tasks that cannot be done using the online Office 365 suite must be done on your local network.

The process for backing up files is somewhat different between OneDrive and Azure.

With OneDrive, there are no choices that you need to make about how your files are backed up. Microsoft automatically stores your data in multiple data centers for redundancy. Files are saved in hot storage, meaning that they can be accessed from anywhere in the world immediately.

OneDrive has a service level agreement (SLA) of 99.9%, meaning that Microsoft guarantees that it will have no more than five minutes of downtime a month.

With Azure, you have a number of choices to make about how to store your data. Azure offers the Files module for storing file libraries, the Blob module for storing unstructured data and SQL databases, and the Tables module for storing NoSQL databases. On top of that, youll need to decide whether your data is stored hot for active use or cold for archival purposes.

You also get to choose the primary data center region your data is stored in with Azure. Of course, all data is backed up to multiple data centers for redundancy. Azure comes with an SLA of 99.99%, meaning that it experiences less than 30 seconds of downtime per month.

Both OneDrive and Azure enable you to share files both inside and outside your organization.

With OneDrive, file sharing is relatively simple. You can select any file or group of files and share them via a link or invitation. In addition, you can password-protect shared files or set an expiration date on sharing links. Administrators have the option to limit sharing of certain types of files outside your organization.

With Azure, files are by default accessible to anyone with access to your businesss Azure Portal. It is not possible to create personal, employee-specific file storage systems within Azure. However, you can control access to files using the Azure Active Directory module. This module also lets you invite guest users from outside your business to access specific files for collaboration.

If OneDrive and Azure arent a fit for your business, there are plenty of alternatives to choose from. However, youll still need to make the same decision as to whether your business needs a cloud storage platform like OneDrive or a cloud computing platform like Azure.

OneDrives main competitor is Google Drive, which is part of Google Workspace. Just as OneDrive integrates with Microsoft 365, so Google Drive integrates with Google productivity apps like Docs, Slides, Gmail, and Calendar. However, its worth pointing out that Google Workspace only includes web-based apps, whereas Microsoft 365 apps are available in both web and desktop versions.

Google Workspace plans start at $6 per user a month, and include 30GB per user, so you wont get as much storage per dollar as with OneDrive. The main reason to go with Google Workspace is if you prefer Googles productivity suite to Microsofts. You can find out more about this cloud storage service in our Google Drive review.

An alternative to Azure for cloud computing is Amazon Web Services (AWS). AWS is years ahead of Azure in deploying advanced computing resources, including quantum computing applications. It also offers ultra-cheap storage through AWS Glacier, a service for long-term storage of rarely accessed data. For businesses with a lot of data, AWS can be cheaper than Azure, and offers a wider variety of big data analysis pipelines.

OneDrive and Azure are very different Microsoft cloud storage platforms that serve different purposes. OneDrive is a file storage platform that integrates seamlessly with Microsoft 365 productivity apps. Azure is a cloud computing platform thats designed to facilitate big data analysis, software development, and cloud server deployment.

For businesses that simply want to store files in the cloud to promote collaboration, and reduce dependence on physical hard drives, OneDrive is likely to be the better choice. Its very easy to use, and if your business uses Microsoft 365 apps like Word, Excel, Outlook, or Teams, you likely already have access to OneDrive. OneDrive makes file sharing simple, and employees can even edit files at the same time.

For businesses that currently operate in the cloud or want to take advantage of cloud computing, Azure may make more sense. While its more complicated to set up than OneDrive, Azure makes your data available to applications and workflows running in the cloud. With Azure, you can flexibly access as much computing power as you need to analyze big data or develop your own custom applications.

OneDrive has almost everything you want in a cloud storage platform. It's affordable and highly secure, with robust encryption frameworks. Business customers also get access to a wide range of compliance and auditing capabilities...Deep Microsoft 365 integration makes OneDrive perfect for working online, and it is our top pick for businesses wanting a premium digital workspace and communication ecosystem. Score: 4/5

Choosing between Microsoft OneDrive and Azure can be a fork in the road for your business. Ultimately, which is better comes down to whether you just want cloud storage for business or a full-fledged cloud hosting service. If youre thinking about migrating your entire business to the cloud, make sure you set yourself up with a roadmap for success.

Read the original here:

Microsoft cloud storage: is OneDrive or Azure right for your business? - ITProPortal

Posted in Cloud Computing | Comments Off on Microsoft cloud storage: is OneDrive or Azure right for your business? – ITProPortal

Stocks making the biggest moves in the premarket: Alphabet, Tesla, Gores Guggenheim and more – CNBC

Posted: at 5:14 pm

Take a look at some of the biggest movers in the premarket:

Alphabet (GOOGL) Alphabet's Google unit will cut the commissions it collects on third-party software sales in its Cloud Marketplace. That's according to a person familiar with the matter who spoke to CNBC, who said Google will now collect just 3% of sales compared to the prior 20%.

Tesla (TSLA) Tesla rolled out a software update that allows customers to request access to its Full Self-Driving beta software. Access will be granted to Tesla drivers who get a sufficiently high safety score.

Gores Guggenheim (GGPI) The special purpose acquisition company will take electric car maker Polestar public through a merger, at a valuation of $20 billion including debt. Polestar is controlled by car maker Volvo and its parent Zhejiang Geely Holding Group. Gores rose 2.4% in premarket trading.

Acceleron Pharma (XLRN) Acceleron is in talks to be acquired by an unidentified large pharmaceutical company for about $180 per share, according to people familiar with the matter who spoke to Bloomberg. Bristol-Myers Squibb (BMY) is considered one potential candidate, as it already owns an 11.5% stake in Acceleron.

Box (BOX) Box was upgraded to "market outperform" from "market perform" at JMP Securities, which cited the cloud computing company's execution among other factors. Box added 2.2% in the premarket.

Altice USA (ATUS) The broadband and video company was downgraded to "neutral" from "outperform" at Credit Suisse, which notes the likely short-term negative impact from an aggressive fiber buildout strategy. Altice USA slid 1.8% in premarket action.

Toyota Motor (TM) The automaker's shares rose 1.3% in the premarket after the company said it had completed a 25.8 million share buyback.

Best Buy (BBY) The electronics retailer was named a "top idea" at Piper Sandler, which is enthusiastic about the upcoming rollout of Best Buy's new "Best Buy Total Tech" membership program.

Gannett (GCI) The USA Today publisher said it was seeking to refinance up to $550 million in senior secured debt. Gannett said its plan was subject to market conditions and that there is no assurance it will be able to execute the refinancing.

Read more here:

Stocks making the biggest moves in the premarket: Alphabet, Tesla, Gores Guggenheim and more - CNBC

Posted in Cloud Computing | Comments Off on Stocks making the biggest moves in the premarket: Alphabet, Tesla, Gores Guggenheim and more – CNBC

Infrastructure and ‘talent’ why Amazon selected Auckland for $7.5b investment – Stuff.co.nz

Posted: at 5:14 pm

Amazon Web Services decided to base cloud computing data centres in Auckland because of the citys telecommunications connections and skilled workforce, its country manager says.

Amazons cloud computing arm, Amazon Web Services (AWS), will spend $7.5 billion over 15 years building world class computing infrastructure in Auckland, the company announced on Thursday.

AWS New Zealand country manager Tim Dacombe-Bird said the country would join 25 other territories in which the company had established cloud computing data centres.

The company would build a cluster of at least three data centres in the city, he said.

READ MORE:* Joining Google as an engineer need no longer mean leaving NZ* The cables that keep New Zealand connected to the world* Amazon Web Services opens new NZ office, but no word on data centre

Explaining the decision to locate all the infrastructure in Auckland, rather than elsewhere in the country, Dacombe-Bird said AWS took a number of factors into consideration when picking sites.

Its three availability zones needed to be geographically isolated with separate infrastructure connections but still close enough together that there wasnt a lot of latency between them, he said.

So we need internet transfer points and we need the networking or telecommunications infrastructure and we were able to find that in Auckland.

The other thing we need is the availability of talent, he said. Auckland has a large pool of top quality technology talent.

The $7.5b investment would cover the construction and fitting out of the data centres and salaries, Dacombe-Bird said.

A proportion of the spending will be on imported computer equipment.

But AWS estimated the investment would create 1000 jobs and contribute $10.8b to New Zealands GDP over the next 15 years.

Rob Stock/Stuff

Digital Economy and Communications Minister David Clark said AWS investment would bring long-term benefits.

Microsoft is also investing in cloud computing data centres in New Zealand, although there is no indication that will be on the same scale.

AWS other major global cloud-computing rival Google has instead elected to continue servicing customers in the region from Sydney and Melbourne.

Remi Galasso, co-founder of Datagrid, which plans to build a large data centre with associated subsea cable infrastructure near Invercargill at a cost of about $700m, said AWS investment was great news for the country.

New Zealand is definitely becoming attractive to cloud providers, he said.

Auckland is obviously the first chosen location, but I am sure other data centre locations will flourish in the near future especially the ones with access to renewable energy.

Supplied

Datagrid co-founder Remi Galasso said NZ was becoming more attractive as a base for cloud computing and he expected investment to also go into other regions.

Dacombe-Bird said AWS had not sought or been offered any government incentives for its investment.

Minister for the Digital Economy and Communications David Clark said the investment demonstrated the high level of confidence the international business community has in backing New Zealands economy.

Cloud-based technologies are generally accepted now as being the way to work and innovate digitally, he said.

This will create job opportunities for industries like our construction sector, and bring long term benefits as we see the ICT sector and local innovators significantly grow into the future.

National Party digital economy and communications spokeswoman Melissa Lee also welcomed the investment, describing it as a massive confidence boost to our digital economy.

Cloud computing data centres are used in part to house the computers running common software applications, for example by Google to run its Google Apps and by Microsoft for Office 365.

In AWS case, its data centres are used by thousands of third-party software companies such as Xero to provide their services to customers.

They are also used by businesses to run their own particular software applications and by technology start-ups and others to develop new IT services, for example by training machine-learning algorithms.

The Auckland data centres will also be available for companies outside the country who wanted to use them as a back-up, Dacombe-Bird said.

Aside from the injection into the economy provided by the construction of the facilities, they can also have benefits in making software applications slightly faster to run in the countries where they are built, by shortening the distance between computer users and the online services they are accessing.

Dacombe-Bird said some AWS customers also wanted to have the facilities locally for privacy reasons, so that their customers data need not leave the country.

AWS nearest data-centre facilities at the moment are in Sydney.

Supplied

Lines company Vector says the AWS investment will have spin-offs beyond the technology sector.

Dacombe-Bird said the facilities AWS was building would start to come online from 2024.

The investment would unleash further innovation, drive greater productivity, increase our skilled workforce, and truly position New Zealand at the forefront of digital commerce for generations of Kiwis to come, he said.

Discussions with power companies for the electricity to supply the data centres were well-advanced, he said.

AWS has committed to power all its operations, including its data centres, with 100 per cent renewable energy by 2025.

Simon McKenzie, chief executive of Auckland lines company Vector, which has a partnership with AWS, said the investment was excellent news for AWS but more importantly New Zealand.

The scale and nature of this AWS investment and commitment to New Zealand will encourage and build confidence for increased levels of innovation across all sectors, he said.

A thriving tech sector benefits us all. It adds diversity to our economy, job opportunities, and keeps talent on our shores, while allowing companies to scale globally from New Zealand.

ALDEN WILLIAMS/STUFF/Stuff

AWS investment would appear to have mixed implications for some local technology infrastructure firms such as Spark, but it welcomed the development.

Alex Burke, chief executive of Dunedin-based education technology company Education Perfect, which uses AWS to host its cloud software, said AWS investment would help it attract top tech talent to New Zealand and encourage young people to pursue careers in technology.

We will look to build on the 200 people we have in the business already by hiring an additional 100-plus engineers over this upcoming year, he said.

It's important to state that this investment, whilst in Auckland, will positively impact the whole country, he said.

But AWS investment would appear to have mixed implications for some local companies that already play in the technology infrastructure space, including Spark.

Spark chief financial officer Stefan Knight said in April that its revenues from providing IT and managed services grew 10 per cent last year to $1.1b, making up 31 per cent of its overall sales.

Spark customer director Grant McBeath welcomed AWS investment, saying Spark was well-positioned to support customers with their cloud, security, and managed service needs as AWS grew its offering in New Zealand.

We believe that hybrid cloud private and public options serve businesses well and we can work with our customers to identify the optimal mix and manage migrations, he said.

Read the original:

Infrastructure and 'talent' why Amazon selected Auckland for $7.5b investment - Stuff.co.nz

Posted in Cloud Computing | Comments Off on Infrastructure and ‘talent’ why Amazon selected Auckland for $7.5b investment – Stuff.co.nz

Cloud computing transformation in banking risk – McKinsey

Posted: September 20, 2021 at 8:42 am

On May 11, 1997, when IBMs chess-playing supercomputer Deep Blue beat Garry Kasparov, the reigning world champion, it became apparent how computers could make judgments on par with the sharpest human minds. To process the 10123 possible moves in a game (far more than the 1082 atoms in the observable universe), Deep Blue had been programmed to proxy human judgment, using 4,000 positions and 700,000 grandmaster games. Eight years later, AlphaGo, a computer program developed by DeepMind Technologies, defeated Lee Sedol, one of the worlds best players at Go, a game with as many as 10360 possible moves. AlphaGo used a combination of machine learning and neural network algorithms and was trained using 30 million moves. Today, AlphaZero, a successor, is considered the worlds leading player of both Go and chess. It trains itself.

For years, however, this kind of computing power was difficult for most organizations to obtain. That is no longer the case, thanks to an equally dramatic change: the move from owned systems (such as the dedicated hardware of the chess and Go champions) to public cloud-based computing, giving users everywhere instant access to computing power and storage. Many businesses are embracing cloud-based software as a game changer that lets them process vast amounts of data, run new methods of advanced analytics, and benefit from more flexible technology setups. Despite rapid growth in spending (the top three cloud service providers reached $100 billion in combined revenue in 2020), cloud infrastructure still represents a small fraction of the $2.4 trillion global market for enterprise IT services. A recent research effort by McKinsey Digital foresees more than $1 trillion in run-rate EBITDA (earnings before interest, taxes, depreciation, and amortization) across Fortune 500 companies in 2030 from rejuvenating existing operations, innovating in business processes, and pioneering new businesses.

Despite the potential, banking has been slower than other sectors to adopt the cloud. Most banks find it difficult to give up their legacy on-premises applications, with only a few exceptions of early adopters like Capital Onewhich started a migration to the Amazon Web Services (AWS) cloud in 2012 and closed the last of its eight on-premises data centers in November 2020.

Now attitudes are starting to change. Some in the banking regulatory community are taking a more open stance toward cloud in financial services, considering the enhanced transparency, monitoring tools, and security features of cloud computing. For example, Don Anderson, chief information officer (CIO) at the Federal Reserve Bank of Boston, noted in 2019 that ignoring the cloud may even introduce new security vulnerabilities as on-premises vendors discontinue support for their products. On the other hand, regulators continue to issue guidance that highlights the key risks of cloud computing to individual institutions and to the stability of broader financial systems. In a recent report, the Bank of England noted that since the start of 2020, financial institutions have accelerated their plans to scale up their reliance on CSPs (cloud service providers), and that the resulting concentration among a small number of cloud providers could pose risks to financial stability. Other concerns pointed out by regulators relate to information security and the need to build cloud-appropriate risk management frameworks as an integral part of cloud migrations.

Among banking activities, one of the biggest areas of opportunity for cloud computing is risk management, both for financial risks (such as credit, market, and liquidity) and nonfinancial risks (cybersecurity, fraud, financial crime). At a time when risk management leaders are being asked to process greater amounts of data in shorter amounts of timeoften amid budget and staff constraintscloud computing could unlock considerable benefits. It can help risk teams react rapidly to changes in the external environment and dive deeper into the analytics life cycle (exhibit) to better understand the drivers of risk, all without major capital expenditures.

Exhibit

First movers are already employing cloud-based solutions in both financial and nonfinancial risk use cases. They are, for instance, deploying them to run large and complex daily and intraday liquidity risk calculations, do close monitoring of peer-to-peer payments and mobile banking transactions, improve regulatory compliance, and get smarter about the identification of money-laundering activity. Since the pricing model for many cloud providers is flexible and usage based, cloud computing also provides economic benefits. Chief risk officers (CROs) only pay for what they use and can scale up for the surge-based computing needs of certain risk analytics activities, enabling them to shift their technology cost model from capital expense to operating expense.

By adopting cloud computing, CROs could better address four historically intractable risk management challenges: the need to process much more data, the need for more powerful processing systems, the complexity of analytics required to compete, and the greater challenges these all present to todays systems developers.

To make effective risk decisions, financial institutions have always had to convert information into insights, but todays data requirements are massive. Not only do banks gather data in greater volumes, but it comes from multiple sources and in multiple formats. For example, to assess sales conduct risk, banks are using point-of-sale transactions, sales-force performance data, consumer complaint feedback, and a range of other sources.

Developing insights from so much data is all the more difficult because financial institutions often house their information in disconnected systems and then govern these systems through different processes. This siloed approach makes it hard to integrate internal and external sources of information and develop a complete and unified view of risks. As a result, teams can miss useful insights.

With cloud-based solutions, risk management teams have the potential to easily and quickly integrate many different data sources and systems. Some solutions have standardized, easy-to-use web-based interfaces, which eliminate the need for specialized configurations between a banks systems and those of a third party. American Express, for instance, introduced its cloud-based Cornerstone data ecosystem several years ago to share capabilities and data across functions and geographies.

Processing the large data sets needed for sophisticated advanced analytics and machine-learning models requires heavy loads of computing power, especially when multiple legacy systems are involved. Banking risk management functions seldom have access to such high levels of computing power, and resource barriers prevent teams from simply adding more servers.

Cloud deployments offer a more flexible model, pioneered by AWS elastic-computing capabilities, giving teams access to on-demand increases in computing power and a library of easy-to-deploy tools. Today those capabilities are changing the way bankings risk function operates. One leading bank was able to multiply the processing power dedicated to the Monte Carlo simulations it uses for trading risk projections, running them in a matter of hours as opposed to multiple days, according to executives at Microsoft Azure. At one investment bank, the relaxation of legacy computing capacity constraints enabled by cloud computing resulted in a significant increase in the use of analytics: more experimentation by trading strategy teams and adoption of new types of analyses they could not have tried before (e.g., modeling certain types of interest rate volatility directly rather than assuming it).

Similarly, a global systemically important bank had been assessing global liquidity by running its model 14 hours a day. After the bank switched to a cloud-based solution, the time dropped to less than three hours, allowing risk teams to do more frequent analyses and iterations, incorporate additional data, and make quicker business and balance sheet strategy decisions, according to executives at Google Cloud.

Migrating to a cloud-enabled platform can also streamline upgrades. Instead of spending substantial time and effort configuring new upgrades and capabilities on disconnected legacy systems, risk teams let their technology partners handle both the software and hardware upgrades. This reduces ongoing technology operating costs and minimizes the risk of obsolescence in an age of rapid evolution.

In recent years, cloud-based providers of risk management solutions have developed a wide variety of innovative, user-friendly, out-of-the-box automation tools, such as data drift analysis, critical misconfiguration alerts, and digital forensics tools. Utilizing such technology frees up risk analysts time to focus on what they do best. Instead of spending time configuring tools and technology, they can move quickly to develop sophisticated models and alert mechanisms. Barclays freed up time for its risk analysts by working with a cloud-based provider to improve its automation process for granting transaction risk analysis exemptions for merchants.

Executing risk management in the cloud also makes it easier for teams to recalibrate and manage their models and set up new tests. Cloud-based infrastructure can be continuously fed with real-time datasomething beyond the capabilities of many legacy systems. This makes models more accurate and precise, and helps analysts quickly make data-based decisions on their effectiveness.

HSBC, for instance, uses cloud computing services to look for money-laundering and other criminal activity in a completely new way. By mapping networks of connections between people and companies, the banks Global Social Network Analytics platform lets risk management teams find suspicious transactions that previously were identifiable only by humans. Using cloud services, another bank detected a data breach and found the individual responsible within two weeks, while at a competitor, the detection and apprehension of the same breach took over a year, according to executives at Google Cloud.

The flexibility and connectivity of cloud-based environments can have a meaningful impact not only on the productivity of risk analysts but also on the developers who create and maintain the models that identify, measure, and mitigate risks. After moving to the cloud, developers often report significant improvement in key performance metrics, including improvements in release frequency, lead time to deploy, and mean time to recover. Commerzbank, for instance, says its developers use cloud services to follow a continuous integration and delivery (CICD) approach, enabling them to do code updates much more seamlessly and easily.

Finally, the impact of cloud-based solutions extends beyond the risk function, since their ease of use makes robust risk identification and assessment tools more accessible to business units, which are the first line of defense. This allows for a better understanding of risks and a sense of ownership for risk decisions. Loan officers, for instance, can stress test loan portfolios or simulate the performance of a loan before approving it, enabling a deeper awareness of risk-return trade-offs.

While the potential benefits of cloud computing are substantial, so are the challenges of migrating risk management systems and activities from on premises to the cloud. CROs must plan for managing complexity, investing the necessary resources, and meeting needs for new capabilities and culture.

For the most part, risk systems are not stand-alone; they thread through the banks core applications and processes. As a result, moving risk applications to the cloud may have implications for other systems and ultimately require the reconfiguration of other applications. Thus, the migration journey for risk applications needs to be designed as part of the broader enterprise migration, which will involve hundreds of applications in total. Some companies have established a private cloud in which computing resources are hosted on a network used by only one organization and located within their own data center. Others have opted for a hybrid between this approach and the public cloud hosted by a major provider.

Migrating to the cloud can have a significant impact on financial statements. Although the legacy technology systems on which banks often operate carry maintenance costs, their depreciation expenses are minimal. While most cloud providers offer incentives for multiyear commitments that can offset near-term migration costs, substantial expenses will still hit the P&L. Therefore, investments needed for cloud migration and the subsequent operating costs must be carefully planned and sequenced over time to manage their financial impact.

The skills required to migrate and operate in the cloud include a much heavier focus on engineering and data science than is needed for on-premises computing. This kind of talent is difficult to recruit and even harder to retain, especially amid currently high attrition rates. In addition, the culture of teams working in the cloud is faster moving, more adaptable, and more focused on rapid delivery. Risk functions at banks will need to adjust their operating model to enable this new culture while keeping the rigor, control, and governance required for risk management activities.

Given these challenges, migrating to the cloud isnt an express trip. Instead, for most risk leaders, it is a multistage journey that will need planning and execution within the broader cloud strategy of the entire organization. As our colleagues have pointed out recently, companies that adopt cloud have to manage their overall strategy and business case, adoption in each business domain, and the construction of foundational capabilities that enable security and scale, all in concert.

CROs and other risk leaders have an important role driving adoption across the risk domain, but also can influence the overall strategy and business case, and need to help scope the foundational capabilities required, in particular when it comes to security and controls. Three actions can help guide the cloud adoption journey for risk management:

The common error of scattering tests and use cases throughout multiple domains will not create the momentum delivered by a deep dive into one or two major domains, whether consumer credit risk, trading risk, or consumer fraud. This is because the migration of data and tech to a cloud provider is often the toughest challenge. Once a single use case is complete for a given domain, its easier to develop additional use cases in parallel.

A transition to cloud-based risk management offers too many benefits for risk leaders to ignore. For banks, cloud computing is quickly becoming an imperative. Those that do not migrate their systems and capabilities could lose the ability to innovate quickly and respond effectively to the competitive pressures and increasing number of risks facing banks. The many decisions to make along the journey can paralyze firms, but a focus on the key issues and a prudent approach to implementation can help risk managers think several moves ahead on the chessboard.

See the article here:

Cloud computing transformation in banking risk - McKinsey

Posted in Cloud Computing | Comments Off on Cloud computing transformation in banking risk – McKinsey

Cloud computing is the new frontier for companies looking to get ahead of Google – Digiday

Posted: at 8:42 am

Editors Note: This story is part of a 10-part series that examines life after the third-party cookie. Visit this interactive graphic outlining the full series here.

At the end of August, Erin Yasgar concluded that Google really wanted her to attend its upcoming cloud conference.

Yasgar, a practice lead for marketing and agency strategy at Prohaska Consulting, was being pestered in that way sought-after people will find familiar: Getting multiple emails a week, reminding her of the date, just checking in.

This felt unusual. The cloud side of Googles business had previously shown little interest in agencies or marketers in general.Google is actively courting the marketers [for its cloud business], Yasgar said.

As the media industry begins to turn itself upside down before Google ends its support of third-party cookies, large cloud tech companies including Google, Amazon and Microsoft are using the upheaval as an opportunity to try and grow their cloud businesses.

In taking third-party cookies out of digital advertisings equation, Google has forced advertisers and publishers alike to focus on their own first-party data to an unprecedented degree. Without cookies to fall back on for targeting or measurement, they will need to bring more data into more clean rooms for matching. As the importance of e-commerce continues to swell, marketers will need to do more to tie their ad data to other kinds of data typically walled off in different corners of their organizations. And as advertisers, as well as publishers, reshuffle their always-on ad spending strategies, so will the need to use artificial intelligence to spot pattern growths too.

Signs of these growing needs are already visible. For example, searches for customer data platform software more than doubled last year, according to data from the business software marketplace G2.

But at the most developed end of the spectrum, most of these needs can be solved by cloud infrastructure and computing services. And its largest purveyors are ready to pounce.

They never had to understand unknown, or anonymous people. They were just buying return on ad spend through an agency.

David Novak, the co-chair of the data and marketing practice at Prophet

As you get into the billions and above in revenue, its becoming one of the highest priorities, said David Novak, the co-chair of the data and marketing practice at Prophet. Those largest firms, Novak said, are driving a very tech-centric view of how to gain customer understanding that they didnt have a few years ago.

They [marketers] never had to understand unknown, or anonymous people. They were just buying return on ad spend through an agency.

Today, Novak and Yasgar see large organizations instead focused on trying to drive personalized interactions with their customers across many disparate surfaces; sharing data not just across their own different parts internally but with growing numbers of strategic partners externally.

All of that makes the need for strong cloud infrastructure more important.

That need for the flexibility [of data], and the tech that CMOs need to deal with the identity graphs, cloud computing supports all of it, Yasgar said. All the infrastructure thats needed, what both sides of the coin need, is absolutely set to grow.

While the cloud providers are most interested in the largest marketers and media companies at the moment, this shift seems likely to spread to smaller publishers too, some, such as Amazon, are positioning themselves in a way that could hook even mid- and long-tail publishers into their clouds.

QUICK STAT

For Amazon, Google and Microsoft combined, cloud computing is already a $100 billion business. Timothy K. Horan, a cloud and communications analyst at Oppenheimer

In the great cloud battle, media would be a minor theater of war. Cloud computing is already a $100 billion business for Amazon, Google and Microsoft combined, and it is growing by more than $30 billion annually, said Timothy K. Horan, a cloud and communications analyst at Oppenheimer.

Securing the largest advertisers and publishers might amount to a single-digit percentage of their cloud revenues. Maybe its hundreds of millions of revenue in a couple years, Horan said. Theyre always pursuing these kinds of things, but its a minor product [in that context].

But even if each individual marketer is small potatoes to the cloud purveyors themselves, committing to one kind of infrastructure or another represents a profound decision for most of their customers.

Were talking about something that is incredibly complex, Yasgar said. Theres always that decision of, Do I go all in on one vendor, or do I go best in class and match in my stack?

Viewed from one angle, all of that complexity, as well as the significant amount of time and investment needed to commit to a cloud provider and strategy, could create the kind of oligopoly that currently rules the digital advertising market: If marketers have to work more closely with one another, the incentive to use shared cloud infrastructure grows strong enough that it may force the largest marketers and publishers and agencies into the hands of Google, Amazon and Microsoft, freezing out independent services such as Snowflake.

But others say that complexity may have the opposite effect, essentially buying time for competitors, which have made separate in-roads with marketers, to develop their cloud offerings to better suit these larger needs.

It could give them [Adobe and Salesforce] a dark horse position, Novak said. They are already working with marketers, hand in hand.

For now, smaller marketers and publishers are mostly relying on agencies or less sophisticated software to handle these problems. But if you squint while looking at the moves that cloud providers such as Amazon are making, its possible to see them as part of the foundation of a much larger cloud-based services business for them.

The companys publisher services division, APS, already provides a server-side tool that allows them to manage ad requests. It is currently building an ad tech services marketplace inside APS as well, which would allow publishers to run many of the revenue-boosting (but often site-slowing) services on Amazon servers, rather than theirs.

That basket of services could represent the strong foundation of a product that most publishers would have to at least consider subscribing to, sources say. It could sweeten the deal further by using its dominant market position to secure preferential pricing for APS customers.

Amazon may not wind up building out that infrastructure to target smaller publishers. But the shadow of Amazons cloud ambitions already color the conversations it has, say sources at multiple publishers who work with Amazon, who asked not to be identified while discussing a key business partner.

All they care about is their cloud business, one said.

Read more here:

Cloud computing is the new frontier for companies looking to get ahead of Google - Digiday

Posted in Cloud Computing | Comments Off on Cloud computing is the new frontier for companies looking to get ahead of Google – Digiday

Cloud Exchange: Leveraging cloud data and migration in the public sector healthcare arena – Federal News Network

Posted: at 8:42 am

Cloud computing is rapidly advancing health care on three important fronts.

First, according to Mathew Soltis, vice president of Cloud Solutions at GDIT, is how its improving health care research. The cloud enables aggregation and sharing of data in a given research domain, speeding visualization and data analysis, and therefore the development of remedies.

NIH is a good example of some data sharing here around COVID. They have a commons platform, where researchers can get together and share data in a common standard, a common infrastructure, Soltis said at the Federal News Network Cloud Exchange. Earlier, the exchange of hard drives or snail-like downloads simply held things up, he said. Now cloud deployments are accelerating research outcomes.

The second area of cloud-induced improvement, Soltis said, is how it eases compliance with a myriad of health-domain requirements, such as privacy protection under HIPAA. Basic cybersecurity also improves under a well-crafted cloud implementation, he added.

What we see now is a lot of the cloud providers and the service providers have raised the standards, Soltis said. So if you want to use data or access infrastructure in the cloud, its now HIPAA compliant or can support your HIPAA outcomes. Thats advancing the cyber posture of a lot of customers.

Third is how the cloud is enabling new modes of remote health care delivery and thereby improving outcomes. For example, cloud-hosted data related to electronic health records aids EHR interoperability and access from anywhere, Soltis said.

Taking advantage of cloud computing comes with challenges, Soltis pointed out. A central question is how to handle and govern the large amounts of data that characterize health information.

When we talk to federal customers, over 40% of them are experiencing issues with data migration, Soltis said. Beyond any technical challenges lie the questions of data ownership, who is going to use the data, who in sharing arrangements pays cloud costs, and whos responsible for backups.

Key, Soltis said, is having an official role to own, manage, charter, chaperone and govern the data. Namely, the chief data officer.

But the CDO organization must partner with other stakeholders. When the CDO sits at the same level as technology and the mission owner and the application owner, that provides the best outcome, Soltis said.

On the technical side lie questions of how to design architecture for data lakes, dealing with mixed structured and unstructured data, and the most economical ways to tier the storage setup.

If that infrastructure is in place with your architecture, you can then do some higher level activities like machine learning, artificial intelligence and data visualization, Soltis said.

Success depends on getting those pieces in place, he said, but additional strategies can provide shortcuts for healthcare agencies.

For instance, look at how practitioners in other domains have done it.

Sometimes in parallel industries things like high performance computing, anomaly detection, image recognition there may be other agencies or organizations doing this, Soltis said. Some of the cross-government communities have been really successful here, looking at those use cases, where you can apply that technology and that solution to healthcare data.

The specific strategy leading organizations take, he added, takes data as the primary focus, and moves data to the cloud first ahead of applications.

Contractors can help too. Soltis said GDITs secure, cloud-native data reference architecture applies in many circumstances. He cited the Indian Health Service, which is modernizing its EHR system to incorporate the cloud.

Read the rest here:

Cloud Exchange: Leveraging cloud data and migration in the public sector healthcare arena - Federal News Network

Posted in Cloud Computing | Comments Off on Cloud Exchange: Leveraging cloud data and migration in the public sector healthcare arena – Federal News Network

New Study Finds Salesforce Economy Will Create 9.3 Million Jobs and $1.6 Trillion in New Business Revenues by 2026 – PRNewswire

Posted: at 8:42 am

SAN FRANCISCO, Sept. 20, 2021 /PRNewswire/ --Salesforce (NYSE: CRM), the global leader in CRM, today announced a new study from IDC that finds Salesforce and its ecosystem of partners will create 9.3 million new jobs and $1.6 trillion in new business revenues worldwide by 2026. The study also finds that Salesforce is driving immense growth for its partner ecosystem, which will make $6.19 for every $1 Salesforce makes by 2026.

Building digital HQs helps solve for urgent transformation needsIDC forecasts1 that cloud-related technologies will account for 27% of digital transformation IT spending this year, growing to 37% in 2026, as businesses focus on establishing digital HQs to deliver customer and employee success from anywhere. Remote work, contactless customer engagement, and sustainability efforts are becoming more prevalent than ever, and IDC expects this trend will only continue.

As more companies build out digital HQs to support an increasingly remote workforce, Salesforce technologies have helped its customers adjust to uncertainty enabling remote work and remote contact with customers, and making it possible to develop new products in weeks, not months2.

IDC also conducted a survey of 525 enterprises across eight countries on cloud deployment and the benefits and challenges of cloud computing. Of the 74% of survey respondents who say their organizations have a formal digital transformation strategy, 97% rate cloud computing as important to that strategy. The survey also found that Salesforce solutions have enabled:

Salesforce technologies can also help companies plan for a more sustainable future. IDC forecasts that from 2021 to 2024, migration from on-premise software to the cloud could reduce as much as 1 billion metric tons of CO23. Salesforce itself has set a goal of pursuing 100% renewable energy for its global operations by 2022, and currently delivers a carbon-neutral cloud to all its customers. And, according to IDC's customer survey, 39% of Salesforce customers surveyed look to Salesforce as a source of support in reaching their own sustainability objectives.

Salesforce partner ecosystem helps drive worldwide acceleration of growth IDC predicts that the use of Salesforce and its ecosystem's cloud services will generate $308 billion in the customer base this year and more than double that in 2026, at $724 billion. Today, the ecosystem of Salesforce partners delivering cloud services to customers is five times as big as Salesforce itself, and will be more than six times as big in 2026. The study also found that 2026 ecosystem revenues are forecast to be 3.5 times those in 2020.

"The Salesforce partner ecosystem extends the power of Salesforce to companies of all sizes, across industries and helps make customer success possible," said Tyler Prince, EVP, Alliances & Channels, Salesforce. "As Salesforce grows, so do our partners and we are committed to providing our expanding partner ecosystem with the tools needed to succeed in the jobs of the future."

Salesforce paves pathways to help unlock career opportunities in the Salesforce Economy 23% of new jobs created in the Salesforce customer base this year leverage significant digital skills such as using automation tools, the Internet of Things ( IoT), and other complex applications. Trailhead, Salesforce's free online learning platform, and its Trailblazer Community, which accelerates this learning through peer-to-peer knowledge sharing and support, empower anyone to learn digital skills for the growing Salesforce economy.

Salesforce is also helping companies navigate their digital transformations through these platforms; 84% of survey respondents at companies using Trailhead say it's important to their organization's deployment of cloud solutions, and 44% said it's "critically" so.

"For Salesforce, it's not only about creating new technology and career opportunities; we have to pave pathways to these new jobs," said Kris Lande, SVP, Trailblazer Ecosystem, Salesforce. "We've made it our mission to empower people with the tools they need to build dynamic careers, companies, and communities with Salesforce, and thrive in a digital-first world."

How Salesforce is creating jobs to fuel the Salesforce Economy Salesforce has a number of programs and initiatives to help create the jobs of the future and to fill them with well-equipped candidates:

What is the Salesforce Economy? IDC defines "The Salesforce Economy" as the footprint of Salesforce and its partner ecosystem on the economy at large. This includes the revenues and jobs directly generated in the Salesforce customer base from the use of Salesforce and its partners cloud services, as well as jobs created indirectly in the economy by local spending by direct employees and Salesforce and its partners themselves.

Salesforce's multi-faceted partner ecosystem is a driving force behind the Salesforce Economy's massive growth:

Additional Resources

IDC Methodology The Salesforce Economic Impact Model is an extension to IDC's IT Economic Impact Model. It estimates Salesforce's current and future share of the benefits to the general economy generated by cloud computing, and it also estimates the size of the ecosystem supporting Salesforce using IDC's market research on the ratio of spending on professional services to cloud subscriptions; the ratio of sales of hardware, software, and networking to spending on public and private cloud computing; and the ratio of spending on application development tools to applications developed.

Note that the ecosystem may include companies that are not formal business partners of Salesforce but that nevertheless sell products or services associated with the Salesforce implementations.

IDC White Paper, sponsored by Salesforce, "The Salesforce Economic Impact," doc #US48214821, September 20, 2021

1IDC's WW Spending Guide on Digital Transformation, 20212The Impact of Digital Transformation During Times of Change,July 20203IDC Press Release, Cloud Computing Could Eliminate a Billion Metric Tons of CO2 Emission Overthe Next Four Years, and Possibly More, According to a New IDC Forecast, March 2021

About Salesforce Salesforce is the global leader in Customer Relationship Management (CRM), bringing companies closer to their customers in the digital age. Founded in 1999, Salesforce enables companies of every size and industry to take advantage of powerful technologiescloud, mobile, social, internet of things, artificial intelligence, voice and blockchainto create a 360-degree view of their customers. For more information about Salesforce (NYSE: CRM), visit: http://www.salesforce.com.

SOURCE Salesforce

http://www.salesforce.com

More:

New Study Finds Salesforce Economy Will Create 9.3 Million Jobs and $1.6 Trillion in New Business Revenues by 2026 - PRNewswire

Posted in Cloud Computing | Comments Off on New Study Finds Salesforce Economy Will Create 9.3 Million Jobs and $1.6 Trillion in New Business Revenues by 2026 – PRNewswire

Why we need green hosting and higher density services now more than ever – TechRadar

Posted: at 8:42 am

It would be easy to adopt a fatalistic attitude in the wake of the IPCCs report into climate change, that the changes necessary are too big for any individual or enterprise to tackle. But the lesson that should be learned is that while the crisis is dire, we can and should make every effort to limit our effects on the environment.

This includes hosting. We think of cloud computing as emission-less. Everything happens out of sight. But it actually isn't emission-less at all. Cloud computing now contributes to 2% of global CO2 emissions.

That might seem small, but another way to think about it is that for every 50 tonnes of CO2 emissions, one of those is from online hosting.

But there are reasons to embrace green hosting that go beyond just saving the planet.

One simple way to reduce emissions, though its not comprehensive, is to use money as a proxy: if you're spending money on something, it is most likely contributing to emissions.

Reducing spending can be a reasonable target for any business that wants to emit less. One way to approach this is through the use of high-density services.

One common feature of modern life is that we use tools that are designed for a far more powerful job. We commute or take short shopping trips in vehicles designed to travel way over the speed limit, deal with extreme off-road conditions, or move tonnes of stuff. We wear jackets that claim to deal with arctic temperatures when it starts to get a little autumnal, and sports gear for elite athletes when out for a light jog.

Computers are no different, whether its personal computing - a top-of-the-range gaming PC used to idly browse the web, say - or the equivalent in the enterprise. Computers often have boring lives given how powerful their processors are, the size of their storage, and their high-speed internet connections.

Often, they are given a small task that they only need to do part of the time, or with a fraction of their processing power. This unused computing power of deployed computers could be used more efficiently by increasing service density - simply put, using fewer computers to do the same tasks.

With hosting, service density is key to reducing the carbon footprint of any workload and the energy efficiency of a data center. Every application has resources allocated to it whenever it needs them but also when those resources are unused, they can be allocated to another application or service.

Weve seen this work in action. In one case, a university was able to replace sites hosted on 2,000 devices, scattered across their university campus with dense, containerised hosting - creating an immediate 30% reduction on hosting costs.

Fewer computers in action being used means both saving money and cutting emissions by reducing energy use - not to mention reduced e-waste when components need to be replaced.

Businesses are keen to do more to reduce the environmental impact, but green hosting is often quite far down the list of things that are being considered. Hosting is very much out of sight, unlike, for example, the waste that a business might produce.

Green hosting has the potential to be an easy win, reducing emissions and saving costs at the same time. Many businesses and organisations have built their IT in an ad-hoc and DIY manner, leading to inefficiencies - not just in density but in how IT and development teams work with this infrastructure.

Consolidation and higher density can mean increased efficiencies in these teams, too. There is also the opportunity to consolidate infrastructure into a single bill, and even move to a more efficient data center if this is possible - moving cloud to the Nordics can mean even greater savings in emissions.

When businesses are looking for ways to be more eco-friendly, whether its buying offsets, considering more hybrid working, or any number of disruptive or expensive options, its important to consider the quick wins.

Green hosting may be less visible than many other options, but it has the potential to both slash emissions and save money. Every business should consider it.

Read the original here:

Why we need green hosting and higher density services now more than ever - TechRadar

Posted in Cloud Computing | Comments Off on Why we need green hosting and higher density services now more than ever – TechRadar

Leveraging cloud computing capabilities can help organizations reduce their carbon footprint – Express Computer

Posted: at 8:42 am

As per a report by McKinsey & Company, migration of assets to clouds, globally, became amongst one of the key business priorities during Covid-19. In 2020, amongst many factors that contributed to the sustenance of a larger ecosystem, technology and particularly cloud adoption played an instrumental role. Since the pandemic, business models have pivoted to cater to the new normal consumer needs like online shopping, increased demand for video streaming, doorstep healthcare facilities, online education, and much more. The need for robust yet efficient cloud computing has thus become relevant and meaningful in the overall consumer experience matrix.

Businesses are increasingly adopting cloud technologies for its functional benefits such as pay-as-you-go pricing models, flexibility to scale, security, agility, mobility, data as an asset, collaboration, quality control, disaster recovery, loss prevention, automated software updates, competitive advantage, last but not the least, sustainability. Clouds popularity grows as it facilitates intelligent technologies and other tech-extensive solutions, in lieu of on-premise deployments that could be vulnerable to dynamic environmental and business requirements.

Amidst this, while technology proliferation is positive for growth and modern innovations, there is a need to work towards making its impact, less intrusive to the environment. Datacenters are core to our technological needs, but they consume a lot of electricity, which is not limited to computing but also to cool the heat generated from computing equipments thereby resulting in CO2 emissions. As responsible corporates and communities, dedicated attempts need to make to draw electricity from renewable sources such as solar and wind. While it takes effort and investment to go carbon neutral, it does pay off.

According to a forecast from International Data Corp. (IDC) released in March 2021 Continued adoption of cloud computing could prevent the emission of more than 1 billion metric tons of carbon dioxide (CO2) from 2021 through 2024. Asia Pacific regions in particular utilize coal for much of their power generation across datacenters and account for significant CO2 emissions.

Cloud computings aggregated compute resource is a key driver in reducing carbon emissions as the framework can efficiently utilize power capacity, optimize cooling, leverage the most power-efficient servers, and increase server utilization rates. On the side-lines of switching over to renewable sources of energy, cloud infrastructure is inherently well suited to address energy efficiencies because:

Efficient resource management as the pay-as-you-go model in cloud computing allows individual users to judiciously utilize the services, thereby reducing wastage.

It helps reduce carbon emissions from multiple physical servers. Virtualization allows cloud solutions to be delivered from a single server which can run multiple operating systems, simultaneously.

In an automated environment, users can operate on higher utilization ratios and consolidation which reduces input from the physical infrastructure.

Cloud is also unaffected by multiple users & organizations accessing its common infrastructure as automation can balance the workloads and minimize the requirement for additional infrastructure or resources.

Modern and efficient cloud data centers are taking the idea of Green IT forward in a meaningful way, saving not just the environment but also building a more robust ecosystem.

The differentiated ability to shift IT service workloads, virtually to any location in the world also creates an opportunity to enable greater usage of any available renewable sources of energy of that location.

Sustainability is frequently viewed from an operational point of view while environmental goals are viewed as a cost center in businesses, risk, or compliance to adhere to. Green datacenter and sustainable cloud infrastructure go beyond the business; they are incredible opportunities to give back to the communities where we operate.

If datacenters get designed for sustainability which starts with shifting to cleaner, renewable sources of energy like wind and solar power, LED usage across datacenters, then carbon emissions can be reduced. An efficient data center will have energy diverted towards running the IT equipment vs cooling the environment where it resides.

Businesses in several countries are taking lead in shifting their IT system to cloud centers and are deriving immense value from the exercise. They are not only able to tackle the problem of fluctuations in the electricity supply but also take add value to the overall brand image & reputation in being an environmentally conscious entity among stakeholders. Businesses may take measures to become carbon neutral through carbon offset efforts or designing data centers with efficiency and environmental protection as the guiding principles and help accelerate sustainability goals.

Authored by AS Rajgopal, MD & CEO, NxtGen Infinite Datacentre

If you have an interesting article / experience / case study to share, please get in touch with us at [emailprotected]

Read more from the original source:

Leveraging cloud computing capabilities can help organizations reduce their carbon footprint - Express Computer

Posted in Cloud Computing | Comments Off on Leveraging cloud computing capabilities can help organizations reduce their carbon footprint – Express Computer

DGT Releases Result for 1st Batch of Advanced Diploma (Vocational) in IT, Networking, Cloud Computing – News18

Posted: at 8:42 am

The Directorate General of Training (DGT), Ministry of Skill Developmentand Entrepreneurshiphas announced the results of the first batch (2018-20) of the advanced diploma (vocational) in IT, networking, and cloud computing.

The first batchincluded 19 trainees out of which 14 cleared the exam. In addition, 18 trainees have been offered placements in IBM and their channel partners.Vurukuti Pavan Kumar from NSTI Hyderabad has secured the top rank while Vinod Kumar K V from the Bengaluru branch has placed at rank two and Dusa Srilekha at rank three.

The course began in 2018 at the two National Skill Training Institutes (NSTI) in Hyderabad and Bangalore on a pilot basisbut has been expanded to 16 NSTIs in 2019.The courseis approved by National Council for Vocational Training (NCVT) as a level 6 National Skills Qualification Framework (NSQF) program.

The two-year course includes industry-relevant courses on hardware maintenance, web development, cloud-based development and deployment, analytics, and soft skills training.

In the first year, there are five core modules, each of 320 hours which are credit-based, independent, and with a focus on employment skills. In the second year, the trainee has to select two out of three elective modules, each of 320 hours, and complete 800 hours of on the job paid training,supportedby IBM whichalso providesa monthly stipend to each trainee for the remaining duration of training (1.5 years) for the third batch onwards.

With the pandemic forcing companies to rapidly adopt new-age technology solutions to run their businesses, demand for the right skills in artificial intelligence, analytics, cloud computing, cyber security etc is on the rise. A 2020 IBV study states that 6 out of 10 companies plan to accelerate their digital transformation efforts, but inadequate skillsets is one of the biggest hurdles to their progress, " said Manoj Balachandran, CSR Leader, IBM India/South Asia.

Read all the Latest News, Breaking News and Coronavirus News here

Here is the original post:

DGT Releases Result for 1st Batch of Advanced Diploma (Vocational) in IT, Networking, Cloud Computing - News18

Posted in Cloud Computing | Comments Off on DGT Releases Result for 1st Batch of Advanced Diploma (Vocational) in IT, Networking, Cloud Computing – News18

Page 63«..1020..62636465..7080..»