Texas Schools Team with Amazon for Cloud Computing Jobs – Government Technology

(TNS) University of Houston student Valerie Smith sat in the front row of her classroom, drawing and labeling boxes with fluorescent colored pens. But what looks like doodles on a page in her notebook are in fact notes to help her remember some of the basics of cloud computing a newer technological venture for many companies and Texas educational institutions.

Cloud computing, a practice that allows the storage, management and processing of data through remote servers on the Internet rather than a local or physical computer (think Google Drive), is becoming more and more common, and education leaders want to prepare students for related job opportunities as more companies migrate their systems to the cloud.

State officials and education and workforce leaders announced in September a partnership with Amazons IT service management company Amazon Web Services that will bring cloud computing education to K-12 schools and colleges across Texas. The program launched in the Dallas, Irving and Houston independent school districts, as well as at three four-year universities and 22 community colleges. The colleges in the Houston region include Prairie View A&M University, Houston Community College and Lone Star College.

The courses, via Amazon Web Services AWS Educate, will provide universities with the tools to train professors and support cloud computing learning for students by building computer and data related skills through additional curriculum and degree programs. Students ages 14 and up, will have access to a self-paced, no-cost curriculum as well as training and job boards.

Tony Moore, chief information officer at Prairie View, said its often difficult for university curriculum to keep up with the pace of technology.

Most of the computer science courses that are taught are still teaching older curriculum, Moore said. But the 21st century workforce has changed.

Cloud technology is the wave of the future, experts say, and companies are embracing the technology. In a report earlier this year, nonprofit Cloud Security Alliance said 69 percent of more than 200 organizations surveyed stated that they are migrating data for their management software applications to the cloud.

The worldwide public cloud services market is projected to grow 17.5 percent in 2019 to $214.3 billion, up from $182.4 billion in 2018, according to Gartner, a Connecticut-based research and advisory company. The firms research shows that more than a third of organizations see cloud investments as a top three investing priority, and that by the end of 2019, more than 30 percent of technology providers new software investments will shift from cloud-first to cloud-only.

High demand, short supply

Theres already a high demand, but short supply for employees with cloud computing skills, according to the Texas Workforce Commission.

This year in Texas alone, theres been a 62 percent increase in the number of jobs listing Amazon Web Services or AWS as a required skill, with more than 32,000 job openings, according to Wanted Analytics. But many of those jobs remain vacant.

Officails sees the demand as a benefit for students, but only if theyre prepared.

Cloud computing has created sort of disruption. Theres not enough resources for companies to hire folks who have that experience. This sort of gives (students) an opportunity to learn cloud technology and be workforce ready when they graduate, said Moore, who added that Prairie View A&M will use AWS to launch a certificate program at the historically black college in the hope of enhancing its current offerings and giving students hands-on-cloud experience.

Start training early

Texas Workforce Commissioner Julian Alvarez said the high demand for cloud computing-related skills is also inspiring high schools and colleges to adjust their curriculum to teach these subjects.

The whole initiative is to align the education to the needs of the industry with the hope of achieving the Texas Higher Education Coordinating Boards goal of equipping 25- to 34-year-olds with a post-secondary education or credentials that will lead to gainful employment or entry-level positions, he said.

Alvarez said the workforce commission and its partnerships are working to prepare students by beginning education as early as middle school. The commission has put out a $4 million grant for a pilot program that allows local workforce boards to help students get as much info as possible about potential career options and separately launched a coding camp for girls. Both initiatives have been to help ensure that women and people of color, groups often underrepresented in STEM, are given opportunities in an industry that is largely white and male-dominated.

HCC Chancellor Cesar Maldonado said the community college system launched an associates of applied science degree in partnership with Amazon this fall. The school is actively looking to develop a seamless pathway to a bachelors degree in computer and information sciences at the University of Houston-Downtown. and HCC officials are working to move the colleges own IT infrastructure to the cloud, he said.

Tradition is changing, Maldonado said.

Amazon at UH

Jose Martinez, assistant professor in computer information systems at UH, helped launch AWS programming about two and a half years ago in the schools College of Technology.

Since then, Martinez, the only certified professor to teach AWS skills at UH, has developed a cloud computing track for the bachelors degree in computer information systems with at least three courses focused solely on aspects of cloud computing, including infrastructure and architecture, he said. UH also offers classes that allow IT professionals to earn certifications.

Although Martinez classes are based on the resources that AWS provides, he aims to help students explore training on other cloud providers like Microsoft and Google so they are well-versed and prepared for any system they might encounter in a future job.

Smith, a UH senior studying computer science, she was interested in learning AWS after working with two companies that had competing cloud computing programs. She wanted to learn all of the programs, and decided to take an Amazon Web Services course. She said the UH content was relevant to the technological demands she experienced at her job.

Its difficult sometimes to find courses that are staying up to date with technology, and this is one of the courses that pushes those boundaries and is pushing to stay up to date, Smith said. Companies are now using it and were fortunate to be (learning it).

Martinez has advised his students to bring up their hands-on cloud computing training during every job interview. When the industry and hiring managers see that these students have cloud skills, they get excited because these are skills that arent necessarily out there in the market, he said.

Developing more opportunities where students can get experience in addition to in-class and lab work is crucial, Martinez said. So on Friday, Nov. 22, UH will host its inaugural Cloudathon. Students from more than 30 universities around Texas and surrounding states are invited to put their cloud computing knowledge to the test in a one-day competition. The top three student teams will win cash prizes.

Martinez said he hopes the competition will provide students with an experience they can list on their resumes and that preparation for the advent of the cloud will only expand.

We need to embrace this, Martinez said.

2019 the Houston Chronicle. Distributed by Tribune Content Agency, LLC.

See the original post here:

Texas Schools Team with Amazon for Cloud Computing Jobs - Government Technology

Regulators begin probe into Google-Ascension cloud computing deal: WSJ – Reuters

FILE PHOTO: A sign is pictured outs a Google office near the company's headquarters in Mountain View, California, U.S., May 8, 2019. REUTERS/Dave Paresh/File Photo/File Photo

(Reuters) - A U.S. federal regulator has initiated an investigation into a cloud computing deal between Alphabet Incs Google and Ascension Health [ASCNH.UL] which would give Google access to detailed health information of millions of patients, the Wall Street Journal reported on Tuesday.

The Office for Civil Rights in the Department of Health and Human Services will look into the data collection to ensure the partnership is in compliance with the Health Insurance Portability and Accountability Act (HIPAA) which safeguards medical information, the Journal said on.wsj.com/2NGPPQX.

On Monday, Google said patient data cannot and will not be combined with any Google consumer data.

Google did not immediately reply to Reuters request for comment.

(This story corrects abbreviation in second paragraph to HIPAA)

Reporting by Abhishek Manikandan in Bengaluru; Editing by Christopher Cushing

See the original post here:

Regulators begin probe into Google-Ascension cloud computing deal: WSJ - Reuters

Cloud computing: SaaS, IaaS or PaaS – which is growing fastest? – ZDNet

As cloud computing continues to eat up traditional tech spending, businesses are beginning to change where they spend their money.

The worldwide public cloud services market is forecast to grow 17% in 2020 to a total of $266.4 billion, up from $227.8 billion in 2019, according to tech analyst Gartner.

MUST READ: What is cloud computing? Everything you need to know about the cloud, explained

Cloud computing adoption has now become mainstream, said Sid Nag, research vice president at Gartner. That means higher spending on cloud, but also higher expectations from cloud buyers as to what they will get for their money.

According to Gartner, Software as a Service (SaaS) will remain the largest market segment: SaaS is forecast to grow to $116 billion next year, up from $99.5 billion in 2019.

The second-largest sector is cloud Infrastructure as a Service (IaaS), which will reach $50 billion in 2020. IaaS is forecast to grow 24% year over year, the highest growth rate across all market segments, which Garter said was the result of data centre consolidation. That's because modern applications and workloads -- many of which are cloud applications themselves -- now require infrastructure at a scale that traditional data centres cannot meet.

Cloud computing was listed among the top three areas where most global CIOs will increase their investment next year, Gartner said: "As organisations increase their reliance on cloud technologies, IT teams are rushing to embrace cloud-built applications and relocate existing digital assets."

SEE:Cloud v. data center decision(ZDNet special report) |Download the report as a PDF(TechRepublic)

However, as the use of cloud computing goes mainstream, the landscape will become increasingly sophisticated and competitive -- so much so that customers will need help with managing multiple cloud suppliers and applications.

By 2022, Gartner said, up to 60% of organisations will use an external service provider's cloud managed service offering -- twice the number in 2018.

"Cloud-native capabilities, application services, multicloud and hybrid cloud comprise a diverse cloud ecosystem that will be important differentiators for technology product managers. Demand for strategic cloud service outcomes signals an organisational shift toward digital business outcomes," Nag said.

Read the rest here:

Cloud computing: SaaS, IaaS or PaaS - which is growing fastest? - ZDNet

Adobe Stock: Is The Cloud Computing Leader Ready For Another Leg Up? – Investor’s Business Daily

Adobe stock gets the nod in today's IBD 50 Stocks To Watch as it sets up in a new base with a growth story that's still very much intact.

Adobe (ADBE) continues to deliver impressive growth for a company with a market capitalization of nearly $141 billion. Its five-year annualized earnings growth rate is 46%, with a sales growth rate of 22%.

CEO Shantanu Narayen shook things up when he took over reins in 2007. Initially, his focus was on digital media and marketing services, but he was also instrumental in Adobe's transformation into a full-service enterprise cloud provider.

In 2013, Adobe released Creative Cloud to take the place of Creative Suite, a group of graphic design, video editing and web development applications. Instead of paying a one-time fee of $1,800, Creative Cloud was priced at $50/month or $19 a month for a single application.

Fast forward to today, and Adobe is widely viewed as one of the cloud leaders.

With a trailing price-to-earnings ratio of 39 and a forward P-E of 30, Adobe might seem like a pricey stock valuation-wise. But it's warranted due to a consistent record of strong earnings and sales growth.

The company's latest earnings report in September revealed another quarter of exceptional growth. Adjusted earnings rose 18% from the year-ago quarter. Sales rose 24% to $2.83 billion.

For its current fiscal year 2019, analysts are modeling profit of $7.84 a share, which would be up 16% from 2018. For 2020, look for growth to accelerate, up 24%.

Adobe's next earnings report is due next month, on or around Dec. 12. According to Zacks, look for adjusted profit to be up 23.5% to $2.26 a share. Look for sales to increase nearly 21% to $2.97 billion.

Adobe gapped about its 50-day moving average on Nov. 5, rising 4% in heavy volume. At its annual design conference, Adobe introduced Photoshop for iPad, the Fresco drawing app for Windows and an AI-powered Photoshop cameraapp for smartphones.

Currently, Adobe stock is forming a shallow cup base with a buy point for now of 313.21, 10 cents above its July 19 intraday high.

But an earlier entry could be seen if Adobe forms a handle area, where the last remaining sellers get shaken out of stock in preparation for a breakout attempt. An idea handle shows a gentle pullback in light volume.

Despite a huge price gain in recent years, Adobe's latest base is considered early stage. That's because a double-bottom pattern that formed in the second half of last year served to reset the base count.

Follow Ken Shreve on Twitter @IBD_Shreve for more stock market analysis and insight.

YOU MIGHT ALSO LIKE:

Cloud Computing: Find Top Cloud Stocks And Track Industry Trends

Forget Recession Why We Could See A 20-Year Bull Market

Catch The Next Big Winning Stock With MarketSmith

Best Growth Stocks To Buy And Watch: See Updates To IBD Stock Lists

Read the rest here:

Adobe Stock: Is The Cloud Computing Leader Ready For Another Leg Up? - Investor's Business Daily

UPDATE 1-Regulators begin probe into Google-Ascension cloud computing deal -WSJ – Reuters

(Adds Googles response)

Nov 12 (Reuters) - A U.S. federal regulator has initiated an investigation into a cloud computing deal between Alphabet Incs Google and Ascension Health which would give Google access to detailed health information of millions of patients, the Wall Street Journal reported on Tuesday.

"We are happy to cooperate with any questions about the project," Google said here in a blog post later on Tuesday, regarding the federal inquiry.

The Office for Civil Rights in the Department of Health and Human Services will look into the data collection to ensure the partnership is in compliance with the Health Insurance Portability and Accountability Act (HIPAA) which safeguards medical information, the Journal said on.wsj.com/2NGPPQX.

On Monday, Google said patient data cannot and will not be combined with any Google consumer data.

Reporting by Abhishek Manikandan in Bengaluru; Additionalreporting by Maria Ponnezhath; Editing by Christopher Cushingand Uttaresh.V

See the rest here:

UPDATE 1-Regulators begin probe into Google-Ascension cloud computing deal -WSJ - Reuters

France and Germany outline its plan to boost European cloud computing sector – Data Economy

The company is hosting record numbers at this weeks annual VMworld customer and partner fest in Barcelona, and its expanding cloud ecosystem is even sunnier than the weather here, with plenty of new alliances and services springing up.

The numbers registeringfor this years VMworld show are said to be up about 2,000 on lastyears event, taking the total to around 14,000, probably reflectingthe number of acquisitions the vendor has made over the last year andthe number of new products it has launched helped by these captures.

For instance, theCarbon Black acquisition in the security field just completed brought into the fold an extra 6,000 customers and 5,000 partnersalone. And the firm now has the full set of major cloud playersintegrated into its offerings with its recently sealed alliance withOracle, to go with its established relationships with AWS, Google,Microsoft and IBM.

At his keynote thismorning, Pat Gelsinger, VMware CEO, said: Technologists who mastermulti-cloud will own the next decade. And as far as he isconcerned the integration of Kubernetes with the firms own vSphereand NSX cloud control and management systems around containers iskey. But its not simple, he said, its more like jazzimprovisation, rather than simple container orchestration.

On the cloud front,Gelsinger announced that VMwares Cloud Foundation services will nowalso be available in AWS data centre facilities in Sweden for thefirst time. And when it comes to the edge, the company has moved tosupport what it calls the Telco Cloud.

Gelsinger said thetelcos were creating the biggest edge opportunity with their roll-outof 5G, which would create new micro services that had to be supportedacross end user smartphones and the telcos own core infrastructure.Involving NSX, the vendor has launched Project Maestro to helpsupport Telco Cloud orchestration. He said around 100 telcos globallywere already involved in the effort.

On the data securityside, it was also announced that the VMware App Defense andVulnerability Management products were being merged with sevenmodules acquired through the Carbon Black acquisition to offervarious services to customers. In positive news for the securityoffering, Dell has just chosen Carbon Black as a preferred solutionfor all its endpoint products, including its full laptop range.

The highlight of thenew techie offerings promoted at VMworld was a more detailed outlineof the Tanzu set of products to build, run and manage apps in amulti-cloud environment. On the build side, the recently announcedPivotal acquisition is set to be closed by the end of the year, andit will play a central role in Project Galleon to deliver an appcatalogue fast with greater security in enterprises.

For running apps,Project Pacific builds Kubernetes into VMware vSphere to provide aneasier and speedier cloud orchestration solution. And for managingapps and tasks, Tanzu Mission Control delivers greater control in anenterprise and cloud neutral environment. All these solutions are nowin beta.

These announcements andothers generally went down well with delegates, and the other clouddata management players will now be looking to see how they can get alittle more action by strengthening their place in the VMwareecosystem. As usual, all the big players are here with stands,including NetApp, Veritas, Veeam, Rubrik, Cohesity Networks, Druva,Dell, IBM, HPE, Fujitsu and Hitachi, along with others.

Read the original:

France and Germany outline its plan to boost European cloud computing sector - Data Economy

Cloud Computing in Healthcare Market 2019| In-depth Analysis by Regions, Production and Consumption by Market Size, and Forecast to 2026 | Research…

The report study researched by Research Industry US gives comprehensive knowledge and valuable insights about the Global Cloud Computing in Healthcare Market. In addition, the study attempts to deliver significant and detailed insights into the current market prospect and emerging growth scenarios. The report on Cloud Computing in Healthcare Market also emphasizes on market players as well as the new entrants in the market landscape. The extensive research will help the well-established as well as the emerging players to set up their business strategies and achieve their short-term and long-term goals. The report also adds significant details rising of the evaluation of the scope of the regions and where the key participants should head to find potential growth opportunities in the future.

The Cloud Computing in Healthcare market research report presents a detailed analysis based on the thorough research of the overall market, particularly on questions that border on the market size, growth scenario, potential opportunities, operation landscape, trend analysis, and competitive analysis of Cloud Computing in Healthcare Market. This research is conducted to understand the current landscape of the market, especially in 2019. This will shape the future of the market and foresee the extent of competition in the market. This report will also help all the manufacturers and investors to have a better understanding of the direction in which the market is headed.

Request For Free Sample Report (Kindly Use Your Bussiness/Corporate Email Id to Get Priority): http://researchindustry.us/report/global-cloud-computing-in-healthcare-market-ric/507893/request-sample

Leading players operating in the market include

MicrosoftInternational Business Machines (IBM)DellORACLECarestream HealthMerge HealthcareGE HealthcareAthenahealthAgfa-GevaertCareCloud

Our researchers have taken into account significant aspects of the vendor landscape such as strategy framework, company market positioning, and competitive environment for providing detailed competitive analysis of the global Cloud Computing in Healthcare market. For company profiling, they studied strategic initiatives, product benchmarking, and financial performance of leading players included for the research study.

Over the statistical analysis, the Cloud Computing in Healthcare Market report depicts the global Cloud Computing in Healthcare Industry including capacity, production, production value, cost/profit, supply/demand, and global import/export. The total market is further divided by type/application, by country, by company, and for the competitive landscape analysis. The Cloud Computing in Healthcare Market report delivers a straightforward overview of the industry including its definition, applications and manufacturing technology.

The report also analyses global markets including growth trends, business opportunities, investment plans, and expert opinions. The report covers all data on the global and territorial markets including notable and future patterns for market requests. The report then estimates, market development trends of the Cloud Computing in Healthcare industry till forecast to 2025.

Market Segmentation

By Types, the Cloud Computing in Healthcare Market can be Split into:

HardwareSoftwareServices

By Applications, the Cloud Computing in Healthcare Market can be Split into:

HospitalClinicsOthers

Enquire Here For Queries Or Report Customization: http://researchindustry.us/report/global-cloud-computing-in-healthcare-market-ric/507893/request-customization

Report Objectives:

Analyzing the size of the global Cloud Computing in Healthcare market on the basis of value and volume.

Accurately calculating the market segments, consumption, and other dynamic factors of different sections of the global Cloud Computing in Healthcare market.

Determining the key dynamics of the global Cloud Computing in Healthcare market.

Highlighting significant trends of the global Cloud Computing in Healthcare market in terms of manufacture, revenue, and sales.

Deeply summarizing top players of the global Cloud Computing in Healthcare market and showing how they compete in the industry.

Studying industry product pricing, processes and costs, and various trends related to them.

Displaying the performance of different regions and countries in the global Cloud Computing in Healthcare market.

If you need specific information, which is not currently available in the Report of Scope, we will give it to you as a part of customization. To know more please Drop Down Your Inquiry(help@researchindustry.us).

Get In Touch!

Navale ICON IT Park,

Office No. 407, 4th Floor, Mumbai Banglore Highway, Narhe, Pune

Maharashtra 411041

phone +91-020-67320403

Email help@researchindustry.us

Read this article:

Cloud Computing in Healthcare Market 2019| In-depth Analysis by Regions, Production and Consumption by Market Size, and Forecast to 2026 | Research...

Task force on artificial intelligence hearing: AI and the evolution of cloud computing – key testimony on the risks, challenges and opportunities -…

On October 18, 2019, the Task Force on Artificial Intelligence, which is a task force within the House Financial Services Committee (FSC), held a hearing titled AI and the Evolution of Cloud Computing: Evaluating How Financial Data is Stored, Protected, and Maintained by Cloud Providers. In a memorandum published before the hearing, the FSC noted that financial institutions have adopted cloud computing for non-core purposes (e.g., human resources, customer relationship management, etc.) while exercising caution when migrating over core services and activities (e.g., payments and retail banking). However, the memorandum notes that over the next five to 10 years, the expectation is that banks will move over more core functions to the cloud. The FSC notes that AI is a component of cloud computing because it helps streamline tasks, improves how data is managed and provides real-time cyber defense.

Financial institutions that use cloud computing and cloud service providers (CSP) have legal compliance obligations when financial institutions use cloud computing to perform both non-core and core functions. For example, federal regulators require CSPs to meet the same regulatory requirements as if the financial institution performed the activities (e.g., complying with the Bank Service Company Act or the Gramm-Leach-Bliley Act (GLBA)). As the FSC memorandum notes, examiners from the Federal Reserve recently visited a large CSP, and the CSP balked when the Federal Reserve asked the CSP to provide additional information after the on-site examination. Further, the CSP sought clarity from the Federal Reserve on how the Federal Reserve would use and store that information and who would have access to it. Therefore, the concerns over data privacy run both ways. As numerous witnesses in the hearing and members of the FSC noted, greater clarity from regulators regarding the use of CSPs by financial institutions would be beneficial. This echoed a 2018 Treasury report on Nonbank Financials, Fintech, and Innovation, which noted that [f]inancial services firms face several regulatory challenges related to the adoption of cloud, driven in large part by a regulatory regime that has yet to be sufficiently modernized to accommodate cloud and other innovative technologies.

The hearing addressed these compliance issues as well as issues related to consolidation, privacy and security. Below is a summary of the participants presentations.

The question and answer session that followed repeatedly focused on security issues posed by the use of CSPs, including whether and how CSPs can be better trained to understand the financial regulatory requirements imposed on their financial institution clients. Another concern mentioned was the difficulty associated with attribution when an error or breach occurs with a CSP (from the perspective of who may have been at fault and who actually committed the act Ms. Broussard noted that AI is useful in helping identify and protect against known vulnerabilities but that it struggles with unknown unknowns). Finally, near the end of the question and answer session, Mr. Benda noted the difficulties associated with the need to comply with both state laws which can vary, sometimes significantly, in their requirements and federal laws and requested that one harmonized approach be adopted so that banks do not have to answer to 51 masters.

This was the third hearing of the Task Force on Artificial Intelligence. You can watch the full hearing here.

Original post:

Task force on artificial intelligence hearing: AI and the evolution of cloud computing - key testimony on the risks, challenges and opportunities -...

The emergence of edge computing – Financier Worldwide

Only when the information has been processed and refined is the data sent to the cloud, if at all, the company continues. Edge computing is becoming more and more relevant with the growing popularity of the Internet of Things (IoT).

Testifying to this is a 2018 CB Insights report What Is Edge Computing? which notes edge computings particular relevance in the realm of autonomous vehicles. An autonomous vehicle is essentially a large, high-powered computer on wheels that collects data through a multitude of sensors, says the report. For these vehicles to operate safely and reliably, they need to respond to their surroundings right away. Any lag in processing speed can be deadly.

Pros and cons

What makes edge computing an effective data processing option is that it creates an environment that acts as a hybrid between cloud and local processing combining the key attributes of both.

The cloud brings flexibility, scalability and services decoupled from hardware to improve end-user experience, says Dalia Adib, edge computing practice lead at STL Partners. But processing locally reduces the amount of data traversing through the network, decreases bandwidth costs and ensures latency is kept to a minimum to support mission critical services. As data privacy concerns grow, edge computing makes it easier for businesses to manage their data and avoid it being stored in remote public clouds.

While businesses that adopt edge computing are looking for enhanced application performance and reduced costs, they must also be aware of its limitations.

The challenge is that there are different edges, including device edge, on-premises and network edge, and each has its pros and cons, explains Ms Adib. The network edge can dramatically decrease latency to 5 to 10 milliseconds, but could potentially be more costly for a developer to use than the cloud. It is more difficult to benefit from economies of scale and the number of edge locations is limited, at least in the short term. In reality, latency is affected by many factors hardware (processing power), location, application and architecture, among others and determining an IT architecture is not a straightforward decision.

Mainstream adoption

While the adoption of edge computing is generally viewed as being at an early stage, its burgeoning status as a transformative business operation suggests it is on the cusp of mainstream adoption. Indeed, according to the International Data Corporation (IDC), by 2022, 40 percent of companies cloud deployments will include edge computing.

The edge computing market is not new we already have applications running on various edges, such as device edge, on-premise edge and CDN edge, says Ms Adib. For example, Android phones run machine learning (ML) models on the device to adapt the keyboard based on user behaviour and only aggregated data is sent back to the data centre.

Manufacturers are using edge computing at their production facilities to run IoT applications, for applications like predictive maintenance, she continues. What we will see in the next five years is growth in network edge edge servers running at locations on the telecoms network. This is a nascent market and very few operators have announced any live, commercial deployments yet.

A potentially powerful force for the future of IT and business, edge computing, despite its history, is still viewed as a new paradigm by many a mechanism to reduce latency and streamline data traffic which may, in time, replace cloud computing as the favoured data storage solution.

Financier Worldwide

Continued here:

The emergence of edge computing - Financier Worldwide

What is Cloud Computing? – Amazon Web Services

Whether you are running applications that share photos to millions of mobile users or youre supporting the critical operations of your business, a cloud services platform provides rapid access to flexible and low cost IT resources. With cloud computing, you dont need to make large upfront investments in hardware and spend a lot of time on the heavy lifting of managing that hardware. Instead, you can provision exactly the right type and size of computing resources you need to power your newest bright idea or operate your IT department. You can access as many resources as you need, almost instantly, and only pay for what you use.

Cloud computing provides a simple way to access servers, storage, databases and a broad set of application services over the Internet. A Cloud services platform such as Amazon Web Services owns and maintains the network-connected hardware required for these application services, while you provision and use what you need via a web application.

Read the rest here:

What is Cloud Computing? - Amazon Web Services

Cloud computing: A complete guide | IBM

Enterprises eager to undergo digital transformations and modernize their applications are quick to see the value of adopting a cloud computing platform. They are increasingly finding business agility or cost savings by renting software. Each cloud computing service and deployment model type provides you with different levels of control, flexibility and management. Therefore, its important to understand the differences between them.

Common convention points to public cloud as the delivery model of choice; but, when considering the right architecture of cloud computing for your applications and workloads, you must begin by addressing the unique needs of your business.

This can include many factors, such as government regulations, security, performance, data residency, service levels, time to market, architecture complexity, skills and preventing vendor lock-in. Add in the need to incorporate the emerging technologies, and you can see why IT leaders are challenging the notion that cloud computing migration is easy.

At first glance, the types of cloud computing seem simple: public, private or a hybrid mix of both. In reality, the choices are many. Public cloud can include shared, dedicated and bare metal delivery models. Fully and partially managed clouds are also options. And, in some cases, especially for existing applications where architectures are too complex to move or the cost-benefit ratio is not optimal, cloud may not be the right choice.

The right model depends on your workload. You should understand the pluses and minuses of each cloud deployment model and take a methodical approach to determining which workloads to move to which type of cloud for the maximum benefit.

Dive deeper into specific cloud service and deployment models, cloud computing architecture and cloud computing examples

See the original post:

Cloud computing: A complete guide | IBM

Cloud Computing – Yahoo

Background

Whats all the fluff about cloud computing? There are plenty of reasons its the most talked-about trend in technology. Starting with the fact that it helps reduce the up-front capital needed to build IT infrastructure and develop software. Cloud services are so appealing that the total market is expected to nearly triple from 2010 to 2016. (Yep, you read that right.)

Of course, technology companies have clamored to add cloud computing to their repertoires, leading to lots of M&A activity. Software and Internet deals represented 57% of transactions closed in 2012, a figure that has grown steadily over the last two years. All of which leaves the cloud looking like a lot more than a passing storm.

We identified US-listed stocks and American Depository Receipts of companies that are engaged in activities relevant to this watchlist's theme. We then filtered out companies that have a share price of less than .00 or a market capitalization less than 00 million, and excluded illiquid stocks by screening companies for liquidity i.e. average bid-ask spreads, dollar volume traded etc. Finally the proprietary Motif Optimization Engine determined the constituent stocks. Learn more about how we select our watchlists.

Motif is an online brokerage built on thematic portfolios of up to 30 stocks and ETFs. Founded in 2010 by Hardeep Walia, Motif combines complex proprietary algorithms with skilled advisers to develop these thematic portfolios. Learn more about our team.

First, we determined each company's percentage of total revenue derived from this watchlist's theme. Second, we applied a pure-play factor to give greater relative weight to companies that derive a higher percentage of their revenue from this theme. Finally, we weighted each company by its market capitalization adjusted for revenue exposure to the theme.

More details on how we build and weight watchlists are available here.

Go here to see the original:

Cloud Computing - Yahoo

cloud computing

Making Design Thinking real

I was hired into a multidisciplinary corporate strategy team, set up by Hasso Plattner, the chairman of SAP's supervisory board, and the only co-founder still with the company, whose mission was to help SAP embrace design thinking in how it built products and processes as well as how it worked with customers. It was the best multidisciplinary team one could imagine to be part of. We were multidisciplinary to a fault where I used to joke that my team members and I had nothing in common. I am proud to be part of this journey and the impact we helped achieve. Over the years we managed to take the double quotes out of design thinking making it a default mindset and philosophy in all parts of SAP. It was a testament to the fact that any bold and audacious mission starts with a few simple steps and can be accomplished if there is a small passionate team behind it striving to make an impact.

Be part of foundation of something disruptive

Being part of the Office of CEO I worked with two CEOsHenning and Leoand their respective executive management teams. This was by far the best learning experience of my life. I got an opportunity to work across lines of businesses and got first hand exposure to intricate parts of SAPs business. As part of the corporate strategy team I also got an opportunity to work on Business Objects post-merger integration, especially the joint product vision. Some of that work led to the foundation of one of the most disruptive products SAP later released, SAP HANA.

Fuel the insane growth of SAP HANA

HANA just happened to SAP. The market and competition were not expecting us to do anything in this space. Most people at SAP didnt realize full potential of it and most customers didnt believe it could actually help them. I dont blame them. HANA was such a radically foreign concept that it created a feeling of skepticism and enthusiasm at the same time. I took on many different roles and worked extensively with various parts of organization and SAPs customers to explore, identify, and realize breakthrough scenarios that exploited the unique and differentiating aspects of HANA.

HANAs value was perceived to help customers to do things better, cheaper, and faster. But, I was on an orthogonal, and rather difficult, mission to help our customers do things they could not have done before or could not even have imagined they could do.

I was fortunate enough to significantly contribute to early adoption of HANAzero to billion dollars in revenue in three yearswhich also went on to become the fastest growing product in SAPs history. I got a chance to work closely with Vishal Sikka, the CTO of SAP and informally known as the father of HANA, on this endeavor and on many other things. It was also a pleasure to work with some of the most prominent global SAP customers who are industry leaders. They taught me a lot about their business.

Incubate a completely new class of data-first cloud solutions

As HANA started to become a foundation and platform for everything we built at SAP my team took on a customer-driven part-accelerator and a part-incubator role to further leverage the most differentiating aspects of the platform and combine it with machine learning and AI to help build new greenfield data-first cloud solutions that reimagined enterprise scenarios. These solutions created potential for more sustaining revenue in days to come.

Practice the General Manager model with a start-up mindset

A true General Manager model is rare or non-existent at SAP (and at many other ISVs), but we implemented that model in my team where I was empowered to run all the functionsengineering, design, product management, product marketing, and business developmentand assumed the overall P&L responsibility of the team. The buck stopped with me and as a team we could make swift business decisions. The team members also felt a strong purpose in how their work helped SAP. Often times, people would come up to me and say, so your team is like a start-up. I would politely tell them claiming my team as a start-up will be a great disservice to all the real start-ups out there. However, I tried very hard for us to embrace the start-up culturesmall tight teams, experimentation, rewarding efforts and not just the outcome, mission and purpose driven to a fault, break things to make them work, insanely short project timelines, and mid to long term vision with focused short-term extreme agile executionand we leveraged the biggest asset SAP has, its customers.

Be part of a transformative journey

I was fortunate to witness SAPs successful transformation to a cloud company without compromising on margins or revenue and HANA-led in-memory revolution that not only paved the path for a completely new category of software but also became the fastest growing product in SAPs history. These kind of things simply dont happen to all people and I was fortunate to be part of this journey. I have tremendous respect for SAP as a company and the leaders, especially the CEO Bill McDermott, in what the company has achieved. Im thankful to all the people who helped and mentored me, and more importantly believed in me.

Looking forward to not doing anything, at least for a short period of time

At times, such a long and fast-paced journey somewhat desensitizes you from the real world. I want to slow down, take a step back, and rethink how the current technology storm in the Silicon Valley will disrupt the world again as it has always and how I can be part of that journey, again. There are also personal projects I have been putting off for a while that I want to tend to. Im hoping a short break will help me reenergize and see the world differently. When I broke this news to my mom she didnt freak out. I must have made the right decision!

I want to disconnect to reconnect.

I am looking forward to do away with my commute for a while, on 101, during rush hours, to smell the proverbial roses. I wont miss 6 AM conference calls, but I will certainly miss those cute self-driving Google cars on streets of Palo Alto. They always remind me of why the valley is such a great place. For a product person, a technology enthusiast, and a generalist like me who has experienced and practiced all the three sidesfeasibility, viability, and desirabilityof building software the valley is full of promises and immense excitement. In coming days I am hoping to learn from my friends and thought leaders that would eventually lead me to my next tour of duty.

About the picture: I was on a hiking trip to four national parks a few years ago where I took this picture standing on the middle of a road inside Death Valley National Park. The C curve on a rather straight road is the only place on that long stretch where you could get cell phone reception. Even short hiking trips have helped me gain a new perspective on work and life.

Read more here:

cloud computing

Cloud computing – Simple English Wikipedia, the free encyclopedia

In computer science, cloud computing describes a type of outsourcing of computer services, similar to the way in which electricity supply is outsourced. Users can simply use it. They do not need to worry where the electricity is from, how it is made, or transported. Every month, they pay for what they consumed.

The idea behind cloud computing is similar: The user can simply use storage, computing power, or specially crafted development environments, without having to worry how these work internally. Cloud computing is usually Internet-based computing. The cloud is a metaphor for the Internet based on how the internet is described in computer network diagrams; which means it is an abstraction hiding the complex infrastructure of the internet.[1] It is a type of computing in which IT-related capabilities are provided as a service,[2] allowing users to access technology-enabled services from the Internet ("in the cloud")[3] without knowledge of, or control over the technologies behind these servers, which can lead to ethical and legal issues.[4]

According to a paper published by IEEE Internet Computing in 2008 "Cloud Computing is a paradigm in which information is permanently stored in servers on the Internet and cached temporarily on clients that include computers, laptops, handhelds, sensors, etc."[5] This concept was first introduced by Cynthia Carter of DataNet, Inc. (https://www.slideshare.net/slideshow/embed_code/key/oQ67EY4it49b0s).

Cloud computing is a general concept that utilizes software as a service (SaaS), such as Web 2.0 and )other technology trends, all of which depend on the Internet for satisfying users' needs. For example, Google Apps provides common business applications online that are accessed from a web browser, while the software and data are stored on the Internet servers.

Cloud computing is often confused with other ideas:

Cloud computing often uses grid computing, has autonomic characteristics and is billed like utilities, but cloud computing can be seen as a natural next step from the grid-utility model.[8] Some successful cloud architectures have little or no centralised infrastructure or billing systems including peer-to-peer networks like BitTorrent and Skype.[9]

The majority of cloud computing infrastructure currently consists of reliable services delivered through data centers that are built on computer and storage virtualization technologies. The services are accessible anywhere in the world, with The Cloud appearing as a single point of access for all the computing needs of consumers. Commercial offerings need to meet the quality of service requirements of customers and typically offer service level agreements.[10] Open standards and open source software are also critical to the growth of cloud computing.[11]

As customers generally do not own the infrastructure or know all details about it, mainly they are accessing or renting, so they can consume resources as a service, and may be paying for what they do not need, instead of what they actually do need to use. Many cloud computing providers use the utility computing model which is analogous to how traditional public utilities like electricity are consumed, while others are billed on a subscription basis. By sharing consumable and "intangible" computing power between multiple "tenants", utilization rates can be improved (as servers are not left idle) which can reduce costs significantly while increasing the speed of application development.

A side effect of this approach is that "computer capacity rises dramatically" as customers do not have to engineer for peak loads.[12] Adoption has been enabled by "increased high-speed bandwidth" which makes it possible to receive the same response times from centralized infrastructure at other sites.

Cloud computing is being driven by providers including Google, Amazon.com, and Yahoo! as well as traditional vendors including IBM, Intel,[13] Microsoft[14] and SAP.[15] It can adopted by all kinds of users, be they individuals or large enterprises. Most internet users are currently using cloud services, even if they do not realize it. Webmail for example is a cloud service, as are Facebook and Wikipedia and contact list synchronization and online data backups.

The Cloud[16] is a metaphor for the Internet,[17] or more generally components and services which are managed by others.[1]

The underlying concept dates back to 1960 when John McCarthy expressed his opinion that "computation may someday be organized as a public utility" and the term Cloud was already in commercial use in the early 1990s to refer to large ATM networks.[18] By the turn of the 21st century, cloud computing solutions had started to appear on the market,[19] though most of the focus at this time was on Software as a service.

Amazon.com played a key role in the development of cloud computing when upgrading their data centers after the dot-com bubble and providing access to their systems by way of Amazon Web Services in 2002 on a utility computing basis. They found the new cloud architecture resulted in significant internal efficiency improvements.[20]

2007 observed increased activity, including Google, IBM and a number of universities starting large scale cloud computing research project,[21] around the time the term started gaining popularity in the mainstream press. It was a hot topic by mid-2008 and numerous cloud computing events had been scheduled.[22]

In August 2008 Gartner observed that "organizations are switching from company-owned hardware and software assets to per-use service-based models" and that the "projected shift to cloud computing will result in dramatic growth in IT products in some areas and in significant reductions in other areas".[23]

Clouds cross many country borders and "may be the ultimate form of globalisation".[24] As such it is the subject of complex geopolitical issues, whereby providers must satisfy many legal restrictions in order to deliver service to a global market. This dates back to the early days of the Internet, where libertarian thinkers felt that "cyberspace was a distinct place calling for laws and legal institutions of its own"; author Neal Stephenson envisaged this as a tiny island data haven in his science-fiction classic novel Cryptonomicon.[24]

Although there have been efforts to match the legal environment (such as US-EU Safe Harbor), providers like Amazon Web Services usually deal with international markets (typically the United States and European Union) by deploying local infrastructure and allowing customers to select their countries.[25] However, there are still concerns about security and privacy for individual through various governmental levels, (for example the USA PATRIOT Act and use of national security letters and title II of the Electronic Communications Privacy Act, the Stored Communications Act).

In March 2007, Dell applied to trademark the term '"cloud computing" in the United States. It received a "Notice of Allowance" in July 2008 which was subsequently canceled on August 6, resulting in a formal rejection of the trademark application in less than a week later.

In November 2007, the Free Software Foundation released the Affero General Public License (abbreviated as Affero GPL and AGPL), a version of GPLv3 designed to close a perceived legal loophole associated with Free software designed to be run over a network, particularly software as a service. According to the AGPL license application service providers are required to release any changes they make to an AGPL open source code.

Cloud architecture[26] is the systems architecture of the software systems involved in the delivery of cloud computing (e.g. hardware, software) as designed by a cloud architect who typically works for a cloud integrator. It typically involves multiple cloud components communicating with each other over application programming interfaces (usually web services).[27]

This is very similar to the Unix philosophy of having multiple programs doing one thing well and working together over universal interfaces. Complexity is controlled and the resulting systems are more manageable than their monolithic counterparts.

Cloud architecture extends to the client where web browsers and/or software applications are used to access cloud applications.

Cloud storage architecture is loosely coupled where metadata operations are centralized enabling the data nodes to scale into the hundreds, each independently delivering data to applications or users.

A cloud application influences The Cloud model of software architecture, often eliminating the need to install and run the application on the customer's own computer, thus reducing software maintenance, ongoing operations, and support. For example:

A cloud client is computer hardware and/or computer software which relies on The Cloud for application delivery, or which is specifically designed for delivery of cloud services, and which is in either case essentially useless without a Cloud.[33] For example:

Cloud infrastructure (e.g. Infrastructure as a service) is the delivery of computer infrastructure (typically a platform virtualization environment) as a service.[41] For example:

A cloud platform (e.g. Platform as a service) (the delivery of a computing platform and/or solution stack as a service) [42] facilitates deployment of applications without the cost and complexity of buying and managing the underlying hardware and software layers.[43] For example:

A cloud service (e.g. Web Service) is "software system[s] designed to support interoperable machine-to-machine interaction over a network"[44] which may be accessed by other cloud computing components, software (e.g. Software plus services) or end users directly.[45] For example:

Cloud storage is the delivery of data storage as a service (including database-like services), often billed on a utility computing basis (e.g. per gigabyte per month).[46] For example:

Traditional storage vendors have recently begun to offer their own flavor of cloud storage, sometimes in conjunction with their existing software products (e.g. Symantec's Online Storage for Backup Exec). Others focus on providing a new kind of back-end storage optimally designed for delivering cloud storage (EMC's Atmos), categorically known as Cloud Optimized Storage.

A cloud computing provider or cloud computing service provider owns and operates cloud computing systems serve someone else. Usually this needs building and managing new data centers. Some organisations get some of the benefits of cloud computing by becoming "internal" cloud providers and servicing themselves, though they do not benefit from the same economies of scale and still have to engineer for peak loads. The barrier to entry is also significantly higher with capital expenditure required and billing and management creates some overhead. However, significant operational efficiency and quickness advantages can be achieved even by small organizations, and server consolidation and virtualization rollouts are already in progress.[47] Amazon.com was the first such provider, modernising its data centers which, like most computer networks were using as little as 10% of its capacity at any one time just to leave room for occasional spikes. This allowed small, fast-moving groups to add new features faster and easier, and they went on to open it up to outsiders as Amazon Web Services in 2002 on a utility computing basis.[20]

The companies listed in the Components section are providers.

A user is a consumer of cloud computing.[33] The privacy of users in cloud computing has become of increasing concern.[48][49] The rights of users is also an issue, which is being addressed via a community effort to create a bill of rights (currently in draft).[50][51]

A vendor sells products and services that facilitate the delivery, adoption and use of cloud computing.[52] For example:

A cloud standard is one of a number of existing (typically lightweight) open standards that have facilitated the growth of cloud computing, including:[57]

Read the original here:

Cloud computing - Simple English Wikipedia, the free encyclopedia

Cloud computing | computer science | Britannica.com

Cloud computing, method of running application software and storing related data in central computer systems and providing customers or other users access to them through the Internet.

The origin of the expression cloud computing is obscure, but it appears to derive from the practice of using drawings of stylized clouds to denote networks in diagrams of computing and communications systems. The term came into popular use in 2008, though the practice of providing remote access to computing functions through networks dates back to the mainframe time-sharing systems of the 1960s and 1970s. In his 1966 book The Challenge of the Computer Utility, the Canadian electrical engineer Douglas F. Parkhill predicted that the computer industry would come to resemble a public utility in which many remotely located users are connected via communication links to a central computing facility.

For decades, efforts to create large-scale computer utilities were frustrated by constraints on the capacity of telecommunications networks such as the telephone system. It was cheaper and easier for companies and other organizations to store data and run applications on private computing systems maintained within their own facilities.

The constraints on network capacity began to be removed in the 1990s when telecommunications companies invested in high-capacity fibre-optic networks in response to the rapidly growing use of the Internet as a shared network for exchanging information. In the late 1990s, a number of companies, called application service providers (ASPs), were founded to supply computer applications to companies over the Internet. Most of the early ASPs failed, but their model of supplying applications remotely became popular a decade later, when it was renamed cloud computing.

Cloud computing encompasses a number of different services. One set of services, sometimes called software as a service (SaaS), involves the supply of a discrete application to outside users. The application can be geared either to business users (such as an accounting application) or to consumers (such as an application for storing and sharing personal photographs). Another set of services, variously called utility computing, grid computing, and hardware as a service (HaaS), involves the provision of computer processing and data storage to outside users, who are able to run their own applications and store their own data on the remote system. A third set of services, sometimes called platform as a service (PaaS), involves the supply of remote computing capacity along with a set of software-development tools for use by outside software programmers.

Early pioneers of cloud computing include Salesforce.com, which supplies a popular business application for managing sales and marketing efforts; Google, Inc., which in addition to its search engine supplies an array of applications, known as Google Apps, to consumers and businesses; and Amazon Web Services, a division of online retailer Amazon.com, which offers access to its computing system to Web-site developers and other companies and individuals. Cloud computing also underpins popular social networks and other online media sites such as Facebook, MySpace, and Twitter. Traditional software companies, including Microsoft Corporation, Apple Inc., Intuit Inc., and Oracle Corporation, have also introduced cloud applications.

Cloud-computing companies either charge users for their services, through subscriptions and usage fees, or provide free access to the services and charge companies for placing advertisements in the services. Because the profitability of cloud services tends to be much lower than the profitability of selling or licensing hardware components and software programs, it is viewed as a potential threat to the businesses of many traditional computing companies.

Construction of the large data centres that run cloud-computing services often requires investments of hundreds of millions of dollars. The centres typically contain thousands of server computers networked together into parallel-processing or grid-computing systems. The centres also often employ sophisticated virtualization technologies, which allow computer systems to be divided into many virtual machines that can be rented temporarily to customers. Because of their intensive use of electricity, the centres are often located near hydroelectric dams or other sources of cheap and plentiful electric power.

Because cloud computing involves the storage of often sensitive personal or commercial information in central database systems run by third parties, it raises concerns about data privacy and security as well as the transmission of data across national boundaries. It also stirs fears about the eventual creation of data monopolies or oligopolies. Some believe that cloud computing will, like other public utilities, come to be heavily regulated by governments.

Follow this link:

Cloud computing | computer science | Britannica.com

Cloud computing – Simple English Wikipedia, the free …

In computer science, cloud computing describes a type of outsourcing of computer services, similar to the way in which electricity supply is outsourced. Users can simply use it. They do not need to worry where the electricity is from, how it is made, or transported. Every month, they pay for what they consumed.

The idea behind cloud computing is similar: The user can simply use storage, computing power, or specially crafted development environments, without having to worry how these work internally. Cloud computing is usually Internet-based computing. The cloud is a metaphor for the Internet based on how the internet is described in computer network diagrams; which means it is an abstraction hiding the complex infrastructure of the internet.[1] It is a type of computing in which IT-related capabilities are provided as a service,[2] allowing users to access technology-enabled services from the Internet ("in the cloud")[3] without knowledge of, or control over the technologies behind these servers, which can lead to ethical and legal issues.[4]

According to a paper published by IEEE Internet Computing in 2008 "Cloud Computing is a paradigm in which information is permanently stored in servers on the Internet and cached temporarily on clients that include computers, laptops, handhelds, sensors, etc."[5] This concept was first introduced by Cynthia Carter of DataNet, Inc. (https://www.slideshare.net/slideshow/embed_code/key/oQ67EY4it49b0s).

Cloud computing is a general concept that utilizes software as a service (SaaS), such as Web 2.0 and )other technology trends, all of which depend on the Internet for satisfying users' needs. For example, Google Apps provides common business applications online that are accessed from a web browser, while the software and data are stored on the Internet servers.

Cloud computing is often confused with other ideas:

Cloud computing often uses grid computing, has autonomic characteristics and is billed like utilities, but cloud computing can be seen as a natural next step from the grid-utility model.[8] Some successful cloud architectures have little or no centralised infrastructure or billing systems including peer-to-peer networks like BitTorrent and Skype.[9]

The majority of cloud computing infrastructure currently consists of reliable services delivered through data centers that are built on computer and storage virtualization technologies. The services are accessible anywhere in the world, with The Cloud appearing as a single point of access for all the computing needs of consumers. Commercial offerings need to meet the quality of service requirements of customers and typically offer service level agreements.[10] Open standards and open source software are also critical to the growth of cloud computing.[11]

As customers generally do not own the infrastructure or know all details about it, mainly they are accessing or renting, so they can consume resources as a service, and may be paying for what they do not need, instead of what they actually do need to use. Many cloud computing providers use the utility computing model which is analogous to how traditional public utilities like electricity are consumed, while others are billed on a subscription basis. By sharing consumable and "intangible" computing power between multiple "tenants", utilization rates can be improved (as servers are not left idle) which can reduce costs significantly while increasing the speed of application development.

A side effect of this approach is that "computer capacity rises dramatically" as customers do not have to engineer for peak loads.[12] Adoption has been enabled by "increased high-speed bandwidth" which makes it possible to receive the same response times from centralized infrastructure at other sites.

Cloud computing is being driven by providers including Google, Amazon.com, and Yahoo! as well as traditional vendors including IBM, Intel,[13] Microsoft[14] and SAP.[15] It can adopted by all kinds of users, be they individuals or large enterprises. Most internet users are currently using cloud services, even if they do not realize it. Webmail for example is a cloud service, as are Facebook and Wikipedia and contact list synchronization and online data backups.

The Cloud[16] is a metaphor for the Internet,[17] or more generally components and services which are managed by others.[1]

The underlying concept dates back to 1960 when John McCarthy expressed his opinion that "computation may someday be organized as a public utility" and the term Cloud was already in commercial use in the early 1990s to refer to large ATM networks.[18] By the turn of the 21st century, cloud computing solutions had started to appear on the market,[19] though most of the focus at this time was on Software as a service.

Amazon.com played a key role in the development of cloud computing when upgrading their data centers after the dot-com bubble and providing access to their systems by way of Amazon Web Services in 2002 on a utility computing basis. They found the new cloud architecture resulted in significant internal efficiency improvements.[20]

2007 observed increased activity, including Google, IBM and a number of universities starting large scale cloud computing research project,[21] around the time the term started gaining popularity in the mainstream press. It was a hot topic by mid-2008 and numerous cloud computing events had been scheduled.[22]

In August 2008 Gartner observed that "organizations are switching from company-owned hardware and software assets to per-use service-based models" and that the "projected shift to cloud computing will result in dramatic growth in IT products in some areas and in significant reductions in other areas".[23]

Clouds cross many country borders and "may be the ultimate form of globalisation".[24] As such it is the subject of complex geopolitical issues, whereby providers must satisfy many legal restrictions in order to deliver service to a global market. This dates back to the early days of the Internet, where libertarian thinkers felt that "cyberspace was a distinct place calling for laws and legal institutions of its own"; author Neal Stephenson envisaged this as a tiny island data haven in his science-fiction classic novel Cryptonomicon.[24]

Although there have been efforts to match the legal environment (such as US-EU Safe Harbor), providers like Amazon Web Services usually deal with international markets (typically the United States and European Union) by deploying local infrastructure and allowing customers to select their countries.[25] However, there are still concerns about security and privacy for individual through various governmental levels, (for example the USA PATRIOT Act and use of national security letters and title II of the Electronic Communications Privacy Act, the Stored Communications Act).

In March 2007, Dell applied to trademark the term '"cloud computing" in the United States. It received a "Notice of Allowance" in July 2008 which was subsequently canceled on August 6, resulting in a formal rejection of the trademark application in less than a week later.

In November 2007, the Free Software Foundation released the Affero General Public License (abbreviated as Affero GPL and AGPL), a version of GPLv3 designed to close a perceived legal loophole associated with Free software designed to be run over a network, particularly software as a service. According to the AGPL license application service providers are required to release any changes they make to an AGPL open source code.

Cloud architecture[26] is the systems architecture of the software systems involved in the delivery of cloud computing (e.g. hardware, software) as designed by a cloud architect who typically works for a cloud integrator. It typically involves multiple cloud components communicating with each other over application programming interfaces (usually web services).[27]

This is very similar to the Unix philosophy of having multiple programs doing one thing well and working together over universal interfaces. Complexity is controlled and the resulting systems are more manageable than their monolithic counterparts.

Cloud architecture extends to the client where web browsers and/or software applications are used to access cloud applications.

Cloud storage architecture is loosely coupled where metadata operations are centralized enabling the data nodes to scale into the hundreds, each independently delivering data to applications or users.

A cloud application influences The Cloud model of software architecture, often eliminating the need to install and run the application on the customer's own computer, thus reducing software maintenance, ongoing operations, and support. For example:

A cloud client is computer hardware and/or computer software which relies on The Cloud for application delivery, or which is specifically designed for delivery of cloud services, and which is in either case essentially useless without a Cloud.[33] For example:

Cloud infrastructure (e.g. Infrastructure as a service) is the delivery of computer infrastructure (typically a platform virtualization environment) as a service.[41] For example:

A cloud platform (e.g. Platform as a service) (the delivery of a computing platform and/or solution stack as a service) [42] facilitates deployment of applications without the cost and complexity of buying and managing the underlying hardware and software layers.[43] For example:

A cloud service (e.g. Web Service) is "software system[s] designed to support interoperable machine-to-machine interaction over a network"[44] which may be accessed by other cloud computing components, software (e.g. Software plus services) or end users directly.[45] For example:

Cloud storage is the delivery of data storage as a service (including database-like services), often billed on a utility computing basis (e.g. per gigabyte per month).[46] For example:

Traditional storage vendors have recently begun to offer their own flavor of cloud storage, sometimes in conjunction with their existing software products (e.g. Symantec's Online Storage for Backup Exec). Others focus on providing a new kind of back-end storage optimally designed for delivering cloud storage (EMC's Atmos), categorically known as Cloud Optimized Storage.

A cloud computing provider or cloud computing service provider owns and operates cloud computing systems serve someone else. Usually this needs building and managing new data centers. Some organisations get some of the benefits of cloud computing by becoming "internal" cloud providers and servicing themselves, though they do not benefit from the same economies of scale and still have to engineer for peak loads. The barrier to entry is also significantly higher with capital expenditure required and billing and management creates some overhead. However, significant operational efficiency and quickness advantages can be achieved even by small organizations, and server consolidation and virtualization rollouts are already in progress.[47] Amazon.com was the first such provider, modernising its data centers which, like most computer networks were using as little as 10% of its capacity at any one time just to leave room for occasional spikes. This allowed small, fast-moving groups to add new features faster and easier, and they went on to open it up to outsiders as Amazon Web Services in 2002 on a utility computing basis.[20]

The companies listed in the Components section are providers.

A user is a consumer of cloud computing.[33] The privacy of users in cloud computing has become of increasing concern.[48][49] The rights of users is also an issue, which is being addressed via a community effort to create a bill of rights (currently in draft).[50][51]

A vendor sells products and services that facilitate the delivery, adoption and use of cloud computing.[52] For example:

A cloud standard is one of a number of existing (typically lightweight) open standards that have facilitated the growth of cloud computing, including:[57]

Read the original:

Cloud computing - Simple English Wikipedia, the free ...

What is cloud computing? | IBM

Enterprises eager to undergo digital transformations and modernize their applications are quick to see the value of adopting a cloud computing platform. They are increasingly finding business agility or cost savings by renting software. Each cloud computing service and deployment model type provides you with different levels of control, flexibility and management. Therefore, its important to understand the differences between them.

Common convention points to public cloud as the delivery model of choice; but, when considering the right architecture of cloud computing for your applications and workloads, you must begin by addressing the unique needs of your business.

This can include many factors, such as government regulations, security, performance, data residency, service levels, time to market, architecture complexity, skills and preventing vendor lock-in. Add in the need to incorporate the emerging technologies, and you can see why IT leaders are challenging the notion that cloud computing migration is easy.

At first glance, the types of cloud computing seem simple: public, private or a hybrid mix of both. In reality, the choices are many. Public cloud can include shared, dedicated and bare metal delivery models. Fully and partially managed clouds are also options. And, in some cases, especially for existing applications where architectures are too complex to move or the cost-benefit ratio is not optimal, cloud may not be the right choice.

The right model depends on your workload. You should understand the pluses and minuses of each cloud deployment model and take a methodical approach to determining which workloads to move to which type of cloud for the maximum benefit.

Dive deeper into specific cloud service and deployment models, cloud computing architecture and cloud computing examples

Here is the original post:

What is cloud computing? | IBM

Benefits of cloud computing | IBM Cloud

If you are considering adopting cloud technologies and practices, you will receive a ton of different guidance about the benefits you might see.

Infrastructure and workloads

Many companies position the low initial costs and pay-as-you-go attributes as a very significant cost savings. Theyll note the considerable cost of building and operating data centers and argue for avoiding that to save money. Numbers can get astronomical depending on how you calculate them.

SaaS and cloud dev platforms

A software-as-a-service provider may discuss the savings from paying for application access versus purchasing off-the-shelf software. Software providers will add those "cloud attribute" benefits to the specifics of their software. Recently, there has been more discussion regarding the savings that cloud-based platforms can offer developers.

Speed and productivity

How much is it worth to your business if you can get a new application up and running in 30 hours rather than six to nine months? Likewise, the generic "staff productivity" doesn't do justice to the capabilities that cloud dashboards, real-time statistics and active analytics can bring to reducing administration burden. How much does a person hour cost your company?

Risk exposure

I like to think of this simply. What is the impact if you are wrong?

When the negative impact to trying new things is low, meaning that the risk is low, you will try many more things. The more you attempt, the more successes you will have.

If you asked me how to benefit from adopting cloud services, my first question would be, "Which services?" Every user and every organization is going to get a different set of benefits. The most important thing I can suggest is to think across the spectrum. Evaluate the potential savings, but also think about the soft benefits: improved productivity, more speed and lowered risk.

As hockey great Wayne Gretzky observed, you will miss 100 percent of the shots that you dont take. How much of a benefit is it to take your shot?

Read the original:

Benefits of cloud computing | IBM Cloud

Cloud Computing Trends: 2018 State of the Cloud Survey

In January 2018, RightScale conducted its seventh annual State of the Cloud Survey of the latest cloud computing trends, with a focus on infrastructure-as-a-service and platform-as-a-service.

Both public and private cloud adoption grew in 2018, with larger enterprises increasing their focus on public cloud. AWS is no longer the runaway leader as Azure has grown rapidly and is now a close second, especially among enterprise users. New to the survey this year is data on the large and growing spend on public cloud, which has driven cost optimization to the top of companies' 2018 priority list. To gain control of growing spend, enterprise cloud teams are taking a stronger cloud governance role, including managing costs.

The State of the Cloud Survey is the largest survey on the use of cloud infrastructure thatis focused on cloud buyers and users, as opposed to cloud vendors. Their answers provide a comprehensive perspective on the state of the cloud today.

The survey asked 997 IT professionals about their adoption of cloud infrastructure and related technologies. Fifty-three percent of the respondents represented enterprises with more than 1,000 employees. The margin of error is 3.08 percent.

We highlight several key findings from the survey in this blog post. For the complete survey results, download the RightScale 2018 State of the Cloud Report.

Multi-Cloud Is the Preferred Strategy Among Enterprises

96 Percent of Respondents Use Cloud

More Enterprises Are Prioritizing Public Cloud in 2018

Organizations Leverage Almost 5 Clouds

Serverless Is the Top-Growing Extended Cloud Service

Enterprise Public Cloud Spend Is Significant and Growing Quickly

Enterprise Central IT Teams Shift Role to Governance and Brokering Cloud

Significant Wasted Cloud Spend Makes Optimizing Costs the Top Initiative

Container Use Is Up: Docker Is Used Most Broadly While Kubernetes Grows Quickly

Use of Configuration Tools Grows, with Ansible Showing Strongest Growth

Azure Continues to Grow Quickly and Reduce the AWS Lead, Especially Among Enterprises

Private Cloud Adoption Grows Across the Board

AWS Leads in Users with 50+ VMs While Azure Grows Its Footprint Faster

How AWS, Azure, Google Cloud, and IBM Cloud Stack Up Among Enterprises

In the 12 months since the last State of the Cloud Survey, a multi-cloud strategy remains the preference among enterprises even as the percentage of enterprises who use multiple clouds dropped slightly to 81 percent vs. 85 percent in 2017. Those planning a hybrid cloud strategy fell to 51 percent (from 58 percent in 2017). However, there was a slight increase in the number of enterprises are using multiple public clouds or multiple private clouds.

Both public and private cloud adoption have increased in the last year. The number of respondents now adopting public cloud is 92 percent, up from 89 percent in 2017, while the number of respondents now adopting private cloud is 75 percent, up from 72 percent in 2017. As a result, the overall portion of respondents using at least one public or private cloud is now 96 percent.

Among enterprises, the central IT team is typically tasked with assembling a hybrid portfolio of supported clouds. This year, many more enterprises see public cloud as their top priority, up from 29 percent in 2017 to 38 percent in 2018. Hybrid cloud still leads the to-do list, but has decreased as a top priority for enterprises, declining from 50 percent in 2017 to 45 percent in 2018.

Only 8 percent of enterprises are focusing on building a private cloud, and 9 percent see their top priority as using a hosted private cloud.

On average, survey respondents are using 4.8 clouds across both public and private. Respondents are already running applications in 3.1 clouds and experimenting with 1.7 more.

A significant number of public cloud users are now leveraging services beyond just the basic compute, storage, and network services. Year over year, serverless was the top-growing extended cloud service with a 75 percent increase over 2017 (12 to 21 percent adoption). Container-as-a-service was the second highest growth rate at 36 percent (14 to 19 percent adoption). DBaaS SQL and DBaaS NoSQL were third and fourth (26 and 22 percent growth rates, respectively), but achieved this growth starting from a much larger base of use, with 35 and 23 percent adoption, respectively, in 2017.

As use of public cloud has grown, so has the amount of spend. Public cloud spend is quickly becoming a significant new line item in IT budgets, especially among larger companies. Among all respondents, 13 percent spend at least $6 million annually on public cloud while 30 percent are spending at least $1.2 million per year. Among enterprises the spend is even higher, with 26 percent exceeding $6 million per year and more than half (52 percent) above $1.2 million per year.

Enterprises are not only using a lot of public cloud, but also planning to rapidly grow public cloud spend. Twenty percent of enterprises will more than double their public cloud spend in 2018, while 71 percent will grow spend at least 20 percent.

SMBs generally have fewer workloads overall and, as a result, smaller cloud bills (half spend under $120 thousand per year). However, 13 percent of SMBs still exceed $1.2 million in annual spend.

In contrast, private cloud use will grow more slowly for all sizes of organization. Only 7 percent of each group (enterprises and SMBs) is planning to double its use in 2018. Fewer than half of enterprises (47 percent) and 35 percent of SMBs plan to grow private cloud use by more than 20 percent.

As companies adopt cloud-first strategies, they are increasingly creating a centralized cloud team or a Center of Excellence for cloud. These teams provide centralized controls, tools, and best practices to help accelerate the use of cloud while reducing costs and risk.

Overall, 44 percent of companies already have a central cloud team. Enterprises have an even stronger need for centralized governance within their larger organizations: 57 percent of enterprises already have a central cloud team with another 24 percent planning one.

In 2018 we see enterprise central IT taking a stronger cloud governance role in advising on which applications move to cloud (69 percent in 2018 vs. 63 percent in 2017), managing costs (64 percent in 2018 vs. 55 percent in 2017), setting policies (60 percent in 2018 vs. 58 percent in 2017), and brokering cloud services (60 percent in 2018 vs. 54 percent in 2017).

Even though managing cloud costs is a top challenge, cloud users underestimate the amount of wasted cloud spend. Respondents estimate 30 percent waste, while RightScale has measured actual waste at 35 percent.

With significant wasted cloud spend, organizations are focusing on gaining control of costs. Optimizing cloud costs is the top initiative for the second year in a row, increasing from 53 percent of respondents in 2017 to 58 percent in 2018.

Despite an increased focus on cloud cost management, only a minority of companies have begun to implement automated policies to optimize cloud costs, such as shutting down unused workloads or selecting lower-cost cloud or regions. This represents an opportunity for increased efficiency and increased savings, since manual policies are difficult to monitor and enforce.

Docker adoption increased to 49 percent from 35 percent last year (a growth rate of 40 percent). Kubernetes, a container orchestration tool that leverages Docker, saw the fastest growth, almost doubling to reach 27 percent adoption.

Many users also choose container-as-a-service offerings from the public cloud providers.

The AWS container service (ECS/EKS) followed close behind Docker with 44 percent adoption (26 percent growth rate). Azure Container Service adoption reached 20 percent due to a strong growth of 82 percent, and Google Container Engine also grew strongly (75 percent) to reach adoption of 14 percent.

As part of adopting DevOps processes, companies often choose to implement configuration management tools that allow them to standardize and automate deployment and configuration of servers and applications. Among all respondents, Ansible and Chef are tied with 36 percent adoption each, followed by Puppet at 34 percent adoption.

Ansible showed the strongest growth since last year, up 71 percent in adoption. Chef grew 29 percent and Puppet grew 21 percent.

In 2018, AWS continues to lead in public cloud adoption, but other public clouds are growing more quickly. Azure especially is now nipping at the heels of AWS, especially in larger companies.

And 64 percent of respondents currently run applications in AWS, up from 57 percent in 2017 (12 percent growth rate).

Among enterprises, Azure did even better. Azure increased adoption significantly from 43 percent to 58 percent (35 percent growth rate) while AWS adoption in this group increased from 59 percent to 68 percent (15 percent growth rate). Among other cloud providers that were included in the survey last year, all saw increased adoption this year with Oracle growing fastest from 5 to 10 percent (100 percent growth rate), IBM Cloud from 10 to 15 percent (50 percent growth rate), and Google from 15 to 19 percent (27 percent growth rate).

Enterprise respondents with future projects (the combination of experimenting and planning to use) show the most interest in Google (41 percent).

In contrast to last years survey when we saw private cloud adoption flatten, the 2018 survey shows that adoption of private cloud increased across all providers.

Overall, VMware vSphere continues to lead with 50 percent adoption, up significantly from last year (42 percent). This includes respondents who view their vSphere environment as a private cloud whether or not it meets the accepted definition of cloud computing. OpenStack (24 percent), VMware vCloud Director (24 percent), Microsoft System Center (23 percent), and bare metal (22 percent) were all neck and neck. Azure Stack was in the sixth slot, but showed the highest percentage of respondents that were experimenting or planning to use the technology.

The cloud adoption numbers cited previously indicate the number of respondents that are running any workloads in a particular cloud. However, it is also important to look at the number of workloads or VMs that are running in each cloud. The following charts show the number of VMs being run across the top public and private clouds.

Among all respondents, 15 percent of respondents have more than 1,000+ VMs in vSphere as compared to 10 percent in AWS.

However, AWS leads in respondents with more than 50 VMs, (47 percent for AWS vs. 37 percent for VMware). In third position, Azure shows stronger growth, increasing respondents of more than 50 VMs from 21 to 29 percent.

While public cloud found its initial success in small forward-thinking organizations, over the past few years the battle has now shifted to larger enterprises. AWS has been moving quickly to address the needs of enterprises, and Microsoft has been working to bring its enterprise relationships to Azure. Google and IBM are also focusing on growing their infrastructure-as-a-service lines of business and continue to increase adoption.

The following public cloud scorecard provides a quick snapshot showing that AWS still maintains a lead among enterprises with the highest percentage adoption and largest VM footprint of the top public cloud providers. However, Azure is showing strength by growing much more quickly on already solid adoption numbers. IBM and Google are growing strongly as well but on a smaller base of users.

The 2018 State of the Cloud Survey shows that multi-cloud remains the preferred strategy. Almost every organization is using cloud at some level, with both public and private cloud adoption growing. On average, companies using or experimenting with nearly five public and private clouds with a majority of workloads now running in cloud.

However, public cloud is increasingly becoming the top focus among enterprises and, as a result, public cloud use is growing more quickly with the addition of new customers, an increase in workloads, and an increase in the number of services used.

This expansion in cloud use is driving public cloud spend higher, with large increases expected in 2018. Cost was the number one cloud challenge for intermediate and advanced cloud users. As a result, spend continues to be the top initiative for 2018 as even more organizations are turning their efforts to cost optimization efforts. There is still much room for improvement as 35 percent of cloud bills are wasted due to inefficiencies, and few organizations have yet implemented automated policies to help address these issues.

Enterprise central IT teams are taking a stronger role in cloud adoption, creating central cloud teams or a Center of Excellence. The role of these central teams is focused on cost management and governance as well as advising business units on workloads that should move to cloud. However, business units seek stronger autonomy, except in the area of cost optimization where they look to the central IT team for assistance.

The use of DevOps continues to increase, driving further adoption of container and configuration tools. Docker grew strongly again this year, and Kubernetes showed even stronger growth as a container orchestration solution. Many users are also adopting container-as-a-service offerings from AWS, Azure, and Google.

AWS still leads in public cloud adoption but Azure continues to grow more quickly and gains ground, especially with enterprise customers. Among enterprise cloud beginners, Azure is slightly ahead of AWS. Google maintains the third position, and VMware Cloud on AWS did well in its first year of availability. Adoption of Oracle Cloud is still small, but is growing well in the enterprise.

Cloud provider revenue is driven not just by adoption (percentage of companies using the cloud), but also the number of workloads (VMs) deployed, and the use of other extended cloud services.

Respondents continue to run more VMs in AWS than in other public clouds. However, Azure is growing quickly here as well to reduce AWSs lead.

VMware vSphere continues to lead as a private cloud option (both in adoption and number of VMs) followed by VMware vCloud Director. OpenStack is third, but Azure Pack (sixth place). stands out with the strongest interest level.

Download the RightScale 2018 State of the Cloud Report for the complete survey results.

Use of Charts and Data In This Report

We encourage the re-use of data, charts, and text published in this report under the terms of this Creative Commons Attribution 4.0 International License. You are free to share and make commercial use of this work as long as you attribute the RightScale 2018 State of the Cloud Report as stipulated in the terms of the license.

View post:

Cloud Computing Trends: 2018 State of the Cloud Survey

Cloud computing: Hardware & Software Security: Online …

Examples of cloud computing include Software as a Service, Platform as a Service, and Infrastructure as a Service. Generally, cloud computing services are run outside the walls of the customer organization, on a vendor's infrastructure with vendor maintenance.

Although cloud-like services can be internal (e.g., IU's Intelligent Infrastructure), this document refers exclusively to cloud services provided by third-party vendors over a network connection where at least part of the service resides outside the institution, regardless of whether those services are offered freely to the public or privately to paying or registered users.

Cloud computing represents an externalization of information technology applications and infrastructure beyond an organization's data center walls. In the university context, cloud computing may be thought of as extra-campus or above-campus computing.

Cloud services are often available "on demand," and utilize an infrastructure shared by the vendor's customers. While some offer a flat fee model or consumption-based pricing, other cloud services are offered at no cost.

Within the university, the confidentiality, integrity, availability, use control, and accountability of institutional data and services are expected to be ensured by a suite of physical, technical, and administrative safeguards proportional to the sensitivity and criticality (i.e., risk) of those information assets and services.

These safeguards help protect the reputation of the university and reduce institutional exposure to legal and compliance risks. Much of the challenge in approaching cloud computing involves determining whether a service vendor has adequate safeguards in place commensurate with the value and risk associated with assets and services involved.

Once the high-level challenges are understood, the next step is to consider the risks and determine whether or how to appropriately mitigate those risks in the context of the proposed information and/or service.

The above factors should not be taken to suggest that cloud computing has no potential benefits; but rather that the benefits must be balanced with the risks involved when evaluating the use of cloud computing services.

Cloud computing services are similar to traditional outsourcing and can be approached analogously while accounting for their unique risks/benefits. The following recommendations and strategies are intended to assist units in their approach to evaluating the prudence and feasibility of leveraging cloud services.

Read the original:

Cloud computing: Hardware & Software Security: Online ...