Page 139«..1020..138139140141..150..»

Category Archives: Cloud Computing

Cloud-Computing Business Lifts Oracle’s Profit — 2nd Update – Fox Business

Posted: June 22, 2017 at 5:46 am

Oracle Corp.'s stock hasn't kept pace with some cloud rivals for years as the software company lagged behind in transitioning its business to the cloud.

That may have begun to change Wednesday after Oracle reported earnings that topped Wall Street's modest forecasts, sending the stock up more than 10% in after hours trading.

The Redwood City, Calif., company said its fiscal fourth-quarter net rose 15% to $3.23 billion, or 76 a share, from $2.81 billion, or 66 cents a share, a year earlier. The company said adjusted per-share earnings, which commonly exclude stock-based compensation and other items, were 89 cents.

Revenue rose 2.8% to $10.89 billion.

According to estimates gathered by S&P Global Market Intelligence, analysts expected Oracle to earn 78 cents a share on an adjusted basis, on revenue of $10.45 billion.

Analysts were particularly impressed with Oracle's success in bringing in new customers to its web-based, on-demand computing services. Annually recurring revenue, or ARR, from these new customers hit $855 million in the quarter, and topped $2 billion for year, the company said.

Continue Reading Below

ADVERTISEMENT

"It's the best quarter we have ever had," Oracle co-Chief Executive Mark Hurd said during a conference call with analysts. "We had a goal of $2 billion in ARR; we finished with nearly $2.1 billion. Next year, we will sell more."

At the same time, Oracle is altering the way it reports on its cloud business. The company is mixing its nascent infrastructure-as-a-service business, where it provides computing resources and storage on demand, with its more tenured business of selling access to app-management and data analytics tools, called platform-as-a-service.

In its fiscal fourth quarter, Oracle posted solid results in its cloud-infrastructure business, where it competes against leaders Amazon.com Inc., Microsoft Corp. and Alphabet Inc.'s Google. Revenue from the business rose 23% to $208 million.

The company's platform-as-a-service business, combined with its other cloud business that sells access to applications -- known as software-as-a-service -- saw revenue climb 67% to $1.15 billion ended May 31.

On a call with analysts, co-CEO Safra Catz said Oracle combined results from its platform and infrastructure cloud businesses because "synergies and cross-selling between these two businesses is very high."

Combining results from the two business will make it harder to measure Oracle's success in the cloud-infrastructure market. Larry Ellison, Oracle's co-founder and executive chairman, made building the company's cloud-infrastructure business a key mission, saying last summer "Amazon's lead is over" after introducing Oracle's latest technology for the market.

Amazon, though, continues to pull away. Its Amazon Web Services unit, whose net sales are largely comprised of its cloud-infrastructure business, grew 43% in the most recent quarter to $3.66 billion.

To keep pace with rivals in the cloud-infrastructure market, Oracle will need to meaningfully expand its capital spending and operating expenses, Stifel Nicolaus & Co. analyst Brad Reback recently wrote in a report.

Last year alone, Amazon, Microsoft and Google spent a combined $31.54 billion in 2016 on capital expenditures and leases, much of that on data centers to deliver cloud-infrastructure services.

Oracle spent $2.02 billion on capital expenditures in its fiscal year, up from $1.19 billion a year earlier. That, in part, led to operating margins of 34%, compared with 43% in the previous fiscal year. The company has said it doesn't believe it needs to spend as much as rivals to catch up, arguing its technology is superior.

Growth in Oracle's entire cloud business is outpacing the decline in its legacy business of selling licenses to software customers run on their own servers.

The cloud business grew $502 million year-over-year while Oracle's new software-license revenue fell $140 million. It is the fourth-consecutive quarter in which Oracle's cloud-revenue gains outpaced declines in its legacy software business.

Over all, revenue from new software licenses fell 5% to $2.63 billion.

The biggest piece of Oracle's software business remains its massive software-license updates and product-support operations. That segment generated $4.9 billion in revenue, a 2% gain from a year earlier.

Write to Jay Greene at Jay.Greene@wsj.com

(END) Dow Jones Newswires

June 21, 2017 19:11 ET (23:11 GMT)

See the article here:

Cloud-Computing Business Lifts Oracle's Profit -- 2nd Update - Fox Business

Posted in Cloud Computing | Comments Off on Cloud-Computing Business Lifts Oracle’s Profit — 2nd Update – Fox Business

Cloud Computing and Digital Divide 2.0 – CircleID

Posted: at 5:46 am

Internet connectivity is the great enabler of the 21st century global economy. Studies worldwide unequivocally link increases in Internet penetration rates and expansion of Internet infrastructure to improved education, employment rates, and overall GDP development. Over the next decade, the Internet will reinvent itself yet again in ways we can only imagine today, and cloud computing will be the primary operating platform of this revolution.

But not for everyone. Worldwide, the estimated Internet penetration rate ranges between 44% and 50%, much of which is through less productive mobile devices than desktop workstations. Overall, Internet penetration rates in developed countries stand at over twice that of underdeveloped economies. For many, high-quality Internet services are simply cost-prohibitive. Low-quality infrastructure and devices, unreliable connectivity, and low data rates relegate millions to a global online underclass that lack the resources and skills necessary to more fully participate in the global economy. First recognized as early as the 1990s, these persistent quantitative inequities in overall availability, usability, etc., demarcate a world of Internet "haves" and "have not's" known commonly as the "Digital Divide".

In the decade to come, cloud computing and computational capacity and storage as a service will transform the global economy in ways more substantial than the initial Internet revolution. Public data will become its own public resource that will drive smart cities, improve business processes, and enable innovation across multiple sectors. As the instrumented, data-driven world gathers momentum, well-postured economies will begin to make qualitative leaps ahead of others, creating an even greater chasm between the haves and have not's that we will call Digital Divide 2.0.

At one end of the chasm are modern information-driven economies that will exploit the foundational technologies of the initial Internet revolution to propel their economies forward as never before. In particular, cloud technologies will unleash new capabilities to innovate, collaborate and manage complex data sets that will facilitate start-ups, create new jobs, and improve public governance.

Meanwhile, many in the developing world will continue to struggle with the quantitative inequities of the first Digital Divide. Developing economies will very likely continue to make some progress; however, their inability to rapidly bridge the Internet capacity gap will inhibit them from fully participating in the emerging, instrumented economies of the developed world. Failing to keep pace, these economies will continue to face the perennial problems of lack of investment, lack of transparency within public institutions, and a persistent departure of talent to more developed economies.

In the early 1990s, there was much sloganeering and some real public policyin the United States regarding the development of "information superhighways" that would connect schools and libraries nationwide. Information sharing across educational institutions provided the critical mass for launching today's emerging information economy. However, implementation was uneven, and since that time there remain winners and losers, both nationally and globally.

As cloud computing emerges as the principal operating platform for the next-generation information economy, we are again challenged by many of the same questions from two decades ago: who will benefit most from the upcoming revolution? Will progress be limited solely to wealthy urban and suburban centers, already hard-wired with the necessary high-capacity infrastructure, and flush with raw, university-educated talent? Will poorer and rural economies be left to fall that much further behind?

Not necessarily. Industry experts and economists worldwide broadly recognize the tremendous latent economic value of cloud. Clever public-private partnerships in cloud adoption are reinvigorating and transforming municipalities. Shaping public policy begins with recognizing the transformative power of this technology and the role it can play in enabling a wide range of economic sectors.

Now is the time for public sector authorities, private enterprise, and global thought leaders to develop creative approaches to ensure some level of equity in global information technology access. Engagement now may help avoid repeating and exacerbating the original Digital Divide and posture cloud computing as a global enabler, rather than a global divider.

By Michael McMahon, Director, Cyber Strategy and Analysis at Innovative Analytics & Training

Related topics: Access Providers, Broadband, Cloud Computing, Data Center, Policy & Regulation

See the original post here:

Cloud Computing and Digital Divide 2.0 - CircleID

Posted in Cloud Computing | Comments Off on Cloud Computing and Digital Divide 2.0 – CircleID

Cloud first – Philippine Star

Posted: June 21, 2017 at 4:49 am

Last January, the Department of Information and Communications Technology (DICT) issued a circularaddressed to both the national and local government prescribing the Philippine governments Cloud First Policy.

The policy is aimed at reducing the cost of government information and communications technology (ICT), increasing employee productivity, and developing better citizen online services through the use of cloud computing technology.

Various governments such as the United States, Australia and the United Kingdom have done similar Cloud First Policies.

So what is cloud computing?

The DICT defines cloud computing as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g. networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.

Its characteristics include on-demand self service, broad network access, resource pooling, rapid elasticity and measured service, the department explained.

Business ( Article MRec ), pagematch: 1, sectionmatch: 1

What are the benefits that can be derived from using cloud computing technology? DICT says these are inter-agency collaboration, operational continuity and business recovery, faster deployment of services, greater budget control and decreased spending on legacy infrastructure.

The initial DICT GovCloud infrastructure was set up in 2013 by DOST-ICT Office as part of the Integrated Government Philippines (iGovPhil) Project which aims to provide cloud infrastructure access to government agencies.

And then, to pursue its cloud-first policy, government relaunched the Government Cloud or GovCloud initiative last March.

The DICT awarded the P373-million build, operate and transfer a complete cloud solution project to the Vibal Group, a cloud and education technology company which used to be known as a book publisher.

Vibal said GovCloud would use a hybrid cloud strategy that would use both private and public cloud, adding that creation of private in-country data center would ensure data security, while the off-premise public cloud would make online information and services readily available to government agencies.

The company then partnered with a number of technology firms, including Microsoft for the cloud undertaking. Vibal and Microsoft have been official partners since 2012, when Vibal made available its interactive e-books compatible to Windows OS.

Microsoft managing director Karrie Ilagan said their strength and commitment to security, privacy and transparency would empower the government to achieve the best for its citizens.

While cloud computing produces efficiency, productivity and would provide better citizen services, security is paramount to efficiency, especially with the advent of state-sponsored cyberattacks and cyber-espionage.

DICT launched the National CyberSecurity Plan of 2022 just last month in a bid to protect every single user of the internet in the country. This, of course, is timely especially since the Philippines is among the top 10 countries with malware threats.

With the increasing incidence of cyber espionage and cyberattacks initiated by nation-states, there is now a call for a Digital Geneva Convention, whereby governments should commit to avoiding attacking citizens, critical infrastructure and the private sector; reporting vulnerabilities rather than stockpiling, selling or exploiting them; pledging to aid in the containment and recovery from cyberattacks; and creating a trusted national and global IT infrastructure.

Microsoft offers what it calls a secure, trusted cloud which it emphasized is the most important value that it provides compared to other vendors.

Describing its trusted cloud, Microsoft assured that it helps protect data and has the most comprehensive compliance cover all over the world, including solutions for compliance with the Data Privacy Act of the Philippines, protects major IT systems reliably with Microsoft Disaster Recovery, and offers the most IT flexibility with a truly consistent hybrid cloud.

To show its commitment to a secure, trusted cloud, Microsoft has signed the ICT for Shared Prosperity Technology Manifesto with the DICT. It identifies national challenges and issues that need to be addressed, and key technology pillars that can help in championing and driving economic progress in the country.

Microsoft earlier announced that it would continue to invest $1 billion yearly on cybersecurity research and development in the coming years. The amount excludes acquisitions which the company may make in the sector.

The cloud has allowed companies like Microsoft to create much more sophisticated tools to guard against increasingly cunning attackers. Instead of having to manage their own security, companies also now tap cloud service providers like Microsoft to keep their data secure.

Microsoft has what it calls the Enterprise Mobility + Security that allows its clients to get identity-driven protection against todays attacks.

Its product named Azure is said to have the most comprehensive compliance coverage. It is the most trusted cloud for US government institutions.

With the Data Privacy Act of 2012, Microsoft says it has already designed Azure with industry-leading security measures and privacy policies to safeguard data in the cloud, including categories of personal data identified by the Data Privacy Act.

There is also Microsoft Dynamics 365 which are intelligent cloud applications that connect data across sources.

Microsoft explains that its cloud product combines the companys current customer relationship management and enterprise resource planning cloud services into a single service, and includes new, purpose-built applications to help manage specific business functions.

At the end of the day, it is the citizenry who will decide whether or not governments new policies and programs on ICT have improved the delivery of public services.

For comments, e-mail at philstarhiddenagenda@yahoo.com

More:

Cloud first - Philippine Star

Posted in Cloud Computing | Comments Off on Cloud first – Philippine Star

Chinese tech giant Alibaba joins key open-source cloud computing foundation – GeekWire

Posted: June 19, 2017 at 7:45 pm

GeekWire File Photo.

Kicking off a week in which it plans to encourage American businesses to invest in China, Alibaba Group announced plans to give something back to the cloud computing community: Alibaba Cloud is now a member of the Cloud Native Computing Foundation.

The Chinese internet giant plans to join the CNCF as a Gold member, putting it on the same level as rival Tencent. The CNCF, which is working to improve adoption of modern cloud-native software development technologies without setting standards, said in a statement that it was looking forward to more open-source contributions from the international cloud community.

Alibaba may not be a household name in the U.S., unless your household sells servers or enterprise computing technology. Nearly half a billion people mostly in China use one of Alibabas many services, from ecommerce to streaming video, and Intel has dubbed the company one of its super seven data center customers. The company is holding an event in Detroit this week with founder and executive chairman Jack Ma to pitch China as a source of new revenue for American businesses.

Alibaba Cloud is the leading cloud computing service in China, although it does face competition from Amazon Web Services and Microsoft there. On a global basis, it trails AWS, Microsoft, and Google by some margin, but Gartners latest Magic Quadrant report ranked it above more established U.S. cloud services like IBM and Oracle based on its belief in Alibabas ability to execute its cloud strategy.

Its definitely a significant addition for the CNCF, which now has a second source of cloud computing expertise in China through which to promote its member projects, most notably the Kubernetes container-orchestration project.

Read this article:

Chinese tech giant Alibaba joins key open-source cloud computing foundation - GeekWire

Posted in Cloud Computing | Comments Off on Chinese tech giant Alibaba joins key open-source cloud computing foundation – GeekWire

Microsoft Could Surpass Amazon in Cloud Computing This Year (AMZN, MSFT) – Investopedia

Posted: at 7:45 pm

Amazon (AMZN) may have long been the leader in cloud computing with its Amazon Web Services, but that may change later this year as Microsoft Corp. (MSFT) is finally able to surpass it.

Thats according to Pacific Crest Securities which late Friday said Azure, the cloud computing unit of Microsoft, could have more revenue that its main rival for the first time in 2017. In a research note covered by The Street, analyst Brent Bracelin predicted the rise of Microsoft for the first time in 10 years would transition the company from a cloud laggard to a cloud leader. Bracelin said he came to this conclusion after conducting an analysis of the 60 biggest cloud computing companies. It was revealed that the market is poised for primetime and has lots of growth opportunities. The analyst is predicting spending on cloud initiatives could explode to $239 billion in the span of five years, with the Redmond, Washington software giant benefiting the most from the growth. Bracelin pointed to what he called unmatched product depth and breadth" in software as a service, platform as a service and infrastructure as a service as the main reasons. (See also: Microsoft's Azure Cloud Revenue Estimated at $3B.)

While Microsoft still makes the lions share of its money through software sales, its cloud computing business continues to grow. For the three months ending in March, it said revenue in its Intelligent Cloud business came in at $6.8 billion, up 11% compared to the year ago and up 12% on a constant currency basis. During the third quarter, the company said server products and cloud services revenue increased 15%, driven by Azure cloud revenue growth of 93%. Our results this quarter reflect the trust customers are placing in the Microsoft Cloud, said Satya Nadella, chief executive officer at Microsoft said at the time. From large multi-nationals to small and medium businesses to non-profits all over the world, organizations are using Microsofts cloud platforms to power their digital transformation. (See also: Credit Suisse Bullish on Microsoft Cloud Business)

Pacific Crest isnt writing off Amazon completely in the cloud market, despite predicting the rise of its main rival and even though it has seen revenue growth at AWS decline for seven quarters in a row. Bracelin said that business appears to be increasing during the second quarter, with revenue growth of 9% sequentially forecasted. "We remain bullish on the five year prospects for AWS and are encouraged by 2Q cloud activity picking up," Bracelin wrote. "However, investor optimism is partially reflected in the valuation."

Go here to read the rest:

Microsoft Could Surpass Amazon in Cloud Computing This Year (AMZN, MSFT) - Investopedia

Posted in Cloud Computing | Comments Off on Microsoft Could Surpass Amazon in Cloud Computing This Year (AMZN, MSFT) – Investopedia

7 Tips for Securely Moving Data to the Cloud – Government Technology (blog)

Posted: at 7:45 pm

A few years back, an unmistakable trend emerged that cloud computing was growing in both percentage of organizations adopting cloud solutions as well as the amount and type of data being placed in the cloud.

Earlier this year, I highlighted research that made it clear that trust and risks are both growing in government clouds. Since that time, many readers have asked for more specific guidance about moving more data to the cloud in the public and private sectors. I was asked: What are the right cloud questions?

Questions like: Where are we heading with our sensitive data? Will cloud computing continue to dominate the global landscape? These are key questions that surface on a regular basis.

The forecast for the computer industry is mostly cloudy. Here are some of the recent numbers:

Back at the end of last year, The Motley Fool reported 10 Cloud Computing States That Will Blow You Away, and the last three listed are especially intriguing to me. Here they are:

IoT, Other Trends and the Cloud

And while it is true that the Internet of Things (IoT) has taken over the mantle as the hottest trend in technology, the reality is that The Internet of Things and digital transformation have driven the adoption of cloud computing technology in business organizations, according to a U.S.-based cloud infrastructure firm Nutanix.

This article from CxO Today lays out the case that the cloud remains the most disruptive force in the tech world today. Why?

While premise-based IT software and tools have their own advantages, the global trend is for cloud based applications since they offer more connectivity and functionalities than legacy systems. Moreover, enterprises are naturally gravitating towards it as the technology is reasonably reliable, affordable, and provides them access to other new and emergent technologies as well as high end skills. The cloud boom is also propelled by the fact that enterprises are trying to improve performance and productivity over the long term. Looking at the tremendous response for cloud services, several IT companies are designing applications meant solely for pure cloud play.

Other experts say that several overlapping trends are colliding as The edge is eating the cloud. These trends include:

Overcoming Fears in the Cloud

And yet, there are plenty of enterprises that continue to have significant concerns regarding cloud computing contracts. Kleiner Perkins Mary Meeker highlighted the fact that cloud buyers are kicking the tires of multiple vendors while becoming more concerned about vendor lock-in.

Also, technology leaders often move to the cloud to save money, but CFOs are now telling IT shops to cut costs in the cloud fearing that resources are being wasted. For example:

Also, while overall trust in cloud infrastructure is higher, new concerns are rising about application security delivered through the cloud.

My 7 Tips for Moving Data into the Cloud

So what can technology and security leaders do to protect their data that is moving to the cloud?

Here are seven recommendations that can help you through the journey. Note that the first four items are largely best practices about your current data situation and options before your data moves.

1) Know your data. I mean, really know what is happening now before you move the data. Think about the analogy of a doing a house cleaning and organizing what you own before putting things in storage to sell your house.

If you dont want to catalog everything (which is a mistake), at least know where the most important data is. Who is doing what regarding the cloud already? What data is sensitive? This is your as is data inventory situation with known protections of current data. And dont forget shadow IT. There are plenty of vendor organizations that can help you through this process.

2) Have a defined and enforced data life cycle policy. You need to know what data is being collected by your business processes, where does it go, who is accountable (now) and what policies are in force.

Ask: Is there appropriate training happening now? Is it working? What policies are in place to govern the movement of your data? For example, my good friend and Delaware CSO Elayne Starkey does a great job in this area of policies. You can visit this Web portal for examples: https://dti.delaware.gov/information/standards-policies.shtml

3) Know your cloud options: Private, public, hybrid or community cloud? This simple step often gets confusing, in my experience, because some staff mix these terms up with the public sector and private sector definitions wrongly thinking that a private cloud means private-sector-owned cloud.

Here are some basic cloud definitions to ponder with your architecture team:

Private Cloud: The organization chooses to have its own cloud where the resource pooling is done by the organization itself (Single Organization cloud). May be or may not be on premises (in your own data centers.)

Public Cloud: Different tenants are doing the resource pooling among the same infrastructure.

Pros: It can be easily consumable, and the consumer can provision the resource.

Cons: Consumer will not get the same level of isolation as a Private cloud.

Community Cloud: Sharing the cloud with different organizations usually unified by the same community sharing underlined infrastructure (halfway between private and public) small organizations pooling resources among others. For example, some state and local government organizations share email hosting with other state and local governments in the U.S. only.

Hybrid: Mixture of both private and public i.e., some organization might say we would like elasticity and cost effectiveness of public cloud and we want to put certain applications in private cloud.

4) Understand and clearly articulate your Identity and Access Management (IAM) roles responsibilities and demarcation points for your data. Who owns the data? Who are the custodians? Who has access? Who can add, delete or modify the data? Really (not just on paper)? How will this change with your cloud provider?

Build a system administration list. Insist on rigorous compliance certifications Incorporate appropriate IAM:Incorporate appropriate IAM from the outset, ideally based on roles, especially for administration duties. When you move to the cloud, the customers, not the provider, are responsible for defining who can do what within their cloud environments. Your compliance requirements will likely dictate what your future architecture in the cloud will look like. Note that these staff may need background checks, a process to update lists (for new employees and staff that leave) and segregation of duties as defined by your auditors.

5) Apply encryption thinking end to end data at rest and data in transit. We could do an entirely separate blog on this encryption topic, since a recent (and scary) report says there is no encryption on 82 percent of public cloud databases. Here are a few points to consider. Who controls and has access to the encryption keys? What data is truly being encrypted and when? Only sensitive data? All data?

6) Test your controls. Once you move the data, your cloud solution vulnerability testing should be rigorous and ongoing and include penetration testing. Ask: How do you truly know your data is safe? What tools do you have to see your data in the cloud environment? How transparent is this ongoing process?

The cloud service provider should employ industry-leading vulnerability and incident response tools. For example, solutions from these incidence response tools enable fully automated security assessments that can test for system weaknesses and dramatically shorten the time between critical security audits from yearly or quarterly, to monthly, weekly, or even daily.

You can decide how often a vulnerability assessment is required, varying from device to device and from network to network. Scans can be scheduled or performed on demand.

7) Back up all data in a distinct fault domain.

Gartner recommends: To spread risk most effectively, back up all data in a fault domain distinct from where it resides in production. Some cloud providers offer backup capabilities as an extra cost option, but it isnt a substitute for proper backups. Customers, not cloud providers, are responsible for determining appropriate replication strategies, as well as maintaining backups.

Final Thoughts

No doubt, managing your data in the cloud is a complex and ongoing challenge that includes many other pieces beyond these seven items. From contract provisions to measuring costs incurred for the services to overall administration functions, the essential data duties listed are generally not for technology professionals or contracts pros lacking real experience.

Nevertheless, all organizations that move data into and out of cloud providers data centers are constantly going through this data analysis process. Just because you moved sensitive data in the cloud five years ago for one business area does not mean that new business areas can skip these steps.

If you are in a large enterprise, you may want to consider adding a cloud computing project management office (PMO) to manage vendor engagement and ensure the implementation of best practices across all business areas.

And dont just fall for the typical line: I know xyz company (Amazon or Microsoft or Google or fill-in-the-blank) is better at overall security than we are so just stop asking questions. Yes these companies are good at what they do, but there are always trade-offs.

You must trust but verify your cloud service because you own the data. Remember, you can outsource the function, but not the responsibility.

Go here to read the rest:

7 Tips for Securely Moving Data to the Cloud - Government Technology (blog)

Posted in Cloud Computing | Comments Off on 7 Tips for Securely Moving Data to the Cloud – Government Technology (blog)

Pressing Tech Issue: Enterprise Software Vs. Cloud Computing? – Credit Union Times

Posted: June 17, 2017 at 2:37 pm

One ofRobert Frost's most popular poemscontains more than a few parallels with what insurance technology executives are grappling with as they look at systems in the cloud compared with systems housed within their own organizations.Consider this classic verse:

"Two roads diverged in a wood, and I...

I took the one less traveled by,

And that has made all the difference."

Certainly there are many who are opting for the less-traveled SaaS road, and others who prefer the other road commonly called enterprise.

Within the insurance industry, cloud technologies have been successfully deployed in ancillary areas of the organization such as Human Resources, Accounting, e-mail, and other non-core areas of the business. Typically, those core applications such as policy administration, agent commissions, rating, billing, claims, and agent and customer portals have been firmly entrenched in enterprise or on-premises applications.

However, with the success of cloud-based software in those non-mission critical areas,SaaS systems are becoming the favored choice for deployment in certain core insurance areas. But for those core tasks that are truly mission critical, have deep integration requirements and importantly, are processor intensive, IT executives are taking a go-slow approach before they commit to putting those systems or business processes into the cloud.

Why the concern? The short answer is that enterprise software is "owned" by the insurance carrier, and the risks of a data breach of sensitive information is relatively low when the application is housed behind the insurance companys firewall. Insurance companies are huge repositories of customers personal information. And that information is entrusted to the insurance company with the expectation that it will remain private and confidential.

In short,enterprise software deployments merit a certain kind of securitythat is hard to duplicate in a cloud-based system.

Another aspect to consider is processing horsepower. Saving and retrieving data such as we see in popular CRM systems like Salesforce.com is not particularly processor intensive. Tasks with intensive calculation requirements, such as commissions and bonus calculation, are another matter. These systems can often have more than a hundred integration points both up- and downstream, and managing them in the cloud is a major concern to many insurers.

According to recent research from Novarica, the key driver for carriers adopting SaaS systems was "the speed of deployment and the ability to sunset current applications." (Photo: iStock)

Among the common drivers for carriers to adopt SaaS system, according toNovarica, were standardization paired with release management, which reduces support costs and ultimately lowers the cost of ownership. However, that standardization, call it efficiency, is largely a trade-off between having key business processes undifferentiated from competitors that are on that same SaaS application and having a custom designed application that preserves competitive differentiation.

Large companies see being able to differentiate from competitors as a key advantage of the on-premises model. Additionally, large companies havevery large IT staffs that are capable of implementing and managing new applications.

Cost is clearly another factor in making SaaS a viable choice for many core insurance applications. For mid-tier and smaller insurance organizations, the advantages of SaaS are clear:

No infrastructure costs;

Software is on a subscription model that includes maintenance and upgrades; and

Provisioning is very easy.

With SaaS, a smaller insurance company can readily compete with the 'big guys.' While some simple back-of-the-napkin analysis can show advantages for SaaS, the analysis is really an apples-to-oranges comparison. A more detailed look at cost and a few other items show that cost may not be the main concern.

You may not appreciate the importance of some of the items buried in the fine print of SaaS solution provider contracts. Items such as transaction volume, number of processes allowed per month, data storage fees, data transformation costs and other items can result in significant additional fees levied by the vendor that must be met for subscription compliance.

If you dont understand and carefully quantify each item in the SaaS agreement, fees can easily double or triple but you might not realize the impact until the solution is implemented and in full production and you receive your first over-usage invoice. (Photo: iStock)

In order to get a full assessment of hosted versus on-premises factors such as implementation, customization,upgrade cycles, support and maintenance, security,scalability, and integration(s)must be understood. For example, implementing a SaaS application is relatively easy, since it is using a ready-made platform that has already been provisioned, while on-premises applications take resources, equipment, and time to set up a new environment. In essence, the financial choice is whether the new system will tap the operating expense budget or the capital expense budget.

The key in assessing the advantages and disadvantages of SaaS or on-premises is one that is common to all technology acquisitions the vendor. At the outset, the absolute key requirement is that the vendor has extensive experience withininsurance technology. There are many vendors that purport to have deep domain experience in insurance. From what Ive observed, however, in many applications sold to insurance companies vendors are very likely taking a horizontal application and providing some level of uniqueness that makes it salable to insurance companies. This is very common in CRM and commissions applications, where vendors have created hundreds of applications from managing sales to managing human resources to managing inventory. Vendors will claim insurance expertise, but a look under the hood will usually reveal an application that was built for, say, telecommunications or pharmaceuticals and verticalized to make it acceptable to insurance carriers and distributors. Its the old "one-size fits all" mentality.

Where the rubber hits the road invendor selectionis in looking at a vendors expertise in integration and security. As experienced insurance IT managers are aware, insurance infrastructure can be a hodge-podge of technologies and applications that run the gamut from fairly modern to legacy. A vendor that doesnt have a track record of successful implementations with a variety of technologies is one that probably shouldnt be on your short list. As a starting point, look for applications with embedded integration platforms that you (not the SaaS IT/Support team) will have full access to. The same thing can be said regarding the privacy and security of data and personal and private information.

Insurance carriers are very aware of the security implications of SaaS, where security is dependent on the vendor. A corollary to the vendors experience in integrations is the vendors experience in implementing fixes of the software or migrating existing clients to new versions of the software. Again, vendors that have dozens of satisfied clients are more likely to have the experience and talent to become a credible business partner. One more tip on vendor selection.

Ask for a report detailing system outages for the last two years that shows the nature of the outage, core issue and time to resolution. If the vendor refuses to deliver this document, think again about adding them to your short list.

Some large vendors in our space have recently dropped their on-premise solutions and 'gone all in' for the cloud. It might be a safer to go with a vendor that can provide cloudoron-premise solutions, leaving the final hosting decision in your hands. You can always migrate to the cloud later if youre not comfortable with the change. The choice between the cloud and on-premises is very much like choosing between the two paths that 'diverged in the wood.'

There are certainly advantages to each alternative, but ultimately the key driver is whether the vendor can accommodate both software delivery models, on-premises and SaaS. Vendors that have the capability to work with clients with unique requirements that mandate enterprise software or SaaS are vendors that have the overall experience to help you choose which path to take.

John Sarichis an industry analyst and VP of Strategy atVUE Software. He is a senior solutions architect, strategic consultant and business advisor with over 25 years of insurance industry experience.He can be reached atJohn.Sarich@VUESoftware.com.

Originally published on PropertyCasualty360. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

See the rest here:

Pressing Tech Issue: Enterprise Software Vs. Cloud Computing? - Credit Union Times

Posted in Cloud Computing | Comments Off on Pressing Tech Issue: Enterprise Software Vs. Cloud Computing? – Credit Union Times

From mobile phones to cloud computing – Media India Group

Posted: at 2:37 pm

While the newer versions of the good old mobile phones are now becoming store houses of data, cloud computing, content creation and data catalogingare gaining popularity among young Indians who are pursuing it with rigour, armed with latest technology and devices. We trace their geeky rituals and here is what we learn.

Several electronics makers, engineers and tech aspirantshave started focusing on personal computing devices, their data recording capabilities and tools. This has formed an aggressive market which caters to growing data and content needs. With a growth in human resourcesin Indiaand increasing acceptance of a digitally driven world, the countrys youth has come up with means unique to help create suitable ideas for a progressive future. Content creators now have mediums to cater to the specific need of data generation. This has become possible due to the easy availability of services such as online-data storing drives and seamless access of such data houses.

With the introduction of smart-phones in India, the tech-savvy Indian youth has been provided with a better device to make use of their tech skills and showcase it online.Combination of skills and an ambitionfor excellence along with the innate human curiosity to innovate are the qualities which are always needed to create a special family-like working environment. Companies which are on the look-out for individuals for taking their ideas forward like to care about their creation. Since a particular smartphone will be used by several others, the companies also take care of their users like you and me. Indian gadget bloggers have shown tremendous effort in combining these two key characters, says Darren Paul Miller, a designer and editor at HTCBro, an online forum which creates relevant data about technology giants and their product designs. Forums such as these have markedly become an important mile-marker in every computer engineering aspirant. Documenting any and every progress step-by-step provides insight to employment and other work environment behaviour, he adds.

The turning of the century has invigorated enthusiasm to talk about a fresh outlook, devices and technology amongst the Indian youth. Innovative inventions and future prototypes have always been on a mind that thinks of technology. Nobody wants to put their toys away so soon. Computers, smartphones and new inventions based on pure imagination are gripping the attention of fast working minds once more. It is like being a kid again with the coming of glass based block smartphones. Imagining a device in your head is easy. What is not easy is its final design and how to get it to a functional level, says Shimon Das, 20, editor-in-chief, Droid-Now. My love for mobile phones and android in particular started when I got my first smartphone HTC Wildfire S. There is no restricting when it comes to the youth taking to devices of their own. The gears of almost every techie marks their willingness to perform better than the world-class computer engineer.

When it comes to surfing across social media platforms like YouTube, we have also strived for future content generation and documenting swift adaption of technology in our daily lives. @Geeky Ranjitis a rapidly growing YouTube channel in which an Indian talks and reliably advises about the upcoming tech specifications and electronic appliances. At the channel, there are ways discussed on smartphones and how to effectively use them in your daily life because it is not every day people realise that smartphone is not just a piece of what your livelihood stands as a symbol of. It is a brilliant tool for our personal ambitions and future projects. I am in my late 30s and I have been working with computers for over 25 years now, and being a resident geek I have lots to share. Ranjit has been making electronics and guide videos and uploading them on YouTube for six years now.

Cataloguing ideas and putting them on the internet is allowing readers, viewers and eager techies to make note of what all needs to be done. With the help of user friendly tech, reliable planning and protocol, Indians are taking huge strides on solving technology, data and content related situations.

Visit link:

From mobile phones to cloud computing - Media India Group

Posted in Cloud Computing | Comments Off on From mobile phones to cloud computing – Media India Group

Alibaba to enter European cloud computing market in mid-2017 – Air Cargo World (registration)

Posted: June 16, 2017 at 3:54 pm

As it seeks to expand its global reach outside China, Alibaba Cloud announced that it will introduce MaxCompute, to Europes US$18.9 billion cloud computing market in the second half of 2017.

The E.U. cloud market is smaller than, say, its U.S. counterpart, but it is already populated by the likes of Amazon, Salesforce, Microsoft and IBM, setting the stage for competition. Alibaba said it hopes to get a piece of this action by convincing E.U. businesses that its artificial intelligence (AI) features can, unlock the immense value of their data, using a highly secure and scalable cloud infrastructure and AI programs, said Wanli Min, an Alibaba Cloud scientist responsible for AI and data-mining.

The Chinese cloud providers AI program is already popular in China, where it has been applied to easing traffic congestion, diagnosing disease through medical imaging, and even predicting the winners of reality show contests.

MaxCompute intends to deliver these same services to the E.U. market, using advanced AI and deep -earning technologies and algorithms for data storage, modeling and analytics.

Alibaba Cloud opened its first European data center in Germany in late 2016. The company has not revealed what its E.U. customer base looks like, but said it is in discussions with companies in Europe about using MaxCompute.

Alibaba corporate headquarters

View original post here:

Alibaba to enter European cloud computing market in mid-2017 - Air Cargo World (registration)

Posted in Cloud Computing | Comments Off on Alibaba to enter European cloud computing market in mid-2017 – Air Cargo World (registration)

IoT apps and event-driven computing reshape cloud services – TechTarget

Posted: June 15, 2017 at 9:44 pm

Tools are always shaped by their uses. When cloud computing first came on the scene, it was a form of hosted virtualization, and its goal was to look like a bare-metal server.

Infrastructure as a service (IaaS) shaped the earliest cloud services, and it still dominates public cloud as well as the private cloud software market. Even so, that doesn't mean it's going to be the source of future cloud opportunity.

Cloud providers are always strategizing for the future, and their plans reveal an important -- and already underway -- shift. Every major public cloud provider has added services to process events. In particular, providers are adding features to help developers build applications for the internet of things (IoT). Could these be the basis for the most transformational set of applications to come along since the internet itself?

Legacy applications follow a pattern that's decades old: Work comes to the applications that support it. In traditional cloud computing, users pay for the processing resources they use. The terms differ, but it's essentially a lease of virtual infrastructure. This is a direct mirror of what happens in a data center -- a server farm is loaded with applications and transactions are routed to the correct server in the pool. This approach is great where work is persistent, as in the case of a retail banking application that runs continuously.

Event-driven and IoT apps change this critical notion of persistence. An event can pop up anywhere, at any time. It would be wasteful, perhaps prohibitively wasteful, to dedicate an IaaS instance to wait around for an event. Or the instance might reside within a data center halfway around the world from where the event occurs. If all possible event sources were matched with traditional cloud hosting points, most would sit idle much of the time, doing little but running up costs.

The reason why there's a specific right or wrong place to process events is simple: delay. Most events have a specific response-time expectation. Imagine a machine that triggers spray paint when an item passes a sensor. Picture a self-driving vehicle that approaches a changing traffic light.

The information flow between an event and the receipt of the appropriate response is called a control loop. Most events require a short control loop, which means that their processes need to be close to the point of the event. That's the problem with control loops that force event-handling processes to disperse out toward the cloud edge -- and multiply in numbers.

It's easy to see how the scarcity of events at a given point creates a problem of cloud efficiency and pricing for traditional cloud computing. It's also possible to have too many events. The cloud can allow for cloud bursting, or scaling capacity by spinning up multiple copies of an application component on demand, but it's not that easy.

Few applications written to run on a bare-metal server can seamlessly scale or replace failed instances. Those cloud capabilities aren't common in data centers, where legacy applications run. Moving the applications to the cloud doesn't add the features necessary to scale applications, either.

Multiple copies of an application component require load balancing, and many applications were not designed to allow any copy of a component to handle any event or request. Applications that work by assuming a string of requests in context can't work if half the string goes to one copy of the application and the other half to another. How do we make IoT apps scalable and resilient? They have to be rewritten.

Developers are doing just that, and big cloud providers are responding. In particular, they all see the same IoT-and-event future for the cloud. They have been steadily enhancing their cloud offerings to prepare for that future. Not only do the cloud giants offer special web services to manage IoT devices and connections, they now provide tools to support the kind of programming that IoT apps will demand.

The functional or lambda style of programming doesn't allow an application or component to store data between uses. As a result, all instances of the component can process an event. Cloud providers now offer functional or microservice support instead of simply providing infrastructure, platform or software as a service, because a function cloud is very different.

Where is your function hosted in a function cloud? Everywhere. Nowhere. Functions are activated anywhere they're needed -- when they're needed -- and you pay when you use one. Function clouds for IoT, or any kind of event-driven computing, represent the ultimate flexibility and agility. They also demand that users take care to establish policies on just how much function hosting they are willing to pay for, a decision they'll have to make based on the combination of cost and those pesky control-loop lengths.

Amazon has even allowed for the possibility that IoT will demand cloud applications that migrate outside the cloud. Their Amazon Web Services (AWS) Greengrass platform is a software and middleware framework that lets users execute AWS-compatible functions on their own hardware. This capability will let IoT users do some local processing of events to keep those control loops short, but still host deeper, less time-critical functions in the AWS cloud.

The old cloud model made you pay for hosting instances. In the function cloud, you don't host instances in the usual way. You have extemporaneous execution of functions, as needed. This is what gives rise to the pay-as-you-go or serverless description of the function cloud, but that's short of the mark. You could price any cloud computing service, running any application, on a usage basis, but that doesn't make those cloud services scalable or easily optimized. Without these features, serverless is just a pricing strategy.

Developers will have to make changes in applications to accommodate IoT and function clouds. Almost every new program or service stores information, and this makes it difficult to scale. The rule of functional programming is stateless, meaning that the output you get from a process is based only on what you provide as input. There are even programming languages designed to enforce stateless behavior on developers; it's not second nature.

The notion of the function cloud is likely to accelerate a trend that's already started in response to the use of mobile devices and implementation of BYOD policies. Companies have found that they are creating application components designed to format information for mobile devices, interface with apps written for a variety of mobile platforms and provide consistent support from back-end applications often running in data centers.

These forces combine to create a two-tier model of an application. The device-handling front tier lives in the cloud and takes advantage of the cloud's ability to distribute applications globally. The cloud part then creates traditional transactions for the core business applications, wherever they are.

IoT is even more distributed than mobile workers, and some IoT events need short control loops. As a result, cloud hosting of the front-end part of applications could see explosive growth. That puts pressures on the off-ramp of this kind of two-tier application structure because many events might generate many transactions. These transactions can overwhelm core business applications. Cloud providers are working on this, too. Microsoft, for example, has a cloud-distributed version of the service bus typically used to feed business applications with work.

If you're writing functions for any reason, isn't using a function cloud inevitable?

Given that IoT is in its infancy -- and cloud IoT is even younger -- it's easy to wonder why cloud providers are already offering IoT features. There are three reasons. First, IoT could radically increase IT spending, and cloud providers want to grab some of that as potential new revenue. Second, IoT isn't the only thing that generates events. A lot of mobile worker interaction, for example, looks like event processing. Finally, functional programming techniques are being promoted for every kind of processing. IoT apps demand them. Developer tools and conferences are already describing how functional programming techniques can make programs better and more maintainable.

If you're writing functions for any reason, isn't using a function cloud inevitable?

That's the big question that every cloud provider and cloud user needs to think about. Fully scalable applications -- ones that can increase or decrease capacity under load and repair themselves by simply loading another copy -- are very useful to businesses. The functional programming techniques developed for IoT apps, and the function clouds to support those techniques, will remake programs.

Tools are defined by their uses, remember? Well, users are already seeing the cloud of the future in event processing, and IoT will accelerate that trend. IoT's potential to generate events over a wide area, in large numbers, while demanding short control loops will revolutionize cloud use.

Learn the benefits of runtime as a service

Are you ready for serverless computing?

Build IoT apps for cloud in a flash

See the article here:

IoT apps and event-driven computing reshape cloud services - TechTarget

Posted in Cloud Computing | Comments Off on IoT apps and event-driven computing reshape cloud services – TechTarget

Page 139«..1020..138139140141..150..»