7 Tips for Securely Moving Data to the Cloud – Government Technology (blog)

A few years back, an unmistakable trend emerged that cloud computing was growing in both percentage of organizations adopting cloud solutions as well as the amount and type of data being placed in the cloud.

Earlier this year, I highlighted research that made it clear that trust and risks are both growing in government clouds. Since that time, many readers have asked for more specific guidance about moving more data to the cloud in the public and private sectors. I was asked: What are the right cloud questions?

Questions like: Where are we heading with our sensitive data? Will cloud computing continue to dominate the global landscape? These are key questions that surface on a regular basis.

The forecast for the computer industry is mostly cloudy. Here are some of the recent numbers:

Back at the end of last year, The Motley Fool reported 10 Cloud Computing States That Will Blow You Away, and the last three listed are especially intriguing to me. Here they are:

IoT, Other Trends and the Cloud

And while it is true that the Internet of Things (IoT) has taken over the mantle as the hottest trend in technology, the reality is that The Internet of Things and digital transformation have driven the adoption of cloud computing technology in business organizations, according to a U.S.-based cloud infrastructure firm Nutanix.

This article from CxO Today lays out the case that the cloud remains the most disruptive force in the tech world today. Why?

While premise-based IT software and tools have their own advantages, the global trend is for cloud based applications since they offer more connectivity and functionalities than legacy systems. Moreover, enterprises are naturally gravitating towards it as the technology is reasonably reliable, affordable, and provides them access to other new and emergent technologies as well as high end skills. The cloud boom is also propelled by the fact that enterprises are trying to improve performance and productivity over the long term. Looking at the tremendous response for cloud services, several IT companies are designing applications meant solely for pure cloud play.

Other experts say that several overlapping trends are colliding as The edge is eating the cloud. These trends include:

Overcoming Fears in the Cloud

And yet, there are plenty of enterprises that continue to have significant concerns regarding cloud computing contracts. Kleiner Perkins Mary Meeker highlighted the fact that cloud buyers are kicking the tires of multiple vendors while becoming more concerned about vendor lock-in.

Also, technology leaders often move to the cloud to save money, but CFOs are now telling IT shops to cut costs in the cloud fearing that resources are being wasted. For example:

Also, while overall trust in cloud infrastructure is higher, new concerns are rising about application security delivered through the cloud.

My 7 Tips for Moving Data into the Cloud

So what can technology and security leaders do to protect their data that is moving to the cloud?

Here are seven recommendations that can help you through the journey. Note that the first four items are largely best practices about your current data situation and options before your data moves.

1) Know your data. I mean, really know what is happening now before you move the data. Think about the analogy of a doing a house cleaning and organizing what you own before putting things in storage to sell your house.

If you dont want to catalog everything (which is a mistake), at least know where the most important data is. Who is doing what regarding the cloud already? What data is sensitive? This is your as is data inventory situation with known protections of current data. And dont forget shadow IT. There are plenty of vendor organizations that can help you through this process.

2) Have a defined and enforced data life cycle policy. You need to know what data is being collected by your business processes, where does it go, who is accountable (now) and what policies are in force.

Ask: Is there appropriate training happening now? Is it working? What policies are in place to govern the movement of your data? For example, my good friend and Delaware CSO Elayne Starkey does a great job in this area of policies. You can visit this Web portal for examples: https://dti.delaware.gov/information/standards-policies.shtml

3) Know your cloud options: Private, public, hybrid or community cloud? This simple step often gets confusing, in my experience, because some staff mix these terms up with the public sector and private sector definitions wrongly thinking that a private cloud means private-sector-owned cloud.

Here are some basic cloud definitions to ponder with your architecture team:

Private Cloud: The organization chooses to have its own cloud where the resource pooling is done by the organization itself (Single Organization cloud). May be or may not be on premises (in your own data centers.)

Public Cloud: Different tenants are doing the resource pooling among the same infrastructure.

Pros: It can be easily consumable, and the consumer can provision the resource.

Cons: Consumer will not get the same level of isolation as a Private cloud.

Community Cloud: Sharing the cloud with different organizations usually unified by the same community sharing underlined infrastructure (halfway between private and public) small organizations pooling resources among others. For example, some state and local government organizations share email hosting with other state and local governments in the U.S. only.

Hybrid: Mixture of both private and public i.e., some organization might say we would like elasticity and cost effectiveness of public cloud and we want to put certain applications in private cloud.

4) Understand and clearly articulate your Identity and Access Management (IAM) roles responsibilities and demarcation points for your data. Who owns the data? Who are the custodians? Who has access? Who can add, delete or modify the data? Really (not just on paper)? How will this change with your cloud provider?

Build a system administration list. Insist on rigorous compliance certifications Incorporate appropriate IAM:Incorporate appropriate IAM from the outset, ideally based on roles, especially for administration duties. When you move to the cloud, the customers, not the provider, are responsible for defining who can do what within their cloud environments. Your compliance requirements will likely dictate what your future architecture in the cloud will look like. Note that these staff may need background checks, a process to update lists (for new employees and staff that leave) and segregation of duties as defined by your auditors.

5) Apply encryption thinking end to end data at rest and data in transit. We could do an entirely separate blog on this encryption topic, since a recent (and scary) report says there is no encryption on 82 percent of public cloud databases. Here are a few points to consider. Who controls and has access to the encryption keys? What data is truly being encrypted and when? Only sensitive data? All data?

6) Test your controls. Once you move the data, your cloud solution vulnerability testing should be rigorous and ongoing and include penetration testing. Ask: How do you truly know your data is safe? What tools do you have to see your data in the cloud environment? How transparent is this ongoing process?

The cloud service provider should employ industry-leading vulnerability and incident response tools. For example, solutions from these incidence response tools enable fully automated security assessments that can test for system weaknesses and dramatically shorten the time between critical security audits from yearly or quarterly, to monthly, weekly, or even daily.

You can decide how often a vulnerability assessment is required, varying from device to device and from network to network. Scans can be scheduled or performed on demand.

7) Back up all data in a distinct fault domain.

Gartner recommends: To spread risk most effectively, back up all data in a fault domain distinct from where it resides in production. Some cloud providers offer backup capabilities as an extra cost option, but it isnt a substitute for proper backups. Customers, not cloud providers, are responsible for determining appropriate replication strategies, as well as maintaining backups.

Final Thoughts

No doubt, managing your data in the cloud is a complex and ongoing challenge that includes many other pieces beyond these seven items. From contract provisions to measuring costs incurred for the services to overall administration functions, the essential data duties listed are generally not for technology professionals or contracts pros lacking real experience.

Nevertheless, all organizations that move data into and out of cloud providers data centers are constantly going through this data analysis process. Just because you moved sensitive data in the cloud five years ago for one business area does not mean that new business areas can skip these steps.

If you are in a large enterprise, you may want to consider adding a cloud computing project management office (PMO) to manage vendor engagement and ensure the implementation of best practices across all business areas.

And dont just fall for the typical line: I know xyz company (Amazon or Microsoft or Google or fill-in-the-blank) is better at overall security than we are so just stop asking questions. Yes these companies are good at what they do, but there are always trade-offs.

You must trust but verify your cloud service because you own the data. Remember, you can outsource the function, but not the responsibility.

The rest is here:

7 Tips for Securely Moving Data to the Cloud - Government Technology (blog)

Pressing Tech Issue: Enterprise Software Vs. Cloud Computing? – Credit Union Times

One ofRobert Frost's most popular poemscontains more than a few parallels with what insurance technology executives are grappling with as they look at systems in the cloud compared with systems housed within their own organizations.Consider this classic verse:

"Two roads diverged in a wood, and I...

I took the one less traveled by,

And that has made all the difference."

Certainly there are many who are opting for the less-traveled SaaS road, and others who prefer the other road commonly called enterprise.

Within the insurance industry, cloud technologies have been successfully deployed in ancillary areas of the organization such as Human Resources, Accounting, e-mail, and other non-core areas of the business. Typically, those core applications such as policy administration, agent commissions, rating, billing, claims, and agent and customer portals have been firmly entrenched in enterprise or on-premises applications.

However, with the success of cloud-based software in those non-mission critical areas,SaaS systems are becoming the favored choice for deployment in certain core insurance areas. But for those core tasks that are truly mission critical, have deep integration requirements and importantly, are processor intensive, IT executives are taking a go-slow approach before they commit to putting those systems or business processes into the cloud.

Why the concern? The short answer is that enterprise software is "owned" by the insurance carrier, and the risks of a data breach of sensitive information is relatively low when the application is housed behind the insurance companys firewall. Insurance companies are huge repositories of customers personal information. And that information is entrusted to the insurance company with the expectation that it will remain private and confidential.

In short,enterprise software deployments merit a certain kind of securitythat is hard to duplicate in a cloud-based system.

Another aspect to consider is processing horsepower. Saving and retrieving data such as we see in popular CRM systems like Salesforce.com is not particularly processor intensive. Tasks with intensive calculation requirements, such as commissions and bonus calculation, are another matter. These systems can often have more than a hundred integration points both up- and downstream, and managing them in the cloud is a major concern to many insurers.

According to recent research from Novarica, the key driver for carriers adopting SaaS systems was "the speed of deployment and the ability to sunset current applications." (Photo: iStock)

Among the common drivers for carriers to adopt SaaS system, according toNovarica, were standardization paired with release management, which reduces support costs and ultimately lowers the cost of ownership. However, that standardization, call it efficiency, is largely a trade-off between having key business processes undifferentiated from competitors that are on that same SaaS application and having a custom designed application that preserves competitive differentiation.

Large companies see being able to differentiate from competitors as a key advantage of the on-premises model. Additionally, large companies havevery large IT staffs that are capable of implementing and managing new applications.

Cost is clearly another factor in making SaaS a viable choice for many core insurance applications. For mid-tier and smaller insurance organizations, the advantages of SaaS are clear:

No infrastructure costs;

Software is on a subscription model that includes maintenance and upgrades; and

Provisioning is very easy.

With SaaS, a smaller insurance company can readily compete with the 'big guys.' While some simple back-of-the-napkin analysis can show advantages for SaaS, the analysis is really an apples-to-oranges comparison. A more detailed look at cost and a few other items show that cost may not be the main concern.

You may not appreciate the importance of some of the items buried in the fine print of SaaS solution provider contracts. Items such as transaction volume, number of processes allowed per month, data storage fees, data transformation costs and other items can result in significant additional fees levied by the vendor that must be met for subscription compliance.

If you dont understand and carefully quantify each item in the SaaS agreement, fees can easily double or triple but you might not realize the impact until the solution is implemented and in full production and you receive your first over-usage invoice. (Photo: iStock)

In order to get a full assessment of hosted versus on-premises factors such as implementation, customization,upgrade cycles, support and maintenance, security,scalability, and integration(s)must be understood. For example, implementing a SaaS application is relatively easy, since it is using a ready-made platform that has already been provisioned, while on-premises applications take resources, equipment, and time to set up a new environment. In essence, the financial choice is whether the new system will tap the operating expense budget or the capital expense budget.

The key in assessing the advantages and disadvantages of SaaS or on-premises is one that is common to all technology acquisitions the vendor. At the outset, the absolute key requirement is that the vendor has extensive experience withininsurance technology. There are many vendors that purport to have deep domain experience in insurance. From what Ive observed, however, in many applications sold to insurance companies vendors are very likely taking a horizontal application and providing some level of uniqueness that makes it salable to insurance companies. This is very common in CRM and commissions applications, where vendors have created hundreds of applications from managing sales to managing human resources to managing inventory. Vendors will claim insurance expertise, but a look under the hood will usually reveal an application that was built for, say, telecommunications or pharmaceuticals and verticalized to make it acceptable to insurance carriers and distributors. Its the old "one-size fits all" mentality.

Where the rubber hits the road invendor selectionis in looking at a vendors expertise in integration and security. As experienced insurance IT managers are aware, insurance infrastructure can be a hodge-podge of technologies and applications that run the gamut from fairly modern to legacy. A vendor that doesnt have a track record of successful implementations with a variety of technologies is one that probably shouldnt be on your short list. As a starting point, look for applications with embedded integration platforms that you (not the SaaS IT/Support team) will have full access to. The same thing can be said regarding the privacy and security of data and personal and private information.

Insurance carriers are very aware of the security implications of SaaS, where security is dependent on the vendor. A corollary to the vendors experience in integrations is the vendors experience in implementing fixes of the software or migrating existing clients to new versions of the software. Again, vendors that have dozens of satisfied clients are more likely to have the experience and talent to become a credible business partner. One more tip on vendor selection.

Ask for a report detailing system outages for the last two years that shows the nature of the outage, core issue and time to resolution. If the vendor refuses to deliver this document, think again about adding them to your short list.

Some large vendors in our space have recently dropped their on-premise solutions and 'gone all in' for the cloud. It might be a safer to go with a vendor that can provide cloudoron-premise solutions, leaving the final hosting decision in your hands. You can always migrate to the cloud later if youre not comfortable with the change. The choice between the cloud and on-premises is very much like choosing between the two paths that 'diverged in the wood.'

There are certainly advantages to each alternative, but ultimately the key driver is whether the vendor can accommodate both software delivery models, on-premises and SaaS. Vendors that have the capability to work with clients with unique requirements that mandate enterprise software or SaaS are vendors that have the overall experience to help you choose which path to take.

John Sarichis an industry analyst and VP of Strategy atVUE Software. He is a senior solutions architect, strategic consultant and business advisor with over 25 years of insurance industry experience.He can be reached atJohn.Sarich@VUESoftware.com.

Originally published on PropertyCasualty360. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Read this article:

Pressing Tech Issue: Enterprise Software Vs. Cloud Computing? - Credit Union Times

Alibaba to enter European cloud computing market in mid-2017 – Air Cargo World (registration)

As it seeks to expand its global reach outside China, Alibaba Cloud announced that it will introduce MaxCompute, to Europes US$18.9 billion cloud computing market in the second half of 2017.

The E.U. cloud market is smaller than, say, its U.S. counterpart, but it is already populated by the likes of Amazon, Salesforce, Microsoft and IBM, setting the stage for competition. Alibaba said it hopes to get a piece of this action by convincing E.U. businesses that its artificial intelligence (AI) features can, unlock the immense value of their data, using a highly secure and scalable cloud infrastructure and AI programs, said Wanli Min, an Alibaba Cloud scientist responsible for AI and data-mining.

The Chinese cloud providers AI program is already popular in China, where it has been applied to easing traffic congestion, diagnosing disease through medical imaging, and even predicting the winners of reality show contests.

MaxCompute intends to deliver these same services to the E.U. market, using advanced AI and deep -earning technologies and algorithms for data storage, modeling and analytics.

Alibaba Cloud opened its first European data center in Germany in late 2016. The company has not revealed what its E.U. customer base looks like, but said it is in discussions with companies in Europe about using MaxCompute.

Alibaba corporate headquarters

Read more:

Alibaba to enter European cloud computing market in mid-2017 - Air Cargo World (registration)

Alibaba to enter European cloud computing market in mid-2017 | Air … – Air Cargo World (registration)

As it seeks to expand its global reach outside China, Alibaba Cloud announced that it will introduce MaxCompute, to Europes US$18.9 billion cloud computing market in the second half of 2017.

The E.U. cloud market is smaller than, say, its U.S. counterpart, but it is already populated by the likes of Amazon, Salesforce, Microsoft and IBM, setting the stage for competition. Alibaba said it hopes to get a piece of this action by convincing E.U. businesses that its artificial intelligence (AI) features can, unlock the immense value of their data, using a highly secure and scalable cloud infrastructure and AI programs, said Wanli Min, an Alibaba Cloud scientist responsible for AI and data-mining.

The Chinese cloud providers AI program is already popular in China, where it has been applied to easing traffic congestion, diagnosing disease through medical imaging, and even predicting the winners of reality show contests.

MaxCompute intends to deliver these same services to the E.U. market, using advanced AI and deep -earning technologies and algorithms for data storage, modeling and analytics.

Alibaba Cloud opened its first European data center in Germany in late 2016. The company has not revealed what its E.U. customer base looks like, but said it is in discussions with companies in Europe about using MaxCompute.

Alibaba corporate headquarters

Link:

Alibaba to enter European cloud computing market in mid-2017 | Air ... - Air Cargo World (registration)

Amazon Still Leads Cloud Rankings, But Competition Is Coming On Strong – Fortune

When it comes to major cloud providers, there are three top contendersand then everyone else.

That was reflected once again in market research firm Gartner 's latest annual scorecard of the top public cloud providers that showed Amazon Web Services, Microsoft , and Google retaining the top three spots once again.

Below the top three there were a few surprises.

These so-called public cloud companies amass pools of computers, storage, and networking that they rent to business customers who don't want to spend more money running their own data centers. The Gartner ( it ) report, released Thursday, is closely watched by businesses pondering a shift to cloud computing, and by the vendors themselves who use it for marketing.

Here are a few takeaways.

Amazon Web Services, which is expected to bring in $14 billion in revenue this year, remains the biggest player by far. It also leads the pack based on Gartner's measure of its strategic visionbased on the services it plans to provide customers over timeand its ability to actually deliver those services. For what it's worth, AWS has topped this list since at least 2013, which is no surprise, given that it was first to market 11 years ago.

Related: Amazon Remains Tops in Cloud

AWS offers the widest selection of services overall, again in part because its been in the market longest but also because it's shown little sign of slowing down. On the flip side, buying and implementing AWS services often requires a lot of technical expertise, which is not necessarily a good thing. Some joke that even figuring out an AWS bill can require a graduate degree.

For the past two years, Gartner estimated that Amazon ( amzn ) ran more than ten times the computing capacity as the next 14 cloud providers combined. There was no mention of a similar measure this year and the report itself did not break out market share figures.

To get an indication of that, Synergy Research published a report in February estimating that Amazon's "dominant" 40% market share was flat year over year as measured across several cloud computing categories. Microsoft ( msft ) , Google ( goog ) , and IBM together accounted for 23% of the market across those same categories. Microsoft and Google, in particular, showed year-over-year growth at the expense of smaller competitors, according to Synergy.

Related: Oracle CEO Says We Can Beat Amazon, Microsoft in Cloud

But back to Gartner: The report stated that No 2 Microsoft remains the cloud of choice for many businesses that have run Microsoft software in their own data centers for years. Gartner estimates that Azure revenue will be nearly $3 billion this year (Microsoft, itself lumps Azure revenue in with other, non-cloud products so it's hard to see how much the cloud service alone makes).

Microsoft's large, experienced sales force is a boon for its cloud efforts. However, for a company that has grown up selling to and supporting corporate customers, some Gartner clients cited surprising tech support and training issues.

Related: Gartner Calls Two Horse race in Cloud

Google rounds out the big three. The Internet search and ad giant has invested heavily in the Google Cloud Platform and improved its sales approach to appeal to business customers in bigger companies as well as new startups that tend to be cloud-oriented. But Google still lacks some key cloud features, compared to AWS, Gartner said.

As for the next tier of providers, there were two new to Gartner rankings overall. Chinese retail giant Alibaba ( baba ) , which is backing its Aliyun public cloud both in and outside of China, made its first appearance as did Oracle ( orcl ) , the business software giant that arrived late to the public cloud market.

IBM also had a bigger presence in part because Gartner now looks at IBM's combined Bluemix and SoftLayer businesses, Gartner vice president and distinguished analyst Lydia Leong told Fortune. Bluemix is cloud technology for building business software while SoftLayer is more basic data center infrastructure that IBM acquired a few years ago. Last year, Gartner focused only on SoftLayer.

See the original post:

Amazon Still Leads Cloud Rankings, But Competition Is Coming On Strong - Fortune

Shadow raises $57 million for its cloud computing service for … – TechCrunch

French startup Shadow, also known as Blade, just raised a Series A round of $57.1 million (51 million). Shadow thinks your next computer is going to be in a data center. Your existing phones, laptops and Shadows own device (pictured above) act as a thin client, a window into your virtual machine running on a beefy server in a data center near you.

Shadow had already raised $14.6 million from around 20 business angels. The same ones invested again, starting with Nick Suppipat, Pierre-Kosciusko Morizet and Michael Benabou.

Ive already written a bit about Shadow. The startup is running thousands of virtual machines on 800 server-grade Intel Xeon processors with a dedicated Nvidia GTX 1070 for each user. Its only available in France for now.

In my own testing, it works quite well already even though you can feel that the service is still a bit young. The only issue is that you currently need a speedy fiber connection, which still limits the market quite a bit in France.

You can get your personal instance for around $32.70 per month (30). This isnt just a gaming platform, you get a full Windows 10 virtual machine. So far, 3,500 people have been using the service as the company has been accepting new users in batches every other month.

With todays funding round, the company plans to accept a lot more customers. The first thing were going to change is that we were relying on a pre-order system, co-founder and CEO EmmanuelFreund said. We have a lot more demand compared to our offering output. Were going to switch to an instant ordering system.

Eventually, youll be able to sign up to Shadow and use your Shadow computer the next day. This is going to be challenging as Shadow will need to keep up with the demand and roll out enough servers so that new servers are always available.

Shadow already said that it isnt in the business of looking at your data. You control your Windows instance, you can encrypt your data and Shadow doesnt have your Windows password. But the company said that it plans to provide its own encryption system and write stronger terms of service when it comes to privacy.

On average, these users have been spending 2.5 hours per day per user over the last 30 days. By targeting gamers, Shadow has been focusing on heavy PC users so this number isnt that surprising.

But the startup doesnt plan to stop there. Up next, Shadow wants to sell instances through B2B channels and target less powerful needs. Eventually, Shadow wants to replace computers in your office or your grandparents computer. Those servers probably arent going to have a big Nvidia GPU, but its going to bring the next big wave of users.

The startup wants to attract 100,000 clients by the end of 2018. Shadow is going to expand to the U.K. and Germany in 2017, with other European countries following suit. For each geographical expansion, the startup needs to find new data centers and sign peering deals with telecom companies around Europe.

The company is also going to open an office in Palo Alto so that they can talk with American partners, such as server makers and Microsoft. And Im sure that the company will need a ton of capital to buy new servers and expand its infrastructure.

While cloud computing for end users have been a wild dream for years, internet connections may have become fast enough to turn this into something that you can actually use. Shadow plans to take advantage of that.

See the original post here:

Shadow raises $57 million for its cloud computing service for ... - TechCrunch

‘Sweden is heaven for cloud computing’: Amazon Nordic chief – The … – The Local Sweden

Darren Mowry of Amazon Web Services. Photo: AWS

The head of Nordic operations for Amazon Web Services (AWS) has spelled out exactly why the US cloud computing giant chose to locate three state-of-the-art data centers in Sweden.

In April, it emerged that AWS planned to open a new infrastructure region for its cloud computing services in the Stockholm region in 2018.

Swedens enterprise and innovation minister Mikael Damberg hailed the deal as huge for Sweden.

They could do that wherever in the world, but chose to do it here," he added.

Now the man responsible for expanding AWSs cloud services operations in Sweden, American Darren Mowry, has disclosed the reasoning behind his companys decision to invest in Sweden.

Sweden truly does have it all, Mowry writes in a blog post published on the Data Centers by Sweden website.

But theres more to it than that.

Read his full explanation behind the AWS investment here.

This article was produced byThe Local Client Studioand sponsored by Data Centers by Sweden.

The rest is here:

'Sweden is heaven for cloud computing': Amazon Nordic chief - The ... - The Local Sweden

Amazon.com to open second government cloud-computing region … – The Seattle Times

The cloud-computing unit of Amazon.com said the new AWS GovCloud Region which can include one or more data centers is expected to open in 2018. It will be located on the East Coast.

Seattle Times business reporter

Amazon Web Services is launching a second cloud-computing region dedicated to hosting sensitive data and workloads from the U.S. government and regulated industries.

The cloud-computing unit of Amazon.com said Tuesday that the new AWS GovCloud Region which can include one or more data centers is expected to open in 2018. It will be located on the East Coast.

Amazon launched its first region dedicated to sensitive computing needs by U.S. government agencies and their contractors in 2011. Located on the West Coast, it was designed to meet tough compliance requirements and remain isolated from other parts of the public cloud.

Read the rest here:

Amazon.com to open second government cloud-computing region ... - The Seattle Times

Indonesia banks have yet to implement cloud computing – Jakarta Post

As a country that is experiencing exponential growth in data volume, Indonesia and its banking system have yet to fully implement cloud computing technologies due to regulation barriers and a lack of decent infrastructure.

"Major banks in Indonesia, most of which are our clients, have 10 to 40 million customers with hundreds of millions of transactions every day,"IT solution provider Teradata Indonesia president director Erwin Z Achir said in Jakarta on Monday.

Banking services are among the data giants who generate terabytes of data every day and has yet to move to cloud computing technology a type of Internet-based computing with which different services are delivered to an organization's computers and devices through the internet.

Under Government Regulation No. 82/2012 on the Management of Electronic Transactions and Systems, data and disaster recovery centers for public services must be located within Indonesia, meaning that Indonesian banks must store its customers' data in the country

Therefore, Indonesian banks that previously operated data centers located overseas must repatriate their information.

According to a 2014 survey by IDC Financial Insights on data centers, most Indonesian banks expect a 10 to 20 percent data volume growth rate per year.

Meanwhile, Fajar Muniandy, Teradata chief solution architect, said moving to clouds has yet to be an option for the companys clients because of their massive amounts of data.

He added that cloud computing was currently being adopted by startups and smaller companies because they had built their systems from the beginning. (dis/bbn)

Continue reading here:

Indonesia banks have yet to implement cloud computing - Jakarta Post

Alibaba Cloud announces launch of data centres in India and Indonesia – Cloud Tech

Alibaba Cloud has announced plans to expand to data centres in India and Indonesia, as well as a connectivity partnership with Tata Communications.

The move to Mumbai and Jakarta goes alongside the recent expansion to Malaysia, with the company saying in press materials it will significantly increase its computing resources in Asia. The Indonesian data centre will be the first such development from an international cloud company, according to Alibaba, while the expansion will take the Chinese firms total number of data centre locations to 17, including China, Australia, Germany, Japan, Hong Jong, Singapore, the UAE, and US.

I believe Alibaba Cloud, as the only global cloud services provider originating from Asia, is uniquely positioned with cultural and contextual advantages to provide innovative data intelligence and computing capabilities to customers in this region, said Simon Hu, Alibaba senior vice president and president of Alibaba Cloud in a statement.

Establishing data centres in India and Indonesia will further strengthen our position in the region and across the globe.

Elsewhere, Alibaba Cloud and Tata Communications announced a partnership to afford customers from more than 150 countries citing India in particular greater connectivity through the companies ExpressConnect and IZO Private Connect products respectively.

We are confident that the partnership between Alibaba Cloud and Tata Communications will assist both of us to become true digital transformation partners for our customers, empowering them to expand to new geographies, boost productivity, safeguard their business against threats, and take customer experience to the next level, said Genius Wong, Tata president global network cloud and data centre services. We look forward to offering more global organisations connectivity to Alibaba Cloud and to strengthening our presence in the Chinese market.

According to the most recent analysis from the Asia Cloud Computing Association (ACCA) in April last year, Indonesia and India ranked #11 and #12 out of 14 nations respectively, with both countries scoring particularly low on international connectivity (1.8 and 1.7 respectively) and data centre risk (2.7 and 1.9). The report noted Indias clear challenge in securing reliable access to cloud services and building the infrastructure for a huge population, while citing the need for Indonesia to implement a coordinated plan to tackle the needs of the digital economy for cloud readiness.

Read the rest here:

Alibaba Cloud announces launch of data centres in India and Indonesia - Cloud Tech

The Risks and Perquisites of Cloud Computing – DATAQUEST

Cloud technology is catching up in India and the considerations for adopting it have evolved too. Customers today are looking at deploying public and private cloud capabilities in one infrastructure. In fact, according to a recent Gartner report, the public cloud services market in India is estimated to grow at 38% in 2017 to $1.81 billion.

Infrastructure as a service (IaaS), projected to grow 49.2% in 2017, will be the highest growth driver, followed by software as a service (SaaS) at 33% and lastly platform as a service (PaaS) with 32.1%. This trend is a significant indicator that the migration of application and workloads from on premises data centres to the cloud, along with the development of cloud-ready and cloud-native applications, are triggering immense growth. While this may be on the rise, it also comes with some challenges and companies need to consider aspects like security, migration to new technologies, training for resources etc.

With this, the debate surrounding the security of cloud computing specifically whether data was more secure in the cloud or not has for the most part been settled. A growing number of organizations now view the cloud as secure and in many cases more so than an on-premises deployment. Beyond that, as each of the public cloud vendors point out, security in the cloud is ashared responsibility with the organization as the application owner being responsible for protecting applications, the OS, supporting infrastructure and other assets running in the cloud.

From a security standpoint, public cloud vendors management consoles are a key weak point and consequently an attractive target for an attacker, often via a phishing attempt. As such, its important to lock down and secure privileged credentials in a digital vault to secure the management console. As such, the enterprises responsibilities, specifically the functions above the hypervisor, include securing the privileged credentials used by applications and scripts accessing other applications and assets, such the enterprises customer database.

Unfortunately these credentials are all too often hardcoded. This is a particularly troubling vulnerability as there can be a large number of hardcoded credentials used throughout cloud and hybrid environments. Hard-coding and embedding credentials in the code or a script can initially make them easy to use but thisrepresents a significant vulnerabilitybecause attackers or malicious insiders can also easily access them, especially if the credentials are in clear text. But, even worse, when credentials are hard-coded or locally stored, they are nearly impossible to rotate, further making them a static and easy target for attackers.

The Risk Is Real

As part of the DevOps process developers often share source code theyve developed on code repositories such GitHub. While its part of the DevOps process, its an all too common example of how embedded passwords and credentials can become public if theyre hardcoded. Even if the code is only saved in the enterprises internal code repositories those passwords and credentials can easily be accessed by other developers and used either inadvertently or maliciously. It also becomes difficult, if not impossible, to fully identify which applications or scripts are interacting with other applications and other enterprise assets.

In the past, these mistakes might not have been so risky, exploitable and damaging within an on-premises environment. However, in a cloud environment, because of the rapid pace of change, the ability to quickly scale and the tools being used, these vulnerabilities are amplified and can pose unacceptable levels of risk.

To minimize risk and follow best practices, enterprises should avoid hardcoding passwords and credentials used by applications and scripts and instead secure credentials in a digital vault and rotate them according to policy. With this approach, just like with human users, enterprises can assign unique credentials to each application, code image or script, and then track, monitor and control access. IT administrators will know which applications access resources such as a customer database. Also, when an application or script is retired, the administrator or script can simply turn off the credentials.

A core business benefit of cloud is elasticity the ability to easily and instantaneously scale up and scale down the number of compute instances or virtual servers to meet the needs of the business at a specific point in time. With on-demand cloud computing, the business only pays for the compute, storage and other resources they use. No human intervention is required. The cloud automation tools are either built-in as a capability of the public cloud vendors offerings such as AWS Auto Scale, or as part of the orchestration and automation tools used with DevOps such as Puppet, Chef, Ansible, etc.

On-demand computing in the cloud, enabled by the automation tools, is a huge business benefit, but it also presents challenges and new potential risks when these new application instances are created and launched, they need privileges and credentials to access resources. The automation tools can provide the credentials, but these credentials also need to be secured.

Consequently, when a new application instance is created, as the compute environment dynamically scales, a best practice is to immediately secure the permissions and credentials assigned to the new instance in the secure digital vault. This ensures that the credentials can immediately be monitored, managed, secured and rotated according to policy. When the compute instances are retired, the associated credentials can also be removed. This is achieved with integrations between the various automation tools and the secure digital vault.

Whether the enterprise is fully in the cloud with IaaS or PaaS or is migrating to the cloud, it is critical to ensure applications, scripts and other assets use secure passwords and privileged credentials to access other applications and assets in the cloud.

Go here to see the original:

The Risks and Perquisites of Cloud Computing - DATAQUEST

Terry Crews Is On Crackdown 3 Trailer, No Cloud Computing For Single Player – EconoTimes

Crackdown 3.BagoGames/Flickr

Crackdown 3 is one of the most highly anticipated games on the Xbox Ones lineup, not least of all because its one of the few exclusive titles coming to the marginalized console. Microsoft released the trailer for the game that comes with the obligatory explosions and considerable selections of firearms. It also featured gaming community favorite Terry Crews. Unfortunately, its not all good news, especially on the single player front.

The last time Crackdown 3 made an official appearance was back in 2015, where Microsoft provided a demo for the game. The recent trailer did a good job of making it up to fans, which consisted of many things that went bang and boom. Retaining its cell shaded, neon theme, its still the Crackdown of old, The Verge reports.

Opening the trailer is movie star Terry Crews, the Oldspice spokesperson himself. After a brief, yet intense monolog, viewers are shown some gameplay aspects, which includes a ton of jumping using the jetpacks and blowing people away.

The game is scheduled for launch on November 7th for the Xbox One and Windows. This makes for a relatively short waiting period before gamers can start knocking down buildings in multiplayer. Speaking of which, this is where the bad news comes in.

Back in 2014, Microsoft announced that the game would feature cloud computing aspects in order to make the environment destructible. All well and good, but the company recently clarified that this was only for the multiplayer mode.

For single-player, gamers will not be able to enjoy as much of the destruction. Then again, the game might more than make up for that by absolutely slamming players with a huge amount of content and enemies to destroy as a member of the elite forces that cracks down on crime.

Whats more, the game is coming out the same time as the newly unveiled Xbox One X, Kotaku reports. Crackdown 3 would be a great testbed for bringing out the full power of the console.

New Study Could End Insulin Dependence Of Type-1 Diabetics

Infertility in men could point to more serious health problems later in life

Electrically stimulating your brain can boost memory but here's one reason it doesn't always work

Fainting and the summer heat: Warmer days can make you swoon, so be prepared

Why bad moods are good for you: the surprising benefits of sadness

Here's why 'cool' offices don't always make for a happier workforce

Four myths about diabetes debunked

What are 'fasting' diets and do they help you lose weight?

Placebos work even when patients know what they are

Link:

Terry Crews Is On Crackdown 3 Trailer, No Cloud Computing For Single Player - EconoTimes

China’s cloud industry moving to new era with emergence of unicorns – TechNode (blog)

Just a few years ago, billion-level funding would be beyond the imagination of Chinese cloud computing companies. But now it is becoming more and more tangible as the market matures.

QingCloud, a leading player in the field, is announcing the largest ever funding in the industry so far. The cloud computing platform made it public that they havesecured D round funding worth RMB 1.08 billion (around US$ 158 million). The current round adds to a US$ 2 million series A in 2012, aUSD 20 million Series Bin 2013 and USD 100 million in 2016.The company confirmed with TechNode that it has IPO plans, but declined to offer more details. The firm reportedly is removing their VIE structure to prepare for a local listing.

The massive round is from a consortium of investors, including China Merchants Securities International and China Merchants Zhiyuan Capital Investment (two wholly-owned subsidiaries of Chinas top security trading and brokerage firm, China Merchants Securities), Riverhead Capital Investment Management, CICC Jiatai Fund and China Oceanwide Holdings Group. Existing investors of Lightspeed China Partners and Bluerun Ventures also participated.

QingCloud founding team (L-R): Spencer Lin, Richard Huang, Reno Gan (Image credit: QingCloud)

QingClouds funding isnt a single case. It marks the latest in a series of venture investments in this sector, which has bumped several companies in the vertical to unicorn status recently.

Two companies in the arena received similar-sized backings in June alone. Cloud and big data solution provider Dt Dream received an RMB 750 million A round led by Alibaba and Everbright Industry Capital Management. Another Alibaba-backed cloud computing startup Cloudcare received nearly a 1 billion RMB C round led by FOSUN Group and Sequoia Capital China.Tencent-backed UCloud completed an RMB 960 million series D roundearlier this year.

Among the companies that have landed billion-level RMB funding, Dt Dream is the only one that announced unicorn status with over US$ 1 billion valuation. This may shed light on the valuations of the other companies, which have received similar size or higher funding.

Behind the investment frenzy is the huge potential of this market. Areport from research institute CCID shows that Chinas cloud computing market surged 41.7% YOY to RMB 279.7 billion in 2016, forecasting that this figure would reach RMB570.64 billion by 2019 with an annual growth rate of over 20%.

The emergence of several unicorns over a relatively short period of time is signifying a deeperchange in the market. In line with the second-half era proposition proposed by Meituan-Dianping CEO Wang Xing, the cloud computing startup pointed that Chinas cloud computing market is also entering a special transition point fora new period. While cloud computing platforms only used by non-core businesses for financial clients like banks, insurance, and security companies in the first-half era, it will find wider application in the new era.

Co-founded by IBM alumni Richard Huang, Reno Gan, and Spencer Lin, the company launched the QingCloud platform in July 2013. They now operate 24 data centers, of which 10 are run independently and 14 through partnerships, providing services to over 70,000 enterprise services.

Emma Lee is Shanghai-based tech writer, covering startups and tech happenings in China and Asia in general. Reach her at lixin@technode.com

See original here:

China's cloud industry moving to new era with emergence of unicorns - TechNode (blog)

Cloud Computing Companies Move Into Medical Diagnosis (GOOG, IBM) – Investopedia

Your next medical diagnosis could come from a cloud-based machine learning system. According to a Bloomberg report, Alphabet Inc. (GOOG) subsidiary Google is gearing up to provide "Diagnostics-as-a-Service" capabilities through its cloud division. The service will analyze reams of patient and disease data to diagnose patients and, possibly, recommend appropriate drugs for treatment. A German cancer specialist Alacris Theranostics GmbH is already working with Google's cloud division to carry out virtual clinical trials and virtual patient modeling. It uses these models to design drug therapies for patients. (See also: Google Creates New Cloud Group to Take On Amazon and Microsoft.)

Google is not the only cloud company targeting the healthcare industry. International Business Machine Corporation's (IBM) Watson, which uses a mix of artificial intelligence and cloud computing on the back end, analyzed medical data and images pertaining to 1,000 cancer patients last year and returned diagnoses that concurred with a human doctor's assessment in 99 percent of all cases. Amazon.com, Inc. (AMZN), which is a leader in cloud computing, lists genomic sequencing as one of the most prominent use cases of its service on its site. Last year, the National Cancer Institute announced a collaboration with Microsoft Corporation (MSFT) and Amazon to analyze cancer genomes and enable secure collaboration between researchers using the company's cloud services. (See also: Top Medical & Healthcare Software Companies.)

Healthcare spending on cloud services reached $3.73 billion in 2015 and is expected to increase to $9.5 billion by 2020. Primary use cases for this spending were data storage, email and software systems that increase efficiency. For example, telemedicine is rapidly gaining ground as a means to cut down on redundant costs associated with doctor visits for minor ailments. Medical diagnosis using cloud computing is a relatively new use case.

And it might be a while before the diagnostic use case becomes a reality. This is because such diagnoses requires healthcare providers to release critical data to cloud computing companies. A mix of regulatory and competitive advantage considerations may prevent them from doing so. The Bloomberg article quotes an analyst who says that medical data are likely to remain "locked up" with healthcare providers in the "foreseeable future." (See also: Investing in the Healthcare Sector.)

Read the rest here:

Cloud Computing Companies Move Into Medical Diagnosis (GOOG, IBM) - Investopedia

Why isn’t Cloud Computing in the 2017 Belmont Stakes? – FanSided

May 20, 2017; Baltimore, MD, USA; Javier Castellano aboard Cloud Computing (2) races Julien R. Leparoux aboard Classic Empire (5) during the 142nd running of the Preakness Stakes at Pimlico Race Course. Mandatory Credit: Patrick McDermott-USA TODAY Sports

Why isnt Always Dreaming in the 2017 Belmont Stakes? by Cody Williams

French Open 2017: Womens results Final by John Buhler

Cloud Computing was not on the radar of many people coming into the Preakness Stakes. He opened at a 30-1 underdog. And though his odds drastically improved leading up to post time to 13-1, he was still considered a longshot. Yet, he came out of nowhere and was able to overtake Classic Empire on the final stretch to win at Pimlico Race Course.

However, the win for Cloud Computing eliminated the chance of a Triple Crown winner in 2017. Thus, the Belmont Stakes didnt hold the same meaning for him or for Derby winner Always Dreaming. As such, both Always Dreaming and Cloud Computing wont be running on Saturday in the 2017 Belmont Stakes.

But not running solely because theres no shot at the Triple Crown seems a bit petty. Is that the whole reason as to why Cloud Computing isnt running at the 2017 Belmont Stakes? The short answer is, of course, no.

Though many people only pay attention to horse racing during the Triple Crown races, its actually quite a long season. There are numerous races with big purses for the winner throughout the summer. Thus, a lot of trainers and owners are interested in seeing their horses do well in those races to complete the season.

Whats more, the Belmont Stakes is a notoriously grueling race. Weve seen former Triple Crown hopefuls win the Derby and Preakness only to come up short at the Belmont because of the length of the race.

Thus, with the prestige of the Triple Crown not on the line, it makes sense that Cloud Computings team would rather focus on the summer and not such a long race. Its unfortunate and takes away some of the drama, but it makes sense in the long run.

See more here:

Why isn't Cloud Computing in the 2017 Belmont Stakes? - FanSided

Virtualization admin? Pivot — pivot now — to a cloud computing career – TechTarget

For those virtualization admins hiding under a virtual rock regarding cloud, I have news for you. Your job isn't safe. No one can put the cloud genie back in the bottle. Cloud computing is here to stay, and virtualization admins need to shift focus to keep up with tomorrow's jobs.

This complimentary guide helps readers determine the pros, cons and key considerations of DevOps by offering up 5 important questions you should be asking in order to create a realistic DevOps assessment.

By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.

You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

The move to cloud is already happening at all levels, from the smallest through to the largest businesses. Cloud and microservices mark a new iteration of change that is as disruptive as the original arrival of virtualization with VMware -- if not more so.

Virtualization has two phases: consolidation and abstraction.

In the beginning, virtualization's goal was more efficient use of underutilized hardware. Rarely do servers consume all the resources allocated to them. Virtualization admins could reclaim these lost resources and vastly reduce wasted capacity.

In phase two, virtualization developed advanced functions such as live storage motion or migration, high availability and fault tolerance. These virtualization capabilities address the issues that arise when several machines share one physical piece of hardware. Automation arrived and made server deployment simple and straightforward.

I argue that this virtualization adoption curve peaked a few years ago -- we are now moving to the next iteration, and you'll need to follow a cloud computing career path to come along.

Even once-conservative technology adopters, such as financial institutions, are jumping on board with the third wave of virtualization.

There is a thirst to cut costs, and automation allows massive cost cuts. There will be job losses. No virtualization admin should think it will never happen to them. You are fooling yourself. Fewer staff means fewer medical plans and pensions to support. It is not hard to see why the cloud appeals to the bottom line.

There will not be enough cloud computing careers to go around based on old virtualization working practices, such as in a phase one scenario.

Consider virtual machine orchestration. In early-phase virtualization environments, VMs still required some level of administration action, such as deployment from a template, to accompany automated steps. Tools such as VMware vRealize Automation or Platform9's managed vSphere stack enable an approved user to request a VM, customized to their specifications, and have it deployed within 10 minutes with no administrator interactions. Larger companies used to have several virtualization admins whose jobs purely entailed VM creation and deployment. Within a year or two, that job role disappeared.

Virtual machines are now moving to cattle status, meaning they're disposable commodities. To scale applications, organizations adopt automation tools that deploy new VMs. It's quicker to deploy another instance of a machine than to troubleshoot a broken one.

DevOps does away with manual work; manual deployment is the exact opposite of how DevOps is supposed to work. A key tenet of DevOps is that tasks performed more than once in the same way should be scripted so the IT platform does the action itself.

There is still time to retool and get on a cloud computing career path. Virtualization admins are luckier than most.

Platform as a service reduces the workload. Workloads that used to be custom-built and based on infrastructure as a service are now provided as a service for consumption by developers and companies. Examples include the larger cloud vendors offering secure and highly available database hosting that organizations consume without any effort to build and manage the underlying database infrastructure. Little to no database admin input required. No server admin required either.

The complexity hasn't gone away -- it has just changed. Management complexity moved from the VMs to orchestration and scaling. Virtualization elements such as high availability and disaster recovery (DR) lost importance, while the IT industry turned its attention to microservices that are scalable, redundant and can be spun up and down at will. Automation means little to no hands-on intervention. For example, you can spin up a cloud infrastructure from a single PowerShell script.

Classic DR locations are now costly relics of waste. Cloud affects virtualization in secondary ways. For example, businesses are used to having one primary data center and one DR setup in another data center. Given a relatively modern application set, the entire company infrastructure can restart in its entirety in the cloud in the event of a disaster. Modern DR management products, such as Zerto and Acronis, eliminate the costly secondary data center, allowing businesses to prepopulate and configure DR setups in the cloud.

This is the reality for virtualization admins, and the only future is in a cloud computing career. Over time, more applications are built cloud-first to save money from the start; the old, immovable on-site applications go the way of pagers and typewriters.

The reality is that most virtualization admin roles as we know them will vastly shrink or become outmoded over the next decade. A virtual data center requires far fewer staff, and with automation and scripting, a single administrator can manage massive numbers of servers.

There is still time to retool and get on a cloud computing career path. Virtualization admins are luckier than most. While the technology itself may change, these administrators have skills that easily translate to the popular cloud and DevOps arena.

This doesn't mean becoming a code guru or programmer, but a virtualization admin will need a deep understanding of architectures and tools such as Docker for containerization, Chef for configuration management and Kubernetes for container orchestration to become a DevOps admin. Learn multiple scripting languages and investigate hyper-converged infrastructure for cloud hosting.

The warning signs are there, fellow admins. It is just a case of doing something about it while you still can.

Help keep an organization on track as a cloud capacity manager

Break down seemingly convoluted DevOps job requirements

DevOps engineers must demonstrate strong communication skills

Set up a DevOps home lab to gain hands-on skills

Excerpt from:

Virtualization admin? Pivot -- pivot now -- to a cloud computing career - TechTarget

Real Estate Weekly: Digital Realty Becomes A Cloud Computing Giant – Seeking Alpha

Weekly Review

The REIT ETF indexes (VNQ and IYR) finished the week lower by 0.3% as the 10-year yield climbed 7bps following the UK elections. The S&P 500 (NYSEARCA:SPY) gained 0.3%. The homebuilder ETFs (XHB and ITB) were lower by 1.0% on the week. The commercial construction ETF (NYSEARCA:PKB) gained 0.2%.

(Hoya Capital Real Estate, Performance as of 12pm Friday)

Across other areas of the real estate sector, mortgage REITs (NYSEARCA:REM) finished the week higher by 0.5% and the international real estate ETF (NASDAQ:VNQI) declined 0.6%. The 10-Year Treasury yield (NYSEARCA:IEF) gained 7 bps on the week, recovering from YTD low yields earlier this week.

REITs are now higher by 0.8% YTD on a price-basis and higher by roughly 3% on a total-return basis. The sector divergences are quite significant: the Data Center sector has surged 24% while the retail-focused REITs have fallen double-digits. REITs ended 2016 with a total return of roughly 9%, lower than its 20-year average annual return of 12%.

REITWeek Recap

This week was NAREIT's annual REITWeek conference in New York City, the biggest industry conference of the year. We listened to about 25 presentations across all the major REIT sectors.

We came away with a slightly more positive outlook on the REIT sector as a whole. Retail REITs were unquestionably the major focus for many investors. The bifurcation between high-quality and low-quality retail space has intensified. High quality retail space in desirable locations continue to perform very well and, in many cases, the apparel downsizing has actually been a net positive as the vacated space has been put to more productive and higher-traffic uses. We detailed our judgments in "Short Squeeze May Send Mall REITs Surging."

We also published, "Obamacare Uncertainty Remains A Drag On Healthcare REITs," our update on the Healthcare REIT sector. We discussed that healthcare REITs have outperformed over the past quarter, but this outperformance is entirely attributable to plunging interest rates. Healthcare REITs are up 8% as the 10-year yield fell 45bps. Hospitals and skilled nursing REITs, the sub-sectors most exposed to changes in healthcare policy, continue to trade at substantial discounts as Obamacare crumbles and its replacement appears politically infeasible. While much of the media focus is on drug prices, labor costs are the true drivers of healthcare inflation. This is a structural allocation-of-resources issue within the American education system.

Finally, we also published our Net Lease update, "Retail Contagion Continues To Trouble Net Lease REITs" where we discussed that despite the significant decline in interest rates over the past quarter, net lease REITs have badly underperformed the broader REIT indexes, a worrying development for the sector. Net lease REITs are the most yield-sensitive REIT sector, but these REITs have not acted as bond-proxies so far this year. Investors have been rudely reminded of the significant retail exposures of these names. Credit issues with key tenants at Spirit Capital has dragged down the entire Net Lease sector. More than other REIT sectors, net lease REITs depend on their cost of capital advantage for acquisition-fueled growth. Spirit's credit issues may have meaningfully impaired the sector's competitive advantage.

Arguably the most significant piece of REIT news this week actually came after the conference, as Digital Realty (NYSE:DLR) announced a merger with DuPont Fabros (NYSE:DFT) to form a data center giant that appears more fortified to go head-to-head with the public cloud providers, Google (NASDAQ:GOOG) and Amazon (NASDAQ:AMZN). While demand has continued to be robust and outstripping supply, pricing power has been a concern among investors as companies have increasingly utilized public cloud solutions rather than using their own server racks in the data center. In many cases, both the public and private cloud are both located in these REIT data centers, but the rent per megawatt is lower when, for example, Amazon is the tenant rather than an individual mid-sized company. We think consolidation is the right move. We will write a full report on it early next week.

The six best performing REITs on the week were Dupont Fabros , LaSalle Hotels (NYSE:LHO), Diamondrock (NYSE:DRH), Pebblebrook (NYSE:PEB), Sunstone (NYSE:SHO), and CoreSite (NYSE:COR).

The six worst performers on the week were Care Capital (NYSE:CCP), National Retail Properties (NYSE:NNN), Store Capital (NYSE:STOR), Realty Income (NYSE:O), Digital Realty , and CubeSmart (NYSE:CUBE).

Economic Data

Every week, we like to dive deeper into the economic data that directly impacts real estate.

(Hoya Capital Real Estate, HousingWire)

Home Prices Continue To Rise As Mortgage Rates Continue To Fall

Core Logic's Home Price Index showed a 6.9% YoY rise in home prices in April, a slight deceleration from the 7.1% YoY rise in March."Mortgage rates in April dipped back to their lowest level since November of last year, spurring home-buying activity," said Dr. Frank Nothaft, chief economist for CoreLogic. "In some metro areas, there has been a bidding frenzy as multiple contracts are placed on a single home. This has led home-price growth to outpace rent gains. Nationally, home prices were up 6.9 percent over the last year, while rent growth for single-family rental homes recorded a 3 percent rise through April, according to the CoreLogic Single-Family Rental Index."

Zillow's April Case-Shiller forecast sees a 5.6% rise in home prices for April. Home price appreciation has reaccelerated in recent months after showing signs of slowing in early 2017 as mortgage rates shot up nearly 100bps from the summer 2016 lows. All else equal, lower mortgage rates lead to higher home prices.

Bottom Line

REITs fell 0.3% on the week as the 10-year yield climbed 10 bps. Hotels and retail REITs were the best performers. This week was the annual REITWeek conference in NYC. We came away with a more positive outlook on the REIT sector as a whole, especially the higher quality retail space.

Apartments and hotels have been upside surprises this year and have defied the headwinds from higher supply. Demand has been robust in both sectors and has largely offset higher supply. Digital Realty will merge with DuPont Fabros to form the largest data center REIT. Consolidation will allow these REITs to command better pricing power with the public cloud providers.

Please add your comments if you have additional insight or opinions. We encourage readers to follow our Seeking Alpha page (click "Follow" at the top) to continue to stay up to date on our REIT rankings, weekly recaps, and analysis on the REIT and broader real estate sector.

Disclosure: I am/we are long VNQ, SPY, CCP, COR, DLR, CUBE, SHO.

I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.

Additional disclosure: All of our research is for educational purpose only, always provided free of charge exclusively on Seeking Alpha. Recommendations and commentary are purely theoretical and not intended as investment advice. Information presented is believed to be factual and up-to-date, but we do not guarantee its accuracy and it should not be regarded as a complete analysis of the subjects discussed. For investment advice, consult your financial advisor.

Here is the original post:

Real Estate Weekly: Digital Realty Becomes A Cloud Computing Giant - Seeking Alpha

Edge Computing Is New Cloud Computing Tech Investors Should Track – GuruFocus.com

The cloud computing industry is still in its early stages of adoption. In 2016, the Infrastructure as a Service (IaaS) segment recorded just $22 billion in annual revenues. Considering the hundreds of billions dollars the IT industry spends every year, it is very clear that IaaS still has a long way to go.

The Software as a Service segment is a bit older, but the model has now become the preferred method for software delivery. Microsoft has done an excellent job of ditching its old annual licensing model for SaaS, and the success of Office 365, their lead SaaS product, is ample validation of that. Oracle is targeting $10 billion in annual revenues from SaaS over the next few years.

With these and thousands of other companies in the fray, the SaaS segment is expected to continue its double-digit growth over the next several years.

The cloud software market reached $48.8 billion in revenue in 2014, representing a 24.4% year-over-year growth rate. IDC expects cloud software will grow to surpass $112.8 billion by 2019 at a compound annual growth rate (CAGR) of 18.3%. SaaS delivery will significantly outpace traditional software product delivery, growing nearly five times faster than the traditional software market and becoming a significant growth driver to all functional software markets. By 2019, the cloud software model will account for $1 of every $4.59 spent on software.IDC

As businesses around the world slowly started warming up to the idea of third party-managed infrastructure services (IaaS) and software products delivered over the cloud (SaaS), the segment has piqued the interests of all the major tech players. Early entrants Amazon (NASDAQ:AMZN) and Microsoft (NASDAQ:MSFT) have already crossed $13 billion in trailing 12-month revenues while IBM has, so far, kept pace. Google is still working toward getting its bona fides in the cloud game by building datacenters and increasing features and services while Oracle is slowly working on its IaaS portfolio as well.

But Amazon and Microsoft, the lead players in the cloud story, have now made it clear that they are already on their way to embracing the next level of cloud. Microsofts CEO Satya Nadella made a huge announcement during the recent Microsoft Build 2017 developer conference that the companys cloud strategy is moving toward edge computing:

It has been barely four years since Microsoft CEO Satya Nadella announced the companys Mobile First, Cloud First strategy. Instead of basking in the glory of newfound success in Cloud, Microsoft CEO Satya Nadella has now announced that the time has to come to move on from a Mobile First, Cloud First strategy toward a more cloud-focused Intelligent Cloud, Intelligent Edge strategy.1reddrop

Amazon made its edge computing/IoT-focused software, Amazon Greengrass, publicly available Thursday, making it loud and clear that they, too, are in the race to move computing closer to the edge.

But most investors in the stock market have barely begun to discover cloud computing so heres a little primer on the new wave of cloud computing.

What is edge computing?

In cloud computing, the processing power is always centralized. Data has to travel from a device to servers, where it gets processed; the output is then pushed back to the device. Edge computing moves these heavy processing tasks or as many of them as possible closer to the point of origin, hence the word "edge."

This reduces the time data needs to travel, thereby reducing latency and cutting reliance on internet connections. It results in improved reliability and faster, more reliable decision making at the edge.

Among its application areas are artificial intelligence, or AI, where it It completely transforms the way AI can be applied to various scenarios.

Thats probably an oversimplified description of edge computing, but its enough at this point to understand that the "edge" part of the equation takes the "computing" part of cloud computing away from massive data centers and brings it closer to the connected devices themselves.

There are several obvious advantages to adopting an edge computing model over a traditional cloud computing one, but the segment itself is in very early stages of its development. Edge computing also needs a robust IoT and AI device ecosystem to make its impact felt in full force. By moving in early on this new paradigm in cloud computing, Amazon and Microsoft, the top two cloud companies, have once again moved the cloud goal posts and significantly raised the bar.

Their competitors now have to take the risk to invest sufficient resources, time and money if they want to keep Amazon and Microsoft in check. Considering the fact that Google and Oracle are only now starting on the cloud computing segment itself, it is going to be an extremely difficult task to execute to keep expanding their cloud offerings while also working on edge computing and IoT technologies.

By putting a clear moat around their businesses, Amazon and Microsoft are further differentiating themselves from the now-crowded cloud computing space. Microsoft is even going so far as to redefine its very vision for the future of cloud computing and the direction that the companys cloud push is going to take.

Why do investors need to know this? Because these are the moves that will take Microsoft and Amazon from their annual cloud revenues of $13 billion to twice, thrice that and beyond. Its not something of which a serious tech investor can afford to be unaware.

Disclosure: I have no positions in the stock mentioned above and no intention to initiate a position in the next 72 hours.

Sangara Narayanan

View original post here:

Edge Computing Is New Cloud Computing Tech Investors Should Track - GuruFocus.com

The benefits of cloud computing, Rust 1.18, and intelligent tracking prevention in WebKit SD Times news digest … – SDTimes.com

The cloud is no longer an afterthought, it is a competitive advantage. According to a new Insight-sponsored report by Harvard Business Review Analytic Services, businesses are turning to the cloud for agility, data capabilities, customer and user experiences as well as cost savings.

A companys IT environment should work for them by enabling them to both run and innovate. Large and small to mid-sized companies need to focus on managing and modernizing their IT infrastructure, so that it becomes a transformative part of their business that can directly improve results, said David Lewerke, Director, hybrid cloud consulting practice at Insight. While we knew there were a number of benefits, we wanted to better understand from respondents exactly how cloud systems were impacting their business outcomes.

The report found 42% use a hybrid cloud approach, 40% host their systems in a private cloud, and 13% host in a public cloud. Other benefits of cloud adoption included time to market, ability to manage security, and the ability to mitigate risk.

Rust 1.18 releasedThe latest version of the systems programing language Rust has been released with new improvements, cleanups and features. The biggest changes in Rust 1.18 includes an update to the latest edition of The Rust Programming Language book. The book is being written in the open on GitHub. Version 1.18 features the first draft of the second edition, as well as 19 out of 20 draft chapters.

Other features include an expansion of the pub keyword, library stabilizations, and cargo features.

More information is available here.

WebKits intelligent tracking prevention featureWebKit, an open source web browser engine, is providing a new feature called cross-site tracking. The Intelligent Tracking Prevention feature limits cookies and other website data to help users feel they can trust the privacy-sensitive data about their web activity again.

The success of the web as a platform relies on user trust. Many users feel that trust is broken when they are being tracked and privacy-sensitive data about their web activity is acquired for purposes that they never agreed to, John Wilander, security engineer for WebKit, wrote in a post.

Read more from the original source:

The benefits of cloud computing, Rust 1.18, and intelligent tracking prevention in WebKit SD Times news digest ... - SDTimes.com

Growing Patent Claim Risks in Cloud Computing – Lexology (registration)

This blog develops the themes of our February piece on cloud availability risks from software patent claims. It shows how the patent cloudscape is changing; how PAEs are increasingly active in Europe as well as in the USA; and how CSPs are starting to respond in their contract terms.

With increasingly recognised benefits of security, flexibility and reliability, cloud computing continues to carry all before it. Its aggregation of massive processing power also heralds deep, connected and transformative innovation in our daily lives. Intellectual property (IP) is at the centre of this wave of innovation, and an increasingly fierce battleground as the current high profile dispute between Alphabets Waymo and Uber over core autonomous vehicle technology shows.[1]

You might think that the cloud, built and running on shared environments and public standards, would be a safe space from intrusive IP disputes. But the evidence is mounting that the cloud is proving attractive for PAEs (Patent Assertion Entities, businesses who litigate their patents but generally dont otherwise use their patented technology). And whilst cloud users are increasingly aware of the importance of security and privacy, cloud IP risks are now equally important but still somewhat overlooked: many enterprises dont yet have complete clarity on their IP litigation strategy or IP innovation strategy, especially in a global context.

There are persuasive reasons for cloud customers to focus more on patent risks. PwC, in its most recent (May 2017) Patent Litigation Study[2] notes that damages awards for PAEs are almost four times greater than for other patent claimants and that damages awards at trial in patent disputes continue to rise.

Europe is quickly becoming a key jurisdiction for patent enforcement: the European Patent Office granted 96,000 patents in 2016, [3] up 40% from 2015, and the Unitary Patent along with EU-wide injunctions will soon be a reality.[4]

The cloud computing patent landscape is also developing rapidly. Cloud patent families are well-known in areas such file-storage and protocols but other areas like Fintech[5] are also growing quickly.

PAEs are acquiring cloud computing patents at a rapid pace according to IPlytics, an IP intelligence provider,[6] who note that:

PAEs often acquire patents in technological areas that will likely become strategically important for future markets.

This is borne out in a European Commission report on PAEs in Europe[7] which (on page 26) cites findings that:[8]

PAEs are overwhelmingly involved in the litigation of German and UK patents related to computer and telecommunications technology [and that] these findings are consistent with existing evidence on the activity of US PAEs, which also tend to enforce high-tech patents at a disproportionately high frequency, especially software patents.

Part of the attraction for PAEs is that patent infringement is increasingly easy to detect in the cloud: detailed documentation, APIs and the code for open source (the software that powers much of the cloud) are readily available, and can be read and analysed by anyone, making the cloud a soft target.

As the economic importance of the cloud rises, cloud customers make increasingly interesting targets for PAEs: customers generally dont have the same level of expertise in cloud tech as cloud service providers (CSPs), have a greater incentive to settle, are less prepared to fight an IP battle, and have little incentive to solve an IP Issue for others. Contrast this with the position of the CSP, who will want to avoid an IP threat becoming an issue across its customer base.

A measure of this growing cloud patent claim risk is the evolving approach of the largest global CSPs to this issue in their cloud service agreements.

Microsoft has taken an early lead through its recently announced Azure IP Advantage[9] programme with uncapped indemnification for its Azure cloud services, including open source incorporated in its services, and 10,000 (7,500 currently, 2,500 to come) patents that Microsoft is sharing with its consuming customers.

Google in its Cloud Platform Terms of Service[10] seeks (at section 14.2) to exclude open source software entirely from its IP infringement indemnification a big carve-out given the importance of open source in the cloud environment.

In Amazon Web Services (AWS) Customer Agreement,[11] the effect of section 10 is that AWS does not offer in its standard terms any IP protection at all for its services. Section 8.5 is an unusual IP non-assert term that requires the customer not itself or through others to assert any IP claim regarding the AWS services it has used. The clause continues without limit in time after the agreement has ended; and to the extent it could be said to amount to a patent no-challenge clause, could be problematic in Europe under EU competition law, for example.

The fact that all the largest CSPs are starting to address cloud patent risk expressly in their contract terms is perhaps the most compelling evidence that this PAE-fuelled risk is becoming increasingly relevant and material. Cloud customers, and their regulators in regulated sectors, should take note as well.

Continued here:

Growing Patent Claim Risks in Cloud Computing - Lexology (registration)