Page 148«..1020..147148149150..»

Category Archives: Cloud Computing

Keying Longshot Cloud Computing in the Preakness – America’s Best Racing

Posted: May 18, 2017 at 3:05 pm

On Saturday, racing fans across the country will turn their attention to the $1.5 million, Grade 1 Preakness Stakes at Pimlico Race Course in Maryland, the second leg of the Triple Crown. The race will also draw the attention of handicappers and bettors hoping to make a nice score on one of racings biggest days.

Naturally, all eyes will be focused on #4 Always Dreaming, winner of the Kentucky Derby two weeks ago. Trained by Todd Pletcher, the son of Bodemeister is 4-for-4 this year and hasnt been seriously challenged during that timeframe; furthermore, his Derby win was achieved in eye-catching fashion, as he tracked a solid early pace before taking command to win easily.

Being a speed horse, Always Dreaming is perfectly suited to the Preakness Stakes, which has a tendency to favor horses racing on or near the lead. If he repeats his Derby performance, Always Dreaming will be very tough to beat, though there are a few reasons to consider playing against him. For one, Always Dreaming received a pretty clean trip in the Derby, avoiding trouble at the start and staying clear of traffic while racing near the rail, which may have been the best part of the track. Hell also be a very heavy favorite in the wageringperhaps 3-5 or 4-5which means that playing him to win wont be very appealing.

Always Dreaming could also face a serious challenge from #5 Classic Empire, who finished fourth in the Kentucky Derby after a troubled start left him farther off the pace than usual. With a clean run, Classic Empire might have finished much better in the Derby (he actually ran about 9 lengths farther than Always Dreaming), and as the reigning champion 2-year-old, his talent is undeniable. Prior to the Kentucky Derby, he overcame a tough trip to win the Grade 1 Arkansas Derby with a solid late rally, which marked the third Grade 1 victory of his career. Hes very versatile in terms of running style, and might be just reaching his peak after missing a race and some training during the winter. Expect to see him much closer to the lead in the Preakness, which should give him every chance to run down Always Dreaming in the homestretch.

One longshot that I would strongly consider is #2 Cloud Computing, a lightly-raced colt trained by Chad Brown. Cloud Computing was late getting to the races and didnt debut until Feb. 11, when he won a maiden race sprinting three-quarters of a mile at Aqueduct while defeating the next-out winner Mineralogy. Off of that solid effort, Cloud Computing made his stakes debut in the March 4 Gotham Stakes, where he finished a strong second despite his lack of experience.

Cloud Computing auditioned for a potential Kentucky Derby run when he contested the Grade 1 Wood Memorial on April 8, but a slow start left him off the pace while racing over a track that favored front-runners. Under the circumstances, he had little chance to catch the leaders, but he did well to finish a clear third.

By skipping the Kentucky Derby to await the Preakness, Cloud Computing has had plenty of time to prepare for what will be his toughest race to date. And while its hard to say if he really wants to run this far, his pedigree suggests that the Preakness distance is within his capabilities.

Cloud Computing may also benefit from meeting a field that doesnt appear to have much speed on paper. In fact, according to Cloud Computings Brisnet pace figures (which attempt to quantify early speed), Cloud Computing is the most consistently fast horse in the Preakness field. I think he has a very big chance to finish in the trifecta, possibly even splitting Always Dreaming and Classic Empire for a spot in the exacta.

Since Cloud Computings morning line odds are solid (12-1), lets key him in our wagers to try and boost the potential payoffs while also considering the speedy Arkansas Derby runner-up #10 Conquest Mo Money on one ticket.

Wagering Strategy on a $20 Budget

$4 exacta: 4,5 with 4,5 ($8)

$3 exacta: 4,5 with 2 ($6)

$2 trifecta: 4,5 with 4,5 with 2 ($4)

$1 exacta: 2 with 4,5 ($2)

Wagering Strategy on a $30 Budget

$5 exacta: 4,5 with 4,5 ($10)

$4 exacta: 4,5 with 2 ($8)

$2 exacta: 2 with 4,5 ($4)

$2 trifecta: 4,5 with 4,5 with 2,10 ($8)

Good luck, and enjoy the race!

View post:

Keying Longshot Cloud Computing in the Preakness - America's Best Racing

Posted in Cloud Computing | Comments Off on Keying Longshot Cloud Computing in the Preakness – America’s Best Racing

Cloud computing – Simple English Wikipedia, the free encyclopedia

Posted: May 17, 2017 at 2:27 am

In Computer science, cloud computing describes a type of outsourcing of computer services, similar to the way in which electricity supply is outsourced. Users can simply use it. They do not need to worry where the electricity is from, how it is made, or transported. Every month, they pay for what they consumed.

The idea behind cloud computing is similar: The user can simply use storage, computing power, or specially crafted development environments, without having to worry how these work internally. Cloud computing is usually Internet-based computing. The cloud is a metaphor for the Internet based on how the internet is described in computer network diagrams; which means it is an abstraction hiding the complex infrastructure of the internet.[1] It is a style of computing in which IT-related capabilities are provided as a service,[2] allowing users to access technology-enabled services from the Internet ("in the cloud")[3] without knowledge of, or control over the technologies behind these servers.[4]

According to a paper published by IEEE Internet Computing in 2008 "Cloud Computing is a paradigm in which information is permanently stored in servers on the Internet and cached temporarily on clients that include computers, laptops, handhelds, sensors, etc."[5]

Cloud computing is a general concept that utilizes software as a service (SaaS), such as Web 2.0 and other technology trends, all of which depend on the Internet for satisfying users' needs. For example, Google Apps provides common business applications online that are accessed from a web browser, while the software and data are stored on the Internet servers.

Cloud computing is often confused with other ideas:

Many cloud computing deployments are powered by grids, have autonomic characteristics and are billed like utilities, but cloud computing can be seen as a natural next step from the grid-utility model.[8] Some successful cloud architectures have little or no centralised infrastructure or billing systems including peer-to-peer networks like BitTorrent and Skype.[9]

The majority of cloud computing infrastructure currently consists of reliable services delivered through data centers that are built on computer and storage virtualization technologies. The services are accessible anywhere in the world, with The Cloud appearing as a single point of access for all the computing needs of consumers. Commercial offerings need to meet the quality of service requirements of customers and typically offer service level agreements.[10]Open standards and open source software are also critical to the growth of cloud computing.[11]

As customers generally do not own the infrastructure or know all details about it, mainly they are accessing or renting, so they can consume resources as a service, and may be paying for what they do not need, instead of what they actually do need to use. Many cloud computing providers use the utility computing model which is analogous to how traditional public utilities like electricity are consumed, while others are billed on a subscription basis. By sharing consumable and "intangible" computing power between multiple "tenants", utilization rates can be improved (as servers are not left idle) which can reduce costs significantly while increasing the speed of application development.

A side effect of this approach is that "computer capacity rises dramatically" as customers do not have to engineer for peak loads.[12] Adoption has been enabled by "increased high-speed bandwidth" which makes it possible to receive the same response times from centralized infrastructure at other sites.

Cloud computing is being driven by providers including Google, Amazon.com, and Yahoo! as well as traditional vendors including IBM, Intel,[13]Microsoft[14] and SAP.[15] It can adopted by all kinds of users, be they individuals or large enterprises. Most internet users are currently using cloud services, even if they do not realize it. Webmail for example is a cloud service, as are Facebook and Wikipedia and contact list synchronization and online data backups.

The Cloud[16] is a metaphor for the Internet,[17] or more generally components and services which are managed by others.[1]

The underlying concept dates back to 1960 when John McCarthy expressed his opinion that "computation may someday be organized as a public utility" and the term Cloud was already in commercial use in the early 1990s to refer to large ATM networks.[18] By the turn of the 21st century, cloud computing solutions had started to appear on the market,[19] though most of the focus at this time was on Software as a service.

Amazon.com played a key role in the development of cloud computing when upgrading their data centers after the dot-com bubble and providing access to their systems by way of Amazon Web Services in 2002 on a utility computing basis. They found the new cloud architecture resulted in significant internal efficiency improvements.[20]

2007 observed increased activity, including Google, IBM and a number of universities starting large scale cloud computing research project,[21] around the time the term started gaining popularity in the mainstream press. It was a hot topic by mid-2008 and numerous cloud computing events had been scheduled.[22]

In August 2008 Gartner observed that "organizations are switching from company-owned hardware and software assets to per-use service-based models" and that the "projected shift to cloud computing will result in dramatic growth in IT products in some areas and in significant reductions in other areas".[23]

Clouds cross many country borders and "may be the ultimate form of globalisation".[24] As such it is the subject of complex geopolitical issues, whereby providers must satisfy many legal restrictions in order to deliver service to a global market. This dates back to the early days of the Internet, where libertarian thinkers felt that "cyberspace was a distinct place calling for laws and legal institutions of its own"; author Neal Stephenson envisaged this as a tiny island data haven in his science-fiction classic novel Cryptonomicon.[24]

Although there have been efforts to match the legal environment (such as US-EU Safe Harbor), providers like Amazon Web Services usually deal with international markets (typically the United States and European Union) by deploying local infrastructure and allowing customers to select their countries.[25] However, there are still concerns about security and privacy for individual through various governmental levels, (for example the USA PATRIOT Act and use of national security letters and title II of the Electronic Communications Privacy Act, the Stored Communications Act).

In March 2007, Dell applied to trademark the term '"cloud computing" in the United States. It received a "Notice of Allowance" in July 2008 which was subsequently canceled on August 6, resulting in a formal rejection of the trademark application in less than a week later.

In November 2007, the Free Software Foundation released the Affero General Public License (abbreviated as Affero GPL and AGPL), a version of GPLv3 designed to close a perceived legal loophole associated with Free software designed to be run over a network, particularly software as a service. According to the AGPL license application service providers are required to release any changes they make to an AGPL open source code.

Cloud architecture[26] is the systems architecture of the software systems involved in the delivery of cloud computing (e.g. hardware, software) as designed by a cloud architect who typically works for a cloud integrator. It typically involves multiple cloud components communicating with each other over application programming interfaces (usually web services).[27]

This is very similar to the Unix philosophy of having multiple programs doing one thing well and working together over universal interfaces. Complexity is controlled and the resulting systems are more manageable than their monolithic counterparts.

Cloud architecture extends to the client where web browsers and/or software applications are used to access cloud applications.

Cloud storage architecture is loosely coupled where metadata operations are centralized enabling the data nodes to scale into the hundreds, each independently delivering data to applications or users.

A cloud application influences The Cloud model of software architecture, often eliminating the need to install and run the application on the customer's own computer, thus reducing software maintenance, ongoing operations, and support. For example:

A cloud client is computer hardware and/or computer software which relies on The Cloud for application delivery, or which is specifically designed for delivery of cloud services, and which is in either case essentially useless without a Cloud.[33] For example:

Cloud infrastructure (e.g. Infrastructure as a service) is the delivery of computer infrastructure (typically a platform virtualization environment) as a service.[41] For example:

A cloud platform (e.g. Platform as a service) (the delivery of a computing platform and/or solution stack as a service) [42] facilitates deployment of applications without the cost and complexity of buying and managing the underlying hardware and software layers.[43] For example:

A cloud service (e.g. Web Service) is "software system[s] designed to support interoperable machine-to-machine interaction over a network"[44] which may be accessed by other cloud computing components, software (e.g. Software plus services) or end users directly.[45] For example:

Cloud storage is the delivery of data storage as a service (including database-like services), often billed on a utility computing basis (e.g. per gigabyte per month).[46] For example:

Traditional storage vendors have recently begun to offer their own flavor of cloud storage, sometimes in conjunction with their existing software products (e.g. Symantec's Online Storage for Backup Exec). Others focus on providing a new kind of back-end storage optimally designed for delivering cloud storage (EMC's Atmos), categorically known as Cloud Optimized Storage.

A cloud computing provider or cloud computing service provider owns and operates cloud computing systems serve someone else. Usually this needs building and managing new data centers. Some organisations get some of the benefits of cloud computing by becoming "internal" cloud providers and servicing themselves, though they do not benefit from the same economies of scale and still have to engineer for peak loads. The barrier to entry is also significantly higher with capital expenditure required and billing and management creates some overhead. However, significant operational efficiency and quickness advantages can be achieved even by small organizations, and server consolidation and virtualization rollouts are already in progress.[47]Amazon.com was the first such provider, modernising its data centers which, like most computer networks were using as little as 10% of its capacity at any one time just to leave room for occasional spikes. This allowed small, fast-moving groups to add new features faster and easier, and they went on to open it up to outsiders as Amazon Web Services in 2002 on a utility computing basis.[20]

The companies listed in the Components section are providers.

A user is a consumer of cloud computing.[33] The privacy of users in cloud computing has become of increasing concern.[48][49] The rights of users is also an issue, which is being addressed via a community effort to create a bill of rights (currently in draft).[50][51]

A vendor sells products and services that facilitate the delivery, adoption and use of cloud computing.[52] For example:

A cloud standard is one of a number of existing (typically lightweight) open standards that have facilitated the growth of cloud computing, including:[57]

Read the rest here:

Cloud computing - Simple English Wikipedia, the free encyclopedia

Posted in Cloud Computing | Comments Off on Cloud computing – Simple English Wikipedia, the free encyclopedia

How telecom is shifting its strategy to support cloud computing – SiliconANGLE (blog)

Posted: at 2:27 am

Cloud computing has fundamentally expanded the realm ofpossibilities organizationscan accomplish with technology.While a lot of focus has been placed on the cloud technology and dataarchitecture advancements, the underlying telecommunications infrastructure is also seeing a shift in strategies to support the latest trends in cloud computing.

Cisco Systems, Inc., known for its hardware infrastructure deployments, is helping drive this shift. Ian Wells(pictured, left), distinguished engineer, cloud and platform services at Cisco Systems Inc., and Jerome Tollet(pictured, right), distinguished engineer, Chief Technology and Architecture Office, at Cisco Systems, are twoof the companys team membersspearheading this initiative.

Wells and Tollet spoke with host Stu Miniman (@stu) and guest host John Troyer (@jtroyer), of theCUBE, SiliconANGLE Medias mobile live streaming studio, during OpenStack Summit in Boston, Massachusetts. They discussed theirtechnicalperspectives on virtualization and cloud computing.(*Disclosure below.)

Of all the advances in telecommunications infrastructure, the most important technology for advancing cloud computing is Network Function Virtualization, according to Tollet. NFV is becoming a first-class citizen for this community. At the beginning, people were kind of ignoring NFV, it was all about cloud. Now its becoming quite the opposite, he said.

NFV is the term used to describe the virtualization of functions that historically have been physical hardware used for things like intrusion detection and routing.As the adoption rate for NVF rises, so does the demand for more features, which can create bottle necks in development.

On the networking side, its always, Id like more functionality. Youll hear people talk about service chaining. MPLS [Multiprotocol Label Switching] comes up quite regularly, which is really integration with the rest of the service provider network, Wells said. We have a ways to go to really address the kind of general purpose model that would suit everyone.

Tollet also brought up a very interesting point about the redundancy and overheadassociatedwith a completely virtualized system.

Think in terms of two containers sitting on the same virtual compute node. Why do you need to create a packet? Why do you need to do crypto? Why do you need to do virtual LAN when the two applications are sitting on the same compute node? Tollet said.We have imported into the virtual world all of the concepts we have used in the physical world now I think we can do something a bit more efficient .

Watch the complete video interview below, and be sure to check out more of SiliconANGLEs and theCUBEs independent editorial coverage ofOpenStack Summit 2017 Boston.(* Disclosure: Cisco Systems Inc. sponsored this OpenStack Summit segment on SiliconANGLE Medias theCUBE. Neither Cisco Systemsnor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

Read the rest here:

How telecom is shifting its strategy to support cloud computing - SiliconANGLE (blog)

Posted in Cloud Computing | Comments Off on How telecom is shifting its strategy to support cloud computing – SiliconANGLE (blog)

Benefit-risk ‘tipping point’ for cloud computing now passed, says … – Out-Law.com

Posted: at 2:27 am

The Depository Trust & Clearing Corporation (DTCC), which already hosts some applications in the cloud, said cloud computing has now "moved past a tipping point" whereby it offers greater benefits and fewer risks to traditional outsourcing arrangements.

Financial services and technology law expert Luke Scanlon of Pinsent Masons described DTCC's move as a sign that the barriers that dissuade many financial firms from utilising cloud-based solutions are diminishing.

"The DTCC, after a period of testing and detailed analysis, have here highlighted that some of the traditional reasoning as to why cloud services present significant risk such as concerns around security are no longer valid," Scanlon said.

"In 2017 we are certainly seeing a maturing of the discussion and more and more of a focus on the few remaining regulatory sticking points to cloud adoption, together with the practical concerns around achieving the levels of availability necessary to operate the core systems of financial institutions and utilities, liability and exit arrangements," he said.

In a new white paper it has published, which contained its strategy to leverage the cloud, the DTCC explained why it will move more of its applications and services into the cloud.

"DTCC has been leveraging cloud services for almost five years and believes the cloud represents a viable alternative to corporate data centres," it said. "The maturation, expanded offerings and enormous scale of the technology, resolve the privacy and security challenges of cyber-threats, potential flash crash type market disruptions and the cost challenges facing many financial firms today."

"DTCC believes cloud computing has moved past a tipping point, prompting the firm to pursue a strategy of building a cloud ecosystem with partner vendors that support best practices and standards. DTCC is taking this step because it is confident that the security, scalability, resiliency, recoverability and cost of applications in the cloud are better than almost any private enterprise could achieve on its own," it said.

"DTCC also believes that business services, delivered by applications written to take advantage of the infinite resources, resiliency, and global reach of the cloud, have a significant advantage over legacy applications using traditional models in private data centres. We believe that gap will continue to widen over time," the firm said.

DTCC said it plans to work with regulators to ensure that its cloud-based operations are compliant with "the highest and strictest levels of recommended controls and best practices" it is subject to.

Earlier this year,seven main hurdles to banks' adoption of cloud-based serviceswere highlighted in a joint report by Pinsent Masons and UK banking industry body the BBA.

Continued here:

Benefit-risk 'tipping point' for cloud computing now passed, says ... - Out-Law.com

Posted in Cloud Computing | Comments Off on Benefit-risk ‘tipping point’ for cloud computing now passed, says … – Out-Law.com

Cloud Computing puts in work for Preakness before deluge – Daily Racing Form

Posted: at 2:27 am

Email

Michael Amoruso

Cloud Computing ran second to J Boys Echo in the Grade 3 Gotham Stakes.

ELMONT, N.Y. Before Mother Nature poured buckets of water on Long Island on Saturday morning, trainer Chad Brown was able to get in Cloud Computings last major workout before next Saturdays 142nd Preakness Stakes at Pimlico.

With steady rain falling at Belmont Park shortly after 5:30 a.m. Saturday, Cloud Computing worked a half-mile in 48.56 seconds over the training track that would be considered fast. Working by himself, Cloud Computing went in splits of 12.43 seconds for the opening eighth, 24.47 for the quarter, and got his final quarter in 24.09 without too much encouragement from exercise rider Peter Roman. He galloped out five furlongs in 1:01.69.

The horse breezed well, Brown said. He went a good half, out five. I thought he did it real, real well. Hes moving sound and strong. Im real happy with this horse. If he comes out of it in good shape, well be on to Maryland.

Cloud Computing, a son of Macleans Music owned by Seth Klarman and William Lawrence, has a win from three starts. That victory came going six furlongs over Aqueducts inner track on Feb. 11. He wheeled back three weeks later in the Grade 3 Gotham and ran a respectable second behind J Boys Echo after chasing a fairly hot pace. Cloud Computing then had a wide trip when the inside part of Aqueducts main track was the preferred spot when he finished third behind Irish War Cry and Battalion Runner in the Grade 2 Wood Memorial.

Cloud Computing did earn enough points to run in the Kentucky Derby, but Brown and his owners felt it was too much too soon for the horse. Cloud Computing needed a chip removed from a front ankle last summer, which is why he didnt run at age 2.

I feel very comfortable that we gave him the six weeks from the Wood, Brown said. I see a horse thats really doing well.

Javier Castellano will ride Cloud Computing in the Preakness. Brown said he anticipates shipping Cloud Computing to Baltimore on Tuesday.

Term of Art works

At Santa Anita on Saturday morning, Term of Art, one of the outsiders in the Preakness, worked six furlongs in 1:13.80 while outfitted in blinkers, which trainer Doug O'Neill said he will wear in the second leg of the Triple Crown.

Term of Art has scored both of his wins in blinkers, but he has not worn them in his last three starts, including a seventh-place finish in his last start, the Santa Anita Derby.

On Saturday, with exercise rider Amir Cedeno up, Term of Art worked by himself.

"He worked great," O'Neill said. "The track was demanding safe but slow. I'm very happy. We know he's a longshot, but he's doing well."

O'Neill said Term of Art would fly to Baltimore from California on Tuesday. He will be the only horse O'Neill has at Pimlico next weekend.

O'Neill won the Preakness in 2012 with I'll Have Another, his first Kentucky Derby winner. Last year, he finished third in the Preakness with Derby winner Nyquist.

Always Dreaming, the Derby winner, has been at Pimlico since last Tuesday and on Saturday galloped 1 1/4 miles on a sloppy, sealed track. The wet track forced the postponement of a scheduled work for Royal Mo, who traveled with Always Dreaming to Pimlico on Tuesday. He could work Sunday or wait until Monday.

Gunnevera, seventh in the Derby, was scheduled to arrive at Pimlico on Saturday after a van ride from Churchill Downs.

additional reporting by Jay Privman

More here:

Cloud Computing puts in work for Preakness before deluge - Daily Racing Form

Posted in Cloud Computing | Comments Off on Cloud Computing puts in work for Preakness before deluge – Daily Racing Form

Achieving compliance in the cloud – CSO Online

Posted: at 2:27 am

More and more organizations are moving towards cloud technologies for scalability, cost reduction, and new service offerings. In this short article we will review cloud basics and look at auditing for compliance challenges in the cloud.

Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics, three service models, and four deployment models.- The NIST 800-145 Definition of Cloud Computing

Lets review the deployment models:

Public Cloud- Cloud computing services from vendors that can be accessed across the internet or a private network, using systems in one or more data centers, shared among multiple customers, with varying degrees of data privacy control.

Private Cloud - Computing architectures modeled after Public Clouds, yet built, managed, and used internally by an enterprise; uses a shared services model with variable usage of a common pool of virtualized computing resources. Data is controlled within the enterprise.

Hybrid Cloud - A mix of vendor cloud services, internal cloud computing architectures, and classic IT infrastructure, forming a hybrid model that uses the best-of-breed technologies to meet specific needs.

Community Cloud - The cloud infrastructure is shared by several organizations and supports a specific community that has shared concerns (for example, mission, objectives, security requirements, policy, and compliance considerations). It may be managed by the organizations or a third party, and may exist on-premise or off-premise.

Service Delivery Models:

Infrastructure as a Service (IaaS) - Delivers computer infrastructure, typically a platform virtualization environment as a service. Service is typically billed on a utility computing basis and amount of resources consumed.

Platform as a Service (PaaS) - Delivers a computing platform as a service. It facilitates deployment of applications while limiting or reducing the cost and complexity of buying and managing the underlying hardware and software layers.

Software as a Service (SaaS) - Delivers software as a service over the internet, avoiding the need to install and run the application on the customer's own computers and simplifying maintenance and support.

So now that we have reviewed the basics of deployment and service delivery, what does it all mean to be compliant in the cloud vs compliance on a traditional perimeter based corporate network? We also have to consider the business sector or compliance model and sometimes this is mixed. For example in healthcare its HIPAA compliance we are trying to achieve, In the credit card retail environment it's PCI DSS and in government it's FISMA or the NIST Cyber Security framework we must achieve. Of course healthcare uses credit cards to create a mixed compliance.

It's important to know where the responsibility is when working in the cloud. As we move from IaaS to PaaS and finally to SaaS, we see that the cloud vendor is responsible for more. For example in SaaS they are delivering it all. In IaaS they deliver the least so the rest is all your responsibility. The more they provide the more you lose control.

Some real challenges in working with a cloud environment are understanding the scope of the cloud environment, Can your current risk assessment work in the cloud? Audit trails in the cloud?

The key is to go with a risk-based approach and know that cloud-based risk is different. For example, the concept of a perimeter in a multi-tenant environment doesnt make sense anymore. Some examples: in service delivery risk, we must evaluate virtualization risk, SaaS risk, PaaS, and IaaS risk.

Then we need to look at deployment risk, business model risk and security risk just to name a few.

What we really need now is a map, this is getting too confusing right? Deloittes Cloud Computing Risk Intelligence Map is very helpful.

Take a look at data management in the cloud risk map. Notice that for data usage we have a lack of clear ownership of cloud generated data, and unauthorized access or inappropriate use of sensitive data, personal data or intellectual property. These are real-world issues with cloud computing because you dont have full control especially if you are in an SaaS environment. At the same time you must be able to apply the deployment and service delivery models to your actual compliance framework as in HIPAA, PCI DSS and FISMA for example.

SOC 1 is for service organizations assessments that impact financials, therefore let's look at SOC 2 and 3. SOC 2 is geared towards technology companies and allows the incorporation of other frameworks into the SOC 2 report. SOC 2 assessment consists of the Trust Service Principles (TSP) framework from American Institute of Certified Public Accountants (AICPA) for evaluating a service organization's internal controls against the prescribed set of Common Criteria found in the TSPs.

SOC 2 assessments cover a wide range of controls such as operational, technical and information security controls. SOC 3 SysTrust/WebTrust also known as Trust Services, which are broad based and also from (AICPA). We are really talking about e-commerce compliance here! So SOC 3 covers e-commerce web servers and the systems that interconnect and support e-commerce business platforms.

Trust Services are a set of professional attestation and advisory services based on a core set of principles and criteria that address the risks and opportunities of IT-enabled systems and privacy programs. The following principles and related criteria are used by practitioners in the performance of Trust Services engagements:

In cloud environments, multiple partys data and services can exist on a single physical platform running virtual services for its customers. This creates several problems for security, compliance and audit, including:

Limited ability to control data and applications

Limited knowledge and no visibility into the degree of segmentation and security controls between those collocated virtual resources

Audit and control of data in the public cloud with no visibility into the providers systems and controls even in a private cloud that is privately managed, multi-tenancy is enacted at many layers, including storage, application, database, operating platform and hypervisor-based infrastructure. In other words, shared hosts, data centers and networks can potentially exist between the same and different organizations or internal business units. As such, it is critical that network segmentation is created securely with the ability to monitor any anomalies that may occur across virtual network boundaries.

The auditee in this case the cloud provider or consumer is required to produce compliance reports to prove that their security measures are protecting their assets from being compromised. Several open source and commercial tools, including security information and event management (SIEM) and GRC tools, that enable generation of compliance reports on a periodic and/or on-demand basis, exist in the market.

In cloud environments its important to know what is different in an onsite local computing environment vs cloud service providers. Who has responsibility and to capture this in an service-level agreement and system security plan. Nothing can be assumed. The fact that you are sharing a cloud environment to provide growth and on demand scalability means we must realize the issues related to sharing.

Just like renting a room out in your house changes your security, and privacy so too does sharing cloud computing resources. The NIST and Cloud Security Alliance Standards are mandatory to manage the ever changing and complex cloud environment. In both local and cloud environments we are managing risk and in the cloud its more complex, shared and dynamic.

For further reading on cloud virtual machine issues I recommend a paper titled TenantGuard: Scalable Runtime Verification of Cloud Wide VM level network isolation.

NIST SP 800-53, NIST SP 800-144, SP 800-30, Deloitte cloud computing risk intelligence map, ZCloud, Security Alliance Cloud Controls Matrix, ISACA Cloud computing Audit program, FedRamp Federal Risk and Authorization management Program.

References SANS

Deloitte

This article is published as part of the IDG Contributor Network. Want to Join?

More:

Achieving compliance in the cloud - CSO Online

Posted in Cloud Computing | Comments Off on Achieving compliance in the cloud – CSO Online

Boston schools CIO Mark Racine takes hybrid approach to cloud computing – EdScoop News (press release) (registration) (blog)

Posted: at 2:27 am

The district is also developing a single sign-on platform to better integrate applications and data.

With nearly 60,000 students and a mix of traditional, charter and pilot schools, CIO Mark Racine is always looking for ways to make educational technology go farther for the faculty and families of Boston Public Schools.

Like many CIOs, Racine has his eye on cloud computing as the future of data management.

But with limited funding preventing an immediate full-on move to the cloud, Racine and his infrastructure team are still banking on a hybrid approach, he said in a recent interview with EdScoop. The approach provides scaling opportunities to relieve stress on the network, especially at certain high-traffic points during the school year.

He likened it to the 1-800-Flowers approach, the way flower companies will need to scale up for Valentine's Day, and then come back inside, Racine said.

We would move to the cloud tomorrow if we could, he said.

View more of EdScoop's interviews with innovative school CIOs.

Among other edtech initiatives, Racine said he and his 50-member IT team have also invested heavily in single sign-on technology, geared towards increasing connectivity across the district.

The technology is also aimed at building toward greater data integration. The platform will take authentication to all kinds of different learning apps, and allow us to take our Ed-Fi database and scale that data to all educational platforms as well, he said.

When an educational technology platform is working well in a classroom or school, we want to be able to bring that up to 130 buildings, he said.

Another big initiative underway for Boston Public Schools, according to Racine, is finding the best way to support the districts school choice program.

Boston schools offer parents the flexibility to walk into a family resource center, explore all the schools that are available to them, learn about the educational programming that's in that building, and then be able to make a choice on where they want to send their child.

The ultimate goal of this is to, as Racine says, Eliminate the amount of lost-learning time, through the process of integrating technology into school choice programs.

Ryan Johnston contributed to this report.

Continue reading here:

Boston schools CIO Mark Racine takes hybrid approach to cloud computing - EdScoop News (press release) (registration) (blog)

Posted in Cloud Computing | Comments Off on Boston schools CIO Mark Racine takes hybrid approach to cloud computing – EdScoop News (press release) (registration) (blog)

IBM Announces The Defense Calculator And A Cloud Computing Service – Forbes

Posted: May 14, 2017 at 6:21 pm


Forbes
IBM Announces The Defense Calculator And A Cloud Computing Service
Forbes
This week's milestones in the history of technology include the world's first analog computer, experimenting with the Web via TV in pre-Web days, the birth of the ITU, and IBM's first commercially available scientific computer and first computing ...

Read the rest here:

IBM Announces The Defense Calculator And A Cloud Computing Service - Forbes

Posted in Cloud Computing | Comments Off on IBM Announces The Defense Calculator And A Cloud Computing Service – Forbes

Cloud Computing, Term of Art Complete Preakness Works – BloodHorse.com (press release) (registration) (blog)

Posted: at 6:21 pm

Klaravich Stables and William Lawrence's Cloud Computing tuned up for his expected run in the May 20 Preakness Stakes (G1) with a half-mile breeze in :48.85 over the Belmont Park training track May 13.

Under exercise rider Peter Roman, Cloud Computing beat the worst of the rain by coming out just after thetraining track opened at 5:30 a.m. The son of Maclean's Music posted the second fastest drill of 32 moves at the distance.

"He breezed very well, galloped out super, and came back good so far," said trainer Chad Brown. "That's his last piece of work and if he comes out of it well he'll be on to Baltimore on Tuesday."

Unraced as a juvenile, Cloud Computing has made three starts this season, with his most recent outing being a third-place finish in the April 8 Wood Memorial presented by NYRA Bets (G2). The dark bay colt previously ran second to J Boys Echo in the March 4 Gotham Stakes (G3) after he broke his maiden at first asking going six furlongs at Aqueduct Racetrack Feb. 11.

On the opposite coast, fellow Preakness hopeful Term of Art also completed his last serious work before shipping to Baltimore. He worked six furlongs in 1:13 4/5 at Santa Anita Park Saturday.

Calumet Farm's Term of Art is slated to leave Santa Anita May 16 for the middle leg of the Triple Crown. The Doug O'Neill-trained son of Tiznow was most recently seventh in the Santa Anita Derby (G1) but captured the Cecil B. DeMille Stakes (G3) at Del Mar last November.

Read the original post:

Cloud Computing, Term of Art Complete Preakness Works - BloodHorse.com (press release) (registration) (blog)

Posted in Cloud Computing | Comments Off on Cloud Computing, Term of Art Complete Preakness Works – BloodHorse.com (press release) (registration) (blog)

Trump signs cybersecurity executive order, mandating a move to cloud computing – GeekWire

Posted: at 6:21 pm

The White House plan to address cybersecurity is taking shape. (White House / Pho.to / GeekWire Graphic)

President Donald Trump today signed a long-awaited executive order aimed at beefing up cybersecurity at federal government agencies with a shift of computer capabilities to the cloud as a key part of the strategy.

Weve got to move to the cloud and try to protect ourselves instead of fracturing our security posture, Homeland Security Adviser Tom Bossert told reporters during a White House briefing.

The executive order gives the lead role in managingthe cloud shift to the director of the White Houses newly established American Technology Council, which is due to meet for the first time next month.

Although the councils full roster of members has not yet been announced, the director is said to be Chris Liddell, who formerly served as chief financialofficer at Microsoft and General Motors.

Some agencies already have begun shiftingdata resources to cloud computing services, including Amazon Web Services and Microsoft Azure. Carson Sweet, CTO and co-founder of San Francisco-based CloudPassage, said the emphasis on the cloud makes sense and builds on a trend that began during the Obama administration.

The question now will be how well the administration does with identifying and eliminating the obstructions agencies are facing as they consider adopting cloud / shared services, Sweet told GeekWire in an email.

The executive order also calls upon all federal agencies to implement the NIST Cybersecurity Framework, a set of best practices developed by the National Institute of Standards and Technology for the information technology industry. And it calls on Cabinet secretaries to develop plans to protect critical infrastructure, ranging from utilities to the health care system to the financial system.

Bossert said the measures build on the efforts made by the Obama administration. A lot of progress was made in the last administration, but not nearly enough, he said.

As an example of past failures, Bossert pointed to 2015s data breach at the Office of Personnel Management, which exposed millions of sensitive employment records to hackers. He said such records are the crown jewels of the governments dataassetsand require enhanced protection.

Bossertnoted that Trumps budget blueprint sets aside $1.5 billion for cybersecurity.

Back in January, Trump vowed to come up with a major report on hacking defense within 90 days,but some observers said the executive order didnt meet the target.

Drew Mitnick, policy counsel at Access Now, said in a statement that the measures will serve as incremental changes to existing policies, while the Trump administration has otherwise either ignored or undermined pressing digital security threats internet users face.

The action does not touch several critical areas, like the insecurity of Internet of Things devices, data breaches, or vulnerability disclosure, Mitnick said.

During the briefing, one reporter asked whether shifting the federal governments data to the cloud might heighten rather than reduce cybersecurity risks. Bossert said its better to centralize risk, rather thanhaving 190 federal agencies come up with separate measures.

I dont think thats a wise risk, Bossert said.

Another reporter asked whether concerns over Russias online meddling with last years presidential campaign had any effect on the executive order.

The Russians are not our only adversary, Bossert replied. The Russians, the Chinese, the Iranians, other nation-states are motivated to use cybersecurity and cyber tools to attack our people and our governments and their data. And thats something we can no longer abide.

He declined to say what type of cyber attack might constitute an act of war, other than to say that if somebody does something to the United States of America that we cant tolerate, we will act.

Trump was reportedly on the verge of signing an executive order on cybersecurity back in January, but held off. Bossert said there was nothing unusual behind the delay. He noted that between then and now, the White House had the chance to lay out a budget blueprint and announced the formation of the technology council two developments that set the stage for the executive order.

Bossert also acknowledged that some tech companies expressed concerns that theyd be compelled to take actions to head off distributed denial-of-service attacks, also known as botnet attacks. He emphasized today that the anti-botnet initiative would be voluntary.

The executive order callson Commerce Secretary Wilbur Ross and Homeland Security Secretary John Kelly to file a preliminary report on the anti-botnet campaign within 240 days.

Bossert declined to confirm a claim that federal computers are hit by tens of thousands of hacking attempts daily, but he acknowledged that attempted data break-ins and successful intrusions are on the rise.

The trend line is going in the wrong direction, he told reporters.

Correction for 1:50 p.m. PT May 13: An earlier version of this report incorrectly referred to Chris Liddell as the former chief technology officer of Microsoft and GM. He has served as chief financial officer for those and other companies.

Continue reading here:

Trump signs cybersecurity executive order, mandating a move to cloud computing - GeekWire

Posted in Cloud Computing | Comments Off on Trump signs cybersecurity executive order, mandating a move to cloud computing – GeekWire

Page 148«..1020..147148149150..»